commit stringlengths 40 40 | old_file stringlengths 4 184 | new_file stringlengths 4 184 | old_contents stringlengths 1 3.6k | new_contents stringlengths 5 3.38k | subject stringlengths 15 778 | message stringlengths 16 6.74k | lang stringclasses 201 values | license stringclasses 13 values | repos stringlengths 6 116k | config stringclasses 201 values | content stringlengths 137 7.24k | diff stringlengths 26 5.55k | diff_length int64 1 123 | relative_diff_length float64 0.01 89 | n_lines_added int64 0 108 | n_lines_deleted int64 0 106 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
48deec729bab46f05b23f7d6b4a20eec88c5f2f0 | pages/static/pages/css/tutorial_detail.css | pages/static/pages/css/tutorial_detail.css | .language-switch {
margin: auto 1em 1em auto;
float: right;
}
p {
text-align: justify;
}
img {
max-width: 100%;
}
.content .alert {
margin-top: 16px;
}
.content .alert .close {
margin-left: 8px;
}
| .language-switch {
margin: auto 1em 1em auto;
}
@media(min-width: 992px) { /* md or up */
.language-switch {
float: right;
}
}
p {
text-align: justify;
}
img {
max-width: 100%;
}
.content .alert {
margin-top: 16px;
}
.content .alert .close {
margin-left: 8px;
}
| Put select in own row on small screens | Put select in own row on small screens
| CSS | mit | djangogirlstaipei/djangogirlstaipei,djangogirlstaipei/djangogirlstaipei,djangogirlstaipei/djangogirlstaipei | css | ## Code Before:
.language-switch {
margin: auto 1em 1em auto;
float: right;
}
p {
text-align: justify;
}
img {
max-width: 100%;
}
.content .alert {
margin-top: 16px;
}
.content .alert .close {
margin-left: 8px;
}
## Instruction:
Put select in own row on small screens
## Code After:
.language-switch {
margin: auto 1em 1em auto;
}
@media(min-width: 992px) { /* md or up */
.language-switch {
float: right;
}
}
p {
text-align: justify;
}
img {
max-width: 100%;
}
.content .alert {
margin-top: 16px;
}
.content .alert .close {
margin-left: 8px;
}
| .language-switch {
margin: auto 1em 1em auto;
+ }
+ @media(min-width: 992px) { /* md or up */
+ .language-switch {
- float: right;
+ float: right;
? ++++
+ }
}
p {
text-align: justify;
}
img {
max-width: 100%;
}
.content .alert {
margin-top: 16px;
}
.content .alert .close {
margin-left: 8px;
} | 6 | 0.375 | 5 | 1 |
9a0fd7b490b1ccc3534f27c3ca6aad654b460006 | _recipes/recipe_1.md | _recipes/recipe_1.md | ---
layout: recipes
title: "Recipe 1"
type: "beef"
picture: ../assets/img/patrick_pigs.jpg
---
Beef is a great source of protein. Grass-fed beef like that raised at Eight Kids Farm is even better! Here are some of our favorite beef recipes.
| ---
layout: recipes
title: "Beef terriyaki"
type: "beef"
picture: ../assets/img/patrick_pigs.jpg
---
Beef is a great source of protein. Grass-fed beef like that raised at Eight Kids Farm is even better! Here are some of our favorite beef recipes.
| Change to long title to see how word wrap works | Change to long title to see how word wrap works
| Markdown | mit | mtschloss/nkidsfarm_website,pschloss/nkidsfarm_website,pschloss/nkidsfarm_website,pschloss/nkidsfarm_website,mtschloss/nkidsfarm_website | markdown | ## Code Before:
---
layout: recipes
title: "Recipe 1"
type: "beef"
picture: ../assets/img/patrick_pigs.jpg
---
Beef is a great source of protein. Grass-fed beef like that raised at Eight Kids Farm is even better! Here are some of our favorite beef recipes.
## Instruction:
Change to long title to see how word wrap works
## Code After:
---
layout: recipes
title: "Beef terriyaki"
type: "beef"
picture: ../assets/img/patrick_pigs.jpg
---
Beef is a great source of protein. Grass-fed beef like that raised at Eight Kids Farm is even better! Here are some of our favorite beef recipes.
| ---
layout: recipes
- title: "Recipe 1"
+ title: "Beef terriyaki"
type: "beef"
picture: ../assets/img/patrick_pigs.jpg
---
Beef is a great source of protein. Grass-fed beef like that raised at Eight Kids Farm is even better! Here are some of our favorite beef recipes. | 2 | 0.25 | 1 | 1 |
0300966ea5a1d97960dd8a0ec7128d363047751e | .travis.yml | .travis.yml | dist: xenial
language: c
compiler:
- clang
- gcc
python:
- "2.7"
sudo: required
before_install:
# - sudo -H DEBIAN_FRONTEND=noninteractive apt-get update
- sudo -H PATH="${PATH}:/usr/local/clang-3.4/bin" pip install -r requirements.txt
- sudo -H DEBIAN_FRONTEND=noninteractive apt-get update
# - sudo brew update
# - sudo brew install openssl
# - sudo brew install python --with-brewed-openssl
- pip install --user cpp-coveralls
env:
global:
- COVERALLS_PARALLEL=true
script:
sh -x ./scripts/do-test.sh
after_success:
- if [ "${CC}" = "gcc" ];
then
coveralls --gcov gcov --gcov-options '\-lp';
fi
| language: c
compiler:
- clang
- gcc
python:
- "2.7"
sudo: required
before_install:
# - sudo -H DEBIAN_FRONTEND=noninteractive apt-get update
- sudo -H PATH="${PATH}:/usr/local/clang-3.4/bin" pip install -r requirements.txt
- sudo -H DEBIAN_FRONTEND=noninteractive apt-get update
# - sudo brew update
# - sudo brew install openssl
# - sudo brew install python --with-brewed-openssl
- pip install --user cpp-coveralls
env:
global:
- COVERALLS_PARALLEL=true
script:
sh -x ./scripts/do-test.sh
after_success:
- if [ "${CC}" = "gcc" ];
then
coveralls --gcov gcov --gcov-options '\-lp';
fi
| Revert "Switch to xenial as being a more modern platform." | Revert "Switch to xenial as being a more modern platform."
This reverts commit 2f093dccf1b196880c0e18c90b7d79fdee3cb6c0.
| YAML | bsd-2-clause | dsanders11/rtpproxy,sippy/rtpproxy,sippy/rtpproxy,dsanders11/rtpproxy,dsanders11/rtpproxy,sippy/rtpproxy | yaml | ## Code Before:
dist: xenial
language: c
compiler:
- clang
- gcc
python:
- "2.7"
sudo: required
before_install:
# - sudo -H DEBIAN_FRONTEND=noninteractive apt-get update
- sudo -H PATH="${PATH}:/usr/local/clang-3.4/bin" pip install -r requirements.txt
- sudo -H DEBIAN_FRONTEND=noninteractive apt-get update
# - sudo brew update
# - sudo brew install openssl
# - sudo brew install python --with-brewed-openssl
- pip install --user cpp-coveralls
env:
global:
- COVERALLS_PARALLEL=true
script:
sh -x ./scripts/do-test.sh
after_success:
- if [ "${CC}" = "gcc" ];
then
coveralls --gcov gcov --gcov-options '\-lp';
fi
## Instruction:
Revert "Switch to xenial as being a more modern platform."
This reverts commit 2f093dccf1b196880c0e18c90b7d79fdee3cb6c0.
## Code After:
language: c
compiler:
- clang
- gcc
python:
- "2.7"
sudo: required
before_install:
# - sudo -H DEBIAN_FRONTEND=noninteractive apt-get update
- sudo -H PATH="${PATH}:/usr/local/clang-3.4/bin" pip install -r requirements.txt
- sudo -H DEBIAN_FRONTEND=noninteractive apt-get update
# - sudo brew update
# - sudo brew install openssl
# - sudo brew install python --with-brewed-openssl
- pip install --user cpp-coveralls
env:
global:
- COVERALLS_PARALLEL=true
script:
sh -x ./scripts/do-test.sh
after_success:
- if [ "${CC}" = "gcc" ];
then
coveralls --gcov gcov --gcov-options '\-lp';
fi
| - dist: xenial
language: c
compiler:
- clang
- gcc
python:
- "2.7"
sudo: required
before_install:
# - sudo -H DEBIAN_FRONTEND=noninteractive apt-get update
- sudo -H PATH="${PATH}:/usr/local/clang-3.4/bin" pip install -r requirements.txt
- sudo -H DEBIAN_FRONTEND=noninteractive apt-get update
# - sudo brew update
# - sudo brew install openssl
# - sudo brew install python --with-brewed-openssl
- pip install --user cpp-coveralls
env:
global:
- COVERALLS_PARALLEL=true
script:
sh -x ./scripts/do-test.sh
after_success:
- if [ "${CC}" = "gcc" ];
then
coveralls --gcov gcov --gcov-options '\-lp';
fi | 1 | 0.038462 | 0 | 1 |
6732862e26b3aab27e136e82cf330fbf23ae1a6c | css/kontakt.css | css/kontakt.css | .portrait {
width: 160px;
height: 160px;
}
#kontakt_table {
margin: 25px 0 50px 50px;
border: #cccccc solid 1px;
}
#kontakt_table_heading {
border: #0a0b43 solid 1px;
}
#kontakt_table th {
background-color: #0a0b43;
color: white;
}
#kontakt_table .portrait_role {
text-align: center;
}
#kontakt_table tr.person {
border-bottom: #ec3434 solid 2px;
}
#kontakt_table tr.person>td {
padding-bottom: 1em;
}
#kontakt_table .name {
background-color: #eeeeee;
color: #0a0b43;
height: 30px;
}
#kontakt_table .name,.role {
text-align: center;
}
#kontakt_table tr.role>td {
border-bottom: #ec3434 solid 2px;
padding-bottom: 0.5em;
padding-top: 0.5em;
}
| .portrait {
width: 160px;
height: 160px;
}
#kontakt_table {
margin: 25px 0 50px 50px;
border: #cccccc solid 1px;
}
#kontakt_table_heading {
border: #0a0b43 solid 1px;
}
#kontakt_table th {
background-color: #0a0b43;
color: white;
}
#kontakt_table .portrait_role {
text-align: center;
}
#kontakt_table .name {
background-color: #eeeeee;
color: #0a0b43;
height: 30px;
}
#kontakt_table .name,.role {
text-align: center;
}
#kontakt_table tr.role>td {
border-bottom: #ec3434 solid 2px;
padding-bottom: 0.5em;
padding-top: 0.5em;
}
| Remove styles of old design | Remove styles of old design
| CSS | mit | leozilla/indoorcup-web,leozilla/indoorcup-web,leozilla/indoorcup-web | css | ## Code Before:
.portrait {
width: 160px;
height: 160px;
}
#kontakt_table {
margin: 25px 0 50px 50px;
border: #cccccc solid 1px;
}
#kontakt_table_heading {
border: #0a0b43 solid 1px;
}
#kontakt_table th {
background-color: #0a0b43;
color: white;
}
#kontakt_table .portrait_role {
text-align: center;
}
#kontakt_table tr.person {
border-bottom: #ec3434 solid 2px;
}
#kontakt_table tr.person>td {
padding-bottom: 1em;
}
#kontakt_table .name {
background-color: #eeeeee;
color: #0a0b43;
height: 30px;
}
#kontakt_table .name,.role {
text-align: center;
}
#kontakt_table tr.role>td {
border-bottom: #ec3434 solid 2px;
padding-bottom: 0.5em;
padding-top: 0.5em;
}
## Instruction:
Remove styles of old design
## Code After:
.portrait {
width: 160px;
height: 160px;
}
#kontakt_table {
margin: 25px 0 50px 50px;
border: #cccccc solid 1px;
}
#kontakt_table_heading {
border: #0a0b43 solid 1px;
}
#kontakt_table th {
background-color: #0a0b43;
color: white;
}
#kontakt_table .portrait_role {
text-align: center;
}
#kontakt_table .name {
background-color: #eeeeee;
color: #0a0b43;
height: 30px;
}
#kontakt_table .name,.role {
text-align: center;
}
#kontakt_table tr.role>td {
border-bottom: #ec3434 solid 2px;
padding-bottom: 0.5em;
padding-top: 0.5em;
}
| .portrait {
width: 160px;
height: 160px;
}
#kontakt_table {
margin: 25px 0 50px 50px;
border: #cccccc solid 1px;
}
#kontakt_table_heading {
border: #0a0b43 solid 1px;
}
#kontakt_table th {
background-color: #0a0b43;
color: white;
}
#kontakt_table .portrait_role {
text-align: center;
}
- #kontakt_table tr.person {
- border-bottom: #ec3434 solid 2px;
- }
-
- #kontakt_table tr.person>td {
- padding-bottom: 1em;
- }
-
-
-
#kontakt_table .name {
background-color: #eeeeee;
color: #0a0b43;
height: 30px;
}
#kontakt_table .name,.role {
text-align: center;
}
#kontakt_table tr.role>td {
border-bottom: #ec3434 solid 2px;
padding-bottom: 0.5em;
padding-top: 0.5em;
} | 10 | 0.208333 | 0 | 10 |
582f29e6bd22cbd9a49656dd4a677deeccf86735 | android/src/main/java/org/kobjects/nativehtml/android/HtmlView.java | android/src/main/java/org/kobjects/nativehtml/android/HtmlView.java | package org.kobjects.nativehtml.android;
import android.content.Context;
import android.widget.FrameLayout;
import java.io.Reader;
import java.net.URI;
import org.kobjects.nativehtml.dom.Element;
import org.kobjects.nativehtml.io.DefaultRequestHandler;
import org.kobjects.nativehtml.io.HtmlParser;
import org.kobjects.nativehtml.io.InternalLinkHandler;
public class HtmlView extends FrameLayout implements InternalLinkHandler {
private final AndroidPlatform platform;
private final HtmlParser htmlParser;
private DefaultRequestHandler requestHandler;
public HtmlView(Context context) {
super(context);
platform = new AndroidPlatform(context);
requestHandler = new DefaultRequestHandler(platform);
requestHandler.setInternalLinkHandler(this);
htmlParser = new HtmlParser(platform, requestHandler, null);
}
public void addInternalLinkPrefix(String s) {
requestHandler.addInternalLinkPrefix(s);
}
public void loadHtml(URI url) {
requestHandler.openInternalLink(url);
}
public void loadHtml(Reader reader, URI baseUrl) {
Element element = htmlParser.parse(reader, baseUrl);
removeAllViews();
if (element instanceof AbstractAndroidComponentElement) {
addView((AbstractAndroidComponentElement) element);
}
}
}
| package org.kobjects.nativehtml.android;
import android.content.Context;
import android.util.AttributeSet;
import android.widget.FrameLayout;
import org.kobjects.nativehtml.dom.Element;
import org.kobjects.nativehtml.io.DefaultRequestHandler;
import org.kobjects.nativehtml.io.HtmlParser;
import org.kobjects.nativehtml.io.InternalLinkHandler;
import java.io.Reader;
import java.net.URI;
public class HtmlView extends FrameLayout implements InternalLinkHandler {
private HtmlParser htmlParser;
private DefaultRequestHandler requestHandler;
public HtmlView(Context context) {
super(context);
init(context);
}
public HtmlView(Context context, AttributeSet attrs) {
super(context, attrs);
init(context);
}
public HtmlView(Context context, AttributeSet attrs, int defStyleAttr) {
super(context, attrs, defStyleAttr);
init(context);
}
private void init(Context context) {
AndroidPlatform platform = new AndroidPlatform(context);
requestHandler = new DefaultRequestHandler(platform);
requestHandler.setInternalLinkHandler(this);
htmlParser = new HtmlParser(platform, requestHandler, null);
}
public void addInternalLinkPrefix(String s) {
requestHandler.addInternalLinkPrefix(s);
}
public void loadHtml(URI url) {
requestHandler.openInternalLink(url);
}
public void loadHtml(Reader reader, URI baseUrl) {
Element element = htmlParser.parse(reader, baseUrl);
removeAllViews();
if (element instanceof AbstractAndroidComponentElement) {
addView((AbstractAndroidComponentElement) element);
}
}
}
| Add the required constructors for direct usage in layout xml files. | Add the required constructors for direct usage in layout xml files.
Usage: <org.kobjects.nativehtml.android.HtmlView>
| Java | apache-2.0 | stefanhaustein/nativehtml,stefanhaustein/nativehtml | java | ## Code Before:
package org.kobjects.nativehtml.android;
import android.content.Context;
import android.widget.FrameLayout;
import java.io.Reader;
import java.net.URI;
import org.kobjects.nativehtml.dom.Element;
import org.kobjects.nativehtml.io.DefaultRequestHandler;
import org.kobjects.nativehtml.io.HtmlParser;
import org.kobjects.nativehtml.io.InternalLinkHandler;
public class HtmlView extends FrameLayout implements InternalLinkHandler {
private final AndroidPlatform platform;
private final HtmlParser htmlParser;
private DefaultRequestHandler requestHandler;
public HtmlView(Context context) {
super(context);
platform = new AndroidPlatform(context);
requestHandler = new DefaultRequestHandler(platform);
requestHandler.setInternalLinkHandler(this);
htmlParser = new HtmlParser(platform, requestHandler, null);
}
public void addInternalLinkPrefix(String s) {
requestHandler.addInternalLinkPrefix(s);
}
public void loadHtml(URI url) {
requestHandler.openInternalLink(url);
}
public void loadHtml(Reader reader, URI baseUrl) {
Element element = htmlParser.parse(reader, baseUrl);
removeAllViews();
if (element instanceof AbstractAndroidComponentElement) {
addView((AbstractAndroidComponentElement) element);
}
}
}
## Instruction:
Add the required constructors for direct usage in layout xml files.
Usage: <org.kobjects.nativehtml.android.HtmlView>
## Code After:
package org.kobjects.nativehtml.android;
import android.content.Context;
import android.util.AttributeSet;
import android.widget.FrameLayout;
import org.kobjects.nativehtml.dom.Element;
import org.kobjects.nativehtml.io.DefaultRequestHandler;
import org.kobjects.nativehtml.io.HtmlParser;
import org.kobjects.nativehtml.io.InternalLinkHandler;
import java.io.Reader;
import java.net.URI;
public class HtmlView extends FrameLayout implements InternalLinkHandler {
private HtmlParser htmlParser;
private DefaultRequestHandler requestHandler;
public HtmlView(Context context) {
super(context);
init(context);
}
public HtmlView(Context context, AttributeSet attrs) {
super(context, attrs);
init(context);
}
public HtmlView(Context context, AttributeSet attrs, int defStyleAttr) {
super(context, attrs, defStyleAttr);
init(context);
}
private void init(Context context) {
AndroidPlatform platform = new AndroidPlatform(context);
requestHandler = new DefaultRequestHandler(platform);
requestHandler.setInternalLinkHandler(this);
htmlParser = new HtmlParser(platform, requestHandler, null);
}
public void addInternalLinkPrefix(String s) {
requestHandler.addInternalLinkPrefix(s);
}
public void loadHtml(URI url) {
requestHandler.openInternalLink(url);
}
public void loadHtml(Reader reader, URI baseUrl) {
Element element = htmlParser.parse(reader, baseUrl);
removeAllViews();
if (element instanceof AbstractAndroidComponentElement) {
addView((AbstractAndroidComponentElement) element);
}
}
}
| package org.kobjects.nativehtml.android;
import android.content.Context;
+ import android.util.AttributeSet;
import android.widget.FrameLayout;
+
- import java.io.Reader;
- import java.net.URI;
import org.kobjects.nativehtml.dom.Element;
import org.kobjects.nativehtml.io.DefaultRequestHandler;
import org.kobjects.nativehtml.io.HtmlParser;
import org.kobjects.nativehtml.io.InternalLinkHandler;
+ import java.io.Reader;
+ import java.net.URI;
+
public class HtmlView extends FrameLayout implements InternalLinkHandler {
- private final AndroidPlatform platform;
- private final HtmlParser htmlParser;
? ------
+ private HtmlParser htmlParser;
private DefaultRequestHandler requestHandler;
public HtmlView(Context context) {
super(context);
+ init(context);
+ }
+
+ public HtmlView(Context context, AttributeSet attrs) {
+ super(context, attrs);
+ init(context);
+ }
+
+ public HtmlView(Context context, AttributeSet attrs, int defStyleAttr) {
+ super(context, attrs, defStyleAttr);
+ init(context);
+ }
+
+ private void init(Context context) {
- platform = new AndroidPlatform(context);
+ AndroidPlatform platform = new AndroidPlatform(context);
? ++++++++++++++++
requestHandler = new DefaultRequestHandler(platform);
requestHandler.setInternalLinkHandler(this);
htmlParser = new HtmlParser(platform, requestHandler, null);
}
public void addInternalLinkPrefix(String s) {
requestHandler.addInternalLinkPrefix(s);
}
public void loadHtml(URI url) {
requestHandler.openInternalLink(url);
}
public void loadHtml(Reader reader, URI baseUrl) {
Element element = htmlParser.parse(reader, baseUrl);
removeAllViews();
if (element instanceof AbstractAndroidComponentElement) {
addView((AbstractAndroidComponentElement) element);
}
}
} | 26 | 0.604651 | 21 | 5 |
297c33b3f790fdbdec51891451703724ca1a2614 | core/client/components/gh-activating-list-item.js | core/client/components/gh-activating-list-item.js | var ActivatingListItem = Ember.Component.extend({
tagName: 'li',
classNameBindings: ['active'],
active: false
});
export default ActivatingListItem;
| var ActivatingListItem = Ember.Component.extend({
tagName: 'li',
classNameBindings: ['active'],
active: false,
unfocusLink: function () {
this.$('a').blur();
}.on('click')
});
export default ActivatingListItem;
| Fix active menu state on settings navigation | Fix active menu state on settings navigation
closes #4622
- unfocus ActivatingListItem link when clicked
| JavaScript | mit | panezhang/Ghost,beautyOfProgram/Ghost,jaswilli/Ghost,qdk0901/Ghost,rchrd2/Ghost,schematical/Ghost,icowan/Ghost,singular78/Ghost,NamedGod/Ghost,GroupxDev/javaPress,Bunk/Ghost,jomahoney/Ghost,SachaG/bjjbot-blog,bisoe/Ghost,kwangkim/Ghost,daihuaye/Ghost,ngosinafrica/SiteForNGOs,ASwitlyk/Ghost,prosenjit-itobuz/Ghost,BayPhillips/Ghost,riyadhalnur/Ghost,dYale/blog,Feitianyuan/Ghost,jomofrodo/ccb-ghost,Yarov/yarov,acburdine/Ghost,mohanambati/Ghost,delgermurun/Ghost,nneko/Ghost,mnitchie/Ghost,Kaenn/Ghost,STANAPO/Ghost,etdev/blog,dggr/Ghost-sr,RufusMbugua/TheoryOfACoder,Netazoic/bad-gateway,katiefenn/Ghost,rollokb/Ghost,janvt/Ghost,francisco-filho/Ghost,chris-yoon90/Ghost,allanjsx/Ghost,Japh/shortcoffee,kaychaks/kaushikc.org,woodyrew/Ghost,dqj/Ghost,ThorstenHans/Ghost,adam-paterson/blog,ghostchina/Ghost-zh,panezhang/Ghost,madole/diverse-learners,akveo/akveo-blog,flomotlik/Ghost,PepijnSenders/whatsontheotherside,tyrikio/Ghost,r1N0Xmk2/Ghost,leonli/ghost,dai-shi/Ghost,jin/Ghost,ManRueda/manrueda-blog,thinq4yourself/Unmistakable-Blog,Romdeau/Ghost,jamesslock/Ghost,dymx101/Ghost,cwonrails/Ghost,carlyledavis/Ghost,dgem/Ghost,schneidmaster/theventriloquist.us,hoxoa/Ghost,developer-prosenjit/Ghost,obsoleted/Ghost,imjerrybao/Ghost,greyhwndz/Ghost,jparyani/GhostSS,jomofrodo/ccb-ghost,davidmenger/nodejsfan,dbalders/Ghost,jeonghwan-kim/Ghost,liftup/ghost,BlueHatbRit/Ghost,mhhf/ghost-latex,manishchhabra/Ghost,phillipalexander/Ghost,BlueHatbRit/Ghost,jamesslock/Ghost,rmoorman/Ghost,patterncoder/patterncoder,tksander/Ghost,ineitzke/Ghost,rameshponnada/Ghost,darvelo/Ghost,Gargol/Ghost,PaulBGD/Ghost-Plus,ClarkGH/Ghost,lukekhamilton/Ghost,wangjun/Ghost,patterncoder/patterncoder,sebgie/Ghost,TribeMedia/Ghost,hnarayanan/narayanan.co,SkynetInc/steam,smedrano/Ghost,delgermurun/Ghost,dYale/blog,carlosmtx/Ghost,petersucks/blog,davidmenger/nodejsfan,ljhsai/Ghost,mayconxhh/Ghost,karmakaze/Ghost,Rovak/Ghost,lukw00/Ghost,adam-paterson/blog,uploadcare/uploadcare-ghost-demo,thinq4yourself/Unmistakable-Blog,tadityar/Ghost,sceltoas/Ghost,cysys/ghost-openshift,pollbox/ghostblog,tchapi/igneet-blog,diancloud/Ghost,jiangjian-zh/Ghost,duyetdev/islab,PDXIII/Ghost-FormMailer,edsadr/Ghost,pbevin/Ghost,cicorias/Ghost,denzelwamburu/denzel.xyz,edurangel/Ghost,IbrahimAmin/Ghost,UnbounDev/Ghost,mdbw/ghost,vishnuharidas/Ghost,cysys/ghost-openshift,daihuaye/Ghost,UsmanJ/Ghost,AnthonyCorrado/Ghost,jomofrodo/ccb-ghost,Brunation11/Ghost,barbastan/Ghost,bbmepic/Ghost,ckousik/Ghost,ballPointPenguin/Ghost,jiangjian-zh/Ghost,lukaszklis/Ghost,Loyalsoldier/Ghost,Trendy/Ghost,Azzurrio/Ghost,ivanoats/ivanstorck.com,YY030913/Ghost,Netazoic/bad-gateway,arvidsvensson/Ghost,denzelwamburu/denzel.xyz,janvt/Ghost,Feitianyuan/Ghost,olsio/Ghost,aschmoe/jesse-ghost-app,Brunation11/Ghost,akveo/akveo-blog,disordinary/Ghost,KnowLoading/Ghost,kortemy/Ghost,ManRueda/Ghost,pedroha/Ghost,ballPointPenguin/Ghost,e10/Ghost,ngosinafrica/SiteForNGOs,ghostchina/Ghost-zh,MadeOnMars/Ghost,Klaudit/Ghost,etanxing/Ghost,optikalefx/Ghost,davidenq/Ghost-Blog,manishchhabra/Ghost,beautyOfProgram/Ghost,jorgegilmoreira/ghost,vainglori0us/urban-fortnight,tuan/Ghost,alecho/Ghost,telco2011/Ghost,gabfssilva/Ghost,leonli/ghost,handcode7/Ghost,zackslash/Ghost,notno/Ghost,Kaenn/Ghost,virtuallyearthed/Ghost,djensen47/Ghost,JonathanZWhite/Ghost,situkangsayur/Ghost,lukaszklis/Ghost,blankmaker/Ghost,UsmanJ/Ghost,r1N0Xmk2/Ghost,metadevfoundation/Ghost,optikalefx/Ghost,greenboxindonesia/Ghost,uploadcare/uploadcare-ghost-demo,NikolaiIvanov/Ghost,morficus/Ghost,weareleka/blog,AnthonyCorrado/Ghost,DesenTao/Ghost,mdbw/ghost,claudiordgz/Ghost,rafaelstz/Ghost,rmoorman/Ghost,jaguerra/Ghost,virtuallyearthed/Ghost,Netazoic/bad-gateway,mnitchie/Ghost,aroneiermann/GhostJade,lf2941270/Ghost,jin/Ghost,nmukh/Ghost,llv22/Ghost,neynah/GhostSS,greenboxindonesia/Ghost,cicorias/Ghost,schematical/Ghost,axross/ghost,anijap/PhotoGhost,jaguerra/Ghost,letsjustfixit/Ghost,ryanbrunner/crafters,icowan/Ghost,sajmoon/Ghost,LeandroNascimento/Ghost,cwonrails/Ghost,mayconxhh/Ghost,YY030913/Ghost,Bunk/Ghost,achimos/ghost_as,syaiful6/Ghost,ladislas/ghost,davidenq/Ghost-Blog,davidenq/Ghost-Blog,andrewconnell/Ghost,dbalders/Ghost,wemakeweb/Ghost,rizkyario/Ghost,ananthhh/Ghost,hilerchyn/Ghost,JonathanZWhite/Ghost,VillainyStudios/Ghost,fredeerock/atlabghost,JulienBrks/Ghost,pollbox/ghostblog,camilodelvasto/herokughost,imjerrybao/Ghost,tidyui/Ghost,diogogmt/Ghost,morficus/Ghost,mohanambati/Ghost,Dnlyc/Ghost,TribeMedia/Ghost,arvidsvensson/Ghost,zeropaper/Ghost,PDXIII/Ghost-FormMailer,rito/Ghost,smedrano/Ghost,ErisDS/Ghost,Sebastian1011/Ghost,sangcu/Ghost,lanffy/Ghost,letsjustfixit/Ghost,makapen/Ghost,katrotz/blog.katrotz.space,syaiful6/Ghost,pathayes/FoodBlog,ignasbernotas/nullifer,neynah/GhostSS,pensierinmusica/Ghost,johnnymitch/Ghost,jiachenning/Ghost,acburdine/Ghost,smaty1/Ghost,hnarayanan/narayanan.co,trunk-studio/Ghost,yundt/seisenpenji,javorszky/Ghost,dylanchernick/ghostblog,tadityar/Ghost,jomahoney/Ghost,mtvillwock/Ghost,JohnONolan/Ghost,Remchi/Ghost,sergeylukin/Ghost,rito/Ghost,tandrewnichols/ghost,karmakaze/Ghost,vainglori0us/urban-fortnight,exsodus3249/Ghost,JonSmith/Ghost,ashishapy/ghostpy,mttschltz/ghostblog,cobbspur/Ghost,hnq90/Ghost,diancloud/Ghost,NovaDevelopGroup/Academy,Japh/shortcoffee,lowkeyfred/Ghost,ignasbernotas/nullifer,no1lov3sme/Ghost,FredericBernardo/Ghost,Dnlyc/Ghost,JohnONolan/Ghost,Xibao-Lv/Ghost,r14r/fork_nodejs_ghost,Kikobeats/Ghost,atandon/Ghost,AlexKVal/Ghost,theonlypat/Ghost,velimir0xff/Ghost,lukw00/Ghost,ygbhf/Ghost,sfpgmr/Ghost,bisoe/Ghost,krahman/Ghost,FredericBernardo/Ghost,ryansukale/ux.ryansukale.com,vloom/blog,mttschltz/ghostblog,Kaenn/Ghost,Trendy/Ghost,novaugust/Ghost,rchrd2/Ghost,yanntech/Ghost,Yarov/yarov,MadeOnMars/Ghost,javimolla/Ghost,kwangkim/Ghost,alecho/Ghost,katrotz/blog.katrotz.space,UnbounDev/Ghost,cncodog/Ghost-zh-codog,ITJesse/Ghost-zh,darvelo/Ghost,rameshponnada/Ghost,dbalders/Ghost,blankmaker/Ghost,dylanchernick/ghostblog,PeterCxy/Ghost,melissaroman/ghost-blog,neynah/GhostSS,LeandroNascimento/Ghost,wemakeweb/Ghost,psychobunny/Ghost,sunh3/Ghost,xiongjungit/Ghost,NovaDevelopGroup/Academy,ygbhf/Ghost,zumobi/Ghost,ManRueda/manrueda-blog,sceltoas/Ghost,Japh/Ghost,rouanw/Ghost,bosung90/Ghost,ladislas/ghost,cwonrails/Ghost,tyrikio/Ghost,nneko/Ghost,etdev/blog,tmp-reg/Ghost,JohnONolan/Ghost,telco2011/Ghost,makapen/Ghost,rizkyario/Ghost,cqricky/Ghost,PepijnSenders/whatsontheotherside,theonlypat/Ghost,NamedGod/Ghost,jorgegilmoreira/ghost,jiachenning/Ghost,mtvillwock/Ghost,allanjsx/Ghost,thomasalrin/Ghost,AlexKVal/Ghost,claudiordgz/Ghost,nmukh/Ghost,sifatsultan/js-ghost,ddeveloperr/Ghost,jacostag/Ghost,johngeorgewright/blog.j-g-w.info,GroupxDev/javaPress,augbog/Ghost,PaulBGD/Ghost-Plus,mhhf/ghost-latex,edurangel/Ghost,schneidmaster/theventriloquist.us,skmezanul/Ghost,francisco-filho/Ghost,johnnymitch/Ghost,memezilla/Ghost,shannonshsu/Ghost,vainglori0us/urban-fortnight,yanntech/Ghost,bosung90/Ghost,Remchi/Ghost,sifatsultan/js-ghost,greyhwndz/Ghost,jparyani/GhostSS,obsoleted/Ghost,stridespace/Ghost,kevinansfield/Ghost,r14r/fork_nodejs_ghost,IbrahimAmin/Ghost,bitjson/Ghost,olsio/Ghost,devleague/uber-hackathon,camilodelvasto/localghost,gabfssilva/Ghost,telco2011/Ghost,tksander/Ghost,ryansukale/ux.ryansukale.com,omaracrystal/Ghost,benstoltz/Ghost,leninhasda/Ghost,thehogfather/Ghost,bbmepic/Ghost,netputer/Ghost,klinker-apps/ghost,cncodog/Ghost-zh-codog,GroupxDev/javaPress,cncodog/Ghost-zh-codog,melissaroman/ghost-blog,bastianbin/Ghost,jacostag/Ghost,zhiyishou/Ghost,tidyui/Ghost,lowkeyfred/Ghost,aroneiermann/GhostJade,hnq90/Ghost,cysys/ghost-openshift,dqj/Ghost,camilodelvasto/localghost,chris-yoon90/Ghost,wallmarkets/Ghost,axross/ghost,yangli1990/Ghost,bsansouci/Ghost,epicmiller/pages,sebgie/Ghost,shrimpy/Ghost,hyokosdeveloper/Ghost,pbevin/Ghost,dggr/Ghost-sr,STANAPO/Ghost,sfpgmr/Ghost,Coding-House/Ghost,etanxing/Ghost,sergeylukin/Ghost,Gargol/Ghost,atandon/Ghost,letsjustfixit/Ghost,netputer/Ghost,DesenTao/Ghost,mlabieniec/ghost-env,ineitzke/Ghost,Shauky/Ghost,Xibao-Lv/Ghost,Kikobeats/Ghost,influitive/crafters,metadevfoundation/Ghost,singular78/Ghost,lethalbrains/Ghost,smaty1/Ghost,sangcu/Ghost,mlabieniec/ghost-env,Netazoic/bad-gateway,hoxoa/Ghost,Loyalsoldier/Ghost,skleung/blog,singular78/Ghost,trepafi/ghost-base,laispace/laiblog,wallmarkets/Ghost,Jai-Chaudhary/Ghost,ManRueda/manrueda-blog,TryGhost/Ghost,wangjun/Ghost,GarrethDottin/Habits-Design,lf2941270/Ghost,GarrethDottin/Habits-Design,hilerchyn/Ghost,andrewconnell/Ghost,julianromera/Ghost,e10/Ghost,wspandihai/Ghost,julianromera/Ghost,ManRueda/Ghost,devleague/uber-hackathon,NovaDevelopGroup/Academy,novaugust/Ghost,woodyrew/Ghost,lanffy/Ghost,Romdeau/Ghost,TryGhost/Ghost,Kikobeats/Ghost,kortemy/Ghost,kolorahl/Ghost,Sebastian1011/Ghost,lukekhamilton/Ghost,allanjsx/Ghost,load11/ghost,jgillich/Ghost,Azzurrio/Ghost,wspandihai/Ghost,v3rt1go/Ghost,trunk-studio/Ghost,mattchupp/blog,exsodus3249/Ghost,ckousik/Ghost,klinker-apps/ghost,epicmiller/pages,Smile42RU/Ghost,cqricky/Ghost,acburdine/Ghost,sunh3/Ghost,mattchupp/blog,JulienBrks/Ghost,no1lov3sme/Ghost,disordinary/Ghost,xiongjungit/Ghost,gleneivey/Ghost,liftup/ghost,djensen47/Ghost,tandrewnichols/ghost,kortemy/Ghost,tanbo800/Ghost,leninhasda/Ghost,PeterCxy/Ghost,augbog/Ghost,Jai-Chaudhary/Ghost,gcamana/Ghost,kevinansfield/Ghost,yangli1990/Ghost,Alxandr/Blog,ryanbrunner/crafters,skleung/blog,dggr/Ghost-sr,ivantedja/ghost,gcamana/Ghost,RufusMbugua/TheoryOfACoder,Elektro1776/javaPress,vloom/blog,phillipalexander/Ghost,daimaqiao/Ghost-Bridge,skmezanul/Ghost,developer-prosenjit/Ghost,daimaqiao/Ghost-Bridge,pedroha/Ghost,uniqname/everydaydelicious,ErisDS/Ghost,shannonshsu/Ghost,Elektro1776/javaPress,dai-shi/Ghost,petersucks/blog,lcamacho/Ghost,ashishapy/ghostpy,jparyani/GhostSS,madole/diverse-learners,rafaelstz/Ghost,rizkyario/Ghost,thehogfather/Ghost,stridespace/Ghost,ljhsai/Ghost,ClarkGH/Ghost,handcode7/Ghost,codeincarnate/Ghost,bsansouci/Ghost,notno/Ghost,omaracrystal/Ghost,riyadhalnur/Ghost,hyokosdeveloper/Ghost,barbastan/Ghost,camilodelvasto/herokughost,edsadr/Ghost,ThorstenHans/Ghost,ManRueda/Ghost,psychobunny/Ghost,sebgie/Ghost,Rovak/Ghost,sajmoon/Ghost,VillainyStudios/Ghost,dgem/Ghost,k2byew/Ghost,ananthhh/Ghost,jeonghwan-kim/Ghost,ASwitlyk/Ghost,fredeerock/atlabghost,duyetdev/islab,Alxandr/Blog,ivanoats/ivanstorck.com,veyo-care/Ghost,lethalbrains/Ghost,JonSmith/Ghost,kolorahl/Ghost,bitjson/Ghost,v3rt1go/Ghost,jorgegilmoreira/ghost,carlosmtx/Ghost,laispace/laiblog,laispace/laiblog,SkynetInc/steam,benstoltz/Ghost,Klaudit/Ghost,ddeveloperr/Ghost,yundt/seisenpenji,patrickdbakke/ghost-spa,achimos/ghost_as,k2byew/Ghost,flpms/ghost-ad,javimolla/Ghost,gleneivey/Ghost,zeropaper/Ghost,ITJesse/Ghost-zh,kevinansfield/Ghost,Coding-House/Ghost,kaychaks/kaushikc.org,praveenscience/Ghost,influitive/crafters,zhiyishou/Ghost,leonli/ghost,bastianbin/Ghost,daimaqiao/Ghost-Bridge,llv22/Ghost,tmp-reg/Ghost,shrimpy/Ghost,praveenscience/Ghost,johngeorgewright/blog.j-g-w.info,situkangsayur/Ghost,mohanambati/Ghost,KnowLoading/Ghost,NikolaiIvanov/Ghost,zumobi/Ghost,TryGhost/Ghost,devleague/uber-hackathon,Japh/Ghost,dymx101/Ghost,flpms/ghost-ad,anijap/PhotoGhost,AileenCGN/Ghost,rollokb/Ghost,pensierinmusica/Ghost,weareleka/blog,velimir0xff/Ghost,qdk0901/Ghost,memezilla/Ghost,Elektro1776/javaPress,mlabieniec/ghost-env,zackslash/Ghost,prosenjit-itobuz/Ghost,carlyledavis/Ghost,ErisDS/Ghost,javorszky/Ghost,aschmoe/jesse-ghost-app,diogogmt/Ghost,thomasalrin/Ghost,tanbo800/Ghost,jgillich/Ghost,rouanw/Ghost,Alxandr/Blog,chevex/undoctrinate,pathayes/FoodBlog,load11/ghost,SachaG/bjjbot-blog,jaswilli/Ghost,codeincarnate/Ghost,Smile42RU/Ghost,novaugust/Ghost,BayPhillips/Ghost,veyo-care/Ghost,stridespace/Ghost,flomotlik/Ghost | javascript | ## Code Before:
var ActivatingListItem = Ember.Component.extend({
tagName: 'li',
classNameBindings: ['active'],
active: false
});
export default ActivatingListItem;
## Instruction:
Fix active menu state on settings navigation
closes #4622
- unfocus ActivatingListItem link when clicked
## Code After:
var ActivatingListItem = Ember.Component.extend({
tagName: 'li',
classNameBindings: ['active'],
active: false,
unfocusLink: function () {
this.$('a').blur();
}.on('click')
});
export default ActivatingListItem;
| var ActivatingListItem = Ember.Component.extend({
tagName: 'li',
classNameBindings: ['active'],
- active: false
+ active: false,
? +
+
+ unfocusLink: function () {
+ this.$('a').blur();
+ }.on('click')
});
export default ActivatingListItem; | 6 | 0.857143 | 5 | 1 |
f6e42a3d6798ca765256107d1ca906ec8104eda7 | plugins/suppliers/lib/suppliers_plugin/terms_helper.rb | plugins/suppliers/lib/suppliers_plugin/terms_helper.rb | module SuppliersPlugin::TermsHelper
Terms = [:consumer, :supplier]
TermsVariations = [:singular, :plural]
TermsTransformations = [:capitalize]
TermsKeys = Terms.map do |term|
TermsVariations.map{ |variation| [term, variation].join('.') }
end.flatten
protected
def translated_terms keys = TermsKeys, transformations = TermsTransformations
@translated_terms ||= {}
return @translated_terms unless @translated_terms.blank?
@terms_context ||= 'suppliers_plugin'
keys.each do |key|
translation = I18n.t "#{@terms_context}.terms.#{key}"
@translated_terms["terms.#{key}"] = translation
transformations.map do |transformation|
@translated_terms["terms.#{key}.#{transformation}"] = translation.send transformation
end
end
@translated_terms
end
def t key, options = {}
I18n.t(key, options) % translated_terms
end
end
| module SuppliersPlugin::TermsHelper
Terms = [:consumer, :supplier]
# '.' ins't supported by the % format function (altought it works on some newer systems)
TermsSeparator = '_'
TermsVariations = [:singular, :plural]
TermsTransformations = [:capitalize]
TermsKeys = Terms.map do |term|
TermsVariations.map{ |variation| [term, variation].join('.') }
end.flatten
protected
def sub_separator str
str.gsub '.', TermsSeparator
end
def sub_separator_items str
str.gsub!(/\%\{[^\}]*\}/){ |x| sub_separator x }
str
end
def translated_terms keys = TermsKeys, transformations = TermsTransformations, sep = TermsSeparator
@translated_terms ||= HashWithIndifferentAccess.new
return @translated_terms unless @translated_terms.blank?
@terms_context ||= 'suppliers_plugin'
keys.each do |key|
translation = I18n.t "#{@terms_context}.terms.#{key}"
new_key = sub_separator key
@translated_terms["terms#{sep}#{new_key}"] = translation
transformations.map do |transformation|
@translated_terms["terms#{sep}#{new_key}#{sep}#{transformation}"] = translation.send transformation
end
end
@translated_terms
end
def t key, options = {}
translation = I18n.t key, options
sub_separator_items translation
translation % translated_terms
end
end
| Fix use of format on older systems | Fix use of format on older systems
| Ruby | agpl-3.0 | coletivoEITA/noosfero-ecosol,samasti/noosfero,samasti/noosfero,blogoosfero/noosfero,EcoAlternative/noosfero-ecosol,samasti/noosfero,coletivoEITA/noosfero-ecosol,CIRANDAS/noosfero-ecosol,EcoAlternative/noosfero-ecosol,blogoosfero/noosfero,blogoosfero/noosfero,samasti/noosfero,CIRANDAS/noosfero-ecosol,CIRANDAS/noosfero-ecosol,coletivoEITA/noosfero-ecosol,CIRANDAS/noosfero-ecosol,CIRANDAS/noosfero-ecosol,blogoosfero/noosfero,EcoAlternative/noosfero-ecosol,blogoosfero/noosfero,coletivoEITA/noosfero-ecosol,samasti/noosfero,blogoosfero/noosfero,coletivoEITA/noosfero-ecosol,samasti/noosfero,blogoosfero/noosfero,coletivoEITA/noosfero-ecosol,EcoAlternative/noosfero-ecosol,EcoAlternative/noosfero-ecosol,EcoAlternative/noosfero-ecosol,EcoAlternative/noosfero-ecosol | ruby | ## Code Before:
module SuppliersPlugin::TermsHelper
Terms = [:consumer, :supplier]
TermsVariations = [:singular, :plural]
TermsTransformations = [:capitalize]
TermsKeys = Terms.map do |term|
TermsVariations.map{ |variation| [term, variation].join('.') }
end.flatten
protected
def translated_terms keys = TermsKeys, transformations = TermsTransformations
@translated_terms ||= {}
return @translated_terms unless @translated_terms.blank?
@terms_context ||= 'suppliers_plugin'
keys.each do |key|
translation = I18n.t "#{@terms_context}.terms.#{key}"
@translated_terms["terms.#{key}"] = translation
transformations.map do |transformation|
@translated_terms["terms.#{key}.#{transformation}"] = translation.send transformation
end
end
@translated_terms
end
def t key, options = {}
I18n.t(key, options) % translated_terms
end
end
## Instruction:
Fix use of format on older systems
## Code After:
module SuppliersPlugin::TermsHelper
Terms = [:consumer, :supplier]
# '.' ins't supported by the % format function (altought it works on some newer systems)
TermsSeparator = '_'
TermsVariations = [:singular, :plural]
TermsTransformations = [:capitalize]
TermsKeys = Terms.map do |term|
TermsVariations.map{ |variation| [term, variation].join('.') }
end.flatten
protected
def sub_separator str
str.gsub '.', TermsSeparator
end
def sub_separator_items str
str.gsub!(/\%\{[^\}]*\}/){ |x| sub_separator x }
str
end
def translated_terms keys = TermsKeys, transformations = TermsTransformations, sep = TermsSeparator
@translated_terms ||= HashWithIndifferentAccess.new
return @translated_terms unless @translated_terms.blank?
@terms_context ||= 'suppliers_plugin'
keys.each do |key|
translation = I18n.t "#{@terms_context}.terms.#{key}"
new_key = sub_separator key
@translated_terms["terms#{sep}#{new_key}"] = translation
transformations.map do |transformation|
@translated_terms["terms#{sep}#{new_key}#{sep}#{transformation}"] = translation.send transformation
end
end
@translated_terms
end
def t key, options = {}
translation = I18n.t key, options
sub_separator_items translation
translation % translated_terms
end
end
| module SuppliersPlugin::TermsHelper
Terms = [:consumer, :supplier]
+ # '.' ins't supported by the % format function (altought it works on some newer systems)
+ TermsSeparator = '_'
TermsVariations = [:singular, :plural]
TermsTransformations = [:capitalize]
TermsKeys = Terms.map do |term|
TermsVariations.map{ |variation| [term, variation].join('.') }
end.flatten
protected
+ def sub_separator str
+ str.gsub '.', TermsSeparator
+ end
+
+ def sub_separator_items str
+ str.gsub!(/\%\{[^\}]*\}/){ |x| sub_separator x }
+ str
+ end
+
- def translated_terms keys = TermsKeys, transformations = TermsTransformations
+ def translated_terms keys = TermsKeys, transformations = TermsTransformations, sep = TermsSeparator
? ++++++++++++++++++++++
- @translated_terms ||= {}
+ @translated_terms ||= HashWithIndifferentAccess.new
return @translated_terms unless @translated_terms.blank?
@terms_context ||= 'suppliers_plugin'
keys.each do |key|
translation = I18n.t "#{@terms_context}.terms.#{key}"
+ new_key = sub_separator key
- @translated_terms["terms.#{key}"] = translation
? -
+ @translated_terms["terms#{sep}#{new_key}"] = translation
? ++++++++++
transformations.map do |transformation|
- @translated_terms["terms.#{key}.#{transformation}"] = translation.send transformation
? - ^
+ @translated_terms["terms#{sep}#{new_key}#{sep}#{transformation}"] = translation.send transformation
? ++++++++++ ^^^^^^
end
end
@translated_terms
end
def t key, options = {}
- I18n.t(key, options) % translated_terms
+ translation = I18n.t key, options
+ sub_separator_items translation
+ translation % translated_terms
end
end | 24 | 0.75 | 19 | 5 |
a9defbc270859ad311ef0a0adf7fcf65ae3da204 | playbooks/roles/osx/tasks/homebrew.yml | playbooks/roles/osx/tasks/homebrew.yml | ---
- name: Update Homebrew
homebrew: update_homebrew=yes
- name: Install libraries with Homebrew
homebrew: name={{ item }} state=latest
with_items:
- ack
- curl
- fasd
- git
- hub
- nmap
- node
- python
- python3
- readline
- rename
- ssh-copy-id
- vim
- watch
- wget
- zsh
- httpie
- tree
- openssl
- cask
- coreutils
- findutils
- mosh
- name: Tap into the Homebrew Cask Beta apps
homebrew_tap: tap=caskroom/versions state=present
- name: Install libraries with Homebrew Cask
homebrew_cask: name={{ item }} state=present
with_items:
- dropbox
- evernote
- google-chrome
- iterm2
- sublime-text3
- hipchat
- viber
- github
- spotify
- macpass
- virtualbox
- trailer
- spotifree
- skitch
- flux
- sequel-pro
- vagrant
- xmarks-safari
- vagrant-manager
- utorrent
- vlc
- menumeters
- clipmenu
- bartender
- name: Cleanup Brew packages
command: brew cleanup
- name: Cleanup Brew Cask packages
command: brew cask cleanup
| ---
- name: Update Homebrew
homebrew: update_homebrew=yes
- name: Install libraries with Homebrew
homebrew: name={{ item }} state=latest
with_items:
- ack
- curl
- fasd
- git
- hub
- nmap
- node
- python
- python3
- readline
- rename
- ssh-copy-id
- vim
- watch
- wget
- zsh
- httpie
- tree
- openssl
- cask
- coreutils
- findutils
- mosh
- brew-cask
- name: Tap into the Homebrew Cask Beta apps
homebrew_tap: tap=caskroom/versions state=present
- name: Install libraries with Homebrew Cask
homebrew_cask: name={{ item }} state=present
with_items:
- dropbox
- evernote
- google-chrome
- iterm2
- sublime-text3
- hipchat
- viber
- github
- spotify
- macpass
- virtualbox
- trailer
- spotifree
- skitch
- flux
- sequel-pro
- vagrant
- xmarks-safari
- vagrant-manager
- utorrent
- vlc
- menumeters
- clipmenu
- bartender
- name: Cleanup Brew packages
command: brew cleanup
- name: Cleanup Brew Cask packages
command: brew cask cleanup
| Add brew-cask to list of updates | Add brew-cask to list of updates
| YAML | mit | smathson/dotfiles | yaml | ## Code Before:
---
- name: Update Homebrew
homebrew: update_homebrew=yes
- name: Install libraries with Homebrew
homebrew: name={{ item }} state=latest
with_items:
- ack
- curl
- fasd
- git
- hub
- nmap
- node
- python
- python3
- readline
- rename
- ssh-copy-id
- vim
- watch
- wget
- zsh
- httpie
- tree
- openssl
- cask
- coreutils
- findutils
- mosh
- name: Tap into the Homebrew Cask Beta apps
homebrew_tap: tap=caskroom/versions state=present
- name: Install libraries with Homebrew Cask
homebrew_cask: name={{ item }} state=present
with_items:
- dropbox
- evernote
- google-chrome
- iterm2
- sublime-text3
- hipchat
- viber
- github
- spotify
- macpass
- virtualbox
- trailer
- spotifree
- skitch
- flux
- sequel-pro
- vagrant
- xmarks-safari
- vagrant-manager
- utorrent
- vlc
- menumeters
- clipmenu
- bartender
- name: Cleanup Brew packages
command: brew cleanup
- name: Cleanup Brew Cask packages
command: brew cask cleanup
## Instruction:
Add brew-cask to list of updates
## Code After:
---
- name: Update Homebrew
homebrew: update_homebrew=yes
- name: Install libraries with Homebrew
homebrew: name={{ item }} state=latest
with_items:
- ack
- curl
- fasd
- git
- hub
- nmap
- node
- python
- python3
- readline
- rename
- ssh-copy-id
- vim
- watch
- wget
- zsh
- httpie
- tree
- openssl
- cask
- coreutils
- findutils
- mosh
- brew-cask
- name: Tap into the Homebrew Cask Beta apps
homebrew_tap: tap=caskroom/versions state=present
- name: Install libraries with Homebrew Cask
homebrew_cask: name={{ item }} state=present
with_items:
- dropbox
- evernote
- google-chrome
- iterm2
- sublime-text3
- hipchat
- viber
- github
- spotify
- macpass
- virtualbox
- trailer
- spotifree
- skitch
- flux
- sequel-pro
- vagrant
- xmarks-safari
- vagrant-manager
- utorrent
- vlc
- menumeters
- clipmenu
- bartender
- name: Cleanup Brew packages
command: brew cleanup
- name: Cleanup Brew Cask packages
command: brew cask cleanup
| ---
- name: Update Homebrew
homebrew: update_homebrew=yes
- name: Install libraries with Homebrew
homebrew: name={{ item }} state=latest
with_items:
- ack
- curl
- fasd
- git
- hub
- nmap
- node
- python
- python3
- readline
- rename
- ssh-copy-id
- vim
- watch
- wget
- zsh
- httpie
- tree
- openssl
- cask
- coreutils
- findutils
- mosh
+ - brew-cask
- name: Tap into the Homebrew Cask Beta apps
homebrew_tap: tap=caskroom/versions state=present
- name: Install libraries with Homebrew Cask
homebrew_cask: name={{ item }} state=present
with_items:
- dropbox
- evernote
- google-chrome
- iterm2
- sublime-text3
- hipchat
- viber
- github
- spotify
- macpass
- virtualbox
- trailer
- spotifree
- skitch
- flux
- sequel-pro
- vagrant
- xmarks-safari
- vagrant-manager
- utorrent
- vlc
- menumeters
- clipmenu
- bartender
- name: Cleanup Brew packages
command: brew cleanup
- name: Cleanup Brew Cask packages
command: brew cask cleanup | 1 | 0.014925 | 1 | 0 |
f6435094e02b76bc43d4bd2def79121140ace72f | spec/ccspec_spec.cc | spec/ccspec_spec.cc |
using std::cout;
using ccspec::core::formatters::DocumentationFormatter;
using ccspec::core::ExampleGroup;
using ccspec::core::Reporter;
namespace spec {
namespace matchers {
extern ExampleGroup* eq_spec;
} // namespace matchers
} // namespace spec
int main() {
using namespace spec;
DocumentationFormatter formatter(cout);
Reporter reporter(&formatter);
bool succeeded = matchers::eq_spec->run(reporter);
delete matchers::eq_spec;
return !succeeded;
}
|
using std::cout;
using ccspec::core::formatters::DocumentationFormatter;
using ccspec::core::ExampleGroup;
using ccspec::core::Reporter;
namespace spec {
namespace matchers {
extern ExampleGroup* eq_spec;
} // namespace matchers
} // namespace spec
int main() {
DocumentationFormatter formatter(cout);
Reporter reporter(&formatter);
bool succeeded = spec::matchers::eq_spec->run(reporter);
delete spec::matchers::eq_spec;
return !succeeded;
}
| Remove using namespace to silence linter | Remove using namespace to silence linter
| C++ | mit | michaelachrisco/ccspec,michaelachrisco/ccspec,zhangsu/ccspec,tempbottle/ccspec,michaelachrisco/ccspec,zhangsu/ccspec,tempbottle/ccspec,tempbottle/ccspec,zhangsu/ccspec | c++ | ## Code Before:
using std::cout;
using ccspec::core::formatters::DocumentationFormatter;
using ccspec::core::ExampleGroup;
using ccspec::core::Reporter;
namespace spec {
namespace matchers {
extern ExampleGroup* eq_spec;
} // namespace matchers
} // namespace spec
int main() {
using namespace spec;
DocumentationFormatter formatter(cout);
Reporter reporter(&formatter);
bool succeeded = matchers::eq_spec->run(reporter);
delete matchers::eq_spec;
return !succeeded;
}
## Instruction:
Remove using namespace to silence linter
## Code After:
using std::cout;
using ccspec::core::formatters::DocumentationFormatter;
using ccspec::core::ExampleGroup;
using ccspec::core::Reporter;
namespace spec {
namespace matchers {
extern ExampleGroup* eq_spec;
} // namespace matchers
} // namespace spec
int main() {
DocumentationFormatter formatter(cout);
Reporter reporter(&formatter);
bool succeeded = spec::matchers::eq_spec->run(reporter);
delete spec::matchers::eq_spec;
return !succeeded;
}
|
using std::cout;
using ccspec::core::formatters::DocumentationFormatter;
using ccspec::core::ExampleGroup;
using ccspec::core::Reporter;
namespace spec {
namespace matchers {
extern ExampleGroup* eq_spec;
} // namespace matchers
} // namespace spec
int main() {
- using namespace spec;
-
DocumentationFormatter formatter(cout);
Reporter reporter(&formatter);
- bool succeeded = matchers::eq_spec->run(reporter);
+ bool succeeded = spec::matchers::eq_spec->run(reporter);
? ++++++
- delete matchers::eq_spec;
+ delete spec::matchers::eq_spec;
? ++++++
return !succeeded;
} | 6 | 0.230769 | 2 | 4 |
16a30b004f4de3911a81e9ca335a5e8db6b68a60 | site/assets/scss/_callouts.scss | site/assets/scss/_callouts.scss | //
// Callouts
//
.bd-callout {
padding: 1.25rem;
margin-top: 1.25rem;
margin-bottom: 1.25rem;
border: 1px solid $gray-200;
border-left-width: .25rem;
@include border-radius();
h4 {
margin-bottom: .25rem;
}
p:last-child {
margin-bottom: 0;
}
code {
@include border-radius();
}
+ .bd-callout {
margin-top: -.25rem;
}
}
// Variations
@mixin bs-callout-variant($color) {
border-left-color: $color;
h4 { color: $color; }
}
.bd-callout-info { @include bs-callout-variant($bd-info); }
.bd-callout-warning { @include bs-callout-variant($bd-warning); }
.bd-callout-danger { @include bs-callout-variant($bd-danger); }
| //
// Callouts
//
.bd-callout {
padding: 1.25rem;
margin-top: 1.25rem;
margin-bottom: 1.25rem;
border: 1px solid $gray-200;
border-left-width: .25rem;
@include border-radius();
h4 {
margin-bottom: .25rem;
}
p:last-child {
margin-bottom: 0;
}
code {
@include border-radius();
}
+ .bd-callout {
margin-top: -.25rem;
}
}
// Variations
.bd-callout-info {
border-left-color: $bd-info;
}
.bd-callout-warning {
border-left-color: $bd-warning;
}
.bd-callout-danger {
border-left-color: $bd-danger;
}
| Drop h4 color customization in callouts | Drop h4 color customization in callouts
BS5 commits: 7134f6aa38706398a86311547b8479ec46a094f5
| SCSS | mit | todc/todc-bootstrap,todc/todc-bootstrap,todc/todc-bootstrap | scss | ## Code Before:
//
// Callouts
//
.bd-callout {
padding: 1.25rem;
margin-top: 1.25rem;
margin-bottom: 1.25rem;
border: 1px solid $gray-200;
border-left-width: .25rem;
@include border-radius();
h4 {
margin-bottom: .25rem;
}
p:last-child {
margin-bottom: 0;
}
code {
@include border-radius();
}
+ .bd-callout {
margin-top: -.25rem;
}
}
// Variations
@mixin bs-callout-variant($color) {
border-left-color: $color;
h4 { color: $color; }
}
.bd-callout-info { @include bs-callout-variant($bd-info); }
.bd-callout-warning { @include bs-callout-variant($bd-warning); }
.bd-callout-danger { @include bs-callout-variant($bd-danger); }
## Instruction:
Drop h4 color customization in callouts
BS5 commits: 7134f6aa38706398a86311547b8479ec46a094f5
## Code After:
//
// Callouts
//
.bd-callout {
padding: 1.25rem;
margin-top: 1.25rem;
margin-bottom: 1.25rem;
border: 1px solid $gray-200;
border-left-width: .25rem;
@include border-radius();
h4 {
margin-bottom: .25rem;
}
p:last-child {
margin-bottom: 0;
}
code {
@include border-radius();
}
+ .bd-callout {
margin-top: -.25rem;
}
}
// Variations
.bd-callout-info {
border-left-color: $bd-info;
}
.bd-callout-warning {
border-left-color: $bd-warning;
}
.bd-callout-danger {
border-left-color: $bd-danger;
}
| //
// Callouts
//
.bd-callout {
padding: 1.25rem;
margin-top: 1.25rem;
margin-bottom: 1.25rem;
border: 1px solid $gray-200;
border-left-width: .25rem;
@include border-radius();
h4 {
margin-bottom: .25rem;
}
p:last-child {
margin-bottom: 0;
}
code {
@include border-radius();
}
+ .bd-callout {
margin-top: -.25rem;
}
}
// Variations
- @mixin bs-callout-variant($color) {
+ .bd-callout-info {
- border-left-color: $color;
? ^ ---
+ border-left-color: $bd-info;
? ^^^^^^
-
- h4 { color: $color; }
}
- .bd-callout-info { @include bs-callout-variant($bd-info); }
- .bd-callout-warning { @include bs-callout-variant($bd-warning); }
- .bd-callout-danger { @include bs-callout-variant($bd-danger); }
+ .bd-callout-warning {
+ border-left-color: $bd-warning;
+ }
+
+ .bd-callout-danger {
+ border-left-color: $bd-danger;
+ } | 16 | 0.410256 | 9 | 7 |
dd5fd321bb1c27cc46c929d1dd1c44a88420b1f9 | config/initializers/doorkeeper.rb | config/initializers/doorkeeper.rb | Doorkeeper.configure do
orm :active_record
enable_application_owner :confirmation => true
default_scopes :public
optional_scopes :user, :project, :group, :collection
realm "Panoptes"
resource_owner_authenticator do
current_user || warden.authenticate!(scope: :user)
end
resource_owner_from_credentials do |routes|
if u = User.find_for_database_authentication(login: params[:login])
valid_non_disabled_user = u.valid_password?(params[:password]) && !u.disabled?
u if valid_non_disabled_user
end
end
end
| Doorkeeper.configure do
orm :active_record
enable_application_owner :confirmation => true
default_scopes :public
optional_scopes :user, :project, :group, :collection
realm "Panoptes"
resource_owner_authenticator do
u = current_user || warden.authenticate!(scope: :user)
u if !u.disabled?
end
resource_owner_from_credentials do |routes|
if u = User.find_for_database_authentication(login: params[:login])
valid_non_disabled_user = u.valid_password?(params[:password]) && !u.disabled?
else
u = current_user || warden.authenticate!(scope: :user)
valid_non_disabled_user = !u.disabled?
end
u if valid_non_disabled_user
end
end
| Allow resource owner credentials with devise session | Allow resource owner credentials with devise session
| Ruby | apache-2.0 | parrish/Panoptes,camallen/Panoptes,srallen/Panoptes,marten/Panoptes,astopy/Panoptes,zooniverse/Panoptes,astopy/Panoptes,astopy/Panoptes,camallen/Panoptes,edpaget/Panoptes,parrish/Panoptes,rogerhutchings/Panoptes,astopy/Panoptes,srallen/Panoptes,marten/Panoptes,edpaget/Panoptes,edpaget/Panoptes,marten/Panoptes,edpaget/Panoptes,srallen/Panoptes,camallen/Panoptes,zooniverse/Panoptes,rogerhutchings/Panoptes,parrish/Panoptes,zooniverse/Panoptes,rogerhutchings/Panoptes,zooniverse/Panoptes,rogerhutchings/Panoptes,parrish/Panoptes,marten/Panoptes,srallen/Panoptes,camallen/Panoptes | ruby | ## Code Before:
Doorkeeper.configure do
orm :active_record
enable_application_owner :confirmation => true
default_scopes :public
optional_scopes :user, :project, :group, :collection
realm "Panoptes"
resource_owner_authenticator do
current_user || warden.authenticate!(scope: :user)
end
resource_owner_from_credentials do |routes|
if u = User.find_for_database_authentication(login: params[:login])
valid_non_disabled_user = u.valid_password?(params[:password]) && !u.disabled?
u if valid_non_disabled_user
end
end
end
## Instruction:
Allow resource owner credentials with devise session
## Code After:
Doorkeeper.configure do
orm :active_record
enable_application_owner :confirmation => true
default_scopes :public
optional_scopes :user, :project, :group, :collection
realm "Panoptes"
resource_owner_authenticator do
u = current_user || warden.authenticate!(scope: :user)
u if !u.disabled?
end
resource_owner_from_credentials do |routes|
if u = User.find_for_database_authentication(login: params[:login])
valid_non_disabled_user = u.valid_password?(params[:password]) && !u.disabled?
else
u = current_user || warden.authenticate!(scope: :user)
valid_non_disabled_user = !u.disabled?
end
u if valid_non_disabled_user
end
end
| Doorkeeper.configure do
orm :active_record
enable_application_owner :confirmation => true
default_scopes :public
optional_scopes :user, :project, :group, :collection
realm "Panoptes"
resource_owner_authenticator do
- current_user || warden.authenticate!(scope: :user)
+ u = current_user || warden.authenticate!(scope: :user)
? ++++
+ u if !u.disabled?
end
resource_owner_from_credentials do |routes|
if u = User.find_for_database_authentication(login: params[:login])
valid_non_disabled_user = u.valid_password?(params[:password]) && !u.disabled?
- u if valid_non_disabled_user
+ else
+ u = current_user || warden.authenticate!(scope: :user)
+ valid_non_disabled_user = !u.disabled?
end
+
+ u if valid_non_disabled_user
end
end | 9 | 0.428571 | 7 | 2 |
d0e702a9b16a71141ac27ba3c851418112aec746 | requirements/base.txt | requirements/base.txt | Django>=1.7
dj-database-url==0.3.0
whitenoise==1.0.3
| Django>=1.7
dj-database-url==0.3.0
requests==2.4.3
sunlight==1.2.6
whitenoise==1.0.3
| Add sunlight and requests to requirements.txt | Add sunlight and requests to requirements.txt
| Text | mit | texastribune/txlege84,texastribune/txlege84,texastribune/txlege84,texastribune/txlege84 | text | ## Code Before:
Django>=1.7
dj-database-url==0.3.0
whitenoise==1.0.3
## Instruction:
Add sunlight and requests to requirements.txt
## Code After:
Django>=1.7
dj-database-url==0.3.0
requests==2.4.3
sunlight==1.2.6
whitenoise==1.0.3
| Django>=1.7
dj-database-url==0.3.0
+ requests==2.4.3
+ sunlight==1.2.6
whitenoise==1.0.3 | 2 | 0.5 | 2 | 0 |
57015bec555ca2a3f2e5893158d00f2dd2ca441c | errs.py | errs.py | import sys
class ConfigError(Exception):
def __init__(self, message):
self.message = message
sys.stdout.write("\nERROR: " + str(message) + "\n\n")
class ParseError(Exception):
def __init__(self, message):
self.message = message
sys.stdout.write("\nERROR: " + str(message) + "\n\n")
| import sys
class GenericException(Exception):
def __init__(self, message):
self.message = message
sys.stdout.write("\nERROR: " + str(message) + "\n\n")
class ConfigError(GenericException):
pass
class ParseError(GenericException):
pass
| Make errors a bit easier to copy | Make errors a bit easier to copy
| Python | agpl-3.0 | OpenTechStrategies/anvil | python | ## Code Before:
import sys
class ConfigError(Exception):
def __init__(self, message):
self.message = message
sys.stdout.write("\nERROR: " + str(message) + "\n\n")
class ParseError(Exception):
def __init__(self, message):
self.message = message
sys.stdout.write("\nERROR: " + str(message) + "\n\n")
## Instruction:
Make errors a bit easier to copy
## Code After:
import sys
class GenericException(Exception):
def __init__(self, message):
self.message = message
sys.stdout.write("\nERROR: " + str(message) + "\n\n")
class ConfigError(GenericException):
pass
class ParseError(GenericException):
pass
| import sys
- class ConfigError(Exception):
+ class GenericException(Exception):
def __init__(self, message):
self.message = message
sys.stdout.write("\nERROR: " + str(message) + "\n\n")
+ class ConfigError(GenericException):
+ pass
- class ParseError(Exception):
- def __init__(self, message):
- self.message = message
-
- sys.stdout.write("\nERROR: " + str(message) + "\n\n")
+ class ParseError(GenericException):
+ pass
+ | 12 | 0.923077 | 6 | 6 |
502b10ca957e79adcbf0cecc5dbd3ed8f8de04d7 | README.md | README.md |
This is the dairy nutrient calculator module of farm build JavaScript library.
## Getting Started
To get you started simply download this repository using <a href="https://github.com/FarmBuild/farmbuild-soil-sample-importer/archive/master.zip" target="_blank">"Download ZIP"</a> on the right side of this page.
Unzip the downloaded folder and go to "examples" and open one of the html files in a browser.
<a href="https://rawgit.com/FarmBuild/farmbuild-soil-sample-importer/master/docs/farmbuild-soil-sample-importer/0.1.2/index.html" target="_blank">API documentation</a>
For more information about API and Testing see the [Wiki](https://github.com/SpatialVision/farm-build-nutrient-calculator/wiki) section.
[](https://travis-ci.org/FarmBuild/farmbuild-soil-sample-importer)
|
This is the dairy nutrient calculator module of farm build JavaScript library.
## Getting Started
To get you started simply download this repository using <a href="https://github.com/FarmBuild/farmbuild-soil-sample-importer/archive/master.zip" target="_blank">"Download ZIP"</a> on the right side of this page.
Unzip the downloaded folder and go to "examples" and open one of the html files in a browser.
<a href="https://rawgit.com/FarmBuild/farmbuild-soil-sample-importer/master/docs/farmbuild-soil-sample-importer/0.1.2/index.html" target="_blank">API documentation</a>
### AngularJS example
* <a href="https://rawgit.com/FarmBuild/farmbuild-soil-sample-importer/master/examples/angularjs/index.html" target="_blank">Soil Sample - angularJS example</a>
For more information about API and Testing see the [Wiki](https://github.com/SpatialVision/farm-build-nutrient-calculator/wiki) section.
[](https://travis-ci.org/FarmBuild/farmbuild-soil-sample-importer)
| Add link to rawgit hosted example | Add link to rawgit hosted example
| Markdown | apache-2.0 | FarmBuild/farmbuild-soil-sample-importer,FarmBuild/farmbuild-soil-sample-importer | markdown | ## Code Before:
This is the dairy nutrient calculator module of farm build JavaScript library.
## Getting Started
To get you started simply download this repository using <a href="https://github.com/FarmBuild/farmbuild-soil-sample-importer/archive/master.zip" target="_blank">"Download ZIP"</a> on the right side of this page.
Unzip the downloaded folder and go to "examples" and open one of the html files in a browser.
<a href="https://rawgit.com/FarmBuild/farmbuild-soil-sample-importer/master/docs/farmbuild-soil-sample-importer/0.1.2/index.html" target="_blank">API documentation</a>
For more information about API and Testing see the [Wiki](https://github.com/SpatialVision/farm-build-nutrient-calculator/wiki) section.
[](https://travis-ci.org/FarmBuild/farmbuild-soil-sample-importer)
## Instruction:
Add link to rawgit hosted example
## Code After:
This is the dairy nutrient calculator module of farm build JavaScript library.
## Getting Started
To get you started simply download this repository using <a href="https://github.com/FarmBuild/farmbuild-soil-sample-importer/archive/master.zip" target="_blank">"Download ZIP"</a> on the right side of this page.
Unzip the downloaded folder and go to "examples" and open one of the html files in a browser.
<a href="https://rawgit.com/FarmBuild/farmbuild-soil-sample-importer/master/docs/farmbuild-soil-sample-importer/0.1.2/index.html" target="_blank">API documentation</a>
### AngularJS example
* <a href="https://rawgit.com/FarmBuild/farmbuild-soil-sample-importer/master/examples/angularjs/index.html" target="_blank">Soil Sample - angularJS example</a>
For more information about API and Testing see the [Wiki](https://github.com/SpatialVision/farm-build-nutrient-calculator/wiki) section.
[](https://travis-ci.org/FarmBuild/farmbuild-soil-sample-importer)
|
This is the dairy nutrient calculator module of farm build JavaScript library.
## Getting Started
To get you started simply download this repository using <a href="https://github.com/FarmBuild/farmbuild-soil-sample-importer/archive/master.zip" target="_blank">"Download ZIP"</a> on the right side of this page.
Unzip the downloaded folder and go to "examples" and open one of the html files in a browser.
<a href="https://rawgit.com/FarmBuild/farmbuild-soil-sample-importer/master/docs/farmbuild-soil-sample-importer/0.1.2/index.html" target="_blank">API documentation</a>
+ ### AngularJS example
+ * <a href="https://rawgit.com/FarmBuild/farmbuild-soil-sample-importer/master/examples/angularjs/index.html" target="_blank">Soil Sample - angularJS example</a>
For more information about API and Testing see the [Wiki](https://github.com/SpatialVision/farm-build-nutrient-calculator/wiki) section.
[](https://travis-ci.org/FarmBuild/farmbuild-soil-sample-importer) | 2 | 0.125 | 2 | 0 |
1d12f0a74e95381d1caca8b7525bdc08355aaee9 | db/migrate/20170116103602_add_tokens_to_course_lesson_plan_items.rb | db/migrate/20170116103602_add_tokens_to_course_lesson_plan_items.rb | class AddTokensToCourseLessonPlanItems < ActiveRecord::Migration
def change
add_column :course_lesson_plan_items, :opening_reminder_token, :float
add_column :course_lesson_plan_items, :closing_reminder_token, :float
end
end
| class AddTokensToCourseLessonPlanItems < ActiveRecord::Migration
def change
add_column :course_lesson_plan_items, :opening_reminder_token, :float
add_column :course_lesson_plan_items, :closing_reminder_token, :float
Course::Assessment.joins(:lesson_plan_item).
where('course_lesson_plan_items.start_at > ?', Time.zone.now).find_each do |assessment|
# Remove milliseconds part of the assessment
assessment.lesson_plan_item.update_column(:start_at, assessment.start_at.change(usec: 0))
# Create the new reminder job
token = Time.zone.now.to_f.round(5)
assessment.lesson_plan_item.update_column(:opening_reminder_token, token)
Course::Assessment::OpeningReminderJob.set(wait_until: assessment.start_at).
perform_later(assessment.updater, assessment, token)
end
Course::Assessment.joins(:lesson_plan_item).
where('course_lesson_plan_items.end_at > ?', 1.day.from_now).find_each do |assessment|
# Remove milliseconds part of the assessment
assessment.lesson_plan_item.update_column(:end_at, assessment.end_at.change(usec: 0))
# Create the new reminder job
token = Time.zone.now.to_f.round(5)
assessment.lesson_plan_item.update_column(:closing_reminder_token, token)
Course::Assessment::ClosingReminderJob.set(wait_until: assessment.end_at - 1.day).
perform_later(assessment.updater, assessment, token)
end
end
end
| Remove milliseconds part of the time and create new jobs for assessments | Remove milliseconds part of the time and create new jobs for assessments
| Ruby | mit | Coursemology/coursemology2,cysjonathan/coursemology2,Coursemology/coursemology2,Coursemology/coursemology2,cysjonathan/coursemology2,Coursemology/coursemology2,Coursemology/coursemology2,cysjonathan/coursemology2,Coursemology/coursemology2,Coursemology/coursemology2 | ruby | ## Code Before:
class AddTokensToCourseLessonPlanItems < ActiveRecord::Migration
def change
add_column :course_lesson_plan_items, :opening_reminder_token, :float
add_column :course_lesson_plan_items, :closing_reminder_token, :float
end
end
## Instruction:
Remove milliseconds part of the time and create new jobs for assessments
## Code After:
class AddTokensToCourseLessonPlanItems < ActiveRecord::Migration
def change
add_column :course_lesson_plan_items, :opening_reminder_token, :float
add_column :course_lesson_plan_items, :closing_reminder_token, :float
Course::Assessment.joins(:lesson_plan_item).
where('course_lesson_plan_items.start_at > ?', Time.zone.now).find_each do |assessment|
# Remove milliseconds part of the assessment
assessment.lesson_plan_item.update_column(:start_at, assessment.start_at.change(usec: 0))
# Create the new reminder job
token = Time.zone.now.to_f.round(5)
assessment.lesson_plan_item.update_column(:opening_reminder_token, token)
Course::Assessment::OpeningReminderJob.set(wait_until: assessment.start_at).
perform_later(assessment.updater, assessment, token)
end
Course::Assessment.joins(:lesson_plan_item).
where('course_lesson_plan_items.end_at > ?', 1.day.from_now).find_each do |assessment|
# Remove milliseconds part of the assessment
assessment.lesson_plan_item.update_column(:end_at, assessment.end_at.change(usec: 0))
# Create the new reminder job
token = Time.zone.now.to_f.round(5)
assessment.lesson_plan_item.update_column(:closing_reminder_token, token)
Course::Assessment::ClosingReminderJob.set(wait_until: assessment.end_at - 1.day).
perform_later(assessment.updater, assessment, token)
end
end
end
| class AddTokensToCourseLessonPlanItems < ActiveRecord::Migration
def change
add_column :course_lesson_plan_items, :opening_reminder_token, :float
add_column :course_lesson_plan_items, :closing_reminder_token, :float
+
+ Course::Assessment.joins(:lesson_plan_item).
+ where('course_lesson_plan_items.start_at > ?', Time.zone.now).find_each do |assessment|
+ # Remove milliseconds part of the assessment
+ assessment.lesson_plan_item.update_column(:start_at, assessment.start_at.change(usec: 0))
+
+ # Create the new reminder job
+ token = Time.zone.now.to_f.round(5)
+ assessment.lesson_plan_item.update_column(:opening_reminder_token, token)
+ Course::Assessment::OpeningReminderJob.set(wait_until: assessment.start_at).
+ perform_later(assessment.updater, assessment, token)
+ end
+
+ Course::Assessment.joins(:lesson_plan_item).
+ where('course_lesson_plan_items.end_at > ?', 1.day.from_now).find_each do |assessment|
+ # Remove milliseconds part of the assessment
+ assessment.lesson_plan_item.update_column(:end_at, assessment.end_at.change(usec: 0))
+
+ # Create the new reminder job
+ token = Time.zone.now.to_f.round(5)
+ assessment.lesson_plan_item.update_column(:closing_reminder_token, token)
+ Course::Assessment::ClosingReminderJob.set(wait_until: assessment.end_at - 1.day).
+ perform_later(assessment.updater, assessment, token)
+ end
end
end | 24 | 4 | 24 | 0 |
ccfd0a23d2ba3363a4b86fdc9807f39eddbffc83 | src/main/java/de/codecentric/centerdevice/glass/MacApplicationAdapter.java | src/main/java/de/codecentric/centerdevice/glass/MacApplicationAdapter.java | package de.codecentric.centerdevice.glass;
import com.sun.glass.ui.Application;
import de.codecentric.centerdevice.util.ReflectionUtils;
import javafx.application.Platform;
public class MacApplicationAdapter {
private Application app;
private boolean forceQuitOnCmdQ = true;
public MacApplicationAdapter() throws ReflectiveOperationException {
app = Application.GetApplication();
}
public void hide() {
ReflectionUtils.invokeQuietly(app, "_hide");
}
public void hideOtherApplications() {
ReflectionUtils.invokeQuietly(app, "_hideOtherApplications");
}
public void unhideAllApplications() {
ReflectionUtils.invokeQuietly(app, "_unhideAllApplications");
}
public void quit() {
Application.EventHandler eh = app.getEventHandler();
if (eh != null) {
eh.handleQuitAction(Application.GetApplication(), System.nanoTime());
}
if (forceQuitOnCmdQ) {
Platform.exit();
}
}
public void setForceQuitOnCmdQ(boolean forceQuit) {
this.forceQuitOnCmdQ = forceQuit;
}
}
| package de.codecentric.centerdevice.glass;
import com.sun.glass.ui.Application;
import de.codecentric.centerdevice.util.ReflectionUtils;
import javafx.application.Platform;
public class MacApplicationAdapter {
private Application app;
private boolean forceQuitOnCmdQ = true;
public MacApplicationAdapter() {
app = Application.GetApplication();
}
public void hide() {
ReflectionUtils.invokeQuietly(app, "_hide");
}
public void hideOtherApplications() {
ReflectionUtils.invokeQuietly(app, "_hideOtherApplications");
}
public void unhideAllApplications() {
ReflectionUtils.invokeQuietly(app, "_unhideAllApplications");
}
public void quit() {
Application.EventHandler eh = app.getEventHandler();
if (eh != null) {
eh.handleQuitAction(Application.GetApplication(), System.nanoTime());
}
if (forceQuitOnCmdQ) {
Platform.exit();
}
}
public void setForceQuitOnCmdQ(boolean forceQuit) {
this.forceQuitOnCmdQ = forceQuit;
}
}
| Remove exception from method signature | Remove exception from method signature
| Java | bsd-3-clause | codecentric/NSMenuFX | java | ## Code Before:
package de.codecentric.centerdevice.glass;
import com.sun.glass.ui.Application;
import de.codecentric.centerdevice.util.ReflectionUtils;
import javafx.application.Platform;
public class MacApplicationAdapter {
private Application app;
private boolean forceQuitOnCmdQ = true;
public MacApplicationAdapter() throws ReflectiveOperationException {
app = Application.GetApplication();
}
public void hide() {
ReflectionUtils.invokeQuietly(app, "_hide");
}
public void hideOtherApplications() {
ReflectionUtils.invokeQuietly(app, "_hideOtherApplications");
}
public void unhideAllApplications() {
ReflectionUtils.invokeQuietly(app, "_unhideAllApplications");
}
public void quit() {
Application.EventHandler eh = app.getEventHandler();
if (eh != null) {
eh.handleQuitAction(Application.GetApplication(), System.nanoTime());
}
if (forceQuitOnCmdQ) {
Platform.exit();
}
}
public void setForceQuitOnCmdQ(boolean forceQuit) {
this.forceQuitOnCmdQ = forceQuit;
}
}
## Instruction:
Remove exception from method signature
## Code After:
package de.codecentric.centerdevice.glass;
import com.sun.glass.ui.Application;
import de.codecentric.centerdevice.util.ReflectionUtils;
import javafx.application.Platform;
public class MacApplicationAdapter {
private Application app;
private boolean forceQuitOnCmdQ = true;
public MacApplicationAdapter() {
app = Application.GetApplication();
}
public void hide() {
ReflectionUtils.invokeQuietly(app, "_hide");
}
public void hideOtherApplications() {
ReflectionUtils.invokeQuietly(app, "_hideOtherApplications");
}
public void unhideAllApplications() {
ReflectionUtils.invokeQuietly(app, "_unhideAllApplications");
}
public void quit() {
Application.EventHandler eh = app.getEventHandler();
if (eh != null) {
eh.handleQuitAction(Application.GetApplication(), System.nanoTime());
}
if (forceQuitOnCmdQ) {
Platform.exit();
}
}
public void setForceQuitOnCmdQ(boolean forceQuit) {
this.forceQuitOnCmdQ = forceQuit;
}
}
| package de.codecentric.centerdevice.glass;
import com.sun.glass.ui.Application;
import de.codecentric.centerdevice.util.ReflectionUtils;
import javafx.application.Platform;
public class MacApplicationAdapter {
private Application app;
private boolean forceQuitOnCmdQ = true;
- public MacApplicationAdapter() throws ReflectiveOperationException {
+ public MacApplicationAdapter() {
app = Application.GetApplication();
}
public void hide() {
ReflectionUtils.invokeQuietly(app, "_hide");
}
public void hideOtherApplications() {
ReflectionUtils.invokeQuietly(app, "_hideOtherApplications");
}
public void unhideAllApplications() {
ReflectionUtils.invokeQuietly(app, "_unhideAllApplications");
}
public void quit() {
Application.EventHandler eh = app.getEventHandler();
if (eh != null) {
eh.handleQuitAction(Application.GetApplication(), System.nanoTime());
}
if (forceQuitOnCmdQ) {
Platform.exit();
}
}
public void setForceQuitOnCmdQ(boolean forceQuit) {
this.forceQuitOnCmdQ = forceQuit;
}
} | 2 | 0.045455 | 1 | 1 |
fd2324be6a55dc5a72e800b5807d30e0051a2b0b | charts/covidDataExplorer/CovidRadioControl.tsx | charts/covidDataExplorer/CovidRadioControl.tsx | import React from "react"
import { observer } from "mobx-react"
import { action } from "mobx"
export interface RadioOption {
label: string
checked: boolean
onChange: (checked: boolean) => void
}
@observer
export class CovidRadioControl extends React.Component<{
name: string
isCheckbox?: boolean
options: RadioOption[]
}> {
@action.bound onChange(ev: React.ChangeEvent<HTMLInputElement>) {
this.props.options[parseInt(ev.currentTarget.value)].onChange(
ev.currentTarget.checked
)
}
render() {
const name = this.props.name
return (
<div className="CovidDataExplorerControl">
<div className="ControlHeader">{this.props.name}</div>
{this.props.options.map((option, index) => (
<div key={index}>
<label
className={
option.checked ? "SelectedOption" : "Option"
}
>
<input
onChange={this.onChange}
type={
this.props.isCheckbox ? "checkbox" : "radio"
}
name={name}
data-track-note={`covid-click-${name}`}
checked={option.checked}
value={index}
/>{" "}
{option.label}
</label>
</div>
))}
</div>
)
}
}
| import React from "react"
import { observer } from "mobx-react"
import { action } from "mobx"
export interface RadioOption {
label: string
checked: boolean
onChange: (checked: boolean) => void
}
@observer
export class CovidRadioControl extends React.Component<{
name: string
isCheckbox?: boolean
options: RadioOption[]
}> {
@action.bound onChange(ev: React.ChangeEvent<HTMLInputElement>) {
this.props.options[parseInt(ev.currentTarget.value)].onChange(
ev.currentTarget.checked
)
}
render() {
const name = this.props.name
return (
<div className="CovidDataExplorerControl">
<div className="ControlHeader">{this.props.name}</div>
{this.props.options.map((option, index) => (
<div key={index}>
<label
className={
option.checked ? "SelectedOption" : "Option"
}
data-track-note={`covid-click-${name}`}
>
<input
onChange={this.onChange}
type={
this.props.isCheckbox ? "checkbox" : "radio"
}
name={name}
checked={option.checked}
value={index}
/>{" "}
{option.label}
</label>
</div>
))}
</div>
)
}
}
| Move data-track-note to <label> to capture label text content | Move data-track-note to <label> to capture label text content
| TypeScript | mit | owid/owid-grapher,OurWorldInData/owid-grapher,owid/owid-grapher,owid/owid-grapher,OurWorldInData/owid-grapher,OurWorldInData/owid-grapher,owid/owid-grapher,owid/owid-grapher,OurWorldInData/owid-grapher,OurWorldInData/owid-grapher | typescript | ## Code Before:
import React from "react"
import { observer } from "mobx-react"
import { action } from "mobx"
export interface RadioOption {
label: string
checked: boolean
onChange: (checked: boolean) => void
}
@observer
export class CovidRadioControl extends React.Component<{
name: string
isCheckbox?: boolean
options: RadioOption[]
}> {
@action.bound onChange(ev: React.ChangeEvent<HTMLInputElement>) {
this.props.options[parseInt(ev.currentTarget.value)].onChange(
ev.currentTarget.checked
)
}
render() {
const name = this.props.name
return (
<div className="CovidDataExplorerControl">
<div className="ControlHeader">{this.props.name}</div>
{this.props.options.map((option, index) => (
<div key={index}>
<label
className={
option.checked ? "SelectedOption" : "Option"
}
>
<input
onChange={this.onChange}
type={
this.props.isCheckbox ? "checkbox" : "radio"
}
name={name}
data-track-note={`covid-click-${name}`}
checked={option.checked}
value={index}
/>{" "}
{option.label}
</label>
</div>
))}
</div>
)
}
}
## Instruction:
Move data-track-note to <label> to capture label text content
## Code After:
import React from "react"
import { observer } from "mobx-react"
import { action } from "mobx"
export interface RadioOption {
label: string
checked: boolean
onChange: (checked: boolean) => void
}
@observer
export class CovidRadioControl extends React.Component<{
name: string
isCheckbox?: boolean
options: RadioOption[]
}> {
@action.bound onChange(ev: React.ChangeEvent<HTMLInputElement>) {
this.props.options[parseInt(ev.currentTarget.value)].onChange(
ev.currentTarget.checked
)
}
render() {
const name = this.props.name
return (
<div className="CovidDataExplorerControl">
<div className="ControlHeader">{this.props.name}</div>
{this.props.options.map((option, index) => (
<div key={index}>
<label
className={
option.checked ? "SelectedOption" : "Option"
}
data-track-note={`covid-click-${name}`}
>
<input
onChange={this.onChange}
type={
this.props.isCheckbox ? "checkbox" : "radio"
}
name={name}
checked={option.checked}
value={index}
/>{" "}
{option.label}
</label>
</div>
))}
</div>
)
}
}
| import React from "react"
import { observer } from "mobx-react"
import { action } from "mobx"
export interface RadioOption {
label: string
checked: boolean
onChange: (checked: boolean) => void
}
@observer
export class CovidRadioControl extends React.Component<{
name: string
isCheckbox?: boolean
options: RadioOption[]
}> {
@action.bound onChange(ev: React.ChangeEvent<HTMLInputElement>) {
this.props.options[parseInt(ev.currentTarget.value)].onChange(
ev.currentTarget.checked
)
}
render() {
const name = this.props.name
return (
<div className="CovidDataExplorerControl">
<div className="ControlHeader">{this.props.name}</div>
{this.props.options.map((option, index) => (
<div key={index}>
<label
className={
option.checked ? "SelectedOption" : "Option"
}
+ data-track-note={`covid-click-${name}`}
>
<input
onChange={this.onChange}
type={
this.props.isCheckbox ? "checkbox" : "radio"
}
name={name}
- data-track-note={`covid-click-${name}`}
checked={option.checked}
value={index}
/>{" "}
{option.label}
</label>
</div>
))}
</div>
)
}
} | 2 | 0.038462 | 1 | 1 |
fdd915f668f5c07e9642243d59548767780e0c36 | app/graphql/mutations/create_team_member.rb | app/graphql/mutations/create_team_member.rb | class Mutations::CreateTeamMember < Mutations::BaseMutation
field :team_member, Types::TeamMemberType, null: false
field :ticket, Types::TicketType, null: true
field :converted_signups, [Types::SignupType], null: false
field :moved_signups, [Types::SignupMoveResultType], null: false
argument :event_id, Integer, required: true, camelize: false
argument :user_con_profile_id, Integer, required: true, camelize: false
argument :team_member, Types::TeamMemberInputType, required: true, camelize: false
argument :provide_ticket_type_id, Integer, required: false, camelize: false
def resolve(**args)
event = convention.events.find(args[:event_id])
user_con_profile = convention.user_con_profiles.find(args[:user_con_profile_id])
result = CreateTeamMemberService.new(
event: event,
user_con_profile: user_con_profile,
team_member_attrs: args[:team_member],
provide_ticket_type_id: args[:provide_ticket_type_id]
).call!
{
team_member: result.team_member,
ticket: result.ticket,
converted_signups: result.converted_signups,
moved_signups: result.moved_signups
}
end
end
| class Mutations::CreateTeamMember < Mutations::BaseMutation
field :team_member, Types::TeamMemberType, null: false
field :ticket, Types::TicketType, null: true
field :converted_signups, [Types::SignupType], null: false
field :move_results, [Types::SignupMoveResultType], null: false
argument :event_id, Integer, required: true, camelize: false
argument :user_con_profile_id, Integer, required: true, camelize: false
argument :team_member, Types::TeamMemberInputType, required: true, camelize: false
argument :provide_ticket_type_id, Integer, required: false, camelize: false
def resolve(**args)
event = convention.events.find(args[:event_id])
user_con_profile = convention.user_con_profiles.find(args[:user_con_profile_id])
result = CreateTeamMemberService.new(
event: event,
user_con_profile: user_con_profile,
team_member_attrs: args[:team_member],
provide_ticket_type_id: args[:provide_ticket_type_id]
).call!
{
team_member: result.team_member,
ticket: result.ticket,
converted_signups: result.converted_signups,
move_results: result.move_results
}
end
end
| Use the right field name | Use the right field name
| Ruby | mit | neinteractiveliterature/intercode,neinteractiveliterature/intercode,neinteractiveliterature/intercode,neinteractiveliterature/intercode,neinteractiveliterature/intercode | ruby | ## Code Before:
class Mutations::CreateTeamMember < Mutations::BaseMutation
field :team_member, Types::TeamMemberType, null: false
field :ticket, Types::TicketType, null: true
field :converted_signups, [Types::SignupType], null: false
field :moved_signups, [Types::SignupMoveResultType], null: false
argument :event_id, Integer, required: true, camelize: false
argument :user_con_profile_id, Integer, required: true, camelize: false
argument :team_member, Types::TeamMemberInputType, required: true, camelize: false
argument :provide_ticket_type_id, Integer, required: false, camelize: false
def resolve(**args)
event = convention.events.find(args[:event_id])
user_con_profile = convention.user_con_profiles.find(args[:user_con_profile_id])
result = CreateTeamMemberService.new(
event: event,
user_con_profile: user_con_profile,
team_member_attrs: args[:team_member],
provide_ticket_type_id: args[:provide_ticket_type_id]
).call!
{
team_member: result.team_member,
ticket: result.ticket,
converted_signups: result.converted_signups,
moved_signups: result.moved_signups
}
end
end
## Instruction:
Use the right field name
## Code After:
class Mutations::CreateTeamMember < Mutations::BaseMutation
field :team_member, Types::TeamMemberType, null: false
field :ticket, Types::TicketType, null: true
field :converted_signups, [Types::SignupType], null: false
field :move_results, [Types::SignupMoveResultType], null: false
argument :event_id, Integer, required: true, camelize: false
argument :user_con_profile_id, Integer, required: true, camelize: false
argument :team_member, Types::TeamMemberInputType, required: true, camelize: false
argument :provide_ticket_type_id, Integer, required: false, camelize: false
def resolve(**args)
event = convention.events.find(args[:event_id])
user_con_profile = convention.user_con_profiles.find(args[:user_con_profile_id])
result = CreateTeamMemberService.new(
event: event,
user_con_profile: user_con_profile,
team_member_attrs: args[:team_member],
provide_ticket_type_id: args[:provide_ticket_type_id]
).call!
{
team_member: result.team_member,
ticket: result.ticket,
converted_signups: result.converted_signups,
move_results: result.move_results
}
end
end
| class Mutations::CreateTeamMember < Mutations::BaseMutation
field :team_member, Types::TeamMemberType, null: false
field :ticket, Types::TicketType, null: true
field :converted_signups, [Types::SignupType], null: false
- field :moved_signups, [Types::SignupMoveResultType], null: false
? - --- ^
+ field :move_results, [Types::SignupMoveResultType], null: false
? ++ ^^
argument :event_id, Integer, required: true, camelize: false
argument :user_con_profile_id, Integer, required: true, camelize: false
argument :team_member, Types::TeamMemberInputType, required: true, camelize: false
argument :provide_ticket_type_id, Integer, required: false, camelize: false
def resolve(**args)
event = convention.events.find(args[:event_id])
user_con_profile = convention.user_con_profiles.find(args[:user_con_profile_id])
result = CreateTeamMemberService.new(
event: event,
user_con_profile: user_con_profile,
team_member_attrs: args[:team_member],
provide_ticket_type_id: args[:provide_ticket_type_id]
).call!
{
team_member: result.team_member,
ticket: result.ticket,
converted_signups: result.converted_signups,
- moved_signups: result.moved_signups
? - --- ^ - --- ^
+ move_results: result.move_results
? ++ ^^ ++ ^^
}
end
end | 4 | 0.137931 | 2 | 2 |
aa6e42790847f36ad7a13be5baf9774f031f53f7 | conda-r/exercise/README.md | conda-r/exercise/README.md | Conda Hands-on Activity
=======================
Goal
----
- Use conda to install dependencies, then run a toolbox that won't otherwise work.
- Kevin suggested spatial methods in scikit-learn, look at that. Failing that, we should do something else which uses scikit-learn (or something similarly useful)
Steps
-----
1. Open toolbox, `sdm.pyt`, populate values.
2. Run it -- fails! Why's that?
3. Use conda -- command line, the UI doesn't support UAC yet.
conda install scikit-learn
4. Add back data to toolbox, run again.
Bonus
-----
Jupyter notebooks allow interactive coding to be much simpler. We have Python API for Python which enables spatial data analysis moving between local data, ArcGIS Online data, and server data, enabled with (you guessed it) the SciPy stack.
To install with Conda:
conda install arcgis
See details at the [ArcGIS API for Python](https://developers.arcgis.com/python/) site.
| Conda Hands-on Activity
=======================
Goal
----
- Use conda to install dependencies, then run a toolbox that won't otherwise work.
- Kevin suggested spatial methods in scikit-learn, look at that. Failing that, we should do something else which uses scikit-learn (or something similarly useful)
Steps
-----
1. Open toolbox, `sdm.pyt`, populate values.
2. Run it -- fails! Why's that?
3. Use conda, either from the command line, or from the UI.
+ To install from the command line, run:
```sh
conda install scikit-learn
```
+ To install from Pro:
1. Click on "Project" then select "Python".
2. Select the "Add Packages" tab.
3. In the search box type "scikit". Select the `scikit-learn` package.
4. Click "Install".
4. Add back data to toolbox, run again.
Bonus
-----
Jupyter notebooks allow interactive coding to be much simpler. We have Python API for Python which enables spatial data analysis moving between local data, ArcGIS Online data, and server data, enabled with (you guessed it) the SciPy stack.
To install with Conda:
conda install arcgis
See details at the [ArcGIS API for Python](https://developers.arcgis.com/python/) site.
| Add better scikit-learn installation instructions | Add better scikit-learn installation instructions | Markdown | apache-2.0 | EsriOceans/oceans-workshop-2016,EsriOceans/oceans-workshop-2016,EsriOceans/oceans-workshop-2016 | markdown | ## Code Before:
Conda Hands-on Activity
=======================
Goal
----
- Use conda to install dependencies, then run a toolbox that won't otherwise work.
- Kevin suggested spatial methods in scikit-learn, look at that. Failing that, we should do something else which uses scikit-learn (or something similarly useful)
Steps
-----
1. Open toolbox, `sdm.pyt`, populate values.
2. Run it -- fails! Why's that?
3. Use conda -- command line, the UI doesn't support UAC yet.
conda install scikit-learn
4. Add back data to toolbox, run again.
Bonus
-----
Jupyter notebooks allow interactive coding to be much simpler. We have Python API for Python which enables spatial data analysis moving between local data, ArcGIS Online data, and server data, enabled with (you guessed it) the SciPy stack.
To install with Conda:
conda install arcgis
See details at the [ArcGIS API for Python](https://developers.arcgis.com/python/) site.
## Instruction:
Add better scikit-learn installation instructions
## Code After:
Conda Hands-on Activity
=======================
Goal
----
- Use conda to install dependencies, then run a toolbox that won't otherwise work.
- Kevin suggested spatial methods in scikit-learn, look at that. Failing that, we should do something else which uses scikit-learn (or something similarly useful)
Steps
-----
1. Open toolbox, `sdm.pyt`, populate values.
2. Run it -- fails! Why's that?
3. Use conda, either from the command line, or from the UI.
+ To install from the command line, run:
```sh
conda install scikit-learn
```
+ To install from Pro:
1. Click on "Project" then select "Python".
2. Select the "Add Packages" tab.
3. In the search box type "scikit". Select the `scikit-learn` package.
4. Click "Install".
4. Add back data to toolbox, run again.
Bonus
-----
Jupyter notebooks allow interactive coding to be much simpler. We have Python API for Python which enables spatial data analysis moving between local data, ArcGIS Online data, and server data, enabled with (you guessed it) the SciPy stack.
To install with Conda:
conda install arcgis
See details at the [ArcGIS API for Python](https://developers.arcgis.com/python/) site.
| Conda Hands-on Activity
=======================
Goal
----
- Use conda to install dependencies, then run a toolbox that won't otherwise work.
- Kevin suggested spatial methods in scikit-learn, look at that. Failing that, we should do something else which uses scikit-learn (or something similarly useful)
Steps
-----
1. Open toolbox, `sdm.pyt`, populate values.
2. Run it -- fails! Why's that?
- 3. Use conda -- command line, the UI doesn't support UAC yet.
+ 3. Use conda, either from the command line, or from the UI.
+ + To install from the command line, run:
+ ```sh
- conda install scikit-learn
+ conda install scikit-learn
? ++++
+ ```
+
+ + To install from Pro:
+ 1. Click on "Project" then select "Python".
+ 2. Select the "Add Packages" tab.
+ 3. In the search box type "scikit". Select the `scikit-learn` package.
+ 4. Click "Install".
+
4. Add back data to toolbox, run again.
Bonus
-----
Jupyter notebooks allow interactive coding to be much simpler. We have Python API for Python which enables spatial data analysis moving between local data, ArcGIS Online data, and server data, enabled with (you guessed it) the SciPy stack.
To install with Conda:
conda install arcgis
See details at the [ArcGIS API for Python](https://developers.arcgis.com/python/) site. | 14 | 0.424242 | 12 | 2 |
35ef09494170aa3b4f80c15b0f591cc5728b57b7 | app/templates/arethusa.core/navbar_buttons_collapsed.html | app/templates/arethusa.core/navbar_buttons_collapsed.html | <li>
<a
class="button"
title="Menu"
dropdown-toggle="#navbar_collapsed_buttons_menu">
<i class="fi-align-justify"></i>
</a>
<ul id="navbar_collapsed_buttons_menu" class="navbar-dropdown">
<li><a saver/></li>
<li><a hist-undo/></li>
<li><a hist-redo/></li>
<li><a sidepanel-folder/></li>
<li><a deselector/></li>
<li><a title="Messages" reveal-toggle="all-messages" slide="true"><i class="fi-mail"></i></a></li>
<li><a title="Contact us" id="uservoicebutton" data-uv-trigger="contact"><i class="fi-comment"></i></a></li>
<li><a title="{{ 'LANGUAGE' | translate }}" translate-language/></li>
</ul>
</li>
| <li>
<a
class="button"
title="Menu"
dropdown-toggle="#navbar_collapsed_buttons_menu">
<i class="fi-align-justify"></i>
</a>
<ul id="navbar_collapsed_buttons_menu" class="navbar-dropdown">
<li><a saver/></li>
<li><a hist-undo/></li>
<li><a hist-redo/></li>
<li><a sidepanel-folder/></li>
<li><a deselector/></li>
<li><a title="Messages" reveal-toggle="all-messages" slide="true"><i class="fi-mail"></i></a></li>
<li><a title="Contact us" id="uservoicebutton" data-uv-trigger="contact"><i class="fi-comment"></i></a></li>
<li><a title="{{ 'LANGUAGE' | translate }}" translate-language/></li>
<li><a exit/></li>
</ul>
</li>
| Add exit to collapsed navbar buttons | Add exit to collapsed navbar buttons
| HTML | mit | latin-language-toolkit/arethusa,alpheios-project/arethusa,alpheios-project/arethusa,PonteIneptique/arethusa,alpheios-project/arethusa,Masoumeh/arethusa,fbaumgardt/arethusa,Masoumeh/arethusa,PonteIneptique/arethusa,fbaumgardt/arethusa,latin-language-toolkit/arethusa,fbaumgardt/arethusa | html | ## Code Before:
<li>
<a
class="button"
title="Menu"
dropdown-toggle="#navbar_collapsed_buttons_menu">
<i class="fi-align-justify"></i>
</a>
<ul id="navbar_collapsed_buttons_menu" class="navbar-dropdown">
<li><a saver/></li>
<li><a hist-undo/></li>
<li><a hist-redo/></li>
<li><a sidepanel-folder/></li>
<li><a deselector/></li>
<li><a title="Messages" reveal-toggle="all-messages" slide="true"><i class="fi-mail"></i></a></li>
<li><a title="Contact us" id="uservoicebutton" data-uv-trigger="contact"><i class="fi-comment"></i></a></li>
<li><a title="{{ 'LANGUAGE' | translate }}" translate-language/></li>
</ul>
</li>
## Instruction:
Add exit to collapsed navbar buttons
## Code After:
<li>
<a
class="button"
title="Menu"
dropdown-toggle="#navbar_collapsed_buttons_menu">
<i class="fi-align-justify"></i>
</a>
<ul id="navbar_collapsed_buttons_menu" class="navbar-dropdown">
<li><a saver/></li>
<li><a hist-undo/></li>
<li><a hist-redo/></li>
<li><a sidepanel-folder/></li>
<li><a deselector/></li>
<li><a title="Messages" reveal-toggle="all-messages" slide="true"><i class="fi-mail"></i></a></li>
<li><a title="Contact us" id="uservoicebutton" data-uv-trigger="contact"><i class="fi-comment"></i></a></li>
<li><a title="{{ 'LANGUAGE' | translate }}" translate-language/></li>
<li><a exit/></li>
</ul>
</li>
| <li>
<a
class="button"
title="Menu"
dropdown-toggle="#navbar_collapsed_buttons_menu">
<i class="fi-align-justify"></i>
</a>
<ul id="navbar_collapsed_buttons_menu" class="navbar-dropdown">
<li><a saver/></li>
<li><a hist-undo/></li>
<li><a hist-redo/></li>
<li><a sidepanel-folder/></li>
<li><a deselector/></li>
<li><a title="Messages" reveal-toggle="all-messages" slide="true"><i class="fi-mail"></i></a></li>
<li><a title="Contact us" id="uservoicebutton" data-uv-trigger="contact"><i class="fi-comment"></i></a></li>
<li><a title="{{ 'LANGUAGE' | translate }}" translate-language/></li>
+ <li><a exit/></li>
</ul>
</li> | 1 | 0.055556 | 1 | 0 |
a96a533ac026d9b543e642eb5e59a9f092f24ae4 | project/Build.scala | project/Build.scala | import sbt._
import Keys._
import play.Project._
object ApplicationBuild extends Build {
val appName = "liumsg"
val appVersion = "1.0-SNAPSHOT"
val appDependencies = Seq(
// Add your project dependencies here,
jdbc,
anorm,
"com.wordnik" %% "swagger-play2-utils" % "1.2.4"
)
val main = play.Project(appName, appVersion, appDependencies).settings(
// Add your own project settings here
resolvers := Seq(
"Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository",
"sonatype-snapshots" at "https://oss.sonatype.org/content/repositories/snapshots",
"sonatype-releases" at "https://oss.sonatype.org/content/repositories/releases",
"java-net" at "http://download.java.net/maven/2",
"Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"))
}
| import sbt._
import Keys._
import play.Project._
object ApplicationBuild extends Build {
val appName = "liumsg"
val appVersion = "1.0-SNAPSHOT"
val appDependencies = Seq(
// Add your project dependencies here,
jdbc,
anorm,
"com.wordnik" %% "swagger-play2-utils" % "1.2.4"
)
val main = play.Project(appName, appVersion, appDependencies).settings(
// Add your own project settings here
scalacOptions ++= Seq("-unchecked", "-deprecation", "-feature"),
resolvers := Seq(
"Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository",
"sonatype-snapshots" at "https://oss.sonatype.org/content/repositories/snapshots",
"sonatype-releases" at "https://oss.sonatype.org/content/repositories/releases",
"java-net" at "http://download.java.net/maven/2",
"Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"))
}
| Add compiler flags for deprecated features | Add compiler flags for deprecated features
| Scala | mit | bsalimi/speech-recognition-api,SG-LIUM/SGL-SpeechWeb-Demo,bsalimi/speech-recognition-api,SG-LIUM/SGL-SpeechWeb-Demo,bsalimi/speech-recognition-api,SG-LIUM/SGL-SpeechWeb-Demo,bsalimi/speech-recognition-api | scala | ## Code Before:
import sbt._
import Keys._
import play.Project._
object ApplicationBuild extends Build {
val appName = "liumsg"
val appVersion = "1.0-SNAPSHOT"
val appDependencies = Seq(
// Add your project dependencies here,
jdbc,
anorm,
"com.wordnik" %% "swagger-play2-utils" % "1.2.4"
)
val main = play.Project(appName, appVersion, appDependencies).settings(
// Add your own project settings here
resolvers := Seq(
"Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository",
"sonatype-snapshots" at "https://oss.sonatype.org/content/repositories/snapshots",
"sonatype-releases" at "https://oss.sonatype.org/content/repositories/releases",
"java-net" at "http://download.java.net/maven/2",
"Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"))
}
## Instruction:
Add compiler flags for deprecated features
## Code After:
import sbt._
import Keys._
import play.Project._
object ApplicationBuild extends Build {
val appName = "liumsg"
val appVersion = "1.0-SNAPSHOT"
val appDependencies = Seq(
// Add your project dependencies here,
jdbc,
anorm,
"com.wordnik" %% "swagger-play2-utils" % "1.2.4"
)
val main = play.Project(appName, appVersion, appDependencies).settings(
// Add your own project settings here
scalacOptions ++= Seq("-unchecked", "-deprecation", "-feature"),
resolvers := Seq(
"Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository",
"sonatype-snapshots" at "https://oss.sonatype.org/content/repositories/snapshots",
"sonatype-releases" at "https://oss.sonatype.org/content/repositories/releases",
"java-net" at "http://download.java.net/maven/2",
"Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"))
}
| import sbt._
import Keys._
import play.Project._
object ApplicationBuild extends Build {
val appName = "liumsg"
val appVersion = "1.0-SNAPSHOT"
val appDependencies = Seq(
// Add your project dependencies here,
jdbc,
anorm,
"com.wordnik" %% "swagger-play2-utils" % "1.2.4"
)
val main = play.Project(appName, appVersion, appDependencies).settings(
// Add your own project settings here
+ scalacOptions ++= Seq("-unchecked", "-deprecation", "-feature"),
resolvers := Seq(
"Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository",
"sonatype-snapshots" at "https://oss.sonatype.org/content/repositories/snapshots",
"sonatype-releases" at "https://oss.sonatype.org/content/repositories/releases",
"java-net" at "http://download.java.net/maven/2",
"Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"))
} | 1 | 0.04 | 1 | 0 |
6ef6e9e6b259f4da27b9a0566d9cc12e760df360 | app/models/post.rb | app/models/post.rb | class Post < ActiveRecord::Base
include Taggable
belongs_to :category
belongs_to :user
has_and_belongs_to_many :joined_users,
class_name: "User",
join_table: "user_joined_post",
foreign_key: "post_id",
association_foreign_key: "user_id"
# acts_as_taggable rescue nil
# HACK: there is a known issue that acts_as_taggable breaks asset precompilation on Heroku.
default_scope ->{ order('posts.created_at DESC') }
def self.categorized(cat=nil)
cat ? where(category_id: cat) : self
end
def to_s
title
end
end
| class Post < ActiveRecord::Base
include Taggable
belongs_to :category
belongs_to :user
has_and_belongs_to_many :joined_users,
class_name: "User",
join_table: "user_joined_post",
foreign_key: "post_id",
association_foreign_key: "user_id"
# acts_as_taggable rescue nil
# HACK: there is a known issue that acts_as_taggable breaks asset precompilation on Heroku.
default_scope ->{ order('posts.created_at DESC') }
scope :categorized, ->(cat) { where(category_id: cat) if cat }
def to_s
title
end
end
| Fix for the a scope that removed previous filters... :( | Fix for the a scope that removed previous filters... :( | Ruby | agpl-3.0 | d2bit/timeoverflow,d2bit/timeoverflow,coopdevs/timeoverflow,jordipujollarena/timeoverflow,coopdevs/timeoverflow,jordipujollarena/timeoverflow,coopdevs/timeoverflow,d2bit/timeoverflow,jordipujollarena/timeoverflow,coopdevs/timeoverflow,d2bit/timeoverflow | ruby | ## Code Before:
class Post < ActiveRecord::Base
include Taggable
belongs_to :category
belongs_to :user
has_and_belongs_to_many :joined_users,
class_name: "User",
join_table: "user_joined_post",
foreign_key: "post_id",
association_foreign_key: "user_id"
# acts_as_taggable rescue nil
# HACK: there is a known issue that acts_as_taggable breaks asset precompilation on Heroku.
default_scope ->{ order('posts.created_at DESC') }
def self.categorized(cat=nil)
cat ? where(category_id: cat) : self
end
def to_s
title
end
end
## Instruction:
Fix for the a scope that removed previous filters... :(
## Code After:
class Post < ActiveRecord::Base
include Taggable
belongs_to :category
belongs_to :user
has_and_belongs_to_many :joined_users,
class_name: "User",
join_table: "user_joined_post",
foreign_key: "post_id",
association_foreign_key: "user_id"
# acts_as_taggable rescue nil
# HACK: there is a known issue that acts_as_taggable breaks asset precompilation on Heroku.
default_scope ->{ order('posts.created_at DESC') }
scope :categorized, ->(cat) { where(category_id: cat) if cat }
def to_s
title
end
end
| class Post < ActiveRecord::Base
include Taggable
belongs_to :category
belongs_to :user
has_and_belongs_to_many :joined_users,
class_name: "User",
join_table: "user_joined_post",
foreign_key: "post_id",
association_foreign_key: "user_id"
# acts_as_taggable rescue nil
# HACK: there is a known issue that acts_as_taggable breaks asset precompilation on Heroku.
default_scope ->{ order('posts.created_at DESC') }
+ scope :categorized, ->(cat) { where(category_id: cat) if cat }
- def self.categorized(cat=nil)
- cat ? where(category_id: cat) : self
- end
def to_s
title
end
end | 4 | 0.16 | 1 | 3 |
113c176ed098a26ce52e36e7c5fc3ce4b08298b4 | .travis.yml | .travis.yml | sudo: false
language: python
python:
- "2.7"
- "3.3"
- "3.4"
- "3.5"
cache:
directories:
- $HOME/.cache/pip
- $HOME/virtualenv/python$TRAVIS_PYTHON_VERSION/lib/python$TRAVIS_PYTHON_VERSION/site-packages
- $HOME/virtualenv/python$TRAVIS_PYTHON_VERSION/bin
install:
- env
- pip install -r requirements/test.txt
- pip install coveralls
- python setup.py bdist_egg
# command to run tests
script: nosetests
after_success:
- coveralls
| sudo: false
language: python
python:
- "2.7"
- "3.3"
- "3.4"
- "3.5"
matrix:
include:
- python: "2.7"
env: JYTHON=true
cache:
directories:
- $HOME/.cache/pip
- $HOME/virtualenv/python$TRAVIS_PYTHON_VERSION/lib/python$TRAVIS_PYTHON_VERSION/site-packages
- $HOME/virtualenv/python$TRAVIS_PYTHON_VERSION/bin
before_install:
- export JYTHON_URL='http://search.maven.org/remotecontent?filepath=org/python/jython-installer/2.7.1b3/jython-installer-2.7.1b3.jar'
- if [ "$JYTHON" == "true" ]; then
wget $JYTHON_URL -O jython_installer.jar;
java -jar jython_installer.jar -s -d $HOME/jython;
export PATH=$HOME/jython/bin:$PATH;
$HOME/jython/bin/easy_install nose;
fi
install:
- env
- pip install -r requirements/test.txt
- pip install coveralls
- python setup.py bdist_egg
before_script:
- if [ "$JYTHON" == "true" ]; then
export NOSETESTS=$HOME/jython/bin/nosetests;
else
export NOSETESTS=nosetests;
fi
script: $NOSETESTS
after_success:
- coveralls
| Add Jython testing to Travis builds | Add Jython testing to Travis builds
| YAML | apache-2.0 | dls-controls/scanpointgenerator | yaml | ## Code Before:
sudo: false
language: python
python:
- "2.7"
- "3.3"
- "3.4"
- "3.5"
cache:
directories:
- $HOME/.cache/pip
- $HOME/virtualenv/python$TRAVIS_PYTHON_VERSION/lib/python$TRAVIS_PYTHON_VERSION/site-packages
- $HOME/virtualenv/python$TRAVIS_PYTHON_VERSION/bin
install:
- env
- pip install -r requirements/test.txt
- pip install coveralls
- python setup.py bdist_egg
# command to run tests
script: nosetests
after_success:
- coveralls
## Instruction:
Add Jython testing to Travis builds
## Code After:
sudo: false
language: python
python:
- "2.7"
- "3.3"
- "3.4"
- "3.5"
matrix:
include:
- python: "2.7"
env: JYTHON=true
cache:
directories:
- $HOME/.cache/pip
- $HOME/virtualenv/python$TRAVIS_PYTHON_VERSION/lib/python$TRAVIS_PYTHON_VERSION/site-packages
- $HOME/virtualenv/python$TRAVIS_PYTHON_VERSION/bin
before_install:
- export JYTHON_URL='http://search.maven.org/remotecontent?filepath=org/python/jython-installer/2.7.1b3/jython-installer-2.7.1b3.jar'
- if [ "$JYTHON" == "true" ]; then
wget $JYTHON_URL -O jython_installer.jar;
java -jar jython_installer.jar -s -d $HOME/jython;
export PATH=$HOME/jython/bin:$PATH;
$HOME/jython/bin/easy_install nose;
fi
install:
- env
- pip install -r requirements/test.txt
- pip install coveralls
- python setup.py bdist_egg
before_script:
- if [ "$JYTHON" == "true" ]; then
export NOSETESTS=$HOME/jython/bin/nosetests;
else
export NOSETESTS=nosetests;
fi
script: $NOSETESTS
after_success:
- coveralls
| sudo: false
language: python
python:
- "2.7"
- "3.3"
- "3.4"
- "3.5"
+ matrix:
+ include:
+ - python: "2.7"
+ env: JYTHON=true
+
cache:
directories:
- $HOME/.cache/pip
- $HOME/virtualenv/python$TRAVIS_PYTHON_VERSION/lib/python$TRAVIS_PYTHON_VERSION/site-packages
- $HOME/virtualenv/python$TRAVIS_PYTHON_VERSION/bin
+ before_install:
+ - export JYTHON_URL='http://search.maven.org/remotecontent?filepath=org/python/jython-installer/2.7.1b3/jython-installer-2.7.1b3.jar'
+ - if [ "$JYTHON" == "true" ]; then
+ wget $JYTHON_URL -O jython_installer.jar;
+ java -jar jython_installer.jar -s -d $HOME/jython;
+ export PATH=$HOME/jython/bin:$PATH;
+ $HOME/jython/bin/easy_install nose;
+ fi
+
install:
- env
- pip install -r requirements/test.txt
- pip install coveralls
- python setup.py bdist_egg
-
- # command to run tests
- script: nosetests
+
+ before_script:
+ - if [ "$JYTHON" == "true" ]; then
+ export NOSETESTS=$HOME/jython/bin/nosetests;
+ else
+ export NOSETESTS=nosetests;
+ fi
+
+ script: $NOSETESTS
after_success:
- coveralls
| 26 | 0.962963 | 23 | 3 |
e5a677fd88d98eaf697f56e59b5322a45a284606 | content/en/post/_index.md | content/en/post/_index.md | ---
title: Blog
url: "/blog/"
top_graphic: 2
---
| ---
title: Blog
url: "/blog/"
top_graphic: 2
menu:
main:
weight: 100
parent: about
---
| Add blog to the main nav, as the last item of the about menu. | Add blog to the main nav, as the last item of the about menu.
| Markdown | mpl-2.0 | letsencrypt/website,letsencrypt/website,letsencrypt/website | markdown | ## Code Before:
---
title: Blog
url: "/blog/"
top_graphic: 2
---
## Instruction:
Add blog to the main nav, as the last item of the about menu.
## Code After:
---
title: Blog
url: "/blog/"
top_graphic: 2
menu:
main:
weight: 100
parent: about
---
| ---
title: Blog
url: "/blog/"
top_graphic: 2
+ menu:
+ main:
+ weight: 100
+ parent: about
--- | 4 | 0.8 | 4 | 0 |
25494622a88f172fb14abf10eb5936246d475066 | other/wrapping-cpp/swig/cpointerproblem/test_examples.py | other/wrapping-cpp/swig/cpointerproblem/test_examples.py |
import os
import pytest
#print("pwd:")
#os.system('pwd')
#import subprocess
#subprocess.check_output('pwd')
os.system('make all')
import example1
def test_f():
assert example1.f(1) - 1 <= 10 ** -7
def test_myfun():
"""Demonstrate that calling code with wrong object type results
in TypeError exception."""
with pytest.raises(TypeError):
assert example1.myfun(example1.f, 2.0) - 4.0 <= 10 ** -7
os.system('make alternate')
import example2
def test2_f():
assert example2.f(1) - 1 <= 10 ** -7
def test2_myfun():
assert example2.myfun(example2.f, 2.0) - 4.0 <= 10 ** -7
os.system('make clean')
|
import os
import pytest
# Need to call Makefile in directory where this test file is
def call_make(target):
# where is this file
this_file = os.path.realpath(__file__)
this_dir = os.path.split(this_file)[0]
cd_command = "cd {}".format(this_dir)
make_command = "make {}".format(target)
command = '{}; {}'.format(cd_command, make_command)
print("About to execute: '{}'".format(command))
os.system(command)
call_make('all')
import example1
def test_f():
assert example1.f(1) - 1 <= 10 ** -7
def test_myfun():
"""Demonstrate that calling code with wrong object type results
in TypeError exception."""
with pytest.raises(TypeError):
assert example1.myfun(example1.f, 2.0) - 4.0 <= 10 ** -7
call_make('alternate')
import example2
def test2_f():
assert example2.f(1) - 1 <= 10 ** -7
def test2_myfun():
assert example2.myfun(example2.f, 2.0) - 4.0 <= 10 ** -7
call_make('clean')
| Modify testing code to work if executed from above its own directory | Modify testing code to work if executed from above its own directory
| Python | bsd-2-clause | ryanpepper/oommf-python,ryanpepper/oommf-python,ryanpepper/oommf-python,fangohr/oommf-python,fangohr/oommf-python,fangohr/oommf-python,ryanpepper/oommf-python | python | ## Code Before:
import os
import pytest
#print("pwd:")
#os.system('pwd')
#import subprocess
#subprocess.check_output('pwd')
os.system('make all')
import example1
def test_f():
assert example1.f(1) - 1 <= 10 ** -7
def test_myfun():
"""Demonstrate that calling code with wrong object type results
in TypeError exception."""
with pytest.raises(TypeError):
assert example1.myfun(example1.f, 2.0) - 4.0 <= 10 ** -7
os.system('make alternate')
import example2
def test2_f():
assert example2.f(1) - 1 <= 10 ** -7
def test2_myfun():
assert example2.myfun(example2.f, 2.0) - 4.0 <= 10 ** -7
os.system('make clean')
## Instruction:
Modify testing code to work if executed from above its own directory
## Code After:
import os
import pytest
# Need to call Makefile in directory where this test file is
def call_make(target):
# where is this file
this_file = os.path.realpath(__file__)
this_dir = os.path.split(this_file)[0]
cd_command = "cd {}".format(this_dir)
make_command = "make {}".format(target)
command = '{}; {}'.format(cd_command, make_command)
print("About to execute: '{}'".format(command))
os.system(command)
call_make('all')
import example1
def test_f():
assert example1.f(1) - 1 <= 10 ** -7
def test_myfun():
"""Demonstrate that calling code with wrong object type results
in TypeError exception."""
with pytest.raises(TypeError):
assert example1.myfun(example1.f, 2.0) - 4.0 <= 10 ** -7
call_make('alternate')
import example2
def test2_f():
assert example2.f(1) - 1 <= 10 ** -7
def test2_myfun():
assert example2.myfun(example2.f, 2.0) - 4.0 <= 10 ** -7
call_make('clean')
|
import os
import pytest
- #print("pwd:")
- #os.system('pwd')
- #import subprocess
- #subprocess.check_output('pwd')
+
+ # Need to call Makefile in directory where this test file is
+ def call_make(target):
+ # where is this file
+ this_file = os.path.realpath(__file__)
+ this_dir = os.path.split(this_file)[0]
+ cd_command = "cd {}".format(this_dir)
+ make_command = "make {}".format(target)
+ command = '{}; {}'.format(cd_command, make_command)
+ print("About to execute: '{}'".format(command))
+ os.system(command)
- os.system('make all')
+ call_make('all')
import example1
def test_f():
assert example1.f(1) - 1 <= 10 ** -7
def test_myfun():
"""Demonstrate that calling code with wrong object type results
in TypeError exception."""
with pytest.raises(TypeError):
assert example1.myfun(example1.f, 2.0) - 4.0 <= 10 ** -7
- os.system('make alternate')
+ call_make('alternate')
import example2
def test2_f():
assert example2.f(1) - 1 <= 10 ** -7
def test2_myfun():
assert example2.myfun(example2.f, 2.0) - 4.0 <= 10 ** -7
- os.system('make clean')
+ call_make('clean') | 21 | 0.552632 | 14 | 7 |
f55f89cfa11c6ba2c8c6fdbec3d5c9d31dea7ac2 | libraries/_autoload.rb | libraries/_autoload.rb | ENV['GEM_PATH'] = ([File.expand_path('../../files/default/vendor', __FILE__)] + Gem.path).join(Gem.path_separator)
Gem.paths = ENV
gem 'docker-api', '~> 1.24'
$LOAD_PATH.unshift *Dir[File.expand_path('..', __FILE__)]
| $LOAD_PATH.push *Dir[File.expand_path('../../files/default/vendor/gems/**/lib', __FILE__)]
$LOAD_PATH.unshift *Dir[File.expand_path('..', __FILE__)]
| Revert "Allow distributed gems to be activated normally via Ruby" | Revert "Allow distributed gems to be activated normally via Ruby"
This reverts commit af2a7a194110672678c11252666d3dd081fe81a8.
| Ruby | apache-2.0 | chef-cookbooks/docker,chef-cookbooks/docker,fxposter/chef-docker,fxposter/chef-docker,chef-cookbooks/docker,fxposter/chef-docker | ruby | ## Code Before:
ENV['GEM_PATH'] = ([File.expand_path('../../files/default/vendor', __FILE__)] + Gem.path).join(Gem.path_separator)
Gem.paths = ENV
gem 'docker-api', '~> 1.24'
$LOAD_PATH.unshift *Dir[File.expand_path('..', __FILE__)]
## Instruction:
Revert "Allow distributed gems to be activated normally via Ruby"
This reverts commit af2a7a194110672678c11252666d3dd081fe81a8.
## Code After:
$LOAD_PATH.push *Dir[File.expand_path('../../files/default/vendor/gems/**/lib', __FILE__)]
$LOAD_PATH.unshift *Dir[File.expand_path('..', __FILE__)]
| + $LOAD_PATH.push *Dir[File.expand_path('../../files/default/vendor/gems/**/lib', __FILE__)]
+ $LOAD_PATH.unshift *Dir[File.expand_path('..', __FILE__)]
- ENV['GEM_PATH'] = ([File.expand_path('../../files/default/vendor', __FILE__)] + Gem.path).join(Gem.path_separator)
- Gem.paths = ENV
- gem 'docker-api', '~> 1.24'
- $LOAD_PATH.unshift *Dir[File.expand_path('..', __FILE__)] | 6 | 1.2 | 2 | 4 |
5eb0e36faf7f0682ca3aa35f9bf952a61f217383 | src/components/fields/DocLabel.jsx | src/components/fields/DocLabel.jsx | import React from 'react'
import input from '../../config/input.js'
import colors from '../../config/colors.js'
import { margins, fontSizes } from '../../config/scales.js'
export default class DocLabel extends React.Component {
static propTypes = {
label: React.PropTypes.string.isRequired,
doc: React.PropTypes.string.isRequired,
style: React.PropTypes.object,
}
constructor(props) {
super(props)
this.state = { showDoc: false }
}
render() {
return <label
style={{
...input.label,
...this.props.style,
position: 'relative',
}}
>
<span
onMouseOver={e => this.setState({showDoc: true})}
onMouseOut={e => this.setState({showDoc: false})}
style={{
cursor: 'help',
}}
>
{this.props.label}
</span>
<div style={{
backgroundColor: colors.gray,
padding: margins[1],
position: 'absolute',
top: 20,
left: 0,
width: 100,
display: this.state.showDoc ? null : 'none',
zIndex: 3,
}}>
{this.props.doc}
</div>
</label>
}
}
| import React from 'react'
import input from '../../config/input.js'
import colors from '../../config/colors.js'
import { margins, fontSizes } from '../../config/scales.js'
export default class DocLabel extends React.Component {
static propTypes = {
label: React.PropTypes.string.isRequired,
doc: React.PropTypes.string.isRequired,
style: React.PropTypes.object,
}
constructor(props) {
super(props)
this.state = { showDoc: false }
}
render() {
return <label
style={{
...input.label,
...this.props.style,
position: 'relative',
}}
>
<span
onMouseOver={e => this.setState({showDoc: true})}
onMouseOut={e => this.setState({showDoc: false})}
style={{
cursor: 'help',
}}
>
{this.props.label}
</span>
<div style={{
backgroundColor: colors.gray,
padding: margins[1],
fontSize: 10,
position: 'absolute',
top: 20,
left: 0,
width: 120,
display: this.state.showDoc ? null : 'none',
zIndex: 3,
}}>
{this.props.doc}
</div>
</label>
}
}
| Decrease doc label font size | Decrease doc label font size
| JSX | mit | maputnik/editor,maputnik/editor | jsx | ## Code Before:
import React from 'react'
import input from '../../config/input.js'
import colors from '../../config/colors.js'
import { margins, fontSizes } from '../../config/scales.js'
export default class DocLabel extends React.Component {
static propTypes = {
label: React.PropTypes.string.isRequired,
doc: React.PropTypes.string.isRequired,
style: React.PropTypes.object,
}
constructor(props) {
super(props)
this.state = { showDoc: false }
}
render() {
return <label
style={{
...input.label,
...this.props.style,
position: 'relative',
}}
>
<span
onMouseOver={e => this.setState({showDoc: true})}
onMouseOut={e => this.setState({showDoc: false})}
style={{
cursor: 'help',
}}
>
{this.props.label}
</span>
<div style={{
backgroundColor: colors.gray,
padding: margins[1],
position: 'absolute',
top: 20,
left: 0,
width: 100,
display: this.state.showDoc ? null : 'none',
zIndex: 3,
}}>
{this.props.doc}
</div>
</label>
}
}
## Instruction:
Decrease doc label font size
## Code After:
import React from 'react'
import input from '../../config/input.js'
import colors from '../../config/colors.js'
import { margins, fontSizes } from '../../config/scales.js'
export default class DocLabel extends React.Component {
static propTypes = {
label: React.PropTypes.string.isRequired,
doc: React.PropTypes.string.isRequired,
style: React.PropTypes.object,
}
constructor(props) {
super(props)
this.state = { showDoc: false }
}
render() {
return <label
style={{
...input.label,
...this.props.style,
position: 'relative',
}}
>
<span
onMouseOver={e => this.setState({showDoc: true})}
onMouseOut={e => this.setState({showDoc: false})}
style={{
cursor: 'help',
}}
>
{this.props.label}
</span>
<div style={{
backgroundColor: colors.gray,
padding: margins[1],
fontSize: 10,
position: 'absolute',
top: 20,
left: 0,
width: 120,
display: this.state.showDoc ? null : 'none',
zIndex: 3,
}}>
{this.props.doc}
</div>
</label>
}
}
| import React from 'react'
import input from '../../config/input.js'
import colors from '../../config/colors.js'
import { margins, fontSizes } from '../../config/scales.js'
export default class DocLabel extends React.Component {
static propTypes = {
label: React.PropTypes.string.isRequired,
doc: React.PropTypes.string.isRequired,
style: React.PropTypes.object,
}
constructor(props) {
super(props)
this.state = { showDoc: false }
}
render() {
return <label
style={{
...input.label,
...this.props.style,
position: 'relative',
}}
>
<span
onMouseOver={e => this.setState({showDoc: true})}
onMouseOut={e => this.setState({showDoc: false})}
style={{
cursor: 'help',
}}
>
{this.props.label}
</span>
<div style={{
backgroundColor: colors.gray,
padding: margins[1],
+ fontSize: 10,
position: 'absolute',
top: 20,
left: 0,
- width: 100,
? ^
+ width: 120,
? ^
display: this.state.showDoc ? null : 'none',
zIndex: 3,
}}>
{this.props.doc}
</div>
</label>
}
} | 3 | 0.061224 | 2 | 1 |
8154a848fd1192378ab03d32b7fd4084bdc74a62 | requirements.txt | requirements.txt | Flask==0.10.1
Flask-Bootstrap==3.3.0.1
Flask-Script==2.0.5
requests==2.5.1
| Flask==0.10.1
Flask-Bootstrap==3.3.0.1
Flask-Script==2.0.5
requests==2.5.1
# Required for SNI to work in requests
pyOpenSSL==0.14
ndg-httpsclient==0.3.3
pyasn1==0.1.7
| Add dependencies required for SNI to work with requests | Add dependencies required for SNI to work with requests
| Text | mit | mtekel/digitalmarketplace-admin-frontend,alphagov/digitalmarketplace-admin-frontend,mtekel/digitalmarketplace-admin-frontend,mtekel/digitalmarketplace-admin-frontend,mtekel/digitalmarketplace-admin-frontend,alphagov/digitalmarketplace-admin-frontend,alphagov/digitalmarketplace-admin-frontend,alphagov/digitalmarketplace-admin-frontend | text | ## Code Before:
Flask==0.10.1
Flask-Bootstrap==3.3.0.1
Flask-Script==2.0.5
requests==2.5.1
## Instruction:
Add dependencies required for SNI to work with requests
## Code After:
Flask==0.10.1
Flask-Bootstrap==3.3.0.1
Flask-Script==2.0.5
requests==2.5.1
# Required for SNI to work in requests
pyOpenSSL==0.14
ndg-httpsclient==0.3.3
pyasn1==0.1.7
| Flask==0.10.1
Flask-Bootstrap==3.3.0.1
Flask-Script==2.0.5
requests==2.5.1
+
+ # Required for SNI to work in requests
+ pyOpenSSL==0.14
+ ndg-httpsclient==0.3.3
+ pyasn1==0.1.7 | 5 | 1.25 | 5 | 0 |
419d30f74993f817552a46caf87085e46199ef1b | spec/unit/recipes/global_spec.rb | spec/unit/recipes/global_spec.rb | require "spec_helper"
describe "logrotate::global" do
let(:chef_run) { ChefSpec::SoloRunner.new.converge(described_recipe) }
it "includes the default recipe" do
expect(chef_run).to include_recipe("logrotate::default")
end
it "writes the configuration template" do
template = chef_run.template("/etc/logrotate.conf")
expect(template).to be
expect(template.source).to eq("logrotate-global.erb")
expect(template.mode).to eq("0644")
end
end
| require "spec_helper"
describe "logrotate::global" do
let(:chef_run) { ChefSpec::SoloRunner.new.converge(described_recipe) }
it "includes the default recipe" do
expect(chef_run).to include_recipe("logrotate::default")
end
it "writes the configuration template" do
template = chef_run.template("/etc/logrotate.conf")
expect(template).to be
expect(template.source).to eq("logrotate-global.erb")
expect(template.mode).to eq("0644")
end
shared_examples "script in global context" do
it "puts the script in the configuration file" do
expect(chef_run).to render_file("/etc/logrotate.conf").with_content(content_regexp)
end
end
%w{postrotate prerotate firstaction lastaction}.each do |script_type|
context "when a #{script_type} script is present in the global attribute" do
let(:script) { "/usr/bin/test_#{script_type}_script" }
let(:chef_run) do
ChefSpec::SoloRunner.new do |node|
node.override["logrotate"]["global"][script_type] = script
end.converge(described_recipe)
end
let(:content_regexp) { /#{script_type}\n#{script}\nendscript/ }
it_behaves_like "script in global context"
end
end
end
| Add unit test for the global scripts feature | Add unit test for the global scripts feature | Ruby | apache-2.0 | stevendanna/logrotate,stevendanna/logrotate | ruby | ## Code Before:
require "spec_helper"
describe "logrotate::global" do
let(:chef_run) { ChefSpec::SoloRunner.new.converge(described_recipe) }
it "includes the default recipe" do
expect(chef_run).to include_recipe("logrotate::default")
end
it "writes the configuration template" do
template = chef_run.template("/etc/logrotate.conf")
expect(template).to be
expect(template.source).to eq("logrotate-global.erb")
expect(template.mode).to eq("0644")
end
end
## Instruction:
Add unit test for the global scripts feature
## Code After:
require "spec_helper"
describe "logrotate::global" do
let(:chef_run) { ChefSpec::SoloRunner.new.converge(described_recipe) }
it "includes the default recipe" do
expect(chef_run).to include_recipe("logrotate::default")
end
it "writes the configuration template" do
template = chef_run.template("/etc/logrotate.conf")
expect(template).to be
expect(template.source).to eq("logrotate-global.erb")
expect(template.mode).to eq("0644")
end
shared_examples "script in global context" do
it "puts the script in the configuration file" do
expect(chef_run).to render_file("/etc/logrotate.conf").with_content(content_regexp)
end
end
%w{postrotate prerotate firstaction lastaction}.each do |script_type|
context "when a #{script_type} script is present in the global attribute" do
let(:script) { "/usr/bin/test_#{script_type}_script" }
let(:chef_run) do
ChefSpec::SoloRunner.new do |node|
node.override["logrotate"]["global"][script_type] = script
end.converge(described_recipe)
end
let(:content_regexp) { /#{script_type}\n#{script}\nendscript/ }
it_behaves_like "script in global context"
end
end
end
| require "spec_helper"
describe "logrotate::global" do
let(:chef_run) { ChefSpec::SoloRunner.new.converge(described_recipe) }
it "includes the default recipe" do
expect(chef_run).to include_recipe("logrotate::default")
end
it "writes the configuration template" do
template = chef_run.template("/etc/logrotate.conf")
expect(template).to be
expect(template.source).to eq("logrotate-global.erb")
expect(template.mode).to eq("0644")
end
+
+ shared_examples "script in global context" do
+ it "puts the script in the configuration file" do
+ expect(chef_run).to render_file("/etc/logrotate.conf").with_content(content_regexp)
+ end
+ end
+
+ %w{postrotate prerotate firstaction lastaction}.each do |script_type|
+ context "when a #{script_type} script is present in the global attribute" do
+ let(:script) { "/usr/bin/test_#{script_type}_script" }
+ let(:chef_run) do
+ ChefSpec::SoloRunner.new do |node|
+ node.override["logrotate"]["global"][script_type] = script
+ end.converge(described_recipe)
+ end
+ let(:content_regexp) { /#{script_type}\n#{script}\nendscript/ }
+
+ it_behaves_like "script in global context"
+ end
+ end
end | 20 | 1.25 | 20 | 0 |
2ab47ed05ef419a2ea0deece62d8d55bf3f3c8d6 | spec/puppet-lint/plugins/check_strings/puppet_url_without_modules_spec.rb | spec/puppet-lint/plugins/check_strings/puppet_url_without_modules_spec.rb | require 'spec_helper'
describe 'puppet_url_without_modules' do
let(:msg) { 'puppet:// URL without modules/ found' }
context 'puppet:// url with modules' do
let(:code) { "'puppet:///modules/foo'" }
it 'should not detect any problems' do
expect(problems).to have(0).problems
end
end
context 'puppet:// url without modules' do
let(:code) { "'puppet:///foo'" }
it 'should only detect a single problem' do
expect(problems).to have(1).problem
end
it 'should create a warning' do
expect(problems).to contain_warning(msg).on_line(1).in_column(1)
end
end
context 'double string wrapped puppet:// urls' do
let(:code) { File.read('spec/fixtures/test/manifests/url_interpolation.pp') }
it 'should only detect a single problem' do
expect(problems).to have(4).problem
end
end
end
| require 'spec_helper'
describe 'puppet_url_without_modules' do
let(:msg) { 'puppet:// URL without modules/ found' }
context 'puppet:// url with modules' do
let(:code) { "'puppet:///modules/foo'" }
it 'should not detect any problems' do
expect(problems).to have(0).problems
end
end
context 'puppet:// url without modules' do
let(:code) { "'puppet:///foo'" }
it 'should only detect a single problem' do
expect(problems).to have(1).problem
end
it 'should create a warning' do
expect(problems).to contain_warning(msg).on_line(1).in_column(1)
end
end
context 'double string wrapped puppet:// urls' do
let(:code) { File.read('spec/fixtures/test/manifests/url_interpolation.pp') }
it 'should detect several problems' do
expect(problems).to have(4).problem
end
end
end
| Fix the spec name again | Fix the spec name again
| Ruby | mit | keeleysam/puppet-lint,rodjek/puppet-lint,toby82/puppet-lint,danzilio/puppet-lint,rothsa/puppet-lint,Rovanion/puppet-lint,yulis/puppet-lint | ruby | ## Code Before:
require 'spec_helper'
describe 'puppet_url_without_modules' do
let(:msg) { 'puppet:// URL without modules/ found' }
context 'puppet:// url with modules' do
let(:code) { "'puppet:///modules/foo'" }
it 'should not detect any problems' do
expect(problems).to have(0).problems
end
end
context 'puppet:// url without modules' do
let(:code) { "'puppet:///foo'" }
it 'should only detect a single problem' do
expect(problems).to have(1).problem
end
it 'should create a warning' do
expect(problems).to contain_warning(msg).on_line(1).in_column(1)
end
end
context 'double string wrapped puppet:// urls' do
let(:code) { File.read('spec/fixtures/test/manifests/url_interpolation.pp') }
it 'should only detect a single problem' do
expect(problems).to have(4).problem
end
end
end
## Instruction:
Fix the spec name again
## Code After:
require 'spec_helper'
describe 'puppet_url_without_modules' do
let(:msg) { 'puppet:// URL without modules/ found' }
context 'puppet:// url with modules' do
let(:code) { "'puppet:///modules/foo'" }
it 'should not detect any problems' do
expect(problems).to have(0).problems
end
end
context 'puppet:// url without modules' do
let(:code) { "'puppet:///foo'" }
it 'should only detect a single problem' do
expect(problems).to have(1).problem
end
it 'should create a warning' do
expect(problems).to contain_warning(msg).on_line(1).in_column(1)
end
end
context 'double string wrapped puppet:// urls' do
let(:code) { File.read('spec/fixtures/test/manifests/url_interpolation.pp') }
it 'should detect several problems' do
expect(problems).to have(4).problem
end
end
end
| require 'spec_helper'
describe 'puppet_url_without_modules' do
let(:msg) { 'puppet:// URL without modules/ found' }
context 'puppet:// url with modules' do
let(:code) { "'puppet:///modules/foo'" }
it 'should not detect any problems' do
expect(problems).to have(0).problems
end
end
context 'puppet:// url without modules' do
let(:code) { "'puppet:///foo'" }
it 'should only detect a single problem' do
expect(problems).to have(1).problem
end
it 'should create a warning' do
expect(problems).to contain_warning(msg).on_line(1).in_column(1)
end
end
context 'double string wrapped puppet:// urls' do
let(:code) { File.read('spec/fixtures/test/manifests/url_interpolation.pp') }
- it 'should only detect a single problem' do
? ----- ----- -
+ it 'should detect several problems' do
? +++++ +
expect(problems).to have(4).problem
end
end
end | 2 | 0.058824 | 1 | 1 |
072fbbe02b22ae9033cf66994dee89282c3a7048 | php/bsonrpc.php | php/bsonrpc.php | <?php
/*
NOTE: This module requires bson_encode() and bson_decode(),
which can be obtained by installing the MongoDB driver.
For example, on Debian/Ubuntu:
$ sudo apt-get install php5-mongo
*/
require_once('gorpc.php');
class BsonRpcClient extends GoRpcClient {
const LEN_PACK_FORMAT = 'V';
const LEN_PACK_SIZE = 4;
protected function send_request(GoRpcRequest $req) {
$this->write(bson_encode($req->header));
$this->write(bson_encode($req->body));
}
protected function read_response() {
// Read the header.
$data = $this->read_n(LEN_PACK_SIZE);
$len = unpack(LEN_PACK_FORMAT, $data)[0];
$header = $data . $this->read_n($len - LEN_PACK_SIZE);
// Read the body.
$data = $this->read_n(LEN_PACK_SIZE);
$len = unpack(LEN_PACK_FORMAT, $data)[0];
$body = $data . $this->read_n($len - LEN_PACK_SIZE);
// Decode and return.
return new GoRpcResponse(bson_decode($header), bson_decode($body));
}
}
| <?php
/*
NOTE: This module requires bson_encode() and bson_decode(),
which can be obtained by installing the MongoDB driver.
For example, on Debian/Ubuntu:
$ sudo apt-get install php5-mongo
*/
require_once('gorpc.php');
class BsonRpcClient extends GoRpcClient {
const LEN_PACK_FORMAT = 'V';
const LEN_PACK_SIZE = 4;
protected function send_request(GoRpcRequest $req) {
$this->write(bson_encode($req->header));
$this->write(bson_encode($req->body));
}
protected function read_response() {
// Read the header.
$data = $this->read_n(self::LEN_PACK_SIZE);
$len = unpack(self::LEN_PACK_FORMAT, $data)[0];
$header = $data . $this->read_n($len - self::LEN_PACK_SIZE);
// Read the body.
$data = $this->read_n(self::LEN_PACK_SIZE);
$len = unpack(self::LEN_PACK_FORMAT, $data)[0];
$body = $data . $this->read_n($len - self::LEN_PACK_SIZE);
// Decode and return.
return new GoRpcResponse(bson_decode($header), bson_decode($body));
}
}
| Fix class-level constants in PHP. | Fix class-level constants in PHP.
| PHP | apache-2.0 | AndyDiamondstein/vitess,nurblieh/vitess,skyportsystems/vitess,mattharden/vitess,guokeno0/vitess,kmiku7/vitess-annotated,AndyDiamondstein/vitess,mlc0202/vitess,cloudbearings/vitess,anusornc/vitess,mahak/vitess,pivanof/vitess,netroby/vitess,kuipertan/vitess,xgwubin/vitess,xgwubin/vitess,fengshao0907/vitess,aaijazi/vitess,erzel/vitess,fengshao0907/vitess,michael-berlin/vitess,fengshao0907/vitess,tjyang/vitess,mapbased/vitess,kuipertan/vitess,pivanof/vitess,vitessio/vitess,mlc0202/vitess,mlc0202/vitess,ptomasroos/vitess,mapbased/vitess,tirsen/vitess,tirsen/vitess,SDHM/vitess,tjyang/vitess,michael-berlin/vitess,skyportsystems/vitess,michael-berlin/vitess,aaijazi/vitess,cloudbearings/vitess,applift/vitess,vitessio/vitess,aaijazi/vitess,alainjobart/vitess,tirsen/vitess,cgvarela/vitess,yangzhongj/vitess,skyportsystems/vitess,yangzhongj/vitess,sougou/vitess,erzel/vitess,davygeek/vitess,AndyDiamondstein/vitess,atyenoria/vitess,AndyDiamondstein/vitess,tjyang/vitess,dcadevil/vitess,kmiku7/vitess-annotated,mahak/vitess,yangzhongj/vitess,kmiku7/vitess-annotated,erzel/vitess,tjyang/vitess,fengshao0907/vitess,tjyang/vitess,nurblieh/vitess,applift/vitess,AndyDiamondstein/vitess,michael-berlin/vitess,kuipertan/vitess,ptomasroos/vitess,erzel/vitess,guokeno0/vitess,tinyspeck/vitess,cgvarela/vitess,fengshao0907/vitess,guokeno0/vitess,dumbunny/vitess,ptomasroos/vitess,fengshao0907/vitess,yaoshengzhe/vitess,kmiku7/vitess-annotated,dcadevil/vitess,yangzhongj/vitess,enisoc/vitess,mattharden/vitess,alainjobart/vitess,atyenoria/vitess,tinyspeck/vitess,pivanof/vitess,sougou/vitess,kuipertan/vitess,fengshao0907/vitess,rnavarro/vitess,fengshao0907/vitess,SDHM/vitess,aaijazi/vitess,pivanof/vitess,tirsen/vitess,netroby/vitess,michael-berlin/vitess,SDHM/vitess,atyenoria/vitess,mlc0202/vitess,rnavarro/vitess,pivanof/vitess,HubSpot/vitess,erzel/vitess,anusornc/vitess,applift/vitess,dumbunny/vitess,davygeek/vitess,cgvarela/vitess,sougou/vitess,atyenoria/vitess,cloudbearings/vitess,ptomasroos/vitess,erzel/vitess,tinyspeck/vitess,xgwubin/vitess,nurblieh/vitess,cloudbearings/vitess,AndyDiamondstein/vitess,tinyspeck/vitess,dumbunny/vitess,sougou/vitess,mahak/vitess,vitessio/vitess,erzel/vitess,yaoshengzhe/vitess,xgwubin/vitess,applift/vitess,ptomasroos/vitess,guokeno0/vitess,enisoc/vitess,sougou/vitess,aaijazi/vitess,mapbased/vitess,mapbased/vitess,enisoc/vitess,aaijazi/vitess,mattharden/vitess,sougou/vitess,kuipertan/vitess,guokeno0/vitess,applift/vitess,cgvarela/vitess,rnavarro/vitess,dcadevil/vitess,dcadevil/vitess,ptomasroos/vitess,HubSpot/vitess,anusornc/vitess,nurblieh/vitess,mattharden/vitess,yaoshengzhe/vitess,xgwubin/vitess,mapbased/vitess,nurblieh/vitess,applift/vitess,atyenoria/vitess,mlc0202/vitess,tjyang/vitess,nurblieh/vitess,tirsen/vitess,netroby/vitess,yaoshengzhe/vitess,davygeek/vitess,mahak/vitess,dumbunny/vitess,enisoc/vitess,dcadevil/vitess,davygeek/vitess,atyenoria/vitess,alainjobart/vitess,michael-berlin/vitess,mlc0202/vitess,skyportsystems/vitess,aaijazi/vitess,AndyDiamondstein/vitess,ptomasroos/vitess,skyportsystems/vitess,yaoshengzhe/vitess,alainjobart/vitess,kmiku7/vitess-annotated,vitessio/vitess,HubSpot/vitess,tinyspeck/vitess,yaoshengzhe/vitess,netroby/vitess,anusornc/vitess,michael-berlin/vitess,erzel/vitess,netroby/vitess,cgvarela/vitess,yaoshengzhe/vitess,cgvarela/vitess,michael-berlin/vitess,pivanof/vitess,enisoc/vitess,mahak/vitess,sougou/vitess,mlc0202/vitess,skyportsystems/vitess,cgvarela/vitess,applift/vitess,skyportsystems/vitess,alainjobart/vitess,xgwubin/vitess,anusornc/vitess,applift/vitess,anusornc/vitess,mahak/vitess,kuipertan/vitess,SDHM/vitess,mattharden/vitess,yaoshengzhe/vitess,SDHM/vitess,mapbased/vitess,erzel/vitess,atyenoria/vitess,mattharden/vitess,atyenoria/vitess,nurblieh/vitess,mattharden/vitess,tinyspeck/vitess,xgwubin/vitess,enisoc/vitess,davygeek/vitess,ptomasroos/vitess,HubSpot/vitess,tinyspeck/vitess,mlc0202/vitess,SDHM/vitess,kmiku7/vitess-annotated,dumbunny/vitess,tjyang/vitess,rnavarro/vitess,mattharden/vitess,netroby/vitess,HubSpot/vitess,dumbunny/vitess,dumbunny/vitess,AndyDiamondstein/vitess,anusornc/vitess,mattharden/vitess,cloudbearings/vitess,mapbased/vitess,dumbunny/vitess,SDHM/vitess,nurblieh/vitess,dcadevil/vitess,rnavarro/vitess,rnavarro/vitess,vitessio/vitess,ptomasroos/vitess,fengshao0907/vitess,mapbased/vitess,netroby/vitess,yangzhongj/vitess,mahak/vitess,alainjobart/vitess,enisoc/vitess,tirsen/vitess,vitessio/vitess,kuipertan/vitess,kmiku7/vitess-annotated,yaoshengzhe/vitess,kmiku7/vitess-annotated,tirsen/vitess,tirsen/vitess,aaijazi/vitess,SDHM/vitess,HubSpot/vitess,xgwubin/vitess,mapbased/vitess,HubSpot/vitess,applift/vitess,cgvarela/vitess,skyportsystems/vitess,yangzhongj/vitess,anusornc/vitess,yangzhongj/vitess,dumbunny/vitess,HubSpot/vitess,tirsen/vitess,cloudbearings/vitess,sougou/vitess,anusornc/vitess,kuipertan/vitess,mlc0202/vitess,netroby/vitess,guokeno0/vitess,netroby/vitess,mahak/vitess,aaijazi/vitess,cloudbearings/vitess,alainjobart/vitess,cloudbearings/vitess,kuipertan/vitess,kmiku7/vitess-annotated,yangzhongj/vitess,pivanof/vitess,guokeno0/vitess,tjyang/vitess,atyenoria/vitess,rnavarro/vitess,michael-berlin/vitess,alainjobart/vitess,davygeek/vitess,xgwubin/vitess,vitessio/vitess,pivanof/vitess,pivanof/vitess,davygeek/vitess,skyportsystems/vitess,tjyang/vitess,SDHM/vitess,rnavarro/vitess,dumbunny/vitess,cgvarela/vitess,guokeno0/vitess,dcadevil/vitess,applift/vitess,rnavarro/vitess,cloudbearings/vitess,yangzhongj/vitess,mapbased/vitess,mattharden/vitess,guokeno0/vitess,AndyDiamondstein/vitess,erzel/vitess,nurblieh/vitess,vitessio/vitess | php | ## Code Before:
<?php
/*
NOTE: This module requires bson_encode() and bson_decode(),
which can be obtained by installing the MongoDB driver.
For example, on Debian/Ubuntu:
$ sudo apt-get install php5-mongo
*/
require_once('gorpc.php');
class BsonRpcClient extends GoRpcClient {
const LEN_PACK_FORMAT = 'V';
const LEN_PACK_SIZE = 4;
protected function send_request(GoRpcRequest $req) {
$this->write(bson_encode($req->header));
$this->write(bson_encode($req->body));
}
protected function read_response() {
// Read the header.
$data = $this->read_n(LEN_PACK_SIZE);
$len = unpack(LEN_PACK_FORMAT, $data)[0];
$header = $data . $this->read_n($len - LEN_PACK_SIZE);
// Read the body.
$data = $this->read_n(LEN_PACK_SIZE);
$len = unpack(LEN_PACK_FORMAT, $data)[0];
$body = $data . $this->read_n($len - LEN_PACK_SIZE);
// Decode and return.
return new GoRpcResponse(bson_decode($header), bson_decode($body));
}
}
## Instruction:
Fix class-level constants in PHP.
## Code After:
<?php
/*
NOTE: This module requires bson_encode() and bson_decode(),
which can be obtained by installing the MongoDB driver.
For example, on Debian/Ubuntu:
$ sudo apt-get install php5-mongo
*/
require_once('gorpc.php');
class BsonRpcClient extends GoRpcClient {
const LEN_PACK_FORMAT = 'V';
const LEN_PACK_SIZE = 4;
protected function send_request(GoRpcRequest $req) {
$this->write(bson_encode($req->header));
$this->write(bson_encode($req->body));
}
protected function read_response() {
// Read the header.
$data = $this->read_n(self::LEN_PACK_SIZE);
$len = unpack(self::LEN_PACK_FORMAT, $data)[0];
$header = $data . $this->read_n($len - self::LEN_PACK_SIZE);
// Read the body.
$data = $this->read_n(self::LEN_PACK_SIZE);
$len = unpack(self::LEN_PACK_FORMAT, $data)[0];
$body = $data . $this->read_n($len - self::LEN_PACK_SIZE);
// Decode and return.
return new GoRpcResponse(bson_decode($header), bson_decode($body));
}
}
| <?php
/*
NOTE: This module requires bson_encode() and bson_decode(),
which can be obtained by installing the MongoDB driver.
For example, on Debian/Ubuntu:
$ sudo apt-get install php5-mongo
*/
require_once('gorpc.php');
class BsonRpcClient extends GoRpcClient {
const LEN_PACK_FORMAT = 'V';
const LEN_PACK_SIZE = 4;
protected function send_request(GoRpcRequest $req) {
$this->write(bson_encode($req->header));
$this->write(bson_encode($req->body));
}
protected function read_response() {
// Read the header.
- $data = $this->read_n(LEN_PACK_SIZE);
+ $data = $this->read_n(self::LEN_PACK_SIZE);
? ++++++
- $len = unpack(LEN_PACK_FORMAT, $data)[0];
+ $len = unpack(self::LEN_PACK_FORMAT, $data)[0];
? ++++++
- $header = $data . $this->read_n($len - LEN_PACK_SIZE);
+ $header = $data . $this->read_n($len - self::LEN_PACK_SIZE);
? ++++++
// Read the body.
- $data = $this->read_n(LEN_PACK_SIZE);
+ $data = $this->read_n(self::LEN_PACK_SIZE);
? ++++++
- $len = unpack(LEN_PACK_FORMAT, $data)[0];
+ $len = unpack(self::LEN_PACK_FORMAT, $data)[0];
? ++++++
- $body = $data . $this->read_n($len - LEN_PACK_SIZE);
+ $body = $data . $this->read_n($len - self::LEN_PACK_SIZE);
? ++++++
// Decode and return.
return new GoRpcResponse(bson_decode($header), bson_decode($body));
}
} | 12 | 0.342857 | 6 | 6 |
3fa7031aa1a47d53d384583cbc5364d9ea623ff4 | neovim/.config/nvim/ftplugin/gitcommit.vim | neovim/.config/nvim/ftplugin/gitcommit.vim | " Turn on spell checking.
setlocal spell
" Mapping to insert current git ticket into file.
inoremap <C-g><C-t> <C-r>=execute('!git ticket')<cr>
" Mapping to retrieve commit message after failed commit.
function! GetPrevCommit()
let l:git_toplevel = glob("`git rev-parse --show-toplevel`/.git/COMMIT_EDITMSG")
:0read `git rev-parse --show-toplevel`/.git/COMMIT_EDITMSG
endfunction
nnoremap <buffer> <silent> <LocalLeader>b :call GetPrevCommit()<CR>
" vim:fdm=marker ft=vim et sts=2 sw=2 ts=2
| " Turn on spell checking.
setlocal spell
" Mapping to insert current git ticket into file.
nnoremap <buffer> <silent> <LocalLeader>t :r !git ticket<cr>
" Mapping to retrieve commit message after failed commit.
function! GetPrevCommit()
let l:git_toplevel = glob("`git rev-parse --show-toplevel`/.git/COMMIT_EDITMSG")
:0read `git rev-parse --show-toplevel`/.git/COMMIT_EDITMSG
endfunction
nnoremap <buffer> <silent> <LocalLeader>b :call GetPrevCommit()<CR>
" vim:fdm=marker ft=vim et sts=2 sw=2 ts=2
| Add ticket mapping to git commit filetype | Add ticket mapping to git commit filetype
Conveinence mapping to insert the ticket name/number into the git commit
message.
| VimL | mit | bronzehedwick/dotfiles,bronzehedwick/dotfiles,bronzehedwick/dotfiles,bronzehedwick/dotfiles | viml | ## Code Before:
" Turn on spell checking.
setlocal spell
" Mapping to insert current git ticket into file.
inoremap <C-g><C-t> <C-r>=execute('!git ticket')<cr>
" Mapping to retrieve commit message after failed commit.
function! GetPrevCommit()
let l:git_toplevel = glob("`git rev-parse --show-toplevel`/.git/COMMIT_EDITMSG")
:0read `git rev-parse --show-toplevel`/.git/COMMIT_EDITMSG
endfunction
nnoremap <buffer> <silent> <LocalLeader>b :call GetPrevCommit()<CR>
" vim:fdm=marker ft=vim et sts=2 sw=2 ts=2
## Instruction:
Add ticket mapping to git commit filetype
Conveinence mapping to insert the ticket name/number into the git commit
message.
## Code After:
" Turn on spell checking.
setlocal spell
" Mapping to insert current git ticket into file.
nnoremap <buffer> <silent> <LocalLeader>t :r !git ticket<cr>
" Mapping to retrieve commit message after failed commit.
function! GetPrevCommit()
let l:git_toplevel = glob("`git rev-parse --show-toplevel`/.git/COMMIT_EDITMSG")
:0read `git rev-parse --show-toplevel`/.git/COMMIT_EDITMSG
endfunction
nnoremap <buffer> <silent> <LocalLeader>b :call GetPrevCommit()<CR>
" vim:fdm=marker ft=vim et sts=2 sw=2 ts=2
| " Turn on spell checking.
setlocal spell
" Mapping to insert current git ticket into file.
- inoremap <C-g><C-t> <C-r>=execute('!git ticket')<cr>
+ nnoremap <buffer> <silent> <LocalLeader>t :r !git ticket<cr>
" Mapping to retrieve commit message after failed commit.
function! GetPrevCommit()
let l:git_toplevel = glob("`git rev-parse --show-toplevel`/.git/COMMIT_EDITMSG")
:0read `git rev-parse --show-toplevel`/.git/COMMIT_EDITMSG
endfunction
nnoremap <buffer> <silent> <LocalLeader>b :call GetPrevCommit()<CR>
" vim:fdm=marker ft=vim et sts=2 sw=2 ts=2 | 2 | 0.142857 | 1 | 1 |
c47692c5c283c5afd6245224dd686b6cd2f3c666 | .travis.yml | .travis.yml | language: python
sudo: false
cache: pip
python:
- '3.6'
env:
matrix:
- TOXENV=qa
- DJANGO=111
- DJANGO=20
- DJANGO=master
matrix:
fast_finish: true
allow_failures:
- env: DJANGO=master
install:
- pip install --upgrade pip tox
- pip install -U coveralls
before_script:
- |
if [[ -z $TOXENV ]]; then
export TOXENV=py$(echo $TRAVIS_PYTHON_VERSION | sed -e 's/\.//g')-dj$DJANGO
fi
- echo $TOXENV
script:
- tox -e $TOXENV
after_success:
- coveralls
deploy:
provider: pypi
user: codingjoe
password:
secure: jJyadofJm7F1Qco+EDCyN/aMZaYSbfQ0GAE02Bx7I499MkjPYvv38X2btg+PjdW3rzGD0d/kq24lfWLkgKncyQ/YMgLQ7H/GuCCHHYbKUklxllaoFXActBjstmKOvXyWWC5oEb+YEJ4HTwgkvS6wkp69B7C1d4BAOqGs5IKnCSo=
on:
tags: true
distributions: sdist bdist_wheel
repo: KristianOellegaard/django-health-check
branch: master
| language: python
sudo: false
cache: pip
python:
- '3.6'
env:
matrix:
- TOXENV=qa
- DJANGO=111
- DJANGO=20
- DJANGO=master
matrix:
fast_finish: true
allow_failures:
- env: DJANGO=master
install:
- pip install --upgrade codecov tox
before_script:
- |
if [[ -z $TOXENV ]]; then
export TOXENV=py$(echo $TRAVIS_PYTHON_VERSION | sed -e 's/\.//g')-dj$DJANGO
fi
- echo $TOXENV
script:
- tox -e $TOXENV
after_success:
- codecov
deploy:
provider: pypi
user: codingjoe
password:
secure: jJyadofJm7F1Qco+EDCyN/aMZaYSbfQ0GAE02Bx7I499MkjPYvv38X2btg+PjdW3rzGD0d/kq24lfWLkgKncyQ/YMgLQ7H/GuCCHHYbKUklxllaoFXActBjstmKOvXyWWC5oEb+YEJ4HTwgkvS6wkp69B7C1d4BAOqGs5IKnCSo=
on:
tags: true
distributions: sdist bdist_wheel
repo: KristianOellegaard/django-health-check
branch: master
| Switch to CodeCov for coverage tracking | Switch to CodeCov for coverage tracking
| YAML | mit | KristianOellegaard/django-health-check,KristianOellegaard/django-health-check | yaml | ## Code Before:
language: python
sudo: false
cache: pip
python:
- '3.6'
env:
matrix:
- TOXENV=qa
- DJANGO=111
- DJANGO=20
- DJANGO=master
matrix:
fast_finish: true
allow_failures:
- env: DJANGO=master
install:
- pip install --upgrade pip tox
- pip install -U coveralls
before_script:
- |
if [[ -z $TOXENV ]]; then
export TOXENV=py$(echo $TRAVIS_PYTHON_VERSION | sed -e 's/\.//g')-dj$DJANGO
fi
- echo $TOXENV
script:
- tox -e $TOXENV
after_success:
- coveralls
deploy:
provider: pypi
user: codingjoe
password:
secure: jJyadofJm7F1Qco+EDCyN/aMZaYSbfQ0GAE02Bx7I499MkjPYvv38X2btg+PjdW3rzGD0d/kq24lfWLkgKncyQ/YMgLQ7H/GuCCHHYbKUklxllaoFXActBjstmKOvXyWWC5oEb+YEJ4HTwgkvS6wkp69B7C1d4BAOqGs5IKnCSo=
on:
tags: true
distributions: sdist bdist_wheel
repo: KristianOellegaard/django-health-check
branch: master
## Instruction:
Switch to CodeCov for coverage tracking
## Code After:
language: python
sudo: false
cache: pip
python:
- '3.6'
env:
matrix:
- TOXENV=qa
- DJANGO=111
- DJANGO=20
- DJANGO=master
matrix:
fast_finish: true
allow_failures:
- env: DJANGO=master
install:
- pip install --upgrade codecov tox
before_script:
- |
if [[ -z $TOXENV ]]; then
export TOXENV=py$(echo $TRAVIS_PYTHON_VERSION | sed -e 's/\.//g')-dj$DJANGO
fi
- echo $TOXENV
script:
- tox -e $TOXENV
after_success:
- codecov
deploy:
provider: pypi
user: codingjoe
password:
secure: jJyadofJm7F1Qco+EDCyN/aMZaYSbfQ0GAE02Bx7I499MkjPYvv38X2btg+PjdW3rzGD0d/kq24lfWLkgKncyQ/YMgLQ7H/GuCCHHYbKUklxllaoFXActBjstmKOvXyWWC5oEb+YEJ4HTwgkvS6wkp69B7C1d4BAOqGs5IKnCSo=
on:
tags: true
distributions: sdist bdist_wheel
repo: KristianOellegaard/django-health-check
branch: master
| language: python
sudo: false
cache: pip
python:
- '3.6'
env:
matrix:
- TOXENV=qa
- DJANGO=111
- DJANGO=20
- DJANGO=master
matrix:
fast_finish: true
allow_failures:
- env: DJANGO=master
install:
- - pip install --upgrade pip tox
? ^^^
+ - pip install --upgrade codecov tox
? ^^^^^^^
- - pip install -U coveralls
before_script:
- |
if [[ -z $TOXENV ]]; then
export TOXENV=py$(echo $TRAVIS_PYTHON_VERSION | sed -e 's/\.//g')-dj$DJANGO
fi
- echo $TOXENV
script:
- tox -e $TOXENV
after_success:
- - coveralls
+ - codecov
deploy:
provider: pypi
user: codingjoe
password:
secure: jJyadofJm7F1Qco+EDCyN/aMZaYSbfQ0GAE02Bx7I499MkjPYvv38X2btg+PjdW3rzGD0d/kq24lfWLkgKncyQ/YMgLQ7H/GuCCHHYbKUklxllaoFXActBjstmKOvXyWWC5oEb+YEJ4HTwgkvS6wkp69B7C1d4BAOqGs5IKnCSo=
on:
tags: true
distributions: sdist bdist_wheel
repo: KristianOellegaard/django-health-check
branch: master | 5 | 0.131579 | 2 | 3 |
036c9693a26d40f7a5ebebef8e27f2a98cc6a102 | data/apply-planet_osm_line.sql | data/apply-planet_osm_line.sql | DO $$
BEGIN
--------------------------------------------------------------------------------
-- planet_osm_line
--------------------------------------------------------------------------------
ALTER TABLE planet_osm_line
ADD COLUMN mz_id TEXT,
ADD COLUMN mz_road_level SMALLINT,
ADD COLUMN mz_road_sort_key FLOAT;
UPDATE planet_osm_line AS line SET
mz_id = mz_normalize_id(road.osm_id, road.way),
mz_road_level = road.mz_road_level,
mz_road_sort_key = (CASE WHEN road.mz_road_level IS NULL
THEN NULL
ELSE mz_calculate_road_sort_key(road.layer, road.bridge, road.tunnel, road.highway, road.railway)
END)
FROM (
SELECT osm_id, way, layer, bridge, tunnel, highway, railway, mz_calculate_road_level(highway, railway) AS mz_road_level
FROM planet_osm_line
) road
WHERE line.osm_id = road.osm_id;
PERFORM mz_create_partial_index_if_not_exists('planet_osm_line_mz_road_level_index', 'planet_osm_line', 'mz_road_level', 'mz_road_level IS NOT NULL');
END $$;
| DO $$
BEGIN
--------------------------------------------------------------------------------
-- planet_osm_line
--------------------------------------------------------------------------------
ALTER TABLE planet_osm_line
ADD COLUMN mz_id TEXT,
ADD COLUMN mz_road_level SMALLINT,
ADD COLUMN mz_road_sort_key FLOAT;
UPDATE planet_osm_line AS line SET
mz_id = mz_normalize_id(road.osm_id, road.way),
mz_road_level = road.mz_road_level,
mz_road_sort_key = (CASE WHEN road.mz_road_level IS NULL
THEN NULL
ELSE mz_calculate_road_sort_key(road.layer, road.bridge, road.tunnel, road.highway, road.railway)
END)
FROM (
SELECT osm_id, way, layer, bridge, tunnel, highway, railway, mz_calculate_road_level(highway, railway) AS mz_road_level
FROM planet_osm_line
) road
WHERE line.osm_id = road.osm_id;
PERFORM mz_create_partial_index_if_not_exists('planet_osm_line_mz_road_level_index', 'planet_osm_line', 'mz_road_level', 'mz_road_level IS NOT NULL');
PERFORM mz_create_partial_index_if_not_exists('planet_osm_line_waterway', 'planet_osm_line', 'waterway', 'waterway IS NOT NULL');
END $$;
| Add index on waterway lines | Add index on waterway lines
| SQL | mit | zoondka/vector-datasource,mapzen/vector-datasource,gronke/vector-datasource,mapzen/vector-datasource,adncentral/vector-datasource,mapzen/vector-datasource,adncentral/vector-datasource,kyroskoh/vector-datasource,zoondka/vector-datasource,kyroskoh/vector-datasource,adncentral/vector-datasource,gronke/vector-datasource,adncentral/vector-datasource | sql | ## Code Before:
DO $$
BEGIN
--------------------------------------------------------------------------------
-- planet_osm_line
--------------------------------------------------------------------------------
ALTER TABLE planet_osm_line
ADD COLUMN mz_id TEXT,
ADD COLUMN mz_road_level SMALLINT,
ADD COLUMN mz_road_sort_key FLOAT;
UPDATE planet_osm_line AS line SET
mz_id = mz_normalize_id(road.osm_id, road.way),
mz_road_level = road.mz_road_level,
mz_road_sort_key = (CASE WHEN road.mz_road_level IS NULL
THEN NULL
ELSE mz_calculate_road_sort_key(road.layer, road.bridge, road.tunnel, road.highway, road.railway)
END)
FROM (
SELECT osm_id, way, layer, bridge, tunnel, highway, railway, mz_calculate_road_level(highway, railway) AS mz_road_level
FROM planet_osm_line
) road
WHERE line.osm_id = road.osm_id;
PERFORM mz_create_partial_index_if_not_exists('planet_osm_line_mz_road_level_index', 'planet_osm_line', 'mz_road_level', 'mz_road_level IS NOT NULL');
END $$;
## Instruction:
Add index on waterway lines
## Code After:
DO $$
BEGIN
--------------------------------------------------------------------------------
-- planet_osm_line
--------------------------------------------------------------------------------
ALTER TABLE planet_osm_line
ADD COLUMN mz_id TEXT,
ADD COLUMN mz_road_level SMALLINT,
ADD COLUMN mz_road_sort_key FLOAT;
UPDATE planet_osm_line AS line SET
mz_id = mz_normalize_id(road.osm_id, road.way),
mz_road_level = road.mz_road_level,
mz_road_sort_key = (CASE WHEN road.mz_road_level IS NULL
THEN NULL
ELSE mz_calculate_road_sort_key(road.layer, road.bridge, road.tunnel, road.highway, road.railway)
END)
FROM (
SELECT osm_id, way, layer, bridge, tunnel, highway, railway, mz_calculate_road_level(highway, railway) AS mz_road_level
FROM planet_osm_line
) road
WHERE line.osm_id = road.osm_id;
PERFORM mz_create_partial_index_if_not_exists('planet_osm_line_mz_road_level_index', 'planet_osm_line', 'mz_road_level', 'mz_road_level IS NOT NULL');
PERFORM mz_create_partial_index_if_not_exists('planet_osm_line_waterway', 'planet_osm_line', 'waterway', 'waterway IS NOT NULL');
END $$;
| DO $$
BEGIN
--------------------------------------------------------------------------------
-- planet_osm_line
--------------------------------------------------------------------------------
ALTER TABLE planet_osm_line
ADD COLUMN mz_id TEXT,
ADD COLUMN mz_road_level SMALLINT,
ADD COLUMN mz_road_sort_key FLOAT;
UPDATE planet_osm_line AS line SET
mz_id = mz_normalize_id(road.osm_id, road.way),
mz_road_level = road.mz_road_level,
mz_road_sort_key = (CASE WHEN road.mz_road_level IS NULL
THEN NULL
ELSE mz_calculate_road_sort_key(road.layer, road.bridge, road.tunnel, road.highway, road.railway)
END)
FROM (
SELECT osm_id, way, layer, bridge, tunnel, highway, railway, mz_calculate_road_level(highway, railway) AS mz_road_level
FROM planet_osm_line
) road
WHERE line.osm_id = road.osm_id;
PERFORM mz_create_partial_index_if_not_exists('planet_osm_line_mz_road_level_index', 'planet_osm_line', 'mz_road_level', 'mz_road_level IS NOT NULL');
+ PERFORM mz_create_partial_index_if_not_exists('planet_osm_line_waterway', 'planet_osm_line', 'waterway', 'waterway IS NOT NULL');
END $$; | 1 | 0.033333 | 1 | 0 |
da28503eb702a7877e208da106f6388d7a92e98d | README.md | README.md | [](https://travis-ci.org/start/up)
Up (the easiest way to write structured content for the web)
============================================================
Up is a set of human-friendly conventions for writing structured content in plain text.
For software developers, Up is also a tool that converts Up documents into HTML.
I'm not accepting contributions at this time!
| [](https://travis-ci.org/start/up)
Up (a markup language)
======================
Up offers conventions for writing structured documents in plain text.
For more information, check out [writeup.io](https://writeup.io).
I'm not accepting contributions at this time!
| Add link to writeup.io in readme | Add link to writeup.io in readme
| Markdown | mit | start/up,start/up | markdown | ## Code Before:
[](https://travis-ci.org/start/up)
Up (the easiest way to write structured content for the web)
============================================================
Up is a set of human-friendly conventions for writing structured content in plain text.
For software developers, Up is also a tool that converts Up documents into HTML.
I'm not accepting contributions at this time!
## Instruction:
Add link to writeup.io in readme
## Code After:
[](https://travis-ci.org/start/up)
Up (a markup language)
======================
Up offers conventions for writing structured documents in plain text.
For more information, check out [writeup.io](https://writeup.io).
I'm not accepting contributions at this time!
| [](https://travis-ci.org/start/up)
- Up (the easiest way to write structured content for the web)
- ============================================================
+ Up (a markup language)
+ ======================
- Up is a set of human-friendly conventions for writing structured content in plain text.
? --------- ------- ^^^^^^ ^^^
+ Up offers conventions for writing structured documents in plain text.
? + ^ ++ ^^ +
+ For more information, check out [writeup.io](https://writeup.io).
- For software developers, Up is also a tool that converts Up documents into HTML.
-
I'm not accepting contributions at this time! | 9 | 0.818182 | 4 | 5 |
1d5cf6f63f2c72eee9862a35eace9ab58919bb36 | slaves/bin/meta-measured_setup.sh | slaves/bin/meta-measured_setup.sh |
SLAVE_ROOT=/var/lib/buildbot/slaves
MEASURED_BUILD=${SLAVE_ROOT}/meta-measured/build/
MEASURED_CONF=${MEASURED_BUILD}/conf/local.conf
MEASURED_LAYERS=${MEASURED_BUILD}/LAYERS
MEASURED_FETCH_CONF=${MEASURED_BUILD}/fetch.conf
# values we'll be setting in local.conf
DL_DIR="/mnt/openembedded/downloads/"
GIT_MIRROR="file:///var/lib/git"
if [ ! -f ${MEASURED_CONF} ]; then
echo "Missing config file for meta-measured. Halting."
exit 1
fi
# set DL_DIR
sed -i "s&^\([[:space:]]*DL_DIR[[:space:]]*\)\(\?=\|\+=\|=\+\|=\).*$&\1\2 \"${DL_DIR}\"&" ${MEASURED_CONF}
if [ $? -ne 0 ]; then
exit $?
fi
echo "GIT_MIRROR=\"${GIT_MIRROR}\"" > ${MEASURED_FETCH_CONF}
if [ $? -ne 0 ]; then
exit $?
fi
|
SLAVE_ROOT=/var/lib/buildbot/slaves
MEASURED_BUILD=${SLAVE_ROOT}/meta-measured/build/
MEASURED_AUTO_CONF=${MEASURED_BUILD}/conf/auto.conf
MEASURED_FETCH_CONF=${MEASURED_BUILD}/fetch.conf
# set values in bitbake auto builder config file
cat << EOL > ${MEASURED_AUTO_CONF}
DL_DIR ?= "/mnt/openembedded/downloads/"
EOL
# set value in fetch config file
cat << EOL > ${MEASURED_FETCH_CONF}
GIT_MIRROR="file:///var/lib/git"
EOL
| Clean up meta-measured setup script and use an auto.conf file. | Clean up meta-measured setup script and use an auto.conf file.
| Shell | unlicense | flihp/twobit-buildbot,flihp/twobit-buildbot | shell | ## Code Before:
SLAVE_ROOT=/var/lib/buildbot/slaves
MEASURED_BUILD=${SLAVE_ROOT}/meta-measured/build/
MEASURED_CONF=${MEASURED_BUILD}/conf/local.conf
MEASURED_LAYERS=${MEASURED_BUILD}/LAYERS
MEASURED_FETCH_CONF=${MEASURED_BUILD}/fetch.conf
# values we'll be setting in local.conf
DL_DIR="/mnt/openembedded/downloads/"
GIT_MIRROR="file:///var/lib/git"
if [ ! -f ${MEASURED_CONF} ]; then
echo "Missing config file for meta-measured. Halting."
exit 1
fi
# set DL_DIR
sed -i "s&^\([[:space:]]*DL_DIR[[:space:]]*\)\(\?=\|\+=\|=\+\|=\).*$&\1\2 \"${DL_DIR}\"&" ${MEASURED_CONF}
if [ $? -ne 0 ]; then
exit $?
fi
echo "GIT_MIRROR=\"${GIT_MIRROR}\"" > ${MEASURED_FETCH_CONF}
if [ $? -ne 0 ]; then
exit $?
fi
## Instruction:
Clean up meta-measured setup script and use an auto.conf file.
## Code After:
SLAVE_ROOT=/var/lib/buildbot/slaves
MEASURED_BUILD=${SLAVE_ROOT}/meta-measured/build/
MEASURED_AUTO_CONF=${MEASURED_BUILD}/conf/auto.conf
MEASURED_FETCH_CONF=${MEASURED_BUILD}/fetch.conf
# set values in bitbake auto builder config file
cat << EOL > ${MEASURED_AUTO_CONF}
DL_DIR ?= "/mnt/openembedded/downloads/"
EOL
# set value in fetch config file
cat << EOL > ${MEASURED_FETCH_CONF}
GIT_MIRROR="file:///var/lib/git"
EOL
|
SLAVE_ROOT=/var/lib/buildbot/slaves
MEASURED_BUILD=${SLAVE_ROOT}/meta-measured/build/
- MEASURED_CONF=${MEASURED_BUILD}/conf/local.conf
? ^ ---
+ MEASURED_AUTO_CONF=${MEASURED_BUILD}/conf/auto.conf
? +++++ ^^^
- MEASURED_LAYERS=${MEASURED_BUILD}/LAYERS
MEASURED_FETCH_CONF=${MEASURED_BUILD}/fetch.conf
- # values we'll be setting in local.conf
+ # set values in bitbake auto builder config file
+ cat << EOL > ${MEASURED_AUTO_CONF}
- DL_DIR="/mnt/openembedded/downloads/"
+ DL_DIR ?= "/mnt/openembedded/downloads/"
? ++ +
+ EOL
+
+ # set value in fetch config file
+ cat << EOL > ${MEASURED_FETCH_CONF}
GIT_MIRROR="file:///var/lib/git"
+ EOL
- if [ ! -f ${MEASURED_CONF} ]; then
- echo "Missing config file for meta-measured. Halting."
- exit 1
- fi
-
- # set DL_DIR
- sed -i "s&^\([[:space:]]*DL_DIR[[:space:]]*\)\(\?=\|\+=\|=\+\|=\).*$&\1\2 \"${DL_DIR}\"&" ${MEASURED_CONF}
- if [ $? -ne 0 ]; then
- exit $?
- fi
-
- echo "GIT_MIRROR=\"${GIT_MIRROR}\"" > ${MEASURED_FETCH_CONF}
- if [ $? -ne 0 ]; then
- exit $?
- fi
- | 29 | 1.074074 | 9 | 20 |
6d400878da6b8cbf7092bd57c5eefac935d6cff8 | src/main/groovy/org/geocommands/vector/LayerOutCommand.groovy | src/main/groovy/org/geocommands/vector/LayerOutCommand.groovy | package org.geocommands.vector
import geoscript.layer.Layer
import geoscript.workspace.Memory
import org.geocommands.Command
/**
* A Command base class for writing Layers.
* @author Jared Erickson
*/
abstract class LayerOutCommand<T extends LayerOutOptions> extends Command<T> {
abstract String getName()
abstract String getDescription()
abstract T getOptions()
abstract Layer createLayer(T options, Reader reader, Writer writer) throws Exception
void execute(T options, Reader reader, Writer writer) throws Exception {
Layer layer = createLayer(options, reader, writer)
try {
if (layer.workspace instanceof Memory) {
writer.write(new geoscript.layer.io.CsvWriter().write(layer))
}
}
finally {
layer.workspace.close()
}
}
protected String getOutputLayerName(T options, String defaultName) {
String outName = options.outputLayer ? options.outputLayer : defaultName
if (options.outputWorkspace && (options.outputWorkspace.endsWith(".shp") ||
options.outputWorkspace.endsWith(".properties") ||
options.outputWorkspace.endsWith(".pbf")
)) {
String fileName = new File(options.outputWorkspace).name
outName = fileName.substring(0, fileName.lastIndexOf("."))
}
outName
}
}
| package org.geocommands.vector
import geoscript.layer.Layer
import geoscript.workspace.Memory
import org.geocommands.Command
/**
* A Command base class for writing Layers.
* @author Jared Erickson
*/
abstract class LayerOutCommand<T extends LayerOutOptions> extends Command<T> {
abstract String getName()
abstract String getDescription()
abstract T getOptions()
abstract Layer createLayer(T options, Reader reader, Writer writer) throws Exception
void execute(T options, Reader reader, Writer writer) throws Exception {
Layer layer = createLayer(options, reader, writer)
try {
if (layer.workspace instanceof Memory) {
writer.write(new geoscript.layer.io.CsvWriter().write(layer))
}
}
finally {
layer.workspace.close()
}
}
protected String getOutputLayerName(T options, String defaultName) {
String outName = options.outputLayer
if (!outName) {
if (options.outputWorkspace && (options.outputWorkspace.endsWith(".shp") ||
options.outputWorkspace.endsWith(".properties") ||
options.outputWorkspace.endsWith(".pbf")
)) {
String fileName = new File(options.outputWorkspace).name
outName = fileName.substring(0, fileName.lastIndexOf("."))
} else {
outName = defaultName
}
}
outName
}
}
| Fix output layer default name calculation. | Fix output layer default name calculation.
| Groovy | mit | jericks/geoc,jericks/geoc | groovy | ## Code Before:
package org.geocommands.vector
import geoscript.layer.Layer
import geoscript.workspace.Memory
import org.geocommands.Command
/**
* A Command base class for writing Layers.
* @author Jared Erickson
*/
abstract class LayerOutCommand<T extends LayerOutOptions> extends Command<T> {
abstract String getName()
abstract String getDescription()
abstract T getOptions()
abstract Layer createLayer(T options, Reader reader, Writer writer) throws Exception
void execute(T options, Reader reader, Writer writer) throws Exception {
Layer layer = createLayer(options, reader, writer)
try {
if (layer.workspace instanceof Memory) {
writer.write(new geoscript.layer.io.CsvWriter().write(layer))
}
}
finally {
layer.workspace.close()
}
}
protected String getOutputLayerName(T options, String defaultName) {
String outName = options.outputLayer ? options.outputLayer : defaultName
if (options.outputWorkspace && (options.outputWorkspace.endsWith(".shp") ||
options.outputWorkspace.endsWith(".properties") ||
options.outputWorkspace.endsWith(".pbf")
)) {
String fileName = new File(options.outputWorkspace).name
outName = fileName.substring(0, fileName.lastIndexOf("."))
}
outName
}
}
## Instruction:
Fix output layer default name calculation.
## Code After:
package org.geocommands.vector
import geoscript.layer.Layer
import geoscript.workspace.Memory
import org.geocommands.Command
/**
* A Command base class for writing Layers.
* @author Jared Erickson
*/
abstract class LayerOutCommand<T extends LayerOutOptions> extends Command<T> {
abstract String getName()
abstract String getDescription()
abstract T getOptions()
abstract Layer createLayer(T options, Reader reader, Writer writer) throws Exception
void execute(T options, Reader reader, Writer writer) throws Exception {
Layer layer = createLayer(options, reader, writer)
try {
if (layer.workspace instanceof Memory) {
writer.write(new geoscript.layer.io.CsvWriter().write(layer))
}
}
finally {
layer.workspace.close()
}
}
protected String getOutputLayerName(T options, String defaultName) {
String outName = options.outputLayer
if (!outName) {
if (options.outputWorkspace && (options.outputWorkspace.endsWith(".shp") ||
options.outputWorkspace.endsWith(".properties") ||
options.outputWorkspace.endsWith(".pbf")
)) {
String fileName = new File(options.outputWorkspace).name
outName = fileName.substring(0, fileName.lastIndexOf("."))
} else {
outName = defaultName
}
}
outName
}
}
| package org.geocommands.vector
import geoscript.layer.Layer
import geoscript.workspace.Memory
import org.geocommands.Command
/**
* A Command base class for writing Layers.
* @author Jared Erickson
*/
abstract class LayerOutCommand<T extends LayerOutOptions> extends Command<T> {
abstract String getName()
abstract String getDescription()
abstract T getOptions()
abstract Layer createLayer(T options, Reader reader, Writer writer) throws Exception
void execute(T options, Reader reader, Writer writer) throws Exception {
Layer layer = createLayer(options, reader, writer)
try {
if (layer.workspace instanceof Memory) {
writer.write(new geoscript.layer.io.CsvWriter().write(layer))
}
}
finally {
layer.workspace.close()
}
}
protected String getOutputLayerName(T options, String defaultName) {
- String outName = options.outputLayer ? options.outputLayer : defaultName
+ String outName = options.outputLayer
+ if (!outName) {
- if (options.outputWorkspace && (options.outputWorkspace.endsWith(".shp") ||
+ if (options.outputWorkspace && (options.outputWorkspace.endsWith(".shp") ||
? ++++
- options.outputWorkspace.endsWith(".properties") ||
+ options.outputWorkspace.endsWith(".properties") ||
? ++++
- options.outputWorkspace.endsWith(".pbf")
+ options.outputWorkspace.endsWith(".pbf")
? ++++
- )) {
+ )) {
? ++++
- String fileName = new File(options.outputWorkspace).name
+ String fileName = new File(options.outputWorkspace).name
? ++++
- outName = fileName.substring(0, fileName.lastIndexOf("."))
+ outName = fileName.substring(0, fileName.lastIndexOf("."))
? ++++
+ } else {
+ outName = defaultName
+ }
}
outName
}
} | 18 | 0.409091 | 11 | 7 |
95518e94aeee9c952bf5884088e42de3f4a0c1ec | WaniKani/src/tr/xip/wanikani/settings/SettingsActivity.java | WaniKani/src/tr/xip/wanikani/settings/SettingsActivity.java | package tr.xip.wanikani.settings;
import android.os.Build;
import android.os.Bundle;
import android.preference.Preference;
import android.preference.PreferenceActivity;
import tr.xip.wanikani.R;
import tr.xip.wanikani.managers.PrefManager;
/**
* Created by xihsa_000 on 4/4/14.
*/
public class SettingsActivity extends PreferenceActivity {
PrefManager prefMan;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
addPreferencesFromResource(R.xml.preferences);
prefMan = new PrefManager(this);
Preference mApiKey = findPreference(PrefManager.PREF_API_KEY);
mApiKey.setSummary(prefMan.getApiKey());
}
@Override
public void onBackPressed() {
if (Build.VERSION.SDK_INT >= 16) {
super.onNavigateUp();
} else {
super.onBackPressed();
}
}
}
| package tr.xip.wanikani.settings;
import android.os.Build;
import android.os.Bundle;
import android.preference.Preference;
import android.preference.PreferenceActivity;
import tr.xip.wanikani.R;
import tr.xip.wanikani.managers.PrefManager;
/**
* Created by xihsa_000 on 4/4/14.
*/
public class SettingsActivity extends PreferenceActivity {
PrefManager prefMan;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
addPreferencesFromResource(R.xml.preferences);
prefMan = new PrefManager(this);
Preference mApiKey = findPreference(PrefManager.PREF_API_KEY);
String apiKey = prefMan.getApiKey();
String maskedApiKey = "************************";
for (int i = 25; i < apiKey.length(); i++) {
maskedApiKey += apiKey.charAt(i);
}
mApiKey.setSummary(maskedApiKey);
}
@Override
public void onBackPressed() {
if (Build.VERSION.SDK_INT >= 16) {
super.onNavigateUp();
} else {
super.onBackPressed();
}
}
}
| Mask the API key shown in settings | Mask the API key shown in settings
| Java | bsd-2-clause | gkathir15/WaniKani-for-Android,0359xiaodong/WaniKani-for-Android,0359xiaodong/WaniKani-for-Android,leerduo/WaniKani-for-Android,leerduo/WaniKani-for-Android,dhamofficial/WaniKani-for-Android,msdgwzhy6/WaniKani-for-Android,jiangzhonghui/WaniKani-for-Android,dhamofficial/WaniKani-for-Android,gkathir15/WaniKani-for-Android,Gustorn/WaniKani-for-Android,diptakobu/WaniKani-for-Android,msdgwzhy6/WaniKani-for-Android,Gustorn/WaniKani-for-Android,jiangzhonghui/WaniKani-for-Android,shekibobo/WaniKani-for-Android,shekibobo/WaniKani-for-Android,diptakobu/WaniKani-for-Android | java | ## Code Before:
package tr.xip.wanikani.settings;
import android.os.Build;
import android.os.Bundle;
import android.preference.Preference;
import android.preference.PreferenceActivity;
import tr.xip.wanikani.R;
import tr.xip.wanikani.managers.PrefManager;
/**
* Created by xihsa_000 on 4/4/14.
*/
public class SettingsActivity extends PreferenceActivity {
PrefManager prefMan;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
addPreferencesFromResource(R.xml.preferences);
prefMan = new PrefManager(this);
Preference mApiKey = findPreference(PrefManager.PREF_API_KEY);
mApiKey.setSummary(prefMan.getApiKey());
}
@Override
public void onBackPressed() {
if (Build.VERSION.SDK_INT >= 16) {
super.onNavigateUp();
} else {
super.onBackPressed();
}
}
}
## Instruction:
Mask the API key shown in settings
## Code After:
package tr.xip.wanikani.settings;
import android.os.Build;
import android.os.Bundle;
import android.preference.Preference;
import android.preference.PreferenceActivity;
import tr.xip.wanikani.R;
import tr.xip.wanikani.managers.PrefManager;
/**
* Created by xihsa_000 on 4/4/14.
*/
public class SettingsActivity extends PreferenceActivity {
PrefManager prefMan;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
addPreferencesFromResource(R.xml.preferences);
prefMan = new PrefManager(this);
Preference mApiKey = findPreference(PrefManager.PREF_API_KEY);
String apiKey = prefMan.getApiKey();
String maskedApiKey = "************************";
for (int i = 25; i < apiKey.length(); i++) {
maskedApiKey += apiKey.charAt(i);
}
mApiKey.setSummary(maskedApiKey);
}
@Override
public void onBackPressed() {
if (Build.VERSION.SDK_INT >= 16) {
super.onNavigateUp();
} else {
super.onBackPressed();
}
}
}
| package tr.xip.wanikani.settings;
import android.os.Build;
import android.os.Bundle;
import android.preference.Preference;
import android.preference.PreferenceActivity;
import tr.xip.wanikani.R;
import tr.xip.wanikani.managers.PrefManager;
/**
* Created by xihsa_000 on 4/4/14.
*/
public class SettingsActivity extends PreferenceActivity {
PrefManager prefMan;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
addPreferencesFromResource(R.xml.preferences);
prefMan = new PrefManager(this);
Preference mApiKey = findPreference(PrefManager.PREF_API_KEY);
+
+ String apiKey = prefMan.getApiKey();
+ String maskedApiKey = "************************";
+
+ for (int i = 25; i < apiKey.length(); i++) {
+ maskedApiKey += apiKey.charAt(i);
+ }
+
- mApiKey.setSummary(prefMan.getApiKey());
? ^^ ^^^^^^^^ --
+ mApiKey.setSummary(maskedApiKey);
? ^^^^ ^
}
@Override
public void onBackPressed() {
if (Build.VERSION.SDK_INT >= 16) {
super.onNavigateUp();
} else {
super.onBackPressed();
}
}
} | 10 | 0.27027 | 9 | 1 |
d7a2392fff7a6b43022a7be91253fe14f05206b0 | week-4/math/my_solution.rb | week-4/math/my_solution.rb |
def add(num_1, num_2)
num_1 + num_2
end
def subtract(num_1, num_2)
num_1 - num_2
end
def multiply(num_1, num_2)
num_1 * num_2
end
def divide(num_1, num_2)
num_1.to_f / num_2.to_f
end |
def add(num_1, num_2)
num_1 + num_2
end
def subtract(num_1, num_2)
num_1 - num_2
end
def multiply(num_1, num_2)
num_1 * num_2
end
def divide(num_1, num_2)
num_1.to_f / num_2.to_f
end
puts add(10, 2)
puts subtract(10, 2)
puts multiply(10, 2)
puts divide(10, 2) | Update my solution with output | Update my solution with output
| Ruby | mit | nataliecodes/phase-0,nataliecodes/phase-0,nataliecodes/phase-0 | ruby | ## Code Before:
def add(num_1, num_2)
num_1 + num_2
end
def subtract(num_1, num_2)
num_1 - num_2
end
def multiply(num_1, num_2)
num_1 * num_2
end
def divide(num_1, num_2)
num_1.to_f / num_2.to_f
end
## Instruction:
Update my solution with output
## Code After:
def add(num_1, num_2)
num_1 + num_2
end
def subtract(num_1, num_2)
num_1 - num_2
end
def multiply(num_1, num_2)
num_1 * num_2
end
def divide(num_1, num_2)
num_1.to_f / num_2.to_f
end
puts add(10, 2)
puts subtract(10, 2)
puts multiply(10, 2)
puts divide(10, 2) |
def add(num_1, num_2)
num_1 + num_2
end
def subtract(num_1, num_2)
num_1 - num_2
end
def multiply(num_1, num_2)
num_1 * num_2
end
def divide(num_1, num_2)
num_1.to_f / num_2.to_f
end
+
+ puts add(10, 2)
+ puts subtract(10, 2)
+ puts multiply(10, 2)
+ puts divide(10, 2) | 5 | 0.3125 | 5 | 0 |
97fd6a9e9febff06d782c4a3f757c5b21542a059 | features/support/env.rb | features/support/env.rb |
$: << File.expand_path(File.join(File.dirname(__FILE__), '..', '..', 'lib'))
require 'yajl'
require 'beanstalk-client'
#require 'flapjack/inifile'
#require 'flapjack/filters/ok'
#require 'flapjack/filters/any_parents_failed'
#require 'flapjack/notifier_engine'
#require 'flapjack/transports/result'
#require 'flapjack/transports/beanstalkd'
#require 'flapjack/patches'
#require 'flapjack/inifile'
#require 'flapjack/cli/worker_manager'
#require 'flapjack/cli/notifier_manager'
#require 'flapjack/cli/notifier'
#require 'flapjack/cli/worker'
#require 'flapjack/persistence/couch'
#require 'flapjack/persistence/sqlite3'
#require 'flapjack/applications/notifier'
#require 'flapjack/applications/worker'
#require 'flapjack/notifiers/mailer/init'
#require 'flapjack/notifiers/xmpp/init'
|
$: << File.expand_path(File.join(File.dirname(__FILE__), '..', '..', 'lib'))
require 'pathname'
require 'yajl'
require 'beanstalk-client'
#require 'flapjack/inifile'
#require 'flapjack/filters/ok'
#require 'flapjack/filters/any_parents_failed'
#require 'flapjack/notifier_engine'
#require 'flapjack/transports/result'
#require 'flapjack/transports/beanstalkd'
#require 'flapjack/patches'
#require 'flapjack/inifile'
#require 'flapjack/cli/worker_manager'
#require 'flapjack/cli/notifier_manager'
#require 'flapjack/cli/notifier'
#require 'flapjack/cli/worker'
#require 'flapjack/persistence/couch'
#require 'flapjack/persistence/sqlite3'
#require 'flapjack/applications/notifier'
#require 'flapjack/applications/worker'
#require 'flapjack/notifiers/mailer/init'
#require 'flapjack/notifiers/xmpp/init'
| Make pathname available across all features | Make pathname available across all features
| Ruby | mit | flapjack/flapjack,ikusalic/flapjack,flapjack/flapjack,ikusalic/flapjack,cloudevelops/flapjack,cloudevelops/flapjack,necula01/flapjack,necula01/flapjack,flapjack/flapjack,flapjack/flapjack,cloudevelops/flapjack,cloudevelops/flapjack,flapjack/flapjack,ikusalic/flapjack,necula01/flapjack,ikusalic/flapjack,necula01/flapjack,ikusalic/flapjack,necula01/flapjack | ruby | ## Code Before:
$: << File.expand_path(File.join(File.dirname(__FILE__), '..', '..', 'lib'))
require 'yajl'
require 'beanstalk-client'
#require 'flapjack/inifile'
#require 'flapjack/filters/ok'
#require 'flapjack/filters/any_parents_failed'
#require 'flapjack/notifier_engine'
#require 'flapjack/transports/result'
#require 'flapjack/transports/beanstalkd'
#require 'flapjack/patches'
#require 'flapjack/inifile'
#require 'flapjack/cli/worker_manager'
#require 'flapjack/cli/notifier_manager'
#require 'flapjack/cli/notifier'
#require 'flapjack/cli/worker'
#require 'flapjack/persistence/couch'
#require 'flapjack/persistence/sqlite3'
#require 'flapjack/applications/notifier'
#require 'flapjack/applications/worker'
#require 'flapjack/notifiers/mailer/init'
#require 'flapjack/notifiers/xmpp/init'
## Instruction:
Make pathname available across all features
## Code After:
$: << File.expand_path(File.join(File.dirname(__FILE__), '..', '..', 'lib'))
require 'pathname'
require 'yajl'
require 'beanstalk-client'
#require 'flapjack/inifile'
#require 'flapjack/filters/ok'
#require 'flapjack/filters/any_parents_failed'
#require 'flapjack/notifier_engine'
#require 'flapjack/transports/result'
#require 'flapjack/transports/beanstalkd'
#require 'flapjack/patches'
#require 'flapjack/inifile'
#require 'flapjack/cli/worker_manager'
#require 'flapjack/cli/notifier_manager'
#require 'flapjack/cli/notifier'
#require 'flapjack/cli/worker'
#require 'flapjack/persistence/couch'
#require 'flapjack/persistence/sqlite3'
#require 'flapjack/applications/notifier'
#require 'flapjack/applications/worker'
#require 'flapjack/notifiers/mailer/init'
#require 'flapjack/notifiers/xmpp/init'
|
$: << File.expand_path(File.join(File.dirname(__FILE__), '..', '..', 'lib'))
+ require 'pathname'
require 'yajl'
require 'beanstalk-client'
#require 'flapjack/inifile'
#require 'flapjack/filters/ok'
#require 'flapjack/filters/any_parents_failed'
#require 'flapjack/notifier_engine'
#require 'flapjack/transports/result'
#require 'flapjack/transports/beanstalkd'
#require 'flapjack/patches'
#require 'flapjack/inifile'
#require 'flapjack/cli/worker_manager'
#require 'flapjack/cli/notifier_manager'
#require 'flapjack/cli/notifier'
#require 'flapjack/cli/worker'
#require 'flapjack/persistence/couch'
#require 'flapjack/persistence/sqlite3'
#require 'flapjack/applications/notifier'
#require 'flapjack/applications/worker'
#require 'flapjack/notifiers/mailer/init'
#require 'flapjack/notifiers/xmpp/init' | 1 | 0.041667 | 1 | 0 |
d0b930e6d7ce3bff833bd177bc13a908cb1bed0d | setup.py | setup.py | import os
from setuptools import setup
# Utility function to read the README file.
# Used for the long_description. It's nice, because now 1) we have a top level
# README file and 2) it's easier to type in the README file than to put a raw
# string in below ...
def read(fname):
return open(os.path.join(os.path.dirname(__file__), fname)).read()
setup(
name='timeflow',
packages=['timeflow'],
version='0.2',
description='Small CLI time logger',
author='Justas Trimailovas',
author_email='j.trimailvoas@gmail.com',
url='https://github.com/trimailov/timeflow',
keywords=['timelogger', 'logging', 'timetracker', 'tracker'],
long_description=read('README.rst'),
entry_points='''
[console_scripts]
timeflow=timeflow.main:main
tf=timeflow.main:main
''',
)
| import os
from setuptools import setup
# Utility function to read the README file.
# Used for the long_description. It's nice, because now 1) we have a top level
# README file and 2) it's easier to type in the README file than to put a raw
# string in below ...
def read(fname):
return open(os.path.join(os.path.dirname(__file__), fname)).read()
setup(
name='timeflow',
packages=['timeflow'],
version='0.2',
description='Small CLI time logger',
author='Justas Trimailovas',
author_email='j.trimailovas@gmail.com',
url='https://github.com/trimailov/timeflow',
license='MIT',
keywords=['timelogger', 'logging', 'timetracker', 'tracker'],
long_description=read('README.rst'),
entry_points='''
[console_scripts]
timeflow=timeflow.main:main
tf=timeflow.main:main
''',
)
| Add license type and fix typo | Add license type and fix typo
| Python | mit | trimailov/timeflow | python | ## Code Before:
import os
from setuptools import setup
# Utility function to read the README file.
# Used for the long_description. It's nice, because now 1) we have a top level
# README file and 2) it's easier to type in the README file than to put a raw
# string in below ...
def read(fname):
return open(os.path.join(os.path.dirname(__file__), fname)).read()
setup(
name='timeflow',
packages=['timeflow'],
version='0.2',
description='Small CLI time logger',
author='Justas Trimailovas',
author_email='j.trimailvoas@gmail.com',
url='https://github.com/trimailov/timeflow',
keywords=['timelogger', 'logging', 'timetracker', 'tracker'],
long_description=read('README.rst'),
entry_points='''
[console_scripts]
timeflow=timeflow.main:main
tf=timeflow.main:main
''',
)
## Instruction:
Add license type and fix typo
## Code After:
import os
from setuptools import setup
# Utility function to read the README file.
# Used for the long_description. It's nice, because now 1) we have a top level
# README file and 2) it's easier to type in the README file than to put a raw
# string in below ...
def read(fname):
return open(os.path.join(os.path.dirname(__file__), fname)).read()
setup(
name='timeflow',
packages=['timeflow'],
version='0.2',
description='Small CLI time logger',
author='Justas Trimailovas',
author_email='j.trimailovas@gmail.com',
url='https://github.com/trimailov/timeflow',
license='MIT',
keywords=['timelogger', 'logging', 'timetracker', 'tracker'],
long_description=read('README.rst'),
entry_points='''
[console_scripts]
timeflow=timeflow.main:main
tf=timeflow.main:main
''',
)
| import os
from setuptools import setup
# Utility function to read the README file.
# Used for the long_description. It's nice, because now 1) we have a top level
# README file and 2) it's easier to type in the README file than to put a raw
# string in below ...
def read(fname):
return open(os.path.join(os.path.dirname(__file__), fname)).read()
setup(
name='timeflow',
packages=['timeflow'],
version='0.2',
description='Small CLI time logger',
author='Justas Trimailovas',
- author_email='j.trimailvoas@gmail.com',
? -
+ author_email='j.trimailovas@gmail.com',
? +
url='https://github.com/trimailov/timeflow',
+ license='MIT',
keywords=['timelogger', 'logging', 'timetracker', 'tracker'],
long_description=read('README.rst'),
entry_points='''
[console_scripts]
timeflow=timeflow.main:main
tf=timeflow.main:main
''',
) | 3 | 0.096774 | 2 | 1 |
b9aeae6d3b8702e8709594f64a24c9795d3f2a5b | Manifold/Expression+Boolean.swift | Manifold/Expression+Boolean.swift | // Copyright © 2015 Rob Rix. All rights reserved.
extension Expression where Recur: TermType {
public static var boolean: Module<Recur> {
let Boolean = Declaration("Boolean",
type: .Type(0),
value: lambda(.Type) { A in Recur.lambda(A, A, const(A)) })
let `true` = Declaration("true",
type: Boolean.ref.out,
value: lambda(.Type) { A in Recur.lambda(A, A) { a, _ in a } })
let `false` = Declaration("false",
type: Boolean.ref.out,
value: lambda(.Type) { A in Recur.lambda(A, A) { _, a in a } })
let not = Declaration("not",
type: FunctionType(Boolean.ref, Boolean.ref),
value: lambda(Boolean.ref, .Type) { b, A in Recur.lambda(A, A) { t, f in b[A, f, t] } })
let `if` = Declaration("if",
type: lambda(.Type, Boolean.ref) { t, _ in .FunctionType(t, t, t) },
value: lambda(.Type, Boolean.ref) { t, condition in Recur.lambda(t, t) { condition[$0, $1] } })
return Module([ Boolean, `true`, `false`, not, `if` ])
}
}
import Prelude
| // Copyright © 2015 Rob Rix. All rights reserved.
extension Expression where Recur: TermType {
public static var boolean: Module<Recur> {
let Boolean = Declaration("Boolean",
type: .Type(0),
value: lambda(.Type) { A in Recur.lambda(A, A, const(A)) })
let `true` = Declaration("true",
type: Boolean.ref.out,
value: lambda(.Type) { A in Recur.lambda(A, A) { a, _ in a } })
let `false` = Declaration("false",
type: Boolean.ref.out,
value: lambda(.Type) { A in Recur.lambda(A, A) { _, a in a } })
let not = Declaration("not",
type: FunctionType(Boolean.ref, Boolean.ref),
value: lambda(Boolean.ref, .Type) { b, A in Recur.lambda(A, A) { t, f in b[A, f, t] } })
return Module([ Boolean, `true`, `false`, not ])
}
}
import Prelude
| Revert "Add an `if` binding." | Revert "Add an `if` binding."
This reverts commit 4967447ba330f7a486d4d78341906157b5ad9218.
| Swift | mit | antitypical/Manifold,antitypical/Manifold | swift | ## Code Before:
// Copyright © 2015 Rob Rix. All rights reserved.
extension Expression where Recur: TermType {
public static var boolean: Module<Recur> {
let Boolean = Declaration("Boolean",
type: .Type(0),
value: lambda(.Type) { A in Recur.lambda(A, A, const(A)) })
let `true` = Declaration("true",
type: Boolean.ref.out,
value: lambda(.Type) { A in Recur.lambda(A, A) { a, _ in a } })
let `false` = Declaration("false",
type: Boolean.ref.out,
value: lambda(.Type) { A in Recur.lambda(A, A) { _, a in a } })
let not = Declaration("not",
type: FunctionType(Boolean.ref, Boolean.ref),
value: lambda(Boolean.ref, .Type) { b, A in Recur.lambda(A, A) { t, f in b[A, f, t] } })
let `if` = Declaration("if",
type: lambda(.Type, Boolean.ref) { t, _ in .FunctionType(t, t, t) },
value: lambda(.Type, Boolean.ref) { t, condition in Recur.lambda(t, t) { condition[$0, $1] } })
return Module([ Boolean, `true`, `false`, not, `if` ])
}
}
import Prelude
## Instruction:
Revert "Add an `if` binding."
This reverts commit 4967447ba330f7a486d4d78341906157b5ad9218.
## Code After:
// Copyright © 2015 Rob Rix. All rights reserved.
extension Expression where Recur: TermType {
public static var boolean: Module<Recur> {
let Boolean = Declaration("Boolean",
type: .Type(0),
value: lambda(.Type) { A in Recur.lambda(A, A, const(A)) })
let `true` = Declaration("true",
type: Boolean.ref.out,
value: lambda(.Type) { A in Recur.lambda(A, A) { a, _ in a } })
let `false` = Declaration("false",
type: Boolean.ref.out,
value: lambda(.Type) { A in Recur.lambda(A, A) { _, a in a } })
let not = Declaration("not",
type: FunctionType(Boolean.ref, Boolean.ref),
value: lambda(Boolean.ref, .Type) { b, A in Recur.lambda(A, A) { t, f in b[A, f, t] } })
return Module([ Boolean, `true`, `false`, not ])
}
}
import Prelude
| // Copyright © 2015 Rob Rix. All rights reserved.
extension Expression where Recur: TermType {
public static var boolean: Module<Recur> {
let Boolean = Declaration("Boolean",
type: .Type(0),
value: lambda(.Type) { A in Recur.lambda(A, A, const(A)) })
let `true` = Declaration("true",
type: Boolean.ref.out,
value: lambda(.Type) { A in Recur.lambda(A, A) { a, _ in a } })
let `false` = Declaration("false",
type: Boolean.ref.out,
value: lambda(.Type) { A in Recur.lambda(A, A) { _, a in a } })
let not = Declaration("not",
type: FunctionType(Boolean.ref, Boolean.ref),
value: lambda(Boolean.ref, .Type) { b, A in Recur.lambda(A, A) { t, f in b[A, f, t] } })
- let `if` = Declaration("if",
- type: lambda(.Type, Boolean.ref) { t, _ in .FunctionType(t, t, t) },
- value: lambda(.Type, Boolean.ref) { t, condition in Recur.lambda(t, t) { condition[$0, $1] } })
-
- return Module([ Boolean, `true`, `false`, not, `if` ])
? ------
+ return Module([ Boolean, `true`, `false`, not ])
}
}
import Prelude | 6 | 0.2 | 1 | 5 |
8d8bfecab60a4104d0e7871d51216e2524da8dea | src/Reflow/Parser.hs | src/Reflow/Parser.hs | module Reflow.Parser where
import Data.Monoid ((<>))
import Data.Text (Text)
import qualified Data.Text as T
import Text.ParserCombinators.Parsec
import Reflow.Types
parseFile :: Text -> [Content]
parseFile t = either (const []) id $ parse parseContent "content" (T.unpack t)
parseContent :: Parser [Content]
parseContent = many (quoted <|> codeBlock <|> normal) <* eof
normal :: Parser Content
normal = Normal <$> singleLine
quoted :: Parser Content
quoted = do
q <- quoteChar
l <- singleLine
return $ Quoted (q <> l)
codeBlock :: Parser Content
codeBlock = do
s <- codeBlockChar
c <- codeBlockContents
e <- codeBlockChar
eol
return $ CodeBlock $ s <> c <> e
singleLine :: Parser Text
singleLine = T.pack <$> manyTill anyChar (try eol)
quoteChar :: Parser Text
quoteChar = T.pack <$> (string ">")
codeBlockChar :: Parser Text
codeBlockChar = T.pack <$> (string "```")
codeBlockContents :: Parser Text
codeBlockContents = T.pack <$> manyTill anyChar (lookAhead codeBlockChar)
eol :: Parser String
eol = try (string "\n\r")
<|> try (string "\r\n")
<|> string "\n"
<|> string "\r"
<?> "end of line"
| module Reflow.Parser where
import Data.Monoid ((<>))
import Data.Text (Text, pack)
import Text.Parsec
import Text.Parsec.Text (Parser)
import Reflow.Types
parseFile :: Text -> [Content]
parseFile t = either (const []) id $ parse parseContent "content" t
parseContent :: Parser [Content]
parseContent = many (quoted <|> codeBlock <|> normal) <* eof
normal :: Parser Content
normal = Normal <$> singleLine
quoted :: Parser Content
quoted = do
q <- quoteChar
l <- singleLine
return $ Quoted (q <> l)
codeBlock :: Parser Content
codeBlock = do
s <- codeBlockChar
c <- codeBlockContents
e <- codeBlockChar
eol
return $ CodeBlock $ s <> c <> e
singleLine :: Parser Text
singleLine = pack <$> manyTill anyChar (try eol)
quoteChar :: Parser Text
quoteChar = pack <$> (string ">")
codeBlockChar :: Parser Text
codeBlockChar = pack <$> (string "```")
codeBlockContents :: Parser Text
codeBlockContents = pack <$> manyTill anyChar (lookAhead codeBlockChar)
eol :: Parser String
eol = try (string "\n\r")
<|> try (string "\r\n")
<|> string "\n"
<|> string "\r"
<?> "end of line"
| Make parsing Data.Text.Text slightly nicer | Make parsing Data.Text.Text slightly nicer
Parsec has had support for parsing Text Since version 3.1.2, so we don't need
`T.unpack text` as long as we also import `Text.Parsec.Text`.
This also means that the code only requires `(Text, pack)` from `Data.Text`, so
import only those (unqualified).
| Haskell | mit | gfontenot/reflow | haskell | ## Code Before:
module Reflow.Parser where
import Data.Monoid ((<>))
import Data.Text (Text)
import qualified Data.Text as T
import Text.ParserCombinators.Parsec
import Reflow.Types
parseFile :: Text -> [Content]
parseFile t = either (const []) id $ parse parseContent "content" (T.unpack t)
parseContent :: Parser [Content]
parseContent = many (quoted <|> codeBlock <|> normal) <* eof
normal :: Parser Content
normal = Normal <$> singleLine
quoted :: Parser Content
quoted = do
q <- quoteChar
l <- singleLine
return $ Quoted (q <> l)
codeBlock :: Parser Content
codeBlock = do
s <- codeBlockChar
c <- codeBlockContents
e <- codeBlockChar
eol
return $ CodeBlock $ s <> c <> e
singleLine :: Parser Text
singleLine = T.pack <$> manyTill anyChar (try eol)
quoteChar :: Parser Text
quoteChar = T.pack <$> (string ">")
codeBlockChar :: Parser Text
codeBlockChar = T.pack <$> (string "```")
codeBlockContents :: Parser Text
codeBlockContents = T.pack <$> manyTill anyChar (lookAhead codeBlockChar)
eol :: Parser String
eol = try (string "\n\r")
<|> try (string "\r\n")
<|> string "\n"
<|> string "\r"
<?> "end of line"
## Instruction:
Make parsing Data.Text.Text slightly nicer
Parsec has had support for parsing Text Since version 3.1.2, so we don't need
`T.unpack text` as long as we also import `Text.Parsec.Text`.
This also means that the code only requires `(Text, pack)` from `Data.Text`, so
import only those (unqualified).
## Code After:
module Reflow.Parser where
import Data.Monoid ((<>))
import Data.Text (Text, pack)
import Text.Parsec
import Text.Parsec.Text (Parser)
import Reflow.Types
parseFile :: Text -> [Content]
parseFile t = either (const []) id $ parse parseContent "content" t
parseContent :: Parser [Content]
parseContent = many (quoted <|> codeBlock <|> normal) <* eof
normal :: Parser Content
normal = Normal <$> singleLine
quoted :: Parser Content
quoted = do
q <- quoteChar
l <- singleLine
return $ Quoted (q <> l)
codeBlock :: Parser Content
codeBlock = do
s <- codeBlockChar
c <- codeBlockContents
e <- codeBlockChar
eol
return $ CodeBlock $ s <> c <> e
singleLine :: Parser Text
singleLine = pack <$> manyTill anyChar (try eol)
quoteChar :: Parser Text
quoteChar = pack <$> (string ">")
codeBlockChar :: Parser Text
codeBlockChar = pack <$> (string "```")
codeBlockContents :: Parser Text
codeBlockContents = pack <$> manyTill anyChar (lookAhead codeBlockChar)
eol :: Parser String
eol = try (string "\n\r")
<|> try (string "\r\n")
<|> string "\n"
<|> string "\r"
<?> "end of line"
| module Reflow.Parser where
import Data.Monoid ((<>))
- import Data.Text (Text)
+ import Data.Text (Text, pack)
? ++++++
- import qualified Data.Text as T
- import Text.ParserCombinators.Parsec
+ import Text.Parsec
+ import Text.Parsec.Text (Parser)
import Reflow.Types
parseFile :: Text -> [Content]
- parseFile t = either (const []) id $ parse parseContent "content" (T.unpack t)
? ---------- -
+ parseFile t = either (const []) id $ parse parseContent "content" t
parseContent :: Parser [Content]
parseContent = many (quoted <|> codeBlock <|> normal) <* eof
normal :: Parser Content
normal = Normal <$> singleLine
quoted :: Parser Content
quoted = do
q <- quoteChar
l <- singleLine
return $ Quoted (q <> l)
codeBlock :: Parser Content
codeBlock = do
s <- codeBlockChar
c <- codeBlockContents
e <- codeBlockChar
eol
return $ CodeBlock $ s <> c <> e
singleLine :: Parser Text
- singleLine = T.pack <$> manyTill anyChar (try eol)
? --
+ singleLine = pack <$> manyTill anyChar (try eol)
quoteChar :: Parser Text
- quoteChar = T.pack <$> (string ">")
? --
+ quoteChar = pack <$> (string ">")
codeBlockChar :: Parser Text
- codeBlockChar = T.pack <$> (string "```")
? --
+ codeBlockChar = pack <$> (string "```")
codeBlockContents :: Parser Text
- codeBlockContents = T.pack <$> manyTill anyChar (lookAhead codeBlockChar)
? --
+ codeBlockContents = pack <$> manyTill anyChar (lookAhead codeBlockChar)
eol :: Parser String
eol = try (string "\n\r")
<|> try (string "\r\n")
<|> string "\n"
<|> string "\r"
<?> "end of line" | 16 | 0.32 | 8 | 8 |
fb2ce0079a664803b2e1f7c12c3ac2f6f66e4e3a | ansible/roles/testing/files/testing/tests.d/007-heat-cfntools.sh | ansible/roles/testing/files/testing/tests.d/007-heat-cfntools.sh | . $(dirname $0)/../assert.sh
if [ -f /etc/os-release ]; then
. /etc/os-release
elif [ -f /etc/redhat-release ]; then
NAME=$(sed -e 's/\(.*\)release\? \([0-9]\+\).*/\1/' /etc/redhat-release)
ID=$(echo $NAME | tr '[:upper:]' '[:lower:]')
VERSION_ID=$(sed -e 's/.*release\? \([0-9]\+\).*/\1/' /etc/redhat-release)
else
echo "Disto ID check fail"
exit 1
fi
### heat-cfntools for everything except CentOS 5
skip_if "test \"$ID\" = \"centos\" -a $VERSION_ID -lt 6"
assert_raises "which cfn-create-aws-symlinks cfn-get-metadata cfn-push-stats cfn-hup cfn-init cfn-signal"
assert_end "heat-cfntools are installed"
| . $(dirname $0)/../assert.sh
### heat-cfntools installed in path
assert_raises "which cfn-create-aws-symlinks cfn-get-metadata cfn-push-stats cfn-hup cfn-init cfn-signal"
assert_end "heat-cfntools are installed"
| Remove CentOS 5 exception for heat-cfntools test now it's EOL | Remove CentOS 5 exception for heat-cfntools test now it's EOL
Change-Id: I9a5f470199f80420dfee9f6f4dcd16c2455ce7ba
| Shell | apache-2.0 | NeCTAR-RC/nectar-images,NeCTAR-RC/nectar-images | shell | ## Code Before:
. $(dirname $0)/../assert.sh
if [ -f /etc/os-release ]; then
. /etc/os-release
elif [ -f /etc/redhat-release ]; then
NAME=$(sed -e 's/\(.*\)release\? \([0-9]\+\).*/\1/' /etc/redhat-release)
ID=$(echo $NAME | tr '[:upper:]' '[:lower:]')
VERSION_ID=$(sed -e 's/.*release\? \([0-9]\+\).*/\1/' /etc/redhat-release)
else
echo "Disto ID check fail"
exit 1
fi
### heat-cfntools for everything except CentOS 5
skip_if "test \"$ID\" = \"centos\" -a $VERSION_ID -lt 6"
assert_raises "which cfn-create-aws-symlinks cfn-get-metadata cfn-push-stats cfn-hup cfn-init cfn-signal"
assert_end "heat-cfntools are installed"
## Instruction:
Remove CentOS 5 exception for heat-cfntools test now it's EOL
Change-Id: I9a5f470199f80420dfee9f6f4dcd16c2455ce7ba
## Code After:
. $(dirname $0)/../assert.sh
### heat-cfntools installed in path
assert_raises "which cfn-create-aws-symlinks cfn-get-metadata cfn-push-stats cfn-hup cfn-init cfn-signal"
assert_end "heat-cfntools are installed"
| . $(dirname $0)/../assert.sh
+ ### heat-cfntools installed in path
- if [ -f /etc/os-release ]; then
- . /etc/os-release
- elif [ -f /etc/redhat-release ]; then
- NAME=$(sed -e 's/\(.*\)release\? \([0-9]\+\).*/\1/' /etc/redhat-release)
- ID=$(echo $NAME | tr '[:upper:]' '[:lower:]')
- VERSION_ID=$(sed -e 's/.*release\? \([0-9]\+\).*/\1/' /etc/redhat-release)
- else
- echo "Disto ID check fail"
- exit 1
- fi
-
- ### heat-cfntools for everything except CentOS 5
- skip_if "test \"$ID\" = \"centos\" -a $VERSION_ID -lt 6"
assert_raises "which cfn-create-aws-symlinks cfn-get-metadata cfn-push-stats cfn-hup cfn-init cfn-signal"
assert_end "heat-cfntools are installed" | 14 | 0.823529 | 1 | 13 |
a6a21026fa32e4f016f928fea1f9a53a82c22844 | docs/installation.md | docs/installation.md |
Install via npm.
npm install lambda-tester --save-dev
**Note:** Make sure you are using the correct version of Node.js. `lambda-tester` by default will throw an exception if the version of
node is not `6.10.x`, which is the version supported by AWS Lambda.
|
Install via npm.
npm install lambda-tester --save-dev
| Remove note about Node version check | Remove note about Node version check | Markdown | bsd-3-clause | vandium-io/lambda-tester | markdown | ## Code Before:
Install via npm.
npm install lambda-tester --save-dev
**Note:** Make sure you are using the correct version of Node.js. `lambda-tester` by default will throw an exception if the version of
node is not `6.10.x`, which is the version supported by AWS Lambda.
## Instruction:
Remove note about Node version check
## Code After:
Install via npm.
npm install lambda-tester --save-dev
|
Install via npm.
npm install lambda-tester --save-dev
-
- **Note:** Make sure you are using the correct version of Node.js. `lambda-tester` by default will throw an exception if the version of
- node is not `6.10.x`, which is the version supported by AWS Lambda. | 3 | 0.428571 | 0 | 3 |
5f11fbf5274079317127fb4ea69b57e2b29cddf0 | src/onInit.js | src/onInit.js | export default function onInit(){
this.raw_data.forEach(function(d){
d.FormField = d.Form + ": "+d.Field
})
}; | export default function onInit(){
this.raw_data.forEach(function(d){
d.FormField = d[this.config.form_col] + ": "+d[this.config.field_col]
})
}; | Use user specified columns for data transform | Use user specified columns for data transform
| JavaScript | mit | RhoInc/query-overview,RhoInc/query-overview | javascript | ## Code Before:
export default function onInit(){
this.raw_data.forEach(function(d){
d.FormField = d.Form + ": "+d.Field
})
};
## Instruction:
Use user specified columns for data transform
## Code After:
export default function onInit(){
this.raw_data.forEach(function(d){
d.FormField = d[this.config.form_col] + ": "+d[this.config.field_col]
})
}; | export default function onInit(){
this.raw_data.forEach(function(d){
- d.FormField = d.Form + ": "+d.Field
+ d.FormField = d[this.config.form_col] + ": "+d[this.config.field_col]
})
}; | 2 | 0.4 | 1 | 1 |
b5e7e1101ee6e6f6fe30f3d186f99b518cac624f | weave_todo.md | weave_todo.md |
* Weave: Implement `target_host`.
* Figure out how to deprecate `rsync`
## Open Questions
* Should we keep or deprecate the behavior of adding settings to the global namespace?
* Should we include the "user@" portion of `target_host` or leave it off?
|
* Weave: Implement `target_host`.
* Figure out how to deprecate `rsync`
* Test everything in ruby 1.8.7
## Open Questions
* Should we keep or deprecate the behavior of adding settings to the global namespace?
* Should we include the "user@" portion of `target_host` or leave it off?
## Breaking changes in the weave branch
* `target_host` no longer includes "user@", it's just the domain
* `rsync` no longer exists (unless we write a new helper function)
* `host` is now an alias for `target_host`, not a method
| Add a todo and list of breaking changes | Weave: Add a todo and list of breaking changes
| Markdown | mit | dmac/fezzik,dmac/fezzik | markdown | ## Code Before:
* Weave: Implement `target_host`.
* Figure out how to deprecate `rsync`
## Open Questions
* Should we keep or deprecate the behavior of adding settings to the global namespace?
* Should we include the "user@" portion of `target_host` or leave it off?
## Instruction:
Weave: Add a todo and list of breaking changes
## Code After:
* Weave: Implement `target_host`.
* Figure out how to deprecate `rsync`
* Test everything in ruby 1.8.7
## Open Questions
* Should we keep or deprecate the behavior of adding settings to the global namespace?
* Should we include the "user@" portion of `target_host` or leave it off?
## Breaking changes in the weave branch
* `target_host` no longer includes "user@", it's just the domain
* `rsync` no longer exists (unless we write a new helper function)
* `host` is now an alias for `target_host`, not a method
|
* Weave: Implement `target_host`.
* Figure out how to deprecate `rsync`
+ * Test everything in ruby 1.8.7
## Open Questions
* Should we keep or deprecate the behavior of adding settings to the global namespace?
* Should we include the "user@" portion of `target_host` or leave it off?
+
+ ## Breaking changes in the weave branch
+
+ * `target_host` no longer includes "user@", it's just the domain
+ * `rsync` no longer exists (unless we write a new helper function)
+ * `host` is now an alias for `target_host`, not a method | 7 | 0.875 | 7 | 0 |
dc071864e6a560f34f8230cde69cb2d340faaa8e | .eslintrc.js | .eslintrc.js | module.exports = {
parser: 'babel-eslint',
extends: 'google',
rules: {
'no-var': 2
}
};
| module.exports = {
parser: 'babel-eslint',
extends: 'google',
rules: {
'no-var': 2,
'max-len': [2, 120, 2]
}
};
| Increase maximum valid line width | chore(linter): Increase maximum valid line width
| JavaScript | mit | base16-builder/base16-builder,base16-builder/base16-builder,aloisdg/base16-builder,aloisdg/base16-builder,aloisdg/base16-builder,base16-builder/base16-builder | javascript | ## Code Before:
module.exports = {
parser: 'babel-eslint',
extends: 'google',
rules: {
'no-var': 2
}
};
## Instruction:
chore(linter): Increase maximum valid line width
## Code After:
module.exports = {
parser: 'babel-eslint',
extends: 'google',
rules: {
'no-var': 2,
'max-len': [2, 120, 2]
}
};
| module.exports = {
parser: 'babel-eslint',
extends: 'google',
rules: {
- 'no-var': 2
+ 'no-var': 2,
? +
+ 'max-len': [2, 120, 2]
}
}; | 3 | 0.428571 | 2 | 1 |
e255b92589000c2d485d35f9008b78e0313b4374 | pystache/template_spec.py | pystache/template_spec.py |
# TODO: finish the class docstring.
class TemplateSpec(object):
"""
A mixin or interface for specifying custom template information.
The "spec" in TemplateSpec can be taken to mean that the template
information is either "specified" or "special."
A view should subclass this class only if customized template loading
is needed. The following attributes allow one to customize/override
template information on a per view basis. A None value means to use
default behavior for that value and perform no customization. All
attributes are initialized to None.
Attributes:
template: the template as a string.
template_rel_path: the path to the template file, relative to the
directory containing the module defining the class.
template_rel_directory: the directory containing the template file, relative
to the directory containing the module defining the class.
template_extension: the template file extension. Defaults to "mustache".
Pass False for no extension (i.e. extensionless template files).
"""
template = None
template_rel_path = None
template_rel_directory = None
template_name = None
template_extension = None
template_encoding = None
|
class TemplateSpec(object):
"""
A mixin or interface for specifying custom template information.
The "spec" in TemplateSpec can be taken to mean that the template
information is either "specified" or "special."
A view should subclass this class only if customized template loading
is needed. The following attributes allow one to customize/override
template information on a per view basis. A None value means to use
default behavior for that value and perform no customization. All
attributes are initialized to None.
Attributes:
template: the template as a string.
template_encoding: the encoding used by the template.
template_extension: the template file extension. Defaults to "mustache".
Pass False for no extension (i.e. extensionless template files).
template_name: the name of the template.
template_rel_directory: the directory containing the template file,
relative to the directory containing the module defining the class.
template_rel_path: the path to the template file, relative to the
directory containing the module defining the class.
"""
template = None
template_encoding = None
template_extension = None
template_name = None
template_rel_directory = None
template_rel_path = None
| Reorder TemplateSpec attributes and add to docstring. | Reorder TemplateSpec attributes and add to docstring.
| Python | mit | nitish116/pystache,rismalrv/pystache,charbeljc/pystache,rismalrv/pystache,harsh00008/pystache,arlenesr28/pystache,defunkt/pystache,beni55/pystache,nitish116/pystache,nitish116/pystache,rismalrv/pystache,jrnold/pystache,jrnold/pystache,harsh00008/pystache,harsh00008/pystache,charbeljc/pystache,arlenesr28/pystache,beni55/pystache,arlenesr28/pystache | python | ## Code Before:
# TODO: finish the class docstring.
class TemplateSpec(object):
"""
A mixin or interface for specifying custom template information.
The "spec" in TemplateSpec can be taken to mean that the template
information is either "specified" or "special."
A view should subclass this class only if customized template loading
is needed. The following attributes allow one to customize/override
template information on a per view basis. A None value means to use
default behavior for that value and perform no customization. All
attributes are initialized to None.
Attributes:
template: the template as a string.
template_rel_path: the path to the template file, relative to the
directory containing the module defining the class.
template_rel_directory: the directory containing the template file, relative
to the directory containing the module defining the class.
template_extension: the template file extension. Defaults to "mustache".
Pass False for no extension (i.e. extensionless template files).
"""
template = None
template_rel_path = None
template_rel_directory = None
template_name = None
template_extension = None
template_encoding = None
## Instruction:
Reorder TemplateSpec attributes and add to docstring.
## Code After:
class TemplateSpec(object):
"""
A mixin or interface for specifying custom template information.
The "spec" in TemplateSpec can be taken to mean that the template
information is either "specified" or "special."
A view should subclass this class only if customized template loading
is needed. The following attributes allow one to customize/override
template information on a per view basis. A None value means to use
default behavior for that value and perform no customization. All
attributes are initialized to None.
Attributes:
template: the template as a string.
template_encoding: the encoding used by the template.
template_extension: the template file extension. Defaults to "mustache".
Pass False for no extension (i.e. extensionless template files).
template_name: the name of the template.
template_rel_directory: the directory containing the template file,
relative to the directory containing the module defining the class.
template_rel_path: the path to the template file, relative to the
directory containing the module defining the class.
"""
template = None
template_encoding = None
template_extension = None
template_name = None
template_rel_directory = None
template_rel_path = None
|
- # TODO: finish the class docstring.
class TemplateSpec(object):
"""
A mixin or interface for specifying custom template information.
The "spec" in TemplateSpec can be taken to mean that the template
information is either "specified" or "special."
A view should subclass this class only if customized template loading
is needed. The following attributes allow one to customize/override
template information on a per view basis. A None value means to use
default behavior for that value and perform no customization. All
attributes are initialized to None.
Attributes:
template: the template as a string.
+ template_encoding: the encoding used by the template.
- template_rel_path: the path to the template file, relative to the
- directory containing the module defining the class.
-
- template_rel_directory: the directory containing the template file, relative
- to the directory containing the module defining the class.
template_extension: the template file extension. Defaults to "mustache".
Pass False for no extension (i.e. extensionless template files).
+ template_name: the name of the template.
+
+ template_rel_directory: the directory containing the template file,
+ relative to the directory containing the module defining the class.
+
+ template_rel_path: the path to the template file, relative to the
+ directory containing the module defining the class.
+
"""
template = None
+ template_encoding = None
+ template_extension = None
+ template_name = None
+ template_rel_directory = None
template_rel_path = None
- template_rel_directory = None
- template_name = None
- template_extension = None
- template_encoding = None | 23 | 0.621622 | 13 | 10 |
e9df3a82de5251bbbd8ee31e907f602a4aee1797 | ci-build.sh | ci-build.sh |
python3 --version
py.test --version
pep8 --version
# Target directory for all build files
BUILD=${1:-ci-build}
mkdir -p $BUILD
pep8 --ignore E501 i3pystatus tests
# Check that the setup.py script works
rm -rf ${BUILD}/test-install ${BUILD}/test-install-bin
mkdir ${BUILD}/test-install ${BUILD}/test-install-bin
PYTHONPATH=${BUILD}/test-install python3 setup.py --quiet install --install-lib ${BUILD}/test-install --install-scripts ${BUILD}/test-install-bin
test -f ${BUILD}/test-install-bin/i3pystatus
PYTHONPATH=${BUILD}/test-install py.test --junitxml ${BUILD}/testlog.xml tests
# Check that the docs build w/o warnings (-W flag)
sphinx-build -b html -W docs ${BUILD}/docs/
|
python3 --version
py.test --version
python3 -mpep8 --version
# Target directory for all build files
BUILD=${1:-ci-build}
mkdir -p $BUILD
python3 -mpep8 --ignore E501 i3pystatus tests
# Check that the setup.py script works
rm -rf ${BUILD}/test-install ${BUILD}/test-install-bin
mkdir ${BUILD}/test-install ${BUILD}/test-install-bin
PYTHONPATH=${BUILD}/test-install python3 setup.py --quiet install --install-lib ${BUILD}/test-install --install-scripts ${BUILD}/test-install-bin
test -f ${BUILD}/test-install-bin/i3pystatus
PYTHONPATH=${BUILD}/test-install py.test --junitxml ${BUILD}/testlog.xml tests
# Check that the docs build w/o warnings (-W flag)
sphinx-build -b html -W docs ${BUILD}/docs/
| Make sure that we use python3 pep8 | Make sure that we use python3 pep8
| Shell | mit | Elder-of-Ozone/i3pystatus,asmikhailov/i3pystatus,ncoop/i3pystatus,richese/i3pystatus,eBrnd/i3pystatus,juliushaertl/i3pystatus,opatut/i3pystatus,enkore/i3pystatus,Elder-of-Ozone/i3pystatus,richese/i3pystatus,m45t3r/i3pystatus,drwahl/i3pystatus,fmarchenko/i3pystatus,juliushaertl/i3pystatus,Arvedui/i3pystatus,ismaelpuerto/i3pystatus,enkore/i3pystatus,schroeji/i3pystatus,Arvedui/i3pystatus,fmarchenko/i3pystatus,asmikhailov/i3pystatus,yang-ling/i3pystatus,teto/i3pystatus,plumps/i3pystatus,yang-ling/i3pystatus,MaicoTimmerman/i3pystatus,ncoop/i3pystatus,claria/i3pystatus,paulollivier/i3pystatus,ismaelpuerto/i3pystatus,onkelpit/i3pystatus,plumps/i3pystatus,facetoe/i3pystatus,facetoe/i3pystatus,m45t3r/i3pystatus,paulollivier/i3pystatus,teto/i3pystatus,schroeji/i3pystatus,claria/i3pystatus,onkelpit/i3pystatus,eBrnd/i3pystatus,drwahl/i3pystatus,opatut/i3pystatus,MaicoTimmerman/i3pystatus | shell | ## Code Before:
python3 --version
py.test --version
pep8 --version
# Target directory for all build files
BUILD=${1:-ci-build}
mkdir -p $BUILD
pep8 --ignore E501 i3pystatus tests
# Check that the setup.py script works
rm -rf ${BUILD}/test-install ${BUILD}/test-install-bin
mkdir ${BUILD}/test-install ${BUILD}/test-install-bin
PYTHONPATH=${BUILD}/test-install python3 setup.py --quiet install --install-lib ${BUILD}/test-install --install-scripts ${BUILD}/test-install-bin
test -f ${BUILD}/test-install-bin/i3pystatus
PYTHONPATH=${BUILD}/test-install py.test --junitxml ${BUILD}/testlog.xml tests
# Check that the docs build w/o warnings (-W flag)
sphinx-build -b html -W docs ${BUILD}/docs/
## Instruction:
Make sure that we use python3 pep8
## Code After:
python3 --version
py.test --version
python3 -mpep8 --version
# Target directory for all build files
BUILD=${1:-ci-build}
mkdir -p $BUILD
python3 -mpep8 --ignore E501 i3pystatus tests
# Check that the setup.py script works
rm -rf ${BUILD}/test-install ${BUILD}/test-install-bin
mkdir ${BUILD}/test-install ${BUILD}/test-install-bin
PYTHONPATH=${BUILD}/test-install python3 setup.py --quiet install --install-lib ${BUILD}/test-install --install-scripts ${BUILD}/test-install-bin
test -f ${BUILD}/test-install-bin/i3pystatus
PYTHONPATH=${BUILD}/test-install py.test --junitxml ${BUILD}/testlog.xml tests
# Check that the docs build w/o warnings (-W flag)
sphinx-build -b html -W docs ${BUILD}/docs/
|
python3 --version
py.test --version
- pep8 --version
+ python3 -mpep8 --version
# Target directory for all build files
BUILD=${1:-ci-build}
mkdir -p $BUILD
- pep8 --ignore E501 i3pystatus tests
+ python3 -mpep8 --ignore E501 i3pystatus tests
? ++++++++++
# Check that the setup.py script works
rm -rf ${BUILD}/test-install ${BUILD}/test-install-bin
mkdir ${BUILD}/test-install ${BUILD}/test-install-bin
PYTHONPATH=${BUILD}/test-install python3 setup.py --quiet install --install-lib ${BUILD}/test-install --install-scripts ${BUILD}/test-install-bin
test -f ${BUILD}/test-install-bin/i3pystatus
PYTHONPATH=${BUILD}/test-install py.test --junitxml ${BUILD}/testlog.xml tests
# Check that the docs build w/o warnings (-W flag)
sphinx-build -b html -W docs ${BUILD}/docs/
| 4 | 0.166667 | 2 | 2 |
c8970d9d16234cdadca19eb0976300c7548180c2 | addon/templates/components/goal/goal-overview.hbs | addon/templates/components/goal/goal-overview.hbs | <div class="mdl-card__title">
<h2 class="mdl-card__title-text">{{model.title}}</h2>
</div>
{{#if showAll}}
<div class="mdl-card__supporting-text">
{{model.description}}
{{component 'bepstore-goals@goal/core-team' team=model}}
</div>
<div class="mdl-card__actions mdl-card--border">
{{mdl-button isRaised=true text='Go to goal' action="open"}}
</div>
{{/if}}
<div class="mdl-card__menu">
{{mdl-icon badge=model.contributors.length icon='group' class='mdl-badge--overlap vbottom '}}
{{component 'bepstore-goals@badge/status-badge' status=model.status size='small' class='inline-block'}}
{{#if showAll}}
{{mdl-button icon='expand_less' isColored=false action='showDescr'}}
{{else}}
{{mdl-button icon='expand_more' isColored=false action='showDescr'}}
{{/if}}
</div>
| <div class="mdl-card__title" {{action 'open'}}>
<h2 class="mdl-card__title-text">{{model.title}}</h2>
</div>
{{#if showAll}}
<div class="mdl-card__supporting-text">
{{model.description}}
{{component 'bepstore-goals@goal/core-team' team=model}}
</div>
<div class="mdl-card__actions mdl-card--border">
{{mdl-button isRaised=true text='Go to goal' action="open"}}
</div>
{{/if}}
<div class="mdl-card__menu">
{{mdl-icon badge=model.contributors.length icon='group' class='mdl-badge--overlap vbottom '}}
{{component 'bepstore-goals@badge/status-badge' status=model.status size='small' class='inline-block'}}
{{#if showAll}}
{{mdl-button icon='expand_less' isColored=false action='showDescr'}}
{{else}}
{{mdl-button icon='expand_more' isColored=false action='showDescr'}}
{{/if}}
</div>
| Add possibility to directly go to a goal. | Add possibility to directly go to a goal.
| Handlebars | mit | feedbackfruits/bepstore-ui-goals,feedbackfruits/bepstore-ui-goals | handlebars | ## Code Before:
<div class="mdl-card__title">
<h2 class="mdl-card__title-text">{{model.title}}</h2>
</div>
{{#if showAll}}
<div class="mdl-card__supporting-text">
{{model.description}}
{{component 'bepstore-goals@goal/core-team' team=model}}
</div>
<div class="mdl-card__actions mdl-card--border">
{{mdl-button isRaised=true text='Go to goal' action="open"}}
</div>
{{/if}}
<div class="mdl-card__menu">
{{mdl-icon badge=model.contributors.length icon='group' class='mdl-badge--overlap vbottom '}}
{{component 'bepstore-goals@badge/status-badge' status=model.status size='small' class='inline-block'}}
{{#if showAll}}
{{mdl-button icon='expand_less' isColored=false action='showDescr'}}
{{else}}
{{mdl-button icon='expand_more' isColored=false action='showDescr'}}
{{/if}}
</div>
## Instruction:
Add possibility to directly go to a goal.
## Code After:
<div class="mdl-card__title" {{action 'open'}}>
<h2 class="mdl-card__title-text">{{model.title}}</h2>
</div>
{{#if showAll}}
<div class="mdl-card__supporting-text">
{{model.description}}
{{component 'bepstore-goals@goal/core-team' team=model}}
</div>
<div class="mdl-card__actions mdl-card--border">
{{mdl-button isRaised=true text='Go to goal' action="open"}}
</div>
{{/if}}
<div class="mdl-card__menu">
{{mdl-icon badge=model.contributors.length icon='group' class='mdl-badge--overlap vbottom '}}
{{component 'bepstore-goals@badge/status-badge' status=model.status size='small' class='inline-block'}}
{{#if showAll}}
{{mdl-button icon='expand_less' isColored=false action='showDescr'}}
{{else}}
{{mdl-button icon='expand_more' isColored=false action='showDescr'}}
{{/if}}
</div>
| - <div class="mdl-card__title">
+ <div class="mdl-card__title" {{action 'open'}}>
? ++++++++++++++++++
<h2 class="mdl-card__title-text">{{model.title}}</h2>
</div>
{{#if showAll}}
<div class="mdl-card__supporting-text">
{{model.description}}
{{component 'bepstore-goals@goal/core-team' team=model}}
</div>
<div class="mdl-card__actions mdl-card--border">
{{mdl-button isRaised=true text='Go to goal' action="open"}}
</div>
{{/if}}
<div class="mdl-card__menu">
{{mdl-icon badge=model.contributors.length icon='group' class='mdl-badge--overlap vbottom '}}
{{component 'bepstore-goals@badge/status-badge' status=model.status size='small' class='inline-block'}}
{{#if showAll}}
{{mdl-button icon='expand_less' isColored=false action='showDescr'}}
{{else}}
{{mdl-button icon='expand_more' isColored=false action='showDescr'}}
{{/if}}
</div> | 2 | 0.095238 | 1 | 1 |
e90c7d034f070361893f77d7a257640d647be0c7 | mbuild/tests/test_xyz.py | mbuild/tests/test_xyz.py | import numpy as np
import pytest
import mbuild as mb
from mbuild.utils.io import get_fn
from mbuild.tests.base_test import BaseTest
from mbuild.exceptions import MBuildError
class TestXYZ(BaseTest):
def test_load_no_top(self, ethane):
ethane.save(filename='ethane.xyz')
ethane_in = mb.load('ethane.xyz')
assert len(ethane_in.children) == 8
assert ethane_in.n_bonds == 0
assert set([child.name for child in ethane_in.children]) == {'C', 'H'}
def test_wrong_n_atoms(self):
with pytest.raises(MBuildError):
mb.load(get_fn('too_few_atoms.xyz'))
with pytest.raises(MBuildError):
mb.load(get_fn('too_many_atoms.xyz'))
def test_save(self, ethane):
ethane.save(filename='ethane.xyz')
ethane_in = mb.load('ethane.xyz')
assert len(ethane_in.children) == 8
assert set([child.name for child in ethane_in.children]) == {'C', 'H'}
def test_coordinates(self, ethane):
ethane.save(filename='ethane.xyz')
ethane_in = mb.load('ethane.xyz')
assert np.allclose(ethane.xyz, ethane_in.xyz)
| import numpy as np
import pytest
import mbuild as mb
from mbuild.formats.xyz import write_xyz
from mbuild.utils.io import get_fn
from mbuild.tests.base_test import BaseTest
from mbuild.exceptions import MBuildError
class TestXYZ(BaseTest):
def test_load_no_top(self, ethane):
ethane.save(filename='ethane.xyz')
ethane_in = mb.load('ethane.xyz')
assert len(ethane_in.children) == 8
assert ethane_in.n_bonds == 0
assert set([child.name for child in ethane_in.children]) == {'C', 'H'}
def test_wrong_n_atoms(self):
with pytest.raises(MBuildError):
mb.load(get_fn('too_few_atoms.xyz'))
with pytest.raises(MBuildError):
mb.load(get_fn('too_many_atoms.xyz'))
def test_bad_input(self, ethane):
with pytest.raises(ValueError):
assert isinstance(ethane, mb.Compound)
write_xyz(ethane, 'compound.xyz')
def test_save(self, ethane):
ethane.save(filename='ethane.xyz')
ethane_in = mb.load('ethane.xyz')
assert len(ethane_in.children) == 8
assert set([child.name for child in ethane_in.children]) == {'C', 'H'}
def test_coordinates(self, ethane):
ethane.save(filename='ethane.xyz')
ethane_in = mb.load('ethane.xyz')
assert np.allclose(ethane.xyz, ethane_in.xyz)
| Add test to ensure write_xyz does not directly take in compound | Add test to ensure write_xyz does not directly take in compound
| Python | mit | iModels/mbuild,iModels/mbuild | python | ## Code Before:
import numpy as np
import pytest
import mbuild as mb
from mbuild.utils.io import get_fn
from mbuild.tests.base_test import BaseTest
from mbuild.exceptions import MBuildError
class TestXYZ(BaseTest):
def test_load_no_top(self, ethane):
ethane.save(filename='ethane.xyz')
ethane_in = mb.load('ethane.xyz')
assert len(ethane_in.children) == 8
assert ethane_in.n_bonds == 0
assert set([child.name for child in ethane_in.children]) == {'C', 'H'}
def test_wrong_n_atoms(self):
with pytest.raises(MBuildError):
mb.load(get_fn('too_few_atoms.xyz'))
with pytest.raises(MBuildError):
mb.load(get_fn('too_many_atoms.xyz'))
def test_save(self, ethane):
ethane.save(filename='ethane.xyz')
ethane_in = mb.load('ethane.xyz')
assert len(ethane_in.children) == 8
assert set([child.name for child in ethane_in.children]) == {'C', 'H'}
def test_coordinates(self, ethane):
ethane.save(filename='ethane.xyz')
ethane_in = mb.load('ethane.xyz')
assert np.allclose(ethane.xyz, ethane_in.xyz)
## Instruction:
Add test to ensure write_xyz does not directly take in compound
## Code After:
import numpy as np
import pytest
import mbuild as mb
from mbuild.formats.xyz import write_xyz
from mbuild.utils.io import get_fn
from mbuild.tests.base_test import BaseTest
from mbuild.exceptions import MBuildError
class TestXYZ(BaseTest):
def test_load_no_top(self, ethane):
ethane.save(filename='ethane.xyz')
ethane_in = mb.load('ethane.xyz')
assert len(ethane_in.children) == 8
assert ethane_in.n_bonds == 0
assert set([child.name for child in ethane_in.children]) == {'C', 'H'}
def test_wrong_n_atoms(self):
with pytest.raises(MBuildError):
mb.load(get_fn('too_few_atoms.xyz'))
with pytest.raises(MBuildError):
mb.load(get_fn('too_many_atoms.xyz'))
def test_bad_input(self, ethane):
with pytest.raises(ValueError):
assert isinstance(ethane, mb.Compound)
write_xyz(ethane, 'compound.xyz')
def test_save(self, ethane):
ethane.save(filename='ethane.xyz')
ethane_in = mb.load('ethane.xyz')
assert len(ethane_in.children) == 8
assert set([child.name for child in ethane_in.children]) == {'C', 'H'}
def test_coordinates(self, ethane):
ethane.save(filename='ethane.xyz')
ethane_in = mb.load('ethane.xyz')
assert np.allclose(ethane.xyz, ethane_in.xyz)
| import numpy as np
import pytest
import mbuild as mb
+ from mbuild.formats.xyz import write_xyz
from mbuild.utils.io import get_fn
from mbuild.tests.base_test import BaseTest
from mbuild.exceptions import MBuildError
class TestXYZ(BaseTest):
def test_load_no_top(self, ethane):
ethane.save(filename='ethane.xyz')
ethane_in = mb.load('ethane.xyz')
assert len(ethane_in.children) == 8
assert ethane_in.n_bonds == 0
assert set([child.name for child in ethane_in.children]) == {'C', 'H'}
def test_wrong_n_atoms(self):
with pytest.raises(MBuildError):
mb.load(get_fn('too_few_atoms.xyz'))
with pytest.raises(MBuildError):
mb.load(get_fn('too_many_atoms.xyz'))
+ def test_bad_input(self, ethane):
+ with pytest.raises(ValueError):
+ assert isinstance(ethane, mb.Compound)
+ write_xyz(ethane, 'compound.xyz')
+
def test_save(self, ethane):
ethane.save(filename='ethane.xyz')
ethane_in = mb.load('ethane.xyz')
assert len(ethane_in.children) == 8
assert set([child.name for child in ethane_in.children]) == {'C', 'H'}
def test_coordinates(self, ethane):
ethane.save(filename='ethane.xyz')
ethane_in = mb.load('ethane.xyz')
assert np.allclose(ethane.xyz, ethane_in.xyz) | 6 | 0.181818 | 6 | 0 |
5df65bc90a4c206fd5e35930ed5995077b94db91 | scuole/static_src/scss/main.scss | scuole/static_src/scss/main.scss | // Variables and mixins
@import 'variables';
@import 'mixins';
// Reset and overrides
@import 'reset';
@import 'overrides';
// Third-party libraries
@import 'node_modules/sass-mq/mq';
// Base CSS
@import 'grid';
@import 'type';
@import 'tables';
@import 'buttons';
@import 'cards';
@import 'fonts';
@import 'animations';
// Components
@import 'components/header';
@import 'components/tribune-logo';
@import 'components/landing-content';
@import 'components/breadcrumbs';
@import 'components/search';
@import 'components/map';
@import 'components/ads';
// @import 'components/stats';
@import 'components/metrics';
@import 'components/menu';
@import 'components/bar-chart';
@import 'components/sections';
@import 'components/campus-list';
@import 'components/story-grid';
@import 'components/footer';
@import 'components/reminder-bar';
@import 'components/listbox';
| // Variables and mixins
@import 'variables';
@import 'mixins';
// Reset and overrides
@import 'reset';
@import 'overrides';
// Third-party libraries
@import 'node_modules/sass-mq/mq';
// Base CSS
@import 'grid';
@import 'type';
@import 'tables';
@import 'buttons';
@import 'cards';
@import 'fonts';
@import 'animations';
// Components
@import 'components/header';
@import 'components/tribune-logo';
@import 'components/landing-content';
@import 'components/breadcrumbs';
@import 'components/search';
@import 'components/map';
@import 'components/ads';
@import 'components/metrics';
@import 'components/menu';
@import 'components/bar-chart';
@import 'components/sections';
@import 'components/campus-list';
@import 'components/story-grid';
@import 'components/footer';
@import 'components/reminder-bar';
@import 'components/listbox';
@import 'components/about';
| Remove unused stats import, add about | Remove unused stats import, add about
| SCSS | mit | texastribune/scuole,texastribune/scuole,texastribune/scuole,texastribune/scuole | scss | ## Code Before:
// Variables and mixins
@import 'variables';
@import 'mixins';
// Reset and overrides
@import 'reset';
@import 'overrides';
// Third-party libraries
@import 'node_modules/sass-mq/mq';
// Base CSS
@import 'grid';
@import 'type';
@import 'tables';
@import 'buttons';
@import 'cards';
@import 'fonts';
@import 'animations';
// Components
@import 'components/header';
@import 'components/tribune-logo';
@import 'components/landing-content';
@import 'components/breadcrumbs';
@import 'components/search';
@import 'components/map';
@import 'components/ads';
// @import 'components/stats';
@import 'components/metrics';
@import 'components/menu';
@import 'components/bar-chart';
@import 'components/sections';
@import 'components/campus-list';
@import 'components/story-grid';
@import 'components/footer';
@import 'components/reminder-bar';
@import 'components/listbox';
## Instruction:
Remove unused stats import, add about
## Code After:
// Variables and mixins
@import 'variables';
@import 'mixins';
// Reset and overrides
@import 'reset';
@import 'overrides';
// Third-party libraries
@import 'node_modules/sass-mq/mq';
// Base CSS
@import 'grid';
@import 'type';
@import 'tables';
@import 'buttons';
@import 'cards';
@import 'fonts';
@import 'animations';
// Components
@import 'components/header';
@import 'components/tribune-logo';
@import 'components/landing-content';
@import 'components/breadcrumbs';
@import 'components/search';
@import 'components/map';
@import 'components/ads';
@import 'components/metrics';
@import 'components/menu';
@import 'components/bar-chart';
@import 'components/sections';
@import 'components/campus-list';
@import 'components/story-grid';
@import 'components/footer';
@import 'components/reminder-bar';
@import 'components/listbox';
@import 'components/about';
| // Variables and mixins
@import 'variables';
@import 'mixins';
// Reset and overrides
@import 'reset';
@import 'overrides';
// Third-party libraries
@import 'node_modules/sass-mq/mq';
// Base CSS
@import 'grid';
@import 'type';
@import 'tables';
@import 'buttons';
@import 'cards';
@import 'fonts';
@import 'animations';
// Components
@import 'components/header';
@import 'components/tribune-logo';
@import 'components/landing-content';
@import 'components/breadcrumbs';
@import 'components/search';
@import 'components/map';
@import 'components/ads';
- // @import 'components/stats';
@import 'components/metrics';
@import 'components/menu';
@import 'components/bar-chart';
@import 'components/sections';
@import 'components/campus-list';
@import 'components/story-grid';
@import 'components/footer';
@import 'components/reminder-bar';
@import 'components/listbox';
+ @import 'components/about'; | 2 | 0.052632 | 1 | 1 |
f146d34622af5e73b72af16a88a3f9e3a92a3a6d | app/views/users/_chance.haml | app/views/users/_chance.haml | / Chance
.col-md-4.col-xs-12.widget.widget_tally_box
%button.x_panel.ui-ribbon-container.fixed_height_390{"data-target" => "#contact", "data-toggle" => "modal", :type => "button"}
- if percent > 75
.ui-ribbon-wrapper
.ui-ribbon
Prioritet
/ .x_title
/ %h2= type
/ .clearfix
.x_content
%div{style: "text-align: center; margin-bottom: 17px"}
%span.chart{"data-percent" => percent}
%span.percent= percent
%canvas{height: "110", width: "110"}
%h3.name_title= "#{name} #{type}"
%br
%div
%ul.list-inline.widget_tally
- @chance.factors.where(product: name).order(id: :desc).limit(3).each do |factor|
%li
%span.month
= factor.title
%span.count
= factor.amount > 0 ? '+' : '-'
= "#{factor.amount.abs}%"
/ %p Šansa za kupovinu
/ .divider
/ %p If you've decided to go in development mode and tweak all of this a bit, there are few things you should do.
| / Chance
.col-md-4.col-xs-12.widget.widget_tally_box
%button.x_panel.ui-ribbon-container.fixed_height_390{"data-target" => "#contact", "data-toggle" => "modal", :type => "button"}
- if percent > 75
.ui-ribbon-wrapper
.ui-ribbon
Prioritet
/ .x_title
/ %h2= type
/ .clearfix
.x_content
%div{style: "text-align: center; margin-bottom: 17px"}
%span.chart{"data-percent" => percent}
%span.percent= percent
%canvas{height: "110", width: "110"}
%h3.name_title= "#{name} #{type}"
%br
%div
%ul.list-inline.widget_tally
- @chance.factors.where(product: name).order(id: :desc).limit(3).each do |factor|
%li
%span.month
- if factor.title.include? 'kredit'
%i.fa.fa-bar-chart{style: "color: #3b5998;"}
- else
%i.fa.fa-facebook-official{style: "color: #3b5998;"}
= factor.title
%span.count
= factor.amount > 0 ? '+' : '-'
= "#{factor.amount.abs}%"
/ %p Šansa za kupovinu
/ .divider
/ %p If you've decided to go in development mode and tweak all of this a bit, there are few things you should do.
| Add font-awesome icons to factors | Add font-awesome icons to factors
| Haml | mit | aaleksandar/Societe-Generale-Fintech-Hackathon-Dashboard,aaleksandar/Societe-Generale-Fintech-Hackathon-Dashboard,aaleksandar/Societe-Generale-Fintech-Hackathon-Dashboard | haml | ## Code Before:
/ Chance
.col-md-4.col-xs-12.widget.widget_tally_box
%button.x_panel.ui-ribbon-container.fixed_height_390{"data-target" => "#contact", "data-toggle" => "modal", :type => "button"}
- if percent > 75
.ui-ribbon-wrapper
.ui-ribbon
Prioritet
/ .x_title
/ %h2= type
/ .clearfix
.x_content
%div{style: "text-align: center; margin-bottom: 17px"}
%span.chart{"data-percent" => percent}
%span.percent= percent
%canvas{height: "110", width: "110"}
%h3.name_title= "#{name} #{type}"
%br
%div
%ul.list-inline.widget_tally
- @chance.factors.where(product: name).order(id: :desc).limit(3).each do |factor|
%li
%span.month
= factor.title
%span.count
= factor.amount > 0 ? '+' : '-'
= "#{factor.amount.abs}%"
/ %p Šansa za kupovinu
/ .divider
/ %p If you've decided to go in development mode and tweak all of this a bit, there are few things you should do.
## Instruction:
Add font-awesome icons to factors
## Code After:
/ Chance
.col-md-4.col-xs-12.widget.widget_tally_box
%button.x_panel.ui-ribbon-container.fixed_height_390{"data-target" => "#contact", "data-toggle" => "modal", :type => "button"}
- if percent > 75
.ui-ribbon-wrapper
.ui-ribbon
Prioritet
/ .x_title
/ %h2= type
/ .clearfix
.x_content
%div{style: "text-align: center; margin-bottom: 17px"}
%span.chart{"data-percent" => percent}
%span.percent= percent
%canvas{height: "110", width: "110"}
%h3.name_title= "#{name} #{type}"
%br
%div
%ul.list-inline.widget_tally
- @chance.factors.where(product: name).order(id: :desc).limit(3).each do |factor|
%li
%span.month
- if factor.title.include? 'kredit'
%i.fa.fa-bar-chart{style: "color: #3b5998;"}
- else
%i.fa.fa-facebook-official{style: "color: #3b5998;"}
= factor.title
%span.count
= factor.amount > 0 ? '+' : '-'
= "#{factor.amount.abs}%"
/ %p Šansa za kupovinu
/ .divider
/ %p If you've decided to go in development mode and tweak all of this a bit, there are few things you should do.
| / Chance
.col-md-4.col-xs-12.widget.widget_tally_box
%button.x_panel.ui-ribbon-container.fixed_height_390{"data-target" => "#contact", "data-toggle" => "modal", :type => "button"}
- if percent > 75
.ui-ribbon-wrapper
.ui-ribbon
Prioritet
/ .x_title
/ %h2= type
/ .clearfix
.x_content
%div{style: "text-align: center; margin-bottom: 17px"}
%span.chart{"data-percent" => percent}
%span.percent= percent
%canvas{height: "110", width: "110"}
%h3.name_title= "#{name} #{type}"
%br
%div
%ul.list-inline.widget_tally
- @chance.factors.where(product: name).order(id: :desc).limit(3).each do |factor|
%li
%span.month
+ - if factor.title.include? 'kredit'
+ %i.fa.fa-bar-chart{style: "color: #3b5998;"}
+ - else
+ %i.fa.fa-facebook-official{style: "color: #3b5998;"}
= factor.title
%span.count
= factor.amount > 0 ? '+' : '-'
= "#{factor.amount.abs}%"
/ %p Šansa za kupovinu
/ .divider
/ %p If you've decided to go in development mode and tweak all of this a bit, there are few things you should do. | 4 | 0.133333 | 4 | 0 |
b53313b55e721420f0d1edbfc41927e3477017ea | project/Release.scala | project/Release.scala | package ml.combust.mleap
import com.typesafe.sbt.packager.docker.DockerPlugin.autoImport
import sbt.Keys._
import sbtrelease.ReleasePlugin.autoImport.{ReleaseStep, _}
import sbtrelease.ReleaseStateTransformations._
import xerial.sbt.Sonatype.SonatypeCommand
object Release {
lazy val settings = Seq(releaseVersionBump := sbtrelease.Version.Bump.Minor,
releaseCrossBuild := true,
releaseProcess := Seq[ReleaseStep](
checkSnapshotDependencies,
inquireVersions,
setReleaseVersion,
runClean,
runTest,
tagRelease,
publishArtifacts,
commitReleaseVersion,
releaseStepCommand(SonatypeCommand.sonatypeRelease),
releaseStepTask(publish in autoImport.Docker in MleapProject.serving),
setNextVersion,
commitNextVersion,
pushChanges
))
} | package ml.combust.mleap
import com.typesafe.sbt.packager.docker.DockerPlugin.autoImport
import sbt.Keys._
import sbtrelease.ReleasePlugin.autoImport.{ReleaseStep, _}
import sbtrelease.ReleaseStateTransformations._
import xerial.sbt.Sonatype.SonatypeCommand
object Release {
lazy val settings = Seq(releaseVersionBump := sbtrelease.Version.Bump.Minor,
releaseCrossBuild := true,
releaseProcess := Seq[ReleaseStep](
checkSnapshotDependencies,
inquireVersions,
setReleaseVersion,
runClean,
runTest,
tagRelease,
commitReleaseVersion,
publishArtifacts,
releaseStepCommand(SonatypeCommand.sonatypeRelease),
releaseStepTask(publish in autoImport.Docker in MleapProject.serving),
setNextVersion,
commitNextVersion,
pushChanges
))
} | Update release process once more. | Update release process once more.
| Scala | apache-2.0 | combust/mleap,combust-ml/mleap,combust-ml/mleap,combust/mleap,combust-ml/mleap,combust/mleap | scala | ## Code Before:
package ml.combust.mleap
import com.typesafe.sbt.packager.docker.DockerPlugin.autoImport
import sbt.Keys._
import sbtrelease.ReleasePlugin.autoImport.{ReleaseStep, _}
import sbtrelease.ReleaseStateTransformations._
import xerial.sbt.Sonatype.SonatypeCommand
object Release {
lazy val settings = Seq(releaseVersionBump := sbtrelease.Version.Bump.Minor,
releaseCrossBuild := true,
releaseProcess := Seq[ReleaseStep](
checkSnapshotDependencies,
inquireVersions,
setReleaseVersion,
runClean,
runTest,
tagRelease,
publishArtifacts,
commitReleaseVersion,
releaseStepCommand(SonatypeCommand.sonatypeRelease),
releaseStepTask(publish in autoImport.Docker in MleapProject.serving),
setNextVersion,
commitNextVersion,
pushChanges
))
}
## Instruction:
Update release process once more.
## Code After:
package ml.combust.mleap
import com.typesafe.sbt.packager.docker.DockerPlugin.autoImport
import sbt.Keys._
import sbtrelease.ReleasePlugin.autoImport.{ReleaseStep, _}
import sbtrelease.ReleaseStateTransformations._
import xerial.sbt.Sonatype.SonatypeCommand
object Release {
lazy val settings = Seq(releaseVersionBump := sbtrelease.Version.Bump.Minor,
releaseCrossBuild := true,
releaseProcess := Seq[ReleaseStep](
checkSnapshotDependencies,
inquireVersions,
setReleaseVersion,
runClean,
runTest,
tagRelease,
commitReleaseVersion,
publishArtifacts,
releaseStepCommand(SonatypeCommand.sonatypeRelease),
releaseStepTask(publish in autoImport.Docker in MleapProject.serving),
setNextVersion,
commitNextVersion,
pushChanges
))
} | package ml.combust.mleap
import com.typesafe.sbt.packager.docker.DockerPlugin.autoImport
import sbt.Keys._
import sbtrelease.ReleasePlugin.autoImport.{ReleaseStep, _}
import sbtrelease.ReleaseStateTransformations._
import xerial.sbt.Sonatype.SonatypeCommand
object Release {
lazy val settings = Seq(releaseVersionBump := sbtrelease.Version.Bump.Minor,
releaseCrossBuild := true,
releaseProcess := Seq[ReleaseStep](
checkSnapshotDependencies,
inquireVersions,
setReleaseVersion,
runClean,
runTest,
tagRelease,
+ commitReleaseVersion,
publishArtifacts,
- commitReleaseVersion,
releaseStepCommand(SonatypeCommand.sonatypeRelease),
releaseStepTask(publish in autoImport.Docker in MleapProject.serving),
setNextVersion,
commitNextVersion,
pushChanges
))
} | 2 | 0.071429 | 1 | 1 |
6b1ebb4eb2e3038d7315b9fe0dc908245bc57d36 | lib/nikeplus/http_utils.rb | lib/nikeplus/http_utils.rb | module NikePlus
module HTTPUtils
def post_request(url, body)
Excon.post(url,
body: URI.encode_www_form(body),
headers: { 'Content-Type' => 'application/x-www-form-urlencoded' })
end
def get_request(url)
Excon.get(url)
end
def extract_hash_from_json_response_body(response)
JSON.parse(response.data[:body])
end
def build_url(options = {})
uri = URI.parse(self.class.const_get('API_URL'))
uri.query = [uri.query, URI.encode_www_form(options)].compact.join('&')
uri.to_s
end
end
end
| module NikePlus
module HTTPUtils
def post_request(url, body)
Excon.post(url,
body: URI.encode_www_form(body),
headers: { 'Content-Type' => 'application/x-www-form-urlencoded' })
end
def get_request(url)
Excon.get(url)
end
def extract_hash_from_json_response_body(response)
JSON.parse(response.data[:body])
end
def build_url(options = {})
uri = URI.parse(self.class.const_get('API_URL'))
token = NikePlus.configuration[:access_token]
options.merge!(access_token: token) if token
uri.query = [uri.query, URI.encode_www_form(options)].compact.join('&')
uri.to_s
end
end
end
| Add token to all get request via class variable on module | Add token to all get request via class variable on module
| Ruby | mit | jumichot/nikeplus | ruby | ## Code Before:
module NikePlus
module HTTPUtils
def post_request(url, body)
Excon.post(url,
body: URI.encode_www_form(body),
headers: { 'Content-Type' => 'application/x-www-form-urlencoded' })
end
def get_request(url)
Excon.get(url)
end
def extract_hash_from_json_response_body(response)
JSON.parse(response.data[:body])
end
def build_url(options = {})
uri = URI.parse(self.class.const_get('API_URL'))
uri.query = [uri.query, URI.encode_www_form(options)].compact.join('&')
uri.to_s
end
end
end
## Instruction:
Add token to all get request via class variable on module
## Code After:
module NikePlus
module HTTPUtils
def post_request(url, body)
Excon.post(url,
body: URI.encode_www_form(body),
headers: { 'Content-Type' => 'application/x-www-form-urlencoded' })
end
def get_request(url)
Excon.get(url)
end
def extract_hash_from_json_response_body(response)
JSON.parse(response.data[:body])
end
def build_url(options = {})
uri = URI.parse(self.class.const_get('API_URL'))
token = NikePlus.configuration[:access_token]
options.merge!(access_token: token) if token
uri.query = [uri.query, URI.encode_www_form(options)].compact.join('&')
uri.to_s
end
end
end
| module NikePlus
module HTTPUtils
def post_request(url, body)
Excon.post(url,
body: URI.encode_www_form(body),
headers: { 'Content-Type' => 'application/x-www-form-urlencoded' })
end
def get_request(url)
Excon.get(url)
end
def extract_hash_from_json_response_body(response)
JSON.parse(response.data[:body])
end
def build_url(options = {})
uri = URI.parse(self.class.const_get('API_URL'))
+ token = NikePlus.configuration[:access_token]
+ options.merge!(access_token: token) if token
uri.query = [uri.query, URI.encode_www_form(options)].compact.join('&')
uri.to_s
end
end
end | 2 | 0.086957 | 2 | 0 |
a98da6a323a098bdebfce6f18d919af93b8f4f0a | crawl.sh | crawl.sh |
if [[ -f $1 ]]; then
log_file=$1
else
# If multiple log files, just use first found
log_file=`find . -iname '*.log' -print -quit`
fi
if [[ $@ != *"-A"* ]]; then
conn_ip=$(echo $SSH_CLIENT | awk '{ print $1 }')
else
conn_ip='//'
fi
printf "Following log file %s, ignoring %s\n" $log_file $conn_ip
tail -f ${log_file} | grep -v ${conn_ip}
|
if [[ -f $1 ]]; then
log_file=$1
else
# If multiple log files, just use first found
log_file=`find . -iname '*.log' -print -quit`
fi
if [[ $@ != *"-A"* ]]; then
conn_ip=$(echo $SSH_CLIENT | awk '{ print $1 }')
else
conn_ip='//'
fi
printf "\n==> Following log file %s, ignoring %s <==\n\n" $log_file $conn_ip
tail -f ${log_file} | grep -v ${conn_ip}
| Format informational line to be more semantic, explicit | Format informational line to be more semantic, explicit
| Shell | mit | jm-janzen/scripts,jm-janzen/scripts,jm-janzen/scripts | shell | ## Code Before:
if [[ -f $1 ]]; then
log_file=$1
else
# If multiple log files, just use first found
log_file=`find . -iname '*.log' -print -quit`
fi
if [[ $@ != *"-A"* ]]; then
conn_ip=$(echo $SSH_CLIENT | awk '{ print $1 }')
else
conn_ip='//'
fi
printf "Following log file %s, ignoring %s\n" $log_file $conn_ip
tail -f ${log_file} | grep -v ${conn_ip}
## Instruction:
Format informational line to be more semantic, explicit
## Code After:
if [[ -f $1 ]]; then
log_file=$1
else
# If multiple log files, just use first found
log_file=`find . -iname '*.log' -print -quit`
fi
if [[ $@ != *"-A"* ]]; then
conn_ip=$(echo $SSH_CLIENT | awk '{ print $1 }')
else
conn_ip='//'
fi
printf "\n==> Following log file %s, ignoring %s <==\n\n" $log_file $conn_ip
tail -f ${log_file} | grep -v ${conn_ip}
|
if [[ -f $1 ]]; then
log_file=$1
else
# If multiple log files, just use first found
log_file=`find . -iname '*.log' -print -quit`
fi
if [[ $@ != *"-A"* ]]; then
conn_ip=$(echo $SSH_CLIENT | awk '{ print $1 }')
else
conn_ip='//'
fi
- printf "Following log file %s, ignoring %s\n" $log_file $conn_ip
+ printf "\n==> Following log file %s, ignoring %s <==\n\n" $log_file $conn_ip
? ++++++ ++++++
tail -f ${log_file} | grep -v ${conn_ip} | 2 | 0.117647 | 1 | 1 |
6377284c022f26cfd9528b09af3ec61fc91a2c54 | api/tests/__init__.py | api/tests/__init__.py | import json
from django.test import TestCase, Client
# Create your tests here.
from login.models import myuser
from rest_framework.authtoken.models import Token
class APITestCase(TestCase):
test_schema = 'schema1'
test_table = 'population2'
@classmethod
def setUpClass(cls):
super(APITestCase, cls).setUpClass()
cls.user = myuser.objects.create(name='MrTest')
cls.user.save()
cls.token = Token.objects.get(user=cls.user)
cls.client = Client()
def assertDictEqualKeywise(self, d1, d2, excluded=None):
if not excluded:
excluded = []
self.assertEqual(set(d1.keys()).union(excluded), set(d2.keys()).union(excluded), "Key sets do not match")
for key in d1:
if key not in excluded:
value = d1[key]
covalue = d2[key]
self.assertEqual(value, covalue,
"Key '{key}' does not match.".format(key=key)) | import json
from django.test import TestCase, Client
# Create your tests here.
from login.models import myuser
from rest_framework.authtoken.models import Token
class APITestCase(TestCase):
test_schema = 'schema1'
test_table = 'population2'
@classmethod
def setUpClass(cls):
super(APITestCase, cls).setUpClass()
cls.user = myuser.objects.create(name='MrTest', mail_address='mrtest@test.com')
cls.user.save()
cls.token = Token.objects.get(user=cls.user)
cls.other_user = myuser.objects.create(name='NotMrTest', mail_address='notmrtest@test.com')
cls.other_user.save()
cls.other_token = Token.objects.get(user=cls.other_user)
cls.client = Client()
def assertDictEqualKeywise(self, d1, d2, excluded=None):
if not excluded:
excluded = []
self.assertEqual(set(d1.keys()).union(excluded), set(d2.keys()).union(excluded), "Key sets do not match")
for key in d1:
if key not in excluded:
value = d1[key]
covalue = d2[key]
self.assertEqual(value, covalue,
"Key '{key}' does not match.".format(key=key)) | Add another user for permission testing | Add another user for permission testing
| Python | agpl-3.0 | tom-heimbrodt/oeplatform,tom-heimbrodt/oeplatform,tom-heimbrodt/oeplatform,openego/oeplatform,openego/oeplatform,openego/oeplatform,openego/oeplatform | python | ## Code Before:
import json
from django.test import TestCase, Client
# Create your tests here.
from login.models import myuser
from rest_framework.authtoken.models import Token
class APITestCase(TestCase):
test_schema = 'schema1'
test_table = 'population2'
@classmethod
def setUpClass(cls):
super(APITestCase, cls).setUpClass()
cls.user = myuser.objects.create(name='MrTest')
cls.user.save()
cls.token = Token.objects.get(user=cls.user)
cls.client = Client()
def assertDictEqualKeywise(self, d1, d2, excluded=None):
if not excluded:
excluded = []
self.assertEqual(set(d1.keys()).union(excluded), set(d2.keys()).union(excluded), "Key sets do not match")
for key in d1:
if key not in excluded:
value = d1[key]
covalue = d2[key]
self.assertEqual(value, covalue,
"Key '{key}' does not match.".format(key=key))
## Instruction:
Add another user for permission testing
## Code After:
import json
from django.test import TestCase, Client
# Create your tests here.
from login.models import myuser
from rest_framework.authtoken.models import Token
class APITestCase(TestCase):
test_schema = 'schema1'
test_table = 'population2'
@classmethod
def setUpClass(cls):
super(APITestCase, cls).setUpClass()
cls.user = myuser.objects.create(name='MrTest', mail_address='mrtest@test.com')
cls.user.save()
cls.token = Token.objects.get(user=cls.user)
cls.other_user = myuser.objects.create(name='NotMrTest', mail_address='notmrtest@test.com')
cls.other_user.save()
cls.other_token = Token.objects.get(user=cls.other_user)
cls.client = Client()
def assertDictEqualKeywise(self, d1, d2, excluded=None):
if not excluded:
excluded = []
self.assertEqual(set(d1.keys()).union(excluded), set(d2.keys()).union(excluded), "Key sets do not match")
for key in d1:
if key not in excluded:
value = d1[key]
covalue = d2[key]
self.assertEqual(value, covalue,
"Key '{key}' does not match.".format(key=key)) | import json
from django.test import TestCase, Client
# Create your tests here.
from login.models import myuser
from rest_framework.authtoken.models import Token
class APITestCase(TestCase):
test_schema = 'schema1'
test_table = 'population2'
@classmethod
def setUpClass(cls):
super(APITestCase, cls).setUpClass()
- cls.user = myuser.objects.create(name='MrTest')
+ cls.user = myuser.objects.create(name='MrTest', mail_address='mrtest@test.com')
? ++++++++++++++++++++++++++++++++
cls.user.save()
cls.token = Token.objects.get(user=cls.user)
+
+ cls.other_user = myuser.objects.create(name='NotMrTest', mail_address='notmrtest@test.com')
+ cls.other_user.save()
+ cls.other_token = Token.objects.get(user=cls.other_user)
+
cls.client = Client()
def assertDictEqualKeywise(self, d1, d2, excluded=None):
if not excluded:
excluded = []
self.assertEqual(set(d1.keys()).union(excluded), set(d2.keys()).union(excluded), "Key sets do not match")
for key in d1:
if key not in excluded:
value = d1[key]
covalue = d2[key]
self.assertEqual(value, covalue,
"Key '{key}' does not match.".format(key=key)) | 7 | 0.194444 | 6 | 1 |
6388ebf131929da987221bf513914ef6e7bb0003 | .travis.yml | .travis.yml | language: c++
sudo: false
script:
- cmake -DCMAKE_BUILD_TYPE=Release
- make -j
- bin/unitTests
before_script:
- if [[ "${COMPILER}" != "" ]]; then export CXX=${COMPILER}; fi
matrix:
include:
- os: linux
compiler: gcc
addons: &gcc49
apt:
sources: ['ubuntu-toolchain-r-test']
packages: ['g++-4.9']
env: COMPILER='g++-4.9'
- os: linux
compiler: clang
addons: &clang35
apt:
sources: ['llvm-toolchain-precise-3.5', 'ubuntu-toolchain-r-test']
packages: ['clang-3.5']
env: COMPILER='clang++-3.5'
- os: osx
compiler: clang
addons:
apt:
sources:
- george-edison55-precise-backports
packages:
- cmake-data
- cmake | language: c++
sudo: false
script:
- cmake -DCMAKE_BUILD_TYPE=Release
- make -j
- bin/unitTests
before_script:
- if [[ "${COMPILER}" != "" ]]; then export CXX=${COMPILER}; fi
matrix:
include:
- os: linux
compiler: gcc
addons: &gcc49
apt:
sources: ['ubuntu-toolchain-r-test']
packages: ['g++-4.9']
env: COMPILER='g++-4.9'
- os: linux
compiler: clang
- os: osx
compiler: clang
addons:
apt:
sources:
- george-edison55-precise-backports
packages:
- cmake-data
- cmake | Speed up clang build on linux. | Speed up clang build on linux.
| YAML | mit | emilrowland/LifeSimulator | yaml | ## Code Before:
language: c++
sudo: false
script:
- cmake -DCMAKE_BUILD_TYPE=Release
- make -j
- bin/unitTests
before_script:
- if [[ "${COMPILER}" != "" ]]; then export CXX=${COMPILER}; fi
matrix:
include:
- os: linux
compiler: gcc
addons: &gcc49
apt:
sources: ['ubuntu-toolchain-r-test']
packages: ['g++-4.9']
env: COMPILER='g++-4.9'
- os: linux
compiler: clang
addons: &clang35
apt:
sources: ['llvm-toolchain-precise-3.5', 'ubuntu-toolchain-r-test']
packages: ['clang-3.5']
env: COMPILER='clang++-3.5'
- os: osx
compiler: clang
addons:
apt:
sources:
- george-edison55-precise-backports
packages:
- cmake-data
- cmake
## Instruction:
Speed up clang build on linux.
## Code After:
language: c++
sudo: false
script:
- cmake -DCMAKE_BUILD_TYPE=Release
- make -j
- bin/unitTests
before_script:
- if [[ "${COMPILER}" != "" ]]; then export CXX=${COMPILER}; fi
matrix:
include:
- os: linux
compiler: gcc
addons: &gcc49
apt:
sources: ['ubuntu-toolchain-r-test']
packages: ['g++-4.9']
env: COMPILER='g++-4.9'
- os: linux
compiler: clang
- os: osx
compiler: clang
addons:
apt:
sources:
- george-edison55-precise-backports
packages:
- cmake-data
- cmake | language: c++
sudo: false
script:
- cmake -DCMAKE_BUILD_TYPE=Release
- make -j
- bin/unitTests
before_script:
- if [[ "${COMPILER}" != "" ]]; then export CXX=${COMPILER}; fi
matrix:
include:
- os: linux
compiler: gcc
addons: &gcc49
apt:
sources: ['ubuntu-toolchain-r-test']
packages: ['g++-4.9']
env: COMPILER='g++-4.9'
- os: linux
compiler: clang
- addons: &clang35
- apt:
- sources: ['llvm-toolchain-precise-3.5', 'ubuntu-toolchain-r-test']
- packages: ['clang-3.5']
- env: COMPILER='clang++-3.5'
- os: osx
compiler: clang
addons:
apt:
sources:
- george-edison55-precise-backports
packages:
- cmake-data
- cmake | 5 | 0.138889 | 0 | 5 |
cff6281155104eb548924f05f3da2316e95fddf2 | app/views/time_slots/new.html.erb | app/views/time_slots/new.html.erb | <% title "New Time Slot" %>
<h2>Add New Time Slot</h2>
<% if current_user.is_admin_of?(@department) %>
<p>
<%= check_box_tag :repeating_event, false, false, onclick: "$('.toggle_me').toggle()" %> Repeating event?
</p>
<% end %>
<div class="toggle_me">
<%= form_for @time_slot do |f| %>
<%= f.error_messages %>
<%= render partial: 'time_slots/form', locals: {f: f} %>
<p><%= submit_tag "Create New" %></p>
<% end %>
</div>
<% if current_user.is_admin_of?(@department) %>
<div class="toggle_me" style="display:none">
<%= form_for (@repeating_event = RepeatingEvent.new) do |f| %>
<%= render partial: 'time_slots/form_repeating', locals: {f: f} %>
<%= hidden_field_tag 'rerender_date', params[:date] %>
<p><%= submit_tag "Create New Repeating Event", url: { controller: 'repeating_events', action: 'create', calendar: params[:calendar] } %></p>
<% end %>
</div>
<% end %>
<p><%= link_to "Back to List", time_slots_path %></p>
| <% title "New Time Slot" %>
<% if current_user.is_admin_of?(@department) %>
<p>
<%= check_box_tag :repeating_event, false, false, onclick: "$('.toggle_me').toggle()" %> Repeating event?
</p>
<% end %>
<div class="toggle_me">
<%= form_for @time_slot do |f| %>
<%= f.error_messages %>
<%= render partial: 'time_slots/form', locals: {f: f} %>
<p><%= submit_tag "Create New" %></p>
<% end %>
</div>
<% if current_user.is_admin_of?(@department) %>
<div class="toggle_me" style="display:none">
<%= form_for (@repeating_event = RepeatingEvent.new) do |f| %>
<%= render partial: 'time_slots/form_repeating', locals: {f: f} %>
<%= hidden_field_tag 'rerender_date', params[:date] %>
<p><%= submit_tag "Create New Repeating Event", url: { controller: 'repeating_events', action: 'create', calendar: params[:calendar] } %></p>
<% end %>
</div>
<% end %>
<p><%= link_to "Back to List", time_slots_path %></p>
| Remove redundant title in new form | Remove redundant title in new form
| HTML+ERB | mit | YaleSTC/shifts,YaleSTC/shifts,YaleSTC/shifts | html+erb | ## Code Before:
<% title "New Time Slot" %>
<h2>Add New Time Slot</h2>
<% if current_user.is_admin_of?(@department) %>
<p>
<%= check_box_tag :repeating_event, false, false, onclick: "$('.toggle_me').toggle()" %> Repeating event?
</p>
<% end %>
<div class="toggle_me">
<%= form_for @time_slot do |f| %>
<%= f.error_messages %>
<%= render partial: 'time_slots/form', locals: {f: f} %>
<p><%= submit_tag "Create New" %></p>
<% end %>
</div>
<% if current_user.is_admin_of?(@department) %>
<div class="toggle_me" style="display:none">
<%= form_for (@repeating_event = RepeatingEvent.new) do |f| %>
<%= render partial: 'time_slots/form_repeating', locals: {f: f} %>
<%= hidden_field_tag 'rerender_date', params[:date] %>
<p><%= submit_tag "Create New Repeating Event", url: { controller: 'repeating_events', action: 'create', calendar: params[:calendar] } %></p>
<% end %>
</div>
<% end %>
<p><%= link_to "Back to List", time_slots_path %></p>
## Instruction:
Remove redundant title in new form
## Code After:
<% title "New Time Slot" %>
<% if current_user.is_admin_of?(@department) %>
<p>
<%= check_box_tag :repeating_event, false, false, onclick: "$('.toggle_me').toggle()" %> Repeating event?
</p>
<% end %>
<div class="toggle_me">
<%= form_for @time_slot do |f| %>
<%= f.error_messages %>
<%= render partial: 'time_slots/form', locals: {f: f} %>
<p><%= submit_tag "Create New" %></p>
<% end %>
</div>
<% if current_user.is_admin_of?(@department) %>
<div class="toggle_me" style="display:none">
<%= form_for (@repeating_event = RepeatingEvent.new) do |f| %>
<%= render partial: 'time_slots/form_repeating', locals: {f: f} %>
<%= hidden_field_tag 'rerender_date', params[:date] %>
<p><%= submit_tag "Create New Repeating Event", url: { controller: 'repeating_events', action: 'create', calendar: params[:calendar] } %></p>
<% end %>
</div>
<% end %>
<p><%= link_to "Back to List", time_slots_path %></p>
| <% title "New Time Slot" %>
- <h2>Add New Time Slot</h2>
<% if current_user.is_admin_of?(@department) %>
<p>
<%= check_box_tag :repeating_event, false, false, onclick: "$('.toggle_me').toggle()" %> Repeating event?
</p>
<% end %>
<div class="toggle_me">
<%= form_for @time_slot do |f| %>
<%= f.error_messages %>
<%= render partial: 'time_slots/form', locals: {f: f} %>
<p><%= submit_tag "Create New" %></p>
<% end %>
</div>
<% if current_user.is_admin_of?(@department) %>
<div class="toggle_me" style="display:none">
<%= form_for (@repeating_event = RepeatingEvent.new) do |f| %>
<%= render partial: 'time_slots/form_repeating', locals: {f: f} %>
<%= hidden_field_tag 'rerender_date', params[:date] %>
<p><%= submit_tag "Create New Repeating Event", url: { controller: 'repeating_events', action: 'create', calendar: params[:calendar] } %></p>
<% end %>
</div>
<% end %>
<p><%= link_to "Back to List", time_slots_path %></p> | 1 | 0.033333 | 0 | 1 |
764163154e8b7dfbd23c7a5bd2bef49fdc454400 | consul-registrator-setup/docker-compose.yml | consul-registrator-setup/docker-compose.yml | consul:
image: consul:v0.6.4
container_name: consul
hostname: consul
command: "agent"
ports:
- "8400:8400"
- "8500:8500"
- "8600:8600/udp"
environment:
SERVICE_IGNORE: "yes"
volumes:
- ./consul.json:/consul/config/config.json
# http://gliderlabs.com/registrator/latest/user/run/
registrator:
image: gliderlabs/registrator:v7
container_name: registrator
hostname: registrator
command: "-internal -cleanup=true -retry-attempts -1 -resync 2 -ttl 10 -ttl-refresh 2 consul://consul:8500"
links:
- consul
volumes:
- /var/run/docker.sock:/tmp/docker.sock
dns:
- 172.17.0.1
- 8.8.8.8
dns_search: service.docker
| consul:
image: consul:1.7
container_name: consul
hostname: consul
command: "agent"
ports:
- "8400:8400"
- "8500:8500"
- "8600:8600/udp"
environment:
SERVICE_IGNORE: "yes"
volumes:
- ./consul.json:/consul/config/config.json
# http://gliderlabs.com/registrator/latest/user/run/
registrator:
image: gliderlabs/registrator:latest
container_name: registrator
hostname: registrator
command: "-internal -cleanup=true -retry-attempts -1 -resync 2 -ttl 10 -ttl-refresh 2 consul://consul:8500"
links:
- consul
volumes:
- /var/run/docker.sock:/tmp/docker.sock
dns:
- 172.17.0.1
- 8.8.8.8
dns_search: service.docker
| Update consul and registrator versions | Update consul and registrator versions
| YAML | mit | adamaig/dev-with-docker-on-ubuntu | yaml | ## Code Before:
consul:
image: consul:v0.6.4
container_name: consul
hostname: consul
command: "agent"
ports:
- "8400:8400"
- "8500:8500"
- "8600:8600/udp"
environment:
SERVICE_IGNORE: "yes"
volumes:
- ./consul.json:/consul/config/config.json
# http://gliderlabs.com/registrator/latest/user/run/
registrator:
image: gliderlabs/registrator:v7
container_name: registrator
hostname: registrator
command: "-internal -cleanup=true -retry-attempts -1 -resync 2 -ttl 10 -ttl-refresh 2 consul://consul:8500"
links:
- consul
volumes:
- /var/run/docker.sock:/tmp/docker.sock
dns:
- 172.17.0.1
- 8.8.8.8
dns_search: service.docker
## Instruction:
Update consul and registrator versions
## Code After:
consul:
image: consul:1.7
container_name: consul
hostname: consul
command: "agent"
ports:
- "8400:8400"
- "8500:8500"
- "8600:8600/udp"
environment:
SERVICE_IGNORE: "yes"
volumes:
- ./consul.json:/consul/config/config.json
# http://gliderlabs.com/registrator/latest/user/run/
registrator:
image: gliderlabs/registrator:latest
container_name: registrator
hostname: registrator
command: "-internal -cleanup=true -retry-attempts -1 -resync 2 -ttl 10 -ttl-refresh 2 consul://consul:8500"
links:
- consul
volumes:
- /var/run/docker.sock:/tmp/docker.sock
dns:
- 172.17.0.1
- 8.8.8.8
dns_search: service.docker
| consul:
- image: consul:v0.6.4
? ^^ ^^^
+ image: consul:1.7
? ^ ^
container_name: consul
hostname: consul
command: "agent"
ports:
- "8400:8400"
- "8500:8500"
- "8600:8600/udp"
environment:
SERVICE_IGNORE: "yes"
volumes:
- ./consul.json:/consul/config/config.json
# http://gliderlabs.com/registrator/latest/user/run/
registrator:
- image: gliderlabs/registrator:v7
? ^^
+ image: gliderlabs/registrator:latest
? ^^^^^^
container_name: registrator
hostname: registrator
command: "-internal -cleanup=true -retry-attempts -1 -resync 2 -ttl 10 -ttl-refresh 2 consul://consul:8500"
links:
- consul
volumes:
- /var/run/docker.sock:/tmp/docker.sock
dns:
- 172.17.0.1
- 8.8.8.8
dns_search: service.docker | 4 | 0.142857 | 2 | 2 |
8054982b3aa106a9551e792f6453993484a17f2a | tests/unit/test_factory.py | tests/unit/test_factory.py | import os
import flask
import json
import pytest
from app import create_app
def test_default_config():
"""Test the default config class is the DevelopmentConfig"""
assert isinstance(create_app(), flask.app.Flask)
def test_testing_config():
"""Test the app.testing variable is set when using the testing config."""
assert create_app(config_name='testing', jobs_enabled=False).testing | import os
import flask
import json
import pytest
from app import create_app
@pytest.mark.skip(reason="Scheduler is not functioning and needs to be replaced.")
def test_default_config():
"""Test the default config class is the DevelopmentConfig"""
assert isinstance(create_app(), flask.app.Flask)
def test_testing_config():
"""Test the app.testing variable is set when using the testing config."""
assert create_app(config_name='testing', jobs_enabled=False).testing | Mark test_default_config as skip; Scheduler needs to be rewritten | Mark test_default_config as skip; Scheduler needs to be rewritten
| Python | apache-2.0 | CityOfNewYork/NYCOpenRecords,CityOfNewYork/NYCOpenRecords,CityOfNewYork/NYCOpenRecords,CityOfNewYork/NYCOpenRecords,CityOfNewYork/NYCOpenRecords | python | ## Code Before:
import os
import flask
import json
import pytest
from app import create_app
def test_default_config():
"""Test the default config class is the DevelopmentConfig"""
assert isinstance(create_app(), flask.app.Flask)
def test_testing_config():
"""Test the app.testing variable is set when using the testing config."""
assert create_app(config_name='testing', jobs_enabled=False).testing
## Instruction:
Mark test_default_config as skip; Scheduler needs to be rewritten
## Code After:
import os
import flask
import json
import pytest
from app import create_app
@pytest.mark.skip(reason="Scheduler is not functioning and needs to be replaced.")
def test_default_config():
"""Test the default config class is the DevelopmentConfig"""
assert isinstance(create_app(), flask.app.Flask)
def test_testing_config():
"""Test the app.testing variable is set when using the testing config."""
assert create_app(config_name='testing', jobs_enabled=False).testing | import os
import flask
import json
import pytest
from app import create_app
-
+ @pytest.mark.skip(reason="Scheduler is not functioning and needs to be replaced.")
def test_default_config():
"""Test the default config class is the DevelopmentConfig"""
assert isinstance(create_app(), flask.app.Flask)
def test_testing_config():
"""Test the app.testing variable is set when using the testing config."""
assert create_app(config_name='testing', jobs_enabled=False).testing | 2 | 0.111111 | 1 | 1 |
375fdcd825a877a4200c391b18996d17658788c5 | dev-requirements.txt | dev-requirements.txt | dash_core_components==0.26.0
dash_html_components==0.11.0rc5
dash==0.23.1
percy
selenium
mock
six
| dash_core_components
dash_html_components
dash
percy
selenium
mock
six
| Update Dash ecosystem requirements to latest | Update Dash ecosystem requirements to latest
| Text | mit | plotly/dash,plotly/dash,plotly/dash,plotly/dash,plotly/dash | text | ## Code Before:
dash_core_components==0.26.0
dash_html_components==0.11.0rc5
dash==0.23.1
percy
selenium
mock
six
## Instruction:
Update Dash ecosystem requirements to latest
## Code After:
dash_core_components
dash_html_components
dash
percy
selenium
mock
six
| - dash_core_components==0.26.0
? --------
+ dash_core_components
- dash_html_components==0.11.0rc5
? -----------
+ dash_html_components
- dash==0.23.1
+ dash
percy
selenium
mock
six | 6 | 0.857143 | 3 | 3 |
f86f4873e8aef4d54ea45abc70b5f45af204186a | fitz/stm_filter.c | fitz/stm_filter.c |
fz_error
fz_process(fz_filter *f, fz_buffer *in, fz_buffer *out)
{
fz_error reason;
unsigned char *oldrp;
unsigned char *oldwp;
assert(!out->eof);
oldrp = in->rp;
oldwp = out->wp;
if (f->done)
return fz_iodone;
reason = f->process(f, in, out);
assert(in->rp <= in->wp);
assert(out->wp <= out->ep);
f->consumed = in->rp > oldrp;
f->produced = out->wp > oldwp;
f->count += out->wp - oldwp;
/* iodone or error */
if (reason != fz_ioneedin && reason != fz_ioneedout)
{
if (reason != fz_iodone)
reason = fz_rethrow(reason, "cannot process filter");
out->eof = 1;
f->done = 1;
}
return reason;
}
fz_filter *
fz_keepfilter(fz_filter *f)
{
f->refs ++;
return f;
}
void
fz_dropfilter(fz_filter *f)
{
if (--f->refs == 0)
{
if (f->drop)
f->drop(f);
fz_free(f);
}
}
|
fz_error
fz_process(fz_filter *f, fz_buffer *in, fz_buffer *out)
{
fz_error reason;
unsigned char *oldrp;
unsigned char *oldwp;
oldrp = in->rp;
oldwp = out->wp;
if (f->done)
return fz_iodone;
assert(!out->eof);
reason = f->process(f, in, out);
assert(in->rp <= in->wp);
assert(out->wp <= out->ep);
f->consumed = in->rp > oldrp;
f->produced = out->wp > oldwp;
f->count += out->wp - oldwp;
/* iodone or error */
if (reason != fz_ioneedin && reason != fz_ioneedout)
{
if (reason != fz_iodone)
reason = fz_rethrow(reason, "cannot process filter");
out->eof = 1;
f->done = 1;
}
return reason;
}
fz_filter *
fz_keepfilter(fz_filter *f)
{
f->refs ++;
return f;
}
void
fz_dropfilter(fz_filter *f)
{
if (--f->refs == 0)
{
if (f->drop)
f->drop(f);
fz_free(f);
}
}
| Move assert test of EOF to after testing if a filter is done. | Move assert test of EOF to after testing if a filter is done.
| C | agpl-3.0 | tophyr/mupdf,PuzzleFlow/mupdf,derek-watson/mupdf,cgogolin/penandpdf,ArtifexSoftware/mupdf,asbloomf/mupdf,lamemate/mupdf,lolo32/mupdf-mirror,lolo32/mupdf-mirror,loungeup/mupdf,knielsen/mupdf,kobolabs/mupdf,asbloomf/mupdf,TamirEvan/mupdf,hjiayz/forkmupdf,hackqiang/mupdf,seagullua/MuPDF,MokiMobility/muPDF,nqv/mupdf,sebras/mupdf,poor-grad-student/mupdf,benoit-pierre/mupdf,Kalp695/mupdf,ArtifexSoftware/mupdf,lolo32/mupdf-mirror,robamler/mupdf-nacl,ArtifexSoftware/mupdf,Kalp695/mupdf,derek-watson/mupdf,TamirEvan/mupdf,isavin/humblepdf,knielsen/mupdf,tribals/mupdf,lolo32/mupdf-mirror,lamemate/mupdf,michaelcadilhac/pdfannot,seagullua/MuPDF,geetakaur/NewApps,cgogolin/penandpdf,MokiMobility/muPDF,benoit-pierre/mupdf,michaelcadilhac/pdfannot,ArtifexSoftware/mupdf,PuzzleFlow/mupdf,FabriceSalvaire/mupdf-cmake,robamler/mupdf-nacl,lustersir/MuPDF,xiangxw/mupdf,wild0/opened_mupdf,FabriceSalvaire/mupdf-v1.3,poor-grad-student/mupdf,cgogolin/penandpdf,clchiou/mupdf,derek-watson/mupdf,sebras/mupdf,isavin/humblepdf,hxx0215/MuPDFMirror,FabriceSalvaire/mupdf-cmake,wzhsunn/mupdf,clchiou/mupdf,lamemate/mupdf,tophyr/mupdf,FabriceSalvaire/mupdf-v1.3,geetakaur/NewApps,wild0/opened_mupdf,FabriceSalvaire/mupdf-cmake,issuestand/mupdf,FabriceSalvaire/mupdf-v1.3,github201407/MuPDF,seagullua/MuPDF,TamirEvan/mupdf,github201407/MuPDF,cgogolin/penandpdf,zeniko/mupdf,fluks/mupdf-x11-bookmarks,ccxvii/mupdf,TamirEvan/mupdf,asbloomf/mupdf,wzhsunn/mupdf,ArtifexSoftware/mupdf,flipstudio/MuPDF,hjiayz/forkmupdf,xiangxw/mupdf,lolo32/mupdf-mirror,ziel/mupdf,benoit-pierre/mupdf,sebras/mupdf,fluks/mupdf-x11-bookmarks,wzhsunn/mupdf,flipstudio/MuPDF,kobolabs/mupdf,fluks/mupdf-x11-bookmarks,flipstudio/MuPDF,github201407/MuPDF,hxx0215/MuPDFMirror,crow-misia/mupdf,ArtifexSoftware/mupdf,hxx0215/MuPDFMirror,xiangxw/mupdf,Kalp695/mupdf,hackqiang/mupdf,lolo32/mupdf-mirror,zeniko/mupdf,seagullua/MuPDF,muennich/mupdf,sebras/mupdf,MokiMobility/muPDF,issuestand/mupdf,ylixir/mupdf,ccxvii/mupdf,ziel/mupdf,asbloomf/mupdf,samturneruk/mupdf_secure_android,clchiou/mupdf,flipstudio/MuPDF,samturneruk/mupdf_secure_android,nqv/mupdf,hackqiang/mupdf,crow-misia/mupdf,nqv/mupdf,isavin/humblepdf,seagullua/MuPDF,geetakaur/NewApps,TamirEvan/mupdf,ylixir/mupdf,robamler/mupdf-nacl,ccxvii/mupdf,hackqiang/mupdf,seagullua/MuPDF,Kalp695/mupdf,FabriceSalvaire/mupdf-v1.3,crow-misia/mupdf,benoit-pierre/mupdf,ArtifexSoftware/mupdf,tophyr/mupdf,derek-watson/mupdf,samturneruk/mupdf_secure_android,isavin/humblepdf,poor-grad-student/mupdf,loungeup/mupdf,ziel/mupdf,lustersir/MuPDF,ziel/mupdf,ylixir/mupdf,TamirEvan/mupdf,knielsen/mupdf,wzhsunn/mupdf,lustersir/MuPDF,TamirEvan/mupdf,loungeup/mupdf,andyhan/mupdf,knielsen/mupdf,lustersir/MuPDF,tribals/mupdf,derek-watson/mupdf,wzhsunn/mupdf,xiangxw/mupdf,FabriceSalvaire/mupdf-v1.3,PuzzleFlow/mupdf,zeniko/mupdf,zeniko/mupdf,hjiayz/forkmupdf,FabriceSalvaire/mupdf-cmake,asbloomf/mupdf,wzhsunn/mupdf,muennich/mupdf,knielsen/mupdf,andyhan/mupdf,wild0/opened_mupdf,andyhan/mupdf,crow-misia/mupdf,ccxvii/mupdf,kobolabs/mupdf,MokiMobility/muPDF,ylixir/mupdf,loungeup/mupdf,samturneruk/mupdf_secure_android,sebras/mupdf,samturneruk/mupdf_secure_android,flipstudio/MuPDF,xiangxw/mupdf,wild0/opened_mupdf,kobolabs/mupdf,wild0/opened_mupdf,lamemate/mupdf,samturneruk/mupdf_secure_android,github201407/MuPDF,crow-misia/mupdf,hackqiang/mupdf,wild0/opened_mupdf,issuestand/mupdf,andyhan/mupdf,benoit-pierre/mupdf,fluks/mupdf-x11-bookmarks,muennich/mupdf,lolo32/mupdf-mirror,benoit-pierre/mupdf,hjiayz/forkmupdf,PuzzleFlow/mupdf,michaelcadilhac/pdfannot,andyhan/mupdf,Kalp695/mupdf,hxx0215/MuPDFMirror,FabriceSalvaire/mupdf-v1.3,zeniko/mupdf,derek-watson/mupdf,FabriceSalvaire/mupdf-cmake,lamemate/mupdf,cgogolin/penandpdf,MokiMobility/muPDF,robamler/mupdf-nacl,cgogolin/penandpdf,lustersir/MuPDF,clchiou/mupdf,tribals/mupdf,tophyr/mupdf,ArtifexSoftware/mupdf,fluks/mupdf-x11-bookmarks,tophyr/mupdf,zeniko/mupdf,loungeup/mupdf,FabriceSalvaire/mupdf-cmake,sebras/mupdf,kobolabs/mupdf,geetakaur/NewApps,ylixir/mupdf,issuestand/mupdf,github201407/MuPDF,tribals/mupdf,ccxvii/mupdf,poor-grad-student/mupdf,poor-grad-student/mupdf,tophyr/mupdf,poor-grad-student/mupdf,ziel/mupdf,hxx0215/MuPDFMirror,Kalp695/mupdf,muennich/mupdf,kobolabs/mupdf,hxx0215/MuPDFMirror,issuestand/mupdf,hjiayz/forkmupdf,hjiayz/forkmupdf,isavin/humblepdf,knielsen/mupdf,nqv/mupdf,kobolabs/mupdf,xiangxw/mupdf,PuzzleFlow/mupdf,github201407/MuPDF,muennich/mupdf,nqv/mupdf,isavin/humblepdf,geetakaur/NewApps,wild0/opened_mupdf,muennich/mupdf,fluks/mupdf-x11-bookmarks,geetakaur/NewApps,fluks/mupdf-x11-bookmarks,michaelcadilhac/pdfannot,issuestand/mupdf,michaelcadilhac/pdfannot,robamler/mupdf-nacl,xiangxw/mupdf,isavin/humblepdf,asbloomf/mupdf,clchiou/mupdf,tribals/mupdf,clchiou/mupdf,muennich/mupdf,ccxvii/mupdf,ziel/mupdf,ylixir/mupdf,michaelcadilhac/pdfannot,nqv/mupdf,robamler/mupdf-nacl,MokiMobility/muPDF,andyhan/mupdf,TamirEvan/mupdf,tribals/mupdf,PuzzleFlow/mupdf,crow-misia/mupdf,Kalp695/mupdf,lamemate/mupdf,hackqiang/mupdf,loungeup/mupdf,flipstudio/MuPDF,lustersir/MuPDF,PuzzleFlow/mupdf | c | ## Code Before:
fz_error
fz_process(fz_filter *f, fz_buffer *in, fz_buffer *out)
{
fz_error reason;
unsigned char *oldrp;
unsigned char *oldwp;
assert(!out->eof);
oldrp = in->rp;
oldwp = out->wp;
if (f->done)
return fz_iodone;
reason = f->process(f, in, out);
assert(in->rp <= in->wp);
assert(out->wp <= out->ep);
f->consumed = in->rp > oldrp;
f->produced = out->wp > oldwp;
f->count += out->wp - oldwp;
/* iodone or error */
if (reason != fz_ioneedin && reason != fz_ioneedout)
{
if (reason != fz_iodone)
reason = fz_rethrow(reason, "cannot process filter");
out->eof = 1;
f->done = 1;
}
return reason;
}
fz_filter *
fz_keepfilter(fz_filter *f)
{
f->refs ++;
return f;
}
void
fz_dropfilter(fz_filter *f)
{
if (--f->refs == 0)
{
if (f->drop)
f->drop(f);
fz_free(f);
}
}
## Instruction:
Move assert test of EOF to after testing if a filter is done.
## Code After:
fz_error
fz_process(fz_filter *f, fz_buffer *in, fz_buffer *out)
{
fz_error reason;
unsigned char *oldrp;
unsigned char *oldwp;
oldrp = in->rp;
oldwp = out->wp;
if (f->done)
return fz_iodone;
assert(!out->eof);
reason = f->process(f, in, out);
assert(in->rp <= in->wp);
assert(out->wp <= out->ep);
f->consumed = in->rp > oldrp;
f->produced = out->wp > oldwp;
f->count += out->wp - oldwp;
/* iodone or error */
if (reason != fz_ioneedin && reason != fz_ioneedout)
{
if (reason != fz_iodone)
reason = fz_rethrow(reason, "cannot process filter");
out->eof = 1;
f->done = 1;
}
return reason;
}
fz_filter *
fz_keepfilter(fz_filter *f)
{
f->refs ++;
return f;
}
void
fz_dropfilter(fz_filter *f)
{
if (--f->refs == 0)
{
if (f->drop)
f->drop(f);
fz_free(f);
}
}
|
fz_error
fz_process(fz_filter *f, fz_buffer *in, fz_buffer *out)
{
fz_error reason;
unsigned char *oldrp;
unsigned char *oldwp;
- assert(!out->eof);
-
oldrp = in->rp;
oldwp = out->wp;
if (f->done)
return fz_iodone;
+
+ assert(!out->eof);
reason = f->process(f, in, out);
assert(in->rp <= in->wp);
assert(out->wp <= out->ep);
f->consumed = in->rp > oldrp;
f->produced = out->wp > oldwp;
f->count += out->wp - oldwp;
/* iodone or error */
if (reason != fz_ioneedin && reason != fz_ioneedout)
{
if (reason != fz_iodone)
reason = fz_rethrow(reason, "cannot process filter");
out->eof = 1;
f->done = 1;
}
return reason;
}
fz_filter *
fz_keepfilter(fz_filter *f)
{
f->refs ++;
return f;
}
void
fz_dropfilter(fz_filter *f)
{
if (--f->refs == 0)
{
if (f->drop)
f->drop(f);
fz_free(f);
}
}
| 4 | 0.072727 | 2 | 2 |
ae89d5de1a9d248951bed0b992500121860c47d4 | tvsort_sl/messages.py | tvsort_sl/messages.py | import os
from sendgrid import sendgrid, Email
from sendgrid.helpers.mail import Content, Mail, MailSettings, SandBoxMode
def send_email(subject, content):
sg = sendgrid.SendGridAPIClient(apikey=os.environ.get('SENDGRID_API_KEY'))
from_email = Email(name='TV sort', email='tvsortsl@gmail.com')
to_email = Email(name='TV sort', email='tvsortsl@gmail.com')
content = Content("text/plain", content)
mail = Mail(from_email, subject, to_email, content)
sand_box = os.environ.get('SAND_BOX')
if sand_box == 'true':
mail_settings = MailSettings()
mail_settings.sandbox_mode = SandBoxMode(True)
mail.mail_settings = mail_settings
return sg.client.mail.send.post(request_body=mail.get())
| import os
from sendgrid import sendgrid, Email
from sendgrid.helpers.mail import Content, Mail, MailSettings, SandBoxMode
def send_email(subject, content):
sg = sendgrid.SendGridAPIClient(apikey=os.environ.get('SENDGRID_API_KEY'))
from_email = Email(name='TV sort', email='tvsortsl@gmail.com')
to_email = Email(name='TV sort', email='tvsortsl@gmail.com')
content = Content("text/plain", content)
mail = Mail(from_email, subject, to_email, content)
sand_box = os.environ.get('SAND_BOX')
print(os.environ.get('SENDGRID_API_KEY'))
print(os.environ.get('SAND_BOX'))
if sand_box == 'true':
mail_settings = MailSettings()
mail_settings.sandbox_mode = SandBoxMode(True)
mail.mail_settings = mail_settings
return sg.client.mail.send.post(request_body=mail.get())
| Add debug prints to send_email | Add debug prints to send_email
| Python | mit | shlomiLan/tvsort_sl | python | ## Code Before:
import os
from sendgrid import sendgrid, Email
from sendgrid.helpers.mail import Content, Mail, MailSettings, SandBoxMode
def send_email(subject, content):
sg = sendgrid.SendGridAPIClient(apikey=os.environ.get('SENDGRID_API_KEY'))
from_email = Email(name='TV sort', email='tvsortsl@gmail.com')
to_email = Email(name='TV sort', email='tvsortsl@gmail.com')
content = Content("text/plain", content)
mail = Mail(from_email, subject, to_email, content)
sand_box = os.environ.get('SAND_BOX')
if sand_box == 'true':
mail_settings = MailSettings()
mail_settings.sandbox_mode = SandBoxMode(True)
mail.mail_settings = mail_settings
return sg.client.mail.send.post(request_body=mail.get())
## Instruction:
Add debug prints to send_email
## Code After:
import os
from sendgrid import sendgrid, Email
from sendgrid.helpers.mail import Content, Mail, MailSettings, SandBoxMode
def send_email(subject, content):
sg = sendgrid.SendGridAPIClient(apikey=os.environ.get('SENDGRID_API_KEY'))
from_email = Email(name='TV sort', email='tvsortsl@gmail.com')
to_email = Email(name='TV sort', email='tvsortsl@gmail.com')
content = Content("text/plain", content)
mail = Mail(from_email, subject, to_email, content)
sand_box = os.environ.get('SAND_BOX')
print(os.environ.get('SENDGRID_API_KEY'))
print(os.environ.get('SAND_BOX'))
if sand_box == 'true':
mail_settings = MailSettings()
mail_settings.sandbox_mode = SandBoxMode(True)
mail.mail_settings = mail_settings
return sg.client.mail.send.post(request_body=mail.get())
| import os
from sendgrid import sendgrid, Email
from sendgrid.helpers.mail import Content, Mail, MailSettings, SandBoxMode
def send_email(subject, content):
sg = sendgrid.SendGridAPIClient(apikey=os.environ.get('SENDGRID_API_KEY'))
from_email = Email(name='TV sort', email='tvsortsl@gmail.com')
to_email = Email(name='TV sort', email='tvsortsl@gmail.com')
content = Content("text/plain", content)
mail = Mail(from_email, subject, to_email, content)
sand_box = os.environ.get('SAND_BOX')
+
+ print(os.environ.get('SENDGRID_API_KEY'))
+ print(os.environ.get('SAND_BOX'))
+
if sand_box == 'true':
mail_settings = MailSettings()
mail_settings.sandbox_mode = SandBoxMode(True)
mail.mail_settings = mail_settings
return sg.client.mail.send.post(request_body=mail.get()) | 4 | 0.210526 | 4 | 0 |
aec2dd3853efe31c06373c6b999b4f2a9451debf | src/connect/index.js | src/connect/index.js | import Socket from './socket'
import { actions } from '../store/actions'
import {
supportsWebSockets,
supportsGetUserMedia
} from '../utils/feature-detection'
const {
setToken,
setWebSocketSupport,
setGumSupport,
setAuthenticated
} = actions
function setSupport() {
setWebSocketSupport(supportsWebSockets)
setGumSupport(supportsGetUserMedia)
}
export default function connect(jwt) {
setSupport()
if (supportsWebSockets) {
const socket = new Socket
socket.connect(jwt)
setToken(jwt)
setAuthenticated(true)
return socket
} else {
// console.warn('WebSockets not supported')
}
}
| import Socket from './socket'
import { actions } from '../store/actions'
import {
supportsWebSockets,
supportsGetUserMedia
} from '../utils/feature-detection'
const { setWebSocketSupport, setGumSupport } = actions
function setSupport() {
setWebSocketSupport(supportsWebSockets)
setGumSupport(supportsGetUserMedia)
}
export default function connect(jwt) {
setSupport()
try {
if (!supportsWebSockets) throw 'WebSockets not supported'
const socket = new Socket
socket.connect(jwt)
return socket
}
catch(err) {
console.log(err)
}
}
| Use try/catch for initiating WebSocket | Use try/catch for initiating WebSocket
| JavaScript | mit | onfido/onfido-sdk-core | javascript | ## Code Before:
import Socket from './socket'
import { actions } from '../store/actions'
import {
supportsWebSockets,
supportsGetUserMedia
} from '../utils/feature-detection'
const {
setToken,
setWebSocketSupport,
setGumSupport,
setAuthenticated
} = actions
function setSupport() {
setWebSocketSupport(supportsWebSockets)
setGumSupport(supportsGetUserMedia)
}
export default function connect(jwt) {
setSupport()
if (supportsWebSockets) {
const socket = new Socket
socket.connect(jwt)
setToken(jwt)
setAuthenticated(true)
return socket
} else {
// console.warn('WebSockets not supported')
}
}
## Instruction:
Use try/catch for initiating WebSocket
## Code After:
import Socket from './socket'
import { actions } from '../store/actions'
import {
supportsWebSockets,
supportsGetUserMedia
} from '../utils/feature-detection'
const { setWebSocketSupport, setGumSupport } = actions
function setSupport() {
setWebSocketSupport(supportsWebSockets)
setGumSupport(supportsGetUserMedia)
}
export default function connect(jwt) {
setSupport()
try {
if (!supportsWebSockets) throw 'WebSockets not supported'
const socket = new Socket
socket.connect(jwt)
return socket
}
catch(err) {
console.log(err)
}
}
| import Socket from './socket'
import { actions } from '../store/actions'
import {
supportsWebSockets,
supportsGetUserMedia
} from '../utils/feature-detection'
+ const { setWebSocketSupport, setGumSupport } = actions
- const {
- setToken,
- setWebSocketSupport,
- setGumSupport,
- setAuthenticated
- } = actions
function setSupport() {
setWebSocketSupport(supportsWebSockets)
setGumSupport(supportsGetUserMedia)
}
export default function connect(jwt) {
setSupport()
- if (supportsWebSockets) {
+ try {
+ if (!supportsWebSockets) throw 'WebSockets not supported'
const socket = new Socket
socket.connect(jwt)
- setToken(jwt)
- setAuthenticated(true)
return socket
- } else {
- // console.warn('WebSockets not supported')
+ }
+ catch(err) {
+ console.log(err)
}
} | 17 | 0.548387 | 6 | 11 |
56c8ca132599e2bb722cb0605c65e86d4fd1a031 | lib/overlord/connection/finalized_connection.rb | lib/overlord/connection/finalized_connection.rb | module CarbonMU
class Connection
private
class FinalizedConnection < ConnectionState
def handle_input(input)
command_context = CommandContext.new(@conn)
Parser.parse(input, command_context)
end
end
end
end
| module CarbonMU
class Connection
private
class FinalizedConnection < ConnectionState
def handle_input(input)
command_context = CommandContext.new(enacting_connection_id: @conn.id, command: input)
Parser.parse(command_context)
end
end
end
end
| Update FinalizedConnection to new CommandContext format | Update FinalizedConnection to new CommandContext format
| Ruby | mit | tkrajcar/carbonmu,1337807/carbonmu | ruby | ## Code Before:
module CarbonMU
class Connection
private
class FinalizedConnection < ConnectionState
def handle_input(input)
command_context = CommandContext.new(@conn)
Parser.parse(input, command_context)
end
end
end
end
## Instruction:
Update FinalizedConnection to new CommandContext format
## Code After:
module CarbonMU
class Connection
private
class FinalizedConnection < ConnectionState
def handle_input(input)
command_context = CommandContext.new(enacting_connection_id: @conn.id, command: input)
Parser.parse(command_context)
end
end
end
end
| module CarbonMU
class Connection
private
class FinalizedConnection < ConnectionState
def handle_input(input)
- command_context = CommandContext.new(@conn)
+ command_context = CommandContext.new(enacting_connection_id: @conn.id, command: input)
- Parser.parse(input, command_context)
? -------
+ Parser.parse(command_context)
end
end
end
end | 4 | 0.307692 | 2 | 2 |
823bc32664343b7abffea1619510c47fcb0db955 | testing/CMakeLists.txt | testing/CMakeLists.txt | find_package(CUnit)
IF(CUNIT_FOUND)
include_directories(${CUNIT_INCLUDE_DIR} . ../include/)
set(LIBS ${CUNIT_LIBRARY})
add_executable(openomf_test_main
test_main.c
test_str.c
test_hashmap.c
test_vector.c
test_list.c
test_array.c
../src/utils/hashmap.c
../src/utils/vector.c
../src/utils/iterator.c
../src/utils/str.c
../src/utils/list.c
../src/utils/array.c
)
# On unix platforms, add libm (sometimes needed, it seems)
IF(UNIX)
SET(LIBS ${LIBS} -lm)
ENDIF(UNIX)
target_link_libraries(openomf_test_main ${LIBS})
add_custom_target(test openomf_test_main)
ENDIF(CUNIT_FOUND)
| find_package(CUnit)
IF(CUNIT_FOUND)
include_directories(${CUNIT_INCLUDE_DIR} . ../include/)
set(LIBS ${CUNIT_LIBRARY})
add_executable(openomf_test_main
test_main.c
test_str.c
test_hashmap.c
test_vector.c
test_list.c
test_array.c
../src/utils/hashmap.c
../src/utils/vector.c
../src/utils/iterator.c
../src/utils/str.c
../src/utils/list.c
../src/utils/array.c
)
# On unix platforms, add libm (sometimes needed, it seems)
IF(UNIX)
SET(LIBS ${LIBS} -lm)
ENDIF(UNIX)
target_link_libraries(openomf_test_main ${LIBS})
# Change policy to allow "test" target name
cmake_policy(PUSH)
if(POLICY CMP0037)
cmake_policy(SET CMP0037 OLD)
endif()
add_custom_target(test openomf_test_main)
cmake_policy(POP)
ENDIF(CUNIT_FOUND)
| Set cmake policy to allow test target name. | Set cmake policy to allow test target name.
| Text | mit | pmjdebruijn/openomf,gdeda/openomf,gdeda/openomf,gdeda/openomf,omf2097/openomf,pmjdebruijn/openomf,omf2097/openomf,pmjdebruijn/openomf,omf2097/openomf | text | ## Code Before:
find_package(CUnit)
IF(CUNIT_FOUND)
include_directories(${CUNIT_INCLUDE_DIR} . ../include/)
set(LIBS ${CUNIT_LIBRARY})
add_executable(openomf_test_main
test_main.c
test_str.c
test_hashmap.c
test_vector.c
test_list.c
test_array.c
../src/utils/hashmap.c
../src/utils/vector.c
../src/utils/iterator.c
../src/utils/str.c
../src/utils/list.c
../src/utils/array.c
)
# On unix platforms, add libm (sometimes needed, it seems)
IF(UNIX)
SET(LIBS ${LIBS} -lm)
ENDIF(UNIX)
target_link_libraries(openomf_test_main ${LIBS})
add_custom_target(test openomf_test_main)
ENDIF(CUNIT_FOUND)
## Instruction:
Set cmake policy to allow test target name.
## Code After:
find_package(CUnit)
IF(CUNIT_FOUND)
include_directories(${CUNIT_INCLUDE_DIR} . ../include/)
set(LIBS ${CUNIT_LIBRARY})
add_executable(openomf_test_main
test_main.c
test_str.c
test_hashmap.c
test_vector.c
test_list.c
test_array.c
../src/utils/hashmap.c
../src/utils/vector.c
../src/utils/iterator.c
../src/utils/str.c
../src/utils/list.c
../src/utils/array.c
)
# On unix platforms, add libm (sometimes needed, it seems)
IF(UNIX)
SET(LIBS ${LIBS} -lm)
ENDIF(UNIX)
target_link_libraries(openomf_test_main ${LIBS})
# Change policy to allow "test" target name
cmake_policy(PUSH)
if(POLICY CMP0037)
cmake_policy(SET CMP0037 OLD)
endif()
add_custom_target(test openomf_test_main)
cmake_policy(POP)
ENDIF(CUNIT_FOUND)
| find_package(CUnit)
IF(CUNIT_FOUND)
include_directories(${CUNIT_INCLUDE_DIR} . ../include/)
set(LIBS ${CUNIT_LIBRARY})
add_executable(openomf_test_main
test_main.c
test_str.c
test_hashmap.c
test_vector.c
test_list.c
test_array.c
../src/utils/hashmap.c
../src/utils/vector.c
../src/utils/iterator.c
../src/utils/str.c
../src/utils/list.c
../src/utils/array.c
)
# On unix platforms, add libm (sometimes needed, it seems)
IF(UNIX)
SET(LIBS ${LIBS} -lm)
ENDIF(UNIX)
target_link_libraries(openomf_test_main ${LIBS})
+ # Change policy to allow "test" target name
+ cmake_policy(PUSH)
+ if(POLICY CMP0037)
+ cmake_policy(SET CMP0037 OLD)
+ endif()
add_custom_target(test openomf_test_main)
+ cmake_policy(POP)
ENDIF(CUNIT_FOUND)
| 6 | 0.193548 | 6 | 0 |
ca06378b83a2cef1902bff1204cb3f506433f974 | setup.py | setup.py | try:
from setuptools import setup, find_packages
except ImportError:
from distutils.core import setup, find_packages
DESCRIPTION = "Convert Matplotlib plots into Leaflet web maps"
LONG_DESCRIPTION = DESCRIPTION
NAME = "mplleaflet"
AUTHOR = "Jacob Wasserman"
AUTHOR_EMAIL = "jwasserman@gmail.com"
MAINTAINER = "Jacob Wasserman"
MAINTAINER_EMAIL = "jwasserman@gmail.com"
DOWNLOAD_URL = 'http://github.com/jwass/mplleaflet'
LICENSE = 'BSD 3-clause'
VERSION = '0.0.2'
setup(
name=NAME,
version=VERSION,
description=DESCRIPTION,
long_description=LONG_DESCRIPTION,
author=AUTHOR,
author_email=AUTHOR_EMAIL,
maintainer=MAINTAINER,
maintainer_email=MAINTAINER_EMAIL,
url=DOWNLOAD_URL,
download_url=DOWNLOAD_URL,
license=LICENSE,
packages=find_packages(),
package_data={'': ['*.html']}, # Include the templates
install_requires=[
"jinja2",
"six",
],
)
| try:
from setuptools import setup, find_packages
except ImportError:
from distutils.core import setup, find_packages
with open('AUTHORS.md') as f:
authors = f.read()
description = "Convert Matplotlib plots into Leaflet web maps"
long_description = description + "\n\n" + authors
NAME = "mplleaflet"
AUTHOR = "Jacob Wasserman"
AUTHOR_EMAIL = "jwasserman@gmail.com"
MAINTAINER = "Jacob Wasserman"
MAINTAINER_EMAIL = "jwasserman@gmail.com"
DOWNLOAD_URL = 'http://github.com/jwass/mplleaflet'
LICENSE = 'BSD 3-clause'
VERSION = '0.0.3'
setup(
name=NAME,
version=VERSION,
description=description,
long_description=long_description,
author=AUTHOR,
author_email=AUTHOR_EMAIL,
maintainer=MAINTAINER,
maintainer_email=MAINTAINER_EMAIL,
url=DOWNLOAD_URL,
download_url=DOWNLOAD_URL,
license=LICENSE,
packages=find_packages(),
package_data={'': ['*.html']}, # Include the templates
install_requires=[
"jinja2",
"six",
],
)
| Add authors to long description. | Add authors to long description.
| Python | bsd-3-clause | jwass/mplleaflet,ocefpaf/mplleaflet,jwass/mplleaflet,ocefpaf/mplleaflet | python | ## Code Before:
try:
from setuptools import setup, find_packages
except ImportError:
from distutils.core import setup, find_packages
DESCRIPTION = "Convert Matplotlib plots into Leaflet web maps"
LONG_DESCRIPTION = DESCRIPTION
NAME = "mplleaflet"
AUTHOR = "Jacob Wasserman"
AUTHOR_EMAIL = "jwasserman@gmail.com"
MAINTAINER = "Jacob Wasserman"
MAINTAINER_EMAIL = "jwasserman@gmail.com"
DOWNLOAD_URL = 'http://github.com/jwass/mplleaflet'
LICENSE = 'BSD 3-clause'
VERSION = '0.0.2'
setup(
name=NAME,
version=VERSION,
description=DESCRIPTION,
long_description=LONG_DESCRIPTION,
author=AUTHOR,
author_email=AUTHOR_EMAIL,
maintainer=MAINTAINER,
maintainer_email=MAINTAINER_EMAIL,
url=DOWNLOAD_URL,
download_url=DOWNLOAD_URL,
license=LICENSE,
packages=find_packages(),
package_data={'': ['*.html']}, # Include the templates
install_requires=[
"jinja2",
"six",
],
)
## Instruction:
Add authors to long description.
## Code After:
try:
from setuptools import setup, find_packages
except ImportError:
from distutils.core import setup, find_packages
with open('AUTHORS.md') as f:
authors = f.read()
description = "Convert Matplotlib plots into Leaflet web maps"
long_description = description + "\n\n" + authors
NAME = "mplleaflet"
AUTHOR = "Jacob Wasserman"
AUTHOR_EMAIL = "jwasserman@gmail.com"
MAINTAINER = "Jacob Wasserman"
MAINTAINER_EMAIL = "jwasserman@gmail.com"
DOWNLOAD_URL = 'http://github.com/jwass/mplleaflet'
LICENSE = 'BSD 3-clause'
VERSION = '0.0.3'
setup(
name=NAME,
version=VERSION,
description=description,
long_description=long_description,
author=AUTHOR,
author_email=AUTHOR_EMAIL,
maintainer=MAINTAINER,
maintainer_email=MAINTAINER_EMAIL,
url=DOWNLOAD_URL,
download_url=DOWNLOAD_URL,
license=LICENSE,
packages=find_packages(),
package_data={'': ['*.html']}, # Include the templates
install_requires=[
"jinja2",
"six",
],
)
| try:
from setuptools import setup, find_packages
except ImportError:
from distutils.core import setup, find_packages
+ with open('AUTHORS.md') as f:
+ authors = f.read()
+
- DESCRIPTION = "Convert Matplotlib plots into Leaflet web maps"
? ^^^^^^^^^^^
+ description = "Convert Matplotlib plots into Leaflet web maps"
? ^^^^^^^^^^^
- LONG_DESCRIPTION = DESCRIPTION
+ long_description = description + "\n\n" + authors
NAME = "mplleaflet"
AUTHOR = "Jacob Wasserman"
AUTHOR_EMAIL = "jwasserman@gmail.com"
MAINTAINER = "Jacob Wasserman"
MAINTAINER_EMAIL = "jwasserman@gmail.com"
DOWNLOAD_URL = 'http://github.com/jwass/mplleaflet'
LICENSE = 'BSD 3-clause'
- VERSION = '0.0.2'
? ^
+ VERSION = '0.0.3'
? ^
setup(
name=NAME,
version=VERSION,
- description=DESCRIPTION,
- long_description=LONG_DESCRIPTION,
+ description=description,
+ long_description=long_description,
author=AUTHOR,
author_email=AUTHOR_EMAIL,
maintainer=MAINTAINER,
maintainer_email=MAINTAINER_EMAIL,
url=DOWNLOAD_URL,
download_url=DOWNLOAD_URL,
license=LICENSE,
packages=find_packages(),
package_data={'': ['*.html']}, # Include the templates
install_requires=[
"jinja2",
"six",
],
) | 13 | 0.371429 | 8 | 5 |
3db170d40a0b7f9e8e01727116fb06b893fe26b2 | src/application/components/Application/Model/model.css | src/application/components/Application/Model/model.css | @import "variables.css";
.header {
}
.title {
font-weight: 700;
}
.container {
margin: 1rem 5rem 1rem 5rem;
padding: 0 0 1rem 1rem;
}
.fields-container {
padding: 0 1rem 1rem 1rem;
border: 1px solid var(--color-concrete);
border-bottom-right-radius: var(--border-radius);
border-bottom-left-radius: var(--border-radius);
}
.description {
margin: 0;
column-count: 2;
}
.json {
padding: 1rem 0 1rem 0;
} | @import "variables.css";
.header {
margin: 5rem 0 2rem 0;
}
.title {
font-weight: 700;
margin: 0 0 1rem 0;
}
.container {
margin: 1rem 5rem 1rem 5rem;
padding: 0 0 1rem 1rem;
}
.fields-container {
padding: 0 1rem 1rem 1rem;
border: 1px solid var(--color-concrete);
border-bottom-right-radius: var(--border-radius);
border-bottom-left-radius: var(--border-radius);
}
.description {
margin: 0;
}
.json {
padding: 1rem 0 1rem 0;
}
| Add vertical spacing for consistency | Add vertical spacing for consistency
| CSS | mit | movio/apidoc-ui,movio/apidoc-ui | css | ## Code Before:
@import "variables.css";
.header {
}
.title {
font-weight: 700;
}
.container {
margin: 1rem 5rem 1rem 5rem;
padding: 0 0 1rem 1rem;
}
.fields-container {
padding: 0 1rem 1rem 1rem;
border: 1px solid var(--color-concrete);
border-bottom-right-radius: var(--border-radius);
border-bottom-left-radius: var(--border-radius);
}
.description {
margin: 0;
column-count: 2;
}
.json {
padding: 1rem 0 1rem 0;
}
## Instruction:
Add vertical spacing for consistency
## Code After:
@import "variables.css";
.header {
margin: 5rem 0 2rem 0;
}
.title {
font-weight: 700;
margin: 0 0 1rem 0;
}
.container {
margin: 1rem 5rem 1rem 5rem;
padding: 0 0 1rem 1rem;
}
.fields-container {
padding: 0 1rem 1rem 1rem;
border: 1px solid var(--color-concrete);
border-bottom-right-radius: var(--border-radius);
border-bottom-left-radius: var(--border-radius);
}
.description {
margin: 0;
}
.json {
padding: 1rem 0 1rem 0;
}
| @import "variables.css";
.header {
+ margin: 5rem 0 2rem 0;
}
.title {
font-weight: 700;
+ margin: 0 0 1rem 0;
}
.container {
margin: 1rem 5rem 1rem 5rem;
padding: 0 0 1rem 1rem;
}
.fields-container {
padding: 0 1rem 1rem 1rem;
border: 1px solid var(--color-concrete);
border-bottom-right-radius: var(--border-radius);
border-bottom-left-radius: var(--border-radius);
}
.description {
margin: 0;
- column-count: 2;
}
.json {
padding: 1rem 0 1rem 0;
} | 3 | 0.103448 | 2 | 1 |
483040e559089a95f98148becb0284584d400a49 | gun.js | gun.js | var wpi = require("wiring-pi"),
http = require("http"),
port = 5000,
host = "localhost",
method = "POST";
var IRin = 12;
var cut_video = true;
wpi.setup("phys");
wpi.pinMode(IRin, wpi.INPUT);
wpi.pullUpDnControl(IRin, wpi.PUD_DOWN);
wpi.wiringPiISR(IRin, wpi.INT_EDGE_BOTH, function(){
//This part stop all the LOCAL tracks (Audio and Video)
if(!cut_video){
http.request({
method:method,
host:host,
port: port,
path: "/on"
}).end();
}else{
http.request({
method:method,
host:host,
port: port,
path: "/off"
}).end();
}
cut_video = !cut_video;
});
| var wpi = require("wiring-pi"),
http = require("http"),
port = 5000,
host = "https://portal-ns.herokuapp.com",
method = "POST";
var IRin = 12;
var cut_video = true;
wpi.setup("phys");
wpi.pinMode(IRin, wpi.INPUT);
wpi.pullUpDnControl(IRin, wpi.PUD_DOWN);
wpi.wiringPiISR(IRin, wpi.INT_EDGE_BOTH, function(){
//This part stop all the LOCAL tracks (Audio and Video)
if(!cut_video){
http.request({
method:method,
host:host,
port: port,
path: "/on"
}).end();
}else{
http.request({
method:method,
host:host,
port: port,
path: "/off"
}).end();
}
cut_video = !cut_video;
});
| Change local to heroku host | Change local to heroku host
| JavaScript | apache-2.0 | CarlosMiguelCuevas/portal-ns,CarlosMiguelCuevas/portal-ns,CarlosMiguelCuevas/portal-ns,CarlosMiguelCuevas/portal-ns,Nearsoft/office-portal,Nearsoft/office-portal,Nearsoft/office-portal,Nearsoft/office-portal | javascript | ## Code Before:
var wpi = require("wiring-pi"),
http = require("http"),
port = 5000,
host = "localhost",
method = "POST";
var IRin = 12;
var cut_video = true;
wpi.setup("phys");
wpi.pinMode(IRin, wpi.INPUT);
wpi.pullUpDnControl(IRin, wpi.PUD_DOWN);
wpi.wiringPiISR(IRin, wpi.INT_EDGE_BOTH, function(){
//This part stop all the LOCAL tracks (Audio and Video)
if(!cut_video){
http.request({
method:method,
host:host,
port: port,
path: "/on"
}).end();
}else{
http.request({
method:method,
host:host,
port: port,
path: "/off"
}).end();
}
cut_video = !cut_video;
});
## Instruction:
Change local to heroku host
## Code After:
var wpi = require("wiring-pi"),
http = require("http"),
port = 5000,
host = "https://portal-ns.herokuapp.com",
method = "POST";
var IRin = 12;
var cut_video = true;
wpi.setup("phys");
wpi.pinMode(IRin, wpi.INPUT);
wpi.pullUpDnControl(IRin, wpi.PUD_DOWN);
wpi.wiringPiISR(IRin, wpi.INT_EDGE_BOTH, function(){
//This part stop all the LOCAL tracks (Audio and Video)
if(!cut_video){
http.request({
method:method,
host:host,
port: port,
path: "/on"
}).end();
}else{
http.request({
method:method,
host:host,
port: port,
path: "/off"
}).end();
}
cut_video = !cut_video;
});
| var wpi = require("wiring-pi"),
http = require("http"),
port = 5000,
- host = "localhost",
+ host = "https://portal-ns.herokuapp.com",
method = "POST";
var IRin = 12;
var cut_video = true;
wpi.setup("phys");
wpi.pinMode(IRin, wpi.INPUT);
wpi.pullUpDnControl(IRin, wpi.PUD_DOWN);
wpi.wiringPiISR(IRin, wpi.INT_EDGE_BOTH, function(){
//This part stop all the LOCAL tracks (Audio and Video)
if(!cut_video){
http.request({
method:method,
host:host,
port: port,
path: "/on"
}).end();
}else{
http.request({
method:method,
host:host,
port: port,
path: "/off"
}).end();
}
cut_video = !cut_video;
}); | 2 | 0.054054 | 1 | 1 |
e1e39d99e03fdc7586403fa0dc667fff4709704f | dist.ini | dist.ini | name = Log-Dispatch
author = Dave Rolsky <autarch@urth.org>
license = Artistic_2_0
copyright_holder = Dave Rolsky
[@DROLSKY]
dist = Log-Dispatch
pod_coverage_skip = Log::Dispatch::ApacheLog
pod_coverage_skip = Log::Dispatch::Conflicts
pod_coverage_trustme = Log::Dispatch => qr/^(?:warn|err|crit|emerg)$/
pod_coverage_trustme = Log::Dispatch => qr/^is_\w+$/
pod_coverage_trustme = Log::Dispatch::File => qr/^(?:O_)?APPEND$/
pod_coverage_trustme = Log::Dispatch::Output => qr/^new$/
stopwords_file = .stopwords
prereqs_skip = Apache2?::Log
prereqs_skip = ^Mail::
prereqs_skip = MIME::Lite
prereqs_skip = threads
prereqs_skip = threads::shared
-remove = Test::Compile
-remove = Test::Pod::LinkCheck
-remove = Test::Synopsis
-remove = Test::Version
[FileFinder::ByName / MostLibs]
dir = lib
skip = Log/Dispatch/Conflicts.pm
[Test::Version]
finder = MostLibs
[Conflicts]
Log::Dispatch::File::Stamped = 0.10
| name = Log-Dispatch
author = Dave Rolsky <autarch@urth.org>
license = Artistic_2_0
copyright_holder = Dave Rolsky
[@DROLSKY]
dist = Log-Dispatch
pod_coverage_skip = Log::Dispatch::ApacheLog
pod_coverage_skip = Log::Dispatch::Conflicts
pod_coverage_trustme = Log::Dispatch => qr/^(?:warn|err|crit|emerg)$/
pod_coverage_trustme = Log::Dispatch => qr/^is_\w+$/
pod_coverage_trustme = Log::Dispatch::File => qr/^(?:O_)?APPEND$/
pod_coverage_trustme = Log::Dispatch::Output => qr/^new$/
stopwords_file = .stopwords
prereqs_skip = Apache2?::Log
prereqs_skip = ^Mail::
prereqs_skip = MIME::Lite
prereqs_skip = threads
prereqs_skip = threads::shared
-remove = Test::Compile
-remove = Test::Pod::LinkCheck
-remove = Test::Synopsis
-remove = Test::Version
[Prereqs / DevelopRequires]
MIME::Lite = 0
Mail::Send = 0
Mail::Sender = 0
Mail::Sendmail = 0
[FileFinder::ByName / MostLibs]
dir = lib
skip = Log/Dispatch/Conflicts.pm
[Test::Version]
finder = MostLibs
[Conflicts]
Log::Dispatch::File::Stamped = 0.10
| Add various email modules to develop prereqs | Add various email modules to develop prereqs
| INI | artistic-2.0 | rivy/Log-Dispatch,rivy/Log-Dispatch | ini | ## Code Before:
name = Log-Dispatch
author = Dave Rolsky <autarch@urth.org>
license = Artistic_2_0
copyright_holder = Dave Rolsky
[@DROLSKY]
dist = Log-Dispatch
pod_coverage_skip = Log::Dispatch::ApacheLog
pod_coverage_skip = Log::Dispatch::Conflicts
pod_coverage_trustme = Log::Dispatch => qr/^(?:warn|err|crit|emerg)$/
pod_coverage_trustme = Log::Dispatch => qr/^is_\w+$/
pod_coverage_trustme = Log::Dispatch::File => qr/^(?:O_)?APPEND$/
pod_coverage_trustme = Log::Dispatch::Output => qr/^new$/
stopwords_file = .stopwords
prereqs_skip = Apache2?::Log
prereqs_skip = ^Mail::
prereqs_skip = MIME::Lite
prereqs_skip = threads
prereqs_skip = threads::shared
-remove = Test::Compile
-remove = Test::Pod::LinkCheck
-remove = Test::Synopsis
-remove = Test::Version
[FileFinder::ByName / MostLibs]
dir = lib
skip = Log/Dispatch/Conflicts.pm
[Test::Version]
finder = MostLibs
[Conflicts]
Log::Dispatch::File::Stamped = 0.10
## Instruction:
Add various email modules to develop prereqs
## Code After:
name = Log-Dispatch
author = Dave Rolsky <autarch@urth.org>
license = Artistic_2_0
copyright_holder = Dave Rolsky
[@DROLSKY]
dist = Log-Dispatch
pod_coverage_skip = Log::Dispatch::ApacheLog
pod_coverage_skip = Log::Dispatch::Conflicts
pod_coverage_trustme = Log::Dispatch => qr/^(?:warn|err|crit|emerg)$/
pod_coverage_trustme = Log::Dispatch => qr/^is_\w+$/
pod_coverage_trustme = Log::Dispatch::File => qr/^(?:O_)?APPEND$/
pod_coverage_trustme = Log::Dispatch::Output => qr/^new$/
stopwords_file = .stopwords
prereqs_skip = Apache2?::Log
prereqs_skip = ^Mail::
prereqs_skip = MIME::Lite
prereqs_skip = threads
prereqs_skip = threads::shared
-remove = Test::Compile
-remove = Test::Pod::LinkCheck
-remove = Test::Synopsis
-remove = Test::Version
[Prereqs / DevelopRequires]
MIME::Lite = 0
Mail::Send = 0
Mail::Sender = 0
Mail::Sendmail = 0
[FileFinder::ByName / MostLibs]
dir = lib
skip = Log/Dispatch/Conflicts.pm
[Test::Version]
finder = MostLibs
[Conflicts]
Log::Dispatch::File::Stamped = 0.10
| name = Log-Dispatch
author = Dave Rolsky <autarch@urth.org>
license = Artistic_2_0
copyright_holder = Dave Rolsky
[@DROLSKY]
dist = Log-Dispatch
pod_coverage_skip = Log::Dispatch::ApacheLog
pod_coverage_skip = Log::Dispatch::Conflicts
pod_coverage_trustme = Log::Dispatch => qr/^(?:warn|err|crit|emerg)$/
pod_coverage_trustme = Log::Dispatch => qr/^is_\w+$/
pod_coverage_trustme = Log::Dispatch::File => qr/^(?:O_)?APPEND$/
pod_coverage_trustme = Log::Dispatch::Output => qr/^new$/
stopwords_file = .stopwords
prereqs_skip = Apache2?::Log
prereqs_skip = ^Mail::
prereqs_skip = MIME::Lite
prereqs_skip = threads
prereqs_skip = threads::shared
-remove = Test::Compile
-remove = Test::Pod::LinkCheck
-remove = Test::Synopsis
-remove = Test::Version
+ [Prereqs / DevelopRequires]
+ MIME::Lite = 0
+ Mail::Send = 0
+ Mail::Sender = 0
+ Mail::Sendmail = 0
+
[FileFinder::ByName / MostLibs]
dir = lib
skip = Log/Dispatch/Conflicts.pm
[Test::Version]
finder = MostLibs
[Conflicts]
Log::Dispatch::File::Stamped = 0.10 | 6 | 0.181818 | 6 | 0 |
77bc35648cc3b628b013d8690afb9d234750f211 | bindings/python-examples/parses-demo-sql.txt | bindings/python-examples/parses-demo-sql.txt | % This file contains test sentences to verify that the SQL dict
% works. It contains more than one sentence to check that memory
% is freed properly (e.g by using LSAN).
Ithis is a test
O
O +------WV------+--Osm--+
O +---Wd---+-Ss*b+ +-Ds-+
O | | | | |
OLEFT-WALL this.p is.v a test.n
O
Ithis is another test
O
O +------WV------+-----Osm-----+
O +---Wd---+-Ss*b+ +---Ds--+
O | | | | |
OLEFT-WALL this.p is.v another test.n
O
| % This file contains test sentences to verify that the SQL dict
% works. It contains more than one sentence to check that memory
% is freed properly (e.g by using LSAN).
Ithis is a test
O
O +------WV------+--Osm--+
O +---Wd---+-Ss*b+ +-Ds-+
O | | | | |
OLEFT-WALL this.p is.v a test.n
O
Ithis is another test
O
O +------WV------+-----Osm-----+
O +---Wd---+-Ss*b+ +---Ds--+
O | | | | |
OLEFT-WALL this.p is.v another test.n
O
-max_null_count=1
IThis is a a test
O
O +------WV------+----Osm----+
O +---Wd---+-Ss*b+ +-Ds-+
O | | | | |
OLEFT-WALL this.p is.v [a] a test.n
O
| Validate parsing with nulls using SQLite dict | tests.py: Validate parsing with nulls using SQLite dict
| Text | lgpl-2.1 | linas/link-grammar,opencog/link-grammar,opencog/link-grammar,ampli/link-grammar,linas/link-grammar,ampli/link-grammar,linas/link-grammar,opencog/link-grammar,linas/link-grammar,opencog/link-grammar,linas/link-grammar,linas/link-grammar,opencog/link-grammar,ampli/link-grammar,ampli/link-grammar,linas/link-grammar,ampli/link-grammar,linas/link-grammar,ampli/link-grammar,opencog/link-grammar,ampli/link-grammar,opencog/link-grammar,ampli/link-grammar,opencog/link-grammar,linas/link-grammar,opencog/link-grammar,ampli/link-grammar | text | ## Code Before:
% This file contains test sentences to verify that the SQL dict
% works. It contains more than one sentence to check that memory
% is freed properly (e.g by using LSAN).
Ithis is a test
O
O +------WV------+--Osm--+
O +---Wd---+-Ss*b+ +-Ds-+
O | | | | |
OLEFT-WALL this.p is.v a test.n
O
Ithis is another test
O
O +------WV------+-----Osm-----+
O +---Wd---+-Ss*b+ +---Ds--+
O | | | | |
OLEFT-WALL this.p is.v another test.n
O
## Instruction:
tests.py: Validate parsing with nulls using SQLite dict
## Code After:
% This file contains test sentences to verify that the SQL dict
% works. It contains more than one sentence to check that memory
% is freed properly (e.g by using LSAN).
Ithis is a test
O
O +------WV------+--Osm--+
O +---Wd---+-Ss*b+ +-Ds-+
O | | | | |
OLEFT-WALL this.p is.v a test.n
O
Ithis is another test
O
O +------WV------+-----Osm-----+
O +---Wd---+-Ss*b+ +---Ds--+
O | | | | |
OLEFT-WALL this.p is.v another test.n
O
-max_null_count=1
IThis is a a test
O
O +------WV------+----Osm----+
O +---Wd---+-Ss*b+ +-Ds-+
O | | | | |
OLEFT-WALL this.p is.v [a] a test.n
O
| % This file contains test sentences to verify that the SQL dict
% works. It contains more than one sentence to check that memory
% is freed properly (e.g by using LSAN).
Ithis is a test
O
O +------WV------+--Osm--+
O +---Wd---+-Ss*b+ +-Ds-+
O | | | | |
OLEFT-WALL this.p is.v a test.n
O
Ithis is another test
O
O +------WV------+-----Osm-----+
O +---Wd---+-Ss*b+ +---Ds--+
O | | | | |
OLEFT-WALL this.p is.v another test.n
O
+
+ -max_null_count=1
+ IThis is a a test
+ O
+ O +------WV------+----Osm----+
+ O +---Wd---+-Ss*b+ +-Ds-+
+ O | | | | |
+ OLEFT-WALL this.p is.v [a] a test.n
+ O | 9 | 0.473684 | 9 | 0 |
a46192c85ea898bc6969a13853424ef2782901fb | README.md | README.md | [Lil' Bits](https://www.youtube.com/watch?v=Gj4-E5Hs3Kc) of code that I might want to reuse. Feel free to steal them(with credit).
## Table of Contents
### Python
* [list comprehension building a string incrementally](#incremental-string-builder)
### Java
### C#
<br>
<hr>
<br>
## Python
#### Incremental String Builder [*](https://github.com/jabocg/scraps/blob/master/python/incremental-string-builder)
```python
[string[:i] for i in range(1, len(string))]
```
| [Lil' Bits](https://www.youtube.com/watch?v=Gj4-E5Hs3Kc) of code that I might want to reuse. Feel free to steal them(with credit).
## Table of Contents
### Python
* [list comprehension building a string incrementally](#incremental-string-builder)
### Java
### C#
<br>
<hr>
<br>
## Python
#### Incremental String Builder [*](https://github.com/jabocg/scraps/blob/master/python/incremental-string-builder.py)
```python
[string[:i] for i in range(1, len(string))]
```
#### File copy via `pathlib` [*](https://github.com/jabocg/scraps/blob/master/python/pathlib-file-copy.py)
```python
import pathlib
source = Path(...)
dest = Path(...)
with source.open() as i:
with dest.open(mode='w') as o:
o.write(i.read())
```
| Add pathlib file copy to readme | Add pathlib file copy to readme | Markdown | mit | jabocg/scraps | markdown | ## Code Before:
[Lil' Bits](https://www.youtube.com/watch?v=Gj4-E5Hs3Kc) of code that I might want to reuse. Feel free to steal them(with credit).
## Table of Contents
### Python
* [list comprehension building a string incrementally](#incremental-string-builder)
### Java
### C#
<br>
<hr>
<br>
## Python
#### Incremental String Builder [*](https://github.com/jabocg/scraps/blob/master/python/incremental-string-builder)
```python
[string[:i] for i in range(1, len(string))]
```
## Instruction:
Add pathlib file copy to readme
## Code After:
[Lil' Bits](https://www.youtube.com/watch?v=Gj4-E5Hs3Kc) of code that I might want to reuse. Feel free to steal them(with credit).
## Table of Contents
### Python
* [list comprehension building a string incrementally](#incremental-string-builder)
### Java
### C#
<br>
<hr>
<br>
## Python
#### Incremental String Builder [*](https://github.com/jabocg/scraps/blob/master/python/incremental-string-builder.py)
```python
[string[:i] for i in range(1, len(string))]
```
#### File copy via `pathlib` [*](https://github.com/jabocg/scraps/blob/master/python/pathlib-file-copy.py)
```python
import pathlib
source = Path(...)
dest = Path(...)
with source.open() as i:
with dest.open(mode='w') as o:
o.write(i.read())
```
| [Lil' Bits](https://www.youtube.com/watch?v=Gj4-E5Hs3Kc) of code that I might want to reuse. Feel free to steal them(with credit).
## Table of Contents
### Python
* [list comprehension building a string incrementally](#incremental-string-builder)
### Java
### C#
<br>
<hr>
<br>
## Python
+
- #### Incremental String Builder [*](https://github.com/jabocg/scraps/blob/master/python/incremental-string-builder)
+ #### Incremental String Builder [*](https://github.com/jabocg/scraps/blob/master/python/incremental-string-builder.py)
? +++
```python
[string[:i] for i in range(1, len(string))]
```
+
+ #### File copy via `pathlib` [*](https://github.com/jabocg/scraps/blob/master/python/pathlib-file-copy.py)
+ ```python
+ import pathlib
+
+ source = Path(...)
+ dest = Path(...)
+
+ with source.open() as i:
+ with dest.open(mode='w') as o:
+ o.write(i.read())
+ ``` | 15 | 0.833333 | 14 | 1 |
6778ad6f014931c68c75edc7083e0b74a7f5afbf | spec/cli/cmd/analyze_cmd_spec.rb | spec/cli/cmd/analyze_cmd_spec.rb |
require_relative '../../../lib/csv2psql/cli/cli'
describe Csv2Psql do
it 'help analyze' do
run_cli(%w(help analyze))
end
it 'analyze data/cia-data-all.csv' do
run_cli(%w(analyze data/cia-data-all.csv))
end
it 'analyze data/sample.csv' do
run_cli(%w(analyze data/sample.csv))
end
end
|
require_relative '../../../lib/csv2psql/cli/cli'
describe Csv2Psql do
it 'help analyze' do
run_cli(%w(help analyze))
end
it 'analyze data/cia-data-all.csv' do
run_cli(%w(analyze data/cia-data-all.csv))
end
it 'analyze data/sample.csv' do
run_cli(%w(analyze data/sample.csv))
end
it 'analyze --format table data/sample.csv' do
run_cli(%w(analyze --format table data/sample.csv))
end
end
| Test analyze --format table data/sample.csv | Test analyze --format table data/sample.csv
| Ruby | mit | korczis/csv2psql,korczis/csv2psql | ruby | ## Code Before:
require_relative '../../../lib/csv2psql/cli/cli'
describe Csv2Psql do
it 'help analyze' do
run_cli(%w(help analyze))
end
it 'analyze data/cia-data-all.csv' do
run_cli(%w(analyze data/cia-data-all.csv))
end
it 'analyze data/sample.csv' do
run_cli(%w(analyze data/sample.csv))
end
end
## Instruction:
Test analyze --format table data/sample.csv
## Code After:
require_relative '../../../lib/csv2psql/cli/cli'
describe Csv2Psql do
it 'help analyze' do
run_cli(%w(help analyze))
end
it 'analyze data/cia-data-all.csv' do
run_cli(%w(analyze data/cia-data-all.csv))
end
it 'analyze data/sample.csv' do
run_cli(%w(analyze data/sample.csv))
end
it 'analyze --format table data/sample.csv' do
run_cli(%w(analyze --format table data/sample.csv))
end
end
|
require_relative '../../../lib/csv2psql/cli/cli'
describe Csv2Psql do
it 'help analyze' do
run_cli(%w(help analyze))
end
it 'analyze data/cia-data-all.csv' do
run_cli(%w(analyze data/cia-data-all.csv))
end
it 'analyze data/sample.csv' do
run_cli(%w(analyze data/sample.csv))
end
+
+ it 'analyze --format table data/sample.csv' do
+ run_cli(%w(analyze --format table data/sample.csv))
+ end
end | 4 | 0.25 | 4 | 0 |
d526acf1af5153c0ee39195b2e0f8873d5c76f22 | various/migrate/config/example.databases.php | various/migrate/config/example.databases.php | <?php
/**
* @author Reto Schneider, 2011, github@reto-schneider.ch
*/
$drupalprefix = ''; // Drupal prefix (e.g. 'drupal_')
$config['naturvielfalt_dev'] = array( // Drupal database
'name' => '',
'user' => '',
'password' => '',
'host' => '',
'driver' => 'pgsql'
);
$config['evab'] = array( // 2nd database for import from evab
'name' => '',
'user' => '',
'password' => '',
'host' => '',
'driver' => 'pgsql'
);
?>
| <?php
/**
* @author Reto Schneider, 2011, github@reto-schneider.ch
*/
$drupalprefix = ''; // Drupal prefix (e.g. 'drupal_')
$config['naturvielfalt_dev'] = array( // Drupal database
'name' => '',
'user' => '',
'password' => '',
'host' => '',
'driver' => 'pgsql'
);
$config['evab'] = array( // 2nd database for import from evab
'name' => '',
'user' => '',
'password' => '',
'host' => '',
'driver' => 'pgsql'
);
$config['swissmon'] = array(
'name' => '',
'user' => '',
'password' => '',
'host' => '',
'driver' => 'pgsql'
);
?>
| Update example configuration to include the swissmon database configuration | Update example configuration to include the swissmon database configuration
| PHP | bsd-3-clause | rettichschnidi/naturvielfalt,rettichschnidi/naturvielfalt,rettichschnidi/naturvielfalt,rettichschnidi/naturvielfalt | php | ## Code Before:
<?php
/**
* @author Reto Schneider, 2011, github@reto-schneider.ch
*/
$drupalprefix = ''; // Drupal prefix (e.g. 'drupal_')
$config['naturvielfalt_dev'] = array( // Drupal database
'name' => '',
'user' => '',
'password' => '',
'host' => '',
'driver' => 'pgsql'
);
$config['evab'] = array( // 2nd database for import from evab
'name' => '',
'user' => '',
'password' => '',
'host' => '',
'driver' => 'pgsql'
);
?>
## Instruction:
Update example configuration to include the swissmon database configuration
## Code After:
<?php
/**
* @author Reto Schneider, 2011, github@reto-schneider.ch
*/
$drupalprefix = ''; // Drupal prefix (e.g. 'drupal_')
$config['naturvielfalt_dev'] = array( // Drupal database
'name' => '',
'user' => '',
'password' => '',
'host' => '',
'driver' => 'pgsql'
);
$config['evab'] = array( // 2nd database for import from evab
'name' => '',
'user' => '',
'password' => '',
'host' => '',
'driver' => 'pgsql'
);
$config['swissmon'] = array(
'name' => '',
'user' => '',
'password' => '',
'host' => '',
'driver' => 'pgsql'
);
?>
| <?php
/**
* @author Reto Schneider, 2011, github@reto-schneider.ch
*/
$drupalprefix = ''; // Drupal prefix (e.g. 'drupal_')
$config['naturvielfalt_dev'] = array( // Drupal database
'name' => '',
'user' => '',
'password' => '',
'host' => '',
'driver' => 'pgsql'
);
$config['evab'] = array( // 2nd database for import from evab
'name' => '',
'user' => '',
'password' => '',
'host' => '',
'driver' => 'pgsql'
);
+
+ $config['swissmon'] = array(
+ 'name' => '',
+ 'user' => '',
+ 'password' => '',
+ 'host' => '',
+ 'driver' => 'pgsql'
+ );
?> | 8 | 0.347826 | 8 | 0 |
7e153be7452c49b214041c948f0bc65c51d74f32 | app/controllers/reviews_controller.rb | app/controllers/reviews_controller.rb | class ReviewsController < ApplicationController
before_action :authenticate_user!
def new
@review = Review.new
@review_carrier = Trip.find(params[:trip_id]) if params[:trip_id]
@review_carrier = Parcel.find(params[:parcel_id]) if params[:parcel_id]
end
def create
review = Review.new(review_params)
review.update_attributes(reviewer_id: current_user.id)
reviewee = User.find(review.reviewee_id)
if review.save
reputation = reviewee.get_reputation
reviewee.update_attributes(reputation: reputation)
redirect_to user_path(reviewee.id)
else
flash[:error] = review.errors.full_messages.join('<br>')
@review = Review.new
@review_carrier = Trip.find(params[:trip_id]) if params[:trip_id]
@review_carrier = Parcel.find(params[:parcel_id]) if params[:parcel_id]
render :new
end
end
def destroy
review = Review.find(params[:id])
review.destroy
redirect_to current_user_profile
end
private
def review_params
params.require(:review).permit(:rating, :content, :trip_id, :parcel_id, :reviewee_id)
end
end
| class ReviewsController < ApplicationController
before_action :authenticate_user!
def new
@review = Review.new
@review_carrier = Trip.find(params[:trip_id]) if params[:trip_id]
@review_carrier = Parcel.find(params[:parcel_id]) if params[:parcel_id]
end
def create
review = Review.new(review_params)
reviewee = User.find(review.reviewee_id)
if review.save
reputation = reviewee.get_reputation
reviewee.update_attributes(reputation: reputation)
redirect_to user_path(reviewee.id)
else
flash[:error] = review.errors.full_messages.join('<br>')
@review = Review.new
@review_carrier = Trip.find(params[:trip_id]) if params[:trip_id]
@review_carrier = Parcel.find(params[:parcel_id]) if params[:parcel_id]
render :new
end
end
def destroy
review = Review.find(params[:id])
review.destroy
redirect_to current_user_profile
end
private
def review_params
params.require(:review).permit(:rating, :content, :trip_id, :parcel_id, :reviewee_id).merge(reviewer_id: current_user.id)
end
end
| Add merge to review params in controller to assign reviewer id to current user id | Add merge to review params in controller to assign reviewer id to current user id
| Ruby | mit | grantziolkowski/caravan,nyc-rock-doves-2015/caravan,bicyclethief/caravan,grantziolkowski/caravan,nyc-rock-doves-2015/caravan,nyc-rock-doves-2015/caravan,bicyclethief/caravan,grantziolkowski/caravan,bicyclethief/caravan | ruby | ## Code Before:
class ReviewsController < ApplicationController
before_action :authenticate_user!
def new
@review = Review.new
@review_carrier = Trip.find(params[:trip_id]) if params[:trip_id]
@review_carrier = Parcel.find(params[:parcel_id]) if params[:parcel_id]
end
def create
review = Review.new(review_params)
review.update_attributes(reviewer_id: current_user.id)
reviewee = User.find(review.reviewee_id)
if review.save
reputation = reviewee.get_reputation
reviewee.update_attributes(reputation: reputation)
redirect_to user_path(reviewee.id)
else
flash[:error] = review.errors.full_messages.join('<br>')
@review = Review.new
@review_carrier = Trip.find(params[:trip_id]) if params[:trip_id]
@review_carrier = Parcel.find(params[:parcel_id]) if params[:parcel_id]
render :new
end
end
def destroy
review = Review.find(params[:id])
review.destroy
redirect_to current_user_profile
end
private
def review_params
params.require(:review).permit(:rating, :content, :trip_id, :parcel_id, :reviewee_id)
end
end
## Instruction:
Add merge to review params in controller to assign reviewer id to current user id
## Code After:
class ReviewsController < ApplicationController
before_action :authenticate_user!
def new
@review = Review.new
@review_carrier = Trip.find(params[:trip_id]) if params[:trip_id]
@review_carrier = Parcel.find(params[:parcel_id]) if params[:parcel_id]
end
def create
review = Review.new(review_params)
reviewee = User.find(review.reviewee_id)
if review.save
reputation = reviewee.get_reputation
reviewee.update_attributes(reputation: reputation)
redirect_to user_path(reviewee.id)
else
flash[:error] = review.errors.full_messages.join('<br>')
@review = Review.new
@review_carrier = Trip.find(params[:trip_id]) if params[:trip_id]
@review_carrier = Parcel.find(params[:parcel_id]) if params[:parcel_id]
render :new
end
end
def destroy
review = Review.find(params[:id])
review.destroy
redirect_to current_user_profile
end
private
def review_params
params.require(:review).permit(:rating, :content, :trip_id, :parcel_id, :reviewee_id).merge(reviewer_id: current_user.id)
end
end
| class ReviewsController < ApplicationController
before_action :authenticate_user!
def new
@review = Review.new
@review_carrier = Trip.find(params[:trip_id]) if params[:trip_id]
@review_carrier = Parcel.find(params[:parcel_id]) if params[:parcel_id]
end
def create
review = Review.new(review_params)
- review.update_attributes(reviewer_id: current_user.id)
reviewee = User.find(review.reviewee_id)
if review.save
reputation = reviewee.get_reputation
reviewee.update_attributes(reputation: reputation)
redirect_to user_path(reviewee.id)
else
flash[:error] = review.errors.full_messages.join('<br>')
@review = Review.new
@review_carrier = Trip.find(params[:trip_id]) if params[:trip_id]
@review_carrier = Parcel.find(params[:parcel_id]) if params[:parcel_id]
render :new
end
end
def destroy
review = Review.find(params[:id])
review.destroy
redirect_to current_user_profile
end
private
def review_params
- params.require(:review).permit(:rating, :content, :trip_id, :parcel_id, :reviewee_id)
+ params.require(:review).permit(:rating, :content, :trip_id, :parcel_id, :reviewee_id).merge(reviewer_id: current_user.id)
? ++++++++++++++++++++++++++++++++++++
end
end | 3 | 0.076923 | 1 | 2 |
de4234259ed64d922e7dd6300b4f4bc1efebb2ef | lib/trashed/request_measurement.rb | lib/trashed/request_measurement.rb | module Trashed
module RequestMeasurement
def self.included(base)
base.send :around_filter, :measure_resource_usage
end
protected
def measure_resource_usage
before = Measurement.measure
yield
ensure
change = Measurement.change_since(before)
Rails.logger.info "STATS: #{change.pp}"
end
class Measurement < Struct.new(:time, :memory, :objects, :gc_runs, :gc_time)
PP_FORMAT = '%d ms, %.2f KB, %d obj, %d GCs in %d ms'.freeze
def self.change_since(before)
measure - before
end
def self.measure
new(Time.now.to_f,
GC.allocated_size, ObjectSpace.allocated_objects,
GC.collections, GC.time)
end
def -(other)
self.class.new(time - other.time,
memory - other.memory, objects - other.objects,
gc_runs - other.gc_runs, gc_time - other.gc_time)
end
def pp
PP_FORMAT % [time * 1000,
memory / 1024.0, objects,
gc_runs, gc_time / 1000.0]
end
end
end
end
| module Trashed
module RequestMeasurement
LOG_MESSAGE = 'STATS: %s | %s [%s]'.freeze
def self.included(base)
base.send :around_filter, :measure_resource_usage
end
protected
def measure_resource_usage
before = Measurement.now
yield
ensure
change = Measurement.now - before
Rails.logger.info(LOG_MESSAGE % [change.to_s,
headers['Status'].to_s, (complete_request_uri rescue 'unknown')])
end
class Measurement < Struct.new(:time, :memory, :objects, :gc_runs, :gc_time)
PP_FORMAT = '%d ms, %.2f KB, %d obj, %d GCs in %d ms'.freeze
def self.now
new(Time.now.to_f,
GC.allocated_size, ObjectSpace.allocated_objects,
GC.collections, GC.time)
end
def -(other)
self.class.new(time - other.time,
memory - other.memory, objects - other.objects,
gc_runs - other.gc_runs, gc_time - other.gc_time)
end
def to_s
PP_FORMAT % [time * 1000,
memory / 1024.0, objects,
gc_runs, gc_time / 1000.0]
end
end
end
end
| Include request status and uri on the stats line for easy tracking | Include request status and uri on the stats line for easy tracking
| Ruby | mit | basecamp/trashed | ruby | ## Code Before:
module Trashed
module RequestMeasurement
def self.included(base)
base.send :around_filter, :measure_resource_usage
end
protected
def measure_resource_usage
before = Measurement.measure
yield
ensure
change = Measurement.change_since(before)
Rails.logger.info "STATS: #{change.pp}"
end
class Measurement < Struct.new(:time, :memory, :objects, :gc_runs, :gc_time)
PP_FORMAT = '%d ms, %.2f KB, %d obj, %d GCs in %d ms'.freeze
def self.change_since(before)
measure - before
end
def self.measure
new(Time.now.to_f,
GC.allocated_size, ObjectSpace.allocated_objects,
GC.collections, GC.time)
end
def -(other)
self.class.new(time - other.time,
memory - other.memory, objects - other.objects,
gc_runs - other.gc_runs, gc_time - other.gc_time)
end
def pp
PP_FORMAT % [time * 1000,
memory / 1024.0, objects,
gc_runs, gc_time / 1000.0]
end
end
end
end
## Instruction:
Include request status and uri on the stats line for easy tracking
## Code After:
module Trashed
module RequestMeasurement
LOG_MESSAGE = 'STATS: %s | %s [%s]'.freeze
def self.included(base)
base.send :around_filter, :measure_resource_usage
end
protected
def measure_resource_usage
before = Measurement.now
yield
ensure
change = Measurement.now - before
Rails.logger.info(LOG_MESSAGE % [change.to_s,
headers['Status'].to_s, (complete_request_uri rescue 'unknown')])
end
class Measurement < Struct.new(:time, :memory, :objects, :gc_runs, :gc_time)
PP_FORMAT = '%d ms, %.2f KB, %d obj, %d GCs in %d ms'.freeze
def self.now
new(Time.now.to_f,
GC.allocated_size, ObjectSpace.allocated_objects,
GC.collections, GC.time)
end
def -(other)
self.class.new(time - other.time,
memory - other.memory, objects - other.objects,
gc_runs - other.gc_runs, gc_time - other.gc_time)
end
def to_s
PP_FORMAT % [time * 1000,
memory / 1024.0, objects,
gc_runs, gc_time / 1000.0]
end
end
end
end
| module Trashed
module RequestMeasurement
+ LOG_MESSAGE = 'STATS: %s | %s [%s]'.freeze
+
def self.included(base)
base.send :around_filter, :measure_resource_usage
end
protected
def measure_resource_usage
- before = Measurement.measure
? ^^^^^^^
+ before = Measurement.now
? ^^^
yield
ensure
- change = Measurement.change_since(before)
? --- ^^^^^^^^^ -
+ change = Measurement.now - before
? ^^^^^
- Rails.logger.info "STATS: #{change.pp}"
+ Rails.logger.info(LOG_MESSAGE % [change.to_s,
+ headers['Status'].to_s, (complete_request_uri rescue 'unknown')])
end
class Measurement < Struct.new(:time, :memory, :objects, :gc_runs, :gc_time)
PP_FORMAT = '%d ms, %.2f KB, %d obj, %d GCs in %d ms'.freeze
- def self.change_since(before)
- measure - before
- end
-
- def self.measure
? ^^^^^^^
+ def self.now
? ^^^
new(Time.now.to_f,
GC.allocated_size, ObjectSpace.allocated_objects,
GC.collections, GC.time)
end
def -(other)
self.class.new(time - other.time,
memory - other.memory, objects - other.objects,
gc_runs - other.gc_runs, gc_time - other.gc_time)
end
- def pp
? ^^
+ def to_s
? ^^^^
PP_FORMAT % [time * 1000,
memory / 1024.0, objects,
gc_runs, gc_time / 1000.0]
end
end
end
end | 17 | 0.404762 | 8 | 9 |
b90c05fdd7e3592e19f43637dee646b65f88af5d | CHCarouselView/Views/CarouselView.swift | CHCarouselView/Views/CarouselView.swift | //
// CarouselView.swift
// CHCarouselView
//
// Created by Calvin on 8/5/16.
// Copyright © 2016 CapsLock. All rights reserved.
//
import UIKit
public class CarouselView: UIScrollView {
@IBOutlet weak var pageControl: UIPageControl!
@IBOutlet public var views: [UIView] = []
public var currentPage: Int = 0
public var selectedCallback: ((currentPage: Int) -> ())?
override public init(frame: CGRect) {
super.init(frame: frame)
self.configure()
}
public required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
self.configure()
}
override public func drawRect(rect: CGRect) {
}
// MARK: - Private Methods
private func configure() {
self.delegate = self
self.pagingEnabled = true
self.showsVerticalScrollIndicator = false
self.showsHorizontalScrollIndicator = false
self.scrollsToTop = false
}
}
extension CarouselView: UIScrollViewDelegate {
public func scrollViewDidScroll(scrollView: UIScrollView) {
}
}
| //
// CarouselView.swift
// CHCarouselView
//
// Created by Calvin on 8/5/16.
// Copyright © 2016 CapsLock. All rights reserved.
//
import UIKit
public class CarouselView: UIScrollView {
@IBOutlet weak var pageControl: UIPageControl!
@IBOutlet public var views: [UIView] = []
public var currentPage: Int = 0
public var selectedCallback: ((currentPage: Int) -> ())?
override public init(frame: CGRect) {
super.init(frame: frame)
self.configure()
}
public required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
self.configure()
}
override public func drawRect(rect: CGRect) {
views.enumerate().forEach { (index: Int, view: UIView) in
let viewOffset = CGPoint(x: CGFloat(index) * self.bounds.width, y: 0)
view.frame = CGRect(origin: viewOffset, size: self.bounds.size)
self.addSubview(view)
}
self.contentSize = CGSize(width: CGFloat(views.count) * self.bounds.width, height: self.bounds.height)
}
// MARK: - Private Methods
private func configure() {
self.delegate = self
self.pagingEnabled = true
self.showsVerticalScrollIndicator = false
self.showsHorizontalScrollIndicator = false
self.scrollsToTop = false
}
}
extension CarouselView: UIScrollViewDelegate {
public func scrollViewDidScroll(scrollView: UIScrollView) {
}
}
| Arrange views to carousel's content and set content size. | Arrange views to carousel's content and set content size.
| Swift | mit | Calvin-Huang/CHCarouselView,Calvin-Huang/CHCarouselView | swift | ## Code Before:
//
// CarouselView.swift
// CHCarouselView
//
// Created by Calvin on 8/5/16.
// Copyright © 2016 CapsLock. All rights reserved.
//
import UIKit
public class CarouselView: UIScrollView {
@IBOutlet weak var pageControl: UIPageControl!
@IBOutlet public var views: [UIView] = []
public var currentPage: Int = 0
public var selectedCallback: ((currentPage: Int) -> ())?
override public init(frame: CGRect) {
super.init(frame: frame)
self.configure()
}
public required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
self.configure()
}
override public func drawRect(rect: CGRect) {
}
// MARK: - Private Methods
private func configure() {
self.delegate = self
self.pagingEnabled = true
self.showsVerticalScrollIndicator = false
self.showsHorizontalScrollIndicator = false
self.scrollsToTop = false
}
}
extension CarouselView: UIScrollViewDelegate {
public func scrollViewDidScroll(scrollView: UIScrollView) {
}
}
## Instruction:
Arrange views to carousel's content and set content size.
## Code After:
//
// CarouselView.swift
// CHCarouselView
//
// Created by Calvin on 8/5/16.
// Copyright © 2016 CapsLock. All rights reserved.
//
import UIKit
public class CarouselView: UIScrollView {
@IBOutlet weak var pageControl: UIPageControl!
@IBOutlet public var views: [UIView] = []
public var currentPage: Int = 0
public var selectedCallback: ((currentPage: Int) -> ())?
override public init(frame: CGRect) {
super.init(frame: frame)
self.configure()
}
public required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
self.configure()
}
override public func drawRect(rect: CGRect) {
views.enumerate().forEach { (index: Int, view: UIView) in
let viewOffset = CGPoint(x: CGFloat(index) * self.bounds.width, y: 0)
view.frame = CGRect(origin: viewOffset, size: self.bounds.size)
self.addSubview(view)
}
self.contentSize = CGSize(width: CGFloat(views.count) * self.bounds.width, height: self.bounds.height)
}
// MARK: - Private Methods
private func configure() {
self.delegate = self
self.pagingEnabled = true
self.showsVerticalScrollIndicator = false
self.showsHorizontalScrollIndicator = false
self.scrollsToTop = false
}
}
extension CarouselView: UIScrollViewDelegate {
public func scrollViewDidScroll(scrollView: UIScrollView) {
}
}
| //
// CarouselView.swift
// CHCarouselView
//
// Created by Calvin on 8/5/16.
// Copyright © 2016 CapsLock. All rights reserved.
//
import UIKit
public class CarouselView: UIScrollView {
@IBOutlet weak var pageControl: UIPageControl!
@IBOutlet public var views: [UIView] = []
public var currentPage: Int = 0
public var selectedCallback: ((currentPage: Int) -> ())?
override public init(frame: CGRect) {
super.init(frame: frame)
self.configure()
}
public required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
self.configure()
}
override public func drawRect(rect: CGRect) {
+ views.enumerate().forEach { (index: Int, view: UIView) in
+ let viewOffset = CGPoint(x: CGFloat(index) * self.bounds.width, y: 0)
+ view.frame = CGRect(origin: viewOffset, size: self.bounds.size)
+
+ self.addSubview(view)
+ }
+
+ self.contentSize = CGSize(width: CGFloat(views.count) * self.bounds.width, height: self.bounds.height)
}
// MARK: - Private Methods
private func configure() {
self.delegate = self
self.pagingEnabled = true
self.showsVerticalScrollIndicator = false
self.showsHorizontalScrollIndicator = false
self.scrollsToTop = false
}
}
extension CarouselView: UIScrollViewDelegate {
public func scrollViewDidScroll(scrollView: UIScrollView) {
}
} | 8 | 0.170213 | 8 | 0 |
f0c6fbfa23714cd0996886bcd76ed4789c85932b | src/__tests__/components/CloseButton.js | src/__tests__/components/CloseButton.js | /* eslint-env jest */
import React from 'react';
import { shallow } from 'enzyme';
import CloseButton from './../../components/CloseButton';
const closeToast = jest.fn();
describe('CloseButton', () => {
it('Should call closeToast on click', () => {
const component = shallow(<CloseButton closeToast={closeToast} />);
expect(closeToast).not.toHaveBeenCalled();
component.simulate('click');
expect(closeToast).toHaveBeenCalled();
});
});
| /* eslint-env jest */
import React from 'react';
import { shallow } from 'enzyme';
import CloseButton from './../../components/CloseButton';
const closeToast = jest.fn();
describe('CloseButton', () => {
it('Should call closeToast on click', () => {
const component = shallow(<CloseButton closeToast={closeToast} />);
expect(closeToast).not.toHaveBeenCalled();
component.simulate('click', { stopPropagation: () => undefined });
expect(closeToast).toHaveBeenCalled();
});
});
| Fix failing test event undefined when shadow rendering | Fix failing test event undefined when shadow rendering
| JavaScript | mit | fkhadra/react-toastify,sniphpet/react-toastify,fkhadra/react-toastify,fkhadra/react-toastify,sniphpet/react-toastify,fkhadra/react-toastify | javascript | ## Code Before:
/* eslint-env jest */
import React from 'react';
import { shallow } from 'enzyme';
import CloseButton from './../../components/CloseButton';
const closeToast = jest.fn();
describe('CloseButton', () => {
it('Should call closeToast on click', () => {
const component = shallow(<CloseButton closeToast={closeToast} />);
expect(closeToast).not.toHaveBeenCalled();
component.simulate('click');
expect(closeToast).toHaveBeenCalled();
});
});
## Instruction:
Fix failing test event undefined when shadow rendering
## Code After:
/* eslint-env jest */
import React from 'react';
import { shallow } from 'enzyme';
import CloseButton from './../../components/CloseButton';
const closeToast = jest.fn();
describe('CloseButton', () => {
it('Should call closeToast on click', () => {
const component = shallow(<CloseButton closeToast={closeToast} />);
expect(closeToast).not.toHaveBeenCalled();
component.simulate('click', { stopPropagation: () => undefined });
expect(closeToast).toHaveBeenCalled();
});
});
| /* eslint-env jest */
import React from 'react';
import { shallow } from 'enzyme';
import CloseButton from './../../components/CloseButton';
const closeToast = jest.fn();
describe('CloseButton', () => {
it('Should call closeToast on click', () => {
const component = shallow(<CloseButton closeToast={closeToast} />);
expect(closeToast).not.toHaveBeenCalled();
- component.simulate('click');
+ component.simulate('click', { stopPropagation: () => undefined });
expect(closeToast).toHaveBeenCalled();
});
}); | 2 | 0.117647 | 1 | 1 |
c2719013c00c400e72ff362ba6b61b3460c2fa98 | userguide/src/en/index.adoc | userguide/src/en/index.adoc | = Activiti User Guide
v 5.17.1-SNAPSHOT
:doctype: book
:toc: left
:icons: font
:numbered:
:source-highlighter: pygments
:pygments-css: class
:pygments-linenums-mode: table
:compat-mode:
include::ch01-Introduction.adoc[]
include::ch02-GettingStarted.adoc[]
include::ch03-Configuration.adoc[]
include::ch04-API.adoc[]
include::ch05-Spring.adoc[]
include::ch06-Deployment.adoc[]
include::ch07a-BPMN-Introduction.adoc[]
include::ch07b-BPMN-Constructs.adoc[]
include::ch08-Forms.adoc[]
include::ch09-JPA.adoc[]
include::ch10-History.adoc[]
include::ch11-Designer.adoc[]
include::ch12-Explorer.adoc[]
include::ch13-Modeler.adoc[]
include::ch14-REST.adoc[]
include::ch15-Cdi.adoc[]
include::ch16-Ldap.adoc[]
include::ch17-Advanced.adoc[]
include::ch18-Simulation.adoc[]
include::ch19-operation-control.adoc[]
| = Activiti User Guide
v 5.17.1-SNAPSHOT
:doctype: book
:toc: left
:toclevels: 5
:icons: font
:numbered:
:source-highlighter: pygments
:pygments-css: class
:pygments-linenums-mode: table
:compat-mode:
include::ch01-Introduction.adoc[]
include::ch02-GettingStarted.adoc[]
include::ch03-Configuration.adoc[]
include::ch04-API.adoc[]
include::ch05-Spring.adoc[]
include::ch06-Deployment.adoc[]
include::ch07a-BPMN-Introduction.adoc[]
include::ch07b-BPMN-Constructs.adoc[]
include::ch08-Forms.adoc[]
include::ch09-JPA.adoc[]
include::ch10-History.adoc[]
include::ch11-Designer.adoc[]
include::ch12-Explorer.adoc[]
include::ch13-Modeler.adoc[]
include::ch14-REST.adoc[]
include::ch15-Cdi.adoc[]
include::ch16-Ldap.adoc[]
include::ch17-Advanced.adoc[]
include::ch18-Simulation.adoc[]
include::ch19-operation-control.adoc[]
| Update userguide: show more levels in the TOC | Update userguide: show more levels in the TOC
| AsciiDoc | apache-2.0 | flowable/flowable-engine,gro-mar/flowable-engine,lsmall/flowable-engine,dbmalkovsky/flowable-engine,sibok666/flowable-engine,dbmalkovsky/flowable-engine,lsmall/flowable-engine,martin-grofcik/flowable-engine,zwets/flowable-engine,yvoswillens/flowable-engine,marcus-nl/flowable-engine,marcus-nl/flowable-engine,gro-mar/flowable-engine,sibok666/flowable-engine,stefan-ziel/Activiti,martin-grofcik/flowable-engine,roberthafner/flowable-engine,paulstapleton/flowable-engine,yvoswillens/flowable-engine,robsoncardosoti/flowable-engine,zwets/flowable-engine,motorina0/flowable-engine,stephraleigh/flowable-engine,marcus-nl/flowable-engine,robsoncardosoti/flowable-engine,zwets/flowable-engine,roberthafner/flowable-engine,sibok666/flowable-engine,gro-mar/flowable-engine,lsmall/flowable-engine,flowable/flowable-engine,stefan-ziel/Activiti,robsoncardosoti/flowable-engine,yvoswillens/flowable-engine,flowable/flowable-engine,martin-grofcik/flowable-engine,paulstapleton/flowable-engine,stephraleigh/flowable-engine,dbmalkovsky/flowable-engine,stephraleigh/flowable-engine,motorina0/flowable-engine,motorina0/flowable-engine,Activiti/Activiti,roberthafner/flowable-engine,stefan-ziel/Activiti,lsmall/flowable-engine,yvoswillens/flowable-engine,paulstapleton/flowable-engine,marcus-nl/flowable-engine,robsoncardosoti/flowable-engine,roberthafner/flowable-engine,stephraleigh/flowable-engine,stefan-ziel/Activiti,zwets/flowable-engine,Activiti/Activiti,martin-grofcik/flowable-engine,dbmalkovsky/flowable-engine,paulstapleton/flowable-engine,sibok666/flowable-engine,gro-mar/flowable-engine,motorina0/flowable-engine,flowable/flowable-engine | asciidoc | ## Code Before:
= Activiti User Guide
v 5.17.1-SNAPSHOT
:doctype: book
:toc: left
:icons: font
:numbered:
:source-highlighter: pygments
:pygments-css: class
:pygments-linenums-mode: table
:compat-mode:
include::ch01-Introduction.adoc[]
include::ch02-GettingStarted.adoc[]
include::ch03-Configuration.adoc[]
include::ch04-API.adoc[]
include::ch05-Spring.adoc[]
include::ch06-Deployment.adoc[]
include::ch07a-BPMN-Introduction.adoc[]
include::ch07b-BPMN-Constructs.adoc[]
include::ch08-Forms.adoc[]
include::ch09-JPA.adoc[]
include::ch10-History.adoc[]
include::ch11-Designer.adoc[]
include::ch12-Explorer.adoc[]
include::ch13-Modeler.adoc[]
include::ch14-REST.adoc[]
include::ch15-Cdi.adoc[]
include::ch16-Ldap.adoc[]
include::ch17-Advanced.adoc[]
include::ch18-Simulation.adoc[]
include::ch19-operation-control.adoc[]
## Instruction:
Update userguide: show more levels in the TOC
## Code After:
= Activiti User Guide
v 5.17.1-SNAPSHOT
:doctype: book
:toc: left
:toclevels: 5
:icons: font
:numbered:
:source-highlighter: pygments
:pygments-css: class
:pygments-linenums-mode: table
:compat-mode:
include::ch01-Introduction.adoc[]
include::ch02-GettingStarted.adoc[]
include::ch03-Configuration.adoc[]
include::ch04-API.adoc[]
include::ch05-Spring.adoc[]
include::ch06-Deployment.adoc[]
include::ch07a-BPMN-Introduction.adoc[]
include::ch07b-BPMN-Constructs.adoc[]
include::ch08-Forms.adoc[]
include::ch09-JPA.adoc[]
include::ch10-History.adoc[]
include::ch11-Designer.adoc[]
include::ch12-Explorer.adoc[]
include::ch13-Modeler.adoc[]
include::ch14-REST.adoc[]
include::ch15-Cdi.adoc[]
include::ch16-Ldap.adoc[]
include::ch17-Advanced.adoc[]
include::ch18-Simulation.adoc[]
include::ch19-operation-control.adoc[]
| = Activiti User Guide
v 5.17.1-SNAPSHOT
:doctype: book
:toc: left
+ :toclevels: 5
:icons: font
:numbered:
:source-highlighter: pygments
:pygments-css: class
:pygments-linenums-mode: table
:compat-mode:
include::ch01-Introduction.adoc[]
include::ch02-GettingStarted.adoc[]
include::ch03-Configuration.adoc[]
include::ch04-API.adoc[]
include::ch05-Spring.adoc[]
include::ch06-Deployment.adoc[]
include::ch07a-BPMN-Introduction.adoc[]
include::ch07b-BPMN-Constructs.adoc[]
include::ch08-Forms.adoc[]
include::ch09-JPA.adoc[]
include::ch10-History.adoc[]
include::ch11-Designer.adoc[]
include::ch12-Explorer.adoc[]
include::ch13-Modeler.adoc[]
include::ch14-REST.adoc[]
include::ch15-Cdi.adoc[]
include::ch16-Ldap.adoc[]
include::ch17-Advanced.adoc[]
include::ch18-Simulation.adoc[]
include::ch19-operation-control.adoc[] | 1 | 0.019608 | 1 | 0 |
b513cf037a087383d161ac1b2b85bb60dd282526 | alacritty/.config/alacritty/alacritty.yml | alacritty/.config/alacritty/alacritty.yml | font:
# The normal (roman) font face to use.
normal:
family: IBM Plex Mono
style: Regular
bold:
family: IBM Plex Mono
style: Bold
italic:
family: IBM Plex Mono
style: Italic
| font:
size: 12.0
# The normal (roman) font face to use.
normal:
family: IBM Plex Mono
style: Regular
bold:
family: IBM Plex Mono
style: Bold
italic:
family: IBM Plex Mono
style: Italic
| Change the terminal font size | Change the terminal font size
| YAML | mpl-2.0 | steveno/dotfiles | yaml | ## Code Before:
font:
# The normal (roman) font face to use.
normal:
family: IBM Plex Mono
style: Regular
bold:
family: IBM Plex Mono
style: Bold
italic:
family: IBM Plex Mono
style: Italic
## Instruction:
Change the terminal font size
## Code After:
font:
size: 12.0
# The normal (roman) font face to use.
normal:
family: IBM Plex Mono
style: Regular
bold:
family: IBM Plex Mono
style: Bold
italic:
family: IBM Plex Mono
style: Italic
| font:
+ size: 12.0
# The normal (roman) font face to use.
normal:
family: IBM Plex Mono
style: Regular
bold:
family: IBM Plex Mono
style: Bold
italic:
family: IBM Plex Mono
style: Italic | 1 | 0.076923 | 1 | 0 |
ca3556d80ec7ff39a96536c656c7280bdc679ebe | .travis.yml | .travis.yml | language: csharp
solution: BencodeNET.sln
install:
- nuget restore BencodeNET.sln
- nuget install xunit.runners -Version 1.9.2 -OutputDirectory xunit
script:
- xbuild /p:Configuration=Release BencodeNET.sln
- mono ./xunit/xunit.runners.1.9.2/tools/xunit.console.clr4.exe ./BencodeNET.Tests/bin/Release/BencodeNET.Tests.dll
| language: csharp
solution: BencodeNET.sln
mono:
- 4.2
install:
- nuget restore BencodeNET.sln
- nuget install xunit.runner.console -Version 2.0.0 -OutputDirectory xunit
script:
- xbuild /p:Configuration=Release BencodeNET.sln
- mono ./xunit/xunit.runner.console.2.0.0/tools/xunit.console.exe ./BencodeNET.Tests/bin/Release/BencodeNET.Tests.dll
| Revert to xunit 2.0 and mono 4.2 | Revert to xunit 2.0 and mono 4.2 | YAML | unlicense | Krusen/BencodeNET | yaml | ## Code Before:
language: csharp
solution: BencodeNET.sln
install:
- nuget restore BencodeNET.sln
- nuget install xunit.runners -Version 1.9.2 -OutputDirectory xunit
script:
- xbuild /p:Configuration=Release BencodeNET.sln
- mono ./xunit/xunit.runners.1.9.2/tools/xunit.console.clr4.exe ./BencodeNET.Tests/bin/Release/BencodeNET.Tests.dll
## Instruction:
Revert to xunit 2.0 and mono 4.2
## Code After:
language: csharp
solution: BencodeNET.sln
mono:
- 4.2
install:
- nuget restore BencodeNET.sln
- nuget install xunit.runner.console -Version 2.0.0 -OutputDirectory xunit
script:
- xbuild /p:Configuration=Release BencodeNET.sln
- mono ./xunit/xunit.runner.console.2.0.0/tools/xunit.console.exe ./BencodeNET.Tests/bin/Release/BencodeNET.Tests.dll
| language: csharp
solution: BencodeNET.sln
+ mono:
+ - 4.2
install:
- nuget restore BencodeNET.sln
- - nuget install xunit.runners -Version 1.9.2 -OutputDirectory xunit
? ^ ^ ^
+ - nuget install xunit.runner.console -Version 2.0.0 -OutputDirectory xunit
? ++++ +++ ^ ^ ^
script:
- xbuild /p:Configuration=Release BencodeNET.sln
- - mono ./xunit/xunit.runners.1.9.2/tools/xunit.console.clr4.exe ./BencodeNET.Tests/bin/Release/BencodeNET.Tests.dll
? ^^^^ -----
+ - mono ./xunit/xunit.runner.console.2.0.0/tools/xunit.console.exe ./BencodeNET.Tests/bin/Release/BencodeNET.Tests.dll
? ++++ ^^^ ++++
| 6 | 0.6 | 4 | 2 |
eff661b69750fba61fd49e19c1918eac139f5e7d | KSPhotoBrowser/KSPhotoItem.h | KSPhotoBrowser/KSPhotoItem.h | //
// KSPhotoItem.h
// KSPhotoBrowser
//
// Created by Kyle Sun on 12/25/16.
// Copyright © 2016 Kyle Sun. All rights reserved.
//
#import <UIKit/UIKit.h>
NS_ASSUME_NONNULL_BEGIN
@interface KSPhotoItem : NSObject
@property (nonatomic, strong, readonly) UIView *sourceView;
@property (nonatomic, strong, readonly) UIImage *thumbImage;
@property (nonatomic, strong, readonly) UIImage *image;
@property (nonatomic, strong, readonly) NSURL *imageUrl;
@property (nonatomic, assign) BOOL finished;
- (instancetype)initWithSourceView:(UIView *)view
thumbImage:(UIImage *)image
imageUrl:(NSURL *)url;
- (instancetype)initWithSourceView:(UIImageView *)view
imageUrl:(NSURL *)url;
- (instancetype)initWithSourceView:(UIImageView *)view
image:(UIImage *)image;
+ (instancetype)itemWithSourceView:(UIView *)view
thumbImage:(UIImage *)image
imageUrl:(NSURL *)url;
+ (instancetype)itemWithSourceView:(UIImageView *)view
imageUrl:(NSURL *)url;
+ (instancetype)itemWithSourceView:(UIImageView *)view
image:(UIImage *)image;
@end
NS_ASSUME_NONNULL_END
| //
// KSPhotoItem.h
// KSPhotoBrowser
//
// Created by Kyle Sun on 12/25/16.
// Copyright © 2016 Kyle Sun. All rights reserved.
//
#import <UIKit/UIKit.h>
@interface KSPhotoItem : NSObject
@property (nonatomic, strong, readonly, nullable) UIView *sourceView;
@property (nonatomic, strong, readonly, nullable) UIImage *thumbImage;
@property (nonatomic, strong, readonly, nullable) UIImage *image;
@property (nonatomic, strong, readonly, nullable) NSURL *imageUrl;
@property (nonatomic, assign) BOOL finished;
- (nonnull instancetype)initWithSourceView:(nullable UIView *)view
thumbImage:(nullable UIImage *)image
imageUrl:(nullable NSURL *)url;
- (nonnull instancetype)initWithSourceView:(nullable UIImageView * )view
imageUrl:(nullable NSURL *)url;
- (nonnull instancetype)initWithSourceView:(nullable UIImageView *)view
image:(nullable UIImage *)image;
+ (nonnull instancetype)itemWithSourceView:(nullable UIView *)view
thumbImage:(nullable UIImage *)image
imageUrl:(nullable NSURL *)url;
+ (nonnull instancetype)itemWithSourceView:(nullable UIImageView *)view
imageUrl:(nullable NSURL *)url;
+ (nonnull instancetype)itemWithSourceView:(nullable UIImageView *)view
image:(nullable UIImage *)image;
@end
| Support nullable Item sourceView in Swift. | Support nullable Item sourceView in Swift.
| C | mit | skx926/KSPhotoBrowser | c | ## Code Before:
//
// KSPhotoItem.h
// KSPhotoBrowser
//
// Created by Kyle Sun on 12/25/16.
// Copyright © 2016 Kyle Sun. All rights reserved.
//
#import <UIKit/UIKit.h>
NS_ASSUME_NONNULL_BEGIN
@interface KSPhotoItem : NSObject
@property (nonatomic, strong, readonly) UIView *sourceView;
@property (nonatomic, strong, readonly) UIImage *thumbImage;
@property (nonatomic, strong, readonly) UIImage *image;
@property (nonatomic, strong, readonly) NSURL *imageUrl;
@property (nonatomic, assign) BOOL finished;
- (instancetype)initWithSourceView:(UIView *)view
thumbImage:(UIImage *)image
imageUrl:(NSURL *)url;
- (instancetype)initWithSourceView:(UIImageView *)view
imageUrl:(NSURL *)url;
- (instancetype)initWithSourceView:(UIImageView *)view
image:(UIImage *)image;
+ (instancetype)itemWithSourceView:(UIView *)view
thumbImage:(UIImage *)image
imageUrl:(NSURL *)url;
+ (instancetype)itemWithSourceView:(UIImageView *)view
imageUrl:(NSURL *)url;
+ (instancetype)itemWithSourceView:(UIImageView *)view
image:(UIImage *)image;
@end
NS_ASSUME_NONNULL_END
## Instruction:
Support nullable Item sourceView in Swift.
## Code After:
//
// KSPhotoItem.h
// KSPhotoBrowser
//
// Created by Kyle Sun on 12/25/16.
// Copyright © 2016 Kyle Sun. All rights reserved.
//
#import <UIKit/UIKit.h>
@interface KSPhotoItem : NSObject
@property (nonatomic, strong, readonly, nullable) UIView *sourceView;
@property (nonatomic, strong, readonly, nullable) UIImage *thumbImage;
@property (nonatomic, strong, readonly, nullable) UIImage *image;
@property (nonatomic, strong, readonly, nullable) NSURL *imageUrl;
@property (nonatomic, assign) BOOL finished;
- (nonnull instancetype)initWithSourceView:(nullable UIView *)view
thumbImage:(nullable UIImage *)image
imageUrl:(nullable NSURL *)url;
- (nonnull instancetype)initWithSourceView:(nullable UIImageView * )view
imageUrl:(nullable NSURL *)url;
- (nonnull instancetype)initWithSourceView:(nullable UIImageView *)view
image:(nullable UIImage *)image;
+ (nonnull instancetype)itemWithSourceView:(nullable UIView *)view
thumbImage:(nullable UIImage *)image
imageUrl:(nullable NSURL *)url;
+ (nonnull instancetype)itemWithSourceView:(nullable UIImageView *)view
imageUrl:(nullable NSURL *)url;
+ (nonnull instancetype)itemWithSourceView:(nullable UIImageView *)view
image:(nullable UIImage *)image;
@end
| //
// KSPhotoItem.h
// KSPhotoBrowser
//
// Created by Kyle Sun on 12/25/16.
// Copyright © 2016 Kyle Sun. All rights reserved.
//
#import <UIKit/UIKit.h>
- NS_ASSUME_NONNULL_BEGIN
-
@interface KSPhotoItem : NSObject
- @property (nonatomic, strong, readonly) UIView *sourceView;
+ @property (nonatomic, strong, readonly, nullable) UIView *sourceView;
? ++++++++++
- @property (nonatomic, strong, readonly) UIImage *thumbImage;
+ @property (nonatomic, strong, readonly, nullable) UIImage *thumbImage;
? ++++++++++
- @property (nonatomic, strong, readonly) UIImage *image;
+ @property (nonatomic, strong, readonly, nullable) UIImage *image;
? ++++++++++
- @property (nonatomic, strong, readonly) NSURL *imageUrl;
+ @property (nonatomic, strong, readonly, nullable) NSURL *imageUrl;
? ++++++++++
@property (nonatomic, assign) BOOL finished;
- - (instancetype)initWithSourceView:(UIView *)view
+ - (nonnull instancetype)initWithSourceView:(nullable UIView *)view
? ++++++++ +++++++++
- thumbImage:(UIImage *)image
+ thumbImage:(nullable UIImage *)image
? ++++++++ +++++++++
- imageUrl:(NSURL *)url;
+ imageUrl:(nullable NSURL *)url;
? ++++++++ +++++++++
+ - (nonnull instancetype)initWithSourceView:(nullable UIImageView * )view
+ imageUrl:(nullable NSURL *)url;
- - (instancetype)initWithSourceView:(UIImageView *)view
+ - (nonnull instancetype)initWithSourceView:(nullable UIImageView *)view
? ++++++++ +++++++++
- imageUrl:(NSURL *)url;
- - (instancetype)initWithSourceView:(UIImageView *)view
- image:(UIImage *)image;
+ image:(nullable UIImage *)image;
? ++++++++ +++++++++
- + (instancetype)itemWithSourceView:(UIView *)view
+ + (nonnull instancetype)itemWithSourceView:(nullable UIView *)view
? ++++++++ +++++++++
- thumbImage:(UIImage *)image
+ thumbImage:(nullable UIImage *)image
? +++++++ +++++++++
- imageUrl:(NSURL *)url;
+ imageUrl:(nullable NSURL *)url;
? +++++++ +++++++++
- + (instancetype)itemWithSourceView:(UIImageView *)view
+ + (nonnull instancetype)itemWithSourceView:(nullable UIImageView *)view
? ++++++++ +++++++++
- imageUrl:(NSURL *)url;
+ imageUrl:(nullable NSURL *)url;
? +++++++ +++++++++
- + (instancetype)itemWithSourceView:(UIImageView *)view
+ + (nonnull instancetype)itemWithSourceView:(nullable UIImageView *)view
? ++++++++ +++++++++
- image:(UIImage *)image;
+ image:(nullable UIImage *)image;
? +++++++ +++++++++
@end
-
- NS_ASSUME_NONNULL_END | 40 | 1.025641 | 18 | 22 |
d77af9be56c614a2b6643a33564f7ec3d40c6617 | app/templates/_Gruntfile.js | app/templates/_Gruntfile.js | module.exports = function(grunt) {
var requirejsOptions = require('./config');
requirejsOptions.optimize = 'none';
// Project configuration.
grunt.initConfig({
pkg: grunt.file.readJSON('package.json'),
jshint: {
all: ['Gruntfile.js', 'js/**/*.js', 'tests/*.js']
},
karma: {
options: {
configFile: 'tests/karma.conf.js',
runnerPort: 9999,
browsers: ['Chrome']
},
dev: {
autoWatch: true
},
ci: {
singleRun: true,
reporters: ['dots', 'junit', 'coverage'],
junitReporter: {
outputFile: 'test-results.xml'
},
coverageReporter: {
type : 'cobertura',
dir : 'coverage/'
}
// SauceLabs stuff comes here
}
},
requirejs: {
options: requirejsOptions,
bundle: {
options: {
name: 'node_modules/almond/almond.js',
include: '<%= _.slugify(packageName) %>-bundle',
insertRequire: ['<%= _.slugify(packageName) %>-bundle'],
out: 'build/bundle.min.js',
excludeShallow: ['jquery']
}
}
}
});
grunt.loadNpmTasks('grunt-contrib-requirejs');
grunt.loadNpmTasks('grunt-contrib-jshint');
grunt.loadNpmTasks('grunt-karma');
grunt.registerTask('default', ['requirejs:bundle']);
};
| module.exports = function(grunt) {
var requirejsOptions = require('./config');
requirejsOptions.optimize = 'none';
// Project configuration.
grunt.initConfig({
pkg: grunt.file.readJSON('package.json'),
jshint: {
all: ['Gruntfile.js', 'js/**/*.js', 'tests/*.js']
},
karma: {
options: {
configFile: 'tests/karma.conf.js',
runnerPort: 9999,
browsers: ['Chrome']
},
dev: {
autoWatch: true
},
ci: {
singleRun: true,
reporters: ['dots', 'junit', 'coverage'],
junitReporter: {
outputFile: 'test-results.xml'
},
coverageReporter: {
type : 'cobertura',
dir : 'coverage/'
}
// SauceLabs stuff comes here
}
},
requirejs: {
options: requirejsOptions,
bundle: {
options: {
name: 'node_modules/almond/almond.js',
include: '<%= _.slugify(packageName) %>-bundle',
insertRequire: ['<%= _.slugify(packageName) %>-bundle'],
out: 'build/bundle.js',
excludeShallow: ['jquery']
}
}
},
uglify: {
widgets: {
files: {
'build/bundle.min.js': ['build/bundle.js']
}
}
}
});
grunt.loadNpmTasks('grunt-contrib-requirejs');
grunt.loadNpmTasks('grunt-contrib-jshint');
grunt.loadNpmTasks('grunt-karma');
grunt.registerTask('default', ['requirejs:bundle']);
};
| Include an uglified version when compiling | Include an uglified version when compiling
| JavaScript | mit | collective/generator-plonemockup,collective/generator-plonemockup | javascript | ## Code Before:
module.exports = function(grunt) {
var requirejsOptions = require('./config');
requirejsOptions.optimize = 'none';
// Project configuration.
grunt.initConfig({
pkg: grunt.file.readJSON('package.json'),
jshint: {
all: ['Gruntfile.js', 'js/**/*.js', 'tests/*.js']
},
karma: {
options: {
configFile: 'tests/karma.conf.js',
runnerPort: 9999,
browsers: ['Chrome']
},
dev: {
autoWatch: true
},
ci: {
singleRun: true,
reporters: ['dots', 'junit', 'coverage'],
junitReporter: {
outputFile: 'test-results.xml'
},
coverageReporter: {
type : 'cobertura',
dir : 'coverage/'
}
// SauceLabs stuff comes here
}
},
requirejs: {
options: requirejsOptions,
bundle: {
options: {
name: 'node_modules/almond/almond.js',
include: '<%= _.slugify(packageName) %>-bundle',
insertRequire: ['<%= _.slugify(packageName) %>-bundle'],
out: 'build/bundle.min.js',
excludeShallow: ['jquery']
}
}
}
});
grunt.loadNpmTasks('grunt-contrib-requirejs');
grunt.loadNpmTasks('grunt-contrib-jshint');
grunt.loadNpmTasks('grunt-karma');
grunt.registerTask('default', ['requirejs:bundle']);
};
## Instruction:
Include an uglified version when compiling
## Code After:
module.exports = function(grunt) {
var requirejsOptions = require('./config');
requirejsOptions.optimize = 'none';
// Project configuration.
grunt.initConfig({
pkg: grunt.file.readJSON('package.json'),
jshint: {
all: ['Gruntfile.js', 'js/**/*.js', 'tests/*.js']
},
karma: {
options: {
configFile: 'tests/karma.conf.js',
runnerPort: 9999,
browsers: ['Chrome']
},
dev: {
autoWatch: true
},
ci: {
singleRun: true,
reporters: ['dots', 'junit', 'coverage'],
junitReporter: {
outputFile: 'test-results.xml'
},
coverageReporter: {
type : 'cobertura',
dir : 'coverage/'
}
// SauceLabs stuff comes here
}
},
requirejs: {
options: requirejsOptions,
bundle: {
options: {
name: 'node_modules/almond/almond.js',
include: '<%= _.slugify(packageName) %>-bundle',
insertRequire: ['<%= _.slugify(packageName) %>-bundle'],
out: 'build/bundle.js',
excludeShallow: ['jquery']
}
}
},
uglify: {
widgets: {
files: {
'build/bundle.min.js': ['build/bundle.js']
}
}
}
});
grunt.loadNpmTasks('grunt-contrib-requirejs');
grunt.loadNpmTasks('grunt-contrib-jshint');
grunt.loadNpmTasks('grunt-karma');
grunt.registerTask('default', ['requirejs:bundle']);
};
| module.exports = function(grunt) {
var requirejsOptions = require('./config');
requirejsOptions.optimize = 'none';
// Project configuration.
grunt.initConfig({
pkg: grunt.file.readJSON('package.json'),
jshint: {
all: ['Gruntfile.js', 'js/**/*.js', 'tests/*.js']
},
karma: {
options: {
configFile: 'tests/karma.conf.js',
runnerPort: 9999,
browsers: ['Chrome']
},
dev: {
autoWatch: true
},
ci: {
singleRun: true,
reporters: ['dots', 'junit', 'coverage'],
junitReporter: {
outputFile: 'test-results.xml'
},
coverageReporter: {
type : 'cobertura',
dir : 'coverage/'
}
// SauceLabs stuff comes here
}
},
requirejs: {
options: requirejsOptions,
bundle: {
options: {
name: 'node_modules/almond/almond.js',
include: '<%= _.slugify(packageName) %>-bundle',
insertRequire: ['<%= _.slugify(packageName) %>-bundle'],
- out: 'build/bundle.min.js',
? ----
+ out: 'build/bundle.js',
excludeShallow: ['jquery']
+ }
+ }
+ },
+ uglify: {
+ widgets: {
+ files: {
+ 'build/bundle.min.js': ['build/bundle.js']
}
}
}
});
grunt.loadNpmTasks('grunt-contrib-requirejs');
grunt.loadNpmTasks('grunt-contrib-jshint');
grunt.loadNpmTasks('grunt-karma');
grunt.registerTask('default', ['requirejs:bundle']);
}; | 9 | 0.163636 | 8 | 1 |
6546d57a7a3a4612dd6137b90b1319b2834479cc | pom.xml | pom.xml | <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.jvnet.hudson.plugins</groupId>
<artifactId>plugin</artifactId>
<version>1.285</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>promoted-builds</artifactId>
<packaging>hpi</packaging>
<name>Hudson promoted builds plugin</name>
<version>1.2-SNAPSHOT</version>
<dependencies>
<!-- for testing purpose -->
<dependency>
<groupId>org.jvnet.hudson.plugins</groupId>
<artifactId>batch-task</artifactId>
<version>1.3</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
| <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.jvnet.hudson.plugins</groupId>
<artifactId>plugin</artifactId>
<version>1.285</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>promoted-builds</artifactId>
<packaging>hpi</packaging>
<name>Hudson promoted builds plugin</name>
<url>http://wiki.hudson-ci.org//display/HUDSON/Promoted+Builds+Plugin</url>
<version>1.2-SNAPSHOT</version>
<repositories>
<repository>
<id>m.g.o-public</id>
<url>http://maven.glassfish.org/content/groups/public/</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
</repositories>
<dependencies>
<!-- for testing purpose -->
<dependency>
<groupId>org.jvnet.hudson.plugins</groupId>
<artifactId>batch-task</artifactId>
<version>1.3</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
| Add m.g.o repository so that plugin release tasks work Add URL so that wiki page is correctly referenced in release notice RSS feed etc | Add m.g.o repository so that plugin release tasks work
Add URL so that wiki page is correctly referenced in release notice RSS feed etc | XML | mit | Brantone/promoted-builds-plugin,bekriebel/promoted-builds-plugin,vStone/promoted-builds-plugin,oleg-nenashev/promoted-builds-plugin,J-cztery/promoted-builds-plugin,oleg-nenashev/promoted-builds-plugin,bbideep/promoted-builds-plugin,bekriebel/promoted-builds-plugin,denschu/promoted-builds-plugin,tomrom95/promoted-builds-plugin,bbideep/promoted-builds-plugin,tomrom95/promoted-builds-plugin,dnozay/promoted-builds-plugin,dnozay/promoted-builds-plugin,vStone/promoted-builds-plugin,jenkinsci/promoted-builds-plugin,dnozay/promoted-builds-plugin,denschu/promoted-builds-plugin,bbideep/promoted-builds-plugin,Brantone/promoted-builds-plugin,vStone/promoted-builds-plugin,jenkinsci/promoted-builds-plugin,J-cztery/promoted-builds-plugin,J-cztery/promoted-builds-plugin,Brantone/promoted-builds-plugin,bekriebel/promoted-builds-plugin,denschu/promoted-builds-plugin,oleg-nenashev/promoted-builds-plugin,tomrom95/promoted-builds-plugin | xml | ## Code Before:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.jvnet.hudson.plugins</groupId>
<artifactId>plugin</artifactId>
<version>1.285</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>promoted-builds</artifactId>
<packaging>hpi</packaging>
<name>Hudson promoted builds plugin</name>
<version>1.2-SNAPSHOT</version>
<dependencies>
<!-- for testing purpose -->
<dependency>
<groupId>org.jvnet.hudson.plugins</groupId>
<artifactId>batch-task</artifactId>
<version>1.3</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
## Instruction:
Add m.g.o repository so that plugin release tasks work
Add URL so that wiki page is correctly referenced in release notice RSS feed etc
## Code After:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.jvnet.hudson.plugins</groupId>
<artifactId>plugin</artifactId>
<version>1.285</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>promoted-builds</artifactId>
<packaging>hpi</packaging>
<name>Hudson promoted builds plugin</name>
<url>http://wiki.hudson-ci.org//display/HUDSON/Promoted+Builds+Plugin</url>
<version>1.2-SNAPSHOT</version>
<repositories>
<repository>
<id>m.g.o-public</id>
<url>http://maven.glassfish.org/content/groups/public/</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
</repositories>
<dependencies>
<!-- for testing purpose -->
<dependency>
<groupId>org.jvnet.hudson.plugins</groupId>
<artifactId>batch-task</artifactId>
<version>1.3</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
| <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.jvnet.hudson.plugins</groupId>
<artifactId>plugin</artifactId>
<version>1.285</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>promoted-builds</artifactId>
<packaging>hpi</packaging>
<name>Hudson promoted builds plugin</name>
+ <url>http://wiki.hudson-ci.org//display/HUDSON/Promoted+Builds+Plugin</url>
<version>1.2-SNAPSHOT</version>
+ <repositories>
+ <repository>
+ <id>m.g.o-public</id>
+ <url>http://maven.glassfish.org/content/groups/public/</url>
+ <releases>
+ <enabled>true</enabled>
+ </releases>
+ <snapshots>
+ <enabled>false</enabled>
+ </snapshots>
+ </repository>
+ </repositories>
<dependencies>
<!-- for testing purpose -->
<dependency>
<groupId>org.jvnet.hudson.plugins</groupId>
<artifactId>batch-task</artifactId>
<version>1.3</version>
<scope>test</scope>
</dependency>
</dependencies>
</project> | 13 | 0.541667 | 13 | 0 |
80311f8e2a56cd176ef220533f71ecde20f1fa15 | lib/frontend_server.rb | lib/frontend_server.rb | require "frontend_server/version"
require 'yaml'
require 'rack/reverse_proxy'
require 'rack/rewrite'
require 'rake-pipeline'
require 'rake-pipeline/middleware'
require 'rake-pipeline-web-filters'
require 'rake-pipeline-web-filters/erb_filter'
require 'frontend_server/middleware/add_header'
require 'frontend_server/middleware/pipeline_reloader'
require 'frontend_server/pipeline'
require 'frontend_server/rack'
module FrontendServer
class Application
include Pipeline
include Rack
attr_accessor :root
attr_reader :config
class << self
def configure(&block)
@callbacks ||= []
@callbacks << block
end
def configurations
@callbacks || []
end
def root=(val)
@root = val
end
def root
@root
end
end
def initialize
boot!
end
def env
ENV['RACK_ENV'] || 'development'
end
def boot!
@config = OpenStruct.new(YAML.load(ERB.new(File.read("#{root}/config/application.yml")).result)[env])
begin
require "#{root}/config/environment.rb"
# Now require the individual enviroment files
# that can be used to add middleware and all the
# other standard rack stuff
require "#{root}/config/#{env}.rb"
rescue LoadError
end
end
def root
self.class.root
end
def production?
env == 'production'
end
def development?
env == 'development'
end
end
end
| require "frontend_server/version"
require 'yaml'
require 'rack/reverse_proxy'
require 'rack/rewrite'
require 'rake-pipeline'
require 'rake-pipeline/middleware'
require 'rake-pipeline-web-filters'
require 'rake-pipeline-web-filters/erb_filter'
require 'frontend_server/middleware/add_header'
require 'frontend_server/middleware/pipeline_reloader'
require 'frontend_server/pipeline'
require 'frontend_server/rack'
module FrontendServer
class Application
include Pipeline
include Rack
attr_accessor :root
attr_reader :config
class << self
def configure(&block)
@callbacks ||= []
@callbacks << block
end
def configurations
@callbacks || []
end
def root=(val)
@root = val
end
def root
@root
end
end
def initialize
boot!
end
def env
ENV['RACK_ENV'] || 'development'
end
def boot!
@config = OpenStruct.new(YAML.load(ERB.new(File.read("#{root}/config/settings.yml")).result)[env])
begin
require "#{root}/config/application.rb"
# Now require the individual enviroment files
# that can be used to add middleware and all the
# other standard rack stuff
require "#{root}/config/#{env}.rb"
rescue LoadError
end
end
def root
self.class.root
end
def production?
env == 'production'
end
def development?
env == 'development'
end
end
end
| Reorganize for heroku build pack | Reorganize for heroku build pack
| Ruby | mit | radiumsoftware/iridium,radiumsoftware/iridium,johnthethird/frontend_server,radiumsoftware/iridium | ruby | ## Code Before:
require "frontend_server/version"
require 'yaml'
require 'rack/reverse_proxy'
require 'rack/rewrite'
require 'rake-pipeline'
require 'rake-pipeline/middleware'
require 'rake-pipeline-web-filters'
require 'rake-pipeline-web-filters/erb_filter'
require 'frontend_server/middleware/add_header'
require 'frontend_server/middleware/pipeline_reloader'
require 'frontend_server/pipeline'
require 'frontend_server/rack'
module FrontendServer
class Application
include Pipeline
include Rack
attr_accessor :root
attr_reader :config
class << self
def configure(&block)
@callbacks ||= []
@callbacks << block
end
def configurations
@callbacks || []
end
def root=(val)
@root = val
end
def root
@root
end
end
def initialize
boot!
end
def env
ENV['RACK_ENV'] || 'development'
end
def boot!
@config = OpenStruct.new(YAML.load(ERB.new(File.read("#{root}/config/application.yml")).result)[env])
begin
require "#{root}/config/environment.rb"
# Now require the individual enviroment files
# that can be used to add middleware and all the
# other standard rack stuff
require "#{root}/config/#{env}.rb"
rescue LoadError
end
end
def root
self.class.root
end
def production?
env == 'production'
end
def development?
env == 'development'
end
end
end
## Instruction:
Reorganize for heroku build pack
## Code After:
require "frontend_server/version"
require 'yaml'
require 'rack/reverse_proxy'
require 'rack/rewrite'
require 'rake-pipeline'
require 'rake-pipeline/middleware'
require 'rake-pipeline-web-filters'
require 'rake-pipeline-web-filters/erb_filter'
require 'frontend_server/middleware/add_header'
require 'frontend_server/middleware/pipeline_reloader'
require 'frontend_server/pipeline'
require 'frontend_server/rack'
module FrontendServer
class Application
include Pipeline
include Rack
attr_accessor :root
attr_reader :config
class << self
def configure(&block)
@callbacks ||= []
@callbacks << block
end
def configurations
@callbacks || []
end
def root=(val)
@root = val
end
def root
@root
end
end
def initialize
boot!
end
def env
ENV['RACK_ENV'] || 'development'
end
def boot!
@config = OpenStruct.new(YAML.load(ERB.new(File.read("#{root}/config/settings.yml")).result)[env])
begin
require "#{root}/config/application.rb"
# Now require the individual enviroment files
# that can be used to add middleware and all the
# other standard rack stuff
require "#{root}/config/#{env}.rb"
rescue LoadError
end
end
def root
self.class.root
end
def production?
env == 'production'
end
def development?
env == 'development'
end
end
end
| require "frontend_server/version"
require 'yaml'
require 'rack/reverse_proxy'
require 'rack/rewrite'
require 'rake-pipeline'
require 'rake-pipeline/middleware'
require 'rake-pipeline-web-filters'
require 'rake-pipeline-web-filters/erb_filter'
require 'frontend_server/middleware/add_header'
require 'frontend_server/middleware/pipeline_reloader'
require 'frontend_server/pipeline'
require 'frontend_server/rack'
module FrontendServer
class Application
include Pipeline
include Rack
attr_accessor :root
attr_reader :config
class << self
def configure(&block)
@callbacks ||= []
@callbacks << block
end
def configurations
@callbacks || []
end
def root=(val)
@root = val
end
def root
@root
end
end
def initialize
boot!
end
def env
ENV['RACK_ENV'] || 'development'
end
def boot!
- @config = OpenStruct.new(YAML.load(ERB.new(File.read("#{root}/config/application.yml")).result)[env])
? ^^^^^^^ -
+ @config = OpenStruct.new(YAML.load(ERB.new(File.read("#{root}/config/settings.yml")).result)[env])
? ^^^ ++
begin
- require "#{root}/config/environment.rb"
? ^^^ ^ ----
+ require "#{root}/config/application.rb"
? ^^^^ ^^^^
# Now require the individual enviroment files
# that can be used to add middleware and all the
# other standard rack stuff
require "#{root}/config/#{env}.rb"
rescue LoadError
end
end
def root
self.class.root
end
def production?
env == 'production'
end
def development?
env == 'development'
end
end
end | 4 | 0.051282 | 2 | 2 |
d7cd3c3635bc6200cd9c8668a025826818f19a80 | app/services/ci/create_pipeline_schedule_service.rb | app/services/ci/create_pipeline_schedule_service.rb | module Ci
class CreatePipelineScheduleService < BaseService
def execute
pipeline_schedule = project.pipeline_schedules.build(pipeline_schedule_params)
if variable_keys_duplicated?
pipeline_schedule.errors.add('variables.key', "keys are duplicated")
return pipeline_schedule
end
pipeline_schedule.save
pipeline_schedule
end
def update(pipeline_schedule)
if variable_keys_duplicated?
pipeline_schedule.errors.add('variables.key', "keys are duplicated")
return false
end
pipeline_schedule.update(pipeline_schedule_params)
end
private
def pipeline_schedule_params
@pipeline_schedule_params ||= params.merge(owner: current_user)
end
def variable_keys_duplicated?
attributes = pipeline_schedule_params['variables_attributes']
return false unless attributes.is_a?(Array)
attributes.map { |v| v['key'] }.uniq.length != attributes.length
end
end
end
| module Ci
class CreatePipelineScheduleService < BaseService
def execute
project.pipeline_schedules.create(pipeline_schedule_params)
end
private
def pipeline_schedule_params
params.merge(owner: current_user)
end
end
end
| Revert extra validation for duplication between same keys on a submit | Revert extra validation for duplication between same keys on a submit
| Ruby | mit | dreampet/gitlab,dplarson/gitlabhq,stoplightio/gitlabhq,mmkassem/gitlabhq,iiet/iiet-git,t-zuehlsdorff/gitlabhq,dreampet/gitlab,iiet/iiet-git,dplarson/gitlabhq,iiet/iiet-git,dreampet/gitlab,axilleas/gitlabhq,dplarson/gitlabhq,stoplightio/gitlabhq,jirutka/gitlabhq,stoplightio/gitlabhq,iiet/iiet-git,axilleas/gitlabhq,dplarson/gitlabhq,mmkassem/gitlabhq,t-zuehlsdorff/gitlabhq,jirutka/gitlabhq,dreampet/gitlab,axilleas/gitlabhq,stoplightio/gitlabhq,axilleas/gitlabhq,jirutka/gitlabhq,t-zuehlsdorff/gitlabhq,jirutka/gitlabhq,mmkassem/gitlabhq,t-zuehlsdorff/gitlabhq,mmkassem/gitlabhq | ruby | ## Code Before:
module Ci
class CreatePipelineScheduleService < BaseService
def execute
pipeline_schedule = project.pipeline_schedules.build(pipeline_schedule_params)
if variable_keys_duplicated?
pipeline_schedule.errors.add('variables.key', "keys are duplicated")
return pipeline_schedule
end
pipeline_schedule.save
pipeline_schedule
end
def update(pipeline_schedule)
if variable_keys_duplicated?
pipeline_schedule.errors.add('variables.key', "keys are duplicated")
return false
end
pipeline_schedule.update(pipeline_schedule_params)
end
private
def pipeline_schedule_params
@pipeline_schedule_params ||= params.merge(owner: current_user)
end
def variable_keys_duplicated?
attributes = pipeline_schedule_params['variables_attributes']
return false unless attributes.is_a?(Array)
attributes.map { |v| v['key'] }.uniq.length != attributes.length
end
end
end
## Instruction:
Revert extra validation for duplication between same keys on a submit
## Code After:
module Ci
class CreatePipelineScheduleService < BaseService
def execute
project.pipeline_schedules.create(pipeline_schedule_params)
end
private
def pipeline_schedule_params
params.merge(owner: current_user)
end
end
end
| module Ci
class CreatePipelineScheduleService < BaseService
def execute
- pipeline_schedule = project.pipeline_schedules.build(pipeline_schedule_params)
-
- if variable_keys_duplicated?
- pipeline_schedule.errors.add('variables.key', "keys are duplicated")
-
- return pipeline_schedule
- end
-
- pipeline_schedule.save
- pipeline_schedule
- end
-
- def update(pipeline_schedule)
- if variable_keys_duplicated?
- pipeline_schedule.errors.add('variables.key', "keys are duplicated")
-
- return false
- end
-
- pipeline_schedule.update(pipeline_schedule_params)
? ^^^
+ project.pipeline_schedules.create(pipeline_schedule_params)
? ++++++++ + ^^^
end
private
def pipeline_schedule_params
+ params.merge(owner: current_user)
- @pipeline_schedule_params ||= params.merge(owner: current_user)
- end
-
- def variable_keys_duplicated?
- attributes = pipeline_schedule_params['variables_attributes']
- return false unless attributes.is_a?(Array)
-
- attributes.map { |v| v['key'] }.uniq.length != attributes.length
end
end
end | 30 | 0.769231 | 2 | 28 |
ad6c489c343fba4d8121f2797c6a06dc71087f23 | app/models/promotion_spend_condition.rb | app/models/promotion_spend_condition.rb | class PromotionSpendCondition < PromotionCondition
desc "Minimum Spend"
condition_scope :order
exclusivity_scope :none
position 5
metadata(:config) do
float :amount, :required => true, :greater_than => 0, :default => 0
end
def check(order)
spend = SpookAndPuff::Money.new(amount.to_s)
if order.product_total >= spend
success
else
failure(:insufficient_spend, "Must spend at least #{spend}")
end
end
end
| class PromotionSpendCondition < PromotionCondition
desc "Minimum Spend"
condition_scope :order
exclusivity_scope :none
position 5
metadata(:config) do
float :amount, :required => true, :greater_than => 0, :default => 0
end
def amount_and_kind
SpookAndPuff::Money.new(amount.to_s)
end
def check(order)
spend = SpookAndPuff::Money.new(amount.to_s)
if order.product_total >= spend
success
else
failure(:insufficient_spend, "Must spend at least #{spend}")
end
end
end
| Fix summarising mimimum spend condition. | Fix summarising mimimum spend condition.
| Ruby | mit | spookandpuff/islay-shop,spookandpuff/islay-shop,spookandpuff/islay-shop | ruby | ## Code Before:
class PromotionSpendCondition < PromotionCondition
desc "Minimum Spend"
condition_scope :order
exclusivity_scope :none
position 5
metadata(:config) do
float :amount, :required => true, :greater_than => 0, :default => 0
end
def check(order)
spend = SpookAndPuff::Money.new(amount.to_s)
if order.product_total >= spend
success
else
failure(:insufficient_spend, "Must spend at least #{spend}")
end
end
end
## Instruction:
Fix summarising mimimum spend condition.
## Code After:
class PromotionSpendCondition < PromotionCondition
desc "Minimum Spend"
condition_scope :order
exclusivity_scope :none
position 5
metadata(:config) do
float :amount, :required => true, :greater_than => 0, :default => 0
end
def amount_and_kind
SpookAndPuff::Money.new(amount.to_s)
end
def check(order)
spend = SpookAndPuff::Money.new(amount.to_s)
if order.product_total >= spend
success
else
failure(:insufficient_spend, "Must spend at least #{spend}")
end
end
end
| class PromotionSpendCondition < PromotionCondition
desc "Minimum Spend"
condition_scope :order
exclusivity_scope :none
position 5
metadata(:config) do
float :amount, :required => true, :greater_than => 0, :default => 0
+ end
+
+ def amount_and_kind
+ SpookAndPuff::Money.new(amount.to_s)
end
def check(order)
spend = SpookAndPuff::Money.new(amount.to_s)
if order.product_total >= spend
success
else
failure(:insufficient_spend, "Must spend at least #{spend}")
end
end
end | 4 | 0.2 | 4 | 0 |
eae73f1969ac2772e05dd04a27606c3bb164f901 | src/themes/bootstrap/templates/footer.tpl | src/themes/bootstrap/templates/footer.tpl | <div class="container" id="footer">
<div class="footer_space"></div>
<div class="rights"><p>© <{if $copyright_since && $copyright_since != date('Y')}><{$copyright_since}>-<{/if}><{'Y'|date}> · <{$copyright}>, All rights reserved. Built with <a href="https://www.activecollab.com/labs/shade" title="Shade builds help portals from Markdown files">Shade v<{shade_version}></a>.</p></div>
<{if !empty($project->getSocialLinks())}>
<div class="social">
<p>Stay up to date with all new features:</p>
<ul class="links">
<{foreach $project->getSocialLinks() as $service}>
<li><a href="<{$service.url}>" target="_blank"><img src="<{theme_asset name=$service.icon page_level=$page_level current_locale=$current_locale}>" title="{$service.name}" alt="{$service.name} icon"></a></li>
<{/foreach}>
</ul>
</div>
<{/if}>
</div>
<{foreach $plugins as $plugin}>
<{$plugin->renderFoot() nofilter}>
<{/foreach}>
| <div class="container" id="footer">
<div class="footer_space"></div>
<div class="rights"><p>©<{if $copyright_since && $copyright_since != date('Y')}><{$copyright_since}>-<{/if}><{'Y'|date}> <{$copyright}>, All rights reserved. Built with <a href="https://www.activecollab.com/labs/shade" title="Shade builds help portals from Markdown files">Shade v<{shade_version}></a>.</p></div>
<{if !empty($project->getSocialLinks())}>
<div class="social">
<p>Stay up to date with all new features:</p>
<ul class="links">
<{foreach $project->getSocialLinks() as $service}>
<li><a href="<{$service.url}>" target="_blank"><img src="<{theme_asset name=$service.icon page_level=$page_level current_locale=$current_locale}>" title="{$service.name}" alt="{$service.name} icon"></a></li>
<{/foreach}>
</ul>
</div>
<{/if}>
</div>
<{foreach $plugins as $plugin}>
<{$plugin->renderFoot() nofilter}>
<{/foreach}>
| Make copyright read like a sentence | Make copyright read like a sentence
| Smarty | mit | activecollab/shade | smarty | ## Code Before:
<div class="container" id="footer">
<div class="footer_space"></div>
<div class="rights"><p>© <{if $copyright_since && $copyright_since != date('Y')}><{$copyright_since}>-<{/if}><{'Y'|date}> · <{$copyright}>, All rights reserved. Built with <a href="https://www.activecollab.com/labs/shade" title="Shade builds help portals from Markdown files">Shade v<{shade_version}></a>.</p></div>
<{if !empty($project->getSocialLinks())}>
<div class="social">
<p>Stay up to date with all new features:</p>
<ul class="links">
<{foreach $project->getSocialLinks() as $service}>
<li><a href="<{$service.url}>" target="_blank"><img src="<{theme_asset name=$service.icon page_level=$page_level current_locale=$current_locale}>" title="{$service.name}" alt="{$service.name} icon"></a></li>
<{/foreach}>
</ul>
</div>
<{/if}>
</div>
<{foreach $plugins as $plugin}>
<{$plugin->renderFoot() nofilter}>
<{/foreach}>
## Instruction:
Make copyright read like a sentence
## Code After:
<div class="container" id="footer">
<div class="footer_space"></div>
<div class="rights"><p>©<{if $copyright_since && $copyright_since != date('Y')}><{$copyright_since}>-<{/if}><{'Y'|date}> <{$copyright}>, All rights reserved. Built with <a href="https://www.activecollab.com/labs/shade" title="Shade builds help portals from Markdown files">Shade v<{shade_version}></a>.</p></div>
<{if !empty($project->getSocialLinks())}>
<div class="social">
<p>Stay up to date with all new features:</p>
<ul class="links">
<{foreach $project->getSocialLinks() as $service}>
<li><a href="<{$service.url}>" target="_blank"><img src="<{theme_asset name=$service.icon page_level=$page_level current_locale=$current_locale}>" title="{$service.name}" alt="{$service.name} icon"></a></li>
<{/foreach}>
</ul>
</div>
<{/if}>
</div>
<{foreach $plugins as $plugin}>
<{$plugin->renderFoot() nofilter}>
<{/foreach}>
| <div class="container" id="footer">
<div class="footer_space"></div>
- <div class="rights"><p>© <{if $copyright_since && $copyright_since != date('Y')}><{$copyright_since}>-<{/if}><{'Y'|date}> · <{$copyright}>, All rights reserved. Built with <a href="https://www.activecollab.com/labs/shade" title="Shade builds help portals from Markdown files">Shade v<{shade_version}></a>.</p></div>
? - ---------
+ <div class="rights"><p>©<{if $copyright_since && $copyright_since != date('Y')}><{$copyright_since}>-<{/if}><{'Y'|date}> <{$copyright}>, All rights reserved. Built with <a href="https://www.activecollab.com/labs/shade" title="Shade builds help portals from Markdown files">Shade v<{shade_version}></a>.</p></div>
<{if !empty($project->getSocialLinks())}>
<div class="social">
<p>Stay up to date with all new features:</p>
<ul class="links">
<{foreach $project->getSocialLinks() as $service}>
<li><a href="<{$service.url}>" target="_blank"><img src="<{theme_asset name=$service.icon page_level=$page_level current_locale=$current_locale}>" title="{$service.name}" alt="{$service.name} icon"></a></li>
<{/foreach}>
</ul>
</div>
<{/if}>
</div>
<{foreach $plugins as $plugin}>
<{$plugin->renderFoot() nofilter}>
<{/foreach}> | 2 | 0.111111 | 1 | 1 |
900fb9299968c543f0621a993026911163ac3878 | src/kato-http-api.coffee | src/kato-http-api.coffee |
module.exports = (robot) ->
robot.on 'kato-http-post', (kato-http-post) ->
sendMessage kato-http-post
robot.brain.data.kato-http-api.rooms or= {}
sendMessage = (message) ->
data = JSON.stringify({
room: message.room || 'all',
text: message.text,
from: message.from || robot.name,
color: message.color || 'grey',
renderer: 'markdown'
})
roomURL = robot.brain.data.kato-http-api.rooms[data.room]
robot.http(roomURL)
.header('Content-Type', 'application/json')
.post(data) (err, res, body) ->
if err
robot.emit 'error', err, res, body
return
robot.emit res, body |
module.exports = (robot) ->
config = secrets:
katoRoomURL: process.env.HUBOT_KATO_ALL_ROOM_HTTP_POST_URL
robot.on 'kato-http-post', (kato-http-post) ->
sendMessage kato-http-post
sendMessage = (message) ->
data = JSON.stringify({
text: message.text,
from: message.from || robot.name,
color: message.color || 'grey',
renderer: 'markdown'
})
robot.http(config.secrets.katoRoomURL)
.header('Content-Type', 'application/json')
.post(data) (err, res, body) ->
if err
robot.emit 'error', err, res, body
return
robot.emit res, body
| Set up environment variable to hold Kato "All" room HTTP POST URL. | Set up environment variable to hold Kato "All" room HTTP POST URL.
| CoffeeScript | mit | ocean/hubot-kato-http-api | coffeescript | ## Code Before:
module.exports = (robot) ->
robot.on 'kato-http-post', (kato-http-post) ->
sendMessage kato-http-post
robot.brain.data.kato-http-api.rooms or= {}
sendMessage = (message) ->
data = JSON.stringify({
room: message.room || 'all',
text: message.text,
from: message.from || robot.name,
color: message.color || 'grey',
renderer: 'markdown'
})
roomURL = robot.brain.data.kato-http-api.rooms[data.room]
robot.http(roomURL)
.header('Content-Type', 'application/json')
.post(data) (err, res, body) ->
if err
robot.emit 'error', err, res, body
return
robot.emit res, body
## Instruction:
Set up environment variable to hold Kato "All" room HTTP POST URL.
## Code After:
module.exports = (robot) ->
config = secrets:
katoRoomURL: process.env.HUBOT_KATO_ALL_ROOM_HTTP_POST_URL
robot.on 'kato-http-post', (kato-http-post) ->
sendMessage kato-http-post
sendMessage = (message) ->
data = JSON.stringify({
text: message.text,
from: message.from || robot.name,
color: message.color || 'grey',
renderer: 'markdown'
})
robot.http(config.secrets.katoRoomURL)
.header('Content-Type', 'application/json')
.post(data) (err, res, body) ->
if err
robot.emit 'error', err, res, body
return
robot.emit res, body
|
module.exports = (robot) ->
+
+ config = secrets:
+ katoRoomURL: process.env.HUBOT_KATO_ALL_ROOM_HTTP_POST_URL
+
robot.on 'kato-http-post', (kato-http-post) ->
sendMessage kato-http-post
+
-
- robot.brain.data.kato-http-api.rooms or= {}
-
sendMessage = (message) ->
data = JSON.stringify({
- room: message.room || 'all',
text: message.text,
from: message.from || robot.name,
color: message.color || 'grey',
renderer: 'markdown'
})
+ robot.http(config.secrets.katoRoomURL)
- roomURL = robot.brain.data.kato-http-api.rooms[data.room]
-
- robot.http(roomURL)
.header('Content-Type', 'application/json')
.post(data) (err, res, body) ->
if err
robot.emit 'error', err, res, body
return
robot.emit res, body | 13 | 0.5 | 6 | 7 |
47d20dd2c506326c4acb0289f3abd787f42f807a | src/main/java/com/solsticesquared/schelling/AppEntry.java | src/main/java/com/solsticesquared/schelling/AppEntry.java | /*
* Copyright 2016 Will Knez <wbknez.dev@gmail.com>
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.solsticesquared.schelling;
/**
* The main driver for the Schelling project.
*/
public final class AppEntry {
/**
* The application entry point.
*
* @param args
* The array of command line arguments, if any.
*/
public static void main(final String[] args) {
}
/**
* Constructor (private).
*/
private AppEntry() {
}
}
| /*
* Copyright 2016 Will Knez <wbknez.dev@gmail.com>
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.solsticesquared.schelling;
import com.solsticesquared.schelling.ui.SchellingExplorerConsole;
import com.solsticesquared.schelling.ui.SchellingExplorerWithUi;
import sim.display.Console;
import sim.display.GUIState;
import sim.engine.SimState;
import javax.swing.SwingUtilities;
/**
* The main driver for the Schelling project.
*/
public final class AppEntry {
/**
* The application entry point.
*
* @param args
* The array of command line arguments, if any.
*/
public static void main(final String[] args) {
SwingUtilities.invokeLater(() -> {
// The random number generation seed.
final long seed = System.currentTimeMillis();
// The underlying Schelling-based model.
final SimState model =
SchellingExplorerUtils.createTwoGroupModel(seed);
// The user interface.
final GUIState view = new SchellingExplorerWithUi(model);
// The user interface controller.
final Console controller = new SchellingExplorerConsole(view);
// Run.
controller.setVisible(true);
});
}
/**
* Constructor (private).
*/
private AppEntry() {
}
}
| Update application entry point to use custom controller implementation. | Update application entry point to use custom controller implementation.
| Java | apache-2.0 | wbknez/schelling,wbknez/schelling | java | ## Code Before:
/*
* Copyright 2016 Will Knez <wbknez.dev@gmail.com>
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.solsticesquared.schelling;
/**
* The main driver for the Schelling project.
*/
public final class AppEntry {
/**
* The application entry point.
*
* @param args
* The array of command line arguments, if any.
*/
public static void main(final String[] args) {
}
/**
* Constructor (private).
*/
private AppEntry() {
}
}
## Instruction:
Update application entry point to use custom controller implementation.
## Code After:
/*
* Copyright 2016 Will Knez <wbknez.dev@gmail.com>
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.solsticesquared.schelling;
import com.solsticesquared.schelling.ui.SchellingExplorerConsole;
import com.solsticesquared.schelling.ui.SchellingExplorerWithUi;
import sim.display.Console;
import sim.display.GUIState;
import sim.engine.SimState;
import javax.swing.SwingUtilities;
/**
* The main driver for the Schelling project.
*/
public final class AppEntry {
/**
* The application entry point.
*
* @param args
* The array of command line arguments, if any.
*/
public static void main(final String[] args) {
SwingUtilities.invokeLater(() -> {
// The random number generation seed.
final long seed = System.currentTimeMillis();
// The underlying Schelling-based model.
final SimState model =
SchellingExplorerUtils.createTwoGroupModel(seed);
// The user interface.
final GUIState view = new SchellingExplorerWithUi(model);
// The user interface controller.
final Console controller = new SchellingExplorerConsole(view);
// Run.
controller.setVisible(true);
});
}
/**
* Constructor (private).
*/
private AppEntry() {
}
}
| /*
* Copyright 2016 Will Knez <wbknez.dev@gmail.com>
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.solsticesquared.schelling;
+ import com.solsticesquared.schelling.ui.SchellingExplorerConsole;
+ import com.solsticesquared.schelling.ui.SchellingExplorerWithUi;
+ import sim.display.Console;
+ import sim.display.GUIState;
+ import sim.engine.SimState;
+
+ import javax.swing.SwingUtilities;
+
/**
* The main driver for the Schelling project.
*/
public final class AppEntry {
/**
* The application entry point.
*
* @param args
* The array of command line arguments, if any.
*/
public static void main(final String[] args) {
+ SwingUtilities.invokeLater(() -> {
+ // The random number generation seed.
+ final long seed = System.currentTimeMillis();
+
+ // The underlying Schelling-based model.
+ final SimState model =
+ SchellingExplorerUtils.createTwoGroupModel(seed);
+ // The user interface.
+ final GUIState view = new SchellingExplorerWithUi(model);
+ // The user interface controller.
+ final Console controller = new SchellingExplorerConsole(view);
+
+ // Run.
+ controller.setVisible(true);
+ });
}
/**
* Constructor (private).
*/
private AppEntry() {
}
} | 23 | 0.605263 | 23 | 0 |
7aa140778cd689a8efa86f0890c4ccb8fc7f0d43 | infrastructure/tests/test_api_views.py | infrastructure/tests/test_api_views.py | from django.test import Client, TestCase
from infrastructure import utils
from infrastructure import models
import json
from infrastructure.models import FinancialYear, QuarterlySpendFile, Expenditure, Project
from scorecard.models import Geography
from scorecard.profiles import MunicipalityProfile
from scorecard.admin import MunicipalityProfilesCompilationAdmin
class TestProject(TestCase):
def setUp(self):
fixtures = ["test_infrastructure.json"]
TestProject.geography = Geography.objects.create(
geo_level="municipality",
geo_code="BUF",
province_name="Eastern Cape",
province_code="EC",
category="A",
)
def test_infrastructure_project_search(self):
response = self.client.get(
"/api/v1/infrastructure/search/?province=Eastern+Cape&municipality=Buffalo+City&q=&budget_phase=Budget+year&financial_year=2019%2F2020&ordering=-total_forecast_budget")
self.assertEqual(response.status_code, 200)
js = response.json()
self.assertEquals(len(js["results"]), 3)
| from django.test import TestCase
class TestProject(TestCase):
fixtures = ["test_infrastructure.json"]
def test_infrastructure_project_filters(self):
response = self.client.get(
"/api/v1/infrastructure/search/?q=&province=Western+Cape&municipality=City+of+Cape+Town&project_type=New&function=Administrative+and+Corporate+Support&budget_phase=Budget+year&quarterly_phase=Original+Budget&financial_year=2019%2F2020&ordering=-total_forecast_budget")
self.assertEqual(response.status_code, 200)
js = response.json()
self.assertEquals(js["count"], 2)
self.assertEquals(len(js["results"]["projects"]), 2)
def test_infrastructure_project_search(self):
response = self.client.get(
"/api/v1/infrastructure/search/?q=PC001002004002_00473&budget_phase=Budget+year&quarterly_phase=Original+Budget&financial_year=2019%2F2020&ordering=-total_forecast_budget")
self.assertEqual(response.status_code, 200)
js = response.json()
self.assertEquals(js["count"], 1)
self.assertEquals(len(js["results"]["projects"]), 1)
response = self.client.get(
"/api/v1/infrastructure/search/?q=Acquisition&budget_phase=Budget+year&quarterly_phase=Original+Budget&financial_year=2019%2F2020&ordering=-total_forecast_budget")
self.assertEqual(response.status_code, 200)
js = response.json()
self.assertEquals(js["count"], 1)
self.assertEquals(len(js["results"]["projects"]), 1) | Add test for infra search API and some refactoring | Add test for infra search API and some refactoring
| Python | mit | Code4SA/municipal-data,Code4SA/municipal-data,Code4SA/municipal-data,Code4SA/municipal-data | python | ## Code Before:
from django.test import Client, TestCase
from infrastructure import utils
from infrastructure import models
import json
from infrastructure.models import FinancialYear, QuarterlySpendFile, Expenditure, Project
from scorecard.models import Geography
from scorecard.profiles import MunicipalityProfile
from scorecard.admin import MunicipalityProfilesCompilationAdmin
class TestProject(TestCase):
def setUp(self):
fixtures = ["test_infrastructure.json"]
TestProject.geography = Geography.objects.create(
geo_level="municipality",
geo_code="BUF",
province_name="Eastern Cape",
province_code="EC",
category="A",
)
def test_infrastructure_project_search(self):
response = self.client.get(
"/api/v1/infrastructure/search/?province=Eastern+Cape&municipality=Buffalo+City&q=&budget_phase=Budget+year&financial_year=2019%2F2020&ordering=-total_forecast_budget")
self.assertEqual(response.status_code, 200)
js = response.json()
self.assertEquals(len(js["results"]), 3)
## Instruction:
Add test for infra search API and some refactoring
## Code After:
from django.test import TestCase
class TestProject(TestCase):
fixtures = ["test_infrastructure.json"]
def test_infrastructure_project_filters(self):
response = self.client.get(
"/api/v1/infrastructure/search/?q=&province=Western+Cape&municipality=City+of+Cape+Town&project_type=New&function=Administrative+and+Corporate+Support&budget_phase=Budget+year&quarterly_phase=Original+Budget&financial_year=2019%2F2020&ordering=-total_forecast_budget")
self.assertEqual(response.status_code, 200)
js = response.json()
self.assertEquals(js["count"], 2)
self.assertEquals(len(js["results"]["projects"]), 2)
def test_infrastructure_project_search(self):
response = self.client.get(
"/api/v1/infrastructure/search/?q=PC001002004002_00473&budget_phase=Budget+year&quarterly_phase=Original+Budget&financial_year=2019%2F2020&ordering=-total_forecast_budget")
self.assertEqual(response.status_code, 200)
js = response.json()
self.assertEquals(js["count"], 1)
self.assertEquals(len(js["results"]["projects"]), 1)
response = self.client.get(
"/api/v1/infrastructure/search/?q=Acquisition&budget_phase=Budget+year&quarterly_phase=Original+Budget&financial_year=2019%2F2020&ordering=-total_forecast_budget")
self.assertEqual(response.status_code, 200)
js = response.json()
self.assertEquals(js["count"], 1)
self.assertEquals(len(js["results"]["projects"]), 1) | - from django.test import Client, TestCase
? --------
+ from django.test import TestCase
- from infrastructure import utils
- from infrastructure import models
- import json
-
- from infrastructure.models import FinancialYear, QuarterlySpendFile, Expenditure, Project
-
- from scorecard.models import Geography
- from scorecard.profiles import MunicipalityProfile
- from scorecard.admin import MunicipalityProfilesCompilationAdmin
class TestProject(TestCase):
- def setUp(self):
- fixtures = ["test_infrastructure.json"]
? ----
+ fixtures = ["test_infrastructure.json"]
- TestProject.geography = Geography.objects.create(
- geo_level="municipality",
- geo_code="BUF",
- province_name="Eastern Cape",
- province_code="EC",
- category="A",
- )
+ def test_infrastructure_project_filters(self):
+ response = self.client.get(
+ "/api/v1/infrastructure/search/?q=&province=Western+Cape&municipality=City+of+Cape+Town&project_type=New&function=Administrative+and+Corporate+Support&budget_phase=Budget+year&quarterly_phase=Original+Budget&financial_year=2019%2F2020&ordering=-total_forecast_budget")
+ self.assertEqual(response.status_code, 200)
+ js = response.json()
+ self.assertEquals(js["count"], 2)
+ self.assertEquals(len(js["results"]["projects"]), 2)
def test_infrastructure_project_search(self):
- response = self.client.get(
? -
+ response = self.client.get(
- "/api/v1/infrastructure/search/?province=Eastern+Cape&municipality=Buffalo+City&q=&budget_phase=Budget+year&financial_year=2019%2F2020&ordering=-total_forecast_budget")
+ "/api/v1/infrastructure/search/?q=PC001002004002_00473&budget_phase=Budget+year&quarterly_phase=Original+Budget&financial_year=2019%2F2020&ordering=-total_forecast_budget")
- self.assertEqual(response.status_code, 200)
? -
+ self.assertEqual(response.status_code, 200)
- js = response.json()
? -
+ js = response.json()
+ self.assertEquals(js["count"], 1)
- self.assertEquals(len(js["results"]), 3)
? - ^
+ self.assertEquals(len(js["results"]["projects"]), 1)
? ++++++++++++ ^
+
+ response = self.client.get(
+ "/api/v1/infrastructure/search/?q=Acquisition&budget_phase=Budget+year&quarterly_phase=Original+Budget&financial_year=2019%2F2020&ordering=-total_forecast_budget")
+ self.assertEqual(response.status_code, 200)
+ js = response.json()
+ self.assertEquals(js["count"], 1)
+ self.assertEquals(len(js["results"]["projects"]), 1) | 46 | 1.533333 | 22 | 24 |
b40f8f9e76c3f1060e1f90661b0689069b518a86 | index.js | index.js | 'use strict';
const compiler = require('vueify').compiler;
const fs = require('fs');
class VueBrunch {
constructor(config) {
this.config = config && config.plugins && config.plugins.vue || {};
this.styles = {};
}
compile(file) {
if (this.config) {
compiler.applyConfig(this.config);
}
compiler.on('style', args => {
this.styles[args.file] = args.style;
});
return new Promise((resolve, reject) => {
compiler.compile(file.data, file.path, (error, result) => {
if (error) {
reject(error);
}
resolve(result);
});
});
}
onCompile() {
if (this.config.extractCSS) {
this.extractCSS();
}
}
extractCSS() {
var outPath = this.config.out || this.config.o || 'bundle.css';
var css = Object.keys(this.styles || [])
.map(file => this.styles[file])
.join('\n');
if (typeof outPath === 'object' && outPath.write) {
outPath.write(css);
outPath.end();
} else if (typeof outPath === 'string') {
fs.writeFileSync(outPath, css);
}
}
}
VueBrunch.prototype.brunchPlugin = true;
VueBrunch.prototype.type = 'template';
VueBrunch.prototype.extension = 'vue';
module.exports = VueBrunch;
| 'use strict';
const compiler = require('vueify').compiler;
const fs = require('fs');
class VueBrunch {
constructor(config) {
this.config = config && config.plugins && config.plugins.vue || {};
this.styles = {};
}
compile(file) {
if (this.config) {
compiler.applyConfig(this.config);
}
compiler.on('style', args => {
this.styles[args.file] = args.style;
});
return new Promise((resolve, reject) => {
compiler.compile(file.data, file.path, (error, result) => {
if (error) {
reject(error);
}
resolve(result);
});
});
}
onCompile() {
if (this.config.extractCSS) {
this.extractCSS();
}
}
extractCSS() {
var outPath = this.config.out || this.config.o || 'bundle.css';
var css = Object.keys(this.styles || [])
.map(file => this.styles[file])
.join('\n');
if (typeof outPath === 'object' && outPath.write) {
outPath.write(css);
outPath.end();
} else if (typeof outPath === 'string') {
fs.writeFileSync(outPath, css);
}
}
}
VueBrunch.prototype.brunchPlugin = true;
VueBrunch.prototype.type = 'javascript';
VueBrunch.prototype.extension = 'vue';
module.exports = VueBrunch;
| Set type to javascript to force JS compilation | Set type to javascript to force JS compilation
| JavaScript | mit | nblackburn/vue-brunch | javascript | ## Code Before:
'use strict';
const compiler = require('vueify').compiler;
const fs = require('fs');
class VueBrunch {
constructor(config) {
this.config = config && config.plugins && config.plugins.vue || {};
this.styles = {};
}
compile(file) {
if (this.config) {
compiler.applyConfig(this.config);
}
compiler.on('style', args => {
this.styles[args.file] = args.style;
});
return new Promise((resolve, reject) => {
compiler.compile(file.data, file.path, (error, result) => {
if (error) {
reject(error);
}
resolve(result);
});
});
}
onCompile() {
if (this.config.extractCSS) {
this.extractCSS();
}
}
extractCSS() {
var outPath = this.config.out || this.config.o || 'bundle.css';
var css = Object.keys(this.styles || [])
.map(file => this.styles[file])
.join('\n');
if (typeof outPath === 'object' && outPath.write) {
outPath.write(css);
outPath.end();
} else if (typeof outPath === 'string') {
fs.writeFileSync(outPath, css);
}
}
}
VueBrunch.prototype.brunchPlugin = true;
VueBrunch.prototype.type = 'template';
VueBrunch.prototype.extension = 'vue';
module.exports = VueBrunch;
## Instruction:
Set type to javascript to force JS compilation
## Code After:
'use strict';
const compiler = require('vueify').compiler;
const fs = require('fs');
class VueBrunch {
constructor(config) {
this.config = config && config.plugins && config.plugins.vue || {};
this.styles = {};
}
compile(file) {
if (this.config) {
compiler.applyConfig(this.config);
}
compiler.on('style', args => {
this.styles[args.file] = args.style;
});
return new Promise((resolve, reject) => {
compiler.compile(file.data, file.path, (error, result) => {
if (error) {
reject(error);
}
resolve(result);
});
});
}
onCompile() {
if (this.config.extractCSS) {
this.extractCSS();
}
}
extractCSS() {
var outPath = this.config.out || this.config.o || 'bundle.css';
var css = Object.keys(this.styles || [])
.map(file => this.styles[file])
.join('\n');
if (typeof outPath === 'object' && outPath.write) {
outPath.write(css);
outPath.end();
} else if (typeof outPath === 'string') {
fs.writeFileSync(outPath, css);
}
}
}
VueBrunch.prototype.brunchPlugin = true;
VueBrunch.prototype.type = 'javascript';
VueBrunch.prototype.extension = 'vue';
module.exports = VueBrunch;
| 'use strict';
const compiler = require('vueify').compiler;
const fs = require('fs');
class VueBrunch {
constructor(config) {
this.config = config && config.plugins && config.plugins.vue || {};
this.styles = {};
}
compile(file) {
if (this.config) {
compiler.applyConfig(this.config);
}
compiler.on('style', args => {
this.styles[args.file] = args.style;
});
return new Promise((resolve, reject) => {
compiler.compile(file.data, file.path, (error, result) => {
if (error) {
reject(error);
}
resolve(result);
});
});
}
onCompile() {
if (this.config.extractCSS) {
this.extractCSS();
}
}
extractCSS() {
var outPath = this.config.out || this.config.o || 'bundle.css';
var css = Object.keys(this.styles || [])
.map(file => this.styles[file])
.join('\n');
if (typeof outPath === 'object' && outPath.write) {
outPath.write(css);
outPath.end();
} else if (typeof outPath === 'string') {
fs.writeFileSync(outPath, css);
}
}
}
VueBrunch.prototype.brunchPlugin = true;
- VueBrunch.prototype.type = 'template';
? -------
+ VueBrunch.prototype.type = 'javascript';
? +++++++++
VueBrunch.prototype.extension = 'vue';
module.exports = VueBrunch; | 2 | 0.033333 | 1 | 1 |
be6e068031aa9f3b37e86ec2a3d7336bd8d780e8 | lib/linters/duplicate_property.js | lib/linters/duplicate_property.js | 'use strict';
var find = require('lodash.find');
var path = require('path');
module.exports = function (options) {
var filename = path.basename(options.path);
var config = options.config;
var node = options.node;
var properties = [];
var errors = [];
// Bail if the linter isn't wanted
if (config.duplicateProperty && !config.duplicateProperty.enabled) {
return null;
}
// Not applicable, bail
if (node.type !== 'block') {
return null;
}
node.forEach('declaration', function (declaration) {
var property = declaration.first('property').content[0];
if (property && property.type !== 'ident') {
return;
}
if (properties.indexOf(property.content) !== -1) {
errors.push({
message: 'Duplicate property: "' + property.content + '".',
column: property.start.column,
line: property.start.line
});
}
properties.push(property.content);
});
if (errors.length) {
return errors.map(function (error) {
return {
column: error.column,
file: filename,
line: error.line,
linter: 'duplicateProperty',
message: error.message
};
});
}
return true;
};
| 'use strict';
var path = require('path');
module.exports = function (options) {
var filename = path.basename(options.path);
var config = options.config;
var node = options.node;
var properties = [];
var errors = [];
// Bail if the linter isn't wanted
if (config.duplicateProperty && !config.duplicateProperty.enabled) {
return null;
}
// Not applicable, bail
if (node.type !== 'block') {
return null;
}
node.forEach('declaration', function (declaration) {
var property = declaration.first('property').content[0];
if (property && property.type !== 'ident') {
return;
}
if (properties.indexOf(property.content) !== -1) {
errors.push({
message: 'Duplicate property: "' + property.content + '".',
column: property.start.column,
line: property.start.line
});
}
properties.push(property.content);
});
if (errors.length) {
return errors.map(function (error) {
return {
column: error.column,
file: filename,
line: error.line,
linter: 'duplicateProperty',
message: error.message
};
});
}
return true;
};
| Remove unused lodash.find reference in duplicateProperty | Remove unused lodash.find reference in duplicateProperty
| JavaScript | mit | lesshint/lesshint,runarberg/lesshint,gilt/lesshint,JoshuaKGoldberg/lesshint | javascript | ## Code Before:
'use strict';
var find = require('lodash.find');
var path = require('path');
module.exports = function (options) {
var filename = path.basename(options.path);
var config = options.config;
var node = options.node;
var properties = [];
var errors = [];
// Bail if the linter isn't wanted
if (config.duplicateProperty && !config.duplicateProperty.enabled) {
return null;
}
// Not applicable, bail
if (node.type !== 'block') {
return null;
}
node.forEach('declaration', function (declaration) {
var property = declaration.first('property').content[0];
if (property && property.type !== 'ident') {
return;
}
if (properties.indexOf(property.content) !== -1) {
errors.push({
message: 'Duplicate property: "' + property.content + '".',
column: property.start.column,
line: property.start.line
});
}
properties.push(property.content);
});
if (errors.length) {
return errors.map(function (error) {
return {
column: error.column,
file: filename,
line: error.line,
linter: 'duplicateProperty',
message: error.message
};
});
}
return true;
};
## Instruction:
Remove unused lodash.find reference in duplicateProperty
## Code After:
'use strict';
var path = require('path');
module.exports = function (options) {
var filename = path.basename(options.path);
var config = options.config;
var node = options.node;
var properties = [];
var errors = [];
// Bail if the linter isn't wanted
if (config.duplicateProperty && !config.duplicateProperty.enabled) {
return null;
}
// Not applicable, bail
if (node.type !== 'block') {
return null;
}
node.forEach('declaration', function (declaration) {
var property = declaration.first('property').content[0];
if (property && property.type !== 'ident') {
return;
}
if (properties.indexOf(property.content) !== -1) {
errors.push({
message: 'Duplicate property: "' + property.content + '".',
column: property.start.column,
line: property.start.line
});
}
properties.push(property.content);
});
if (errors.length) {
return errors.map(function (error) {
return {
column: error.column,
file: filename,
line: error.line,
linter: 'duplicateProperty',
message: error.message
};
});
}
return true;
};
| 'use strict';
- var find = require('lodash.find');
var path = require('path');
module.exports = function (options) {
var filename = path.basename(options.path);
var config = options.config;
var node = options.node;
var properties = [];
var errors = [];
// Bail if the linter isn't wanted
if (config.duplicateProperty && !config.duplicateProperty.enabled) {
return null;
}
// Not applicable, bail
if (node.type !== 'block') {
return null;
}
node.forEach('declaration', function (declaration) {
var property = declaration.first('property').content[0];
if (property && property.type !== 'ident') {
return;
}
if (properties.indexOf(property.content) !== -1) {
errors.push({
message: 'Duplicate property: "' + property.content + '".',
column: property.start.column,
line: property.start.line
});
}
properties.push(property.content);
});
if (errors.length) {
return errors.map(function (error) {
return {
column: error.column,
file: filename,
line: error.line,
linter: 'duplicateProperty',
message: error.message
};
});
}
return true;
}; | 1 | 0.018519 | 0 | 1 |
548bdcc408ddd7bfc9b98aa69ac8e06548bf2c62 | test/gtest/CMakeLists.txt | test/gtest/CMakeLists.txt | project(gtest)
find_package(Threads)
add_library(gtest src/gtest-all.cc)
target_link_libraries(gtest ${CMAKE_THREAD_LIBS_INIT})
target_include_directories(gtest SYSTEM PUBLIC ${PROJECT_SOURCE_DIR}/include)
add_library(gtest_main src/gtest_main.cc)
target_link_libraries(gtest_main gtest ${CMAKE_THREAD_LIBS_INIT})
| project(gtest)
find_package(Threads)
add_library(gtest src/gtest-all.cc)
target_link_libraries(gtest ${CMAKE_THREAD_LIBS_INIT})
target_include_directories(gtest SYSTEM PUBLIC ${PROJECT_SOURCE_DIR}/include)
if(BUILD_SHARED_LIBS)
target_compile_definitions(gtest
PRIVATE GTEST_CREATE_SHARED_LIBRARY
INTERFACE GTEST_LINKED_AS_SHARED_LIBRARY)
endif()
add_library(gtest_main src/gtest_main.cc)
target_link_libraries(gtest_main gtest ${CMAKE_THREAD_LIBS_INIT})
| Define symbols for gtest exports | Define symbols for gtest exports
Modify build of gtest to add the symbol definitions necessary to
correctly define its export decoration symbol on Windows. This should
newly enable gtest to be built as a shared library on Windows.
| Text | lgpl-2.1 | bluesquall/lcm,bluesquall/lcm,bluesquall/lcm,lcm-proj/lcm,bluesquall/lcm,lcm-proj/lcm,lcm-proj/lcm,adeschamps/lcm,adeschamps/lcm,lcm-proj/lcm,adeschamps/lcm,bluesquall/lcm,adeschamps/lcm,bluesquall/lcm,adeschamps/lcm,lcm-proj/lcm,bluesquall/lcm,lcm-proj/lcm,adeschamps/lcm,lcm-proj/lcm,adeschamps/lcm,adeschamps/lcm,lcm-proj/lcm | text | ## Code Before:
project(gtest)
find_package(Threads)
add_library(gtest src/gtest-all.cc)
target_link_libraries(gtest ${CMAKE_THREAD_LIBS_INIT})
target_include_directories(gtest SYSTEM PUBLIC ${PROJECT_SOURCE_DIR}/include)
add_library(gtest_main src/gtest_main.cc)
target_link_libraries(gtest_main gtest ${CMAKE_THREAD_LIBS_INIT})
## Instruction:
Define symbols for gtest exports
Modify build of gtest to add the symbol definitions necessary to
correctly define its export decoration symbol on Windows. This should
newly enable gtest to be built as a shared library on Windows.
## Code After:
project(gtest)
find_package(Threads)
add_library(gtest src/gtest-all.cc)
target_link_libraries(gtest ${CMAKE_THREAD_LIBS_INIT})
target_include_directories(gtest SYSTEM PUBLIC ${PROJECT_SOURCE_DIR}/include)
if(BUILD_SHARED_LIBS)
target_compile_definitions(gtest
PRIVATE GTEST_CREATE_SHARED_LIBRARY
INTERFACE GTEST_LINKED_AS_SHARED_LIBRARY)
endif()
add_library(gtest_main src/gtest_main.cc)
target_link_libraries(gtest_main gtest ${CMAKE_THREAD_LIBS_INIT})
| project(gtest)
find_package(Threads)
add_library(gtest src/gtest-all.cc)
target_link_libraries(gtest ${CMAKE_THREAD_LIBS_INIT})
target_include_directories(gtest SYSTEM PUBLIC ${PROJECT_SOURCE_DIR}/include)
+ if(BUILD_SHARED_LIBS)
+ target_compile_definitions(gtest
+ PRIVATE GTEST_CREATE_SHARED_LIBRARY
+ INTERFACE GTEST_LINKED_AS_SHARED_LIBRARY)
+ endif()
add_library(gtest_main src/gtest_main.cc)
target_link_libraries(gtest_main gtest ${CMAKE_THREAD_LIBS_INIT}) | 5 | 0.5 | 5 | 0 |
01247c7c28acfb7bcccf8fc371e2b0f54373b1da | db/v48/rogers16.bib | db/v48/rogers16.bib | @InProceedings{rogers16,
supplementary = {Supplementary:rogers16-supp.pdf},
title = {Differentially Private Chi-Squared Hypothesis Testing: Goodness of Fit and Independence Testing},
author = {Ryan Rogers and Salil Vadhan and Hyun Lim and Marco Gaboardi},
pages = {2111-2120},
abstract = {Hypothesis testing is a useful statistical tool in determining whether a given model should be rejected based on a sample from the population. Sample data may contain sensitive information about individuals, such as medical information. Thus it is important to design statistical tests that guarantee the privacy of subjects in the data. In this work, we study hypothesis testing subject to differential privacy, specifically chi-squared tests for goodness of fit for multinomial data and independence between two categorical variables.},
}
| @InProceedings{rogers16,
supplementary = {Supplementary:rogers16-supp.pdf},
title = {Differentially Private Chi-Squared Hypothesis Testing: Goodness of Fit and Independence Testing},
author = {Marco Gaboardi and Hyun Lim and Ryan Rogers and Salil Vadhan},
pages = {2111-2120},
abstract = {Hypothesis testing is a useful statistical tool in determining whether a given model should be rejected based on a sample from the population. Sample data may contain sensitive information about individuals, such as medical information. Thus it is important to design statistical tests that guarantee the privacy of subjects in the data. In this work, we study hypothesis testing subject to differential privacy, specifically chi-squared tests for goodness of fit for multinomial data and independence between two categorical variables.},
}
| Change order of author names | Change order of author names
| TeX | mit | mreid/papersite,mreid/papersite,mreid/papersite | tex | ## Code Before:
@InProceedings{rogers16,
supplementary = {Supplementary:rogers16-supp.pdf},
title = {Differentially Private Chi-Squared Hypothesis Testing: Goodness of Fit and Independence Testing},
author = {Ryan Rogers and Salil Vadhan and Hyun Lim and Marco Gaboardi},
pages = {2111-2120},
abstract = {Hypothesis testing is a useful statistical tool in determining whether a given model should be rejected based on a sample from the population. Sample data may contain sensitive information about individuals, such as medical information. Thus it is important to design statistical tests that guarantee the privacy of subjects in the data. In this work, we study hypothesis testing subject to differential privacy, specifically chi-squared tests for goodness of fit for multinomial data and independence between two categorical variables.},
}
## Instruction:
Change order of author names
## Code After:
@InProceedings{rogers16,
supplementary = {Supplementary:rogers16-supp.pdf},
title = {Differentially Private Chi-Squared Hypothesis Testing: Goodness of Fit and Independence Testing},
author = {Marco Gaboardi and Hyun Lim and Ryan Rogers and Salil Vadhan},
pages = {2111-2120},
abstract = {Hypothesis testing is a useful statistical tool in determining whether a given model should be rejected based on a sample from the population. Sample data may contain sensitive information about individuals, such as medical information. Thus it is important to design statistical tests that guarantee the privacy of subjects in the data. In this work, we study hypothesis testing subject to differential privacy, specifically chi-squared tests for goodness of fit for multinomial data and independence between two categorical variables.},
}
| @InProceedings{rogers16,
supplementary = {Supplementary:rogers16-supp.pdf},
title = {Differentially Private Chi-Squared Hypothesis Testing: Goodness of Fit and Independence Testing},
- author = {Ryan Rogers and Salil Vadhan and Hyun Lim and Marco Gaboardi},
+ author = {Marco Gaboardi and Hyun Lim and Ryan Rogers and Salil Vadhan},
pages = {2111-2120},
abstract = {Hypothesis testing is a useful statistical tool in determining whether a given model should be rejected based on a sample from the population. Sample data may contain sensitive information about individuals, such as medical information. Thus it is important to design statistical tests that guarantee the privacy of subjects in the data. In this work, we study hypothesis testing subject to differential privacy, specifically chi-squared tests for goodness of fit for multinomial data and independence between two categorical variables.},
} | 2 | 0.285714 | 1 | 1 |
2450955e2beb14e4c6ba0394e4bcd64e2ce2e4ec | wordcloud/views.py | wordcloud/views.py | import json
import os
from django.conf import settings
from django.http import HttpResponse
from django.views.decorators.cache import cache_page
from .wordcloud import popular_words
@cache_page(60*60*4)
def wordcloud(request, max_entries=30):
""" Return tag cloud JSON results"""
max_entries = int(max_entries)
leaf_name = 'wordcloud-{0}.json'.format(max_entries)
cache_path = os.path.join(
settings.MEDIA_ROOT, 'wordcloud_cache', leaf_name
)
if os.path.exists(cache_path):
response = HttpResponse()
response['Content-Type'] = 'application/json'
response['X-Sendfile'] = cache_path.encode('utf-8')
return response
content = json.dumps(popular_words(max_entries=max_entries))
return HttpResponse(
content,
content_type='application/json',
)
| import json
import os
from django.conf import settings
from django.http import HttpResponse
from django.views.decorators.cache import cache_page
from .wordcloud import popular_words
@cache_page(60*60*4)
def wordcloud(request, max_entries=30):
""" Return tag cloud JSON results"""
max_entries = int(max_entries)
leaf_name = 'wordcloud-{0}.json'.format(max_entries)
cache_path = os.path.join(
settings.MEDIA_ROOT, 'wordcloud_cache', leaf_name
)
if os.path.exists(cache_path):
response = HttpResponse(json.dumps({
'error':
("If you can see this, then X-SendFile isn't configured "
"correctly in your webserver. (If you're using Nginx, you'll "
"have to change the code to add a X-Accel-Redirect header - "
"this hasn't currently been tested.)")
}))
response['Content-Type'] = 'application/json'
response['X-Sendfile'] = cache_path.encode('utf-8')
return response
content = json.dumps(popular_words(max_entries=max_entries))
return HttpResponse(
content,
content_type='application/json',
)
| Add diagnostic output for when X-SendFile is misconfigured | Add diagnostic output for when X-SendFile is misconfigured
| Python | agpl-3.0 | mysociety/pombola,geoffkilpin/pombola,mysociety/pombola,geoffkilpin/pombola,geoffkilpin/pombola,geoffkilpin/pombola,mysociety/pombola,geoffkilpin/pombola,mysociety/pombola,mysociety/pombola,geoffkilpin/pombola,mysociety/pombola | python | ## Code Before:
import json
import os
from django.conf import settings
from django.http import HttpResponse
from django.views.decorators.cache import cache_page
from .wordcloud import popular_words
@cache_page(60*60*4)
def wordcloud(request, max_entries=30):
""" Return tag cloud JSON results"""
max_entries = int(max_entries)
leaf_name = 'wordcloud-{0}.json'.format(max_entries)
cache_path = os.path.join(
settings.MEDIA_ROOT, 'wordcloud_cache', leaf_name
)
if os.path.exists(cache_path):
response = HttpResponse()
response['Content-Type'] = 'application/json'
response['X-Sendfile'] = cache_path.encode('utf-8')
return response
content = json.dumps(popular_words(max_entries=max_entries))
return HttpResponse(
content,
content_type='application/json',
)
## Instruction:
Add diagnostic output for when X-SendFile is misconfigured
## Code After:
import json
import os
from django.conf import settings
from django.http import HttpResponse
from django.views.decorators.cache import cache_page
from .wordcloud import popular_words
@cache_page(60*60*4)
def wordcloud(request, max_entries=30):
""" Return tag cloud JSON results"""
max_entries = int(max_entries)
leaf_name = 'wordcloud-{0}.json'.format(max_entries)
cache_path = os.path.join(
settings.MEDIA_ROOT, 'wordcloud_cache', leaf_name
)
if os.path.exists(cache_path):
response = HttpResponse(json.dumps({
'error':
("If you can see this, then X-SendFile isn't configured "
"correctly in your webserver. (If you're using Nginx, you'll "
"have to change the code to add a X-Accel-Redirect header - "
"this hasn't currently been tested.)")
}))
response['Content-Type'] = 'application/json'
response['X-Sendfile'] = cache_path.encode('utf-8')
return response
content = json.dumps(popular_words(max_entries=max_entries))
return HttpResponse(
content,
content_type='application/json',
)
| import json
import os
from django.conf import settings
from django.http import HttpResponse
from django.views.decorators.cache import cache_page
from .wordcloud import popular_words
@cache_page(60*60*4)
def wordcloud(request, max_entries=30):
""" Return tag cloud JSON results"""
max_entries = int(max_entries)
leaf_name = 'wordcloud-{0}.json'.format(max_entries)
cache_path = os.path.join(
settings.MEDIA_ROOT, 'wordcloud_cache', leaf_name
)
if os.path.exists(cache_path):
- response = HttpResponse()
? ^
+ response = HttpResponse(json.dumps({
? ^^^^^^^^^^^^
+ 'error':
+ ("If you can see this, then X-SendFile isn't configured "
+ "correctly in your webserver. (If you're using Nginx, you'll "
+ "have to change the code to add a X-Accel-Redirect header - "
+ "this hasn't currently been tested.)")
+ }))
response['Content-Type'] = 'application/json'
response['X-Sendfile'] = cache_path.encode('utf-8')
return response
content = json.dumps(popular_words(max_entries=max_entries))
return HttpResponse(
content,
content_type='application/json',
) | 8 | 0.258065 | 7 | 1 |
ec69f926a9599ce1389b5db8a0ef1f7653f57e4b | spec/client_spec.rb | spec/client_spec.rb |
require 'spec_helper'
require 'ephemeral/client'
describe 'Ephemeral::Client' do
it 'expects 3 arguments' do
test_client = Ephemeral::Client.new
expect{
test_client.build
}.to raise_error ArgumentError
end
end |
require 'spec_helper'
require 'ephemeral/client'
describe 'Ephemeral::Client' do
it 'expects 3 arguments' do
test_client = Ephemeral::Client.new
expect{
test_client.build
}.to raise_error ArgumentError
end
it 'returns an id' do
test_client = Ephemeral::Client.new
response = test_client.build("test", "test", "test")
expect(response).to have_type String
end
end | Add test for returned id | Add test for returned id
| Ruby | mit | factor-io/ephemeral-client | ruby | ## Code Before:
require 'spec_helper'
require 'ephemeral/client'
describe 'Ephemeral::Client' do
it 'expects 3 arguments' do
test_client = Ephemeral::Client.new
expect{
test_client.build
}.to raise_error ArgumentError
end
end
## Instruction:
Add test for returned id
## Code After:
require 'spec_helper'
require 'ephemeral/client'
describe 'Ephemeral::Client' do
it 'expects 3 arguments' do
test_client = Ephemeral::Client.new
expect{
test_client.build
}.to raise_error ArgumentError
end
it 'returns an id' do
test_client = Ephemeral::Client.new
response = test_client.build("test", "test", "test")
expect(response).to have_type String
end
end |
require 'spec_helper'
require 'ephemeral/client'
describe 'Ephemeral::Client' do
it 'expects 3 arguments' do
test_client = Ephemeral::Client.new
expect{
test_client.build
}.to raise_error ArgumentError
end
+
+ it 'returns an id' do
+ test_client = Ephemeral::Client.new
+ response = test_client.build("test", "test", "test")
+ expect(response).to have_type String
+ end
end | 6 | 0.461538 | 6 | 0 |
44761810be4a93f5087d60f7a464931fbaf74eb0 | theme/thumbnail.scss | theme/thumbnail.scss | .thumbnail {
width: 50px;
border-radius: 70px;
&.default, &.self, &.image, &.spoiler {
background-position-x: -10px; // spritesheets, eh?
}
img {
height: 50px;
width: auto;
}
}
| .thumbnail {
width: 50px;
border-radius: 70px;
background-position-x: -10px !important;
img {
height: 50px;
width: auto;
}
}
| Update background center for all types. | Update background center for all types. | SCSS | mit | avinashbot/focus-theme,avinashbot/focus-theme | scss | ## Code Before:
.thumbnail {
width: 50px;
border-radius: 70px;
&.default, &.self, &.image, &.spoiler {
background-position-x: -10px; // spritesheets, eh?
}
img {
height: 50px;
width: auto;
}
}
## Instruction:
Update background center for all types.
## Code After:
.thumbnail {
width: 50px;
border-radius: 70px;
background-position-x: -10px !important;
img {
height: 50px;
width: auto;
}
}
| .thumbnail {
width: 50px;
border-radius: 70px;
+ background-position-x: -10px !important;
-
- &.default, &.self, &.image, &.spoiler {
- background-position-x: -10px; // spritesheets, eh?
- }
img {
height: 50px;
width: auto;
}
} | 5 | 0.384615 | 1 | 4 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.