commit stringlengths 40 40 | old_file stringlengths 4 184 | new_file stringlengths 4 184 | old_contents stringlengths 1 3.6k | new_contents stringlengths 5 3.38k | subject stringlengths 15 778 | message stringlengths 16 6.74k | lang stringclasses 201 values | license stringclasses 13 values | repos stringlengths 6 116k | config stringclasses 201 values | content stringlengths 137 7.24k | diff stringlengths 26 5.55k | diff_length int64 1 123 | relative_diff_length float64 0.01 89 | n_lines_added int64 0 108 | n_lines_deleted int64 0 106 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
73c1d735a2582c2c46ad19fcf9d543f6497b0108 | config/initializers/mime_types.rb | config/initializers/mime_types.rb | Mime::Type.register_alias "text/html", :full
Mime::Type.register_alias "text/html", :mobile
Mime::Type.register_alias "text/javascript", :jsmobile
| Mime::Type.register "font/truetype", :ttf
Mime::Type.register "font/opentype", :otf
Mime::Type.register "application/vnd.ms-fontobject", :eot
Mime::Type.register "application/x-font-woff", :woff
Mime::Type.register_alias "text/html", :full
Mime::Type.register_alias "text/html", :mobile
Mime::Type.register_alias "text/javascript", :jsmobile
| Add mime types for fonts | Add mime types for fonts
| Ruby | agpl-3.0 | suec78/vish_storyrobin,ging/vish,suec78/vish_storyrobin,rogervaas/vish,ging/vish_orange,suec78/vish_storyrobin,ging/vish,agordillo/vish,agordillo/vish,ging/vish_orange,rogervaas/vish,ging/vish_orange,agordillo/vish,suec78/vish_storyrobin,ging/vish_orange,rogervaas/vish,ging/vish,nathanV38/vishTst,ging/vish,agordillo/vish,ging/vish_orange,nathanV38/vishTst,rogervaas/vish,ging/vish | ruby | ## Code Before:
Mime::Type.register_alias "text/html", :full
Mime::Type.register_alias "text/html", :mobile
Mime::Type.register_alias "text/javascript", :jsmobile
## Instruction:
Add mime types for fonts
## Code After:
Mime::Type.register "font/truetype", :ttf
Mime::Type.register "font/opentype", :otf
Mime::Type.register "application/vnd.ms-fontobject", :eot
Mime::Type.register "application/x-font-woff", :woff
Mime::Type.register_alias "text/html", :full
Mime::Type.register_alias "text/html", :mobile
Mime::Type.register_alias "text/javascript", :jsmobile
| + Mime::Type.register "font/truetype", :ttf
+ Mime::Type.register "font/opentype", :otf
+ Mime::Type.register "application/vnd.ms-fontobject", :eot
+ Mime::Type.register "application/x-font-woff", :woff
+
Mime::Type.register_alias "text/html", :full
Mime::Type.register_alias "text/html", :mobile
Mime::Type.register_alias "text/javascript", :jsmobile | 5 | 1.666667 | 5 | 0 |
95278a78ff02ade1c5cc2a3003dd20dd51a10289 | translations/ru.json | translations/ru.json | {
"LOADING": "Загрузка …",
"TODAY": "Сегодня",
"TOMORROW": "Завтра",
"DAYAFTERTOMORROW": "Послезавтра",
"RUNNING": "Заканчивается через",
"EMPTY": "Нет предстоящих событий",
"N": "С",
"NNE": "ССВ",
"NE": "СВ",
"ENE": "ВСВ",
"E": "В",
"ESE": "ВЮВ",
"SE": "ЮВ",
"SSE": "ЮЮВ",
"S": "Ю",
"SSW": "ЮЮЗ",
"SW": "ЮЗ",
"WSW": "ЗЮЗ",
"W": "З",
"WNW": "ЗСЗ",
"NW": "СЗ",
"NNW": "ССЗ",
"UPDATE_NOTIFICATION": "Есть обновление для MagicMirror².",
"UPDATE_NOTIFICATION_MODULE": "Есть обновление для MODULE_NAME модуля.",
"UPDATE_INFO": "Данная инсталляция позади BRANCH_NAME ветки на COMMIT_COUNT коммитов."
}
| {
"LOADING": "Загрузка …",
"TODAY": "Сегодня",
"TOMORROW": "Завтра",
"DAYAFTERTOMORROW": "Послезавтра",
"RUNNING": "Заканчивается через",
"EMPTY": "Нет предстоящих событий",
"WEEK": "Неделя",
"N": "С",
"NNE": "ССВ",
"NE": "СВ",
"ENE": "ВСВ",
"E": "В",
"ESE": "ВЮВ",
"SE": "ЮВ",
"SSE": "ЮЮВ",
"S": "Ю",
"SSW": "ЮЮЗ",
"SW": "ЮЗ",
"WSW": "ЗЮЗ",
"W": "З",
"WNW": "ЗСЗ",
"NW": "СЗ",
"NNW": "ССЗ",
"UPDATE_NOTIFICATION": "Есть обновление для MagicMirror².",
"UPDATE_NOTIFICATION_MODULE": "Есть обновление для MODULE_NAME модуля.",
"UPDATE_INFO": "Данная инсталляция позади BRANCH_NAME ветки на COMMIT_COUNT коммитов."
}
| Add Russian translation for week | Add Russian translation for week
| JSON | mit | Leniox/MagicMirror,vyazadji/MagicMirror,vyazadji/MagicMirror,MichMich/MagicMirror,thobach/MagicMirror,wszgxa/magic-mirror,roramirez/MagicMirror,Leniox/MagicMirror,n8many/MagicMirror,morozgrafix/MagicMirror,ShivamShrivastava/Smart-Mirror,morozgrafix/MagicMirror,heyheyhexi/MagicMirror,heyheyhexi/MagicMirror,mbalfour/MagicMirror,MichMich/MagicMirror,Tyvonne/MagicMirror,berlincount/MagicMirror,berlincount/MagicMirror,heyheyhexi/MagicMirror,thobach/MagicMirror,Devan0369/AI-Project,marc-86/MagicMirror,berlincount/MagicMirror,Tyvonne/MagicMirror,wszgxa/magic-mirror,marc-86/MagicMirror,morozgrafix/MagicMirror,wszgxa/magic-mirror,morozgrafix/MagicMirror,ShivamShrivastava/Smart-Mirror,Devan0369/AI-Project,mbalfour/MagicMirror,Tyvonne/MagicMirror,thobach/MagicMirror,marc-86/MagicMirror,mbalfour/MagicMirror,Leniox/MagicMirror,MichMich/MagicMirror,heyheyhexi/MagicMirror,Tyvonne/MagicMirror,n8many/MagicMirror,vyazadji/MagicMirror,berlincount/MagicMirror,marc-86/MagicMirror,roramirez/MagicMirror,MichMich/MagicMirror,n8many/MagicMirror,n8many/MagicMirror,thobach/MagicMirror,ShivamShrivastava/Smart-Mirror,roramirez/MagicMirror,Devan0369/AI-Project,n8many/MagicMirror,roramirez/MagicMirror | json | ## Code Before:
{
"LOADING": "Загрузка …",
"TODAY": "Сегодня",
"TOMORROW": "Завтра",
"DAYAFTERTOMORROW": "Послезавтра",
"RUNNING": "Заканчивается через",
"EMPTY": "Нет предстоящих событий",
"N": "С",
"NNE": "ССВ",
"NE": "СВ",
"ENE": "ВСВ",
"E": "В",
"ESE": "ВЮВ",
"SE": "ЮВ",
"SSE": "ЮЮВ",
"S": "Ю",
"SSW": "ЮЮЗ",
"SW": "ЮЗ",
"WSW": "ЗЮЗ",
"W": "З",
"WNW": "ЗСЗ",
"NW": "СЗ",
"NNW": "ССЗ",
"UPDATE_NOTIFICATION": "Есть обновление для MagicMirror².",
"UPDATE_NOTIFICATION_MODULE": "Есть обновление для MODULE_NAME модуля.",
"UPDATE_INFO": "Данная инсталляция позади BRANCH_NAME ветки на COMMIT_COUNT коммитов."
}
## Instruction:
Add Russian translation for week
## Code After:
{
"LOADING": "Загрузка …",
"TODAY": "Сегодня",
"TOMORROW": "Завтра",
"DAYAFTERTOMORROW": "Послезавтра",
"RUNNING": "Заканчивается через",
"EMPTY": "Нет предстоящих событий",
"WEEK": "Неделя",
"N": "С",
"NNE": "ССВ",
"NE": "СВ",
"ENE": "ВСВ",
"E": "В",
"ESE": "ВЮВ",
"SE": "ЮВ",
"SSE": "ЮЮВ",
"S": "Ю",
"SSW": "ЮЮЗ",
"SW": "ЮЗ",
"WSW": "ЗЮЗ",
"W": "З",
"WNW": "ЗСЗ",
"NW": "СЗ",
"NNW": "ССЗ",
"UPDATE_NOTIFICATION": "Есть обновление для MagicMirror².",
"UPDATE_NOTIFICATION_MODULE": "Есть обновление для MODULE_NAME модуля.",
"UPDATE_INFO": "Данная инсталляция позади BRANCH_NAME ветки на COMMIT_COUNT коммитов."
}
| {
"LOADING": "Загрузка …",
"TODAY": "Сегодня",
"TOMORROW": "Завтра",
"DAYAFTERTOMORROW": "Послезавтра",
"RUNNING": "Заканчивается через",
"EMPTY": "Нет предстоящих событий",
+
+ "WEEK": "Неделя",
"N": "С",
"NNE": "ССВ",
"NE": "СВ",
"ENE": "ВСВ",
"E": "В",
"ESE": "ВЮВ",
"SE": "ЮВ",
"SSE": "ЮЮВ",
"S": "Ю",
"SSW": "ЮЮЗ",
"SW": "ЮЗ",
"WSW": "ЗЮЗ",
"W": "З",
"WNW": "ЗСЗ",
"NW": "СЗ",
"NNW": "ССЗ",
"UPDATE_NOTIFICATION": "Есть обновление для MagicMirror².",
"UPDATE_NOTIFICATION_MODULE": "Есть обновление для MODULE_NAME модуля.",
"UPDATE_INFO": "Данная инсталляция позади BRANCH_NAME ветки на COMMIT_COUNT коммитов."
} | 2 | 0.066667 | 2 | 0 |
46a473407c1c5a663cf37c79c851f4347f0f8506 | .travis.yml | .travis.yml | language: node_js
cache:
directories:
- $HOME/.npm
- $HOME/.yarn-cache
- node_modules
- js-given/node_modules
- documentation/node_modules
- examples/jest-es2015/node_modules
node_js:
- "4.8.3"
- "6.10.3"
- "7.10.0"
before_install:
- "export DISPLAY=:99.0"
- "sh -e /etc/init.d/xvfb start"
- sleep 3 # give xvfb some time to start
install:
- yarn
| language: node_js
cache:
directories:
- $HOME/.npm
- $HOME/.yarn-cache
- node_modules
- js-given/node_modules
- documentation/node_modules
- examples/jest-es2015/node_modules
node_js:
- "4.8.3"
- "6.10.3"
- "7.10.0"
before_install:
- "export DISPLAY=:99.0"
- "sh -e /etc/init.d/xvfb start"
install:
- yarn
| Remove sleep in xvfb start | Remove sleep in xvfb start
| YAML | mit | jsGiven/jsGiven | yaml | ## Code Before:
language: node_js
cache:
directories:
- $HOME/.npm
- $HOME/.yarn-cache
- node_modules
- js-given/node_modules
- documentation/node_modules
- examples/jest-es2015/node_modules
node_js:
- "4.8.3"
- "6.10.3"
- "7.10.0"
before_install:
- "export DISPLAY=:99.0"
- "sh -e /etc/init.d/xvfb start"
- sleep 3 # give xvfb some time to start
install:
- yarn
## Instruction:
Remove sleep in xvfb start
## Code After:
language: node_js
cache:
directories:
- $HOME/.npm
- $HOME/.yarn-cache
- node_modules
- js-given/node_modules
- documentation/node_modules
- examples/jest-es2015/node_modules
node_js:
- "4.8.3"
- "6.10.3"
- "7.10.0"
before_install:
- "export DISPLAY=:99.0"
- "sh -e /etc/init.d/xvfb start"
install:
- yarn
| language: node_js
cache:
directories:
- $HOME/.npm
- $HOME/.yarn-cache
- node_modules
- js-given/node_modules
- documentation/node_modules
- examples/jest-es2015/node_modules
node_js:
- "4.8.3"
- "6.10.3"
- "7.10.0"
before_install:
- "export DISPLAY=:99.0"
- "sh -e /etc/init.d/xvfb start"
- - sleep 3 # give xvfb some time to start
install:
- yarn | 1 | 0.052632 | 0 | 1 |
2f7adfb482a3929d0b753ed81250159cdee63eb7 | Manifold/Module+Vector.swift | Manifold/Module+Vector.swift | // Copyright © 2015 Rob Rix. All rights reserved.
extension Module {
public static var vector: Module {
let Natural: Recur = "Natural"
let Vector: Recur = "Vector"
let vector = Declaration("Vector",
type: .Type --> Natural --> .Type,
value: (.Type, Natural, .Type) => { A, n, B in n[.Type, B --> B, Natural => { n in (A --> Vector[A, n] --> B) --> B --> B }] })
let successor: Recur = "successor"
let cons = Declaration("cons",
type: (.Type, Natural) => { A, n in A --> Vector[A, n] --> Vector[A, successor[n]] },
value: (.Type, Natural) => { (A: Recur, n) in (A, Vector[A, n], .Type) => { head, tail, B in (A --> Vector[A, n] --> B, B) => { ifCons, _ in ifCons[head, tail] } } })
let zero: Recur = "zero"
let `nil` = Declaration("nil",
type: .Type => { (A: Recur) in Vector[A, zero] },
value: (.Type, Natural, .Type) => { A, n, B in ((A --> Vector[A, n] --> B), B) => { _, other in other } })
return Module("Vector", [ natural ], [ vector, cons, `nil` ])
}
}
| // Copyright © 2015 Rob Rix. All rights reserved.
extension Module {
public static var vector: Module {
let Natural: Recur = "Natural"
let Vector: Recur = "Vector"
let vector = Declaration("Vector",
type: .Type --> Natural --> .Type,
value: (.Type, Natural, .Type) => { A, n, B in n[.Type, B --> B, Natural => { n in (A --> Vector[A, n] --> B) --> B --> B }] })
let successor: Recur = "successor"
let cons = Declaration("cons",
type: (.Type, Natural) => { A, n in A --> Vector[A, n] --> Vector[A, successor[n]] },
value: (.Type, Natural) => { (A: Recur, n) in (A, Vector[A, n], .Type) => { head, tail, B in (A --> Vector[A, n] --> B, B) => { ifCons, _ in ifCons[head, tail] } } })
let zero: Recur = "zero"
let `nil` = Declaration("nil",
type: .Type => { (A: Recur) in Vector[A, zero] },
value: (.Type, .Type) => { A, B in B => id })
return Module("Vector", [ natural ], [ vector, cons, `nil` ])
}
}
import Prelude
| Correct the definition of `Vector.nil`. | Correct the definition of `Vector.nil`.
| Swift | mit | antitypical/Manifold,antitypical/Manifold | swift | ## Code Before:
// Copyright © 2015 Rob Rix. All rights reserved.
extension Module {
public static var vector: Module {
let Natural: Recur = "Natural"
let Vector: Recur = "Vector"
let vector = Declaration("Vector",
type: .Type --> Natural --> .Type,
value: (.Type, Natural, .Type) => { A, n, B in n[.Type, B --> B, Natural => { n in (A --> Vector[A, n] --> B) --> B --> B }] })
let successor: Recur = "successor"
let cons = Declaration("cons",
type: (.Type, Natural) => { A, n in A --> Vector[A, n] --> Vector[A, successor[n]] },
value: (.Type, Natural) => { (A: Recur, n) in (A, Vector[A, n], .Type) => { head, tail, B in (A --> Vector[A, n] --> B, B) => { ifCons, _ in ifCons[head, tail] } } })
let zero: Recur = "zero"
let `nil` = Declaration("nil",
type: .Type => { (A: Recur) in Vector[A, zero] },
value: (.Type, Natural, .Type) => { A, n, B in ((A --> Vector[A, n] --> B), B) => { _, other in other } })
return Module("Vector", [ natural ], [ vector, cons, `nil` ])
}
}
## Instruction:
Correct the definition of `Vector.nil`.
## Code After:
// Copyright © 2015 Rob Rix. All rights reserved.
extension Module {
public static var vector: Module {
let Natural: Recur = "Natural"
let Vector: Recur = "Vector"
let vector = Declaration("Vector",
type: .Type --> Natural --> .Type,
value: (.Type, Natural, .Type) => { A, n, B in n[.Type, B --> B, Natural => { n in (A --> Vector[A, n] --> B) --> B --> B }] })
let successor: Recur = "successor"
let cons = Declaration("cons",
type: (.Type, Natural) => { A, n in A --> Vector[A, n] --> Vector[A, successor[n]] },
value: (.Type, Natural) => { (A: Recur, n) in (A, Vector[A, n], .Type) => { head, tail, B in (A --> Vector[A, n] --> B, B) => { ifCons, _ in ifCons[head, tail] } } })
let zero: Recur = "zero"
let `nil` = Declaration("nil",
type: .Type => { (A: Recur) in Vector[A, zero] },
value: (.Type, .Type) => { A, B in B => id })
return Module("Vector", [ natural ], [ vector, cons, `nil` ])
}
}
import Prelude
| // Copyright © 2015 Rob Rix. All rights reserved.
extension Module {
public static var vector: Module {
let Natural: Recur = "Natural"
let Vector: Recur = "Vector"
let vector = Declaration("Vector",
type: .Type --> Natural --> .Type,
value: (.Type, Natural, .Type) => { A, n, B in n[.Type, B --> B, Natural => { n in (A --> Vector[A, n] --> B) --> B --> B }] })
let successor: Recur = "successor"
let cons = Declaration("cons",
type: (.Type, Natural) => { A, n in A --> Vector[A, n] --> Vector[A, successor[n]] },
value: (.Type, Natural) => { (A: Recur, n) in (A, Vector[A, n], .Type) => { head, tail, B in (A --> Vector[A, n] --> B, B) => { ifCons, _ in ifCons[head, tail] } } })
let zero: Recur = "zero"
let `nil` = Declaration("nil",
type: .Type => { (A: Recur) in Vector[A, zero] },
- value: (.Type, Natural, .Type) => { A, n, B in ((A --> Vector[A, n] --> B), B) => { _, other in other } })
+ value: (.Type, .Type) => { A, B in B => id })
return Module("Vector", [ natural ], [ vector, cons, `nil` ])
}
}
+
+
+ import Prelude | 5 | 0.217391 | 4 | 1 |
da96ace72f3e46966f4b9d156ce3972d7cde31d9 | spec/views/meetings/show.html.erb_spec.rb | spec/views/meetings/show.html.erb_spec.rb | require 'rails_helper'
RSpec.describe "meetings/show", type: :view do
before(:each) do
@meeting = assign(:meeting, FactoryGirl.create(:meeting))
end
it "renders attributes in <p>" do
render
end
end
| require 'rails_helper'
RSpec.describe "meetings/show", type: :view do
before(:each) do
@meeting = assign(:meeting, FactoryGirl.create(:meeting))
end
it "renders the div" do
skip('Gives me hell about current_user')
render
assert_select('div.show-meeting')
end
end
| Mark spec as pending because it was giving me hell for no apparent reason pertaining to bcrypt | Mark spec as pending because it was giving me hell for no apparent reason pertaining to bcrypt
| Ruby | unlicense | danascheider/testrubypdx,danascheider/testrubypdx,danascheider/testrubypdx | ruby | ## Code Before:
require 'rails_helper'
RSpec.describe "meetings/show", type: :view do
before(:each) do
@meeting = assign(:meeting, FactoryGirl.create(:meeting))
end
it "renders attributes in <p>" do
render
end
end
## Instruction:
Mark spec as pending because it was giving me hell for no apparent reason pertaining to bcrypt
## Code After:
require 'rails_helper'
RSpec.describe "meetings/show", type: :view do
before(:each) do
@meeting = assign(:meeting, FactoryGirl.create(:meeting))
end
it "renders the div" do
skip('Gives me hell about current_user')
render
assert_select('div.show-meeting')
end
end
| require 'rails_helper'
RSpec.describe "meetings/show", type: :view do
before(:each) do
@meeting = assign(:meeting, FactoryGirl.create(:meeting))
end
- it "renders attributes in <p>" do
+ it "renders the div" do
+ skip('Gives me hell about current_user')
render
+
+ assert_select('div.show-meeting')
end
end | 5 | 0.454545 | 4 | 1 |
fda6d0e4fad79691aa0398ea307ec6758d6827dd | src/components/semantic-ui/ui-dropdown.js | src/components/semantic-ui/ui-dropdown.js | import { inject, bindable, bindingMode, containerless } from 'aurelia-framework';
@inject(Element)
export class UiDropdownCustomElement {
@bindable({ defaultBindingMode: bindingMode.twoWay }) value;
@bindable placeholder;
@bindable required;
constructor(element) {
this.element = element;
}
attached() {
this.dropdown = jQuery('.ui.selection.dropdown', this.element);
this.dropdown.dropdown();
// The set up gets called before the elements are in place thus
// not actually selecting them. This prevents that from happening.
setTimeout(() => {
//this.dropdown.dropdown('refresh');
this.valueChanged(this.value);
}, 1);
}
valueChanged(n) {
if (this.dropdown) {
this.value = n;
// Clear cache since that seems to stop it from actually selecting anything
this.dropdown.dropdown('refresh');
// Set the value as selected to make sure updates happen properly when values
// are added dynamically.
this.dropdown.dropdown('set selected', this.value);
}
}
}
| import { inject, bindable, bindingMode, containerless } from 'aurelia-framework';
@inject(Element)
export class UiDropdownCustomElement {
@bindable({ defaultBindingMode: bindingMode.twoWay }) value;
@bindable placeholder;
@bindable required;
constructor(element) {
this.element = element;
}
attached() {
this.dropdown = jQuery('.ui.selection.dropdown', this.element);
this.dropdown.dropdown();
// The set up gets called before the elements are in place thus
// not actually selecting them. This prevents that from happening.
setTimeout(() => {
//this.dropdown.dropdown('refresh');
this.valueChanged(this.value);
}, 1);
}
valueChanged(n) {
if (this.dropdown) {
this.value = n;
if (this.value == "") {
this.dropdown.dropdown('clear');
}
// Clear cache since that seems to stop it from actually selecting anything
this.dropdown.dropdown('refresh');
// Set the value as selected to make sure updates happen properly when values
// are added dynamically.
this.dropdown.dropdown('set selected', this.value);
}
}
}
| Fix dropdown not being cleared when no value set | Fix dropdown not being cleared when no value set
| JavaScript | mit | GETLIMS/LIMS-Frontend,GETLIMS/LIMS-Frontend | javascript | ## Code Before:
import { inject, bindable, bindingMode, containerless } from 'aurelia-framework';
@inject(Element)
export class UiDropdownCustomElement {
@bindable({ defaultBindingMode: bindingMode.twoWay }) value;
@bindable placeholder;
@bindable required;
constructor(element) {
this.element = element;
}
attached() {
this.dropdown = jQuery('.ui.selection.dropdown', this.element);
this.dropdown.dropdown();
// The set up gets called before the elements are in place thus
// not actually selecting them. This prevents that from happening.
setTimeout(() => {
//this.dropdown.dropdown('refresh');
this.valueChanged(this.value);
}, 1);
}
valueChanged(n) {
if (this.dropdown) {
this.value = n;
// Clear cache since that seems to stop it from actually selecting anything
this.dropdown.dropdown('refresh');
// Set the value as selected to make sure updates happen properly when values
// are added dynamically.
this.dropdown.dropdown('set selected', this.value);
}
}
}
## Instruction:
Fix dropdown not being cleared when no value set
## Code After:
import { inject, bindable, bindingMode, containerless } from 'aurelia-framework';
@inject(Element)
export class UiDropdownCustomElement {
@bindable({ defaultBindingMode: bindingMode.twoWay }) value;
@bindable placeholder;
@bindable required;
constructor(element) {
this.element = element;
}
attached() {
this.dropdown = jQuery('.ui.selection.dropdown', this.element);
this.dropdown.dropdown();
// The set up gets called before the elements are in place thus
// not actually selecting them. This prevents that from happening.
setTimeout(() => {
//this.dropdown.dropdown('refresh');
this.valueChanged(this.value);
}, 1);
}
valueChanged(n) {
if (this.dropdown) {
this.value = n;
if (this.value == "") {
this.dropdown.dropdown('clear');
}
// Clear cache since that seems to stop it from actually selecting anything
this.dropdown.dropdown('refresh');
// Set the value as selected to make sure updates happen properly when values
// are added dynamically.
this.dropdown.dropdown('set selected', this.value);
}
}
}
| import { inject, bindable, bindingMode, containerless } from 'aurelia-framework';
@inject(Element)
export class UiDropdownCustomElement {
@bindable({ defaultBindingMode: bindingMode.twoWay }) value;
@bindable placeholder;
@bindable required;
constructor(element) {
this.element = element;
}
attached() {
this.dropdown = jQuery('.ui.selection.dropdown', this.element);
this.dropdown.dropdown();
// The set up gets called before the elements are in place thus
// not actually selecting them. This prevents that from happening.
setTimeout(() => {
//this.dropdown.dropdown('refresh');
this.valueChanged(this.value);
}, 1);
}
valueChanged(n) {
if (this.dropdown) {
this.value = n;
+ if (this.value == "") {
+ this.dropdown.dropdown('clear');
+ }
// Clear cache since that seems to stop it from actually selecting anything
this.dropdown.dropdown('refresh');
// Set the value as selected to make sure updates happen properly when values
// are added dynamically.
this.dropdown.dropdown('set selected', this.value);
}
}
} | 3 | 0.085714 | 3 | 0 |
7437163c20dd5d52b5fe1400667863f2f7d2652c | tox.ini | tox.ini | [tox]
envlist = py26
[testenv]
changedir=tests
deps=pytest
commands=py.test --junitxml=junit-{envname}.xml
| [tox]
envlist = py267,py271
[testenv]
changedir=tests
deps=pytest
commands=py.test --junitxml=junit-{envname}.xml
[testenv:py271]
basepython=/var/lib/jenkins/.pythonbrew/pythons/Python-2.7.1/
[testenv:py267]
basepython=/var/lib/jenkins/.pythonbrew/pythons/Python-2.6.7/
| Use pybrew for jenkins user | Use pybrew for jenkins user
| INI | bsd-3-clause | myint/yolk,myint/yolk | ini | ## Code Before:
[tox]
envlist = py26
[testenv]
changedir=tests
deps=pytest
commands=py.test --junitxml=junit-{envname}.xml
## Instruction:
Use pybrew for jenkins user
## Code After:
[tox]
envlist = py267,py271
[testenv]
changedir=tests
deps=pytest
commands=py.test --junitxml=junit-{envname}.xml
[testenv:py271]
basepython=/var/lib/jenkins/.pythonbrew/pythons/Python-2.7.1/
[testenv:py267]
basepython=/var/lib/jenkins/.pythonbrew/pythons/Python-2.6.7/
| [tox]
- envlist = py26
+ envlist = py267,py271
? +++++++
[testenv]
changedir=tests
deps=pytest
commands=py.test --junitxml=junit-{envname}.xml
+ [testenv:py271]
+ basepython=/var/lib/jenkins/.pythonbrew/pythons/Python-2.7.1/
+ [testenv:py267]
+ basepython=/var/lib/jenkins/.pythonbrew/pythons/Python-2.6.7/
+
+ | 8 | 1 | 7 | 1 |
5b1fb7de6a87fb66393d784f71f020f2d8a097e7 | .travis.yml | .travis.yml | language: android
jdk: oraclejdk7
android:
components:
- build-tools-22.0.1
- android-22
- extra-android-m2repository
- sys-img-armeabi-v7a-android-19
before_script:
- echo no | android create avd --force -n test -t android-19 --abi armeabi-v7a
- emulator -avd test -no-skin -no-audio -no-window &
- android-wait-for-emulator
- adb shell input keyevent 82 &
script:
- echo "twitterConsumerKey=$TWITTER_CONSUMER_KEY" >> samples/app/fabric.properties
- echo "twitterConsumerSecret=$TWITTER_CONSUMER_SECRET" >> samples/app/fabric.properties
- echo "apiKey=$API_KEY" >> samples/app/fabric.properties
- echo "apiSecret=$API_SECRET" >> samples/app/fabric.properties
- ./gradlew assemble test connectedCheck --parallel
| language: android
jdk: oraclejdk7
android:
components:
- build-tools-22.0.1
- android-22
- extra-android-m2repository
script:
- echo "twitterConsumerKey=$TWITTER_CONSUMER_KEY" >> samples/app/fabric.properties
- echo "twitterConsumerSecret=$TWITTER_CONSUMER_SECRET" >> samples/app/fabric.properties
- echo "apiKey=$API_KEY" >> samples/app/fabric.properties
- echo "apiSecret=$API_SECRET" >> samples/app/fabric.properties
- ./gradlew assemble test
| Remove parallel and connectedCheck from Travis builds | Remove parallel and connectedCheck from Travis builds
Change-Id: Ibb20e719141bf3b41a83f1e2fa34c0587caafa64
| YAML | apache-2.0 | afeiluo/twitter-kit-android,twitter/twitter-kit-android,t9nf/twitter-kit-android,rcastro78/twitter-kit-android,afeiluo/twitter-kit-android | yaml | ## Code Before:
language: android
jdk: oraclejdk7
android:
components:
- build-tools-22.0.1
- android-22
- extra-android-m2repository
- sys-img-armeabi-v7a-android-19
before_script:
- echo no | android create avd --force -n test -t android-19 --abi armeabi-v7a
- emulator -avd test -no-skin -no-audio -no-window &
- android-wait-for-emulator
- adb shell input keyevent 82 &
script:
- echo "twitterConsumerKey=$TWITTER_CONSUMER_KEY" >> samples/app/fabric.properties
- echo "twitterConsumerSecret=$TWITTER_CONSUMER_SECRET" >> samples/app/fabric.properties
- echo "apiKey=$API_KEY" >> samples/app/fabric.properties
- echo "apiSecret=$API_SECRET" >> samples/app/fabric.properties
- ./gradlew assemble test connectedCheck --parallel
## Instruction:
Remove parallel and connectedCheck from Travis builds
Change-Id: Ibb20e719141bf3b41a83f1e2fa34c0587caafa64
## Code After:
language: android
jdk: oraclejdk7
android:
components:
- build-tools-22.0.1
- android-22
- extra-android-m2repository
script:
- echo "twitterConsumerKey=$TWITTER_CONSUMER_KEY" >> samples/app/fabric.properties
- echo "twitterConsumerSecret=$TWITTER_CONSUMER_SECRET" >> samples/app/fabric.properties
- echo "apiKey=$API_KEY" >> samples/app/fabric.properties
- echo "apiSecret=$API_SECRET" >> samples/app/fabric.properties
- ./gradlew assemble test
| language: android
jdk: oraclejdk7
android:
components:
- build-tools-22.0.1
- android-22
- extra-android-m2repository
- - sys-img-armeabi-v7a-android-19
-
- before_script:
- - echo no | android create avd --force -n test -t android-19 --abi armeabi-v7a
- - emulator -avd test -no-skin -no-audio -no-window &
- - android-wait-for-emulator
- - adb shell input keyevent 82 &
script:
- echo "twitterConsumerKey=$TWITTER_CONSUMER_KEY" >> samples/app/fabric.properties
- echo "twitterConsumerSecret=$TWITTER_CONSUMER_SECRET" >> samples/app/fabric.properties
- echo "apiKey=$API_KEY" >> samples/app/fabric.properties
- echo "apiSecret=$API_SECRET" >> samples/app/fabric.properties
- - ./gradlew assemble test connectedCheck --parallel
+ - ./gradlew assemble test | 9 | 0.428571 | 1 | 8 |
97a178995f3c5bc89da4fb6a90bf53b39c81156a | app/modules/redis/InsertLogPubSub.js | app/modules/redis/InsertLogPubSub.js | 'use strict';
exports.on = function(logs) {
var redis = mainevent.requireModule('redis').createInstance();
redis.connect();
redis.client.publish('InsertLog', JSON.stringify(logs), function() {
redis.client.end();
});
};
| 'use strict';
var redis = mainevent.requireModule('redis').createInstance();
redis.connect();
exports.on = function(logs) {
redis.client.publish('InsertLog', JSON.stringify(logs));
};
| Fix attempt: stop creating new clients for each message | Fix attempt: stop creating new clients for each message
| JavaScript | mit | codeactual/mainevent | javascript | ## Code Before:
'use strict';
exports.on = function(logs) {
var redis = mainevent.requireModule('redis').createInstance();
redis.connect();
redis.client.publish('InsertLog', JSON.stringify(logs), function() {
redis.client.end();
});
};
## Instruction:
Fix attempt: stop creating new clients for each message
## Code After:
'use strict';
var redis = mainevent.requireModule('redis').createInstance();
redis.connect();
exports.on = function(logs) {
redis.client.publish('InsertLog', JSON.stringify(logs));
};
| 'use strict';
+ var redis = mainevent.requireModule('redis').createInstance();
+ redis.connect();
+
exports.on = function(logs) {
- var redis = mainevent.requireModule('redis').createInstance();
- redis.connect();
- redis.client.publish('InsertLog', JSON.stringify(logs), function() {
? ----------- ^^
+ redis.client.publish('InsertLog', JSON.stringify(logs));
? ^
- redis.client.end();
- });
}; | 9 | 1 | 4 | 5 |
15edf1a4aad40d284afd224056ac84645cb20fd7 | CONTRIBUTING.md | CONTRIBUTING.md |
See the [Developer / Contributor
Guide](http://docs.cilium.io/en/stable/contributing/contributing/) for detailed information on
how to contribute, get started and find good first issues.
|
See the [Developer / Contributor
Guide](https://docs.cilium.io/en/stable/contributing/development/contributing_guide/) for detailed information on
how to contribute, get started and find good first issues.
| Update link to the guide | Update link to the guide
The link was outdated and pointing nowhere
Signed-off-by: Manuel Buil <e425ad3a02d493517ddd7945e45944816381397e@suse.com>
| Markdown | apache-2.0 | tklauser/cilium,michi-covalent/cilium,michi-covalent/cilium,tklauser/cilium,cilium/cilium,tgraf/cilium,cilium/cilium,tklauser/cilium,michi-covalent/cilium,tgraf/cilium,michi-covalent/cilium,cilium/cilium,tgraf/cilium,tgraf/cilium,tklauser/cilium,cilium-team/cilium,cilium/cilium,tgraf/cilium,michi-covalent/cilium,tgraf/cilium,cilium-team/cilium,tklauser/cilium,cilium/cilium | markdown | ## Code Before:
See the [Developer / Contributor
Guide](http://docs.cilium.io/en/stable/contributing/contributing/) for detailed information on
how to contribute, get started and find good first issues.
## Instruction:
Update link to the guide
The link was outdated and pointing nowhere
Signed-off-by: Manuel Buil <e425ad3a02d493517ddd7945e45944816381397e@suse.com>
## Code After:
See the [Developer / Contributor
Guide](https://docs.cilium.io/en/stable/contributing/development/contributing_guide/) for detailed information on
how to contribute, get started and find good first issues.
|
See the [Developer / Contributor
- Guide](http://docs.cilium.io/en/stable/contributing/contributing/) for detailed information on
+ Guide](https://docs.cilium.io/en/stable/contributing/development/contributing_guide/) for detailed information on
? + ++++++++++++ ++++++
how to contribute, get started and find good first issues. | 2 | 0.5 | 1 | 1 |
43bdea60b9dcd23167f84a0684fcfa02d629516a | templates/jvm-guy-spring-boot-project/gradle/wrapper/gradle-wrapper.properties | templates/jvm-guy-spring-boot-project/gradle/wrapper/gradle-wrapper.properties | distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-2.12-all.zip
| distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
#distributionUrl=https\://services.gradle.org/distributions/gradle-2.12-all.zip
distributionUrl=http\://192.168.1.229:8081/artifactory/gradle-releases/gradle-2.12-all.zip
| Use Artifactory for Gradle downloads | Use Artifactory for Gradle downloads
| INI | apache-2.0 | kurron/lazybones-experiment,kurron/lazybones-experiment,kurron/lazybones-experiment,kurron/lazybones-experiment | ini | ## Code Before:
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-2.12-all.zip
## Instruction:
Use Artifactory for Gradle downloads
## Code After:
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
#distributionUrl=https\://services.gradle.org/distributions/gradle-2.12-all.zip
distributionUrl=http\://192.168.1.229:8081/artifactory/gradle-releases/gradle-2.12-all.zip
| distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
- distributionUrl=https\://services.gradle.org/distributions/gradle-2.12-all.zip
+ #distributionUrl=https\://services.gradle.org/distributions/gradle-2.12-all.zip
? +
+ distributionUrl=http\://192.168.1.229:8081/artifactory/gradle-releases/gradle-2.12-all.zip | 3 | 0.6 | 2 | 1 |
c8b3ceaf1927fdbe4c4cb0417958121e8823ab30 | src/routes/Morph/Help/index.js | src/routes/Morph/Help/index.js | import React from 'react'
import PropTypes from 'prop-types'
import { siteTitle } from '../../../App'
import Help from '../../../components/Help'
import Notice from '../../../components/Notice'
import { morphInfo } from '../'
const MorphHelp = props => (
<Help toolInfo={morphInfo}>
<Notice>This tool is still in development.</Notice>
<h3 id='using'>
Using {siteTitle}
{morphInfo.title}
</h3>
<h3 id='acknowledgments'>Acknowledgments</h3>
<p>
Much thanks should be given to Mark Rosenfelder and{' '}
<a
href='http://www.zompist.com/sca2.html'
target='_blank'
rel='noopener noreferrer'
>
the Sound Change Applier 2 (SCA
<sup>2</sup>)
</a>
. {siteTitle}
{morphInfo.title} was mainly built as a modernized and updated version of
SCA
<sup>2</sup>.
</p>
</Help>
)
MorphHelp.propTypes = {
classes: PropTypes.object
}
export default MorphHelp
| import React from 'react'
import PropTypes from 'prop-types'
import { siteTitle } from '../../../App'
import Help from '../../../components/Help'
import Notice from '../../../components/Notice'
import { morphInfo } from '../'
const MorphHelp = props => (
<Help toolInfo={morphInfo}>
<Notice>This tool is still in development.</Notice>
<h3 id='using'>
Using {siteTitle}
{morphInfo.title}
</h3>
<h3 id='acknowledgments'>Acknowledgments</h3>
<p>
Much thanks should be given to Mark Rosenfelder and{' '}
<a
href='http://www.zompist.com/sca2.html'
target='_blank'
rel='noopener noreferrer'
>
the Sound Change Applier 2
</a>{' '}
(SCA
<sup>2</sup>
). {siteTitle}
{morphInfo.title} was mainly built as a modernized and updated version of
SCA
<sup>2</sup>.
</p>
</Help>
)
MorphHelp.propTypes = {
classes: PropTypes.object
}
export default MorphHelp
| Move SCA2 outside of link in acknowledgment | Move SCA2 outside of link in acknowledgment
| JavaScript | agpl-3.0 | nai888/langua,nai888/langua | javascript | ## Code Before:
import React from 'react'
import PropTypes from 'prop-types'
import { siteTitle } from '../../../App'
import Help from '../../../components/Help'
import Notice from '../../../components/Notice'
import { morphInfo } from '../'
const MorphHelp = props => (
<Help toolInfo={morphInfo}>
<Notice>This tool is still in development.</Notice>
<h3 id='using'>
Using {siteTitle}
{morphInfo.title}
</h3>
<h3 id='acknowledgments'>Acknowledgments</h3>
<p>
Much thanks should be given to Mark Rosenfelder and{' '}
<a
href='http://www.zompist.com/sca2.html'
target='_blank'
rel='noopener noreferrer'
>
the Sound Change Applier 2 (SCA
<sup>2</sup>)
</a>
. {siteTitle}
{morphInfo.title} was mainly built as a modernized and updated version of
SCA
<sup>2</sup>.
</p>
</Help>
)
MorphHelp.propTypes = {
classes: PropTypes.object
}
export default MorphHelp
## Instruction:
Move SCA2 outside of link in acknowledgment
## Code After:
import React from 'react'
import PropTypes from 'prop-types'
import { siteTitle } from '../../../App'
import Help from '../../../components/Help'
import Notice from '../../../components/Notice'
import { morphInfo } from '../'
const MorphHelp = props => (
<Help toolInfo={morphInfo}>
<Notice>This tool is still in development.</Notice>
<h3 id='using'>
Using {siteTitle}
{morphInfo.title}
</h3>
<h3 id='acknowledgments'>Acknowledgments</h3>
<p>
Much thanks should be given to Mark Rosenfelder and{' '}
<a
href='http://www.zompist.com/sca2.html'
target='_blank'
rel='noopener noreferrer'
>
the Sound Change Applier 2
</a>{' '}
(SCA
<sup>2</sup>
). {siteTitle}
{morphInfo.title} was mainly built as a modernized and updated version of
SCA
<sup>2</sup>.
</p>
</Help>
)
MorphHelp.propTypes = {
classes: PropTypes.object
}
export default MorphHelp
| import React from 'react'
import PropTypes from 'prop-types'
import { siteTitle } from '../../../App'
import Help from '../../../components/Help'
import Notice from '../../../components/Notice'
import { morphInfo } from '../'
const MorphHelp = props => (
<Help toolInfo={morphInfo}>
<Notice>This tool is still in development.</Notice>
<h3 id='using'>
Using {siteTitle}
{morphInfo.title}
</h3>
<h3 id='acknowledgments'>Acknowledgments</h3>
<p>
Much thanks should be given to Mark Rosenfelder and{' '}
<a
href='http://www.zompist.com/sca2.html'
target='_blank'
rel='noopener noreferrer'
>
- the Sound Change Applier 2 (SCA
? -----
+ the Sound Change Applier 2
+ </a>{' '}
+ (SCA
- <sup>2</sup>)
? -- -
+ <sup>2</sup>
- </a>
- . {siteTitle}
+ ). {siteTitle}
? +
{morphInfo.title} was mainly built as a modernized and updated version of
SCA
<sup>2</sup>.
</p>
</Help>
)
MorphHelp.propTypes = {
classes: PropTypes.object
}
export default MorphHelp | 9 | 0.225 | 5 | 4 |
38428c80124afc0200972868f528a8655def00c6 | bin/dp.js | bin/dp.js |
'use strict';
const program = require('commander');
const pkg = require('../package.json');
program
.version(pkg.version)
.command('diff <left> <right>', 'compare the given routes')
.command('show <id>', 'show a diff of the given id')
.command('run', 'run the diff suite', { isDefault: true })
.parse(process.argv);
|
'use strict';
const program = require('commander');
const pkg = require('../package.json');
program
.version(pkg.version)
.allowUnknownOption()
.command('diff <left> <right>', 'compare the given routes')
.command('show <id>', 'show a diff of the given id')
.command('run', 'run the diff suite', { isDefault: true })
.parse(process.argv);
| Allow unknown options on "root level" | Allow unknown options on "root level"
| JavaScript | mit | maxdome/dumpinator,maxdome/dumpinator | javascript | ## Code Before:
'use strict';
const program = require('commander');
const pkg = require('../package.json');
program
.version(pkg.version)
.command('diff <left> <right>', 'compare the given routes')
.command('show <id>', 'show a diff of the given id')
.command('run', 'run the diff suite', { isDefault: true })
.parse(process.argv);
## Instruction:
Allow unknown options on "root level"
## Code After:
'use strict';
const program = require('commander');
const pkg = require('../package.json');
program
.version(pkg.version)
.allowUnknownOption()
.command('diff <left> <right>', 'compare the given routes')
.command('show <id>', 'show a diff of the given id')
.command('run', 'run the diff suite', { isDefault: true })
.parse(process.argv);
|
'use strict';
const program = require('commander');
const pkg = require('../package.json');
program
.version(pkg.version)
+ .allowUnknownOption()
+
.command('diff <left> <right>', 'compare the given routes')
.command('show <id>', 'show a diff of the given id')
.command('run', 'run the diff suite', { isDefault: true })
.parse(process.argv); | 2 | 0.142857 | 2 | 0 |
74739018fcbb127050f5c6577b9f7dc50b1f13aa | lib/model/user.coffee | lib/model/user.coffee | root = exports ? this
root.User = class User
@findOne: (idOrName) ->
data = Meteor.users.findOne(idOrName)
if not data
data = Meteor.users.findOne({'profile.name': new RegExp('^' + idOrName + '$', 'i')})
new User(data)
@current: ->
Meteor.userId() and new User(Meteor.user())
constructor: (@data) ->
name: -> @data.profile.name
photoUrl: (height) ->
if @data.services
if @data.services.google
picture = @data.services.google.picture
if picture and height
picture = "#{picture}?sz=#{height}"
else if @data.services.github
picture = @data.services.github.picture
if picture and height
picture = "#{picture}&s=#{height}"
else if @data.services.facebook
picture = "http://graph.facebook.com/#{@data.services.facebook.id}/picture/"
if height
picture = "#{picture}?height=#{height/2}"
else if @data.services.twitter
picture = @data.services.twitter.profile_image_url
if not picture
if height
picture = Cdn.cdnify("/img/user-#{height}x#{height}.png")
else
picture = Cdn.cdnify('/img/user.png')
picture
| root = exports ? this
root.User = class User
@findOne: (idOrName) ->
data = Meteor.users.findOne(idOrName)
if not data
data = Meteor.users.findOne({'profile.name': new RegExp('^' + idOrName + '$', 'i')})
new User(data)
@current: ->
Meteor.userId() and new User(Meteor.user())
constructor: (@data) ->
name: -> @data.profile.name
photoUrl: (height) ->
if @data.services
if @data.services.google
picture = @data.services.google.picture
if picture and height
picture = "#{picture}?sz=#{height}"
else if @data.services.github
picture = @data.services.github.picture
if picture and height
picture = "#{picture}&s=#{height}"
else if @data.services.facebook
picture = "http://graph.facebook.com/#{@data.services.facebook.id}/picture/"
if height
picture = "#{picture}?height=#{height/2}&width=#{height/2}"
else if @data.services.twitter
picture = @data.services.twitter.profile_image_url
if not picture
if height
picture = Cdn.cdnify("/img/user-#{height}x#{height}.png")
else
picture = Cdn.cdnify('/img/user.png')
picture
| Fix facebook images to make sure they are squares | Fix facebook images to make sure they are squares
| CoffeeScript | apache-2.0 | rantav/reversim-summit-2014,rantav/reversim-summit-2015,rantav/reversim-summit-2015,rantav/reversim-summit-2014,rantav/reversim-summit-2015 | coffeescript | ## Code Before:
root = exports ? this
root.User = class User
@findOne: (idOrName) ->
data = Meteor.users.findOne(idOrName)
if not data
data = Meteor.users.findOne({'profile.name': new RegExp('^' + idOrName + '$', 'i')})
new User(data)
@current: ->
Meteor.userId() and new User(Meteor.user())
constructor: (@data) ->
name: -> @data.profile.name
photoUrl: (height) ->
if @data.services
if @data.services.google
picture = @data.services.google.picture
if picture and height
picture = "#{picture}?sz=#{height}"
else if @data.services.github
picture = @data.services.github.picture
if picture and height
picture = "#{picture}&s=#{height}"
else if @data.services.facebook
picture = "http://graph.facebook.com/#{@data.services.facebook.id}/picture/"
if height
picture = "#{picture}?height=#{height/2}"
else if @data.services.twitter
picture = @data.services.twitter.profile_image_url
if not picture
if height
picture = Cdn.cdnify("/img/user-#{height}x#{height}.png")
else
picture = Cdn.cdnify('/img/user.png')
picture
## Instruction:
Fix facebook images to make sure they are squares
## Code After:
root = exports ? this
root.User = class User
@findOne: (idOrName) ->
data = Meteor.users.findOne(idOrName)
if not data
data = Meteor.users.findOne({'profile.name': new RegExp('^' + idOrName + '$', 'i')})
new User(data)
@current: ->
Meteor.userId() and new User(Meteor.user())
constructor: (@data) ->
name: -> @data.profile.name
photoUrl: (height) ->
if @data.services
if @data.services.google
picture = @data.services.google.picture
if picture and height
picture = "#{picture}?sz=#{height}"
else if @data.services.github
picture = @data.services.github.picture
if picture and height
picture = "#{picture}&s=#{height}"
else if @data.services.facebook
picture = "http://graph.facebook.com/#{@data.services.facebook.id}/picture/"
if height
picture = "#{picture}?height=#{height/2}&width=#{height/2}"
else if @data.services.twitter
picture = @data.services.twitter.profile_image_url
if not picture
if height
picture = Cdn.cdnify("/img/user-#{height}x#{height}.png")
else
picture = Cdn.cdnify('/img/user.png')
picture
| root = exports ? this
root.User = class User
@findOne: (idOrName) ->
data = Meteor.users.findOne(idOrName)
if not data
data = Meteor.users.findOne({'profile.name': new RegExp('^' + idOrName + '$', 'i')})
new User(data)
@current: ->
Meteor.userId() and new User(Meteor.user())
constructor: (@data) ->
name: -> @data.profile.name
photoUrl: (height) ->
if @data.services
if @data.services.google
picture = @data.services.google.picture
if picture and height
picture = "#{picture}?sz=#{height}"
else if @data.services.github
picture = @data.services.github.picture
if picture and height
picture = "#{picture}&s=#{height}"
else if @data.services.facebook
picture = "http://graph.facebook.com/#{@data.services.facebook.id}/picture/"
if height
- picture = "#{picture}?height=#{height/2}"
+ picture = "#{picture}?height=#{height/2}&width=#{height/2}"
? ++++++++++++++++++
else if @data.services.twitter
picture = @data.services.twitter.profile_image_url
if not picture
if height
picture = Cdn.cdnify("/img/user-#{height}x#{height}.png")
else
picture = Cdn.cdnify('/img/user.png')
picture
| 2 | 0.051282 | 1 | 1 |
18e5fe55934899ba1e4454e1383ef49c5068dafb | ember-cli-build.js | ember-cli-build.js | /* eslint-env node */
/* global require, module */
const EmberAddon = require('ember-cli/lib/broccoli/ember-addon')
module.exports = function (defaults) {
var app = new EmberAddon(defaults, {
babel: {
optional: ['es7.decorators']
},
'ember-cli-babel': {
includePolyfill: true
},
'ember-cli-mocha': {
useLintTree: false
},
sassOptions: {
includePaths: [
'addon/styles'
]
},
snippetPaths: [
'code-snippets'
],
snippetSearchPaths: [
'tests/dummy'
]
})
app.import('bower_components/highlightjs/styles/github.css')
app.import(app.project.addonPackages['ember-source']
? 'vendor/ember/ember-template-compiler.js' : 'bower_components/ember/ember-template-compiler.js')
return app.toTree()
}
| /* eslint-env node */
/* global require, module */
const EmberAddon = require('ember-cli/lib/broccoli/ember-addon')
module.exports = function (defaults) {
var app = new EmberAddon(defaults, {
babel: {
optional: ['es7.decorators']
},
'ember-cli-babel': {
includePolyfill: true
},
sassOptions: {
includePaths: [
'addon/styles'
]
},
snippetPaths: [
'code-snippets'
],
snippetSearchPaths: [
'tests/dummy'
]
})
app.import('bower_components/highlightjs/styles/github.css')
app.import(app.project.addonPackages['ember-source']
? 'vendor/ember/ember-template-compiler.js' : 'bower_components/ember/ember-template-compiler.js')
return app.toTree()
}
| Remove useLintTree ember-cli-mocha build configuration | Remove useLintTree ember-cli-mocha build configuration
| JavaScript | mit | EWhite613/ember-frost-core,dafortin/ember-frost-core,ciena-frost/ember-frost-core,dafortin/ember-frost-core,ciena-frost/ember-frost-core,dafortin/ember-frost-core,ciena-frost/ember-frost-core,EWhite613/ember-frost-core,EWhite613/ember-frost-core | javascript | ## Code Before:
/* eslint-env node */
/* global require, module */
const EmberAddon = require('ember-cli/lib/broccoli/ember-addon')
module.exports = function (defaults) {
var app = new EmberAddon(defaults, {
babel: {
optional: ['es7.decorators']
},
'ember-cli-babel': {
includePolyfill: true
},
'ember-cli-mocha': {
useLintTree: false
},
sassOptions: {
includePaths: [
'addon/styles'
]
},
snippetPaths: [
'code-snippets'
],
snippetSearchPaths: [
'tests/dummy'
]
})
app.import('bower_components/highlightjs/styles/github.css')
app.import(app.project.addonPackages['ember-source']
? 'vendor/ember/ember-template-compiler.js' : 'bower_components/ember/ember-template-compiler.js')
return app.toTree()
}
## Instruction:
Remove useLintTree ember-cli-mocha build configuration
## Code After:
/* eslint-env node */
/* global require, module */
const EmberAddon = require('ember-cli/lib/broccoli/ember-addon')
module.exports = function (defaults) {
var app = new EmberAddon(defaults, {
babel: {
optional: ['es7.decorators']
},
'ember-cli-babel': {
includePolyfill: true
},
sassOptions: {
includePaths: [
'addon/styles'
]
},
snippetPaths: [
'code-snippets'
],
snippetSearchPaths: [
'tests/dummy'
]
})
app.import('bower_components/highlightjs/styles/github.css')
app.import(app.project.addonPackages['ember-source']
? 'vendor/ember/ember-template-compiler.js' : 'bower_components/ember/ember-template-compiler.js')
return app.toTree()
}
| /* eslint-env node */
/* global require, module */
const EmberAddon = require('ember-cli/lib/broccoli/ember-addon')
module.exports = function (defaults) {
var app = new EmberAddon(defaults, {
babel: {
optional: ['es7.decorators']
},
'ember-cli-babel': {
includePolyfill: true
- },
- 'ember-cli-mocha': {
- useLintTree: false
},
sassOptions: {
includePaths: [
'addon/styles'
]
},
snippetPaths: [
'code-snippets'
],
snippetSearchPaths: [
'tests/dummy'
]
})
app.import('bower_components/highlightjs/styles/github.css')
app.import(app.project.addonPackages['ember-source']
? 'vendor/ember/ember-template-compiler.js' : 'bower_components/ember/ember-template-compiler.js')
return app.toTree()
} | 3 | 0.088235 | 0 | 3 |
abfb87284cdeb8cdbd30187cf9ab833d6c68c54f | .travis.yml | .travis.yml | sudo: false
language: java
jdk:
- oraclejdk8
- openjdk7
before_install:
- wget https://github.com/google/protobuf/releases/download/v3.0.0-beta-3/protoc-3.0.0-beta-3-linux-x86_64.zip
- unzip protoc-3.0.0-beta-3-linux-x86_64.zip
- export PROTOC_COMPILER="$(pwd)/protoc"
os:
- linux
| sudo: false
language: java
jdk:
- oraclejdk8
before_install:
- wget https://github.com/google/protobuf/releases/download/v3.0.0-beta-3/protoc-3.0.0-beta-3-linux-x86_64.zip
- unzip protoc-3.0.0-beta-3-linux-x86_64.zip
- export PROTOC_COMPILER="$(pwd)/protoc"
os:
- linux
| Remove java7 build constraints. Api Compiler is moving to java8 only for open source build. | Remove java7 build constraints. Api Compiler is moving to java8 only for open source build.
| YAML | apache-2.0 | googleapis/api-compiler,googleapis/api-compiler | yaml | ## Code Before:
sudo: false
language: java
jdk:
- oraclejdk8
- openjdk7
before_install:
- wget https://github.com/google/protobuf/releases/download/v3.0.0-beta-3/protoc-3.0.0-beta-3-linux-x86_64.zip
- unzip protoc-3.0.0-beta-3-linux-x86_64.zip
- export PROTOC_COMPILER="$(pwd)/protoc"
os:
- linux
## Instruction:
Remove java7 build constraints. Api Compiler is moving to java8 only for open source build.
## Code After:
sudo: false
language: java
jdk:
- oraclejdk8
before_install:
- wget https://github.com/google/protobuf/releases/download/v3.0.0-beta-3/protoc-3.0.0-beta-3-linux-x86_64.zip
- unzip protoc-3.0.0-beta-3-linux-x86_64.zip
- export PROTOC_COMPILER="$(pwd)/protoc"
os:
- linux
| sudo: false
language: java
jdk:
- oraclejdk8
- - openjdk7
before_install:
- wget https://github.com/google/protobuf/releases/download/v3.0.0-beta-3/protoc-3.0.0-beta-3-linux-x86_64.zip
- unzip protoc-3.0.0-beta-3-linux-x86_64.zip
- export PROTOC_COMPILER="$(pwd)/protoc"
os:
- linux
| 1 | 0.071429 | 0 | 1 |
1bb1fc7e0b7bf4c9a0a655ed502cf922bea38286 | README.md | README.md | Maya plugin with tools that operate on meshes using symmetry tables calculated by topology.
#### Description
See the [Wiki](https://github.com/yantor3d/polySymmetry/wiki) for full details.
| Maya plugin with tools that operate on meshes using symmetry tables calculated by topology. Requires Maya 2016 or later version.
#### Description
See the [Wiki](https://github.com/yantor3d/polySymmetry/wiki) for full details.
| Update Readme with Maya version restriction. | Update Readme with Maya version restriction. | Markdown | mit | yantor3d/polySymmetry,yantor3d/polySymmetry | markdown | ## Code Before:
Maya plugin with tools that operate on meshes using symmetry tables calculated by topology.
#### Description
See the [Wiki](https://github.com/yantor3d/polySymmetry/wiki) for full details.
## Instruction:
Update Readme with Maya version restriction.
## Code After:
Maya plugin with tools that operate on meshes using symmetry tables calculated by topology. Requires Maya 2016 or later version.
#### Description
See the [Wiki](https://github.com/yantor3d/polySymmetry/wiki) for full details.
| - Maya plugin with tools that operate on meshes using symmetry tables calculated by topology.
+ Maya plugin with tools that operate on meshes using symmetry tables calculated by topology. Requires Maya 2016 or later version.
? +++++++++++++++++++++++++++++++++++++
#### Description
See the [Wiki](https://github.com/yantor3d/polySymmetry/wiki) for full details. | 2 | 0.5 | 1 | 1 |
247fa8e27849df44cbe0c922d9fd9469d6eca7b4 | src/Core/User.php | src/Core/User.php | <?php
namespace Vela\Core;
use \Sabre\HTTP\Request;
Class User
{
private $request;
public function __construct(Request $request)
{
$this->request = $request;
}
public function getUserAgent()
{
return $this->request->getRawServerValue('HTTP_USER_AGENT');
}
public function getUserIp()
{
$userIp = '';
if (getenv('HTTP_CLIENT_IP'))
$userIp = getenv('HTTP_CLIENT_IP');
else if(getenv('HTTP_X_FORWARDED_FOR'))
$userIp = getenv('HTTP_X_FORWARDED_FOR');
else if(getenv('HTTP_X_FORWARDED'))
$userIp = getenv('HTTP_X_FORWARDED');
else if(getenv('HTTP_FORWARDED_FOR'))
$userIp = getenv('HTTP_FORWARDED_FOR');
else if(getenv('HTTP_FORWARDED'))
$userIp = getenv('HTTP_FORWARDED');
else if(getenv('REMOTE_ADDR'))
$userIp = getenv('REMOTE_ADDR');
else
$userIp = 'UNKNOWN';
return $userIp;
}
}
| <?php
namespace Vela\Core;
use \Sabre\HTTP\Request;
Class User
{
private $request;
public function __construct(Request $request)
{
$this->request = $request;
}
public function getUserAgent()
{
return $this->request->getRawServerValue('HTTP_USER_AGENT');
}
public function getUserIp()
{
$userIp = '';
if (getenv('HTTP_CLIENT_IP'))
$userIp = getenv('HTTP_CLIENT_IP');
else if(getenv('HTTP_X_FORWARDED_FOR'))
$userIp = getenv('HTTP_X_FORWARDED_FOR');
else if(getenv('HTTP_X_FORWARDED'))
$userIp = getenv('HTTP_X_FORWARDED');
else if(getenv('HTTP_FORWARDED_FOR'))
$userIp = getenv('HTTP_FORWARDED_FOR');
else if(getenv('HTTP_FORWARDED'))
$userIp = getenv('HTTP_FORWARDED');
else if(getenv('REMOTE_ADDR'))
$userIp = getenv('REMOTE_ADDR');
else
$userIp = 'UNKNOWN';
return $userIp;
}
public function isRobot($userAgent)
{
$robots = require 'src/Config/Robots.php';
foreach ($robots as $robot)
{
if(strpos(strtolower($userAgent), $robot) !== false)
{
return true;
break;
}
}
return false;
}
}
| Move robot check code to USer class | Move robot check code to USer class | PHP | mit | acidvertigo/vela-commerce | php | ## Code Before:
<?php
namespace Vela\Core;
use \Sabre\HTTP\Request;
Class User
{
private $request;
public function __construct(Request $request)
{
$this->request = $request;
}
public function getUserAgent()
{
return $this->request->getRawServerValue('HTTP_USER_AGENT');
}
public function getUserIp()
{
$userIp = '';
if (getenv('HTTP_CLIENT_IP'))
$userIp = getenv('HTTP_CLIENT_IP');
else if(getenv('HTTP_X_FORWARDED_FOR'))
$userIp = getenv('HTTP_X_FORWARDED_FOR');
else if(getenv('HTTP_X_FORWARDED'))
$userIp = getenv('HTTP_X_FORWARDED');
else if(getenv('HTTP_FORWARDED_FOR'))
$userIp = getenv('HTTP_FORWARDED_FOR');
else if(getenv('HTTP_FORWARDED'))
$userIp = getenv('HTTP_FORWARDED');
else if(getenv('REMOTE_ADDR'))
$userIp = getenv('REMOTE_ADDR');
else
$userIp = 'UNKNOWN';
return $userIp;
}
}
## Instruction:
Move robot check code to USer class
## Code After:
<?php
namespace Vela\Core;
use \Sabre\HTTP\Request;
Class User
{
private $request;
public function __construct(Request $request)
{
$this->request = $request;
}
public function getUserAgent()
{
return $this->request->getRawServerValue('HTTP_USER_AGENT');
}
public function getUserIp()
{
$userIp = '';
if (getenv('HTTP_CLIENT_IP'))
$userIp = getenv('HTTP_CLIENT_IP');
else if(getenv('HTTP_X_FORWARDED_FOR'))
$userIp = getenv('HTTP_X_FORWARDED_FOR');
else if(getenv('HTTP_X_FORWARDED'))
$userIp = getenv('HTTP_X_FORWARDED');
else if(getenv('HTTP_FORWARDED_FOR'))
$userIp = getenv('HTTP_FORWARDED_FOR');
else if(getenv('HTTP_FORWARDED'))
$userIp = getenv('HTTP_FORWARDED');
else if(getenv('REMOTE_ADDR'))
$userIp = getenv('REMOTE_ADDR');
else
$userIp = 'UNKNOWN';
return $userIp;
}
public function isRobot($userAgent)
{
$robots = require 'src/Config/Robots.php';
foreach ($robots as $robot)
{
if(strpos(strtolower($userAgent), $robot) !== false)
{
return true;
break;
}
}
return false;
}
}
| <?php
namespace Vela\Core;
use \Sabre\HTTP\Request;
Class User
{
private $request;
public function __construct(Request $request)
{
$this->request = $request;
}
public function getUserAgent()
{
return $this->request->getRawServerValue('HTTP_USER_AGENT');
}
public function getUserIp()
{
$userIp = '';
if (getenv('HTTP_CLIENT_IP'))
$userIp = getenv('HTTP_CLIENT_IP');
else if(getenv('HTTP_X_FORWARDED_FOR'))
$userIp = getenv('HTTP_X_FORWARDED_FOR');
else if(getenv('HTTP_X_FORWARDED'))
$userIp = getenv('HTTP_X_FORWARDED');
else if(getenv('HTTP_FORWARDED_FOR'))
$userIp = getenv('HTTP_FORWARDED_FOR');
else if(getenv('HTTP_FORWARDED'))
$userIp = getenv('HTTP_FORWARDED');
else if(getenv('REMOTE_ADDR'))
$userIp = getenv('REMOTE_ADDR');
else
$userIp = 'UNKNOWN';
return $userIp;
}
+
+ public function isRobot($userAgent)
+ {
+ $robots = require 'src/Config/Robots.php';
+
+ foreach ($robots as $robot)
+ {
+ if(strpos(strtolower($userAgent), $robot) !== false)
+ {
+ return true;
+ break;
+ }
+ }
+
+ return false;
+ }
} | 16 | 0.390244 | 16 | 0 |
8cb072590831370ee51de42aa308d81b5cd522c6 | src/main/java/com/alexrnl/request/shows/ShowArchive.java | src/main/java/com/alexrnl/request/shows/ShowArchive.java | package com.alexrnl.request.shows;
import java.util.HashMap;
import java.util.Map;
import com.alexrnl.request.Request;
import com.alexrnl.request.Verb;
/**
* Archive a show in the logged member account.<br />
* @author Alex
*/
public class ShowArchive extends Request {
/** Address of the method */
private static final String ADDRESS = "/shows/archive";
/** Name of the id parameter */
private static final String PARAMETER_ID = "id";
/** */
private final String id;
/**
* Constructor #1.<br />
* @param id
* the id of the show to archive.
*/
public ShowArchive (final String id) {
super(Verb.GET, ADDRESS);
this.id = id;
}
/**
* Return the attribute id.
* @return the attribute id.
*/
public String getId () {
return id;
}
@Override
public Map<String, String> getParameters () {
final Map<String, String> parameters = new HashMap<>();
parameters.put(PARAMETER_ID, getId());
return parameters;
}
}
| package com.alexrnl.request.shows;
import java.util.HashMap;
import java.util.Map;
import com.alexrnl.request.Request;
import com.alexrnl.request.Verb;
/**
* Archive a show in the logged member account.<br />
* @author Alex
*/
public class ShowArchive extends Request {
/** Address of the method */
private static final String ADDRESS = "/shows/archive";
/** Name of the id parameter */
private static final String PARAMETER_ID = "id";
/** The id of the show to archive */
private final Integer id;
/**
* Constructor #1.<br />
* @param id
* the id of the show to archive.
*/
public ShowArchive (final Integer id) {
super(Verb.GET, ADDRESS);
this.id = id;
}
/**
* Return the attribute id.
* @return the attribute id.
*/
public Integer getId () {
return id;
}
@Override
public Map<String, String> getParameters () {
final Map<String, String> parameters = new HashMap<>();
parameters.put(PARAMETER_ID, getId().toString());
return parameters;
}
}
| Make the id of the show an Integer | Make the id of the show an Integer
| Java | bsd-3-clause | AlexRNL/jSeries | java | ## Code Before:
package com.alexrnl.request.shows;
import java.util.HashMap;
import java.util.Map;
import com.alexrnl.request.Request;
import com.alexrnl.request.Verb;
/**
* Archive a show in the logged member account.<br />
* @author Alex
*/
public class ShowArchive extends Request {
/** Address of the method */
private static final String ADDRESS = "/shows/archive";
/** Name of the id parameter */
private static final String PARAMETER_ID = "id";
/** */
private final String id;
/**
* Constructor #1.<br />
* @param id
* the id of the show to archive.
*/
public ShowArchive (final String id) {
super(Verb.GET, ADDRESS);
this.id = id;
}
/**
* Return the attribute id.
* @return the attribute id.
*/
public String getId () {
return id;
}
@Override
public Map<String, String> getParameters () {
final Map<String, String> parameters = new HashMap<>();
parameters.put(PARAMETER_ID, getId());
return parameters;
}
}
## Instruction:
Make the id of the show an Integer
## Code After:
package com.alexrnl.request.shows;
import java.util.HashMap;
import java.util.Map;
import com.alexrnl.request.Request;
import com.alexrnl.request.Verb;
/**
* Archive a show in the logged member account.<br />
* @author Alex
*/
public class ShowArchive extends Request {
/** Address of the method */
private static final String ADDRESS = "/shows/archive";
/** Name of the id parameter */
private static final String PARAMETER_ID = "id";
/** The id of the show to archive */
private final Integer id;
/**
* Constructor #1.<br />
* @param id
* the id of the show to archive.
*/
public ShowArchive (final Integer id) {
super(Verb.GET, ADDRESS);
this.id = id;
}
/**
* Return the attribute id.
* @return the attribute id.
*/
public Integer getId () {
return id;
}
@Override
public Map<String, String> getParameters () {
final Map<String, String> parameters = new HashMap<>();
parameters.put(PARAMETER_ID, getId().toString());
return parameters;
}
}
| package com.alexrnl.request.shows;
import java.util.HashMap;
import java.util.Map;
import com.alexrnl.request.Request;
import com.alexrnl.request.Verb;
/**
* Archive a show in the logged member account.<br />
* @author Alex
*/
public class ShowArchive extends Request {
/** Address of the method */
private static final String ADDRESS = "/shows/archive";
/** Name of the id parameter */
private static final String PARAMETER_ID = "id";
- /** */
+ /** The id of the show to archive */
- private final String id;
? ^ ---
+ private final Integer id;
? ^^ +++
/**
* Constructor #1.<br />
* @param id
* the id of the show to archive.
*/
- public ShowArchive (final String id) {
? ^ ---
+ public ShowArchive (final Integer id) {
? ^^ +++
super(Verb.GET, ADDRESS);
this.id = id;
}
/**
* Return the attribute id.
* @return the attribute id.
*/
- public String getId () {
? ^ ---
+ public Integer getId () {
? ^^ +++
return id;
}
@Override
public Map<String, String> getParameters () {
final Map<String, String> parameters = new HashMap<>();
- parameters.put(PARAMETER_ID, getId());
+ parameters.put(PARAMETER_ID, getId().toString());
? +++++++++++
return parameters;
}
} | 10 | 0.208333 | 5 | 5 |
d069fe16cc151d69a040cbe0d1ba266a48e831a8 | client/Library/Listing.ts | client/Library/Listing.ts | import { Listable, ServerListing } from "../../common/Listable";
import { Store } from "../Utility/Store";
export type ListingOrigin = "server" | "account" | "localStorage";
export class Listing<T extends Listable> implements ServerListing {
constructor(
public Id: string,
public Name: string,
public Path: string,
public SearchHint: string,
public Link: string,
public Origin: ListingOrigin,
value?: T
) {
if (value) {
this.value(value);
}
}
private value = ko.observable<T>();
public SetValue = value => this.value(value);
public GetAsyncWithUpdatedId(callback: (item: T) => any) {
if (this.value()) {
return callback(this.value());
}
if (this.Origin === "localStorage") {
const item = Store.Load<T>(this.Link, this.Id);
item.Id = this.Id;
if (item !== null) {
return callback(item);
} else {
console.error(`Couldn't load item keyed '${this.Id}' from localStorage.`);
}
}
return $.getJSON(this.Link).done(item => {
this.value(item);
return callback(item);
});
}
public CurrentName = ko.computed(() => {
const current = this.value();
if (current !== undefined) {
return current.Name || this.Name;
}
return this.Name;
});
}
| import { Listable, ServerListing } from "../../common/Listable";
import { Store } from "../Utility/Store";
export type ListingOrigin = "server" | "account" | "localStorage";
export class Listing<T extends Listable> implements ServerListing {
constructor(
public Id: string,
public Name: string,
public Path: string,
public SearchHint: string,
public Link: string,
public Origin: ListingOrigin,
value?: T
) {
if (value) {
this.value(value);
}
}
private value = ko.observable<T>();
public SetValue = value => this.value(value);
public GetAsyncWithUpdatedId(callback: (item: T) => any) {
if (this.value()) {
return callback(this.value());
}
if (this.Origin === "localStorage") {
const item = Store.Load<T>(this.Link, this.Id);
item.Id = this.Id;
if (item !== null) {
return callback(item);
} else {
console.error(`Couldn't load item keyed '${this.Id}' from localStorage.`);
}
}
return $.getJSON(this.Link).done(item => {
item.Id = this.Id;
this.value(item);
return callback(item);
});
}
public CurrentName = ko.computed(() => {
const current = this.value();
if (current !== undefined) {
return current.Name || this.Name;
}
return this.Name;
});
}
| Update item Id when getting from server | Update item Id when getting from server
| TypeScript | mit | cynicaloptimist/improved-initiative,cynicaloptimist/improved-initiative,cynicaloptimist/improved-initiative | typescript | ## Code Before:
import { Listable, ServerListing } from "../../common/Listable";
import { Store } from "../Utility/Store";
export type ListingOrigin = "server" | "account" | "localStorage";
export class Listing<T extends Listable> implements ServerListing {
constructor(
public Id: string,
public Name: string,
public Path: string,
public SearchHint: string,
public Link: string,
public Origin: ListingOrigin,
value?: T
) {
if (value) {
this.value(value);
}
}
private value = ko.observable<T>();
public SetValue = value => this.value(value);
public GetAsyncWithUpdatedId(callback: (item: T) => any) {
if (this.value()) {
return callback(this.value());
}
if (this.Origin === "localStorage") {
const item = Store.Load<T>(this.Link, this.Id);
item.Id = this.Id;
if (item !== null) {
return callback(item);
} else {
console.error(`Couldn't load item keyed '${this.Id}' from localStorage.`);
}
}
return $.getJSON(this.Link).done(item => {
this.value(item);
return callback(item);
});
}
public CurrentName = ko.computed(() => {
const current = this.value();
if (current !== undefined) {
return current.Name || this.Name;
}
return this.Name;
});
}
## Instruction:
Update item Id when getting from server
## Code After:
import { Listable, ServerListing } from "../../common/Listable";
import { Store } from "../Utility/Store";
export type ListingOrigin = "server" | "account" | "localStorage";
export class Listing<T extends Listable> implements ServerListing {
constructor(
public Id: string,
public Name: string,
public Path: string,
public SearchHint: string,
public Link: string,
public Origin: ListingOrigin,
value?: T
) {
if (value) {
this.value(value);
}
}
private value = ko.observable<T>();
public SetValue = value => this.value(value);
public GetAsyncWithUpdatedId(callback: (item: T) => any) {
if (this.value()) {
return callback(this.value());
}
if (this.Origin === "localStorage") {
const item = Store.Load<T>(this.Link, this.Id);
item.Id = this.Id;
if (item !== null) {
return callback(item);
} else {
console.error(`Couldn't load item keyed '${this.Id}' from localStorage.`);
}
}
return $.getJSON(this.Link).done(item => {
item.Id = this.Id;
this.value(item);
return callback(item);
});
}
public CurrentName = ko.computed(() => {
const current = this.value();
if (current !== undefined) {
return current.Name || this.Name;
}
return this.Name;
});
}
| import { Listable, ServerListing } from "../../common/Listable";
import { Store } from "../Utility/Store";
export type ListingOrigin = "server" | "account" | "localStorage";
export class Listing<T extends Listable> implements ServerListing {
constructor(
public Id: string,
public Name: string,
public Path: string,
public SearchHint: string,
public Link: string,
public Origin: ListingOrigin,
value?: T
) {
if (value) {
this.value(value);
}
}
private value = ko.observable<T>();
public SetValue = value => this.value(value);
public GetAsyncWithUpdatedId(callback: (item: T) => any) {
if (this.value()) {
return callback(this.value());
}
if (this.Origin === "localStorage") {
const item = Store.Load<T>(this.Link, this.Id);
item.Id = this.Id;
if (item !== null) {
return callback(item);
} else {
console.error(`Couldn't load item keyed '${this.Id}' from localStorage.`);
}
}
return $.getJSON(this.Link).done(item => {
+ item.Id = this.Id;
this.value(item);
return callback(item);
});
}
public CurrentName = ko.computed(() => {
const current = this.value();
if (current !== undefined) {
return current.Name || this.Name;
}
return this.Name;
});
} | 1 | 0.018519 | 1 | 0 |
60533ab4d1d0b1b049b98f206445ac5dbc437913 | lib/anticuado/ios/carthage.rb | lib/anticuado/ios/carthage.rb | module Anticuado
module IOS
class Carthage < Anticuado::Base
# @param [String] project Path to project directory.
# @return [String] The result of command `carthage outdated`.
def self.outdated(project = nil)
return puts "have no carthage command" if `which carthage`.empty?
if project
`carthage outdated --project-directory #{project}`
else
`carthage outdated`
end
end
# @param [String] outdated The result of command `carthage outdated`
# @return [Array] Array include outdated data.
# If target project have no outdated data, then return blank array such as `[]`
def self.format(outdated)
array = outdated.split(/\R/).map(&:strip)
index = array.find_index("The following dependencies are outdated:")
return [] if index.nil?
array[index + 1..array.size].map do |library|
versions = library.split(/[\s|"]/) # e.g. ["Result", "", "2.0.0", "", "->", "", "2.1.3"]
{
library_name: versions[0],
current_version: versions[2],
available_version: versions[6],
latest_version: versions[6]
}
end
end
end # class Carthage
end # module IOS
end # module Anticuado
| module Anticuado
module IOS
class Carthage < Anticuado::Base
# @param [String] project Path to project directory.
# @return [String] The result of command `carthage outdated`.
def self.outdated(project = nil)
return puts "have no carthage command" if `which carthage`.empty?
if project
`carthage outdated --project-directory #{project}`
else
`carthage outdated`
end
end
# @param [String] outdated The result of command `carthage outdated`
# @return [Array] Array include outdated data.
# If target project have no outdated data, then return blank array such as `[]`
def self.format(outdated)
array = outdated.split(/\R/).map(&:strip)
index = array.find_index("The following dependencies are outdated:")
return [] if index.nil?
array[index + 1..array.size].map do |library|
versions = library.split(/[\s|"]/)
if versions[8] =~ /Latest/
# e.g. ["RxSwift", "", "4.1.0", "", "->", "", "4.1.2", "", "(Latest:", "", "4.1.2", ")"]
{
library_name: versions[0],
current_version: versions[2],
available_version: versions[6],
latest_version: versions[10]
}
else
# e.g. ["Result", "", "2.0.0", "", "->", "", "2.1.3"]
{
library_name: versions[0],
current_version: versions[2],
available_version: versions[6],
latest_version: versions[6]
}
end
end
end
end # class Carthage
end # module IOS
end # module Anticuado
| Support newer syntax of Carthage | Support newer syntax of Carthage
| Ruby | mit | KazuCocoa/anticuado,KazuCocoa/anticuado | ruby | ## Code Before:
module Anticuado
module IOS
class Carthage < Anticuado::Base
# @param [String] project Path to project directory.
# @return [String] The result of command `carthage outdated`.
def self.outdated(project = nil)
return puts "have no carthage command" if `which carthage`.empty?
if project
`carthage outdated --project-directory #{project}`
else
`carthage outdated`
end
end
# @param [String] outdated The result of command `carthage outdated`
# @return [Array] Array include outdated data.
# If target project have no outdated data, then return blank array such as `[]`
def self.format(outdated)
array = outdated.split(/\R/).map(&:strip)
index = array.find_index("The following dependencies are outdated:")
return [] if index.nil?
array[index + 1..array.size].map do |library|
versions = library.split(/[\s|"]/) # e.g. ["Result", "", "2.0.0", "", "->", "", "2.1.3"]
{
library_name: versions[0],
current_version: versions[2],
available_version: versions[6],
latest_version: versions[6]
}
end
end
end # class Carthage
end # module IOS
end # module Anticuado
## Instruction:
Support newer syntax of Carthage
## Code After:
module Anticuado
module IOS
class Carthage < Anticuado::Base
# @param [String] project Path to project directory.
# @return [String] The result of command `carthage outdated`.
def self.outdated(project = nil)
return puts "have no carthage command" if `which carthage`.empty?
if project
`carthage outdated --project-directory #{project}`
else
`carthage outdated`
end
end
# @param [String] outdated The result of command `carthage outdated`
# @return [Array] Array include outdated data.
# If target project have no outdated data, then return blank array such as `[]`
def self.format(outdated)
array = outdated.split(/\R/).map(&:strip)
index = array.find_index("The following dependencies are outdated:")
return [] if index.nil?
array[index + 1..array.size].map do |library|
versions = library.split(/[\s|"]/)
if versions[8] =~ /Latest/
# e.g. ["RxSwift", "", "4.1.0", "", "->", "", "4.1.2", "", "(Latest:", "", "4.1.2", ")"]
{
library_name: versions[0],
current_version: versions[2],
available_version: versions[6],
latest_version: versions[10]
}
else
# e.g. ["Result", "", "2.0.0", "", "->", "", "2.1.3"]
{
library_name: versions[0],
current_version: versions[2],
available_version: versions[6],
latest_version: versions[6]
}
end
end
end
end # class Carthage
end # module IOS
end # module Anticuado
| module Anticuado
module IOS
class Carthage < Anticuado::Base
# @param [String] project Path to project directory.
# @return [String] The result of command `carthage outdated`.
def self.outdated(project = nil)
return puts "have no carthage command" if `which carthage`.empty?
if project
`carthage outdated --project-directory #{project}`
else
`carthage outdated`
end
end
# @param [String] outdated The result of command `carthage outdated`
# @return [Array] Array include outdated data.
# If target project have no outdated data, then return blank array such as `[]`
def self.format(outdated)
array = outdated.split(/\R/).map(&:strip)
index = array.find_index("The following dependencies are outdated:")
return [] if index.nil?
array[index + 1..array.size].map do |library|
- versions = library.split(/[\s|"]/) # e.g. ["Result", "", "2.0.0", "", "->", "", "2.1.3"]
+ versions = library.split(/[\s|"]/)
+ if versions[8] =~ /Latest/
+ # e.g. ["RxSwift", "", "4.1.0", "", "->", "", "4.1.2", "", "(Latest:", "", "4.1.2", ")"]
- {
+ {
? ++
+ library_name: versions[0],
+ current_version: versions[2],
+ available_version: versions[6],
+ latest_version: versions[10]
+ }
+ else
+ # e.g. ["Result", "", "2.0.0", "", "->", "", "2.1.3"]
+ {
library_name: versions[0],
current_version: versions[2],
available_version: versions[6],
latest_version: versions[6]
- }
+ }
? ++
+ end
end
end
end # class Carthage
end # module IOS
end # module Anticuado | 17 | 0.459459 | 14 | 3 |
1c6557bf527f4d26aa81965324b99ba3aa97dc5d | _data/events_gdcr2019/pune.json | _data/events_gdcr2019/pune.json | {
"title": "Global Day of Coderetreat 2019 @ Equal Experts",
"url": "https://www.meetup.com/expert-talks-Pune/events/264152214/",
"moderators": [],
"sponsors": [
{
"name": "Equal Experts, Pune",
"url": "https://github.com/EqualExperts"
}
],
"date": {
"start": "2019-11-16T09:00:00+05:30",
"end": "2019-11-16T17:00:00+05:30"
},
"location": {
"city": "Pune",
"country": "India",
"coordinates": {
"latitude": 18.544905,
"longitude": 73.909751
}
}
}
| {
"title": "Global Day of Coderetreat 2019 @ Equal Experts",
"url": "https://www.meetup.com/expert-talks-Pune/events/264152214/",
"moderators": [
{
"name": "Aashish Ghogre",
"url": "https://github.com/ashishghogre"
},
{
"name": "Sagar Mate",
"url": "https://www.linkedin.com/in/sagar-mate-6063125b/"
}
],
"sponsors": [
{
"name": "Equal Experts, Pune",
"url": "https://github.com/EqualExperts"
}
],
"date": {
"start": "2019-11-16T09:00:00+05:30",
"end": "2019-11-16T17:00:00+05:30"
},
"location": {
"city": "Pune",
"country": "India",
"coordinates": {
"latitude": 18.544905,
"longitude": 73.909751
}
}
}
| Add facilitators for Equal Experts, Pune | Add facilitators for Equal Experts, Pune
| JSON | mit | coderetreat/coderetreat.github.io,coderetreat/coderetreat.github.io,coderetreat/coderetreat.github.io,coderetreat/coderetreat.github.io | json | ## Code Before:
{
"title": "Global Day of Coderetreat 2019 @ Equal Experts",
"url": "https://www.meetup.com/expert-talks-Pune/events/264152214/",
"moderators": [],
"sponsors": [
{
"name": "Equal Experts, Pune",
"url": "https://github.com/EqualExperts"
}
],
"date": {
"start": "2019-11-16T09:00:00+05:30",
"end": "2019-11-16T17:00:00+05:30"
},
"location": {
"city": "Pune",
"country": "India",
"coordinates": {
"latitude": 18.544905,
"longitude": 73.909751
}
}
}
## Instruction:
Add facilitators for Equal Experts, Pune
## Code After:
{
"title": "Global Day of Coderetreat 2019 @ Equal Experts",
"url": "https://www.meetup.com/expert-talks-Pune/events/264152214/",
"moderators": [
{
"name": "Aashish Ghogre",
"url": "https://github.com/ashishghogre"
},
{
"name": "Sagar Mate",
"url": "https://www.linkedin.com/in/sagar-mate-6063125b/"
}
],
"sponsors": [
{
"name": "Equal Experts, Pune",
"url": "https://github.com/EqualExperts"
}
],
"date": {
"start": "2019-11-16T09:00:00+05:30",
"end": "2019-11-16T17:00:00+05:30"
},
"location": {
"city": "Pune",
"country": "India",
"coordinates": {
"latitude": 18.544905,
"longitude": 73.909751
}
}
}
| {
"title": "Global Day of Coderetreat 2019 @ Equal Experts",
"url": "https://www.meetup.com/expert-talks-Pune/events/264152214/",
- "moderators": [],
? --
+ "moderators": [
+ {
+ "name": "Aashish Ghogre",
+ "url": "https://github.com/ashishghogre"
+ },
+ {
+ "name": "Sagar Mate",
+ "url": "https://www.linkedin.com/in/sagar-mate-6063125b/"
+ }
+ ],
"sponsors": [
{
"name": "Equal Experts, Pune",
"url": "https://github.com/EqualExperts"
}
],
"date": {
"start": "2019-11-16T09:00:00+05:30",
"end": "2019-11-16T17:00:00+05:30"
},
"location": {
"city": "Pune",
"country": "India",
"coordinates": {
"latitude": 18.544905,
"longitude": 73.909751
}
}
}
| 11 | 0.458333 | 10 | 1 |
4027ee5ab468e8688551234875562e280809e40a | tasks/storage.yml | tasks/storage.yml | ---
- name: Create filesystem on storage block device
filesystem:
dev: "{{ docker_storage_block_device }}"
fstype: "{{ docker_storage_filesystem }}"
force: "{{ docker_storage_force }}"
become: true
ignore_errors: true
- name: Create docker directory
file:
path: /var/lib/docker
state: directory
owner: root
group: root
mode: 0711
become: true
tags:
- skip_ansible_lint
- name: Mount storage block device to docker directory
mount:
# NOTE: changes to path after 2.3
name: /var/lib/docker
src: "{{ docker_storage_block_device }}"
fstype: "{{ docker_storage_filesystem }}"
state: mounted
become: true
| ---
- name: Create filesystem on storage block device
filesystem:
dev: "{{ docker_storage_block_device }}"
fstype: "{{ docker_storage_filesystem }}"
force: "{{ docker_storage_force }}"
become: true
ignore_errors: true
- name: Create docker directory
file:
path: /var/lib/docker
state: directory
owner: root
group: root
mode: 0711
become: true
tags:
- skip_ansible_lint
- name: Mount storage block device to docker directory
mount:
path: /var/lib/docker
src: "{{ docker_storage_block_device }}"
fstype: "{{ docker_storage_filesystem }}"
state: mounted
become: true
| Use path instead of name | Use path instead of name
| YAML | apache-2.0 | betacloud/ansible-docker,betacloud/ansible-docker | yaml | ## Code Before:
---
- name: Create filesystem on storage block device
filesystem:
dev: "{{ docker_storage_block_device }}"
fstype: "{{ docker_storage_filesystem }}"
force: "{{ docker_storage_force }}"
become: true
ignore_errors: true
- name: Create docker directory
file:
path: /var/lib/docker
state: directory
owner: root
group: root
mode: 0711
become: true
tags:
- skip_ansible_lint
- name: Mount storage block device to docker directory
mount:
# NOTE: changes to path after 2.3
name: /var/lib/docker
src: "{{ docker_storage_block_device }}"
fstype: "{{ docker_storage_filesystem }}"
state: mounted
become: true
## Instruction:
Use path instead of name
## Code After:
---
- name: Create filesystem on storage block device
filesystem:
dev: "{{ docker_storage_block_device }}"
fstype: "{{ docker_storage_filesystem }}"
force: "{{ docker_storage_force }}"
become: true
ignore_errors: true
- name: Create docker directory
file:
path: /var/lib/docker
state: directory
owner: root
group: root
mode: 0711
become: true
tags:
- skip_ansible_lint
- name: Mount storage block device to docker directory
mount:
path: /var/lib/docker
src: "{{ docker_storage_block_device }}"
fstype: "{{ docker_storage_filesystem }}"
state: mounted
become: true
| ---
- name: Create filesystem on storage block device
filesystem:
dev: "{{ docker_storage_block_device }}"
fstype: "{{ docker_storage_filesystem }}"
force: "{{ docker_storage_force }}"
become: true
ignore_errors: true
- name: Create docker directory
file:
path: /var/lib/docker
state: directory
owner: root
group: root
mode: 0711
become: true
tags:
- skip_ansible_lint
- name: Mount storage block device to docker directory
mount:
- # NOTE: changes to path after 2.3
- name: /var/lib/docker
? ^ ^^
+ path: /var/lib/docker
? ^ ^^
src: "{{ docker_storage_block_device }}"
fstype: "{{ docker_storage_filesystem }}"
state: mounted
become: true | 3 | 0.107143 | 1 | 2 |
2c36f2ef7ca925b97659324ace77b40541fb7553 | .nodeset.yml | .nodeset.yml | default_set: 'centos-64-x64'
sets:
'centos-64-x64':
nodes:
"main.foo.vm":
prefab: 'centos-64-x64'
| default_set: 'centos-64-x64'
sets:
'centos-64-x64':
nodes:
"main.foo.vm":
prefab: 'centos-64-x64'
'debian-70rc1-x64':
nodes:
"main":
prefab: 'debian-70rc1-x64'
| Add debian wheezy node for rspec-system | Add debian wheezy node for rspec-system
| YAML | apache-2.0 | MelanieGault/puppet-wget,blackcobra1973/puppet-wget,maestrodev/puppet-wget,azcender/puppet-wget,DSI-Ville-Noumea/puppet-wget,seanscottking/puppet-wget,PierrickI3/puppet-wget | yaml | ## Code Before:
default_set: 'centos-64-x64'
sets:
'centos-64-x64':
nodes:
"main.foo.vm":
prefab: 'centos-64-x64'
## Instruction:
Add debian wheezy node for rspec-system
## Code After:
default_set: 'centos-64-x64'
sets:
'centos-64-x64':
nodes:
"main.foo.vm":
prefab: 'centos-64-x64'
'debian-70rc1-x64':
nodes:
"main":
prefab: 'debian-70rc1-x64'
| default_set: 'centos-64-x64'
sets:
'centos-64-x64':
nodes:
"main.foo.vm":
prefab: 'centos-64-x64'
+ 'debian-70rc1-x64':
+ nodes:
+ "main":
+ prefab: 'debian-70rc1-x64' | 4 | 0.666667 | 4 | 0 |
326022e902902cc05ed10a656b9cccc3c78590ea | app/models/houston/itsm/issue.rb | app/models/houston/itsm/issue.rb | require "ntlm/http"
module Houston
module Itsm
class Issue < Struct.new(:summary, :url, :assigned_to_email, :assigned_to_user)
def self.open
http = Net::HTTP.start("ecphhelper", 80)
req = Net::HTTP::Get.new("/ITSM.asmx/GetOpenCallsEmergingProducts")
req.ntlm_auth("Houston", "cph.pri", "gKfub6mFy9BHDs6")
response = http.request(req)
parse_issues(response.body).map do |issue|
url = Nokogiri::HTML::fragment(issue["CallDetailLink"]).children.first[:href]
self.new(issue["Summary"], url, issue["AssignedToEmailAddress"].try(:downcase))
end
end
private
def self.parse_issues(xml)
Array.wrap(
Hash.from_xml(xml)
.fetch("ArrayOfOpenCallData", {})
.fetch("OpenCallData", []))
rescue REXML::ParseException # malformed response upstream
Rails.logger.error "\e[31;1m#{$!.class}\e[0;31m: #{$!.message}"
[]
end
end
end
end
| require "ntlm/http"
module Houston
module Itsm
class Issue < Struct.new(:key, :summary, :url, :assigned_to_email, :assigned_to_user)
def self.open
http = Net::HTTP.start("ecphhelper", 80)
req = Net::HTTP::Get.new("/ITSM.asmx/GetOpenCallsEmergingProducts")
req.ntlm_auth("Houston", "cph.pri", "gKfub6mFy9BHDs6")
response = http.request(req)
parse_issues(response.body).map do |issue|
self.new(
issue["SupportCallID"],
issue["Summary"],
href_of(issue["CallDetailLink"]),
issue["AssignedToEmailAddress"].try(:downcase))
end
end
private
def self.parse_issues(xml)
Array.wrap(
Hash.from_xml(xml)
.fetch("ArrayOfOpenCallData", {})
.fetch("OpenCallData", []))
rescue REXML::ParseException # malformed response upstream
Rails.logger.error "\e[31;1m#{$!.class}\e[0;31m: #{$!.message}"
[]
end
def self.href_of(link)
Nokogiri::HTML::fragment(link).children.first[:href]
end
end
end
end
| Add `key` to Issue (2m) | [refactor] Add `key` to Issue (2m)
| Ruby | mit | cph/houston-dashboards,concordia-publishing-house/houston-dashboards,cph/houston-dashboards,concordia-publishing-house/houston-dashboards,concordia-publishing-house/houston-dashboards,cph/houston-dashboards | ruby | ## Code Before:
require "ntlm/http"
module Houston
module Itsm
class Issue < Struct.new(:summary, :url, :assigned_to_email, :assigned_to_user)
def self.open
http = Net::HTTP.start("ecphhelper", 80)
req = Net::HTTP::Get.new("/ITSM.asmx/GetOpenCallsEmergingProducts")
req.ntlm_auth("Houston", "cph.pri", "gKfub6mFy9BHDs6")
response = http.request(req)
parse_issues(response.body).map do |issue|
url = Nokogiri::HTML::fragment(issue["CallDetailLink"]).children.first[:href]
self.new(issue["Summary"], url, issue["AssignedToEmailAddress"].try(:downcase))
end
end
private
def self.parse_issues(xml)
Array.wrap(
Hash.from_xml(xml)
.fetch("ArrayOfOpenCallData", {})
.fetch("OpenCallData", []))
rescue REXML::ParseException # malformed response upstream
Rails.logger.error "\e[31;1m#{$!.class}\e[0;31m: #{$!.message}"
[]
end
end
end
end
## Instruction:
[refactor] Add `key` to Issue (2m)
## Code After:
require "ntlm/http"
module Houston
module Itsm
class Issue < Struct.new(:key, :summary, :url, :assigned_to_email, :assigned_to_user)
def self.open
http = Net::HTTP.start("ecphhelper", 80)
req = Net::HTTP::Get.new("/ITSM.asmx/GetOpenCallsEmergingProducts")
req.ntlm_auth("Houston", "cph.pri", "gKfub6mFy9BHDs6")
response = http.request(req)
parse_issues(response.body).map do |issue|
self.new(
issue["SupportCallID"],
issue["Summary"],
href_of(issue["CallDetailLink"]),
issue["AssignedToEmailAddress"].try(:downcase))
end
end
private
def self.parse_issues(xml)
Array.wrap(
Hash.from_xml(xml)
.fetch("ArrayOfOpenCallData", {})
.fetch("OpenCallData", []))
rescue REXML::ParseException # malformed response upstream
Rails.logger.error "\e[31;1m#{$!.class}\e[0;31m: #{$!.message}"
[]
end
def self.href_of(link)
Nokogiri::HTML::fragment(link).children.first[:href]
end
end
end
end
| require "ntlm/http"
module Houston
module Itsm
- class Issue < Struct.new(:summary, :url, :assigned_to_email, :assigned_to_user)
+ class Issue < Struct.new(:key, :summary, :url, :assigned_to_email, :assigned_to_user)
? ++++++
def self.open
http = Net::HTTP.start("ecphhelper", 80)
req = Net::HTTP::Get.new("/ITSM.asmx/GetOpenCallsEmergingProducts")
req.ntlm_auth("Houston", "cph.pri", "gKfub6mFy9BHDs6")
response = http.request(req)
parse_issues(response.body).map do |issue|
- url = Nokogiri::HTML::fragment(issue["CallDetailLink"]).children.first[:href]
+ self.new(
+ issue["SupportCallID"],
+ issue["Summary"],
+ href_of(issue["CallDetailLink"]),
- self.new(issue["Summary"], url, issue["AssignedToEmailAddress"].try(:downcase))
? -------------------------- ----
+ issue["AssignedToEmailAddress"].try(:downcase))
end
end
private
def self.parse_issues(xml)
Array.wrap(
Hash.from_xml(xml)
.fetch("ArrayOfOpenCallData", {})
.fetch("OpenCallData", []))
rescue REXML::ParseException # malformed response upstream
Rails.logger.error "\e[31;1m#{$!.class}\e[0;31m: #{$!.message}"
[]
end
+ def self.href_of(link)
+ Nokogiri::HTML::fragment(link).children.first[:href]
+ end
+
end
end
end | 13 | 0.382353 | 10 | 3 |
699c43e5bf71fc09c3c7cd3820ae1034ab0830f8 | credentials/remove-newlines.sh | credentials/remove-newlines.sh | TMPFILE="$(mktemp)"
for f in "$(dirname "$0")/"*".txt"; do
cat "$f" | tr -d '\n' > "$TMPFILE"
mv "$TMPFILE" "$f"
done
rm "$TMPFILE"
| TMPFILE="$(mktemp)"
for f in "$(dirname "$0")/"*".txt"; do
cat "$f" | tr -d '\n' > "$TMPFILE"
mv "$TMPFILE" "$f"
done
[ -e "$TMPFILE" ] && rm "$TMPFILE"
| Remove temp file only if it exists | Remove temp file only if it exists
| Shell | agpl-3.0 | lcorbasson/progicilia-watch,lcorbasson/progicilia-hack,lcorbasson/progicilia-watch,lcorbasson/progicilia-hack | shell | ## Code Before:
TMPFILE="$(mktemp)"
for f in "$(dirname "$0")/"*".txt"; do
cat "$f" | tr -d '\n' > "$TMPFILE"
mv "$TMPFILE" "$f"
done
rm "$TMPFILE"
## Instruction:
Remove temp file only if it exists
## Code After:
TMPFILE="$(mktemp)"
for f in "$(dirname "$0")/"*".txt"; do
cat "$f" | tr -d '\n' > "$TMPFILE"
mv "$TMPFILE" "$f"
done
[ -e "$TMPFILE" ] && rm "$TMPFILE"
| TMPFILE="$(mktemp)"
for f in "$(dirname "$0")/"*".txt"; do
cat "$f" | tr -d '\n' > "$TMPFILE"
mv "$TMPFILE" "$f"
done
- rm "$TMPFILE"
+ [ -e "$TMPFILE" ] && rm "$TMPFILE"
| 2 | 0.285714 | 1 | 1 |
5bbd2968530821b2f3a3b265dcb30178a45e4444 | y86_machine.7 | y86_machine.7 | .\"
.\" Copyright (c) 2020 Scott Bennett <scottb@fastmail.com>
.\"
.Dd March 21, 2020
.Dt Y86_MACHINE 7
.Os
.Sh NAME
.Nm Y86 Machine Reference
.Sh DESCRIPTION
The Y86
.Pq Dq YES
machine is a theoretical processor based on a subset of x86 concepts.
Being a teaching processor, Y86 embodies elements of both RISC and CISC architectures.
The Y86 machine is a little endian machine consisting of three main components:
memory, program registers, and processor registers.
.Ss Memory
The Y86 memory consists of 1024, 32-bit words.
.Sh SEE ALSO
.Xr yess 1 ,
.Xr y86_obj_code 7
| .\"
.\" Copyright (c) 2020 Scott Bennett <scottb@fastmail.com>
.\"
.Dd March 21, 2020
.Dt Y86_MACHINE 7
.Os
.Sh NAME
.Nm Y86 Machine Reference
.Sh DESCRIPTION
The Y86
.Pq Dq YES
machine is a theoretical processor based on a subset of x86 concepts.
Being a teaching processor, Y86 embodies elements of both RISC and CISC architectures.
The Y86 machine is a little endian machine consisting of three main components:
memory, program registers, and processor registers.
.Ss Memory
The Y86 memory consists of 1024, 32-bit words.
.Pp
The relationship between word addresses and byte addresses can be seen below.
.\" Table of memory addresses
.TS
allbox;
cw14 cz s s s
r c c c c.
Word Address Byte Address
0 3 2 1 0
1 7 6 5 4
2 11 10 9 8
3 15 14 13 12
4 19 18 17 16
\&. \&. \&. \&. \&.
1023 4095 4094 4093 4092
.TE
.Sh SEE ALSO
.Xr yess 1 ,
.Xr y86_obj_code 7
| Make a table of address relationships | Make a table of address relationships
| Groff | isc | sbennett1990/YESS,sbennett1990/YESS,sbennett1990/YESS | groff | ## Code Before:
.\"
.\" Copyright (c) 2020 Scott Bennett <scottb@fastmail.com>
.\"
.Dd March 21, 2020
.Dt Y86_MACHINE 7
.Os
.Sh NAME
.Nm Y86 Machine Reference
.Sh DESCRIPTION
The Y86
.Pq Dq YES
machine is a theoretical processor based on a subset of x86 concepts.
Being a teaching processor, Y86 embodies elements of both RISC and CISC architectures.
The Y86 machine is a little endian machine consisting of three main components:
memory, program registers, and processor registers.
.Ss Memory
The Y86 memory consists of 1024, 32-bit words.
.Sh SEE ALSO
.Xr yess 1 ,
.Xr y86_obj_code 7
## Instruction:
Make a table of address relationships
## Code After:
.\"
.\" Copyright (c) 2020 Scott Bennett <scottb@fastmail.com>
.\"
.Dd March 21, 2020
.Dt Y86_MACHINE 7
.Os
.Sh NAME
.Nm Y86 Machine Reference
.Sh DESCRIPTION
The Y86
.Pq Dq YES
machine is a theoretical processor based on a subset of x86 concepts.
Being a teaching processor, Y86 embodies elements of both RISC and CISC architectures.
The Y86 machine is a little endian machine consisting of three main components:
memory, program registers, and processor registers.
.Ss Memory
The Y86 memory consists of 1024, 32-bit words.
.Pp
The relationship between word addresses and byte addresses can be seen below.
.\" Table of memory addresses
.TS
allbox;
cw14 cz s s s
r c c c c.
Word Address Byte Address
0 3 2 1 0
1 7 6 5 4
2 11 10 9 8
3 15 14 13 12
4 19 18 17 16
\&. \&. \&. \&. \&.
1023 4095 4094 4093 4092
.TE
.Sh SEE ALSO
.Xr yess 1 ,
.Xr y86_obj_code 7
| .\"
.\" Copyright (c) 2020 Scott Bennett <scottb@fastmail.com>
.\"
.Dd March 21, 2020
.Dt Y86_MACHINE 7
.Os
.Sh NAME
.Nm Y86 Machine Reference
.Sh DESCRIPTION
The Y86
.Pq Dq YES
machine is a theoretical processor based on a subset of x86 concepts.
Being a teaching processor, Y86 embodies elements of both RISC and CISC architectures.
The Y86 machine is a little endian machine consisting of three main components:
memory, program registers, and processor registers.
.Ss Memory
The Y86 memory consists of 1024, 32-bit words.
+ .Pp
+ The relationship between word addresses and byte addresses can be seen below.
+ .\" Table of memory addresses
+ .TS
+ allbox;
+ cw14 cz s s s
+ r c c c c.
+ Word Address Byte Address
+ 0 3 2 1 0
+ 1 7 6 5 4
+ 2 11 10 9 8
+ 3 15 14 13 12
+ 4 19 18 17 16
+ \&. \&. \&. \&. \&.
+ 1023 4095 4094 4093 4092
+ .TE
.Sh SEE ALSO
.Xr yess 1 ,
.Xr y86_obj_code 7 | 16 | 0.8 | 16 | 0 |
cfdbfa2200aa500af3a80e78ad2e673c8446113e | README.md | README.md | [language-colors](http://language-colors.herokuapp.com)
===============
A simple web app and service for viewing GitHub's language colors.
You may access the simple API at
[`/index.json`](http://language-colors.herokuapp.com/index.json).
| [language-colors](http://language-colors.herokuapp.com)
===============
[](https://travis-ci.org/nicolasmccurdy/language-colors)
A simple web app and service for viewing GitHub's language colors.
You may access the simple API at
[`/index.json`](http://language-colors.herokuapp.com/index.json).
| Add a Travis badge to the readme | Add a Travis badge to the readme | Markdown | mit | nicolasmccurdy/language-colors,nicolasmccurdy/language-colors | markdown | ## Code Before:
[language-colors](http://language-colors.herokuapp.com)
===============
A simple web app and service for viewing GitHub's language colors.
You may access the simple API at
[`/index.json`](http://language-colors.herokuapp.com/index.json).
## Instruction:
Add a Travis badge to the readme
## Code After:
[language-colors](http://language-colors.herokuapp.com)
===============
[](https://travis-ci.org/nicolasmccurdy/language-colors)
A simple web app and service for viewing GitHub's language colors.
You may access the simple API at
[`/index.json`](http://language-colors.herokuapp.com/index.json).
| [language-colors](http://language-colors.herokuapp.com)
===============
+ [](https://travis-ci.org/nicolasmccurdy/language-colors)
+
A simple web app and service for viewing GitHub's language colors.
You may access the simple API at
[`/index.json`](http://language-colors.herokuapp.com/index.json). | 2 | 0.333333 | 2 | 0 |
ebaa046ea915dc8e1e1abf0b079f4cb081a7f674 | Kwc/Basic/LinkTag/FirstChildPage/Data.php | Kwc/Basic/LinkTag/FirstChildPage/Data.php | <?php
class Kwc_Basic_LinkTag_FirstChildPage_Data extends Kwf_Component_Data
{
private $_pageCache = false;
public function __get($var)
{
if ($var == 'url') {
$page = $this->_getFirstChildPage();
return $page ? $page->url : '';
} else if ($var == 'rel') {
$page = $this->_getFirstChildPage();
return $page ? $page->rel : '';
} else {
return parent::__get($var);
}
}
public function getAbsoluteUrl()
{
return $this->_getFirstChildPage()->getAbsoluteUrl();
}
public function _getFirstChildPage()
{
if ($this->_pageCache === false) {
// zuerst prüfen ob es eine händisch angelegte child page gibt
$page = $this->getChildPage(array('pageGenerator' => true));
if (!$page) {
$page = $this->getChildPage(array('inherit'=>false),
array(
'inherit'=>false,
'page'=>false
));
}
$this->_pageCache = $page;
}
return $this->_pageCache;
}
public function getLinkDataAttributes()
{
return $this->_getFirstChildPage()->getLinkDataAttributes();
}
}
| <?php
class Kwc_Basic_LinkTag_FirstChildPage_Data extends Kwf_Component_Data
{
private $_pageCache = false;
public function __get($var)
{
if ($var == 'url') {
$page = $this->_getFirstChildPage();
return $page ? $page->url : '';
} else if ($var == 'rel') {
$page = $this->_getFirstChildPage();
return $page ? $page->rel : '';
} else {
return parent::__get($var);
}
}
public function getAbsoluteUrl()
{
$page = $this->_getFirstChildPage();
return $page ? $page->getAbsoluteUrl() : '';
}
public function _getFirstChildPage()
{
if ($this->_pageCache === false) {
// zuerst prüfen ob es eine händisch angelegte child page gibt
$page = $this->getChildPage(array('pageGenerator' => true));
if (!$page) {
$page = $this->getChildPage(array('inherit'=>false),
array(
'inherit'=>false,
'page'=>false
));
}
$this->_pageCache = $page;
}
return $this->_pageCache;
}
public function getLinkDataAttributes()
{
$page = $this->_getFirstChildPage();
return $page ? $page->getLinkDataAttributes() : array();
}
}
| Fix FirstChildPage if child page doesn't exist | Fix FirstChildPage if child page doesn't exist
| PHP | bsd-2-clause | koala-framework/koala-framework,kaufmo/koala-framework,koala-framework/koala-framework,kaufmo/koala-framework,kaufmo/koala-framework | php | ## Code Before:
<?php
class Kwc_Basic_LinkTag_FirstChildPage_Data extends Kwf_Component_Data
{
private $_pageCache = false;
public function __get($var)
{
if ($var == 'url') {
$page = $this->_getFirstChildPage();
return $page ? $page->url : '';
} else if ($var == 'rel') {
$page = $this->_getFirstChildPage();
return $page ? $page->rel : '';
} else {
return parent::__get($var);
}
}
public function getAbsoluteUrl()
{
return $this->_getFirstChildPage()->getAbsoluteUrl();
}
public function _getFirstChildPage()
{
if ($this->_pageCache === false) {
// zuerst prüfen ob es eine händisch angelegte child page gibt
$page = $this->getChildPage(array('pageGenerator' => true));
if (!$page) {
$page = $this->getChildPage(array('inherit'=>false),
array(
'inherit'=>false,
'page'=>false
));
}
$this->_pageCache = $page;
}
return $this->_pageCache;
}
public function getLinkDataAttributes()
{
return $this->_getFirstChildPage()->getLinkDataAttributes();
}
}
## Instruction:
Fix FirstChildPage if child page doesn't exist
## Code After:
<?php
class Kwc_Basic_LinkTag_FirstChildPage_Data extends Kwf_Component_Data
{
private $_pageCache = false;
public function __get($var)
{
if ($var == 'url') {
$page = $this->_getFirstChildPage();
return $page ? $page->url : '';
} else if ($var == 'rel') {
$page = $this->_getFirstChildPage();
return $page ? $page->rel : '';
} else {
return parent::__get($var);
}
}
public function getAbsoluteUrl()
{
$page = $this->_getFirstChildPage();
return $page ? $page->getAbsoluteUrl() : '';
}
public function _getFirstChildPage()
{
if ($this->_pageCache === false) {
// zuerst prüfen ob es eine händisch angelegte child page gibt
$page = $this->getChildPage(array('pageGenerator' => true));
if (!$page) {
$page = $this->getChildPage(array('inherit'=>false),
array(
'inherit'=>false,
'page'=>false
));
}
$this->_pageCache = $page;
}
return $this->_pageCache;
}
public function getLinkDataAttributes()
{
$page = $this->_getFirstChildPage();
return $page ? $page->getLinkDataAttributes() : array();
}
}
| <?php
class Kwc_Basic_LinkTag_FirstChildPage_Data extends Kwf_Component_Data
{
private $_pageCache = false;
public function __get($var)
{
if ($var == 'url') {
$page = $this->_getFirstChildPage();
return $page ? $page->url : '';
} else if ($var == 'rel') {
$page = $this->_getFirstChildPage();
return $page ? $page->rel : '';
} else {
return parent::__get($var);
}
}
public function getAbsoluteUrl()
{
- return $this->_getFirstChildPage()->getAbsoluteUrl();
+ $page = $this->_getFirstChildPage();
+ return $page ? $page->getAbsoluteUrl() : '';
}
public function _getFirstChildPage()
{
if ($this->_pageCache === false) {
// zuerst prüfen ob es eine händisch angelegte child page gibt
$page = $this->getChildPage(array('pageGenerator' => true));
if (!$page) {
$page = $this->getChildPage(array('inherit'=>false),
array(
'inherit'=>false,
'page'=>false
));
}
$this->_pageCache = $page;
}
return $this->_pageCache;
}
public function getLinkDataAttributes()
{
- return $this->_getFirstChildPage()->getLinkDataAttributes();
+ $page = $this->_getFirstChildPage();
+ return $page ? $page->getLinkDataAttributes() : array();
}
} | 6 | 0.133333 | 4 | 2 |
d147da405c4e9726b53195e63d9c7621c3c88edf | zero-storage-core/src/test/java/com/unidev/zerostorage/TestIndexStorage.java | zero-storage-core/src/test/java/com/unidev/zerostorage/TestIndexStorage.java | package com.unidev.zerostorage;
import com.unidev.zerostorage.index.IndexStorage;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.TemporaryFolder;
import java.io.File;
/**
* Storage index tests
*/
public class TestIndexStorage {
@Rule
public TemporaryFolder folder= new TemporaryFolder();
@Test
public void testBlankIndexStorageSave() {
IndexStorage indexStorage = new IndexStorage();
Storage blankIndex = new Storage();
indexStorage.index(blankIndex);
File root = folder.getRoot();
indexStorage.storageRoot(root);
indexStorage.save();
}
@Test
public void testIndexStorageSave() {
IndexStorage indexStorage = new IndexStorage();
Storage index = new Storage();
index.details().put("Potato", "Tomato");
index.details().put("Tomato", null);
indexStorage.index(index);
File root = folder.getRoot();
indexStorage.storageRoot(root);
indexStorage.save();
}
}
| package com.unidev.zerostorage;
import com.unidev.zerostorage.index.IndexStorage;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.TemporaryFolder;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.nio.file.Files;
/**
* Storage index tests
*/
public class TestIndexStorage {
@Rule
public TemporaryFolder folder= new TemporaryFolder();
@Test
public void testBlankIndexStorageSave() {
IndexStorage indexStorage = new IndexStorage();
Storage blankIndex = new Storage();
indexStorage.index(blankIndex);
File root = folder.getRoot();
indexStorage.storageRoot(root);
indexStorage.save();
}
@Test
public void testIndexStorageSave() {
IndexStorage indexStorage = new IndexStorage();
Storage index = new Storage();
index.details().put("Potato", "Tomato");
index.details().put("Tomato", null);
indexStorage.index(index);
File root = folder.getRoot();
indexStorage.storageRoot(root);
indexStorage.save();
}
@Test
public void testLoadIndexStorage() throws IOException {
InputStream inputStream = TestStorage.class.getResourceAsStream("/blankIndex.json");
File root = folder.getRoot();
File index = new File(root, IndexStorage.INDEX_FILE);
Files.copy(inputStream, index.toPath());
IndexStorage indexStorage = new IndexStorage();
indexStorage.storageRoot(root);
indexStorage.load();
}
}
| Test loading of blank index file | Test loading of blank index file
| Java | apache-2.0 | universal-development/zulu-storage | java | ## Code Before:
package com.unidev.zerostorage;
import com.unidev.zerostorage.index.IndexStorage;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.TemporaryFolder;
import java.io.File;
/**
* Storage index tests
*/
public class TestIndexStorage {
@Rule
public TemporaryFolder folder= new TemporaryFolder();
@Test
public void testBlankIndexStorageSave() {
IndexStorage indexStorage = new IndexStorage();
Storage blankIndex = new Storage();
indexStorage.index(blankIndex);
File root = folder.getRoot();
indexStorage.storageRoot(root);
indexStorage.save();
}
@Test
public void testIndexStorageSave() {
IndexStorage indexStorage = new IndexStorage();
Storage index = new Storage();
index.details().put("Potato", "Tomato");
index.details().put("Tomato", null);
indexStorage.index(index);
File root = folder.getRoot();
indexStorage.storageRoot(root);
indexStorage.save();
}
}
## Instruction:
Test loading of blank index file
## Code After:
package com.unidev.zerostorage;
import com.unidev.zerostorage.index.IndexStorage;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.TemporaryFolder;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.nio.file.Files;
/**
* Storage index tests
*/
public class TestIndexStorage {
@Rule
public TemporaryFolder folder= new TemporaryFolder();
@Test
public void testBlankIndexStorageSave() {
IndexStorage indexStorage = new IndexStorage();
Storage blankIndex = new Storage();
indexStorage.index(blankIndex);
File root = folder.getRoot();
indexStorage.storageRoot(root);
indexStorage.save();
}
@Test
public void testIndexStorageSave() {
IndexStorage indexStorage = new IndexStorage();
Storage index = new Storage();
index.details().put("Potato", "Tomato");
index.details().put("Tomato", null);
indexStorage.index(index);
File root = folder.getRoot();
indexStorage.storageRoot(root);
indexStorage.save();
}
@Test
public void testLoadIndexStorage() throws IOException {
InputStream inputStream = TestStorage.class.getResourceAsStream("/blankIndex.json");
File root = folder.getRoot();
File index = new File(root, IndexStorage.INDEX_FILE);
Files.copy(inputStream, index.toPath());
IndexStorage indexStorage = new IndexStorage();
indexStorage.storageRoot(root);
indexStorage.load();
}
}
| package com.unidev.zerostorage;
import com.unidev.zerostorage.index.IndexStorage;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.TemporaryFolder;
import java.io.File;
+ import java.io.IOException;
+ import java.io.InputStream;
+ import java.nio.file.Files;
/**
* Storage index tests
*/
public class TestIndexStorage {
@Rule
public TemporaryFolder folder= new TemporaryFolder();
@Test
public void testBlankIndexStorageSave() {
IndexStorage indexStorage = new IndexStorage();
Storage blankIndex = new Storage();
indexStorage.index(blankIndex);
File root = folder.getRoot();
indexStorage.storageRoot(root);
indexStorage.save();
}
@Test
public void testIndexStorageSave() {
IndexStorage indexStorage = new IndexStorage();
Storage index = new Storage();
index.details().put("Potato", "Tomato");
index.details().put("Tomato", null);
indexStorage.index(index);
File root = folder.getRoot();
indexStorage.storageRoot(root);
indexStorage.save();
}
+ @Test
+ public void testLoadIndexStorage() throws IOException {
+ InputStream inputStream = TestStorage.class.getResourceAsStream("/blankIndex.json");
+ File root = folder.getRoot();
+
+ File index = new File(root, IndexStorage.INDEX_FILE);
+ Files.copy(inputStream, index.toPath());
+
+ IndexStorage indexStorage = new IndexStorage();
+ indexStorage.storageRoot(root);
+ indexStorage.load();
+ }
+
} | 16 | 0.333333 | 16 | 0 |
1cc1bff2ca38d0d8fca50c562c88424e70ff0cc8 | app/commands/put_publish_intent.rb | app/commands/put_publish_intent.rb | module Commands
class PutPublishIntent < BaseCommand
def call
PathReservation.reserve_base_path!(base_path, payload[:publishing_app])
if downstream
payload = Presenters::DownstreamPresenter::V1.present(publish_intent, transmitted_at: false)
Adapters::ContentStore.put_publish_intent(base_path, payload)
end
Success.new(payload)
end
private
def publish_intent
payload.except(:base_path).deep_symbolize_keys
end
def base_path
payload.fetch(:base_path)
end
end
end
| module Commands
class PutPublishIntent < BaseCommand
def call
PathReservation.reserve_base_path!(base_path, payload[:publishing_app])
if downstream
payload = Presenters::DownstreamPresenter::V1.present(publish_intent, payload_version: false)
Adapters::ContentStore.put_publish_intent(base_path, payload)
end
Success.new(payload)
end
private
def publish_intent
payload.except(:base_path).deep_symbolize_keys
end
def base_path
payload.fetch(:base_path)
end
end
end
| Update `PutPublishIntent` command to omit `payload_version` | Update `PutPublishIntent` command to omit `payload_version`
| Ruby | mit | alphagov/publishing-api,alphagov/publishing-api | ruby | ## Code Before:
module Commands
class PutPublishIntent < BaseCommand
def call
PathReservation.reserve_base_path!(base_path, payload[:publishing_app])
if downstream
payload = Presenters::DownstreamPresenter::V1.present(publish_intent, transmitted_at: false)
Adapters::ContentStore.put_publish_intent(base_path, payload)
end
Success.new(payload)
end
private
def publish_intent
payload.except(:base_path).deep_symbolize_keys
end
def base_path
payload.fetch(:base_path)
end
end
end
## Instruction:
Update `PutPublishIntent` command to omit `payload_version`
## Code After:
module Commands
class PutPublishIntent < BaseCommand
def call
PathReservation.reserve_base_path!(base_path, payload[:publishing_app])
if downstream
payload = Presenters::DownstreamPresenter::V1.present(publish_intent, payload_version: false)
Adapters::ContentStore.put_publish_intent(base_path, payload)
end
Success.new(payload)
end
private
def publish_intent
payload.except(:base_path).deep_symbolize_keys
end
def base_path
payload.fetch(:base_path)
end
end
end
| module Commands
class PutPublishIntent < BaseCommand
def call
PathReservation.reserve_base_path!(base_path, payload[:publishing_app])
if downstream
- payload = Presenters::DownstreamPresenter::V1.present(publish_intent, transmitted_at: false)
? ^^ ^^^^^^^ ^^
+ payload = Presenters::DownstreamPresenter::V1.present(publish_intent, payload_version: false)
? ^ ^^^^ ^^^^^^^
Adapters::ContentStore.put_publish_intent(base_path, payload)
end
Success.new(payload)
end
private
def publish_intent
payload.except(:base_path).deep_symbolize_keys
end
def base_path
payload.fetch(:base_path)
end
end
end | 2 | 0.083333 | 1 | 1 |
349b9ab3e72f188c50b5f84f3ce5ee2fddcfbf80 | roles/ceph-common/tasks/generate_ceph_conf.yml | roles/ceph-common/tasks/generate_ceph_conf.yml | ---
- name: create ceph conf directory
file:
path: /etc/ceph
state: directory
owner: "ceph"
group: "ceph"
mode: "0755"
- name: "generate ceph configuration file: {{ cluster }}.conf"
action: config_template
args:
src: ceph.conf.j2
dest: /etc/ceph/{{ cluster }}.conf
owner: "ceph"
group: "ceph"
mode: "0644"
config_overrides: "{{ ceph_conf_overrides }}"
config_type: ini
notify:
- restart ceph mons
- restart ceph osds
- restart ceph mdss
- restart ceph rgws
- restart ceph nfss
| ---
- name: create ceph conf directory and assemble directory
file:
path: "{{ item }}"
state: directory
owner: "ceph"
group: "ceph"
mode: "0755"
with_items:
- /etc/ceph/
- /etc/ceph/ceph.d/
- name: "generate ceph configuration file: {{ cluster }}.conf"
action: config_template
args:
src: ceph.conf.j2
dest: /etc/ceph/ceph.d/{{ cluster }}.conf
owner: "ceph"
group: "ceph"
mode: "0644"
config_overrides: "{{ ceph_conf_overrides }}"
config_type: ini
- name: assemble {{ cluster }}.conf and fragments
assemble:
src: /etc/ceph/ceph.d/
dest: /etc/ceph/{{ cluster }}.conf
owner: "ceph"
group: "ceph"
mode: "0644"
notify:
- restart ceph mons
- restart ceph osds
- restart ceph mdss
- restart ceph rgws
- restart ceph nfss
| Make ceph-common aware off osd config fragments | Make ceph-common aware off osd config fragments
This removes the implicit order requirement when using OSD fragments.
When you use OSD fragments and ceph-osd role is not the last one,
the fragments get removed from ceph.conf by ceph-common.
It is not nice to have this code at two locations, but this is
necessary to prevent problems, when ceph-osd is the last role as
ceph-common gets executed before ceph-osd.
This could be prevented when ceph-common would be explicitly called
at the end of the playbook.
Signed-off-by: Christian Zunker <55c21f91ef088e50e9390ce29a26f69b20baf7b0@codecentric.de>
| YAML | apache-2.0 | jtaleric/ceph-ansible,font/ceph-ansible,ceph/ceph-ansible,WingkaiHo/ceph-ansible,WingkaiHo/ceph-ansible,fgal/ceph-ansible,albertomurillo/ceph-ansible,bengland2/ceph-ansible,albertomurillo/ceph-ansible,travmi/ceph-ansible,bengland2/ceph-ansible,jtaleric/ceph-ansible,font/ceph-ansible,WingkaiHo/ceph-ansible,ceph/ceph-ansible,albertomurillo/ceph-ansible,travmi/ceph-ansible,fgal/ceph-ansible | yaml | ## Code Before:
---
- name: create ceph conf directory
file:
path: /etc/ceph
state: directory
owner: "ceph"
group: "ceph"
mode: "0755"
- name: "generate ceph configuration file: {{ cluster }}.conf"
action: config_template
args:
src: ceph.conf.j2
dest: /etc/ceph/{{ cluster }}.conf
owner: "ceph"
group: "ceph"
mode: "0644"
config_overrides: "{{ ceph_conf_overrides }}"
config_type: ini
notify:
- restart ceph mons
- restart ceph osds
- restart ceph mdss
- restart ceph rgws
- restart ceph nfss
## Instruction:
Make ceph-common aware off osd config fragments
This removes the implicit order requirement when using OSD fragments.
When you use OSD fragments and ceph-osd role is not the last one,
the fragments get removed from ceph.conf by ceph-common.
It is not nice to have this code at two locations, but this is
necessary to prevent problems, when ceph-osd is the last role as
ceph-common gets executed before ceph-osd.
This could be prevented when ceph-common would be explicitly called
at the end of the playbook.
Signed-off-by: Christian Zunker <55c21f91ef088e50e9390ce29a26f69b20baf7b0@codecentric.de>
## Code After:
---
- name: create ceph conf directory and assemble directory
file:
path: "{{ item }}"
state: directory
owner: "ceph"
group: "ceph"
mode: "0755"
with_items:
- /etc/ceph/
- /etc/ceph/ceph.d/
- name: "generate ceph configuration file: {{ cluster }}.conf"
action: config_template
args:
src: ceph.conf.j2
dest: /etc/ceph/ceph.d/{{ cluster }}.conf
owner: "ceph"
group: "ceph"
mode: "0644"
config_overrides: "{{ ceph_conf_overrides }}"
config_type: ini
- name: assemble {{ cluster }}.conf and fragments
assemble:
src: /etc/ceph/ceph.d/
dest: /etc/ceph/{{ cluster }}.conf
owner: "ceph"
group: "ceph"
mode: "0644"
notify:
- restart ceph mons
- restart ceph osds
- restart ceph mdss
- restart ceph rgws
- restart ceph nfss
| ---
- - name: create ceph conf directory
+ - name: create ceph conf directory and assemble directory
file:
- path: /etc/ceph
+ path: "{{ item }}"
state: directory
owner: "ceph"
group: "ceph"
mode: "0755"
+ with_items:
+ - /etc/ceph/
+ - /etc/ceph/ceph.d/
- name: "generate ceph configuration file: {{ cluster }}.conf"
action: config_template
args:
src: ceph.conf.j2
- dest: /etc/ceph/{{ cluster }}.conf
+ dest: /etc/ceph/ceph.d/{{ cluster }}.conf
? +++++++
owner: "ceph"
group: "ceph"
mode: "0644"
config_overrides: "{{ ceph_conf_overrides }}"
config_type: ini
+
+ - name: assemble {{ cluster }}.conf and fragments
+ assemble:
+ src: /etc/ceph/ceph.d/
+ dest: /etc/ceph/{{ cluster }}.conf
+ owner: "ceph"
+ group: "ceph"
+ mode: "0644"
notify:
- restart ceph mons
- restart ceph osds
- restart ceph mdss
- restart ceph rgws
- restart ceph nfss | 17 | 0.68 | 14 | 3 |
5b25b2ce8237a64f843daa8088a5b2eba0a30f1f | custom/gnome-terminal.zsh | custom/gnome-terminal.zsh | bindkey '\e[A' up-line-or-search
bindkey '\e[B' down-line-or-search
# Delete
bindkey '[3~' delete-char
# Home/End
bindkey 'OH' beginning-of-line
bindkey 'OF' end-of-line
# Ctrl Left/Right (^[[1;5D & ^[[1;5C)
bindkey '[1;5D' backward-word
bindkey '[1;5C' forward-word
| bindkey '\e[A' up-line-or-search
bindkey '\e[B' down-line-or-search
# Delete
bindkey '[3~' delete-char
# Home/End
bindkey 'OH' beginning-of-line
bindkey 'OF' end-of-line
bindkey '[1~' beginning-of-line
bindkey '[4~' end-of-line
# Ctrl Left/Right (^[[1;5D & ^[[1;5C)
bindkey '[1;5D' backward-word
bindkey '[1;5C' forward-word
bindkey 'OD' backward-word
bindkey 'OC' forward-word
| Add more bindings for home/end/ctrl-left/right | Add more bindings for home/end/ctrl-left/right
| Shell | mit | jreese/oh-my-zsh,jreese/oh-my-zsh,jreese/oh-my-zsh | shell | ## Code Before:
bindkey '\e[A' up-line-or-search
bindkey '\e[B' down-line-or-search
# Delete
bindkey '[3~' delete-char
# Home/End
bindkey 'OH' beginning-of-line
bindkey 'OF' end-of-line
# Ctrl Left/Right (^[[1;5D & ^[[1;5C)
bindkey '[1;5D' backward-word
bindkey '[1;5C' forward-word
## Instruction:
Add more bindings for home/end/ctrl-left/right
## Code After:
bindkey '\e[A' up-line-or-search
bindkey '\e[B' down-line-or-search
# Delete
bindkey '[3~' delete-char
# Home/End
bindkey 'OH' beginning-of-line
bindkey 'OF' end-of-line
bindkey '[1~' beginning-of-line
bindkey '[4~' end-of-line
# Ctrl Left/Right (^[[1;5D & ^[[1;5C)
bindkey '[1;5D' backward-word
bindkey '[1;5C' forward-word
bindkey 'OD' backward-word
bindkey 'OC' forward-word
| bindkey '\e[A' up-line-or-search
bindkey '\e[B' down-line-or-search
# Delete
bindkey '[3~' delete-char
# Home/End
bindkey 'OH' beginning-of-line
bindkey 'OF' end-of-line
+ bindkey '[1~' beginning-of-line
+ bindkey '[4~' end-of-line
# Ctrl Left/Right (^[[1;5D & ^[[1;5C)
bindkey '[1;5D' backward-word
bindkey '[1;5C' forward-word
+ bindkey 'OD' backward-word
+ bindkey 'OC' forward-word
| 4 | 0.266667 | 4 | 0 |
f2832134f676e6a291987c47f398707f1954bf53 | sample-config/server-multi-port.json | sample-config/server-multi-port.json | {
"port_password": {
"8387": "foobar",
"8388": "barfoo"
},
"timeout": 60,
"cache_enctable": true
}
| {
"port_password": {
"8387": "foobar",
"8388": "barfoo"
},
"timeout": 600,
}
| Delete removed option in multi port config sample. | Delete removed option in multi port config sample.
| JSON | apache-2.0 | joejang/shadowsocks-go,silentred/shadowsocks-go,dearplain/fast-shadowsocks,sjzhao/shadowsocks-go,heihei1252/shadowsocks-go,cappiewu/shadowsocks-go,aOJzQTORG/shadowsocks-go,shines77/shadowsocks-go,sunclx/shadowsocks-go,mervin0502/shadowsocks-go,boolstudio/shadowsocks-go,xvweirong/shadowsocks-go,bigeagle/shadowsocks-go,beneo/shadowsocks-go,bjjyd/shadowsocks-go,ijyd/shadowss,sjzhao/shadowsocks-go,draco1023/shadowsocks-go,jabez1314/shadowsocks-go,L-Jovi/shadowsocks-go,cnlangzi/shadowsocks-go,xzturn/shadowsocks-go,chenzhenjia/shadowsocks-go,billypon/shadowsocks-go,shuihuo/shadowsocks-go,lixin9311/shadowsocks-go,yimouleng/shadowsocks-go,srekcah/shadowsocks-go,cgcgbcbc/shadowsocks-go,sakz/shadowsocks-go,shines77/shadowsocks-go,willjunspecialcool/shadowsocks-go,shadowsocks/shadowsocks-go,srekcah/shadowsocks-go,cnlangzi/shadowsocks-go,lasting-yang/shadowsocks-go,luinnx/shadowsocks-go,zuoyanyouwu/shadowsocks-go,genzj/shadowsocks-go,zhengnanlee/shadowsocks-go,myself659/shadowsocks-go,willjunspecialcool/shadowsocks-go,joejang/shadowsocks-go,frtmelody/shadowsocks-go,qibinghua/shadowsocks-go,wufabeishang/shadowsocks-go,lasting-yang/shadowsocks-go,zuoyanyouwu/shadowsocks-go,yalay/shadowsocks-go,janstk/shadowsocks-go,FlexibleBroadband/socks5-go,mengzhuo/shadowsocks-go,flyer103/shadowsocks-go,kid551/shadowsocks-go,v2e4lisp/shadowsocks-go,qq844863921/shadowsocks-go,DJMIN/shadowsocks-go,stormltf/shadowsocks-go,HerringtonDarkholme/shadowsocks-go,will7723/shadowsocks-go,goodjob1114/shadowsocks-go,bjjyd/shadowsocks-go,abv76/shadowsocks-go,blacktear23/shadowsocks,shadowsocks/shadowsocks-go,hugozhu/shadowsocks-go,arthurkiller/shadowsocks-go,L-Jovi/shadowsocks-go,hongqn/shadowsocks-go,sakz/shadowsocks-go,MinFu/shadowsocks-go,froggatt/shadowsocks-go,laplaceliu/shadowsocks-go,v2e4lisp/shadowsocks-go,MinFu/shadowsocks-go,bingyuxq/shadowsocks-go,shuihuo/shadowsocks-go,ljtnine/shadowsocks-go,sunclx/shadowsocks-go,bowwowxx/shadowsocks-go,pomelolee/shadowsocks-go,genzj/shadowsocks-go,will7723/shadowsocks-go,orvice/shadowsocks-go,memoz/shadowsocks-go,JeremyHe-cn/shadowsocks-go,dadi-sisi/shadowsocks-go-autocheck,myself659/shadowsocks-go,ijyd/shadowss,shanghaikid/shadowsocks-go,shanghaikid/shadowsocks-go,yxonic/shadowsocks-go,draco1023/shadowsocks-go,TohnoSakuya/shadowsocks-go,willjunspecial/shadowsocks-go,sllt/shadowsocks-go,HerringtonDarkholme/shadowsocks-go,qibinghua/shadowsocks-go,wfxiang08/shadowsocks-go,luinnx/shadowsocks-go,sdvdxl/shadowsocks-go,blacktear23/shadowsocks,zhuqling/shadowsocks-go,geligaoli/shadowsocks-go,flyer103/shadowsocks-go,scutdk/shadowsocks-go,goodjob1114/shadowsocks-go,jinyuli/shadowsocks-go,hugozhu/shadowsocks-go,jcteng/shadowsocks-go,Archs/shadowsocks-go,mikeqian/shadowsocks-go,mengzhuo/shadowsocks-go,kk71/shadowsocks-go,sdvdxl/shadowsocks-go,dearplain/fast-shadowsocks,126ium/shadowsocks-go,silentred/shadowsocks-go,beneo/shadowsocks-go,menghan/shadowsocks-go,TohnoSakuya/shadowsocks-go,janstk/shadowsocks-go,yalay/shadowsocks-go,mikeqian/shadowsocks-go,linc01n/shadowsocks-go,laplaceliu/shadowsocks-go,stormltf/shadowsocks-go,frtmelody/shadowsocks-go,linc01n/shadowsocks-go,yimouleng/shadowsocks-go,willjunspecial/shadowsocks-go,billypon/shadowsocks-go,zhengnanlee/shadowsocks-go,yunkai/shadowsocks-go,ljtnine/shadowsocks-go,DJMIN/shadowsocks-go,yunkai/shadowsocks-go,aOJzQTORG/shadowsocks-go,126ium/shadowsocks-go,bowwowxx/shadowsocks-go,JeremyHe-cn/shadowsocks-go,scutdk/shadowsocks-go,froggatt/shadowsocks-go,jabez1314/shadowsocks-go,boolstudio/shadowsocks-go,lijinpei/shadowsocks-go,lijinpei/shadowsocks-go,kk71/shadowsocks-go,dmiedema/shadowsocks-go,wufabeishang/shadowsocks-go,geligaoli/shadowsocks-go,bingyuxq/shadowsocks-go,arthurkiller/shadowsocks-go,sutun2008/shadowsocks-go,dadi-sisi/shadowsocks-go-autocheck,wfxiang08/shadowsocks-go,pomelolee/shadowsocks-go,sutun2008/shadowsocks-go,FlexibleBroadband/socks5-go,mervin0502/shadowsocks-go,abv76/shadowsocks-go,yxonic/shadowsocks-go,lixin9311/shadowsocks-go,hcyGo/shadowsocks-go,sllt/shadowsocks-go,dmiedema/shadowsocks-go,zhuqling/shadowsocks-go,dadi-sisi/shadowsocks-go-autocheck,cgcgbcbc/shadowsocks-go,chenzhenjia/shadowsocks-go,qq844863921/shadowsocks-go,jcteng/shadowsocks-go,menghan/shadowsocks-go,cappiewu/shadowsocks-go,xzturn/shadowsocks-go,hcyGo/shadowsocks-go,ArkBriar/shadowsocks-go,bigeagle/shadowsocks-go,memoz/shadowsocks-go,Archs/shadowsocks-go,wooberlong/shadowsocks-go,xvweirong/shadowsocks-go,kid551/shadowsocks-go,wooberlong/shadowsocks-go,hongqn/shadowsocks-go,orvice/shadowsocks-go,ArkBriar/shadowsocks-go,jinyuli/shadowsocks-go,heihei1252/shadowsocks-go | json | ## Code Before:
{
"port_password": {
"8387": "foobar",
"8388": "barfoo"
},
"timeout": 60,
"cache_enctable": true
}
## Instruction:
Delete removed option in multi port config sample.
## Code After:
{
"port_password": {
"8387": "foobar",
"8388": "barfoo"
},
"timeout": 600,
}
| {
"port_password": {
"8387": "foobar",
"8388": "barfoo"
},
- "timeout": 60,
+ "timeout": 600,
? +
- "cache_enctable": true
} | 3 | 0.375 | 1 | 2 |
34ab7f1090d878bf8328f25f0fa4be4e575e7f43 | numba/sigutils.py | numba/sigutils.py | from __future__ import print_function, division, absolute_import
from numba import types, typing
def is_signature(sig):
"""
Return whether *sig* is a potentially valid signature
specification (for user-facing APIs).
"""
return isinstance(sig, (str, tuple, typing.Signature))
def _parse_signature_string(signature_str):
# Just eval signature_str using the types submodules as globals
return eval(signature_str, {}, types.__dict__)
def normalize_signature(sig):
"""
From *sig* (a signature specification), return a ``(return_type, args)``
tuple, where ``args`` itself is a tuple of types, and ``return_type``
can be None if not specified.
"""
if isinstance(sig, str):
parsed = _parse_signature_string(sig)
else:
parsed = sig
if isinstance(parsed, tuple):
args, return_type = parsed, None
elif isinstance(parsed, typing.Signature):
args, return_type = parsed.args, parsed.return_type
else:
raise TypeError("invalid signature: %r (type: %r) evaluates to %r "
"instead of tuple or Signature" % (
sig, sig.__class__.__name__,
parsed.__class__.__name__
))
def check_type(ty):
if not isinstance(ty, types.Type):
raise TypeError("invalid type in signature: expected a type "
"instance, got %r" % (ty,))
if return_type is not None:
check_type(return_type)
for ty in args:
check_type(ty)
return args, return_type
| from __future__ import print_function, division, absolute_import
from numba import types, typing
def is_signature(sig):
"""
Return whether *sig* is a potentially valid signature
specification (for user-facing APIs).
"""
return isinstance(sig, (str, tuple, typing.Signature))
def _parse_signature_string(signature_str):
# Just eval signature_str using the types submodules as globals
return eval(signature_str, {}, types.__dict__)
def normalize_signature(sig):
"""
From *sig* (a signature specification), return a ``(args, return_type)``
tuple, where ``args`` itself is a tuple of types, and ``return_type``
can be None if not specified.
"""
if isinstance(sig, str):
parsed = _parse_signature_string(sig)
else:
parsed = sig
if isinstance(parsed, tuple):
args, return_type = parsed, None
elif isinstance(parsed, typing.Signature):
args, return_type = parsed.args, parsed.return_type
else:
raise TypeError("invalid signature: %r (type: %r) evaluates to %r "
"instead of tuple or Signature" % (
sig, sig.__class__.__name__,
parsed.__class__.__name__
))
def check_type(ty):
if not isinstance(ty, types.Type):
raise TypeError("invalid type in signature: expected a type "
"instance, got %r" % (ty,))
if return_type is not None:
check_type(return_type)
for ty in args:
check_type(ty)
return args, return_type
| Update return value order in normalize_signature docstring | Update return value order in normalize_signature docstring [skip ci]
| Python | bsd-2-clause | IntelLabs/numba,gmarkall/numba,jriehl/numba,jriehl/numba,seibert/numba,numba/numba,IntelLabs/numba,stonebig/numba,stonebig/numba,stonebig/numba,stonebig/numba,sklam/numba,stuartarchibald/numba,seibert/numba,stuartarchibald/numba,jriehl/numba,seibert/numba,cpcloud/numba,cpcloud/numba,cpcloud/numba,numba/numba,IntelLabs/numba,stuartarchibald/numba,seibert/numba,gmarkall/numba,numba/numba,numba/numba,IntelLabs/numba,stuartarchibald/numba,cpcloud/numba,gmarkall/numba,seibert/numba,cpcloud/numba,numba/numba,stuartarchibald/numba,jriehl/numba,sklam/numba,gmarkall/numba,IntelLabs/numba,sklam/numba,stonebig/numba,jriehl/numba,sklam/numba,gmarkall/numba,sklam/numba | python | ## Code Before:
from __future__ import print_function, division, absolute_import
from numba import types, typing
def is_signature(sig):
"""
Return whether *sig* is a potentially valid signature
specification (for user-facing APIs).
"""
return isinstance(sig, (str, tuple, typing.Signature))
def _parse_signature_string(signature_str):
# Just eval signature_str using the types submodules as globals
return eval(signature_str, {}, types.__dict__)
def normalize_signature(sig):
"""
From *sig* (a signature specification), return a ``(return_type, args)``
tuple, where ``args`` itself is a tuple of types, and ``return_type``
can be None if not specified.
"""
if isinstance(sig, str):
parsed = _parse_signature_string(sig)
else:
parsed = sig
if isinstance(parsed, tuple):
args, return_type = parsed, None
elif isinstance(parsed, typing.Signature):
args, return_type = parsed.args, parsed.return_type
else:
raise TypeError("invalid signature: %r (type: %r) evaluates to %r "
"instead of tuple or Signature" % (
sig, sig.__class__.__name__,
parsed.__class__.__name__
))
def check_type(ty):
if not isinstance(ty, types.Type):
raise TypeError("invalid type in signature: expected a type "
"instance, got %r" % (ty,))
if return_type is not None:
check_type(return_type)
for ty in args:
check_type(ty)
return args, return_type
## Instruction:
Update return value order in normalize_signature docstring [skip ci]
## Code After:
from __future__ import print_function, division, absolute_import
from numba import types, typing
def is_signature(sig):
"""
Return whether *sig* is a potentially valid signature
specification (for user-facing APIs).
"""
return isinstance(sig, (str, tuple, typing.Signature))
def _parse_signature_string(signature_str):
# Just eval signature_str using the types submodules as globals
return eval(signature_str, {}, types.__dict__)
def normalize_signature(sig):
"""
From *sig* (a signature specification), return a ``(args, return_type)``
tuple, where ``args`` itself is a tuple of types, and ``return_type``
can be None if not specified.
"""
if isinstance(sig, str):
parsed = _parse_signature_string(sig)
else:
parsed = sig
if isinstance(parsed, tuple):
args, return_type = parsed, None
elif isinstance(parsed, typing.Signature):
args, return_type = parsed.args, parsed.return_type
else:
raise TypeError("invalid signature: %r (type: %r) evaluates to %r "
"instead of tuple or Signature" % (
sig, sig.__class__.__name__,
parsed.__class__.__name__
))
def check_type(ty):
if not isinstance(ty, types.Type):
raise TypeError("invalid type in signature: expected a type "
"instance, got %r" % (ty,))
if return_type is not None:
check_type(return_type)
for ty in args:
check_type(ty)
return args, return_type
| from __future__ import print_function, division, absolute_import
from numba import types, typing
def is_signature(sig):
"""
Return whether *sig* is a potentially valid signature
specification (for user-facing APIs).
"""
return isinstance(sig, (str, tuple, typing.Signature))
def _parse_signature_string(signature_str):
# Just eval signature_str using the types submodules as globals
return eval(signature_str, {}, types.__dict__)
def normalize_signature(sig):
"""
- From *sig* (a signature specification), return a ``(return_type, args)``
? ------
+ From *sig* (a signature specification), return a ``(args, return_type)``
? ++++++
tuple, where ``args`` itself is a tuple of types, and ``return_type``
can be None if not specified.
"""
if isinstance(sig, str):
parsed = _parse_signature_string(sig)
else:
parsed = sig
if isinstance(parsed, tuple):
args, return_type = parsed, None
elif isinstance(parsed, typing.Signature):
args, return_type = parsed.args, parsed.return_type
else:
raise TypeError("invalid signature: %r (type: %r) evaluates to %r "
"instead of tuple or Signature" % (
sig, sig.__class__.__name__,
parsed.__class__.__name__
))
def check_type(ty):
if not isinstance(ty, types.Type):
raise TypeError("invalid type in signature: expected a type "
"instance, got %r" % (ty,))
if return_type is not None:
check_type(return_type)
for ty in args:
check_type(ty)
return args, return_type | 2 | 0.04 | 1 | 1 |
2c736b7885fefbb8622e6b8572ff09a4506b8dd0 | Sources/DoubleLayout.swift | Sources/DoubleLayout.swift | //
// DoubleLayout.swift
// PhotoCollectionView
//
// Created by luan on 9/2/17.
//
//
import UIKit
class DoubleLayout: PhotoLayoutProtocol {
var contentSize: CGSize = CGSize.zero
var maxPhoto: Int {
return 2
}
func frame(at index: Int, in photoCollectionView: PhotoCollectionView) -> CGRect {
guard index >= 0 && index < maxPhoto else {
return CGRect.zero
}
contentSize = photoCollectionView.bounds.size
guard let image = photoCollectionView.image(at: index) else {
return CGRect(origin: .zero, size: contentSize)
}
let width = (photoCollectionView.bounds.width - spacing * CGFloat(maxPhoto + 1)) / CGFloat(maxPhoto)
var height = width * image.size.height / image.size.width
height = min(height, width * 1.25)
contentSize = CGSize(width: width, height: height)
return CGRect(origin: .zero, size: contentSize)
}
func contentSize(of photoCollectionView: PhotoCollectionView) -> CGSize {
return contentSize
}
}
| //
// DoubleLayout.swift
// PhotoCollectionView
//
// Created by luan on 9/2/17.
//
//
import UIKit
class DoubleLayout: PhotoLayoutProtocol {
var contentSize: CGSize = CGSize.zero
var itemSize: CGSize = CGSize.zero
var maxPhoto: Int {
return 2
}
func frame(at index: Int, in photoCollectionView: PhotoCollectionView) -> CGRect {
guard index >= 0 && index < maxPhoto else {
return CGRect.zero
}
contentSize = photoCollectionView.bounds.size
guard let image = photoCollectionView.image(at: index) else {
return CGRect(origin: .zero, size: contentSize)
}
let width = (photoCollectionView.bounds.width - spacing * CGFloat(maxPhoto + 1)) / CGFloat(maxPhoto)
var height = width * image.size.height / image.size.width
height = min(height, width * 1.25)
itemSize = CGSize(width: width, height: height)
contentSize = CGSize(width: photoCollectionView.bounds.width, height: height)
return CGRect(origin: .zero, size: contentSize)
}
func contentSize(of photoCollectionView: PhotoCollectionView) -> CGSize {
return contentSize
}
}
| Add item size and fix content size for double layout | Add item size and fix content size for double layout
| Swift | mit | noblakit01/PhotoCollectionView,noblakit01/PhotoCollectionView | swift | ## Code Before:
//
// DoubleLayout.swift
// PhotoCollectionView
//
// Created by luan on 9/2/17.
//
//
import UIKit
class DoubleLayout: PhotoLayoutProtocol {
var contentSize: CGSize = CGSize.zero
var maxPhoto: Int {
return 2
}
func frame(at index: Int, in photoCollectionView: PhotoCollectionView) -> CGRect {
guard index >= 0 && index < maxPhoto else {
return CGRect.zero
}
contentSize = photoCollectionView.bounds.size
guard let image = photoCollectionView.image(at: index) else {
return CGRect(origin: .zero, size: contentSize)
}
let width = (photoCollectionView.bounds.width - spacing * CGFloat(maxPhoto + 1)) / CGFloat(maxPhoto)
var height = width * image.size.height / image.size.width
height = min(height, width * 1.25)
contentSize = CGSize(width: width, height: height)
return CGRect(origin: .zero, size: contentSize)
}
func contentSize(of photoCollectionView: PhotoCollectionView) -> CGSize {
return contentSize
}
}
## Instruction:
Add item size and fix content size for double layout
## Code After:
//
// DoubleLayout.swift
// PhotoCollectionView
//
// Created by luan on 9/2/17.
//
//
import UIKit
class DoubleLayout: PhotoLayoutProtocol {
var contentSize: CGSize = CGSize.zero
var itemSize: CGSize = CGSize.zero
var maxPhoto: Int {
return 2
}
func frame(at index: Int, in photoCollectionView: PhotoCollectionView) -> CGRect {
guard index >= 0 && index < maxPhoto else {
return CGRect.zero
}
contentSize = photoCollectionView.bounds.size
guard let image = photoCollectionView.image(at: index) else {
return CGRect(origin: .zero, size: contentSize)
}
let width = (photoCollectionView.bounds.width - spacing * CGFloat(maxPhoto + 1)) / CGFloat(maxPhoto)
var height = width * image.size.height / image.size.width
height = min(height, width * 1.25)
itemSize = CGSize(width: width, height: height)
contentSize = CGSize(width: photoCollectionView.bounds.width, height: height)
return CGRect(origin: .zero, size: contentSize)
}
func contentSize(of photoCollectionView: PhotoCollectionView) -> CGSize {
return contentSize
}
}
| //
// DoubleLayout.swift
// PhotoCollectionView
//
// Created by luan on 9/2/17.
//
//
import UIKit
class DoubleLayout: PhotoLayoutProtocol {
var contentSize: CGSize = CGSize.zero
+ var itemSize: CGSize = CGSize.zero
var maxPhoto: Int {
return 2
}
func frame(at index: Int, in photoCollectionView: PhotoCollectionView) -> CGRect {
guard index >= 0 && index < maxPhoto else {
return CGRect.zero
}
contentSize = photoCollectionView.bounds.size
guard let image = photoCollectionView.image(at: index) else {
return CGRect(origin: .zero, size: contentSize)
}
let width = (photoCollectionView.bounds.width - spacing * CGFloat(maxPhoto + 1)) / CGFloat(maxPhoto)
var height = width * image.size.height / image.size.width
height = min(height, width * 1.25)
- contentSize = CGSize(width: width, height: height)
? ^^^ ^^
+ itemSize = CGSize(width: width, height: height)
? ^ ^
+ contentSize = CGSize(width: photoCollectionView.bounds.width, height: height)
return CGRect(origin: .zero, size: contentSize)
}
func contentSize(of photoCollectionView: PhotoCollectionView) -> CGSize {
return contentSize
}
} | 4 | 0.105263 | 3 | 1 |
ac27e0a945895038e1ab3f97b7595cbaf259512a | henrste/utils/set_sysctl_tcp_mem.sh | henrste/utils/set_sysctl_tcp_mem.sh | sysctl -w net.ipv4.tcp_rmem='4096 87380 8388608'
sysctl -w net.ipv4.tcp_wmem='4096 65536 8388608'
# default on DARASK-X250:
# rmem: 4096 87380 6291456
# wmem: 4096 16384 4194304
|
if [ -f /testbed-is-docker ]; then
echo "Don't run this inside Docker"
exit 1
fi
if [ -n "$1" ]; then
max_window=$1
sysctl -w net.ipv4.tcp_rmem="4096 87380 $(($max_window * 1448 * 2))"
sysctl -w net.ipv4.tcp_wmem="4096 16384 $(($max_window * 1448 * 3))"
else
# no argument will reset to "default"
rmem=6291456
wmem=4194304
sysctl -w net.ipv4.tcp_rmem="4096 87380 $rmem"
sysctl -w net.ipv4.tcp_wmem="4096 16384 $wmem"
echo "Maximum window size is now approx. (using 1448 sized segments):"
echo " $((rmem / 1448 / 2)) packets for receiving side"
echo " $((wmem / 1448 / 3)) packets for sending side"
fi
| Improve script for setting tcp_mem sysctl settings | Improve script for setting tcp_mem sysctl settings
| Shell | mit | henrist/aqmt,henrist/aqmt,henrist/aqmt,henrist/aqmt | shell | ## Code Before:
sysctl -w net.ipv4.tcp_rmem='4096 87380 8388608'
sysctl -w net.ipv4.tcp_wmem='4096 65536 8388608'
# default on DARASK-X250:
# rmem: 4096 87380 6291456
# wmem: 4096 16384 4194304
## Instruction:
Improve script for setting tcp_mem sysctl settings
## Code After:
if [ -f /testbed-is-docker ]; then
echo "Don't run this inside Docker"
exit 1
fi
if [ -n "$1" ]; then
max_window=$1
sysctl -w net.ipv4.tcp_rmem="4096 87380 $(($max_window * 1448 * 2))"
sysctl -w net.ipv4.tcp_wmem="4096 16384 $(($max_window * 1448 * 3))"
else
# no argument will reset to "default"
rmem=6291456
wmem=4194304
sysctl -w net.ipv4.tcp_rmem="4096 87380 $rmem"
sysctl -w net.ipv4.tcp_wmem="4096 16384 $wmem"
echo "Maximum window size is now approx. (using 1448 sized segments):"
echo " $((rmem / 1448 / 2)) packets for receiving side"
echo " $((wmem / 1448 / 3)) packets for sending side"
fi
| - sysctl -w net.ipv4.tcp_rmem='4096 87380 8388608'
- sysctl -w net.ipv4.tcp_wmem='4096 65536 8388608'
- # default on DARASK-X250:
- # rmem: 4096 87380 6291456
- # wmem: 4096 16384 4194304
+ if [ -f /testbed-is-docker ]; then
+ echo "Don't run this inside Docker"
+ exit 1
+ fi
+
+ if [ -n "$1" ]; then
+ max_window=$1
+ sysctl -w net.ipv4.tcp_rmem="4096 87380 $(($max_window * 1448 * 2))"
+ sysctl -w net.ipv4.tcp_wmem="4096 16384 $(($max_window * 1448 * 3))"
+ else
+ # no argument will reset to "default"
+ rmem=6291456
+ wmem=4194304
+ sysctl -w net.ipv4.tcp_rmem="4096 87380 $rmem"
+ sysctl -w net.ipv4.tcp_wmem="4096 16384 $wmem"
+
+ echo "Maximum window size is now approx. (using 1448 sized segments):"
+ echo " $((rmem / 1448 / 2)) packets for receiving side"
+ echo " $((wmem / 1448 / 3)) packets for sending side"
+ fi | 25 | 4.166667 | 20 | 5 |
760ce74fca8fa9a640167eabb4af83e31e902500 | openedx/core/djangoapps/api_admin/utils.py | openedx/core/djangoapps/api_admin/utils.py | """ Course Discovery API Service. """
from django.conf import settings
from edx_rest_api_client.client import EdxRestApiClient
from openedx.core.djangoapps.theming import helpers
from openedx.core.lib.token_utils import get_id_token
from provider.oauth2.models import Client
CLIENT_NAME = 'course-discovery'
def course_discovery_api_client(user):
""" Returns a Course Discovery API client setup with authentication for the specified user. """
course_discovery_client = Client.objects.get(name=CLIENT_NAME)
secret_key = helpers.get_value('JWT_AUTH', settings.JWT_AUTH)['JWT_SECRET_KEY']
return EdxRestApiClient(
course_discovery_client.url,
jwt=get_id_token(user, CLIENT_NAME, secret_key=secret_key)
)
| """ Course Discovery API Service. """
import datetime
from django.conf import settings
from edx_rest_api_client.client import EdxRestApiClient
import jwt
from openedx.core.djangoapps.theming import helpers
from provider.oauth2.models import Client
from student.models import UserProfile, anonymous_id_for_user
CLIENT_NAME = 'course-discovery'
def get_id_token(user):
"""
Return a JWT for `user`, suitable for use with the course discovery service.
Arguments:
user (User): User for whom to generate the JWT.
Returns:
str: The JWT.
"""
try:
# Service users may not have user profiles.
full_name = UserProfile.objects.get(user=user).name
except UserProfile.DoesNotExist:
full_name = None
now = datetime.datetime.utcnow()
expires_in = getattr(settings, 'OAUTH_ID_TOKEN_EXPIRATION', 30)
payload = {
'preferred_username': user.username,
'name': full_name,
'email': user.email,
'administrator': user.is_staff,
'iss': helpers.get_value('OAUTH_OIDC_ISSUER', settings.OAUTH_OIDC_ISSUER),
'exp': now + datetime.timedelta(seconds=expires_in),
'iat': now,
'aud': helpers.get_value('JWT_AUTH', settings.JWT_AUTH)['JWT_AUDIENCE'],
'sub': anonymous_id_for_user(user, None),
}
secret_key = helpers.get_value('JWT_AUTH', settings.JWT_AUTH)['JWT_SECRET_KEY']
return jwt.encode(payload, secret_key)
def course_discovery_api_client(user):
""" Returns a Course Discovery API client setup with authentication for the specified user. """
course_discovery_client = Client.objects.get(name=CLIENT_NAME)
return EdxRestApiClient(course_discovery_client.url, jwt=get_id_token(user))
| Use correct JWT audience when connecting to course discovery. | Use correct JWT audience when connecting to course discovery.
| Python | agpl-3.0 | cecep-edu/edx-platform,ahmedaljazzar/edx-platform,fintech-circle/edx-platform,waheedahmed/edx-platform,proversity-org/edx-platform,pabloborrego93/edx-platform,mbareta/edx-platform-ft,ESOedX/edx-platform,longmen21/edx-platform,pepeportela/edx-platform,chrisndodge/edx-platform,procangroup/edx-platform,ampax/edx-platform,fintech-circle/edx-platform,Stanford-Online/edx-platform,proversity-org/edx-platform,EDUlib/edx-platform,Edraak/edraak-platform,eduNEXT/edx-platform,eduNEXT/edx-platform,BehavioralInsightsTeam/edx-platform,JioEducation/edx-platform,stvstnfrd/edx-platform,gsehub/edx-platform,mitocw/edx-platform,proversity-org/edx-platform,chrisndodge/edx-platform,mitocw/edx-platform,lduarte1991/edx-platform,jzoldak/edx-platform,mitocw/edx-platform,waheedahmed/edx-platform,prarthitm/edxplatform,Livit/Livit.Learn.EdX,amir-qayyum-khan/edx-platform,CredoReference/edx-platform,appsembler/edx-platform,jjmiranda/edx-platform,amir-qayyum-khan/edx-platform,msegado/edx-platform,gymnasium/edx-platform,mbareta/edx-platform-ft,pepeportela/edx-platform,kmoocdev2/edx-platform,kmoocdev2/edx-platform,angelapper/edx-platform,eduNEXT/edx-platform,ahmedaljazzar/edx-platform,romain-li/edx-platform,philanthropy-u/edx-platform,ahmedaljazzar/edx-platform,Lektorium-LLC/edx-platform,angelapper/edx-platform,arbrandes/edx-platform,edx-solutions/edx-platform,jolyonb/edx-platform,caesar2164/edx-platform,deepsrijit1105/edx-platform,ampax/edx-platform,arbrandes/edx-platform,philanthropy-u/edx-platform,ESOedX/edx-platform,TeachAtTUM/edx-platform,cecep-edu/edx-platform,longmen21/edx-platform,romain-li/edx-platform,raccoongang/edx-platform,a-parhom/edx-platform,procangroup/edx-platform,kmoocdev2/edx-platform,shabab12/edx-platform,pepeportela/edx-platform,TeachAtTUM/edx-platform,jzoldak/edx-platform,proversity-org/edx-platform,jolyonb/edx-platform,appsembler/edx-platform,teltek/edx-platform,caesar2164/edx-platform,ahmedaljazzar/edx-platform,JioEducation/edx-platform,pabloborrego93/edx-platform,Stanford-Online/edx-platform,chrisndodge/edx-platform,mbareta/edx-platform-ft,Edraak/edraak-platform,lduarte1991/edx-platform,cpennington/edx-platform,lduarte1991/edx-platform,deepsrijit1105/edx-platform,deepsrijit1105/edx-platform,Edraak/edraak-platform,procangroup/edx-platform,shabab12/edx-platform,Edraak/edraak-platform,CredoReference/edx-platform,stvstnfrd/edx-platform,longmen21/edx-platform,itsjeyd/edx-platform,naresh21/synergetics-edx-platform,eduNEXT/edunext-platform,gymnasium/edx-platform,louyihua/edx-platform,msegado/edx-platform,louyihua/edx-platform,prarthitm/edxplatform,jjmiranda/edx-platform,msegado/edx-platform,TeachAtTUM/edx-platform,Lektorium-LLC/edx-platform,pabloborrego93/edx-platform,synergeticsedx/deployment-wipro,arbrandes/edx-platform,marcore/edx-platform,naresh21/synergetics-edx-platform,chrisndodge/edx-platform,marcore/edx-platform,louyihua/edx-platform,gymnasium/edx-platform,procangroup/edx-platform,gsehub/edx-platform,amir-qayyum-khan/edx-platform,kmoocdev2/edx-platform,marcore/edx-platform,marcore/edx-platform,eduNEXT/edunext-platform,a-parhom/edx-platform,ESOedX/edx-platform,stvstnfrd/edx-platform,Stanford-Online/edx-platform,BehavioralInsightsTeam/edx-platform,JioEducation/edx-platform,fintech-circle/edx-platform,synergeticsedx/deployment-wipro,msegado/edx-platform,edx/edx-platform,cpennington/edx-platform,TeachAtTUM/edx-platform,longmen21/edx-platform,Stanford-Online/edx-platform,Livit/Livit.Learn.EdX,philanthropy-u/edx-platform,miptliot/edx-platform,raccoongang/edx-platform,EDUlib/edx-platform,synergeticsedx/deployment-wipro,a-parhom/edx-platform,BehavioralInsightsTeam/edx-platform,mbareta/edx-platform-ft,hastexo/edx-platform,jolyonb/edx-platform,Livit/Livit.Learn.EdX,Livit/Livit.Learn.EdX,teltek/edx-platform,kmoocdev2/edx-platform,eduNEXT/edunext-platform,jjmiranda/edx-platform,edx/edx-platform,gymnasium/edx-platform,jolyonb/edx-platform,teltek/edx-platform,waheedahmed/edx-platform,ampax/edx-platform,EDUlib/edx-platform,deepsrijit1105/edx-platform,tanmaykm/edx-platform,Lektorium-LLC/edx-platform,tanmaykm/edx-platform,eduNEXT/edx-platform,philanthropy-u/edx-platform,cpennington/edx-platform,cecep-edu/edx-platform,amir-qayyum-khan/edx-platform,edx-solutions/edx-platform,edx-solutions/edx-platform,cecep-edu/edx-platform,cpennington/edx-platform,CredoReference/edx-platform,romain-li/edx-platform,romain-li/edx-platform,angelapper/edx-platform,edx/edx-platform,eduNEXT/edunext-platform,lduarte1991/edx-platform,waheedahmed/edx-platform,appsembler/edx-platform,miptliot/edx-platform,appsembler/edx-platform,itsjeyd/edx-platform,jzoldak/edx-platform,jjmiranda/edx-platform,pepeportela/edx-platform,ampax/edx-platform,gsehub/edx-platform,pabloborrego93/edx-platform,hastexo/edx-platform,tanmaykm/edx-platform,angelapper/edx-platform,gsehub/edx-platform,teltek/edx-platform,EDUlib/edx-platform,CredoReference/edx-platform,naresh21/synergetics-edx-platform,tanmaykm/edx-platform,arbrandes/edx-platform,hastexo/edx-platform,raccoongang/edx-platform,itsjeyd/edx-platform,raccoongang/edx-platform,Lektorium-LLC/edx-platform,synergeticsedx/deployment-wipro,cecep-edu/edx-platform,shabab12/edx-platform,fintech-circle/edx-platform,edx/edx-platform,naresh21/synergetics-edx-platform,waheedahmed/edx-platform,mitocw/edx-platform,louyihua/edx-platform,prarthitm/edxplatform,romain-li/edx-platform,caesar2164/edx-platform,msegado/edx-platform,caesar2164/edx-platform,miptliot/edx-platform,JioEducation/edx-platform,miptliot/edx-platform,shabab12/edx-platform,hastexo/edx-platform,itsjeyd/edx-platform,stvstnfrd/edx-platform,a-parhom/edx-platform,BehavioralInsightsTeam/edx-platform,ESOedX/edx-platform,longmen21/edx-platform,prarthitm/edxplatform,edx-solutions/edx-platform,jzoldak/edx-platform | python | ## Code Before:
""" Course Discovery API Service. """
from django.conf import settings
from edx_rest_api_client.client import EdxRestApiClient
from openedx.core.djangoapps.theming import helpers
from openedx.core.lib.token_utils import get_id_token
from provider.oauth2.models import Client
CLIENT_NAME = 'course-discovery'
def course_discovery_api_client(user):
""" Returns a Course Discovery API client setup with authentication for the specified user. """
course_discovery_client = Client.objects.get(name=CLIENT_NAME)
secret_key = helpers.get_value('JWT_AUTH', settings.JWT_AUTH)['JWT_SECRET_KEY']
return EdxRestApiClient(
course_discovery_client.url,
jwt=get_id_token(user, CLIENT_NAME, secret_key=secret_key)
)
## Instruction:
Use correct JWT audience when connecting to course discovery.
## Code After:
""" Course Discovery API Service. """
import datetime
from django.conf import settings
from edx_rest_api_client.client import EdxRestApiClient
import jwt
from openedx.core.djangoapps.theming import helpers
from provider.oauth2.models import Client
from student.models import UserProfile, anonymous_id_for_user
CLIENT_NAME = 'course-discovery'
def get_id_token(user):
"""
Return a JWT for `user`, suitable for use with the course discovery service.
Arguments:
user (User): User for whom to generate the JWT.
Returns:
str: The JWT.
"""
try:
# Service users may not have user profiles.
full_name = UserProfile.objects.get(user=user).name
except UserProfile.DoesNotExist:
full_name = None
now = datetime.datetime.utcnow()
expires_in = getattr(settings, 'OAUTH_ID_TOKEN_EXPIRATION', 30)
payload = {
'preferred_username': user.username,
'name': full_name,
'email': user.email,
'administrator': user.is_staff,
'iss': helpers.get_value('OAUTH_OIDC_ISSUER', settings.OAUTH_OIDC_ISSUER),
'exp': now + datetime.timedelta(seconds=expires_in),
'iat': now,
'aud': helpers.get_value('JWT_AUTH', settings.JWT_AUTH)['JWT_AUDIENCE'],
'sub': anonymous_id_for_user(user, None),
}
secret_key = helpers.get_value('JWT_AUTH', settings.JWT_AUTH)['JWT_SECRET_KEY']
return jwt.encode(payload, secret_key)
def course_discovery_api_client(user):
""" Returns a Course Discovery API client setup with authentication for the specified user. """
course_discovery_client = Client.objects.get(name=CLIENT_NAME)
return EdxRestApiClient(course_discovery_client.url, jwt=get_id_token(user))
| """ Course Discovery API Service. """
+ import datetime
+
from django.conf import settings
+ from edx_rest_api_client.client import EdxRestApiClient
+ import jwt
- from edx_rest_api_client.client import EdxRestApiClient
from openedx.core.djangoapps.theming import helpers
- from openedx.core.lib.token_utils import get_id_token
from provider.oauth2.models import Client
+ from student.models import UserProfile, anonymous_id_for_user
CLIENT_NAME = 'course-discovery'
+
+
+ def get_id_token(user):
+ """
+ Return a JWT for `user`, suitable for use with the course discovery service.
+
+ Arguments:
+ user (User): User for whom to generate the JWT.
+
+ Returns:
+ str: The JWT.
+ """
+ try:
+ # Service users may not have user profiles.
+ full_name = UserProfile.objects.get(user=user).name
+ except UserProfile.DoesNotExist:
+ full_name = None
+
+ now = datetime.datetime.utcnow()
+ expires_in = getattr(settings, 'OAUTH_ID_TOKEN_EXPIRATION', 30)
+
+ payload = {
+ 'preferred_username': user.username,
+ 'name': full_name,
+ 'email': user.email,
+ 'administrator': user.is_staff,
+ 'iss': helpers.get_value('OAUTH_OIDC_ISSUER', settings.OAUTH_OIDC_ISSUER),
+ 'exp': now + datetime.timedelta(seconds=expires_in),
+ 'iat': now,
+ 'aud': helpers.get_value('JWT_AUTH', settings.JWT_AUTH)['JWT_AUDIENCE'],
+ 'sub': anonymous_id_for_user(user, None),
+ }
+ secret_key = helpers.get_value('JWT_AUTH', settings.JWT_AUTH)['JWT_SECRET_KEY']
+
+ return jwt.encode(payload, secret_key)
def course_discovery_api_client(user):
""" Returns a Course Discovery API client setup with authentication for the specified user. """
course_discovery_client = Client.objects.get(name=CLIENT_NAME)
+ return EdxRestApiClient(course_discovery_client.url, jwt=get_id_token(user))
- secret_key = helpers.get_value('JWT_AUTH', settings.JWT_AUTH)['JWT_SECRET_KEY']
- return EdxRestApiClient(
- course_discovery_client.url,
- jwt=get_id_token(user, CLIENT_NAME, secret_key=secret_key)
- ) | 48 | 2.526316 | 41 | 7 |
eedefe208fcd01e20a4cc86b1838e95f81cbb628 | npm/package.json | npm/package.json | {
"name": "parserlib",
"version": "@VERSION@",
"description": "CSSLint",
"author": "Nicholas C. Zakas",
"contributors": [
],
"engines": {
"node" : ">=0.2.0"
},
"directories": {
"lib" : "lib"
},
"main": "./lib/node-parserlib.js",
"licenses":[
{
"type" : "MIT",
"url" : "https://github.com/nzakas/parser-lib/blob/master/LICENSE"
}
],
"repository": {
"type":"git",
"url":"http://github.com/nzakas/parser-lib.git"
}
} | {
"name": "parserlib",
"version": "@VERSION@",
"description": "CSSLint",
"author": "Nicholas C. Zakas",
"description": "CSS3 SAX-inspired parser",
"keywords": [ "parser", "css", "css3", "sax", "style", "stylesheet" ],
"contributors": [
],
"engines": {
"node" : ">=0.2.0"
},
"directories": {
"lib" : "lib"
},
"main": "./lib/node-parserlib.js",
"licenses":[
{
"type" : "MIT",
"url" : "https://github.com/nzakas/parser-lib/blob/master/LICENSE"
}
],
"repository": {
"type":"git",
"url":"http://github.com/nzakas/parser-lib.git"
},
"scripts": {
"test": "ant -f ../../build.xml test",
"prepublish": "npm test"
}
} | Test hooks & keyword data for npm | Test hooks & keyword data for npm
| JSON | mit | malept/parser-lib,medikoo/parser-lib,ash1982ok/parser-lib,cscott/parser-lib,malept/parser-lib,ovanderzee/parser-lib,medikoo/parser-lib,ideadapt/parser-lib,ovanderzee/parser-lib,cscott/parser-lib,ash1982ok/parser-lib,ideadapt/parser-lib | json | ## Code Before:
{
"name": "parserlib",
"version": "@VERSION@",
"description": "CSSLint",
"author": "Nicholas C. Zakas",
"contributors": [
],
"engines": {
"node" : ">=0.2.0"
},
"directories": {
"lib" : "lib"
},
"main": "./lib/node-parserlib.js",
"licenses":[
{
"type" : "MIT",
"url" : "https://github.com/nzakas/parser-lib/blob/master/LICENSE"
}
],
"repository": {
"type":"git",
"url":"http://github.com/nzakas/parser-lib.git"
}
}
## Instruction:
Test hooks & keyword data for npm
## Code After:
{
"name": "parserlib",
"version": "@VERSION@",
"description": "CSSLint",
"author": "Nicholas C. Zakas",
"description": "CSS3 SAX-inspired parser",
"keywords": [ "parser", "css", "css3", "sax", "style", "stylesheet" ],
"contributors": [
],
"engines": {
"node" : ">=0.2.0"
},
"directories": {
"lib" : "lib"
},
"main": "./lib/node-parserlib.js",
"licenses":[
{
"type" : "MIT",
"url" : "https://github.com/nzakas/parser-lib/blob/master/LICENSE"
}
],
"repository": {
"type":"git",
"url":"http://github.com/nzakas/parser-lib.git"
},
"scripts": {
"test": "ant -f ../../build.xml test",
"prepublish": "npm test"
}
} | {
"name": "parserlib",
"version": "@VERSION@",
"description": "CSSLint",
"author": "Nicholas C. Zakas",
+ "description": "CSS3 SAX-inspired parser",
+ "keywords": [ "parser", "css", "css3", "sax", "style", "stylesheet" ],
"contributors": [
],
"engines": {
"node" : ">=0.2.0"
},
"directories": {
"lib" : "lib"
},
"main": "./lib/node-parserlib.js",
"licenses":[
{
"type" : "MIT",
"url" : "https://github.com/nzakas/parser-lib/blob/master/LICENSE"
}
],
"repository": {
"type":"git",
"url":"http://github.com/nzakas/parser-lib.git"
+ },
+ "scripts": {
+ "test": "ant -f ../../build.xml test",
+ "prepublish": "npm test"
}
} | 6 | 0.24 | 6 | 0 |
8aff666c35c61b49ed6b04d3576bfeb4b7224874 | index.js | index.js | const Koa = require('koa');
const fetch = require('node-fetch');
const app = new Koa();
const port = process.env.PORT || 3000;
app.use(async (ctx, next) => {
var response = await fetch('http://inspirobot.me/api?generate=true');
var url = await response.text();
ctx.body = url;
});
if (!module.parent) app.listen(port); | const Koa = require('koa');
const fetch = require('node-fetch');
const app = new Koa();
const port = process.env.PORT || 3000;
app.use(async (ctx, next) => {
const response = await fetch('http://inspirobot.me/api?generate=true');
const url = await response.text();
const result = {
parse: "full",
response_type: "in_channel",
text: url,
attachments: [{ image_url: url }],
unfurl_media: true,
unfurl_links: true
};
ctx.set('Content-Type', 'application/json');
ctx.body = result;
});
if (!module.parent) app.listen(port); | Format response to slack json | Format response to slack json
| JavaScript | mit | Shtian/inspireme | javascript | ## Code Before:
const Koa = require('koa');
const fetch = require('node-fetch');
const app = new Koa();
const port = process.env.PORT || 3000;
app.use(async (ctx, next) => {
var response = await fetch('http://inspirobot.me/api?generate=true');
var url = await response.text();
ctx.body = url;
});
if (!module.parent) app.listen(port);
## Instruction:
Format response to slack json
## Code After:
const Koa = require('koa');
const fetch = require('node-fetch');
const app = new Koa();
const port = process.env.PORT || 3000;
app.use(async (ctx, next) => {
const response = await fetch('http://inspirobot.me/api?generate=true');
const url = await response.text();
const result = {
parse: "full",
response_type: "in_channel",
text: url,
attachments: [{ image_url: url }],
unfurl_media: true,
unfurl_links: true
};
ctx.set('Content-Type', 'application/json');
ctx.body = result;
});
if (!module.parent) app.listen(port); | const Koa = require('koa');
const fetch = require('node-fetch');
const app = new Koa();
const port = process.env.PORT || 3000;
app.use(async (ctx, next) => {
- var response = await fetch('http://inspirobot.me/api?generate=true');
? ^^^
+ const response = await fetch('http://inspirobot.me/api?generate=true');
? ^^^^^
- var url = await response.text();
? ^^^
+ const url = await response.text();
? ^^^^^
+ const result = {
+ parse: "full",
+ response_type: "in_channel",
+ text: url,
+ attachments: [{ image_url: url }],
+ unfurl_media: true,
+ unfurl_links: true
+ };
+ ctx.set('Content-Type', 'application/json');
- ctx.body = url;
? -
+ ctx.body = result;
? +++ +
});
if (!module.parent) app.listen(port); | 15 | 1.25 | 12 | 3 |
fb67dd5793ba6163e66f325fa3adcb12c4fc8038 | gulpfile.js | gulpfile.js | var stexDev = require("stex-dev");
var gulp = stexDev.gulp();
var plugins = stexDev.gulpPlugins();
var paths = stexDev.paths;
// composite gulp tasks
gulp.task('default', ['test']);
gulp.task('dist', ['']);
gulp.task('test', ['lint', 'mocha']);
// gulp.task('db:setup', ['db:ensure-created', 'db:migrate']);
// end composite tasks
// you can find individual gulp tasks in ./node_modules/stex-dev/lib/gulp.js
// // expose the app globals to other tasks
// gulp.task('app', function(next) {
// var stex = require("./lib/app");
// stex.activate();
// next();
// });
// gulp.task('db:ensure-created', ['app'], function() {
// var Knex = require("knex");
// var dbConfig = conf.get("db");
// var dbToCreate = dbConfig.connection.database;
// // create a connection to the db without specifying the db
// delete dbConfig.connection.database;
// var db = Knex.initialize(dbConfig);
// return db.raw("CREATE DATABASE IF NOT EXISTS `" + dbToCreate + "`")
// .then(function() { /* noop */ })
// .finally(function(){
// db.client.pool.destroy();
// });
// });
// gulp.task('db:migrate', function(next) {
// var spawn = require('child_process').spawn;
// var proc = spawn("stex", ["db-migrate", "up"], { stdio: 'inherit' });
// proc.on('close', function (code) {
// if(code === 0) {
// next();
// } else {
// next(new Error("Process failed: " + code));
// }
// });
// });
| var Stex = require("stex");
var stexDev = require("stex-dev");
var gulp = stexDev.gulp();
var plugins = stexDev.gulpPlugins();
var paths = stexDev.paths;
paths.root = __dirname; //HACK: can't think of a better way to expose the app root prior to stex init
gulp.task('default', ['test']);
gulp.task('dist', ['']);
gulp.task('test', ['lint', 'mocha']);
gulp.task('db:setup', ['db:ensure-created', 'db:migrate']);
// you can find individual gulp tasks in ./node_modules/stex-dev/lib/gulp.js
| Move app/db tasks into stex-dev | Move app/db tasks into stex-dev
| JavaScript | isc | Payshare/stellar-wallet,stellar/stellar-wallet,stellar/stellar-wallet,Payshare/stellar-wallet | javascript | ## Code Before:
var stexDev = require("stex-dev");
var gulp = stexDev.gulp();
var plugins = stexDev.gulpPlugins();
var paths = stexDev.paths;
// composite gulp tasks
gulp.task('default', ['test']);
gulp.task('dist', ['']);
gulp.task('test', ['lint', 'mocha']);
// gulp.task('db:setup', ['db:ensure-created', 'db:migrate']);
// end composite tasks
// you can find individual gulp tasks in ./node_modules/stex-dev/lib/gulp.js
// // expose the app globals to other tasks
// gulp.task('app', function(next) {
// var stex = require("./lib/app");
// stex.activate();
// next();
// });
// gulp.task('db:ensure-created', ['app'], function() {
// var Knex = require("knex");
// var dbConfig = conf.get("db");
// var dbToCreate = dbConfig.connection.database;
// // create a connection to the db without specifying the db
// delete dbConfig.connection.database;
// var db = Knex.initialize(dbConfig);
// return db.raw("CREATE DATABASE IF NOT EXISTS `" + dbToCreate + "`")
// .then(function() { /* noop */ })
// .finally(function(){
// db.client.pool.destroy();
// });
// });
// gulp.task('db:migrate', function(next) {
// var spawn = require('child_process').spawn;
// var proc = spawn("stex", ["db-migrate", "up"], { stdio: 'inherit' });
// proc.on('close', function (code) {
// if(code === 0) {
// next();
// } else {
// next(new Error("Process failed: " + code));
// }
// });
// });
## Instruction:
Move app/db tasks into stex-dev
## Code After:
var Stex = require("stex");
var stexDev = require("stex-dev");
var gulp = stexDev.gulp();
var plugins = stexDev.gulpPlugins();
var paths = stexDev.paths;
paths.root = __dirname; //HACK: can't think of a better way to expose the app root prior to stex init
gulp.task('default', ['test']);
gulp.task('dist', ['']);
gulp.task('test', ['lint', 'mocha']);
gulp.task('db:setup', ['db:ensure-created', 'db:migrate']);
// you can find individual gulp tasks in ./node_modules/stex-dev/lib/gulp.js
| + var Stex = require("stex");
var stexDev = require("stex-dev");
var gulp = stexDev.gulp();
var plugins = stexDev.gulpPlugins();
var paths = stexDev.paths;
+ paths.root = __dirname; //HACK: can't think of a better way to expose the app root prior to stex init
- // composite gulp tasks
gulp.task('default', ['test']);
gulp.task('dist', ['']);
gulp.task('test', ['lint', 'mocha']);
- // gulp.task('db:setup', ['db:ensure-created', 'db:migrate']);
? ---
+ gulp.task('db:setup', ['db:ensure-created', 'db:migrate']);
-
- // end composite tasks
-
// you can find individual gulp tasks in ./node_modules/stex-dev/lib/gulp.js
-
-
- // // expose the app globals to other tasks
- // gulp.task('app', function(next) {
- // var stex = require("./lib/app");
- // stex.activate();
- // next();
- // });
-
- // gulp.task('db:ensure-created', ['app'], function() {
- // var Knex = require("knex");
- // var dbConfig = conf.get("db");
- // var dbToCreate = dbConfig.connection.database;
-
- // // create a connection to the db without specifying the db
- // delete dbConfig.connection.database;
- // var db = Knex.initialize(dbConfig);
-
- // return db.raw("CREATE DATABASE IF NOT EXISTS `" + dbToCreate + "`")
- // .then(function() { /* noop */ })
- // .finally(function(){
- // db.client.pool.destroy();
- // });
- // });
-
-
- // gulp.task('db:migrate', function(next) {
- // var spawn = require('child_process').spawn;
-
- // var proc = spawn("stex", ["db-migrate", "up"], { stdio: 'inherit' });
- // proc.on('close', function (code) {
- // if(code === 0) {
- // next();
- // } else {
- // next(new Error("Process failed: " + code));
- // }
- // });
- // });
-
- | 48 | 0.842105 | 3 | 45 |
de3aa8fc5e6b525f06794086900331ece437b08a | app/controllers/admin/events_controller.rb | app/controllers/admin/events_controller.rb | class Admin::EventsController < Admin::BaseController
before_filter :authenticate_user!
skip_authorization_check # XXX because I'm using the BaseController for auth
def index
@events = Event.all
@num_competitors = Registrant.where({:competitor => true}).count
@num_non_competitors = Registrant.where({:competitor => false}).count
@num_registrants = @num_competitors + @num_non_competitors
end
def show
@event = Event.find(params[:id])
end
end
| class Admin::EventsController < Admin::BaseController
before_filter :authenticate_user!
load_and_authorize_resource
def index
@num_competitors = Registrant.where({:competitor => true}).count
@num_non_competitors = Registrant.where({:competitor => false}).count
@num_registrants = @num_competitors + @num_non_competitors
end
def show
end
end
| Fix authorization check on the /admin/events page | Fix authorization check on the /admin/events page
| Ruby | mit | rdunlop/unicycling-registration,rdunlop/unicycling-registration,scotthue/unicycling-registration,rdunlop/unicycling-registration,scotthue/unicycling-registration,scotthue/unicycling-registration,rdunlop/unicycling-registration | ruby | ## Code Before:
class Admin::EventsController < Admin::BaseController
before_filter :authenticate_user!
skip_authorization_check # XXX because I'm using the BaseController for auth
def index
@events = Event.all
@num_competitors = Registrant.where({:competitor => true}).count
@num_non_competitors = Registrant.where({:competitor => false}).count
@num_registrants = @num_competitors + @num_non_competitors
end
def show
@event = Event.find(params[:id])
end
end
## Instruction:
Fix authorization check on the /admin/events page
## Code After:
class Admin::EventsController < Admin::BaseController
before_filter :authenticate_user!
load_and_authorize_resource
def index
@num_competitors = Registrant.where({:competitor => true}).count
@num_non_competitors = Registrant.where({:competitor => false}).count
@num_registrants = @num_competitors + @num_non_competitors
end
def show
end
end
| class Admin::EventsController < Admin::BaseController
before_filter :authenticate_user!
- skip_authorization_check # XXX because I'm using the BaseController for auth
+ load_and_authorize_resource
def index
- @events = Event.all
@num_competitors = Registrant.where({:competitor => true}).count
@num_non_competitors = Registrant.where({:competitor => false}).count
@num_registrants = @num_competitors + @num_non_competitors
end
def show
- @event = Event.find(params[:id])
end
end | 4 | 0.266667 | 1 | 3 |
cc6ebf12d4ad415a61363f84fd84364591224760 | docs/vault.md | docs/vault.md |
Vault is a tool for managing secrets, such as API keys and other credentials. In the context of builder, these credentials are mostly necessary to access or modify infrastructure.
## User scenario
builder assumes a logged-in Vault client, probably running on the master server.
Run the following command to log in:
`./bldr vault.login`
You should be asked for an access token that your system administrators should provide you with.
In case it's necessary to log out, run:
`./bldr vault.logout`
Terraform, wrapped by builder, will read the ~/.vault-token file during its operations.
## Creating new tokens
After logging in with a root token, run:
`./bldr vault.token_create`
This token will be associated with the `builder-user` policy which gives it read-only access to secrets needed by builder.
To lookup information about a token:
`./bldr vault.token_lookup:<token>`
To revoke a token:
`./bldr vault.token_revoke:<token>`
|
Vault is a tool for managing secrets, such as API keys and other credentials. In the context of builder, these credentials are mostly necessary to access or modify infrastructure.
## User scenario
builder assumes a logged-in Vault client, probably running on the master server.
Run the following command to log in:
`./bldr vault.login`
You should be asked for an access token that your system administrators should provide you with.
In case it's necessary to log out, run:
`./bldr vault.logout`
Terraform, wrapped by builder, will read the ~/.vault-token file during its operations.
## Creating new tokens
After logging in with a root token, run:
`./bldr vault.token_create`
This token will be associated with the `builder-user` policy which gives it read-only access to secrets needed by builder.
To lookup information about a token:
`./bldr vault.token_lookup:<token>`
To revoke a token:
`./bldr vault.token_revoke:<token>`
## Reading and writing secrets (admin only)
Some commands can be manually run to directly interact with Vault's key-value secrets store:
```
$ VAULT_ADDR=https://master-server.elifesciences.org:8200 vault kv get secret/builder/apikey/fastly-gcs-logging
Key Value
--- -----
email fastly@elife-fastly.iam.gserviceaccount.com
secret_key -----BEGIN PRIVATE KEY-----
...
-----END PRIVATE KEY-----
```
```
$ VAULT_ADDR=https://master-server.elifesciences.org:8200 vault kv put secret/builder/apikey/fastly-gcp-logging email=fastly@elife-fastly.iam.gserviceaccount.com secret_key=@../../fastly-gcp-logging.secret
Success! Data written to: secret/builder/apikey/fastly-gcp-logging
```
| Store commands for reading and writing secrets from Vault | Store commands for reading and writing secrets from Vault
| Markdown | mit | elifesciences/builder,elifesciences/builder | markdown | ## Code Before:
Vault is a tool for managing secrets, such as API keys and other credentials. In the context of builder, these credentials are mostly necessary to access or modify infrastructure.
## User scenario
builder assumes a logged-in Vault client, probably running on the master server.
Run the following command to log in:
`./bldr vault.login`
You should be asked for an access token that your system administrators should provide you with.
In case it's necessary to log out, run:
`./bldr vault.logout`
Terraform, wrapped by builder, will read the ~/.vault-token file during its operations.
## Creating new tokens
After logging in with a root token, run:
`./bldr vault.token_create`
This token will be associated with the `builder-user` policy which gives it read-only access to secrets needed by builder.
To lookup information about a token:
`./bldr vault.token_lookup:<token>`
To revoke a token:
`./bldr vault.token_revoke:<token>`
## Instruction:
Store commands for reading and writing secrets from Vault
## Code After:
Vault is a tool for managing secrets, such as API keys and other credentials. In the context of builder, these credentials are mostly necessary to access or modify infrastructure.
## User scenario
builder assumes a logged-in Vault client, probably running on the master server.
Run the following command to log in:
`./bldr vault.login`
You should be asked for an access token that your system administrators should provide you with.
In case it's necessary to log out, run:
`./bldr vault.logout`
Terraform, wrapped by builder, will read the ~/.vault-token file during its operations.
## Creating new tokens
After logging in with a root token, run:
`./bldr vault.token_create`
This token will be associated with the `builder-user` policy which gives it read-only access to secrets needed by builder.
To lookup information about a token:
`./bldr vault.token_lookup:<token>`
To revoke a token:
`./bldr vault.token_revoke:<token>`
## Reading and writing secrets (admin only)
Some commands can be manually run to directly interact with Vault's key-value secrets store:
```
$ VAULT_ADDR=https://master-server.elifesciences.org:8200 vault kv get secret/builder/apikey/fastly-gcs-logging
Key Value
--- -----
email fastly@elife-fastly.iam.gserviceaccount.com
secret_key -----BEGIN PRIVATE KEY-----
...
-----END PRIVATE KEY-----
```
```
$ VAULT_ADDR=https://master-server.elifesciences.org:8200 vault kv put secret/builder/apikey/fastly-gcp-logging email=fastly@elife-fastly.iam.gserviceaccount.com secret_key=@../../fastly-gcp-logging.secret
Success! Data written to: secret/builder/apikey/fastly-gcp-logging
```
|
Vault is a tool for managing secrets, such as API keys and other credentials. In the context of builder, these credentials are mostly necessary to access or modify infrastructure.
## User scenario
builder assumes a logged-in Vault client, probably running on the master server.
Run the following command to log in:
`./bldr vault.login`
You should be asked for an access token that your system administrators should provide you with.
In case it's necessary to log out, run:
`./bldr vault.logout`
Terraform, wrapped by builder, will read the ~/.vault-token file during its operations.
## Creating new tokens
After logging in with a root token, run:
`./bldr vault.token_create`
This token will be associated with the `builder-user` policy which gives it read-only access to secrets needed by builder.
To lookup information about a token:
`./bldr vault.token_lookup:<token>`
To revoke a token:
`./bldr vault.token_revoke:<token>`
+
+ ## Reading and writing secrets (admin only)
+
+ Some commands can be manually run to directly interact with Vault's key-value secrets store:
+
+ ```
+ $ VAULT_ADDR=https://master-server.elifesciences.org:8200 vault kv get secret/builder/apikey/fastly-gcs-logging
+ Key Value
+ --- -----
+ email fastly@elife-fastly.iam.gserviceaccount.com
+ secret_key -----BEGIN PRIVATE KEY-----
+ ...
+ -----END PRIVATE KEY-----
+ ```
+
+ ```
+ $ VAULT_ADDR=https://master-server.elifesciences.org:8200 vault kv put secret/builder/apikey/fastly-gcp-logging email=fastly@elife-fastly.iam.gserviceaccount.com secret_key=@../../fastly-gcp-logging.secret
+ Success! Data written to: secret/builder/apikey/fastly-gcp-logging
+ ``` | 19 | 0.558824 | 19 | 0 |
34ea5331f8e05dacf356096dfc1b63682fa78654 | Wikipedia/Code/BITHockeyManager+WMFExtensions.h | Wikipedia/Code/BITHockeyManager+WMFExtensions.h |
@interface BITHockeyManager (WMFExtensions) <BITHockeyManagerDelegate>
/**
* Configure and startup in one line.
* This will call the methods below as part of the configuration process.
* This method will use the current bundle id of the app
*/
- (void)wmf_setupAndStart;
/**
* Configure the alert to be displayed when a user is prompeted to send a crash report
*/
- (void)wmf_setupCrashNotificationAlert;
@end
| @import HockeySDK;
@interface BITHockeyManager (WMFExtensions) <BITHockeyManagerDelegate>
/**
* Configure and startup in one line.
* This will call the methods below as part of the configuration process.
* This method will use the current bundle id of the app
*/
- (void)wmf_setupAndStart;
/**
* Configure the alert to be displayed when a user is prompeted to send a crash report
*/
- (void)wmf_setupCrashNotificationAlert;
@end
| Revert "use old import syntax for HockeySDK" | Revert "use old import syntax for HockeySDK"
This reverts commit 0babdd70b3ab330f032790521002f2e171fcf3e6.
| C | mit | wikimedia/wikipedia-ios,wikimedia/wikipedia-ios,josve05a/wikipedia-ios,julienbodet/wikipedia-ios,wikimedia/apps-ios-wikipedia,wikimedia/apps-ios-wikipedia,montehurd/apps-ios-wikipedia,wikimedia/apps-ios-wikipedia,wikimedia/wikipedia-ios,montehurd/apps-ios-wikipedia,wikimedia/wikipedia-ios,montehurd/apps-ios-wikipedia,josve05a/wikipedia-ios,wikimedia/apps-ios-wikipedia,josve05a/wikipedia-ios,wikimedia/apps-ios-wikipedia,josve05a/wikipedia-ios,wikimedia/apps-ios-wikipedia,wikimedia/wikipedia-ios,wikimedia/wikipedia-ios,julienbodet/wikipedia-ios,josve05a/wikipedia-ios,montehurd/apps-ios-wikipedia,wikimedia/apps-ios-wikipedia,julienbodet/wikipedia-ios,montehurd/apps-ios-wikipedia,josve05a/wikipedia-ios,julienbodet/wikipedia-ios,julienbodet/wikipedia-ios,josve05a/wikipedia-ios,julienbodet/wikipedia-ios,montehurd/apps-ios-wikipedia,montehurd/apps-ios-wikipedia,montehurd/apps-ios-wikipedia,julienbodet/wikipedia-ios,julienbodet/wikipedia-ios,josve05a/wikipedia-ios,wikimedia/wikipedia-ios,wikimedia/apps-ios-wikipedia | c | ## Code Before:
@interface BITHockeyManager (WMFExtensions) <BITHockeyManagerDelegate>
/**
* Configure and startup in one line.
* This will call the methods below as part of the configuration process.
* This method will use the current bundle id of the app
*/
- (void)wmf_setupAndStart;
/**
* Configure the alert to be displayed when a user is prompeted to send a crash report
*/
- (void)wmf_setupCrashNotificationAlert;
@end
## Instruction:
Revert "use old import syntax for HockeySDK"
This reverts commit 0babdd70b3ab330f032790521002f2e171fcf3e6.
## Code After:
@import HockeySDK;
@interface BITHockeyManager (WMFExtensions) <BITHockeyManagerDelegate>
/**
* Configure and startup in one line.
* This will call the methods below as part of the configuration process.
* This method will use the current bundle id of the app
*/
- (void)wmf_setupAndStart;
/**
* Configure the alert to be displayed when a user is prompeted to send a crash report
*/
- (void)wmf_setupCrashNotificationAlert;
@end
| + @import HockeySDK;
@interface BITHockeyManager (WMFExtensions) <BITHockeyManagerDelegate>
/**
* Configure and startup in one line.
* This will call the methods below as part of the configuration process.
* This method will use the current bundle id of the app
*/
- (void)wmf_setupAndStart;
/**
* Configure the alert to be displayed when a user is prompeted to send a crash report
*/
- (void)wmf_setupCrashNotificationAlert;
@end | 1 | 0.0625 | 1 | 0 |
14ff798a0795fa39ee762fedbd832f19463a2190 | assets/desktop/common/noty/layouts.js | assets/desktop/common/noty/layouts.js | /*global window, jQuery*/
;(function($) {
'use strict';
$.noty.layouts.progress = {
name: 'progress',
options: {}, // overrides options
container: {
object: '<div id="notice-progress-container" />',
selector: 'div#notice-progress-container',
style: $.noop,
},
parent: {
object: '<div class="notice-element" />',
selector: 'div.notice-element',
css: {}
},
css: {
display: 'none',
width: '310px'
},
addClass: ''
};
$.noty.layouts.notification = {
name: 'notification',
options: {}, // overrides options
container: {
object: '<div id="notice-notification-container" />',
selector: 'div#notice-notification-container',
style: $.noop,
},
parent: {
object: '<div class="notice-element" />',
selector: 'div.notice-element',
css: {}
},
css: {
display: 'none',
width: '310px'
},
addClass: ''
};
})(jQuery);
| /*global window, jQuery*/
;(function($) {
'use strict';
$.noty.layouts.progress = {
name: 'progress',
options: {}, // overrides options
container: {
object: '<div id="notice-progress-container" />',
selector: 'div#notice-progress-container',
style: $.noop,
},
parent: {
object: '<div class="notice-element" />',
selector: 'div.notice-element',
css: {}
},
css: {
display: 'none',
width: '310px'
},
addClass: ''
};
$.noty.layouts.notification = {
name: 'notification',
options: {}, // overrides options
container: {
object: '<ul id="noty_topRight_layout_container" />',
selector: 'ul#noty_topRight_layout_container',
style: function() {
$(this).css({
top: 20,
right: 20,
position: 'fixed',
width: '310px',
height: 'auto',
margin: 0,
padding: 0,
listStyleType: 'none',
zIndex: 10000000
});
if (window.innerWidth < 600) {
$(this).css({
right: 5
});
}
}
},
parent: {
object: '<li />',
selector: 'li',
css: {}
},
css: {
display: 'none',
width: '310px'
},
addClass: ''
};
})(jQuery);
| Revert "Cleanup notification notice layout" | Revert "Cleanup notification notice layout"
This reverts commit d1f46897609f900bd1e8f79745751bc3789e8ef8.
| JavaScript | mit | nodeca/nodeca.core,nodeca/nodeca.core | javascript | ## Code Before:
/*global window, jQuery*/
;(function($) {
'use strict';
$.noty.layouts.progress = {
name: 'progress',
options: {}, // overrides options
container: {
object: '<div id="notice-progress-container" />',
selector: 'div#notice-progress-container',
style: $.noop,
},
parent: {
object: '<div class="notice-element" />',
selector: 'div.notice-element',
css: {}
},
css: {
display: 'none',
width: '310px'
},
addClass: ''
};
$.noty.layouts.notification = {
name: 'notification',
options: {}, // overrides options
container: {
object: '<div id="notice-notification-container" />',
selector: 'div#notice-notification-container',
style: $.noop,
},
parent: {
object: '<div class="notice-element" />',
selector: 'div.notice-element',
css: {}
},
css: {
display: 'none',
width: '310px'
},
addClass: ''
};
})(jQuery);
## Instruction:
Revert "Cleanup notification notice layout"
This reverts commit d1f46897609f900bd1e8f79745751bc3789e8ef8.
## Code After:
/*global window, jQuery*/
;(function($) {
'use strict';
$.noty.layouts.progress = {
name: 'progress',
options: {}, // overrides options
container: {
object: '<div id="notice-progress-container" />',
selector: 'div#notice-progress-container',
style: $.noop,
},
parent: {
object: '<div class="notice-element" />',
selector: 'div.notice-element',
css: {}
},
css: {
display: 'none',
width: '310px'
},
addClass: ''
};
$.noty.layouts.notification = {
name: 'notification',
options: {}, // overrides options
container: {
object: '<ul id="noty_topRight_layout_container" />',
selector: 'ul#noty_topRight_layout_container',
style: function() {
$(this).css({
top: 20,
right: 20,
position: 'fixed',
width: '310px',
height: 'auto',
margin: 0,
padding: 0,
listStyleType: 'none',
zIndex: 10000000
});
if (window.innerWidth < 600) {
$(this).css({
right: 5
});
}
}
},
parent: {
object: '<li />',
selector: 'li',
css: {}
},
css: {
display: 'none',
width: '310px'
},
addClass: ''
};
})(jQuery);
| /*global window, jQuery*/
;(function($) {
'use strict';
$.noty.layouts.progress = {
name: 'progress',
options: {}, // overrides options
container: {
object: '<div id="notice-progress-container" />',
selector: 'div#notice-progress-container',
style: $.noop,
},
parent: {
object: '<div class="notice-element" />',
selector: 'div.notice-element',
css: {}
},
css: {
display: 'none',
width: '310px'
},
addClass: ''
};
$.noty.layouts.notification = {
name: 'notification',
options: {}, // overrides options
container: {
- object: '<div id="notice-notification-container" />',
- selector: 'div#notice-notification-container',
- style: $.noop,
+ object: '<ul id="noty_topRight_layout_container" />',
+ selector: 'ul#noty_topRight_layout_container',
+ style: function() {
+ $(this).css({
+ top: 20,
+ right: 20,
+ position: 'fixed',
+ width: '310px',
+ height: 'auto',
+ margin: 0,
+ padding: 0,
+ listStyleType: 'none',
+ zIndex: 10000000
+ });
+
+ if (window.innerWidth < 600) {
+ $(this).css({
+ right: 5
+ });
+ }
+ }
},
parent: {
- object: '<div class="notice-element" />',
- selector: 'div.notice-element',
+ object: '<li />',
+ selector: 'li',
- css: {}
? -----
+ css: {}
},
css: {
display: 'none',
width: '310px'
},
addClass: ''
};
})(jQuery);
| 30 | 0.612245 | 24 | 6 |
97cd7a8adb9b9a675a2f7a470d33d0ee754d6ce0 | README.rst | README.rst | osbridge-volunteer
==================
Open Source Bridge volunteer management app
| osbridge-volunteer
==================
Open Source Bridge volunteer management app
Install Requirements
====================
Create a virtualenvironment based on Python 3.4
virtualenv --python=/usr/bin/python3.4 virtualenv
or
virtualenv --python=/usr/bin/python3.4 virtualenv
| Save 3.4 installation command for the virtualenv | Save 3.4 installation command for the virtualenv
| reStructuredText | mit | ChrisFreeman/osbridge-volunteer,ChrisFreeman/osbridge-volunteer,ChrisFreeman/osbridge-volunteer | restructuredtext | ## Code Before:
osbridge-volunteer
==================
Open Source Bridge volunteer management app
## Instruction:
Save 3.4 installation command for the virtualenv
## Code After:
osbridge-volunteer
==================
Open Source Bridge volunteer management app
Install Requirements
====================
Create a virtualenvironment based on Python 3.4
virtualenv --python=/usr/bin/python3.4 virtualenv
or
virtualenv --python=/usr/bin/python3.4 virtualenv
| osbridge-volunteer
==================
Open Source Bridge volunteer management app
+
+ Install Requirements
+ ====================
+
+ Create a virtualenvironment based on Python 3.4
+ virtualenv --python=/usr/bin/python3.4 virtualenv
+ or
+ virtualenv --python=/usr/bin/python3.4 virtualenv | 8 | 2 | 8 | 0 |
ee8395b757f7f30d2967eff1699a542dd4e72fb0 | fizzbuzz/clojure/fizzbuzz/test/fizzbuzz/core_test.clj | fizzbuzz/clojure/fizzbuzz/test/fizzbuzz/core_test.clj | (ns fizzbuzz.core-test
(:require [clojure.test :refer :all]
[fizzbuzz.core :refer :all]))
(deftest a-test
(testing "FIXME, I fail."
(is (= 0 1))))
| (ns fizzbuzz.core-test
(:require [clojure.test :refer :all]
[fizzbuzz.core :refer :all]))
(deftest fizzbuzz-val-test
(testing "Returns fizz for multiple of 3"
(is (= "fizz" (fizzbuzz-val 9))))
(testing "Returns buzz for multiple of 5"
(is (= "buzz" (fizzbuzz-val 10))))
(testing "Returns fizzbuzz for multiple of 15"
(is (= "fizzbuzz" (fizzbuzz-val 15))))
(testing "Returns stringified number for other values")
(is (= "13" (fizzbuzz-val 13))))
(deftest generic-fizzbuzz-test
(testing "Returns vector of fizzbuzz results"
(is (= ["1" "2" "fizz" "4" "buzz" "fizz" "7" "8" "fizz"]
(generic-fizzbuzz 9)))))
(deftest fizzbuzz-test
(testing "Returns vector of 100 fizzbuzz results"
(is (= 100 (count (fizzbuzz))))))
| Update the clojure fizzbuzz tests. | Update the clojure fizzbuzz tests.
| Clojure | mit | jbranchaud/hello-world,jbranchaud/hello-world,jbranchaud/hello-world,jbranchaud/hello-world,jbranchaud/hello-world,jbranchaud/hello-world | clojure | ## Code Before:
(ns fizzbuzz.core-test
(:require [clojure.test :refer :all]
[fizzbuzz.core :refer :all]))
(deftest a-test
(testing "FIXME, I fail."
(is (= 0 1))))
## Instruction:
Update the clojure fizzbuzz tests.
## Code After:
(ns fizzbuzz.core-test
(:require [clojure.test :refer :all]
[fizzbuzz.core :refer :all]))
(deftest fizzbuzz-val-test
(testing "Returns fizz for multiple of 3"
(is (= "fizz" (fizzbuzz-val 9))))
(testing "Returns buzz for multiple of 5"
(is (= "buzz" (fizzbuzz-val 10))))
(testing "Returns fizzbuzz for multiple of 15"
(is (= "fizzbuzz" (fizzbuzz-val 15))))
(testing "Returns stringified number for other values")
(is (= "13" (fizzbuzz-val 13))))
(deftest generic-fizzbuzz-test
(testing "Returns vector of fizzbuzz results"
(is (= ["1" "2" "fizz" "4" "buzz" "fizz" "7" "8" "fizz"]
(generic-fizzbuzz 9)))))
(deftest fizzbuzz-test
(testing "Returns vector of 100 fizzbuzz results"
(is (= 100 (count (fizzbuzz))))))
| (ns fizzbuzz.core-test
(:require [clojure.test :refer :all]
[fizzbuzz.core :refer :all]))
+ (deftest fizzbuzz-val-test
+ (testing "Returns fizz for multiple of 3"
+ (is (= "fizz" (fizzbuzz-val 9))))
+ (testing "Returns buzz for multiple of 5"
+ (is (= "buzz" (fizzbuzz-val 10))))
+ (testing "Returns fizzbuzz for multiple of 15"
+ (is (= "fizzbuzz" (fizzbuzz-val 15))))
+ (testing "Returns stringified number for other values")
+ (is (= "13" (fizzbuzz-val 13))))
+
+ (deftest generic-fizzbuzz-test
+ (testing "Returns vector of fizzbuzz results"
+ (is (= ["1" "2" "fizz" "4" "buzz" "fizz" "7" "8" "fizz"]
+ (generic-fizzbuzz 9)))))
+
- (deftest a-test
? ^
+ (deftest fizzbuzz-test
? ^^^^^^^^
- (testing "FIXME, I fail."
- (is (= 0 1))))
+ (testing "Returns vector of 100 fizzbuzz results"
+ (is (= 100 (count (fizzbuzz)))))) | 21 | 3 | 18 | 3 |
20036b79e7730f76e77e5ad989bc4e0084bf49c1 | .travis.yml | .travis.yml | sudo: false
dist: xenial
language: python
cache: pip
python:
- "2.7"
- "3.5"
- "3.5"
- "3.6"
- "3.7"
- "nightly"
install:
- pip install -r dev-requirements.txt
- pip install -e .
- pip freeze
script:
- coverage erase
- coverage run --source pykwalify -p -m py.test -v
- py.test
- python setup.py sdist bdist
after_success:
- coverage combine
- coveralls
- "if [[ $TEST_PYCODESTYLE == '1' ]]; then pycodestyle --repeat --show-source --exclude=.venv,.tox,dist,docs,build,*.egg,pykwalify_install .; fi"
matrix:
allow_failures:
- python: "nightly"
| sudo: false
dist: xenial
language: python
cache: pip
python:
- "2.7"
- "3.5"
- "3.5"
- "3.6"
- "3.7"
- "nightly"
install:
- pip install -r dev-requirements.txt
- pip install -e .
- pip freeze
script:
- coverage erase
- coverage run --source pykwalify -p -m py.test -v
- py.test
- python setup.py sdist bdist
after_success:
- coverage combine
- coveralls
- "if [[ $TEST_PYCODESTYLE == '1' ]]; then pycodestyle --repeat --show-source --exclude=.venv,.tox,dist,docs,build,*.egg,pykwalify_install .; fi"
matrix:
allow_failures:
- python: "nightly"
- python: 2.7
env: TEST_PYCODESTYLE=1
- python: 3.6
env: TEST_PYCODESTYLE=1
| Add envrionment variables for testing pycodestyle | Add envrionment variables for testing pycodestyle
| YAML | mit | grokzen/pykwalify | yaml | ## Code Before:
sudo: false
dist: xenial
language: python
cache: pip
python:
- "2.7"
- "3.5"
- "3.5"
- "3.6"
- "3.7"
- "nightly"
install:
- pip install -r dev-requirements.txt
- pip install -e .
- pip freeze
script:
- coverage erase
- coverage run --source pykwalify -p -m py.test -v
- py.test
- python setup.py sdist bdist
after_success:
- coverage combine
- coveralls
- "if [[ $TEST_PYCODESTYLE == '1' ]]; then pycodestyle --repeat --show-source --exclude=.venv,.tox,dist,docs,build,*.egg,pykwalify_install .; fi"
matrix:
allow_failures:
- python: "nightly"
## Instruction:
Add envrionment variables for testing pycodestyle
## Code After:
sudo: false
dist: xenial
language: python
cache: pip
python:
- "2.7"
- "3.5"
- "3.5"
- "3.6"
- "3.7"
- "nightly"
install:
- pip install -r dev-requirements.txt
- pip install -e .
- pip freeze
script:
- coverage erase
- coverage run --source pykwalify -p -m py.test -v
- py.test
- python setup.py sdist bdist
after_success:
- coverage combine
- coveralls
- "if [[ $TEST_PYCODESTYLE == '1' ]]; then pycodestyle --repeat --show-source --exclude=.venv,.tox,dist,docs,build,*.egg,pykwalify_install .; fi"
matrix:
allow_failures:
- python: "nightly"
- python: 2.7
env: TEST_PYCODESTYLE=1
- python: 3.6
env: TEST_PYCODESTYLE=1
| sudo: false
dist: xenial
language: python
cache: pip
python:
- "2.7"
- "3.5"
- "3.5"
- "3.6"
- "3.7"
- "nightly"
install:
- pip install -r dev-requirements.txt
- pip install -e .
- pip freeze
script:
- coverage erase
- coverage run --source pykwalify -p -m py.test -v
- py.test
- python setup.py sdist bdist
after_success:
- coverage combine
- coveralls
- "if [[ $TEST_PYCODESTYLE == '1' ]]; then pycodestyle --repeat --show-source --exclude=.venv,.tox,dist,docs,build,*.egg,pykwalify_install .; fi"
matrix:
allow_failures:
- python: "nightly"
+ - python: 2.7
+ env: TEST_PYCODESTYLE=1
+ - python: 3.6
+ env: TEST_PYCODESTYLE=1 | 4 | 0.125 | 4 | 0 |
bf7341afca3ac5b9418ad2e73d05bed14ddd9293 | README.md | README.md | [](https://travis-ci.org/cparram/ap-networking-proxy-cache)
# ap-networking-proxy-cache
Projecto with Academic Purpose about Proxy cache implementation.
## Usage
### Build
* `$ ./gradlew buildClientJar`: Creates a jar for client source set
* `$ ./gradlew buildNodeJar`: Creates a jar for node source set
### Daemons
**You must create your own topology.properties file**
* `./proxy-node.sh [start|stop|restart] [port] [path]`
### Run Client
* `java -jar build/libs/proxy-cache-client-<version>.jar [command [file]] [options]`
### Assumptions
* The port used to listen nodes will be the master consultations 8181 which can not be used for other purposes
* Is used configuration files used in each node to store the file list (if it is a proxy) and store ips of synchronized proxies (if master). These files are small
### Vagrant
For testing purposes vagrant file is created for creating virtual machines:
#### Steps:
* `$ bundle install`
* `$ bundle exec librarian-chef install`
* `$ vagrant up --provision`: this could take a long time
* `$ vagrant ssh master`: to enter a master machine
* `$ vagrant ssh node1`: to enter a node machine | [](https://travis-ci.org/cparram/ap-networking-proxy-cache)
# ap-networking-proxy-cache
Projecto with Academic Purpose about Proxy cache implementation.
## Usage
### Build
* `$ ./gradlew buildClientJar`: Creates a jar for client source set
* `$ ./gradlew buildNodeJar`: Creates a jar for node source set
### Daemons
**You must create your own topology.properties file**
* `./proxy-node.sh [start|stop|restart] [port] [path]`
### Run Client
* `java -jar build/libs/proxy-cache-client-<version>.jar [command [file]] [options]`
### Assumptions
* The port used to listen nodes will be the master consultations 8181 which can not be used for other purposes
* Is used configuration files used in each node to store the file list (if it is a proxy) and store ips of synchronized proxies (if master). These files are small
* It is not accepted that upload files with the same name.
### Vagrant
For testing purposes vagrant file is created for creating virtual machines:
#### Steps:
* `$ bundle install`
* `$ bundle exec librarian-chef install`
* `$ vagrant up --provision`: this could take a long time
* `$ vagrant ssh master`: to enter a master machine
* `$ vagrant ssh node1`: to enter a node machine | Add a new assumption on readme | Add a new assumption on readme
| Markdown | mit | cparram/ap-networking-proxy-cache,cparram/ap-networking-proxy-cache,cparram/ap-networking-proxy-cache | markdown | ## Code Before:
[](https://travis-ci.org/cparram/ap-networking-proxy-cache)
# ap-networking-proxy-cache
Projecto with Academic Purpose about Proxy cache implementation.
## Usage
### Build
* `$ ./gradlew buildClientJar`: Creates a jar for client source set
* `$ ./gradlew buildNodeJar`: Creates a jar for node source set
### Daemons
**You must create your own topology.properties file**
* `./proxy-node.sh [start|stop|restart] [port] [path]`
### Run Client
* `java -jar build/libs/proxy-cache-client-<version>.jar [command [file]] [options]`
### Assumptions
* The port used to listen nodes will be the master consultations 8181 which can not be used for other purposes
* Is used configuration files used in each node to store the file list (if it is a proxy) and store ips of synchronized proxies (if master). These files are small
### Vagrant
For testing purposes vagrant file is created for creating virtual machines:
#### Steps:
* `$ bundle install`
* `$ bundle exec librarian-chef install`
* `$ vagrant up --provision`: this could take a long time
* `$ vagrant ssh master`: to enter a master machine
* `$ vagrant ssh node1`: to enter a node machine
## Instruction:
Add a new assumption on readme
## Code After:
[](https://travis-ci.org/cparram/ap-networking-proxy-cache)
# ap-networking-proxy-cache
Projecto with Academic Purpose about Proxy cache implementation.
## Usage
### Build
* `$ ./gradlew buildClientJar`: Creates a jar for client source set
* `$ ./gradlew buildNodeJar`: Creates a jar for node source set
### Daemons
**You must create your own topology.properties file**
* `./proxy-node.sh [start|stop|restart] [port] [path]`
### Run Client
* `java -jar build/libs/proxy-cache-client-<version>.jar [command [file]] [options]`
### Assumptions
* The port used to listen nodes will be the master consultations 8181 which can not be used for other purposes
* Is used configuration files used in each node to store the file list (if it is a proxy) and store ips of synchronized proxies (if master). These files are small
* It is not accepted that upload files with the same name.
### Vagrant
For testing purposes vagrant file is created for creating virtual machines:
#### Steps:
* `$ bundle install`
* `$ bundle exec librarian-chef install`
* `$ vagrant up --provision`: this could take a long time
* `$ vagrant ssh master`: to enter a master machine
* `$ vagrant ssh node1`: to enter a node machine | [](https://travis-ci.org/cparram/ap-networking-proxy-cache)
# ap-networking-proxy-cache
Projecto with Academic Purpose about Proxy cache implementation.
## Usage
### Build
* `$ ./gradlew buildClientJar`: Creates a jar for client source set
* `$ ./gradlew buildNodeJar`: Creates a jar for node source set
### Daemons
**You must create your own topology.properties file**
* `./proxy-node.sh [start|stop|restart] [port] [path]`
### Run Client
* `java -jar build/libs/proxy-cache-client-<version>.jar [command [file]] [options]`
### Assumptions
* The port used to listen nodes will be the master consultations 8181 which can not be used for other purposes
* Is used configuration files used in each node to store the file list (if it is a proxy) and store ips of synchronized proxies (if master). These files are small
+ * It is not accepted that upload files with the same name.
### Vagrant
For testing purposes vagrant file is created for creating virtual machines:
#### Steps:
* `$ bundle install`
* `$ bundle exec librarian-chef install`
* `$ vagrant up --provision`: this could take a long time
* `$ vagrant ssh master`: to enter a master machine
* `$ vagrant ssh node1`: to enter a node machine | 1 | 0.032258 | 1 | 0 |
0a09554a503394fd1fe8555100049f59e6aa7ddb | src/client/react/user/components/Results/HideResults.js | src/client/react/user/components/Results/HideResults.js | import React from 'react';
import autobind from 'core-decorators/es/autobind';
import { NonIdealState, Button, Intent } from '@blueprintjs/core';
import LoadingSpinner from '../../../components/LoadingSpinner';
@autobind
class HideResults extends React.Component {
refreshPage() {
location.reload();
}
render() {
return (
<div>
<h2 style={{textAlign:'center'}}>Who will reign victorious?</h2>
<NonIdealState
description='Results will be out soon. Check this page again in a few minutes!'
visual={<LoadingSpinner hideText/>}
action={<Button intent={Intent.PRIMARY} text='Refresh page' onClick={this.refreshPage}/>}/>
</div>
);
}
}
export default HideResults;
| import React from 'react';
import autobind from 'core-decorators/es/autobind';
import { NonIdealState, Button, Intent } from '@blueprintjs/core';
import LoadingSpinner from '../../../components/LoadingSpinner';
@autobind
class HideResults extends React.Component {
refreshPage() {
location.reload();
}
render() {
return (
<div>
<h2 style={{textAlign:'center'}}>Who will reign victorious?</h2>
<NonIdealState
title='Results will be out soon'
description='Check this page again in a few minutes!'
visual={<LoadingSpinner hideText/>}
action={<Button intent={Intent.PRIMARY} text='Refresh page' onClick={this.refreshPage}/>}/>
</div>
);
}
}
export default HideResults;
| Improve pending results screen text | [FIX] Improve pending results screen text
| JavaScript | mit | bwyap/ptc-amazing-g-race,bwyap/ptc-amazing-g-race | javascript | ## Code Before:
import React from 'react';
import autobind from 'core-decorators/es/autobind';
import { NonIdealState, Button, Intent } from '@blueprintjs/core';
import LoadingSpinner from '../../../components/LoadingSpinner';
@autobind
class HideResults extends React.Component {
refreshPage() {
location.reload();
}
render() {
return (
<div>
<h2 style={{textAlign:'center'}}>Who will reign victorious?</h2>
<NonIdealState
description='Results will be out soon. Check this page again in a few minutes!'
visual={<LoadingSpinner hideText/>}
action={<Button intent={Intent.PRIMARY} text='Refresh page' onClick={this.refreshPage}/>}/>
</div>
);
}
}
export default HideResults;
## Instruction:
[FIX] Improve pending results screen text
## Code After:
import React from 'react';
import autobind from 'core-decorators/es/autobind';
import { NonIdealState, Button, Intent } from '@blueprintjs/core';
import LoadingSpinner from '../../../components/LoadingSpinner';
@autobind
class HideResults extends React.Component {
refreshPage() {
location.reload();
}
render() {
return (
<div>
<h2 style={{textAlign:'center'}}>Who will reign victorious?</h2>
<NonIdealState
title='Results will be out soon'
description='Check this page again in a few minutes!'
visual={<LoadingSpinner hideText/>}
action={<Button intent={Intent.PRIMARY} text='Refresh page' onClick={this.refreshPage}/>}/>
</div>
);
}
}
export default HideResults;
| import React from 'react';
import autobind from 'core-decorators/es/autobind';
import { NonIdealState, Button, Intent } from '@blueprintjs/core';
import LoadingSpinner from '../../../components/LoadingSpinner';
@autobind
class HideResults extends React.Component {
refreshPage() {
location.reload();
}
render() {
return (
<div>
<h2 style={{textAlign:'center'}}>Who will reign victorious?</h2>
<NonIdealState
+ title='Results will be out soon'
- description='Results will be out soon. Check this page again in a few minutes!'
? --------------------------
+ description='Check this page again in a few minutes!'
visual={<LoadingSpinner hideText/>}
action={<Button intent={Intent.PRIMARY} text='Refresh page' onClick={this.refreshPage}/>}/>
</div>
);
}
}
export default HideResults; | 3 | 0.111111 | 2 | 1 |
b36dabefa3e70c07e622fa4dcc54bdc7c1eca59f | app/src/main/res/layout/adapter_videos.xml | app/src/main/res/layout/adapter_videos.xml | <?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="240dp"
android:layout_height="180dp">
<ImageView
android:id="@+id/image"
android:layout_width="240dp"
android:layout_height="180dp"
android:contentDescription="@string/backdrop"/>
<TextView
android:id="@+id/title"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="center_horizontal|top"
android:gravity="center"
android:textAppearance="@android:style/TextAppearance.Medium"
android:textColor="@color/colorAccent"
tools:text="Title"/>
<TextView
android:id="@+id/type"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="center_horizontal|bottom"
android:textAppearance="@android:style/TextAppearance.Large"
android:textColor="@color/colorAccent"
tools:text="Type"/>
</FrameLayout> | <?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="240dp"
android:layout_height="180dp">
<ImageView
android:id="@+id/image"
android:layout_width="240dp"
android:layout_height="180dp"
android:contentDescription="@string/backdrop"/>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_gravity="bottom"
android:background="#B3000000"
android:orientation="vertical"
android:padding="4dp">
<TextView
android:id="@+id/type"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:singleLine="true"
android:textAppearance="@android:style/TextAppearance.Medium"
android:textColor="@color/colorAccent"
tools:text="Type"/>
<TextView
android:id="@+id/title"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:gravity="center"
android:singleLine="true"
android:textAppearance="@android:style/TextAppearance.Small"
android:textColor="@color/colorAccent"
tools:text="Title"/>
</LinearLayout>
</FrameLayout> | Change layout for Adapter Videos | [UPDATE] Change layout for Adapter Videos
| XML | apache-2.0 | Skalaw/Video-Training-App | xml | ## Code Before:
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="240dp"
android:layout_height="180dp">
<ImageView
android:id="@+id/image"
android:layout_width="240dp"
android:layout_height="180dp"
android:contentDescription="@string/backdrop"/>
<TextView
android:id="@+id/title"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="center_horizontal|top"
android:gravity="center"
android:textAppearance="@android:style/TextAppearance.Medium"
android:textColor="@color/colorAccent"
tools:text="Title"/>
<TextView
android:id="@+id/type"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="center_horizontal|bottom"
android:textAppearance="@android:style/TextAppearance.Large"
android:textColor="@color/colorAccent"
tools:text="Type"/>
</FrameLayout>
## Instruction:
[UPDATE] Change layout for Adapter Videos
## Code After:
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="240dp"
android:layout_height="180dp">
<ImageView
android:id="@+id/image"
android:layout_width="240dp"
android:layout_height="180dp"
android:contentDescription="@string/backdrop"/>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_gravity="bottom"
android:background="#B3000000"
android:orientation="vertical"
android:padding="4dp">
<TextView
android:id="@+id/type"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:singleLine="true"
android:textAppearance="@android:style/TextAppearance.Medium"
android:textColor="@color/colorAccent"
tools:text="Type"/>
<TextView
android:id="@+id/title"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:gravity="center"
android:singleLine="true"
android:textAppearance="@android:style/TextAppearance.Small"
android:textColor="@color/colorAccent"
tools:text="Title"/>
</LinearLayout>
</FrameLayout> | <?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="240dp"
android:layout_height="180dp">
<ImageView
android:id="@+id/image"
android:layout_width="240dp"
android:layout_height="180dp"
android:contentDescription="@string/backdrop"/>
+ <LinearLayout
- <TextView
- android:id="@+id/title"
- android:layout_width="wrap_content"
? ^ -------
+ android:layout_width="match_parent"
? ^^^^^^^^
android:layout_height="wrap_content"
- android:layout_gravity="center_horizontal|top"
? ^^^ -------------- ^
+ android:layout_gravity="bottom"
? ^^ ^
+ android:background="#B3000000"
+ android:orientation="vertical"
+ android:padding="4dp">
- android:gravity="center"
- android:textAppearance="@android:style/TextAppearance.Medium"
- android:textColor="@color/colorAccent"
- tools:text="Title"/>
- <TextView
+ <TextView
? ++++
- android:id="@+id/type"
+ android:id="@+id/type"
? ++++
- android:layout_width="wrap_content"
+ android:layout_width="wrap_content"
? ++++
- android:layout_height="wrap_content"
+ android:layout_height="wrap_content"
? ++++
- android:layout_gravity="center_horizontal|bottom"
+ android:singleLine="true"
- android:textAppearance="@android:style/TextAppearance.Large"
? ^^^^
+ android:textAppearance="@android:style/TextAppearance.Medium"
? ++++ ^ ++++
- android:textColor="@color/colorAccent"
+ android:textColor="@color/colorAccent"
? ++++
- tools:text="Type"/>
+ tools:text="Type"/>
? ++++
+
+ <TextView
+ android:id="@+id/title"
+ android:layout_width="wrap_content"
+ android:layout_height="wrap_content"
+ android:gravity="center"
+ android:singleLine="true"
+ android:textAppearance="@android:style/TextAppearance.Small"
+ android:textColor="@color/colorAccent"
+ tools:text="Title"/>
+ </LinearLayout>
</FrameLayout> | 41 | 1.322581 | 25 | 16 |
55336f22bd7d2b406bc3846acc91d622c7b3ba03 | src/components/pool/resource-index.js | src/components/pool/resource-index.js | import lunr from 'lunr';
export default lunr(function index() {
this.field('name', { boost: 100 });
this.field('location', { boost: 10 });
this.field('website');
this.ref('key');
});
| import elasticlunr from 'elasticlunr';
elasticlunr.clearStopWords();
export const index = elasticlunr(function index() {
this.addField('name');
this.addField('location');
this.addField('website');
this.setRef('key');
});
export const booster = {
fields: {
name: { boost: 3 },
location: { boost: 2 },
website: { boost: 1 },
},
expand: true,
};
| Update index from lunr to elasticlunr | Update index from lunr to elasticlunr
| JavaScript | mit | tuelsch/poolparty,tuelsch/poolparty | javascript | ## Code Before:
import lunr from 'lunr';
export default lunr(function index() {
this.field('name', { boost: 100 });
this.field('location', { boost: 10 });
this.field('website');
this.ref('key');
});
## Instruction:
Update index from lunr to elasticlunr
## Code After:
import elasticlunr from 'elasticlunr';
elasticlunr.clearStopWords();
export const index = elasticlunr(function index() {
this.addField('name');
this.addField('location');
this.addField('website');
this.setRef('key');
});
export const booster = {
fields: {
name: { boost: 3 },
location: { boost: 2 },
website: { boost: 1 },
},
expand: true,
};
| - import lunr from 'lunr';
+ import elasticlunr from 'elasticlunr';
? +++++++ +++++++
+ elasticlunr.clearStopWords();
+
- export default lunr(function index() {
? ^ ^^ ^
+ export const index = elasticlunr(function index() {
? ++++++++ ^^^^^^ ^ ^^
- this.field('name', { boost: 100 });
- this.field('location', { boost: 10 });
+ this.addField('name');
+ this.addField('location');
- this.field('website');
? ^
+ this.addField('website');
? ^^^^
- this.ref('key');
? ^
+ this.setRef('key');
? ^^^^
});
+
+ export const booster = {
+ fields: {
+ name: { boost: 3 },
+ location: { boost: 2 },
+ website: { boost: 1 },
+ },
+ expand: true,
+ }; | 23 | 2.875 | 17 | 6 |
4c3ed3ca379257ce8bacfffa69978669cb80b1e7 | lib/letter_opener.rb | lib/letter_opener.rb | require "fileutils"
require "digest/sha1"
require "cgi"
require "letter_opener/message"
require "letter_opener/delivery_method"
require "letter_opener/railtie" if defined? Rails
| require "fileutils"
require "digest/sha1"
require "cgi"
require "launchy"
require "letter_opener/message"
require "letter_opener/delivery_method"
require "letter_opener/railtie" if defined? Rails
| Add a require on launchy | Add a require on launchy
| Ruby | mit | ryanb/letter_opener,fheisler/letter_opener,e-accent/letter_opener,arxpoetica/letter_opener_single_tab,arxpoetica/letter_opener_single_tab,e-accent/letter_opener,ryanb/letter_opener,fheisler/letter_opener | ruby | ## Code Before:
require "fileutils"
require "digest/sha1"
require "cgi"
require "letter_opener/message"
require "letter_opener/delivery_method"
require "letter_opener/railtie" if defined? Rails
## Instruction:
Add a require on launchy
## Code After:
require "fileutils"
require "digest/sha1"
require "cgi"
require "launchy"
require "letter_opener/message"
require "letter_opener/delivery_method"
require "letter_opener/railtie" if defined? Rails
| require "fileutils"
require "digest/sha1"
require "cgi"
+ require "launchy"
require "letter_opener/message"
require "letter_opener/delivery_method"
require "letter_opener/railtie" if defined? Rails | 1 | 0.142857 | 1 | 0 |
34507b5e3631f67b38f5c22902d995053437302a | src/discordcr/rest.cr | src/discordcr/rest.cr | require "http/client"
require "openssl/ssl/context"
require "./mappings/*"
require "./version"
module Discord
module REST
SSL_CONTEXT = OpenSSL::SSL::Context::Client.new
USER_AGENT = "DiscordBot (https://github.com/meew0/discordcr, #{Discord::VERSION})"
API_BASE = "https://discordapp.com/api/v6"
def request(endpoint_key : Symbol, method : String, path : String, headers : HTTP::Headers, body : String?)
headers["Authorization"] = @token
headers["User-Agent"] = USER_AGENT
HTTP::Client.exec(method: method, url: API_BASE + path, headers: headers, body: body, tls: SSL_CONTEXT)
end
def get_gateway
response = request(
:get_gateway,
"GET",
"/gateway",
HTTP::Headers.new,
nil
)
GatewayResponse.from_json(response.body)
end
def create_message(channel_id, content)
response = request(
:create_message,
"POST",
"/channels/#{channel_id}/messages",
HTTP::Headers{"Content-Type" => "application/json"},
{content: content}.to_json
)
Message.from_json(response.body)
end
end
end
| require "http/client"
require "openssl/ssl/context"
require "./mappings/*"
require "./version"
module Discord
module REST
SSL_CONTEXT = OpenSSL::SSL::Context::Client.new
USER_AGENT = "DiscordBot (https://github.com/meew0/discordcr, #{Discord::VERSION})"
API_BASE = "https://discordapp.com/api/v6"
def request(endpoint_key : Symbol, method : String, path : String, headers : HTTP::Headers, body : String?)
headers["Authorization"] = @token
headers["User-Agent"] = USER_AGENT
HTTP::Client.exec(method: method, url: API_BASE + path, headers: headers, body: body, tls: SSL_CONTEXT)
end
def get_gateway
response = request(
:get_gateway,
"GET",
"/gateway",
HTTP::Headers.new,
nil
)
GatewayResponse.from_json(response.body)
end
def get_channel(channel_id)
response = request(
:get_channel,
"GET",
"/channels/#{channel_id}",
HTTP::Headers.new,
nil
)
Channel.from_json(response.body)
end
def create_message(channel_id, content)
response = request(
:create_message,
"POST",
"/channels/#{channel_id}/messages",
HTTP::Headers{"Content-Type" => "application/json"},
{content: content}.to_json
)
Message.from_json(response.body)
end
end
end
| Add a method for the get channel endpoint | Add a method for the get channel endpoint
| Crystal | mit | meew0/discordcr | crystal | ## Code Before:
require "http/client"
require "openssl/ssl/context"
require "./mappings/*"
require "./version"
module Discord
module REST
SSL_CONTEXT = OpenSSL::SSL::Context::Client.new
USER_AGENT = "DiscordBot (https://github.com/meew0/discordcr, #{Discord::VERSION})"
API_BASE = "https://discordapp.com/api/v6"
def request(endpoint_key : Symbol, method : String, path : String, headers : HTTP::Headers, body : String?)
headers["Authorization"] = @token
headers["User-Agent"] = USER_AGENT
HTTP::Client.exec(method: method, url: API_BASE + path, headers: headers, body: body, tls: SSL_CONTEXT)
end
def get_gateway
response = request(
:get_gateway,
"GET",
"/gateway",
HTTP::Headers.new,
nil
)
GatewayResponse.from_json(response.body)
end
def create_message(channel_id, content)
response = request(
:create_message,
"POST",
"/channels/#{channel_id}/messages",
HTTP::Headers{"Content-Type" => "application/json"},
{content: content}.to_json
)
Message.from_json(response.body)
end
end
end
## Instruction:
Add a method for the get channel endpoint
## Code After:
require "http/client"
require "openssl/ssl/context"
require "./mappings/*"
require "./version"
module Discord
module REST
SSL_CONTEXT = OpenSSL::SSL::Context::Client.new
USER_AGENT = "DiscordBot (https://github.com/meew0/discordcr, #{Discord::VERSION})"
API_BASE = "https://discordapp.com/api/v6"
def request(endpoint_key : Symbol, method : String, path : String, headers : HTTP::Headers, body : String?)
headers["Authorization"] = @token
headers["User-Agent"] = USER_AGENT
HTTP::Client.exec(method: method, url: API_BASE + path, headers: headers, body: body, tls: SSL_CONTEXT)
end
def get_gateway
response = request(
:get_gateway,
"GET",
"/gateway",
HTTP::Headers.new,
nil
)
GatewayResponse.from_json(response.body)
end
def get_channel(channel_id)
response = request(
:get_channel,
"GET",
"/channels/#{channel_id}",
HTTP::Headers.new,
nil
)
Channel.from_json(response.body)
end
def create_message(channel_id, content)
response = request(
:create_message,
"POST",
"/channels/#{channel_id}/messages",
HTTP::Headers{"Content-Type" => "application/json"},
{content: content}.to_json
)
Message.from_json(response.body)
end
end
end
| require "http/client"
require "openssl/ssl/context"
require "./mappings/*"
require "./version"
module Discord
module REST
SSL_CONTEXT = OpenSSL::SSL::Context::Client.new
USER_AGENT = "DiscordBot (https://github.com/meew0/discordcr, #{Discord::VERSION})"
API_BASE = "https://discordapp.com/api/v6"
def request(endpoint_key : Symbol, method : String, path : String, headers : HTTP::Headers, body : String?)
headers["Authorization"] = @token
headers["User-Agent"] = USER_AGENT
HTTP::Client.exec(method: method, url: API_BASE + path, headers: headers, body: body, tls: SSL_CONTEXT)
end
def get_gateway
response = request(
:get_gateway,
"GET",
"/gateway",
HTTP::Headers.new,
nil
)
GatewayResponse.from_json(response.body)
end
+ def get_channel(channel_id)
+ response = request(
+ :get_channel,
+ "GET",
+ "/channels/#{channel_id}",
+ HTTP::Headers.new,
+ nil
+ )
+
+ Channel.from_json(response.body)
+ end
+
def create_message(channel_id, content)
response = request(
:create_message,
"POST",
"/channels/#{channel_id}/messages",
HTTP::Headers{"Content-Type" => "application/json"},
{content: content}.to_json
)
Message.from_json(response.body)
end
end
end | 12 | 0.272727 | 12 | 0 |
6fa003f5f67b3055180f66775eef1384d6de6534 | supervisord-monitor/README.md | supervisord-monitor/README.md | https://github.com/mlazarov/supervisord-monitor
# repo
https://github.com/banjocat/docker-devops-examples/tree/master/supervisord-monitor
## Required Environmental variables
| Variable | Syntax |What it does|
|----------|--------|------|
|SERVERS|server1=127.0.0.1,server2=localhost| List of servers by display_name=url|
|USERNAME|admin|username to login remove supervisord|
|PASSWORD|admin|password to login
## Optional Environmental variables
| Variable | Default |What it does|
|----------|--------|------|
|PORT| 9001 | Port to access supervisord|
## Example
docker run -e URL=www.google.com banjocat/wrk
| https://github.com/mlazarov/supervisord-monitor
# repo
https://github.com/banjocat/docker-devops-examples/tree/master/supervisord-monitor
## Required Environmental variables
| Variable | Syntax |What it does|
|----------|--------|------|
|SERVERS|server1=127.0.0.1,server2=localhost| List of servers by display_name=url|
|USERNAME|admin|username to login remove supervisord|
|PASSWORD|admin|password to login
## Optional Environmental variables
| Variable | Default |What it does|
|----------|--------|------|
|PORT| 9001 | Port to access supervisord|
## Example
docker run -e SERVERS="server1=localhost",USERNAME=admin,PASSWORD=admin banjocat/supervisord-monitor
| Update readme with better example | Update readme with better example
| Markdown | mit | banjocat/docker-devops-examples,banjocat/docker-devops-examples,banjocat/docker-devops-examples | markdown | ## Code Before:
https://github.com/mlazarov/supervisord-monitor
# repo
https://github.com/banjocat/docker-devops-examples/tree/master/supervisord-monitor
## Required Environmental variables
| Variable | Syntax |What it does|
|----------|--------|------|
|SERVERS|server1=127.0.0.1,server2=localhost| List of servers by display_name=url|
|USERNAME|admin|username to login remove supervisord|
|PASSWORD|admin|password to login
## Optional Environmental variables
| Variable | Default |What it does|
|----------|--------|------|
|PORT| 9001 | Port to access supervisord|
## Example
docker run -e URL=www.google.com banjocat/wrk
## Instruction:
Update readme with better example
## Code After:
https://github.com/mlazarov/supervisord-monitor
# repo
https://github.com/banjocat/docker-devops-examples/tree/master/supervisord-monitor
## Required Environmental variables
| Variable | Syntax |What it does|
|----------|--------|------|
|SERVERS|server1=127.0.0.1,server2=localhost| List of servers by display_name=url|
|USERNAME|admin|username to login remove supervisord|
|PASSWORD|admin|password to login
## Optional Environmental variables
| Variable | Default |What it does|
|----------|--------|------|
|PORT| 9001 | Port to access supervisord|
## Example
docker run -e SERVERS="server1=localhost",USERNAME=admin,PASSWORD=admin banjocat/supervisord-monitor
| ERROR: type should be string, got " https://github.com/mlazarov/supervisord-monitor\n # repo\n https://github.com/banjocat/docker-devops-examples/tree/master/supervisord-monitor\n \n ## Required Environmental variables\n | Variable | Syntax |What it does|\n |----------|--------|------|\n |SERVERS|server1=127.0.0.1,server2=localhost| List of servers by display_name=url|\n |USERNAME|admin|username to login remove supervisord|\n |PASSWORD|admin|password to login\n \n ## Optional Environmental variables\n | Variable | Default |What it does|\n |----------|--------|------|\n |PORT| 9001 | Port to access supervisord|\n \n \n \n ## Example\n \n- docker run -e URL=www.google.com banjocat/wrk\n+ docker run -e SERVERS=\"server1=localhost\",USERNAME=admin,PASSWORD=admin banjocat/supervisord-monitor" | 2 | 0.095238 | 1 | 1 |
f590db8181f2a0bc2ff0c634b1e453dd052ea4d9 | Casks/font-anonymous-pro.rb | Casks/font-anonymous-pro.rb | class FontAnonymousPro < Cask
url 'http://www.marksimonson.com/assets/content/fonts/AnonymousPro-1.002.zip'
homepage 'http://www.marksimonson.com/fonts/view/anonymous-pro'
version '1.002'
sha1 '87651de93312fdd3f27e50741d2a0630a41ec30d'
font 'AnonymousPro-1.002.001/Anonymous Pro B.ttf'
font 'AnonymousPro-1.002.001/Anonymous Pro BI.ttf'
font 'AnonymousPro-1.002.001/Anonymous Pro I.ttf'
font 'AnonymousPro-1.002.001/Anonymous Pro.ttf'
end
| class FontAnonymousPro < Cask
url 'http://www.marksimonson.com/assets/content/fonts/AnonymousPro-1.002.zip'
homepage 'http://www.marksimonson.com/fonts/view/anonymous-pro'
version '1.002'
sha256 '86665847a51cdfb58a1e1dfd8b1ba33f183485affe50b53e3304f63d3d3552ab'
font 'AnonymousPro-1.002.001/Anonymous Pro B.ttf'
font 'AnonymousPro-1.002.001/Anonymous Pro BI.ttf'
font 'AnonymousPro-1.002.001/Anonymous Pro I.ttf'
font 'AnonymousPro-1.002.001/Anonymous Pro.ttf'
end
| Update Anonymous Pro to sha256 checksums | Update Anonymous Pro to sha256 checksums
| Ruby | bsd-2-clause | bkudria/homebrew-fonts,andrewsardone/homebrew-fonts,bkudria/homebrew-fonts,andrewsardone/homebrew-fonts,sscotth/homebrew-fonts,mtakayuki/homebrew-fonts,herblover/homebrew-fonts,mtakayuki/homebrew-fonts,zorosteven/homebrew-fonts,elmariofredo/homebrew-fonts,kkung/homebrew-fonts,victorpopkov/homebrew-fonts,caskroom/homebrew-fonts,rstacruz/homebrew-fonts,RJHsiao/homebrew-fonts,joeyhoer/homebrew-fonts,guerrero/homebrew-fonts,alerque/homebrew-fonts,scw/homebrew-fonts,ahbeng/homebrew-fonts,psibre/homebrew-fonts,kostasdizas/homebrew-fonts,RJHsiao/homebrew-fonts,kostasdizas/homebrew-fonts,psibre/homebrew-fonts,alerque/homebrew-fonts,unasuke/homebrew-fonts,rstacruz/homebrew-fonts,ahbeng/homebrew-fonts,kkung/homebrew-fonts,scw/homebrew-fonts,caskroom/homebrew-fonts,victorpopkov/homebrew-fonts,herblover/homebrew-fonts,joeyhoer/homebrew-fonts,sscotth/homebrew-fonts,elmariofredo/homebrew-fonts,zorosteven/homebrew-fonts,guerrero/homebrew-fonts | ruby | ## Code Before:
class FontAnonymousPro < Cask
url 'http://www.marksimonson.com/assets/content/fonts/AnonymousPro-1.002.zip'
homepage 'http://www.marksimonson.com/fonts/view/anonymous-pro'
version '1.002'
sha1 '87651de93312fdd3f27e50741d2a0630a41ec30d'
font 'AnonymousPro-1.002.001/Anonymous Pro B.ttf'
font 'AnonymousPro-1.002.001/Anonymous Pro BI.ttf'
font 'AnonymousPro-1.002.001/Anonymous Pro I.ttf'
font 'AnonymousPro-1.002.001/Anonymous Pro.ttf'
end
## Instruction:
Update Anonymous Pro to sha256 checksums
## Code After:
class FontAnonymousPro < Cask
url 'http://www.marksimonson.com/assets/content/fonts/AnonymousPro-1.002.zip'
homepage 'http://www.marksimonson.com/fonts/view/anonymous-pro'
version '1.002'
sha256 '86665847a51cdfb58a1e1dfd8b1ba33f183485affe50b53e3304f63d3d3552ab'
font 'AnonymousPro-1.002.001/Anonymous Pro B.ttf'
font 'AnonymousPro-1.002.001/Anonymous Pro BI.ttf'
font 'AnonymousPro-1.002.001/Anonymous Pro I.ttf'
font 'AnonymousPro-1.002.001/Anonymous Pro.ttf'
end
| class FontAnonymousPro < Cask
url 'http://www.marksimonson.com/assets/content/fonts/AnonymousPro-1.002.zip'
homepage 'http://www.marksimonson.com/fonts/view/anonymous-pro'
version '1.002'
- sha1 '87651de93312fdd3f27e50741d2a0630a41ec30d'
+ sha256 '86665847a51cdfb58a1e1dfd8b1ba33f183485affe50b53e3304f63d3d3552ab'
font 'AnonymousPro-1.002.001/Anonymous Pro B.ttf'
font 'AnonymousPro-1.002.001/Anonymous Pro BI.ttf'
font 'AnonymousPro-1.002.001/Anonymous Pro I.ttf'
font 'AnonymousPro-1.002.001/Anonymous Pro.ttf'
end | 2 | 0.2 | 1 | 1 |
ef42bbd15d3bba0571810451ed2d6062218a2f29 | module/User/src/User/Form/UserForm.php | module/User/src/User/Form/UserForm.php | <?php
namespace User\Form;
use Zend\Form\Form;
class UserForm extends Form
{
public function __construct($name = null)
{
parent::__construct('user');
$this->setAttribute('method', 'post');
$this->add(array(
'name' => 'id',
'type' => 'Hidden',
));
$this->add(array(
'name' => 'username',
'type' => 'Text',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'email',
'type' => 'Email',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'full_name',
'type' => 'Text',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'password',
'type' => 'Password',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'submit',
'type' => 'Submit',
'attributes' => array(
'value' => 'Submit',
'id' => 'submit-btn',
'class' => 'btn btn-primary',
),
));
}
} | <?php
namespace User\Form;
use Zend\Form\Form;
class UserForm extends Form
{
public function __construct($name = null)
{
parent::__construct('user');
$this->setAttribute('method', 'post');
$this->add(array(
'name' => 'id',
'type' => 'Hidden',
));
$this->add(array(
'name' => 'username',
'type' => 'Text',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'email',
'type' => 'Email',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'full_name',
'type' => 'Text',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'password',
'type' => 'Password',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'confirm_password',
'type' => 'Password',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'remember_me',
'type' => 'checkbox',
));
$this->add(array(
'name' => 'submit',
'type' => 'Submit',
'attributes' => array(
'value' => 'Submit',
'id' => 'submit-btn',
'class' => 'btn btn-primary',
),
));
}
} | Add field `confirm_password` and `remember_me` to form | Add field `confirm_password` and `remember_me` to form
| PHP | bsd-3-clause | BichPhuongUITTeam/shop-ban-do-go,BichPhuongUITTeam/shop-ban-do-go | php | ## Code Before:
<?php
namespace User\Form;
use Zend\Form\Form;
class UserForm extends Form
{
public function __construct($name = null)
{
parent::__construct('user');
$this->setAttribute('method', 'post');
$this->add(array(
'name' => 'id',
'type' => 'Hidden',
));
$this->add(array(
'name' => 'username',
'type' => 'Text',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'email',
'type' => 'Email',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'full_name',
'type' => 'Text',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'password',
'type' => 'Password',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'submit',
'type' => 'Submit',
'attributes' => array(
'value' => 'Submit',
'id' => 'submit-btn',
'class' => 'btn btn-primary',
),
));
}
}
## Instruction:
Add field `confirm_password` and `remember_me` to form
## Code After:
<?php
namespace User\Form;
use Zend\Form\Form;
class UserForm extends Form
{
public function __construct($name = null)
{
parent::__construct('user');
$this->setAttribute('method', 'post');
$this->add(array(
'name' => 'id',
'type' => 'Hidden',
));
$this->add(array(
'name' => 'username',
'type' => 'Text',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'email',
'type' => 'Email',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'full_name',
'type' => 'Text',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'password',
'type' => 'Password',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'confirm_password',
'type' => 'Password',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'remember_me',
'type' => 'checkbox',
));
$this->add(array(
'name' => 'submit',
'type' => 'Submit',
'attributes' => array(
'value' => 'Submit',
'id' => 'submit-btn',
'class' => 'btn btn-primary',
),
));
}
} | <?php
namespace User\Form;
use Zend\Form\Form;
class UserForm extends Form
{
public function __construct($name = null)
{
parent::__construct('user');
$this->setAttribute('method', 'post');
$this->add(array(
'name' => 'id',
'type' => 'Hidden',
));
$this->add(array(
'name' => 'username',
'type' => 'Text',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'email',
'type' => 'Email',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'full_name',
'type' => 'Text',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
'name' => 'password',
'type' => 'Password',
'attributes' => array(
'class' => 'form-control',
),
));
$this->add(array(
+ 'name' => 'confirm_password',
+ 'type' => 'Password',
+ 'attributes' => array(
+ 'class' => 'form-control',
+ ),
+ ));
+
+ $this->add(array(
+ 'name' => 'remember_me',
+ 'type' => 'checkbox',
+ ));
+
+ $this->add(array(
'name' => 'submit',
'type' => 'Submit',
'attributes' => array(
'value' => 'Submit',
'id' => 'submit-btn',
'class' => 'btn btn-primary',
),
));
}
} | 13 | 0.213115 | 13 | 0 |
91fe307bd362b02e88c59e5d7543e792d4cc447a | system/aliases.zsh | system/aliases.zsh | alias grep="grep --color=auto"
alias pgrep='ps aux | grep -v grep | grep'
alias ss="open -a 'screen sharing'"
alias ls='ls -GF'
alias rm='nocorrect rm'
alias rvmnew='rvm --create --ruby-version'
alias rvmc='rvm current'
# =======================================================
# Override the basic aliases if we have GNU coreutils available
# `brew install coreutils` to get these...
# =======================================================
if $(gls &>/dev/null)
then
alias ls="gls -F --color"
alias l="gls -lAh --color"
alias ll="gls -l --color"
alias la='gls -A --color'
fi
# =======================================================
# Linux specific aliases / fixes.
# =======================================================
if [[ "$(uname -s)" == "Linux" ]]
then
alias ls="ls -GF --color"
alias ge='vi'
fi
# =======================================================
# OSX specific aliases / fixes.
# =======================================================
if [[ "$(uname -s)" == "Darwin" ]]
then
alias vi="mvim -v"
alias ge='mvim --remote-tab-silent'
fi
| alias grep="grep --color=auto"
alias pgrep='ps aux | grep -v grep | grep'
alias ss="open -a 'screen sharing'"
alias ls='ls -GF'
alias rm='nocorrect rm'
alias rvmnew='rvm --create --ruby-version'
alias rvmc='rvm current'
alias rvmempty='rvm --force gemset empty'
# =======================================================
# Override the basic aliases if we have GNU coreutils available
# `brew install coreutils` to get these...
# =======================================================
if $(gls &>/dev/null)
then
alias ls="gls -F --color"
alias l="gls -lAh --color"
alias ll="gls -l --color"
alias la='gls -A --color'
fi
# =======================================================
# Linux specific aliases / fixes.
# =======================================================
if [[ "$(uname -s)" == "Linux" ]]
then
alias ls="ls -GF --color"
alias ge='vi'
fi
# =======================================================
# OSX specific aliases / fixes.
# =======================================================
if [[ "$(uname -s)" == "Darwin" ]]
then
alias vi="mvim -v"
alias ge='mvim --remote-tab-silent'
fi
| Add alias rvmempty to flush a gemset. | Add alias rvmempty to flush a gemset.
| Shell | mit | dpilone/dotfiles,dpilone/dotfiles | shell | ## Code Before:
alias grep="grep --color=auto"
alias pgrep='ps aux | grep -v grep | grep'
alias ss="open -a 'screen sharing'"
alias ls='ls -GF'
alias rm='nocorrect rm'
alias rvmnew='rvm --create --ruby-version'
alias rvmc='rvm current'
# =======================================================
# Override the basic aliases if we have GNU coreutils available
# `brew install coreutils` to get these...
# =======================================================
if $(gls &>/dev/null)
then
alias ls="gls -F --color"
alias l="gls -lAh --color"
alias ll="gls -l --color"
alias la='gls -A --color'
fi
# =======================================================
# Linux specific aliases / fixes.
# =======================================================
if [[ "$(uname -s)" == "Linux" ]]
then
alias ls="ls -GF --color"
alias ge='vi'
fi
# =======================================================
# OSX specific aliases / fixes.
# =======================================================
if [[ "$(uname -s)" == "Darwin" ]]
then
alias vi="mvim -v"
alias ge='mvim --remote-tab-silent'
fi
## Instruction:
Add alias rvmempty to flush a gemset.
## Code After:
alias grep="grep --color=auto"
alias pgrep='ps aux | grep -v grep | grep'
alias ss="open -a 'screen sharing'"
alias ls='ls -GF'
alias rm='nocorrect rm'
alias rvmnew='rvm --create --ruby-version'
alias rvmc='rvm current'
alias rvmempty='rvm --force gemset empty'
# =======================================================
# Override the basic aliases if we have GNU coreutils available
# `brew install coreutils` to get these...
# =======================================================
if $(gls &>/dev/null)
then
alias ls="gls -F --color"
alias l="gls -lAh --color"
alias ll="gls -l --color"
alias la='gls -A --color'
fi
# =======================================================
# Linux specific aliases / fixes.
# =======================================================
if [[ "$(uname -s)" == "Linux" ]]
then
alias ls="ls -GF --color"
alias ge='vi'
fi
# =======================================================
# OSX specific aliases / fixes.
# =======================================================
if [[ "$(uname -s)" == "Darwin" ]]
then
alias vi="mvim -v"
alias ge='mvim --remote-tab-silent'
fi
| alias grep="grep --color=auto"
alias pgrep='ps aux | grep -v grep | grep'
alias ss="open -a 'screen sharing'"
alias ls='ls -GF'
alias rm='nocorrect rm'
alias rvmnew='rvm --create --ruby-version'
alias rvmc='rvm current'
+ alias rvmempty='rvm --force gemset empty'
# =======================================================
# Override the basic aliases if we have GNU coreutils available
# `brew install coreutils` to get these...
# =======================================================
if $(gls &>/dev/null)
then
alias ls="gls -F --color"
alias l="gls -lAh --color"
alias ll="gls -l --color"
alias la='gls -A --color'
fi
# =======================================================
# Linux specific aliases / fixes.
# =======================================================
if [[ "$(uname -s)" == "Linux" ]]
then
alias ls="ls -GF --color"
alias ge='vi'
fi
# =======================================================
# OSX specific aliases / fixes.
# =======================================================
if [[ "$(uname -s)" == "Darwin" ]]
then
alias vi="mvim -v"
alias ge='mvim --remote-tab-silent'
fi | 1 | 0.027027 | 1 | 0 |
15e5ba4b535b4d9f0a83fe00cb5775d86c0301ff | entity-services/src/test/resources/model-units/instance-0.0.2.xml | entity-services/src/test/resources/model-units/instance-0.0.2.xml | <?xml version="1.0"?>
<x xmlns:es="http://marklogic.com/entity-services">
<es:instance>
<ETOne>
<pk>http://example.org/some-iri/1</pk>
<p1>1</p1>
<p2>3</p2>
<p3 datatype="array">32</p3>
<p4 datatype="array">333</p4>
<p5>12</p5>
<p6>4</p6>
</ETOne>
</es:instance>
<es:instance>
<ETTwo>
<pk>http://example.org/some-iri/2</pk>
<parent><ETOne>http://example.org/some-iri</ETOne></parent>
<p7><ETTwo>3</ETTwo></p7>
<p8><ETTwo>23</ETTwo></p8>
<p9 datatype="array"><ETTwo>19</ETTwo></p9>
<p9 datatype="array"><ETTwo>14</ETTwo></p9>
<p9 datatype="array"><ETTwo>16</ETTwo></p9>
<p10/>
<p12>100293</p12>
<p13 datatype="array">100293</p13>
</ETTwo></es:instance>
<es:instance>
<ETThree/>
</es:instance>
</x>
| <x xmlns:es="http://marklogic.com/entity-services">
<es:instance>
<ETOne>
<pk>http://example.org/some-iri/1</pk>
<p1>1</p1>
<p2>3</p2>
<p3 datatype="array">32</p3>
<p4 datatype="array">333</p4>
<p5>12</p5>
<p6>4</p6>
</ETOne>
</es:instance>
<es:instance>
<ETTwo>
<pk>http://example.org/some-iri/2</pk>
<parent><ETOne><pk/><p1/><p2/></ETOne></parent>
<p7><ETTwo>3</ETTwo></p7>
<p8><ETTwo>23</ETTwo></p8>
<p9 datatype="array"><ETTwo>19</ETTwo></p9>
<p9 datatype="array"><ETTwo>14</ETTwo></p9>
<p9 datatype="array"><ETTwo>16</ETTwo></p9>
<p10/>
<p12>100293</p12>
<p13 datatype="array">100293</p13>
</ETTwo>
</es:instance>
<es:instance>
<ETThree/>
</es:instance>
</x>
| Fix key in version translator | Fix key in version translator
| XML | lgpl-2.1 | bsrikan/entity-services,bsrikan/entity-services,rashiatmarklogic/entity-services,grechaw/entity-services,grechaw/entity-services | xml | ## Code Before:
<?xml version="1.0"?>
<x xmlns:es="http://marklogic.com/entity-services">
<es:instance>
<ETOne>
<pk>http://example.org/some-iri/1</pk>
<p1>1</p1>
<p2>3</p2>
<p3 datatype="array">32</p3>
<p4 datatype="array">333</p4>
<p5>12</p5>
<p6>4</p6>
</ETOne>
</es:instance>
<es:instance>
<ETTwo>
<pk>http://example.org/some-iri/2</pk>
<parent><ETOne>http://example.org/some-iri</ETOne></parent>
<p7><ETTwo>3</ETTwo></p7>
<p8><ETTwo>23</ETTwo></p8>
<p9 datatype="array"><ETTwo>19</ETTwo></p9>
<p9 datatype="array"><ETTwo>14</ETTwo></p9>
<p9 datatype="array"><ETTwo>16</ETTwo></p9>
<p10/>
<p12>100293</p12>
<p13 datatype="array">100293</p13>
</ETTwo></es:instance>
<es:instance>
<ETThree/>
</es:instance>
</x>
## Instruction:
Fix key in version translator
## Code After:
<x xmlns:es="http://marklogic.com/entity-services">
<es:instance>
<ETOne>
<pk>http://example.org/some-iri/1</pk>
<p1>1</p1>
<p2>3</p2>
<p3 datatype="array">32</p3>
<p4 datatype="array">333</p4>
<p5>12</p5>
<p6>4</p6>
</ETOne>
</es:instance>
<es:instance>
<ETTwo>
<pk>http://example.org/some-iri/2</pk>
<parent><ETOne><pk/><p1/><p2/></ETOne></parent>
<p7><ETTwo>3</ETTwo></p7>
<p8><ETTwo>23</ETTwo></p8>
<p9 datatype="array"><ETTwo>19</ETTwo></p9>
<p9 datatype="array"><ETTwo>14</ETTwo></p9>
<p9 datatype="array"><ETTwo>16</ETTwo></p9>
<p10/>
<p12>100293</p12>
<p13 datatype="array">100293</p13>
</ETTwo>
</es:instance>
<es:instance>
<ETThree/>
</es:instance>
</x>
| - <?xml version="1.0"?>
<x xmlns:es="http://marklogic.com/entity-services">
<es:instance>
- <ETOne>
+ <ETOne>
? ++
- <pk>http://example.org/some-iri/1</pk>
+ <pk>http://example.org/some-iri/1</pk>
? ++
- <p1>1</p1>
+ <p1>1</p1>
? ++
- <p2>3</p2>
+ <p2>3</p2>
? ++
- <p3 datatype="array">32</p3>
+ <p3 datatype="array">32</p3>
? ++
- <p4 datatype="array">333</p4>
+ <p4 datatype="array">333</p4>
? ++
- <p5>12</p5>
+ <p5>12</p5>
? ++
- <p6>4</p6>
+ <p6>4</p6>
? ++
- </ETOne>
+ </ETOne>
? ++
</es:instance>
<es:instance>
- <ETTwo>
+ <ETTwo>
? ++
- <pk>http://example.org/some-iri/2</pk>
+ <pk>http://example.org/some-iri/2</pk>
? ++
- <parent><ETOne>http://example.org/some-iri</ETOne></parent>
+ <parent><ETOne><pk/><p1/><p2/></ETOne></parent>
- <p7><ETTwo>3</ETTwo></p7>
+ <p7><ETTwo>3</ETTwo></p7>
? ++
- <p8><ETTwo>23</ETTwo></p8>
+ <p8><ETTwo>23</ETTwo></p8>
? ++
- <p9 datatype="array"><ETTwo>19</ETTwo></p9>
+ <p9 datatype="array"><ETTwo>19</ETTwo></p9>
? ++
- <p9 datatype="array"><ETTwo>14</ETTwo></p9>
+ <p9 datatype="array"><ETTwo>14</ETTwo></p9>
? ++
- <p9 datatype="array"><ETTwo>16</ETTwo></p9>
+ <p9 datatype="array"><ETTwo>16</ETTwo></p9>
? ++
- <p10/>
+ <p10/>
? ++
- <p12>100293</p12>
+ <p12>100293</p12>
? ++
- <p13 datatype="array">100293</p13>
+ <p13 datatype="array">100293</p13>
? ++
+ </ETTwo>
- </ETTwo></es:instance>
? --------
+ </es:instance>
<es:instance>
- <ETThree/>
+ <ETThree/>
? ++
</es:instance>
</x> | 46 | 1.533333 | 23 | 23 |
2c2004859bfb38fde111d3a5f1c0d4df3d501c07 | x/validator/types/proposed_disable_validator.go | x/validator/types/proposed_disable_validator.go | package types
import sdk "github.com/cosmos/cosmos-sdk/types"
func (disabledValidator ProposedDisableValidator) HasApprovalFrom(address sdk.AccAddress) bool {
addrStr := address.String()
for _, approval := range disabledValidator.Approvals {
if approval.Address == addrStr {
return true
}
}
return false
}
| package types
import sdk "github.com/cosmos/cosmos-sdk/types"
func (disabledValidator ProposedDisableValidator) HasApprovalFrom(address sdk.AccAddress) bool {
addrStr := address.String()
for _, approval := range disabledValidator.Approvals {
if approval.Address == addrStr {
return true
}
}
return false
}
func (disabledValidator ProposedDisableValidator) HasRejectDisableFrom(address sdk.AccAddress) bool {
addrStr := address.String()
for _, rejectDisable := range disabledValidator.RejectApprovals {
if rejectDisable.Address == addrStr {
return true
}
}
return false
}
| Add function for checking has reject disable from some account | Add function for checking has reject disable from some account
| Go | apache-2.0 | zigbee-alliance/distributed-compliance-ledger,zigbee-alliance/distributed-compliance-ledger,zigbee-alliance/distributed-compliance-ledger,zigbee-alliance/distributed-compliance-ledger,zigbee-alliance/distributed-compliance-ledger,zigbee-alliance/distributed-compliance-ledger | go | ## Code Before:
package types
import sdk "github.com/cosmos/cosmos-sdk/types"
func (disabledValidator ProposedDisableValidator) HasApprovalFrom(address sdk.AccAddress) bool {
addrStr := address.String()
for _, approval := range disabledValidator.Approvals {
if approval.Address == addrStr {
return true
}
}
return false
}
## Instruction:
Add function for checking has reject disable from some account
## Code After:
package types
import sdk "github.com/cosmos/cosmos-sdk/types"
func (disabledValidator ProposedDisableValidator) HasApprovalFrom(address sdk.AccAddress) bool {
addrStr := address.String()
for _, approval := range disabledValidator.Approvals {
if approval.Address == addrStr {
return true
}
}
return false
}
func (disabledValidator ProposedDisableValidator) HasRejectDisableFrom(address sdk.AccAddress) bool {
addrStr := address.String()
for _, rejectDisable := range disabledValidator.RejectApprovals {
if rejectDisable.Address == addrStr {
return true
}
}
return false
}
| package types
import sdk "github.com/cosmos/cosmos-sdk/types"
func (disabledValidator ProposedDisableValidator) HasApprovalFrom(address sdk.AccAddress) bool {
addrStr := address.String()
for _, approval := range disabledValidator.Approvals {
if approval.Address == addrStr {
return true
}
}
return false
}
+
+ func (disabledValidator ProposedDisableValidator) HasRejectDisableFrom(address sdk.AccAddress) bool {
+ addrStr := address.String()
+ for _, rejectDisable := range disabledValidator.RejectApprovals {
+ if rejectDisable.Address == addrStr {
+ return true
+ }
+ }
+
+ return false
+ } | 11 | 0.785714 | 11 | 0 |
1de238dc7bf8902f1d210df841eed6426b0928b5 | bench_map_key_lookup_string_from_bytes_test.go | bench_map_key_lookup_string_from_bytes_test.go | package main
import (
"fmt"
"testing"
)
func BenchmarkMapLookupKeyStringFromBytes(b *testing.B) {
entries := 4096
lookup := make(map[string]int, entries)
for i := 0; i < entries; i++ {
lookup[fmt.Sprintf("foo.%d", i)] = -1
}
find := []byte("foo.0")
b.ResetTimer()
for i := 0; i < b.N; i++ {
// lookup[string(find)] = i
if _, ok := lookup[string(find)]; !ok {
b.Fatalf("key %s should exist", string(find))
}
}
}
func BenchmarkMapSetKeyStringFromBytes(b *testing.B) {
entries := 4096
lookup := make(map[string]int, entries)
for i := 0; i < entries; i++ {
lookup[fmt.Sprintf("foo.%d", i)] = -1
}
find := []byte("foo.0")
b.ResetTimer()
for i := 0; i < b.N; i++ {
lookup[string(find)] = i
}
}
| package main
/*
Results
--
$ go test -v -bench BenchmarkMap -benchmem
testing: warning: no tests to run
BenchmarkMapLookupKeyStringFromBytes-4 100000000 19.6 ns/op 0 B/op 0 allocs/op
BenchmarkMapSetKeyStringFromBytes-4 20000000 73.8 ns/op 5 B/op 1 allocs/op
PASS
ok github.com/robskillington/benchmarks-go 3.561s
*/
import (
"fmt"
"testing"
)
func BenchmarkMapLookupKeyStringFromBytes(b *testing.B) {
entries := 4096
lookup := make(map[string]int, entries)
for i := 0; i < entries; i++ {
lookup[fmt.Sprintf("foo.%d", i)] = -1
}
find := []byte("foo.0")
b.ResetTimer()
for i := 0; i < b.N; i++ {
if _, ok := lookup[string(find)]; !ok {
b.Fatalf("key %s should exist", string(find))
}
}
}
func BenchmarkMapSetKeyStringFromBytes(b *testing.B) {
entries := 4096
lookup := make(map[string]int, entries)
for i := 0; i < entries; i++ {
lookup[fmt.Sprintf("foo.%d", i)] = -1
}
find := []byte("foo.0")
b.ResetTimer()
for i := 0; i < b.N; i++ {
lookup[string(find)] = i
}
}
| Add results for BenchmarkMapLookupKeyStringFromBytes and BenchmarkMapSetKeyStringFromBytes | Add results for BenchmarkMapLookupKeyStringFromBytes and BenchmarkMapSetKeyStringFromBytes
| Go | mit | robskillington/benchmarks-go | go | ## Code Before:
package main
import (
"fmt"
"testing"
)
func BenchmarkMapLookupKeyStringFromBytes(b *testing.B) {
entries := 4096
lookup := make(map[string]int, entries)
for i := 0; i < entries; i++ {
lookup[fmt.Sprintf("foo.%d", i)] = -1
}
find := []byte("foo.0")
b.ResetTimer()
for i := 0; i < b.N; i++ {
// lookup[string(find)] = i
if _, ok := lookup[string(find)]; !ok {
b.Fatalf("key %s should exist", string(find))
}
}
}
func BenchmarkMapSetKeyStringFromBytes(b *testing.B) {
entries := 4096
lookup := make(map[string]int, entries)
for i := 0; i < entries; i++ {
lookup[fmt.Sprintf("foo.%d", i)] = -1
}
find := []byte("foo.0")
b.ResetTimer()
for i := 0; i < b.N; i++ {
lookup[string(find)] = i
}
}
## Instruction:
Add results for BenchmarkMapLookupKeyStringFromBytes and BenchmarkMapSetKeyStringFromBytes
## Code After:
package main
/*
Results
--
$ go test -v -bench BenchmarkMap -benchmem
testing: warning: no tests to run
BenchmarkMapLookupKeyStringFromBytes-4 100000000 19.6 ns/op 0 B/op 0 allocs/op
BenchmarkMapSetKeyStringFromBytes-4 20000000 73.8 ns/op 5 B/op 1 allocs/op
PASS
ok github.com/robskillington/benchmarks-go 3.561s
*/
import (
"fmt"
"testing"
)
func BenchmarkMapLookupKeyStringFromBytes(b *testing.B) {
entries := 4096
lookup := make(map[string]int, entries)
for i := 0; i < entries; i++ {
lookup[fmt.Sprintf("foo.%d", i)] = -1
}
find := []byte("foo.0")
b.ResetTimer()
for i := 0; i < b.N; i++ {
if _, ok := lookup[string(find)]; !ok {
b.Fatalf("key %s should exist", string(find))
}
}
}
func BenchmarkMapSetKeyStringFromBytes(b *testing.B) {
entries := 4096
lookup := make(map[string]int, entries)
for i := 0; i < entries; i++ {
lookup[fmt.Sprintf("foo.%d", i)] = -1
}
find := []byte("foo.0")
b.ResetTimer()
for i := 0; i < b.N; i++ {
lookup[string(find)] = i
}
}
| package main
+
+ /*
+ Results
+ --
+ $ go test -v -bench BenchmarkMap -benchmem
+ testing: warning: no tests to run
+ BenchmarkMapLookupKeyStringFromBytes-4 100000000 19.6 ns/op 0 B/op 0 allocs/op
+ BenchmarkMapSetKeyStringFromBytes-4 20000000 73.8 ns/op 5 B/op 1 allocs/op
+ PASS
+ ok github.com/robskillington/benchmarks-go 3.561s
+ */
import (
"fmt"
"testing"
)
func BenchmarkMapLookupKeyStringFromBytes(b *testing.B) {
entries := 4096
lookup := make(map[string]int, entries)
for i := 0; i < entries; i++ {
lookup[fmt.Sprintf("foo.%d", i)] = -1
}
find := []byte("foo.0")
b.ResetTimer()
for i := 0; i < b.N; i++ {
- // lookup[string(find)] = i
if _, ok := lookup[string(find)]; !ok {
b.Fatalf("key %s should exist", string(find))
}
}
}
func BenchmarkMapSetKeyStringFromBytes(b *testing.B) {
entries := 4096
lookup := make(map[string]int, entries)
for i := 0; i < entries; i++ {
lookup[fmt.Sprintf("foo.%d", i)] = -1
}
find := []byte("foo.0")
b.ResetTimer()
for i := 0; i < b.N; i++ {
lookup[string(find)] = i
}
} | 12 | 0.292683 | 11 | 1 |
2d87d34d97d92ee8cfa6e731ac60518cde180959 | src/Accordion/accordion.js | src/Accordion/accordion.js | // @flow
import React, { Component, type Node } from 'react';
import { Provider } from 'mobx-react';
import { createAccordionStore } from '../accordionStore/accordionStore';
type AccordionProps = {
accordion: boolean,
children: Node,
// activeItems: Array<string | number>,
className: string,
onChange: Function,
};
class Accordion extends Component<AccordionProps, *> {
static defaultProps = {
accordion: true,
onChange: () => {},
className: 'accordion',
activeItems: [],
};
accordionStore = createAccordionStore({
accordion: this.props.accordion,
onChange: this.props.onChange,
});
render() {
const { className, children } = this.props;
const { accordion } = this.accordionStore;
return (
<Provider accordionStore={this.accordionStore}>
<div role={accordion ? 'tablist' : null} className={className}>
{children}
</div>
</Provider>
);
}
}
export default Accordion;
| // @flow
import React, { Component, type Node } from 'react';
import { Provider } from 'mobx-react';
import { createAccordionStore } from '../accordionStore/accordionStore';
type AccordionProps = {
accordion: boolean,
children: Node,
// activeItems: Array<string | number>,
className: string,
onChange: Function,
};
class Accordion extends Component<AccordionProps, *> {
static defaultProps = {
accordion: true,
onChange: () => {},
className: 'accordion',
activeItems: [],
children: null,
};
accordionStore = createAccordionStore({
accordion: this.props.accordion,
onChange: this.props.onChange,
});
render() {
const { className, children } = this.props;
const { accordion } = this.accordionStore;
return (
<Provider accordionStore={this.accordionStore}>
<div role={accordion ? 'tablist' : null} className={className}>
{children}
</div>
</Provider>
);
}
}
export default Accordion;
| Set defaultProp for Accordion's children | Set defaultProp for Accordion's children
| JavaScript | mit | springload/react-accessible-accordion,springload/react-accessible-accordion,springload/react-accessible-accordion | javascript | ## Code Before:
// @flow
import React, { Component, type Node } from 'react';
import { Provider } from 'mobx-react';
import { createAccordionStore } from '../accordionStore/accordionStore';
type AccordionProps = {
accordion: boolean,
children: Node,
// activeItems: Array<string | number>,
className: string,
onChange: Function,
};
class Accordion extends Component<AccordionProps, *> {
static defaultProps = {
accordion: true,
onChange: () => {},
className: 'accordion',
activeItems: [],
};
accordionStore = createAccordionStore({
accordion: this.props.accordion,
onChange: this.props.onChange,
});
render() {
const { className, children } = this.props;
const { accordion } = this.accordionStore;
return (
<Provider accordionStore={this.accordionStore}>
<div role={accordion ? 'tablist' : null} className={className}>
{children}
</div>
</Provider>
);
}
}
export default Accordion;
## Instruction:
Set defaultProp for Accordion's children
## Code After:
// @flow
import React, { Component, type Node } from 'react';
import { Provider } from 'mobx-react';
import { createAccordionStore } from '../accordionStore/accordionStore';
type AccordionProps = {
accordion: boolean,
children: Node,
// activeItems: Array<string | number>,
className: string,
onChange: Function,
};
class Accordion extends Component<AccordionProps, *> {
static defaultProps = {
accordion: true,
onChange: () => {},
className: 'accordion',
activeItems: [],
children: null,
};
accordionStore = createAccordionStore({
accordion: this.props.accordion,
onChange: this.props.onChange,
});
render() {
const { className, children } = this.props;
const { accordion } = this.accordionStore;
return (
<Provider accordionStore={this.accordionStore}>
<div role={accordion ? 'tablist' : null} className={className}>
{children}
</div>
</Provider>
);
}
}
export default Accordion;
| // @flow
import React, { Component, type Node } from 'react';
import { Provider } from 'mobx-react';
import { createAccordionStore } from '../accordionStore/accordionStore';
type AccordionProps = {
accordion: boolean,
children: Node,
// activeItems: Array<string | number>,
className: string,
onChange: Function,
};
class Accordion extends Component<AccordionProps, *> {
static defaultProps = {
accordion: true,
onChange: () => {},
className: 'accordion',
activeItems: [],
+ children: null,
};
accordionStore = createAccordionStore({
accordion: this.props.accordion,
onChange: this.props.onChange,
});
render() {
const { className, children } = this.props;
const { accordion } = this.accordionStore;
return (
<Provider accordionStore={this.accordionStore}>
<div role={accordion ? 'tablist' : null} className={className}>
{children}
</div>
</Provider>
);
}
}
export default Accordion; | 1 | 0.02381 | 1 | 0 |
f3337c870645cd2c48f2a22da5590f9ad281982d | src/test-common/java/org/thymeleaf/engine/domselector/DOMSelectorTest.java | src/test-common/java/org/thymeleaf/engine/domselector/DOMSelectorTest.java | /*
* =============================================================================
*
* Copyright (c) 2011-2013, The THYMELEAF team (http://www.thymeleaf.org)
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
* =============================================================================
*/
package org.thymeleaf.engine.domselector;
import org.junit.Assert;
import org.junit.Test;
import org.thymeleaf.testing.templateengine.engine.TestExecutor;
public class DOMSelectorTest {
public DOMSelectorTest() {
super();
}
@Test
public void testDOMSelector() throws Exception {
final TestExecutor executor = new TestExecutor();
executor.execute("classpath:engine/domselector");
Assert.assertTrue(executor.isAllOK());
}
@Test
public void testDOMSelector14() throws Exception {
final TestExecutor executor = new TestExecutor();
executor.execute("classpath:engine/domselector/domselector14.thtest");
Assert.assertTrue(executor.isAllOK());
}
}
| /*
* =============================================================================
*
* Copyright (c) 2011-2013, The THYMELEAF team (http://www.thymeleaf.org)
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
* =============================================================================
*/
package org.thymeleaf.engine.domselector;
import org.junit.Assert;
import org.junit.Test;
import org.thymeleaf.testing.templateengine.engine.TestExecutor;
public class DOMSelectorTest {
public DOMSelectorTest() {
super();
}
@Test
public void testDOMSelector() throws Exception {
final TestExecutor executor = new TestExecutor();
executor.execute("classpath:engine/domselector");
Assert.assertTrue(executor.isAllOK());
}
}
| Remove unneeded specific test method | Remove unneeded specific test method
| Java | apache-2.0 | thymeleaf/thymeleaf-tests,thymeleaf/thymeleaf-tests,thymeleaf/thymeleaf-tests | java | ## Code Before:
/*
* =============================================================================
*
* Copyright (c) 2011-2013, The THYMELEAF team (http://www.thymeleaf.org)
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
* =============================================================================
*/
package org.thymeleaf.engine.domselector;
import org.junit.Assert;
import org.junit.Test;
import org.thymeleaf.testing.templateengine.engine.TestExecutor;
public class DOMSelectorTest {
public DOMSelectorTest() {
super();
}
@Test
public void testDOMSelector() throws Exception {
final TestExecutor executor = new TestExecutor();
executor.execute("classpath:engine/domselector");
Assert.assertTrue(executor.isAllOK());
}
@Test
public void testDOMSelector14() throws Exception {
final TestExecutor executor = new TestExecutor();
executor.execute("classpath:engine/domselector/domselector14.thtest");
Assert.assertTrue(executor.isAllOK());
}
}
## Instruction:
Remove unneeded specific test method
## Code After:
/*
* =============================================================================
*
* Copyright (c) 2011-2013, The THYMELEAF team (http://www.thymeleaf.org)
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
* =============================================================================
*/
package org.thymeleaf.engine.domselector;
import org.junit.Assert;
import org.junit.Test;
import org.thymeleaf.testing.templateengine.engine.TestExecutor;
public class DOMSelectorTest {
public DOMSelectorTest() {
super();
}
@Test
public void testDOMSelector() throws Exception {
final TestExecutor executor = new TestExecutor();
executor.execute("classpath:engine/domselector");
Assert.assertTrue(executor.isAllOK());
}
}
| /*
* =============================================================================
*
* Copyright (c) 2011-2013, The THYMELEAF team (http://www.thymeleaf.org)
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*
* =============================================================================
*/
package org.thymeleaf.engine.domselector;
import org.junit.Assert;
import org.junit.Test;
import org.thymeleaf.testing.templateengine.engine.TestExecutor;
public class DOMSelectorTest {
public DOMSelectorTest() {
super();
}
@Test
public void testDOMSelector() throws Exception {
final TestExecutor executor = new TestExecutor();
executor.execute("classpath:engine/domselector");
Assert.assertTrue(executor.isAllOK());
}
- @Test
- public void testDOMSelector14() throws Exception {
-
- final TestExecutor executor = new TestExecutor();
- executor.execute("classpath:engine/domselector/domselector14.thtest");
-
- Assert.assertTrue(executor.isAllOK());
-
- }
-
} | 10 | 0.172414 | 0 | 10 |
9c6530203d08853c8a5f0061f9691b335c012e3d | .travis.yml | .travis.yml | language: node_js
node_js:
- "0.12"
script:
- "node tools/build.js -t node"
- "npm test"
sudo: false # Use container-based architecture
| language: node_js
node_js:
- "0.12"
- "iojs"
script:
- "node tools/build.js -t node"
- "npm test"
sudo: false # Use container-based architecture
| Add iojs to engines being tested | Add iojs to engines being tested
| YAML | bsd-3-clause | highlightjs/highlight.js,delebash/highlight.js,ysbaddaden/highlight.js,martijnrusschen/highlight.js,abhishekgahlot/highlight.js,zachaysan/highlight.js,Delermando/highlight.js,axter/highlight.js,snegovick/highlight.js,adam-lynch/highlight.js,lizhil/highlight.js,MakeNowJust/highlight.js,jean/highlight.js,ehornbostel/highlight.js,bluepichu/highlight.js,brennced/highlight.js,MakeNowJust/highlight.js,liang42hao/highlight.js,cicorias/highlight.js,Delermando/highlight.js,CausalityLtd/highlight.js,ehornbostel/highlight.js,Ajunboys/highlight.js,kayyyy/highlight.js,kayyyy/highlight.js,carlokok/highlight.js,Ajunboys/highlight.js,highlightjs/highlight.js,aristidesstaffieri/highlight.js,tenbits/highlight.js,ilovezy/highlight.js,teambition/highlight.js,zachaysan/highlight.js,Ankirama/highlight.js,Amrit01/highlight.js,palmin/highlight.js,adjohnson916/highlight.js,0x7fffffff/highlight.js,J2TeaM/highlight.js,bogachev-pa/highlight.js,sourrust/highlight.js,CausalityLtd/highlight.js,liang42hao/highlight.js,carlokok/highlight.js,kba/highlight.js,kayyyy/highlight.js,STRML/highlight.js,VoldemarLeGrand/highlight.js,VoldemarLeGrand/highlight.js,1st1/highlight.js,ilovezy/highlight.js,alex-zhang/highlight.js,weiyibin/highlight.js,STRML/highlight.js,dublebuble/highlight.js,delebash/highlight.js,daimor/highlight.js,dbkaplun/highlight.js,yxxme/highlight.js,bogachev-pa/highlight.js,daimor/highlight.js,dx285/highlight.js,brennced/highlight.js,dbkaplun/highlight.js,Delermando/highlight.js,brennced/highlight.js,ponylang/highlight.js,highlightjs/highlight.js,xing-zhi/highlight.js,xing-zhi/highlight.js,StanislawSwierc/highlight.js,dYale/highlight.js,krig/highlight.js,martijnrusschen/highlight.js,teambition/highlight.js,SibuStephen/highlight.js,krig/highlight.js,Ankirama/highlight.js,bogachev-pa/highlight.js,carlokok/highlight.js,yxxme/highlight.js,abhishekgahlot/highlight.js,aurusov/highlight.js,ponylang/highlight.js,Ankirama/highlight.js,dx285/highlight.js,sourrust/highlight.js,krig/highlight.js,MakeNowJust/highlight.js,taoger/highlight.js,robconery/highlight.js,palmin/highlight.js,isagalaev/highlight.js,aurusov/highlight.js,snegovick/highlight.js,lizhil/highlight.js,1st1/highlight.js,J2TeaM/highlight.js,adam-lynch/highlight.js,kevinrodbe/highlight.js,aristidesstaffieri/highlight.js,cicorias/highlight.js,bluepichu/highlight.js,adjohnson916/highlight.js,devmario/highlight.js,CausalityLtd/highlight.js,robconery/highlight.js,dublebuble/highlight.js,tenbits/highlight.js,aurusov/highlight.js,dublebuble/highlight.js,ehornbostel/highlight.js,bluepichu/highlight.js,sourrust/highlight.js,Amrit01/highlight.js,kba/highlight.js,axter/highlight.js,aristidesstaffieri/highlight.js,tenbits/highlight.js,StanislawSwierc/highlight.js,christoffer/highlight.js,SibuStephen/highlight.js,delebash/highlight.js,martijnrusschen/highlight.js,alex-zhang/highlight.js,dbkaplun/highlight.js,STRML/highlight.js,ponylang/highlight.js,Sannis/highlight.js,kevinrodbe/highlight.js,robconery/highlight.js,0x7fffffff/highlight.js,adjohnson916/highlight.js,kevinrodbe/highlight.js,palmin/highlight.js,abhishekgahlot/highlight.js,ysbaddaden/highlight.js,cicorias/highlight.js,devmario/highlight.js,taoger/highlight.js,weiyibin/highlight.js,daimor/highlight.js,Ajunboys/highlight.js,highlightjs/highlight.js,weiyibin/highlight.js,snegovick/highlight.js,teambition/highlight.js,devmario/highlight.js,yxxme/highlight.js,xing-zhi/highlight.js,SibuStephen/highlight.js,VoldemarLeGrand/highlight.js,alex-zhang/highlight.js,christoffer/highlight.js,0x7fffffff/highlight.js,zachaysan/highlight.js,carlokok/highlight.js,christoffer/highlight.js,Sannis/highlight.js,kba/highlight.js,lizhil/highlight.js,isagalaev/highlight.js,dYale/highlight.js,ysbaddaden/highlight.js,Amrit01/highlight.js,lead-auth/highlight.js,jean/highlight.js,taoger/highlight.js,ilovezy/highlight.js,liang42hao/highlight.js,dx285/highlight.js,1st1/highlight.js,axter/highlight.js,J2TeaM/highlight.js,Sannis/highlight.js,adam-lynch/highlight.js,dYale/highlight.js,jean/highlight.js | yaml | ## Code Before:
language: node_js
node_js:
- "0.12"
script:
- "node tools/build.js -t node"
- "npm test"
sudo: false # Use container-based architecture
## Instruction:
Add iojs to engines being tested
## Code After:
language: node_js
node_js:
- "0.12"
- "iojs"
script:
- "node tools/build.js -t node"
- "npm test"
sudo: false # Use container-based architecture
| language: node_js
node_js:
- "0.12"
+ - "iojs"
script:
- "node tools/build.js -t node"
- "npm test"
sudo: false # Use container-based architecture | 1 | 0.125 | 1 | 0 |
6bf80a675bdce13f44a8d4bb56b12af065ec14d2 | Source/View/Connecting/VConnectingPinListCell.swift | Source/View/Connecting/VConnectingPinListCell.swift | import UIKit
class VConnectingPinListCell:UICollectionViewCell
{
override init(frame:CGRect)
{
super.init(frame:frame)
}
required init?(coder:NSCoder)
{
return nil
}
}
| import UIKit
class VConnectingPinListCell:UICollectionViewCell
{
private let kCornerRadius:CGFloat = 6
override init(frame:CGRect)
{
super.init(frame:frame)
isUserInteractionEnabled = false
let background:UIView = UIView()
background.isUserInteractionEnabled = false
background.backgroundColor = UIColor(white:1, alpha:0.6)
background.translatesAutoresizingMaskIntoConstraints = false
background.layer.cornerRadius = kCornerRadius
addSubview(background)
NSLayoutConstraint.equals(
view:background,
toView:self)
}
required init?(coder:NSCoder)
{
return nil
}
}
| Add background view to cell | Add background view to cell
| Swift | mit | devpunk/velvet_room,devpunk/velvet_room | swift | ## Code Before:
import UIKit
class VConnectingPinListCell:UICollectionViewCell
{
override init(frame:CGRect)
{
super.init(frame:frame)
}
required init?(coder:NSCoder)
{
return nil
}
}
## Instruction:
Add background view to cell
## Code After:
import UIKit
class VConnectingPinListCell:UICollectionViewCell
{
private let kCornerRadius:CGFloat = 6
override init(frame:CGRect)
{
super.init(frame:frame)
isUserInteractionEnabled = false
let background:UIView = UIView()
background.isUserInteractionEnabled = false
background.backgroundColor = UIColor(white:1, alpha:0.6)
background.translatesAutoresizingMaskIntoConstraints = false
background.layer.cornerRadius = kCornerRadius
addSubview(background)
NSLayoutConstraint.equals(
view:background,
toView:self)
}
required init?(coder:NSCoder)
{
return nil
}
}
| import UIKit
class VConnectingPinListCell:UICollectionViewCell
{
+ private let kCornerRadius:CGFloat = 6
+
override init(frame:CGRect)
{
super.init(frame:frame)
+ isUserInteractionEnabled = false
+
+ let background:UIView = UIView()
+ background.isUserInteractionEnabled = false
+ background.backgroundColor = UIColor(white:1, alpha:0.6)
+ background.translatesAutoresizingMaskIntoConstraints = false
+ background.layer.cornerRadius = kCornerRadius
+
+ addSubview(background)
+
+ NSLayoutConstraint.equals(
+ view:background,
+ toView:self)
}
required init?(coder:NSCoder)
{
return nil
}
} | 15 | 1.071429 | 15 | 0 |
491072136ab5709a8dc48a34af2e91c320c0b772 | .travis.yml | .travis.yml | language: python
sudo: false
cache:
- pip
python:
- "2.7"
- "3.5"
env:
- DJANGO_VERSION=1.10
- DJANGO_VERSION=1.11rc1
install:
- pip install -r requirements.txt
- pip uninstall -y Django
- pip install -q Django==$DJANGO_VERSION
before_script:
- psql -c 'create database access_mo_django;' -U postgres
- psql -c "CREATE EXTENSION postgis;" -U postgres -d access_mo_django
script:
- flake8 am
- python am/manage.py migrate
- coverage run am/manage.py test house_scraper
- coverage run am/manage.py test senate_scraper
after_success:
- coveralls
addons:
postgresql: "9.4"
| language: python
sudo: false
cache:
- pip
python:
- "2.7"
- "3.5"
env:
- DJANGO_VERSION=1.10 DATABASE_SETTINGS='settings_local.py'
- DJANGO_VERSION=1.11rc1 DATABASE_SETTINGS='settings_local.py'
install:
- pip install -r requirements.txt
- pip uninstall -y Django
- pip install -q Django==$DJANGO_VERSION
before_script:
- psql -c 'create database access_mo_django;' -U postgres
- psql -c "CREATE EXTENSION postgis;" -U postgres -d access_mo_django
script:
- flake8 am
- python am/manage.py migrate
- coverage run am/manage.py test house_scraper
- coverage run am/manage.py test senate_scraper
after_success:
- coveralls
addons:
postgresql: "9.4"
| Add database settings to ci tests | Add database settings to ci tests
| YAML | bsd-2-clause | access-missouri/am-django-project,access-missouri/am-django-project,access-missouri/am-django-project,access-missouri/am-django-project | yaml | ## Code Before:
language: python
sudo: false
cache:
- pip
python:
- "2.7"
- "3.5"
env:
- DJANGO_VERSION=1.10
- DJANGO_VERSION=1.11rc1
install:
- pip install -r requirements.txt
- pip uninstall -y Django
- pip install -q Django==$DJANGO_VERSION
before_script:
- psql -c 'create database access_mo_django;' -U postgres
- psql -c "CREATE EXTENSION postgis;" -U postgres -d access_mo_django
script:
- flake8 am
- python am/manage.py migrate
- coverage run am/manage.py test house_scraper
- coverage run am/manage.py test senate_scraper
after_success:
- coveralls
addons:
postgresql: "9.4"
## Instruction:
Add database settings to ci tests
## Code After:
language: python
sudo: false
cache:
- pip
python:
- "2.7"
- "3.5"
env:
- DJANGO_VERSION=1.10 DATABASE_SETTINGS='settings_local.py'
- DJANGO_VERSION=1.11rc1 DATABASE_SETTINGS='settings_local.py'
install:
- pip install -r requirements.txt
- pip uninstall -y Django
- pip install -q Django==$DJANGO_VERSION
before_script:
- psql -c 'create database access_mo_django;' -U postgres
- psql -c "CREATE EXTENSION postgis;" -U postgres -d access_mo_django
script:
- flake8 am
- python am/manage.py migrate
- coverage run am/manage.py test house_scraper
- coverage run am/manage.py test senate_scraper
after_success:
- coveralls
addons:
postgresql: "9.4"
| language: python
sudo: false
cache:
- pip
python:
- "2.7"
- "3.5"
env:
- - DJANGO_VERSION=1.10
- - DJANGO_VERSION=1.11rc1
+ - DJANGO_VERSION=1.10 DATABASE_SETTINGS='settings_local.py'
+ - DJANGO_VERSION=1.11rc1 DATABASE_SETTINGS='settings_local.py'
install:
- pip install -r requirements.txt
- pip uninstall -y Django
- pip install -q Django==$DJANGO_VERSION
before_script:
- psql -c 'create database access_mo_django;' -U postgres
- psql -c "CREATE EXTENSION postgis;" -U postgres -d access_mo_django
script:
- flake8 am
- python am/manage.py migrate
- coverage run am/manage.py test house_scraper
- coverage run am/manage.py test senate_scraper
after_success:
- coveralls
addons:
postgresql: "9.4" | 4 | 0.114286 | 2 | 2 |
caa7f042cd333599ce75b9eb36927bdc44011628 | src/cheats/hll-compiler.pir | src/cheats/hll-compiler.pir |
.namespace ['HLL';'Compiler']
.sub '' :anon :init :load
load_bytecode 'PCT.pbc'
.local pmc p6meta
p6meta = get_hll_global 'P6metaclass'
p6meta.'new_class'('HLL::Compiler', 'parent'=>'PCT::HLLCompiler')
.end
.sub 'parse' :method
.param pmc source
.param pmc options :slurpy :named
.local pmc parsegrammar, parseactions, match
parsegrammar = self.'parsegrammar'()
parseactions = self.'parseactions'()
$I0 = isa parsegrammar, ['Regex';'Cursor']
unless $I0 goto parse_old
match = parsegrammar.'parse'(source, 'from'=>0, 'action'=>parseactions)
unless match goto err_parsefail
.return (match)
err_parsefail:
self.'panic'('Unable to parse source')
.return (match)
parse_old:
$I0 = isa parsegrammar, ['NameSpace']
if $I0 goto parse_old_1
## switch from protoobjects to classes, then call parent
$P0 = parsegrammar.'HOW'()
$P0 = getattribute $P0, 'parrotclass'
parsegrammar = $P0.'get_namespace'()
self.'parsegrammar'(parsegrammar)
$P0 = parseactions.'HOW'()
$P0 = getattribute $P0, 'parrotclass'
parseactions = $P0.'get_namespace'()
self.'parseactions'(parseactions)
parse_old_1:
$P0 = get_hll_global ['PCT'], 'HLLCompiler'
$P1 = find_method $P0, 'parse'
.tailcall self.$P1(source, options :flat :named)
.end
|
.namespace ['HLL';'Compiler']
.sub '' :anon :init :load
load_bytecode 'PCT.pbc'
.local pmc p6meta
p6meta = get_hll_global 'P6metaclass'
p6meta.'new_class'('HLL::Compiler', 'parent'=>'PCT::HLLCompiler')
.end
.sub 'parse' :method
.param pmc source
.param pmc options :slurpy :named
.local pmc parsegrammar, parseactions, match
parsegrammar = self.'parsegrammar'()
null parseactions
$S0 = options['target']
# if $S0 == 'parse' goto have_parseactions
parseactions = self.'parseactions'()
have_parseactions:
match = parsegrammar.'parse'(source, 'from'=>0, 'action'=>parseactions)
unless match goto err_parsefail
.return (match)
err_parsefail:
self.'panic'('Unable to parse source')
.return (match)
.end
| Remove obsolete emulation of old grammar+action setup. | [hll]: Remove obsolete emulation of old grammar+action setup.
| Parrot Internal Representation | artistic-2.0 | cygx/nqp,cygx/nqp,perl6/nqp-rx,cygx/nqp,cygx/nqp,cygx/nqp,cygx/nqp,cygx/nqp,cygx/nqp,perl6/nqp-rx | parrot-internal-representation | ## Code Before:
.namespace ['HLL';'Compiler']
.sub '' :anon :init :load
load_bytecode 'PCT.pbc'
.local pmc p6meta
p6meta = get_hll_global 'P6metaclass'
p6meta.'new_class'('HLL::Compiler', 'parent'=>'PCT::HLLCompiler')
.end
.sub 'parse' :method
.param pmc source
.param pmc options :slurpy :named
.local pmc parsegrammar, parseactions, match
parsegrammar = self.'parsegrammar'()
parseactions = self.'parseactions'()
$I0 = isa parsegrammar, ['Regex';'Cursor']
unless $I0 goto parse_old
match = parsegrammar.'parse'(source, 'from'=>0, 'action'=>parseactions)
unless match goto err_parsefail
.return (match)
err_parsefail:
self.'panic'('Unable to parse source')
.return (match)
parse_old:
$I0 = isa parsegrammar, ['NameSpace']
if $I0 goto parse_old_1
## switch from protoobjects to classes, then call parent
$P0 = parsegrammar.'HOW'()
$P0 = getattribute $P0, 'parrotclass'
parsegrammar = $P0.'get_namespace'()
self.'parsegrammar'(parsegrammar)
$P0 = parseactions.'HOW'()
$P0 = getattribute $P0, 'parrotclass'
parseactions = $P0.'get_namespace'()
self.'parseactions'(parseactions)
parse_old_1:
$P0 = get_hll_global ['PCT'], 'HLLCompiler'
$P1 = find_method $P0, 'parse'
.tailcall self.$P1(source, options :flat :named)
.end
## Instruction:
[hll]: Remove obsolete emulation of old grammar+action setup.
## Code After:
.namespace ['HLL';'Compiler']
.sub '' :anon :init :load
load_bytecode 'PCT.pbc'
.local pmc p6meta
p6meta = get_hll_global 'P6metaclass'
p6meta.'new_class'('HLL::Compiler', 'parent'=>'PCT::HLLCompiler')
.end
.sub 'parse' :method
.param pmc source
.param pmc options :slurpy :named
.local pmc parsegrammar, parseactions, match
parsegrammar = self.'parsegrammar'()
null parseactions
$S0 = options['target']
# if $S0 == 'parse' goto have_parseactions
parseactions = self.'parseactions'()
have_parseactions:
match = parsegrammar.'parse'(source, 'from'=>0, 'action'=>parseactions)
unless match goto err_parsefail
.return (match)
err_parsefail:
self.'panic'('Unable to parse source')
.return (match)
.end
|
.namespace ['HLL';'Compiler']
.sub '' :anon :init :load
load_bytecode 'PCT.pbc'
.local pmc p6meta
p6meta = get_hll_global 'P6metaclass'
p6meta.'new_class'('HLL::Compiler', 'parent'=>'PCT::HLLCompiler')
.end
.sub 'parse' :method
.param pmc source
.param pmc options :slurpy :named
.local pmc parsegrammar, parseactions, match
parsegrammar = self.'parsegrammar'()
+
+ null parseactions
+ $S0 = options['target']
+ # if $S0 == 'parse' goto have_parseactions
parseactions = self.'parseactions'()
+ have_parseactions:
- $I0 = isa parsegrammar, ['Regex';'Cursor']
- unless $I0 goto parse_old
match = parsegrammar.'parse'(source, 'from'=>0, 'action'=>parseactions)
unless match goto err_parsefail
.return (match)
err_parsefail:
self.'panic'('Unable to parse source')
.return (match)
-
- parse_old:
- $I0 = isa parsegrammar, ['NameSpace']
- if $I0 goto parse_old_1
- ## switch from protoobjects to classes, then call parent
- $P0 = parsegrammar.'HOW'()
- $P0 = getattribute $P0, 'parrotclass'
- parsegrammar = $P0.'get_namespace'()
- self.'parsegrammar'(parsegrammar)
-
- $P0 = parseactions.'HOW'()
- $P0 = getattribute $P0, 'parrotclass'
- parseactions = $P0.'get_namespace'()
- self.'parseactions'(parseactions)
-
- parse_old_1:
- $P0 = get_hll_global ['PCT'], 'HLLCompiler'
- $P1 = find_method $P0, 'parse'
- .tailcall self.$P1(source, options :flat :named)
.end | 26 | 0.530612 | 5 | 21 |
3f17def59779608c455f00c1fcea3ceaf23fd60e | test/fixture.sh | test/fixture.sh | TMP=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )/tmp
# Recreate tmp folder
rm -rf "$TMP"
mkdir -p "$TMP"
cd "$TMP"
# Create repository and master branch
git init
# Make fixture
git commit -m "Dummy" --allow-empty
git checkout -b feature_one 2> /dev/null
git commit -m "Dummy" --allow-empty
git checkout -b feature_two 2> /dev/null
git commit -m "Dummy" --allow-empty
git checkout -b feature_three 2> /dev/null
git commit -m "Dummy" --allow-empty
git checkout feature_two 2> /dev/null
git checkout master 2> /dev/null
| TMP=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )/tmp
# Recreate tmp folder
rm -rf "$TMP"
mkdir -p "$TMP"
cd "$TMP"
# Create repository and master branch
git init
# Make fixture
git commit -m "Dummy" --allow-empty
git checkout -b feature_one 2> /dev/null
git commit -m "Dummy" --allow-empty
git checkout -b feature_two 2> /dev/null
git commit -m "Dummy" --allow-empty
git checkout -b feature_three 2> /dev/null
git commit -m "Dummy" --allow-empty
git checkout feature_two 2> /dev/null
git checkout master 2> /dev/null
# Create base configuration
git config --local --add switch.count 9
git config --local --add switch.order modified
| Set default configuration before running tests | Set default configuration before running tests
| Shell | mit | san650/git-switch,san650/git-switch | shell | ## Code Before:
TMP=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )/tmp
# Recreate tmp folder
rm -rf "$TMP"
mkdir -p "$TMP"
cd "$TMP"
# Create repository and master branch
git init
# Make fixture
git commit -m "Dummy" --allow-empty
git checkout -b feature_one 2> /dev/null
git commit -m "Dummy" --allow-empty
git checkout -b feature_two 2> /dev/null
git commit -m "Dummy" --allow-empty
git checkout -b feature_three 2> /dev/null
git commit -m "Dummy" --allow-empty
git checkout feature_two 2> /dev/null
git checkout master 2> /dev/null
## Instruction:
Set default configuration before running tests
## Code After:
TMP=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )/tmp
# Recreate tmp folder
rm -rf "$TMP"
mkdir -p "$TMP"
cd "$TMP"
# Create repository and master branch
git init
# Make fixture
git commit -m "Dummy" --allow-empty
git checkout -b feature_one 2> /dev/null
git commit -m "Dummy" --allow-empty
git checkout -b feature_two 2> /dev/null
git commit -m "Dummy" --allow-empty
git checkout -b feature_three 2> /dev/null
git commit -m "Dummy" --allow-empty
git checkout feature_two 2> /dev/null
git checkout master 2> /dev/null
# Create base configuration
git config --local --add switch.count 9
git config --local --add switch.order modified
| TMP=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )/tmp
# Recreate tmp folder
rm -rf "$TMP"
mkdir -p "$TMP"
cd "$TMP"
# Create repository and master branch
git init
# Make fixture
git commit -m "Dummy" --allow-empty
git checkout -b feature_one 2> /dev/null
git commit -m "Dummy" --allow-empty
git checkout -b feature_two 2> /dev/null
git commit -m "Dummy" --allow-empty
git checkout -b feature_three 2> /dev/null
git commit -m "Dummy" --allow-empty
git checkout feature_two 2> /dev/null
git checkout master 2> /dev/null
+
+ # Create base configuration
+ git config --local --add switch.count 9
+ git config --local --add switch.order modified | 4 | 0.2 | 4 | 0 |
cb798ae8f7f6e810a87137a56cd04be76596a2dd | photutils/tests/test_psfs.py | photutils/tests/test_psfs.py | from __future__ import division
import numpy as np
from astropy.tests.helper import pytest
from photutils.psf import GaussianPSF
try:
from scipy import optimize
HAS_SCIPY = True
except ImportError:
HAS_SCIPY = False
widths = [0.001, 0.01, 0.1, 1]
@pytest.mark.skipif('not HAS_SCIPY')
@pytest.mark.parametrize(('width'), widths)
def test_subpixel_gauss_psf(width):
"""
Test subpixel accuracy of Gaussian PSF by checking the sum o pixels.
"""
gauss_psf = GaussianPSF(width)
y, x = np.mgrid[-10:11, -10:11]
assert np.abs(gauss_psf(x, y).sum() - 1) < 1E-12
@pytest.mark.skipif('not HAS_SCIPY')
def test_gaussian_PSF_integral():
"""
Test if Gaussian PSF integrates to unity on larger scales.
"""
psf = GaussianPSF(10)
y, x = np.mgrid[-100:101, -100:101]
assert np.abs(psf(y, x).sum() - 1) < 1E-12
| from __future__ import division
import numpy as np
from astropy.tests.helper import pytest
from ..psf import GaussianPSF
try:
from scipy import optimize
HAS_SCIPY = True
except ImportError:
HAS_SCIPY = False
widths = [0.001, 0.01, 0.1, 1]
@pytest.mark.skipif('not HAS_SCIPY')
@pytest.mark.parametrize(('width'), widths)
def test_subpixel_gauss_psf(width):
"""
Test subpixel accuracy of Gaussian PSF by checking the sum o pixels.
"""
gauss_psf = GaussianPSF(width)
y, x = np.mgrid[-10:11, -10:11]
assert np.abs(gauss_psf(x, y).sum() - 1) < 1E-12
@pytest.mark.skipif('not HAS_SCIPY')
def test_gaussian_PSF_integral():
"""
Test if Gaussian PSF integrates to unity on larger scales.
"""
psf = GaussianPSF(10)
y, x = np.mgrid[-100:101, -100:101]
assert np.abs(psf(y, x).sum() - 1) < 1E-12
| Use relative imports for consistency; pep8 | Use relative imports for consistency; pep8
| Python | bsd-3-clause | larrybradley/photutils,astropy/photutils | python | ## Code Before:
from __future__ import division
import numpy as np
from astropy.tests.helper import pytest
from photutils.psf import GaussianPSF
try:
from scipy import optimize
HAS_SCIPY = True
except ImportError:
HAS_SCIPY = False
widths = [0.001, 0.01, 0.1, 1]
@pytest.mark.skipif('not HAS_SCIPY')
@pytest.mark.parametrize(('width'), widths)
def test_subpixel_gauss_psf(width):
"""
Test subpixel accuracy of Gaussian PSF by checking the sum o pixels.
"""
gauss_psf = GaussianPSF(width)
y, x = np.mgrid[-10:11, -10:11]
assert np.abs(gauss_psf(x, y).sum() - 1) < 1E-12
@pytest.mark.skipif('not HAS_SCIPY')
def test_gaussian_PSF_integral():
"""
Test if Gaussian PSF integrates to unity on larger scales.
"""
psf = GaussianPSF(10)
y, x = np.mgrid[-100:101, -100:101]
assert np.abs(psf(y, x).sum() - 1) < 1E-12
## Instruction:
Use relative imports for consistency; pep8
## Code After:
from __future__ import division
import numpy as np
from astropy.tests.helper import pytest
from ..psf import GaussianPSF
try:
from scipy import optimize
HAS_SCIPY = True
except ImportError:
HAS_SCIPY = False
widths = [0.001, 0.01, 0.1, 1]
@pytest.mark.skipif('not HAS_SCIPY')
@pytest.mark.parametrize(('width'), widths)
def test_subpixel_gauss_psf(width):
"""
Test subpixel accuracy of Gaussian PSF by checking the sum o pixels.
"""
gauss_psf = GaussianPSF(width)
y, x = np.mgrid[-10:11, -10:11]
assert np.abs(gauss_psf(x, y).sum() - 1) < 1E-12
@pytest.mark.skipif('not HAS_SCIPY')
def test_gaussian_PSF_integral():
"""
Test if Gaussian PSF integrates to unity on larger scales.
"""
psf = GaussianPSF(10)
y, x = np.mgrid[-100:101, -100:101]
assert np.abs(psf(y, x).sum() - 1) < 1E-12
| from __future__ import division
-
import numpy as np
-
from astropy.tests.helper import pytest
- from photutils.psf import GaussianPSF
? ^^^^^^^^^ -
+ from ..psf import GaussianPSF
? ^
-
try:
from scipy import optimize
HAS_SCIPY = True
except ImportError:
HAS_SCIPY = False
widths = [0.001, 0.01, 0.1, 1]
+
@pytest.mark.skipif('not HAS_SCIPY')
@pytest.mark.parametrize(('width'), widths)
def test_subpixel_gauss_psf(width):
"""
Test subpixel accuracy of Gaussian PSF by checking the sum o pixels.
"""
gauss_psf = GaussianPSF(width)
y, x = np.mgrid[-10:11, -10:11]
assert np.abs(gauss_psf(x, y).sum() - 1) < 1E-12
-
+
+
- @pytest.mark.skipif('not HAS_SCIPY')
? ----
+ @pytest.mark.skipif('not HAS_SCIPY')
def test_gaussian_PSF_integral():
"""
Test if Gaussian PSF integrates to unity on larger scales.
"""
psf = GaussianPSF(10)
y, x = np.mgrid[-100:101, -100:101]
- assert np.abs(psf(y, x).sum() - 1) < 1E-12
? -
+ assert np.abs(psf(y, x).sum() - 1) < 1E-12
-
- | 15 | 0.416667 | 6 | 9 |
49fd555ed9e25e437916166783c7d5a1778c1372 | lisp/init-projectile.el | lisp/init-projectile.el | (when (maybe-require-package 'projectile)
(add-hook 'after-init-hook 'projectile-mode)
(after-load 'projectile
(define-key projectile-mode-map (kbd "C-c C-p") 'projectile-command-map)
;; Shorter modeline
(setq-default projectile-mode-line-lighter " Proj"))
(maybe-require-package 'ibuffer-projectile))
(provide 'init-projectile)
| (when (maybe-require-package 'projectile)
(add-hook 'after-init-hook 'projectile-mode)
;; Shorter modeline
(setq-default projectile-mode-line-lighter " Proj")
(after-load 'projectile
(define-key projectile-mode-map (kbd "C-c C-p") 'projectile-command-map))
(maybe-require-package 'ibuffer-projectile))
(provide 'init-projectile)
| Set shorter projectile modeline correctly | Set shorter projectile modeline correctly
| Emacs Lisp | bsd-2-clause | benkha/emacs.d,cjqw/emacs.d,blueabysm/emacs.d,qianwan/emacs.d,emuio/emacs.d,krzysz00/emacs.d,purcell/emacs.d,braveoyster/emacs.d,me020523/emacs.d,baohaojun/emacs.d,kindoblue/emacs.d,gsmlg/emacs.d,svenyurgensson/emacs.d,blueseason/emacs.d,kongfy/emacs.d,dcorking/emacs.d,arthurl/emacs.d,wegatron/emacs.d,mmqmzk/emacs.d,lust4life/emacs.d,sgarciac/emacs.d | emacs-lisp | ## Code Before:
(when (maybe-require-package 'projectile)
(add-hook 'after-init-hook 'projectile-mode)
(after-load 'projectile
(define-key projectile-mode-map (kbd "C-c C-p") 'projectile-command-map)
;; Shorter modeline
(setq-default projectile-mode-line-lighter " Proj"))
(maybe-require-package 'ibuffer-projectile))
(provide 'init-projectile)
## Instruction:
Set shorter projectile modeline correctly
## Code After:
(when (maybe-require-package 'projectile)
(add-hook 'after-init-hook 'projectile-mode)
;; Shorter modeline
(setq-default projectile-mode-line-lighter " Proj")
(after-load 'projectile
(define-key projectile-mode-map (kbd "C-c C-p") 'projectile-command-map))
(maybe-require-package 'ibuffer-projectile))
(provide 'init-projectile)
| (when (maybe-require-package 'projectile)
(add-hook 'after-init-hook 'projectile-mode)
+ ;; Shorter modeline
+ (setq-default projectile-mode-line-lighter " Proj")
+
(after-load 'projectile
- (define-key projectile-mode-map (kbd "C-c C-p") 'projectile-command-map)
+ (define-key projectile-mode-map (kbd "C-c C-p") 'projectile-command-map))
? +
-
- ;; Shorter modeline
- (setq-default projectile-mode-line-lighter " Proj"))
(maybe-require-package 'ibuffer-projectile))
(provide 'init-projectile) | 8 | 0.615385 | 4 | 4 |
cb41c14fdbca0f385449ca143f9594dc1a8c5b3a | install-win.ps1 | install-win.ps1 | if (Test-Path $PROFILE) {
$file = Get-Item $PROFILE -Force -ea 0
$symlink = $file.Attributes -band [IO.FileAttributes]::ReparsePoint
if (-Not $symlink) {
Remove-Item $PROFILE
cmd /c mklink "$PROFILE" "$($args[0])"
}
} else {
$profilePath = Split-Path $PROFILE
if (!(Test-Path -Path $profilePath)) {
New-Item -ItemType directory -Path $profilePath | Out-Null
}
cmd /c mklink "$PROFILE" "$($args[0])"
}
. $PROFILE
iex (new-object net.webclient).downloadstring('https://get.scoop.sh')
scoop install concfg
concfg import -n solarized-light small concfg\source-code-pro.json
concfg clean
| if (Test-Path $PROFILE) {
$file = Get-Item $PROFILE -Force -ea 0
$symlink = $file.Attributes -band [IO.FileAttributes]::ReparsePoint
if (-Not $symlink) {
Remove-Item $PROFILE
cmd /c mklink "$PROFILE" "$($args[0])"
}
} else {
$profilePath = Split-Path $PROFILE
if (!(Test-Path -Path $profilePath)) {
New-Item -ItemType directory -Path $profilePath | Out-Null
}
cmd /c mklink "$PROFILE" "$($args[0])"
}
. $PROFILE
iex (new-object net.webclient).downloadstring('https://get.scoop.sh')
scoop update
scoop install concfg
scoop update concfg
concfg import -n solarized-light small concfg\source-code-pro.json
concfg clean
| Update scoop and concfg in install script | Update scoop and concfg in install script
| PowerShell | mit | ArloL/dotfiles,ArloL/dotfiles | powershell | ## Code Before:
if (Test-Path $PROFILE) {
$file = Get-Item $PROFILE -Force -ea 0
$symlink = $file.Attributes -band [IO.FileAttributes]::ReparsePoint
if (-Not $symlink) {
Remove-Item $PROFILE
cmd /c mklink "$PROFILE" "$($args[0])"
}
} else {
$profilePath = Split-Path $PROFILE
if (!(Test-Path -Path $profilePath)) {
New-Item -ItemType directory -Path $profilePath | Out-Null
}
cmd /c mklink "$PROFILE" "$($args[0])"
}
. $PROFILE
iex (new-object net.webclient).downloadstring('https://get.scoop.sh')
scoop install concfg
concfg import -n solarized-light small concfg\source-code-pro.json
concfg clean
## Instruction:
Update scoop and concfg in install script
## Code After:
if (Test-Path $PROFILE) {
$file = Get-Item $PROFILE -Force -ea 0
$symlink = $file.Attributes -band [IO.FileAttributes]::ReparsePoint
if (-Not $symlink) {
Remove-Item $PROFILE
cmd /c mklink "$PROFILE" "$($args[0])"
}
} else {
$profilePath = Split-Path $PROFILE
if (!(Test-Path -Path $profilePath)) {
New-Item -ItemType directory -Path $profilePath | Out-Null
}
cmd /c mklink "$PROFILE" "$($args[0])"
}
. $PROFILE
iex (new-object net.webclient).downloadstring('https://get.scoop.sh')
scoop update
scoop install concfg
scoop update concfg
concfg import -n solarized-light small concfg\source-code-pro.json
concfg clean
| if (Test-Path $PROFILE) {
$file = Get-Item $PROFILE -Force -ea 0
$symlink = $file.Attributes -band [IO.FileAttributes]::ReparsePoint
if (-Not $symlink) {
Remove-Item $PROFILE
cmd /c mklink "$PROFILE" "$($args[0])"
}
} else {
$profilePath = Split-Path $PROFILE
if (!(Test-Path -Path $profilePath)) {
New-Item -ItemType directory -Path $profilePath | Out-Null
}
cmd /c mklink "$PROFILE" "$($args[0])"
}
. $PROFILE
iex (new-object net.webclient).downloadstring('https://get.scoop.sh')
+ scoop update
scoop install concfg
+ scoop update concfg
concfg import -n solarized-light small concfg\source-code-pro.json
concfg clean | 2 | 0.095238 | 2 | 0 |
bb922e3b9e93e481e20d77589892dc520b862007 | docker-resources/init-minerva-db-and-instance.sh | docker-resources/init-minerva-db-and-instance.sh | export MINERVA_DB_NAME=minerva
export PGDATABASE=$MINERVA_DB_NAME
export PGUSER=postgres
export ADD_PGTAB_EXTENSION=true
export PYTHONUNBUFFERED=1
create-minerva-database
minerva initialize -i /instance
| export MINERVA_DB_NAME=minerva
export PGDATABASE=$MINERVA_DB_NAME
export PGUSER=postgres
export ADD_PGTAB_EXTENSION=true
export PYTHONUNBUFFERED=1
create-minerva-database
if [[ ! -z "$LOAD_SAMPLE_DATA" ]]
then
minerva initialize -i /instance --load-sample-data
else
minerva initialize -i /instance
fi
| Add support for environment variable LOAD_SAMPLE_DATA | Add support for environment variable LOAD_SAMPLE_DATA
| Shell | agpl-3.0 | hendrikx-itc/minerva,hendrikx-itc/minerva | shell | ## Code Before:
export MINERVA_DB_NAME=minerva
export PGDATABASE=$MINERVA_DB_NAME
export PGUSER=postgres
export ADD_PGTAB_EXTENSION=true
export PYTHONUNBUFFERED=1
create-minerva-database
minerva initialize -i /instance
## Instruction:
Add support for environment variable LOAD_SAMPLE_DATA
## Code After:
export MINERVA_DB_NAME=minerva
export PGDATABASE=$MINERVA_DB_NAME
export PGUSER=postgres
export ADD_PGTAB_EXTENSION=true
export PYTHONUNBUFFERED=1
create-minerva-database
if [[ ! -z "$LOAD_SAMPLE_DATA" ]]
then
minerva initialize -i /instance --load-sample-data
else
minerva initialize -i /instance
fi
| export MINERVA_DB_NAME=minerva
export PGDATABASE=$MINERVA_DB_NAME
export PGUSER=postgres
export ADD_PGTAB_EXTENSION=true
export PYTHONUNBUFFERED=1
create-minerva-database
+ if [[ ! -z "$LOAD_SAMPLE_DATA" ]]
+ then
+ minerva initialize -i /instance --load-sample-data
+ else
- minerva initialize -i /instance
+ minerva initialize -i /instance
? ++++
+ fi
+ | 8 | 0.888889 | 7 | 1 |
416972f118431d1bd8794e9f55cab7edd82fc884 | lib/rp_capistrano/resque.rb | lib/rp_capistrano/resque.rb | module RPCapistrano
module Resque
def self.load_into(configuration)
configuration.load do
after 'deploy:restart', 'rp:resque:load_god_config'
after 'rp:resque:load_god_config', 'rp:resque:kill_processes'
namespace :rp do
namespace :resque do
desc "Load god config for resque"
task :load_god_config, :roles => :god, :on_no_matching_servers => :continue do
puts " ** LOADING RESQUE GOD CONFIG ====================================="
run "sudo /usr/local/rvm/bin/boot_god load #{release_path}/config/god/resque.god"
end
desc "Kill resque workers and reload god config"
task :kill_processes, :roles => :god, :on_no_matching_servers => :continue do
puts " ** KILLING RESQUE WORKERS ========================================"
run "sudo ps -e -o pid,command | grep resque-1 | awk '{ if ($2!=\"grep\") system(\"echo Killing \" $2 \" \" $1 \";sudo kill -3 \" $1)}'" do |ch, stream, data|
puts " > #{data}"
end
end
end
end
end
end
end
end
if Capistrano::Configuration.instance
RPCapistrano::Resque.load_into(Capistrano::Configuration.instance)
end
| module RPCapistrano
module Resque
def self.load_into(configuration)
configuration.load do
after 'deploy:restart', 'rp:resque:load_god_config'
after 'rp:resque:load_god_config', 'rp:resque:restart_god_workers'
namespace :rp do
namespace :resque do
desc "Load god config for resque"
task :load_god_config, :roles => :god, :on_no_matching_servers => :continue do
puts " ** LOADING RESQUE GOD CONFIG ====================================="
run "sudo /usr/local/rvm/bin/boot_god load #{release_path}/config/god/resque.god"
end
desc "Restart god workers"
task :restart_god_workers, :roles => :god, :on_no_matching_servers => :continue do
puts " ** LOADING RESQUE GOD CONFIG ====================================="
run "sudo /usr/local/rvm/bin/boot_god restart #{app_name}-resque" do |ch, stream, data|
puts data
end
end
desc "Kill resque workers and reload god config"
task :kill_processes, :roles => :god, :on_no_matching_servers => :continue do
puts " ** KILLING RESQUE WORKERS ========================================"
run "sudo ps -e -o pid,command | grep resque-1 | awk '{ if ($2!=\"grep\") system(\"echo Killing \" $2 \" \" $1 \";sudo kill -3 \" $1)}'" do |ch, stream, data|
puts " > #{data}"
end
end
end
end
end
end
end
end
if Capistrano::Configuration.instance
RPCapistrano::Resque.load_into(Capistrano::Configuration.instance)
end
| Add a task to gracefully restart workers using god | Add a task to gracefully restart workers using god
| Ruby | mit | RevolutionPrep/rp_capistrano | ruby | ## Code Before:
module RPCapistrano
module Resque
def self.load_into(configuration)
configuration.load do
after 'deploy:restart', 'rp:resque:load_god_config'
after 'rp:resque:load_god_config', 'rp:resque:kill_processes'
namespace :rp do
namespace :resque do
desc "Load god config for resque"
task :load_god_config, :roles => :god, :on_no_matching_servers => :continue do
puts " ** LOADING RESQUE GOD CONFIG ====================================="
run "sudo /usr/local/rvm/bin/boot_god load #{release_path}/config/god/resque.god"
end
desc "Kill resque workers and reload god config"
task :kill_processes, :roles => :god, :on_no_matching_servers => :continue do
puts " ** KILLING RESQUE WORKERS ========================================"
run "sudo ps -e -o pid,command | grep resque-1 | awk '{ if ($2!=\"grep\") system(\"echo Killing \" $2 \" \" $1 \";sudo kill -3 \" $1)}'" do |ch, stream, data|
puts " > #{data}"
end
end
end
end
end
end
end
end
if Capistrano::Configuration.instance
RPCapistrano::Resque.load_into(Capistrano::Configuration.instance)
end
## Instruction:
Add a task to gracefully restart workers using god
## Code After:
module RPCapistrano
module Resque
def self.load_into(configuration)
configuration.load do
after 'deploy:restart', 'rp:resque:load_god_config'
after 'rp:resque:load_god_config', 'rp:resque:restart_god_workers'
namespace :rp do
namespace :resque do
desc "Load god config for resque"
task :load_god_config, :roles => :god, :on_no_matching_servers => :continue do
puts " ** LOADING RESQUE GOD CONFIG ====================================="
run "sudo /usr/local/rvm/bin/boot_god load #{release_path}/config/god/resque.god"
end
desc "Restart god workers"
task :restart_god_workers, :roles => :god, :on_no_matching_servers => :continue do
puts " ** LOADING RESQUE GOD CONFIG ====================================="
run "sudo /usr/local/rvm/bin/boot_god restart #{app_name}-resque" do |ch, stream, data|
puts data
end
end
desc "Kill resque workers and reload god config"
task :kill_processes, :roles => :god, :on_no_matching_servers => :continue do
puts " ** KILLING RESQUE WORKERS ========================================"
run "sudo ps -e -o pid,command | grep resque-1 | awk '{ if ($2!=\"grep\") system(\"echo Killing \" $2 \" \" $1 \";sudo kill -3 \" $1)}'" do |ch, stream, data|
puts " > #{data}"
end
end
end
end
end
end
end
end
if Capistrano::Configuration.instance
RPCapistrano::Resque.load_into(Capistrano::Configuration.instance)
end
| module RPCapistrano
module Resque
def self.load_into(configuration)
configuration.load do
after 'deploy:restart', 'rp:resque:load_god_config'
- after 'rp:resque:load_god_config', 'rp:resque:kill_processes'
? ------ -- ^
+ after 'rp:resque:load_god_config', 'rp:resque:restart_god_workers'
? ^^^^^^^^^^^^^ +
namespace :rp do
namespace :resque do
desc "Load god config for resque"
task :load_god_config, :roles => :god, :on_no_matching_servers => :continue do
puts " ** LOADING RESQUE GOD CONFIG ====================================="
run "sudo /usr/local/rvm/bin/boot_god load #{release_path}/config/god/resque.god"
+ end
+
+ desc "Restart god workers"
+ task :restart_god_workers, :roles => :god, :on_no_matching_servers => :continue do
+ puts " ** LOADING RESQUE GOD CONFIG ====================================="
+ run "sudo /usr/local/rvm/bin/boot_god restart #{app_name}-resque" do |ch, stream, data|
+ puts data
+ end
end
desc "Kill resque workers and reload god config"
task :kill_processes, :roles => :god, :on_no_matching_servers => :continue do
puts " ** KILLING RESQUE WORKERS ========================================"
run "sudo ps -e -o pid,command | grep resque-1 | awk '{ if ($2!=\"grep\") system(\"echo Killing \" $2 \" \" $1 \";sudo kill -3 \" $1)}'" do |ch, stream, data|
puts " > #{data}"
end
end
end
end
end
end
end
end
if Capistrano::Configuration.instance
RPCapistrano::Resque.load_into(Capistrano::Configuration.instance)
end | 10 | 0.3125 | 9 | 1 |
dd0bbab7f31ac5e71dfdaf097f6c2aa5abf66982 | tensorflow_datasets/community-datasets.toml | tensorflow_datasets/community-datasets.toml | [Namespaces]
# You can add your own datasets here to register them in TFDS. See details
# at: https://www.tensorflow.org/datasets/community_catalog/overview
huggingface = "github://huggingface/datasets/tree/master/datasets"
robotics = "gs://gresearch/robotics"
| [Namespaces]
# You can add your own datasets here to register them in TFDS. See details
# at: https://www.tensorflow.org/datasets/community_catalog/overview
huggingface = "github://huggingface/datasets/tree/master/datasets"
kubric = "gs://kubric-public/tfds"
robotics = "gs://gresearch/robotics"
| Add Kubric to community datasets | Add Kubric to community datasets
PiperOrigin-RevId: 436724480
| TOML | apache-2.0 | tensorflow/datasets,tensorflow/datasets,tensorflow/datasets,tensorflow/datasets,tensorflow/datasets | toml | ## Code Before:
[Namespaces]
# You can add your own datasets here to register them in TFDS. See details
# at: https://www.tensorflow.org/datasets/community_catalog/overview
huggingface = "github://huggingface/datasets/tree/master/datasets"
robotics = "gs://gresearch/robotics"
## Instruction:
Add Kubric to community datasets
PiperOrigin-RevId: 436724480
## Code After:
[Namespaces]
# You can add your own datasets here to register them in TFDS. See details
# at: https://www.tensorflow.org/datasets/community_catalog/overview
huggingface = "github://huggingface/datasets/tree/master/datasets"
kubric = "gs://kubric-public/tfds"
robotics = "gs://gresearch/robotics"
| [Namespaces]
# You can add your own datasets here to register them in TFDS. See details
# at: https://www.tensorflow.org/datasets/community_catalog/overview
huggingface = "github://huggingface/datasets/tree/master/datasets"
+ kubric = "gs://kubric-public/tfds"
robotics = "gs://gresearch/robotics" | 1 | 0.2 | 1 | 0 |
993b472d93a0d3d5c4309af3a0685fd80db62499 | packages/co/constructible.yaml | packages/co/constructible.yaml | homepage: http://andersk.mit.edu/haskell/constructible/
changelog-type: ''
hash: 0795284c06b54dd8f7594a4263f594ae4088525fbc4122ff5db72cf40480b0c0
test-bench-deps: {}
maintainer: Anders Kaseorg <andersk@mit.edu>
synopsis: Exact computation with constructible real numbers
changelog: ''
basic-deps:
binary-search: ! '>=0.0'
complex-generic: ! '>=0.1'
base: ==4.*
arithmoi: ! '>=0.1'
all-versions:
- '0.1'
- 0.1.0.1
author: Anders Kaseorg <andersk@mit.edu>
latest: 0.1.0.1
description-type: haddock
description: ! 'The constructible reals are the subset of the real numbers that can
be represented exactly using field operations (addition,
subtraction, multiplication, division) and positive square roots.
They support exact computations, equality comparisons, and ordering.'
license-name: BSD-3-Clause
| homepage: http://andersk.mit.edu/haskell/constructible/
changelog-type: ''
hash: 8deb5610cc26c648194edba30ce734a9097b5eb43a1ea27b157cb2273968ef8a
test-bench-deps: {}
maintainer: Anders Kaseorg <andersk@mit.edu>
synopsis: Exact computation with constructible real numbers
changelog: ''
basic-deps:
binary-search: ! '>=0.0'
complex-generic: ! '>=0.1'
base: ==4.*
integer-roots: ! '>=1.0'
all-versions:
- '0.1'
- 0.1.0.1
- 0.1.1
author: Anders Kaseorg <andersk@mit.edu>
latest: 0.1.1
description-type: haddock
description: |-
The constructible reals are the subset of the real numbers that can
be represented exactly using field operations (addition,
subtraction, multiplication, division) and positive square roots.
They support exact computations, equality comparisons, and ordering.
license-name: BSD-3-Clause
| Update from Hackage at 2020-02-09T13:49:24Z | Update from Hackage at 2020-02-09T13:49:24Z
| YAML | mit | commercialhaskell/all-cabal-metadata | yaml | ## Code Before:
homepage: http://andersk.mit.edu/haskell/constructible/
changelog-type: ''
hash: 0795284c06b54dd8f7594a4263f594ae4088525fbc4122ff5db72cf40480b0c0
test-bench-deps: {}
maintainer: Anders Kaseorg <andersk@mit.edu>
synopsis: Exact computation with constructible real numbers
changelog: ''
basic-deps:
binary-search: ! '>=0.0'
complex-generic: ! '>=0.1'
base: ==4.*
arithmoi: ! '>=0.1'
all-versions:
- '0.1'
- 0.1.0.1
author: Anders Kaseorg <andersk@mit.edu>
latest: 0.1.0.1
description-type: haddock
description: ! 'The constructible reals are the subset of the real numbers that can
be represented exactly using field operations (addition,
subtraction, multiplication, division) and positive square roots.
They support exact computations, equality comparisons, and ordering.'
license-name: BSD-3-Clause
## Instruction:
Update from Hackage at 2020-02-09T13:49:24Z
## Code After:
homepage: http://andersk.mit.edu/haskell/constructible/
changelog-type: ''
hash: 8deb5610cc26c648194edba30ce734a9097b5eb43a1ea27b157cb2273968ef8a
test-bench-deps: {}
maintainer: Anders Kaseorg <andersk@mit.edu>
synopsis: Exact computation with constructible real numbers
changelog: ''
basic-deps:
binary-search: ! '>=0.0'
complex-generic: ! '>=0.1'
base: ==4.*
integer-roots: ! '>=1.0'
all-versions:
- '0.1'
- 0.1.0.1
- 0.1.1
author: Anders Kaseorg <andersk@mit.edu>
latest: 0.1.1
description-type: haddock
description: |-
The constructible reals are the subset of the real numbers that can
be represented exactly using field operations (addition,
subtraction, multiplication, division) and positive square roots.
They support exact computations, equality comparisons, and ordering.
license-name: BSD-3-Clause
| homepage: http://andersk.mit.edu/haskell/constructible/
changelog-type: ''
- hash: 0795284c06b54dd8f7594a4263f594ae4088525fbc4122ff5db72cf40480b0c0
+ hash: 8deb5610cc26c648194edba30ce734a9097b5eb43a1ea27b157cb2273968ef8a
test-bench-deps: {}
maintainer: Anders Kaseorg <andersk@mit.edu>
synopsis: Exact computation with constructible real numbers
changelog: ''
basic-deps:
binary-search: ! '>=0.0'
complex-generic: ! '>=0.1'
base: ==4.*
- arithmoi: ! '>=0.1'
+ integer-roots: ! '>=1.0'
all-versions:
- '0.1'
- 0.1.0.1
+ - 0.1.1
author: Anders Kaseorg <andersk@mit.edu>
- latest: 0.1.0.1
? --
+ latest: 0.1.1
description-type: haddock
+ description: |-
- description: ! 'The constructible reals are the subset of the real numbers that can
? ------------ - -
+ The constructible reals are the subset of the real numbers that can
-
be represented exactly using field operations (addition,
-
subtraction, multiplication, division) and positive square roots.
-
- They support exact computations, equality comparisons, and ordering.'
? -
+ They support exact computations, equality comparisons, and ordering.
license-name: BSD-3-Clause | 15 | 0.576923 | 7 | 8 |
922d7db1b9c8ae53c123ac4bb4e1a6a2d07bd664 | Tools/package.ps1 | Tools/package.ps1 | $ErrorActionPreference = "Stop"
$rootFolder = ".."
$sourceFolder = "$rootFolder\Source"
$buildFolder = "$rootFolder\build"
$packageFolder = "$rootFolder\package"
function cleanup_package_folder()
{
if (Test-Path $packageFolder)
{
Remove-Item $packageFolder -Recurse -Force
}
New-Item $packageFolder -Type Directory | Out-Null
}
function zip($sourceDir, $outputFileName)
{
Add-Type -Assembly System.IO.Compression.FileSystem | Out-Null
$compressionLevel = [System.IO.Compression.CompressionLevel]::Optimal
[System.IO.Compression.ZipFile]::CreateFromDirectory($sourceDir, $outputFileName, $compressionLevel, $false)
}
function pack_nuget()
{
Write-Host "Preparing NuGet package..."
$nuget = "$sourceFolder\.nuget\nuget.exe"
&$nuget pack FasterTests.nuspec -BasePath $buildFolder -OutputDirectory $packageFolder -Verbosity quiet
}
function pack_zip()
{
Write-Host "Preparing zip package..."
zip $buildFolder "$packageFolder\latest.zip"
}
cleanup_package_folder
pack_nuget
pack_zip
Write-Host "Package succeeded" | $ErrorActionPreference = "Stop"
$rootFolder = ".."
$sourceFolder = "$rootFolder\Source"
$buildFolder = "$rootFolder\build"
$packageFolder = "$rootFolder\package"
$nuspecFile = "FasterTests.nuspec"
function cleanup_package_folder()
{
if (Test-Path $packageFolder)
{
Remove-Item $packageFolder -Recurse -Force
}
New-Item $packageFolder -Type Directory | Out-Null
}
function zip($sourceDir, $outputFileName)
{
Add-Type -Assembly System.IO.Compression.FileSystem | Out-Null
$compressionLevel = [System.IO.Compression.CompressionLevel]::Optimal
[System.IO.Compression.ZipFile]::CreateFromDirectory($sourceDir, $outputFileName, $compressionLevel, $false)
}
function pack_nuget()
{
Write-Host "Preparing NuGet package..."
$nuget = "$sourceFolder\.nuget\nuget.exe"
&$nuget pack $nuspecFile -BasePath $buildFolder -OutputDirectory $packageFolder -Verbosity quiet
}
function pack_zip()
{
Write-Host "Preparing zip package..."
[xml]$nuspec = Get-Content $nuspecFile
$version = $nuspec.package.metadata.version
zip $buildFolder "$packageFolder\FasterTests-$version.zip"
}
cleanup_package_folder
pack_nuget
pack_zip
Write-Host "Package succeeded" | Package script modified to generate proper ZIP file name | Package script modified to generate proper ZIP file name
| PowerShell | bsd-3-clause | devoyster/FasterTests | powershell | ## Code Before:
$ErrorActionPreference = "Stop"
$rootFolder = ".."
$sourceFolder = "$rootFolder\Source"
$buildFolder = "$rootFolder\build"
$packageFolder = "$rootFolder\package"
function cleanup_package_folder()
{
if (Test-Path $packageFolder)
{
Remove-Item $packageFolder -Recurse -Force
}
New-Item $packageFolder -Type Directory | Out-Null
}
function zip($sourceDir, $outputFileName)
{
Add-Type -Assembly System.IO.Compression.FileSystem | Out-Null
$compressionLevel = [System.IO.Compression.CompressionLevel]::Optimal
[System.IO.Compression.ZipFile]::CreateFromDirectory($sourceDir, $outputFileName, $compressionLevel, $false)
}
function pack_nuget()
{
Write-Host "Preparing NuGet package..."
$nuget = "$sourceFolder\.nuget\nuget.exe"
&$nuget pack FasterTests.nuspec -BasePath $buildFolder -OutputDirectory $packageFolder -Verbosity quiet
}
function pack_zip()
{
Write-Host "Preparing zip package..."
zip $buildFolder "$packageFolder\latest.zip"
}
cleanup_package_folder
pack_nuget
pack_zip
Write-Host "Package succeeded"
## Instruction:
Package script modified to generate proper ZIP file name
## Code After:
$ErrorActionPreference = "Stop"
$rootFolder = ".."
$sourceFolder = "$rootFolder\Source"
$buildFolder = "$rootFolder\build"
$packageFolder = "$rootFolder\package"
$nuspecFile = "FasterTests.nuspec"
function cleanup_package_folder()
{
if (Test-Path $packageFolder)
{
Remove-Item $packageFolder -Recurse -Force
}
New-Item $packageFolder -Type Directory | Out-Null
}
function zip($sourceDir, $outputFileName)
{
Add-Type -Assembly System.IO.Compression.FileSystem | Out-Null
$compressionLevel = [System.IO.Compression.CompressionLevel]::Optimal
[System.IO.Compression.ZipFile]::CreateFromDirectory($sourceDir, $outputFileName, $compressionLevel, $false)
}
function pack_nuget()
{
Write-Host "Preparing NuGet package..."
$nuget = "$sourceFolder\.nuget\nuget.exe"
&$nuget pack $nuspecFile -BasePath $buildFolder -OutputDirectory $packageFolder -Verbosity quiet
}
function pack_zip()
{
Write-Host "Preparing zip package..."
[xml]$nuspec = Get-Content $nuspecFile
$version = $nuspec.package.metadata.version
zip $buildFolder "$packageFolder\FasterTests-$version.zip"
}
cleanup_package_folder
pack_nuget
pack_zip
Write-Host "Package succeeded" | $ErrorActionPreference = "Stop"
$rootFolder = ".."
$sourceFolder = "$rootFolder\Source"
$buildFolder = "$rootFolder\build"
$packageFolder = "$rootFolder\package"
+ $nuspecFile = "FasterTests.nuspec"
function cleanup_package_folder()
{
if (Test-Path $packageFolder)
{
Remove-Item $packageFolder -Recurse -Force
}
New-Item $packageFolder -Type Directory | Out-Null
}
function zip($sourceDir, $outputFileName)
{
Add-Type -Assembly System.IO.Compression.FileSystem | Out-Null
$compressionLevel = [System.IO.Compression.CompressionLevel]::Optimal
[System.IO.Compression.ZipFile]::CreateFromDirectory($sourceDir, $outputFileName, $compressionLevel, $false)
}
function pack_nuget()
{
Write-Host "Preparing NuGet package..."
$nuget = "$sourceFolder\.nuget\nuget.exe"
- &$nuget pack FasterTests.nuspec -BasePath $buildFolder -OutputDirectory $packageFolder -Verbosity quiet
? ^^^^^^^^^^^^
+ &$nuget pack $nuspecFile -BasePath $buildFolder -OutputDirectory $packageFolder -Verbosity quiet
? ^ ++++
}
function pack_zip()
{
Write-Host "Preparing zip package..."
+ [xml]$nuspec = Get-Content $nuspecFile
+ $version = $nuspec.package.metadata.version
+
- zip $buildFolder "$packageFolder\latest.zip"
? ^
+ zip $buildFolder "$packageFolder\FasterTests-$version.zip"
? ^ + +++ ++++++++++
}
cleanup_package_folder
pack_nuget
pack_zip
Write-Host "Package succeeded" | 8 | 0.181818 | 6 | 2 |
5fffb9f49cd7b1237a0bfed0faebf16ef5cdeec1 | lib/sanitizer_common/sanitizer_platform_interceptors.h | lib/sanitizer_common/sanitizer_platform_interceptors.h | //===-- sanitizer_platform_interceptors.h -----------------------*- C++ -*-===//
//
// The LLVM Compiler Infrastructure
//
// This file is distributed under the University of Illinois Open Source
// License. See LICENSE.TXT for details.
//
//===----------------------------------------------------------------------===//
//
// This file defines macro telling whether sanitizer tools can/should intercept
// given library functions on a given platform.
//
//===----------------------------------------------------------------------===//
#include "sanitizer_internal_defs.h"
#if !defined(_WIN32)
# define SI_NOT_WINDOWS 1
#else
# define SI_NOT_WINDOWS 0
#endif
#if defined(__linux__) && !defined(ANDROID)
# define SI_LINUX_NOT_ANDROID 1
#else
# define SI_LINUX_NOT_ANDROID 0
#endif
# define SANITIZER_INTERCEPT_READ SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_PREAD SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_WRITE SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_PWRITE SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_PREAD64 SI_LINUX_NOT_ANDROID
# define SANITIZER_INTERCEPT_PWRITE64 SI_LINUX_NOT_ANDROID
# define SANITIZER_INTERCEPT_PRCTL SI_LINUX_NOT_ANDROID
# define SANITIZER_INTERCEPT_SCANF 1
| //===-- sanitizer_platform_interceptors.h -----------------------*- C++ -*-===//
//
// The LLVM Compiler Infrastructure
//
// This file is distributed under the University of Illinois Open Source
// License. See LICENSE.TXT for details.
//
//===----------------------------------------------------------------------===//
//
// This file defines macro telling whether sanitizer tools can/should intercept
// given library functions on a given platform.
//
//===----------------------------------------------------------------------===//
#include "sanitizer_internal_defs.h"
#if !defined(_WIN32)
# define SI_NOT_WINDOWS 1
#else
# define SI_NOT_WINDOWS 0
#endif
#if defined(__linux__) && !defined(ANDROID)
# define SI_LINUX_NOT_ANDROID 1
#else
# define SI_LINUX_NOT_ANDROID 0
#endif
# define SANITIZER_INTERCEPT_READ SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_PREAD SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_WRITE SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_PWRITE SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_PREAD64 SI_LINUX_NOT_ANDROID
# define SANITIZER_INTERCEPT_PWRITE64 SI_LINUX_NOT_ANDROID
# define SANITIZER_INTERCEPT_PRCTL SI_LINUX_NOT_ANDROID
# define SANITIZER_INTERCEPT_SCANF SI_NOT_WINDOWS
| Disable scanf interceptor on windows. | [sanitizer] Disable scanf interceptor on windows.
git-svn-id: c199f293c43da69278bea8e88f92242bf3aa95f7@173037 91177308-0d34-0410-b5e6-96231b3b80d8
| C | apache-2.0 | llvm-mirror/compiler-rt,llvm-mirror/compiler-rt,llvm-mirror/compiler-rt,llvm-mirror/compiler-rt,llvm-mirror/compiler-rt | c | ## Code Before:
//===-- sanitizer_platform_interceptors.h -----------------------*- C++ -*-===//
//
// The LLVM Compiler Infrastructure
//
// This file is distributed under the University of Illinois Open Source
// License. See LICENSE.TXT for details.
//
//===----------------------------------------------------------------------===//
//
// This file defines macro telling whether sanitizer tools can/should intercept
// given library functions on a given platform.
//
//===----------------------------------------------------------------------===//
#include "sanitizer_internal_defs.h"
#if !defined(_WIN32)
# define SI_NOT_WINDOWS 1
#else
# define SI_NOT_WINDOWS 0
#endif
#if defined(__linux__) && !defined(ANDROID)
# define SI_LINUX_NOT_ANDROID 1
#else
# define SI_LINUX_NOT_ANDROID 0
#endif
# define SANITIZER_INTERCEPT_READ SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_PREAD SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_WRITE SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_PWRITE SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_PREAD64 SI_LINUX_NOT_ANDROID
# define SANITIZER_INTERCEPT_PWRITE64 SI_LINUX_NOT_ANDROID
# define SANITIZER_INTERCEPT_PRCTL SI_LINUX_NOT_ANDROID
# define SANITIZER_INTERCEPT_SCANF 1
## Instruction:
[sanitizer] Disable scanf interceptor on windows.
git-svn-id: c199f293c43da69278bea8e88f92242bf3aa95f7@173037 91177308-0d34-0410-b5e6-96231b3b80d8
## Code After:
//===-- sanitizer_platform_interceptors.h -----------------------*- C++ -*-===//
//
// The LLVM Compiler Infrastructure
//
// This file is distributed under the University of Illinois Open Source
// License. See LICENSE.TXT for details.
//
//===----------------------------------------------------------------------===//
//
// This file defines macro telling whether sanitizer tools can/should intercept
// given library functions on a given platform.
//
//===----------------------------------------------------------------------===//
#include "sanitizer_internal_defs.h"
#if !defined(_WIN32)
# define SI_NOT_WINDOWS 1
#else
# define SI_NOT_WINDOWS 0
#endif
#if defined(__linux__) && !defined(ANDROID)
# define SI_LINUX_NOT_ANDROID 1
#else
# define SI_LINUX_NOT_ANDROID 0
#endif
# define SANITIZER_INTERCEPT_READ SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_PREAD SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_WRITE SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_PWRITE SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_PREAD64 SI_LINUX_NOT_ANDROID
# define SANITIZER_INTERCEPT_PWRITE64 SI_LINUX_NOT_ANDROID
# define SANITIZER_INTERCEPT_PRCTL SI_LINUX_NOT_ANDROID
# define SANITIZER_INTERCEPT_SCANF SI_NOT_WINDOWS
| //===-- sanitizer_platform_interceptors.h -----------------------*- C++ -*-===//
//
// The LLVM Compiler Infrastructure
//
// This file is distributed under the University of Illinois Open Source
// License. See LICENSE.TXT for details.
//
//===----------------------------------------------------------------------===//
//
// This file defines macro telling whether sanitizer tools can/should intercept
// given library functions on a given platform.
//
//===----------------------------------------------------------------------===//
#include "sanitizer_internal_defs.h"
#if !defined(_WIN32)
# define SI_NOT_WINDOWS 1
#else
# define SI_NOT_WINDOWS 0
#endif
#if defined(__linux__) && !defined(ANDROID)
# define SI_LINUX_NOT_ANDROID 1
#else
# define SI_LINUX_NOT_ANDROID 0
#endif
# define SANITIZER_INTERCEPT_READ SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_PREAD SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_WRITE SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_PWRITE SI_NOT_WINDOWS
# define SANITIZER_INTERCEPT_PREAD64 SI_LINUX_NOT_ANDROID
# define SANITIZER_INTERCEPT_PWRITE64 SI_LINUX_NOT_ANDROID
# define SANITIZER_INTERCEPT_PRCTL SI_LINUX_NOT_ANDROID
- # define SANITIZER_INTERCEPT_SCANF 1
? ^
+ # define SANITIZER_INTERCEPT_SCANF SI_NOT_WINDOWS
? ^^^^^^^^^^^^^^
| 2 | 0.052632 | 1 | 1 |
3a67d50d5858806f090833cf81da46c1780b0814 | lib/numbering.rb | lib/numbering.rb | module Numbering
extend ActiveSupport::Concern
included do
before_create :generate_number
end
module ClassMethods
def numbering_parent_column(column_name)
@numbering_parent_column = column_name
end
end
def generate_number
column = self.class.instance_variable_get(:'@numbering_parent_column')
sql = <<-SQL
SELECT MAX(number)
FROM #{self.class.table_name}
WHERE #{column}=#{send(column).to_i}
SQL
self.number = connection.select_value(sql).to_i + 1
end
end
| module Numbering
extend ActiveSupport::Concern
included do
before_create :generate_number
end
module ClassMethods
def numbering_parent_column(column_name)
@numbering_parent_column = column_name
end
end
def generate_number
column = self.class.instance_variable_get(:'@numbering_parent_column')
max = self.class.where(column => send(column).to_i).maximum('number').to_i
self.number = max + 1
end
end
| Use Rails methods for max number retrieval. | Use Rails methods for max number retrieval.
| Ruby | agpl-3.0 | ontohub/ontohub,ontohub/ontohub,ontohub/ontohub,ontohub/ontohub,ontohub/ontohub,ontohub/ontohub | ruby | ## Code Before:
module Numbering
extend ActiveSupport::Concern
included do
before_create :generate_number
end
module ClassMethods
def numbering_parent_column(column_name)
@numbering_parent_column = column_name
end
end
def generate_number
column = self.class.instance_variable_get(:'@numbering_parent_column')
sql = <<-SQL
SELECT MAX(number)
FROM #{self.class.table_name}
WHERE #{column}=#{send(column).to_i}
SQL
self.number = connection.select_value(sql).to_i + 1
end
end
## Instruction:
Use Rails methods for max number retrieval.
## Code After:
module Numbering
extend ActiveSupport::Concern
included do
before_create :generate_number
end
module ClassMethods
def numbering_parent_column(column_name)
@numbering_parent_column = column_name
end
end
def generate_number
column = self.class.instance_variable_get(:'@numbering_parent_column')
max = self.class.where(column => send(column).to_i).maximum('number').to_i
self.number = max + 1
end
end
| module Numbering
extend ActiveSupport::Concern
included do
before_create :generate_number
end
module ClassMethods
def numbering_parent_column(column_name)
@numbering_parent_column = column_name
end
end
def generate_number
column = self.class.instance_variable_get(:'@numbering_parent_column')
+ max = self.class.where(column => send(column).to_i).maximum('number').to_i
+ self.number = max + 1
- sql = <<-SQL
- SELECT MAX(number)
- FROM #{self.class.table_name}
- WHERE #{column}=#{send(column).to_i}
- SQL
- self.number = connection.select_value(sql).to_i + 1
end
end | 8 | 0.347826 | 2 | 6 |
8a70626a042d220047fdc6d5c911c495b24a151e | _posts/2000-01-03-details.md | _posts/2000-01-03-details.md | ---
title: "setup details"
bg: '#27ae60'
color: black
fa-icon: cog
style: center
---
# Add the Docker Alias
You can run coala inside docker on your source code with just one command. We recommend setting an alias in your `.bashrc` or `.zshrc`:
### `alias coala="docker run -ti -v $(pwd):/app --workdir=/app coala/base coala"`
On windows - or if you prefer to run docker directly, skip the "try it" section and just run `docker run -ti -v $(pwd):/app --workdir=/app coala/base coala --files="**/*.py" --bears=PEP8Bear --save`.
-------------------------
# Try It!
### `cd project && coala --files="**/*.py" --bears=PEP8Bear --save`
### Your configuration will be automatically saved to the `.coafile` in the current directory. Go to our [bear documentation](https://coala.io/languages) to see what bears are available. Bears can be installed as pip packages.
-------------------------
# Commit It!
### `git add .coafile && git commit -m "Add coala configuration"`
-------------------------
# Tell Us What You Think!
### Simply join our channel at [Gitter](https://coala.io/chat)! We'd love to speak to you!
### Look at [our tutorial](https://coala.io/tutorial) for a deeper introduction.
| ---
title: "setup details"
bg: '#27ae60'
color: black
fa-icon: cog
style: center
---
# Add the Docker Alias
You can run coala inside docker on your source code with just one command. We recommend setting an alias in your `.bashrc` or `.zshrc`:
### `alias coala="docker run -ti -v $(pwd):/app --workdir=/app coala/base coala"`
On windows - or if you prefer to run docker directly, skip the "try it" section and just run `docker run -ti -v $(pwd):/app --workdir=/app coala/base coala --files="**/*.py" --bears=PEP8Bear --save`.
If you do not want or cannot use docker, consult [our installation documentation](http://docs.coala.io/en/latest/Users/Install.html).
-------------------------
# Try It!
### `cd project && coala --files="**/*.py" --bears=PEP8Bear --save`
### Your configuration will be automatically saved to the `.coafile` in the current directory. Go to our [bear documentation](https://coala.io/languages) to see what bears are available. Bears can be installed as pip packages.
-------------------------
# Commit It!
### `git add .coafile && git commit -m "Add coala configuration"`
-------------------------
# Tell Us What You Think!
### Simply join our channel at [Gitter](https://coala.io/chat)! We'd love to speak to you!
### Look at [our tutorial](https://coala.io/tutorial) for a deeper introduction.
| Add link to install docs | Add link to install docs
| Markdown | mit | coala-analyzer/website,coala-analyzer/website | markdown | ## Code Before:
---
title: "setup details"
bg: '#27ae60'
color: black
fa-icon: cog
style: center
---
# Add the Docker Alias
You can run coala inside docker on your source code with just one command. We recommend setting an alias in your `.bashrc` or `.zshrc`:
### `alias coala="docker run -ti -v $(pwd):/app --workdir=/app coala/base coala"`
On windows - or if you prefer to run docker directly, skip the "try it" section and just run `docker run -ti -v $(pwd):/app --workdir=/app coala/base coala --files="**/*.py" --bears=PEP8Bear --save`.
-------------------------
# Try It!
### `cd project && coala --files="**/*.py" --bears=PEP8Bear --save`
### Your configuration will be automatically saved to the `.coafile` in the current directory. Go to our [bear documentation](https://coala.io/languages) to see what bears are available. Bears can be installed as pip packages.
-------------------------
# Commit It!
### `git add .coafile && git commit -m "Add coala configuration"`
-------------------------
# Tell Us What You Think!
### Simply join our channel at [Gitter](https://coala.io/chat)! We'd love to speak to you!
### Look at [our tutorial](https://coala.io/tutorial) for a deeper introduction.
## Instruction:
Add link to install docs
## Code After:
---
title: "setup details"
bg: '#27ae60'
color: black
fa-icon: cog
style: center
---
# Add the Docker Alias
You can run coala inside docker on your source code with just one command. We recommend setting an alias in your `.bashrc` or `.zshrc`:
### `alias coala="docker run -ti -v $(pwd):/app --workdir=/app coala/base coala"`
On windows - or if you prefer to run docker directly, skip the "try it" section and just run `docker run -ti -v $(pwd):/app --workdir=/app coala/base coala --files="**/*.py" --bears=PEP8Bear --save`.
If you do not want or cannot use docker, consult [our installation documentation](http://docs.coala.io/en/latest/Users/Install.html).
-------------------------
# Try It!
### `cd project && coala --files="**/*.py" --bears=PEP8Bear --save`
### Your configuration will be automatically saved to the `.coafile` in the current directory. Go to our [bear documentation](https://coala.io/languages) to see what bears are available. Bears can be installed as pip packages.
-------------------------
# Commit It!
### `git add .coafile && git commit -m "Add coala configuration"`
-------------------------
# Tell Us What You Think!
### Simply join our channel at [Gitter](https://coala.io/chat)! We'd love to speak to you!
### Look at [our tutorial](https://coala.io/tutorial) for a deeper introduction.
| ---
title: "setup details"
bg: '#27ae60'
color: black
fa-icon: cog
style: center
---
# Add the Docker Alias
You can run coala inside docker on your source code with just one command. We recommend setting an alias in your `.bashrc` or `.zshrc`:
### `alias coala="docker run -ti -v $(pwd):/app --workdir=/app coala/base coala"`
On windows - or if you prefer to run docker directly, skip the "try it" section and just run `docker run -ti -v $(pwd):/app --workdir=/app coala/base coala --files="**/*.py" --bears=PEP8Bear --save`.
+
+ If you do not want or cannot use docker, consult [our installation documentation](http://docs.coala.io/en/latest/Users/Install.html).
-------------------------
# Try It!
### `cd project && coala --files="**/*.py" --bears=PEP8Bear --save`
### Your configuration will be automatically saved to the `.coafile` in the current directory. Go to our [bear documentation](https://coala.io/languages) to see what bears are available. Bears can be installed as pip packages.
-------------------------
# Commit It!
### `git add .coafile && git commit -m "Add coala configuration"`
-------------------------
# Tell Us What You Think!
### Simply join our channel at [Gitter](https://coala.io/chat)! We'd love to speak to you!
### Look at [our tutorial](https://coala.io/tutorial) for a deeper introduction. | 2 | 0.054054 | 2 | 0 |
d698c3d88f9a93f72254a1b2a63c0c8e1258b286 | lib/delayed/plugins/honeybadger.rb | lib/delayed/plugins/honeybadger.rb | module Delayed
module Plugins
class Honeybadger < Plugin
module Notify
def error(job, error)
::Honeybadger.notify_or_ignore(
:error_class => error.class.name,
:error_message => "#{ error.class.name }: #{ error.message }",
:backtrace => error.backtrace,
:context => {
:job_id => job.id,
:handler => job.handler,
:last_error => job.last_error,
:attempts => job.attempts,
:queue => job.queue
}
)
super if defined?(super)
end
end
callbacks do |lifecycle|
lifecycle.before(:invoke_job) do |job|
payload = job.payload_object
payload = payload.object if payload.is_a? Delayed::PerformableMethod
payload.extend Notify
end
lifecycle.around(:perform) do |worker, &block|
begin
block.call(worker)
ensure
::Honeybadger.context.clear!
end
end
end
end
end
end
| module Delayed
module Plugins
class Honeybadger < Plugin
callbacks do |lifecycle|
lifecycle.around(:invoke_job) do |job, *args, &block|
begin
block.call(job, *args)
rescue Exception => error
::Honeybadger.notify_or_ignore(
:error_class => error.class.name,
:error_message => "#{ error.class.name }: #{ error.message }",
:backtrace => error.backtrace,
:context => {
:job_id => job.id,
:handler => job.handler,
:last_error => job.last_error,
:attempts => job.attempts,
:queue => job.queue
}
)
raise error
ensure
::Honeybadger.context.clear!
end
end
end
end
end
end
| Fix segfault running in Ruby 2.1.0 | Fix segfault running in Ruby 2.1.0
| Ruby | mit | honeybadger-io/delayed_job_honeybadger | ruby | ## Code Before:
module Delayed
module Plugins
class Honeybadger < Plugin
module Notify
def error(job, error)
::Honeybadger.notify_or_ignore(
:error_class => error.class.name,
:error_message => "#{ error.class.name }: #{ error.message }",
:backtrace => error.backtrace,
:context => {
:job_id => job.id,
:handler => job.handler,
:last_error => job.last_error,
:attempts => job.attempts,
:queue => job.queue
}
)
super if defined?(super)
end
end
callbacks do |lifecycle|
lifecycle.before(:invoke_job) do |job|
payload = job.payload_object
payload = payload.object if payload.is_a? Delayed::PerformableMethod
payload.extend Notify
end
lifecycle.around(:perform) do |worker, &block|
begin
block.call(worker)
ensure
::Honeybadger.context.clear!
end
end
end
end
end
end
## Instruction:
Fix segfault running in Ruby 2.1.0
## Code After:
module Delayed
module Plugins
class Honeybadger < Plugin
callbacks do |lifecycle|
lifecycle.around(:invoke_job) do |job, *args, &block|
begin
block.call(job, *args)
rescue Exception => error
::Honeybadger.notify_or_ignore(
:error_class => error.class.name,
:error_message => "#{ error.class.name }: #{ error.message }",
:backtrace => error.backtrace,
:context => {
:job_id => job.id,
:handler => job.handler,
:last_error => job.last_error,
:attempts => job.attempts,
:queue => job.queue
}
)
raise error
ensure
::Honeybadger.context.clear!
end
end
end
end
end
end
| module Delayed
module Plugins
class Honeybadger < Plugin
- module Notify
- def error(job, error)
- ::Honeybadger.notify_or_ignore(
- :error_class => error.class.name,
- :error_message => "#{ error.class.name }: #{ error.message }",
- :backtrace => error.backtrace,
- :context => {
- :job_id => job.id,
- :handler => job.handler,
- :last_error => job.last_error,
- :attempts => job.attempts,
- :queue => job.queue
- }
- )
- super if defined?(super)
- end
- end
-
callbacks do |lifecycle|
- lifecycle.before(:invoke_job) do |job|
- payload = job.payload_object
- payload = payload.object if payload.is_a? Delayed::PerformableMethod
- payload.extend Notify
- end
-
- lifecycle.around(:perform) do |worker, &block|
? ^ ^^ ^^ ^ ^^^
+ lifecycle.around(:invoke_job) do |job, *args, &block|
? ^^^^^ ^^ ^ ^ +++++ ^^
begin
- block.call(worker)
? ^ ^^^
+ block.call(job, *args)
? ^ +++++ ^^
+ rescue Exception => error
+ ::Honeybadger.notify_or_ignore(
+ :error_class => error.class.name,
+ :error_message => "#{ error.class.name }: #{ error.message }",
+ :backtrace => error.backtrace,
+ :context => {
+ :job_id => job.id,
+ :handler => job.handler,
+ :last_error => job.last_error,
+ :attempts => job.attempts,
+ :queue => job.queue
+ }
+ )
+ raise error
ensure
::Honeybadger.context.clear!
end
end
end
end
end
end | 42 | 1.076923 | 16 | 26 |
62aa1dd73a41392218713bb9c7353ea72d14e8ba | spec/providers/service_runit_spec.rb | spec/providers/service_runit_spec.rb | require 'spec_helper'
describe 'service_runit provider' do
let(:runner) do
ChefSpec::Runner.new(step_into: ['graphite_service']) do |node|
node.automatic["platform_family"] = "debian"
end
end
let(:node) { runner.node }
let(:chef_run) { runner.converge("graphite_fixtures::graphite_service_runit_enable") }
describe 'with the create action' do
it "writes the runit service file" do
expect(chef_run).to enable_runit_service("carbon-cache-a").with(
cookbook: "graphite",
run_template_name: "carbon",
default_logger: true,
finish_script_template_name: "carbon",
finish: true,
options: { type: "cache", instance: "a" }
)
end
end
describe 'with the delete action' do
it "removes the runit service file" do
end
end
end
| require 'spec_helper'
describe 'service_runit provider' do
let(:runner) do
ChefSpec::Runner.new(
platform: "ubuntu",
version: "12.04",
step_into: ['graphite_service']
)
end
let(:node) { runner.node }
let(:chef_run) { runner.converge("graphite_fixtures::graphite_service_runit_enable") }
describe 'with the create action' do
it "writes the runit service file" do
expect(chef_run).to enable_runit_service("carbon-cache-a").with(
cookbook: "graphite",
run_template_name: "carbon",
default_logger: true,
finish_script_template_name: "carbon",
finish: true,
options: { type: "cache", instance: "a" }
)
end
end
describe 'with the delete action' do
it "removes the runit service file" do
end
end
end
| Refactor graphite_service_runit provider spec to use fauxhai data. | Refactor graphite_service_runit provider spec to use fauxhai data.
| Ruby | apache-2.0 | mbabic/graphite,bitmonk/chef-graphite,bitmonk/chef-graphite,tas50/graphite,diegobill/graphite,mbabic/graphite,hw-cookbooks/graphite,tas50/graphite,hw-cookbooks/graphite,diegobill/graphite,hw-cookbooks/graphite,diegobill/graphite,tas50/graphite,bitmonk/chef-graphite,mbabic/graphite | ruby | ## Code Before:
require 'spec_helper'
describe 'service_runit provider' do
let(:runner) do
ChefSpec::Runner.new(step_into: ['graphite_service']) do |node|
node.automatic["platform_family"] = "debian"
end
end
let(:node) { runner.node }
let(:chef_run) { runner.converge("graphite_fixtures::graphite_service_runit_enable") }
describe 'with the create action' do
it "writes the runit service file" do
expect(chef_run).to enable_runit_service("carbon-cache-a").with(
cookbook: "graphite",
run_template_name: "carbon",
default_logger: true,
finish_script_template_name: "carbon",
finish: true,
options: { type: "cache", instance: "a" }
)
end
end
describe 'with the delete action' do
it "removes the runit service file" do
end
end
end
## Instruction:
Refactor graphite_service_runit provider spec to use fauxhai data.
## Code After:
require 'spec_helper'
describe 'service_runit provider' do
let(:runner) do
ChefSpec::Runner.new(
platform: "ubuntu",
version: "12.04",
step_into: ['graphite_service']
)
end
let(:node) { runner.node }
let(:chef_run) { runner.converge("graphite_fixtures::graphite_service_runit_enable") }
describe 'with the create action' do
it "writes the runit service file" do
expect(chef_run).to enable_runit_service("carbon-cache-a").with(
cookbook: "graphite",
run_template_name: "carbon",
default_logger: true,
finish_script_template_name: "carbon",
finish: true,
options: { type: "cache", instance: "a" }
)
end
end
describe 'with the delete action' do
it "removes the runit service file" do
end
end
end
| require 'spec_helper'
describe 'service_runit provider' do
let(:runner) do
- ChefSpec::Runner.new(step_into: ['graphite_service']) do |node|
- node.automatic["platform_family"] = "debian"
- end
+ ChefSpec::Runner.new(
+ platform: "ubuntu",
+ version: "12.04",
+ step_into: ['graphite_service']
+ )
end
let(:node) { runner.node }
let(:chef_run) { runner.converge("graphite_fixtures::graphite_service_runit_enable") }
describe 'with the create action' do
it "writes the runit service file" do
expect(chef_run).to enable_runit_service("carbon-cache-a").with(
cookbook: "graphite",
run_template_name: "carbon",
default_logger: true,
finish_script_template_name: "carbon",
finish: true,
options: { type: "cache", instance: "a" }
)
end
end
describe 'with the delete action' do
it "removes the runit service file" do
end
end
end | 8 | 0.266667 | 5 | 3 |
7e81919e27bf5a4e8c3a31cafbdbfadbe1923b93 | samples/build-scan/build.gradle.kts | samples/build-scan/build.gradle.kts | plugins {
id("com.gradle.build-scan") version "1.8"
}
buildScan {
setLicenseAgreementUrl("https://gradle.com/terms-of-service")
setLicenseAgree("yes")
}
| plugins {
`build-scan`
}
buildScan {
setLicenseAgreementUrl("https://gradle.com/terms-of-service")
setLicenseAgree("yes")
}
| Update the build-scan sample to use the `build-scan` plugin accessor | Update the build-scan sample to use the `build-scan` plugin accessor
See #404
| Kotlin | apache-2.0 | blindpirate/gradle,gradle/gradle,robinverduijn/gradle,blindpirate/gradle,robinverduijn/gradle,gradle/gradle,robinverduijn/gradle,gradle/gradle,blindpirate/gradle,gradle/gradle,blindpirate/gradle,robinverduijn/gradle,robinverduijn/gradle,gradle/gradle,blindpirate/gradle,gradle/gradle,robinverduijn/gradle,robinverduijn/gradle,robinverduijn/gradle,gradle/gradle,robinverduijn/gradle,gradle/gradle-script-kotlin,blindpirate/gradle,robinverduijn/gradle,blindpirate/gradle,gradle/gradle,gradle/gradle,robinverduijn/gradle,blindpirate/gradle,blindpirate/gradle,blindpirate/gradle,gradle/gradle,gradle/gradle-script-kotlin | kotlin | ## Code Before:
plugins {
id("com.gradle.build-scan") version "1.8"
}
buildScan {
setLicenseAgreementUrl("https://gradle.com/terms-of-service")
setLicenseAgree("yes")
}
## Instruction:
Update the build-scan sample to use the `build-scan` plugin accessor
See #404
## Code After:
plugins {
`build-scan`
}
buildScan {
setLicenseAgreementUrl("https://gradle.com/terms-of-service")
setLicenseAgree("yes")
}
| plugins {
- id("com.gradle.build-scan") version "1.8"
+ `build-scan`
}
buildScan {
setLicenseAgreementUrl("https://gradle.com/terms-of-service")
setLicenseAgree("yes")
} | 2 | 0.25 | 1 | 1 |
030c1d7457407c4f27e02d11f6e5aa8e9d6b161f | .travis.yml | .travis.yml | before_install:
- sudo chmod 755 ./vendor/bin/youtube-dl
- sudo apt-get update -qq
rvm:
- 1.9.3 # EOL, but still supporting
- 2.0.0 # Oldest stable
- 2.2.3 # Current stable
- jruby-head # JRuby
- rbx-2 # Rubinius
cache: bundler
matrix:
allow_failures:
- rvm: jruby-head
notifications:
slack: layer8x:cvixULdjCINfq7u9Zs1rS0VY
| before_install:
- sudo chmod 755 ./vendor/bin/youtube-dl
- sudo apt-get update -qq
rvm:
- 1.9.3 # EOL, but still supporting
- 2.0.0 # Oldest stable
- 2.2.3 # Current stable
- jruby-head # JRuby
- rbx-2 # Rubinius
cache: bundler
matrix:
allow_failures:
- rvm: jruby-head
notifications:
slack: layer8x:cvixULdjCINfq7u9Zs1rS0VY
email: false
| Stop Travis CI from emailing me all the fucking time | Stop Travis CI from emailing me all the fucking time
| YAML | mit | layer8x/youtube-dl.rb | yaml | ## Code Before:
before_install:
- sudo chmod 755 ./vendor/bin/youtube-dl
- sudo apt-get update -qq
rvm:
- 1.9.3 # EOL, but still supporting
- 2.0.0 # Oldest stable
- 2.2.3 # Current stable
- jruby-head # JRuby
- rbx-2 # Rubinius
cache: bundler
matrix:
allow_failures:
- rvm: jruby-head
notifications:
slack: layer8x:cvixULdjCINfq7u9Zs1rS0VY
## Instruction:
Stop Travis CI from emailing me all the fucking time
## Code After:
before_install:
- sudo chmod 755 ./vendor/bin/youtube-dl
- sudo apt-get update -qq
rvm:
- 1.9.3 # EOL, but still supporting
- 2.0.0 # Oldest stable
- 2.2.3 # Current stable
- jruby-head # JRuby
- rbx-2 # Rubinius
cache: bundler
matrix:
allow_failures:
- rvm: jruby-head
notifications:
slack: layer8x:cvixULdjCINfq7u9Zs1rS0VY
email: false
| before_install:
- sudo chmod 755 ./vendor/bin/youtube-dl
- sudo apt-get update -qq
rvm:
- 1.9.3 # EOL, but still supporting
- 2.0.0 # Oldest stable
- 2.2.3 # Current stable
- jruby-head # JRuby
- rbx-2 # Rubinius
cache: bundler
matrix:
allow_failures:
- rvm: jruby-head
notifications:
slack: layer8x:cvixULdjCINfq7u9Zs1rS0VY
+ email: false | 1 | 0.052632 | 1 | 0 |
f4230952eb59d1fe4065bbf51245737dbbf71701 | .travis.yml | .travis.yml | sudo: false
language: java
cache:
directories:
- $HOME/.m2
install: mvn clean install -q -Dmaven.test.failure.ignore=true
branches:
only:
- develop
#jdk:
# - openjdk8
| sudo: false
language: java
cache:
directories:
- $HOME/.m2
install: mvn clean install -Dorg.slf4j.simpleLogger.defaultLogLevel=warn -Dmaven.test.failure.ignore=true
branches:
only:
- develop
#jdk:
# - openjdk8
| Adjust log level to warning | Adjust log level to warning | YAML | apache-2.0 | orientechnologies/orientdb,orientechnologies/orientdb,orientechnologies/orientdb,orientechnologies/orientdb | yaml | ## Code Before:
sudo: false
language: java
cache:
directories:
- $HOME/.m2
install: mvn clean install -q -Dmaven.test.failure.ignore=true
branches:
only:
- develop
#jdk:
# - openjdk8
## Instruction:
Adjust log level to warning
## Code After:
sudo: false
language: java
cache:
directories:
- $HOME/.m2
install: mvn clean install -Dorg.slf4j.simpleLogger.defaultLogLevel=warn -Dmaven.test.failure.ignore=true
branches:
only:
- develop
#jdk:
# - openjdk8
| sudo: false
language: java
cache:
directories:
- $HOME/.m2
- install: mvn clean install -q -Dmaven.test.failure.ignore=true
+ install: mvn clean install -Dorg.slf4j.simpleLogger.defaultLogLevel=warn -Dmaven.test.failure.ignore=true
branches:
only:
- develop
#jdk:
# - openjdk8 | 2 | 0.181818 | 1 | 1 |
33309df85823bde19fcdd2b21b73db9f1da131ab | requests_oauthlib/compliance_fixes/facebook.py | requests_oauthlib/compliance_fixes/facebook.py | from json import dumps
from oauthlib.common import urldecode
from urlparse import parse_qsl
def facebook_compliance_fix(session):
def _compliance_fix(r):
# if Facebook claims to be sending us json, let's trust them.
if 'application/json' in r.headers['content-type']:
return r
# Facebook returns a content-type of text/plain when sending their
# x-www-form-urlencoded responses, along with a 200. If not, let's
# assume we're getting JSON and bail on the fix.
if 'text/plain' in r.headers['content-type'] and r.status_code == 200:
token = dict(parse_qsl(r.text, keep_blank_values=True))
else:
return r
expires = token.get('expires')
if expires is not None:
token['expires_in'] = expires
token['token_type'] = 'Bearer'
r._content = dumps(token)
return r
session.register_compliance_hook('access_token_response', _compliance_fix)
return session
| from json import dumps
try:
from urlparse import parse_qsl
except ImportError:
from urllib.parse import parse_qsl
def facebook_compliance_fix(session):
def _compliance_fix(r):
# if Facebook claims to be sending us json, let's trust them.
if 'application/json' in r.headers['content-type']:
return r
# Facebook returns a content-type of text/plain when sending their
# x-www-form-urlencoded responses, along with a 200. If not, let's
# assume we're getting JSON and bail on the fix.
if 'text/plain' in r.headers['content-type'] and r.status_code == 200:
token = dict(parse_qsl(r.text, keep_blank_values=True))
else:
return r
expires = token.get('expires')
if expires is not None:
token['expires_in'] = expires
token['token_type'] = 'Bearer'
r._content = dumps(token)
return r
session.register_compliance_hook('access_token_response', _compliance_fix)
return session
| Remove unused import. Facebook compliance support python3 | Remove unused import. Facebook compliance support python3
| Python | isc | abhi931375/requests-oauthlib,gras100/asks-oauthlib,requests/requests-oauthlib,singingwolfboy/requests-oauthlib,jayvdb/requests-oauthlib,lucidbard/requests-oauthlib,dongguangming/requests-oauthlib,jsfan/requests-oauthlib,jayvdb/requests-oauthlib,sigmavirus24/requests-oauthlib,elafarge/requests-oauthlib | python | ## Code Before:
from json import dumps
from oauthlib.common import urldecode
from urlparse import parse_qsl
def facebook_compliance_fix(session):
def _compliance_fix(r):
# if Facebook claims to be sending us json, let's trust them.
if 'application/json' in r.headers['content-type']:
return r
# Facebook returns a content-type of text/plain when sending their
# x-www-form-urlencoded responses, along with a 200. If not, let's
# assume we're getting JSON and bail on the fix.
if 'text/plain' in r.headers['content-type'] and r.status_code == 200:
token = dict(parse_qsl(r.text, keep_blank_values=True))
else:
return r
expires = token.get('expires')
if expires is not None:
token['expires_in'] = expires
token['token_type'] = 'Bearer'
r._content = dumps(token)
return r
session.register_compliance_hook('access_token_response', _compliance_fix)
return session
## Instruction:
Remove unused import. Facebook compliance support python3
## Code After:
from json import dumps
try:
from urlparse import parse_qsl
except ImportError:
from urllib.parse import parse_qsl
def facebook_compliance_fix(session):
def _compliance_fix(r):
# if Facebook claims to be sending us json, let's trust them.
if 'application/json' in r.headers['content-type']:
return r
# Facebook returns a content-type of text/plain when sending their
# x-www-form-urlencoded responses, along with a 200. If not, let's
# assume we're getting JSON and bail on the fix.
if 'text/plain' in r.headers['content-type'] and r.status_code == 200:
token = dict(parse_qsl(r.text, keep_blank_values=True))
else:
return r
expires = token.get('expires')
if expires is not None:
token['expires_in'] = expires
token['token_type'] = 'Bearer'
r._content = dumps(token)
return r
session.register_compliance_hook('access_token_response', _compliance_fix)
return session
| from json import dumps
- from oauthlib.common import urldecode
+ try:
- from urlparse import parse_qsl
+ from urlparse import parse_qsl
? ++++
+ except ImportError:
+ from urllib.parse import parse_qsl
def facebook_compliance_fix(session):
def _compliance_fix(r):
# if Facebook claims to be sending us json, let's trust them.
if 'application/json' in r.headers['content-type']:
return r
# Facebook returns a content-type of text/plain when sending their
# x-www-form-urlencoded responses, along with a 200. If not, let's
# assume we're getting JSON and bail on the fix.
if 'text/plain' in r.headers['content-type'] and r.status_code == 200:
token = dict(parse_qsl(r.text, keep_blank_values=True))
else:
return r
expires = token.get('expires')
if expires is not None:
token['expires_in'] = expires
token['token_type'] = 'Bearer'
r._content = dumps(token)
return r
session.register_compliance_hook('access_token_response', _compliance_fix)
return session | 6 | 0.206897 | 4 | 2 |
854d3e53635542ae62e9c1dccc13fcd322647b49 | docs/_config.yml | docs/_config.yml | theme: jekyll-theme-hacker | theme: jekyll-theme-hacker
title: OpenKh
description: This is a project centralizes all the technical knowledge of Kingdom Hearts series in one place, providing documentation, tools, code libraries and the foundation for modding the commercial games.
google_analytics: UA-56422469-2 | Update homepage with title, description and analytics | Update homepage with title, description and analytics
| YAML | mit | Xeeynamo/KingdomHearts | yaml | ## Code Before:
theme: jekyll-theme-hacker
## Instruction:
Update homepage with title, description and analytics
## Code After:
theme: jekyll-theme-hacker
title: OpenKh
description: This is a project centralizes all the technical knowledge of Kingdom Hearts series in one place, providing documentation, tools, code libraries and the foundation for modding the commercial games.
google_analytics: UA-56422469-2 | theme: jekyll-theme-hacker
+ title: OpenKh
+ description: This is a project centralizes all the technical knowledge of Kingdom Hearts series in one place, providing documentation, tools, code libraries and the foundation for modding the commercial games.
+ google_analytics: UA-56422469-2 | 3 | 3 | 3 | 0 |
28b29665a72930917eeb16554e001737d65def34 | requirements.txt | requirements.txt | --editable git+https://github.com/openfisca/openfisca-core.git#egg=OpenFisca-Core
--editable .
| --editable git+https://github.com/benjello/openfisca-core.git@baremes-ipp#egg=OpenFisca-Core
--editable .
| Modify requirement.txt to use openfisca-core baremes-ipp branch | Modify requirement.txt to use openfisca-core baremes-ipp branch
| Text | agpl-3.0 | antoinearnoud/openfisca-france,antoinearnoud/openfisca-france,sgmap/openfisca-france,sgmap/openfisca-france | text | ## Code Before:
--editable git+https://github.com/openfisca/openfisca-core.git#egg=OpenFisca-Core
--editable .
## Instruction:
Modify requirement.txt to use openfisca-core baremes-ipp branch
## Code After:
--editable git+https://github.com/benjello/openfisca-core.git@baremes-ipp#egg=OpenFisca-Core
--editable .
| - --editable git+https://github.com/openfisca/openfisca-core.git#egg=OpenFisca-Core
? ^^ ^^^^^
+ --editable git+https://github.com/benjello/openfisca-core.git@baremes-ipp#egg=OpenFisca-Core
? ^ ^^^^^ ++++++++++++
--editable . | 2 | 1 | 1 | 1 |
e6b7ddb16d6d3e55c6f096a0559ae8e89a1e90c0 | README.md | README.md | Useful functions gathered in one DLL
| Useful functions gathered in one DLL
# Build Status
TeamCity: master:
Travis: master:
AppVeyor:
| Update READ me file to add buil status | Update READ me file to add buil status | Markdown | mit | fredatgithub/UsefulFunctions | markdown | ## Code Before:
Useful functions gathered in one DLL
## Instruction:
Update READ me file to add buil status
## Code After:
Useful functions gathered in one DLL
# Build Status
TeamCity: master:
Travis: master:
AppVeyor:
| Useful functions gathered in one DLL
+
+ # Build Status
+ TeamCity: master:
+ Travis: master:
+ AppVeyor: | 5 | 5 | 5 | 0 |
be9c37b2e80c883e2381cc492be2b560240b6196 | README.md | README.md | vmu.js
===========
Parsing vmu files on the browser
Test it on http://reicast.github.io/vmujs
| vmu.js
===========
Parsing vmu files on the browser
Test it on [http://reicast.github.io/vmujs](http://reicast.github.io/vmujs/#?u=https%3A%2F%2Fdl.dropboxusercontent.com%2Fs%2Fzln799lp4hrygej%2Fvmu-test.bin)
| Test url w/ dropbox test file | Test url w/ dropbox test file | Markdown | mit | reicast/vmujs | markdown | ## Code Before:
vmu.js
===========
Parsing vmu files on the browser
Test it on http://reicast.github.io/vmujs
## Instruction:
Test url w/ dropbox test file
## Code After:
vmu.js
===========
Parsing vmu files on the browser
Test it on [http://reicast.github.io/vmujs](http://reicast.github.io/vmujs/#?u=https%3A%2F%2Fdl.dropboxusercontent.com%2Fs%2Fzln799lp4hrygej%2Fvmu-test.bin)
| vmu.js
===========
Parsing vmu files on the browser
- Test it on http://reicast.github.io/vmujs
+ Test it on [http://reicast.github.io/vmujs](http://reicast.github.io/vmujs/#?u=https%3A%2F%2Fdl.dropboxusercontent.com%2Fs%2Fzln799lp4hrygej%2Fvmu-test.bin) | 2 | 0.333333 | 1 | 1 |
87ad807e10e808ec706eca212932915c7619ddb7 | frontend/src/backendQuery.js | frontend/src/backendQuery.js | 'use strict'
var util = require('./util')
export function createSchedule (queryArray, callback) {
var onResponse = function (responseString, error) {
if (error) {
callback(null, error)
} else if (!responseString) {
callback(null, 'Server neodpovídá.')
} else {
var response = JSON.parse(responseString)
if (response['error']) {
callback(null, 'Chyba při tvorbě rozvrhu: ' + response['error'])
} else {
callback(response['data'], null)
}
}
}
util.makeHttpRequest('POST', 'solverquery/', JSON.stringify(queryArray), onResponse)
}
export function addCourse (courseCode, callback) {
var onResponse = function (responseString, error) {
if (error) {
callback(null, error)
} else if (!responseString) {
callback(null, 'Server neodpovídá.')
} else {
var response = JSON.parse(responseString)
if (response['error']) {
callback(null, 'Nepodařilo se najít předmět ' + courseCode + '; je kód zadán správně?')
} else {
callback(response['data'], null)
}
}
}
util.makeHttpRequest('GET', 'sisquery/' + encodeURIComponent(courseCode), null, onResponse)
}
| 'use strict'
var util = require('./util')
export function createSchedule (queryArray, callback) {
var onResponse = function (responseString, error) {
if (error) {
callback(null, error)
} else if (!responseString) {
callback(null, 'Server neodpovídá.')
} else {
var response = JSON.parse(responseString)
if (response['error']) {
callback(null, 'Chyba při tvorbě rozvrhu: ' + response['error'])
} else {
callback(response['data'], null)
}
}
}
util.makeHttpRequest('POST', 'solverquery/', JSON.stringify(queryArray), onResponse)
}
export function addCourse (courseCode, callback) {
var onResponse = function (responseString, error) {
if (error) {
callback(null, error)
} else if (!responseString) {
callback(null, 'Server neodpovídá.')
} else {
var response = JSON.parse(responseString)
if (response['error']) {
if (response['error'] === 'The course has no scheduled events') {
callback(null, 'Předmět ' + courseCode + ' není rozvržený.')
} else {
callback(null, 'Nepodařilo se najít předmět ' + courseCode + '; je kód zadán správně?')
}
} else {
callback(response['data'], null)
}
}
}
util.makeHttpRequest('GET', 'sisquery/' + encodeURIComponent(courseCode), null, onResponse)
}
| Improve error message when the subject's events are not scheduled | Improve error message when the subject's events are not scheduled
| JavaScript | mit | IAmWave/samorozvrh,IAmWave/samorozvrh,IAmWave/samorozvrh,IAmWave/samorozvrh | javascript | ## Code Before:
'use strict'
var util = require('./util')
export function createSchedule (queryArray, callback) {
var onResponse = function (responseString, error) {
if (error) {
callback(null, error)
} else if (!responseString) {
callback(null, 'Server neodpovídá.')
} else {
var response = JSON.parse(responseString)
if (response['error']) {
callback(null, 'Chyba při tvorbě rozvrhu: ' + response['error'])
} else {
callback(response['data'], null)
}
}
}
util.makeHttpRequest('POST', 'solverquery/', JSON.stringify(queryArray), onResponse)
}
export function addCourse (courseCode, callback) {
var onResponse = function (responseString, error) {
if (error) {
callback(null, error)
} else if (!responseString) {
callback(null, 'Server neodpovídá.')
} else {
var response = JSON.parse(responseString)
if (response['error']) {
callback(null, 'Nepodařilo se najít předmět ' + courseCode + '; je kód zadán správně?')
} else {
callback(response['data'], null)
}
}
}
util.makeHttpRequest('GET', 'sisquery/' + encodeURIComponent(courseCode), null, onResponse)
}
## Instruction:
Improve error message when the subject's events are not scheduled
## Code After:
'use strict'
var util = require('./util')
export function createSchedule (queryArray, callback) {
var onResponse = function (responseString, error) {
if (error) {
callback(null, error)
} else if (!responseString) {
callback(null, 'Server neodpovídá.')
} else {
var response = JSON.parse(responseString)
if (response['error']) {
callback(null, 'Chyba při tvorbě rozvrhu: ' + response['error'])
} else {
callback(response['data'], null)
}
}
}
util.makeHttpRequest('POST', 'solverquery/', JSON.stringify(queryArray), onResponse)
}
export function addCourse (courseCode, callback) {
var onResponse = function (responseString, error) {
if (error) {
callback(null, error)
} else if (!responseString) {
callback(null, 'Server neodpovídá.')
} else {
var response = JSON.parse(responseString)
if (response['error']) {
if (response['error'] === 'The course has no scheduled events') {
callback(null, 'Předmět ' + courseCode + ' není rozvržený.')
} else {
callback(null, 'Nepodařilo se najít předmět ' + courseCode + '; je kód zadán správně?')
}
} else {
callback(response['data'], null)
}
}
}
util.makeHttpRequest('GET', 'sisquery/' + encodeURIComponent(courseCode), null, onResponse)
}
| 'use strict'
var util = require('./util')
export function createSchedule (queryArray, callback) {
var onResponse = function (responseString, error) {
if (error) {
callback(null, error)
} else if (!responseString) {
callback(null, 'Server neodpovídá.')
} else {
var response = JSON.parse(responseString)
if (response['error']) {
callback(null, 'Chyba při tvorbě rozvrhu: ' + response['error'])
} else {
callback(response['data'], null)
}
}
}
util.makeHttpRequest('POST', 'solverquery/', JSON.stringify(queryArray), onResponse)
}
export function addCourse (courseCode, callback) {
var onResponse = function (responseString, error) {
if (error) {
callback(null, error)
} else if (!responseString) {
callback(null, 'Server neodpovídá.')
} else {
var response = JSON.parse(responseString)
if (response['error']) {
+ if (response['error'] === 'The course has no scheduled events') {
+ callback(null, 'Předmět ' + courseCode + ' není rozvržený.')
+ } else {
- callback(null, 'Nepodařilo se najít předmět ' + courseCode + '; je kód zadán správně?')
+ callback(null, 'Nepodařilo se najít předmět ' + courseCode + '; je kód zadán správně?')
? ++
+ }
} else {
callback(response['data'], null)
}
}
}
util.makeHttpRequest('GET', 'sisquery/' + encodeURIComponent(courseCode), null, onResponse)
} | 6 | 0.15 | 5 | 1 |
f2fc4c7e1c6a2ef3b15914a693df17681ef23460 | ansible.cfg | ansible.cfg | [ssh_connection]
ssh_args=-o ForwardAgent=yes -o ControlMaster=auto -o ControlPersist=60s
| [default]
remote_tmp = /tmp/.ansible-${USER}/tmp
[ssh_connection]
ssh_args=-o ForwardAgent=yes -o ControlMaster=auto -o ControlPersist=60s
| Change remote_tmp to avoid stuff in home | Change remote_tmp to avoid stuff in home
| INI | mit | henrik-farre/ansible,henrik-farre/ansible,henrik-farre/ansible | ini | ## Code Before:
[ssh_connection]
ssh_args=-o ForwardAgent=yes -o ControlMaster=auto -o ControlPersist=60s
## Instruction:
Change remote_tmp to avoid stuff in home
## Code After:
[default]
remote_tmp = /tmp/.ansible-${USER}/tmp
[ssh_connection]
ssh_args=-o ForwardAgent=yes -o ControlMaster=auto -o ControlPersist=60s
| + [default]
+ remote_tmp = /tmp/.ansible-${USER}/tmp
+
[ssh_connection]
ssh_args=-o ForwardAgent=yes -o ControlMaster=auto -o ControlPersist=60s | 3 | 1.5 | 3 | 0 |
1aa33733bc121308d15c2e8667f01156cd94ebfe | app/views/application/_content_header.html.haml | app/views/application/_content_header.html.haml | %header.content-header
.content-header__main
%a.content-header__button.content-header__button--menu.yyy(data-sidebar='left') menu
%p.content-header__title
= link_to current_site.home_url.presence || root_url, class: 'content-header__logo' do
= image_tag current_site.logo_image_url || current_site.logo_url, alt: current_site.name, class: 'content-header__logo-image'
- if current_site.content_header_url.present?
.external-component{data: {external_src: current_site.content_header_url}}
.content-header__sub.yyy
- if current_site.tagline.present?
%p.content-header__tagline= current_site.tagline
| %header.content-header
.content-header__main
%div.content-header__menu
%a.content-header__button.content-header__button--menu(data-sidebar='left')
%span.yyy menu
%p.content-header__title
= link_to current_site.home_url.presence || root_url, class: 'content-header__logo' do
= image_tag current_site.logo_image_url || current_site.logo_url, alt: current_site.name, class: 'content-header__logo-image'
- if current_site.content_header_url.present?
.external-component{data: {external_src: current_site.content_header_url}}
.content-header__sub.yyy
- if current_site.tagline.present?
%p.content-header__tagline= current_site.tagline
| Apply style for header menu button | Apply style for header menu button
| Haml | mit | bm-sms/daimon-news,bm-sms/daimon-news,bm-sms/daimon-news | haml | ## Code Before:
%header.content-header
.content-header__main
%a.content-header__button.content-header__button--menu.yyy(data-sidebar='left') menu
%p.content-header__title
= link_to current_site.home_url.presence || root_url, class: 'content-header__logo' do
= image_tag current_site.logo_image_url || current_site.logo_url, alt: current_site.name, class: 'content-header__logo-image'
- if current_site.content_header_url.present?
.external-component{data: {external_src: current_site.content_header_url}}
.content-header__sub.yyy
- if current_site.tagline.present?
%p.content-header__tagline= current_site.tagline
## Instruction:
Apply style for header menu button
## Code After:
%header.content-header
.content-header__main
%div.content-header__menu
%a.content-header__button.content-header__button--menu(data-sidebar='left')
%span.yyy menu
%p.content-header__title
= link_to current_site.home_url.presence || root_url, class: 'content-header__logo' do
= image_tag current_site.logo_image_url || current_site.logo_url, alt: current_site.name, class: 'content-header__logo-image'
- if current_site.content_header_url.present?
.external-component{data: {external_src: current_site.content_header_url}}
.content-header__sub.yyy
- if current_site.tagline.present?
%p.content-header__tagline= current_site.tagline
| %header.content-header
.content-header__main
+ %div.content-header__menu
- %a.content-header__button.content-header__button--menu.yyy(data-sidebar='left') menu
? ---- -----
+ %a.content-header__button.content-header__button--menu(data-sidebar='left')
? ++
+ %span.yyy menu
%p.content-header__title
= link_to current_site.home_url.presence || root_url, class: 'content-header__logo' do
= image_tag current_site.logo_image_url || current_site.logo_url, alt: current_site.name, class: 'content-header__logo-image'
- if current_site.content_header_url.present?
.external-component{data: {external_src: current_site.content_header_url}}
.content-header__sub.yyy
- if current_site.tagline.present?
%p.content-header__tagline= current_site.tagline | 4 | 0.285714 | 3 | 1 |
0cd4b05e45d0e3660c83960ca458471fbcf9c6e2 | bin/console.php | bin/console.php | <?php
/**
* @license http://opensource.org/licenses/BSD-3-Clause BSD-3-Clause
* @copyright Copyright (c) 2015 Matthew Weier O'Phinney (https://mwop.net)
*/
require __DIR__ . '/../vendor/autoload.php';
use Zend\Console\Console;
use ZF\Console\Application;
use ZF\Console\Dispatcher;
$version = '0.0.1';
$application = new Application(
'Component Installer',
$version,
include __DIR__ . '/../config/routes.php',
Console::getInstance(),
new Dispatcher()
);
$exit = $application->run();
exit($exit);
| <?php
/**
* @license http://opensource.org/licenses/BSD-3-Clause BSD-3-Clause
* @copyright Copyright (c) 2015 Matthew Weier O'Phinney (https://mwop.net)
*/
require __DIR__ . '/../vendor/autoload.php';
use Zend\Console\Console;
use ZF\Console\Application;
use ZF\Console\Dispatcher;
$version = '@package_version@';
$application = new Application(
'Component Installer',
$version,
include __DIR__ . '/../config/routes.php',
Console::getInstance(),
new Dispatcher()
);
$exit = $application->run();
exit($exit);
| Use git version for package version | Use git version for package version
| PHP | bsd-3-clause | zendframework/zend-component-installer,weierophinney/component-installer,weierophinney/component-installer | php | ## Code Before:
<?php
/**
* @license http://opensource.org/licenses/BSD-3-Clause BSD-3-Clause
* @copyright Copyright (c) 2015 Matthew Weier O'Phinney (https://mwop.net)
*/
require __DIR__ . '/../vendor/autoload.php';
use Zend\Console\Console;
use ZF\Console\Application;
use ZF\Console\Dispatcher;
$version = '0.0.1';
$application = new Application(
'Component Installer',
$version,
include __DIR__ . '/../config/routes.php',
Console::getInstance(),
new Dispatcher()
);
$exit = $application->run();
exit($exit);
## Instruction:
Use git version for package version
## Code After:
<?php
/**
* @license http://opensource.org/licenses/BSD-3-Clause BSD-3-Clause
* @copyright Copyright (c) 2015 Matthew Weier O'Phinney (https://mwop.net)
*/
require __DIR__ . '/../vendor/autoload.php';
use Zend\Console\Console;
use ZF\Console\Application;
use ZF\Console\Dispatcher;
$version = '@package_version@';
$application = new Application(
'Component Installer',
$version,
include __DIR__ . '/../config/routes.php',
Console::getInstance(),
new Dispatcher()
);
$exit = $application->run();
exit($exit);
| <?php
/**
* @license http://opensource.org/licenses/BSD-3-Clause BSD-3-Clause
* @copyright Copyright (c) 2015 Matthew Weier O'Phinney (https://mwop.net)
*/
require __DIR__ . '/../vendor/autoload.php';
use Zend\Console\Console;
use ZF\Console\Application;
use ZF\Console\Dispatcher;
- $version = '0.0.1';
+ $version = '@package_version@';
$application = new Application(
'Component Installer',
$version,
include __DIR__ . '/../config/routes.php',
Console::getInstance(),
new Dispatcher()
);
$exit = $application->run();
exit($exit); | 2 | 0.083333 | 1 | 1 |
721703801654af88e8b5064d1bc65569ce1555cf | thumbnails/engines/__init__.py | thumbnails/engines/__init__.py |
def get_current_engine():
return None
| from thumbnails.engines.pillow import PillowEngine
def get_current_engine():
return PillowEngine()
| Set pillow engine as default | Set pillow engine as default
| Python | mit | python-thumbnails/python-thumbnails,relekang/python-thumbnails | python | ## Code Before:
def get_current_engine():
return None
## Instruction:
Set pillow engine as default
## Code After:
from thumbnails.engines.pillow import PillowEngine
def get_current_engine():
return PillowEngine()
| + from thumbnails.engines.pillow import PillowEngine
def get_current_engine():
- return None
+ return PillowEngine() | 3 | 0.75 | 2 | 1 |
fd5d03cf3d96354b69adedb6cd9adaf5675adcc9 | src/index.ts | src/index.ts | import { NgModule, ModuleWithProviders } from '@angular/core';
import { CommonModule } from '@angular/common';
import { DeviceDetectorService } from './device-detector.service';
@NgModule({
imports: [
CommonModule
]
})
export class DeviceDetectorModule {
static forRoot(): ModuleWithProviders {
return {
ngModule: DeviceDetectorModule,
providers: [DeviceDetectorService]
};
}
}
export { DeviceDetectorService, DeviceInfo } from './device-detector.service';
export { ReTree } from './retree';
export * from './device-detector.constants';
| import { NgModule, ModuleWithProviders } from '@angular/core';
import { CommonModule } from '@angular/common';
import { DeviceDetectorService } from './device-detector.service';
@NgModule({
imports: [
CommonModule
]
})
export class DeviceDetectorModule {
static forRoot(): ModuleWithProviders<DeviceDetectorModule> {
return {
ngModule: DeviceDetectorModule,
providers: [DeviceDetectorService]
};
}
}
export { DeviceDetectorService, DeviceInfo } from './device-detector.service';
export { ReTree } from './retree';
export * from './device-detector.constants';
| Use generic version for ModuleWithProviders | Use generic version for ModuleWithProviders
This should add some level of Angular 9 compatibility and fix #144 | TypeScript | mit | KoderLabs/ng2-device-detector,KoderLabs/ng2-device-detector | typescript | ## Code Before:
import { NgModule, ModuleWithProviders } from '@angular/core';
import { CommonModule } from '@angular/common';
import { DeviceDetectorService } from './device-detector.service';
@NgModule({
imports: [
CommonModule
]
})
export class DeviceDetectorModule {
static forRoot(): ModuleWithProviders {
return {
ngModule: DeviceDetectorModule,
providers: [DeviceDetectorService]
};
}
}
export { DeviceDetectorService, DeviceInfo } from './device-detector.service';
export { ReTree } from './retree';
export * from './device-detector.constants';
## Instruction:
Use generic version for ModuleWithProviders
This should add some level of Angular 9 compatibility and fix #144
## Code After:
import { NgModule, ModuleWithProviders } from '@angular/core';
import { CommonModule } from '@angular/common';
import { DeviceDetectorService } from './device-detector.service';
@NgModule({
imports: [
CommonModule
]
})
export class DeviceDetectorModule {
static forRoot(): ModuleWithProviders<DeviceDetectorModule> {
return {
ngModule: DeviceDetectorModule,
providers: [DeviceDetectorService]
};
}
}
export { DeviceDetectorService, DeviceInfo } from './device-detector.service';
export { ReTree } from './retree';
export * from './device-detector.constants';
| import { NgModule, ModuleWithProviders } from '@angular/core';
import { CommonModule } from '@angular/common';
import { DeviceDetectorService } from './device-detector.service';
@NgModule({
imports: [
CommonModule
]
})
export class DeviceDetectorModule {
- static forRoot(): ModuleWithProviders {
+ static forRoot(): ModuleWithProviders<DeviceDetectorModule> {
? ++++++++++++++++++++++
return {
ngModule: DeviceDetectorModule,
providers: [DeviceDetectorService]
};
}
}
export { DeviceDetectorService, DeviceInfo } from './device-detector.service';
export { ReTree } from './retree';
export * from './device-detector.constants';
| 2 | 0.090909 | 1 | 1 |
6bec90c8b084308cc7d542349dcfa044b05d8295 | cmd/tchaik/ui/static/sass/components/retro.scss | cmd/tchaik/ui/static/sass/components/retro.scss | @import '../variables.scss';
@import '../mixins.scss';
div.retro {
position:relative;
width: 100%;
height: 100%;
div.blur {
position: fixed;
left: 0;
top: 0;
height: 100%;
width: 100%;
background-size: cover;
z-index: 0;
height: 20%; width: 20%; /* Using Glen Maddern's trick /via @mente */
transform: scale(5);
transform-origin: top left;
-webkit-filter: blur(10px);
@include fade-in(background-image);
}
div.content {
position: relative;
display: flex;
align-items: center;
justify-content: center;
z-index: 1;
.playlist {
max-width: 600px;
background-color: $transparent-bar-dark-background;
}
}
.current-artwork {
position: absolute;
top: 0px;
right: 100px;
img {
border: 10px solid rgba(17, 17, 17, 0.2);
max-height: 600px;
margin-right: 30px;
}
}
}
| @import '../variables.scss';
@import '../mixins.scss';
div.retro {
position:relative;
width: 100%;
height: 100%;
div.blur {
position: fixed;
left: 0;
top: 0;
height: 100%;
width: 100%;
background-size: cover;
z-index: 0;
height: 20%; width: 20%; /* Using Glen Maddern's trick /via @mente */
transform: scale(5);
transform-origin: top left;
-webkit-filter: blur(10px);
@include fade-in(background-image);
}
div.content {
position: relative;
display: flex;
align-items: center;
justify-content: center;
z-index: 1;
.playlist {
max-width: 600px;
background-color: $transparent-bar-dark-background;
}
}
.current-artwork {
position: fixed;
top: 100px;
right: 100px;
img {
border: 10px solid rgba(17, 17, 17, 0.2);
max-height: 600px;
margin-right: 30px;
}
}
}
| Fix the current playing artwork to the top right | [Retro] Fix the current playing artwork to the top right
| SCSS | bsd-2-clause | tchaik/tchaik,tchaik/tchaik,GrahamGoudeau21/tchaik,GrahamGoudeau21/tchaik,GrahamGoudeau21/tchaik,GrahamGoudeau21/tchaik,tchaik/tchaik,tchaik/tchaik | scss | ## Code Before:
@import '../variables.scss';
@import '../mixins.scss';
div.retro {
position:relative;
width: 100%;
height: 100%;
div.blur {
position: fixed;
left: 0;
top: 0;
height: 100%;
width: 100%;
background-size: cover;
z-index: 0;
height: 20%; width: 20%; /* Using Glen Maddern's trick /via @mente */
transform: scale(5);
transform-origin: top left;
-webkit-filter: blur(10px);
@include fade-in(background-image);
}
div.content {
position: relative;
display: flex;
align-items: center;
justify-content: center;
z-index: 1;
.playlist {
max-width: 600px;
background-color: $transparent-bar-dark-background;
}
}
.current-artwork {
position: absolute;
top: 0px;
right: 100px;
img {
border: 10px solid rgba(17, 17, 17, 0.2);
max-height: 600px;
margin-right: 30px;
}
}
}
## Instruction:
[Retro] Fix the current playing artwork to the top right
## Code After:
@import '../variables.scss';
@import '../mixins.scss';
div.retro {
position:relative;
width: 100%;
height: 100%;
div.blur {
position: fixed;
left: 0;
top: 0;
height: 100%;
width: 100%;
background-size: cover;
z-index: 0;
height: 20%; width: 20%; /* Using Glen Maddern's trick /via @mente */
transform: scale(5);
transform-origin: top left;
-webkit-filter: blur(10px);
@include fade-in(background-image);
}
div.content {
position: relative;
display: flex;
align-items: center;
justify-content: center;
z-index: 1;
.playlist {
max-width: 600px;
background-color: $transparent-bar-dark-background;
}
}
.current-artwork {
position: fixed;
top: 100px;
right: 100px;
img {
border: 10px solid rgba(17, 17, 17, 0.2);
max-height: 600px;
margin-right: 30px;
}
}
}
| @import '../variables.scss';
@import '../mixins.scss';
div.retro {
position:relative;
width: 100%;
height: 100%;
div.blur {
position: fixed;
left: 0;
top: 0;
height: 100%;
width: 100%;
background-size: cover;
z-index: 0;
height: 20%; width: 20%; /* Using Glen Maddern's trick /via @mente */
transform: scale(5);
transform-origin: top left;
-webkit-filter: blur(10px);
@include fade-in(background-image);
}
div.content {
position: relative;
display: flex;
align-items: center;
justify-content: center;
z-index: 1;
.playlist {
max-width: 600px;
background-color: $transparent-bar-dark-background;
}
}
.current-artwork {
- position: absolute;
+ position: fixed;
- top: 0px;
+ top: 100px;
? ++
right: 100px;
img {
border: 10px solid rgba(17, 17, 17, 0.2);
max-height: 600px;
margin-right: 30px;
}
}
} | 4 | 0.081633 | 2 | 2 |
7ea6f649a748cd99922eb63de67864cc6ecf355a | .travis.yml | .travis.yml | language: python
sudo: false
env:
- PYTHON_VERSION=2.7
- PYTHON_VERSION=3.4
before_install:
- wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh -O miniconda.sh
- chmod +x miniconda.sh
- ./miniconda.sh -b
- export PATH=/home/travis/miniconda3/bin:$PATH
# Update conda itself
- conda update --yes conda
install:
- travis_retry conda create --yes -n env_name python=$PYTHON_VERSION pip nose flake8 coverage numpy
- source activate env_name
- travis_retry pip install .
script:
- nosetests --with-doctest --with-coverage
- flake8 qtp_biom setup.py scripts
after_success:
- coveralls
| language: python
sudo: false
env:
- PYTHON_VERSION=2.7
- PYTHON_VERSION=3.4
before_install:
- wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh -O miniconda.sh
- chmod +x miniconda.sh
- ./miniconda.sh -b
- export PATH=/home/travis/miniconda3/bin:$PATH
# Update conda itself
- conda update --yes conda
install:
- travis_retry conda create --yes -n env_name python=$PYTHON_VERSION pip nose flake8 coverage numpy
- source activate env_name
- pip install https://github.com/qiita-spots/qiita_client/archive/master.zip
- travis_retry pip install .
script:
- nosetests --with-doctest --with-coverage
- flake8 qtp_biom setup.py scripts
after_success:
- coveralls
| Install qiita client from master | Install qiita client from master
| YAML | bsd-3-clause | qiita-spots/qtp-biom | yaml | ## Code Before:
language: python
sudo: false
env:
- PYTHON_VERSION=2.7
- PYTHON_VERSION=3.4
before_install:
- wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh -O miniconda.sh
- chmod +x miniconda.sh
- ./miniconda.sh -b
- export PATH=/home/travis/miniconda3/bin:$PATH
# Update conda itself
- conda update --yes conda
install:
- travis_retry conda create --yes -n env_name python=$PYTHON_VERSION pip nose flake8 coverage numpy
- source activate env_name
- travis_retry pip install .
script:
- nosetests --with-doctest --with-coverage
- flake8 qtp_biom setup.py scripts
after_success:
- coveralls
## Instruction:
Install qiita client from master
## Code After:
language: python
sudo: false
env:
- PYTHON_VERSION=2.7
- PYTHON_VERSION=3.4
before_install:
- wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh -O miniconda.sh
- chmod +x miniconda.sh
- ./miniconda.sh -b
- export PATH=/home/travis/miniconda3/bin:$PATH
# Update conda itself
- conda update --yes conda
install:
- travis_retry conda create --yes -n env_name python=$PYTHON_VERSION pip nose flake8 coverage numpy
- source activate env_name
- pip install https://github.com/qiita-spots/qiita_client/archive/master.zip
- travis_retry pip install .
script:
- nosetests --with-doctest --with-coverage
- flake8 qtp_biom setup.py scripts
after_success:
- coveralls
| language: python
sudo: false
env:
- PYTHON_VERSION=2.7
- PYTHON_VERSION=3.4
before_install:
- wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh -O miniconda.sh
- chmod +x miniconda.sh
- ./miniconda.sh -b
- export PATH=/home/travis/miniconda3/bin:$PATH
# Update conda itself
- conda update --yes conda
install:
- travis_retry conda create --yes -n env_name python=$PYTHON_VERSION pip nose flake8 coverage numpy
- source activate env_name
+ - pip install https://github.com/qiita-spots/qiita_client/archive/master.zip
- travis_retry pip install .
script:
- nosetests --with-doctest --with-coverage
- flake8 qtp_biom setup.py scripts
after_success:
- coveralls | 1 | 0.047619 | 1 | 0 |
5ee15158c31190fc7fb58d9a2016b0e74f268490 | .travis.yml | .travis.yml | language: ruby
rvm:
- 1.8.7
- 1.9.2
- 1.9.3
- 2.0.0
- 2.1.0
gemfile:
- test/gemfiles/1.0
- test/gemfiles/1.1
- test/gemfiles/1.2
- test/gemfiles/1.3
- test/gemfiles/1.4
- test/gemfiles/1.5
- test/gemfiles/1.6
- test/gemfiles/1.7
| language: ruby
rvm:
- 1.9.2
- 1.9.3
- 2.0.0
- 2.1.0
gemfile:
- test/gemfiles/1.0
- test/gemfiles/1.1
- test/gemfiles/1.2
- test/gemfiles/1.3
- test/gemfiles/1.4
- test/gemfiles/1.5
- test/gemfiles/1.6
- test/gemfiles/1.7
| Remove 1.8.7 from text matrix | Remove 1.8.7 from text matrix
ExecJS 2.0 will only support 1.9.3 and above
| YAML | mit | rails/ruby-coffee-script,rails/ruby-coffee-script,mavenlink/ruby-coffee-script,eileencodes/ruby-coffee-script,josh/ruby-coffee-script,eileencodes/ruby-coffee-script,HubSpot/ruby-coffee-script-multi,mavenlink/ruby-coffee-script | yaml | ## Code Before:
language: ruby
rvm:
- 1.8.7
- 1.9.2
- 1.9.3
- 2.0.0
- 2.1.0
gemfile:
- test/gemfiles/1.0
- test/gemfiles/1.1
- test/gemfiles/1.2
- test/gemfiles/1.3
- test/gemfiles/1.4
- test/gemfiles/1.5
- test/gemfiles/1.6
- test/gemfiles/1.7
## Instruction:
Remove 1.8.7 from text matrix
ExecJS 2.0 will only support 1.9.3 and above
## Code After:
language: ruby
rvm:
- 1.9.2
- 1.9.3
- 2.0.0
- 2.1.0
gemfile:
- test/gemfiles/1.0
- test/gemfiles/1.1
- test/gemfiles/1.2
- test/gemfiles/1.3
- test/gemfiles/1.4
- test/gemfiles/1.5
- test/gemfiles/1.6
- test/gemfiles/1.7
| language: ruby
rvm:
- - 1.8.7
- 1.9.2
- 1.9.3
- 2.0.0
- 2.1.0
gemfile:
- test/gemfiles/1.0
- test/gemfiles/1.1
- test/gemfiles/1.2
- test/gemfiles/1.3
- test/gemfiles/1.4
- test/gemfiles/1.5
- test/gemfiles/1.6
- test/gemfiles/1.7 | 1 | 0.055556 | 0 | 1 |
86666d4c79109608aa735db1e80c40abca31a78d | README.md | README.md |
Following the [DHH Rails 5: Action Cable Demo](https://medium.com/@dhh/rails-5-action-cable-demo-8bba4ccfc55e) or [Railscast](http://railscasts-china.com/episodes/action-cable-rails-5) but confirm works at Windows!
Need Rails 5.0.0.beta3.
Thanks @maclover7 [PR to enable using PG as storage](https://github.com/rails/rails/pull/22950) and @matthewd [PR to remove EventMachine](https://github.com/rails/rails/pull/23152) [Round 2](https://github.com/rails/rails/pull/23305), it's now possible to starting using ActionCable in Windows.
### Step to setup the PG DB in Windows
[Download](http://www.postgresql.org/download/windows/) and install, recommand PostgreSQL v9.4.5
```bat
psql -U postgres
# or running if in MacOS
# psql -U `whoami` -d postgres
```
```psql
CREATE DATABASE pgac_dev;
GRANT ALL PRIVILEGES ON DATABASE pgac_dev to <current_user>;
```
|
Following the [DHH Rails 5: Action Cable Demo](https://medium.com/@dhh/rails-5-action-cable-demo-8bba4ccfc55e) or [Railscast](http://railscasts-china.com/episodes/action-cable-rails-5) but confirm works at Windows!
Need Rails 5.0.0.beta3.
Thanks @maclover7 [PR to enable using PG as storage](https://github.com/rails/rails/pull/22950) and @matthewd [PR to remove EventMachine](https://github.com/rails/rails/pull/23152) [Round 2](https://github.com/rails/rails/pull/23305), it's now possible to starting using ActionCable in Windows.
### Step to setup the PG DB in Windows
[Download](http://www.postgresql.org/download/windows/) and install, recommand PostgreSQL v9.4.5
```bat
psql -U postgres
# or running if in MacOS
# psql -U `whoami` -d postgres
```
```psql
CREATE DATABASE pgac_dev;
GRANT ALL PRIVILEGES ON DATABASE pgac_dev to <current_user>;
```
### Heroku Deploy
```bash
heroku create
git push heroku master
heroku run rake db:migrate
heroku open
```
| Add notes about deploy heroku | Add notes about deploy heroku
| Markdown | unlicense | Eric-Guo/pgac_demo,Eric-Guo/pgac_demo,Eric-Guo/pgac_demo | markdown | ## Code Before:
Following the [DHH Rails 5: Action Cable Demo](https://medium.com/@dhh/rails-5-action-cable-demo-8bba4ccfc55e) or [Railscast](http://railscasts-china.com/episodes/action-cable-rails-5) but confirm works at Windows!
Need Rails 5.0.0.beta3.
Thanks @maclover7 [PR to enable using PG as storage](https://github.com/rails/rails/pull/22950) and @matthewd [PR to remove EventMachine](https://github.com/rails/rails/pull/23152) [Round 2](https://github.com/rails/rails/pull/23305), it's now possible to starting using ActionCable in Windows.
### Step to setup the PG DB in Windows
[Download](http://www.postgresql.org/download/windows/) and install, recommand PostgreSQL v9.4.5
```bat
psql -U postgres
# or running if in MacOS
# psql -U `whoami` -d postgres
```
```psql
CREATE DATABASE pgac_dev;
GRANT ALL PRIVILEGES ON DATABASE pgac_dev to <current_user>;
```
## Instruction:
Add notes about deploy heroku
## Code After:
Following the [DHH Rails 5: Action Cable Demo](https://medium.com/@dhh/rails-5-action-cable-demo-8bba4ccfc55e) or [Railscast](http://railscasts-china.com/episodes/action-cable-rails-5) but confirm works at Windows!
Need Rails 5.0.0.beta3.
Thanks @maclover7 [PR to enable using PG as storage](https://github.com/rails/rails/pull/22950) and @matthewd [PR to remove EventMachine](https://github.com/rails/rails/pull/23152) [Round 2](https://github.com/rails/rails/pull/23305), it's now possible to starting using ActionCable in Windows.
### Step to setup the PG DB in Windows
[Download](http://www.postgresql.org/download/windows/) and install, recommand PostgreSQL v9.4.5
```bat
psql -U postgres
# or running if in MacOS
# psql -U `whoami` -d postgres
```
```psql
CREATE DATABASE pgac_dev;
GRANT ALL PRIVILEGES ON DATABASE pgac_dev to <current_user>;
```
### Heroku Deploy
```bash
heroku create
git push heroku master
heroku run rake db:migrate
heroku open
```
|
Following the [DHH Rails 5: Action Cable Demo](https://medium.com/@dhh/rails-5-action-cable-demo-8bba4ccfc55e) or [Railscast](http://railscasts-china.com/episodes/action-cable-rails-5) but confirm works at Windows!
Need Rails 5.0.0.beta3.
Thanks @maclover7 [PR to enable using PG as storage](https://github.com/rails/rails/pull/22950) and @matthewd [PR to remove EventMachine](https://github.com/rails/rails/pull/23152) [Round 2](https://github.com/rails/rails/pull/23305), it's now possible to starting using ActionCable in Windows.
### Step to setup the PG DB in Windows
[Download](http://www.postgresql.org/download/windows/) and install, recommand PostgreSQL v9.4.5
```bat
psql -U postgres
# or running if in MacOS
# psql -U `whoami` -d postgres
```
```psql
CREATE DATABASE pgac_dev;
GRANT ALL PRIVILEGES ON DATABASE pgac_dev to <current_user>;
```
+
+ ### Heroku Deploy
+
+ ```bash
+ heroku create
+ git push heroku master
+ heroku run rake db:migrate
+ heroku open
+ ``` | 9 | 0.428571 | 9 | 0 |
609c8e37b082dd3c91fadb3cb11130ba345527f8 | readme.md | readme.md |
> Email input component for virtual-dom
## Install
```
$ npm install --save email-input
```
## Usage
```js
var EmailInput = require('email-input')
var emailInput = EmailInput()
function render (state) {
var vtree = EmailInput.render(state)
//=> use virtual-dom to patch vtree into real DOM
}
emailInput(render)
```
## API
#### `EmailInput(data)` -> `function`
Create a new email input observable.
##### data
Type: `object`
The initial state of the input.
###### value
Type: `string`
The email address.
###### valid
Type: `boolean`
The validity state of the email. Treat as read only.
###### placeholder
Type: `string`
The placeholder for the `<input>`
#### `EmailInput.render(state)` -> `object`
Render an email state to a vtree object.
## License
MIT © [Ben Drucker](http://bendrucker.me)
|
> Email input component for virtual-dom
## Install
```
$ npm install --save email-input
```
## Usage
```js
var EmailInput = require('email-input')
var emailInput = EmailInput()
function render (state) {
var vtree = EmailInput.render(state)
//=> use virtual-dom to patch vtree into real DOM
}
emailInput(render)
```
## API
#### `EmailInput(data)` -> `function`
Create a new email input observable.
##### data
Type: `object`
The initial state of the input.
###### value
Type: `string`
The email address.
###### valid
Type: `boolean`
The validity state of the email. Treat as read only.
#### `EmailInput.render(state, options)` -> `object`
Render an email state to a vtree object. `options` will be merged with the defaults (`{type: 'email', name: 'email'}`) and passed to [virtual-hyperscript](https://github.com/Matt-Esch/virtual-dom/tree/master/virtual-hyperscript).
## License
MIT © [Ben Drucker](http://bendrucker.me)
| Document new options + removal of placeholder | Document new options + removal of placeholder
| Markdown | mit | bendrucker/email-input | markdown | ## Code Before:
> Email input component for virtual-dom
## Install
```
$ npm install --save email-input
```
## Usage
```js
var EmailInput = require('email-input')
var emailInput = EmailInput()
function render (state) {
var vtree = EmailInput.render(state)
//=> use virtual-dom to patch vtree into real DOM
}
emailInput(render)
```
## API
#### `EmailInput(data)` -> `function`
Create a new email input observable.
##### data
Type: `object`
The initial state of the input.
###### value
Type: `string`
The email address.
###### valid
Type: `boolean`
The validity state of the email. Treat as read only.
###### placeholder
Type: `string`
The placeholder for the `<input>`
#### `EmailInput.render(state)` -> `object`
Render an email state to a vtree object.
## License
MIT © [Ben Drucker](http://bendrucker.me)
## Instruction:
Document new options + removal of placeholder
## Code After:
> Email input component for virtual-dom
## Install
```
$ npm install --save email-input
```
## Usage
```js
var EmailInput = require('email-input')
var emailInput = EmailInput()
function render (state) {
var vtree = EmailInput.render(state)
//=> use virtual-dom to patch vtree into real DOM
}
emailInput(render)
```
## API
#### `EmailInput(data)` -> `function`
Create a new email input observable.
##### data
Type: `object`
The initial state of the input.
###### value
Type: `string`
The email address.
###### valid
Type: `boolean`
The validity state of the email. Treat as read only.
#### `EmailInput.render(state, options)` -> `object`
Render an email state to a vtree object. `options` will be merged with the defaults (`{type: 'email', name: 'email'}`) and passed to [virtual-hyperscript](https://github.com/Matt-Esch/virtual-dom/tree/master/virtual-hyperscript).
## License
MIT © [Ben Drucker](http://bendrucker.me)
|
> Email input component for virtual-dom
## Install
```
$ npm install --save email-input
```
## Usage
```js
var EmailInput = require('email-input')
var emailInput = EmailInput()
function render (state) {
var vtree = EmailInput.render(state)
//=> use virtual-dom to patch vtree into real DOM
}
emailInput(render)
```
## API
#### `EmailInput(data)` -> `function`
Create a new email input observable.
##### data
Type: `object`
The initial state of the input.
###### value
Type: `string`
The email address.
###### valid
Type: `boolean`
The validity state of the email. Treat as read only.
- ###### placeholder
+ #### `EmailInput.render(state, options)` -> `object`
+ Render an email state to a vtree object. `options` will be merged with the defaults (`{type: 'email', name: 'email'}`) and passed to [virtual-hyperscript](https://github.com/Matt-Esch/virtual-dom/tree/master/virtual-hyperscript).
- Type: `string`
-
- The placeholder for the `<input>`
-
- #### `EmailInput.render(state)` -> `object`
-
- Render an email state to a vtree object.
## License
MIT © [Ben Drucker](http://bendrucker.me) | 10 | 0.16129 | 2 | 8 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.