commit stringlengths 40 40 | old_file stringlengths 4 237 | new_file stringlengths 4 237 | old_contents stringlengths 1 4.24k | new_contents stringlengths 5 4.84k | subject stringlengths 15 778 | message stringlengths 16 6.86k | lang stringlengths 1 30 | license stringclasses 13 values | repos stringlengths 5 116k | config stringlengths 1 30 | content stringlengths 105 8.72k |
|---|---|---|---|---|---|---|---|---|---|---|---|
ccb774b58ab7dbe704abfb7df3fa29915fad8f8f | examples/memnn/download.py | examples/memnn/download.py |
from six.moves.urllib import request
def main():
opener = request.FancyURLopener()
opener.addheaders = [('User-Agent', '')]
opener.retrieve(
'http://www.thespermwhale.com/jaseweston/babi/tasks_1-20_v1-2.tar.gz',
'tasks_1-20_v1-2.tar.gz')
if __name__ == '__main__':
main()
|
from six.moves.urllib import request
def main():
request.urlretrieve(
'http://www.thespermwhale.com/jaseweston/babi/tasks_1-20_v1-2.tar.gz',
'tasks_1-20_v1-2.tar.gz')
if __name__ == '__main__':
main()
| Replace deprecated URLopener in `donwload.py` | Replace deprecated URLopener in `donwload.py`
| Python | mit | niboshi/chainer,keisuke-umezawa/chainer,wkentaro/chainer,wkentaro/chainer,pfnet/chainer,keisuke-umezawa/chainer,wkentaro/chainer,niboshi/chainer,niboshi/chainer,okuta/chainer,okuta/chainer,chainer/chainer,hvy/chainer,chainer/chainer,keisuke-umezawa/chainer,wkentaro/chainer,okuta/chainer,keisuke-umezawa/chainer,hvy/chainer,hvy/chainer,tkerola/chainer,hvy/chainer,chainer/chainer,chainer/chainer,okuta/chainer,niboshi/chainer | python | ## Code Before:
from six.moves.urllib import request
def main():
opener = request.FancyURLopener()
opener.addheaders = [('User-Agent', '')]
opener.retrieve(
'http://www.thespermwhale.com/jaseweston/babi/tasks_1-20_v1-2.tar.gz',
'tasks_1-20_v1-2.tar.gz')
if __name__ == '__main__':
main()
## Instruction:
Replace deprecated URLopener in `donwload.py`
## Code After:
from six.moves.urllib import request
def main():
request.urlretrieve(
'http://www.thespermwhale.com/jaseweston/babi/tasks_1-20_v1-2.tar.gz',
'tasks_1-20_v1-2.tar.gz')
if __name__ == '__main__':
main()
|
c868d416d90c448be733056505b48e339da582e0 | CHANGELOG.md | CHANGELOG.md | Multi User Test Runner
======================
# 0.3.0
## Breaking changes
* `SpringMultiUserTestClassRunner` is in its own artifact: spring-test-class-runner
* Default TestUsers runner is changed from `SpringMultiUserTestClassRunner` to `BlockMultiUserTestClassRunner`
* Runner configuration is done via `MultiUserTestConfig` annotation. Annotation can be added to a base class to reduce boilerplage code. | Multi User Test Runner
======================
# 0.3.0
## Breaking changes
* `SpringMultiUserTestClassRunner` is in its own artifact: spring-test-class-runner
* Default TestUsers runner is changed from `SpringMultiUserTestClassRunner` to `BlockMultiUserTestClassRunner`
* Runner configuration is done via `MultiUserTestConfig` annotation. Annotation can be added to a base class to reduce boilerplage code.
* Creator is logged after `@Before` methods but just before calling the test method. Previously
it was called in `AbstractUserRoleIT`class's `@Before` method which made impossible to create custom users
which could be used as creator user. | Add breaking change note about logInAs(Creator) call | Add breaking change note about logInAs(Creator) call
| Markdown | apache-2.0 | Vincit/multi-user-test-runner,Vincit/multi-user-test-runner,Vincit/multi-user-test-runner | markdown | ## Code Before:
Multi User Test Runner
======================
# 0.3.0
## Breaking changes
* `SpringMultiUserTestClassRunner` is in its own artifact: spring-test-class-runner
* Default TestUsers runner is changed from `SpringMultiUserTestClassRunner` to `BlockMultiUserTestClassRunner`
* Runner configuration is done via `MultiUserTestConfig` annotation. Annotation can be added to a base class to reduce boilerplage code.
## Instruction:
Add breaking change note about logInAs(Creator) call
## Code After:
Multi User Test Runner
======================
# 0.3.0
## Breaking changes
* `SpringMultiUserTestClassRunner` is in its own artifact: spring-test-class-runner
* Default TestUsers runner is changed from `SpringMultiUserTestClassRunner` to `BlockMultiUserTestClassRunner`
* Runner configuration is done via `MultiUserTestConfig` annotation. Annotation can be added to a base class to reduce boilerplage code.
* Creator is logged after `@Before` methods but just before calling the test method. Previously
it was called in `AbstractUserRoleIT`class's `@Before` method which made impossible to create custom users
which could be used as creator user. |
4640afc8ea101090c4205f0300cfa5c7c10d9960 | roles/dd-agent/tasks/main.yml | roles/dd-agent/tasks/main.yml | ---
- name: install dd repo key
apt_key:
id: C7A7DA52
keyserver: keyserver.ubuntu.com
- name: install dd repo
apt_repository:
repo: "deb https://apt.datadoghq.com/ stable main"
update_cache: yes
- name: install dependency packages
apt:
name: "{{ item }}"
state: installed
with_items:
- apt-transport-https
- name: install datadog agent package
apt:
name: datadog-agent
state: installed
notify:
- restart dd-agent
# This is required to call datadog_monitor with ansible and later to use the
# datadog callback for more integration into ansible runs
# within datadog.
- name: install datadog pip
pip:
name: datadog
virtualenv: /opt/datadog
- name: Ensure config directories
file:
state: directory
path: "{{ item }}"
with_items:
- "{{ datadog_config_dir }}"
- "{{ datadog_check_dir }}"
- "{{ datadog_frag_dir }}"
- name: configure datadog
template:
src: etc/dd-agent/datadog.conf
dest: "{{ datadog_config_dir }}/datadog.conf"
owner: dd-agent
group: root
notify:
- restart dd-agent
# Flush handlers to avoid a start + restart scenario
- meta: flush_handlers
- name: start datadog agent
service:
name: datadog-agent
state: started
enabled: yes
| ---
- name: install dd repo key
apt_key:
id: C7A7DA52
keyserver: keyserver.ubuntu.com
- name: install dd repo
apt_repository:
repo: "deb https://apt.datadoghq.com/ stable main"
update_cache: yes
- name: install dependency packages
apt:
name: "{{ item }}"
state: installed
with_items:
- apt-transport-https
- name: install datadog agent package
apt:
name: datadog-agent
state: installed
notify:
- restart dd-agent
- name: Ensure config directories
file:
state: directory
path: "{{ item }}"
with_items:
- "{{ datadog_config_dir }}"
- "{{ datadog_check_dir }}"
- "{{ datadog_frag_dir }}"
- name: configure datadog
template:
src: etc/dd-agent/datadog.conf
dest: "{{ datadog_config_dir }}/datadog.conf"
owner: dd-agent
group: root
notify:
- restart dd-agent
# Flush handlers to avoid a start + restart scenario
- meta: flush_handlers
- name: start datadog agent
service:
name: datadog-agent
state: started
enabled: yes
| Stop install pip datadog to /opt/datadog | Stop install pip datadog to /opt/datadog
I think this is a remnant of some earlier attempt at ansible datadog
integration, however ansible gets installed into /opt/ansible and there
is no other reference to this directory I can find in hoist.
I don't think it's required any more.
Signed-off-by: Jamie Lennox <1b9801e8e46f6bd96338b24e58c84c4f37762fc2@gmail.com>
| YAML | apache-2.0 | jamielennox/hoist,jamielennox/hoist,SpamapS/hoist,BonnyCI/hoist,SpamapS/hoist,BonnyCI/hoist | yaml | ## Code Before:
---
- name: install dd repo key
apt_key:
id: C7A7DA52
keyserver: keyserver.ubuntu.com
- name: install dd repo
apt_repository:
repo: "deb https://apt.datadoghq.com/ stable main"
update_cache: yes
- name: install dependency packages
apt:
name: "{{ item }}"
state: installed
with_items:
- apt-transport-https
- name: install datadog agent package
apt:
name: datadog-agent
state: installed
notify:
- restart dd-agent
# This is required to call datadog_monitor with ansible and later to use the
# datadog callback for more integration into ansible runs
# within datadog.
- name: install datadog pip
pip:
name: datadog
virtualenv: /opt/datadog
- name: Ensure config directories
file:
state: directory
path: "{{ item }}"
with_items:
- "{{ datadog_config_dir }}"
- "{{ datadog_check_dir }}"
- "{{ datadog_frag_dir }}"
- name: configure datadog
template:
src: etc/dd-agent/datadog.conf
dest: "{{ datadog_config_dir }}/datadog.conf"
owner: dd-agent
group: root
notify:
- restart dd-agent
# Flush handlers to avoid a start + restart scenario
- meta: flush_handlers
- name: start datadog agent
service:
name: datadog-agent
state: started
enabled: yes
## Instruction:
Stop install pip datadog to /opt/datadog
I think this is a remnant of some earlier attempt at ansible datadog
integration, however ansible gets installed into /opt/ansible and there
is no other reference to this directory I can find in hoist.
I don't think it's required any more.
Signed-off-by: Jamie Lennox <1b9801e8e46f6bd96338b24e58c84c4f37762fc2@gmail.com>
## Code After:
---
- name: install dd repo key
apt_key:
id: C7A7DA52
keyserver: keyserver.ubuntu.com
- name: install dd repo
apt_repository:
repo: "deb https://apt.datadoghq.com/ stable main"
update_cache: yes
- name: install dependency packages
apt:
name: "{{ item }}"
state: installed
with_items:
- apt-transport-https
- name: install datadog agent package
apt:
name: datadog-agent
state: installed
notify:
- restart dd-agent
- name: Ensure config directories
file:
state: directory
path: "{{ item }}"
with_items:
- "{{ datadog_config_dir }}"
- "{{ datadog_check_dir }}"
- "{{ datadog_frag_dir }}"
- name: configure datadog
template:
src: etc/dd-agent/datadog.conf
dest: "{{ datadog_config_dir }}/datadog.conf"
owner: dd-agent
group: root
notify:
- restart dd-agent
# Flush handlers to avoid a start + restart scenario
- meta: flush_handlers
- name: start datadog agent
service:
name: datadog-agent
state: started
enabled: yes
|
abfc91a687b33dda2659025af254bc9f50c077b5 | publishers/migrations/0008_fix_name_indices.py | publishers/migrations/0008_fix_name_indices.py |
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('publishers', '0007_publisher_romeo_parent_id'),
]
operations = [
migrations.AlterField(
model_name='journal',
name='title',
field=models.CharField(max_length=256),
),
migrations.AlterField(
model_name='publisher',
name='name',
field=models.CharField(max_length=256),
),
migrations.RunSQL(
sql="""
CREATE INDEX CONCURRENTLY papers_journal_title_upper ON public.papers_journal USING btree (UPPER(title));
""",
reverse_sql="""
DROP INDEX CONCURRENTLY papers_journal_title_upper;
"""
),
migrations.RunSQL(
sql="""
CREATE INDEX CONCURRENTLY papers_publisher_name_upper ON public.papers_publisher USING btree (UPPER(name));
""",
reverse_sql="""
DROP INDEX CONCURRENTLY papers_publisher_name_upper;
"""
),
]
|
from django.db import migrations, models
class Migration(migrations.Migration):
atomic = False
dependencies = [
('publishers', '0007_publisher_romeo_parent_id'),
]
operations = [
migrations.AlterField(
model_name='journal',
name='title',
field=models.CharField(max_length=256),
),
migrations.AlterField(
model_name='publisher',
name='name',
field=models.CharField(max_length=256),
),
migrations.RunSQL(
sql="""
CREATE INDEX CONCURRENTLY papers_journal_title_upper ON public.papers_journal USING btree (UPPER(title));
""",
reverse_sql="""
DROP INDEX CONCURRENTLY papers_journal_title_upper;
""",
),
migrations.RunSQL(
sql="""
CREATE INDEX CONCURRENTLY papers_publisher_name_upper ON public.papers_publisher USING btree (UPPER(name));
""",
reverse_sql="""
DROP INDEX CONCURRENTLY papers_publisher_name_upper;
"""
),
]
| Mark index migration as non atomic | Mark index migration as non atomic
| Python | agpl-3.0 | dissemin/dissemin,wetneb/dissemin,wetneb/dissemin,dissemin/dissemin,dissemin/dissemin,dissemin/dissemin,wetneb/dissemin,dissemin/dissemin,wetneb/dissemin | python | ## Code Before:
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('publishers', '0007_publisher_romeo_parent_id'),
]
operations = [
migrations.AlterField(
model_name='journal',
name='title',
field=models.CharField(max_length=256),
),
migrations.AlterField(
model_name='publisher',
name='name',
field=models.CharField(max_length=256),
),
migrations.RunSQL(
sql="""
CREATE INDEX CONCURRENTLY papers_journal_title_upper ON public.papers_journal USING btree (UPPER(title));
""",
reverse_sql="""
DROP INDEX CONCURRENTLY papers_journal_title_upper;
"""
),
migrations.RunSQL(
sql="""
CREATE INDEX CONCURRENTLY papers_publisher_name_upper ON public.papers_publisher USING btree (UPPER(name));
""",
reverse_sql="""
DROP INDEX CONCURRENTLY papers_publisher_name_upper;
"""
),
]
## Instruction:
Mark index migration as non atomic
## Code After:
from django.db import migrations, models
class Migration(migrations.Migration):
atomic = False
dependencies = [
('publishers', '0007_publisher_romeo_parent_id'),
]
operations = [
migrations.AlterField(
model_name='journal',
name='title',
field=models.CharField(max_length=256),
),
migrations.AlterField(
model_name='publisher',
name='name',
field=models.CharField(max_length=256),
),
migrations.RunSQL(
sql="""
CREATE INDEX CONCURRENTLY papers_journal_title_upper ON public.papers_journal USING btree (UPPER(title));
""",
reverse_sql="""
DROP INDEX CONCURRENTLY papers_journal_title_upper;
""",
),
migrations.RunSQL(
sql="""
CREATE INDEX CONCURRENTLY papers_publisher_name_upper ON public.papers_publisher USING btree (UPPER(name));
""",
reverse_sql="""
DROP INDEX CONCURRENTLY papers_publisher_name_upper;
"""
),
]
|
77223cb9cc13961c2762bec24a348c72e63a6217 | source/init/constants.ts | source/init/constants.ts | import {setVersionInfo, setAppName, setTimezone} from '@frogpond/constants'
export const SENTRY_DSN =
'https://6f70285364b7417181e17db8bcf4de11@sentry.frogpond.tech/2'
import {version, name} from '../../package.json'
setVersionInfo(version)
setAppName(name)
setTimezone('America/Chicago')
| import {setVersionInfo, setAppName, setTimezone} from '@frogpond/constants'
export const SENTRY_DSN =
'https://7f68e814c5c24c32a582f2ddc3d42b4c@o524787.ingest.sentry.io/5637838'
import {version, name} from '../../package.json'
setVersionInfo(version)
setAppName(name)
setTimezone('America/Chicago')
| Switch DSN over to hosted sentry url | sentry: Switch DSN over to hosted sentry url
Signed-off-by: Kristofer Rye <1ed31cfd0b53bc3d1689a6fee6dbfc9507dffd22@gmail.com>
| TypeScript | agpl-3.0 | StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native | typescript | ## Code Before:
import {setVersionInfo, setAppName, setTimezone} from '@frogpond/constants'
export const SENTRY_DSN =
'https://6f70285364b7417181e17db8bcf4de11@sentry.frogpond.tech/2'
import {version, name} from '../../package.json'
setVersionInfo(version)
setAppName(name)
setTimezone('America/Chicago')
## Instruction:
sentry: Switch DSN over to hosted sentry url
Signed-off-by: Kristofer Rye <1ed31cfd0b53bc3d1689a6fee6dbfc9507dffd22@gmail.com>
## Code After:
import {setVersionInfo, setAppName, setTimezone} from '@frogpond/constants'
export const SENTRY_DSN =
'https://7f68e814c5c24c32a582f2ddc3d42b4c@o524787.ingest.sentry.io/5637838'
import {version, name} from '../../package.json'
setVersionInfo(version)
setAppName(name)
setTimezone('America/Chicago')
|
f64f9dacf3f8370edd02c45bf577f40100fb9846 | composer.json | composer.json | {
"name": "unisharp/laravel-settings",
"description": "Persistent settings manager for laravel 5",
"authors": [
{
"name": "Unisharp Ltd.",
"email": "service@unisharp.com"
}
],
"require": {
"php": ">=5.5.0",
"illuminate/support": ">=5.0.0"
},
"autoload": {
"psr-4": {
"Unisharp\\Setting\\": "src/Setting/"
}
}
}
| {
"name": "unisharp/laravel-settings",
"description": "Persistent settings manager for laravel 5",
"type": "library",
"keywords": [ "key-value", "storage", "settings", "persistent", "laravel"],
"license": "MIT",
"authors": [
{
"name": "Unisharp Ltd.",
"email": "service@unisharp.com"
}
],
"require": {
"php": ">=5.5.0",
"illuminate/support": ">=5.0.0"
},
"autoload": {
"psr-4": {
"Unisharp\\Setting\\": "src/Setting/"
}
}
}
| Add license, keywords and type | Add license, keywords and type
| JSON | mit | UniSharp/laravel-settings | json | ## Code Before:
{
"name": "unisharp/laravel-settings",
"description": "Persistent settings manager for laravel 5",
"authors": [
{
"name": "Unisharp Ltd.",
"email": "service@unisharp.com"
}
],
"require": {
"php": ">=5.5.0",
"illuminate/support": ">=5.0.0"
},
"autoload": {
"psr-4": {
"Unisharp\\Setting\\": "src/Setting/"
}
}
}
## Instruction:
Add license, keywords and type
## Code After:
{
"name": "unisharp/laravel-settings",
"description": "Persistent settings manager for laravel 5",
"type": "library",
"keywords": [ "key-value", "storage", "settings", "persistent", "laravel"],
"license": "MIT",
"authors": [
{
"name": "Unisharp Ltd.",
"email": "service@unisharp.com"
}
],
"require": {
"php": ">=5.5.0",
"illuminate/support": ">=5.0.0"
},
"autoload": {
"psr-4": {
"Unisharp\\Setting\\": "src/Setting/"
}
}
}
|
538e4b799675744d72ce47b067673f4b2f3700ea | test/fs-walk-parallel-test.js | test/fs-walk-parallel-test.js | "use strict";
var fsWalkParallel = require("../lib/fs-walk-parallel");
var mockFs = require("mock-fs");
var test = require("tape");
function setup(t, fakeFs) {
mockFs.restore();
mockFs(fakeFs);
}
test("nonexistant folder should return error", function(t) {
setup(t, {});
t.plan(1);
fsWalkParallel("doesntExist", function() {
throw "should never happen";
}, function(err) {
t.ok(err, "should have an error");
t.end();
});
});
test("empty folder should return no errors", function(t) {
setup(t, { "folder": {} });
t.plan(1);
fsWalkParallel("folder", function() {
throw "should never happen";
}, function(err) {
t.notOk(err, "should have no errors");
t.end();
});
});
test("single file should call iterator once with stat", function(t) {
setup(t, { "folder": { "file": "contents" } });
t.plan(2);
fsWalkParallel("folder", function(stat) {
t.ok(stat.isFile(), "should return a fs.Stats");
}, function(err) {
t.notOk(err, "should have no errors");
t.end();
});
});
| "use strict";
var fsWalkParallel = require("../lib/fs-walk-parallel");
var mockFs = require("mock-fs");
var test = require("tape");
function setup(t, fakeFs) {
mockFs.restore();
mockFs(fakeFs);
}
test("nonexistant folder should return error", function(t) {
setup(t, {});
t.plan(1);
fsWalkParallel("doesntExist", function() {
throw "should never happen";
}, function(err) {
t.ok(err, "should have an error");
t.end();
});
});
test("empty folder should return no errors", function(t) {
setup(t, { "folder": {} });
t.plan(1);
fsWalkParallel("folder", function() {
throw "should never happen";
}, function(err) {
t.notOk(err, "should have no errors");
t.end();
});
});
test("two files should call iterator twice with stat", function(t) {
setup(t, { "folder": { "file1": "contents1", "file2": "contents2" } });
t.plan(3);
fsWalkParallel("folder", function(stat) {
t.ok(stat.isFile(), "should return a fs.Stats");
}, function(err) {
t.notOk(err, "should have no errors");
t.end();
});
});
| Add test for two files | Add test for two files
| JavaScript | mit | ericlathrop/node-fs-walk-breadth-first | javascript | ## Code Before:
"use strict";
var fsWalkParallel = require("../lib/fs-walk-parallel");
var mockFs = require("mock-fs");
var test = require("tape");
function setup(t, fakeFs) {
mockFs.restore();
mockFs(fakeFs);
}
test("nonexistant folder should return error", function(t) {
setup(t, {});
t.plan(1);
fsWalkParallel("doesntExist", function() {
throw "should never happen";
}, function(err) {
t.ok(err, "should have an error");
t.end();
});
});
test("empty folder should return no errors", function(t) {
setup(t, { "folder": {} });
t.plan(1);
fsWalkParallel("folder", function() {
throw "should never happen";
}, function(err) {
t.notOk(err, "should have no errors");
t.end();
});
});
test("single file should call iterator once with stat", function(t) {
setup(t, { "folder": { "file": "contents" } });
t.plan(2);
fsWalkParallel("folder", function(stat) {
t.ok(stat.isFile(), "should return a fs.Stats");
}, function(err) {
t.notOk(err, "should have no errors");
t.end();
});
});
## Instruction:
Add test for two files
## Code After:
"use strict";
var fsWalkParallel = require("../lib/fs-walk-parallel");
var mockFs = require("mock-fs");
var test = require("tape");
function setup(t, fakeFs) {
mockFs.restore();
mockFs(fakeFs);
}
test("nonexistant folder should return error", function(t) {
setup(t, {});
t.plan(1);
fsWalkParallel("doesntExist", function() {
throw "should never happen";
}, function(err) {
t.ok(err, "should have an error");
t.end();
});
});
test("empty folder should return no errors", function(t) {
setup(t, { "folder": {} });
t.plan(1);
fsWalkParallel("folder", function() {
throw "should never happen";
}, function(err) {
t.notOk(err, "should have no errors");
t.end();
});
});
test("two files should call iterator twice with stat", function(t) {
setup(t, { "folder": { "file1": "contents1", "file2": "contents2" } });
t.plan(3);
fsWalkParallel("folder", function(stat) {
t.ok(stat.isFile(), "should return a fs.Stats");
}, function(err) {
t.notOk(err, "should have no errors");
t.end();
});
});
|
f1351bc1e34801dd26a84c37cf10449428d7a4ff | RubyTest.rb | RubyTest.rb | puts "Enter your name:"
name = gets.chomp
puts "Your name is #{name}!"
name.reverse!
puts "Your name backwards is #{name}"
name.upcase!
puts "Your name backwards and upcase is #{name}!" | puts "Enter your name:"
name = gets.chomp
puts "Your name is #{name}!"
name.reverse!
puts "Your name backwards is #{name}"
name.upcase!
puts "Your name backwards and upcase is #{name}!"
puts "Enter your last name:"
lastname = gets.chomp
name.reverse!
name.capitalize!
puts "Your name is #{name} #{lastname}" | Add last name functionality for double the fun...ctionality | Add last name functionality for double the fun...ctionality
| Ruby | mit | jnicholea/phase-1 | ruby | ## Code Before:
puts "Enter your name:"
name = gets.chomp
puts "Your name is #{name}!"
name.reverse!
puts "Your name backwards is #{name}"
name.upcase!
puts "Your name backwards and upcase is #{name}!"
## Instruction:
Add last name functionality for double the fun...ctionality
## Code After:
puts "Enter your name:"
name = gets.chomp
puts "Your name is #{name}!"
name.reverse!
puts "Your name backwards is #{name}"
name.upcase!
puts "Your name backwards and upcase is #{name}!"
puts "Enter your last name:"
lastname = gets.chomp
name.reverse!
name.capitalize!
puts "Your name is #{name} #{lastname}" |
74748c58c01f441a2a910f250a8d9736c25ea233 | .travis.yml | .travis.yml | language: generic
os: osx
osx_image: xcode9.2
branches:
only:
- master
before_install:
- export XCODEBUILD_PATH=$(which xcodebuild)
- export PATH=$PATH:$XCODEBUILD_PATH
- export PATH=$PATH:~/bin
- brew update
- gem env
- gem install xcpretty --no-rdoc --no-ri --user-install -n~/bin
- gem install fastlane --no-rdoc --no-ri --user-install -n~/bin
install:
- xcrun simctl list
- carthage update --platform 'iOS'
- carthage update --platform 'tvOS'
script:
- ./test
| language: generic
os: osx
osx_image: xcode9.2
branches:
only:
- master
before_install:
- export XCODEBUILD_PATH=$(which xcodebuild)
- export PATH=$PATH:$XCODEBUILD_PATH
- export PATH=$PATH:~/bin
- brew update
- gem install xcpretty --no-rdoc --no-ri --user-install -n~/bin
- gem install fastlane --no-rdoc --no-ri --user-install -n~/bin
install:
- xcrun simctl list
- carthage update --platform 'iOS'
- carthage update --platform 'tvOS'
script:
- ./test
| Remove diagnostic gem env print statement from Travis config | Remove diagnostic gem env print statement from Travis config
| YAML | apache-2.0 | jwfriese/Fleet,jwfriese/Fleet,jwfriese/Fleet,jwfriese/Fleet | yaml | ## Code Before:
language: generic
os: osx
osx_image: xcode9.2
branches:
only:
- master
before_install:
- export XCODEBUILD_PATH=$(which xcodebuild)
- export PATH=$PATH:$XCODEBUILD_PATH
- export PATH=$PATH:~/bin
- brew update
- gem env
- gem install xcpretty --no-rdoc --no-ri --user-install -n~/bin
- gem install fastlane --no-rdoc --no-ri --user-install -n~/bin
install:
- xcrun simctl list
- carthage update --platform 'iOS'
- carthage update --platform 'tvOS'
script:
- ./test
## Instruction:
Remove diagnostic gem env print statement from Travis config
## Code After:
language: generic
os: osx
osx_image: xcode9.2
branches:
only:
- master
before_install:
- export XCODEBUILD_PATH=$(which xcodebuild)
- export PATH=$PATH:$XCODEBUILD_PATH
- export PATH=$PATH:~/bin
- brew update
- gem install xcpretty --no-rdoc --no-ri --user-install -n~/bin
- gem install fastlane --no-rdoc --no-ri --user-install -n~/bin
install:
- xcrun simctl list
- carthage update --platform 'iOS'
- carthage update --platform 'tvOS'
script:
- ./test
|
f5ebd2f805663ed2520d268d1cda97a27e038890 | lib/generators/ambry_generator.rb | lib/generators/ambry_generator.rb | require 'rails/generators'
require 'rails/generators/actions'
# This generator adds an initializer and default empty database to your Rails
# application. It can be invoked on the command line like:
#
# rails generate ambry
#
class AmbryGenerator < Rails::Generators::Base
# Create the initializer and empty database.
def create_files
initializer("ambry.rb") do
<<-EOI
require "ambry/adapters/yaml"
require "ambry/active_model"
Ambry::Adapters::YAML.new :file => Rails.root.join('db', 'ambry.yml')
EOI
end
create_file("db/ambry.yml", '')
end
end | require 'rails/generators'
require 'rails/generators/actions'
# This generator adds an initializer and default empty database to your Rails
# application. It can be invoked on the command line like:
#
# rails generate ambry
#
class AmbryGenerator < Rails::Generators::Base
# Create the initializer and empty database.
def create_files
initializer("ambry.rb") do
<<-EOI
require "ambry/adapters/yaml"
require "ambry/active_model"
Ambry.remove_adapter :main
Ambry::Adapters::YAML.new :file => Rails.root.join('db', 'ambry.yml')
EOI
end
create_file("db/ambry.yml", '')
end
end | Remove main adapter by default | Remove main adapter by default
The default behavior on Rails should probably not be to raise an error.
Resolves issue #12
| Ruby | mit | norman/ambry | ruby | ## Code Before:
require 'rails/generators'
require 'rails/generators/actions'
# This generator adds an initializer and default empty database to your Rails
# application. It can be invoked on the command line like:
#
# rails generate ambry
#
class AmbryGenerator < Rails::Generators::Base
# Create the initializer and empty database.
def create_files
initializer("ambry.rb") do
<<-EOI
require "ambry/adapters/yaml"
require "ambry/active_model"
Ambry::Adapters::YAML.new :file => Rails.root.join('db', 'ambry.yml')
EOI
end
create_file("db/ambry.yml", '')
end
end
## Instruction:
Remove main adapter by default
The default behavior on Rails should probably not be to raise an error.
Resolves issue #12
## Code After:
require 'rails/generators'
require 'rails/generators/actions'
# This generator adds an initializer and default empty database to your Rails
# application. It can be invoked on the command line like:
#
# rails generate ambry
#
class AmbryGenerator < Rails::Generators::Base
# Create the initializer and empty database.
def create_files
initializer("ambry.rb") do
<<-EOI
require "ambry/adapters/yaml"
require "ambry/active_model"
Ambry.remove_adapter :main
Ambry::Adapters::YAML.new :file => Rails.root.join('db', 'ambry.yml')
EOI
end
create_file("db/ambry.yml", '')
end
end |
968fdefc7d48e7877f3729fd7d4b2fa2658144d2 | config/routes.rb | config/routes.rb | Qunit::Rails::Engine.routes.draw do
root to: 'test#index'
end
Rails.application.routes.draw do
mount Qunit::Rails::Engine => '/qunit'
end
| Qunit::Rails::Engine.routes.draw do
root to: 'test#index'
match ':action', controller: 'test'
end
Rails.application.routes.draw do
mount Qunit::Rails::Engine => '/qunit'
end
| Allow for arbitrary test suites | Allow for arbitrary test suites
| Ruby | unlicense | frodsan/qunit-rails,frodsan/qunit-rails | ruby | ## Code Before:
Qunit::Rails::Engine.routes.draw do
root to: 'test#index'
end
Rails.application.routes.draw do
mount Qunit::Rails::Engine => '/qunit'
end
## Instruction:
Allow for arbitrary test suites
## Code After:
Qunit::Rails::Engine.routes.draw do
root to: 'test#index'
match ':action', controller: 'test'
end
Rails.application.routes.draw do
mount Qunit::Rails::Engine => '/qunit'
end
|
b9cb2db713bba95597fa2bfdd1748a03e695b80f | config/routes.rb | config/routes.rb | Rails.application.routes.draw do
devise_for :users
get 'welcome/index'
root 'welcome#index'
resources :docs
end
| Rails.application.routes.draw do
devise_for :users
get 'welcome/index'
resources :docs
authenticated :user do
root "docs#index", as: "authenticated_root"
end
root 'welcome#index'
end
| Change root of application for authenticated user | Change root of application for authenticated user
| Ruby | mit | hugoangeles025/cabinet,hugoangeles025/cabinet,hugoangeles025/cabinet | ruby | ## Code Before:
Rails.application.routes.draw do
devise_for :users
get 'welcome/index'
root 'welcome#index'
resources :docs
end
## Instruction:
Change root of application for authenticated user
## Code After:
Rails.application.routes.draw do
devise_for :users
get 'welcome/index'
resources :docs
authenticated :user do
root "docs#index", as: "authenticated_root"
end
root 'welcome#index'
end
|
4a950e67c2a16962c403b73a807d59c60b9d8ac5 | packages/basemap/requirements-doc.txt | packages/basemap/requirements-doc.txt | six >= 1.10, < 1.16; python_version >= "3.5"
sphinx >= 3.0, < 3.4; python_version >= "3.5"
| sphinx >= 3.0, < 3.4; python_version >= "3.5"
| Remove six also from doc requirements | Remove six also from doc requirements
| Text | mit | matplotlib/basemap,guziy/basemap,matplotlib/basemap,guziy/basemap | text | ## Code Before:
six >= 1.10, < 1.16; python_version >= "3.5"
sphinx >= 3.0, < 3.4; python_version >= "3.5"
## Instruction:
Remove six also from doc requirements
## Code After:
sphinx >= 3.0, < 3.4; python_version >= "3.5"
|
1201c6806bac6db17dde4bf5e98e55a7cb81c087 | manifests/bits-service-webdav-certs.yml | manifests/bits-service-webdav-certs.yml | properties:
bits-service:
buildpacks:
webdav_config:
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
droplets:
webdav_config:
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
packages:
webdav_config:
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
app_stash:
webdav_config:
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
blobstore:
tls:
cert: (( file "standalone-blobstore-certs/server.crt" ))
private_key: (( file "standalone-blobstore-certs/server.key" ))
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
| - type: replace
path: /properties?/bits-service/buildpacks/webdav_config/ca_cert
value: ((blobstore_ssl.ca))
- type: replace
path: /properties?/bits-service/droplets/webdav_config/ca_cert
value: ((blobstore_ssl.ca))
- type: replace
path: /properties?/bits-service/packages/webdav_config/ca_cert
value: ((blobstore_ssl.ca))
- type: replace
path: /properties?/bits-service/app_stash/webdav_config/ca_cert
value: ((blobstore_ssl.ca))
- type: replace
path: /properties?/blobstore/tls
value:
cert: ((blobstore_ssl.certificate))
private_key: ((blobstore_ssl.private_key))
ca_cert: ((blobstore_ssl.ca))
- type: replace
path: /variables?/name=blobstore_ssl
value:
name: blobstore_ssl
type: certificate
options:
common_name: not_configured
alternative_names:
- blobstore.10.175.96.245.xip.io
- '*.10.175.96.245.xip.io'
- 10.175.96.245.xip.io
ca: default_ca
- type: replace
path: /variables?/name=default_ca
value:
name: default_ca
options:
common_name: ca
is_ca: true
type: certificate
| Fix certificates for bits-service standalone deployment | Fix certificates for bits-service standalone deployment
We never noticed that these certificates were broken, because the Ruby implementation does not validate them. Bitsgo does.
| YAML | apache-2.0 | cloudfoundry-incubator/bits-service-ci,cloudfoundry-incubator/bits-service-ci,cloudfoundry-incubator/bits-service-ci | yaml | ## Code Before:
properties:
bits-service:
buildpacks:
webdav_config:
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
droplets:
webdav_config:
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
packages:
webdav_config:
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
app_stash:
webdav_config:
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
blobstore:
tls:
cert: (( file "standalone-blobstore-certs/server.crt" ))
private_key: (( file "standalone-blobstore-certs/server.key" ))
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
## Instruction:
Fix certificates for bits-service standalone deployment
We never noticed that these certificates were broken, because the Ruby implementation does not validate them. Bitsgo does.
## Code After:
- type: replace
path: /properties?/bits-service/buildpacks/webdav_config/ca_cert
value: ((blobstore_ssl.ca))
- type: replace
path: /properties?/bits-service/droplets/webdav_config/ca_cert
value: ((blobstore_ssl.ca))
- type: replace
path: /properties?/bits-service/packages/webdav_config/ca_cert
value: ((blobstore_ssl.ca))
- type: replace
path: /properties?/bits-service/app_stash/webdav_config/ca_cert
value: ((blobstore_ssl.ca))
- type: replace
path: /properties?/blobstore/tls
value:
cert: ((blobstore_ssl.certificate))
private_key: ((blobstore_ssl.private_key))
ca_cert: ((blobstore_ssl.ca))
- type: replace
path: /variables?/name=blobstore_ssl
value:
name: blobstore_ssl
type: certificate
options:
common_name: not_configured
alternative_names:
- blobstore.10.175.96.245.xip.io
- '*.10.175.96.245.xip.io'
- 10.175.96.245.xip.io
ca: default_ca
- type: replace
path: /variables?/name=default_ca
value:
name: default_ca
options:
common_name: ca
is_ca: true
type: certificate
|
ebf64c605c034d892275be70771f96be3b62ae28 | src/cosy/loader/lua.lua | src/cosy/loader/lua.lua | if #setmetatable ({}, { __len = function () return 1 end }) ~= 1
then
error "Cosy requires Lua >= 5.2 or Luajit with 5.2 compatibility to run."
end
return function (t)
t = t or {}
local loader = {}
local modules = setmetatable ({}, { __mode = "kv" })
loader.hotswap = t.hotswap
or require "hotswap".new {}
loader.require = function (name)
return loader.hotswap.require (name)
end
loader.load = function (name)
if modules [name] then
return modules [name]
end
local module = loader.require (name) (loader) or true
modules [name] = module
return module
end
loader.logto = t.logto
loader.coroutine = loader.require "coroutine.make" ()
loader.scheduler = t.scheduler
or loader.require "copas.ev"
loader.request = t.request
or loader.require "socket.http".request
loader.load "cosy.string"
return loader
end
| if #setmetatable ({}, { __len = function () return 1 end }) ~= 1
then
error "Cosy requires Lua >= 5.2 or Luajit with 5.2 compatibility to run."
end
return function (t)
t = t or {}
local loader = {}
local modules = setmetatable ({}, { __mode = "kv" })
loader.hotswap = t.hotswap
or require "hotswap".new {}
loader.require = function (name)
local back = _G.require
_G.require = loader.hotswap.require
local result = loader.hotswap.require (name)
_G.require = back
return result
end
-- loader.require = function (name)
-- return loader.hotswap.require (name)
-- end
loader.load = function (name)
if modules [name] then
return modules [name]
end
local module = loader.require (name) (loader) or true
modules [name] = module
return module
end
loader.logto = t.logto
loader.coroutine = loader.require "coroutine.make" ()
loader.scheduler = t.scheduler
or loader.require "copas.ev"
loader.request = t.request
or loader.require "socket.http".request
loader.load "cosy.string"
return loader
end
| Fix require in the loader. | Fix require in the loader.
| Lua | mit | CosyVerif/library,CosyVerif/library,CosyVerif/library | lua | ## Code Before:
if #setmetatable ({}, { __len = function () return 1 end }) ~= 1
then
error "Cosy requires Lua >= 5.2 or Luajit with 5.2 compatibility to run."
end
return function (t)
t = t or {}
local loader = {}
local modules = setmetatable ({}, { __mode = "kv" })
loader.hotswap = t.hotswap
or require "hotswap".new {}
loader.require = function (name)
return loader.hotswap.require (name)
end
loader.load = function (name)
if modules [name] then
return modules [name]
end
local module = loader.require (name) (loader) or true
modules [name] = module
return module
end
loader.logto = t.logto
loader.coroutine = loader.require "coroutine.make" ()
loader.scheduler = t.scheduler
or loader.require "copas.ev"
loader.request = t.request
or loader.require "socket.http".request
loader.load "cosy.string"
return loader
end
## Instruction:
Fix require in the loader.
## Code After:
if #setmetatable ({}, { __len = function () return 1 end }) ~= 1
then
error "Cosy requires Lua >= 5.2 or Luajit with 5.2 compatibility to run."
end
return function (t)
t = t or {}
local loader = {}
local modules = setmetatable ({}, { __mode = "kv" })
loader.hotswap = t.hotswap
or require "hotswap".new {}
loader.require = function (name)
local back = _G.require
_G.require = loader.hotswap.require
local result = loader.hotswap.require (name)
_G.require = back
return result
end
-- loader.require = function (name)
-- return loader.hotswap.require (name)
-- end
loader.load = function (name)
if modules [name] then
return modules [name]
end
local module = loader.require (name) (loader) or true
modules [name] = module
return module
end
loader.logto = t.logto
loader.coroutine = loader.require "coroutine.make" ()
loader.scheduler = t.scheduler
or loader.require "copas.ev"
loader.request = t.request
or loader.require "socket.http".request
loader.load "cosy.string"
return loader
end
|
f6cfbfde296ca7839a91c55fc04ccfaeafeeed5a | eg_rest.psgi | eg_rest.psgi | use strict;
use warnings;
use Plack::Builder;
use EnsEMBL::REST;
use Config;
use Plack::Util;
use File::Basename;
use File::Spec;
my $app = EnsEMBL::REST->psgi_app;
builder {
my $dirname = dirname(__FILE__);
my $rootdir = File::Spec->rel2abs(File::Spec->catdir($dirname, File::Spec->updir(), File::Spec->updir()));
my $staticdir = File::Spec->catdir($rootdir, 'root');
Plack::Util::load_class('BSD::Resource') if $Config{osname} eq 'darwin';
enable 'SizeLimit' => (
max_unshared_size_in_kb => (300 * 1024), # 100MB per process (memory assigned just to the process)
check_every_n_requests => 10,
);
enable "Plack::Middleware::ReverseProxy";
enable 'StackTrace';
enable 'Runtime';
enable "ContentLength";
enable 'CrossOrigin', origins => '*', headers => '*', methods => ['GET','POST','OPTIONS'];
$app;
}
| use strict;
use warnings;
use Plack::Builder;
use EnsEMBL::REST;
use Config;
use Plack::Util;
use File::Basename;
use File::Spec;
my $app = EnsEMBL::REST->psgi_app;
builder {
my $dirname = dirname(__FILE__);
my $rootdir = File::Spec->rel2abs(File::Spec->catdir($dirname, File::Spec->updir(), File::Spec->updir()));
my $staticdir = File::Spec->catdir($rootdir, 'root');
Plack::Util::load_class('BSD::Resource') if $Config{osname} eq 'darwin';
enable 'SizeLimit' => (
max_unshared_size_in_kb => (1300 * 1024), # 1300MB per process (memory assigned just to the process)
check_every_n_requests => 10,
log_when_limits_exceeded => 1
);
enable "Plack::Middleware::ReverseProxy";
enable 'StackTrace';
enable 'Runtime';
enable "ContentLength";
enable 'CrossOrigin', origins => '*', headers => '*', methods => ['GET','POST','OPTIONS'];
$app;
}
| Change configuration for maximum starman worker process memory size | Change configuration for maximum starman worker process memory size
Previously this was 300MB. But the virtual and resident sizes for a single
worker process are both significantly larger. Because we were checking it
every 10 requests, the worker processes were restarted after serving 10
requests.
Change it to 1300MB so that it is slightly larger than the virtual size for
a single worker which is 1260MB minimum.
| Perl | apache-2.0 | EnsemblGenomes/eg-rest,EnsemblGenomes/eg-rest | perl | ## Code Before:
use strict;
use warnings;
use Plack::Builder;
use EnsEMBL::REST;
use Config;
use Plack::Util;
use File::Basename;
use File::Spec;
my $app = EnsEMBL::REST->psgi_app;
builder {
my $dirname = dirname(__FILE__);
my $rootdir = File::Spec->rel2abs(File::Spec->catdir($dirname, File::Spec->updir(), File::Spec->updir()));
my $staticdir = File::Spec->catdir($rootdir, 'root');
Plack::Util::load_class('BSD::Resource') if $Config{osname} eq 'darwin';
enable 'SizeLimit' => (
max_unshared_size_in_kb => (300 * 1024), # 100MB per process (memory assigned just to the process)
check_every_n_requests => 10,
);
enable "Plack::Middleware::ReverseProxy";
enable 'StackTrace';
enable 'Runtime';
enable "ContentLength";
enable 'CrossOrigin', origins => '*', headers => '*', methods => ['GET','POST','OPTIONS'];
$app;
}
## Instruction:
Change configuration for maximum starman worker process memory size
Previously this was 300MB. But the virtual and resident sizes for a single
worker process are both significantly larger. Because we were checking it
every 10 requests, the worker processes were restarted after serving 10
requests.
Change it to 1300MB so that it is slightly larger than the virtual size for
a single worker which is 1260MB minimum.
## Code After:
use strict;
use warnings;
use Plack::Builder;
use EnsEMBL::REST;
use Config;
use Plack::Util;
use File::Basename;
use File::Spec;
my $app = EnsEMBL::REST->psgi_app;
builder {
my $dirname = dirname(__FILE__);
my $rootdir = File::Spec->rel2abs(File::Spec->catdir($dirname, File::Spec->updir(), File::Spec->updir()));
my $staticdir = File::Spec->catdir($rootdir, 'root');
Plack::Util::load_class('BSD::Resource') if $Config{osname} eq 'darwin';
enable 'SizeLimit' => (
max_unshared_size_in_kb => (1300 * 1024), # 1300MB per process (memory assigned just to the process)
check_every_n_requests => 10,
log_when_limits_exceeded => 1
);
enable "Plack::Middleware::ReverseProxy";
enable 'StackTrace';
enable 'Runtime';
enable "ContentLength";
enable 'CrossOrigin', origins => '*', headers => '*', methods => ['GET','POST','OPTIONS'];
$app;
}
|
1a5df01638a71b8ecaba2b311f08e2bcad1710b7 | views/identify.erb | views/identify.erb | <%
user_agent = UserAgent.parse(request.user_agent)
$id_vars[:os] = user_agent.os
$id_vars[:platform] = user_agent.platform
$id_vars[:browser] = user_agent.browser
$id_vars[:version] = user_agent.version
$id_vars[:build] = user_agent.build
$id_vars[:localization] = user_agent.localization
id_json =$id_vars.to_json
id_md5 = Digest::MD5.hexdigest(id_json)
$redis.incr(id_md5) unless cookies[:open_panopticlick]
id_match = $redis.get(id_md5)
id_total = $redis.keys.count
cookies[:open_panopticlick] = 'locked'
%>
<table class="table table-bordered table-condensed">
<tr><td>Fingerprint</td><td class="text-center"><%=id_md5%></td></tr>
<tr><td>Uniqueness</td><td class="text-center"><%="#{id_match} / #{id_total}"%></td></tr>
</table>
<small><%=id_json%></small>
| <%
user_agent = UserAgent.parse(request.user_agent)
$id_vars[:os] = user_agent.os
$id_vars[:platform] = user_agent.platform
$id_vars[:browser] = user_agent.browser
$id_vars[:version] = user_agent.version
$id_vars[:build] = user_agent.build
$id_vars[:localization] = user_agent.localization
id_json =$id_vars.to_json
id_md5 = Digest::MD5.hexdigest(id_json)
$redis.incr(id_md5) unless cookies[:open_panopticlick]
id_match = $redis.get(id_md5)
id_total = $redis.keys.count
cookies[:open_panopticlick] = 'locked'
%>
<table class="table table-bordered table-condensed">
<tr><td>Fingerprint</td><td class="text-center"><%=id_md5%></td></tr>
<tr><td>Uniqueness</td><td class="text-center"><%="#{id_match} / #{id_total}"%></td></tr>
</table>
<hr>
<h3>About:</h3>
<p>In 2010 the <a href="https://www.eff.org/">Electronic Frontier Foundation (EFF)</a> announced <a href="https://panopticlick.eff.org/">Panopticlick</a>, a web application designed to identify unique users by fingerprinting their web browsers.</p>
<p>In 2013 after Panopticlick was reposted <sup><small>(Again...)</small></sup> on <a href="https://news.ycombinator.com/">Hacker News</a>, I became curious about how exactly some parts of it worked. On discovering that the EFF have never published any source code for Panopticlick, I decided to create a FOSS (GPLv3) reimplementation.</p>
| Add 'About', remove debug output | Add 'About', remove debug output
| HTML+ERB | agpl-3.0 | p8952/open-panopticlick | html+erb | ## Code Before:
<%
user_agent = UserAgent.parse(request.user_agent)
$id_vars[:os] = user_agent.os
$id_vars[:platform] = user_agent.platform
$id_vars[:browser] = user_agent.browser
$id_vars[:version] = user_agent.version
$id_vars[:build] = user_agent.build
$id_vars[:localization] = user_agent.localization
id_json =$id_vars.to_json
id_md5 = Digest::MD5.hexdigest(id_json)
$redis.incr(id_md5) unless cookies[:open_panopticlick]
id_match = $redis.get(id_md5)
id_total = $redis.keys.count
cookies[:open_panopticlick] = 'locked'
%>
<table class="table table-bordered table-condensed">
<tr><td>Fingerprint</td><td class="text-center"><%=id_md5%></td></tr>
<tr><td>Uniqueness</td><td class="text-center"><%="#{id_match} / #{id_total}"%></td></tr>
</table>
<small><%=id_json%></small>
## Instruction:
Add 'About', remove debug output
## Code After:
<%
user_agent = UserAgent.parse(request.user_agent)
$id_vars[:os] = user_agent.os
$id_vars[:platform] = user_agent.platform
$id_vars[:browser] = user_agent.browser
$id_vars[:version] = user_agent.version
$id_vars[:build] = user_agent.build
$id_vars[:localization] = user_agent.localization
id_json =$id_vars.to_json
id_md5 = Digest::MD5.hexdigest(id_json)
$redis.incr(id_md5) unless cookies[:open_panopticlick]
id_match = $redis.get(id_md5)
id_total = $redis.keys.count
cookies[:open_panopticlick] = 'locked'
%>
<table class="table table-bordered table-condensed">
<tr><td>Fingerprint</td><td class="text-center"><%=id_md5%></td></tr>
<tr><td>Uniqueness</td><td class="text-center"><%="#{id_match} / #{id_total}"%></td></tr>
</table>
<hr>
<h3>About:</h3>
<p>In 2010 the <a href="https://www.eff.org/">Electronic Frontier Foundation (EFF)</a> announced <a href="https://panopticlick.eff.org/">Panopticlick</a>, a web application designed to identify unique users by fingerprinting their web browsers.</p>
<p>In 2013 after Panopticlick was reposted <sup><small>(Again...)</small></sup> on <a href="https://news.ycombinator.com/">Hacker News</a>, I became curious about how exactly some parts of it worked. On discovering that the EFF have never published any source code for Panopticlick, I decided to create a FOSS (GPLv3) reimplementation.</p>
|
e49fd692e1281f91164d216708befd81a1a3c102 | test/simple.js | test/simple.js | var test = require("tape")
var crypto = require('crypto')
var cryptoB = require('../')
var assert = require('assert')
function assertSame (fn) {
test(fn.name, function (t) {
fn(crypto, function (err, expected) {
fn(cryptoB, function (err, actual) {
t.equal(actual, expected)
t.end()
})
})
})
}
assertSame(function sha1 (crypto, cb) {
cb(null, crypto.createHash('sha1').update('hello', 'utf-8').digest('hex'))
})
assertSame(function md5(crypto, cb) {
cb(null, crypto.createHash('md5').update('hello', 'utf-8').digest('hex'))
})
assert.equal(cryptoB.randomBytes(10).length, 10)
test('randomBytes', function (t) {
cryptoB.randomBytes(10, function(ex, bytes) {
assert.ifError(ex)
bytes.forEach(function(bite) {
assert.equal(typeof bite, 'number')
})
t.end()
})
})
| var test = require("tape")
var crypto = require('crypto')
var cryptoB = require('../')
function assertSame (fn) {
test(fn.name, function (t) {
t.plan(1)
fn(crypto, function (err, expected) {
fn(cryptoB, function (err, actual) {
t.equal(actual, expected)
t.end()
})
})
})
}
assertSame(function sha1 (crypto, cb) {
cb(null, crypto.createHash('sha1').update('hello', 'utf-8').digest('hex'))
})
assertSame(function md5(crypto, cb) {
cb(null, crypto.createHash('md5').update('hello', 'utf-8').digest('hex'))
})
test('randomBytes', function (t) {
t.plan(3 + 10)
t.equal(cryptoB.randomBytes(10).length, 10)
cryptoB.randomBytes(10, function(ex, bytes) {
t.error(ex)
t.equal(bytes.length, 10)
bytes.forEach(function(bite) {
t.equal(typeof bite, 'number')
})
t.end()
})
})
| Use tape for asserts to better detect callbacks not being fired | Use tape for asserts to better detect callbacks not being fired | JavaScript | mit | crypto-browserify/crypto-browserify,crypto-browserify/crypto-browserify | javascript | ## Code Before:
var test = require("tape")
var crypto = require('crypto')
var cryptoB = require('../')
var assert = require('assert')
function assertSame (fn) {
test(fn.name, function (t) {
fn(crypto, function (err, expected) {
fn(cryptoB, function (err, actual) {
t.equal(actual, expected)
t.end()
})
})
})
}
assertSame(function sha1 (crypto, cb) {
cb(null, crypto.createHash('sha1').update('hello', 'utf-8').digest('hex'))
})
assertSame(function md5(crypto, cb) {
cb(null, crypto.createHash('md5').update('hello', 'utf-8').digest('hex'))
})
assert.equal(cryptoB.randomBytes(10).length, 10)
test('randomBytes', function (t) {
cryptoB.randomBytes(10, function(ex, bytes) {
assert.ifError(ex)
bytes.forEach(function(bite) {
assert.equal(typeof bite, 'number')
})
t.end()
})
})
## Instruction:
Use tape for asserts to better detect callbacks not being fired
## Code After:
var test = require("tape")
var crypto = require('crypto')
var cryptoB = require('../')
function assertSame (fn) {
test(fn.name, function (t) {
t.plan(1)
fn(crypto, function (err, expected) {
fn(cryptoB, function (err, actual) {
t.equal(actual, expected)
t.end()
})
})
})
}
assertSame(function sha1 (crypto, cb) {
cb(null, crypto.createHash('sha1').update('hello', 'utf-8').digest('hex'))
})
assertSame(function md5(crypto, cb) {
cb(null, crypto.createHash('md5').update('hello', 'utf-8').digest('hex'))
})
test('randomBytes', function (t) {
t.plan(3 + 10)
t.equal(cryptoB.randomBytes(10).length, 10)
cryptoB.randomBytes(10, function(ex, bytes) {
t.error(ex)
t.equal(bytes.length, 10)
bytes.forEach(function(bite) {
t.equal(typeof bite, 'number')
})
t.end()
})
})
|
e2cf64cc8bba62a718e26bdd9c8e60aa8ce7fa98 | README.md | README.md | PairUp app
==========
General notes
-------------
All dependencies have been kept local, so there are minimal assumptions about the machine upon which this application is installed.
The repository has been called PairUp just as a working title, and this may change.
Installation instructions
-------------------------
Create a directory and get the repo:
mkdir PairUp && cd PairUp
git clone git@github.com:Lets-Build-Something/PairUp.git
Install Composer:
curl -sS https://getcomposer.org/installer | php
Get dependencies:
php composer.phar install
| PairUp app
==========
General notes
-------------
All dependencies have been kept local, so there are minimal assumptions about the machine upon which this application is installed.
The repository has been called PairUp just as a working title, and this may change.
Installation instructions
-------------------------
Create a directory and get the repo:
mkdir PairUp && cd PairUp
git clone git@github.com:Lets-Build-Something/PairUp.git
Install Composer:
curl -sS https://getcomposer.org/installer | php
Get dependencies:
php composer.phar install
Set encryption keys inside the app:
php artisan key:generate
In a development version only, to see errors verbosely:
cp .env.example .env
On a spare console, fire up an instance of the app on http:localhost:8000
php artisan serve
Or of course, create an Apache vhost.
| Add a couple of things to get Laravel working | Add a couple of things to get Laravel working
| Markdown | mit | Lets-Build-Something/PairUp,khoparzi/PairUp,Lets-Build-Something/PairUp,khoparzi/PairUp | markdown | ## Code Before:
PairUp app
==========
General notes
-------------
All dependencies have been kept local, so there are minimal assumptions about the machine upon which this application is installed.
The repository has been called PairUp just as a working title, and this may change.
Installation instructions
-------------------------
Create a directory and get the repo:
mkdir PairUp && cd PairUp
git clone git@github.com:Lets-Build-Something/PairUp.git
Install Composer:
curl -sS https://getcomposer.org/installer | php
Get dependencies:
php composer.phar install
## Instruction:
Add a couple of things to get Laravel working
## Code After:
PairUp app
==========
General notes
-------------
All dependencies have been kept local, so there are minimal assumptions about the machine upon which this application is installed.
The repository has been called PairUp just as a working title, and this may change.
Installation instructions
-------------------------
Create a directory and get the repo:
mkdir PairUp && cd PairUp
git clone git@github.com:Lets-Build-Something/PairUp.git
Install Composer:
curl -sS https://getcomposer.org/installer | php
Get dependencies:
php composer.phar install
Set encryption keys inside the app:
php artisan key:generate
In a development version only, to see errors verbosely:
cp .env.example .env
On a spare console, fire up an instance of the app on http:localhost:8000
php artisan serve
Or of course, create an Apache vhost.
|
c9e2c70e05ade220e5aa6a4790ee2a9b720cc46e | sorting_test.py | sorting_test.py | import mergesort.merge_sort
import quicksort.quicksort
import sys
import time
from random import randint
def main(max_len):
for n in [2**(n+1) for n in range(max_len)]:
print 'Array size: %d' % n
arr = [randint(0, 2**max_len) for n in range(n)]
current_time = time.time()
quicksort.quicksort.check(mergesort.merge_sort.sort(arr))
print 'Merge sort: %f' % (time.time() - current_time)
current_time = time.time()
quicksort.quicksort.check(quicksort.quicksort.sort(arr, 0, len(arr)))
print 'Quicksort: %f' % (time.time() - current_time)
print '-----------------'
if __name__ == '__main__':
try:
max_len = int(sys.argv[1])
except (IndexError, ValueError):
print 'Format: python sorting_test.py <log(max input)>'
main(max_len) | import mergesort.merge_sort
import quicksort.quicksort
import sys
import time
from random import randint
def multi_size(max_len):
for n in [2**(n+1) for n in range(max_len)]:
print 'Array size: %d' % n
arr = [randint(0, 2**max_len) for n in range(n)]
current_time = time.time()
quicksort.quicksort.check(mergesort.merge_sort.sort(arr), n+1)
print 'Merge sort: %f' % (time.time() - current_time)
current_time = time.time()
quicksort.quicksort.check(quicksort.quicksort.sort(arr, 0, n+1), n+1)
print 'Quicksort: %f' % (time.time() - current_time)
print '-----------------'
def fixed_time(sec, length):
count = 0
start = time.time()
end = start + sec
while time.time() < end:
arr = [randint(0, length) for n in range(length)]
mergesort.merge_sort.sort(arr)
count += 1
print 'Merge sort: %d %d-element arrays in %d seconds' % (count, length, sec)
count = 0
start = time.time()
end = start + sec
while time.time() < end:
arr = [randint(0, length) for n in range(length)]
quicksort.quicksort.sort(arr, 0, length)
count += 1
print 'Quicksort: %d %d-element arrays in %d seconds' % (count, length, sec)
if __name__ == '__main__':
if len(sys.argv) > 2:
fixed_time(int(sys.argv[1]), int(sys.argv[2]))
else:
multi_size(int(sys.argv[1])) | Allow comparison within a fixed time period | Allow comparison within a fixed time period
To get an idea of average run-time, I wanted to be able to test
mergesort and quicksort with the same inputs many times over;
now by specifying a time limit and array length, the script will
run each algorithm on as many times as possible on random arrays
and report how many arrays were sorted within the time period.
| Python | mit | timpel/stanford-algs,timpel/stanford-algs | python | ## Code Before:
import mergesort.merge_sort
import quicksort.quicksort
import sys
import time
from random import randint
def main(max_len):
for n in [2**(n+1) for n in range(max_len)]:
print 'Array size: %d' % n
arr = [randint(0, 2**max_len) for n in range(n)]
current_time = time.time()
quicksort.quicksort.check(mergesort.merge_sort.sort(arr))
print 'Merge sort: %f' % (time.time() - current_time)
current_time = time.time()
quicksort.quicksort.check(quicksort.quicksort.sort(arr, 0, len(arr)))
print 'Quicksort: %f' % (time.time() - current_time)
print '-----------------'
if __name__ == '__main__':
try:
max_len = int(sys.argv[1])
except (IndexError, ValueError):
print 'Format: python sorting_test.py <log(max input)>'
main(max_len)
## Instruction:
Allow comparison within a fixed time period
To get an idea of average run-time, I wanted to be able to test
mergesort and quicksort with the same inputs many times over;
now by specifying a time limit and array length, the script will
run each algorithm on as many times as possible on random arrays
and report how many arrays were sorted within the time period.
## Code After:
import mergesort.merge_sort
import quicksort.quicksort
import sys
import time
from random import randint
def multi_size(max_len):
for n in [2**(n+1) for n in range(max_len)]:
print 'Array size: %d' % n
arr = [randint(0, 2**max_len) for n in range(n)]
current_time = time.time()
quicksort.quicksort.check(mergesort.merge_sort.sort(arr), n+1)
print 'Merge sort: %f' % (time.time() - current_time)
current_time = time.time()
quicksort.quicksort.check(quicksort.quicksort.sort(arr, 0, n+1), n+1)
print 'Quicksort: %f' % (time.time() - current_time)
print '-----------------'
def fixed_time(sec, length):
count = 0
start = time.time()
end = start + sec
while time.time() < end:
arr = [randint(0, length) for n in range(length)]
mergesort.merge_sort.sort(arr)
count += 1
print 'Merge sort: %d %d-element arrays in %d seconds' % (count, length, sec)
count = 0
start = time.time()
end = start + sec
while time.time() < end:
arr = [randint(0, length) for n in range(length)]
quicksort.quicksort.sort(arr, 0, length)
count += 1
print 'Quicksort: %d %d-element arrays in %d seconds' % (count, length, sec)
if __name__ == '__main__':
if len(sys.argv) > 2:
fixed_time(int(sys.argv[1]), int(sys.argv[2]))
else:
multi_size(int(sys.argv[1])) |
10fbce820f580d058bdfb1ab89cf01a027da3161 | gitst.c | gitst.c |
int
main(int argc, char **argv)
{
int pid, ret;
int pipes[2];
char gitbuff[GITBUF];
size_t gitlen;
char b;
char *br;
int childst;
if(pipe(pipes) != 0)
{
perror("Error creating pipes");
exit(1);
}
if((pid = fork()) == -1)
{
perror("Error forking git");
exit(1);
}
if(pid == 0) //child
{
if(dup2(pipes[1], STDOUT_FILENO) == -1)
{
perror("Error duplicating stdout");
exit(1);
}
close(STDERR_FILENO);
ret = execlp("git", "git", "status", "-z", "-b", (char*)0);
}
waitpid(pid, &childst, 0);
if(childst != 0) {
exit(2);
}
gitlen = read(pipes[0], gitbuff, GITBUF);
br = &gitbuff[3];
putchar('(');
while(*br != '\0')
{
// Three dots separate the branch from the tracking branch
if(*br == '.' && *(br+1) == '.' && *(br+2) == '.') break;
putchar(*br++);
}
putchar(')');
putchar('\n');
}
|
int
main(int argc, char **argv)
{
int pid, ret;
int pipes[2];
char gitbuff[GITBUF];
size_t gitlen;
char b;
char *br;
int childst;
if(pipe(pipes) != 0)
{
perror("Error creating pipes");
exit(1);
}
if((pid = fork()) == -1)
{
perror("Error forking git");
exit(1);
}
if(pid == 0) //child
{
if(dup2(pipes[1], STDOUT_FILENO) == -1)
{
perror("Error duplicating stdout");
exit(1);
}
close(STDERR_FILENO);
ret = execlp("git", "git", "branch", "--list", (char*)0);
}
waitpid(pid, &childst, 0);
if(childst != 0) {
exit(2);
}
gitlen = read(pipes[0], gitbuff, GITBUF);
br = gitbuff;
putchar('(');
while(*br++ != '*') {}
// skip the '*' and the space after it
br++;
while(*br != '\n')
{
putchar(*br++);
}
putchar(')');
putchar('\n');
}
| Convert to `git branch` for performance | Convert to `git branch` for performance
- Using `git status` is eificient to get only the current branch but in
a repo with a large number of untracked files it can stall for a long
time getting the status of a repository. Git Branch is much better in
these cases.
| C | bsd-3-clause | wnh/prompt_utils | c | ## Code Before:
int
main(int argc, char **argv)
{
int pid, ret;
int pipes[2];
char gitbuff[GITBUF];
size_t gitlen;
char b;
char *br;
int childst;
if(pipe(pipes) != 0)
{
perror("Error creating pipes");
exit(1);
}
if((pid = fork()) == -1)
{
perror("Error forking git");
exit(1);
}
if(pid == 0) //child
{
if(dup2(pipes[1], STDOUT_FILENO) == -1)
{
perror("Error duplicating stdout");
exit(1);
}
close(STDERR_FILENO);
ret = execlp("git", "git", "status", "-z", "-b", (char*)0);
}
waitpid(pid, &childst, 0);
if(childst != 0) {
exit(2);
}
gitlen = read(pipes[0], gitbuff, GITBUF);
br = &gitbuff[3];
putchar('(');
while(*br != '\0')
{
// Three dots separate the branch from the tracking branch
if(*br == '.' && *(br+1) == '.' && *(br+2) == '.') break;
putchar(*br++);
}
putchar(')');
putchar('\n');
}
## Instruction:
Convert to `git branch` for performance
- Using `git status` is eificient to get only the current branch but in
a repo with a large number of untracked files it can stall for a long
time getting the status of a repository. Git Branch is much better in
these cases.
## Code After:
int
main(int argc, char **argv)
{
int pid, ret;
int pipes[2];
char gitbuff[GITBUF];
size_t gitlen;
char b;
char *br;
int childst;
if(pipe(pipes) != 0)
{
perror("Error creating pipes");
exit(1);
}
if((pid = fork()) == -1)
{
perror("Error forking git");
exit(1);
}
if(pid == 0) //child
{
if(dup2(pipes[1], STDOUT_FILENO) == -1)
{
perror("Error duplicating stdout");
exit(1);
}
close(STDERR_FILENO);
ret = execlp("git", "git", "branch", "--list", (char*)0);
}
waitpid(pid, &childst, 0);
if(childst != 0) {
exit(2);
}
gitlen = read(pipes[0], gitbuff, GITBUF);
br = gitbuff;
putchar('(');
while(*br++ != '*') {}
// skip the '*' and the space after it
br++;
while(*br != '\n')
{
putchar(*br++);
}
putchar(')');
putchar('\n');
}
|
75611b5d82438d20f6a959778712bfa335c1ed0e | source/css/sass/_layouts.sass | source/css/sass/_layouts.sass | // Master layouts file - import all needed layout styles here
@import layouts/navbar
@import layouts/header
@import layouts/footer
| // Master layouts file - import all needed layout styles here
@import layouts/navbar
@import layouts/header
@import layouts/footer
@import layouts/l-container
@import layouts/l-content
| Add l-content and l-container to layouts | Add l-content and l-container to layouts
| Sass | mit | pcvonz/pattern_lab_playground,pcvonz/pattern_lab_playground,pcvonz/pattern_lab_playground | sass | ## Code Before:
// Master layouts file - import all needed layout styles here
@import layouts/navbar
@import layouts/header
@import layouts/footer
## Instruction:
Add l-content and l-container to layouts
## Code After:
// Master layouts file - import all needed layout styles here
@import layouts/navbar
@import layouts/header
@import layouts/footer
@import layouts/l-container
@import layouts/l-content
|
ea85a1586617816974232ca5374919e37b2b490c | .github/mergify.yml | .github/mergify.yml | pull_request_rules:
- name: Integration with @gradle-bot
conditions:
- check-success=Bot Says OK
- check-success=Quick Feedback - Linux Only (Trigger) (Check)
actions:
merge:
method: merge
strict_method: rebase
strict: true
- name: Trigger mergify on TestAndMerge command
conditions:
- check-success=Bot Triggers Mergify
actions:
merge:
method: merge
strict_method: rebase
strict: true
| pull_request_rules:
- name: Integration with @gradle-bot
conditions:
- check-success=Bot Says OK
- check-success=Quick Feedback - Linux Only (Trigger) (Check)
actions:
merge:
method: merge
strict: true
- name: Trigger mergify on TestAndMerge command
conditions:
- check-success=Bot Triggers Mergify
actions:
merge:
method: merge
strict: true
| Use merge method for now | Use merge method for now
See https://github.com/Mergifyio/mergify-engine/issues/1529
Mergify engine may have issues with "rebase" strict_method,
so we fallback to "merge" for now.
| YAML | apache-2.0 | gradle/gradle,blindpirate/gradle,gradle/gradle,blindpirate/gradle,blindpirate/gradle,gradle/gradle,blindpirate/gradle,blindpirate/gradle,gradle/gradle,gradle/gradle,gradle/gradle,gradle/gradle,gradle/gradle,blindpirate/gradle,gradle/gradle,blindpirate/gradle,blindpirate/gradle,blindpirate/gradle,blindpirate/gradle,gradle/gradle | yaml | ## Code Before:
pull_request_rules:
- name: Integration with @gradle-bot
conditions:
- check-success=Bot Says OK
- check-success=Quick Feedback - Linux Only (Trigger) (Check)
actions:
merge:
method: merge
strict_method: rebase
strict: true
- name: Trigger mergify on TestAndMerge command
conditions:
- check-success=Bot Triggers Mergify
actions:
merge:
method: merge
strict_method: rebase
strict: true
## Instruction:
Use merge method for now
See https://github.com/Mergifyio/mergify-engine/issues/1529
Mergify engine may have issues with "rebase" strict_method,
so we fallback to "merge" for now.
## Code After:
pull_request_rules:
- name: Integration with @gradle-bot
conditions:
- check-success=Bot Says OK
- check-success=Quick Feedback - Linux Only (Trigger) (Check)
actions:
merge:
method: merge
strict: true
- name: Trigger mergify on TestAndMerge command
conditions:
- check-success=Bot Triggers Mergify
actions:
merge:
method: merge
strict: true
|
24006c946b8a6634a4116422acf912a820c7f238 | src/middlewares/logger.ts | src/middlewares/logger.ts | import { Store, Dispatch, Action } from "redux";
const crashReporter = (store: Store) => (next: Dispatch) => (
action: Action
) => {
const { getState } = store;
const beforeActionState = getState();
const result = next(action);
const afterActionState = getState();
console.info(
"action",
action,
"beforeState",
beforeActionState,
"afterState",
afterActionState
);
return result;
};
export default crashReporter;
| import { Store, Dispatch, Action } from "redux";
const logger = (store: Store) => (next: Dispatch) => (action: Action) => {
const { getState } = store;
const beforeActionState = getState();
const result = next(action);
const afterActionState = getState();
console.info(
"action",
action,
"beforeState",
beforeActionState,
"afterState",
afterActionState
);
return result;
};
export default logger;
| Change name to match middlware behavior | Change name to match middlware behavior
| TypeScript | bsd-3-clause | wakatime/desktop,wakatime/wakatime-desktop,wakatime/wakatime-desktop,wakatime/desktop,wakatime/desktop | typescript | ## Code Before:
import { Store, Dispatch, Action } from "redux";
const crashReporter = (store: Store) => (next: Dispatch) => (
action: Action
) => {
const { getState } = store;
const beforeActionState = getState();
const result = next(action);
const afterActionState = getState();
console.info(
"action",
action,
"beforeState",
beforeActionState,
"afterState",
afterActionState
);
return result;
};
export default crashReporter;
## Instruction:
Change name to match middlware behavior
## Code After:
import { Store, Dispatch, Action } from "redux";
const logger = (store: Store) => (next: Dispatch) => (action: Action) => {
const { getState } = store;
const beforeActionState = getState();
const result = next(action);
const afterActionState = getState();
console.info(
"action",
action,
"beforeState",
beforeActionState,
"afterState",
afterActionState
);
return result;
};
export default logger;
|
7a206bc1e76afea90139a5a2683f00cbc4e2a5d1 | src/turbolinks/error_renderer.coffee | src/turbolinks/error_renderer.coffee |
class Turbolinks.ErrorRenderer extends Turbolinks.Renderer
constructor: (@html) ->
render: (callback) ->
@renderView =>
@replaceDocumentHTML()
@activateBodyScriptElements()
callback()
replaceDocumentHTML: ->
document.documentElement.innerHTML = @html
activateBodyScriptElements: ->
for replaceableElement in @getScriptElements()
element = @createScriptElement(replaceableElement)
replaceableElement.parentNode.replaceChild(element, replaceableElement)
getScriptElements: ->
document.documentElement.querySelectorAll("script")
|
class Turbolinks.ErrorRenderer extends Turbolinks.Renderer
constructor: (html) ->
htmlElement = document.createElement("html")
htmlElement.innerHTML = html
@newHead = htmlElement.querySelector("head")
@newBody = htmlElement.querySelector("body")
render: (callback) ->
@renderView =>
@replaceHeadAndBody()
@activateBodyScriptElements()
callback()
replaceHeadAndBody: ->
{head, body} = document
head.parentNode.replaceChild(@newHead, head)
body.parentNode.replaceChild(@newBody, body)
activateBodyScriptElements: ->
for replaceableElement in @getScriptElements()
element = @createScriptElement(replaceableElement)
replaceableElement.parentNode.replaceChild(element, replaceableElement)
getScriptElements: ->
document.documentElement.querySelectorAll("script")
| Set newBody property when rendering errors | Set newBody property when rendering errors
It's expected by the Turbolinks.Renderer super class
| CoffeeScript | mit | turbolinks/turbolinks,turbolinks/turbolinks,turbolinks/turbolinks | coffeescript | ## Code Before:
class Turbolinks.ErrorRenderer extends Turbolinks.Renderer
constructor: (@html) ->
render: (callback) ->
@renderView =>
@replaceDocumentHTML()
@activateBodyScriptElements()
callback()
replaceDocumentHTML: ->
document.documentElement.innerHTML = @html
activateBodyScriptElements: ->
for replaceableElement in @getScriptElements()
element = @createScriptElement(replaceableElement)
replaceableElement.parentNode.replaceChild(element, replaceableElement)
getScriptElements: ->
document.documentElement.querySelectorAll("script")
## Instruction:
Set newBody property when rendering errors
It's expected by the Turbolinks.Renderer super class
## Code After:
class Turbolinks.ErrorRenderer extends Turbolinks.Renderer
constructor: (html) ->
htmlElement = document.createElement("html")
htmlElement.innerHTML = html
@newHead = htmlElement.querySelector("head")
@newBody = htmlElement.querySelector("body")
render: (callback) ->
@renderView =>
@replaceHeadAndBody()
@activateBodyScriptElements()
callback()
replaceHeadAndBody: ->
{head, body} = document
head.parentNode.replaceChild(@newHead, head)
body.parentNode.replaceChild(@newBody, body)
activateBodyScriptElements: ->
for replaceableElement in @getScriptElements()
element = @createScriptElement(replaceableElement)
replaceableElement.parentNode.replaceChild(element, replaceableElement)
getScriptElements: ->
document.documentElement.querySelectorAll("script")
|
65cb681a26530e038c7f45ac26dbd2009d6c0e3b | components/media-bar/media-bar.html | components/media-bar/media-bar.html | <app-scroll-scroller
overlay
horizontal
class="section fill-darker media-bar"
:class="{
'-is-loading': !mediaItems || !mediaItems.length,
}"
>
<div class="-items">
<div class="-loading-container" v-if="!mediaItems || !mediaItems.length">
<app-loading centered no-color stationary hide-label />
</div>
<template v-if="mediaItems && mediaItems.length">
<app-media-bar-item
v-for="item of mediaItems"
:key="item.id"
:item="item"
/>
</template>
</div>
</app-scroll-scroller>
| <app-scroll-scroller overlay horizontal>
<div
class="section fill-darker media-bar"
:class="{
'-is-loading': !mediaItems || !mediaItems.length,
}"
>
<div class="-items">
<div class="-loading-container" v-if="!mediaItems || !mediaItems.length">
<app-loading centered no-color stationary hide-label />
</div>
<template v-if="mediaItems && mediaItems.length">
<app-media-bar-item
v-for="item of mediaItems"
:key="item.id"
:item="item"
/>
</template>
</div>
</div>
</app-scroll-scroller>
| Fix media bar with overlay scroll on chrome | Fix media bar with overlay scroll on chrome
| HTML | mit | gamejolt/frontend-lib,gamejolt/frontend-lib,gamejolt/frontend-lib,gamejolt/frontend-lib | html | ## Code Before:
<app-scroll-scroller
overlay
horizontal
class="section fill-darker media-bar"
:class="{
'-is-loading': !mediaItems || !mediaItems.length,
}"
>
<div class="-items">
<div class="-loading-container" v-if="!mediaItems || !mediaItems.length">
<app-loading centered no-color stationary hide-label />
</div>
<template v-if="mediaItems && mediaItems.length">
<app-media-bar-item
v-for="item of mediaItems"
:key="item.id"
:item="item"
/>
</template>
</div>
</app-scroll-scroller>
## Instruction:
Fix media bar with overlay scroll on chrome
## Code After:
<app-scroll-scroller overlay horizontal>
<div
class="section fill-darker media-bar"
:class="{
'-is-loading': !mediaItems || !mediaItems.length,
}"
>
<div class="-items">
<div class="-loading-container" v-if="!mediaItems || !mediaItems.length">
<app-loading centered no-color stationary hide-label />
</div>
<template v-if="mediaItems && mediaItems.length">
<app-media-bar-item
v-for="item of mediaItems"
:key="item.id"
:item="item"
/>
</template>
</div>
</div>
</app-scroll-scroller>
|
1aa65a9da9c73cc212244e9fee7e9e18699c5972 | app/views/partials/foyer/ressources/montants.html | app/views/partials/foyer/ressources/montants.html | <div class="frame-foyer">
<h1>{{ pageTitle }}</h1>
<p>
Indiquez toutes vos ressources <strong>nettes</strong> perçues en France comme à l'étranger.
</p>
<form method="post" name="form" novalidate ng-submit="submit(form)">
<div class="ressources-individu">
<div class="clearfix"></div>
<div class="content">
<div ng-if="ressources.length">
<div ng-include="'partials/foyer/ressources/detail-montants.html'"></div>
</div>
</div>
<hr>
</div>
<div class="text-right">
<a title="Déclarer d'autres ressources" ui-sref="foyer.ressources.individu.types" class="btn btn-success">
Déclarer d'autres ressources
</a>
<button type="submit" class="btn btn-primary">
Valider <i class="fa fa-arrow-circle-right" aria-hidden="true"></i>
</button>
</div>
</form>
</div>
| <div class="frame-foyer">
<h1>{{ pageTitle }}</h1>
<p>
Indiquez toutes vos ressources <strong>nettes</strong> perçues en France comme à l'étranger.
</p>
<form method="post" name="form" novalidate ng-submit="submit(form)">
<div class="ressources-individu">
<div class="clearfix"></div>
<div class="content">
<div ng-if="ressources.length">
<div ng-include="'partials/foyer/ressources/detail-montants.html'"></div>
</div>
</div>
<hr>
</div>
<div class="pull-left">
<a title="Déclarer d'autres ressources" ui-sref="foyer.ressources.individu.types" class="btn btn-default">
<i class="fa fa-sm fa-arrow-left" aria-hidden="true"></i>
Déclarer d'autres ressources
</a>
</div>
<div class="text-right">
<button type="submit" class="btn btn-primary">
Valider <i class="fa fa-arrow-circle-right" aria-hidden="true"></i>
</button>
</div>
</form>
</div>
| Move and make ressources back button more discrete | Move and make ressources back button more discrete
| HTML | agpl-3.0 | sgmap/mes-aides-ui,sgmap/mes-aides-ui,sgmap/mes-aides-ui,sgmap/mes-aides-ui | html | ## Code Before:
<div class="frame-foyer">
<h1>{{ pageTitle }}</h1>
<p>
Indiquez toutes vos ressources <strong>nettes</strong> perçues en France comme à l'étranger.
</p>
<form method="post" name="form" novalidate ng-submit="submit(form)">
<div class="ressources-individu">
<div class="clearfix"></div>
<div class="content">
<div ng-if="ressources.length">
<div ng-include="'partials/foyer/ressources/detail-montants.html'"></div>
</div>
</div>
<hr>
</div>
<div class="text-right">
<a title="Déclarer d'autres ressources" ui-sref="foyer.ressources.individu.types" class="btn btn-success">
Déclarer d'autres ressources
</a>
<button type="submit" class="btn btn-primary">
Valider <i class="fa fa-arrow-circle-right" aria-hidden="true"></i>
</button>
</div>
</form>
</div>
## Instruction:
Move and make ressources back button more discrete
## Code After:
<div class="frame-foyer">
<h1>{{ pageTitle }}</h1>
<p>
Indiquez toutes vos ressources <strong>nettes</strong> perçues en France comme à l'étranger.
</p>
<form method="post" name="form" novalidate ng-submit="submit(form)">
<div class="ressources-individu">
<div class="clearfix"></div>
<div class="content">
<div ng-if="ressources.length">
<div ng-include="'partials/foyer/ressources/detail-montants.html'"></div>
</div>
</div>
<hr>
</div>
<div class="pull-left">
<a title="Déclarer d'autres ressources" ui-sref="foyer.ressources.individu.types" class="btn btn-default">
<i class="fa fa-sm fa-arrow-left" aria-hidden="true"></i>
Déclarer d'autres ressources
</a>
</div>
<div class="text-right">
<button type="submit" class="btn btn-primary">
Valider <i class="fa fa-arrow-circle-right" aria-hidden="true"></i>
</button>
</div>
</form>
</div>
|
4ebac589cfcddcfcef4b4e5160a07c5f67d999f6 | .github/ISSUE_TEMPLATE/feature_request.md | .github/ISSUE_TEMPLATE/feature_request.md | ---
name: Feature request
about: Suggest an idea for this project
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| ---
name: Feature request
about: Suggest an idea for this project
---
**Current Problem**
Is your feature request related to a problem? Please provide a clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Proposed Solution**
A clear and concise description of what you want to happen.
**Alternatives Considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context here.
| Update feature request issue template | Update feature request issue template | Markdown | apache-2.0 | dwavesystems/dimod,dwavesystems/dimod | markdown | ## Code Before:
---
name: Feature request
about: Suggest an idea for this project
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
## Instruction:
Update feature request issue template
## Code After:
---
name: Feature request
about: Suggest an idea for this project
---
**Current Problem**
Is your feature request related to a problem? Please provide a clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Proposed Solution**
A clear and concise description of what you want to happen.
**Alternatives Considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context here.
|
ef62582ea460420ffb877a53682a2fb25df3fdd8 | CHANGELOG.md | CHANGELOG.md | 3.2.0
=====
* Added `gluggi_dummy()` for dummy content. Currently supported is only `gluggi_dummy("content")`.
* Allow Symfony 5.
| 3.2.0
=====
* Implement new layout.
* Added `gluggi_dummy()` for dummy content. Currently supported is only `gluggi_dummy("content")`.
* Allow Symfony 5.
| Add changelog entry about new layout | Add changelog entry about new layout
| Markdown | bsd-3-clause | Becklyn/GluggiBundle,Becklyn/GluggiBundle,Becklyn/GluggiBundle | markdown | ## Code Before:
3.2.0
=====
* Added `gluggi_dummy()` for dummy content. Currently supported is only `gluggi_dummy("content")`.
* Allow Symfony 5.
## Instruction:
Add changelog entry about new layout
## Code After:
3.2.0
=====
* Implement new layout.
* Added `gluggi_dummy()` for dummy content. Currently supported is only `gluggi_dummy("content")`.
* Allow Symfony 5.
|
11ac7a9fdecf5bd9082eb358bc4c2e2e98821ed6 | nubis/builder/artifacts/AMIs.json | nubis/builder/artifacts/AMIs.json | {
"builds": [
{
"name": "amazon-ebs-ubuntu",
"builder_type": "amazon-ebs",
"build_time": 1491922866,
"files": null,
"artifact_id": "ap-northeast-1:ami-ef7f5d88,ap-northeast-2:ami-47c61429,ap-southeast-1:ami-7448f617,ap-southeast-2:ami-59808e3a,eu-central-1:ami-7f8a5810,eu-west-1:ami-9d2e14fb,sa-east-1:ami-df4725b3,us-east-1:ami-c0941cd6,us-west-1:ami-da2500ba,us-west-2:ami-860595e6",
"packer_run_uuid": "13d967ea-2ee5-b52f-7561-6634dfe6a1e0"
}
],
"last_run_uuid": "13d967ea-2ee5-b52f-7561-6634dfe6a1e0"
} | {
"builds": [
{
"name": "amazon-ebs-ubuntu",
"builder_type": "amazon-ebs",
"build_time": 1491928656,
"files": null,
"artifact_id": "ap-northeast-1:ami-db4a68bc,ap-northeast-2:ami-85c113eb,ap-southeast-1:ami-3642fc55,ap-southeast-2:ami-bd9b95de,eu-central-1:ami-3a845655,eu-west-1:ami-9e201af8,sa-east-1:ami-6347250f,us-east-1:ami-00a42c16,us-west-1:ami-0c21046c,us-west-2:ami-fc07979c",
"packer_run_uuid": "6fa1cc48-7cc0-4352-1b74-b268bd8fa068"
}
],
"last_run_uuid": "6fa1cc48-7cc0-4352-1b74-b268bd8fa068"
} | Update builder artifacts for v1.5.0-dev release | Update builder artifacts for v1.5.0-dev release [skip ci]
| JSON | mpl-2.0 | gozer/nubis-dpaste,Nubisproject/nubis-dpaste,limed/nubis-dpaste,gozer/nubis-dpaste,tinnightcap/nubis-dpaste,gozer/nubis-dpaste,Nubisproject/nubis-dpaste,tinnightcap/nubis-dpaste,limed/nubis-dpaste | json | ## Code Before:
{
"builds": [
{
"name": "amazon-ebs-ubuntu",
"builder_type": "amazon-ebs",
"build_time": 1491922866,
"files": null,
"artifact_id": "ap-northeast-1:ami-ef7f5d88,ap-northeast-2:ami-47c61429,ap-southeast-1:ami-7448f617,ap-southeast-2:ami-59808e3a,eu-central-1:ami-7f8a5810,eu-west-1:ami-9d2e14fb,sa-east-1:ami-df4725b3,us-east-1:ami-c0941cd6,us-west-1:ami-da2500ba,us-west-2:ami-860595e6",
"packer_run_uuid": "13d967ea-2ee5-b52f-7561-6634dfe6a1e0"
}
],
"last_run_uuid": "13d967ea-2ee5-b52f-7561-6634dfe6a1e0"
}
## Instruction:
Update builder artifacts for v1.5.0-dev release [skip ci]
## Code After:
{
"builds": [
{
"name": "amazon-ebs-ubuntu",
"builder_type": "amazon-ebs",
"build_time": 1491928656,
"files": null,
"artifact_id": "ap-northeast-1:ami-db4a68bc,ap-northeast-2:ami-85c113eb,ap-southeast-1:ami-3642fc55,ap-southeast-2:ami-bd9b95de,eu-central-1:ami-3a845655,eu-west-1:ami-9e201af8,sa-east-1:ami-6347250f,us-east-1:ami-00a42c16,us-west-1:ami-0c21046c,us-west-2:ami-fc07979c",
"packer_run_uuid": "6fa1cc48-7cc0-4352-1b74-b268bd8fa068"
}
],
"last_run_uuid": "6fa1cc48-7cc0-4352-1b74-b268bd8fa068"
} |
89e9b3bae2060f47b92a6ff1e3613a8f7d4a602a | lib/console-vmc-plugin/plugin.rb | lib/console-vmc-plugin/plugin.rb | require "vmc/plugin"
require "console-vmc-plugin/console"
module VMCConsole
class Console < VMC::CLI
desc "Open a console connected to your app"
group :apps, :manage
input :app, :argument => :required, :from_given => by_name("app"),
:desc => "App to connect to"
input :port, :default => 10000
def console
app = input[:app]
console = CFConsole.new(client, app)
port = console.pick_port!(input[:port])
with_progress("Opening console on port #{c(port, :name)}") do
console.open!
console.wait_for_start
end
console.start_console
end
filter(:start, :start_app) do |app|
if app.framework.name == "rails3"
app.console = true
end
app
end
end
end
| require "vmc/plugin"
require "console-vmc-plugin/console"
module VMCConsole
class Console < VMC::CLI
desc "Open a console connected to your app"
group :apps, :manage
input :app, :argument => :required, :from_given => by_name("app"),
:desc => "App to connect to"
input :port, :default => 10000
def console
app = input[:app]
console = CFConsole.new(client, app)
port = console.pick_port!(input[:port])
with_progress("Opening console on port #{c(port, :name)}") do
console.open!
console.wait_for_start
end
console.start_console
end
filter(:start, :start_app) do |app|
if app.framework.name == "rails3" || app.framework.name == "buildpack"
app.console = true
end
app
end
end
end
| Provision console port for buildpack framework | Provision console port for buildpack framework
Change-Id: I7572a97f328cfe60791ac89dd41650774a2afdfd
| Ruby | apache-2.0 | cloudfoundry-attic/console-vmc-plugin,cloudfoundry-attic/console-vmc-plugin | ruby | ## Code Before:
require "vmc/plugin"
require "console-vmc-plugin/console"
module VMCConsole
class Console < VMC::CLI
desc "Open a console connected to your app"
group :apps, :manage
input :app, :argument => :required, :from_given => by_name("app"),
:desc => "App to connect to"
input :port, :default => 10000
def console
app = input[:app]
console = CFConsole.new(client, app)
port = console.pick_port!(input[:port])
with_progress("Opening console on port #{c(port, :name)}") do
console.open!
console.wait_for_start
end
console.start_console
end
filter(:start, :start_app) do |app|
if app.framework.name == "rails3"
app.console = true
end
app
end
end
end
## Instruction:
Provision console port for buildpack framework
Change-Id: I7572a97f328cfe60791ac89dd41650774a2afdfd
## Code After:
require "vmc/plugin"
require "console-vmc-plugin/console"
module VMCConsole
class Console < VMC::CLI
desc "Open a console connected to your app"
group :apps, :manage
input :app, :argument => :required, :from_given => by_name("app"),
:desc => "App to connect to"
input :port, :default => 10000
def console
app = input[:app]
console = CFConsole.new(client, app)
port = console.pick_port!(input[:port])
with_progress("Opening console on port #{c(port, :name)}") do
console.open!
console.wait_for_start
end
console.start_console
end
filter(:start, :start_app) do |app|
if app.framework.name == "rails3" || app.framework.name == "buildpack"
app.console = true
end
app
end
end
end
|
36c922d6e2b662eee8c2dc10cb53855a80c6db14 | lib/serum/util.ex | lib/serum/util.ex | defmodule Serum.Util do
@moduledoc """
This module provides some frequently used shortcut functions. These functions
are inlined by the compiler.
"""
@doc "Prints a warning message to stderr."
@spec warn(binary) :: Macro.t()
defmacro warn(str) do
quote do
IO.puts(:stderr, "\x1b[33m * #{unquote(str)}\x1b[0m")
end
end
@doc "Displays which file is generated."
@spec msg_gen(binary) :: Macro.t()
defmacro msg_gen(dest) do
quote do
IO.puts("\x1b[92m GEN \x1b[0m#{unquote(dest)}")
end
end
@doc "Displays which file is generated."
@spec msg_gen(binary, binary) :: Macro.t()
defmacro msg_gen(src, dest) do
quote do
IO.puts("\x1b[92m GEN \x1b[0m#{unquote(src)} -> #{unquote(dest)}")
end
end
@doc "Displays which directory is created."
@spec msg_mkdir(binary) :: Macro.t()
defmacro msg_mkdir(dir) do
quote do
IO.puts("\x1b[96m MKDIR \x1b[0m#{unquote(dir)}")
end
end
end
| defmodule Serum.Util do
@moduledoc """
This module provides some frequently used shortcut functions. These functions
are inlined by the compiler.
"""
@doc "Prints a warning message to stderr."
@spec warn(binary) :: Macro.t()
defmacro warn(str) do
quote do
IO.puts(:stderr, "\x1b[33m * #{unquote(str)}\x1b[0m")
end
end
@doc "Displays which directory is created."
@spec msg_mkdir(binary) :: Macro.t()
defmacro msg_mkdir(dir) do
quote do
IO.puts("\x1b[96m MKDIR \x1b[0m#{unquote(dir)}")
end
end
end
| Remove unused macros from Serum.Util | Remove unused macros from Serum.Util
| Elixir | mit | Dalgona/Serum | elixir | ## Code Before:
defmodule Serum.Util do
@moduledoc """
This module provides some frequently used shortcut functions. These functions
are inlined by the compiler.
"""
@doc "Prints a warning message to stderr."
@spec warn(binary) :: Macro.t()
defmacro warn(str) do
quote do
IO.puts(:stderr, "\x1b[33m * #{unquote(str)}\x1b[0m")
end
end
@doc "Displays which file is generated."
@spec msg_gen(binary) :: Macro.t()
defmacro msg_gen(dest) do
quote do
IO.puts("\x1b[92m GEN \x1b[0m#{unquote(dest)}")
end
end
@doc "Displays which file is generated."
@spec msg_gen(binary, binary) :: Macro.t()
defmacro msg_gen(src, dest) do
quote do
IO.puts("\x1b[92m GEN \x1b[0m#{unquote(src)} -> #{unquote(dest)}")
end
end
@doc "Displays which directory is created."
@spec msg_mkdir(binary) :: Macro.t()
defmacro msg_mkdir(dir) do
quote do
IO.puts("\x1b[96m MKDIR \x1b[0m#{unquote(dir)}")
end
end
end
## Instruction:
Remove unused macros from Serum.Util
## Code After:
defmodule Serum.Util do
@moduledoc """
This module provides some frequently used shortcut functions. These functions
are inlined by the compiler.
"""
@doc "Prints a warning message to stderr."
@spec warn(binary) :: Macro.t()
defmacro warn(str) do
quote do
IO.puts(:stderr, "\x1b[33m * #{unquote(str)}\x1b[0m")
end
end
@doc "Displays which directory is created."
@spec msg_mkdir(binary) :: Macro.t()
defmacro msg_mkdir(dir) do
quote do
IO.puts("\x1b[96m MKDIR \x1b[0m#{unquote(dir)}")
end
end
end
|
cee8c5cd8b3502f677e05f11641c90436862a08f | test/index.js | test/index.js | require('dotenv/config');
const request = require('then-request');
let server = null;
let url = null;
switch (process.argv[2]) {
case 'local':
url = 'http://localhost:3000';
server = require('../server');
process.env.PIPERMAIL_SOURCE = 'https://mail.mozilla.org/pipermail/es-discuss/';
break;
case 'staging':
url = 'https://esdiscuss-bot-staging.herokuapp.com';
break;
case 'prod':
url = 'https://bot.esdiscuss.org';
break;
default:
console.error('Unrecognised environment ' + process.argv[2]);
process.exit(1);
break;
}
const timeout = Date.now() + 10 * 60 * 1000;
function poll() {
request('GET', url).done(res => {
if (res.statusCode === 503 && Date.now() < timeout) {
return poll();
}
if (res.statusCode !== 200) {
console.log(res.body.toString('utf8'));
process.exit(1);
} else if (server) {
process.exit(0);
}
});
}
poll(); | require('dotenv/config');
const request = require('then-request');
let server = null;
let url = null;
switch (process.argv[2]) {
case 'local':
url = 'http://localhost:3000';
server = require('../server');
process.env.PIPERMAIL_SOURCE = 'https://mail.mozilla.org/pipermail/es-discuss/';
break;
case 'staging':
url = 'https://esdiscuss-bot-staging.herokuapp.com';
break;
case 'prod':
url = 'https://bot.esdiscuss.org';
break;
default:
console.error('Unrecognised environment ' + process.argv[2]);
process.exit(1);
break;
}
const slow = Date.now() + 60 * 1000;
const timeout = Date.now() + 5 * 60 * 1000;
function poll() {
request('GET', url).done(res => {
if (res.statusCode === 503 && Date.now() < timeout) {
if (Date.now() > slow) {
console.log(res.body.toString('utf8'));
}
return poll();
}
if (res.statusCode !== 200) {
console.log(res.body.toString('utf8'));
process.exit(1);
} else if (server) {
process.exit(0);
}
});
}
poll();
| Reduce test timeout to 5 minutes | Reduce test timeout to 5 minutes | JavaScript | mit | esdiscuss/bot,esdiscuss/bot | javascript | ## Code Before:
require('dotenv/config');
const request = require('then-request');
let server = null;
let url = null;
switch (process.argv[2]) {
case 'local':
url = 'http://localhost:3000';
server = require('../server');
process.env.PIPERMAIL_SOURCE = 'https://mail.mozilla.org/pipermail/es-discuss/';
break;
case 'staging':
url = 'https://esdiscuss-bot-staging.herokuapp.com';
break;
case 'prod':
url = 'https://bot.esdiscuss.org';
break;
default:
console.error('Unrecognised environment ' + process.argv[2]);
process.exit(1);
break;
}
const timeout = Date.now() + 10 * 60 * 1000;
function poll() {
request('GET', url).done(res => {
if (res.statusCode === 503 && Date.now() < timeout) {
return poll();
}
if (res.statusCode !== 200) {
console.log(res.body.toString('utf8'));
process.exit(1);
} else if (server) {
process.exit(0);
}
});
}
poll();
## Instruction:
Reduce test timeout to 5 minutes
## Code After:
require('dotenv/config');
const request = require('then-request');
let server = null;
let url = null;
switch (process.argv[2]) {
case 'local':
url = 'http://localhost:3000';
server = require('../server');
process.env.PIPERMAIL_SOURCE = 'https://mail.mozilla.org/pipermail/es-discuss/';
break;
case 'staging':
url = 'https://esdiscuss-bot-staging.herokuapp.com';
break;
case 'prod':
url = 'https://bot.esdiscuss.org';
break;
default:
console.error('Unrecognised environment ' + process.argv[2]);
process.exit(1);
break;
}
const slow = Date.now() + 60 * 1000;
const timeout = Date.now() + 5 * 60 * 1000;
function poll() {
request('GET', url).done(res => {
if (res.statusCode === 503 && Date.now() < timeout) {
if (Date.now() > slow) {
console.log(res.body.toString('utf8'));
}
return poll();
}
if (res.statusCode !== 200) {
console.log(res.body.toString('utf8'));
process.exit(1);
} else if (server) {
process.exit(0);
}
});
}
poll();
|
9dfe4a4ad31dd8f13c8c54b172652a8af8983975 | .github/workflows/docker-release.yml | .github/workflows/docker-release.yml | name: Docker build and push
on:
push:
branches:
- master
jobs:
Publish-to-docker:
runs-on: ubuntu-latest
env:
DOCKER_TAG: latest-multi
steps:
-
name: Checkout
uses: actions/checkout@v1
-
name: Set up Docker Buildx
id: buildx
uses: crazy-max/ghaction-docker-buildx@v1
with:
version: latest
-
name: List available platforms
run: echo ${{ steps.buildx.outputs.platforms }}
-
name: Docker login (set DOCKER_USERNAME and DOCKER_PASSWORD in secrets)
run:
docker login -u ${{ secrets.DOCKER_USERNAME }} -p ${{ secrets.DOCKER_PASSWORD }}
-
name: Publish to docker as voucher/vouch-proxy
if: ${{ success() && startsWith(github.repository, 'vouch/')}}
run: |
docker buildx build \
--platform linux/arm/v7,linux/arm64 \
--push \
-f ./.build/Dockerfile \
-t voucher/vouch-proxy:$DOCKER_TAG \
.
-
name: Publish to docker as github_user/github_repo
if: ${{ success() && !startsWith(github.repository, 'vouch/')}}
run: |
docker buildx build \
--platform linux/amd64,linux/arm/v7,linux/arm64 \
--push \
-t $GITHUB_REPOSITORY:latest \
. | name: Docker build and push
on:
push:
branches:
- master
jobs:
Publish-to-docker:
runs-on: ubuntu-latest
env:
DOCKER_TAG: latest-multi
steps:
-
name: Checkout
uses: actions/checkout@v2
-
name: Set up Docker Buildx
id: buildx
uses: crazy-max/ghaction-docker-buildx@v1
with:
version: latest
-
name: List available platforms
run: echo ${{ steps.buildx.outputs.platforms }}
-
name: Docker login (set DOCKER_USERNAME and DOCKER_PASSWORD in secrets)
if: ${{ success() && startsWith(github.repository, 'vouch/')}} # Remove this line, if you want everybody to publish to docker hub
run:
docker login -u ${{ secrets.DOCKER_USERNAME }} -p ${{ secrets.DOCKER_PASSWORD }}
-
name: Publish to docker as voucher/vouch-proxy
if: ${{ success() && startsWith(github.repository, 'vouch/')}}
run: |
docker buildx build \
--platform linux/arm/v7,linux/arm64 \
--push \
-t voucher/vouch-proxy:$DOCKER_TAG \
.
# Uncomment below to have github build to docker for every user. Watch out for indentation
# -
# name: Publish to docker as github_user/github_repo
# if: ${{ success() && !startsWith(github.repository, 'vouch/')}}
# run: |
# docker buildx build \
# --platform linux/amd64,linux/arm/v7,linux/arm64 \
# --push \
# -t $GITHUB_REPOSITORY:latest \
# . | Disable build for other users | chore: Disable build for other users
| YAML | mit | bnfinet/lasso,bnfinet/lasso | yaml | ## Code Before:
name: Docker build and push
on:
push:
branches:
- master
jobs:
Publish-to-docker:
runs-on: ubuntu-latest
env:
DOCKER_TAG: latest-multi
steps:
-
name: Checkout
uses: actions/checkout@v1
-
name: Set up Docker Buildx
id: buildx
uses: crazy-max/ghaction-docker-buildx@v1
with:
version: latest
-
name: List available platforms
run: echo ${{ steps.buildx.outputs.platforms }}
-
name: Docker login (set DOCKER_USERNAME and DOCKER_PASSWORD in secrets)
run:
docker login -u ${{ secrets.DOCKER_USERNAME }} -p ${{ secrets.DOCKER_PASSWORD }}
-
name: Publish to docker as voucher/vouch-proxy
if: ${{ success() && startsWith(github.repository, 'vouch/')}}
run: |
docker buildx build \
--platform linux/arm/v7,linux/arm64 \
--push \
-f ./.build/Dockerfile \
-t voucher/vouch-proxy:$DOCKER_TAG \
.
-
name: Publish to docker as github_user/github_repo
if: ${{ success() && !startsWith(github.repository, 'vouch/')}}
run: |
docker buildx build \
--platform linux/amd64,linux/arm/v7,linux/arm64 \
--push \
-t $GITHUB_REPOSITORY:latest \
.
## Instruction:
chore: Disable build for other users
## Code After:
name: Docker build and push
on:
push:
branches:
- master
jobs:
Publish-to-docker:
runs-on: ubuntu-latest
env:
DOCKER_TAG: latest-multi
steps:
-
name: Checkout
uses: actions/checkout@v2
-
name: Set up Docker Buildx
id: buildx
uses: crazy-max/ghaction-docker-buildx@v1
with:
version: latest
-
name: List available platforms
run: echo ${{ steps.buildx.outputs.platforms }}
-
name: Docker login (set DOCKER_USERNAME and DOCKER_PASSWORD in secrets)
if: ${{ success() && startsWith(github.repository, 'vouch/')}} # Remove this line, if you want everybody to publish to docker hub
run:
docker login -u ${{ secrets.DOCKER_USERNAME }} -p ${{ secrets.DOCKER_PASSWORD }}
-
name: Publish to docker as voucher/vouch-proxy
if: ${{ success() && startsWith(github.repository, 'vouch/')}}
run: |
docker buildx build \
--platform linux/arm/v7,linux/arm64 \
--push \
-t voucher/vouch-proxy:$DOCKER_TAG \
.
# Uncomment below to have github build to docker for every user. Watch out for indentation
# -
# name: Publish to docker as github_user/github_repo
# if: ${{ success() && !startsWith(github.repository, 'vouch/')}}
# run: |
# docker buildx build \
# --platform linux/amd64,linux/arm/v7,linux/arm64 \
# --push \
# -t $GITHUB_REPOSITORY:latest \
# . |
43ef6bc110a28670a7a97a433373222acc990944 | readme.md | readme.md |
[Grunt](http://gruntjs.com/) plugin for [Sass Lint](https://github.com/sasstools/sass-lint).
## Install
```
npm install grunt-sass-lint --save-dev
```
## Examples
```js
grunt.initConfig({
sasslint: {
options: {
configFile: 'config/.sass-lint.yml',
},
target: ['location/*.scss']
}
});
```
## Options
See the [sass-lint options](https://github.com/sasstools/sass-lint#options).
In addition the following options are supported:
### configFile
Type: `string`
Default: ``
Will fallback to `.sass-lint.yml` or the file location set at the `"sasslintConfig"` key inside of `package.json`
|
[Grunt](http://gruntjs.com/) plugin for [Sass Lint](https://github.com/sasstools/sass-lint).
## Install
```
npm install grunt-sass-lint --save-dev
```
Once the plugin has been installed, it may be enabled inside your Gruntfile with this line of JavaScript:
```
grunt.loadNpmTasks('grunt-sass-lint');
```
## Examples
```js
grunt.initConfig({
sasslint: {
options: {
configFile: 'config/.sass-lint.yml',
},
target: ['location/*.scss']
}
});
```
## Options
See the [sass-lint options](https://github.com/sasstools/sass-lint#options).
In addition the following options are supported:
### configFile
Type: `string`
Default: ``
Will fallback to `.sass-lint.yml` or the file location set at the `"sasslintConfig"` key inside of `package.json`
| Document how to enable grunt-sass-lint in Gruntfile. | Document how to enable grunt-sass-lint in Gruntfile.
For easy copy + paste. | Markdown | mit | sasstools/grunt-sass-lint | markdown | ## Code Before:
[Grunt](http://gruntjs.com/) plugin for [Sass Lint](https://github.com/sasstools/sass-lint).
## Install
```
npm install grunt-sass-lint --save-dev
```
## Examples
```js
grunt.initConfig({
sasslint: {
options: {
configFile: 'config/.sass-lint.yml',
},
target: ['location/*.scss']
}
});
```
## Options
See the [sass-lint options](https://github.com/sasstools/sass-lint#options).
In addition the following options are supported:
### configFile
Type: `string`
Default: ``
Will fallback to `.sass-lint.yml` or the file location set at the `"sasslintConfig"` key inside of `package.json`
## Instruction:
Document how to enable grunt-sass-lint in Gruntfile.
For easy copy + paste.
## Code After:
[Grunt](http://gruntjs.com/) plugin for [Sass Lint](https://github.com/sasstools/sass-lint).
## Install
```
npm install grunt-sass-lint --save-dev
```
Once the plugin has been installed, it may be enabled inside your Gruntfile with this line of JavaScript:
```
grunt.loadNpmTasks('grunt-sass-lint');
```
## Examples
```js
grunt.initConfig({
sasslint: {
options: {
configFile: 'config/.sass-lint.yml',
},
target: ['location/*.scss']
}
});
```
## Options
See the [sass-lint options](https://github.com/sasstools/sass-lint#options).
In addition the following options are supported:
### configFile
Type: `string`
Default: ``
Will fallback to `.sass-lint.yml` or the file location set at the `"sasslintConfig"` key inside of `package.json`
|
fcd9aa3292727ac6c442caff74dc1cde747e0462 | README.md | README.md | Australian Virtual Herbarium Hub (avh-hub)
=======
Grails application that provides UI and customisations to the [ALA Biocache](https://github.com/AtlasOfLivingAustralia/biocache-hubs)
(Grails plugin) for [http://avh.ala.org.au/](http://avh.ala.org.au/).
|
Australian Virtual Herbarium Hub (avh-hub)
==========================================
Grails application that provides UI and customisations to the [ALA Biocache](https://github.com/AtlasOfLivingAustralia/biocache-hubs)
(Grails plugin) for [http://avh.ala.org.au/](http://avh.ala.org.au/).
Deploying AVH
=============
If you have not yet installed Ansible, Vagrant, or VirtualBox, use the instructions at the (ALA Install README.md file)[https://github.com/AtlasOfLivingAustralia/ala-install/blob/master/README.md] to install those first for your operating system.
Then, to deploy AVH onto a local virtual box install use the following instructions:
```
$ cd gitrepos
$ git clone git@github.com:AtlasOfLivingAustralia/ala-install.git
$ (cd ala-install/vagrant/ubuntu-trusty && vagrant up)
```
Add a line to your /etc/hosts file with the following information, replacing '10.1.1.3' with whatever IP address is assigned to the virtual machine that Vagrant starts up in VirtualBox:
```
10.1.1.3 avh.ala.org.au
```
Then you can clone the ansible instructions and install it onto the given machine:
```
$ git clone git@github.com:AtlasOfLivingAustralia/ansible-inventories.git
$ ansible-playbook -i ansible-inventories/avh.ala.org.au ala-install/ansible/avh-hub-standalone.yml --private-key ~/.vagrant.d/insecure_private_key -vvvv --user vagrant --sudo
```
| Add command line instructions for deploying AVH | Add command line instructions for deploying AVH | Markdown | mpl-2.0 | AtlasOfLivingAustralia/avh-hub,AtlasOfLivingAustralia/avh-hub | markdown | ## Code Before:
Australian Virtual Herbarium Hub (avh-hub)
=======
Grails application that provides UI and customisations to the [ALA Biocache](https://github.com/AtlasOfLivingAustralia/biocache-hubs)
(Grails plugin) for [http://avh.ala.org.au/](http://avh.ala.org.au/).
## Instruction:
Add command line instructions for deploying AVH
## Code After:
Australian Virtual Herbarium Hub (avh-hub)
==========================================
Grails application that provides UI and customisations to the [ALA Biocache](https://github.com/AtlasOfLivingAustralia/biocache-hubs)
(Grails plugin) for [http://avh.ala.org.au/](http://avh.ala.org.au/).
Deploying AVH
=============
If you have not yet installed Ansible, Vagrant, or VirtualBox, use the instructions at the (ALA Install README.md file)[https://github.com/AtlasOfLivingAustralia/ala-install/blob/master/README.md] to install those first for your operating system.
Then, to deploy AVH onto a local virtual box install use the following instructions:
```
$ cd gitrepos
$ git clone git@github.com:AtlasOfLivingAustralia/ala-install.git
$ (cd ala-install/vagrant/ubuntu-trusty && vagrant up)
```
Add a line to your /etc/hosts file with the following information, replacing '10.1.1.3' with whatever IP address is assigned to the virtual machine that Vagrant starts up in VirtualBox:
```
10.1.1.3 avh.ala.org.au
```
Then you can clone the ansible instructions and install it onto the given machine:
```
$ git clone git@github.com:AtlasOfLivingAustralia/ansible-inventories.git
$ ansible-playbook -i ansible-inventories/avh.ala.org.au ala-install/ansible/avh-hub-standalone.yml --private-key ~/.vagrant.d/insecure_private_key -vvvv --user vagrant --sudo
```
|
800706f5835293ee20dd9505d1d11c28eb38bbb2 | tests/shipane_sdk/matchers/dataframe_matchers.py | tests/shipane_sdk/matchers/dataframe_matchers.py |
import re
from hamcrest.core.base_matcher import BaseMatcher
class HasColumn(BaseMatcher):
def __init__(self, column):
self._column = column
def _matches(self, df):
return self._column in df.columns
def describe_to(self, description):
description.append_text(u'Dataframe doesn\'t have colum [{0}]'.format(self._column))
def has_column(column):
return HasColumn(column)
class HasColumnMatches(BaseMatcher):
def __init__(self, column_pattern):
self._column_pattern = re.compile(column_pattern)
def _matches(self, df):
return df.filter(regex=self._column_pattern).columns.size > 0
def describe_to(self, description):
description.append_text(u'Dataframe doesn\'t have colum matches [{0}]'.format(self._column_pattern))
def has_column_matches(column_pattern):
return HasColumnMatches(column_pattern)
class HasRow(BaseMatcher):
def __init__(self, row):
self._row = row
def _matches(self, df):
return self._row in df.index
def describe_to(self, description):
description.append_text(u'Dataframe doesn\'t have row [%s]'.format(self._row))
def has_row(row):
return HasRow(row)
|
import re
from hamcrest.core.base_matcher import BaseMatcher
class HasColumn(BaseMatcher):
def __init__(self, column):
self._column = column
def _matches(self, df):
return self._column in df.columns
def describe_to(self, description):
description.append_text(u'Dataframe doesn\'t have colum [{0}]'.format(self._column))
def has_column(column):
return HasColumn(column)
class HasColumnMatches(BaseMatcher):
def __init__(self, column_pattern):
self._column_pattern = re.compile(column_pattern)
def _matches(self, df):
return len(list(filter(self._column_pattern.match, df.columns.values))) > 0
def describe_to(self, description):
description.append_text(u'Dataframe doesn\'t have colum matches [{0}]'.format(self._column_pattern))
def has_column_matches(column_pattern):
return HasColumnMatches(column_pattern)
class HasRow(BaseMatcher):
def __init__(self, row):
self._row = row
def _matches(self, df):
return self._row in df.index
def describe_to(self, description):
description.append_text(u'Dataframe doesn\'t have row [%s]'.format(self._row))
def has_row(row):
return HasRow(row)
| Fix HasColumn matcher for dataframe with duplicated columns | Fix HasColumn matcher for dataframe with duplicated columns
| Python | mit | sinall/ShiPanE-Python-SDK,sinall/ShiPanE-Python-SDK | python | ## Code Before:
import re
from hamcrest.core.base_matcher import BaseMatcher
class HasColumn(BaseMatcher):
def __init__(self, column):
self._column = column
def _matches(self, df):
return self._column in df.columns
def describe_to(self, description):
description.append_text(u'Dataframe doesn\'t have colum [{0}]'.format(self._column))
def has_column(column):
return HasColumn(column)
class HasColumnMatches(BaseMatcher):
def __init__(self, column_pattern):
self._column_pattern = re.compile(column_pattern)
def _matches(self, df):
return df.filter(regex=self._column_pattern).columns.size > 0
def describe_to(self, description):
description.append_text(u'Dataframe doesn\'t have colum matches [{0}]'.format(self._column_pattern))
def has_column_matches(column_pattern):
return HasColumnMatches(column_pattern)
class HasRow(BaseMatcher):
def __init__(self, row):
self._row = row
def _matches(self, df):
return self._row in df.index
def describe_to(self, description):
description.append_text(u'Dataframe doesn\'t have row [%s]'.format(self._row))
def has_row(row):
return HasRow(row)
## Instruction:
Fix HasColumn matcher for dataframe with duplicated columns
## Code After:
import re
from hamcrest.core.base_matcher import BaseMatcher
class HasColumn(BaseMatcher):
def __init__(self, column):
self._column = column
def _matches(self, df):
return self._column in df.columns
def describe_to(self, description):
description.append_text(u'Dataframe doesn\'t have colum [{0}]'.format(self._column))
def has_column(column):
return HasColumn(column)
class HasColumnMatches(BaseMatcher):
def __init__(self, column_pattern):
self._column_pattern = re.compile(column_pattern)
def _matches(self, df):
return len(list(filter(self._column_pattern.match, df.columns.values))) > 0
def describe_to(self, description):
description.append_text(u'Dataframe doesn\'t have colum matches [{0}]'.format(self._column_pattern))
def has_column_matches(column_pattern):
return HasColumnMatches(column_pattern)
class HasRow(BaseMatcher):
def __init__(self, row):
self._row = row
def _matches(self, df):
return self._row in df.index
def describe_to(self, description):
description.append_text(u'Dataframe doesn\'t have row [%s]'.format(self._row))
def has_row(row):
return HasRow(row)
|
3bb371f5b2671f35a1563c46b58980796c59bf17 | .travis.yml | .travis.yml | language: scala
jdk:
- oraclejdk8
sudo: required
dist: trusty
scala:
- 2.10.6
- 2.11.8
env:
- CASSANDRA_VERSION=2.1.15
- CASSANDRA_VERSION=3.6
script:
- "sbt ++$TRAVIS_SCALA_VERSION -Dtest.cassandra.version=$CASSANDRA_VERSION -Dtravis=true -Dtravis=true it:test" #Integration Suite
- "sbt ++$TRAVIS_SCALA_VERSION -Dtest.cassandra.version=$CASSANDRA_VERSION -Dtravis=true -Dtravis=true assembly" #Test Suite and Assembly Test
| language: scala
jdk:
- oraclejdk8
sudo: required
dist: trusty
scala:
- 2.11.8
env:
- CASSANDRA_VERSION=2.1.15
- CASSANDRA_VERSION=3.6
script:
- "sbt ++$TRAVIS_SCALA_VERSION -Dtest.cassandra.version=$CASSANDRA_VERSION -Dtravis=true -Dtravis=true it:test" #Integration Suite
- "sbt ++$TRAVIS_SCALA_VERSION -Dtest.cassandra.version=$CASSANDRA_VERSION -Dtravis=true -Dtravis=true assembly" #Test Suite and Assembly Test
| Disable 2.10 Scala Testing in Travis | Disable 2.10 Scala Testing in Travis
Since 2.3 no longer supports Scala 2.10 we are removing it from
the travis test matrix.
| YAML | apache-2.0 | datastax/spark-cassandra-connector,datastax/spark-cassandra-connector | yaml | ## Code Before:
language: scala
jdk:
- oraclejdk8
sudo: required
dist: trusty
scala:
- 2.10.6
- 2.11.8
env:
- CASSANDRA_VERSION=2.1.15
- CASSANDRA_VERSION=3.6
script:
- "sbt ++$TRAVIS_SCALA_VERSION -Dtest.cassandra.version=$CASSANDRA_VERSION -Dtravis=true -Dtravis=true it:test" #Integration Suite
- "sbt ++$TRAVIS_SCALA_VERSION -Dtest.cassandra.version=$CASSANDRA_VERSION -Dtravis=true -Dtravis=true assembly" #Test Suite and Assembly Test
## Instruction:
Disable 2.10 Scala Testing in Travis
Since 2.3 no longer supports Scala 2.10 we are removing it from
the travis test matrix.
## Code After:
language: scala
jdk:
- oraclejdk8
sudo: required
dist: trusty
scala:
- 2.11.8
env:
- CASSANDRA_VERSION=2.1.15
- CASSANDRA_VERSION=3.6
script:
- "sbt ++$TRAVIS_SCALA_VERSION -Dtest.cassandra.version=$CASSANDRA_VERSION -Dtravis=true -Dtravis=true it:test" #Integration Suite
- "sbt ++$TRAVIS_SCALA_VERSION -Dtest.cassandra.version=$CASSANDRA_VERSION -Dtravis=true -Dtravis=true assembly" #Test Suite and Assembly Test
|
d95cefe45a48ab5427a2cca03f80fb1169a36e0d | lib/yumlcmd.rb | lib/yumlcmd.rb | require "net/http"
require "optparse"
class YumlCmd
def YumlCmd.generate(args)
ext = "png"
input = ""
type = "/scruffy"
opts = OptionParser.new do |o|
o.banner = "Usage: #{File.basename($0)} [options]"
o.on('-f', '--file FILENAME', 'File containing yuml.me diagram.') do |filename|
input = filename
end
o.on('-t', '--type EXTENSION', 'Output format: png (default), jpg') do |extension|
ext = extension if extension
end
o.on('-o', '--orderly', 'Generate orderly') do |type|
type = "" if type
end
o.on_tail('-h', '--help', 'Display this help and exit') do
puts opts
exit
end
end
opts.parse!(args)
lines = IO.readlines(input).collect!{|l| l.gsub("\n", "")}.reject{|l| l =~ /#/}
writer = open("yuml-output.#{ext}", "wb")
res = Net::HTTP.start("yuml.me", 80) {|http|
http.get(URI.escape("/diagram#{type}/class/#{lines.join(",")}"))
}
writer.write(res.body)
writer.close
end
end
if $0 == __FILE__
YumlCmd.generate(ARGV)
end | require "net/http"
require "optparse"
class YumlCmd
def YumlCmd.generate(args)
ext = "png"
input = ""
type = "/scruffy"
opts = OptionParser.new do |o|
o.banner = "Usage: #{File.basename($0)} [options]"
o.on('-f', '--file FILENAME', 'File containing yuml.me diagram.') do |filename|
input = filename
end
o.on('-t', '--type EXTENSION', 'Output format: png (default), jpg') do |extension|
ext = extension if extension
end
o.on('-o', '--orderly', 'Generate orderly') do |type|
type = "" if type
end
o.on_tail('-h', '--help', 'Display this help and exit') do
puts opts
exit
end
end
opts.parse!(args)
lines = IO.readlines(input).collect!{|l| l.gsub("\n", "")}.reject{|l| l =~ /#/}
output = input.replace(".", "-")
output = output + "-output"
writer = open(output + ".#{ext}", "wb")
res = Net::HTTP.start("yuml.me", 80) {|http|
http.get(URI.escape("/diagram#{type}/class/#{lines.join(",")}"))
}
writer.write(res.body)
writer.close
end
end
if $0 == __FILE__
YumlCmd.generate(ARGV)
end | Use input filename as base for output filename | Use input filename as base for output filename
| Ruby | mit | atog/yumlcmd | ruby | ## Code Before:
require "net/http"
require "optparse"
class YumlCmd
def YumlCmd.generate(args)
ext = "png"
input = ""
type = "/scruffy"
opts = OptionParser.new do |o|
o.banner = "Usage: #{File.basename($0)} [options]"
o.on('-f', '--file FILENAME', 'File containing yuml.me diagram.') do |filename|
input = filename
end
o.on('-t', '--type EXTENSION', 'Output format: png (default), jpg') do |extension|
ext = extension if extension
end
o.on('-o', '--orderly', 'Generate orderly') do |type|
type = "" if type
end
o.on_tail('-h', '--help', 'Display this help and exit') do
puts opts
exit
end
end
opts.parse!(args)
lines = IO.readlines(input).collect!{|l| l.gsub("\n", "")}.reject{|l| l =~ /#/}
writer = open("yuml-output.#{ext}", "wb")
res = Net::HTTP.start("yuml.me", 80) {|http|
http.get(URI.escape("/diagram#{type}/class/#{lines.join(",")}"))
}
writer.write(res.body)
writer.close
end
end
if $0 == __FILE__
YumlCmd.generate(ARGV)
end
## Instruction:
Use input filename as base for output filename
## Code After:
require "net/http"
require "optparse"
class YumlCmd
def YumlCmd.generate(args)
ext = "png"
input = ""
type = "/scruffy"
opts = OptionParser.new do |o|
o.banner = "Usage: #{File.basename($0)} [options]"
o.on('-f', '--file FILENAME', 'File containing yuml.me diagram.') do |filename|
input = filename
end
o.on('-t', '--type EXTENSION', 'Output format: png (default), jpg') do |extension|
ext = extension if extension
end
o.on('-o', '--orderly', 'Generate orderly') do |type|
type = "" if type
end
o.on_tail('-h', '--help', 'Display this help and exit') do
puts opts
exit
end
end
opts.parse!(args)
lines = IO.readlines(input).collect!{|l| l.gsub("\n", "")}.reject{|l| l =~ /#/}
output = input.replace(".", "-")
output = output + "-output"
writer = open(output + ".#{ext}", "wb")
res = Net::HTTP.start("yuml.me", 80) {|http|
http.get(URI.escape("/diagram#{type}/class/#{lines.join(",")}"))
}
writer.write(res.body)
writer.close
end
end
if $0 == __FILE__
YumlCmd.generate(ARGV)
end |
1a6f45011f653a4039fd2ae847d3de65ff216c57 | providers/soundcloud.yml | providers/soundcloud.yml | ---
- provider_name: SoundCloud
provider_url: http://soundcloud.com/
endpoints:
- schemes:
- http://soundcloud.com/*
url: https://soundcloud.com/oembed
example_urls:
- https://soundcloud.com/oembed?url=https%3A%2F%2Fsoundcloud.com%2Fforss%2Fflickermood
- https://soundcloud.com/oembed?url=https%3A%2F%2Fsoundcloud.com%2Fforss%2Fflickermood&format=json
...
| ---
- provider_name: SoundCloud
provider_url: http://soundcloud.com/
endpoints:
- schemes:
- http://soundcloud.com/*
- https://soundcloud.com/*
url: https://soundcloud.com/oembed
example_urls:
- https://soundcloud.com/oembed?url=https%3A%2F%2Fsoundcloud.com%2Fforss%2Fflickermood
- https://soundcloud.com/oembed?url=https%3A%2F%2Fsoundcloud.com%2Fforss%2Fflickermood&format=json
...
| Add https scheme for SoundCloud | Add https scheme for SoundCloud
no issue
- SoundCloud is by default https so https://soundcloud/* URLs should be recognised | YAML | mit | iamcal/oembed,iamcal/oembed,iamcal/oembed,iamcal/oembed | yaml | ## Code Before:
---
- provider_name: SoundCloud
provider_url: http://soundcloud.com/
endpoints:
- schemes:
- http://soundcloud.com/*
url: https://soundcloud.com/oembed
example_urls:
- https://soundcloud.com/oembed?url=https%3A%2F%2Fsoundcloud.com%2Fforss%2Fflickermood
- https://soundcloud.com/oembed?url=https%3A%2F%2Fsoundcloud.com%2Fforss%2Fflickermood&format=json
...
## Instruction:
Add https scheme for SoundCloud
no issue
- SoundCloud is by default https so https://soundcloud/* URLs should be recognised
## Code After:
---
- provider_name: SoundCloud
provider_url: http://soundcloud.com/
endpoints:
- schemes:
- http://soundcloud.com/*
- https://soundcloud.com/*
url: https://soundcloud.com/oembed
example_urls:
- https://soundcloud.com/oembed?url=https%3A%2F%2Fsoundcloud.com%2Fforss%2Fflickermood
- https://soundcloud.com/oembed?url=https%3A%2F%2Fsoundcloud.com%2Fforss%2Fflickermood&format=json
...
|
b4167b3315e16e279c34d1624de0de604072b0cd | Resources/views/Form/fields.html.twig | Resources/views/Form/fields.html.twig | {% block file_row %}
{% spaceless %}
{% if panda_uploader %}
{{ block("panda_uploader_widget") }}
{% else %}
<div>
{{ form_label(form) }}
{{ form_errors(form) }}
{{ form_widget(form) }}
</div>
{% endif %}
{% endspaceless %}
{% endblock %}
{% block panda_uploader_widget %}
<div class="panda-uploader-widget">
{{ form_label(form) }}
{{ form_errors(form) }}
{% set type = "hidden" %}
{{ block("form_widget_simple") }}
{{ block("browse_button_widget") }}
{{ block("cancel_button_widget") }}
{{ block("progress_bar_widget") }}
</div>
{% endblock %}
{% block browse_button_widget %}
<div class="browse-button" id="{{ browse_button_id }}">{{ browse_button_label }}</div>
{% endblock %}
{% block cancel_button_widget %}
{% if cancel_button %}
<div class="cancel-button" id="{{ cancel_button_id }}">{{ cancel_button_label }}</div>
{% endif %}
{% endblock %}
{% block progress_bar_widget %}
{% if progress_bar %}
<div class="progress-bar" id="{{ progress_bar_id }}">
0%
</div>
{% endif %}
{% endblock %}
| {% block file_row -%}
{%- if panda_uploader -%}
{{- block("panda_uploader_widget") -}}
{%- else -%}
<div>
{{- form_label(form) -}}
{{- form_errors(form) -}}
{{- form_widget(form) -}}
</div>
{%- endif -%}
{%- endblock %}
{% block panda_uploader_widget -%}
<div class="panda-uploader-widget">
{{- form_label(form) -}}
{{- form_errors(form) -}}
{%- set type = "hidden" %}
{{- block("form_widget_simple") -}}
{{- block("browse_button_widget") -}}
{{- block("cancel_button_widget") -}}
{{- block("progress_bar_widget") -}}
</div>
{% endblock %}
{% block browse_button_widget -%}
<div class="browse-button" id="{{ browse_button_id }}">{{ browse_button_label }}</div>
{%- endblock %}
{% block cancel_button_widget -%}
{%- if cancel_button -%}
<div class="cancel-button" id="{{ cancel_button_id }}">{{ cancel_button_label }}</div>
{%- endif -%}
{%- endblock %}
{% block progress_bar_widget -%}
{%- if progress_bar -%}
<div class="progress-bar" id="{{ progress_bar_id }}">
0%
</div>
{%- endif -%}
{%- endblock %}
| Migrate from spaceless to whitespace control in form theme | Migrate from spaceless to whitespace control in form theme
| Twig | mit | xabbuh/PandaBundle,xabbuh/PandaBundle | twig | ## Code Before:
{% block file_row %}
{% spaceless %}
{% if panda_uploader %}
{{ block("panda_uploader_widget") }}
{% else %}
<div>
{{ form_label(form) }}
{{ form_errors(form) }}
{{ form_widget(form) }}
</div>
{% endif %}
{% endspaceless %}
{% endblock %}
{% block panda_uploader_widget %}
<div class="panda-uploader-widget">
{{ form_label(form) }}
{{ form_errors(form) }}
{% set type = "hidden" %}
{{ block("form_widget_simple") }}
{{ block("browse_button_widget") }}
{{ block("cancel_button_widget") }}
{{ block("progress_bar_widget") }}
</div>
{% endblock %}
{% block browse_button_widget %}
<div class="browse-button" id="{{ browse_button_id }}">{{ browse_button_label }}</div>
{% endblock %}
{% block cancel_button_widget %}
{% if cancel_button %}
<div class="cancel-button" id="{{ cancel_button_id }}">{{ cancel_button_label }}</div>
{% endif %}
{% endblock %}
{% block progress_bar_widget %}
{% if progress_bar %}
<div class="progress-bar" id="{{ progress_bar_id }}">
0%
</div>
{% endif %}
{% endblock %}
## Instruction:
Migrate from spaceless to whitespace control in form theme
## Code After:
{% block file_row -%}
{%- if panda_uploader -%}
{{- block("panda_uploader_widget") -}}
{%- else -%}
<div>
{{- form_label(form) -}}
{{- form_errors(form) -}}
{{- form_widget(form) -}}
</div>
{%- endif -%}
{%- endblock %}
{% block panda_uploader_widget -%}
<div class="panda-uploader-widget">
{{- form_label(form) -}}
{{- form_errors(form) -}}
{%- set type = "hidden" %}
{{- block("form_widget_simple") -}}
{{- block("browse_button_widget") -}}
{{- block("cancel_button_widget") -}}
{{- block("progress_bar_widget") -}}
</div>
{% endblock %}
{% block browse_button_widget -%}
<div class="browse-button" id="{{ browse_button_id }}">{{ browse_button_label }}</div>
{%- endblock %}
{% block cancel_button_widget -%}
{%- if cancel_button -%}
<div class="cancel-button" id="{{ cancel_button_id }}">{{ cancel_button_label }}</div>
{%- endif -%}
{%- endblock %}
{% block progress_bar_widget -%}
{%- if progress_bar -%}
<div class="progress-bar" id="{{ progress_bar_id }}">
0%
</div>
{%- endif -%}
{%- endblock %}
|
1e9dc2c02a8aa21195d1e487ec389fc7a7f5a5f9 | src/main.rs | src/main.rs | extern crate rand;
use rand::thread_rng;
use rand::distributions::Range;
use individual::Individual;
mod individual;
fn main() {
let mut rng = thread_rng();
let range = Range::new(-512.03_f64, 511.97); // range for Schwefel problem
// initialize population
let mut population: Vec<_> = (0..128).map(|_| {
Individual::new(&range, &mut rng)
}).collect();
for i in 0..10000 {
// generate mutated offspring
population = population.iter().map(|x| {
x.mutate(&range, &mut rng)
}).collect();
let best = population.iter().min().unwrap();
if i % 100 == 0 {
println!("{}th fitness: {}", i, best.fitness);
}
if best.fitness < 1000_f64 {
println!("Solution: {:?}", best.solution);
return;
}
}
println!("Failed to converge.");
}
| extern crate rand;
use rand::{Rng, thread_rng};
use rand::distributions::Range;
use individual::Individual;
mod individual;
fn select<'a, R: Rng>(population: &'a Vec<Individual>, rng: &mut R)
-> &'a Individual {
let population: Vec<_> = (0..4).map(|_| rng.choose(population)).collect();
if let Some(selected) = population.iter().min() {
return selected.unwrap();
}
unimplemented!();
}
fn main() {
let mut rng = thread_rng();
let range = Range::new(-512.03_f64, 511.97); // range for Schwefel problem
// initialize population
let mut population: Vec<_> = (0..128).map(|_| {
Individual::new(&range, &mut rng)
}).collect();
for i in 0..10000 {
// select and mutate individuals for next population
population = (0..128).map(|_| {
select(&population, &mut rng).mutate(&range, &mut rng)
}).collect();
let best = population.iter().min().unwrap();
if i % 100 == 0 {
println!("{}th fitness: {}", i, best.fitness);
}
if best.fitness < 1000_f64 {
println!("{}th solution converged at {}: {:?}",
i, best.fitness, best.solution);
return;
}
}
println!("Failed to converge.");
}
| Implement tournament selection to obtain some convergence | Implement tournament selection to obtain some convergence
| Rust | agpl-3.0 | andschwa/rust-genetic-algorithm | rust | ## Code Before:
extern crate rand;
use rand::thread_rng;
use rand::distributions::Range;
use individual::Individual;
mod individual;
fn main() {
let mut rng = thread_rng();
let range = Range::new(-512.03_f64, 511.97); // range for Schwefel problem
// initialize population
let mut population: Vec<_> = (0..128).map(|_| {
Individual::new(&range, &mut rng)
}).collect();
for i in 0..10000 {
// generate mutated offspring
population = population.iter().map(|x| {
x.mutate(&range, &mut rng)
}).collect();
let best = population.iter().min().unwrap();
if i % 100 == 0 {
println!("{}th fitness: {}", i, best.fitness);
}
if best.fitness < 1000_f64 {
println!("Solution: {:?}", best.solution);
return;
}
}
println!("Failed to converge.");
}
## Instruction:
Implement tournament selection to obtain some convergence
## Code After:
extern crate rand;
use rand::{Rng, thread_rng};
use rand::distributions::Range;
use individual::Individual;
mod individual;
fn select<'a, R: Rng>(population: &'a Vec<Individual>, rng: &mut R)
-> &'a Individual {
let population: Vec<_> = (0..4).map(|_| rng.choose(population)).collect();
if let Some(selected) = population.iter().min() {
return selected.unwrap();
}
unimplemented!();
}
fn main() {
let mut rng = thread_rng();
let range = Range::new(-512.03_f64, 511.97); // range for Schwefel problem
// initialize population
let mut population: Vec<_> = (0..128).map(|_| {
Individual::new(&range, &mut rng)
}).collect();
for i in 0..10000 {
// select and mutate individuals for next population
population = (0..128).map(|_| {
select(&population, &mut rng).mutate(&range, &mut rng)
}).collect();
let best = population.iter().min().unwrap();
if i % 100 == 0 {
println!("{}th fitness: {}", i, best.fitness);
}
if best.fitness < 1000_f64 {
println!("{}th solution converged at {}: {:?}",
i, best.fitness, best.solution);
return;
}
}
println!("Failed to converge.");
}
|
afd2e2635bf73ce8b4dbe229ae0148a73060287b | app/views/films/show.html.erb | app/views/films/show.html.erb | <h1><%= film.title %></h1>
<% @films.ratings.each do |film| %>
<h4><%= film.average_score %></h4>
<p><%= render @film.reviews %></p>
<% end %>
| <h1><%= film.title %></h1>
<h4><%= film.average_score %></h4>
<p><%= render @film.reviews %></p>
| Fix average score display on a film | Fix average score display on a film
| HTML+ERB | mit | JacobCrofts/wintergarten,JacobCrofts/wintergarten,JacobCrofts/wintergarten | html+erb | ## Code Before:
<h1><%= film.title %></h1>
<% @films.ratings.each do |film| %>
<h4><%= film.average_score %></h4>
<p><%= render @film.reviews %></p>
<% end %>
## Instruction:
Fix average score display on a film
## Code After:
<h1><%= film.title %></h1>
<h4><%= film.average_score %></h4>
<p><%= render @film.reviews %></p>
|
98339c030b63a5befcee41d5d2a26642f93fed40 | unsplash/resources/js/Unsplash_Script.js | unsplash/resources/js/Unsplash_Script.js | jQuery(document).ready(function($) {
var container = $('#splashing_images');
$.LoadingOverlaySetup({
color : "rgba(241,241,241,0.8)",
maxSize : "80px",
minSize : "20px",
resizeInterval : 0,
size : "30%"
});
$('a.upload').click(function(e){
var element = $(this);
var image = element.find('img');
// If not saving, then proceed
if(!element.hasClass('saving')){
element.addClass('saving');
e.preventDefault();
var payload = { source : $(this).data('source'), author: $(this).data('author'), credit: $(this).data('credit')};
payload[window.csrfTokenName] = window.csrfTokenValue;
$.ajax({
type: 'POST',
url: Craft.getActionUrl('unsplash/download/save'),
dataType: 'JSON',
data: payload,
beforeSend: function() {
image.LoadingOverlay("show");
},
success: function(response) {
image.LoadingOverlay("hide");
Craft.cp.displayNotice(Craft.t('Image saved!'));
},
error: function(xhr, status, error) {
Craft.cp.displayError(Craft.t('Oops, something went wrong!'));
}
});
};
});
}); | jQuery(document).ready(function($) {
var container = $('#splashing_images');
$.LoadingOverlaySetup({
color : "rgba(241,241,241,0.8)",
maxSize : "80px",
minSize : "20px",
resizeInterval : 0,
size : "30%"
});
$('a.upload').click(function(e){
var element = $(this);
var image = element.find('img');
// If not saving, then proceed
if(!element.hasClass('saving')){
element.addClass('saving');
e.preventDefault();
var payload = { source : $(this).data('source'), author: $(this).data('author'), credit: $(this).data('credit')};
payload[window.csrfTokenName] = window.csrfTokenValue;
$.ajax({
type: 'POST',
url: Craft.getActionUrl('unsplash/download/save'),
dataType: 'JSON',
data: payload,
beforeSend: function() {
image.LoadingOverlay("show");
},
success: function(response) {
image.LoadingOverlay("hide");
Craft.cp.displayNotice(Craft.t('Image saved!'));
},
error: function(xhr, status, error) {
image.LoadingOverlay("hide");
Craft.cp.displayError(Craft.t('Oops, something went wrong!'));
}
});
};
});
}); | Hide overlay on ajax error | Hide overlay on ajax error
| JavaScript | mit | studioespresso/craft-unsplash,studioespresso/craft-unsplash,studioespresso/craft-unsplash | javascript | ## Code Before:
jQuery(document).ready(function($) {
var container = $('#splashing_images');
$.LoadingOverlaySetup({
color : "rgba(241,241,241,0.8)",
maxSize : "80px",
minSize : "20px",
resizeInterval : 0,
size : "30%"
});
$('a.upload').click(function(e){
var element = $(this);
var image = element.find('img');
// If not saving, then proceed
if(!element.hasClass('saving')){
element.addClass('saving');
e.preventDefault();
var payload = { source : $(this).data('source'), author: $(this).data('author'), credit: $(this).data('credit')};
payload[window.csrfTokenName] = window.csrfTokenValue;
$.ajax({
type: 'POST',
url: Craft.getActionUrl('unsplash/download/save'),
dataType: 'JSON',
data: payload,
beforeSend: function() {
image.LoadingOverlay("show");
},
success: function(response) {
image.LoadingOverlay("hide");
Craft.cp.displayNotice(Craft.t('Image saved!'));
},
error: function(xhr, status, error) {
Craft.cp.displayError(Craft.t('Oops, something went wrong!'));
}
});
};
});
});
## Instruction:
Hide overlay on ajax error
## Code After:
jQuery(document).ready(function($) {
var container = $('#splashing_images');
$.LoadingOverlaySetup({
color : "rgba(241,241,241,0.8)",
maxSize : "80px",
minSize : "20px",
resizeInterval : 0,
size : "30%"
});
$('a.upload').click(function(e){
var element = $(this);
var image = element.find('img');
// If not saving, then proceed
if(!element.hasClass('saving')){
element.addClass('saving');
e.preventDefault();
var payload = { source : $(this).data('source'), author: $(this).data('author'), credit: $(this).data('credit')};
payload[window.csrfTokenName] = window.csrfTokenValue;
$.ajax({
type: 'POST',
url: Craft.getActionUrl('unsplash/download/save'),
dataType: 'JSON',
data: payload,
beforeSend: function() {
image.LoadingOverlay("show");
},
success: function(response) {
image.LoadingOverlay("hide");
Craft.cp.displayNotice(Craft.t('Image saved!'));
},
error: function(xhr, status, error) {
image.LoadingOverlay("hide");
Craft.cp.displayError(Craft.t('Oops, something went wrong!'));
}
});
};
});
}); |
0ef129be9341f9668bba8e8cfdef06c7dfde5983 | link/config/nvim/plugin/mapleader.vim | link/config/nvim/plugin/mapleader.vim | " Change mapleader
let mapleader=","
" Save session for restoring later with 'vim -S' (,ss)
map <leader>ss :mksession<CR>
" Save a file as root (,sr)
map <leader>sr :w !sudo tee % > /dev/null<CR>
" Use ',t' to toggle absolute and relative line numbers
map <leader>t :call ToggleNumber()<CR>
" Open ag.vim with '\'
map \ :Ag<SPACE>
" Use ',.' to go to end of current line
map <leader>. $
" Use ',m' to go to beginning of current line
map <leader>m ^
" Pasting from clipboard into vim is formatted correctly (,p)
map <silent> <leader>p :set paste<CR>"*p:set nopaste<CR>
" Quickly edit/reload the ~/.vimrc file
" ',ev' opens ~/.vimrc in a new buffer to edit.
" ',sv' sources the ~/.vimrc file.
map <silent> <leader>ev :e $MYVIMRC<CR>
map <silent> <leader>sv :so $MYVIMRC<CR>:call SourceConfigs()<CR>
" Switch CWD to the directory of the open buffer
map <leader>cd :cd %:p:h<cr>:pwd<cr>
" vim:foldenable:foldmethod=marker
| " Change mapleader
let mapleader=","
" Save a file as root (,sr)
map <leader>sr :w !sudo tee % > /dev/null<CR>
" Use ',t' to toggle absolute and relative line numbers
map <leader>t :call ToggleNumber()<CR>
" Pasting from clipboard into vim is formatted correctly (,p)
map <silent> <leader>p :set paste<CR>"*p:set nopaste<CR>
" Quickly edit/reload the ~/.vimrc file
" ',ev' opens ~/.vimrc in a new buffer to edit.
" ',sv' sources the ~/.vimrc file.
map <silent> <leader>ev :e $MYVIMRC<CR>
map <silent> <leader>sv :so $MYVIMRC<CR>:call SourceConfigs()<CR>
" Switch CWD to the directory of the open buffer
map <leader>cd :cd %:p:h<cr>:pwd<CR>
" Update all plugins
nmap <leader>pu :PackerUpdate<CR>
" vim:foldenable:foldmethod=marker
| Remove unused nvim leader shortcuts, add new one | Remove unused nvim leader shortcuts, add new one
| VimL | mit | tjtrabue/dotfiles,tjtrabue/dotfiles,tjtrabue/dotfiles,tjtrabue/dotfiles,tjtrabue/dotfiles | viml | ## Code Before:
" Change mapleader
let mapleader=","
" Save session for restoring later with 'vim -S' (,ss)
map <leader>ss :mksession<CR>
" Save a file as root (,sr)
map <leader>sr :w !sudo tee % > /dev/null<CR>
" Use ',t' to toggle absolute and relative line numbers
map <leader>t :call ToggleNumber()<CR>
" Open ag.vim with '\'
map \ :Ag<SPACE>
" Use ',.' to go to end of current line
map <leader>. $
" Use ',m' to go to beginning of current line
map <leader>m ^
" Pasting from clipboard into vim is formatted correctly (,p)
map <silent> <leader>p :set paste<CR>"*p:set nopaste<CR>
" Quickly edit/reload the ~/.vimrc file
" ',ev' opens ~/.vimrc in a new buffer to edit.
" ',sv' sources the ~/.vimrc file.
map <silent> <leader>ev :e $MYVIMRC<CR>
map <silent> <leader>sv :so $MYVIMRC<CR>:call SourceConfigs()<CR>
" Switch CWD to the directory of the open buffer
map <leader>cd :cd %:p:h<cr>:pwd<cr>
" vim:foldenable:foldmethod=marker
## Instruction:
Remove unused nvim leader shortcuts, add new one
## Code After:
" Change mapleader
let mapleader=","
" Save a file as root (,sr)
map <leader>sr :w !sudo tee % > /dev/null<CR>
" Use ',t' to toggle absolute and relative line numbers
map <leader>t :call ToggleNumber()<CR>
" Pasting from clipboard into vim is formatted correctly (,p)
map <silent> <leader>p :set paste<CR>"*p:set nopaste<CR>
" Quickly edit/reload the ~/.vimrc file
" ',ev' opens ~/.vimrc in a new buffer to edit.
" ',sv' sources the ~/.vimrc file.
map <silent> <leader>ev :e $MYVIMRC<CR>
map <silent> <leader>sv :so $MYVIMRC<CR>:call SourceConfigs()<CR>
" Switch CWD to the directory of the open buffer
map <leader>cd :cd %:p:h<cr>:pwd<CR>
" Update all plugins
nmap <leader>pu :PackerUpdate<CR>
" vim:foldenable:foldmethod=marker
|
4e49556ec45db5dbfa1383065f802f84b0ab4ce3 | lib/greek_string/container.rb | lib/greek_string/container.rb | class GreekString
class Container
require 'forwardable'
extend Forwardable
def_delegators :@container, :[], :<<, :each, :map
def initialize
@container = []
end
def letters
@container.flat_map(&:letters)
end
def to_a
@container
end
def to_s(type)
@container.flat_map(&type)
end
def methods(meth)
@container.map! do |letter|
if letter.instance_variable_defined?("@" + meth.to_s)
letter
end
end
@container.compact
self
end
def method_missing(meth)
blk = Proc.new { methods(meth) }
self.class.send(:define_method, meth, &blk)
end
end
end
| class GreekString
class Container
require 'forwardable'
extend Forwardable
def_delegators :@container, :[], :<<, :each, :map, :flat_map
def initialize
@container = []
end
def to_a
@container
end
def to_s(type)
@container.flat_map(&type)
end
end
end
| Delete letters method, add flat_map delegator | Delete letters method, add flat_map delegator
| Ruby | mit | latin-language-toolkit/greek_string | ruby | ## Code Before:
class GreekString
class Container
require 'forwardable'
extend Forwardable
def_delegators :@container, :[], :<<, :each, :map
def initialize
@container = []
end
def letters
@container.flat_map(&:letters)
end
def to_a
@container
end
def to_s(type)
@container.flat_map(&type)
end
def methods(meth)
@container.map! do |letter|
if letter.instance_variable_defined?("@" + meth.to_s)
letter
end
end
@container.compact
self
end
def method_missing(meth)
blk = Proc.new { methods(meth) }
self.class.send(:define_method, meth, &blk)
end
end
end
## Instruction:
Delete letters method, add flat_map delegator
## Code After:
class GreekString
class Container
require 'forwardable'
extend Forwardable
def_delegators :@container, :[], :<<, :each, :map, :flat_map
def initialize
@container = []
end
def to_a
@container
end
def to_s(type)
@container.flat_map(&type)
end
end
end
|
fd3bf885acc4cc222378fbb9d29b12bf2632eef6 | app/components/class-field-description.js | app/components/class-field-description.js | import { inject as service } from '@ember/service';
import Component from '@ember/component';
import { tracked } from '@glimmer/tracking';
class Field {
@tracked name;
@tracked class;
}
export default class ClassFieldDescription extends Component {
@service
legacyModuleMappings;
field = new Field();
get hasImportExample() {
return this.legacyModuleMappings.hasFunctionMapping(
this.field.name,
this.field.class
);
}
/**
* Callback for updating the anchor with the field name that was clicked by a user.
*
* @method updateAnchor
* @method fieldName String The name representing the field that was clicked.
*/
updateAnchor() {}
}
| import { inject as service } from '@ember/service';
import Component from '@ember/component';
export default class ClassFieldDescription extends Component {
@service
legacyModuleMappings;
get hasImportExample() {
return this.legacyModuleMappings.hasFunctionMapping(
this.args.field.name,
this.args.field.class
);
}
/**
* Callback for updating the anchor with the field name that was clicked by a user.
*
* @method updateAnchor
* @method fieldName String The name representing the field that was clicked.
*/
updateAnchor() {}
}
| Remove @tracked as it is not used | Remove @tracked as it is not used
| JavaScript | mit | ember-learn/ember-api-docs,ember-learn/ember-api-docs | javascript | ## Code Before:
import { inject as service } from '@ember/service';
import Component from '@ember/component';
import { tracked } from '@glimmer/tracking';
class Field {
@tracked name;
@tracked class;
}
export default class ClassFieldDescription extends Component {
@service
legacyModuleMappings;
field = new Field();
get hasImportExample() {
return this.legacyModuleMappings.hasFunctionMapping(
this.field.name,
this.field.class
);
}
/**
* Callback for updating the anchor with the field name that was clicked by a user.
*
* @method updateAnchor
* @method fieldName String The name representing the field that was clicked.
*/
updateAnchor() {}
}
## Instruction:
Remove @tracked as it is not used
## Code After:
import { inject as service } from '@ember/service';
import Component from '@ember/component';
export default class ClassFieldDescription extends Component {
@service
legacyModuleMappings;
get hasImportExample() {
return this.legacyModuleMappings.hasFunctionMapping(
this.args.field.name,
this.args.field.class
);
}
/**
* Callback for updating the anchor with the field name that was clicked by a user.
*
* @method updateAnchor
* @method fieldName String The name representing the field that was clicked.
*/
updateAnchor() {}
}
|
8ea1ff90e4b5071e3e99970cb20368551a72d259 | src/PortNumber.php | src/PortNumber.php | <?php
namespace Krixon\URL;
class PortNumber
{
private $number;
/**
* @param int $value
*
* @throws \InvalidArgumentException
*/
public function __construct($value)
{
$options = [
'options' => [
'min_range' => 0,
'max_range' => 65535,
],
];
if (!filter_var($value, FILTER_VALIDATE_INT, $options)) {
throw new \InvalidArgumentException('Invalid port number.');
}
}
/**
* @return string
*/
public function __toString()
{
return $this->toString();
}
/**
* @return string
*/
public function toString()
{
return (string)$this->toInt();
}
/**
* @return int
*/
public function toInt()
{
return $this->number;
}
}
| <?php
namespace Krixon\URL;
class PortNumber
{
private $number;
/**
* @param int $value
*
* @throws \InvalidArgumentException
*/
public function __construct($value)
{
$options = [
'options' => [
'min_range' => 0,
'max_range' => 65535,
],
];
if (!filter_var($value, FILTER_VALIDATE_INT, $options)) {
throw new \InvalidArgumentException('Invalid port number.');
}
$this->number = $value;
}
/**
* @return string
*/
public function __toString()
{
return $this->toString();
}
/**
* @return string
*/
public function toString()
{
return (string)$this->toInt();
}
/**
* @return int
*/
public function toInt()
{
return $this->number;
}
}
| Make sure port number is stored correctly | Make sure port number is stored correctly
| PHP | mit | krixon/url | php | ## Code Before:
<?php
namespace Krixon\URL;
class PortNumber
{
private $number;
/**
* @param int $value
*
* @throws \InvalidArgumentException
*/
public function __construct($value)
{
$options = [
'options' => [
'min_range' => 0,
'max_range' => 65535,
],
];
if (!filter_var($value, FILTER_VALIDATE_INT, $options)) {
throw new \InvalidArgumentException('Invalid port number.');
}
}
/**
* @return string
*/
public function __toString()
{
return $this->toString();
}
/**
* @return string
*/
public function toString()
{
return (string)$this->toInt();
}
/**
* @return int
*/
public function toInt()
{
return $this->number;
}
}
## Instruction:
Make sure port number is stored correctly
## Code After:
<?php
namespace Krixon\URL;
class PortNumber
{
private $number;
/**
* @param int $value
*
* @throws \InvalidArgumentException
*/
public function __construct($value)
{
$options = [
'options' => [
'min_range' => 0,
'max_range' => 65535,
],
];
if (!filter_var($value, FILTER_VALIDATE_INT, $options)) {
throw new \InvalidArgumentException('Invalid port number.');
}
$this->number = $value;
}
/**
* @return string
*/
public function __toString()
{
return $this->toString();
}
/**
* @return string
*/
public function toString()
{
return (string)$this->toInt();
}
/**
* @return int
*/
public function toInt()
{
return $this->number;
}
}
|
79d4615df86e915113b13835b0f806f8927b8ec8 | metadata/com.wesaphzt.privatelocation.yml | metadata/com.wesaphzt.privatelocation.yml | Categories:
- Development
- Navigation
- System
License: GPL-3.0-or-later
SourceCode: https://github.com/wesaphzt/privatelocation
IssueTracker: https://github.com/wesaphzt/privatelocation/issues
Bitcoin: 1GCkvAg9oG79niQTbh6EH9rPALQDXKyHKK
Litecoin: LV687s3wVdhmLZyJMFxomJHdHFXeFAKT5R
AutoName: Private Location
Description: |-
A simple app to set your location to anywhere in the world, and improve
general phone location privacy.
Apps on your phone can require location permissions to work, and make
repeated and unnecessary location requests in the background throughout
the day.
Setting your location somewhere else will help protect your privacy when
using these apps.
This is one of the few location apps that doesn't rely on Google Maps.
RepoType: git
Repo: https://github.com/wesaphzt/privatelocation
Builds:
- versionName: '1.0'
versionCode: 1
commit: '1.0'
subdir: app
gradle:
- yes
AutoUpdateMode: Version %v
UpdateCheckMode: Tags
CurrentVersion: '1.0'
CurrentVersionCode: 1
| Categories:
- Development
- Navigation
- System
License: GPL-3.0-or-later
SourceCode: https://github.com/wesaphzt/privatelocation
IssueTracker: https://github.com/wesaphzt/privatelocation/issues
Bitcoin: 1GCkvAg9oG79niQTbh6EH9rPALQDXKyHKK
Litecoin: LV687s3wVdhmLZyJMFxomJHdHFXeFAKT5R
AutoName: Private Location
Description: |-
A simple app to set your location to anywhere in the world, and improve
general phone location privacy.
Apps on your phone can require location permissions to work, and make
repeated and unnecessary location requests in the background throughout
the day.
Setting your location somewhere else will help protect your privacy when
using these apps.
This is one of the few location apps that doesn't rely on Google Maps.
RepoType: git
Repo: https://github.com/wesaphzt/privatelocation
Builds:
- versionName: '1.0'
versionCode: 1
commit: '1.0'
subdir: app
gradle:
- yes
- versionName: '1.1'
versionCode: 2
commit: '1.1'
subdir: app
gradle:
- yes
AutoUpdateMode: Version %v
UpdateCheckMode: Tags
CurrentVersion: '1.1'
CurrentVersionCode: 2
| Update Private Location to 1.1 (2) | Update Private Location to 1.1 (2)
| YAML | agpl-3.0 | f-droid/fdroiddata,f-droid/fdroiddata | yaml | ## Code Before:
Categories:
- Development
- Navigation
- System
License: GPL-3.0-or-later
SourceCode: https://github.com/wesaphzt/privatelocation
IssueTracker: https://github.com/wesaphzt/privatelocation/issues
Bitcoin: 1GCkvAg9oG79niQTbh6EH9rPALQDXKyHKK
Litecoin: LV687s3wVdhmLZyJMFxomJHdHFXeFAKT5R
AutoName: Private Location
Description: |-
A simple app to set your location to anywhere in the world, and improve
general phone location privacy.
Apps on your phone can require location permissions to work, and make
repeated and unnecessary location requests in the background throughout
the day.
Setting your location somewhere else will help protect your privacy when
using these apps.
This is one of the few location apps that doesn't rely on Google Maps.
RepoType: git
Repo: https://github.com/wesaphzt/privatelocation
Builds:
- versionName: '1.0'
versionCode: 1
commit: '1.0'
subdir: app
gradle:
- yes
AutoUpdateMode: Version %v
UpdateCheckMode: Tags
CurrentVersion: '1.0'
CurrentVersionCode: 1
## Instruction:
Update Private Location to 1.1 (2)
## Code After:
Categories:
- Development
- Navigation
- System
License: GPL-3.0-or-later
SourceCode: https://github.com/wesaphzt/privatelocation
IssueTracker: https://github.com/wesaphzt/privatelocation/issues
Bitcoin: 1GCkvAg9oG79niQTbh6EH9rPALQDXKyHKK
Litecoin: LV687s3wVdhmLZyJMFxomJHdHFXeFAKT5R
AutoName: Private Location
Description: |-
A simple app to set your location to anywhere in the world, and improve
general phone location privacy.
Apps on your phone can require location permissions to work, and make
repeated and unnecessary location requests in the background throughout
the day.
Setting your location somewhere else will help protect your privacy when
using these apps.
This is one of the few location apps that doesn't rely on Google Maps.
RepoType: git
Repo: https://github.com/wesaphzt/privatelocation
Builds:
- versionName: '1.0'
versionCode: 1
commit: '1.0'
subdir: app
gradle:
- yes
- versionName: '1.1'
versionCode: 2
commit: '1.1'
subdir: app
gradle:
- yes
AutoUpdateMode: Version %v
UpdateCheckMode: Tags
CurrentVersion: '1.1'
CurrentVersionCode: 2
|
352afe3f55179b60e0cecdf3076742bb5ccb77ff | src/Http/Middleware/ForgetLocaleRouteParameter.php | src/Http/Middleware/ForgetLocaleRouteParameter.php | <?php
declare(strict_types=1);
namespace Cortex\Foundation\Http\Middleware;
use Closure;
class ForgetLocaleRouteParameter
{
/**
* Handle an incoming request.
*
* @param \Illuminate\Http\Request $request
* @param \Closure $next
*
* @return mixed
*/
public function handle($request, Closure $next)
{
// Bind {locale} route parameter
$request->route()->forgetParameter('locale');
return $next($request);
}
}
| <?php
declare(strict_types=1);
namespace Cortex\Foundation\Http\Middleware;
use Closure;
class ForgetLocaleRouteParameter
{
/**
* Handle an incoming request.
*
* @param \Illuminate\Http\Request $request
* @param \Closure $next
*
* @return mixed
*/
public function handle($request, Closure $next)
{
// unBind {locale} route parameter
! config('cortex.foundation.route.locale_prefix') || $request->route()->forgetParameter('locale');
return $next($request);
}
}
| Check config option before forgetting route parameters | Check config option before forgetting route parameters
| PHP | mit | rinvex/cortex-foundation | php | ## Code Before:
<?php
declare(strict_types=1);
namespace Cortex\Foundation\Http\Middleware;
use Closure;
class ForgetLocaleRouteParameter
{
/**
* Handle an incoming request.
*
* @param \Illuminate\Http\Request $request
* @param \Closure $next
*
* @return mixed
*/
public function handle($request, Closure $next)
{
// Bind {locale} route parameter
$request->route()->forgetParameter('locale');
return $next($request);
}
}
## Instruction:
Check config option before forgetting route parameters
## Code After:
<?php
declare(strict_types=1);
namespace Cortex\Foundation\Http\Middleware;
use Closure;
class ForgetLocaleRouteParameter
{
/**
* Handle an incoming request.
*
* @param \Illuminate\Http\Request $request
* @param \Closure $next
*
* @return mixed
*/
public function handle($request, Closure $next)
{
// unBind {locale} route parameter
! config('cortex.foundation.route.locale_prefix') || $request->route()->forgetParameter('locale');
return $next($request);
}
}
|
7842919b2af368c640363b4e4e05144049b111ba | ovp_core/emails.py | ovp_core/emails.py | from django.core.mail import EmailMultiAlternatives
from django.template import Context, Template
from django.template.loader import get_template
from django.conf import settings
import threading
class EmailThread(threading.Thread):
def __init__(self, msg):
self.msg = msg
threading.Thread.__init__(self)
def run (self):
return self.msg.send() > 0
class BaseMail:
"""
This class is responsible for firing emails
"""
from_email = ''
def __init__(self, user, async_mail=None):
self.user = user
self.async_mail = async_mail
def sendEmail(self, template_name, subject, context):
ctx = Context(context)
text_content = get_template('email/{}.txt'.format(template_name)).render(ctx)
html_content = get_template('email/{}.html'.format(template_name)).render(ctx)
msg = EmailMultiAlternatives(subject, text_content, self.from_email, [self.user.email])
msg.attach_alternative(text_content, "text/plain")
msg.attach_alternative(html_content, "text/html")
if self.async_mail:
async_flag="async"
else:
async_flag=getattr(settings, "DEFAULT_SEND_EMAIL", "async")
if async_flag == "async":
t = EmailThread(msg)
t.start()
return t
else:
return msg.send() > 0
| from django.core.mail import EmailMultiAlternatives
from django.template import Context, Template
from django.template.loader import get_template
from django.conf import settings
import threading
class EmailThread(threading.Thread):
def __init__(self, msg):
self.msg = msg
threading.Thread.__init__(self)
def run (self):
return self.msg.send() > 0
class BaseMail:
"""
This class is responsible for firing emails
"""
from_email = ''
def __init__(self, email_address, async_mail=None):
self.email_address = email_address
self.async_mail = async_mail
def sendEmail(self, template_name, subject, context):
ctx = Context(context)
text_content = get_template('email/{}.txt'.format(template_name)).render(ctx)
html_content = get_template('email/{}.html'.format(template_name)).render(ctx)
msg = EmailMultiAlternatives(subject, text_content, self.from_email, [self.email_address])
msg.attach_alternative(text_content, "text/plain")
msg.attach_alternative(html_content, "text/html")
if self.async_mail:
async_flag="async"
else:
async_flag=getattr(settings, "DEFAULT_SEND_EMAIL", "async")
if async_flag == "async":
t = EmailThread(msg)
t.start()
return t
else:
return msg.send() > 0
| Remove BaseMail dependency on User object | Remove BaseMail dependency on User object
| Python | agpl-3.0 | OpenVolunteeringPlatform/django-ovp-core,OpenVolunteeringPlatform/django-ovp-core | python | ## Code Before:
from django.core.mail import EmailMultiAlternatives
from django.template import Context, Template
from django.template.loader import get_template
from django.conf import settings
import threading
class EmailThread(threading.Thread):
def __init__(self, msg):
self.msg = msg
threading.Thread.__init__(self)
def run (self):
return self.msg.send() > 0
class BaseMail:
"""
This class is responsible for firing emails
"""
from_email = ''
def __init__(self, user, async_mail=None):
self.user = user
self.async_mail = async_mail
def sendEmail(self, template_name, subject, context):
ctx = Context(context)
text_content = get_template('email/{}.txt'.format(template_name)).render(ctx)
html_content = get_template('email/{}.html'.format(template_name)).render(ctx)
msg = EmailMultiAlternatives(subject, text_content, self.from_email, [self.user.email])
msg.attach_alternative(text_content, "text/plain")
msg.attach_alternative(html_content, "text/html")
if self.async_mail:
async_flag="async"
else:
async_flag=getattr(settings, "DEFAULT_SEND_EMAIL", "async")
if async_flag == "async":
t = EmailThread(msg)
t.start()
return t
else:
return msg.send() > 0
## Instruction:
Remove BaseMail dependency on User object
## Code After:
from django.core.mail import EmailMultiAlternatives
from django.template import Context, Template
from django.template.loader import get_template
from django.conf import settings
import threading
class EmailThread(threading.Thread):
def __init__(self, msg):
self.msg = msg
threading.Thread.__init__(self)
def run (self):
return self.msg.send() > 0
class BaseMail:
"""
This class is responsible for firing emails
"""
from_email = ''
def __init__(self, email_address, async_mail=None):
self.email_address = email_address
self.async_mail = async_mail
def sendEmail(self, template_name, subject, context):
ctx = Context(context)
text_content = get_template('email/{}.txt'.format(template_name)).render(ctx)
html_content = get_template('email/{}.html'.format(template_name)).render(ctx)
msg = EmailMultiAlternatives(subject, text_content, self.from_email, [self.email_address])
msg.attach_alternative(text_content, "text/plain")
msg.attach_alternative(html_content, "text/html")
if self.async_mail:
async_flag="async"
else:
async_flag=getattr(settings, "DEFAULT_SEND_EMAIL", "async")
if async_flag == "async":
t = EmailThread(msg)
t.start()
return t
else:
return msg.send() > 0
|
ec8b3381a9529bc76062014730ed626f73f97e55 | config/admin_users.yml | config/admin_users.yml | ---
- email: "alex.tomlins@digital.cabinet-office.gov.uk"
origin: "google"
- email: "andras.ferencz-szabo@digital.cabinet-office.gov.uk"
origin: "google"
- email: "chris.farmiloe@digital.cabinet-office.gov.uk"
origin: "google"
- email: "lee.porte@digital.cabinet-office.gov.uk"
origin: "google"
- email: "michael.mokrysz@digital.cabinet-office.gov.uk"
origin: "google"
- email: "rafal.proszowski@digital.cabinet-office.gov.uk"
origin: "google"
- email: "richard.towers@digital.cabinet-office.gov.uk"
origin: "google"
- email: "sam.crang@digital.cabinet-office.gov.uk"
origin: "google"
- email: "tom.whitwell@digital.cabinet-office.gov.uk"
origin: "google"
| ---
- email: "alex.tomlins@digital.cabinet-office.gov.uk"
origin: "google"
- email: "andras.ferencz-szabo@digital.cabinet-office.gov.uk"
origin: "google"
- email: "chris.farmiloe@digital.cabinet-office.gov.uk"
origin: "google"
- email: "lee.porte@digital.cabinet-office.gov.uk"
origin: "google"
- email: "michael.mokrysz@digital.cabinet-office.gov.uk"
origin: "google"
- email: "rafal.proszowski@digital.cabinet-office.gov.uk"
origin: "google"
- email: "richard.towers@digital.cabinet-office.gov.uk"
origin: "google"
- email: "russell.howe@digital.cabinet-office.gov.uk"
origin: "google"
- email: "sam.crang@digital.cabinet-office.gov.uk"
origin: "google"
- email: "tom.whitwell@digital.cabinet-office.gov.uk"
origin: "google"
| Add Russell to the list of CF admin users. | Add Russell to the list of CF admin users.
| YAML | mit | alphagov/paas-cf,alphagov/paas-cf,alphagov/paas-cf,alphagov/paas-cf,alphagov/paas-cf,alphagov/paas-cf,alphagov/paas-cf,alphagov/paas-cf | yaml | ## Code Before:
---
- email: "alex.tomlins@digital.cabinet-office.gov.uk"
origin: "google"
- email: "andras.ferencz-szabo@digital.cabinet-office.gov.uk"
origin: "google"
- email: "chris.farmiloe@digital.cabinet-office.gov.uk"
origin: "google"
- email: "lee.porte@digital.cabinet-office.gov.uk"
origin: "google"
- email: "michael.mokrysz@digital.cabinet-office.gov.uk"
origin: "google"
- email: "rafal.proszowski@digital.cabinet-office.gov.uk"
origin: "google"
- email: "richard.towers@digital.cabinet-office.gov.uk"
origin: "google"
- email: "sam.crang@digital.cabinet-office.gov.uk"
origin: "google"
- email: "tom.whitwell@digital.cabinet-office.gov.uk"
origin: "google"
## Instruction:
Add Russell to the list of CF admin users.
## Code After:
---
- email: "alex.tomlins@digital.cabinet-office.gov.uk"
origin: "google"
- email: "andras.ferencz-szabo@digital.cabinet-office.gov.uk"
origin: "google"
- email: "chris.farmiloe@digital.cabinet-office.gov.uk"
origin: "google"
- email: "lee.porte@digital.cabinet-office.gov.uk"
origin: "google"
- email: "michael.mokrysz@digital.cabinet-office.gov.uk"
origin: "google"
- email: "rafal.proszowski@digital.cabinet-office.gov.uk"
origin: "google"
- email: "richard.towers@digital.cabinet-office.gov.uk"
origin: "google"
- email: "russell.howe@digital.cabinet-office.gov.uk"
origin: "google"
- email: "sam.crang@digital.cabinet-office.gov.uk"
origin: "google"
- email: "tom.whitwell@digital.cabinet-office.gov.uk"
origin: "google"
|
68eceecffb4ea457fe92d623f5722fd25c09e718 | src/js/routers/app-router.js | src/js/routers/app-router.js | import * as Backbone from 'backbone';
import Items from '../collections/items';
import SearchBoxView from '../views/searchBox-view';
import SearchResultsView from '../views/searchResults-view';
import AppView from '../views/app-view';
import DocumentSet from '../helpers/search';
import dispatcher from '../helpers/dispatcher';
class AppRouter extends Backbone.Router {
get routes() {
return {
'': 'loadDefault',
'search/(?*queryString)': 'showSearchResults'
};
}
initialize() {
this.listenTo(dispatcher, 'router:go', this.go);
}
go(route) {
this.navigate(route, {trigger: true});
}
loadDefault() {
new AppView();
}
showSearchResults(queryString) {
let q = queryString.substring(2);
DocumentSet.search(q).then(function(result) {
let currentDocuments = new DocumentSet(result).documents;
let docCollection = new Items(currentDocuments);
new SearchResultsView({collection: docCollection}).render();
});
}
}
export default AppRouter;
| import * as Backbone from 'backbone';
import Items from '../collections/items';
import SearchBoxView from '../views/searchBox-view';
import SearchResultsView from '../views/searchResults-view';
import AppView from '../views/app-view';
import Events from '../helpers/backbone-events';
import DocumentSet from '../helpers/search';
class AppRouter extends Backbone.Router {
get routes() {
return {
'': 'loadDefault',
'search/(?*queryString)': 'showSearchResults'
};
}
initialize() {
this.listenTo(Events, 'router:go', this.go);
}
go(route) {
this.navigate(route, {trigger: true});
}
loadDefault() {
new AppView();
}
showSearchResults(queryString) {
let q = queryString.substring(2);
DocumentSet.search(q).then(function(result) {
let currentDocuments = new DocumentSet(result).documents;
let docCollection = new Items(currentDocuments);
new SearchResultsView({collection: docCollection}).render();
});
}
}
export default AppRouter;
| Use Events instead of custom dispatcher (more) | Use Events instead of custom dispatcher (more)
| JavaScript | mit | trevormunoz/katherine-anne,trevormunoz/katherine-anne,trevormunoz/katherine-anne | javascript | ## Code Before:
import * as Backbone from 'backbone';
import Items from '../collections/items';
import SearchBoxView from '../views/searchBox-view';
import SearchResultsView from '../views/searchResults-view';
import AppView from '../views/app-view';
import DocumentSet from '../helpers/search';
import dispatcher from '../helpers/dispatcher';
class AppRouter extends Backbone.Router {
get routes() {
return {
'': 'loadDefault',
'search/(?*queryString)': 'showSearchResults'
};
}
initialize() {
this.listenTo(dispatcher, 'router:go', this.go);
}
go(route) {
this.navigate(route, {trigger: true});
}
loadDefault() {
new AppView();
}
showSearchResults(queryString) {
let q = queryString.substring(2);
DocumentSet.search(q).then(function(result) {
let currentDocuments = new DocumentSet(result).documents;
let docCollection = new Items(currentDocuments);
new SearchResultsView({collection: docCollection}).render();
});
}
}
export default AppRouter;
## Instruction:
Use Events instead of custom dispatcher (more)
## Code After:
import * as Backbone from 'backbone';
import Items from '../collections/items';
import SearchBoxView from '../views/searchBox-view';
import SearchResultsView from '../views/searchResults-view';
import AppView from '../views/app-view';
import Events from '../helpers/backbone-events';
import DocumentSet from '../helpers/search';
class AppRouter extends Backbone.Router {
get routes() {
return {
'': 'loadDefault',
'search/(?*queryString)': 'showSearchResults'
};
}
initialize() {
this.listenTo(Events, 'router:go', this.go);
}
go(route) {
this.navigate(route, {trigger: true});
}
loadDefault() {
new AppView();
}
showSearchResults(queryString) {
let q = queryString.substring(2);
DocumentSet.search(q).then(function(result) {
let currentDocuments = new DocumentSet(result).documents;
let docCollection = new Items(currentDocuments);
new SearchResultsView({collection: docCollection}).render();
});
}
}
export default AppRouter;
|
386d74d660c347e2f3ae70b11caf51c1a63d1ebf | appveyor.yml | appveyor.yml | version: '1.0.{build}'
configuration:
- Release
platform: Any CPU
branches:
only:
- master
environment:
# Don't report back to the mothership
DOTNET_CLI_TELEMETRY_OPTOUT: 1
init:
- ps: $Env:LABEL = "CI" + $Env:APPVEYOR_BUILD_NUMBER.PadLeft(5, "0")
before_build:
- appveyor-retry dotnet restore -v Minimal
assembly_info:
patch: true
file: AssemblyInfo.cs
assembly_version: '{version}'
assembly_file_version: '{version}'
assembly_informational_version: '{version}'
build_script:
- ps: src\dotnet-make\update_version.ps1
- dotnet build "src\dotnet-make" -c %CONFIGURATION% --no-dependencies --version-suffix %LABEL%
after_build:
- dotnet pack "src\dotnet-make" -c Release
test_script:
artifacts:
- path: '**\*.nupkg'
name: NuGet package
deploy:
provider: NuGet
api_key:
secure: 2yGf9wWYVIETYfooKme+gT7pKIbw94M2iwnMBZLlsZR5abhhBp6xpmsaIrodL1eC
skip_symbols: true
artifact: /.*\.nupkg/
cache:
- '%USERPROFILE%\.nuget\packages'
on_finish:
| version: '1.0.{build}'
configuration:
- Release
platform: Any CPU
branches:
only:
- master
environment:
# Don't report back to the mothership
DOTNET_CLI_TELEMETRY_OPTOUT: 1
init:
- ps: $Env:LABEL = "CI" + $Env:APPVEYOR_BUILD_NUMBER.PadLeft(5, "0")
before_build:
- appveyor-retry dotnet restore -v Minimal
assembly_info:
patch: true
file: AssemblyInfo.cs
assembly_version: '{version}'
assembly_file_version: '{version}'
assembly_informational_version: '{version}'
build_script:
- ps: C:\Projects\dotnet-make\src\dotnet-make\update_version.ps1
- dotnet build "src\dotnet-make" -c %CONFIGURATION% --no-dependencies --version-suffix %LABEL%
after_build:
- dotnet pack "src\dotnet-make" -c Release
test_script:
artifacts:
- path: '**\*.nupkg'
name: NuGet package
deploy:
provider: NuGet
api_key:
secure: 2yGf9wWYVIETYfooKme+gT7pKIbw94M2iwnMBZLlsZR5abhhBp6xpmsaIrodL1eC
skip_symbols: true
artifact: /.*\.nupkg/
cache:
- '%USERPROFILE%\.nuget\packages'
on_finish:
| Fix - Run powershell script. | Fix - Run powershell script.
| YAML | apache-2.0 | springcomp/dotnet-make | yaml | ## Code Before:
version: '1.0.{build}'
configuration:
- Release
platform: Any CPU
branches:
only:
- master
environment:
# Don't report back to the mothership
DOTNET_CLI_TELEMETRY_OPTOUT: 1
init:
- ps: $Env:LABEL = "CI" + $Env:APPVEYOR_BUILD_NUMBER.PadLeft(5, "0")
before_build:
- appveyor-retry dotnet restore -v Minimal
assembly_info:
patch: true
file: AssemblyInfo.cs
assembly_version: '{version}'
assembly_file_version: '{version}'
assembly_informational_version: '{version}'
build_script:
- ps: src\dotnet-make\update_version.ps1
- dotnet build "src\dotnet-make" -c %CONFIGURATION% --no-dependencies --version-suffix %LABEL%
after_build:
- dotnet pack "src\dotnet-make" -c Release
test_script:
artifacts:
- path: '**\*.nupkg'
name: NuGet package
deploy:
provider: NuGet
api_key:
secure: 2yGf9wWYVIETYfooKme+gT7pKIbw94M2iwnMBZLlsZR5abhhBp6xpmsaIrodL1eC
skip_symbols: true
artifact: /.*\.nupkg/
cache:
- '%USERPROFILE%\.nuget\packages'
on_finish:
## Instruction:
Fix - Run powershell script.
## Code After:
version: '1.0.{build}'
configuration:
- Release
platform: Any CPU
branches:
only:
- master
environment:
# Don't report back to the mothership
DOTNET_CLI_TELEMETRY_OPTOUT: 1
init:
- ps: $Env:LABEL = "CI" + $Env:APPVEYOR_BUILD_NUMBER.PadLeft(5, "0")
before_build:
- appveyor-retry dotnet restore -v Minimal
assembly_info:
patch: true
file: AssemblyInfo.cs
assembly_version: '{version}'
assembly_file_version: '{version}'
assembly_informational_version: '{version}'
build_script:
- ps: C:\Projects\dotnet-make\src\dotnet-make\update_version.ps1
- dotnet build "src\dotnet-make" -c %CONFIGURATION% --no-dependencies --version-suffix %LABEL%
after_build:
- dotnet pack "src\dotnet-make" -c Release
test_script:
artifacts:
- path: '**\*.nupkg'
name: NuGet package
deploy:
provider: NuGet
api_key:
secure: 2yGf9wWYVIETYfooKme+gT7pKIbw94M2iwnMBZLlsZR5abhhBp6xpmsaIrodL1eC
skip_symbols: true
artifact: /.*\.nupkg/
cache:
- '%USERPROFILE%\.nuget\packages'
on_finish:
|
4dabda647bdd6180c724594953306cc4dc2d8c84 | _plugins/asset_img_size.rb | _plugins/asset_img_size.rb | module Jekyll
module Filter
def asset_img_size(input)
file = input.to_s.split('/').last.split('.').first
if file.include? '_'
file = file.split('_').last
if file.include? 'x'
file = file.to_s.split('x')
file.map! { |i| if i == '' then '100%' else i + 'px' end }
if file.size < 2 then file.push('100%') end
else
nil
end
else
nil
end
end
end
end
Liquid::Template.register_filter(Jekyll::Filter)
| module Jekyll
module Filter
def asset_img_size(input)
file = input.to_s.split('/').last.split('.').first
if file.include? '_'
file = file.split('_').last
if file.include? 'x'
file = file.to_s.split('x')
file.map! { |i| if i == '' then '100%' else i + 'px' end }
if file.size < 2 then file.push('100%') end
return file
else
nil
end
else
nil
end
end
end
end
Liquid::Template.register_filter(Jekyll::Filter)
| Fix return point issue when widthx syntax | Fix return point issue when widthx syntax
| Ruby | mit | rosa89n20/jekyll-asset_img_size | ruby | ## Code Before:
module Jekyll
module Filter
def asset_img_size(input)
file = input.to_s.split('/').last.split('.').first
if file.include? '_'
file = file.split('_').last
if file.include? 'x'
file = file.to_s.split('x')
file.map! { |i| if i == '' then '100%' else i + 'px' end }
if file.size < 2 then file.push('100%') end
else
nil
end
else
nil
end
end
end
end
Liquid::Template.register_filter(Jekyll::Filter)
## Instruction:
Fix return point issue when widthx syntax
## Code After:
module Jekyll
module Filter
def asset_img_size(input)
file = input.to_s.split('/').last.split('.').first
if file.include? '_'
file = file.split('_').last
if file.include? 'x'
file = file.to_s.split('x')
file.map! { |i| if i == '' then '100%' else i + 'px' end }
if file.size < 2 then file.push('100%') end
return file
else
nil
end
else
nil
end
end
end
end
Liquid::Template.register_filter(Jekyll::Filter)
|
6af4563d934fa55ab102fa1b9c3cea1b434771ab | src/less/print.less | src/less/print.less | @media print {
.header h3, nav, footer {
display: none;
}
nav > div {
display: none;
}
.navbar-header {
display: none;
}
body {
font: 12pt Georgia, "Times New Roman", Times, serif;
line-height: 1.3;
}
h1 {
font-size: 24pt;
}
h2 {
font-size: 14pt;
margin-top: 25px;
}
aside h2 {
font-size: 18pt;
}
ul {
margin: 0;
}
li {
content: "» ";
}
/*@page :left {
margin: 0.4cm;
}
@page :right {
margin: 0.4cm;
}*/
}
| @media print {
nav, footer {
display: none;
}
nav > div {
display: none;
}
.navbar-header {
display: none;
}
.action-btn {
display: none;
}
span > button {
display: none;
}
body {
font: 12pt Georgia, "Times New Roman", Times, serif;
line-height: 1.3;
}
h1 {
font-size: 24pt;
}
h2 {
font-size: 14pt;
margin-top: 25px;
}
h3 {
font-size: 24pt;
margin-top: 0px;
}
h3 span {
font-size: 26pt;
}
aside h2 {
font-size: 18pt;
}
ul {
margin: 0;
}
li {
content: "» ";
}
/*@page :left {
margin: 0.4cm;
}
@page :right {
margin: 0.4cm;
}*/
}
| Check pointing adding rpint stylesheet for facility detailed view | Check pointing adding rpint stylesheet for facility detailed view
| Less | mit | MasterFacilityList/mfl_web,urandu/mfl_web,urandu/mfl_web,MasterFacilityList/mfl_web,MasterFacilityList/mfl_web,MasterFacilityList/mfl_web | less | ## Code Before:
@media print {
.header h3, nav, footer {
display: none;
}
nav > div {
display: none;
}
.navbar-header {
display: none;
}
body {
font: 12pt Georgia, "Times New Roman", Times, serif;
line-height: 1.3;
}
h1 {
font-size: 24pt;
}
h2 {
font-size: 14pt;
margin-top: 25px;
}
aside h2 {
font-size: 18pt;
}
ul {
margin: 0;
}
li {
content: "» ";
}
/*@page :left {
margin: 0.4cm;
}
@page :right {
margin: 0.4cm;
}*/
}
## Instruction:
Check pointing adding rpint stylesheet for facility detailed view
## Code After:
@media print {
nav, footer {
display: none;
}
nav > div {
display: none;
}
.navbar-header {
display: none;
}
.action-btn {
display: none;
}
span > button {
display: none;
}
body {
font: 12pt Georgia, "Times New Roman", Times, serif;
line-height: 1.3;
}
h1 {
font-size: 24pt;
}
h2 {
font-size: 14pt;
margin-top: 25px;
}
h3 {
font-size: 24pt;
margin-top: 0px;
}
h3 span {
font-size: 26pt;
}
aside h2 {
font-size: 18pt;
}
ul {
margin: 0;
}
li {
content: "» ";
}
/*@page :left {
margin: 0.4cm;
}
@page :right {
margin: 0.4cm;
}*/
}
|
b5b35ffa5169946958f25f31aa040d8e3d8291de | cico_do_build_che.sh | cico_do_build_che.sh |
. config
mvnche() {
which scl 2>/dev/null
if [ $? -eq 0 ]
then
if [ `scl -l 2> /dev/null | grep rh-maven33` != "" ]
then
scl enable rh-maven33 rh-nodejs4 "mvn $*"
else
mvn $*
fi
else
mvn $*
fi
}
mkdir $NPM_CONFIG_PREFIX 2>/dev/null
mvnche -B $* install -U
if [ $? -ne 0 ]; then
echo "Error building che/rh-che with dashboard"
exit 1;
fi
if [ "$DeveloperBuild" != "true" ]
then
mvnche -B -P'!checkout-base-che' -DwithoutDashboard $* install -U
if [ $? -ne 0 ]; then
echo "Error building che/rh-che without dashboard"
exit 1;
fi
fi
|
. config
mvnche() {
which scl 2>/dev/null
if [ $? -eq 0 ]
then
if [ `scl -l 2> /dev/null | grep rh-maven33` != "" ]
then
# gulp-cli is needed to build the dashboard
scl enable rh-nodejs4 "npm install --global gulp-cli"
scl enable rh-maven33 rh-nodejs4 "mvn $*"
else
mvn $*
fi
else
mvn $*
fi
}
mkdir $NPM_CONFIG_PREFIX 2>/dev/null
mvnche -B $* install -U
if [ $? -ne 0 ]; then
echo "Error building che/rh-che with dashboard"
exit 1;
fi
if [ "$DeveloperBuild" != "true" ]
then
mvnche -B -P'!checkout-base-che' -DwithoutDashboard $* install -U
if [ $? -ne 0 ]; then
echo "Error building che/rh-che without dashboard"
exit 1;
fi
fi
| Fix build error because gulp could not be found | Fix build error because gulp could not be found | Shell | epl-1.0 | sunix/rh-che,sunix/rh-che,sunix/rh-che,sunix/rh-che | shell | ## Code Before:
. config
mvnche() {
which scl 2>/dev/null
if [ $? -eq 0 ]
then
if [ `scl -l 2> /dev/null | grep rh-maven33` != "" ]
then
scl enable rh-maven33 rh-nodejs4 "mvn $*"
else
mvn $*
fi
else
mvn $*
fi
}
mkdir $NPM_CONFIG_PREFIX 2>/dev/null
mvnche -B $* install -U
if [ $? -ne 0 ]; then
echo "Error building che/rh-che with dashboard"
exit 1;
fi
if [ "$DeveloperBuild" != "true" ]
then
mvnche -B -P'!checkout-base-che' -DwithoutDashboard $* install -U
if [ $? -ne 0 ]; then
echo "Error building che/rh-che without dashboard"
exit 1;
fi
fi
## Instruction:
Fix build error because gulp could not be found
## Code After:
. config
mvnche() {
which scl 2>/dev/null
if [ $? -eq 0 ]
then
if [ `scl -l 2> /dev/null | grep rh-maven33` != "" ]
then
# gulp-cli is needed to build the dashboard
scl enable rh-nodejs4 "npm install --global gulp-cli"
scl enable rh-maven33 rh-nodejs4 "mvn $*"
else
mvn $*
fi
else
mvn $*
fi
}
mkdir $NPM_CONFIG_PREFIX 2>/dev/null
mvnche -B $* install -U
if [ $? -ne 0 ]; then
echo "Error building che/rh-che with dashboard"
exit 1;
fi
if [ "$DeveloperBuild" != "true" ]
then
mvnche -B -P'!checkout-base-che' -DwithoutDashboard $* install -U
if [ $? -ne 0 ]; then
echo "Error building che/rh-che without dashboard"
exit 1;
fi
fi
|
1d870bb6dee02f98a84836f5f989c84dfec6dbb3 | .travis.yml | .travis.yml | language: csharp
solution: ./src/FluentValidation.Validators.UnitTestExtension.sln
script:
- xbuild /p:Configuration=Release ./src/FluentValidation.Validators.UnitTestExtension.sln
- mono ./src/packages/xunit.runner.console.*/tools/xunit.console.exe ./src/FluentValidation.Validators.UnitTestExtension.Tests/bin/Release/FluentValidation.Validators.UnitTestExtension.Tests.dll | language: csharp
solution: ./src/FluentValidation.Validators.UnitTestExtension.sln
script:
- xbuild /p:Configuration=Release ./src/FluentValidation.Validators.UnitTestExtension.sln
- mono ./src/packages/xunit.runner.console.*/tools/xunit.console.exe ./src/FluentValidation.Validators.UnitTestExtension.Tests/bin/Release/FluentValidation.Validators.UnitTestExtension.Tests.dll
language: csharp
solution: ./src/FluentValidation.Validators.UnitTestExtension.NuGet.Tests.sln
script:
- xbuild /p:Configuration=Release ./src/FluentValidation.Validators.UnitTestExtension.NuGet.Tests.sln
- mono ./src/packages/xunit.runner.console.*/tools/xunit.console.exe ./src/FluentValidation.Validators.UnitTestExtension.NuGet.Tests/bin/Release/FluentValidation.Validators.UnitTestExtension.NuGet.Tests.dll | Change build script - added NuGet dll tests | Change build script - added NuGet dll tests
| YAML | mit | MichalJankowskii/FluentValidation.Validators.UnitTestExtension | yaml | ## Code Before:
language: csharp
solution: ./src/FluentValidation.Validators.UnitTestExtension.sln
script:
- xbuild /p:Configuration=Release ./src/FluentValidation.Validators.UnitTestExtension.sln
- mono ./src/packages/xunit.runner.console.*/tools/xunit.console.exe ./src/FluentValidation.Validators.UnitTestExtension.Tests/bin/Release/FluentValidation.Validators.UnitTestExtension.Tests.dll
## Instruction:
Change build script - added NuGet dll tests
## Code After:
language: csharp
solution: ./src/FluentValidation.Validators.UnitTestExtension.sln
script:
- xbuild /p:Configuration=Release ./src/FluentValidation.Validators.UnitTestExtension.sln
- mono ./src/packages/xunit.runner.console.*/tools/xunit.console.exe ./src/FluentValidation.Validators.UnitTestExtension.Tests/bin/Release/FluentValidation.Validators.UnitTestExtension.Tests.dll
language: csharp
solution: ./src/FluentValidation.Validators.UnitTestExtension.NuGet.Tests.sln
script:
- xbuild /p:Configuration=Release ./src/FluentValidation.Validators.UnitTestExtension.NuGet.Tests.sln
- mono ./src/packages/xunit.runner.console.*/tools/xunit.console.exe ./src/FluentValidation.Validators.UnitTestExtension.NuGet.Tests/bin/Release/FluentValidation.Validators.UnitTestExtension.NuGet.Tests.dll |
3e7fb60e9a22ad03f70bde63b43ecf5c661295c7 | src/client/app/components/toasts/toasts.template.js | src/client/app/components/toasts/toasts.template.js | import { Util } from "../../util.js";
export class ToastsTemplate {
static update(render, { message = "" } = {}) {
const now = Util.formatDate(new Date());
if (message) {
setTimeout(() => {
render`<span class="bg-yellow">${now} ${message}.</span>`;
setTimeout(() => {
render`<span>${now} Ready.</span>`;
}, 10000);
}, 1000);
} else {
render`<span>${now} Ready.</span>`;
}
}
}
| import { Util } from "../../util.js";
export class ToastsTemplate {
static update(render, { message = "" } = {}) {
if (!message) {
return;
}
const toasts = ToastsTemplate.toasts;
toasts.push(message);
if (ToastsTemplate.isRunnning) {
return;
}
ToastsTemplate.run(render, toasts, () => {
ToastsTemplate.isRunnning = false;
ToastsTemplate.renderReady(render);
});
}
static run(render, toasts, callback) {
ToastsTemplate.isRunnning = true;
if (toasts.length > 0) {
ToastsTemplate.renderToast(render, toasts.shift());
setTimeout(() => {
ToastsTemplate.run(render, toasts, callback);
}, 2000);
} else {
return callback();
}
return true;
}
static renderReady(render) {
const now = Util.formatDate(new Date());
console.warn("Ready");
render`<span>${now} Ready.</span>`;
}
static renderToast(render, message) {
const now = Util.formatDate(new Date());
console.warn(message);
render`<span class="bg-yellow">${now} ${message}.</span>`;
}
}
ToastsTemplate.toasts = [];
ToastsTemplate.isRunnning = false;
| Add queue to the toasts. | Add queue to the toasts.
| JavaScript | mit | albertosantini/node-conpa,albertosantini/node-conpa | javascript | ## Code Before:
import { Util } from "../../util.js";
export class ToastsTemplate {
static update(render, { message = "" } = {}) {
const now = Util.formatDate(new Date());
if (message) {
setTimeout(() => {
render`<span class="bg-yellow">${now} ${message}.</span>`;
setTimeout(() => {
render`<span>${now} Ready.</span>`;
}, 10000);
}, 1000);
} else {
render`<span>${now} Ready.</span>`;
}
}
}
## Instruction:
Add queue to the toasts.
## Code After:
import { Util } from "../../util.js";
export class ToastsTemplate {
static update(render, { message = "" } = {}) {
if (!message) {
return;
}
const toasts = ToastsTemplate.toasts;
toasts.push(message);
if (ToastsTemplate.isRunnning) {
return;
}
ToastsTemplate.run(render, toasts, () => {
ToastsTemplate.isRunnning = false;
ToastsTemplate.renderReady(render);
});
}
static run(render, toasts, callback) {
ToastsTemplate.isRunnning = true;
if (toasts.length > 0) {
ToastsTemplate.renderToast(render, toasts.shift());
setTimeout(() => {
ToastsTemplate.run(render, toasts, callback);
}, 2000);
} else {
return callback();
}
return true;
}
static renderReady(render) {
const now = Util.formatDate(new Date());
console.warn("Ready");
render`<span>${now} Ready.</span>`;
}
static renderToast(render, message) {
const now = Util.formatDate(new Date());
console.warn(message);
render`<span class="bg-yellow">${now} ${message}.</span>`;
}
}
ToastsTemplate.toasts = [];
ToastsTemplate.isRunnning = false;
|
1f3164f95f0ce40bac38ac384bf5fdd181ab5fa1 | importlib_metadata/__init__.py | importlib_metadata/__init__.py | from .api import (
Distribution, PackageNotFoundError, distribution, distributions,
entry_points, files, metadata, requires, version)
# Import for installation side-effects.
from . import _hooks # noqa: F401
__all__ = [
'Distribution',
'PackageNotFoundError',
'distribution',
'distributions',
'entry_points',
'files',
'metadata',
'requires',
'version',
]
__version__ = version(__name__)
| from .api import (
Distribution, PackageNotFoundError, distribution, distributions,
entry_points, files, metadata, requires, version)
# Import for installation side-effects.
__import__('importlib_metadata._hooks')
__all__ = [
'Distribution',
'PackageNotFoundError',
'distribution',
'distributions',
'entry_points',
'files',
'metadata',
'requires',
'version',
]
__version__ = version(__name__)
| Use imperative import to avoid lint (import order) and as a good convention when side-effects is the intention. | Use imperative import to avoid lint (import order) and as a good convention when side-effects is the intention.
| Python | apache-2.0 | python/importlib_metadata | python | ## Code Before:
from .api import (
Distribution, PackageNotFoundError, distribution, distributions,
entry_points, files, metadata, requires, version)
# Import for installation side-effects.
from . import _hooks # noqa: F401
__all__ = [
'Distribution',
'PackageNotFoundError',
'distribution',
'distributions',
'entry_points',
'files',
'metadata',
'requires',
'version',
]
__version__ = version(__name__)
## Instruction:
Use imperative import to avoid lint (import order) and as a good convention when side-effects is the intention.
## Code After:
from .api import (
Distribution, PackageNotFoundError, distribution, distributions,
entry_points, files, metadata, requires, version)
# Import for installation side-effects.
__import__('importlib_metadata._hooks')
__all__ = [
'Distribution',
'PackageNotFoundError',
'distribution',
'distributions',
'entry_points',
'files',
'metadata',
'requires',
'version',
]
__version__ = version(__name__)
|
6dd975a1ac18788efd21d6b546dd90ef001f4bc7 | .travis.yml | .travis.yml | notifications:
email: false
language: node_js
node_js:
- "6"
script:
- "NODE_ENV=test npm run-script unit-tests"
- "npm run-script style"
cache:
- apt: true
- directories:
- "$HOME/.npm"
- "$HOME/.electron"
- "/Library/Caches/Homebrew"
addons:
apt:
packages:
- xvfb
before_install:
- "npm config set spin false"
install:
- export DISPLAY=':99.0'
- Xvfb :99 -screen 0 1024x768x24 > /dev/null 2>&1 &
- npm install
| notifications:
email: false
language: node_js
node_js:
- "6"
script:
- "NODE_ENV=test npm run-script unit-tests"
- "npm run-script style"
cache:
- apt: true
- directories:
- "$HOME/.npm"
- "$HOME/.electron"
- "/Library/Caches/Homebrew"
env:
- CC=clang CXX=clang++ npm_config_clang=1
addons:
apt:
packages:
- xvfb
- gnome-keyring
- libgnome-keyring-dev
before_install:
- "npm config set spin false"
install:
- export DISPLAY=':99.0'
- Xvfb :99 -screen 0 1024x768x24 > /dev/null 2>&1 &
- npm install
| Update Travis config for keytar | Update Travis config for keytar
| YAML | mit | cheshire137/gh-notifications-snoozer,cheshire137/gh-notifications-snoozer | yaml | ## Code Before:
notifications:
email: false
language: node_js
node_js:
- "6"
script:
- "NODE_ENV=test npm run-script unit-tests"
- "npm run-script style"
cache:
- apt: true
- directories:
- "$HOME/.npm"
- "$HOME/.electron"
- "/Library/Caches/Homebrew"
addons:
apt:
packages:
- xvfb
before_install:
- "npm config set spin false"
install:
- export DISPLAY=':99.0'
- Xvfb :99 -screen 0 1024x768x24 > /dev/null 2>&1 &
- npm install
## Instruction:
Update Travis config for keytar
## Code After:
notifications:
email: false
language: node_js
node_js:
- "6"
script:
- "NODE_ENV=test npm run-script unit-tests"
- "npm run-script style"
cache:
- apt: true
- directories:
- "$HOME/.npm"
- "$HOME/.electron"
- "/Library/Caches/Homebrew"
env:
- CC=clang CXX=clang++ npm_config_clang=1
addons:
apt:
packages:
- xvfb
- gnome-keyring
- libgnome-keyring-dev
before_install:
- "npm config set spin false"
install:
- export DISPLAY=':99.0'
- Xvfb :99 -screen 0 1024x768x24 > /dev/null 2>&1 &
- npm install
|
8e0d9ed317239a880f77bdd5ed6c2e45ed862f39 | server/APIv1/follows/getUserFollowState.sql | server/APIv1/follows/getUserFollowState.sql | /**
* input
* follower: user_id<Number>
* target: user_id<Number>
*
* returns:
* | followcount | togglefollow |
* |-------------|--------------|
* | number | boolean |
*/
WITH user_of_interest AS (
SELECT
follower
FROM
followers_users_users
WHERE
target = ${target}
)
SELECT
count(*) AS followCount,
(EXISTS
(SELECT
1
FROM
user_of_interest
WHERE
follower = ${follower})
) AS toggleFollow
FROM
user_of_interest;
| /**
* input
* targetID: user_id<Number>
* userID: user_id<Number>
*
* returns:
* | followcount | togglefollow |
* |-------------|--------------|
* | number | boolean |
*/
WITH user_of_interest AS (
SELECT
follower
FROM
followers_users_users
WHERE
target = ${targetID}
)
SELECT
count(*) AS followCount,
(EXISTS
(SELECT
1
FROM
user_of_interest
WHERE
follower = ${userID})
) AS toggleFollow
FROM
user_of_interest;
| Update sql query for removing a user follow | Update sql query for removing a user follow
| SQL | mit | forkful/forkful,rompingstalactite/rompingstalactite,forkful/forkful,forkful/forkful,rompingstalactite/rompingstalactite,rompingstalactite/rompingstalactite | sql | ## Code Before:
/**
* input
* follower: user_id<Number>
* target: user_id<Number>
*
* returns:
* | followcount | togglefollow |
* |-------------|--------------|
* | number | boolean |
*/
WITH user_of_interest AS (
SELECT
follower
FROM
followers_users_users
WHERE
target = ${target}
)
SELECT
count(*) AS followCount,
(EXISTS
(SELECT
1
FROM
user_of_interest
WHERE
follower = ${follower})
) AS toggleFollow
FROM
user_of_interest;
## Instruction:
Update sql query for removing a user follow
## Code After:
/**
* input
* targetID: user_id<Number>
* userID: user_id<Number>
*
* returns:
* | followcount | togglefollow |
* |-------------|--------------|
* | number | boolean |
*/
WITH user_of_interest AS (
SELECT
follower
FROM
followers_users_users
WHERE
target = ${targetID}
)
SELECT
count(*) AS followCount,
(EXISTS
(SELECT
1
FROM
user_of_interest
WHERE
follower = ${userID})
) AS toggleFollow
FROM
user_of_interest;
|
e979d4a5deeac020477d565b97d68ff4b07c1aca | spec/receiver/pop3_spec.rb | spec/receiver/pop3_spec.rb | require File.expand_path(File.join(File.dirname(__FILE__), '..', '/spec_helper'))
describe 'POP3 receiver' do
before do
@connection = mock('POP3 Connection')
@connection.stub!(:start).and_return(true)
@receiver_options = { :username => 'user',
:password => 'pass',
:connection => @connection }
@receiver = Mailman::Receiver::POP3.new(@receiver_options)
end
describe 'connecting' do
it 'should connect to a POP3 server' do
@receiver.connect.should be_true
end
end
end
| require File.expand_path(File.join(File.dirname(__FILE__), '..', '/spec_helper'))
describe 'POP3 receiver' do
before do
@receiver_options = { :username => 'user',
:password => 'pass',
:connection => MockPOP3.new }
@receiver = Mailman::Receiver::POP3.new(@receiver_options)
end
describe 'connecting' do
it 'should connect to a POP3 server' do
@receiver.connect.should be_true
end
end
end
class MockPOP3
def start(account, password)
return self if account == 'user' && password == 'pass'
end
end
| Change how POP3 spec mocks | Change how POP3 spec mocks
| Ruby | mit | sr-education/mailman,mailman/mailman | ruby | ## Code Before:
require File.expand_path(File.join(File.dirname(__FILE__), '..', '/spec_helper'))
describe 'POP3 receiver' do
before do
@connection = mock('POP3 Connection')
@connection.stub!(:start).and_return(true)
@receiver_options = { :username => 'user',
:password => 'pass',
:connection => @connection }
@receiver = Mailman::Receiver::POP3.new(@receiver_options)
end
describe 'connecting' do
it 'should connect to a POP3 server' do
@receiver.connect.should be_true
end
end
end
## Instruction:
Change how POP3 spec mocks
## Code After:
require File.expand_path(File.join(File.dirname(__FILE__), '..', '/spec_helper'))
describe 'POP3 receiver' do
before do
@receiver_options = { :username => 'user',
:password => 'pass',
:connection => MockPOP3.new }
@receiver = Mailman::Receiver::POP3.new(@receiver_options)
end
describe 'connecting' do
it 'should connect to a POP3 server' do
@receiver.connect.should be_true
end
end
end
class MockPOP3
def start(account, password)
return self if account == 'user' && password == 'pass'
end
end
|
097fdf38dbd59c3c29b86cbc1b4f1415f45b9a92 | docs/configuration-directives/WSGIRestrictEmbedded.rst | docs/configuration-directives/WSGIRestrictEmbedded.rst | ====================
WSGIRestrictEmbedded
====================
:Description: Enable restrictions on use of embedded mode.
:Syntax: ``WSGIRestrictEmbedded On|Off``
:Default: ``WSGIRestrictEmbedded Off``
:Context: server config
The WSGIRestrictEmbedded directive determines whether mod_wsgi embedded
mode is enabled or not. If set to 'On' and the restriction on embedded mode
is therefore enabled, any attempt to make a request against a WSGI
application which hasn't been properly configured so as to be delegated to
a daemon mode process will fail with a HTTP internal server error response.
This option does not exist on Windows, or Apache 1.3 or any other
configuration where daemon mode is not available.
| ====================
WSGIRestrictEmbedded
====================
:Description: Enable restrictions on use of embedded mode.
:Syntax: ``WSGIRestrictEmbedded On|Off``
:Default: ``WSGIRestrictEmbedded Off``
:Context: server config
The WSGIRestrictEmbedded directive determines whether mod_wsgi embedded
mode is enabled or not. If set to 'On' and the restriction on embedded mode
is therefore enabled, any attempt to make a request against a WSGI
application which hasn't been properly configured so as to be delegated to
a daemon mode process will fail with a HTTP internal server error response.
For historical reasons and to maintain backward compatibility with old
configurations this option is 'Off' by default. As daemon mode is the
preferred deployment method, it is good practice to override the default
and set this to 'On', ensuring you have set up and are always using daemon
mode.
This option does not exist on Windows or any other configuration where
daemon mode is not available.
| Improve warning on restricting use of embedded mode. | Improve warning on restricting use of embedded mode.
| reStructuredText | apache-2.0 | GrahamDumpleton/mod_wsgi,GrahamDumpleton/mod_wsgi,GrahamDumpleton/mod_wsgi | restructuredtext | ## Code Before:
====================
WSGIRestrictEmbedded
====================
:Description: Enable restrictions on use of embedded mode.
:Syntax: ``WSGIRestrictEmbedded On|Off``
:Default: ``WSGIRestrictEmbedded Off``
:Context: server config
The WSGIRestrictEmbedded directive determines whether mod_wsgi embedded
mode is enabled or not. If set to 'On' and the restriction on embedded mode
is therefore enabled, any attempt to make a request against a WSGI
application which hasn't been properly configured so as to be delegated to
a daemon mode process will fail with a HTTP internal server error response.
This option does not exist on Windows, or Apache 1.3 or any other
configuration where daemon mode is not available.
## Instruction:
Improve warning on restricting use of embedded mode.
## Code After:
====================
WSGIRestrictEmbedded
====================
:Description: Enable restrictions on use of embedded mode.
:Syntax: ``WSGIRestrictEmbedded On|Off``
:Default: ``WSGIRestrictEmbedded Off``
:Context: server config
The WSGIRestrictEmbedded directive determines whether mod_wsgi embedded
mode is enabled or not. If set to 'On' and the restriction on embedded mode
is therefore enabled, any attempt to make a request against a WSGI
application which hasn't been properly configured so as to be delegated to
a daemon mode process will fail with a HTTP internal server error response.
For historical reasons and to maintain backward compatibility with old
configurations this option is 'Off' by default. As daemon mode is the
preferred deployment method, it is good practice to override the default
and set this to 'On', ensuring you have set up and are always using daemon
mode.
This option does not exist on Windows or any other configuration where
daemon mode is not available.
|
abf499e6cb1cb1636693600af68c45ce05ae14cd | code/folder/MicroPostActivityFilter.php | code/folder/MicroPostActivityFilter.php | <?php
/**
*
*
* @author <marcus@silverstripe.com.au>
* @license BSD License http://www.silverstripe.org/bsd-license
*/
class MicroPostActivityFilter implements RequestFilter {
/**
* @var MicroBlogService
*/
public $microBlogService;
public function postRequest(\SS_HTTPRequest $request, \SS_HTTPResponse $response, \DataModel $model) {
$actions = $this->microBlogService->getUserActions();
if ($actions && count($actions)) {
$members = Member::get()->filter('ID', array_keys($actions));
foreach ($members as $member) {
$member->LastPostView = SS_Datetime::now()->getValue();
$member->write();
}
}
}
public function preRequest(\SS_HTTPRequest $request, \Session $session, \DataModel $model) {
}
}
| <?php
/**
*
*
* @author <marcus@silverstripe.com.au>
* @license BSD License http://www.silverstripe.org/bsd-license
*/
class MicroPostActivityFilter implements RequestFilter {
/**
* @var MicroBlogService
*/
public $microBlogService;
public function postRequest(\SS_HTTPRequest $request, \SS_HTTPResponse $response, \DataModel $model) {
$actions = $this->microBlogService->getUserActions();
if ($actions && count($actions)) {
$members = Member::get()->filter('ID', array_keys($actions));
foreach ($members as $member) {
if ($member->exists()) {
$member->LastPostView = SS_Datetime::now()->getValue();
$member->write();
}
}
}
}
public function preRequest(\SS_HTTPRequest $request, \Session $session, \DataModel $model) {
}
}
| FIX Only update member actions that have occurred for member objects that exist in the DB | FIX Only update member actions that have occurred for member objects that exist in the DB
| PHP | bsd-3-clause | nyeholt/silverstripe-microblog,nyeholt/silverstripe-microblog,nyeholt/silverstripe-microblog,nyeholt/silverstripe-microblog,nyeholt/silverstripe-microblog,Neumes/silverstripe-microblog,Neumes/silverstripe-microblog | php | ## Code Before:
<?php
/**
*
*
* @author <marcus@silverstripe.com.au>
* @license BSD License http://www.silverstripe.org/bsd-license
*/
class MicroPostActivityFilter implements RequestFilter {
/**
* @var MicroBlogService
*/
public $microBlogService;
public function postRequest(\SS_HTTPRequest $request, \SS_HTTPResponse $response, \DataModel $model) {
$actions = $this->microBlogService->getUserActions();
if ($actions && count($actions)) {
$members = Member::get()->filter('ID', array_keys($actions));
foreach ($members as $member) {
$member->LastPostView = SS_Datetime::now()->getValue();
$member->write();
}
}
}
public function preRequest(\SS_HTTPRequest $request, \Session $session, \DataModel $model) {
}
}
## Instruction:
FIX Only update member actions that have occurred for member objects that exist in the DB
## Code After:
<?php
/**
*
*
* @author <marcus@silverstripe.com.au>
* @license BSD License http://www.silverstripe.org/bsd-license
*/
class MicroPostActivityFilter implements RequestFilter {
/**
* @var MicroBlogService
*/
public $microBlogService;
public function postRequest(\SS_HTTPRequest $request, \SS_HTTPResponse $response, \DataModel $model) {
$actions = $this->microBlogService->getUserActions();
if ($actions && count($actions)) {
$members = Member::get()->filter('ID', array_keys($actions));
foreach ($members as $member) {
if ($member->exists()) {
$member->LastPostView = SS_Datetime::now()->getValue();
$member->write();
}
}
}
}
public function preRequest(\SS_HTTPRequest $request, \Session $session, \DataModel $model) {
}
}
|
2529b01321abbf0810387e68fa3080f8b814ac08 | custom.html | custom.html | <!DOCTYPE html>
<html>
<head>
<title>Guitar Customization</title>
</head>
<body>
<h1>Post your custom Fenders, Gibsons, Ibanez, etc.</h1>
<p><i>The blog forum is upcoming!</i></p>
<footer><a href="index.html">Home</a></footer>
</body>
</html> | <!DOCTYPE html>
<html>
<head>
<title>Guitar Customization</title>
</head>
<body>
<img src="images/slash-lespaul.jpg" alt="Slash Les Paul" />
<p>Photo by: Flickr user <a href="https://www.flickr.com/photos/55111293@N08/5105487032/in/photolist-8M9Xqu-4SDEks-8YLMwv-4o8XSG-chyz5m-4SDBCA-4SDAKf-7wJx4z-4Szm94-apz2QW-9N5E8k-4o4Th8-9N8mNb-dsAavP-4iKMtq-5kEyFu-FjpbFy-dWmVjA-bbWV7c-8nAgCD-eTY3Kt-8fzgdb-9p24JQ-dsxS5B-4o4TFH-9N5Bst-8MSLuv-9yn2au-p4cW1L-8trMbc-tAvH7Q-D5WH3Z-dsFwgF-9o93z1-8M6LLi-daELRS-dDvmYD-7a4juY-55Gb8j-arxYHS-9o93Fy-95GrT8-4BGpQa-9shxgf-5xGGLf-5xGHdS-6Qe9VN-owZRHs-fruzEq-5yoS69">jhonmer</a>, <a href="https://creativecommons.org/publicdomain/mark/1.0/">Lincense</a>
<h1>Post your custom Fenders, Gibsons, Ibanez, etc.</h1>
<p><i>The blog forum is upcoming!</i></p>
<footer><a href="index.html">Home</a></footer>
</body>
</html> | Add images and links to page | Add images and links to page
| HTML | mit | jmarsh433/jmarsh433.github.io | html | ## Code Before:
<!DOCTYPE html>
<html>
<head>
<title>Guitar Customization</title>
</head>
<body>
<h1>Post your custom Fenders, Gibsons, Ibanez, etc.</h1>
<p><i>The blog forum is upcoming!</i></p>
<footer><a href="index.html">Home</a></footer>
</body>
</html>
## Instruction:
Add images and links to page
## Code After:
<!DOCTYPE html>
<html>
<head>
<title>Guitar Customization</title>
</head>
<body>
<img src="images/slash-lespaul.jpg" alt="Slash Les Paul" />
<p>Photo by: Flickr user <a href="https://www.flickr.com/photos/55111293@N08/5105487032/in/photolist-8M9Xqu-4SDEks-8YLMwv-4o8XSG-chyz5m-4SDBCA-4SDAKf-7wJx4z-4Szm94-apz2QW-9N5E8k-4o4Th8-9N8mNb-dsAavP-4iKMtq-5kEyFu-FjpbFy-dWmVjA-bbWV7c-8nAgCD-eTY3Kt-8fzgdb-9p24JQ-dsxS5B-4o4TFH-9N5Bst-8MSLuv-9yn2au-p4cW1L-8trMbc-tAvH7Q-D5WH3Z-dsFwgF-9o93z1-8M6LLi-daELRS-dDvmYD-7a4juY-55Gb8j-arxYHS-9o93Fy-95GrT8-4BGpQa-9shxgf-5xGGLf-5xGHdS-6Qe9VN-owZRHs-fruzEq-5yoS69">jhonmer</a>, <a href="https://creativecommons.org/publicdomain/mark/1.0/">Lincense</a>
<h1>Post your custom Fenders, Gibsons, Ibanez, etc.</h1>
<p><i>The blog forum is upcoming!</i></p>
<footer><a href="index.html">Home</a></footer>
</body>
</html> |
09ce2e218fb34ec0ad182e3d378614257fdb22e6 | subsys/random/rand32_entropy_device.c | subsys/random/rand32_entropy_device.c | /*
* Copyright (c) 2017 Intel Corporation
*
* SPDX-License-Identifier: Apache-2.0
*/
#include <atomic.h>
#include <kernel.h>
#include <entropy.h>
static atomic_t entropy_driver;
u32_t sys_rand32_get(void)
{
struct device *dev = (struct device *)atomic_get(&entropy_driver);
u32_t random_num;
int ret;
if (unlikely(!dev)) {
/* Only one entropy device exists, so this is safe even
* if the whole operation isn't atomic.
*/
dev = device_get_binding(CONFIG_ENTROPY_NAME);
atomic_set(&entropy_driver, (atomic_t)(uintptr_t)dev);
}
ret = entropy_get_entropy(dev, (u8_t *)&random_num,
sizeof(random_num));
if (unlikely(ret < 0)) {
/* Use system timer in case the entropy device couldn't deliver
* 32-bit of data. There's not much that can be done in this
* situation. An __ASSERT() isn't used here as the HWRNG might
* still be gathering entropy during early boot situations.
*/
random_num = k_cycle_get_32();
}
return random_num;
}
| /*
* Copyright (c) 2017 Intel Corporation
*
* SPDX-License-Identifier: Apache-2.0
*/
#include <atomic.h>
#include <kernel.h>
#include <entropy.h>
static atomic_t entropy_driver;
u32_t sys_rand32_get(void)
{
struct device *dev = (struct device *)atomic_get(&entropy_driver);
u32_t random_num;
int ret;
if (unlikely(!dev)) {
/* Only one entropy device exists, so this is safe even
* if the whole operation isn't atomic.
*/
dev = device_get_binding(CONFIG_ENTROPY_NAME);
__ASSERT((dev != NULL),
"Device driver for %s (CONFIG_ENTROPY_NAME) not found. "
"Check your build configuration!",
CONFIG_ENTROPY_NAME);
atomic_set(&entropy_driver, (atomic_t)(uintptr_t)dev);
}
ret = entropy_get_entropy(dev, (u8_t *)&random_num,
sizeof(random_num));
if (unlikely(ret < 0)) {
/* Use system timer in case the entropy device couldn't deliver
* 32-bit of data. There's not much that can be done in this
* situation. An __ASSERT() isn't used here as the HWRNG might
* still be gathering entropy during early boot situations.
*/
random_num = k_cycle_get_32();
}
return random_num;
}
| Add _ASSERT() test on returned device_get_binding | subsys/random: Add _ASSERT() test on returned device_get_binding
If there is a build setup problem where a device driver has not been
setup for the entropy driver then the call to device_get_binding()
will return a NULL value and the code will continue to use this NULL
value. The result is a hard fault later in code execution.
Note that CONFIG_ASSERT is by default off so one has to turn this
configuration on to catch this problem.
Signed-off-by: David Leach <98707a1e43c521c5b5b81e36afef696dde6831a3@nxp.com>
| C | apache-2.0 | mbolivar/zephyr,mbolivar/zephyr,Vudentz/zephyr,finikorg/zephyr,kraj/zephyr,mbolivar/zephyr,aceofall/zephyr-iotos,nashif/zephyr,galak/zephyr,galak/zephyr,ldts/zephyr,zephyriot/zephyr,explora26/zephyr,galak/zephyr,zephyrproject-rtos/zephyr,galak/zephyr,mbolivar/zephyr,ldts/zephyr,punitvara/zephyr,GiulianoFranchetto/zephyr,GiulianoFranchetto/zephyr,punitvara/zephyr,mbolivar/zephyr,punitvara/zephyr,zephyriot/zephyr,aceofall/zephyr-iotos,nashif/zephyr,kraj/zephyr,GiulianoFranchetto/zephyr,nashif/zephyr,punitvara/zephyr,zephyrproject-rtos/zephyr,finikorg/zephyr,explora26/zephyr,aceofall/zephyr-iotos,explora26/zephyr,zephyriot/zephyr,GiulianoFranchetto/zephyr,zephyrproject-rtos/zephyr,Vudentz/zephyr,ldts/zephyr,ldts/zephyr,kraj/zephyr,kraj/zephyr,ldts/zephyr,aceofall/zephyr-iotos,Vudentz/zephyr,zephyrproject-rtos/zephyr,nashif/zephyr,explora26/zephyr,punitvara/zephyr,GiulianoFranchetto/zephyr,Vudentz/zephyr,zephyriot/zephyr,zephyriot/zephyr,finikorg/zephyr,aceofall/zephyr-iotos,Vudentz/zephyr,galak/zephyr,nashif/zephyr,kraj/zephyr,Vudentz/zephyr,explora26/zephyr,finikorg/zephyr,finikorg/zephyr,zephyrproject-rtos/zephyr | c | ## Code Before:
/*
* Copyright (c) 2017 Intel Corporation
*
* SPDX-License-Identifier: Apache-2.0
*/
#include <atomic.h>
#include <kernel.h>
#include <entropy.h>
static atomic_t entropy_driver;
u32_t sys_rand32_get(void)
{
struct device *dev = (struct device *)atomic_get(&entropy_driver);
u32_t random_num;
int ret;
if (unlikely(!dev)) {
/* Only one entropy device exists, so this is safe even
* if the whole operation isn't atomic.
*/
dev = device_get_binding(CONFIG_ENTROPY_NAME);
atomic_set(&entropy_driver, (atomic_t)(uintptr_t)dev);
}
ret = entropy_get_entropy(dev, (u8_t *)&random_num,
sizeof(random_num));
if (unlikely(ret < 0)) {
/* Use system timer in case the entropy device couldn't deliver
* 32-bit of data. There's not much that can be done in this
* situation. An __ASSERT() isn't used here as the HWRNG might
* still be gathering entropy during early boot situations.
*/
random_num = k_cycle_get_32();
}
return random_num;
}
## Instruction:
subsys/random: Add _ASSERT() test on returned device_get_binding
If there is a build setup problem where a device driver has not been
setup for the entropy driver then the call to device_get_binding()
will return a NULL value and the code will continue to use this NULL
value. The result is a hard fault later in code execution.
Note that CONFIG_ASSERT is by default off so one has to turn this
configuration on to catch this problem.
Signed-off-by: David Leach <98707a1e43c521c5b5b81e36afef696dde6831a3@nxp.com>
## Code After:
/*
* Copyright (c) 2017 Intel Corporation
*
* SPDX-License-Identifier: Apache-2.0
*/
#include <atomic.h>
#include <kernel.h>
#include <entropy.h>
static atomic_t entropy_driver;
u32_t sys_rand32_get(void)
{
struct device *dev = (struct device *)atomic_get(&entropy_driver);
u32_t random_num;
int ret;
if (unlikely(!dev)) {
/* Only one entropy device exists, so this is safe even
* if the whole operation isn't atomic.
*/
dev = device_get_binding(CONFIG_ENTROPY_NAME);
__ASSERT((dev != NULL),
"Device driver for %s (CONFIG_ENTROPY_NAME) not found. "
"Check your build configuration!",
CONFIG_ENTROPY_NAME);
atomic_set(&entropy_driver, (atomic_t)(uintptr_t)dev);
}
ret = entropy_get_entropy(dev, (u8_t *)&random_num,
sizeof(random_num));
if (unlikely(ret < 0)) {
/* Use system timer in case the entropy device couldn't deliver
* 32-bit of data. There's not much that can be done in this
* situation. An __ASSERT() isn't used here as the HWRNG might
* still be gathering entropy during early boot situations.
*/
random_num = k_cycle_get_32();
}
return random_num;
}
|
0a67db3bef58c2032368fa3b4bec32a6a8b69995 | README.md | README.md |
Thumbnails for Django, Flask and other Python projects.
|
Thumbnails for Django, Flask and other Python projects.
## Install
```
pip install -e https://github.com/relekang/python-thumbnails.git#egg=python-thumbnails
```
## Usage
```python
from thumbnails import get_thumbnail
get_thumbnail('path/to/image.png', '300x300', 'center')
```
----------------------
MIT © Rolf Erik Lekang
| Add installation and usage in readme | Add installation and usage in readme
| Markdown | mit | relekang/python-thumbnails,python-thumbnails/python-thumbnails | markdown | ## Code Before:
Thumbnails for Django, Flask and other Python projects.
## Instruction:
Add installation and usage in readme
## Code After:
Thumbnails for Django, Flask and other Python projects.
## Install
```
pip install -e https://github.com/relekang/python-thumbnails.git#egg=python-thumbnails
```
## Usage
```python
from thumbnails import get_thumbnail
get_thumbnail('path/to/image.png', '300x300', 'center')
```
----------------------
MIT © Rolf Erik Lekang
|
628ff819e0e8632823bd8092543d0a58a469f9b6 | server/package.yaml | server/package.yaml | name: tempgres-server
version: "2.0.0"
synopsis: REST service for creating temporary PostgreSQL databases.
description: >
REST service for conveniently creating temporary PostgreSQL databases
for use in tests.
.
See <https://github.com/ClockworkConsulting/tempgres-server/blob/master/README.md README.md> for
detailed usage and setup instructions.
license: AGPL-3
license-file: LICENSE.txt
author: Bardur Arantsson
maintainer: bardur@scientician.net
copyright: Copyright (c) 2014-2021 Bardur Arantsson
category: Database Testing Web
data-dir: data
github: ClockworkConsulting/tempgres-server
ghc-options: -Wall
default-extensions:
- DeriveGeneric
- DerivingVia
- ImportQualifiedPost
- LambdaCase
- OverloadedStrings
dependencies:
- base
- async
- envy
- postgresql-simple
- random
- scotty
- text
- transformers
- warp
executables:
tempgres-server:
source-dirs: src
main: Main.hs
other-modules:
- Tempgres.Configuration
- Tempgres.Mutex
- Tempgres.DatabaseId
| name: tempgres-server
version: "2.0.0"
synopsis: REST service for creating temporary PostgreSQL databases.
description: >
REST service for conveniently creating temporary PostgreSQL databases
for use in tests.
.
See <https://github.com/ClockworkConsulting/tempgres-server/blob/master/README.md README.md> for
detailed usage and setup instructions.
license: AGPL-3
license-file: LICENSE.txt
author: Bardur Arantsson
maintainer: bardur@scientician.net
copyright: Copyright (c) 2014-2021 Bardur Arantsson
category: Database Testing Web
data-dir: data
github: ClockworkConsulting/tempgres-server
ghc-options:
- -Wall
- -Wmissing-fields
- -threaded
default-extensions:
- DeriveGeneric
- DerivingVia
- ImportQualifiedPost
- LambdaCase
- OverloadedStrings
dependencies:
- base
- async
- envy
- postgresql-simple
- random
- scotty
- text
- transformers
- warp
executables:
tempgres-server:
source-dirs: src
main: Main.hs
other-modules:
- Tempgres.Configuration
- Tempgres.Mutex
- Tempgres.DatabaseId
| Add '-Wmissing-fields' and '-threaded' GHC options | Add '-Wmissing-fields' and '-threaded' GHC options
| YAML | agpl-3.0 | ClockworkConsulting/tempgres-server | yaml | ## Code Before:
name: tempgres-server
version: "2.0.0"
synopsis: REST service for creating temporary PostgreSQL databases.
description: >
REST service for conveniently creating temporary PostgreSQL databases
for use in tests.
.
See <https://github.com/ClockworkConsulting/tempgres-server/blob/master/README.md README.md> for
detailed usage and setup instructions.
license: AGPL-3
license-file: LICENSE.txt
author: Bardur Arantsson
maintainer: bardur@scientician.net
copyright: Copyright (c) 2014-2021 Bardur Arantsson
category: Database Testing Web
data-dir: data
github: ClockworkConsulting/tempgres-server
ghc-options: -Wall
default-extensions:
- DeriveGeneric
- DerivingVia
- ImportQualifiedPost
- LambdaCase
- OverloadedStrings
dependencies:
- base
- async
- envy
- postgresql-simple
- random
- scotty
- text
- transformers
- warp
executables:
tempgres-server:
source-dirs: src
main: Main.hs
other-modules:
- Tempgres.Configuration
- Tempgres.Mutex
- Tempgres.DatabaseId
## Instruction:
Add '-Wmissing-fields' and '-threaded' GHC options
## Code After:
name: tempgres-server
version: "2.0.0"
synopsis: REST service for creating temporary PostgreSQL databases.
description: >
REST service for conveniently creating temporary PostgreSQL databases
for use in tests.
.
See <https://github.com/ClockworkConsulting/tempgres-server/blob/master/README.md README.md> for
detailed usage and setup instructions.
license: AGPL-3
license-file: LICENSE.txt
author: Bardur Arantsson
maintainer: bardur@scientician.net
copyright: Copyright (c) 2014-2021 Bardur Arantsson
category: Database Testing Web
data-dir: data
github: ClockworkConsulting/tempgres-server
ghc-options:
- -Wall
- -Wmissing-fields
- -threaded
default-extensions:
- DeriveGeneric
- DerivingVia
- ImportQualifiedPost
- LambdaCase
- OverloadedStrings
dependencies:
- base
- async
- envy
- postgresql-simple
- random
- scotty
- text
- transformers
- warp
executables:
tempgres-server:
source-dirs: src
main: Main.hs
other-modules:
- Tempgres.Configuration
- Tempgres.Mutex
- Tempgres.DatabaseId
|
cb09cb74a4dea6ddb73915f592ea629ece3572f4 | .travis.yml | .travis.yml | language: php
services: mongodb
sudo: false
php:
- 5.5
- 5.6
- 7.0
- hhvm
before_script:
- yes "" | pecl install mongodb
- ./composer.phar install
script: vendor/bin/phpunit
| language: php
services: mongodb
sudo: false
php:
- 5.6
- 7.0
- hhvm-3.11
- hhvm-3.12
before_script:
- yes "" | pecl install mongodb
- ./composer.phar install
script: vendor/bin/phpunit
| Drop php5.5, try hhvm 3.11 and 3.12 | Drop php5.5, try hhvm 3.11 and 3.12
| YAML | mit | netom/mandango,netom/mandango,netom/mandango | yaml | ## Code Before:
language: php
services: mongodb
sudo: false
php:
- 5.5
- 5.6
- 7.0
- hhvm
before_script:
- yes "" | pecl install mongodb
- ./composer.phar install
script: vendor/bin/phpunit
## Instruction:
Drop php5.5, try hhvm 3.11 and 3.12
## Code After:
language: php
services: mongodb
sudo: false
php:
- 5.6
- 7.0
- hhvm-3.11
- hhvm-3.12
before_script:
- yes "" | pecl install mongodb
- ./composer.phar install
script: vendor/bin/phpunit
|
a27c15024c200132f5ce6ef86dd5dd9e077c3ce9 | README.md | README.md | package.json files are central for node.js/npm projects. Beyond just valid json, there are required fields to follow the Packages [1.0](http://wiki.commonjs.org/wiki/Packages/1.0)/[1.1](http://wiki.commonjs.org/wiki/Packages/1.0) specifications.
This tool verifies the package.json against the spec, letting you know about errors that you MUST have, and making recommendations for optional fields that you SHOULD have.
# Usages
* Online copy hosted courtesy of Nick Sullivan at [http://package-json-validator.com/](http://package-json-validator.com/)
Want to run your own copy? You are welcome to clone or fork this repo
# Future ideas...
* An API
* A node based command line interface, that would allow for plugins for editors
# Issues/Requests
Please check out [the existing issues](https://github.com/pilotfish/pilotfish/issues), and if you don't see that your problem is already being worked on, please [file an issue](https://github.com/pilotfish/pilotfish/issues/new)
# License
See LICENSE
| package.json files are central for node.js/npm projects. Beyond just valid json, there are required fields to follow the Packages [1.0](http://wiki.commonjs.org/wiki/Packages/1.0)/[1.1](http://wiki.commonjs.org/wiki/Packages/1.0) specifications.
This tool verifies the package.json against the spec, letting you know about errors that you MUST have, and making recommendations for optional fields that you SHOULD have.
# Usages
* Online copy hosted courtesy of Nick Sullivan at [http://package-json-validator.com/](http://package-json-validator.com/)
Want to run your own copy? You are welcome to clone or fork this repo
# Future ideas...
* Support for NPM specific fields
* Choice between the type of validation
* An API
* A node based command line interface, that would allow for plugins for editors
# Issues/Requests
Please check out [the existing issues](https://github.com/pilotfish/pilotfish/issues), and if you don't see that your problem is already being worked on, please [file an issue](https://github.com/pilotfish/pilotfish/issues/new)
# License
See LICENSE
| Update readme with future ideas | Update readme with future ideas
| Markdown | mit | gorillamania/package.json-validator,gtanner/package.json-validator,gorillamania/package.json-validator,gorillamania/package.json-validator,gtanner/package.json-validator | markdown | ## Code Before:
package.json files are central for node.js/npm projects. Beyond just valid json, there are required fields to follow the Packages [1.0](http://wiki.commonjs.org/wiki/Packages/1.0)/[1.1](http://wiki.commonjs.org/wiki/Packages/1.0) specifications.
This tool verifies the package.json against the spec, letting you know about errors that you MUST have, and making recommendations for optional fields that you SHOULD have.
# Usages
* Online copy hosted courtesy of Nick Sullivan at [http://package-json-validator.com/](http://package-json-validator.com/)
Want to run your own copy? You are welcome to clone or fork this repo
# Future ideas...
* An API
* A node based command line interface, that would allow for plugins for editors
# Issues/Requests
Please check out [the existing issues](https://github.com/pilotfish/pilotfish/issues), and if you don't see that your problem is already being worked on, please [file an issue](https://github.com/pilotfish/pilotfish/issues/new)
# License
See LICENSE
## Instruction:
Update readme with future ideas
## Code After:
package.json files are central for node.js/npm projects. Beyond just valid json, there are required fields to follow the Packages [1.0](http://wiki.commonjs.org/wiki/Packages/1.0)/[1.1](http://wiki.commonjs.org/wiki/Packages/1.0) specifications.
This tool verifies the package.json against the spec, letting you know about errors that you MUST have, and making recommendations for optional fields that you SHOULD have.
# Usages
* Online copy hosted courtesy of Nick Sullivan at [http://package-json-validator.com/](http://package-json-validator.com/)
Want to run your own copy? You are welcome to clone or fork this repo
# Future ideas...
* Support for NPM specific fields
* Choice between the type of validation
* An API
* A node based command line interface, that would allow for plugins for editors
# Issues/Requests
Please check out [the existing issues](https://github.com/pilotfish/pilotfish/issues), and if you don't see that your problem is already being worked on, please [file an issue](https://github.com/pilotfish/pilotfish/issues/new)
# License
See LICENSE
|
432f831b3b8bdf20df7f8eab55653e434aa492cf | app/views/organizations/index.html.erb | app/views/organizations/index.html.erb | <% content_for :title, "Organizations" %>
<% content_for :content do %>
<% if current_user.can_create_organization? %>
<div class="row input-row">
<div class="col-xs-12">
<div class="pull-right">
<%= link_to "New Organization", new_organization_path, class: "btn btn-primary" %>
</div>
</div>
</div>
<% end %>
<div class="table-responsive">
<table class="table table-striped data-table sort-asc">
<thead>
<tr>
<th>County</th>
<th>Name</th>
<th>Mailing Address</th>
<th><%= t :organization_phone %></th>
<th>Email</th>
</tr>
</thead>
<tbody>
<% current_user.organizations_with_permission_enabled(:can_update_organization_at?).each do |organization| %>
<tr data-href="<%= edit_organization_path(organization) %>">
<td><%= organization.county %></td>
<td><%= organization.name %></td>
<td><%= organization.address %></td>
<td><%= organization.phone_number %></td>
<td><%= organization.email %></td>
</tr>
<% end %>
</tbody>
</table>
</div>
<% end %>
| <% content_for :title, "Organizations" %>
<% content_for :content do %>
<% if current_user.can_create_organization? %>
<div class="row input-row">
<div class="col-xs-12">
<div class="pull-right">
<%= link_to "New Organization", new_organization_path, class: "btn btn-primary" %>
</div>
</div>
</div>
<% end %>
<div class="table-responsive">
<table class="table table-striped data-table sort-asc">
<thead>
<tr>
<th>County</th>
<th>Name</th>
<th>Mailing Address</th>
<th><%= t :organization_phone %></th>
<th>Email</th>
</tr>
</thead>
<tbody>
<% current_user.organizations_with_permission_enabled(:can_update_organization_at?).each do |organization| %>
<tr data-href="<%= edit_organization_path(organization) %>">
<td><%= organization.county %></td>
<td><%= organization.name %></td>
<td><%= organization.addresses.first.address %></td>
<td><%= organization.phone_number %></td>
<td><%= organization.email %></td>
</tr>
<% end %>
</tbody>
</table>
</div>
<% end %>
| Update organization index to use first address for display | Update organization index to use first address for display
| HTML+ERB | mit | on-site/StockAid,on-site/StockAid,on-site/StockAid | html+erb | ## Code Before:
<% content_for :title, "Organizations" %>
<% content_for :content do %>
<% if current_user.can_create_organization? %>
<div class="row input-row">
<div class="col-xs-12">
<div class="pull-right">
<%= link_to "New Organization", new_organization_path, class: "btn btn-primary" %>
</div>
</div>
</div>
<% end %>
<div class="table-responsive">
<table class="table table-striped data-table sort-asc">
<thead>
<tr>
<th>County</th>
<th>Name</th>
<th>Mailing Address</th>
<th><%= t :organization_phone %></th>
<th>Email</th>
</tr>
</thead>
<tbody>
<% current_user.organizations_with_permission_enabled(:can_update_organization_at?).each do |organization| %>
<tr data-href="<%= edit_organization_path(organization) %>">
<td><%= organization.county %></td>
<td><%= organization.name %></td>
<td><%= organization.address %></td>
<td><%= organization.phone_number %></td>
<td><%= organization.email %></td>
</tr>
<% end %>
</tbody>
</table>
</div>
<% end %>
## Instruction:
Update organization index to use first address for display
## Code After:
<% content_for :title, "Organizations" %>
<% content_for :content do %>
<% if current_user.can_create_organization? %>
<div class="row input-row">
<div class="col-xs-12">
<div class="pull-right">
<%= link_to "New Organization", new_organization_path, class: "btn btn-primary" %>
</div>
</div>
</div>
<% end %>
<div class="table-responsive">
<table class="table table-striped data-table sort-asc">
<thead>
<tr>
<th>County</th>
<th>Name</th>
<th>Mailing Address</th>
<th><%= t :organization_phone %></th>
<th>Email</th>
</tr>
</thead>
<tbody>
<% current_user.organizations_with_permission_enabled(:can_update_organization_at?).each do |organization| %>
<tr data-href="<%= edit_organization_path(organization) %>">
<td><%= organization.county %></td>
<td><%= organization.name %></td>
<td><%= organization.addresses.first.address %></td>
<td><%= organization.phone_number %></td>
<td><%= organization.email %></td>
</tr>
<% end %>
</tbody>
</table>
</div>
<% end %>
|
062740cc255ab3fabb61e81a5d0662379681ceab | operations/backup-and-restore/enable-backup-restore.yml | operations/backup-and-restore/enable-backup-restore.yml | - type: replace
path: /releases/-
value:
name: backup-and-restore-sdk
sha1: 71a7659e5aac18031ece278e0f4793c8b5282c8b
url: https://bosh.io/d/github.com/cloudfoundry-incubator/backup-and-restore-sdk-release?v=1.17.0
version: 1.17.0
- type: replace
path: /instance_groups/-
value:
azs:
- z1
instances: 1
jobs:
- name: database-backup-restorer
release: backup-and-restore-sdk
- name: bbr-cfnetworkingdb
properties:
release_level_backup: true
release: cf-networking
- name: bbr-cloudcontrollerdb
release: capi
- name: bbr-routingdb
release: routing
- name: bbr-uaadb
release: uaa
- name: bbr-credhubdb
properties:
release_level_backup: true
release: credhub
name: backup-restore
networks:
- name: default
persistent_disk_type: 10GB
stemcell: default
vm_type: minimal
- type: replace
path: /instance_groups/name=api/jobs/name=routing-api/properties/release_level_backup?
value: true
- type: replace
path: /instance_groups/name=uaa/jobs/name=uaa/properties/release_level_backup?
value: true
| - type: replace
path: /releases/-
value:
name: backup-and-restore-sdk
sha1: 71a7659e5aac18031ece278e0f4793c8b5282c8b
url: https://bosh.io/d/github.com/cloudfoundry-incubator/backup-and-restore-sdk-release?v=1.17.0
version: 1.17.0
- type: replace
path: /instance_groups/-
value:
azs:
- z1
instances: 1
jobs:
- name: database-backup-restorer
release: backup-and-restore-sdk
- name: bbr-cfnetworkingdb
properties:
release_level_backup: true
release: cf-networking
- name: bbr-cloudcontrollerdb
release: capi
- name: bbr-routingdb
release: routing
- name: bbr-uaadb
properties:
release_level_backup: true
release: uaa
- name: bbr-credhubdb
properties:
release_level_backup: true
release: credhub
name: backup-restore
networks:
- name: default
persistent_disk_type: 10GB
stemcell: default
vm_type: minimal
- type: replace
path: /instance_groups/name=api/jobs/name=routing-api/properties/release_level_backup?
value: true
| Fix UAA BBR release_level_backup property | Fix UAA BBR release_level_backup property
- it now needs to be specified on the bbr-uaadb job, instead of the uaa
job
[#167748864](https://www.pivotaltracker.com/story/show/167748864)
| YAML | apache-2.0 | cloudfoundry/cf-deployment,cloudfoundry/cf-deployment | yaml | ## Code Before:
- type: replace
path: /releases/-
value:
name: backup-and-restore-sdk
sha1: 71a7659e5aac18031ece278e0f4793c8b5282c8b
url: https://bosh.io/d/github.com/cloudfoundry-incubator/backup-and-restore-sdk-release?v=1.17.0
version: 1.17.0
- type: replace
path: /instance_groups/-
value:
azs:
- z1
instances: 1
jobs:
- name: database-backup-restorer
release: backup-and-restore-sdk
- name: bbr-cfnetworkingdb
properties:
release_level_backup: true
release: cf-networking
- name: bbr-cloudcontrollerdb
release: capi
- name: bbr-routingdb
release: routing
- name: bbr-uaadb
release: uaa
- name: bbr-credhubdb
properties:
release_level_backup: true
release: credhub
name: backup-restore
networks:
- name: default
persistent_disk_type: 10GB
stemcell: default
vm_type: minimal
- type: replace
path: /instance_groups/name=api/jobs/name=routing-api/properties/release_level_backup?
value: true
- type: replace
path: /instance_groups/name=uaa/jobs/name=uaa/properties/release_level_backup?
value: true
## Instruction:
Fix UAA BBR release_level_backup property
- it now needs to be specified on the bbr-uaadb job, instead of the uaa
job
[#167748864](https://www.pivotaltracker.com/story/show/167748864)
## Code After:
- type: replace
path: /releases/-
value:
name: backup-and-restore-sdk
sha1: 71a7659e5aac18031ece278e0f4793c8b5282c8b
url: https://bosh.io/d/github.com/cloudfoundry-incubator/backup-and-restore-sdk-release?v=1.17.0
version: 1.17.0
- type: replace
path: /instance_groups/-
value:
azs:
- z1
instances: 1
jobs:
- name: database-backup-restorer
release: backup-and-restore-sdk
- name: bbr-cfnetworkingdb
properties:
release_level_backup: true
release: cf-networking
- name: bbr-cloudcontrollerdb
release: capi
- name: bbr-routingdb
release: routing
- name: bbr-uaadb
properties:
release_level_backup: true
release: uaa
- name: bbr-credhubdb
properties:
release_level_backup: true
release: credhub
name: backup-restore
networks:
- name: default
persistent_disk_type: 10GB
stemcell: default
vm_type: minimal
- type: replace
path: /instance_groups/name=api/jobs/name=routing-api/properties/release_level_backup?
value: true
|
1d20367a4704816c9138e6389487fb9e317aa317 | README.md | README.md |
A list of ambitious web applications built using [ember.js](http://emberjs.com/).
This project was inspired by a Google Drive spreadsheet that was shared on Twitter sometime ago. We thought that it might be a good idea to have a site listing all the cool projects built using ember.js so we made it.
### Submissions
We are trying to keep the site submission quality as high as possible. Please only submit real world applications running in production environments. Make sure that you can link to the application directly and no demos or proof-of-concepts. Only finished applications allowed.
### How to submit
To submit a site suggestions, [open an issue](https://github.com/GetBlimp/built-with-ember/issues/new) or create a pull request. Pull requests will be given higher priority since they are easier to include.
Make sure the screenshot is **1000x800** and please double check that everything looks good before submitting.
|
A list of ambitious web applications built using [ember.js](http://emberjs.com/).
This project was inspired by a Google Drive spreadsheet that was shared on Twitter sometime ago. We thought that it might be a good idea to have a site listing all the cool projects built using ember.js so we made it.
### Submissions
We are trying to keep the site submission quality as high as possible. Please only submit real world applications running in production environments. Make sure that you can link to the application directly and no demos or proof-of-concepts. Only finished applications allowed.
### How to submit
To submit a site suggestions, [open an issue](https://github.com/GetBlimp/built-with-ember/issues/new) or create a pull request. Pull requests will be given higher priority since they are easier to include.
Make sure the screenshot is **1000x800** and please double check that everything looks good before submitting.
### Running the site locally
```
$ gem install jekyll
$ git clone https://github.com/GetBlimp/built-with-ember.git
$ cd built-with-ember
$ jekyll serve --watch
```
| Add running instructions to readme | Add running instructions to readme
| Markdown | mit | nicolaschenet/built-with-ember,jpadilla/built-with-ember,kellysutton/built-with-ember,kellysutton/built-with-ember,GetBlimp/built-with-ember,1024inc/built-with-ember,GetBlimp/built-with-ember,toddjordan/built-with-ember,toddjordan/built-with-ember,jpadilla/built-with-ember,nicolaschenet/built-with-ember | markdown | ## Code Before:
A list of ambitious web applications built using [ember.js](http://emberjs.com/).
This project was inspired by a Google Drive spreadsheet that was shared on Twitter sometime ago. We thought that it might be a good idea to have a site listing all the cool projects built using ember.js so we made it.
### Submissions
We are trying to keep the site submission quality as high as possible. Please only submit real world applications running in production environments. Make sure that you can link to the application directly and no demos or proof-of-concepts. Only finished applications allowed.
### How to submit
To submit a site suggestions, [open an issue](https://github.com/GetBlimp/built-with-ember/issues/new) or create a pull request. Pull requests will be given higher priority since they are easier to include.
Make sure the screenshot is **1000x800** and please double check that everything looks good before submitting.
## Instruction:
Add running instructions to readme
## Code After:
A list of ambitious web applications built using [ember.js](http://emberjs.com/).
This project was inspired by a Google Drive spreadsheet that was shared on Twitter sometime ago. We thought that it might be a good idea to have a site listing all the cool projects built using ember.js so we made it.
### Submissions
We are trying to keep the site submission quality as high as possible. Please only submit real world applications running in production environments. Make sure that you can link to the application directly and no demos or proof-of-concepts. Only finished applications allowed.
### How to submit
To submit a site suggestions, [open an issue](https://github.com/GetBlimp/built-with-ember/issues/new) or create a pull request. Pull requests will be given higher priority since they are easier to include.
Make sure the screenshot is **1000x800** and please double check that everything looks good before submitting.
### Running the site locally
```
$ gem install jekyll
$ git clone https://github.com/GetBlimp/built-with-ember.git
$ cd built-with-ember
$ jekyll serve --watch
```
|
e56b5a5256dcb7f791ad3d68703fe51123ce3513 | tools/seec-trace-view/Annotations.cpp | tools/seec-trace-view/Annotations.cpp | //===- tools/seec-trace-view/Annotations.cpp ------------------------------===//
//
// SeeC
//
// This file is distributed under The MIT License (MIT). See LICENSE.TXT for
// details.
//
//===----------------------------------------------------------------------===//
///
/// \file
///
//===----------------------------------------------------------------------===//
#include "seec/Util/MakeUnique.hpp"
#include <wx/archive.h>
#include <wx/xml/xml.h>
#include "Annotations.hpp"
AnnotationCollection::
AnnotationCollection(std::unique_ptr<wxXmlDocument> XmlDocument)
: m_XmlDocument(std::move(XmlDocument))
{}
AnnotationCollection::AnnotationCollection()
{
m_XmlDocument = seec::makeUnique<wxXmlDocument>();
auto const Root = new wxXmlNode(nullptr, wxXML_ELEMENT_NODE, "annotations");
m_XmlDocument->SetRoot(Root);
}
AnnotationCollection::~AnnotationCollection() = default;
seec::Maybe<AnnotationCollection>
AnnotationCollection::fromDoc(std::unique_ptr<wxXmlDocument> Doc)
{
return AnnotationCollection(std::move(Doc));
}
bool AnnotationCollection::writeToArchive(wxArchiveOutputStream &Stream)
{
return Stream.PutNextEntry("annotations.xml")
&& m_XmlDocument->Save(Stream);
}
| //===- tools/seec-trace-view/Annotations.cpp ------------------------------===//
//
// SeeC
//
// This file is distributed under The MIT License (MIT). See LICENSE.TXT for
// details.
//
//===----------------------------------------------------------------------===//
///
/// \file
///
//===----------------------------------------------------------------------===//
#include "seec/Util/MakeUnique.hpp"
#include <wx/archive.h>
#include <wx/xml/xml.h>
#include "Annotations.hpp"
namespace {
bool isAnnotationCollection(wxXmlDocument &Doc)
{
if (!Doc.IsOk())
return false;
auto const RootNode = Doc.GetRoot();
if (!RootNode || RootNode->GetName() != "annotations")
return false;
return true;
}
} // anonymous namespace
AnnotationCollection::
AnnotationCollection(std::unique_ptr<wxXmlDocument> XmlDocument)
: m_XmlDocument(std::move(XmlDocument))
{}
AnnotationCollection::AnnotationCollection()
{
m_XmlDocument = seec::makeUnique<wxXmlDocument>();
auto const Root = new wxXmlNode(nullptr, wxXML_ELEMENT_NODE, "annotations");
m_XmlDocument->SetRoot(Root);
}
AnnotationCollection::~AnnotationCollection() = default;
seec::Maybe<AnnotationCollection>
AnnotationCollection::fromDoc(std::unique_ptr<wxXmlDocument> Doc)
{
if (!isAnnotationCollection(*Doc))
return seec::Maybe<AnnotationCollection>();
return AnnotationCollection(std::move(Doc));
}
bool AnnotationCollection::writeToArchive(wxArchiveOutputStream &Stream)
{
return Stream.PutNextEntry("annotations.xml")
&& m_XmlDocument->Save(Stream);
}
| Check that annotations.xml documents match our expectations. | Check that annotations.xml documents match our expectations.
| C++ | mit | mheinsen/seec,seec-team/seec,mheinsen/seec,seec-team/seec,seec-team/seec,seec-team/seec,mheinsen/seec,seec-team/seec,mheinsen/seec,mheinsen/seec | c++ | ## Code Before:
//===- tools/seec-trace-view/Annotations.cpp ------------------------------===//
//
// SeeC
//
// This file is distributed under The MIT License (MIT). See LICENSE.TXT for
// details.
//
//===----------------------------------------------------------------------===//
///
/// \file
///
//===----------------------------------------------------------------------===//
#include "seec/Util/MakeUnique.hpp"
#include <wx/archive.h>
#include <wx/xml/xml.h>
#include "Annotations.hpp"
AnnotationCollection::
AnnotationCollection(std::unique_ptr<wxXmlDocument> XmlDocument)
: m_XmlDocument(std::move(XmlDocument))
{}
AnnotationCollection::AnnotationCollection()
{
m_XmlDocument = seec::makeUnique<wxXmlDocument>();
auto const Root = new wxXmlNode(nullptr, wxXML_ELEMENT_NODE, "annotations");
m_XmlDocument->SetRoot(Root);
}
AnnotationCollection::~AnnotationCollection() = default;
seec::Maybe<AnnotationCollection>
AnnotationCollection::fromDoc(std::unique_ptr<wxXmlDocument> Doc)
{
return AnnotationCollection(std::move(Doc));
}
bool AnnotationCollection::writeToArchive(wxArchiveOutputStream &Stream)
{
return Stream.PutNextEntry("annotations.xml")
&& m_XmlDocument->Save(Stream);
}
## Instruction:
Check that annotations.xml documents match our expectations.
## Code After:
//===- tools/seec-trace-view/Annotations.cpp ------------------------------===//
//
// SeeC
//
// This file is distributed under The MIT License (MIT). See LICENSE.TXT for
// details.
//
//===----------------------------------------------------------------------===//
///
/// \file
///
//===----------------------------------------------------------------------===//
#include "seec/Util/MakeUnique.hpp"
#include <wx/archive.h>
#include <wx/xml/xml.h>
#include "Annotations.hpp"
namespace {
bool isAnnotationCollection(wxXmlDocument &Doc)
{
if (!Doc.IsOk())
return false;
auto const RootNode = Doc.GetRoot();
if (!RootNode || RootNode->GetName() != "annotations")
return false;
return true;
}
} // anonymous namespace
AnnotationCollection::
AnnotationCollection(std::unique_ptr<wxXmlDocument> XmlDocument)
: m_XmlDocument(std::move(XmlDocument))
{}
AnnotationCollection::AnnotationCollection()
{
m_XmlDocument = seec::makeUnique<wxXmlDocument>();
auto const Root = new wxXmlNode(nullptr, wxXML_ELEMENT_NODE, "annotations");
m_XmlDocument->SetRoot(Root);
}
AnnotationCollection::~AnnotationCollection() = default;
seec::Maybe<AnnotationCollection>
AnnotationCollection::fromDoc(std::unique_ptr<wxXmlDocument> Doc)
{
if (!isAnnotationCollection(*Doc))
return seec::Maybe<AnnotationCollection>();
return AnnotationCollection(std::move(Doc));
}
bool AnnotationCollection::writeToArchive(wxArchiveOutputStream &Stream)
{
return Stream.PutNextEntry("annotations.xml")
&& m_XmlDocument->Save(Stream);
}
|
4200157057dcf5b520308ae7391f94e48c025331 | montecarlo/pythia6/CMakeLists.txt | montecarlo/pythia6/CMakeLists.txt |
ROOT_USE_PACKAGE(montecarlo/eg)
ROOT_USE_PACKAGE(montecarlo/vmc)
ROOT_USE_PACKAGE(math/physics)
ROOT_GENERATE_DICTIONARY(G__Pythia6 *.h LINKDEF LinkDef.h)
ROOT_GENERATE_ROOTMAP(EGPythia6 LINKDEF LinkDef.h DEPENDENCIES EG Graf VMC Physics )
if(pythia6_nolink)
string(REGEX REPLACE "-Wl,--no-undefined" "" CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS}")
ROOT_LINKER_LIBRARY(EGPythia6 *.cxx G__Pythia6.cxx LIBRARIES Core DEPENDENCIES EG Graf VMC Physics)
else()
ROOT_LINKER_LIBRARY(EGPythia6 *.cxx G__Pythia6.cxx LIBRARIES Core ${PYTHIA6_LIBRARIES} DEPENDENCIES EG Graf VMC Physics)
endif()
ROOT_INSTALL_HEADERS()
|
ROOT_USE_PACKAGE(montecarlo/eg)
ROOT_USE_PACKAGE(montecarlo/vmc)
ROOT_USE_PACKAGE(math/physics)
ROOT_GENERATE_DICTIONARY(G__Pythia6 *.h LINKDEF LinkDef.h)
ROOT_GENERATE_ROOTMAP(EGPythia6 LINKDEF LinkDef.h DEPENDENCIES EG Graf VMC Physics )
if(pythia6_nolink)
string(REGEX REPLACE "-Wl,--no-undefined" "" CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS}")
ROOT_LINKER_LIBRARY(EGPythia6 *.cxx G__Pythia6.cxx LIBRARIES Core DEPENDENCIES EG Graf VMC Physics)
else()
if(MSVC)
SET(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} /SAFESEH:NO ")
link_directories($ENV{LIB})
endif()
ROOT_LINKER_LIBRARY(EGPythia6 *.cxx G__Pythia6.cxx LIBRARIES Core ${PYTHIA6_LIBRARIES} DEPENDENCIES EG Graf VMC Physics)
endif()
ROOT_INSTALL_HEADERS()
| Fix compilation error on Windows | Fix compilation error on Windows
| Text | lgpl-2.1 | tc3t/qoot,tc3t/qoot,tc3t/qoot,tc3t/qoot,tc3t/qoot,tc3t/qoot,tc3t/qoot,tc3t/qoot,tc3t/qoot,tc3t/qoot | text | ## Code Before:
ROOT_USE_PACKAGE(montecarlo/eg)
ROOT_USE_PACKAGE(montecarlo/vmc)
ROOT_USE_PACKAGE(math/physics)
ROOT_GENERATE_DICTIONARY(G__Pythia6 *.h LINKDEF LinkDef.h)
ROOT_GENERATE_ROOTMAP(EGPythia6 LINKDEF LinkDef.h DEPENDENCIES EG Graf VMC Physics )
if(pythia6_nolink)
string(REGEX REPLACE "-Wl,--no-undefined" "" CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS}")
ROOT_LINKER_LIBRARY(EGPythia6 *.cxx G__Pythia6.cxx LIBRARIES Core DEPENDENCIES EG Graf VMC Physics)
else()
ROOT_LINKER_LIBRARY(EGPythia6 *.cxx G__Pythia6.cxx LIBRARIES Core ${PYTHIA6_LIBRARIES} DEPENDENCIES EG Graf VMC Physics)
endif()
ROOT_INSTALL_HEADERS()
## Instruction:
Fix compilation error on Windows
## Code After:
ROOT_USE_PACKAGE(montecarlo/eg)
ROOT_USE_PACKAGE(montecarlo/vmc)
ROOT_USE_PACKAGE(math/physics)
ROOT_GENERATE_DICTIONARY(G__Pythia6 *.h LINKDEF LinkDef.h)
ROOT_GENERATE_ROOTMAP(EGPythia6 LINKDEF LinkDef.h DEPENDENCIES EG Graf VMC Physics )
if(pythia6_nolink)
string(REGEX REPLACE "-Wl,--no-undefined" "" CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS}")
ROOT_LINKER_LIBRARY(EGPythia6 *.cxx G__Pythia6.cxx LIBRARIES Core DEPENDENCIES EG Graf VMC Physics)
else()
if(MSVC)
SET(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} /SAFESEH:NO ")
link_directories($ENV{LIB})
endif()
ROOT_LINKER_LIBRARY(EGPythia6 *.cxx G__Pythia6.cxx LIBRARIES Core ${PYTHIA6_LIBRARIES} DEPENDENCIES EG Graf VMC Physics)
endif()
ROOT_INSTALL_HEADERS()
|
94dc41159a503f3be907442f1fbe8f922683de05 | bin/run.sh | bin/run.sh |
set -o errexit
out=$(mktemp)
trap "rm -f $out" EXIT
java -jar "$1" - > $out
printf '\377' # 255 in octal
cat $out
|
set -o errexit
out=$(mktemp)
trap "rm -f $out" EXIT
# Only lets programs run for 30 seconds
timeout 30 java -jar "$1" - > $out
printf '\377' # 255 in octal
cat $out
| Add a timeout, as the user may hang it. | Add a timeout, as the user may hang it.
| Shell | mit | cogumbreiro/sepi-playpen,drlagos/mpisessions-playpen,drlagos/unify-playpen,cogumbreiro/sepi-playpen,drlagos/mpisessions-playpen,drlagos/unify-playpen | shell | ## Code Before:
set -o errexit
out=$(mktemp)
trap "rm -f $out" EXIT
java -jar "$1" - > $out
printf '\377' # 255 in octal
cat $out
## Instruction:
Add a timeout, as the user may hang it.
## Code After:
set -o errexit
out=$(mktemp)
trap "rm -f $out" EXIT
# Only lets programs run for 30 seconds
timeout 30 java -jar "$1" - > $out
printf '\377' # 255 in octal
cat $out
|
6932233d8ce0619fd6abb42efd4acf6f564cc056 | docs/content/lockfile.md | docs/content/lockfile.md | What is the Lockfile?
=====================
Consider the following [Packages file](packages_file.html):
source "http://nuget.org/api/v2"
nuget "Castle.Windsor-log4net" "~> 3.2"
nuget "Rx-Main" "~> 2.0"
In this case we specify dependencies to `Castle.Windsor-log4net` and `Rx-Main`.
Both packages have dependencies to further NuGet packages.
The `package.lock` file is a concrete resolution of all direct or indirect dependencies of your application.
[lang=textfile]
NUGET
remote: http://nuget.org/api/v2
specs:
Castle.Windsor (2.1)
Castle.Windsor-log4net (3.3)
Rx-Core (2.1)
Rx-Main (2.0)
log (1.2)
log4net (1.1)
Further runs of [paket install](packet_install.htm) will not analyze the `packages.fsx` file again.
So if you commit `package.lock` to your version control system,
it ensures that other developers, as well as your build servers,
will always use the same packages that you are using now. | What is the Lockfile?
=====================
Consider the following [Packages file](packages_file.html):
source "http://nuget.org/api/v2"
nuget "Castle.Windsor-log4net" "~> 3.2"
nuget "Rx-Main" "~> 2.0"
In this case we specify dependencies to `Castle.Windsor-log4net` and `Rx-Main`.
Both packages have dependencies to further NuGet packages.
The `package.lock` file is a concrete resolution of all direct or indirect dependencies of your application.
[lang=textfile]
NUGET
remote: http://nuget.org/api/v2
specs:
Castle.Core (3.3.0)
Castle.Core-log4net (3.3.0)
Castle.LoggingFacility (3.3.0)
Castle.Windsor (3.3.0)
Castle.Windsor-log4net (3.3.0)
Rx-Core (2.2.5)
Rx-Interfaces (2.2.5)
Rx-Linq (2.2.5)
Rx-Main (2.2.5)
Rx-PlatformServices (2.3)
log4net (1.2.10)
Further runs of [paket install](packet_install.htm) will not analyze the `packages.fsx` file again.
So if you commit `package.lock` to your version control system,
it ensures that other developers, as well as your build servers,
will always use the same packages that you are using now. | Use the correct dependency resolution in the docs | Use the correct dependency resolution in the docs
| Markdown | mit | baronfel/Paket,vasily-kirichenko/Paket,matthid/Paket,0x53A/Paket,matthid/Paket,dedale/Paket,simonhdickson/Paket,mrinaldi/Paket,inosik/Paket,colinbull/Paket,betgenius/Paket,robertpi/Paket,snowcrazed/Paket,jonathankarsh/Paket,Stift/Paket,0x53A/Paket,vasily-kirichenko/Paket,Stift/Paket,thinkbeforecoding/Paket,ascjones/Paket,lexarchik/Paket,LexxFedoroff/Paket,0x53A/Paket,fsprojects/Paket,colinbull/Paket,kjellski/Paket,MorganPersson/Paket,magicmonty/Paket,NatElkins/Paket,simonhdickson/Paket,thinkbeforecoding/Paket,lexarchik/Paket,theimowski/Paket,cloudRoutine/Paket,ctaggart/Paket,0x53A/Paket,lexarchik/Paket,matthid/Paket,jam40jeff/Paket,vbfox/Paket,thinkbeforecoding/Paket,mavnn/Paket,fsprojects/Paket,ascjones/Paket,LexxFedoroff/Paket,vbfox/Paket,robertpi/Paket,jam40jeff/Paket,konste/Paket,cloudRoutine/Paket,magicmonty/Paket,MorganPersson/Paket,sergey-tihon/Paket,MorganPersson/Paket,MorganPersson/Paket,simonhdickson/Paket,Thorium/Paket,kjellski/Paket,thinkbeforecoding/Paket,snowcrazed/Paket,ovatsus/Paket,robertpi/Paket,snowcrazed/Paket,magicmonty/Paket,konste/Paket,isaacabraham/Paket,cloudRoutine/Paket,NatElkins/Paket,antymon4o/Paket,Stift/Paket,robertpi/Paket,baronfel/Paket,isaacabraham/Paket,magicmonty/Paket,vbfox/Paket,mexx/Paket,mexx/Paket,mrinaldi/Paket,mrinaldi/Paket,Thorium/Paket,Thorium/Paket,isaacabraham/Paket,vbfox/Paket,betgenius/Paket,devboy/Paket,ovatsus/Paket,ctaggart/Paket,mausch/Paket,mrinaldi/Paket,sergey-tihon/Paket,mausch/Paket,inosik/Paket,isaacabraham/Paket,jonathankarsh/Paket,antymon4o/Paket,matthid/Paket,mavnn/Paket,dedale/Paket,lexarchik/Paket,NatElkins/Paket,Stift/Paket,Thorium/Paket,theimowski/Paket,cloudRoutine/Paket,NatElkins/Paket,simonhdickson/Paket,artur-s/Paket,snowcrazed/Paket,artur-s/Paket,devboy/Paket | markdown | ## Code Before:
What is the Lockfile?
=====================
Consider the following [Packages file](packages_file.html):
source "http://nuget.org/api/v2"
nuget "Castle.Windsor-log4net" "~> 3.2"
nuget "Rx-Main" "~> 2.0"
In this case we specify dependencies to `Castle.Windsor-log4net` and `Rx-Main`.
Both packages have dependencies to further NuGet packages.
The `package.lock` file is a concrete resolution of all direct or indirect dependencies of your application.
[lang=textfile]
NUGET
remote: http://nuget.org/api/v2
specs:
Castle.Windsor (2.1)
Castle.Windsor-log4net (3.3)
Rx-Core (2.1)
Rx-Main (2.0)
log (1.2)
log4net (1.1)
Further runs of [paket install](packet_install.htm) will not analyze the `packages.fsx` file again.
So if you commit `package.lock` to your version control system,
it ensures that other developers, as well as your build servers,
will always use the same packages that you are using now.
## Instruction:
Use the correct dependency resolution in the docs
## Code After:
What is the Lockfile?
=====================
Consider the following [Packages file](packages_file.html):
source "http://nuget.org/api/v2"
nuget "Castle.Windsor-log4net" "~> 3.2"
nuget "Rx-Main" "~> 2.0"
In this case we specify dependencies to `Castle.Windsor-log4net` and `Rx-Main`.
Both packages have dependencies to further NuGet packages.
The `package.lock` file is a concrete resolution of all direct or indirect dependencies of your application.
[lang=textfile]
NUGET
remote: http://nuget.org/api/v2
specs:
Castle.Core (3.3.0)
Castle.Core-log4net (3.3.0)
Castle.LoggingFacility (3.3.0)
Castle.Windsor (3.3.0)
Castle.Windsor-log4net (3.3.0)
Rx-Core (2.2.5)
Rx-Interfaces (2.2.5)
Rx-Linq (2.2.5)
Rx-Main (2.2.5)
Rx-PlatformServices (2.3)
log4net (1.2.10)
Further runs of [paket install](packet_install.htm) will not analyze the `packages.fsx` file again.
So if you commit `package.lock` to your version control system,
it ensures that other developers, as well as your build servers,
will always use the same packages that you are using now. |
fbc8c87a1b4cfa1f5de5f38b326216889e55a89f | 2013/thought_sauce/hkid_checksum/lib/Hkid.rb | 2013/thought_sauce/hkid_checksum/lib/Hkid.rb |
class Hkid
def check_digit(hkid)
sum = 0
leading_char = []
hkid.downcase.reverse.split(//).each_with_index do |char, index|
if index < 6
sum += char.to_i * (index + 2)
else
sum += (char.ord - 96) * (index + 2)
end
end
checksum = 11 - sum % 11
if checksum < 11
checksum.to_s
else
"A"
end
end
end
|
class Hkid
def check_digit(hkid)
sum = 0
leading_char = []
hkid.downcase.reverse.split(//).each_with_index do |char, index|
if index < 6
sum += char.to_i * (index + 2)
else
sum += (char.ord - 96) * (index + 2)
end
end
(11 - sum % 11).to_s(11).upcase
end
end
| Fix code to pass check digit is “A” | Fix code to pass check digit is “A”
| Ruby | mit | linc01n/interview | ruby | ## Code Before:
class Hkid
def check_digit(hkid)
sum = 0
leading_char = []
hkid.downcase.reverse.split(//).each_with_index do |char, index|
if index < 6
sum += char.to_i * (index + 2)
else
sum += (char.ord - 96) * (index + 2)
end
end
checksum = 11 - sum % 11
if checksum < 11
checksum.to_s
else
"A"
end
end
end
## Instruction:
Fix code to pass check digit is “A”
## Code After:
class Hkid
def check_digit(hkid)
sum = 0
leading_char = []
hkid.downcase.reverse.split(//).each_with_index do |char, index|
if index < 6
sum += char.to_i * (index + 2)
else
sum += (char.ord - 96) * (index + 2)
end
end
(11 - sum % 11).to_s(11).upcase
end
end
|
b91102ed6f6b8bff0d012485f54e103a0f9a2ddf | scripts/get-release-version.sh | scripts/get-release-version.sh | caf_version=`grep "define CAF_VERSION" libcaf_core/caf/config.hpp | awk '{ printf "%d.%d.%d", int($3 / 10000), int($3 / 100) % 100, $3 % 100 }'`
# Get SHA from Git.
git_sha=`git log --pretty=format:%h -n 1`
# Check whether the current SHA is a tag.
if git describe --tags --contains $git_sha 1>release.txt 2>/dev/null
then
# Tags indicate stable release -> use tag version.
# On success, we'll have the tag version in release.txt now, so we're done.
exit 0
fi
# Generate default release version.
echo "$caf_version+exp.sha.$git_sha" >release.txt
| caf_version=`grep "define CAF_VERSION" libcaf_core/caf/config.hpp | awk '{ printf "%d.%d.%d", int($3 / 10000), int($3 / 100) % 100, $3 % 100 }'`
# Get SHA from Git.
git_sha=`git log --pretty=format:%h -n 1`
# Check whether the current SHA is a tag.
if git describe --tags --contains $git_sha 1>release.txt 2>/dev/null
then
# Tags indicate stable release -> use tag version.
# On success, we'll have the tag version in release.txt now, so we're done.
echo "build a tagged release: $caf_version"
else
# Generate default release version.
caf_release_version="$caf_version+exp.sha.$git_sha"
echo "build a commit version: $caf_release_version"
echo "$caf_release_version" >release.txt
fi
| Print build when getting release version | Print build when getting release version
| Shell | bsd-3-clause | DavadDi/actor-framework,DavadDi/actor-framework,actor-framework/actor-framework,DavadDi/actor-framework,DavadDi/actor-framework,actor-framework/actor-framework,actor-framework/actor-framework,actor-framework/actor-framework | shell | ## Code Before:
caf_version=`grep "define CAF_VERSION" libcaf_core/caf/config.hpp | awk '{ printf "%d.%d.%d", int($3 / 10000), int($3 / 100) % 100, $3 % 100 }'`
# Get SHA from Git.
git_sha=`git log --pretty=format:%h -n 1`
# Check whether the current SHA is a tag.
if git describe --tags --contains $git_sha 1>release.txt 2>/dev/null
then
# Tags indicate stable release -> use tag version.
# On success, we'll have the tag version in release.txt now, so we're done.
exit 0
fi
# Generate default release version.
echo "$caf_version+exp.sha.$git_sha" >release.txt
## Instruction:
Print build when getting release version
## Code After:
caf_version=`grep "define CAF_VERSION" libcaf_core/caf/config.hpp | awk '{ printf "%d.%d.%d", int($3 / 10000), int($3 / 100) % 100, $3 % 100 }'`
# Get SHA from Git.
git_sha=`git log --pretty=format:%h -n 1`
# Check whether the current SHA is a tag.
if git describe --tags --contains $git_sha 1>release.txt 2>/dev/null
then
# Tags indicate stable release -> use tag version.
# On success, we'll have the tag version in release.txt now, so we're done.
echo "build a tagged release: $caf_version"
else
# Generate default release version.
caf_release_version="$caf_version+exp.sha.$git_sha"
echo "build a commit version: $caf_release_version"
echo "$caf_release_version" >release.txt
fi
|
c275dfe386fc8430a87e0cd5649239af966a8c62 | keymaps/autocomplete-plus.cson | keymaps/autocomplete-plus.cson | ".autocomplete-plus input.hidden-input":
"tab": "autocomplete-plus:confirm"
"down": "autocomplete-plus:select-next"
"up": "autocomplete-plus:select-previous"
"escape": "autocomplete-plus:cancel"
".editor":
"ctrl-shift-space": "autocomplete-plus:activate"
| ".autocomplete-plus input.hidden-input":
"tab": "autocomplete-plus:confirm"
"down": "autocomplete-plus:select-next"
"ctrl-n": "autocomplete-plus:select-next"
"up": "autocomplete-plus:select-previous"
"ctrl-p": "autocomplete-plus:select-previous"
"escape": "autocomplete-plus:cancel"
".editor":
"ctrl-shift-space": "autocomplete-plus:activate"
| Add ctrl-n,p emacs bindings for next, previous | Add ctrl-n,p emacs bindings for next, previous
Emacs binds are supported through most of the rest of Atom. Plus it's rough having to drop down to the arrow keys in the midst of typing :) | CoffeeScript | mit | Lyleo/autocomplete-plus,atom/autocomplete-plus,lestat220255/autocomplete-plus,brntbeer/autocomplete-plus,atom-community/autocomplete-plus,kevinsawicki/autocomplete-plus,wakermahmud/autocomplete-plus,kevinsawicki/autocomplete-plus | coffeescript | ## Code Before:
".autocomplete-plus input.hidden-input":
"tab": "autocomplete-plus:confirm"
"down": "autocomplete-plus:select-next"
"up": "autocomplete-plus:select-previous"
"escape": "autocomplete-plus:cancel"
".editor":
"ctrl-shift-space": "autocomplete-plus:activate"
## Instruction:
Add ctrl-n,p emacs bindings for next, previous
Emacs binds are supported through most of the rest of Atom. Plus it's rough having to drop down to the arrow keys in the midst of typing :)
## Code After:
".autocomplete-plus input.hidden-input":
"tab": "autocomplete-plus:confirm"
"down": "autocomplete-plus:select-next"
"ctrl-n": "autocomplete-plus:select-next"
"up": "autocomplete-plus:select-previous"
"ctrl-p": "autocomplete-plus:select-previous"
"escape": "autocomplete-plus:cancel"
".editor":
"ctrl-shift-space": "autocomplete-plus:activate"
|
d82580647ea3c85fc14dbcc3db470867ef7499a6 | ql/test/qlpack.yml | ql/test/qlpack.yml | name: codeql/go-tests
version: 0.0.2
dependencies:
codeql/go-queries: ^0.0.2
codeql/go-all: ^0.0.2
extractor: go
| name: codeql/go-tests
version: 0.0.2
dependencies:
codeql/go-queries: ^0.0.2
codeql/go-all: ^0.0.2
codeql/go-examples: ^0.0.2
extractor: go
| Add reference to `codeql/go-examples` pack from test pack | Add reference to `codeql/go-examples` pack from test pack
| YAML | mit | github/codeql,github/codeql,github/codeql-go,github/codeql,github/codeql,github/codeql,github/codeql-go,github/codeql,github/codeql,github/codeql-go,github/codeql,github/codeql,github/codeql,github/codeql,github/codeql,github/codeql,github/codeql,github/codeql,github/codeql-go | yaml | ## Code Before:
name: codeql/go-tests
version: 0.0.2
dependencies:
codeql/go-queries: ^0.0.2
codeql/go-all: ^0.0.2
extractor: go
## Instruction:
Add reference to `codeql/go-examples` pack from test pack
## Code After:
name: codeql/go-tests
version: 0.0.2
dependencies:
codeql/go-queries: ^0.0.2
codeql/go-all: ^0.0.2
codeql/go-examples: ^0.0.2
extractor: go
|
d6dcdf1c6b804d1f8e3b0a147b712e2ed9d45431 | ConfigureChecks.cmake | ConfigureChecks.cmake | INCLUDE(CheckIncludeFile)
INCLUDE(CheckLibraryExists)
# Only build statically on Windows.
IF(WIN32)
SET(CMAKE_EXE_LINKER_FLAGS "-static")
SET(CMAKE_FIND_LIBRARY_SUFFIXES .lib .a ${CMAKE_FIND_LIBRARY_SUFFIXES})
ELSE(WIN32)
SET(CMAKE_FIND_LIBRARY_SUFFIXES .a ${CMAKE_FIND_LIBRARY_SUFFIXES})
ENDIF(WIN32)
# Ignore depreciation warnings on OSX because GLUT is deprecated.
IF(APPLE)
add_compile_options(-Wno-deprecated)
ENDIF(APPLE)
# Find required libraries.
find_package(OpenGL REQUIRED)
find_package(GLUT REQUIRED)
find_package(Threads REQUIRED)
# Check for some include files.
check_include_files(dirent.h HAVE_DIRENT_H)
configure_file(${CMAKE_CURRENT_SOURCE_DIR}/config.h.in ${CMAKE_CURRENT_BINARY_DIR}/config.h)
| INCLUDE(CheckIncludeFile)
INCLUDE(CheckLibraryExists)
# Only build statically on Windows.
IF(WIN32)
SET(CMAKE_EXE_LINKER_FLAGS "-static")
SET(CMAKE_FIND_LIBRARY_SUFFIXES .lib .a ${CMAKE_FIND_LIBRARY_SUFFIXES})
ELSE(WIN32)
SET(CMAKE_FIND_LIBRARY_SUFFIXES .a ${CMAKE_FIND_LIBRARY_SUFFIXES})
ENDIF(WIN32)
# Ignore depreciation warnings on OSX because GLUT is deprecated.
IF(APPLE)
add_compile_options(-Wno-deprecated)
ENDIF(APPLE)
# Find required libraries.
find_package(OpenGL REQUIRED)
find_package(GLUT REQUIRED)
find_package(Threads REQUIRED)
find_package(PNG REQUIRED)
# Check for some include files.
check_include_files(dirent.h HAVE_DIRENT_H)
configure_file(${CMAKE_CURRENT_SOURCE_DIR}/config.h.in ${CMAKE_CURRENT_BINARY_DIR}/config.h)
| Add libpng as a required package for CMake. | Add libpng as a required package for CMake.
| CMake | apache-2.0 | nol888/0x40hues,nol888/0x40hues,nol888/0x40hues | cmake | ## Code Before:
INCLUDE(CheckIncludeFile)
INCLUDE(CheckLibraryExists)
# Only build statically on Windows.
IF(WIN32)
SET(CMAKE_EXE_LINKER_FLAGS "-static")
SET(CMAKE_FIND_LIBRARY_SUFFIXES .lib .a ${CMAKE_FIND_LIBRARY_SUFFIXES})
ELSE(WIN32)
SET(CMAKE_FIND_LIBRARY_SUFFIXES .a ${CMAKE_FIND_LIBRARY_SUFFIXES})
ENDIF(WIN32)
# Ignore depreciation warnings on OSX because GLUT is deprecated.
IF(APPLE)
add_compile_options(-Wno-deprecated)
ENDIF(APPLE)
# Find required libraries.
find_package(OpenGL REQUIRED)
find_package(GLUT REQUIRED)
find_package(Threads REQUIRED)
# Check for some include files.
check_include_files(dirent.h HAVE_DIRENT_H)
configure_file(${CMAKE_CURRENT_SOURCE_DIR}/config.h.in ${CMAKE_CURRENT_BINARY_DIR}/config.h)
## Instruction:
Add libpng as a required package for CMake.
## Code After:
INCLUDE(CheckIncludeFile)
INCLUDE(CheckLibraryExists)
# Only build statically on Windows.
IF(WIN32)
SET(CMAKE_EXE_LINKER_FLAGS "-static")
SET(CMAKE_FIND_LIBRARY_SUFFIXES .lib .a ${CMAKE_FIND_LIBRARY_SUFFIXES})
ELSE(WIN32)
SET(CMAKE_FIND_LIBRARY_SUFFIXES .a ${CMAKE_FIND_LIBRARY_SUFFIXES})
ENDIF(WIN32)
# Ignore depreciation warnings on OSX because GLUT is deprecated.
IF(APPLE)
add_compile_options(-Wno-deprecated)
ENDIF(APPLE)
# Find required libraries.
find_package(OpenGL REQUIRED)
find_package(GLUT REQUIRED)
find_package(Threads REQUIRED)
find_package(PNG REQUIRED)
# Check for some include files.
check_include_files(dirent.h HAVE_DIRENT_H)
configure_file(${CMAKE_CURRENT_SOURCE_DIR}/config.h.in ${CMAKE_CURRENT_BINARY_DIR}/config.h)
|
8f60692f34da1ad83e5f39d62de8c35aac257308 | spec/lib/core/parser_spec.rb | spec/lib/core/parser_spec.rb | require 'spec_helper'
describe Parser do
class ParseTestCommand < Command
syntax "testing_good_command"
end
context '.parse_and_execute' do
it "handles a bad command" do
expect_any_instance_of(UnknownCommand).to receive(:execute)
subject.parse_and_execute(1, "DEFINITELY_NEVER_GOING_TO_BE_A_GOOD_COMMAND")
end
it "calls .execute on a good command" do
expect_any_instance_of(ParseTestCommand).to receive(:execute)
subject.parse_and_execute(1, "testing_good_command")
end
end
context '.register_syntax' do
it "remembers a given syntax and class" do
class TestEmptyClass; end
Parser.register_syntax("foo", TestEmptyClass)
expect(Parser.syntaxes["foo"]).to eq(TestEmptyClass)
end
end
end
| require 'spec_helper'
describe Parser do
class ParseTestCommand < Command
syntax "testing_good_command"
end
let (:connection) { double(Connection).as_null_object }
context '.parse_and_execute' do
it "handles a bad command" do
expect_any_instance_of(UnknownCommand).to receive(:execute)
subject.parse_and_execute(connection, "DEFINITELY_NEVER_GOING_TO_BE_A_GOOD_COMMAND")
end
it "calls .execute on a good command" do
expect_any_instance_of(ParseTestCommand).to receive(:execute)
subject.parse_and_execute(connection, "testing_good_command")
end
end
context '.register_syntax' do
it "remembers a given syntax and class" do
class TestEmptyClass; end
Parser.register_syntax("foo", TestEmptyClass)
expect(Parser.syntaxes["foo"]).to eq(TestEmptyClass)
end
end
end
| Fix parser spec to not explode | Fix parser spec to not explode
| Ruby | mit | tkrajcar/carbonmu,1337807/carbonmu | ruby | ## Code Before:
require 'spec_helper'
describe Parser do
class ParseTestCommand < Command
syntax "testing_good_command"
end
context '.parse_and_execute' do
it "handles a bad command" do
expect_any_instance_of(UnknownCommand).to receive(:execute)
subject.parse_and_execute(1, "DEFINITELY_NEVER_GOING_TO_BE_A_GOOD_COMMAND")
end
it "calls .execute on a good command" do
expect_any_instance_of(ParseTestCommand).to receive(:execute)
subject.parse_and_execute(1, "testing_good_command")
end
end
context '.register_syntax' do
it "remembers a given syntax and class" do
class TestEmptyClass; end
Parser.register_syntax("foo", TestEmptyClass)
expect(Parser.syntaxes["foo"]).to eq(TestEmptyClass)
end
end
end
## Instruction:
Fix parser spec to not explode
## Code After:
require 'spec_helper'
describe Parser do
class ParseTestCommand < Command
syntax "testing_good_command"
end
let (:connection) { double(Connection).as_null_object }
context '.parse_and_execute' do
it "handles a bad command" do
expect_any_instance_of(UnknownCommand).to receive(:execute)
subject.parse_and_execute(connection, "DEFINITELY_NEVER_GOING_TO_BE_A_GOOD_COMMAND")
end
it "calls .execute on a good command" do
expect_any_instance_of(ParseTestCommand).to receive(:execute)
subject.parse_and_execute(connection, "testing_good_command")
end
end
context '.register_syntax' do
it "remembers a given syntax and class" do
class TestEmptyClass; end
Parser.register_syntax("foo", TestEmptyClass)
expect(Parser.syntaxes["foo"]).to eq(TestEmptyClass)
end
end
end
|
c9861ef2d8f76d622c6ea558e9007a746388c2a3 | src/main.ts | src/main.ts | import * as rtl from 'rtl-css-js';
export interface JssRTLOptions {
enabled?: boolean;
opt?: 'in' | 'out';
}
export default function jssRTL({ enabled = true, opt = 'out' }: JssRTLOptions = {}) {
return {
onProcessStyle(style: any, _: any, sheet: any) {
if (!enabled) {
if (typeof style.flip === 'boolean') {
delete style.flip;
}
return style;
}
let flip = opt === 'out'; // If it's set to opt-out, then it should flip by default
if (typeof sheet.options.flip === 'boolean') {
flip = sheet.options.flip;
}
if (typeof style.flip === 'boolean') {
flip = style.flip;
delete style.flip;
}
if (!flip) {
return style;
}
return rtl(style);
},
};
}
| import * as rtl from 'rtl-css-js';
const convert = rtl['default'] || rtl;
export interface JssRTLOptions {
enabled?: boolean;
opt?: 'in' | 'out';
}
export default function jssRTL({ enabled = true, opt = 'out' }: JssRTLOptions = {}) {
return {
onProcessStyle(style: any, _: any, sheet: any) {
if (!enabled) {
if (typeof style.flip === 'boolean') {
delete style.flip;
}
return style;
}
let flip = opt === 'out'; // If it's set to opt-out, then it should flip by default
if (typeof sheet.options.flip === 'boolean') {
flip = sheet.options.flip;
}
if (typeof style.flip === 'boolean') {
flip = style.flip;
delete style.flip;
}
if (!flip) {
return style;
}
return convert(style);
},
};
}
| Fix for cases where esm version of rtl-css-js is hit | Fix for cases where esm version of rtl-css-js is hit
| TypeScript | mit | alitaheri/jss-rtl | typescript | ## Code Before:
import * as rtl from 'rtl-css-js';
export interface JssRTLOptions {
enabled?: boolean;
opt?: 'in' | 'out';
}
export default function jssRTL({ enabled = true, opt = 'out' }: JssRTLOptions = {}) {
return {
onProcessStyle(style: any, _: any, sheet: any) {
if (!enabled) {
if (typeof style.flip === 'boolean') {
delete style.flip;
}
return style;
}
let flip = opt === 'out'; // If it's set to opt-out, then it should flip by default
if (typeof sheet.options.flip === 'boolean') {
flip = sheet.options.flip;
}
if (typeof style.flip === 'boolean') {
flip = style.flip;
delete style.flip;
}
if (!flip) {
return style;
}
return rtl(style);
},
};
}
## Instruction:
Fix for cases where esm version of rtl-css-js is hit
## Code After:
import * as rtl from 'rtl-css-js';
const convert = rtl['default'] || rtl;
export interface JssRTLOptions {
enabled?: boolean;
opt?: 'in' | 'out';
}
export default function jssRTL({ enabled = true, opt = 'out' }: JssRTLOptions = {}) {
return {
onProcessStyle(style: any, _: any, sheet: any) {
if (!enabled) {
if (typeof style.flip === 'boolean') {
delete style.flip;
}
return style;
}
let flip = opt === 'out'; // If it's set to opt-out, then it should flip by default
if (typeof sheet.options.flip === 'boolean') {
flip = sheet.options.flip;
}
if (typeof style.flip === 'boolean') {
flip = style.flip;
delete style.flip;
}
if (!flip) {
return style;
}
return convert(style);
},
};
}
|
ca4feae4515df8796aa13dac7379117fa2b79728 | frontend/README.md | frontend/README.md |
- `$ gulp` to build an optimized version of your application in folder dist
- `$ gulp serve` to start BrowserSync server on your source files with live reload
- `$ gulp serve:dist` to start BrowserSync server on your optimized application without live reload
- `$ gulp test` to run your unit tests with Karma
- `$ gulp test:auto` to run your unit tests with Karma in watch mode
- `$ gulp protractor` to launch your e2e tests with Protractor
- `$ gulp protractor:dist` to launch your e2e tests with Protractor on the dist files
|
Install dependencies
npm install
bower install
Develop
- `gulp serve` to start BrowserSync server on your source files with live reload
Test
- `gulp test` to run your unit tests with Karma
- `gulp test:auto` to run your unit tests with Karma in watch mode
- `gulp protractor` to launch your e2e tests with Protractor
Build dist version
- `gulp` to build an optimized version of your application in folder dist
- `gulp serve:dist` to start BrowserSync server on your optimized application without live reload
- `gulp protractor:dist` to launch your e2e tests with Protractor on the dist files
| Update readme with common commands | Update readme with common commands
| Markdown | mit | Vilsepi/after,Vilsepi/after,Vilsepi/after,Vilsepi/after,Vilsepi/after | markdown | ## Code Before:
- `$ gulp` to build an optimized version of your application in folder dist
- `$ gulp serve` to start BrowserSync server on your source files with live reload
- `$ gulp serve:dist` to start BrowserSync server on your optimized application without live reload
- `$ gulp test` to run your unit tests with Karma
- `$ gulp test:auto` to run your unit tests with Karma in watch mode
- `$ gulp protractor` to launch your e2e tests with Protractor
- `$ gulp protractor:dist` to launch your e2e tests with Protractor on the dist files
## Instruction:
Update readme with common commands
## Code After:
Install dependencies
npm install
bower install
Develop
- `gulp serve` to start BrowserSync server on your source files with live reload
Test
- `gulp test` to run your unit tests with Karma
- `gulp test:auto` to run your unit tests with Karma in watch mode
- `gulp protractor` to launch your e2e tests with Protractor
Build dist version
- `gulp` to build an optimized version of your application in folder dist
- `gulp serve:dist` to start BrowserSync server on your optimized application without live reload
- `gulp protractor:dist` to launch your e2e tests with Protractor on the dist files
|
317a5f93b212c0ec9fc566ee3d6887c57da555ed | spec/session_spec.rb | spec/session_spec.rb | require_relative "spec_helper"
describe "session handling" do
include CookieJar
it "should give a warning if session variable is not available" do
app do |r|
begin
session
rescue Exception => e
e.message
end
end
body.must_match("You're missing a session handler, try using the sessions plugin.")
end
it "should return session if rack session middleware is used" do
app(:bare) do
use Rack::Session::Cookie, :secret=>'1'
route do |r|
r.on do
(session[1] ||= 'a'.dup) << 'b'
session[1]
end
end
end
_, _, b = req
b.join.must_equal 'ab'
_, _, b = req
b.join.must_equal 'abb'
_, _, b = req
b.join.must_equal 'abbb'
end
end
| require_relative "spec_helper"
describe "session handling" do
include CookieJar
it "should give a warning if session variable is not available" do
app do |r|
begin
session
rescue Exception => e
e.message
end
end
body.must_match("You're missing a session handler, try using the sessions plugin.")
end
it "should return session if session middleware is used" do
require 'roda/session_middleware'
app(:bare) do
if RUBY_VERSION >= '2.0'
require 'roda/session_middleware'
use RodaSessionMiddleware, :secret=>'1'*64
else
use Rack::Session::Cookie, :secret=>'1'*64
end
route do |r|
r.on do
(session[1] ||= 'a'.dup) << 'b'
session[1]
end
end
end
_, _, b = req
b.join.must_equal 'ab'
_, _, b = req
b.join.must_equal 'abb'
_, _, b = req
b.join.must_equal 'abbb'
end
end
| Use roda session middleware instead of rack session middleware in session spec on Ruby 2+ | Use roda session middleware instead of rack session middleware in session spec on Ruby 2+
This works around the issue that Rack::Session::Cookie isn't
available in Rack starting in Rack 3+.
| Ruby | mit | jeremyevans/roda,jeremyevans/roda,jeremyevans/roda | ruby | ## Code Before:
require_relative "spec_helper"
describe "session handling" do
include CookieJar
it "should give a warning if session variable is not available" do
app do |r|
begin
session
rescue Exception => e
e.message
end
end
body.must_match("You're missing a session handler, try using the sessions plugin.")
end
it "should return session if rack session middleware is used" do
app(:bare) do
use Rack::Session::Cookie, :secret=>'1'
route do |r|
r.on do
(session[1] ||= 'a'.dup) << 'b'
session[1]
end
end
end
_, _, b = req
b.join.must_equal 'ab'
_, _, b = req
b.join.must_equal 'abb'
_, _, b = req
b.join.must_equal 'abbb'
end
end
## Instruction:
Use roda session middleware instead of rack session middleware in session spec on Ruby 2+
This works around the issue that Rack::Session::Cookie isn't
available in Rack starting in Rack 3+.
## Code After:
require_relative "spec_helper"
describe "session handling" do
include CookieJar
it "should give a warning if session variable is not available" do
app do |r|
begin
session
rescue Exception => e
e.message
end
end
body.must_match("You're missing a session handler, try using the sessions plugin.")
end
it "should return session if session middleware is used" do
require 'roda/session_middleware'
app(:bare) do
if RUBY_VERSION >= '2.0'
require 'roda/session_middleware'
use RodaSessionMiddleware, :secret=>'1'*64
else
use Rack::Session::Cookie, :secret=>'1'*64
end
route do |r|
r.on do
(session[1] ||= 'a'.dup) << 'b'
session[1]
end
end
end
_, _, b = req
b.join.must_equal 'ab'
_, _, b = req
b.join.must_equal 'abb'
_, _, b = req
b.join.must_equal 'abbb'
end
end
|
5ef1d115dfed0314dd2787a76f1edea64b36ebb1 | .travis.yml | .travis.yml | language: ruby
before_install:
- gem install bundler -v '= 1.5.1'
rvm:
- 1.9.3
- 2.0
- 2.1
env:
- rails=3.1.8
- rails=3.2.8
- rails=3.2.16
- rails=4.0.2
- rails=4.1.0.beta1
- orm=mongoid2
- orm=mongoid3
- orm=mongoid4
- orm=mongo_mapper
- table_name_prefix=h_
- table_name_suffix=_h
services:
- mongodb
| language: ruby
before_install:
- gem install bundler -v '= 1.5.1'
rvm:
- 1.9.3
- 2.0
- 2.1
env:
- rails=3.1.12
- rails=3.2.18
- rails=4.0.5
- rails=4.1.1
- orm=mongoid2
- orm=mongoid3
- orm=mongoid4
- orm=mongo_mapper
- table_name_prefix=h_
- table_name_suffix=_h
services:
- mongodb
| Move to latest Rails releases for 3.1.x - 4.1.x | Move to latest Rails releases for 3.1.x - 4.1.x
| YAML | mit | AICIDNN/doorkeeper,GeekOnCoffee/doorkeeper,Uysim/doorkeeper,nbulaj/doorkeeper,identification-io/doorkeeper,kolorahl/doorkeeper,pmdeazeta/doorkeeper,telekomatrix/doorkeeper,shivakumaarmgs/doorkeeper,dollarshaveclub/doorkeeper,lalithr95/doorkeeper,jasl/doorkeeper,outstand/doorkeeper,ValMilkevich/doorkeeper,doorkeeper-gem/doorkeeper,ValMilkevich/doorkeeper,daande/doorkeeper,stormz/doorkeeper,kolorahl/doorkeeper,Uysim/doorkeeper,stefania11/doorkeeper,doorkeeper-gem/doorkeeper,mavenlink/doorkeeper,ifeelgoods/doorkeeper,stefania11/doorkeeper,pmdeazeta/doorkeeper,dk1234567/doorkeeper,dk1234567/doorkeeper,jasl/doorkeeper,outstand/doorkeeper,AdStage/doorkeeper,doorkeeper-gem/doorkeeper,mavenlink/doorkeeper,telekomatrix/doorkeeper,vovimayhem/doorkeeper,ngpestelos/doorkeeper,daande/doorkeeper,jasl/doorkeeper,6fusion/doorkeeper,AICIDNN/doorkeeper,lalithr95/doorkeeper,EasterAndJay/doorkeeper,EasterAndJay/doorkeeper,moneytree/doorkeeper,vanboom/doorkeeper,ifeelgoods/doorkeeper,dollarshaveclub/doorkeeper,ezilocchi/doorkeeper,phillbaker/doorkeeper,ezilocchi/doorkeeper,mavenlink/doorkeeper,ngpestelos/doorkeeper,6fusion/doorkeeper,outstand/doorkeeper,ukasiu/doorkeeper,stormz/doorkeeper,identification-io/doorkeeper,ukasiu/doorkeeper,calfzhou/doorkeeper,shivakumaarmgs/doorkeeper,nbulaj/doorkeeper,GeekOnCoffee/doorkeeper,AdStage/doorkeeper,vovimayhem/doorkeeper,ifeelgoods/doorkeeper,moneytree/doorkeeper,calfzhou/doorkeeper,ministryofjustice/doorkeeper,ministryofjustice/doorkeeper,vanboom/doorkeeper | yaml | ## Code Before:
language: ruby
before_install:
- gem install bundler -v '= 1.5.1'
rvm:
- 1.9.3
- 2.0
- 2.1
env:
- rails=3.1.8
- rails=3.2.8
- rails=3.2.16
- rails=4.0.2
- rails=4.1.0.beta1
- orm=mongoid2
- orm=mongoid3
- orm=mongoid4
- orm=mongo_mapper
- table_name_prefix=h_
- table_name_suffix=_h
services:
- mongodb
## Instruction:
Move to latest Rails releases for 3.1.x - 4.1.x
## Code After:
language: ruby
before_install:
- gem install bundler -v '= 1.5.1'
rvm:
- 1.9.3
- 2.0
- 2.1
env:
- rails=3.1.12
- rails=3.2.18
- rails=4.0.5
- rails=4.1.1
- orm=mongoid2
- orm=mongoid3
- orm=mongoid4
- orm=mongo_mapper
- table_name_prefix=h_
- table_name_suffix=_h
services:
- mongodb
|
49153e142eacb556d7564033b270bd47b3aa0d18 | mbinfo/mbinfo.go | mbinfo/mbinfo.go | /* Read from /sys/devices/virtual/dmi/id/board_{name,serial,vendor} */
package mbinfo
import (
"fmt"
"os/exec"
)
const MB_NAME_FILE = "/sys/devices/virtual/dmi/id/board_name"
const MB_SERIAL_FILE = "/sys/devices/virtual/dmi/id/board_serial"
const MB_VENDOR_FILE = "/sys/devices/virtual/dmi/id/board_vendor"
type MBstats struct {
Model string
Serial string
}
var Motherboard MBstats
func cat_file(filepath string) (ret string) {
cmd := exec.Command("/bin/cat", filepath)
buf, err := cmd.Output()
if err != nil {
fmt.Println(err)
return
}
ret = string(buf)
return
}
func init() {
Motherboard.Model = cat_file(MB_VENDOR_FILE) + cat_file(MB_NAME_FILE)
Motherboard.Serial = cat_file(MB_SERIAL_FILE)
}
| /* Read from /sys/devices/virtual/dmi/id/board_{name,serial,vendor} */
package mbinfo
import (
"fmt"
"os/exec"
"strings"
)
const MB_NAME_FILE = "/sys/devices/virtual/dmi/id/board_name"
const MB_SERIAL_FILE = "/sys/devices/virtual/dmi/id/board_serial"
const MB_VENDOR_FILE = "/sys/devices/virtual/dmi/id/board_vendor"
type MBstats struct {
Model string
Serial string
}
var Motherboard MBstats
func cat_file(filepath string) (ret string) {
cmd := exec.Command("/bin/cat", filepath)
buf, err := cmd.Output()
if err != nil {
fmt.Println(err)
return
}
ret = string(buf)
return
}
func init() {
Motherboard.Model = cat_file(MB_VENDOR_FILE) + cat_file(MB_NAME_FILE)
Motherboard.Model = strings.Replace(Motherboard.Model, "\n", " ", -1)
Motherboard.Serial = cat_file(MB_SERIAL_FILE)
Motherboard.Serial = strings.Replace(Motherboard.Serial, "\n", " ", -1)
}
| Remove '\n' character in Motherboard's Model/Serial | Remove '\n' character in Motherboard's Model/Serial
| Go | mit | PiScale/hwinfo-lib | go | ## Code Before:
/* Read from /sys/devices/virtual/dmi/id/board_{name,serial,vendor} */
package mbinfo
import (
"fmt"
"os/exec"
)
const MB_NAME_FILE = "/sys/devices/virtual/dmi/id/board_name"
const MB_SERIAL_FILE = "/sys/devices/virtual/dmi/id/board_serial"
const MB_VENDOR_FILE = "/sys/devices/virtual/dmi/id/board_vendor"
type MBstats struct {
Model string
Serial string
}
var Motherboard MBstats
func cat_file(filepath string) (ret string) {
cmd := exec.Command("/bin/cat", filepath)
buf, err := cmd.Output()
if err != nil {
fmt.Println(err)
return
}
ret = string(buf)
return
}
func init() {
Motherboard.Model = cat_file(MB_VENDOR_FILE) + cat_file(MB_NAME_FILE)
Motherboard.Serial = cat_file(MB_SERIAL_FILE)
}
## Instruction:
Remove '\n' character in Motherboard's Model/Serial
## Code After:
/* Read from /sys/devices/virtual/dmi/id/board_{name,serial,vendor} */
package mbinfo
import (
"fmt"
"os/exec"
"strings"
)
const MB_NAME_FILE = "/sys/devices/virtual/dmi/id/board_name"
const MB_SERIAL_FILE = "/sys/devices/virtual/dmi/id/board_serial"
const MB_VENDOR_FILE = "/sys/devices/virtual/dmi/id/board_vendor"
type MBstats struct {
Model string
Serial string
}
var Motherboard MBstats
func cat_file(filepath string) (ret string) {
cmd := exec.Command("/bin/cat", filepath)
buf, err := cmd.Output()
if err != nil {
fmt.Println(err)
return
}
ret = string(buf)
return
}
func init() {
Motherboard.Model = cat_file(MB_VENDOR_FILE) + cat_file(MB_NAME_FILE)
Motherboard.Model = strings.Replace(Motherboard.Model, "\n", " ", -1)
Motherboard.Serial = cat_file(MB_SERIAL_FILE)
Motherboard.Serial = strings.Replace(Motherboard.Serial, "\n", " ", -1)
}
|
c43eca9d029740d877a2e6d3ba7ce7813878e86a | test/representers/api/musical_work_representer_test.rb | test/representers/api/musical_work_representer_test.rb |
require 'test_helper'
require 'musical_work' if !defined?(AudioFile)
describe Api::MusicalWorkRepresenter do
let(:musical_work) { FactoryGirl.create(:musical_work) }
let(:representer) { Api::MusicalWorkRepresenter.new(musical_work) }
let(:json) { JSON.parse(representer.to_json) }
it 'create representer' do
representer.wont_be_nil
end
it 'use representer to create json' do
json['id'].must_equal musical_work.id
end
it 'will have a nested self link' do
self_href = "/api/v1/stories/#{musical_work.story.id}/musical_works/#{musical_work.id}"
json['_links']['self']['href'].must_equal self_href
end
end
|
require 'test_helper'
require 'musical_work' if !defined?(AudioFile)
describe Api::MusicalWorkRepresenter do
let(:musical_work) { FactoryGirl.create(:musical_work) }
let(:representer) { Api::MusicalWorkRepresenter.new(musical_work) }
let(:json) { JSON.parse(representer.to_json) }
it 'create representer' do
representer.wont_be_nil
end
it 'use representer to create json' do
json['id'].must_equal musical_work.id
end
it 'will have a nested self link' do
self_href = "/api/v1/stories/#{musical_work.story.id}/musical_works/#{musical_work.id}"
json['_links']['self']['href'].must_equal self_href
end
it 'serializes the length of the story as duration' do
musical_work.stub(:excerpt_length, 123) do
json['duration'].must_equal 123
end
end
end
| Add testing for musical work duration | Add testing for musical work duration
| Ruby | agpl-3.0 | PRX/cms.prx.org,PRX/cms.prx.org,PRX/cms.prx.org,PRX/cms.prx.org | ruby | ## Code Before:
require 'test_helper'
require 'musical_work' if !defined?(AudioFile)
describe Api::MusicalWorkRepresenter do
let(:musical_work) { FactoryGirl.create(:musical_work) }
let(:representer) { Api::MusicalWorkRepresenter.new(musical_work) }
let(:json) { JSON.parse(representer.to_json) }
it 'create representer' do
representer.wont_be_nil
end
it 'use representer to create json' do
json['id'].must_equal musical_work.id
end
it 'will have a nested self link' do
self_href = "/api/v1/stories/#{musical_work.story.id}/musical_works/#{musical_work.id}"
json['_links']['self']['href'].must_equal self_href
end
end
## Instruction:
Add testing for musical work duration
## Code After:
require 'test_helper'
require 'musical_work' if !defined?(AudioFile)
describe Api::MusicalWorkRepresenter do
let(:musical_work) { FactoryGirl.create(:musical_work) }
let(:representer) { Api::MusicalWorkRepresenter.new(musical_work) }
let(:json) { JSON.parse(representer.to_json) }
it 'create representer' do
representer.wont_be_nil
end
it 'use representer to create json' do
json['id'].must_equal musical_work.id
end
it 'will have a nested self link' do
self_href = "/api/v1/stories/#{musical_work.story.id}/musical_works/#{musical_work.id}"
json['_links']['self']['href'].must_equal self_href
end
it 'serializes the length of the story as duration' do
musical_work.stub(:excerpt_length, 123) do
json['duration'].must_equal 123
end
end
end
|
76c3947b6033d812bf02d9d1d2445460e888f427 | _includes/header.html | _includes/header.html | <nav class="navbar navbar-default">
<div class="container">
<div class="navbar-header">
<a class="navbar-brand" href="{{ site.baseurl }}">Hautahi Kingi</a>
</div>
<div id="navbar">
<ul class="nav navbar-nav navbar-right">
<li><a {% if page.url == '/' %} class="active"{% endif %} href="{{ site.baseurl }}">Home</a></li>
<li><a {% if page.url == '/work' %} class="active"{% endif %} href="{{ site.baseurl }}work">Work</a></li>
<li><a {% if page.url == '/play' %} class="active"{% endif %} href="{{ site.baseurl }}play">Play</a></li>
<li><a target="blank" href="{{ site.baseurl }}static/docs/CV_H.Kingi.pdf">CV</a></li>
</ul>
</div>
</div>
</nav>
| <nav class="navbar navbar-default">
<div class="container">
<div class="navbar-header">
<a class="navbar-brand" href="{{ site.baseurl }}">Hautahi Kingi</a>
</div>
<div id="navbar" class="hidden-xs">
<ul class="nav navbar-nav navbar-right">
<li><a {% if page.url == '/' %} class="active"{% endif %} href="{{ site.baseurl }}">Home</a></li>
<li><a {% if page.url == '/work' %} class="active"{% endif %} href="{{ site.baseurl }}work">Work</a></li>
<li><a {% if page.url == '/play' %} class="active"{% endif %} href="{{ site.baseurl }}play">Play</a></li>
<li><a target="blank" href="{{ site.baseurl }}static/docs/CV_H.Kingi.pdf">CV</a></li>
</ul>
</div>
<div id="navbar" class="visible-xs">
<button type="button" data-toggle="collapse" data-target="#my-collapsible-menu"> MENU </button>
<div class="collapse navbar-collapse topmarg" id="my-collapsible-menu">
<ul class="nav navbar-nav">
<li><a {% if page.url == '/' %} class="active"{% endif %} href="{{ site.baseurl }}">Home</a></li>
<li><a {% if page.url == '/work' %} class="active"{% endif %} href="{{ site.baseurl }}work">Work</a></li>
<li><a {% if page.url == '/play' %} class="active"{% endif %} href="{{ site.baseurl }}play">Play</a></li>
<li><a target="blank" href="{{ site.baseurl }}static/docs/CV_H.Kingi.pdf">CV</a></li>
</ul>
</div>
</div>
</div>
</nav>
| Fix menu for small screen. | Fix menu for small screen.
| HTML | mit | hautahi/hautahi.github.io,hautahi/hautahi.github.io | html | ## Code Before:
<nav class="navbar navbar-default">
<div class="container">
<div class="navbar-header">
<a class="navbar-brand" href="{{ site.baseurl }}">Hautahi Kingi</a>
</div>
<div id="navbar">
<ul class="nav navbar-nav navbar-right">
<li><a {% if page.url == '/' %} class="active"{% endif %} href="{{ site.baseurl }}">Home</a></li>
<li><a {% if page.url == '/work' %} class="active"{% endif %} href="{{ site.baseurl }}work">Work</a></li>
<li><a {% if page.url == '/play' %} class="active"{% endif %} href="{{ site.baseurl }}play">Play</a></li>
<li><a target="blank" href="{{ site.baseurl }}static/docs/CV_H.Kingi.pdf">CV</a></li>
</ul>
</div>
</div>
</nav>
## Instruction:
Fix menu for small screen.
## Code After:
<nav class="navbar navbar-default">
<div class="container">
<div class="navbar-header">
<a class="navbar-brand" href="{{ site.baseurl }}">Hautahi Kingi</a>
</div>
<div id="navbar" class="hidden-xs">
<ul class="nav navbar-nav navbar-right">
<li><a {% if page.url == '/' %} class="active"{% endif %} href="{{ site.baseurl }}">Home</a></li>
<li><a {% if page.url == '/work' %} class="active"{% endif %} href="{{ site.baseurl }}work">Work</a></li>
<li><a {% if page.url == '/play' %} class="active"{% endif %} href="{{ site.baseurl }}play">Play</a></li>
<li><a target="blank" href="{{ site.baseurl }}static/docs/CV_H.Kingi.pdf">CV</a></li>
</ul>
</div>
<div id="navbar" class="visible-xs">
<button type="button" data-toggle="collapse" data-target="#my-collapsible-menu"> MENU </button>
<div class="collapse navbar-collapse topmarg" id="my-collapsible-menu">
<ul class="nav navbar-nav">
<li><a {% if page.url == '/' %} class="active"{% endif %} href="{{ site.baseurl }}">Home</a></li>
<li><a {% if page.url == '/work' %} class="active"{% endif %} href="{{ site.baseurl }}work">Work</a></li>
<li><a {% if page.url == '/play' %} class="active"{% endif %} href="{{ site.baseurl }}play">Play</a></li>
<li><a target="blank" href="{{ site.baseurl }}static/docs/CV_H.Kingi.pdf">CV</a></li>
</ul>
</div>
</div>
</div>
</nav>
|
6e96905b97bbb3c154a15e90b3dc3d118db7c96e | src/OI.h | src/OI.h |
class OI
{
private:
Joystick joystick;
public:
OI();
inline float GetXplusY(){
float val;
val = FractionOmitted(joystick.GetY() + joystick.GetX());
if(val > 1.2) val = 1.2;
if(val < -1.2) val = -1.2;
return val;
}
inline float GetXminusY(){
float val;
val = FractionOmitted(joystick.GetY() - joystick.GetX());
if(val > 1.2) val = 1.2;
if(val < -1.2) val = -1.2;
return val;
}
inline float GetStickX(){ return FractionOmitted(joystick.GetX()); }
inline float GetStickY(){ return FractionOmitted(joystick.GetY()); }
inline float GetStickTwist(){ return FractionOmitted(joystick.GetTwist()); }
inline float GetStickThrottle(){ return FractionOmitted(joystick.GetThrottle()); }
inline float FractionOmitted(float original){
if(fabsf(original) < 0.01 ){
original = 0;
}
return original;
}
};
#endif
|
class OI
{
private:
Joystick joystick;
public:
OI();
inline float GetXplusY(){
float val;
val = FractionOmitted(joystick.GetY() + joystick.GetX());
if(val > 1.2) val = 1.2;
if(val < -1.2) val = -1.2;
return val;
}
inline float GetXminusY(){
float val;
val = FractionOmitted(joystick.GetY() - joystick.GetX());
if(val > 1.2) val = 1.2;
if(val < -1.2) val = -1.2;
return val;
}
inline float GetStickX(){ return FractionOmitted(joystick.GetX()); }
inline float GetStickY(){ return FractionOmitted(joystick.GetY()); }
inline float GetStickTwist(){ return FractionOmitted(joystick.GetTwist()); }
inline float GetStickThrottle(){ return FractionOmitted(joystick.GetThrottle()); }
inline float GetStickRightX(){ return FractionOmitted(joystick.GetRawAxis(5)); }
inline float GetStcikRightY(){ return FractionOmitted(joystick.GetRawAxis(6)); }
inline float FractionOmitted(float original){
if(fabsf(original) < 0.01 ){
original = 0;
}
return original;
}
};
#endif
| Create RightStick X and Y value get Function | Create RightStick X and Y value get Function | C | epl-1.0 | tokyotechnicalsamurai/shougun | c | ## Code Before:
class OI
{
private:
Joystick joystick;
public:
OI();
inline float GetXplusY(){
float val;
val = FractionOmitted(joystick.GetY() + joystick.GetX());
if(val > 1.2) val = 1.2;
if(val < -1.2) val = -1.2;
return val;
}
inline float GetXminusY(){
float val;
val = FractionOmitted(joystick.GetY() - joystick.GetX());
if(val > 1.2) val = 1.2;
if(val < -1.2) val = -1.2;
return val;
}
inline float GetStickX(){ return FractionOmitted(joystick.GetX()); }
inline float GetStickY(){ return FractionOmitted(joystick.GetY()); }
inline float GetStickTwist(){ return FractionOmitted(joystick.GetTwist()); }
inline float GetStickThrottle(){ return FractionOmitted(joystick.GetThrottle()); }
inline float FractionOmitted(float original){
if(fabsf(original) < 0.01 ){
original = 0;
}
return original;
}
};
#endif
## Instruction:
Create RightStick X and Y value get Function
## Code After:
class OI
{
private:
Joystick joystick;
public:
OI();
inline float GetXplusY(){
float val;
val = FractionOmitted(joystick.GetY() + joystick.GetX());
if(val > 1.2) val = 1.2;
if(val < -1.2) val = -1.2;
return val;
}
inline float GetXminusY(){
float val;
val = FractionOmitted(joystick.GetY() - joystick.GetX());
if(val > 1.2) val = 1.2;
if(val < -1.2) val = -1.2;
return val;
}
inline float GetStickX(){ return FractionOmitted(joystick.GetX()); }
inline float GetStickY(){ return FractionOmitted(joystick.GetY()); }
inline float GetStickTwist(){ return FractionOmitted(joystick.GetTwist()); }
inline float GetStickThrottle(){ return FractionOmitted(joystick.GetThrottle()); }
inline float GetStickRightX(){ return FractionOmitted(joystick.GetRawAxis(5)); }
inline float GetStcikRightY(){ return FractionOmitted(joystick.GetRawAxis(6)); }
inline float FractionOmitted(float original){
if(fabsf(original) < 0.01 ){
original = 0;
}
return original;
}
};
#endif
|
5c954f2c750c0e391b0fea1444d288bb2048cd38 | config/mongoid.yml | config/mongoid.yml | <%= IO.read("/etc/db_config/database.yml") %>
development:
<<: *developer_mongodb_development
test:
<<: *developer_mongodb_test
staging_int:
<<: *developer_mongodb_staging_int
staging:
<<: *developer_mongodb_staging
production:
<<: *developer_mongodb_production
| <%= IO.read("/etc/db_config/database.yml") %>
development:
<<: *developer_mongodb_development
test:
<<: *developer_mongodb_test
staging_int:
<<: *developer_mongodb_staging_int
production:
<<: *developer_mongodb_production
| Remove our dummy two staging servers, since we're all internal for now. | Remove our dummy two staging servers, since we're all internal for now.
| YAML | mit | johan--/api-umbrella-web,NREL/api-umbrella,NREL/api-umbrella-web,NREL/api-umbrella,NREL/api-umbrella-web,NREL/api-umbrella,apinf/api-umbrella,johan--/api-umbrella-web,apinf/api-umbrella,apinf/api-umbrella,cmc333333/api-umbrella-web,johan--/api-umbrella-web,johan--/api-umbrella-web,apinf/api-umbrella,NREL/api-umbrella-web,NREL/api-umbrella-web,apinf/api-umbrella,cmc333333/api-umbrella-web,NREL/api-umbrella,cmc333333/api-umbrella-web,cmc333333/api-umbrella-web | yaml | ## Code Before:
<%= IO.read("/etc/db_config/database.yml") %>
development:
<<: *developer_mongodb_development
test:
<<: *developer_mongodb_test
staging_int:
<<: *developer_mongodb_staging_int
staging:
<<: *developer_mongodb_staging
production:
<<: *developer_mongodb_production
## Instruction:
Remove our dummy two staging servers, since we're all internal for now.
## Code After:
<%= IO.read("/etc/db_config/database.yml") %>
development:
<<: *developer_mongodb_development
test:
<<: *developer_mongodb_test
staging_int:
<<: *developer_mongodb_staging_int
production:
<<: *developer_mongodb_production
|
bfd7f8864c584f7c17ca55630cb41f576724e15f | polling_stations/apps/nus_wales/templates/nus_wales/base.html | polling_stations/apps/nus_wales/templates/nus_wales/base.html | {% extends "dc_base.html" %}
{% load i18n %}
{% load staticfiles %}
{% block body_base %}
<body class="embed">
<header style="background-color:#00677f;padding:2em;color:#FFF;margin:2em 0;">
<div class="row">
<div class="columns large-12">
<h2>Find your polling station</h2>
</div>
</div>
</header>
<div class="container">
<div class="col-md-12">
{% block content %}{% endblock content %}
</div>
</div>
<footer class="footer">
<div class="row">
<div class="columns large-12">
<h1><img src="{% static 'images/nus_logo.jpg' %}" width=200>
<img src="{% static 'images/logo-with-text.png' %}" width=200></h1>
</div>
</div>
</footer>
{% endblock body_base %}
| {% extends "dc_base.html" %}
{% load i18n %}
{% load staticfiles %}
{% block body_base %}
<body class="embed">
<header style="background-color:#00677f;padding:2em;color:#FFF;margin:2em 0;">
<div class="row">
<div class="columns large-12">
<h2>{% trans "Find your polling station" %}</h2>
</div>
</div>
</header>
<div class="container">
<div class="col-md-12">
{% block content %}{% endblock content %}
</div>
</div>
<footer class="footer">
<div class="row">
<div class="columns large-12">
<h1><img src="{% static 'images/nus_logo.jpg' %}" width=200>
<img src="{% static 'images/logo-with-text.png' %}" width=200></h1>
</div>
</div>
</footer>
{% endblock body_base %}
| Add trans tag to NUS theme | Add trans tag to NUS theme
| HTML | bsd-3-clause | DemocracyClub/UK-Polling-Stations,andylolz/UK-Polling-Stations,DemocracyClub/UK-Polling-Stations,chris48s/UK-Polling-Stations,chris48s/UK-Polling-Stations,chris48s/UK-Polling-Stations,andylolz/UK-Polling-Stations,DemocracyClub/UK-Polling-Stations,andylolz/UK-Polling-Stations | html | ## Code Before:
{% extends "dc_base.html" %}
{% load i18n %}
{% load staticfiles %}
{% block body_base %}
<body class="embed">
<header style="background-color:#00677f;padding:2em;color:#FFF;margin:2em 0;">
<div class="row">
<div class="columns large-12">
<h2>Find your polling station</h2>
</div>
</div>
</header>
<div class="container">
<div class="col-md-12">
{% block content %}{% endblock content %}
</div>
</div>
<footer class="footer">
<div class="row">
<div class="columns large-12">
<h1><img src="{% static 'images/nus_logo.jpg' %}" width=200>
<img src="{% static 'images/logo-with-text.png' %}" width=200></h1>
</div>
</div>
</footer>
{% endblock body_base %}
## Instruction:
Add trans tag to NUS theme
## Code After:
{% extends "dc_base.html" %}
{% load i18n %}
{% load staticfiles %}
{% block body_base %}
<body class="embed">
<header style="background-color:#00677f;padding:2em;color:#FFF;margin:2em 0;">
<div class="row">
<div class="columns large-12">
<h2>{% trans "Find your polling station" %}</h2>
</div>
</div>
</header>
<div class="container">
<div class="col-md-12">
{% block content %}{% endblock content %}
</div>
</div>
<footer class="footer">
<div class="row">
<div class="columns large-12">
<h1><img src="{% static 'images/nus_logo.jpg' %}" width=200>
<img src="{% static 'images/logo-with-text.png' %}" width=200></h1>
</div>
</div>
</footer>
{% endblock body_base %}
|
6a243d4072603cf0a06ee497a8a3424e5709612e | _data/posters.yml | _data/posters.yml | - year: 2016
posters:
- authors: "Heywood P, Richmond P, Maddock S, Jung M"
date: "2016-04-04"
event: "GPU Technology Conference 2016"
event_url: "http://www.gputechconf.com/"
title: "Accelerated Transport System Simulation using CUDA"
note: "Poster P6203"
pdf: "http://on-demand.gputechconf.com/gtc/2016/posters/GTC_2016_Virtual_Reality_Augmented_Reality_VRAR_01_P6203_WEB.pdf"
| - year: 2017
posters:
- authors: "Heywood P, Richmond P, Maddock S, Bradley R, Wright I, Swain D, Mawson M, Fletcher G, Guichard R, Himlin R"
date: "2017-05-08"
event: "GPU Technology Conference 2017"
event_url: "http://www.gputechconf.com/"
title: "Shortest Path Calculations for Low-Density High-Diameter Road Network Simulation"
note: "Poster P7196"
pdf:
- year: 2016
posters:
- authors: "Heywood P, Richmond P, Maddock S, Jung M"
date: "2016-04-04"
event: "GPU Technology Conference 2016"
event_url: "http://www.gputechconf.com/"
title: "Accelerated Transport System Simulation using CUDA"
note: "Poster P6203"
pdf: "http://on-demand.gputechconf.com/gtc/2016/posters/GTC_2016_Virtual_Reality_Augmented_Reality_VRAR_01_P6203_WEB.pdf"
| Add GTC 2017 poster details | Add GTC 2017 poster details
| YAML | mit | ptheywood/ptheywood.github.io,ptheywood/ptheywood.github.io,ptheywood/ptheywood.github.io | yaml | ## Code Before:
- year: 2016
posters:
- authors: "Heywood P, Richmond P, Maddock S, Jung M"
date: "2016-04-04"
event: "GPU Technology Conference 2016"
event_url: "http://www.gputechconf.com/"
title: "Accelerated Transport System Simulation using CUDA"
note: "Poster P6203"
pdf: "http://on-demand.gputechconf.com/gtc/2016/posters/GTC_2016_Virtual_Reality_Augmented_Reality_VRAR_01_P6203_WEB.pdf"
## Instruction:
Add GTC 2017 poster details
## Code After:
- year: 2017
posters:
- authors: "Heywood P, Richmond P, Maddock S, Bradley R, Wright I, Swain D, Mawson M, Fletcher G, Guichard R, Himlin R"
date: "2017-05-08"
event: "GPU Technology Conference 2017"
event_url: "http://www.gputechconf.com/"
title: "Shortest Path Calculations for Low-Density High-Diameter Road Network Simulation"
note: "Poster P7196"
pdf:
- year: 2016
posters:
- authors: "Heywood P, Richmond P, Maddock S, Jung M"
date: "2016-04-04"
event: "GPU Technology Conference 2016"
event_url: "http://www.gputechconf.com/"
title: "Accelerated Transport System Simulation using CUDA"
note: "Poster P6203"
pdf: "http://on-demand.gputechconf.com/gtc/2016/posters/GTC_2016_Virtual_Reality_Augmented_Reality_VRAR_01_P6203_WEB.pdf"
|
5e70a05c26a54e7e6dd5c8bb88e92ea966e36430 | Gruntfile.js | Gruntfile.js | module.exports = function(grunt) {
grunt.loadNpmTasks("grunt-mocha-test");
grunt.loadNpmTasks("grunt-mocha-istanbul");
var testOutputLocation = process.env.CIRCLE_TEST_REPORTS || "test_output";
var artifactsLocation = process.env.CIRCLE_ARTIFACTS || "build_artifacts";
grunt.initConfig({
mochaTest: {
test: {
src: ["test/**/*.js"]
},
ci: {
src: ["test/**/*.js"],
options: {
reporter: "xunit",
captureFile: testOutputLocation + "/mocha/results.xml",
quiet: true
}
}
},
mocha_istanbul: {
coverage: {
src: ["test/**/*.js"],
options: {
coverageFolder: artifactsLocation + "/coverage",
check: {
lines: 100,
statements: 100
},
reportFormats: ["lcov"]
}
}
}
});
grunt.registerTask("test", ["mochaTest:test", "mocha_istanbul"]);
grunt.registerTask("ci-test", ["mochaTest:ci", "mocha_istanbul"]);
grunt.registerTask("default", "test");
};
| module.exports = function(grunt) {
grunt.loadNpmTasks("grunt-mocha-test");
grunt.loadNpmTasks("grunt-mocha-istanbul");
var testOutputLocation = process.env.CIRCLE_TEST_REPORTS || "test_output";
var artifactsLocation = process.env.CIRCLE_ARTIFACTS || "build_artifacts";
grunt.initConfig({
mochaTest: {
test: {
src: ["test/**/*.js"]
},
ci: {
src: ["test/**/*.js"],
options: {
reporter: "xunit",
captureFile: testOutputLocation + "/mocha/results.xml",
quiet: true
}
}
},
mocha_istanbul: {
coverage: {
src: ["test/**/*.js"],
options: {
coverageFolder: artifactsLocation + "/coverage",
check: {
lines: 100,
statements: 100,
branches: 100,
functions: 100
},
reportFormats: ["lcov"]
}
}
}
});
grunt.registerTask("test", ["mochaTest:test", "mocha_istanbul"]);
grunt.registerTask("ci-test", ["mochaTest:ci", "mocha_istanbul"]);
grunt.registerTask("default", "test");
};
| Check function and branch coverage | Check function and branch coverage
| JavaScript | mit | MantasSutula/todo-grad-project,stevenjob/todo-grad-project,RMcNeillSL/todo-grad-project,thullSL/todo-grad-project,seanworkcode/todo-grad-impl,MantasSutula/todo-grad-project,JMVHill/todo-grad-project,jdhodges431/todo-grad-project,seanworkcode/todo-grad-project,WPFerg/todo-grad-project,seanworkcode/todo-grad-impl,jdhodges431/todo-grad-project,WPFerg/todo-grad-project,JMVHill/todo-grad-project,jamiemorris1991/todo-grad-project,RMcNeillSL/todo-grad-project,thullSL/todo-angular-grad,seanworkcode/todo-grad-project,thullSL/todo-angular-grad,thullSL/todo-grad-project,jamiemorris1991/todo-grad-project,stevenjob/todo-grad-project | javascript | ## Code Before:
module.exports = function(grunt) {
grunt.loadNpmTasks("grunt-mocha-test");
grunt.loadNpmTasks("grunt-mocha-istanbul");
var testOutputLocation = process.env.CIRCLE_TEST_REPORTS || "test_output";
var artifactsLocation = process.env.CIRCLE_ARTIFACTS || "build_artifacts";
grunt.initConfig({
mochaTest: {
test: {
src: ["test/**/*.js"]
},
ci: {
src: ["test/**/*.js"],
options: {
reporter: "xunit",
captureFile: testOutputLocation + "/mocha/results.xml",
quiet: true
}
}
},
mocha_istanbul: {
coverage: {
src: ["test/**/*.js"],
options: {
coverageFolder: artifactsLocation + "/coverage",
check: {
lines: 100,
statements: 100
},
reportFormats: ["lcov"]
}
}
}
});
grunt.registerTask("test", ["mochaTest:test", "mocha_istanbul"]);
grunt.registerTask("ci-test", ["mochaTest:ci", "mocha_istanbul"]);
grunt.registerTask("default", "test");
};
## Instruction:
Check function and branch coverage
## Code After:
module.exports = function(grunt) {
grunt.loadNpmTasks("grunt-mocha-test");
grunt.loadNpmTasks("grunt-mocha-istanbul");
var testOutputLocation = process.env.CIRCLE_TEST_REPORTS || "test_output";
var artifactsLocation = process.env.CIRCLE_ARTIFACTS || "build_artifacts";
grunt.initConfig({
mochaTest: {
test: {
src: ["test/**/*.js"]
},
ci: {
src: ["test/**/*.js"],
options: {
reporter: "xunit",
captureFile: testOutputLocation + "/mocha/results.xml",
quiet: true
}
}
},
mocha_istanbul: {
coverage: {
src: ["test/**/*.js"],
options: {
coverageFolder: artifactsLocation + "/coverage",
check: {
lines: 100,
statements: 100,
branches: 100,
functions: 100
},
reportFormats: ["lcov"]
}
}
}
});
grunt.registerTask("test", ["mochaTest:test", "mocha_istanbul"]);
grunt.registerTask("ci-test", ["mochaTest:ci", "mocha_istanbul"]);
grunt.registerTask("default", "test");
};
|
fc95348177a60f40313f7ece442ac67f981f2ad7 | service/security/authentication/api/src/main/java/org/eclipse/kapua/service/authentication/credential/CredentialFactory.java | service/security/authentication/api/src/main/java/org/eclipse/kapua/service/authentication/credential/CredentialFactory.java | /*******************************************************************************
* Copyright (c) 2011, 2016 Eurotech and/or its affiliates and others
*
* All rights reserved. This program and the accompanying materials
* are made available under the terms of the Eclipse Public License v1.0
* which accompanies this distribution, and is available at
* http://www.eclipse.org/legal/epl-v10.html
*
* Contributors:
* Eurotech - initial API and implementation
*
*******************************************************************************/
package org.eclipse.kapua.service.authentication.credential;
import org.eclipse.kapua.model.KapuaEntityFactory;
import org.eclipse.kapua.model.KapuaObjectFactory;
import org.eclipse.kapua.model.id.KapuaId;
/**
* Credential factory service definition.
*
* @since 1.0
*
*/
public interface CredentialFactory extends KapuaEntityFactory<Credential, CredentialCreator, CredentialQuery, CredentialListResult>
{
/**
* Create a new {@link Credential}
*
* @param scopeId
* @param userId
* @param credentialType
* @param credentialKey
* @return
*/
public Credential newCredential(KapuaId scopeId, KapuaId userId, CredentialType credentialType, String credentialKey);
/**
* Create a new {@link CredentialCreator} for the specific credential type
*
* @param scopeId
* @param userId
* @param credentialType
* @param credentialKey
* @return
*/
public CredentialCreator newCreator(KapuaId scopeId, KapuaId userId, CredentialType credentialType, String credentialKey);
}
| /*******************************************************************************
* Copyright (c) 2011, 2016 Eurotech and/or its affiliates and others
*
* All rights reserved. This program and the accompanying materials
* are made available under the terms of the Eclipse Public License v1.0
* which accompanies this distribution, and is available at
* http://www.eclipse.org/legal/epl-v10.html
*
* Contributors:
* Eurotech - initial API and implementation
*
*******************************************************************************/
package org.eclipse.kapua.service.authentication.credential;
import org.eclipse.kapua.model.KapuaEntityFactory;
import org.eclipse.kapua.model.id.KapuaId;
/**
* Credential factory service definition.
*
* @since 1.0
*
*/
public interface CredentialFactory extends KapuaEntityFactory<Credential, CredentialCreator, CredentialQuery, CredentialListResult> {
/**
* Create a new {@link Credential}
*
* @param scopeId
* @param userId
* @param credentialType
* @param credentialKey
* @return
*/
public Credential newCredential(KapuaId scopeId, KapuaId userId, CredentialType credentialType, String credentialKey);
/**
* Create a new {@link CredentialCreator} for the specific credential type
*
* @param scopeId
* @param userId
* @param credentialType
* @param credentialKey
* @return
*/
public CredentialCreator newCreator(KapuaId scopeId, KapuaId userId, CredentialType credentialType, String credentialKey);
}
| Fix warnings and apply code style | Fix warnings and apply code style | Java | epl-1.0 | stzilli/kapua,LeoNerdoG/kapua,LeoNerdoG/kapua,LeoNerdoG/kapua,stzilli/kapua,stzilli/kapua,LeoNerdoG/kapua,stzilli/kapua,stzilli/kapua,LeoNerdoG/kapua | java | ## Code Before:
/*******************************************************************************
* Copyright (c) 2011, 2016 Eurotech and/or its affiliates and others
*
* All rights reserved. This program and the accompanying materials
* are made available under the terms of the Eclipse Public License v1.0
* which accompanies this distribution, and is available at
* http://www.eclipse.org/legal/epl-v10.html
*
* Contributors:
* Eurotech - initial API and implementation
*
*******************************************************************************/
package org.eclipse.kapua.service.authentication.credential;
import org.eclipse.kapua.model.KapuaEntityFactory;
import org.eclipse.kapua.model.KapuaObjectFactory;
import org.eclipse.kapua.model.id.KapuaId;
/**
* Credential factory service definition.
*
* @since 1.0
*
*/
public interface CredentialFactory extends KapuaEntityFactory<Credential, CredentialCreator, CredentialQuery, CredentialListResult>
{
/**
* Create a new {@link Credential}
*
* @param scopeId
* @param userId
* @param credentialType
* @param credentialKey
* @return
*/
public Credential newCredential(KapuaId scopeId, KapuaId userId, CredentialType credentialType, String credentialKey);
/**
* Create a new {@link CredentialCreator} for the specific credential type
*
* @param scopeId
* @param userId
* @param credentialType
* @param credentialKey
* @return
*/
public CredentialCreator newCreator(KapuaId scopeId, KapuaId userId, CredentialType credentialType, String credentialKey);
}
## Instruction:
Fix warnings and apply code style
## Code After:
/*******************************************************************************
* Copyright (c) 2011, 2016 Eurotech and/or its affiliates and others
*
* All rights reserved. This program and the accompanying materials
* are made available under the terms of the Eclipse Public License v1.0
* which accompanies this distribution, and is available at
* http://www.eclipse.org/legal/epl-v10.html
*
* Contributors:
* Eurotech - initial API and implementation
*
*******************************************************************************/
package org.eclipse.kapua.service.authentication.credential;
import org.eclipse.kapua.model.KapuaEntityFactory;
import org.eclipse.kapua.model.id.KapuaId;
/**
* Credential factory service definition.
*
* @since 1.0
*
*/
public interface CredentialFactory extends KapuaEntityFactory<Credential, CredentialCreator, CredentialQuery, CredentialListResult> {
/**
* Create a new {@link Credential}
*
* @param scopeId
* @param userId
* @param credentialType
* @param credentialKey
* @return
*/
public Credential newCredential(KapuaId scopeId, KapuaId userId, CredentialType credentialType, String credentialKey);
/**
* Create a new {@link CredentialCreator} for the specific credential type
*
* @param scopeId
* @param userId
* @param credentialType
* @param credentialKey
* @return
*/
public CredentialCreator newCreator(KapuaId scopeId, KapuaId userId, CredentialType credentialType, String credentialKey);
}
|
23c73808ac87bb5701d77ba0aa3717decdeb33ac | pkgs/os-specific/linux/procps/default.nix | pkgs/os-specific/linux/procps/default.nix | { stdenv, fetchurl, ncurses }:
stdenv.mkDerivation {
name = "procps-ng-3.3.10";
src = fetchurl {
url = "mirror://sourceforge/procps-ng/Production/${name}.tar.xz";
sha256 = "0d8mki0q4yamnkk4533kx8mc0jd879573srxhg6r2fs3lkc6iv8i";
};
buildInputs = [ ncurses ];
makeFlags = "DESTDIR=$(out)";
crossAttrs = {
CC = stdenv.cross.config + "-gcc";
};
meta = {
homepage = http://procps.sourceforge.net/;
description = "Utilities that give information about processes using the /proc filesystem";
platforms = platforms.linux;
maintainers = with maintainers; [ wkennington ];
};
}
| { stdenv, fetchurl, ncurses }:
stdenv.mkDerivation {
name = "procps-3.2.8";
src = fetchurl {
url = http://procps.sourceforge.net/procps-3.2.8.tar.gz;
sha256 = "0d8mki0q4yamnkk4533kx8mc0jd879573srxhg6r2fs3lkc6iv8i";
};
patches =
[ ./makefile.patch
./procps-build.patch
./gnumake3.82.patch
./linux-ver-init.patch
];
buildInputs = [ ncurses ];
makeFlags = "DESTDIR=$(out)";
crossAttrs = {
CC = stdenv.cross.config + "-gcc";
};
meta = {
homepage = http://procps.sourceforge.net/;
description = "Utilities that give information about processes using the /proc filesystem";
};
}
| Revert "procps: 3.2.8 -> 3.3.10" | Revert "procps: 3.2.8 -> 3.3.10"
This reverts commit 63675e8c2775932851cdccf2917c1bc11d157145.
This should be the old version.
| Nix | mit | NixOS/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,triton/triton,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,triton/triton,triton/triton,NixOS/nixpkgs,SymbiFlow/nixpkgs,triton/triton,SymbiFlow/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,triton/triton,NixOS/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,triton/triton,SymbiFlow/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,triton/triton,SymbiFlow/nixpkgs,NixOS/nixpkgs,triton/triton | nix | ## Code Before:
{ stdenv, fetchurl, ncurses }:
stdenv.mkDerivation {
name = "procps-ng-3.3.10";
src = fetchurl {
url = "mirror://sourceforge/procps-ng/Production/${name}.tar.xz";
sha256 = "0d8mki0q4yamnkk4533kx8mc0jd879573srxhg6r2fs3lkc6iv8i";
};
buildInputs = [ ncurses ];
makeFlags = "DESTDIR=$(out)";
crossAttrs = {
CC = stdenv.cross.config + "-gcc";
};
meta = {
homepage = http://procps.sourceforge.net/;
description = "Utilities that give information about processes using the /proc filesystem";
platforms = platforms.linux;
maintainers = with maintainers; [ wkennington ];
};
}
## Instruction:
Revert "procps: 3.2.8 -> 3.3.10"
This reverts commit 63675e8c2775932851cdccf2917c1bc11d157145.
This should be the old version.
## Code After:
{ stdenv, fetchurl, ncurses }:
stdenv.mkDerivation {
name = "procps-3.2.8";
src = fetchurl {
url = http://procps.sourceforge.net/procps-3.2.8.tar.gz;
sha256 = "0d8mki0q4yamnkk4533kx8mc0jd879573srxhg6r2fs3lkc6iv8i";
};
patches =
[ ./makefile.patch
./procps-build.patch
./gnumake3.82.patch
./linux-ver-init.patch
];
buildInputs = [ ncurses ];
makeFlags = "DESTDIR=$(out)";
crossAttrs = {
CC = stdenv.cross.config + "-gcc";
};
meta = {
homepage = http://procps.sourceforge.net/;
description = "Utilities that give information about processes using the /proc filesystem";
};
}
|
815b1b0a589d59b6fde4406b55bca78bec11caa8 | .travis.yml | .travis.yml | language: python
python: 3.5 # 3.5 requires manual download, so it needs to be selected
install: pip install tox
script: tox
env:
# Github deployment key
- secure: "d6xg3/CKxF2SDhkBNb94s7KbQPlCKwrO0jIJhS59djtyHApY/0tE594LXzrdQR9/yz998Zm/aCy+mQ4JbGToUZd4jKImelJguHCTOVXzjVEZqIH2dakpiYtLzWkyphQkqqWvIIBL8DuxX3BbVbrwqUEAFnlwZ1SHWwjwx3Xv3QU="
matrix:
fast_finish: true
# This is required to enable container based Travis CI infrastructure.
# See http://docs.travis-ci.com/user/migrating-from-legacy/
sudo: false
cache: pip
after_success:
- "curl -o /tmp/travis-automerge https://raw.githubusercontent.com/cdown/travis-automerge/master/travis-automerge"
- "chmod a+x /tmp/travis-automerge"
- "BRANCHES_TO_MERGE_REGEX='^f/' BRANCH_TO_MERGE_INTO=develop GITHUB_REPO=cdown/srt /tmp/travis-automerge"
| language: python
python: 3.5 # 3.5 requires manual download, so it needs to be selected
install: pip install tox
script: tox
env:
# Github deployment key
- secure: "d6xg3/CKxF2SDhkBNb94s7KbQPlCKwrO0jIJhS59djtyHApY/0tE594LXzrdQR9/yz998Zm/aCy+mQ4JbGToUZd4jKImelJguHCTOVXzjVEZqIH2dakpiYtLzWkyphQkqqWvIIBL8DuxX3BbVbrwqUEAFnlwZ1SHWwjwx3Xv3QU="
matrix:
fast_finish: true
# This is required to enable container based Travis CI infrastructure.
# See http://docs.travis-ci.com/user/migrating-from-legacy/
sudo: false
cache: pip
after_success:
- "curl -o /tmp/travis-automerge https://raw.githubusercontent.com/cdown/travis-automerge/master/travis-automerge"
- "chmod a+x /tmp/travis-automerge"
- "BRANCHES_TO_MERGE_REGEX='^f/' BRANCH_TO_MERGE_INTO=develop GITHUB_REPO=cdown/srt /tmp/travis-automerge"
notifications:
email:
on_success: never
on_failure: always
| Disable e-mails on Travis success | Disable e-mails on Travis success
Fixes #12.
| YAML | mit | cdown/srt | yaml | ## Code Before:
language: python
python: 3.5 # 3.5 requires manual download, so it needs to be selected
install: pip install tox
script: tox
env:
# Github deployment key
- secure: "d6xg3/CKxF2SDhkBNb94s7KbQPlCKwrO0jIJhS59djtyHApY/0tE594LXzrdQR9/yz998Zm/aCy+mQ4JbGToUZd4jKImelJguHCTOVXzjVEZqIH2dakpiYtLzWkyphQkqqWvIIBL8DuxX3BbVbrwqUEAFnlwZ1SHWwjwx3Xv3QU="
matrix:
fast_finish: true
# This is required to enable container based Travis CI infrastructure.
# See http://docs.travis-ci.com/user/migrating-from-legacy/
sudo: false
cache: pip
after_success:
- "curl -o /tmp/travis-automerge https://raw.githubusercontent.com/cdown/travis-automerge/master/travis-automerge"
- "chmod a+x /tmp/travis-automerge"
- "BRANCHES_TO_MERGE_REGEX='^f/' BRANCH_TO_MERGE_INTO=develop GITHUB_REPO=cdown/srt /tmp/travis-automerge"
## Instruction:
Disable e-mails on Travis success
Fixes #12.
## Code After:
language: python
python: 3.5 # 3.5 requires manual download, so it needs to be selected
install: pip install tox
script: tox
env:
# Github deployment key
- secure: "d6xg3/CKxF2SDhkBNb94s7KbQPlCKwrO0jIJhS59djtyHApY/0tE594LXzrdQR9/yz998Zm/aCy+mQ4JbGToUZd4jKImelJguHCTOVXzjVEZqIH2dakpiYtLzWkyphQkqqWvIIBL8DuxX3BbVbrwqUEAFnlwZ1SHWwjwx3Xv3QU="
matrix:
fast_finish: true
# This is required to enable container based Travis CI infrastructure.
# See http://docs.travis-ci.com/user/migrating-from-legacy/
sudo: false
cache: pip
after_success:
- "curl -o /tmp/travis-automerge https://raw.githubusercontent.com/cdown/travis-automerge/master/travis-automerge"
- "chmod a+x /tmp/travis-automerge"
- "BRANCHES_TO_MERGE_REGEX='^f/' BRANCH_TO_MERGE_INTO=develop GITHUB_REPO=cdown/srt /tmp/travis-automerge"
notifications:
email:
on_success: never
on_failure: always
|
151503bab3972bdd07f644b5a94225dda05d82d4 | client/views/forgotPassword/forgotPassword.html | client/views/forgotPassword/forgotPassword.html | <template name='entryForgotPassword'>
<div class="{{containerCSSClass}}">
<div class="{{rowCSSClass}}">
{{#if logo}}
<div class="entry-logo">
<a href="/"><img src="{{logo}}" alt="logo"></a>
</div>
{{/if}}
<div class="entry col-md-4 col-md-offset-4">
{{#if error}}
<div class='alert alert-danger'>{{error}}</div>
{{/if}}
<h3>{{t9n 'forgotPassword'}}</h3>
<form id='forgotPassword'>
<div class="form-group">
<input type="email" name="forgottenEmail" class="form-control" placeholder="{{t9n 'emailAddress'}}" value=''>
</div>
<button type="submit" class="btn btn-default">{{t9n 'emailResetLink'}}</button>
</form>
{{#if showSignupCode}}
<p class="entry-signup-cta">{{t9n 'dontHaveAnAccount'}} <a href="{{pathFor 'entrySignUp'}}">{{t9n 'signUp'}}</a></p>
{{/if}}
</div>
</div>
</div>
</template>
| <template name='entryForgotPassword'>
<div class="sign-in-page forgot-password-page">
<div class="{{containerCSSClass}}">
<div class="{{rowCSSClass}}">
{{#if logo}}
<div class="entry-logo">
<a href="/"><img src="{{logo}}" alt="logo"></a>
</div>
{{/if}}
<div class="col-md-12">
<div class="header-signin">
<h1>forgot password</h1>
<p>put your email address you registered with lumin, we will send an email with a link to recover your password</p>
</div>
</div>
</div>
<div class="row">
<div class="entry col-md-offset-3 col-md-6">
{{#if error}}
<div class='alert alert-danger'>{{error}}</div>
{{/if}}
<form class="entry-form" id='forgotPassword'>
<div class="form-group">
<label>Your email address:</label>
<input type="email" name="forgottenEmail" class="form-control" value=''>
</div>
<button type="submit" class="submit btn btn-default">send reset link</button>
</form>
{{#if showSignupCode}}
<p class="entry-signup-cta">{{t9n 'dontHaveAnAccount'}} <a href="{{pathFor 'entrySignUp'}}">{{t9n 'signUp'}}</a></p>
{{/if}}
</div>
</div>
</div>
</div>
</template>
| Update html code for forgot-password page | Update html code for forgot-password page
| HTML | mit | maxkferg/accounts-entry,vhmh2005/accounts-entry,maxkferg/accounts-entry,vhmh2005/accounts-entry | html | ## Code Before:
<template name='entryForgotPassword'>
<div class="{{containerCSSClass}}">
<div class="{{rowCSSClass}}">
{{#if logo}}
<div class="entry-logo">
<a href="/"><img src="{{logo}}" alt="logo"></a>
</div>
{{/if}}
<div class="entry col-md-4 col-md-offset-4">
{{#if error}}
<div class='alert alert-danger'>{{error}}</div>
{{/if}}
<h3>{{t9n 'forgotPassword'}}</h3>
<form id='forgotPassword'>
<div class="form-group">
<input type="email" name="forgottenEmail" class="form-control" placeholder="{{t9n 'emailAddress'}}" value=''>
</div>
<button type="submit" class="btn btn-default">{{t9n 'emailResetLink'}}</button>
</form>
{{#if showSignupCode}}
<p class="entry-signup-cta">{{t9n 'dontHaveAnAccount'}} <a href="{{pathFor 'entrySignUp'}}">{{t9n 'signUp'}}</a></p>
{{/if}}
</div>
</div>
</div>
</template>
## Instruction:
Update html code for forgot-password page
## Code After:
<template name='entryForgotPassword'>
<div class="sign-in-page forgot-password-page">
<div class="{{containerCSSClass}}">
<div class="{{rowCSSClass}}">
{{#if logo}}
<div class="entry-logo">
<a href="/"><img src="{{logo}}" alt="logo"></a>
</div>
{{/if}}
<div class="col-md-12">
<div class="header-signin">
<h1>forgot password</h1>
<p>put your email address you registered with lumin, we will send an email with a link to recover your password</p>
</div>
</div>
</div>
<div class="row">
<div class="entry col-md-offset-3 col-md-6">
{{#if error}}
<div class='alert alert-danger'>{{error}}</div>
{{/if}}
<form class="entry-form" id='forgotPassword'>
<div class="form-group">
<label>Your email address:</label>
<input type="email" name="forgottenEmail" class="form-control" value=''>
</div>
<button type="submit" class="submit btn btn-default">send reset link</button>
</form>
{{#if showSignupCode}}
<p class="entry-signup-cta">{{t9n 'dontHaveAnAccount'}} <a href="{{pathFor 'entrySignUp'}}">{{t9n 'signUp'}}</a></p>
{{/if}}
</div>
</div>
</div>
</div>
</template>
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.