commit stringlengths 40 40 | old_file stringlengths 4 184 | new_file stringlengths 4 184 | old_contents stringlengths 1 3.6k | new_contents stringlengths 5 3.38k | subject stringlengths 15 778 | message stringlengths 16 6.74k | lang stringclasses 201 values | license stringclasses 13 values | repos stringlengths 6 116k | config stringclasses 201 values | content stringlengths 137 7.24k | diff stringlengths 26 5.55k | diff_length int64 1 123 | relative_diff_length float64 0.01 89 | n_lines_added int64 0 108 | n_lines_deleted int64 0 106 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
c17c451b7c2f191442d7e2d4c3149b11cebf16f6 | README.md | README.md | desmoj
======
Desmo-J is a Framework for Discrete-Event Modelling and Simulation Library developing object-oriented simulation models in Java.
Developed by Department of Computer Science, University of Hamburg. Main contributors for this work are Johannes Göbel, Gunnar Kiesel, Nicolas Knaak, Julia Kuck, Tim Lechler, Ruth Meyer, Gaby Neumann, Volker Wohlgemuth, Bernd Page.
Full documentation is available on http://desmoj.sourceforge.net/home.html.
The source code is available for free on http://desmoj.sourceforge.net/download.html
This github fork the framework at version 2.4.1
| desmoj
======
Desmo-J is a Framework for Discrete-Event Modelling and Simulation Library developing object-oriented simulation models in Java.
Developed by Department of Computer Science, University of Hamburg. Main contributors for this work are Johannes Göbel, Gunnar Kiesel, Nicolas Knaak, Julia Kuck, Tim Lechler, Ruth Meyer, Gaby Neumann, Volker Wohlgemuth, Bernd Page, Wolfgang Kreutzer.
Full documentation is available on http://desmoj.sourceforge.net/home.html.
The source code is available for free on http://desmoj.sourceforge.net/download.html
This github fork the framework at version 2.4.1
| Add another one contributor's name. | Add another one contributor's name.
| Markdown | apache-2.0 | muhd7rosli/desmoj | markdown | ## Code Before:
desmoj
======
Desmo-J is a Framework for Discrete-Event Modelling and Simulation Library developing object-oriented simulation models in Java.
Developed by Department of Computer Science, University of Hamburg. Main contributors for this work are Johannes Göbel, Gunnar Kiesel, Nicolas Knaak, Julia Kuck, Tim Lechler, Ruth Meyer, Gaby Neumann, Volker Wohlgemuth, Bernd Page.
Full documentation is available on http://desmoj.sourceforge.net/home.html.
The source code is available for free on http://desmoj.sourceforge.net/download.html
This github fork the framework at version 2.4.1
## Instruction:
Add another one contributor's name.
## Code After:
desmoj
======
Desmo-J is a Framework for Discrete-Event Modelling and Simulation Library developing object-oriented simulation models in Java.
Developed by Department of Computer Science, University of Hamburg. Main contributors for this work are Johannes Göbel, Gunnar Kiesel, Nicolas Knaak, Julia Kuck, Tim Lechler, Ruth Meyer, Gaby Neumann, Volker Wohlgemuth, Bernd Page, Wolfgang Kreutzer.
Full documentation is available on http://desmoj.sourceforge.net/home.html.
The source code is available for free on http://desmoj.sourceforge.net/download.html
This github fork the framework at version 2.4.1
| desmoj
======
Desmo-J is a Framework for Discrete-Event Modelling and Simulation Library developing object-oriented simulation models in Java.
- Developed by Department of Computer Science, University of Hamburg. Main contributors for this work are Johannes Göbel, Gunnar Kiesel, Nicolas Knaak, Julia Kuck, Tim Lechler, Ruth Meyer, Gaby Neumann, Volker Wohlgemuth, Bernd Page.
+ Developed by Department of Computer Science, University of Hamburg. Main contributors for this work are Johannes Göbel, Gunnar Kiesel, Nicolas Knaak, Julia Kuck, Tim Lechler, Ruth Meyer, Gaby Neumann, Volker Wohlgemuth, Bernd Page, Wolfgang Kreutzer.
? +++++++++++++++++++
Full documentation is available on http://desmoj.sourceforge.net/home.html.
The source code is available for free on http://desmoj.sourceforge.net/download.html
This github fork the framework at version 2.4.1 | 2 | 0.181818 | 1 | 1 |
7822c56d8d623e03682b46a836e68e36d07053ee | src/utils/promises.ts | src/utils/promises.ts | export async function waitFor<T>(action: () => T, checkEveryMilliseconds: number, tryForMilliseconds: number, token?: { isCancellationRequested: boolean }): Promise<T | undefined> {
let timeRemaining = tryForMilliseconds;
while (timeRemaining > 0 && !(token && token.isCancellationRequested)) {
const res = action();
if (res)
return res;
await new Promise((resolve) => setTimeout(resolve, checkEveryMilliseconds));
timeRemaining -= 20;
}
}
| export async function waitFor<T>(action: () => T, checkEveryMilliseconds: number, tryForMilliseconds: number, token?: { isCancellationRequested: boolean }): Promise<T | undefined> {
let timeRemaining = tryForMilliseconds;
while (timeRemaining > 0 && !(token && token.isCancellationRequested)) {
const res = action();
if (res)
return res;
await new Promise((resolve) => setTimeout(resolve, checkEveryMilliseconds));
timeRemaining -= checkEveryMilliseconds;
}
}
| Fix bug in waitFor not waiting long enough | Fix bug in waitFor not waiting long enough
| TypeScript | mit | Dart-Code/Dart-Code,Dart-Code/Dart-Code,Dart-Code/Dart-Code,Dart-Code/Dart-Code,Dart-Code/Dart-Code | typescript | ## Code Before:
export async function waitFor<T>(action: () => T, checkEveryMilliseconds: number, tryForMilliseconds: number, token?: { isCancellationRequested: boolean }): Promise<T | undefined> {
let timeRemaining = tryForMilliseconds;
while (timeRemaining > 0 && !(token && token.isCancellationRequested)) {
const res = action();
if (res)
return res;
await new Promise((resolve) => setTimeout(resolve, checkEveryMilliseconds));
timeRemaining -= 20;
}
}
## Instruction:
Fix bug in waitFor not waiting long enough
## Code After:
export async function waitFor<T>(action: () => T, checkEveryMilliseconds: number, tryForMilliseconds: number, token?: { isCancellationRequested: boolean }): Promise<T | undefined> {
let timeRemaining = tryForMilliseconds;
while (timeRemaining > 0 && !(token && token.isCancellationRequested)) {
const res = action();
if (res)
return res;
await new Promise((resolve) => setTimeout(resolve, checkEveryMilliseconds));
timeRemaining -= checkEveryMilliseconds;
}
}
| export async function waitFor<T>(action: () => T, checkEveryMilliseconds: number, tryForMilliseconds: number, token?: { isCancellationRequested: boolean }): Promise<T | undefined> {
let timeRemaining = tryForMilliseconds;
while (timeRemaining > 0 && !(token && token.isCancellationRequested)) {
const res = action();
if (res)
return res;
await new Promise((resolve) => setTimeout(resolve, checkEveryMilliseconds));
- timeRemaining -= 20;
+ timeRemaining -= checkEveryMilliseconds;
}
} | 2 | 0.2 | 1 | 1 |
8e029786ff8fad8bbe851fe28acb8253e0721d0e | test/functional/schemes_controller_test.rb | test/functional/schemes_controller_test.rb | require 'test_helper'
class SchemesControllerTest < ActionController::TestCase
test "should get new" do
get :new
assert_response :success
end
end
| require 'test_helper'
class SchemesControllerTest < ActionController::TestCase
test 'redirect to splash if not logged in' do
get :new, {}, {id: nil}
assert_redirected_to :root
end
test 'gets new if logged in' do
get :new, {}, {id: true}
assert_response :success
end
end
| Change some tests for schemes controller | Change some tests for schemes controller
| Ruby | mit | mybuddymichael/skeemah,mybuddymichael/skeemah | ruby | ## Code Before:
require 'test_helper'
class SchemesControllerTest < ActionController::TestCase
test "should get new" do
get :new
assert_response :success
end
end
## Instruction:
Change some tests for schemes controller
## Code After:
require 'test_helper'
class SchemesControllerTest < ActionController::TestCase
test 'redirect to splash if not logged in' do
get :new, {}, {id: nil}
assert_redirected_to :root
end
test 'gets new if logged in' do
get :new, {}, {id: true}
assert_response :success
end
end
| require 'test_helper'
class SchemesControllerTest < ActionController::TestCase
- test "should get new" do
- get :new
+ test 'redirect to splash if not logged in' do
+ get :new, {}, {id: nil}
+ assert_redirected_to :root
+ end
+
+ test 'gets new if logged in' do
+ get :new, {}, {id: true}
assert_response :success
end
end | 9 | 1 | 7 | 2 |
1592c50612b4fdf45aa32294a96b59cc0bfb3c49 | .travis.yml | .travis.yml | language: php
sudo: true
php:
- '5.5'
- '5.6'
- '7.0'
- hhvm
- nightly
before_install:
- wget http://ppa.launchpad.net/anton+/dnscrypt/ubuntu/pool/main/libs/libsodium/libsodium-dbg_1.0.3-1pmo1~precise_amd64.deb
- wget http://ppa.launchpad.net/anton+/dnscrypt/ubuntu/pool/main/libs/libsodium/libsodium-dev_1.0.3-1pmo1~precise_amd64.deb
- wget http://ppa.launchpad.net/anton+/dnscrypt/ubuntu/pool/main/libs/libsodium/libsodium13_1.0.3-1pmo1~precise_amd64.deb
- sudo dpkg -i libsodium13_1.0.3-1pmo1~precise_amd64.deb
- sudo dpkg -i libsodium-dev_1.0.3-1pmo1~precise_amd64.deb
- sudo dpkg -i libsodium-dbg_1.0.3-1pmo1~precise_amd64.deb
- pecl install libsodium
install:
- travis_retry composer install --no-interaction --prefer-source
script: vendor/bin/phpunit | language: php
sudo: true
php:
- '5.5'
- '5.6'
- '7.0'
before_install:
- wget http://ppa.launchpad.net/anton+/dnscrypt/ubuntu/pool/main/libs/libsodium/libsodium-dbg_1.0.3-1pmo1~precise_amd64.deb
- wget http://ppa.launchpad.net/anton+/dnscrypt/ubuntu/pool/main/libs/libsodium/libsodium-dev_1.0.3-1pmo1~precise_amd64.deb
- wget http://ppa.launchpad.net/anton+/dnscrypt/ubuntu/pool/main/libs/libsodium/libsodium13_1.0.3-1pmo1~precise_amd64.deb
- sudo dpkg -i libsodium13_1.0.3-1pmo1~precise_amd64.deb
- sudo dpkg -i libsodium-dev_1.0.3-1pmo1~precise_amd64.deb
- sudo dpkg -i libsodium-dbg_1.0.3-1pmo1~precise_amd64.deb
- pecl install libsodium
install:
- travis_retry composer install --no-interaction --prefer-source
script: vendor/bin/phpunit | Remove HHVM and nightly from the build targets. | Remove HHVM and nightly from the build targets.
| YAML | mit | simpleapisecurity/php | yaml | ## Code Before:
language: php
sudo: true
php:
- '5.5'
- '5.6'
- '7.0'
- hhvm
- nightly
before_install:
- wget http://ppa.launchpad.net/anton+/dnscrypt/ubuntu/pool/main/libs/libsodium/libsodium-dbg_1.0.3-1pmo1~precise_amd64.deb
- wget http://ppa.launchpad.net/anton+/dnscrypt/ubuntu/pool/main/libs/libsodium/libsodium-dev_1.0.3-1pmo1~precise_amd64.deb
- wget http://ppa.launchpad.net/anton+/dnscrypt/ubuntu/pool/main/libs/libsodium/libsodium13_1.0.3-1pmo1~precise_amd64.deb
- sudo dpkg -i libsodium13_1.0.3-1pmo1~precise_amd64.deb
- sudo dpkg -i libsodium-dev_1.0.3-1pmo1~precise_amd64.deb
- sudo dpkg -i libsodium-dbg_1.0.3-1pmo1~precise_amd64.deb
- pecl install libsodium
install:
- travis_retry composer install --no-interaction --prefer-source
script: vendor/bin/phpunit
## Instruction:
Remove HHVM and nightly from the build targets.
## Code After:
language: php
sudo: true
php:
- '5.5'
- '5.6'
- '7.0'
before_install:
- wget http://ppa.launchpad.net/anton+/dnscrypt/ubuntu/pool/main/libs/libsodium/libsodium-dbg_1.0.3-1pmo1~precise_amd64.deb
- wget http://ppa.launchpad.net/anton+/dnscrypt/ubuntu/pool/main/libs/libsodium/libsodium-dev_1.0.3-1pmo1~precise_amd64.deb
- wget http://ppa.launchpad.net/anton+/dnscrypt/ubuntu/pool/main/libs/libsodium/libsodium13_1.0.3-1pmo1~precise_amd64.deb
- sudo dpkg -i libsodium13_1.0.3-1pmo1~precise_amd64.deb
- sudo dpkg -i libsodium-dev_1.0.3-1pmo1~precise_amd64.deb
- sudo dpkg -i libsodium-dbg_1.0.3-1pmo1~precise_amd64.deb
- pecl install libsodium
install:
- travis_retry composer install --no-interaction --prefer-source
script: vendor/bin/phpunit | language: php
sudo: true
php:
- '5.5'
- '5.6'
- '7.0'
- - hhvm
- - nightly
before_install:
- wget http://ppa.launchpad.net/anton+/dnscrypt/ubuntu/pool/main/libs/libsodium/libsodium-dbg_1.0.3-1pmo1~precise_amd64.deb
- wget http://ppa.launchpad.net/anton+/dnscrypt/ubuntu/pool/main/libs/libsodium/libsodium-dev_1.0.3-1pmo1~precise_amd64.deb
- wget http://ppa.launchpad.net/anton+/dnscrypt/ubuntu/pool/main/libs/libsodium/libsodium13_1.0.3-1pmo1~precise_amd64.deb
- sudo dpkg -i libsodium13_1.0.3-1pmo1~precise_amd64.deb
- sudo dpkg -i libsodium-dev_1.0.3-1pmo1~precise_amd64.deb
- sudo dpkg -i libsodium-dbg_1.0.3-1pmo1~precise_amd64.deb
- pecl install libsodium
install:
- travis_retry composer install --no-interaction --prefer-source
script: vendor/bin/phpunit | 2 | 0.083333 | 0 | 2 |
7f513be538a0aab943afed25ee1f64295b25644d | Cargo.toml | Cargo.toml | [package]
name = "rsciidoc"
version = "0.1.0"
authors = [
"Andrew Aday <andrewladay@gmail.com>",
"Alan Du <alanhdu@gmail.com>",
"Dennis Wei <dw2654@columbia.edu>",
]
[dependencies]
| [package]
name = "rsciidoc"
version = "0.1.0"
authors = [
"Andrew Aday <andrewladay@gmail.com>",
"Alan Du <alanhdu@gmail.com>",
"Dennis Wei <dw2654@columbia.edu>",
]
[dependencies]
nom = "^1.2.2"
| Add nom as a dependency | Add nom as a dependency
| TOML | mpl-2.0 | ADI-Labs/rsciidoc | toml | ## Code Before:
[package]
name = "rsciidoc"
version = "0.1.0"
authors = [
"Andrew Aday <andrewladay@gmail.com>",
"Alan Du <alanhdu@gmail.com>",
"Dennis Wei <dw2654@columbia.edu>",
]
[dependencies]
## Instruction:
Add nom as a dependency
## Code After:
[package]
name = "rsciidoc"
version = "0.1.0"
authors = [
"Andrew Aday <andrewladay@gmail.com>",
"Alan Du <alanhdu@gmail.com>",
"Dennis Wei <dw2654@columbia.edu>",
]
[dependencies]
nom = "^1.2.2"
| [package]
name = "rsciidoc"
version = "0.1.0"
authors = [
"Andrew Aday <andrewladay@gmail.com>",
"Alan Du <alanhdu@gmail.com>",
"Dennis Wei <dw2654@columbia.edu>",
]
[dependencies]
+ nom = "^1.2.2" | 1 | 0.1 | 1 | 0 |
a16e2a2db996469e0112297d5214cf8eaf1aaf97 | deploy-to-nodes.sh | deploy-to-nodes.sh |
nodes=($(docker-machine ls -q | grep hyperquick))
version_tag() {
if [ ! "${VERSION_TAG+x}" = "x" ] ; then git --git-dir "${KUBE_ROOT:-../}/.git" describe; fi
}
docker save -o "node.tar" "errordeveloper/hyperquick:node-$(version_tag)"
for n in "${nodes[@]}" ; do
eval "$(docker-machine env --shell bash "${n}")"
make clean
docker load -i "node.tar"
eval "$(docker-machine env --shell bash --unset)"
done
rm -f "node.tar"
|
cd "$(dirname "${BASH_SOURCE[0]}")"
nodes=($(docker-machine ls -q | grep hyperquick))
version_tag() {
if [ ! "${VERSION_TAG+x}" = "x" ] ; then git --git-dir "${KUBE_ROOT:-../}/.git" describe; fi
}
docker save -o "node.tar" "errordeveloper/hyperquick:node-$(version_tag)"
for n in "${nodes[@]}" ; do
eval "$(docker-machine env --shell bash "${n}")"
make clean
docker load -i "node.tar"
eval "$(docker-machine env --shell bash --unset)"
done
rm -f "node.tar"
| Make sure this can be called from any subdir | Make sure this can be called from any subdir
| Shell | apache-2.0 | errordeveloper/t8s | shell | ## Code Before:
nodes=($(docker-machine ls -q | grep hyperquick))
version_tag() {
if [ ! "${VERSION_TAG+x}" = "x" ] ; then git --git-dir "${KUBE_ROOT:-../}/.git" describe; fi
}
docker save -o "node.tar" "errordeveloper/hyperquick:node-$(version_tag)"
for n in "${nodes[@]}" ; do
eval "$(docker-machine env --shell bash "${n}")"
make clean
docker load -i "node.tar"
eval "$(docker-machine env --shell bash --unset)"
done
rm -f "node.tar"
## Instruction:
Make sure this can be called from any subdir
## Code After:
cd "$(dirname "${BASH_SOURCE[0]}")"
nodes=($(docker-machine ls -q | grep hyperquick))
version_tag() {
if [ ! "${VERSION_TAG+x}" = "x" ] ; then git --git-dir "${KUBE_ROOT:-../}/.git" describe; fi
}
docker save -o "node.tar" "errordeveloper/hyperquick:node-$(version_tag)"
for n in "${nodes[@]}" ; do
eval "$(docker-machine env --shell bash "${n}")"
make clean
docker load -i "node.tar"
eval "$(docker-machine env --shell bash --unset)"
done
rm -f "node.tar"
| +
+ cd "$(dirname "${BASH_SOURCE[0]}")"
nodes=($(docker-machine ls -q | grep hyperquick))
version_tag() {
if [ ! "${VERSION_TAG+x}" = "x" ] ; then git --git-dir "${KUBE_ROOT:-../}/.git" describe; fi
}
docker save -o "node.tar" "errordeveloper/hyperquick:node-$(version_tag)"
for n in "${nodes[@]}" ; do
eval "$(docker-machine env --shell bash "${n}")"
make clean
docker load -i "node.tar"
eval "$(docker-machine env --shell bash --unset)"
done
rm -f "node.tar" | 2 | 0.117647 | 2 | 0 |
9e9acac9b5c956f514678ee23e6ef72ce73ef158 | lib/crepe/versioning/endpoint.rb | lib/crepe/versioning/endpoint.rb | require 'crepe/versioning/request'
module Crepe
module Versioning
module Endpoint
# The most acceptable format requested, e.g. +:json+.
#
# @return [Symbol]
def format
return @format if defined? @format
formats = Util.media_types config[:formats]
media_type = request.accepts.best_of formats
@format = config[:formats].find { |f| Util.media_type(f) == media_type }
end
end
end
end
Crepe::Endpoint.send(:prepend, Crepe::Versioning::Endpoint)
| require 'crepe/versioning/request'
module Crepe
module Versioning
# Crepe::Endpoint needs a format override.
module Endpoint
# The most acceptable format requested, e.g. +:json+.
#
# @return [Symbol]
def format
return @format if defined? @format
formats = Util.media_types config[:formats]
media_type = request.accepts.best_of formats
@format = config[:formats].find { |f| Util.media_type(f) == media_type }
end
end
end
end
Crepe::Endpoint.send(:prepend, Crepe::Versioning::Endpoint)
| Add explanatory comment to Crepe::Versioning::Endpoint | Add explanatory comment to Crepe::Versioning::Endpoint
Signed-off-by: David Celis <b1c1d8736f20db3fb6c1c66bb1455ed43909f0d8@davidcel.is>
| Ruby | mit | crepe/crepe-versioning | ruby | ## Code Before:
require 'crepe/versioning/request'
module Crepe
module Versioning
module Endpoint
# The most acceptable format requested, e.g. +:json+.
#
# @return [Symbol]
def format
return @format if defined? @format
formats = Util.media_types config[:formats]
media_type = request.accepts.best_of formats
@format = config[:formats].find { |f| Util.media_type(f) == media_type }
end
end
end
end
Crepe::Endpoint.send(:prepend, Crepe::Versioning::Endpoint)
## Instruction:
Add explanatory comment to Crepe::Versioning::Endpoint
Signed-off-by: David Celis <b1c1d8736f20db3fb6c1c66bb1455ed43909f0d8@davidcel.is>
## Code After:
require 'crepe/versioning/request'
module Crepe
module Versioning
# Crepe::Endpoint needs a format override.
module Endpoint
# The most acceptable format requested, e.g. +:json+.
#
# @return [Symbol]
def format
return @format if defined? @format
formats = Util.media_types config[:formats]
media_type = request.accepts.best_of formats
@format = config[:formats].find { |f| Util.media_type(f) == media_type }
end
end
end
end
Crepe::Endpoint.send(:prepend, Crepe::Versioning::Endpoint)
| require 'crepe/versioning/request'
module Crepe
module Versioning
+ # Crepe::Endpoint needs a format override.
module Endpoint
# The most acceptable format requested, e.g. +:json+.
#
# @return [Symbol]
def format
return @format if defined? @format
formats = Util.media_types config[:formats]
media_type = request.accepts.best_of formats
@format = config[:formats].find { |f| Util.media_type(f) == media_type }
end
end
end
end
Crepe::Endpoint.send(:prepend, Crepe::Versioning::Endpoint) | 1 | 0.052632 | 1 | 0 |
b6ddc2ba05bd552290853a78952025f8dbc44409 | deployment_scripts/puppet/modules/lma_monitoring_analytics/metadata.json | deployment_scripts/puppet/modules/lma_monitoring_analytics/metadata.json | {
"name": "lma_monitoring_analytics",
"version": "1.0.0",
"author": "Guillaume Thouvenin <gthouvenin@mirantis.com>",
"summary": "Provides Grafana for InfluxDB for LMA monitoring analytics",
"license": "Apache-2.0",
"source": "git://git.openstack.org/cgit/stackforge/fuel-plugin-influxdb-grafana.git",
"project_page": "none",
"issues_url": "none",
"operatingsystem_support": [
{
"operatingsystem": "Ubuntu",
"operatingsystemrelease": ["14.04"]
},
{
"operatingsystem": "CentOS",
"operatingsystemrelease": ["6"]
}
],
"description": "Puppet module for configuring the Grafana dashboard and InfluxDB",
"dependencies": [
{"name": "puppetlabs/stdlib", "version_requirement": "4.x"},
{"name": "jfryman/nginx", "version_requirement": ">= 0.2.2"},
{"name": "bfraser/puppet-grafana", "version_requirement": ">= 2.1.0"}
]
}
| {
"name": "lma_monitoring_analytics",
"version": "1.0.0",
"author": "Guillaume Thouvenin <gthouvenin@mirantis.com>",
"summary": "Provides Grafana for InfluxDB for LMA monitoring analytics",
"license": "Apache-2.0",
"source": "git://git.openstack.org/cgit/stackforge/fuel-plugin-influxdb-grafana.git",
"project_page": "none",
"issues_url": "none",
"operatingsystem_support": [
{
"operatingsystem": "Ubuntu",
"operatingsystemrelease": ["14.04"]
},
{
"operatingsystem": "CentOS",
"operatingsystemrelease": ["6"]
}
],
"description": "Puppet module for configuring the Grafana dashboard and InfluxDB",
"dependencies": [
{"name": "puppetlabs/stdlib", "version_requirement": "4.x"},
{"name": "bfraser/puppet-grafana", "version_requirement": ">= 2.1.0"}
]
}
| Remove nginx Puppet module dependency | Remove nginx Puppet module dependency
Change-Id: I005f738acaf49117dd4f737a5385c45de2e38f0a
| JSON | apache-2.0 | stackforge/fuel-plugin-influxdb-grafana,stackforge/fuel-plugin-influxdb-grafana,stackforge/fuel-plugin-influxdb-grafana,stackforge/fuel-plugin-influxdb-grafana | json | ## Code Before:
{
"name": "lma_monitoring_analytics",
"version": "1.0.0",
"author": "Guillaume Thouvenin <gthouvenin@mirantis.com>",
"summary": "Provides Grafana for InfluxDB for LMA monitoring analytics",
"license": "Apache-2.0",
"source": "git://git.openstack.org/cgit/stackforge/fuel-plugin-influxdb-grafana.git",
"project_page": "none",
"issues_url": "none",
"operatingsystem_support": [
{
"operatingsystem": "Ubuntu",
"operatingsystemrelease": ["14.04"]
},
{
"operatingsystem": "CentOS",
"operatingsystemrelease": ["6"]
}
],
"description": "Puppet module for configuring the Grafana dashboard and InfluxDB",
"dependencies": [
{"name": "puppetlabs/stdlib", "version_requirement": "4.x"},
{"name": "jfryman/nginx", "version_requirement": ">= 0.2.2"},
{"name": "bfraser/puppet-grafana", "version_requirement": ">= 2.1.0"}
]
}
## Instruction:
Remove nginx Puppet module dependency
Change-Id: I005f738acaf49117dd4f737a5385c45de2e38f0a
## Code After:
{
"name": "lma_monitoring_analytics",
"version": "1.0.0",
"author": "Guillaume Thouvenin <gthouvenin@mirantis.com>",
"summary": "Provides Grafana for InfluxDB for LMA monitoring analytics",
"license": "Apache-2.0",
"source": "git://git.openstack.org/cgit/stackforge/fuel-plugin-influxdb-grafana.git",
"project_page": "none",
"issues_url": "none",
"operatingsystem_support": [
{
"operatingsystem": "Ubuntu",
"operatingsystemrelease": ["14.04"]
},
{
"operatingsystem": "CentOS",
"operatingsystemrelease": ["6"]
}
],
"description": "Puppet module for configuring the Grafana dashboard and InfluxDB",
"dependencies": [
{"name": "puppetlabs/stdlib", "version_requirement": "4.x"},
{"name": "bfraser/puppet-grafana", "version_requirement": ">= 2.1.0"}
]
}
| {
"name": "lma_monitoring_analytics",
"version": "1.0.0",
"author": "Guillaume Thouvenin <gthouvenin@mirantis.com>",
"summary": "Provides Grafana for InfluxDB for LMA monitoring analytics",
"license": "Apache-2.0",
"source": "git://git.openstack.org/cgit/stackforge/fuel-plugin-influxdb-grafana.git",
"project_page": "none",
"issues_url": "none",
"operatingsystem_support": [
{
"operatingsystem": "Ubuntu",
"operatingsystemrelease": ["14.04"]
},
{
"operatingsystem": "CentOS",
"operatingsystemrelease": ["6"]
}
],
"description": "Puppet module for configuring the Grafana dashboard and InfluxDB",
"dependencies": [
{"name": "puppetlabs/stdlib", "version_requirement": "4.x"},
- {"name": "jfryman/nginx", "version_requirement": ">= 0.2.2"},
{"name": "bfraser/puppet-grafana", "version_requirement": ">= 2.1.0"}
]
} | 1 | 0.038462 | 0 | 1 |
25a93f9091e2b0c744a11a6aa4074e1ab657e333 | README.md | README.md | gmap-fm
=======
Retrieve online digital map data from Google to provide local calculation resource.
| gmap-fm
=======
Retrieve online digital map data from Google to visualize local resource.
Framework
---------
1. Frontend uses jQuery to present the content to users.
2. Use Google Map API to visualize data.
3. Backend uses MongoDB database to store and manage trajectory data
4. Server is built on python BaseHTTPServer
| Update the project readme file | Update the project readme file
| Markdown | mit | thekingofkings/gmap-fm,thekingofkings/gmap-fm,thekingofkings/gmap-fm | markdown | ## Code Before:
gmap-fm
=======
Retrieve online digital map data from Google to provide local calculation resource.
## Instruction:
Update the project readme file
## Code After:
gmap-fm
=======
Retrieve online digital map data from Google to visualize local resource.
Framework
---------
1. Frontend uses jQuery to present the content to users.
2. Use Google Map API to visualize data.
3. Backend uses MongoDB database to store and manage trajectory data
4. Server is built on python BaseHTTPServer
| gmap-fm
=======
- Retrieve online digital map data from Google to provide local calculation resource.
? --- ^ ------------
+ Retrieve online digital map data from Google to visualize local resource.
? ^^^^^^
+
+ Framework
+ ---------
+ 1. Frontend uses jQuery to present the content to users.
+ 2. Use Google Map API to visualize data.
+ 3. Backend uses MongoDB database to store and manage trajectory data
+ 4. Server is built on python BaseHTTPServer | 9 | 2.25 | 8 | 1 |
87a3ce1f57961d68daead09340c2da9bff568501 | src/components/Home.vue | src/components/Home.vue | <template>
<div class="container">
<h1>{{ title }}</h1>
<div id="search">
</div>
<ul>
<serie v-for="serie in series" :key="serie.id" :serie-details="serie" @clicked="toggleFavorites($event)"></serie>
</ul>
</div>
</template>
<script>
import Serie from '@/components/Serie.vue'
import seriesService from '@/services/series.service'
import favoritesService from '@/services/favorites.service'
export default {
components: {
Serie
},
data () {
return {
title: 'Liste des séries',
series: []
}
},
mounted () {
seriesService.getSeries().then(response => (this.series = response.data.map(item => item.show)))
},
methods: {
toggleFavorites (serie) {
favoritesService.isFavorite(serie) ? favoritesService.removeFavorite(serie) : favoritesService.addFavorite(serie)
}
}
}
</script>
<style scoped>
#search {
padding-bottom: 15px;
}
ul {
-webkit-padding-start: 0px;
}
</style>
| <template>
<div class="container">
<h1>{{ title }}</h1>
<div id="search">
<input type="text" v-model="search" class="form-control" placeholder="Filtrer...">
</div>
<ul>
<serie v-for="serie in filteredSeries" :key="serie.id" :serie-details="serie" @clicked="toggleFavorites($event)"></serie>
</ul>
</div>
</template>
<script>
import Serie from '@/components/Serie.vue'
import seriesService from '@/services/series.service'
import favoritesService from '@/services/favorites.service'
export default {
components: {
Serie
},
data () {
return {
title: 'Liste des séries',
series: [],
search: ''
}
},
mounted () {
seriesService.getSeries().then(response => (this.series = response.data.map(item => item.show)))
},
methods: {
toggleFavorites (serie) {
favoritesService.isFavorite(serie) ? favoritesService.removeFavorite(serie) : favoritesService.addFavorite(serie)
}
},
computed: {
filteredSeries () {
return this.search === '' ? this.series : this.series.filter(serie => serie.name.toLowerCase().includes(this.search.toLowerCase()))
}
}
}
</script>
<style scoped>
#search {
padding-bottom: 15px;
}
ul {
-webkit-padding-start: 0px;
}
</style>
| Add search input to filter series | Add search input to filter series
| Vue | apache-2.0 | GregoryBevan/devfest-vuejs,GregoryBevan/devfest-vuejs | vue | ## Code Before:
<template>
<div class="container">
<h1>{{ title }}</h1>
<div id="search">
</div>
<ul>
<serie v-for="serie in series" :key="serie.id" :serie-details="serie" @clicked="toggleFavorites($event)"></serie>
</ul>
</div>
</template>
<script>
import Serie from '@/components/Serie.vue'
import seriesService from '@/services/series.service'
import favoritesService from '@/services/favorites.service'
export default {
components: {
Serie
},
data () {
return {
title: 'Liste des séries',
series: []
}
},
mounted () {
seriesService.getSeries().then(response => (this.series = response.data.map(item => item.show)))
},
methods: {
toggleFavorites (serie) {
favoritesService.isFavorite(serie) ? favoritesService.removeFavorite(serie) : favoritesService.addFavorite(serie)
}
}
}
</script>
<style scoped>
#search {
padding-bottom: 15px;
}
ul {
-webkit-padding-start: 0px;
}
</style>
## Instruction:
Add search input to filter series
## Code After:
<template>
<div class="container">
<h1>{{ title }}</h1>
<div id="search">
<input type="text" v-model="search" class="form-control" placeholder="Filtrer...">
</div>
<ul>
<serie v-for="serie in filteredSeries" :key="serie.id" :serie-details="serie" @clicked="toggleFavorites($event)"></serie>
</ul>
</div>
</template>
<script>
import Serie from '@/components/Serie.vue'
import seriesService from '@/services/series.service'
import favoritesService from '@/services/favorites.service'
export default {
components: {
Serie
},
data () {
return {
title: 'Liste des séries',
series: [],
search: ''
}
},
mounted () {
seriesService.getSeries().then(response => (this.series = response.data.map(item => item.show)))
},
methods: {
toggleFavorites (serie) {
favoritesService.isFavorite(serie) ? favoritesService.removeFavorite(serie) : favoritesService.addFavorite(serie)
}
},
computed: {
filteredSeries () {
return this.search === '' ? this.series : this.series.filter(serie => serie.name.toLowerCase().includes(this.search.toLowerCase()))
}
}
}
</script>
<style scoped>
#search {
padding-bottom: 15px;
}
ul {
-webkit-padding-start: 0px;
}
</style>
| <template>
<div class="container">
<h1>{{ title }}</h1>
<div id="search">
+ <input type="text" v-model="search" class="form-control" placeholder="Filtrer...">
</div>
<ul>
- <serie v-for="serie in series" :key="serie.id" :serie-details="serie" @clicked="toggleFavorites($event)"></serie>
? ^
+ <serie v-for="serie in filteredSeries" :key="serie.id" :serie-details="serie" @clicked="toggleFavorites($event)"></serie>
? ^^^^^^^^^
</ul>
</div>
</template>
<script>
import Serie from '@/components/Serie.vue'
import seriesService from '@/services/series.service'
import favoritesService from '@/services/favorites.service'
export default {
components: {
Serie
},
data () {
return {
title: 'Liste des séries',
- series: []
+ series: [],
? +
+ search: ''
}
},
mounted () {
seriesService.getSeries().then(response => (this.series = response.data.map(item => item.show)))
},
methods: {
toggleFavorites (serie) {
favoritesService.isFavorite(serie) ? favoritesService.removeFavorite(serie) : favoritesService.addFavorite(serie)
+ }
+ },
+ computed: {
+ filteredSeries () {
+ return this.search === '' ? this.series : this.series.filter(serie => serie.name.toLowerCase().includes(this.search.toLowerCase()))
}
}
}
</script>
<style scoped>
#search {
padding-bottom: 15px;
}
ul {
-webkit-padding-start: 0px;
}
</style> | 11 | 0.244444 | 9 | 2 |
019beb13dcf5d594dd7c6a201d0b9db5d3b36e91 | config-template.json | config-template.json | {
"api_key": "<discord api key here>",
"services": {
"BiblesOrg": {
"api_key": "<bibles.org api key here>"
},
"BibleGateway": {}
},
"bibles": {
"esv": {
"name": "English Standard Version",
"service": "BiblesOrg",
"service_version": "eng-ESV"
},
"kjv": {
"name": "King James Version",
"service": "BiblesOrg",
"service_version": "eng-KJV"
},
"kjva": {
"name": "King James Version with Apocrypha",
"service": "BiblesOrg",
"service_version": "eng-KJVA"
},
"nasb": {
"name": "New American Standard Bible",
"service": "BiblesOrg",
"service_version": "eng-NASB"
},
"sbl": {
"name": "SBL Greek New Testament",
"service": "BibleGateway",
"service_version": "SBLGNT"
},
"niv": {
"name": "New International Version",
"service": "BibleGateway",
"service_version": "NIV"
}
}
}
| {
"api_key": "<discord api key here>",
"services": {
"BiblesOrg": {
"api_key": "<bibles.org api key here>"
},
"BibleGateway": {}
},
"bibles": {
"esv": {
"name": "English Standard Version",
"abbr": "ESV",
"service": "BiblesOrg",
"service_version": "eng-ESV"
},
"kjv": {
"name": "King James Version",
"abbr": "KJV",
"service": "BiblesOrg",
"service_version": "eng-KJV"
},
"kjva": {
"name": "King James Version with Apocrypha",
"abbr": "KJV w/ Apocrypha",
"service": "BiblesOrg",
"service_version": "eng-KJVA"
},
"nasb": {
"name": "New American Standard Bible",
"abbr": "NASB",
"service": "BiblesOrg",
"service_version": "eng-NASB"
},
"sbl": {
"name": "SBL Greek New Testament",
"abbr": "SBL GNT",
"service": "BibleGateway",
"service_version": "SBLGNT"
},
"niv": {
"name": "New International Version",
"abbr": "NIV",
"service": "BibleGateway",
"service_version": "NIV"
}
}
}
| Add "abbr" to Bible objects in config template | Add "abbr" to Bible objects in config template
| JSON | bsd-3-clause | bryanforbes/Erasmus | json | ## Code Before:
{
"api_key": "<discord api key here>",
"services": {
"BiblesOrg": {
"api_key": "<bibles.org api key here>"
},
"BibleGateway": {}
},
"bibles": {
"esv": {
"name": "English Standard Version",
"service": "BiblesOrg",
"service_version": "eng-ESV"
},
"kjv": {
"name": "King James Version",
"service": "BiblesOrg",
"service_version": "eng-KJV"
},
"kjva": {
"name": "King James Version with Apocrypha",
"service": "BiblesOrg",
"service_version": "eng-KJVA"
},
"nasb": {
"name": "New American Standard Bible",
"service": "BiblesOrg",
"service_version": "eng-NASB"
},
"sbl": {
"name": "SBL Greek New Testament",
"service": "BibleGateway",
"service_version": "SBLGNT"
},
"niv": {
"name": "New International Version",
"service": "BibleGateway",
"service_version": "NIV"
}
}
}
## Instruction:
Add "abbr" to Bible objects in config template
## Code After:
{
"api_key": "<discord api key here>",
"services": {
"BiblesOrg": {
"api_key": "<bibles.org api key here>"
},
"BibleGateway": {}
},
"bibles": {
"esv": {
"name": "English Standard Version",
"abbr": "ESV",
"service": "BiblesOrg",
"service_version": "eng-ESV"
},
"kjv": {
"name": "King James Version",
"abbr": "KJV",
"service": "BiblesOrg",
"service_version": "eng-KJV"
},
"kjva": {
"name": "King James Version with Apocrypha",
"abbr": "KJV w/ Apocrypha",
"service": "BiblesOrg",
"service_version": "eng-KJVA"
},
"nasb": {
"name": "New American Standard Bible",
"abbr": "NASB",
"service": "BiblesOrg",
"service_version": "eng-NASB"
},
"sbl": {
"name": "SBL Greek New Testament",
"abbr": "SBL GNT",
"service": "BibleGateway",
"service_version": "SBLGNT"
},
"niv": {
"name": "New International Version",
"abbr": "NIV",
"service": "BibleGateway",
"service_version": "NIV"
}
}
}
| {
"api_key": "<discord api key here>",
"services": {
"BiblesOrg": {
"api_key": "<bibles.org api key here>"
},
"BibleGateway": {}
},
"bibles": {
"esv": {
"name": "English Standard Version",
+ "abbr": "ESV",
"service": "BiblesOrg",
"service_version": "eng-ESV"
},
"kjv": {
"name": "King James Version",
+ "abbr": "KJV",
"service": "BiblesOrg",
"service_version": "eng-KJV"
},
"kjva": {
"name": "King James Version with Apocrypha",
+ "abbr": "KJV w/ Apocrypha",
"service": "BiblesOrg",
"service_version": "eng-KJVA"
},
"nasb": {
"name": "New American Standard Bible",
+ "abbr": "NASB",
"service": "BiblesOrg",
"service_version": "eng-NASB"
},
"sbl": {
"name": "SBL Greek New Testament",
+ "abbr": "SBL GNT",
"service": "BibleGateway",
"service_version": "SBLGNT"
},
"niv": {
"name": "New International Version",
+ "abbr": "NIV",
"service": "BibleGateway",
"service_version": "NIV"
}
}
} | 6 | 0.146341 | 6 | 0 |
106ea580471387a3645877f52018ff2880db34f3 | live_studio/config/forms.py | live_studio/config/forms.py | from django import forms
from .models import Config
class ConfigForm(forms.ModelForm):
class Meta:
model = Config
exclude = ('created', 'user')
PAGES = (
('base',),
('distribution',),
('media_type',),
('architecture',),
('installer',),
('locale', 'keyboard_layout'),
)
WIZARD_FORMS = []
for fields in PAGES:
meta = type('Meta', (), {
'model': Config,
'fields': fields,
})
WIZARD_FORMS.append(type('', (forms.ModelForm,), {'Meta': meta}))
| from django import forms
from .models import Config
class ConfigForm(forms.ModelForm):
class Meta:
model = Config
exclude = ('created', 'user')
PAGES = (
('base',),
('distribution',),
('media_type',),
('architecture',),
('installer',),
('locale', 'keyboard_layout'),
)
WIZARD_FORMS = []
for fields in PAGES:
meta = type('Meta', (), {
'model': Config,
'fields': fields,
'widgets': {
'base': forms.RadioSelect(),
'distribution': forms.RadioSelect(),
'media_type': forms.RadioSelect(),
'architecture': forms.RadioSelect(),
'installer': forms.RadioSelect(),
},
})
WIZARD_FORMS.append(type('', (forms.ModelForm,), {'Meta': meta}))
| Use radio buttons for most of the interface. | Use radio buttons for most of the interface.
Signed-off-by: Chris Lamb <29e6d179a8d73471df7861382db6dd7e64138033@debian.org>
| Python | agpl-3.0 | lamby/live-studio,lamby/live-studio,lamby/live-studio,debian-live/live-studio,debian-live/live-studio,debian-live/live-studio | python | ## Code Before:
from django import forms
from .models import Config
class ConfigForm(forms.ModelForm):
class Meta:
model = Config
exclude = ('created', 'user')
PAGES = (
('base',),
('distribution',),
('media_type',),
('architecture',),
('installer',),
('locale', 'keyboard_layout'),
)
WIZARD_FORMS = []
for fields in PAGES:
meta = type('Meta', (), {
'model': Config,
'fields': fields,
})
WIZARD_FORMS.append(type('', (forms.ModelForm,), {'Meta': meta}))
## Instruction:
Use radio buttons for most of the interface.
Signed-off-by: Chris Lamb <29e6d179a8d73471df7861382db6dd7e64138033@debian.org>
## Code After:
from django import forms
from .models import Config
class ConfigForm(forms.ModelForm):
class Meta:
model = Config
exclude = ('created', 'user')
PAGES = (
('base',),
('distribution',),
('media_type',),
('architecture',),
('installer',),
('locale', 'keyboard_layout'),
)
WIZARD_FORMS = []
for fields in PAGES:
meta = type('Meta', (), {
'model': Config,
'fields': fields,
'widgets': {
'base': forms.RadioSelect(),
'distribution': forms.RadioSelect(),
'media_type': forms.RadioSelect(),
'architecture': forms.RadioSelect(),
'installer': forms.RadioSelect(),
},
})
WIZARD_FORMS.append(type('', (forms.ModelForm,), {'Meta': meta}))
| from django import forms
from .models import Config
class ConfigForm(forms.ModelForm):
class Meta:
model = Config
exclude = ('created', 'user')
PAGES = (
('base',),
('distribution',),
('media_type',),
('architecture',),
('installer',),
('locale', 'keyboard_layout'),
)
WIZARD_FORMS = []
for fields in PAGES:
meta = type('Meta', (), {
'model': Config,
'fields': fields,
+ 'widgets': {
+ 'base': forms.RadioSelect(),
+ 'distribution': forms.RadioSelect(),
+ 'media_type': forms.RadioSelect(),
+ 'architecture': forms.RadioSelect(),
+ 'installer': forms.RadioSelect(),
+ },
})
WIZARD_FORMS.append(type('', (forms.ModelForm,), {'Meta': meta})) | 7 | 0.269231 | 7 | 0 |
7159908eb64ebe4a1d0b94435e8d2ba318b44b63 | setup.py | setup.py | from setuptools import setup
import re
import os
import requests
def get_pip_version(pkginfo_url):
pkginfo = requests.get(pkginfo_url).text
for record in pkginfo.split('\n'):
if record.startswith('Version'):
current_version = str(record).split(':',1)
return (current_version[1]).strip()
if os.path.isfile('branch_master'):
current_version = get_pip_version('https://pypi.python.org/pypi?name=taskcat&:action=display_pkginfo')
else:
current_version = get_pip_version('https://testpypi.python.org/pypi?name=taskcat&:action=display_pkginfo')
new_version =re.sub('\d$', lambda x: str(int(x.group(0)) + 1), current_version)
setup(
name='taskcat',
packages=['taskcat'],
description='An OpenSource Cloudformation Deployment Framework',
author='Tony Vattathil, Santiago Cardenas, Shivansh Singh',
author_email='tonynv@amazon.com, sancard@amazon.com, sshvans@amazon.com',
url='https://aws-quickstart.github.io/taskcat/',
version=new_version,
license='Apache License 2.0',
download_url='https://github.com/aws-quickstart/taskcat/tarball/master',
classifiers=[
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Topic :: Software Development :: Libraries',
],
keywords=['aws', 'cloudformation', 'cloud', 'cloudformation testing', 'cloudformation deploy', 'taskcat'],
install_requires=['boto3', 'pyfiglet', 'pyyaml', 'tabulate', 'yattag']
)
| from setuptools import setup
setup(
name='taskcat',
packages=['taskcat'],
description='An OpenSource Cloudformation Deployment Framework',
author='Tony Vattathil, Santiago Cardenas, Shivansh Singh',
author_email='tonynv@amazon.com, sancard@amazon.com, sshvans@amazon.com',
url='https://aws-quickstart.github.io/taskcat/',
version='0.0.0.dev11',
license='Apache License 2.0',
download_url='https://github.com/aws-quickstart/taskcat/tarball/master',
classifiers=[
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Topic :: Software Development :: Libraries',
],
keywords=['aws', 'cloudformation', 'cloud', 'cloudformation testing', 'cloudformation deploy', 'taskcat'],
install_requires=['boto3', 'pyfiglet', 'pyyaml', 'tabulate', 'yattag']
)
| Revert Auto Version (Versioning is now managed by CI) | Revert Auto Version (Versioning is now managed by CI)
| Python | apache-2.0 | aws-quickstart/taskcat,aws-quickstart/taskcat,aws-quickstart/taskcat | python | ## Code Before:
from setuptools import setup
import re
import os
import requests
def get_pip_version(pkginfo_url):
pkginfo = requests.get(pkginfo_url).text
for record in pkginfo.split('\n'):
if record.startswith('Version'):
current_version = str(record).split(':',1)
return (current_version[1]).strip()
if os.path.isfile('branch_master'):
current_version = get_pip_version('https://pypi.python.org/pypi?name=taskcat&:action=display_pkginfo')
else:
current_version = get_pip_version('https://testpypi.python.org/pypi?name=taskcat&:action=display_pkginfo')
new_version =re.sub('\d$', lambda x: str(int(x.group(0)) + 1), current_version)
setup(
name='taskcat',
packages=['taskcat'],
description='An OpenSource Cloudformation Deployment Framework',
author='Tony Vattathil, Santiago Cardenas, Shivansh Singh',
author_email='tonynv@amazon.com, sancard@amazon.com, sshvans@amazon.com',
url='https://aws-quickstart.github.io/taskcat/',
version=new_version,
license='Apache License 2.0',
download_url='https://github.com/aws-quickstart/taskcat/tarball/master',
classifiers=[
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Topic :: Software Development :: Libraries',
],
keywords=['aws', 'cloudformation', 'cloud', 'cloudformation testing', 'cloudformation deploy', 'taskcat'],
install_requires=['boto3', 'pyfiglet', 'pyyaml', 'tabulate', 'yattag']
)
## Instruction:
Revert Auto Version (Versioning is now managed by CI)
## Code After:
from setuptools import setup
setup(
name='taskcat',
packages=['taskcat'],
description='An OpenSource Cloudformation Deployment Framework',
author='Tony Vattathil, Santiago Cardenas, Shivansh Singh',
author_email='tonynv@amazon.com, sancard@amazon.com, sshvans@amazon.com',
url='https://aws-quickstart.github.io/taskcat/',
version='0.0.0.dev11',
license='Apache License 2.0',
download_url='https://github.com/aws-quickstart/taskcat/tarball/master',
classifiers=[
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Topic :: Software Development :: Libraries',
],
keywords=['aws', 'cloudformation', 'cloud', 'cloudformation testing', 'cloudformation deploy', 'taskcat'],
install_requires=['boto3', 'pyfiglet', 'pyyaml', 'tabulate', 'yattag']
)
| from setuptools import setup
- import re
- import os
- import requests
- def get_pip_version(pkginfo_url):
- pkginfo = requests.get(pkginfo_url).text
- for record in pkginfo.split('\n'):
- if record.startswith('Version'):
- current_version = str(record).split(':',1)
- return (current_version[1]).strip()
-
-
- if os.path.isfile('branch_master'):
- current_version = get_pip_version('https://pypi.python.org/pypi?name=taskcat&:action=display_pkginfo')
- else:
- current_version = get_pip_version('https://testpypi.python.org/pypi?name=taskcat&:action=display_pkginfo')
-
- new_version =re.sub('\d$', lambda x: str(int(x.group(0)) + 1), current_version)
-
setup(
name='taskcat',
packages=['taskcat'],
description='An OpenSource Cloudformation Deployment Framework',
author='Tony Vattathil, Santiago Cardenas, Shivansh Singh',
author_email='tonynv@amazon.com, sancard@amazon.com, sshvans@amazon.com',
url='https://aws-quickstart.github.io/taskcat/',
- version=new_version,
+ version='0.0.0.dev11',
license='Apache License 2.0',
download_url='https://github.com/aws-quickstart/taskcat/tarball/master',
classifiers=[
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Topic :: Software Development :: Libraries',
-
],
-
keywords=['aws', 'cloudformation', 'cloud', 'cloudformation testing', 'cloudformation deploy', 'taskcat'],
-
install_requires=['boto3', 'pyfiglet', 'pyyaml', 'tabulate', 'yattag']
-
) | 24 | 0.545455 | 1 | 23 |
2dba897ed903815e6f904aa04a16abfbfeb02f7b | CMakeLists.txt | CMakeLists.txt | cmake_minimum_required(VERSION 2.8)
project(amqp-cpp)
# ensure c++11 on all compilers
include(set_cxx_norm.cmake)
set_cxx_norm (${CXX_NORM_CXX11})
macro (add_sources)
file (RELATIVE_PATH _relPath "${PROJECT_SOURCE_DIR}" "${CMAKE_CURRENT_SOURCE_DIR}")
foreach (_src ${ARGN})
if (_relPath)
list (APPEND SRCS "${_relPath}/${_src}")
else()
list (APPEND SRCS "${_src}")
endif()
endforeach()
if (_relPath)
# propagate SRCS to parent directory
set (SRCS ${SRCS} PARENT_SCOPE)
endif()
endmacro()
add_subdirectory(src)
add_subdirectory(include)
add_library(amqp-cpp STATIC ${SRCS})
target_include_directories(amqp-cpp SYSTEM PUBLIC ${PROJECT_SOURCE_DIR})
set(AMQP-CPP_INCLUDE_PATH ${CMAKE_CURRENT_SOURCE_DIR})
set(AMQP-CPP_INCLUDE_PATH ${CMAKE_CURRENT_SOURCE_DIR} PARENT_SCOPE)
| cmake_minimum_required(VERSION 2.8)
project(amqp-cpp)
# ensure c++11 on all compilers
include(set_cxx_norm.cmake)
set_cxx_norm (${CXX_NORM_CXX11})
macro (add_sources)
file (RELATIVE_PATH _relPath "${PROJECT_SOURCE_DIR}" "${CMAKE_CURRENT_SOURCE_DIR}")
foreach (_src ${ARGN})
if (_relPath)
list (APPEND SRCS "${_relPath}/${_src}")
else()
list (APPEND SRCS "${_src}")
endif()
endforeach()
if (_relPath)
# propagate SRCS to parent directory
set (SRCS ${SRCS} PARENT_SCOPE)
endif()
endmacro()
add_subdirectory(src)
add_subdirectory(include)
add_library(amqp-cpp STATIC ${SRCS})
target_include_directories(amqp-cpp SYSTEM PUBLIC ${PROJECT_SOURCE_DIR})
install(TARGETS amqp-cpp
ARCHIVE DESTINATION lib
)
set(AMQP-CPP_INCLUDE_PATH ${CMAKE_CURRENT_SOURCE_DIR})
set(AMQP-CPP_INCLUDE_PATH ${CMAKE_CURRENT_SOURCE_DIR} PARENT_SCOPE)
| Add install target to CMake. | Add install target to CMake.
| Text | apache-2.0 | tm604/AMQP-CPP,tangkingchun/AMQP-CPP,Kojoley/AMQP-CPP,tangkingchun/AMQP-CPP,antoniomonty/AMQP-CPP,toolking/AMQP-CPP,fantastory/AMQP-CPP,tm604/AMQP-CPP,fantastory/AMQP-CPP,Kojoley/AMQP-CPP,antoniomonty/AMQP-CPP,toolking/AMQP-CPP,CopernicaMarketingSoftware/AMQP-CPP,CopernicaMarketingSoftware/AMQP-CPP | text | ## Code Before:
cmake_minimum_required(VERSION 2.8)
project(amqp-cpp)
# ensure c++11 on all compilers
include(set_cxx_norm.cmake)
set_cxx_norm (${CXX_NORM_CXX11})
macro (add_sources)
file (RELATIVE_PATH _relPath "${PROJECT_SOURCE_DIR}" "${CMAKE_CURRENT_SOURCE_DIR}")
foreach (_src ${ARGN})
if (_relPath)
list (APPEND SRCS "${_relPath}/${_src}")
else()
list (APPEND SRCS "${_src}")
endif()
endforeach()
if (_relPath)
# propagate SRCS to parent directory
set (SRCS ${SRCS} PARENT_SCOPE)
endif()
endmacro()
add_subdirectory(src)
add_subdirectory(include)
add_library(amqp-cpp STATIC ${SRCS})
target_include_directories(amqp-cpp SYSTEM PUBLIC ${PROJECT_SOURCE_DIR})
set(AMQP-CPP_INCLUDE_PATH ${CMAKE_CURRENT_SOURCE_DIR})
set(AMQP-CPP_INCLUDE_PATH ${CMAKE_CURRENT_SOURCE_DIR} PARENT_SCOPE)
## Instruction:
Add install target to CMake.
## Code After:
cmake_minimum_required(VERSION 2.8)
project(amqp-cpp)
# ensure c++11 on all compilers
include(set_cxx_norm.cmake)
set_cxx_norm (${CXX_NORM_CXX11})
macro (add_sources)
file (RELATIVE_PATH _relPath "${PROJECT_SOURCE_DIR}" "${CMAKE_CURRENT_SOURCE_DIR}")
foreach (_src ${ARGN})
if (_relPath)
list (APPEND SRCS "${_relPath}/${_src}")
else()
list (APPEND SRCS "${_src}")
endif()
endforeach()
if (_relPath)
# propagate SRCS to parent directory
set (SRCS ${SRCS} PARENT_SCOPE)
endif()
endmacro()
add_subdirectory(src)
add_subdirectory(include)
add_library(amqp-cpp STATIC ${SRCS})
target_include_directories(amqp-cpp SYSTEM PUBLIC ${PROJECT_SOURCE_DIR})
install(TARGETS amqp-cpp
ARCHIVE DESTINATION lib
)
set(AMQP-CPP_INCLUDE_PATH ${CMAKE_CURRENT_SOURCE_DIR})
set(AMQP-CPP_INCLUDE_PATH ${CMAKE_CURRENT_SOURCE_DIR} PARENT_SCOPE)
| cmake_minimum_required(VERSION 2.8)
project(amqp-cpp)
# ensure c++11 on all compilers
include(set_cxx_norm.cmake)
set_cxx_norm (${CXX_NORM_CXX11})
macro (add_sources)
file (RELATIVE_PATH _relPath "${PROJECT_SOURCE_DIR}" "${CMAKE_CURRENT_SOURCE_DIR}")
foreach (_src ${ARGN})
if (_relPath)
list (APPEND SRCS "${_relPath}/${_src}")
else()
list (APPEND SRCS "${_src}")
endif()
endforeach()
if (_relPath)
# propagate SRCS to parent directory
set (SRCS ${SRCS} PARENT_SCOPE)
endif()
endmacro()
add_subdirectory(src)
add_subdirectory(include)
add_library(amqp-cpp STATIC ${SRCS})
target_include_directories(amqp-cpp SYSTEM PUBLIC ${PROJECT_SOURCE_DIR})
+ install(TARGETS amqp-cpp
+ ARCHIVE DESTINATION lib
+ )
set(AMQP-CPP_INCLUDE_PATH ${CMAKE_CURRENT_SOURCE_DIR})
set(AMQP-CPP_INCLUDE_PATH ${CMAKE_CURRENT_SOURCE_DIR} PARENT_SCOPE) | 3 | 0.096774 | 3 | 0 |
5aa8bcf6400824b4856eafb178cee723f0927c27 | utils/Cargo.toml | utils/Cargo.toml | [package]
name = "asn1-parse-utils"
version = "0.1.0"
authors = ["Josh <keeperofdakeys@gmail.comm>"]
[[bin]]
path = "src/asn1-spec-dump.rs"
name = "dump"
[dependencies]
asn1-parse = { path = "../../asn1-parse" }
nom = "^1.2"
argparse = "0.2"
| [package]
name = "asn1-parse-utils"
version = "0.1.0"
authors = ["Josh <keeperofdakeys@gmail.comm>"]
[[bin]]
path = "src/asn1-spec-dump.rs"
name = "dump"
[dependencies]
asn1-parse = { path = "../" }
# nom = "^1.2"
nom = { path = "../../nom" }
argparse = "0.2"
| Use local nom until eol patch is landed | Use local nom until eol patch is landed
| TOML | apache-2.0 | keeperofdakeys/asn1-parse,keeperofdakeys/asn1-parse | toml | ## Code Before:
[package]
name = "asn1-parse-utils"
version = "0.1.0"
authors = ["Josh <keeperofdakeys@gmail.comm>"]
[[bin]]
path = "src/asn1-spec-dump.rs"
name = "dump"
[dependencies]
asn1-parse = { path = "../../asn1-parse" }
nom = "^1.2"
argparse = "0.2"
## Instruction:
Use local nom until eol patch is landed
## Code After:
[package]
name = "asn1-parse-utils"
version = "0.1.0"
authors = ["Josh <keeperofdakeys@gmail.comm>"]
[[bin]]
path = "src/asn1-spec-dump.rs"
name = "dump"
[dependencies]
asn1-parse = { path = "../" }
# nom = "^1.2"
nom = { path = "../../nom" }
argparse = "0.2"
| [package]
name = "asn1-parse-utils"
version = "0.1.0"
authors = ["Josh <keeperofdakeys@gmail.comm>"]
[[bin]]
path = "src/asn1-spec-dump.rs"
name = "dump"
[dependencies]
- asn1-parse = { path = "../../asn1-parse" }
? -------------
+ asn1-parse = { path = "../" }
- nom = "^1.2"
+ # nom = "^1.2"
? ++
+ nom = { path = "../../nom" }
argparse = "0.2" | 5 | 0.384615 | 3 | 2 |
6c2ef049a9ef00b4c753f614830d2d51be87dcf3 | tools/renew-certificates.sh | tools/renew-certificates.sh |
moziot_dir="/home/pi/mozilla-iot"
certbot renew \
--config-dir "$moziot_dir/etc" \
--logs-dir "$moziot_dir/var/log" \
--work-dir "$moziot_dir/var/lib" \
--deploy-hook "$moziot_dir/gateway/tools/deploy-certificates.sh"
|
moziot_dir="/home/pi/mozilla-iot"
certbot renew \
--config-dir "$moziot_dir/etc" \
--logs-dir "$moziot_dir/var/log" \
--work-dir "$moziot_dir/var/lib" \
--deploy-hook "$moziot_dir/gateway/tools/deploy-certificates.sh"
chown -R pi:pi "$moziot_dir/etc" "$moziot_dir/var"
| Fix permissions after renewing certs. | Fix permissions after renewing certs.
| Shell | mpl-2.0 | mozilla-iot/gateway,moziot/gateway,moziot/gateway,mozilla-iot/gateway,mozilla-iot/gateway,mozilla-iot/gateway,moziot/gateway,mozilla-iot/gateway | shell | ## Code Before:
moziot_dir="/home/pi/mozilla-iot"
certbot renew \
--config-dir "$moziot_dir/etc" \
--logs-dir "$moziot_dir/var/log" \
--work-dir "$moziot_dir/var/lib" \
--deploy-hook "$moziot_dir/gateway/tools/deploy-certificates.sh"
## Instruction:
Fix permissions after renewing certs.
## Code After:
moziot_dir="/home/pi/mozilla-iot"
certbot renew \
--config-dir "$moziot_dir/etc" \
--logs-dir "$moziot_dir/var/log" \
--work-dir "$moziot_dir/var/lib" \
--deploy-hook "$moziot_dir/gateway/tools/deploy-certificates.sh"
chown -R pi:pi "$moziot_dir/etc" "$moziot_dir/var"
|
moziot_dir="/home/pi/mozilla-iot"
certbot renew \
--config-dir "$moziot_dir/etc" \
--logs-dir "$moziot_dir/var/log" \
--work-dir "$moziot_dir/var/lib" \
--deploy-hook "$moziot_dir/gateway/tools/deploy-certificates.sh"
+
+ chown -R pi:pi "$moziot_dir/etc" "$moziot_dir/var" | 2 | 0.25 | 2 | 0 |
254d2831fcab758f55302a01032fe73c3fe49e10 | www/modules/core/components/session.service.js | www/modules/core/components/session.service.js | (function() {
'use strict';
angular.module('Core')
.service('sessionService', sessionService);
sessionService.$inject = ['commonService'];
function sessionService(commonService) {
var service = this;
service.isUserLoggedIn = isUserLoggedIn;
/* ======================================== Var ==================================================== */
service.userData = {
};
/* ======================================== Services =============================================== */
/* ======================================== Public Methods ========================================= */
function isUserLoggedIn() {
// Check if user is logged in
}
/* ======================================== Private Methods ======================================== */
function init() {
}
init();
}
})();
| (function() {
'use strict';
angular.module('Core')
.service('sessionService', sessionService);
sessionService.$inject = [];
function sessionService() {
var service = this;
service.isUserLoggedIn = isUserLoggedIn;
/* ======================================== Var ==================================================== */
service.userData = {
};
/* ======================================== Services =============================================== */
/* ======================================== Public Methods ========================================= */
function isUserLoggedIn() {
// Check if user is logged in
}
/* ======================================== Private Methods ======================================== */
function init() {
}
init();
}
})();
| Remove commonSvc dependency to prevent circular dependency | Remove commonSvc dependency to prevent circular dependency
| JavaScript | mit | tlkiong/cxa_test,tlkiong/cxa_test | javascript | ## Code Before:
(function() {
'use strict';
angular.module('Core')
.service('sessionService', sessionService);
sessionService.$inject = ['commonService'];
function sessionService(commonService) {
var service = this;
service.isUserLoggedIn = isUserLoggedIn;
/* ======================================== Var ==================================================== */
service.userData = {
};
/* ======================================== Services =============================================== */
/* ======================================== Public Methods ========================================= */
function isUserLoggedIn() {
// Check if user is logged in
}
/* ======================================== Private Methods ======================================== */
function init() {
}
init();
}
})();
## Instruction:
Remove commonSvc dependency to prevent circular dependency
## Code After:
(function() {
'use strict';
angular.module('Core')
.service('sessionService', sessionService);
sessionService.$inject = [];
function sessionService() {
var service = this;
service.isUserLoggedIn = isUserLoggedIn;
/* ======================================== Var ==================================================== */
service.userData = {
};
/* ======================================== Services =============================================== */
/* ======================================== Public Methods ========================================= */
function isUserLoggedIn() {
// Check if user is logged in
}
/* ======================================== Private Methods ======================================== */
function init() {
}
init();
}
})();
| (function() {
'use strict';
angular.module('Core')
.service('sessionService', sessionService);
- sessionService.$inject = ['commonService'];
? ---------------
+ sessionService.$inject = [];
- function sessionService(commonService) {
? -------------
+ function sessionService() {
var service = this;
service.isUserLoggedIn = isUserLoggedIn;
/* ======================================== Var ==================================================== */
service.userData = {
};
/* ======================================== Services =============================================== */
/* ======================================== Public Methods ========================================= */
function isUserLoggedIn() {
// Check if user is logged in
}
/* ======================================== Private Methods ======================================== */
function init() {
}
init();
}
})(); | 4 | 0.117647 | 2 | 2 |
74ff9b6416a1225f2051ff7dde7f6e5b16b5f847 | app/js/services.js | app/js/services.js | 'use strict';
/* Services */
// Demonstrate how to register services
// In this case it is a simple value service.
angular.module('myApp.services', []).
value('version', '0.1')
.factory("moviesService", function($http){
var _movies = [];
var _getMovies = function(){
$http.get("js/data/movies.json")
.then(function(results){
//Success
angular.copy(results.data, _movies); //this is the preferred; instead of $scope.movies = result.data
}, function(results){
//Error
})
}
var _addNewMovie = function(movie){
_movies.splice(0, 0, movie);
}
var _removeMovie = function(idx){
var person_to_delete = _movies[idx.id];
_movies.splice(idx.id, 1);
/*
angular.forEach(_movies, function(value, key){
if(value.id == movie.id){
_movies.splice(idx, 1);
}
});
*/
}
return{
movies: _movies,
getMovies: _getMovies,
addNewMovie: _addNewMovie,
removeMovie:_removeMovie
};
});
| 'use strict';
/* Services */
// Demonstrate how to register services
// In this case it is a simple value service.
angular.module('myApp.services', []).
value('version', '0.1')
.factory("moviesService", function($http){
var _movies = [];
var _getMovies = function(){
$http.get("js/data/movies.json")
.then(function(results){
//Success
angular.copy(results.data, _movies); //this is the preferred; instead of $scope.movies = result.data
}, function(results){
//Error
})
}
var _addNewMovie = function(movie){
_movies.splice(0, 0, movie);
}
var _removeMovie = function(idx){
var person_to_delete = _movies[idx.id];
_movies.splice(idx.id, 1);
/*
angular.forEach(_movies, function(value, key){
if(value.id == movie.id){
_movies.splice(idx, 1);
}
});
*/
}
return{
movies: _movies, // revealing module pattern applied here
getMovies: _getMovies,
addNewMovie: _addNewMovie,
removeMovie:_removeMovie
};
});
| Comment for revealing module pattern | Comment for revealing module pattern
| JavaScript | mit | compufreakjosh/add-delete-routing,compufreakjosh/add-delete-routing,compufreakjosh/add-delete-routing,compufreakjosh/add-delete-routing | javascript | ## Code Before:
'use strict';
/* Services */
// Demonstrate how to register services
// In this case it is a simple value service.
angular.module('myApp.services', []).
value('version', '0.1')
.factory("moviesService", function($http){
var _movies = [];
var _getMovies = function(){
$http.get("js/data/movies.json")
.then(function(results){
//Success
angular.copy(results.data, _movies); //this is the preferred; instead of $scope.movies = result.data
}, function(results){
//Error
})
}
var _addNewMovie = function(movie){
_movies.splice(0, 0, movie);
}
var _removeMovie = function(idx){
var person_to_delete = _movies[idx.id];
_movies.splice(idx.id, 1);
/*
angular.forEach(_movies, function(value, key){
if(value.id == movie.id){
_movies.splice(idx, 1);
}
});
*/
}
return{
movies: _movies,
getMovies: _getMovies,
addNewMovie: _addNewMovie,
removeMovie:_removeMovie
};
});
## Instruction:
Comment for revealing module pattern
## Code After:
'use strict';
/* Services */
// Demonstrate how to register services
// In this case it is a simple value service.
angular.module('myApp.services', []).
value('version', '0.1')
.factory("moviesService", function($http){
var _movies = [];
var _getMovies = function(){
$http.get("js/data/movies.json")
.then(function(results){
//Success
angular.copy(results.data, _movies); //this is the preferred; instead of $scope.movies = result.data
}, function(results){
//Error
})
}
var _addNewMovie = function(movie){
_movies.splice(0, 0, movie);
}
var _removeMovie = function(idx){
var person_to_delete = _movies[idx.id];
_movies.splice(idx.id, 1);
/*
angular.forEach(_movies, function(value, key){
if(value.id == movie.id){
_movies.splice(idx, 1);
}
});
*/
}
return{
movies: _movies, // revealing module pattern applied here
getMovies: _getMovies,
addNewMovie: _addNewMovie,
removeMovie:_removeMovie
};
});
| 'use strict';
/* Services */
// Demonstrate how to register services
// In this case it is a simple value service.
angular.module('myApp.services', []).
value('version', '0.1')
.factory("moviesService", function($http){
var _movies = [];
var _getMovies = function(){
$http.get("js/data/movies.json")
.then(function(results){
//Success
angular.copy(results.data, _movies); //this is the preferred; instead of $scope.movies = result.data
}, function(results){
//Error
})
}
var _addNewMovie = function(movie){
_movies.splice(0, 0, movie);
}
var _removeMovie = function(idx){
var person_to_delete = _movies[idx.id];
_movies.splice(idx.id, 1);
/*
angular.forEach(_movies, function(value, key){
if(value.id == movie.id){
_movies.splice(idx, 1);
}
});
*/
}
return{
- movies: _movies,
+ movies: _movies, // revealing module pattern applied here
getMovies: _getMovies,
addNewMovie: _addNewMovie,
removeMovie:_removeMovie
};
});
| 2 | 0.044444 | 1 | 1 |
806885d20b798a4120e301790efc5fb9077a853e | lib/osc-ruby/em_server.rb | lib/osc-ruby/em_server.rb | require 'eventmachine'
module OSC
Channel = EM::Channel.new
class Connection < EventMachine::Connection
def receive_data(data)
ip_info = get_peername[2,6].unpack("nC4")
Channel << OSC::OSCPacket.messages_from_network(data, ip_info)
end
end
class EMServer
def initialize(port = 3333)
@port = port
setup_dispatcher
@tuples = []
end
def run
EM.error_handler{ |e|
puts "Error raised in EMServer: #{e.message}"
puts e.backtrace
}
EM.run do
EM::open_datagram_socket "0.0.0.0", @port, Connection
end
end
def add_method(address_pattern, &proc)
matcher = AddressPattern.new(address_pattern)
@tuples << [matcher, proc]
end
private
def setup_dispatcher
Channel.subscribe do |messages|
messages.each do |message|
diff = (message.time || 0) - Time.now.to_ntp
if diff <= 0
sendmesg(message)
else
EM.defer do
sleep(diff)
sendmesg(message)
end
end
end
end
end
def sendmesg(mesg)
@tuples.each do |matcher, obj|
if matcher.match?(mesg.address)
obj.call(mesg)
end
end
end
end
end
| require 'eventmachine'
module OSC
Channel = EM::Channel.new
class Connection < EventMachine::Connection
def receive_data(data)
ip_info = get_peername[2,6].unpack("nC4")
Channel << OSC::OSCPacket.messages_from_network(data, ip_info)
end
end
class EMServer
def initialize(port = 3333)
@port = port
setup_dispatcher
@tuples = []
end
def run
EM.error_handler{ |e|
Thread.main.raise e
}
EM.run do
EM::open_datagram_socket "0.0.0.0", @port, Connection
end
end
def add_method(address_pattern, &proc)
matcher = AddressPattern.new(address_pattern)
@tuples << [matcher, proc]
end
private
def setup_dispatcher
Channel.subscribe do |messages|
messages.each do |message|
diff = (message.time || 0) - Time.now.to_ntp
if diff <= 0
sendmesg(message)
else
EM.defer do
sleep(diff)
sendmesg(message)
end
end
end
end
end
def sendmesg(mesg)
@tuples.each do |matcher, obj|
if matcher.match?(mesg.address)
obj.call(mesg)
end
end
end
end
end
| Use actual exceptions for handler errors | Use actual exceptions for handler errors
| Ruby | mit | aberant/osc-ruby,aberant/osc-ruby | ruby | ## Code Before:
require 'eventmachine'
module OSC
Channel = EM::Channel.new
class Connection < EventMachine::Connection
def receive_data(data)
ip_info = get_peername[2,6].unpack("nC4")
Channel << OSC::OSCPacket.messages_from_network(data, ip_info)
end
end
class EMServer
def initialize(port = 3333)
@port = port
setup_dispatcher
@tuples = []
end
def run
EM.error_handler{ |e|
puts "Error raised in EMServer: #{e.message}"
puts e.backtrace
}
EM.run do
EM::open_datagram_socket "0.0.0.0", @port, Connection
end
end
def add_method(address_pattern, &proc)
matcher = AddressPattern.new(address_pattern)
@tuples << [matcher, proc]
end
private
def setup_dispatcher
Channel.subscribe do |messages|
messages.each do |message|
diff = (message.time || 0) - Time.now.to_ntp
if diff <= 0
sendmesg(message)
else
EM.defer do
sleep(diff)
sendmesg(message)
end
end
end
end
end
def sendmesg(mesg)
@tuples.each do |matcher, obj|
if matcher.match?(mesg.address)
obj.call(mesg)
end
end
end
end
end
## Instruction:
Use actual exceptions for handler errors
## Code After:
require 'eventmachine'
module OSC
Channel = EM::Channel.new
class Connection < EventMachine::Connection
def receive_data(data)
ip_info = get_peername[2,6].unpack("nC4")
Channel << OSC::OSCPacket.messages_from_network(data, ip_info)
end
end
class EMServer
def initialize(port = 3333)
@port = port
setup_dispatcher
@tuples = []
end
def run
EM.error_handler{ |e|
Thread.main.raise e
}
EM.run do
EM::open_datagram_socket "0.0.0.0", @port, Connection
end
end
def add_method(address_pattern, &proc)
matcher = AddressPattern.new(address_pattern)
@tuples << [matcher, proc]
end
private
def setup_dispatcher
Channel.subscribe do |messages|
messages.each do |message|
diff = (message.time || 0) - Time.now.to_ntp
if diff <= 0
sendmesg(message)
else
EM.defer do
sleep(diff)
sendmesg(message)
end
end
end
end
end
def sendmesg(mesg)
@tuples.each do |matcher, obj|
if matcher.match?(mesg.address)
obj.call(mesg)
end
end
end
end
end
| require 'eventmachine'
module OSC
Channel = EM::Channel.new
class Connection < EventMachine::Connection
def receive_data(data)
ip_info = get_peername[2,6].unpack("nC4")
Channel << OSC::OSCPacket.messages_from_network(data, ip_info)
end
end
class EMServer
def initialize(port = 3333)
@port = port
setup_dispatcher
@tuples = []
end
def run
EM.error_handler{ |e|
+ Thread.main.raise e
- puts "Error raised in EMServer: #{e.message}"
- puts e.backtrace
}
EM.run do
EM::open_datagram_socket "0.0.0.0", @port, Connection
end
end
def add_method(address_pattern, &proc)
matcher = AddressPattern.new(address_pattern)
@tuples << [matcher, proc]
end
private
def setup_dispatcher
Channel.subscribe do |messages|
messages.each do |message|
diff = (message.time || 0) - Time.now.to_ntp
if diff <= 0
sendmesg(message)
else
EM.defer do
sleep(diff)
sendmesg(message)
end
end
end
end
end
def sendmesg(mesg)
@tuples.each do |matcher, obj|
if matcher.match?(mesg.address)
obj.call(mesg)
end
end
end
end
end | 3 | 0.046875 | 1 | 2 |
051992e0dd7bc5b9a4f8d84eb0b7dcca31dc0bfb | project.clj | project.clj | (defproject com.taoensso/faraday "0.5.2"
:description "Clojure DynamoDB client"
:url "https://github.com/ptaoussanis/faraday"
:license {:name "Eclipse Public License"
:url "http://www.eclipse.org/legal/epl-v10.html"}
:dependencies [[org.clojure/clojure "1.5.1"]
[org.clojure/tools.macro "0.1.1"]
[com.amazonaws/aws-java-sdk "1.4.4.1"]
[com.taoensso/nippy "1.2.1"]
[com.taoensso/timbre "2.0.1"]]
:profiles {:1.5 {:dependencies [[org.clojure/clojure "1.5.1"]]}
:dev {:dependencies []}
:test {:dependencies []}
:bench {:dependencies []}}
:aliases {"test-all" ["with-profile" "test,1.5" "test"]}
:plugins [[codox "0.6.4"]]
:min-lein-version "2.0.0"
:warn-on-reflection true)
| (defproject com.taoensso/faraday "0.5.2"
:description "Clojure DynamoDB client"
:url "https://github.com/ptaoussanis/faraday"
:license {:name "Eclipse Public License"
:url "http://www.eclipse.org/legal/epl-v10.html"}
:dependencies [[org.clojure/clojure "1.5.1"]
[org.clojure/tools.macro "0.1.1"]
[com.amazonaws/aws-java-sdk "1.4.4.1"]
[expectations "1.4.43"]
[com.taoensso/nippy "1.2.1"]
[com.taoensso/timbre "2.0.1"]]
:profiles {:1.5 {:dependencies [[org.clojure/clojure "1.5.1"]]}
:dev {:dependencies []}
:test {:dependencies []}
:bench {:dependencies []}}
:aliases {"test-all" ["with-profile" "test,1.5" "test"]}
:plugins [[lein-expectations "0.0.7"]
[lein-autoexpect "0.2.5"]
[codox "0.6.4"]]
:min-lein-version "2.0.0"
:warn-on-reflection true)
| Add dependency (Expectations 1.4.43 & plugins) | Add dependency (Expectations 1.4.43 & plugins)
| Clojure | epl-1.0 | jeffh/faraday,ptaoussanis/faraday,langford/faraday,marcuswr/faraday-rotary | clojure | ## Code Before:
(defproject com.taoensso/faraday "0.5.2"
:description "Clojure DynamoDB client"
:url "https://github.com/ptaoussanis/faraday"
:license {:name "Eclipse Public License"
:url "http://www.eclipse.org/legal/epl-v10.html"}
:dependencies [[org.clojure/clojure "1.5.1"]
[org.clojure/tools.macro "0.1.1"]
[com.amazonaws/aws-java-sdk "1.4.4.1"]
[com.taoensso/nippy "1.2.1"]
[com.taoensso/timbre "2.0.1"]]
:profiles {:1.5 {:dependencies [[org.clojure/clojure "1.5.1"]]}
:dev {:dependencies []}
:test {:dependencies []}
:bench {:dependencies []}}
:aliases {"test-all" ["with-profile" "test,1.5" "test"]}
:plugins [[codox "0.6.4"]]
:min-lein-version "2.0.0"
:warn-on-reflection true)
## Instruction:
Add dependency (Expectations 1.4.43 & plugins)
## Code After:
(defproject com.taoensso/faraday "0.5.2"
:description "Clojure DynamoDB client"
:url "https://github.com/ptaoussanis/faraday"
:license {:name "Eclipse Public License"
:url "http://www.eclipse.org/legal/epl-v10.html"}
:dependencies [[org.clojure/clojure "1.5.1"]
[org.clojure/tools.macro "0.1.1"]
[com.amazonaws/aws-java-sdk "1.4.4.1"]
[expectations "1.4.43"]
[com.taoensso/nippy "1.2.1"]
[com.taoensso/timbre "2.0.1"]]
:profiles {:1.5 {:dependencies [[org.clojure/clojure "1.5.1"]]}
:dev {:dependencies []}
:test {:dependencies []}
:bench {:dependencies []}}
:aliases {"test-all" ["with-profile" "test,1.5" "test"]}
:plugins [[lein-expectations "0.0.7"]
[lein-autoexpect "0.2.5"]
[codox "0.6.4"]]
:min-lein-version "2.0.0"
:warn-on-reflection true)
| (defproject com.taoensso/faraday "0.5.2"
:description "Clojure DynamoDB client"
:url "https://github.com/ptaoussanis/faraday"
:license {:name "Eclipse Public License"
:url "http://www.eclipse.org/legal/epl-v10.html"}
:dependencies [[org.clojure/clojure "1.5.1"]
[org.clojure/tools.macro "0.1.1"]
[com.amazonaws/aws-java-sdk "1.4.4.1"]
+ [expectations "1.4.43"]
[com.taoensso/nippy "1.2.1"]
[com.taoensso/timbre "2.0.1"]]
:profiles {:1.5 {:dependencies [[org.clojure/clojure "1.5.1"]]}
:dev {:dependencies []}
:test {:dependencies []}
:bench {:dependencies []}}
:aliases {"test-all" ["with-profile" "test,1.5" "test"]}
- :plugins [[codox "0.6.4"]]
+ :plugins [[lein-expectations "0.0.7"]
+ [lein-autoexpect "0.2.5"]
+ [codox "0.6.4"]]
:min-lein-version "2.0.0"
:warn-on-reflection true) | 5 | 0.277778 | 4 | 1 |
2fa22fb28e11abbbfbe5dc4a43ce94b287f71e43 | project/plugins.sbt | project/plugins.sbt | resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)
addSbtPlugin("com.earldouglas" % "xsbt-web-plugin" % "0.4.2")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.10.0")
addSbtPlugin("com.openstudy" % "sbt-resource-management" % "0.4.1-SNAPSHOT")
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.7.4")
| resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)
addSbtPlugin("com.earldouglas" % "xsbt-web-plugin" % "0.4.2")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.10.0")
addSbtPlugin("com.openstudy" % "sbt-resource-management" % "0.4.2")
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.7.4")
| Use release 0.4.2 of sbt-resource-management. | Use release 0.4.2 of sbt-resource-management.
| Scala | apache-2.0 | farmdawgnation/anchortab,farmdawgnation/anchortab,farmdawgnation/anchortab,farmdawgnation/anchortab | scala | ## Code Before:
resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)
addSbtPlugin("com.earldouglas" % "xsbt-web-plugin" % "0.4.2")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.10.0")
addSbtPlugin("com.openstudy" % "sbt-resource-management" % "0.4.1-SNAPSHOT")
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.7.4")
## Instruction:
Use release 0.4.2 of sbt-resource-management.
## Code After:
resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)
addSbtPlugin("com.earldouglas" % "xsbt-web-plugin" % "0.4.2")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.10.0")
addSbtPlugin("com.openstudy" % "sbt-resource-management" % "0.4.2")
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.7.4")
| resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)
addSbtPlugin("com.earldouglas" % "xsbt-web-plugin" % "0.4.2")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.10.0")
- addSbtPlugin("com.openstudy" % "sbt-resource-management" % "0.4.1-SNAPSHOT")
? ^^^^^^^^^^
+ addSbtPlugin("com.openstudy" % "sbt-resource-management" % "0.4.2")
? ^
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.7.4") | 2 | 0.222222 | 1 | 1 |
248f8e37846e06a53305785ec09de42213fb0876 | st2tests/st2tests/fixtures/generic/actionchains/chain_with_output.json | st2tests/st2tests/fixtures/generic/actionchains/chain_with_output.json | {
"vars": {
"strtype": "{{system.a}}",
"inttype": 1
},
"chain": [
{
"name": "c1",
"ref": "wolfpack.a2",
"params":
{
"inttype": "{{inttype}}",
"strtype": "{{strtype}}",
"booltype": true
},
"output":
{
"o1": "{{c1.foo.bar}}"
}
}
],
"default": "c1"
}
| {
"vars": {
"strtype": "{{system.a}}",
"inttype": 1
},
"chain": [
{
"name": "c1",
"ref": "wolfpack.a2",
"params":
{
"inttype": "{{inttype}}",
"strtype": "{{strtype}}",
"booltype": true
},
"output":
{
"o1": "{{c1.foo.bar}}"
},
"on-success": "c2"
},
{
"name": "c2",
"ref": "wolfpack.a2",
"params":
{
"inttype": "{{inttype}}",
"strtype": "{{c1.o1}}",
"booltype": true
}
}
],
"default": "c1"
}
| Update fixture to include output and multiple nodes. | Update fixture to include output and multiple nodes.
| JSON | apache-2.0 | StackStorm/st2,nzlosh/st2,pixelrebel/st2,peak6/st2,Plexxi/st2,StackStorm/st2,lakshmi-kannan/st2,tonybaloney/st2,grengojbo/st2,pinterb/st2,pinterb/st2,emedvedev/st2,dennybaa/st2,armab/st2,alfasin/st2,nzlosh/st2,Itxaka/st2,peak6/st2,Itxaka/st2,punalpatel/st2,grengojbo/st2,emedvedev/st2,armab/st2,armab/st2,lakshmi-kannan/st2,StackStorm/st2,nzlosh/st2,Plexxi/st2,alfasin/st2,Plexxi/st2,grengojbo/st2,punalpatel/st2,dennybaa/st2,tonybaloney/st2,tonybaloney/st2,pinterb/st2,nzlosh/st2,alfasin/st2,jtopjian/st2,pixelrebel/st2,dennybaa/st2,Plexxi/st2,peak6/st2,jtopjian/st2,jtopjian/st2,lakshmi-kannan/st2,emedvedev/st2,StackStorm/st2,punalpatel/st2,pixelrebel/st2,Itxaka/st2 | json | ## Code Before:
{
"vars": {
"strtype": "{{system.a}}",
"inttype": 1
},
"chain": [
{
"name": "c1",
"ref": "wolfpack.a2",
"params":
{
"inttype": "{{inttype}}",
"strtype": "{{strtype}}",
"booltype": true
},
"output":
{
"o1": "{{c1.foo.bar}}"
}
}
],
"default": "c1"
}
## Instruction:
Update fixture to include output and multiple nodes.
## Code After:
{
"vars": {
"strtype": "{{system.a}}",
"inttype": 1
},
"chain": [
{
"name": "c1",
"ref": "wolfpack.a2",
"params":
{
"inttype": "{{inttype}}",
"strtype": "{{strtype}}",
"booltype": true
},
"output":
{
"o1": "{{c1.foo.bar}}"
},
"on-success": "c2"
},
{
"name": "c2",
"ref": "wolfpack.a2",
"params":
{
"inttype": "{{inttype}}",
"strtype": "{{c1.o1}}",
"booltype": true
}
}
],
"default": "c1"
}
| {
"vars": {
"strtype": "{{system.a}}",
"inttype": 1
},
"chain": [
{
"name": "c1",
"ref": "wolfpack.a2",
"params":
{
"inttype": "{{inttype}}",
"strtype": "{{strtype}}",
"booltype": true
},
"output":
{
"o1": "{{c1.foo.bar}}"
+ },
+ "on-success": "c2"
+ },
+ {
+ "name": "c2",
+ "ref": "wolfpack.a2",
+ "params":
+ {
+ "inttype": "{{inttype}}",
+ "strtype": "{{c1.o1}}",
+ "booltype": true
}
}
],
"default": "c1"
} | 11 | 0.478261 | 11 | 0 |
4ab058dbc4cd8e38355012d19b00f956df8d85da | README.md | README.md |
A [less] mixin for [flexbox].
## Available Mixins
The mixin names and their values correspond to the official draft.
* `display`
* `justify-content`
* `align-items`
* `align-self`
* `align-content`
* `order`
* `flex`
* `flex-grow`
* `flex-shrink`
* `flex-basis`
* `flex-direction`
* `flex-wrap`
* `flex-grow`
## Browser Compatability
## Sources
* [Official W3C Editor's Draft]
* [Centering Elements with Flexbox] from Smashing Magazine
[less]: http://lesscss.org
[flexbox]: http://dev.w3.org/csswg/css-flexbox
[Official W3C Editor's Draft]: http://dev.w3.org/csswg/css-flexbox
[Centering Elements with Flexbox]: http://coding.smashingmagazine.com/2013/05/22/centering-elements-with-flexbox/
|
A [less] mixin for [flexbox].
## Available Mixins
The mixin names and their values correspond to the official draft.
* `display`
* `justify-content`
* `align-items`
* `align-self`
* `align-content`
* `order`
* `flex`
* `flex-grow`
* `flex-shrink`
* `flex-basis`
* `flex-direction`
* `flex-wrap`
* `flex-grow`
## Browser Compatability
In Firefox < 22 you need to add the following snippet to every element that
you set `display: flex`.
```css
/*
* Legacy Firefox implementation treats all flex containers
* as inline-block elements.
*/
@-moz-document url-prefix() {
#selector {
width: 100%;
-moz-box-sizing: border-box;
}
}
```
## Sources
* [Official W3C Editor's Draft]
* [Centering Elements with Flexbox] from Smashing Magazine
[less]: http://lesscss.org
[flexbox]: http://dev.w3.org/csswg/css-flexbox
[Official W3C Editor's Draft]: http://dev.w3.org/csswg/css-flexbox
[Centering Elements with Flexbox]: http://coding.smashingmagazine.com/2013/05/22/centering-elements-with-flexbox/
| Add notice about shitty firefox | Add notice about shitty firefox
| Markdown | mit | codio/Flex.less,jianliaoim/Flex.less | markdown | ## Code Before:
A [less] mixin for [flexbox].
## Available Mixins
The mixin names and their values correspond to the official draft.
* `display`
* `justify-content`
* `align-items`
* `align-self`
* `align-content`
* `order`
* `flex`
* `flex-grow`
* `flex-shrink`
* `flex-basis`
* `flex-direction`
* `flex-wrap`
* `flex-grow`
## Browser Compatability
## Sources
* [Official W3C Editor's Draft]
* [Centering Elements with Flexbox] from Smashing Magazine
[less]: http://lesscss.org
[flexbox]: http://dev.w3.org/csswg/css-flexbox
[Official W3C Editor's Draft]: http://dev.w3.org/csswg/css-flexbox
[Centering Elements with Flexbox]: http://coding.smashingmagazine.com/2013/05/22/centering-elements-with-flexbox/
## Instruction:
Add notice about shitty firefox
## Code After:
A [less] mixin for [flexbox].
## Available Mixins
The mixin names and their values correspond to the official draft.
* `display`
* `justify-content`
* `align-items`
* `align-self`
* `align-content`
* `order`
* `flex`
* `flex-grow`
* `flex-shrink`
* `flex-basis`
* `flex-direction`
* `flex-wrap`
* `flex-grow`
## Browser Compatability
In Firefox < 22 you need to add the following snippet to every element that
you set `display: flex`.
```css
/*
* Legacy Firefox implementation treats all flex containers
* as inline-block elements.
*/
@-moz-document url-prefix() {
#selector {
width: 100%;
-moz-box-sizing: border-box;
}
}
```
## Sources
* [Official W3C Editor's Draft]
* [Centering Elements with Flexbox] from Smashing Magazine
[less]: http://lesscss.org
[flexbox]: http://dev.w3.org/csswg/css-flexbox
[Official W3C Editor's Draft]: http://dev.w3.org/csswg/css-flexbox
[Centering Elements with Flexbox]: http://coding.smashingmagazine.com/2013/05/22/centering-elements-with-flexbox/
|
A [less] mixin for [flexbox].
## Available Mixins
The mixin names and their values correspond to the official draft.
* `display`
* `justify-content`
* `align-items`
* `align-self`
* `align-content`
* `order`
* `flex`
* `flex-grow`
* `flex-shrink`
* `flex-basis`
* `flex-direction`
* `flex-wrap`
* `flex-grow`
## Browser Compatability
+ In Firefox < 22 you need to add the following snippet to every element that
+ you set `display: flex`.
+
+ ```css
+ /*
+ * Legacy Firefox implementation treats all flex containers
+ * as inline-block elements.
+ */
+ @-moz-document url-prefix() {
+ #selector {
+ width: 100%;
+ -moz-box-sizing: border-box;
+ }
+ }
+ ```
## Sources
* [Official W3C Editor's Draft]
* [Centering Elements with Flexbox] from Smashing Magazine
[less]: http://lesscss.org
[flexbox]: http://dev.w3.org/csswg/css-flexbox
[Official W3C Editor's Draft]: http://dev.w3.org/csswg/css-flexbox
[Centering Elements with Flexbox]: http://coding.smashingmagazine.com/2013/05/22/centering-elements-with-flexbox/
| 15 | 0.384615 | 15 | 0 |
01535f9ad9e25ff8856b01ab7dc14ad69fc45b53 | src/Repositories/SeriesRepository.php | src/Repositories/SeriesRepository.php | <?php namespace Bishopm\Connexion\Repositories;
use Bishopm\Connexion\Repositories\EloquentBaseRepository;
class SeriesRepository extends EloquentBaseRepository
{
public function findwithsermons($id)
{
return $series=$this->model->with('sermons')->where('id',$id)->first();
}
public function allwithsermons()
{
return $this->model->has('sermons')->get();
}
}
| <?php namespace Bishopm\Connexion\Repositories;
use Bishopm\Connexion\Repositories\EloquentBaseRepository;
class SeriesRepository extends EloquentBaseRepository
{
public function findwithsermons($id)
{
return $this->model->with('sermons.comments')->where('id',$id)->first();
}
public function allwithsermons()
{
return $this->model->has('sermons')->get();
}
}
| Include sermon comments for API | Include sermon comments for API
| PHP | mit | bishopm/base,bishopm/base,bishopm/connexion,bishopm/connexion,bishopm/base,bishopm/connexion | php | ## Code Before:
<?php namespace Bishopm\Connexion\Repositories;
use Bishopm\Connexion\Repositories\EloquentBaseRepository;
class SeriesRepository extends EloquentBaseRepository
{
public function findwithsermons($id)
{
return $series=$this->model->with('sermons')->where('id',$id)->first();
}
public function allwithsermons()
{
return $this->model->has('sermons')->get();
}
}
## Instruction:
Include sermon comments for API
## Code After:
<?php namespace Bishopm\Connexion\Repositories;
use Bishopm\Connexion\Repositories\EloquentBaseRepository;
class SeriesRepository extends EloquentBaseRepository
{
public function findwithsermons($id)
{
return $this->model->with('sermons.comments')->where('id',$id)->first();
}
public function allwithsermons()
{
return $this->model->has('sermons')->get();
}
}
| <?php namespace Bishopm\Connexion\Repositories;
use Bishopm\Connexion\Repositories\EloquentBaseRepository;
class SeriesRepository extends EloquentBaseRepository
{
public function findwithsermons($id)
{
- return $series=$this->model->with('sermons')->where('id',$id)->first();
? --------
+ return $this->model->with('sermons.comments')->where('id',$id)->first();
? +++++++++
}
public function allwithsermons()
{
return $this->model->has('sermons')->get();
}
} | 2 | 0.117647 | 1 | 1 |
3ccaf18243232d756ed139d9f84a6b3903af15f7 | exploratory_analysis/author_scan.py | exploratory_analysis/author_scan.py | import os
from utils import Reader
import code
import sys
author_dict = dict()
def extract_authors(tweets):
# code.interact(local=dict(globals(), **locals()))
for t in tweets:
if t.is_post():
actor = t.actor()
create_key(actor['id'])
increment_author(actor, t.is_post())
elif t.is_share():
original_tweet = t.data['object']
actor = original_tweet['actor']
create_key(actor['id'])
increment_author(actor, t.is_post())
else:
print 'Neither post nor share:', t.id()
def increment_author(actor, is_post):
dict_value = author_dict[actor['id']]
dict_value[0] = actor['link']
dict_value[1] = actor['preferredUsername']
dict_value[2] = actor['displayName']
if is_post:
dict_value[3] += 1
else:
dict_value[4] += 1
def create_key(actor_id):
if actor_id not in author_dict.keys():
# link, username, display_name, post, post that gotten shared
default_value = ['', '', '', 0, 0]
author_dict[actor_id] = default_value
def print_all():
for k in author_dict.keys():
value = author_dict[k]
print '"{}","{}","{}","{}",{},{}'.format(k, value[0], value[1], value[2], value[3], value[4])
if __name__ == '__main__':
# coding=utf-8
reload(sys)
sys.setdefaultencoding('utf-8')
working_directory = os.getcwd()
files = Reader.read_directory(working_directory)
for f in files:
extract_authors(Reader.read_file(f))
print_all()
# code.interact(local=dict(globals(), **locals()))
| import os
from utils import Reader
import code
import sys
def extract_authors(tweets):
for t in tweets:
if t.is_post():
actor = t.actor()
print '"{}","{}","{}","{}",{},{}'.format(actor['id'],
actor['link'],
actor['preferredUsername'],
actor['displayName'], 1, 0)
elif t.is_share():
original_tweet = t.data['object']
actor = original_tweet['actor']
print '"{}","{}","{}","{}",{},{}'.format(actor['id'],
actor['link'],
actor['preferredUsername'],
actor['displayName'], 0, 1)
else:
print 'Neither post nor share:', t.id()
if __name__ == '__main__':
# coding=utf-8
reload(sys)
sys.setdefaultencoding('utf-8')
working_directory = os.getcwd()
files = Reader.read_directory(working_directory)
for f in files:
extract_authors(Reader.read_file(f))
# code.interact(local=dict(globals(), **locals()))
| Print everything out in csv and use tableau to do calculation | Print everything out in csv and use tableau to do calculation
| Python | apache-2.0 | chuajiesheng/twitter-sentiment-analysis | python | ## Code Before:
import os
from utils import Reader
import code
import sys
author_dict = dict()
def extract_authors(tweets):
# code.interact(local=dict(globals(), **locals()))
for t in tweets:
if t.is_post():
actor = t.actor()
create_key(actor['id'])
increment_author(actor, t.is_post())
elif t.is_share():
original_tweet = t.data['object']
actor = original_tweet['actor']
create_key(actor['id'])
increment_author(actor, t.is_post())
else:
print 'Neither post nor share:', t.id()
def increment_author(actor, is_post):
dict_value = author_dict[actor['id']]
dict_value[0] = actor['link']
dict_value[1] = actor['preferredUsername']
dict_value[2] = actor['displayName']
if is_post:
dict_value[3] += 1
else:
dict_value[4] += 1
def create_key(actor_id):
if actor_id not in author_dict.keys():
# link, username, display_name, post, post that gotten shared
default_value = ['', '', '', 0, 0]
author_dict[actor_id] = default_value
def print_all():
for k in author_dict.keys():
value = author_dict[k]
print '"{}","{}","{}","{}",{},{}'.format(k, value[0], value[1], value[2], value[3], value[4])
if __name__ == '__main__':
# coding=utf-8
reload(sys)
sys.setdefaultencoding('utf-8')
working_directory = os.getcwd()
files = Reader.read_directory(working_directory)
for f in files:
extract_authors(Reader.read_file(f))
print_all()
# code.interact(local=dict(globals(), **locals()))
## Instruction:
Print everything out in csv and use tableau to do calculation
## Code After:
import os
from utils import Reader
import code
import sys
def extract_authors(tweets):
for t in tweets:
if t.is_post():
actor = t.actor()
print '"{}","{}","{}","{}",{},{}'.format(actor['id'],
actor['link'],
actor['preferredUsername'],
actor['displayName'], 1, 0)
elif t.is_share():
original_tweet = t.data['object']
actor = original_tweet['actor']
print '"{}","{}","{}","{}",{},{}'.format(actor['id'],
actor['link'],
actor['preferredUsername'],
actor['displayName'], 0, 1)
else:
print 'Neither post nor share:', t.id()
if __name__ == '__main__':
# coding=utf-8
reload(sys)
sys.setdefaultencoding('utf-8')
working_directory = os.getcwd()
files = Reader.read_directory(working_directory)
for f in files:
extract_authors(Reader.read_file(f))
# code.interact(local=dict(globals(), **locals()))
| import os
from utils import Reader
import code
import sys
- author_dict = dict()
-
def extract_authors(tweets):
- # code.interact(local=dict(globals(), **locals()))
-
for t in tweets:
if t.is_post():
actor = t.actor()
- create_key(actor['id'])
- increment_author(actor, t.is_post())
+
+ print '"{}","{}","{}","{}",{},{}'.format(actor['id'],
+ actor['link'],
+ actor['preferredUsername'],
+ actor['displayName'], 1, 0)
elif t.is_share():
original_tweet = t.data['object']
actor = original_tweet['actor']
- create_key(actor['id'])
- increment_author(actor, t.is_post())
+
+ print '"{}","{}","{}","{}",{},{}'.format(actor['id'],
+ actor['link'],
+ actor['preferredUsername'],
+ actor['displayName'], 0, 1)
else:
print 'Neither post nor share:', t.id()
-
-
- def increment_author(actor, is_post):
- dict_value = author_dict[actor['id']]
-
- dict_value[0] = actor['link']
- dict_value[1] = actor['preferredUsername']
- dict_value[2] = actor['displayName']
-
- if is_post:
- dict_value[3] += 1
- else:
- dict_value[4] += 1
-
-
- def create_key(actor_id):
- if actor_id not in author_dict.keys():
- # link, username, display_name, post, post that gotten shared
- default_value = ['', '', '', 0, 0]
- author_dict[actor_id] = default_value
-
-
- def print_all():
- for k in author_dict.keys():
- value = author_dict[k]
- print '"{}","{}","{}","{}",{},{}'.format(k, value[0], value[1], value[2], value[3], value[4])
if __name__ == '__main__':
# coding=utf-8
reload(sys)
sys.setdefaultencoding('utf-8')
working_directory = os.getcwd()
files = Reader.read_directory(working_directory)
for f in files:
extract_authors(Reader.read_file(f))
- print_all()
-
# code.interact(local=dict(globals(), **locals())) | 46 | 0.69697 | 10 | 36 |
5fc87a678fc30482b854618f11cba187d0a67e16 | dart_test.yaml | dart_test.yaml | presets:
travis:
# Travis is sloooow.
timeout: 3x
| presets:
travis:
# Travis is sloooow.
timeout: 3x
# Pub has a huge number of small suites, which means test is frequently
# loading new ones. Keeping concurrency low helps limit load timeouts.
concurrency: 4
| Use low concurrency on Travis. | Use low concurrency on Travis.
4 is just a guess here.
| YAML | bsd-3-clause | dart-lang/pub,dart-lang/pub | yaml | ## Code Before:
presets:
travis:
# Travis is sloooow.
timeout: 3x
## Instruction:
Use low concurrency on Travis.
4 is just a guess here.
## Code After:
presets:
travis:
# Travis is sloooow.
timeout: 3x
# Pub has a huge number of small suites, which means test is frequently
# loading new ones. Keeping concurrency low helps limit load timeouts.
concurrency: 4
| presets:
travis:
# Travis is sloooow.
timeout: 3x
+
+ # Pub has a huge number of small suites, which means test is frequently
+ # loading new ones. Keeping concurrency low helps limit load timeouts.
+ concurrency: 4 | 4 | 1 | 4 | 0 |
d079caadc86f718720cf61e20b3b27d66cc00102 | .expeditor/update_version.sh | .expeditor/update_version.sh |
set -evx
VERSION=$(cat VERSION)
sed -i -r "s/^(\s*)VERSION = \".+\"/\1VERSION = \"${VERSION}\"/" lib/chef_fixie/version.rb
# Once Expeditor finshes executing this script, it will commit the changes and push
# the commit as a new tag corresponding to the value in the VERSION file.
|
set -evx
sed -i -r "s/^(\s*)VERSION = \".+\"/\1VERSION = \"$(cat VERSION)\"/" lib/chef_fixie/version.rb
# Once Expeditor finshes executing this script, it will commit the changes and push
# the commit as a new tag corresponding to the value in the VERSION file.
| Revert "Fix version bump for fixie" | Revert "Fix version bump for fixie"
This reverts commit 20485ba57b86457181272fbc86c86101d62abfe2.
| Shell | apache-2.0 | chef/fixie,chef/fixie | shell | ## Code Before:
set -evx
VERSION=$(cat VERSION)
sed -i -r "s/^(\s*)VERSION = \".+\"/\1VERSION = \"${VERSION}\"/" lib/chef_fixie/version.rb
# Once Expeditor finshes executing this script, it will commit the changes and push
# the commit as a new tag corresponding to the value in the VERSION file.
## Instruction:
Revert "Fix version bump for fixie"
This reverts commit 20485ba57b86457181272fbc86c86101d62abfe2.
## Code After:
set -evx
sed -i -r "s/^(\s*)VERSION = \".+\"/\1VERSION = \"$(cat VERSION)\"/" lib/chef_fixie/version.rb
# Once Expeditor finshes executing this script, it will commit the changes and push
# the commit as a new tag corresponding to the value in the VERSION file.
|
set -evx
- VERSION=$(cat VERSION)
+
- sed -i -r "s/^(\s*)VERSION = \".+\"/\1VERSION = \"${VERSION}\"/" lib/chef_fixie/version.rb
? ^ ^
+ sed -i -r "s/^(\s*)VERSION = \".+\"/\1VERSION = \"$(cat VERSION)\"/" lib/chef_fixie/version.rb
? ^^^^^ ^
# Once Expeditor finshes executing this script, it will commit the changes and push
# the commit as a new tag corresponding to the value in the VERSION file. | 4 | 0.571429 | 2 | 2 |
573db628a9882d20339b9d66a583981abc2e3446 | assets/stylesheets/modules/_panels.scss | assets/stylesheets/modules/_panels.scss | // Elements overrides
.form-group {
.panel-border-narrow {
padding-bottom: em(15, 19);
}
}
| // Elements overrides
.panel {
margin-bottom: $gutter-half;
@include media(tablet) {
margin-bottom: $gutter;
}
&:last-child {
margin-bottom: 0;
}
}
.form-group {
.panel-border-narrow {
padding-bottom: em(15, 19);
}
}
| Increase bottom margin of panels | Increase bottom margin of panels
Increase the bottom margin of panels to follow form-group spacing and allow more space between these and other form elements.
| SCSS | mit | UKHomeOffice/passports-frontend-toolkit | scss | ## Code Before:
// Elements overrides
.form-group {
.panel-border-narrow {
padding-bottom: em(15, 19);
}
}
## Instruction:
Increase bottom margin of panels
Increase the bottom margin of panels to follow form-group spacing and allow more space between these and other form elements.
## Code After:
// Elements overrides
.panel {
margin-bottom: $gutter-half;
@include media(tablet) {
margin-bottom: $gutter;
}
&:last-child {
margin-bottom: 0;
}
}
.form-group {
.panel-border-narrow {
padding-bottom: em(15, 19);
}
}
| // Elements overrides
+
+ .panel {
+ margin-bottom: $gutter-half;
+ @include media(tablet) {
+ margin-bottom: $gutter;
+ }
+
+ &:last-child {
+ margin-bottom: 0;
+ }
+ }
+
.form-group {
.panel-border-narrow {
padding-bottom: em(15, 19);
}
} | 12 | 2 | 12 | 0 |
8e87598d8b641647100f4e05508995948f1282f0 | Tests/Integration/BlockRepositoryTest.php | Tests/Integration/BlockRepositoryTest.php | <?php namespace Modules\Block\Tests\Integration;
class BlockRepositoryTest extends BaseBlockTest
{
/** @test */
public function it_creates_blocks()
{
$block = $this->block->create(['name' => 'testBlock', 'en' => ['body' => 'lorem en'], 'fr' => ['body' => 'lorem fr']]);
$blocks = $this->block->all();
$this->assertCount(1, $blocks);
$this->assertEquals('testBlock', $block->name);
$this->assertEquals('lorem en', $block->translate('en')->body);
$this->assertEquals('lorem fr', $block->translate('fr')->body);
}
}
| <?php namespace Modules\Block\Tests\Integration;
use Faker\Factory;
class BlockRepositoryTest extends BaseBlockTest
{
/** @test */
public function it_creates_blocks()
{
$block = $this->block->create(['name' => 'testBlock', 'en' => ['body' => 'lorem en'], 'fr' => ['body' => 'lorem fr']]);
$blocks = $this->block->all();
$this->assertCount(1, $blocks);
$this->assertEquals('testBlock', $block->name);
$this->assertEquals('lorem en', $block->translate('en')->body);
$this->assertEquals('lorem fr', $block->translate('fr')->body);
}
/** @test */
public function it_gets_only_online_blocks()
{
$this->createRandomBlock();
$this->createRandomBlock(true, true);
$this->createRandomBlock(true, true);
$this->createRandomBlock(true, false);
$allBlocks = $this->block->all();
$onlineBlocksFr = $this->block->allOnlineInLang('fr');
$onlineBlocksEn = $this->block->allOnlineInLang('en');
$this->assertCount(4, $allBlocks);
$this->assertCount(2, $onlineBlocksFr);
$this->assertCount(3, $onlineBlocksEn);
}
/**
* Create a block with random properties
* @param bool $statusEn
* @param bool $statusFr
* @return mixed
*/
private function createRandomBlock($statusEn = false, $statusFr = false)
{
$factory = Factory::create();
$data = [
'name' => $factory->word,
'en' => [
'body' => $factory->text,
'online' => $statusEn,
],
'fr' => [
'body' => $factory->text,
'online' => $statusFr,
],
];
return $this->block->create($data);
}
}
| Test for getting all blocks in given language | Test for getting all blocks in given language
| PHP | mit | oimken/Block,simonfunk/Block,oimken/Block | php | ## Code Before:
<?php namespace Modules\Block\Tests\Integration;
class BlockRepositoryTest extends BaseBlockTest
{
/** @test */
public function it_creates_blocks()
{
$block = $this->block->create(['name' => 'testBlock', 'en' => ['body' => 'lorem en'], 'fr' => ['body' => 'lorem fr']]);
$blocks = $this->block->all();
$this->assertCount(1, $blocks);
$this->assertEquals('testBlock', $block->name);
$this->assertEquals('lorem en', $block->translate('en')->body);
$this->assertEquals('lorem fr', $block->translate('fr')->body);
}
}
## Instruction:
Test for getting all blocks in given language
## Code After:
<?php namespace Modules\Block\Tests\Integration;
use Faker\Factory;
class BlockRepositoryTest extends BaseBlockTest
{
/** @test */
public function it_creates_blocks()
{
$block = $this->block->create(['name' => 'testBlock', 'en' => ['body' => 'lorem en'], 'fr' => ['body' => 'lorem fr']]);
$blocks = $this->block->all();
$this->assertCount(1, $blocks);
$this->assertEquals('testBlock', $block->name);
$this->assertEquals('lorem en', $block->translate('en')->body);
$this->assertEquals('lorem fr', $block->translate('fr')->body);
}
/** @test */
public function it_gets_only_online_blocks()
{
$this->createRandomBlock();
$this->createRandomBlock(true, true);
$this->createRandomBlock(true, true);
$this->createRandomBlock(true, false);
$allBlocks = $this->block->all();
$onlineBlocksFr = $this->block->allOnlineInLang('fr');
$onlineBlocksEn = $this->block->allOnlineInLang('en');
$this->assertCount(4, $allBlocks);
$this->assertCount(2, $onlineBlocksFr);
$this->assertCount(3, $onlineBlocksEn);
}
/**
* Create a block with random properties
* @param bool $statusEn
* @param bool $statusFr
* @return mixed
*/
private function createRandomBlock($statusEn = false, $statusFr = false)
{
$factory = Factory::create();
$data = [
'name' => $factory->word,
'en' => [
'body' => $factory->text,
'online' => $statusEn,
],
'fr' => [
'body' => $factory->text,
'online' => $statusFr,
],
];
return $this->block->create($data);
}
}
| <?php namespace Modules\Block\Tests\Integration;
+
+ use Faker\Factory;
class BlockRepositoryTest extends BaseBlockTest
{
/** @test */
public function it_creates_blocks()
{
$block = $this->block->create(['name' => 'testBlock', 'en' => ['body' => 'lorem en'], 'fr' => ['body' => 'lorem fr']]);
$blocks = $this->block->all();
$this->assertCount(1, $blocks);
$this->assertEquals('testBlock', $block->name);
$this->assertEquals('lorem en', $block->translate('en')->body);
$this->assertEquals('lorem fr', $block->translate('fr')->body);
}
+
+ /** @test */
+ public function it_gets_only_online_blocks()
+ {
+ $this->createRandomBlock();
+ $this->createRandomBlock(true, true);
+ $this->createRandomBlock(true, true);
+ $this->createRandomBlock(true, false);
+
+ $allBlocks = $this->block->all();
+ $onlineBlocksFr = $this->block->allOnlineInLang('fr');
+ $onlineBlocksEn = $this->block->allOnlineInLang('en');
+
+ $this->assertCount(4, $allBlocks);
+ $this->assertCount(2, $onlineBlocksFr);
+ $this->assertCount(3, $onlineBlocksEn);
+ }
+
+ /**
+ * Create a block with random properties
+ * @param bool $statusEn
+ * @param bool $statusFr
+ * @return mixed
+ */
+ private function createRandomBlock($statusEn = false, $statusFr = false)
+ {
+ $factory = Factory::create();
+
+ $data = [
+ 'name' => $factory->word,
+ 'en' => [
+ 'body' => $factory->text,
+ 'online' => $statusEn,
+ ],
+ 'fr' => [
+ 'body' => $factory->text,
+ 'online' => $statusFr,
+ ],
+ ];
+
+ return $this->block->create($data);
+ }
} | 44 | 2.75 | 44 | 0 |
2d95a5ba5b20b0b46c587d6ea68d3f4cbc956ba6 | apuniverse/templates/blog/base_blog.html | apuniverse/templates/blog/base_blog.html | {% extends "base.html" %}
{% load taggit_extras %}
{% block title %}Blog{% endblock title %}
{% block extrahead %}
{% endblock extrahead %}
{% block page_title %}
BLOG!
{% endblock page_title %}
{% block aside %}
<div class="links-box">
<h3>Categories</h3>
<nav>
{% get_taglist as all_tags %}
<ul>
{% for tag in all_tags %}
<a href="{% url 'blogtags' tag.slug %} ">
<li>{{ tag }} ({{ tag.num_times }})</li>
</a>
{% endfor %}
</ul>
</nav>
</div>
<div class="links-box">
<h3>Archive</h3>
<ul>
{% for year, count in archive_links %}
<a href="{% url "post_year_archive" year %}">
<li>{{ year }} ({{ count }})</li>
</a>
{% endfor %}
</ul>
</div>
{% endblock aside %}
| {% extends "base.html" %}
{% load taggit_extras %}
{% block title %}Blog{% endblock title %}
{% block extrahead %}
{% endblock extrahead %}
{% block page_title %}
BLOG!
{% endblock page_title %}
{% block aside %}
<div class="links-box">
<h3>Categories</h3>
<nav>
{% get_taglist as all_tags for 'blog'%}
<ul>
{% for tag in all_tags %}
<a href="{% url 'blogtags' tag.slug %} ">
<li>{{ tag }}</li>
</a>
{% endfor %}
</ul>
</nav>
</div>
<div class="links-box">
<h3>Archive</h3>
<ul>
{% for year, count in archive_links %}
<a href="{% url "post_year_archive" year %}">
<li>{{ year }} ({{ count }})</li>
</a>
{% endfor %}
</ul>
</div>
{% endblock aside %}
| Remove tag counts because theyre wrong | Remove tag counts because theyre wrong
| HTML | mit | dangerdak/apuniverse,dangerdak/apuniverse,dangerdak/apuniverse | html | ## Code Before:
{% extends "base.html" %}
{% load taggit_extras %}
{% block title %}Blog{% endblock title %}
{% block extrahead %}
{% endblock extrahead %}
{% block page_title %}
BLOG!
{% endblock page_title %}
{% block aside %}
<div class="links-box">
<h3>Categories</h3>
<nav>
{% get_taglist as all_tags %}
<ul>
{% for tag in all_tags %}
<a href="{% url 'blogtags' tag.slug %} ">
<li>{{ tag }} ({{ tag.num_times }})</li>
</a>
{% endfor %}
</ul>
</nav>
</div>
<div class="links-box">
<h3>Archive</h3>
<ul>
{% for year, count in archive_links %}
<a href="{% url "post_year_archive" year %}">
<li>{{ year }} ({{ count }})</li>
</a>
{% endfor %}
</ul>
</div>
{% endblock aside %}
## Instruction:
Remove tag counts because theyre wrong
## Code After:
{% extends "base.html" %}
{% load taggit_extras %}
{% block title %}Blog{% endblock title %}
{% block extrahead %}
{% endblock extrahead %}
{% block page_title %}
BLOG!
{% endblock page_title %}
{% block aside %}
<div class="links-box">
<h3>Categories</h3>
<nav>
{% get_taglist as all_tags for 'blog'%}
<ul>
{% for tag in all_tags %}
<a href="{% url 'blogtags' tag.slug %} ">
<li>{{ tag }}</li>
</a>
{% endfor %}
</ul>
</nav>
</div>
<div class="links-box">
<h3>Archive</h3>
<ul>
{% for year, count in archive_links %}
<a href="{% url "post_year_archive" year %}">
<li>{{ year }} ({{ count }})</li>
</a>
{% endfor %}
</ul>
</div>
{% endblock aside %}
| {% extends "base.html" %}
{% load taggit_extras %}
{% block title %}Blog{% endblock title %}
{% block extrahead %}
{% endblock extrahead %}
{% block page_title %}
BLOG!
{% endblock page_title %}
{% block aside %}
<div class="links-box">
<h3>Categories</h3>
<nav>
- {% get_taglist as all_tags %}
+ {% get_taglist as all_tags for 'blog'%}
? ++++++++++
<ul>
{% for tag in all_tags %}
<a href="{% url 'blogtags' tag.slug %} ">
- <li>{{ tag }} ({{ tag.num_times }})</li>
+ <li>{{ tag }}</li>
</a>
{% endfor %}
</ul>
</nav>
</div>
<div class="links-box">
<h3>Archive</h3>
<ul>
{% for year, count in archive_links %}
<a href="{% url "post_year_archive" year %}">
<li>{{ year }} ({{ count }})</li>
</a>
{% endfor %}
</ul>
</div>
{% endblock aside %} | 4 | 0.095238 | 2 | 2 |
859e891a1f5418a877615a510d03f0e421a8140d | layouts/partials/jumbotron.html | layouts/partials/jumbotron.html | {{ $isHomePage := eq .Title .Site.Title }}
<div class="jumbotron" style="background-image: url({{ if $isHomePage }}{{ with .Site.Params.image }}{{ . }}{{ end }}{{ else }}{{ with .Params.image }}{{ . }}{{ end }}{{ end }})">
<div class="container">
<h1>{{ if $isHomePage }}{{ .Site.Title }}{{ else }}{{ .Title }}{{ end }}</h1>
<p>{{ if $isHomePage }}{{ with .Site.Params.description }}{{ . }}{{ end }}{{ else }}{{ with .Params.description }}{{ . }}{{ end }}{{ end }}</p>
</div>
</div>
| {{ $isHomePage := eq .Title .Site.Title }}
{{ $baseUrl := .Site.BaseURL }}
<div class="jumbotron" style="background-image: url({{ if isset .Params "image" }}{{ .Params.image }}{{ else }}{{ with .Site.Params.image }}{{ if $isHomePage }}{{ . }}{{ else }}{{ $baseUrl }}/{{ . }}{{ end }}{{ end }}{{ end }});">
<div class="container">
<h1>{{ if $isHomePage }}{{ .Site.Title }}{{ else }}{{ .Title }}{{ end }}</h1>
<p>{{ if $isHomePage }}{{ with .Site.Params.description }}{{ . }}{{ end }}{{ else }}{{ with .Params.description }}{{ . }}{{ end }}{{ end }}</p>
</div>
</div>
| Make .Site.Params.image default on other pages | Make .Site.Params.image default on other pages
If a page does not have .Params.image defined, it will use
.Site.Params.image in the jumbotron.
This means all pages will hae jumbotron images.
Signed-off-by: Ethan Madison <0a27e12d062ad71673d57f9c2799b207af316885@ethanmad.com>
| HTML | mit | UM-Fencing/club-theme,UM-Fencing/club-theme | html | ## Code Before:
{{ $isHomePage := eq .Title .Site.Title }}
<div class="jumbotron" style="background-image: url({{ if $isHomePage }}{{ with .Site.Params.image }}{{ . }}{{ end }}{{ else }}{{ with .Params.image }}{{ . }}{{ end }}{{ end }})">
<div class="container">
<h1>{{ if $isHomePage }}{{ .Site.Title }}{{ else }}{{ .Title }}{{ end }}</h1>
<p>{{ if $isHomePage }}{{ with .Site.Params.description }}{{ . }}{{ end }}{{ else }}{{ with .Params.description }}{{ . }}{{ end }}{{ end }}</p>
</div>
</div>
## Instruction:
Make .Site.Params.image default on other pages
If a page does not have .Params.image defined, it will use
.Site.Params.image in the jumbotron.
This means all pages will hae jumbotron images.
Signed-off-by: Ethan Madison <0a27e12d062ad71673d57f9c2799b207af316885@ethanmad.com>
## Code After:
{{ $isHomePage := eq .Title .Site.Title }}
{{ $baseUrl := .Site.BaseURL }}
<div class="jumbotron" style="background-image: url({{ if isset .Params "image" }}{{ .Params.image }}{{ else }}{{ with .Site.Params.image }}{{ if $isHomePage }}{{ . }}{{ else }}{{ $baseUrl }}/{{ . }}{{ end }}{{ end }}{{ end }});">
<div class="container">
<h1>{{ if $isHomePage }}{{ .Site.Title }}{{ else }}{{ .Title }}{{ end }}</h1>
<p>{{ if $isHomePage }}{{ with .Site.Params.description }}{{ . }}{{ end }}{{ else }}{{ with .Params.description }}{{ . }}{{ end }}{{ end }}</p>
</div>
</div>
| {{ $isHomePage := eq .Title .Site.Title }}
+ {{ $baseUrl := .Site.BaseURL }}
- <div class="jumbotron" style="background-image: url({{ if $isHomePage }}{{ with .Site.Params.image }}{{ . }}{{ end }}{{ else }}{{ with .Params.image }}{{ . }}{{ end }}{{ end }})">
+ <div class="jumbotron" style="background-image: url({{ if isset .Params "image" }}{{ .Params.image }}{{ else }}{{ with .Site.Params.image }}{{ if $isHomePage }}{{ . }}{{ else }}{{ $baseUrl }}/{{ . }}{{ end }}{{ end }}{{ end }});">
<div class="container">
<h1>{{ if $isHomePage }}{{ .Site.Title }}{{ else }}{{ .Title }}{{ end }}</h1>
<p>{{ if $isHomePage }}{{ with .Site.Params.description }}{{ . }}{{ end }}{{ else }}{{ with .Params.description }}{{ . }}{{ end }}{{ end }}</p>
</div>
</div> | 3 | 0.375 | 2 | 1 |
e884b8a6bda740c9ed0c53761672cacaaf3f7cb6 | appveyor.yml | appveyor.yml | environment:
matrix:
- nodejs_version: '6'
- nodejs_version: '5'
- nodejs_version: '4'
install:
- ps: Install-Product node $env:nodejs_version
- set CI=true
- set CASH_APPVEYOR=true
- npm install -g npm@latest || (timeout 30 && npm install -g npm@latest)
- set PATH=%APPDATA%\npm;%PATH%
- npm install || (timeout 30 && npm install)
matrix:
fast_finish: true
build: off
version: '{build}'
shallow_clone: true
clone_depth: 1
test_script:
- node --version
- npm --version
- npm run test-win || (timeout 30 && npm run test-win)
| environment:
matrix:
- nodejs_version: '7'
- nodejs_version: '6'
- nodejs_version: '5'
- nodejs_version: '4'
install:
- ps: Install-Product node $env:nodejs_version
- set CI=true
- set CASH_APPVEYOR=true
- npm install -g npm@latest || (timeout 30 && npm install -g npm@latest)
- set PATH=%APPDATA%\npm;%PATH%
- npm install || (timeout 30 && npm install)
matrix:
fast_finish: true
build: off
version: '{build}'
shallow_clone: true
clone_depth: 1
test_script:
- node --version
- npm --version
- npm run test-win || (timeout 30 && npm run test-win)
| Update Appveyor to use Node v7 | Update Appveyor to use Node v7 | YAML | mit | dthree/cash | yaml | ## Code Before:
environment:
matrix:
- nodejs_version: '6'
- nodejs_version: '5'
- nodejs_version: '4'
install:
- ps: Install-Product node $env:nodejs_version
- set CI=true
- set CASH_APPVEYOR=true
- npm install -g npm@latest || (timeout 30 && npm install -g npm@latest)
- set PATH=%APPDATA%\npm;%PATH%
- npm install || (timeout 30 && npm install)
matrix:
fast_finish: true
build: off
version: '{build}'
shallow_clone: true
clone_depth: 1
test_script:
- node --version
- npm --version
- npm run test-win || (timeout 30 && npm run test-win)
## Instruction:
Update Appveyor to use Node v7
## Code After:
environment:
matrix:
- nodejs_version: '7'
- nodejs_version: '6'
- nodejs_version: '5'
- nodejs_version: '4'
install:
- ps: Install-Product node $env:nodejs_version
- set CI=true
- set CASH_APPVEYOR=true
- npm install -g npm@latest || (timeout 30 && npm install -g npm@latest)
- set PATH=%APPDATA%\npm;%PATH%
- npm install || (timeout 30 && npm install)
matrix:
fast_finish: true
build: off
version: '{build}'
shallow_clone: true
clone_depth: 1
test_script:
- node --version
- npm --version
- npm run test-win || (timeout 30 && npm run test-win)
| environment:
matrix:
+ - nodejs_version: '7'
- nodejs_version: '6'
- nodejs_version: '5'
- nodejs_version: '4'
install:
- ps: Install-Product node $env:nodejs_version
- set CI=true
- set CASH_APPVEYOR=true
- npm install -g npm@latest || (timeout 30 && npm install -g npm@latest)
- set PATH=%APPDATA%\npm;%PATH%
- npm install || (timeout 30 && npm install)
matrix:
fast_finish: true
build: off
version: '{build}'
shallow_clone: true
clone_depth: 1
test_script:
- node --version
- npm --version
- npm run test-win || (timeout 30 && npm run test-win) | 1 | 0.045455 | 1 | 0 |
e64358edc12b9a2fcbf57ecb2ce17ca609df8d43 | Core/Assembler.h | Core/Assembler.h |
enum class ArmipsMode { FILE, MEMORY };
struct LabelDefinition
{
std::wstring name;
int64_t value;
};
struct EquationDefinition
{
std::wstring name;
std::wstring value;
};
struct ArmipsArguments
{
// common
ArmipsMode mode;
int symFileVersion;
bool errorOnWarning;
bool silent;
StringList* errorsResult;
std::vector<EquationDefinition> equList;
std::vector<LabelDefinition> labels;
// file mode
std::wstring inputFileName;
std::wstring tempFileName;
std::wstring symFileName;
bool useAbsoluteFileNames;
// memory mode
std::shared_ptr<AssemblerFile> memoryFile;
std::wstring content;
ArmipsArguments()
{
mode = ArmipsMode::FILE;
errorOnWarning = false;
silent = false;
errorsResult = nullptr;
useAbsoluteFileNames = true;
}
};
bool runArmips(ArmipsArguments& arguments);
|
enum class ArmipsMode { FILE, MEMORY };
struct LabelDefinition
{
std::wstring name;
int64_t value;
};
struct EquationDefinition
{
std::wstring name;
std::wstring value;
};
struct ArmipsArguments
{
// common
ArmipsMode mode;
int symFileVersion;
bool errorOnWarning;
bool silent;
StringList* errorsResult;
std::vector<EquationDefinition> equList;
std::vector<LabelDefinition> labels;
// file mode
std::wstring inputFileName;
std::wstring tempFileName;
std::wstring symFileName;
bool useAbsoluteFileNames;
// memory mode
std::shared_ptr<AssemblerFile> memoryFile;
std::wstring content;
ArmipsArguments()
{
mode = ArmipsMode::FILE;
symFileVersion = 0;
errorOnWarning = false;
silent = false;
errorsResult = nullptr;
useAbsoluteFileNames = true;
}
};
bool runArmips(ArmipsArguments& arguments);
| Fix symFileVersion not being initialized in ArmipsArguments constructor | Fix symFileVersion not being initialized in ArmipsArguments constructor
| C | mit | Kingcom/armips,Kingcom/armips,sp1187/armips,sp1187/armips,Kingcom/armips,sp1187/armips | c | ## Code Before:
enum class ArmipsMode { FILE, MEMORY };
struct LabelDefinition
{
std::wstring name;
int64_t value;
};
struct EquationDefinition
{
std::wstring name;
std::wstring value;
};
struct ArmipsArguments
{
// common
ArmipsMode mode;
int symFileVersion;
bool errorOnWarning;
bool silent;
StringList* errorsResult;
std::vector<EquationDefinition> equList;
std::vector<LabelDefinition> labels;
// file mode
std::wstring inputFileName;
std::wstring tempFileName;
std::wstring symFileName;
bool useAbsoluteFileNames;
// memory mode
std::shared_ptr<AssemblerFile> memoryFile;
std::wstring content;
ArmipsArguments()
{
mode = ArmipsMode::FILE;
errorOnWarning = false;
silent = false;
errorsResult = nullptr;
useAbsoluteFileNames = true;
}
};
bool runArmips(ArmipsArguments& arguments);
## Instruction:
Fix symFileVersion not being initialized in ArmipsArguments constructor
## Code After:
enum class ArmipsMode { FILE, MEMORY };
struct LabelDefinition
{
std::wstring name;
int64_t value;
};
struct EquationDefinition
{
std::wstring name;
std::wstring value;
};
struct ArmipsArguments
{
// common
ArmipsMode mode;
int symFileVersion;
bool errorOnWarning;
bool silent;
StringList* errorsResult;
std::vector<EquationDefinition> equList;
std::vector<LabelDefinition> labels;
// file mode
std::wstring inputFileName;
std::wstring tempFileName;
std::wstring symFileName;
bool useAbsoluteFileNames;
// memory mode
std::shared_ptr<AssemblerFile> memoryFile;
std::wstring content;
ArmipsArguments()
{
mode = ArmipsMode::FILE;
symFileVersion = 0;
errorOnWarning = false;
silent = false;
errorsResult = nullptr;
useAbsoluteFileNames = true;
}
};
bool runArmips(ArmipsArguments& arguments);
|
enum class ArmipsMode { FILE, MEMORY };
struct LabelDefinition
{
std::wstring name;
int64_t value;
};
struct EquationDefinition
{
std::wstring name;
std::wstring value;
};
struct ArmipsArguments
{
// common
ArmipsMode mode;
int symFileVersion;
bool errorOnWarning;
bool silent;
StringList* errorsResult;
std::vector<EquationDefinition> equList;
std::vector<LabelDefinition> labels;
// file mode
std::wstring inputFileName;
std::wstring tempFileName;
std::wstring symFileName;
bool useAbsoluteFileNames;
// memory mode
std::shared_ptr<AssemblerFile> memoryFile;
std::wstring content;
ArmipsArguments()
{
mode = ArmipsMode::FILE;
+ symFileVersion = 0;
errorOnWarning = false;
silent = false;
errorsResult = nullptr;
useAbsoluteFileNames = true;
}
};
bool runArmips(ArmipsArguments& arguments); | 1 | 0.021277 | 1 | 0 |
1778588444dca3c3c87d05411561fddefa1ecd64 | app/views/spree/admin/container_taxonomies/_list.html.erb | app/views/spree/admin/container_taxonomies/_list.html.erb | <table class="index" id='listing_container_taxonomies' data-hook>
<tr data-hook="container_taxonomies_header">
<th><%= t(:name) %></th>
<th><%= t(:count)%></th>
<th></th>
</tr>
<% @container_taxonomies.each do |container_taxonomy| %>
<tr id="<%= dom_id container_taxonomy %>" data-hook="container_taxonomies_row">
<td><%= container_taxonomy.name %></td>
<td><%= container_taxonomy.container_taxons.count %></td>
<td class="actions">
<%= link_to_edit container_taxonomy.id, :class => 'edit' %>
<%= link_to_delete container_taxonomy %>
</td>
</tr>
<% end %>
</table>
| <table class="index" id='listing_container_taxonomies' data-hook>
<tr data-hook="container_taxonomies_header">
<th><%= t(:name) %></th>
<th><%= t(:count)%></th>
<th></th>
</tr>
<% @container_taxonomies.each do |container_taxonomy| %>
<tr id="<%= dom_id container_taxonomy %>" data-hook="container_taxonomies_row">
<td><%= link_to container_taxonomy.name, admin_container_taxonomy_container_taxons_path(container_taxonomy) %></td>
<td><%= container_taxonomy.container_taxons.count %></td>
<td class="actions">
<%= link_to_edit container_taxonomy.id, :class => 'edit' %>
<%= link_to_delete container_taxonomy %>
</td>
</tr>
<% end %>
</table>
| Add a show link to container_taxonomies index view | Add a show link to container_taxonomies index view
| HTML+ERB | bsd-3-clause | Genshin/spree_warehouse,Genshin/spree_warehouse | html+erb | ## Code Before:
<table class="index" id='listing_container_taxonomies' data-hook>
<tr data-hook="container_taxonomies_header">
<th><%= t(:name) %></th>
<th><%= t(:count)%></th>
<th></th>
</tr>
<% @container_taxonomies.each do |container_taxonomy| %>
<tr id="<%= dom_id container_taxonomy %>" data-hook="container_taxonomies_row">
<td><%= container_taxonomy.name %></td>
<td><%= container_taxonomy.container_taxons.count %></td>
<td class="actions">
<%= link_to_edit container_taxonomy.id, :class => 'edit' %>
<%= link_to_delete container_taxonomy %>
</td>
</tr>
<% end %>
</table>
## Instruction:
Add a show link to container_taxonomies index view
## Code After:
<table class="index" id='listing_container_taxonomies' data-hook>
<tr data-hook="container_taxonomies_header">
<th><%= t(:name) %></th>
<th><%= t(:count)%></th>
<th></th>
</tr>
<% @container_taxonomies.each do |container_taxonomy| %>
<tr id="<%= dom_id container_taxonomy %>" data-hook="container_taxonomies_row">
<td><%= link_to container_taxonomy.name, admin_container_taxonomy_container_taxons_path(container_taxonomy) %></td>
<td><%= container_taxonomy.container_taxons.count %></td>
<td class="actions">
<%= link_to_edit container_taxonomy.id, :class => 'edit' %>
<%= link_to_delete container_taxonomy %>
</td>
</tr>
<% end %>
</table>
| <table class="index" id='listing_container_taxonomies' data-hook>
<tr data-hook="container_taxonomies_header">
<th><%= t(:name) %></th>
<th><%= t(:count)%></th>
<th></th>
</tr>
<% @container_taxonomies.each do |container_taxonomy| %>
<tr id="<%= dom_id container_taxonomy %>" data-hook="container_taxonomies_row">
- <td><%= container_taxonomy.name %></td>
+ <td><%= link_to container_taxonomy.name, admin_container_taxonomy_container_taxons_path(container_taxonomy) %></td>
<td><%= container_taxonomy.container_taxons.count %></td>
<td class="actions">
<%= link_to_edit container_taxonomy.id, :class => 'edit' %>
<%= link_to_delete container_taxonomy %>
+
</td>
</tr>
<% end %>
</table> | 3 | 0.166667 | 2 | 1 |
93d65e5163d52024708f03caf4799f0b40458a00 | README.md | README.md | infix-doller-reader
===================
Infix $ operator for reducing redundant paranthesis
example:
(fun x y $ fun2 z)
This form above would be expanded to
(fun x y (fun2 z))
| infix-doller-reader
===================
Infix $ operator for reducing redundant paranthesis.
### usage:
1. load package;
(asdf:load-system :infix-doller-reader)
2. enable to use infix $ notation;
(idoller:use-infix-doller)
CAUTION: this operation replaces current \*READTABLE\* with copied one.
3. write your forms with infix $ notation s.t. (the fixnum $ + x 10)
4. rollback \*READTABLE\*;
(idoller:unuse-infix-doller)
### example:
(fun x y $ fun2 z)
This form above would be expanded to below form.
(fun x y (fun2 z))
And also,
(= (+ 1 2 $ * 3 4 $ + 5 6) (+ 1 2 (* 3 4 (+ 5 6)))) ; => T
| Update readme with usages and more examples | Update readme with usages and more examples
| Markdown | mit | ichimal/infix-dollar-reader | markdown | ## Code Before:
infix-doller-reader
===================
Infix $ operator for reducing redundant paranthesis
example:
(fun x y $ fun2 z)
This form above would be expanded to
(fun x y (fun2 z))
## Instruction:
Update readme with usages and more examples
## Code After:
infix-doller-reader
===================
Infix $ operator for reducing redundant paranthesis.
### usage:
1. load package;
(asdf:load-system :infix-doller-reader)
2. enable to use infix $ notation;
(idoller:use-infix-doller)
CAUTION: this operation replaces current \*READTABLE\* with copied one.
3. write your forms with infix $ notation s.t. (the fixnum $ + x 10)
4. rollback \*READTABLE\*;
(idoller:unuse-infix-doller)
### example:
(fun x y $ fun2 z)
This form above would be expanded to below form.
(fun x y (fun2 z))
And also,
(= (+ 1 2 $ * 3 4 $ + 5 6) (+ 1 2 (* 3 4 (+ 5 6)))) ; => T
| infix-doller-reader
===================
- Infix $ operator for reducing redundant paranthesis
+ Infix $ operator for reducing redundant paranthesis.
? +
+ ### usage:
- example:
- (fun x y $ fun2 z)
- This form above would be expanded to
- (fun x y (fun2 z))
+ 1. load package;
+
+ (asdf:load-system :infix-doller-reader)
+
+ 2. enable to use infix $ notation;
+
+ (idoller:use-infix-doller)
+
+ CAUTION: this operation replaces current \*READTABLE\* with copied one.
+
+ 3. write your forms with infix $ notation s.t. (the fixnum $ + x 10)
+
+
+ 4. rollback \*READTABLE\*;
+
+ (idoller:unuse-infix-doller)
+
+ ### example:
+
+ (fun x y $ fun2 z)
+
+ This form above would be expanded to below form.
+
+ (fun x y (fun2 z))
+
+ And also,
+
+ (= (+ 1 2 $ * 3 4 $ + 5 6) (+ 1 2 (* 3 4 (+ 5 6)))) ; => T
+ | 36 | 3.6 | 31 | 5 |
e4249fd1a9187086d8eff3097d0b322728c83448 | flake.nix | flake.nix | {
description = "glualint - Linter and pretty printer for Garry's Mod's variant of Lua.";
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs";
flake-utils.url = "github:numtide/flake-utils";
};
outputs = { self, nixpkgs, flake-utils }: flake-utils.lib.eachDefaultSystem (system:
let
pkgs = nixpkgs.legacyPackages.${system};
haskellPackages = pkgs.haskellPackages;
staticHaskellPackages = pkgs.pkgsStatic.haskellPackages;
in {
packages.glualint = haskellPackages.callPackage ./default.nix {};
packages.glualint-static = staticHaskellPackages.callPackage ./default.nix {};
defaultPackage = self.packages.${system}.glualint-static;
devShell = pkgs.mkShell {
buildInputs = with haskellPackages; [
cabal-install
(ghcWithPackages (self: with self; [
aeson array base bytestring containers directory filemanip filepath
ListLike MissingH mtl optparse-applicative parsec pretty signal
uu-parsinglib uuagc uuagc-cabal deepseq
]))
];
};
}
);
}
| {
description = "glualint - Linter and pretty printer for Garry's Mod's variant of Lua.";
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs";
flake-utils.url = "github:numtide/flake-utils";
};
outputs = { self, nixpkgs, flake-utils }: flake-utils.lib.eachDefaultSystem (system:
let
pkgs = nixpkgs.legacyPackages.${system};
haskellPackages = pkgs.haskellPackages;
staticHaskellPackages = pkgs.pkgsStatic.haskellPackages;
in {
packages.glualint = haskellPackages.callPackage ./default.nix {};
packages.glualint-static = staticHaskellPackages.callPackage ./default.nix {};
defaultPackage = self.packages.${system}.glualint-static;
devShell = pkgs.mkShell {
buildInputs = with haskellPackages; [
cabal-install
(ghcWithPackages (self: with self; [
aeson array base bytestring containers directory filemanip filepath
ListLike MissingH mtl optparse-applicative parsec pretty signal
uu-parsinglib uuagc uuagc-cabal deepseq
]))
haskell-language-server
];
};
}
);
}
| Add HLS to the development environment | Add HLS to the development environment
| Nix | lgpl-2.1 | FPtje/GLuaFixer | nix | ## Code Before:
{
description = "glualint - Linter and pretty printer for Garry's Mod's variant of Lua.";
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs";
flake-utils.url = "github:numtide/flake-utils";
};
outputs = { self, nixpkgs, flake-utils }: flake-utils.lib.eachDefaultSystem (system:
let
pkgs = nixpkgs.legacyPackages.${system};
haskellPackages = pkgs.haskellPackages;
staticHaskellPackages = pkgs.pkgsStatic.haskellPackages;
in {
packages.glualint = haskellPackages.callPackage ./default.nix {};
packages.glualint-static = staticHaskellPackages.callPackage ./default.nix {};
defaultPackage = self.packages.${system}.glualint-static;
devShell = pkgs.mkShell {
buildInputs = with haskellPackages; [
cabal-install
(ghcWithPackages (self: with self; [
aeson array base bytestring containers directory filemanip filepath
ListLike MissingH mtl optparse-applicative parsec pretty signal
uu-parsinglib uuagc uuagc-cabal deepseq
]))
];
};
}
);
}
## Instruction:
Add HLS to the development environment
## Code After:
{
description = "glualint - Linter and pretty printer for Garry's Mod's variant of Lua.";
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs";
flake-utils.url = "github:numtide/flake-utils";
};
outputs = { self, nixpkgs, flake-utils }: flake-utils.lib.eachDefaultSystem (system:
let
pkgs = nixpkgs.legacyPackages.${system};
haskellPackages = pkgs.haskellPackages;
staticHaskellPackages = pkgs.pkgsStatic.haskellPackages;
in {
packages.glualint = haskellPackages.callPackage ./default.nix {};
packages.glualint-static = staticHaskellPackages.callPackage ./default.nix {};
defaultPackage = self.packages.${system}.glualint-static;
devShell = pkgs.mkShell {
buildInputs = with haskellPackages; [
cabal-install
(ghcWithPackages (self: with self; [
aeson array base bytestring containers directory filemanip filepath
ListLike MissingH mtl optparse-applicative parsec pretty signal
uu-parsinglib uuagc uuagc-cabal deepseq
]))
haskell-language-server
];
};
}
);
}
| {
description = "glualint - Linter and pretty printer for Garry's Mod's variant of Lua.";
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs";
flake-utils.url = "github:numtide/flake-utils";
};
outputs = { self, nixpkgs, flake-utils }: flake-utils.lib.eachDefaultSystem (system:
let
pkgs = nixpkgs.legacyPackages.${system};
haskellPackages = pkgs.haskellPackages;
staticHaskellPackages = pkgs.pkgsStatic.haskellPackages;
in {
packages.glualint = haskellPackages.callPackage ./default.nix {};
packages.glualint-static = staticHaskellPackages.callPackage ./default.nix {};
defaultPackage = self.packages.${system}.glualint-static;
devShell = pkgs.mkShell {
buildInputs = with haskellPackages; [
cabal-install
(ghcWithPackages (self: with self; [
aeson array base bytestring containers directory filemanip filepath
ListLike MissingH mtl optparse-applicative parsec pretty signal
uu-parsinglib uuagc uuagc-cabal deepseq
]))
+ haskell-language-server
];
};
}
);
} | 1 | 0.033333 | 1 | 0 |
db6b875ec1aeae640aac351d55d819fe7baeeb41 | bower.json | bower.json | {
"name": "automation-inspector",
"description": "Inspection app for Chrome Automation API",
"main": "build/app/manifest.json",
"authors": [
"Aaron Leventhal"
],
"license": "Apache-2.0",
"keywords": [
"Chrome",
"automation",
"accessibility"
],
"homepage": "https://github.com/google/automation-inspector",
"private": true,
"ignore": [
"**/.*",
"node_modules",
"bower_components",
"test",
"tests"
],
"dependencies": {
"jquery": "^3.2.1",
"jquery-ui": "^1.12.1",
"fancytree": "git://github.com/aleventhal/fancytree#dist"
}
}
| {
"name": "automation-inspector",
"description": "Inspection app for Chrome Automation API",
"main": "build/app/manifest.json",
"authors": [
"Aaron Leventhal"
],
"license": "Apache-2.0",
"keywords": [
"Chrome",
"automation",
"accessibility"
],
"homepage": "https://github.com/google/automation-inspector",
"private": true,
"ignore": [
"**/.*",
"node_modules",
"bower_components",
"test",
"tests"
],
"dependencies": {
"jquery": "^3.2.1",
"jquery-ui": "^1.12.1",
"fancytree": "git://github.com/aleventhal/fancytree#fix-chrome-names"
}
}
| Use fancytree with workaround for Chrome treeitem names including too much descendant text | Use fancytree with workaround for Chrome treeitem names including too much descendant text
| JSON | apache-2.0 | google/automation-inspector,google/automation-inspector | json | ## Code Before:
{
"name": "automation-inspector",
"description": "Inspection app for Chrome Automation API",
"main": "build/app/manifest.json",
"authors": [
"Aaron Leventhal"
],
"license": "Apache-2.0",
"keywords": [
"Chrome",
"automation",
"accessibility"
],
"homepage": "https://github.com/google/automation-inspector",
"private": true,
"ignore": [
"**/.*",
"node_modules",
"bower_components",
"test",
"tests"
],
"dependencies": {
"jquery": "^3.2.1",
"jquery-ui": "^1.12.1",
"fancytree": "git://github.com/aleventhal/fancytree#dist"
}
}
## Instruction:
Use fancytree with workaround for Chrome treeitem names including too much descendant text
## Code After:
{
"name": "automation-inspector",
"description": "Inspection app for Chrome Automation API",
"main": "build/app/manifest.json",
"authors": [
"Aaron Leventhal"
],
"license": "Apache-2.0",
"keywords": [
"Chrome",
"automation",
"accessibility"
],
"homepage": "https://github.com/google/automation-inspector",
"private": true,
"ignore": [
"**/.*",
"node_modules",
"bower_components",
"test",
"tests"
],
"dependencies": {
"jquery": "^3.2.1",
"jquery-ui": "^1.12.1",
"fancytree": "git://github.com/aleventhal/fancytree#fix-chrome-names"
}
}
| {
"name": "automation-inspector",
"description": "Inspection app for Chrome Automation API",
"main": "build/app/manifest.json",
"authors": [
"Aaron Leventhal"
],
"license": "Apache-2.0",
"keywords": [
"Chrome",
"automation",
"accessibility"
],
"homepage": "https://github.com/google/automation-inspector",
"private": true,
"ignore": [
"**/.*",
"node_modules",
"bower_components",
"test",
"tests"
],
"dependencies": {
"jquery": "^3.2.1",
"jquery-ui": "^1.12.1",
- "fancytree": "git://github.com/aleventhal/fancytree#dist"
? ^ -
+ "fancytree": "git://github.com/aleventhal/fancytree#fix-chrome-names"
? ^ +++++++++++++
}
} | 2 | 0.071429 | 1 | 1 |
648cdac73ab608dc56fa3ed3faf0f514fdf8b348 | ci/ansible/all.yml | ci/ansible/all.yml | ---
node_config_directory: "/etc/ovn-scale-test"
container_config_directory: "/var/lib/ovn-scale-test/config_files"
deploy_user: "root"
###################
# Docker options
###################
ovn_db_image: "ovn-scale-test-ovn"
ovn_chassis_image: "ovn-scale-test-ovn"
rally_image: "ovn-scale-test-rally"
# Valid options are [ missing, always ]
image_pull_policy: "missing"
###################
# Emulation options
###################
ovn_database_alias_ip: "172.16.20.100"
ovn_database_device: "eth0"
ovn_chassis_start_cidr: "172.16.200.10/16"
ovn_chassis_device: "eth0"
# Total number of emulated chassis
ovn_number_chassis: 5
########################
# Rally workload options
########################
network_start_cidr: "172.16.201.0/24"
network_number: "5"
ports_per_network: "50"
acls_per_port: "5"
################
# OVS Repository
################
ovs_repo: "https://github.com/openvswitch/ovs.git"
ovs_branch: "master"
| ---
node_config_directory: "/etc/ovn-scale-test"
container_config_directory: "/var/lib/ovn-scale-test/config_files"
deploy_user: "root"
###################
# Docker options
###################
ovn_db_image: "huikang/ovn-scale-test-ovn"
ovn_chassis_image: "huikang/ovn-scale-test-ovn"
rally_image: "huikang/ovn-scale-test-rally"
# Valid options are [ missing, always ]
image_pull_policy: "missing"
###################
# Emulation options
###################
ovn_database_alias_ip: "172.16.20.100"
ovn_database_device: "eth0"
ovn_chassis_start_cidr: "172.16.200.10/16"
ovn_chassis_device: "eth0"
# Total number of emulated chassis
ovn_number_chassis: 5
########################
# Rally workload options
########################
network_start_cidr: "172.16.201.0/24"
network_number: "5"
ports_per_network: "50"
acls_per_port: "5"
################
# OVS Repository
################
ovs_repo: "https://github.com/openvswitch/ovs.git"
ovs_branch: "master"
| Update docker image names in ci | Update docker image names in ci
It's unclear how this used to work, but a recent issue arose (see
Isse #74) where running ovn-scale-test using the CI scripts was
failing. This was tracked down to incorrect docker image names.
Closes Issue #74
Signed-off-by: Kyle Mestery <9cfbda34e625c7b0454bcb19a48d07c46f9097d2@mestery.com>
| YAML | apache-2.0 | openvswitch/ovn-scale-test,openvswitch/ovn-scale-test,sivakom/ovn-scale-test,sivakom/ovn-scale-test | yaml | ## Code Before:
---
node_config_directory: "/etc/ovn-scale-test"
container_config_directory: "/var/lib/ovn-scale-test/config_files"
deploy_user: "root"
###################
# Docker options
###################
ovn_db_image: "ovn-scale-test-ovn"
ovn_chassis_image: "ovn-scale-test-ovn"
rally_image: "ovn-scale-test-rally"
# Valid options are [ missing, always ]
image_pull_policy: "missing"
###################
# Emulation options
###################
ovn_database_alias_ip: "172.16.20.100"
ovn_database_device: "eth0"
ovn_chassis_start_cidr: "172.16.200.10/16"
ovn_chassis_device: "eth0"
# Total number of emulated chassis
ovn_number_chassis: 5
########################
# Rally workload options
########################
network_start_cidr: "172.16.201.0/24"
network_number: "5"
ports_per_network: "50"
acls_per_port: "5"
################
# OVS Repository
################
ovs_repo: "https://github.com/openvswitch/ovs.git"
ovs_branch: "master"
## Instruction:
Update docker image names in ci
It's unclear how this used to work, but a recent issue arose (see
Isse #74) where running ovn-scale-test using the CI scripts was
failing. This was tracked down to incorrect docker image names.
Closes Issue #74
Signed-off-by: Kyle Mestery <9cfbda34e625c7b0454bcb19a48d07c46f9097d2@mestery.com>
## Code After:
---
node_config_directory: "/etc/ovn-scale-test"
container_config_directory: "/var/lib/ovn-scale-test/config_files"
deploy_user: "root"
###################
# Docker options
###################
ovn_db_image: "huikang/ovn-scale-test-ovn"
ovn_chassis_image: "huikang/ovn-scale-test-ovn"
rally_image: "huikang/ovn-scale-test-rally"
# Valid options are [ missing, always ]
image_pull_policy: "missing"
###################
# Emulation options
###################
ovn_database_alias_ip: "172.16.20.100"
ovn_database_device: "eth0"
ovn_chassis_start_cidr: "172.16.200.10/16"
ovn_chassis_device: "eth0"
# Total number of emulated chassis
ovn_number_chassis: 5
########################
# Rally workload options
########################
network_start_cidr: "172.16.201.0/24"
network_number: "5"
ports_per_network: "50"
acls_per_port: "5"
################
# OVS Repository
################
ovs_repo: "https://github.com/openvswitch/ovs.git"
ovs_branch: "master"
| ---
node_config_directory: "/etc/ovn-scale-test"
container_config_directory: "/var/lib/ovn-scale-test/config_files"
deploy_user: "root"
###################
# Docker options
###################
- ovn_db_image: "ovn-scale-test-ovn"
+ ovn_db_image: "huikang/ovn-scale-test-ovn"
? ++++++++
- ovn_chassis_image: "ovn-scale-test-ovn"
+ ovn_chassis_image: "huikang/ovn-scale-test-ovn"
? ++++++++
- rally_image: "ovn-scale-test-rally"
+ rally_image: "huikang/ovn-scale-test-rally"
? ++++++++
# Valid options are [ missing, always ]
image_pull_policy: "missing"
###################
# Emulation options
###################
ovn_database_alias_ip: "172.16.20.100"
ovn_database_device: "eth0"
ovn_chassis_start_cidr: "172.16.200.10/16"
ovn_chassis_device: "eth0"
# Total number of emulated chassis
ovn_number_chassis: 5
########################
# Rally workload options
########################
network_start_cidr: "172.16.201.0/24"
network_number: "5"
ports_per_network: "50"
acls_per_port: "5"
################
# OVS Repository
################
ovs_repo: "https://github.com/openvswitch/ovs.git"
ovs_branch: "master" | 6 | 0.130435 | 3 | 3 |
56c90f495c31ed61aeda21ae4cc2216a2723b464 | core/db/default/spree/stores.rb | core/db/default/spree/stores.rb | unless Spree::Store.where(code: 'spree').exists?
Spree::Store.new do |s|
s.code = 'spree'
s.name = 'Spree Demo Site'
s.url = 'demo.spreecommerce.com'
s.mail_from_address = 'spree@example.com'
end.save!
end
| unless Spree::Store.where(code: 'spree').exists?
Spree::Store.new do |s|
s.code = 'spree'
s.name = 'Spree Demo Site'
s.url = 'demo.spreecommerce.com'
s.mail_from_address = 'spree@example.com'
s.cart_tax_country_iso = Spree::Config.admin_vat_location
end.save!
end
| Initialize the demo store with a sensible default tax country column | Initialize the demo store with a sensible default tax country column
The current default behaviour in Solidus is one setting for two
behaviour changes: What taxes are included in the prices in the backend,
and how carts without a shipping address are taxed.
While doing the work in separating the two settings, it does make sense for
most shops to have both be equal. This just sets the `cart_tax_country_iso`
for a new store to be equal to the Spree::Config for the backend prices.
| Ruby | bsd-3-clause | Arpsara/solidus,Arpsara/solidus,Arpsara/solidus,pervino/solidus,jordan-brough/solidus,pervino/solidus,pervino/solidus,pervino/solidus,jordan-brough/solidus,jordan-brough/solidus,Arpsara/solidus,jordan-brough/solidus | ruby | ## Code Before:
unless Spree::Store.where(code: 'spree').exists?
Spree::Store.new do |s|
s.code = 'spree'
s.name = 'Spree Demo Site'
s.url = 'demo.spreecommerce.com'
s.mail_from_address = 'spree@example.com'
end.save!
end
## Instruction:
Initialize the demo store with a sensible default tax country column
The current default behaviour in Solidus is one setting for two
behaviour changes: What taxes are included in the prices in the backend,
and how carts without a shipping address are taxed.
While doing the work in separating the two settings, it does make sense for
most shops to have both be equal. This just sets the `cart_tax_country_iso`
for a new store to be equal to the Spree::Config for the backend prices.
## Code After:
unless Spree::Store.where(code: 'spree').exists?
Spree::Store.new do |s|
s.code = 'spree'
s.name = 'Spree Demo Site'
s.url = 'demo.spreecommerce.com'
s.mail_from_address = 'spree@example.com'
s.cart_tax_country_iso = Spree::Config.admin_vat_location
end.save!
end
| unless Spree::Store.where(code: 'spree').exists?
Spree::Store.new do |s|
s.code = 'spree'
s.name = 'Spree Demo Site'
s.url = 'demo.spreecommerce.com'
s.mail_from_address = 'spree@example.com'
+ s.cart_tax_country_iso = Spree::Config.admin_vat_location
end.save!
end | 1 | 0.125 | 1 | 0 |
0a418d04f67a853d178b16d1ed6a815fc5a63e3b | lib/tytus.rb | lib/tytus.rb |
require "tytus/version"
require "tytus/compatibility"
require "tytus/controller_extensions"
require "tytus/view_extensions"
module Tytus
if defined? Rails::Railtie
require "tytus/railtie"
else
Tytus::Railtie.insert_view
Tytus::Railtie.insert_controller
end
end # Tytus
|
require "rails"
require "tytus/version"
require "tytus/railtie"
require "tytus/compatibility"
require "tytus/controller_extensions"
require "tytus/view_extensions"
module Tytus
if defined? Rails::Railtie
require "tytus/railtie"
else
Tytus::Railtie.insert_view
Tytus::Railtie.insert_controller
end
end # Tytus
| Add railtie and rails to gem requires | Add railtie and rails to gem requires
| Ruby | mit | peter-murach/tytus,peter-murach/tytus | ruby | ## Code Before:
require "tytus/version"
require "tytus/compatibility"
require "tytus/controller_extensions"
require "tytus/view_extensions"
module Tytus
if defined? Rails::Railtie
require "tytus/railtie"
else
Tytus::Railtie.insert_view
Tytus::Railtie.insert_controller
end
end # Tytus
## Instruction:
Add railtie and rails to gem requires
## Code After:
require "rails"
require "tytus/version"
require "tytus/railtie"
require "tytus/compatibility"
require "tytus/controller_extensions"
require "tytus/view_extensions"
module Tytus
if defined? Rails::Railtie
require "tytus/railtie"
else
Tytus::Railtie.insert_view
Tytus::Railtie.insert_controller
end
end # Tytus
|
+ require "rails"
require "tytus/version"
+ require "tytus/railtie"
require "tytus/compatibility"
require "tytus/controller_extensions"
require "tytus/view_extensions"
module Tytus
if defined? Rails::Railtie
require "tytus/railtie"
else
Tytus::Railtie.insert_view
Tytus::Railtie.insert_controller
end
end # Tytus | 2 | 0.125 | 2 | 0 |
cf2b6d959acfc700119aaf081b371f940b89c7e8 | conf/pgp-keyservers.php | conf/pgp-keyservers.php | <?php
return [
'keys.openpgp.org',
'keys.fedoraproject.org',
'keyserver.ubuntu.com',
'hkps.pool.sks-keyservers.net'
];
| <?php
return [
'keys.openpgp.org',
'keyserver.ubuntu.com',
'hkps.pool.sks-keyservers.net'
];
| Remove keys.fedoraproject.org from server list | Remove keys.fedoraproject.org from server list
| PHP | bsd-3-clause | phar-io/phive | php | ## Code Before:
<?php
return [
'keys.openpgp.org',
'keys.fedoraproject.org',
'keyserver.ubuntu.com',
'hkps.pool.sks-keyservers.net'
];
## Instruction:
Remove keys.fedoraproject.org from server list
## Code After:
<?php
return [
'keys.openpgp.org',
'keyserver.ubuntu.com',
'hkps.pool.sks-keyservers.net'
];
| <?php
return [
'keys.openpgp.org',
- 'keys.fedoraproject.org',
'keyserver.ubuntu.com',
'hkps.pool.sks-keyservers.net'
]; | 1 | 0.125 | 0 | 1 |
eeef2a8459090a7fc4dfca1dc3c93f064b4e01e9 | .travis.yml | .travis.yml | sudo: false
language: python
python:
- '2.7'
- '3.5'
env:
- CONDA=true
- CONDA=false
install:
- lsb_release -a
- if [[ "${CONDA}" == "true" ]]; then
source auto_version/travis_install_conda.sh numpy scipy numba pip pytest;
fi
- python setup.py install
script:
- py.test
after_success:
- chmod +x ./.conda_deploy.sh
deploy:
matrix:
- provider: script
script: "./.conda_deploy.sh"
skip_cleanup: true
- provider: pypi
user: moboyle79
password:
secure: D0OzSdZn5hWjXX5H41g4pnqme1TJaABKGmUpe14PGMaRf8DjisVMZStAsVVWfocxFqQX3gress+KKtxEvdySXxcfkXAp8VTBvZ+V/uzQQFSYmf5KFwTR/yywff7vdCO+eSHztIcdOhz8Uw2poL/f4/BmO9Y5OoHXJPTkvr6MbIk=
on:
distributions: sdist bdist_wheel
repo: moble/quaternion
| sudo: false
language: python
python:
- '2.7'
- '3.5'
env:
- CONDA=true
- CONDA=false
install:
- lsb_release -a
- if [[ "${CONDA}" == "true" ]]; then
source auto_version/travis_install_conda.sh numpy scipy numba pip pytest;
fi
- python setup.py install
script:
- py.test
after_success:
- chmod +x ./.conda_deploy.sh
deploy:
- provider: script
script: "./.conda_deploy.sh"
skip_cleanup: true
- provider: pypi
user: moboyle79
password:
secure: D0OzSdZn5hWjXX5H41g4pnqme1TJaABKGmUpe14PGMaRf8DjisVMZStAsVVWfocxFqQX3gress+KKtxEvdySXxcfkXAp8VTBvZ+V/uzQQFSYmf5KFwTR/yywff7vdCO+eSHztIcdOhz8Uw2poL/f4/BmO9Y5OoHXJPTkvr6MbIk=
on:
distributions: sdist bdist_wheel
repo: moble/quaternion
python: '3.5'
condition: $CONDA = 'false'
| Remove "matrix", but add conditions for pypi deployment | Remove "matrix", but add conditions for pypi deployment
| YAML | mit | moble/quaternion,moble/quaternion | yaml | ## Code Before:
sudo: false
language: python
python:
- '2.7'
- '3.5'
env:
- CONDA=true
- CONDA=false
install:
- lsb_release -a
- if [[ "${CONDA}" == "true" ]]; then
source auto_version/travis_install_conda.sh numpy scipy numba pip pytest;
fi
- python setup.py install
script:
- py.test
after_success:
- chmod +x ./.conda_deploy.sh
deploy:
matrix:
- provider: script
script: "./.conda_deploy.sh"
skip_cleanup: true
- provider: pypi
user: moboyle79
password:
secure: D0OzSdZn5hWjXX5H41g4pnqme1TJaABKGmUpe14PGMaRf8DjisVMZStAsVVWfocxFqQX3gress+KKtxEvdySXxcfkXAp8VTBvZ+V/uzQQFSYmf5KFwTR/yywff7vdCO+eSHztIcdOhz8Uw2poL/f4/BmO9Y5OoHXJPTkvr6MbIk=
on:
distributions: sdist bdist_wheel
repo: moble/quaternion
## Instruction:
Remove "matrix", but add conditions for pypi deployment
## Code After:
sudo: false
language: python
python:
- '2.7'
- '3.5'
env:
- CONDA=true
- CONDA=false
install:
- lsb_release -a
- if [[ "${CONDA}" == "true" ]]; then
source auto_version/travis_install_conda.sh numpy scipy numba pip pytest;
fi
- python setup.py install
script:
- py.test
after_success:
- chmod +x ./.conda_deploy.sh
deploy:
- provider: script
script: "./.conda_deploy.sh"
skip_cleanup: true
- provider: pypi
user: moboyle79
password:
secure: D0OzSdZn5hWjXX5H41g4pnqme1TJaABKGmUpe14PGMaRf8DjisVMZStAsVVWfocxFqQX3gress+KKtxEvdySXxcfkXAp8VTBvZ+V/uzQQFSYmf5KFwTR/yywff7vdCO+eSHztIcdOhz8Uw2poL/f4/BmO9Y5OoHXJPTkvr6MbIk=
on:
distributions: sdist bdist_wheel
repo: moble/quaternion
python: '3.5'
condition: $CONDA = 'false'
| sudo: false
language: python
python:
- '2.7'
- '3.5'
env:
- CONDA=true
- CONDA=false
install:
- lsb_release -a
- if [[ "${CONDA}" == "true" ]]; then
source auto_version/travis_install_conda.sh numpy scipy numba pip pytest;
fi
- python setup.py install
script:
- py.test
after_success:
- chmod +x ./.conda_deploy.sh
deploy:
- matrix:
- provider: script
script: "./.conda_deploy.sh"
skip_cleanup: true
- provider: pypi
user: moboyle79
password:
secure: D0OzSdZn5hWjXX5H41g4pnqme1TJaABKGmUpe14PGMaRf8DjisVMZStAsVVWfocxFqQX3gress+KKtxEvdySXxcfkXAp8VTBvZ+V/uzQQFSYmf5KFwTR/yywff7vdCO+eSHztIcdOhz8Uw2poL/f4/BmO9Y5OoHXJPTkvr6MbIk=
on:
distributions: sdist bdist_wheel
repo: moble/quaternion
+ python: '3.5'
+ condition: $CONDA = 'false' | 3 | 0.081081 | 2 | 1 |
01e79f1297b94ddce9e9f31f16f8d05db6989d0d | lib/awspec/type/route53_hosted_zone.rb | lib/awspec/type/route53_hosted_zone.rb | module Awspec::Type
class Route53HostedZone < ResourceBase
def resource_via_client
@resource_via_client ||= find_hosted_zone(@display_name)
end
def id
@id ||= resource_via_client.id if resource_via_client
end
def resource_via_client_record_sets
@resource_via_client_record_sets ||= select_record_sets_by_hosted_zone_id(id)
end
def has_record_set?(name, type, value, options = {})
name.gsub!(/\*/, '\\\052') # wildcard support
ret = resource_via_client_record_sets.find do |record_set|
# next if record_set.type != type.upcase
next unless record_set.type.casecmp(type) == 0
options[:ttl] = record_set[:ttl] unless options[:ttl]
if !record_set.resource_records.empty?
v = record_set.resource_records.map { |r| r.value }.join("\n")
record_set.name == name && \
value == v && \
record_set.ttl == options[:ttl]
else
# ALIAS
record_set.name == name && \
record_set.alias_target.dns_name == options[:alias_dns_name] && \
record_set.alias_target.hosted_zone_id == options[:alias_hosted_zone_id]
end
end
end
end
end
| module Awspec::Type
class Route53HostedZone < ResourceBase
def resource_via_client
@resource_via_client ||= find_hosted_zone(@display_name)
end
def id
@id ||= resource_via_client.id if resource_via_client
end
def resource_via_client_record_sets
@resource_via_client_record_sets ||= select_record_sets_by_hosted_zone_id(id)
end
def has_record_set?(name, type, value, options = {})
name.gsub!(/\*/, '\\\052') # wildcard support
ret = resource_via_client_record_sets.find do |record_set|
# next if record_set.type != type.upcase
next unless record_set.type.casecmp(type) == 0
options[:ttl] = record_set[:ttl] unless options[:ttl]
if !record_set.resource_records.empty?
sorted = record_set.resource_records.map { |r| r.value }.sort.join("\n")
record_set.name == name && \
value.split("\n").sort.join("\n") == sorted && \
record_set.ttl == options[:ttl]
else
# ALIAS
record_set.name == name && \
record_set.alias_target.dns_name == options[:alias_dns_name] && \
record_set.alias_target.hosted_zone_id == options[:alias_hosted_zone_id]
end
end
end
end
end
| Support "no order guarantee Route53 record value" | Support "no order guarantee Route53 record value"
| Ruby | mit | k1LoW/awspec,k1LoW/awspec | ruby | ## Code Before:
module Awspec::Type
class Route53HostedZone < ResourceBase
def resource_via_client
@resource_via_client ||= find_hosted_zone(@display_name)
end
def id
@id ||= resource_via_client.id if resource_via_client
end
def resource_via_client_record_sets
@resource_via_client_record_sets ||= select_record_sets_by_hosted_zone_id(id)
end
def has_record_set?(name, type, value, options = {})
name.gsub!(/\*/, '\\\052') # wildcard support
ret = resource_via_client_record_sets.find do |record_set|
# next if record_set.type != type.upcase
next unless record_set.type.casecmp(type) == 0
options[:ttl] = record_set[:ttl] unless options[:ttl]
if !record_set.resource_records.empty?
v = record_set.resource_records.map { |r| r.value }.join("\n")
record_set.name == name && \
value == v && \
record_set.ttl == options[:ttl]
else
# ALIAS
record_set.name == name && \
record_set.alias_target.dns_name == options[:alias_dns_name] && \
record_set.alias_target.hosted_zone_id == options[:alias_hosted_zone_id]
end
end
end
end
end
## Instruction:
Support "no order guarantee Route53 record value"
## Code After:
module Awspec::Type
class Route53HostedZone < ResourceBase
def resource_via_client
@resource_via_client ||= find_hosted_zone(@display_name)
end
def id
@id ||= resource_via_client.id if resource_via_client
end
def resource_via_client_record_sets
@resource_via_client_record_sets ||= select_record_sets_by_hosted_zone_id(id)
end
def has_record_set?(name, type, value, options = {})
name.gsub!(/\*/, '\\\052') # wildcard support
ret = resource_via_client_record_sets.find do |record_set|
# next if record_set.type != type.upcase
next unless record_set.type.casecmp(type) == 0
options[:ttl] = record_set[:ttl] unless options[:ttl]
if !record_set.resource_records.empty?
sorted = record_set.resource_records.map { |r| r.value }.sort.join("\n")
record_set.name == name && \
value.split("\n").sort.join("\n") == sorted && \
record_set.ttl == options[:ttl]
else
# ALIAS
record_set.name == name && \
record_set.alias_target.dns_name == options[:alias_dns_name] && \
record_set.alias_target.hosted_zone_id == options[:alias_hosted_zone_id]
end
end
end
end
end
| module Awspec::Type
class Route53HostedZone < ResourceBase
def resource_via_client
@resource_via_client ||= find_hosted_zone(@display_name)
end
def id
@id ||= resource_via_client.id if resource_via_client
end
def resource_via_client_record_sets
@resource_via_client_record_sets ||= select_record_sets_by_hosted_zone_id(id)
end
def has_record_set?(name, type, value, options = {})
name.gsub!(/\*/, '\\\052') # wildcard support
ret = resource_via_client_record_sets.find do |record_set|
# next if record_set.type != type.upcase
next unless record_set.type.casecmp(type) == 0
options[:ttl] = record_set[:ttl] unless options[:ttl]
if !record_set.resource_records.empty?
- v = record_set.resource_records.map { |r| r.value }.join("\n")
? ^
+ sorted = record_set.resource_records.map { |r| r.value }.sort.join("\n")
? ^^^^^^ +++++
record_set.name == name && \
- value == v && \
+ value.split("\n").sort.join("\n") == sorted && \
record_set.ttl == options[:ttl]
else
# ALIAS
record_set.name == name && \
record_set.alias_target.dns_name == options[:alias_dns_name] && \
record_set.alias_target.hosted_zone_id == options[:alias_hosted_zone_id]
end
end
end
end
end | 4 | 0.114286 | 2 | 2 |
636ea5cdc91b5dcf51ffb69e17c0a09296f1afae | docs/developer/other/IDE_INTEGRATION.md | docs/developer/other/IDE_INTEGRATION.md |
These scripts automatically generate project configuration for the Intellij and Eclipse editors.
Using these scripts removes the need to run custom import and configuration to integrate uPortal with the Integrated Development Environment (IDE).
## Intellij
1. Open a terminal
2. `cd` to the uPortal folder
3. run
```sh
./gradlew idea
```
4. Open intellij
5. Go to the welcome page
6. Select open

7. Navigate to the uPortal folder
8. Open it
## Eclipse
1. Open a terminal
2. `cd` to the uPortal folder
3. run
```sh
./gradlew eclipse
```
4. Open eclipse
6. Select file > import
7. Search for "Existing Projects into Workspace"

7. Navigate to the uPortal folder
8. Open it
|
These scripts automatically generate project configuration for the Intellij and Eclipse editors.
Using these scripts removes the need to run custom import and configuration to integrate uPortal with the Integrated Development Environment (IDE).
## Intellij
1. Open a terminal
2. `cd` to the uPortal folder
3. run
```sh
./gradlew idea
```
4. Open intellij
5. Go to the welcome page
6. Select open

7. Navigate to the uPortal folder
8. Open it
## Eclipse
1. Open a terminal
2. `cd` to the uPortal folder
3. run
```sh
./gradlew eclipse
```
4. Open eclipse
5. Select file > import
6. Search for "Existing Projects into Workspace"

7. Select the the uPortal folder as root directory
8. Select the "Search for nested projects" option
8. Open it
| Add nested projects option to Eclipse guide. | Add nested projects option to Eclipse guide. | Markdown | apache-2.0 | groybal/uPortal,groybal/uPortal,jonathanmtran/uPortal,cousquer/uPortal,Jasig/uPortal,groybal/uPortal,jonathanmtran/uPortal,cousquer/uPortal,cousquer/uPortal,bjagg/uPortal,jonathanmtran/uPortal,bjagg/uPortal,Jasig/uPortal,groybal/uPortal,mgillian/uPortal,mgillian/uPortal,mgillian/uPortal,ChristianMurphy/uPortal,ChristianMurphy/uPortal,ChristianMurphy/uPortal,groybal/uPortal,Jasig/uPortal,bjagg/uPortal | markdown | ## Code Before:
These scripts automatically generate project configuration for the Intellij and Eclipse editors.
Using these scripts removes the need to run custom import and configuration to integrate uPortal with the Integrated Development Environment (IDE).
## Intellij
1. Open a terminal
2. `cd` to the uPortal folder
3. run
```sh
./gradlew idea
```
4. Open intellij
5. Go to the welcome page
6. Select open

7. Navigate to the uPortal folder
8. Open it
## Eclipse
1. Open a terminal
2. `cd` to the uPortal folder
3. run
```sh
./gradlew eclipse
```
4. Open eclipse
6. Select file > import
7. Search for "Existing Projects into Workspace"

7. Navigate to the uPortal folder
8. Open it
## Instruction:
Add nested projects option to Eclipse guide.
## Code After:
These scripts automatically generate project configuration for the Intellij and Eclipse editors.
Using these scripts removes the need to run custom import and configuration to integrate uPortal with the Integrated Development Environment (IDE).
## Intellij
1. Open a terminal
2. `cd` to the uPortal folder
3. run
```sh
./gradlew idea
```
4. Open intellij
5. Go to the welcome page
6. Select open

7. Navigate to the uPortal folder
8. Open it
## Eclipse
1. Open a terminal
2. `cd` to the uPortal folder
3. run
```sh
./gradlew eclipse
```
4. Open eclipse
5. Select file > import
6. Search for "Existing Projects into Workspace"

7. Select the the uPortal folder as root directory
8. Select the "Search for nested projects" option
8. Open it
|
These scripts automatically generate project configuration for the Intellij and Eclipse editors.
Using these scripts removes the need to run custom import and configuration to integrate uPortal with the Integrated Development Environment (IDE).
## Intellij
1. Open a terminal
2. `cd` to the uPortal folder
3. run
```sh
./gradlew idea
```
4. Open intellij
5. Go to the welcome page
6. Select open

7. Navigate to the uPortal folder
8. Open it
## Eclipse
1. Open a terminal
2. `cd` to the uPortal folder
3. run
```sh
./gradlew eclipse
```
4. Open eclipse
- 6. Select file > import
? ^
+ 5. Select file > import
? ^
- 7. Search for "Existing Projects into Workspace"
? ^
+ 6. Search for "Existing Projects into Workspace"
? ^

- 7. Navigate to the uPortal folder
+ 7. Select the the uPortal folder as root directory
+ 8. Select the "Search for nested projects" option
8. Open it | 7 | 0.212121 | 4 | 3 |
867087fcb10ee9540696a4c7c24737ac4456437c | app/views/events/index.html.haml | app/views/events/index.html.haml | - provide(:title, '近日開催の道場まとめ')
- provide(:desc, '開催される道場情報をまとめています。')
- provide(:url, @url)
%section.cover
= image_tag "events_cover.png"
%section#events.text-center
%br
%h1 📅 近日開催の道場 (β版)
%br
%p{style: "margin: 0 30px 40px 0px;"}
☯️開催予定のイベントをチェックしましょう! ✅
%br
(Facebook イベントにも
%a{href: "https://github.com/coderdojo-japan/coderdojo.jp/issues/393"}<> 今後対応する予定
です)
= render partial: 'upcoming_events', locals: { upcoming_events: @upcoming_events }
%br
%br
| - provide(:title, '近日開催の道場まとめ')
- provide(:desc, '開催される道場情報をまとめています。')
- provide(:url, @url)
%section.cover
= image_tag "events_cover.png"
%section#events.text-center
%br
%h1 📅 近日開催の道場 (β版)
%br
%p{style: "margin: 0 30px 40px 0px;"}
☯️開催予定のイベントをチェックしましょう! ✅
%br
(Facebook イベントにも
%a{href: "https://github.com/coderdojo-japan/coderdojo.jp/issues/393"}<> 今後対応する予定
です)
= render partial: 'upcoming_events', locals: { upcoming_events: @upcoming_events }
%section#dojos.dojos.text-center
%br
%h3{style: "margin-top: 60px; font-weight: bold;"} 🛠 関連リンク
%ul{:style => "list-style: none; margin-left: -40px; margin-bottom: 40px;"}
%li
%a{:href => "https://github.com/coderdojo-japan/coderdojo.jp/issues/258"} 直近の CoderDojo 開催情報を表示したい #258
%li
%a{:href => "https://github.com/coderdojo-japan/coderdojo.jp/issues/375"} イベント履歴収集スクリプトの改修 #375
%li
%a{:href => "https://github.com/coderdojo-japan/coderdojo.jp/issues/393"} Facebook のイベント履歴収集を自動化したい #393
%br
%br
%br
| Add links to Issue/PR in GitHub from /events page | Add links to Issue/PR in GitHub from /events page
| Haml | mit | yasslab/coderdojo.jp,coderdojo-japan/coderdojo.jp,coderdojo-japan/coderdojo.jp,yasslab/coderdojo.jp,coderdojo-japan/coderdojo.jp,coderdojo-japan/coderdojo.jp,yasslab/coderdojo.jp | haml | ## Code Before:
- provide(:title, '近日開催の道場まとめ')
- provide(:desc, '開催される道場情報をまとめています。')
- provide(:url, @url)
%section.cover
= image_tag "events_cover.png"
%section#events.text-center
%br
%h1 📅 近日開催の道場 (β版)
%br
%p{style: "margin: 0 30px 40px 0px;"}
☯️開催予定のイベントをチェックしましょう! ✅
%br
(Facebook イベントにも
%a{href: "https://github.com/coderdojo-japan/coderdojo.jp/issues/393"}<> 今後対応する予定
です)
= render partial: 'upcoming_events', locals: { upcoming_events: @upcoming_events }
%br
%br
## Instruction:
Add links to Issue/PR in GitHub from /events page
## Code After:
- provide(:title, '近日開催の道場まとめ')
- provide(:desc, '開催される道場情報をまとめています。')
- provide(:url, @url)
%section.cover
= image_tag "events_cover.png"
%section#events.text-center
%br
%h1 📅 近日開催の道場 (β版)
%br
%p{style: "margin: 0 30px 40px 0px;"}
☯️開催予定のイベントをチェックしましょう! ✅
%br
(Facebook イベントにも
%a{href: "https://github.com/coderdojo-japan/coderdojo.jp/issues/393"}<> 今後対応する予定
です)
= render partial: 'upcoming_events', locals: { upcoming_events: @upcoming_events }
%section#dojos.dojos.text-center
%br
%h3{style: "margin-top: 60px; font-weight: bold;"} 🛠 関連リンク
%ul{:style => "list-style: none; margin-left: -40px; margin-bottom: 40px;"}
%li
%a{:href => "https://github.com/coderdojo-japan/coderdojo.jp/issues/258"} 直近の CoderDojo 開催情報を表示したい #258
%li
%a{:href => "https://github.com/coderdojo-japan/coderdojo.jp/issues/375"} イベント履歴収集スクリプトの改修 #375
%li
%a{:href => "https://github.com/coderdojo-japan/coderdojo.jp/issues/393"} Facebook のイベント履歴収集を自動化したい #393
%br
%br
%br
| - provide(:title, '近日開催の道場まとめ')
- provide(:desc, '開催される道場情報をまとめています。')
- provide(:url, @url)
%section.cover
= image_tag "events_cover.png"
%section#events.text-center
%br
%h1 📅 近日開催の道場 (β版)
%br
%p{style: "margin: 0 30px 40px 0px;"}
☯️開催予定のイベントをチェックしましょう! ✅
%br
(Facebook イベントにも
%a{href: "https://github.com/coderdojo-japan/coderdojo.jp/issues/393"}<> 今後対応する予定
です)
= render partial: 'upcoming_events', locals: { upcoming_events: @upcoming_events }
+ %section#dojos.dojos.text-center
+ %br
+ %h3{style: "margin-top: 60px; font-weight: bold;"} 🛠 関連リンク
+ %ul{:style => "list-style: none; margin-left: -40px; margin-bottom: 40px;"}
+ %li
+ %a{:href => "https://github.com/coderdojo-japan/coderdojo.jp/issues/258"} 直近の CoderDojo 開催情報を表示したい #258
+ %li
+ %a{:href => "https://github.com/coderdojo-japan/coderdojo.jp/issues/375"} イベント履歴収集スクリプトの改修 #375
+ %li
+ %a{:href => "https://github.com/coderdojo-japan/coderdojo.jp/issues/393"} Facebook のイベント履歴収集を自動化したい #393
+ %br
+
%br
%br | 12 | 0.571429 | 12 | 0 |
86fbbb5ca26a1e1f601ca469332d5e19a5e7474d | .travis.yml | .travis.yml | language: erlang
branches:
only:
- master
notifications:
email: mongoose-im@erlang-solutions.com
otp_release:
- 21.3
- 22.0
sudo: required
services:
- docker
install:
- make
before_script:
- make mongooseim-start
script:
- make test
- make ct
- make dialyzer
after_script:
- make mongooseim-stop
cache:
directories:
- $HOME/.cache/rebar3
| language: erlang
branches:
only:
- master
notifications:
email: mongoose-im@erlang-solutions.com
otp_release:
- 21.3
- 22.3
- 23.0
sudo: required
services:
- docker
install:
- make
before_script:
- make mongooseim-start
script:
- make test
- make ct
- make dialyzer
after_script:
- make mongooseim-stop
cache:
directories:
- $HOME/.cache/rebar3
| Introduce OTP23 and upgrade OTP22 | Introduce OTP23 and upgrade OTP22
| YAML | apache-2.0 | esl/escalus | yaml | ## Code Before:
language: erlang
branches:
only:
- master
notifications:
email: mongoose-im@erlang-solutions.com
otp_release:
- 21.3
- 22.0
sudo: required
services:
- docker
install:
- make
before_script:
- make mongooseim-start
script:
- make test
- make ct
- make dialyzer
after_script:
- make mongooseim-stop
cache:
directories:
- $HOME/.cache/rebar3
## Instruction:
Introduce OTP23 and upgrade OTP22
## Code After:
language: erlang
branches:
only:
- master
notifications:
email: mongoose-im@erlang-solutions.com
otp_release:
- 21.3
- 22.3
- 23.0
sudo: required
services:
- docker
install:
- make
before_script:
- make mongooseim-start
script:
- make test
- make ct
- make dialyzer
after_script:
- make mongooseim-stop
cache:
directories:
- $HOME/.cache/rebar3
| language: erlang
branches:
only:
- master
notifications:
email: mongoose-im@erlang-solutions.com
otp_release:
- 21.3
- - 22.0
? ^
+ - 22.3
? ^
+ - 23.0
sudo: required
services:
- docker
install:
- make
before_script:
- make mongooseim-start
script:
- make test
- make ct
- make dialyzer
after_script:
- make mongooseim-stop
cache:
directories:
- $HOME/.cache/rebar3 | 3 | 0.090909 | 2 | 1 |
47a7cedf2cc90d495973195a4bf52ec4818fd753 | _drafts/2019-03-26-mistakes.md | _drafts/2019-03-26-mistakes.md | ---
layout: post
title: "List of Programming Mistakes"
description: "A list of programming mistakes that I have encountered."
keywords: ""
---
This article is really a list of a programming mistakes that I have encountered or made myself. This article won't really have any structure.
- Unexpected side-effects
Recently, I came across the following code snippet:
```C++
class HoleDetector {
private:
std::vector<int> holeIds;
public:
bool isHole(int id) {
//
}
}
```
- HashMap that associates key to true value
- Not setting invariants in the constructor
- Use for-each loop instead of classic for loop
| ---
layout: post
title: "List of Programming Mistakes"
description: "A list of programming mistakes that I have encountered."
keywords: ""
---
This article is really a list of a programming mistakes that I have encountered or made myself. This article won't really have any structure.
- Unexpected side-effects
Recently, I came across the following code snippet:
```C++
class HoleDetector {
private:
std::vector<int> holeIds;
public:
bool isHole(int id) {
//
}
}
```
- HashMap that associates key to true value
- Not setting invariants in the constructor
- Use for-each loop instead of classic for loop when possible
- Use `std::numeric_limits<double>::lowest()` instead of `DBL_MIN`
| Use `numeric_limits` instead of `DBL_MIN` | Use `numeric_limits` instead of `DBL_MIN`
| Markdown | mit | aaronang/aaronang.github.io,aaronang/aaronang.github.io | markdown | ## Code Before:
---
layout: post
title: "List of Programming Mistakes"
description: "A list of programming mistakes that I have encountered."
keywords: ""
---
This article is really a list of a programming mistakes that I have encountered or made myself. This article won't really have any structure.
- Unexpected side-effects
Recently, I came across the following code snippet:
```C++
class HoleDetector {
private:
std::vector<int> holeIds;
public:
bool isHole(int id) {
//
}
}
```
- HashMap that associates key to true value
- Not setting invariants in the constructor
- Use for-each loop instead of classic for loop
## Instruction:
Use `numeric_limits` instead of `DBL_MIN`
## Code After:
---
layout: post
title: "List of Programming Mistakes"
description: "A list of programming mistakes that I have encountered."
keywords: ""
---
This article is really a list of a programming mistakes that I have encountered or made myself. This article won't really have any structure.
- Unexpected side-effects
Recently, I came across the following code snippet:
```C++
class HoleDetector {
private:
std::vector<int> holeIds;
public:
bool isHole(int id) {
//
}
}
```
- HashMap that associates key to true value
- Not setting invariants in the constructor
- Use for-each loop instead of classic for loop when possible
- Use `std::numeric_limits<double>::lowest()` instead of `DBL_MIN`
| ---
layout: post
title: "List of Programming Mistakes"
description: "A list of programming mistakes that I have encountered."
keywords: ""
---
This article is really a list of a programming mistakes that I have encountered or made myself. This article won't really have any structure.
- Unexpected side-effects
Recently, I came across the following code snippet:
```C++
class HoleDetector {
private:
std::vector<int> holeIds;
public:
bool isHole(int id) {
//
}
}
```
- HashMap that associates key to true value
- Not setting invariants in the constructor
- - Use for-each loop instead of classic for loop
+ - Use for-each loop instead of classic for loop when possible
? ++++++++++++++
+ - Use `std::numeric_limits<double>::lowest()` instead of `DBL_MIN` | 3 | 0.103448 | 2 | 1 |
40a66c58fab00f05c84602eaf258a455a532a77d | scala3doc-testcases/src/tests/objectSignatures.scala | scala3doc-testcases/src/tests/objectSignatures.scala | package tests
package objectSignatures
class A[T]
{
val a: String = "asd"
def method3() = "asd"
}
object A
trait C
object Base
object A2 extends A[String] with C
// We are not going to add final below
// final object B
| package tests
package objectSignatures
class A[T]
{
val a: String = "asd"
def method3() = "asd"
}
object A
trait C
object Base
object A2 extends A[String] with C
object <
object >
// We are not going to add final below
// final object B
| Add testcases to check uniqueness of page links for symbols with special characters | Add testcases to check uniqueness of page links for symbols with special characters
| Scala | apache-2.0 | lampepfl/dotty,dotty-staging/dotty,lampepfl/dotty,lampepfl/dotty,sjrd/dotty,sjrd/dotty,dotty-staging/dotty,dotty-staging/dotty,sjrd/dotty,lampepfl/dotty,dotty-staging/dotty,sjrd/dotty,dotty-staging/dotty,lampepfl/dotty,sjrd/dotty | scala | ## Code Before:
package tests
package objectSignatures
class A[T]
{
val a: String = "asd"
def method3() = "asd"
}
object A
trait C
object Base
object A2 extends A[String] with C
// We are not going to add final below
// final object B
## Instruction:
Add testcases to check uniqueness of page links for symbols with special characters
## Code After:
package tests
package objectSignatures
class A[T]
{
val a: String = "asd"
def method3() = "asd"
}
object A
trait C
object Base
object A2 extends A[String] with C
object <
object >
// We are not going to add final below
// final object B
| package tests
package objectSignatures
class A[T]
{
val a: String = "asd"
def method3() = "asd"
}
object A
trait C
object Base
object A2 extends A[String] with C
+ object <
+
+ object >
+
// We are not going to add final below
// final object B | 4 | 0.210526 | 4 | 0 |
dcd399ba3d9c1e3c1ce8a9f654e681d9791e0143 | libraries/index.css | libraries/index.css | justify-content: center;
}
#libraries-index .tabs {
flex-flow: column;
align-items: center;
}
#libraries-index .tabs:first-child {
margin-bottom: -1.5em;
}
#libraries-index .tabs.contains-title > h1 {
margin-left: 0;
margin-right: 0;
}
#libraries-index .tabs ul > li.contains-children {
align-items: center;
}
#libraries-index .tabs ul > li.contains-children > ul {
flex-flow: column;
font-size: initial;
}
#libraries-index .tabs ul > li.contains-children > ul > li {
text-align: left;
}
#libraries-index .tabs > ul > li > a {
font-size: larger;
font-weight: bold;
align-self: stretch;
}
#misc-index {
margin-top: 2em;
}
@media (max-width: 25em) {
#page-header h1 {
font-size: 13vw;
}
}
| flex-flow: column;
align-items: center;
}
#libraries-index .tabs:first-child {
margin-bottom: -1.5em;
}
#libraries-index .tabs.contains-title > h1 {
margin-left: 0;
margin-right: 0;
}
#libraries-index .tabs ul > li.contains-children {
align-items: center;
}
#libraries-index .tabs ul > li.contains-children > ul {
flex-flow: column;
font-size: initial;
}
#libraries-index .tabs ul > li.contains-children > ul > li {
text-align: left;
}
#libraries-index .tabs > ul > li > a {
font-size: larger;
font-weight: bold;
align-self: stretch;
}
#misc-index {
margin-top: 2em;
}
@media (max-width: 25em) {
#page-header h1 {
font-size: 13vw;
}
}
| Use new global support for centered tab groups. | Use new global support for centered tab groups.
| CSS | unlicense | Hexstream/hexstreamsoft.com,Hexstream/www.hexstreamsoft.com,Hexstream/www.hexstreamsoft.com,Hexstream/www.hexstreamsoft.com | css | ## Code Before:
justify-content: center;
}
#libraries-index .tabs {
flex-flow: column;
align-items: center;
}
#libraries-index .tabs:first-child {
margin-bottom: -1.5em;
}
#libraries-index .tabs.contains-title > h1 {
margin-left: 0;
margin-right: 0;
}
#libraries-index .tabs ul > li.contains-children {
align-items: center;
}
#libraries-index .tabs ul > li.contains-children > ul {
flex-flow: column;
font-size: initial;
}
#libraries-index .tabs ul > li.contains-children > ul > li {
text-align: left;
}
#libraries-index .tabs > ul > li > a {
font-size: larger;
font-weight: bold;
align-self: stretch;
}
#misc-index {
margin-top: 2em;
}
@media (max-width: 25em) {
#page-header h1 {
font-size: 13vw;
}
}
## Instruction:
Use new global support for centered tab groups.
## Code After:
flex-flow: column;
align-items: center;
}
#libraries-index .tabs:first-child {
margin-bottom: -1.5em;
}
#libraries-index .tabs.contains-title > h1 {
margin-left: 0;
margin-right: 0;
}
#libraries-index .tabs ul > li.contains-children {
align-items: center;
}
#libraries-index .tabs ul > li.contains-children > ul {
flex-flow: column;
font-size: initial;
}
#libraries-index .tabs ul > li.contains-children > ul > li {
text-align: left;
}
#libraries-index .tabs > ul > li > a {
font-size: larger;
font-weight: bold;
align-self: stretch;
}
#misc-index {
margin-top: 2em;
}
@media (max-width: 25em) {
#page-header h1 {
font-size: 13vw;
}
}
| - justify-content: center;
- }
-
- #libraries-index .tabs {
flex-flow: column;
align-items: center;
}
#libraries-index .tabs:first-child {
margin-bottom: -1.5em;
}
#libraries-index .tabs.contains-title > h1 {
margin-left: 0;
margin-right: 0;
}
#libraries-index .tabs ul > li.contains-children {
align-items: center;
}
#libraries-index .tabs ul > li.contains-children > ul {
flex-flow: column;
font-size: initial;
}
#libraries-index .tabs ul > li.contains-children > ul > li {
text-align: left;
}
#libraries-index .tabs > ul > li > a {
font-size: larger;
font-weight: bold;
align-self: stretch;
}
#misc-index {
margin-top: 2em;
}
@media (max-width: 25em) {
#page-header h1 {
font-size: 13vw;
}
} | 4 | 0.085106 | 0 | 4 |
5b869f49ae68196656d3806bc3e738a0e60dfb87 | CHANGES.md | CHANGES.md | 0.1.0 (2016-01-05):
* Initial public release
| 0.1.1 (2017-04-24)
------------------
* Switch from Oasis to topkg
* Switch from ctypes.foreign to ctypes.stubs in the tests
* Drop OCaml 4.00.0 support
0.1.0 (2016-01-05)
------------------
* Initial public release
| Update change log for 0.1.1. | Update change log for 0.1.1.
| Markdown | mit | yallop/ocaml-unix-type-representations | markdown | ## Code Before:
0.1.0 (2016-01-05):
* Initial public release
## Instruction:
Update change log for 0.1.1.
## Code After:
0.1.1 (2017-04-24)
------------------
* Switch from Oasis to topkg
* Switch from ctypes.foreign to ctypes.stubs in the tests
* Drop OCaml 4.00.0 support
0.1.0 (2016-01-05)
------------------
* Initial public release
| + 0.1.1 (2017-04-24)
+ ------------------
+ * Switch from Oasis to topkg
+ * Switch from ctypes.foreign to ctypes.stubs in the tests
+ * Drop OCaml 4.00.0 support
+
- 0.1.0 (2016-01-05):
? -
+ 0.1.0 (2016-01-05)
+ ------------------
* Initial public release | 9 | 4.5 | 8 | 1 |
5b087ed18a88f9c3ffe340eb31ee734e55ff5ecc | symlink_git_hooks.sh | symlink_git_hooks.sh | set -eo pipefail
cd .git/hooks/
SCRIPTS=$HOME/github.com/hkjn/scripts
ln -vs $SCRIPTS/pre-commit.sh pre-commit
ln -vs $SCRIPTS/pre-push.sh pre-push
| set -eo pipefail
cd .git/hooks/
SCRIPTS="$GOPATH/src/hkjn.me/scripts"
ln -vs $SCRIPTS/pre-commit.sh pre-commit
ln -vs $SCRIPTS/pre-push.sh pre-push
| Use GOPATH for scripts/ link | Use GOPATH for scripts/ link
| Shell | mit | hkjn/scripts,hkjn/scripts | shell | ## Code Before:
set -eo pipefail
cd .git/hooks/
SCRIPTS=$HOME/github.com/hkjn/scripts
ln -vs $SCRIPTS/pre-commit.sh pre-commit
ln -vs $SCRIPTS/pre-push.sh pre-push
## Instruction:
Use GOPATH for scripts/ link
## Code After:
set -eo pipefail
cd .git/hooks/
SCRIPTS="$GOPATH/src/hkjn.me/scripts"
ln -vs $SCRIPTS/pre-commit.sh pre-commit
ln -vs $SCRIPTS/pre-push.sh pre-push
| set -eo pipefail
cd .git/hooks/
- SCRIPTS=$HOME/github.com/hkjn/scripts
+ SCRIPTS="$GOPATH/src/hkjn.me/scripts"
ln -vs $SCRIPTS/pre-commit.sh pre-commit
ln -vs $SCRIPTS/pre-push.sh pre-push | 2 | 0.333333 | 1 | 1 |
6332d7c0b3e3dddfdc6f0a77c63bd2684dda3b69 | ocm-provider/index.php | ocm-provider/index.php | <?php
/**
* @copyright Copyright (c) 2018 Bjoern Schiessle <bjoern@schiessle.org>
*
* @license GNU AGPL version 3 or any later version
*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as
* published by the Free Software Foundation, either version 3 of the
* License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with this program. If not, see <http://www.gnu.org/licenses/>.
*
*/
require_once __DIR__ . '/../lib/base.php';
header('Content-Type: application/json');
$server = \OC::$server;
$isEnabled = $server->getAppManager()->isEnabledForUser('cloud_federation_api');
if ($isEnabled) {
$capabilities = new OCA\CloudFederationAPI\Capabilities($server->getURLGenerator());
header('Content-Type: application/json');
echo json_encode($capabilities->getCapabilities()['ocm']);
} else {
header($_SERVER["SERVER_PROTOCOL"]." 501 Not Implemented", true, 501);
exit("501 Not Implemented");
}
| <?php
/**
* @copyright Copyright (c) 2018 Bjoern Schiessle <bjoern@schiessle.org>
*
* @license GNU AGPL version 3 or any later version
*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as
* published by the Free Software Foundation, either version 3 of the
* License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with this program. If not, see <http://www.gnu.org/licenses/>.
*
*/
require_once __DIR__ . '/../lib/base.php';
header('Content-Type: application/json');
$server = \OC::$server;
$isEnabled = $server->getAppManager()->isEnabledForUser('cloud_federation_api');
if ($isEnabled) {
// Make sure the routes are loaded
\OC_App::loadApp('cloud_federation_api');
$capabilities = new OCA\CloudFederationAPI\Capabilities($server->getURLGenerator());
header('Content-Type: application/json');
echo json_encode($capabilities->getCapabilities()['ocm']);
} else {
header($_SERVER["SERVER_PROTOCOL"]." 501 Not Implemented", true, 501);
exit("501 Not Implemented");
}
| Make sure the cloud_federation_api routes are loaded | Make sure the cloud_federation_api routes are loaded
Signed-off-by: Joas Schilling <ab43a7c9cb5b2380afc4ddf8b3e2583169b39a02@schilljs.com>
| PHP | agpl-3.0 | andreas-p/nextcloud-server,nextcloud/server,andreas-p/nextcloud-server,andreas-p/nextcloud-server,nextcloud/server,nextcloud/server,nextcloud/server,andreas-p/nextcloud-server,andreas-p/nextcloud-server | php | ## Code Before:
<?php
/**
* @copyright Copyright (c) 2018 Bjoern Schiessle <bjoern@schiessle.org>
*
* @license GNU AGPL version 3 or any later version
*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as
* published by the Free Software Foundation, either version 3 of the
* License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with this program. If not, see <http://www.gnu.org/licenses/>.
*
*/
require_once __DIR__ . '/../lib/base.php';
header('Content-Type: application/json');
$server = \OC::$server;
$isEnabled = $server->getAppManager()->isEnabledForUser('cloud_federation_api');
if ($isEnabled) {
$capabilities = new OCA\CloudFederationAPI\Capabilities($server->getURLGenerator());
header('Content-Type: application/json');
echo json_encode($capabilities->getCapabilities()['ocm']);
} else {
header($_SERVER["SERVER_PROTOCOL"]." 501 Not Implemented", true, 501);
exit("501 Not Implemented");
}
## Instruction:
Make sure the cloud_federation_api routes are loaded
Signed-off-by: Joas Schilling <ab43a7c9cb5b2380afc4ddf8b3e2583169b39a02@schilljs.com>
## Code After:
<?php
/**
* @copyright Copyright (c) 2018 Bjoern Schiessle <bjoern@schiessle.org>
*
* @license GNU AGPL version 3 or any later version
*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as
* published by the Free Software Foundation, either version 3 of the
* License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with this program. If not, see <http://www.gnu.org/licenses/>.
*
*/
require_once __DIR__ . '/../lib/base.php';
header('Content-Type: application/json');
$server = \OC::$server;
$isEnabled = $server->getAppManager()->isEnabledForUser('cloud_federation_api');
if ($isEnabled) {
// Make sure the routes are loaded
\OC_App::loadApp('cloud_federation_api');
$capabilities = new OCA\CloudFederationAPI\Capabilities($server->getURLGenerator());
header('Content-Type: application/json');
echo json_encode($capabilities->getCapabilities()['ocm']);
} else {
header($_SERVER["SERVER_PROTOCOL"]." 501 Not Implemented", true, 501);
exit("501 Not Implemented");
}
| <?php
/**
* @copyright Copyright (c) 2018 Bjoern Schiessle <bjoern@schiessle.org>
*
* @license GNU AGPL version 3 or any later version
*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as
* published by the Free Software Foundation, either version 3 of the
* License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with this program. If not, see <http://www.gnu.org/licenses/>.
*
*/
require_once __DIR__ . '/../lib/base.php';
header('Content-Type: application/json');
$server = \OC::$server;
$isEnabled = $server->getAppManager()->isEnabledForUser('cloud_federation_api');
if ($isEnabled) {
+ // Make sure the routes are loaded
+ \OC_App::loadApp('cloud_federation_api');
$capabilities = new OCA\CloudFederationAPI\Capabilities($server->getURLGenerator());
header('Content-Type: application/json');
echo json_encode($capabilities->getCapabilities()['ocm']);
} else {
header($_SERVER["SERVER_PROTOCOL"]." 501 Not Implemented", true, 501);
exit("501 Not Implemented");
} | 2 | 0.052632 | 2 | 0 |
95e30fb8c7e00a1e494392d22b1bede70978d587 | app/interactors/handle_policy_notification/create_policy_or_update_continuation.rb | app/interactors/handle_policy_notification/create_policy_or_update_continuation.rb | module HandlePolicyNotification
# Create or update the 'main' policy in glue to bring the data
# into line with the changes we are about to transmit.
class CreatePolicyOrUpdateContinuation
include Interactor
# Context requires:
# - primary_policy_action (a HandlePolicyNotification::PolicyAction)
def call
primary_action = context.primary_policy_action
member_details = primary_action.member_detail_collection
enrollees = member_details.map do |md|
Enrollee.new(md.enrollee_attributes)
end
Policy.create!(primary_action.new_policy_attributes.merge({
:enrollees => enrollees
}))
end
end
end
| module HandlePolicyNotification
# Create or update the 'main' policy in glue to bring the data
# into line with the changes we are about to transmit.
class CreatePolicyOrUpdateContinuation
include Interactor
# Context requires:
# - primary_policy_action (a HandlePolicyNotification::PolicyAction)
def call
# Don't build me for now
=begin
primary_action = context.primary_policy_action
member_details = primary_action.member_detail_collection
raise member_details.inspect
enrollees = member_details.map do |md|
Enrollee.new(md.enrollee_attributes)
end
Policy.create!(primary_action.new_policy_attributes.merge({
:enrollees => enrollees
}))
=end
end
end
end
| Disable db munging for now. | Disable db munging for now.
| Ruby | mit | blackeaglejs/gluedb,blackeaglejs/gluedb,blackeaglejs/gluedb,blackeaglejs/gluedb | ruby | ## Code Before:
module HandlePolicyNotification
# Create or update the 'main' policy in glue to bring the data
# into line with the changes we are about to transmit.
class CreatePolicyOrUpdateContinuation
include Interactor
# Context requires:
# - primary_policy_action (a HandlePolicyNotification::PolicyAction)
def call
primary_action = context.primary_policy_action
member_details = primary_action.member_detail_collection
enrollees = member_details.map do |md|
Enrollee.new(md.enrollee_attributes)
end
Policy.create!(primary_action.new_policy_attributes.merge({
:enrollees => enrollees
}))
end
end
end
## Instruction:
Disable db munging for now.
## Code After:
module HandlePolicyNotification
# Create or update the 'main' policy in glue to bring the data
# into line with the changes we are about to transmit.
class CreatePolicyOrUpdateContinuation
include Interactor
# Context requires:
# - primary_policy_action (a HandlePolicyNotification::PolicyAction)
def call
# Don't build me for now
=begin
primary_action = context.primary_policy_action
member_details = primary_action.member_detail_collection
raise member_details.inspect
enrollees = member_details.map do |md|
Enrollee.new(md.enrollee_attributes)
end
Policy.create!(primary_action.new_policy_attributes.merge({
:enrollees => enrollees
}))
=end
end
end
end
| module HandlePolicyNotification
# Create or update the 'main' policy in glue to bring the data
# into line with the changes we are about to transmit.
class CreatePolicyOrUpdateContinuation
include Interactor
# Context requires:
# - primary_policy_action (a HandlePolicyNotification::PolicyAction)
def call
+ # Don't build me for now
+ =begin
primary_action = context.primary_policy_action
member_details = primary_action.member_detail_collection
+ raise member_details.inspect
enrollees = member_details.map do |md|
Enrollee.new(md.enrollee_attributes)
end
Policy.create!(primary_action.new_policy_attributes.merge({
:enrollees => enrollees
}))
+ =end
end
end
end | 4 | 0.2 | 4 | 0 |
50941019c3a55bfd03bcf4b0bedfc00a5ff58564 | public/examples/Reactive/Draw.elm | public/examples/Reactive/Draw.elm | -- Try this out on an iOS or Android device by removing "edit"
-- from the URL: http://elm-lang.org/examples/Reactive/Touches.elm
import Dict
import Touch
import Window
main = lift2 scene Window.dimensions
(Dict.values <~ foldp addN Dict.empty Touch.touches)
addN : [Touch.Touch] -> Dict.Dict Int [(Int,Int)] -> Dict.Dict Int [(Int,Int)]
addN ts dict = foldl add1 dict ts
add1 : Touch.Touch -> Dict.Dict Int [(Int,Int)] -> Dict.Dict Int [(Int,Int)]
add1 t d = let vs = Dict.findWithDefault [] t.id d
in Dict.insert t.id ((t.x,t.y) :: vs) d
scene : (Int,Int) -> [[(Int,Int)]] -> Element
scene (w,h) paths =
let float (a,b) = (toFloat a, toFloat -b)
pathForms = group (map (traced thickLine . path . map float) paths)
picture = collage w h [ move (float (-w `div` 2, -h `div` 2)) pathForms ]
in layers [ picture, message ]
thickLine : LineStyle
thickLine = { defaultLine | color <- rgba 123 123 123 0.3, width <- 8 }
message = [markdown|
<a href="/examples/Reactive/Draw.elm" target="_top">Try it fullscreen</a>
if you are on iOS or Android.
|]
|
import Dict
import Touch
import Window
main = lift2 scene Window.dimensions
(Dict.values <~ foldp addN Dict.empty Touch.touches)
addN : [Touch.Touch] -> Dict.Dict Int [(Int,Int)] -> Dict.Dict Int [(Int,Int)]
addN ts dict = foldl add1 dict ts
add1 : Touch.Touch -> Dict.Dict Int [(Int,Int)] -> Dict.Dict Int [(Int,Int)]
add1 t d = let vs = Dict.findWithDefault [] t.id d
in Dict.insert t.id ((t.x,t.y) :: vs) d
scene : (Int,Int) -> [[(Int,Int)]] -> Element
scene (w,h) paths =
let float (a,b) = (toFloat a, toFloat -b)
pathForms = group (map (traced thickLine . path . map float) paths)
picture = collage w h [ move (float (-w `div` 2, -h `div` 2)) pathForms ]
in layers [ picture, message ]
thickLine : LineStyle
thickLine = { defaultLine | color <- rgba 123 123 123 0.3, width <- 8 }
message = [markdown|
<a href="/examples/Reactive/Draw.elm" target="_top">Try it fullscreen</a>
if you are on iOS or Android.
|]
| Remove incorrect and redundant comment | Remove incorrect and redundant comment
Thanks @jvoigtlaender!
| Elm | bsd-3-clause | liubko/elm-lang.org,jvoigtlaender/elm-lang.org,yang-wei/elm-lang.org,rtfeldman/elm-lang.org,frankschmitt/elm-lang.org,kevinrood/elm-lang.org,randomer/elm-lang.org,jazmit/elm-lang.org,elm-lang/debug.elm-lang.org,amitaibu/elm-lang.org,randomer/elm-lang.org,elm-lang/debug.elm-lang.org,yang-wei/elm-lang.org,FranklinChen/elm-lang.org,inchingforward/elm-lang.org,inchingforward/elm-lang.org,coreyhaines/elm-lang.org,kevinrood/elm-lang.org,ThomasWeiser/elm-lang.org,rtorr/elm-lang.org,amitaibu/elm-lang.org,ThomasWeiser/elm-lang.org,frankschmitt/elm-lang.org,FranklinChen/elm-lang.org,liubko/elm-lang.org,elm-guides/elm-lang.org,akeeton/elm-lang.org,coreyhaines/elm-lang.org,elm-guides/elm-lang.org,akeeton/elm-lang.org,jazmit/elm-lang.org,rtfeldman/elm-lang.org,eeue56/elm-lang.org,eeue56/elm-lang.org,avh4/elm-lang.org,jvoigtlaender/elm-lang.org,semaj/jameslarisch.com-elm,rtorr/elm-lang.org | elm | ## Code Before:
-- Try this out on an iOS or Android device by removing "edit"
-- from the URL: http://elm-lang.org/examples/Reactive/Touches.elm
import Dict
import Touch
import Window
main = lift2 scene Window.dimensions
(Dict.values <~ foldp addN Dict.empty Touch.touches)
addN : [Touch.Touch] -> Dict.Dict Int [(Int,Int)] -> Dict.Dict Int [(Int,Int)]
addN ts dict = foldl add1 dict ts
add1 : Touch.Touch -> Dict.Dict Int [(Int,Int)] -> Dict.Dict Int [(Int,Int)]
add1 t d = let vs = Dict.findWithDefault [] t.id d
in Dict.insert t.id ((t.x,t.y) :: vs) d
scene : (Int,Int) -> [[(Int,Int)]] -> Element
scene (w,h) paths =
let float (a,b) = (toFloat a, toFloat -b)
pathForms = group (map (traced thickLine . path . map float) paths)
picture = collage w h [ move (float (-w `div` 2, -h `div` 2)) pathForms ]
in layers [ picture, message ]
thickLine : LineStyle
thickLine = { defaultLine | color <- rgba 123 123 123 0.3, width <- 8 }
message = [markdown|
<a href="/examples/Reactive/Draw.elm" target="_top">Try it fullscreen</a>
if you are on iOS or Android.
|]
## Instruction:
Remove incorrect and redundant comment
Thanks @jvoigtlaender!
## Code After:
import Dict
import Touch
import Window
main = lift2 scene Window.dimensions
(Dict.values <~ foldp addN Dict.empty Touch.touches)
addN : [Touch.Touch] -> Dict.Dict Int [(Int,Int)] -> Dict.Dict Int [(Int,Int)]
addN ts dict = foldl add1 dict ts
add1 : Touch.Touch -> Dict.Dict Int [(Int,Int)] -> Dict.Dict Int [(Int,Int)]
add1 t d = let vs = Dict.findWithDefault [] t.id d
in Dict.insert t.id ((t.x,t.y) :: vs) d
scene : (Int,Int) -> [[(Int,Int)]] -> Element
scene (w,h) paths =
let float (a,b) = (toFloat a, toFloat -b)
pathForms = group (map (traced thickLine . path . map float) paths)
picture = collage w h [ move (float (-w `div` 2, -h `div` 2)) pathForms ]
in layers [ picture, message ]
thickLine : LineStyle
thickLine = { defaultLine | color <- rgba 123 123 123 0.3, width <- 8 }
message = [markdown|
<a href="/examples/Reactive/Draw.elm" target="_top">Try it fullscreen</a>
if you are on iOS or Android.
|]
| - -- Try this out on an iOS or Android device by removing "edit"
- -- from the URL: http://elm-lang.org/examples/Reactive/Touches.elm
import Dict
import Touch
import Window
main = lift2 scene Window.dimensions
(Dict.values <~ foldp addN Dict.empty Touch.touches)
addN : [Touch.Touch] -> Dict.Dict Int [(Int,Int)] -> Dict.Dict Int [(Int,Int)]
addN ts dict = foldl add1 dict ts
add1 : Touch.Touch -> Dict.Dict Int [(Int,Int)] -> Dict.Dict Int [(Int,Int)]
add1 t d = let vs = Dict.findWithDefault [] t.id d
in Dict.insert t.id ((t.x,t.y) :: vs) d
scene : (Int,Int) -> [[(Int,Int)]] -> Element
scene (w,h) paths =
let float (a,b) = (toFloat a, toFloat -b)
pathForms = group (map (traced thickLine . path . map float) paths)
picture = collage w h [ move (float (-w `div` 2, -h `div` 2)) pathForms ]
in layers [ picture, message ]
thickLine : LineStyle
thickLine = { defaultLine | color <- rgba 123 123 123 0.3, width <- 8 }
message = [markdown|
<a href="/examples/Reactive/Draw.elm" target="_top">Try it fullscreen</a>
if you are on iOS or Android.
|]
| 2 | 0.058824 | 0 | 2 |
6ead56f0cc7c82d7ecab9a41a0e36d2502005e57 | actions/norm-names.js | actions/norm-names.js | registerAction(function (node) {
if (!settings["fixNames"]) return;
node = node || document.body;
toArray(node.querySelectorAll(".content > .l_profile, .lbody > .l_profile, .name > .l_profile, .foaf > .l_profile"))
.forEach(function (node) {
var login = node.getAttribute("href").substr(1);
if (node.innerHTML != login) {
node.dataset["displayName"] = node.innerHTML;
node.innerHTML = login;
}
});
// В попапе показываем оригинальное имя
toArray(node.querySelectorAll("#popup .name > .l_profile"))
.forEach(function (node) {
node.innerHTML = node.dataset["displayName"];
});
});
| registerAction(function (node) {
if (!settings["fixNames"]) return;
node = node || document.body;
toArray(node.querySelectorAll(".content > .l_profile, .lbody > .l_profile, .name > .l_profile, .foaf > .l_profile"))
.forEach(function (node) {
var login = node.getAttribute("href").substr(1);
if (node.innerHTML != login) {
node.dataset["displayName"] = node.innerHTML;
node.innerHTML = login;
}
});
// В попапе показываем оригинальное имя
toArray(node.querySelectorAll("#popup .name > .l_profile"))
.forEach(function (node) {
var sn = node.dataset["displayName"];
if (sn !== undefined) node.innerHTML = node.dataset["displayName"];
});
});
| FIX Неправильно обрабатывались юзеры без дисплейнейма | FIX Неправильно обрабатывались юзеры без дисплейнейма
| JavaScript | mit | davidmz/friendfeed-and-co,davidmz/friendfeed-and-co | javascript | ## Code Before:
registerAction(function (node) {
if (!settings["fixNames"]) return;
node = node || document.body;
toArray(node.querySelectorAll(".content > .l_profile, .lbody > .l_profile, .name > .l_profile, .foaf > .l_profile"))
.forEach(function (node) {
var login = node.getAttribute("href").substr(1);
if (node.innerHTML != login) {
node.dataset["displayName"] = node.innerHTML;
node.innerHTML = login;
}
});
// В попапе показываем оригинальное имя
toArray(node.querySelectorAll("#popup .name > .l_profile"))
.forEach(function (node) {
node.innerHTML = node.dataset["displayName"];
});
});
## Instruction:
FIX Неправильно обрабатывались юзеры без дисплейнейма
## Code After:
registerAction(function (node) {
if (!settings["fixNames"]) return;
node = node || document.body;
toArray(node.querySelectorAll(".content > .l_profile, .lbody > .l_profile, .name > .l_profile, .foaf > .l_profile"))
.forEach(function (node) {
var login = node.getAttribute("href").substr(1);
if (node.innerHTML != login) {
node.dataset["displayName"] = node.innerHTML;
node.innerHTML = login;
}
});
// В попапе показываем оригинальное имя
toArray(node.querySelectorAll("#popup .name > .l_profile"))
.forEach(function (node) {
var sn = node.dataset["displayName"];
if (sn !== undefined) node.innerHTML = node.dataset["displayName"];
});
});
| registerAction(function (node) {
if (!settings["fixNames"]) return;
node = node || document.body;
toArray(node.querySelectorAll(".content > .l_profile, .lbody > .l_profile, .name > .l_profile, .foaf > .l_profile"))
.forEach(function (node) {
var login = node.getAttribute("href").substr(1);
if (node.innerHTML != login) {
node.dataset["displayName"] = node.innerHTML;
node.innerHTML = login;
}
});
// В попапе показываем оригинальное имя
toArray(node.querySelectorAll("#popup .name > .l_profile"))
.forEach(function (node) {
+ var sn = node.dataset["displayName"];
- node.innerHTML = node.dataset["displayName"];
+ if (sn !== undefined) node.innerHTML = node.dataset["displayName"];
? ++++++++++++++++++++++
});
}); | 3 | 0.157895 | 2 | 1 |
b5a0b4d4750bec3c80f50f4653aed37603bcd679 | app/jobs/fetch_link_data_job.rb | app/jobs/fetch_link_data_job.rb | class FetchLinkDataJob < ApplicationJob
queue_as :http_service
def perform(post_id, url, retry_count=1)
post = Post.find(post_id)
post.link_crawler_cache = LinkCrawlerCache.find_or_create_by(url: url)
post.save
rescue ActiveRecord::RecordNotFound
FetchLinkDataJob.set(wait: 1.minute).
perform_later(post_id, url, retry_count+1) unless retry_count > 3
rescue LinkThumbnailer::Exceptions
FetchLinkDataJob.set(wait: 1.minute).
perform_later(post_id, url, retry_count+1) unless retry_count > 3
end
end
| class FetchLinkDataJob < ApplicationJob
queue_as :http_service
def perform(post_id, url, retry_count=1)
post = Post.find(post_id)
post.link_crawler_cache = LinkCrawlerCache.find_or_create_by(url: url)
post.save
rescue ActiveRecord::RecordNotFound
FetchLinkDataJob.set(wait: 1.minute).
perform_later(post_id, url, retry_count+1) unless retry_count > 3
rescue LinkThumbnailer::Exceptions
FetchLinkDataJob.set(wait: retry_count * 1.minute).
perform_later(post_id, url, retry_count+1) unless retry_count > 3
end
end
| Update wait time on LinkThumbnailer::Exceptions | Update wait time on LinkThumbnailer::Exceptions
| Ruby | mit | exsules/exsules-api,exsules/exsules-api | ruby | ## Code Before:
class FetchLinkDataJob < ApplicationJob
queue_as :http_service
def perform(post_id, url, retry_count=1)
post = Post.find(post_id)
post.link_crawler_cache = LinkCrawlerCache.find_or_create_by(url: url)
post.save
rescue ActiveRecord::RecordNotFound
FetchLinkDataJob.set(wait: 1.minute).
perform_later(post_id, url, retry_count+1) unless retry_count > 3
rescue LinkThumbnailer::Exceptions
FetchLinkDataJob.set(wait: 1.minute).
perform_later(post_id, url, retry_count+1) unless retry_count > 3
end
end
## Instruction:
Update wait time on LinkThumbnailer::Exceptions
## Code After:
class FetchLinkDataJob < ApplicationJob
queue_as :http_service
def perform(post_id, url, retry_count=1)
post = Post.find(post_id)
post.link_crawler_cache = LinkCrawlerCache.find_or_create_by(url: url)
post.save
rescue ActiveRecord::RecordNotFound
FetchLinkDataJob.set(wait: 1.minute).
perform_later(post_id, url, retry_count+1) unless retry_count > 3
rescue LinkThumbnailer::Exceptions
FetchLinkDataJob.set(wait: retry_count * 1.minute).
perform_later(post_id, url, retry_count+1) unless retry_count > 3
end
end
| class FetchLinkDataJob < ApplicationJob
queue_as :http_service
def perform(post_id, url, retry_count=1)
post = Post.find(post_id)
post.link_crawler_cache = LinkCrawlerCache.find_or_create_by(url: url)
post.save
rescue ActiveRecord::RecordNotFound
FetchLinkDataJob.set(wait: 1.minute).
perform_later(post_id, url, retry_count+1) unless retry_count > 3
rescue LinkThumbnailer::Exceptions
- FetchLinkDataJob.set(wait: 1.minute).
+ FetchLinkDataJob.set(wait: retry_count * 1.minute).
? ++++++++++++++
perform_later(post_id, url, retry_count+1) unless retry_count > 3
end
end | 2 | 0.133333 | 1 | 1 |
d51a843e91150edd506614fdf78502f0f1441ce9 | spec/integration/veritas/finding_one_object_spec.rb | spec/integration/veritas/finding_one_object_spec.rb | require 'spec_helper_integration'
describe 'Finding One Object' do
include_context 'Models and Mappers'
before(:all) do
setup_db
insert_user 1, 'John', 18
insert_user 2, 'Jane', 21
insert_user 3, 'Jane', 22
insert_user 4, 'Piotr', 20
user_mapper
end
it 'finds one object matching search criteria' do
user = DM_ENV[user_model].one(:name => 'Jane', :age => 22)
user.should be_instance_of(user_model)
user.name.should eql('Jane')
user.age.should eql(22)
end
it 'raises an exception if more than one objects were found' do
expect { DM_ENV[user_model].one(:name => 'Jane') }.to raise_error(
"#{DM_ENV[user_model]}#one returned more than one result")
end
end
| require 'spec_helper_integration'
describe 'Finding One Object' do
include_context 'Models and Mappers'
before(:all) do
setup_db
insert_user 1, 'John', 18
insert_user 2, 'Jane', 21
insert_user 3, 'Jane', 22
insert_user 4, 'Piotr', 20
user_mapper
end
it 'finds one object matching search criteria' do
user = DM_ENV[user_model].one(:name => 'Jane', :age => 22)
user.should be_instance_of(user_model)
user.name.should eql('Jane')
user.age.should eql(22)
end
it 'raises an exception if more than one objects were found' do
expect { DM_ENV[user_model].one(:name => 'Jane') }.to raise_error(
ManyTuplesError, "one tuple expected, but 2 were returned")
end
end
| Fix veritas integration specs for "finding one object" | Fix veritas integration specs for "finding one object" | Ruby | mit | solnic/rom-relation | ruby | ## Code Before:
require 'spec_helper_integration'
describe 'Finding One Object' do
include_context 'Models and Mappers'
before(:all) do
setup_db
insert_user 1, 'John', 18
insert_user 2, 'Jane', 21
insert_user 3, 'Jane', 22
insert_user 4, 'Piotr', 20
user_mapper
end
it 'finds one object matching search criteria' do
user = DM_ENV[user_model].one(:name => 'Jane', :age => 22)
user.should be_instance_of(user_model)
user.name.should eql('Jane')
user.age.should eql(22)
end
it 'raises an exception if more than one objects were found' do
expect { DM_ENV[user_model].one(:name => 'Jane') }.to raise_error(
"#{DM_ENV[user_model]}#one returned more than one result")
end
end
## Instruction:
Fix veritas integration specs for "finding one object"
## Code After:
require 'spec_helper_integration'
describe 'Finding One Object' do
include_context 'Models and Mappers'
before(:all) do
setup_db
insert_user 1, 'John', 18
insert_user 2, 'Jane', 21
insert_user 3, 'Jane', 22
insert_user 4, 'Piotr', 20
user_mapper
end
it 'finds one object matching search criteria' do
user = DM_ENV[user_model].one(:name => 'Jane', :age => 22)
user.should be_instance_of(user_model)
user.name.should eql('Jane')
user.age.should eql(22)
end
it 'raises an exception if more than one objects were found' do
expect { DM_ENV[user_model].one(:name => 'Jane') }.to raise_error(
ManyTuplesError, "one tuple expected, but 2 were returned")
end
end
| require 'spec_helper_integration'
describe 'Finding One Object' do
include_context 'Models and Mappers'
before(:all) do
setup_db
insert_user 1, 'John', 18
insert_user 2, 'Jane', 21
insert_user 3, 'Jane', 22
insert_user 4, 'Piotr', 20
user_mapper
end
it 'finds one object matching search criteria' do
user = DM_ENV[user_model].one(:name => 'Jane', :age => 22)
user.should be_instance_of(user_model)
user.name.should eql('Jane')
user.age.should eql(22)
end
it 'raises an exception if more than one objects were found' do
expect { DM_ENV[user_model].one(:name => 'Jane') }.to raise_error(
- "#{DM_ENV[user_model]}#one returned more than one result")
+ ManyTuplesError, "one tuple expected, but 2 were returned")
end
end | 2 | 0.066667 | 1 | 1 |
115e299d4ea176099ec7c14af0404ad84deedc63 | decisioncenter/script/addDCApplications.sh | decisioncenter/script/addDCApplications.sh |
applicationXml=$1
if [ ! ${applicationXml} ]; then
applicationXml="/config/application.xml"
fi
function addApplication {
if [ -e "${APPS}/$1.war" ]
then
cat "/config/application-$1.xml" >> ${applicationXml}
fi
}
# Begin - Add DC Apps
touch ${applicationXml}
echo "<server>" >> ${applicationXml}
cat ${applicationXml}
addApplication decisioncenter
addApplication decisionmodel
addApplication teamserver
addApplication decisioncenter-api
addApplication teamserver-dbdump
echo "</server>" >> ${applicationXml}
cat ${applicationXml}
# End - Add DC Apps
|
applicationXml=$1
if [ ! ${applicationXml} ]; then
applicationXml="/config/application.xml"
fi
function addApplication {
if [ -e "${APPS}/$1.war" ]
then
cat "/config/application-$1.xml" >> ${applicationXml}
fi
}
# Begin - Add DC Apps
touch ${applicationXml}
echo "<server>" >> ${applicationXml}
addApplication decisioncenter
addApplication decisionmodel
addApplication teamserver
addApplication decisioncenter-api
addApplication teamserver-dbdump
echo "</server>" >> ${applicationXml}
# End - Add DC Apps
| Fix RTC Defect : 115704 | Fix RTC Defect : 115704
| Shell | mit | lgrateau/ODM-on-Docker,lgrateau/ODM-on-Docker,lgrateau/odm-ondocker,lgrateau/odm-ondocker,lgrateau/odm-ondocker | shell | ## Code Before:
applicationXml=$1
if [ ! ${applicationXml} ]; then
applicationXml="/config/application.xml"
fi
function addApplication {
if [ -e "${APPS}/$1.war" ]
then
cat "/config/application-$1.xml" >> ${applicationXml}
fi
}
# Begin - Add DC Apps
touch ${applicationXml}
echo "<server>" >> ${applicationXml}
cat ${applicationXml}
addApplication decisioncenter
addApplication decisionmodel
addApplication teamserver
addApplication decisioncenter-api
addApplication teamserver-dbdump
echo "</server>" >> ${applicationXml}
cat ${applicationXml}
# End - Add DC Apps
## Instruction:
Fix RTC Defect : 115704
## Code After:
applicationXml=$1
if [ ! ${applicationXml} ]; then
applicationXml="/config/application.xml"
fi
function addApplication {
if [ -e "${APPS}/$1.war" ]
then
cat "/config/application-$1.xml" >> ${applicationXml}
fi
}
# Begin - Add DC Apps
touch ${applicationXml}
echo "<server>" >> ${applicationXml}
addApplication decisioncenter
addApplication decisionmodel
addApplication teamserver
addApplication decisioncenter-api
addApplication teamserver-dbdump
echo "</server>" >> ${applicationXml}
# End - Add DC Apps
|
applicationXml=$1
if [ ! ${applicationXml} ]; then
applicationXml="/config/application.xml"
fi
function addApplication {
if [ -e "${APPS}/$1.war" ]
then
cat "/config/application-$1.xml" >> ${applicationXml}
fi
}
# Begin - Add DC Apps
touch ${applicationXml}
echo "<server>" >> ${applicationXml}
- cat ${applicationXml}
addApplication decisioncenter
addApplication decisionmodel
addApplication teamserver
addApplication decisioncenter-api
addApplication teamserver-dbdump
echo "</server>" >> ${applicationXml}
- cat ${applicationXml}
+
# End - Add DC Apps | 3 | 0.096774 | 1 | 2 |
0558a38c82b663756a68530c374c5ed4eec9e97a | src/main/resources/webapp/WEB-INF/jsp/sidebar/navigation.jsp | src/main/resources/webapp/WEB-INF/jsp/sidebar/navigation.jsp | <%@ taglib uri="http://java.sun.com/jsp/jstl/core" prefix="c" %>
<c:url var="homeUrl" value="/" />
<c:url var="aboutUrl" value="/legal/about" />
<div class="sectionHeading">Go...</div>
<div class="sectionContent">
<ul class="nav">
<li><a href="${homeUrl}">Home</a></li>
<li class="navbreak"><a href="https://bintray.com/insideo/randomcoder-release">Download</a></li>
<li><a href="https://github.com/search?q=user%3Ainsideo+randomcoder-">Git repositories</a></li>
<li class="navbreak"><a href="${aboutUrl}">About this site</a></li>
</ul>
</div>
| <%@ taglib uri="http://java.sun.com/jsp/jstl/core" prefix="c" %>
<c:url var="homeUrl" value="/" />
<c:url var="aboutUrl" value="/legal/about" />
<div class="sectionHeading">Go...</div>
<div class="sectionContent">
<ul class="nav">
<li><a href="${homeUrl}">Home</a></li>
<li class="navbreak"><a href="https://bintray.com/insideo/randomcoder-release" class="external">Download</a></li>
<li><a href="https://github.com/search?q=user%3Ainsideo+randomcoder-" class="external">Git repositories</a></li>
<li class="navbreak"><a href="${aboutUrl}">About this site</a></li>
</ul>
</div>
| Make download / git repo links external | Make download / git repo links external
| Java Server Pages | bsd-2-clause | insideo/randomcoder-website,insideo/randomcoder-website,insideo/randomcoder-website | java-server-pages | ## Code Before:
<%@ taglib uri="http://java.sun.com/jsp/jstl/core" prefix="c" %>
<c:url var="homeUrl" value="/" />
<c:url var="aboutUrl" value="/legal/about" />
<div class="sectionHeading">Go...</div>
<div class="sectionContent">
<ul class="nav">
<li><a href="${homeUrl}">Home</a></li>
<li class="navbreak"><a href="https://bintray.com/insideo/randomcoder-release">Download</a></li>
<li><a href="https://github.com/search?q=user%3Ainsideo+randomcoder-">Git repositories</a></li>
<li class="navbreak"><a href="${aboutUrl}">About this site</a></li>
</ul>
</div>
## Instruction:
Make download / git repo links external
## Code After:
<%@ taglib uri="http://java.sun.com/jsp/jstl/core" prefix="c" %>
<c:url var="homeUrl" value="/" />
<c:url var="aboutUrl" value="/legal/about" />
<div class="sectionHeading">Go...</div>
<div class="sectionContent">
<ul class="nav">
<li><a href="${homeUrl}">Home</a></li>
<li class="navbreak"><a href="https://bintray.com/insideo/randomcoder-release" class="external">Download</a></li>
<li><a href="https://github.com/search?q=user%3Ainsideo+randomcoder-" class="external">Git repositories</a></li>
<li class="navbreak"><a href="${aboutUrl}">About this site</a></li>
</ul>
</div>
| <%@ taglib uri="http://java.sun.com/jsp/jstl/core" prefix="c" %>
<c:url var="homeUrl" value="/" />
<c:url var="aboutUrl" value="/legal/about" />
<div class="sectionHeading">Go...</div>
<div class="sectionContent">
<ul class="nav">
<li><a href="${homeUrl}">Home</a></li>
- <li class="navbreak"><a href="https://bintray.com/insideo/randomcoder-release">Download</a></li>
+ <li class="navbreak"><a href="https://bintray.com/insideo/randomcoder-release" class="external">Download</a></li>
? +++++++++++++++++
- <li><a href="https://github.com/search?q=user%3Ainsideo+randomcoder-">Git repositories</a></li>
+ <li><a href="https://github.com/search?q=user%3Ainsideo+randomcoder-" class="external">Git repositories</a></li>
? +++++++++++++++++
<li class="navbreak"><a href="${aboutUrl}">About this site</a></li>
</ul>
</div> | 4 | 0.333333 | 2 | 2 |
9822c184d44e728fcf79aa911498cb59df9d70c4 | recipes/_plugin_manager.rb | recipes/_plugin_manager.rb | add_vim_git_plugin({ "vundle" => "https://github.com/gmarik/vundle.git",
"unbundle" => "https://github.com/sunaku/vim-unbundle.git",
"pathogen" => "https://github.com/tpope/vim-pathogen.git" }[node[:vim_config][:plugin_manager].to_s])
| add_vim_git_plugin({ "vundle" => "https://github.com/gmarik/vundle.git",
"unbundle" => "https://github.com/sunaku/vim-unbundle.git",
"pathogen" => "https://github.com/tpope/vim-pathogen.git" }[node[:vim_config][:plugin_manager].to_s])
case node[:vim_config][:plugin_manager]
when "pathogen"
node.override[:vim_config][:vimrc][:config][:system_wide] = node[:vim_config][:vimrc][:config][:system_wide].dup << "" << "source /etc/vim/bundle/vim-pathogen/autoload/pathogen.vim" << "execute pathogen#infect('/etc/vim/bundle/{}', 'bundle/{}')"
end
| Configure pathogen for autoload system-wide bundles | Configure pathogen for autoload system-wide bundles
| Ruby | mit | promisedlandt/cookbook-vim_config,promisedlandt/cookbook-vim_config | ruby | ## Code Before:
add_vim_git_plugin({ "vundle" => "https://github.com/gmarik/vundle.git",
"unbundle" => "https://github.com/sunaku/vim-unbundle.git",
"pathogen" => "https://github.com/tpope/vim-pathogen.git" }[node[:vim_config][:plugin_manager].to_s])
## Instruction:
Configure pathogen for autoload system-wide bundles
## Code After:
add_vim_git_plugin({ "vundle" => "https://github.com/gmarik/vundle.git",
"unbundle" => "https://github.com/sunaku/vim-unbundle.git",
"pathogen" => "https://github.com/tpope/vim-pathogen.git" }[node[:vim_config][:plugin_manager].to_s])
case node[:vim_config][:plugin_manager]
when "pathogen"
node.override[:vim_config][:vimrc][:config][:system_wide] = node[:vim_config][:vimrc][:config][:system_wide].dup << "" << "source /etc/vim/bundle/vim-pathogen/autoload/pathogen.vim" << "execute pathogen#infect('/etc/vim/bundle/{}', 'bundle/{}')"
end
| add_vim_git_plugin({ "vundle" => "https://github.com/gmarik/vundle.git",
"unbundle" => "https://github.com/sunaku/vim-unbundle.git",
"pathogen" => "https://github.com/tpope/vim-pathogen.git" }[node[:vim_config][:plugin_manager].to_s])
+
+ case node[:vim_config][:plugin_manager]
+ when "pathogen"
+ node.override[:vim_config][:vimrc][:config][:system_wide] = node[:vim_config][:vimrc][:config][:system_wide].dup << "" << "source /etc/vim/bundle/vim-pathogen/autoload/pathogen.vim" << "execute pathogen#infect('/etc/vim/bundle/{}', 'bundle/{}')"
+ end | 5 | 1.666667 | 5 | 0 |
b86bfcfeeeb61b2f4f60164e18a974129c899bec | packages/lesswrong/components/users/UsersSingle.jsx | packages/lesswrong/components/users/UsersSingle.jsx | import { Components, registerComponent, Utils } from 'meteor/vulcan:core';
import React from 'react';
import { withRouter } from 'react-router';
import Users from "meteor/vulcan:users";
const UsersSingle = ({params, router}) => {
const slug = Utils.slugify(params.slug);
const canonicalUrl = Users.getProfileUrlFromSlug(slug);
if (router.location.pathname !== canonicalUrl) {
router.replace(canonicalUrl);
return null;
} else {
return <Components.UsersProfile userId={params._id} slug={slug} />
}
};
UsersSingle.displayName = "UsersSingle";
registerComponent('UsersSingle', UsersSingle, withRouter);
| import { Components, registerComponent, Utils } from 'meteor/vulcan:core';
import React from 'react';
import { withRouter } from 'react-router';
import Users from "meteor/vulcan:users";
const UsersSingle = ({params, router}) => {
const slug = Utils.slugify(params.slug);
const canonicalUrl = Users.getProfileUrlFromSlug(slug);
if (router.location.pathname !== canonicalUrl) {
// A Javascript redirect, which replaces the history entry (so you don't
// have a redirector interfering with the back button). Does not cause a
// pageload.
router.replace(canonicalUrl);
return null;
} else {
return <Components.UsersProfile userId={params._id} slug={slug} />
}
};
UsersSingle.displayName = "UsersSingle";
registerComponent('UsersSingle', UsersSingle, withRouter);
| Comment about what router.replace does | Comment about what router.replace does
| JSX | mit | Discordius/Telescope,Discordius/Lesswrong2,Discordius/Lesswrong2,Discordius/Telescope,Discordius/Lesswrong2,Discordius/Lesswrong2,Discordius/Telescope,Discordius/Telescope | jsx | ## Code Before:
import { Components, registerComponent, Utils } from 'meteor/vulcan:core';
import React from 'react';
import { withRouter } from 'react-router';
import Users from "meteor/vulcan:users";
const UsersSingle = ({params, router}) => {
const slug = Utils.slugify(params.slug);
const canonicalUrl = Users.getProfileUrlFromSlug(slug);
if (router.location.pathname !== canonicalUrl) {
router.replace(canonicalUrl);
return null;
} else {
return <Components.UsersProfile userId={params._id} slug={slug} />
}
};
UsersSingle.displayName = "UsersSingle";
registerComponent('UsersSingle', UsersSingle, withRouter);
## Instruction:
Comment about what router.replace does
## Code After:
import { Components, registerComponent, Utils } from 'meteor/vulcan:core';
import React from 'react';
import { withRouter } from 'react-router';
import Users from "meteor/vulcan:users";
const UsersSingle = ({params, router}) => {
const slug = Utils.slugify(params.slug);
const canonicalUrl = Users.getProfileUrlFromSlug(slug);
if (router.location.pathname !== canonicalUrl) {
// A Javascript redirect, which replaces the history entry (so you don't
// have a redirector interfering with the back button). Does not cause a
// pageload.
router.replace(canonicalUrl);
return null;
} else {
return <Components.UsersProfile userId={params._id} slug={slug} />
}
};
UsersSingle.displayName = "UsersSingle";
registerComponent('UsersSingle', UsersSingle, withRouter);
| import { Components, registerComponent, Utils } from 'meteor/vulcan:core';
import React from 'react';
import { withRouter } from 'react-router';
import Users from "meteor/vulcan:users";
const UsersSingle = ({params, router}) => {
const slug = Utils.slugify(params.slug);
const canonicalUrl = Users.getProfileUrlFromSlug(slug);
if (router.location.pathname !== canonicalUrl) {
+ // A Javascript redirect, which replaces the history entry (so you don't
+ // have a redirector interfering with the back button). Does not cause a
+ // pageload.
router.replace(canonicalUrl);
return null;
} else {
return <Components.UsersProfile userId={params._id} slug={slug} />
}
};
UsersSingle.displayName = "UsersSingle";
registerComponent('UsersSingle', UsersSingle, withRouter); | 3 | 0.157895 | 3 | 0 |
fe02b6422225d1d23b2367f006a9c229b78af8dc | lib/roo_on_rails/railties/sidekiq.rb | lib/roo_on_rails/railties/sidekiq.rb | require 'sidekiq'
require 'roo_on_rails/sidekiq/settings'
module RooOnRails
module Railties
class Sidekiq < Rails::Railtie
initializer 'roo_on_rails.sidekiq' do |app|
require 'hirefire-resource'
$stderr.puts 'initializer roo_on_rails.sidekiq'
break unless ENV.fetch('SIDEKIQ_ENABLED', 'true').to_s =~ /\A(YES|TRUE|ON|1)\Z/i
config_sidekiq
config_hirefire(app)
end
def config_hirefire(app)
unless ENV['HIREFIRE_TOKEN']
warn 'No HIREFIRE_TOKEN token set, auto scaling not enabled'
return
end
add_middleware(app)
end
def config_sidekiq
::Sidekiq.configure_server do |x|
x.options[:concurrency] = RooOnRails::Sidekiq::Settings.concurrency.to_i
x.options[:queues] = RooOnRails::Sidekiq::Settings.queues
end
end
def add_middleware(app)
$stderr.puts 'HIREFIRE_TOKEN set'
app.middleware.use HireFire::Middleware
HireFire::Resource.configure do |config|
config.dyno(:worker) do
RooOnRails::SidekiqSla.queue
end
end
end
end
end
end
| require 'sidekiq'
require 'roo_on_rails/sidekiq/settings'
require 'roo_on_rails/sidekiq/sla_metric'
module RooOnRails
module Railties
class Sidekiq < Rails::Railtie
initializer 'roo_on_rails.sidekiq' do |app|
require 'hirefire-resource'
$stderr.puts 'initializer roo_on_rails.sidekiq'
break unless ENV.fetch('SIDEKIQ_ENABLED', 'true').to_s =~ /\A(YES|TRUE|ON|1)\Z/i
config_sidekiq
config_hirefire(app)
end
def config_hirefire(app)
unless ENV['HIREFIRE_TOKEN']
warn 'No HIREFIRE_TOKEN token set, auto scaling not enabled'
return
end
add_middleware(app)
end
def config_sidekiq
::Sidekiq.configure_server do |x|
x.options[:concurrency] = RooOnRails::Sidekiq::Settings.concurrency.to_i
x.options[:queues] = RooOnRails::Sidekiq::Settings.queues
end
end
def add_middleware(app)
$stderr.puts 'HIREFIRE_TOKEN set'
app.middleware.use HireFire::Middleware
HireFire::Resource.configure do |config|
config.dyno(:worker) do
RooOnRails::Sidekiq::SlaMetric.queue
end
end
end
end
end
end
| Use correct SLA class name | HOTFIX: Use correct SLA class name
This could really do with some tests, but I’m not quite sure how to
test it and this is causing errors in production so I’d prefer to patch
it quickly and then work out how to test it later.
@danielcooper maybe you have some thoughts on how to test?
| Ruby | mit | deliveroo/roo_on_rails,deliveroo/roo_on_rails,deliveroo/roo_on_rails | ruby | ## Code Before:
require 'sidekiq'
require 'roo_on_rails/sidekiq/settings'
module RooOnRails
module Railties
class Sidekiq < Rails::Railtie
initializer 'roo_on_rails.sidekiq' do |app|
require 'hirefire-resource'
$stderr.puts 'initializer roo_on_rails.sidekiq'
break unless ENV.fetch('SIDEKIQ_ENABLED', 'true').to_s =~ /\A(YES|TRUE|ON|1)\Z/i
config_sidekiq
config_hirefire(app)
end
def config_hirefire(app)
unless ENV['HIREFIRE_TOKEN']
warn 'No HIREFIRE_TOKEN token set, auto scaling not enabled'
return
end
add_middleware(app)
end
def config_sidekiq
::Sidekiq.configure_server do |x|
x.options[:concurrency] = RooOnRails::Sidekiq::Settings.concurrency.to_i
x.options[:queues] = RooOnRails::Sidekiq::Settings.queues
end
end
def add_middleware(app)
$stderr.puts 'HIREFIRE_TOKEN set'
app.middleware.use HireFire::Middleware
HireFire::Resource.configure do |config|
config.dyno(:worker) do
RooOnRails::SidekiqSla.queue
end
end
end
end
end
end
## Instruction:
HOTFIX: Use correct SLA class name
This could really do with some tests, but I’m not quite sure how to
test it and this is causing errors in production so I’d prefer to patch
it quickly and then work out how to test it later.
@danielcooper maybe you have some thoughts on how to test?
## Code After:
require 'sidekiq'
require 'roo_on_rails/sidekiq/settings'
require 'roo_on_rails/sidekiq/sla_metric'
module RooOnRails
module Railties
class Sidekiq < Rails::Railtie
initializer 'roo_on_rails.sidekiq' do |app|
require 'hirefire-resource'
$stderr.puts 'initializer roo_on_rails.sidekiq'
break unless ENV.fetch('SIDEKIQ_ENABLED', 'true').to_s =~ /\A(YES|TRUE|ON|1)\Z/i
config_sidekiq
config_hirefire(app)
end
def config_hirefire(app)
unless ENV['HIREFIRE_TOKEN']
warn 'No HIREFIRE_TOKEN token set, auto scaling not enabled'
return
end
add_middleware(app)
end
def config_sidekiq
::Sidekiq.configure_server do |x|
x.options[:concurrency] = RooOnRails::Sidekiq::Settings.concurrency.to_i
x.options[:queues] = RooOnRails::Sidekiq::Settings.queues
end
end
def add_middleware(app)
$stderr.puts 'HIREFIRE_TOKEN set'
app.middleware.use HireFire::Middleware
HireFire::Resource.configure do |config|
config.dyno(:worker) do
RooOnRails::Sidekiq::SlaMetric.queue
end
end
end
end
end
end
| require 'sidekiq'
require 'roo_on_rails/sidekiq/settings'
+ require 'roo_on_rails/sidekiq/sla_metric'
+
module RooOnRails
module Railties
class Sidekiq < Rails::Railtie
initializer 'roo_on_rails.sidekiq' do |app|
require 'hirefire-resource'
$stderr.puts 'initializer roo_on_rails.sidekiq'
break unless ENV.fetch('SIDEKIQ_ENABLED', 'true').to_s =~ /\A(YES|TRUE|ON|1)\Z/i
config_sidekiq
config_hirefire(app)
end
def config_hirefire(app)
unless ENV['HIREFIRE_TOKEN']
warn 'No HIREFIRE_TOKEN token set, auto scaling not enabled'
return
end
add_middleware(app)
end
def config_sidekiq
::Sidekiq.configure_server do |x|
x.options[:concurrency] = RooOnRails::Sidekiq::Settings.concurrency.to_i
x.options[:queues] = RooOnRails::Sidekiq::Settings.queues
end
end
def add_middleware(app)
$stderr.puts 'HIREFIRE_TOKEN set'
app.middleware.use HireFire::Middleware
HireFire::Resource.configure do |config|
config.dyno(:worker) do
- RooOnRails::SidekiqSla.queue
+ RooOnRails::Sidekiq::SlaMetric.queue
? ++ ++++++
end
end
end
end
end
end | 4 | 0.1 | 3 | 1 |
590f04e426a72ae905f3ea9209588fdfd249cf33 | recipes/bokeh_conda_requirements/meta.yaml | recipes/bokeh_conda_requirements/meta.yaml | {% set name = "bokeh_conda_requirements" %}
{% set version = "0.12.6" %}
package:
name: {{ name }}
version: {{ version }}
requirements:
run:
- python
- six >=1.5.2
- requests >=1.2.3
- pyyaml >=3.10
- python-dateutil >=2.1
- jinja2 >=2.7
- numpy >=1.7.1
- tornado >=4.3
- futures >=3.0.3 # [py2k]
- bkcharts >=0.2
about:
summary: Requirements meta-package for bokeh
| {% set name = "bokeh_conda_requirements" %}
{% set version = "0.12.6" %}
package:
name: {{ name }}
version: {{ version }}
build:
number: 0
noarch: True
string: py2_0 # [py2k]
string: py3_0 # [py3k]
requirements:
run:
- python 2.* # [py2k]
- python 3.* # [py3k]
- six >=1.5.2
- requests >=1.2.3
- pyyaml >=3.10
- python-dateutil >=2.1
- jinja2 >=2.7
- numpy >=1.7.1
- tornado >=4.3
- futures >=3.0.3 # [py2k]
- bkcharts >=0.2
about:
summary: Requirements meta-package for bokeh
| Make bokeh_conda_requirements a noarch package | Make bokeh_conda_requirements a noarch package
| YAML | bsd-3-clause | jjhelmus/berryconda,jjhelmus/berryconda,jjhelmus/berryconda,jjhelmus/berryconda,jjhelmus/berryconda | yaml | ## Code Before:
{% set name = "bokeh_conda_requirements" %}
{% set version = "0.12.6" %}
package:
name: {{ name }}
version: {{ version }}
requirements:
run:
- python
- six >=1.5.2
- requests >=1.2.3
- pyyaml >=3.10
- python-dateutil >=2.1
- jinja2 >=2.7
- numpy >=1.7.1
- tornado >=4.3
- futures >=3.0.3 # [py2k]
- bkcharts >=0.2
about:
summary: Requirements meta-package for bokeh
## Instruction:
Make bokeh_conda_requirements a noarch package
## Code After:
{% set name = "bokeh_conda_requirements" %}
{% set version = "0.12.6" %}
package:
name: {{ name }}
version: {{ version }}
build:
number: 0
noarch: True
string: py2_0 # [py2k]
string: py3_0 # [py3k]
requirements:
run:
- python 2.* # [py2k]
- python 3.* # [py3k]
- six >=1.5.2
- requests >=1.2.3
- pyyaml >=3.10
- python-dateutil >=2.1
- jinja2 >=2.7
- numpy >=1.7.1
- tornado >=4.3
- futures >=3.0.3 # [py2k]
- bkcharts >=0.2
about:
summary: Requirements meta-package for bokeh
| {% set name = "bokeh_conda_requirements" %}
{% set version = "0.12.6" %}
package:
name: {{ name }}
version: {{ version }}
+ build:
+ number: 0
+ noarch: True
+ string: py2_0 # [py2k]
+ string: py3_0 # [py3k]
+
requirements:
run:
- - python
+ - python 2.* # [py2k]
+ - python 3.* # [py3k]
- six >=1.5.2
- requests >=1.2.3
- pyyaml >=3.10
- python-dateutil >=2.1
- jinja2 >=2.7
- numpy >=1.7.1
- tornado >=4.3
- futures >=3.0.3 # [py2k]
- bkcharts >=0.2
about:
summary: Requirements meta-package for bokeh | 9 | 0.409091 | 8 | 1 |
d7d55dbbfa73e39a7b8ea687444e2cb78e5cb50d | ReadMe.md | ReadMe.md |
This plugin provides a wiki for hawtio
### Basic usage
#### Running this plugin locally
First clone the source
git clone https://github.com/hawtio/hawtio-wiki
cd hawtio-wiki
Next you'll need to [install NodeJS](http://nodejs.org/download/) and then install the default global npm dependencies:
npm install -g bower gulp slush slush-hawtio-javascript slush-hawtio-typescript typescript
Then install all local nodejs packages and update bower dependencies via:
npm install
bower update
Then to run the web application:
gulp
#### Install the bower package
`bower install --save hawtio-wiki` |
The source is now in [fabric8-console](https://github.com/fabric8io/fabric8-console)
See the new [source code here](https://github.com/fabric8io/fabric8-console/tree/master/plugins/wiki)
| Update readme with new source location | Update readme with new source location
| Markdown | apache-2.0 | hawtio/hawtio-wiki,hawtio/hawtio-wiki,hawtio/hawtio-wiki | markdown | ## Code Before:
This plugin provides a wiki for hawtio
### Basic usage
#### Running this plugin locally
First clone the source
git clone https://github.com/hawtio/hawtio-wiki
cd hawtio-wiki
Next you'll need to [install NodeJS](http://nodejs.org/download/) and then install the default global npm dependencies:
npm install -g bower gulp slush slush-hawtio-javascript slush-hawtio-typescript typescript
Then install all local nodejs packages and update bower dependencies via:
npm install
bower update
Then to run the web application:
gulp
#### Install the bower package
`bower install --save hawtio-wiki`
## Instruction:
Update readme with new source location
## Code After:
The source is now in [fabric8-console](https://github.com/fabric8io/fabric8-console)
See the new [source code here](https://github.com/fabric8io/fabric8-console/tree/master/plugins/wiki)
|
- This plugin provides a wiki for hawtio
+ The source is now in [fabric8-console](https://github.com/fabric8io/fabric8-console)
- ### Basic usage
+ See the new [source code here](https://github.com/fabric8io/fabric8-console/tree/master/plugins/wiki)
- #### Running this plugin locally
-
- First clone the source
-
- git clone https://github.com/hawtio/hawtio-wiki
- cd hawtio-wiki
-
- Next you'll need to [install NodeJS](http://nodejs.org/download/) and then install the default global npm dependencies:
-
- npm install -g bower gulp slush slush-hawtio-javascript slush-hawtio-typescript typescript
-
- Then install all local nodejs packages and update bower dependencies via:
-
- npm install
- bower update
-
- Then to run the web application:
-
- gulp
-
- #### Install the bower package
-
- `bower install --save hawtio-wiki` | 27 | 0.964286 | 2 | 25 |
a62be2b6891068481edb556af690cb0919dbeb4f | r01/src/main/java/edu/virginia/psyc/r01/persistence/CC.java | r01/src/main/java/edu/virginia/psyc/r01/persistence/CC.java | package edu.virginia.psyc.r01.persistence;
import lombok.Data;
import lombok.EqualsAndHashCode;
import org.apache.commons.collections.map.HashedMap;
import org.mindtrails.domain.questionnaire.LinkedQuestionnaireData;
import org.mindtrails.domain.questionnaire.MeasureField;
import javax.persistence.Entity;
import javax.persistence.Table;
import javax.validation.constraints.NotNull;
import java.util.Collections;
import java.util.Map;
/**
* Created by samportnow on 7/21/14.
*/
@Entity
@Table(name="CC")
@EqualsAndHashCode(callSuper = true)
@Data
public class CC extends LinkedQuestionnaireData {
@MeasureField(desc="While reading the brief stories in the training program, how much did you relate to the situations presented?")
@NotNull
private Integer related;
@MeasureField(desc="While reading the brief stories in the training program, how much did you feel like it could be you behaving that way in those stories?")
@NotNull
private Integer compare;
@Override
public Map<Integer, String> getScale(String scale) {
Map<Integer, String> tmpScale = new HashedMap();
tmpScale.put(1, "Not at all");
tmpScale.put(2, "Slightly");
tmpScale.put(3, "Somewhat");
tmpScale.put(4, "Mostly");
tmpScale.put(5, "Very much");
return Collections.unmodifiableMap(tmpScale);
}
} | package edu.virginia.psyc.r01.persistence;
import lombok.Data;
import lombok.EqualsAndHashCode;
import org.apache.commons.collections.map.HashedMap;
import org.mindtrails.domain.questionnaire.LinkedQuestionnaireData;
import org.mindtrails.domain.questionnaire.MeasureField;
import javax.persistence.Entity;
import javax.persistence.Table;
import javax.validation.constraints.NotNull;
import java.util.Collections;
import java.util.Map;
import java.util.TreeMap;
/**
* Created by samportnow on 7/21/14.
*/
@Entity
@Table(name="CC")
@EqualsAndHashCode(callSuper = true)
@Data
public class CC extends LinkedQuestionnaireData {
@MeasureField(desc="While reading the brief stories in the training program, how much did you relate to the situations presented?")
@NotNull
private Integer related;
@MeasureField(desc="While reading the brief stories in the training program, how much did you feel like it could be you behaving that way in those stories?")
@NotNull
private Integer compare;
@Override
public Map<Integer, String> getScale(String scale) {
Map<Integer, String> tmpScale = new TreeMap<>();
tmpScale.put(1, "Not at all");
tmpScale.put(2, "Slightly");
tmpScale.put(3, "Somewhat");
tmpScale.put(4, "Mostly");
tmpScale.put(5, "Very much");
return Collections.unmodifiableMap(tmpScale);
}
} | Return the scale in the order it was created. | Return the scale in the order it was created.
| Java | mit | danfunk/MindTrails,danfunk/MindTrails,danfunk/MindTrails,danfunk/MindTrails | java | ## Code Before:
package edu.virginia.psyc.r01.persistence;
import lombok.Data;
import lombok.EqualsAndHashCode;
import org.apache.commons.collections.map.HashedMap;
import org.mindtrails.domain.questionnaire.LinkedQuestionnaireData;
import org.mindtrails.domain.questionnaire.MeasureField;
import javax.persistence.Entity;
import javax.persistence.Table;
import javax.validation.constraints.NotNull;
import java.util.Collections;
import java.util.Map;
/**
* Created by samportnow on 7/21/14.
*/
@Entity
@Table(name="CC")
@EqualsAndHashCode(callSuper = true)
@Data
public class CC extends LinkedQuestionnaireData {
@MeasureField(desc="While reading the brief stories in the training program, how much did you relate to the situations presented?")
@NotNull
private Integer related;
@MeasureField(desc="While reading the brief stories in the training program, how much did you feel like it could be you behaving that way in those stories?")
@NotNull
private Integer compare;
@Override
public Map<Integer, String> getScale(String scale) {
Map<Integer, String> tmpScale = new HashedMap();
tmpScale.put(1, "Not at all");
tmpScale.put(2, "Slightly");
tmpScale.put(3, "Somewhat");
tmpScale.put(4, "Mostly");
tmpScale.put(5, "Very much");
return Collections.unmodifiableMap(tmpScale);
}
}
## Instruction:
Return the scale in the order it was created.
## Code After:
package edu.virginia.psyc.r01.persistence;
import lombok.Data;
import lombok.EqualsAndHashCode;
import org.apache.commons.collections.map.HashedMap;
import org.mindtrails.domain.questionnaire.LinkedQuestionnaireData;
import org.mindtrails.domain.questionnaire.MeasureField;
import javax.persistence.Entity;
import javax.persistence.Table;
import javax.validation.constraints.NotNull;
import java.util.Collections;
import java.util.Map;
import java.util.TreeMap;
/**
* Created by samportnow on 7/21/14.
*/
@Entity
@Table(name="CC")
@EqualsAndHashCode(callSuper = true)
@Data
public class CC extends LinkedQuestionnaireData {
@MeasureField(desc="While reading the brief stories in the training program, how much did you relate to the situations presented?")
@NotNull
private Integer related;
@MeasureField(desc="While reading the brief stories in the training program, how much did you feel like it could be you behaving that way in those stories?")
@NotNull
private Integer compare;
@Override
public Map<Integer, String> getScale(String scale) {
Map<Integer, String> tmpScale = new TreeMap<>();
tmpScale.put(1, "Not at all");
tmpScale.put(2, "Slightly");
tmpScale.put(3, "Somewhat");
tmpScale.put(4, "Mostly");
tmpScale.put(5, "Very much");
return Collections.unmodifiableMap(tmpScale);
}
} | package edu.virginia.psyc.r01.persistence;
import lombok.Data;
import lombok.EqualsAndHashCode;
import org.apache.commons.collections.map.HashedMap;
import org.mindtrails.domain.questionnaire.LinkedQuestionnaireData;
import org.mindtrails.domain.questionnaire.MeasureField;
import javax.persistence.Entity;
import javax.persistence.Table;
import javax.validation.constraints.NotNull;
import java.util.Collections;
import java.util.Map;
+ import java.util.TreeMap;
/**
* Created by samportnow on 7/21/14.
*/
@Entity
@Table(name="CC")
@EqualsAndHashCode(callSuper = true)
@Data
public class CC extends LinkedQuestionnaireData {
@MeasureField(desc="While reading the brief stories in the training program, how much did you relate to the situations presented?")
@NotNull
private Integer related;
@MeasureField(desc="While reading the brief stories in the training program, how much did you feel like it could be you behaving that way in those stories?")
@NotNull
private Integer compare;
@Override
public Map<Integer, String> getScale(String scale) {
- Map<Integer, String> tmpScale = new HashedMap();
? ^^^^ ^
+ Map<Integer, String> tmpScale = new TreeMap<>();
? ^^ ^ ++
tmpScale.put(1, "Not at all");
tmpScale.put(2, "Slightly");
tmpScale.put(3, "Somewhat");
tmpScale.put(4, "Mostly");
tmpScale.put(5, "Very much");
return Collections.unmodifiableMap(tmpScale);
}
} | 3 | 0.066667 | 2 | 1 |
7a22c6301dc5bb41bae4fcb4e1031204db5156ef | .travis.yml | .travis.yml | language: node_js
node_js:
- '0.11'
script:
- npm test -- --single-run
- npm run lint
cache:
directories:
- node_modules
| sudo: false
language: node_js
node_js:
- '0.11'
script:
- npm test -- --single-run
- npm run lint
cache:
directories:
- node_modules
| Upgrade to new Travis CI infrastructure | Upgrade to new Travis CI infrastructure
| YAML | mit | js-fns/date-fns,date-fns/date-fns,date-fns/date-fns,js-fns/date-fns,date-fns/date-fns | yaml | ## Code Before:
language: node_js
node_js:
- '0.11'
script:
- npm test -- --single-run
- npm run lint
cache:
directories:
- node_modules
## Instruction:
Upgrade to new Travis CI infrastructure
## Code After:
sudo: false
language: node_js
node_js:
- '0.11'
script:
- npm test -- --single-run
- npm run lint
cache:
directories:
- node_modules
| + sudo: false
language: node_js
node_js:
- '0.11'
script:
- npm test -- --single-run
- npm run lint
cache:
directories:
- node_modules
| 1 | 0.1 | 1 | 0 |
a71425b57660c767534cd36687f8bb03ba4c7eaf | tox.ini | tox.ini | [tox]
envlist = py27,py33,py34,py35,py36,py37,pypy,pypy3
[testenv]
commands = python run_tests.py
[testenv:py27-flake8]
deps =
flake8
commands = flake8 {posargs} pep8ext_naming.py
[testenv:py37-flake8]
basepython = python3.7
deps =
flake8
commands = flake8 {posargs} pep8ext_naming.py
[testenv:release]
deps =
twine >= 1.4.0
wheel
commands =
python setup.py sdist bdist_wheel
twine upload {posargs} dist/*
| [tox]
envlist = py27,py33,py34,py35,py36,py37,pypy,pypy3
[testenv]
commands = python run_tests.py
[testenv:py27-flake8]
deps =
flake8
commands =
flake8 {posargs} pep8ext_naming.py
python setup.py check --restructuredtext
[testenv:py37-flake8]
basepython = python3.7
deps =
flake8
commands =
flake8 {posargs} pep8ext_naming.py
python setup.py check --restructuredtext
[testenv:release]
deps =
twine >= 1.4.0
wheel
commands =
python setup.py sdist bdist_wheel
twine upload {posargs} dist/*
| Check reStructuredText syntax along with flake8 | Check reStructuredText syntax along with flake8
This verifies the integrity of the setup metadata so we can be even more
confident in our PyPI releases.
| INI | mit | flintwork/pep8-naming | ini | ## Code Before:
[tox]
envlist = py27,py33,py34,py35,py36,py37,pypy,pypy3
[testenv]
commands = python run_tests.py
[testenv:py27-flake8]
deps =
flake8
commands = flake8 {posargs} pep8ext_naming.py
[testenv:py37-flake8]
basepython = python3.7
deps =
flake8
commands = flake8 {posargs} pep8ext_naming.py
[testenv:release]
deps =
twine >= 1.4.0
wheel
commands =
python setup.py sdist bdist_wheel
twine upload {posargs} dist/*
## Instruction:
Check reStructuredText syntax along with flake8
This verifies the integrity of the setup metadata so we can be even more
confident in our PyPI releases.
## Code After:
[tox]
envlist = py27,py33,py34,py35,py36,py37,pypy,pypy3
[testenv]
commands = python run_tests.py
[testenv:py27-flake8]
deps =
flake8
commands =
flake8 {posargs} pep8ext_naming.py
python setup.py check --restructuredtext
[testenv:py37-flake8]
basepython = python3.7
deps =
flake8
commands =
flake8 {posargs} pep8ext_naming.py
python setup.py check --restructuredtext
[testenv:release]
deps =
twine >= 1.4.0
wheel
commands =
python setup.py sdist bdist_wheel
twine upload {posargs} dist/*
| [tox]
envlist = py27,py33,py34,py35,py36,py37,pypy,pypy3
[testenv]
commands = python run_tests.py
[testenv:py27-flake8]
deps =
flake8
+ commands =
- commands = flake8 {posargs} pep8ext_naming.py
? -------- ^
+ flake8 {posargs} pep8ext_naming.py
? ^^
+ python setup.py check --restructuredtext
[testenv:py37-flake8]
basepython = python3.7
deps =
flake8
+ commands =
- commands = flake8 {posargs} pep8ext_naming.py
? -------- ^
+ flake8 {posargs} pep8ext_naming.py
? ^^
+ python setup.py check --restructuredtext
[testenv:release]
deps =
twine >= 1.4.0
wheel
commands =
python setup.py sdist bdist_wheel
twine upload {posargs} dist/* | 8 | 0.333333 | 6 | 2 |
477f306724cee65c93f18cf10a1be9328f7dbdd5 | jekyll-itafroma-archive.gemspec | jekyll-itafroma-archive.gemspec | Gem::Specification.new do |s|
s.name = 'jekyll-itafroma-archive'
s.version = '0.3.0'
s.date = '2015-01-07'
s.summary = 'Jekyll plugin to create a set of archive pages.'
s.description = <<-EOF
Jekyll Archive Generator creates a set of archive pages for a Jekyll
website.
Oddly, Jekyll doesn't include a date-based archive for posts out of the box.
For example, if you have a permalink structure like `blog/2014/01/01/title,
URL hacking won't work because going to `blog/2014` will return 404 Page Not
Found.
Jekyll Archive Generator fixes that by generating all the necessary archive
pages for each part your blog URL structure.
EOF
s.authors = ['Mark Trapp']
s.email = 'mark@marktrapp.com'
s.files = [
'lib/jekyll/itafroma/archive.rb',
'lib/jekyll/itafroma/archive_generator.rb',
'lib/jekyll/itafroma/archive_page.rb',
'lib/jekyll/itafroma/archive_pager.rb',
]
s.homepage = 'http://marktrapp.com/projects/jekyll-archive'
s.license = 'MIT'
end
| Gem::Specification.new do |s|
s.name = 'jekyll-itafroma-archive'
s.version = '0.4.0'
s.date = '2015-01-15'
s.summary = 'Jekyll plugin to create a set of archive pages.'
s.description = <<-EOF
Jekyll Archive Generator creates a set of archive pages for a Jekyll
website.
Oddly, Jekyll doesn't include a date-based archive for posts out of the box.
For example, if you have a permalink structure like `blog/2014/01/01/title,
URL hacking won't work because going to `blog/2014` will return 404 Page Not
Found.
Jekyll Archive Generator fixes that by generating all the necessary archive
pages for each part your blog URL structure.
EOF
s.authors = ['Mark Trapp']
s.email = 'mark@marktrapp.com'
s.files = [
'lib/jekyll/itafroma/archive.rb',
'lib/jekyll/itafroma/archive_generator.rb',
'lib/jekyll/itafroma/archive_page.rb',
'lib/jekyll/itafroma/archive_pager.rb',
]
s.homepage = 'http://marktrapp.com/projects/jekyll-archive'
s.license = 'MIT'
s.required_ruby_version = '>= 2.0.0'
s.add_runtime_dependency 'jekyll', '~> 2.1'
end
| Prepare gemspec for 0.4.0 release. | Prepare gemspec for 0.4.0 release.
| Ruby | mit | itafroma/jekyll-archive,itafroma/jekyll-archive | ruby | ## Code Before:
Gem::Specification.new do |s|
s.name = 'jekyll-itafroma-archive'
s.version = '0.3.0'
s.date = '2015-01-07'
s.summary = 'Jekyll plugin to create a set of archive pages.'
s.description = <<-EOF
Jekyll Archive Generator creates a set of archive pages for a Jekyll
website.
Oddly, Jekyll doesn't include a date-based archive for posts out of the box.
For example, if you have a permalink structure like `blog/2014/01/01/title,
URL hacking won't work because going to `blog/2014` will return 404 Page Not
Found.
Jekyll Archive Generator fixes that by generating all the necessary archive
pages for each part your blog URL structure.
EOF
s.authors = ['Mark Trapp']
s.email = 'mark@marktrapp.com'
s.files = [
'lib/jekyll/itafroma/archive.rb',
'lib/jekyll/itafroma/archive_generator.rb',
'lib/jekyll/itafroma/archive_page.rb',
'lib/jekyll/itafroma/archive_pager.rb',
]
s.homepage = 'http://marktrapp.com/projects/jekyll-archive'
s.license = 'MIT'
end
## Instruction:
Prepare gemspec for 0.4.0 release.
## Code After:
Gem::Specification.new do |s|
s.name = 'jekyll-itafroma-archive'
s.version = '0.4.0'
s.date = '2015-01-15'
s.summary = 'Jekyll plugin to create a set of archive pages.'
s.description = <<-EOF
Jekyll Archive Generator creates a set of archive pages for a Jekyll
website.
Oddly, Jekyll doesn't include a date-based archive for posts out of the box.
For example, if you have a permalink structure like `blog/2014/01/01/title,
URL hacking won't work because going to `blog/2014` will return 404 Page Not
Found.
Jekyll Archive Generator fixes that by generating all the necessary archive
pages for each part your blog URL structure.
EOF
s.authors = ['Mark Trapp']
s.email = 'mark@marktrapp.com'
s.files = [
'lib/jekyll/itafroma/archive.rb',
'lib/jekyll/itafroma/archive_generator.rb',
'lib/jekyll/itafroma/archive_page.rb',
'lib/jekyll/itafroma/archive_pager.rb',
]
s.homepage = 'http://marktrapp.com/projects/jekyll-archive'
s.license = 'MIT'
s.required_ruby_version = '>= 2.0.0'
s.add_runtime_dependency 'jekyll', '~> 2.1'
end
| Gem::Specification.new do |s|
s.name = 'jekyll-itafroma-archive'
- s.version = '0.3.0'
? ^
+ s.version = '0.4.0'
? ^
- s.date = '2015-01-07'
? ^^
+ s.date = '2015-01-15'
? ^^
s.summary = 'Jekyll plugin to create a set of archive pages.'
s.description = <<-EOF
Jekyll Archive Generator creates a set of archive pages for a Jekyll
website.
Oddly, Jekyll doesn't include a date-based archive for posts out of the box.
For example, if you have a permalink structure like `blog/2014/01/01/title,
URL hacking won't work because going to `blog/2014` will return 404 Page Not
Found.
Jekyll Archive Generator fixes that by generating all the necessary archive
pages for each part your blog URL structure.
EOF
s.authors = ['Mark Trapp']
s.email = 'mark@marktrapp.com'
s.files = [
'lib/jekyll/itafroma/archive.rb',
'lib/jekyll/itafroma/archive_generator.rb',
'lib/jekyll/itafroma/archive_page.rb',
'lib/jekyll/itafroma/archive_pager.rb',
]
s.homepage = 'http://marktrapp.com/projects/jekyll-archive'
s.license = 'MIT'
+
+ s.required_ruby_version = '>= 2.0.0'
+ s.add_runtime_dependency 'jekyll', '~> 2.1'
end | 7 | 0.25 | 5 | 2 |
2e30fa87d0f66a45da8bc8b74aaf208c32ab37ab | test/test_camera.rb | test/test_camera.rb | require_relative 'test_helper'
class TestCamera < Poke::Test
class Player
attr_reader :x, :y
def initialize
@x = 32
@y = 64
end
end
def setup
@window = TWindow.new
@player = Player.new
@camera = Poke::Camera.new(window: @window, player: @player)
end
end
| require_relative 'test_helper'
class TestCamera < Poke::Test
class Player
attr_reader :x, :y
def initialize
@x = 32
@y = 64
end
end
def setup
@player = Player.new
@camera = Poke::Camera.new(window: @window, player: @player)
end
end
| Remove Camera test Window assignment for now | Remove Camera test Window assignment for now
| Ruby | mit | mybuddymichael/poke | ruby | ## Code Before:
require_relative 'test_helper'
class TestCamera < Poke::Test
class Player
attr_reader :x, :y
def initialize
@x = 32
@y = 64
end
end
def setup
@window = TWindow.new
@player = Player.new
@camera = Poke::Camera.new(window: @window, player: @player)
end
end
## Instruction:
Remove Camera test Window assignment for now
## Code After:
require_relative 'test_helper'
class TestCamera < Poke::Test
class Player
attr_reader :x, :y
def initialize
@x = 32
@y = 64
end
end
def setup
@player = Player.new
@camera = Poke::Camera.new(window: @window, player: @player)
end
end
| require_relative 'test_helper'
class TestCamera < Poke::Test
class Player
attr_reader :x, :y
def initialize
@x = 32
@y = 64
end
end
def setup
- @window = TWindow.new
@player = Player.new
@camera = Poke::Camera.new(window: @window, player: @player)
end
end | 1 | 0.05 | 0 | 1 |
c11d0b3d616fb1d03cdbf0453dcdc5cb39ac122a | lib/node_modules/@stdlib/types/ndarray/ctor/lib/getnd.js | lib/node_modules/@stdlib/types/ndarray/ctor/lib/getnd.js | 'use strict';
// MODULES //
var isInteger = require( '@stdlib/assert/is-integer' ).isPrimitive;
// FUNCTIONS //
/**
* Returns an array element.
*
* @private
* @param {...integer} idx - indices
* @throws {TypeError} provided indices must be integer valued
* @throws {RangeError} index exceeds array dimensions
* @returns {*} array element
*/
function get() {
/* eslint-disable no-invalid-this */
var len;
var idx;
var i;
// TODO: support index modes
len = arguments.length;
idx = this._offset;
for ( i = 0; i < len; i++ ) {
if ( !isInteger( arguments[ i ] ) ) {
throw new TypeError( 'invalid input argument. Indices must be integer valued. Argument: '+i+'. Value: `'+arguments[i]+'`.' );
}
idx += this._strides[ i ] * arguments[ i ];
}
return this._buffer[ idx ];
} // end FUNCTION get()
// EXPORTS //
module.exports = get;
| 'use strict';
// MODULES //
var isInteger = require( '@stdlib/assert/is-integer' ).isPrimitive;
var getIndex = require( './get_index.js' );
// FUNCTIONS //
/**
* Returns an array element.
*
* @private
* @param {...integer} idx - indices
* @throws {TypeError} provided indices must be integer valued
* @throws {RangeError} index exceeds array dimensions
* @returns {*} array element
*/
function get() {
/* eslint-disable no-invalid-this */
var len;
var idx;
var ind;
var i;
len = arguments.length;
idx = this._offset;
for ( i = 0; i < len; i++ ) {
if ( !isInteger( arguments[ i ] ) ) {
throw new TypeError( 'invalid input argument. Indices must be integer valued. Argument: '+i+'. Value: `'+arguments[i]+'`.' );
}
ind = getIndex( arguments[ i ], this._shape[ i ], this._mode );
idx += this._strides[ i ] * ind;
}
return this._buffer[ idx ];
} // end FUNCTION get()
// EXPORTS //
module.exports = get;
| Add support for an index mode | Add support for an index mode
| JavaScript | apache-2.0 | stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib | javascript | ## Code Before:
'use strict';
// MODULES //
var isInteger = require( '@stdlib/assert/is-integer' ).isPrimitive;
// FUNCTIONS //
/**
* Returns an array element.
*
* @private
* @param {...integer} idx - indices
* @throws {TypeError} provided indices must be integer valued
* @throws {RangeError} index exceeds array dimensions
* @returns {*} array element
*/
function get() {
/* eslint-disable no-invalid-this */
var len;
var idx;
var i;
// TODO: support index modes
len = arguments.length;
idx = this._offset;
for ( i = 0; i < len; i++ ) {
if ( !isInteger( arguments[ i ] ) ) {
throw new TypeError( 'invalid input argument. Indices must be integer valued. Argument: '+i+'. Value: `'+arguments[i]+'`.' );
}
idx += this._strides[ i ] * arguments[ i ];
}
return this._buffer[ idx ];
} // end FUNCTION get()
// EXPORTS //
module.exports = get;
## Instruction:
Add support for an index mode
## Code After:
'use strict';
// MODULES //
var isInteger = require( '@stdlib/assert/is-integer' ).isPrimitive;
var getIndex = require( './get_index.js' );
// FUNCTIONS //
/**
* Returns an array element.
*
* @private
* @param {...integer} idx - indices
* @throws {TypeError} provided indices must be integer valued
* @throws {RangeError} index exceeds array dimensions
* @returns {*} array element
*/
function get() {
/* eslint-disable no-invalid-this */
var len;
var idx;
var ind;
var i;
len = arguments.length;
idx = this._offset;
for ( i = 0; i < len; i++ ) {
if ( !isInteger( arguments[ i ] ) ) {
throw new TypeError( 'invalid input argument. Indices must be integer valued. Argument: '+i+'. Value: `'+arguments[i]+'`.' );
}
ind = getIndex( arguments[ i ], this._shape[ i ], this._mode );
idx += this._strides[ i ] * ind;
}
return this._buffer[ idx ];
} // end FUNCTION get()
// EXPORTS //
module.exports = get;
| 'use strict';
// MODULES //
var isInteger = require( '@stdlib/assert/is-integer' ).isPrimitive;
+ var getIndex = require( './get_index.js' );
// FUNCTIONS //
/**
* Returns an array element.
*
* @private
* @param {...integer} idx - indices
* @throws {TypeError} provided indices must be integer valued
* @throws {RangeError} index exceeds array dimensions
* @returns {*} array element
*/
function get() {
/* eslint-disable no-invalid-this */
var len;
var idx;
+ var ind;
var i;
-
- // TODO: support index modes
len = arguments.length;
idx = this._offset;
for ( i = 0; i < len; i++ ) {
if ( !isInteger( arguments[ i ] ) ) {
throw new TypeError( 'invalid input argument. Indices must be integer valued. Argument: '+i+'. Value: `'+arguments[i]+'`.' );
}
+ ind = getIndex( arguments[ i ], this._shape[ i ], this._mode );
- idx += this._strides[ i ] * arguments[ i ];
? ^^^^^^ ^^^^^^^
+ idx += this._strides[ i ] * ind;
? ^ ^
}
return this._buffer[ idx ];
} // end FUNCTION get()
// EXPORTS //
module.exports = get; | 7 | 0.170732 | 4 | 3 |
790c4a4e4fa6b022ec664933711452aa27751d98 | .travis.yml | .travis.yml | sudo: required
dist: trusty
install:
- sudo pip install -r requirements.txt
script:
sudo python -m doctest main.py
| sudo: required
dist: trusty
install:
- sudo pip install -r requirements.txt
- mv config.example.py config.py
script:
sudo python -m doctest main.py
| Use example config during tests | Use example config during tests
| YAML | mit | jncraton/gCal-iCal-Sync | yaml | ## Code Before:
sudo: required
dist: trusty
install:
- sudo pip install -r requirements.txt
script:
sudo python -m doctest main.py
## Instruction:
Use example config during tests
## Code After:
sudo: required
dist: trusty
install:
- sudo pip install -r requirements.txt
- mv config.example.py config.py
script:
sudo python -m doctest main.py
| sudo: required
dist: trusty
install:
- sudo pip install -r requirements.txt
+ - mv config.example.py config.py
script:
sudo python -m doctest main.py | 1 | 0.142857 | 1 | 0 |
b7433c0d18a01a9e1340123f7c0423d1fdec04a3 | sphinxdoc/urls.py | sphinxdoc/urls.py | from django.conf.urls.defaults import patterns, url
from django.views.generic import list_detail
from sphinxdoc import models
from sphinxdoc.views import ProjectSearchView
project_info = {
'queryset': models.Project.objects.all().order_by('name'),
'template_object_name': 'project',
}
urlpatterns = patterns('sphinxdoc.views',
url(
r'^$',
list_detail.object_list,
project_info,
),
url(
r'^(?P<slug>[\w-]+)/search/$',
ProjectSearchView(),
name='doc-search',
),
# These URLs have to be without the / at the end so that relative links in
# static HTML files work correctly and that browsers know how to name files
# for download
url(
r'^(?P<slug>[\w-]+)/(?P<type_>_images|_static|_downloads|_source)/' + \
r'(?P<path>.+)$',
'sphinx_serve',
),
url(
r'^(?P<slug>[\w-]+)/_objects/$',
'objects_inventory',
name='objects-inv',
),
url(
r'^(?P<slug>[\w-]+)/$',
'documentation',
{'path': ''},
name='doc-index',
),
url(
r'^(?P<slug>[\w-]+)/(?P<path>.+)/$',
'documentation',
name='doc-detail',
),
)
| from django.conf.urls.defaults import patterns, url
from django.views.generic.list import ListView
from sphinxdoc import models
from sphinxdoc.views import ProjectSearchView
urlpatterns = patterns('sphinxdoc.views',
url(
r'^$',
ListView.as_view(queryset=models.Project.objects.all().order_by('name'))
),
url(
r'^(?P<slug>[\w-]+)/search/$',
ProjectSearchView(),
name='doc-search',
),
# These URLs have to be without the / at the end so that relative links in
# static HTML files work correctly and that browsers know how to name files
# for download
url(
r'^(?P<slug>[\w-]+)/(?P<type_>_images|_static|_downloads|_source)/' + \
r'(?P<path>.+)$',
'sphinx_serve',
),
url(
r'^(?P<slug>[\w-]+)/_objects/$',
'objects_inventory',
name='objects-inv',
),
url(
r'^(?P<slug>[\w-]+)/$',
'documentation',
{'path': ''},
name='doc-index',
),
url(
r'^(?P<slug>[\w-]+)/(?P<path>.+)/$',
'documentation',
name='doc-detail',
),
)
| Change function-based generic view to class-based. | Change function-based generic view to class-based.
As per their deprecation policy, Django 1.5 removed function-based
generic views.
| Python | mit | kamni/django-sphinxdoc | python | ## Code Before:
from django.conf.urls.defaults import patterns, url
from django.views.generic import list_detail
from sphinxdoc import models
from sphinxdoc.views import ProjectSearchView
project_info = {
'queryset': models.Project.objects.all().order_by('name'),
'template_object_name': 'project',
}
urlpatterns = patterns('sphinxdoc.views',
url(
r'^$',
list_detail.object_list,
project_info,
),
url(
r'^(?P<slug>[\w-]+)/search/$',
ProjectSearchView(),
name='doc-search',
),
# These URLs have to be without the / at the end so that relative links in
# static HTML files work correctly and that browsers know how to name files
# for download
url(
r'^(?P<slug>[\w-]+)/(?P<type_>_images|_static|_downloads|_source)/' + \
r'(?P<path>.+)$',
'sphinx_serve',
),
url(
r'^(?P<slug>[\w-]+)/_objects/$',
'objects_inventory',
name='objects-inv',
),
url(
r'^(?P<slug>[\w-]+)/$',
'documentation',
{'path': ''},
name='doc-index',
),
url(
r'^(?P<slug>[\w-]+)/(?P<path>.+)/$',
'documentation',
name='doc-detail',
),
)
## Instruction:
Change function-based generic view to class-based.
As per their deprecation policy, Django 1.5 removed function-based
generic views.
## Code After:
from django.conf.urls.defaults import patterns, url
from django.views.generic.list import ListView
from sphinxdoc import models
from sphinxdoc.views import ProjectSearchView
urlpatterns = patterns('sphinxdoc.views',
url(
r'^$',
ListView.as_view(queryset=models.Project.objects.all().order_by('name'))
),
url(
r'^(?P<slug>[\w-]+)/search/$',
ProjectSearchView(),
name='doc-search',
),
# These URLs have to be without the / at the end so that relative links in
# static HTML files work correctly and that browsers know how to name files
# for download
url(
r'^(?P<slug>[\w-]+)/(?P<type_>_images|_static|_downloads|_source)/' + \
r'(?P<path>.+)$',
'sphinx_serve',
),
url(
r'^(?P<slug>[\w-]+)/_objects/$',
'objects_inventory',
name='objects-inv',
),
url(
r'^(?P<slug>[\w-]+)/$',
'documentation',
{'path': ''},
name='doc-index',
),
url(
r'^(?P<slug>[\w-]+)/(?P<path>.+)/$',
'documentation',
name='doc-detail',
),
)
| from django.conf.urls.defaults import patterns, url
- from django.views.generic import list_detail
? ^ ^^ ^^^^
+ from django.views.generic.list import ListView
? +++++ ^ ^^ ^
from sphinxdoc import models
from sphinxdoc.views import ProjectSearchView
- project_info = {
- 'queryset': models.Project.objects.all().order_by('name'),
- 'template_object_name': 'project',
- }
-
urlpatterns = patterns('sphinxdoc.views',
url(
r'^$',
+ ListView.as_view(queryset=models.Project.objects.all().order_by('name'))
- list_detail.object_list,
- project_info,
),
url(
r'^(?P<slug>[\w-]+)/search/$',
ProjectSearchView(),
name='doc-search',
),
# These URLs have to be without the / at the end so that relative links in
# static HTML files work correctly and that browsers know how to name files
# for download
url(
r'^(?P<slug>[\w-]+)/(?P<type_>_images|_static|_downloads|_source)/' + \
r'(?P<path>.+)$',
'sphinx_serve',
),
url(
r'^(?P<slug>[\w-]+)/_objects/$',
'objects_inventory',
name='objects-inv',
),
url(
r'^(?P<slug>[\w-]+)/$',
'documentation',
{'path': ''},
name='doc-index',
),
url(
r'^(?P<slug>[\w-]+)/(?P<path>.+)/$',
'documentation',
name='doc-detail',
),
) | 10 | 0.208333 | 2 | 8 |
1c2270a6363a2c2b0d84ca7baf1845219b64839f | metadata/it.feio.android.omninotes.txt | metadata/it.feio.android.omninotes.txt | Categories:Office
License:Apache2
Web Site:https://github.com/federicoiosue/Omni-Notes
Source Code:https://github.com/federicoiosue/Omni-Notes
Issue Tracker:https://github.com/federicoiosue/Omni-Notes/issues
Auto Name:Omni Notes
Summary:Note taking
Description:
Note taking application aimed at having oth a lightweight footprint and a
simple interface while keeping a smart user experience.
.
Repo Type:git
Repo:https://github.com/federicoiosue/Omni-Notes.git
Build:4.4.1,91
disable=google analytics, other jars
commit=v_4_4_1
Auto Update Mode:None
Update Check Mode:RepoManifest
Current Version:4.4.2
Current Version Code:92
| Categories:Office
License:Apache2
Web Site:https://github.com/federicoiosue/Omni-Notes
Source Code:https://github.com/federicoiosue/Omni-Notes
Issue Tracker:https://github.com/federicoiosue/Omni-Notes/issues
Auto Name:Omni Notes
Summary:Note taking
Description:
Note taking application aimed at having oth a lightweight footprint and a
simple interface while keeping a smart user experience.
.
Repo Type:git
Repo:https://github.com/federicoiosue/Omni-Notes.git
Build:4.4.1,91
disable=google analytics, other jars
commit=v_4_4_1
Auto Update Mode:None
Update Check Mode:RepoManifest
Current Version:4.5.0 Beta 3
Current Version Code:96
| Update CV of Omni Notes to 4.5.0 Beta 3 (96) | Update CV of Omni Notes to 4.5.0 Beta 3 (96)
| Text | agpl-3.0 | f-droid/fdroid-data,f-droid/fdroiddata,f-droid/fdroiddata | text | ## Code Before:
Categories:Office
License:Apache2
Web Site:https://github.com/federicoiosue/Omni-Notes
Source Code:https://github.com/federicoiosue/Omni-Notes
Issue Tracker:https://github.com/federicoiosue/Omni-Notes/issues
Auto Name:Omni Notes
Summary:Note taking
Description:
Note taking application aimed at having oth a lightweight footprint and a
simple interface while keeping a smart user experience.
.
Repo Type:git
Repo:https://github.com/federicoiosue/Omni-Notes.git
Build:4.4.1,91
disable=google analytics, other jars
commit=v_4_4_1
Auto Update Mode:None
Update Check Mode:RepoManifest
Current Version:4.4.2
Current Version Code:92
## Instruction:
Update CV of Omni Notes to 4.5.0 Beta 3 (96)
## Code After:
Categories:Office
License:Apache2
Web Site:https://github.com/federicoiosue/Omni-Notes
Source Code:https://github.com/federicoiosue/Omni-Notes
Issue Tracker:https://github.com/federicoiosue/Omni-Notes/issues
Auto Name:Omni Notes
Summary:Note taking
Description:
Note taking application aimed at having oth a lightweight footprint and a
simple interface while keeping a smart user experience.
.
Repo Type:git
Repo:https://github.com/federicoiosue/Omni-Notes.git
Build:4.4.1,91
disable=google analytics, other jars
commit=v_4_4_1
Auto Update Mode:None
Update Check Mode:RepoManifest
Current Version:4.5.0 Beta 3
Current Version Code:96
| Categories:Office
License:Apache2
Web Site:https://github.com/federicoiosue/Omni-Notes
Source Code:https://github.com/federicoiosue/Omni-Notes
Issue Tracker:https://github.com/federicoiosue/Omni-Notes/issues
Auto Name:Omni Notes
Summary:Note taking
Description:
Note taking application aimed at having oth a lightweight footprint and a
simple interface while keeping a smart user experience.
.
Repo Type:git
Repo:https://github.com/federicoiosue/Omni-Notes.git
Build:4.4.1,91
disable=google analytics, other jars
commit=v_4_4_1
Auto Update Mode:None
Update Check Mode:RepoManifest
- Current Version:4.4.2
? ^ ^
+ Current Version:4.5.0 Beta 3
? ^ ^^^^^^^^
- Current Version Code:92
? ^
+ Current Version Code:96
? ^
| 4 | 0.16 | 2 | 2 |
890de402df0a64598abbde5cd135c62548e7e75c | static/templates/announce_stream_docs.hbs | static/templates/announce_stream_docs.hbs | {{! Explanation of what "announce stream" does when creating a stream }}
<div>
{{#tr}}
<p>Stream will be announced in <b>#{notifications_stream}</b>.</p>
{{/tr}}
<p>{{t 'Organization administrators can change this in the organization settings.' }}</p>
</div>
| {{! Explanation of what "announce stream" does when creating a stream }}
<div>
{{#tr}}
<p>Stream will be announced in <b>#{notifications_stream}</b>.</p>
{{/tr}}
<p>{{t 'Organization administrators can change the announcement stream in the organization settings.' }}</p>
</div>
| Modify the message of the "Announce stream" hint tooltip. | stream_settings: Modify the message of the "Announce stream" hint tooltip.
Follow-up to commit bbda7a5bb05898fd87b99e8b20176ea6bce88aa3.
The "Announce stream" hint tooltip earlier read "Organization
administrators can change this in in the organization settings."
It wasn't obvious that "this" refers to the stream the notification
will go to, so the tootltip message has been modified to reflect this.
| Handlebars | apache-2.0 | zulip/zulip,rht/zulip,zulip/zulip,zulip/zulip,zulip/zulip,andersk/zulip,andersk/zulip,andersk/zulip,rht/zulip,zulip/zulip,zulip/zulip,andersk/zulip,rht/zulip,andersk/zulip,rht/zulip,andersk/zulip,rht/zulip,rht/zulip,rht/zulip,zulip/zulip,andersk/zulip | handlebars | ## Code Before:
{{! Explanation of what "announce stream" does when creating a stream }}
<div>
{{#tr}}
<p>Stream will be announced in <b>#{notifications_stream}</b>.</p>
{{/tr}}
<p>{{t 'Organization administrators can change this in the organization settings.' }}</p>
</div>
## Instruction:
stream_settings: Modify the message of the "Announce stream" hint tooltip.
Follow-up to commit bbda7a5bb05898fd87b99e8b20176ea6bce88aa3.
The "Announce stream" hint tooltip earlier read "Organization
administrators can change this in in the organization settings."
It wasn't obvious that "this" refers to the stream the notification
will go to, so the tootltip message has been modified to reflect this.
## Code After:
{{! Explanation of what "announce stream" does when creating a stream }}
<div>
{{#tr}}
<p>Stream will be announced in <b>#{notifications_stream}</b>.</p>
{{/tr}}
<p>{{t 'Organization administrators can change the announcement stream in the organization settings.' }}</p>
</div>
| {{! Explanation of what "announce stream" does when creating a stream }}
<div>
{{#tr}}
<p>Stream will be announced in <b>#{notifications_stream}</b>.</p>
{{/tr}}
- <p>{{t 'Organization administrators can change this in the organization settings.' }}</p>
? ^
+ <p>{{t 'Organization administrators can change the announcement stream in the organization settings.' }}</p>
? ^^^^^^^^^^^^^^^ +++++
</div> | 2 | 0.222222 | 1 | 1 |
771c2eef39547dd1c61c53080b7eb2e769e0fc6a | .travis.yml | .travis.yml | language: node_js
node_js:
- 'lts/*'
env:
- NODE_TEST_ENV=development
- NODE_TEST_ENV=production
addons:
chrome: stable # headless chrome testing
before_script:
- yarn build
| language: node_js
node_js:
- 'node'
- 'lts/*'
env:
- NODE_TEST_ENV=development
- NODE_TEST_ENV=production
addons:
chrome: stable # headless chrome testing
before_script:
- yarn build
| Add back latest non-LTS node to CI/CD matrix | Add back latest non-LTS node to CI/CD matrix
| YAML | mit | thesephist/torus | yaml | ## Code Before:
language: node_js
node_js:
- 'lts/*'
env:
- NODE_TEST_ENV=development
- NODE_TEST_ENV=production
addons:
chrome: stable # headless chrome testing
before_script:
- yarn build
## Instruction:
Add back latest non-LTS node to CI/CD matrix
## Code After:
language: node_js
node_js:
- 'node'
- 'lts/*'
env:
- NODE_TEST_ENV=development
- NODE_TEST_ENV=production
addons:
chrome: stable # headless chrome testing
before_script:
- yarn build
| language: node_js
node_js:
+ - 'node'
- 'lts/*'
env:
- NODE_TEST_ENV=development
- NODE_TEST_ENV=production
addons:
chrome: stable # headless chrome testing
before_script:
- yarn build | 1 | 0.083333 | 1 | 0 |
646b0f8babf346f3ec21ae688453deee24fb410f | tests/core/tests/base_formats_tests.py | tests/core/tests/base_formats_tests.py | from __future__ import unicode_literals
import os
from django.test import TestCase
from django.utils.text import force_text
from import_export.formats import base_formats
class XLSTest(TestCase):
def test_binary_format(self):
self.assertTrue(base_formats.XLS().is_binary())
class CSVTest(TestCase):
def setUp(self):
self.format = base_formats.CSV()
def test_import_dos(self):
filename = os.path.join(
os.path.dirname(__file__),
os.path.pardir,
'exports',
'books-dos.csv')
in_stream = open(filename, self.format.get_read_mode()).read()
expected = 'id,name,author_email\n1,Some book,test@example.com\n'
self.assertEqual(in_stream, expected)
def test_import_unicode(self):
# importing csv UnicodeEncodeError 347
filename = os.path.join(
os.path.dirname(__file__),
os.path.pardir,
'exports',
'books-unicode.csv')
in_stream = open(filename, self.format.get_read_mode())
data = force_text(in_stream.read())
base_formats.CSV().create_dataset(data)
| from __future__ import unicode_literals
import os
from django.test import TestCase
try:
from django.utils.encoding import force_text
except ImportError:
from django.utils.encoding import force_unicode as force_text
from import_export.formats import base_formats
class XLSTest(TestCase):
def test_binary_format(self):
self.assertTrue(base_formats.XLS().is_binary())
class CSVTest(TestCase):
def setUp(self):
self.format = base_formats.CSV()
def test_import_dos(self):
filename = os.path.join(
os.path.dirname(__file__),
os.path.pardir,
'exports',
'books-dos.csv')
in_stream = open(filename, self.format.get_read_mode()).read()
expected = 'id,name,author_email\n1,Some book,test@example.com\n'
self.assertEqual(in_stream, expected)
def test_import_unicode(self):
# importing csv UnicodeEncodeError 347
filename = os.path.join(
os.path.dirname(__file__),
os.path.pardir,
'exports',
'books-unicode.csv')
in_stream = open(filename, self.format.get_read_mode())
data = force_text(in_stream.read())
base_formats.CSV().create_dataset(data)
| Fix importing force_text tests for 1.4 compatibility | Fix importing force_text tests for 1.4 compatibility
use 1.4 compat code
| Python | bsd-2-clause | copperleaftech/django-import-export,PetrDlouhy/django-import-export,PetrDlouhy/django-import-export,rhunwicks/django-import-export,copperleaftech/django-import-export,Apkawa/django-import-export,jnns/django-import-export,PetrDlouhy/django-import-export,daniell/django-import-export,django-import-export/django-import-export,django-import-export/django-import-export,pajod/django-import-export,daniell/django-import-export,brillgen/django-import-export,pajod/django-import-export,bmihelac/django-import-export,manelclos/django-import-export,jnns/django-import-export,brillgen/django-import-export,jnns/django-import-export,jnns/django-import-export,copperleaftech/django-import-export,PetrDlouhy/django-import-export,bmihelac/django-import-export,pajod/django-import-export,bmihelac/django-import-export,Apkawa/django-import-export,bmihelac/django-import-export,Apkawa/django-import-export,daniell/django-import-export,daniell/django-import-export,copperleaftech/django-import-export,manelclos/django-import-export,brillgen/django-import-export,django-import-export/django-import-export,rhunwicks/django-import-export,pajod/django-import-export,django-import-export/django-import-export,manelclos/django-import-export,rhunwicks/django-import-export,brillgen/django-import-export | python | ## Code Before:
from __future__ import unicode_literals
import os
from django.test import TestCase
from django.utils.text import force_text
from import_export.formats import base_formats
class XLSTest(TestCase):
def test_binary_format(self):
self.assertTrue(base_formats.XLS().is_binary())
class CSVTest(TestCase):
def setUp(self):
self.format = base_formats.CSV()
def test_import_dos(self):
filename = os.path.join(
os.path.dirname(__file__),
os.path.pardir,
'exports',
'books-dos.csv')
in_stream = open(filename, self.format.get_read_mode()).read()
expected = 'id,name,author_email\n1,Some book,test@example.com\n'
self.assertEqual(in_stream, expected)
def test_import_unicode(self):
# importing csv UnicodeEncodeError 347
filename = os.path.join(
os.path.dirname(__file__),
os.path.pardir,
'exports',
'books-unicode.csv')
in_stream = open(filename, self.format.get_read_mode())
data = force_text(in_stream.read())
base_formats.CSV().create_dataset(data)
## Instruction:
Fix importing force_text tests for 1.4 compatibility
use 1.4 compat code
## Code After:
from __future__ import unicode_literals
import os
from django.test import TestCase
try:
from django.utils.encoding import force_text
except ImportError:
from django.utils.encoding import force_unicode as force_text
from import_export.formats import base_formats
class XLSTest(TestCase):
def test_binary_format(self):
self.assertTrue(base_formats.XLS().is_binary())
class CSVTest(TestCase):
def setUp(self):
self.format = base_formats.CSV()
def test_import_dos(self):
filename = os.path.join(
os.path.dirname(__file__),
os.path.pardir,
'exports',
'books-dos.csv')
in_stream = open(filename, self.format.get_read_mode()).read()
expected = 'id,name,author_email\n1,Some book,test@example.com\n'
self.assertEqual(in_stream, expected)
def test_import_unicode(self):
# importing csv UnicodeEncodeError 347
filename = os.path.join(
os.path.dirname(__file__),
os.path.pardir,
'exports',
'books-unicode.csv')
in_stream = open(filename, self.format.get_read_mode())
data = force_text(in_stream.read())
base_formats.CSV().create_dataset(data)
| from __future__ import unicode_literals
import os
from django.test import TestCase
+
+ try:
- from django.utils.text import force_text
? - ^^
+ from django.utils.encoding import force_text
? ++++ ^^^^^^^
+ except ImportError:
+ from django.utils.encoding import force_unicode as force_text
from import_export.formats import base_formats
class XLSTest(TestCase):
def test_binary_format(self):
self.assertTrue(base_formats.XLS().is_binary())
class CSVTest(TestCase):
def setUp(self):
self.format = base_formats.CSV()
def test_import_dos(self):
filename = os.path.join(
os.path.dirname(__file__),
os.path.pardir,
'exports',
'books-dos.csv')
in_stream = open(filename, self.format.get_read_mode()).read()
expected = 'id,name,author_email\n1,Some book,test@example.com\n'
self.assertEqual(in_stream, expected)
def test_import_unicode(self):
# importing csv UnicodeEncodeError 347
filename = os.path.join(
os.path.dirname(__file__),
os.path.pardir,
'exports',
'books-unicode.csv')
in_stream = open(filename, self.format.get_read_mode())
data = force_text(in_stream.read())
base_formats.CSV().create_dataset(data) | 6 | 0.146341 | 5 | 1 |
96c8eb90358eb5d1136338a053dbeda6303caa1c | ORIGINSTORY.md | ORIGINSTORY.md |
This project came to be when I saw a really badly implemented [*Icelandic name generator*](http://www.vg.no/spesial/2016/islandsk-navnegenerator/?lang=en) which swapped out characters just because, added random suffixes and didn't use genitive form. It was horrible to use as an Icelander but it was a nice idea. So I just wanted to do a better job of generating correct and thus **better names** (hence the name of the project).
So in short hacking session I threw this together. Sorry for sloppy code. I just wanted to get this out of my system.
|
This project came to be when I saw a really badly implemented [*Icelandic name generator*](http://www.vg.no/spesial/2016/islandsk-navnegenerator/?lang=en) which swapped out characters just because, added random suffixes and didn't use genitive form. It was horrible to use as an Icelander but it was a nice idea. So I just wanted to do a better job of generating correct and thus **better names** (hence the name of the project).
So in an evening hacking session I threw this together. Sorry for sloppy code. I just wanted to get this out of my system.
| Fix typo by adding a better explanation | Fix typo by adding a better explanation
It was really just one evening.
| Markdown | agpl-3.0 | trickvi/better-names | markdown | ## Code Before:
This project came to be when I saw a really badly implemented [*Icelandic name generator*](http://www.vg.no/spesial/2016/islandsk-navnegenerator/?lang=en) which swapped out characters just because, added random suffixes and didn't use genitive form. It was horrible to use as an Icelander but it was a nice idea. So I just wanted to do a better job of generating correct and thus **better names** (hence the name of the project).
So in short hacking session I threw this together. Sorry for sloppy code. I just wanted to get this out of my system.
## Instruction:
Fix typo by adding a better explanation
It was really just one evening.
## Code After:
This project came to be when I saw a really badly implemented [*Icelandic name generator*](http://www.vg.no/spesial/2016/islandsk-navnegenerator/?lang=en) which swapped out characters just because, added random suffixes and didn't use genitive form. It was horrible to use as an Icelander but it was a nice idea. So I just wanted to do a better job of generating correct and thus **better names** (hence the name of the project).
So in an evening hacking session I threw this together. Sorry for sloppy code. I just wanted to get this out of my system.
|
This project came to be when I saw a really badly implemented [*Icelandic name generator*](http://www.vg.no/spesial/2016/islandsk-navnegenerator/?lang=en) which swapped out characters just because, added random suffixes and didn't use genitive form. It was horrible to use as an Icelander but it was a nice idea. So I just wanted to do a better job of generating correct and thus **better names** (hence the name of the project).
- So in short hacking session I threw this together. Sorry for sloppy code. I just wanted to get this out of my system.
? ^^^^^
+ So in an evening hacking session I threw this together. Sorry for sloppy code. I just wanted to get this out of my system.
? ^^^^^^^^^^
| 2 | 0.5 | 1 | 1 |
e2eaf48c0cc9198a388e42d8c2ca5aa446df409e | tests/contentSpec.js | tests/contentSpec.js | describe('traffic', function() {
beforeEach(function() {
window.matrix.settings = {
profileId: ''
};
subject = window.matrix.content;
sandbox = sinon.sandbox.create();
server = sinon.fakeServer.create();
});
afterEach(function() {
sandbox.restore();
server.restore();
});
it('has initial points', function() {
expect(subject.pages).to.eql([]);
});
describe('#endpoint', function() {
it('returns the path to the servers realtime endpoint', function() {
expect(subject.endpoint()).to.eql('/realtime?ids=ga:&metrics=rt:pageviews&dimensions=rt:pageTitle,rt:pagePath&max-results=10&sort=-rt%3Apageviews');
});
context('with profileId', function() {
beforeEach(function() {
window.matrix.settings = {
profileId: 'Test'
};
});
it('returns correct profile Id in the endpoint path', function() {
expect(subject.endpoint()).to.eql('/realtime?ids=ga:Test&metrics=rt:pageviews&dimensions=rt:pageTitle,rt:pagePath&max-results=10&sort=-rt%3Apageviews');
});
});
});
});
| describe('traffic', function() {
beforeEach(function() {
window.matrix.settings = {
profileId: ''
};
subject = window.matrix.content;
sandbox = sinon.sandbox.create();
server = sinon.fakeServer.create();
});
afterEach(function() {
sandbox.restore();
server.restore();
});
it('empty pages', function() {
expect(subject.pages).to.eql([]);
});
describe('#init', function() {
it("calls reload", function() {
mock = sandbox.mock(subject).expects("reload").once();
subject.init();
mock.verify();
});
it("calls reload with interval of 30 minute", function() {
clock = sinon.useFakeTimers(Date.now());
mock = sandbox.mock(subject).expects("reload").twice();
subject.init();
clock.tick(1800000);
mock.verify();
clock.restore();
});
});
describe('#endpoint', function() {
it('returns the path to the servers realtime endpoint', function() {
expect(subject.endpoint()).to.eql('/realtime?ids=ga:&metrics=rt:pageviews&dimensions=rt:pageTitle,rt:pagePath&max-results=10&sort=-rt%3Apageviews');
});
context('with profileId', function() {
beforeEach(function() {
window.matrix.settings = {
profileId: 'Test'
};
});
it('returns correct profile Id in the endpoint path', function() {
expect(subject.endpoint()).to.eql('/realtime?ids=ga:Test&metrics=rt:pageviews&dimensions=rt:pageTitle,rt:pagePath&max-results=10&sort=-rt%3Apageviews');
});
});
});
});
| Add tests for init method | Add tests for init method
| JavaScript | mit | codeforamerica/city-analytics-dashboard,codeforamerica/city-analytics-dashboard,codeforamerica/city-analytics-dashboard | javascript | ## Code Before:
describe('traffic', function() {
beforeEach(function() {
window.matrix.settings = {
profileId: ''
};
subject = window.matrix.content;
sandbox = sinon.sandbox.create();
server = sinon.fakeServer.create();
});
afterEach(function() {
sandbox.restore();
server.restore();
});
it('has initial points', function() {
expect(subject.pages).to.eql([]);
});
describe('#endpoint', function() {
it('returns the path to the servers realtime endpoint', function() {
expect(subject.endpoint()).to.eql('/realtime?ids=ga:&metrics=rt:pageviews&dimensions=rt:pageTitle,rt:pagePath&max-results=10&sort=-rt%3Apageviews');
});
context('with profileId', function() {
beforeEach(function() {
window.matrix.settings = {
profileId: 'Test'
};
});
it('returns correct profile Id in the endpoint path', function() {
expect(subject.endpoint()).to.eql('/realtime?ids=ga:Test&metrics=rt:pageviews&dimensions=rt:pageTitle,rt:pagePath&max-results=10&sort=-rt%3Apageviews');
});
});
});
});
## Instruction:
Add tests for init method
## Code After:
describe('traffic', function() {
beforeEach(function() {
window.matrix.settings = {
profileId: ''
};
subject = window.matrix.content;
sandbox = sinon.sandbox.create();
server = sinon.fakeServer.create();
});
afterEach(function() {
sandbox.restore();
server.restore();
});
it('empty pages', function() {
expect(subject.pages).to.eql([]);
});
describe('#init', function() {
it("calls reload", function() {
mock = sandbox.mock(subject).expects("reload").once();
subject.init();
mock.verify();
});
it("calls reload with interval of 30 minute", function() {
clock = sinon.useFakeTimers(Date.now());
mock = sandbox.mock(subject).expects("reload").twice();
subject.init();
clock.tick(1800000);
mock.verify();
clock.restore();
});
});
describe('#endpoint', function() {
it('returns the path to the servers realtime endpoint', function() {
expect(subject.endpoint()).to.eql('/realtime?ids=ga:&metrics=rt:pageviews&dimensions=rt:pageTitle,rt:pagePath&max-results=10&sort=-rt%3Apageviews');
});
context('with profileId', function() {
beforeEach(function() {
window.matrix.settings = {
profileId: 'Test'
};
});
it('returns correct profile Id in the endpoint path', function() {
expect(subject.endpoint()).to.eql('/realtime?ids=ga:Test&metrics=rt:pageviews&dimensions=rt:pageTitle,rt:pagePath&max-results=10&sort=-rt%3Apageviews');
});
});
});
});
| describe('traffic', function() {
beforeEach(function() {
window.matrix.settings = {
profileId: ''
};
subject = window.matrix.content;
sandbox = sinon.sandbox.create();
server = sinon.fakeServer.create();
});
afterEach(function() {
sandbox.restore();
server.restore();
});
- it('has initial points', function() {
+ it('empty pages', function() {
expect(subject.pages).to.eql([]);
+ });
+ describe('#init', function() {
+ it("calls reload", function() {
+ mock = sandbox.mock(subject).expects("reload").once();
+ subject.init();
+ mock.verify();
+ });
+ it("calls reload with interval of 30 minute", function() {
+ clock = sinon.useFakeTimers(Date.now());
+ mock = sandbox.mock(subject).expects("reload").twice();
+ subject.init();
+ clock.tick(1800000);
+ mock.verify();
+ clock.restore();
+ });
});
describe('#endpoint', function() {
it('returns the path to the servers realtime endpoint', function() {
expect(subject.endpoint()).to.eql('/realtime?ids=ga:&metrics=rt:pageviews&dimensions=rt:pageTitle,rt:pagePath&max-results=10&sort=-rt%3Apageviews');
});
context('with profileId', function() {
beforeEach(function() {
window.matrix.settings = {
profileId: 'Test'
};
});
it('returns correct profile Id in the endpoint path', function() {
expect(subject.endpoint()).to.eql('/realtime?ids=ga:Test&metrics=rt:pageviews&dimensions=rt:pageTitle,rt:pagePath&max-results=10&sort=-rt%3Apageviews');
});
});
});
}); | 17 | 0.53125 | 16 | 1 |
46d81f2bcfa56dc2579d26b1d38b86051119c7ca | src/Core/Framework/QScriptEngineHelpers.h | src/Core/Framework/QScriptEngineHelpers.h | // For conditions of distribution and use, see copyright notice in license.txt
#include <QScriptEngine>
#include <QMetaType>
// The following functions help register a custom QObject-derived class to a QScriptEngine.
// See http://lists.trolltech.com/qt-interest/2007-12/thread00158-0.html .
template <typename Tp>
QScriptValue qScriptValueFromQObject(QScriptEngine *engine, Tp const &qobject)
{
return engine->newQObject(qobject);
}
template <typename Tp>
void qScriptValueToQObject(const QScriptValue &value, Tp &qobject)
{
qobject = qobject_cast<Tp>(value.toQObject());
}
template <typename Tp>
int qScriptRegisterQObjectMetaType(QScriptEngine *engine, const QScriptValue &prototype = QScriptValue()
#ifndef qdoc
, Tp * = 0
#endif
)
{
return qScriptRegisterMetaType<Tp>(engine, qScriptValueFromQObject, qScriptValueToQObject, prototype);
}
| // For conditions of distribution and use, see copyright notice in license.txt
#include <QScriptEngine>
#include <QMetaType>
// The following functions help register a custom QObject-derived class to a QScriptEngine.
// See http://lists.trolltech.com/qt-interest/2007-12/thread00158-0.html .
template <typename Tp>
QScriptValue qScriptValueFromQObject(QScriptEngine *engine, Tp const &qobject)
{
return engine->newQObject(qobject);
}
template <typename Tp>
void qScriptValueToQObject(const QScriptValue &value, Tp &qobject)
{
qobject = dynamic_cast<Tp>(value.toQObject());
if (!qobject)
{
// qobject_cast has been observed to fail for some metatypes, such as Entity*, so prefer dynamic_cast.
// However, to see that there are no regressions from that, check that if dynamic_cast fails, so does qobject_cast
Tp ptr = qobject_cast<Tp>(value.toQObject());
assert(!ptr);
if (ptr)
::LogError("qScriptValueToQObject: dynamic_cast was null, but qobject_cast was not!");
}
}
template <typename Tp>
int qScriptRegisterQObjectMetaType(QScriptEngine *engine, const QScriptValue &prototype = QScriptValue()
#ifndef qdoc
, Tp * = 0
#endif
)
{
return qScriptRegisterMetaType<Tp>(engine, qScriptValueFromQObject, qScriptValueToQObject, prototype);
}
| Use dynamic_cast in qScriptValueToQObject as qobject_cast fails for some metatypes such as Entity*. However, assert that if dynamic_cast is null, so is qobject_cast. | Use dynamic_cast in qScriptValueToQObject as qobject_cast fails for some metatypes such as Entity*. However, assert that if dynamic_cast is null, so is qobject_cast.
| C | apache-2.0 | realXtend/tundra,jesterKing/naali,pharos3d/tundra,jesterKing/naali,jesterKing/naali,AlphaStaxLLC/tundra,AlphaStaxLLC/tundra,BogusCurry/tundra,jesterKing/naali,pharos3d/tundra,jesterKing/naali,AlphaStaxLLC/tundra,jesterKing/naali,BogusCurry/tundra,jesterKing/naali,BogusCurry/tundra,BogusCurry/tundra,pharos3d/tundra,realXtend/tundra,pharos3d/tundra,AlphaStaxLLC/tundra,realXtend/tundra,pharos3d/tundra,BogusCurry/tundra,AlphaStaxLLC/tundra,realXtend/tundra,pharos3d/tundra,AlphaStaxLLC/tundra,realXtend/tundra,realXtend/tundra,BogusCurry/tundra | c | ## Code Before:
// For conditions of distribution and use, see copyright notice in license.txt
#include <QScriptEngine>
#include <QMetaType>
// The following functions help register a custom QObject-derived class to a QScriptEngine.
// See http://lists.trolltech.com/qt-interest/2007-12/thread00158-0.html .
template <typename Tp>
QScriptValue qScriptValueFromQObject(QScriptEngine *engine, Tp const &qobject)
{
return engine->newQObject(qobject);
}
template <typename Tp>
void qScriptValueToQObject(const QScriptValue &value, Tp &qobject)
{
qobject = qobject_cast<Tp>(value.toQObject());
}
template <typename Tp>
int qScriptRegisterQObjectMetaType(QScriptEngine *engine, const QScriptValue &prototype = QScriptValue()
#ifndef qdoc
, Tp * = 0
#endif
)
{
return qScriptRegisterMetaType<Tp>(engine, qScriptValueFromQObject, qScriptValueToQObject, prototype);
}
## Instruction:
Use dynamic_cast in qScriptValueToQObject as qobject_cast fails for some metatypes such as Entity*. However, assert that if dynamic_cast is null, so is qobject_cast.
## Code After:
// For conditions of distribution and use, see copyright notice in license.txt
#include <QScriptEngine>
#include <QMetaType>
// The following functions help register a custom QObject-derived class to a QScriptEngine.
// See http://lists.trolltech.com/qt-interest/2007-12/thread00158-0.html .
template <typename Tp>
QScriptValue qScriptValueFromQObject(QScriptEngine *engine, Tp const &qobject)
{
return engine->newQObject(qobject);
}
template <typename Tp>
void qScriptValueToQObject(const QScriptValue &value, Tp &qobject)
{
qobject = dynamic_cast<Tp>(value.toQObject());
if (!qobject)
{
// qobject_cast has been observed to fail for some metatypes, such as Entity*, so prefer dynamic_cast.
// However, to see that there are no regressions from that, check that if dynamic_cast fails, so does qobject_cast
Tp ptr = qobject_cast<Tp>(value.toQObject());
assert(!ptr);
if (ptr)
::LogError("qScriptValueToQObject: dynamic_cast was null, but qobject_cast was not!");
}
}
template <typename Tp>
int qScriptRegisterQObjectMetaType(QScriptEngine *engine, const QScriptValue &prototype = QScriptValue()
#ifndef qdoc
, Tp * = 0
#endif
)
{
return qScriptRegisterMetaType<Tp>(engine, qScriptValueFromQObject, qScriptValueToQObject, prototype);
}
| // For conditions of distribution and use, see copyright notice in license.txt
#include <QScriptEngine>
#include <QMetaType>
// The following functions help register a custom QObject-derived class to a QScriptEngine.
// See http://lists.trolltech.com/qt-interest/2007-12/thread00158-0.html .
template <typename Tp>
QScriptValue qScriptValueFromQObject(QScriptEngine *engine, Tp const &qobject)
{
return engine->newQObject(qobject);
}
template <typename Tp>
void qScriptValueToQObject(const QScriptValue &value, Tp &qobject)
{
- qobject = qobject_cast<Tp>(value.toQObject());
? ^^^^^ -
+ qobject = dynamic_cast<Tp>(value.toQObject());
? ^^^^^^
+ if (!qobject)
+ {
+ // qobject_cast has been observed to fail for some metatypes, such as Entity*, so prefer dynamic_cast.
+ // However, to see that there are no regressions from that, check that if dynamic_cast fails, so does qobject_cast
+ Tp ptr = qobject_cast<Tp>(value.toQObject());
+ assert(!ptr);
+ if (ptr)
+ ::LogError("qScriptValueToQObject: dynamic_cast was null, but qobject_cast was not!");
+ }
}
template <typename Tp>
int qScriptRegisterQObjectMetaType(QScriptEngine *engine, const QScriptValue &prototype = QScriptValue()
#ifndef qdoc
, Tp * = 0
#endif
)
{
return qScriptRegisterMetaType<Tp>(engine, qScriptValueFromQObject, qScriptValueToQObject, prototype);
} | 11 | 0.392857 | 10 | 1 |
1577651697ad587fecb657cabc981f63d2fa683c | spec/fc-reminder/gateways/twilio_spec.rb | spec/fc-reminder/gateways/twilio_spec.rb | require 'spec_helper'
describe FCReminder::Gateways::Twilio do
subject(:gateway) { FCReminder::Gateways::Twilio.new }
let(:config) do
{
account_sid: "accound sid",
auth_token: "auth token",
phone_number: "+1234567890"
}
end
context "#initialize" do
it { expect(gateway).to be_kind_of(FCReminder::Gateways::Base) }
end
context "#client" do
before { gateway.config = config }
it { expect(gateway.client).to be_instance_of(Twilio::REST::Client) }
it { expect{ gateway.client }.not_to change{ gateway.client } }
end
context "#send" do
before { gateway.config = config }
before do
allow(gateway.client)
.to receive_message_chain(:account, :messages, :create)
end
it { expect(gateway).to respond_to(:send).with(2).arguments }
it "sends message using Twilio REST API" do
args = {
from: config[:phone_number],
to: "recipient",
body: anything()
}
expect(gateway.client.account.messages)
.to receive(:create)
.once
.with(args)
gateway.send("recipient", {})
end
end
end
| require 'spec_helper'
describe FCReminder::Gateways::Twilio do
subject(:gateway) { FCReminder::Gateways::Twilio.new }
let(:config) do
{
account_sid: "accound sid",
auth_token: "auth token",
phone_number: "+1234567890"
}
end
context "#initialize" do
it { expect(gateway).to be_kind_of(FCReminder::Gateways::Base) }
end
context "#client" do
before { gateway.config = config }
it { expect(gateway.client).to be_instance_of(Twilio::REST::Client) }
it { expect{ gateway.client }.not_to change{ gateway.client } }
end
context "#send" do
before do
gateway.config = config
allow(gateway.client)
.to receive_message_chain(:account, :messages, :create)
end
it { expect(gateway).to respond_to(:send).with(2).arguments }
it "sends message using Twilio REST API" do
args = {
from: config[:phone_number],
to: "recipient",
body: anything()
}
expect(gateway.client.account.messages)
.to receive(:create)
.once
.with(args)
gateway.send("recipient", {})
end
end
end
| Use only one before block | Use only one before block | Ruby | bsd-3-clause | kiela/fc-reminder | ruby | ## Code Before:
require 'spec_helper'
describe FCReminder::Gateways::Twilio do
subject(:gateway) { FCReminder::Gateways::Twilio.new }
let(:config) do
{
account_sid: "accound sid",
auth_token: "auth token",
phone_number: "+1234567890"
}
end
context "#initialize" do
it { expect(gateway).to be_kind_of(FCReminder::Gateways::Base) }
end
context "#client" do
before { gateway.config = config }
it { expect(gateway.client).to be_instance_of(Twilio::REST::Client) }
it { expect{ gateway.client }.not_to change{ gateway.client } }
end
context "#send" do
before { gateway.config = config }
before do
allow(gateway.client)
.to receive_message_chain(:account, :messages, :create)
end
it { expect(gateway).to respond_to(:send).with(2).arguments }
it "sends message using Twilio REST API" do
args = {
from: config[:phone_number],
to: "recipient",
body: anything()
}
expect(gateway.client.account.messages)
.to receive(:create)
.once
.with(args)
gateway.send("recipient", {})
end
end
end
## Instruction:
Use only one before block
## Code After:
require 'spec_helper'
describe FCReminder::Gateways::Twilio do
subject(:gateway) { FCReminder::Gateways::Twilio.new }
let(:config) do
{
account_sid: "accound sid",
auth_token: "auth token",
phone_number: "+1234567890"
}
end
context "#initialize" do
it { expect(gateway).to be_kind_of(FCReminder::Gateways::Base) }
end
context "#client" do
before { gateway.config = config }
it { expect(gateway.client).to be_instance_of(Twilio::REST::Client) }
it { expect{ gateway.client }.not_to change{ gateway.client } }
end
context "#send" do
before do
gateway.config = config
allow(gateway.client)
.to receive_message_chain(:account, :messages, :create)
end
it { expect(gateway).to respond_to(:send).with(2).arguments }
it "sends message using Twilio REST API" do
args = {
from: config[:phone_number],
to: "recipient",
body: anything()
}
expect(gateway.client.account.messages)
.to receive(:create)
.once
.with(args)
gateway.send("recipient", {})
end
end
end
| require 'spec_helper'
describe FCReminder::Gateways::Twilio do
subject(:gateway) { FCReminder::Gateways::Twilio.new }
let(:config) do
{
account_sid: "accound sid",
auth_token: "auth token",
phone_number: "+1234567890"
}
end
context "#initialize" do
it { expect(gateway).to be_kind_of(FCReminder::Gateways::Base) }
end
context "#client" do
before { gateway.config = config }
it { expect(gateway.client).to be_instance_of(Twilio::REST::Client) }
it { expect{ gateway.client }.not_to change{ gateway.client } }
end
context "#send" do
- before { gateway.config = config }
before do
+ gateway.config = config
+
allow(gateway.client)
.to receive_message_chain(:account, :messages, :create)
end
it { expect(gateway).to respond_to(:send).with(2).arguments }
it "sends message using Twilio REST API" do
args = {
from: config[:phone_number],
to: "recipient",
body: anything()
}
expect(gateway.client.account.messages)
.to receive(:create)
.once
.with(args)
gateway.send("recipient", {})
end
end
end | 3 | 0.06383 | 2 | 1 |
5d332259e16758bc43201073db91409390be9134 | UM/Operations/GroupedOperation.py | UM/Operations/GroupedOperation.py |
from . import Operation
## An operation that groups several other operations together.
#
# The intent of this operation is to hide an underlying chain of operations
# from the user if they correspond to only one interaction with the user, such
# as an operation applied to multiple scene nodes or a re-arrangement of
# multiple items in the scene.
class GroupedOperation(Operation.Operation):
## Creates a new grouped operation.
#
# The grouped operation is empty after its initialisation.
def __init__(self):
super().__init__()
self._children = []
## Adds an operation to this group.
#
# The operation will be undone together with the rest of the operations in
# this group.
# Note that when the order matters, the operations are undone in reverse
# order as the order in which they are added.
def addOperation(self, op):
self._children.append(op)
## Removes an operation from this group.
def removeOperation(self, index):
del self._children[index]
## Undo all operations in this group.
#
# The operations are undone in reverse order as the order in which they
# were added.
def undo(self):
for op in reversed(self._children):
op.undo()
## Redoes all operations in this group.
def redo(self):
for op in self._children:
op.redo()
|
from . import Operation
## An operation that groups several other operations together.
#
# The intent of this operation is to hide an underlying chain of operations
# from the user if they correspond to only one interaction with the user, such
# as an operation applied to multiple scene nodes or a re-arrangement of
# multiple items in the scene.
class GroupedOperation(Operation.Operation):
## Creates a new grouped operation.
#
# The grouped operation is empty after its initialisation.
def __init__(self):
super().__init__()
self._children = []
## Adds an operation to this group.
#
# The operation will be undone together with the rest of the operations in
# this group.
# Note that when the order matters, the operations are undone in reverse
# order as the order in which they are added.
def addOperation(self, op):
self._children.append(op)
## Undo all operations in this group.
#
# The operations are undone in reverse order as the order in which they
# were added.
def undo(self):
for op in reversed(self._children):
op.undo()
## Redoes all operations in this group.
def redo(self):
for op in self._children:
op.redo()
| Remove removeOperation from grouped operation | Remove removeOperation from grouped operation
This function is never used and actually should never be used. The operation may not be modified after it is used, so removing an operation from the list makes no sense.
| Python | agpl-3.0 | onitake/Uranium,onitake/Uranium | python | ## Code Before:
from . import Operation
## An operation that groups several other operations together.
#
# The intent of this operation is to hide an underlying chain of operations
# from the user if they correspond to only one interaction with the user, such
# as an operation applied to multiple scene nodes or a re-arrangement of
# multiple items in the scene.
class GroupedOperation(Operation.Operation):
## Creates a new grouped operation.
#
# The grouped operation is empty after its initialisation.
def __init__(self):
super().__init__()
self._children = []
## Adds an operation to this group.
#
# The operation will be undone together with the rest of the operations in
# this group.
# Note that when the order matters, the operations are undone in reverse
# order as the order in which they are added.
def addOperation(self, op):
self._children.append(op)
## Removes an operation from this group.
def removeOperation(self, index):
del self._children[index]
## Undo all operations in this group.
#
# The operations are undone in reverse order as the order in which they
# were added.
def undo(self):
for op in reversed(self._children):
op.undo()
## Redoes all operations in this group.
def redo(self):
for op in self._children:
op.redo()
## Instruction:
Remove removeOperation from grouped operation
This function is never used and actually should never be used. The operation may not be modified after it is used, so removing an operation from the list makes no sense.
## Code After:
from . import Operation
## An operation that groups several other operations together.
#
# The intent of this operation is to hide an underlying chain of operations
# from the user if they correspond to only one interaction with the user, such
# as an operation applied to multiple scene nodes or a re-arrangement of
# multiple items in the scene.
class GroupedOperation(Operation.Operation):
## Creates a new grouped operation.
#
# The grouped operation is empty after its initialisation.
def __init__(self):
super().__init__()
self._children = []
## Adds an operation to this group.
#
# The operation will be undone together with the rest of the operations in
# this group.
# Note that when the order matters, the operations are undone in reverse
# order as the order in which they are added.
def addOperation(self, op):
self._children.append(op)
## Undo all operations in this group.
#
# The operations are undone in reverse order as the order in which they
# were added.
def undo(self):
for op in reversed(self._children):
op.undo()
## Redoes all operations in this group.
def redo(self):
for op in self._children:
op.redo()
|
from . import Operation
## An operation that groups several other operations together.
#
# The intent of this operation is to hide an underlying chain of operations
# from the user if they correspond to only one interaction with the user, such
# as an operation applied to multiple scene nodes or a re-arrangement of
# multiple items in the scene.
class GroupedOperation(Operation.Operation):
## Creates a new grouped operation.
#
# The grouped operation is empty after its initialisation.
def __init__(self):
super().__init__()
self._children = []
## Adds an operation to this group.
#
# The operation will be undone together with the rest of the operations in
# this group.
# Note that when the order matters, the operations are undone in reverse
# order as the order in which they are added.
def addOperation(self, op):
self._children.append(op)
- ## Removes an operation from this group.
- def removeOperation(self, index):
- del self._children[index]
-
## Undo all operations in this group.
#
# The operations are undone in reverse order as the order in which they
# were added.
def undo(self):
for op in reversed(self._children):
op.undo()
## Redoes all operations in this group.
def redo(self):
for op in self._children:
op.redo() | 4 | 0.095238 | 0 | 4 |
5ccc167150e63ba58522d0bf62301980f2b456c0 | views/instagram.rb | views/instagram.rb | module Views
module Instagram
class << self
def parse(message)
message_array = message.split(' ')
message_array.map! do |word|
if word.include?(Views::HASHMARK)
populate_tag_url(word)
elsif word.start_with?(Views::AT_SIGN)
"<a href='https://www.instagram.com/#{word.delete(Views::AT_SIGN)}'>#{word}</a>"
else
word
end
end
message_array.join(' ')
end
private
def populate_tag_url(word)
if word.end_with?(Views::POINT)
"#{url_for(word.delete(Views::POINT))}#{Views::POINT}"
elsif word.end_with?(Views::COMMA)
"#{url_for(word.delete(Views::COMMA))}#{Views::COMMA}"
else
url_for(word)
end
end
def url_for(tag)
"<a href='https://www.instagram.com/explore/tags/#{tag.delete(Views::HASHMARK)}/'>#{tag}</a>"
end
end
end
end
| module Views
module Instagram
class << self
def parse(message)
message_array = message.split(' ')
message_array.map! do |word|
next populate_tag_url(word) if word.include?(Views::HASHMARK)
next account_url_for(word) if word.start_with?(Views::AT_SIGN)
word
end
message_array.join(' ')
end
private
def populate_tag_url(word)
return "#{tag_url_for(word.delete(Views::POINT))}#{Views::POINT}" if word.end_with?(Views::POINT)
return "#{tag_url_for(word.delete(Views::COMMA))}#{Views::COMMA}" if word.end_with?(Views::COMMA)
tag_url_for(word)
end
def tag_url_for(word)
"<a href='https://www.instagram.com/explore/tags/#{word.delete(Views::HASHMARK)}/'>#{word}</a>"
end
def account_url_for(word)
"<a href='https://www.instagram.com/#{word.delete(Views::AT_SIGN)}'>#{word}</a>"
end
end
end
end
| Simplify methods of Instagram module | Simplify methods of Instagram module
* Use early returns
| Ruby | mit | spaceneedle2019/FranksHat,spaceneedle2019/FranksHat,spaceneedle2019/FranksHat | ruby | ## Code Before:
module Views
module Instagram
class << self
def parse(message)
message_array = message.split(' ')
message_array.map! do |word|
if word.include?(Views::HASHMARK)
populate_tag_url(word)
elsif word.start_with?(Views::AT_SIGN)
"<a href='https://www.instagram.com/#{word.delete(Views::AT_SIGN)}'>#{word}</a>"
else
word
end
end
message_array.join(' ')
end
private
def populate_tag_url(word)
if word.end_with?(Views::POINT)
"#{url_for(word.delete(Views::POINT))}#{Views::POINT}"
elsif word.end_with?(Views::COMMA)
"#{url_for(word.delete(Views::COMMA))}#{Views::COMMA}"
else
url_for(word)
end
end
def url_for(tag)
"<a href='https://www.instagram.com/explore/tags/#{tag.delete(Views::HASHMARK)}/'>#{tag}</a>"
end
end
end
end
## Instruction:
Simplify methods of Instagram module
* Use early returns
## Code After:
module Views
module Instagram
class << self
def parse(message)
message_array = message.split(' ')
message_array.map! do |word|
next populate_tag_url(word) if word.include?(Views::HASHMARK)
next account_url_for(word) if word.start_with?(Views::AT_SIGN)
word
end
message_array.join(' ')
end
private
def populate_tag_url(word)
return "#{tag_url_for(word.delete(Views::POINT))}#{Views::POINT}" if word.end_with?(Views::POINT)
return "#{tag_url_for(word.delete(Views::COMMA))}#{Views::COMMA}" if word.end_with?(Views::COMMA)
tag_url_for(word)
end
def tag_url_for(word)
"<a href='https://www.instagram.com/explore/tags/#{word.delete(Views::HASHMARK)}/'>#{word}</a>"
end
def account_url_for(word)
"<a href='https://www.instagram.com/#{word.delete(Views::AT_SIGN)}'>#{word}</a>"
end
end
end
end
| module Views
module Instagram
class << self
def parse(message)
message_array = message.split(' ')
message_array.map! do |word|
- if word.include?(Views::HASHMARK)
+ next populate_tag_url(word) if word.include?(Views::HASHMARK)
? ++++++++++++++++++++++++++++
- populate_tag_url(word)
- elsif word.start_with?(Views::AT_SIGN)
? ^
+ next account_url_for(word) if word.start_with?(Views::AT_SIGN)
? + +++++++++++++ ^^^^^^^^^^^
- "<a href='https://www.instagram.com/#{word.delete(Views::AT_SIGN)}'>#{word}</a>"
- else
- word
? --
+ word
- end
end
message_array.join(' ')
end
private
def populate_tag_url(word)
+ return "#{tag_url_for(word.delete(Views::POINT))}#{Views::POINT}" if word.end_with?(Views::POINT)
+ return "#{tag_url_for(word.delete(Views::COMMA))}#{Views::COMMA}" if word.end_with?(Views::COMMA)
- if word.end_with?(Views::POINT)
- "#{url_for(word.delete(Views::POINT))}#{Views::POINT}"
- elsif word.end_with?(Views::COMMA)
- "#{url_for(word.delete(Views::COMMA))}#{Views::COMMA}"
- else
- url_for(word)
? ^^
+ tag_url_for(word)
? ^^^^
- end
end
- def url_for(tag)
? ^^^
+ def tag_url_for(word)
? ++++ ^^^^
- "<a href='https://www.instagram.com/explore/tags/#{tag.delete(Views::HASHMARK)}/'>#{tag}</a>"
? ^^^ ^^^
+ "<a href='https://www.instagram.com/explore/tags/#{word.delete(Views::HASHMARK)}/'>#{word}</a>"
? ^^^^ ^^^^
+ end
+
+ def account_url_for(word)
+ "<a href='https://www.instagram.com/#{word.delete(Views::AT_SIGN)}'>#{word}</a>"
end
end
end
end | 28 | 0.8 | 12 | 16 |
adb6288e0414a95c3001bae364e7d84fa67dbd7a | dev-build-push.sh | dev-build-push.sh |
dcos marathon app remove /scale-db
dcos marathon app remove /scale-fluentd
dcos marathon app remove /scale-webserver
dcos marathon app stop /scale
if [ ! -f scale-ui/deploy/scale-ui-master.tar.gz ]
then
cd scale-ui
tar xf node_modules.tar.gz
tar xf bower_components.tar.gz
npm install
node node_modules/gulp/bin/gulp.js deploy
cd ..
fi
docker build -t $1 -f Dockerfile-dev .
docker push $1
dcos marathon app start /scale 1
|
dcos marathon app remove /scale-db
dcos marathon app remove /scale-fluentd
dcos marathon app remove /scale-webserver
dcos marathon app stop /scale
# Assumes an adjacent scale-ui source code checkout to grab UI assets from
if [ ! -f ./scale-ui/index.html ]
then
cd ../scale-ui
npm run builddev:prod
cd -
cp -R ../scale-ui/dist/developer ./scale-ui
fi
docker build -t $1 -f Dockerfile-dev .
docker push $1
dcos marathon app start /scale 1
| Build UI in for testing | Build UI in for testing
| Shell | apache-2.0 | ngageoint/scale,ngageoint/scale,ngageoint/scale,ngageoint/scale | shell | ## Code Before:
dcos marathon app remove /scale-db
dcos marathon app remove /scale-fluentd
dcos marathon app remove /scale-webserver
dcos marathon app stop /scale
if [ ! -f scale-ui/deploy/scale-ui-master.tar.gz ]
then
cd scale-ui
tar xf node_modules.tar.gz
tar xf bower_components.tar.gz
npm install
node node_modules/gulp/bin/gulp.js deploy
cd ..
fi
docker build -t $1 -f Dockerfile-dev .
docker push $1
dcos marathon app start /scale 1
## Instruction:
Build UI in for testing
## Code After:
dcos marathon app remove /scale-db
dcos marathon app remove /scale-fluentd
dcos marathon app remove /scale-webserver
dcos marathon app stop /scale
# Assumes an adjacent scale-ui source code checkout to grab UI assets from
if [ ! -f ./scale-ui/index.html ]
then
cd ../scale-ui
npm run builddev:prod
cd -
cp -R ../scale-ui/dist/developer ./scale-ui
fi
docker build -t $1 -f Dockerfile-dev .
docker push $1
dcos marathon app start /scale 1
|
dcos marathon app remove /scale-db
dcos marathon app remove /scale-fluentd
dcos marathon app remove /scale-webserver
dcos marathon app stop /scale
- if [ ! -f scale-ui/deploy/scale-ui-master.tar.gz ]
+ # Assumes an adjacent scale-ui source code checkout to grab UI assets from
+ if [ ! -f ./scale-ui/index.html ]
then
- cd scale-ui
+ cd ../scale-ui
? +++
+ npm run builddev:prod
- tar xf node_modules.tar.gz
- tar xf bower_components.tar.gz
- npm install
- node node_modules/gulp/bin/gulp.js deploy
- cd ..
? ^^
+ cd -
? ^
+ cp -R ../scale-ui/dist/developer ./scale-ui
fi
docker build -t $1 -f Dockerfile-dev .
docker push $1
dcos marathon app start /scale 1 | 13 | 0.65 | 6 | 7 |
01dd232947895052939d7279ba699565db6a8d77 | services/value-stat-getter.js | services/value-stat-getter.js | 'use strict';
var _ = require('lodash');
var P = require('bluebird');
var FilterParser = require('./filter-parser');
function ValueStatGetter(model, params, opts) {
function getAggregateField() {
// jshint sub: true
return params['aggregate_field'] || '_id';
}
this.perform = function () {
return new P(function (resolve, reject) {
var query = model.aggregate();
if (params.filters) {
_.each(params.filters, function (filter) {
query = new FilterParser(model, opts).perform(query, filter.field,
filter.value, 'match');
});
}
var sum = 1;
if (params['aggregate_field']) {
sum = '$' + params['aggregate_field'];
}
query
.group({
_id: null,
total: { $sum: sum }
})
.exec(function (err, records) {
if (err) { return reject(err); }
resolve({ value: records[0].total });
});
});
};
}
module.exports = ValueStatGetter;
| 'use strict';
var _ = require('lodash');
var P = require('bluebird');
var FilterParser = require('./filter-parser');
function ValueStatGetter(model, params, opts) {
function getAggregateField() {
// jshint sub: true
return params['aggregate_field'] || '_id';
}
this.perform = function () {
return new P(function (resolve, reject) {
var query = model.aggregate();
if (params.filters) {
_.each(params.filters, function (filter) {
query = new FilterParser(model, opts).perform(query, filter.field,
filter.value, 'match');
});
}
var sum = 1;
if (params['aggregate_field']) {
sum = '$' + params['aggregate_field'];
}
query
.group({
_id: null,
total: { $sum: sum }
})
.exec(function (err, records) {
if (err) { return reject(err); }
if (!records || !records.length) { return resolve({ value: 0 }); }
resolve({ value: records[0].total });
});
});
};
}
module.exports = ValueStatGetter;
| Fix the value stat getter when no records available | Fix the value stat getter when no records available
| JavaScript | mit | SeyZ/forest-express-mongoose | javascript | ## Code Before:
'use strict';
var _ = require('lodash');
var P = require('bluebird');
var FilterParser = require('./filter-parser');
function ValueStatGetter(model, params, opts) {
function getAggregateField() {
// jshint sub: true
return params['aggregate_field'] || '_id';
}
this.perform = function () {
return new P(function (resolve, reject) {
var query = model.aggregate();
if (params.filters) {
_.each(params.filters, function (filter) {
query = new FilterParser(model, opts).perform(query, filter.field,
filter.value, 'match');
});
}
var sum = 1;
if (params['aggregate_field']) {
sum = '$' + params['aggregate_field'];
}
query
.group({
_id: null,
total: { $sum: sum }
})
.exec(function (err, records) {
if (err) { return reject(err); }
resolve({ value: records[0].total });
});
});
};
}
module.exports = ValueStatGetter;
## Instruction:
Fix the value stat getter when no records available
## Code After:
'use strict';
var _ = require('lodash');
var P = require('bluebird');
var FilterParser = require('./filter-parser');
function ValueStatGetter(model, params, opts) {
function getAggregateField() {
// jshint sub: true
return params['aggregate_field'] || '_id';
}
this.perform = function () {
return new P(function (resolve, reject) {
var query = model.aggregate();
if (params.filters) {
_.each(params.filters, function (filter) {
query = new FilterParser(model, opts).perform(query, filter.field,
filter.value, 'match');
});
}
var sum = 1;
if (params['aggregate_field']) {
sum = '$' + params['aggregate_field'];
}
query
.group({
_id: null,
total: { $sum: sum }
})
.exec(function (err, records) {
if (err) { return reject(err); }
if (!records || !records.length) { return resolve({ value: 0 }); }
resolve({ value: records[0].total });
});
});
};
}
module.exports = ValueStatGetter;
| 'use strict';
var _ = require('lodash');
var P = require('bluebird');
var FilterParser = require('./filter-parser');
function ValueStatGetter(model, params, opts) {
function getAggregateField() {
// jshint sub: true
return params['aggregate_field'] || '_id';
}
this.perform = function () {
return new P(function (resolve, reject) {
var query = model.aggregate();
if (params.filters) {
_.each(params.filters, function (filter) {
query = new FilterParser(model, opts).perform(query, filter.field,
filter.value, 'match');
});
}
var sum = 1;
if (params['aggregate_field']) {
sum = '$' + params['aggregate_field'];
}
query
.group({
_id: null,
total: { $sum: sum }
})
.exec(function (err, records) {
if (err) { return reject(err); }
+ if (!records || !records.length) { return resolve({ value: 0 }); }
+
resolve({ value: records[0].total });
});
});
};
}
module.exports = ValueStatGetter; | 2 | 0.047619 | 2 | 0 |
2282238d2c4fd629258ef1702217444aa21936d0 | lib/_request.js | lib/_request.js | 'use strict';
const https = require('https');
function _request(options, body) {
return new Promise(function(resolve, reject) {
const request = https.request(Object.assign({
hostname: 'api.mojang.com',
headers: { 'Content-Type': 'application/json' },
}, options), function(resp) {
let data = [];
resp.on('data', function(chunk) {
data.push(chunk);
});
resp.on('end', function() {
data = JSON.parse(Buffer.concat(data).toString());
resolve(data);
});
resp.on('error', reject);
});
request.on('error', reject);
if (typeof body === 'object' && !(body instanceof Buffer)) {
request.write(JSON.stringify(body));
} else if (typeof body !== 'undefined') {
request.write(body);
}
request.end();
});
}
module.exports = _request;
| 'use strict';
const https = require('https');
function _request(options, body) {
return new Promise(function(resolve, reject) {
const request = https.request(Object.assign({
hostname: 'api.mojang.com',
headers: { 'Content-Type': 'application/json' },
}, options), function(resp) {
let data = [];
resp.on('data', function(chunk) {
data.push(chunk);
});
resp.on('end', function() {
if(data.length < 1){
return reject(new Error("No data received."));
}
data = JSON.parse(Buffer.concat(data).toString());
resolve(data);
});
resp.on('error', reject);
});
request.on('error', reject);
if (typeof body === 'object' && !(body instanceof Buffer)) {
request.write(JSON.stringify(body));
} else if (typeof body !== 'undefined') {
request.write(body);
}
request.end();
});
}
module.exports = _request;
| Patch when no data received. | Patch when no data received.
If there was not a response from the Mojang servers then an error would occur:
```
SyntaxError: Unexpected end of input
at Object.parse (native)
at IncomingMessage.<anonymous> (/home/player-info/node_modules/mojang/lib/_request.js:18:21)
at emitNone (events.js:85:20)
at IncomingMessage.emit (events.js:179:7)
at endReadableNT (_stream_readable.js:913:12)
at _combinedTickCallback (node.js:377:13)
at process._tickCallback (node.js:401:11)
```
You can replicate by looking up a username which does not exist. | JavaScript | unknown | jamen/node-mojang,jamen/mcnode,maccelerated/node-mojang,JamenMarz/mcnode | javascript | ## Code Before:
'use strict';
const https = require('https');
function _request(options, body) {
return new Promise(function(resolve, reject) {
const request = https.request(Object.assign({
hostname: 'api.mojang.com',
headers: { 'Content-Type': 'application/json' },
}, options), function(resp) {
let data = [];
resp.on('data', function(chunk) {
data.push(chunk);
});
resp.on('end', function() {
data = JSON.parse(Buffer.concat(data).toString());
resolve(data);
});
resp.on('error', reject);
});
request.on('error', reject);
if (typeof body === 'object' && !(body instanceof Buffer)) {
request.write(JSON.stringify(body));
} else if (typeof body !== 'undefined') {
request.write(body);
}
request.end();
});
}
module.exports = _request;
## Instruction:
Patch when no data received.
If there was not a response from the Mojang servers then an error would occur:
```
SyntaxError: Unexpected end of input
at Object.parse (native)
at IncomingMessage.<anonymous> (/home/player-info/node_modules/mojang/lib/_request.js:18:21)
at emitNone (events.js:85:20)
at IncomingMessage.emit (events.js:179:7)
at endReadableNT (_stream_readable.js:913:12)
at _combinedTickCallback (node.js:377:13)
at process._tickCallback (node.js:401:11)
```
You can replicate by looking up a username which does not exist.
## Code After:
'use strict';
const https = require('https');
function _request(options, body) {
return new Promise(function(resolve, reject) {
const request = https.request(Object.assign({
hostname: 'api.mojang.com',
headers: { 'Content-Type': 'application/json' },
}, options), function(resp) {
let data = [];
resp.on('data', function(chunk) {
data.push(chunk);
});
resp.on('end', function() {
if(data.length < 1){
return reject(new Error("No data received."));
}
data = JSON.parse(Buffer.concat(data).toString());
resolve(data);
});
resp.on('error', reject);
});
request.on('error', reject);
if (typeof body === 'object' && !(body instanceof Buffer)) {
request.write(JSON.stringify(body));
} else if (typeof body !== 'undefined') {
request.write(body);
}
request.end();
});
}
module.exports = _request;
| 'use strict';
const https = require('https');
function _request(options, body) {
return new Promise(function(resolve, reject) {
const request = https.request(Object.assign({
hostname: 'api.mojang.com',
headers: { 'Content-Type': 'application/json' },
}, options), function(resp) {
let data = [];
resp.on('data', function(chunk) {
data.push(chunk);
});
resp.on('end', function() {
+ if(data.length < 1){
+ return reject(new Error("No data received."));
+ }
data = JSON.parse(Buffer.concat(data).toString());
resolve(data);
});
resp.on('error', reject);
});
request.on('error', reject);
if (typeof body === 'object' && !(body instanceof Buffer)) {
request.write(JSON.stringify(body));
} else if (typeof body !== 'undefined') {
request.write(body);
}
request.end();
});
}
module.exports = _request; | 3 | 0.085714 | 3 | 0 |
a4ac01a10007bd63b2ab225198f1badee606b8b2 | src/bem/blocks/sv-select-base.scss | src/bem/blocks/sv-select-base.scss | .sv-select-base {
padding: 0;
}
.sv-select-base__item {
position: relative;
margin-top: 0.3em;
margin-bottom: 1.07em;
vertical-align: middle;
}
.sv-select-base__item--inline {
display: inline-block;
padding-right: 2%;
}
.sv-select-base__column {
display: inline-block;
width: 19%;
vertical-align: top;
}
.sv-select-base__label {
display: inline-block;
}
.sv-select-base__control {
margin: 0;
vertical-align: middle;
}
.sv-select-base__control-label {
margin-left: 1em;
vertical-align: middle;
}
| .sv-select-base {
padding: 0;
}
.sv-select-base__item {
position: relative;
margin-top: 0.3em;
margin-bottom: 1.07em;
vertical-align: middle;
}
.sv-select-base__item--inline {
display: inline-block;
padding-right: 2%;
}
.sv-select-base__column {
display: inline-block;
width: 19%;
vertical-align: top;
}
.sv-select-base__label {
display: inline-block;
width: 100%;
overflow: hidden;
text-overflow: ellipsis;
white-space: nowrap;
}
.sv-select-base__control {
margin: 0;
vertical-align: middle;
}
.sv-select-base__control-label {
margin-left: 1em;
vertical-align: middle;
}
| Fix labels in the select-base for small width case | Fix labels in the select-base for small width case
| SCSS | mit | surveyjs/surveyjs,andrewtelnov/surveyjs,surveyjs/surveyjs,surveyjs/surveyjs,andrewtelnov/surveyjs,andrewtelnov/surveyjs,surveyjs/surveyjs | scss | ## Code Before:
.sv-select-base {
padding: 0;
}
.sv-select-base__item {
position: relative;
margin-top: 0.3em;
margin-bottom: 1.07em;
vertical-align: middle;
}
.sv-select-base__item--inline {
display: inline-block;
padding-right: 2%;
}
.sv-select-base__column {
display: inline-block;
width: 19%;
vertical-align: top;
}
.sv-select-base__label {
display: inline-block;
}
.sv-select-base__control {
margin: 0;
vertical-align: middle;
}
.sv-select-base__control-label {
margin-left: 1em;
vertical-align: middle;
}
## Instruction:
Fix labels in the select-base for small width case
## Code After:
.sv-select-base {
padding: 0;
}
.sv-select-base__item {
position: relative;
margin-top: 0.3em;
margin-bottom: 1.07em;
vertical-align: middle;
}
.sv-select-base__item--inline {
display: inline-block;
padding-right: 2%;
}
.sv-select-base__column {
display: inline-block;
width: 19%;
vertical-align: top;
}
.sv-select-base__label {
display: inline-block;
width: 100%;
overflow: hidden;
text-overflow: ellipsis;
white-space: nowrap;
}
.sv-select-base__control {
margin: 0;
vertical-align: middle;
}
.sv-select-base__control-label {
margin-left: 1em;
vertical-align: middle;
}
| .sv-select-base {
padding: 0;
}
.sv-select-base__item {
position: relative;
margin-top: 0.3em;
margin-bottom: 1.07em;
vertical-align: middle;
}
.sv-select-base__item--inline {
display: inline-block;
padding-right: 2%;
}
.sv-select-base__column {
display: inline-block;
width: 19%;
vertical-align: top;
}
.sv-select-base__label {
display: inline-block;
+ width: 100%;
+ overflow: hidden;
+ text-overflow: ellipsis;
+ white-space: nowrap;
}
.sv-select-base__control {
margin: 0;
vertical-align: middle;
}
.sv-select-base__control-label {
margin-left: 1em;
vertical-align: middle;
} | 4 | 0.137931 | 4 | 0 |
2fbd88f1a8fbf02601a1121a88096271279bfca2 | src/scripts/hubot-zoi.coffee | src/scripts/hubot-zoi.coffee |
request = require 'request'
uri = 'http://zoi.herokuapp.com/js/services.js'
getZois = (cb) ->
request(uri, (err, response, body) ->
if err
cb.onError(err)
else
zois = body.match(/image: 'https:\/\/pbs.twimg.com\/media\/.+.jpg:large'/mg).
map((l) -> l.match("'(.+)'")[1])
(cb.onZoi || cb.onSuccess)(zois)
)
module.exports = (robot) ->
robot.respond /zoi/i, (msg) ->
getZois
onZoi: (zois) -> msg.send msg.random zois
onError: (err) -> msg.send "頑張れなかった...: #{err}"
|
request = require 'request'
uri = 'http://zoi.herokuapp.com/js/services.js'
getZois = (cb) ->
request(uri, (err, response, body) ->
if err
cb.onError(err)
else
zois = body.match(/image: 'https:\/\/pbs.twimg.com\/media\/.+.jpg:large'/mg).
map((l) -> l.match("'(https.+\.jpg):large'")[1])
(cb.onZoi || cb.onSuccess)(zois)
)
module.exports = (robot) ->
robot.respond /zoi/i, (msg) ->
getZois
onZoi: (zois) -> msg.send msg.random zois
onError: (err) -> msg.send "頑張れなかった...: #{err}"
| Remove :large suffix to usefully expand on limechat | Remove :large suffix to usefully expand on limechat
| CoffeeScript | mit | udzura/hubot-zoi | coffeescript | ## Code Before:
request = require 'request'
uri = 'http://zoi.herokuapp.com/js/services.js'
getZois = (cb) ->
request(uri, (err, response, body) ->
if err
cb.onError(err)
else
zois = body.match(/image: 'https:\/\/pbs.twimg.com\/media\/.+.jpg:large'/mg).
map((l) -> l.match("'(.+)'")[1])
(cb.onZoi || cb.onSuccess)(zois)
)
module.exports = (robot) ->
robot.respond /zoi/i, (msg) ->
getZois
onZoi: (zois) -> msg.send msg.random zois
onError: (err) -> msg.send "頑張れなかった...: #{err}"
## Instruction:
Remove :large suffix to usefully expand on limechat
## Code After:
request = require 'request'
uri = 'http://zoi.herokuapp.com/js/services.js'
getZois = (cb) ->
request(uri, (err, response, body) ->
if err
cb.onError(err)
else
zois = body.match(/image: 'https:\/\/pbs.twimg.com\/media\/.+.jpg:large'/mg).
map((l) -> l.match("'(https.+\.jpg):large'")[1])
(cb.onZoi || cb.onSuccess)(zois)
)
module.exports = (robot) ->
robot.respond /zoi/i, (msg) ->
getZois
onZoi: (zois) -> msg.send msg.random zois
onError: (err) -> msg.send "頑張れなかった...: #{err}"
|
request = require 'request'
uri = 'http://zoi.herokuapp.com/js/services.js'
getZois = (cb) ->
request(uri, (err, response, body) ->
if err
cb.onError(err)
else
zois = body.match(/image: 'https:\/\/pbs.twimg.com\/media\/.+.jpg:large'/mg).
- map((l) -> l.match("'(.+)'")[1])
+ map((l) -> l.match("'(https.+\.jpg):large'")[1])
? +++++ +++++ ++++++
(cb.onZoi || cb.onSuccess)(zois)
)
module.exports = (robot) ->
robot.respond /zoi/i, (msg) ->
getZois
onZoi: (zois) -> msg.send msg.random zois
onError: (err) -> msg.send "頑張れなかった...: #{err}"
| 2 | 0.095238 | 1 | 1 |
404a349ff820cc9897ce4f95cec460620bf21e4d | .travis.yml | .travis.yml | language: java
matrix:
include:
- jdk: openjdk8
env: TARGET=checkstyle
- jdk: openjdk8
env: TARGET=pmd
- jdk: openjdk8
env: TARGET=cpd
- jdk: openjdk8
env: TARGET=test
services: xvfb
- jdk: openjdk11
env: TARGET=test
services: xvfb
- os: osx
osx_image: xcode9.2
env: TARGET=test
before_install:
- if [ $TARGET = test ]; then
if [ $TRAVIS_OS_NAME = osx ]; then
brew update; brew install clasp;
else
sudo apt-get -qq update; sudo apt-get install -y clasp;
fi
fi
install:
- ./gradlew assemble
script:
- ./gradlew $TARGET
- if [ $TARGET = test ]; then ./ci/run.sh; fi
after_success:
- if [ $TARGET = test -a $TRAVIS_JDK_VERSION = openjdk8 ]; then ./gradlew coveralls; fi
| language: java
matrix:
include:
- jdk: openjdk8
env: TARGET=checkstyle
- jdk: openjdk8
env: TARGET=pmd
- jdk: openjdk8
env: TARGET=cpd
- jdk: openjdk8
env: TARGET=test
services: xvfb
- jdk: openjdk11
env: TARGET=test
services: xvfb
- os: osx
osx_image: xcode11
env: TARGET=test
addons:
apt:
packages: clasp
homebrew:
packages: clasp
install:
- ./gradlew assemble
script:
- ./gradlew $TARGET
- if [ $TARGET = test ]; then ./ci/run.sh; fi
after_success:
- if [ $TARGET = test -a $TRAVIS_JDK_VERSION = openjdk8 ]; then ./gradlew coveralls; fi
| Use Travis addons for installing Clasp | Use Travis addons for installing Clasp
| YAML | mit | tuura/workcraft,danilovesky/workcraft,danilovesky/workcraft,workcraft/workcraft,danilovesky/workcraft,workcraft/workcraft,workcraft/workcraft,danilovesky/workcraft,tuura/workcraft,tuura/workcraft,workcraft/workcraft | yaml | ## Code Before:
language: java
matrix:
include:
- jdk: openjdk8
env: TARGET=checkstyle
- jdk: openjdk8
env: TARGET=pmd
- jdk: openjdk8
env: TARGET=cpd
- jdk: openjdk8
env: TARGET=test
services: xvfb
- jdk: openjdk11
env: TARGET=test
services: xvfb
- os: osx
osx_image: xcode9.2
env: TARGET=test
before_install:
- if [ $TARGET = test ]; then
if [ $TRAVIS_OS_NAME = osx ]; then
brew update; brew install clasp;
else
sudo apt-get -qq update; sudo apt-get install -y clasp;
fi
fi
install:
- ./gradlew assemble
script:
- ./gradlew $TARGET
- if [ $TARGET = test ]; then ./ci/run.sh; fi
after_success:
- if [ $TARGET = test -a $TRAVIS_JDK_VERSION = openjdk8 ]; then ./gradlew coveralls; fi
## Instruction:
Use Travis addons for installing Clasp
## Code After:
language: java
matrix:
include:
- jdk: openjdk8
env: TARGET=checkstyle
- jdk: openjdk8
env: TARGET=pmd
- jdk: openjdk8
env: TARGET=cpd
- jdk: openjdk8
env: TARGET=test
services: xvfb
- jdk: openjdk11
env: TARGET=test
services: xvfb
- os: osx
osx_image: xcode11
env: TARGET=test
addons:
apt:
packages: clasp
homebrew:
packages: clasp
install:
- ./gradlew assemble
script:
- ./gradlew $TARGET
- if [ $TARGET = test ]; then ./ci/run.sh; fi
after_success:
- if [ $TARGET = test -a $TRAVIS_JDK_VERSION = openjdk8 ]; then ./gradlew coveralls; fi
| language: java
matrix:
include:
- jdk: openjdk8
env: TARGET=checkstyle
- jdk: openjdk8
env: TARGET=pmd
- jdk: openjdk8
env: TARGET=cpd
- jdk: openjdk8
env: TARGET=test
services: xvfb
- jdk: openjdk11
env: TARGET=test
services: xvfb
- os: osx
- osx_image: xcode9.2
? ^^^
+ osx_image: xcode11
? ^^
env: TARGET=test
+ addons:
+ apt:
+ packages: clasp
+ homebrew:
+ packages: clasp
+
- before_install:
- - if [ $TARGET = test ]; then
- if [ $TRAVIS_OS_NAME = osx ]; then
- brew update; brew install clasp;
- else
- sudo apt-get -qq update; sudo apt-get install -y clasp;
- fi
- fi
install:
- ./gradlew assemble
script:
- ./gradlew $TARGET
- if [ $TARGET = test ]; then ./ci/run.sh; fi
after_success:
- if [ $TARGET = test -a $TRAVIS_JDK_VERSION = openjdk8 ]; then ./gradlew coveralls; fi | 16 | 0.432432 | 7 | 9 |
43f9b4f23c2eae1414acac7a6196fd428f4c826a | app/views/sprint/confirmation.blade.php | app/views/sprint/confirmation.blade.php | @extends('layouts.default')
@section('content')
<h1>Successfully created "{{ $sprint->title }}"</h1>
<ul>
<li>
{{ link_to(
$_ENV['PHABRICATOR_URL'] . 'project/view/' . $sprint->phabricator_id,
$sprint->title . ' on Phabricator'
) }}
</li>
<li>
{{ link_to_route(
'sprint_path',
$sprint->title . ' on Phragile',
$sprint->phid
) }}
</li>
<li>
{{ link_to_route(
'project_path',
$sprint->project->title . ' on Phragile',
$sprint->project->slug
) }}
</li>
</ul>
@stop
| @extends('layouts.default')
@section('content')
<h1>Successfully created "{{ $sprint->title }}"</h1>
<ul>
<li>
{{ link_to(
$_ENV['PHABRICATOR_URL'] . 'project/view/' . $sprint->phabricator_id,
$sprint->title . ' on Phabricator'
) }}
</li>
<li>
{{ link_to_route(
'sprint_path',
$sprint->title . ' on Phragile',
$sprint->phabricator_id
) }}
</li>
<li>
{{ link_to_route(
'project_path',
$sprint->project->title . ' on Phragile',
$sprint->project->slug
) }}
</li>
</ul>
@stop
| Fix confirmation page id again. | Fix confirmation page id again.
| PHP | apache-2.0 | christopher-johnson/phabricator,christopher-johnson/phabricator,christopher-johnson/phabricator,christopher-johnson/phabricator,christopher-johnson/phabricator | php | ## Code Before:
@extends('layouts.default')
@section('content')
<h1>Successfully created "{{ $sprint->title }}"</h1>
<ul>
<li>
{{ link_to(
$_ENV['PHABRICATOR_URL'] . 'project/view/' . $sprint->phabricator_id,
$sprint->title . ' on Phabricator'
) }}
</li>
<li>
{{ link_to_route(
'sprint_path',
$sprint->title . ' on Phragile',
$sprint->phid
) }}
</li>
<li>
{{ link_to_route(
'project_path',
$sprint->project->title . ' on Phragile',
$sprint->project->slug
) }}
</li>
</ul>
@stop
## Instruction:
Fix confirmation page id again.
## Code After:
@extends('layouts.default')
@section('content')
<h1>Successfully created "{{ $sprint->title }}"</h1>
<ul>
<li>
{{ link_to(
$_ENV['PHABRICATOR_URL'] . 'project/view/' . $sprint->phabricator_id,
$sprint->title . ' on Phabricator'
) }}
</li>
<li>
{{ link_to_route(
'sprint_path',
$sprint->title . ' on Phragile',
$sprint->phabricator_id
) }}
</li>
<li>
{{ link_to_route(
'project_path',
$sprint->project->title . ' on Phragile',
$sprint->project->slug
) }}
</li>
</ul>
@stop
| @extends('layouts.default')
@section('content')
<h1>Successfully created "{{ $sprint->title }}"</h1>
<ul>
<li>
{{ link_to(
$_ENV['PHABRICATOR_URL'] . 'project/view/' . $sprint->phabricator_id,
$sprint->title . ' on Phabricator'
) }}
</li>
<li>
{{ link_to_route(
'sprint_path',
$sprint->title . ' on Phragile',
- $sprint->phid
+ $sprint->phabricator_id
? ++++++++++
) }}
</li>
<li>
{{ link_to_route(
'project_path',
$sprint->project->title . ' on Phragile',
$sprint->project->slug
) }}
</li>
</ul>
@stop | 2 | 0.071429 | 1 | 1 |
98059cc0ecbcce5cd87545a02b233b6819b7821a | .travis-ci.sh | .travis-ci.sh | echo "build here"
echo "Compiling new static content"
make release
BUILD_WWW=`make build_www`
echo "chmod"
chmod 600 deploy-key
echo "eval"
eval `ssh-agent -s`
echo "ssh"
ssh-add deploy-key
echo "git clone"
git clone git@github.com:hazelgrove/hazel.git
echo "git conf"
git config --global user.email "travis@travis-ci.org"
git config --global user.name "Push From Travis"
echo "move to hazel"
cd hazel
echo "switch to gh-pages"
git checkout gh-pages
echo "clear gh-pages subdir $TRAVIS_BRANCH"
if [ -d "$TRAVIS_BRANCH" ]
then
echo "subdir found, clearing contents"
git rm -rf $TRAVIS_BRANCH/*
else
echo "subdir not found, creating new"
mkdir "$TRAVIS_BRANCH"
fi
echo "cp new build contents into subdir"
cp -r ../$BUILD_WWW/* $TRAVIS_BRANCH
echo "git add"
git add .
echo "commit"
git commit -m "Travis Build"
echo "push"
git push origin gh-pages
| echo "build here"
echo "compile new static content"
make release
BUILD_WWW=`make build_www`
# cf.
# http://markbucciarelli.com/posts/2019-01-26_how-to-push-to-github-from-travis-ci.html
# https://stackoverflow.com/questions/18935539/authenticate-with-github-using-a-token/22977235#22977235
echo "set up private key to push to hazel-build"
openssl aes-256-cbc -k "$travis_key_password" -d -md sha256 -a -in hazel-build-key.enc -out hazel-build-key
echo "chmod hazel-build-key"
chmod 600 hazel-build-key
echo "ssh-add hazel-build-key"
eval `ssh-agent -s`
ssh-add hazel-build-key
echo "git clone"
git clone git@github.com:hazelgrove/hazel-build.git
echo "git config"
git config --global user.email "travis@travis-ci.org"
git config --global user.name "Push From Travis"
echo "move to hazel-build"
cd hazel-build
echo "clear contents of subdir $TRAVIS_BRANCH"
if [ -d "$TRAVIS_BRANCH" ]
then
echo "subdir found, clearing contents"
git rm -rf $TRAVIS_BRANCH/*
else
echo "subdir not found, creating new"
mkdir "$TRAVIS_BRANCH"
fi
echo "cp new build contents into subdir"
cp -r ../$BUILD_WWW/* $TRAVIS_BRANCH
echo "git add"
git add .
echo "git commit"
git commit -m "Travis Build"
echo "git push"
git push origin master
| Update travis script to push to hazel-build | Update travis script to push to hazel-build
| Shell | mit | hazelgrove/hazel,hazelgrove/hazel,hazelgrove/hazel | shell | ## Code Before:
echo "build here"
echo "Compiling new static content"
make release
BUILD_WWW=`make build_www`
echo "chmod"
chmod 600 deploy-key
echo "eval"
eval `ssh-agent -s`
echo "ssh"
ssh-add deploy-key
echo "git clone"
git clone git@github.com:hazelgrove/hazel.git
echo "git conf"
git config --global user.email "travis@travis-ci.org"
git config --global user.name "Push From Travis"
echo "move to hazel"
cd hazel
echo "switch to gh-pages"
git checkout gh-pages
echo "clear gh-pages subdir $TRAVIS_BRANCH"
if [ -d "$TRAVIS_BRANCH" ]
then
echo "subdir found, clearing contents"
git rm -rf $TRAVIS_BRANCH/*
else
echo "subdir not found, creating new"
mkdir "$TRAVIS_BRANCH"
fi
echo "cp new build contents into subdir"
cp -r ../$BUILD_WWW/* $TRAVIS_BRANCH
echo "git add"
git add .
echo "commit"
git commit -m "Travis Build"
echo "push"
git push origin gh-pages
## Instruction:
Update travis script to push to hazel-build
## Code After:
echo "build here"
echo "compile new static content"
make release
BUILD_WWW=`make build_www`
# cf.
# http://markbucciarelli.com/posts/2019-01-26_how-to-push-to-github-from-travis-ci.html
# https://stackoverflow.com/questions/18935539/authenticate-with-github-using-a-token/22977235#22977235
echo "set up private key to push to hazel-build"
openssl aes-256-cbc -k "$travis_key_password" -d -md sha256 -a -in hazel-build-key.enc -out hazel-build-key
echo "chmod hazel-build-key"
chmod 600 hazel-build-key
echo "ssh-add hazel-build-key"
eval `ssh-agent -s`
ssh-add hazel-build-key
echo "git clone"
git clone git@github.com:hazelgrove/hazel-build.git
echo "git config"
git config --global user.email "travis@travis-ci.org"
git config --global user.name "Push From Travis"
echo "move to hazel-build"
cd hazel-build
echo "clear contents of subdir $TRAVIS_BRANCH"
if [ -d "$TRAVIS_BRANCH" ]
then
echo "subdir found, clearing contents"
git rm -rf $TRAVIS_BRANCH/*
else
echo "subdir not found, creating new"
mkdir "$TRAVIS_BRANCH"
fi
echo "cp new build contents into subdir"
cp -r ../$BUILD_WWW/* $TRAVIS_BRANCH
echo "git add"
git add .
echo "git commit"
git commit -m "Travis Build"
echo "git push"
git push origin master
| echo "build here"
-
- echo "Compiling new static content"
? ^ ^^^
+ echo "compile new static content"
? ^ ^
make release
BUILD_WWW=`make build_www`
- echo "chmod"
- chmod 600 deploy-key
- echo "eval"
+ # cf.
+ # http://markbucciarelli.com/posts/2019-01-26_how-to-push-to-github-from-travis-ci.html
+ # https://stackoverflow.com/questions/18935539/authenticate-with-github-using-a-token/22977235#22977235
+ echo "set up private key to push to hazel-build"
+ openssl aes-256-cbc -k "$travis_key_password" -d -md sha256 -a -in hazel-build-key.enc -out hazel-build-key
+
+ echo "chmod hazel-build-key"
+ chmod 600 hazel-build-key
+ echo "ssh-add hazel-build-key"
eval `ssh-agent -s`
+ ssh-add hazel-build-key
- echo "ssh"
- ssh-add deploy-key
-
echo "git clone"
- git clone git@github.com:hazelgrove/hazel.git
+ git clone git@github.com:hazelgrove/hazel-build.git
? ++++++
- echo "git conf"
+ echo "git config"
? ++
git config --global user.email "travis@travis-ci.org"
git config --global user.name "Push From Travis"
- echo "move to hazel"
+ echo "move to hazel-build"
? ++++++
+ cd hazel-build
- cd hazel
- echo "switch to gh-pages"
- git checkout gh-pages
- echo "clear gh-pages subdir $TRAVIS_BRANCH"
? ^^^^^^
+ echo "clear contents of subdir $TRAVIS_BRANCH"
? ^^^^ ++ +++
if [ -d "$TRAVIS_BRANCH" ]
then
echo "subdir found, clearing contents"
git rm -rf $TRAVIS_BRANCH/*
else
echo "subdir not found, creating new"
mkdir "$TRAVIS_BRANCH"
fi
echo "cp new build contents into subdir"
cp -r ../$BUILD_WWW/* $TRAVIS_BRANCH
echo "git add"
git add .
- echo "commit"
+ echo "git commit"
? ++++
git commit -m "Travis Build"
- echo "push"
+ echo "git push"
? ++++
- git push origin gh-pages
? ^^^^ ^ ^
+ git push origin master
? ^ ^^ ^
| 37 | 0.880952 | 19 | 18 |
24ffb225bf6a7777461884f7c13f5feba63a43e7 | t/020-construct.t | t/020-construct.t |
use v6.c;
use Test;
use GDBM;
my $obj;
lives-ok { $obj = GDBM.new('foo.db') }, "create one";
isa-ok $obj, GDBM, "and it's the right sort of object";
ok 'foo.db'.IO.e, "and the file exists";
"foo.db".IO.unlink;
done-testing;
# vim: expandtab shiftwidth=4 ft=perl6
|
use v6.c;
use Test;
use GDBM;
my $filename = 'tmp-' ~ $*PID ~ '.db';
subtest {
my $obj;
lives-ok { $obj = GDBM.new($filename) }, "create one";
isa-ok $obj, GDBM, "and it's the right sort of object";
ok $obj.filename.IO.e, "and the file exists";
$obj.close;
$filename.IO.unlink;
}, "positional constructor";
subtest {
my $obj;
lives-ok { $obj = GDBM.new(:$filename) }, "create one";
isa-ok $obj, GDBM, "and it's the right sort of object";
ok $obj.filename.IO.e, "and the file exists";
$obj.close;
$filename.IO.unlink;
}, "named constructor";
done-testing;
# vim: expandtab shiftwidth=4 ft=perl6
| Test the new style constructor | Test the new style constructor
| Perl | artistic-2.0 | jonathanstowe/p6-GDBM,jonathanstowe/p6-GDBM | perl | ## Code Before:
use v6.c;
use Test;
use GDBM;
my $obj;
lives-ok { $obj = GDBM.new('foo.db') }, "create one";
isa-ok $obj, GDBM, "and it's the right sort of object";
ok 'foo.db'.IO.e, "and the file exists";
"foo.db".IO.unlink;
done-testing;
# vim: expandtab shiftwidth=4 ft=perl6
## Instruction:
Test the new style constructor
## Code After:
use v6.c;
use Test;
use GDBM;
my $filename = 'tmp-' ~ $*PID ~ '.db';
subtest {
my $obj;
lives-ok { $obj = GDBM.new($filename) }, "create one";
isa-ok $obj, GDBM, "and it's the right sort of object";
ok $obj.filename.IO.e, "and the file exists";
$obj.close;
$filename.IO.unlink;
}, "positional constructor";
subtest {
my $obj;
lives-ok { $obj = GDBM.new(:$filename) }, "create one";
isa-ok $obj, GDBM, "and it's the right sort of object";
ok $obj.filename.IO.e, "and the file exists";
$obj.close;
$filename.IO.unlink;
}, "named constructor";
done-testing;
# vim: expandtab shiftwidth=4 ft=perl6
|
use v6.c;
use Test;
use GDBM;
- my $obj;
- lives-ok { $obj = GDBM.new('foo.db') }, "create one";
+ my $filename = 'tmp-' ~ $*PID ~ '.db';
- isa-ok $obj, GDBM, "and it's the right sort of object";
+ subtest {
+ my $obj;
+ lives-ok { $obj = GDBM.new($filename) }, "create one";
- ok 'foo.db'.IO.e, "and the file exists";
+ isa-ok $obj, GDBM, "and it's the right sort of object";
- "foo.db".IO.unlink;
+ ok $obj.filename.IO.e, "and the file exists";
+ $obj.close;
+ $filename.IO.unlink;
+ }, "positional constructor";
+
+ subtest {
+ my $obj;
+ lives-ok { $obj = GDBM.new(:$filename) }, "create one";
+
+ isa-ok $obj, GDBM, "and it's the right sort of object";
+
+ ok $obj.filename.IO.e, "and the file exists";
+ $obj.close;
+ $filename.IO.unlink;
+ }, "named constructor";
done-testing;
# vim: expandtab shiftwidth=4 ft=perl6 | 25 | 1.388889 | 20 | 5 |
af79dfa608ceba4ac757fcc7797e01401869ff94 | Entity/ReferralRegistrationRepository.php | Entity/ReferralRegistrationRepository.php | <?php
namespace Shaygan\AffiliateBundle\Entity;
use Doctrine\ORM\EntityRepository;
/**
* ReferralRegistrationRepository
*
* This class was generated by the Doctrine ORM. Add your own custom
* repository methods below.
*/
class ReferralRegistrationRepository extends EntityRepository
{
public function getRegistrationByUser($user)
{
return $this->_em
->createQuery('SELECT rr,r FROM ShayganAffiliateBundle:ReferralRegistration rr '
. 'JOIN rr.referrer r '
. 'WHERE r.id=:id '
. 'ORDER BY rr.id DESC ')
->setParameter("id", $user->getId())
;
}
public function getRegistrationCountByUser($user)
{
return $this->_em
->createQuery('SELECT COUNT(*) FROM ShayganAffiliateBundle:ReferralRegistration rr '
. 'WHERE r.id=:id ')
->setParameter("id", $user->getId())
->getSingleScalarResult()
;
}
}
| <?php
namespace Shaygan\AffiliateBundle\Entity;
use Doctrine\ORM\EntityRepository;
/**
* ReferralRegistrationRepository
*
* This class was generated by the Doctrine ORM. Add your own custom
* repository methods below.
*/
class ReferralRegistrationRepository extends EntityRepository
{
public function getRegistrationByUser($user)
{
return $this->_em
->createQuery('SELECT rr,r FROM ShayganAffiliateBundle:ReferralRegistration rr '
. 'JOIN rr.referrer r '
. 'WHERE r.id=:id '
. 'ORDER BY rr.id DESC ')
->setParameter("id", $user->getId())
;
}
public function getRegistrationCountByUser($user)
{
return $this->_em
->createQuery('SELECT COUNT(*) FROM ShayganAffiliateBundle:ReferralRegistration rr '
. 'JOIN rr.referrer r '
. 'WHERE r.id=:id ')
->setParameter("id", $user->getId())
->getSingleScalarResult()
;
}
}
| Add get registration count to service | Add get registration count to service | PHP | mit | Shaygan/AffiliateBundle | php | ## Code Before:
<?php
namespace Shaygan\AffiliateBundle\Entity;
use Doctrine\ORM\EntityRepository;
/**
* ReferralRegistrationRepository
*
* This class was generated by the Doctrine ORM. Add your own custom
* repository methods below.
*/
class ReferralRegistrationRepository extends EntityRepository
{
public function getRegistrationByUser($user)
{
return $this->_em
->createQuery('SELECT rr,r FROM ShayganAffiliateBundle:ReferralRegistration rr '
. 'JOIN rr.referrer r '
. 'WHERE r.id=:id '
. 'ORDER BY rr.id DESC ')
->setParameter("id", $user->getId())
;
}
public function getRegistrationCountByUser($user)
{
return $this->_em
->createQuery('SELECT COUNT(*) FROM ShayganAffiliateBundle:ReferralRegistration rr '
. 'WHERE r.id=:id ')
->setParameter("id", $user->getId())
->getSingleScalarResult()
;
}
}
## Instruction:
Add get registration count to service
## Code After:
<?php
namespace Shaygan\AffiliateBundle\Entity;
use Doctrine\ORM\EntityRepository;
/**
* ReferralRegistrationRepository
*
* This class was generated by the Doctrine ORM. Add your own custom
* repository methods below.
*/
class ReferralRegistrationRepository extends EntityRepository
{
public function getRegistrationByUser($user)
{
return $this->_em
->createQuery('SELECT rr,r FROM ShayganAffiliateBundle:ReferralRegistration rr '
. 'JOIN rr.referrer r '
. 'WHERE r.id=:id '
. 'ORDER BY rr.id DESC ')
->setParameter("id", $user->getId())
;
}
public function getRegistrationCountByUser($user)
{
return $this->_em
->createQuery('SELECT COUNT(*) FROM ShayganAffiliateBundle:ReferralRegistration rr '
. 'JOIN rr.referrer r '
. 'WHERE r.id=:id ')
->setParameter("id", $user->getId())
->getSingleScalarResult()
;
}
}
| <?php
namespace Shaygan\AffiliateBundle\Entity;
use Doctrine\ORM\EntityRepository;
/**
* ReferralRegistrationRepository
*
* This class was generated by the Doctrine ORM. Add your own custom
* repository methods below.
*/
class ReferralRegistrationRepository extends EntityRepository
{
public function getRegistrationByUser($user)
{
return $this->_em
->createQuery('SELECT rr,r FROM ShayganAffiliateBundle:ReferralRegistration rr '
. 'JOIN rr.referrer r '
. 'WHERE r.id=:id '
. 'ORDER BY rr.id DESC ')
->setParameter("id", $user->getId())
;
}
public function getRegistrationCountByUser($user)
{
return $this->_em
->createQuery('SELECT COUNT(*) FROM ShayganAffiliateBundle:ReferralRegistration rr '
+ . 'JOIN rr.referrer r '
. 'WHERE r.id=:id ')
->setParameter("id", $user->getId())
->getSingleScalarResult()
;
}
} | 1 | 0.027027 | 1 | 0 |
25cc54cacd4eb8f997dba2290e9d09818bd6a269 | lib/gitlab/ci/templates/Serverless.gitlab-ci.yml | lib/gitlab/ci/templates/Serverless.gitlab-ci.yml |
image: alpine:latest
stages:
- build
- test
- deploy
.serverless:build:image:
stage: build
image: registry.gitlab.com/gitlab-org/gitlabktl:latest
script: /usr/bin/gitlabktl app build
.serverless:deploy:image:
stage: deploy
image: gcr.io/triggermesh/tm@sha256:3cfdd470a66b741004fb02354319d79f1598c70117ce79978d2e07e192bfb336 # v0.0.11
environment: development
script:
- echo "$CI_REGISTRY_IMAGE"
- tm -n "$KUBE_NAMESPACE" --config "$KUBECONFIG" deploy service "$CI_PROJECT_NAME" --from-image "$CI_REGISTRY_IMAGE" --wait
.serverless:build:functions:
stage: build
environment: development
image: registry.gitlab.com/gitlab-org/gitlabktl:latest
script: /usr/bin/gitlabktl serverless build
.serverless:deploy:functions:
stage: deploy
environment: development
image: registry.gitlab.com/gitlab-org/gitlabktl:latest
script: /usr/bin/gitlabktl serverless deploy
|
image: alpine:latest
stages:
- build
- test
- deploy
.serverless:build:image:
image: registry.gitlab.com/gitlab-org/gitlabktl:latest
stage: build
script: /usr/bin/gitlabktl app build
.serverless:deploy:image:
image: registry.gitlab.com/gitlab-org/gitlabktl:latest
stage: deploy
environment: development
script: /usr/bin/gitlabktl app deploy
.serverless:build:functions:
image: registry.gitlab.com/gitlab-org/gitlabktl:latest
stage: build
script: /usr/bin/gitlabktl serverless build
.serverless:deploy:functions:
image: registry.gitlab.com/gitlab-org/gitlabktl:latest
stage: deploy
environment: development
script: /usr/bin/gitlabktl serverless deploy
| Deploy serverless apps with `gitlabktl` | Deploy serverless apps with `gitlabktl`
| YAML | mit | mmkassem/gitlabhq,stoplightio/gitlabhq,stoplightio/gitlabhq,mmkassem/gitlabhq,stoplightio/gitlabhq,mmkassem/gitlabhq,stoplightio/gitlabhq,mmkassem/gitlabhq | yaml | ## Code Before:
image: alpine:latest
stages:
- build
- test
- deploy
.serverless:build:image:
stage: build
image: registry.gitlab.com/gitlab-org/gitlabktl:latest
script: /usr/bin/gitlabktl app build
.serverless:deploy:image:
stage: deploy
image: gcr.io/triggermesh/tm@sha256:3cfdd470a66b741004fb02354319d79f1598c70117ce79978d2e07e192bfb336 # v0.0.11
environment: development
script:
- echo "$CI_REGISTRY_IMAGE"
- tm -n "$KUBE_NAMESPACE" --config "$KUBECONFIG" deploy service "$CI_PROJECT_NAME" --from-image "$CI_REGISTRY_IMAGE" --wait
.serverless:build:functions:
stage: build
environment: development
image: registry.gitlab.com/gitlab-org/gitlabktl:latest
script: /usr/bin/gitlabktl serverless build
.serverless:deploy:functions:
stage: deploy
environment: development
image: registry.gitlab.com/gitlab-org/gitlabktl:latest
script: /usr/bin/gitlabktl serverless deploy
## Instruction:
Deploy serverless apps with `gitlabktl`
## Code After:
image: alpine:latest
stages:
- build
- test
- deploy
.serverless:build:image:
image: registry.gitlab.com/gitlab-org/gitlabktl:latest
stage: build
script: /usr/bin/gitlabktl app build
.serverless:deploy:image:
image: registry.gitlab.com/gitlab-org/gitlabktl:latest
stage: deploy
environment: development
script: /usr/bin/gitlabktl app deploy
.serverless:build:functions:
image: registry.gitlab.com/gitlab-org/gitlabktl:latest
stage: build
script: /usr/bin/gitlabktl serverless build
.serverless:deploy:functions:
image: registry.gitlab.com/gitlab-org/gitlabktl:latest
stage: deploy
environment: development
script: /usr/bin/gitlabktl serverless deploy
|
image: alpine:latest
stages:
- build
- test
- deploy
.serverless:build:image:
+ image: registry.gitlab.com/gitlab-org/gitlabktl:latest
stage: build
- image: registry.gitlab.com/gitlab-org/gitlabktl:latest
script: /usr/bin/gitlabktl app build
.serverless:deploy:image:
+ image: registry.gitlab.com/gitlab-org/gitlabktl:latest
stage: deploy
- image: gcr.io/triggermesh/tm@sha256:3cfdd470a66b741004fb02354319d79f1598c70117ce79978d2e07e192bfb336 # v0.0.11
environment: development
+ script: /usr/bin/gitlabktl app deploy
- script:
- - echo "$CI_REGISTRY_IMAGE"
- - tm -n "$KUBE_NAMESPACE" --config "$KUBECONFIG" deploy service "$CI_PROJECT_NAME" --from-image "$CI_REGISTRY_IMAGE" --wait
.serverless:build:functions:
+ image: registry.gitlab.com/gitlab-org/gitlabktl:latest
stage: build
- environment: development
- image: registry.gitlab.com/gitlab-org/gitlabktl:latest
script: /usr/bin/gitlabktl serverless build
.serverless:deploy:functions:
+ image: registry.gitlab.com/gitlab-org/gitlabktl:latest
stage: deploy
environment: development
- image: registry.gitlab.com/gitlab-org/gitlabktl:latest
script: /usr/bin/gitlabktl serverless deploy | 13 | 0.40625 | 5 | 8 |
27f5676656e7507883ba365d2639e5f3cb5b0b58 | snippets/keras_testing.py | snippets/keras_testing.py | from keras.models import Sequential
from keras.layers import Dense, Dropout
import sys
import numpy as np
def main():
input_filename = sys.argv[1]
training = np.loadtxt('data/%s.csv' % input_filename, delimiter=',')
test = np.loadtxt('data/%s_test.csv' % input_filename, delimiter=',')
y_train = training[:,3:4]
x_train = training[:,0:3]
y_test = test[:,3:4]
x_test = test[:,0:3]
model = Sequential()
model.add(Dense(10, activation='tanh', input_dim=3))
model.add(Dense(10, activation='tanh'))
model.add(Dense(1, activation='linear'))
model.compile(optimizer='sgd', loss='mean_squared_error')
model.fit(x_train, y_train, epochs=100)
print('Test score: ', model.evaluate(x_test, y_test))
y_network = model.predict(x_test)
out = np.concatenate((x_test, y_test, y_network), axis=1)
np.savetxt('results/%s_kera.csv' % input_filename, out, delimiter=',')
if __name__ == "__main__":
main()
| from keras.models import Sequential
from keras.layers import Dense, Dropout
import sys
import numpy as np
def main():
input_filename = sys.argv[1]
num_networks = int(sys.argv[2])
training = np.loadtxt('data/%s.csv' % input_filename, delimiter=',')
test = np.loadtxt('data/%s_test.csv' % input_filename, delimiter=',')
y_train = training[:, 3:4]
x_train = training[:, 0:3]
y_test = test[:, 3:4]
x_test = test[:, 0:3]
test_score = 0
result = np.zeros((1,5))
for _ in range(num_networks):
model = Sequential()
model.add(Dense(10, activation='tanh', input_dim=3))
model.add(Dense(10, activation='tanh'))
model.add(Dense(1, activation='linear'))
model.compile(optimizer='sgd', loss='mean_squared_error')
model.fit(x_train, y_train, epochs=20, shuffle=True)
y_network = model.predict_on_batch(x_test)
result = np.concatenate((result, np.concatenate((x_test, y_test, y_network), axis=1)), axis=0)
test_score += model.evaluate(x_test, y_test)
print()
print('Test score: ', test_score / num_networks)
result = np.delete(result, 0, 0)
np.savetxt('results/%s_kera.csv' % input_filename, result, delimiter=',')
if __name__ == "__main__":
main()
| Tweak parameters and allow runs over multiple networks | Tweak parameters and allow runs over multiple networks
| Python | mit | farthir/msc-project | python | ## Code Before:
from keras.models import Sequential
from keras.layers import Dense, Dropout
import sys
import numpy as np
def main():
input_filename = sys.argv[1]
training = np.loadtxt('data/%s.csv' % input_filename, delimiter=',')
test = np.loadtxt('data/%s_test.csv' % input_filename, delimiter=',')
y_train = training[:,3:4]
x_train = training[:,0:3]
y_test = test[:,3:4]
x_test = test[:,0:3]
model = Sequential()
model.add(Dense(10, activation='tanh', input_dim=3))
model.add(Dense(10, activation='tanh'))
model.add(Dense(1, activation='linear'))
model.compile(optimizer='sgd', loss='mean_squared_error')
model.fit(x_train, y_train, epochs=100)
print('Test score: ', model.evaluate(x_test, y_test))
y_network = model.predict(x_test)
out = np.concatenate((x_test, y_test, y_network), axis=1)
np.savetxt('results/%s_kera.csv' % input_filename, out, delimiter=',')
if __name__ == "__main__":
main()
## Instruction:
Tweak parameters and allow runs over multiple networks
## Code After:
from keras.models import Sequential
from keras.layers import Dense, Dropout
import sys
import numpy as np
def main():
input_filename = sys.argv[1]
num_networks = int(sys.argv[2])
training = np.loadtxt('data/%s.csv' % input_filename, delimiter=',')
test = np.loadtxt('data/%s_test.csv' % input_filename, delimiter=',')
y_train = training[:, 3:4]
x_train = training[:, 0:3]
y_test = test[:, 3:4]
x_test = test[:, 0:3]
test_score = 0
result = np.zeros((1,5))
for _ in range(num_networks):
model = Sequential()
model.add(Dense(10, activation='tanh', input_dim=3))
model.add(Dense(10, activation='tanh'))
model.add(Dense(1, activation='linear'))
model.compile(optimizer='sgd', loss='mean_squared_error')
model.fit(x_train, y_train, epochs=20, shuffle=True)
y_network = model.predict_on_batch(x_test)
result = np.concatenate((result, np.concatenate((x_test, y_test, y_network), axis=1)), axis=0)
test_score += model.evaluate(x_test, y_test)
print()
print('Test score: ', test_score / num_networks)
result = np.delete(result, 0, 0)
np.savetxt('results/%s_kera.csv' % input_filename, result, delimiter=',')
if __name__ == "__main__":
main()
| from keras.models import Sequential
from keras.layers import Dense, Dropout
import sys
import numpy as np
def main():
input_filename = sys.argv[1]
+ num_networks = int(sys.argv[2])
training = np.loadtxt('data/%s.csv' % input_filename, delimiter=',')
test = np.loadtxt('data/%s_test.csv' % input_filename, delimiter=',')
- y_train = training[:,3:4]
+ y_train = training[:, 3:4]
? +
- x_train = training[:,0:3]
+ x_train = training[:, 0:3]
? +
- y_test = test[:,3:4]
+ y_test = test[:, 3:4]
? +
- x_test = test[:,0:3]
+ x_test = test[:, 0:3]
? +
+ test_score = 0
+ result = np.zeros((1,5))
- model = Sequential()
- model.add(Dense(10, activation='tanh', input_dim=3))
- model.add(Dense(10, activation='tanh'))
- model.add(Dense(1, activation='linear'))
- model.compile(optimizer='sgd', loss='mean_squared_error')
- model.fit(x_train, y_train, epochs=100)
- print('Test score: ', model.evaluate(x_test, y_test))
+ for _ in range(num_networks):
+ model = Sequential()
+ model.add(Dense(10, activation='tanh', input_dim=3))
+ model.add(Dense(10, activation='tanh'))
+ model.add(Dense(1, activation='linear'))
+ model.compile(optimizer='sgd', loss='mean_squared_error')
+ model.fit(x_train, y_train, epochs=20, shuffle=True)
- y_network = model.predict(x_test)
+ y_network = model.predict_on_batch(x_test)
? ++++ +++++++++
+ result = np.concatenate((result, np.concatenate((x_test, y_test, y_network), axis=1)), axis=0)
+ test_score += model.evaluate(x_test, y_test)
- out = np.concatenate((x_test, y_test, y_network), axis=1)
+ print()
+ print('Test score: ', test_score / num_networks)
+
+ result = np.delete(result, 0, 0)
- np.savetxt('results/%s_kera.csv' % input_filename, out, delimiter=',')
? ^
+ np.savetxt('results/%s_kera.csv' % input_filename, result, delimiter=',')
? ^^^ +
if __name__ == "__main__":
main() | 36 | 1.2 | 22 | 14 |
545aba006876923681b167596d0f931f542f7ec4 | CHANGELOG.md | CHANGELOG.md | Changelog
=========
### 0.1.1 March 11, 2014
**Features**
* Add changelog overview (#14)
* Add Change map bounds when loading GeoJSON (#13)
**Bugfixes**
* Fix undefined evaluation when snapping layer is not avaiable (#15)
* Fail gracefully when loading invalid GeoJSON (#12)
### 0.1.0 March 10, 2014
* Implements `#loadGeoJSON()` method (#3)
#### Backwards compability note
* Format of `properties.waypoints` in `#toGeoJSON()` is changed according to #3.
| Changelog
=========
### 0.2.0 (Unreadlesed)
**Features**
* Add option for overriding draw shortcuts (#23)
* Disable shortcuts by setting `options.shortcut` to `false`.
**Breaking changes**
* Default shortcut key for draw enable is `d`
* Default shortcut key for draw disable is `q`
### 0.1.1 March 11, 2014
**Features**
* Add changelog overview (#14)
* Add Change map bounds when loading GeoJSON (#13)
**Bugfixes**
* Fix undefined evaluation when snapping layer is not avaiable (#15)
* Fail gracefully when loading invalid GeoJSON (#12)
### 0.1.0 March 10, 2014
* Implements `#loadGeoJSON()` method (#3)
#### Backwards compability note
* Format of `properties.waypoints` in `#toGeoJSON()` is changed according to #3.
| Add shortcut changes to Changelog | Add shortcut changes to Changelog
| Markdown | bsd-2-clause | Turistforeningen/leaflet-routing,Turistforeningen/leaflet-routing | markdown | ## Code Before:
Changelog
=========
### 0.1.1 March 11, 2014
**Features**
* Add changelog overview (#14)
* Add Change map bounds when loading GeoJSON (#13)
**Bugfixes**
* Fix undefined evaluation when snapping layer is not avaiable (#15)
* Fail gracefully when loading invalid GeoJSON (#12)
### 0.1.0 March 10, 2014
* Implements `#loadGeoJSON()` method (#3)
#### Backwards compability note
* Format of `properties.waypoints` in `#toGeoJSON()` is changed according to #3.
## Instruction:
Add shortcut changes to Changelog
## Code After:
Changelog
=========
### 0.2.0 (Unreadlesed)
**Features**
* Add option for overriding draw shortcuts (#23)
* Disable shortcuts by setting `options.shortcut` to `false`.
**Breaking changes**
* Default shortcut key for draw enable is `d`
* Default shortcut key for draw disable is `q`
### 0.1.1 March 11, 2014
**Features**
* Add changelog overview (#14)
* Add Change map bounds when loading GeoJSON (#13)
**Bugfixes**
* Fix undefined evaluation when snapping layer is not avaiable (#15)
* Fail gracefully when loading invalid GeoJSON (#12)
### 0.1.0 March 10, 2014
* Implements `#loadGeoJSON()` method (#3)
#### Backwards compability note
* Format of `properties.waypoints` in `#toGeoJSON()` is changed according to #3.
| Changelog
=========
+
+ ### 0.2.0 (Unreadlesed)
+
+ **Features**
+
+ * Add option for overriding draw shortcuts (#23)
+ * Disable shortcuts by setting `options.shortcut` to `false`.
+
+ **Breaking changes**
+
+ * Default shortcut key for draw enable is `d`
+ * Default shortcut key for draw disable is `q`
### 0.1.1 March 11, 2014
**Features**
* Add changelog overview (#14)
* Add Change map bounds when loading GeoJSON (#13)
**Bugfixes**
* Fix undefined evaluation when snapping layer is not avaiable (#15)
* Fail gracefully when loading invalid GeoJSON (#12)
### 0.1.0 March 10, 2014
* Implements `#loadGeoJSON()` method (#3)
#### Backwards compability note
* Format of `properties.waypoints` in `#toGeoJSON()` is changed according to #3.
| 12 | 0.521739 | 12 | 0 |
8f399e4211b45fb5eba8f21888b7c84b1d46a26c | site/README.md | site/README.md |
This directory contains the code for the Apache ORC docs site, [orc.apache.org](https://orc.apache.org/).
## Running locally
You can preview your contributions before opening a pull request by running from within the directory:
1. `gem install bundler`
2. `bundle install`
3. `bundle exec jekyll serve`
|
This directory contains the code for the Apache ORC web site,
[orc.apache.org](https://orc.apache.org/).
## Setup
1. `cd site`
2. `git clone https://git-wip-us.apache.org/repos/asf/orc.git -b asf-site target`
3. `sudo gem install bundler`
4. `sudo gem install github-pages jeykll`
4. `bundle install`
## Running locally
You can preview your contributions before opening a pull request by running from within the directory:
3. `bundle exec jekyll serve`
## Pushing to site
1. `cd site/target`
2. `git status`
3. You'll need to `git add` any new files
4. `git commit -a`
5. `git push origin asf-site` | Update the readme for the site. | Update the readme for the site.
| Markdown | apache-2.0 | chunyang-wen/orc,pudidic/orc,chunyang-wen/orc,chunyang-wen/orc,AnatoliShein/orc,omalley/orc,pudidic/orc,AnatoliShein/orc,omalley/orc,apache/orc,apache/orc,xunzhang/orc,apache/orc,omalley/orc,majetideepak/orc,AnatoliShein/orc,omalley/orc,xunzhang/orc,xunzhang/orc,AnatoliShein/orc,pudidic/orc,xunzhang/orc,omalley/orc,majetideepak/orc,AnatoliShein/orc,xunzhang/orc,AnatoliShein/orc,majetideepak/orc,xunzhang/orc,chunyang-wen/orc,chunyang-wen/orc,xunzhang/orc,xunzhang/orc,apache/orc,AnatoliShein/orc,chunyang-wen/orc,apache/orc,chunyang-wen/orc,pudidic/orc,xunzhang/orc,majetideepak/orc,chunyang-wen/orc,omalley/orc,pudidic/orc,majetideepak/orc,pudidic/orc,pudidic/orc,omalley/orc,apache/orc,majetideepak/orc,majetideepak/orc,AnatoliShein/orc,chunyang-wen/orc,apache/orc,apache/orc | markdown | ## Code Before:
This directory contains the code for the Apache ORC docs site, [orc.apache.org](https://orc.apache.org/).
## Running locally
You can preview your contributions before opening a pull request by running from within the directory:
1. `gem install bundler`
2. `bundle install`
3. `bundle exec jekyll serve`
## Instruction:
Update the readme for the site.
## Code After:
This directory contains the code for the Apache ORC web site,
[orc.apache.org](https://orc.apache.org/).
## Setup
1. `cd site`
2. `git clone https://git-wip-us.apache.org/repos/asf/orc.git -b asf-site target`
3. `sudo gem install bundler`
4. `sudo gem install github-pages jeykll`
4. `bundle install`
## Running locally
You can preview your contributions before opening a pull request by running from within the directory:
3. `bundle exec jekyll serve`
## Pushing to site
1. `cd site/target`
2. `git status`
3. You'll need to `git add` any new files
4. `git commit -a`
5. `git push origin asf-site` |
- This directory contains the code for the Apache ORC docs site, [orc.apache.org](https://orc.apache.org/).
+ This directory contains the code for the Apache ORC web site,
+ [orc.apache.org](https://orc.apache.org/).
+
+ ## Setup
+
+ 1. `cd site`
+ 2. `git clone https://git-wip-us.apache.org/repos/asf/orc.git -b asf-site target`
+ 3. `sudo gem install bundler`
+ 4. `sudo gem install github-pages jeykll`
+ 4. `bundle install`
## Running locally
You can preview your contributions before opening a pull request by running from within the directory:
- 1. `gem install bundler`
- 2. `bundle install`
3. `bundle exec jekyll serve`
+
+ ## Pushing to site
+
+ 1. `cd site/target`
+ 2. `git status`
+ 3. You'll need to `git add` any new files
+ 4. `git commit -a`
+ 5. `git push origin asf-site` | 21 | 2.333333 | 18 | 3 |
e24d07e42053207fc4b13339a3f5f86e144e3de9 | doc/ops-guide/source/ops_projects.rst | doc/ops-guide/source/ops_projects.rst | =================
Managing Projects
=================
Users must be associated with at least one project, though they may
belong to many. Therefore, you should add at least one project before
adding users.
Adding Projects
~~~~~~~~~~~~~~~
To create a project through the OpenStack dashboard:
#. Log in as an administrative user.
#. Select the :guilabel:`Identity` tab in the left navigation bar.
#. Under Identity tab, click :guilabel:`Projects`.
#. Click the :guilabel:`Create Project` button.
You are prompted for a project name and an optional, but recommended,
description. Select the checkbox at the bottom of the form to enable
this project. By default, it is enabled, as shown in
:ref:`figure_create_project`.
.. _figure_create_project:
.. figure:: figures/osog_0901.png
:alt: Dashboard's Create Project form
Figure Dashboard's Create Project form
It is also possible to add project members and adjust the project
quotas. We'll discuss those actions later, but in practice, it can be
quite convenient to deal with all these operations at one time.
To add a project through the command line, you must use the OpenStack
command line client.
.. code-block:: console
# openstack project create demo
This command creates a project named "demo." Optionally, you can add a
description string by appending :option:`--description tenant-description`,
which can be very useful. You can also
create a group in a disabled state by appending :option:`--disable` to the
command. By default, projects are created in an enabled state.
| =================
Managing Projects
=================
Users must be associated with at least one project, though they may
belong to many. Therefore, you should add at least one project before
adding users.
Adding Projects
~~~~~~~~~~~~~~~
To create a project through the OpenStack dashboard:
#. Log in as an administrative user.
#. Select the :guilabel:`Identity` tab in the left navigation bar.
#. Under Identity tab, click :guilabel:`Projects`.
#. Click the :guilabel:`Create Project` button.
You are prompted for a project name and an optional, but recommended,
description. Select the checkbox at the bottom of the form to enable
this project. By default, it is enabled, as shown in
:ref:`figure_create_project`.
.. _figure_create_project:
.. figure:: figures/osog_0901.png
:alt: Dashboard's Create Project form
Figure Dashboard's Create Project form
It is also possible to add project members and adjust the project
quotas. We'll discuss those actions later, but in practice, it can be
quite convenient to deal with all these operations at one time.
To add a project through the command line, you must use the OpenStack
command line client.
.. code-block:: console
# openstack project create demo
This command creates a project named ``demo``. Optionally, you can add a
description string by appending :option:`--description PROJECT_DESCRIPTION`,
which can be very useful. You can also
create a group in a disabled state by appending :option:`--disable` to the
command. By default, projects are created in an enabled state.
| Fix typos and enhance description | Fix typos and enhance description
In section 'Adding Projects', 1) “demo.” below might need to
be changed to '“demo”.', 2) 'tenant-description' might be
better if changed to 'project-description'
Change-Id: Ib7a96c86ae95c95563aa51e9c4241103c922676e
Closes-bug: 1604670
| reStructuredText | apache-2.0 | openstack/openstack-manuals,AlekhyaMallina-Vedams/openstack-manuals,openstack/openstack-manuals,AlekhyaMallina-Vedams/openstack-manuals,openstack/openstack-manuals,AlekhyaMallina-Vedams/openstack-manuals,openstack/openstack-manuals | restructuredtext | ## Code Before:
=================
Managing Projects
=================
Users must be associated with at least one project, though they may
belong to many. Therefore, you should add at least one project before
adding users.
Adding Projects
~~~~~~~~~~~~~~~
To create a project through the OpenStack dashboard:
#. Log in as an administrative user.
#. Select the :guilabel:`Identity` tab in the left navigation bar.
#. Under Identity tab, click :guilabel:`Projects`.
#. Click the :guilabel:`Create Project` button.
You are prompted for a project name and an optional, but recommended,
description. Select the checkbox at the bottom of the form to enable
this project. By default, it is enabled, as shown in
:ref:`figure_create_project`.
.. _figure_create_project:
.. figure:: figures/osog_0901.png
:alt: Dashboard's Create Project form
Figure Dashboard's Create Project form
It is also possible to add project members and adjust the project
quotas. We'll discuss those actions later, but in practice, it can be
quite convenient to deal with all these operations at one time.
To add a project through the command line, you must use the OpenStack
command line client.
.. code-block:: console
# openstack project create demo
This command creates a project named "demo." Optionally, you can add a
description string by appending :option:`--description tenant-description`,
which can be very useful. You can also
create a group in a disabled state by appending :option:`--disable` to the
command. By default, projects are created in an enabled state.
## Instruction:
Fix typos and enhance description
In section 'Adding Projects', 1) “demo.” below might need to
be changed to '“demo”.', 2) 'tenant-description' might be
better if changed to 'project-description'
Change-Id: Ib7a96c86ae95c95563aa51e9c4241103c922676e
Closes-bug: 1604670
## Code After:
=================
Managing Projects
=================
Users must be associated with at least one project, though they may
belong to many. Therefore, you should add at least one project before
adding users.
Adding Projects
~~~~~~~~~~~~~~~
To create a project through the OpenStack dashboard:
#. Log in as an administrative user.
#. Select the :guilabel:`Identity` tab in the left navigation bar.
#. Under Identity tab, click :guilabel:`Projects`.
#. Click the :guilabel:`Create Project` button.
You are prompted for a project name and an optional, but recommended,
description. Select the checkbox at the bottom of the form to enable
this project. By default, it is enabled, as shown in
:ref:`figure_create_project`.
.. _figure_create_project:
.. figure:: figures/osog_0901.png
:alt: Dashboard's Create Project form
Figure Dashboard's Create Project form
It is also possible to add project members and adjust the project
quotas. We'll discuss those actions later, but in practice, it can be
quite convenient to deal with all these operations at one time.
To add a project through the command line, you must use the OpenStack
command line client.
.. code-block:: console
# openstack project create demo
This command creates a project named ``demo``. Optionally, you can add a
description string by appending :option:`--description PROJECT_DESCRIPTION`,
which can be very useful. You can also
create a group in a disabled state by appending :option:`--disable` to the
command. By default, projects are created in an enabled state.
| =================
Managing Projects
=================
Users must be associated with at least one project, though they may
belong to many. Therefore, you should add at least one project before
adding users.
Adding Projects
~~~~~~~~~~~~~~~
To create a project through the OpenStack dashboard:
#. Log in as an administrative user.
#. Select the :guilabel:`Identity` tab in the left navigation bar.
#. Under Identity tab, click :guilabel:`Projects`.
#. Click the :guilabel:`Create Project` button.
You are prompted for a project name and an optional, but recommended,
description. Select the checkbox at the bottom of the form to enable
this project. By default, it is enabled, as shown in
:ref:`figure_create_project`.
.. _figure_create_project:
.. figure:: figures/osog_0901.png
:alt: Dashboard's Create Project form
Figure Dashboard's Create Project form
It is also possible to add project members and adjust the project
quotas. We'll discuss those actions later, but in practice, it can be
quite convenient to deal with all these operations at one time.
To add a project through the command line, you must use the OpenStack
command line client.
.. code-block:: console
# openstack project create demo
- This command creates a project named "demo." Optionally, you can add a
? ^ -
+ This command creates a project named ``demo``. Optionally, you can add a
? ^^ ++
- description string by appending :option:`--description tenant-description`,
? ^^^^^^^^^^^^^^^^^^
+ description string by appending :option:`--description PROJECT_DESCRIPTION`,
? ^^^^^^^^^^^^^^^^^^^
which can be very useful. You can also
create a group in a disabled state by appending :option:`--disable` to the
command. By default, projects are created in an enabled state. | 4 | 0.081633 | 2 | 2 |
eb26baf49634cc1547b2a8a9be0befac551c11ac | src/app-webapp/app/components/new-question/new-question.js | src/app-webapp/app/components/new-question/new-question.js | import React from 'react'
class NewQuestion extends React.Component {
render () {
return <div>
<h2>Create a New Question</h2>
</div>
}
}
export default NewQuestion
| import React from 'react'
import {
Form, FormGroup, ControlLabel, FormControl, Col
} from 'react-bootstrap'
import Button from 'react-bootstrap-button-loader'
class NewQuestion extends React.Component {
constructor (props) {
super(props)
this.state = { question: '', answer: '', isLoading: false }
this.handleChange = this.handleChange.bind(this)
this.onSubmit = this.onSubmit.bind(this)
}
render () {
return <div>
<h2>Create a New Question</h2>
<Form horizontal>
<FormGroup>
<Col sm={2} componentClass={ControlLabel}>Question</Col>
<Col sm={10}>
<FormControl
componentClass='textarea'
name='question'
value={this.state.question}
placeholder='Enter your question here...'
onChange={this.handleChange}
/>
</Col>
</FormGroup>
<FormGroup>
<Col sm={2} componentClass={ControlLabel}>Answer</Col>
<Col sm={10}>
<FormControl
type='text'
name='answer'
value={this.state.answer}
placeholder='Enter the correct answer here...'
onChange={this.handleChange}
/>
</Col>
</FormGroup>
<FormGroup>
<Col smOffset={2} sm={10}>
<Button
bsStyle='primary'
bsSize='large'
loading={this.state.isLoading}
disabled={!this.validFormInput()}
onClick={this.onSubmit}
>
Create
</Button>
</Col>
</FormGroup>
</Form>
</div>
}
handleChange (event) {
let target = event.target
this.setState({ [target.name]: target.value })
}
onSubmit () {
this.setState({ isLoading: true })
}
validFormInput () {
return this.state.question !== '' && this.state.answer !== ''
}
}
export default NewQuestion
| Add form for for creating a new question. | Add form for for creating a new question.
| JavaScript | mit | Charterhouse/NextBuild2017,Charterhouse/NextBuild2017,Charterhouse/NextBuild2017 | javascript | ## Code Before:
import React from 'react'
class NewQuestion extends React.Component {
render () {
return <div>
<h2>Create a New Question</h2>
</div>
}
}
export default NewQuestion
## Instruction:
Add form for for creating a new question.
## Code After:
import React from 'react'
import {
Form, FormGroup, ControlLabel, FormControl, Col
} from 'react-bootstrap'
import Button from 'react-bootstrap-button-loader'
class NewQuestion extends React.Component {
constructor (props) {
super(props)
this.state = { question: '', answer: '', isLoading: false }
this.handleChange = this.handleChange.bind(this)
this.onSubmit = this.onSubmit.bind(this)
}
render () {
return <div>
<h2>Create a New Question</h2>
<Form horizontal>
<FormGroup>
<Col sm={2} componentClass={ControlLabel}>Question</Col>
<Col sm={10}>
<FormControl
componentClass='textarea'
name='question'
value={this.state.question}
placeholder='Enter your question here...'
onChange={this.handleChange}
/>
</Col>
</FormGroup>
<FormGroup>
<Col sm={2} componentClass={ControlLabel}>Answer</Col>
<Col sm={10}>
<FormControl
type='text'
name='answer'
value={this.state.answer}
placeholder='Enter the correct answer here...'
onChange={this.handleChange}
/>
</Col>
</FormGroup>
<FormGroup>
<Col smOffset={2} sm={10}>
<Button
bsStyle='primary'
bsSize='large'
loading={this.state.isLoading}
disabled={!this.validFormInput()}
onClick={this.onSubmit}
>
Create
</Button>
</Col>
</FormGroup>
</Form>
</div>
}
handleChange (event) {
let target = event.target
this.setState({ [target.name]: target.value })
}
onSubmit () {
this.setState({ isLoading: true })
}
validFormInput () {
return this.state.question !== '' && this.state.answer !== ''
}
}
export default NewQuestion
| import React from 'react'
+ import {
+ Form, FormGroup, ControlLabel, FormControl, Col
+ } from 'react-bootstrap'
+ import Button from 'react-bootstrap-button-loader'
class NewQuestion extends React.Component {
+
+ constructor (props) {
+ super(props)
+ this.state = { question: '', answer: '', isLoading: false }
+ this.handleChange = this.handleChange.bind(this)
+ this.onSubmit = this.onSubmit.bind(this)
+ }
+
render () {
return <div>
<h2>Create a New Question</h2>
+ <Form horizontal>
+ <FormGroup>
+ <Col sm={2} componentClass={ControlLabel}>Question</Col>
+ <Col sm={10}>
+ <FormControl
+ componentClass='textarea'
+ name='question'
+ value={this.state.question}
+ placeholder='Enter your question here...'
+ onChange={this.handleChange}
+ />
+ </Col>
+ </FormGroup>
+ <FormGroup>
+ <Col sm={2} componentClass={ControlLabel}>Answer</Col>
+ <Col sm={10}>
+ <FormControl
+ type='text'
+ name='answer'
+ value={this.state.answer}
+ placeholder='Enter the correct answer here...'
+ onChange={this.handleChange}
+ />
+ </Col>
+ </FormGroup>
+ <FormGroup>
+ <Col smOffset={2} sm={10}>
+ <Button
+ bsStyle='primary'
+ bsSize='large'
+ loading={this.state.isLoading}
+ disabled={!this.validFormInput()}
+ onClick={this.onSubmit}
+ >
+ Create
+ </Button>
+ </Col>
+ </FormGroup>
+ </Form>
</div>
}
+
+ handleChange (event) {
+ let target = event.target
+ this.setState({ [target.name]: target.value })
+ }
+
+ onSubmit () {
+ this.setState({ isLoading: true })
+ }
+
+ validFormInput () {
+ return this.state.question !== '' && this.state.answer !== ''
+ }
+
}
export default NewQuestion | 65 | 5.909091 | 65 | 0 |
77ea218dd7d99d34ba635034e2c70436864d8bb1 | metadata/fr.xgouchet.packageexplorer.yml | metadata/fr.xgouchet.packageexplorer.yml | Categories:
- Development
License: MIT
SourceCode: https://github.com/xgouchet/Stanley
IssueTracker: https://github.com/xgouchet/Stanley/issues
Donate: https://www.paypal.me/xaviergouchet
AutoName: Stanley
RepoType: git
Repo: https://github.com/xgouchet/Stanley
Builds:
- versionName: '2.3'
versionCode: 15
commit: v2.3
subdir: app
gradle:
- yes
scanignore:
- buildSrc/build/
- versionName: '2.4'
versionCode: 16
commit: v2.4
subdir: app
gradle:
- yes
scanignore:
- buildSrc/build/
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags
CurrentVersion: '2.4'
CurrentVersionCode: 16
| Categories:
- Development
License: MIT
SourceCode: https://github.com/xgouchet/Stanley
IssueTracker: https://github.com/xgouchet/Stanley/issues
Donate: https://www.paypal.me/xaviergouchet
AutoName: Stanley
RepoType: git
Repo: https://github.com/xgouchet/Stanley
Builds:
- versionName: '2.3'
versionCode: 15
commit: v2.3
subdir: app
gradle:
- yes
scanignore:
- buildSrc/build/
- versionName: '2.4'
versionCode: 16
commit: v2.4
subdir: app
gradle:
- yes
scanignore:
- buildSrc/build/
- versionName: '2.5'
versionCode: 17
commit: v2.5
subdir: app
gradle:
- yes
scanignore:
- buildSrc/build/
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags
CurrentVersion: '2.5'
CurrentVersionCode: 17
| Update Stanley to 2.5 (17) | Update Stanley to 2.5 (17)
| YAML | agpl-3.0 | f-droid/fdroiddata,f-droid/fdroiddata | yaml | ## Code Before:
Categories:
- Development
License: MIT
SourceCode: https://github.com/xgouchet/Stanley
IssueTracker: https://github.com/xgouchet/Stanley/issues
Donate: https://www.paypal.me/xaviergouchet
AutoName: Stanley
RepoType: git
Repo: https://github.com/xgouchet/Stanley
Builds:
- versionName: '2.3'
versionCode: 15
commit: v2.3
subdir: app
gradle:
- yes
scanignore:
- buildSrc/build/
- versionName: '2.4'
versionCode: 16
commit: v2.4
subdir: app
gradle:
- yes
scanignore:
- buildSrc/build/
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags
CurrentVersion: '2.4'
CurrentVersionCode: 16
## Instruction:
Update Stanley to 2.5 (17)
## Code After:
Categories:
- Development
License: MIT
SourceCode: https://github.com/xgouchet/Stanley
IssueTracker: https://github.com/xgouchet/Stanley/issues
Donate: https://www.paypal.me/xaviergouchet
AutoName: Stanley
RepoType: git
Repo: https://github.com/xgouchet/Stanley
Builds:
- versionName: '2.3'
versionCode: 15
commit: v2.3
subdir: app
gradle:
- yes
scanignore:
- buildSrc/build/
- versionName: '2.4'
versionCode: 16
commit: v2.4
subdir: app
gradle:
- yes
scanignore:
- buildSrc/build/
- versionName: '2.5'
versionCode: 17
commit: v2.5
subdir: app
gradle:
- yes
scanignore:
- buildSrc/build/
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags
CurrentVersion: '2.5'
CurrentVersionCode: 17
| Categories:
- Development
License: MIT
SourceCode: https://github.com/xgouchet/Stanley
IssueTracker: https://github.com/xgouchet/Stanley/issues
Donate: https://www.paypal.me/xaviergouchet
AutoName: Stanley
RepoType: git
Repo: https://github.com/xgouchet/Stanley
Builds:
- versionName: '2.3'
versionCode: 15
commit: v2.3
subdir: app
gradle:
- yes
scanignore:
- buildSrc/build/
- versionName: '2.4'
versionCode: 16
commit: v2.4
subdir: app
gradle:
- yes
scanignore:
- buildSrc/build/
+ - versionName: '2.5'
+ versionCode: 17
+ commit: v2.5
+ subdir: app
+ gradle:
+ - yes
+ scanignore:
+ - buildSrc/build/
+
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags
- CurrentVersion: '2.4'
? ^
+ CurrentVersion: '2.5'
? ^
- CurrentVersionCode: 16
? ^
+ CurrentVersionCode: 17
? ^
| 13 | 0.371429 | 11 | 2 |
0c3e1fb2e5ad9cfde759f9a6c262387e0a989065 | spec/integration/shared/valid_and_invalid_model.rb | spec/integration/shared/valid_and_invalid_model.rb | describe "valid model", :shared => true do
it "is valid" do
@model.should be_valid
end
it "has no error messages" do
@model.errors.should be_empty
end
end
describe "invalid model", :shared => true do
it "is NOT valid" do
@model.should_not be_valid
end
end
| describe "valid model", :shared => true do
it "is valid" do
@model.should be_valid
end
it "has no error messages" do
@model.errors.should be_empty
end
it "has empty list of full error messages" do
@model.errors.full_messages.should be_empty
end
end
describe "invalid model", :shared => true do
it "is NOT valid" do
@model.should_not be_valid
end
it "has error messages" do
@model.errors.should_not be_blank
end
it "has list of full error messages" do
@model.errors.full_messages.should_not be_blank
end
end | Add error message expectations to 'valid model' and 'invalid model' shared spec groups | [dm-validations] Add error message expectations to 'valid model' and 'invalid model' shared spec groups
| Ruby | mit | emmanuel/aequitas,datamapper/dm-validations,mbj/aequitas | ruby | ## Code Before:
describe "valid model", :shared => true do
it "is valid" do
@model.should be_valid
end
it "has no error messages" do
@model.errors.should be_empty
end
end
describe "invalid model", :shared => true do
it "is NOT valid" do
@model.should_not be_valid
end
end
## Instruction:
[dm-validations] Add error message expectations to 'valid model' and 'invalid model' shared spec groups
## Code After:
describe "valid model", :shared => true do
it "is valid" do
@model.should be_valid
end
it "has no error messages" do
@model.errors.should be_empty
end
it "has empty list of full error messages" do
@model.errors.full_messages.should be_empty
end
end
describe "invalid model", :shared => true do
it "is NOT valid" do
@model.should_not be_valid
end
it "has error messages" do
@model.errors.should_not be_blank
end
it "has list of full error messages" do
@model.errors.full_messages.should_not be_blank
end
end | describe "valid model", :shared => true do
it "is valid" do
@model.should be_valid
end
it "has no error messages" do
@model.errors.should be_empty
end
+
+ it "has empty list of full error messages" do
+ @model.errors.full_messages.should be_empty
+ end
end
describe "invalid model", :shared => true do
it "is NOT valid" do
@model.should_not be_valid
end
+
+ it "has error messages" do
+ @model.errors.should_not be_blank
+ end
+
+ it "has list of full error messages" do
+ @model.errors.full_messages.should_not be_blank
+ end
end | 12 | 0.8 | 12 | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.