commit stringlengths 40 40 | old_file stringlengths 4 184 | new_file stringlengths 4 184 | old_contents stringlengths 1 3.6k | new_contents stringlengths 5 3.38k | subject stringlengths 15 778 | message stringlengths 16 6.74k | lang stringclasses 201 values | license stringclasses 13 values | repos stringlengths 6 116k | config stringclasses 201 values | content stringlengths 137 7.24k | diff stringlengths 26 5.55k | diff_length int64 1 123 | relative_diff_length float64 0.01 89 | n_lines_added int64 0 108 | n_lines_deleted int64 0 106 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
dc42b819082dfae314643d3440fe0e19967b5cfc | README.md | README.md | dotfiles
========
Gilbert's dot files. Thanks to @mikedodge04 for helping me setup a bunch of
these a while ago.
Please note that there may be files from other repositories. These files are
have been accumalted over a long period of time. If you find some code the
belongs to someone else, please let me know so I can properly attribute them.
# Pre-Requisites
- ZSH
- Oh-my-zsh: https://github.com/robbyrussell/oh-my-zsh
- pathogen: https://github.com/tpope/vim-pathogen
- Solarized theme: http://ethanschoonover.com/solarized/vim-colors-solarized
# Recommended
- Powerline Fonthttps://github.com/powerline/fonts
- Menlo font: https://gist.github.com/qrush/1595572
| dotfiles
========
Gilbert's dot files. Thanks to @mikedodge04 for helping me setup a bunch of
these a while ago.
Please note that there may be files from other repositories. These files are
have been accumalted over a long period of time. If you find some code the
belongs to someone else, please let me know so I can properly attribute them.
# Pre-Requisites
- ZSH
- Oh-my-zsh: https://github.com/robbyrussell/oh-my-zsh
- pathogen: https://github.com/tpope/vim-pathogen
- Solarized theme: http://ethanschoonover.com/solarized/vim-colors-solarized
# Recommended
- Powerline Fonthttps://github.com/powerline/fonts
- Menlo font: https://gist.github.com/qrush/1595572
# Setup
git clone git@github.com:HeyItsGilbert/dotfiles.git
mv dotfiles/*(DN) ~/
chmod +x .shell/setup.sh
. .shell/setup.sh
| Add setup instructions to readme | Add setup instructions to readme
I always forget the mv shortcuts. Why not put them in the readme? | Markdown | mit | HeyItsGilbert/dotfiles | markdown | ## Code Before:
dotfiles
========
Gilbert's dot files. Thanks to @mikedodge04 for helping me setup a bunch of
these a while ago.
Please note that there may be files from other repositories. These files are
have been accumalted over a long period of time. If you find some code the
belongs to someone else, please let me know so I can properly attribute them.
# Pre-Requisites
- ZSH
- Oh-my-zsh: https://github.com/robbyrussell/oh-my-zsh
- pathogen: https://github.com/tpope/vim-pathogen
- Solarized theme: http://ethanschoonover.com/solarized/vim-colors-solarized
# Recommended
- Powerline Fonthttps://github.com/powerline/fonts
- Menlo font: https://gist.github.com/qrush/1595572
## Instruction:
Add setup instructions to readme
I always forget the mv shortcuts. Why not put them in the readme?
## Code After:
dotfiles
========
Gilbert's dot files. Thanks to @mikedodge04 for helping me setup a bunch of
these a while ago.
Please note that there may be files from other repositories. These files are
have been accumalted over a long period of time. If you find some code the
belongs to someone else, please let me know so I can properly attribute them.
# Pre-Requisites
- ZSH
- Oh-my-zsh: https://github.com/robbyrussell/oh-my-zsh
- pathogen: https://github.com/tpope/vim-pathogen
- Solarized theme: http://ethanschoonover.com/solarized/vim-colors-solarized
# Recommended
- Powerline Fonthttps://github.com/powerline/fonts
- Menlo font: https://gist.github.com/qrush/1595572
# Setup
git clone git@github.com:HeyItsGilbert/dotfiles.git
mv dotfiles/*(DN) ~/
chmod +x .shell/setup.sh
. .shell/setup.sh
| dotfiles
========
Gilbert's dot files. Thanks to @mikedodge04 for helping me setup a bunch of
these a while ago.
Please note that there may be files from other repositories. These files are
have been accumalted over a long period of time. If you find some code the
belongs to someone else, please let me know so I can properly attribute them.
# Pre-Requisites
- ZSH
- Oh-my-zsh: https://github.com/robbyrussell/oh-my-zsh
- pathogen: https://github.com/tpope/vim-pathogen
- Solarized theme: http://ethanschoonover.com/solarized/vim-colors-solarized
# Recommended
- Powerline Fonthttps://github.com/powerline/fonts
- Menlo font: https://gist.github.com/qrush/1595572
+
+ # Setup
+ git clone git@github.com:HeyItsGilbert/dotfiles.git
+ mv dotfiles/*(DN) ~/
+ chmod +x .shell/setup.sh
+ . .shell/setup.sh | 6 | 0.315789 | 6 | 0 |
7350e5112a37ad19fe219462056a8f7ca55fd49f | src/Genes/King_Protection_Gene.cpp | src/Genes/King_Protection_Gene.cpp |
double King_Protection_Gene::score_board(const Board& board, Piece_Color perspective, size_t, double) const noexcept
{
auto square_count = 0;
for(size_t attack_index = 0; attack_index < 16; ++attack_index)
{
auto step = Move::attack_direction_from_index(attack_index);
for(auto square : Square::square_line_from(board.find_king(perspective), step))
{
if(board.piece_on_square(square))
{
break;
}
else
{
++square_count;
}
if(attack_index >= 8) // knight move
{
break;
}
}
}
constexpr const int max_square_count = 8 // knight attack
+ 7 + 7 // rooks/queen row/column attack
+ 7 + 6; // bishop/queen/pawn attack
return double(max_square_count - square_count)/max_square_count; // return score [0, 1]
}
std::string King_Protection_Gene::name() const noexcept
{
return "King Protection Gene";
}
|
double King_Protection_Gene::score_board(const Board& board, Piece_Color perspective, size_t, double) const noexcept
{
auto square_count = 0;
auto king_square = board.find_king(perspective);
for(size_t attack_index = 0; attack_index < 16; ++attack_index)
{
auto step = Move::attack_direction_from_index(attack_index);
for(auto square : Square::square_line_from(king_square, step))
{
if(board.piece_on_square(square))
{
break;
}
else
{
++square_count;
}
if(attack_index >= 8) // knight move
{
break;
}
}
}
constexpr const int max_square_count = 8 // knight attack
+ 7 + 7 // rooks/queen row/column attack
+ 7 + 6; // bishop/queen/pawn attack
return double(max_square_count - square_count)/max_square_count; // return score [0, 1]
}
std::string King_Protection_Gene::name() const noexcept
{
return "King Protection Gene";
}
| Move unchanging lookup to outside of loop | Move unchanging lookup to outside of loop
| C++ | mit | MarkZH/Genetic_Chess,MarkZH/Genetic_Chess,MarkZH/Genetic_Chess,MarkZH/Genetic_Chess,MarkZH/Genetic_Chess | c++ | ## Code Before:
double King_Protection_Gene::score_board(const Board& board, Piece_Color perspective, size_t, double) const noexcept
{
auto square_count = 0;
for(size_t attack_index = 0; attack_index < 16; ++attack_index)
{
auto step = Move::attack_direction_from_index(attack_index);
for(auto square : Square::square_line_from(board.find_king(perspective), step))
{
if(board.piece_on_square(square))
{
break;
}
else
{
++square_count;
}
if(attack_index >= 8) // knight move
{
break;
}
}
}
constexpr const int max_square_count = 8 // knight attack
+ 7 + 7 // rooks/queen row/column attack
+ 7 + 6; // bishop/queen/pawn attack
return double(max_square_count - square_count)/max_square_count; // return score [0, 1]
}
std::string King_Protection_Gene::name() const noexcept
{
return "King Protection Gene";
}
## Instruction:
Move unchanging lookup to outside of loop
## Code After:
double King_Protection_Gene::score_board(const Board& board, Piece_Color perspective, size_t, double) const noexcept
{
auto square_count = 0;
auto king_square = board.find_king(perspective);
for(size_t attack_index = 0; attack_index < 16; ++attack_index)
{
auto step = Move::attack_direction_from_index(attack_index);
for(auto square : Square::square_line_from(king_square, step))
{
if(board.piece_on_square(square))
{
break;
}
else
{
++square_count;
}
if(attack_index >= 8) // knight move
{
break;
}
}
}
constexpr const int max_square_count = 8 // knight attack
+ 7 + 7 // rooks/queen row/column attack
+ 7 + 6; // bishop/queen/pawn attack
return double(max_square_count - square_count)/max_square_count; // return score [0, 1]
}
std::string King_Protection_Gene::name() const noexcept
{
return "King Protection Gene";
}
|
double King_Protection_Gene::score_board(const Board& board, Piece_Color perspective, size_t, double) const noexcept
{
auto square_count = 0;
+ auto king_square = board.find_king(perspective);
for(size_t attack_index = 0; attack_index < 16; ++attack_index)
{
auto step = Move::attack_direction_from_index(attack_index);
- for(auto square : Square::square_line_from(board.find_king(perspective), step))
? ----------- ^^ ----------
+ for(auto square : Square::square_line_from(king_square, step))
? ^^^^^^
{
if(board.piece_on_square(square))
{
break;
}
else
{
++square_count;
}
if(attack_index >= 8) // knight move
{
break;
}
}
}
constexpr const int max_square_count = 8 // knight attack
+ 7 + 7 // rooks/queen row/column attack
+ 7 + 6; // bishop/queen/pawn attack
return double(max_square_count - square_count)/max_square_count; // return score [0, 1]
}
std::string King_Protection_Gene::name() const noexcept
{
return "King Protection Gene";
} | 3 | 0.083333 | 2 | 1 |
44e2156e97dd19af928b517dade7b717acfacb38 | README.md | README.md | FactoryJill
===========
## Usage
```gradle
sourceCompatibility = 1.8
dependencies {
testCompile('com.github.markymarkmcdonald:FactoryJill:2.0.0')
}
```
## Features
Supported:
- Defining reusable factories
- Overriding fields
- Associations: Defining fields as instances of other factories
Not Supported:
- Aliases
- Sequences
- Building a list of factories
Not Planned to be supported:
- Factory Inheritance
- Lazily setting fields (based off of other properties)
- Callbacks (pre-build, post-build)
## Requirements
- Java 8
- A public getter and setter for each field defined or overridden.
## Examples
Defining a factory:
```java
factory("truck", Car.class, ImmutableMap.of(
"make", "ford",
"convertible", false,
"yearsOwned", 5,
"releaseDate", new Date()
));
```
Building an instance from a factory:
```java
Car pickupTruck = build("truck");
assert pickupTruck.getMake().equals("ford");
```
Sometimes you need to override specific properties:
```java
Car convertible = build("truck", ImmutableMap.of("convertible", true));
assert convertible.getConvertible().equals(true);
```
## Full API
For now FactoryJill provides a public interface with two methods, `FactoryJill.factory` and `FactoryJill.build`.
Check out the test directory for a complete set of usage examples.
| FactoryJill
===========
## Usage
```gradle
sourceCompatibility = 1.8
dependencies {
testCompile('com.github.markymarkmcdonald:FactoryJill:2.0.0')
}
```
## Features
Supported:
- Defining reusable factories
- Overriding fields
- Associations: Defining fields as instances of other factories
Not Supported:
- Aliases
- Sequences
- Building a list of factories
Not Planned to be supported:
- Factory Inheritance
- Lazily setting fields (based off of other properties)
- Callbacks (pre-build, post-build)
## Requirements
- Java 8
- A public getter and setter for each field defined or overridden.
## Examples
Defining a factory:
```java
factory("truck", Car.class, ImmutableMap.of(
"make", "ford",
"convertible", false,
"yearsOwned", 5,
"releaseDate", new Date()
));
```
Building an instance from a factory:
```java
Car pickupTruck = build("truck");
```
Sometimes you need to override specific properties:
```java
Car convertible = build("truck", ImmutableMap.of("convertible", true));
```
Building multiple instances from a factoy:
```java
List<Car> pickupTrucks = buildMultiple("truck", 5);
```
## Full API
For now FactoryJill provides a public interface with three methods, `FactoryJill.factory`, `FactoryJill.build`, and `FactoryJill.buildMultiple`.
Check out the test directory for a complete set of usage examples.
| Add buildMutliple examples to Readme | Add buildMutliple examples to Readme
| Markdown | mit | MarkyMarkMcDonald/FactoryJill | markdown | ## Code Before:
FactoryJill
===========
## Usage
```gradle
sourceCompatibility = 1.8
dependencies {
testCompile('com.github.markymarkmcdonald:FactoryJill:2.0.0')
}
```
## Features
Supported:
- Defining reusable factories
- Overriding fields
- Associations: Defining fields as instances of other factories
Not Supported:
- Aliases
- Sequences
- Building a list of factories
Not Planned to be supported:
- Factory Inheritance
- Lazily setting fields (based off of other properties)
- Callbacks (pre-build, post-build)
## Requirements
- Java 8
- A public getter and setter for each field defined or overridden.
## Examples
Defining a factory:
```java
factory("truck", Car.class, ImmutableMap.of(
"make", "ford",
"convertible", false,
"yearsOwned", 5,
"releaseDate", new Date()
));
```
Building an instance from a factory:
```java
Car pickupTruck = build("truck");
assert pickupTruck.getMake().equals("ford");
```
Sometimes you need to override specific properties:
```java
Car convertible = build("truck", ImmutableMap.of("convertible", true));
assert convertible.getConvertible().equals(true);
```
## Full API
For now FactoryJill provides a public interface with two methods, `FactoryJill.factory` and `FactoryJill.build`.
Check out the test directory for a complete set of usage examples.
## Instruction:
Add buildMutliple examples to Readme
## Code After:
FactoryJill
===========
## Usage
```gradle
sourceCompatibility = 1.8
dependencies {
testCompile('com.github.markymarkmcdonald:FactoryJill:2.0.0')
}
```
## Features
Supported:
- Defining reusable factories
- Overriding fields
- Associations: Defining fields as instances of other factories
Not Supported:
- Aliases
- Sequences
- Building a list of factories
Not Planned to be supported:
- Factory Inheritance
- Lazily setting fields (based off of other properties)
- Callbacks (pre-build, post-build)
## Requirements
- Java 8
- A public getter and setter for each field defined or overridden.
## Examples
Defining a factory:
```java
factory("truck", Car.class, ImmutableMap.of(
"make", "ford",
"convertible", false,
"yearsOwned", 5,
"releaseDate", new Date()
));
```
Building an instance from a factory:
```java
Car pickupTruck = build("truck");
```
Sometimes you need to override specific properties:
```java
Car convertible = build("truck", ImmutableMap.of("convertible", true));
```
Building multiple instances from a factoy:
```java
List<Car> pickupTrucks = buildMultiple("truck", 5);
```
## Full API
For now FactoryJill provides a public interface with three methods, `FactoryJill.factory`, `FactoryJill.build`, and `FactoryJill.buildMultiple`.
Check out the test directory for a complete set of usage examples.
| FactoryJill
===========
## Usage
```gradle
sourceCompatibility = 1.8
dependencies {
testCompile('com.github.markymarkmcdonald:FactoryJill:2.0.0')
}
```
## Features
Supported:
- Defining reusable factories
- Overriding fields
- Associations: Defining fields as instances of other factories
Not Supported:
- Aliases
- Sequences
- Building a list of factories
Not Planned to be supported:
- Factory Inheritance
- Lazily setting fields (based off of other properties)
- Callbacks (pre-build, post-build)
## Requirements
- Java 8
- A public getter and setter for each field defined or overridden.
## Examples
Defining a factory:
```java
factory("truck", Car.class, ImmutableMap.of(
"make", "ford",
"convertible", false,
"yearsOwned", 5,
"releaseDate", new Date()
));
```
Building an instance from a factory:
```java
Car pickupTruck = build("truck");
- assert pickupTruck.getMake().equals("ford");
```
Sometimes you need to override specific properties:
```java
Car convertible = build("truck", ImmutableMap.of("convertible", true));
- assert convertible.getConvertible().equals(true);
+ ```
+
+ Building multiple instances from a factoy:
+ ```java
+ List<Car> pickupTrucks = buildMultiple("truck", 5);
```
## Full API
- For now FactoryJill provides a public interface with two methods, `FactoryJill.factory` and `FactoryJill.build`.
? ^^
+ For now FactoryJill provides a public interface with three methods, `FactoryJill.factory`, `FactoryJill.build`, and `FactoryJill.buildMultiple`.
? ^^^^ ++++++++++++++++++++++ ++++++++
Check out the test directory for a complete set of usage examples. | 9 | 0.157895 | 6 | 3 |
be46a106ef9961bef03860a950077bd8f737a263 | README.md | README.md | [](https://travis-ci.org/caulagi/toystori)

toystori - A website for all the lovely mothers and children to
* Share your toys
* Borrow interesting ones
* Make new friends
* And best of all... **free to use**
Built with love using Nodejs and Mongodb during Bangalore Hack
## Dev setup
* Install mongodb - [download](http://www.mongodb.org/downloads)
* Install dependencies
$ npm install
* GO!
$ npm start
## Tests
Yup!
$ npm test
## License
Licensed under [MIT][1]
[1]: https://github.com/caulagi/toystori/blob/master/LICENSE.mit
[2]: http://www.toystori.com
|
** THIS PROJECT IS NOT MAINTAINED ANYMORE **

toystori - A website for all the lovely mothers and children to
* Share your toys
* Borrow interesting ones
* Make new friends
* And best of all... **free to use**
Built with love using Nodejs and Mongodb during Bangalore Hack
## Dev setup
* Install mongodb - [download](http://www.mongodb.org/downloads)
* Install dependencies
$ npm install
* GO!
$ npm start
## Tests
Yup!
$ npm test
## License
Licensed under [MIT][1]
[1]: https://github.com/caulagi/toystori/blob/master/LICENSE.mit
[2]: http://www.toystori.com
| Update status of project in readme | Update status of project in readme
| Markdown | mit | caulagi/toystori | markdown | ## Code Before:
[](https://travis-ci.org/caulagi/toystori)

toystori - A website for all the lovely mothers and children to
* Share your toys
* Borrow interesting ones
* Make new friends
* And best of all... **free to use**
Built with love using Nodejs and Mongodb during Bangalore Hack
## Dev setup
* Install mongodb - [download](http://www.mongodb.org/downloads)
* Install dependencies
$ npm install
* GO!
$ npm start
## Tests
Yup!
$ npm test
## License
Licensed under [MIT][1]
[1]: https://github.com/caulagi/toystori/blob/master/LICENSE.mit
[2]: http://www.toystori.com
## Instruction:
Update status of project in readme
## Code After:
** THIS PROJECT IS NOT MAINTAINED ANYMORE **

toystori - A website for all the lovely mothers and children to
* Share your toys
* Borrow interesting ones
* Make new friends
* And best of all... **free to use**
Built with love using Nodejs and Mongodb during Bangalore Hack
## Dev setup
* Install mongodb - [download](http://www.mongodb.org/downloads)
* Install dependencies
$ npm install
* GO!
$ npm start
## Tests
Yup!
$ npm test
## License
Licensed under [MIT][1]
[1]: https://github.com/caulagi/toystori/blob/master/LICENSE.mit
[2]: http://www.toystori.com
| - [](https://travis-ci.org/caulagi/toystori)
+
+ ** THIS PROJECT IS NOT MAINTAINED ANYMORE **

toystori - A website for all the lovely mothers and children to
* Share your toys
* Borrow interesting ones
* Make new friends
* And best of all... **free to use**
Built with love using Nodejs and Mongodb during Bangalore Hack
## Dev setup
* Install mongodb - [download](http://www.mongodb.org/downloads)
* Install dependencies
$ npm install
* GO!
$ npm start
## Tests
Yup!
$ npm test
## License
Licensed under [MIT][1]
[1]: https://github.com/caulagi/toystori/blob/master/LICENSE.mit
[2]: http://www.toystori.com | 3 | 0.078947 | 2 | 1 |
ef96a54e717ead84d24a77955a815793f30b77d5 | db/migrate/20150520102406_add_preferences_to_spree_promotion_action.rb | db/migrate/20150520102406_add_preferences_to_spree_promotion_action.rb | class AddPreferencesToSpreePromotionAction < ActiveRecord::Migration
def change
unless column_exists? :preferences
add_column :spree_promotion_actions, :preferences, :text
end
end
end
| class AddPreferencesToSpreePromotionAction < ActiveRecord::Migration
def change
unless column_exists?(:spree_promotion_actions, :preferences)
add_column :spree_promotion_actions, :preferences, :text
end
end
end
| Fix checking column existence on migration | Fix checking column existence on migration
| Ruby | bsd-3-clause | onedanshow/spree_conditional_promotion_actions,onedanshow/spree_conditional_promotion_actions,onedanshow/spree_conditional_promotion_actions | ruby | ## Code Before:
class AddPreferencesToSpreePromotionAction < ActiveRecord::Migration
def change
unless column_exists? :preferences
add_column :spree_promotion_actions, :preferences, :text
end
end
end
## Instruction:
Fix checking column existence on migration
## Code After:
class AddPreferencesToSpreePromotionAction < ActiveRecord::Migration
def change
unless column_exists?(:spree_promotion_actions, :preferences)
add_column :spree_promotion_actions, :preferences, :text
end
end
end
| class AddPreferencesToSpreePromotionAction < ActiveRecord::Migration
def change
- unless column_exists? :preferences
+ unless column_exists?(:spree_promotion_actions, :preferences)
add_column :spree_promotion_actions, :preferences, :text
end
end
end | 2 | 0.285714 | 1 | 1 |
5cc8c628db922b1b2e49bbdb12ba7fb09259a4fa | presto-server-main/etc/catalog/hive.properties | presto-server-main/etc/catalog/hive.properties |
connector.name=hive-hadoop2
# Configuration appropriate for Hive as started by product test environment, e.g.
# presto-product-tests-launcher/bin/run-launcher env up --environment singlenode --without-presto
# On Mac, this additionally requires that you add "<your external IP> hadoop-master" to /etc/hosts
hive.metastore.uri=thrift://localhost:9083
hive.hdfs.socks-proxy=localhost:1180
# Fail-fast in development
hive.metastore.thrift.client.max-retry-time=1s
|
connector.name=hive-hadoop2
# Configuration appropriate for Hive as started by product test environment, e.g.
# presto-product-tests-launcher/bin/run-launcher env up --environment singlenode --without-presto
# On Mac, this additionally requires that you add "<your external IP> hadoop-master" to /etc/hosts
hive.metastore.uri=thrift://localhost:9083
hive.hdfs.socks-proxy=localhost:1180
# Fail-fast in development
hive.metastore.thrift.client.max-retry-time=1s
# Be permissive in development
hive.allow-add-column=true
hive.allow-drop-column=true
hive.allow-drop-table=true
hive.allow-rename-table=true
hive.allow-comment-table=true
hive.allow-comment-column=true
hive.allow-rename-column=true
| Allow drop table and others in developer's config | Allow drop table and others in developer's config
| INI | apache-2.0 | smartnews/presto,losipiuk/presto,ebyhr/presto,ebyhr/presto,dain/presto,11xor6/presto,smartnews/presto,11xor6/presto,erichwang/presto,ebyhr/presto,electrum/presto,ebyhr/presto,smartnews/presto,electrum/presto,ebyhr/presto,erichwang/presto,smartnews/presto,Praveen2112/presto,dain/presto,Praveen2112/presto,electrum/presto,erichwang/presto,losipiuk/presto,11xor6/presto,losipiuk/presto,Praveen2112/presto,11xor6/presto,11xor6/presto,electrum/presto,Praveen2112/presto,erichwang/presto,dain/presto,electrum/presto,losipiuk/presto,dain/presto,losipiuk/presto,Praveen2112/presto,dain/presto,smartnews/presto,erichwang/presto | ini | ## Code Before:
connector.name=hive-hadoop2
# Configuration appropriate for Hive as started by product test environment, e.g.
# presto-product-tests-launcher/bin/run-launcher env up --environment singlenode --without-presto
# On Mac, this additionally requires that you add "<your external IP> hadoop-master" to /etc/hosts
hive.metastore.uri=thrift://localhost:9083
hive.hdfs.socks-proxy=localhost:1180
# Fail-fast in development
hive.metastore.thrift.client.max-retry-time=1s
## Instruction:
Allow drop table and others in developer's config
## Code After:
connector.name=hive-hadoop2
# Configuration appropriate for Hive as started by product test environment, e.g.
# presto-product-tests-launcher/bin/run-launcher env up --environment singlenode --without-presto
# On Mac, this additionally requires that you add "<your external IP> hadoop-master" to /etc/hosts
hive.metastore.uri=thrift://localhost:9083
hive.hdfs.socks-proxy=localhost:1180
# Fail-fast in development
hive.metastore.thrift.client.max-retry-time=1s
# Be permissive in development
hive.allow-add-column=true
hive.allow-drop-column=true
hive.allow-drop-table=true
hive.allow-rename-table=true
hive.allow-comment-table=true
hive.allow-comment-column=true
hive.allow-rename-column=true
|
connector.name=hive-hadoop2
# Configuration appropriate for Hive as started by product test environment, e.g.
# presto-product-tests-launcher/bin/run-launcher env up --environment singlenode --without-presto
# On Mac, this additionally requires that you add "<your external IP> hadoop-master" to /etc/hosts
hive.metastore.uri=thrift://localhost:9083
hive.hdfs.socks-proxy=localhost:1180
# Fail-fast in development
hive.metastore.thrift.client.max-retry-time=1s
+
+ # Be permissive in development
+ hive.allow-add-column=true
+ hive.allow-drop-column=true
+ hive.allow-drop-table=true
+ hive.allow-rename-table=true
+ hive.allow-comment-table=true
+ hive.allow-comment-column=true
+ hive.allow-rename-column=true | 9 | 0.818182 | 9 | 0 |
e22381da15686e88121c1d160cf1c4160080d654 | t/tags.t | t/tags.t | use strict;
use Test::Simple tests => 6;
use Font::TTF::OTTags qw( %tttags %ttnames %iso639 readtagsfile);
ok($tttags{'SCRIPT'}{'Cypriot Syllabary'} eq 'cprt', 'tttags{SCRIPT}');
ok($ttnames{'LANGUAGE'}{'AFK '} eq 'Afrikaans', 'ttnames{LANGUAGE}');
ok($ttnames{'LANGUAGE'}{'DHV '} eq 'Dhivehi (deprecated)' && $ttnames{'LANGUAGE'}{'DIV '} eq 'Dhivehi', 'ttnames{LANGUAGE} Dhivehi');
ok($ttnames{'FEATURE'}{'cv01'} eq 'Character Variants 01', 'ttnames{FEATURE}');
ok($iso639{'atv'} eq 'ALT ', 'iso639{atv}');
ok($iso639{'ALT '}->[0] eq 'atv' && $iso639{'ALT '}->[1] eq 'alt', 'iso639{ALT}'); | use strict;
use Test::Simple tests => 6;
use Font::TTF::OTTags qw( %tttags %ttnames %iso639 readtagsfile);
ok($tttags{'SCRIPT'}{'Cypriot Syllabary'} eq 'cprt', 'tttags{SCRIPT}');
ok($ttnames{'LANGUAGE'}{'AFK '} eq 'Afrikaans', 'ttnames{LANGUAGE}');
ok($ttnames{'LANGUAGE'}{'DHV '} eq 'Divehi (Dhivehi, Maldivian) (deprecated)' && $ttnames{'LANGUAGE'}{'DIV '} eq 'Divehi (Dhivehi, Maldivian)', 'ttnames{LANGUAGE} Dhivehi');
ok($ttnames{'FEATURE'}{'cv01'} eq 'Character Variants 01', 'ttnames{FEATURE}');
ok($iso639{'atv'} eq 'ALT ', 'iso639{atv}');
ok($iso639{'ALT '}->[0] eq 'atv' && $iso639{'ALT '}->[1] eq 'alt', 'iso639{ALT}'); | Correct test for changes in OTTags.pm (r1030) | Correct test for changes in OTTags.pm (r1030)
git-svn-id: f75843d1f2d20a890c1b7b8c44655b4ad9095f8f@1034 4dbf3599-f9e3-0310-9a96-835ab5cc0019
| Perl | artistic-2.0 | silnrsi/font-ttf,silnrsi/font-ttf | perl | ## Code Before:
use strict;
use Test::Simple tests => 6;
use Font::TTF::OTTags qw( %tttags %ttnames %iso639 readtagsfile);
ok($tttags{'SCRIPT'}{'Cypriot Syllabary'} eq 'cprt', 'tttags{SCRIPT}');
ok($ttnames{'LANGUAGE'}{'AFK '} eq 'Afrikaans', 'ttnames{LANGUAGE}');
ok($ttnames{'LANGUAGE'}{'DHV '} eq 'Dhivehi (deprecated)' && $ttnames{'LANGUAGE'}{'DIV '} eq 'Dhivehi', 'ttnames{LANGUAGE} Dhivehi');
ok($ttnames{'FEATURE'}{'cv01'} eq 'Character Variants 01', 'ttnames{FEATURE}');
ok($iso639{'atv'} eq 'ALT ', 'iso639{atv}');
ok($iso639{'ALT '}->[0] eq 'atv' && $iso639{'ALT '}->[1] eq 'alt', 'iso639{ALT}');
## Instruction:
Correct test for changes in OTTags.pm (r1030)
git-svn-id: f75843d1f2d20a890c1b7b8c44655b4ad9095f8f@1034 4dbf3599-f9e3-0310-9a96-835ab5cc0019
## Code After:
use strict;
use Test::Simple tests => 6;
use Font::TTF::OTTags qw( %tttags %ttnames %iso639 readtagsfile);
ok($tttags{'SCRIPT'}{'Cypriot Syllabary'} eq 'cprt', 'tttags{SCRIPT}');
ok($ttnames{'LANGUAGE'}{'AFK '} eq 'Afrikaans', 'ttnames{LANGUAGE}');
ok($ttnames{'LANGUAGE'}{'DHV '} eq 'Divehi (Dhivehi, Maldivian) (deprecated)' && $ttnames{'LANGUAGE'}{'DIV '} eq 'Divehi (Dhivehi, Maldivian)', 'ttnames{LANGUAGE} Dhivehi');
ok($ttnames{'FEATURE'}{'cv01'} eq 'Character Variants 01', 'ttnames{FEATURE}');
ok($iso639{'atv'} eq 'ALT ', 'iso639{atv}');
ok($iso639{'ALT '}->[0] eq 'atv' && $iso639{'ALT '}->[1] eq 'alt', 'iso639{ALT}'); | use strict;
use Test::Simple tests => 6;
use Font::TTF::OTTags qw( %tttags %ttnames %iso639 readtagsfile);
ok($tttags{'SCRIPT'}{'Cypriot Syllabary'} eq 'cprt', 'tttags{SCRIPT}');
ok($ttnames{'LANGUAGE'}{'AFK '} eq 'Afrikaans', 'ttnames{LANGUAGE}');
- ok($ttnames{'LANGUAGE'}{'DHV '} eq 'Dhivehi (deprecated)' && $ttnames{'LANGUAGE'}{'DIV '} eq 'Dhivehi', 'ttnames{LANGUAGE} Dhivehi');
+ ok($ttnames{'LANGUAGE'}{'DHV '} eq 'Divehi (Dhivehi, Maldivian) (deprecated)' && $ttnames{'LANGUAGE'}{'DIV '} eq 'Divehi (Dhivehi, Maldivian)', 'ttnames{LANGUAGE} Dhivehi');
? ++++++++ ++++++++++++ ++++++++ ++++++++++++
ok($ttnames{'FEATURE'}{'cv01'} eq 'Character Variants 01', 'ttnames{FEATURE}');
ok($iso639{'atv'} eq 'ALT ', 'iso639{atv}');
ok($iso639{'ALT '}->[0] eq 'atv' && $iso639{'ALT '}->[1] eq 'alt', 'iso639{ALT}'); | 2 | 0.125 | 1 | 1 |
de8f946cbfcc5c43bf70344fb07a64dcc4c0d4f7 | .rultor.yml | .rultor.yml |
readers:
- urn:github:1119686
#docker:
# image: "ubuntu:12.10"
merge: # or "deploy" or "release"
commanders:
- LaurentTardif
env:
MAVEN_OPTS: "-XX:MaxPermSize=256m -Xmx512m"
script:
- "mvn clean test install"
release: # or "deploy" or "release"
commanders:
- LaurentTardif
env:
MAVEN_OPTS: "-XX:MaxPermSize=256m -Xmx512m"
script:
- "git commit -am \"${tag}\""
- "mvn clean test site:site install"
|
readers:
- urn:github:1119686
#docker:
# image: "ubuntu:12.10"
merge: # or "deploy" or "release"
commanders:
- LaurentTardif
env:
MAVEN_OPTS: "-XX:MaxPermSize=256m -Xmx512m"
script:
- "mvn clean test install"
release: # or "deploy" or "release"
commanders:
- LaurentTardif
env:
MAVEN_OPTS: "-XX:MaxPermSize=256m -Xmx512m"
script:
# - maven version ----
# - "git commit -am \"${tag}\""
- "mvn clean test site:site install"
| Fix git commit without files changed :) | Fix git commit without files changed :)
| YAML | apache-2.0 | LaurentTardif/mavenTraining | yaml | ## Code Before:
readers:
- urn:github:1119686
#docker:
# image: "ubuntu:12.10"
merge: # or "deploy" or "release"
commanders:
- LaurentTardif
env:
MAVEN_OPTS: "-XX:MaxPermSize=256m -Xmx512m"
script:
- "mvn clean test install"
release: # or "deploy" or "release"
commanders:
- LaurentTardif
env:
MAVEN_OPTS: "-XX:MaxPermSize=256m -Xmx512m"
script:
- "git commit -am \"${tag}\""
- "mvn clean test site:site install"
## Instruction:
Fix git commit without files changed :)
## Code After:
readers:
- urn:github:1119686
#docker:
# image: "ubuntu:12.10"
merge: # or "deploy" or "release"
commanders:
- LaurentTardif
env:
MAVEN_OPTS: "-XX:MaxPermSize=256m -Xmx512m"
script:
- "mvn clean test install"
release: # or "deploy" or "release"
commanders:
- LaurentTardif
env:
MAVEN_OPTS: "-XX:MaxPermSize=256m -Xmx512m"
script:
# - maven version ----
# - "git commit -am \"${tag}\""
- "mvn clean test site:site install"
|
readers:
- urn:github:1119686
#docker:
# image: "ubuntu:12.10"
merge: # or "deploy" or "release"
commanders:
- LaurentTardif
env:
MAVEN_OPTS: "-XX:MaxPermSize=256m -Xmx512m"
script:
- "mvn clean test install"
release: # or "deploy" or "release"
commanders:
- LaurentTardif
env:
MAVEN_OPTS: "-XX:MaxPermSize=256m -Xmx512m"
script:
+ # - maven version ----
- - "git commit -am \"${tag}\""
+ # - "git commit -am \"${tag}\""
? +
- "mvn clean test site:site install"
| 3 | 0.125 | 2 | 1 |
01c4d9ddf99cde2c3ef48185b9e239580ef9832c | _posts/2015-01-07-gioi-thieu-series-discourse-co-ban.markdown | _posts/2015-01-07-gioi-thieu-series-discourse-co-ban.markdown | ---
layout: post
title: Giới thiệu Series Discourse cơ bản
date: 2015-01-07 08:30:20.000000000 -05:00
---
Discourse là một dự án mã nguồn mở để xây dựng một nền tảng thảo luận online có thể dùng với các mục đích sau :
* Danh sách gửi mail
* Một diễn đàn thảo luận
* Một chat room
Mục tiêu của Discourse là trở thành Wordpress trong thế giới mã nguồn Forum.
## Giới thiệu Series
Trong series này, mình sẽ hướng dẫn các bạn những phần cơ bản nhất để có một Forum chạy Discourse như cài đặt, cấu hình, các plugin cần thiết,...
## Các bài viết có trong Series:
1. [Giới thiệu Discourse](http://khoanguyen.me/gioi-thieu-discourse/)
2. Cài đặt Discourse
3. Cấu hình cần thiết sau khi cài đặt
4. Một số plugin
| ---
layout: post
title: Giới thiệu Series Discourse cơ bản
date: 2015-01-07 08:30:20.000000000 -05:00
tags: discourse
---
Discourse là một dự án mã nguồn mở để xây dựng một nền tảng thảo luận online có thể dùng với các mục đích sau :
* Danh sách gửi mail
* Một diễn đàn thảo luận
* Một chat room
Mục tiêu của Discourse là trở thành Wordpress trong thế giới mã nguồn Forum.
## Giới thiệu Series
Trong series này, mình sẽ hướng dẫn các bạn những phần cơ bản nhất để có một Forum chạy Discourse như cài đặt, cấu hình, các plugin cần thiết,...
## Các bài viết có trong Series:
1. [Giới thiệu Discourse](/gioi-thieu-discourse/)
2. Cài đặt Discourse
3. Cấu hình cần thiết sau khi cài đặt
4. Một số plugin
| Add tags and use relative link | Add tags and use relative link | Markdown | mit | thangngoc89/blog-archive,thangngoc89/blog-archive | markdown | ## Code Before:
---
layout: post
title: Giới thiệu Series Discourse cơ bản
date: 2015-01-07 08:30:20.000000000 -05:00
---
Discourse là một dự án mã nguồn mở để xây dựng một nền tảng thảo luận online có thể dùng với các mục đích sau :
* Danh sách gửi mail
* Một diễn đàn thảo luận
* Một chat room
Mục tiêu của Discourse là trở thành Wordpress trong thế giới mã nguồn Forum.
## Giới thiệu Series
Trong series này, mình sẽ hướng dẫn các bạn những phần cơ bản nhất để có một Forum chạy Discourse như cài đặt, cấu hình, các plugin cần thiết,...
## Các bài viết có trong Series:
1. [Giới thiệu Discourse](http://khoanguyen.me/gioi-thieu-discourse/)
2. Cài đặt Discourse
3. Cấu hình cần thiết sau khi cài đặt
4. Một số plugin
## Instruction:
Add tags and use relative link
## Code After:
---
layout: post
title: Giới thiệu Series Discourse cơ bản
date: 2015-01-07 08:30:20.000000000 -05:00
tags: discourse
---
Discourse là một dự án mã nguồn mở để xây dựng một nền tảng thảo luận online có thể dùng với các mục đích sau :
* Danh sách gửi mail
* Một diễn đàn thảo luận
* Một chat room
Mục tiêu của Discourse là trở thành Wordpress trong thế giới mã nguồn Forum.
## Giới thiệu Series
Trong series này, mình sẽ hướng dẫn các bạn những phần cơ bản nhất để có một Forum chạy Discourse như cài đặt, cấu hình, các plugin cần thiết,...
## Các bài viết có trong Series:
1. [Giới thiệu Discourse](/gioi-thieu-discourse/)
2. Cài đặt Discourse
3. Cấu hình cần thiết sau khi cài đặt
4. Một số plugin
| ---
layout: post
title: Giới thiệu Series Discourse cơ bản
date: 2015-01-07 08:30:20.000000000 -05:00
+ tags: discourse
---
Discourse là một dự án mã nguồn mở để xây dựng một nền tảng thảo luận online có thể dùng với các mục đích sau :
* Danh sách gửi mail
* Một diễn đàn thảo luận
* Một chat room
Mục tiêu của Discourse là trở thành Wordpress trong thế giới mã nguồn Forum.
## Giới thiệu Series
Trong series này, mình sẽ hướng dẫn các bạn những phần cơ bản nhất để có một Forum chạy Discourse như cài đặt, cấu hình, các plugin cần thiết,...
## Các bài viết có trong Series:
- 1. [Giới thiệu Discourse](http://khoanguyen.me/gioi-thieu-discourse/)
? --------------------
+ 1. [Giới thiệu Discourse](/gioi-thieu-discourse/)
2. Cài đặt Discourse
3. Cấu hình cần thiết sau khi cài đặt
4. Một số plugin | 3 | 0.15 | 2 | 1 |
9ecf877dcdd92fc03082ec737e25cf1334eedd3f | lib/fastlane/plugin/slack_train/actions/slack_train_action.rb | lib/fastlane/plugin/slack_train/actions/slack_train_action.rb | module Fastlane
module Actions
class SlackTrainAction < Action
def self.run(params)
train_emoji = lane_context[SharedValues::SLACK_TRAIN_EMOJI]
rail_emoji = lane_context[SharedValues::SLACK_TRAIN_RAIL]
total_distance = lane_context[SharedValues::SLACK_TRAIN_DISTANCE]
current_position = lane_context[SharedValues::SLACK_TRAIN_CURRENT_TRAIN_POSITION]
speed = lane_context[SharedValues::SLACK_TRAIN_DIRECTION]
before = rail_emoji * current_position
after = rail_emoji * (total_distance - current_position - 1)
message = [before, train_emoji, after].join("")
other_action.slack(message: message)
UI.message(message)
lane_context[SharedValues::SLACK_TRAIN_CURRENT_TRAIN_POSITION] += speed
end
def self.description
"Show a train of the fastlane progress"
end
def self.authors
["@KrauseFx"]
end
def self.return_value
"A string that is being sent to slack"
end
def self.available_options
[]
end
def self.is_supported?(platform)
true
end
end
end
end
| module Fastlane
module Actions
class SlackTrainAction < Action
def self.run(params)
train_emoji = lane_context[SharedValues::SLACK_TRAIN_EMOJI]
rail_emoji = lane_context[SharedValues::SLACK_TRAIN_RAIL]
total_distance = lane_context[SharedValues::SLACK_TRAIN_DISTANCE]
current_position = lane_context[SharedValues::SLACK_TRAIN_CURRENT_TRAIN_POSITION]
speed = lane_context[SharedValues::SLACK_TRAIN_DIRECTION]
UI.user_error!("train drove too far") if current_position < 0
UI.user_error!("train drove too far") if total_distance == current_position
before = rail_emoji * current_position
after = rail_emoji * (total_distance - current_position - 1)
message = [before, train_emoji, after].join("")
other_action.slack(message: message)
UI.message(message)
lane_context[SharedValues::SLACK_TRAIN_CURRENT_TRAIN_POSITION] += speed
end
def self.description
"Show a train of the fastlane progress"
end
def self.authors
["@KrauseFx"]
end
def self.return_value
"A string that is being sent to slack"
end
def self.available_options
[]
end
def self.is_supported?(platform)
true
end
end
end
end
| Fix crash for out of bounds | Fix crash for out of bounds
| Ruby | mit | KrauseFx/fastlane-plugin-slack_train | ruby | ## Code Before:
module Fastlane
module Actions
class SlackTrainAction < Action
def self.run(params)
train_emoji = lane_context[SharedValues::SLACK_TRAIN_EMOJI]
rail_emoji = lane_context[SharedValues::SLACK_TRAIN_RAIL]
total_distance = lane_context[SharedValues::SLACK_TRAIN_DISTANCE]
current_position = lane_context[SharedValues::SLACK_TRAIN_CURRENT_TRAIN_POSITION]
speed = lane_context[SharedValues::SLACK_TRAIN_DIRECTION]
before = rail_emoji * current_position
after = rail_emoji * (total_distance - current_position - 1)
message = [before, train_emoji, after].join("")
other_action.slack(message: message)
UI.message(message)
lane_context[SharedValues::SLACK_TRAIN_CURRENT_TRAIN_POSITION] += speed
end
def self.description
"Show a train of the fastlane progress"
end
def self.authors
["@KrauseFx"]
end
def self.return_value
"A string that is being sent to slack"
end
def self.available_options
[]
end
def self.is_supported?(platform)
true
end
end
end
end
## Instruction:
Fix crash for out of bounds
## Code After:
module Fastlane
module Actions
class SlackTrainAction < Action
def self.run(params)
train_emoji = lane_context[SharedValues::SLACK_TRAIN_EMOJI]
rail_emoji = lane_context[SharedValues::SLACK_TRAIN_RAIL]
total_distance = lane_context[SharedValues::SLACK_TRAIN_DISTANCE]
current_position = lane_context[SharedValues::SLACK_TRAIN_CURRENT_TRAIN_POSITION]
speed = lane_context[SharedValues::SLACK_TRAIN_DIRECTION]
UI.user_error!("train drove too far") if current_position < 0
UI.user_error!("train drove too far") if total_distance == current_position
before = rail_emoji * current_position
after = rail_emoji * (total_distance - current_position - 1)
message = [before, train_emoji, after].join("")
other_action.slack(message: message)
UI.message(message)
lane_context[SharedValues::SLACK_TRAIN_CURRENT_TRAIN_POSITION] += speed
end
def self.description
"Show a train of the fastlane progress"
end
def self.authors
["@KrauseFx"]
end
def self.return_value
"A string that is being sent to slack"
end
def self.available_options
[]
end
def self.is_supported?(platform)
true
end
end
end
end
| module Fastlane
module Actions
class SlackTrainAction < Action
def self.run(params)
train_emoji = lane_context[SharedValues::SLACK_TRAIN_EMOJI]
rail_emoji = lane_context[SharedValues::SLACK_TRAIN_RAIL]
total_distance = lane_context[SharedValues::SLACK_TRAIN_DISTANCE]
current_position = lane_context[SharedValues::SLACK_TRAIN_CURRENT_TRAIN_POSITION]
speed = lane_context[SharedValues::SLACK_TRAIN_DIRECTION]
+
+ UI.user_error!("train drove too far") if current_position < 0
+ UI.user_error!("train drove too far") if total_distance == current_position
before = rail_emoji * current_position
after = rail_emoji * (total_distance - current_position - 1)
message = [before, train_emoji, after].join("")
other_action.slack(message: message)
UI.message(message)
lane_context[SharedValues::SLACK_TRAIN_CURRENT_TRAIN_POSITION] += speed
end
def self.description
"Show a train of the fastlane progress"
end
def self.authors
["@KrauseFx"]
end
def self.return_value
"A string that is being sent to slack"
end
def self.available_options
[]
end
def self.is_supported?(platform)
true
end
end
end
end | 3 | 0.071429 | 3 | 0 |
1263f31f1469518740327c038cb932534493ce5c | install_files/ansible-base/roles/development/tasks/main.yml | install_files/ansible-base/roles/development/tasks/main.yml | ---
- name: update apt package lists
apt:
update_cache: yes
tags:
- apt
- development
# TODO: this is copied from roles/common/tasks/create_users.yml
# A more DRY solution would be better. This task is necessary
# to ensure that the default user exists in DigitalOcean instances,
# which are used for testing. It's also simply a good idea
# to ensure an assumed user account does actually exist, even though
# Vagrant VirtualBox images come with the "vagrant" user already configured.
- name: ensure SecureDrop admin user accounts exist
user:
name: "{{ item }}"
shell: /bin/bash
groups: sudo,ssh
with_items: ssh_users
tags:
- users
- sudoers
- name: install apt package deps for dev environment
apt:
pkg: "{{ item }}"
state: latest
with_items: dev_deps
tags:
- apt
- development
- name: install pip dependencies for securedrop
pip:
requirements: "{{ securedrop_pip_requirements }}"
tags:
- pip
- development
- name: set SECUREDROP_ENV in bashrc
lineinfile:
dest: /home/{{ securedrop_user }}/.bashrc
state: present
insertafter: EOF
line: "export SECUREDROP_ENV=dev"
tags:
- environment
- development
| ---
- name: update apt package lists
apt:
update_cache: yes
tags:
- apt
- development
- name: install apt package deps for dev environment
apt:
pkg: "{{ item }}"
state: latest
with_items: dev_deps
tags:
- apt
- development
- name: install pip dependencies for securedrop
pip:
requirements: "{{ securedrop_pip_requirements }}"
tags:
- pip
- development
- name: set SECUREDROP_ENV in bashrc
lineinfile:
dest: /home/{{ securedrop_user }}/.bashrc
state: present
insertafter: EOF
line: "export SECUREDROP_ENV=dev"
tags:
- environment
- development
| Revert "ensure "ssh_users" account exists in development" | Revert "ensure "ssh_users" account exists in development"
In order to keep the development playbook clean, the user
provisioning specific to DigitalOcean droplets has been
moved into the Vagrantfile.
This reverts commit f179503239aba4c8f7d32dfdddc795ce1093b342.
| YAML | agpl-3.0 | conorsch/securedrop,chadmiller/securedrop,jrosco/securedrop,jaseg/securedrop,ehartsuyker/securedrop,chadmiller/securedrop,jeann2013/securedrop,heartsucker/securedrop,kelcecil/securedrop,garrettr/securedrop,ageis/securedrop,garrettr/securedrop,jaseg/securedrop,chadmiller/securedrop,ehartsuyker/securedrop,pwplus/securedrop,jrosco/securedrop,ageis/securedrop,chadmiller/securedrop,pwplus/securedrop,jaseg/securedrop,jrosco/securedrop,jrosco/securedrop,pwplus/securedrop,jeann2013/securedrop,jeann2013/securedrop,ehartsuyker/securedrop,garrettr/securedrop,conorsch/securedrop,jaseg/securedrop,ehartsuyker/securedrop,jaseg/securedrop,conorsch/securedrop,micahflee/securedrop,pwplus/securedrop,ehartsuyker/securedrop,heartsucker/securedrop,heartsucker/securedrop,pwplus/securedrop,kelcecil/securedrop,kelcecil/securedrop,conorsch/securedrop,ageis/securedrop,micahflee/securedrop,conorsch/securedrop,micahflee/securedrop,ehartsuyker/securedrop,chadmiller/securedrop,garrettr/securedrop,heartsucker/securedrop,jeann2013/securedrop,pwplus/securedrop,ageis/securedrop,chadmiller/securedrop,heartsucker/securedrop,jrosco/securedrop,jeann2013/securedrop,jaseg/securedrop,micahflee/securedrop,jeann2013/securedrop,kelcecil/securedrop,kelcecil/securedrop,jrosco/securedrop,kelcecil/securedrop | yaml | ## Code Before:
---
- name: update apt package lists
apt:
update_cache: yes
tags:
- apt
- development
# TODO: this is copied from roles/common/tasks/create_users.yml
# A more DRY solution would be better. This task is necessary
# to ensure that the default user exists in DigitalOcean instances,
# which are used for testing. It's also simply a good idea
# to ensure an assumed user account does actually exist, even though
# Vagrant VirtualBox images come with the "vagrant" user already configured.
- name: ensure SecureDrop admin user accounts exist
user:
name: "{{ item }}"
shell: /bin/bash
groups: sudo,ssh
with_items: ssh_users
tags:
- users
- sudoers
- name: install apt package deps for dev environment
apt:
pkg: "{{ item }}"
state: latest
with_items: dev_deps
tags:
- apt
- development
- name: install pip dependencies for securedrop
pip:
requirements: "{{ securedrop_pip_requirements }}"
tags:
- pip
- development
- name: set SECUREDROP_ENV in bashrc
lineinfile:
dest: /home/{{ securedrop_user }}/.bashrc
state: present
insertafter: EOF
line: "export SECUREDROP_ENV=dev"
tags:
- environment
- development
## Instruction:
Revert "ensure "ssh_users" account exists in development"
In order to keep the development playbook clean, the user
provisioning specific to DigitalOcean droplets has been
moved into the Vagrantfile.
This reverts commit f179503239aba4c8f7d32dfdddc795ce1093b342.
## Code After:
---
- name: update apt package lists
apt:
update_cache: yes
tags:
- apt
- development
- name: install apt package deps for dev environment
apt:
pkg: "{{ item }}"
state: latest
with_items: dev_deps
tags:
- apt
- development
- name: install pip dependencies for securedrop
pip:
requirements: "{{ securedrop_pip_requirements }}"
tags:
- pip
- development
- name: set SECUREDROP_ENV in bashrc
lineinfile:
dest: /home/{{ securedrop_user }}/.bashrc
state: present
insertafter: EOF
line: "export SECUREDROP_ENV=dev"
tags:
- environment
- development
| ---
- name: update apt package lists
apt:
update_cache: yes
tags:
- apt
- development
-
- # TODO: this is copied from roles/common/tasks/create_users.yml
- # A more DRY solution would be better. This task is necessary
- # to ensure that the default user exists in DigitalOcean instances,
- # which are used for testing. It's also simply a good idea
- # to ensure an assumed user account does actually exist, even though
- # Vagrant VirtualBox images come with the "vagrant" user already configured.
- - name: ensure SecureDrop admin user accounts exist
- user:
- name: "{{ item }}"
- shell: /bin/bash
- groups: sudo,ssh
- with_items: ssh_users
- tags:
- - users
- - sudoers
- name: install apt package deps for dev environment
apt:
pkg: "{{ item }}"
state: latest
with_items: dev_deps
tags:
- apt
- development
- name: install pip dependencies for securedrop
pip:
requirements: "{{ securedrop_pip_requirements }}"
tags:
- pip
- development
- name: set SECUREDROP_ENV in bashrc
lineinfile:
dest: /home/{{ securedrop_user }}/.bashrc
state: present
insertafter: EOF
line: "export SECUREDROP_ENV=dev"
tags:
- environment
- development | 16 | 0.326531 | 0 | 16 |
c6c3b022aaa56aaa783472b912481ad7943e5a4d | XLib/CMakeLists.txt | XLib/CMakeLists.txt | project (XLib VERSION 0.0.0.0)
configure_file ("${PROJECT_SOURCE_DIR}/Version.h.in" "${PROJECT_BINARY_DIR}/Generated/Version.h")
include_directories("${PROJECT_BINARY_DIR}/Generated")
add_library(XLib ${PROJ_LIB_TYPES} dllmain.cpp)
message(STATUS "building XLib libray as: ${PROJ_LIB_TYPES}") | project (XLib VERSION 0.0.0.0)
configure_file ("${PROJECT_SOURCE_DIR}/Version.h.in" "${PROJECT_BINARY_DIR}/Generated/Version.h")
include_directories("${PROJECT_BINARY_DIR}/Generated")
file(GLOB_RECURSE CPP_FILES "${CMAKE_CURRENT_SOURCE_DIR}" "*.cpp")
file(GLOB_RECURSE H_FILES "${CMAKE_CURRENT_SOURCE_DIR}" "*.h")
source_group("Code" FILES ${CPP_FILES} ${H_FILES})
add_library(XLib ${PROJ_LIB_TYPES} ${CPP_FILES} ${H_FILES})
message(STATUS "building XLib libray as: ${PROJ_LIB_TYPES}") | Fix source code finding for XLib | Fix source code finding for XLib
Automatically find files when they're added.
| Text | apache-2.0 | Feoggou/zLib | text | ## Code Before:
project (XLib VERSION 0.0.0.0)
configure_file ("${PROJECT_SOURCE_DIR}/Version.h.in" "${PROJECT_BINARY_DIR}/Generated/Version.h")
include_directories("${PROJECT_BINARY_DIR}/Generated")
add_library(XLib ${PROJ_LIB_TYPES} dllmain.cpp)
message(STATUS "building XLib libray as: ${PROJ_LIB_TYPES}")
## Instruction:
Fix source code finding for XLib
Automatically find files when they're added.
## Code After:
project (XLib VERSION 0.0.0.0)
configure_file ("${PROJECT_SOURCE_DIR}/Version.h.in" "${PROJECT_BINARY_DIR}/Generated/Version.h")
include_directories("${PROJECT_BINARY_DIR}/Generated")
file(GLOB_RECURSE CPP_FILES "${CMAKE_CURRENT_SOURCE_DIR}" "*.cpp")
file(GLOB_RECURSE H_FILES "${CMAKE_CURRENT_SOURCE_DIR}" "*.h")
source_group("Code" FILES ${CPP_FILES} ${H_FILES})
add_library(XLib ${PROJ_LIB_TYPES} ${CPP_FILES} ${H_FILES})
message(STATUS "building XLib libray as: ${PROJ_LIB_TYPES}") | project (XLib VERSION 0.0.0.0)
configure_file ("${PROJECT_SOURCE_DIR}/Version.h.in" "${PROJECT_BINARY_DIR}/Generated/Version.h")
include_directories("${PROJECT_BINARY_DIR}/Generated")
- add_library(XLib ${PROJ_LIB_TYPES} dllmain.cpp)
+ file(GLOB_RECURSE CPP_FILES "${CMAKE_CURRENT_SOURCE_DIR}" "*.cpp")
+ file(GLOB_RECURSE H_FILES "${CMAKE_CURRENT_SOURCE_DIR}" "*.h")
+ source_group("Code" FILES ${CPP_FILES} ${H_FILES})
+
+ add_library(XLib ${PROJ_LIB_TYPES} ${CPP_FILES} ${H_FILES})
message(STATUS "building XLib libray as: ${PROJ_LIB_TYPES}") | 6 | 0.857143 | 5 | 1 |
1f80fe9831734355e4e0f51396568f590d483b44 | README.md | README.md | Aldryn FAQ App
===============
Simple faq application. It allows you to:
- write a questions and answers
Installation
============
Aldryn Platrofm Users
---------------------
Choose a site you want to install the add-on to from the dashboard. Then go to ``Apps -> Install app`` and click ``Install`` next to ``FAQ`` app.
Redeploy the site.
Manuall Installation
--------------------
Run ``pip install aldryn-faq``.
Add below apps to ``INSTALLED_APPS``:
INSTALLED_APPS = [
…
'aldryn_faq',
'djangocms_text_ckeditor',
'adminsortable',
'sortedm2m',
'hvad',
…
]
Listing
=======
You can add question/answer in the admin interface now. Search for the label ``Aldryn_Faq``.
In order to display them, create a CMS page and install the app there (choose ``FAQ`` from the ``Advanced Settings -> Application`` dropdown).
Now redeploy/restart the site again.
The above CMS site has become a faq list view.
| [](https://travis-ci.org/mkoistinen/aldryn-faq/)
Aldryn FAQ App
===============
Simple faq application. It allows you to:
- write a questions and answers
Installation
============
Aldryn Platrofm Users
---------------------
Choose a site you want to install the add-on to from the dashboard. Then go to ``Apps -> Install app`` and click ``Install`` next to ``FAQ`` app.
Redeploy the site.
Manuall Installation
--------------------
Run ``pip install aldryn-faq``.
Add below apps to ``INSTALLED_APPS``:
INSTALLED_APPS = [
…
'aldryn_faq',
'djangocms_text_ckeditor',
'adminsortable',
'sortedm2m',
'hvad',
…
]
Listing
=======
You can add question/answer in the admin interface now. Search for the label ``Aldryn_Faq``.
In order to display them, create a CMS page and install the app there (choose ``FAQ`` from the ``Advanced Settings -> Application`` dropdown).
Now redeploy/restart the site again.
The above CMS site has become a faq list view.
| Add Travis CI status badge | Add Travis CI status badge
| Markdown | bsd-3-clause | czpython/aldryn-faq,czpython/aldryn-faq,czpython/aldryn-faq,czpython/aldryn-faq | markdown | ## Code Before:
Aldryn FAQ App
===============
Simple faq application. It allows you to:
- write a questions and answers
Installation
============
Aldryn Platrofm Users
---------------------
Choose a site you want to install the add-on to from the dashboard. Then go to ``Apps -> Install app`` and click ``Install`` next to ``FAQ`` app.
Redeploy the site.
Manuall Installation
--------------------
Run ``pip install aldryn-faq``.
Add below apps to ``INSTALLED_APPS``:
INSTALLED_APPS = [
…
'aldryn_faq',
'djangocms_text_ckeditor',
'adminsortable',
'sortedm2m',
'hvad',
…
]
Listing
=======
You can add question/answer in the admin interface now. Search for the label ``Aldryn_Faq``.
In order to display them, create a CMS page and install the app there (choose ``FAQ`` from the ``Advanced Settings -> Application`` dropdown).
Now redeploy/restart the site again.
The above CMS site has become a faq list view.
## Instruction:
Add Travis CI status badge
## Code After:
[](https://travis-ci.org/mkoistinen/aldryn-faq/)
Aldryn FAQ App
===============
Simple faq application. It allows you to:
- write a questions and answers
Installation
============
Aldryn Platrofm Users
---------------------
Choose a site you want to install the add-on to from the dashboard. Then go to ``Apps -> Install app`` and click ``Install`` next to ``FAQ`` app.
Redeploy the site.
Manuall Installation
--------------------
Run ``pip install aldryn-faq``.
Add below apps to ``INSTALLED_APPS``:
INSTALLED_APPS = [
…
'aldryn_faq',
'djangocms_text_ckeditor',
'adminsortable',
'sortedm2m',
'hvad',
…
]
Listing
=======
You can add question/answer in the admin interface now. Search for the label ``Aldryn_Faq``.
In order to display them, create a CMS page and install the app there (choose ``FAQ`` from the ``Advanced Settings -> Application`` dropdown).
Now redeploy/restart the site again.
The above CMS site has become a faq list view.
| + [](https://travis-ci.org/mkoistinen/aldryn-faq/)
+
Aldryn FAQ App
===============
Simple faq application. It allows you to:
- write a questions and answers
Installation
============
Aldryn Platrofm Users
---------------------
Choose a site you want to install the add-on to from the dashboard. Then go to ``Apps -> Install app`` and click ``Install`` next to ``FAQ`` app.
Redeploy the site.
Manuall Installation
--------------------
Run ``pip install aldryn-faq``.
Add below apps to ``INSTALLED_APPS``:
INSTALLED_APPS = [
…
'aldryn_faq',
'djangocms_text_ckeditor',
'adminsortable',
'sortedm2m',
'hvad',
…
]
Listing
=======
You can add question/answer in the admin interface now. Search for the label ``Aldryn_Faq``.
In order to display them, create a CMS page and install the app there (choose ``FAQ`` from the ``Advanced Settings -> Application`` dropdown).
Now redeploy/restart the site again.
The above CMS site has become a faq list view.
| 2 | 0.042553 | 2 | 0 |
00cf98727720037142e760ad59d677d09f327147 | lib/index.js | lib/index.js | var fs = require('fs');
var storage = [];
module.exports = {
init: (config, callback) => {
var directories = fs.readdirSync('.');
if (directories.indexOf(config.directory) === -1) {
fs.mkdirSync(config.directory);
config.documents.forEach(name => {
fs.writeFileSync(`${config.directory}/${name}.db`, '{}');
});
return 'done';
}
config.documents.forEach(name => {
var file = fs.readFileSync(`${config.directory}/${name}.db`);
if (file) {
storage[name] = new Map();
file = JSON.parse(file);
Object.keys(file).forEach((key) => {
storage[name].set(key, file[key]);
});
} else {
fs.writeFileSync(`${config.directory}/${name}.db`, '{}');
}
});
return 'test';
},
get: (document, key) => {
return storage[document].get(key);
}
};
| var fs = require('fs');
var storage = [];
module.exports = {
init: (config, callback) => {
var directories = fs.readdirSync('.');
if (directories.indexOf(config.directory) === -1) {
fs.mkdirSync(config.directory);
config.documents.forEach(name => {
fs.writeFileSync(`${config.directory}/${name}.db`, '');
storage[name] = new Map();
});
return true;
}
config.documents.forEach(name => {
if (!fs.existsSync(`${config.directory}/${name}.db`)) {
fs.writeFileSync(`${config.directory}/${name}.db`, '');
}
var file = fs.readFileSync(`${config.directory}/${name}.db`);
storage[name] = new Map(file);
});
setInterval(() => {
//console.log('----');
//console.log(storage);
storage.forEach((document) => {
console.log('document');
// fs.writeFileSync(`${config.directory}/${}.db`);
});
}, 1000);
return true;
},
get: (document, key) => {
return storage[document].get(key);
},
set: (document, key, value) => {
storage[document].set(key, value);
return true;
}
};
| Fix crash on start when documents doesn't exists | Fix crash on start when documents doesn't exists
| JavaScript | mit | ItsJimi/store-data | javascript | ## Code Before:
var fs = require('fs');
var storage = [];
module.exports = {
init: (config, callback) => {
var directories = fs.readdirSync('.');
if (directories.indexOf(config.directory) === -1) {
fs.mkdirSync(config.directory);
config.documents.forEach(name => {
fs.writeFileSync(`${config.directory}/${name}.db`, '{}');
});
return 'done';
}
config.documents.forEach(name => {
var file = fs.readFileSync(`${config.directory}/${name}.db`);
if (file) {
storage[name] = new Map();
file = JSON.parse(file);
Object.keys(file).forEach((key) => {
storage[name].set(key, file[key]);
});
} else {
fs.writeFileSync(`${config.directory}/${name}.db`, '{}');
}
});
return 'test';
},
get: (document, key) => {
return storage[document].get(key);
}
};
## Instruction:
Fix crash on start when documents doesn't exists
## Code After:
var fs = require('fs');
var storage = [];
module.exports = {
init: (config, callback) => {
var directories = fs.readdirSync('.');
if (directories.indexOf(config.directory) === -1) {
fs.mkdirSync(config.directory);
config.documents.forEach(name => {
fs.writeFileSync(`${config.directory}/${name}.db`, '');
storage[name] = new Map();
});
return true;
}
config.documents.forEach(name => {
if (!fs.existsSync(`${config.directory}/${name}.db`)) {
fs.writeFileSync(`${config.directory}/${name}.db`, '');
}
var file = fs.readFileSync(`${config.directory}/${name}.db`);
storage[name] = new Map(file);
});
setInterval(() => {
//console.log('----');
//console.log(storage);
storage.forEach((document) => {
console.log('document');
// fs.writeFileSync(`${config.directory}/${}.db`);
});
}, 1000);
return true;
},
get: (document, key) => {
return storage[document].get(key);
},
set: (document, key, value) => {
storage[document].set(key, value);
return true;
}
};
| var fs = require('fs');
var storage = [];
module.exports = {
init: (config, callback) => {
var directories = fs.readdirSync('.');
if (directories.indexOf(config.directory) === -1) {
fs.mkdirSync(config.directory);
config.documents.forEach(name => {
- fs.writeFileSync(`${config.directory}/${name}.db`, '{}');
? --
+ fs.writeFileSync(`${config.directory}/${name}.db`, '');
+ storage[name] = new Map();
});
- return 'done';
? ^^^^ -
+ return true;
? ^^^
}
config.documents.forEach(name => {
- var file = fs.readFileSync(`${config.directory}/${name}.db`);
? ^^^ ^^^^^^^ - ^^^ ^^ ^
+ if (!fs.existsSync(`${config.directory}/${name}.db`)) {
? ^^ ^^ ^ ^^^ ^^^
- if (file) {
- storage[name] = new Map();
- file = JSON.parse(file);
- Object.keys(file).forEach((key) => {
- storage[name].set(key, file[key]);
- });
- } else {
- fs.writeFileSync(`${config.directory}/${name}.db`, '{}');
? --
+ fs.writeFileSync(`${config.directory}/${name}.db`, '');
}
+ var file = fs.readFileSync(`${config.directory}/${name}.db`);
+ storage[name] = new Map(file);
});
+ setInterval(() => {
+ //console.log('----');
+ //console.log(storage);
+ storage.forEach((document) => {
+ console.log('document');
+ // fs.writeFileSync(`${config.directory}/${}.db`);
+ });
+ }, 1000);
- return 'test';
? - ---
+ return true;
? ++
},
get: (document, key) => {
return storage[document].get(key);
+ },
+ set: (document, key, value) => {
+ storage[document].set(key, value);
+ return true;
}
}; | 32 | 1 | 20 | 12 |
c6e4e0262b52262c5304f84b31d5073a7e8e8cdd | src/api-umbrella/admin-ui/app/helpers/html-safe.js | src/api-umbrella/admin-ui/app/helpers/html-safe.js | import { helper } from '@ember/component/helper';
export function htmlSafe(params) {
let value = params[0];
return new htmlSafe(value);
}
export default helper(htmlSafe);
| import { helper } from '@ember/component/helper';
import { htmlSafe } from '@ember/string';
export function htmlSafeHelper(params) {
let value = params[0];
return new htmlSafe(value);
}
export default helper(htmlSafeHelper);
| Fix htmlSafe helper after ember upgrades. | Fix htmlSafe helper after ember upgrades.
| JavaScript | mit | NREL/api-umbrella,apinf/api-umbrella,NREL/api-umbrella,NREL/api-umbrella,NREL/api-umbrella,apinf/api-umbrella,apinf/api-umbrella,apinf/api-umbrella,apinf/api-umbrella | javascript | ## Code Before:
import { helper } from '@ember/component/helper';
export function htmlSafe(params) {
let value = params[0];
return new htmlSafe(value);
}
export default helper(htmlSafe);
## Instruction:
Fix htmlSafe helper after ember upgrades.
## Code After:
import { helper } from '@ember/component/helper';
import { htmlSafe } from '@ember/string';
export function htmlSafeHelper(params) {
let value = params[0];
return new htmlSafe(value);
}
export default helper(htmlSafeHelper);
| import { helper } from '@ember/component/helper';
+ import { htmlSafe } from '@ember/string';
- export function htmlSafe(params) {
+ export function htmlSafeHelper(params) {
? ++++++
let value = params[0];
return new htmlSafe(value);
}
- export default helper(htmlSafe);
+ export default helper(htmlSafeHelper);
? ++++++
| 5 | 0.625 | 3 | 2 |
c604e4167dbfca0246fb73f8842f5eaf601e3da8 | test/controllers/graphql_controller_test.rb | test/controllers/graphql_controller_test.rb | require 'test_helper'
class GraphqlControllerTest < ActionDispatch::IntegrationTest
let(:user_con_profile) { create :user_con_profile }
let(:convention) { user_con_profile.convention }
setup do
set_convention convention
sign_in user_con_profile.user
end
test 'should return my profile' do
query = <<~GRAPHQL
query {
myProfile {
__typename
id
name
}
}
GRAPHQL
post graphql_url, params: { 'query' => query }
assert_response :success
json = JSON.parse(response.body)
refute json['errors'].present?, json['errors'].to_s
assert_equal 'UserConProfile', json['data']['myProfile']['__typename']
assert_equal user_con_profile.id, json['data']['myProfile']['id']
assert_equal user_con_profile.name, json['data']['myProfile']['name']
end
end
| require 'test_helper'
class GraphqlControllerTest < ActionDispatch::IntegrationTest
let(:user_con_profile) { create :user_con_profile }
let(:convention) { user_con_profile.convention }
setup do
set_convention convention
sign_in user_con_profile.user
end
test 'should return my profile' do
query = <<~GRAPHQL
query {
convention: conventionByRequestHost {
my_profile {
__typename
id
name
}
}
}
GRAPHQL
post graphql_url, params: { 'query' => query }
assert_response :success
json = JSON.parse(response.body)
refute json['errors'].present?, json['errors'].to_s
assert_equal 'UserConProfile', json['data']['convention']['my_profile']['__typename']
assert_equal user_con_profile.id.to_s, json['data']['convention']['my_profile']['id']
assert_equal user_con_profile.name, json['data']['convention']['my_profile']['name']
end
end
| Fix more deprecated GraphQL queries in tests | Fix more deprecated GraphQL queries in tests
| Ruby | mit | neinteractiveliterature/intercode,neinteractiveliterature/intercode,neinteractiveliterature/intercode,neinteractiveliterature/intercode,neinteractiveliterature/intercode | ruby | ## Code Before:
require 'test_helper'
class GraphqlControllerTest < ActionDispatch::IntegrationTest
let(:user_con_profile) { create :user_con_profile }
let(:convention) { user_con_profile.convention }
setup do
set_convention convention
sign_in user_con_profile.user
end
test 'should return my profile' do
query = <<~GRAPHQL
query {
myProfile {
__typename
id
name
}
}
GRAPHQL
post graphql_url, params: { 'query' => query }
assert_response :success
json = JSON.parse(response.body)
refute json['errors'].present?, json['errors'].to_s
assert_equal 'UserConProfile', json['data']['myProfile']['__typename']
assert_equal user_con_profile.id, json['data']['myProfile']['id']
assert_equal user_con_profile.name, json['data']['myProfile']['name']
end
end
## Instruction:
Fix more deprecated GraphQL queries in tests
## Code After:
require 'test_helper'
class GraphqlControllerTest < ActionDispatch::IntegrationTest
let(:user_con_profile) { create :user_con_profile }
let(:convention) { user_con_profile.convention }
setup do
set_convention convention
sign_in user_con_profile.user
end
test 'should return my profile' do
query = <<~GRAPHQL
query {
convention: conventionByRequestHost {
my_profile {
__typename
id
name
}
}
}
GRAPHQL
post graphql_url, params: { 'query' => query }
assert_response :success
json = JSON.parse(response.body)
refute json['errors'].present?, json['errors'].to_s
assert_equal 'UserConProfile', json['data']['convention']['my_profile']['__typename']
assert_equal user_con_profile.id.to_s, json['data']['convention']['my_profile']['id']
assert_equal user_con_profile.name, json['data']['convention']['my_profile']['name']
end
end
| require 'test_helper'
class GraphqlControllerTest < ActionDispatch::IntegrationTest
let(:user_con_profile) { create :user_con_profile }
let(:convention) { user_con_profile.convention }
setup do
set_convention convention
sign_in user_con_profile.user
end
test 'should return my profile' do
query = <<~GRAPHQL
query {
+ convention: conventionByRequestHost {
- myProfile {
? ^
+ my_profile {
? ++ ^^
- __typename
+ __typename
? ++
- id
+ id
? ++
- name
+ name
? ++
+ }
}
}
GRAPHQL
post graphql_url, params: { 'query' => query }
assert_response :success
json = JSON.parse(response.body)
refute json['errors'].present?, json['errors'].to_s
- assert_equal 'UserConProfile', json['data']['myProfile']['__typename']
? ^
+ assert_equal 'UserConProfile', json['data']['convention']['my_profile']['__typename']
? ++++++++++++++ ^^
- assert_equal user_con_profile.id, json['data']['myProfile']['id']
? ^
+ assert_equal user_con_profile.id.to_s, json['data']['convention']['my_profile']['id']
? +++++ ++++++++++++++ ^^
- assert_equal user_con_profile.name, json['data']['myProfile']['name']
? ^
+ assert_equal user_con_profile.name, json['data']['convention']['my_profile']['name']
? ++++++++++++++ ^^
end
end | 16 | 0.5 | 9 | 7 |
732de8e5106b62dd765bc6c537537507615c6391 | lib/domgen/imit/templates/graph_enum.java.erb | lib/domgen/imit/templates/graph_enum.java.erb | /* DO NOT EDIT: File is auto-generated */
package <%= repository.imit.entity_package %>;
@javax.annotation.Generated( "Domgen" )
public enum <%= repository.imit.graph_enum_name %>
{
<%= repository.imit.graphs.collect {|g| Domgen::Naming.uppercase_constantize(g.name) } .join(",\n ") %>
}
| /* DO NOT EDIT: File is auto-generated */
package <%= repository.imit.entity_package %>;
@javax.annotation.Generated( "Domgen" )
public enum <%= repository.imit.graph_enum_name %>
{
<%= repository.imit.graphs.collect do |g|
elements = g.instance_root? ? g.reachable_entities.sort : g.type_roots
data = <<-DATA
/**
* #{g.name} is a #{g.instance_root? ? 'instance' : 'type'}-based replication graph.
* <p>It includes the following entities:</p>
* <ul>
DATA
data += elements.collect{|e| " * <li>#{e}</li>"}.join("\n")
data += "\n * </ul>\n */\n #{Domgen::Naming.uppercase_constantize(g.name)}"
end.join(",\n") %>
}
| Document the set of entities in each replication graph | Document the set of entities in each replication graph
| HTML+ERB | apache-2.0 | realityforge/domgen,icaughley/domgen,icaughley/domgen,realityforge/domgen | html+erb | ## Code Before:
/* DO NOT EDIT: File is auto-generated */
package <%= repository.imit.entity_package %>;
@javax.annotation.Generated( "Domgen" )
public enum <%= repository.imit.graph_enum_name %>
{
<%= repository.imit.graphs.collect {|g| Domgen::Naming.uppercase_constantize(g.name) } .join(",\n ") %>
}
## Instruction:
Document the set of entities in each replication graph
## Code After:
/* DO NOT EDIT: File is auto-generated */
package <%= repository.imit.entity_package %>;
@javax.annotation.Generated( "Domgen" )
public enum <%= repository.imit.graph_enum_name %>
{
<%= repository.imit.graphs.collect do |g|
elements = g.instance_root? ? g.reachable_entities.sort : g.type_roots
data = <<-DATA
/**
* #{g.name} is a #{g.instance_root? ? 'instance' : 'type'}-based replication graph.
* <p>It includes the following entities:</p>
* <ul>
DATA
data += elements.collect{|e| " * <li>#{e}</li>"}.join("\n")
data += "\n * </ul>\n */\n #{Domgen::Naming.uppercase_constantize(g.name)}"
end.join(",\n") %>
}
| /* DO NOT EDIT: File is auto-generated */
package <%= repository.imit.entity_package %>;
@javax.annotation.Generated( "Domgen" )
public enum <%= repository.imit.graph_enum_name %>
{
- <%= repository.imit.graphs.collect {|g| Domgen::Naming.uppercase_constantize(g.name) } .join(",\n ") %>
+ <%= repository.imit.graphs.collect do |g|
+ elements = g.instance_root? ? g.reachable_entities.sort : g.type_roots
+ data = <<-DATA
+ /**
+ * #{g.name} is a #{g.instance_root? ? 'instance' : 'type'}-based replication graph.
+ * <p>It includes the following entities:</p>
+ * <ul>
+ DATA
+ data += elements.collect{|e| " * <li>#{e}</li>"}.join("\n")
+ data += "\n * </ul>\n */\n #{Domgen::Naming.uppercase_constantize(g.name)}"
+ end.join(",\n") %>
} | 12 | 1.5 | 11 | 1 |
0fd593290ebfd9a90c4d94dc54b7ad61958f12dc | .travis.yml | .travis.yml | sudo: required
matrix:
include:
- os: linux
dist: jessie
# env:
# - ARCH_TRAVIS_VERBOSE=1
arch:
repos:
- xyne-any=http://xyne.archlinux.ca/repos/xyne
- papyros=http://dash.papyros.io/repos/$repo/$arch
packages:
- base
- reflector
# aur
#- archey3-git
# papyros
- papyros-shell
- python2-pip
script:
#- "sudo archey3"
- sudo pip2 install ansible
- "reflector --verbose -l 20 --sort rate -p http"
- echo localhost > inventory
- sudo ansible-galaxy install --force -r requirements.yml -p roles
# Check syntax
- ansible-playbook --syntax-check -i inventory test.yml
# First run
- ansible-playbook -i inventory test.yml --connection=local --sudo -v
# Second run Idempotence test
- >
ansible-playbook -i inventory test.yml --connection=local --sudo
| grep -q 'failed=0'
&& (echo 'Idempotence test: pass' && exit 0)
|| (echo 'Idempotence test: fail' && exit 1)
# Further checks:
- yaourt --help || exit 1
script:
- "bash .travis/arch-travis.sh"
| sudo: required
matrix:
include:
- os: linux
dist: jessie
# env:
# - ARCH_TRAVIS_VERBOSE=1
arch:
repos:
- xyne-any=http://xyne.archlinux.ca/repos/xyne
- papyros=http://dash.papyros.io/repos/$repo/$arch
packages:
- base
- reflector
# aur
#- archey3-git
# papyros
- papyros-shell
- python2-pip
script:
#- "sudo archey3"
- rm /dev/shm
- mkdir /dev/shm
- chmod 777 /dev/shm
- sudo pip2 install ansible
- "reflector --verbose -l 20 --sort rate -p http"
- echo localhost > inventory
- sudo ansible-galaxy install --force -r requirements.yml -p roles
# Check syntax
- ansible-playbook --syntax-check -i inventory test.yml
# First run
- ansible-playbook -i inventory test.yml --connection=local --sudo -v
# Second run Idempotence test
- >
ansible-playbook -i inventory test.yml --connection=local --sudo
| grep -q 'failed=0'
&& (echo 'Idempotence test: pass' && exit 0)
|| (echo 'Idempotence test: fail' && exit 1)
# Further checks:
- yaourt --help || exit 1
script:
- "bash .travis/arch-travis.sh"
| Add OS fix from github | Add OS fix from github
| YAML | bsd-2-clause | Haevas/Haevas.yaourt | yaml | ## Code Before:
sudo: required
matrix:
include:
- os: linux
dist: jessie
# env:
# - ARCH_TRAVIS_VERBOSE=1
arch:
repos:
- xyne-any=http://xyne.archlinux.ca/repos/xyne
- papyros=http://dash.papyros.io/repos/$repo/$arch
packages:
- base
- reflector
# aur
#- archey3-git
# papyros
- papyros-shell
- python2-pip
script:
#- "sudo archey3"
- sudo pip2 install ansible
- "reflector --verbose -l 20 --sort rate -p http"
- echo localhost > inventory
- sudo ansible-galaxy install --force -r requirements.yml -p roles
# Check syntax
- ansible-playbook --syntax-check -i inventory test.yml
# First run
- ansible-playbook -i inventory test.yml --connection=local --sudo -v
# Second run Idempotence test
- >
ansible-playbook -i inventory test.yml --connection=local --sudo
| grep -q 'failed=0'
&& (echo 'Idempotence test: pass' && exit 0)
|| (echo 'Idempotence test: fail' && exit 1)
# Further checks:
- yaourt --help || exit 1
script:
- "bash .travis/arch-travis.sh"
## Instruction:
Add OS fix from github
## Code After:
sudo: required
matrix:
include:
- os: linux
dist: jessie
# env:
# - ARCH_TRAVIS_VERBOSE=1
arch:
repos:
- xyne-any=http://xyne.archlinux.ca/repos/xyne
- papyros=http://dash.papyros.io/repos/$repo/$arch
packages:
- base
- reflector
# aur
#- archey3-git
# papyros
- papyros-shell
- python2-pip
script:
#- "sudo archey3"
- rm /dev/shm
- mkdir /dev/shm
- chmod 777 /dev/shm
- sudo pip2 install ansible
- "reflector --verbose -l 20 --sort rate -p http"
- echo localhost > inventory
- sudo ansible-galaxy install --force -r requirements.yml -p roles
# Check syntax
- ansible-playbook --syntax-check -i inventory test.yml
# First run
- ansible-playbook -i inventory test.yml --connection=local --sudo -v
# Second run Idempotence test
- >
ansible-playbook -i inventory test.yml --connection=local --sudo
| grep -q 'failed=0'
&& (echo 'Idempotence test: pass' && exit 0)
|| (echo 'Idempotence test: fail' && exit 1)
# Further checks:
- yaourt --help || exit 1
script:
- "bash .travis/arch-travis.sh"
| sudo: required
matrix:
include:
- os: linux
dist: jessie
# env:
# - ARCH_TRAVIS_VERBOSE=1
arch:
repos:
- xyne-any=http://xyne.archlinux.ca/repos/xyne
- papyros=http://dash.papyros.io/repos/$repo/$arch
packages:
- base
- reflector
# aur
#- archey3-git
# papyros
- papyros-shell
- python2-pip
script:
#- "sudo archey3"
+ - rm /dev/shm
+ - mkdir /dev/shm
+ - chmod 777 /dev/shm
- sudo pip2 install ansible
- "reflector --verbose -l 20 --sort rate -p http"
- echo localhost > inventory
- sudo ansible-galaxy install --force -r requirements.yml -p roles
# Check syntax
- ansible-playbook --syntax-check -i inventory test.yml
# First run
- ansible-playbook -i inventory test.yml --connection=local --sudo -v
# Second run Idempotence test
- >
ansible-playbook -i inventory test.yml --connection=local --sudo
| grep -q 'failed=0'
&& (echo 'Idempotence test: pass' && exit 0)
|| (echo 'Idempotence test: fail' && exit 1)
# Further checks:
- yaourt --help || exit 1
script:
- "bash .travis/arch-travis.sh"
| 3 | 0.061224 | 3 | 0 |
4c763ddb300c77fa7f7c1357481afcd654149713 | CONTRIBUTING.md | CONTRIBUTING.md | Contributions are welcomed. Just follow the same style and common sense
| Contributions are welcomed. Just follow the same style and common sense
# Running tests
Install and run tox
# Publishing a new version
```bash
python3 setup.py sdist_wheel
python2 setup.py sdist_wheel
python3 -m twine upload dist/*
```
| Add instructions on how to test and publish | Add instructions on how to test and publish
Publish cannot be done via universal wheel as they have different dependencies | Markdown | mit | Mariocj89/dothub | markdown | ## Code Before:
Contributions are welcomed. Just follow the same style and common sense
## Instruction:
Add instructions on how to test and publish
Publish cannot be done via universal wheel as they have different dependencies
## Code After:
Contributions are welcomed. Just follow the same style and common sense
# Running tests
Install and run tox
# Publishing a new version
```bash
python3 setup.py sdist_wheel
python2 setup.py sdist_wheel
python3 -m twine upload dist/*
```
| Contributions are welcomed. Just follow the same style and common sense
+
+ # Running tests
+ Install and run tox
+
+ # Publishing a new version
+ ```bash
+ python3 setup.py sdist_wheel
+ python2 setup.py sdist_wheel
+ python3 -m twine upload dist/*
+ ```
+ | 11 | 11 | 11 | 0 |
ad71cb0d5026adcb58dae6fe243eec96b762e148 | README.md | README.md | [](https://codeclimate.com/github/CoastDigitalGroup/subengine)
[](https://hakiri.io/github/CoastDigitalGroup/cdg-subengine/master)
# CoastDigitalGroup - Subscription Engine
The main authetication and subscription engine for all CoastDigitalGroup's apps open sourced. | [](https://codeclimate.com/github/CoastDigitalGroup/subengine)
[](https://hakiri.io/github/CoastDigitalGroup/cdg-subengine/master)
# CoastDigitalGroup - Subscription Engine
The main authetication and subscription engine for all CoastDigitalGroup's apps open sourced.
===== Step 1: Required in gemfile
* For Development - Remote Github - gem 'cdgsubengine', git: 'http://github.com/CoastDigitalGroup/cdg-subengine.git'
* For Production - Local Folder - gem 'cdgsubengine', path: 'cdgsubengine' Download cdg-subengine into root foler.
===== Step 2: Add below line to load requried file to the top of routes file. (routes.rb)
require "cdgsubengine/constraints/subdomain_required"
===== Step 3: Add contraints to routes that requires multitenancy function between the two lines below . (routes.rb)
constraints(Cdgsubengine::Constraints::SubdomainRequired) do
end
===== Step 4: Copy below line insert at the bottom of the routes file. (routes.rb)
mount Cdgsubengine::Engine, :at => '/'
------
===== Security and Login
* Uses Warden for Authetication
* Authetication through Subdomain
===== Account Data Management
* Sub-domain Restriction
* Sub-domain Validation
* Separated Account Data
===== User UI and Interaction
* Gravatar Support
==== InProgress
Stripe Plan & Payment System
* Stripe Payment Integration
* Plan/Subscription Stripe Integration
* Plan Switching & Management
==== Licence and Terms of Use
This project rocks and uses MIT-LICENSE and is open sourced.
Notice: Some major changes can break applications and we cannot be held liable. Under production, load the engine locally within the root of your project instead of using github repository. | Add old instructions to new project. Will need to adjust. | Add old instructions to new project. Will need to adjust.
| Markdown | mit | CoastDigitalGroup/cdg-subengine-devise,PHCNetworks/multi-tenancy-devise,CoastDigitalGroup/cdg-subengine,CoastDigitalGroup/cdg-subengine-devise,PHCNetworks/multi-tenancy-devise,CoastDigitalGroup/cdg-subengine,CoastDigitalGroup/cdg-subengine-devise,CoastDigitalGroup/cdg-subengine,PHCNetworks/multi-tenancy-devise | markdown | ## Code Before:
[](https://codeclimate.com/github/CoastDigitalGroup/subengine)
[](https://hakiri.io/github/CoastDigitalGroup/cdg-subengine/master)
# CoastDigitalGroup - Subscription Engine
The main authetication and subscription engine for all CoastDigitalGroup's apps open sourced.
## Instruction:
Add old instructions to new project. Will need to adjust.
## Code After:
[](https://codeclimate.com/github/CoastDigitalGroup/subengine)
[](https://hakiri.io/github/CoastDigitalGroup/cdg-subengine/master)
# CoastDigitalGroup - Subscription Engine
The main authetication and subscription engine for all CoastDigitalGroup's apps open sourced.
===== Step 1: Required in gemfile
* For Development - Remote Github - gem 'cdgsubengine', git: 'http://github.com/CoastDigitalGroup/cdg-subengine.git'
* For Production - Local Folder - gem 'cdgsubengine', path: 'cdgsubengine' Download cdg-subengine into root foler.
===== Step 2: Add below line to load requried file to the top of routes file. (routes.rb)
require "cdgsubengine/constraints/subdomain_required"
===== Step 3: Add contraints to routes that requires multitenancy function between the two lines below . (routes.rb)
constraints(Cdgsubengine::Constraints::SubdomainRequired) do
end
===== Step 4: Copy below line insert at the bottom of the routes file. (routes.rb)
mount Cdgsubengine::Engine, :at => '/'
------
===== Security and Login
* Uses Warden for Authetication
* Authetication through Subdomain
===== Account Data Management
* Sub-domain Restriction
* Sub-domain Validation
* Separated Account Data
===== User UI and Interaction
* Gravatar Support
==== InProgress
Stripe Plan & Payment System
* Stripe Payment Integration
* Plan/Subscription Stripe Integration
* Plan Switching & Management
==== Licence and Terms of Use
This project rocks and uses MIT-LICENSE and is open sourced.
Notice: Some major changes can break applications and we cannot be held liable. Under production, load the engine locally within the root of your project instead of using github repository. | [](https://codeclimate.com/github/CoastDigitalGroup/subengine)
[](https://hakiri.io/github/CoastDigitalGroup/cdg-subengine/master)
# CoastDigitalGroup - Subscription Engine
The main authetication and subscription engine for all CoastDigitalGroup's apps open sourced.
+
+ ===== Step 1: Required in gemfile
+ * For Development - Remote Github - gem 'cdgsubengine', git: 'http://github.com/CoastDigitalGroup/cdg-subengine.git'
+ * For Production - Local Folder - gem 'cdgsubengine', path: 'cdgsubengine' Download cdg-subengine into root foler.
+
+ ===== Step 2: Add below line to load requried file to the top of routes file. (routes.rb)
+ require "cdgsubengine/constraints/subdomain_required"
+
+ ===== Step 3: Add contraints to routes that requires multitenancy function between the two lines below . (routes.rb)
+ constraints(Cdgsubengine::Constraints::SubdomainRequired) do
+
+ end
+
+ ===== Step 4: Copy below line insert at the bottom of the routes file. (routes.rb)
+ mount Cdgsubengine::Engine, :at => '/'
+
+
+
+ ------
+
+ ===== Security and Login
+ * Uses Warden for Authetication
+ * Authetication through Subdomain
+
+ ===== Account Data Management
+ * Sub-domain Restriction
+ * Sub-domain Validation
+ * Separated Account Data
+
+ ===== User UI and Interaction
+ * Gravatar Support
+
+ ==== InProgress
+
+ Stripe Plan & Payment System
+ * Stripe Payment Integration
+ * Plan/Subscription Stripe Integration
+ * Plan Switching & Management
+
+ ==== Licence and Terms of Use
+
+ This project rocks and uses MIT-LICENSE and is open sourced.
+
+ Notice: Some major changes can break applications and we cannot be held liable. Under production, load the engine locally within the root of your project instead of using github repository. | 44 | 7.333333 | 44 | 0 |
c004edc9cb2c8cb37949065adc0c645808674064 | perforce/perforce_update.bat | perforce/perforce_update.bat | REM This Windows batch script logs into Perforce and updates a file named `last_updated` in the depot with the current date and timestamp.
REM This can be set up to be executed upon startup/login to avoid expiration of access permissions upon a lack of recent Perforce updates.
SET P4PORT=
SET P4CLIENT=
SET P4USER=
SET P4PASSWD=
SET DIR=
SET FILEPATH=%DIR%\last_updated
p4 login -s
p4 info
p4 sync -q -f
REM If the file does not exist, ADD will be successful; if the file does exist, EDIT will be successful
p4 add %FILEPATH%
p4 edit %FILEPATH%
ECHO %DATE%-%TIME% > %FILEPATH%
p4 submit -d "Update" %FILEPATH%
p4 logout
| @ECHO OFF
SETLOCAL ENABLEEXTENSIONS
REM This Windows batch script logs into Perforce and updates a file named `last_updated` in the depot with the current date and timestamp.
REM This can be set up to be executed upon startup/login to avoid expiration of access permissions upon a lack of recent Perforce updates.
ECHO [Script executed at %DATE%-%TIME%]
SET P4PORT=
SET P4CLIENT=
SET P4USER=
SET P4PASSWD=
SET DIR=
SET FILEPATH=%DIR%\last_updated
p4 login -s
p4 info
p4 sync -q -f
REM If the file does not exist, ADD will be successful; if the file does exist, EDIT will be successful
p4 add %FILEPATH%
p4 edit %FILEPATH%
ECHO %DATE%-%TIME% > %FILEPATH%
p4 submit -d "Update" %FILEPATH%
p4 logout
EXIT /B 0
| Add experience enhancers to perf_update | [Perforce] Add experience enhancers to perf_update
| Batchfile | mit | jleung51/scripts,jleung51/scripts,jleung51/scripts | batchfile | ## Code Before:
REM This Windows batch script logs into Perforce and updates a file named `last_updated` in the depot with the current date and timestamp.
REM This can be set up to be executed upon startup/login to avoid expiration of access permissions upon a lack of recent Perforce updates.
SET P4PORT=
SET P4CLIENT=
SET P4USER=
SET P4PASSWD=
SET DIR=
SET FILEPATH=%DIR%\last_updated
p4 login -s
p4 info
p4 sync -q -f
REM If the file does not exist, ADD will be successful; if the file does exist, EDIT will be successful
p4 add %FILEPATH%
p4 edit %FILEPATH%
ECHO %DATE%-%TIME% > %FILEPATH%
p4 submit -d "Update" %FILEPATH%
p4 logout
## Instruction:
[Perforce] Add experience enhancers to perf_update
## Code After:
@ECHO OFF
SETLOCAL ENABLEEXTENSIONS
REM This Windows batch script logs into Perforce and updates a file named `last_updated` in the depot with the current date and timestamp.
REM This can be set up to be executed upon startup/login to avoid expiration of access permissions upon a lack of recent Perforce updates.
ECHO [Script executed at %DATE%-%TIME%]
SET P4PORT=
SET P4CLIENT=
SET P4USER=
SET P4PASSWD=
SET DIR=
SET FILEPATH=%DIR%\last_updated
p4 login -s
p4 info
p4 sync -q -f
REM If the file does not exist, ADD will be successful; if the file does exist, EDIT will be successful
p4 add %FILEPATH%
p4 edit %FILEPATH%
ECHO %DATE%-%TIME% > %FILEPATH%
p4 submit -d "Update" %FILEPATH%
p4 logout
EXIT /B 0
| + @ECHO OFF
+ SETLOCAL ENABLEEXTENSIONS
+
REM This Windows batch script logs into Perforce and updates a file named `last_updated` in the depot with the current date and timestamp.
REM This can be set up to be executed upon startup/login to avoid expiration of access permissions upon a lack of recent Perforce updates.
+
+ ECHO [Script executed at %DATE%-%TIME%]
SET P4PORT=
SET P4CLIENT=
SET P4USER=
SET P4PASSWD=
SET DIR=
SET FILEPATH=%DIR%\last_updated
p4 login -s
p4 info
p4 sync -q -f
REM If the file does not exist, ADD will be successful; if the file does exist, EDIT will be successful
p4 add %FILEPATH%
p4 edit %FILEPATH%
ECHO %DATE%-%TIME% > %FILEPATH%
p4 submit -d "Update" %FILEPATH%
p4 logout
+
+ EXIT /B 0 | 7 | 0.304348 | 7 | 0 |
315bc1b131c72d57e3a2ea412a9ee7db2aaa85ae | services/nx_graph_generator/README.md | services/nx_graph_generator/README.md | (TBD) |
1. Install Docker: https://store.docker.com/search?type=edition&offering=community
1. (Optional) Install jq
1. Make sure you also have latest version of Docker Compose
1. From this directory, type ```docker-compose build && docker-compose up```
1. ```curl -d "@./sample-data/sample.cx" -H "Content-Type: application/json" -X POST localhost | jq .```
1. Now you should get a network with some new network attributes.
## Parameters
### Algorithm
You can select a graph generator by adding query-string `algorithm`
e.g.
` curl -d "@./sample-data/sample.cx" -H "Content-Type: application/json" -X POST localhost?algorithm=gnm_random_graph" | jq
`
Random graph generator you can select are below.
- `fast_gnp_random_graph`
- `gnp_random_graph`
- `dense_gnm_random_graph`
- `gnm_random_graph`
- `erdos_renyi_graph`
- `binomial_graph`
- `newman_watts_strogatz_graph`
- `watts_strogatz_graph`
- `connected_watts_strogatz_graph`
- `random_regular_graph`
- `barabasi_albert_graph`
- `powerlaw_cluster_graph`
- `random_lobster`
- `random_powerlaw_tree`
### Generator's Parameters
You can also pass graph generator's parameters by adding query-string.
You can see parameters you can pass in the following link
https://networkx.github.io/documentation/development/reference/generators.html#module-networkx.generators.random_graphs
If you don't pass parameters, default parameters are passed.
| Add readme of graph generator | Add readme of graph generator
| Markdown | mit | idekerlab/graph-services | markdown | ## Code Before:
(TBD)
## Instruction:
Add readme of graph generator
## Code After:
1. Install Docker: https://store.docker.com/search?type=edition&offering=community
1. (Optional) Install jq
1. Make sure you also have latest version of Docker Compose
1. From this directory, type ```docker-compose build && docker-compose up```
1. ```curl -d "@./sample-data/sample.cx" -H "Content-Type: application/json" -X POST localhost | jq .```
1. Now you should get a network with some new network attributes.
## Parameters
### Algorithm
You can select a graph generator by adding query-string `algorithm`
e.g.
` curl -d "@./sample-data/sample.cx" -H "Content-Type: application/json" -X POST localhost?algorithm=gnm_random_graph" | jq
`
Random graph generator you can select are below.
- `fast_gnp_random_graph`
- `gnp_random_graph`
- `dense_gnm_random_graph`
- `gnm_random_graph`
- `erdos_renyi_graph`
- `binomial_graph`
- `newman_watts_strogatz_graph`
- `watts_strogatz_graph`
- `connected_watts_strogatz_graph`
- `random_regular_graph`
- `barabasi_albert_graph`
- `powerlaw_cluster_graph`
- `random_lobster`
- `random_powerlaw_tree`
### Generator's Parameters
You can also pass graph generator's parameters by adding query-string.
You can see parameters you can pass in the following link
https://networkx.github.io/documentation/development/reference/generators.html#module-networkx.generators.random_graphs
If you don't pass parameters, default parameters are passed.
| - (TBD)
+
+ 1. Install Docker: https://store.docker.com/search?type=edition&offering=community
+ 1. (Optional) Install jq
+ 1. Make sure you also have latest version of Docker Compose
+ 1. From this directory, type ```docker-compose build && docker-compose up```
+ 1. ```curl -d "@./sample-data/sample.cx" -H "Content-Type: application/json" -X POST localhost | jq .```
+ 1. Now you should get a network with some new network attributes.
+
+
+ ## Parameters
+ ### Algorithm
+ You can select a graph generator by adding query-string `algorithm`
+
+ e.g.
+ ` curl -d "@./sample-data/sample.cx" -H "Content-Type: application/json" -X POST localhost?algorithm=gnm_random_graph" | jq
+ `
+
+ Random graph generator you can select are below.
+ - `fast_gnp_random_graph`
+ - `gnp_random_graph`
+ - `dense_gnm_random_graph`
+ - `gnm_random_graph`
+ - `erdos_renyi_graph`
+ - `binomial_graph`
+ - `newman_watts_strogatz_graph`
+ - `watts_strogatz_graph`
+ - `connected_watts_strogatz_graph`
+ - `random_regular_graph`
+ - `barabasi_albert_graph`
+ - `powerlaw_cluster_graph`
+ - `random_lobster`
+ - `random_powerlaw_tree`
+
+
+ ### Generator's Parameters
+ You can also pass graph generator's parameters by adding query-string.
+ You can see parameters you can pass in the following link
+ https://networkx.github.io/documentation/development/reference/generators.html#module-networkx.generators.random_graphs
+
+ If you don't pass parameters, default parameters are passed. | 41 | 41 | 40 | 1 |
47ad1a3c58f2acb09235c4a5463d3a8be8b3f800 | src/js/services/tracker.js | src/js/services/tracker.js | /**
* Event tracker (analytics)
*/
var module = angular.module('tracker', []);
var mixpanel;
module.factory('rpTracker', ['$rootScope', function ($scope) {
var track = function (event,properties) {
if (Options.mixpanel && Options.mixpanel.track && mixpanel) {
mixpanel.track(event,properties);
}
};
return {
track: track
};
}]);
| /**
* Event tracker (analytics)
*/
var module = angular.module('tracker', []);
module.factory('rpTracker', ['$rootScope', function ($scope) {
var track = function (event,properties) {
console.log('track?',Options.mixpanel, Options.mixpanel.track, typeof mixpanel !== 'undefined');
if (Options.mixpanel && Options.mixpanel.track && mixpanel) {
console.log('track');
mixpanel.track(event,properties);
}
};
return {
track: track
};
}]);
| Check if variable is defined. | Tracker: Check if variable is defined.
| JavaScript | isc | h0vhannes/ripple-client,wangbibo/ripple-client,vhpoet/ripple-client-desktop,Madsn/ripple-client,darkdarkdragon/ripple-client-desktop,Madsn/ripple-client,bankonme/ripple-client-desktop,h0vhannes/ripple-client,MatthewPhinney/ripple-client,vhpoet/ripple-client-desktop,h0vhannes/ripple-client,resilience-me/DEPRICATED_ripple-client,darkdarkdragon/ripple-client,Madsn/ripple-client,xdv/ripple-client-desktop,xdv/ripple-client,MatthewPhinney/ripple-client,darkdarkdragon/ripple-client,arturomc/ripple-client,vhpoet/ripple-client,dncohen/ripple-client-desktop,yongsoo/ripple-client,wangbibo/ripple-client,yongsoo/ripple-client,xdv/ripple-client-desktop,MatthewPhinney/ripple-client-desktop,mrajvanshy/ripple-client,ripple/ripple-client,thics/ripple-client-desktop,yxxyun/ripple-client-desktop,thics/ripple-client-desktop,arturomc/ripple-client,ripple/ripple-client,mrajvanshy/ripple-client,wangbibo/ripple-client,bsteinlo/ripple-client,darkdarkdragon/ripple-client,arturomc/ripple-client,MatthewPhinney/ripple-client,MatthewPhinney/ripple-client,mrajvanshy/ripple-client,mrajvanshy/ripple-client,xdv/ripple-client-desktop,yongsoo/ripple-client-desktop,darkdarkdragon/ripple-client-desktop,xdv/ripple-client,arturomc/ripple-client,yongsoo/ripple-client,ripple/ripple-client,vhpoet/ripple-client,vhpoet/ripple-client,bsteinlo/ripple-client,ripple/ripple-client,vhpoet/ripple-client,MatthewPhinney/ripple-client-desktop,dncohen/ripple-client-desktop,resilience-me/DEPRICATED_ripple-client,xdv/ripple-client,xdv/ripple-client,yongsoo/ripple-client-desktop,yxxyun/ripple-client-desktop,mrajvanshy/ripple-client-desktop,bankonme/ripple-client-desktop,yongsoo/ripple-client,ripple/ripple-client-desktop,h0vhannes/ripple-client,darkdarkdragon/ripple-client,wangbibo/ripple-client,ripple/ripple-client-desktop,mrajvanshy/ripple-client-desktop,Madsn/ripple-client | javascript | ## Code Before:
/**
* Event tracker (analytics)
*/
var module = angular.module('tracker', []);
var mixpanel;
module.factory('rpTracker', ['$rootScope', function ($scope) {
var track = function (event,properties) {
if (Options.mixpanel && Options.mixpanel.track && mixpanel) {
mixpanel.track(event,properties);
}
};
return {
track: track
};
}]);
## Instruction:
Tracker: Check if variable is defined.
## Code After:
/**
* Event tracker (analytics)
*/
var module = angular.module('tracker', []);
module.factory('rpTracker', ['$rootScope', function ($scope) {
var track = function (event,properties) {
console.log('track?',Options.mixpanel, Options.mixpanel.track, typeof mixpanel !== 'undefined');
if (Options.mixpanel && Options.mixpanel.track && mixpanel) {
console.log('track');
mixpanel.track(event,properties);
}
};
return {
track: track
};
}]);
| /**
* Event tracker (analytics)
*/
var module = angular.module('tracker', []);
- var mixpanel;
module.factory('rpTracker', ['$rootScope', function ($scope) {
var track = function (event,properties) {
+ console.log('track?',Options.mixpanel, Options.mixpanel.track, typeof mixpanel !== 'undefined');
if (Options.mixpanel && Options.mixpanel.track && mixpanel) {
+ console.log('track');
mixpanel.track(event,properties);
}
};
return {
track: track
};
}]); | 3 | 0.166667 | 2 | 1 |
2b2e76d560e2c46846df6962ad2684df812aec45 | .travis.yml | .travis.yml | language: objective-c
osx_image: xcode7.3
matrix:
include:
- xcode_workspace: Example-iOS/VOKBenkode.xcworkspace
xcode_scheme: VOKBenkode-Example
xcode_sdk: iphonesimulator
- xcode_workspace: Example-iOS/VOKBenkode.xcworkspace
xcode_scheme: VOKBenkode-tvOS
xcode_sdk: appletvsimulator
- xcode_workspace: Example-OSX/VOKBenkode.xcworkspace
xcode_scheme: VOKBenkode-Example
xcode_sdk: macosx
| language: objective-c
osx_image: xcode7.3
before_install:
- brew update > /dev/null; if brew outdated | grep -qx xctool; then brew upgrade xctool; fi
matrix:
include:
- xcode_workspace: Example-iOS/VOKBenkode.xcworkspace
xcode_scheme: VOKBenkode-Example
xcode_sdk: iphonesimulator
- xcode_workspace: Example-iOS/VOKBenkode.xcworkspace
xcode_scheme: VOKBenkode-tvOS
xcode_sdk: appletvsimulator
- xcode_workspace: Example-OSX/VOKBenkode.xcworkspace
xcode_scheme: VOKBenkode-Example
xcode_sdk: macosx
| Revert change to remove xctool update on Travis | Revert change to remove xctool update on Travis
| YAML | mit | chillpop/VOKBenkode,chillpop/VOKBenkode,vokal/VOKBenkode,chillpop/VOKBenkode | yaml | ## Code Before:
language: objective-c
osx_image: xcode7.3
matrix:
include:
- xcode_workspace: Example-iOS/VOKBenkode.xcworkspace
xcode_scheme: VOKBenkode-Example
xcode_sdk: iphonesimulator
- xcode_workspace: Example-iOS/VOKBenkode.xcworkspace
xcode_scheme: VOKBenkode-tvOS
xcode_sdk: appletvsimulator
- xcode_workspace: Example-OSX/VOKBenkode.xcworkspace
xcode_scheme: VOKBenkode-Example
xcode_sdk: macosx
## Instruction:
Revert change to remove xctool update on Travis
## Code After:
language: objective-c
osx_image: xcode7.3
before_install:
- brew update > /dev/null; if brew outdated | grep -qx xctool; then brew upgrade xctool; fi
matrix:
include:
- xcode_workspace: Example-iOS/VOKBenkode.xcworkspace
xcode_scheme: VOKBenkode-Example
xcode_sdk: iphonesimulator
- xcode_workspace: Example-iOS/VOKBenkode.xcworkspace
xcode_scheme: VOKBenkode-tvOS
xcode_sdk: appletvsimulator
- xcode_workspace: Example-OSX/VOKBenkode.xcworkspace
xcode_scheme: VOKBenkode-Example
xcode_sdk: macosx
| language: objective-c
osx_image: xcode7.3
+ before_install:
+ - brew update > /dev/null; if brew outdated | grep -qx xctool; then brew upgrade xctool; fi
matrix:
include:
- xcode_workspace: Example-iOS/VOKBenkode.xcworkspace
xcode_scheme: VOKBenkode-Example
xcode_sdk: iphonesimulator
- xcode_workspace: Example-iOS/VOKBenkode.xcworkspace
xcode_scheme: VOKBenkode-tvOS
xcode_sdk: appletvsimulator
- xcode_workspace: Example-OSX/VOKBenkode.xcworkspace
xcode_scheme: VOKBenkode-Example
xcode_sdk: macosx | 2 | 0.153846 | 2 | 0 |
e0a166c5198182b271cad9946976d4b9d187a24b | spec/core/kernel/eval_spec.rb | spec/core/kernel/eval_spec.rb | require File.expand_path('../../../spec_helper', __FILE__)
require File.expand_path('../fixtures/classes', __FILE__)
describe "Kernel#eval" do
before :each do
@cache = tmp("eval_cache.rbc")
name = fixture(__FILE__, "eval_cache.rb")
KernelSpecs.cache_file name, @cache
end
after :each do
rm_r @cache
end
it "creates a CompiledMethod that can be cached and re-run" do
KernelSpecs.run_cache(@cache).should == "Object:Object"
end
end
| require File.expand_path('../../../spec_helper', __FILE__)
require File.expand_path('../fixtures/classes', __FILE__)
describe "Kernel#eval" do
before :each do
@cache = tmp("eval_cache.rbc")
name = fixture(__FILE__, "eval_cache.rb")
KernelSpecs.cache_file name, @cache
end
after :each do
rm_r @cache
end
it "creates a CompiledMethod that can be cached and re-run" do
KernelSpecs.run_cache(@cache).should == "Object:Object"
end
it "should not use the cache if the line-number differs" do
eval("__LINE__", binding, "(file)", 1).should == 1
eval("__LINE__", binding, "(file)", 2).should == 2
end
end
| Add a spec for __LINE__ in eval | Add a spec for __LINE__ in eval
| Ruby | bsd-3-clause | benlovell/rubinius,ruipserra/rubinius,benlovell/rubinius,digitalextremist/rubinius,heftig/rubinius,mlarraz/rubinius,ngpestelos/rubinius,kachick/rubinius,heftig/rubinius,sferik/rubinius,heftig/rubinius,jemc/rubinius,ruipserra/rubinius,kachick/rubinius,Azizou/rubinius,dblock/rubinius,pH14/rubinius,kachick/rubinius,sferik/rubinius,Wirachmat/rubinius,digitalextremist/rubinius,benlovell/rubinius,lgierth/rubinius,ruipserra/rubinius,jsyeo/rubinius,Wirachmat/rubinius,digitalextremist/rubinius,jemc/rubinius,Azizou/rubinius,lgierth/rubinius,ruipserra/rubinius,jemc/rubinius,Wirachmat/rubinius,ruipserra/rubinius,Wirachmat/rubinius,Azizou/rubinius,ngpestelos/rubinius,jemc/rubinius,Azizou/rubinius,mlarraz/rubinius,ruipserra/rubinius,benlovell/rubinius,sferik/rubinius,jsyeo/rubinius,ngpestelos/rubinius,jsyeo/rubinius,mlarraz/rubinius,lgierth/rubinius,dblock/rubinius,Azizou/rubinius,Azizou/rubinius,jsyeo/rubinius,mlarraz/rubinius,ngpestelos/rubinius,ngpestelos/rubinius,pH14/rubinius,sferik/rubinius,pH14/rubinius,kachick/rubinius,kachick/rubinius,heftig/rubinius,lgierth/rubinius,mlarraz/rubinius,kachick/rubinius,jsyeo/rubinius,kachick/rubinius,digitalextremist/rubinius,kachick/rubinius,pH14/rubinius,benlovell/rubinius,ruipserra/rubinius,lgierth/rubinius,benlovell/rubinius,Wirachmat/rubinius,jemc/rubinius,mlarraz/rubinius,pH14/rubinius,digitalextremist/rubinius,sferik/rubinius,ngpestelos/rubinius,dblock/rubinius,dblock/rubinius,pH14/rubinius,heftig/rubinius,dblock/rubinius,heftig/rubinius,Azizou/rubinius,sferik/rubinius,Wirachmat/rubinius,jsyeo/rubinius,dblock/rubinius,sferik/rubinius,mlarraz/rubinius,ngpestelos/rubinius,pH14/rubinius,jemc/rubinius,lgierth/rubinius,dblock/rubinius,heftig/rubinius,digitalextremist/rubinius,benlovell/rubinius,Wirachmat/rubinius,jemc/rubinius,lgierth/rubinius,jsyeo/rubinius,digitalextremist/rubinius | ruby | ## Code Before:
require File.expand_path('../../../spec_helper', __FILE__)
require File.expand_path('../fixtures/classes', __FILE__)
describe "Kernel#eval" do
before :each do
@cache = tmp("eval_cache.rbc")
name = fixture(__FILE__, "eval_cache.rb")
KernelSpecs.cache_file name, @cache
end
after :each do
rm_r @cache
end
it "creates a CompiledMethod that can be cached and re-run" do
KernelSpecs.run_cache(@cache).should == "Object:Object"
end
end
## Instruction:
Add a spec for __LINE__ in eval
## Code After:
require File.expand_path('../../../spec_helper', __FILE__)
require File.expand_path('../fixtures/classes', __FILE__)
describe "Kernel#eval" do
before :each do
@cache = tmp("eval_cache.rbc")
name = fixture(__FILE__, "eval_cache.rb")
KernelSpecs.cache_file name, @cache
end
after :each do
rm_r @cache
end
it "creates a CompiledMethod that can be cached and re-run" do
KernelSpecs.run_cache(@cache).should == "Object:Object"
end
it "should not use the cache if the line-number differs" do
eval("__LINE__", binding, "(file)", 1).should == 1
eval("__LINE__", binding, "(file)", 2).should == 2
end
end
| require File.expand_path('../../../spec_helper', __FILE__)
require File.expand_path('../fixtures/classes', __FILE__)
describe "Kernel#eval" do
before :each do
@cache = tmp("eval_cache.rbc")
name = fixture(__FILE__, "eval_cache.rb")
KernelSpecs.cache_file name, @cache
end
after :each do
rm_r @cache
end
it "creates a CompiledMethod that can be cached and re-run" do
KernelSpecs.run_cache(@cache).should == "Object:Object"
end
+
+ it "should not use the cache if the line-number differs" do
+ eval("__LINE__", binding, "(file)", 1).should == 1
+ eval("__LINE__", binding, "(file)", 2).should == 2
+ end
end | 5 | 0.277778 | 5 | 0 |
515acfe7cf4ce3c5e5c7028c441347d33e47996e | .travis.yml | .travis.yml | language: node_js
node_js:
- '4.0'
before_install: npm install -g grunt-cli
before_script:
- ./node_modules/.bin/webdriver-manager update
env:
SAUCE_ACCESS_KEY=af8788b9-bde3-4e85-8557-cacfbac98cdc
SAUCE_USERNAME=alfonso-presa
| language: node_js
node_js:
- 4
- 5
- 6
before_install: npm install -g grunt-cli
env:
SAUCE_ACCESS_KEY=af8788b9-bde3-4e85-8557-cacfbac98cdc
SAUCE_USERNAME=alfonso-presa
| Test in node 4, 5, and 6 | Test in node 4, 5, and 6
Webdriver update is now happening in `grunt test` so we don't need to
do it in Travis anymore.
| YAML | mit | alfonso-presa/protractor-testability-plugin,alfonso-presa/protractor-testability-plugin | yaml | ## Code Before:
language: node_js
node_js:
- '4.0'
before_install: npm install -g grunt-cli
before_script:
- ./node_modules/.bin/webdriver-manager update
env:
SAUCE_ACCESS_KEY=af8788b9-bde3-4e85-8557-cacfbac98cdc
SAUCE_USERNAME=alfonso-presa
## Instruction:
Test in node 4, 5, and 6
Webdriver update is now happening in `grunt test` so we don't need to
do it in Travis anymore.
## Code After:
language: node_js
node_js:
- 4
- 5
- 6
before_install: npm install -g grunt-cli
env:
SAUCE_ACCESS_KEY=af8788b9-bde3-4e85-8557-cacfbac98cdc
SAUCE_USERNAME=alfonso-presa
| language: node_js
node_js:
- - '4.0'
+ - 4
+ - 5
+ - 6
before_install: npm install -g grunt-cli
- before_script:
- - ./node_modules/.bin/webdriver-manager update
env:
SAUCE_ACCESS_KEY=af8788b9-bde3-4e85-8557-cacfbac98cdc
SAUCE_USERNAME=alfonso-presa | 6 | 0.666667 | 3 | 3 |
6788de7995b6d93944f573b4c7fe41f20915115c | pkg/machines/hostvmslist.scss | pkg/machines/hostvmslist.scss | // PF4 issue https://github.com/patternfly/patternfly-react/issues/4612
@import "~@patternfly/patternfly/components/Table/table.scss";
@import "~@patternfly/patternfly/components/Table/table-grid.scss";
@import "~@patternfly/patternfly/components/Toolbar/toolbar.scss";
// Mimic .pf-c-table__action
#virtual-machines-listing td:last-child {
--pf-c-table--cell--Width: 1%;
}
#virtual-machines-page-main-nav {
padding: 0;
}
// Expand the link to the container, for easier clickability
#virtual-machines-listing th a {
display: block;
text-align: left;
} | // PF4 issue https://github.com/patternfly/patternfly-react/issues/4612
@import "~@patternfly/patternfly/components/Table/table.scss";
@import "~@patternfly/patternfly/components/Table/table-grid.scss";
@import "~@patternfly/patternfly/components/Toolbar/toolbar.scss";
// Mimic .pf-c-table__action
#virtual-machines-listing td:last-child {
--pf-c-table--cell--Width: 1%;
}
#virtual-machines-page-main-nav {
padding: 0;
}
// Expand the link to the container, for easier clickability
#virtual-machines-listing th a {
display: block;
text-align: left;
}
/* Add the missing space when the list's toolbar wraps
* patternfly/patternfly#3348
*/
.pf-c-toolbar__content-section {
gap: var(--pf-global--spacer--sm) 0;
}
| Add a vertical space when toolbar in the main page wraps | machines: Add a vertical space when toolbar in the main page wraps
| SCSS | lgpl-2.1 | garrett/cockpit,cockpit-project/cockpit,garrett/cockpit,garrett/cockpit,martinpitt/cockpit,mvollmer/cockpit,martinpitt/cockpit,deryni/cockpit,cockpit-project/cockpit,deryni/cockpit,deryni/cockpit,deryni/cockpit,garrett/cockpit,martinpitt/cockpit,cockpit-project/cockpit,deryni/cockpit,deryni/cockpit,garrett/cockpit,cockpit-project/cockpit,deryni/cockpit,martinpitt/cockpit,cockpit-project/cockpit,martinpitt/cockpit,mvollmer/cockpit,mvollmer/cockpit,mvollmer/cockpit,mvollmer/cockpit | scss | ## Code Before:
// PF4 issue https://github.com/patternfly/patternfly-react/issues/4612
@import "~@patternfly/patternfly/components/Table/table.scss";
@import "~@patternfly/patternfly/components/Table/table-grid.scss";
@import "~@patternfly/patternfly/components/Toolbar/toolbar.scss";
// Mimic .pf-c-table__action
#virtual-machines-listing td:last-child {
--pf-c-table--cell--Width: 1%;
}
#virtual-machines-page-main-nav {
padding: 0;
}
// Expand the link to the container, for easier clickability
#virtual-machines-listing th a {
display: block;
text-align: left;
}
## Instruction:
machines: Add a vertical space when toolbar in the main page wraps
## Code After:
// PF4 issue https://github.com/patternfly/patternfly-react/issues/4612
@import "~@patternfly/patternfly/components/Table/table.scss";
@import "~@patternfly/patternfly/components/Table/table-grid.scss";
@import "~@patternfly/patternfly/components/Toolbar/toolbar.scss";
// Mimic .pf-c-table__action
#virtual-machines-listing td:last-child {
--pf-c-table--cell--Width: 1%;
}
#virtual-machines-page-main-nav {
padding: 0;
}
// Expand the link to the container, for easier clickability
#virtual-machines-listing th a {
display: block;
text-align: left;
}
/* Add the missing space when the list's toolbar wraps
* patternfly/patternfly#3348
*/
.pf-c-toolbar__content-section {
gap: var(--pf-global--spacer--sm) 0;
}
| // PF4 issue https://github.com/patternfly/patternfly-react/issues/4612
@import "~@patternfly/patternfly/components/Table/table.scss";
@import "~@patternfly/patternfly/components/Table/table-grid.scss";
@import "~@patternfly/patternfly/components/Toolbar/toolbar.scss";
// Mimic .pf-c-table__action
#virtual-machines-listing td:last-child {
--pf-c-table--cell--Width: 1%;
}
#virtual-machines-page-main-nav {
padding: 0;
}
// Expand the link to the container, for easier clickability
#virtual-machines-listing th a {
display: block;
text-align: left;
}
+
+ /* Add the missing space when the list's toolbar wraps
+ * patternfly/patternfly#3348
+ */
+ .pf-c-toolbar__content-section {
+ gap: var(--pf-global--spacer--sm) 0;
+ } | 7 | 0.368421 | 7 | 0 |
e068fc34a68ad08364c870fbba4389212c40d1f5 | README.md | README.md |
Check out the repo and run "run.bat" with the path to a git repository as a parameter. You should end up with a "glue.html" file in output with some very basic stats regarding that repo.
### It can't find git!
I've defaulted the git bin path to: ``@"C:\Program Files (x86)\Git\bin\git.exe"``. You may need to override this as an environment variable (``git``) or just change it in the ``src/readLog.fsx`` file
### I want to analysis different things!
That's kind of the whole point :). Crack open ``generate/glue.fsx`` and start adding things. The data manipulation is being done by [Deedle](http://bluemountaincapital.github.io/Deedle) which I'm enjoying a lot, but I'm not an expert in by any means.
|
Check out the repo and run "run.bat" with the path to a git repository as a parameter. You should end up with a "glue.html" file in output with some very basic stats regarding that repo.
### It can't find git!
I've defaulted the git bin path to: ``@"C:\Program Files (x86)\Git\bin\git.exe"``. You may need to override this as an environment variable (``git``) or just change it in the ``src/readLog.fsx`` file
### I want to analysis different things!
That's kind of the whole point :). Crack open ``generate/glue.fsx`` and start adding things. The data manipulation is being done by [Deedle](http://bluemountaincapital.github.io/Deedle) which I'm enjoying a lot, but I'm not an expert in by any means.
### What does it look like?
Something like this:

| Add example graph (I hope) | Add example graph (I hope)
| Markdown | mit | mavnn/GitStuff,mavnn/GitStuff | markdown | ## Code Before:
Check out the repo and run "run.bat" with the path to a git repository as a parameter. You should end up with a "glue.html" file in output with some very basic stats regarding that repo.
### It can't find git!
I've defaulted the git bin path to: ``@"C:\Program Files (x86)\Git\bin\git.exe"``. You may need to override this as an environment variable (``git``) or just change it in the ``src/readLog.fsx`` file
### I want to analysis different things!
That's kind of the whole point :). Crack open ``generate/glue.fsx`` and start adding things. The data manipulation is being done by [Deedle](http://bluemountaincapital.github.io/Deedle) which I'm enjoying a lot, but I'm not an expert in by any means.
## Instruction:
Add example graph (I hope)
## Code After:
Check out the repo and run "run.bat" with the path to a git repository as a parameter. You should end up with a "glue.html" file in output with some very basic stats regarding that repo.
### It can't find git!
I've defaulted the git bin path to: ``@"C:\Program Files (x86)\Git\bin\git.exe"``. You may need to override this as an environment variable (``git``) or just change it in the ``src/readLog.fsx`` file
### I want to analysis different things!
That's kind of the whole point :). Crack open ``generate/glue.fsx`` and start adding things. The data manipulation is being done by [Deedle](http://bluemountaincapital.github.io/Deedle) which I'm enjoying a lot, but I'm not an expert in by any means.
### What does it look like?
Something like this:

|
Check out the repo and run "run.bat" with the path to a git repository as a parameter. You should end up with a "glue.html" file in output with some very basic stats regarding that repo.
### It can't find git!
I've defaulted the git bin path to: ``@"C:\Program Files (x86)\Git\bin\git.exe"``. You may need to override this as an environment variable (``git``) or just change it in the ``src/readLog.fsx`` file
### I want to analysis different things!
That's kind of the whole point :). Crack open ``generate/glue.fsx`` and start adding things. The data manipulation is being done by [Deedle](http://bluemountaincapital.github.io/Deedle) which I'm enjoying a lot, but I'm not an expert in by any means.
+
+ ### What does it look like?
+
+ Something like this:
+
+  | 6 | 0.6 | 6 | 0 |
38169a161b80c2b0ab6e2c7e7c39f52406967956 | app/views/board.html | app/views/board.html | <div ng-controller="BoardCtrl" ng-init="init()">
<h1>Board</h1>
<h2>Total: {{ calculateTotal() }}</h2>
<h2>{{ pointsTotal }}</h2>
<div ng-show="tasksLoaded()" ng-repeat="task in tasks">
<h2> {{ task.name }}</h2>
<div ng-controller="TaskCtrl"
ng-init="loadTaskPoints()"
ng-show="taskPointsLoaded()">
<p style="color: white; background: red; font-size: 20pt; padding: 2px;">{{ points }}</p>
<div ng-show="tagsLoaded()">
<select ng-model="taskPointTag" ng-options="pointTag.name for pointTag in getPointTags(tags)">
<option value="">Select a point</option>
</select>
</div>
</div>
<hr>
</div>
</div>
| <div ng-controller="BoardCtrl" ng-init="init()">
<h1>Board</h1>
<h2>Total: {{ calculateTotal() }}</h2>
<h2>{{ pointsTotal }}</h2>
<div ng-show="tasksLoaded()" ng-repeat="task in tasks">
<h2> {{ task.name }}</h2>
<div ng-controller="TaskCtrl"
ng-init="loadTaskPoints()"
ng-show="taskPointsLoaded()">
<p style="color: white; background: red; font-size: 20pt; padding: 2px;">{{ points }}</p>
<div ng-show="tagsLoaded()">
<select ng-model="taskPointTag"
ng-options="pointTag.name for pointTag in getPointTags(tags) | orderBy: 'name'">
<option value="">Select a point</option>
</select>
</div>
</div>
<hr>
</div>
</div>
| Order points in ascending order. | Order points in ascending order.
| HTML | mit | astrohckr/sprint-board | html | ## Code Before:
<div ng-controller="BoardCtrl" ng-init="init()">
<h1>Board</h1>
<h2>Total: {{ calculateTotal() }}</h2>
<h2>{{ pointsTotal }}</h2>
<div ng-show="tasksLoaded()" ng-repeat="task in tasks">
<h2> {{ task.name }}</h2>
<div ng-controller="TaskCtrl"
ng-init="loadTaskPoints()"
ng-show="taskPointsLoaded()">
<p style="color: white; background: red; font-size: 20pt; padding: 2px;">{{ points }}</p>
<div ng-show="tagsLoaded()">
<select ng-model="taskPointTag" ng-options="pointTag.name for pointTag in getPointTags(tags)">
<option value="">Select a point</option>
</select>
</div>
</div>
<hr>
</div>
</div>
## Instruction:
Order points in ascending order.
## Code After:
<div ng-controller="BoardCtrl" ng-init="init()">
<h1>Board</h1>
<h2>Total: {{ calculateTotal() }}</h2>
<h2>{{ pointsTotal }}</h2>
<div ng-show="tasksLoaded()" ng-repeat="task in tasks">
<h2> {{ task.name }}</h2>
<div ng-controller="TaskCtrl"
ng-init="loadTaskPoints()"
ng-show="taskPointsLoaded()">
<p style="color: white; background: red; font-size: 20pt; padding: 2px;">{{ points }}</p>
<div ng-show="tagsLoaded()">
<select ng-model="taskPointTag"
ng-options="pointTag.name for pointTag in getPointTags(tags) | orderBy: 'name'">
<option value="">Select a point</option>
</select>
</div>
</div>
<hr>
</div>
</div>
| <div ng-controller="BoardCtrl" ng-init="init()">
<h1>Board</h1>
<h2>Total: {{ calculateTotal() }}</h2>
<h2>{{ pointsTotal }}</h2>
<div ng-show="tasksLoaded()" ng-repeat="task in tasks">
<h2> {{ task.name }}</h2>
<div ng-controller="TaskCtrl"
ng-init="loadTaskPoints()"
ng-show="taskPointsLoaded()">
<p style="color: white; background: red; font-size: 20pt; padding: 2px;">{{ points }}</p>
<div ng-show="tagsLoaded()">
- <select ng-model="taskPointTag" ng-options="pointTag.name for pointTag in getPointTags(tags)">
+ <select ng-model="taskPointTag"
+ ng-options="pointTag.name for pointTag in getPointTags(tags) | orderBy: 'name'">
<option value="">Select a point</option>
</select>
</div>
</div>
<hr>
</div>
</div>
| 3 | 0.096774 | 2 | 1 |
b8451fc0fc4f2899df56d3171e1e6f6c691e65a8 | promo/app/models/spree/promotion_rule.rb | promo/app/models/spree/promotion_rule.rb | module Spree
class PromotionRule < ActiveRecord::Base
belongs_to :promotion, :foreign_key => 'activator_id'
scope :of_type, lambda {|t| {:conditions => {:type => t}}}
validate :promotion, :presence => true
validate :unique_per_activator, :on => :create
def eligible?(order, options = {})
raise 'eligible? should be implemented in a sub-class of Promotion::PromotionRule'
end
private
def unique_per_activator
if Spree::PromotionRule.exists?(:activator_id => activator_id, :type => self.class.name)
errors[:base] << "Promotion already contains this rule type"
end
end
end
end
| module Spree
class PromotionRule < ActiveRecord::Base
belongs_to :promotion, :foreign_key => 'activator_id'
scope :of_type, lambda {|t| {:conditions => {:type => t}}}
validate :promotion, :presence => true
validate :unique_per_activator, :on => :create
attr_accessible :preferred_operator, :preferred_amount
def eligible?(order, options = {})
raise 'eligible? should be implemented in a sub-class of Promotion::PromotionRule'
end
private
def unique_per_activator
if Spree::PromotionRule.exists?(:activator_id => activator_id, :type => self.class.name)
errors[:base] << "Promotion already contains this rule type"
end
end
end
end
| Make preferred_operator and preferred_amount accessible attributes on PromotionRule | Make preferred_operator and preferred_amount accessible attributes on PromotionRule
| Ruby | bsd-3-clause | DynamoMTL/spree,HealthWave/spree,ayb/spree,vinsol/spree,joanblake/spree,JDutil/spree,trigrass2/spree,sliaquat/spree,gautamsawhney/spree,raow/spree,jparr/spree,jsurdilla/solidus,archSeer/spree,fahidnasir/spree,jimblesm/spree,raow/spree,bonobos/solidus,Antdesk/karpal-spree,progsri/spree,shekibobo/spree,tomash/spree,keatonrow/spree,shaywood2/spree,zamiang/spree,piousbox/spree,project-eutopia/spree,Mayvenn/spree,groundctrl/spree,LBRapid/spree,karlitxo/spree,project-eutopia/spree,priyank-gupta/spree,rbngzlv/spree,Lostmyname/spree,radarseesradar/spree,hifly/spree,knuepwebdev/FloatTubeRodHolders,bjornlinder/Spree,mleglise/spree,wolfieorama/spree,azranel/spree,sideci-sample/sideci-sample-spree,grzlus/spree,Senjai/solidus,wolfieorama/spree,Senjai/spree,yushine/spree,derekluo/spree,brchristian/spree,Hates/spree,abhishekjain16/spree,kewaunited/spree,jspizziri/spree,Hawaiideveloper/shoppingcart,mleglise/spree,alepore/spree,woboinc/spree,RatioClothing/spree,DarkoP/spree,ayb/spree,yiqing95/spree,pervino/spree,groundctrl/spree,FadliKun/spree,DarkoP/spree,jordan-brough/solidus,lsirivong/spree,dandanwei/spree,tomash/spree,builtbybuffalo/spree,archSeer/spree,Hawaiideveloper/shoppingcart,siddharth28/spree,yiqing95/spree,sunny2601/spree,Kagetsuki/spree,alvinjean/spree,forkata/solidus,richardnuno/solidus,dafontaine/spree,omarsar/spree,jasonfb/spree,dandanwei/spree,bricesanchez/spree,reinaris/spree,KMikhaylovCTG/spree,imella/spree,NerdsvilleCEO/spree,keatonrow/spree,surfdome/spree,jimblesm/spree,richardnuno/solidus,builtbybuffalo/spree,ahmetabdi/spree,locomotivapro/spree,jparr/spree,Senjai/spree,woboinc/spree,xuewenfei/solidus,tesserakt/clean_spree,Hates/spree,Nevensoft/spree,surfdome/spree,scottcrawford03/solidus,tesserakt/clean_spree,priyank-gupta/spree,raow/spree,TrialGuides/spree,vulk/spree,ujai/spree,priyank-gupta/spree,trigrass2/spree,bonobos/solidus,RatioClothing/spree,devilcoders/solidus,pervino/spree,DynamoMTL/spree,Boomkat/spree,groundctrl/spree,fahidnasir/spree,fahidnasir/spree,derekluo/spree,agient/agientstorefront,AgilTec/spree,lyzxsc/spree,rajeevriitm/spree,TimurTarasenko/spree,patdec/spree,biagidp/spree,jeffboulet/spree,builtbybuffalo/spree,wolfieorama/spree,Ropeney/spree,yushine/spree,athal7/solidus,archSeer/spree,Mayvenn/spree,watg/spree,gregoryrikson/spree-sample,lzcabrera/spree-1-3-stable,jeffboulet/spree,freerunningtech/spree,kewaunited/spree,jordan-brough/solidus,athal7/solidus,nooysters/spree,CiscoCloud/spree,ramkumar-kr/spree,KMikhaylovCTG/spree,Senjai/solidus,HealthWave/spree,progsri/spree,brchristian/spree,TimurTarasenko/spree,Nevensoft/spree,StemboltHQ/spree,volpejoaquin/spree,shioyama/spree,welitonfreitas/spree,azranel/spree,jparr/spree,firman/spree,Boomkat/spree,jeffboulet/spree,abhishekjain16/spree,patdec/spree,assembledbrands/spree,jaspreet21anand/spree,Nevensoft/spree,jspizziri/spree,adaddeo/spree,Senjai/solidus,raow/spree,nooysters/spree,lsirivong/solidus,lsirivong/spree,freerunningtech/spree,useiichi/spree,vulk/spree,carlesjove/spree,jsurdilla/solidus,miyazawatomoka/spree,Engeltj/spree,lsirivong/solidus,thogg4/spree,imella/spree,bonobos/solidus,calvinl/spree,maybii/spree,berkes/spree,jaspreet21anand/spree,FadliKun/spree,jordan-brough/solidus,zaeznet/spree,njerrywerry/spree,ckk-scratch/solidus,gautamsawhney/spree,net2b/spree,edgward/spree,degica/spree,jparr/spree,azranel/spree,woboinc/spree,firman/spree,bricesanchez/spree,grzlus/solidus,TrialGuides/spree,project-eutopia/spree,AgilTec/spree,urimikhli/spree,ckk-scratch/solidus,siddharth28/spree,rakibulislam/spree,LBRapid/spree,kitwalker12/spree,bjornlinder/Spree,welitonfreitas/spree,imella/spree,codesavvy/sandbox,TimurTarasenko/spree,judaro13/spree-fork,jaspreet21anand/spree,yomishra/pce,carlesjove/spree,jasonfb/spree,volpejoaquin/spree,KMikhaylovCTG/spree,rbngzlv/spree,calvinl/spree,kewaunited/spree,njerrywerry/spree,adaddeo/spree,Migweld/spree,Arpsara/solidus,ujai/spree,SadTreeFriends/spree,dafontaine/spree,net2b/spree,dafontaine/spree,vmatekole/spree,moneyspyder/spree,sunny2601/spree,odk211/spree,mindvolt/spree,Hates/spree,DarkoP/spree,madetech/spree,bonobos/solidus,StemboltHQ/spree,mindvolt/spree,Arpsara/solidus,keatonrow/spree,calvinl/spree,scottcrawford03/solidus,sideci-sample/sideci-sample-spree,edgward/spree,thogg4/spree,vinayvinsol/spree,alejandromangione/spree,Hawaiideveloper/shoppingcart,tesserakt/clean_spree,agient/agientstorefront,sfcgeorge/spree,joanblake/spree,maybii/spree,Lostmyname/spree,vcavallo/spree,delphsoft/spree-store-ballchair,radarseesradar/spree,athal7/solidus,pervino/spree,CJMrozek/spree,net2b/spree,siddharth28/spree,tancnle/spree,grzlus/spree,devilcoders/solidus,tancnle/spree,vmatekole/spree,piousbox/spree,TrialGuides/spree,ujai/spree,CiscoCloud/spree,Senjai/spree,ayb/spree,lzcabrera/spree-1-3-stable,jasonfb/spree,nooysters/spree,Engeltj/spree,shaywood2/spree,alvinjean/spree,azclick/spree,knuepwebdev/FloatTubeRodHolders,Hawaiideveloper/shoppingcart,lzcabrera/spree-1-3-stable,mleglise/spree,wolfieorama/spree,xuewenfei/solidus,yiqing95/spree,useiichi/spree,tomash/spree,locomotivapro/spree,Nevensoft/spree,JDutil/spree,karlitxo/spree,forkata/solidus,gregoryrikson/spree-sample,reidblomquist/spree,CJMrozek/spree,rakibulislam/spree,robodisco/spree,joanblake/spree,odk211/spree,beni55/spree,zaeznet/spree,scottcrawford03/solidus,alejandromangione/spree,athal7/solidus,CJMrozek/spree,moneyspyder/spree,FadliKun/spree,yomishra/pce,vinayvinsol/spree,rajeevriitm/spree,xuewenfei/solidus,ramkumar-kr/spree,piousbox/spree,xuewenfei/solidus,robodisco/spree,madetech/spree,Machpowersystems/spree_mach,pervino/solidus,SadTreeFriends/spree,Machpowersystems/spree_mach,mindvolt/spree,bricesanchez/spree,degica/spree,orenf/spree,berkes/spree,rajeevriitm/spree,RatioClothing/spree,rakibulislam/spree,kitwalker12/spree,TimurTarasenko/spree,alvinjean/spree,devilcoders/solidus,gregoryrikson/spree-sample,abhishekjain16/spree,JuandGirald/spree,brchristian/spree,caiqinghua/spree,NerdsvilleCEO/spree,volpejoaquin/spree,tailic/spree,vcavallo/spree,Lostmyname/spree,jspizziri/spree,quentinuys/spree,project-eutopia/spree,orenf/spree,tailic/spree,calvinl/spree,carlesjove/spree,hifly/spree,dotandbo/spree,codesavvy/sandbox,KMikhaylovCTG/spree,vulk/spree,Arpsara/solidus,NerdsvilleCEO/spree,groundctrl/spree,beni55/spree,watg/spree,thogg4/spree,shaywood2/spree,jordan-brough/spree,Ropeney/spree,rbngzlv/spree,SadTreeFriends/spree,alejandromangione/spree,cutefrank/spree,maybii/spree,APohio/spree,jaspreet21anand/spree,Mayvenn/spree,orenf/spree,pervino/solidus,jsurdilla/solidus,cutefrank/spree,forkata/solidus,sliaquat/spree,firman/spree,reidblomquist/spree,camelmasa/spree,agient/agientstorefront,beni55/spree,jimblesm/spree,urimikhli/spree,abhishekjain16/spree,azclick/spree,sideci-sample/sideci-sample-spree,moneyspyder/spree,HealthWave/spree,vinayvinsol/spree,StemboltHQ/spree,hoanghiep90/spree,azclick/spree,ahmetabdi/spree,piousbox/spree,PhoenixTeam/spree_phoenix,locomotivapro/spree,NerdsvilleCEO/spree,vinsol/spree,reidblomquist/spree,quentinuys/spree,reidblomquist/spree,pjmj777/spree,builtbybuffalo/spree,grzlus/spree,judaro13/spree-fork,lsirivong/solidus,Engeltj/spree,cutefrank/spree,surfdome/spree,ahmetabdi/spree,mleglise/spree,lyzxsc/spree,tesserakt/clean_spree,pjmj777/spree,hoanghiep90/spree,assembledbrands/spree,locomotivapro/spree,pulkit21/spree,sliaquat/spree,APohio/spree,urimikhli/spree,jordan-brough/spree,gautamsawhney/spree,trigrass2/spree,lyzxsc/spree,hoanghiep90/spree,mindvolt/spree,vmatekole/spree,vinsol/spree,lsirivong/solidus,hoanghiep90/spree,camelmasa/spree,yiqing95/spree,joanblake/spree,grzlus/spree,jordan-brough/spree,derekluo/spree,useiichi/spree,Kagetsuki/spree,zamiang/spree,pulkit21/spree,Boomkat/spree,softr8/spree,APohio/spree,FadliKun/spree,robodisco/spree,madetech/spree,PhoenixTeam/spree_phoenix,adaddeo/spree,AgilTec/spree,lsirivong/spree,beni55/spree,TrialGuides/spree,patdec/spree,ramkumar-kr/spree,berkes/spree,Lostmyname/spree,archSeer/spree,sfcgeorge/spree,tancnle/spree,trigrass2/spree,dandanwei/spree,omarsar/spree,quentinuys/spree,patdec/spree,njerrywerry/spree,vinsol/spree,carlesjove/spree,karlitxo/spree,edgward/spree,alvinjean/spree,welitonfreitas/spree,watg/spree,dandanwei/spree,ayb/spree,tomash/spree,Migweld/spree,alejandromangione/spree,sunny2601/spree,omarsar/spree,pervino/spree,lsirivong/spree,welitonfreitas/spree,dotandbo/spree,radarseesradar/spree,keatonrow/spree,njerrywerry/spree,caiqinghua/spree,rbngzlv/spree,CJMrozek/spree,edgward/spree,PhoenixTeam/spree_phoenix,codesavvy/sandbox,siddharth28/spree,biagidp/spree,zaeznet/spree,omarsar/spree,codesavvy/sandbox,camelmasa/spree,jspizziri/spree,reinaris/spree,pervino/solidus,zamiang/spree,APohio/spree,sunny2601/spree,useiichi/spree,vcavallo/spree,vulk/spree,progsri/spree,odk211/spree,vmatekole/spree,hifly/spree,Machpowersystems/spree_mach,camelmasa/spree,JuandGirald/spree,DynamoMTL/spree,forkata/solidus,jhawthorn/spree,dotandbo/spree,radarseesradar/spree,degica/spree,bjornlinder/Spree,moneyspyder/spree,miyazawatomoka/spree,berkes/spree,delphsoft/spree-store-ballchair,ckk-scratch/solidus,ahmetabdi/spree,shekibobo/spree,tancnle/spree,derekluo/spree,cutefrank/spree,priyank-gupta/spree,shekibobo/spree,Ropeney/spree,assembledbrands/spree,devilcoders/solidus,jasonfb/spree,Migweld/spree,yomishra/pce,grzlus/solidus,dotandbo/spree,jsurdilla/solidus,Ropeney/spree,ckk-scratch/solidus,caiqinghua/spree,CiscoCloud/spree,brchristian/spree,thogg4/spree,jimblesm/spree,alepore/spree,quentinuys/spree,pervino/solidus,vcavallo/spree,sfcgeorge/spree,Boomkat/spree,jeffboulet/spree,shioyama/spree,reinaris/spree,JuandGirald/spree,Hates/spree,SadTreeFriends/spree,zamiang/spree,DynamoMTL/spree,jordan-brough/solidus,delphsoft/spree-store-ballchair,odk211/spree,Kagetsuki/spree,grzlus/solidus,Mayvenn/spree,zaeznet/spree,softr8/spree,orenf/spree,PhoenixTeam/spree_phoenix,dafontaine/spree,miyazawatomoka/spree,freerunningtech/spree,progsri/spree,nooysters/spree,reinaris/spree,madetech/spree,sfcgeorge/spree,jhawthorn/spree,adaddeo/spree,shioyama/spree,DarkoP/spree,karlitxo/spree,tailic/spree,richardnuno/solidus,judaro13/spree-fork,knuepwebdev/FloatTubeRodHolders,gregoryrikson/spree-sample,robodisco/spree,pulkit21/spree,sliaquat/spree,Antdesk/karpal-spree,Arpsara/solidus,hifly/spree,AgilTec/spree,JuandGirald/spree,grzlus/solidus,softr8/spree,ramkumar-kr/spree,agient/agientstorefront,yushine/spree,fahidnasir/spree,net2b/spree,JDutil/spree,Engeltj/spree,jhawthorn/spree,shaywood2/spree,maybii/spree,pulkit21/spree,LBRapid/spree,scottcrawford03/solidus,richardnuno/solidus,vinayvinsol/spree,azclick/spree,caiqinghua/spree,pjmj777/spree,biagidp/spree,softr8/spree,Antdesk/karpal-spree,Senjai/solidus,shekibobo/spree,azranel/spree,surfdome/spree,volpejoaquin/spree,kitwalker12/spree,gautamsawhney/spree,delphsoft/spree-store-ballchair,miyazawatomoka/spree,lyzxsc/spree,Kagetsuki/spree,kewaunited/spree,firman/spree,CiscoCloud/spree,rakibulislam/spree,rajeevriitm/spree,Migweld/spree,alepore/spree,yushine/spree,JDutil/spree | ruby | ## Code Before:
module Spree
class PromotionRule < ActiveRecord::Base
belongs_to :promotion, :foreign_key => 'activator_id'
scope :of_type, lambda {|t| {:conditions => {:type => t}}}
validate :promotion, :presence => true
validate :unique_per_activator, :on => :create
def eligible?(order, options = {})
raise 'eligible? should be implemented in a sub-class of Promotion::PromotionRule'
end
private
def unique_per_activator
if Spree::PromotionRule.exists?(:activator_id => activator_id, :type => self.class.name)
errors[:base] << "Promotion already contains this rule type"
end
end
end
end
## Instruction:
Make preferred_operator and preferred_amount accessible attributes on PromotionRule
## Code After:
module Spree
class PromotionRule < ActiveRecord::Base
belongs_to :promotion, :foreign_key => 'activator_id'
scope :of_type, lambda {|t| {:conditions => {:type => t}}}
validate :promotion, :presence => true
validate :unique_per_activator, :on => :create
attr_accessible :preferred_operator, :preferred_amount
def eligible?(order, options = {})
raise 'eligible? should be implemented in a sub-class of Promotion::PromotionRule'
end
private
def unique_per_activator
if Spree::PromotionRule.exists?(:activator_id => activator_id, :type => self.class.name)
errors[:base] << "Promotion already contains this rule type"
end
end
end
end
| module Spree
class PromotionRule < ActiveRecord::Base
belongs_to :promotion, :foreign_key => 'activator_id'
scope :of_type, lambda {|t| {:conditions => {:type => t}}}
validate :promotion, :presence => true
validate :unique_per_activator, :on => :create
+
+ attr_accessible :preferred_operator, :preferred_amount
def eligible?(order, options = {})
raise 'eligible? should be implemented in a sub-class of Promotion::PromotionRule'
end
private
def unique_per_activator
if Spree::PromotionRule.exists?(:activator_id => activator_id, :type => self.class.name)
errors[:base] << "Promotion already contains this rule type"
end
end
end
end | 2 | 0.090909 | 2 | 0 |
5a2890e09a74f4e4f1b62b626db150a83d9a304c | packages/components/components/button/Button.tsx | packages/components/components/button/Button.tsx | import React from 'react';
import ButtonLike, { ButtonLikeProps } from './ButtonLike';
export interface ButtonProps extends Omit<ButtonLikeProps<'button'>, 'as' | 'ref'> {}
const Button = (props: ButtonProps, ref: React.Ref<HTMLButtonElement>) => {
return <ButtonLike type="button" ref={ref} {...props} as="button" />;
};
export default React.forwardRef<HTMLButtonElement, ButtonProps>(Button);
| import React from 'react';
import ButtonLike, { ButtonLikeProps } from './ButtonLike';
export interface ButtonProps extends Omit<ButtonLikeProps<'button'>, 'as' | 'ref'> {}
const Button = React.forwardRef<HTMLButtonElement, ButtonProps>(
(props: ButtonProps, ref: React.Ref<HTMLButtonElement>) => {
return <ButtonLike type="button" ref={ref} {...props} as="button" />;
}
);
export default Button;
| Change usage of forwardRef to enable typescript docgen again | Change usage of forwardRef to enable typescript docgen again
| TypeScript | mit | ProtonMail/WebClient,ProtonMail/WebClient,ProtonMail/WebClient | typescript | ## Code Before:
import React from 'react';
import ButtonLike, { ButtonLikeProps } from './ButtonLike';
export interface ButtonProps extends Omit<ButtonLikeProps<'button'>, 'as' | 'ref'> {}
const Button = (props: ButtonProps, ref: React.Ref<HTMLButtonElement>) => {
return <ButtonLike type="button" ref={ref} {...props} as="button" />;
};
export default React.forwardRef<HTMLButtonElement, ButtonProps>(Button);
## Instruction:
Change usage of forwardRef to enable typescript docgen again
## Code After:
import React from 'react';
import ButtonLike, { ButtonLikeProps } from './ButtonLike';
export interface ButtonProps extends Omit<ButtonLikeProps<'button'>, 'as' | 'ref'> {}
const Button = React.forwardRef<HTMLButtonElement, ButtonProps>(
(props: ButtonProps, ref: React.Ref<HTMLButtonElement>) => {
return <ButtonLike type="button" ref={ref} {...props} as="button" />;
}
);
export default Button;
| import React from 'react';
import ButtonLike, { ButtonLikeProps } from './ButtonLike';
export interface ButtonProps extends Omit<ButtonLikeProps<'button'>, 'as' | 'ref'> {}
+ const Button = React.forwardRef<HTMLButtonElement, ButtonProps>(
- const Button = (props: ButtonProps, ref: React.Ref<HTMLButtonElement>) => {
? ----- ------ ^
+ (props: ButtonProps, ref: React.Ref<HTMLButtonElement>) => {
? ^
- return <ButtonLike type="button" ref={ref} {...props} as="button" />;
+ return <ButtonLike type="button" ref={ref} {...props} as="button" />;
? ++++
- };
+ }
+ );
- export default React.forwardRef<HTMLButtonElement, ButtonProps>(Button);
+ export default Button; | 10 | 0.909091 | 6 | 4 |
4579736e8cca4d7850bafde325e0ec825106e1f5 | files/name.ridgewell.unlock.sh | files/name.ridgewell.unlock.sh | echo "Checking if volume is locked."
if [ "`diskutil cs list | grep Locked`" ]
then
PASSWORD=`security 2>&1 >/dev/null find -gs name.ridgewell.unlock "/Library/Keychains/System.keychain" \
| ruby -e 'print $1 if STDIN.gets =~ /^password: "(.*)"$/'`
echo "Unlocking volume."
diskutil cs unlockVolume 32585426-99F0-4991-965C-DB5C809400B5 -passphrase $PASSWORD
fi
| echo "Checking if volume is locked."
if [ "`diskutil cs list | grep Locked`" ]
then
PASSWORD=`security 2>&1 >/dev/null find -gs name.ridgewell.unlock "/Library/Keychains/System.keychain" | cut -d '"' -f 2`
echo "Unlocking volume."
diskutil cs unlockVolume 32585426-99F0-4991-965C-DB5C809400B5 -passphrase $PASSWORD
fi
| Use cut instead of ruby. | Use cut instead of ruby. | Shell | mit | jridgewell/Unlock | shell | ## Code Before:
echo "Checking if volume is locked."
if [ "`diskutil cs list | grep Locked`" ]
then
PASSWORD=`security 2>&1 >/dev/null find -gs name.ridgewell.unlock "/Library/Keychains/System.keychain" \
| ruby -e 'print $1 if STDIN.gets =~ /^password: "(.*)"$/'`
echo "Unlocking volume."
diskutil cs unlockVolume 32585426-99F0-4991-965C-DB5C809400B5 -passphrase $PASSWORD
fi
## Instruction:
Use cut instead of ruby.
## Code After:
echo "Checking if volume is locked."
if [ "`diskutil cs list | grep Locked`" ]
then
PASSWORD=`security 2>&1 >/dev/null find -gs name.ridgewell.unlock "/Library/Keychains/System.keychain" | cut -d '"' -f 2`
echo "Unlocking volume."
diskutil cs unlockVolume 32585426-99F0-4991-965C-DB5C809400B5 -passphrase $PASSWORD
fi
| echo "Checking if volume is locked."
if [ "`diskutil cs list | grep Locked`" ]
then
- PASSWORD=`security 2>&1 >/dev/null find -gs name.ridgewell.unlock "/Library/Keychains/System.keychain" \
? ^
+ PASSWORD=`security 2>&1 >/dev/null find -gs name.ridgewell.unlock "/Library/Keychains/System.keychain" | cut -d '"' -f 2`
? ^^^^^^^^^^^^^^^^^^
- | ruby -e 'print $1 if STDIN.gets =~ /^password: "(.*)"$/'`
echo "Unlocking volume."
diskutil cs unlockVolume 32585426-99F0-4991-965C-DB5C809400B5 -passphrase $PASSWORD
fi | 3 | 0.375 | 1 | 2 |
474fa603da793174c3dfcfc90378a98b88a40e5e | CMakeLists.txt | CMakeLists.txt | cmake_minimum_required (VERSION 3.2)
project (libcpputil)
set (CMAKE_CXX_STANDARD 14)
set(include-dir ${CMAKE_CURRENT_SOURCE_DIR}/include)
set(test-dir ${CMAKE_CURRENT_SOURCE_DIR}/tests)
set(test-sources
${test-dir}/tuple_for_each.cxx
${test-dir}/tuple_at.cxx
${test-dir}/call_seq.cxx
${test-dir}/curl.cxx
${test-dir}/make_ref_tuple.cxx
${test-dir}/argv.cxx
${test-dir}/for_each_param.cxx
${test-dir}/stack_allocator.cxx
${test-dir}/bitview.cxx
${test-dir}/int_type.cxx
${test-dir}/sizeof_tuple.cxx
${test-dir}/marshall.cxx
)
add_executable(tests ${test-sources})
target_include_directories(tests PRIVATE ${tests} ${include-dir})
target_link_libraries(tests curl)
include(CTest)
add_test(
NAME "unit tests"
COMMAND tests)
add_subdirectory(examples)
| cmake_minimum_required (VERSION 3.2)
project (libcpputil)
set (CMAKE_CXX_STANDARD 14)
option (CONFIG_TESTS "build unit tests" OFF)
option (CONFIG_EXAMPLES "build examples" OFF)
set(include-dir ${CMAKE_CURRENT_SOURCE_DIR}/include)
set(test-dir ${CMAKE_CURRENT_SOURCE_DIR}/tests)
set(test-sources
${test-dir}/tuple_for_each.cxx
${test-dir}/tuple_at.cxx
${test-dir}/call_seq.cxx
${test-dir}/curl.cxx
${test-dir}/make_ref_tuple.cxx
${test-dir}/argv.cxx
${test-dir}/for_each_param.cxx
${test-dir}/stack_allocator.cxx
${test-dir}/bitview.cxx
${test-dir}/int_type.cxx
${test-dir}/sizeof_tuple.cxx
${test-dir}/marshall.cxx
)
if (CONFIG_TESTS)
add_executable(tests ${test-sources})
target_include_directories(tests PRIVATE ${tests} ${include-dir})
target_link_libraries(tests curl)
include(CTest)
add_test(
NAME "unit tests"
COMMAND tests)
endif (CONFIG_TESTS)
if (CONFIG_EXAMPLES)
add_subdirectory(examples)
endif (CONFIG_EXAMPLES)
| Build tests and examples conditionally | Build tests and examples conditionally
| Text | mit | lukaszgemborowski/cpptoolbox | text | ## Code Before:
cmake_minimum_required (VERSION 3.2)
project (libcpputil)
set (CMAKE_CXX_STANDARD 14)
set(include-dir ${CMAKE_CURRENT_SOURCE_DIR}/include)
set(test-dir ${CMAKE_CURRENT_SOURCE_DIR}/tests)
set(test-sources
${test-dir}/tuple_for_each.cxx
${test-dir}/tuple_at.cxx
${test-dir}/call_seq.cxx
${test-dir}/curl.cxx
${test-dir}/make_ref_tuple.cxx
${test-dir}/argv.cxx
${test-dir}/for_each_param.cxx
${test-dir}/stack_allocator.cxx
${test-dir}/bitview.cxx
${test-dir}/int_type.cxx
${test-dir}/sizeof_tuple.cxx
${test-dir}/marshall.cxx
)
add_executable(tests ${test-sources})
target_include_directories(tests PRIVATE ${tests} ${include-dir})
target_link_libraries(tests curl)
include(CTest)
add_test(
NAME "unit tests"
COMMAND tests)
add_subdirectory(examples)
## Instruction:
Build tests and examples conditionally
## Code After:
cmake_minimum_required (VERSION 3.2)
project (libcpputil)
set (CMAKE_CXX_STANDARD 14)
option (CONFIG_TESTS "build unit tests" OFF)
option (CONFIG_EXAMPLES "build examples" OFF)
set(include-dir ${CMAKE_CURRENT_SOURCE_DIR}/include)
set(test-dir ${CMAKE_CURRENT_SOURCE_DIR}/tests)
set(test-sources
${test-dir}/tuple_for_each.cxx
${test-dir}/tuple_at.cxx
${test-dir}/call_seq.cxx
${test-dir}/curl.cxx
${test-dir}/make_ref_tuple.cxx
${test-dir}/argv.cxx
${test-dir}/for_each_param.cxx
${test-dir}/stack_allocator.cxx
${test-dir}/bitview.cxx
${test-dir}/int_type.cxx
${test-dir}/sizeof_tuple.cxx
${test-dir}/marshall.cxx
)
if (CONFIG_TESTS)
add_executable(tests ${test-sources})
target_include_directories(tests PRIVATE ${tests} ${include-dir})
target_link_libraries(tests curl)
include(CTest)
add_test(
NAME "unit tests"
COMMAND tests)
endif (CONFIG_TESTS)
if (CONFIG_EXAMPLES)
add_subdirectory(examples)
endif (CONFIG_EXAMPLES)
| cmake_minimum_required (VERSION 3.2)
project (libcpputil)
set (CMAKE_CXX_STANDARD 14)
+
+ option (CONFIG_TESTS "build unit tests" OFF)
+ option (CONFIG_EXAMPLES "build examples" OFF)
set(include-dir ${CMAKE_CURRENT_SOURCE_DIR}/include)
set(test-dir ${CMAKE_CURRENT_SOURCE_DIR}/tests)
set(test-sources
${test-dir}/tuple_for_each.cxx
${test-dir}/tuple_at.cxx
${test-dir}/call_seq.cxx
${test-dir}/curl.cxx
${test-dir}/make_ref_tuple.cxx
${test-dir}/argv.cxx
${test-dir}/for_each_param.cxx
${test-dir}/stack_allocator.cxx
${test-dir}/bitview.cxx
${test-dir}/int_type.cxx
${test-dir}/sizeof_tuple.cxx
${test-dir}/marshall.cxx
)
+ if (CONFIG_TESTS)
- add_executable(tests ${test-sources})
+ add_executable(tests ${test-sources})
? ++++
- target_include_directories(tests PRIVATE ${tests} ${include-dir})
+ target_include_directories(tests PRIVATE ${tests} ${include-dir})
? ++++
- target_link_libraries(tests curl)
+ target_link_libraries(tests curl)
? ++++
- include(CTest)
+ include(CTest)
? ++++
- add_test(
+ add_test(
? ++++
NAME "unit tests"
COMMAND tests)
+ endif (CONFIG_TESTS)
+ if (CONFIG_EXAMPLES)
- add_subdirectory(examples)
+ add_subdirectory(examples)
? ++++
+ endif (CONFIG_EXAMPLES) | 19 | 0.612903 | 13 | 6 |
700fc7227e3cbdb73af3ce876107d4d1720641d1 | app/views/admin/settings/_notification.html.slim | app/views/admin/settings/_notification.html.slim | li id="notification-#{notification.id}"
span.date-time-text
- if !notification.sent?
' Scheduled to be sent on
- else
' Already sent on
span.trigger-at
= notification.formatted_trigger_time
span.actions
= link_to "Edit", "#", class: "edit-notification if-no-js-hide"
- if policy(notification).destroy?
span.if-no-js-hide
' |
= button_to "Delete", admin_settings_email_notification_path(notification), { onclick: "return confirm('Are you sure?')", method: :delete, remote: true }
.notification-edit-form.well
= simple_form_for notification, url: admin_settings_email_notification_path(notification, year: params[:year]), remote: true, authenticity_token: true do |f|
.control-date
label.control-label Edit schedule
= f.input :trigger_at,
as: :separated_date_time,
input_html: { id: '' }
.control-action
= link_to "Cancel", "#", class: "btn btn-default btn-cancel if-no-js-hide"
= f.submit "Save", class: "btn btn-primary btn-submit"
.clear
| li id="notification-#{notification.id}"
span.date-time-text
- if !notification.sent?
' Scheduled to be sent on
- else
' Already sent on
span.trigger-at
= notification.formatted_trigger_time
- if !notification.sent?
span.actions
= link_to "Edit", "#", class: "edit-notification if-no-js-hide"
- if policy(notification).destroy?
span.if-no-js-hide
' |
= button_to "Delete", admin_settings_email_notification_path(notification), { onclick: "return confirm('Are you sure?')", method: :delete, remote: true }
.notification-edit-form.well
= simple_form_for notification, url: admin_settings_email_notification_path(notification, year: params[:year]), remote: true, authenticity_token: true do |f|
.control-date
label.control-label Edit schedule
= f.input :trigger_at,
as: :separated_date_time,
input_html: { id: '' }
.control-action
= link_to "Cancel", "#", class: "btn btn-default btn-cancel if-no-js-hide"
= f.submit "Save", class: "btn btn-primary btn-submit"
.clear
| Hide settings email date's edit and delete links when email sent | Hide settings email date's edit and delete links when email sent
| Slim | mit | bitzesty/qae,bitzesty/qae,bitzesty/qae,bitzesty/qae | slim | ## Code Before:
li id="notification-#{notification.id}"
span.date-time-text
- if !notification.sent?
' Scheduled to be sent on
- else
' Already sent on
span.trigger-at
= notification.formatted_trigger_time
span.actions
= link_to "Edit", "#", class: "edit-notification if-no-js-hide"
- if policy(notification).destroy?
span.if-no-js-hide
' |
= button_to "Delete", admin_settings_email_notification_path(notification), { onclick: "return confirm('Are you sure?')", method: :delete, remote: true }
.notification-edit-form.well
= simple_form_for notification, url: admin_settings_email_notification_path(notification, year: params[:year]), remote: true, authenticity_token: true do |f|
.control-date
label.control-label Edit schedule
= f.input :trigger_at,
as: :separated_date_time,
input_html: { id: '' }
.control-action
= link_to "Cancel", "#", class: "btn btn-default btn-cancel if-no-js-hide"
= f.submit "Save", class: "btn btn-primary btn-submit"
.clear
## Instruction:
Hide settings email date's edit and delete links when email sent
## Code After:
li id="notification-#{notification.id}"
span.date-time-text
- if !notification.sent?
' Scheduled to be sent on
- else
' Already sent on
span.trigger-at
= notification.formatted_trigger_time
- if !notification.sent?
span.actions
= link_to "Edit", "#", class: "edit-notification if-no-js-hide"
- if policy(notification).destroy?
span.if-no-js-hide
' |
= button_to "Delete", admin_settings_email_notification_path(notification), { onclick: "return confirm('Are you sure?')", method: :delete, remote: true }
.notification-edit-form.well
= simple_form_for notification, url: admin_settings_email_notification_path(notification, year: params[:year]), remote: true, authenticity_token: true do |f|
.control-date
label.control-label Edit schedule
= f.input :trigger_at,
as: :separated_date_time,
input_html: { id: '' }
.control-action
= link_to "Cancel", "#", class: "btn btn-default btn-cancel if-no-js-hide"
= f.submit "Save", class: "btn btn-primary btn-submit"
.clear
| li id="notification-#{notification.id}"
span.date-time-text
- if !notification.sent?
' Scheduled to be sent on
- else
' Already sent on
span.trigger-at
= notification.formatted_trigger_time
+ - if !notification.sent?
- span.actions
+ span.actions
? ++
- = link_to "Edit", "#", class: "edit-notification if-no-js-hide"
+ = link_to "Edit", "#", class: "edit-notification if-no-js-hide"
? ++
- - if policy(notification).destroy?
+ - if policy(notification).destroy?
? ++
- span.if-no-js-hide
+ span.if-no-js-hide
? ++
- ' |
+ ' |
? ++
- = button_to "Delete", admin_settings_email_notification_path(notification), { onclick: "return confirm('Are you sure?')", method: :delete, remote: true }
+ = button_to "Delete", admin_settings_email_notification_path(notification), { onclick: "return confirm('Are you sure?')", method: :delete, remote: true }
? ++
- .notification-edit-form.well
+ .notification-edit-form.well
? ++
- = simple_form_for notification, url: admin_settings_email_notification_path(notification, year: params[:year]), remote: true, authenticity_token: true do |f|
+ = simple_form_for notification, url: admin_settings_email_notification_path(notification, year: params[:year]), remote: true, authenticity_token: true do |f|
? ++
- .control-date
+ .control-date
? ++
- label.control-label Edit schedule
+ label.control-label Edit schedule
? ++
- = f.input :trigger_at,
+ = f.input :trigger_at,
? ++
- as: :separated_date_time,
+ as: :separated_date_time,
? ++
- input_html: { id: '' }
+ input_html: { id: '' }
? ++
- .control-action
+ .control-action
? ++
- = link_to "Cancel", "#", class: "btn btn-default btn-cancel if-no-js-hide"
+ = link_to "Cancel", "#", class: "btn btn-default btn-cancel if-no-js-hide"
? ++
- = f.submit "Save", class: "btn btn-primary btn-submit"
+ = f.submit "Save", class: "btn btn-primary btn-submit"
? ++
- .clear
+ .clear
? ++
| 35 | 1.296296 | 18 | 17 |
732045eea9392357517b1c927bc9345d3af478ac | pvpython_setup.sh | pvpython_setup.sh |
die () {
echo >&2 "$@"
exit 1
}
[ "$#" -eq 1 ] || die "Please provide path to pvpython"
echo "Installing requests ..."
REQUESTS_DIR=`mktemp -d`
wget --no-check-certificate https://github.com/kennethreitz/requests/tarball/v2.8.1 -O - | tar xz -C $REQUESTS_DIR
pushd .
cd $REQUESTS_DIR/*requests*
$1 setup.py install
popd
rm -rf $REQUESTS_DIR
echo "Installing requests-toolbelt ..."
# Install setuptools
wget https://bootstrap.pypa.io/ez_setup.py -O - | $1
REQUESTS_TOOLBELT_DIR=`mktemp -d`
wget --no-check-certificate https://github.com/sigmavirus24/requests-toolbelt/tarball/0.4.0 -O - | tar xz -C $REQUESTS_TOOLBELT_DIR
pushd .
cd $REQUESTS_TOOLBELT_DIR/*requests-toolbelt*
$1 setup.py install
popd
rm -rf $REQUESTS_TOOLBELT_DIR
|
die () {
echo >&2 "$@"
exit 1
}
[ "$#" -eq 1 ] || die "Please provide path to pvpython"
echo "Installing requests ..."
REQUESTS_DIR=`mktemp -d`
curl -Lk https://github.com/kennethreitz/requests/tarball/v2.8.1 -o - | tar xz -C $REQUESTS_DIR
pushd .
cd $REQUESTS_DIR/*requests*
$1 setup.py install
popd
rm -rf $REQUESTS_DIR
echo "Installing requests-toolbelt ..."
# Install setuptools
wget https://bootstrap.pypa.io/ez_setup.py -O - | $1
REQUESTS_TOOLBELT_DIR=`mktemp -d`
curl -Lk https://github.com/sigmavirus24/requests-toolbelt/tarball/0.4.0 -o - | tar xz -C $REQUESTS_TOOLBELT_DIR
pushd .
cd $REQUESTS_TOOLBELT_DIR/*requests-toolbelt*
$1 setup.py install
popd
rm -rf $REQUESTS_TOOLBELT_DIR
| Use curl to download file instead of wget | Use curl to download file instead of wget
| Shell | apache-2.0 | Kitware/HPCCloud-deploy,Kitware/HPCCloud-deploy,Kitware/HPCCloud-deploy | shell | ## Code Before:
die () {
echo >&2 "$@"
exit 1
}
[ "$#" -eq 1 ] || die "Please provide path to pvpython"
echo "Installing requests ..."
REQUESTS_DIR=`mktemp -d`
wget --no-check-certificate https://github.com/kennethreitz/requests/tarball/v2.8.1 -O - | tar xz -C $REQUESTS_DIR
pushd .
cd $REQUESTS_DIR/*requests*
$1 setup.py install
popd
rm -rf $REQUESTS_DIR
echo "Installing requests-toolbelt ..."
# Install setuptools
wget https://bootstrap.pypa.io/ez_setup.py -O - | $1
REQUESTS_TOOLBELT_DIR=`mktemp -d`
wget --no-check-certificate https://github.com/sigmavirus24/requests-toolbelt/tarball/0.4.0 -O - | tar xz -C $REQUESTS_TOOLBELT_DIR
pushd .
cd $REQUESTS_TOOLBELT_DIR/*requests-toolbelt*
$1 setup.py install
popd
rm -rf $REQUESTS_TOOLBELT_DIR
## Instruction:
Use curl to download file instead of wget
## Code After:
die () {
echo >&2 "$@"
exit 1
}
[ "$#" -eq 1 ] || die "Please provide path to pvpython"
echo "Installing requests ..."
REQUESTS_DIR=`mktemp -d`
curl -Lk https://github.com/kennethreitz/requests/tarball/v2.8.1 -o - | tar xz -C $REQUESTS_DIR
pushd .
cd $REQUESTS_DIR/*requests*
$1 setup.py install
popd
rm -rf $REQUESTS_DIR
echo "Installing requests-toolbelt ..."
# Install setuptools
wget https://bootstrap.pypa.io/ez_setup.py -O - | $1
REQUESTS_TOOLBELT_DIR=`mktemp -d`
curl -Lk https://github.com/sigmavirus24/requests-toolbelt/tarball/0.4.0 -o - | tar xz -C $REQUESTS_TOOLBELT_DIR
pushd .
cd $REQUESTS_TOOLBELT_DIR/*requests-toolbelt*
$1 setup.py install
popd
rm -rf $REQUESTS_TOOLBELT_DIR
|
die () {
echo >&2 "$@"
exit 1
}
[ "$#" -eq 1 ] || die "Please provide path to pvpython"
echo "Installing requests ..."
REQUESTS_DIR=`mktemp -d`
- wget --no-check-certificate https://github.com/kennethreitz/requests/tarball/v2.8.1 -O - | tar xz -C $REQUESTS_DIR
? ^^^^ ^^^^^^^^ ------------ ^
+ curl -Lk https://github.com/kennethreitz/requests/tarball/v2.8.1 -o - | tar xz -C $REQUESTS_DIR
? ^^^^ ^ ^
pushd .
cd $REQUESTS_DIR/*requests*
$1 setup.py install
popd
rm -rf $REQUESTS_DIR
echo "Installing requests-toolbelt ..."
# Install setuptools
wget https://bootstrap.pypa.io/ez_setup.py -O - | $1
REQUESTS_TOOLBELT_DIR=`mktemp -d`
- wget --no-check-certificate https://github.com/sigmavirus24/requests-toolbelt/tarball/0.4.0 -O - | tar xz -C $REQUESTS_TOOLBELT_DIR
? ^^^^ ^^^^^^^^ ------------ ^
+ curl -Lk https://github.com/sigmavirus24/requests-toolbelt/tarball/0.4.0 -o - | tar xz -C $REQUESTS_TOOLBELT_DIR
? ^^^^ ^ ^
pushd .
cd $REQUESTS_TOOLBELT_DIR/*requests-toolbelt*
$1 setup.py install
popd
rm -rf $REQUESTS_TOOLBELT_DIR
| 4 | 0.125 | 2 | 2 |
ed47f4707b9907fcbc81ca922aff87821b47affb | app/controllers/features_controller.rb | app/controllers/features_controller.rb | class FeaturesController < ApplicationController
before_filter :admin_required
before_filter :find_feature, :only => [:edit, :destroy, :update]
def new
@feature = Feature.new
@feature.product_id = params[:product_id]
redirect_to products_path and return unless product_valid?
@feature.required = true
end
def create
@feature = Feature.new(params[:feature])
redirect_to products_path and return unless product_valid?
if @feature.save
flash[:notice] = "Successfully added new feature."
redirect_to product_path(@feature.product)
else
render :action => "new"
end
end
def update
if @feature.update_attributes(params[:feature])
flash[:notice] = "Feature successfully updated."
redirect_to product_path(@feature.product.id)
else
render :action => "edit"
end
end
def edit
end
def destroy
redirect_to products_path and return unless product_valid?
@feature.destroy
flash[:notice] = "Feature deleted."
redirect_to edit_product_path(@feature.product)
end
protected
def find_feature
@feature = Feature.find(params[:id])
end
def product_valid?
if Product.find_by_id_and_website_id(@feature.product_id, @w.id)
true
else
flash[:notice] = 'Invalid product.'
false
end
end
end
| class FeaturesController < ApplicationController
before_filter :admin_required
before_filter :find_feature, :only => [:edit, :destroy, :update]
def new
@feature = Feature.new
@feature.product_id = params[:product_id]
redirect_to products_path and return unless product_valid?
@feature.required = true
end
def create
@feature = Feature.new(params[:feature])
redirect_to products_path and return unless product_valid?
if @feature.save
flash[:notice] = "Successfully added new feature."
redirect_to product_path(@feature.product)
else
render :action => "new"
end
end
def update
if @feature.update_attributes(params[:feature])
flash[:notice] = "Feature successfully updated."
redirect_to product_path(@feature.product.id)
else
render :action => "edit"
end
end
def edit
end
def destroy
@feature.destroy
flash[:notice] = "Feature deleted."
redirect_to edit_product_path(@feature.product)
end
protected
def find_feature
@feature = Feature.find(params[:id])
redirect_to products_path and return unless product_valid?
end
def product_valid?
if Product.find_by_id_and_website_id(@feature.product_id, @w.id)
true
else
flash[:notice] = 'Invalid product.'
false
end
end
end
| Refactor product validity check for destroying features | Refactor product validity check for destroying features
| Ruby | mit | ianfleeton/zmey,ianfleeton/zmey,ianfleeton/zmey,ianfleeton/zmey | ruby | ## Code Before:
class FeaturesController < ApplicationController
before_filter :admin_required
before_filter :find_feature, :only => [:edit, :destroy, :update]
def new
@feature = Feature.new
@feature.product_id = params[:product_id]
redirect_to products_path and return unless product_valid?
@feature.required = true
end
def create
@feature = Feature.new(params[:feature])
redirect_to products_path and return unless product_valid?
if @feature.save
flash[:notice] = "Successfully added new feature."
redirect_to product_path(@feature.product)
else
render :action => "new"
end
end
def update
if @feature.update_attributes(params[:feature])
flash[:notice] = "Feature successfully updated."
redirect_to product_path(@feature.product.id)
else
render :action => "edit"
end
end
def edit
end
def destroy
redirect_to products_path and return unless product_valid?
@feature.destroy
flash[:notice] = "Feature deleted."
redirect_to edit_product_path(@feature.product)
end
protected
def find_feature
@feature = Feature.find(params[:id])
end
def product_valid?
if Product.find_by_id_and_website_id(@feature.product_id, @w.id)
true
else
flash[:notice] = 'Invalid product.'
false
end
end
end
## Instruction:
Refactor product validity check for destroying features
## Code After:
class FeaturesController < ApplicationController
before_filter :admin_required
before_filter :find_feature, :only => [:edit, :destroy, :update]
def new
@feature = Feature.new
@feature.product_id = params[:product_id]
redirect_to products_path and return unless product_valid?
@feature.required = true
end
def create
@feature = Feature.new(params[:feature])
redirect_to products_path and return unless product_valid?
if @feature.save
flash[:notice] = "Successfully added new feature."
redirect_to product_path(@feature.product)
else
render :action => "new"
end
end
def update
if @feature.update_attributes(params[:feature])
flash[:notice] = "Feature successfully updated."
redirect_to product_path(@feature.product.id)
else
render :action => "edit"
end
end
def edit
end
def destroy
@feature.destroy
flash[:notice] = "Feature deleted."
redirect_to edit_product_path(@feature.product)
end
protected
def find_feature
@feature = Feature.find(params[:id])
redirect_to products_path and return unless product_valid?
end
def product_valid?
if Product.find_by_id_and_website_id(@feature.product_id, @w.id)
true
else
flash[:notice] = 'Invalid product.'
false
end
end
end
| class FeaturesController < ApplicationController
before_filter :admin_required
before_filter :find_feature, :only => [:edit, :destroy, :update]
def new
@feature = Feature.new
@feature.product_id = params[:product_id]
redirect_to products_path and return unless product_valid?
@feature.required = true
end
def create
@feature = Feature.new(params[:feature])
redirect_to products_path and return unless product_valid?
if @feature.save
flash[:notice] = "Successfully added new feature."
redirect_to product_path(@feature.product)
else
render :action => "new"
end
end
def update
if @feature.update_attributes(params[:feature])
flash[:notice] = "Feature successfully updated."
redirect_to product_path(@feature.product.id)
else
render :action => "edit"
end
end
def edit
end
def destroy
- redirect_to products_path and return unless product_valid?
@feature.destroy
flash[:notice] = "Feature deleted."
redirect_to edit_product_path(@feature.product)
end
protected
def find_feature
@feature = Feature.find(params[:id])
+ redirect_to products_path and return unless product_valid?
end
def product_valid?
if Product.find_by_id_and_website_id(@feature.product_id, @w.id)
true
else
flash[:notice] = 'Invalid product.'
false
end
end
end | 2 | 0.035714 | 1 | 1 |
412466f9d11f229d5a118028292c69c2b1b3814f | commands/say.js | commands/say.js | var l10n_file = __dirname + '/../l10n/commands/say.yml';
var l10n = require('../src/l10n')(l10n_file);
var CommandUtil = require('../src/command_util').CommandUtil;
exports.command = function(rooms, items, players, npcs, Commands) {
return function(args, player) {
if (args) {
player.sayL10n(l10n, 'YOU_SAY', args);
players.eachIf(function(p) {
otherPlayersInRoom(p);
}, function(p) {
p.sayL10n(l10n, 'THEY_SAY', player.getName(), args);
});
return;
}
player.sayL10n(l10n, 'NOTHING_SAID');
return;
}
function otherPlayersInRoom(p) {
if (p)
return (p.getName() !== player.getName() && p.getLocation() === player.getLocation());
};
}; | var l10n_file = __dirname + '/../l10n/commands/say.yml';
var l10n = require('../src/l10n')(l10n_file);
var CommandUtil = require('../src/command_util').CommandUtil;
exports.command = function(rooms, items, players, npcs, Commands) {
return function(args, player) {
if (args) {
player.sayL10n(l10n, 'YOU_SAY', args);
players.eachIf(function(p) {
return otherPlayersInRoom(p);
}, function(p) {
if (p.getName() != player.getName())
p.sayL10n(l10n, 'THEY_SAY', player.getName(), args);
});
return;
}
player.sayL10n(l10n, 'NOTHING_SAID');
return;
}
function otherPlayersInRoom(p) {
if (p)
return (p.getName() !== player.getName() && p.getLocation() === player.getLocation());
};
}; | Add conditional to avoid duplicate messages, fix EachIf block. | Add conditional to avoid duplicate messages, fix EachIf block.
| JavaScript | mit | seanohue/ranviermud,seanohue/ranviermud,shawncplus/ranviermud | javascript | ## Code Before:
var l10n_file = __dirname + '/../l10n/commands/say.yml';
var l10n = require('../src/l10n')(l10n_file);
var CommandUtil = require('../src/command_util').CommandUtil;
exports.command = function(rooms, items, players, npcs, Commands) {
return function(args, player) {
if (args) {
player.sayL10n(l10n, 'YOU_SAY', args);
players.eachIf(function(p) {
otherPlayersInRoom(p);
}, function(p) {
p.sayL10n(l10n, 'THEY_SAY', player.getName(), args);
});
return;
}
player.sayL10n(l10n, 'NOTHING_SAID');
return;
}
function otherPlayersInRoom(p) {
if (p)
return (p.getName() !== player.getName() && p.getLocation() === player.getLocation());
};
};
## Instruction:
Add conditional to avoid duplicate messages, fix EachIf block.
## Code After:
var l10n_file = __dirname + '/../l10n/commands/say.yml';
var l10n = require('../src/l10n')(l10n_file);
var CommandUtil = require('../src/command_util').CommandUtil;
exports.command = function(rooms, items, players, npcs, Commands) {
return function(args, player) {
if (args) {
player.sayL10n(l10n, 'YOU_SAY', args);
players.eachIf(function(p) {
return otherPlayersInRoom(p);
}, function(p) {
if (p.getName() != player.getName())
p.sayL10n(l10n, 'THEY_SAY', player.getName(), args);
});
return;
}
player.sayL10n(l10n, 'NOTHING_SAID');
return;
}
function otherPlayersInRoom(p) {
if (p)
return (p.getName() !== player.getName() && p.getLocation() === player.getLocation());
};
}; | var l10n_file = __dirname + '/../l10n/commands/say.yml';
var l10n = require('../src/l10n')(l10n_file);
var CommandUtil = require('../src/command_util').CommandUtil;
exports.command = function(rooms, items, players, npcs, Commands) {
return function(args, player) {
if (args) {
player.sayL10n(l10n, 'YOU_SAY', args);
players.eachIf(function(p) {
- otherPlayersInRoom(p);
+ return otherPlayersInRoom(p);
? +++++++
}, function(p) {
+ if (p.getName() != player.getName())
- p.sayL10n(l10n, 'THEY_SAY', player.getName(), args);
+ p.sayL10n(l10n, 'THEY_SAY', player.getName(), args);
? ++
});
return;
}
player.sayL10n(l10n, 'NOTHING_SAID');
return;
}
function otherPlayersInRoom(p) {
if (p)
return (p.getName() !== player.getName() && p.getLocation() === player.getLocation());
};
}; | 5 | 0.192308 | 3 | 2 |
48b4229d23105486185a652a6b711c5b6d54dad5 | README.md | README.md | ZeroDB [http://www.zerodb.io/] is an end-to-end encrypted database.
Data can be stored on untrusted database servers without ever exposing the
encryption key. Clients can execute remote queries against the encrypted data
without downloading it or suffering an excessive performance hit.
### Technical white paper: [https://www.zerodb.io/zerodb.pdf](https://www.zerodb.io/zerodb.pdf)
### Documentation: [https://docs.zerodb.io/](https://docs.zerodb.io/)
### OpenSource Community: [https://slack.zerodb.io/](https://slack.zerodb.io/)
Based on ZODB
| ZeroDB [http://www.zerodb.io/] is an end-to-end encrypted database.
Data can be stored on untrusted database servers without ever exposing the
encryption key. Clients can execute remote queries against the encrypted data
without downloading it or suffering an excessive performance hit.
Special thanks to ZODB community on which ZeroDB is based.
### Technical white paper: [https://www.zerodb.io/zerodb.pdf](https://www.zerodb.io/zerodb.pdf)
### Documentation: [https://docs.zerodb.io/](https://docs.zerodb.io/)
### OpenSource Community: [https://slack.zerodb.io/](https://slack.zerodb.io/)
| Update the Readme file for cleanup | Update the Readme file for cleanup
Quick patch | Markdown | agpl-3.0 | zero-db/zerodb,zerodb/zerodb,zero-db/zerodb,zerodb/zerodb | markdown | ## Code Before:
ZeroDB [http://www.zerodb.io/] is an end-to-end encrypted database.
Data can be stored on untrusted database servers without ever exposing the
encryption key. Clients can execute remote queries against the encrypted data
without downloading it or suffering an excessive performance hit.
### Technical white paper: [https://www.zerodb.io/zerodb.pdf](https://www.zerodb.io/zerodb.pdf)
### Documentation: [https://docs.zerodb.io/](https://docs.zerodb.io/)
### OpenSource Community: [https://slack.zerodb.io/](https://slack.zerodb.io/)
Based on ZODB
## Instruction:
Update the Readme file for cleanup
Quick patch
## Code After:
ZeroDB [http://www.zerodb.io/] is an end-to-end encrypted database.
Data can be stored on untrusted database servers without ever exposing the
encryption key. Clients can execute remote queries against the encrypted data
without downloading it or suffering an excessive performance hit.
Special thanks to ZODB community on which ZeroDB is based.
### Technical white paper: [https://www.zerodb.io/zerodb.pdf](https://www.zerodb.io/zerodb.pdf)
### Documentation: [https://docs.zerodb.io/](https://docs.zerodb.io/)
### OpenSource Community: [https://slack.zerodb.io/](https://slack.zerodb.io/)
| ZeroDB [http://www.zerodb.io/] is an end-to-end encrypted database.
Data can be stored on untrusted database servers without ever exposing the
encryption key. Clients can execute remote queries against the encrypted data
without downloading it or suffering an excessive performance hit.
+
+ Special thanks to ZODB community on which ZeroDB is based.
### Technical white paper: [https://www.zerodb.io/zerodb.pdf](https://www.zerodb.io/zerodb.pdf)
### Documentation: [https://docs.zerodb.io/](https://docs.zerodb.io/)
### OpenSource Community: [https://slack.zerodb.io/](https://slack.zerodb.io/)
- Based on ZODB | 3 | 0.25 | 2 | 1 |
506767f581bc365a5f192cfc9e97c85abe93d964 | app/models/group_hierarchy.rb | app/models/group_hierarchy.rb | class GroupHierarchy
def initialize(root)
@root = root
end
def to_hash
export(@root)
end
private
def export(group)
{
name: group.name,
url: Rails.application.routes.url_helpers.group_path(group),
children: group.children.map { |g| export(g) }
}
end
end
| class GroupHierarchy
FIELDS = [:id, :name, :slug, :parent_id]
LiteGroup = Struct.new(*FIELDS)
def initialize(root)
@root = root
end
def to_hash
export(@root)
end
private
def export(group)
{
name: group.name,
url: group_path(group),
children: children(group).map { |g| export(g) }
}
end
def children(group)
lookup.select { |g| g.parent_id == group.id }.sort_by(&:name)
end
def group_path(lg)
Rails.application.routes.url_helpers.group_path(id: lg.slug)
end
def lookup
@lookup ||= Group.pluck(*FIELDS).map { |a| LiteGroup.new(*a) }
end
end
| Make group hierarchy generation efficient | Make group hierarchy generation efficient
It now uses one query instead of one for every group in the database.
| Ruby | mit | MjAbuz/peoplefinder,MjAbuz/peoplefinder,ministryofjustice/peoplefinder,heathd/moj_peoplefinder,ministryofjustice/peoplefinder,ministryofjustice/peoplefinder,MjAbuz/peoplefinder,ministryofjustice/peoplefinder,heathd/moj_peoplefinder,ministryofjustice/peoplefinder,MjAbuz/peoplefinder | ruby | ## Code Before:
class GroupHierarchy
def initialize(root)
@root = root
end
def to_hash
export(@root)
end
private
def export(group)
{
name: group.name,
url: Rails.application.routes.url_helpers.group_path(group),
children: group.children.map { |g| export(g) }
}
end
end
## Instruction:
Make group hierarchy generation efficient
It now uses one query instead of one for every group in the database.
## Code After:
class GroupHierarchy
FIELDS = [:id, :name, :slug, :parent_id]
LiteGroup = Struct.new(*FIELDS)
def initialize(root)
@root = root
end
def to_hash
export(@root)
end
private
def export(group)
{
name: group.name,
url: group_path(group),
children: children(group).map { |g| export(g) }
}
end
def children(group)
lookup.select { |g| g.parent_id == group.id }.sort_by(&:name)
end
def group_path(lg)
Rails.application.routes.url_helpers.group_path(id: lg.slug)
end
def lookup
@lookup ||= Group.pluck(*FIELDS).map { |a| LiteGroup.new(*a) }
end
end
| class GroupHierarchy
+ FIELDS = [:id, :name, :slug, :parent_id]
+ LiteGroup = Struct.new(*FIELDS)
+
def initialize(root)
@root = root
end
def to_hash
export(@root)
end
private
def export(group)
{
name: group.name,
- url: Rails.application.routes.url_helpers.group_path(group),
+ url: group_path(group),
- children: group.children.map { |g| export(g) }
? ------
+ children: children(group).map { |g| export(g) }
? +++++++
}
end
+
+ def children(group)
+ lookup.select { |g| g.parent_id == group.id }.sort_by(&:name)
+ end
+
+ def group_path(lg)
+ Rails.application.routes.url_helpers.group_path(id: lg.slug)
+ end
+
+ def lookup
+ @lookup ||= Group.pluck(*FIELDS).map { |a| LiteGroup.new(*a) }
+ end
end | 19 | 1 | 17 | 2 |
cbb68032f3452e63565a8f51032a22f1e18733a6 | .travis.yml | .travis.yml | language: python
python:
- "2.6"
- "2.7"
- "3.2"
- "3.3"
- "pypy"
install:
- pip install -r dev-requirements.txt
script:
- inv test
# Jinja2, used by Sphinx, does not work on Python 3.2 (but does on 3.3)
- "if [[ $TRAVIS_PYTHON_VERSION != '3.2' ]]; then inv docs; fi"
| language: python
python:
- "2.6"
- "2.7"
- "3.2"
- "3.3"
- "pypy"
install:
- pip install -r dev-requirements.txt
script:
# Primary test suite
- inv test
# Integration-level test suite (hard to reliably test full stack in-code :()
- inv test -o --tests=integration
# Jinja2, used by Sphinx, does not work on Python 3.2 (but does on 3.3)
- "if [[ $TRAVIS_PYTHON_VERSION != '3.2' ]]; then inv docs; fi"
| Add integration tests to Travis | Add integration tests to Travis
| YAML | bsd-2-clause | bitprophet/releases | yaml | ## Code Before:
language: python
python:
- "2.6"
- "2.7"
- "3.2"
- "3.3"
- "pypy"
install:
- pip install -r dev-requirements.txt
script:
- inv test
# Jinja2, used by Sphinx, does not work on Python 3.2 (but does on 3.3)
- "if [[ $TRAVIS_PYTHON_VERSION != '3.2' ]]; then inv docs; fi"
## Instruction:
Add integration tests to Travis
## Code After:
language: python
python:
- "2.6"
- "2.7"
- "3.2"
- "3.3"
- "pypy"
install:
- pip install -r dev-requirements.txt
script:
# Primary test suite
- inv test
# Integration-level test suite (hard to reliably test full stack in-code :()
- inv test -o --tests=integration
# Jinja2, used by Sphinx, does not work on Python 3.2 (but does on 3.3)
- "if [[ $TRAVIS_PYTHON_VERSION != '3.2' ]]; then inv docs; fi"
| language: python
python:
- "2.6"
- "2.7"
- "3.2"
- "3.3"
- "pypy"
install:
- pip install -r dev-requirements.txt
script:
+ # Primary test suite
- inv test
+ # Integration-level test suite (hard to reliably test full stack in-code :()
+ - inv test -o --tests=integration
# Jinja2, used by Sphinx, does not work on Python 3.2 (but does on 3.3)
- "if [[ $TRAVIS_PYTHON_VERSION != '3.2' ]]; then inv docs; fi" | 3 | 0.230769 | 3 | 0 |
0731a34fd55477b20ffcd19c9b41cda0dd084d75 | ggplot/utils/date_breaks.py | ggplot/utils/date_breaks.py | from matplotlib.dates import DayLocator, WeekdayLocator, MonthLocator, YearLocator
def parse_break_str(txt):
"parses '10 weeks' into tuple (10, week)."
txt = txt.strip()
if len(txt.split()) == 2:
n, units = txt.split()
else:
n,units = 1, txt
units = units.rstrip('s') # e.g. weeks => week
n = int(n)
return n, units
# matplotlib's YearLocator uses different named
# arguments than the others
LOCATORS = {
'day': DayLocator,
'week': WeekdayLocator,
'month': MonthLocator,
'year': lambda interval: YearLocator(base=interval)
}
def date_breaks(width):
"""
"Regularly spaced dates."
width:
an interval specification. must be one of [day, week, month, year]
usage:
date_breaks(width = '1 year')
date_breaks(width = '6 weeks')
date_breaks('months')
"""
period, units = parse_break_str(width)
Locator = LOCATORS.get(units)
locator = Locator(interval=period)
return locator
| from matplotlib.dates import MinuteLocator, HourLocator, DayLocator
from matplotlib.dates import WeekdayLocator, MonthLocator, YearLocator
def parse_break_str(txt):
"parses '10 weeks' into tuple (10, week)."
txt = txt.strip()
if len(txt.split()) == 2:
n, units = txt.split()
else:
n,units = 1, txt
units = units.rstrip('s') # e.g. weeks => week
n = int(n)
return n, units
# matplotlib's YearLocator uses different named
# arguments than the others
LOCATORS = {
'minute': MinuteLocator,
'hour': HourLocator,
'day': DayLocator,
'week': WeekdayLocator,
'month': MonthLocator,
'year': lambda interval: YearLocator(base=interval)
}
def date_breaks(width):
"""
"Regularly spaced dates."
width:
an interval specification. must be one of [minute, hour, day, week, month, year]
usage:
date_breaks(width = '1 year')
date_breaks(width = '6 weeks')
date_breaks('months')
"""
period, units = parse_break_str(width)
Locator = LOCATORS.get(units)
locator = Locator(interval=period)
return locator
| Add more granular date locators | Add more granular date locators
| Python | bsd-2-clause | xguse/ggplot,andnovar/ggplot,benslice/ggplot,bitemyapp/ggplot,kmather73/ggplot,benslice/ggplot,udacity/ggplot,ricket1978/ggplot,mizzao/ggplot,wllmtrng/ggplot,smblance/ggplot,assad2012/ggplot,Cophy08/ggplot,xguse/ggplot,ricket1978/ggplot,mizzao/ggplot | python | ## Code Before:
from matplotlib.dates import DayLocator, WeekdayLocator, MonthLocator, YearLocator
def parse_break_str(txt):
"parses '10 weeks' into tuple (10, week)."
txt = txt.strip()
if len(txt.split()) == 2:
n, units = txt.split()
else:
n,units = 1, txt
units = units.rstrip('s') # e.g. weeks => week
n = int(n)
return n, units
# matplotlib's YearLocator uses different named
# arguments than the others
LOCATORS = {
'day': DayLocator,
'week': WeekdayLocator,
'month': MonthLocator,
'year': lambda interval: YearLocator(base=interval)
}
def date_breaks(width):
"""
"Regularly spaced dates."
width:
an interval specification. must be one of [day, week, month, year]
usage:
date_breaks(width = '1 year')
date_breaks(width = '6 weeks')
date_breaks('months')
"""
period, units = parse_break_str(width)
Locator = LOCATORS.get(units)
locator = Locator(interval=period)
return locator
## Instruction:
Add more granular date locators
## Code After:
from matplotlib.dates import MinuteLocator, HourLocator, DayLocator
from matplotlib.dates import WeekdayLocator, MonthLocator, YearLocator
def parse_break_str(txt):
"parses '10 weeks' into tuple (10, week)."
txt = txt.strip()
if len(txt.split()) == 2:
n, units = txt.split()
else:
n,units = 1, txt
units = units.rstrip('s') # e.g. weeks => week
n = int(n)
return n, units
# matplotlib's YearLocator uses different named
# arguments than the others
LOCATORS = {
'minute': MinuteLocator,
'hour': HourLocator,
'day': DayLocator,
'week': WeekdayLocator,
'month': MonthLocator,
'year': lambda interval: YearLocator(base=interval)
}
def date_breaks(width):
"""
"Regularly spaced dates."
width:
an interval specification. must be one of [minute, hour, day, week, month, year]
usage:
date_breaks(width = '1 year')
date_breaks(width = '6 weeks')
date_breaks('months')
"""
period, units = parse_break_str(width)
Locator = LOCATORS.get(units)
locator = Locator(interval=period)
return locator
| + from matplotlib.dates import MinuteLocator, HourLocator, DayLocator
- from matplotlib.dates import DayLocator, WeekdayLocator, MonthLocator, YearLocator
? ------------
+ from matplotlib.dates import WeekdayLocator, MonthLocator, YearLocator
def parse_break_str(txt):
"parses '10 weeks' into tuple (10, week)."
txt = txt.strip()
if len(txt.split()) == 2:
n, units = txt.split()
else:
n,units = 1, txt
units = units.rstrip('s') # e.g. weeks => week
n = int(n)
return n, units
# matplotlib's YearLocator uses different named
# arguments than the others
LOCATORS = {
+ 'minute': MinuteLocator,
+ 'hour': HourLocator,
'day': DayLocator,
'week': WeekdayLocator,
'month': MonthLocator,
'year': lambda interval: YearLocator(base=interval)
}
def date_breaks(width):
"""
"Regularly spaced dates."
width:
- an interval specification. must be one of [day, week, month, year]
+ an interval specification. must be one of [minute, hour, day, week, month, year]
? ++++++++++++++
usage:
date_breaks(width = '1 year')
date_breaks(width = '6 weeks')
date_breaks('months')
"""
period, units = parse_break_str(width)
Locator = LOCATORS.get(units)
locator = Locator(interval=period)
return locator | 7 | 0.189189 | 5 | 2 |
4dca6a526ad8deca6925c54f99843b72df642afd | src/website/content/dev.md | src/website/content/dev.md | ---
title: Development
---
### Code
Spiel is developed in [this Git repository](https://thewordnerd.info/scm/projects/spiel). To clone, run:
$ git clone https://thewordnerd.info/scm/projects/spiel
Please ask on the [mailing list](https://groups.google.com/forum/#!forum/spielproject) if you'd like commit access.
### Translation
A new hosted translation infrastructure is coming soon.
| ---
title: Development
---
### Code
Spiel is developed in [this Git repository](https://thewordnerd.info/scm/projects/spiel). To clone, run:
$ git clone https://thewordnerd.info/scm/projects/spiel
Please ask on the [mailing list](https://groups.google.com/forum/#!forum/spielproject) if you'd like commit access.
### Translations
Translations can now be maintained via [this web interface](https://hosted.weblate.org/projects/spiel/).
| Update info on hosted translation service. | Update info on hosted translation service.
| Markdown | apache-2.0 | beqabeqa473/Asra,bramd/spiel,beqabeqa473/Asra,bramd/spiel,bramd/spiel,beqabeqa473/Asra | markdown | ## Code Before:
---
title: Development
---
### Code
Spiel is developed in [this Git repository](https://thewordnerd.info/scm/projects/spiel). To clone, run:
$ git clone https://thewordnerd.info/scm/projects/spiel
Please ask on the [mailing list](https://groups.google.com/forum/#!forum/spielproject) if you'd like commit access.
### Translation
A new hosted translation infrastructure is coming soon.
## Instruction:
Update info on hosted translation service.
## Code After:
---
title: Development
---
### Code
Spiel is developed in [this Git repository](https://thewordnerd.info/scm/projects/spiel). To clone, run:
$ git clone https://thewordnerd.info/scm/projects/spiel
Please ask on the [mailing list](https://groups.google.com/forum/#!forum/spielproject) if you'd like commit access.
### Translations
Translations can now be maintained via [this web interface](https://hosted.weblate.org/projects/spiel/).
| ---
title: Development
---
### Code
Spiel is developed in [this Git repository](https://thewordnerd.info/scm/projects/spiel). To clone, run:
$ git clone https://thewordnerd.info/scm/projects/spiel
Please ask on the [mailing list](https://groups.google.com/forum/#!forum/spielproject) if you'd like commit access.
- ### Translation
+ ### Translations
? +
- A new hosted translation infrastructure is coming soon.
+ Translations can now be maintained via [this web interface](https://hosted.weblate.org/projects/spiel/). | 4 | 0.285714 | 2 | 2 |
c8c31a70ac48e62f55962ec6576a838781a5e289 | metadata/com.fabienli.dokuwiki.yml | metadata/com.fabienli.dokuwiki.yml | Categories:
- Internet
License: GPL-3.0-or-later
SourceCode: https://github.com/fabienli/DokuwikiAndroid
IssueTracker: https://github.com/fabienli/DokuwikiAndroid/issues
AutoName: Dokuwiki
Description: Android application to access a dokuwiki and keep pages in local cache.
RepoType: git
Repo: https://github.com/fabienli/DokuwikiAndroid
Builds:
- versionName: v0.8
versionCode: 52
commit: v0.8
subdir: app
gradle:
- yes
- versionName: v0.10
versionCode: 54
commit: v0.10
subdir: app
gradle:
- yes
AutoUpdateMode: Version %v
UpdateCheckMode: Tags v\d+\.\d+
CurrentVersion: v0.10
CurrentVersionCode: 54
| Categories:
- Internet
License: GPL-3.0-or-later
SourceCode: https://github.com/fabienli/DokuwikiAndroid
IssueTracker: https://github.com/fabienli/DokuwikiAndroid/issues
AutoName: Dokuwiki
Description: Android application to access a dokuwiki and keep pages in local cache.
RepoType: git
Repo: https://github.com/fabienli/DokuwikiAndroid
Builds:
- versionName: v0.8
versionCode: 52
commit: v0.8
subdir: app
gradle:
- yes
- versionName: v0.10
versionCode: 54
commit: v0.10
subdir: app
gradle:
- yes
- versionName: v0.11
versionCode: 55
commit: v0.11
subdir: app
gradle:
- yes
AutoUpdateMode: Version %v
UpdateCheckMode: Tags v\d+\.\d+
CurrentVersion: v0.11
CurrentVersionCode: 55
| Update Dokuwiki to v0.11 (55) | Update Dokuwiki to v0.11 (55)
| YAML | agpl-3.0 | f-droid/fdroiddata,f-droid/fdroiddata | yaml | ## Code Before:
Categories:
- Internet
License: GPL-3.0-or-later
SourceCode: https://github.com/fabienli/DokuwikiAndroid
IssueTracker: https://github.com/fabienli/DokuwikiAndroid/issues
AutoName: Dokuwiki
Description: Android application to access a dokuwiki and keep pages in local cache.
RepoType: git
Repo: https://github.com/fabienli/DokuwikiAndroid
Builds:
- versionName: v0.8
versionCode: 52
commit: v0.8
subdir: app
gradle:
- yes
- versionName: v0.10
versionCode: 54
commit: v0.10
subdir: app
gradle:
- yes
AutoUpdateMode: Version %v
UpdateCheckMode: Tags v\d+\.\d+
CurrentVersion: v0.10
CurrentVersionCode: 54
## Instruction:
Update Dokuwiki to v0.11 (55)
## Code After:
Categories:
- Internet
License: GPL-3.0-or-later
SourceCode: https://github.com/fabienli/DokuwikiAndroid
IssueTracker: https://github.com/fabienli/DokuwikiAndroid/issues
AutoName: Dokuwiki
Description: Android application to access a dokuwiki and keep pages in local cache.
RepoType: git
Repo: https://github.com/fabienli/DokuwikiAndroid
Builds:
- versionName: v0.8
versionCode: 52
commit: v0.8
subdir: app
gradle:
- yes
- versionName: v0.10
versionCode: 54
commit: v0.10
subdir: app
gradle:
- yes
- versionName: v0.11
versionCode: 55
commit: v0.11
subdir: app
gradle:
- yes
AutoUpdateMode: Version %v
UpdateCheckMode: Tags v\d+\.\d+
CurrentVersion: v0.11
CurrentVersionCode: 55
| Categories:
- Internet
License: GPL-3.0-or-later
SourceCode: https://github.com/fabienli/DokuwikiAndroid
IssueTracker: https://github.com/fabienli/DokuwikiAndroid/issues
AutoName: Dokuwiki
Description: Android application to access a dokuwiki and keep pages in local cache.
RepoType: git
Repo: https://github.com/fabienli/DokuwikiAndroid
Builds:
- versionName: v0.8
versionCode: 52
commit: v0.8
subdir: app
gradle:
- yes
- versionName: v0.10
versionCode: 54
commit: v0.10
subdir: app
gradle:
- yes
+ - versionName: v0.11
+ versionCode: 55
+ commit: v0.11
+ subdir: app
+ gradle:
+ - yes
+
AutoUpdateMode: Version %v
UpdateCheckMode: Tags v\d+\.\d+
- CurrentVersion: v0.10
? ^
+ CurrentVersion: v0.11
? ^
- CurrentVersionCode: 54
? ^
+ CurrentVersionCode: 55
? ^
| 11 | 0.354839 | 9 | 2 |
b99763cbf5e6ffbf39231ec4234676cef4f08d52 | lib/thinking_sphinx/railtie.rb | lib/thinking_sphinx/railtie.rb | require 'thinking_sphinx'
require 'rails'
module ThinkingSphinx
class Railtie < Rails::Railtie
initializer 'thinking_sphinx.sphinx' do
ThinkingSphinx::AutoVersion.detect
end
initializer "thinking_sphinx.active_record" do
ActiveSupport.on_load :active_record do
include ThinkingSphinx::ActiveRecord
end
end
initializer "thinking_sphinx.action_controller" do
ActiveSupport.on_load :action_controller do
require 'thinking_sphinx/action_controller'
include ThinkingSphinx::ActionController
end
end
initializer "thinking_sphinx.set_app_root" do |app|
ThinkingSphinx::Configuration.instance.reset # Rails has setup app now
end
config.to_prepare do
I18n.backend.reload!
I18n.backend.available_locales
# ActiveRecord::Base.to_crc32s is dependant on the subclasses being loaded
# consistently. When the environment is reset, subclasses/descendants will
# be lost but our context will not reload them for us.
#
# We reset the context which causes the subclasses/descendants to be
# reloaded next time the context is called.
#
ThinkingSphinx.reset_context!
end
rake_tasks do
load File.expand_path('../tasks.rb', __FILE__)
end
end
end
| require 'thinking_sphinx'
require 'rails'
module ThinkingSphinx
class Railtie < Rails::Railtie
initializer 'thinking_sphinx.sphinx' do
ThinkingSphinx::AutoVersion.detect
end
initializer "thinking_sphinx.active_record" do
ActiveSupport.on_load :active_record do
include ThinkingSphinx::ActiveRecord
end
end
initializer "thinking_sphinx.action_controller" do
ActiveSupport.on_load :action_controller do
require 'thinking_sphinx/action_controller'
include ThinkingSphinx::ActionController
end
end
initializer "thinking_sphinx.set_app_root" do |app|
ThinkingSphinx::Configuration.instance.reset # Rails has setup app now
end
config.to_prepare do
# ActiveRecord::Base.to_crc32s is dependant on the subclasses being loaded
# consistently. When the environment is reset, subclasses/descendants will
# be lost but our context will not reload them for us.
#
# We reset the context which causes the subclasses/descendants to be
# reloaded next time the context is called.
#
ThinkingSphinx.reset_context!
end
rake_tasks do
load File.expand_path('../tasks.rb', __FILE__)
end
end
end
| Load order doesn't seem to matter for translations now, so no need to force the reload of I18n setup. | Load order doesn't seem to matter for translations now, so no need to force the reload of I18n setup.
| Ruby | mit | agibralter/thinking-sphinx,Napolskih/thinking-sphinx | ruby | ## Code Before:
require 'thinking_sphinx'
require 'rails'
module ThinkingSphinx
class Railtie < Rails::Railtie
initializer 'thinking_sphinx.sphinx' do
ThinkingSphinx::AutoVersion.detect
end
initializer "thinking_sphinx.active_record" do
ActiveSupport.on_load :active_record do
include ThinkingSphinx::ActiveRecord
end
end
initializer "thinking_sphinx.action_controller" do
ActiveSupport.on_load :action_controller do
require 'thinking_sphinx/action_controller'
include ThinkingSphinx::ActionController
end
end
initializer "thinking_sphinx.set_app_root" do |app|
ThinkingSphinx::Configuration.instance.reset # Rails has setup app now
end
config.to_prepare do
I18n.backend.reload!
I18n.backend.available_locales
# ActiveRecord::Base.to_crc32s is dependant on the subclasses being loaded
# consistently. When the environment is reset, subclasses/descendants will
# be lost but our context will not reload them for us.
#
# We reset the context which causes the subclasses/descendants to be
# reloaded next time the context is called.
#
ThinkingSphinx.reset_context!
end
rake_tasks do
load File.expand_path('../tasks.rb', __FILE__)
end
end
end
## Instruction:
Load order doesn't seem to matter for translations now, so no need to force the reload of I18n setup.
## Code After:
require 'thinking_sphinx'
require 'rails'
module ThinkingSphinx
class Railtie < Rails::Railtie
initializer 'thinking_sphinx.sphinx' do
ThinkingSphinx::AutoVersion.detect
end
initializer "thinking_sphinx.active_record" do
ActiveSupport.on_load :active_record do
include ThinkingSphinx::ActiveRecord
end
end
initializer "thinking_sphinx.action_controller" do
ActiveSupport.on_load :action_controller do
require 'thinking_sphinx/action_controller'
include ThinkingSphinx::ActionController
end
end
initializer "thinking_sphinx.set_app_root" do |app|
ThinkingSphinx::Configuration.instance.reset # Rails has setup app now
end
config.to_prepare do
# ActiveRecord::Base.to_crc32s is dependant on the subclasses being loaded
# consistently. When the environment is reset, subclasses/descendants will
# be lost but our context will not reload them for us.
#
# We reset the context which causes the subclasses/descendants to be
# reloaded next time the context is called.
#
ThinkingSphinx.reset_context!
end
rake_tasks do
load File.expand_path('../tasks.rb', __FILE__)
end
end
end
| require 'thinking_sphinx'
require 'rails'
module ThinkingSphinx
class Railtie < Rails::Railtie
-
+
initializer 'thinking_sphinx.sphinx' do
ThinkingSphinx::AutoVersion.detect
end
initializer "thinking_sphinx.active_record" do
ActiveSupport.on_load :active_record do
include ThinkingSphinx::ActiveRecord
end
end
initializer "thinking_sphinx.action_controller" do
ActiveSupport.on_load :action_controller do
require 'thinking_sphinx/action_controller'
include ThinkingSphinx::ActionController
end
end
initializer "thinking_sphinx.set_app_root" do |app|
ThinkingSphinx::Configuration.instance.reset # Rails has setup app now
end
config.to_prepare do
- I18n.backend.reload!
- I18n.backend.available_locales
-
# ActiveRecord::Base.to_crc32s is dependant on the subclasses being loaded
# consistently. When the environment is reset, subclasses/descendants will
# be lost but our context will not reload them for us.
#
# We reset the context which causes the subclasses/descendants to be
# reloaded next time the context is called.
- #
? -
+ #
ThinkingSphinx.reset_context!
end
rake_tasks do
load File.expand_path('../tasks.rb', __FILE__)
end
end
end | 7 | 0.152174 | 2 | 5 |
f82adffec548240c02997ef84cad1da0fda13315 | src/server/index.js | src/server/index.js | var express = require('express');
var webpack = require('webpack');
var webpackDevMiddleware = require('webpack-dev-middleware');
var webpackHotMiddleware = require('webpack-hot-middleware');
var yargs = require('yargs');
var args = yargs
.alias('p', 'production')
.argv;
process.env.NODE_ENV = args.production ? 'production' : 'development';
console.log('NODE_ENV => ', process.env.NODE_ENV);
var app = new express();
var port = 8000;
// In development mode serve the scripts using webpack-dev-middleware
if (process.env.NODE_ENV === 'development') {
var config = require('../../webpack.config');
var compiler = webpack(config);
app.use(webpackDevMiddleware(compiler, { noInfo: true, publicPath: config.output.publicPath }));
app.use(webpackHotMiddleware(compiler));
} else {
app.use('/dist', express.static('dist', {maxAge: '200d'}));
}
app.use('/assets', express.static('assets', {maxAge: '200d'}));
app.get("*", function(req, res) {
res.sendFile(__dirname + '/index.html');
})
app.listen(port, function(error) {
if (error) {
console.error(error);
} else {
console.info("==> 🌎 Listening on port %s. Open up http://localhost:%s/ in your browser.", port, port);
}
})
| var express = require('express');
var webpack = require('webpack');
var webpackDevMiddleware = require('webpack-dev-middleware');
var webpackHotMiddleware = require('webpack-hot-middleware');
var yargs = require('yargs');
var args = yargs
.alias('p', 'production')
.argv;
var app = new express();
var port = 8000;
process.env.NODE_ENV = args.production ? 'production' : 'development';
console.log('NODE_ENV => ', process.env.NODE_ENV);
// In development mode serve the scripts using webpack-dev-middleware
if (process.env.NODE_ENV === 'development') {
var config = require('../../webpack.config');
var compiler = webpack(config);
app.use(webpackDevMiddleware(compiler, { noInfo: true, publicPath: config.output.publicPath }));
app.use(webpackHotMiddleware(compiler));
} else {
app.use('/dist', express.static('dist', {maxAge: '200d'}));
}
app.use('/assets', express.static('assets', {maxAge: '200d'}));
app.get("*", function(req, res) {
res.sendFile(__dirname + '/index.html');
})
app.listen(port, function(error) {
if (error) {
console.error(error);
} else {
console.info("==> 🌎 Listening on port %s. Open up http://localhost:%s/ in your browser.", port, port);
}
})
| Reorganize the code a bit. | Reorganize the code a bit.
| JavaScript | mit | mareksuscak/surfing-gallery,mareksuscak/surfing-gallery | javascript | ## Code Before:
var express = require('express');
var webpack = require('webpack');
var webpackDevMiddleware = require('webpack-dev-middleware');
var webpackHotMiddleware = require('webpack-hot-middleware');
var yargs = require('yargs');
var args = yargs
.alias('p', 'production')
.argv;
process.env.NODE_ENV = args.production ? 'production' : 'development';
console.log('NODE_ENV => ', process.env.NODE_ENV);
var app = new express();
var port = 8000;
// In development mode serve the scripts using webpack-dev-middleware
if (process.env.NODE_ENV === 'development') {
var config = require('../../webpack.config');
var compiler = webpack(config);
app.use(webpackDevMiddleware(compiler, { noInfo: true, publicPath: config.output.publicPath }));
app.use(webpackHotMiddleware(compiler));
} else {
app.use('/dist', express.static('dist', {maxAge: '200d'}));
}
app.use('/assets', express.static('assets', {maxAge: '200d'}));
app.get("*", function(req, res) {
res.sendFile(__dirname + '/index.html');
})
app.listen(port, function(error) {
if (error) {
console.error(error);
} else {
console.info("==> 🌎 Listening on port %s. Open up http://localhost:%s/ in your browser.", port, port);
}
})
## Instruction:
Reorganize the code a bit.
## Code After:
var express = require('express');
var webpack = require('webpack');
var webpackDevMiddleware = require('webpack-dev-middleware');
var webpackHotMiddleware = require('webpack-hot-middleware');
var yargs = require('yargs');
var args = yargs
.alias('p', 'production')
.argv;
var app = new express();
var port = 8000;
process.env.NODE_ENV = args.production ? 'production' : 'development';
console.log('NODE_ENV => ', process.env.NODE_ENV);
// In development mode serve the scripts using webpack-dev-middleware
if (process.env.NODE_ENV === 'development') {
var config = require('../../webpack.config');
var compiler = webpack(config);
app.use(webpackDevMiddleware(compiler, { noInfo: true, publicPath: config.output.publicPath }));
app.use(webpackHotMiddleware(compiler));
} else {
app.use('/dist', express.static('dist', {maxAge: '200d'}));
}
app.use('/assets', express.static('assets', {maxAge: '200d'}));
app.get("*", function(req, res) {
res.sendFile(__dirname + '/index.html');
})
app.listen(port, function(error) {
if (error) {
console.error(error);
} else {
console.info("==> 🌎 Listening on port %s. Open up http://localhost:%s/ in your browser.", port, port);
}
})
| var express = require('express');
var webpack = require('webpack');
var webpackDevMiddleware = require('webpack-dev-middleware');
var webpackHotMiddleware = require('webpack-hot-middleware');
var yargs = require('yargs');
var args = yargs
.alias('p', 'production')
.argv;
+ var app = new express();
+ var port = 8000;
+
process.env.NODE_ENV = args.production ? 'production' : 'development';
console.log('NODE_ENV => ', process.env.NODE_ENV);
-
- var app = new express();
- var port = 8000;
// In development mode serve the scripts using webpack-dev-middleware
if (process.env.NODE_ENV === 'development') {
var config = require('../../webpack.config');
var compiler = webpack(config);
app.use(webpackDevMiddleware(compiler, { noInfo: true, publicPath: config.output.publicPath }));
app.use(webpackHotMiddleware(compiler));
} else {
app.use('/dist', express.static('dist', {maxAge: '200d'}));
}
app.use('/assets', express.static('assets', {maxAge: '200d'}));
app.get("*", function(req, res) {
res.sendFile(__dirname + '/index.html');
})
app.listen(port, function(error) {
if (error) {
console.error(error);
} else {
console.info("==> 🌎 Listening on port %s. Open up http://localhost:%s/ in your browser.", port, port);
}
}) | 6 | 0.153846 | 3 | 3 |
ec9a7cdd832bc19dbff3e3001cd42dc4e14d5b14 | .travis.yml | .travis.yml | language: php
php:
- 5.3
- 5.4
before_script:
- wget http://getcomposer.org/composer.phar
- php composer.phar install
script:
- vendor/bin/atoum
| language: php
php:
- 5.3
- 5.4
before_script:
- echo "extension = memcache.so" >> ~/.phpenv/versions/$(phpenv version-name)/etc/php.ini
- wget http://getcomposer.org/composer.phar
- php composer.phar install
script:
- vendor/bin/atoum
| Add memcache module to Travis config | Add memcache module to Travis config | YAML | mit | KuiKui/MemcacheServiceProvider | yaml | ## Code Before:
language: php
php:
- 5.3
- 5.4
before_script:
- wget http://getcomposer.org/composer.phar
- php composer.phar install
script:
- vendor/bin/atoum
## Instruction:
Add memcache module to Travis config
## Code After:
language: php
php:
- 5.3
- 5.4
before_script:
- echo "extension = memcache.so" >> ~/.phpenv/versions/$(phpenv version-name)/etc/php.ini
- wget http://getcomposer.org/composer.phar
- php composer.phar install
script:
- vendor/bin/atoum
| language: php
php:
- 5.3
- 5.4
before_script:
+ - echo "extension = memcache.so" >> ~/.phpenv/versions/$(phpenv version-name)/etc/php.ini
- wget http://getcomposer.org/composer.phar
- php composer.phar install
script:
- vendor/bin/atoum | 1 | 0.083333 | 1 | 0 |
bc08748535b296c8668c1a8cf8cebcb765e27660 | .travis.yml | .travis.yml | language: ruby
rvm:
- 2.3.0
before_script:
- bundle install
- bundle exec rake db:create
- bundle exec rake db:migrate
- bundle exec rake db:seed
- bundle exec rails s -d
script:
- RAILS_ENV=development bin/delayed_job start forstarting background jobs
- bundle exec rake recurring:init
- bundle exec rake secure_pipeline:network_attack
- bundle exec rake secure_pipeline:ssl_attack
- bundle exec rake secure_pipeline:xss
- bundle exec rake secure_pipeline:information_leakage
- bundle exec rake secure_pipeline:sql_injection
| language: ruby
rvm:
- 2.3.0
before_script:
- bundle install
- bundle exec rake db:create
- bundle exec rake db:migrate
- bundle exec rake db:seed
- bundle exec rails s -d
script:
- RAILS_ENV=development bin/delayed_job start forstarting background jobs
- bundle exec rake recurring:init
| Remove pipeline security tasks from build script | Remove pipeline security tasks from build script
| YAML | mit | tonyvince/alexa_rank_tracker,tonyvince/alexa_rank_tracker,tonyvince/alexa_rank_tracker | yaml | ## Code Before:
language: ruby
rvm:
- 2.3.0
before_script:
- bundle install
- bundle exec rake db:create
- bundle exec rake db:migrate
- bundle exec rake db:seed
- bundle exec rails s -d
script:
- RAILS_ENV=development bin/delayed_job start forstarting background jobs
- bundle exec rake recurring:init
- bundle exec rake secure_pipeline:network_attack
- bundle exec rake secure_pipeline:ssl_attack
- bundle exec rake secure_pipeline:xss
- bundle exec rake secure_pipeline:information_leakage
- bundle exec rake secure_pipeline:sql_injection
## Instruction:
Remove pipeline security tasks from build script
## Code After:
language: ruby
rvm:
- 2.3.0
before_script:
- bundle install
- bundle exec rake db:create
- bundle exec rake db:migrate
- bundle exec rake db:seed
- bundle exec rails s -d
script:
- RAILS_ENV=development bin/delayed_job start forstarting background jobs
- bundle exec rake recurring:init
| language: ruby
rvm:
- 2.3.0
before_script:
- bundle install
- bundle exec rake db:create
- bundle exec rake db:migrate
- bundle exec rake db:seed
- bundle exec rails s -d
script:
- RAILS_ENV=development bin/delayed_job start forstarting background jobs
- bundle exec rake recurring:init
- - bundle exec rake secure_pipeline:network_attack
- - bundle exec rake secure_pipeline:ssl_attack
- - bundle exec rake secure_pipeline:xss
- - bundle exec rake secure_pipeline:information_leakage
- - bundle exec rake secure_pipeline:sql_injection
| 5 | 0.263158 | 0 | 5 |
467c1a5f04c6fe62b1964893526ce4f0eb2966e5 | .travis.yml | .travis.yml | sudo: required
dist: trusty
language: cpp
compiler:
- gcc
- clang
before_install:
- sudo apt-get update -qq
- sudo apt-get install -y qt5-default libqt5svg5-dev
script:
- mkdir build
- cd build
- qmake ..
- make
| sudo: required
dist: trusty
language: cpp
compiler:
- gcc
- clang
before_install:
- sudo apt-get update -qq
- sudo apt-get install -y qt5-default libqt5svg5-dev
script:
- export QMAKE_CC=$CC
- export QMAKE_CXX=$CXX
- mkdir build
- cd build
- qmake ..
- make
| Set QMAKE_CC and CXX flag for Travis builds | Set QMAKE_CC and CXX flag for Travis builds
| YAML | bsd-2-clause | csete/softrig,csete/softrig,csete/softrig | yaml | ## Code Before:
sudo: required
dist: trusty
language: cpp
compiler:
- gcc
- clang
before_install:
- sudo apt-get update -qq
- sudo apt-get install -y qt5-default libqt5svg5-dev
script:
- mkdir build
- cd build
- qmake ..
- make
## Instruction:
Set QMAKE_CC and CXX flag for Travis builds
## Code After:
sudo: required
dist: trusty
language: cpp
compiler:
- gcc
- clang
before_install:
- sudo apt-get update -qq
- sudo apt-get install -y qt5-default libqt5svg5-dev
script:
- export QMAKE_CC=$CC
- export QMAKE_CXX=$CXX
- mkdir build
- cd build
- qmake ..
- make
| sudo: required
dist: trusty
language: cpp
compiler:
- gcc
- clang
before_install:
- sudo apt-get update -qq
- sudo apt-get install -y qt5-default libqt5svg5-dev
script:
+ - export QMAKE_CC=$CC
+ - export QMAKE_CXX=$CXX
- mkdir build
- cd build
- qmake ..
- make
| 2 | 0.1 | 2 | 0 |
db78eb4f27ac20695ec573c42954771acc232e6c | scratchpad/gamepad/lib/ThrusterClient.js | scratchpad/gamepad/lib/ThrusterClient.js | class ThrusterClient {
constructor() {
// TODO: initialize our websocket connection
}
sendMessage(controller, type, index, value) {
// build message binary payload per services/controllers/message.py
let fa = new Float32Array([value]);
let ba = new Int8Array(fa.buffer);
console.log(ba);
// send websocket packet
}
}
| class ThrusterClient {
constructor(cb) {
this.cb = cb;
this.socket = new WebSocket("ws://127.0.0.1:9997/");
this.socket.type = "arraybuffer";
this.socket.addEventListener("message", (e) => this.message(e));
if (this.cb !== null && this.cb !== undefined) {
this.socket.addEventListener("open", cb);
}
}
sendMessage(controller, type, index, value) {
console.log(`C: ${controller}, T: ${type}, I: ${index}, V: ${value}`);
// build message binary payload per services/controllers/message.py
let b1 =
(controller & 0x03) << 6 |
(type & 0x03) << 4 |
(index & 0x0F);
let floatArray = new Float32Array([value]);
let byteArray = new Int8Array(floatArray.buffer);
let result = new Int8Array([
b1,
byteArray[0],
byteArray[1],
byteArray[2],
byteArray[3],
]);
// send websocket packet
this.socket.send(result);
}
message(e) {
}
}
| Add socket and message packing | Add socket and message packing
| JavaScript | bsd-3-clause | gizmo-cda/g2x-submarine-v2,gizmo-cda/g2x-submarine-v2,gizmo-cda/g2x-submarine-v2,gizmo-cda/g2x-submarine-v2 | javascript | ## Code Before:
class ThrusterClient {
constructor() {
// TODO: initialize our websocket connection
}
sendMessage(controller, type, index, value) {
// build message binary payload per services/controllers/message.py
let fa = new Float32Array([value]);
let ba = new Int8Array(fa.buffer);
console.log(ba);
// send websocket packet
}
}
## Instruction:
Add socket and message packing
## Code After:
class ThrusterClient {
constructor(cb) {
this.cb = cb;
this.socket = new WebSocket("ws://127.0.0.1:9997/");
this.socket.type = "arraybuffer";
this.socket.addEventListener("message", (e) => this.message(e));
if (this.cb !== null && this.cb !== undefined) {
this.socket.addEventListener("open", cb);
}
}
sendMessage(controller, type, index, value) {
console.log(`C: ${controller}, T: ${type}, I: ${index}, V: ${value}`);
// build message binary payload per services/controllers/message.py
let b1 =
(controller & 0x03) << 6 |
(type & 0x03) << 4 |
(index & 0x0F);
let floatArray = new Float32Array([value]);
let byteArray = new Int8Array(floatArray.buffer);
let result = new Int8Array([
b1,
byteArray[0],
byteArray[1],
byteArray[2],
byteArray[3],
]);
// send websocket packet
this.socket.send(result);
}
message(e) {
}
}
| class ThrusterClient {
- constructor() {
+ constructor(cb) {
? ++
- // TODO: initialize our websocket connection
+ this.cb = cb;
+ this.socket = new WebSocket("ws://127.0.0.1:9997/");
+ this.socket.type = "arraybuffer";
+ this.socket.addEventListener("message", (e) => this.message(e));
+
+ if (this.cb !== null && this.cb !== undefined) {
+ this.socket.addEventListener("open", cb);
+ }
}
sendMessage(controller, type, index, value) {
+ console.log(`C: ${controller}, T: ${type}, I: ${index}, V: ${value}`);
+
// build message binary payload per services/controllers/message.py
+ let b1 =
+ (controller & 0x03) << 6 |
+ (type & 0x03) << 4 |
+ (index & 0x0F);
+
- let fa = new Float32Array([value]);
+ let floatArray = new Float32Array([value]);
? ++ ++++++
- let ba = new Int8Array(fa.buffer);
+ let byteArray = new Int8Array(floatArray.buffer);
? ++++++ + ++ ++++++
- console.log(ba);
+
+ let result = new Int8Array([
+ b1,
+ byteArray[0],
+ byteArray[1],
+ byteArray[2],
+ byteArray[3],
+ ]);
// send websocket packet
+ this.socket.send(result);
+ }
+
+ message(e) {
+
}
} | 36 | 2.571429 | 31 | 5 |
93b9c951674088835f1f0b81abd7c0b4f51b7b2c | tests/PhpBrew/Command/InstallCommandTest.php | tests/PhpBrew/Command/InstallCommandTest.php | <?php
use PhpBrew\Testing\CommandTestCase;
class InstallCommandTest extends CommandTestCase
{
/**
* @outputBuffering enabled
*/
public function testInstallCommandLatestMinorVersion() {
$this->assertTrue($this->runCommand("phpbrew --quiet install 5.4")); // we will likely get 5.4.34 - 2014-11-02
}
/**
* @outputBuffering enabled
*/
public function testInstallCommand()
{
$this->assertTrue($this->runCommand("phpbrew --quiet install 5.4.29 +sqlite +intl +icu"));
}
/**
* @outputBuffering enabled
* @depends testInstallCommand
*/
public function testCleanCommand()
{
$this->assertTrue($this->runCommand("phpbrew --quiet clean 5.4.29"));
}
/**
* @outputBuffering enabled
* @depends testInstallCommand
*/
public function testInstallLikeCommand() {
$this->assertTrue($this->runCommand("phpbrew --quiet install -d --like 5.4.29 5.5.18 +soap"));
}
}
| <?php
use PhpBrew\Testing\CommandTestCase;
class InstallCommandTest extends CommandTestCase
{
/**
* @outputBuffering enabled
*/
public function testInstallCommandLatestMinorVersion() {
$this->assertTrue($this->runCommand("phpbrew --quiet install 5.4")); // we will likely get 5.4.34 - 2014-11-02
}
/**
* @outputBuffering enabled
*/
public function testInstallCommand()
{
$this->assertTrue($this->runCommand("phpbrew --quiet install 5.4.29 +sqlite +intl +icu"));
}
/**
* @outputBuffering enabled
* @depends testInstallCommand
*/
public function testListCommand()
{
$this->assertTrue($this->runCommand("phpbrew list -v -d"));
$this->assertTrue($this->runCommand("phpbrew list --dir --variants"));
}
/**
* @outputBuffering enabled
* @depends testInstallCommand
*/
public function testCleanCommand()
{
$this->assertTrue($this->runCommand("phpbrew --quiet clean 5.4.29"));
}
/**
* @outputBuffering enabled
* @depends testInstallCommand
*/
public function testInstallLikeCommand() {
$this->assertTrue($this->runCommand("phpbrew --quiet install -d --like 5.4.29 5.5.18 +soap"));
}
}
| Test list command after the installation | Test list command after the installation
| PHP | mit | phpbrew/phpbrew,mengzyou/phpbrew,phpbrew/phpbrew,williamn/phpbrew,hechunwen/phpbrew,wellic/phpbrew,pierredup/phpbrew,svetlyak40wt/phpbrew,morozov/phpbrew,viniciusferreira/phpbrew,phpbrew/phpbrew,viniciusferreira/phpbrew,konomae/phpbrew,kayyyy/phpbrew,svetlyak40wt/phpbrew,zonuexe/phpbrew,davidfuhr/phpbrew,viniciusferreira/phpbrew,hechunwen/phpbrew,davidfuhr/phpbrew,wellic/phpbrew,davidfuhr/phpbrew,morozov/phpbrew,zonuexe/phpbrew,pinkbigmacmedia/phpbrew,wellic/phpbrew,pierredup/phpbrew,kayyyy/phpbrew,svetlyak40wt/phpbrew,davidfuhr/phpbrew,williamn/phpbrew,zonuexe/phpbrew,pinkbigmacmedia/phpbrew,mengzyou/phpbrew,mengzyou/phpbrew,pierredup/phpbrew,dduca-sugarcrm/phpbrew,pinkbigmacmedia/phpbrew,konomae/phpbrew,kayyyy/phpbrew,dduca-sugarcrm/phpbrew,pierredup/phpbrew,kayyyy/phpbrew,konomae/phpbrew,mengzyou/phpbrew,williamn/phpbrew,williamn/phpbrew,zonuexe/phpbrew,svetlyak40wt/phpbrew,hechunwen/phpbrew,wellic/phpbrew,konomae/phpbrew,hechunwen/phpbrew,dduca-sugarcrm/phpbrew,viniciusferreira/phpbrew,dduca-sugarcrm/phpbrew,davidfuhr/phpbrew,pinkbigmacmedia/phpbrew | php | ## Code Before:
<?php
use PhpBrew\Testing\CommandTestCase;
class InstallCommandTest extends CommandTestCase
{
/**
* @outputBuffering enabled
*/
public function testInstallCommandLatestMinorVersion() {
$this->assertTrue($this->runCommand("phpbrew --quiet install 5.4")); // we will likely get 5.4.34 - 2014-11-02
}
/**
* @outputBuffering enabled
*/
public function testInstallCommand()
{
$this->assertTrue($this->runCommand("phpbrew --quiet install 5.4.29 +sqlite +intl +icu"));
}
/**
* @outputBuffering enabled
* @depends testInstallCommand
*/
public function testCleanCommand()
{
$this->assertTrue($this->runCommand("phpbrew --quiet clean 5.4.29"));
}
/**
* @outputBuffering enabled
* @depends testInstallCommand
*/
public function testInstallLikeCommand() {
$this->assertTrue($this->runCommand("phpbrew --quiet install -d --like 5.4.29 5.5.18 +soap"));
}
}
## Instruction:
Test list command after the installation
## Code After:
<?php
use PhpBrew\Testing\CommandTestCase;
class InstallCommandTest extends CommandTestCase
{
/**
* @outputBuffering enabled
*/
public function testInstallCommandLatestMinorVersion() {
$this->assertTrue($this->runCommand("phpbrew --quiet install 5.4")); // we will likely get 5.4.34 - 2014-11-02
}
/**
* @outputBuffering enabled
*/
public function testInstallCommand()
{
$this->assertTrue($this->runCommand("phpbrew --quiet install 5.4.29 +sqlite +intl +icu"));
}
/**
* @outputBuffering enabled
* @depends testInstallCommand
*/
public function testListCommand()
{
$this->assertTrue($this->runCommand("phpbrew list -v -d"));
$this->assertTrue($this->runCommand("phpbrew list --dir --variants"));
}
/**
* @outputBuffering enabled
* @depends testInstallCommand
*/
public function testCleanCommand()
{
$this->assertTrue($this->runCommand("phpbrew --quiet clean 5.4.29"));
}
/**
* @outputBuffering enabled
* @depends testInstallCommand
*/
public function testInstallLikeCommand() {
$this->assertTrue($this->runCommand("phpbrew --quiet install -d --like 5.4.29 5.5.18 +soap"));
}
}
| <?php
use PhpBrew\Testing\CommandTestCase;
class InstallCommandTest extends CommandTestCase
{
/**
* @outputBuffering enabled
*/
public function testInstallCommandLatestMinorVersion() {
$this->assertTrue($this->runCommand("phpbrew --quiet install 5.4")); // we will likely get 5.4.34 - 2014-11-02
}
/**
* @outputBuffering enabled
*/
public function testInstallCommand()
{
$this->assertTrue($this->runCommand("phpbrew --quiet install 5.4.29 +sqlite +intl +icu"));
+ }
+
+ /**
+ * @outputBuffering enabled
+ * @depends testInstallCommand
+ */
+ public function testListCommand()
+ {
+ $this->assertTrue($this->runCommand("phpbrew list -v -d"));
+ $this->assertTrue($this->runCommand("phpbrew list --dir --variants"));
}
/**
* @outputBuffering enabled
* @depends testInstallCommand
*/
public function testCleanCommand()
{
$this->assertTrue($this->runCommand("phpbrew --quiet clean 5.4.29"));
}
/**
* @outputBuffering enabled
* @depends testInstallCommand
*/
public function testInstallLikeCommand() {
$this->assertTrue($this->runCommand("phpbrew --quiet install -d --like 5.4.29 5.5.18 +soap"));
}
}
| 10 | 0.232558 | 10 | 0 |
a1ff3b4dd6e635cfb7e3ee28a36ee3d26e67903d | app.rb | app.rb | require 'sinatra'
require 'sinatra/reloader'
configure do
set :public_folder, File.dirname(__FILE__) + '/json'
end
get '/:name' do
content_type 'application/json'
if params[:name].nil? then
File.readlines(params[:name])
else
"File not found"
end
end
| require 'sinatra'
require 'sinatra/reloader'
configure do
set :public_folder, File.dirname(__FILE__) + '/json'
end
get '/:name' do
content_type 'application/json'
if params[:name].nil? then
File.readlines(params[:name])
else
status 400
end
end
| Return 404 if the request file is not there | Return 404 if the request file is not there
| Ruby | apache-2.0 | yelinaung/JSONatra | ruby | ## Code Before:
require 'sinatra'
require 'sinatra/reloader'
configure do
set :public_folder, File.dirname(__FILE__) + '/json'
end
get '/:name' do
content_type 'application/json'
if params[:name].nil? then
File.readlines(params[:name])
else
"File not found"
end
end
## Instruction:
Return 404 if the request file is not there
## Code After:
require 'sinatra'
require 'sinatra/reloader'
configure do
set :public_folder, File.dirname(__FILE__) + '/json'
end
get '/:name' do
content_type 'application/json'
if params[:name].nil? then
File.readlines(params[:name])
else
status 400
end
end
| require 'sinatra'
require 'sinatra/reloader'
configure do
set :public_folder, File.dirname(__FILE__) + '/json'
end
get '/:name' do
content_type 'application/json'
if params[:name].nil? then
File.readlines(params[:name])
else
- "File not found"
+ status 400
end
end | 2 | 0.133333 | 1 | 1 |
57805ec1c70faffcd613adc9a1725fa1a6521737 | styles/markers.atom-text-editor.less | styles/markers.atom-text-editor.less | @import "ui-variables";
:host(:host) .gutter .line-number {
&.latex-error {
border-left: 2px solid @background-color-error;
padding-left: ~"calc(0.5em - 2px)";
}
&.latex-warning {
border-left: 2px solid @background-color-warning;
padding-left: ~"calc(0.5em - 2px)";
}
&.latex-info {
border-left: 2px solid @background-color-info;
padding-left: ~"calc(0.5em - 2px)";
}
}
| @import "ui-variables";
atom-text-editor .gutter .line-number {
&.latex-error {
border-left: 2px solid @background-color-error;
padding-left: ~"calc(0.5em - 2px)";
}
&.latex-warning {
border-left: 2px solid @background-color-warning;
padding-left: ~"calc(0.5em - 2px)";
}
&.latex-info {
border-left: 2px solid @background-color-info;
padding-left: ~"calc(0.5em - 2px)";
}
}
| Remove usage of the Shadow DOM | Remove usage of the Shadow DOM
This commit rewrites deprecated Shadow DOM based selectors that will
stop working in Atom v1.13.
| Less | mit | thomasjo/atom-latex,thomasjo/atom-latex,thomasjo/atom-latex | less | ## Code Before:
@import "ui-variables";
:host(:host) .gutter .line-number {
&.latex-error {
border-left: 2px solid @background-color-error;
padding-left: ~"calc(0.5em - 2px)";
}
&.latex-warning {
border-left: 2px solid @background-color-warning;
padding-left: ~"calc(0.5em - 2px)";
}
&.latex-info {
border-left: 2px solid @background-color-info;
padding-left: ~"calc(0.5em - 2px)";
}
}
## Instruction:
Remove usage of the Shadow DOM
This commit rewrites deprecated Shadow DOM based selectors that will
stop working in Atom v1.13.
## Code After:
@import "ui-variables";
atom-text-editor .gutter .line-number {
&.latex-error {
border-left: 2px solid @background-color-error;
padding-left: ~"calc(0.5em - 2px)";
}
&.latex-warning {
border-left: 2px solid @background-color-warning;
padding-left: ~"calc(0.5em - 2px)";
}
&.latex-info {
border-left: 2px solid @background-color-info;
padding-left: ~"calc(0.5em - 2px)";
}
}
| @import "ui-variables";
- :host(:host) .gutter .line-number {
+ atom-text-editor .gutter .line-number {
&.latex-error {
border-left: 2px solid @background-color-error;
padding-left: ~"calc(0.5em - 2px)";
}
&.latex-warning {
border-left: 2px solid @background-color-warning;
padding-left: ~"calc(0.5em - 2px)";
}
&.latex-info {
border-left: 2px solid @background-color-info;
padding-left: ~"calc(0.5em - 2px)";
}
} | 2 | 0.125 | 1 | 1 |
3f0c5e538f16ef9c1e6980f9e03a8b8b317de6c5 | packages/as/aspell-pipe.yaml | packages/as/aspell-pipe.yaml | homepage: ''
changelog-type: markdown
hash: 551513bf792cec38a9d919e1d8462eb3f318558765b5f7544fbd7c9e88f993e6
test-bench-deps: {}
maintainer: jtd@galois.com
synopsis: Pipe-based interface to the Aspell program
changelog: ! '
0.1
---
Initial release.
'
basic-deps:
base: ! '>=4.8 && <4.9'
text: -any
process: ! '>=1.6'
all-versions:
- '0.1'
author: Jonathan Daugherty
latest: '0.1'
description-type: markdown
description: ! '
aspell-pipe
-----------
This package provides a simple interface to the `aspell` spell checker
tool. In particular, this library communicates with `aspell` using
its pipe interface rather than linking directly to the Aspell shared
library. This is because in some cases it is desirable to avoid a shared
library dependency and to deal gracefully with an absent spell checker.
To get started, see the Haddock documentation for `Text.Aspell`.
'
license-name: BSD3
| homepage: ''
changelog-type: markdown
hash: 731f3a8a3417763a6235adf9b0434949136f22c85bb2e4c97442bad29931c138
test-bench-deps: {}
maintainer: jtd@galois.com
synopsis: Pipe-based interface to the Aspell program
changelog: ! '
0.2
---
* Relaxed base upper bound to < 5.
0.1
---
Initial release.
'
basic-deps:
base: ! '>=4.8 && <5'
text: -any
process: ! '>=1.6'
all-versions:
- '0.1'
- '0.2'
author: Jonathan Daugherty
latest: '0.2'
description-type: markdown
description: ! '
aspell-pipe
-----------
This package provides a simple interface to the `aspell` spell checker
tool. In particular, this library communicates with `aspell` using
its pipe interface rather than linking directly to the Aspell shared
library. This is because in some cases it is desirable to avoid a shared
library dependency and to deal gracefully with an absent spell checker.
To get started, see the Haddock documentation for `Text.Aspell`.
'
license-name: BSD3
| Update from Hackage at 2017-06-27T17:03:25Z | Update from Hackage at 2017-06-27T17:03:25Z
| YAML | mit | commercialhaskell/all-cabal-metadata | yaml | ## Code Before:
homepage: ''
changelog-type: markdown
hash: 551513bf792cec38a9d919e1d8462eb3f318558765b5f7544fbd7c9e88f993e6
test-bench-deps: {}
maintainer: jtd@galois.com
synopsis: Pipe-based interface to the Aspell program
changelog: ! '
0.1
---
Initial release.
'
basic-deps:
base: ! '>=4.8 && <4.9'
text: -any
process: ! '>=1.6'
all-versions:
- '0.1'
author: Jonathan Daugherty
latest: '0.1'
description-type: markdown
description: ! '
aspell-pipe
-----------
This package provides a simple interface to the `aspell` spell checker
tool. In particular, this library communicates with `aspell` using
its pipe interface rather than linking directly to the Aspell shared
library. This is because in some cases it is desirable to avoid a shared
library dependency and to deal gracefully with an absent spell checker.
To get started, see the Haddock documentation for `Text.Aspell`.
'
license-name: BSD3
## Instruction:
Update from Hackage at 2017-06-27T17:03:25Z
## Code After:
homepage: ''
changelog-type: markdown
hash: 731f3a8a3417763a6235adf9b0434949136f22c85bb2e4c97442bad29931c138
test-bench-deps: {}
maintainer: jtd@galois.com
synopsis: Pipe-based interface to the Aspell program
changelog: ! '
0.2
---
* Relaxed base upper bound to < 5.
0.1
---
Initial release.
'
basic-deps:
base: ! '>=4.8 && <5'
text: -any
process: ! '>=1.6'
all-versions:
- '0.1'
- '0.2'
author: Jonathan Daugherty
latest: '0.2'
description-type: markdown
description: ! '
aspell-pipe
-----------
This package provides a simple interface to the `aspell` spell checker
tool. In particular, this library communicates with `aspell` using
its pipe interface rather than linking directly to the Aspell shared
library. This is because in some cases it is desirable to avoid a shared
library dependency and to deal gracefully with an absent spell checker.
To get started, see the Haddock documentation for `Text.Aspell`.
'
license-name: BSD3
| homepage: ''
changelog-type: markdown
- hash: 551513bf792cec38a9d919e1d8462eb3f318558765b5f7544fbd7c9e88f993e6
+ hash: 731f3a8a3417763a6235adf9b0434949136f22c85bb2e4c97442bad29931c138
test-bench-deps: {}
maintainer: jtd@galois.com
synopsis: Pipe-based interface to the Aspell program
changelog: ! '
+
+ 0.2
+
+ ---
+
+
+ * Relaxed base upper bound to < 5.
+
0.1
---
Initial release.
'
basic-deps:
- base: ! '>=4.8 && <4.9'
? ^^^
+ base: ! '>=4.8 && <5'
? ^
text: -any
process: ! '>=1.6'
all-versions:
- '0.1'
+ - '0.2'
author: Jonathan Daugherty
- latest: '0.1'
? ^
+ latest: '0.2'
? ^
description-type: markdown
description: ! '
aspell-pipe
-----------
This package provides a simple interface to the `aspell` spell checker
tool. In particular, this library communicates with `aspell` using
its pipe interface rather than linking directly to the Aspell shared
library. This is because in some cases it is desirable to avoid a shared
library dependency and to deal gracefully with an absent spell checker.
To get started, see the Haddock documentation for `Text.Aspell`.
'
license-name: BSD3 | 15 | 0.319149 | 12 | 3 |
b17595d58aca21b941b678b23b2a3fe908915efe | _posts/2000-01-01-intro.md | _posts/2000-01-01-intro.md | ---
title: "home"
bg: white
color: black
style: center
---
#### Downtown Incubator starting at 6pm
<h1 class="tron">Open CoMo</h1>
# This weekend! 13th-14th of November 2015
Help make Columbia better using open local, state or federal data. Join us to for two evenings to create *applications*, *visualizations* and *analyses* using open data!
Signups are closed - but you should come see the demo on Saturday at 9pm!
| ---
title: "home"
bg: white
color: black
style: center
---
#### Downtown Incubator starting at 6pm
<h1 class="tron">Open CoMo</h1>
# 13th-14th of November 2015
## (That's this weekend!)
Help make Columbia better using open local, state or federal data. Join us to for two evenings to create *applications*, *visualizations* and *analyses* using open data!
Signups are closed - but you should come see the demo on Saturday at 9pm!
| Make the this weekend sign suck less | Make the this weekend sign suck less | Markdown | mit | jakewins/opencomo-site,jakewins/opencomo-site | markdown | ## Code Before:
---
title: "home"
bg: white
color: black
style: center
---
#### Downtown Incubator starting at 6pm
<h1 class="tron">Open CoMo</h1>
# This weekend! 13th-14th of November 2015
Help make Columbia better using open local, state or federal data. Join us to for two evenings to create *applications*, *visualizations* and *analyses* using open data!
Signups are closed - but you should come see the demo on Saturday at 9pm!
## Instruction:
Make the this weekend sign suck less
## Code After:
---
title: "home"
bg: white
color: black
style: center
---
#### Downtown Incubator starting at 6pm
<h1 class="tron">Open CoMo</h1>
# 13th-14th of November 2015
## (That's this weekend!)
Help make Columbia better using open local, state or federal data. Join us to for two evenings to create *applications*, *visualizations* and *analyses* using open data!
Signups are closed - but you should come see the demo on Saturday at 9pm!
| ---
title: "home"
bg: white
color: black
style: center
---
#### Downtown Incubator starting at 6pm
<h1 class="tron">Open CoMo</h1>
- # This weekend! 13th-14th of November 2015
? --------------
+ # 13th-14th of November 2015
+
+ ## (That's this weekend!)
Help make Columbia better using open local, state or federal data. Join us to for two evenings to create *applications*, *visualizations* and *analyses* using open data!
Signups are closed - but you should come see the demo on Saturday at 9pm! | 4 | 0.25 | 3 | 1 |
84dd2633fe82cd41ba267cc9566ef02069d57640 | gradle.properties | gradle.properties | PROJECT_VERSION=0.13.2
PROJECT_GROUP_ID=com.jaynewstrom
PROJECT_VCS_URL=https://github.com/JayNewstrom/Concrete.git
PROJECT_DESCRIPTION=Concrete - An Android System Service that helps with Dependency Injection scoping using Dagger 2
PROJECT_REPO=Concrete
android.useAndroidX=true
android.enableJetifier=true
| PROJECT_VERSION=0.13.3-SNAPSHOT
PROJECT_GROUP_ID=com.jaynewstrom
PROJECT_VCS_URL=https://github.com/JayNewstrom/Concrete.git
PROJECT_DESCRIPTION=Concrete - An Android System Service that helps with Dependency Injection scoping using Dagger 2
PROJECT_REPO=Concrete
android.useAndroidX=true
android.enableJetifier=true
| Prepare for next development iteration | Prepare for next development iteration
| INI | apache-2.0 | JayNewstrom/Concrete | ini | ## Code Before:
PROJECT_VERSION=0.13.2
PROJECT_GROUP_ID=com.jaynewstrom
PROJECT_VCS_URL=https://github.com/JayNewstrom/Concrete.git
PROJECT_DESCRIPTION=Concrete - An Android System Service that helps with Dependency Injection scoping using Dagger 2
PROJECT_REPO=Concrete
android.useAndroidX=true
android.enableJetifier=true
## Instruction:
Prepare for next development iteration
## Code After:
PROJECT_VERSION=0.13.3-SNAPSHOT
PROJECT_GROUP_ID=com.jaynewstrom
PROJECT_VCS_URL=https://github.com/JayNewstrom/Concrete.git
PROJECT_DESCRIPTION=Concrete - An Android System Service that helps with Dependency Injection scoping using Dagger 2
PROJECT_REPO=Concrete
android.useAndroidX=true
android.enableJetifier=true
| - PROJECT_VERSION=0.13.2
? ^
+ PROJECT_VERSION=0.13.3-SNAPSHOT
? ^^^^^^^^^^
PROJECT_GROUP_ID=com.jaynewstrom
PROJECT_VCS_URL=https://github.com/JayNewstrom/Concrete.git
PROJECT_DESCRIPTION=Concrete - An Android System Service that helps with Dependency Injection scoping using Dagger 2
PROJECT_REPO=Concrete
android.useAndroidX=true
android.enableJetifier=true | 2 | 0.285714 | 1 | 1 |
fd642666255d1633925028251b4258102badaffa | src/lib/provider_selection/chainable_strategy.rb | src/lib/provider_selection/chainable_strategy.rb |
module ProviderSelection
module ChainableStrategy
module InstanceMethods
def initialize(strategies, options = {})
@strategies = strategies
options ||= {}
if self.class.method_names.include?('default_options')
@options = self.class.default_options.with_indifferent_access.merge(options)
else
@options = options
end
end
def method_missing(method, *args)
args.empty? ? @strategies.send(method) : @strategies.send(method, args)
end
end
end
end
|
module ProviderSelection
module ChainableStrategy
module InstanceMethods
def initialize(strategies, options = {})
@strategies = strategies
options ||= {}
if self.class.methods.map(&:to_s).include?('default_options')
@options = self.class.default_options.with_indifferent_access.merge(options)
else
@options = options
end
end
def method_missing(method, *args)
args.empty? ? @strategies.send(method) : @strategies.send(method, args)
end
end
end
end
| Refactor Ruby 1.8 compatibility fix in provider selection | Refactor Ruby 1.8 compatibility fix in provider selection
The change was introduced by 7f26b8846b36515a6648b3f94ab24036471d8370 which is using a Rails method to solve an incompatility issue between 1.8 and 1.9.
It's better no to couple Rails with the provider selection because in the future we may want to seperate the provider selection code into a separate gem.
| Ruby | apache-2.0 | aeolusproject/conductor,aeolusproject/conductor,aeolusproject/conductor,aeolusproject/conductor | ruby | ## Code Before:
module ProviderSelection
module ChainableStrategy
module InstanceMethods
def initialize(strategies, options = {})
@strategies = strategies
options ||= {}
if self.class.method_names.include?('default_options')
@options = self.class.default_options.with_indifferent_access.merge(options)
else
@options = options
end
end
def method_missing(method, *args)
args.empty? ? @strategies.send(method) : @strategies.send(method, args)
end
end
end
end
## Instruction:
Refactor Ruby 1.8 compatibility fix in provider selection
The change was introduced by 7f26b8846b36515a6648b3f94ab24036471d8370 which is using a Rails method to solve an incompatility issue between 1.8 and 1.9.
It's better no to couple Rails with the provider selection because in the future we may want to seperate the provider selection code into a separate gem.
## Code After:
module ProviderSelection
module ChainableStrategy
module InstanceMethods
def initialize(strategies, options = {})
@strategies = strategies
options ||= {}
if self.class.methods.map(&:to_s).include?('default_options')
@options = self.class.default_options.with_indifferent_access.merge(options)
else
@options = options
end
end
def method_missing(method, *args)
args.empty? ? @strategies.send(method) : @strategies.send(method, args)
end
end
end
end
|
module ProviderSelection
module ChainableStrategy
module InstanceMethods
def initialize(strategies, options = {})
@strategies = strategies
options ||= {}
- if self.class.method_names.include?('default_options')
? ----
+ if self.class.methods.map(&:to_s).include?('default_options')
? ++++++++++ +
@options = self.class.default_options.with_indifferent_access.merge(options)
else
@options = options
end
end
def method_missing(method, *args)
args.empty? ? @strategies.send(method) : @strategies.send(method, args)
end
end
end
end | 2 | 0.08 | 1 | 1 |
a380410f493f349c05255a58448538e16a12b04d | .travis.yml | .travis.yml | notifications:
recipients:
- koji.shimada@enishi-tech.com
language: ruby
rvm:
- 1.9.3
before_install:
- curl https://raw.github.com/groonga/groonga/master/data/travis/setup.sh | sh
| notifications:
recipients:
- koji.shimada@enishi-tech.com
language: ruby
rvm:
- 1.9.3
before_install:
- gem install bundler
- curl https://raw.github.com/groonga/groonga/master/data/travis/setup.sh | sh
| Install latest bundler on Travis | Install latest bundler on Travis
| YAML | mit | snoozer05/gglog | yaml | ## Code Before:
notifications:
recipients:
- koji.shimada@enishi-tech.com
language: ruby
rvm:
- 1.9.3
before_install:
- curl https://raw.github.com/groonga/groonga/master/data/travis/setup.sh | sh
## Instruction:
Install latest bundler on Travis
## Code After:
notifications:
recipients:
- koji.shimada@enishi-tech.com
language: ruby
rvm:
- 1.9.3
before_install:
- gem install bundler
- curl https://raw.github.com/groonga/groonga/master/data/travis/setup.sh | sh
| notifications:
recipients:
- koji.shimada@enishi-tech.com
language: ruby
rvm:
- 1.9.3
before_install:
+ - gem install bundler
- curl https://raw.github.com/groonga/groonga/master/data/travis/setup.sh | sh | 1 | 0.125 | 1 | 0 |
7672bb5020cbca4123cf81b948e01a462bba9377 | MultiColumnForm/css/styles.css | MultiColumnForm/css/styles.css | * {
font-family: 'Lato', Calibri, Arial, sans-serif;
background-color: #47a3da;
color: #fff;
border: 2px;
box-sizing: border-box;
}
container {
width: 90%;
}
body, .main, div, form, header, h1, h2, nav, span {
border: 2px solid #47a3da;
}
label, .first_name {
border: 2px solid #47a3da;
text-transform: uppercase;
display:block;
}
.cbp-mc-form input, .cbp-mc-form textarea, .cbp-mc-form select {
padding: 5px 10px;
color: #FFF;
line-height: 1.5px;
width: 33%;
border: 2px solid white;
}
.cbp-mc-form label, .cbp-mc-form input, .cbp-mc-form textarea{
float:left;
margin: 15px 10px;
width: 33%;
padding: 10px 10px;
}
| * {
font-family: 'Lato', Calibri, Arial, sans-serif;
background-color: #47a3da;
color: #fff;
border: 2px;
}
container {
width: 90%;
}
.cbp-mc-column {
float: left;
border: 1px solid black;
width: 33%;
}
.cbp-mc-column:nth-child(3) {
background-color: orange;
float: left;
width: 33%;
}
.cbp-mc-submit-wrap {
background-color: magenta;
clear:both;
}
.cbp-mc-form input, .cbp-mc-form textarea, .cbp-mc-form select {
padding: 5px 10px;
color: #FFF;
line-height: 1.5px;
width: 33%;
border: 2px solid white;
}
.cbp-mc-form label, input, textarea, select{
margin: 15px 10px;
width: 33%;
padding: 10px 10px;
}
label {
text-transform: uppercase;
display:block;
}
| Fix float and display. Add tracer bullets | Fix float and display. Add tracer bullets
| CSS | cc0-1.0 | ShayHall/TIY-Assignments,ShayHall/TIY-Assignments | css | ## Code Before:
* {
font-family: 'Lato', Calibri, Arial, sans-serif;
background-color: #47a3da;
color: #fff;
border: 2px;
box-sizing: border-box;
}
container {
width: 90%;
}
body, .main, div, form, header, h1, h2, nav, span {
border: 2px solid #47a3da;
}
label, .first_name {
border: 2px solid #47a3da;
text-transform: uppercase;
display:block;
}
.cbp-mc-form input, .cbp-mc-form textarea, .cbp-mc-form select {
padding: 5px 10px;
color: #FFF;
line-height: 1.5px;
width: 33%;
border: 2px solid white;
}
.cbp-mc-form label, .cbp-mc-form input, .cbp-mc-form textarea{
float:left;
margin: 15px 10px;
width: 33%;
padding: 10px 10px;
}
## Instruction:
Fix float and display. Add tracer bullets
## Code After:
* {
font-family: 'Lato', Calibri, Arial, sans-serif;
background-color: #47a3da;
color: #fff;
border: 2px;
}
container {
width: 90%;
}
.cbp-mc-column {
float: left;
border: 1px solid black;
width: 33%;
}
.cbp-mc-column:nth-child(3) {
background-color: orange;
float: left;
width: 33%;
}
.cbp-mc-submit-wrap {
background-color: magenta;
clear:both;
}
.cbp-mc-form input, .cbp-mc-form textarea, .cbp-mc-form select {
padding: 5px 10px;
color: #FFF;
line-height: 1.5px;
width: 33%;
border: 2px solid white;
}
.cbp-mc-form label, input, textarea, select{
margin: 15px 10px;
width: 33%;
padding: 10px 10px;
}
label {
text-transform: uppercase;
display:block;
}
| * {
font-family: 'Lato', Calibri, Arial, sans-serif;
background-color: #47a3da;
color: #fff;
border: 2px;
- box-sizing: border-box;
}
container {
width: 90%;
}
- body, .main, div, form, header, h1, h2, nav, span {
+ .cbp-mc-column {
+ float: left;
- border: 2px solid #47a3da;
? -- ^ ^^^ ^^^
+ border: 1px solid black;
? ^ ^^ ^^
+ width: 33%;
}
- label, .first_name {
- border: 2px solid #47a3da;
- text-transform: uppercase;
- display:block;
+ .cbp-mc-column:nth-child(3) {
+ background-color: orange;
+ float: left;
+ width: 33%;
+ }
+
+ .cbp-mc-submit-wrap {
+ background-color: magenta;
+ clear:both;
}
.cbp-mc-form input, .cbp-mc-form textarea, .cbp-mc-form select {
padding: 5px 10px;
color: #FFF;
line-height: 1.5px;
width: 33%;
border: 2px solid white;
-
}
+ .cbp-mc-form label, input, textarea, select{
- .cbp-mc-form label, .cbp-mc-form input, .cbp-mc-form textarea{
- float:left;
margin: 15px 10px;
width: 33%;
padding: 10px 10px;
}
+
+ label {
+ text-transform: uppercase;
+ display:block;
+ } | 29 | 0.783784 | 19 | 10 |
5afdd695b58247f2f4594cd8b2ee17eef9aca24c | erudite.gemspec | erudite.gemspec |
require_relative 'lib/erudite/version'
Gem::Specification.new do |gem|
gem.name = 'erudite'
gem.version = Erudite::VERSION.to_s
gem.summary = 'Executable documentation.'
gem.description = gem.summary
gem.homepage = 'https://github.com/tfausak/erudite'
gem.author = 'Taylor Fausak'
gem.email = 'taylor@fausak.me'
gem.license = 'MIT'
gem.executable = 'erudite'
gem.files = %w(CHANGELOG.md CONTRIBUTING.md LICENSE.md README.md) +
Dir.glob(File.join(gem.require_path, '**', '*.rb'))
gem.test_files = Dir.glob(File.join('spec', '**', '*.rb'))
gem.required_ruby_version = '>= 1.9.3'
gem.add_development_dependency 'coveralls', '~> 0.7'
gem.add_development_dependency 'rake', '~> 10.4'
gem.add_development_dependency 'rspec', '~> 3.1'
gem.add_development_dependency 'rubocop', '~> 0.28'
end
|
Gem::Specification.new do |gem|
gem.name = 'erudite'
gem.version = '0.2.0'
gem.summary = 'Executable documentation.'
gem.description = gem.summary
gem.homepage = 'https://github.com/tfausak/erudite'
gem.author = 'Taylor Fausak'
gem.email = 'taylor@fausak.me'
gem.license = 'MIT'
gem.executable = 'erudite'
gem.files = %w(CHANGELOG.md CONTRIBUTING.md LICENSE.md README.md) +
Dir.glob(File.join(gem.require_path, '**', '*.rb'))
gem.test_files = Dir.glob(File.join('spec', '**', '*.rb'))
gem.required_ruby_version = '>= 1.9.3'
gem.add_development_dependency 'coveralls', '~> 0.7'
gem.add_development_dependency 'rake', '~> 10.4'
gem.add_development_dependency 'rspec', '~> 3.1'
gem.add_development_dependency 'rubocop', '~> 0.28'
end
| Revert "Dynamically load the version from Erudite::VERSION" | Revert "Dynamically load the version from Erudite::VERSION"
This reverts commit e77c6a962259d4181cbb90cfc19e40cb818ab02d.
| Ruby | mit | tfausak/erudite | ruby | ## Code Before:
require_relative 'lib/erudite/version'
Gem::Specification.new do |gem|
gem.name = 'erudite'
gem.version = Erudite::VERSION.to_s
gem.summary = 'Executable documentation.'
gem.description = gem.summary
gem.homepage = 'https://github.com/tfausak/erudite'
gem.author = 'Taylor Fausak'
gem.email = 'taylor@fausak.me'
gem.license = 'MIT'
gem.executable = 'erudite'
gem.files = %w(CHANGELOG.md CONTRIBUTING.md LICENSE.md README.md) +
Dir.glob(File.join(gem.require_path, '**', '*.rb'))
gem.test_files = Dir.glob(File.join('spec', '**', '*.rb'))
gem.required_ruby_version = '>= 1.9.3'
gem.add_development_dependency 'coveralls', '~> 0.7'
gem.add_development_dependency 'rake', '~> 10.4'
gem.add_development_dependency 'rspec', '~> 3.1'
gem.add_development_dependency 'rubocop', '~> 0.28'
end
## Instruction:
Revert "Dynamically load the version from Erudite::VERSION"
This reverts commit e77c6a962259d4181cbb90cfc19e40cb818ab02d.
## Code After:
Gem::Specification.new do |gem|
gem.name = 'erudite'
gem.version = '0.2.0'
gem.summary = 'Executable documentation.'
gem.description = gem.summary
gem.homepage = 'https://github.com/tfausak/erudite'
gem.author = 'Taylor Fausak'
gem.email = 'taylor@fausak.me'
gem.license = 'MIT'
gem.executable = 'erudite'
gem.files = %w(CHANGELOG.md CONTRIBUTING.md LICENSE.md README.md) +
Dir.glob(File.join(gem.require_path, '**', '*.rb'))
gem.test_files = Dir.glob(File.join('spec', '**', '*.rb'))
gem.required_ruby_version = '>= 1.9.3'
gem.add_development_dependency 'coveralls', '~> 0.7'
gem.add_development_dependency 'rake', '~> 10.4'
gem.add_development_dependency 'rspec', '~> 3.1'
gem.add_development_dependency 'rubocop', '~> 0.28'
end
| -
- require_relative 'lib/erudite/version'
Gem::Specification.new do |gem|
gem.name = 'erudite'
- gem.version = Erudite::VERSION.to_s
+ gem.version = '0.2.0'
gem.summary = 'Executable documentation.'
gem.description = gem.summary
gem.homepage = 'https://github.com/tfausak/erudite'
gem.author = 'Taylor Fausak'
gem.email = 'taylor@fausak.me'
gem.license = 'MIT'
gem.executable = 'erudite'
gem.files = %w(CHANGELOG.md CONTRIBUTING.md LICENSE.md README.md) +
Dir.glob(File.join(gem.require_path, '**', '*.rb'))
gem.test_files = Dir.glob(File.join('spec', '**', '*.rb'))
gem.required_ruby_version = '>= 1.9.3'
gem.add_development_dependency 'coveralls', '~> 0.7'
gem.add_development_dependency 'rake', '~> 10.4'
gem.add_development_dependency 'rspec', '~> 3.1'
gem.add_development_dependency 'rubocop', '~> 0.28'
end | 4 | 0.16 | 1 | 3 |
1862f018fc23828b852dc00bd79d71a59d09098f | travis-deploy-to-docker-registry.sh | travis-deploy-to-docker-registry.sh |
DOCKER_USERNAME=${DOCKER_USERNAME-foo}
DOCKER_PASSWORD=${DOCKER_PASSWORD-bar}
# Deploy only if the following conditions are satisfied
# 1. The build is for the project paypal/docker-selion, not on the fork
# 3. The build is not on a pull request
# 4. The build is on the develop branch
if [ "$TRAVIS_REPO_SLUG" = "paypal/docker-selion" ] && [ "$TRAVIS_PULL_REQUEST" = "false" ] && [ "$TRAVIS_BRANCH" = "develop" ]; then
echo "Deploying to Docker registry...\n"
# verify that we are on develop branch, otherwise exit with error code
git checkout develop
output=$(git rev-parse --abbrev-ref HEAD)
if [ "$output" = "develop" ]; then
docker login -u "$DOCKER_USERNAME" -p "$DOCKER_PASSWORD"
if [ $? -ne 0 ]; then
echo "Failed to authenticate with Docker registry."
exit 1
fi
make dev_release
else
echo "Not on the develop branch."
exit 1
fi
else
echo "Deployment selection criteria not met."
fi
|
DOCKER_USERNAME=${DOCKER_USERNAME-foo}
DOCKER_PASSWORD=${DOCKER_PASSWORD-bar}
# Deploy only if the following conditions are satisfied
# 1. The build is for the project paypal/docker-selion, not on the fork
# 3. The build is not on a pull request
# 4. The build is on the develop branch
if [ "$TRAVIS_REPO_SLUG" = "paypal/docker-selion" ] && [ "$TRAVIS_PULL_REQUEST" = "false" ] && [ "$TRAVIS_BRANCH" = "develop" ]; then
echo "Deploying to Docker registry...\n"
# verify that we are on develop branch, otherwise exit with error code
git checkout develop
output=$(git rev-parse --abbrev-ref HEAD)
if [ "$output" = "develop" ]; then
docker login -u "$DOCKER_USERNAME" -p "$DOCKER_PASSWORD" -e "$DOCKER_EMAIL"
if [ $? -ne 0 ]; then
echo "Failed to authenticate with Docker registry."
exit 1
fi
make dev_release
else
echo "Not on the develop branch."
exit 1
fi
else
echo "Deployment selection criteria not met."
fi
| Add email for 'docker login' publishing step | Add email for 'docker login' publishing step
| Shell | apache-2.0 | paypal/docker-selion,paypal/docker-selion | shell | ## Code Before:
DOCKER_USERNAME=${DOCKER_USERNAME-foo}
DOCKER_PASSWORD=${DOCKER_PASSWORD-bar}
# Deploy only if the following conditions are satisfied
# 1. The build is for the project paypal/docker-selion, not on the fork
# 3. The build is not on a pull request
# 4. The build is on the develop branch
if [ "$TRAVIS_REPO_SLUG" = "paypal/docker-selion" ] && [ "$TRAVIS_PULL_REQUEST" = "false" ] && [ "$TRAVIS_BRANCH" = "develop" ]; then
echo "Deploying to Docker registry...\n"
# verify that we are on develop branch, otherwise exit with error code
git checkout develop
output=$(git rev-parse --abbrev-ref HEAD)
if [ "$output" = "develop" ]; then
docker login -u "$DOCKER_USERNAME" -p "$DOCKER_PASSWORD"
if [ $? -ne 0 ]; then
echo "Failed to authenticate with Docker registry."
exit 1
fi
make dev_release
else
echo "Not on the develop branch."
exit 1
fi
else
echo "Deployment selection criteria not met."
fi
## Instruction:
Add email for 'docker login' publishing step
## Code After:
DOCKER_USERNAME=${DOCKER_USERNAME-foo}
DOCKER_PASSWORD=${DOCKER_PASSWORD-bar}
# Deploy only if the following conditions are satisfied
# 1. The build is for the project paypal/docker-selion, not on the fork
# 3. The build is not on a pull request
# 4. The build is on the develop branch
if [ "$TRAVIS_REPO_SLUG" = "paypal/docker-selion" ] && [ "$TRAVIS_PULL_REQUEST" = "false" ] && [ "$TRAVIS_BRANCH" = "develop" ]; then
echo "Deploying to Docker registry...\n"
# verify that we are on develop branch, otherwise exit with error code
git checkout develop
output=$(git rev-parse --abbrev-ref HEAD)
if [ "$output" = "develop" ]; then
docker login -u "$DOCKER_USERNAME" -p "$DOCKER_PASSWORD" -e "$DOCKER_EMAIL"
if [ $? -ne 0 ]; then
echo "Failed to authenticate with Docker registry."
exit 1
fi
make dev_release
else
echo "Not on the develop branch."
exit 1
fi
else
echo "Deployment selection criteria not met."
fi
|
DOCKER_USERNAME=${DOCKER_USERNAME-foo}
DOCKER_PASSWORD=${DOCKER_PASSWORD-bar}
# Deploy only if the following conditions are satisfied
# 1. The build is for the project paypal/docker-selion, not on the fork
# 3. The build is not on a pull request
# 4. The build is on the develop branch
if [ "$TRAVIS_REPO_SLUG" = "paypal/docker-selion" ] && [ "$TRAVIS_PULL_REQUEST" = "false" ] && [ "$TRAVIS_BRANCH" = "develop" ]; then
echo "Deploying to Docker registry...\n"
# verify that we are on develop branch, otherwise exit with error code
git checkout develop
output=$(git rev-parse --abbrev-ref HEAD)
if [ "$output" = "develop" ]; then
- docker login -u "$DOCKER_USERNAME" -p "$DOCKER_PASSWORD"
+ docker login -u "$DOCKER_USERNAME" -p "$DOCKER_PASSWORD" -e "$DOCKER_EMAIL"
? +++++++++++++++++++
if [ $? -ne 0 ]; then
echo "Failed to authenticate with Docker registry."
exit 1
fi
make dev_release
else
echo "Not on the develop branch."
exit 1
fi
else
echo "Deployment selection criteria not met."
fi | 2 | 0.074074 | 1 | 1 |
c529ee1291490d306118e561f5d0062381f8a6a0 | .github/workflows/verify-sling-release.yml | .github/workflows/verify-sling-release.yml |
name: Verify Sling Release
# Controls when the action will run.
on:
workflow_dispatch:
inputs:
releaseId:
description: Sling Release ID
required: true
jobs:
verify:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
repository: apache/sling-org-apache-sling-committer-cli
- name: Set up JDK 11
uses: actions/setup-java@v1
with:
java-version: 11
- name: Build Sling Committers CLI
run: mvn clean package
- name: Verify Release
run: docker run -e ASF_USERNAME -e ASF_PASSWORD apache/sling-cli release verify -r ${{ github.event.inputs.releaseId }}
env:
ASF_USERNAME: ${{ secrets.ASF_USERNAME }}
ASF_PASSWORD: ${{ secrets.ASF_PASSWORD }}
|
name: Verify Sling Release
# Controls when the action will run.
on:
workflow_dispatch:
inputs:
releaseId:
description: Sling Release ID
required: true
jobs:
verify:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
repository: apache/sling-org-apache-sling-committer-cli
ref: SLING-10294-status-code
- name: Set up JDK 11
uses: actions/setup-java@v1
with:
java-version: 11
- name: Build Sling Committers CLI
run: mvn clean package
- name: Verify Release
run: docker run -e ASF_USERNAME -e ASF_PASSWORD apache/sling-cli release verify -r ${{ github.event.inputs.releaseId }}
env:
ASF_USERNAME: ${{ secrets.ASF_USERNAME }}
ASF_PASSWORD: ${{ secrets.ASF_PASSWORD }}
| Set to checkout a branch of the Committer CLI supporting return codes | Set to checkout a branch of the Committer CLI supporting return codes | YAML | mit | klcodanr/Scripts | yaml | ## Code Before:
name: Verify Sling Release
# Controls when the action will run.
on:
workflow_dispatch:
inputs:
releaseId:
description: Sling Release ID
required: true
jobs:
verify:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
repository: apache/sling-org-apache-sling-committer-cli
- name: Set up JDK 11
uses: actions/setup-java@v1
with:
java-version: 11
- name: Build Sling Committers CLI
run: mvn clean package
- name: Verify Release
run: docker run -e ASF_USERNAME -e ASF_PASSWORD apache/sling-cli release verify -r ${{ github.event.inputs.releaseId }}
env:
ASF_USERNAME: ${{ secrets.ASF_USERNAME }}
ASF_PASSWORD: ${{ secrets.ASF_PASSWORD }}
## Instruction:
Set to checkout a branch of the Committer CLI supporting return codes
## Code After:
name: Verify Sling Release
# Controls when the action will run.
on:
workflow_dispatch:
inputs:
releaseId:
description: Sling Release ID
required: true
jobs:
verify:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
repository: apache/sling-org-apache-sling-committer-cli
ref: SLING-10294-status-code
- name: Set up JDK 11
uses: actions/setup-java@v1
with:
java-version: 11
- name: Build Sling Committers CLI
run: mvn clean package
- name: Verify Release
run: docker run -e ASF_USERNAME -e ASF_PASSWORD apache/sling-cli release verify -r ${{ github.event.inputs.releaseId }}
env:
ASF_USERNAME: ${{ secrets.ASF_USERNAME }}
ASF_PASSWORD: ${{ secrets.ASF_PASSWORD }}
|
name: Verify Sling Release
# Controls when the action will run.
on:
workflow_dispatch:
inputs:
releaseId:
description: Sling Release ID
required: true
jobs:
verify:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
repository: apache/sling-org-apache-sling-committer-cli
+ ref: SLING-10294-status-code
- name: Set up JDK 11
uses: actions/setup-java@v1
with:
java-version: 11
- name: Build Sling Committers CLI
run: mvn clean package
- name: Verify Release
run: docker run -e ASF_USERNAME -e ASF_PASSWORD apache/sling-cli release verify -r ${{ github.event.inputs.releaseId }}
env:
ASF_USERNAME: ${{ secrets.ASF_USERNAME }}
ASF_PASSWORD: ${{ secrets.ASF_PASSWORD }} | 1 | 0.035714 | 1 | 0 |
8f13edd6cf1fe4996cf26b73afbfd537024757dc | Assignment1/DatasetDescriptions.csv | Assignment1/DatasetDescriptions.csv | Your name; Dataset name; Dataset URL; Dataset brief description
Sandra Saez; AragoDBPedia; http://opendata.aragon.es/aragopedia; This dataset contains data obtained through an automatic extraction process from Aragopedia and contains information about all the municipalities, boroughs, and provinces of the region of Aragón, in Spain.
Nestor Lopez Rivas;2001 Spanish Census to RDF;http://dataweb.infor.uva.es/census2001;El conjunto de datos se refiere a un muestreo del 5% del censo realizado en espa�a en 2001.
Oscar Fernandez;DBpedia ES;http://es.dbpedia.org;The DBpedia project generates semantic information from the wikipedia(Spanish language)
Jorge Gonzalez;Didactalia;http://didactalia.net/comunidad/materialeducativo;Didactalia.net is a global community and an storage place for teachers, students and parents to create, share and find open educational resources
Marco López de Miguel;Datos abiertos Zaragoza;https://www.zaragoza.es/ciudad/risp/; Datos Abiertos Zaragoza es una iniciativa del Ayuntamiento de Zaragoza para el fomento de la reutilización de la información publicada en su web por parte de la ciudadanía, las empresas y otros organismos. | Your name; Dataset name; Dataset URL; Dataset brief description
Sandra Saez; AragoDBPedia; http://opendata.aragon.es/aragopedia; This dataset contains data obtained through an automatic extraction process from Aragopedia and contains information about all the municipalities, boroughs, and provinces of the region of Aragón, in Spain.
Nestor Lopez Rivas;2001 Spanish Census to RDF;http://dataweb.infor.uva.es/census2001;El conjunto de datos se refiere a un muestreo del 5% del censo realizado en espa�a en 2001.
Oscar Fernandez;DBpedia ES;http://es.dbpedia.org;The DBpedia project generates semantic information from the wikipedia(Spanish language)
Jorge Gonzalez;Didactalia;http://didactalia.net/comunidad/materialeducativo;Didactalia.net is a global community and an storage place for teachers, students and parents to create, share and find open educational resources | Revert "Include Datos abiertos Zaragoza" | Revert "Include Datos abiertos Zaragoza"
This reverts commit b5b5dbd27e7647bb8ac903070e86b20e66cbbfb3.
| CSV | apache-2.0 | AlejandroHerrero/Curso2014-2015,FacultadInformatica-LinkedData/Curso2014-2015,AlejandroHerrero/Curso2014-2015,FacultadInformatica-LinkedData/Curso2014-2015,FacultadInformatica-LinkedData/Curso2014-2015,AlejandroHerrero/Curso2014-2015,FacultadInformatica-LinkedData/Curso2014-2015 | csv | ## Code Before:
Your name; Dataset name; Dataset URL; Dataset brief description
Sandra Saez; AragoDBPedia; http://opendata.aragon.es/aragopedia; This dataset contains data obtained through an automatic extraction process from Aragopedia and contains information about all the municipalities, boroughs, and provinces of the region of Aragón, in Spain.
Nestor Lopez Rivas;2001 Spanish Census to RDF;http://dataweb.infor.uva.es/census2001;El conjunto de datos se refiere a un muestreo del 5% del censo realizado en espa�a en 2001.
Oscar Fernandez;DBpedia ES;http://es.dbpedia.org;The DBpedia project generates semantic information from the wikipedia(Spanish language)
Jorge Gonzalez;Didactalia;http://didactalia.net/comunidad/materialeducativo;Didactalia.net is a global community and an storage place for teachers, students and parents to create, share and find open educational resources
Marco López de Miguel;Datos abiertos Zaragoza;https://www.zaragoza.es/ciudad/risp/; Datos Abiertos Zaragoza es una iniciativa del Ayuntamiento de Zaragoza para el fomento de la reutilización de la información publicada en su web por parte de la ciudadanía, las empresas y otros organismos.
## Instruction:
Revert "Include Datos abiertos Zaragoza"
This reverts commit b5b5dbd27e7647bb8ac903070e86b20e66cbbfb3.
## Code After:
Your name; Dataset name; Dataset URL; Dataset brief description
Sandra Saez; AragoDBPedia; http://opendata.aragon.es/aragopedia; This dataset contains data obtained through an automatic extraction process from Aragopedia and contains information about all the municipalities, boroughs, and provinces of the region of Aragón, in Spain.
Nestor Lopez Rivas;2001 Spanish Census to RDF;http://dataweb.infor.uva.es/census2001;El conjunto de datos se refiere a un muestreo del 5% del censo realizado en espa�a en 2001.
Oscar Fernandez;DBpedia ES;http://es.dbpedia.org;The DBpedia project generates semantic information from the wikipedia(Spanish language)
Jorge Gonzalez;Didactalia;http://didactalia.net/comunidad/materialeducativo;Didactalia.net is a global community and an storage place for teachers, students and parents to create, share and find open educational resources | Your name; Dataset name; Dataset URL; Dataset brief description
Sandra Saez; AragoDBPedia; http://opendata.aragon.es/aragopedia; This dataset contains data obtained through an automatic extraction process from Aragopedia and contains information about all the municipalities, boroughs, and provinces of the region of Aragón, in Spain.
Nestor Lopez Rivas;2001 Spanish Census to RDF;http://dataweb.infor.uva.es/census2001;El conjunto de datos se refiere a un muestreo del 5% del censo realizado en espa�a en 2001.
Oscar Fernandez;DBpedia ES;http://es.dbpedia.org;The DBpedia project generates semantic information from the wikipedia(Spanish language)
Jorge Gonzalez;Didactalia;http://didactalia.net/comunidad/materialeducativo;Didactalia.net is a global community and an storage place for teachers, students and parents to create, share and find open educational resources
- Marco López de Miguel;Datos abiertos Zaragoza;https://www.zaragoza.es/ciudad/risp/; Datos Abiertos Zaragoza es una iniciativa del Ayuntamiento de Zaragoza para el fomento de la reutilización de la información publicada en su web por parte de la ciudadanía, las empresas y otros organismos. | 1 | 0.166667 | 0 | 1 |
59f81978ebd922d3ad3870f5102f591cf5880e04 | lib/caches_constants.rb | lib/caches_constants.rb | module ActiveRecord
class Base
class << self
def caches_constants(additional_options = {})
options = {:key => :name, :limit => 64}.merge(additional_options)
find(:all).each do |model|
const = Viget::Format.to_const(model.send(options[:key].to_sym))
const = const[0, options[:limit]] unless const.blank?
set_constant = !const.blank? && !const_defined?(const)
const_set(const, model) if set_constant
end
end
end
end
end | module ActiveRecord
class Base
class << self
def caches_constants(additional_options = {})
options = {:key => :name, :limit => 64}.merge(additional_options)
find(:all).each do |model|
const = Viget::Format.to_const(model.send(options[:key].to_sym))
unless const.blank?
const = const[0, options[:limit]]
const_set(const, model) if !const.blank? && !const_defined?(const)
end
end
end
end
end
end | Refactor code per Ben's suggestions | Refactor code per Ben's suggestions
git-svn-id: 61896eb63f1561c389be937852d9629e8857f2e4@134 e959a6d6-1924-0410-92b3-a6fa492f4c66
| Ruby | mit | theworldbright/constant_cache,tpitale/constant_cache | ruby | ## Code Before:
module ActiveRecord
class Base
class << self
def caches_constants(additional_options = {})
options = {:key => :name, :limit => 64}.merge(additional_options)
find(:all).each do |model|
const = Viget::Format.to_const(model.send(options[:key].to_sym))
const = const[0, options[:limit]] unless const.blank?
set_constant = !const.blank? && !const_defined?(const)
const_set(const, model) if set_constant
end
end
end
end
end
## Instruction:
Refactor code per Ben's suggestions
git-svn-id: 61896eb63f1561c389be937852d9629e8857f2e4@134 e959a6d6-1924-0410-92b3-a6fa492f4c66
## Code After:
module ActiveRecord
class Base
class << self
def caches_constants(additional_options = {})
options = {:key => :name, :limit => 64}.merge(additional_options)
find(:all).each do |model|
const = Viget::Format.to_const(model.send(options[:key].to_sym))
unless const.blank?
const = const[0, options[:limit]]
const_set(const, model) if !const.blank? && !const_defined?(const)
end
end
end
end
end
end | module ActiveRecord
class Base
class << self
def caches_constants(additional_options = {})
options = {:key => :name, :limit => 64}.merge(additional_options)
find(:all).each do |model|
const = Viget::Format.to_const(model.send(options[:key].to_sym))
+ unless const.blank?
- const = const[0, options[:limit]] unless const.blank?
? --------------------
+ const = const[0, options[:limit]]
? ++
- set_constant = !const.blank? && !const_defined?(const)
? ^^^^ ^ ^
+ const_set(const, model) if !const.blank? && !const_defined?(const)
? ^^ ^^^^^^^ + + ^^^^^^^^^
- const_set(const, model) if set_constant
+ end
end
end
end
end
end | 7 | 0.466667 | 4 | 3 |
2128e8337a1b39fdad17e3388d1c70e9d67099b0 | diytrainer/templates/guides/feedback_form.html | diytrainer/templates/guides/feedback_form.html | {% extends 'base_guides.html' %}
{% load crispy_forms_tags %}
{% block title %}Feedback | {{ guide.name }}{{ block.super }}{% endblock title %}
{% block view %}feedback{% endblock view %}
{% block header %}
<header>
<hgroup>
<h1>DIY Trainer</h1>
</hgroup>
</header>
{% endblock header %}
{% block content %}
<div id="feedback_thanks">
<h2>Thank you</h2>
<h3>Thanks for your interest!<br>We're working hard to get DIY Trainer in shape for you.</h3>
</div>
<div id="feedback_form">
<form method="post">
{% csrf_token %}
{{ form|crispy }}
<div class="range_helper">
<span class="start"><span class="range_number">1</span><br>Completely<br>Mortified</span>
<span class="finish"><span class="range_number">10</span><br>Utterly<br>Fearless</span>
</div>
<input type="submit" value="Submit ›">
</form>
</div>
{% endblock content %} | {% extends 'base_guides.html' %}
{% load crispy_forms_tags %}
{% block title %}Feedback | {{ guide.name }}{{ block.super }}{% endblock title %}
{% block view %}feedback{% endblock view %}
{% block header %}
<header>
<hgroup>
<h1>DIY Trainer</h1>
</hgroup>
</header>
{% endblock header %}
{% block content %}
<div id="feedback_thanks">
<h2>Thank you</h2>
<h3>Thanks for your interest!<br>We're working hard to get DIY Trainer in shape for you.</h3>
</div>
<div id="feedback_form">
<form method="post">
{% csrf_token %}
{{ form|crispy }}
<div class="range_helper">
<span class="start"><span class="range_number">1</span><br>Completely<br>Mortified</span>
<span class="finish"><span class="range_number">10</span><br>Utterly<br>Fearless</span>
</div>
<input type="submit" value="Submit ›">
</form>
</div>
{% endblock content %}
{% block extra_js %}
{% if guide.version == 2 %}
<script>
// Grab the first question label, and rewrite the copy for this guide.
$( "#div_id_project_recommendation label" ).text( "We're busy building out our library of home improvement experts. Tell us which projects you'd like advice on the most..." );
</script>
{% endif %}
{{ block.super }}
{% endblock extra_js %} | Add method of changing the form label for guide 2. | Add method of changing the form label for guide 2.
| HTML | mit | patrickbeeson/diy-trainer | html | ## Code Before:
{% extends 'base_guides.html' %}
{% load crispy_forms_tags %}
{% block title %}Feedback | {{ guide.name }}{{ block.super }}{% endblock title %}
{% block view %}feedback{% endblock view %}
{% block header %}
<header>
<hgroup>
<h1>DIY Trainer</h1>
</hgroup>
</header>
{% endblock header %}
{% block content %}
<div id="feedback_thanks">
<h2>Thank you</h2>
<h3>Thanks for your interest!<br>We're working hard to get DIY Trainer in shape for you.</h3>
</div>
<div id="feedback_form">
<form method="post">
{% csrf_token %}
{{ form|crispy }}
<div class="range_helper">
<span class="start"><span class="range_number">1</span><br>Completely<br>Mortified</span>
<span class="finish"><span class="range_number">10</span><br>Utterly<br>Fearless</span>
</div>
<input type="submit" value="Submit ›">
</form>
</div>
{% endblock content %}
## Instruction:
Add method of changing the form label for guide 2.
## Code After:
{% extends 'base_guides.html' %}
{% load crispy_forms_tags %}
{% block title %}Feedback | {{ guide.name }}{{ block.super }}{% endblock title %}
{% block view %}feedback{% endblock view %}
{% block header %}
<header>
<hgroup>
<h1>DIY Trainer</h1>
</hgroup>
</header>
{% endblock header %}
{% block content %}
<div id="feedback_thanks">
<h2>Thank you</h2>
<h3>Thanks for your interest!<br>We're working hard to get DIY Trainer in shape for you.</h3>
</div>
<div id="feedback_form">
<form method="post">
{% csrf_token %}
{{ form|crispy }}
<div class="range_helper">
<span class="start"><span class="range_number">1</span><br>Completely<br>Mortified</span>
<span class="finish"><span class="range_number">10</span><br>Utterly<br>Fearless</span>
</div>
<input type="submit" value="Submit ›">
</form>
</div>
{% endblock content %}
{% block extra_js %}
{% if guide.version == 2 %}
<script>
// Grab the first question label, and rewrite the copy for this guide.
$( "#div_id_project_recommendation label" ).text( "We're busy building out our library of home improvement experts. Tell us which projects you'd like advice on the most..." );
</script>
{% endif %}
{{ block.super }}
{% endblock extra_js %} | {% extends 'base_guides.html' %}
{% load crispy_forms_tags %}
{% block title %}Feedback | {{ guide.name }}{{ block.super }}{% endblock title %}
{% block view %}feedback{% endblock view %}
{% block header %}
<header>
<hgroup>
<h1>DIY Trainer</h1>
</hgroup>
</header>
{% endblock header %}
{% block content %}
<div id="feedback_thanks">
<h2>Thank you</h2>
<h3>Thanks for your interest!<br>We're working hard to get DIY Trainer in shape for you.</h3>
</div>
<div id="feedback_form">
<form method="post">
{% csrf_token %}
{{ form|crispy }}
<div class="range_helper">
<span class="start"><span class="range_number">1</span><br>Completely<br>Mortified</span>
<span class="finish"><span class="range_number">10</span><br>Utterly<br>Fearless</span>
</div>
<input type="submit" value="Submit ›">
</form>
</div>
{% endblock content %}
+
+ {% block extra_js %}
+ {% if guide.version == 2 %}
+ <script>
+ // Grab the first question label, and rewrite the copy for this guide.
+ $( "#div_id_project_recommendation label" ).text( "We're busy building out our library of home improvement experts. Tell us which projects you'd like advice on the most..." );
+ </script>
+ {% endif %}
+
+ {{ block.super }}
+ {% endblock extra_js %} | 11 | 0.314286 | 11 | 0 |
ef0ea057994b56678ce462a23de4b0a69243ec29 | scripts/test.sh | scripts/test.sh |
set -v # print commands as they're executed
pip install tensorflow==$TF_VERSION pytest
pytest .
|
set -ev # print commands as they're executed and immediately fail if one has a non zero exit code
pip install tensorflow==$TF_VERSION pytest
pytest .
| Switch to .sh to see if it fixes CI | Switch to .sh to see if it fixes CI
| Shell | apache-2.0 | larq/larq | shell | ## Code Before:
set -v # print commands as they're executed
pip install tensorflow==$TF_VERSION pytest
pytest .
## Instruction:
Switch to .sh to see if it fixes CI
## Code After:
set -ev # print commands as they're executed and immediately fail if one has a non zero exit code
pip install tensorflow==$TF_VERSION pytest
pytest .
|
- set -v # print commands as they're executed
+ set -ev # print commands as they're executed and immediately fail if one has a non zero exit code
pip install tensorflow==$TF_VERSION pytest
pytest . | 2 | 0.333333 | 1 | 1 |
c9813eed9c663a8eff0d1184f6ec4ad6724d90a9 | cloudformation.yml | cloudformation.yml | AWSTemplateFormatVersion: '2010-09-09'
Description: PhotoS3
Parameters:
BucketPrefix:
Type: String
Description: Prefix for S3 bucket name
ConstraintDescription: must be valid S3 bucket characters and prefix+accountnumber+suffix <=63 characters
Default: photos3-
MinLength: 0
MaxLength: 63
AllowedPattern: '[-_\.a-zA-Z0-9]*'
BucketSuffix:
Type: String
Description: Suffix for S3 bucket name
ConstraintDescription: must be valid S3 bucket characters and prefix+accountnumber+suffix <=63 characters
Default: ''
MinLength: 0
MaxLength: 63
AllowedPattern: '[-_\.a-zA-Z0-9]*'
Conditions: {}
Mappings: {}
Outputs:
UploadBucket:
Description: S3 bucket for uploads
Value: !Ref PhotoBucket
Resources:
PhotoBucket:
Type: AWS::S3::Bucket
Properties:
BucketName: !Sub ${BucketPrefix}${AWS::AccountId}${BucketSuffix}
# TODO Add event notification
# http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-s3-bucket-notificationconfig.html
# vim: ts=2:sw=2
| AWSTemplateFormatVersion: '2010-09-09'
Description: PhotoS3
Parameters:
BucketPrefix:
Type: String
Description: Prefix for S3 bucket name
ConstraintDescription: must be valid S3 bucket characters and prefix+accountnumber+suffix <=63 characters
Default: photos3-
MinLength: 0
MaxLength: 63
AllowedPattern: '[-_\.a-zA-Z0-9]*'
BucketSuffix:
Type: String
Description: Suffix for S3 bucket name
ConstraintDescription: must be valid S3 bucket characters and prefix+accountnumber+suffix <=63 characters
Default: ''
MinLength: 0
MaxLength: 63
AllowedPattern: '[-_\.a-zA-Z0-9]*'
Conditions: {}
Mappings: {}
Outputs:
UploadBucket:
Description: S3 bucket for uploads
Value: !Ref PhotoBucket
Resources:
PhotoBucket:
Type: AWS::S3::Bucket
Properties:
BucketName: !Sub ${BucketPrefix}${AWS::AccountId}${BucketSuffix}
NotificationConfiguration:
QueueConfigurations:
- Event: s3:ObjectCreated:*
Filter:
S3Key:
Rules:
- Name: prefix
Value: source
Queue: !Sub ${NewImageQueue.Arn}
NewImageQueue:
Type: AWS::SQS::Queue
Properties:
VisibilityTimeout: 300
NewImageQueuePolicy:
Type: AWS::SQS::QueuePolicy
Properties:
Queues:
- !Ref NewImageQueue
PolicyDocument:
Version: 2012-10-17
Id: S3Publish
Statement:
- Sid: S3PublishMessage
Effect: Allow
Principal: {Service: s3.amazonaws.com}
Action: ['SQS:SendMessage']
Resource: !Sub ${NewImageQueue.Arn}
Condition:
ArnLike:
'aws:SourceArn': !Sub ${PhotoBucket.Arn}
# vim: ts=2:sw=2
| Add SQS queue & event notification | Add SQS queue & event notification
| YAML | mit | tomislacker/photos3,tomislacker/photos3 | yaml | ## Code Before:
AWSTemplateFormatVersion: '2010-09-09'
Description: PhotoS3
Parameters:
BucketPrefix:
Type: String
Description: Prefix for S3 bucket name
ConstraintDescription: must be valid S3 bucket characters and prefix+accountnumber+suffix <=63 characters
Default: photos3-
MinLength: 0
MaxLength: 63
AllowedPattern: '[-_\.a-zA-Z0-9]*'
BucketSuffix:
Type: String
Description: Suffix for S3 bucket name
ConstraintDescription: must be valid S3 bucket characters and prefix+accountnumber+suffix <=63 characters
Default: ''
MinLength: 0
MaxLength: 63
AllowedPattern: '[-_\.a-zA-Z0-9]*'
Conditions: {}
Mappings: {}
Outputs:
UploadBucket:
Description: S3 bucket for uploads
Value: !Ref PhotoBucket
Resources:
PhotoBucket:
Type: AWS::S3::Bucket
Properties:
BucketName: !Sub ${BucketPrefix}${AWS::AccountId}${BucketSuffix}
# TODO Add event notification
# http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-s3-bucket-notificationconfig.html
# vim: ts=2:sw=2
## Instruction:
Add SQS queue & event notification
## Code After:
AWSTemplateFormatVersion: '2010-09-09'
Description: PhotoS3
Parameters:
BucketPrefix:
Type: String
Description: Prefix for S3 bucket name
ConstraintDescription: must be valid S3 bucket characters and prefix+accountnumber+suffix <=63 characters
Default: photos3-
MinLength: 0
MaxLength: 63
AllowedPattern: '[-_\.a-zA-Z0-9]*'
BucketSuffix:
Type: String
Description: Suffix for S3 bucket name
ConstraintDescription: must be valid S3 bucket characters and prefix+accountnumber+suffix <=63 characters
Default: ''
MinLength: 0
MaxLength: 63
AllowedPattern: '[-_\.a-zA-Z0-9]*'
Conditions: {}
Mappings: {}
Outputs:
UploadBucket:
Description: S3 bucket for uploads
Value: !Ref PhotoBucket
Resources:
PhotoBucket:
Type: AWS::S3::Bucket
Properties:
BucketName: !Sub ${BucketPrefix}${AWS::AccountId}${BucketSuffix}
NotificationConfiguration:
QueueConfigurations:
- Event: s3:ObjectCreated:*
Filter:
S3Key:
Rules:
- Name: prefix
Value: source
Queue: !Sub ${NewImageQueue.Arn}
NewImageQueue:
Type: AWS::SQS::Queue
Properties:
VisibilityTimeout: 300
NewImageQueuePolicy:
Type: AWS::SQS::QueuePolicy
Properties:
Queues:
- !Ref NewImageQueue
PolicyDocument:
Version: 2012-10-17
Id: S3Publish
Statement:
- Sid: S3PublishMessage
Effect: Allow
Principal: {Service: s3.amazonaws.com}
Action: ['SQS:SendMessage']
Resource: !Sub ${NewImageQueue.Arn}
Condition:
ArnLike:
'aws:SourceArn': !Sub ${PhotoBucket.Arn}
# vim: ts=2:sw=2
| AWSTemplateFormatVersion: '2010-09-09'
Description: PhotoS3
Parameters:
BucketPrefix:
Type: String
Description: Prefix for S3 bucket name
ConstraintDescription: must be valid S3 bucket characters and prefix+accountnumber+suffix <=63 characters
Default: photos3-
MinLength: 0
MaxLength: 63
AllowedPattern: '[-_\.a-zA-Z0-9]*'
BucketSuffix:
Type: String
Description: Suffix for S3 bucket name
ConstraintDescription: must be valid S3 bucket characters and prefix+accountnumber+suffix <=63 characters
Default: ''
MinLength: 0
MaxLength: 63
AllowedPattern: '[-_\.a-zA-Z0-9]*'
Conditions: {}
Mappings: {}
Outputs:
UploadBucket:
Description: S3 bucket for uploads
Value: !Ref PhotoBucket
Resources:
PhotoBucket:
Type: AWS::S3::Bucket
Properties:
BucketName: !Sub ${BucketPrefix}${AWS::AccountId}${BucketSuffix}
- # TODO Add event notification
- # http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-s3-bucket-notificationconfig.html
+ NotificationConfiguration:
+ QueueConfigurations:
+ - Event: s3:ObjectCreated:*
+ Filter:
+ S3Key:
+ Rules:
+ - Name: prefix
+ Value: source
+ Queue: !Sub ${NewImageQueue.Arn}
+
+ NewImageQueue:
+ Type: AWS::SQS::Queue
+ Properties:
+ VisibilityTimeout: 300
+
+ NewImageQueuePolicy:
+ Type: AWS::SQS::QueuePolicy
+ Properties:
+ Queues:
+ - !Ref NewImageQueue
+ PolicyDocument:
+ Version: 2012-10-17
+ Id: S3Publish
+ Statement:
+ - Sid: S3PublishMessage
+ Effect: Allow
+ Principal: {Service: s3.amazonaws.com}
+ Action: ['SQS:SendMessage']
+ Resource: !Sub ${NewImageQueue.Arn}
+ Condition:
+ ArnLike:
+ 'aws:SourceArn': !Sub ${PhotoBucket.Arn}
# vim: ts=2:sw=2 | 34 | 0.894737 | 32 | 2 |
04b7e79ce3fed1afac129098badb632ca226fdee | dispatch.py | dispatch.py | import config
import steam
steam.set_api_key(config.api_key)
from optf2.backend import openid
from optf2.frontend import render
openid.set_session(render.session)
import web
if config.enable_fastcgi:
web.wsgi.runwsgi = lambda func, addr=None: web.wsgi.runfcgi(func, addr)
if __name__ == "__main__":
render.application.run()
| import config
import steam
steam.set_api_key(config.api_key)
from optf2.backend import openid
from optf2.frontend import render
openid.set_session(render.session)
import web
# wsgi
application = render.application.wsgifunc()
if config.enable_fastcgi:
web.wsgi.runwsgi = lambda func, addr=None: web.wsgi.runfcgi(func, addr)
if __name__ == "__main__":
render.application.run()
| Add wsgi handler by default | Add wsgi handler by default
| Python | isc | Lagg/optf2,FlaminSarge/optf2,Lagg/optf2,FlaminSarge/optf2,Lagg/optf2,FlaminSarge/optf2 | python | ## Code Before:
import config
import steam
steam.set_api_key(config.api_key)
from optf2.backend import openid
from optf2.frontend import render
openid.set_session(render.session)
import web
if config.enable_fastcgi:
web.wsgi.runwsgi = lambda func, addr=None: web.wsgi.runfcgi(func, addr)
if __name__ == "__main__":
render.application.run()
## Instruction:
Add wsgi handler by default
## Code After:
import config
import steam
steam.set_api_key(config.api_key)
from optf2.backend import openid
from optf2.frontend import render
openid.set_session(render.session)
import web
# wsgi
application = render.application.wsgifunc()
if config.enable_fastcgi:
web.wsgi.runwsgi = lambda func, addr=None: web.wsgi.runfcgi(func, addr)
if __name__ == "__main__":
render.application.run()
| import config
import steam
steam.set_api_key(config.api_key)
from optf2.backend import openid
from optf2.frontend import render
openid.set_session(render.session)
import web
+ # wsgi
+ application = render.application.wsgifunc()
+
if config.enable_fastcgi:
web.wsgi.runwsgi = lambda func, addr=None: web.wsgi.runfcgi(func, addr)
if __name__ == "__main__":
render.application.run() | 3 | 0.230769 | 3 | 0 |
18603df8d13cf0eff753cb713e6c52ae30179f30 | experiments/camera-stdin-stdout/bg-subtract.py | experiments/camera-stdin-stdout/bg-subtract.py |
import cv2
import ImageReader
escape_key = 27
reader = ImageReader.ImageReader()
fgbg = cv2.createBackgroundSubtractorMOG2()
while True:
imgread, img = reader.decode()
if imgread:
fgmask = fgbg.apply(img)
cv2.imshow("frame", fgmask)
key = cv2.waitKey(1) & 0xFF
if key == escape_key:
break
cv2.destroyAllWindows()
|
import cv2
import ImageReader
import numpy as np
import sys
escape_key = 27
reader = ImageReader.ImageReader()
fgbg = cv2.createBackgroundSubtractorMOG2()
while True:
imgread, img = reader.decode()
if imgread:
fgmask = fgbg.apply(img)
retval, encodedimage = cv2.imencode(".jpg", fgmask)
np.savetxt(sys.stdout, encodedimage, fmt='%c', newline='')
key = cv2.waitKey(1) & 0xFF
if key == escape_key:
break
cv2.destroyAllWindows()
| Print to stdout the changed video | Print to stdout the changed video
| Python | mit | Mindavi/Positiebepalingssysteem | python | ## Code Before:
import cv2
import ImageReader
escape_key = 27
reader = ImageReader.ImageReader()
fgbg = cv2.createBackgroundSubtractorMOG2()
while True:
imgread, img = reader.decode()
if imgread:
fgmask = fgbg.apply(img)
cv2.imshow("frame", fgmask)
key = cv2.waitKey(1) & 0xFF
if key == escape_key:
break
cv2.destroyAllWindows()
## Instruction:
Print to stdout the changed video
## Code After:
import cv2
import ImageReader
import numpy as np
import sys
escape_key = 27
reader = ImageReader.ImageReader()
fgbg = cv2.createBackgroundSubtractorMOG2()
while True:
imgread, img = reader.decode()
if imgread:
fgmask = fgbg.apply(img)
retval, encodedimage = cv2.imencode(".jpg", fgmask)
np.savetxt(sys.stdout, encodedimage, fmt='%c', newline='')
key = cv2.waitKey(1) & 0xFF
if key == escape_key:
break
cv2.destroyAllWindows()
|
import cv2
import ImageReader
+ import numpy as np
+ import sys
escape_key = 27
reader = ImageReader.ImageReader()
fgbg = cv2.createBackgroundSubtractorMOG2()
while True:
imgread, img = reader.decode()
if imgread:
fgmask = fgbg.apply(img)
- cv2.imshow("frame", fgmask)
+ retval, encodedimage = cv2.imencode(".jpg", fgmask)
+ np.savetxt(sys.stdout, encodedimage, fmt='%c', newline='')
- key = cv2.waitKey(1) & 0xFF
+ key = cv2.waitKey(1) & 0xFF
? ++++
- if key == escape_key:
+ if key == escape_key:
? ++++
- break
+ break
? ++++
cv2.destroyAllWindows() | 11 | 0.55 | 7 | 4 |
dff656a69714bc9c13c02960b80fd9668d147a6b | web/static/js/components/stage_change_info_voting.jsx | web/static/js/components/stage_change_info_voting.jsx | import React from "react"
export default () => (
<React.Fragment>
The skinny on Labeling + Voting:
<div className="ui basic segment">
<ul className="ui list">
<li>Work as a team to arrive at a sensible label for each group <span className="optional">(optional)</span></li>
<li>Apply votes to the items you feel are <strong>most important</strong> to discuss.
<p className="list-item-note">
<strong>Note: </strong>voting is blind.
Totals will be revealed when the facilitator advances the retro.
</p>
</li>
</ul>
</div>
</React.Fragment>
)
| import React from "react"
export default () => (
<React.Fragment>
The skinny on Labeling + Voting:
<div className="ui basic segment">
<ul className="ui list">
<li>Work as a team to arrive at a sensible label for each group <span className="optional">(optional)</span></li>
<li>Apply votes to the items you feel are <strong>most important</strong> to discuss.
<p className="list-item-note"><strong>Note: </strong>voting is blind.
Totals will be revealed when the facilitator advances the retro.
</p>
</li>
</ul>
</div>
</React.Fragment>
)
| Remove unnecessary whitespace from 'Note:' bullet point | Remove unnecessary whitespace from 'Note:' bullet point
| JSX | mit | stride-nyc/remote_retro,stride-nyc/remote_retro,stride-nyc/remote_retro | jsx | ## Code Before:
import React from "react"
export default () => (
<React.Fragment>
The skinny on Labeling + Voting:
<div className="ui basic segment">
<ul className="ui list">
<li>Work as a team to arrive at a sensible label for each group <span className="optional">(optional)</span></li>
<li>Apply votes to the items you feel are <strong>most important</strong> to discuss.
<p className="list-item-note">
<strong>Note: </strong>voting is blind.
Totals will be revealed when the facilitator advances the retro.
</p>
</li>
</ul>
</div>
</React.Fragment>
)
## Instruction:
Remove unnecessary whitespace from 'Note:' bullet point
## Code After:
import React from "react"
export default () => (
<React.Fragment>
The skinny on Labeling + Voting:
<div className="ui basic segment">
<ul className="ui list">
<li>Work as a team to arrive at a sensible label for each group <span className="optional">(optional)</span></li>
<li>Apply votes to the items you feel are <strong>most important</strong> to discuss.
<p className="list-item-note"><strong>Note: </strong>voting is blind.
Totals will be revealed when the facilitator advances the retro.
</p>
</li>
</ul>
</div>
</React.Fragment>
)
| import React from "react"
export default () => (
<React.Fragment>
The skinny on Labeling + Voting:
<div className="ui basic segment">
<ul className="ui list">
<li>Work as a team to arrive at a sensible label for each group <span className="optional">(optional)</span></li>
<li>Apply votes to the items you feel are <strong>most important</strong> to discuss.
- <p className="list-item-note">
- <strong>Note: </strong>voting is blind.
? ^
+ <p className="list-item-note"><strong>Note: </strong>voting is blind.
? ++ ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Totals will be revealed when the facilitator advances the retro.
</p>
</li>
</ul>
</div>
</React.Fragment>
) | 3 | 0.166667 | 1 | 2 |
dbb53da60c5366295a4269fec2577b34be1a862f | .lgtm.yml | .lgtm.yml | queries:
# we don't print any sensitive data by default and we are giving a loud warning about using
# flags that enable such output
- exclude: py/clear-text-logging-sensitive-data
| queries:
# we don't print any sensitive data by default and we are giving a loud warning about using
# flags that enable such output
- exclude: py/clear-text-logging-sensitive-data
# we import functions for output using "from X import Y" to shorten these calls
# to *improve* maintainability (see https://github.com/egnyte/gitlabform/pull/267)
- exclude: py/import-and-import-from
| Make LGTM stop complaining on these - this is on purpose | Make LGTM stop complaining on these - this is on purpose
| YAML | mit | egnyte/gitlabform,egnyte/gitlabform | yaml | ## Code Before:
queries:
# we don't print any sensitive data by default and we are giving a loud warning about using
# flags that enable such output
- exclude: py/clear-text-logging-sensitive-data
## Instruction:
Make LGTM stop complaining on these - this is on purpose
## Code After:
queries:
# we don't print any sensitive data by default and we are giving a loud warning about using
# flags that enable such output
- exclude: py/clear-text-logging-sensitive-data
# we import functions for output using "from X import Y" to shorten these calls
# to *improve* maintainability (see https://github.com/egnyte/gitlabform/pull/267)
- exclude: py/import-and-import-from
| queries:
# we don't print any sensitive data by default and we are giving a loud warning about using
# flags that enable such output
- exclude: py/clear-text-logging-sensitive-data
+ # we import functions for output using "from X import Y" to shorten these calls
+ # to *improve* maintainability (see https://github.com/egnyte/gitlabform/pull/267)
+ - exclude: py/import-and-import-from | 3 | 0.75 | 3 | 0 |
69aedd709be7de4b43e32d84b526bca92d5cb0b2 | util/channel_util_test.go | util/channel_util_test.go | package util
import (
"reflect"
"testing"
)
func doubleIt(v reflect.Value) reflect.Value {
return reflect.ValueOf(v.Int() * 2)
}
func TestMergeChannel(t *testing.T) {
chan1 := make(chan reflect.Value)
chan2 := make(chan reflect.Value)
go func() {
chan1 <- reflect.ValueOf(1)
close(chan1)
}()
go func() {
chan2 <- reflect.ValueOf(2)
close(chan2)
}()
outChan := make(chan reflect.Value, 2)
MergeChannelTo([]chan reflect.Value{chan1, chan2}, doubleIt, outChan)
got := make([]int64, 0, 2)
for v := range outChan {
got = append(got, v.Int())
}
want := []int64{2, 4}
if !reflect.DeepEqual(got, want) {
t.Errorf("Got %v want %v", got, want)
}
}
| package util
import (
"reflect"
"sort"
"testing"
)
func doubleIt(v reflect.Value) reflect.Value {
return reflect.ValueOf(v.Int() * 2)
}
func TestMergeChannel(t *testing.T) {
chan1 := make(chan reflect.Value)
chan2 := make(chan reflect.Value)
go func() {
chan1 <- reflect.ValueOf(1)
close(chan1)
}()
go func() {
chan2 <- reflect.ValueOf(2)
close(chan2)
}()
outChan := make(chan reflect.Value, 2)
MergeChannelTo([]chan reflect.Value{chan1, chan2}, doubleIt, outChan)
got := make([]int, 0, 2)
for v := range outChan {
got = append(got, int(v.Int()))
}
sort.Ints(got)
want := []int{2, 4}
if !reflect.DeepEqual(got, want) {
t.Errorf("Got %v want %v", got, want)
}
}
| Sort the result list to eliminate test flakiness. | Sort the result list to eliminate test flakiness.
| Go | apache-2.0 | justicezyx/glow,chrislusf/glow,chrislusf/glow,justicezyx/glow | go | ## Code Before:
package util
import (
"reflect"
"testing"
)
func doubleIt(v reflect.Value) reflect.Value {
return reflect.ValueOf(v.Int() * 2)
}
func TestMergeChannel(t *testing.T) {
chan1 := make(chan reflect.Value)
chan2 := make(chan reflect.Value)
go func() {
chan1 <- reflect.ValueOf(1)
close(chan1)
}()
go func() {
chan2 <- reflect.ValueOf(2)
close(chan2)
}()
outChan := make(chan reflect.Value, 2)
MergeChannelTo([]chan reflect.Value{chan1, chan2}, doubleIt, outChan)
got := make([]int64, 0, 2)
for v := range outChan {
got = append(got, v.Int())
}
want := []int64{2, 4}
if !reflect.DeepEqual(got, want) {
t.Errorf("Got %v want %v", got, want)
}
}
## Instruction:
Sort the result list to eliminate test flakiness.
## Code After:
package util
import (
"reflect"
"sort"
"testing"
)
func doubleIt(v reflect.Value) reflect.Value {
return reflect.ValueOf(v.Int() * 2)
}
func TestMergeChannel(t *testing.T) {
chan1 := make(chan reflect.Value)
chan2 := make(chan reflect.Value)
go func() {
chan1 <- reflect.ValueOf(1)
close(chan1)
}()
go func() {
chan2 <- reflect.ValueOf(2)
close(chan2)
}()
outChan := make(chan reflect.Value, 2)
MergeChannelTo([]chan reflect.Value{chan1, chan2}, doubleIt, outChan)
got := make([]int, 0, 2)
for v := range outChan {
got = append(got, int(v.Int()))
}
sort.Ints(got)
want := []int{2, 4}
if !reflect.DeepEqual(got, want) {
t.Errorf("Got %v want %v", got, want)
}
}
| package util
import (
"reflect"
+ "sort"
"testing"
)
func doubleIt(v reflect.Value) reflect.Value {
return reflect.ValueOf(v.Int() * 2)
}
func TestMergeChannel(t *testing.T) {
chan1 := make(chan reflect.Value)
chan2 := make(chan reflect.Value)
go func() {
chan1 <- reflect.ValueOf(1)
close(chan1)
}()
go func() {
chan2 <- reflect.ValueOf(2)
close(chan2)
}()
outChan := make(chan reflect.Value, 2)
MergeChannelTo([]chan reflect.Value{chan1, chan2}, doubleIt, outChan)
- got := make([]int64, 0, 2)
? --
+ got := make([]int, 0, 2)
for v := range outChan {
- got = append(got, v.Int())
+ got = append(got, int(v.Int()))
? ++++ +
}
+ sort.Ints(got)
- want := []int64{2, 4}
? --
+ want := []int{2, 4}
if !reflect.DeepEqual(got, want) {
t.Errorf("Got %v want %v", got, want)
}
} | 8 | 0.216216 | 5 | 3 |
daa19ed58c1378bb23800a72d937412d18aa50dd | spec/scenario/db/migrate/20100401102949_create_tables.rb | spec/scenario/db/migrate/20100401102949_create_tables.rb | class CreateTables < ActiveRecord::Migration
def self.up
create_table :users do |t|
t.cas_authenticatable
t.rememberable
t.string :email
t.timestamps
end
end
def self.down
drop_table :users
end
end | class CreateTables < ActiveRecord::Migration
def self.up
create_table :users do |t|
t.string :username, :null => false
t.datetime :remember_created_at
t.string :email
t.timestamps
end
add_index :users, :username, :unique => true
end
def self.down
drop_table :users
end
end
| Switch schema to work with Devise 2.1 | Switch schema to work with Devise 2.1
| Ruby | mit | lutechspa/devise_cas_authenticatable,tomascharad/devise_cas_authenticatable,nbudin/devise_cas_authenticatable,identification-io/devise_cas_authenticatable,tomascharad/devise_cas_authenticatable,nbudin/devise_cas_authenticatable,identification-io/devise_cas_authenticatable | ruby | ## Code Before:
class CreateTables < ActiveRecord::Migration
def self.up
create_table :users do |t|
t.cas_authenticatable
t.rememberable
t.string :email
t.timestamps
end
end
def self.down
drop_table :users
end
end
## Instruction:
Switch schema to work with Devise 2.1
## Code After:
class CreateTables < ActiveRecord::Migration
def self.up
create_table :users do |t|
t.string :username, :null => false
t.datetime :remember_created_at
t.string :email
t.timestamps
end
add_index :users, :username, :unique => true
end
def self.down
drop_table :users
end
end
| class CreateTables < ActiveRecord::Migration
def self.up
create_table :users do |t|
- t.cas_authenticatable
- t.rememberable
+ t.string :username, :null => false
+ t.datetime :remember_created_at
t.string :email
t.timestamps
end
+
+ add_index :users, :username, :unique => true
end
def self.down
drop_table :users
end
end | 6 | 0.428571 | 4 | 2 |
a8791965a4053192927e4db0274aae50635f0105 | README.md | README.md | This is the code to run a survey across mySociety sites in conjuction
with the University of Manchester
| This is the code to run a survey across mySociety sites in conjuction
with the University of Manchester
## Install instructions for Debian
Install necessary packages
sudo apt-get update -y
sudo apt-get upgrade -y
sudo apt-get install -y $(cut -d " " -f 1 conf/packages | egrep -v "^#")
Install the project's requirements into a virtualenv
virtualenv ~/virtualenv-manchester-survey
source ~/virtualenv-manchester-survey/bin/activate
pip install -r requirements.txt
Create a user and a database in postgres
sudo -u postgres psql --command="CREATE USER questions WITH PASSWORD 'questions' CREATEDB;"
sudo -u postgres psql --command='CREATE DATABASE questions OWNER questions;'
Add required configuration
cp conf/general.yml-example conf/general.yml
- Edit MANSURV_DB_USER to be 'questions' in conf/general.yml
- Edit MANSURV_DB_NAME to be 'questions' in conf/general.yml
- Edit MANSURV_DB_PASS to be 'questions' in conf/general.yml
- Edit DJANGO_SECRET_KEY in conf/general.yml
Install gems and compile CSS
sudo gem install --conservative --no-ri --no-rdoc compass zurb-foundation
compass compile web
| Add some basic install instructions for debian | Add some basic install instructions for debian
These are tested in a fresh precise64 vagrant box using the
hashicorp/precise64 image.
| Markdown | agpl-3.0 | mysociety/manchester-survey,mysociety/manchester-survey,mysociety/manchester-survey,mysociety/manchester-survey,mysociety/manchester-survey | markdown | ## Code Before:
This is the code to run a survey across mySociety sites in conjuction
with the University of Manchester
## Instruction:
Add some basic install instructions for debian
These are tested in a fresh precise64 vagrant box using the
hashicorp/precise64 image.
## Code After:
This is the code to run a survey across mySociety sites in conjuction
with the University of Manchester
## Install instructions for Debian
Install necessary packages
sudo apt-get update -y
sudo apt-get upgrade -y
sudo apt-get install -y $(cut -d " " -f 1 conf/packages | egrep -v "^#")
Install the project's requirements into a virtualenv
virtualenv ~/virtualenv-manchester-survey
source ~/virtualenv-manchester-survey/bin/activate
pip install -r requirements.txt
Create a user and a database in postgres
sudo -u postgres psql --command="CREATE USER questions WITH PASSWORD 'questions' CREATEDB;"
sudo -u postgres psql --command='CREATE DATABASE questions OWNER questions;'
Add required configuration
cp conf/general.yml-example conf/general.yml
- Edit MANSURV_DB_USER to be 'questions' in conf/general.yml
- Edit MANSURV_DB_NAME to be 'questions' in conf/general.yml
- Edit MANSURV_DB_PASS to be 'questions' in conf/general.yml
- Edit DJANGO_SECRET_KEY in conf/general.yml
Install gems and compile CSS
sudo gem install --conservative --no-ri --no-rdoc compass zurb-foundation
compass compile web
| This is the code to run a survey across mySociety sites in conjuction
with the University of Manchester
+
+ ## Install instructions for Debian
+
+ Install necessary packages
+
+ sudo apt-get update -y
+ sudo apt-get upgrade -y
+ sudo apt-get install -y $(cut -d " " -f 1 conf/packages | egrep -v "^#")
+
+ Install the project's requirements into a virtualenv
+
+ virtualenv ~/virtualenv-manchester-survey
+ source ~/virtualenv-manchester-survey/bin/activate
+ pip install -r requirements.txt
+
+ Create a user and a database in postgres
+
+ sudo -u postgres psql --command="CREATE USER questions WITH PASSWORD 'questions' CREATEDB;"
+ sudo -u postgres psql --command='CREATE DATABASE questions OWNER questions;'
+
+ Add required configuration
+
+ cp conf/general.yml-example conf/general.yml
+
+ - Edit MANSURV_DB_USER to be 'questions' in conf/general.yml
+ - Edit MANSURV_DB_NAME to be 'questions' in conf/general.yml
+ - Edit MANSURV_DB_PASS to be 'questions' in conf/general.yml
+ - Edit DJANGO_SECRET_KEY in conf/general.yml
+
+ Install gems and compile CSS
+
+ sudo gem install --conservative --no-ri --no-rdoc compass zurb-foundation
+ compass compile web | 33 | 16.5 | 33 | 0 |
b0b1a305e3d8d227617e8c9ee3d7c61e1f61ad34 | intermine/src/java/org/intermine/dataconversion/DataConverter.java | intermine/src/java/org/intermine/dataconversion/DataConverter.java | package org.flymine.dataconversion;
/*
* Copyright (C) 2002-2003 FlyMine
*
* This code may be freely distributed and modified under the
* terms of the GNU Lesser General Public Licence. This should
* be distributed with the code. See the LICENSE file for more
* information or http://www.gnu.org/copyleft/lesser.html.
*
*/
/**
* Abstract parent class of all DataConverters
* @author Mark Woodbridge
*/
public abstract class DataConverter
{
protected ItemWriter writer;
/**
* Constructor that should be called by children
* @param writer an ItemWriter used to handle the resultant Items
*/
public DataConverter(ItemWriter writer) {
this.writer = writer;
}
/**
* Perform the data conversion
* @throws Exception if an error occurs during processing
*/
public abstract void process() throws Exception;
}
| package org.flymine.dataconversion;
/*
* Copyright (C) 2002-2003 FlyMine
*
* This code may be freely distributed and modified under the
* terms of the GNU Lesser General Public Licence. This should
* be distributed with the code. See the LICENSE file for more
* information or http://www.gnu.org/copyleft/lesser.html.
*
*/
import java.util.Map;
import java.util.HashMap;
/**
* Abstract parent class of all DataConverters
* @author Mark Woodbridge
*/
public abstract class DataConverter
{
protected ItemWriter writer;
protected Map aliases = new HashMap();
protected int nextClsId = 0;
/**
* Constructor that should be called by children
* @param writer an ItemWriter used to handle the resultant Items
*/
public DataConverter(ItemWriter writer) {
this.writer = writer;
}
/**
* Perform the data conversion
* @throws Exception if an error occurs during processing
*/
public abstract void process() throws Exception;
/**
* Uniquely alias a className
* @param className the class name
* @return the alias
*/
protected String alias(String className) {
String alias = (String) aliases.get(className);
if (alias != null) {
return alias;
}
String nextIndex = "" + (nextClsId++);
aliases.put(className, nextIndex);
return nextIndex;
}
}
| Add functionality to alias classes | Add functionality to alias classes
| Java | lgpl-2.1 | justincc/intermine,Arabidopsis-Information-Portal/intermine,JoeCarlson/intermine,tomck/intermine,julie-sullivan/phytomine,joshkh/intermine,kimrutherford/intermine,elsiklab/intermine,zebrafishmine/intermine,Arabidopsis-Information-Portal/intermine,joshkh/intermine,elsiklab/intermine,justincc/intermine,zebrafishmine/intermine,kimrutherford/intermine,drhee/toxoMine,elsiklab/intermine,Arabidopsis-Information-Portal/intermine,julie-sullivan/phytomine,JoeCarlson/intermine,kimrutherford/intermine,JoeCarlson/intermine,Arabidopsis-Information-Portal/intermine,joshkh/intermine,JoeCarlson/intermine,JoeCarlson/intermine,drhee/toxoMine,Arabidopsis-Information-Portal/intermine,justincc/intermine,kimrutherford/intermine,julie-sullivan/phytomine,drhee/toxoMine,tomck/intermine,elsiklab/intermine,elsiklab/intermine,tomck/intermine,tomck/intermine,JoeCarlson/intermine,julie-sullivan/phytomine,Arabidopsis-Information-Portal/intermine,drhee/toxoMine,tomck/intermine,JoeCarlson/intermine,Arabidopsis-Information-Portal/intermine,joshkh/intermine,joshkh/intermine,zebrafishmine/intermine,zebrafishmine/intermine,zebrafishmine/intermine,elsiklab/intermine,kimrutherford/intermine,julie-sullivan/phytomine,joshkh/intermine,drhee/toxoMine,elsiklab/intermine,Arabidopsis-Information-Portal/intermine,tomck/intermine,kimrutherford/intermine,zebrafishmine/intermine,Arabidopsis-Information-Portal/intermine,tomck/intermine,drhee/toxoMine,tomck/intermine,elsiklab/intermine,drhee/toxoMine,julie-sullivan/phytomine,justincc/intermine,joshkh/intermine,zebrafishmine/intermine,drhee/toxoMine,joshkh/intermine,julie-sullivan/phytomine,justincc/intermine,drhee/toxoMine,justincc/intermine,justincc/intermine,zebrafishmine/intermine,zebrafishmine/intermine,elsiklab/intermine,JoeCarlson/intermine,justincc/intermine,kimrutherford/intermine,justincc/intermine,kimrutherford/intermine,joshkh/intermine,JoeCarlson/intermine,kimrutherford/intermine,tomck/intermine | java | ## Code Before:
package org.flymine.dataconversion;
/*
* Copyright (C) 2002-2003 FlyMine
*
* This code may be freely distributed and modified under the
* terms of the GNU Lesser General Public Licence. This should
* be distributed with the code. See the LICENSE file for more
* information or http://www.gnu.org/copyleft/lesser.html.
*
*/
/**
* Abstract parent class of all DataConverters
* @author Mark Woodbridge
*/
public abstract class DataConverter
{
protected ItemWriter writer;
/**
* Constructor that should be called by children
* @param writer an ItemWriter used to handle the resultant Items
*/
public DataConverter(ItemWriter writer) {
this.writer = writer;
}
/**
* Perform the data conversion
* @throws Exception if an error occurs during processing
*/
public abstract void process() throws Exception;
}
## Instruction:
Add functionality to alias classes
## Code After:
package org.flymine.dataconversion;
/*
* Copyright (C) 2002-2003 FlyMine
*
* This code may be freely distributed and modified under the
* terms of the GNU Lesser General Public Licence. This should
* be distributed with the code. See the LICENSE file for more
* information or http://www.gnu.org/copyleft/lesser.html.
*
*/
import java.util.Map;
import java.util.HashMap;
/**
* Abstract parent class of all DataConverters
* @author Mark Woodbridge
*/
public abstract class DataConverter
{
protected ItemWriter writer;
protected Map aliases = new HashMap();
protected int nextClsId = 0;
/**
* Constructor that should be called by children
* @param writer an ItemWriter used to handle the resultant Items
*/
public DataConverter(ItemWriter writer) {
this.writer = writer;
}
/**
* Perform the data conversion
* @throws Exception if an error occurs during processing
*/
public abstract void process() throws Exception;
/**
* Uniquely alias a className
* @param className the class name
* @return the alias
*/
protected String alias(String className) {
String alias = (String) aliases.get(className);
if (alias != null) {
return alias;
}
String nextIndex = "" + (nextClsId++);
aliases.put(className, nextIndex);
return nextIndex;
}
}
| package org.flymine.dataconversion;
/*
* Copyright (C) 2002-2003 FlyMine
*
* This code may be freely distributed and modified under the
* terms of the GNU Lesser General Public Licence. This should
* be distributed with the code. See the LICENSE file for more
* information or http://www.gnu.org/copyleft/lesser.html.
*
*/
+ import java.util.Map;
+ import java.util.HashMap;
+
- /**
? -
+ /**
* Abstract parent class of all DataConverters
* @author Mark Woodbridge
*/
public abstract class DataConverter
{
protected ItemWriter writer;
+ protected Map aliases = new HashMap();
+ protected int nextClsId = 0;
/**
* Constructor that should be called by children
* @param writer an ItemWriter used to handle the resultant Items
*/
public DataConverter(ItemWriter writer) {
this.writer = writer;
}
/**
* Perform the data conversion
* @throws Exception if an error occurs during processing
*/
public abstract void process() throws Exception;
+
+ /**
+ * Uniquely alias a className
+ * @param className the class name
+ * @return the alias
+ */
+ protected String alias(String className) {
+ String alias = (String) aliases.get(className);
+ if (alias != null) {
+ return alias;
+ }
+ String nextIndex = "" + (nextClsId++);
+ aliases.put(className, nextIndex);
+ return nextIndex;
+ }
} | 22 | 0.647059 | 21 | 1 |
6c46c0985eeeec4e9dfd31fd27b4634594d5a425 | lib/devise/models/validatable.rb | lib/devise/models/validatable.rb | module Devise
module Models
# Validatable creates all needed validations for a user email and password.
# It's optional, given you may want to create the validations by yourself.
# Automatically validate if the email is present, unique and it's format is
# valid. Also tests presence of password, confirmation and length
module Validatable
# Email regex used to validate email formats. Retrieved from authlogic.
EMAIL_REGEX = /\A[\w\.%\+\-]+@(?:[A-Z0-9\-]+\.)+(?:[A-Z]{2,4}|museum|travel)\z/i
def self.included(base)
base.class_eval do
validates_presence_of :email
validates_uniqueness_of :email, :allow_blank => true
validates_format_of :email, :with => EMAIL_REGEX, :allow_blank => true
validates_presence_of :password, :if => :password_required?
validates_confirmation_of :password, :if => :password_required?
validates_length_of :password, :within => 6..20, :allow_blank => true, :if => :password_required?
end
end
protected
# Checks whether a password is needed or not. For validations only.
# Passwords are always required if it's a new record, or if the password
# or confirmation are being set somewhere.
def password_required?
new_record? || !password.nil? || !password_confirmation.nil?
end
end
end
end
| module Devise
module Models
# Validatable creates all needed validations for a user email and password.
# It's optional, given you may want to create the validations by yourself.
# Automatically validate if the email is present, unique and it's format is
# valid. Also tests presence of password, confirmation and length
module Validatable
# Email regex used to validate email formats. Retrieved from authlogic.
EMAIL_REGEX = /\A[\w\.%\+\-]+@(?:[A-Z0-9\-]+\.)+(?:[A-Z]{2,4}|museum|travel)\z/i
def self.included(base)
base.class_eval do
attribute = authentication_keys.first
validates_presence_of attribute
validates_uniqueness_of attribute, :allow_blank => true
validates_format_of attribute, :with => EMAIL_REGEX, :allow_blank => true,
:scope => authentication_keys[1..-1]
with_options :if => :password_required? do |v|
v.validates_presence_of :password
v.validates_confirmation_of :password
v.validates_length_of :password, :within => 6..20, :allow_blank => true
end
end
end
protected
# Checks whether a password is needed or not. For validations only.
# Passwords are always required if it's a new record, or if the password
# or confirmation are being set somewhere.
def password_required?
new_record? || !password.nil? || !password_confirmation.nil?
end
end
end
end
| Make validations based on authentication keys. | Make validations based on authentication keys.
| Ruby | mit | RailsDev42/devise,didacte/devise,rivalmetrics/rm-devise,vincentwoo/devise,Bazay/devise,silvaelton/devise,thiagorp/devise,thejourneydude/devise,nambrot/devise,denysnando/devise,twalpole/devise,randoum/devise,tiarly/devise,sideci-sample/sideci-sample-devise,NeilvB/devise,Legalyze/devise,plataformatec/devise,havok2905/devise,eaconde/devise,DanFerreira/devise,JanChw/devise,tjgrathwell/devise,Ataraxic/devise,onursarikaya/devise,evopark/devise,chadzilla2080/devise,shtzr840329/devise,yui-knk/devise,MAubreyK/devise,havok2905/devise,LimeBlast/devise,rubyrider/devise,yannis/biologyconf,eaconde/devise,alan707/startup_theme,bobegir/devise,SadTreeFriends/devise,buzybee83/devise,gauravtiwari/devise,dgynn/devise,bonekost/devise,ddisel/interestingforme,kewaunited/devise,DanFerreira/devise,JohnSmall/device-plus-mods,Course-Master/devise,eltonchrls/devise,sferik/devise,ram-krishan/devise,Diego5529/devise,chrisnordqvist/devise,gauravtiwari/devise,ngpestelos/devise,andyhite/devise,evopark/devise,testerzz/devise,LimeBlast/devise,hauledev/devise,bolshakov/devise,JanChw/devise,davidrv/devise,4nshu19/devise,posgarou/devise,ram-krishan/devise,Bazay/devise,yizjiang/devise,bonekost/devise,SiddhiPrabha/devise,SadTreeFriends/devise,LimeBlast/devise,sferik/devise,thiagorp/devise,solutelabs-savan/deviseOne,rlhh/devise,yang70/devise,yizjiang/devise,yakovenkodenis/devise,joycedelatorre/devise,yakovenkodenis/devise,jnappy/devise,ngpestelos/devise,RailsDev42/devise,SadTreeFriends/devise,bolshakov/devise,nambrot/devise,cassioscabral/devise,sharma1nitish/devise,silvaelton/devise,solutelabs-savan/deviseOne,JaisonBrooks/devise,zeiv/devise-archangel,thejourneydude/devise,digideskio/devise,riskgod/devise,MAubreyK/devise,yannis/biologyconf,HopeCommUnityCenter/devise,abevoelker/devise,hazah/devise,LauraWN/devise,chrisnordqvist/devise,RailsDev42/devise,xiuxian123/loyal_device,NeilvB/devise,solutelabs-savan/deviseOne,9040044/devise,zarinn3pal/devise,shtzr840329/devise,Angeldude/devise,silvaelton/devise,janocat/devise,Serg09/devise,fjg/devise,buzybee83/devise,twalpole/devise,onursarikaya/devise,chrisnordqvist/devise,jinzhu/devise,Legalyze/devise,yui-knk/devise,jahammo2/devise,sharma1nitish/devise,testerzz/devise,hazah/devise,Angeldude/devise,Course-Master/devise,yizjiang/devise,Odyl/devise,hauledev/devise,Legalyze/devise,Odyl/devise,0x000000/devise,betesh/devise,davidrv/devise,chadzilla2080/devise,djfpaagman/devise,DanFerreira/devise,bobegir/devise,tiarly/devise,ar-shestopal/devise,sulestone/devise,fjg/devise,digideskio/devise,zarinn3pal/devise,SiddhiPrabha/devise,JanChw/devise,eltonchrls/devise,LauraWN/devise,jahammo2/devise,randoum/devise,sulestone/devise,djfpaagman/devise,satyaki123/devise,cefigueiredo/devise,testerzz/devise,bonekost/devise,xymbol/devise,keeliker/devise,dgynn/devise,Diego5529/devise,zhaoxiao10/devise,Umar-malik007/devise,Odyl/devise,MAubreyK/devise,zarinn3pal/devise,joycedelatorre/devise,ar-shestopal/devise,thejourneydude/devise,posgarou/devise,satyaki123/devise,betesh/devise,ram-krishan/devise,janocat/devise,JaisonBrooks/devise,alan707/startup_theme,travis-repos/devise,absk1317/devise,Fuseit/devise,Qwiklabs/devise,marlonandrade/devise,kewaunited/devise,joycedelatorre/devise,4nshu19/devise,openeducation/devise,Bazay/devise,timoschilling/devise,andyhite/devise,rlhh/devise,pcantrell/devise,xymbol/devise,vincentwoo/devise,Umar-malik007/devise,cefigueiredo/devise,yang70/devise,riskgod/devise,gauravtiwari/devise,yui-knk/devise,stanhu/devise,leonardoprg/devise,JaisonBrooks/devise,cassioscabral/devise,betesh/devise,tjgrathwell/devise,jamescallmebrent/devise,zvikamenahemi/devise,sespindola/devise,Qwiklabs/devise,deivid-rodriguez/devise,Serg09/devise,rubyrider/devise,shtzr840329/devise,sulestone/devise,tiarly/devise,marlonandrade/devise,bobegir/devise,eaconde/devise,sespindola/devise,Fuseit/devise,Umar-malik007/devise,timoschilling/devise,eltonchrls/devise,deivid-rodriguez/devise,absk1317/devise,Angeldude/devise,marlonandrade/devise,jahammo2/devise,imaboldcompany/devise,9040044/devise,HopeCommUnityCenter/devise,Ataraxic/devise,sharma1nitish/devise,9040044/devise,riskgod/devise,andyhite/devise,leonardoprg/devise,robertquinn/devise,twalpole/devise,kewaunited/devise,didacte/devise,janocat/devise,yakovenkodenis/devise,Diego5529/devise,tjgrathwell/devise,evopark/devise,imaboldcompany/devise,4nshu19/devise,rlhh/devise,thiagorp/devise,plataformatec/devise,zvikamenahemi/devise,LauraWN/devise,ar-shestopal/devise,denysnando/devise,satyaki123/devise,jamescook/devise,hazah/devise,Ataraxic/devise,djfpaagman/devise,sespindola/devise,keeliker/devise,jnappy/devise,NeilvB/devise,didacte/devise,stanhu/devise,onursarikaya/devise,keeliker/devise,Serg09/devise,hauledev/devise,rubyrider/devise,davidrv/devise,xymbol/devise,posgarou/devise,havok2905/devise,buzybee83/devise,cefigueiredo/devise,sideci-sample/sideci-sample-devise,HopeCommUnityCenter/devise,randoum/devise,nambrot/devise,ddisel/interestingforme,yang70/devise,vincentwoo/devise,dgynn/devise,denysnando/devise,yannis/biologyconf,chadzilla2080/devise,SiddhiPrabha/devise | ruby | ## Code Before:
module Devise
module Models
# Validatable creates all needed validations for a user email and password.
# It's optional, given you may want to create the validations by yourself.
# Automatically validate if the email is present, unique and it's format is
# valid. Also tests presence of password, confirmation and length
module Validatable
# Email regex used to validate email formats. Retrieved from authlogic.
EMAIL_REGEX = /\A[\w\.%\+\-]+@(?:[A-Z0-9\-]+\.)+(?:[A-Z]{2,4}|museum|travel)\z/i
def self.included(base)
base.class_eval do
validates_presence_of :email
validates_uniqueness_of :email, :allow_blank => true
validates_format_of :email, :with => EMAIL_REGEX, :allow_blank => true
validates_presence_of :password, :if => :password_required?
validates_confirmation_of :password, :if => :password_required?
validates_length_of :password, :within => 6..20, :allow_blank => true, :if => :password_required?
end
end
protected
# Checks whether a password is needed or not. For validations only.
# Passwords are always required if it's a new record, or if the password
# or confirmation are being set somewhere.
def password_required?
new_record? || !password.nil? || !password_confirmation.nil?
end
end
end
end
## Instruction:
Make validations based on authentication keys.
## Code After:
module Devise
module Models
# Validatable creates all needed validations for a user email and password.
# It's optional, given you may want to create the validations by yourself.
# Automatically validate if the email is present, unique and it's format is
# valid. Also tests presence of password, confirmation and length
module Validatable
# Email regex used to validate email formats. Retrieved from authlogic.
EMAIL_REGEX = /\A[\w\.%\+\-]+@(?:[A-Z0-9\-]+\.)+(?:[A-Z]{2,4}|museum|travel)\z/i
def self.included(base)
base.class_eval do
attribute = authentication_keys.first
validates_presence_of attribute
validates_uniqueness_of attribute, :allow_blank => true
validates_format_of attribute, :with => EMAIL_REGEX, :allow_blank => true,
:scope => authentication_keys[1..-1]
with_options :if => :password_required? do |v|
v.validates_presence_of :password
v.validates_confirmation_of :password
v.validates_length_of :password, :within => 6..20, :allow_blank => true
end
end
end
protected
# Checks whether a password is needed or not. For validations only.
# Passwords are always required if it's a new record, or if the password
# or confirmation are being set somewhere.
def password_required?
new_record? || !password.nil? || !password_confirmation.nil?
end
end
end
end
| module Devise
module Models
# Validatable creates all needed validations for a user email and password.
# It's optional, given you may want to create the validations by yourself.
# Automatically validate if the email is present, unique and it's format is
# valid. Also tests presence of password, confirmation and length
module Validatable
# Email regex used to validate email formats. Retrieved from authlogic.
EMAIL_REGEX = /\A[\w\.%\+\-]+@(?:[A-Z0-9\-]+\.)+(?:[A-Z]{2,4}|museum|travel)\z/i
def self.included(base)
base.class_eval do
+ attribute = authentication_keys.first
- validates_presence_of :email
? ^^^ ----
+ validates_presence_of attribute
? ^^^^^^^^
- validates_uniqueness_of :email, :allow_blank => true
? ^^^ ----
+ validates_uniqueness_of attribute, :allow_blank => true
? ^^^^^^^^
- validates_format_of :email, :with => EMAIL_REGEX, :allow_blank => true
? ^^^ ----
+ validates_format_of attribute, :with => EMAIL_REGEX, :allow_blank => true,
? ^^^^^^^^ +
+ :scope => authentication_keys[1..-1]
- validates_presence_of :password, :if => :password_required?
- validates_confirmation_of :password, :if => :password_required?
+ with_options :if => :password_required? do |v|
+ v.validates_presence_of :password
+ v.validates_confirmation_of :password
- validates_length_of :password, :within => 6..20, :allow_blank => true, :if => :password_required?
? ----------------------------
+ v.validates_length_of :password, :within => 6..20, :allow_blank => true
? ++++
+ end
end
end
protected
# Checks whether a password is needed or not. For validations only.
# Passwords are always required if it's a new record, or if the password
# or confirmation are being set somewhere.
def password_required?
new_record? || !password.nil? || !password_confirmation.nil?
end
end
end
end | 16 | 0.444444 | 10 | 6 |
0177066012b3373753cba8baf86f00a365d7147b | findaconf/tests/config.py | findaconf/tests/config.py |
from decouple import config
from findaconf.tests.fake_data import fake_conference, seed
def set_app(app, db=False):
unset_app(db)
app.config['TESTING'] = True
app.config['WTF_CSRF_ENABLED'] = False
if db:
app.config['SQLALCHEMY_DATABASE_URI'] = config(
'DATABASE_URL_TEST',
default='sqlite:///' + app.config['BASEDIR'].child('findaconf',
'tests',
'tests.db')
)
test_app = app.test_client()
if db:
db.create_all()
seed(app, db)
[db.session.add(fake_conference(db)) for i in range(1, 43)]
db.session.commit()
return test_app
def unset_app(db=False):
if db:
db.session.remove()
db.drop_all()
|
from decouple import config
from findaconf.tests.fake_data import fake_conference, seed
def set_app(app, db=False):
# set test vars
app.config['TESTING'] = True
app.config['WTF_CSRF_ENABLED'] = False
# set test db
if db:
app.config['SQLALCHEMY_DATABASE_URI'] = config(
'DATABASE_URL_TEST',
default='sqlite:///' + app.config['BASEDIR'].child('findaconf',
'tests',
'tests.db')
)
# create test app
test_app = app.test_client()
# create and feed db tables
if db:
# start from a clean db
db.session.remove()
db.drop_all()
# create tables and feed them
db.create_all()
seed(app, db)
[db.session.add(fake_conference(db)) for i in range(1, 43)]
db.session.commit()
# return test app
return test_app
def unset_app(db=False):
if db:
db.session.remove()
db.drop_all()
| Fix bug that used dev db instead of test db | Fix bug that used dev db instead of test db
| Python | mit | cuducos/findaconf,koorukuroo/findaconf,cuducos/findaconf,koorukuroo/findaconf,koorukuroo/findaconf,cuducos/findaconf | python | ## Code Before:
from decouple import config
from findaconf.tests.fake_data import fake_conference, seed
def set_app(app, db=False):
unset_app(db)
app.config['TESTING'] = True
app.config['WTF_CSRF_ENABLED'] = False
if db:
app.config['SQLALCHEMY_DATABASE_URI'] = config(
'DATABASE_URL_TEST',
default='sqlite:///' + app.config['BASEDIR'].child('findaconf',
'tests',
'tests.db')
)
test_app = app.test_client()
if db:
db.create_all()
seed(app, db)
[db.session.add(fake_conference(db)) for i in range(1, 43)]
db.session.commit()
return test_app
def unset_app(db=False):
if db:
db.session.remove()
db.drop_all()
## Instruction:
Fix bug that used dev db instead of test db
## Code After:
from decouple import config
from findaconf.tests.fake_data import fake_conference, seed
def set_app(app, db=False):
# set test vars
app.config['TESTING'] = True
app.config['WTF_CSRF_ENABLED'] = False
# set test db
if db:
app.config['SQLALCHEMY_DATABASE_URI'] = config(
'DATABASE_URL_TEST',
default='sqlite:///' + app.config['BASEDIR'].child('findaconf',
'tests',
'tests.db')
)
# create test app
test_app = app.test_client()
# create and feed db tables
if db:
# start from a clean db
db.session.remove()
db.drop_all()
# create tables and feed them
db.create_all()
seed(app, db)
[db.session.add(fake_conference(db)) for i in range(1, 43)]
db.session.commit()
# return test app
return test_app
def unset_app(db=False):
if db:
db.session.remove()
db.drop_all()
|
from decouple import config
from findaconf.tests.fake_data import fake_conference, seed
def set_app(app, db=False):
- unset_app(db)
+
+ # set test vars
app.config['TESTING'] = True
app.config['WTF_CSRF_ENABLED'] = False
+
+ # set test db
if db:
app.config['SQLALCHEMY_DATABASE_URI'] = config(
'DATABASE_URL_TEST',
default='sqlite:///' + app.config['BASEDIR'].child('findaconf',
'tests',
'tests.db')
)
+
+ # create test app
test_app = app.test_client()
+
+ # create and feed db tables
if db:
+
+ # start from a clean db
+ db.session.remove()
+ db.drop_all()
+
+ # create tables and feed them
db.create_all()
seed(app, db)
[db.session.add(fake_conference(db)) for i in range(1, 43)]
db.session.commit()
+
+ # return test app
return test_app
def unset_app(db=False):
if db:
db.session.remove()
db.drop_all() | 17 | 0.586207 | 16 | 1 |
de50a519b2d18c1e53c86305d37b19c58cdd674d | app/src/main/res/layout/photo_item.xml | app/src/main/res/layout/photo_item.xml | <?xml version="1.0" encoding="utf-8"?>
<layout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto">
<data>
</data>
<androidx.constraintlayout.widget.ConstraintLayout
android:layout_width="match_parent"
android:layout_height="wrap_content">
<ImageView
android:id="@+id/image"
android:layout_width="match_parent"
android:layout_height="150dp"
android:contentDescription="@string/app_name"
android:scaleType="centerCrop"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent" />
</androidx.constraintlayout.widget.ConstraintLayout>
</layout> | <?xml version="1.0" encoding="utf-8"?>
<layout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto">
<data>
</data>
<androidx.constraintlayout.widget.ConstraintLayout
android:layout_width="match_parent"
android:layout_height="wrap_content">
<ImageView
android:id="@+id/image"
android:layout_width="match_parent"
android:layout_height="0dp"
android:contentDescription="@string/app_name"
android:scaleType="centerCrop"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent" />
</androidx.constraintlayout.widget.ConstraintLayout>
</layout> | Add vertical constraint for photo item | fix: Add vertical constraint for photo item
| XML | apache-2.0 | WenhaoWu/BingWallpaper | xml | ## Code Before:
<?xml version="1.0" encoding="utf-8"?>
<layout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto">
<data>
</data>
<androidx.constraintlayout.widget.ConstraintLayout
android:layout_width="match_parent"
android:layout_height="wrap_content">
<ImageView
android:id="@+id/image"
android:layout_width="match_parent"
android:layout_height="150dp"
android:contentDescription="@string/app_name"
android:scaleType="centerCrop"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent" />
</androidx.constraintlayout.widget.ConstraintLayout>
</layout>
## Instruction:
fix: Add vertical constraint for photo item
## Code After:
<?xml version="1.0" encoding="utf-8"?>
<layout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto">
<data>
</data>
<androidx.constraintlayout.widget.ConstraintLayout
android:layout_width="match_parent"
android:layout_height="wrap_content">
<ImageView
android:id="@+id/image"
android:layout_width="match_parent"
android:layout_height="0dp"
android:contentDescription="@string/app_name"
android:scaleType="centerCrop"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent" />
</androidx.constraintlayout.widget.ConstraintLayout>
</layout> | <?xml version="1.0" encoding="utf-8"?>
<layout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto">
<data>
</data>
<androidx.constraintlayout.widget.ConstraintLayout
android:layout_width="match_parent"
android:layout_height="wrap_content">
<ImageView
android:id="@+id/image"
android:layout_width="match_parent"
- android:layout_height="150dp"
? --
+ android:layout_height="0dp"
android:contentDescription="@string/app_name"
android:scaleType="centerCrop"
+ app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
- app:layout_constraintStart_toStartOf="parent" />
? ---
+ app:layout_constraintStart_toStartOf="parent"
+ app:layout_constraintTop_toTopOf="parent" />
</androidx.constraintlayout.widget.ConstraintLayout>
</layout> | 6 | 0.25 | 4 | 2 |
82aba7f7de57908630ffe583ecb0414801b46b30 | uspek/build.gradle.kts | uspek/build.gradle.kts | plugins {
kotlin("multiplatform")
`maven-publish`
}
group = "com.github.langara.uspek"
version = "0.0.12"
repositories {
mavenCentral()
}
kotlin {
jvm()
js {
nodejs()
browser {
testTask {
enabled = System.getenv("JITPACK") != "true"
}
}
}
// linuxX64()
sourceSets {
val commonMain by getting {
dependencies {
api(Deps.kotlinTestCommon)
api(Deps.kotlinTestAnnotationsCommon)
}
}
val jvmMain by getting {
dependencies {
implementation(Deps.kotlinTestJUnit)
implementation(Deps.junit5)
}
}
val jsMain by getting {
dependencies {
implementation(Deps.kotlinTestJs)
}
}
// val linuxX64Main by getting {
// dependencies {
// // TODO
// }
// }
}
}
//// Create sources Jar from main kotlin sources
//val sourcesJar by tasks.creating(Jar::class) {
// group = JavaBasePlugin.DOCUMENTATION_GROUP
// description = "Assembles sources JAR"
// classifier = "sources"
// from(sourceSets.getByName("main").allSource)
//}
//
//publishing {
// publications {
// create("default", MavenPublication::class.java) {
// from(components.getByName("java"))
// artifact(sourcesJar)
// }
// }
// repositories {
// maven {
// url = uri("$buildDir/repository")
// }
// }
//}
| plugins {
kotlin("multiplatform")
`maven-publish`
}
group = "com.github.langara.uspek"
version = "0.0.12"
repositories {
mavenCentral()
}
kotlin {
jvm()
js {
nodejs()
browser {
testTask {
enabled = System.getenv("JITPACK") != "true"
}
}
}
linuxX64()
sourceSets {
val commonMain by getting {
dependencies {
api(Deps.kotlinTestCommon)
api(Deps.kotlinTestAnnotationsCommon)
}
}
val jvmMain by getting {
dependencies {
implementation(Deps.kotlinTestJUnit)
implementation(Deps.junit5)
}
}
val jsMain by getting {
dependencies {
implementation(Deps.kotlinTestJs)
}
}
val linuxX64Main by getting
}
}
//// Create sources Jar from main kotlin sources
//val sourcesJar by tasks.creating(Jar::class) {
// group = JavaBasePlugin.DOCUMENTATION_GROUP
// description = "Assembles sources JAR"
// classifier = "sources"
// from(sourceSets.getByName("main").allSource)
//}
//
//publishing {
// publications {
// create("default", MavenPublication::class.java) {
// from(components.getByName("java"))
// artifact(sourcesJar)
// }
// }
// repositories {
// maven {
// url = uri("$buildDir/repository")
// }
// }
//}
| Enable native target and tests (works on ubuntu after installing libtinfo5 package) | Enable native target and tests (works on ubuntu after installing libtinfo5 package)
| Kotlin | apache-2.0 | langara/USpek,langara/USpek,langara/USpek | kotlin | ## Code Before:
plugins {
kotlin("multiplatform")
`maven-publish`
}
group = "com.github.langara.uspek"
version = "0.0.12"
repositories {
mavenCentral()
}
kotlin {
jvm()
js {
nodejs()
browser {
testTask {
enabled = System.getenv("JITPACK") != "true"
}
}
}
// linuxX64()
sourceSets {
val commonMain by getting {
dependencies {
api(Deps.kotlinTestCommon)
api(Deps.kotlinTestAnnotationsCommon)
}
}
val jvmMain by getting {
dependencies {
implementation(Deps.kotlinTestJUnit)
implementation(Deps.junit5)
}
}
val jsMain by getting {
dependencies {
implementation(Deps.kotlinTestJs)
}
}
// val linuxX64Main by getting {
// dependencies {
// // TODO
// }
// }
}
}
//// Create sources Jar from main kotlin sources
//val sourcesJar by tasks.creating(Jar::class) {
// group = JavaBasePlugin.DOCUMENTATION_GROUP
// description = "Assembles sources JAR"
// classifier = "sources"
// from(sourceSets.getByName("main").allSource)
//}
//
//publishing {
// publications {
// create("default", MavenPublication::class.java) {
// from(components.getByName("java"))
// artifact(sourcesJar)
// }
// }
// repositories {
// maven {
// url = uri("$buildDir/repository")
// }
// }
//}
## Instruction:
Enable native target and tests (works on ubuntu after installing libtinfo5 package)
## Code After:
plugins {
kotlin("multiplatform")
`maven-publish`
}
group = "com.github.langara.uspek"
version = "0.0.12"
repositories {
mavenCentral()
}
kotlin {
jvm()
js {
nodejs()
browser {
testTask {
enabled = System.getenv("JITPACK") != "true"
}
}
}
linuxX64()
sourceSets {
val commonMain by getting {
dependencies {
api(Deps.kotlinTestCommon)
api(Deps.kotlinTestAnnotationsCommon)
}
}
val jvmMain by getting {
dependencies {
implementation(Deps.kotlinTestJUnit)
implementation(Deps.junit5)
}
}
val jsMain by getting {
dependencies {
implementation(Deps.kotlinTestJs)
}
}
val linuxX64Main by getting
}
}
//// Create sources Jar from main kotlin sources
//val sourcesJar by tasks.creating(Jar::class) {
// group = JavaBasePlugin.DOCUMENTATION_GROUP
// description = "Assembles sources JAR"
// classifier = "sources"
// from(sourceSets.getByName("main").allSource)
//}
//
//publishing {
// publications {
// create("default", MavenPublication::class.java) {
// from(components.getByName("java"))
// artifact(sourcesJar)
// }
// }
// repositories {
// maven {
// url = uri("$buildDir/repository")
// }
// }
//}
| plugins {
kotlin("multiplatform")
`maven-publish`
}
group = "com.github.langara.uspek"
version = "0.0.12"
repositories {
mavenCentral()
}
kotlin {
jvm()
js {
nodejs()
browser {
testTask {
enabled = System.getenv("JITPACK") != "true"
}
}
}
- // linuxX64()
? --
+ linuxX64()
sourceSets {
val commonMain by getting {
dependencies {
api(Deps.kotlinTestCommon)
api(Deps.kotlinTestAnnotationsCommon)
}
}
val jvmMain by getting {
dependencies {
implementation(Deps.kotlinTestJUnit)
implementation(Deps.junit5)
}
}
val jsMain by getting {
dependencies {
implementation(Deps.kotlinTestJs)
}
}
- // val linuxX64Main by getting {
? -- --
+ val linuxX64Main by getting
- // dependencies {
- // // TODO
- // }
- // }
}
}
//// Create sources Jar from main kotlin sources
//val sourcesJar by tasks.creating(Jar::class) {
// group = JavaBasePlugin.DOCUMENTATION_GROUP
// description = "Assembles sources JAR"
// classifier = "sources"
// from(sourceSets.getByName("main").allSource)
//}
//
//publishing {
// publications {
// create("default", MavenPublication::class.java) {
// from(components.getByName("java"))
// artifact(sourcesJar)
// }
// }
// repositories {
// maven {
// url = uri("$buildDir/repository")
// }
// }
//} | 8 | 0.112676 | 2 | 6 |
1e87b2a8ddc7a0476f22ed8f5f889a1dcc7ddee1 | CONTRIBUTORS.rst | CONTRIBUTORS.rst | Original Authors
================
* Matthew Westcott matthew.westcott@torchbox.com
* David Cranwell david.cranwell@torchbox.com
* Karl Hobley karl.hobley@torchbox.com
* Helen Chapman helen.chapman@torchbox.com
Contributors
============
* Balazs Endresz balazs.endresz@torchbox.com
* Neal Todd neal.todd@torchbox.com
* Paul Hallett (twilio) hello@phalt.co
* Tom Dyson
* Serafeim Papastefanos
* Łukasz Bołdys
* peterarenot
* Levi Gross
* Delgermurun Purevkhuu
* Lewis Cowper
* Stephen Newey
* Ryan Foster
* v1kku
* Miguel Vieira
* Ben Emery
Translators
===========
* Basque: Unai Zalakain
* Bulgarian: Lyuboslav Petrov
* Catalan: David Llop
* Chinese: Lihan Li
* French: Sylvain Fankhauser
* Galician: fooflare
* German: Karl Sander, Johannes Spielmann
* Greek: Serafeim Papastefanos
* Mongolian: Delgermurun Purevkhuu
* Polish: Łukasz Bołdys
* Romanian: Dan Braghis
* Spanish: Unai Zalakain, fooflare
| Original Authors
================
* Matthew Westcott matthew.westcott@torchbox.com
* David Cranwell david.cranwell@torchbox.com
* Karl Hobley karl.hobley@torchbox.com
* Helen Chapman helen.chapman@torchbox.com
Contributors
============
* Balazs Endresz balazs.endresz@torchbox.com
* Neal Todd neal.todd@torchbox.com
* Paul Hallett (twilio) hello@phalt.co
* Tom Dyson
* Serafeim Papastefanos
* Łukasz Bołdys
* peterarenot
* Levi Gross
* Delgermurun Purevkhuu
* Lewis Cowper
* Stephen Newey
* Ryan Foster
* v1kku
* Miguel Vieira
* Ben Emery
Translators
===========
* Basque: Unai Zalakain
* Bulgarian: Lyuboslav Petrov
* Catalan: David Llop
* Chinese: Lihan Li
* French: Sylvain Fankhauser
* Galician: fooflare
* German: Karl Sander, Johannes Spielmann
* Greek: Serafeim Papastefanos
* Mongolian: Delgermurun Purevkhuu
* Polish: Łukasz Bołdys
* Romanian: Dan Braghis
* Spanish: Unai Zalakain, fooflare
* Portuguese Brazil: Gilson Filho
| Insert responsible of translation in Portuguese Brazil | Insert responsible of translation in Portuguese Brazil
| reStructuredText | bsd-3-clause | wagtail/wagtail,mephizzle/wagtail,JoshBarr/wagtail,kurtrwall/wagtail,takeshineshiro/wagtail,100Shapes/wagtail,zerolab/wagtail,rjsproxy/wagtail,dresiu/wagtail,takeshineshiro/wagtail,marctc/wagtail,kurtw/wagtail,rsalmaso/wagtail,taedori81/wagtail,wagtail/wagtail,thenewguy/wagtail,rv816/wagtail,willcodefortea/wagtail,takeshineshiro/wagtail,bjesus/wagtail,darith27/wagtail,WQuanfeng/wagtail,nrsimha/wagtail,tangentlabs/wagtail,Klaudit/wagtail,torchbox/wagtail,nutztherookie/wagtail,jnns/wagtail,benemery/wagtail,mikedingjan/wagtail,stevenewey/wagtail,rv816/wagtail,marctc/wagtail,Tivix/wagtail,nimasmi/wagtail,benjaoming/wagtail,jnns/wagtail,kurtw/wagtail,Toshakins/wagtail,kurtw/wagtail,iansprice/wagtail,iansprice/wagtail,iho/wagtail,chrxr/wagtail,hamsterbacke23/wagtail,nrsimha/wagtail,dresiu/wagtail,zerolab/wagtail,dresiu/wagtail,mephizzle/wagtail,chrxr/wagtail,janusnic/wagtail,kaedroho/wagtail,nilnvoid/wagtail,KimGlazebrook/wagtail-experiment,Toshakins/wagtail,mjec/wagtail,chimeno/wagtail,takeshineshiro/wagtail,nimasmi/wagtail,benemery/wagtail,nilnvoid/wagtail,dresiu/wagtail,bjesus/wagtail,bjesus/wagtail,rjsproxy/wagtail,stevenewey/wagtail,bjesus/wagtail,Tivix/wagtail,quru/wagtail,mephizzle/wagtail,darith27/wagtail,inonit/wagtail,jnns/wagtail,serzans/wagtail,davecranwell/wagtail,kaedroho/wagtail,Pennebaker/wagtail,kaedroho/wagtail,mikedingjan/wagtail,gasman/wagtail,FlipperPA/wagtail,jorge-marques/wagtail,mixxorz/wagtail,mixxorz/wagtail,nrsimha/wagtail,davecranwell/wagtail,nrsimha/wagtail,inonit/wagtail,mikedingjan/wagtail,iho/wagtail,kurtrwall/wagtail,torchbox/wagtail,JoshBarr/wagtail,gogobook/wagtail,KimGlazebrook/wagtail-experiment,willcodefortea/wagtail,helenwarren/pied-wagtail,FlipperPA/wagtail,willcodefortea/wagtail,hanpama/wagtail,janusnic/wagtail,jnns/wagtail,torchbox/wagtail,taedori81/wagtail,benjaoming/wagtail,m-sanders/wagtail,gasman/wagtail,rv816/wagtail,nealtodd/wagtail,gasman/wagtail,nutztherookie/wagtail,nilnvoid/wagtail,benemery/wagtail,rjsproxy/wagtail,nilnvoid/wagtail,mjec/wagtail,jordij/wagtail,tangentlabs/wagtail,JoshBarr/wagtail,100Shapes/wagtail,dresiu/wagtail,tangentlabs/wagtail,inonit/wagtail,takeflight/wagtail,KimGlazebrook/wagtail-experiment,WQuanfeng/wagtail,iansprice/wagtail,kaedroho/wagtail,kurtrwall/wagtail,iho/wagtail,tangentlabs/wagtail,thenewguy/wagtail,Pennebaker/wagtail,chrxr/wagtail,timorieber/wagtail,thenewguy/wagtail,quru/wagtail,mikedingjan/wagtail,chimeno/wagtail,rsalmaso/wagtail,rsalmaso/wagtail,jordij/wagtail,Pennebaker/wagtail,hanpama/wagtail,stevenewey/wagtail,chimeno/wagtail,Tivix/wagtail,mayapurmedia/wagtail,mephizzle/wagtail,zerolab/wagtail,gogobook/wagtail,taedori81/wagtail,benjaoming/wagtail,hanpama/wagtail,janusnic/wagtail,lojack/wagtail,thenewguy/wagtail,hamsterbacke23/wagtail,m-sanders/wagtail,kurtrwall/wagtail,iho/wagtail,jorge-marques/wagtail,lojack/wagtail,timorieber/wagtail,WQuanfeng/wagtail,serzans/wagtail,torchbox/wagtail,gasman/wagtail,mixxorz/wagtail,serzans/wagtail,chrxr/wagtail,gasman/wagtail,mjec/wagtail,nimasmi/wagtail,mjec/wagtail,willcodefortea/wagtail,mixxorz/wagtail,Klaudit/wagtail,inonit/wagtail,darith27/wagtail,wagtail/wagtail,nimasmi/wagtail,marctc/wagtail,marctc/wagtail,JoshBarr/wagtail,nealtodd/wagtail,Pennebaker/wagtail,chimeno/wagtail,taedori81/wagtail,mixxorz/wagtail,Tivix/wagtail,quru/wagtail,benemery/wagtail,wagtail/wagtail,helenwarren/pied-wagtail,jorge-marques/wagtail,FlipperPA/wagtail,iansprice/wagtail,WQuanfeng/wagtail,KimGlazebrook/wagtail-experiment,FlipperPA/wagtail,davecranwell/wagtail,takeflight/wagtail,takeflight/wagtail,davecranwell/wagtail,Klaudit/wagtail,rjsproxy/wagtail,rv816/wagtail,rsalmaso/wagtail,kurtw/wagtail,hamsterbacke23/wagtail,100Shapes/wagtail,jorge-marques/wagtail,Toshakins/wagtail,m-sanders/wagtail,wagtail/wagtail,nutztherookie/wagtail,nealtodd/wagtail,taedori81/wagtail,mayapurmedia/wagtail,chimeno/wagtail,janusnic/wagtail,zerolab/wagtail,m-sanders/wagtail,quru/wagtail,helenwarren/pied-wagtail,mayapurmedia/wagtail,nutztherookie/wagtail,thenewguy/wagtail,benjaoming/wagtail,hamsterbacke23/wagtail,jordij/wagtail,zerolab/wagtail,Toshakins/wagtail,jorge-marques/wagtail,serzans/wagtail,kaedroho/wagtail,timorieber/wagtail,gogobook/wagtail,hanpama/wagtail,timorieber/wagtail,rsalmaso/wagtail,stevenewey/wagtail,darith27/wagtail,Klaudit/wagtail,jordij/wagtail,takeflight/wagtail,lojack/wagtail,nealtodd/wagtail,mayapurmedia/wagtail,gogobook/wagtail | restructuredtext | ## Code Before:
Original Authors
================
* Matthew Westcott matthew.westcott@torchbox.com
* David Cranwell david.cranwell@torchbox.com
* Karl Hobley karl.hobley@torchbox.com
* Helen Chapman helen.chapman@torchbox.com
Contributors
============
* Balazs Endresz balazs.endresz@torchbox.com
* Neal Todd neal.todd@torchbox.com
* Paul Hallett (twilio) hello@phalt.co
* Tom Dyson
* Serafeim Papastefanos
* Łukasz Bołdys
* peterarenot
* Levi Gross
* Delgermurun Purevkhuu
* Lewis Cowper
* Stephen Newey
* Ryan Foster
* v1kku
* Miguel Vieira
* Ben Emery
Translators
===========
* Basque: Unai Zalakain
* Bulgarian: Lyuboslav Petrov
* Catalan: David Llop
* Chinese: Lihan Li
* French: Sylvain Fankhauser
* Galician: fooflare
* German: Karl Sander, Johannes Spielmann
* Greek: Serafeim Papastefanos
* Mongolian: Delgermurun Purevkhuu
* Polish: Łukasz Bołdys
* Romanian: Dan Braghis
* Spanish: Unai Zalakain, fooflare
## Instruction:
Insert responsible of translation in Portuguese Brazil
## Code After:
Original Authors
================
* Matthew Westcott matthew.westcott@torchbox.com
* David Cranwell david.cranwell@torchbox.com
* Karl Hobley karl.hobley@torchbox.com
* Helen Chapman helen.chapman@torchbox.com
Contributors
============
* Balazs Endresz balazs.endresz@torchbox.com
* Neal Todd neal.todd@torchbox.com
* Paul Hallett (twilio) hello@phalt.co
* Tom Dyson
* Serafeim Papastefanos
* Łukasz Bołdys
* peterarenot
* Levi Gross
* Delgermurun Purevkhuu
* Lewis Cowper
* Stephen Newey
* Ryan Foster
* v1kku
* Miguel Vieira
* Ben Emery
Translators
===========
* Basque: Unai Zalakain
* Bulgarian: Lyuboslav Petrov
* Catalan: David Llop
* Chinese: Lihan Li
* French: Sylvain Fankhauser
* Galician: fooflare
* German: Karl Sander, Johannes Spielmann
* Greek: Serafeim Papastefanos
* Mongolian: Delgermurun Purevkhuu
* Polish: Łukasz Bołdys
* Romanian: Dan Braghis
* Spanish: Unai Zalakain, fooflare
* Portuguese Brazil: Gilson Filho
| Original Authors
================
* Matthew Westcott matthew.westcott@torchbox.com
* David Cranwell david.cranwell@torchbox.com
* Karl Hobley karl.hobley@torchbox.com
* Helen Chapman helen.chapman@torchbox.com
Contributors
============
* Balazs Endresz balazs.endresz@torchbox.com
* Neal Todd neal.todd@torchbox.com
* Paul Hallett (twilio) hello@phalt.co
* Tom Dyson
* Serafeim Papastefanos
* Łukasz Bołdys
* peterarenot
* Levi Gross
* Delgermurun Purevkhuu
* Lewis Cowper
* Stephen Newey
* Ryan Foster
* v1kku
* Miguel Vieira
* Ben Emery
Translators
===========
* Basque: Unai Zalakain
* Bulgarian: Lyuboslav Petrov
* Catalan: David Llop
* Chinese: Lihan Li
* French: Sylvain Fankhauser
* Galician: fooflare
* German: Karl Sander, Johannes Spielmann
* Greek: Serafeim Papastefanos
* Mongolian: Delgermurun Purevkhuu
* Polish: Łukasz Bołdys
* Romanian: Dan Braghis
* Spanish: Unai Zalakain, fooflare
+ * Portuguese Brazil: Gilson Filho | 1 | 0.02381 | 1 | 0 |
c2c81ed89938155ce91acb5173ac38580f630e3d | .travis.yml | .travis.yml | language: ruby
rvm:
- 2.0.0
- 2.1.8
- 2.2.4
bundler_args: --without guard
before_script: gem install bundler -v 1.10.6 && sudo service redis-server status && sudo service mongodb status
script: bundle exec rubocop && bundle exec rake
services:
- redis-server
- mongodb
- memcached
env:
- RAILS_VERSION=5.0.0
- RAILS_VERSION=4.2.5
- RAILS_VERSION=3.2.21
matrix:
# don't run rails 5 on ruby versions that can't install rack 2
exclude:
- rvm: 2.0.0
env: RAILS_VERSION=5.0.0
- rvm: 2.1.8
env: RAILS_VERSION=5.0.0
| language: ruby
rvm:
- 2.1
- 2.2
- 2.3
bundler_args: --without guard
before_script: gem install bundler -v 1.10.6 && sudo service redis-server status && sudo service mongodb status
script: bundle exec rubocop && bundle exec rake
services:
- redis-server
- mongodb
- memcached
env:
- RAILS_VERSION=5.0.0
- RAILS_VERSION=4.2.5
- RAILS_VERSION=3.2.21
matrix:
# don't run rails 5 on ruby versions that can't install rack 2
exclude:
- rvm: 2.1.8
env: RAILS_VERSION=5.0.0
| Simplify ruby versions checked and drop 2.0 support | Simplify ruby versions checked and drop 2.0 support
2.0 is EOL'd so let's stop supporting it too.
| YAML | mit | jnunemaker/flipper,jnunemaker/flipper,jnunemaker/flipper,jnunemaker/flipper | yaml | ## Code Before:
language: ruby
rvm:
- 2.0.0
- 2.1.8
- 2.2.4
bundler_args: --without guard
before_script: gem install bundler -v 1.10.6 && sudo service redis-server status && sudo service mongodb status
script: bundle exec rubocop && bundle exec rake
services:
- redis-server
- mongodb
- memcached
env:
- RAILS_VERSION=5.0.0
- RAILS_VERSION=4.2.5
- RAILS_VERSION=3.2.21
matrix:
# don't run rails 5 on ruby versions that can't install rack 2
exclude:
- rvm: 2.0.0
env: RAILS_VERSION=5.0.0
- rvm: 2.1.8
env: RAILS_VERSION=5.0.0
## Instruction:
Simplify ruby versions checked and drop 2.0 support
2.0 is EOL'd so let's stop supporting it too.
## Code After:
language: ruby
rvm:
- 2.1
- 2.2
- 2.3
bundler_args: --without guard
before_script: gem install bundler -v 1.10.6 && sudo service redis-server status && sudo service mongodb status
script: bundle exec rubocop && bundle exec rake
services:
- redis-server
- mongodb
- memcached
env:
- RAILS_VERSION=5.0.0
- RAILS_VERSION=4.2.5
- RAILS_VERSION=3.2.21
matrix:
# don't run rails 5 on ruby versions that can't install rack 2
exclude:
- rvm: 2.1.8
env: RAILS_VERSION=5.0.0
| language: ruby
rvm:
- - 2.0.0
- - 2.1.8
? --
+ - 2.1
- - 2.2.4
? --
+ - 2.2
+ - 2.3
bundler_args: --without guard
before_script: gem install bundler -v 1.10.6 && sudo service redis-server status && sudo service mongodb status
script: bundle exec rubocop && bundle exec rake
services:
- redis-server
- mongodb
- memcached
env:
- RAILS_VERSION=5.0.0
- RAILS_VERSION=4.2.5
- RAILS_VERSION=3.2.21
matrix:
# don't run rails 5 on ruby versions that can't install rack 2
exclude:
- - rvm: 2.0.0
- env: RAILS_VERSION=5.0.0
- rvm: 2.1.8
env: RAILS_VERSION=5.0.0 | 8 | 0.347826 | 3 | 5 |
77c2bc0acf8567606e42b611fe00162ad7cfa2de | capstone-project/src/main/angular-webapp/src/app/marker-filter/marker-filter.component.html | capstone-project/src/main/angular-webapp/src/app/marker-filter/marker-filter.component.html | <form class="example-form">
<mat-form-field class="example-full-width">
<input type="text"
placeholder="Filter by animal"
matInput
[formControl]="myControl"
[matAutocomplete]="auto">
<mat-autocomplete autoActiveFirstOption #auto="matAutocomplete" [displayWith]="filterMarkers.bind(this)">
<mat-option *ngFor="let option of filteredOptions | async" [value]="option">
{{option}}
</mat-option>
</mat-autocomplete>
</mat-form-field>
</form> | <form>
<mat-form-field>
<input type="text"
placeholder="Filter by animal"
matInput
[formControl]="myControl"
[matAutocomplete]="auto">
<mat-autocomplete autoActiveFirstOption #auto="matAutocomplete" [displayWith]="filterMarkers.bind(this)">
<mat-option *ngFor="let option of filteredOptions | async" [value]="option">
{{option}}
</mat-option>
</mat-autocomplete>
</mat-form-field>
</form> | Remove class names in html | Remove class names in html
| HTML | apache-2.0 | googleinterns/step263-2020,googleinterns/step263-2020,googleinterns/step263-2020,googleinterns/step263-2020 | html | ## Code Before:
<form class="example-form">
<mat-form-field class="example-full-width">
<input type="text"
placeholder="Filter by animal"
matInput
[formControl]="myControl"
[matAutocomplete]="auto">
<mat-autocomplete autoActiveFirstOption #auto="matAutocomplete" [displayWith]="filterMarkers.bind(this)">
<mat-option *ngFor="let option of filteredOptions | async" [value]="option">
{{option}}
</mat-option>
</mat-autocomplete>
</mat-form-field>
</form>
## Instruction:
Remove class names in html
## Code After:
<form>
<mat-form-field>
<input type="text"
placeholder="Filter by animal"
matInput
[formControl]="myControl"
[matAutocomplete]="auto">
<mat-autocomplete autoActiveFirstOption #auto="matAutocomplete" [displayWith]="filterMarkers.bind(this)">
<mat-option *ngFor="let option of filteredOptions | async" [value]="option">
{{option}}
</mat-option>
</mat-autocomplete>
</mat-form-field>
</form> | - <form class="example-form">
- <mat-form-field class="example-full-width">
+ <form>
+ <mat-form-field>
<input type="text"
placeholder="Filter by animal"
matInput
[formControl]="myControl"
[matAutocomplete]="auto">
<mat-autocomplete autoActiveFirstOption #auto="matAutocomplete" [displayWith]="filterMarkers.bind(this)">
<mat-option *ngFor="let option of filteredOptions | async" [value]="option">
{{option}}
</mat-option>
</mat-autocomplete>
</mat-form-field>
</form> | 4 | 0.285714 | 2 | 2 |
11661aefd80bdd37d1cfc5274c285c79ab21160a | README.md | README.md | Active Merchant library for integration with order fulfillment services.
[](https://travis-ci.org/Shopify/active_fulfillment)
# Installation
Add to your gem file
```
gem 'active_fulfillment', :git => 'git://github.com/Shopify/active_fulfillment.git'
```
# Usage
# Contributors
* Adrian Irving-Beer
* Arthur Nogueira Neves
* Blake Mesdag
* Chris Saunders (<http://christophersaunders.ca>)
* Cody Fauser
* Denis Odorcic
* Dennis Theisen
* James MacAulay
* Jesse HK
* Jesse Storimer
* John Duff
* John Tajima
* Jonathan Rudenberg
* Ryan Romanchuk
* Simon Eskildsen
* Tobias Lütke
* Tom Burns
| Active Merchant library for integration with order fulfillment services.
[](https://travis-ci.org/Shopify/active_fulfillment)
# Installation
Add to your gem file
```
gem 'active_fulfillment', :git => 'git://github.com/Shopify/active_fulfillment.git'
```
# Contributing
Contributions are encouraged of course! The basic Fulfillment Service API can be seen in
[active_fulfillment/fulfillment/service.rb](https://github.com/Shopify/active_fulfillment/blob/master/lib/active_fulfillment/fulfillment/service.rb).
New features and patches will have a better chance of getting merged if you include a strong
set of tests for your patch or feature.
It is also encouraged to try to keep your functions as concise as possible to make review easier
as well as to make testing more straightforward.
# Usage
# Contributors
* Adrian Irving-Beer
* Arthur Nogueira Neves
* Blake Mesdag
* Chris Saunders (<http://christophersaunders.ca>)
* Cody Fauser
* Denis Odorcic
* Dennis Theisen
* James MacAulay
* Jesse HK
* Jesse Storimer
* John Duff
* John Tajima
* Jonathan Rudenberg
* Ryan Romanchuk
* Simon Eskildsen
* Tobias Lütke
* Tom Burns
| Add some contribution guidelines to the readme | Add some contribution guidelines to the readme
| Markdown | mit | ddrmanxbxfr/active_fulfillment | markdown | ## Code Before:
Active Merchant library for integration with order fulfillment services.
[](https://travis-ci.org/Shopify/active_fulfillment)
# Installation
Add to your gem file
```
gem 'active_fulfillment', :git => 'git://github.com/Shopify/active_fulfillment.git'
```
# Usage
# Contributors
* Adrian Irving-Beer
* Arthur Nogueira Neves
* Blake Mesdag
* Chris Saunders (<http://christophersaunders.ca>)
* Cody Fauser
* Denis Odorcic
* Dennis Theisen
* James MacAulay
* Jesse HK
* Jesse Storimer
* John Duff
* John Tajima
* Jonathan Rudenberg
* Ryan Romanchuk
* Simon Eskildsen
* Tobias Lütke
* Tom Burns
## Instruction:
Add some contribution guidelines to the readme
## Code After:
Active Merchant library for integration with order fulfillment services.
[](https://travis-ci.org/Shopify/active_fulfillment)
# Installation
Add to your gem file
```
gem 'active_fulfillment', :git => 'git://github.com/Shopify/active_fulfillment.git'
```
# Contributing
Contributions are encouraged of course! The basic Fulfillment Service API can be seen in
[active_fulfillment/fulfillment/service.rb](https://github.com/Shopify/active_fulfillment/blob/master/lib/active_fulfillment/fulfillment/service.rb).
New features and patches will have a better chance of getting merged if you include a strong
set of tests for your patch or feature.
It is also encouraged to try to keep your functions as concise as possible to make review easier
as well as to make testing more straightforward.
# Usage
# Contributors
* Adrian Irving-Beer
* Arthur Nogueira Neves
* Blake Mesdag
* Chris Saunders (<http://christophersaunders.ca>)
* Cody Fauser
* Denis Odorcic
* Dennis Theisen
* James MacAulay
* Jesse HK
* Jesse Storimer
* John Duff
* John Tajima
* Jonathan Rudenberg
* Ryan Romanchuk
* Simon Eskildsen
* Tobias Lütke
* Tom Burns
| Active Merchant library for integration with order fulfillment services.
[](https://travis-ci.org/Shopify/active_fulfillment)
# Installation
Add to your gem file
```
gem 'active_fulfillment', :git => 'git://github.com/Shopify/active_fulfillment.git'
```
-
+
+ # Contributing
+
+ Contributions are encouraged of course! The basic Fulfillment Service API can be seen in
+ [active_fulfillment/fulfillment/service.rb](https://github.com/Shopify/active_fulfillment/blob/master/lib/active_fulfillment/fulfillment/service.rb).
+
+ New features and patches will have a better chance of getting merged if you include a strong
+ set of tests for your patch or feature.
+
+ It is also encouraged to try to keep your functions as concise as possible to make review easier
+ as well as to make testing more straightforward.
+
# Usage
# Contributors
* Adrian Irving-Beer
* Arthur Nogueira Neves
* Blake Mesdag
* Chris Saunders (<http://christophersaunders.ca>)
* Cody Fauser
* Denis Odorcic
* Dennis Theisen
* James MacAulay
* Jesse HK
* Jesse Storimer
* John Duff
* John Tajima
* Jonathan Rudenberg
* Ryan Romanchuk
* Simon Eskildsen
* Tobias Lütke
* Tom Burns | 13 | 0.40625 | 12 | 1 |
d99ef1ab1dc414294a200d4dafcb0d21c2d3f6d8 | webapp/byceps/blueprints/board/formatting.py | webapp/byceps/blueprints/board/formatting.py |
import bbcode
def render_html(value):
"""Render text as HTML, interpreting BBcode."""
parser = bbcode.Parser(replace_cosmetic=False)
# Replace image tags.
def render_image(name, value, options, parent, context):
return '<img src="{}"/>'.format(value)
parser.add_formatter('img', render_image, replace_links=False)
# Render quotes with optional author.
def render_quote(name, value, options, parent, context):
intro = ''
if 'author' in options:
author = options['author']
intro = '<p class="quote-intro"><cite>{}</cite> schrieb:</p>\n' \
.format(author)
return '{}<blockquote>{}</blockquote>'.format(intro, value)
parser.add_formatter('quote', render_quote, strip=True)
return parser.format(value)
|
import bbcode
def create_parser():
"""Create a customized BBcode parser."""
parser = bbcode.Parser(replace_cosmetic=False)
# Replace image tags.
def render_image(name, value, options, parent, context):
return '<img src="{}"/>'.format(value)
parser.add_formatter('img', render_image, replace_links=False)
# Render quotes with optional author.
def render_quote(name, value, options, parent, context):
intro = ''
if 'author' in options:
author = options['author']
intro = '<p class="quote-intro"><cite>{}</cite> schrieb:</p>\n' \
.format(author)
return '{}<blockquote>{}</blockquote>'.format(intro, value)
parser.add_formatter('quote', render_quote, strip=True)
return parser
PARSER = create_parser()
def render_html(value):
"""Render text as HTML, interpreting BBcode."""
return PARSER.format(value)
| Create and reuse a single BBcode parser instance. | Create and reuse a single BBcode parser instance.
| Python | bsd-3-clause | m-ober/byceps,m-ober/byceps,homeworkprod/byceps,homeworkprod/byceps,m-ober/byceps,homeworkprod/byceps | python | ## Code Before:
import bbcode
def render_html(value):
"""Render text as HTML, interpreting BBcode."""
parser = bbcode.Parser(replace_cosmetic=False)
# Replace image tags.
def render_image(name, value, options, parent, context):
return '<img src="{}"/>'.format(value)
parser.add_formatter('img', render_image, replace_links=False)
# Render quotes with optional author.
def render_quote(name, value, options, parent, context):
intro = ''
if 'author' in options:
author = options['author']
intro = '<p class="quote-intro"><cite>{}</cite> schrieb:</p>\n' \
.format(author)
return '{}<blockquote>{}</blockquote>'.format(intro, value)
parser.add_formatter('quote', render_quote, strip=True)
return parser.format(value)
## Instruction:
Create and reuse a single BBcode parser instance.
## Code After:
import bbcode
def create_parser():
"""Create a customized BBcode parser."""
parser = bbcode.Parser(replace_cosmetic=False)
# Replace image tags.
def render_image(name, value, options, parent, context):
return '<img src="{}"/>'.format(value)
parser.add_formatter('img', render_image, replace_links=False)
# Render quotes with optional author.
def render_quote(name, value, options, parent, context):
intro = ''
if 'author' in options:
author = options['author']
intro = '<p class="quote-intro"><cite>{}</cite> schrieb:</p>\n' \
.format(author)
return '{}<blockquote>{}</blockquote>'.format(intro, value)
parser.add_formatter('quote', render_quote, strip=True)
return parser
PARSER = create_parser()
def render_html(value):
"""Render text as HTML, interpreting BBcode."""
return PARSER.format(value)
|
import bbcode
- def render_html(value):
- """Render text as HTML, interpreting BBcode."""
+ def create_parser():
+ """Create a customized BBcode parser."""
parser = bbcode.Parser(replace_cosmetic=False)
# Replace image tags.
def render_image(name, value, options, parent, context):
return '<img src="{}"/>'.format(value)
parser.add_formatter('img', render_image, replace_links=False)
# Render quotes with optional author.
def render_quote(name, value, options, parent, context):
intro = ''
if 'author' in options:
author = options['author']
intro = '<p class="quote-intro"><cite>{}</cite> schrieb:</p>\n' \
.format(author)
return '{}<blockquote>{}</blockquote>'.format(intro, value)
parser.add_formatter('quote', render_quote, strip=True)
+ return parser
+
+
+ PARSER = create_parser()
+
+
+ def render_html(value):
+ """Render text as HTML, interpreting BBcode."""
- return parser.format(value)
? ^^^^^^
+ return PARSER.format(value)
? ^^^^^^
| 14 | 0.583333 | 11 | 3 |
edda8744772f368a1ac10d3d11866145e47fd394 | lib/app_frame/controller_methods.rb | lib/app_frame/controller_methods.rb | module AppFrame
module ControllerMethods
def self.included(klass)
klass.extend(ClassMethods)
end
module ClassMethods
def app_frame(options = {})
layout "#{AppFrame::theme}/#{options[:layout] || 'default'}"
unless options[:resource] == false
inherit_resources
include PaginationSupport
end
end
end
module PaginationSupport
# paginate collection
def collection
get_collection_ivar || set_collection_ivar(end_of_association_chain.scoped.paginate(:page => params[:page], :per_page => per_page))
end
def per_page
20
end
end
end
end | module AppFrame
module ControllerMethods
def self.included(klass)
klass.extend(ClassMethods)
end
module ClassMethods
def app_frame(options = {})
layout "#{AppFrame::theme}/#{options[:layout] || 'default'}"
unless options[:resource] == false
inherit_resources
include PaginationSupport
end
end
end
module PaginationSupport
# paginate collection
def collection
get_collection_ivar || set_collection_ivar(end_of_association_chain.page(params[:page]).per(per_page))
end
def per_page
20
end
end
end
end | Use kaminari syntax for pagination | Use kaminari syntax for pagination
| Ruby | mit | mateomurphy/app_frame,mateomurphy/app_frame,mateomurphy/app_frame | ruby | ## Code Before:
module AppFrame
module ControllerMethods
def self.included(klass)
klass.extend(ClassMethods)
end
module ClassMethods
def app_frame(options = {})
layout "#{AppFrame::theme}/#{options[:layout] || 'default'}"
unless options[:resource] == false
inherit_resources
include PaginationSupport
end
end
end
module PaginationSupport
# paginate collection
def collection
get_collection_ivar || set_collection_ivar(end_of_association_chain.scoped.paginate(:page => params[:page], :per_page => per_page))
end
def per_page
20
end
end
end
end
## Instruction:
Use kaminari syntax for pagination
## Code After:
module AppFrame
module ControllerMethods
def self.included(klass)
klass.extend(ClassMethods)
end
module ClassMethods
def app_frame(options = {})
layout "#{AppFrame::theme}/#{options[:layout] || 'default'}"
unless options[:resource] == false
inherit_resources
include PaginationSupport
end
end
end
module PaginationSupport
# paginate collection
def collection
get_collection_ivar || set_collection_ivar(end_of_association_chain.page(params[:page]).per(per_page))
end
def per_page
20
end
end
end
end | module AppFrame
module ControllerMethods
def self.included(klass)
klass.extend(ClassMethods)
end
module ClassMethods
def app_frame(options = {})
layout "#{AppFrame::theme}/#{options[:layout] || 'default'}"
unless options[:resource] == false
inherit_resources
include PaginationSupport
end
end
end
module PaginationSupport
# paginate collection
def collection
- get_collection_ivar || set_collection_ivar(end_of_association_chain.scoped.paginate(:page => params[:page], :per_page => per_page))
? ----------------- ^^^^ ^^^ ^^^^^^^^^
+ get_collection_ivar || set_collection_ivar(end_of_association_chain.page(params[:page]).per(per_page))
? ^ ^^ ^
end
def per_page
20
end
end
end
end | 2 | 0.066667 | 1 | 1 |
3ce2e23cdc865fea23635004bfd53d4c7fb1f4f9 | src/view/bindings/route_binding.coffee | src/view/bindings/route_binding.coffee |
class Batman.DOM.RouteBinding extends Batman.DOM.AbstractBinding
onAnchorTag: false
onlyObserve: Batman.BindingDefinitionOnlyObserve.Data
@accessor 'dispatcher', ->
@view.lookupKeypath('dispatcher') || Batman.App.get('current.dispatcher')
bind: ->
if @node.nodeName.toUpperCase() is 'A'
@onAnchorTag = true
super
Batman.DOM.events.click(@node, @routeClick)
routeClick: (node, event) =>
return if event.__batmanActionTaken
event.__batmanActionTaken = true
params = @pathFromValue(@get('filteredValue'))
Batman.redirect(params) if params?
dataChange: (value) ->
if value
path = @pathFromValue(value)
if @onAnchorTag
if path and Batman.navigator
path = Batman.navigator.linkTo(path)
else
path = "#"
@node.href = path
pathFromValue: (value) ->
if value
if value.isNamedRouteQuery
value.get('path')
else
@get('dispatcher')?.pathFromParams(value)
|
class Batman.DOM.RouteBinding extends Batman.DOM.AbstractBinding
onAnchorTag: false
onlyObserve: Batman.BindingDefinitionOnlyObserve.Data
@accessor 'dispatcher', ->
@view.lookupKeypath('dispatcher') || Batman.App.get('current.dispatcher')
bind: ->
if @node.nodeName.toUpperCase() is 'A'
@onAnchorTag = true
super
return if @onAnchorTag && @node.getAttribute('target')
Batman.DOM.events.click(@node, @routeClick)
routeClick: (node, event) =>
return if event.__batmanActionTaken
event.__batmanActionTaken = true
params = @pathFromValue(@get('filteredValue'))
Batman.redirect(params) if params?
dataChange: (value) ->
if value
path = @pathFromValue(value)
if @onAnchorTag
if path and Batman.navigator
path = Batman.navigator.linkTo(path)
else
path = "#"
@node.href = path
pathFromValue: (value) ->
if value
if value.isNamedRouteQuery
value.get('path')
else
@get('dispatcher')?.pathFromParams(value)
| Allow click events to propagate for anchors with target="_blank" | Allow click events to propagate for anchors with target="_blank"
| CoffeeScript | mit | getshuvo/batman,getshuvo/batman | coffeescript | ## Code Before:
class Batman.DOM.RouteBinding extends Batman.DOM.AbstractBinding
onAnchorTag: false
onlyObserve: Batman.BindingDefinitionOnlyObserve.Data
@accessor 'dispatcher', ->
@view.lookupKeypath('dispatcher') || Batman.App.get('current.dispatcher')
bind: ->
if @node.nodeName.toUpperCase() is 'A'
@onAnchorTag = true
super
Batman.DOM.events.click(@node, @routeClick)
routeClick: (node, event) =>
return if event.__batmanActionTaken
event.__batmanActionTaken = true
params = @pathFromValue(@get('filteredValue'))
Batman.redirect(params) if params?
dataChange: (value) ->
if value
path = @pathFromValue(value)
if @onAnchorTag
if path and Batman.navigator
path = Batman.navigator.linkTo(path)
else
path = "#"
@node.href = path
pathFromValue: (value) ->
if value
if value.isNamedRouteQuery
value.get('path')
else
@get('dispatcher')?.pathFromParams(value)
## Instruction:
Allow click events to propagate for anchors with target="_blank"
## Code After:
class Batman.DOM.RouteBinding extends Batman.DOM.AbstractBinding
onAnchorTag: false
onlyObserve: Batman.BindingDefinitionOnlyObserve.Data
@accessor 'dispatcher', ->
@view.lookupKeypath('dispatcher') || Batman.App.get('current.dispatcher')
bind: ->
if @node.nodeName.toUpperCase() is 'A'
@onAnchorTag = true
super
return if @onAnchorTag && @node.getAttribute('target')
Batman.DOM.events.click(@node, @routeClick)
routeClick: (node, event) =>
return if event.__batmanActionTaken
event.__batmanActionTaken = true
params = @pathFromValue(@get('filteredValue'))
Batman.redirect(params) if params?
dataChange: (value) ->
if value
path = @pathFromValue(value)
if @onAnchorTag
if path and Batman.navigator
path = Batman.navigator.linkTo(path)
else
path = "#"
@node.href = path
pathFromValue: (value) ->
if value
if value.isNamedRouteQuery
value.get('path')
else
@get('dispatcher')?.pathFromParams(value)
|
class Batman.DOM.RouteBinding extends Batman.DOM.AbstractBinding
onAnchorTag: false
onlyObserve: Batman.BindingDefinitionOnlyObserve.Data
@accessor 'dispatcher', ->
@view.lookupKeypath('dispatcher') || Batman.App.get('current.dispatcher')
bind: ->
if @node.nodeName.toUpperCase() is 'A'
@onAnchorTag = true
super
+ return if @onAnchorTag && @node.getAttribute('target')
Batman.DOM.events.click(@node, @routeClick)
routeClick: (node, event) =>
- return if event.__batmanActionTaken
? --
+ return if event.__batmanActionTaken
- event.__batmanActionTaken = true
? --
+ event.__batmanActionTaken = true
- params = @pathFromValue(@get('filteredValue'))
? --
+ params = @pathFromValue(@get('filteredValue'))
- Batman.redirect(params) if params?
? --
+ Batman.redirect(params) if params?
dataChange: (value) ->
if value
path = @pathFromValue(value)
if @onAnchorTag
if path and Batman.navigator
path = Batman.navigator.linkTo(path)
else
path = "#"
@node.href = path
pathFromValue: (value) ->
if value
if value.isNamedRouteQuery
value.get('path')
else
@get('dispatcher')?.pathFromParams(value) | 9 | 0.225 | 5 | 4 |
cb218ebab467c528b3448d86a37bc11de2613d7e | busytown/androidx_sandbox.sh | busytown/androidx_sandbox.sh | set -e
cd "$(dirname $0)"
# build just KMP projects. This will also enable native targets.
export ANDROIDX_PROJECTS=KMP
# disable cache, NS does not allow it yet: b/235227707
export USE_ANDROIDX_REMOTE_BUILD_CACHE=false
# run build in a sandbox
../development/sandbox/run-without-network.sh impl/build.sh buildOnServer --no-configuration-cache --no-daemon --offline
| set -e
cd "$(dirname $0)"
# build just KMP projects. This will also enable native targets.
export ANDROIDX_PROJECTS=KMP
# disable cache, NS does not allow it yet: b/235227707
export USE_ANDROIDX_REMOTE_BUILD_CACHE=false
# run build in a sandbox
../development/sandbox/run-without-network.sh impl/build.sh buildOnServer --no-configuration-cache --no-daemon
| Remove --offline from the sandbox build | Remove --offline from the sandbox build
With Kotlin 1.7.0, Kotlin Gradle Plugin inherits the --offline
flag for its `airplaneMode`.
Unfortunately, with that flag, it doesn't consider any of its
repository parameters for pulling dependencies, hence it fails.
This CL works around that issue by removing the `--offline` flag
from the sandbox build. Note that the sandbox build runs in nsjail
hence it cannot be making any network requests so this is a rather
safe change.
Ideally, we need to work with a proper long term solution but
this is a quick band-aid to get the build working again.
Bug: https://youtrack.jetbrains.com/issue/KT-49247
Bug: 236115214
Test: CI (abdt run L41300000955135022)
Change-Id: Ie7bd8461c0eb4df6a74e5badae4b8f8d75c8bca2
| Shell | apache-2.0 | AndroidX/androidx,AndroidX/androidx,AndroidX/androidx,androidx/androidx,AndroidX/androidx,androidx/androidx,androidx/androidx,AndroidX/androidx,androidx/androidx,androidx/androidx,AndroidX/androidx,androidx/androidx,androidx/androidx,androidx/androidx,androidx/androidx,AndroidX/androidx,AndroidX/androidx,AndroidX/androidx,androidx/androidx,AndroidX/androidx | shell | ## Code Before:
set -e
cd "$(dirname $0)"
# build just KMP projects. This will also enable native targets.
export ANDROIDX_PROJECTS=KMP
# disable cache, NS does not allow it yet: b/235227707
export USE_ANDROIDX_REMOTE_BUILD_CACHE=false
# run build in a sandbox
../development/sandbox/run-without-network.sh impl/build.sh buildOnServer --no-configuration-cache --no-daemon --offline
## Instruction:
Remove --offline from the sandbox build
With Kotlin 1.7.0, Kotlin Gradle Plugin inherits the --offline
flag for its `airplaneMode`.
Unfortunately, with that flag, it doesn't consider any of its
repository parameters for pulling dependencies, hence it fails.
This CL works around that issue by removing the `--offline` flag
from the sandbox build. Note that the sandbox build runs in nsjail
hence it cannot be making any network requests so this is a rather
safe change.
Ideally, we need to work with a proper long term solution but
this is a quick band-aid to get the build working again.
Bug: https://youtrack.jetbrains.com/issue/KT-49247
Bug: 236115214
Test: CI (abdt run L41300000955135022)
Change-Id: Ie7bd8461c0eb4df6a74e5badae4b8f8d75c8bca2
## Code After:
set -e
cd "$(dirname $0)"
# build just KMP projects. This will also enable native targets.
export ANDROIDX_PROJECTS=KMP
# disable cache, NS does not allow it yet: b/235227707
export USE_ANDROIDX_REMOTE_BUILD_CACHE=false
# run build in a sandbox
../development/sandbox/run-without-network.sh impl/build.sh buildOnServer --no-configuration-cache --no-daemon
| set -e
cd "$(dirname $0)"
# build just KMP projects. This will also enable native targets.
export ANDROIDX_PROJECTS=KMP
# disable cache, NS does not allow it yet: b/235227707
export USE_ANDROIDX_REMOTE_BUILD_CACHE=false
# run build in a sandbox
- ../development/sandbox/run-without-network.sh impl/build.sh buildOnServer --no-configuration-cache --no-daemon --offline
? ----------
+ ../development/sandbox/run-without-network.sh impl/build.sh buildOnServer --no-configuration-cache --no-daemon | 2 | 0.181818 | 1 | 1 |
ed4d75911f74561afe8875eacfa7250b21b77796 | collection/source/ceylon/collection/makeCellArray.ceylon | collection/source/ceylon/collection/makeCellArray.ceylon |
Array<Cell<Key->Item>?> makeCellEntryArray<Key,Item>(Integer size)
given Key satisfies Object
given Item satisfies Object {
return makeArray<Cell<Key->Item>?>(size, (Integer index) null);
}
Array<Cell<Element>?> makeCellElementArray<Element>(Integer size) {
return makeArray<Cell<Element>?>(size, (Integer index) null);
} |
Array<Cell<Key->Item>?> makeCellEntryArray<Key,Item>(Integer size)
given Key satisfies Object
given Item satisfies Object {
return arrayOfSize<Cell<Key->Item>?>(size, null);
}
Array<Cell<Element>?> makeCellElementArray<Element>(Integer size) {
return arrayOfSize<Cell<Element>?>(size, null);
} | Change makeArray -> arrayOfSize to fix compilation error | Change makeArray -> arrayOfSize to fix compilation error
| Ceylon | apache-2.0 | vietj/ceylon-sdk,ASzc/ceylon-sdk,jeancharles-roger/ceylon-sdk,unratito/ceylon-sdk,jeancharles-roger/ceylon-sdk,rohitmohan96/ceylon-sdk,unratito/ceylon-sdk,ASzc/ceylon-sdk,jvasileff/ceylon-sdk,DiegoCoronel/ceylon-sdk,lucaswerkmeister/ceylon-sdk,DiegoCoronel/ceylon-sdk,vietj/ceylon-sdk,lucaswerkmeister/ceylon-sdk,jvasileff/ceylon-sdk,matejonnet/ceylon-sdk | ceylon | ## Code Before:
Array<Cell<Key->Item>?> makeCellEntryArray<Key,Item>(Integer size)
given Key satisfies Object
given Item satisfies Object {
return makeArray<Cell<Key->Item>?>(size, (Integer index) null);
}
Array<Cell<Element>?> makeCellElementArray<Element>(Integer size) {
return makeArray<Cell<Element>?>(size, (Integer index) null);
}
## Instruction:
Change makeArray -> arrayOfSize to fix compilation error
## Code After:
Array<Cell<Key->Item>?> makeCellEntryArray<Key,Item>(Integer size)
given Key satisfies Object
given Item satisfies Object {
return arrayOfSize<Cell<Key->Item>?>(size, null);
}
Array<Cell<Element>?> makeCellElementArray<Element>(Integer size) {
return arrayOfSize<Cell<Element>?>(size, null);
} |
Array<Cell<Key->Item>?> makeCellEntryArray<Key,Item>(Integer size)
given Key satisfies Object
given Item satisfies Object {
- return makeArray<Cell<Key->Item>?>(size, (Integer index) null);
? - --- ----------------
+ return arrayOfSize<Cell<Key->Item>?>(size, null);
? ++++++
}
Array<Cell<Element>?> makeCellElementArray<Element>(Integer size) {
- return makeArray<Cell<Element>?>(size, (Integer index) null);
? - --- ----------------
+ return arrayOfSize<Cell<Element>?>(size, null);
? ++++++
} | 4 | 0.4 | 2 | 2 |
afc58fda363c27ed3a2f66d145cdfc51666dcdf6 | datasift-writer/src/main/resources/log4j.properties | datasift-writer/src/main/resources/log4j.properties | log4j.rootLogger=INFO, file, stdout
# Direct log messages to a log file
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.File=/var/log/datasift/writer.log
log4j.appender.file.MaxFileSize=250MB
log4j.appender.file.MaxBackupIndex=5
log4j.appender.file.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.file.layout.ConversionPattern=%d{(yyyy,MM,dd,HH,mm,ss,(SSS))} %5p %c{3.}:%L - %m%n %throwable{10}
# Direct log messages to stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{(yyyy,MM,dd,HH,mm,ss,(SSS))} %5p %c{3.}:%L - %m%n %throwable{10}
log4j.logger.org.apache.zookeeper=WARN
| log4j.rootLogger=INFO, file, stdout
# Direct log messages to a log file
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.File=/var/log/datasift/writer.log
log4j.appender.file.MaxFileSize=250MB
log4j.appender.file.MaxBackupIndex=5
log4j.appender.file.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.file.layout.ConversionPattern=%d{(yyyy,MM,dd,HH,mm,ss,(SSS))} %5p %c{3.}:%L - %m%n %throwable{10}
# Direct log messages to stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{(yyyy,MM,dd,HH,mm,ss,(SSS))} %5p %c{3.}:%L - %m%n %throwable{10}
log4j.logger.org.apache.zookeeper=WARN
log4j.logger.org.apache.http.wire=ERROR
| Reduce log spam by filtering out apache http wire logging. | Reduce log spam by filtering out apache http wire logging.
| INI | mit | yvonq/datasift-connector,datasift/datasift-connector,datasift/datasift-connector,yvonq/datasift-connector,yvonq/datasift-connector,datasift/datasift-connector,yvonq/datasift-connector,datasift/datasift-connector,yvonq/datasift-connector,datasift/datasift-connector | ini | ## Code Before:
log4j.rootLogger=INFO, file, stdout
# Direct log messages to a log file
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.File=/var/log/datasift/writer.log
log4j.appender.file.MaxFileSize=250MB
log4j.appender.file.MaxBackupIndex=5
log4j.appender.file.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.file.layout.ConversionPattern=%d{(yyyy,MM,dd,HH,mm,ss,(SSS))} %5p %c{3.}:%L - %m%n %throwable{10}
# Direct log messages to stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{(yyyy,MM,dd,HH,mm,ss,(SSS))} %5p %c{3.}:%L - %m%n %throwable{10}
log4j.logger.org.apache.zookeeper=WARN
## Instruction:
Reduce log spam by filtering out apache http wire logging.
## Code After:
log4j.rootLogger=INFO, file, stdout
# Direct log messages to a log file
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.File=/var/log/datasift/writer.log
log4j.appender.file.MaxFileSize=250MB
log4j.appender.file.MaxBackupIndex=5
log4j.appender.file.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.file.layout.ConversionPattern=%d{(yyyy,MM,dd,HH,mm,ss,(SSS))} %5p %c{3.}:%L - %m%n %throwable{10}
# Direct log messages to stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{(yyyy,MM,dd,HH,mm,ss,(SSS))} %5p %c{3.}:%L - %m%n %throwable{10}
log4j.logger.org.apache.zookeeper=WARN
log4j.logger.org.apache.http.wire=ERROR
| log4j.rootLogger=INFO, file, stdout
# Direct log messages to a log file
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.File=/var/log/datasift/writer.log
log4j.appender.file.MaxFileSize=250MB
log4j.appender.file.MaxBackupIndex=5
log4j.appender.file.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.file.layout.ConversionPattern=%d{(yyyy,MM,dd,HH,mm,ss,(SSS))} %5p %c{3.}:%L - %m%n %throwable{10}
# Direct log messages to stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{(yyyy,MM,dd,HH,mm,ss,(SSS))} %5p %c{3.}:%L - %m%n %throwable{10}
log4j.logger.org.apache.zookeeper=WARN
+ log4j.logger.org.apache.http.wire=ERROR | 1 | 0.058824 | 1 | 0 |
e4bea785c3001fec941cfda0251c5627c2df229a | confluent_osdeploy/el8/profiles/default/initprofile.sh | confluent_osdeploy/el8/profiles/default/initprofile.sh | ln -s $1/images/pxeboot/vmlinuz $2/boot/kernel && \
ln -s $1/images/pxeboot/initrd.img $1/boot/initramfs/distribution
mkdir -p $2/boot/media/EFI/BOOT && \
ln -s $1/EFI/BOOT/BOOTX64.EFI $1/1/EFI/BOOT/grubx64.efi $2/boot/media/EFI/BOOT/
| sed -i s/centos/CentOS/ "s/rhel/Red Hat Enterprise Linux/" $2/profile.yaml
ln -s $1/images/pxeboot/vmlinuz $2/boot/kernel && \
ln -s $1/images/pxeboot/initrd.img $2/boot/initramfs/distribution
mkdir -p $2/boot/media/EFI/BOOT && \
ln -s $1/EFI/BOOT/BOOTX64.EFI $1/1/EFI/BOOT/grubx64.efi $2/boot/media/EFI/BOOT/
| Fix RHEL osimport init script | Fix RHEL osimport init script
| Shell | apache-2.0 | xcat2/confluent,xcat2/confluent,jjohnson42/confluent,xcat2/confluent,xcat2/confluent,jjohnson42/confluent,jjohnson42/confluent,jjohnson42/confluent,xcat2/confluent,jjohnson42/confluent | shell | ## Code Before:
ln -s $1/images/pxeboot/vmlinuz $2/boot/kernel && \
ln -s $1/images/pxeboot/initrd.img $1/boot/initramfs/distribution
mkdir -p $2/boot/media/EFI/BOOT && \
ln -s $1/EFI/BOOT/BOOTX64.EFI $1/1/EFI/BOOT/grubx64.efi $2/boot/media/EFI/BOOT/
## Instruction:
Fix RHEL osimport init script
## Code After:
sed -i s/centos/CentOS/ "s/rhel/Red Hat Enterprise Linux/" $2/profile.yaml
ln -s $1/images/pxeboot/vmlinuz $2/boot/kernel && \
ln -s $1/images/pxeboot/initrd.img $2/boot/initramfs/distribution
mkdir -p $2/boot/media/EFI/BOOT && \
ln -s $1/EFI/BOOT/BOOTX64.EFI $1/1/EFI/BOOT/grubx64.efi $2/boot/media/EFI/BOOT/
| + sed -i s/centos/CentOS/ "s/rhel/Red Hat Enterprise Linux/" $2/profile.yaml
ln -s $1/images/pxeboot/vmlinuz $2/boot/kernel && \
- ln -s $1/images/pxeboot/initrd.img $1/boot/initramfs/distribution
? ^
+ ln -s $1/images/pxeboot/initrd.img $2/boot/initramfs/distribution
? ^
mkdir -p $2/boot/media/EFI/BOOT && \
ln -s $1/EFI/BOOT/BOOTX64.EFI $1/1/EFI/BOOT/grubx64.efi $2/boot/media/EFI/BOOT/
| 3 | 0.6 | 2 | 1 |
99be8919a0bc274dc311ebe3201dfc490a1d0d07 | setup.py | setup.py | import os
from distutils.core import setup, find_packages
# Utility function to read the README file.
# Used for the long_description. It's nice, because now 1) we have a top level
# README file and 2) it's easier to type in the README file than to put a raw
# string in below ...
def read(fname):
return open(os.path.join(os.path.dirname(__file__), fname)).read()
setup(
name = "DataShape",
version = "0.1.0",
author = "Continuum Analytics",
author_email = "blaze-dev@continuum.io",
description = ("A data description language."),
license = "BSD",
keywords = "data language",
url = "http://packages.python.org/datashape",
packages = ["datashape", "datashape.test"],
long_description = read('README.md'),
classifiers = [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"Topic :: Software Development",
"License :: OSI Approved :: BSD License",
],
)
| import os
from distutils.core import setup
# Utility function to read the README file.
# Used for the long_description. It's nice, because now 1) we have a top level
# README file and 2) it's easier to type in the README file than to put a raw
# string in below ...
def read(fname):
return open(os.path.join(os.path.dirname(__file__), fname)).read()
setup(
name = "DataShape",
version = "0.1.0",
author = "Continuum Analytics",
author_email = "blaze-dev@continuum.io",
description = ("A data description language."),
license = "BSD",
keywords = "data language",
url = "http://packages.python.org/datashape",
packages = ["datashape", "datashape.tests"],
long_description = read('README.md'),
classifiers = [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"Topic :: Software Development",
"License :: OSI Approved :: BSD License",
],
)
| Remove find_packages import, it's not in distutils | Remove find_packages import, it's not in distutils
| Python | bsd-2-clause | blaze/datashape,cowlicks/datashape,ContinuumIO/datashape,cpcloud/datashape,aterrel/datashape,quantopian/datashape,FrancescAlted/datashape,quantopian/datashape,aterrel/datashape,cowlicks/datashape,markflorisson/datashape,ContinuumIO/datashape,cpcloud/datashape,blaze/datashape,llllllllll/datashape,markflorisson/datashape,FrancescAlted/datashape,llllllllll/datashape | python | ## Code Before:
import os
from distutils.core import setup, find_packages
# Utility function to read the README file.
# Used for the long_description. It's nice, because now 1) we have a top level
# README file and 2) it's easier to type in the README file than to put a raw
# string in below ...
def read(fname):
return open(os.path.join(os.path.dirname(__file__), fname)).read()
setup(
name = "DataShape",
version = "0.1.0",
author = "Continuum Analytics",
author_email = "blaze-dev@continuum.io",
description = ("A data description language."),
license = "BSD",
keywords = "data language",
url = "http://packages.python.org/datashape",
packages = ["datashape", "datashape.test"],
long_description = read('README.md'),
classifiers = [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"Topic :: Software Development",
"License :: OSI Approved :: BSD License",
],
)
## Instruction:
Remove find_packages import, it's not in distutils
## Code After:
import os
from distutils.core import setup
# Utility function to read the README file.
# Used for the long_description. It's nice, because now 1) we have a top level
# README file and 2) it's easier to type in the README file than to put a raw
# string in below ...
def read(fname):
return open(os.path.join(os.path.dirname(__file__), fname)).read()
setup(
name = "DataShape",
version = "0.1.0",
author = "Continuum Analytics",
author_email = "blaze-dev@continuum.io",
description = ("A data description language."),
license = "BSD",
keywords = "data language",
url = "http://packages.python.org/datashape",
packages = ["datashape", "datashape.tests"],
long_description = read('README.md'),
classifiers = [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"Topic :: Software Development",
"License :: OSI Approved :: BSD License",
],
)
| import os
- from distutils.core import setup, find_packages
? ---------------
+ from distutils.core import setup
# Utility function to read the README file.
# Used for the long_description. It's nice, because now 1) we have a top level
# README file and 2) it's easier to type in the README file than to put a raw
# string in below ...
def read(fname):
return open(os.path.join(os.path.dirname(__file__), fname)).read()
setup(
name = "DataShape",
version = "0.1.0",
author = "Continuum Analytics",
author_email = "blaze-dev@continuum.io",
description = ("A data description language."),
license = "BSD",
keywords = "data language",
url = "http://packages.python.org/datashape",
- packages = ["datashape", "datashape.test"],
+ packages = ["datashape", "datashape.tests"],
? +
long_description = read('README.md'),
classifiers = [
"Development Status :: 3 - Alpha",
"Intended Audience :: Developers",
"Topic :: Software Development",
"License :: OSI Approved :: BSD License",
],
) | 4 | 0.142857 | 2 | 2 |
eed25be4f49870d006a510c5f1f9566700353082 | .travis.yml | .travis.yml | language: java
before_install:
- chmod +x mvnw
jdk:
- oraclejdk8
script: mvn clean install | language: java
before_install:
- chmod +x mvnw
jdk:
- openjdk8
- openjdk11
script: mvn clean install | Build with OpenJDK 8 and 11 | Build with OpenJDK 8 and 11
| YAML | apache-2.0 | xSAVIKx/AndroidScreencast | yaml | ## Code Before:
language: java
before_install:
- chmod +x mvnw
jdk:
- oraclejdk8
script: mvn clean install
## Instruction:
Build with OpenJDK 8 and 11
## Code After:
language: java
before_install:
- chmod +x mvnw
jdk:
- openjdk8
- openjdk11
script: mvn clean install | language: java
before_install:
- chmod +x mvnw
jdk:
- - oraclejdk8
? ^^^^
+ - openjdk8
? ^ +
+ - openjdk11
script: mvn clean install | 3 | 0.375 | 2 | 1 |
dd87e6fe48962ed587cce1dcff90ca2dd0351d94 | .travis.yml | .travis.yml | language: python
cache: pip
# Matrix of build options
python:
- '3.4'
- '3.5'
env:
global:
- DJANGO_SETTINGS_MODULE="tests.settings"
- TOX_ENV=
matrix:
- DJANGO='19' WAGTAIL='16'
- DJANGO='110' WAGTAIL='16'
before_install:
# - sudo add-apt-repository ppa:mc3man/trusty-media -y // Inlcudes all multimedia options but 404s currently :(
- sudo apt-get update -qq
- sudo apt-get install ffmpeg -y
install:
- pip install --upgrade pip wheel tox
cache:
directories:
- $HOME/.cache/pip
- $HOME/virtualenv
script:
# Run tox using either a specific environment from TOX_ENV,
# or building one from the environment variables
- tox -e "${TOX_ENV:-py${TRAVIS_PYTHON_VERSION/./}-dj${DJANGO}-wt${WAGTAIL}}"
| language: python
cache: pip
# Matrix of build options
python:
- '2.7'
- '3.4'
- '3.5'
env:
global:
- DJANGO_SETTINGS_MODULE="tests.settings"
- TOX_ENV=
matrix:
- DJANGO='19' WAGTAIL='16'
- DJANGO='110' WAGTAIL='16'
before_install:
# - sudo add-apt-repository ppa:mc3man/trusty-media -y // Inlcudes all multimedia options but 404s currently :(
- sudo apt-get update -qq
- sudo apt-get install ffmpeg -y
install:
- pip install --upgrade pip wheel tox
cache:
directories:
- $HOME/.cache/pip
- $HOME/virtualenv
script:
# Run tox using either a specific environment from TOX_ENV,
# or building one from the environment variables
- tox -e "${TOX_ENV:-py${TRAVIS_PYTHON_VERSION/./}-dj${DJANGO}-wt${WAGTAIL}}"
| Add py2.7 tests to Travis | Add py2.7 tests to Travis
| YAML | bsd-3-clause | takeflight/wagtail-metadata,takeflight/wagtail-metadata | yaml | ## Code Before:
language: python
cache: pip
# Matrix of build options
python:
- '3.4'
- '3.5'
env:
global:
- DJANGO_SETTINGS_MODULE="tests.settings"
- TOX_ENV=
matrix:
- DJANGO='19' WAGTAIL='16'
- DJANGO='110' WAGTAIL='16'
before_install:
# - sudo add-apt-repository ppa:mc3man/trusty-media -y // Inlcudes all multimedia options but 404s currently :(
- sudo apt-get update -qq
- sudo apt-get install ffmpeg -y
install:
- pip install --upgrade pip wheel tox
cache:
directories:
- $HOME/.cache/pip
- $HOME/virtualenv
script:
# Run tox using either a specific environment from TOX_ENV,
# or building one from the environment variables
- tox -e "${TOX_ENV:-py${TRAVIS_PYTHON_VERSION/./}-dj${DJANGO}-wt${WAGTAIL}}"
## Instruction:
Add py2.7 tests to Travis
## Code After:
language: python
cache: pip
# Matrix of build options
python:
- '2.7'
- '3.4'
- '3.5'
env:
global:
- DJANGO_SETTINGS_MODULE="tests.settings"
- TOX_ENV=
matrix:
- DJANGO='19' WAGTAIL='16'
- DJANGO='110' WAGTAIL='16'
before_install:
# - sudo add-apt-repository ppa:mc3man/trusty-media -y // Inlcudes all multimedia options but 404s currently :(
- sudo apt-get update -qq
- sudo apt-get install ffmpeg -y
install:
- pip install --upgrade pip wheel tox
cache:
directories:
- $HOME/.cache/pip
- $HOME/virtualenv
script:
# Run tox using either a specific environment from TOX_ENV,
# or building one from the environment variables
- tox -e "${TOX_ENV:-py${TRAVIS_PYTHON_VERSION/./}-dj${DJANGO}-wt${WAGTAIL}}"
| language: python
cache: pip
# Matrix of build options
python:
+ - '2.7'
- '3.4'
- '3.5'
env:
global:
- DJANGO_SETTINGS_MODULE="tests.settings"
- TOX_ENV=
matrix:
- DJANGO='19' WAGTAIL='16'
- DJANGO='110' WAGTAIL='16'
before_install:
# - sudo add-apt-repository ppa:mc3man/trusty-media -y // Inlcudes all multimedia options but 404s currently :(
- sudo apt-get update -qq
- sudo apt-get install ffmpeg -y
install:
- pip install --upgrade pip wheel tox
cache:
directories:
- $HOME/.cache/pip
- $HOME/virtualenv
script:
# Run tox using either a specific environment from TOX_ENV,
# or building one from the environment variables
- tox -e "${TOX_ENV:-py${TRAVIS_PYTHON_VERSION/./}-dj${DJANGO}-wt${WAGTAIL}}" | 1 | 0.030303 | 1 | 0 |
458b8598db2ca89cd13a2d7abc6c25d51bf2827c | .travis.yml | .travis.yml | language: node_js
node_js:
- '0.10'
env:
global:
- secure: YIGqeIndEd4iy6xBBF3WGzEL11HLKwpEP7j8oa0/b/UgnSNAoQ6ZgC5T0WEZk28dKDV8coBVpqNG/5B7gSUA00OejRBvKfT3ZURYsQhoNCkjuSmhYUKALptYu0OGfg58Jl3vuesPpF4A3A960Nilz4F63DMz26pnmy12cZScuPY=
- secure: TQpzd8v3dM13DfJf59LYYVD2Btetfd1d5D71dozhHtZC3liTZronG96UyvgZNGz4nS76rNoU5e2QhMxoi2odXBQdIL6bXEgWzbnNKsg5d+0ZfJZMyhD/4o/WPbLAVbjGIWy73U1MqUSenNdMar2490pEv9AHoOVtA9aYMy9ImYw=
addons:
sauce_connect: true | language: node_js
node_js:
- '0.10'
before_script:
- npm install -g karma-cli
env:
global:
- secure: YIGqeIndEd4iy6xBBF3WGzEL11HLKwpEP7j8oa0/b/UgnSNAoQ6ZgC5T0WEZk28dKDV8coBVpqNG/5B7gSUA00OejRBvKfT3ZURYsQhoNCkjuSmhYUKALptYu0OGfg58Jl3vuesPpF4A3A960Nilz4F63DMz26pnmy12cZScuPY=
- secure: TQpzd8v3dM13DfJf59LYYVD2Btetfd1d5D71dozhHtZC3liTZronG96UyvgZNGz4nS76rNoU5e2QhMxoi2odXBQdIL6bXEgWzbnNKsg5d+0ZfJZMyhD/4o/WPbLAVbjGIWy73U1MqUSenNdMar2490pEv9AHoOVtA9aYMy9ImYw=
addons:
sauce_connect: true | Install karma-cli before running tests on Travis CI | Install karma-cli before running tests on Travis CI
| YAML | mit | katranci/PositionSticky | yaml | ## Code Before:
language: node_js
node_js:
- '0.10'
env:
global:
- secure: YIGqeIndEd4iy6xBBF3WGzEL11HLKwpEP7j8oa0/b/UgnSNAoQ6ZgC5T0WEZk28dKDV8coBVpqNG/5B7gSUA00OejRBvKfT3ZURYsQhoNCkjuSmhYUKALptYu0OGfg58Jl3vuesPpF4A3A960Nilz4F63DMz26pnmy12cZScuPY=
- secure: TQpzd8v3dM13DfJf59LYYVD2Btetfd1d5D71dozhHtZC3liTZronG96UyvgZNGz4nS76rNoU5e2QhMxoi2odXBQdIL6bXEgWzbnNKsg5d+0ZfJZMyhD/4o/WPbLAVbjGIWy73U1MqUSenNdMar2490pEv9AHoOVtA9aYMy9ImYw=
addons:
sauce_connect: true
## Instruction:
Install karma-cli before running tests on Travis CI
## Code After:
language: node_js
node_js:
- '0.10'
before_script:
- npm install -g karma-cli
env:
global:
- secure: YIGqeIndEd4iy6xBBF3WGzEL11HLKwpEP7j8oa0/b/UgnSNAoQ6ZgC5T0WEZk28dKDV8coBVpqNG/5B7gSUA00OejRBvKfT3ZURYsQhoNCkjuSmhYUKALptYu0OGfg58Jl3vuesPpF4A3A960Nilz4F63DMz26pnmy12cZScuPY=
- secure: TQpzd8v3dM13DfJf59LYYVD2Btetfd1d5D71dozhHtZC3liTZronG96UyvgZNGz4nS76rNoU5e2QhMxoi2odXBQdIL6bXEgWzbnNKsg5d+0ZfJZMyhD/4o/WPbLAVbjGIWy73U1MqUSenNdMar2490pEv9AHoOVtA9aYMy9ImYw=
addons:
sauce_connect: true | language: node_js
node_js:
- '0.10'
+ before_script:
+ - npm install -g karma-cli
env:
global:
- secure: YIGqeIndEd4iy6xBBF3WGzEL11HLKwpEP7j8oa0/b/UgnSNAoQ6ZgC5T0WEZk28dKDV8coBVpqNG/5B7gSUA00OejRBvKfT3ZURYsQhoNCkjuSmhYUKALptYu0OGfg58Jl3vuesPpF4A3A960Nilz4F63DMz26pnmy12cZScuPY=
- secure: TQpzd8v3dM13DfJf59LYYVD2Btetfd1d5D71dozhHtZC3liTZronG96UyvgZNGz4nS76rNoU5e2QhMxoi2odXBQdIL6bXEgWzbnNKsg5d+0ZfJZMyhD/4o/WPbLAVbjGIWy73U1MqUSenNdMar2490pEv9AHoOVtA9aYMy9ImYw=
addons:
sauce_connect: true | 2 | 0.222222 | 2 | 0 |
d1123db13be25fedcc3f8cac057c684a88a2060a | lib/game_icons/tasks/update.rb | lib/game_icons/tasks/update.rb | require 'game_icons'
require 'open-uri'
require 'zip'
require 'fileutils'
module GameIcons
class Update
@@URL = 'http://game-icons.net/archives/svg/zip/ffffff/000000/game-icons.net.svg.zip'
@@TMP_ZIP = 'game-icons.net.svg.zip'
def self.run
puts "Clearing..."
clear
puts "Downloading..."
download
puts "Unzipping..."
unzip
puts "Done."
end
private
def self.clear
FileUtils.rm_rf('resources/icons')
FileUtils.mkdir('resources/icons')
end
def self.download
File.open("resources/#{@@TMP_ZIP}", 'wb+') do |save_file|
open(@@URL, 'rb') { |read_file| save_file.write(read_file.read) }
end
end
def self.unzip
Zip.on_exists_proc = true #overwrite files if they already exist
Dir.chdir('./resources/') do
zip_file = Zip::File.new(@@TMP_ZIP)
zip_file.each do |entry|
fpath = "./#{entry.name}"
FileUtils.mkdir_p File.dirname entry.name
zip_file.extract(entry, fpath) unless File.exist?(fpath)
print '.'
end
end
end
end
end | require 'fileutils'
require 'game_icons'
require 'open-uri'
require 'zip'
module GameIcons
class Update
@@URL = 'https://game-icons.net/archives/svg/zip/ffffff/000000/game-icons.net.svg.zip'
@@TMP_ZIP = 'game-icons.net.svg.zip'
def self.run
puts "Clearing..."
clear
puts "Downloading..."
download
puts "Unzipping..."
unzip
puts "Done."
end
private
def self.clear
FileUtils.rm_rf('resources/icons')
FileUtils.mkdir('resources/icons')
end
def self.download
File.open("resources/#{@@TMP_ZIP}", 'wb+') do |save_file|
URI.open(@@URL, 'rb') { |read_file| save_file.write(read_file.read) }
end
end
def self.unzip
Zip.on_exists_proc = true #overwrite files if they already exist
Dir.chdir('./resources/') do
zip_file = Zip::File.new(@@TMP_ZIP)
zip_file.each do |entry|
fpath = "./#{entry.name}"
FileUtils.mkdir_p File.dirname entry.name
zip_file.extract(entry, fpath) unless File.exist?(fpath)
print '.'
end
end
end
end
end | Update deprecated calls for Ruby 3.0 | Update deprecated calls for Ruby 3.0
| Ruby | mit | andymeneely/game_icons | ruby | ## Code Before:
require 'game_icons'
require 'open-uri'
require 'zip'
require 'fileutils'
module GameIcons
class Update
@@URL = 'http://game-icons.net/archives/svg/zip/ffffff/000000/game-icons.net.svg.zip'
@@TMP_ZIP = 'game-icons.net.svg.zip'
def self.run
puts "Clearing..."
clear
puts "Downloading..."
download
puts "Unzipping..."
unzip
puts "Done."
end
private
def self.clear
FileUtils.rm_rf('resources/icons')
FileUtils.mkdir('resources/icons')
end
def self.download
File.open("resources/#{@@TMP_ZIP}", 'wb+') do |save_file|
open(@@URL, 'rb') { |read_file| save_file.write(read_file.read) }
end
end
def self.unzip
Zip.on_exists_proc = true #overwrite files if they already exist
Dir.chdir('./resources/') do
zip_file = Zip::File.new(@@TMP_ZIP)
zip_file.each do |entry|
fpath = "./#{entry.name}"
FileUtils.mkdir_p File.dirname entry.name
zip_file.extract(entry, fpath) unless File.exist?(fpath)
print '.'
end
end
end
end
end
## Instruction:
Update deprecated calls for Ruby 3.0
## Code After:
require 'fileutils'
require 'game_icons'
require 'open-uri'
require 'zip'
module GameIcons
class Update
@@URL = 'https://game-icons.net/archives/svg/zip/ffffff/000000/game-icons.net.svg.zip'
@@TMP_ZIP = 'game-icons.net.svg.zip'
def self.run
puts "Clearing..."
clear
puts "Downloading..."
download
puts "Unzipping..."
unzip
puts "Done."
end
private
def self.clear
FileUtils.rm_rf('resources/icons')
FileUtils.mkdir('resources/icons')
end
def self.download
File.open("resources/#{@@TMP_ZIP}", 'wb+') do |save_file|
URI.open(@@URL, 'rb') { |read_file| save_file.write(read_file.read) }
end
end
def self.unzip
Zip.on_exists_proc = true #overwrite files if they already exist
Dir.chdir('./resources/') do
zip_file = Zip::File.new(@@TMP_ZIP)
zip_file.each do |entry|
fpath = "./#{entry.name}"
FileUtils.mkdir_p File.dirname entry.name
zip_file.extract(entry, fpath) unless File.exist?(fpath)
print '.'
end
end
end
end
end | + require 'fileutils'
require 'game_icons'
require 'open-uri'
require 'zip'
- require 'fileutils'
module GameIcons
class Update
- @@URL = 'http://game-icons.net/archives/svg/zip/ffffff/000000/game-icons.net.svg.zip'
+ @@URL = 'https://game-icons.net/archives/svg/zip/ffffff/000000/game-icons.net.svg.zip'
? +
@@TMP_ZIP = 'game-icons.net.svg.zip'
def self.run
puts "Clearing..."
clear
puts "Downloading..."
download
puts "Unzipping..."
unzip
puts "Done."
end
private
def self.clear
FileUtils.rm_rf('resources/icons')
FileUtils.mkdir('resources/icons')
end
def self.download
File.open("resources/#{@@TMP_ZIP}", 'wb+') do |save_file|
- open(@@URL, 'rb') { |read_file| save_file.write(read_file.read) }
+ URI.open(@@URL, 'rb') { |read_file| save_file.write(read_file.read) }
? ++++
end
end
def self.unzip
Zip.on_exists_proc = true #overwrite files if they already exist
Dir.chdir('./resources/') do
zip_file = Zip::File.new(@@TMP_ZIP)
zip_file.each do |entry|
fpath = "./#{entry.name}"
FileUtils.mkdir_p File.dirname entry.name
zip_file.extract(entry, fpath) unless File.exist?(fpath)
print '.'
end
end
end
end
end | 6 | 0.125 | 3 | 3 |
d2e002a8da0858167c72b95483a8a76969721735 | features/exit_statuses.feature | features/exit_statuses.feature | Feature: exit statuses
In order to specify expected exit statuses
As a developer using Cucumber
I want to use the "the exit status should be" step
Scenario: exit status of 0
When I run `true`
Then the exit status should be 0
Scenario: non-zero exit status
When I run `false`
Then the exit status should be 1
And the exit status should not be 0
Scenario: Successfully run something
When I successfully run `true`
Scenario: Successfully run something for a long time
Given The default aruba timeout is 0 seconds
When I successfully run `ruby -e 'sleep 1'` for up to 2 seconds
Scenario: Unsuccessfully run something that takes too long
Given The default aruba timeout is 0 seconds
When I do aruba I successfully run `ruby -e 'sleep 1'`
Then aruba should fail with "process still alive after 0 seconds"
Scenario: Unsuccessfully run something
When I do aruba I successfully run `false`
Then aruba should fail with ""
| Feature: exit statuses
In order to specify expected exit statuses
As a developer using Cucumber
I want to use the "the exit status should be" step
Scenario: exit status of 0
When I run `true`
Then the exit status should be 0
Scenario: non-zero exit status
When I run `false`
Then the exit status should be 1
And the exit status should not be 0
Scenario: Successfully run something
When I successfully run `true`
Scenario: Successfully run something for a long time
Given The default aruba timeout is 0 seconds
When I successfully run `sleep 1` for up to 2 seconds
Scenario: Unsuccessfully run something that takes too long
Given The default aruba timeout is 0 seconds
When I do aruba I successfully run `sleep 1`
Then aruba should fail with "process still alive after 0 seconds"
Scenario: Unsuccessfully run something
When I do aruba I successfully run `false`
Then aruba should fail with ""
| Change sleep commands to use bash command. | Change sleep commands to use bash command.
| Cucumber | mit | dg-ratiodata/aruba,maxmeyer/aruba,jasnow/aruba,ducthanh/aruba,maxmeyer/aruba,AdrieanKhisbe/aruba,weedySeaDragon/aruba,jasnow/aruba,weedySeaDragon/aruba,ducthanh/aruba,e2/aruba,dg-ratiodata/aruba,e2/aruba,jasnow/aruba,jasnow/aruba,ducthanh/aruba,rapid7/aruba,dg-ratiodata/aruba,cucumber/aruba,AdrieanKhisbe/aruba,e2/aruba,rapid7/aruba,dg-ratiodata/aruba,mvz/aruba,weedySeaDragon/aruba,ducthanh/aruba,AdrieanKhisbe/aruba,mvz/aruba,e2/aruba,AdrieanKhisbe/aruba,weedySeaDragon/aruba | cucumber | ## Code Before:
Feature: exit statuses
In order to specify expected exit statuses
As a developer using Cucumber
I want to use the "the exit status should be" step
Scenario: exit status of 0
When I run `true`
Then the exit status should be 0
Scenario: non-zero exit status
When I run `false`
Then the exit status should be 1
And the exit status should not be 0
Scenario: Successfully run something
When I successfully run `true`
Scenario: Successfully run something for a long time
Given The default aruba timeout is 0 seconds
When I successfully run `ruby -e 'sleep 1'` for up to 2 seconds
Scenario: Unsuccessfully run something that takes too long
Given The default aruba timeout is 0 seconds
When I do aruba I successfully run `ruby -e 'sleep 1'`
Then aruba should fail with "process still alive after 0 seconds"
Scenario: Unsuccessfully run something
When I do aruba I successfully run `false`
Then aruba should fail with ""
## Instruction:
Change sleep commands to use bash command.
## Code After:
Feature: exit statuses
In order to specify expected exit statuses
As a developer using Cucumber
I want to use the "the exit status should be" step
Scenario: exit status of 0
When I run `true`
Then the exit status should be 0
Scenario: non-zero exit status
When I run `false`
Then the exit status should be 1
And the exit status should not be 0
Scenario: Successfully run something
When I successfully run `true`
Scenario: Successfully run something for a long time
Given The default aruba timeout is 0 seconds
When I successfully run `sleep 1` for up to 2 seconds
Scenario: Unsuccessfully run something that takes too long
Given The default aruba timeout is 0 seconds
When I do aruba I successfully run `sleep 1`
Then aruba should fail with "process still alive after 0 seconds"
Scenario: Unsuccessfully run something
When I do aruba I successfully run `false`
Then aruba should fail with ""
| Feature: exit statuses
In order to specify expected exit statuses
As a developer using Cucumber
I want to use the "the exit status should be" step
Scenario: exit status of 0
When I run `true`
Then the exit status should be 0
Scenario: non-zero exit status
When I run `false`
Then the exit status should be 1
And the exit status should not be 0
Scenario: Successfully run something
When I successfully run `true`
Scenario: Successfully run something for a long time
Given The default aruba timeout is 0 seconds
- When I successfully run `ruby -e 'sleep 1'` for up to 2 seconds
? --------- -
+ When I successfully run `sleep 1` for up to 2 seconds
Scenario: Unsuccessfully run something that takes too long
Given The default aruba timeout is 0 seconds
- When I do aruba I successfully run `ruby -e 'sleep 1'`
? --------- -
+ When I do aruba I successfully run `sleep 1`
Then aruba should fail with "process still alive after 0 seconds"
Scenario: Unsuccessfully run something
When I do aruba I successfully run `false`
Then aruba should fail with "" | 4 | 0.133333 | 2 | 2 |
ca1fe4024501e802035719b4bc8a093c7e35b28a | index.js | index.js | /* jshint node: true */
'use strict';
var path = require('path');
module.exports = {
name: 'ember-cli-timecop',
treeFor: function(name) {
if (name !== 'vendor') { return; }
return this.treeGenerator(path.join(__dirname, 'node_modules'));
},
included: function(app) {
this._super.included(app);
if (app.env !== 'production') {
this.app.import('vendor/timecop/timecop.js');
}
}
};
| /* jshint node: true */
'use strict';
var path = require('path');
module.exports = {
name: 'ember-cli-timecop',
treeFor: function(name) {
if (name !== 'vendor') { return; }
var assetsPath = require('path').join('timecop', 'timecop.js');
return this.treeGenerator(require.resolve('timecop').replace(assetsPath, ''));
},
included: function(app) {
this._super.included(app);
if (app.env !== 'production') {
this.app.import('vendor/timecop/timecop.js');
}
}
};
| Add support for npm 3 | Add support for npm 3
| JavaScript | mit | matteodepalo/ember-cli-timecop,matteodepalo/ember-cli-timecop | javascript | ## Code Before:
/* jshint node: true */
'use strict';
var path = require('path');
module.exports = {
name: 'ember-cli-timecop',
treeFor: function(name) {
if (name !== 'vendor') { return; }
return this.treeGenerator(path.join(__dirname, 'node_modules'));
},
included: function(app) {
this._super.included(app);
if (app.env !== 'production') {
this.app.import('vendor/timecop/timecop.js');
}
}
};
## Instruction:
Add support for npm 3
## Code After:
/* jshint node: true */
'use strict';
var path = require('path');
module.exports = {
name: 'ember-cli-timecop',
treeFor: function(name) {
if (name !== 'vendor') { return; }
var assetsPath = require('path').join('timecop', 'timecop.js');
return this.treeGenerator(require.resolve('timecop').replace(assetsPath, ''));
},
included: function(app) {
this._super.included(app);
if (app.env !== 'production') {
this.app.import('vendor/timecop/timecop.js');
}
}
};
| /* jshint node: true */
'use strict';
var path = require('path');
module.exports = {
name: 'ember-cli-timecop',
treeFor: function(name) {
if (name !== 'vendor') { return; }
- return this.treeGenerator(path.join(__dirname, 'node_modules'));
+ var assetsPath = require('path').join('timecop', 'timecop.js');
+ return this.treeGenerator(require.resolve('timecop').replace(assetsPath, ''));
},
included: function(app) {
this._super.included(app);
if (app.env !== 'production') {
this.app.import('vendor/timecop/timecop.js');
}
}
}; | 3 | 0.136364 | 2 | 1 |
222f2ee18c0255363ec01452fb9fd850c55d9b30 | app.js | app.js | const express = require('express');
const hbs = require('hbs');
// instantiate Express.js
const app = express();
// Tell Handlebars where to look for partials
hbs.registerPartials(__dirname + '/views/partials');
// Set Handlebars as default templating engine
app.set('view engine', 'hbs');
// app.use(express.static(__dirname + '/public'));
// root route
app.get('/', (req, res) => {
res.render('home.hbs', {
pageTitle: 'Home Page'
});
});
// Specify port and run local server
let port = 3000;
app.listen(port, () => {
console.log(`listening on ${port}`);
});
| const express = require('express');
const hbs = require('hbs');
// instantiate Express.js
const app = express();
// Tell Handlebars where to look for partials
hbs.registerPartials(__dirname + '/views/partials');
// Set Handlebars as default templating engine
app.set('view engine', 'hbs');
// app.use(express.static(__dirname + '/public'));
// root route
app.get('/', (req, res) => {
res.render('home.hbs', {
pageTitle: 'Home Page'
});
});
// route for e-commerce site
app.get('/', (req, res) => {
res.render('shop.hbs', {
pageTitle: 'E-Commerce Shop'
});
});
// Specify port and run local server
let port = 3000;
app.listen(port, () => {
console.log(`listening on ${port}`);
});
| Add route for e-commerce site | Add route for e-commerce site
| JavaScript | mit | dshaps10/full-stack-demo-site,dshaps10/full-stack-demo-site | javascript | ## Code Before:
const express = require('express');
const hbs = require('hbs');
// instantiate Express.js
const app = express();
// Tell Handlebars where to look for partials
hbs.registerPartials(__dirname + '/views/partials');
// Set Handlebars as default templating engine
app.set('view engine', 'hbs');
// app.use(express.static(__dirname + '/public'));
// root route
app.get('/', (req, res) => {
res.render('home.hbs', {
pageTitle: 'Home Page'
});
});
// Specify port and run local server
let port = 3000;
app.listen(port, () => {
console.log(`listening on ${port}`);
});
## Instruction:
Add route for e-commerce site
## Code After:
const express = require('express');
const hbs = require('hbs');
// instantiate Express.js
const app = express();
// Tell Handlebars where to look for partials
hbs.registerPartials(__dirname + '/views/partials');
// Set Handlebars as default templating engine
app.set('view engine', 'hbs');
// app.use(express.static(__dirname + '/public'));
// root route
app.get('/', (req, res) => {
res.render('home.hbs', {
pageTitle: 'Home Page'
});
});
// route for e-commerce site
app.get('/', (req, res) => {
res.render('shop.hbs', {
pageTitle: 'E-Commerce Shop'
});
});
// Specify port and run local server
let port = 3000;
app.listen(port, () => {
console.log(`listening on ${port}`);
});
| const express = require('express');
const hbs = require('hbs');
// instantiate Express.js
const app = express();
// Tell Handlebars where to look for partials
hbs.registerPartials(__dirname + '/views/partials');
// Set Handlebars as default templating engine
app.set('view engine', 'hbs');
// app.use(express.static(__dirname + '/public'));
// root route
app.get('/', (req, res) => {
res.render('home.hbs', {
pageTitle: 'Home Page'
});
});
+ // route for e-commerce site
+ app.get('/', (req, res) => {
+ res.render('shop.hbs', {
+ pageTitle: 'E-Commerce Shop'
+ });
+ });
+
// Specify port and run local server
let port = 3000;
app.listen(port, () => {
console.log(`listening on ${port}`);
}); | 7 | 0.259259 | 7 | 0 |
bd990fea02b82da3ed1fcc033094bd04247dda77 | meinberlin/assets/scss/components/_filter_bar.scss | meinberlin/assets/scss/components/_filter_bar.scss | .control-bar {
@include clearfix;
font-size: $font-size-sm;
.dropdown {
display: inline-block;
}
@media (max-width: $breakpoint) {
> * {
@include grid-same-width(2);
}
.dropdown {
display: block;
}
.dropdown-toggle {
width: 100%;
max-width: none;
}
}
.control-bar__right {
float: right;
@media (max-width: $breakpoint) {
float: none;
// push to the right
// NOTE: needs to overwrite grid-tiles
@include grid-position(6);
}
}
.form-group {
margin-bottom: 0;
}
}
.control-bar__top-overlap {
position: relative;
z-index: 1;
bottom: -1em;
.dropdown {
margin-bottom: 0;
}
}
.control-bar__bottom-overlap {
position: relative;
z-index: 1;
top: -1em;
}
| .control-bar {
@include clearfix;
font-size: $font-size-sm;
.dropdown {
display: inline-block;
}
@media (max-width: $breakpoint) {
> * {
@include grid-same-width(2);
}
.dropdown {
display: block;
}
.dropdown-toggle {
width: 100%;
max-width: none;
}
}
.control-bar__right {
float: right;
@media (max-width: $breakpoint) {
// grid-position does not include grid-cell automatically
@include grid-cell();
// push to the right
// NOTE: needs to overwrite grid-tiles
@include grid-position(6);
}
}
.form-group {
margin-bottom: 0;
}
}
.control-bar__top-overlap {
position: relative;
z-index: 1;
bottom: -1em;
.dropdown {
margin-bottom: 0;
}
}
.control-bar__bottom-overlap {
position: relative;
z-index: 1;
top: -1em;
}
| Fix grid for right aligned filters within control bars | Fix grid for right aligned filters within control bars
The float is set for larger screen to position the filter to the right.
The grid does not work as expected if the element does not float.
As grid-position does not include grid-cell automatically it is required
to include it manually to reset the correct grid floating and margins.
| SCSS | agpl-3.0 | liqd/a4-meinberlin,liqd/a4-meinberlin,liqd/a4-meinberlin,liqd/a4-meinberlin | scss | ## Code Before:
.control-bar {
@include clearfix;
font-size: $font-size-sm;
.dropdown {
display: inline-block;
}
@media (max-width: $breakpoint) {
> * {
@include grid-same-width(2);
}
.dropdown {
display: block;
}
.dropdown-toggle {
width: 100%;
max-width: none;
}
}
.control-bar__right {
float: right;
@media (max-width: $breakpoint) {
float: none;
// push to the right
// NOTE: needs to overwrite grid-tiles
@include grid-position(6);
}
}
.form-group {
margin-bottom: 0;
}
}
.control-bar__top-overlap {
position: relative;
z-index: 1;
bottom: -1em;
.dropdown {
margin-bottom: 0;
}
}
.control-bar__bottom-overlap {
position: relative;
z-index: 1;
top: -1em;
}
## Instruction:
Fix grid for right aligned filters within control bars
The float is set for larger screen to position the filter to the right.
The grid does not work as expected if the element does not float.
As grid-position does not include grid-cell automatically it is required
to include it manually to reset the correct grid floating and margins.
## Code After:
.control-bar {
@include clearfix;
font-size: $font-size-sm;
.dropdown {
display: inline-block;
}
@media (max-width: $breakpoint) {
> * {
@include grid-same-width(2);
}
.dropdown {
display: block;
}
.dropdown-toggle {
width: 100%;
max-width: none;
}
}
.control-bar__right {
float: right;
@media (max-width: $breakpoint) {
// grid-position does not include grid-cell automatically
@include grid-cell();
// push to the right
// NOTE: needs to overwrite grid-tiles
@include grid-position(6);
}
}
.form-group {
margin-bottom: 0;
}
}
.control-bar__top-overlap {
position: relative;
z-index: 1;
bottom: -1em;
.dropdown {
margin-bottom: 0;
}
}
.control-bar__bottom-overlap {
position: relative;
z-index: 1;
top: -1em;
}
| .control-bar {
@include clearfix;
font-size: $font-size-sm;
.dropdown {
display: inline-block;
}
@media (max-width: $breakpoint) {
> * {
@include grid-same-width(2);
}
.dropdown {
display: block;
}
.dropdown-toggle {
width: 100%;
max-width: none;
}
}
.control-bar__right {
float: right;
@media (max-width: $breakpoint) {
- float: none;
+ // grid-position does not include grid-cell automatically
+ @include grid-cell();
// push to the right
// NOTE: needs to overwrite grid-tiles
@include grid-position(6);
}
}
.form-group {
margin-bottom: 0;
}
}
.control-bar__top-overlap {
position: relative;
z-index: 1;
bottom: -1em;
.dropdown {
margin-bottom: 0;
}
}
.control-bar__bottom-overlap {
position: relative;
z-index: 1;
top: -1em;
} | 3 | 0.055556 | 2 | 1 |
caf2fa4f5a2e497785223902527341f650d2916b | config-model/src/main/java/com/yahoo/vespa/model/admin/LogserverContainerCluster.java | config-model/src/main/java/com/yahoo/vespa/model/admin/LogserverContainerCluster.java | // Copyright 2019 Oath Inc. Licensed under the terms of the Apache 2.0 license. See LICENSE in the project root.
package com.yahoo.vespa.model.admin;
import com.yahoo.config.model.deploy.DeployState;
import com.yahoo.config.model.producer.AbstractConfigProducer;
import com.yahoo.container.handler.ThreadpoolConfig;
import com.yahoo.search.config.QrStartConfig;
import com.yahoo.vespa.model.container.ContainerCluster;
import com.yahoo.vespa.model.container.component.Handler;
/**
* @author hmusum
*/
public class LogserverContainerCluster extends ContainerCluster<LogserverContainer> {
public LogserverContainerCluster(AbstractConfigProducer<?> parent, String name, DeployState deployState) {
super(parent, name, name, deployState);
addDefaultHandlersWithVip();
addLogHandler();
}
@Override
protected void doPrepare(DeployState deployState) { }
@Override
public void getConfig(ThreadpoolConfig.Builder builder) {
builder.maxthreads(10);
}
@Override
public void getConfig(QrStartConfig.Builder builder) {
super.getConfig(builder);
builder.jvm.heapsize(384);
}
protected boolean messageBusEnabled() { return false; }
private void addLogHandler() {
Handler<?> logHandler = Handler.fromClassName(ContainerCluster.LOG_HANDLER_CLASS);
logHandler.addServerBindings("http://*/logs");
addComponent(logHandler);
}
}
| // Copyright 2019 Oath Inc. Licensed under the terms of the Apache 2.0 license. See LICENSE in the project root.
package com.yahoo.vespa.model.admin;
import com.yahoo.config.model.deploy.DeployState;
import com.yahoo.config.model.producer.AbstractConfigProducer;
import com.yahoo.container.handler.ThreadpoolConfig;
import com.yahoo.search.config.QrStartConfig;
import com.yahoo.vespa.model.container.ContainerCluster;
import com.yahoo.vespa.model.container.component.Handler;
/**
* @author hmusum
*/
public class LogserverContainerCluster extends ContainerCluster<LogserverContainer> {
public LogserverContainerCluster(AbstractConfigProducer<?> parent, String name, DeployState deployState) {
super(parent, name, name, deployState);
addDefaultHandlersWithVip();
addLogHandler();
}
@Override
protected void doPrepare(DeployState deployState) { }
@Override
public void getConfig(ThreadpoolConfig.Builder builder) {
builder.maxthreads(10);
}
@Override
public void getConfig(QrStartConfig.Builder builder) {
super.getConfig(builder);
builder.jvm.heapsize(384)
.verbosegc(true);
}
protected boolean messageBusEnabled() { return false; }
private void addLogHandler() {
Handler<?> logHandler = Handler.fromClassName(ContainerCluster.LOG_HANDLER_CLASS);
logHandler.addServerBindings("http://*/logs");
addComponent(logHandler);
}
}
| Use verbose gc logging for logserver-container | Use verbose gc logging for logserver-container
| Java | apache-2.0 | vespa-engine/vespa,vespa-engine/vespa,vespa-engine/vespa,vespa-engine/vespa,vespa-engine/vespa,vespa-engine/vespa,vespa-engine/vespa,vespa-engine/vespa,vespa-engine/vespa,vespa-engine/vespa | java | ## Code Before:
// Copyright 2019 Oath Inc. Licensed under the terms of the Apache 2.0 license. See LICENSE in the project root.
package com.yahoo.vespa.model.admin;
import com.yahoo.config.model.deploy.DeployState;
import com.yahoo.config.model.producer.AbstractConfigProducer;
import com.yahoo.container.handler.ThreadpoolConfig;
import com.yahoo.search.config.QrStartConfig;
import com.yahoo.vespa.model.container.ContainerCluster;
import com.yahoo.vespa.model.container.component.Handler;
/**
* @author hmusum
*/
public class LogserverContainerCluster extends ContainerCluster<LogserverContainer> {
public LogserverContainerCluster(AbstractConfigProducer<?> parent, String name, DeployState deployState) {
super(parent, name, name, deployState);
addDefaultHandlersWithVip();
addLogHandler();
}
@Override
protected void doPrepare(DeployState deployState) { }
@Override
public void getConfig(ThreadpoolConfig.Builder builder) {
builder.maxthreads(10);
}
@Override
public void getConfig(QrStartConfig.Builder builder) {
super.getConfig(builder);
builder.jvm.heapsize(384);
}
protected boolean messageBusEnabled() { return false; }
private void addLogHandler() {
Handler<?> logHandler = Handler.fromClassName(ContainerCluster.LOG_HANDLER_CLASS);
logHandler.addServerBindings("http://*/logs");
addComponent(logHandler);
}
}
## Instruction:
Use verbose gc logging for logserver-container
## Code After:
// Copyright 2019 Oath Inc. Licensed under the terms of the Apache 2.0 license. See LICENSE in the project root.
package com.yahoo.vespa.model.admin;
import com.yahoo.config.model.deploy.DeployState;
import com.yahoo.config.model.producer.AbstractConfigProducer;
import com.yahoo.container.handler.ThreadpoolConfig;
import com.yahoo.search.config.QrStartConfig;
import com.yahoo.vespa.model.container.ContainerCluster;
import com.yahoo.vespa.model.container.component.Handler;
/**
* @author hmusum
*/
public class LogserverContainerCluster extends ContainerCluster<LogserverContainer> {
public LogserverContainerCluster(AbstractConfigProducer<?> parent, String name, DeployState deployState) {
super(parent, name, name, deployState);
addDefaultHandlersWithVip();
addLogHandler();
}
@Override
protected void doPrepare(DeployState deployState) { }
@Override
public void getConfig(ThreadpoolConfig.Builder builder) {
builder.maxthreads(10);
}
@Override
public void getConfig(QrStartConfig.Builder builder) {
super.getConfig(builder);
builder.jvm.heapsize(384)
.verbosegc(true);
}
protected boolean messageBusEnabled() { return false; }
private void addLogHandler() {
Handler<?> logHandler = Handler.fromClassName(ContainerCluster.LOG_HANDLER_CLASS);
logHandler.addServerBindings("http://*/logs");
addComponent(logHandler);
}
}
| // Copyright 2019 Oath Inc. Licensed under the terms of the Apache 2.0 license. See LICENSE in the project root.
package com.yahoo.vespa.model.admin;
import com.yahoo.config.model.deploy.DeployState;
import com.yahoo.config.model.producer.AbstractConfigProducer;
import com.yahoo.container.handler.ThreadpoolConfig;
import com.yahoo.search.config.QrStartConfig;
import com.yahoo.vespa.model.container.ContainerCluster;
import com.yahoo.vespa.model.container.component.Handler;
/**
* @author hmusum
*/
public class LogserverContainerCluster extends ContainerCluster<LogserverContainer> {
public LogserverContainerCluster(AbstractConfigProducer<?> parent, String name, DeployState deployState) {
super(parent, name, name, deployState);
addDefaultHandlersWithVip();
addLogHandler();
}
@Override
protected void doPrepare(DeployState deployState) { }
@Override
public void getConfig(ThreadpoolConfig.Builder builder) {
builder.maxthreads(10);
}
@Override
public void getConfig(QrStartConfig.Builder builder) {
super.getConfig(builder);
- builder.jvm.heapsize(384);
? -
+ builder.jvm.heapsize(384)
+ .verbosegc(true);
}
protected boolean messageBusEnabled() { return false; }
private void addLogHandler() {
Handler<?> logHandler = Handler.fromClassName(ContainerCluster.LOG_HANDLER_CLASS);
logHandler.addServerBindings("http://*/logs");
addComponent(logHandler);
}
} | 3 | 0.066667 | 2 | 1 |
97534be9f7df628177a886718fd39c31293d2f11 | lib/node_modules/@stdlib/utils/try-function/docs/repl.txt | lib/node_modules/@stdlib/utils/try-function/docs/repl.txt |
{{alias}}( fcn[, thisArg] )
Wraps a function in a try/catch block.
Parameters
----------
fcn: Function
Function to wrap.
thisArg: any (optional)
Function context.
Returns
-------
wrapped: Function
Wrapped function.
Examples
--------
> function fcn() {
> throw new Error( 'beep boop' );
> }
> var f = wrap( fcn );
> var out = f();
> out.message
'beep boop'
See Also
--------
|
{{alias}}( fcn[, thisArg] )
Wraps a function in a try/catch block.
If provided an asynchronous function, the returned function only traps
errors which occur during the current event loop tick.
Parameters
----------
fcn: Function
Function to wrap.
thisArg: any (optional)
Function context.
Returns
-------
out: Function
Wrapped function.
Examples
--------
> function fcn() {
> throw new Error( 'beep boop' );
> };
> var f = wrap( fcn );
> var out = f();
> out.message
'beep boop'
See Also
--------
| Add semicolon and add note | Add semicolon and add note
| Text | apache-2.0 | stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib | text | ## Code Before:
{{alias}}( fcn[, thisArg] )
Wraps a function in a try/catch block.
Parameters
----------
fcn: Function
Function to wrap.
thisArg: any (optional)
Function context.
Returns
-------
wrapped: Function
Wrapped function.
Examples
--------
> function fcn() {
> throw new Error( 'beep boop' );
> }
> var f = wrap( fcn );
> var out = f();
> out.message
'beep boop'
See Also
--------
## Instruction:
Add semicolon and add note
## Code After:
{{alias}}( fcn[, thisArg] )
Wraps a function in a try/catch block.
If provided an asynchronous function, the returned function only traps
errors which occur during the current event loop tick.
Parameters
----------
fcn: Function
Function to wrap.
thisArg: any (optional)
Function context.
Returns
-------
out: Function
Wrapped function.
Examples
--------
> function fcn() {
> throw new Error( 'beep boop' );
> };
> var f = wrap( fcn );
> var out = f();
> out.message
'beep boop'
See Also
--------
|
{{alias}}( fcn[, thisArg] )
Wraps a function in a try/catch block.
+
+ If provided an asynchronous function, the returned function only traps
+ errors which occur during the current event loop tick.
Parameters
----------
fcn: Function
Function to wrap.
thisArg: any (optional)
Function context.
Returns
-------
- wrapped: Function
+ out: Function
Wrapped function.
Examples
--------
> function fcn() {
> throw new Error( 'beep boop' );
- > }
+ > };
? +
> var f = wrap( fcn );
> var out = f();
> out.message
'beep boop'
See Also
--------
| 7 | 0.233333 | 5 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.