commit stringlengths 40 40 | old_file stringlengths 4 184 | new_file stringlengths 4 184 | old_contents stringlengths 1 3.6k | new_contents stringlengths 5 3.38k | subject stringlengths 15 778 | message stringlengths 16 6.74k | lang stringclasses 201 values | license stringclasses 13 values | repos stringlengths 6 116k | config stringclasses 201 values | content stringlengths 137 7.24k | diff stringlengths 26 5.55k | diff_length int64 1 123 | relative_diff_length float64 0.01 89 | n_lines_added int64 0 108 | n_lines_deleted int64 0 106 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2adda7e21c11a425956bd18f0540dd16f8cf9d53 | run_docker.sh | run_docker.sh | docker run --add-host=db:172.17.0.2 -v $(pwd)/src:/code -p 8000:8000 -t -i django_web python /code/manage.py runserver 0.0.0.0:8000
| id=$(docker ps -q -f image=postgres)
if [ -z "$id" ]
then
echo "[!] Error: Start the postgres image before running django! Try: docker run -d postgres"
return
fi
docker run --link $id:db -v $(pwd)/src:/code -p 8000:8000 -t -i django_web python /code/manage.py runserver 0.0.0.0:8000
| Change docker run script to check for postgres | Change docker run script to check for postgres
| Shell | mit | Dbof/django-docker-example | shell | ## Code Before:
docker run --add-host=db:172.17.0.2 -v $(pwd)/src:/code -p 8000:8000 -t -i django_web python /code/manage.py runserver 0.0.0.0:8000
## Instruction:
Change docker run script to check for postgres
## Code After:
id=$(docker ps -q -f image=postgres)
if [ -z "$id" ]
then
echo "[!] Error: Start the postgres image before running django! Try: docker run -d postgres"
return
fi
docker run --link $id:db -v $(pwd)/src:/code -p 8000:8000 -t -i django_web python /code/manage.py runserver 0.0.0.0:8000
| + id=$(docker ps -q -f image=postgres)
+ if [ -z "$id" ]
+ then
+ echo "[!] Error: Start the postgres image before running django! Try: docker run -d postgres"
+ return
+ fi
+
- docker run --add-host=db:172.17.0.2 -v $(pwd)/src:/code -p 8000:8000 -t -i django_web python /code/manage.py runserver 0.0.0.0:8000
? ^ ^^^^^^^ -----------
+ docker run --link $id:db -v $(pwd)/src:/code -p 8000:8000 -t -i django_web python /code/manage.py runserver 0.0.0.0:8000
? ^^^^^^^ ^
| 9 | 9 | 8 | 1 |
e5a8592786d503c84d7439fc1aeb8cbec9e0713c | tests/cases/fourslash/completionListInFunctionExpression.ts | tests/cases/fourslash/completionListInFunctionExpression.ts | /// <reference path="fourslash.ts"/>
////interface Number {
//// toString(radix?: number): string;
//// toFixed(fractionDigits?: number): string;
//// toExponential(fractionDigits?: number): string;
//// toPrecision(precision: number): string;
////}
////
////() => {
//// var foo = 0;
//// /*requestCompletion*/
//// foo./*memberCompletion*/toString;
////}/*editDeclaration*/
goTo.marker("requestCompletion");
verify.memberListContains("foo");
goTo.marker("memberCompletion");
verify.memberListContains("toExponential");
// Now change the decl by adding a semicolon
goTo.marker("editDeclaration");
edit.insert(";");
// foo should still be there
goTo.marker("requestCompletion");
verify.memberListContains("foo");
goTo.marker("memberCompletion");
verify.memberListContains("toExponential");
| /// <reference path="fourslash.ts"/>
////() => {
//// var foo = 0;
//// /*requestCompletion*/
//// foo./*memberCompletion*/toString;
////}/*editDeclaration*/
goTo.marker("requestCompletion");
verify.memberListContains("foo");
goTo.marker("memberCompletion");
verify.memberListContains("toExponential");
// Now change the decl by adding a semicolon
goTo.marker("editDeclaration");
edit.insert(";");
// foo should still be there
goTo.marker("requestCompletion");
verify.memberListContains("foo");
goTo.marker("memberCompletion");
verify.memberListContains("toExponential");
| Remove extraneous Number decls now that lib.d.ts is always present | Remove extraneous Number decls now that lib.d.ts is always present
| TypeScript | apache-2.0 | mbebenita/shumway.ts,hippich/typescript,hippich/typescript,fdecampredon/jsx-typescript-old-version,mbrowne/typescript-dci,hippich/typescript,mbrowne/typescript-dci,popravich/typescript,mbrowne/typescript-dci,mbebenita/shumway.ts,fdecampredon/jsx-typescript-old-version,mbebenita/shumway.ts,fdecampredon/jsx-typescript-old-version,mbrowne/typescript-dci,popravich/typescript,popravich/typescript | typescript | ## Code Before:
/// <reference path="fourslash.ts"/>
////interface Number {
//// toString(radix?: number): string;
//// toFixed(fractionDigits?: number): string;
//// toExponential(fractionDigits?: number): string;
//// toPrecision(precision: number): string;
////}
////
////() => {
//// var foo = 0;
//// /*requestCompletion*/
//// foo./*memberCompletion*/toString;
////}/*editDeclaration*/
goTo.marker("requestCompletion");
verify.memberListContains("foo");
goTo.marker("memberCompletion");
verify.memberListContains("toExponential");
// Now change the decl by adding a semicolon
goTo.marker("editDeclaration");
edit.insert(";");
// foo should still be there
goTo.marker("requestCompletion");
verify.memberListContains("foo");
goTo.marker("memberCompletion");
verify.memberListContains("toExponential");
## Instruction:
Remove extraneous Number decls now that lib.d.ts is always present
## Code After:
/// <reference path="fourslash.ts"/>
////() => {
//// var foo = 0;
//// /*requestCompletion*/
//// foo./*memberCompletion*/toString;
////}/*editDeclaration*/
goTo.marker("requestCompletion");
verify.memberListContains("foo");
goTo.marker("memberCompletion");
verify.memberListContains("toExponential");
// Now change the decl by adding a semicolon
goTo.marker("editDeclaration");
edit.insert(";");
// foo should still be there
goTo.marker("requestCompletion");
verify.memberListContains("foo");
goTo.marker("memberCompletion");
verify.memberListContains("toExponential");
| /// <reference path="fourslash.ts"/>
- ////interface Number {
- //// toString(radix?: number): string;
- //// toFixed(fractionDigits?: number): string;
- //// toExponential(fractionDigits?: number): string;
- //// toPrecision(precision: number): string;
- ////}
- ////
////() => {
//// var foo = 0;
//// /*requestCompletion*/
//// foo./*memberCompletion*/toString;
////}/*editDeclaration*/
goTo.marker("requestCompletion");
verify.memberListContains("foo");
goTo.marker("memberCompletion");
verify.memberListContains("toExponential");
// Now change the decl by adding a semicolon
goTo.marker("editDeclaration");
edit.insert(";");
// foo should still be there
goTo.marker("requestCompletion");
verify.memberListContains("foo");
goTo.marker("memberCompletion");
verify.memberListContains("toExponential"); | 7 | 0.225806 | 0 | 7 |
3f683f5b6f6e77a8453bc578c330d08f57284331 | src/vistk/image/instantiate.h | src/vistk/image/instantiate.h | /*ckwg +5
* Copyright 2011-2012 by Kitware, Inc. All Rights Reserved. Please refer to
* KITWARE_LICENSE.TXT for licensing information, or contact General Counsel,
* Kitware, Inc., 28 Corporate Drive, Clifton Park, NY 12065.
*/
#include <boost/cstdint.hpp>
#define VISTK_IMAGE_INSTANTIATE(cls) \
template class cls<bool>; \
template class cls<uint8_t>; \
template class cls<float>; \
template class cls<double>
| /*ckwg +5
* Copyright 2011-2012 by Kitware, Inc. All Rights Reserved. Please refer to
* KITWARE_LICENSE.TXT for licensing information, or contact General Counsel,
* Kitware, Inc., 28 Corporate Drive, Clifton Park, NY 12065.
*/
#ifndef VISTK_IMAGE_INSTANTIATE_H
#define VISTK_IMAGE_INSTANTIATE_H
#include <boost/cstdint.hpp>
/**
* \file image/instantiate.h
*
* \brief Types for instantiation of image library templates.
*/
#define VISTK_IMAGE_INSTANTIATE_INT(cls) \
template class cls<bool>; \
template class cls<uint8_t>
#define VISTK_IMAGE_INSTANTIATE_FLOAT(cls) \
template class cls<float>; \
template class cls<double>
#define VISTK_IMAGE_INSTANTIATE(cls) \
VISTK_IMAGE_INSTANTIATE_INT(cls); \
VISTK_IMAGE_INSTANTIATE_FLOAT(cls)
#endif // VISTK_IMAGE_INSTANTIATE_H
| Split int from float instantiations | Split int from float instantiations
| C | bsd-3-clause | linus-sherrill/sprokit,Kitware/sprokit,linus-sherrill/sprokit,Kitware/sprokit,linus-sherrill/sprokit,Kitware/sprokit,linus-sherrill/sprokit,mathstuf/sprokit,mathstuf/sprokit,mathstuf/sprokit,mathstuf/sprokit,Kitware/sprokit | c | ## Code Before:
/*ckwg +5
* Copyright 2011-2012 by Kitware, Inc. All Rights Reserved. Please refer to
* KITWARE_LICENSE.TXT for licensing information, or contact General Counsel,
* Kitware, Inc., 28 Corporate Drive, Clifton Park, NY 12065.
*/
#include <boost/cstdint.hpp>
#define VISTK_IMAGE_INSTANTIATE(cls) \
template class cls<bool>; \
template class cls<uint8_t>; \
template class cls<float>; \
template class cls<double>
## Instruction:
Split int from float instantiations
## Code After:
/*ckwg +5
* Copyright 2011-2012 by Kitware, Inc. All Rights Reserved. Please refer to
* KITWARE_LICENSE.TXT for licensing information, or contact General Counsel,
* Kitware, Inc., 28 Corporate Drive, Clifton Park, NY 12065.
*/
#ifndef VISTK_IMAGE_INSTANTIATE_H
#define VISTK_IMAGE_INSTANTIATE_H
#include <boost/cstdint.hpp>
/**
* \file image/instantiate.h
*
* \brief Types for instantiation of image library templates.
*/
#define VISTK_IMAGE_INSTANTIATE_INT(cls) \
template class cls<bool>; \
template class cls<uint8_t>
#define VISTK_IMAGE_INSTANTIATE_FLOAT(cls) \
template class cls<float>; \
template class cls<double>
#define VISTK_IMAGE_INSTANTIATE(cls) \
VISTK_IMAGE_INSTANTIATE_INT(cls); \
VISTK_IMAGE_INSTANTIATE_FLOAT(cls)
#endif // VISTK_IMAGE_INSTANTIATE_H
| /*ckwg +5
* Copyright 2011-2012 by Kitware, Inc. All Rights Reserved. Please refer to
* KITWARE_LICENSE.TXT for licensing information, or contact General Counsel,
* Kitware, Inc., 28 Corporate Drive, Clifton Park, NY 12065.
*/
+ #ifndef VISTK_IMAGE_INSTANTIATE_H
+ #define VISTK_IMAGE_INSTANTIATE_H
+
#include <boost/cstdint.hpp>
+ /**
+ * \file image/instantiate.h
+ *
+ * \brief Types for instantiation of image library templates.
+ */
+
+ #define VISTK_IMAGE_INSTANTIATE_INT(cls) \
+ template class cls<bool>; \
+ template class cls<uint8_t>
+
+ #define VISTK_IMAGE_INSTANTIATE_FLOAT(cls) \
+ template class cls<float>; \
+ template class cls<double>
+
#define VISTK_IMAGE_INSTANTIATE(cls) \
- template class cls<bool>; \
- template class cls<uint8_t>; \
- template class cls<float>; \
- template class cls<double>
+ VISTK_IMAGE_INSTANTIATE_INT(cls); \
+ VISTK_IMAGE_INSTANTIATE_FLOAT(cls)
+
+ #endif // VISTK_IMAGE_INSTANTIATE_H | 25 | 1.923077 | 21 | 4 |
b0a0e5b72ee78c0a03b95a092ac3d50b2ee1bc48 | README.md | README.md | Swift-Algorithms
================
Implementation of various algorithms and data structures in Swift.
Feel free to add more implementations!
Each implementation is done in Xcode playground. To view code, either:
1. Clone the repo and import in Xcode, *or*
2. View the .swift file in each .playground folder (on Github)
## Algorithms
### Sorting
1. [Insertion Sort](https://github.com/karan/Swift-Algorithms/tree/master/InsertionSort.playground)
2. [Selection Sort](https://github.com/karan/Swift-Algorithms/tree/master/SelectionSort.playground)
| Swift-Algorithms
================
Implementation of various algorithms and data structures in Swift.
Feel free to add more implementations!
Each implementation is done in Xcode playground. To view code, either:
1. Clone the repo and import in Xcode, *or*
2. View the .swift file in each .playground folder (on Github)
## Algorithms
### Sorting
1. [Insertion Sort](https://github.com/karan/Swift-Algorithms/tree/master/InsertionSort.playground)
2. [Selection Sort](https://github.com/karan/Swift-Algorithms/tree/master/SelectionSort.playground)
3. [Bubble Sort](https://github.com/karan/Swift-Algorithms/tree/master/BubbleSort.playground)
4. [Quick Sort](https://github.com/karan/Swift-Algorithms/tree/master/QuickSort.playground)
| Add links to Bubble and Quick sorts | Add links to Bubble and Quick sorts | Markdown | mit | karan/Swift-Algorithms,cdare77/Swift-Algorithms,alvinvarghese/Swift-Algorithms | markdown | ## Code Before:
Swift-Algorithms
================
Implementation of various algorithms and data structures in Swift.
Feel free to add more implementations!
Each implementation is done in Xcode playground. To view code, either:
1. Clone the repo and import in Xcode, *or*
2. View the .swift file in each .playground folder (on Github)
## Algorithms
### Sorting
1. [Insertion Sort](https://github.com/karan/Swift-Algorithms/tree/master/InsertionSort.playground)
2. [Selection Sort](https://github.com/karan/Swift-Algorithms/tree/master/SelectionSort.playground)
## Instruction:
Add links to Bubble and Quick sorts
## Code After:
Swift-Algorithms
================
Implementation of various algorithms and data structures in Swift.
Feel free to add more implementations!
Each implementation is done in Xcode playground. To view code, either:
1. Clone the repo and import in Xcode, *or*
2. View the .swift file in each .playground folder (on Github)
## Algorithms
### Sorting
1. [Insertion Sort](https://github.com/karan/Swift-Algorithms/tree/master/InsertionSort.playground)
2. [Selection Sort](https://github.com/karan/Swift-Algorithms/tree/master/SelectionSort.playground)
3. [Bubble Sort](https://github.com/karan/Swift-Algorithms/tree/master/BubbleSort.playground)
4. [Quick Sort](https://github.com/karan/Swift-Algorithms/tree/master/QuickSort.playground)
| Swift-Algorithms
================
Implementation of various algorithms and data structures in Swift.
Feel free to add more implementations!
Each implementation is done in Xcode playground. To view code, either:
1. Clone the repo and import in Xcode, *or*
2. View the .swift file in each .playground folder (on Github)
## Algorithms
### Sorting
1. [Insertion Sort](https://github.com/karan/Swift-Algorithms/tree/master/InsertionSort.playground)
2. [Selection Sort](https://github.com/karan/Swift-Algorithms/tree/master/SelectionSort.playground)
+
+ 3. [Bubble Sort](https://github.com/karan/Swift-Algorithms/tree/master/BubbleSort.playground)
+
+ 4. [Quick Sort](https://github.com/karan/Swift-Algorithms/tree/master/QuickSort.playground) | 4 | 0.2 | 4 | 0 |
af44ce54fe427986398cf76b9cd0c856bf215f83 | src/FlashServiceProvider.php | src/FlashServiceProvider.php | <?php
namespace Jalle19\Laravel\UnshittyFlash;
use Illuminate\Support\ServiceProvider;
/**
* Class FlashServiceProvider
* @package Jalle19\Laravel\UnshittyFlash
*/
class FlashServiceProvider extends ServiceProvider
{
/**
* Registers bindings into the container
*/
public function register()
{
$this->app->singleton(FlashService::class, function() {
return new FlashService(config('flash'));
});
}
}
| <?php
namespace Jalle19\Laravel\UnshittyFlash;
use Illuminate\Support\ServiceProvider;
/**
* Class FlashServiceProvider
* @package Jalle19\Laravel\UnshittyFlash
*/
class FlashServiceProvider extends ServiceProvider
{
/**
* Registers bindings into the container
*/
public function register()
{
$this->app->singleton(FlashService::class, function() {
return new FlashService(config('flash'));
});
}
/**
*
*/
public function boot()
{
$this->publishes([
__DIR__ . '/../config/flash.php' => config_path('flash.php'),
]);
}
}
| Add configuration file publishing to the service provider | Add configuration file publishing to the service provider
| PHP | mit | Jalle19/laravel-unshitty-flash | php | ## Code Before:
<?php
namespace Jalle19\Laravel\UnshittyFlash;
use Illuminate\Support\ServiceProvider;
/**
* Class FlashServiceProvider
* @package Jalle19\Laravel\UnshittyFlash
*/
class FlashServiceProvider extends ServiceProvider
{
/**
* Registers bindings into the container
*/
public function register()
{
$this->app->singleton(FlashService::class, function() {
return new FlashService(config('flash'));
});
}
}
## Instruction:
Add configuration file publishing to the service provider
## Code After:
<?php
namespace Jalle19\Laravel\UnshittyFlash;
use Illuminate\Support\ServiceProvider;
/**
* Class FlashServiceProvider
* @package Jalle19\Laravel\UnshittyFlash
*/
class FlashServiceProvider extends ServiceProvider
{
/**
* Registers bindings into the container
*/
public function register()
{
$this->app->singleton(FlashService::class, function() {
return new FlashService(config('flash'));
});
}
/**
*
*/
public function boot()
{
$this->publishes([
__DIR__ . '/../config/flash.php' => config_path('flash.php'),
]);
}
}
| <?php
namespace Jalle19\Laravel\UnshittyFlash;
use Illuminate\Support\ServiceProvider;
/**
* Class FlashServiceProvider
* @package Jalle19\Laravel\UnshittyFlash
*/
class FlashServiceProvider extends ServiceProvider
{
/**
* Registers bindings into the container
*/
public function register()
{
$this->app->singleton(FlashService::class, function() {
return new FlashService(config('flash'));
});
}
+
+ /**
+ *
+ */
+ public function boot()
+ {
+ $this->publishes([
+ __DIR__ . '/../config/flash.php' => config_path('flash.php'),
+ ]);
+ }
+
} | 11 | 0.458333 | 11 | 0 |
672ab800d07fb43757d8dae5ed3f4969e5cbf24f | recipes/mapsplice/meta.yaml | recipes/mapsplice/meta.yaml | package:
name: mapsplice
version: 2.2.0
source:
fn: MapSplice-v2.2.0.zip
url: http://protocols.netlab.uky.edu/~zeng/MapSplice-v2.2.0.zip
md5: 171975d48d2ddcc111a314d926f3527a
patches:
- mapsplice.patch
- samtools.patch
build:
number: 1
skip: True # [osx]
string: "ncurses{{CONDA_NCURSES}}_{{PKG_BUILDNUM}}"
requirements:
build:
- python
- zlib
- ncurses {{CONDA_NCURSES}}*
run:
- python
- zlib
- ncurses {{CONDA_NCURSES}}*
test:
commands:
- mapsplice.py -v 2>&1 |grep MapSplice > /dev/null
about:
license: Custom
summary: MapSplice is a software for mapping RNA-seq data to reference genome for splice junction discovery that depends only on reference genome, and not on any further annotations.
license_file: Copyright.txt
| package:
name: mapsplice
version: 2.2.0
source:
fn: MapSplice-v2.2.0.zip
url: http://protocols.netlab.uky.edu/~zeng/MapSplice-v2.2.0.zip
md5: 171975d48d2ddcc111a314d926f3527a
patches:
- mapsplice.patch
- samtools.patch
build:
number: 1
skip: True # [osx]
string: "py{{CONDA_PY}}_ncurses{{CONDA_NCURSES}}_{{PKG_BUILDNUM}}"
requirements:
build:
- python
- zlib
- ncurses {{CONDA_NCURSES}}*
run:
- python
- zlib
- ncurses {{CONDA_NCURSES}}*
test:
commands:
- mapsplice.py -v 2>&1 |grep MapSplice > /dev/null
about:
license: Custom
summary: MapSplice is a software for mapping RNA-seq data to reference genome for splice junction discovery that depends only on reference genome, and not on any further annotations.
license_file: Copyright.txt
| Fix build string of mapsplice. | Fix build string of mapsplice.
| YAML | mit | pinguinkiste/bioconda-recipes,gregvonkuster/bioconda-recipes,hardingnj/bioconda-recipes,joachimwolff/bioconda-recipes,JenCabral/bioconda-recipes,bow/bioconda-recipes,peterjc/bioconda-recipes,keuv-grvl/bioconda-recipes,ivirshup/bioconda-recipes,pinguinkiste/bioconda-recipes,omicsnut/bioconda-recipes,shenwei356/bioconda-recipes,ostrokach/bioconda-recipes,lpantano/recipes,zachcp/bioconda-recipes,dmaticzka/bioconda-recipes,xguse/bioconda-recipes,mcornwell1957/bioconda-recipes,colinbrislawn/bioconda-recipes,ostrokach/bioconda-recipes,ivirshup/bioconda-recipes,Luobiny/bioconda-recipes,JenCabral/bioconda-recipes,jasper1918/bioconda-recipes,gvlproject/bioconda-recipes,hardingnj/bioconda-recipes,colinbrislawn/bioconda-recipes,bebatut/bioconda-recipes,oena/bioconda-recipes,colinbrislawn/bioconda-recipes,daler/bioconda-recipes,oena/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,npavlovikj/bioconda-recipes,daler/bioconda-recipes,lpantano/recipes,joachimwolff/bioconda-recipes,omicsnut/bioconda-recipes,JenCabral/bioconda-recipes,martin-mann/bioconda-recipes,matthdsm/bioconda-recipes,bow/bioconda-recipes,Luobiny/bioconda-recipes,zachcp/bioconda-recipes,shenwei356/bioconda-recipes,gvlproject/bioconda-recipes,colinbrislawn/bioconda-recipes,mdehollander/bioconda-recipes,acaprez/recipes,CGATOxford/bioconda-recipes,bow/bioconda-recipes,acaprez/recipes,HassanAmr/bioconda-recipes,zachcp/bioconda-recipes,omicsnut/bioconda-recipes,pinguinkiste/bioconda-recipes,daler/bioconda-recipes,chapmanb/bioconda-recipes,saketkc/bioconda-recipes,gregvonkuster/bioconda-recipes,shenwei356/bioconda-recipes,martin-mann/bioconda-recipes,rvalieris/bioconda-recipes,guowei-he/bioconda-recipes,CGATOxford/bioconda-recipes,abims-sbr/bioconda-recipes,xguse/bioconda-recipes,cokelaer/bioconda-recipes,cokelaer/bioconda-recipes,saketkc/bioconda-recipes,rvalieris/bioconda-recipes,HassanAmr/bioconda-recipes,guowei-he/bioconda-recipes,xguse/bioconda-recipes,gvlproject/bioconda-recipes,phac-nml/bioconda-recipes,zwanli/bioconda-recipes,gvlproject/bioconda-recipes,ivirshup/bioconda-recipes,ostrokach/bioconda-recipes,bow/bioconda-recipes,dmaticzka/bioconda-recipes,saketkc/bioconda-recipes,keuv-grvl/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,rvalieris/bioconda-recipes,roryk/recipes,hardingnj/bioconda-recipes,bioconda/bioconda-recipes,blankenberg/bioconda-recipes,abims-sbr/bioconda-recipes,jfallmann/bioconda-recipes,joachimwolff/bioconda-recipes,ostrokach/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,peterjc/bioconda-recipes,abims-sbr/bioconda-recipes,jfallmann/bioconda-recipes,npavlovikj/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,Luobiny/bioconda-recipes,ivirshup/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,saketkc/bioconda-recipes,colinbrislawn/bioconda-recipes,gregvonkuster/bioconda-recipes,omicsnut/bioconda-recipes,jfallmann/bioconda-recipes,CGATOxford/bioconda-recipes,matthdsm/bioconda-recipes,guowei-he/bioconda-recipes,jasper1918/bioconda-recipes,dkoppstein/recipes,HassanAmr/bioconda-recipes,yesimon/bioconda-recipes,jasper1918/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,daler/bioconda-recipes,mdehollander/bioconda-recipes,oena/bioconda-recipes,blankenberg/bioconda-recipes,mcornwell1957/bioconda-recipes,bebatut/bioconda-recipes,ivirshup/bioconda-recipes,joachimwolff/bioconda-recipes,martin-mann/bioconda-recipes,saketkc/bioconda-recipes,yesimon/bioconda-recipes,mdehollander/bioconda-recipes,oena/bioconda-recipes,lpantano/recipes,keuv-grvl/bioconda-recipes,phac-nml/bioconda-recipes,rob-p/bioconda-recipes,gregvonkuster/bioconda-recipes,ThomasWollmann/bioconda-recipes,colinbrislawn/bioconda-recipes,saketkc/bioconda-recipes,bioconda/bioconda-recipes,mcornwell1957/bioconda-recipes,dkoppstein/recipes,acaprez/recipes,HassanAmr/bioconda-recipes,joachimwolff/bioconda-recipes,keuv-grvl/bioconda-recipes,guowei-he/bioconda-recipes,matthdsm/bioconda-recipes,matthdsm/bioconda-recipes,jfallmann/bioconda-recipes,cokelaer/bioconda-recipes,daler/bioconda-recipes,HassanAmr/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,matthdsm/bioconda-recipes,bebatut/bioconda-recipes,jasper1918/bioconda-recipes,zwanli/bioconda-recipes,shenwei356/bioconda-recipes,npavlovikj/bioconda-recipes,joachimwolff/bioconda-recipes,mdehollander/bioconda-recipes,peterjc/bioconda-recipes,dmaticzka/bioconda-recipes,rob-p/bioconda-recipes,dkoppstein/recipes,xguse/bioconda-recipes,pinguinkiste/bioconda-recipes,chapmanb/bioconda-recipes,hardingnj/bioconda-recipes,mdehollander/bioconda-recipes,zwanli/bioconda-recipes,bow/bioconda-recipes,chapmanb/bioconda-recipes,zachcp/bioconda-recipes,zwanli/bioconda-recipes,chapmanb/bioconda-recipes,mdehollander/bioconda-recipes,rob-p/bioconda-recipes,JenCabral/bioconda-recipes,phac-nml/bioconda-recipes,rob-p/bioconda-recipes,bow/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,ThomasWollmann/bioconda-recipes,gvlproject/bioconda-recipes,abims-sbr/bioconda-recipes,bioconda/bioconda-recipes,JenCabral/bioconda-recipes,ThomasWollmann/bioconda-recipes,rvalieris/bioconda-recipes,dmaticzka/bioconda-recipes,ThomasWollmann/bioconda-recipes,npavlovikj/bioconda-recipes,abims-sbr/bioconda-recipes,martin-mann/bioconda-recipes,keuv-grvl/bioconda-recipes,ThomasWollmann/bioconda-recipes,martin-mann/bioconda-recipes,roryk/recipes,pinguinkiste/bioconda-recipes,ostrokach/bioconda-recipes,jasper1918/bioconda-recipes,dmaticzka/bioconda-recipes,bioconda/recipes,roryk/recipes,ivirshup/bioconda-recipes,yesimon/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,pinguinkiste/bioconda-recipes,bioconda/bioconda-recipes,omicsnut/bioconda-recipes,CGATOxford/bioconda-recipes,ostrokach/bioconda-recipes,rvalieris/bioconda-recipes,hardingnj/bioconda-recipes,bioconda/recipes,daler/bioconda-recipes,zwanli/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,chapmanb/bioconda-recipes,oena/bioconda-recipes,cokelaer/bioconda-recipes,zwanli/bioconda-recipes,peterjc/bioconda-recipes,lpantano/recipes,bebatut/bioconda-recipes,peterjc/bioconda-recipes,gvlproject/bioconda-recipes,rvalieris/bioconda-recipes,peterjc/bioconda-recipes,mcornwell1957/bioconda-recipes,ThomasWollmann/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,abims-sbr/bioconda-recipes,mcornwell1957/bioconda-recipes,yesimon/bioconda-recipes,blankenberg/bioconda-recipes,keuv-grvl/bioconda-recipes,CGATOxford/bioconda-recipes,matthdsm/bioconda-recipes,guowei-he/bioconda-recipes,HassanAmr/bioconda-recipes,xguse/bioconda-recipes,CGATOxford/bioconda-recipes,acaprez/recipes,blankenberg/bioconda-recipes,bioconda/recipes,Luobiny/bioconda-recipes,phac-nml/bioconda-recipes,phac-nml/bioconda-recipes,JenCabral/bioconda-recipes,dmaticzka/bioconda-recipes | yaml | ## Code Before:
package:
name: mapsplice
version: 2.2.0
source:
fn: MapSplice-v2.2.0.zip
url: http://protocols.netlab.uky.edu/~zeng/MapSplice-v2.2.0.zip
md5: 171975d48d2ddcc111a314d926f3527a
patches:
- mapsplice.patch
- samtools.patch
build:
number: 1
skip: True # [osx]
string: "ncurses{{CONDA_NCURSES}}_{{PKG_BUILDNUM}}"
requirements:
build:
- python
- zlib
- ncurses {{CONDA_NCURSES}}*
run:
- python
- zlib
- ncurses {{CONDA_NCURSES}}*
test:
commands:
- mapsplice.py -v 2>&1 |grep MapSplice > /dev/null
about:
license: Custom
summary: MapSplice is a software for mapping RNA-seq data to reference genome for splice junction discovery that depends only on reference genome, and not on any further annotations.
license_file: Copyright.txt
## Instruction:
Fix build string of mapsplice.
## Code After:
package:
name: mapsplice
version: 2.2.0
source:
fn: MapSplice-v2.2.0.zip
url: http://protocols.netlab.uky.edu/~zeng/MapSplice-v2.2.0.zip
md5: 171975d48d2ddcc111a314d926f3527a
patches:
- mapsplice.patch
- samtools.patch
build:
number: 1
skip: True # [osx]
string: "py{{CONDA_PY}}_ncurses{{CONDA_NCURSES}}_{{PKG_BUILDNUM}}"
requirements:
build:
- python
- zlib
- ncurses {{CONDA_NCURSES}}*
run:
- python
- zlib
- ncurses {{CONDA_NCURSES}}*
test:
commands:
- mapsplice.py -v 2>&1 |grep MapSplice > /dev/null
about:
license: Custom
summary: MapSplice is a software for mapping RNA-seq data to reference genome for splice junction discovery that depends only on reference genome, and not on any further annotations.
license_file: Copyright.txt
| package:
name: mapsplice
version: 2.2.0
source:
fn: MapSplice-v2.2.0.zip
url: http://protocols.netlab.uky.edu/~zeng/MapSplice-v2.2.0.zip
md5: 171975d48d2ddcc111a314d926f3527a
patches:
- mapsplice.patch
- samtools.patch
build:
number: 1
skip: True # [osx]
- string: "ncurses{{CONDA_NCURSES}}_{{PKG_BUILDNUM}}"
+ string: "py{{CONDA_PY}}_ncurses{{CONDA_NCURSES}}_{{PKG_BUILDNUM}}"
? +++++++++++++++
requirements:
build:
- python
- zlib
- ncurses {{CONDA_NCURSES}}*
run:
- python
- zlib
- ncurses {{CONDA_NCURSES}}*
test:
commands:
- mapsplice.py -v 2>&1 |grep MapSplice > /dev/null
about:
license: Custom
summary: MapSplice is a software for mapping RNA-seq data to reference genome for splice junction discovery that depends only on reference genome, and not on any further annotations.
license_file: Copyright.txt | 2 | 0.052632 | 1 | 1 |
ad8b1dc1b2054ce794315b9ef6691aa83686c126 | beryl/src/org/beryl/intents/IntentHelper.java | beryl/src/org/beryl/intents/IntentHelper.java | package org.beryl.intents;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.net.Uri;
import android.webkit.MimeTypeMap;
public class IntentHelper {
public static boolean canHandleIntent(final Context context, final Intent intent) {
if (intent != null) {
final PackageManager pm = context.getPackageManager();
if (pm != null) {
if (pm.queryIntentActivities(intent, 0).size() > 0) {
return true;
}
}
}
return false;
}
public static String getMimeTypeFromUrl(Uri url) {
return getMimeTypeFromUrl(url.getPath());
}
public static String getMimeTypeFromUrl(String url) {
final MimeTypeMap mimeMap = MimeTypeMap.getSingleton();
final String extension = MimeTypeMap.getFileExtensionFromUrl(url);
return mimeMap.getMimeTypeFromExtension(extension);
}
public static final Intent getContentByType(String mimeType) {
final Intent intent = new Intent(Intent.ACTION_GET_CONTENT);
intent.setType(mimeType);
return intent;
}
}
| package org.beryl.intents;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.net.Uri;
import android.webkit.MimeTypeMap;
public class IntentHelper {
public static boolean canHandleIntent(final Context context, final Intent intent) {
if (intent != null) {
final PackageManager pm = context.getPackageManager();
if (pm != null) {
if (pm.queryIntentActivities(intent, 0).size() > 0) {
return true;
}
}
}
return false;
}
public static String getMimeTypeFromUrl(Uri url) {
return getMimeTypeFromUrl(url.getPath());
}
public static String getMimeTypeFromUrl(String url) {
final MimeTypeMap mimeMap = MimeTypeMap.getSingleton();
final String extension = MimeTypeMap.getFileExtensionFromUrl(url);
return mimeMap.getMimeTypeFromExtension(extension);
}
public static final Intent getContentByType(String mimeType) {
final Intent intent = new Intent(Intent.ACTION_GET_CONTENT);
intent.setType(mimeType);
intent.setFlags(Intent.FLAG_ACTIVITY_CLEAR_WHEN_TASK_RESET);
return intent;
}
}
| Clear activity on pick image intents. | Clear activity on pick image intents.
| Java | mit | weimingtom/android-beryl,weimingtom/android-beryl,Letractively/android-beryl,Letractively/android-beryl | java | ## Code Before:
package org.beryl.intents;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.net.Uri;
import android.webkit.MimeTypeMap;
public class IntentHelper {
public static boolean canHandleIntent(final Context context, final Intent intent) {
if (intent != null) {
final PackageManager pm = context.getPackageManager();
if (pm != null) {
if (pm.queryIntentActivities(intent, 0).size() > 0) {
return true;
}
}
}
return false;
}
public static String getMimeTypeFromUrl(Uri url) {
return getMimeTypeFromUrl(url.getPath());
}
public static String getMimeTypeFromUrl(String url) {
final MimeTypeMap mimeMap = MimeTypeMap.getSingleton();
final String extension = MimeTypeMap.getFileExtensionFromUrl(url);
return mimeMap.getMimeTypeFromExtension(extension);
}
public static final Intent getContentByType(String mimeType) {
final Intent intent = new Intent(Intent.ACTION_GET_CONTENT);
intent.setType(mimeType);
return intent;
}
}
## Instruction:
Clear activity on pick image intents.
## Code After:
package org.beryl.intents;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.net.Uri;
import android.webkit.MimeTypeMap;
public class IntentHelper {
public static boolean canHandleIntent(final Context context, final Intent intent) {
if (intent != null) {
final PackageManager pm = context.getPackageManager();
if (pm != null) {
if (pm.queryIntentActivities(intent, 0).size() > 0) {
return true;
}
}
}
return false;
}
public static String getMimeTypeFromUrl(Uri url) {
return getMimeTypeFromUrl(url.getPath());
}
public static String getMimeTypeFromUrl(String url) {
final MimeTypeMap mimeMap = MimeTypeMap.getSingleton();
final String extension = MimeTypeMap.getFileExtensionFromUrl(url);
return mimeMap.getMimeTypeFromExtension(extension);
}
public static final Intent getContentByType(String mimeType) {
final Intent intent = new Intent(Intent.ACTION_GET_CONTENT);
intent.setType(mimeType);
intent.setFlags(Intent.FLAG_ACTIVITY_CLEAR_WHEN_TASK_RESET);
return intent;
}
}
| package org.beryl.intents;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.net.Uri;
import android.webkit.MimeTypeMap;
public class IntentHelper {
public static boolean canHandleIntent(final Context context, final Intent intent) {
if (intent != null) {
final PackageManager pm = context.getPackageManager();
if (pm != null) {
if (pm.queryIntentActivities(intent, 0).size() > 0) {
return true;
}
}
}
return false;
}
public static String getMimeTypeFromUrl(Uri url) {
return getMimeTypeFromUrl(url.getPath());
}
public static String getMimeTypeFromUrl(String url) {
final MimeTypeMap mimeMap = MimeTypeMap.getSingleton();
final String extension = MimeTypeMap.getFileExtensionFromUrl(url);
return mimeMap.getMimeTypeFromExtension(extension);
}
public static final Intent getContentByType(String mimeType) {
final Intent intent = new Intent(Intent.ACTION_GET_CONTENT);
intent.setType(mimeType);
+ intent.setFlags(Intent.FLAG_ACTIVITY_CLEAR_WHEN_TASK_RESET);
return intent;
}
} | 1 | 0.025 | 1 | 0 |
a9a9f75146074b41cf6b73de1ef19037521bee56 | src/bindings/javascript/logger.cpp | src/bindings/javascript/logger.cpp |
using namespace emscripten;
EMSCRIPTEN_BINDINGS(libcellml_logger) {
class_<libcellml::Logger>("Logger")
.function("addIssue", &libcellml::Logger::addIssue)
.function("removeAllIssues", &libcellml::Logger::removeAllIssues)
.function("error", &libcellml::Logger::error)
.function("warning", &libcellml::Logger::warning)
.function("hint", &libcellml::Logger::hint)
;
}
|
using namespace emscripten;
EMSCRIPTEN_BINDINGS(libcellml_logger) {
class_<libcellml::Logger>("Logger")
.function("addIssue", &libcellml::Logger::addIssue)
.function("removeAllIssues", &libcellml::Logger::removeAllIssues)
.function("error", &libcellml::Logger::error)
.function("warning", &libcellml::Logger::warning)
.function("hint", &libcellml::Logger::hint)
.function("issueCount", &libcellml::Logger::issueCount)
.function("errorCount", &libcellml::Logger::errorCount)
.function("warningCount", &libcellml::Logger::warningCount)
.function("hintCount", &libcellml::Logger::hintCount)
;
}
| Add missing API from Javascript bindings Logger class. | Add missing API from Javascript bindings Logger class.
| C++ | apache-2.0 | hsorby/libcellml,cellml/libcellml,nickerso/libcellml,cellml/libcellml,cellml/libcellml,hsorby/libcellml,nickerso/libcellml,nickerso/libcellml,nickerso/libcellml,hsorby/libcellml,hsorby/libcellml,cellml/libcellml | c++ | ## Code Before:
using namespace emscripten;
EMSCRIPTEN_BINDINGS(libcellml_logger) {
class_<libcellml::Logger>("Logger")
.function("addIssue", &libcellml::Logger::addIssue)
.function("removeAllIssues", &libcellml::Logger::removeAllIssues)
.function("error", &libcellml::Logger::error)
.function("warning", &libcellml::Logger::warning)
.function("hint", &libcellml::Logger::hint)
;
}
## Instruction:
Add missing API from Javascript bindings Logger class.
## Code After:
using namespace emscripten;
EMSCRIPTEN_BINDINGS(libcellml_logger) {
class_<libcellml::Logger>("Logger")
.function("addIssue", &libcellml::Logger::addIssue)
.function("removeAllIssues", &libcellml::Logger::removeAllIssues)
.function("error", &libcellml::Logger::error)
.function("warning", &libcellml::Logger::warning)
.function("hint", &libcellml::Logger::hint)
.function("issueCount", &libcellml::Logger::issueCount)
.function("errorCount", &libcellml::Logger::errorCount)
.function("warningCount", &libcellml::Logger::warningCount)
.function("hintCount", &libcellml::Logger::hintCount)
;
}
|
using namespace emscripten;
EMSCRIPTEN_BINDINGS(libcellml_logger) {
class_<libcellml::Logger>("Logger")
.function("addIssue", &libcellml::Logger::addIssue)
.function("removeAllIssues", &libcellml::Logger::removeAllIssues)
.function("error", &libcellml::Logger::error)
.function("warning", &libcellml::Logger::warning)
.function("hint", &libcellml::Logger::hint)
+ .function("issueCount", &libcellml::Logger::issueCount)
+ .function("errorCount", &libcellml::Logger::errorCount)
+ .function("warningCount", &libcellml::Logger::warningCount)
+ .function("hintCount", &libcellml::Logger::hintCount)
;
} | 4 | 0.285714 | 4 | 0 |
12cb8fe299278af20f4a4e380e4a58624c41a778 | snippets/language-restructuredtext.cson | snippets/language-restructuredtext.cson | '.text.restructuredtext':
'image':
'prefix': 'image'
'body': '.. image:: ${1:path}\n$0'
'link':
'prefix': 'link'
'body': '\\`${1:Title} <${2:http://link}>\\`_$0'
'section 1':
'prefix': 'sec'
'body': '${1:subsection name}\n================$0\n'
'section 2':
'prefix': 'subs'
'body': '${1:subsection name}\n****************$0\n'
'section 3':
'prefix': 'sss'
'body': '${1:subsection name}\n----------------$0\n'
| '.text.restructuredtext':
'image':
'prefix': 'image'
'body': '.. image:: ${1:path}\n$0'
'link':
'prefix': 'link'
'body': '\\`${1:Title} <${2:http://link}>\\`_$0'
'section 1':
'prefix': 'sec'
'body': '${1:subsection name}\n================$0\n'
'section 2':
'prefix': 'subs'
'body': '${1:subsection name}\n****************$0\n'
'section 3':
'prefix': 'sss'
'body': '${1:subsection name}\n----------------$0\n'
'code block':
'prefix': 'code'
'body': '.. code-block::\n\n\t$1'
| Add snippet for inserting a vanilla code-block | Add snippet for inserting a vanilla code-block
| CoffeeScript | mit | Lukasa/language-restructuredtext | coffeescript | ## Code Before:
'.text.restructuredtext':
'image':
'prefix': 'image'
'body': '.. image:: ${1:path}\n$0'
'link':
'prefix': 'link'
'body': '\\`${1:Title} <${2:http://link}>\\`_$0'
'section 1':
'prefix': 'sec'
'body': '${1:subsection name}\n================$0\n'
'section 2':
'prefix': 'subs'
'body': '${1:subsection name}\n****************$0\n'
'section 3':
'prefix': 'sss'
'body': '${1:subsection name}\n----------------$0\n'
## Instruction:
Add snippet for inserting a vanilla code-block
## Code After:
'.text.restructuredtext':
'image':
'prefix': 'image'
'body': '.. image:: ${1:path}\n$0'
'link':
'prefix': 'link'
'body': '\\`${1:Title} <${2:http://link}>\\`_$0'
'section 1':
'prefix': 'sec'
'body': '${1:subsection name}\n================$0\n'
'section 2':
'prefix': 'subs'
'body': '${1:subsection name}\n****************$0\n'
'section 3':
'prefix': 'sss'
'body': '${1:subsection name}\n----------------$0\n'
'code block':
'prefix': 'code'
'body': '.. code-block::\n\n\t$1'
| '.text.restructuredtext':
'image':
'prefix': 'image'
'body': '.. image:: ${1:path}\n$0'
'link':
'prefix': 'link'
'body': '\\`${1:Title} <${2:http://link}>\\`_$0'
'section 1':
'prefix': 'sec'
'body': '${1:subsection name}\n================$0\n'
'section 2':
'prefix': 'subs'
'body': '${1:subsection name}\n****************$0\n'
'section 3':
'prefix': 'sss'
'body': '${1:subsection name}\n----------------$0\n'
+ 'code block':
+ 'prefix': 'code'
+ 'body': '.. code-block::\n\n\t$1' | 3 | 0.1875 | 3 | 0 |
007f65f02c6f6ddc07b1fb9a97f398c3db5d8c1f | src/components/FilterSuggester/FilterSuggester.jsx | src/components/FilterSuggester/FilterSuggester.jsx | const React = require('react')
class FilterSuggester extends React.Component {
onChange(event) {
event.target.blur()
if (this.props.onChange) {
this.props.onChange(event)
}
}
render() {
const inputClass = this.props.className || ''
const value = this.props.value || ''
return (
<div>
<input
type="text"
className={inputClass}
name="filterValue"
value={value}
onChange={e => this.onChange(e)}
placeholder="e.g., team:org/team-name is:open sort:updated-desc"
/>
</div>
)
}
}
FilterSuggester.propTypes = {
className: React.PropTypes.string,
onChange: React.PropTypes.func,
value: React.PropTypes.string,
}
module.exports = FilterSuggester
| const React = require('react')
class FilterSuggester extends React.Component {
constructor(props) {
super(props)
this.state = { value: props.value || '' }
}
onChange(event) {
this.setState({ value: event.target.value })
if (this.props.onChange) {
this.props.onChange(event)
}
}
render() {
const inputClass = this.props.className || ''
return (
<div>
<input
type="text"
className={inputClass}
name="filterValue"
value={this.state.value}
onChange={e => this.onChange(e)}
placeholder="e.g., team:org/team-name is:open sort:updated-desc"
/>
</div>
)
}
}
FilterSuggester.propTypes = {
className: React.PropTypes.string,
onChange: React.PropTypes.func,
value: React.PropTypes.string,
}
module.exports = FilterSuggester
| Fix not being able to type in 'new filter' query field | Fix not being able to type in 'new filter' query field
| JSX | mit | cheshire137/gh-notifications-snoozer,cheshire137/gh-notifications-snoozer | jsx | ## Code Before:
const React = require('react')
class FilterSuggester extends React.Component {
onChange(event) {
event.target.blur()
if (this.props.onChange) {
this.props.onChange(event)
}
}
render() {
const inputClass = this.props.className || ''
const value = this.props.value || ''
return (
<div>
<input
type="text"
className={inputClass}
name="filterValue"
value={value}
onChange={e => this.onChange(e)}
placeholder="e.g., team:org/team-name is:open sort:updated-desc"
/>
</div>
)
}
}
FilterSuggester.propTypes = {
className: React.PropTypes.string,
onChange: React.PropTypes.func,
value: React.PropTypes.string,
}
module.exports = FilterSuggester
## Instruction:
Fix not being able to type in 'new filter' query field
## Code After:
const React = require('react')
class FilterSuggester extends React.Component {
constructor(props) {
super(props)
this.state = { value: props.value || '' }
}
onChange(event) {
this.setState({ value: event.target.value })
if (this.props.onChange) {
this.props.onChange(event)
}
}
render() {
const inputClass = this.props.className || ''
return (
<div>
<input
type="text"
className={inputClass}
name="filterValue"
value={this.state.value}
onChange={e => this.onChange(e)}
placeholder="e.g., team:org/team-name is:open sort:updated-desc"
/>
</div>
)
}
}
FilterSuggester.propTypes = {
className: React.PropTypes.string,
onChange: React.PropTypes.func,
value: React.PropTypes.string,
}
module.exports = FilterSuggester
| const React = require('react')
class FilterSuggester extends React.Component {
+ constructor(props) {
+ super(props)
+ this.state = { value: props.value || '' }
+ }
+
onChange(event) {
- event.target.blur()
+ this.setState({ value: event.target.value })
if (this.props.onChange) {
this.props.onChange(event)
}
}
render() {
const inputClass = this.props.className || ''
- const value = this.props.value || ''
return (
<div>
<input
type="text"
className={inputClass}
name="filterValue"
- value={value}
+ value={this.state.value}
? +++++++++++
onChange={e => this.onChange(e)}
placeholder="e.g., team:org/team-name is:open sort:updated-desc"
/>
</div>
)
}
}
FilterSuggester.propTypes = {
className: React.PropTypes.string,
onChange: React.PropTypes.func,
value: React.PropTypes.string,
}
module.exports = FilterSuggester | 10 | 0.285714 | 7 | 3 |
bd290c8c9ff50c457c51387412037059a9c701f4 | .travis.yml | .travis.yml | language: go
go:
- 1.1
- 1.2
- 1.3
before_install:
- wget http://apache.mirror.nexicom.net/kafka/0.8.1.1/kafka_2.10-0.8.1.1.tgz -O kafka.tgz
- mkdir -p kafka && tar xzf kafka.tgz -C kafka --strip-components 1
- nohup bash -c "cd kafka && bin/zookeeper-server-start.sh config/zookeeper.properties &"
- sleep 5
- nohup bash -c "cd kafka && bin/kafka-server-start.sh config/server.properties &"
- sleep 5
- kafka/bin/kafka-topics.sh --create --partitions 1 --replication-factor 1 --topic single_partition --zookeeper localhost:2181
- kafka/bin/kafka-topics.sh --create --partitions 2 --replication-factor 1 --topic multi_partition --zookeeper localhost:2181
| language: go
go:
- 1.1
- 1.2
- 1.3
before_install:
- wget http://apache.mirror.nexicom.net/kafka/0.8.1.1/kafka_2.10-0.8.1.1.tgz -O kafka.tgz
- mkdir -p kafka && tar xzf kafka.tgz -C kafka --strip-components 1
- nohup bash -c "cd kafka && bin/zookeeper-server-start.sh config/zookeeper.properties &"
- sleep 5
- nohup bash -c "cd kafka && bin/kafka-server-start.sh config/server.properties &"
- sleep 5
- kafka/bin/kafka-topics.sh --create --partitions 1 --replication-factor 1 --topic single_partition --zookeeper localhost:2181
- kafka/bin/kafka-topics.sh --create --partitions 2 --replication-factor 1 --topic multi_partition --zookeeper localhost:2181
notifications:
flowdock: 15e08f7ed3a8fd2d89ddb36435301c1a
| Add flowdock notifications to Travis | Add flowdock notifications to Travis
| YAML | mit | VividCortex/sarama,mailgun/sarama,DataDog/sarama,raisemarketplace/sarama,DataDog/sarama,jepst/sarama,winlinvip/sarama,aybabtme/sarama,remerge/sarama,VividCortex/sarama,crast/sarama,gyroscope-innovations/legacy_sarama,pronix/sarama,scalingdata/sarama,cep21/sarama,Shopify/sarama,cep21/sarama,uovobw/sarama,kmiku7/sarama,jimmykuo/sarama,winlinvip/sarama,crast/sarama,rafrombrc/sarama,vrischmann/sarama,kraveio/sarama,pronix/sarama,Neuw84/sarama,kmiku7/sarama,kraveio/sarama,LastManStanding/sarama,LastManStanding/sarama,rafrombrc/sarama,scalingdata/sarama,betable/sarama,Neuw84/sarama,raisemarketplace/sarama,yangfan876/sarama,HelixEducation/sarama,Shopify/sarama,vrischmann/sarama,betable/sarama,aybabtme/sarama,jepst/sarama,mailgun/sarama,remerge/sarama,HelixEducation/sarama,uovobw/sarama,jimmykuo/sarama,yangfan876/sarama,gyroscope-innovations/legacy_sarama | yaml | ## Code Before:
language: go
go:
- 1.1
- 1.2
- 1.3
before_install:
- wget http://apache.mirror.nexicom.net/kafka/0.8.1.1/kafka_2.10-0.8.1.1.tgz -O kafka.tgz
- mkdir -p kafka && tar xzf kafka.tgz -C kafka --strip-components 1
- nohup bash -c "cd kafka && bin/zookeeper-server-start.sh config/zookeeper.properties &"
- sleep 5
- nohup bash -c "cd kafka && bin/kafka-server-start.sh config/server.properties &"
- sleep 5
- kafka/bin/kafka-topics.sh --create --partitions 1 --replication-factor 1 --topic single_partition --zookeeper localhost:2181
- kafka/bin/kafka-topics.sh --create --partitions 2 --replication-factor 1 --topic multi_partition --zookeeper localhost:2181
## Instruction:
Add flowdock notifications to Travis
## Code After:
language: go
go:
- 1.1
- 1.2
- 1.3
before_install:
- wget http://apache.mirror.nexicom.net/kafka/0.8.1.1/kafka_2.10-0.8.1.1.tgz -O kafka.tgz
- mkdir -p kafka && tar xzf kafka.tgz -C kafka --strip-components 1
- nohup bash -c "cd kafka && bin/zookeeper-server-start.sh config/zookeeper.properties &"
- sleep 5
- nohup bash -c "cd kafka && bin/kafka-server-start.sh config/server.properties &"
- sleep 5
- kafka/bin/kafka-topics.sh --create --partitions 1 --replication-factor 1 --topic single_partition --zookeeper localhost:2181
- kafka/bin/kafka-topics.sh --create --partitions 2 --replication-factor 1 --topic multi_partition --zookeeper localhost:2181
notifications:
flowdock: 15e08f7ed3a8fd2d89ddb36435301c1a
| language: go
go:
- 1.1
- 1.2
- 1.3
before_install:
- wget http://apache.mirror.nexicom.net/kafka/0.8.1.1/kafka_2.10-0.8.1.1.tgz -O kafka.tgz
- mkdir -p kafka && tar xzf kafka.tgz -C kafka --strip-components 1
- nohup bash -c "cd kafka && bin/zookeeper-server-start.sh config/zookeeper.properties &"
- sleep 5
- nohup bash -c "cd kafka && bin/kafka-server-start.sh config/server.properties &"
- sleep 5
- kafka/bin/kafka-topics.sh --create --partitions 1 --replication-factor 1 --topic single_partition --zookeeper localhost:2181
- kafka/bin/kafka-topics.sh --create --partitions 2 --replication-factor 1 --topic multi_partition --zookeeper localhost:2181
+
+ notifications:
+ flowdock: 15e08f7ed3a8fd2d89ddb36435301c1a | 3 | 0.2 | 3 | 0 |
8570084eb8c7f594697955bb01db89877938f74e | FitNesseRoot/FitNesse/UserGuide/CommandLineOption/content.txt | FitNesseRoot/FitNesse/UserGuide/CommandLineOption/content.txt | There are several commands you can run to control the FitNesse wiki, as shown below. (The values shown between braces are the default values.)
{{{ java -jar fitnesse-standalone.jar [-pdrleoa]
-p <port number> {80} or {9123 if -c}
-d <working directory> {.}
-r <page root directory> {FitNesseRoot}
-l <log directory> {no logging}
-e <days> {14} Number of days before page versions expire
-o omit updates
-a {user:pwd | user-file-name} enable authentication.
-i Install only, do not run fitnesse after install.
-c <command> Run a Rest Command and then exit.
(Return status is the number of test pages that failed as a result of the command.)
}}}
You can find more information on the -c options at CommandLineRestCommands.
If you are using SliM, you can also pass in -Dslim.port to hard code the port that the SliM server uses. This can be used in conjunction with Jenkins Port Allocator Plugin or the reserver-network-port mojo of the Maven build-helper plugin.
| There are several commands you can run to control the FitNesse wiki, as shown below. (The values shown between braces are the default values.)
{{{ java -jar fitnesse-standalone.jar [-pdrleoa]
-p <port number> {80} or {9123 if -c}
-d <working directory> {.}
-r <page root directory> {FitNesseRoot}
-l <log directory> {no logging}
-e <days> {14} Number of days before page versions expire
-o omit updates
-a {user:pwd | user-file-name} enable authentication.
-i Install only, do not run fitnesse after install.
-c <command> Run a Rest Command and then exit.
(Return status is the number of test pages that failed as a result of the command.)
-b <filename> redirect command output.
}}}
You can find more information on the -c options at CommandLineRestCommands.
If you are using SliM, you can also pass in -Dslim.port to hard code the port that the SliM server uses. This can be used in conjunction with Jenkins Port Allocator Plugin or the reserver-network-port mojo of the Maven build-helper plugin.
| Add -b output redirection flag to the documentation. | Add -b output redirection flag to the documentation.
| Text | epl-1.0 | rbevers/fitnesse,amolenaar/fitnesse,amolenaar/fitnesse,jdufner/fitnesse,rbevers/fitnesse,jdufner/fitnesse,hansjoachim/fitnesse,hansjoachim/fitnesse,jdufner/fitnesse,hansjoachim/fitnesse,rbevers/fitnesse,amolenaar/fitnesse | text | ## Code Before:
There are several commands you can run to control the FitNesse wiki, as shown below. (The values shown between braces are the default values.)
{{{ java -jar fitnesse-standalone.jar [-pdrleoa]
-p <port number> {80} or {9123 if -c}
-d <working directory> {.}
-r <page root directory> {FitNesseRoot}
-l <log directory> {no logging}
-e <days> {14} Number of days before page versions expire
-o omit updates
-a {user:pwd | user-file-name} enable authentication.
-i Install only, do not run fitnesse after install.
-c <command> Run a Rest Command and then exit.
(Return status is the number of test pages that failed as a result of the command.)
}}}
You can find more information on the -c options at CommandLineRestCommands.
If you are using SliM, you can also pass in -Dslim.port to hard code the port that the SliM server uses. This can be used in conjunction with Jenkins Port Allocator Plugin or the reserver-network-port mojo of the Maven build-helper plugin.
## Instruction:
Add -b output redirection flag to the documentation.
## Code After:
There are several commands you can run to control the FitNesse wiki, as shown below. (The values shown between braces are the default values.)
{{{ java -jar fitnesse-standalone.jar [-pdrleoa]
-p <port number> {80} or {9123 if -c}
-d <working directory> {.}
-r <page root directory> {FitNesseRoot}
-l <log directory> {no logging}
-e <days> {14} Number of days before page versions expire
-o omit updates
-a {user:pwd | user-file-name} enable authentication.
-i Install only, do not run fitnesse after install.
-c <command> Run a Rest Command and then exit.
(Return status is the number of test pages that failed as a result of the command.)
-b <filename> redirect command output.
}}}
You can find more information on the -c options at CommandLineRestCommands.
If you are using SliM, you can also pass in -Dslim.port to hard code the port that the SliM server uses. This can be used in conjunction with Jenkins Port Allocator Plugin or the reserver-network-port mojo of the Maven build-helper plugin.
| There are several commands you can run to control the FitNesse wiki, as shown below. (The values shown between braces are the default values.)
{{{ java -jar fitnesse-standalone.jar [-pdrleoa]
-p <port number> {80} or {9123 if -c}
-d <working directory> {.}
-r <page root directory> {FitNesseRoot}
-l <log directory> {no logging}
-e <days> {14} Number of days before page versions expire
-o omit updates
-a {user:pwd | user-file-name} enable authentication.
-i Install only, do not run fitnesse after install.
-c <command> Run a Rest Command and then exit.
(Return status is the number of test pages that failed as a result of the command.)
+ -b <filename> redirect command output.
}}}
You can find more information on the -c options at CommandLineRestCommands.
If you are using SliM, you can also pass in -Dslim.port to hard code the port that the SliM server uses. This can be used in conjunction with Jenkins Port Allocator Plugin or the reserver-network-port mojo of the Maven build-helper plugin.
| 1 | 0.052632 | 1 | 0 |
6f3cb0f47bda684708a58e33fd6222267f5f09a5 | README.md | README.md |
This is for labeling issues and PR's according to the library files they modify.
Put a Github api-key in your home directory.
```console
$ docker build -t go-label .
$ docker run -it --rm go-label go-wrapper run "$(cat ~/.github)"
```
|
This is for labeling issues and PR's according to the library files they modify.
Put a Github api-key in your home directory.
```console
$ docker build -t official-images-issue-labeler .
$ docker run -it --rm official-images-issue-labeler --help
Usage:
app [OPTIONS]
Application Options:
--token=deadbeef GitHub API access token
--owner=docker-library
--repo=official-images
--state=[open|closed|all]
Help Options:
-h, --help Show this help message
$ docker run -it --rm official-images-issue-labeler --token "$(cat ~/.github)"
```
| Update readme with usage and new travis badge | Update readme with usage and new travis badge
| Markdown | mit | yosifkit/official-images-issue-labeler | markdown | ## Code Before:
This is for labeling issues and PR's according to the library files they modify.
Put a Github api-key in your home directory.
```console
$ docker build -t go-label .
$ docker run -it --rm go-label go-wrapper run "$(cat ~/.github)"
```
## Instruction:
Update readme with usage and new travis badge
## Code After:
This is for labeling issues and PR's according to the library files they modify.
Put a Github api-key in your home directory.
```console
$ docker build -t official-images-issue-labeler .
$ docker run -it --rm official-images-issue-labeler --help
Usage:
app [OPTIONS]
Application Options:
--token=deadbeef GitHub API access token
--owner=docker-library
--repo=official-images
--state=[open|closed|all]
Help Options:
-h, --help Show this help message
$ docker run -it --rm official-images-issue-labeler --token "$(cat ~/.github)"
```
|
This is for labeling issues and PR's according to the library files they modify.
Put a Github api-key in your home directory.
```console
- $ docker build -t go-label .
- $ docker run -it --rm go-label go-wrapper run "$(cat ~/.github)"
+ $ docker build -t official-images-issue-labeler .
+ $ docker run -it --rm official-images-issue-labeler --help
+ Usage:
+ app [OPTIONS]
+
+ Application Options:
+ --token=deadbeef GitHub API access token
+ --owner=docker-library
+ --repo=official-images
+ --state=[open|closed|all]
+
+ Help Options:
+ -h, --help Show this help message
+
+ $ docker run -it --rm official-images-issue-labeler --token "$(cat ~/.github)"
``` | 17 | 1.888889 | 15 | 2 |
21a78a8a88b315f484dab76fec207ede9bae4579 | lib/geocoder/results/yandex.rb | lib/geocoder/results/yandex.rb | require 'geocoder/results/base'
module Geocoder::Result
class Yandex < Base
def coordinates
@data['GeoObject']['Point']['pos'].split(' ').reverse.map(&:to_f)
end
def address(format = :full)
@data['GeoObject']['metaDataProperty']['GeocoderMetaData']['text']
end
def city
if state.empty?
address_details['Locality']['LocalityName']
elsif sub_state.empty?
address_details['AdministrativeArea']['Locality']['LocalityName']
else
address_details['AdministrativeArea']['SubAdministrativeArea']['Locality']['LocalityName']
end
end
def country
address_details['CountryName']
end
def country_code
address_details['CountryNameCode']
end
def state
if address_details['AdministrativeArea']
address_details['AdministrativeArea']['AdministrativeAreaName']
else
""
end
end
def sub_state
if !state.empty? and address_details['AdministrativeArea']['SubAdministrativeArea']
address_details['AdministrativeArea']['SubAdministrativeArea']['SubAdministrativeAreaName']
else
""
end
end
def state_code
""
end
def postal_code
""
end
def premise_name
address_details['Locality']['Premise']['PremiseName']
end
private # ----------------------------------------------------------------
def address_details
@data['GeoObject']['metaDataProperty']['GeocoderMetaData']['AddressDetails']['Country']
end
end
end
| require 'geocoder/results/base'
module Geocoder::Result
class Yandex < Base
def coordinates
@data['GeoObject']['Point']['pos'].split(' ').reverse.map(&:to_f)
end
def address(format = :full)
@data['GeoObject']['metaDataProperty']['GeocoderMetaData']['text']
end
def city
if state.empty?
address_details['Locality']['LocalityName']
elsif sub_state.empty?
address_details['AdministrativeArea']['Locality']['LocalityName']
else
address_details['AdministrativeArea']['SubAdministrativeArea']['Locality']['LocalityName']
end
end
def country
address_details['CountryName']
end
def country_code
address_details['CountryNameCode']
end
def state
if address_details['AdministrativeArea']
address_details['AdministrativeArea']['AdministrativeAreaName']
else
""
end
end
def sub_state
if !state.empty? and address_details['AdministrativeArea']['SubAdministrativeArea']
address_details['AdministrativeArea']['SubAdministrativeArea']['SubAdministrativeAreaName']
else
""
end
end
def state_code
""
end
def postal_code
""
end
def premise_name
address_details['Locality']['Premise']['PremiseName']
end
def precision
@data['GeoObject']['metaDataProperty']['GeocoderMetaData']['precision']
end
private # ----------------------------------------------------------------
def address_details
@data['GeoObject']['metaDataProperty']['GeocoderMetaData']['AddressDetails']['Country']
end
end
end
| Add precision method to Geocoder::Result::Yandex. | Add precision method to Geocoder::Result::Yandex.
| Ruby | mit | brian-ewell/geocoder,jaigouk/geocoder-olleh,andreamazz/geocoder,sjaveed/geocoder,malandrina/geocoder,sr-education/geocoder,AndyDangerous/geocoder,TrangPham/geocoder,boone/geocoder,rayzeller/geocoder,rdlugosz/geocoder,jaigouk/geocoder-olleh,FanaHOVA/geocoder,alexreisner/geocoder,Devetzis/geocoder,aceunreal/geocoder,crdunwel/geocoder,mveer99/geocoder,donbobka/geocoder,epyatopal/geocoder,gogovan/geocoder-olleh,nerdyglasses/geocoder,mehlah/geocoder,moofish32/geocoder,gogovan/geocoder-olleh,jgrciv/geocoder,guavapass/geocoder,3mundi/geocoder,ankane/geocoder,tiramizoo/geocoder,rios1993/geocoder,ivanzotov/geocoder,eugenehp/ruby-geocoder,StudioMelipone/geocoder,sideci-sample/sideci-sample-geocoder,mdebo/geocoder,tortuba/geocoder,davydovanton/geocoder,GermanDZ/geocoder | ruby | ## Code Before:
require 'geocoder/results/base'
module Geocoder::Result
class Yandex < Base
def coordinates
@data['GeoObject']['Point']['pos'].split(' ').reverse.map(&:to_f)
end
def address(format = :full)
@data['GeoObject']['metaDataProperty']['GeocoderMetaData']['text']
end
def city
if state.empty?
address_details['Locality']['LocalityName']
elsif sub_state.empty?
address_details['AdministrativeArea']['Locality']['LocalityName']
else
address_details['AdministrativeArea']['SubAdministrativeArea']['Locality']['LocalityName']
end
end
def country
address_details['CountryName']
end
def country_code
address_details['CountryNameCode']
end
def state
if address_details['AdministrativeArea']
address_details['AdministrativeArea']['AdministrativeAreaName']
else
""
end
end
def sub_state
if !state.empty? and address_details['AdministrativeArea']['SubAdministrativeArea']
address_details['AdministrativeArea']['SubAdministrativeArea']['SubAdministrativeAreaName']
else
""
end
end
def state_code
""
end
def postal_code
""
end
def premise_name
address_details['Locality']['Premise']['PremiseName']
end
private # ----------------------------------------------------------------
def address_details
@data['GeoObject']['metaDataProperty']['GeocoderMetaData']['AddressDetails']['Country']
end
end
end
## Instruction:
Add precision method to Geocoder::Result::Yandex.
## Code After:
require 'geocoder/results/base'
module Geocoder::Result
class Yandex < Base
def coordinates
@data['GeoObject']['Point']['pos'].split(' ').reverse.map(&:to_f)
end
def address(format = :full)
@data['GeoObject']['metaDataProperty']['GeocoderMetaData']['text']
end
def city
if state.empty?
address_details['Locality']['LocalityName']
elsif sub_state.empty?
address_details['AdministrativeArea']['Locality']['LocalityName']
else
address_details['AdministrativeArea']['SubAdministrativeArea']['Locality']['LocalityName']
end
end
def country
address_details['CountryName']
end
def country_code
address_details['CountryNameCode']
end
def state
if address_details['AdministrativeArea']
address_details['AdministrativeArea']['AdministrativeAreaName']
else
""
end
end
def sub_state
if !state.empty? and address_details['AdministrativeArea']['SubAdministrativeArea']
address_details['AdministrativeArea']['SubAdministrativeArea']['SubAdministrativeAreaName']
else
""
end
end
def state_code
""
end
def postal_code
""
end
def premise_name
address_details['Locality']['Premise']['PremiseName']
end
def precision
@data['GeoObject']['metaDataProperty']['GeocoderMetaData']['precision']
end
private # ----------------------------------------------------------------
def address_details
@data['GeoObject']['metaDataProperty']['GeocoderMetaData']['AddressDetails']['Country']
end
end
end
| require 'geocoder/results/base'
module Geocoder::Result
class Yandex < Base
def coordinates
@data['GeoObject']['Point']['pos'].split(' ').reverse.map(&:to_f)
end
def address(format = :full)
@data['GeoObject']['metaDataProperty']['GeocoderMetaData']['text']
end
def city
if state.empty?
address_details['Locality']['LocalityName']
elsif sub_state.empty?
address_details['AdministrativeArea']['Locality']['LocalityName']
else
address_details['AdministrativeArea']['SubAdministrativeArea']['Locality']['LocalityName']
end
end
def country
address_details['CountryName']
end
def country_code
address_details['CountryNameCode']
end
def state
if address_details['AdministrativeArea']
address_details['AdministrativeArea']['AdministrativeAreaName']
else
""
end
end
def sub_state
if !state.empty? and address_details['AdministrativeArea']['SubAdministrativeArea']
address_details['AdministrativeArea']['SubAdministrativeArea']['SubAdministrativeAreaName']
else
""
end
end
def state_code
""
end
def postal_code
""
end
def premise_name
address_details['Locality']['Premise']['PremiseName']
end
+ def precision
+ @data['GeoObject']['metaDataProperty']['GeocoderMetaData']['precision']
+ end
+
private # ----------------------------------------------------------------
def address_details
@data['GeoObject']['metaDataProperty']['GeocoderMetaData']['AddressDetails']['Country']
end
end
end | 4 | 0.060606 | 4 | 0 |
42a38b012ae01477845ef926d4620b12f0a418f6 | prime/index.test.js | prime/index.test.js | /* eslint-env node, jest */
const shogi = require('./index.js');
const Slack = require('../lib/slackMock.js');
jest.mock('../achievements/index.ts');
let slack = null;
beforeEach(() => {
slack = new Slack();
process.env.CHANNEL_SANDBOX = slack.fakeChannel;
shogi(slack);
});
describe('shogi', () => {
it('responds to "素数大富豪"', async () => {
const {username, text} = await slack.getResponseTo('素数大富豪');
expect(username).toBe('primebot');
expect(text).toContain('手札');
});
});
| /* eslint-env node, jest */
jest.mock('../achievements/index.ts');
const shogi = require('./index.js');
const Slack = require('../lib/slackMock.js');
let slack = null;
beforeEach(() => {
slack = new Slack();
process.env.CHANNEL_SANDBOX = slack.fakeChannel;
shogi(slack);
});
describe('shogi', () => {
it('responds to "素数大富豪"', async () => {
const {username, text} = await slack.getResponseTo('素数大富豪');
expect(username).toBe('primebot');
expect(text).toContain('手札');
});
});
| Fix order of invokation of achievement mocking | Fix order of invokation of achievement mocking
| JavaScript | mit | tsg-ut/slackbot,tsg-ut/slackbot,tsg-ut/slackbot,tsg-ut/slackbot,tsg-ut/slackbot,tsg-ut/slackbot | javascript | ## Code Before:
/* eslint-env node, jest */
const shogi = require('./index.js');
const Slack = require('../lib/slackMock.js');
jest.mock('../achievements/index.ts');
let slack = null;
beforeEach(() => {
slack = new Slack();
process.env.CHANNEL_SANDBOX = slack.fakeChannel;
shogi(slack);
});
describe('shogi', () => {
it('responds to "素数大富豪"', async () => {
const {username, text} = await slack.getResponseTo('素数大富豪');
expect(username).toBe('primebot');
expect(text).toContain('手札');
});
});
## Instruction:
Fix order of invokation of achievement mocking
## Code After:
/* eslint-env node, jest */
jest.mock('../achievements/index.ts');
const shogi = require('./index.js');
const Slack = require('../lib/slackMock.js');
let slack = null;
beforeEach(() => {
slack = new Slack();
process.env.CHANNEL_SANDBOX = slack.fakeChannel;
shogi(slack);
});
describe('shogi', () => {
it('responds to "素数大富豪"', async () => {
const {username, text} = await slack.getResponseTo('素数大富豪');
expect(username).toBe('primebot');
expect(text).toContain('手札');
});
});
| /* eslint-env node, jest */
+
+ jest.mock('../achievements/index.ts');
const shogi = require('./index.js');
const Slack = require('../lib/slackMock.js');
-
- jest.mock('../achievements/index.ts');
let slack = null;
beforeEach(() => {
slack = new Slack();
process.env.CHANNEL_SANDBOX = slack.fakeChannel;
shogi(slack);
});
describe('shogi', () => {
it('responds to "素数大富豪"', async () => {
const {username, text} = await slack.getResponseTo('素数大富豪');
expect(username).toBe('primebot');
expect(text).toContain('手札');
});
}); | 4 | 0.173913 | 2 | 2 |
beaa06471602627bc9f9602ef84208c24223add3 | metadata/com.renard.ocr.txt | metadata/com.renard.ocr.txt | Categories:Office,Multimedia
License:Apache2
Web Site:https://github.com/renard314/textfairy
Source Code:https://github.com/renard314/textfairy
Issue Tracker:https://github.com/renard314/textfairy/issues
Auto Name:Text Fairy
Summary:Android OCR App
Description:
An Android OCR App based on Tesseract.
Features:
* convert images to pdf
* recognize text in images
* basic document management (delete/edit/join documents, view toc)
.
Repo Type:git
Repo:https://github.com/renard314/textfairy.git
Build:1.1.12,17
disable=various gradle problems
commit=752a45ace49825e852b0a12efd5774e863448974
submodules=yes
subdir=textfairy
gradle=yes
prebuild=sed -i 's@ndkDir = .*@ndkDir = $$NDK$$@' ../gradle.properties
Auto Update Mode:None
Update Check Mode:RepoManifest
Current Version:1.1.12
Current Version Code:17
| Categories:Office,Multimedia
License:Apache2
Web Site:https://github.com/renard314/textfairy
Source Code:https://github.com/renard314/textfairy
Issue Tracker:https://github.com/renard314/textfairy/issues
Auto Name:Text Fairy
Summary:Android OCR App
Description:
An Android OCR App based on Tesseract.
Features:
* convert images to pdf
* recognize text in images
* basic document management (delete/edit/join documents, view toc)
.
Repo Type:git
Repo:https://github.com/renard314/textfairy.git
Build:1.1.12,17
disable=various gradle problems
commit=752a45ace49825e852b0a12efd5774e863448974
subdir=textfairy
submodules=yes
gradle=yes
prebuild=sed -i 's@ndkDir = .*@ndkDir = $$NDK$$@' ../gradle.properties
Auto Update Mode:None
Update Check Mode:RepoManifest
Current Version:1.1.19
Current Version Code:24
| Update CV of Text Fairy to 1.1.19 (24) | Update CV of Text Fairy to 1.1.19 (24)
| Text | agpl-3.0 | f-droid/fdroiddata,f-droid/fdroid-data,f-droid/fdroiddata | text | ## Code Before:
Categories:Office,Multimedia
License:Apache2
Web Site:https://github.com/renard314/textfairy
Source Code:https://github.com/renard314/textfairy
Issue Tracker:https://github.com/renard314/textfairy/issues
Auto Name:Text Fairy
Summary:Android OCR App
Description:
An Android OCR App based on Tesseract.
Features:
* convert images to pdf
* recognize text in images
* basic document management (delete/edit/join documents, view toc)
.
Repo Type:git
Repo:https://github.com/renard314/textfairy.git
Build:1.1.12,17
disable=various gradle problems
commit=752a45ace49825e852b0a12efd5774e863448974
submodules=yes
subdir=textfairy
gradle=yes
prebuild=sed -i 's@ndkDir = .*@ndkDir = $$NDK$$@' ../gradle.properties
Auto Update Mode:None
Update Check Mode:RepoManifest
Current Version:1.1.12
Current Version Code:17
## Instruction:
Update CV of Text Fairy to 1.1.19 (24)
## Code After:
Categories:Office,Multimedia
License:Apache2
Web Site:https://github.com/renard314/textfairy
Source Code:https://github.com/renard314/textfairy
Issue Tracker:https://github.com/renard314/textfairy/issues
Auto Name:Text Fairy
Summary:Android OCR App
Description:
An Android OCR App based on Tesseract.
Features:
* convert images to pdf
* recognize text in images
* basic document management (delete/edit/join documents, view toc)
.
Repo Type:git
Repo:https://github.com/renard314/textfairy.git
Build:1.1.12,17
disable=various gradle problems
commit=752a45ace49825e852b0a12efd5774e863448974
subdir=textfairy
submodules=yes
gradle=yes
prebuild=sed -i 's@ndkDir = .*@ndkDir = $$NDK$$@' ../gradle.properties
Auto Update Mode:None
Update Check Mode:RepoManifest
Current Version:1.1.19
Current Version Code:24
| Categories:Office,Multimedia
License:Apache2
Web Site:https://github.com/renard314/textfairy
Source Code:https://github.com/renard314/textfairy
Issue Tracker:https://github.com/renard314/textfairy/issues
Auto Name:Text Fairy
Summary:Android OCR App
Description:
An Android OCR App based on Tesseract.
Features:
* convert images to pdf
* recognize text in images
* basic document management (delete/edit/join documents, view toc)
.
Repo Type:git
Repo:https://github.com/renard314/textfairy.git
Build:1.1.12,17
disable=various gradle problems
commit=752a45ace49825e852b0a12efd5774e863448974
+ subdir=textfairy
submodules=yes
- subdir=textfairy
gradle=yes
prebuild=sed -i 's@ndkDir = .*@ndkDir = $$NDK$$@' ../gradle.properties
Auto Update Mode:None
Update Check Mode:RepoManifest
- Current Version:1.1.12
? ^
+ Current Version:1.1.19
? ^
- Current Version Code:17
? ^^
+ Current Version Code:24
? ^^
| 6 | 0.181818 | 3 | 3 |
01bf8edd36862104b78044efb300cac8179352a3 | requirements.md | requirements.md |
1. enable transcription of latin handwriting
2. 2 main use cases:
a) marking rectangular or polygon parts of scans of the handwriting to be one kind of text: original handwriting, first, second or even third order annotation
b) annotating these handwritten parts using markdown after a certain review process to be specified further
3. user registration / login + logout etc.
|
1. enable transcription of latin handwriting
* datastructure example could be Omeka
* Gathering transcriptions from every citizen scientist, everyone has his own handle
* assuming sufficient transcription quality after 100 transcriptions for one piece of text
2. 2 main use cases
* marking rectangular or polygon parts of scans of the handwriting to be one kind of text: original handwriting, first, second or even third order annotation
* using drag and drop on the image, save as CTS-URN
* rectangular sometimes not optimal
* entire readable page in the background, enabling reporting of wrong annotated area of interest, option to mark another area of interest if the user wishes to further annotate a page
* requesting the user to mark single lines as areas of interest in the user manual
* annotating these handwritten parts using markdown after a certain review process to be specified further
* use markdown strikethrough
* offer compiled markdown on another pane
3. user registration / login + logout etc.
4. merging of different transcriptions for one part of text
* tba
#non-functional#
1. preferring small tasks for users over transcribing an entire page
2. intuitive web interface
| Update Requirements after meeting 10th June | Update Requirements after meeting 10th June
| Markdown | mit | runjak/TranscriptionDesk,runjak/TranscriptionDesk,runjak/TranscriptionDesk,runjak/TranscriptionDesk | markdown | ## Code Before:
1. enable transcription of latin handwriting
2. 2 main use cases:
a) marking rectangular or polygon parts of scans of the handwriting to be one kind of text: original handwriting, first, second or even third order annotation
b) annotating these handwritten parts using markdown after a certain review process to be specified further
3. user registration / login + logout etc.
## Instruction:
Update Requirements after meeting 10th June
## Code After:
1. enable transcription of latin handwriting
* datastructure example could be Omeka
* Gathering transcriptions from every citizen scientist, everyone has his own handle
* assuming sufficient transcription quality after 100 transcriptions for one piece of text
2. 2 main use cases
* marking rectangular or polygon parts of scans of the handwriting to be one kind of text: original handwriting, first, second or even third order annotation
* using drag and drop on the image, save as CTS-URN
* rectangular sometimes not optimal
* entire readable page in the background, enabling reporting of wrong annotated area of interest, option to mark another area of interest if the user wishes to further annotate a page
* requesting the user to mark single lines as areas of interest in the user manual
* annotating these handwritten parts using markdown after a certain review process to be specified further
* use markdown strikethrough
* offer compiled markdown on another pane
3. user registration / login + logout etc.
4. merging of different transcriptions for one part of text
* tba
#non-functional#
1. preferring small tasks for users over transcribing an entire page
2. intuitive web interface
|
1. enable transcription of latin handwriting
+
+ * datastructure example could be Omeka
+ * Gathering transcriptions from every citizen scientist, everyone has his own handle
+ * assuming sufficient transcription quality after 100 transcriptions for one piece of text
- 2. 2 main use cases:
? -
+ 2. 2 main use cases
+
- a) marking rectangular or polygon parts of scans of the handwriting to be one kind of text: original handwriting, first, second or even third order annotation
? ^^
+ * marking rectangular or polygon parts of scans of the handwriting to be one kind of text: original handwriting, first, second or even third order annotation
? ^
+ * using drag and drop on the image, save as CTS-URN
+ * rectangular sometimes not optimal
+ * entire readable page in the background, enabling reporting of wrong annotated area of interest, option to mark another area of interest if the user wishes to further annotate a page
+ * requesting the user to mark single lines as areas of interest in the user manual
- b) annotating these handwritten parts using markdown after a certain review process to be specified further
? ^^
+ * annotating these handwritten parts using markdown after a certain review process to be specified further
? ^
+ * use markdown strikethrough
+ * offer compiled markdown on another pane
3. user registration / login + logout etc.
+ 4. merging of different transcriptions for one part of text
+
+ * tba
+
+ #non-functional#
+
+ 1. preferring small tasks for users over transcribing an entire page
+ 2. intuitive web interface
+
+ | 27 | 3.375 | 24 | 3 |
0812b5431df465e65d4dddfaf1c5485be0d8705e | Lib/Source/Calculator.cpp | Lib/Source/Calculator.cpp |
Calculator::Calculator()
{
ready = false;
}
Calculator::Calculator(int a, int b)
{
ready = false;
create(a, b);
}
Calculator::~Calculator()
{
if (valid())
{
destroy();
}
}
bool Calculator::create(int a, int b)
{
if (valid())
{
throw std::exception("Calculator object is initialised!");
}
this->a = a;
this->b = b;
ready = !ready;
return true;
}
bool Calculator::valid()
{
return ready;
}
void Calculator::destroy()
{
if (!valid())
{
throw std::exception("Calculator object is not initialised!");
}
a = b = 0;
ready = !ready;
}
int Calculator::sum()
{
return a + b;
}
int Calculator::mul()
{
return a * b;
}
int Calculator::diff()
{
return a - b;
}
int Calculator::div()
{
if (!a)
{
return 0;
}
return a / b;
}
|
Calculator::Calculator()
{
ready = false;
}
Calculator::Calculator(int a, int b)
{
ready = false;
create(a, b);
}
Calculator::~Calculator()
{
if (valid())
{
destroy();
}
}
bool Calculator::create(int a, int b)
{
if (valid())
{
throw std::runtime_error("Calculator object is initialised!");
}
this->a = a;
this->b = b;
ready = !ready;
return true;
}
bool Calculator::valid()
{
return ready;
}
void Calculator::destroy()
{
if (!valid())
{
throw std::runtime_error("Calculator object is not initialised!");
}
a = b = 0;
ready = !ready;
}
int Calculator::sum()
{
return a + b;
}
int Calculator::mul()
{
return a * b;
}
int Calculator::diff()
{
return a - b;
}
int Calculator::div()
{
if (!a)
{
return 0;
}
return a / b;
}
| Fix excepion name for linux platform. | Fix excepion name for linux platform.
| C++ | mit | cristian-szabo/cmake-seed | c++ | ## Code Before:
Calculator::Calculator()
{
ready = false;
}
Calculator::Calculator(int a, int b)
{
ready = false;
create(a, b);
}
Calculator::~Calculator()
{
if (valid())
{
destroy();
}
}
bool Calculator::create(int a, int b)
{
if (valid())
{
throw std::exception("Calculator object is initialised!");
}
this->a = a;
this->b = b;
ready = !ready;
return true;
}
bool Calculator::valid()
{
return ready;
}
void Calculator::destroy()
{
if (!valid())
{
throw std::exception("Calculator object is not initialised!");
}
a = b = 0;
ready = !ready;
}
int Calculator::sum()
{
return a + b;
}
int Calculator::mul()
{
return a * b;
}
int Calculator::diff()
{
return a - b;
}
int Calculator::div()
{
if (!a)
{
return 0;
}
return a / b;
}
## Instruction:
Fix excepion name for linux platform.
## Code After:
Calculator::Calculator()
{
ready = false;
}
Calculator::Calculator(int a, int b)
{
ready = false;
create(a, b);
}
Calculator::~Calculator()
{
if (valid())
{
destroy();
}
}
bool Calculator::create(int a, int b)
{
if (valid())
{
throw std::runtime_error("Calculator object is initialised!");
}
this->a = a;
this->b = b;
ready = !ready;
return true;
}
bool Calculator::valid()
{
return ready;
}
void Calculator::destroy()
{
if (!valid())
{
throw std::runtime_error("Calculator object is not initialised!");
}
a = b = 0;
ready = !ready;
}
int Calculator::sum()
{
return a + b;
}
int Calculator::mul()
{
return a * b;
}
int Calculator::diff()
{
return a - b;
}
int Calculator::div()
{
if (!a)
{
return 0;
}
return a / b;
}
|
Calculator::Calculator()
{
ready = false;
}
Calculator::Calculator(int a, int b)
{
ready = false;
create(a, b);
}
Calculator::~Calculator()
{
if (valid())
{
destroy();
}
}
bool Calculator::create(int a, int b)
{
if (valid())
{
- throw std::exception("Calculator object is initialised!");
? ^^^^^ ^
+ throw std::runtime_error("Calculator object is initialised!");
? ^^^ ++++++ ^
}
this->a = a;
this->b = b;
ready = !ready;
return true;
}
bool Calculator::valid()
{
return ready;
}
void Calculator::destroy()
{
if (!valid())
{
- throw std::exception("Calculator object is not initialised!");
? ^^^^^ ^
+ throw std::runtime_error("Calculator object is not initialised!");
? ^^^ ++++++ ^
}
a = b = 0;
ready = !ready;
}
int Calculator::sum()
{
return a + b;
}
int Calculator::mul()
{
return a * b;
}
int Calculator::diff()
{
return a - b;
}
int Calculator::div()
{
if (!a)
{
return 0;
}
return a / b;
} | 4 | 0.051282 | 2 | 2 |
eb1313914daa9e48ebdaa1634cc9d477ab8c6687 | src/FileEncrypter/GpgFileEncrypter.php | src/FileEncrypter/GpgFileEncrypter.php | <?php
/*
* This file is part of the DB Backup utility.
*
* (c) Tom Adam <tomadam@instantiate.co.uk>
*
* This source file is subject to the MIT license that is bundled
* with this source code in the file LICENSE.
*/
namespace Instantiate\DatabaseBackup\FileEncrypter;
use Instantiate\DatabaseBackup\Util\Process;
class GpgFileEncrypter extends AbstractFileEncrypter
{
/**
* @param string $inputFile
* @param string $outputFile
*/
protected function encryptFile($inputFile, $outputFile)
{
Process::exec(
'gpg2 -v --encrypt {sign} --recipient {recipient} --batch --yes {passphrase} --output {output_file} {input_file}',
[
'{sign}' => $this->config['sign'] ? '--sign' : '',
'{recipient}' => $this->config['recipient'],
'{passphrase}' => $this->config['sign'] && $this->config['key_file']
? '--passphrase-file '.$this->config['key_file']
: '',
'{input_file}' => $inputFile,
'{output_file}' => $outputFile,
],
[],
$this->logger
);
}
protected function getEncryptedFilename($inputFile)
{
return $inputFile.'.gpg';
}
}
| <?php
/*
* This file is part of the DB Backup utility.
*
* (c) Tom Adam <tomadam@instantiate.co.uk>
*
* This source file is subject to the MIT license that is bundled
* with this source code in the file LICENSE.
*/
namespace Instantiate\DatabaseBackup\FileEncrypter;
use Instantiate\DatabaseBackup\Util\Process;
class GpgFileEncrypter extends AbstractFileEncrypter
{
/**
* @param string $inputFile
* @param string $outputFile
*/
protected function encryptFile($inputFile, $outputFile)
{
Process::exec(
'gpg2 -v --encrypt {sign} --recipient {recipient} --batch --pinentry-mode loopback --yes {passphrase} --output {output_file} {input_file}',
[
'{sign}' => $this->config['sign'] ? '--sign' : '',
'{recipient}' => $this->config['recipient'],
'{passphrase}' => $this->config['sign'] && $this->config['key_file']
? '--passphrase-file '.$this->config['key_file']
: '',
'{input_file}' => $inputFile,
'{output_file}' => $outputFile,
],
[],
$this->logger
);
}
protected function getEncryptedFilename($inputFile)
{
return $inputFile.'.gpg';
}
}
| Add pinentry mode loopback to gpg command | Add pinentry mode loopback to gpg command
| PHP | mit | TomAdam/db-backup | php | ## Code Before:
<?php
/*
* This file is part of the DB Backup utility.
*
* (c) Tom Adam <tomadam@instantiate.co.uk>
*
* This source file is subject to the MIT license that is bundled
* with this source code in the file LICENSE.
*/
namespace Instantiate\DatabaseBackup\FileEncrypter;
use Instantiate\DatabaseBackup\Util\Process;
class GpgFileEncrypter extends AbstractFileEncrypter
{
/**
* @param string $inputFile
* @param string $outputFile
*/
protected function encryptFile($inputFile, $outputFile)
{
Process::exec(
'gpg2 -v --encrypt {sign} --recipient {recipient} --batch --yes {passphrase} --output {output_file} {input_file}',
[
'{sign}' => $this->config['sign'] ? '--sign' : '',
'{recipient}' => $this->config['recipient'],
'{passphrase}' => $this->config['sign'] && $this->config['key_file']
? '--passphrase-file '.$this->config['key_file']
: '',
'{input_file}' => $inputFile,
'{output_file}' => $outputFile,
],
[],
$this->logger
);
}
protected function getEncryptedFilename($inputFile)
{
return $inputFile.'.gpg';
}
}
## Instruction:
Add pinentry mode loopback to gpg command
## Code After:
<?php
/*
* This file is part of the DB Backup utility.
*
* (c) Tom Adam <tomadam@instantiate.co.uk>
*
* This source file is subject to the MIT license that is bundled
* with this source code in the file LICENSE.
*/
namespace Instantiate\DatabaseBackup\FileEncrypter;
use Instantiate\DatabaseBackup\Util\Process;
class GpgFileEncrypter extends AbstractFileEncrypter
{
/**
* @param string $inputFile
* @param string $outputFile
*/
protected function encryptFile($inputFile, $outputFile)
{
Process::exec(
'gpg2 -v --encrypt {sign} --recipient {recipient} --batch --pinentry-mode loopback --yes {passphrase} --output {output_file} {input_file}',
[
'{sign}' => $this->config['sign'] ? '--sign' : '',
'{recipient}' => $this->config['recipient'],
'{passphrase}' => $this->config['sign'] && $this->config['key_file']
? '--passphrase-file '.$this->config['key_file']
: '',
'{input_file}' => $inputFile,
'{output_file}' => $outputFile,
],
[],
$this->logger
);
}
protected function getEncryptedFilename($inputFile)
{
return $inputFile.'.gpg';
}
}
| <?php
/*
* This file is part of the DB Backup utility.
*
* (c) Tom Adam <tomadam@instantiate.co.uk>
*
* This source file is subject to the MIT license that is bundled
* with this source code in the file LICENSE.
*/
namespace Instantiate\DatabaseBackup\FileEncrypter;
use Instantiate\DatabaseBackup\Util\Process;
class GpgFileEncrypter extends AbstractFileEncrypter
{
/**
* @param string $inputFile
* @param string $outputFile
*/
protected function encryptFile($inputFile, $outputFile)
{
Process::exec(
- 'gpg2 -v --encrypt {sign} --recipient {recipient} --batch --yes {passphrase} --output {output_file} {input_file}',
+ 'gpg2 -v --encrypt {sign} --recipient {recipient} --batch --pinentry-mode loopback --yes {passphrase} --output {output_file} {input_file}',
? +++++++++++++++++++++++++
[
'{sign}' => $this->config['sign'] ? '--sign' : '',
'{recipient}' => $this->config['recipient'],
'{passphrase}' => $this->config['sign'] && $this->config['key_file']
? '--passphrase-file '.$this->config['key_file']
: '',
'{input_file}' => $inputFile,
'{output_file}' => $outputFile,
],
[],
$this->logger
);
}
protected function getEncryptedFilename($inputFile)
{
return $inputFile.'.gpg';
}
} | 2 | 0.046512 | 1 | 1 |
d2d7ae3714917305f15b10e436a5712f5caa6a6b | README.md | README.md | A simple Python 3 script that checks if you can order a specified Nexus device in the Google Store. It prints the non-availability to standard output and triggers a pushbullet notification as soon as the device is available.
## Usage
The script is supposed to be run via systemd-timer or as a cronjob.
You need a valid Pushbullet Access Token, which you can find in the settings: https://www.pushbullet.com/#settings
Define the Nexus-variant you want to buy by providing it's name. The following versions have been tested:
* 5x
* 6p
* 6
* 9
Execute the script as follows:
```
PB_API_KEY=your_token NEXUS_VARIANT=your_device ./nexus_availability.py
```
## Caveats
* Currently german and english responses are supported. The Google Store uses your IP address to determine your location. If you are using this script on a system that might trigger a localized response from the Google Store, please add your translation of "soon" to the script. I'll gladly accept your pull request.
* Make sure you're able to disable the checker as soon as the device is available to ensure you do not end up being spammed with availability-messages as soon as your Nexus is available. 😅
## Todo:
Check if device store availability notification has already been triggered in order to not be spammed with notifications.
| A simple Python 3 script that checks if you can order a specified Nexus device in the Google Store. It prints the non-availability to standard output and triggers a pushbullet notification as soon as the device is available.
## Requirements
Install from provided requirements.txt:
```
pip3 install -r requirements.txt
````
Install manually:
```
pip3 install requests lxml pushbullet.py
```
## Usage
The script is supposed to be run via systemd-timer or as a cronjob.
You need a valid Pushbullet Access Token, which you can find in the settings: https://www.pushbullet.com/#settings
Define the Nexus-variant you want to buy by providing it's name. The following versions have been tested:
* 5x
* 6p
* 6
* 9
Execute the script as follows:
```
PB_API_KEY=your_token NEXUS_VARIANT=your_device ./nexus_availability.py
```
## Caveats
* Currently german and english responses are supported. The Google Store uses your IP address to determine your location. If you are using this script on a system that might trigger a localized response from the Google Store, please add your translation of "soon" to the script. I'll gladly accept your pull request.
* Make sure you're able to disable the checker as soon as the device is available to ensure you do not end up being spammed with availability-messages as soon as your Nexus is available. 😅
## Todo:
Check if device store availability notification has already been triggered in order to not be spammed with notifications.
| Add required module installation to documentation | Add required module installation to documentation | Markdown | mit | dictvm/nexus_checker | markdown | ## Code Before:
A simple Python 3 script that checks if you can order a specified Nexus device in the Google Store. It prints the non-availability to standard output and triggers a pushbullet notification as soon as the device is available.
## Usage
The script is supposed to be run via systemd-timer or as a cronjob.
You need a valid Pushbullet Access Token, which you can find in the settings: https://www.pushbullet.com/#settings
Define the Nexus-variant you want to buy by providing it's name. The following versions have been tested:
* 5x
* 6p
* 6
* 9
Execute the script as follows:
```
PB_API_KEY=your_token NEXUS_VARIANT=your_device ./nexus_availability.py
```
## Caveats
* Currently german and english responses are supported. The Google Store uses your IP address to determine your location. If you are using this script on a system that might trigger a localized response from the Google Store, please add your translation of "soon" to the script. I'll gladly accept your pull request.
* Make sure you're able to disable the checker as soon as the device is available to ensure you do not end up being spammed with availability-messages as soon as your Nexus is available. 😅
## Todo:
Check if device store availability notification has already been triggered in order to not be spammed with notifications.
## Instruction:
Add required module installation to documentation
## Code After:
A simple Python 3 script that checks if you can order a specified Nexus device in the Google Store. It prints the non-availability to standard output and triggers a pushbullet notification as soon as the device is available.
## Requirements
Install from provided requirements.txt:
```
pip3 install -r requirements.txt
````
Install manually:
```
pip3 install requests lxml pushbullet.py
```
## Usage
The script is supposed to be run via systemd-timer or as a cronjob.
You need a valid Pushbullet Access Token, which you can find in the settings: https://www.pushbullet.com/#settings
Define the Nexus-variant you want to buy by providing it's name. The following versions have been tested:
* 5x
* 6p
* 6
* 9
Execute the script as follows:
```
PB_API_KEY=your_token NEXUS_VARIANT=your_device ./nexus_availability.py
```
## Caveats
* Currently german and english responses are supported. The Google Store uses your IP address to determine your location. If you are using this script on a system that might trigger a localized response from the Google Store, please add your translation of "soon" to the script. I'll gladly accept your pull request.
* Make sure you're able to disable the checker as soon as the device is available to ensure you do not end up being spammed with availability-messages as soon as your Nexus is available. 😅
## Todo:
Check if device store availability notification has already been triggered in order to not be spammed with notifications.
| A simple Python 3 script that checks if you can order a specified Nexus device in the Google Store. It prints the non-availability to standard output and triggers a pushbullet notification as soon as the device is available.
+
+ ## Requirements
+
+ Install from provided requirements.txt:
+ ```
+ pip3 install -r requirements.txt
+ ````
+
+ Install manually:
+ ```
+ pip3 install requests lxml pushbullet.py
+ ```
## Usage
The script is supposed to be run via systemd-timer or as a cronjob.
You need a valid Pushbullet Access Token, which you can find in the settings: https://www.pushbullet.com/#settings
Define the Nexus-variant you want to buy by providing it's name. The following versions have been tested:
* 5x
* 6p
* 6
* 9
Execute the script as follows:
```
PB_API_KEY=your_token NEXUS_VARIANT=your_device ./nexus_availability.py
```
## Caveats
* Currently german and english responses are supported. The Google Store uses your IP address to determine your location. If you are using this script on a system that might trigger a localized response from the Google Store, please add your translation of "soon" to the script. I'll gladly accept your pull request.
* Make sure you're able to disable the checker as soon as the device is available to ensure you do not end up being spammed with availability-messages as soon as your Nexus is available. 😅
## Todo:
Check if device store availability notification has already been triggered in order to not be spammed with notifications. | 12 | 0.48 | 12 | 0 |
ae33f745ddf00ef8c03466b5c0d18d8f28339822 | autogen.sh | autogen.sh | rm -rf autom4te.cache
aclocal -I m4
autoheader
libtoolize --copy
automake --add-missing --copy
autoconf
| rm -rf autom4te.cache
aclocal -I m4
autoheader
case `uname` in Darwin*) glibtoolize --copy ;;
*) libtoolize --copy ;; esac
automake --add-missing --copy
autoconf
| Use glibtoolize rather than libtoolize on mac | Use glibtoolize rather than libtoolize on mac
| Shell | bsd-3-clause | couchbase/snappy,apage43/snappy,apage43/snappy,apage43/snappy,couchbase/snappy,couchbase/snappy,apage43/snappy,couchbase/snappy | shell | ## Code Before:
rm -rf autom4te.cache
aclocal -I m4
autoheader
libtoolize --copy
automake --add-missing --copy
autoconf
## Instruction:
Use glibtoolize rather than libtoolize on mac
## Code After:
rm -rf autom4te.cache
aclocal -I m4
autoheader
case `uname` in Darwin*) glibtoolize --copy ;;
*) libtoolize --copy ;; esac
automake --add-missing --copy
autoconf
| rm -rf autom4te.cache
aclocal -I m4
autoheader
- libtoolize --copy
+ case `uname` in Darwin*) glibtoolize --copy ;;
+ *) libtoolize --copy ;; esac
automake --add-missing --copy
autoconf | 3 | 0.5 | 2 | 1 |
ba8cd626bec5071799b62e8d6f50e60c948ff9c1 | lib/llt/diff/alignment/difference/nrefs.rb | lib/llt/diff/alignment/difference/nrefs.rb | module LLT
module Diff::Alignment::Difference
class Nrefs
include Diff::Helpers::HashContainable
include Diff::Helpers::DiffReporter
xml_tag :nrefs
def initialize(original, new)
@id = id
@original = original
@new = new
@container = {}
end
def id
xml_tag
end
def xml_attributes
{ original: @original, new: @new, unique: @unique }
end
end
end
end
| module LLT
module Diff::Alignment::Difference
# This whole class could arguably deleted, not sure there is any need for it
class Nrefs
include Diff::Helpers::HashContainable
include Diff::Helpers::DiffReporter
xml_tag :nrefs
def initialize(original, new)
@id = id
@original = original
@new = new
@container = {}
end
def id
xml_tag
end
def xml_attributes
{ original: @original, new: @new, unique: @unique }
end
end
end
end
| Add comment about possible deprecation of Nrefs | Add comment about possible deprecation of Nrefs
| Ruby | mit | latin-language-toolkit/llt-review,latin-language-toolkit/llt-review | ruby | ## Code Before:
module LLT
module Diff::Alignment::Difference
class Nrefs
include Diff::Helpers::HashContainable
include Diff::Helpers::DiffReporter
xml_tag :nrefs
def initialize(original, new)
@id = id
@original = original
@new = new
@container = {}
end
def id
xml_tag
end
def xml_attributes
{ original: @original, new: @new, unique: @unique }
end
end
end
end
## Instruction:
Add comment about possible deprecation of Nrefs
## Code After:
module LLT
module Diff::Alignment::Difference
# This whole class could arguably deleted, not sure there is any need for it
class Nrefs
include Diff::Helpers::HashContainable
include Diff::Helpers::DiffReporter
xml_tag :nrefs
def initialize(original, new)
@id = id
@original = original
@new = new
@container = {}
end
def id
xml_tag
end
def xml_attributes
{ original: @original, new: @new, unique: @unique }
end
end
end
end
| module LLT
module Diff::Alignment::Difference
+
+ # This whole class could arguably deleted, not sure there is any need for it
+
class Nrefs
include Diff::Helpers::HashContainable
include Diff::Helpers::DiffReporter
xml_tag :nrefs
def initialize(original, new)
@id = id
@original = original
@new = new
@container = {}
end
def id
xml_tag
end
def xml_attributes
{ original: @original, new: @new, unique: @unique }
end
end
end
end
| 3 | 0.115385 | 3 | 0 |
ff1129b597b5d25daddb85e76e1ebad6701cebbd | recipes-bsp/common-csl-ip/common-csl-ip_git.bb | recipes-bsp/common-csl-ip/common-csl-ip_git.bb | DESCRIPTION = "Chip support library low level interface"
LICENSE = "BSD-3-Clause"
LIC_FILES_CHKSUM = "file://COPYING.txt;md5=5857833e20836213677fac33f9aded21"
COMPATIBLE_MACHINE = "keystone"
ALLOW_EMPTY_${PN} = "1"
PR = "r2"
BRANCH="master"
SRC_URI = "git://git.ti.com/keystone-rtos/common-csl-ip.git;protocol=git;branch=${BRANCH}"
# commit ID corresponds to DEV.CSL_KEYSTONE2.02.01.00.06A
SRCREV = "6e39222e13244c285929bda2f90f5224f9ea5144"
S = "${WORKDIR}/git"
do_install () {
install -d ${D}${includedir}/ti/csl
find . -name "*.h" -type f | xargs -I {} cp --parents {} ${D}${includedir}/ti/csl
find ./src/ip/serdes_sb/V0 -name "*.c" -type f | xargs -I {} cp --parents {} ${D}${includedir}/ti/csl
}
| DESCRIPTION = "Chip support library low level interface"
LICENSE = "BSD-3-Clause"
LIC_FILES_CHKSUM = "file://COPYING.txt;md5=5857833e20836213677fac33f9aded21"
COMPATIBLE_MACHINE = "keystone"
ALLOW_EMPTY_${PN} = "1"
PR = "r3"
BRANCH="master"
SRC_URI = "git://git.ti.com/keystone-rtos/common-csl-ip.git;protocol=git;branch=${BRANCH}"
# commit ID corresponds to DEV.CSL_KEYSTONE2.02.01.00.07A
SRCREV = "c78867df9165fdf8042fb692fcea776fc0102326"
S = "${WORKDIR}/git"
do_install () {
install -d ${D}${includedir}/ti/csl
find . -name "*.h" -type f | xargs -I {} cp --parents {} ${D}${includedir}/ti/csl
find ./src/ip/serdes_sb/V0 -name "*.c" -type f | xargs -I {} cp --parents {} ${D}${includedir}/ti/csl
}
| Update to new version 2.1.0.7 | common-csl-ip: Update to new version 2.1.0.7
Signed-off-by: Sam Nelson <922b89dc606ff5f34a79f0527810322597be4ea8@ti.com>
Signed-off-by: Denys Dmytriyenko <d29de71aea38aad3a87d486929cb0aad173ae612@ti.com>
| BitBake | mit | RedFIR/meta-ti,joelagnel/meta-ti,joelagnel/meta-ti,joelagnel/meta-ti,tylerwhall/meta-ti,rcn-ee/meta-ti,rcn-ee/meta-ti,rcn-ee/meta-ti,tylerwhall/meta-ti,RedFIR/meta-ti,RedFIR/meta-ti,joelagnel/meta-ti,tylerwhall/meta-ti,tylerwhall/meta-ti,RedFIR/meta-ti,rcn-ee/meta-ti | bitbake | ## Code Before:
DESCRIPTION = "Chip support library low level interface"
LICENSE = "BSD-3-Clause"
LIC_FILES_CHKSUM = "file://COPYING.txt;md5=5857833e20836213677fac33f9aded21"
COMPATIBLE_MACHINE = "keystone"
ALLOW_EMPTY_${PN} = "1"
PR = "r2"
BRANCH="master"
SRC_URI = "git://git.ti.com/keystone-rtos/common-csl-ip.git;protocol=git;branch=${BRANCH}"
# commit ID corresponds to DEV.CSL_KEYSTONE2.02.01.00.06A
SRCREV = "6e39222e13244c285929bda2f90f5224f9ea5144"
S = "${WORKDIR}/git"
do_install () {
install -d ${D}${includedir}/ti/csl
find . -name "*.h" -type f | xargs -I {} cp --parents {} ${D}${includedir}/ti/csl
find ./src/ip/serdes_sb/V0 -name "*.c" -type f | xargs -I {} cp --parents {} ${D}${includedir}/ti/csl
}
## Instruction:
common-csl-ip: Update to new version 2.1.0.7
Signed-off-by: Sam Nelson <922b89dc606ff5f34a79f0527810322597be4ea8@ti.com>
Signed-off-by: Denys Dmytriyenko <d29de71aea38aad3a87d486929cb0aad173ae612@ti.com>
## Code After:
DESCRIPTION = "Chip support library low level interface"
LICENSE = "BSD-3-Clause"
LIC_FILES_CHKSUM = "file://COPYING.txt;md5=5857833e20836213677fac33f9aded21"
COMPATIBLE_MACHINE = "keystone"
ALLOW_EMPTY_${PN} = "1"
PR = "r3"
BRANCH="master"
SRC_URI = "git://git.ti.com/keystone-rtos/common-csl-ip.git;protocol=git;branch=${BRANCH}"
# commit ID corresponds to DEV.CSL_KEYSTONE2.02.01.00.07A
SRCREV = "c78867df9165fdf8042fb692fcea776fc0102326"
S = "${WORKDIR}/git"
do_install () {
install -d ${D}${includedir}/ti/csl
find . -name "*.h" -type f | xargs -I {} cp --parents {} ${D}${includedir}/ti/csl
find ./src/ip/serdes_sb/V0 -name "*.c" -type f | xargs -I {} cp --parents {} ${D}${includedir}/ti/csl
}
| DESCRIPTION = "Chip support library low level interface"
LICENSE = "BSD-3-Clause"
LIC_FILES_CHKSUM = "file://COPYING.txt;md5=5857833e20836213677fac33f9aded21"
COMPATIBLE_MACHINE = "keystone"
ALLOW_EMPTY_${PN} = "1"
- PR = "r2"
? ^
+ PR = "r3"
? ^
BRANCH="master"
SRC_URI = "git://git.ti.com/keystone-rtos/common-csl-ip.git;protocol=git;branch=${BRANCH}"
- # commit ID corresponds to DEV.CSL_KEYSTONE2.02.01.00.06A
? ^
+ # commit ID corresponds to DEV.CSL_KEYSTONE2.02.01.00.07A
? ^
- SRCREV = "6e39222e13244c285929bda2f90f5224f9ea5144"
+ SRCREV = "c78867df9165fdf8042fb692fcea776fc0102326"
S = "${WORKDIR}/git"
do_install () {
install -d ${D}${includedir}/ti/csl
find . -name "*.h" -type f | xargs -I {} cp --parents {} ${D}${includedir}/ti/csl
find ./src/ip/serdes_sb/V0 -name "*.c" -type f | xargs -I {} cp --parents {} ${D}${includedir}/ti/csl
} | 6 | 0.3 | 3 | 3 |
b0d1e6fa73eed12f61d57bffdcadbf96b748e24d | vertx-web-examples-it/src/it/web-examples/src/main/web-react-run.json | vertx-web-examples-it/src/it/web-examples/src/main/web-react-run.json | {
"name": "web-react",
"tags": ["web", "chat", "react"],
"executions": {
"java" : {
"directory" : "${base}/src/main/java/io/vertx/example/web/react",
"command" : "vertx run io.vertx.example.web.react.Server -cp ${base}/target/classes:."
},
"js" : {
"directory" : "${base}/src/main/js/io/vertx/example/web/chat",
"command" : "vertx run server.js -cp ${base}/target/classes:."
},
"groovy": {
"directory" : "${base}/src/main/groovy/io/vertx/example/web/chat",
"command" : "vertx run server.groovy -cp ${base}/target/classes:."
},
"ruby": {
"directory" : "${base}/src/main/rb/io/vertx/example/web/chat",
"command" : "vertx run server.rb -cp ${base}/target/classes:."
}
},
"grace-text": "Succeeded in deploying verticle",
"client-check": "web-react-check.groovy"
} | {
"name": "web-react",
"tags": ["web", "chat", "react"],
"executions": {
"java" : {
"directory" : "${base}/src/main/java/io/vertx/example/web/react",
"command" : "vertx run io.vertx.example.web.react.Server -cp ${base}/target/classes:."
}
},
"grace-text": "Succeeded in deploying verticle",
"client-check": "web-react-check.groovy"
} | Remove the execution of non-polyglot exmaple (react) | Remove the execution of non-polyglot exmaple (react)
| JSON | apache-2.0 | cescoffier/vertx-examples-it,cescoffier/vertx-examples-it | json | ## Code Before:
{
"name": "web-react",
"tags": ["web", "chat", "react"],
"executions": {
"java" : {
"directory" : "${base}/src/main/java/io/vertx/example/web/react",
"command" : "vertx run io.vertx.example.web.react.Server -cp ${base}/target/classes:."
},
"js" : {
"directory" : "${base}/src/main/js/io/vertx/example/web/chat",
"command" : "vertx run server.js -cp ${base}/target/classes:."
},
"groovy": {
"directory" : "${base}/src/main/groovy/io/vertx/example/web/chat",
"command" : "vertx run server.groovy -cp ${base}/target/classes:."
},
"ruby": {
"directory" : "${base}/src/main/rb/io/vertx/example/web/chat",
"command" : "vertx run server.rb -cp ${base}/target/classes:."
}
},
"grace-text": "Succeeded in deploying verticle",
"client-check": "web-react-check.groovy"
}
## Instruction:
Remove the execution of non-polyglot exmaple (react)
## Code After:
{
"name": "web-react",
"tags": ["web", "chat", "react"],
"executions": {
"java" : {
"directory" : "${base}/src/main/java/io/vertx/example/web/react",
"command" : "vertx run io.vertx.example.web.react.Server -cp ${base}/target/classes:."
}
},
"grace-text": "Succeeded in deploying verticle",
"client-check": "web-react-check.groovy"
} | {
"name": "web-react",
"tags": ["web", "chat", "react"],
"executions": {
"java" : {
"directory" : "${base}/src/main/java/io/vertx/example/web/react",
"command" : "vertx run io.vertx.example.web.react.Server -cp ${base}/target/classes:."
- },
-
- "js" : {
- "directory" : "${base}/src/main/js/io/vertx/example/web/chat",
- "command" : "vertx run server.js -cp ${base}/target/classes:."
- },
-
- "groovy": {
- "directory" : "${base}/src/main/groovy/io/vertx/example/web/chat",
- "command" : "vertx run server.groovy -cp ${base}/target/classes:."
- },
-
- "ruby": {
- "directory" : "${base}/src/main/rb/io/vertx/example/web/chat",
- "command" : "vertx run server.rb -cp ${base}/target/classes:."
}
-
},
"grace-text": "Succeeded in deploying verticle",
"client-check": "web-react-check.groovy"
} | 16 | 0.5 | 0 | 16 |
50041233658459ff5805276309e8ed2ee478407e | docs/source/reference/routines.rst | docs/source/reference/routines.rst | --------
Routines
--------
The following pages describe NumPy-compatible routines.
These functions cover a subset of
`NumPy routines <https://docs.scipy.org/doc/numpy/reference/routines.html>`_.
.. currentmodule:: cupy
.. toctree::
:maxdepth: 2
creation
manipulation
binary
dtype
fft
functional
indexing
io
linalg
logic
math
pad
polynomials
random
sorting
statistics
ext
CUB backend for reduction routines
----------------------------------
Some CuPy reduction routines, including :func:`~cupy.sum`, :func:`~cupy.min`, :func:`~cupy.max`,
:func:`~cupy.argmin`, :func:`~cupy.argmax`, and other functions built on top of them, can be
accelerated by switching to the `CUB`_ backend. The switch can be toggled on or off at runtime
by setting the bool :data:`cupy.cuda.cub_enabled`, which is set to ``False`` by default. Note
that while in general CUB-backed reductions are faster, there could be exceptions depending on
the data layout. We recommend users to perform some benchmarks to determine whether CUB offers
better performance or not.
.. _CUB: http://nvlabs.github.io/cub/
| --------
Routines
--------
The following pages describe NumPy-compatible routines.
These functions cover a subset of
`NumPy routines <https://docs.scipy.org/doc/numpy/reference/routines.html>`_.
.. currentmodule:: cupy
.. toctree::
:maxdepth: 2
creation
manipulation
binary
dtype
fft
functional
indexing
io
linalg
logic
math
pad
polynomials
random
sorting
statistics
ext
CUB/cuTENSOR backend for reduction routines
----------------------------------
Some CuPy reduction routines, including :func:`~cupy.sum`, :func:`~cupy.min`, :func:`~cupy.max`,
:func:`~cupy.argmin`, :func:`~cupy.argmax`, and other functions built on top of them, can be
accelerated by switching to the `CUB`_ or `cuTENSOR`_ backend. These backends can be enabled
by setting ``CUPY_ACCELERATORS`` environement variable. Note that while in general the accelerated
reductions are faster, there could be exceptions depending on the data layout. We recommend
users to perform some benchmarks to determine whether CUB offers better performance or not.
.. _CUB: https://nvlabs.github.io/cub/
.. _cuTENSOR: https://docs.nvidia.com/cuda/cutensor/index.html
| Fix documents to reflect CUPY_ACCELERATORS | Fix documents to reflect CUPY_ACCELERATORS
| reStructuredText | mit | cupy/cupy,cupy/cupy,cupy/cupy,cupy/cupy | restructuredtext | ## Code Before:
--------
Routines
--------
The following pages describe NumPy-compatible routines.
These functions cover a subset of
`NumPy routines <https://docs.scipy.org/doc/numpy/reference/routines.html>`_.
.. currentmodule:: cupy
.. toctree::
:maxdepth: 2
creation
manipulation
binary
dtype
fft
functional
indexing
io
linalg
logic
math
pad
polynomials
random
sorting
statistics
ext
CUB backend for reduction routines
----------------------------------
Some CuPy reduction routines, including :func:`~cupy.sum`, :func:`~cupy.min`, :func:`~cupy.max`,
:func:`~cupy.argmin`, :func:`~cupy.argmax`, and other functions built on top of them, can be
accelerated by switching to the `CUB`_ backend. The switch can be toggled on or off at runtime
by setting the bool :data:`cupy.cuda.cub_enabled`, which is set to ``False`` by default. Note
that while in general CUB-backed reductions are faster, there could be exceptions depending on
the data layout. We recommend users to perform some benchmarks to determine whether CUB offers
better performance or not.
.. _CUB: http://nvlabs.github.io/cub/
## Instruction:
Fix documents to reflect CUPY_ACCELERATORS
## Code After:
--------
Routines
--------
The following pages describe NumPy-compatible routines.
These functions cover a subset of
`NumPy routines <https://docs.scipy.org/doc/numpy/reference/routines.html>`_.
.. currentmodule:: cupy
.. toctree::
:maxdepth: 2
creation
manipulation
binary
dtype
fft
functional
indexing
io
linalg
logic
math
pad
polynomials
random
sorting
statistics
ext
CUB/cuTENSOR backend for reduction routines
----------------------------------
Some CuPy reduction routines, including :func:`~cupy.sum`, :func:`~cupy.min`, :func:`~cupy.max`,
:func:`~cupy.argmin`, :func:`~cupy.argmax`, and other functions built on top of them, can be
accelerated by switching to the `CUB`_ or `cuTENSOR`_ backend. These backends can be enabled
by setting ``CUPY_ACCELERATORS`` environement variable. Note that while in general the accelerated
reductions are faster, there could be exceptions depending on the data layout. We recommend
users to perform some benchmarks to determine whether CUB offers better performance or not.
.. _CUB: https://nvlabs.github.io/cub/
.. _cuTENSOR: https://docs.nvidia.com/cuda/cutensor/index.html
| --------
Routines
--------
The following pages describe NumPy-compatible routines.
These functions cover a subset of
`NumPy routines <https://docs.scipy.org/doc/numpy/reference/routines.html>`_.
.. currentmodule:: cupy
.. toctree::
:maxdepth: 2
creation
manipulation
binary
dtype
fft
functional
indexing
io
linalg
logic
math
pad
polynomials
random
sorting
statistics
ext
- CUB backend for reduction routines
+ CUB/cuTENSOR backend for reduction routines
? +++++++++
----------------------------------
Some CuPy reduction routines, including :func:`~cupy.sum`, :func:`~cupy.min`, :func:`~cupy.max`,
:func:`~cupy.argmin`, :func:`~cupy.argmax`, and other functions built on top of them, can be
+ accelerated by switching to the `CUB`_ or `cuTENSOR`_ backend. These backends can be enabled
+ by setting ``CUPY_ACCELERATORS`` environement variable. Note that while in general the accelerated
+ reductions are faster, there could be exceptions depending on the data layout. We recommend
+ users to perform some benchmarks to determine whether CUB offers better performance or not.
- accelerated by switching to the `CUB`_ backend. The switch can be toggled on or off at runtime
- by setting the bool :data:`cupy.cuda.cub_enabled`, which is set to ``False`` by default. Note
- that while in general CUB-backed reductions are faster, there could be exceptions depending on
- the data layout. We recommend users to perform some benchmarks to determine whether CUB offers
- better performance or not.
- .. _CUB: http://nvlabs.github.io/cub/
+ .. _CUB: https://nvlabs.github.io/cub/
? +
+ .. _cuTENSOR: https://docs.nvidia.com/cuda/cutensor/index.html | 14 | 0.325581 | 7 | 7 |
0ac085666a3736acf68dd49b1b0d685d46afcae5 | spec/spec_helper.rb | spec/spec_helper.rb | PADRINO_ENV = 'test'
PADRINO_ROOT = File.dirname(__FILE__) unless defined?(PADRINO_ROOT)
require 'rubygems' unless defined?(Gem)
require 'bundler'
Bundler.require(:default, PADRINO_ENV)
require 'padrino-contrib'
RSpec.configure do |config|
config.include Rack::Test::Methods
config.include Webrat::Matchers
# Sets up a Sinatra::Base subclass defined with the block
# given. Used in setup or individual spec methods to establish
# the application.
def mock_app(base=Padrino::Application, &block)
@app = Sinatra.new(base, &block)
end
def app
@app
end
end
| RACK_ENV = 'test'
PADRINO_ROOT = File.dirname(__FILE__) unless defined?(PADRINO_ROOT)
require 'rubygems' unless defined?(Gem)
require 'bundler'
Bundler.require(:default, RACK_ENV)
require 'padrino-contrib'
RSpec.configure do |config|
config.include Rack::Test::Methods
config.include Webrat::Matchers
# Sets up a Sinatra::Base subclass defined with the block
# given. Used in setup or individual spec methods to establish
# the application.
def mock_app(base=Padrino::Application, &block)
@app = Sinatra.new(base, &block)
end
def app
@app
end
end
| Use RACK_ENV instead of deprecated PADRINO_ENV | Use RACK_ENV instead of deprecated PADRINO_ENV
| Ruby | mit | padrino/padrino-contrib,padrino/padrino-contrib | ruby | ## Code Before:
PADRINO_ENV = 'test'
PADRINO_ROOT = File.dirname(__FILE__) unless defined?(PADRINO_ROOT)
require 'rubygems' unless defined?(Gem)
require 'bundler'
Bundler.require(:default, PADRINO_ENV)
require 'padrino-contrib'
RSpec.configure do |config|
config.include Rack::Test::Methods
config.include Webrat::Matchers
# Sets up a Sinatra::Base subclass defined with the block
# given. Used in setup or individual spec methods to establish
# the application.
def mock_app(base=Padrino::Application, &block)
@app = Sinatra.new(base, &block)
end
def app
@app
end
end
## Instruction:
Use RACK_ENV instead of deprecated PADRINO_ENV
## Code After:
RACK_ENV = 'test'
PADRINO_ROOT = File.dirname(__FILE__) unless defined?(PADRINO_ROOT)
require 'rubygems' unless defined?(Gem)
require 'bundler'
Bundler.require(:default, RACK_ENV)
require 'padrino-contrib'
RSpec.configure do |config|
config.include Rack::Test::Methods
config.include Webrat::Matchers
# Sets up a Sinatra::Base subclass defined with the block
# given. Used in setup or individual spec methods to establish
# the application.
def mock_app(base=Padrino::Application, &block)
@app = Sinatra.new(base, &block)
end
def app
@app
end
end
| - PADRINO_ENV = 'test'
? ^ ^^^^^
+ RACK_ENV = 'test'
? ^ ^^
PADRINO_ROOT = File.dirname(__FILE__) unless defined?(PADRINO_ROOT)
require 'rubygems' unless defined?(Gem)
require 'bundler'
- Bundler.require(:default, PADRINO_ENV)
? ^ ^^^^^
+ Bundler.require(:default, RACK_ENV)
? ^ ^^
require 'padrino-contrib'
RSpec.configure do |config|
config.include Rack::Test::Methods
config.include Webrat::Matchers
# Sets up a Sinatra::Base subclass defined with the block
# given. Used in setup or individual spec methods to establish
# the application.
def mock_app(base=Padrino::Application, &block)
@app = Sinatra.new(base, &block)
end
def app
@app
end
end | 4 | 0.173913 | 2 | 2 |
eea3c24404e92dad143cb9d6b5dcb2078042922a | app/serializers/me_serializer.rb | app/serializers/me_serializer.rb | class MeSerializer < UserSerializer
cached # Make sure to cache the user object.
attributes :read_terms, :default_avatar, :email, :email_verified
# Public: Checks if user is using default avatar.
# Returns true or false.
def default_avatar
not object.avatar.present?
end
# Public: Checks if user has read terms of agreements.
# Returns true or false.
def read_terms
object.tnc_last_accepted.present?
end
# Public: Check if email is verified.
# Returns true or false.
def email_verified
object.email_verified_at.present?
end
# Public: The cache key for the MeSerializer, this behaves a little
# different than user serializers because it contains more data.
def cache_key
"me/#{object.id}-#{object.updated_at.utc.to_s(:number)}"
end
end
| class MeSerializer < UserSerializer
cached # Make sure to cache the user object.
attributes :read_terms, :default_avatar, :email, :email_verified
# Public: Checks if user is using default avatar.
# Returns true or false.
def default_avatar
not object.avatar.present?
end
# Public: Checks if user has read terms of agreements.
# Returns true or false.
def read_terms
object.tnc_last_accepted.present?
end
# Public: Check if email is verified.
# Returns true or false.
def email_verified
object.email_verified_at.present?
end
# Public: The cache key for the MeSerializer, this behaves a little
# different than user serializers because it contains more data.
def cache_key
"me/#{object.id}-#{object.updated_at.utc.to_s(:number)}"
end
def refs
[{ rel: 'self', href: v2_me_url(format: :json, host: $api_host) }]
end
end
| Make the ref for /v2/me be to itself. | Make the ref for /v2/me be to itself.
| Ruby | mit | cloudsdaleapp/cloudsdale-web,cloudsdaleapp/cloudsdale-web,cloudsdaleapp/cloudsdale-web,cloudsdaleapp/cloudsdale-web | ruby | ## Code Before:
class MeSerializer < UserSerializer
cached # Make sure to cache the user object.
attributes :read_terms, :default_avatar, :email, :email_verified
# Public: Checks if user is using default avatar.
# Returns true or false.
def default_avatar
not object.avatar.present?
end
# Public: Checks if user has read terms of agreements.
# Returns true or false.
def read_terms
object.tnc_last_accepted.present?
end
# Public: Check if email is verified.
# Returns true or false.
def email_verified
object.email_verified_at.present?
end
# Public: The cache key for the MeSerializer, this behaves a little
# different than user serializers because it contains more data.
def cache_key
"me/#{object.id}-#{object.updated_at.utc.to_s(:number)}"
end
end
## Instruction:
Make the ref for /v2/me be to itself.
## Code After:
class MeSerializer < UserSerializer
cached # Make sure to cache the user object.
attributes :read_terms, :default_avatar, :email, :email_verified
# Public: Checks if user is using default avatar.
# Returns true or false.
def default_avatar
not object.avatar.present?
end
# Public: Checks if user has read terms of agreements.
# Returns true or false.
def read_terms
object.tnc_last_accepted.present?
end
# Public: Check if email is verified.
# Returns true or false.
def email_verified
object.email_verified_at.present?
end
# Public: The cache key for the MeSerializer, this behaves a little
# different than user serializers because it contains more data.
def cache_key
"me/#{object.id}-#{object.updated_at.utc.to_s(:number)}"
end
def refs
[{ rel: 'self', href: v2_me_url(format: :json, host: $api_host) }]
end
end
| class MeSerializer < UserSerializer
cached # Make sure to cache the user object.
attributes :read_terms, :default_avatar, :email, :email_verified
# Public: Checks if user is using default avatar.
# Returns true or false.
def default_avatar
not object.avatar.present?
end
# Public: Checks if user has read terms of agreements.
# Returns true or false.
def read_terms
object.tnc_last_accepted.present?
end
# Public: Check if email is verified.
# Returns true or false.
def email_verified
object.email_verified_at.present?
end
# Public: The cache key for the MeSerializer, this behaves a little
# different than user serializers because it contains more data.
def cache_key
"me/#{object.id}-#{object.updated_at.utc.to_s(:number)}"
end
+ def refs
+ [{ rel: 'self', href: v2_me_url(format: :json, host: $api_host) }]
+ end
+
end | 4 | 0.129032 | 4 | 0 |
7ce51c412d86c54fc5aa604597facc7b84e8e971 | msa/docker/docker-update-service.sh | msa/docker/docker-update-service.sh |
docker-compose up -d --no-deps --build $@
|
docker-compose pull $@
docker-compose up -d --no-deps --build $@
| Improve docker update service script. | Improve docker update service script.
| Shell | apache-2.0 | volodymyr-babak/thingsboard,volodymyr-babak/thingsboard,volodymyr-babak/thingsboard,thingsboard/thingsboard,thingsboard/thingsboard,thingsboard/thingsboard,thingsboard/thingsboard,volodymyr-babak/thingsboard,volodymyr-babak/thingsboard,thingsboard/thingsboard,volodymyr-babak/thingsboard,thingsboard/thingsboard | shell | ## Code Before:
docker-compose up -d --no-deps --build $@
## Instruction:
Improve docker update service script.
## Code After:
docker-compose pull $@
docker-compose up -d --no-deps --build $@
|
+ docker-compose pull $@
docker-compose up -d --no-deps --build $@ | 1 | 0.5 | 1 | 0 |
0690887add9f609e4ec365fe42a43c6299eeac33 | Resources/config/services.xml | Resources/config/services.xml | <?xml version="1.0" encoding="UTF-8"?>
<container xmlns="http://symfony.com/schema/dic/services" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://symfony.com/schema/dic/services http://symfony.com/schema/dic/services/services-1.0.xsd">
<services>
<service id="ecommit_util.helper" class="Ecommit\UtilBundle\Helper\UtilHelper">
</service>
<service id="ecommit_util.twig.util_extension" class="Ecommit\UtilBundle\Twig\UtilExtension">
<argument type="service" id="ecommit_util.helper" />
<tag name="twig.extension" />
</service>
<service id="ecommit_util.cache" class="Ecommit\UtilBundle\Cache\Cache" abstract="true" public="false">
<argument /> <!-- Options -->
</service>
<service id="ecommit_util.clear_entity_manager" class="Ecommit\UtilBundle\Doctrine\ClearEntityManager">
<argument type="service" id="doctrine" />
</service>
</services>
</container>
| <?xml version="1.0" encoding="UTF-8"?>
<container xmlns="http://symfony.com/schema/dic/services" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://symfony.com/schema/dic/services http://symfony.com/schema/dic/services/services-1.0.xsd">
<services>
<service id="ecommit_util.helper" class="Ecommit\UtilBundle\Helper\UtilHelper">
</service>
<service id="ecommit_util.twig.util_extension" class="Ecommit\UtilBundle\Twig\UtilExtension">
<argument type="service" id="ecommit_util.helper" />
<tag name="twig.extension" />
</service>
<service id="ecommit_util.cache" class="Ecommit\UtilBundle\Cache\Cache" abstract="true" public="false">
<argument /> <!-- Options -->
</service>
<service id="ecommit_util.clear_entity_manager" class="Ecommit\UtilBundle\Doctrine\ClearEntityManager">
<argument type="service" id="doctrine" on-invalid="null" />
</service>
</services>
</container>
| Fix Doctrine dependency when doctrine isn't used | Fix Doctrine dependency when doctrine isn't used
| XML | mit | e-commit/EcommitUtilBundle,e-commit/EcommitUtilBundle | xml | ## Code Before:
<?xml version="1.0" encoding="UTF-8"?>
<container xmlns="http://symfony.com/schema/dic/services" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://symfony.com/schema/dic/services http://symfony.com/schema/dic/services/services-1.0.xsd">
<services>
<service id="ecommit_util.helper" class="Ecommit\UtilBundle\Helper\UtilHelper">
</service>
<service id="ecommit_util.twig.util_extension" class="Ecommit\UtilBundle\Twig\UtilExtension">
<argument type="service" id="ecommit_util.helper" />
<tag name="twig.extension" />
</service>
<service id="ecommit_util.cache" class="Ecommit\UtilBundle\Cache\Cache" abstract="true" public="false">
<argument /> <!-- Options -->
</service>
<service id="ecommit_util.clear_entity_manager" class="Ecommit\UtilBundle\Doctrine\ClearEntityManager">
<argument type="service" id="doctrine" />
</service>
</services>
</container>
## Instruction:
Fix Doctrine dependency when doctrine isn't used
## Code After:
<?xml version="1.0" encoding="UTF-8"?>
<container xmlns="http://symfony.com/schema/dic/services" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://symfony.com/schema/dic/services http://symfony.com/schema/dic/services/services-1.0.xsd">
<services>
<service id="ecommit_util.helper" class="Ecommit\UtilBundle\Helper\UtilHelper">
</service>
<service id="ecommit_util.twig.util_extension" class="Ecommit\UtilBundle\Twig\UtilExtension">
<argument type="service" id="ecommit_util.helper" />
<tag name="twig.extension" />
</service>
<service id="ecommit_util.cache" class="Ecommit\UtilBundle\Cache\Cache" abstract="true" public="false">
<argument /> <!-- Options -->
</service>
<service id="ecommit_util.clear_entity_manager" class="Ecommit\UtilBundle\Doctrine\ClearEntityManager">
<argument type="service" id="doctrine" on-invalid="null" />
</service>
</services>
</container>
| <?xml version="1.0" encoding="UTF-8"?>
<container xmlns="http://symfony.com/schema/dic/services" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://symfony.com/schema/dic/services http://symfony.com/schema/dic/services/services-1.0.xsd">
<services>
<service id="ecommit_util.helper" class="Ecommit\UtilBundle\Helper\UtilHelper">
</service>
<service id="ecommit_util.twig.util_extension" class="Ecommit\UtilBundle\Twig\UtilExtension">
<argument type="service" id="ecommit_util.helper" />
<tag name="twig.extension" />
</service>
<service id="ecommit_util.cache" class="Ecommit\UtilBundle\Cache\Cache" abstract="true" public="false">
<argument /> <!-- Options -->
</service>
<service id="ecommit_util.clear_entity_manager" class="Ecommit\UtilBundle\Doctrine\ClearEntityManager">
- <argument type="service" id="doctrine" />
+ <argument type="service" id="doctrine" on-invalid="null" />
? ++++++++++++++++++
</service>
</services>
</container> | 2 | 0.095238 | 1 | 1 |
3cdafa6b55a7d5915cd798444176a6f8c7c5f3b9 | static/js/directives/timeSince.js | static/js/directives/timeSince.js | define([
'app',
'utils',
'bootstrap/tooltip'
], function(app, utils, tooltip_func) {
'use strict';
// we use a shared timeout for the page loop to avoid
// extremely large amounts of $digest cycles and wasteful
// amounts of defers
window.setInterval(function(){
$('.ng-timesince').each(function(_, element){
var $element = angular.element(element),
value = $element.data('datetime');
$element.text(utils.time.timeSince(value));
});
}, 1000);
app.directive('timeSince', ['$timeout', function($timeout) {
return function timeSince(scope, element, attrs) {
var tagname = element[0].tagName.toLowerCase();
if (tagname == 'td' || tagname == 'dd') {
var err = "applying the timeSince directive to <" + tagname + "> tags" +
" can cause rendering issues. Instead, please put this on an" +
" inner <span> tag";
console.warn(err);
}
var value = scope.$eval(attrs.timeSince);
element.addClass('ng-timesince');
element.data('datetime', value);
element.attr('title', (new Date(value)).toUTCString());
element.attr('data-placement', 'left');
tooltip_func.bind(element)();
};
}]);
});
| define([
'app',
'utils',
'moment',
'bootstrap/tooltip'
], function(app, utils, moment, tooltip_func) {
'use strict';
// we use a shared timeout for the page loop to avoid
// extremely large amounts of $digest cycles and wasteful
// amounts of defers
window.setInterval(function(){
$('.ng-timesince').each(function(_, element){
var $element = angular.element(element),
value = $element.data('datetime');
$element.text(utils.time.timeSince(value));
});
}, 1000);
app.directive('timeSince', ['$timeout', function($timeout) {
return function timeSince(scope, element, attrs) {
var tagname = element[0].tagName.toLowerCase();
if (tagname == 'td' || tagname == 'dd') {
var err = "applying the timeSince directive to <" + tagname + "> tags" +
" can cause rendering issues. Instead, please put this on an" +
" inner <span> tag";
console.warn(err);
}
var value = scope.$eval(attrs.timeSince);
element.addClass('ng-timesince');
element.data('datetime', value);
element.attr('title', moment.utc(value).toString());
element.attr('data-placement', 'left');
tooltip_func.bind(element)();
};
}]);
});
| Use moment instead of new Date for timestamps | Use moment instead of new Date for timestamps
Summary:
new Date has inconsistent behavior between browsers: chrome assumes
its input is utc by default, while firefox assumes its the local time. Just use
moment instead.
https://tails.corp.dropbox.com/F196477
Test Plan: see screenshot
Reviewers: haoyi, josiah
Reviewed By: josiah
Subscribers: changesbot, mkedia
Differential Revision: https://tails.corp.dropbox.com/D118110
| JavaScript | apache-2.0 | bowlofstew/changes,dropbox/changes,dropbox/changes,wfxiang08/changes,wfxiang08/changes,bowlofstew/changes,bowlofstew/changes,bowlofstew/changes,dropbox/changes,wfxiang08/changes,wfxiang08/changes,dropbox/changes | javascript | ## Code Before:
define([
'app',
'utils',
'bootstrap/tooltip'
], function(app, utils, tooltip_func) {
'use strict';
// we use a shared timeout for the page loop to avoid
// extremely large amounts of $digest cycles and wasteful
// amounts of defers
window.setInterval(function(){
$('.ng-timesince').each(function(_, element){
var $element = angular.element(element),
value = $element.data('datetime');
$element.text(utils.time.timeSince(value));
});
}, 1000);
app.directive('timeSince', ['$timeout', function($timeout) {
return function timeSince(scope, element, attrs) {
var tagname = element[0].tagName.toLowerCase();
if (tagname == 'td' || tagname == 'dd') {
var err = "applying the timeSince directive to <" + tagname + "> tags" +
" can cause rendering issues. Instead, please put this on an" +
" inner <span> tag";
console.warn(err);
}
var value = scope.$eval(attrs.timeSince);
element.addClass('ng-timesince');
element.data('datetime', value);
element.attr('title', (new Date(value)).toUTCString());
element.attr('data-placement', 'left');
tooltip_func.bind(element)();
};
}]);
});
## Instruction:
Use moment instead of new Date for timestamps
Summary:
new Date has inconsistent behavior between browsers: chrome assumes
its input is utc by default, while firefox assumes its the local time. Just use
moment instead.
https://tails.corp.dropbox.com/F196477
Test Plan: see screenshot
Reviewers: haoyi, josiah
Reviewed By: josiah
Subscribers: changesbot, mkedia
Differential Revision: https://tails.corp.dropbox.com/D118110
## Code After:
define([
'app',
'utils',
'moment',
'bootstrap/tooltip'
], function(app, utils, moment, tooltip_func) {
'use strict';
// we use a shared timeout for the page loop to avoid
// extremely large amounts of $digest cycles and wasteful
// amounts of defers
window.setInterval(function(){
$('.ng-timesince').each(function(_, element){
var $element = angular.element(element),
value = $element.data('datetime');
$element.text(utils.time.timeSince(value));
});
}, 1000);
app.directive('timeSince', ['$timeout', function($timeout) {
return function timeSince(scope, element, attrs) {
var tagname = element[0].tagName.toLowerCase();
if (tagname == 'td' || tagname == 'dd') {
var err = "applying the timeSince directive to <" + tagname + "> tags" +
" can cause rendering issues. Instead, please put this on an" +
" inner <span> tag";
console.warn(err);
}
var value = scope.$eval(attrs.timeSince);
element.addClass('ng-timesince');
element.data('datetime', value);
element.attr('title', moment.utc(value).toString());
element.attr('data-placement', 'left');
tooltip_func.bind(element)();
};
}]);
});
| define([
'app',
'utils',
+ 'moment',
'bootstrap/tooltip'
- ], function(app, utils, tooltip_func) {
+ ], function(app, utils, moment, tooltip_func) {
? ++++++++
'use strict';
// we use a shared timeout for the page loop to avoid
// extremely large amounts of $digest cycles and wasteful
// amounts of defers
window.setInterval(function(){
$('.ng-timesince').each(function(_, element){
var $element = angular.element(element),
value = $element.data('datetime');
$element.text(utils.time.timeSince(value));
});
}, 1000);
app.directive('timeSince', ['$timeout', function($timeout) {
return function timeSince(scope, element, attrs) {
var tagname = element[0].tagName.toLowerCase();
if (tagname == 'td' || tagname == 'dd') {
var err = "applying the timeSince directive to <" + tagname + "> tags" +
" can cause rendering issues. Instead, please put this on an" +
" inner <span> tag";
console.warn(err);
}
var value = scope.$eval(attrs.timeSince);
element.addClass('ng-timesince');
element.data('datetime', value);
- element.attr('title', (new Date(value)).toUTCString());
? ^ ----- ^ - ---
+ element.attr('title', moment.utc(value).toString());
? ^^^^ ^^^^
element.attr('data-placement', 'left');
tooltip_func.bind(element)();
};
}]);
}); | 5 | 0.131579 | 3 | 2 |
f1e4894d47541a4328b3f4fee60de74727ff918a | .travis.yml | .travis.yml | language: ruby
bundler_args: --deployment --without development
rvm:
- 1.9.3-p448
script: bundle exec rspec spec
before_script:
- mysql -u root -e "DELETE FROM mysql.user WHERE Host='localhost' AND User=''"
- mysql -u root -e "SHOW VARIABLES"
| language: ruby
bundler_args: --deployment --without development
rvm:
- 1.9.3-p448
script: bundle exec rspec spec
before_script:
- mysql -u root -e "DELETE FROM mysql.user WHERE Host='localhost' AND User=''"
- mysql -u root -e "SET GLOBAL innodb_stats_auto_recalc=ON, innodb_stats_on_metadata=ON, innodb_stats_persistent=OFF;"
- mysql -u root -e "SHOW VARIABLES"
| Set MySQL stats configuration for Travis | Set MySQL stats configuration for Travis
| YAML | apache-2.0 | cloudfoundry/cf-mysql-broker,cloudfoundry/cf-mysql-broker,SudarsananRengarajan/cf-mysql-broker,SudarsananRengarajan/cf-mysql-broker,cloudfoundry/cf-mysql-broker | yaml | ## Code Before:
language: ruby
bundler_args: --deployment --without development
rvm:
- 1.9.3-p448
script: bundle exec rspec spec
before_script:
- mysql -u root -e "DELETE FROM mysql.user WHERE Host='localhost' AND User=''"
- mysql -u root -e "SHOW VARIABLES"
## Instruction:
Set MySQL stats configuration for Travis
## Code After:
language: ruby
bundler_args: --deployment --without development
rvm:
- 1.9.3-p448
script: bundle exec rspec spec
before_script:
- mysql -u root -e "DELETE FROM mysql.user WHERE Host='localhost' AND User=''"
- mysql -u root -e "SET GLOBAL innodb_stats_auto_recalc=ON, innodb_stats_on_metadata=ON, innodb_stats_persistent=OFF;"
- mysql -u root -e "SHOW VARIABLES"
| language: ruby
bundler_args: --deployment --without development
rvm:
- 1.9.3-p448
script: bundle exec rspec spec
before_script:
- mysql -u root -e "DELETE FROM mysql.user WHERE Host='localhost' AND User=''"
+ - mysql -u root -e "SET GLOBAL innodb_stats_auto_recalc=ON, innodb_stats_on_metadata=ON, innodb_stats_persistent=OFF;"
- mysql -u root -e "SHOW VARIABLES"
+ | 2 | 0.166667 | 2 | 0 |
2d3d5c8b22604fb5954fc5b9a8969a6a9ea52ee9 | assemble-plugins.sh | assemble-plugins.sh |
MPS_PATH='/Applications/MPS 3.1'
echo "params $#"
echo $@
if [ "$#" == "0" ]; then
#we are running manually from the command line
ANT_BIN=ant
PROPS="-Dmps_home=\"${MPS_PATH}\""
else
#we are likely running with jenkins, the first parameter is the ant executable path, the others are the properties to pass to ant execution(s)
ANT_BIN="$1/ant"
shift
PROPS="$@"
fi
function assemble-plugin {
xml=$1
keyword=$2
#Ignore generate errors
rm -fr build/artifacts/*${keyword}* && "${ANT_BIN}" ${PROPS} -f ${xml} generate || true
"${ANT_BIN}" ${PROPS} -f ${xml} && cp build/artifacts/*${keyword}*/*.zip target/plugins
}
mkdir -p target/plugins
assemble-plugin ui.xml UI && \
assemble-plugin TextOutput.xml TextOutput && \
assemble-plugin logger.xml Logger && \
assemble-plugin background.xml Background && \
assemble-plugin ClusterConfig.xml ClusterConfig && \
assemble-plugin NYoSh.xml NYoSh && \
assemble-plugin GobyWeb.xml GobyWeb && \
assemble-plugin Interactive.xml Interactive
|
MPS_PATH='/Applications/MPS 3.1'
echo "params $#"
echo $@
if [ "$#" == "0" ]; then
#we are running manually from the command line
ANT_BIN=ant
PROPS="-Dmps_home=\"${MPS_PATH}\""
else
#we are likely running with jenkins, the first parameter is the ant executable path, the others are the properties to pass to ant execution(s)
ANT_BIN="$1/ant"
shift
PROPS="$@"
rm -rf ${MPS_HOME}/config/plugins/XChart
cp ../XChart/build/artifacts/XChart/XChart_*.zip ${MPS_HOME}/config/plugins/
cd ${MPS_HOME}/config/plugins/
unzip XChart_*.zip
cd -
fi
function assemble-plugin {
xml=$1
keyword=$2
#Ignore generate errors
rm -fr build/artifacts/*${keyword}* && "${ANT_BIN}" ${PROPS} -f ${xml} generate || true
"${ANT_BIN}" ${PROPS} -f ${xml} && cp build/artifacts/*${keyword}*/*.zip target/plugins
}
mkdir -p target/plugins
assemble-plugin ui.xml UI && \
assemble-plugin TextOutput.xml TextOutput && \
assemble-plugin logger.xml Logger && \
assemble-plugin background.xml Background && \
assemble-plugin ClusterConfig.xml ClusterConfig && \
assemble-plugin NYoSh.xml NYoSh && \
assemble-plugin GobyWeb.xml GobyWeb && \
assemble-plugin Interactive.xml Interactive
| Deploy XChart plugin before assembling the plugins | Deploy XChart plugin before assembling the plugins
| Shell | apache-2.0 | CampagneLaboratory/NYoSh,CampagneLaboratory/NYoSh | shell | ## Code Before:
MPS_PATH='/Applications/MPS 3.1'
echo "params $#"
echo $@
if [ "$#" == "0" ]; then
#we are running manually from the command line
ANT_BIN=ant
PROPS="-Dmps_home=\"${MPS_PATH}\""
else
#we are likely running with jenkins, the first parameter is the ant executable path, the others are the properties to pass to ant execution(s)
ANT_BIN="$1/ant"
shift
PROPS="$@"
fi
function assemble-plugin {
xml=$1
keyword=$2
#Ignore generate errors
rm -fr build/artifacts/*${keyword}* && "${ANT_BIN}" ${PROPS} -f ${xml} generate || true
"${ANT_BIN}" ${PROPS} -f ${xml} && cp build/artifacts/*${keyword}*/*.zip target/plugins
}
mkdir -p target/plugins
assemble-plugin ui.xml UI && \
assemble-plugin TextOutput.xml TextOutput && \
assemble-plugin logger.xml Logger && \
assemble-plugin background.xml Background && \
assemble-plugin ClusterConfig.xml ClusterConfig && \
assemble-plugin NYoSh.xml NYoSh && \
assemble-plugin GobyWeb.xml GobyWeb && \
assemble-plugin Interactive.xml Interactive
## Instruction:
Deploy XChart plugin before assembling the plugins
## Code After:
MPS_PATH='/Applications/MPS 3.1'
echo "params $#"
echo $@
if [ "$#" == "0" ]; then
#we are running manually from the command line
ANT_BIN=ant
PROPS="-Dmps_home=\"${MPS_PATH}\""
else
#we are likely running with jenkins, the first parameter is the ant executable path, the others are the properties to pass to ant execution(s)
ANT_BIN="$1/ant"
shift
PROPS="$@"
rm -rf ${MPS_HOME}/config/plugins/XChart
cp ../XChart/build/artifacts/XChart/XChart_*.zip ${MPS_HOME}/config/plugins/
cd ${MPS_HOME}/config/plugins/
unzip XChart_*.zip
cd -
fi
function assemble-plugin {
xml=$1
keyword=$2
#Ignore generate errors
rm -fr build/artifacts/*${keyword}* && "${ANT_BIN}" ${PROPS} -f ${xml} generate || true
"${ANT_BIN}" ${PROPS} -f ${xml} && cp build/artifacts/*${keyword}*/*.zip target/plugins
}
mkdir -p target/plugins
assemble-plugin ui.xml UI && \
assemble-plugin TextOutput.xml TextOutput && \
assemble-plugin logger.xml Logger && \
assemble-plugin background.xml Background && \
assemble-plugin ClusterConfig.xml ClusterConfig && \
assemble-plugin NYoSh.xml NYoSh && \
assemble-plugin GobyWeb.xml GobyWeb && \
assemble-plugin Interactive.xml Interactive
|
MPS_PATH='/Applications/MPS 3.1'
echo "params $#"
echo $@
if [ "$#" == "0" ]; then
#we are running manually from the command line
ANT_BIN=ant
PROPS="-Dmps_home=\"${MPS_PATH}\""
else
#we are likely running with jenkins, the first parameter is the ant executable path, the others are the properties to pass to ant execution(s)
ANT_BIN="$1/ant"
shift
PROPS="$@"
+ rm -rf ${MPS_HOME}/config/plugins/XChart
+ cp ../XChart/build/artifacts/XChart/XChart_*.zip ${MPS_HOME}/config/plugins/
+ cd ${MPS_HOME}/config/plugins/
+ unzip XChart_*.zip
+ cd -
fi
function assemble-plugin {
xml=$1
keyword=$2
#Ignore generate errors
rm -fr build/artifacts/*${keyword}* && "${ANT_BIN}" ${PROPS} -f ${xml} generate || true
"${ANT_BIN}" ${PROPS} -f ${xml} && cp build/artifacts/*${keyword}*/*.zip target/plugins
}
mkdir -p target/plugins
assemble-plugin ui.xml UI && \
assemble-plugin TextOutput.xml TextOutput && \
assemble-plugin logger.xml Logger && \
assemble-plugin background.xml Background && \
assemble-plugin ClusterConfig.xml ClusterConfig && \
assemble-plugin NYoSh.xml NYoSh && \
assemble-plugin GobyWeb.xml GobyWeb && \
assemble-plugin Interactive.xml Interactive | 5 | 0.147059 | 5 | 0 |
569ab8870cfa67fc910965158c142c3ea03e9592 | CommonCheckTargets.cmake | CommonCheckTargets.cmake |
include(GetSourceFilesFromTarget)
include(CommonClangCheck)
include(CommonCPPCheck)
include(CommonCPPLint)
function(common_check_targets _name)
set(_exclude_pattern ".*moc_|.*qrc_.*\\.c.*$") # Qt moc and qrc files
# Get the list of files once for all check targets
get_source_files(${_name} ${_exclude_pattern})
if(NOT ${_name}_FILES)
return()
endif()
common_clangcheck(${_name} FILES ${${_name}_FILES})
common_cppcheck(${_name} FILES ${${_name}_FILES}
POSSIBLE_ERROR FAIL_ON_WARNINGS)
common_cpplint(${_name} FILES ${${_name}_FILES} CATEGORY_FILTER_OUT readability/streams)
endfunction()
|
include(GetSourceFilesFromTarget)
include(CommonClangCheck)
include(CommonCPPCheck)
include(CommonCPPLint)
function(common_check_targets _name)
# Qt moc & qrc files, C and Objective-C files
set(_exclude_pattern ".*moc_|.*qrc_.*\\.c.*|.*\\.mm.*$")
# Get the list of files once for all check targets
get_source_files(${_name} ${_exclude_pattern})
if(NOT ${_name}_FILES)
return()
endif()
common_clangcheck(${_name} FILES ${${_name}_FILES})
common_cppcheck(${_name} FILES ${${_name}_FILES}
POSSIBLE_ERROR FAIL_ON_WARNINGS)
common_cpplint(${_name} FILES ${${_name}_FILES} CATEGORY_FILTER_OUT readability/streams)
endfunction()
| Exclude Objective-C files from cppcheck | Exclude Objective-C files from cppcheck
| CMake | bsd-3-clause | ptoharia/CMake,ptoharia/CMake,ptoharia/CMake,ptoharia/CMake | cmake | ## Code Before:
include(GetSourceFilesFromTarget)
include(CommonClangCheck)
include(CommonCPPCheck)
include(CommonCPPLint)
function(common_check_targets _name)
set(_exclude_pattern ".*moc_|.*qrc_.*\\.c.*$") # Qt moc and qrc files
# Get the list of files once for all check targets
get_source_files(${_name} ${_exclude_pattern})
if(NOT ${_name}_FILES)
return()
endif()
common_clangcheck(${_name} FILES ${${_name}_FILES})
common_cppcheck(${_name} FILES ${${_name}_FILES}
POSSIBLE_ERROR FAIL_ON_WARNINGS)
common_cpplint(${_name} FILES ${${_name}_FILES} CATEGORY_FILTER_OUT readability/streams)
endfunction()
## Instruction:
Exclude Objective-C files from cppcheck
## Code After:
include(GetSourceFilesFromTarget)
include(CommonClangCheck)
include(CommonCPPCheck)
include(CommonCPPLint)
function(common_check_targets _name)
# Qt moc & qrc files, C and Objective-C files
set(_exclude_pattern ".*moc_|.*qrc_.*\\.c.*|.*\\.mm.*$")
# Get the list of files once for all check targets
get_source_files(${_name} ${_exclude_pattern})
if(NOT ${_name}_FILES)
return()
endif()
common_clangcheck(${_name} FILES ${${_name}_FILES})
common_cppcheck(${_name} FILES ${${_name}_FILES}
POSSIBLE_ERROR FAIL_ON_WARNINGS)
common_cpplint(${_name} FILES ${${_name}_FILES} CATEGORY_FILTER_OUT readability/streams)
endfunction()
|
include(GetSourceFilesFromTarget)
include(CommonClangCheck)
include(CommonCPPCheck)
include(CommonCPPLint)
function(common_check_targets _name)
- set(_exclude_pattern ".*moc_|.*qrc_.*\\.c.*$") # Qt moc and qrc files
+ # Qt moc & qrc files, C and Objective-C files
+ set(_exclude_pattern ".*moc_|.*qrc_.*\\.c.*|.*\\.mm.*$")
# Get the list of files once for all check targets
get_source_files(${_name} ${_exclude_pattern})
if(NOT ${_name}_FILES)
return()
endif()
common_clangcheck(${_name} FILES ${${_name}_FILES})
common_cppcheck(${_name} FILES ${${_name}_FILES}
POSSIBLE_ERROR FAIL_ON_WARNINGS)
common_cpplint(${_name} FILES ${${_name}_FILES} CATEGORY_FILTER_OUT readability/streams)
endfunction() | 3 | 0.15 | 2 | 1 |
91d57caee910c9095988f6d86959ed1c1353e3c7 | pubspec.yaml | pubspec.yaml | name: statemachine
version: 1.4.2
author: Lukas Renggli <renggli@gmail.com>
homepage: https://github.com/renggli/dart-statemachine
description: A simple, yet powerful state machine framework for Dart supporting
Flutter and web apps.
environment:
sdk: ^2.0.0
dev_dependencies:
build_runner: ^0.9.0
build_test: ^0.10.0
build_web_compilers: ^0.4.0
test: ^1.3.0
| name: statemachine
version: 1.4.2
author: Lukas Renggli <renggli@gmail.com>
homepage: https://github.com/renggli/dart-statemachine
description: A simple, yet powerful state machine framework for Dart supporting
Flutter and web apps.
environment:
sdk: '>=2.0.0-dev.61.0 <3.0.0'
dev_dependencies:
build_runner: ^0.9.0
build_test: ^0.10.0
build_web_compilers: ^0.4.0
test: ^1.3.0
| Revert SDK version to make Travis and Flutter work again | Revert SDK version to make Travis and Flutter work again
| YAML | mit | renggli/dart-statemachine,renggli/dart-statemachine | yaml | ## Code Before:
name: statemachine
version: 1.4.2
author: Lukas Renggli <renggli@gmail.com>
homepage: https://github.com/renggli/dart-statemachine
description: A simple, yet powerful state machine framework for Dart supporting
Flutter and web apps.
environment:
sdk: ^2.0.0
dev_dependencies:
build_runner: ^0.9.0
build_test: ^0.10.0
build_web_compilers: ^0.4.0
test: ^1.3.0
## Instruction:
Revert SDK version to make Travis and Flutter work again
## Code After:
name: statemachine
version: 1.4.2
author: Lukas Renggli <renggli@gmail.com>
homepage: https://github.com/renggli/dart-statemachine
description: A simple, yet powerful state machine framework for Dart supporting
Flutter and web apps.
environment:
sdk: '>=2.0.0-dev.61.0 <3.0.0'
dev_dependencies:
build_runner: ^0.9.0
build_test: ^0.10.0
build_web_compilers: ^0.4.0
test: ^1.3.0
| name: statemachine
version: 1.4.2
author: Lukas Renggli <renggli@gmail.com>
homepage: https://github.com/renggli/dart-statemachine
description: A simple, yet powerful state machine framework for Dart supporting
Flutter and web apps.
environment:
- sdk: ^2.0.0
+ sdk: '>=2.0.0-dev.61.0 <3.0.0'
dev_dependencies:
build_runner: ^0.9.0
build_test: ^0.10.0
build_web_compilers: ^0.4.0
test: ^1.3.0 | 2 | 0.133333 | 1 | 1 |
6593e83468f4f47fdfabca3f397fd82432bab1f0 | README.md | README.md | > Automatically generate queries and mutations from Sequelize models
---
## Why
- Less error prone development. No more keeping GraphQL in sync with Database fields.
- [Don't Repeat Yourself](https://en.wikipedia.org/wiki/Don%27t_repeat_yourself).
- Power of GraphQL and Relay with rapid database development of Sequelize
## Features
- [ ] Generated GraphQL API only from Sequelize Models defintitions
- [ ] [Relay](https://facebook.github.io/relay/) compatiable GraphQL API
- [ ] Generate READ Queries
- [ ] Generate CREATE, UPDATE, DELETE Mutations
- [ ] Custom queries and mutations within Sequelize Models defitions
|
[](https://nodei.co/npm/graphql-sequelize-crud/)
[](https://nodei.co/npm/graphql-sequelize-crud/)
> Automatically generate queries and mutations from Sequelize models
---
## Why
- Less error prone development. No more keeping GraphQL in sync with Database fields.
- [Don't Repeat Yourself](https://en.wikipedia.org/wiki/Don%27t_repeat_yourself).
- Power of GraphQL and Relay with rapid database development of Sequelize
## Features
- [ ] Generated GraphQL API only from Sequelize Models defintitions
- [ ] [Relay](https://facebook.github.io/relay/) compatiable GraphQL API
- [ ] Generate READ Queries
- [ ] Generate CREATE, UPDATE, DELETE Mutations
- [ ] Custom queries and mutations within Sequelize Models defitions
| Add Travis CI and Nodei.co badges | Add Travis CI and Nodei.co badges | Markdown | mit | jayprakash1/graphql-sequelize-crud,Glavin001/graphql-sequelize-crud | markdown | ## Code Before:
> Automatically generate queries and mutations from Sequelize models
---
## Why
- Less error prone development. No more keeping GraphQL in sync with Database fields.
- [Don't Repeat Yourself](https://en.wikipedia.org/wiki/Don%27t_repeat_yourself).
- Power of GraphQL and Relay with rapid database development of Sequelize
## Features
- [ ] Generated GraphQL API only from Sequelize Models defintitions
- [ ] [Relay](https://facebook.github.io/relay/) compatiable GraphQL API
- [ ] Generate READ Queries
- [ ] Generate CREATE, UPDATE, DELETE Mutations
- [ ] Custom queries and mutations within Sequelize Models defitions
## Instruction:
Add Travis CI and Nodei.co badges
## Code After:
[](https://nodei.co/npm/graphql-sequelize-crud/)
[](https://nodei.co/npm/graphql-sequelize-crud/)
> Automatically generate queries and mutations from Sequelize models
---
## Why
- Less error prone development. No more keeping GraphQL in sync with Database fields.
- [Don't Repeat Yourself](https://en.wikipedia.org/wiki/Don%27t_repeat_yourself).
- Power of GraphQL and Relay with rapid database development of Sequelize
## Features
- [ ] Generated GraphQL API only from Sequelize Models defintitions
- [ ] [Relay](https://facebook.github.io/relay/) compatiable GraphQL API
- [ ] Generate READ Queries
- [ ] Generate CREATE, UPDATE, DELETE Mutations
- [ ] Custom queries and mutations within Sequelize Models defitions
| +
+ [](https://nodei.co/npm/graphql-sequelize-crud/)
+ [](https://nodei.co/npm/graphql-sequelize-crud/)
+
> Automatically generate queries and mutations from Sequelize models
---
## Why
- Less error prone development. No more keeping GraphQL in sync with Database fields.
- [Don't Repeat Yourself](https://en.wikipedia.org/wiki/Don%27t_repeat_yourself).
- Power of GraphQL and Relay with rapid database development of Sequelize
## Features
- [ ] Generated GraphQL API only from Sequelize Models defintitions
- [ ] [Relay](https://facebook.github.io/relay/) compatiable GraphQL API
- [ ] Generate READ Queries
- [ ] Generate CREATE, UPDATE, DELETE Mutations
- [ ] Custom queries and mutations within Sequelize Models defitions | 4 | 0.25 | 4 | 0 |
22915dc814062ba75ed1efb1502106db74f1e679 | .travis.yml | .travis.yml | language: php
sudo: required
services:
- docker
php:
- '5.6'
before_install:
- docker --version
- docker-compose --version
- docker-compose build >> /dev/null
- docker-compose run --rm php composer install --prefer-source --no-interaction --dev >> /dev/null
- docker-compose run --rm node npm install --loglevel error
before_script:
- export DISPLAY=:99.0
- sh -e /etc/init.d/xvfb start
script:
- make phpunit
- make phpcs
- make karma options='--single-run'
notifications:
on_success: never
on_failure: always
| language: php
sudo: required
services:
- docker
php:
- '5.6'
before_install:
- docker --version
- docker-compose --version
- docker-compose build --no-cache >> /dev/null
- docker-compose run --rm php composer install --prefer-source --no-interaction --dev >> /dev/null
- docker-compose run --rm node npm install --loglevel error
before_script:
- export DISPLAY=:99.0
- sh -e /etc/init.d/xvfb start
script:
- make phpunit
- make phpcs
- make karma options='--single-run'
notifications:
on_success: never
on_failure: always
| Rebuild docker images in Travis. | Rebuild docker images in Travis.
| YAML | mit | IDCI-Consulting/ExtraFormBundle,IDCI-Consulting/ExtraFormBundle | yaml | ## Code Before:
language: php
sudo: required
services:
- docker
php:
- '5.6'
before_install:
- docker --version
- docker-compose --version
- docker-compose build >> /dev/null
- docker-compose run --rm php composer install --prefer-source --no-interaction --dev >> /dev/null
- docker-compose run --rm node npm install --loglevel error
before_script:
- export DISPLAY=:99.0
- sh -e /etc/init.d/xvfb start
script:
- make phpunit
- make phpcs
- make karma options='--single-run'
notifications:
on_success: never
on_failure: always
## Instruction:
Rebuild docker images in Travis.
## Code After:
language: php
sudo: required
services:
- docker
php:
- '5.6'
before_install:
- docker --version
- docker-compose --version
- docker-compose build --no-cache >> /dev/null
- docker-compose run --rm php composer install --prefer-source --no-interaction --dev >> /dev/null
- docker-compose run --rm node npm install --loglevel error
before_script:
- export DISPLAY=:99.0
- sh -e /etc/init.d/xvfb start
script:
- make phpunit
- make phpcs
- make karma options='--single-run'
notifications:
on_success: never
on_failure: always
| language: php
sudo: required
services:
- docker
php:
- '5.6'
before_install:
- docker --version
- docker-compose --version
- - docker-compose build >> /dev/null
+ - docker-compose build --no-cache >> /dev/null
? +++++++++++
- docker-compose run --rm php composer install --prefer-source --no-interaction --dev >> /dev/null
- docker-compose run --rm node npm install --loglevel error
before_script:
- export DISPLAY=:99.0
- sh -e /etc/init.d/xvfb start
script:
- make phpunit
- make phpcs
- make karma options='--single-run'
notifications:
on_success: never
on_failure: always | 2 | 0.064516 | 1 | 1 |
e06f04df16b8b0b4404673fcfa381cfe04013b9a | server/features/plebiscite.js | server/features/plebiscite.js | 'use strict';
const Promise = require('bluebird');
const {MongoClient} = require('mongodb');
const MONGO_URL = process.env.MONGO_URL;
const MONGO_COLLECTION = process.env.MONGO_COLLECTION;
const mdb_delayed = MongoClient.connect(MONGO_URL);
class Plebiscite {
constructor({db}) {
this._db = db;
}
* _getCollection() {
return (yield this._db).collection(MONGO_COLLECTION);
}
* find({substance, offset, limit}) {
const collection = yield* this._getCollection();
return yield collection.find({
'substanceInfo.substance': substance
})
.sort({'meta.published': -1})
.skip(offset)
.limit(limit)
.toArray();
}
}
module.exports = new Plebiscite({db: mdb_delayed});
| 'use strict';
const Promise = require('bluebird');
const {MongoClient} = require('mongodb');
const MONGO_URL = process.env.MONGO_URL;
const MONGO_COLLECTION = process.env.MONGO_COLLECTION;
const mdb_delayed = MongoClient.connect(MONGO_URL);
class Plebiscite {
constructor({db}) {
this._db = db;
}
* _getCollection() {
return (yield this._db).collection(MONGO_COLLECTION);
}
* find({substance, offset, limit}) {
const collection = yield* this._getCollection();
const query = {};
if (!substance === true) {
query['substanceInfo.substance'] = substance;
}
return yield collection.find(query)
.sort({'meta.published': -1})
.skip(offset)
.limit(limit)
.toArray();
}
}
module.exports = new Plebiscite({db: mdb_delayed});
| Allow empty query for substances | Allow empty query for substances
| JavaScript | mit | psychonautwiki/bifrost,officialpsychonautwiki/bifrost | javascript | ## Code Before:
'use strict';
const Promise = require('bluebird');
const {MongoClient} = require('mongodb');
const MONGO_URL = process.env.MONGO_URL;
const MONGO_COLLECTION = process.env.MONGO_COLLECTION;
const mdb_delayed = MongoClient.connect(MONGO_URL);
class Plebiscite {
constructor({db}) {
this._db = db;
}
* _getCollection() {
return (yield this._db).collection(MONGO_COLLECTION);
}
* find({substance, offset, limit}) {
const collection = yield* this._getCollection();
return yield collection.find({
'substanceInfo.substance': substance
})
.sort({'meta.published': -1})
.skip(offset)
.limit(limit)
.toArray();
}
}
module.exports = new Plebiscite({db: mdb_delayed});
## Instruction:
Allow empty query for substances
## Code After:
'use strict';
const Promise = require('bluebird');
const {MongoClient} = require('mongodb');
const MONGO_URL = process.env.MONGO_URL;
const MONGO_COLLECTION = process.env.MONGO_COLLECTION;
const mdb_delayed = MongoClient.connect(MONGO_URL);
class Plebiscite {
constructor({db}) {
this._db = db;
}
* _getCollection() {
return (yield this._db).collection(MONGO_COLLECTION);
}
* find({substance, offset, limit}) {
const collection = yield* this._getCollection();
const query = {};
if (!substance === true) {
query['substanceInfo.substance'] = substance;
}
return yield collection.find(query)
.sort({'meta.published': -1})
.skip(offset)
.limit(limit)
.toArray();
}
}
module.exports = new Plebiscite({db: mdb_delayed});
| 'use strict';
const Promise = require('bluebird');
const {MongoClient} = require('mongodb');
const MONGO_URL = process.env.MONGO_URL;
const MONGO_COLLECTION = process.env.MONGO_COLLECTION;
const mdb_delayed = MongoClient.connect(MONGO_URL);
class Plebiscite {
constructor({db}) {
this._db = db;
}
* _getCollection() {
return (yield this._db).collection(MONGO_COLLECTION);
}
* find({substance, offset, limit}) {
const collection = yield* this._getCollection();
- return yield collection.find({
+ const query = {};
+
+ if (!substance === true) {
- 'substanceInfo.substance': substance
? ^
+ query['substanceInfo.substance'] = substance;
? ++++++ ^^^ +
- })
? -
+ }
+
+ return yield collection.find(query)
.sort({'meta.published': -1})
.skip(offset)
.limit(limit)
.toArray();
}
}
module.exports = new Plebiscite({db: mdb_delayed}); | 10 | 0.294118 | 7 | 3 |
835e312d80bc5c6fc3c2931be66618cec91d0c78 | _config.yml | _config.yml | name: Alykhan Kanji
exclude: [README.md, LICENSE.md, CNAME, Gemfile, Gemfile.lock, vendor]
permalink: blog/:categories/:year/:month/:day/:title/
defaults:
-
scope:
path: ""
type: "pages"
values:
layout: "default"
-
scope:
path: ""
type: "posts"
values:
layout: "post"
gems:
- jekyll_feed
| name: Alykhan Kanji
exclude: [README.md, LICENSE.md, CNAME, Gemfile, Gemfile.lock, vendor]
permalink: blog/:categories/:year/:month/:day/:title/
defaults:
-
scope:
path: ""
type: "pages"
values:
layout: "default"
-
scope:
path: ""
type: "posts"
values:
layout: "post"
| Revert "Set up automatic Atom feed generation" | Revert "Set up automatic Atom feed generation"
This reverts commit 35469e2231d2a9b92bf320b02c7878d2cbf9fa0b.
| YAML | mit | alykhank/alykhank.github.io,alykhank/alykhank.github.io,alykhank/alykhank.github.io | yaml | ## Code Before:
name: Alykhan Kanji
exclude: [README.md, LICENSE.md, CNAME, Gemfile, Gemfile.lock, vendor]
permalink: blog/:categories/:year/:month/:day/:title/
defaults:
-
scope:
path: ""
type: "pages"
values:
layout: "default"
-
scope:
path: ""
type: "posts"
values:
layout: "post"
gems:
- jekyll_feed
## Instruction:
Revert "Set up automatic Atom feed generation"
This reverts commit 35469e2231d2a9b92bf320b02c7878d2cbf9fa0b.
## Code After:
name: Alykhan Kanji
exclude: [README.md, LICENSE.md, CNAME, Gemfile, Gemfile.lock, vendor]
permalink: blog/:categories/:year/:month/:day/:title/
defaults:
-
scope:
path: ""
type: "pages"
values:
layout: "default"
-
scope:
path: ""
type: "posts"
values:
layout: "post"
| name: Alykhan Kanji
exclude: [README.md, LICENSE.md, CNAME, Gemfile, Gemfile.lock, vendor]
permalink: blog/:categories/:year/:month/:day/:title/
defaults:
-
scope:
path: ""
type: "pages"
values:
layout: "default"
-
scope:
path: ""
type: "posts"
values:
layout: "post"
- gems:
- - jekyll_feed | 2 | 0.111111 | 0 | 2 |
741352c97e8a274d0ae3ea5a3933f50ec0038594 | app/players/philipp-lahm.json | app/players/philipp-lahm.json | {
"name": "Philipp Lahm",
"id": "philipp-lahm",
"image": "img/players/philipp-lahm.jpg",
"videos": ["Xjmy1P9gr8E","PabsSYHoQ-s"],
"nickname": "The Magic Dwarf."
}
| {
"name": "Philipp Lahm",
"id": "philipp-lahm",
"image": "img/players/philipp-lahm.jpg",
"videos": ["b3GaUVJOprM","PabsSYHoQ-s"],
"nickname": "The Magic Dwarf."
}
| Fix broken video for Philipp Lahm | Fix broken video for Philipp Lahm
| JSON | mit | rishadafc/potw,rishadafc/potw | json | ## Code Before:
{
"name": "Philipp Lahm",
"id": "philipp-lahm",
"image": "img/players/philipp-lahm.jpg",
"videos": ["Xjmy1P9gr8E","PabsSYHoQ-s"],
"nickname": "The Magic Dwarf."
}
## Instruction:
Fix broken video for Philipp Lahm
## Code After:
{
"name": "Philipp Lahm",
"id": "philipp-lahm",
"image": "img/players/philipp-lahm.jpg",
"videos": ["b3GaUVJOprM","PabsSYHoQ-s"],
"nickname": "The Magic Dwarf."
}
| {
"name": "Philipp Lahm",
"id": "philipp-lahm",
"image": "img/players/philipp-lahm.jpg",
- "videos": ["Xjmy1P9gr8E","PabsSYHoQ-s"],
? ^^^^^^^^ ^^
+ "videos": ["b3GaUVJOprM","PabsSYHoQ-s"],
? ^^^^^^^^^ ^
"nickname": "The Magic Dwarf."
} | 2 | 0.285714 | 1 | 1 |
4b7c5967068b5b37aa20a7dca36e9eddf204a625 | README.md | README.md |
This repository contains lectures for [CAS frontend engineering course](http://www.hsr.ch/Front-End-Engineering.12432.0.html) at the technical college in Rapperswil, Switzerland.
The lectures are:
- [Client Testing and Project Automation](https://github.com/tjunghans/lectures/blob/master/fe-testing/main.md)
- [Deployment, Scaling & Performance](https://github.com/tjunghans/lectures/tree/master/fe-performance/main.md)
The lecture notes and examples are in english since english is the language of the software and frontend developer. The lecture notes are written in markdown on Github so that they can be versioned and improved online by myself as well as participants and the community using [pull requests](https://help.github.com/articles/using-pull-requests).
If you have any questions feel free to contact me at thomas.junghans@gmail.com.
|
This repository contains lectures for [CAS frontend engineering course](http://www.hsr.ch/Front-End-Engineering.12432.0.html) at the technical college in Rapperswil, Switzerland.
The lectures are:
- [Client Testing and Project Automation](https://github.com/tjunghans/lectures/blob/master/fe-testing/main.md)
- ~~[Deployment, Scaling & Performance](https://github.com/tjunghans/lectures/tree/master/fe-performance/main.md)~~ Notes are in a Powerpoint which will be linked during the course.
The lecture notes and examples are in english since english is the language of the software and frontend developer. The lecture notes are written in markdown on Github so that they can be versioned and improved online by myself as well as participants and the community using [pull requests](https://help.github.com/articles/using-pull-requests).
If you have any questions feel free to contact me at thomas.junghans@gmail.com.
| Add note to performance link | Add note to performance link | Markdown | mit | tjunghans/lectures | markdown | ## Code Before:
This repository contains lectures for [CAS frontend engineering course](http://www.hsr.ch/Front-End-Engineering.12432.0.html) at the technical college in Rapperswil, Switzerland.
The lectures are:
- [Client Testing and Project Automation](https://github.com/tjunghans/lectures/blob/master/fe-testing/main.md)
- [Deployment, Scaling & Performance](https://github.com/tjunghans/lectures/tree/master/fe-performance/main.md)
The lecture notes and examples are in english since english is the language of the software and frontend developer. The lecture notes are written in markdown on Github so that they can be versioned and improved online by myself as well as participants and the community using [pull requests](https://help.github.com/articles/using-pull-requests).
If you have any questions feel free to contact me at thomas.junghans@gmail.com.
## Instruction:
Add note to performance link
## Code After:
This repository contains lectures for [CAS frontend engineering course](http://www.hsr.ch/Front-End-Engineering.12432.0.html) at the technical college in Rapperswil, Switzerland.
The lectures are:
- [Client Testing and Project Automation](https://github.com/tjunghans/lectures/blob/master/fe-testing/main.md)
- ~~[Deployment, Scaling & Performance](https://github.com/tjunghans/lectures/tree/master/fe-performance/main.md)~~ Notes are in a Powerpoint which will be linked during the course.
The lecture notes and examples are in english since english is the language of the software and frontend developer. The lecture notes are written in markdown on Github so that they can be versioned and improved online by myself as well as participants and the community using [pull requests](https://help.github.com/articles/using-pull-requests).
If you have any questions feel free to contact me at thomas.junghans@gmail.com.
|
This repository contains lectures for [CAS frontend engineering course](http://www.hsr.ch/Front-End-Engineering.12432.0.html) at the technical college in Rapperswil, Switzerland.
The lectures are:
- [Client Testing and Project Automation](https://github.com/tjunghans/lectures/blob/master/fe-testing/main.md)
- - [Deployment, Scaling & Performance](https://github.com/tjunghans/lectures/tree/master/fe-performance/main.md)
+ - ~~[Deployment, Scaling & Performance](https://github.com/tjunghans/lectures/tree/master/fe-performance/main.md)~~ Notes are in a Powerpoint which will be linked during the course.
? ++ ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
The lecture notes and examples are in english since english is the language of the software and frontend developer. The lecture notes are written in markdown on Github so that they can be versioned and improved online by myself as well as participants and the community using [pull requests](https://help.github.com/articles/using-pull-requests).
If you have any questions feel free to contact me at thomas.junghans@gmail.com. | 2 | 0.181818 | 1 | 1 |
8513491340ae4158310f79b79e69fb08577ab96e | library/base64/decode64_spec.rb | library/base64/decode64_spec.rb | require_relative '../../spec_helper'
require 'base64'
describe "Base64#decode64" do
it "returns the Base64-decoded version of the given string" do
Base64.decode64("U2VuZCByZWluZm9yY2VtZW50cw==\n").should == "Send reinforcements"
end
it "returns the Base64-decoded version of the given shared string" do
Base64.decode64("base64: U2VuZCByZWluZm9yY2VtZW50cw==\n".split(" ").last).should == "Send reinforcements"
end
it "returns the Base64-decoded version of the given string with wrong padding" do
Base64.decode64("XU2VuZCByZWluZm9yY2VtZW50cw===").should == "]M\x95\xB9\x90\x81\xC9\x95\xA5\xB9\x99\xBD\xC9\x8D\x95\xB5\x95\xB9\xD1\xCC".b
end
it "returns the Base64-decoded version of the given string that contains an invalid character" do
Base64.decode64("%3D").should == "\xDC".b
end
it "returns a binary encoded string" do
Base64.decode64("SEk=").encoding.should == Encoding::BINARY
end
end
| require_relative '../../spec_helper'
require 'base64'
describe "Base64#decode64" do
it "returns the Base64-decoded version of the given string" do
Base64.decode64("U2VuZCByZWluZm9yY2VtZW50cw==\n").should == "Send reinforcements"
end
it "returns the Base64-decoded version of the given shared string" do
Base64.decode64("base64: U2VuZCByZWluZm9yY2VtZW50cw==\n".split(" ").last).should == "Send reinforcements"
end
it "returns the Base64-decoded version of the given string with wrong padding" do
Base64.decode64("XU2VuZCByZWluZm9yY2VtZW50cw===").should == "]M\x95\xB9\x90\x81\xC9\x95\xA5\xB9\x99\xBD\xC9\x8D\x95\xB5\x95\xB9\xD1\xCC".b
end
it "returns the Base64-decoded version of the given string that contains an invalid character" do
Base64.decode64("%3D").should == "\xDC".b
end
it "returns a binary encoded string" do
Base64.decode64("SEk=").encoding.should == Encoding::BINARY
end
it "decodes without padding suffix ==" do
Base64.decode64("eyJrZXkiOnsibiI6InR0dCJ9fQ").should == "{\"key\":{\"n\":\"ttt\"}}"
end
end
| Fix base64 decoding issue with unpack m | Fix base64 decoding issue with unpack m
| Ruby | mit | nobu/rubyspec,ruby/spec,ruby/spec,nobu/rubyspec,ruby/spec,nobu/rubyspec | ruby | ## Code Before:
require_relative '../../spec_helper'
require 'base64'
describe "Base64#decode64" do
it "returns the Base64-decoded version of the given string" do
Base64.decode64("U2VuZCByZWluZm9yY2VtZW50cw==\n").should == "Send reinforcements"
end
it "returns the Base64-decoded version of the given shared string" do
Base64.decode64("base64: U2VuZCByZWluZm9yY2VtZW50cw==\n".split(" ").last).should == "Send reinforcements"
end
it "returns the Base64-decoded version of the given string with wrong padding" do
Base64.decode64("XU2VuZCByZWluZm9yY2VtZW50cw===").should == "]M\x95\xB9\x90\x81\xC9\x95\xA5\xB9\x99\xBD\xC9\x8D\x95\xB5\x95\xB9\xD1\xCC".b
end
it "returns the Base64-decoded version of the given string that contains an invalid character" do
Base64.decode64("%3D").should == "\xDC".b
end
it "returns a binary encoded string" do
Base64.decode64("SEk=").encoding.should == Encoding::BINARY
end
end
## Instruction:
Fix base64 decoding issue with unpack m
## Code After:
require_relative '../../spec_helper'
require 'base64'
describe "Base64#decode64" do
it "returns the Base64-decoded version of the given string" do
Base64.decode64("U2VuZCByZWluZm9yY2VtZW50cw==\n").should == "Send reinforcements"
end
it "returns the Base64-decoded version of the given shared string" do
Base64.decode64("base64: U2VuZCByZWluZm9yY2VtZW50cw==\n".split(" ").last).should == "Send reinforcements"
end
it "returns the Base64-decoded version of the given string with wrong padding" do
Base64.decode64("XU2VuZCByZWluZm9yY2VtZW50cw===").should == "]M\x95\xB9\x90\x81\xC9\x95\xA5\xB9\x99\xBD\xC9\x8D\x95\xB5\x95\xB9\xD1\xCC".b
end
it "returns the Base64-decoded version of the given string that contains an invalid character" do
Base64.decode64("%3D").should == "\xDC".b
end
it "returns a binary encoded string" do
Base64.decode64("SEk=").encoding.should == Encoding::BINARY
end
it "decodes without padding suffix ==" do
Base64.decode64("eyJrZXkiOnsibiI6InR0dCJ9fQ").should == "{\"key\":{\"n\":\"ttt\"}}"
end
end
| require_relative '../../spec_helper'
require 'base64'
describe "Base64#decode64" do
it "returns the Base64-decoded version of the given string" do
Base64.decode64("U2VuZCByZWluZm9yY2VtZW50cw==\n").should == "Send reinforcements"
end
it "returns the Base64-decoded version of the given shared string" do
Base64.decode64("base64: U2VuZCByZWluZm9yY2VtZW50cw==\n".split(" ").last).should == "Send reinforcements"
end
it "returns the Base64-decoded version of the given string with wrong padding" do
Base64.decode64("XU2VuZCByZWluZm9yY2VtZW50cw===").should == "]M\x95\xB9\x90\x81\xC9\x95\xA5\xB9\x99\xBD\xC9\x8D\x95\xB5\x95\xB9\xD1\xCC".b
end
it "returns the Base64-decoded version of the given string that contains an invalid character" do
Base64.decode64("%3D").should == "\xDC".b
end
it "returns a binary encoded string" do
Base64.decode64("SEk=").encoding.should == Encoding::BINARY
end
+
+ it "decodes without padding suffix ==" do
+ Base64.decode64("eyJrZXkiOnsibiI6InR0dCJ9fQ").should == "{\"key\":{\"n\":\"ttt\"}}"
+ end
end | 4 | 0.16 | 4 | 0 |
c5bf7332dee2eee19b5f8098784c6e09cc6188b6 | tasks/remove-duplicate-elements/src/main.rs | tasks/remove-duplicate-elements/src/main.rs | use std::collections::HashSet;
use std::hash::Hash;
fn remove_duplicate_elements_hashing<T: Hash + Eq>(elements: &mut Vec<T>) {
let set: HashSet<_> = elements.drain(..).collect();
elements.extend(set.into_iter());
}
fn remove_duplicate_elements_sorting<T: Ord>(elements: &mut Vec<T>) {
elements.sort_unstable();
elements.dedup();
}
fn main() {
let mut sample_elements = vec![0, 0, 1, 1, 2, 3, 2];
println!("Before removal of duplicates : {:?}", sample_elements);
remove_duplicate_elements_sorting(&mut sample_elements);
println!("After removal of duplicates : {:?}", sample_elements);
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_remove_duplicate_elements_hashing() {
let mut sample_elements = vec![0, 0, 1, 1, 2, 3, 2];
remove_duplicate_elements_hashing(&mut sample_elements);
sample_elements.sort_unstable();
assert_eq!(sample_elements, [0, 1, 2, 3])
}
#[test]
fn test_remove_duplicate_elements_sorting() {
let mut sample_elements = vec![0, 0, 1, 1, 2, 3, 2];
remove_duplicate_elements_sorting(&mut sample_elements);
assert_eq!(sample_elements, [0, 1, 2, 3])
}
}
| use std::vec::Vec;
use std::collections::HashSet;
use std::hash::Hash;
use std::cmp::Eq;
fn main() {
let mut sample_elements = vec![0u8, 0, 1, 1, 2, 3, 2];
println!("Before removal of duplicates : {:?}", sample_elements);
remove_duplicate_elements(&mut sample_elements);
println!("After removal of duplicates : {:?}", sample_elements);
}
fn remove_duplicate_elements<T: Hash + Eq>(elements: &mut Vec<T>) {
let set: HashSet<_> = elements.drain(..).collect();
elements.extend(set.into_iter());
}
#[test]
fn test_remove_duplicate_elements() {
let mut sample_elements = vec![0u8, 0, 1, 1, 2, 3, 2];
remove_duplicate_elements(&mut sample_elements);
sample_elements.sort();
assert_eq!(sample_elements, vec![0, 1, 2, 3])
}
| Revert "cleanup & add sorting method" | Revert "cleanup & add sorting method"
This reverts commit 98dd45f7d5419ee5189d12e43d9794ef29b816e9.
| Rust | unlicense | Hoverbear/rust-rosetta | rust | ## Code Before:
use std::collections::HashSet;
use std::hash::Hash;
fn remove_duplicate_elements_hashing<T: Hash + Eq>(elements: &mut Vec<T>) {
let set: HashSet<_> = elements.drain(..).collect();
elements.extend(set.into_iter());
}
fn remove_duplicate_elements_sorting<T: Ord>(elements: &mut Vec<T>) {
elements.sort_unstable();
elements.dedup();
}
fn main() {
let mut sample_elements = vec![0, 0, 1, 1, 2, 3, 2];
println!("Before removal of duplicates : {:?}", sample_elements);
remove_duplicate_elements_sorting(&mut sample_elements);
println!("After removal of duplicates : {:?}", sample_elements);
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_remove_duplicate_elements_hashing() {
let mut sample_elements = vec![0, 0, 1, 1, 2, 3, 2];
remove_duplicate_elements_hashing(&mut sample_elements);
sample_elements.sort_unstable();
assert_eq!(sample_elements, [0, 1, 2, 3])
}
#[test]
fn test_remove_duplicate_elements_sorting() {
let mut sample_elements = vec![0, 0, 1, 1, 2, 3, 2];
remove_duplicate_elements_sorting(&mut sample_elements);
assert_eq!(sample_elements, [0, 1, 2, 3])
}
}
## Instruction:
Revert "cleanup & add sorting method"
This reverts commit 98dd45f7d5419ee5189d12e43d9794ef29b816e9.
## Code After:
use std::vec::Vec;
use std::collections::HashSet;
use std::hash::Hash;
use std::cmp::Eq;
fn main() {
let mut sample_elements = vec![0u8, 0, 1, 1, 2, 3, 2];
println!("Before removal of duplicates : {:?}", sample_elements);
remove_duplicate_elements(&mut sample_elements);
println!("After removal of duplicates : {:?}", sample_elements);
}
fn remove_duplicate_elements<T: Hash + Eq>(elements: &mut Vec<T>) {
let set: HashSet<_> = elements.drain(..).collect();
elements.extend(set.into_iter());
}
#[test]
fn test_remove_duplicate_elements() {
let mut sample_elements = vec![0u8, 0, 1, 1, 2, 3, 2];
remove_duplicate_elements(&mut sample_elements);
sample_elements.sort();
assert_eq!(sample_elements, vec![0, 1, 2, 3])
}
| + use std::vec::Vec;
use std::collections::HashSet;
use std::hash::Hash;
+ use std::cmp::Eq;
+ fn main() {
+ let mut sample_elements = vec![0u8, 0, 1, 1, 2, 3, 2];
+ println!("Before removal of duplicates : {:?}", sample_elements);
+ remove_duplicate_elements(&mut sample_elements);
+ println!("After removal of duplicates : {:?}", sample_elements);
+ }
+
- fn remove_duplicate_elements_hashing<T: Hash + Eq>(elements: &mut Vec<T>) {
? --------
+ fn remove_duplicate_elements<T: Hash + Eq>(elements: &mut Vec<T>) {
let set: HashSet<_> = elements.drain(..).collect();
elements.extend(set.into_iter());
}
- fn remove_duplicate_elements_sorting<T: Ord>(elements: &mut Vec<T>) {
- elements.sort_unstable();
- elements.dedup();
+ #[test]
+ fn test_remove_duplicate_elements() {
+ let mut sample_elements = vec![0u8, 0, 1, 1, 2, 3, 2];
+ remove_duplicate_elements(&mut sample_elements);
+ sample_elements.sort();
+ assert_eq!(sample_elements, vec![0, 1, 2, 3])
}
-
- fn main() {
- let mut sample_elements = vec![0, 0, 1, 1, 2, 3, 2];
- println!("Before removal of duplicates : {:?}", sample_elements);
- remove_duplicate_elements_sorting(&mut sample_elements);
- println!("After removal of duplicates : {:?}", sample_elements);
- }
-
- #[cfg(test)]
- mod tests {
- use super::*;
-
- #[test]
- fn test_remove_duplicate_elements_hashing() {
- let mut sample_elements = vec![0, 0, 1, 1, 2, 3, 2];
- remove_duplicate_elements_hashing(&mut sample_elements);
- sample_elements.sort_unstable();
- assert_eq!(sample_elements, [0, 1, 2, 3])
- }
-
- #[test]
- fn test_remove_duplicate_elements_sorting() {
- let mut sample_elements = vec![0, 0, 1, 1, 2, 3, 2];
- remove_duplicate_elements_sorting(&mut sample_elements);
- assert_eq!(sample_elements, [0, 1, 2, 3])
- }
- } | 47 | 1.205128 | 16 | 31 |
1a59d7d527eae74eecaa03a61314cdda9b5b50d7 | src/Payum/Core/GatewayInterface.php | src/Payum/Core/GatewayInterface.php | <?php
namespace Payum\Core;
interface GatewayInterface
{
/**
* @param mixed $request
* @param boolean $catchReply
*
* @throws \Payum\Core\Exception\RequestNotSupportedException if there is not an action which able to process the request.
* @throws \Payum\Core\Reply\ReplyInterface when a gateway needs some external tasks to be executed. like a redirect to a gateway site or a page with credit card form. if $catchReply set to false the reply will be returned.
*
* @return \Payum\Core\Reply\ReplyInterface|null
*/
public function execute($request, $catchReply = false);
}
| <?php
namespace Payum\Core;
interface GatewayInterface
{
/**
* @param mixed $request
* @param boolean $catchReply If false the reply behave like an exception. If true the reply will be caught internally and returned.
*
* @throws \Payum\Core\Exception\RequestNotSupportedException If there is not an action which able to process the request.
* @throws \Payum\Core\Reply\ReplyInterface Gateway throws reply if some external tasks have to be done. For example show a credit card form, an iframe or perform a redirect.
*
* @return \Payum\Core\Reply\ReplyInterface|null
*/
public function execute($request, $catchReply = false);
}
| Correct description for catchReply argument. | Correct description for catchReply argument.
https://github.com/Payum/Payum/pull/626 | PHP | mit | Payum/Payum,Payum/Payum | php | ## Code Before:
<?php
namespace Payum\Core;
interface GatewayInterface
{
/**
* @param mixed $request
* @param boolean $catchReply
*
* @throws \Payum\Core\Exception\RequestNotSupportedException if there is not an action which able to process the request.
* @throws \Payum\Core\Reply\ReplyInterface when a gateway needs some external tasks to be executed. like a redirect to a gateway site or a page with credit card form. if $catchReply set to false the reply will be returned.
*
* @return \Payum\Core\Reply\ReplyInterface|null
*/
public function execute($request, $catchReply = false);
}
## Instruction:
Correct description for catchReply argument.
https://github.com/Payum/Payum/pull/626
## Code After:
<?php
namespace Payum\Core;
interface GatewayInterface
{
/**
* @param mixed $request
* @param boolean $catchReply If false the reply behave like an exception. If true the reply will be caught internally and returned.
*
* @throws \Payum\Core\Exception\RequestNotSupportedException If there is not an action which able to process the request.
* @throws \Payum\Core\Reply\ReplyInterface Gateway throws reply if some external tasks have to be done. For example show a credit card form, an iframe or perform a redirect.
*
* @return \Payum\Core\Reply\ReplyInterface|null
*/
public function execute($request, $catchReply = false);
}
| <?php
namespace Payum\Core;
interface GatewayInterface
{
/**
* @param mixed $request
- * @param boolean $catchReply
+ * @param boolean $catchReply If false the reply behave like an exception. If true the reply will be caught internally and returned.
*
- * @throws \Payum\Core\Exception\RequestNotSupportedException if there is not an action which able to process the request.
? ^
+ * @throws \Payum\Core\Exception\RequestNotSupportedException If there is not an action which able to process the request.
? ^
- * @throws \Payum\Core\Reply\ReplyInterface when a gateway needs some external tasks to be executed. like a redirect to a gateway site or a page with credit card form. if $catchReply set to false the reply will be returned.
+ * @throws \Payum\Core\Reply\ReplyInterface Gateway throws reply if some external tasks have to be done. For example show a credit card form, an iframe or perform a redirect.
*
* @return \Payum\Core\Reply\ReplyInterface|null
*/
public function execute($request, $catchReply = false);
} | 6 | 0.375 | 3 | 3 |
59874bb2baacb64b3c625a694c469b592d9e4696 | package.json | package.json | {
"name": "hyperflow",
"version": "1.0.0-beta-2",
"engines": {
"node": "0.10.x"
},
"env": {
"PYTHON": "c:/programs/python27"
},
"dependencies": {
"express":"3.x",
"consolidate":"0.9.1",
"cradle":"0.6.6",
"ejs":"0.8.4",
"eyes":"0.1.8",
"async":"0.2.x",
"redis":"0.9.1",
"xml2js":"0.2.7",
"underscore":"1.x",
"uuid":"",
"amqplib":"",
"when":"",
"eventemitter2":"",
"optimist":"0.6.0",
"walkdir":"0.0.7",
"path":"0.4.9",
"z-schema": "2.0.x",
"walk": "2.x",
"mtwitter": "1.5.2",
"crc": "0.2.1",
"log4js": "0.6.9",
"nconf": "0.6.8",
"file": "0.2.x",
"value": "0.3.0",
"form-data": "",
"nodeunit": "",
"request": "",
"q": ""
}
}
| {
"name": "hyperflow",
"version": "1.0.0-beta-2",
"engines": {
"node": "0.10.x"
},
"env": {
"PYTHON": "c:/programs/python27"
},
"dependencies": {
"express":"3.x",
"consolidate":"0.9.1",
"cradle":"0.6.6",
"ejs":"0.8.4",
"eyes":"0.1.8",
"async":"0.2.x",
"redis":"0.9.1",
"xml2js":"0.2.7",
"underscore":"1.x",
"uuid":"",
"amqplib":"",
"when":"",
"eventemitter2":"",
"optimist":"0.6.0",
"walkdir":"0.0.7",
"path":"0.4.9",
"z-schema": "2.0.x",
"walk": "2.x",
"mtwitter": "1.5.2",
"crc": "0.2.1",
"log4js": "0.6.9",
"nconf": "0.6.8",
"file": "0.2.x",
"value": "0.3.0",
"form-data": "",
"nodeunit": "",
"request": "",
"q": "",
"traverse": ""
}
}
| Add traverse to dependencies, it's needed for copy(). | Add traverse to dependencies, it's needed for copy().
| JSON | mit | dice-cyfronet/hyperflow,dice-cyfronet/hyperflow,dice-cyfronet/hyperflow | json | ## Code Before:
{
"name": "hyperflow",
"version": "1.0.0-beta-2",
"engines": {
"node": "0.10.x"
},
"env": {
"PYTHON": "c:/programs/python27"
},
"dependencies": {
"express":"3.x",
"consolidate":"0.9.1",
"cradle":"0.6.6",
"ejs":"0.8.4",
"eyes":"0.1.8",
"async":"0.2.x",
"redis":"0.9.1",
"xml2js":"0.2.7",
"underscore":"1.x",
"uuid":"",
"amqplib":"",
"when":"",
"eventemitter2":"",
"optimist":"0.6.0",
"walkdir":"0.0.7",
"path":"0.4.9",
"z-schema": "2.0.x",
"walk": "2.x",
"mtwitter": "1.5.2",
"crc": "0.2.1",
"log4js": "0.6.9",
"nconf": "0.6.8",
"file": "0.2.x",
"value": "0.3.0",
"form-data": "",
"nodeunit": "",
"request": "",
"q": ""
}
}
## Instruction:
Add traverse to dependencies, it's needed for copy().
## Code After:
{
"name": "hyperflow",
"version": "1.0.0-beta-2",
"engines": {
"node": "0.10.x"
},
"env": {
"PYTHON": "c:/programs/python27"
},
"dependencies": {
"express":"3.x",
"consolidate":"0.9.1",
"cradle":"0.6.6",
"ejs":"0.8.4",
"eyes":"0.1.8",
"async":"0.2.x",
"redis":"0.9.1",
"xml2js":"0.2.7",
"underscore":"1.x",
"uuid":"",
"amqplib":"",
"when":"",
"eventemitter2":"",
"optimist":"0.6.0",
"walkdir":"0.0.7",
"path":"0.4.9",
"z-schema": "2.0.x",
"walk": "2.x",
"mtwitter": "1.5.2",
"crc": "0.2.1",
"log4js": "0.6.9",
"nconf": "0.6.8",
"file": "0.2.x",
"value": "0.3.0",
"form-data": "",
"nodeunit": "",
"request": "",
"q": "",
"traverse": ""
}
}
| {
"name": "hyperflow",
"version": "1.0.0-beta-2",
"engines": {
"node": "0.10.x"
},
"env": {
"PYTHON": "c:/programs/python27"
},
"dependencies": {
"express":"3.x",
"consolidate":"0.9.1",
"cradle":"0.6.6",
"ejs":"0.8.4",
"eyes":"0.1.8",
"async":"0.2.x",
"redis":"0.9.1",
"xml2js":"0.2.7",
"underscore":"1.x",
"uuid":"",
"amqplib":"",
"when":"",
"eventemitter2":"",
"optimist":"0.6.0",
"walkdir":"0.0.7",
"path":"0.4.9",
"z-schema": "2.0.x",
"walk": "2.x",
"mtwitter": "1.5.2",
"crc": "0.2.1",
"log4js": "0.6.9",
"nconf": "0.6.8",
"file": "0.2.x",
"value": "0.3.0",
"form-data": "",
"nodeunit": "",
"request": "",
- "q": ""
+ "q": "",
? +
+ "traverse": ""
}
} | 3 | 0.075 | 2 | 1 |
365bd7b4c1ae1bf6ed010cfb818e2da9b3ed53b7 | app/models/tiny_mce_article.rb | app/models/tiny_mce_article.rb | require 'white_list_filter'
class TinyMceArticle < TextArticle
def self.short_description
_('Text article with visual editor')
end
def self.description
_('Not accessible for visually impaired users.')
end
xss_terminate :only => [ ]
xss_terminate :only => [ :name, :abstract, :body ], :with => 'white_list', :on => 'validation'
include WhiteListFilter
filter_iframes :abstract, :body
def iframe_whitelist
profile && profile.environment && profile.environment.trusted_sites_for_iframe
end
def notifiable?
true
end
def tiny_mce?
true
end
end
| require 'white_list_filter'
class TinyMceArticle < TextArticle
def self.short_description
_('Article')
end
def self.description
_('Add a new text article.')
end
xss_terminate :only => [ ]
xss_terminate :only => [ :name, :abstract, :body ], :with => 'white_list', :on => 'validation'
include WhiteListFilter
filter_iframes :abstract, :body
def iframe_whitelist
profile && profile.environment && profile.environment.trusted_sites_for_iframe
end
def notifiable?
true
end
def tiny_mce?
true
end
end
| Make a more inviting text for new text articles. | Make a more inviting text for new text articles.
| Ruby | agpl-3.0 | blogoosfero/noosfero,CIRANDAS/noosfero-ecosol,EcoAlternative/noosfero-ecosol,EcoAlternative/noosfero-ecosol,blogoosfero/noosfero,blogoosfero/noosfero,samasti/noosfero,coletivoEITA/noosfero-ecosol,blogoosfero/noosfero,samasti/noosfero,samasti/noosfero,EcoAlternative/noosfero-ecosol,EcoAlternative/noosfero-ecosol,blogoosfero/noosfero,coletivoEITA/noosfero-ecosol,CIRANDAS/noosfero-ecosol,CIRANDAS/noosfero-ecosol,CIRANDAS/noosfero-ecosol,coletivoEITA/noosfero-ecosol,EcoAlternative/noosfero-ecosol,blogoosfero/noosfero,EcoAlternative/noosfero-ecosol,coletivoEITA/noosfero-ecosol,CIRANDAS/noosfero-ecosol,coletivoEITA/noosfero-ecosol,samasti/noosfero,coletivoEITA/noosfero-ecosol,EcoAlternative/noosfero-ecosol,blogoosfero/noosfero,samasti/noosfero,samasti/noosfero | ruby | ## Code Before:
require 'white_list_filter'
class TinyMceArticle < TextArticle
def self.short_description
_('Text article with visual editor')
end
def self.description
_('Not accessible for visually impaired users.')
end
xss_terminate :only => [ ]
xss_terminate :only => [ :name, :abstract, :body ], :with => 'white_list', :on => 'validation'
include WhiteListFilter
filter_iframes :abstract, :body
def iframe_whitelist
profile && profile.environment && profile.environment.trusted_sites_for_iframe
end
def notifiable?
true
end
def tiny_mce?
true
end
end
## Instruction:
Make a more inviting text for new text articles.
## Code After:
require 'white_list_filter'
class TinyMceArticle < TextArticle
def self.short_description
_('Article')
end
def self.description
_('Add a new text article.')
end
xss_terminate :only => [ ]
xss_terminate :only => [ :name, :abstract, :body ], :with => 'white_list', :on => 'validation'
include WhiteListFilter
filter_iframes :abstract, :body
def iframe_whitelist
profile && profile.environment && profile.environment.trusted_sites_for_iframe
end
def notifiable?
true
end
def tiny_mce?
true
end
end
| require 'white_list_filter'
class TinyMceArticle < TextArticle
def self.short_description
- _('Text article with visual editor')
+ _('Article')
end
def self.description
- _('Not accessible for visually impaired users.')
+ _('Add a new text article.')
end
xss_terminate :only => [ ]
xss_terminate :only => [ :name, :abstract, :body ], :with => 'white_list', :on => 'validation'
include WhiteListFilter
filter_iframes :abstract, :body
def iframe_whitelist
profile && profile.environment && profile.environment.trusted_sites_for_iframe
end
def notifiable?
true
end
def tiny_mce?
true
end
end | 4 | 0.129032 | 2 | 2 |
eb2949361eb0a7c99668ec57b69d9f24622e01b9 | .travis.yml | .travis.yml | language: python
addons:
apt:
config:
retries: true
cache:
apt: true
pip: true
matrix:
include:
- os: linux
dist: xenial
python: 2.7
env: TOXENV=py27
name: "2.7 Xenial"
- os: linux
dist: xenial
python: 3.5
env: TOXENV=py35
name: "3.5 Xenial"
- os: linux
dist: xenial
python: 3.6
env: TOXENV=py36
name: "3.6 Xenial"
- os: linux
dist: xenial
python: 3.7
env: TOXENV=py37
name: "3.7 Xenial"
- os: linux
dist: xenial
python: 3.7
env: TOXENV=pep8
name: "3.7 Xenial pep8"
- os: linux
dist: bionic
python: 3.7
env: TOXENV=py37
name: "3.7 Bionic"
before_install:
- sudo apt-get update
- sudo apt-get install -y tesseract-ocr
- sudo apt-get install -y tesseract-ocr-fra
- tesseract --version
- tesseract --list-langs
install:
pip install tox
script:
tox
notifications:
email: false
| language: python
addons:
apt:
config:
retries: true
packages:
- tesseract-ocr
- tesseract-ocr-fra
cache:
apt: true
pip: true
matrix:
include:
- os: linux
dist: xenial
python: 2.7
env: TOXENV=py27
name: "2.7 Xenial"
- os: linux
dist: xenial
python: 3.5
env: TOXENV=py35
name: "3.5 Xenial"
- os: linux
dist: xenial
python: 3.6
env: TOXENV=py36
name: "3.6 Xenial"
- os: linux
dist: xenial
python: 3.7
env: TOXENV=py37
name: "3.7 Xenial"
- os: linux
dist: xenial
python: 3.7
env: TOXENV=pep8
name: "3.7 Xenial pep8"
- os: linux
dist: bionic
python: 3.7
env: TOXENV=py37
name: "3.7 Bionic"
before_install:
# - sudo apt-get update
# - sudo apt-get install -y tesseract-ocr
# - sudo apt-get install -y tesseract-ocr-fra
- tesseract --version
- tesseract --list-langs
install:
pip install tox
script:
tox
notifications:
email: false
| Use TravisCI apt packages addon | Use TravisCI apt packages addon | YAML | apache-2.0 | madmaze/pytesseract | yaml | ## Code Before:
language: python
addons:
apt:
config:
retries: true
cache:
apt: true
pip: true
matrix:
include:
- os: linux
dist: xenial
python: 2.7
env: TOXENV=py27
name: "2.7 Xenial"
- os: linux
dist: xenial
python: 3.5
env: TOXENV=py35
name: "3.5 Xenial"
- os: linux
dist: xenial
python: 3.6
env: TOXENV=py36
name: "3.6 Xenial"
- os: linux
dist: xenial
python: 3.7
env: TOXENV=py37
name: "3.7 Xenial"
- os: linux
dist: xenial
python: 3.7
env: TOXENV=pep8
name: "3.7 Xenial pep8"
- os: linux
dist: bionic
python: 3.7
env: TOXENV=py37
name: "3.7 Bionic"
before_install:
- sudo apt-get update
- sudo apt-get install -y tesseract-ocr
- sudo apt-get install -y tesseract-ocr-fra
- tesseract --version
- tesseract --list-langs
install:
pip install tox
script:
tox
notifications:
email: false
## Instruction:
Use TravisCI apt packages addon
## Code After:
language: python
addons:
apt:
config:
retries: true
packages:
- tesseract-ocr
- tesseract-ocr-fra
cache:
apt: true
pip: true
matrix:
include:
- os: linux
dist: xenial
python: 2.7
env: TOXENV=py27
name: "2.7 Xenial"
- os: linux
dist: xenial
python: 3.5
env: TOXENV=py35
name: "3.5 Xenial"
- os: linux
dist: xenial
python: 3.6
env: TOXENV=py36
name: "3.6 Xenial"
- os: linux
dist: xenial
python: 3.7
env: TOXENV=py37
name: "3.7 Xenial"
- os: linux
dist: xenial
python: 3.7
env: TOXENV=pep8
name: "3.7 Xenial pep8"
- os: linux
dist: bionic
python: 3.7
env: TOXENV=py37
name: "3.7 Bionic"
before_install:
# - sudo apt-get update
# - sudo apt-get install -y tesseract-ocr
# - sudo apt-get install -y tesseract-ocr-fra
- tesseract --version
- tesseract --list-langs
install:
pip install tox
script:
tox
notifications:
email: false
| language: python
addons:
apt:
config:
retries: true
+ packages:
+ - tesseract-ocr
+ - tesseract-ocr-fra
cache:
apt: true
pip: true
matrix:
include:
- os: linux
dist: xenial
python: 2.7
env: TOXENV=py27
name: "2.7 Xenial"
- os: linux
dist: xenial
python: 3.5
env: TOXENV=py35
name: "3.5 Xenial"
- os: linux
dist: xenial
python: 3.6
env: TOXENV=py36
name: "3.6 Xenial"
- os: linux
dist: xenial
python: 3.7
env: TOXENV=py37
name: "3.7 Xenial"
- os: linux
dist: xenial
python: 3.7
env: TOXENV=pep8
name: "3.7 Xenial pep8"
- os: linux
dist: bionic
python: 3.7
env: TOXENV=py37
name: "3.7 Bionic"
before_install:
- - sudo apt-get update
+ # - sudo apt-get update
? +
- - sudo apt-get install -y tesseract-ocr
+ # - sudo apt-get install -y tesseract-ocr
? +
- - sudo apt-get install -y tesseract-ocr-fra
+ # - sudo apt-get install -y tesseract-ocr-fra
? +
- tesseract --version
- tesseract --list-langs
install:
pip install tox
script:
tox
notifications:
email: false | 9 | 0.152542 | 6 | 3 |
935bb29bd5116a08a476592b49214d312d871244 | src/addon/pnacl/lib/axiom_pnacl/executables.js | src/addon/pnacl/lib/axiom_pnacl/executables.js | // Copyright (c) 2014 The Axiom Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
import AxiomError from 'axiom/core/error';
import environment from 'axiom_shell/environment';
import PnaclCommand from 'axiom_pnacl/pnacl_command';
// @note ExecuteContext from 'axiom/bindings/fs/execute_context'
var createCommand = function(commandName, sourceUrl, tarFilename) {
var command = new PnaclCommand(commandName, sourceUrl, tarFilename);
return command.run.bind(command);
};
export var executables = function(sourceUrl) {
return {
'curl(@)': createCommand('curl', sourceUrl),
'nano(@)': createCommand('nano', sourceUrl, 'nano.tar'),
'nethack(@)': createCommand('nethack', sourceUrl, 'nethack.tar'),
'python(@)': createCommand('python', sourceUrl, 'pydata_pnacl.tar'),
'unzip(@)': createCommand('unzip', sourceUrl),
'vim(@)': createCommand('vim', sourceUrl, 'vim.tar'),
};
};
export default executables;
| // Copyright (c) 2014 The Axiom Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
import AxiomError from 'axiom/core/error';
import environment from 'axiom_shell/environment';
import PnaclCommand from 'axiom_pnacl/pnacl_command';
// @note ExecuteContext from 'axiom/bindings/fs/execute_context'
var createCommand = function(name, sourceUrl, opt_tarFilename, opt_env) {
var command = new PnaclCommand(name, sourceUrl, opt_tarFilename, opt_env);
return command.run.bind(command);
};
export var executables = function(sourceUrl) {
return {
'curl(@)': createCommand('curl', sourceUrl),
'nano(@)': createCommand('nano', sourceUrl, 'nano.tar'),
'nethack(@)': createCommand('nethack', sourceUrl, 'nethack.tar'),
'python(@)':
// Note: We set PYTHONHOME to '/' so that the python library files
// are loaded from '/lib/python2.7', which is the prefix path used in
// 'pydata_pnacl.tar'. By default, python set PYTHONHOME to be the
// prefix of the executable from arg[0]. This conflicts with the way
// we set arg[0] to tell pnacl to where to load the .tar file from.
createCommand('python', sourceUrl, 'pydata_pnacl.tar', {
'$PYTHONHOME': '/'
}),
'unzip(@)': createCommand('unzip', sourceUrl),
'vim(@)': createCommand('vim', sourceUrl, 'vim.tar'),
};
};
export default executables;
| Fix running python by setting PYTHONHOME to '/'. | Fix running python by setting PYTHONHOME to '/'.
| JavaScript | apache-2.0 | umop/axiom_old_private,chromium/axiom,rpaquay/axiom,ussuri/axiom,mcanthony/axiom | javascript | ## Code Before:
// Copyright (c) 2014 The Axiom Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
import AxiomError from 'axiom/core/error';
import environment from 'axiom_shell/environment';
import PnaclCommand from 'axiom_pnacl/pnacl_command';
// @note ExecuteContext from 'axiom/bindings/fs/execute_context'
var createCommand = function(commandName, sourceUrl, tarFilename) {
var command = new PnaclCommand(commandName, sourceUrl, tarFilename);
return command.run.bind(command);
};
export var executables = function(sourceUrl) {
return {
'curl(@)': createCommand('curl', sourceUrl),
'nano(@)': createCommand('nano', sourceUrl, 'nano.tar'),
'nethack(@)': createCommand('nethack', sourceUrl, 'nethack.tar'),
'python(@)': createCommand('python', sourceUrl, 'pydata_pnacl.tar'),
'unzip(@)': createCommand('unzip', sourceUrl),
'vim(@)': createCommand('vim', sourceUrl, 'vim.tar'),
};
};
export default executables;
## Instruction:
Fix running python by setting PYTHONHOME to '/'.
## Code After:
// Copyright (c) 2014 The Axiom Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
import AxiomError from 'axiom/core/error';
import environment from 'axiom_shell/environment';
import PnaclCommand from 'axiom_pnacl/pnacl_command';
// @note ExecuteContext from 'axiom/bindings/fs/execute_context'
var createCommand = function(name, sourceUrl, opt_tarFilename, opt_env) {
var command = new PnaclCommand(name, sourceUrl, opt_tarFilename, opt_env);
return command.run.bind(command);
};
export var executables = function(sourceUrl) {
return {
'curl(@)': createCommand('curl', sourceUrl),
'nano(@)': createCommand('nano', sourceUrl, 'nano.tar'),
'nethack(@)': createCommand('nethack', sourceUrl, 'nethack.tar'),
'python(@)':
// Note: We set PYTHONHOME to '/' so that the python library files
// are loaded from '/lib/python2.7', which is the prefix path used in
// 'pydata_pnacl.tar'. By default, python set PYTHONHOME to be the
// prefix of the executable from arg[0]. This conflicts with the way
// we set arg[0] to tell pnacl to where to load the .tar file from.
createCommand('python', sourceUrl, 'pydata_pnacl.tar', {
'$PYTHONHOME': '/'
}),
'unzip(@)': createCommand('unzip', sourceUrl),
'vim(@)': createCommand('vim', sourceUrl, 'vim.tar'),
};
};
export default executables;
| // Copyright (c) 2014 The Axiom Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
import AxiomError from 'axiom/core/error';
import environment from 'axiom_shell/environment';
import PnaclCommand from 'axiom_pnacl/pnacl_command';
// @note ExecuteContext from 'axiom/bindings/fs/execute_context'
- var createCommand = function(commandName, sourceUrl, tarFilename) {
? ----- --
+ var createCommand = function(name, sourceUrl, opt_tarFilename, opt_env) {
? ++++ +++++++++
- var command = new PnaclCommand(commandName, sourceUrl, tarFilename);
? ----- --
+ var command = new PnaclCommand(name, sourceUrl, opt_tarFilename, opt_env);
? ++++ +++++++++
return command.run.bind(command);
};
export var executables = function(sourceUrl) {
return {
'curl(@)': createCommand('curl', sourceUrl),
'nano(@)': createCommand('nano', sourceUrl, 'nano.tar'),
'nethack(@)': createCommand('nethack', sourceUrl, 'nethack.tar'),
+ 'python(@)':
+ // Note: We set PYTHONHOME to '/' so that the python library files
+ // are loaded from '/lib/python2.7', which is the prefix path used in
+ // 'pydata_pnacl.tar'. By default, python set PYTHONHOME to be the
+ // prefix of the executable from arg[0]. This conflicts with the way
+ // we set arg[0] to tell pnacl to where to load the .tar file from.
- 'python(@)': createCommand('python', sourceUrl, 'pydata_pnacl.tar'),
? ^^^^^^^^^^^^ -
+ createCommand('python', sourceUrl, 'pydata_pnacl.tar', {
? ^^^ ++
+ '$PYTHONHOME': '/'
+ }),
'unzip(@)': createCommand('unzip', sourceUrl),
'vim(@)': createCommand('vim', sourceUrl, 'vim.tar'),
};
};
export default executables; | 14 | 0.482759 | 11 | 3 |
bcc64cb8049de02a8e2a3e17e690fc696789e749 | app/models/time_based_repo_rater.rb | app/models/time_based_repo_rater.rb | require "date"
module HazCommitz
class TimeBasedRepoRater < RepoRater
def rate(github_repo)
rating = MIN
time_now = DateTime.now
return MIN if github_repo.nil?
last_commit_time = github_repo.latest_commit.date # fix
if last_commit_time.nil? || !last_commit_time.is_a?(Time) # catch when not Time object?
return MIN
end
rating = case (time_now - last_commit_time.to_datetime).round
when 1..7 then MAX
when 8..30 then 8
when 31..90 then 6
when 91..180 then 4
when 181..365 then 2
when 366..Float::INFINITY then 1
end
rating
end
end
end | require "date"
require_relative "repo_rater"
module HazCommitz
class TimeBasedRepoRater < RepoRater
def rate(github_repo)
rating = MIN
time_now = DateTime.now
return MIN if github_repo.nil?
last_commit_time = github_repo.latest_commit.date # fix
if last_commit_time.nil? || !last_commit_time.is_a?(Time) # catch when not Time object?
return MIN
end
rating = case (time_now - last_commit_time.to_datetime).round
when 1..7 then MAX
when 8..30 then 8
when 31..90 then 6
when 91..180 then 4
when 181..365 then 2
when 366..Float::INFINITY then 1
end
rating
end
end
end | Add require for super class | Add require for super class
| Ruby | mit | rob-murray/haz-commitz,rob-murray/haz-commitz,rob-murray/haz-commitz | ruby | ## Code Before:
require "date"
module HazCommitz
class TimeBasedRepoRater < RepoRater
def rate(github_repo)
rating = MIN
time_now = DateTime.now
return MIN if github_repo.nil?
last_commit_time = github_repo.latest_commit.date # fix
if last_commit_time.nil? || !last_commit_time.is_a?(Time) # catch when not Time object?
return MIN
end
rating = case (time_now - last_commit_time.to_datetime).round
when 1..7 then MAX
when 8..30 then 8
when 31..90 then 6
when 91..180 then 4
when 181..365 then 2
when 366..Float::INFINITY then 1
end
rating
end
end
end
## Instruction:
Add require for super class
## Code After:
require "date"
require_relative "repo_rater"
module HazCommitz
class TimeBasedRepoRater < RepoRater
def rate(github_repo)
rating = MIN
time_now = DateTime.now
return MIN if github_repo.nil?
last_commit_time = github_repo.latest_commit.date # fix
if last_commit_time.nil? || !last_commit_time.is_a?(Time) # catch when not Time object?
return MIN
end
rating = case (time_now - last_commit_time.to_datetime).round
when 1..7 then MAX
when 8..30 then 8
when 31..90 then 6
when 91..180 then 4
when 181..365 then 2
when 366..Float::INFINITY then 1
end
rating
end
end
end | require "date"
+ require_relative "repo_rater"
module HazCommitz
class TimeBasedRepoRater < RepoRater
def rate(github_repo)
rating = MIN
time_now = DateTime.now
return MIN if github_repo.nil?
last_commit_time = github_repo.latest_commit.date # fix
if last_commit_time.nil? || !last_commit_time.is_a?(Time) # catch when not Time object?
return MIN
end
rating = case (time_now - last_commit_time.to_datetime).round
when 1..7 then MAX
when 8..30 then 8
when 31..90 then 6
when 91..180 then 4
when 181..365 then 2
when 366..Float::INFINITY then 1
end
rating
end
end
end | 1 | 0.032258 | 1 | 0 |
249293336d2bfcc018c44d9279b89b31522c37da | u2fserver/jsobjects.py | u2fserver/jsobjects.py | from u2flib_server.jsapi import (JSONDict, RegisterRequest, RegisterResponse,
SignRequest, SignResponse)
__all__ = [
'RegisterRequestData',
'RegisterResponseData',
'AuthenticateRequestData',
'AuthenticateResponseData'
]
class RegisterRequestData(JSONDict):
@property
def authenticateRequests(self):
return map(SignRequest, self['authenticateRequests'])
@property
def registerRequests(self):
return map(RegisterRequest, self['registerRequests'])
class RegisterResponseData(JSONDict):
@property
def registerResponse(self):
return RegisterResponse(self['registerResponse'])
@property
def getProps(self):
return self.get('getProps', [])
@property
def setProps(self):
return self.get('setProps', {})
class AuthenticateRequestData(JSONDict):
@property
def authenticateRequests(self):
return map(SignRequest, self['authenticateRequests'])
class AuthenticateResponseData(JSONDict):
@property
def authenticateResponse(self):
return SignResponse(self['authenticateResponse'])
@property
def getProps(self):
return self.get('getProps', [])
@property
def setProps(self):
return self.get('setProps', {})
|
from u2flib_server.jsapi import (JSONDict, RegisterRequest, RegisterResponse,
SignRequest, SignResponse)
__all__ = [
'RegisterRequestData',
'RegisterResponseData',
'AuthenticateRequestData',
'AuthenticateResponseData'
]
class WithProps(object):
@property
def getProps(self):
return self.get('getProps', [])
@property
def setProps(self):
return self.get('setProps', {})
class RegisterRequestData(JSONDict):
@property
def authenticateRequests(self):
return map(SignRequest, self['authenticateRequests'])
@property
def registerRequests(self):
return map(RegisterRequest, self['registerRequests'])
class RegisterResponseData(JSONDict, WithProps):
@property
def registerResponse(self):
return RegisterResponse(self['registerResponse'])
class AuthenticateRequestData(JSONDict):
@property
def authenticateRequests(self):
return map(SignRequest, self['authenticateRequests'])
class AuthenticateResponseData(JSONDict, WithProps):
@property
def authenticateResponse(self):
return SignResponse(self['authenticateResponse'])
| Use a mixin for get-/setProps | Use a mixin for get-/setProps
| Python | bsd-2-clause | moreati/u2fval,Yubico/u2fval | python | ## Code Before:
from u2flib_server.jsapi import (JSONDict, RegisterRequest, RegisterResponse,
SignRequest, SignResponse)
__all__ = [
'RegisterRequestData',
'RegisterResponseData',
'AuthenticateRequestData',
'AuthenticateResponseData'
]
class RegisterRequestData(JSONDict):
@property
def authenticateRequests(self):
return map(SignRequest, self['authenticateRequests'])
@property
def registerRequests(self):
return map(RegisterRequest, self['registerRequests'])
class RegisterResponseData(JSONDict):
@property
def registerResponse(self):
return RegisterResponse(self['registerResponse'])
@property
def getProps(self):
return self.get('getProps', [])
@property
def setProps(self):
return self.get('setProps', {})
class AuthenticateRequestData(JSONDict):
@property
def authenticateRequests(self):
return map(SignRequest, self['authenticateRequests'])
class AuthenticateResponseData(JSONDict):
@property
def authenticateResponse(self):
return SignResponse(self['authenticateResponse'])
@property
def getProps(self):
return self.get('getProps', [])
@property
def setProps(self):
return self.get('setProps', {})
## Instruction:
Use a mixin for get-/setProps
## Code After:
from u2flib_server.jsapi import (JSONDict, RegisterRequest, RegisterResponse,
SignRequest, SignResponse)
__all__ = [
'RegisterRequestData',
'RegisterResponseData',
'AuthenticateRequestData',
'AuthenticateResponseData'
]
class WithProps(object):
@property
def getProps(self):
return self.get('getProps', [])
@property
def setProps(self):
return self.get('setProps', {})
class RegisterRequestData(JSONDict):
@property
def authenticateRequests(self):
return map(SignRequest, self['authenticateRequests'])
@property
def registerRequests(self):
return map(RegisterRequest, self['registerRequests'])
class RegisterResponseData(JSONDict, WithProps):
@property
def registerResponse(self):
return RegisterResponse(self['registerResponse'])
class AuthenticateRequestData(JSONDict):
@property
def authenticateRequests(self):
return map(SignRequest, self['authenticateRequests'])
class AuthenticateResponseData(JSONDict, WithProps):
@property
def authenticateResponse(self):
return SignResponse(self['authenticateResponse'])
| +
from u2flib_server.jsapi import (JSONDict, RegisterRequest, RegisterResponse,
SignRequest, SignResponse)
__all__ = [
'RegisterRequestData',
'RegisterResponseData',
'AuthenticateRequestData',
'AuthenticateResponseData'
]
+
+
+ class WithProps(object):
+
+ @property
+ def getProps(self):
+ return self.get('getProps', [])
+
+ @property
+ def setProps(self):
+ return self.get('setProps', {})
class RegisterRequestData(JSONDict):
@property
def authenticateRequests(self):
return map(SignRequest, self['authenticateRequests'])
@property
def registerRequests(self):
return map(RegisterRequest, self['registerRequests'])
- class RegisterResponseData(JSONDict):
+ class RegisterResponseData(JSONDict, WithProps):
? +++++++++++
@property
def registerResponse(self):
return RegisterResponse(self['registerResponse'])
-
- @property
- def getProps(self):
- return self.get('getProps', [])
-
- @property
- def setProps(self):
- return self.get('setProps', {})
class AuthenticateRequestData(JSONDict):
@property
def authenticateRequests(self):
return map(SignRequest, self['authenticateRequests'])
- class AuthenticateResponseData(JSONDict):
+ class AuthenticateResponseData(JSONDict, WithProps):
? +++++++++++
@property
def authenticateResponse(self):
return SignResponse(self['authenticateResponse'])
-
- @property
- def getProps(self):
- return self.get('getProps', [])
-
- @property
- def setProps(self):
- return self.get('setProps', {}) | 32 | 0.561404 | 14 | 18 |
8e59dc8e06a71d98902bad8114e66cae7000b62f | lib/yt/models/statistics_set.rb | lib/yt/models/statistics_set.rb | require 'yt/models/base'
module Yt
module Models
# @private
# Encapsulates statistics about the resource, such as the number of times
# the resource has been viewed or liked.
# @see https://developers.google.com/youtube/v3/docs/videos#resource
class StatisticsSet < Base
attr_reader :data
def initialize(options = {})
@data = options[:data]
end
has_attribute :view_count, type: Integer
has_attribute :comment_count, type: Integer
has_attribute :like_count, type: Integer
has_attribute :dislike_count, type: Integer
has_attribute :favorite_count, type: Integer
has_attribute :video_count, type: Integer
has_attribute :subscriber_count, type: Integer
has_attribute :hidden_subscriber_count
end
end
end | require 'yt/models/base'
module Yt
module Models
# @private
# Encapsulates statistics about the resource, such as the number of times
# the resource has been viewed or liked.
# @see https://developers.google.com/youtube/v3/docs/videos#resource
# @see https://developers.google.com/youtube/v3/docs/channels#resource-representation
class StatisticsSet < Base
attr_reader :data
def initialize(options = {})
@data = options[:data]
end
has_attribute :view_count, type: Integer
has_attribute :comment_count, type: Integer
has_attribute :like_count, type: Integer
has_attribute :dislike_count, type: Integer
has_attribute :favorite_count, type: Integer
has_attribute :video_count, type: Integer
has_attribute :subscriber_count, type: Integer
has_attribute :hidden_subscriber_count
end
end
end
| Add channel resource documentation link | [skip ci] Add channel resource documentation link
| Ruby | mit | jacknguyen/yt,Fullscreen/yt | ruby | ## Code Before:
require 'yt/models/base'
module Yt
module Models
# @private
# Encapsulates statistics about the resource, such as the number of times
# the resource has been viewed or liked.
# @see https://developers.google.com/youtube/v3/docs/videos#resource
class StatisticsSet < Base
attr_reader :data
def initialize(options = {})
@data = options[:data]
end
has_attribute :view_count, type: Integer
has_attribute :comment_count, type: Integer
has_attribute :like_count, type: Integer
has_attribute :dislike_count, type: Integer
has_attribute :favorite_count, type: Integer
has_attribute :video_count, type: Integer
has_attribute :subscriber_count, type: Integer
has_attribute :hidden_subscriber_count
end
end
end
## Instruction:
[skip ci] Add channel resource documentation link
## Code After:
require 'yt/models/base'
module Yt
module Models
# @private
# Encapsulates statistics about the resource, such as the number of times
# the resource has been viewed or liked.
# @see https://developers.google.com/youtube/v3/docs/videos#resource
# @see https://developers.google.com/youtube/v3/docs/channels#resource-representation
class StatisticsSet < Base
attr_reader :data
def initialize(options = {})
@data = options[:data]
end
has_attribute :view_count, type: Integer
has_attribute :comment_count, type: Integer
has_attribute :like_count, type: Integer
has_attribute :dislike_count, type: Integer
has_attribute :favorite_count, type: Integer
has_attribute :video_count, type: Integer
has_attribute :subscriber_count, type: Integer
has_attribute :hidden_subscriber_count
end
end
end
| require 'yt/models/base'
module Yt
module Models
# @private
# Encapsulates statistics about the resource, such as the number of times
# the resource has been viewed or liked.
# @see https://developers.google.com/youtube/v3/docs/videos#resource
+ # @see https://developers.google.com/youtube/v3/docs/channels#resource-representation
class StatisticsSet < Base
attr_reader :data
def initialize(options = {})
@data = options[:data]
end
has_attribute :view_count, type: Integer
has_attribute :comment_count, type: Integer
has_attribute :like_count, type: Integer
has_attribute :dislike_count, type: Integer
has_attribute :favorite_count, type: Integer
has_attribute :video_count, type: Integer
has_attribute :subscriber_count, type: Integer
has_attribute :hidden_subscriber_count
end
end
end | 1 | 0.038462 | 1 | 0 |
34d45982b70e3e8c309c7e5448aeb858c4771956 | utils/init.sh | utils/init.sh |
if [ ! -d elation ]; then
git clone https://github.com/jbaicoianu/elation.git
cd elation
git clone https://github.com/jbaicoianu/elation-engine.git components/engine
git clone https://github.com/jbaicoianu/cyclone-physics-js.git components/physics
git clone https://github.com/jbaicoianu/elation-share.git components/share
#git clone https://github.com/jbaicoianu/janusweb.git components/janusweb
ln -s `pwd`/.. components/janusweb
./elation web init
./elation component enable engine physics share janusweb
fi
|
echo "Installing dependencies..."
npm install
echo "done"
echo
echo "Creating directory tree..."
DEPENDENCYPATHS=$(npm ls -parseable)
declare -A dependencies
for DEP in $DEPENDENCYPATHS; do
DEPNAME=$(basename $DEP)
dependencies[$DEPNAME]=$DEP
done
if [ ! -d elation ]; then
ln -s ${dependencies["elation"]}
cd elation/components
ln -s ${dependencies["elation-engine"]} engine
ln -s ${dependencies["elation-share"]} share
ln -s ${dependencies["cyclone-physics"]} physics
ln -s ${dependencies["janusweb"]} janusweb
cd ..
./elation web init
./elation component enable engine physics share janusweb
fi
echo "done"
| Use npm for package instalation | Use npm for package instalation
| Shell | mit | jbaicoianu/janusweb,jbaicoianu/janusweb,jbaicoianu/janusweb | shell | ## Code Before:
if [ ! -d elation ]; then
git clone https://github.com/jbaicoianu/elation.git
cd elation
git clone https://github.com/jbaicoianu/elation-engine.git components/engine
git clone https://github.com/jbaicoianu/cyclone-physics-js.git components/physics
git clone https://github.com/jbaicoianu/elation-share.git components/share
#git clone https://github.com/jbaicoianu/janusweb.git components/janusweb
ln -s `pwd`/.. components/janusweb
./elation web init
./elation component enable engine physics share janusweb
fi
## Instruction:
Use npm for package instalation
## Code After:
echo "Installing dependencies..."
npm install
echo "done"
echo
echo "Creating directory tree..."
DEPENDENCYPATHS=$(npm ls -parseable)
declare -A dependencies
for DEP in $DEPENDENCYPATHS; do
DEPNAME=$(basename $DEP)
dependencies[$DEPNAME]=$DEP
done
if [ ! -d elation ]; then
ln -s ${dependencies["elation"]}
cd elation/components
ln -s ${dependencies["elation-engine"]} engine
ln -s ${dependencies["elation-share"]} share
ln -s ${dependencies["cyclone-physics"]} physics
ln -s ${dependencies["janusweb"]} janusweb
cd ..
./elation web init
./elation component enable engine physics share janusweb
fi
echo "done"
| +
+ echo "Installing dependencies..."
+ npm install
+ echo "done"
+ echo
+
+ echo "Creating directory tree..."
+ DEPENDENCYPATHS=$(npm ls -parseable)
+
+ declare -A dependencies
+
+ for DEP in $DEPENDENCYPATHS; do
+ DEPNAME=$(basename $DEP)
+ dependencies[$DEPNAME]=$DEP
+ done
if [ ! -d elation ]; then
- git clone https://github.com/jbaicoianu/elation.git
- cd elation
- git clone https://github.com/jbaicoianu/elation-engine.git components/engine
- git clone https://github.com/jbaicoianu/cyclone-physics-js.git components/physics
- git clone https://github.com/jbaicoianu/elation-share.git components/share
- #git clone https://github.com/jbaicoianu/janusweb.git components/janusweb
- ln -s `pwd`/.. components/janusweb
+ ln -s ${dependencies["elation"]}
+
+ cd elation/components
+ ln -s ${dependencies["elation-engine"]} engine
+ ln -s ${dependencies["elation-share"]} share
+ ln -s ${dependencies["cyclone-physics"]} physics
+ ln -s ${dependencies["janusweb"]} janusweb
+
+ cd ..
./elation web init
./elation component enable engine physics share janusweb
fi
+ echo "done" | 32 | 2.666667 | 25 | 7 |
a751b17f671eef7dfd509452c61e64820d0078a4 | requirements.txt | requirements.txt | colorama==0.3.9
nose==1.3.7
PyYAML>=3.12,<4
requests>=2.18.3,<3
Requires==0.0.3
-e git+https://github.com/tedivm/python-screeps.git@master#egg=screepsapi
six>=1.10.0,<2
urwid==1.3.1
websocket-client==0.44.0
| colorama==0.3.9
nose==1.3.7
PyYAML>=3.12,<4
requests>=2.18.3,<3
Requires==0.0.3
screepsapi>=0.4.2
six>=1.10.0,<2
urwid==1.3.1
websocket-client==0.44.0
| Switch screeps_console to using the pypi version of screepsapi | Switch screeps_console to using the pypi version of screepsapi
| Text | mit | screepers/screeps_console,screepers/screeps_console | text | ## Code Before:
colorama==0.3.9
nose==1.3.7
PyYAML>=3.12,<4
requests>=2.18.3,<3
Requires==0.0.3
-e git+https://github.com/tedivm/python-screeps.git@master#egg=screepsapi
six>=1.10.0,<2
urwid==1.3.1
websocket-client==0.44.0
## Instruction:
Switch screeps_console to using the pypi version of screepsapi
## Code After:
colorama==0.3.9
nose==1.3.7
PyYAML>=3.12,<4
requests>=2.18.3,<3
Requires==0.0.3
screepsapi>=0.4.2
six>=1.10.0,<2
urwid==1.3.1
websocket-client==0.44.0
| colorama==0.3.9
nose==1.3.7
PyYAML>=3.12,<4
requests>=2.18.3,<3
Requires==0.0.3
- -e git+https://github.com/tedivm/python-screeps.git@master#egg=screepsapi
+ screepsapi>=0.4.2
six>=1.10.0,<2
urwid==1.3.1
websocket-client==0.44.0 | 2 | 0.222222 | 1 | 1 |
b76b3cbe0d86bd5037ccfd21086ab50803606ec2 | autobuilder/webhooks.py | autobuilder/webhooks.py | from buildbot.status.web.hooks.github import GitHubEventHandler
from twisted.python import log
import abconfig
class AutobuilderGithubEventHandler(GitHubEventHandler):
def handle_push(self, payload):
# This field is unused:
user = None
# user = payload['pusher']['name']
repo = payload['repository']['name']
repo_url = payload['repository']['url']
# NOTE: what would be a reasonable value for project?
# project = request.args.get('project', [''])[0]
project = abconfig.get_project_for_url(repo_url,
default_if_not_found=payload['repository']['full_name'])
changes = self._process_change(payload, user, repo, repo_url, project)
log.msg("Received %d changes from github" % len(changes))
return changes, 'git'
| from buildbot.status.web.hooks.github import GitHubEventHandler
from twisted.python import log
import abconfig
def codebasemap(payload):
return abconfig.get_project_for_url(payload['repository']['url'])
class AutobuilderGithubEventHandler(GitHubEventHandler):
def __init__(self, secret, strict codebase=None):
if codebase is None:
codebase = codebasemap
GitHubEventHandler.__init__(self, secret, strict, codebase)
def handle_push(self, payload):
# This field is unused:
user = None
# user = payload['pusher']['name']
repo = payload['repository']['name']
repo_url = payload['repository']['url']
# NOTE: what would be a reasonable value for project?
# project = request.args.get('project', [''])[0]
project = abconfig.get_project_for_url(repo_url,
default_if_not_found=payload['repository']['full_name'])
changes = self._process_change(payload, user, repo, repo_url, project)
log.msg("Received %d changes from github" % len(changes))
return changes, 'git'
| Add a codebase generator to the Github web hoook handler, to map the URL to the repo name for use as the codebase. | Add a codebase generator to the Github web hoook handler,
to map the URL to the repo name for use as the codebase.
| Python | mit | madisongh/autobuilder | python | ## Code Before:
from buildbot.status.web.hooks.github import GitHubEventHandler
from twisted.python import log
import abconfig
class AutobuilderGithubEventHandler(GitHubEventHandler):
def handle_push(self, payload):
# This field is unused:
user = None
# user = payload['pusher']['name']
repo = payload['repository']['name']
repo_url = payload['repository']['url']
# NOTE: what would be a reasonable value for project?
# project = request.args.get('project', [''])[0]
project = abconfig.get_project_for_url(repo_url,
default_if_not_found=payload['repository']['full_name'])
changes = self._process_change(payload, user, repo, repo_url, project)
log.msg("Received %d changes from github" % len(changes))
return changes, 'git'
## Instruction:
Add a codebase generator to the Github web hoook handler,
to map the URL to the repo name for use as the codebase.
## Code After:
from buildbot.status.web.hooks.github import GitHubEventHandler
from twisted.python import log
import abconfig
def codebasemap(payload):
return abconfig.get_project_for_url(payload['repository']['url'])
class AutobuilderGithubEventHandler(GitHubEventHandler):
def __init__(self, secret, strict codebase=None):
if codebase is None:
codebase = codebasemap
GitHubEventHandler.__init__(self, secret, strict, codebase)
def handle_push(self, payload):
# This field is unused:
user = None
# user = payload['pusher']['name']
repo = payload['repository']['name']
repo_url = payload['repository']['url']
# NOTE: what would be a reasonable value for project?
# project = request.args.get('project', [''])[0]
project = abconfig.get_project_for_url(repo_url,
default_if_not_found=payload['repository']['full_name'])
changes = self._process_change(payload, user, repo, repo_url, project)
log.msg("Received %d changes from github" % len(changes))
return changes, 'git'
| from buildbot.status.web.hooks.github import GitHubEventHandler
from twisted.python import log
import abconfig
+ def codebasemap(payload):
+ return abconfig.get_project_for_url(payload['repository']['url'])
+
class AutobuilderGithubEventHandler(GitHubEventHandler):
+
+ def __init__(self, secret, strict codebase=None):
+ if codebase is None:
+ codebase = codebasemap
+ GitHubEventHandler.__init__(self, secret, strict, codebase)
def handle_push(self, payload):
# This field is unused:
user = None
# user = payload['pusher']['name']
repo = payload['repository']['name']
repo_url = payload['repository']['url']
# NOTE: what would be a reasonable value for project?
# project = request.args.get('project', [''])[0]
project = abconfig.get_project_for_url(repo_url,
default_if_not_found=payload['repository']['full_name'])
changes = self._process_change(payload, user, repo, repo_url, project)
log.msg("Received %d changes from github" % len(changes))
return changes, 'git' | 8 | 0.363636 | 8 | 0 |
c8814756a3fda2ae2b2f9ac9f4c0004bb7b60dd2 | client/Main/navigation/navigationmachineitem.coffee | client/Main/navigation/navigationmachineitem.coffee | class NavigationMachineItem extends JView
{Running, Stopped} = Machine.State
stateClasses = ''
stateClasses += "#{state.toLowerCase()} " for state in Object.keys Machine.State
constructor:(options = {}, data)->
machine = data
@alias = machine.getName()
path = KD.utils.groupifyLink "/IDE/VM/#{machine.uid}"
options.tagName = 'a'
options.cssClass = "vm #{machine.status.state.toLowerCase()} #{machine.provider}"
options.attributes =
href : path
title : "Open IDE for #{@alias}"
super options, data
@machine = @getData()
@progress = new KDProgressBarView
cssClass : 'hidden'
# initial : Math.floor Math.random() * 100
{ computeController } = KD.singletons
computeController.on "public-#{@machine._id}", (event)=>
{percentage, status} = event
if percentage?
if @progress.bar
@progress.show()
@progress.updateBar percentage
if percentage is 100
KD.utils.wait 1000, @progress.bound 'hide'
else
@progress.hide()
if status?
@unsetClass stateClasses
@setClass status.toLowerCase()
pistachio:->
"""
<figure></figure>#{@alias}<span></span>
{{> @progress}}
""" | class NavigationMachineItem extends JView
{Running, Stopped} = Machine.State
stateClasses = ''
stateClasses += "#{state.toLowerCase()} " for state in Object.keys Machine.State
constructor:(options = {}, data)->
machine = data
@alias = machine.getName()
path = KD.utils.groupifyLink "/IDE/VM/#{machine.uid}"
options.tagName = 'a'
options.cssClass = "vm #{machine.status.state.toLowerCase()} #{machine.provider}"
options.attributes =
href : path
title : "Open IDE for #{@alias}"
super options, data
@machine = @getData()
@label = new KDCustomHTMLView
partial : @alias
@progress = new KDProgressBarView
cssClass : 'hidden'
{ computeController } = KD.singletons
computeController.on "public-#{@machine._id}", (event)=>
{percentage, status} = event
if percentage?
if @progress.bar
@progress.show()
@progress.updateBar percentage
if percentage is 100
KD.utils.wait 1000, @progress.bound 'hide'
else
@progress.hide()
if status?
@unsetClass stateClasses
@setClass status.toLowerCase()
pistachio:->
return """
<figure></figure>
{{> @label}}
<span></span>
{{> @progress}}
"""
| Replace inline pistachio string with a KDCustomHTMLView. | NavigationMachineItem: Replace inline pistachio string with a KDCustomHTMLView.
| CoffeeScript | agpl-3.0 | andrewjcasal/koding,gokmen/koding,sinan/koding,usirin/koding,sinan/koding,mertaytore/koding,usirin/koding,alex-ionochkin/koding,koding/koding,kwagdy/koding-1,rjeczalik/koding,sinan/koding,szkl/koding,usirin/koding,kwagdy/koding-1,usirin/koding,koding/koding,drewsetski/koding,cihangir/koding,usirin/koding,mertaytore/koding,rjeczalik/koding,drewsetski/koding,andrewjcasal/koding,drewsetski/koding,szkl/koding,sinan/koding,koding/koding,cihangir/koding,gokmen/koding,szkl/koding,jack89129/koding,mertaytore/koding,kwagdy/koding-1,gokmen/koding,gokmen/koding,acbodine/koding,rjeczalik/koding,koding/koding,mertaytore/koding,cihangir/koding,alex-ionochkin/koding,mertaytore/koding,andrewjcasal/koding,acbodine/koding,gokmen/koding,jack89129/koding,drewsetski/koding,jack89129/koding,szkl/koding,alex-ionochkin/koding,gokmen/koding,cihangir/koding,alex-ionochkin/koding,usirin/koding,rjeczalik/koding,alex-ionochkin/koding,alex-ionochkin/koding,jack89129/koding,drewsetski/koding,mertaytore/koding,koding/koding,andrewjcasal/koding,rjeczalik/koding,drewsetski/koding,mertaytore/koding,kwagdy/koding-1,andrewjcasal/koding,sinan/koding,szkl/koding,rjeczalik/koding,szkl/koding,szkl/koding,szkl/koding,mertaytore/koding,cihangir/koding,sinan/koding,acbodine/koding,drewsetski/koding,alex-ionochkin/koding,andrewjcasal/koding,acbodine/koding,jack89129/koding,cihangir/koding,andrewjcasal/koding,cihangir/koding,andrewjcasal/koding,sinan/koding,kwagdy/koding-1,jack89129/koding,kwagdy/koding-1,gokmen/koding,gokmen/koding,kwagdy/koding-1,alex-ionochkin/koding,rjeczalik/koding,kwagdy/koding-1,acbodine/koding,usirin/koding,rjeczalik/koding,jack89129/koding,koding/koding,jack89129/koding,acbodine/koding,acbodine/koding,acbodine/koding,drewsetski/koding,sinan/koding,koding/koding,usirin/koding,cihangir/koding,koding/koding | coffeescript | ## Code Before:
class NavigationMachineItem extends JView
{Running, Stopped} = Machine.State
stateClasses = ''
stateClasses += "#{state.toLowerCase()} " for state in Object.keys Machine.State
constructor:(options = {}, data)->
machine = data
@alias = machine.getName()
path = KD.utils.groupifyLink "/IDE/VM/#{machine.uid}"
options.tagName = 'a'
options.cssClass = "vm #{machine.status.state.toLowerCase()} #{machine.provider}"
options.attributes =
href : path
title : "Open IDE for #{@alias}"
super options, data
@machine = @getData()
@progress = new KDProgressBarView
cssClass : 'hidden'
# initial : Math.floor Math.random() * 100
{ computeController } = KD.singletons
computeController.on "public-#{@machine._id}", (event)=>
{percentage, status} = event
if percentage?
if @progress.bar
@progress.show()
@progress.updateBar percentage
if percentage is 100
KD.utils.wait 1000, @progress.bound 'hide'
else
@progress.hide()
if status?
@unsetClass stateClasses
@setClass status.toLowerCase()
pistachio:->
"""
<figure></figure>#{@alias}<span></span>
{{> @progress}}
"""
## Instruction:
NavigationMachineItem: Replace inline pistachio string with a KDCustomHTMLView.
## Code After:
class NavigationMachineItem extends JView
{Running, Stopped} = Machine.State
stateClasses = ''
stateClasses += "#{state.toLowerCase()} " for state in Object.keys Machine.State
constructor:(options = {}, data)->
machine = data
@alias = machine.getName()
path = KD.utils.groupifyLink "/IDE/VM/#{machine.uid}"
options.tagName = 'a'
options.cssClass = "vm #{machine.status.state.toLowerCase()} #{machine.provider}"
options.attributes =
href : path
title : "Open IDE for #{@alias}"
super options, data
@machine = @getData()
@label = new KDCustomHTMLView
partial : @alias
@progress = new KDProgressBarView
cssClass : 'hidden'
{ computeController } = KD.singletons
computeController.on "public-#{@machine._id}", (event)=>
{percentage, status} = event
if percentage?
if @progress.bar
@progress.show()
@progress.updateBar percentage
if percentage is 100
KD.utils.wait 1000, @progress.bound 'hide'
else
@progress.hide()
if status?
@unsetClass stateClasses
@setClass status.toLowerCase()
pistachio:->
return """
<figure></figure>
{{> @label}}
<span></span>
{{> @progress}}
"""
| class NavigationMachineItem extends JView
{Running, Stopped} = Machine.State
stateClasses = ''
stateClasses += "#{state.toLowerCase()} " for state in Object.keys Machine.State
constructor:(options = {}, data)->
machine = data
@alias = machine.getName()
path = KD.utils.groupifyLink "/IDE/VM/#{machine.uid}"
options.tagName = 'a'
options.cssClass = "vm #{machine.status.state.toLowerCase()} #{machine.provider}"
options.attributes =
href : path
title : "Open IDE for #{@alias}"
super options, data
- @machine = @getData()
+ @machine = @getData()
? ++
+
+ @label = new KDCustomHTMLView
+ partial : @alias
+
- @progress = new KDProgressBarView
+ @progress = new KDProgressBarView
? +
cssClass : 'hidden'
- # initial : Math.floor Math.random() * 100
{ computeController } = KD.singletons
computeController.on "public-#{@machine._id}", (event)=>
{percentage, status} = event
if percentage?
if @progress.bar
@progress.show()
@progress.updateBar percentage
if percentage is 100
KD.utils.wait 1000, @progress.bound 'hide'
else
@progress.hide()
if status?
@unsetClass stateClasses
@setClass status.toLowerCase()
pistachio:->
+ return """
+ <figure></figure>
+ {{> @label}}
+ <span></span>
+ {{> @progress}}
"""
- <figure></figure>#{@alias}<span></span>
- {{> @progress}}
- """ | 17 | 0.288136 | 11 | 6 |
e5ae2a615dad480ceb4e7b5122c5b20e6a976791 | scripts/deploy/rebuild_demo.sh | scripts/deploy/rebuild_demo.sh | DEMO_HEROKU_APP_NAME=somerville-teacher-tool-demo
# Deploy to demo app and migrate
echo "🚨 🚨 🚨 DANGER: About to destroy and rebuild the ⬢ $DEMO_HEROKU_APP_NAME database."
echo
read -p "🚨 🚨 🚨 This will cause downtime on $DEMO_HEROKU_APP_NAME. Continue? [y/N] " -n 1 -r
echo
if [[ $REPLY =~ ^[Yy]$ ]]; then
# Reset database through Heroku Postgres CLI
echo "⚙ 💻 ⚙ heroku pg:reset..."
heroku pg:reset DATABASE_URL -a $DEMO_HEROKU_APP_NAME --confirm $DEMO_HEROKU_APP_NAME
# Deploy to Somerville production app and migrate
echo "⚙ 💻 ⚙ rake db:schema:load..."
heroku run DISABLE_DATABASE_ENVIRONMENT_CHECK=1 rake db:migrate --app $DEMO_HEROKU_APP_NAME
echo "⚙ 💻 ⚙ rake db:seed..."
heroku run MORE_DEMO_STUDENTS=false rake db:seed --app $DEMO_HEROKU_APP_NAME
echo
# Deploy to demo app and migrate
echo "Done. Rebuilt ⬢ $DEMO_HEROKU_APP_NAME database."
else
echo "Aborted."
fi
| DEMO_HEROKU_APP_NAME=somerville-teacher-tool-demo
# Deploy to demo app and migrate
echo "🚨 🚨 🚨 DANGER: About to destroy and rebuild the ⬢ $DEMO_HEROKU_APP_NAME database."
echo
read -p "🚨 🚨 🚨 This will cause downtime on $DEMO_HEROKU_APP_NAME. Continue? [y/N] " -n 1 -r
echo
if [[ $REPLY =~ ^[Yy]$ ]]; then
# Maintenance mode
heroku maintenance:on --app $DEMO_HEROKU_APP_NAME
# Reset database through Heroku Postgres CLI
echo "⚙ 💻 ⚙ heroku pg:reset..."
heroku pg:reset DATABASE_URL -a $DEMO_HEROKU_APP_NAME --confirm $DEMO_HEROKU_APP_NAME
# Deploy to Somerville production app and migrate
echo "⚙ 💻 ⚙ rake db:schema:load..."
heroku run DISABLE_DATABASE_ENVIRONMENT_CHECK=1 rake db:migrate --app $DEMO_HEROKU_APP_NAME
echo "⚙ 💻 ⚙ rake db:seed..."
heroku run MORE_DEMO_STUDENTS=true rake db:seed --app $DEMO_HEROKU_APP_NAME
echo
# Maintenance mode
heroku maintenance:off --app $DEMO_HEROKU_APP_NAME
# Deploy to demo app and migrate
echo "Done. Rebuilt ⬢ $DEMO_HEROKU_APP_NAME database."
else
echo "Aborted."
fi
| Update script to rebuild demo site | Update script to rebuild demo site
| Shell | mit | studentinsights/studentinsights,studentinsights/studentinsights,studentinsights/studentinsights,studentinsights/studentinsights | shell | ## Code Before:
DEMO_HEROKU_APP_NAME=somerville-teacher-tool-demo
# Deploy to demo app and migrate
echo "🚨 🚨 🚨 DANGER: About to destroy and rebuild the ⬢ $DEMO_HEROKU_APP_NAME database."
echo
read -p "🚨 🚨 🚨 This will cause downtime on $DEMO_HEROKU_APP_NAME. Continue? [y/N] " -n 1 -r
echo
if [[ $REPLY =~ ^[Yy]$ ]]; then
# Reset database through Heroku Postgres CLI
echo "⚙ 💻 ⚙ heroku pg:reset..."
heroku pg:reset DATABASE_URL -a $DEMO_HEROKU_APP_NAME --confirm $DEMO_HEROKU_APP_NAME
# Deploy to Somerville production app and migrate
echo "⚙ 💻 ⚙ rake db:schema:load..."
heroku run DISABLE_DATABASE_ENVIRONMENT_CHECK=1 rake db:migrate --app $DEMO_HEROKU_APP_NAME
echo "⚙ 💻 ⚙ rake db:seed..."
heroku run MORE_DEMO_STUDENTS=false rake db:seed --app $DEMO_HEROKU_APP_NAME
echo
# Deploy to demo app and migrate
echo "Done. Rebuilt ⬢ $DEMO_HEROKU_APP_NAME database."
else
echo "Aborted."
fi
## Instruction:
Update script to rebuild demo site
## Code After:
DEMO_HEROKU_APP_NAME=somerville-teacher-tool-demo
# Deploy to demo app and migrate
echo "🚨 🚨 🚨 DANGER: About to destroy and rebuild the ⬢ $DEMO_HEROKU_APP_NAME database."
echo
read -p "🚨 🚨 🚨 This will cause downtime on $DEMO_HEROKU_APP_NAME. Continue? [y/N] " -n 1 -r
echo
if [[ $REPLY =~ ^[Yy]$ ]]; then
# Maintenance mode
heroku maintenance:on --app $DEMO_HEROKU_APP_NAME
# Reset database through Heroku Postgres CLI
echo "⚙ 💻 ⚙ heroku pg:reset..."
heroku pg:reset DATABASE_URL -a $DEMO_HEROKU_APP_NAME --confirm $DEMO_HEROKU_APP_NAME
# Deploy to Somerville production app and migrate
echo "⚙ 💻 ⚙ rake db:schema:load..."
heroku run DISABLE_DATABASE_ENVIRONMENT_CHECK=1 rake db:migrate --app $DEMO_HEROKU_APP_NAME
echo "⚙ 💻 ⚙ rake db:seed..."
heroku run MORE_DEMO_STUDENTS=true rake db:seed --app $DEMO_HEROKU_APP_NAME
echo
# Maintenance mode
heroku maintenance:off --app $DEMO_HEROKU_APP_NAME
# Deploy to demo app and migrate
echo "Done. Rebuilt ⬢ $DEMO_HEROKU_APP_NAME database."
else
echo "Aborted."
fi
| DEMO_HEROKU_APP_NAME=somerville-teacher-tool-demo
# Deploy to demo app and migrate
echo "🚨 🚨 🚨 DANGER: About to destroy and rebuild the ⬢ $DEMO_HEROKU_APP_NAME database."
echo
read -p "🚨 🚨 🚨 This will cause downtime on $DEMO_HEROKU_APP_NAME. Continue? [y/N] " -n 1 -r
echo
if [[ $REPLY =~ ^[Yy]$ ]]; then
+ # Maintenance mode
+ heroku maintenance:on --app $DEMO_HEROKU_APP_NAME
+
# Reset database through Heroku Postgres CLI
echo "⚙ 💻 ⚙ heroku pg:reset..."
heroku pg:reset DATABASE_URL -a $DEMO_HEROKU_APP_NAME --confirm $DEMO_HEROKU_APP_NAME
# Deploy to Somerville production app and migrate
echo "⚙ 💻 ⚙ rake db:schema:load..."
heroku run DISABLE_DATABASE_ENVIRONMENT_CHECK=1 rake db:migrate --app $DEMO_HEROKU_APP_NAME
echo "⚙ 💻 ⚙ rake db:seed..."
- heroku run MORE_DEMO_STUDENTS=false rake db:seed --app $DEMO_HEROKU_APP_NAME
? ^^^^
+ heroku run MORE_DEMO_STUDENTS=true rake db:seed --app $DEMO_HEROKU_APP_NAME
? ^^^
echo
+ # Maintenance mode
+ heroku maintenance:off --app $DEMO_HEROKU_APP_NAME
+
# Deploy to demo app and migrate
echo "Done. Rebuilt ⬢ $DEMO_HEROKU_APP_NAME database."
else
echo "Aborted."
fi | 8 | 0.307692 | 7 | 1 |
80d9a14e3c0f62131161a7b46da93ab47768aea7 | tests/TestCase.php | tests/TestCase.php | <?php
namespace Tests;
use Faker\Factory;
class TestCase extends \Laravel\Lumen\Testing\TestCase
{
/**
* Creates the application.
*
* @return \Laravel\Lumen\Application
*/
public function createApplication()
{
return require __DIR__.'/../bootstrap/app.php';
}
/**
* Ran before every test.
*/
public function setUp()
{
parent::setUp();
// Set these super globals since some packages rely on them
$_SERVER['HTTP_USER_AGENT'] = '';
$_SERVER['HTTP_HOST'] = '';
$_SERVER['REQUEST_URI'] = '';
// Force the top menu to be enabled for now since the tests are written specifically for this condition
config(['app.top_menu_enabled' => true]);
// Create a new faker that every test can use
$this->faker = (new Factory)->create();
}
}
| <?php
namespace Tests;
use Faker\Factory;
class TestCase extends \Laravel\Lumen\Testing\TestCase
{
/**
* Creates the application.
*
* @return \Laravel\Lumen\Application
*/
public function createApplication()
{
return require __DIR__.'/../bootstrap/app.php';
}
/**
* Ran before every test.
*/
public function setUp()
{
parent::setUp();
// Set these super globals since some packages rely on them
$_SERVER['HTTP_USER_AGENT'] = '';
$_SERVER['HTTP_HOST'] = '';
$_SERVER['REQUEST_URI'] = '';
// Force the top menu to be enabled for now since the tests are written specifically for this condition
config(['app.top_menu_enabled' => true]);
// Reset the WSU API key so we never make real connections to the API
config(['app.wsu_api_key' => '']);
// Create a new faker that every test can use
$this->faker = (new Factory)->create();
}
}
| Reset the WSU API key so the tests never make a real connection and will error if it tries to. | Reset the WSU API key so the tests never make a real connection and will error if it tries to.
| PHP | mit | waynestate/base-site,waynestate/base-site | php | ## Code Before:
<?php
namespace Tests;
use Faker\Factory;
class TestCase extends \Laravel\Lumen\Testing\TestCase
{
/**
* Creates the application.
*
* @return \Laravel\Lumen\Application
*/
public function createApplication()
{
return require __DIR__.'/../bootstrap/app.php';
}
/**
* Ran before every test.
*/
public function setUp()
{
parent::setUp();
// Set these super globals since some packages rely on them
$_SERVER['HTTP_USER_AGENT'] = '';
$_SERVER['HTTP_HOST'] = '';
$_SERVER['REQUEST_URI'] = '';
// Force the top menu to be enabled for now since the tests are written specifically for this condition
config(['app.top_menu_enabled' => true]);
// Create a new faker that every test can use
$this->faker = (new Factory)->create();
}
}
## Instruction:
Reset the WSU API key so the tests never make a real connection and will error if it tries to.
## Code After:
<?php
namespace Tests;
use Faker\Factory;
class TestCase extends \Laravel\Lumen\Testing\TestCase
{
/**
* Creates the application.
*
* @return \Laravel\Lumen\Application
*/
public function createApplication()
{
return require __DIR__.'/../bootstrap/app.php';
}
/**
* Ran before every test.
*/
public function setUp()
{
parent::setUp();
// Set these super globals since some packages rely on them
$_SERVER['HTTP_USER_AGENT'] = '';
$_SERVER['HTTP_HOST'] = '';
$_SERVER['REQUEST_URI'] = '';
// Force the top menu to be enabled for now since the tests are written specifically for this condition
config(['app.top_menu_enabled' => true]);
// Reset the WSU API key so we never make real connections to the API
config(['app.wsu_api_key' => '']);
// Create a new faker that every test can use
$this->faker = (new Factory)->create();
}
}
| <?php
namespace Tests;
use Faker\Factory;
class TestCase extends \Laravel\Lumen\Testing\TestCase
{
/**
* Creates the application.
*
* @return \Laravel\Lumen\Application
*/
public function createApplication()
{
return require __DIR__.'/../bootstrap/app.php';
}
/**
* Ran before every test.
*/
public function setUp()
{
parent::setUp();
// Set these super globals since some packages rely on them
$_SERVER['HTTP_USER_AGENT'] = '';
$_SERVER['HTTP_HOST'] = '';
$_SERVER['REQUEST_URI'] = '';
// Force the top menu to be enabled for now since the tests are written specifically for this condition
config(['app.top_menu_enabled' => true]);
+ // Reset the WSU API key so we never make real connections to the API
+ config(['app.wsu_api_key' => '']);
+
// Create a new faker that every test can use
$this->faker = (new Factory)->create();
}
} | 3 | 0.081081 | 3 | 0 |
556a9c67ff9a9ef6b48f821ba6ff78698ed75b1a | README.md | README.md | Pyramid Slide challenge (https://www.reddit.com/r/dailyprogrammer/comments/6vi9ro/170823_challenge_328_intermediate_pyramid_sliding/?st=j8oziieq&sh=700700bb)
| Pyramid Slide challenge (https://www.reddit.com/r/dailyprogrammer/comments/6vi9ro/170823_challenge_328_intermediate_pyramid_sliding/?st=j8oziieq&sh=700700bb)
## Group A (??)
* Sam
* Chris
* Tristian
### Pyramid
* https://github.com/Triztian/pyramid
```
$ go get github.com/Triztian/pyramid
```
| Update Readme; add our groups repo | Update Readme; add our groups repo | Markdown | mit | SDGophers/pyramid_slide | markdown | ## Code Before:
Pyramid Slide challenge (https://www.reddit.com/r/dailyprogrammer/comments/6vi9ro/170823_challenge_328_intermediate_pyramid_sliding/?st=j8oziieq&sh=700700bb)
## Instruction:
Update Readme; add our groups repo
## Code After:
Pyramid Slide challenge (https://www.reddit.com/r/dailyprogrammer/comments/6vi9ro/170823_challenge_328_intermediate_pyramid_sliding/?st=j8oziieq&sh=700700bb)
## Group A (??)
* Sam
* Chris
* Tristian
### Pyramid
* https://github.com/Triztian/pyramid
```
$ go get github.com/Triztian/pyramid
```
| Pyramid Slide challenge (https://www.reddit.com/r/dailyprogrammer/comments/6vi9ro/170823_challenge_328_intermediate_pyramid_sliding/?st=j8oziieq&sh=700700bb)
+
+ ## Group A (??)
+
+ * Sam
+ * Chris
+ * Tristian
+
+ ### Pyramid
+ * https://github.com/Triztian/pyramid
+
+ ```
+ $ go get github.com/Triztian/pyramid
+ ```
+
+
+ | 16 | 16 | 16 | 0 |
44ddb35699ed2a7cff9e5dc2f4b36823931a57b6 | app/process_request.js | app/process_request.js | var requirejs = require('requirejs');
var PageConfig = requirejs('page_config');
var get_dashboard_and_render = require('./server/mixins/get_dashboard_and_render');
var renderContent = function (req, res, model) {
model.set(PageConfig.commonConfig(req));
var ControllerClass = model.get('controller');
var controller = new ControllerClass({
model: model,
url: req.originalUrl
});
controller.once('ready', function () {
res.set('Cache-Control', 'public, max-age=600');
if (model.get('published') !== true) {
res.set('X-Robots-Tag', 'none');
}
req['spotlight-dashboard-slug'] = model.get('slug');
res.send(controller.html);
});
controller.render({ init: true });
return controller;
};
var setup = function (req, res) {
var client_instance = setup.get_dashboard_and_render(req, res, setup.renderContent);
//I have no idea what this does, can't find anything obvious in the docs or this app.
client_instance.set('script', true);
client_instance.setPath(req.url.replace('/performance', ''));
};
setup.renderContent = renderContent;
setup.get_dashboard_and_render = get_dashboard_and_render;
module.exports = setup;
| var requirejs = require('requirejs');
var PageConfig = requirejs('page_config');
var get_dashboard_and_render = require('./server/mixins/get_dashboard_and_render');
var renderContent = function (req, res, model) {
model.set(PageConfig.commonConfig(req));
var ControllerClass = model.get('controller');
var controller = new ControllerClass({
model: model,
url: req.originalUrl
});
controller.once('ready', function () {
res.set('Cache-Control', 'public, max-age=600');
if (model.get('published') !== true) {
res.set('X-Robots-Tag', 'none');
}
req['spotlight-dashboard-slug'] = model.get('slug');
res.send(controller.html);
});
controller.render({ init: true });
return controller;
};
var setup = function (req, res) {
var client_instance = setup.get_dashboard_and_render(req, res, setup.renderContent);
client_instance.set('script', true);
client_instance.setPath(req.url.replace('/performance', ''));
};
setup.renderContent = renderContent;
setup.get_dashboard_and_render = get_dashboard_and_render;
module.exports = setup;
| Remove comment about client_instance script | Remove comment about client_instance script
The script attribute of a client_instance is tested in the view,
for example in body-end.html:
<% if (model.get('script')) { %>
It can be set to false to disable including of our rather large
JavaScript assets.
| JavaScript | mit | alphagov/spotlight,tijmenb/spotlight,alphagov/spotlight,keithiopia/spotlight,alphagov/spotlight,tijmenb/spotlight,keithiopia/spotlight,keithiopia/spotlight,tijmenb/spotlight | javascript | ## Code Before:
var requirejs = require('requirejs');
var PageConfig = requirejs('page_config');
var get_dashboard_and_render = require('./server/mixins/get_dashboard_and_render');
var renderContent = function (req, res, model) {
model.set(PageConfig.commonConfig(req));
var ControllerClass = model.get('controller');
var controller = new ControllerClass({
model: model,
url: req.originalUrl
});
controller.once('ready', function () {
res.set('Cache-Control', 'public, max-age=600');
if (model.get('published') !== true) {
res.set('X-Robots-Tag', 'none');
}
req['spotlight-dashboard-slug'] = model.get('slug');
res.send(controller.html);
});
controller.render({ init: true });
return controller;
};
var setup = function (req, res) {
var client_instance = setup.get_dashboard_and_render(req, res, setup.renderContent);
//I have no idea what this does, can't find anything obvious in the docs or this app.
client_instance.set('script', true);
client_instance.setPath(req.url.replace('/performance', ''));
};
setup.renderContent = renderContent;
setup.get_dashboard_and_render = get_dashboard_and_render;
module.exports = setup;
## Instruction:
Remove comment about client_instance script
The script attribute of a client_instance is tested in the view,
for example in body-end.html:
<% if (model.get('script')) { %>
It can be set to false to disable including of our rather large
JavaScript assets.
## Code After:
var requirejs = require('requirejs');
var PageConfig = requirejs('page_config');
var get_dashboard_and_render = require('./server/mixins/get_dashboard_and_render');
var renderContent = function (req, res, model) {
model.set(PageConfig.commonConfig(req));
var ControllerClass = model.get('controller');
var controller = new ControllerClass({
model: model,
url: req.originalUrl
});
controller.once('ready', function () {
res.set('Cache-Control', 'public, max-age=600');
if (model.get('published') !== true) {
res.set('X-Robots-Tag', 'none');
}
req['spotlight-dashboard-slug'] = model.get('slug');
res.send(controller.html);
});
controller.render({ init: true });
return controller;
};
var setup = function (req, res) {
var client_instance = setup.get_dashboard_and_render(req, res, setup.renderContent);
client_instance.set('script', true);
client_instance.setPath(req.url.replace('/performance', ''));
};
setup.renderContent = renderContent;
setup.get_dashboard_and_render = get_dashboard_and_render;
module.exports = setup;
| var requirejs = require('requirejs');
var PageConfig = requirejs('page_config');
var get_dashboard_and_render = require('./server/mixins/get_dashboard_and_render');
var renderContent = function (req, res, model) {
model.set(PageConfig.commonConfig(req));
var ControllerClass = model.get('controller');
var controller = new ControllerClass({
model: model,
url: req.originalUrl
});
controller.once('ready', function () {
res.set('Cache-Control', 'public, max-age=600');
if (model.get('published') !== true) {
res.set('X-Robots-Tag', 'none');
}
req['spotlight-dashboard-slug'] = model.get('slug');
res.send(controller.html);
});
controller.render({ init: true });
return controller;
};
var setup = function (req, res) {
var client_instance = setup.get_dashboard_and_render(req, res, setup.renderContent);
- //I have no idea what this does, can't find anything obvious in the docs or this app.
client_instance.set('script', true);
client_instance.setPath(req.url.replace('/performance', ''));
};
setup.renderContent = renderContent;
setup.get_dashboard_and_render = get_dashboard_and_render;
module.exports = setup; | 1 | 0.025 | 0 | 1 |
bac9f7161886c0190ae48ac4ecffc1caa8cd0ff4 | app/views/snippets/form_inset_panel.html | app/views/snippets/form_inset_panel.html | <div class="form-group">
<div class="panel panel-border-narrow" id="contact-by-text">
<label class="form-label" for="contact-text-message">Mobile phone number</label>
<input class="form-control" name="contact-text-message" type="text" id="contact-text-message">
</div>
</div>
| <div class="form-group">
<div class="panel panel-border-narrow">
<label class="form-label" for="contact-text-message">Mobile phone number</label>
<input class="form-control" name="contact-text-message" type="text" id="contact-text-message">
</div>
</div>
| Remove duplicate ID - contact-by-text | Remove duplicate ID - contact-by-text
| HTML | mit | alphagov/govuk_elements,alphagov/govuk_elements,alphagov/govuk_elements,alphagov/govuk_elements | html | ## Code Before:
<div class="form-group">
<div class="panel panel-border-narrow" id="contact-by-text">
<label class="form-label" for="contact-text-message">Mobile phone number</label>
<input class="form-control" name="contact-text-message" type="text" id="contact-text-message">
</div>
</div>
## Instruction:
Remove duplicate ID - contact-by-text
## Code After:
<div class="form-group">
<div class="panel panel-border-narrow">
<label class="form-label" for="contact-text-message">Mobile phone number</label>
<input class="form-control" name="contact-text-message" type="text" id="contact-text-message">
</div>
</div>
| <div class="form-group">
- <div class="panel panel-border-narrow" id="contact-by-text">
? ---------------------
+ <div class="panel panel-border-narrow">
<label class="form-label" for="contact-text-message">Mobile phone number</label>
<input class="form-control" name="contact-text-message" type="text" id="contact-text-message">
</div>
</div> | 2 | 0.333333 | 1 | 1 |
ce612f803df08ab2c08bbc0ab2d0144875b88434 | README.md | README.md | 1rst Kaggle Competittion
|
**1rst Kaggle Competittion**
Read: *Property_Inspection_Prediction.org* | Add instruction to read the org file | Add instruction to read the org file
| Markdown | apache-2.0 | leandroohf/Public_Liberty_Mutual_Group_Property_Inspection_Prediction,leandroohf/Public_Liberty_Mutual_Group_Property_Inspection_Prediction,leandroohf/Public_Liberty_Mutual_Group_Property_Inspection_Prediction | markdown | ## Code Before:
1rst Kaggle Competittion
## Instruction:
Add instruction to read the org file
## Code After:
**1rst Kaggle Competittion**
Read: *Property_Inspection_Prediction.org* | +
- 1rst Kaggle Competittion
+ **1rst Kaggle Competittion**
? ++++ +++
+
+ Read: *Property_Inspection_Prediction.org* | 5 | 5 | 4 | 1 |
a00c50b7b9465b95770d6234d96e83c286cd33b0 | lib/MongoJob.php | lib/MongoJob.php | <?php
require_once('MongoQueue.php');
require_once('MongoFunctor.php');
abstract class MongoJob
{
public static function later($delay = 0, $batch = false)
{
return self::at(time() + $delay, $batch);
}
public static function at($time = null, $batch = false)
{
if (!$time) $time = time();
$className = get_called_class();
return new MongoFunctor($className, $time, $batch);
}
public static function batchLater($delay = 0)
{
return self::later($delay, true);
}
public static function batchAt($time = null)
{
return self::at($time, true);
}
public static function run()
{
return MongoQueue::run(get_called_class());
}
public static function count()
{
return MongoQueue::count(get_called_class());
}
}
| <?php
require_once('MongoQueue.php');
require_once('MongoFunctor.php');
abstract class MongoJob
{
public static function later($delay = 0, $batch = false)
{
return self::at(time() + $delay, $batch);
}
public static function at($time = null, $batch = false)
{
if ($time === null) $time = time();
$className = get_called_class();
return new MongoFunctor($className, $time, $batch);
}
public static function batchLater($delay = 0)
{
return self::later($delay, true);
}
public static function batchAt($time = null)
{
return self::at($time, true);
}
public static function run()
{
return MongoQueue::run(get_called_class());
}
public static function count()
{
return MongoQueue::count(get_called_class());
}
}
| Check specifically for null before overriding the timestamp | Check specifically for null before overriding the timestamp
| PHP | mit | lunaru/MongoQueue | php | ## Code Before:
<?php
require_once('MongoQueue.php');
require_once('MongoFunctor.php');
abstract class MongoJob
{
public static function later($delay = 0, $batch = false)
{
return self::at(time() + $delay, $batch);
}
public static function at($time = null, $batch = false)
{
if (!$time) $time = time();
$className = get_called_class();
return new MongoFunctor($className, $time, $batch);
}
public static function batchLater($delay = 0)
{
return self::later($delay, true);
}
public static function batchAt($time = null)
{
return self::at($time, true);
}
public static function run()
{
return MongoQueue::run(get_called_class());
}
public static function count()
{
return MongoQueue::count(get_called_class());
}
}
## Instruction:
Check specifically for null before overriding the timestamp
## Code After:
<?php
require_once('MongoQueue.php');
require_once('MongoFunctor.php');
abstract class MongoJob
{
public static function later($delay = 0, $batch = false)
{
return self::at(time() + $delay, $batch);
}
public static function at($time = null, $batch = false)
{
if ($time === null) $time = time();
$className = get_called_class();
return new MongoFunctor($className, $time, $batch);
}
public static function batchLater($delay = 0)
{
return self::later($delay, true);
}
public static function batchAt($time = null)
{
return self::at($time, true);
}
public static function run()
{
return MongoQueue::run(get_called_class());
}
public static function count()
{
return MongoQueue::count(get_called_class());
}
}
| <?php
require_once('MongoQueue.php');
require_once('MongoFunctor.php');
abstract class MongoJob
{
public static function later($delay = 0, $batch = false)
{
return self::at(time() + $delay, $batch);
}
public static function at($time = null, $batch = false)
{
- if (!$time) $time = time();
? -
+ if ($time === null) $time = time();
? +++++++++
$className = get_called_class();
return new MongoFunctor($className, $time, $batch);
}
public static function batchLater($delay = 0)
{
return self::later($delay, true);
}
public static function batchAt($time = null)
{
return self::at($time, true);
}
public static function run()
{
return MongoQueue::run(get_called_class());
}
public static function count()
{
return MongoQueue::count(get_called_class());
}
} | 2 | 0.051282 | 1 | 1 |
1657a56ec025e6670d72c20f1cf87f80ab9c7de6 | lib/throttle.js | lib/throttle.js | 'use strict';
module.exports = (api) => {
api.metasync.throttle = (
// Function throttling
timeout, // time interval
fn, // function to be executed once per timeout
args // arguments array for fn (optional)
) => {
let timer = null;
let wait = false;
return function throttled() {
if (!timer) {
timer = setTimeout(() => {
timer = null;
if (wait) throttled();
}, timeout);
if (args) fn(...args);
else fn();
wait = false;
} else {
wait = true;
}
};
};
api.metasync.debounce = (
timeout,
fn,
args = []
) => {
let timer;
return () => {
clearTimeout(timer);
timer = setTimeout(() => fn(...args), timeout);
};
};
api.metasync.timeout = (
// Set timeout for function execution
timeout, // time interval
asyncFunction, // async function to be executed
// done - callback function
done // callback function on done
) => {
let finished = false;
done = api.metasync.cb(done);
const timer = setTimeout(() => {
if (!finished) {
finished = true;
done();
}
}, timeout);
asyncFunction(() => {
if (!finished) {
clearTimeout(timer);
finished = true;
done();
}
});
};
};
| 'use strict';
module.exports = (api) => {
api.metasync.throttle = (
// Function throttling
timeout, // time interval
fn, // function to be executed once per timeout
args // arguments array for fn (optional)
) => {
let timer = null;
let wait = false;
return function throttled() {
if (!timer) {
timer = setTimeout(() => {
timer = null;
if (wait) throttled();
}, timeout);
if (args) fn(...args);
else fn();
wait = false;
} else {
wait = true;
}
};
};
api.metasync.debounce = (
timeout,
fn,
args = []
) => {
let timer;
return () => {
clearTimeout(timer);
timer = setTimeout(() => fn(...args), timeout);
};
};
api.metasync.timeout = (
// Set timeout for function execution
timeout, // time interval
asyncFunction, // async function to be executed
// done - callback function
done // callback function on done
) => {
let finished = false;
done = api.metasync.cb(done);
const timer = setTimeout(() => {
if (!finished) {
finished = true;
done(null);
}
}, timeout);
asyncFunction((err, data) => {
if (!finished) {
clearTimeout(timer);
finished = true;
done(err, data);
}
});
};
};
| Add null as error argument for callback | Add null as error argument for callback
Refs: #13
PR-URL: https://github.com/metarhia/metasync/pull/152
Reviewed-By: Timur Shemsedinov <6dc7cb6a9fcface2186172df883b5c9ab417ae33@gmail.com>
| JavaScript | mit | metarhia/MetaSync,DzyubSpirit/MetaSync,DzyubSpirit/MetaSync | javascript | ## Code Before:
'use strict';
module.exports = (api) => {
api.metasync.throttle = (
// Function throttling
timeout, // time interval
fn, // function to be executed once per timeout
args // arguments array for fn (optional)
) => {
let timer = null;
let wait = false;
return function throttled() {
if (!timer) {
timer = setTimeout(() => {
timer = null;
if (wait) throttled();
}, timeout);
if (args) fn(...args);
else fn();
wait = false;
} else {
wait = true;
}
};
};
api.metasync.debounce = (
timeout,
fn,
args = []
) => {
let timer;
return () => {
clearTimeout(timer);
timer = setTimeout(() => fn(...args), timeout);
};
};
api.metasync.timeout = (
// Set timeout for function execution
timeout, // time interval
asyncFunction, // async function to be executed
// done - callback function
done // callback function on done
) => {
let finished = false;
done = api.metasync.cb(done);
const timer = setTimeout(() => {
if (!finished) {
finished = true;
done();
}
}, timeout);
asyncFunction(() => {
if (!finished) {
clearTimeout(timer);
finished = true;
done();
}
});
};
};
## Instruction:
Add null as error argument for callback
Refs: #13
PR-URL: https://github.com/metarhia/metasync/pull/152
Reviewed-By: Timur Shemsedinov <6dc7cb6a9fcface2186172df883b5c9ab417ae33@gmail.com>
## Code After:
'use strict';
module.exports = (api) => {
api.metasync.throttle = (
// Function throttling
timeout, // time interval
fn, // function to be executed once per timeout
args // arguments array for fn (optional)
) => {
let timer = null;
let wait = false;
return function throttled() {
if (!timer) {
timer = setTimeout(() => {
timer = null;
if (wait) throttled();
}, timeout);
if (args) fn(...args);
else fn();
wait = false;
} else {
wait = true;
}
};
};
api.metasync.debounce = (
timeout,
fn,
args = []
) => {
let timer;
return () => {
clearTimeout(timer);
timer = setTimeout(() => fn(...args), timeout);
};
};
api.metasync.timeout = (
// Set timeout for function execution
timeout, // time interval
asyncFunction, // async function to be executed
// done - callback function
done // callback function on done
) => {
let finished = false;
done = api.metasync.cb(done);
const timer = setTimeout(() => {
if (!finished) {
finished = true;
done(null);
}
}, timeout);
asyncFunction((err, data) => {
if (!finished) {
clearTimeout(timer);
finished = true;
done(err, data);
}
});
};
};
| 'use strict';
module.exports = (api) => {
api.metasync.throttle = (
// Function throttling
timeout, // time interval
fn, // function to be executed once per timeout
args // arguments array for fn (optional)
) => {
let timer = null;
let wait = false;
return function throttled() {
if (!timer) {
timer = setTimeout(() => {
timer = null;
if (wait) throttled();
}, timeout);
if (args) fn(...args);
else fn();
wait = false;
} else {
wait = true;
}
};
};
api.metasync.debounce = (
timeout,
fn,
args = []
) => {
let timer;
return () => {
clearTimeout(timer);
timer = setTimeout(() => fn(...args), timeout);
};
};
api.metasync.timeout = (
// Set timeout for function execution
timeout, // time interval
asyncFunction, // async function to be executed
// done - callback function
done // callback function on done
) => {
let finished = false;
done = api.metasync.cb(done);
const timer = setTimeout(() => {
if (!finished) {
finished = true;
- done();
+ done(null);
? ++++
}
}, timeout);
- asyncFunction(() => {
+ asyncFunction((err, data) => {
? +++++++++
if (!finished) {
clearTimeout(timer);
finished = true;
- done();
+ done(err, data);
? +++++++++
}
});
};
}; | 6 | 0.090909 | 3 | 3 |
8a2b55898d785efb7344750a8ad3b4e255a00600 | mildly-cold/Office-map.json | mildly-cold/Office-map.json | {
"name": "Office",
"asset": "../../office/Office.gltf",
"players": 1,
"spawn": {
"positions": [
[0.0, 1.0, 0.0]
],
"choice": "random"
},
"collision_mesh": {
"vertices": "Collision_Vertices",
"indices": "Collision_Indices"
},
"physics": {
"gravity": [0.0, -9.81, 0.0]
}
}
| {
"name": "Office",
"asset": "../../office/Office.gltf",
"players": 1,
"spawn": {
"positions": [
[0.0, 1.0, 0.0]
],
"choice": "random"
},
"collision_mesh": {
"vertices": "Collision_Vertices",
"indices": "Collision_Indices"
},
"physics": {
"gravity": [0.0, -9.81, 0.0]
},
"physics_events": [
{
"event": "desk_lamp_toggle",
"shape": {
"type": "sphere",
"radius": 0.2
},
"pos": [-3.03, 0.73, 0.12],
"desc": "Desk lamp switch"
}
]
}
| Add one lamp "click" / physics event to office json | Add one lamp "click" / physics event to office json
| JSON | bsd-3-clause | RedCraneStudio/redcrane-engine,RedCraneStudio/redcrane-engine,RedCraneStudio/redcrane-engine,RedCraneStudio/redcrane-engine | json | ## Code Before:
{
"name": "Office",
"asset": "../../office/Office.gltf",
"players": 1,
"spawn": {
"positions": [
[0.0, 1.0, 0.0]
],
"choice": "random"
},
"collision_mesh": {
"vertices": "Collision_Vertices",
"indices": "Collision_Indices"
},
"physics": {
"gravity": [0.0, -9.81, 0.0]
}
}
## Instruction:
Add one lamp "click" / physics event to office json
## Code After:
{
"name": "Office",
"asset": "../../office/Office.gltf",
"players": 1,
"spawn": {
"positions": [
[0.0, 1.0, 0.0]
],
"choice": "random"
},
"collision_mesh": {
"vertices": "Collision_Vertices",
"indices": "Collision_Indices"
},
"physics": {
"gravity": [0.0, -9.81, 0.0]
},
"physics_events": [
{
"event": "desk_lamp_toggle",
"shape": {
"type": "sphere",
"radius": 0.2
},
"pos": [-3.03, 0.73, 0.12],
"desc": "Desk lamp switch"
}
]
}
| {
"name": "Office",
"asset": "../../office/Office.gltf",
"players": 1,
"spawn": {
"positions": [
[0.0, 1.0, 0.0]
],
"choice": "random"
},
"collision_mesh": {
"vertices": "Collision_Vertices",
"indices": "Collision_Indices"
},
"physics": {
"gravity": [0.0, -9.81, 0.0]
- }
+ },
? +
+ "physics_events": [
+ {
+ "event": "desk_lamp_toggle",
+ "shape": {
+ "type": "sphere",
+ "radius": 0.2
+ },
+ "pos": [-3.03, 0.73, 0.12],
+ "desc": "Desk lamp switch"
+ }
+ ]
} | 13 | 0.722222 | 12 | 1 |
c631ca483da85d04ef13afcf9e9c0f78a74cfc61 | index.js | index.js | exports.register = function(plugin, options, next) {
// Wait 10 seconds for existing connections to close then exit.
var stop = function() {
plugin.servers[0].stop({
timeout: 10 * 1000
}, function() {
process.exit();
});
};
process.on('SIGTERM', stop);
process.on('SIGINT', stop);
next();
};
exports.register.attributes = {
pkg: require('./package.json')
};
| exports.register = function(server, options, next) {
// Wait 10 seconds for existing connections to close then exit.
var stop = function() {
server.connections[0].stop({
timeout: 10 * 1000
}, function() {
process.exit();
});
};
process.on('SIGTERM', stop);
process.on('SIGINT', stop);
next();
};
exports.register.attributes = {
pkg: require('./package.json')
};
| Upgrade for compatability with Hapi 8 | Upgrade for compatability with Hapi 8
| JavaScript | mit | KyleAMathews/hapi-death | javascript | ## Code Before:
exports.register = function(plugin, options, next) {
// Wait 10 seconds for existing connections to close then exit.
var stop = function() {
plugin.servers[0].stop({
timeout: 10 * 1000
}, function() {
process.exit();
});
};
process.on('SIGTERM', stop);
process.on('SIGINT', stop);
next();
};
exports.register.attributes = {
pkg: require('./package.json')
};
## Instruction:
Upgrade for compatability with Hapi 8
## Code After:
exports.register = function(server, options, next) {
// Wait 10 seconds for existing connections to close then exit.
var stop = function() {
server.connections[0].stop({
timeout: 10 * 1000
}, function() {
process.exit();
});
};
process.on('SIGTERM', stop);
process.on('SIGINT', stop);
next();
};
exports.register.attributes = {
pkg: require('./package.json')
};
| - exports.register = function(plugin, options, next) {
? ^^^^^^
+ exports.register = function(server, options, next) {
? ^^^^^^
// Wait 10 seconds for existing connections to close then exit.
var stop = function() {
- plugin.servers[0].stop({
+ server.connections[0].stop({
timeout: 10 * 1000
}, function() {
process.exit();
});
};
process.on('SIGTERM', stop);
process.on('SIGINT', stop);
next();
};
exports.register.attributes = {
pkg: require('./package.json')
}; | 4 | 0.210526 | 2 | 2 |
8db347eaae51ea5f0a591bcecd5ba38263379aae | seqio/__init__.py | seqio/__init__.py |
"""Import to top-level API."""
# pylint:disable=wildcard-import,g-bad-import-order
from seqio.dataset_providers import *
from seqio import evaluation
from seqio import experimental
from seqio.evaluation import Evaluator
from seqio.evaluation import TensorAndNumpyEncoder
from seqio.feature_converters import *
from seqio import preprocessors
import seqio.test_utils
from seqio.utils import *
from seqio.vocabularies import *
# Version number.
from seqio.version import __version__
|
"""Import to top-level API."""
# pylint:disable=wildcard-import,g-bad-import-order
from seqio.dataset_providers import *
from seqio import evaluation
from seqio import experimental
from seqio.evaluation import Evaluator
from seqio.evaluation import JSONLogger
from seqio.evaluation import Logger
from seqio.evaluation import TensorAndNumpyEncoder
from seqio.evaluation import TensorBoardLogger
from seqio.feature_converters import *
from seqio import preprocessors
import seqio.test_utils
from seqio.utils import *
from seqio.vocabularies import *
# Version number.
from seqio.version import __version__
| Make loggers part of the top-level SeqIO API. | Make loggers part of the top-level SeqIO API.
PiperOrigin-RevId: 400711636
| Python | apache-2.0 | google/seqio | python | ## Code Before:
"""Import to top-level API."""
# pylint:disable=wildcard-import,g-bad-import-order
from seqio.dataset_providers import *
from seqio import evaluation
from seqio import experimental
from seqio.evaluation import Evaluator
from seqio.evaluation import TensorAndNumpyEncoder
from seqio.feature_converters import *
from seqio import preprocessors
import seqio.test_utils
from seqio.utils import *
from seqio.vocabularies import *
# Version number.
from seqio.version import __version__
## Instruction:
Make loggers part of the top-level SeqIO API.
PiperOrigin-RevId: 400711636
## Code After:
"""Import to top-level API."""
# pylint:disable=wildcard-import,g-bad-import-order
from seqio.dataset_providers import *
from seqio import evaluation
from seqio import experimental
from seqio.evaluation import Evaluator
from seqio.evaluation import JSONLogger
from seqio.evaluation import Logger
from seqio.evaluation import TensorAndNumpyEncoder
from seqio.evaluation import TensorBoardLogger
from seqio.feature_converters import *
from seqio import preprocessors
import seqio.test_utils
from seqio.utils import *
from seqio.vocabularies import *
# Version number.
from seqio.version import __version__
|
"""Import to top-level API."""
# pylint:disable=wildcard-import,g-bad-import-order
from seqio.dataset_providers import *
from seqio import evaluation
from seqio import experimental
from seqio.evaluation import Evaluator
+ from seqio.evaluation import JSONLogger
+ from seqio.evaluation import Logger
from seqio.evaluation import TensorAndNumpyEncoder
+ from seqio.evaluation import TensorBoardLogger
from seqio.feature_converters import *
from seqio import preprocessors
import seqio.test_utils
from seqio.utils import *
from seqio.vocabularies import *
# Version number.
from seqio.version import __version__ | 3 | 0.176471 | 3 | 0 |
73a9e2759e0282b325371fac214812b177642f61 | src/org/camsrobotics/frc2016/Constants.java | src/org/camsrobotics/frc2016/Constants.java | package org.camsrobotics.frc2016;
/**
* Assorted Constants
*
* @author Wesley
*
*/
public class Constants {
/*
* Camera Constants
*/
public final static int cameraFrameHeight = 240;
public final static int cameraFrameWidth = 320;
public final static double cameraFOVAngle = 21.2505055;
/*
* Drive Constants
*/
public final static double kDriveRotationP = 0;
public final static double kDriveRotationI = 0;
public final static double kDriveRotationD = 0;
public final static double kDriveTranslationP = 0;
public final static double kDriveTranslationI = 0;
public final static double kDriveTranslationD = 0;
/*
* Shooter Constants
*/
public final static double kFlywheelP = 0;
public final static double kFlywheelI = 0;
public final static double kFlywheelD = 0;
public final static double kLiftP = 0;
public final static double kLiftI = 0;
public final static double kLiftD = 0;
}
| package org.camsrobotics.frc2016;
/**
* Assorted Constants
*
* @author Wesley
*
*/
public class Constants {
/*
* Camera Constants
*/
public final static int cameraFrameHeight = 240;
public final static int cameraFrameWidth = 320;
public final static double cameraFOVAngle = 21.2505055;
/*
* Drive Constants
*/
public final static double kDriveRotationP = 0;
public final static double kDriveRotationI = 0;
public final static double kDriveRotationD = 0;
public final static double kDriveTranslationP = 0;
public final static double kDriveTranslationI = 0;
public final static double kDriveTranslationD = 0;
/*
* Shooter Constants
*/
public final static double kFlywheelF = 0;
public final static double kFlywheelP = 0;
public final static double kFlywheelI = 0;
public final static double kFlywheelD = 0;
public final static double kLiftF = 0;
public final static double kLiftP = 0;
public final static double kLiftI = 0;
public final static double kLiftD = 0;
}
| Add the feed forward gain | Add the feed forward gain | Java | mit | nerdherd/Stronghold2016 | java | ## Code Before:
package org.camsrobotics.frc2016;
/**
* Assorted Constants
*
* @author Wesley
*
*/
public class Constants {
/*
* Camera Constants
*/
public final static int cameraFrameHeight = 240;
public final static int cameraFrameWidth = 320;
public final static double cameraFOVAngle = 21.2505055;
/*
* Drive Constants
*/
public final static double kDriveRotationP = 0;
public final static double kDriveRotationI = 0;
public final static double kDriveRotationD = 0;
public final static double kDriveTranslationP = 0;
public final static double kDriveTranslationI = 0;
public final static double kDriveTranslationD = 0;
/*
* Shooter Constants
*/
public final static double kFlywheelP = 0;
public final static double kFlywheelI = 0;
public final static double kFlywheelD = 0;
public final static double kLiftP = 0;
public final static double kLiftI = 0;
public final static double kLiftD = 0;
}
## Instruction:
Add the feed forward gain
## Code After:
package org.camsrobotics.frc2016;
/**
* Assorted Constants
*
* @author Wesley
*
*/
public class Constants {
/*
* Camera Constants
*/
public final static int cameraFrameHeight = 240;
public final static int cameraFrameWidth = 320;
public final static double cameraFOVAngle = 21.2505055;
/*
* Drive Constants
*/
public final static double kDriveRotationP = 0;
public final static double kDriveRotationI = 0;
public final static double kDriveRotationD = 0;
public final static double kDriveTranslationP = 0;
public final static double kDriveTranslationI = 0;
public final static double kDriveTranslationD = 0;
/*
* Shooter Constants
*/
public final static double kFlywheelF = 0;
public final static double kFlywheelP = 0;
public final static double kFlywheelI = 0;
public final static double kFlywheelD = 0;
public final static double kLiftF = 0;
public final static double kLiftP = 0;
public final static double kLiftI = 0;
public final static double kLiftD = 0;
}
| package org.camsrobotics.frc2016;
/**
* Assorted Constants
*
* @author Wesley
*
*/
public class Constants {
/*
* Camera Constants
*/
public final static int cameraFrameHeight = 240;
public final static int cameraFrameWidth = 320;
public final static double cameraFOVAngle = 21.2505055;
/*
* Drive Constants
*/
public final static double kDriveRotationP = 0;
public final static double kDriveRotationI = 0;
public final static double kDriveRotationD = 0;
public final static double kDriveTranslationP = 0;
public final static double kDriveTranslationI = 0;
public final static double kDriveTranslationD = 0;
/*
* Shooter Constants
*/
+ public final static double kFlywheelF = 0;
public final static double kFlywheelP = 0;
public final static double kFlywheelI = 0;
public final static double kFlywheelD = 0;
+ public final static double kLiftF = 0;
public final static double kLiftP = 0;
public final static double kLiftI = 0;
public final static double kLiftD = 0;
} | 2 | 0.054054 | 2 | 0 |
fd951edbef26dcab2a4b89036811520b22e77fcf | marry-fuck-kill/main.py | marry-fuck-kill/main.py | from google.appengine.ext import webapp
from google.appengine.ext.webapp import util
import html_handlers
import models
def main():
# TODO(mjkelly): Clean up these handlers.
application = webapp.WSGIApplication([
("/", html_handlers.MainPageHandler),
("/about", html_handlers.AboutHandler),
("/make", html_handlers.MakeHandler),
("/make.do", html_handlers.MakeSubmitHandler),
("/mymfks", html_handlers.MyMfksHandler),
("/vote/(.*)", html_handlers.VoteHandler),
("/vote.do", html_handlers.VoteSubmitHandler),
("/i/(.*)", html_handlers.EntityImageHandler),
("/.*", html_handlers.CatchAllHandler),
])
util.run_wsgi_app(application)
if __name__ == '__main__':
main()
| from google.appengine.ext import webapp
from google.appengine.ext.webapp import util
import html_handlers
import models
def main():
application = webapp.WSGIApplication([
("/", html_handlers.MainPageHandler),
("/about", html_handlers.AboutHandler),
("/make", html_handlers.MakeHandler),
("/make.do", html_handlers.MakeSubmitHandler),
("/mymfks", html_handlers.MyMfksHandler),
("/vote/(.*)", html_handlers.VoteHandler),
("/vote.do", html_handlers.VoteSubmitHandler),
("/i/(.*)", html_handlers.EntityImageHandler),
("/.*", html_handlers.CatchAllHandler),
])
util.run_wsgi_app(application)
if __name__ == '__main__':
main()
| Remove TODO -- handlers have been cleaned up. | Remove TODO -- handlers have been cleaned up.
| Python | apache-2.0 | hjfreyer/marry-fuck-kill,hjfreyer/marry-fuck-kill | python | ## Code Before:
from google.appengine.ext import webapp
from google.appengine.ext.webapp import util
import html_handlers
import models
def main():
# TODO(mjkelly): Clean up these handlers.
application = webapp.WSGIApplication([
("/", html_handlers.MainPageHandler),
("/about", html_handlers.AboutHandler),
("/make", html_handlers.MakeHandler),
("/make.do", html_handlers.MakeSubmitHandler),
("/mymfks", html_handlers.MyMfksHandler),
("/vote/(.*)", html_handlers.VoteHandler),
("/vote.do", html_handlers.VoteSubmitHandler),
("/i/(.*)", html_handlers.EntityImageHandler),
("/.*", html_handlers.CatchAllHandler),
])
util.run_wsgi_app(application)
if __name__ == '__main__':
main()
## Instruction:
Remove TODO -- handlers have been cleaned up.
## Code After:
from google.appengine.ext import webapp
from google.appengine.ext.webapp import util
import html_handlers
import models
def main():
application = webapp.WSGIApplication([
("/", html_handlers.MainPageHandler),
("/about", html_handlers.AboutHandler),
("/make", html_handlers.MakeHandler),
("/make.do", html_handlers.MakeSubmitHandler),
("/mymfks", html_handlers.MyMfksHandler),
("/vote/(.*)", html_handlers.VoteHandler),
("/vote.do", html_handlers.VoteSubmitHandler),
("/i/(.*)", html_handlers.EntityImageHandler),
("/.*", html_handlers.CatchAllHandler),
])
util.run_wsgi_app(application)
if __name__ == '__main__':
main()
| from google.appengine.ext import webapp
from google.appengine.ext.webapp import util
import html_handlers
import models
def main():
- # TODO(mjkelly): Clean up these handlers.
application = webapp.WSGIApplication([
("/", html_handlers.MainPageHandler),
("/about", html_handlers.AboutHandler),
("/make", html_handlers.MakeHandler),
("/make.do", html_handlers.MakeSubmitHandler),
("/mymfks", html_handlers.MyMfksHandler),
("/vote/(.*)", html_handlers.VoteHandler),
("/vote.do", html_handlers.VoteSubmitHandler),
("/i/(.*)", html_handlers.EntityImageHandler),
("/.*", html_handlers.CatchAllHandler),
])
util.run_wsgi_app(application)
if __name__ == '__main__':
main() | 1 | 0.043478 | 0 | 1 |
4b3ccc08827216e02e5082479ff23e90785fa48d | lib/ruby-recursive.rb | lib/ruby-recursive.rb | class Recursive
def initialize(&block)
@block = block
end
def call(*args)
step = RecursiveStep.new
returned = @block.call(step, *args)
while returned.kind_of?(RecursiveStep)
returned = @block.call(step, *returned.args)
end
returned
end
private
class RecursiveStep
attr_reader :args
def call(*args)
@args = args
self
end
end
end
module RecursiveExtension
def self.extended(mod)
mod.module_eval do
def recursive(&block)
Recursive.new(&block)
end
end
end
end
module Kernel
extend RecursiveExtension
end
| class Proc
def self.recursive(&block)
Proc.new do |args|
step = RecursiveStep.new
returned = block.call(step, *args)
while returned.kind_of?(RecursiveStep)
returned = block.call(step, *returned.args)
end
returned
end
end
private
class RecursiveStep
attr_reader :args
def call(*args)
@args = args
self
end
end
end
module RecursiveExtension
def self.extended(mod)
mod.module_eval do
def recursive(&block)
Proc.recursive(&block)
end
end
end
end
module Kernel
extend RecursiveExtension
end
| Refactor underlying implementation to use a Proc | Refactor underlying implementation to use a Proc
| Ruby | mit | seadowg/matilda-function | ruby | ## Code Before:
class Recursive
def initialize(&block)
@block = block
end
def call(*args)
step = RecursiveStep.new
returned = @block.call(step, *args)
while returned.kind_of?(RecursiveStep)
returned = @block.call(step, *returned.args)
end
returned
end
private
class RecursiveStep
attr_reader :args
def call(*args)
@args = args
self
end
end
end
module RecursiveExtension
def self.extended(mod)
mod.module_eval do
def recursive(&block)
Recursive.new(&block)
end
end
end
end
module Kernel
extend RecursiveExtension
end
## Instruction:
Refactor underlying implementation to use a Proc
## Code After:
class Proc
def self.recursive(&block)
Proc.new do |args|
step = RecursiveStep.new
returned = block.call(step, *args)
while returned.kind_of?(RecursiveStep)
returned = block.call(step, *returned.args)
end
returned
end
end
private
class RecursiveStep
attr_reader :args
def call(*args)
@args = args
self
end
end
end
module RecursiveExtension
def self.extended(mod)
mod.module_eval do
def recursive(&block)
Proc.recursive(&block)
end
end
end
end
module Kernel
extend RecursiveExtension
end
| - class Recursive
- def initialize(&block)
- @block = block
- end
+ class Proc
+ def self.recursive(&block)
+ Proc.new do |args|
+ step = RecursiveStep.new
+ returned = block.call(step, *args)
+ while returned.kind_of?(RecursiveStep)
- def call(*args)
- step = RecursiveStep.new
- returned = @block.call(step, *args)
? -
+ returned = block.call(step, *returned.args)
? ++++ +++++++++
+ end
+ returned
- while returned.kind_of?(RecursiveStep)
- returned = @block.call(step, *returned.args)
end
-
- returned
end
private
class RecursiveStep
attr_reader :args
def call(*args)
@args = args
self
end
end
end
module RecursiveExtension
def self.extended(mod)
mod.module_eval do
def recursive(&block)
- Recursive.new(&block)
? ^ ----
+ Proc.recursive(&block)
? ^^^^^^
end
end
end
end
module Kernel
extend RecursiveExtension
end | 22 | 0.536585 | 10 | 12 |
8a5506776e6c553f9a03bc9b4e9bc38bf49f6f07 | lib/scraper.rb | lib/scraper.rb | require "scraper/version"
require "nokogiri"
require "watir-webdriver"
require "headless"
module Scraper
# https://www.hipmunk.com/flights#f=YVR;t=NYC;d=2015-03-06;r=2015-03-08
@base = "https://www.hipmunk.com/flights#"
def self.request(from_code, to_code, departing, returning, base_url=nil)
base_url = base_url || @base
url = "#{base_url}f=#{from_code};t=#{to_code};d=#{departing};r=#{returning}"
headless = Headless.new
headless.start
browser = Watir::Browser.start url
browser.div(:class => "flight-tab-group-info__price-bottom").wait_until_present(10)
text = browser.div(:class => "flight-tab-group-info__price-bottom").text
headless.destroy
text.delete("^0-9")
end
end
| require "scraper/version"
require "nokogiri"
require "watir-webdriver"
require "headless"
module Scraper
# https://www.hipmunk.com/flights#f=YVR;t=NYC;d=2015-03-06;r=2015-03-08
@base = "https://www.hipmunk.com/flights#"
def self.request(from_code, to_code, departing, returning, base_url=nil)
base_url = base_url || @base
url = "#{base_url}f=#{from_code};t=#{to_code};d=#{departing};r=#{returning}"
headless = Headless.new
headless.start
browser = Watir::Browser.start url
browser.div(:class => "flight-tab-group-info__price-bottom").wait_until_present(12)
text = browser.div(:class => "flight-tab-group-info__price-bottom").text
headless.destroy
price = text.delete("^0-9")
flight_data = {}
flight_data[:price] = price
flight_data[:deep_link] = url
flight_data
end
end
| Remove dollar sign from price | Remove dollar sign from price
| Ruby | mit | gabrieltaylor/scraper | ruby | ## Code Before:
require "scraper/version"
require "nokogiri"
require "watir-webdriver"
require "headless"
module Scraper
# https://www.hipmunk.com/flights#f=YVR;t=NYC;d=2015-03-06;r=2015-03-08
@base = "https://www.hipmunk.com/flights#"
def self.request(from_code, to_code, departing, returning, base_url=nil)
base_url = base_url || @base
url = "#{base_url}f=#{from_code};t=#{to_code};d=#{departing};r=#{returning}"
headless = Headless.new
headless.start
browser = Watir::Browser.start url
browser.div(:class => "flight-tab-group-info__price-bottom").wait_until_present(10)
text = browser.div(:class => "flight-tab-group-info__price-bottom").text
headless.destroy
text.delete("^0-9")
end
end
## Instruction:
Remove dollar sign from price
## Code After:
require "scraper/version"
require "nokogiri"
require "watir-webdriver"
require "headless"
module Scraper
# https://www.hipmunk.com/flights#f=YVR;t=NYC;d=2015-03-06;r=2015-03-08
@base = "https://www.hipmunk.com/flights#"
def self.request(from_code, to_code, departing, returning, base_url=nil)
base_url = base_url || @base
url = "#{base_url}f=#{from_code};t=#{to_code};d=#{departing};r=#{returning}"
headless = Headless.new
headless.start
browser = Watir::Browser.start url
browser.div(:class => "flight-tab-group-info__price-bottom").wait_until_present(12)
text = browser.div(:class => "flight-tab-group-info__price-bottom").text
headless.destroy
price = text.delete("^0-9")
flight_data = {}
flight_data[:price] = price
flight_data[:deep_link] = url
flight_data
end
end
| require "scraper/version"
require "nokogiri"
require "watir-webdriver"
require "headless"
module Scraper
# https://www.hipmunk.com/flights#f=YVR;t=NYC;d=2015-03-06;r=2015-03-08
@base = "https://www.hipmunk.com/flights#"
def self.request(from_code, to_code, departing, returning, base_url=nil)
base_url = base_url || @base
url = "#{base_url}f=#{from_code};t=#{to_code};d=#{departing};r=#{returning}"
headless = Headless.new
headless.start
browser = Watir::Browser.start url
- browser.div(:class => "flight-tab-group-info__price-bottom").wait_until_present(10)
? ^
+ browser.div(:class => "flight-tab-group-info__price-bottom").wait_until_present(12)
? ^
text = browser.div(:class => "flight-tab-group-info__price-bottom").text
headless.destroy
- text.delete("^0-9")
+ price = text.delete("^0-9")
? ++++++++
+ flight_data = {}
+ flight_data[:price] = price
+ flight_data[:deep_link] = url
+
+ flight_data
end
end | 9 | 0.375 | 7 | 2 |
edb10e7ae1f428dade04a9976c3b3f985065d458 | settings/__init__.py | settings/__init__.py | from __future__ import print_function
# Standard Library
import sys
if "test" in sys.argv:
print("\033[1;91mNo django tests.\033[0m")
print("Try: \033[1;33mpy.test\033[0m")
sys.exit(0)
from .common import * # noqa
try:
from .dev import * # noqa
from .prod import * # noqa
except ImportError:
pass
| from __future__ import print_function
# Standard Library
import sys
if "test" in sys.argv:
print("\033[1;91mNo django tests.\033[0m")
print("Try: \033[1;33mpy.test\033[0m")
sys.exit(0)
from .common import * # noqa
try:
from .dev import * # noqa
except ImportError:
pass
try:
from .prod import * # noqa
except ImportError:
pass
| Make sure prod.py is read in settings | Make sure prod.py is read in settings
| Python | mit | hTrap/junction,farhaanbukhsh/junction,ChillarAnand/junction,farhaanbukhsh/junction,akshayaurora/junction,NabeelValapra/junction,pythonindia/junction,shashisp/junction,hTrap/junction,ChillarAnand/junction,shashisp/junction,nava45/junction,NabeelValapra/junction,shashisp/junction,farhaanbukhsh/junction,akshayaurora/junction,shashisp/junction,nava45/junction,pythonindia/junction,Rahul91/junction,NabeelValapra/junction,hTrap/junction,Rahul91/junction,Rahul91/junction,praba230890/junction,Rahul91/junction,akshayaurora/junction,nava45/junction,nava45/junction,hTrap/junction,pythonindia/junction,ChillarAnand/junction,pythonindia/junction,akshayaurora/junction,praba230890/junction,NabeelValapra/junction,praba230890/junction,ChillarAnand/junction,farhaanbukhsh/junction,praba230890/junction | python | ## Code Before:
from __future__ import print_function
# Standard Library
import sys
if "test" in sys.argv:
print("\033[1;91mNo django tests.\033[0m")
print("Try: \033[1;33mpy.test\033[0m")
sys.exit(0)
from .common import * # noqa
try:
from .dev import * # noqa
from .prod import * # noqa
except ImportError:
pass
## Instruction:
Make sure prod.py is read in settings
## Code After:
from __future__ import print_function
# Standard Library
import sys
if "test" in sys.argv:
print("\033[1;91mNo django tests.\033[0m")
print("Try: \033[1;33mpy.test\033[0m")
sys.exit(0)
from .common import * # noqa
try:
from .dev import * # noqa
except ImportError:
pass
try:
from .prod import * # noqa
except ImportError:
pass
| from __future__ import print_function
# Standard Library
import sys
if "test" in sys.argv:
print("\033[1;91mNo django tests.\033[0m")
print("Try: \033[1;33mpy.test\033[0m")
sys.exit(0)
from .common import * # noqa
try:
from .dev import * # noqa
+ except ImportError:
+ pass
+
+ try:
from .prod import * # noqa
except ImportError:
pass | 4 | 0.235294 | 4 | 0 |
119025b231b0f3b9077445334fc08d1ad076abfc | generic_links/migrations/0001_initial.py | generic_links/migrations/0001_initial.py | from __future__ import unicode_literals
from django.db import models, migrations
from django.conf import settings
class Migration(migrations.Migration):
dependencies = [
('contenttypes', '0002_remove_content_type_name'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='GenericLink',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('object_id', models.PositiveIntegerField(db_index=True)),
('url', models.URLField()),
('title', models.CharField(max_length=200)),
('description', models.TextField(max_length=1000, null=True, blank=True)),
('created_at', models.DateTimeField(auto_now_add=True, db_index=True)),
('is_external', models.BooleanField(default=True, db_index=True)),
('content_type', models.ForeignKey(to='contenttypes.ContentType')),
('user', models.ForeignKey(blank=True, to=settings.AUTH_USER_MODEL, null=True)),
],
options={
'ordering': ('-created_at',),
'verbose_name': 'Generic Link',
'verbose_name_plural': 'Generic Links',
},
),
]
| from __future__ import unicode_literals
from django.db import models, migrations
from django.conf import settings
class Migration(migrations.Migration):
dependencies = [
('contenttypes', '__first__'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='GenericLink',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('object_id', models.PositiveIntegerField(db_index=True)),
('url', models.URLField()),
('title', models.CharField(max_length=200)),
('description', models.TextField(max_length=1000, null=True, blank=True)),
('created_at', models.DateTimeField(auto_now_add=True, db_index=True)),
('is_external', models.BooleanField(default=True, db_index=True)),
('content_type', models.ForeignKey(to='contenttypes.ContentType')),
('user', models.ForeignKey(blank=True, to=settings.AUTH_USER_MODEL, null=True)),
],
options={
'ordering': ('-created_at',),
'verbose_name': 'Generic Link',
'verbose_name_plural': 'Generic Links',
},
),
]
| Remove Django 1.8 dependency in initial migration | Remove Django 1.8 dependency in initial migration
The ('contenttypes', '0002_remove_content_type_name') migration was part of Django 1.8, replacing it with '__first__' allows the use of Django 1.7 | Python | bsd-3-clause | matagus/django-generic-links,matagus/django-generic-links | python | ## Code Before:
from __future__ import unicode_literals
from django.db import models, migrations
from django.conf import settings
class Migration(migrations.Migration):
dependencies = [
('contenttypes', '0002_remove_content_type_name'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='GenericLink',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('object_id', models.PositiveIntegerField(db_index=True)),
('url', models.URLField()),
('title', models.CharField(max_length=200)),
('description', models.TextField(max_length=1000, null=True, blank=True)),
('created_at', models.DateTimeField(auto_now_add=True, db_index=True)),
('is_external', models.BooleanField(default=True, db_index=True)),
('content_type', models.ForeignKey(to='contenttypes.ContentType')),
('user', models.ForeignKey(blank=True, to=settings.AUTH_USER_MODEL, null=True)),
],
options={
'ordering': ('-created_at',),
'verbose_name': 'Generic Link',
'verbose_name_plural': 'Generic Links',
},
),
]
## Instruction:
Remove Django 1.8 dependency in initial migration
The ('contenttypes', '0002_remove_content_type_name') migration was part of Django 1.8, replacing it with '__first__' allows the use of Django 1.7
## Code After:
from __future__ import unicode_literals
from django.db import models, migrations
from django.conf import settings
class Migration(migrations.Migration):
dependencies = [
('contenttypes', '__first__'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='GenericLink',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('object_id', models.PositiveIntegerField(db_index=True)),
('url', models.URLField()),
('title', models.CharField(max_length=200)),
('description', models.TextField(max_length=1000, null=True, blank=True)),
('created_at', models.DateTimeField(auto_now_add=True, db_index=True)),
('is_external', models.BooleanField(default=True, db_index=True)),
('content_type', models.ForeignKey(to='contenttypes.ContentType')),
('user', models.ForeignKey(blank=True, to=settings.AUTH_USER_MODEL, null=True)),
],
options={
'ordering': ('-created_at',),
'verbose_name': 'Generic Link',
'verbose_name_plural': 'Generic Links',
},
),
]
| from __future__ import unicode_literals
from django.db import models, migrations
from django.conf import settings
class Migration(migrations.Migration):
dependencies = [
- ('contenttypes', '0002_remove_content_type_name'),
+ ('contenttypes', '__first__'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='GenericLink',
fields=[
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('object_id', models.PositiveIntegerField(db_index=True)),
('url', models.URLField()),
('title', models.CharField(max_length=200)),
('description', models.TextField(max_length=1000, null=True, blank=True)),
('created_at', models.DateTimeField(auto_now_add=True, db_index=True)),
('is_external', models.BooleanField(default=True, db_index=True)),
('content_type', models.ForeignKey(to='contenttypes.ContentType')),
('user', models.ForeignKey(blank=True, to=settings.AUTH_USER_MODEL, null=True)),
],
options={
'ordering': ('-created_at',),
'verbose_name': 'Generic Link',
'verbose_name_plural': 'Generic Links',
},
),
] | 2 | 0.058824 | 1 | 1 |
5821fab5d73637c3fcdb989a10c020df0f5dd92b | keystone/carrier/shipment.sls | keystone/carrier/shipment.sls | {% set openstack = pillar['openstack'] -%}
{% set keystone = openstack['keystone'] -%}
{% set auth = openstack['auth'] -%}
{% set authurl = auth.get('proto','http') + "://"+ auth['server'] -%}
{% set authpass = keystone['service']['password'] -%}
{% set staging = "/srv/container" %}
include:
- openstack.keystone.user
keystone-conf:
file.managed:
- name: {{staging}}/keystone/keystone.conf
- source: salt://openstack/keystone/conf/keystone.conf
- template: jinja
- user: keystone
- group: keystone
keystonerc:
file.managed:
- name: /root/openrc
- mode: 400
- user: root
- group: root
- contents: |
export OS_USERNAME=admin
export OS_TENANT_NAME="Default Tenant"
export OS_PASSWORD={{authpass}}
export OS_AUTH_URL="{{authurl}}:{{auth.get('public_port',5000)}}/v2.0/"
# export SERVICE_TOKEN={{authpass}}
# export SERVICE_ENDPOINT="{{authurl}}:{{auth.get('port',35357)}}/v2.0/"
| {% set openstack = pillar['openstack'] -%}
{% set keystone = openstack['keystone'] -%}
{% set auth = openstack['auth'] -%}
{% set authurl = auth.get('proto','http') + "://"+ auth['server'] -%}
{% set authpass = keystone['service']['password'] -%}
{% set staging = "/srv/container" %}
include:
- openstack.keystone.user
keystone-conf:
file.managed:
- name: {{staging}}/keystone/keystone.conf
- source: salt://openstack/keystone/conf/keystone.conf
- template: jinja
- user: keystone
- group: keystone
- require:
- user: keystone-user
keystonerc:
file.managed:
- name: {{staging}}/openrc
- mode: 400
- user: root
- group: root
- contents: |
export OS_USERNAME=admin
export OS_TENANT_NAME="Default Tenant"
export OS_PASSWORD={{authpass}}
export OS_AUTH_URL="{{authurl}}:{{auth.get('public_port',5000)}}/v2.0/"
# export SERVICE_TOKEN={{authpass}}
# export SERVICE_ENDPOINT="{{authurl}}:{{auth.get('port',35357)}}/v2.0/"
| Add requirement for keystone user | Add requirement for keystone user
| SaltStack | bsd-2-clause | jahkeup/salt-openstack,jahkeup/salt-openstack | saltstack | ## Code Before:
{% set openstack = pillar['openstack'] -%}
{% set keystone = openstack['keystone'] -%}
{% set auth = openstack['auth'] -%}
{% set authurl = auth.get('proto','http') + "://"+ auth['server'] -%}
{% set authpass = keystone['service']['password'] -%}
{% set staging = "/srv/container" %}
include:
- openstack.keystone.user
keystone-conf:
file.managed:
- name: {{staging}}/keystone/keystone.conf
- source: salt://openstack/keystone/conf/keystone.conf
- template: jinja
- user: keystone
- group: keystone
keystonerc:
file.managed:
- name: /root/openrc
- mode: 400
- user: root
- group: root
- contents: |
export OS_USERNAME=admin
export OS_TENANT_NAME="Default Tenant"
export OS_PASSWORD={{authpass}}
export OS_AUTH_URL="{{authurl}}:{{auth.get('public_port',5000)}}/v2.0/"
# export SERVICE_TOKEN={{authpass}}
# export SERVICE_ENDPOINT="{{authurl}}:{{auth.get('port',35357)}}/v2.0/"
## Instruction:
Add requirement for keystone user
## Code After:
{% set openstack = pillar['openstack'] -%}
{% set keystone = openstack['keystone'] -%}
{% set auth = openstack['auth'] -%}
{% set authurl = auth.get('proto','http') + "://"+ auth['server'] -%}
{% set authpass = keystone['service']['password'] -%}
{% set staging = "/srv/container" %}
include:
- openstack.keystone.user
keystone-conf:
file.managed:
- name: {{staging}}/keystone/keystone.conf
- source: salt://openstack/keystone/conf/keystone.conf
- template: jinja
- user: keystone
- group: keystone
- require:
- user: keystone-user
keystonerc:
file.managed:
- name: {{staging}}/openrc
- mode: 400
- user: root
- group: root
- contents: |
export OS_USERNAME=admin
export OS_TENANT_NAME="Default Tenant"
export OS_PASSWORD={{authpass}}
export OS_AUTH_URL="{{authurl}}:{{auth.get('public_port',5000)}}/v2.0/"
# export SERVICE_TOKEN={{authpass}}
# export SERVICE_ENDPOINT="{{authurl}}:{{auth.get('port',35357)}}/v2.0/"
| {% set openstack = pillar['openstack'] -%}
{% set keystone = openstack['keystone'] -%}
{% set auth = openstack['auth'] -%}
{% set authurl = auth.get('proto','http') + "://"+ auth['server'] -%}
{% set authpass = keystone['service']['password'] -%}
{% set staging = "/srv/container" %}
include:
- openstack.keystone.user
+
keystone-conf:
file.managed:
- name: {{staging}}/keystone/keystone.conf
- source: salt://openstack/keystone/conf/keystone.conf
- template: jinja
- user: keystone
- group: keystone
+ - require:
+ - user: keystone-user
+
keystonerc:
file.managed:
- - name: /root/openrc
+ - name: {{staging}}/openrc
- mode: 400
- user: root
- group: root
- contents: |
export OS_USERNAME=admin
export OS_TENANT_NAME="Default Tenant"
export OS_PASSWORD={{authpass}}
export OS_AUTH_URL="{{authurl}}:{{auth.get('public_port',5000)}}/v2.0/"
# export SERVICE_TOKEN={{authpass}}
# export SERVICE_ENDPOINT="{{authurl}}:{{auth.get('port',35357)}}/v2.0/"
| 6 | 0.2 | 5 | 1 |
7d532168a494e7fdd562ed3b4a2378c4e6b68db6 | Resources/views/FormAdmin/results.html.twig | Resources/views/FormAdmin/results.html.twig | {% extends admin.getTemplate('base_list_field') %}
{% block field%}
{% if object.storeResults %}
{% if object.results|length > 0 %}<a href="{{ path("admin_netvlies_netvliesform_form_results", { "id": object.id }) }}">{% endif %}
{% transchoice admin.datagrid.pager.nbresults with {'%count%': object.results|length} from 'SonataAdminBundle' %}list_results_count{% endtranschoice %}
{% if object.results|length > 0 %}</a>{% endif %}
{% endif %}
{% endblock field %}
| {% extends admin.getTemplate('base_list_field') %}
{% block field%}
{% if object.storeResults %}
{% if object.results|length > 0 %}<a href="{{ admin.generateUrl('results', { id: object.id }) }}">{% endif %}
{% transchoice admin.datagrid.pager.nbresults with {'%count%': object.results|length} from 'SonataAdminBundle' %}list_results_count{% endtranschoice %}
{% if object.results|length > 0 %}</a>{% endif %}
{% endif %}
{% endblock field %}
| Use sonata route generator to create results route | Use sonata route generator to create results route
| Twig | mit | netvlies/NetvliesFormBundle,netvlies/NetvliesFormBundle | twig | ## Code Before:
{% extends admin.getTemplate('base_list_field') %}
{% block field%}
{% if object.storeResults %}
{% if object.results|length > 0 %}<a href="{{ path("admin_netvlies_netvliesform_form_results", { "id": object.id }) }}">{% endif %}
{% transchoice admin.datagrid.pager.nbresults with {'%count%': object.results|length} from 'SonataAdminBundle' %}list_results_count{% endtranschoice %}
{% if object.results|length > 0 %}</a>{% endif %}
{% endif %}
{% endblock field %}
## Instruction:
Use sonata route generator to create results route
## Code After:
{% extends admin.getTemplate('base_list_field') %}
{% block field%}
{% if object.storeResults %}
{% if object.results|length > 0 %}<a href="{{ admin.generateUrl('results', { id: object.id }) }}">{% endif %}
{% transchoice admin.datagrid.pager.nbresults with {'%count%': object.results|length} from 'SonataAdminBundle' %}list_results_count{% endtranschoice %}
{% if object.results|length > 0 %}</a>{% endif %}
{% endif %}
{% endblock field %}
| {% extends admin.getTemplate('base_list_field') %}
{% block field%}
{% if object.storeResults %}
- {% if object.results|length > 0 %}<a href="{{ path("admin_netvlies_netvliesform_form_results", { "id": object.id }) }}">{% endif %}
? ------ ^ ^ ^^^^^^^^^^^^^^^^^^^^^^ ^ - -
+ {% if object.results|length > 0 %}<a href="{{ admin.generateUrl('results', { id: object.id }) }}">{% endif %}
? ^^^ ++ ^^^ ^^ ^
{% transchoice admin.datagrid.pager.nbresults with {'%count%': object.results|length} from 'SonataAdminBundle' %}list_results_count{% endtranschoice %}
{% if object.results|length > 0 %}</a>{% endif %}
{% endif %}
{% endblock field %} | 2 | 0.153846 | 1 | 1 |
3ed8df631a5b731050b036e38bcba6760136cb86 | tasks/write-config.js | tasks/write-config.js | var fs = require('fs');
var config = {
domain: process.env.DOMAIN,
username: process.env.USERNAME,
password: process.env.PASSWORD,
appToken: process.env.APPTOKEN,
};
var control = {
"action" : "Replace",
"csv" :
{
"useSocrataGeocoding": false,
"columns": null,
"skip": 0,
"fixedTimestampFormat": ["ISO8601","MM/dd/yy","MM/dd/yyyy","dd-MMM-yyyy"],
"floatingTimestampFormat": ["ISO8601","MM/dd/yy","MM/dd/yyyy","dd-MMM-yyyy"],
"timezone": "UTC",
"separator": ",",
"quote": "\"",
"encoding": "utf-8",
"emptyTextIsNull": true,
"trimWhitespace": true,
"trimServerWhitespace": true,
"overrides": {}
}
};
fs.writeFile('config.json', JSON.stringify(config), function(err) {
if(err) {
console.log('Failed to write config.json.');
} else {
console.log("config.json written!");
}
});
fs.writeFile('control.json', JSON.stringify(control), function(err) {
if(err) {
console.log('Failed to write control.json.');
} else {
console.log("control.json written!");
}
});
| var fs = require('fs');
var config = {
domain: process.env.DOMAIN,
username: process.env.USERNAME,
password: process.env.PASSWORD,
appToken: process.env.APPTOKEN,
adminEmail: "",
emailUponError: "false",
logDatasetID: "",
outgoingMailServer: "",
smtpPort: "",
sslPort: "",
smtpUsername: "",
smtpPassword: "",
filesizeChunkingCutoffMB: "10",
numRowsPerChunk: "10000"
};
var control = {
"action" : "Replace",
"csv" :
{
"useSocrataGeocoding": false,
"columns": null,
"skip": 0,
"fixedTimestampFormat": ["ISO8601","MM/dd/yy","MM/dd/yyyy","dd-MMM-yyyy"],
"floatingTimestampFormat": ["ISO8601","MM/dd/yy","MM/dd/yyyy","dd-MMM-yyyy"],
"timezone": "UTC",
"separator": ",",
"quote": "\"",
"encoding": "utf-8",
"emptyTextIsNull": true,
"trimWhitespace": true,
"trimServerWhitespace": true,
"overrides": {}
}
};
fs.writeFile('config.json', JSON.stringify(config), function(err) {
if(err) {
console.log('Failed to write config.json.');
} else {
console.log("config.json written!");
}
});
fs.writeFile('control.json', JSON.stringify(control), function(err) {
if(err) {
console.log('Failed to write control.json.');
} else {
console.log("control.json written!");
}
});
| Change config to match socrata docs, even though other fields aren't required? | Change config to match socrata docs, even though other fields aren't required?
| JavaScript | mit | seabre/bicycleparkingsynctest | javascript | ## Code Before:
var fs = require('fs');
var config = {
domain: process.env.DOMAIN,
username: process.env.USERNAME,
password: process.env.PASSWORD,
appToken: process.env.APPTOKEN,
};
var control = {
"action" : "Replace",
"csv" :
{
"useSocrataGeocoding": false,
"columns": null,
"skip": 0,
"fixedTimestampFormat": ["ISO8601","MM/dd/yy","MM/dd/yyyy","dd-MMM-yyyy"],
"floatingTimestampFormat": ["ISO8601","MM/dd/yy","MM/dd/yyyy","dd-MMM-yyyy"],
"timezone": "UTC",
"separator": ",",
"quote": "\"",
"encoding": "utf-8",
"emptyTextIsNull": true,
"trimWhitespace": true,
"trimServerWhitespace": true,
"overrides": {}
}
};
fs.writeFile('config.json', JSON.stringify(config), function(err) {
if(err) {
console.log('Failed to write config.json.');
} else {
console.log("config.json written!");
}
});
fs.writeFile('control.json', JSON.stringify(control), function(err) {
if(err) {
console.log('Failed to write control.json.');
} else {
console.log("control.json written!");
}
});
## Instruction:
Change config to match socrata docs, even though other fields aren't required?
## Code After:
var fs = require('fs');
var config = {
domain: process.env.DOMAIN,
username: process.env.USERNAME,
password: process.env.PASSWORD,
appToken: process.env.APPTOKEN,
adminEmail: "",
emailUponError: "false",
logDatasetID: "",
outgoingMailServer: "",
smtpPort: "",
sslPort: "",
smtpUsername: "",
smtpPassword: "",
filesizeChunkingCutoffMB: "10",
numRowsPerChunk: "10000"
};
var control = {
"action" : "Replace",
"csv" :
{
"useSocrataGeocoding": false,
"columns": null,
"skip": 0,
"fixedTimestampFormat": ["ISO8601","MM/dd/yy","MM/dd/yyyy","dd-MMM-yyyy"],
"floatingTimestampFormat": ["ISO8601","MM/dd/yy","MM/dd/yyyy","dd-MMM-yyyy"],
"timezone": "UTC",
"separator": ",",
"quote": "\"",
"encoding": "utf-8",
"emptyTextIsNull": true,
"trimWhitespace": true,
"trimServerWhitespace": true,
"overrides": {}
}
};
fs.writeFile('config.json', JSON.stringify(config), function(err) {
if(err) {
console.log('Failed to write config.json.');
} else {
console.log("config.json written!");
}
});
fs.writeFile('control.json', JSON.stringify(control), function(err) {
if(err) {
console.log('Failed to write control.json.');
} else {
console.log("control.json written!");
}
});
| var fs = require('fs');
var config = {
domain: process.env.DOMAIN,
username: process.env.USERNAME,
password: process.env.PASSWORD,
appToken: process.env.APPTOKEN,
+ adminEmail: "",
+ emailUponError: "false",
+ logDatasetID: "",
+ outgoingMailServer: "",
+ smtpPort: "",
+ sslPort: "",
+ smtpUsername: "",
+ smtpPassword: "",
+ filesizeChunkingCutoffMB: "10",
+ numRowsPerChunk: "10000"
};
var control = {
"action" : "Replace",
"csv" :
{
"useSocrataGeocoding": false,
"columns": null,
"skip": 0,
"fixedTimestampFormat": ["ISO8601","MM/dd/yy","MM/dd/yyyy","dd-MMM-yyyy"],
"floatingTimestampFormat": ["ISO8601","MM/dd/yy","MM/dd/yyyy","dd-MMM-yyyy"],
"timezone": "UTC",
"separator": ",",
"quote": "\"",
"encoding": "utf-8",
"emptyTextIsNull": true,
"trimWhitespace": true,
"trimServerWhitespace": true,
"overrides": {}
}
};
fs.writeFile('config.json', JSON.stringify(config), function(err) {
if(err) {
console.log('Failed to write config.json.');
} else {
console.log("config.json written!");
}
});
fs.writeFile('control.json', JSON.stringify(control), function(err) {
if(err) {
console.log('Failed to write control.json.');
} else {
console.log("control.json written!");
}
}); | 10 | 0.227273 | 10 | 0 |
8f7c24b71da3b51d845f1f0e40c2fafa88ec2b31 | examples/multi_curl_stop.php | examples/multi_curl_stop.php | <?php
require __DIR__ . '/../vendor/autoload.php';
use Curl\MultiCurl;
$multi_curl = new MultiCurl();
// Count the number of completed requests.
$request_count = 0;
$multi_curl->complete(function ($instance) use (&$request_count) {
$request_count += 1;
});
$multi_curl->success(function ($instance) use (&$request_count, $multi_curl) {
echo 'call to "' . $instance->url . '" was successful.' . "\n";
// Stop pending requests and attempt to stop active requests after the first
// successful request.
$multi_curl->stop();
});
$multi_curl->addGet('https://httpbin.org/delay/4');
$multi_curl->addGet('https://httpbin.org/delay/1');
$multi_curl->addGet('https://httpbin.org/delay/3');
$multi_curl->addGet('https://httpbin.org/delay/2');
$multi_curl->start();
assert($request_count === 1);
| <?php
require __DIR__ . '/../vendor/autoload.php';
use Curl\MultiCurl;
$multi_curl = new MultiCurl();
$multi_curl->beforeSend(function ($instance) {
echo 'about to make request ' . $instance->id . ': "' . $instance->url . '".' . "\n";
});
$multi_curl->success(function ($instance) use (&$request_count, $multi_curl) {
echo 'call to "' . $instance->url . '" was successful.' . "\n";
// Stop pending requests and attempt to stop active requests after the first
// successful request.
$multi_curl->stop();
});
$multi_curl->error(function ($instance) {
echo 'call to "' . $instance->url . '" was unsuccessful.' . "\n";
});
// Count the number of completed requests.
$request_count = 0;
$multi_curl->complete(function ($instance) use (&$request_count) {
echo 'call to "' . $instance->url . '" completed.' . "\n";
$request_count += 1;
});
$multi_curl->addGet('https://httpbin.org/delay/4');
$multi_curl->addGet('https://httpbin.org/delay/1');
$multi_curl->addGet('https://httpbin.org/delay/3');
$multi_curl->addGet('https://httpbin.org/delay/2');
$multi_curl->start();
assert($request_count === 1);
| Add additional logging to display progression | Add additional logging to display progression
| PHP | unlicense | zachborboa/php-curl-class,php-curl-class/php-curl-class,zachborboa/php-curl-class,php-curl-class/php-curl-class,php-curl-class/php-curl-class,zachborboa/php-curl-class | php | ## Code Before:
<?php
require __DIR__ . '/../vendor/autoload.php';
use Curl\MultiCurl;
$multi_curl = new MultiCurl();
// Count the number of completed requests.
$request_count = 0;
$multi_curl->complete(function ($instance) use (&$request_count) {
$request_count += 1;
});
$multi_curl->success(function ($instance) use (&$request_count, $multi_curl) {
echo 'call to "' . $instance->url . '" was successful.' . "\n";
// Stop pending requests and attempt to stop active requests after the first
// successful request.
$multi_curl->stop();
});
$multi_curl->addGet('https://httpbin.org/delay/4');
$multi_curl->addGet('https://httpbin.org/delay/1');
$multi_curl->addGet('https://httpbin.org/delay/3');
$multi_curl->addGet('https://httpbin.org/delay/2');
$multi_curl->start();
assert($request_count === 1);
## Instruction:
Add additional logging to display progression
## Code After:
<?php
require __DIR__ . '/../vendor/autoload.php';
use Curl\MultiCurl;
$multi_curl = new MultiCurl();
$multi_curl->beforeSend(function ($instance) {
echo 'about to make request ' . $instance->id . ': "' . $instance->url . '".' . "\n";
});
$multi_curl->success(function ($instance) use (&$request_count, $multi_curl) {
echo 'call to "' . $instance->url . '" was successful.' . "\n";
// Stop pending requests and attempt to stop active requests after the first
// successful request.
$multi_curl->stop();
});
$multi_curl->error(function ($instance) {
echo 'call to "' . $instance->url . '" was unsuccessful.' . "\n";
});
// Count the number of completed requests.
$request_count = 0;
$multi_curl->complete(function ($instance) use (&$request_count) {
echo 'call to "' . $instance->url . '" completed.' . "\n";
$request_count += 1;
});
$multi_curl->addGet('https://httpbin.org/delay/4');
$multi_curl->addGet('https://httpbin.org/delay/1');
$multi_curl->addGet('https://httpbin.org/delay/3');
$multi_curl->addGet('https://httpbin.org/delay/2');
$multi_curl->start();
assert($request_count === 1);
| <?php
require __DIR__ . '/../vendor/autoload.php';
use Curl\MultiCurl;
$multi_curl = new MultiCurl();
+ $multi_curl->beforeSend(function ($instance) {
+ echo 'about to make request ' . $instance->id . ': "' . $instance->url . '".' . "\n";
- // Count the number of completed requests.
- $request_count = 0;
- $multi_curl->complete(function ($instance) use (&$request_count) {
- $request_count += 1;
});
$multi_curl->success(function ($instance) use (&$request_count, $multi_curl) {
echo 'call to "' . $instance->url . '" was successful.' . "\n";
// Stop pending requests and attempt to stop active requests after the first
// successful request.
$multi_curl->stop();
});
+ $multi_curl->error(function ($instance) {
+ echo 'call to "' . $instance->url . '" was unsuccessful.' . "\n";
+ });
+
+ // Count the number of completed requests.
+ $request_count = 0;
+ $multi_curl->complete(function ($instance) use (&$request_count) {
+ echo 'call to "' . $instance->url . '" completed.' . "\n";
+ $request_count += 1;
+ });
+
$multi_curl->addGet('https://httpbin.org/delay/4');
$multi_curl->addGet('https://httpbin.org/delay/1');
$multi_curl->addGet('https://httpbin.org/delay/3');
$multi_curl->addGet('https://httpbin.org/delay/2');
$multi_curl->start();
assert($request_count === 1); | 17 | 0.586207 | 13 | 4 |
7d0c6ab38722b7a6e3d714b3a0ad577bd4ffa433 | recipes/plant_tribes_assembly_post_processor/0.2/meta.yaml | recipes/plant_tribes_assembly_post_processor/0.2/meta.yaml | package:
name: plant_tribes_assembly_post_processor
version: "0.2"
source:
fn: v0.2.tar.gz
url: https://github.com/dePamphilis/PlantTribes/archive/v0.2.tar.gz
md5: 06c7fdb7eaf091a8fe9f826866a1a206
build:
number: 0
# Requires genometools which is not supported on osx
skip: True # [osx]
requirements:
run:
- perl-estscan2 >=2,<3
- transdecoder >=2,<3
- genometools-genometools >=1,<2
test:
commands:
- AssemblyPostProcesser 2>&1 | grep Usage
about:
home: 'https://github.com/dePamphilis/PlantTribes'
summary: 'Transcriptome assembly post processing pipeline'
license: GNU General Public License v3 (GPLv3)
| package:
name: plant_tribes_assembly_post_processor
version: "0.2"
source:
fn: v0.2.tar.gz
url: https://github.com/dePamphilis/PlantTribes/archive/v0.2.tar.gz
md5: 06c7fdb7eaf091a8fe9f826866a1a206
build:
number: 1
# Requires genometools which is not supported on osx
skip: True # [osx]
requirements:
run:
- perl-estscan2 >=2,<3
- transdecoder >=2,<3
- genometools-genometools >=1,<2
test:
commands:
- AssemblyPostProcesser 2>&1 | grep Usage
- perl -e 'use ESTScan; my @mat = ESTScan::LoadMatrix("Hs.smat", "CODING", 4.0);'
- TransDecoder.Predict -h 2>&1 | grep USAGE
about:
home: 'https://github.com/dePamphilis/PlantTribes'
summary: 'Transcriptome assembly post processing pipeline'
license: GNU General Public License v3 (GPLv3)
| Add additional tests for PlantTribes AssemblyPostProcesser version 0.2 | Add additional tests for PlantTribes AssemblyPostProcesser version 0.2
| YAML | mit | saketkc/bioconda-recipes,matthdsm/bioconda-recipes,xguse/bioconda-recipes,ostrokach/bioconda-recipes,joachimwolff/bioconda-recipes,joachimwolff/bioconda-recipes,lpantano/recipes,chapmanb/bioconda-recipes,ivirshup/bioconda-recipes,phac-nml/bioconda-recipes,chapmanb/bioconda-recipes,rob-p/bioconda-recipes,colinbrislawn/bioconda-recipes,jfallmann/bioconda-recipes,CGATOxford/bioconda-recipes,matthdsm/bioconda-recipes,cokelaer/bioconda-recipes,martin-mann/bioconda-recipes,shenwei356/bioconda-recipes,gvlproject/bioconda-recipes,daler/bioconda-recipes,jfallmann/bioconda-recipes,acaprez/recipes,mcornwell1957/bioconda-recipes,peterjc/bioconda-recipes,joachimwolff/bioconda-recipes,keuv-grvl/bioconda-recipes,omicsnut/bioconda-recipes,shenwei356/bioconda-recipes,daler/bioconda-recipes,bioconda/bioconda-recipes,bebatut/bioconda-recipes,colinbrislawn/bioconda-recipes,mcornwell1957/bioconda-recipes,jasper1918/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,keuv-grvl/bioconda-recipes,bioconda/bioconda-recipes,oena/bioconda-recipes,oena/bioconda-recipes,mdehollander/bioconda-recipes,guowei-he/bioconda-recipes,daler/bioconda-recipes,guowei-he/bioconda-recipes,ostrokach/bioconda-recipes,ostrokach/bioconda-recipes,zwanli/bioconda-recipes,ThomasWollmann/bioconda-recipes,matthdsm/bioconda-recipes,blankenberg/bioconda-recipes,hardingnj/bioconda-recipes,phac-nml/bioconda-recipes,matthdsm/bioconda-recipes,joachimwolff/bioconda-recipes,bioconda/recipes,saketkc/bioconda-recipes,keuv-grvl/bioconda-recipes,colinbrislawn/bioconda-recipes,zwanli/bioconda-recipes,HassanAmr/bioconda-recipes,acaprez/recipes,matthdsm/bioconda-recipes,omicsnut/bioconda-recipes,jfallmann/bioconda-recipes,gvlproject/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,npavlovikj/bioconda-recipes,phac-nml/bioconda-recipes,saketkc/bioconda-recipes,JenCabral/bioconda-recipes,abims-sbr/bioconda-recipes,zwanli/bioconda-recipes,omicsnut/bioconda-recipes,xguse/bioconda-recipes,rvalieris/bioconda-recipes,mcornwell1957/bioconda-recipes,HassanAmr/bioconda-recipes,phac-nml/bioconda-recipes,ivirshup/bioconda-recipes,daler/bioconda-recipes,pinguinkiste/bioconda-recipes,mdehollander/bioconda-recipes,mcornwell1957/bioconda-recipes,oena/bioconda-recipes,CGATOxford/bioconda-recipes,abims-sbr/bioconda-recipes,guowei-he/bioconda-recipes,dkoppstein/recipes,rvalieris/bioconda-recipes,bebatut/bioconda-recipes,guowei-he/bioconda-recipes,phac-nml/bioconda-recipes,cokelaer/bioconda-recipes,colinbrislawn/bioconda-recipes,xguse/bioconda-recipes,bebatut/bioconda-recipes,CGATOxford/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,rvalieris/bioconda-recipes,daler/bioconda-recipes,ivirshup/bioconda-recipes,CGATOxford/bioconda-recipes,dmaticzka/bioconda-recipes,bow/bioconda-recipes,keuv-grvl/bioconda-recipes,npavlovikj/bioconda-recipes,rob-p/bioconda-recipes,zachcp/bioconda-recipes,cokelaer/bioconda-recipes,martin-mann/bioconda-recipes,blankenberg/bioconda-recipes,mdehollander/bioconda-recipes,oena/bioconda-recipes,Luobiny/bioconda-recipes,bow/bioconda-recipes,dmaticzka/bioconda-recipes,gvlproject/bioconda-recipes,gvlproject/bioconda-recipes,dmaticzka/bioconda-recipes,blankenberg/bioconda-recipes,guowei-he/bioconda-recipes,zachcp/bioconda-recipes,abims-sbr/bioconda-recipes,roryk/recipes,JenCabral/bioconda-recipes,bioconda/bioconda-recipes,bioconda/recipes,roryk/recipes,ThomasWollmann/bioconda-recipes,oena/bioconda-recipes,acaprez/recipes,daler/bioconda-recipes,ThomasWollmann/bioconda-recipes,peterjc/bioconda-recipes,shenwei356/bioconda-recipes,omicsnut/bioconda-recipes,Luobiny/bioconda-recipes,rvalieris/bioconda-recipes,ivirshup/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,dkoppstein/recipes,bow/bioconda-recipes,saketkc/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,rob-p/bioconda-recipes,zachcp/bioconda-recipes,saketkc/bioconda-recipes,jasper1918/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,gregvonkuster/bioconda-recipes,gvlproject/bioconda-recipes,gregvonkuster/bioconda-recipes,pinguinkiste/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,mdehollander/bioconda-recipes,colinbrislawn/bioconda-recipes,peterjc/bioconda-recipes,CGATOxford/bioconda-recipes,abims-sbr/bioconda-recipes,HassanAmr/bioconda-recipes,dkoppstein/recipes,ostrokach/bioconda-recipes,bow/bioconda-recipes,peterjc/bioconda-recipes,ostrokach/bioconda-recipes,acaprez/recipes,rob-p/bioconda-recipes,npavlovikj/bioconda-recipes,bow/bioconda-recipes,abims-sbr/bioconda-recipes,bow/bioconda-recipes,zwanli/bioconda-recipes,hardingnj/bioconda-recipes,JenCabral/bioconda-recipes,peterjc/bioconda-recipes,hardingnj/bioconda-recipes,pinguinkiste/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,lpantano/recipes,mcornwell1957/bioconda-recipes,zwanli/bioconda-recipes,JenCabral/bioconda-recipes,peterjc/bioconda-recipes,blankenberg/bioconda-recipes,npavlovikj/bioconda-recipes,gregvonkuster/bioconda-recipes,jasper1918/bioconda-recipes,ivirshup/bioconda-recipes,gregvonkuster/bioconda-recipes,chapmanb/bioconda-recipes,bioconda/recipes,pinguinkiste/bioconda-recipes,CGATOxford/bioconda-recipes,colinbrislawn/bioconda-recipes,dmaticzka/bioconda-recipes,HassanAmr/bioconda-recipes,joachimwolff/bioconda-recipes,keuv-grvl/bioconda-recipes,Luobiny/bioconda-recipes,jasper1918/bioconda-recipes,chapmanb/bioconda-recipes,martin-mann/bioconda-recipes,dmaticzka/bioconda-recipes,bebatut/bioconda-recipes,martin-mann/bioconda-recipes,ostrokach/bioconda-recipes,omicsnut/bioconda-recipes,ThomasWollmann/bioconda-recipes,Luobiny/bioconda-recipes,mdehollander/bioconda-recipes,JenCabral/bioconda-recipes,lpantano/recipes,pinguinkiste/bioconda-recipes,martin-mann/bioconda-recipes,JenCabral/bioconda-recipes,mdehollander/bioconda-recipes,matthdsm/bioconda-recipes,chapmanb/bioconda-recipes,joachimwolff/bioconda-recipes,jasper1918/bioconda-recipes,lpantano/recipes,xguse/bioconda-recipes,bioconda/bioconda-recipes,zwanli/bioconda-recipes,zachcp/bioconda-recipes,pinguinkiste/bioconda-recipes,dmaticzka/bioconda-recipes,abims-sbr/bioconda-recipes,shenwei356/bioconda-recipes,roryk/recipes,ThomasWollmann/bioconda-recipes,rvalieris/bioconda-recipes,cokelaer/bioconda-recipes,gvlproject/bioconda-recipes,hardingnj/bioconda-recipes,jfallmann/bioconda-recipes,hardingnj/bioconda-recipes,ThomasWollmann/bioconda-recipes,saketkc/bioconda-recipes,HassanAmr/bioconda-recipes,ivirshup/bioconda-recipes,xguse/bioconda-recipes,HassanAmr/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,keuv-grvl/bioconda-recipes,rvalieris/bioconda-recipes | yaml | ## Code Before:
package:
name: plant_tribes_assembly_post_processor
version: "0.2"
source:
fn: v0.2.tar.gz
url: https://github.com/dePamphilis/PlantTribes/archive/v0.2.tar.gz
md5: 06c7fdb7eaf091a8fe9f826866a1a206
build:
number: 0
# Requires genometools which is not supported on osx
skip: True # [osx]
requirements:
run:
- perl-estscan2 >=2,<3
- transdecoder >=2,<3
- genometools-genometools >=1,<2
test:
commands:
- AssemblyPostProcesser 2>&1 | grep Usage
about:
home: 'https://github.com/dePamphilis/PlantTribes'
summary: 'Transcriptome assembly post processing pipeline'
license: GNU General Public License v3 (GPLv3)
## Instruction:
Add additional tests for PlantTribes AssemblyPostProcesser version 0.2
## Code After:
package:
name: plant_tribes_assembly_post_processor
version: "0.2"
source:
fn: v0.2.tar.gz
url: https://github.com/dePamphilis/PlantTribes/archive/v0.2.tar.gz
md5: 06c7fdb7eaf091a8fe9f826866a1a206
build:
number: 1
# Requires genometools which is not supported on osx
skip: True # [osx]
requirements:
run:
- perl-estscan2 >=2,<3
- transdecoder >=2,<3
- genometools-genometools >=1,<2
test:
commands:
- AssemblyPostProcesser 2>&1 | grep Usage
- perl -e 'use ESTScan; my @mat = ESTScan::LoadMatrix("Hs.smat", "CODING", 4.0);'
- TransDecoder.Predict -h 2>&1 | grep USAGE
about:
home: 'https://github.com/dePamphilis/PlantTribes'
summary: 'Transcriptome assembly post processing pipeline'
license: GNU General Public License v3 (GPLv3)
| package:
name: plant_tribes_assembly_post_processor
version: "0.2"
source:
fn: v0.2.tar.gz
url: https://github.com/dePamphilis/PlantTribes/archive/v0.2.tar.gz
md5: 06c7fdb7eaf091a8fe9f826866a1a206
build:
- number: 0
? ^
+ number: 1
? ^
# Requires genometools which is not supported on osx
skip: True # [osx]
requirements:
run:
- perl-estscan2 >=2,<3
- transdecoder >=2,<3
- genometools-genometools >=1,<2
test:
commands:
- AssemblyPostProcesser 2>&1 | grep Usage
+ - perl -e 'use ESTScan; my @mat = ESTScan::LoadMatrix("Hs.smat", "CODING", 4.0);'
+ - TransDecoder.Predict -h 2>&1 | grep USAGE
about:
home: 'https://github.com/dePamphilis/PlantTribes'
summary: 'Transcriptome assembly post processing pipeline'
license: GNU General Public License v3 (GPLv3) | 4 | 0.142857 | 3 | 1 |
05fc957280fecbc99c8f58897a06e23dcc4b9453 | elections/uk/forms.py | elections/uk/forms.py |
from __future__ import unicode_literals
from django import forms
from django.core.exceptions import ValidationError
from candidates.mapit import BaseMapItException
from popolo.models import Area
from compat import text_type
from .mapit import get_areas_from_postcode
class PostcodeForm(forms.Form):
q = forms.CharField(
label='Enter a candidate name or postcode',
max_length=200,
widget=forms.TextInput(attrs={'placeholder': 'Enter a name'})
)
def clean_postcode(self):
postcode = self.cleaned_data['postcode']
try:
# Go to MapIt to check if this postcode is valid and
# contained in a constituency. (If it's valid then the
# result is cached, so this doesn't cause a double lookup.)
get_areas_from_postcode(postcode)
except BaseMapItException as e:
raise ValidationError(text_type(e))
return postcode
|
from __future__ import unicode_literals
from django import forms
from django.core.exceptions import ValidationError
from candidates.mapit import BaseMapItException
from popolo.models import Area
from compat import text_type
from .mapit import get_areas_from_postcode
class PostcodeForm(forms.Form):
q = forms.CharField(
label='Enter a candidate name or postcode',
max_length=200,
widget=forms.TextInput(attrs={'placeholder': 'Enter a name'})
)
def clean_q(self):
postcode = self.cleaned_data['q']
try:
# Go to MapIt to check if this postcode is valid and
# contained in a constituency. (If it's valid then the
# result is cached, so this doesn't cause a double lookup.)
get_areas_from_postcode(postcode)
except BaseMapItException as e:
raise ValidationError(text_type(e))
return postcode
| Fix the postcode form so that it's actually validating the input | Fix the postcode form so that it's actually validating the input
| Python | agpl-3.0 | DemocracyClub/yournextrepresentative,DemocracyClub/yournextrepresentative,DemocracyClub/yournextrepresentative | python | ## Code Before:
from __future__ import unicode_literals
from django import forms
from django.core.exceptions import ValidationError
from candidates.mapit import BaseMapItException
from popolo.models import Area
from compat import text_type
from .mapit import get_areas_from_postcode
class PostcodeForm(forms.Form):
q = forms.CharField(
label='Enter a candidate name or postcode',
max_length=200,
widget=forms.TextInput(attrs={'placeholder': 'Enter a name'})
)
def clean_postcode(self):
postcode = self.cleaned_data['postcode']
try:
# Go to MapIt to check if this postcode is valid and
# contained in a constituency. (If it's valid then the
# result is cached, so this doesn't cause a double lookup.)
get_areas_from_postcode(postcode)
except BaseMapItException as e:
raise ValidationError(text_type(e))
return postcode
## Instruction:
Fix the postcode form so that it's actually validating the input
## Code After:
from __future__ import unicode_literals
from django import forms
from django.core.exceptions import ValidationError
from candidates.mapit import BaseMapItException
from popolo.models import Area
from compat import text_type
from .mapit import get_areas_from_postcode
class PostcodeForm(forms.Form):
q = forms.CharField(
label='Enter a candidate name or postcode',
max_length=200,
widget=forms.TextInput(attrs={'placeholder': 'Enter a name'})
)
def clean_q(self):
postcode = self.cleaned_data['q']
try:
# Go to MapIt to check if this postcode is valid and
# contained in a constituency. (If it's valid then the
# result is cached, so this doesn't cause a double lookup.)
get_areas_from_postcode(postcode)
except BaseMapItException as e:
raise ValidationError(text_type(e))
return postcode
|
from __future__ import unicode_literals
from django import forms
from django.core.exceptions import ValidationError
from candidates.mapit import BaseMapItException
from popolo.models import Area
from compat import text_type
from .mapit import get_areas_from_postcode
class PostcodeForm(forms.Form):
q = forms.CharField(
label='Enter a candidate name or postcode',
max_length=200,
widget=forms.TextInput(attrs={'placeholder': 'Enter a name'})
)
- def clean_postcode(self):
? ^^^^^^^^
+ def clean_q(self):
? ^
- postcode = self.cleaned_data['postcode']
? ^^^^^^^^
+ postcode = self.cleaned_data['q']
? ^
try:
# Go to MapIt to check if this postcode is valid and
# contained in a constituency. (If it's valid then the
# result is cached, so this doesn't cause a double lookup.)
get_areas_from_postcode(postcode)
except BaseMapItException as e:
raise ValidationError(text_type(e))
return postcode | 4 | 0.133333 | 2 | 2 |
0189ebb5875ca7c7b040edfc19261c6f9f9d4f1a | .travis.yml | .travis.yml | sudo: false
language: ruby
rvm:
- 2.6.1
- 2.5.3
gemfile:
- gemfiles/rails_5.1.gemfile
- gemfiles/rails_5.2.gemfile
before_install: gem install bundler -v 1.17.1
| sudo: false
language: ruby
rvm:
- 2.6.1
- 2.5.3
gemfile:
- gemfiles/rails_5.1.gemfile
- gemfiles/rails_5.2.gemfile
| Use bundled version of bundler | Use bundled version of bundler
| YAML | mit | nowlinuxing/kushojin,nowlinuxing/kushojin | yaml | ## Code Before:
sudo: false
language: ruby
rvm:
- 2.6.1
- 2.5.3
gemfile:
- gemfiles/rails_5.1.gemfile
- gemfiles/rails_5.2.gemfile
before_install: gem install bundler -v 1.17.1
## Instruction:
Use bundled version of bundler
## Code After:
sudo: false
language: ruby
rvm:
- 2.6.1
- 2.5.3
gemfile:
- gemfiles/rails_5.1.gemfile
- gemfiles/rails_5.2.gemfile
| sudo: false
language: ruby
rvm:
- 2.6.1
- 2.5.3
gemfile:
- gemfiles/rails_5.1.gemfile
- gemfiles/rails_5.2.gemfile
- before_install: gem install bundler -v 1.17.1 | 1 | 0.111111 | 0 | 1 |
2c035facdf26442873ee06506a579317dc67006a | _layouts/post.html | _layouts/post.html | ---
layout: default
---
<div class="post">
<h1 class="post-title">{{ page.title }}</h1>
<span class="post-date">{{ page.date | date_to_string }}</span>
<div id="renderTOC">
<table class="toc" id="toc"><tbody><tr><td>
<div id="toctitle"><strong>Contents</strong></div>
<ol data-toc="div.container" data-toc-headings="h2,h3"></ol>
</td></tr></tbody></table>
</div>
{{ content }}
</div>
| ---
layout: default
---
<div class="post">
<h1 class="post-title">{{ page.title }}</h1>
<span class="post-date">{{ page.date | date_to_string }}</span>
<div id="renderTOC">
<table class="toc" id="toc"><tbody><tr><td>
<div id="toctitle"><strong>Contents</strong></div>
<ol id='toccontainer' data-toc="div.container" data-toc-headings="h2,h3"></ol>
</td></tr></tbody></table>
<script>
$( document ).ready(function() {
if( $("#toccontainer li").length <= 0 ) {
$("#renderTOC").hide();
}
});
</script>
</div>
{{ content }}
</div>
| Hide the TOC when there are no elements. | Hide the TOC when there are no elements.
| HTML | mit | funvill/funvill.github.io,funvill/funvill.github.io,funvill/funvill.github.io | html | ## Code Before:
---
layout: default
---
<div class="post">
<h1 class="post-title">{{ page.title }}</h1>
<span class="post-date">{{ page.date | date_to_string }}</span>
<div id="renderTOC">
<table class="toc" id="toc"><tbody><tr><td>
<div id="toctitle"><strong>Contents</strong></div>
<ol data-toc="div.container" data-toc-headings="h2,h3"></ol>
</td></tr></tbody></table>
</div>
{{ content }}
</div>
## Instruction:
Hide the TOC when there are no elements.
## Code After:
---
layout: default
---
<div class="post">
<h1 class="post-title">{{ page.title }}</h1>
<span class="post-date">{{ page.date | date_to_string }}</span>
<div id="renderTOC">
<table class="toc" id="toc"><tbody><tr><td>
<div id="toctitle"><strong>Contents</strong></div>
<ol id='toccontainer' data-toc="div.container" data-toc-headings="h2,h3"></ol>
</td></tr></tbody></table>
<script>
$( document ).ready(function() {
if( $("#toccontainer li").length <= 0 ) {
$("#renderTOC").hide();
}
});
</script>
</div>
{{ content }}
</div>
| ---
layout: default
---
<div class="post">
<h1 class="post-title">{{ page.title }}</h1>
<span class="post-date">{{ page.date | date_to_string }}</span>
<div id="renderTOC">
<table class="toc" id="toc"><tbody><tr><td>
<div id="toctitle"><strong>Contents</strong></div>
- <ol data-toc="div.container" data-toc-headings="h2,h3"></ol>
+ <ol id='toccontainer' data-toc="div.container" data-toc-headings="h2,h3"></ol>
? ++++++++++++++++++
- </td></tr></tbody></table>
? ----
+ </td></tr></tbody></table>
+ <script>
+ $( document ).ready(function() {
+ if( $("#toccontainer li").length <= 0 ) {
+ $("#renderTOC").hide();
+ }
+ });
+ </script>
+
+
+
</div>
{{ content }}
</div> | 14 | 0.823529 | 12 | 2 |
7d02bd555d7519d485d00e02136d26a6e4e7096e | nova/db/sqlalchemy/migrate_repo/versions/034_change_instance_id_in_migrations.py | nova/db/sqlalchemy/migrate_repo/versions/034_change_instance_id_in_migrations.py |
from sqlalchemy import Column, Integer, String, MetaData, Table
meta = MetaData()
#
# Tables to alter
#
#
instance_id = Column('instance_id', Integer())
instance_uuid = Column('instance_uuid', String(255))
def upgrade(migrate_engine):
meta.bind = migrate_engine
migrations = Table('migrations', meta, autoload=True)
migrations.create_column(instance_uuid)
migrations.c.instance_id.drop()
def downgrade(migrate_engine):
meta.bind = migrate_engine
migrations = Table('migrations', meta, autoload=True)
migrations.c.instance_uuid.drop()
migrations.create_column(instance_id)
|
from sqlalchemy import Column, Integer, String, MetaData, Table
meta = MetaData()
#
# Tables to alter
#
#
instance_id = Column('instance_id', Integer())
instance_uuid = Column('instance_uuid', String(255))
def upgrade(migrate_engine):
meta.bind = migrate_engine
migrations = Table('migrations', meta, autoload=True)
migrations.create_column(instance_uuid)
if migrate_engine.name == "mysql":
migrate_engine.execute("ALTER TABLE migrations DROP FOREIGN KEY " \
"`migrations_ibfk_1`;")
migrations.c.instance_id.drop()
def downgrade(migrate_engine):
meta.bind = migrate_engine
migrations = Table('migrations', meta, autoload=True)
migrations.c.instance_uuid.drop()
migrations.create_column(instance_id)
| Drop FK before dropping instance_id column. | Drop FK before dropping instance_id column. | Python | apache-2.0 | sacharya/nova,jianghuaw/nova,leilihh/novaha,eneabio/nova,vladikr/nova_drafts,KarimAllah/nova,sileht/deb-openstack-nova,Stavitsky/nova,DirectXMan12/nova-hacking,akash1808/nova_test_latest,raildo/nova,gspilio/nova,tangfeixiong/nova,jianghuaw/nova,Juniper/nova,JioCloud/nova,zhimin711/nova,usc-isi/nova,orbitfp7/nova,JianyuWang/nova,vmturbo/nova,sebrandon1/nova,jeffrey4l/nova,Francis-Liu/animated-broccoli,psiwczak/openstack,MountainWei/nova,tianweizhang/nova,yrobla/nova,maelnor/nova,whitepages/nova,maoy/zknova,joker946/nova,russellb/nova,iuliat/nova,qwefi/nova,rahulunair/nova,berrange/nova,sileht/deb-openstack-nova,mahak/nova,fnordahl/nova,sridevikoushik31/openstack,Metaswitch/calico-nova,gooddata/openstack-nova,sebrandon1/nova,redhat-openstack/nova,eayunstack/nova,mandeepdhami/nova,tealover/nova,eharney/nova,yrobla/nova,CEG-FYP-OpenStack/scheduler,TieWei/nova,maelnor/nova,TwinkleChawla/nova,KarimAllah/nova,cloudbau/nova,isyippee/nova,mikalstill/nova,hanlind/nova,mgagne/nova,badock/nova,qwefi/nova,paulmathews/nova,kimjaejoong/nova,spring-week-topos/nova-week,plumgrid/plumgrid-nova,alaski/nova,petrutlucian94/nova,thomasem/nova,barnsnake351/nova,cernops/nova,akash1808/nova,Triv90/Nova,yrobla/nova,watonyweng/nova,akash1808/nova_test_latest,NoBodyCam/TftpPxeBootBareMetal,Tehsmash/nova,Juniper/nova,iuliat/nova,orbitfp7/nova,alexandrucoman/vbox-nova-driver,aristanetworks/arista-ovs-nova,fnordahl/nova,cernops/nova,zaina/nova,projectcalico/calico-nova,russellb/nova,apporc/nova,j-carpentier/nova,shahar-stratoscale/nova,DirectXMan12/nova-hacking,tealover/nova,vmturbo/nova,rahulunair/nova,JianyuWang/nova,varunarya10/nova_test_latest,imsplitbit/nova,klmitch/nova,silenceli/nova,NewpTone/stacklab-nova,apporc/nova,devendermishrajio/nova_test_latest,dawnpower/nova,alvarolopez/nova,felixma/nova,saleemjaveds/https-github.com-openstack-nova,adelina-t/nova,angdraug/nova,mikalstill/nova,akash1808/nova,Yuriy-Leonov/nova,CiscoSystems/nova,klmitch/nova,watonyweng/nova,devoid/nova,bgxavier/nova,citrix-openstack-build/nova,psiwczak/openstack,nikesh-mahalka/nova,sridevikoushik31/nova,CiscoSystems/nova,joker946/nova,JioCloud/nova,salv-orlando/MyRepo,rrader/nova-docker-plugin,kimjaejoong/nova,rickerc/nova_audit,savi-dev/nova,sridevikoushik31/nova,hanlind/nova,DirectXMan12/nova-hacking,blueboxgroup/nova,JioCloud/nova_test_latest,eonpatapon/nova,luogangyi/bcec-nova,belmiromoreira/nova,fajoy/nova,rickerc/nova_audit,double12gzh/nova,sileht/deb-openstack-nova,cloudbase/nova,eayunstack/nova,NeCTAR-RC/nova,aristanetworks/arista-ovs-nova,CCI-MOC/nova,sridevikoushik31/openstack,silenceli/nova,Brocade-OpenSource/OpenStack-DNRM-Nova,virtualopensystems/nova,Juniper/nova,devendermishrajio/nova,tudorvio/nova,edulramirez/nova,bgxavier/nova,cyx1231st/nova,shootstar/novatest,varunarya10/nova_test_latest,maheshp/novatest,cernops/nova,imsplitbit/nova,maheshp/novatest,russellb/nova,josephsuh/extra-specs,mahak/nova,mgagne/nova,plumgrid/plumgrid-nova,gspilio/nova,sridevikoushik31/nova,luogangyi/bcec-nova,NoBodyCam/TftpPxeBootBareMetal,alaski/nova,cloudbau/nova,CloudServer/nova,bigswitch/nova,houshengbo/nova_vmware_compute_driver,dawnpower/nova,rajalokan/nova,belmiromoreira/nova,virtualopensystems/nova,saleemjaveds/https-github.com-openstack-nova,bclau/nova,eonpatapon/nova,Juniper/nova,citrix-openstack-build/nova,j-carpentier/nova,sacharya/nova,zhimin711/nova,Yusuke1987/openstack_template,angdraug/nova,mmnelemane/nova,eneabio/nova,cloudbase/nova,klmitch/nova,vmturbo/nova,openstack/nova,zaina/nova,edulramirez/nova,eharney/nova,josephsuh/extra-specs,cloudbase/nova,shail2810/nova,jianghuaw/nova,Triv90/Nova,NeCTAR-RC/nova,viggates/nova,zzicewind/nova,LoHChina/nova,vmturbo/nova,spring-week-topos/nova-week,noironetworks/nova,rajalokan/nova,openstack/nova,berrange/nova,takeshineshiro/nova,eneabio/nova,cloudbase/nova-virtualbox,felixma/nova,fajoy/nova,whitepages/nova,usc-isi/extra-specs,psiwczak/openstack,ruslanloman/nova,isyippee/nova,ruslanloman/nova,petrutlucian94/nova_dev,dstroppa/openstack-smartos-nova-grizzly,shahar-stratoscale/nova,bclau/nova,josephsuh/extra-specs,SUSE-Cloud/nova,vladikr/nova_drafts,noironetworks/nova,fajoy/nova,ntt-sic/nova,maoy/zknova,Francis-Liu/animated-broccoli,BeyondTheClouds/nova,blueboxgroup/nova,LoHChina/nova,cloudbase/nova-virtualbox,Triv90/Nova,jianghuaw/nova,SUSE-Cloud/nova,leilihh/novaha,devoid/nova,salv-orlando/MyRepo,Yuriy-Leonov/nova,jeffrey4l/nova,NewpTone/stacklab-nova,tangfeixiong/nova,zzicewind/nova,houshengbo/nova_vmware_compute_driver,yosshy/nova,BeyondTheClouds/nova,sridevikoushik31/openstack,aristanetworks/arista-ovs-nova,maheshp/novatest,OpenAcademy-OpenStack/nova-scheduler,mandeepdhami/nova,phenoxim/nova,paulmathews/nova,usc-isi/nova,TwinkleChawla/nova,mikalstill/nova,Metaswitch/calico-nova,ntt-sic/nova,KarimAllah/nova,houshengbo/nova_vmware_compute_driver,projectcalico/calico-nova,CloudServer/nova,savi-dev/nova,usc-isi/extra-specs,tanglei528/nova,yatinkumbhare/openstack-nova,tianweizhang/nova,Stavitsky/nova,gooddata/openstack-nova,redhat-openstack/nova,Yusuke1987/openstack_template,sridevikoushik31/nova,yatinkumbhare/openstack-nova,tanglei528/nova,leilihh/nova,dstroppa/openstack-smartos-nova-grizzly,klmitch/nova,mahak/nova,sebrandon1/nova,bigswitch/nova,rajalokan/nova,rrader/nova-docker-plugin,leilihh/nova,raildo/nova,mmnelemane/nova,ewindisch/nova,dstroppa/openstack-smartos-nova-grizzly,gooddata/openstack-nova,Tehsmash/nova,JioCloud/nova_test_latest,petrutlucian94/nova_dev,ted-gould/nova,rahulunair/nova,dims/nova,badock/nova,yosshy/nova,MountainWei/nova,scripnichenko/nova,double12gzh/nova,gspilio/nova,OpenAcademy-OpenStack/nova-scheduler,nikesh-mahalka/nova,shootstar/novatest,savi-dev/nova,scripnichenko/nova,gooddata/openstack-nova,devendermishrajio/nova_test_latest,usc-isi/extra-specs,CCI-MOC/nova,takeshineshiro/nova,usc-isi/nova,NewpTone/stacklab-nova,dims/nova,adelina-t/nova,phenoxim/nova,shail2810/nova,cyx1231st/nova,alvarolopez/nova,alexandrucoman/vbox-nova-driver,barnsnake351/nova,openstack/nova,affo/nova,Brocade-OpenSource/OpenStack-DNRM-Nova,affo/nova,maoy/zknova,thomasem/nova,ted-gould/nova,petrutlucian94/nova,viggates/nova,CEG-FYP-OpenStack/scheduler,TieWei/nova,salv-orlando/MyRepo,hanlind/nova,rajalokan/nova,NoBodyCam/TftpPxeBootBareMetal,ewindisch/nova,BeyondTheClouds/nova,devendermishrajio/nova,paulmathews/nova,tudorvio/nova | python | ## Code Before:
from sqlalchemy import Column, Integer, String, MetaData, Table
meta = MetaData()
#
# Tables to alter
#
#
instance_id = Column('instance_id', Integer())
instance_uuid = Column('instance_uuid', String(255))
def upgrade(migrate_engine):
meta.bind = migrate_engine
migrations = Table('migrations', meta, autoload=True)
migrations.create_column(instance_uuid)
migrations.c.instance_id.drop()
def downgrade(migrate_engine):
meta.bind = migrate_engine
migrations = Table('migrations', meta, autoload=True)
migrations.c.instance_uuid.drop()
migrations.create_column(instance_id)
## Instruction:
Drop FK before dropping instance_id column.
## Code After:
from sqlalchemy import Column, Integer, String, MetaData, Table
meta = MetaData()
#
# Tables to alter
#
#
instance_id = Column('instance_id', Integer())
instance_uuid = Column('instance_uuid', String(255))
def upgrade(migrate_engine):
meta.bind = migrate_engine
migrations = Table('migrations', meta, autoload=True)
migrations.create_column(instance_uuid)
if migrate_engine.name == "mysql":
migrate_engine.execute("ALTER TABLE migrations DROP FOREIGN KEY " \
"`migrations_ibfk_1`;")
migrations.c.instance_id.drop()
def downgrade(migrate_engine):
meta.bind = migrate_engine
migrations = Table('migrations', meta, autoload=True)
migrations.c.instance_uuid.drop()
migrations.create_column(instance_id)
|
from sqlalchemy import Column, Integer, String, MetaData, Table
+
meta = MetaData()
#
# Tables to alter
#
#
instance_id = Column('instance_id', Integer())
instance_uuid = Column('instance_uuid', String(255))
def upgrade(migrate_engine):
meta.bind = migrate_engine
migrations = Table('migrations', meta, autoload=True)
migrations.create_column(instance_uuid)
+
+ if migrate_engine.name == "mysql":
+ migrate_engine.execute("ALTER TABLE migrations DROP FOREIGN KEY " \
+ "`migrations_ibfk_1`;")
+
migrations.c.instance_id.drop()
def downgrade(migrate_engine):
meta.bind = migrate_engine
migrations = Table('migrations', meta, autoload=True)
migrations.c.instance_uuid.drop()
migrations.create_column(instance_id) | 6 | 0.222222 | 6 | 0 |
aeaafbdeaf6d4f13cc38d5ba9bc05b9508550bad | src/index.js | src/index.js | import '../vendor/gridforms.css'
import React, {cloneElement, Children, PropTypes} from 'react'
let classNames = (...args) => args.filter(cn => !!cn).join(' ')
export let GridForm = ({children, className, component: Component, ...props}) =>
<Component {...props} className={classNames('grid-form', className)}>
{children}
</Component>
GridForm.propTypes = {
component: PropTypes.any
}
GridForm.defaultProps = {
component: 'form'
}
export let Fieldset = ({children, legend, ...props}) =>
<fieldset {...props}>
{legend && <legend>{legend}</legend>}
{children}
</fieldset>
Fieldset.propTypes = {
name: PropTypes.string.isRequired
}
export let Row = ({children, className}) => {
let span = 0
Children.forEach(children, child => span += Number(child.props.span))
return <div data-row-span={span} className={className}>
{children}
</div>
}
export let Field = ({children, className, span}) =>
<div data-field-span={span} className={className}>
{children}
</div>
Field.propTypes = {
span: PropTypes.oneOfType([
PropTypes.number,
PropTypes.string
])
}
Field.defaultProps = {
span: '1'
}
| import '../vendor/gridforms.css'
import React, {cloneElement, Children, PropTypes} from 'react'
let classNames = (...args) => args.filter(cn => !!cn).join(' ')
export let GridForm = React.createClass({
getDefaultProps() {
return {
component: 'form'
}
},
render() {
let {children, className, component: Component, ...props} = this.props
return <Component {...props} className={classNames('grid-form', className)}>
{children}
</Component>
}
})
export let Fieldset = React.createClass({
render() {
let {children, legend, ...props} = this.props
return <fieldset {...props}>
{legend && <legend>{legend}</legend>}
{children}
</fieldset>
}
})
export let Row = React.createClass({
render() {
let {children, className} = this.props
let span = 0
Children.forEach(children, child => span += Number(child.props.span))
return <div data-row-span={span} className={className}>
{children}
</div>
}
})
export let Field = React.createClass({
propTypes: {
span: PropTypes.oneOfType([
PropTypes.number,
PropTypes.string
])
},
getDefaultProps() {
return {
span: 1
}
},
render() {
let {children, className, span} = this.props
return <div data-field-span={span} className={className}>
{children}
</div>
}
})
| Use a regular React component impl until hot reloading issues are resolved | Use a regular React component impl until hot reloading issues are resolved
Hot reloading a function component curently blows any state within it away
| JavaScript | mit | insin/react-gridforms | javascript | ## Code Before:
import '../vendor/gridforms.css'
import React, {cloneElement, Children, PropTypes} from 'react'
let classNames = (...args) => args.filter(cn => !!cn).join(' ')
export let GridForm = ({children, className, component: Component, ...props}) =>
<Component {...props} className={classNames('grid-form', className)}>
{children}
</Component>
GridForm.propTypes = {
component: PropTypes.any
}
GridForm.defaultProps = {
component: 'form'
}
export let Fieldset = ({children, legend, ...props}) =>
<fieldset {...props}>
{legend && <legend>{legend}</legend>}
{children}
</fieldset>
Fieldset.propTypes = {
name: PropTypes.string.isRequired
}
export let Row = ({children, className}) => {
let span = 0
Children.forEach(children, child => span += Number(child.props.span))
return <div data-row-span={span} className={className}>
{children}
</div>
}
export let Field = ({children, className, span}) =>
<div data-field-span={span} className={className}>
{children}
</div>
Field.propTypes = {
span: PropTypes.oneOfType([
PropTypes.number,
PropTypes.string
])
}
Field.defaultProps = {
span: '1'
}
## Instruction:
Use a regular React component impl until hot reloading issues are resolved
Hot reloading a function component curently blows any state within it away
## Code After:
import '../vendor/gridforms.css'
import React, {cloneElement, Children, PropTypes} from 'react'
let classNames = (...args) => args.filter(cn => !!cn).join(' ')
export let GridForm = React.createClass({
getDefaultProps() {
return {
component: 'form'
}
},
render() {
let {children, className, component: Component, ...props} = this.props
return <Component {...props} className={classNames('grid-form', className)}>
{children}
</Component>
}
})
export let Fieldset = React.createClass({
render() {
let {children, legend, ...props} = this.props
return <fieldset {...props}>
{legend && <legend>{legend}</legend>}
{children}
</fieldset>
}
})
export let Row = React.createClass({
render() {
let {children, className} = this.props
let span = 0
Children.forEach(children, child => span += Number(child.props.span))
return <div data-row-span={span} className={className}>
{children}
</div>
}
})
export let Field = React.createClass({
propTypes: {
span: PropTypes.oneOfType([
PropTypes.number,
PropTypes.string
])
},
getDefaultProps() {
return {
span: 1
}
},
render() {
let {children, className, span} = this.props
return <div data-field-span={span} className={className}>
{children}
</div>
}
})
| import '../vendor/gridforms.css'
import React, {cloneElement, Children, PropTypes} from 'react'
let classNames = (...args) => args.filter(cn => !!cn).join(' ')
+ export let GridForm = React.createClass({
+ getDefaultProps() {
+ return {
+ component: 'form'
+ }
+ },
+ render() {
- export let GridForm = ({children, className, component: Component, ...props}) =>
? ^^^^^^ ------------ - ^
+ let {children, className, component: Component, ...props} = this.props
? ^^^ ^^^^^^^^^^^
- <Component {...props} className={classNames('grid-form', className)}>
+ return <Component {...props} className={classNames('grid-form', className)}>
? +++++++++
- {children}
+ {children}
? ++
- </Component>
+ </Component>
? ++
+ }
+ })
- GridForm.propTypes = {
- component: PropTypes.any
- }
+ export let Fieldset = React.createClass({
+ render() {
+ let {children, legend, ...props} = this.props
+ return <fieldset {...props}>
+ {legend && <legend>{legend}</legend>}
+ {children}
+ </fieldset>
+ }
+ })
- GridForm.defaultProps = {
- component: 'form'
- }
+ export let Row = React.createClass({
+ render() {
+ let {children, className} = this.props
+ let span = 0
+ Children.forEach(children, child => span += Number(child.props.span))
+ return <div data-row-span={span} className={className}>
+ {children}
+ </div>
+ }
+ })
+ export let Field = React.createClass({
+ propTypes: {
- export let Fieldset = ({children, legend, ...props}) =>
- <fieldset {...props}>
- {legend && <legend>{legend}</legend>}
- {children}
- </fieldset>
-
- Fieldset.propTypes = {
- name: PropTypes.string.isRequired
- }
-
- export let Row = ({children, className}) => {
- let span = 0
- Children.forEach(children, child => span += Number(child.props.span))
- return <div data-row-span={span} className={className}>
- {children}
- </div>
- }
-
- export let Field = ({children, className, span}) =>
- <div data-field-span={span} className={className}>
- {children}
- </div>
-
- Field.propTypes = {
- span: PropTypes.oneOfType([
+ span: PropTypes.oneOfType([
? ++
- PropTypes.number,
+ PropTypes.number,
? ++
- PropTypes.string
+ PropTypes.string
? ++
- ])
+ ])
? ++
- }
-
- Field.defaultProps = {
+ },
+ getDefaultProps() {
+ return {
- span: '1'
? - -
+ span: 1
? ++++
- }
+ }
+ },
+ render() {
+ let {children, className, span} = this.props
+ return <div data-field-span={span} className={className}>
+ {children}
+ </div>
+ }
+ }) | 94 | 1.807692 | 51 | 43 |
2e64e971858a89db46eaf1550ccee0b67b9ea4dc | app.json | app.json | {
"expo": {
"name": "Desert Walk",
"description": "Solitaire card game.",
"slug": "desert-walk",
"privacy": "public",
"sdkVersion": "36.0.0",
"version": "1.0.6",
"orientation": "landscape",
"primaryColor": "#464",
"icon": "./assets/app-icon.png",
"splash": {
"image": "./assets/splash-screen.png",
"hideExponentText": false
},
"assetBundlePatterns": ["assets/**/*"],
"githubUrl": "https://github.com/janaagaard75/desert-walk",
"platforms": ["android", "ios"],
"ios": {
"bundleIdentifier": "com.janaagaard.desertwalk",
"supportsTablet": true,
"usesIcloudStorage": false,
"usesNonExemptEncryption": false
},
"android": {
"package": "com.janaagaard.desertwalk",
"versionCode": 4
}
}
}
| {
"expo": {
"name": "Desert Walk",
"description": "Solitaire card game.",
"slug": "desert-walk",
"privacy": "public",
"sdkVersion": "36.0.0",
"version": "1.0.6",
"orientation": "landscape",
"primaryColor": "#464",
"icon": "./assets/app-icon.png",
"splash": {
"image": "./assets/splash-screen.png",
"hideExponentText": false
},
"assetBundlePatterns": ["assets/**/*"],
"githubUrl": "https://github.com/janaagaard75/desert-walk",
"platforms": ["android", "ios"],
"ios": {
"buildNumber": 5,
"bundleIdentifier": "com.janaagaard.desertwalk",
"supportsTablet": true,
"usesIcloudStorage": false,
"usesNonExemptEncryption": false
},
"android": {
"package": "com.janaagaard.desertwalk",
"versionCode": 5
}
}
}
| Add buildNumber and increment versionCode | Add buildNumber and increment versionCode
| JSON | mit | janaagaard75/desert-walk,janaagaard75/desert-walk | json | ## Code Before:
{
"expo": {
"name": "Desert Walk",
"description": "Solitaire card game.",
"slug": "desert-walk",
"privacy": "public",
"sdkVersion": "36.0.0",
"version": "1.0.6",
"orientation": "landscape",
"primaryColor": "#464",
"icon": "./assets/app-icon.png",
"splash": {
"image": "./assets/splash-screen.png",
"hideExponentText": false
},
"assetBundlePatterns": ["assets/**/*"],
"githubUrl": "https://github.com/janaagaard75/desert-walk",
"platforms": ["android", "ios"],
"ios": {
"bundleIdentifier": "com.janaagaard.desertwalk",
"supportsTablet": true,
"usesIcloudStorage": false,
"usesNonExemptEncryption": false
},
"android": {
"package": "com.janaagaard.desertwalk",
"versionCode": 4
}
}
}
## Instruction:
Add buildNumber and increment versionCode
## Code After:
{
"expo": {
"name": "Desert Walk",
"description": "Solitaire card game.",
"slug": "desert-walk",
"privacy": "public",
"sdkVersion": "36.0.0",
"version": "1.0.6",
"orientation": "landscape",
"primaryColor": "#464",
"icon": "./assets/app-icon.png",
"splash": {
"image": "./assets/splash-screen.png",
"hideExponentText": false
},
"assetBundlePatterns": ["assets/**/*"],
"githubUrl": "https://github.com/janaagaard75/desert-walk",
"platforms": ["android", "ios"],
"ios": {
"buildNumber": 5,
"bundleIdentifier": "com.janaagaard.desertwalk",
"supportsTablet": true,
"usesIcloudStorage": false,
"usesNonExemptEncryption": false
},
"android": {
"package": "com.janaagaard.desertwalk",
"versionCode": 5
}
}
}
| {
"expo": {
"name": "Desert Walk",
"description": "Solitaire card game.",
"slug": "desert-walk",
"privacy": "public",
"sdkVersion": "36.0.0",
"version": "1.0.6",
"orientation": "landscape",
"primaryColor": "#464",
"icon": "./assets/app-icon.png",
"splash": {
"image": "./assets/splash-screen.png",
"hideExponentText": false
},
"assetBundlePatterns": ["assets/**/*"],
"githubUrl": "https://github.com/janaagaard75/desert-walk",
"platforms": ["android", "ios"],
"ios": {
+ "buildNumber": 5,
"bundleIdentifier": "com.janaagaard.desertwalk",
"supportsTablet": true,
"usesIcloudStorage": false,
"usesNonExemptEncryption": false
},
"android": {
"package": "com.janaagaard.desertwalk",
- "versionCode": 4
? ^
+ "versionCode": 5
? ^
}
}
} | 3 | 0.1 | 2 | 1 |
e39dc6dd8b70c3afa7e7efe7cfaaf557b7855ee6 | packages/in/intro-prelude.yaml | packages/in/intro-prelude.yaml | homepage: https://github.com/minad/intro-prelude#readme
changelog-type: ''
hash: 0efa4c65ccfa1af759e17f530a5f2aa691d4bbf1e9702b8b657bad02cc1b3e6e
test-bench-deps:
intro: -any
intro-prelude: -any
maintainer: Daniel Mendler <mail@daniel-mendler.de>
synopsis: Intro reexported as Prelude
changelog: ''
basic-deps:
intro: -any
all-versions:
- '0.1.0.0'
author: Daniel Mendler <mail@daniel-mendler.de>
latest: '0.1.0.0'
description-type: haddock
description: Intro reexported as Prelude. Replace base with base-noprelude and intro-prelude
in your build-depends.
license-name: MIT
| homepage: https://github.com/minad/intro-prelude#readme
changelog-type: ''
hash: a6ffadd4b02b26ea9170eae2f37ee3f5af32cb128a3c1d1099b34b86daec5de6
test-bench-deps:
intro: -any
intro-prelude: -any
maintainer: Daniel Mendler <mail@daniel-mendler.de>
synopsis: Intro reexported as Prelude
changelog: ''
basic-deps:
intro: <0.2
all-versions:
- '0.1.0.0'
author: Daniel Mendler <mail@daniel-mendler.de>
latest: '0.1.0.0'
description-type: haddock
description: Intro reexported as Prelude. Replace base with base-noprelude and intro-prelude
in your build-depends.
license-name: MIT
| Update from Hackage at 2017-01-10T15:37:35Z | Update from Hackage at 2017-01-10T15:37:35Z
| YAML | mit | commercialhaskell/all-cabal-metadata | yaml | ## Code Before:
homepage: https://github.com/minad/intro-prelude#readme
changelog-type: ''
hash: 0efa4c65ccfa1af759e17f530a5f2aa691d4bbf1e9702b8b657bad02cc1b3e6e
test-bench-deps:
intro: -any
intro-prelude: -any
maintainer: Daniel Mendler <mail@daniel-mendler.de>
synopsis: Intro reexported as Prelude
changelog: ''
basic-deps:
intro: -any
all-versions:
- '0.1.0.0'
author: Daniel Mendler <mail@daniel-mendler.de>
latest: '0.1.0.0'
description-type: haddock
description: Intro reexported as Prelude. Replace base with base-noprelude and intro-prelude
in your build-depends.
license-name: MIT
## Instruction:
Update from Hackage at 2017-01-10T15:37:35Z
## Code After:
homepage: https://github.com/minad/intro-prelude#readme
changelog-type: ''
hash: a6ffadd4b02b26ea9170eae2f37ee3f5af32cb128a3c1d1099b34b86daec5de6
test-bench-deps:
intro: -any
intro-prelude: -any
maintainer: Daniel Mendler <mail@daniel-mendler.de>
synopsis: Intro reexported as Prelude
changelog: ''
basic-deps:
intro: <0.2
all-versions:
- '0.1.0.0'
author: Daniel Mendler <mail@daniel-mendler.de>
latest: '0.1.0.0'
description-type: haddock
description: Intro reexported as Prelude. Replace base with base-noprelude and intro-prelude
in your build-depends.
license-name: MIT
| homepage: https://github.com/minad/intro-prelude#readme
changelog-type: ''
- hash: 0efa4c65ccfa1af759e17f530a5f2aa691d4bbf1e9702b8b657bad02cc1b3e6e
+ hash: a6ffadd4b02b26ea9170eae2f37ee3f5af32cb128a3c1d1099b34b86daec5de6
test-bench-deps:
intro: -any
intro-prelude: -any
maintainer: Daniel Mendler <mail@daniel-mendler.de>
synopsis: Intro reexported as Prelude
changelog: ''
basic-deps:
- intro: -any
+ intro: <0.2
all-versions:
- '0.1.0.0'
author: Daniel Mendler <mail@daniel-mendler.de>
latest: '0.1.0.0'
description-type: haddock
description: Intro reexported as Prelude. Replace base with base-noprelude and intro-prelude
in your build-depends.
license-name: MIT | 4 | 0.210526 | 2 | 2 |
4c5e90b7adb0683a7cbab360a66610ccfc35c510 | recipes-webos/smartkey-hun/smartkey-hun.bb | recipes-webos/smartkey-hun/smartkey-hun.bb |
DESCRIPTION = "SmartKey is the webOS service for spell checking."
LICENSE = "Apache-2.0"
LIC_FILES_CHKSUM = "file://${COMMON_LICENSE_DIR}/Apache-2.0;md5=89aea4e17d99a7cacdbeed46a0096b10"
SECTION = "webos/base"
DEPENDS = "libpbnjson cjson glib-2.0 luna-service2 icu hunspell"
PR = "r1"
inherit autotools
inherit webos_component
inherit webos_public_repo
inherit webos_enhanced_submissions
WEBOS_GIT_TAG = "submissions/${WEBOS_SUBMISSION}"
SRC_URI = "${OPENWEBOS_GIT_REPO}/${PN};tag=${WEBOS_GIT_TAG};protocol=git"
S = "${WORKDIR}/git"
EXTRA_OEMAKE = "PLATFORM=${TARGET_ARCH}"
do_install_prepend() {
export INSTALL_DIR="${D}"
}
do_install_append() {
chmod o-rwx ${D}${bindir}/com.palm.smartkey
}
|
DESCRIPTION = "Implementation of the Open webOS SmartKey spell checking service using hunspell"
SECTION = "webos/base"
LICENSE = "Apache-2.0"
LIC_FILES_CHKSUM = "file://${COMMON_LICENSE_DIR}/Apache-2.0;md5=89aea4e17d99a7cacdbeed46a0096b10"
DEPENDS = "libpbnjson cjson glib-2.0 luna-service2 icu hunspell"
PR = "r2"
#Uncomment once do_install() has been moved out of the recipe
#inherit webos_component
inherit webos_public_repo
inherit webos_enhanced_submissions
inherit webos_system_bus
inherit webos_daemon
WEBOS_GIT_TAG = "submissions/${WEBOS_SUBMISSION}"
SRC_URI = "${OPENWEBOS_GIT_REPO}/${PN};tag=${WEBOS_GIT_TAG};protocol=git"
S = "${WORKDIR}/git"
EXTRA_OEMAKE += "PLATFORM=${TARGET_ARCH}"
do_install() {
oe_runmake INSTALL_DIR=${D} install
oe_runmake INCLUDE_DIR=${D}${includedir} LIB_DIR=${D}${libdir} stage
}
| Add service files to package | Smartkey-hun: Add service files to package
* Change description
* Don't inherit autotools anymore
Open-webOS-DCO-1.0-Signed-off-by: Kimmo Leppala <kimmo.leppala@palm.com>
Change-Id: Ie6a4219083dda213887e803ac69dd70c40ad2784
| BitBake | mit | nizovn/meta-webos-ports,openwebos/meta-webos,Garfonso/meta-webos-ports,nizovn/meta-webos-ports,openwebos/meta-webos,openwebos/meta-webos,openwebos/meta-webos,Garfonso/meta-webos-ports | bitbake | ## Code Before:
DESCRIPTION = "SmartKey is the webOS service for spell checking."
LICENSE = "Apache-2.0"
LIC_FILES_CHKSUM = "file://${COMMON_LICENSE_DIR}/Apache-2.0;md5=89aea4e17d99a7cacdbeed46a0096b10"
SECTION = "webos/base"
DEPENDS = "libpbnjson cjson glib-2.0 luna-service2 icu hunspell"
PR = "r1"
inherit autotools
inherit webos_component
inherit webos_public_repo
inherit webos_enhanced_submissions
WEBOS_GIT_TAG = "submissions/${WEBOS_SUBMISSION}"
SRC_URI = "${OPENWEBOS_GIT_REPO}/${PN};tag=${WEBOS_GIT_TAG};protocol=git"
S = "${WORKDIR}/git"
EXTRA_OEMAKE = "PLATFORM=${TARGET_ARCH}"
do_install_prepend() {
export INSTALL_DIR="${D}"
}
do_install_append() {
chmod o-rwx ${D}${bindir}/com.palm.smartkey
}
## Instruction:
Smartkey-hun: Add service files to package
* Change description
* Don't inherit autotools anymore
Open-webOS-DCO-1.0-Signed-off-by: Kimmo Leppala <kimmo.leppala@palm.com>
Change-Id: Ie6a4219083dda213887e803ac69dd70c40ad2784
## Code After:
DESCRIPTION = "Implementation of the Open webOS SmartKey spell checking service using hunspell"
SECTION = "webos/base"
LICENSE = "Apache-2.0"
LIC_FILES_CHKSUM = "file://${COMMON_LICENSE_DIR}/Apache-2.0;md5=89aea4e17d99a7cacdbeed46a0096b10"
DEPENDS = "libpbnjson cjson glib-2.0 luna-service2 icu hunspell"
PR = "r2"
#Uncomment once do_install() has been moved out of the recipe
#inherit webos_component
inherit webos_public_repo
inherit webos_enhanced_submissions
inherit webos_system_bus
inherit webos_daemon
WEBOS_GIT_TAG = "submissions/${WEBOS_SUBMISSION}"
SRC_URI = "${OPENWEBOS_GIT_REPO}/${PN};tag=${WEBOS_GIT_TAG};protocol=git"
S = "${WORKDIR}/git"
EXTRA_OEMAKE += "PLATFORM=${TARGET_ARCH}"
do_install() {
oe_runmake INSTALL_DIR=${D} install
oe_runmake INCLUDE_DIR=${D}${includedir} LIB_DIR=${D}${libdir} stage
}
|
- DESCRIPTION = "SmartKey is the webOS service for spell checking."
+ DESCRIPTION = "Implementation of the Open webOS SmartKey spell checking service using hunspell"
+ SECTION = "webos/base"
LICENSE = "Apache-2.0"
LIC_FILES_CHKSUM = "file://${COMMON_LICENSE_DIR}/Apache-2.0;md5=89aea4e17d99a7cacdbeed46a0096b10"
- SECTION = "webos/base"
DEPENDS = "libpbnjson cjson glib-2.0 luna-service2 icu hunspell"
- PR = "r1"
? ^
+ PR = "r2"
? ^
- inherit autotools
+ #Uncomment once do_install() has been moved out of the recipe
- inherit webos_component
+ #inherit webos_component
? +
inherit webos_public_repo
inherit webos_enhanced_submissions
+ inherit webos_system_bus
+ inherit webos_daemon
WEBOS_GIT_TAG = "submissions/${WEBOS_SUBMISSION}"
SRC_URI = "${OPENWEBOS_GIT_REPO}/${PN};tag=${WEBOS_GIT_TAG};protocol=git"
S = "${WORKDIR}/git"
- EXTRA_OEMAKE = "PLATFORM=${TARGET_ARCH}"
+ EXTRA_OEMAKE += "PLATFORM=${TARGET_ARCH}"
? +
- do_install_prepend() {
? --------
+ do_install() {
- export INSTALL_DIR="${D}"
+ oe_runmake INSTALL_DIR=${D} install
+ oe_runmake INCLUDE_DIR=${D}${includedir} LIB_DIR=${D}${libdir} stage
}
-
- do_install_append() {
- chmod o-rwx ${D}${bindir}/com.palm.smartkey
- }
- | 24 | 0.827586 | 11 | 13 |
4cc5faafc36b0e85c10f50dff5c9e5d7364b18e5 | server/lib/getDriveLetters.js | server/lib/getDriveLetters.js | const childProcess = require('child_process')
const command = 'wmic logicaldisk get caption'
module.exports = function () {
return new Promise((resolve, reject) => {
childProcess.exec(command, (err, stdout) => {
if (err) {
return reject(err)
}
const rows = stdout.split(/\r?\n/)
resolve(rows)
})
})
}
| const childProcess = require('child_process')
const command = 'wmic logicaldisk get caption'
module.exports = function () {
return new Promise((resolve, reject) => {
childProcess.exec(command, (err, stdout) => {
if (err) {
return reject(err)
}
const rows = stdout.split(/\r?\n/).filter(row => row.trim().endsWith(':'))
resolve(rows.map(r => r.trim()))
})
})
}
| Clean up output from wmic | Clean up output from wmic
| JavaScript | isc | bhj/karaoke-forever,bhj/karaoke-forever | javascript | ## Code Before:
const childProcess = require('child_process')
const command = 'wmic logicaldisk get caption'
module.exports = function () {
return new Promise((resolve, reject) => {
childProcess.exec(command, (err, stdout) => {
if (err) {
return reject(err)
}
const rows = stdout.split(/\r?\n/)
resolve(rows)
})
})
}
## Instruction:
Clean up output from wmic
## Code After:
const childProcess = require('child_process')
const command = 'wmic logicaldisk get caption'
module.exports = function () {
return new Promise((resolve, reject) => {
childProcess.exec(command, (err, stdout) => {
if (err) {
return reject(err)
}
const rows = stdout.split(/\r?\n/).filter(row => row.trim().endsWith(':'))
resolve(rows.map(r => r.trim()))
})
})
}
| const childProcess = require('child_process')
const command = 'wmic logicaldisk get caption'
module.exports = function () {
return new Promise((resolve, reject) => {
childProcess.exec(command, (err, stdout) => {
if (err) {
return reject(err)
}
- const rows = stdout.split(/\r?\n/)
- resolve(rows)
+ const rows = stdout.split(/\r?\n/).filter(row => row.trim().endsWith(':'))
+ resolve(rows.map(r => r.trim()))
})
})
} | 4 | 0.266667 | 2 | 2 |
bf32fc69fdbff65c43ad59606ac52dc53c278bbb | jovo-integrations/jovo-platform-alexa/src/core/AlexaSpeechBuilder.ts | jovo-integrations/jovo-platform-alexa/src/core/AlexaSpeechBuilder.ts | import * as _ from 'lodash';
import {SpeechBuilder} from "jovo-core";
import {AlexaSkill} from "./AlexaSkill";
export class AlexaSpeechBuilder extends SpeechBuilder {
constructor(alexaSkill: AlexaSkill) {
super(alexaSkill);
}
/**
* Adds audio tag to speech
* @public
* @param {string} url secure url to audio
* @param {boolean} condition
* @param {number} probability
* @return {SpeechBuilder}
*/
addAudio(url: string | string[], condition?: boolean, probability?: number) {
if (_.isArray(url)) {
return this.addText('<audio src="' + _.sample(url) + '"/>', condition, probability);
}
return this.addText('<audio src="' + url + '"/>', condition, probability);
}
}
| import * as _ from 'lodash';
import {SpeechBuilder} from "jovo-core";
import {AlexaSkill} from "./AlexaSkill";
export class AlexaSpeechBuilder extends SpeechBuilder {
constructor(alexaSkill: AlexaSkill) {
super(alexaSkill);
}
/**
* Adds audio tag to speech
* @public
* @param {string} url secure url to audio
* @param {boolean} condition
* @param {number} probability
* @return {SpeechBuilder}
*/
addAudio(url: string | string[], condition?: boolean, probability?: number) {
if (_.isArray(url)) {
return this.addText('<audio src="' + _.sample(url) + '"/>', condition, probability);
}
return this.addText('<audio src="' + url + '"/>', condition, probability);
}
/**
* Adds text with language
* @param {string} language
* @param {string | string[]} text
* @param {boolean} condition
* @param {number} probability
* @returns {SpeechBuilder}
*/
addLangText(language: string, text: string | string[], condition?: boolean, probability?: number) {
if (_.isArray(text)) {
return this.addText(`<lang xml:lang="${language}">${text}</lang>`, condition, probability);
}
return this.addText(`<lang xml:lang="${language}">${text}</lang>`, condition, probability);
}
/**
* Adds text with polly
* @param {string} pollyName
* @param {string | string[]} text
* @param {boolean} condition
* @param {number} probability
* @returns {SpeechBuilder}
*/
addTextWithPolly(pollyName: string, text: string | string[], condition?: boolean, probability?: number) {
if (_.isArray(text)) {
return this.addText(`<voice name="${pollyName}">${text}</voice>`, condition, probability);
}
return this.addText(`<voice name="${pollyName}">${text}</voice>`, condition, probability);
}
}
| Add polly voices and language tag | :sparkles: Add polly voices and language tag
| TypeScript | apache-2.0 | jovotech/jovo-framework-nodejs,jovotech/jovo-framework-nodejs | typescript | ## Code Before:
import * as _ from 'lodash';
import {SpeechBuilder} from "jovo-core";
import {AlexaSkill} from "./AlexaSkill";
export class AlexaSpeechBuilder extends SpeechBuilder {
constructor(alexaSkill: AlexaSkill) {
super(alexaSkill);
}
/**
* Adds audio tag to speech
* @public
* @param {string} url secure url to audio
* @param {boolean} condition
* @param {number} probability
* @return {SpeechBuilder}
*/
addAudio(url: string | string[], condition?: boolean, probability?: number) {
if (_.isArray(url)) {
return this.addText('<audio src="' + _.sample(url) + '"/>', condition, probability);
}
return this.addText('<audio src="' + url + '"/>', condition, probability);
}
}
## Instruction:
:sparkles: Add polly voices and language tag
## Code After:
import * as _ from 'lodash';
import {SpeechBuilder} from "jovo-core";
import {AlexaSkill} from "./AlexaSkill";
export class AlexaSpeechBuilder extends SpeechBuilder {
constructor(alexaSkill: AlexaSkill) {
super(alexaSkill);
}
/**
* Adds audio tag to speech
* @public
* @param {string} url secure url to audio
* @param {boolean} condition
* @param {number} probability
* @return {SpeechBuilder}
*/
addAudio(url: string | string[], condition?: boolean, probability?: number) {
if (_.isArray(url)) {
return this.addText('<audio src="' + _.sample(url) + '"/>', condition, probability);
}
return this.addText('<audio src="' + url + '"/>', condition, probability);
}
/**
* Adds text with language
* @param {string} language
* @param {string | string[]} text
* @param {boolean} condition
* @param {number} probability
* @returns {SpeechBuilder}
*/
addLangText(language: string, text: string | string[], condition?: boolean, probability?: number) {
if (_.isArray(text)) {
return this.addText(`<lang xml:lang="${language}">${text}</lang>`, condition, probability);
}
return this.addText(`<lang xml:lang="${language}">${text}</lang>`, condition, probability);
}
/**
* Adds text with polly
* @param {string} pollyName
* @param {string | string[]} text
* @param {boolean} condition
* @param {number} probability
* @returns {SpeechBuilder}
*/
addTextWithPolly(pollyName: string, text: string | string[], condition?: boolean, probability?: number) {
if (_.isArray(text)) {
return this.addText(`<voice name="${pollyName}">${text}</voice>`, condition, probability);
}
return this.addText(`<voice name="${pollyName}">${text}</voice>`, condition, probability);
}
}
| import * as _ from 'lodash';
import {SpeechBuilder} from "jovo-core";
import {AlexaSkill} from "./AlexaSkill";
export class AlexaSpeechBuilder extends SpeechBuilder {
constructor(alexaSkill: AlexaSkill) {
super(alexaSkill);
}
/**
* Adds audio tag to speech
* @public
* @param {string} url secure url to audio
* @param {boolean} condition
* @param {number} probability
* @return {SpeechBuilder}
*/
addAudio(url: string | string[], condition?: boolean, probability?: number) {
if (_.isArray(url)) {
return this.addText('<audio src="' + _.sample(url) + '"/>', condition, probability);
}
return this.addText('<audio src="' + url + '"/>', condition, probability);
}
+
+ /**
+ * Adds text with language
+ * @param {string} language
+ * @param {string | string[]} text
+ * @param {boolean} condition
+ * @param {number} probability
+ * @returns {SpeechBuilder}
+ */
+ addLangText(language: string, text: string | string[], condition?: boolean, probability?: number) {
+ if (_.isArray(text)) {
+ return this.addText(`<lang xml:lang="${language}">${text}</lang>`, condition, probability);
+ }
+ return this.addText(`<lang xml:lang="${language}">${text}</lang>`, condition, probability);
+ }
+
+ /**
+ * Adds text with polly
+ * @param {string} pollyName
+ * @param {string | string[]} text
+ * @param {boolean} condition
+ * @param {number} probability
+ * @returns {SpeechBuilder}
+ */
+ addTextWithPolly(pollyName: string, text: string | string[], condition?: boolean, probability?: number) {
+ if (_.isArray(text)) {
+ return this.addText(`<voice name="${pollyName}">${text}</voice>`, condition, probability);
+ }
+ return this.addText(`<voice name="${pollyName}">${text}</voice>`, condition, probability);
+ }
} | 30 | 1.25 | 30 | 0 |
7310fdd86cebd1c8b28d960f0211e350fbf07a62 | handlers/login.py | handlers/login.py | from rorn.Box import LoginBox, ErrorBox, WarningBox, SuccessBox
from rorn.Session import delay
from User import User
from Button import Button
from utils import *
@get('login')
def login(handler, request):
handler.title('Login')
if handler.session['user']:
print WarningBox('Logged In', 'You are already logged in as %s' % handler.session['user'])
else:
print LoginBox()
@post('login')
def loginPost(handler, request, p_username, p_password):
handler.title('Login')
user = User.load(username = p_username, password = User.crypt(p_username, p_password))
if user:
handler.session['user'] = user
delay(handler, SuccessBox("Login Complete", "Logged in as %s" % user))
redirect('/')
else:
delay(handler, ErrorBox("Login Failed", "Invalid username/password combination"))
redirect('/')
@get('logout')
def logout(handler, request):
print "<form method=\"post\" action=\"/logout\">"
print Button('Logout', type = 'submit').negative()
print "</form>"
@post('logout')
def logoutPost(handler, request):
if handler.session['user']:
del handler.session['user']
redirect('/')
else:
print ErrorBox("Logout Failed", "You are not logged in")
| from rorn.Box import LoginBox, ErrorBox, WarningBox, SuccessBox
from rorn.Session import delay
from User import User
from Button import Button
from utils import *
@get('login')
def login(handler, request):
handler.title('Login')
if handler.session['user']:
print WarningBox('Logged In', 'You are already logged in as %s' % handler.session['user'])
else:
print LoginBox()
@post('login')
def loginPost(handler, request, p_username, p_password):
handler.title('Login')
user = User.load(username = p_username, password = User.crypt(p_username, p_password))
if user:
if user.resetkey:
user.resetkey = None
user.save()
handler.session['user'] = user
delay(handler, SuccessBox("Login Complete", "Logged in as %s" % user))
redirect('/')
else:
delay(handler, ErrorBox("Login Failed", "Invalid username/password combination"))
redirect('/')
@get('logout')
def logout(handler, request):
print "<form method=\"post\" action=\"/logout\">"
print Button('Logout', type = 'submit').negative()
print "</form>"
@post('logout')
def logoutPost(handler, request):
if handler.session['user']:
del handler.session['user']
redirect('/')
else:
print ErrorBox("Logout Failed", "You are not logged in")
| Clear reset key on user activity | Clear reset key on user activity
| Python | mit | mrozekma/Sprint,mrozekma/Sprint,mrozekma/Sprint | python | ## Code Before:
from rorn.Box import LoginBox, ErrorBox, WarningBox, SuccessBox
from rorn.Session import delay
from User import User
from Button import Button
from utils import *
@get('login')
def login(handler, request):
handler.title('Login')
if handler.session['user']:
print WarningBox('Logged In', 'You are already logged in as %s' % handler.session['user'])
else:
print LoginBox()
@post('login')
def loginPost(handler, request, p_username, p_password):
handler.title('Login')
user = User.load(username = p_username, password = User.crypt(p_username, p_password))
if user:
handler.session['user'] = user
delay(handler, SuccessBox("Login Complete", "Logged in as %s" % user))
redirect('/')
else:
delay(handler, ErrorBox("Login Failed", "Invalid username/password combination"))
redirect('/')
@get('logout')
def logout(handler, request):
print "<form method=\"post\" action=\"/logout\">"
print Button('Logout', type = 'submit').negative()
print "</form>"
@post('logout')
def logoutPost(handler, request):
if handler.session['user']:
del handler.session['user']
redirect('/')
else:
print ErrorBox("Logout Failed", "You are not logged in")
## Instruction:
Clear reset key on user activity
## Code After:
from rorn.Box import LoginBox, ErrorBox, WarningBox, SuccessBox
from rorn.Session import delay
from User import User
from Button import Button
from utils import *
@get('login')
def login(handler, request):
handler.title('Login')
if handler.session['user']:
print WarningBox('Logged In', 'You are already logged in as %s' % handler.session['user'])
else:
print LoginBox()
@post('login')
def loginPost(handler, request, p_username, p_password):
handler.title('Login')
user = User.load(username = p_username, password = User.crypt(p_username, p_password))
if user:
if user.resetkey:
user.resetkey = None
user.save()
handler.session['user'] = user
delay(handler, SuccessBox("Login Complete", "Logged in as %s" % user))
redirect('/')
else:
delay(handler, ErrorBox("Login Failed", "Invalid username/password combination"))
redirect('/')
@get('logout')
def logout(handler, request):
print "<form method=\"post\" action=\"/logout\">"
print Button('Logout', type = 'submit').negative()
print "</form>"
@post('logout')
def logoutPost(handler, request):
if handler.session['user']:
del handler.session['user']
redirect('/')
else:
print ErrorBox("Logout Failed", "You are not logged in")
| from rorn.Box import LoginBox, ErrorBox, WarningBox, SuccessBox
from rorn.Session import delay
from User import User
from Button import Button
from utils import *
@get('login')
def login(handler, request):
handler.title('Login')
if handler.session['user']:
print WarningBox('Logged In', 'You are already logged in as %s' % handler.session['user'])
else:
print LoginBox()
@post('login')
def loginPost(handler, request, p_username, p_password):
handler.title('Login')
user = User.load(username = p_username, password = User.crypt(p_username, p_password))
if user:
+ if user.resetkey:
+ user.resetkey = None
+ user.save()
+
handler.session['user'] = user
delay(handler, SuccessBox("Login Complete", "Logged in as %s" % user))
redirect('/')
else:
delay(handler, ErrorBox("Login Failed", "Invalid username/password combination"))
redirect('/')
@get('logout')
def logout(handler, request):
print "<form method=\"post\" action=\"/logout\">"
print Button('Logout', type = 'submit').negative()
print "</form>"
@post('logout')
def logoutPost(handler, request):
if handler.session['user']:
del handler.session['user']
redirect('/')
else:
print ErrorBox("Logout Failed", "You are not logged in") | 4 | 0.1 | 4 | 0 |
519b8cda139d4ffb2f680229b6c16ef87b1fa1ab | app/templates/jade/_html-header.jade | app/templates/jade/_html-header.jade | !!!
//if lte IE 9
html.ie
//[if gt IE 9]><!
html
//<![endif]
html
head
title Sumaato
meta(charset='UTF-8')
meta(content='', name='description')
meta(content='', name='author')
meta(content='3 days', name='revisit-after')
// build:css /styles/main.css
link(href='/styles/main.css', rel='stylesheet')
// endbuild
//if lt IE 9
script(src='//html5shiv.googlecode.com/svn/trunk/html5.js')
| !!!
//if lte IE 9
html.ie
//[if gt IE 9]><!
html
//<![endif]
html
head
title <%= _.slugify(projectName) %>
meta(charset='UTF-8')
meta(content='', name='description')
meta(content='', name='author')
meta(content='3 days', name='revisit-after')
// build:css /styles/main.css
link(href='/styles/main.css', rel='stylesheet')
// endbuild
//if lt IE 9
script(src='//html5shiv.googlecode.com/svn/trunk/html5.js')
| Change header title to projectName | Change header title to projectName
| Jade | mit | yfr/generator-jade,Poltergeist/generator-jade,yfr/generator-jade | jade | ## Code Before:
!!!
//if lte IE 9
html.ie
//[if gt IE 9]><!
html
//<![endif]
html
head
title Sumaato
meta(charset='UTF-8')
meta(content='', name='description')
meta(content='', name='author')
meta(content='3 days', name='revisit-after')
// build:css /styles/main.css
link(href='/styles/main.css', rel='stylesheet')
// endbuild
//if lt IE 9
script(src='//html5shiv.googlecode.com/svn/trunk/html5.js')
## Instruction:
Change header title to projectName
## Code After:
!!!
//if lte IE 9
html.ie
//[if gt IE 9]><!
html
//<![endif]
html
head
title <%= _.slugify(projectName) %>
meta(charset='UTF-8')
meta(content='', name='description')
meta(content='', name='author')
meta(content='3 days', name='revisit-after')
// build:css /styles/main.css
link(href='/styles/main.css', rel='stylesheet')
// endbuild
//if lt IE 9
script(src='//html5shiv.googlecode.com/svn/trunk/html5.js')
| !!!
//if lte IE 9
html.ie
//[if gt IE 9]><!
html
//<![endif]
html
head
- title Sumaato
+ title <%= _.slugify(projectName) %>
meta(charset='UTF-8')
meta(content='', name='description')
meta(content='', name='author')
meta(content='3 days', name='revisit-after')
// build:css /styles/main.css
link(href='/styles/main.css', rel='stylesheet')
// endbuild
//if lt IE 9
script(src='//html5shiv.googlecode.com/svn/trunk/html5.js') | 2 | 0.095238 | 1 | 1 |
18fa7b0a25c9ebf3780bbbd8cca22041a13d9d0e | spec/classes/bitbucket_upgrade_spec.rb | spec/classes/bitbucket_upgrade_spec.rb | require 'spec_helper.rb'
describe 'bitbucket' do
context 'supported operating systems' do
on_supported_os.each do |os, facts|
context "on #{os} #{facts}" do
let(:facts) do
facts
end
context 'prepare for upgrade of bitbucket' do
let(:params) do
{ :javahome => '/opt/java' }
end
let(:facts) do
facts.merge(:bitbucket_version => '3.1.0')
end
it 'should stop service and remove old config file' do
should contain_exec('service bitbucket stop && sleep 15')
should contain_exec('rm -f /home/bitbucket/bitbucket.properties')
.with(:command => 'rm -f /home/bitbucket/bitbucket.properties',)
should contain_notify('Attempting to upgrade bitbucket')
end
end
end
end
end
end
| require 'spec_helper.rb'
describe 'bitbucket' do
context 'supported operating systems' do
on_supported_os.each do |os, facts|
context "on #{os} #{facts}" do
let(:facts) do
facts
end
context 'prepare for upgrade of bitbucket' do
let(:params) do
{ :javahome => '/opt/java' }
end
let(:facts) do
facts.merge(:bitbucket_version => '3.1.0')
end
it 'should stop service and remove old config file' do
should contain_exec('service bitbucket stop && sleep 15')
should contain_exec('rm -f /home/bitbucket/stash-config.properties')
.with(:command => 'rm -f /home/bitbucket/stash-config.properties',)
should contain_notify('Attempting to upgrade bitbucket')
end
end
end
end
end
end
| Test case used incorrect property file check. | Test case used incorrect property file check.
| Ruby | mit | danofthewired/puppet-bitbucket,danofthewired/puppet-bitbucket,danofthewired/puppet-bitbucket | ruby | ## Code Before:
require 'spec_helper.rb'
describe 'bitbucket' do
context 'supported operating systems' do
on_supported_os.each do |os, facts|
context "on #{os} #{facts}" do
let(:facts) do
facts
end
context 'prepare for upgrade of bitbucket' do
let(:params) do
{ :javahome => '/opt/java' }
end
let(:facts) do
facts.merge(:bitbucket_version => '3.1.0')
end
it 'should stop service and remove old config file' do
should contain_exec('service bitbucket stop && sleep 15')
should contain_exec('rm -f /home/bitbucket/bitbucket.properties')
.with(:command => 'rm -f /home/bitbucket/bitbucket.properties',)
should contain_notify('Attempting to upgrade bitbucket')
end
end
end
end
end
end
## Instruction:
Test case used incorrect property file check.
## Code After:
require 'spec_helper.rb'
describe 'bitbucket' do
context 'supported operating systems' do
on_supported_os.each do |os, facts|
context "on #{os} #{facts}" do
let(:facts) do
facts
end
context 'prepare for upgrade of bitbucket' do
let(:params) do
{ :javahome => '/opt/java' }
end
let(:facts) do
facts.merge(:bitbucket_version => '3.1.0')
end
it 'should stop service and remove old config file' do
should contain_exec('service bitbucket stop && sleep 15')
should contain_exec('rm -f /home/bitbucket/stash-config.properties')
.with(:command => 'rm -f /home/bitbucket/stash-config.properties',)
should contain_notify('Attempting to upgrade bitbucket')
end
end
end
end
end
end
| require 'spec_helper.rb'
describe 'bitbucket' do
context 'supported operating systems' do
on_supported_os.each do |os, facts|
context "on #{os} #{facts}" do
let(:facts) do
facts
end
context 'prepare for upgrade of bitbucket' do
let(:params) do
{ :javahome => '/opt/java' }
end
let(:facts) do
facts.merge(:bitbucket_version => '3.1.0')
end
it 'should stop service and remove old config file' do
should contain_exec('service bitbucket stop && sleep 15')
- should contain_exec('rm -f /home/bitbucket/bitbucket.properties')
? ^ ^^^^^^^
+ should contain_exec('rm -f /home/bitbucket/stash-config.properties')
? ^^^^^^^^^^ ^
- .with(:command => 'rm -f /home/bitbucket/bitbucket.properties',)
? ^ ^^^^^^^
+ .with(:command => 'rm -f /home/bitbucket/stash-config.properties',)
? ^^^^^^^^^^ ^
should contain_notify('Attempting to upgrade bitbucket')
end
end
end
end
end
end | 4 | 0.148148 | 2 | 2 |
0a428f4034fbdd351e52320f5c4bae0ed8201590 | spec/spec_helper.rb | spec/spec_helper.rb | require 'simplecov'
SimpleCov.start
require 'apple_dep_client'
| require 'simplecov'
SimpleCov.start
require 'apple_dep_client'
RSpec.configure do |config|
config.before(:each) do
AppleDEPClient.configure do |client|
client.private_key = nil
client.consumer_key = nil
client.consumer_secret = nil
client.access_token = nil
client.access_secret = nil
client.access_token_expiry = nil
end
end
end
| Reset AppleDEPClient configuration for each test | Reset AppleDEPClient configuration for each test
| Ruby | agpl-3.0 | cellabus/apple_dep_client | ruby | ## Code Before:
require 'simplecov'
SimpleCov.start
require 'apple_dep_client'
## Instruction:
Reset AppleDEPClient configuration for each test
## Code After:
require 'simplecov'
SimpleCov.start
require 'apple_dep_client'
RSpec.configure do |config|
config.before(:each) do
AppleDEPClient.configure do |client|
client.private_key = nil
client.consumer_key = nil
client.consumer_secret = nil
client.access_token = nil
client.access_secret = nil
client.access_token_expiry = nil
end
end
end
| require 'simplecov'
SimpleCov.start
require 'apple_dep_client'
+
+ RSpec.configure do |config|
+ config.before(:each) do
+ AppleDEPClient.configure do |client|
+ client.private_key = nil
+ client.consumer_key = nil
+ client.consumer_secret = nil
+ client.access_token = nil
+ client.access_secret = nil
+ client.access_token_expiry = nil
+ end
+ end
+ end | 13 | 3.25 | 13 | 0 |
b2e309d10530f3b8b7ce33f53723493c0a433b06 | Casks/jumpshare.rb | Casks/jumpshare.rb | cask :v1 => 'jumpshare' do
version '1.0.25-3'
sha256 'a2a2d44d858616965c8271dcfc177ec299e9592fed3abb795ddb81b533f6d818'
url "https://jumpshare.com/desktop/mac/Jumpshare_#{version}.dmg"
name 'Jumpshare'
homepage 'https://jumpshare.com/'
license :gratis
app 'Jumpshare.app'
end
| cask :v1 => 'jumpshare' do
version '2.1.1'
sha256 '8181e5b35bd02dda770f81ba8f39b83e3c1cde0b934e2e563d962ce4ea16fced'
url "https://jumpshare.com/apps/mac/Jumpshare-#{version}.zip"
name 'Jumpshare'
homepage 'https://jumpshare.com/'
license :gratis
app 'Jumpshare.app'
end
| Update Jumpshare.app to version 2.1.1 | Update Jumpshare.app to version 2.1.1
| Ruby | bsd-2-clause | artdevjs/homebrew-cask,ddm/homebrew-cask,0xadada/homebrew-cask,thii/homebrew-cask,MerelyAPseudonym/homebrew-cask,casidiablo/homebrew-cask,gmkey/homebrew-cask,johnjelinek/homebrew-cask,wickles/homebrew-cask,mathbunnyru/homebrew-cask,tedbundyjr/homebrew-cask,shonjir/homebrew-cask,paour/homebrew-cask,sscotth/homebrew-cask,josa42/homebrew-cask,Bombenleger/homebrew-cask,sscotth/homebrew-cask,winkelsdorf/homebrew-cask,FinalDes/homebrew-cask,thehunmonkgroup/homebrew-cask,greg5green/homebrew-cask,renard/homebrew-cask,markthetech/homebrew-cask,My2ndAngelic/homebrew-cask,wmorin/homebrew-cask,albertico/homebrew-cask,ericbn/homebrew-cask,shorshe/homebrew-cask,andrewdisley/homebrew-cask,xyb/homebrew-cask,malford/homebrew-cask,kteru/homebrew-cask,johndbritton/homebrew-cask,CameronGarrett/homebrew-cask,rajiv/homebrew-cask,mikem/homebrew-cask,arronmabrey/homebrew-cask,larseggert/homebrew-cask,sosedoff/homebrew-cask,jonathanwiesel/homebrew-cask,dustinblackman/homebrew-cask,jgarber623/homebrew-cask,cedwardsmedia/homebrew-cask,blainesch/homebrew-cask,6uclz1/homebrew-cask,hovancik/homebrew-cask,pkq/homebrew-cask,hristozov/homebrew-cask,tsparber/homebrew-cask,cfillion/homebrew-cask,chrisfinazzo/homebrew-cask,stevehedrick/homebrew-cask,scottsuch/homebrew-cask,paour/homebrew-cask,mjgardner/homebrew-cask,ldong/homebrew-cask,linc01n/homebrew-cask,doits/homebrew-cask,esebastian/homebrew-cask,yutarody/homebrew-cask,giannitm/homebrew-cask,timsutton/homebrew-cask,jasmas/homebrew-cask,brianshumate/homebrew-cask,mhubig/homebrew-cask,cprecioso/homebrew-cask,feigaochn/homebrew-cask,markhuber/homebrew-cask,yutarody/homebrew-cask,mlocher/homebrew-cask,psibre/homebrew-cask,jacobbednarz/homebrew-cask,ldong/homebrew-cask,okket/homebrew-cask,elnappo/homebrew-cask,julionc/homebrew-cask,ayohrling/homebrew-cask,robertgzr/homebrew-cask,imgarylai/homebrew-cask,flaviocamilo/homebrew-cask,paour/homebrew-cask,samdoran/homebrew-cask,13k/homebrew-cask,muan/homebrew-cask,shoichiaizawa/homebrew-cask,antogg/homebrew-cask,syscrusher/homebrew-cask,devmynd/homebrew-cask,bosr/homebrew-cask,mhubig/homebrew-cask,JacopKane/homebrew-cask,lukeadams/homebrew-cask,cobyism/homebrew-cask,kpearson/homebrew-cask,thomanq/homebrew-cask,moimikey/homebrew-cask,axodys/homebrew-cask,0rax/homebrew-cask,nathancahill/homebrew-cask,Ephemera/homebrew-cask,Labutin/homebrew-cask,wmorin/homebrew-cask,gyndav/homebrew-cask,mjdescy/homebrew-cask,chuanxd/homebrew-cask,gmkey/homebrew-cask,ebraminio/homebrew-cask,stonehippo/homebrew-cask,aguynamedryan/homebrew-cask,dwihn0r/homebrew-cask,coeligena/homebrew-customized,hakamadare/homebrew-cask,inz/homebrew-cask,mathbunnyru/homebrew-cask,tjnycum/homebrew-cask,jasmas/homebrew-cask,cliffcotino/homebrew-cask,janlugt/homebrew-cask,flaviocamilo/homebrew-cask,shoichiaizawa/homebrew-cask,mlocher/homebrew-cask,jedahan/homebrew-cask,vigosan/homebrew-cask,kkdd/homebrew-cask,Amorymeltzer/homebrew-cask,casidiablo/homebrew-cask,haha1903/homebrew-cask,andyli/homebrew-cask,SentinelWarren/homebrew-cask,winkelsdorf/homebrew-cask,ebraminio/homebrew-cask,xakraz/homebrew-cask,kiliankoe/homebrew-cask,kassi/homebrew-cask,tan9/homebrew-cask,gurghet/homebrew-cask,moimikey/homebrew-cask,crzrcn/homebrew-cask,samshadwell/homebrew-cask,renaudguerin/homebrew-cask,leipert/homebrew-cask,JacopKane/homebrew-cask,Ibuprofen/homebrew-cask,mrmachine/homebrew-cask,forevergenin/homebrew-cask,mazehall/homebrew-cask,Ngrd/homebrew-cask,alexg0/homebrew-cask,michelegera/homebrew-cask,lumaxis/homebrew-cask,claui/homebrew-cask,ericbn/homebrew-cask,n8henrie/homebrew-cask,lcasey001/homebrew-cask,hellosky806/homebrew-cask,JosephViolago/homebrew-cask,guerrero/homebrew-cask,BenjaminHCCarr/homebrew-cask,KosherBacon/homebrew-cask,a1russell/homebrew-cask,shonjir/homebrew-cask,vitorgalvao/homebrew-cask,gerrypower/homebrew-cask,pacav69/homebrew-cask,lucasmezencio/homebrew-cask,colindunn/homebrew-cask,optikfluffel/homebrew-cask,wickedsp1d3r/homebrew-cask,mwean/homebrew-cask,slack4u/homebrew-cask,stephenwade/homebrew-cask,imgarylai/homebrew-cask,uetchy/homebrew-cask,dictcp/homebrew-cask,cobyism/homebrew-cask,FredLackeyOfficial/homebrew-cask,vin047/homebrew-cask,bric3/homebrew-cask,scribblemaniac/homebrew-cask,MircoT/homebrew-cask,janlugt/homebrew-cask,mjdescy/homebrew-cask,squid314/homebrew-cask,joshka/homebrew-cask,kpearson/homebrew-cask,adrianchia/homebrew-cask,6uclz1/homebrew-cask,hanxue/caskroom,victorpopkov/homebrew-cask,cblecker/homebrew-cask,mazehall/homebrew-cask,bdhess/homebrew-cask,a1russell/homebrew-cask,xtian/homebrew-cask,ptb/homebrew-cask,albertico/homebrew-cask,mrmachine/homebrew-cask,alexg0/homebrew-cask,mjgardner/homebrew-cask,jconley/homebrew-cask,moogar0880/homebrew-cask,gabrielizaias/homebrew-cask,yumitsu/homebrew-cask,bric3/homebrew-cask,kkdd/homebrew-cask,KosherBacon/homebrew-cask,adrianchia/homebrew-cask,mingzhi22/homebrew-cask,caskroom/homebrew-cask,sohtsuka/homebrew-cask,ksato9700/homebrew-cask,markhuber/homebrew-cask,dcondrey/homebrew-cask,JikkuJose/homebrew-cask,jonathanwiesel/homebrew-cask,reelsense/homebrew-cask,wKovacs64/homebrew-cask,larseggert/homebrew-cask,cprecioso/homebrew-cask,deiga/homebrew-cask,Labutin/homebrew-cask,miku/homebrew-cask,lumaxis/homebrew-cask,Ephemera/homebrew-cask,m3nu/homebrew-cask,ianyh/homebrew-cask,jbeagley52/homebrew-cask,MichaelPei/homebrew-cask,deanmorin/homebrew-cask,perfide/homebrew-cask,JikkuJose/homebrew-cask,jeanregisser/homebrew-cask,13k/homebrew-cask,Gasol/homebrew-cask,jangalinski/homebrew-cask,onlynone/homebrew-cask,mindriot101/homebrew-cask,brianshumate/homebrew-cask,hyuna917/homebrew-cask,dvdoliveira/homebrew-cask,riyad/homebrew-cask,nightscape/homebrew-cask,nightscape/homebrew-cask,Gasol/homebrew-cask,coeligena/homebrew-customized,stigkj/homebrew-caskroom-cask,imgarylai/homebrew-cask,schneidmaster/homebrew-cask,kamilboratynski/homebrew-cask,nathanielvarona/homebrew-cask,rajiv/homebrew-cask,corbt/homebrew-cask,shorshe/homebrew-cask,kingthorin/homebrew-cask,nrlquaker/homebrew-cask,mchlrmrz/homebrew-cask,singingwolfboy/homebrew-cask,ericbn/homebrew-cask,williamboman/homebrew-cask,dictcp/homebrew-cask,Keloran/homebrew-cask,seanorama/homebrew-cask,athrunsun/homebrew-cask,phpwutz/homebrew-cask,m3nu/homebrew-cask,kamilboratynski/homebrew-cask,blogabe/homebrew-cask,BenjaminHCCarr/homebrew-cask,jedahan/homebrew-cask,skatsuta/homebrew-cask,dvdoliveira/homebrew-cask,yurikoles/homebrew-cask,neverfox/homebrew-cask,jhowtan/homebrew-cask,kongslund/homebrew-cask,gilesdring/homebrew-cask,scribblemaniac/homebrew-cask,tangestani/homebrew-cask,lukeadams/homebrew-cask,m3nu/homebrew-cask,chadcatlett/caskroom-homebrew-cask,elyscape/homebrew-cask,mingzhi22/homebrew-cask,axodys/homebrew-cask,daften/homebrew-cask,schneidmaster/homebrew-cask,fanquake/homebrew-cask,theoriginalgri/homebrew-cask,a1russell/homebrew-cask,lvicentesanchez/homebrew-cask,jeanregisser/homebrew-cask,pacav69/homebrew-cask,zerrot/homebrew-cask,hanxue/caskroom,andyli/homebrew-cask,malob/homebrew-cask,kronicd/homebrew-cask,mikem/homebrew-cask,santoshsahoo/homebrew-cask,chuanxd/homebrew-cask,kteru/homebrew-cask,seanzxx/homebrew-cask,My2ndAngelic/homebrew-cask,julionc/homebrew-cask,sgnh/homebrew-cask,Ibuprofen/homebrew-cask,tmoreira2020/homebrew,joschi/homebrew-cask,arronmabrey/homebrew-cask,mgryszko/homebrew-cask,xight/homebrew-cask,scribblemaniac/homebrew-cask,hristozov/homebrew-cask,nshemonsky/homebrew-cask,miguelfrde/homebrew-cask,jaredsampson/homebrew-cask,lifepillar/homebrew-cask,danielbayley/homebrew-cask,fharbe/homebrew-cask,joschi/homebrew-cask,deiga/homebrew-cask,jellyfishcoder/homebrew-cask,sgnh/homebrew-cask,buo/homebrew-cask,amatos/homebrew-cask,malford/homebrew-cask,n0ts/homebrew-cask,guerrero/homebrew-cask,mishari/homebrew-cask,lvicentesanchez/homebrew-cask,malob/homebrew-cask,mahori/homebrew-cask,miguelfrde/homebrew-cask,codeurge/homebrew-cask,diogodamiani/homebrew-cask,tedbundyjr/homebrew-cask,FredLackeyOfficial/homebrew-cask,dwkns/homebrew-cask,Ketouem/homebrew-cask,jeroenseegers/homebrew-cask,deanmorin/homebrew-cask,tangestani/homebrew-cask,n0ts/homebrew-cask,nshemonsky/homebrew-cask,reelsense/homebrew-cask,okket/homebrew-cask,giannitm/homebrew-cask,stonehippo/homebrew-cask,anbotero/homebrew-cask,mgryszko/homebrew-cask,haha1903/homebrew-cask,jpmat296/homebrew-cask,MichaelPei/homebrew-cask,elyscape/homebrew-cask,julionc/homebrew-cask,artdevjs/homebrew-cask,antogg/homebrew-cask,lantrix/homebrew-cask,kingthorin/homebrew-cask,Amorymeltzer/homebrew-cask,morganestes/homebrew-cask,samshadwell/homebrew-cask,tolbkni/homebrew-cask,dwkns/homebrew-cask,ywfwj2008/homebrew-cask,santoshsahoo/homebrew-cask,yumitsu/homebrew-cask,y00rb/homebrew-cask,uetchy/homebrew-cask,dwihn0r/homebrew-cask,thehunmonkgroup/homebrew-cask,miccal/homebrew-cask,franklouwers/homebrew-cask,gurghet/homebrew-cask,alebcay/homebrew-cask,blainesch/homebrew-cask,stonehippo/homebrew-cask,asbachb/homebrew-cask,diguage/homebrew-cask,faun/homebrew-cask,johnjelinek/homebrew-cask,zerrot/homebrew-cask,corbt/homebrew-cask,tmoreira2020/homebrew,y00rb/homebrew-cask,doits/homebrew-cask,antogg/homebrew-cask,miccal/homebrew-cask,cfillion/homebrew-cask,josa42/homebrew-cask,robbiethegeek/homebrew-cask,jbeagley52/homebrew-cask,kesara/homebrew-cask,alebcay/homebrew-cask,mathbunnyru/homebrew-cask,seanorama/homebrew-cask,andrewdisley/homebrew-cask,alexg0/homebrew-cask,jeroenseegers/homebrew-cask,jangalinski/homebrew-cask,anbotero/homebrew-cask,neverfox/homebrew-cask,optikfluffel/homebrew-cask,stigkj/homebrew-caskroom-cask,mauricerkelly/homebrew-cask,Dremora/homebrew-cask,nathancahill/homebrew-cask,wKovacs64/homebrew-cask,andrewdisley/homebrew-cask,kTitan/homebrew-cask,miku/homebrew-cask,kiliankoe/homebrew-cask,chrisfinazzo/homebrew-cask,colindunn/homebrew-cask,williamboman/homebrew-cask,afh/homebrew-cask,dcondrey/homebrew-cask,bosr/homebrew-cask,renard/homebrew-cask,nrlquaker/homebrew-cask,morganestes/homebrew-cask,helloIAmPau/homebrew-cask,robbiethegeek/homebrew-cask,klane/homebrew-cask,victorpopkov/homebrew-cask,timsutton/homebrew-cask,malob/homebrew-cask,diguage/homebrew-cask,cblecker/homebrew-cask,devmynd/homebrew-cask,jmeridth/homebrew-cask,sscotth/homebrew-cask,vigosan/homebrew-cask,rogeriopradoj/homebrew-cask,toonetown/homebrew-cask,shoichiaizawa/homebrew-cask,onlynone/homebrew-cask,lukasbestle/homebrew-cask,bcomnes/homebrew-cask,kingthorin/homebrew-cask,colindean/homebrew-cask,gilesdring/homebrew-cask,syscrusher/homebrew-cask,singingwolfboy/homebrew-cask,FinalDes/homebrew-cask,leipert/homebrew-cask,tedski/homebrew-cask,ddm/homebrew-cask,mchlrmrz/homebrew-cask,chadcatlett/caskroom-homebrew-cask,ksylvan/homebrew-cask,mauricerkelly/homebrew-cask,Dremora/homebrew-cask,yutarody/homebrew-cask,aguynamedryan/homebrew-cask,tedski/homebrew-cask,robertgzr/homebrew-cask,FranklinChen/homebrew-cask,cblecker/homebrew-cask,singingwolfboy/homebrew-cask,xyb/homebrew-cask,joshka/homebrew-cask,sanchezm/homebrew-cask,samnung/homebrew-cask,hyuna917/homebrew-cask,napaxton/homebrew-cask,MoOx/homebrew-cask,patresi/homebrew-cask,hanxue/caskroom,jacobbednarz/homebrew-cask,perfide/homebrew-cask,Ephemera/homebrew-cask,faun/homebrew-cask,jmeridth/homebrew-cask,maxnordlund/homebrew-cask,mishari/homebrew-cask,theoriginalgri/homebrew-cask,AnastasiaSulyagina/homebrew-cask,boecko/homebrew-cask,jiashuw/homebrew-cask,usami-k/homebrew-cask,gibsjose/homebrew-cask,tangestani/homebrew-cask,sjackman/homebrew-cask,ianyh/homebrew-cask,xakraz/homebrew-cask,wickedsp1d3r/homebrew-cask,hellosky806/homebrew-cask,optikfluffel/homebrew-cask,JosephViolago/homebrew-cask,esebastian/homebrew-cask,reitermarkus/homebrew-cask,sjackman/homebrew-cask,dictcp/homebrew-cask,bdhess/homebrew-cask,farmerchris/homebrew-cask,bric3/homebrew-cask,coeligena/homebrew-customized,Amorymeltzer/homebrew-cask,Keloran/homebrew-cask,lifepillar/homebrew-cask,sebcode/homebrew-cask,joshka/homebrew-cask,neverfox/homebrew-cask,colindean/homebrew-cask,yurikoles/homebrew-cask,zmwangx/homebrew-cask,tsparber/homebrew-cask,zmwangx/homebrew-cask,otaran/homebrew-cask,franklouwers/homebrew-cask,koenrh/homebrew-cask,miccal/homebrew-cask,diogodamiani/homebrew-cask,exherb/homebrew-cask,nathanielvarona/homebrew-cask,gerrypower/homebrew-cask,asins/homebrew-cask,samnung/homebrew-cask,Ngrd/homebrew-cask,troyxmccall/homebrew-cask,howie/homebrew-cask,thii/homebrew-cask,forevergenin/homebrew-cask,Saklad5/homebrew-cask,cedwardsmedia/homebrew-cask,inta/homebrew-cask,timsutton/homebrew-cask,joschi/homebrew-cask,vin047/homebrew-cask,xight/homebrew-cask,Fedalto/homebrew-cask,nathansgreen/homebrew-cask,kassi/homebrew-cask,SentinelWarren/homebrew-cask,rickychilcott/homebrew-cask,rajiv/homebrew-cask,Ketouem/homebrew-cask,sanyer/homebrew-cask,fanquake/homebrew-cask,hovancik/homebrew-cask,mwean/homebrew-cask,xcezx/homebrew-cask,michelegera/homebrew-cask,nathanielvarona/homebrew-cask,crzrcn/homebrew-cask,retbrown/homebrew-cask,greg5green/homebrew-cask,kTitan/homebrew-cask,stephenwade/homebrew-cask,tarwich/homebrew-cask,sebcode/homebrew-cask,jhowtan/homebrew-cask,feigaochn/homebrew-cask,sanyer/homebrew-cask,gibsjose/homebrew-cask,amatos/homebrew-cask,asins/homebrew-cask,0xadada/homebrew-cask,gyndav/homebrew-cask,yurikoles/homebrew-cask,tjt263/homebrew-cask,fharbe/homebrew-cask,esebastian/homebrew-cask,Cottser/homebrew-cask,mattrobenolt/homebrew-cask,sanchezm/homebrew-cask,JosephViolago/homebrew-cask,Fedalto/homebrew-cask,patresi/homebrew-cask,gabrielizaias/homebrew-cask,nathansgreen/homebrew-cask,skatsuta/homebrew-cask,mattrobenolt/homebrew-cask,mindriot101/homebrew-cask,tyage/homebrew-cask,ywfwj2008/homebrew-cask,ptb/homebrew-cask,decrement/homebrew-cask,kesara/homebrew-cask,afh/homebrew-cask,Cottser/homebrew-cask,maxnordlund/homebrew-cask,ksylvan/homebrew-cask,jalaziz/homebrew-cask,Bombenleger/homebrew-cask,puffdad/homebrew-cask,tan9/homebrew-cask,Saklad5/homebrew-cask,tjnycum/homebrew-cask,josa42/homebrew-cask,jalaziz/homebrew-cask,retrography/homebrew-cask,boecko/homebrew-cask,ksato9700/homebrew-cask,sohtsuka/homebrew-cask,tyage/homebrew-cask,wastrachan/homebrew-cask,lantrix/homebrew-cask,kesara/homebrew-cask,sosedoff/homebrew-cask,ninjahoahong/homebrew-cask,jconley/homebrew-cask,alebcay/homebrew-cask,puffdad/homebrew-cask,tjnycum/homebrew-cask,moogar0880/homebrew-cask,MerelyAPseudonym/homebrew-cask,caskroom/homebrew-cask,ninjahoahong/homebrew-cask,JacopKane/homebrew-cask,phpwutz/homebrew-cask,0rax/homebrew-cask,jgarber623/homebrew-cask,xyb/homebrew-cask,klane/homebrew-cask,howie/homebrew-cask,bcomnes/homebrew-cask,jaredsampson/homebrew-cask,sanyer/homebrew-cask,blogabe/homebrew-cask,jawshooah/homebrew-cask,vitorgalvao/homebrew-cask,stephenwade/homebrew-cask,jalaziz/homebrew-cask,tarwich/homebrew-cask,linc01n/homebrew-cask,claui/homebrew-cask,seanzxx/homebrew-cask,jeroenj/homebrew-cask,muan/homebrew-cask,codeurge/homebrew-cask,xcezx/homebrew-cask,inz/homebrew-cask,deiga/homebrew-cask,uetchy/homebrew-cask,mahori/homebrew-cask,RJHsiao/homebrew-cask,scottsuch/homebrew-cask,yuhki50/homebrew-cask,retrography/homebrew-cask,goxberry/homebrew-cask,wickles/homebrew-cask,tolbkni/homebrew-cask,claui/homebrew-cask,inta/homebrew-cask,gyndav/homebrew-cask,jawshooah/homebrew-cask,squid314/homebrew-cask,scottsuch/homebrew-cask,elnappo/homebrew-cask,asbachb/homebrew-cask,pkq/homebrew-cask,opsdev-ws/homebrew-cask,hakamadare/homebrew-cask,kronicd/homebrew-cask,koenrh/homebrew-cask,cobyism/homebrew-cask,cliffcotino/homebrew-cask,decrement/homebrew-cask,lcasey001/homebrew-cask,jppelteret/homebrew-cask,MircoT/homebrew-cask,troyxmccall/homebrew-cask,lucasmezencio/homebrew-cask,toonetown/homebrew-cask,danielbayley/homebrew-cask,FranklinChen/homebrew-cask,helloIAmPau/homebrew-cask,dustinblackman/homebrew-cask,goxberry/homebrew-cask,markthetech/homebrew-cask,wmorin/homebrew-cask,stevehedrick/homebrew-cask,CameronGarrett/homebrew-cask,athrunsun/homebrew-cask,pkq/homebrew-cask,moimikey/homebrew-cask,kongslund/homebrew-cask,danielbayley/homebrew-cask,slack4u/homebrew-cask,reitermarkus/homebrew-cask,jeroenj/homebrew-cask,mjgardner/homebrew-cask,thomanq/homebrew-cask,retbrown/homebrew-cask,AnastasiaSulyagina/homebrew-cask,opsdev-ws/homebrew-cask,ayohrling/homebrew-cask,otaran/homebrew-cask,adrianchia/homebrew-cask,winkelsdorf/homebrew-cask,shonjir/homebrew-cask,chrisfinazzo/homebrew-cask,n8henrie/homebrew-cask,mahori/homebrew-cask,tjt263/homebrew-cask,xight/homebrew-cask,psibre/homebrew-cask,jgarber623/homebrew-cask,MoOx/homebrew-cask,mchlrmrz/homebrew-cask,xtian/homebrew-cask,renaudguerin/homebrew-cask,samdoran/homebrew-cask,usami-k/homebrew-cask,reitermarkus/homebrew-cask,jppelteret/homebrew-cask,blogabe/homebrew-cask,wastrachan/homebrew-cask,rogeriopradoj/homebrew-cask,rickychilcott/homebrew-cask,mattrobenolt/homebrew-cask,rogeriopradoj/homebrew-cask,riyad/homebrew-cask,jiashuw/homebrew-cask,buo/homebrew-cask,napaxton/homebrew-cask,farmerchris/homebrew-cask,RJHsiao/homebrew-cask,johndbritton/homebrew-cask,yuhki50/homebrew-cask,BenjaminHCCarr/homebrew-cask,exherb/homebrew-cask,jellyfishcoder/homebrew-cask,nrlquaker/homebrew-cask,lukasbestle/homebrew-cask,jpmat296/homebrew-cask,daften/homebrew-cask | ruby | ## Code Before:
cask :v1 => 'jumpshare' do
version '1.0.25-3'
sha256 'a2a2d44d858616965c8271dcfc177ec299e9592fed3abb795ddb81b533f6d818'
url "https://jumpshare.com/desktop/mac/Jumpshare_#{version}.dmg"
name 'Jumpshare'
homepage 'https://jumpshare.com/'
license :gratis
app 'Jumpshare.app'
end
## Instruction:
Update Jumpshare.app to version 2.1.1
## Code After:
cask :v1 => 'jumpshare' do
version '2.1.1'
sha256 '8181e5b35bd02dda770f81ba8f39b83e3c1cde0b934e2e563d962ce4ea16fced'
url "https://jumpshare.com/apps/mac/Jumpshare-#{version}.zip"
name 'Jumpshare'
homepage 'https://jumpshare.com/'
license :gratis
app 'Jumpshare.app'
end
| cask :v1 => 'jumpshare' do
- version '1.0.25-3'
? ^^^^^^
+ version '2.1.1'
? ++ ^
- sha256 'a2a2d44d858616965c8271dcfc177ec299e9592fed3abb795ddb81b533f6d818'
+ sha256 '8181e5b35bd02dda770f81ba8f39b83e3c1cde0b934e2e563d962ce4ea16fced'
- url "https://jumpshare.com/desktop/mac/Jumpshare_#{version}.dmg"
? ^^ ---- ^ ^^^
+ url "https://jumpshare.com/apps/mac/Jumpshare-#{version}.zip"
? ^^^ ^ ^^^
name 'Jumpshare'
homepage 'https://jumpshare.com/'
license :gratis
app 'Jumpshare.app'
end | 6 | 0.545455 | 3 | 3 |
4e1c6019c8f0d3b00e32fd6cd546d2ef445a285d | app/controllers/namespaces_controller.rb | app/controllers/namespaces_controller.rb | class NamespacesController < ApplicationController
skip_before_action :authenticate_user!
def show
namespace = Namespace.find_by(path: params[:id])
if namespace
if namespace.is_a?(Group)
group = namespace
else
user = namespace.owner
end
end
if user
redirect_to user_path(user)
elsif group && can?(current_user, :read_group, group)
redirect_to group_path(group)
elsif current_user.nil?
authenticate_user!
else
render_404
end
end
end
| class NamespacesController < ApplicationController
skip_before_action :authenticate_user!
def show
namespace = Namespace.find_by(path: params[:id])
if namespace
if namespace.is_a?(Group)
group = namespace
else
user = namespace.owner
end
end
if user
redirect_to user_path(user)
elsif group
redirect_to group_path(group)
elsif current_user.nil?
authenticate_user!
else
render_404
end
end
end
| Allow access to group from root url | Allow access to group from root url
| Ruby | mit | szechyjs/gitlabhq,ksoichiro/gitlabhq,jrjang/gitlabhq,whluwit/gitlabhq,mmkassem/gitlabhq,copystudy/gitlabhq,OlegGirko/gitlab-ce,t-zuehlsdorff/gitlabhq,mmkassem/gitlabhq,sonalkr132/gitlabhq,martijnvermaat/gitlabhq,sue445/gitlabhq,openwide-java/gitlabhq,mrb/gitlabhq,ikappas/gitlabhq,ikappas/gitlabhq,allistera/gitlabhq,dwrensha/gitlabhq,fantasywind/gitlabhq,t-zuehlsdorff/gitlabhq,MauriceMohlek/gitlabhq,larryli/gitlabhq,copystudy/gitlabhq,Datacom/gitlabhq,aaronsnyder/gitlabhq,allistera/gitlabhq,salipro4ever/gitlabhq,dplarson/gitlabhq,screenpages/gitlabhq,fantasywind/gitlabhq,delkyd/gitlabhq,dreampet/gitlab,htve/GitlabForChinese,k4zzk/gitlabhq,chenrui2014/gitlabhq,daiyu/gitlab-zh,cui-liqiang/gitlab-ce,vjustov/gitlabhq,LUMC/gitlabhq,folpindo/gitlabhq,ttasanen/gitlabhq,yatish27/gitlabhq,LUMC/gitlabhq,icedwater/gitlabhq,iiet/iiet-git,gopeter/gitlabhq,Tyrael/gitlabhq,OtkurBiz/gitlabhq,wangcan2014/gitlabhq,allysonbarros/gitlabhq,SVArago/gitlabhq,sonalkr132/gitlabhq,axilleas/gitlabhq,jrjang/gitlabhq,pjknkda/gitlabhq,OlegGirko/gitlab-ce,jirutka/gitlabhq,vjustov/gitlabhq,pjknkda/gitlabhq,yatish27/gitlabhq,Tyrael/gitlabhq,yfaizal/gitlabhq,shinexiao/gitlabhq,axilleas/gitlabhq,allysonbarros/gitlabhq,Datacom/gitlabhq,wangcan2014/gitlabhq,louahola/gitlabhq,koreamic/gitlabhq,szechyjs/gitlabhq,aaronsnyder/gitlabhq,icedwater/gitlabhq,ttasanen/gitlabhq,htve/GitlabForChinese,yfaizal/gitlabhq,folpindo/gitlabhq,copystudy/gitlabhq,whluwit/gitlabhq,dwrensha/gitlabhq,mmkassem/gitlabhq,SVArago/gitlabhq,mmkassem/gitlabhq,mr-dxdy/gitlabhq,joalmeid/gitlabhq,NKMR6194/gitlabhq,yatish27/gitlabhq,screenpages/gitlabhq,Exeia/gitlabhq,dreampet/gitlab,delkyd/gitlabhq,Tyrael/gitlabhq,SVArago/gitlabhq,larryli/gitlabhq,fpgentil/gitlabhq,htve/GitlabForChinese,salipro4ever/gitlabhq,ksoichiro/gitlabhq,MauriceMohlek/gitlabhq,jrjang/gitlab-ce,daiyu/gitlab-zh,screenpages/gitlabhq,jirutka/gitlabhq,mr-dxdy/gitlabhq,stoplightio/gitlabhq,Devin001/gitlabhq,martijnvermaat/gitlabhq,Telekom-PD/gitlabhq,duduribeiro/gitlabhq,fpgentil/gitlabhq,htve/GitlabForChinese,sue445/gitlabhq,screenpages/gitlabhq,jrjang/gitlab-ce,Telekom-PD/gitlabhq,k4zzk/gitlabhq,ferdinandrosario/gitlabhq,Datacom/gitlabhq,MauriceMohlek/gitlabhq,sue445/gitlabhq,stoplightio/gitlabhq,t-zuehlsdorff/gitlabhq,mr-dxdy/gitlabhq,icedwater/gitlabhq,ferdinandrosario/gitlabhq,LUMC/gitlabhq,Soullivaneuh/gitlabhq,mrb/gitlabhq,folpindo/gitlabhq,dreampet/gitlab,whluwit/gitlabhq,szechyjs/gitlabhq,joalmeid/gitlabhq,jirutka/gitlabhq,salipro4ever/gitlabhq,whluwit/gitlabhq,Soullivaneuh/gitlabhq,iiet/iiet-git,mrb/gitlabhq,vjustov/gitlabhq,Soullivaneuh/gitlabhq,koreamic/gitlabhq,szechyjs/gitlabhq,jrjang/gitlab-ce,darkrasid/gitlabhq,martijnvermaat/gitlabhq,allistera/gitlabhq,openwide-java/gitlabhq,fscherwi/gitlabhq,yatish27/gitlabhq,OlegGirko/gitlab-ce,ferdinandrosario/gitlabhq,shinexiao/gitlabhq,jrjang/gitlabhq,fpgentil/gitlabhq,Devin001/gitlabhq,koreamic/gitlabhq,ferdinandrosario/gitlabhq,gopeter/gitlabhq,NKMR6194/gitlabhq,shinexiao/gitlabhq,daiyu/gitlab-zh,fantasywind/gitlabhq,Datacom/gitlabhq,gopeter/gitlabhq,fscherwi/gitlabhq,copystudy/gitlabhq,openwide-java/gitlabhq,folpindo/gitlabhq,stoplightio/gitlabhq,dplarson/gitlabhq,yfaizal/gitlabhq,fpgentil/gitlabhq,iiet/iiet-git,martijnvermaat/gitlabhq,ikappas/gitlabhq,OtkurBiz/gitlabhq,sonalkr132/gitlabhq,koreamic/gitlabhq,delkyd/gitlabhq,gopeter/gitlabhq,stoplightio/gitlabhq,louahola/gitlabhq,delkyd/gitlabhq,OtkurBiz/gitlabhq,mr-dxdy/gitlabhq,Exeia/gitlabhq,joalmeid/gitlabhq,dplarson/gitlabhq,SVArago/gitlabhq,Telekom-PD/gitlabhq,axilleas/gitlabhq,ksoichiro/gitlabhq,Devin001/gitlabhq,MauriceMohlek/gitlabhq,dplarson/gitlabhq,vjustov/gitlabhq,Exeia/gitlabhq,LUMC/gitlabhq,allysonbarros/gitlabhq,jirutka/gitlabhq,dwrensha/gitlabhq,cui-liqiang/gitlab-ce,NKMR6194/gitlabhq,axilleas/gitlabhq,darkrasid/gitlabhq,k4zzk/gitlabhq,fantasywind/gitlabhq,darkrasid/gitlabhq,ttasanen/gitlabhq,darkrasid/gitlabhq,yfaizal/gitlabhq,jrjang/gitlabhq,salipro4ever/gitlabhq,duduribeiro/gitlabhq,ttasanen/gitlabhq,chenrui2014/gitlabhq,aaronsnyder/gitlabhq,t-zuehlsdorff/gitlabhq,fscherwi/gitlabhq,daiyu/gitlab-zh,duduribeiro/gitlabhq,dreampet/gitlab,chenrui2014/gitlabhq,wangcan2014/gitlabhq,cui-liqiang/gitlab-ce,allysonbarros/gitlabhq,louahola/gitlabhq,duduribeiro/gitlabhq,sue445/gitlabhq,pjknkda/gitlabhq,ksoichiro/gitlabhq,Devin001/gitlabhq,Telekom-PD/gitlabhq,allistera/gitlabhq,ikappas/gitlabhq,shinexiao/gitlabhq,icedwater/gitlabhq,larryli/gitlabhq,mrb/gitlabhq,pjknkda/gitlabhq,openwide-java/gitlabhq,wangcan2014/gitlabhq,NKMR6194/gitlabhq,Tyrael/gitlabhq,cui-liqiang/gitlab-ce,fscherwi/gitlabhq,OlegGirko/gitlab-ce,Soullivaneuh/gitlabhq,larryli/gitlabhq,aaronsnyder/gitlabhq,k4zzk/gitlabhq,dwrensha/gitlabhq,jrjang/gitlab-ce,iiet/iiet-git,Exeia/gitlabhq,joalmeid/gitlabhq,chenrui2014/gitlabhq,louahola/gitlabhq,sonalkr132/gitlabhq,OtkurBiz/gitlabhq | ruby | ## Code Before:
class NamespacesController < ApplicationController
skip_before_action :authenticate_user!
def show
namespace = Namespace.find_by(path: params[:id])
if namespace
if namespace.is_a?(Group)
group = namespace
else
user = namespace.owner
end
end
if user
redirect_to user_path(user)
elsif group && can?(current_user, :read_group, group)
redirect_to group_path(group)
elsif current_user.nil?
authenticate_user!
else
render_404
end
end
end
## Instruction:
Allow access to group from root url
## Code After:
class NamespacesController < ApplicationController
skip_before_action :authenticate_user!
def show
namespace = Namespace.find_by(path: params[:id])
if namespace
if namespace.is_a?(Group)
group = namespace
else
user = namespace.owner
end
end
if user
redirect_to user_path(user)
elsif group
redirect_to group_path(group)
elsif current_user.nil?
authenticate_user!
else
render_404
end
end
end
| class NamespacesController < ApplicationController
skip_before_action :authenticate_user!
def show
namespace = Namespace.find_by(path: params[:id])
if namespace
if namespace.is_a?(Group)
group = namespace
else
user = namespace.owner
end
end
if user
redirect_to user_path(user)
- elsif group && can?(current_user, :read_group, group)
+ elsif group
redirect_to group_path(group)
elsif current_user.nil?
authenticate_user!
else
render_404
end
end
end | 2 | 0.08 | 1 | 1 |
4be5248dd82b11cf0dcb0a165b7307ef35e43346 | .travis.yml | .travis.yml | language: ruby
rvm:
- 1.9.3
script: "bundle exec rake coverage"
services:
- rabbitmq
| language: ruby
rvm:
- 1.9.3
- 2.1
- jruby-1.7.16
script: "bundle exec rake coverage"
services:
- rabbitmq
notifications:
email:
- support@travellink.com.au
flowdock: e69dcafad1fea15c6b8c76e9ced965af
| Add jruby, add Ruby 2.1, add notifications | Add jruby, add Ruby 2.1, add notifications
| YAML | mit | sealink/lagomorph | yaml | ## Code Before:
language: ruby
rvm:
- 1.9.3
script: "bundle exec rake coverage"
services:
- rabbitmq
## Instruction:
Add jruby, add Ruby 2.1, add notifications
## Code After:
language: ruby
rvm:
- 1.9.3
- 2.1
- jruby-1.7.16
script: "bundle exec rake coverage"
services:
- rabbitmq
notifications:
email:
- support@travellink.com.au
flowdock: e69dcafad1fea15c6b8c76e9ced965af
| language: ruby
rvm:
- 1.9.3
+ - 2.1
+ - jruby-1.7.16
script: "bundle exec rake coverage"
services:
- rabbitmq
+ notifications:
+ email:
+ - support@travellink.com.au
+ flowdock: e69dcafad1fea15c6b8c76e9ced965af | 6 | 1 | 6 | 0 |
dbd497a7aa0e67119c08777ace4a75508a7e4fd9 | .circleci/config.yml | .circleci/config.yml | version: 2
jobs:
build-nightly:
working-directory: ~/sonos_discovery
docker:
- image: liuchong/rustup:nightly
steps:
- checkout
- run:
name: Update nightly version
command: "rustup update nightly"
- run:
name: Show version
command: "rustup show"
- run:
name: Run bin on nightly
command: "rustup run nightly cargo build --bin sonos_discovery"
build-nightly:
working-directory: ~/sonos_discovery
docker:
- image: liuchong/rustup:beta
steps:
- checkout
- run:
name: Update beta version
command: "rustup update beta"
- run:
name: Show version
command: "rustup show"
- run:
name: Run bin on beta
command: "rustup run beta cargo build --bin sonos_discovery"
build-nightly:
working-directory: ~/sonos_discovery
docker:
- image: liuchong/rustup:stable
steps:
- checkout
- run:
name: Update stable version
command: "rustup stable nightly"
- run:
name: Show version
command: "rustup show"
- run:
name: Run bin on stable
command: "rustup run stable cargo build --bin sonos_discovery"
| version: 2
jobs:
build-nightly:
working-directory: ~/sonos_discovery
docker:
- image: liuchong/rustup:nightly
steps:
- checkout
- run:
name: Update nightly version
command: "rustup update nightly"
- run:
name: Show version
command: "rustup show"
- run:
name: Run bin on nightly
command: "rustup run nightly cargo build --bin sonos_discovery"
- run:
name: Update beta version
command: "rustup update beta"
- run:
name: Show version
command: "rustup show"
- run:
name: Run bin on beta
command: "rustup run beta cargo build --bin sonos_discovery"
- run:
name: Update stable version
command: "rustup update stable"
- run:
name: Show version
command: "rustup show"
- run:
name: Run bin on stable
command: "rustup run stable cargo build --bin sonos_discovery"
| Move different builds into build stage | Move different builds into build stage
| YAML | mit | Chabare/sonos_discovery | yaml | ## Code Before:
version: 2
jobs:
build-nightly:
working-directory: ~/sonos_discovery
docker:
- image: liuchong/rustup:nightly
steps:
- checkout
- run:
name: Update nightly version
command: "rustup update nightly"
- run:
name: Show version
command: "rustup show"
- run:
name: Run bin on nightly
command: "rustup run nightly cargo build --bin sonos_discovery"
build-nightly:
working-directory: ~/sonos_discovery
docker:
- image: liuchong/rustup:beta
steps:
- checkout
- run:
name: Update beta version
command: "rustup update beta"
- run:
name: Show version
command: "rustup show"
- run:
name: Run bin on beta
command: "rustup run beta cargo build --bin sonos_discovery"
build-nightly:
working-directory: ~/sonos_discovery
docker:
- image: liuchong/rustup:stable
steps:
- checkout
- run:
name: Update stable version
command: "rustup stable nightly"
- run:
name: Show version
command: "rustup show"
- run:
name: Run bin on stable
command: "rustup run stable cargo build --bin sonos_discovery"
## Instruction:
Move different builds into build stage
## Code After:
version: 2
jobs:
build-nightly:
working-directory: ~/sonos_discovery
docker:
- image: liuchong/rustup:nightly
steps:
- checkout
- run:
name: Update nightly version
command: "rustup update nightly"
- run:
name: Show version
command: "rustup show"
- run:
name: Run bin on nightly
command: "rustup run nightly cargo build --bin sonos_discovery"
- run:
name: Update beta version
command: "rustup update beta"
- run:
name: Show version
command: "rustup show"
- run:
name: Run bin on beta
command: "rustup run beta cargo build --bin sonos_discovery"
- run:
name: Update stable version
command: "rustup update stable"
- run:
name: Show version
command: "rustup show"
- run:
name: Run bin on stable
command: "rustup run stable cargo build --bin sonos_discovery"
| version: 2
jobs:
build-nightly:
working-directory: ~/sonos_discovery
docker:
- image: liuchong/rustup:nightly
steps:
- checkout
- run:
name: Update nightly version
command: "rustup update nightly"
- run:
name: Show version
command: "rustup show"
- run:
name: Run bin on nightly
command: "rustup run nightly cargo build --bin sonos_discovery"
- build-nightly:
- working-directory: ~/sonos_discovery
- docker:
- - image: liuchong/rustup:beta
- steps:
- - checkout
- - run:
? --
+ - run:
- name: Update beta version
? --
+ name: Update beta version
- command: "rustup update beta"
? --
+ command: "rustup update beta"
- - run:
? --
+ - run:
- name: Show version
? --
+ name: Show version
- command: "rustup show"
? --
+ command: "rustup show"
- - run:
? --
+ - run:
- name: Run bin on beta
? --
+ name: Run bin on beta
- command: "rustup run beta cargo build --bin sonos_discovery"
? --
+ command: "rustup run beta cargo build --bin sonos_discovery"
- build-nightly:
- working-directory: ~/sonos_discovery
- docker:
- - image: liuchong/rustup:stable
- steps:
- - checkout
- - run:
? --
+ - run:
- name: Update stable version
? --
+ name: Update stable version
- command: "rustup stable nightly"
? -- --------
+ command: "rustup update stable"
? +++++++
- - run:
? --
+ - run:
- name: Show version
? --
+ name: Show version
- command: "rustup show"
? --
+ command: "rustup show"
- - run:
? --
+ - run:
- name: Run bin on stable
? --
+ name: Run bin on stable
- command: "rustup run stable cargo build --bin sonos_discovery"
? --
+ command: "rustup run stable cargo build --bin sonos_discovery" | 48 | 1 | 18 | 30 |
0bdcb1c36432cfa0506c6dd667e4e1910edcd371 | ixprofile_client/management/commands/createsuperuser.py | ixprofile_client/management/commands/createsuperuser.py |
from django.contrib.auth.models import User
from django.core.management.base import BaseCommand, CommandError
from ixprofile_client.webservice import UserWebService
from optparse import make_option
class Command(BaseCommand):
"""
The command to create a superuser with a given email.
"""
option_list = BaseCommand.option_list + (
make_option('--email', default=None,
help='Specifies the email for the superuser.'),
make_option('--noinput',
action='store_false',
dest='interactive',
default=True,
help='Tells Django to NOT prompt the user for input of ' +
'any kind. You must use --email with --noinput.'),
)
def handle(self, *args, **options):
interactive = options.get('interactive')
email = options.get('email')
verbosity = int(options.get('verbosity', 1))
if interactive and not email:
email = raw_input("Email: ")
if not email:
raise CommandError("No email given.")
user = User()
user.email = email
user.set_password(None)
user.is_active = True
user.is_staff = True
user.is_superuser = True
user_ws = UserWebService()
user_ws.connect(user)
if verbosity >= 1:
self.stdout.write("Superuser created successfully.")
|
from django.contrib.auth.models import User
from django.core.management.base import BaseCommand, CommandError
from django.db import transaction
from ixprofile_client.webservice import UserWebService
from optparse import make_option
class Command(BaseCommand):
"""
The command to create a superuser with a given email.
"""
option_list = BaseCommand.option_list + (
make_option('--email', default=None,
help='Specifies the email for the superuser.'),
make_option('--noinput',
action='store_false',
dest='interactive',
default=True,
help='Tells Django to NOT prompt the user for input of ' +
'any kind. You must use --email with --noinput.'),
)
def handle(self, *args, **options):
interactive = options.get('interactive')
email = options.get('email')
verbosity = int(options.get('verbosity', 1))
if interactive and not email:
email = raw_input("Email: ")
if not email:
raise CommandError("No email given.")
with transaction.atomic():
user, created = User.objects.get_or_create(email=email)
user.set_password(None)
user.is_active = True
user.is_staff = True
user.is_superuser = True
user_ws = UserWebService()
user_ws.connect(user)
if verbosity >= 1:
if created:
self.stdout.write("Superuser created successfully.")
else:
self.stdout.write("Superuser flag added successfully.")
| Handle the case where the user may already exist in the database | Handle the case where the user may already exist in the database
| Python | mit | infoxchange/ixprofile-client,infoxchange/ixprofile-client | python | ## Code Before:
from django.contrib.auth.models import User
from django.core.management.base import BaseCommand, CommandError
from ixprofile_client.webservice import UserWebService
from optparse import make_option
class Command(BaseCommand):
"""
The command to create a superuser with a given email.
"""
option_list = BaseCommand.option_list + (
make_option('--email', default=None,
help='Specifies the email for the superuser.'),
make_option('--noinput',
action='store_false',
dest='interactive',
default=True,
help='Tells Django to NOT prompt the user for input of ' +
'any kind. You must use --email with --noinput.'),
)
def handle(self, *args, **options):
interactive = options.get('interactive')
email = options.get('email')
verbosity = int(options.get('verbosity', 1))
if interactive and not email:
email = raw_input("Email: ")
if not email:
raise CommandError("No email given.")
user = User()
user.email = email
user.set_password(None)
user.is_active = True
user.is_staff = True
user.is_superuser = True
user_ws = UserWebService()
user_ws.connect(user)
if verbosity >= 1:
self.stdout.write("Superuser created successfully.")
## Instruction:
Handle the case where the user may already exist in the database
## Code After:
from django.contrib.auth.models import User
from django.core.management.base import BaseCommand, CommandError
from django.db import transaction
from ixprofile_client.webservice import UserWebService
from optparse import make_option
class Command(BaseCommand):
"""
The command to create a superuser with a given email.
"""
option_list = BaseCommand.option_list + (
make_option('--email', default=None,
help='Specifies the email for the superuser.'),
make_option('--noinput',
action='store_false',
dest='interactive',
default=True,
help='Tells Django to NOT prompt the user for input of ' +
'any kind. You must use --email with --noinput.'),
)
def handle(self, *args, **options):
interactive = options.get('interactive')
email = options.get('email')
verbosity = int(options.get('verbosity', 1))
if interactive and not email:
email = raw_input("Email: ")
if not email:
raise CommandError("No email given.")
with transaction.atomic():
user, created = User.objects.get_or_create(email=email)
user.set_password(None)
user.is_active = True
user.is_staff = True
user.is_superuser = True
user_ws = UserWebService()
user_ws.connect(user)
if verbosity >= 1:
if created:
self.stdout.write("Superuser created successfully.")
else:
self.stdout.write("Superuser flag added successfully.")
|
from django.contrib.auth.models import User
from django.core.management.base import BaseCommand, CommandError
+ from django.db import transaction
from ixprofile_client.webservice import UserWebService
from optparse import make_option
class Command(BaseCommand):
"""
The command to create a superuser with a given email.
"""
option_list = BaseCommand.option_list + (
make_option('--email', default=None,
help='Specifies the email for the superuser.'),
make_option('--noinput',
action='store_false',
dest='interactive',
default=True,
help='Tells Django to NOT prompt the user for input of ' +
'any kind. You must use --email with --noinput.'),
)
def handle(self, *args, **options):
interactive = options.get('interactive')
email = options.get('email')
verbosity = int(options.get('verbosity', 1))
if interactive and not email:
email = raw_input("Email: ")
if not email:
raise CommandError("No email given.")
- user = User()
- user.email = email
+ with transaction.atomic():
+ user, created = User.objects.get_or_create(email=email)
- user.set_password(None)
+ user.set_password(None)
? ++++
- user.is_active = True
+ user.is_active = True
? ++++
- user.is_staff = True
+ user.is_staff = True
? ++++
- user.is_superuser = True
+ user.is_superuser = True
? ++++
- user_ws = UserWebService()
+ user_ws = UserWebService()
? ++++
- user_ws.connect(user)
+ user_ws.connect(user)
? ++++
- if verbosity >= 1:
+ if verbosity >= 1:
? ++++
+ if created:
- self.stdout.write("Superuser created successfully.")
+ self.stdout.write("Superuser created successfully.")
? ++++++++
+ else:
+ self.stdout.write("Superuser flag added successfully.") | 24 | 0.510638 | 14 | 10 |
e355a28cb0e9c46897149c8b4fec444634d2aa4c | src/layouts/index.css | src/layouts/index.css | h1 a,
h2 a,
h3 a,
h4 a,
h5 a,
h6 a {
background: none
}
| html {
font-size: 16px;
}
h1 a,
h2 a,
h3 a,
h4 a,
h5 a,
h6 a {
background: none
}
| Fix breakpoints with smaller font size | Fix breakpoints with smaller font size
rem in media queries is based on the initial 16px font size, and calc()
isn't widely enough supported in media queries to be usable.
| CSS | mit | jbhannah/jbhannah.net,jbhannah/jbhannah.net | css | ## Code Before:
h1 a,
h2 a,
h3 a,
h4 a,
h5 a,
h6 a {
background: none
}
## Instruction:
Fix breakpoints with smaller font size
rem in media queries is based on the initial 16px font size, and calc()
isn't widely enough supported in media queries to be usable.
## Code After:
html {
font-size: 16px;
}
h1 a,
h2 a,
h3 a,
h4 a,
h5 a,
h6 a {
background: none
}
| + html {
+ font-size: 16px;
+ }
+
h1 a,
h2 a,
h3 a,
h4 a,
h5 a,
h6 a {
background: none
} | 4 | 0.5 | 4 | 0 |
2375296e67cf813db2076877ba64c0d3dfae2b3f | lib/plugins/index.js | lib/plugins/index.js | var _ = require('lodash');
var Promise = require('../utils/promise');
var BookPlugin = require('./plugin');
/*
PluginsManager is an interface to work with multiple plugins at once:
- Extract assets from plugins
- Call hooks for all plugins, etc
*/
function PluginsManager(output) {
this.output = output;
this.book = output.book;
this.plugins = [];
}
// Returns a plugin by its name
PluginsManager.prototype.get = function(name) {
return _.find(this.plugins, {
id: name
});
};
// Load a plugin, or a list of plugins
PluginsManager.prototype.load = function(name) {
var that = this;
if (_.isArray(name)) {
return Promise.serie(name, function(_name) {
return that.load(_name);
});
}
return Promise()
// Initiate and load the plugin
.then(function() {
var plugin;
if (!_.isString(name)) plugin = name;
else plugin = new BookPlugin(that.book, name);
if (that.get(plugin.id)) {
throw new Error('Plugin "'+plugin.id+'" is already loaded');
}
if (plugin.isLoaded()) return plugin;
else return plugin.load()
.thenResolve(plugin);
})
.then(function(plugin) {
that.plugins.push(plugin);
});
};
module.exports = PluginsManager;
| var _ = require('lodash');
var Promise = require('../utils/promise');
var BookPlugin = require('./plugin');
/*
PluginsManager is an interface to work with multiple plugins at once:
- Extract assets from plugins
- Call hooks for all plugins, etc
*/
function PluginsManager(output) {
this.output = output;
this.book = output.book;
this.plugins = [];
}
// Returns a plugin by its name
PluginsManager.prototype.get = function(name) {
return _.find(this.plugins, {
id: name
});
};
// Load a plugin, or a list of plugins
PluginsManager.prototype.load = function(name) {
var that = this;
if (_.isArray(name)) {
return Promise.serie(name, function(_name) {
return that.load(_name);
});
}
return Promise()
// Initiate and load the plugin
.then(function() {
var plugin;
if (!_.isString(name)) plugin = name;
else plugin = new BookPlugin(that.book, name);
if (that.get(plugin.id)) {
throw new Error('Plugin "'+plugin.id+'" is already loaded');
}
if (plugin.isLoaded()) return plugin;
else return plugin.load()
.thenResolve(plugin);
})
// Setup the plugin
.then(this._setup);
};
// Setup a plugin
// Register its filter, blocks, etc
PluginsManager.prototype._setup = function(plugin) {
};
module.exports = PluginsManager;
| Add base _setup for pluginslist | Add base _setup for pluginslist
| JavaScript | apache-2.0 | GitbookIO/gitbook,tshoper/gitbook,gencer/gitbook,gencer/gitbook,ryanswanson/gitbook,tshoper/gitbook,strawluffy/gitbook | javascript | ## Code Before:
var _ = require('lodash');
var Promise = require('../utils/promise');
var BookPlugin = require('./plugin');
/*
PluginsManager is an interface to work with multiple plugins at once:
- Extract assets from plugins
- Call hooks for all plugins, etc
*/
function PluginsManager(output) {
this.output = output;
this.book = output.book;
this.plugins = [];
}
// Returns a plugin by its name
PluginsManager.prototype.get = function(name) {
return _.find(this.plugins, {
id: name
});
};
// Load a plugin, or a list of plugins
PluginsManager.prototype.load = function(name) {
var that = this;
if (_.isArray(name)) {
return Promise.serie(name, function(_name) {
return that.load(_name);
});
}
return Promise()
// Initiate and load the plugin
.then(function() {
var plugin;
if (!_.isString(name)) plugin = name;
else plugin = new BookPlugin(that.book, name);
if (that.get(plugin.id)) {
throw new Error('Plugin "'+plugin.id+'" is already loaded');
}
if (plugin.isLoaded()) return plugin;
else return plugin.load()
.thenResolve(plugin);
})
.then(function(plugin) {
that.plugins.push(plugin);
});
};
module.exports = PluginsManager;
## Instruction:
Add base _setup for pluginslist
## Code After:
var _ = require('lodash');
var Promise = require('../utils/promise');
var BookPlugin = require('./plugin');
/*
PluginsManager is an interface to work with multiple plugins at once:
- Extract assets from plugins
- Call hooks for all plugins, etc
*/
function PluginsManager(output) {
this.output = output;
this.book = output.book;
this.plugins = [];
}
// Returns a plugin by its name
PluginsManager.prototype.get = function(name) {
return _.find(this.plugins, {
id: name
});
};
// Load a plugin, or a list of plugins
PluginsManager.prototype.load = function(name) {
var that = this;
if (_.isArray(name)) {
return Promise.serie(name, function(_name) {
return that.load(_name);
});
}
return Promise()
// Initiate and load the plugin
.then(function() {
var plugin;
if (!_.isString(name)) plugin = name;
else plugin = new BookPlugin(that.book, name);
if (that.get(plugin.id)) {
throw new Error('Plugin "'+plugin.id+'" is already loaded');
}
if (plugin.isLoaded()) return plugin;
else return plugin.load()
.thenResolve(plugin);
})
// Setup the plugin
.then(this._setup);
};
// Setup a plugin
// Register its filter, blocks, etc
PluginsManager.prototype._setup = function(plugin) {
};
module.exports = PluginsManager;
| var _ = require('lodash');
var Promise = require('../utils/promise');
var BookPlugin = require('./plugin');
/*
PluginsManager is an interface to work with multiple plugins at once:
- Extract assets from plugins
- Call hooks for all plugins, etc
*/
function PluginsManager(output) {
this.output = output;
this.book = output.book;
this.plugins = [];
}
// Returns a plugin by its name
PluginsManager.prototype.get = function(name) {
return _.find(this.plugins, {
id: name
});
};
// Load a plugin, or a list of plugins
PluginsManager.prototype.load = function(name) {
var that = this;
if (_.isArray(name)) {
return Promise.serie(name, function(_name) {
return that.load(_name);
});
}
return Promise()
// Initiate and load the plugin
.then(function() {
var plugin;
if (!_.isString(name)) plugin = name;
else plugin = new BookPlugin(that.book, name);
if (that.get(plugin.id)) {
throw new Error('Plugin "'+plugin.id+'" is already loaded');
}
if (plugin.isLoaded()) return plugin;
else return plugin.load()
.thenResolve(plugin);
})
- .then(function(plugin) {
- that.plugins.push(plugin);
- });
+ // Setup the plugin
+ .then(this._setup);
+ };
+
+ // Setup a plugin
+ // Register its filter, blocks, etc
+ PluginsManager.prototype._setup = function(plugin) {
+
};
module.exports = PluginsManager; | 11 | 0.183333 | 8 | 3 |
aeefef1f80ba92c7900c95c436b61b019d8ffb6a | src/waldur_mastermind/marketplace_openstack/migrations/0011_limit_components.py | src/waldur_mastermind/marketplace_openstack/migrations/0011_limit_components.py | from django.db import migrations
TENANT_TYPE = 'Packages.Template'
LIMIT = 'limit'
def process_components(apps, schema_editor):
OfferingComponent = apps.get_model('marketplace', 'OfferingComponent')
OfferingComponent.objects.filter(offering__type=TENANT_TYPE).update(
billing_type=LIMIT
)
class Migration(migrations.Migration):
dependencies = [
('marketplace_openstack', '0010_split_invoice_items'),
]
operations = [migrations.RunPython(process_components)]
| from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('marketplace_openstack', '0010_split_invoice_items'),
]
| Remove invalid migration script: it has been superceded by 0052_limit_components | Remove invalid migration script: it has been superceded by 0052_limit_components
| Python | mit | opennode/waldur-mastermind,opennode/nodeconductor-assembly-waldur,opennode/waldur-mastermind,opennode/waldur-mastermind,opennode/waldur-mastermind,opennode/nodeconductor-assembly-waldur,opennode/nodeconductor-assembly-waldur | python | ## Code Before:
from django.db import migrations
TENANT_TYPE = 'Packages.Template'
LIMIT = 'limit'
def process_components(apps, schema_editor):
OfferingComponent = apps.get_model('marketplace', 'OfferingComponent')
OfferingComponent.objects.filter(offering__type=TENANT_TYPE).update(
billing_type=LIMIT
)
class Migration(migrations.Migration):
dependencies = [
('marketplace_openstack', '0010_split_invoice_items'),
]
operations = [migrations.RunPython(process_components)]
## Instruction:
Remove invalid migration script: it has been superceded by 0052_limit_components
## Code After:
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('marketplace_openstack', '0010_split_invoice_items'),
]
| from django.db import migrations
-
- TENANT_TYPE = 'Packages.Template'
- LIMIT = 'limit'
-
-
- def process_components(apps, schema_editor):
- OfferingComponent = apps.get_model('marketplace', 'OfferingComponent')
- OfferingComponent.objects.filter(offering__type=TENANT_TYPE).update(
- billing_type=LIMIT
- )
class Migration(migrations.Migration):
dependencies = [
('marketplace_openstack', '0010_split_invoice_items'),
]
-
- operations = [migrations.RunPython(process_components)] | 12 | 0.631579 | 0 | 12 |
140b8e7a41364d003efeba0babe4b57fee6f5ab3 | README.md | README.md | phabricator-to-zulip
========================
This daemon reads [Phabricator](http://phabricator.org/)'s story feed and sends
updates to the [Zulip](https://zulip.com/) chat system.
## Instructions
* Create a stream called "Phabricator" inside Zulip.
* Ensure that you and other users are subscribed to the stream.
* Choose a directory where you want to install the integration software. A
reasonable place would be the parent directory of the "phabricator" directory
of your Phabricator installation, but it's fairly arbitrary.
* `git clone https://github.com/showell/phabricator-to-zulip.git`
* `cd phabricator-to-zulip`
* Collect the following information:
* your Zulip API key
* the location of Phabricator's `__init_script__.php` file
* your Conduit certificate for Phabricator (typically in `~/.arcrc`)
* Edit `ZulipConfig.php` to suit your installation.
* This is very important!
* You will typically modify about eight configuration parameters.
* Run the listener program: `php ZulipFeedListener.php'
* You may need to locate the php binary in your system.
* Try triggering these Phabricator events while watching your "Phabricator" stream on Zulip:
* Edit a Maniphest task.
* Edit a Differential code review.
| phabricator-to-zulip
========================
This daemon reads [Phabricator](http://phabricator.org/)'s story feed and sends
updates to the [Zulip](https://zulip.com/) chat system.
## Instructions
1. Create a stream called "Phabricator" inside Zulip.
2. Ensure that you and other users are subscribed to the stream.
3. Choose a directory where you want to install the integration software. A
reasonable place would be the parent directory of the "phabricator" directory
of your Phabricator installation, but it's fairly arbitrary.
4. `git clone https://github.com/showell/phabricator-to-zulip.git`
5. `cd phabricator-to-zulip`
6. Collect the following information:
* your Zulip API key
* the location of Phabricator's `__init_script__.php` file
* your Conduit certificate for Phabricator (typically in `~/.arcrc`)
7. Edit `ZulipConfig.php` to suit your installation.
* This is very important!
* You will typically modify about eight configuration parameters.
8. Run the listener program: `php ZulipFeedListener.php'
* You may need to locate the php binary in your system.
9. Try triggering these Phabricator events while watching your "Phabricator" stream on Zulip:
* Edit a Maniphest task.
* Edit a Differential code review.
| Enumerate instructions instead of listing. | Enumerate instructions instead of listing.
| Markdown | apache-2.0 | zulip/phabricator-to-zulip,wweiradio/phabricator-to-zulip | markdown | ## Code Before:
phabricator-to-zulip
========================
This daemon reads [Phabricator](http://phabricator.org/)'s story feed and sends
updates to the [Zulip](https://zulip.com/) chat system.
## Instructions
* Create a stream called "Phabricator" inside Zulip.
* Ensure that you and other users are subscribed to the stream.
* Choose a directory where you want to install the integration software. A
reasonable place would be the parent directory of the "phabricator" directory
of your Phabricator installation, but it's fairly arbitrary.
* `git clone https://github.com/showell/phabricator-to-zulip.git`
* `cd phabricator-to-zulip`
* Collect the following information:
* your Zulip API key
* the location of Phabricator's `__init_script__.php` file
* your Conduit certificate for Phabricator (typically in `~/.arcrc`)
* Edit `ZulipConfig.php` to suit your installation.
* This is very important!
* You will typically modify about eight configuration parameters.
* Run the listener program: `php ZulipFeedListener.php'
* You may need to locate the php binary in your system.
* Try triggering these Phabricator events while watching your "Phabricator" stream on Zulip:
* Edit a Maniphest task.
* Edit a Differential code review.
## Instruction:
Enumerate instructions instead of listing.
## Code After:
phabricator-to-zulip
========================
This daemon reads [Phabricator](http://phabricator.org/)'s story feed and sends
updates to the [Zulip](https://zulip.com/) chat system.
## Instructions
1. Create a stream called "Phabricator" inside Zulip.
2. Ensure that you and other users are subscribed to the stream.
3. Choose a directory where you want to install the integration software. A
reasonable place would be the parent directory of the "phabricator" directory
of your Phabricator installation, but it's fairly arbitrary.
4. `git clone https://github.com/showell/phabricator-to-zulip.git`
5. `cd phabricator-to-zulip`
6. Collect the following information:
* your Zulip API key
* the location of Phabricator's `__init_script__.php` file
* your Conduit certificate for Phabricator (typically in `~/.arcrc`)
7. Edit `ZulipConfig.php` to suit your installation.
* This is very important!
* You will typically modify about eight configuration parameters.
8. Run the listener program: `php ZulipFeedListener.php'
* You may need to locate the php binary in your system.
9. Try triggering these Phabricator events while watching your "Phabricator" stream on Zulip:
* Edit a Maniphest task.
* Edit a Differential code review.
| phabricator-to-zulip
========================
This daemon reads [Phabricator](http://phabricator.org/)'s story feed and sends
updates to the [Zulip](https://zulip.com/) chat system.
## Instructions
- * Create a stream called "Phabricator" inside Zulip.
? ^
+ 1. Create a stream called "Phabricator" inside Zulip.
? ^^
- * Ensure that you and other users are subscribed to the stream.
? ^
+ 2. Ensure that you and other users are subscribed to the stream.
? ^^
- * Choose a directory where you want to install the integration software. A
? ^
+ 3. Choose a directory where you want to install the integration software. A
? ^^
reasonable place would be the parent directory of the "phabricator" directory
of your Phabricator installation, but it's fairly arbitrary.
- * `git clone https://github.com/showell/phabricator-to-zulip.git`
? ^
+ 4. `git clone https://github.com/showell/phabricator-to-zulip.git`
? ^^
- * `cd phabricator-to-zulip`
? ^
+ 5. `cd phabricator-to-zulip`
? ^^
- * Collect the following information:
? ^
+ 6. Collect the following information:
? ^^
* your Zulip API key
* the location of Phabricator's `__init_script__.php` file
* your Conduit certificate for Phabricator (typically in `~/.arcrc`)
- * Edit `ZulipConfig.php` to suit your installation.
? ^
+ 7. Edit `ZulipConfig.php` to suit your installation.
? ^^
* This is very important!
* You will typically modify about eight configuration parameters.
- * Run the listener program: `php ZulipFeedListener.php'
? ^
+ 8. Run the listener program: `php ZulipFeedListener.php'
? ^^
* You may need to locate the php binary in your system.
- * Try triggering these Phabricator events while watching your "Phabricator" stream on Zulip:
? ^
+ 9. Try triggering these Phabricator events while watching your "Phabricator" stream on Zulip:
? ^^
* Edit a Maniphest task.
* Edit a Differential code review. | 18 | 0.666667 | 9 | 9 |
322e38f52d16210d1a7561114cd9634822cad6ba | conf/dynomite.single.yml | conf/dynomite.single.yml | dyn_o_mite:
auto_eject_hosts: false
datacenter: DC1
distribution: single
dyn_listen: 127.0.0.1:8101
dyn_read_timeout: 200000
dyn_seed_provider: simple_provider
dyn_write_timeout: 200000
gos_interval: 10000
hash: murmur
listen: 127.0.0.1:8102
preconnect: true
server_retry_timeout: 200000
servers:
- 127.0.0.1:11211:1
timeout: 10000
tokens: 437425602,1122629340,1683099499,2400294391,2772631304,4271597971
| dyn_o_mite:
auto_eject_hosts: false
distribution: single
dyn_listen: 127.0.0.1:8101
dyn_read_timeout: 200000
dyn_seed_provider: simple_provider
dyn_write_timeout: 200000
gos_interval: 10000
hash: murmur
listen: 127.0.0.1:8102
preconnect: true
server_retry_timeout: 200000
servers:
- 127.0.0.1:11211:1
timeout: 10000
| Clean up this config file | Clean up this config file
| YAML | apache-2.0 | Netflix/dynomite,philipz/dynomite,jbfavre/dynomite,mcanthony/dynomite,ipapapa/dynomiteMongo,jbfavre/dynomite,fengshao0907/dynomite,Netflix/dynomite,fengshao0907/dynomite,ipapapa/dynomiteMongo,mcanthony/dynomite,jbfavre/dynomite,jbfavre/dynomite,jbfavre/dynomite,ipapapa/dynomiteMongo,philipz/dynomite,Netflix/dynomite,ipapapa/dynomiteIoannis,ipapapa/dynomiteIoannis,philipz/dynomite,mcanthony/dynomite,ipapapa/dynomiteMongo,fengshao0907/dynomite,ipapapa/dynomiteIoannis,ipapapa/dynomiteIoannis,Netflix/dynomite | yaml | ## Code Before:
dyn_o_mite:
auto_eject_hosts: false
datacenter: DC1
distribution: single
dyn_listen: 127.0.0.1:8101
dyn_read_timeout: 200000
dyn_seed_provider: simple_provider
dyn_write_timeout: 200000
gos_interval: 10000
hash: murmur
listen: 127.0.0.1:8102
preconnect: true
server_retry_timeout: 200000
servers:
- 127.0.0.1:11211:1
timeout: 10000
tokens: 437425602,1122629340,1683099499,2400294391,2772631304,4271597971
## Instruction:
Clean up this config file
## Code After:
dyn_o_mite:
auto_eject_hosts: false
distribution: single
dyn_listen: 127.0.0.1:8101
dyn_read_timeout: 200000
dyn_seed_provider: simple_provider
dyn_write_timeout: 200000
gos_interval: 10000
hash: murmur
listen: 127.0.0.1:8102
preconnect: true
server_retry_timeout: 200000
servers:
- 127.0.0.1:11211:1
timeout: 10000
| dyn_o_mite:
auto_eject_hosts: false
- datacenter: DC1
distribution: single
dyn_listen: 127.0.0.1:8101
dyn_read_timeout: 200000
dyn_seed_provider: simple_provider
dyn_write_timeout: 200000
gos_interval: 10000
hash: murmur
listen: 127.0.0.1:8102
preconnect: true
server_retry_timeout: 200000
servers:
- 127.0.0.1:11211:1
timeout: 10000
- tokens: 437425602,1122629340,1683099499,2400294391,2772631304,4271597971 | 2 | 0.117647 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.