commit stringlengths 40 40 | old_file stringlengths 4 184 | new_file stringlengths 4 184 | old_contents stringlengths 1 3.6k | new_contents stringlengths 5 3.38k | subject stringlengths 15 778 | message stringlengths 16 6.74k | lang stringclasses 201 values | license stringclasses 13 values | repos stringlengths 6 116k | config stringclasses 201 values | content stringlengths 137 7.24k | diff stringlengths 26 5.55k | diff_length int64 1 123 | relative_diff_length float64 0.01 89 | n_lines_added int64 0 108 | n_lines_deleted int64 0 106 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
8fd126e2b87cbd9bdf3e238ecebf1fb79f275599 | README.rst | README.rst | ==================
flask-s3-resumable
==================
This is a sample project to demonstrate how to upload directly to Amazon S3 via CORS using Flask making use of the multipart upload REST API.
The uploaded file will be chunked and sent to S3 concurrently. Every failed Ajax request will be retried until successful.
Using
=====
- Make sure the permission and CORS configuration of your S3 bucket are set properly.
- Fill in these values in app.py:
* AWS_ACCESS_KEY_ID
* AWS_SECRET_KEY
* S3_BUCKET_NAME
- Run app.py as usual.
python app.py
Useful Docs
===========
`Amazon: Multipart Upload Overview <http://docs.aws.amazon.com/AmazonS3/latest/dev/mpuoverview.html>`_
`Amazon: API Support for Multipart Upload <http://docs.aws.amazon.com/AmazonS3/latest/dev/sdksupportformpu.html>`_
`Amazon: Signing and Authenticating REST Requests <http://docs.aws.amazon.com/AmazonS3/latest/dev/RESTAuthentication.html>`_
`Amazon: Enabling Cross-Origin Resource Sharing <http://docs.aws.amazon.com/AmazonS3/latest/dev/cors.html>`_
Useful References
=================
`mule-uploader <https://github.com/cinely/mule-uploader>`_
`s3-multipart-upload-browser <https://github.com/ienzam/s3-multipart-upload-browser>`_
| ==================
flask-s3-multipart
==================
This is a sample project to demonstrate how to upload directly to Amazon S3 via CORS using Flask making use of the multipart upload REST API.
The uploaded file will be chunked and sent to S3 concurrently. Every failed Ajax request will be retried until successful.
Using
=====
- Make sure the permission and CORS configuration of your S3 bucket are set properly.
- Fill in these values in app.py:
* AWS_ACCESS_KEY_ID
* AWS_SECRET_KEY
* S3_BUCKET_NAME
- Run app.py as usual.
python app.py
Useful Docs
===========
`Amazon: Multipart Upload Overview <http://docs.aws.amazon.com/AmazonS3/latest/dev/mpuoverview.html>`_
`Amazon: API Support for Multipart Upload <http://docs.aws.amazon.com/AmazonS3/latest/dev/sdksupportformpu.html>`_
`Amazon: Signing and Authenticating REST Requests <http://docs.aws.amazon.com/AmazonS3/latest/dev/RESTAuthentication.html>`_
`Amazon: Enabling Cross-Origin Resource Sharing <http://docs.aws.amazon.com/AmazonS3/latest/dev/cors.html>`_
Useful References
=================
`mule-uploader <https://github.com/cinely/mule-uploader>`_
`s3-multipart <https://github.com/maxgillett/s3_multipart>`_
| Fix wrong reference. More suitable title | Fix wrong reference. More suitable title
| reStructuredText | mit | heryandi/flask-s3-multipart,heryandi/flask-s3-multipart | restructuredtext | ## Code Before:
==================
flask-s3-resumable
==================
This is a sample project to demonstrate how to upload directly to Amazon S3 via CORS using Flask making use of the multipart upload REST API.
The uploaded file will be chunked and sent to S3 concurrently. Every failed Ajax request will be retried until successful.
Using
=====
- Make sure the permission and CORS configuration of your S3 bucket are set properly.
- Fill in these values in app.py:
* AWS_ACCESS_KEY_ID
* AWS_SECRET_KEY
* S3_BUCKET_NAME
- Run app.py as usual.
python app.py
Useful Docs
===========
`Amazon: Multipart Upload Overview <http://docs.aws.amazon.com/AmazonS3/latest/dev/mpuoverview.html>`_
`Amazon: API Support for Multipart Upload <http://docs.aws.amazon.com/AmazonS3/latest/dev/sdksupportformpu.html>`_
`Amazon: Signing and Authenticating REST Requests <http://docs.aws.amazon.com/AmazonS3/latest/dev/RESTAuthentication.html>`_
`Amazon: Enabling Cross-Origin Resource Sharing <http://docs.aws.amazon.com/AmazonS3/latest/dev/cors.html>`_
Useful References
=================
`mule-uploader <https://github.com/cinely/mule-uploader>`_
`s3-multipart-upload-browser <https://github.com/ienzam/s3-multipart-upload-browser>`_
## Instruction:
Fix wrong reference. More suitable title
## Code After:
==================
flask-s3-multipart
==================
This is a sample project to demonstrate how to upload directly to Amazon S3 via CORS using Flask making use of the multipart upload REST API.
The uploaded file will be chunked and sent to S3 concurrently. Every failed Ajax request will be retried until successful.
Using
=====
- Make sure the permission and CORS configuration of your S3 bucket are set properly.
- Fill in these values in app.py:
* AWS_ACCESS_KEY_ID
* AWS_SECRET_KEY
* S3_BUCKET_NAME
- Run app.py as usual.
python app.py
Useful Docs
===========
`Amazon: Multipart Upload Overview <http://docs.aws.amazon.com/AmazonS3/latest/dev/mpuoverview.html>`_
`Amazon: API Support for Multipart Upload <http://docs.aws.amazon.com/AmazonS3/latest/dev/sdksupportformpu.html>`_
`Amazon: Signing and Authenticating REST Requests <http://docs.aws.amazon.com/AmazonS3/latest/dev/RESTAuthentication.html>`_
`Amazon: Enabling Cross-Origin Resource Sharing <http://docs.aws.amazon.com/AmazonS3/latest/dev/cors.html>`_
Useful References
=================
`mule-uploader <https://github.com/cinely/mule-uploader>`_
`s3-multipart <https://github.com/maxgillett/s3_multipart>`_
| ==================
- flask-s3-resumable
+ flask-s3-multipart
==================
This is a sample project to demonstrate how to upload directly to Amazon S3 via CORS using Flask making use of the multipart upload REST API.
The uploaded file will be chunked and sent to S3 concurrently. Every failed Ajax request will be retried until successful.
Using
=====
- Make sure the permission and CORS configuration of your S3 bucket are set properly.
- Fill in these values in app.py:
* AWS_ACCESS_KEY_ID
* AWS_SECRET_KEY
* S3_BUCKET_NAME
- Run app.py as usual.
python app.py
Useful Docs
===========
`Amazon: Multipart Upload Overview <http://docs.aws.amazon.com/AmazonS3/latest/dev/mpuoverview.html>`_
`Amazon: API Support for Multipart Upload <http://docs.aws.amazon.com/AmazonS3/latest/dev/sdksupportformpu.html>`_
`Amazon: Signing and Authenticating REST Requests <http://docs.aws.amazon.com/AmazonS3/latest/dev/RESTAuthentication.html>`_
`Amazon: Enabling Cross-Origin Resource Sharing <http://docs.aws.amazon.com/AmazonS3/latest/dev/cors.html>`_
Useful References
=================
`mule-uploader <https://github.com/cinely/mule-uploader>`_
- `s3-multipart-upload-browser <https://github.com/ienzam/s3-multipart-upload-browser>`_
+ `s3-multipart <https://github.com/maxgillett/s3_multipart>`_ | 4 | 0.093023 | 2 | 2 |
476348665376530b4b10c249771a95ff1692dd71 | README.md | README.md |
For more info visit https://github.com/build-canaries/nevergreen
## How to use this repo
This repo uses the [Gradle Linux Packaging Plugin](https://github.com/nebula-plugins/gradle-ospackage-plugin) to build a
deployable Nevergreen package to make running your own instance on a Linux distro easier.
This repo contains some sensible defaults so may be used without changes. Some simple settings can be overridden by
creating a `gradle.properties` file, see the [Gradle docs](https://docs.gradle.org/current/userguide/build_environment.html)
for more details about placing this file in the correct place.
Alternatively if you need to make more involved changes you can fork this repo and just use it as a base.
### Standalone Jar
The jar to package is downloaded automatically and placed into `build/libs` unless a file named `nevergreen-standalone.jar`
already exists. This allows you to manually download or build your own jar locally.
### Overridable Properties
- `nevergreenVersion`
The version of Nevergreen to download, e.g. 3.0.0
## Docker Image
Nevergreen is also available on [Dockerhub](https://registry.hub.docker.com/u/buildcanariesteam/nevergreen/).
## License
Copyright © 2014 - 2018 Build Canaries
Distributed under the Eclipse Public License either version 1.0 or (at your option) any later version.
|
As everyone just seems to use Docker nowadays.
# Nevergreen OS Package
## What is Nevergreen?
For more info visit https://github.com/build-canaries/nevergreen
## How to use this repo
This repo uses the [Gradle Linux Packaging Plugin](https://github.com/nebula-plugins/gradle-ospackage-plugin) to build a
deployable Nevergreen package to make running your own instance on a Linux distro easier.
This repo contains some sensible defaults so may be used without changes. Some simple settings can be overridden by
creating a `gradle.properties` file, see the [Gradle docs](https://docs.gradle.org/current/userguide/build_environment.html)
for more details about placing this file in the correct place.
Alternatively if you need to make more involved changes you can fork this repo and just use it as a base.
### Standalone Jar
The jar to package is downloaded automatically and placed into `build/libs` unless a file named `nevergreen-standalone.jar`
already exists. This allows you to manually download or build your own jar locally.
### Overridable Properties
- `nevergreenVersion`
The version of Nevergreen to download, e.g. 3.0.0
## Docker Image
Nevergreen is also available on [Dockerhub](https://registry.hub.docker.com/u/buildcanariesteam/nevergreen/).
## License
Copyright © 2014 - 2018 Build Canaries
Distributed under the Eclipse Public License either version 1.0 or (at your option) any later version.
| Mark as no longer maintained | Mark as no longer maintained | Markdown | epl-1.0 | build-canaries/nevergreen-ospackage | markdown | ## Code Before:
For more info visit https://github.com/build-canaries/nevergreen
## How to use this repo
This repo uses the [Gradle Linux Packaging Plugin](https://github.com/nebula-plugins/gradle-ospackage-plugin) to build a
deployable Nevergreen package to make running your own instance on a Linux distro easier.
This repo contains some sensible defaults so may be used without changes. Some simple settings can be overridden by
creating a `gradle.properties` file, see the [Gradle docs](https://docs.gradle.org/current/userguide/build_environment.html)
for more details about placing this file in the correct place.
Alternatively if you need to make more involved changes you can fork this repo and just use it as a base.
### Standalone Jar
The jar to package is downloaded automatically and placed into `build/libs` unless a file named `nevergreen-standalone.jar`
already exists. This allows you to manually download or build your own jar locally.
### Overridable Properties
- `nevergreenVersion`
The version of Nevergreen to download, e.g. 3.0.0
## Docker Image
Nevergreen is also available on [Dockerhub](https://registry.hub.docker.com/u/buildcanariesteam/nevergreen/).
## License
Copyright © 2014 - 2018 Build Canaries
Distributed under the Eclipse Public License either version 1.0 or (at your option) any later version.
## Instruction:
Mark as no longer maintained
## Code After:
As everyone just seems to use Docker nowadays.
# Nevergreen OS Package
## What is Nevergreen?
For more info visit https://github.com/build-canaries/nevergreen
## How to use this repo
This repo uses the [Gradle Linux Packaging Plugin](https://github.com/nebula-plugins/gradle-ospackage-plugin) to build a
deployable Nevergreen package to make running your own instance on a Linux distro easier.
This repo contains some sensible defaults so may be used without changes. Some simple settings can be overridden by
creating a `gradle.properties` file, see the [Gradle docs](https://docs.gradle.org/current/userguide/build_environment.html)
for more details about placing this file in the correct place.
Alternatively if you need to make more involved changes you can fork this repo and just use it as a base.
### Standalone Jar
The jar to package is downloaded automatically and placed into `build/libs` unless a file named `nevergreen-standalone.jar`
already exists. This allows you to manually download or build your own jar locally.
### Overridable Properties
- `nevergreenVersion`
The version of Nevergreen to download, e.g. 3.0.0
## Docker Image
Nevergreen is also available on [Dockerhub](https://registry.hub.docker.com/u/buildcanariesteam/nevergreen/).
## License
Copyright © 2014 - 2018 Build Canaries
Distributed under the Eclipse Public License either version 1.0 or (at your option) any later version.
| +
+ As everyone just seems to use Docker nowadays.
+
+ # Nevergreen OS Package
+
+ ## What is Nevergreen?
For more info visit https://github.com/build-canaries/nevergreen
## How to use this repo
This repo uses the [Gradle Linux Packaging Plugin](https://github.com/nebula-plugins/gradle-ospackage-plugin) to build a
deployable Nevergreen package to make running your own instance on a Linux distro easier.
This repo contains some sensible defaults so may be used without changes. Some simple settings can be overridden by
creating a `gradle.properties` file, see the [Gradle docs](https://docs.gradle.org/current/userguide/build_environment.html)
for more details about placing this file in the correct place.
Alternatively if you need to make more involved changes you can fork this repo and just use it as a base.
### Standalone Jar
The jar to package is downloaded automatically and placed into `build/libs` unless a file named `nevergreen-standalone.jar`
already exists. This allows you to manually download or build your own jar locally.
### Overridable Properties
- `nevergreenVersion`
The version of Nevergreen to download, e.g. 3.0.0
## Docker Image
Nevergreen is also available on [Dockerhub](https://registry.hub.docker.com/u/buildcanariesteam/nevergreen/).
## License
Copyright © 2014 - 2018 Build Canaries
Distributed under the Eclipse Public License either version 1.0 or (at your option) any later version. | 6 | 0.181818 | 6 | 0 |
0f789ea82e785d9e06582e892c4ec4ea45f245c8 | httpserver.js | httpserver.js | var http = require('http');
var url = require('url');
var port = 8000;
var server = http.createServer(function (request, response){
try {
//go to http://127.0.0.1:8000/?req=Monty
console.log('Server Pinged'); //Ping the server (will show in the cmd prompt)
//The below will be written to the Web Browser
response.writeHead(200, {"Content-Type": "text/plain"});
var req = checkURL(request); //Passes request into the checkURL function and stores output into the req variable
var resp;
if(req === "Aaron"){
resp = "Gavendo";
}
else if(req === "Monty"){
resp = "python";
}
else if(req === "Hello"){
resp = "World";
}
else{
resp = "Please Enter: Aaron, Monty or Hello";
}
response.write(resp); //Write the response on the web page.
response.end();
}
catch (e) {
response.writeHead(500, { 'content-type': 'text/plain' });
response.write('ERROR:' + e);
response.end('\n');
}
});
server.listen(port);
console.log('The server has run'); //This is for the cmd prompt. Runs once at the start.
function checkURL(request){
var phrase = url.parse(request.url, true).query;
// this checks for a query with a property called 'req' and returns its value.
return phrase.req;
}
| var http = require('http');
var url = require('url');
var port = 8000;
var server = http.createServer(function (request, response){
try {
//go to http://127.0.0.1:8000/?req=Hello
console.log('Server Pinged');
response.writeHead(200, {"Content-Type": "text/plain"});
var req = checkURL(request);
var resp;
if(req === "Hello"){
resp = "World";
}
else{
resp = "Please Enter: Aaron, Monty or Hello";
}
response.write(resp);
response.end();
}
catch(e) {
response.writeHead(500, { 'content-type': 'text/plain' });
response.write('ERROR:' + e);
response.end('\n');
}
});
server.listen(port);
console.log('The server has run');
function checkURL(request){
var phrase = url.parse(request.url, true).query;
// this checks for a query with a property called 'req' and returns its value.
return phrase.req;
}
| Fix Space and comment issues. | Fix Space and comment issues.
| JavaScript | mit | delpillar/AvengersCC | javascript | ## Code Before:
var http = require('http');
var url = require('url');
var port = 8000;
var server = http.createServer(function (request, response){
try {
//go to http://127.0.0.1:8000/?req=Monty
console.log('Server Pinged'); //Ping the server (will show in the cmd prompt)
//The below will be written to the Web Browser
response.writeHead(200, {"Content-Type": "text/plain"});
var req = checkURL(request); //Passes request into the checkURL function and stores output into the req variable
var resp;
if(req === "Aaron"){
resp = "Gavendo";
}
else if(req === "Monty"){
resp = "python";
}
else if(req === "Hello"){
resp = "World";
}
else{
resp = "Please Enter: Aaron, Monty or Hello";
}
response.write(resp); //Write the response on the web page.
response.end();
}
catch (e) {
response.writeHead(500, { 'content-type': 'text/plain' });
response.write('ERROR:' + e);
response.end('\n');
}
});
server.listen(port);
console.log('The server has run'); //This is for the cmd prompt. Runs once at the start.
function checkURL(request){
var phrase = url.parse(request.url, true).query;
// this checks for a query with a property called 'req' and returns its value.
return phrase.req;
}
## Instruction:
Fix Space and comment issues.
## Code After:
var http = require('http');
var url = require('url');
var port = 8000;
var server = http.createServer(function (request, response){
try {
//go to http://127.0.0.1:8000/?req=Hello
console.log('Server Pinged');
response.writeHead(200, {"Content-Type": "text/plain"});
var req = checkURL(request);
var resp;
if(req === "Hello"){
resp = "World";
}
else{
resp = "Please Enter: Aaron, Monty or Hello";
}
response.write(resp);
response.end();
}
catch(e) {
response.writeHead(500, { 'content-type': 'text/plain' });
response.write('ERROR:' + e);
response.end('\n');
}
});
server.listen(port);
console.log('The server has run');
function checkURL(request){
var phrase = url.parse(request.url, true).query;
// this checks for a query with a property called 'req' and returns its value.
return phrase.req;
}
| var http = require('http');
var url = require('url');
var port = 8000;
var server = http.createServer(function (request, response){
-
- try {
? ---
+ try {
- //go to http://127.0.0.1:8000/?req=Monty
? ^^ ^ ---
+ //go to http://127.0.0.1:8000/?req=Hello
? ^^ ^^^^
+ console.log('Server Pinged');
- console.log('Server Pinged'); //Ping the server (will show in the cmd prompt)
- //The below will be written to the Web Browser
- response.writeHead(200, {"Content-Type": "text/plain"});
? ^^
+ response.writeHead(200, {"Content-Type": "text/plain"});
? ^^
- var req = checkURL(request); //Passes request into the checkURL function and stores output into the req variable
+ var req = checkURL(request);
- var resp;
? ^^
+ var resp;
? ^^
- if(req === "Aaron"){
- resp = "Gavendo";
- }
- else if(req === "Monty"){
- resp = "python";
- }
- else if(req === "Hello"){
? ^^^^^^ ---
+ if(req === "Hello"){
? ^
- resp = "World";
? ^^^ ----
+ resp = "World";
? ^^^
- }
- else{
+ }
+ else{
- resp = "Please Enter: Aaron, Monty or Hello";
? ^^^
+ resp = "Please Enter: Aaron, Monty or Hello";
? ^^^
- }
- response.write(resp); //Write the response on the web page.
+ }
+ response.write(resp);
- response.end();
? ^^
+ response.end();
? ^^
- }
+ }
- catch (e) {
? --- -
+ catch(e) {
- response.writeHead(500, { 'content-type': 'text/plain' });
? ------
+ response.writeHead(500, { 'content-type': 'text/plain' });
- response.write('ERROR:' + e);
? ------
+ response.write('ERROR:' + e);
- response.end('\n');
? ------
+ response.end('\n');
+ }
- }
-
});
server.listen(port);
- console.log('The server has run'); //This is for the cmd prompt. Runs once at the start.
+ console.log('The server has run');
function checkURL(request){
- var phrase = url.parse(request.url, true).query;
? ^
+ var phrase = url.parse(request.url, true).query;
? ^
- // this checks for a query with a property called 'req' and returns its value.
? ^
+ // this checks for a query with a property called 'req' and returns its value.
? ^
- return phrase.req;
? ^
+ return phrase.req;
? ^
} | 57 | 1.295455 | 24 | 33 |
4da300d7802840ed21c126263c6547ce1650c229 | spark/src/main/scala/org/elasticsearch/spark/serialization/ReflectionUtils.scala | spark/src/main/scala/org/elasticsearch/spark/serialization/ReflectionUtils.scala | package org.elasticsearch.spark.serialization
import java.beans.Introspector
import java.lang.reflect.Method
import scala.reflect.runtime.{ universe => ru }
import scala.reflect.runtime.universe._
import scala.reflect.ClassTag
private[spark] object ReflectionUtils {
def javaBeansInfo(clazz: Class[_]) = {
Introspector.getBeanInfo(clazz).getPropertyDescriptors().filterNot(_.getName == "class").map(pd => (pd.getName, pd.getReadMethod)).sortBy(_._1)
}
def javaBeansValues(target: AnyRef, info: Array[(String, Method)]) = {
info.map(in => (in._1, in._2.invoke(target))).toMap
}
def isCaseClass(clazz: Class[_]): Boolean = {
// reliable case class identifier only happens through class symbols...
runtimeMirror(clazz.getClassLoader()).classSymbol(clazz).isCaseClass
}
def caseClassInfo(clazz: Class[_]): Iterable[String] = {
runtimeMirror(clazz.getClassLoader()).classSymbol(clazz).toType.declarations.collect {
case m: MethodSymbol if m.isCaseAccessor => m.name.toString()
}
}
def caseClassValues(target: AnyRef, props: Iterable[String]) = {
val product = target.asInstanceOf[Product].productIterator
val tuples = for (y <- props) yield (y, product.next)
tuples.toMap
}
} | package org.elasticsearch.spark.serialization
import java.beans.Introspector
import java.lang.reflect.Method
import scala.reflect.runtime.{ universe => ru }
import scala.reflect.runtime.universe._
import scala.reflect.ClassTag
private[spark] object ReflectionUtils {
//SI-6240
protected[spark] object ReflectionLock
def javaBeansInfo(clazz: Class[_]) = {
Introspector.getBeanInfo(clazz).getPropertyDescriptors().filterNot(_.getName == "class").map(pd => (pd.getName, pd.getReadMethod)).sortBy(_._1)
}
def javaBeansValues(target: AnyRef, info: Array[(String, Method)]) = {
info.map(in => (in._1, in._2.invoke(target))).toMap
}
def isCaseClass(clazz: Class[_]): Boolean = {
ReflectionLock.synchronized {
// reliable case class identifier only happens through class symbols...
runtimeMirror(clazz.getClassLoader()).classSymbol(clazz).isCaseClass
}
}
def caseClassInfo(clazz: Class[_]): Iterable[String] = {
ReflectionLock.synchronized {
runtimeMirror(clazz.getClassLoader()).classSymbol(clazz).toType.declarations.collect {
case m: MethodSymbol if m.isCaseAccessor => m.name.toString()
}
}
}
def caseClassValues(target: AnyRef, props: Iterable[String]) = {
val product = target.asInstanceOf[Product].productIterator
val tuples = for (y <- props) yield (y, product.next)
tuples.toMap
}
} | Add synchronization around reflection usage | [SPARK] Add synchronization around reflection usage
relates #365
| Scala | apache-2.0 | samkohli/elasticsearch-hadoop,jasontedor/elasticsearch-hadoop,elastic/elasticsearch-hadoop,pranavraman/elasticsearch-hadoop,kai5263499/elasticsearch-hadoop,trifork/elasticsearch-hadoop,girirajsharma/elasticsearch-hadoop,takezoe/elasticsearch-hadoop,puneetjaiswal/elasticsearch-hadoop,huangll/elasticsearch-hadoop,costin/elasticsearch-hadoop,sarwarbhuiyan/elasticsearch-hadoop,Gavin-Yang/elasticsearch-hadoop,yonglehou/elasticsearch-hadoop,cgvarela/elasticsearch-hadoop,lgscofield/elasticsearch-hadoop,elastic/elasticsearch-hadoop,xjrk58/elasticsearch-hadoop,aie108/elasticsearch-hadoop,holdenk/elasticsearch-hadoop | scala | ## Code Before:
package org.elasticsearch.spark.serialization
import java.beans.Introspector
import java.lang.reflect.Method
import scala.reflect.runtime.{ universe => ru }
import scala.reflect.runtime.universe._
import scala.reflect.ClassTag
private[spark] object ReflectionUtils {
def javaBeansInfo(clazz: Class[_]) = {
Introspector.getBeanInfo(clazz).getPropertyDescriptors().filterNot(_.getName == "class").map(pd => (pd.getName, pd.getReadMethod)).sortBy(_._1)
}
def javaBeansValues(target: AnyRef, info: Array[(String, Method)]) = {
info.map(in => (in._1, in._2.invoke(target))).toMap
}
def isCaseClass(clazz: Class[_]): Boolean = {
// reliable case class identifier only happens through class symbols...
runtimeMirror(clazz.getClassLoader()).classSymbol(clazz).isCaseClass
}
def caseClassInfo(clazz: Class[_]): Iterable[String] = {
runtimeMirror(clazz.getClassLoader()).classSymbol(clazz).toType.declarations.collect {
case m: MethodSymbol if m.isCaseAccessor => m.name.toString()
}
}
def caseClassValues(target: AnyRef, props: Iterable[String]) = {
val product = target.asInstanceOf[Product].productIterator
val tuples = for (y <- props) yield (y, product.next)
tuples.toMap
}
}
## Instruction:
[SPARK] Add synchronization around reflection usage
relates #365
## Code After:
package org.elasticsearch.spark.serialization
import java.beans.Introspector
import java.lang.reflect.Method
import scala.reflect.runtime.{ universe => ru }
import scala.reflect.runtime.universe._
import scala.reflect.ClassTag
private[spark] object ReflectionUtils {
//SI-6240
protected[spark] object ReflectionLock
def javaBeansInfo(clazz: Class[_]) = {
Introspector.getBeanInfo(clazz).getPropertyDescriptors().filterNot(_.getName == "class").map(pd => (pd.getName, pd.getReadMethod)).sortBy(_._1)
}
def javaBeansValues(target: AnyRef, info: Array[(String, Method)]) = {
info.map(in => (in._1, in._2.invoke(target))).toMap
}
def isCaseClass(clazz: Class[_]): Boolean = {
ReflectionLock.synchronized {
// reliable case class identifier only happens through class symbols...
runtimeMirror(clazz.getClassLoader()).classSymbol(clazz).isCaseClass
}
}
def caseClassInfo(clazz: Class[_]): Iterable[String] = {
ReflectionLock.synchronized {
runtimeMirror(clazz.getClassLoader()).classSymbol(clazz).toType.declarations.collect {
case m: MethodSymbol if m.isCaseAccessor => m.name.toString()
}
}
}
def caseClassValues(target: AnyRef, props: Iterable[String]) = {
val product = target.asInstanceOf[Product].productIterator
val tuples = for (y <- props) yield (y, product.next)
tuples.toMap
}
} | package org.elasticsearch.spark.serialization
import java.beans.Introspector
import java.lang.reflect.Method
import scala.reflect.runtime.{ universe => ru }
import scala.reflect.runtime.universe._
import scala.reflect.ClassTag
private[spark] object ReflectionUtils {
+ //SI-6240
+ protected[spark] object ReflectionLock
+
def javaBeansInfo(clazz: Class[_]) = {
Introspector.getBeanInfo(clazz).getPropertyDescriptors().filterNot(_.getName == "class").map(pd => (pd.getName, pd.getReadMethod)).sortBy(_._1)
}
def javaBeansValues(target: AnyRef, info: Array[(String, Method)]) = {
info.map(in => (in._1, in._2.invoke(target))).toMap
}
def isCaseClass(clazz: Class[_]): Boolean = {
+ ReflectionLock.synchronized {
- // reliable case class identifier only happens through class symbols...
+ // reliable case class identifier only happens through class symbols...
? ++
- runtimeMirror(clazz.getClassLoader()).classSymbol(clazz).isCaseClass
+ runtimeMirror(clazz.getClassLoader()).classSymbol(clazz).isCaseClass
? ++
+ }
}
def caseClassInfo(clazz: Class[_]): Iterable[String] = {
+ ReflectionLock.synchronized {
- runtimeMirror(clazz.getClassLoader()).classSymbol(clazz).toType.declarations.collect {
+ runtimeMirror(clazz.getClassLoader()).classSymbol(clazz).toType.declarations.collect {
? ++
- case m: MethodSymbol if m.isCaseAccessor => m.name.toString()
+ case m: MethodSymbol if m.isCaseAccessor => m.name.toString()
? ++
+ }
}
}
def caseClassValues(target: AnyRef, props: Iterable[String]) = {
val product = target.asInstanceOf[Product].productIterator
val tuples = for (y <- props) yield (y, product.next)
tuples.toMap
}
} | 15 | 0.428571 | 11 | 4 |
b5cf1bb48ae631fee2b4b95604d36cdb72c00758 | python/tox.ini | python/tox.ini |
[tox]
# In the future the plan is to support the following
#envlist = py27, py34, py35, py36
envlist = py{27,35}
[testenv]
commands = py.test tests
deps =
pytest
mock
[testenv:lint27]
basepython=python2.7
deps =
pylint
pytest
mock
commands=pylint -rn {envsitepackagesdir}/omnisharp {toxinidir}/tests
[testenv:lint35]
basepython=python3.5
deps =
pylint
pytest
mock
commands=pylint -rn {envsitepackagesdir}/omnisharp {toxinidir}/tests
|
[tox]
# In the future the plan is to support the following
#envlist = py27, py34, py35, py36
envlist = py{27,35}
[testenv]
commands = py.test tests
deps =
pylint
pytest
mock
[testenv:lint]
basepython=python3.5
deps =
pylint
pytest
mock
commands=pylint -rn {envsitepackagesdir}/omnisharp {toxinidir}/tests
| Include pylint in all environments | Include pylint in all environments
| INI | mit | OmniSharp/omnisharp-vim,OmniSharp/omnisharp-vim,OmniSharp/omnisharp-vim | ini | ## Code Before:
[tox]
# In the future the plan is to support the following
#envlist = py27, py34, py35, py36
envlist = py{27,35}
[testenv]
commands = py.test tests
deps =
pytest
mock
[testenv:lint27]
basepython=python2.7
deps =
pylint
pytest
mock
commands=pylint -rn {envsitepackagesdir}/omnisharp {toxinidir}/tests
[testenv:lint35]
basepython=python3.5
deps =
pylint
pytest
mock
commands=pylint -rn {envsitepackagesdir}/omnisharp {toxinidir}/tests
## Instruction:
Include pylint in all environments
## Code After:
[tox]
# In the future the plan is to support the following
#envlist = py27, py34, py35, py36
envlist = py{27,35}
[testenv]
commands = py.test tests
deps =
pylint
pytest
mock
[testenv:lint]
basepython=python3.5
deps =
pylint
pytest
mock
commands=pylint -rn {envsitepackagesdir}/omnisharp {toxinidir}/tests
|
[tox]
# In the future the plan is to support the following
#envlist = py27, py34, py35, py36
envlist = py{27,35}
[testenv]
commands = py.test tests
deps =
+ pylint
pytest
mock
- [testenv:lint27]
? --
+ [testenv:lint]
- basepython=python2.7
- deps =
- pylint
- pytest
- mock
- commands=pylint -rn {envsitepackagesdir}/omnisharp {toxinidir}/tests
-
- [testenv:lint35]
basepython=python3.5
deps =
pylint
pytest
mock
commands=pylint -rn {envsitepackagesdir}/omnisharp {toxinidir}/tests | 11 | 0.407407 | 2 | 9 |
49ac72835a1402b4019572374039ae4888101ad8 | provisioner/daemon/plugins/automators/chef_automator/chef_automator/cookbooks/loom_hosts/recipes/default.rb | provisioner/daemon/plugins/automators/chef_automator/chef_automator/cookbooks/loom_hosts/recipes/default.rb |
node['loom']['cluster']['nodes'].each do |n, v|
hostsfile_entry v.ipaddress do
hostname v.hostname
aliases [ v.hostname.gsub('.local','') ]
unique true
action :create
end
end
|
node['loom']['cluster']['nodes'].each do |n, v|
short_host = v.hostname.split('.').first
hostsfile_entry v.ipaddress do
hostname v.hostname
aliases [ short_host ]
unique true
action :create
end
end
| Fix aliases in /etc/hosts for machines using a DNS suffix other than '.local' | Fix aliases in /etc/hosts for machines using a DNS suffix other than '.local'
| Ruby | apache-2.0 | cdapio/coopr,cdapio/coopr,caskdata/coopr,cdapio/coopr,caskdata/coopr,quantiply-fork/coopr,quantiply-fork/coopr,quantiply-fork/coopr,cdapio/coopr,cdapio/coopr,quantiply-fork/coopr,caskdata/coopr,caskdata/coopr,caskdata/coopr | ruby | ## Code Before:
node['loom']['cluster']['nodes'].each do |n, v|
hostsfile_entry v.ipaddress do
hostname v.hostname
aliases [ v.hostname.gsub('.local','') ]
unique true
action :create
end
end
## Instruction:
Fix aliases in /etc/hosts for machines using a DNS suffix other than '.local'
## Code After:
node['loom']['cluster']['nodes'].each do |n, v|
short_host = v.hostname.split('.').first
hostsfile_entry v.ipaddress do
hostname v.hostname
aliases [ short_host ]
unique true
action :create
end
end
|
node['loom']['cluster']['nodes'].each do |n, v|
+ short_host = v.hostname.split('.').first
hostsfile_entry v.ipaddress do
hostname v.hostname
- aliases [ v.hostname.gsub('.local','') ]
+ aliases [ short_host ]
unique true
action :create
end
end | 3 | 0.333333 | 2 | 1 |
26578d7f3352febd84b71c2d5dd681ed1c0dae86 | validation-test/compiler_crashers/24887-no-stack-trace.swift | validation-test/compiler_crashers/24887-no-stack-trace.swift | // RUN: not --crash %target-swift-frontend %s -c -o /dev/null
// Distributed under the terms of the MIT license
// Test case submitted to project by https://github.com/codafi (Robert Widmann)
struct X<T> {
let s : X<X>
}
| // RUN: not --crash %target-swift-frontend %s -emit-silgen
// Distributed under the terms of the MIT license
// Test case submitted to project by https://github.com/codafi (Robert Widmann)
struct X<T> {
let s : X<X>
}
| Use canonical "-emit-silgen" to trigger crash case. | Use canonical "-emit-silgen" to trigger crash case.
| Swift | apache-2.0 | khizkhiz/swift,khizkhiz/swift,khizkhiz/swift,khizkhiz/swift,khizkhiz/swift,khizkhiz/swift,khizkhiz/swift | swift | ## Code Before:
// RUN: not --crash %target-swift-frontend %s -c -o /dev/null
// Distributed under the terms of the MIT license
// Test case submitted to project by https://github.com/codafi (Robert Widmann)
struct X<T> {
let s : X<X>
}
## Instruction:
Use canonical "-emit-silgen" to trigger crash case.
## Code After:
// RUN: not --crash %target-swift-frontend %s -emit-silgen
// Distributed under the terms of the MIT license
// Test case submitted to project by https://github.com/codafi (Robert Widmann)
struct X<T> {
let s : X<X>
}
| - // RUN: not --crash %target-swift-frontend %s -c -o /dev/null
? ^^ ^^^^ -- ---
+ // RUN: not --crash %target-swift-frontend %s -emit-silgen
? ^^^^ ^^^^
// Distributed under the terms of the MIT license
// Test case submitted to project by https://github.com/codafi (Robert Widmann)
struct X<T> {
let s : X<X>
} | 2 | 0.285714 | 1 | 1 |
503f511c8a434358941a5d47af60702169152967 | src/scss/project/vars/_metrics.scss | src/scss/project/vars/_metrics.scss | /*------------------------------------*\
GENERIC
\*------------------------------------*/
$gutter: 20px;
$site-max-width: 1000px;
$site-min-width: 320px;
/*------------------------------------*\
BREAKPOINTS
\*------------------------------------*/
$breakpoint--palm: "(max-width: 683px)";
$breakpoint--portable: "(max-width: 999px)";
$breakpoint--lap: "(min-width: 684px) and (max-width: 999px)";
$breakpoint--desk: "(min-width: 1000px)";
$breakpoint--desk--wide: "(min-width: 1300px)";
$breakpoint--lap-and-up: "(min-width: 684px)";
/*------------------------------------*\
GRID SYSTEM
\*------------------------------------*/
$grid--responsive: true;
$grid--mobile-first: true;
$grid--gutter: $gutter;
$grid--use-silent-classes: true;
$grid--push: false;
$grid--pull: false; | /*------------------------------------*\
GENERIC
\*------------------------------------*/
$gutter: 20px;
$gutter--mini: ($gutter / 2);
$gutter--midi: ($gutter + ($gutter / 2));
$gutter--double: ($gutter * 2);
$gutter--treble: ($gutter * 3);
$site-max-width: 1000px;
$site-min-width: 320px;
/*------------------------------------*\
BREAKPOINTS
\*------------------------------------*/
$breakpoint--palm: "(max-width: 683px)";
$breakpoint--portable: "(max-width: 999px)";
$breakpoint--lap: "(min-width: 684px) and (max-width: 999px)";
$breakpoint--desk: "(min-width: 1000px)";
$breakpoint--desk--wide: "(min-width: 1300px)";
$breakpoint--lap-and-up: "(min-width: 684px)";
/*------------------------------------*\
GRID SYSTEM
\*------------------------------------*/
$grid--responsive: true;
$grid--mobile-first: true;
$grid--gutter: $gutter;
$grid--use-silent-classes: true;
$grid--push: false;
$grid--pull: false; | Add some useful gutter modifiers | Add some useful gutter modifiers
| SCSS | unlicense | 4ndeh/base-front-end-project,hankchizljaw/base-front-end-project,hankchizljaw/base-front-end-project,4ndeh/base-front-end-project | scss | ## Code Before:
/*------------------------------------*\
GENERIC
\*------------------------------------*/
$gutter: 20px;
$site-max-width: 1000px;
$site-min-width: 320px;
/*------------------------------------*\
BREAKPOINTS
\*------------------------------------*/
$breakpoint--palm: "(max-width: 683px)";
$breakpoint--portable: "(max-width: 999px)";
$breakpoint--lap: "(min-width: 684px) and (max-width: 999px)";
$breakpoint--desk: "(min-width: 1000px)";
$breakpoint--desk--wide: "(min-width: 1300px)";
$breakpoint--lap-and-up: "(min-width: 684px)";
/*------------------------------------*\
GRID SYSTEM
\*------------------------------------*/
$grid--responsive: true;
$grid--mobile-first: true;
$grid--gutter: $gutter;
$grid--use-silent-classes: true;
$grid--push: false;
$grid--pull: false;
## Instruction:
Add some useful gutter modifiers
## Code After:
/*------------------------------------*\
GENERIC
\*------------------------------------*/
$gutter: 20px;
$gutter--mini: ($gutter / 2);
$gutter--midi: ($gutter + ($gutter / 2));
$gutter--double: ($gutter * 2);
$gutter--treble: ($gutter * 3);
$site-max-width: 1000px;
$site-min-width: 320px;
/*------------------------------------*\
BREAKPOINTS
\*------------------------------------*/
$breakpoint--palm: "(max-width: 683px)";
$breakpoint--portable: "(max-width: 999px)";
$breakpoint--lap: "(min-width: 684px) and (max-width: 999px)";
$breakpoint--desk: "(min-width: 1000px)";
$breakpoint--desk--wide: "(min-width: 1300px)";
$breakpoint--lap-and-up: "(min-width: 684px)";
/*------------------------------------*\
GRID SYSTEM
\*------------------------------------*/
$grid--responsive: true;
$grid--mobile-first: true;
$grid--gutter: $gutter;
$grid--use-silent-classes: true;
$grid--push: false;
$grid--pull: false; | /*------------------------------------*\
GENERIC
\*------------------------------------*/
$gutter: 20px;
+ $gutter--mini: ($gutter / 2);
+ $gutter--midi: ($gutter + ($gutter / 2));
+ $gutter--double: ($gutter * 2);
+ $gutter--treble: ($gutter * 3);
$site-max-width: 1000px;
$site-min-width: 320px;
/*------------------------------------*\
BREAKPOINTS
\*------------------------------------*/
$breakpoint--palm: "(max-width: 683px)";
$breakpoint--portable: "(max-width: 999px)";
$breakpoint--lap: "(min-width: 684px) and (max-width: 999px)";
$breakpoint--desk: "(min-width: 1000px)";
$breakpoint--desk--wide: "(min-width: 1300px)";
$breakpoint--lap-and-up: "(min-width: 684px)";
/*------------------------------------*\
GRID SYSTEM
\*------------------------------------*/
$grid--responsive: true;
$grid--mobile-first: true;
$grid--gutter: $gutter;
$grid--use-silent-classes: true;
$grid--push: false;
$grid--pull: false; | 4 | 0.148148 | 4 | 0 |
d8cb6671bb07cf88119e97da25d5e6f002a812b5 | src/ffmanifest.json | src/ffmanifest.json | {
"manifest_version": 2,
"name": "__MSG_appName__",
"short_name": "CaretTab",
"description": "__MSG_appDesc__",
"version": "3.0.1",
"default_locale": "en",
"icons": { "16": "assets/img/icon16.png",
"48": "assets/img/icon48.png",
"128": "assets/img/icon128.png" },
"chrome_url_overrides" : {
"newtab": "index.html"
},
"background": {
"scripts": ["js/update.js"]
},
"permissions": [
"storage",
"bookmarks",
"chrome://favicon/",
"topSites"
],
"content_security_policy": "script-src 'self' 'unsafe-eval'; object-src 'self'",
"applications": {
"gecko": {
"id": "{1ab49626-2e16-405b-ad65-dcaab8decd7b}",
"strict_min_version": "57.0"
}
}
}
| {
"manifest_version": 2,
"name": "__MSG_appName__",
"short_name": "CaretTab",
"description": "__MSG_appDesc__",
"version": "3.0.1",
"default_locale": "en",
"icons": { "16": "assets/img/icon16.png",
"48": "assets/img/icon48.png",
"128": "assets/img/icon128.png" },
"chrome_url_overrides" : {
"newtab": "index.html"
},
"chrome_settings_overrides" : {
"homepage": "index.html"
},
"background": {
"scripts": ["js/update.js"]
},
"permissions": [
"storage",
"bookmarks",
"chrome://favicon/",
"topSites"
],
"content_security_policy": "script-src 'self' 'unsafe-eval'; object-src 'self'",
"applications": {
"gecko": {
"id": "{1ab49626-2e16-405b-ad65-dcaab8decd7b}",
"strict_min_version": "57.0"
}
}
}
| Replace firefox homepage as well | Replace firefox homepage as well
| JSON | mit | bluecaret/carettab,bluecaret/carettab,bluecaret/carettab | json | ## Code Before:
{
"manifest_version": 2,
"name": "__MSG_appName__",
"short_name": "CaretTab",
"description": "__MSG_appDesc__",
"version": "3.0.1",
"default_locale": "en",
"icons": { "16": "assets/img/icon16.png",
"48": "assets/img/icon48.png",
"128": "assets/img/icon128.png" },
"chrome_url_overrides" : {
"newtab": "index.html"
},
"background": {
"scripts": ["js/update.js"]
},
"permissions": [
"storage",
"bookmarks",
"chrome://favicon/",
"topSites"
],
"content_security_policy": "script-src 'self' 'unsafe-eval'; object-src 'self'",
"applications": {
"gecko": {
"id": "{1ab49626-2e16-405b-ad65-dcaab8decd7b}",
"strict_min_version": "57.0"
}
}
}
## Instruction:
Replace firefox homepage as well
## Code After:
{
"manifest_version": 2,
"name": "__MSG_appName__",
"short_name": "CaretTab",
"description": "__MSG_appDesc__",
"version": "3.0.1",
"default_locale": "en",
"icons": { "16": "assets/img/icon16.png",
"48": "assets/img/icon48.png",
"128": "assets/img/icon128.png" },
"chrome_url_overrides" : {
"newtab": "index.html"
},
"chrome_settings_overrides" : {
"homepage": "index.html"
},
"background": {
"scripts": ["js/update.js"]
},
"permissions": [
"storage",
"bookmarks",
"chrome://favicon/",
"topSites"
],
"content_security_policy": "script-src 'self' 'unsafe-eval'; object-src 'self'",
"applications": {
"gecko": {
"id": "{1ab49626-2e16-405b-ad65-dcaab8decd7b}",
"strict_min_version": "57.0"
}
}
}
| {
"manifest_version": 2,
"name": "__MSG_appName__",
"short_name": "CaretTab",
"description": "__MSG_appDesc__",
"version": "3.0.1",
"default_locale": "en",
"icons": { "16": "assets/img/icon16.png",
"48": "assets/img/icon48.png",
"128": "assets/img/icon128.png" },
"chrome_url_overrides" : {
"newtab": "index.html"
+ },
+
+ "chrome_settings_overrides" : {
+ "homepage": "index.html"
},
"background": {
"scripts": ["js/update.js"]
},
"permissions": [
"storage",
"bookmarks",
"chrome://favicon/",
"topSites"
],
"content_security_policy": "script-src 'self' 'unsafe-eval'; object-src 'self'",
"applications": {
"gecko": {
"id": "{1ab49626-2e16-405b-ad65-dcaab8decd7b}",
"strict_min_version": "57.0"
}
}
} | 4 | 0.105263 | 4 | 0 |
45e2619b81c0865482996efff56966aee5a0a35f | lib/wiselinks/rendering.rb | lib/wiselinks/rendering.rb | module Wiselinks
module Rendering
def self.included(base)
base.alias_method_chain :render, :wiselinks
end
protected
def render_with_wiselinks(*args, &block)
options = _normalize_args(*args)
if self.request.wiselinks?
self.headers['Cache-Control'] = 'no-cache, no-store, max-age=0, must-revalidate'
self.headers['Pragma'] = 'no-cache'
if self.request.wiselinks_partial?
Wiselinks.log("Processing partial request")
options[:partial] ||= action_name
else
Wiselinks.log("Processing template request")
if Wiselinks.options[:layout] != false
options[:layout] = self.wiselinks_layout
end
end
if Wiselinks.options[:assets_digest].present?
Wiselinks.log("Assets digest #{Wiselinks.options[:assets_digest]}")
self.headers['X-Wiselinks-Assets-Digest'] = Wiselinks.options[:assets_digest]
end
end
self.render_without_wiselinks(options, &block)
end
end
end
| module Wiselinks
module Rendering
def self.included(base)
base.alias_method_chain :render, :wiselinks
end
protected
def render_with_wiselinks(*args, &block)
options = _normalize_args(*args)
if self.request.wiselinks?
self.headers['Cache-Control'] = 'no-cache, no-store, max-age=0, must-revalidate'
self.headers['Pragma'] = 'no-cache'
if self.request.wiselinks_partial? || options[:partial].present?
Wiselinks.log("Processing partial request")
options[:partial] ||= action_name
else
Wiselinks.log("Processing template request")
if Wiselinks.options[:layout] != false
options[:layout] = self.wiselinks_layout
end
end
if Wiselinks.options[:assets_digest].present?
Wiselinks.log("Assets digest #{Wiselinks.options[:assets_digest]}")
self.headers['X-Wiselinks-Assets-Digest'] = Wiselinks.options[:assets_digest]
end
end
self.render_without_wiselinks(options, &block)
end
end
end
| Add more flexible render process. | Add more flexible render process. | Ruby | mit | phlegx/wiselinks,phlegx/wiselinks,phlegx/wiselinks | ruby | ## Code Before:
module Wiselinks
module Rendering
def self.included(base)
base.alias_method_chain :render, :wiselinks
end
protected
def render_with_wiselinks(*args, &block)
options = _normalize_args(*args)
if self.request.wiselinks?
self.headers['Cache-Control'] = 'no-cache, no-store, max-age=0, must-revalidate'
self.headers['Pragma'] = 'no-cache'
if self.request.wiselinks_partial?
Wiselinks.log("Processing partial request")
options[:partial] ||= action_name
else
Wiselinks.log("Processing template request")
if Wiselinks.options[:layout] != false
options[:layout] = self.wiselinks_layout
end
end
if Wiselinks.options[:assets_digest].present?
Wiselinks.log("Assets digest #{Wiselinks.options[:assets_digest]}")
self.headers['X-Wiselinks-Assets-Digest'] = Wiselinks.options[:assets_digest]
end
end
self.render_without_wiselinks(options, &block)
end
end
end
## Instruction:
Add more flexible render process.
## Code After:
module Wiselinks
module Rendering
def self.included(base)
base.alias_method_chain :render, :wiselinks
end
protected
def render_with_wiselinks(*args, &block)
options = _normalize_args(*args)
if self.request.wiselinks?
self.headers['Cache-Control'] = 'no-cache, no-store, max-age=0, must-revalidate'
self.headers['Pragma'] = 'no-cache'
if self.request.wiselinks_partial? || options[:partial].present?
Wiselinks.log("Processing partial request")
options[:partial] ||= action_name
else
Wiselinks.log("Processing template request")
if Wiselinks.options[:layout] != false
options[:layout] = self.wiselinks_layout
end
end
if Wiselinks.options[:assets_digest].present?
Wiselinks.log("Assets digest #{Wiselinks.options[:assets_digest]}")
self.headers['X-Wiselinks-Assets-Digest'] = Wiselinks.options[:assets_digest]
end
end
self.render_without_wiselinks(options, &block)
end
end
end
| module Wiselinks
module Rendering
def self.included(base)
base.alias_method_chain :render, :wiselinks
end
protected
def render_with_wiselinks(*args, &block)
options = _normalize_args(*args)
if self.request.wiselinks?
self.headers['Cache-Control'] = 'no-cache, no-store, max-age=0, must-revalidate'
self.headers['Pragma'] = 'no-cache'
- if self.request.wiselinks_partial?
+ if self.request.wiselinks_partial? || options[:partial].present?
Wiselinks.log("Processing partial request")
options[:partial] ||= action_name
else
Wiselinks.log("Processing template request")
if Wiselinks.options[:layout] != false
options[:layout] = self.wiselinks_layout
end
end
if Wiselinks.options[:assets_digest].present?
Wiselinks.log("Assets digest #{Wiselinks.options[:assets_digest]}")
self.headers['X-Wiselinks-Assets-Digest'] = Wiselinks.options[:assets_digest]
end
end
self.render_without_wiselinks(options, &block)
end
end
end | 2 | 0.054054 | 1 | 1 |
845478266066fe205560ae442bb7a38388884d9e | backend/helpers/groupByDuplicatesInArray.js | backend/helpers/groupByDuplicatesInArray.js | // Count duplicate keys within an array
// ------------------------------------------------------------
module.exports = function(array) {
if(array.length === 0) {
return null;
}
var counts = {};
array.forEach(function(x) {
counts[x] = (counts[x] || 0) + 1;
});
return counts;
};
| import sortObjByOnlyKey from "./sortObjByOnlyKey";
// Count duplicate keys within an array
// ------------------------------------------------------------
const groupByDuplicatesInArray = array => {
if(array.length === 0) {
return null;
}
var counts = {};
array.forEach(function(x) {
counts[x] = (counts[x] || 0) + 1;
});
return sortObjByOnlyKey(counts);
};
export default groupByDuplicatesInArray;
| Sort object by keys to avoid problems with 'addEmptyDays' util | :bug: Sort object by keys to avoid problems with 'addEmptyDays' util
| JavaScript | mit | dreamyguy/gitinsight,dreamyguy/gitinsight,dreamyguy/gitinsight | javascript | ## Code Before:
// Count duplicate keys within an array
// ------------------------------------------------------------
module.exports = function(array) {
if(array.length === 0) {
return null;
}
var counts = {};
array.forEach(function(x) {
counts[x] = (counts[x] || 0) + 1;
});
return counts;
};
## Instruction:
:bug: Sort object by keys to avoid problems with 'addEmptyDays' util
## Code After:
import sortObjByOnlyKey from "./sortObjByOnlyKey";
// Count duplicate keys within an array
// ------------------------------------------------------------
const groupByDuplicatesInArray = array => {
if(array.length === 0) {
return null;
}
var counts = {};
array.forEach(function(x) {
counts[x] = (counts[x] || 0) + 1;
});
return sortObjByOnlyKey(counts);
};
export default groupByDuplicatesInArray;
| + import sortObjByOnlyKey from "./sortObjByOnlyKey";
+
// Count duplicate keys within an array
// ------------------------------------------------------------
- module.exports = function(array) {
+ const groupByDuplicatesInArray = array => {
if(array.length === 0) {
return null;
}
var counts = {};
array.forEach(function(x) {
counts[x] = (counts[x] || 0) + 1;
});
- return counts;
+ return sortObjByOnlyKey(counts);
};
+
+ export default groupByDuplicatesInArray; | 8 | 0.666667 | 6 | 2 |
b3f38a02f4ef5c9883e153c8241409b6c5c34a51 | app/views/issues/show.html.slim | app/views/issues/show.html.slim | - title t('.title', issue: @issue)
.featurette.featurette-indigo
.container
h1= yield :title
.featurette
.container
h2= t('.details')
p
b> #{Issue.human_attribute_name(:stars)}:
= @issue.stars
p
b> #{Device.model_name.human}:
= @issue.device.name
p
b= Issue.human_attribute_name(:description)
p= simple_format(@issue.description)
hr
h2= t('.solution_found')
blockquote
p
b> #{Issue.human_attribute_name(:closed_by)}:
= @issue.closed_by
p
b> #{Issue.human_attribute_name(:closed_at)}:
= l(@issue.closed_at, format: :long)
- if @issue.obsolete?
p
span.label.label-default= t('.obsolete_html')
p
em.text-muted= @issue.reason
- elsif @issue.reason == 'contact customer support.'
p.lead
span.label.label-default= t('.wrong_forum_html')
p= simple_format(@issue.reason)
- else
p.lead
span.label.label-warning= t('.restricted_html')
p= simple_format(@issue.reason)
hr
= link_to t('.submit_new_issue'), new_issue_path
| - title t('.title', issue: @issue)
.featurette.featurette-indigo
.container
h1= yield :title
.featurette
.container
h2= t('.details')
p
b> #{Issue.human_attribute_name(:stars)}:
= @issue.stars
p
b> #{Device.model_name.human}:
= @issue.device.name
p
b= Issue.human_attribute_name(:description)
p= simple_format(@issue.description)
hr
h2= t('.solution_found')
blockquote
p
b> #{Issue.human_attribute_name(:closed_by)}:
= @issue.closed_by
p
b> #{Issue.human_attribute_name(:closed_at)}:
= l(@issue.closed_at, format: :long)
- if @issue.obsolete?
p
span.label.label-default= t('.obsolete_html')
p
em.text-muted= @issue.reason
- elsif @issue.reason == 'contact customer support.'
p.lead
span.label.label-default= t('.wrong_forum_html')
p= simple_format(@issue.reason)
- else
p.lead
span.label.label-warning= t('.restricted_html')
p= simple_format(@issue.reason)
hr
= link_to t('.submit_new_issue'), new_issue_path, class: 'btn btn-default btn-sm btn-block'
| Change “submit new issue” link style | Change “submit new issue” link style | Slim | bsd-2-clause | tagliala/android-bug-solver,tagliala/android-bug-solver | slim | ## Code Before:
- title t('.title', issue: @issue)
.featurette.featurette-indigo
.container
h1= yield :title
.featurette
.container
h2= t('.details')
p
b> #{Issue.human_attribute_name(:stars)}:
= @issue.stars
p
b> #{Device.model_name.human}:
= @issue.device.name
p
b= Issue.human_attribute_name(:description)
p= simple_format(@issue.description)
hr
h2= t('.solution_found')
blockquote
p
b> #{Issue.human_attribute_name(:closed_by)}:
= @issue.closed_by
p
b> #{Issue.human_attribute_name(:closed_at)}:
= l(@issue.closed_at, format: :long)
- if @issue.obsolete?
p
span.label.label-default= t('.obsolete_html')
p
em.text-muted= @issue.reason
- elsif @issue.reason == 'contact customer support.'
p.lead
span.label.label-default= t('.wrong_forum_html')
p= simple_format(@issue.reason)
- else
p.lead
span.label.label-warning= t('.restricted_html')
p= simple_format(@issue.reason)
hr
= link_to t('.submit_new_issue'), new_issue_path
## Instruction:
Change “submit new issue” link style
## Code After:
- title t('.title', issue: @issue)
.featurette.featurette-indigo
.container
h1= yield :title
.featurette
.container
h2= t('.details')
p
b> #{Issue.human_attribute_name(:stars)}:
= @issue.stars
p
b> #{Device.model_name.human}:
= @issue.device.name
p
b= Issue.human_attribute_name(:description)
p= simple_format(@issue.description)
hr
h2= t('.solution_found')
blockquote
p
b> #{Issue.human_attribute_name(:closed_by)}:
= @issue.closed_by
p
b> #{Issue.human_attribute_name(:closed_at)}:
= l(@issue.closed_at, format: :long)
- if @issue.obsolete?
p
span.label.label-default= t('.obsolete_html')
p
em.text-muted= @issue.reason
- elsif @issue.reason == 'contact customer support.'
p.lead
span.label.label-default= t('.wrong_forum_html')
p= simple_format(@issue.reason)
- else
p.lead
span.label.label-warning= t('.restricted_html')
p= simple_format(@issue.reason)
hr
= link_to t('.submit_new_issue'), new_issue_path, class: 'btn btn-default btn-sm btn-block'
| - title t('.title', issue: @issue)
.featurette.featurette-indigo
.container
h1= yield :title
.featurette
.container
h2= t('.details')
p
b> #{Issue.human_attribute_name(:stars)}:
= @issue.stars
p
b> #{Device.model_name.human}:
= @issue.device.name
p
b= Issue.human_attribute_name(:description)
p= simple_format(@issue.description)
hr
h2= t('.solution_found')
blockquote
p
b> #{Issue.human_attribute_name(:closed_by)}:
= @issue.closed_by
p
b> #{Issue.human_attribute_name(:closed_at)}:
= l(@issue.closed_at, format: :long)
- if @issue.obsolete?
p
span.label.label-default= t('.obsolete_html')
p
em.text-muted= @issue.reason
- elsif @issue.reason == 'contact customer support.'
p.lead
span.label.label-default= t('.wrong_forum_html')
p= simple_format(@issue.reason)
- else
p.lead
span.label.label-warning= t('.restricted_html')
p= simple_format(@issue.reason)
hr
- = link_to t('.submit_new_issue'), new_issue_path
+ = link_to t('.submit_new_issue'), new_issue_path, class: 'btn btn-default btn-sm btn-block' | 2 | 0.05 | 1 | 1 |
259a06431fe7dec8d5c88fa43ae2504ae4d30655 | README.md | README.md |
UnrarKit is here to enable Mac and iOS apps to easily work with RAR files for read-only operations.
There is a main project, with a static library target and a unit tests target, and an example project, which demonstrates how to use the library.
I'm always open to improvements, so please submit your pull requests.
# Installation
UnrarKit has been converted to a CocoaPods project. If you're not familiar with [CocoaPods](http://cocoapods.org), you can start with their [Getting Started guide](http://guides.cocoapods.org/using/getting-started.html).
I've included a sample [`podfile`](Example/Podfile) in the Example directory along with the sample project. Everything should install with the single command:
pod install
# Notes
Since this UnrarKit is cpp based library you will need to change the extension of classes that uses UnrarKit to .mm it will allow us to include libstdc++ on linking stage, otherwise you will need to add libstdc++ as linker flags in you application.
To open in Xcode, use the [UnrarKit.xcworkspace](UnrarKit.xcworkspace) file, which includes the other projects.
# Credits
* Rogerio Pereira Araujo (rogerio.araujo@gmail.com)
* Vicent Scott (vkan388@gmail.com)
* Dov Frankel (dov@abbey-code.com) |
UnrarKit is here to enable Mac and iOS apps to easily work with RAR files for read-only operations.
There is a main project, with a static library target and a unit tests target, and an example project, which demonstrates how to use the library.
I'm always open to improvements, so please submit your pull requests.
# Installation
UnrarKit has been converted to a CocoaPods project. If you're not familiar with [CocoaPods](http://cocoapods.org), you can start with their [Getting Started guide](http://guides.cocoapods.org/using/getting-started.html).
I've included a sample [`podfile`](Example/Podfile) in the Example directory along with the sample project. Everything should install with the single command:
pod install
# Notes
Since UnrarKit uses C++ libraries, you will need to change the extension of classes that use UnrarKit to .mm. This will include libstdc++ in the linking stage. If you would like to keep your extension .m (though I'm not sure what the advantage would be), you will need to add libstdc++ to the linker flags in your application.
To open in Xcode, use the [UnrarKit.xcworkspace](UnrarKit.xcworkspace) file, which includes the other projects.
# Credits
* Rogerio Pereira Araujo (rogerio.araujo@gmail.com)
* Vicent Scott (vkan388@gmail.com)
* Dov Frankel (dov@abbey-code.com) | Copy edited the readme file | Copy edited the readme file
| Markdown | bsd-2-clause | Eonil/UnrarKit,Acidburn0zzz/UnrarKit,Acidburn0zzz/UnrarKit,Acidburn0zzz/UnrarKit,lubaoyilang/UnrarKit,lubaoyilang/UnrarKit,lubaoyilang/UnrarKit,Eonil/UnrarKit,abbeycode/UnrarKit,abbeycode/UnrarKit,abbeycode/UnrarKit,lubaoyilang/UnrarKit,abbeycode/UnrarKit,Acidburn0zzz/UnrarKit,Eonil/UnrarKit,abbeycode/UnrarKit,Eonil/UnrarKit | markdown | ## Code Before:
UnrarKit is here to enable Mac and iOS apps to easily work with RAR files for read-only operations.
There is a main project, with a static library target and a unit tests target, and an example project, which demonstrates how to use the library.
I'm always open to improvements, so please submit your pull requests.
# Installation
UnrarKit has been converted to a CocoaPods project. If you're not familiar with [CocoaPods](http://cocoapods.org), you can start with their [Getting Started guide](http://guides.cocoapods.org/using/getting-started.html).
I've included a sample [`podfile`](Example/Podfile) in the Example directory along with the sample project. Everything should install with the single command:
pod install
# Notes
Since this UnrarKit is cpp based library you will need to change the extension of classes that uses UnrarKit to .mm it will allow us to include libstdc++ on linking stage, otherwise you will need to add libstdc++ as linker flags in you application.
To open in Xcode, use the [UnrarKit.xcworkspace](UnrarKit.xcworkspace) file, which includes the other projects.
# Credits
* Rogerio Pereira Araujo (rogerio.araujo@gmail.com)
* Vicent Scott (vkan388@gmail.com)
* Dov Frankel (dov@abbey-code.com)
## Instruction:
Copy edited the readme file
## Code After:
UnrarKit is here to enable Mac and iOS apps to easily work with RAR files for read-only operations.
There is a main project, with a static library target and a unit tests target, and an example project, which demonstrates how to use the library.
I'm always open to improvements, so please submit your pull requests.
# Installation
UnrarKit has been converted to a CocoaPods project. If you're not familiar with [CocoaPods](http://cocoapods.org), you can start with their [Getting Started guide](http://guides.cocoapods.org/using/getting-started.html).
I've included a sample [`podfile`](Example/Podfile) in the Example directory along with the sample project. Everything should install with the single command:
pod install
# Notes
Since UnrarKit uses C++ libraries, you will need to change the extension of classes that use UnrarKit to .mm. This will include libstdc++ in the linking stage. If you would like to keep your extension .m (though I'm not sure what the advantage would be), you will need to add libstdc++ to the linker flags in your application.
To open in Xcode, use the [UnrarKit.xcworkspace](UnrarKit.xcworkspace) file, which includes the other projects.
# Credits
* Rogerio Pereira Araujo (rogerio.araujo@gmail.com)
* Vicent Scott (vkan388@gmail.com)
* Dov Frankel (dov@abbey-code.com) |
UnrarKit is here to enable Mac and iOS apps to easily work with RAR files for read-only operations.
There is a main project, with a static library target and a unit tests target, and an example project, which demonstrates how to use the library.
I'm always open to improvements, so please submit your pull requests.
# Installation
UnrarKit has been converted to a CocoaPods project. If you're not familiar with [CocoaPods](http://cocoapods.org), you can start with their [Getting Started guide](http://guides.cocoapods.org/using/getting-started.html).
I've included a sample [`podfile`](Example/Podfile) in the Example directory along with the sample project. Everything should install with the single command:
pod install
# Notes
- Since this UnrarKit is cpp based library you will need to change the extension of classes that uses UnrarKit to .mm it will allow us to include libstdc++ on linking stage, otherwise you will need to add libstdc++ as linker flags in you application.
+ Since UnrarKit uses C++ libraries, you will need to change the extension of classes that use UnrarKit to .mm. This will include libstdc++ in the linking stage. If you would like to keep your extension .m (though I'm not sure what the advantage would be), you will need to add libstdc++ to the linker flags in your application.
To open in Xcode, use the [UnrarKit.xcworkspace](UnrarKit.xcworkspace) file, which includes the other projects.
# Credits
* Rogerio Pereira Araujo (rogerio.araujo@gmail.com)
* Vicent Scott (vkan388@gmail.com)
* Dov Frankel (dov@abbey-code.com) | 2 | 0.071429 | 1 | 1 |
11ec2067d683564efe69793ee2345e05d8264e94 | lib/flux_mixin.js | lib/flux_mixin.js | var React = require("react");
module.exports = {
propTypes: {
flux: React.PropTypes.object
},
childContextTypes: {
flux: React.PropTypes.object
},
contextTypes: {
flux: React.PropTypes.object
},
getChildContext: function() {
return {
flux: this.context.flux || this.props.flux
};
}
};
| var React = require("react");
module.exports = {
propTypes: {
flux: React.PropTypes.object.isRequired
},
childContextTypes: {
flux: React.PropTypes.object
},
getChildContext: function() {
return {
flux: this.props.flux
};
}
};
| Make Fluxbox.Mixin only for the top-level components | Make Fluxbox.Mixin only for the top-level components
| JavaScript | mit | alcedo/fluxxor,dantman/fluxxor,chimpinano/fluxxor,nagyistoce/fluxxor,VincentHoang/fluxxor,vsakaria/fluxxor,dantman/fluxxor,nagyistoce/fluxxor,davesag/fluxxor,hoanglamhuynh/fluxxor,demiazz/fluxxor,davesag/fluxxor,BinaryMuse/fluxxor,dantman/fluxxor,davesag/fluxxor,VincentHoang/fluxxor,hoanglamhuynh/fluxxor,STRML/fluxxor,chimpinano/fluxxor,webcoding/fluxxor,chimpinano/fluxxor,SqREL/fluxxor,hoanglamhuynh/fluxxor,nagyistoce/fluxxor,STRML/fluxxor,andrewslater/fluxxor,thomasboyt/fluxxor | javascript | ## Code Before:
var React = require("react");
module.exports = {
propTypes: {
flux: React.PropTypes.object
},
childContextTypes: {
flux: React.PropTypes.object
},
contextTypes: {
flux: React.PropTypes.object
},
getChildContext: function() {
return {
flux: this.context.flux || this.props.flux
};
}
};
## Instruction:
Make Fluxbox.Mixin only for the top-level components
## Code After:
var React = require("react");
module.exports = {
propTypes: {
flux: React.PropTypes.object.isRequired
},
childContextTypes: {
flux: React.PropTypes.object
},
getChildContext: function() {
return {
flux: this.props.flux
};
}
};
| var React = require("react");
module.exports = {
propTypes: {
- flux: React.PropTypes.object
+ flux: React.PropTypes.object.isRequired
? +++++++++++
},
childContextTypes: {
flux: React.PropTypes.object
},
- contextTypes: {
- flux: React.PropTypes.object
- },
-
getChildContext: function() {
return {
- flux: this.context.flux || this.props.flux
+ flux: this.props.flux
};
}
}; | 8 | 0.380952 | 2 | 6 |
5d4315bdb15f3619b3a13e8b8d5f4da6eba83c43 | src/FieldTypes/Markdown.php | src/FieldTypes/Markdown.php | <?php
namespace Axllent\Gfmarkdown\FieldTypes;
use Parsedown;
use SilverStripe\ORM\FieldType\DBText;
class Markdown extends DBText
{
public static $casting = array(
'AsHTML' => 'HTMLText',
'Markdown' => 'Text'
);
public static $escape_type = 'xml';
private static $options = [];
/**
* Render markdown as HTML using Parsedown
*
* @return string Markdown rendered as HTML
*/
public function AsHTML()
{
$parsedown = new Parsedown();
$options = $this->config()->get('options');
foreach ($options as $fn => $param) {
if (method_exists($parsedown, $fn)) {
$parsedown->{$fn}($param);
}
}
return $parsedown->text($this->value);
}
/**
* Renders the field used in the template
* @return string HTML to be used in the template
*/
public function forTemplate()
{
return $this->AsHTML();
}
}
| <?php
namespace Axllent\Gfmarkdown\FieldTypes;
use Parsedown;
use SilverStripe\ORM\FieldType\DBText;
class Markdown extends DBText
{
private static $casting = array(
'AsHTML' => 'HTMLText',
'Markdown' => 'Text'
);
private static $escape_type = 'xml';
/**
* Render markdown as HTML using Parsedown
*
* @return string Markdown rendered as HTML
*/
public function AsHTML()
{
$parsedown = new Parsedown();
$options = $this->config()->get('options');
foreach ($options as $fn => $param) {
if (method_exists($parsedown, $fn)) {
$parsedown->{$fn}($param);
}
}
return $parsedown->text($this->value);
}
/**
* Renders the field used in the template
* @return string HTML to be used in the template
*/
public function forTemplate()
{
return $this->AsHTML();
}
}
| Change static variables to private | Change static variables to private
| PHP | mit | axllent/silverstripe-gfmarkdown,axllent/silverstripe-gfmarkdown | php | ## Code Before:
<?php
namespace Axllent\Gfmarkdown\FieldTypes;
use Parsedown;
use SilverStripe\ORM\FieldType\DBText;
class Markdown extends DBText
{
public static $casting = array(
'AsHTML' => 'HTMLText',
'Markdown' => 'Text'
);
public static $escape_type = 'xml';
private static $options = [];
/**
* Render markdown as HTML using Parsedown
*
* @return string Markdown rendered as HTML
*/
public function AsHTML()
{
$parsedown = new Parsedown();
$options = $this->config()->get('options');
foreach ($options as $fn => $param) {
if (method_exists($parsedown, $fn)) {
$parsedown->{$fn}($param);
}
}
return $parsedown->text($this->value);
}
/**
* Renders the field used in the template
* @return string HTML to be used in the template
*/
public function forTemplate()
{
return $this->AsHTML();
}
}
## Instruction:
Change static variables to private
## Code After:
<?php
namespace Axllent\Gfmarkdown\FieldTypes;
use Parsedown;
use SilverStripe\ORM\FieldType\DBText;
class Markdown extends DBText
{
private static $casting = array(
'AsHTML' => 'HTMLText',
'Markdown' => 'Text'
);
private static $escape_type = 'xml';
/**
* Render markdown as HTML using Parsedown
*
* @return string Markdown rendered as HTML
*/
public function AsHTML()
{
$parsedown = new Parsedown();
$options = $this->config()->get('options');
foreach ($options as $fn => $param) {
if (method_exists($parsedown, $fn)) {
$parsedown->{$fn}($param);
}
}
return $parsedown->text($this->value);
}
/**
* Renders the field used in the template
* @return string HTML to be used in the template
*/
public function forTemplate()
{
return $this->AsHTML();
}
}
| <?php
namespace Axllent\Gfmarkdown\FieldTypes;
use Parsedown;
use SilverStripe\ORM\FieldType\DBText;
class Markdown extends DBText
{
- public static $casting = array(
? ^^^ ^
+ private static $casting = array(
? ^ ^^^^
'AsHTML' => 'HTMLText',
'Markdown' => 'Text'
);
- public static $escape_type = 'xml';
? ^^^ ^
+ private static $escape_type = 'xml';
? ^ ^^^^
-
- private static $options = [];
/**
* Render markdown as HTML using Parsedown
*
* @return string Markdown rendered as HTML
*/
public function AsHTML()
{
$parsedown = new Parsedown();
$options = $this->config()->get('options');
foreach ($options as $fn => $param) {
if (method_exists($parsedown, $fn)) {
$parsedown->{$fn}($param);
}
}
return $parsedown->text($this->value);
}
/**
* Renders the field used in the template
* @return string HTML to be used in the template
*/
public function forTemplate()
{
return $this->AsHTML();
}
} | 6 | 0.136364 | 2 | 4 |
fa62ad541d147e740bedd97f470390aafe882e8f | app/js/arethusa.core/directives/foreign_keys.js | app/js/arethusa.core/directives/foreign_keys.js | 'use strict';
angular.module('arethusa.core').directive('foreignKeys',[
'keyCapture',
'languageSettings',
function (keyCapture, languageSettings) {
return {
restrict: 'A',
scope: {
ngChange: '&',
ngModel: '@',
foreignKeys: '='
},
link: function (scope, element, attrs) {
var parent = scope.$parent;
function extractLanguage() {
return (languageSettings.getFor('treebank') || {}).lang;
}
function lang() {
return scope.foreignKeys || extractLanguage();
}
// This will not detect changes right now
function placeHolderText() {
var language = languageSettings.langNames[lang()];
return language ? language + ' input enabled!' : '';
}
element.attr('placeholder', placeHolderText);
element.on('keydown', function (event) {
var input = event.target.value;
if (lang) {
var fK = keyCapture.getForeignKey(event, lang);
if (fK === false) {
return false;
}
if (fK === undefined) {
return true;
} else {
event.target.value = input + fK;
scope.$apply(function() {
parent.$eval(scope.ngModel + ' = i + k', { i: input, k: fK });
scope.ngChange();
});
return false;
}
} else {
return true;
}
});
}
};
}
]);
| 'use strict';
angular.module('arethusa.core').directive('foreignKeys',[
'keyCapture',
'languageSettings',
function (keyCapture, languageSettings) {
return {
restrict: 'A',
scope: {
ngChange: '&',
ngModel: '@',
foreignKeys: '='
},
link: function (scope, element, attrs) {
var parent = scope.$parent;
function extractLanguage() {
return (languageSettings.getFor('treebank') || {}).lang;
}
function lang() {
return scope.foreignKeys || extractLanguage();
}
// This will not detect changes right now
function placeHolderText() {
var language = languageSettings.langNames[lang()];
return language ? language + ' input enabled!' : '';
}
element.attr('placeholder', placeHolderText);
element.on('keydown', function (event) {
var input = event.target.value;
var l = lang();
if (l) {
var fK = keyCapture.getForeignKey(event, l);
if (fK === false) {
return false;
}
if (fK === undefined) {
return true;
} else {
event.target.value = input + fK;
scope.$apply(function() {
parent.$eval(scope.ngModel + ' = i + k', { i: input, k: fK });
scope.ngChange();
});
return false;
}
} else {
return true;
}
});
}
};
}
]);
| Fix minor mistake in foreignKeys | Fix minor mistake in foreignKeys
| JavaScript | mit | latin-language-toolkit/arethusa,PonteIneptique/arethusa,fbaumgardt/arethusa,Masoumeh/arethusa,latin-language-toolkit/arethusa,alpheios-project/arethusa,fbaumgardt/arethusa,fbaumgardt/arethusa,alpheios-project/arethusa,alpheios-project/arethusa,Masoumeh/arethusa,PonteIneptique/arethusa | javascript | ## Code Before:
'use strict';
angular.module('arethusa.core').directive('foreignKeys',[
'keyCapture',
'languageSettings',
function (keyCapture, languageSettings) {
return {
restrict: 'A',
scope: {
ngChange: '&',
ngModel: '@',
foreignKeys: '='
},
link: function (scope, element, attrs) {
var parent = scope.$parent;
function extractLanguage() {
return (languageSettings.getFor('treebank') || {}).lang;
}
function lang() {
return scope.foreignKeys || extractLanguage();
}
// This will not detect changes right now
function placeHolderText() {
var language = languageSettings.langNames[lang()];
return language ? language + ' input enabled!' : '';
}
element.attr('placeholder', placeHolderText);
element.on('keydown', function (event) {
var input = event.target.value;
if (lang) {
var fK = keyCapture.getForeignKey(event, lang);
if (fK === false) {
return false;
}
if (fK === undefined) {
return true;
} else {
event.target.value = input + fK;
scope.$apply(function() {
parent.$eval(scope.ngModel + ' = i + k', { i: input, k: fK });
scope.ngChange();
});
return false;
}
} else {
return true;
}
});
}
};
}
]);
## Instruction:
Fix minor mistake in foreignKeys
## Code After:
'use strict';
angular.module('arethusa.core').directive('foreignKeys',[
'keyCapture',
'languageSettings',
function (keyCapture, languageSettings) {
return {
restrict: 'A',
scope: {
ngChange: '&',
ngModel: '@',
foreignKeys: '='
},
link: function (scope, element, attrs) {
var parent = scope.$parent;
function extractLanguage() {
return (languageSettings.getFor('treebank') || {}).lang;
}
function lang() {
return scope.foreignKeys || extractLanguage();
}
// This will not detect changes right now
function placeHolderText() {
var language = languageSettings.langNames[lang()];
return language ? language + ' input enabled!' : '';
}
element.attr('placeholder', placeHolderText);
element.on('keydown', function (event) {
var input = event.target.value;
var l = lang();
if (l) {
var fK = keyCapture.getForeignKey(event, l);
if (fK === false) {
return false;
}
if (fK === undefined) {
return true;
} else {
event.target.value = input + fK;
scope.$apply(function() {
parent.$eval(scope.ngModel + ' = i + k', { i: input, k: fK });
scope.ngChange();
});
return false;
}
} else {
return true;
}
});
}
};
}
]);
| 'use strict';
angular.module('arethusa.core').directive('foreignKeys',[
'keyCapture',
'languageSettings',
function (keyCapture, languageSettings) {
return {
restrict: 'A',
scope: {
ngChange: '&',
ngModel: '@',
foreignKeys: '='
},
link: function (scope, element, attrs) {
var parent = scope.$parent;
function extractLanguage() {
return (languageSettings.getFor('treebank') || {}).lang;
}
function lang() {
return scope.foreignKeys || extractLanguage();
}
// This will not detect changes right now
function placeHolderText() {
var language = languageSettings.langNames[lang()];
return language ? language + ' input enabled!' : '';
}
element.attr('placeholder', placeHolderText);
element.on('keydown', function (event) {
var input = event.target.value;
+ var l = lang();
- if (lang) {
? ---
+ if (l) {
- var fK = keyCapture.getForeignKey(event, lang);
? ---
+ var fK = keyCapture.getForeignKey(event, l);
if (fK === false) {
return false;
}
if (fK === undefined) {
return true;
} else {
event.target.value = input + fK;
scope.$apply(function() {
parent.$eval(scope.ngModel + ' = i + k', { i: input, k: fK });
scope.ngChange();
});
return false;
}
} else {
return true;
}
});
}
};
}
]); | 5 | 0.090909 | 3 | 2 |
649890440076ed48d3dc027303912a1c9ad7350a | app/app.js | app/app.js | import React, { Component } from 'react';
import { addNavigationHelpers } from 'react-navigation';
import { bindActionCreators } from 'redux';
import { connect } from 'react-redux';
import { View } from 'react-native';
import * as actions from './actions';
import { AppContainerStyles } from './styles/containers';
import { AppNavigator } from './navigators';
class App extends Component {
handleNavigationStateChange({SWIPEOUT_TASK}) {
SWIPEOUT_TASK(null);
}
render() {
const
{ dispatch, navigation, tasks, ...actionProps } = this.props,
navigationHelpers = addNavigationHelpers({ state: navigation, dispatch });
return (
<View style={AppContainerStyles.container}>
<AppNavigator
screenProps={{...tasks, ...actionProps}}
navigation={navigationHelpers}
onNavigationStateChange={(prevState, currentState) => this.handleNavigationStateChange(actionProps) }
/>
</View>
);
}
}
const
mapStateToProps = state => ({
navigation: state.navigation,
tasks: state.tasks
});
mapDispatchToProps = dispatch => ({
...bindActionCreators(actions, dispatch),
dispatch
});
export default connect(mapStateToProps, mapDispatchToProps)(App);
| import React, { Component } from 'react';
import { addNavigationHelpers, NavigationActions } from 'react-navigation';
import { bindActionCreators } from 'redux';
import { connect } from 'react-redux';
import { BackAndroid, View } from 'react-native';
import * as actions from './actions';
import { AppContainerStyles } from './styles/containers';
import { AppNavigator } from './navigators';
class App extends Component {
componentDidMount() {
BackAndroid.addEventListener('backPress', () => this.handleBackPress(this.props));
}
componentWillUnmount() {
BackAndroid.removeEventListener('backPress');
}
handleBackPress({dispatch, navigation}) {
// Close the app if no route to go back to
if (navigation.index === 0) {
return false;
}
dispatch(NavigationActions.back());
return true;
}
handleNavigationStateChange({SWIPEOUT_TASK}) {
SWIPEOUT_TASK(null);
}
render() {
const
{ dispatch, navigation, tasks, ...actionProps } = this.props,
navigationHelpers = addNavigationHelpers({ state: navigation, dispatch });
return (
<View style={AppContainerStyles.container}>
<AppNavigator
screenProps={{...tasks, ...actionProps}}
navigation={navigationHelpers}
onNavigationStateChange={(prevState, currentState) => this.handleNavigationStateChange(actionProps) }
/>
</View>
);
}
}
const
mapStateToProps = state => ({
navigation: state.navigation,
tasks: state.tasks
});
mapDispatchToProps = dispatch => ({
...bindActionCreators(actions, dispatch),
dispatch
});
export default connect(mapStateToProps, mapDispatchToProps)(App);
| Add support for Android back button | Add support for Android back button
| JavaScript | mit | CarlaCrandall/DoItNow,CarlaCrandall/DoItNow,CarlaCrandall/DoItNow | javascript | ## Code Before:
import React, { Component } from 'react';
import { addNavigationHelpers } from 'react-navigation';
import { bindActionCreators } from 'redux';
import { connect } from 'react-redux';
import { View } from 'react-native';
import * as actions from './actions';
import { AppContainerStyles } from './styles/containers';
import { AppNavigator } from './navigators';
class App extends Component {
handleNavigationStateChange({SWIPEOUT_TASK}) {
SWIPEOUT_TASK(null);
}
render() {
const
{ dispatch, navigation, tasks, ...actionProps } = this.props,
navigationHelpers = addNavigationHelpers({ state: navigation, dispatch });
return (
<View style={AppContainerStyles.container}>
<AppNavigator
screenProps={{...tasks, ...actionProps}}
navigation={navigationHelpers}
onNavigationStateChange={(prevState, currentState) => this.handleNavigationStateChange(actionProps) }
/>
</View>
);
}
}
const
mapStateToProps = state => ({
navigation: state.navigation,
tasks: state.tasks
});
mapDispatchToProps = dispatch => ({
...bindActionCreators(actions, dispatch),
dispatch
});
export default connect(mapStateToProps, mapDispatchToProps)(App);
## Instruction:
Add support for Android back button
## Code After:
import React, { Component } from 'react';
import { addNavigationHelpers, NavigationActions } from 'react-navigation';
import { bindActionCreators } from 'redux';
import { connect } from 'react-redux';
import { BackAndroid, View } from 'react-native';
import * as actions from './actions';
import { AppContainerStyles } from './styles/containers';
import { AppNavigator } from './navigators';
class App extends Component {
componentDidMount() {
BackAndroid.addEventListener('backPress', () => this.handleBackPress(this.props));
}
componentWillUnmount() {
BackAndroid.removeEventListener('backPress');
}
handleBackPress({dispatch, navigation}) {
// Close the app if no route to go back to
if (navigation.index === 0) {
return false;
}
dispatch(NavigationActions.back());
return true;
}
handleNavigationStateChange({SWIPEOUT_TASK}) {
SWIPEOUT_TASK(null);
}
render() {
const
{ dispatch, navigation, tasks, ...actionProps } = this.props,
navigationHelpers = addNavigationHelpers({ state: navigation, dispatch });
return (
<View style={AppContainerStyles.container}>
<AppNavigator
screenProps={{...tasks, ...actionProps}}
navigation={navigationHelpers}
onNavigationStateChange={(prevState, currentState) => this.handleNavigationStateChange(actionProps) }
/>
</View>
);
}
}
const
mapStateToProps = state => ({
navigation: state.navigation,
tasks: state.tasks
});
mapDispatchToProps = dispatch => ({
...bindActionCreators(actions, dispatch),
dispatch
});
export default connect(mapStateToProps, mapDispatchToProps)(App);
| import React, { Component } from 'react';
- import { addNavigationHelpers } from 'react-navigation';
+ import { addNavigationHelpers, NavigationActions } from 'react-navigation';
? +++++++++++++++++++
import { bindActionCreators } from 'redux';
import { connect } from 'react-redux';
- import { View } from 'react-native';
+ import { BackAndroid, View } from 'react-native';
? +++++++++++++
import * as actions from './actions';
import { AppContainerStyles } from './styles/containers';
import { AppNavigator } from './navigators';
class App extends Component {
+ componentDidMount() {
+ BackAndroid.addEventListener('backPress', () => this.handleBackPress(this.props));
+ }
+
+ componentWillUnmount() {
+ BackAndroid.removeEventListener('backPress');
+ }
+
+ handleBackPress({dispatch, navigation}) {
+ // Close the app if no route to go back to
+ if (navigation.index === 0) {
+ return false;
+ }
+
+ dispatch(NavigationActions.back());
+ return true;
+ }
+
handleNavigationStateChange({SWIPEOUT_TASK}) {
SWIPEOUT_TASK(null);
}
render() {
const
{ dispatch, navigation, tasks, ...actionProps } = this.props,
navigationHelpers = addNavigationHelpers({ state: navigation, dispatch });
return (
<View style={AppContainerStyles.container}>
<AppNavigator
screenProps={{...tasks, ...actionProps}}
navigation={navigationHelpers}
onNavigationStateChange={(prevState, currentState) => this.handleNavigationStateChange(actionProps) }
/>
</View>
);
}
}
const
mapStateToProps = state => ({
navigation: state.navigation,
tasks: state.tasks
});
mapDispatchToProps = dispatch => ({
...bindActionCreators(actions, dispatch),
dispatch
});
export default connect(mapStateToProps, mapDispatchToProps)(App); | 22 | 0.52381 | 20 | 2 |
1dd9bdeaa2c89ba75427e9cc1d6fb675cb0ae8f3 | tdiagrams/test/EndToEnd.hs | tdiagrams/test/EndToEnd.hs | import Programs
import CCO.Component (Component, component, ioRun, printer)
import CCO.Diag (parser)
import CCO.Tree (ATerm, Tree (fromTree, toTree))
import CCO.TypeCheck (typeCheckATerm)
import CCO.Diag2Picture (diag2pictureATerm)
import CCO.Picture (Picture)
import Control.Arrow (Arrow (arr), (>>>))
import System.IO (hClose, hPutStrLn, openFile, Handle, IOMode(WriteMode))
main = do
handle <- openFile "inference rules/diagrams.tex" WriteMode
sequence $ map (run handle) correctPrograms
hClose handle
run :: Handle -> String -> IO ()
run handle input = do
hPutStrLn handle ""
hPutStrLn handle "\\bigskip"
hPutStrLn handle $ "\\noindent " ++ input
hPutStrLn handle "\\bigskip"
hPutStrLn handle ""
picture handle input
picture :: Handle -> String -> IO ()
picture handle input = do
parsed <- ioRun (parser >>> arr fromTree) input
typeChecked <- ioRun typeCheckATerm parsed
picture <- ioRun diag2pictureATerm typeChecked
latex <- ioRun (component toTree :: Component ATerm Picture) picture
output <- ioRun printer latex
hPutStrLn handle output
| import Programs
import CCO.Component (Component, component, ioRun, printer)
import CCO.Diag (parser)
import CCO.Tree (ATerm, Tree (fromTree, toTree))
import CCO.TypeCheck (typeCheckATerm)
import CCO.Diag2Picture (diag2pictureATerm)
import CCO.Picture (Picture)
import Control.Arrow (Arrow (arr), (>>>))
import System.IO (hClose, hPutStrLn, openFile, Handle, IOMode(WriteMode))
main = do
handle <- openFile "readme/diagrams.tex" WriteMode
sequence $ map (run handle) correctPrograms
hClose handle
run :: Handle -> String -> IO ()
run handle input = do
hPutStrLn handle ""
hPutStrLn handle "\\bigskip"
hPutStrLn handle $ "\\noindent " ++ input
hPutStrLn handle "\\bigskip"
hPutStrLn handle ""
picture handle input
picture :: Handle -> String -> IO ()
picture handle input = do
parsed <- ioRun (parser >>> arr fromTree) input
typeChecked <- ioRun typeCheckATerm parsed
picture <- ioRun diag2pictureATerm typeChecked
latex <- ioRun (component toTree :: Component ATerm Picture) picture
output <- ioRun printer latex
hPutStrLn handle output
| Fix end to end tests | Fix end to end tests
| Haskell | apache-2.0 | aochagavia/CompilerConstruction | haskell | ## Code Before:
import Programs
import CCO.Component (Component, component, ioRun, printer)
import CCO.Diag (parser)
import CCO.Tree (ATerm, Tree (fromTree, toTree))
import CCO.TypeCheck (typeCheckATerm)
import CCO.Diag2Picture (diag2pictureATerm)
import CCO.Picture (Picture)
import Control.Arrow (Arrow (arr), (>>>))
import System.IO (hClose, hPutStrLn, openFile, Handle, IOMode(WriteMode))
main = do
handle <- openFile "inference rules/diagrams.tex" WriteMode
sequence $ map (run handle) correctPrograms
hClose handle
run :: Handle -> String -> IO ()
run handle input = do
hPutStrLn handle ""
hPutStrLn handle "\\bigskip"
hPutStrLn handle $ "\\noindent " ++ input
hPutStrLn handle "\\bigskip"
hPutStrLn handle ""
picture handle input
picture :: Handle -> String -> IO ()
picture handle input = do
parsed <- ioRun (parser >>> arr fromTree) input
typeChecked <- ioRun typeCheckATerm parsed
picture <- ioRun diag2pictureATerm typeChecked
latex <- ioRun (component toTree :: Component ATerm Picture) picture
output <- ioRun printer latex
hPutStrLn handle output
## Instruction:
Fix end to end tests
## Code After:
import Programs
import CCO.Component (Component, component, ioRun, printer)
import CCO.Diag (parser)
import CCO.Tree (ATerm, Tree (fromTree, toTree))
import CCO.TypeCheck (typeCheckATerm)
import CCO.Diag2Picture (diag2pictureATerm)
import CCO.Picture (Picture)
import Control.Arrow (Arrow (arr), (>>>))
import System.IO (hClose, hPutStrLn, openFile, Handle, IOMode(WriteMode))
main = do
handle <- openFile "readme/diagrams.tex" WriteMode
sequence $ map (run handle) correctPrograms
hClose handle
run :: Handle -> String -> IO ()
run handle input = do
hPutStrLn handle ""
hPutStrLn handle "\\bigskip"
hPutStrLn handle $ "\\noindent " ++ input
hPutStrLn handle "\\bigskip"
hPutStrLn handle ""
picture handle input
picture :: Handle -> String -> IO ()
picture handle input = do
parsed <- ioRun (parser >>> arr fromTree) input
typeChecked <- ioRun typeCheckATerm parsed
picture <- ioRun diag2pictureATerm typeChecked
latex <- ioRun (component toTree :: Component ATerm Picture) picture
output <- ioRun printer latex
hPutStrLn handle output
| import Programs
import CCO.Component (Component, component, ioRun, printer)
import CCO.Diag (parser)
import CCO.Tree (ATerm, Tree (fromTree, toTree))
import CCO.TypeCheck (typeCheckATerm)
import CCO.Diag2Picture (diag2pictureATerm)
import CCO.Picture (Picture)
import Control.Arrow (Arrow (arr), (>>>))
import System.IO (hClose, hPutStrLn, openFile, Handle, IOMode(WriteMode))
main = do
- handle <- openFile "inference rules/diagrams.tex" WriteMode
? ---- ^^ ------
+ handle <- openFile "readme/diagrams.tex" WriteMode
? ^^^
sequence $ map (run handle) correctPrograms
hClose handle
run :: Handle -> String -> IO ()
run handle input = do
hPutStrLn handle ""
hPutStrLn handle "\\bigskip"
hPutStrLn handle $ "\\noindent " ++ input
hPutStrLn handle "\\bigskip"
hPutStrLn handle ""
picture handle input
picture :: Handle -> String -> IO ()
picture handle input = do
parsed <- ioRun (parser >>> arr fromTree) input
typeChecked <- ioRun typeCheckATerm parsed
picture <- ioRun diag2pictureATerm typeChecked
latex <- ioRun (component toTree :: Component ATerm Picture) picture
output <- ioRun printer latex
hPutStrLn handle output | 2 | 0.057143 | 1 | 1 |
f0635e5ce8484febf231c5456efe7cbed66b0ad9 | package.json | package.json | {
"name": "fontserver",
"version": "0.1.0",
"main": "index.js",
"private": true,
"dependencies": {
"node-pre-gyp": "0.5.x"
},
"bundledDependencies": ["node-pre-gyp"],
"devDependencies": {
"mocha": "1.17.x",
"llmr": "http://mapbox-npm.s3.amazonaws.com/tiles/llmr@v0.0.4.tar.gz",
"pbf": "0.0.1"
},
"scripts": {
"install": "node-pre-gyp install --fallback-to-build",
"test": "node_modules/.bin/mocha -R spec"
},
"binary": {
"module_name" : "fontserver",
"module_path" : "./lib/",
"host" : "https://mapbox-node-binary.s3.amazonaws.com",
"remote_path" : "./{name}/v{version}",
"package_name": "{node_abi}-{platform}-{arch}.tar.gz"
}
}
| {
"name": "fontserver",
"version": "0.1.0",
"main": "index.js",
"private": true,
"dependencies": {
"node-pre-gyp": "0.5.x"
},
"bundledDependencies": ["node-pre-gyp"],
"devDependencies": {
"mocha": "1.17.x",
"pbf": "0.0.1"
},
"scripts": {
"install": "node-pre-gyp install --fallback-to-build",
"test": "node_modules/.bin/mocha -R spec"
},
"binary": {
"module_name" : "fontserver",
"module_path" : "./lib/",
"host" : "https://mapbox-node-binary.s3.amazonaws.com",
"remote_path" : "./{name}/v{version}",
"package_name": "{node_abi}-{platform}-{arch}.tar.gz"
}
}
| Remove llmr dep for now. | Remove llmr dep for now.
| JSON | bsd-3-clause | mattdesl/node-fontnik,mapbox/node-fontnik,mattdesl/node-fontnik,mapbox/node-fontnik,mapbox/node-fontnik,mapbox/node-fontnik,mattdesl/node-fontnik,mattdesl/node-fontnik | json | ## Code Before:
{
"name": "fontserver",
"version": "0.1.0",
"main": "index.js",
"private": true,
"dependencies": {
"node-pre-gyp": "0.5.x"
},
"bundledDependencies": ["node-pre-gyp"],
"devDependencies": {
"mocha": "1.17.x",
"llmr": "http://mapbox-npm.s3.amazonaws.com/tiles/llmr@v0.0.4.tar.gz",
"pbf": "0.0.1"
},
"scripts": {
"install": "node-pre-gyp install --fallback-to-build",
"test": "node_modules/.bin/mocha -R spec"
},
"binary": {
"module_name" : "fontserver",
"module_path" : "./lib/",
"host" : "https://mapbox-node-binary.s3.amazonaws.com",
"remote_path" : "./{name}/v{version}",
"package_name": "{node_abi}-{platform}-{arch}.tar.gz"
}
}
## Instruction:
Remove llmr dep for now.
## Code After:
{
"name": "fontserver",
"version": "0.1.0",
"main": "index.js",
"private": true,
"dependencies": {
"node-pre-gyp": "0.5.x"
},
"bundledDependencies": ["node-pre-gyp"],
"devDependencies": {
"mocha": "1.17.x",
"pbf": "0.0.1"
},
"scripts": {
"install": "node-pre-gyp install --fallback-to-build",
"test": "node_modules/.bin/mocha -R spec"
},
"binary": {
"module_name" : "fontserver",
"module_path" : "./lib/",
"host" : "https://mapbox-node-binary.s3.amazonaws.com",
"remote_path" : "./{name}/v{version}",
"package_name": "{node_abi}-{platform}-{arch}.tar.gz"
}
}
| {
"name": "fontserver",
"version": "0.1.0",
"main": "index.js",
"private": true,
"dependencies": {
"node-pre-gyp": "0.5.x"
},
"bundledDependencies": ["node-pre-gyp"],
"devDependencies": {
"mocha": "1.17.x",
- "llmr": "http://mapbox-npm.s3.amazonaws.com/tiles/llmr@v0.0.4.tar.gz",
"pbf": "0.0.1"
},
"scripts": {
"install": "node-pre-gyp install --fallback-to-build",
"test": "node_modules/.bin/mocha -R spec"
},
"binary": {
"module_name" : "fontserver",
"module_path" : "./lib/",
"host" : "https://mapbox-node-binary.s3.amazonaws.com",
"remote_path" : "./{name}/v{version}",
"package_name": "{node_abi}-{platform}-{arch}.tar.gz"
}
} | 1 | 0.032258 | 0 | 1 |
540ce8f0858e027d89a42b18d12d7730bc63440d | _config.yml | _config.yml | title: jekyllDecent
subtitle: Example Blog
description: This is the jekyllTheme example blog. It only exists to demonstrate how the theme looks and feels.
keywords: demo theme example github jekyll features
language: "en" # default language of the blog. Language codes: http://www.w3schools.com/tags/ref_language_codes.asp
baseurl: "" # the subpath of your site, e.g. /blog
url: "" # the base hostname & protocol for your site. Have a look at robots.txt as well!
permalink: /blog/:categories/:title
cover: /assets/mountain-cover.jpg
disqus_shortname: "jekylldecent"
# Build settings
markdown: kramdown
kramdown:
syntax_highlighter_opts:
disable: true
include: ['_pages']
exclude: [vendor]
gems:
- jekyll-mentions
- jekyll-feed
- jekyll-sitemap
- jekyll-redirect-from
| title: jekyllDecent
subtitle: Example Blog
description: This is the jekyllTheme example blog. It only exists to demonstrate how the theme looks and feels.
keywords: demo theme example github jekyll features
language: "en" # default language of the blog. Language codes: http://www.w3schools.com/tags/ref_language_codes.asp
baseurl: "jekyllDecent" # the subpath of your site, e.g. /blog
url: "" # the base hostname & protocol for your site. Have a look at robots.txt as well!
permalink: /blog/:categories/:title
cover: /assets/mountain-cover.jpg
disqus_shortname: "jekylldecent"
# Build settings
markdown: kramdown
kramdown:
syntax_highlighter_opts:
disable: true
include: ['_pages']
exclude: [vendor]
gems:
- jekyll-mentions
- jekyll-feed
- jekyll-sitemap
- jekyll-redirect-from
| Set baseurl to fix redirect bug | Set baseurl to fix redirect bug
Close #35 | YAML | mit | jwillmer/jekyllDecent,jwillmer/jekyllDecent,jwillmer/jekyllDecent | yaml | ## Code Before:
title: jekyllDecent
subtitle: Example Blog
description: This is the jekyllTheme example blog. It only exists to demonstrate how the theme looks and feels.
keywords: demo theme example github jekyll features
language: "en" # default language of the blog. Language codes: http://www.w3schools.com/tags/ref_language_codes.asp
baseurl: "" # the subpath of your site, e.g. /blog
url: "" # the base hostname & protocol for your site. Have a look at robots.txt as well!
permalink: /blog/:categories/:title
cover: /assets/mountain-cover.jpg
disqus_shortname: "jekylldecent"
# Build settings
markdown: kramdown
kramdown:
syntax_highlighter_opts:
disable: true
include: ['_pages']
exclude: [vendor]
gems:
- jekyll-mentions
- jekyll-feed
- jekyll-sitemap
- jekyll-redirect-from
## Instruction:
Set baseurl to fix redirect bug
Close #35
## Code After:
title: jekyllDecent
subtitle: Example Blog
description: This is the jekyllTheme example blog. It only exists to demonstrate how the theme looks and feels.
keywords: demo theme example github jekyll features
language: "en" # default language of the blog. Language codes: http://www.w3schools.com/tags/ref_language_codes.asp
baseurl: "jekyllDecent" # the subpath of your site, e.g. /blog
url: "" # the base hostname & protocol for your site. Have a look at robots.txt as well!
permalink: /blog/:categories/:title
cover: /assets/mountain-cover.jpg
disqus_shortname: "jekylldecent"
# Build settings
markdown: kramdown
kramdown:
syntax_highlighter_opts:
disable: true
include: ['_pages']
exclude: [vendor]
gems:
- jekyll-mentions
- jekyll-feed
- jekyll-sitemap
- jekyll-redirect-from
| title: jekyllDecent
subtitle: Example Blog
description: This is the jekyllTheme example blog. It only exists to demonstrate how the theme looks and feels.
keywords: demo theme example github jekyll features
language: "en" # default language of the blog. Language codes: http://www.w3schools.com/tags/ref_language_codes.asp
- baseurl: "" # the subpath of your site, e.g. /blog
+ baseurl: "jekyllDecent" # the subpath of your site, e.g. /blog
? ++++++++++++
url: "" # the base hostname & protocol for your site. Have a look at robots.txt as well!
permalink: /blog/:categories/:title
cover: /assets/mountain-cover.jpg
disqus_shortname: "jekylldecent"
# Build settings
markdown: kramdown
kramdown:
syntax_highlighter_opts:
disable: true
include: ['_pages']
exclude: [vendor]
gems:
- jekyll-mentions
- jekyll-feed
- jekyll-sitemap
- jekyll-redirect-from | 2 | 0.083333 | 1 | 1 |
4804f884015f1ecbe56c52d9da68eeea662501ed | source/manual/deploy-fixes-for-a-security-vulnerability.html.md | source/manual/deploy-fixes-for-a-security-vulnerability.html.md | ---
owner_slack: "#govuk-2ndline"
title: Deploy fixes for a security vulnerability
section: Deployment
layout: manual_layout
parent: "/manual.html"
last_reviewed_on: 2018-05-31
review_in: 3 months
---
When responding to a security incident, we should review the changes in private
before deploying them, so that we don't accidentally disclose the vulnerability.
To do this, push branches to the [GitLab backup](github-unavailable.html)
of the repository, rather than the normal repository on github.com.
This repository should be up to date as of the previous release, but will be
missing any unreleased commits that are on github.com master.
This is fine as you don't want to deploy these.
## Deploy process
1. Ensure nobody else deploys the app until you've confirmed the vulnerability
is fixed.
1. Review the pull request on GitLab
1. Create a release tag manually in git. This should follow the standard format
`release_X`. Tag the branch directly instead of merging it.
1. Don't use the release app. Go directly to the `deploy_app` Jenkins job, and
check "DEPLOY_FROM_GITLAB".
## After deploying
1. Ensure the vulnerability is fixed.
1. Push the branch and tag to github.com.
1. Merge the branch into master.
1. Record the missing deployment in the release app.
| ---
owner_slack: "#govuk-2ndline"
title: Deploy fixes for a security vulnerability
section: Deployment
layout: manual_layout
parent: "/manual.html"
last_reviewed_on: 2018-05-31
review_in: 3 months
---
When responding to a security incident, we should review the changes in private
before deploying them, so that we don't accidentally disclose the vulnerability.
To do this, push branches to the [AWS CodeCommit backup](github-unavailable.html)
of the repository, rather than the normal repository on github.com.
This repository should be up to date as of the previous release, but will be
missing any unreleased commits that are on github.com master.
This is fine as you don't want to deploy these.
## Deploy process
1. Ensure nobody else deploys the app until you've confirmed the vulnerability
is fixed.
1. Review the pull request on AWS CodeCommit
1. Create a release tag manually in git. This should follow the standard format
`release_X`. Tag the branch directly instead of merging it.
1. Don't use the release app. Go directly to the `deploy_app` Jenkins job, and
check "DEPLOY_FROM_AWS_CODECOMMIT".
## After deploying
1. Ensure the vulnerability is fixed.
1. Push the branch and tag to github.com.
1. Merge the branch into master.
1. Record the missing deployment in the release app.
| Update information on deploying security vulnerability fixes. | Update information on deploying security vulnerability fixes.
| Markdown | mit | alphagov/govuk-developer-docs,alphagov/govuk-developer-docs,alphagov/govuk-developer-docs,alphagov/govuk-developer-docs | markdown | ## Code Before:
---
owner_slack: "#govuk-2ndline"
title: Deploy fixes for a security vulnerability
section: Deployment
layout: manual_layout
parent: "/manual.html"
last_reviewed_on: 2018-05-31
review_in: 3 months
---
When responding to a security incident, we should review the changes in private
before deploying them, so that we don't accidentally disclose the vulnerability.
To do this, push branches to the [GitLab backup](github-unavailable.html)
of the repository, rather than the normal repository on github.com.
This repository should be up to date as of the previous release, but will be
missing any unreleased commits that are on github.com master.
This is fine as you don't want to deploy these.
## Deploy process
1. Ensure nobody else deploys the app until you've confirmed the vulnerability
is fixed.
1. Review the pull request on GitLab
1. Create a release tag manually in git. This should follow the standard format
`release_X`. Tag the branch directly instead of merging it.
1. Don't use the release app. Go directly to the `deploy_app` Jenkins job, and
check "DEPLOY_FROM_GITLAB".
## After deploying
1. Ensure the vulnerability is fixed.
1. Push the branch and tag to github.com.
1. Merge the branch into master.
1. Record the missing deployment in the release app.
## Instruction:
Update information on deploying security vulnerability fixes.
## Code After:
---
owner_slack: "#govuk-2ndline"
title: Deploy fixes for a security vulnerability
section: Deployment
layout: manual_layout
parent: "/manual.html"
last_reviewed_on: 2018-05-31
review_in: 3 months
---
When responding to a security incident, we should review the changes in private
before deploying them, so that we don't accidentally disclose the vulnerability.
To do this, push branches to the [AWS CodeCommit backup](github-unavailable.html)
of the repository, rather than the normal repository on github.com.
This repository should be up to date as of the previous release, but will be
missing any unreleased commits that are on github.com master.
This is fine as you don't want to deploy these.
## Deploy process
1. Ensure nobody else deploys the app until you've confirmed the vulnerability
is fixed.
1. Review the pull request on AWS CodeCommit
1. Create a release tag manually in git. This should follow the standard format
`release_X`. Tag the branch directly instead of merging it.
1. Don't use the release app. Go directly to the `deploy_app` Jenkins job, and
check "DEPLOY_FROM_AWS_CODECOMMIT".
## After deploying
1. Ensure the vulnerability is fixed.
1. Push the branch and tag to github.com.
1. Merge the branch into master.
1. Record the missing deployment in the release app.
| ---
owner_slack: "#govuk-2ndline"
title: Deploy fixes for a security vulnerability
section: Deployment
layout: manual_layout
parent: "/manual.html"
last_reviewed_on: 2018-05-31
review_in: 3 months
---
When responding to a security incident, we should review the changes in private
before deploying them, so that we don't accidentally disclose the vulnerability.
- To do this, push branches to the [GitLab backup](github-unavailable.html)
? ^ ---
+ To do this, push branches to the [AWS CodeCommit backup](github-unavailable.html)
? ^^^^^^^^^^^^
of the repository, rather than the normal repository on github.com.
This repository should be up to date as of the previous release, but will be
missing any unreleased commits that are on github.com master.
This is fine as you don't want to deploy these.
## Deploy process
1. Ensure nobody else deploys the app until you've confirmed the vulnerability
is fixed.
- 1. Review the pull request on GitLab
? ^ ---
+ 1. Review the pull request on AWS CodeCommit
? ^^^^^^^^^^^^
1. Create a release tag manually in git. This should follow the standard format
`release_X`. Tag the branch directly instead of merging it.
1. Don't use the release app. Go directly to the `deploy_app` Jenkins job, and
- check "DEPLOY_FROM_GITLAB".
? ^ ---
+ check "DEPLOY_FROM_AWS_CODECOMMIT".
? ^^^^^^^^^^^^
## After deploying
1. Ensure the vulnerability is fixed.
1. Push the branch and tag to github.com.
1. Merge the branch into master.
1. Record the missing deployment in the release app. | 6 | 0.142857 | 3 | 3 |
82c67c64bed3c4c05142632257c30d4769639df2 | api/app/controllers/spree/api/zones_controller.rb | api/app/controllers/spree/api/zones_controller.rb | module Spree
module Api
class ZonesController < Spree::Api::BaseController
def create
authorize! :create, Zone
@zone = Zone.new(map_nested_attributes_keys(Spree::Zone, params[:zone]))
if @zone.save
respond_with(@zone, :status => 201, :default_template => :show)
else
invalid_resource!(@zone)
end
end
def destroy
authorize! :destroy, zone
zone.destroy
respond_with(zone, :status => 204)
end
def index
@zones = Zone.accessible_by(current_ability, :read).order('name ASC').ransack(params[:q]).result.page(params[:page]).per(params[:per_page])
respond_with(@zones)
end
def show
respond_with(zone)
end
def update
authorize! :update, zone
if zone.update_attributes(map_nested_attributes_keys(Spree::Zone, params[:zone]))
respond_with(zone, :status => 200, :default_template => :show)
else
invalid_resource!(zone)
end
end
private
def zone
@zone ||= Spree::Zone.accessible_by(current_ability, :read).find(params[:id])
end
end
end
end
| module Spree
module Api
class ZonesController < Spree::Api::BaseController
def create
authorize! :create, Zone
@zone = Zone.new(map_nested_attributes_keys(Spree::Zone, zone_params))
if @zone.save
respond_with(@zone, :status => 201, :default_template => :show)
else
invalid_resource!(@zone)
end
end
def destroy
authorize! :destroy, zone
zone.destroy
respond_with(zone, :status => 204)
end
def index
@zones = Zone.accessible_by(current_ability, :read).order('name ASC').ransack(params[:q]).result.page(params[:page]).per(params[:per_page])
respond_with(@zones)
end
def show
respond_with(zone)
end
def update
authorize! :update, zone
if zone.update_attributes(map_nested_attributes_keys(Spree::Zone, zone_params))
respond_with(zone, :status => 200, :default_template => :show)
else
invalid_resource!(zone)
end
end
private
def zone_params
params.require(:zone).permit!
end
def zone
@zone ||= Spree::Zone.accessible_by(current_ability, :read).find(params[:id])
end
end
end
end
| Call permit! on zone params before passing to model | Call permit! on zone params before passing to model
Apparrantly as of rails 4.2 we must `permit` controller `params` before
passing them to the model. Previous rails version `params` probably
didn't respond to `permitted?`
see https://github.com/rails/rails/blob/v4.2.0/activemodel/lib/active_model/forbidden_attributes_protection.rb
| Ruby | bsd-3-clause | nooysters/spree,moneyspyder/spree,jeffboulet/spree,grzlus/spree,jparr/spree,reinaris/spree,vmatekole/spree,Boomkat/spree,groundctrl/spree,tomash/spree,FadliKun/spree,maybii/spree,welitonfreitas/spree,camelmasa/spree,orenf/spree,hoanghiep90/spree,moneyspyder/spree,alejandromangione/spree,jimblesm/spree,adaddeo/spree,groundctrl/spree,derekluo/spree,vulk/spree,firman/spree,piousbox/spree,radarseesradar/spree,softr8/spree,piousbox/spree,beni55/spree,Kagetsuki/spree,hoanghiep90/spree,madetech/spree,reinaris/spree,reidblomquist/spree,softr8/spree,pervino/spree,gautamsawhney/spree,rbngzlv/spree,JuandGirald/spree,ahmetabdi/spree,quentinuys/spree,patdec/spree,dafontaine/spree,robodisco/spree,shekibobo/spree,keatonrow/spree,carlesjove/spree,beni55/spree,FadliKun/spree,delphsoft/spree-store-ballchair,carlesjove/spree,useiichi/spree,ramkumar-kr/spree,abhishekjain16/spree,jimblesm/spree,locomotivapro/spree,calvinl/spree,rajeevriitm/spree,jaspreet21anand/spree,keatonrow/spree,joanblake/spree,volpejoaquin/spree,rakibulislam/spree,siddharth28/spree,sfcgeorge/spree,DarkoP/spree,odk211/spree,NerdsvilleCEO/spree,radarseesradar/spree,volpejoaquin/spree,azranel/spree,tancnle/spree,mindvolt/spree,derekluo/spree,lyzxsc/spree,ramkumar-kr/spree,hifly/spree,radarseesradar/spree,fahidnasir/spree,priyank-gupta/spree,calvinl/spree,yiqing95/spree,mleglise/spree,gregoryrikson/spree-sample,hifly/spree,jaspreet21anand/spree,agient/agientstorefront,Hawaiideveloper/shoppingcart,reidblomquist/spree,pervino/spree,TimurTarasenko/spree,joanblake/spree,firman/spree,lyzxsc/spree,moneyspyder/spree,DarkoP/spree,progsri/spree,wolfieorama/spree,azranel/spree,thogg4/spree,TrialGuides/spree,ayb/spree,yushine/spree,archSeer/spree,tancnle/spree,pulkit21/spree,sideci-sample/sideci-sample-spree,jaspreet21anand/spree,wolfieorama/spree,TimurTarasenko/spree,AgilTec/spree,azranel/spree,dafontaine/spree,moneyspyder/spree,gregoryrikson/spree-sample,njerrywerry/spree,ahmetabdi/spree,Hawaiideveloper/shoppingcart,Lostmyname/spree,sunny2601/spree,jspizziri/spree,omarsar/spree,imella/spree,ayb/spree,kewaunited/spree,robodisco/spree,fahidnasir/spree,keatonrow/spree,imella/spree,njerrywerry/spree,Ropeney/spree,tomash/spree,firman/spree,Engeltj/spree,grzlus/spree,tomash/spree,softr8/spree,carlesjove/spree,omarsar/spree,delphsoft/spree-store-ballchair,Machpowersystems/spree_mach,sliaquat/spree,JDutil/spree,Engeltj/spree,ramkumar-kr/spree,derekluo/spree,rakibulislam/spree,progsri/spree,maybii/spree,TrialGuides/spree,reidblomquist/spree,jeffboulet/spree,miyazawatomoka/spree,calvinl/spree,zamiang/spree,CiscoCloud/spree,adaddeo/spree,SadTreeFriends/spree,priyank-gupta/spree,JDutil/spree,raow/spree,nooysters/spree,locomotivapro/spree,builtbybuffalo/spree,madetech/spree,hoanghiep90/spree,rbngzlv/spree,siddharth28/spree,dandanwei/spree,mindvolt/spree,progsri/spree,berkes/spree,jaspreet21anand/spree,FadliKun/spree,trigrass2/spree,robodisco/spree,AgilTec/spree,vmatekole/spree,lyzxsc/spree,raow/spree,patdec/spree,azclick/spree,ahmetabdi/spree,sunny2601/spree,karlitxo/spree,grzlus/spree,calvinl/spree,odk211/spree,vulk/spree,alejandromangione/spree,jparr/spree,reinaris/spree,orenf/spree,Lostmyname/spree,vcavallo/spree,brchristian/spree,edgward/spree,karlitxo/spree,agient/agientstorefront,sliaquat/spree,kewaunited/spree,dandanwei/spree,caiqinghua/spree,TimurTarasenko/spree,volpejoaquin/spree,raow/spree,CJMrozek/spree,Kagetsuki/spree,AgilTec/spree,welitonfreitas/spree,abhishekjain16/spree,azclick/spree,gregoryrikson/spree-sample,shaywood2/spree,sideci-sample/sideci-sample-spree,zamiang/spree,archSeer/spree,kewaunited/spree,karlitxo/spree,Hawaiideveloper/shoppingcart,omarsar/spree,derekluo/spree,maybii/spree,archSeer/spree,yiqing95/spree,caiqinghua/spree,nooysters/spree,agient/agientstorefront,KMikhaylovCTG/spree,builtbybuffalo/spree,beni55/spree,shekibobo/spree,alvinjean/spree,pulkit21/spree,imella/spree,Kagetsuki/spree,rajeevriitm/spree,quentinuys/spree,sliaquat/spree,Ropeney/spree,locomotivapro/spree,progsri/spree,pulkit21/spree,yushine/spree,priyank-gupta/spree,zamiang/spree,vcavallo/spree,DynamoMTL/spree,quentinuys/spree,ayb/spree,carlesjove/spree,TrialGuides/spree,sfcgeorge/spree,ayb/spree,DarkoP/spree,priyank-gupta/spree,KMikhaylovCTG/spree,adaddeo/spree,vulk/spree,vinayvinsol/spree,groundctrl/spree,zaeznet/spree,lyzxsc/spree,miyazawatomoka/spree,jspizziri/spree,pulkit21/spree,vmatekole/spree,pervino/spree,madetech/spree,orenf/spree,patdec/spree,abhishekjain16/spree,mindvolt/spree,yushine/spree,Lostmyname/spree,thogg4/spree,robodisco/spree,agient/agientstorefront,trigrass2/spree,jasonfb/spree,hifly/spree,joanblake/spree,rajeevriitm/spree,delphsoft/spree-store-ballchair,vinsol/spree,tesserakt/clean_spree,JDutil/spree,tesserakt/clean_spree,FadliKun/spree,SadTreeFriends/spree,edgward/spree,thogg4/spree,Engeltj/spree,gregoryrikson/spree-sample,piousbox/spree,alvinjean/spree,jparr/spree,jimblesm/spree,hifly/spree,builtbybuffalo/spree,brchristian/spree,edgward/spree,abhishekjain16/spree,tancnle/spree,reidblomquist/spree,Hawaiideveloper/shoppingcart,firman/spree,brchristian/spree,gautamsawhney/spree,wolfieorama/spree,NerdsvilleCEO/spree,KMikhaylovCTG/spree,rbngzlv/spree,KMikhaylovCTG/spree,CiscoCloud/spree,sliaquat/spree,softr8/spree,yiqing95/spree,Engeltj/spree,TimurTarasenko/spree,edgward/spree,quentinuys/spree,vinsol/spree,zaeznet/spree,cutefrank/spree,APohio/spree,AgilTec/spree,JuandGirald/spree,caiqinghua/spree,CJMrozek/spree,NerdsvilleCEO/spree,JuandGirald/spree,DynamoMTL/spree,dandanwei/spree,Machpowersystems/spree_mach,sfcgeorge/spree,rakibulislam/spree,zaeznet/spree,siddharth28/spree,lsirivong/spree,radarseesradar/spree,camelmasa/spree,jasonfb/spree,wolfieorama/spree,APohio/spree,sunny2601/spree,njerrywerry/spree,fahidnasir/spree,locomotivapro/spree,alejandromangione/spree,brchristian/spree,madetech/spree,dafontaine/spree,joanblake/spree,tesserakt/clean_spree,shekibobo/spree,azranel/spree,kewaunited/spree,shaywood2/spree,JuandGirald/spree,lsirivong/spree,lsirivong/spree,DynamoMTL/spree,tancnle/spree,cutefrank/spree,dandanwei/spree,APohio/spree,vinayvinsol/spree,alejandromangione/spree,vulk/spree,piousbox/spree,grzlus/spree,CJMrozek/spree,odk211/spree,useiichi/spree,dafontaine/spree,vcavallo/spree,azclick/spree,Boomkat/spree,tesserakt/clean_spree,CiscoCloud/spree,jspizziri/spree,yushine/spree,vcavallo/spree,vinsol/spree,caiqinghua/spree,welitonfreitas/spree,useiichi/spree,SadTreeFriends/spree,omarsar/spree,karlitxo/spree,njerrywerry/spree,sideci-sample/sideci-sample-spree,camelmasa/spree,Boomkat/spree,jeffboulet/spree,mleglise/spree,raow/spree,camelmasa/spree,miyazawatomoka/spree,odk211/spree,siddharth28/spree,jparr/spree,DynamoMTL/spree,alvinjean/spree,builtbybuffalo/spree,jimblesm/spree,fahidnasir/spree,gautamsawhney/spree,mleglise/spree,JDutil/spree,archSeer/spree,jasonfb/spree,jeffboulet/spree,berkes/spree,cutefrank/spree,vinayvinsol/spree,rajeevriitm/spree,adaddeo/spree,SadTreeFriends/spree,azclick/spree,jasonfb/spree,Lostmyname/spree,DarkoP/spree,vinsol/spree,delphsoft/spree-store-ballchair,gautamsawhney/spree,trigrass2/spree,Ropeney/spree,Ropeney/spree,trigrass2/spree,Kagetsuki/spree,sunny2601/spree,APohio/spree,groundctrl/spree,shaywood2/spree,welitonfreitas/spree,nooysters/spree,orenf/spree,berkes/spree,hoanghiep90/spree,yiqing95/spree,rbngzlv/spree,ahmetabdi/spree,reinaris/spree,jspizziri/spree,thogg4/spree,alvinjean/spree,ramkumar-kr/spree,useiichi/spree,berkes/spree,CJMrozek/spree,vinayvinsol/spree,pervino/spree,Machpowersystems/spree_mach,tomash/spree,shekibobo/spree,mindvolt/spree,keatonrow/spree,shaywood2/spree,TrialGuides/spree,sfcgeorge/spree,zamiang/spree,CiscoCloud/spree,vmatekole/spree,zaeznet/spree,beni55/spree,rakibulislam/spree,Boomkat/spree,maybii/spree,patdec/spree,mleglise/spree,cutefrank/spree,NerdsvilleCEO/spree,volpejoaquin/spree,miyazawatomoka/spree,lsirivong/spree | ruby | ## Code Before:
module Spree
module Api
class ZonesController < Spree::Api::BaseController
def create
authorize! :create, Zone
@zone = Zone.new(map_nested_attributes_keys(Spree::Zone, params[:zone]))
if @zone.save
respond_with(@zone, :status => 201, :default_template => :show)
else
invalid_resource!(@zone)
end
end
def destroy
authorize! :destroy, zone
zone.destroy
respond_with(zone, :status => 204)
end
def index
@zones = Zone.accessible_by(current_ability, :read).order('name ASC').ransack(params[:q]).result.page(params[:page]).per(params[:per_page])
respond_with(@zones)
end
def show
respond_with(zone)
end
def update
authorize! :update, zone
if zone.update_attributes(map_nested_attributes_keys(Spree::Zone, params[:zone]))
respond_with(zone, :status => 200, :default_template => :show)
else
invalid_resource!(zone)
end
end
private
def zone
@zone ||= Spree::Zone.accessible_by(current_ability, :read).find(params[:id])
end
end
end
end
## Instruction:
Call permit! on zone params before passing to model
Apparrantly as of rails 4.2 we must `permit` controller `params` before
passing them to the model. Previous rails version `params` probably
didn't respond to `permitted?`
see https://github.com/rails/rails/blob/v4.2.0/activemodel/lib/active_model/forbidden_attributes_protection.rb
## Code After:
module Spree
module Api
class ZonesController < Spree::Api::BaseController
def create
authorize! :create, Zone
@zone = Zone.new(map_nested_attributes_keys(Spree::Zone, zone_params))
if @zone.save
respond_with(@zone, :status => 201, :default_template => :show)
else
invalid_resource!(@zone)
end
end
def destroy
authorize! :destroy, zone
zone.destroy
respond_with(zone, :status => 204)
end
def index
@zones = Zone.accessible_by(current_ability, :read).order('name ASC').ransack(params[:q]).result.page(params[:page]).per(params[:per_page])
respond_with(@zones)
end
def show
respond_with(zone)
end
def update
authorize! :update, zone
if zone.update_attributes(map_nested_attributes_keys(Spree::Zone, zone_params))
respond_with(zone, :status => 200, :default_template => :show)
else
invalid_resource!(zone)
end
end
private
def zone_params
params.require(:zone).permit!
end
def zone
@zone ||= Spree::Zone.accessible_by(current_ability, :read).find(params[:id])
end
end
end
end
| module Spree
module Api
class ZonesController < Spree::Api::BaseController
def create
authorize! :create, Zone
- @zone = Zone.new(map_nested_attributes_keys(Spree::Zone, params[:zone]))
? -------
+ @zone = Zone.new(map_nested_attributes_keys(Spree::Zone, zone_params))
? +++++
if @zone.save
respond_with(@zone, :status => 201, :default_template => :show)
else
invalid_resource!(@zone)
end
end
def destroy
authorize! :destroy, zone
zone.destroy
respond_with(zone, :status => 204)
end
def index
@zones = Zone.accessible_by(current_ability, :read).order('name ASC').ransack(params[:q]).result.page(params[:page]).per(params[:per_page])
respond_with(@zones)
end
def show
respond_with(zone)
end
def update
authorize! :update, zone
- if zone.update_attributes(map_nested_attributes_keys(Spree::Zone, params[:zone]))
? -------
+ if zone.update_attributes(map_nested_attributes_keys(Spree::Zone, zone_params))
? +++++
respond_with(zone, :status => 200, :default_template => :show)
else
invalid_resource!(zone)
end
end
private
+ def zone_params
+ params.require(:zone).permit!
+ end
def zone
@zone ||= Spree::Zone.accessible_by(current_ability, :read).find(params[:id])
end
end
end
end | 7 | 0.152174 | 5 | 2 |
ee724c42e981222efa342816755ea3d3b718be3f | bootstrap.php | bootstrap.php | <?php
/**
* @package s9e\mediaembed
* @copyright Copyright (c) 2015-2016 The s9e Authors
* @license http://www.opensource.org/licenses/mit-license.php The MIT License
*/
namespace s9e\Flarum\MediaEmbed;
use Flarum\Event\ConfigureFormatter;
use Illuminate\Events\Dispatcher;
use s9e\TextFormatter\Configurator\Bundles\MediaPack;
function subscribe(Dispatcher $events)
{
$events->listen(
ConfigureFormatter::class,
function (ConfigureFormatter $event)
{
(new MediaPack)->configure($event->configurator);
}
);
};
return __NAMESPACE__ . '\\subscribe'; | <?php
/**
* @package s9e\mediaembed
* @copyright Copyright (c) 2015-2016 The s9e Authors
* @license http://www.opensource.org/licenses/mit-license.php The MIT License
*/
namespace s9e\Flarum\MediaEmbed;
use Flarum\Formatter\Event\Configuring;
use Illuminate\Events\Dispatcher;
use s9e\TextFormatter\Configurator\Bundles\MediaPack;
function subscribe(Dispatcher $events)
{
$events->listen(
Configuring::class,
function (Configuring $event)
{
(new MediaPack)->configure($event->configurator);
}
);
};
return __NAMESPACE__ . '\\subscribe';
| Update namespaces for beta 8 | Update namespaces for beta 8
| PHP | mit | s9e/flarum-ext-mediaembed | php | ## Code Before:
<?php
/**
* @package s9e\mediaembed
* @copyright Copyright (c) 2015-2016 The s9e Authors
* @license http://www.opensource.org/licenses/mit-license.php The MIT License
*/
namespace s9e\Flarum\MediaEmbed;
use Flarum\Event\ConfigureFormatter;
use Illuminate\Events\Dispatcher;
use s9e\TextFormatter\Configurator\Bundles\MediaPack;
function subscribe(Dispatcher $events)
{
$events->listen(
ConfigureFormatter::class,
function (ConfigureFormatter $event)
{
(new MediaPack)->configure($event->configurator);
}
);
};
return __NAMESPACE__ . '\\subscribe';
## Instruction:
Update namespaces for beta 8
## Code After:
<?php
/**
* @package s9e\mediaembed
* @copyright Copyright (c) 2015-2016 The s9e Authors
* @license http://www.opensource.org/licenses/mit-license.php The MIT License
*/
namespace s9e\Flarum\MediaEmbed;
use Flarum\Formatter\Event\Configuring;
use Illuminate\Events\Dispatcher;
use s9e\TextFormatter\Configurator\Bundles\MediaPack;
function subscribe(Dispatcher $events)
{
$events->listen(
Configuring::class,
function (Configuring $event)
{
(new MediaPack)->configure($event->configurator);
}
);
};
return __NAMESPACE__ . '\\subscribe';
| <?php
/**
* @package s9e\mediaembed
* @copyright Copyright (c) 2015-2016 The s9e Authors
* @license http://www.opensource.org/licenses/mit-license.php The MIT License
*/
namespace s9e\Flarum\MediaEmbed;
- use Flarum\Event\ConfigureFormatter;
+ use Flarum\Formatter\Event\Configuring;
use Illuminate\Events\Dispatcher;
use s9e\TextFormatter\Configurator\Bundles\MediaPack;
function subscribe(Dispatcher $events)
{
$events->listen(
- ConfigureFormatter::class,
+ Configuring::class,
- function (ConfigureFormatter $event)
? ^^^^^^^^^^
+ function (Configuring $event)
? ^^^
{
(new MediaPack)->configure($event->configurator);
}
);
};
return __NAMESPACE__ . '\\subscribe'; | 6 | 0.24 | 3 | 3 |
bfd5dcf8c5a50c7ba66305cd9562de8361afc3b5 | .travis.yml | .travis.yml | language: java
jdk:
- oraclejdk7
- oraclejdk8
sudo: false
notifications:
email:
on_success: never
# there is no signatory on build server, so call 'gradle jar' rather than
# 'gradle assemble' to prevent 'signArchives' from being called
install:
- gradle jar
| language: java
jdk:
- oraclejdk7
- oraclejdk8
sudo: false
notifications:
email:
on_success: never
# there is no signatory on build server, so exclude the 'signArchives' task
install:
- ./gradlew assemble -x signArchives
| Update Travis configuration to use wrapper | ASN-100: Update Travis configuration to use wrapper
| YAML | mit | brightsparklabs/asanti | yaml | ## Code Before:
language: java
jdk:
- oraclejdk7
- oraclejdk8
sudo: false
notifications:
email:
on_success: never
# there is no signatory on build server, so call 'gradle jar' rather than
# 'gradle assemble' to prevent 'signArchives' from being called
install:
- gradle jar
## Instruction:
ASN-100: Update Travis configuration to use wrapper
## Code After:
language: java
jdk:
- oraclejdk7
- oraclejdk8
sudo: false
notifications:
email:
on_success: never
# there is no signatory on build server, so exclude the 'signArchives' task
install:
- ./gradlew assemble -x signArchives
| language: java
jdk:
- oraclejdk7
- oraclejdk8
sudo: false
notifications:
email:
on_success: never
- # there is no signatory on build server, so call 'gradle jar' rather than
? - ^^^^^^ - ------- - ^
+ # there is no signatory on build server, so exclude the 'signArchives' task
? ++ ^ +++++++ +++++++ ^^
- # 'gradle assemble' to prevent 'signArchives' from being called
install:
- - gradle jar
+ - ./gradlew assemble -x signArchives | 5 | 0.3125 | 2 | 3 |
251d9796314135317a038c6564412149bbcb4855 | ideasbox/templates/ideasbox/user_list.html | ideasbox/templates/ideasbox/user_list.html | {% extends 'two-third-third.html' %}
{% load i18n ideasbox_tags %}
{% block twothird %}
<h2>{% trans "users" %}</h2>
{% for user in user_list %}
<p><a href="{% url 'user_detail' pk=user.pk %}">{{ user }}</a></p>
{% endfor %}
{% endblock twothird %}
{% block third %}
<div class="card tinted admin">
{% fa 'user-plus' 'fa-fw' %} <a href="{% url 'user_create' %}">{% trans "Add user" %}</a>
</div>
{% endblock third %}
| {% extends 'two-third-third.html' %}
{% load i18n ideasbox_tags %}
{% block twothird %}
<h2>{% trans "Users" %} ({{ user_list.count }})</h2>
{% for user in user_list %}
<p><a href="{% url 'user_detail' pk=user.pk %}">{{ user }}</a></p>
{% endfor %}
{% endblock twothird %}
{% block third %}
<div class="card tinted admin">
{% fa 'user-plus' 'fa-fw' %} <a href="{% url 'user_create' %}">{% trans "Add user" %}</a>
</div>
{% endblock third %}
| Add users count in users list page | Add users count in users list page
| HTML | agpl-3.0 | ideascube/ideascube,Lcaracol/ideasbox.lan,Lcaracol/ideasbox.lan,Lcaracol/ideasbox.lan,ideascube/ideascube,ideascube/ideascube,ideascube/ideascube | html | ## Code Before:
{% extends 'two-third-third.html' %}
{% load i18n ideasbox_tags %}
{% block twothird %}
<h2>{% trans "users" %}</h2>
{% for user in user_list %}
<p><a href="{% url 'user_detail' pk=user.pk %}">{{ user }}</a></p>
{% endfor %}
{% endblock twothird %}
{% block third %}
<div class="card tinted admin">
{% fa 'user-plus' 'fa-fw' %} <a href="{% url 'user_create' %}">{% trans "Add user" %}</a>
</div>
{% endblock third %}
## Instruction:
Add users count in users list page
## Code After:
{% extends 'two-third-third.html' %}
{% load i18n ideasbox_tags %}
{% block twothird %}
<h2>{% trans "Users" %} ({{ user_list.count }})</h2>
{% for user in user_list %}
<p><a href="{% url 'user_detail' pk=user.pk %}">{{ user }}</a></p>
{% endfor %}
{% endblock twothird %}
{% block third %}
<div class="card tinted admin">
{% fa 'user-plus' 'fa-fw' %} <a href="{% url 'user_create' %}">{% trans "Add user" %}</a>
</div>
{% endblock third %}
| {% extends 'two-third-third.html' %}
{% load i18n ideasbox_tags %}
{% block twothird %}
- <h2>{% trans "users" %}</h2>
+ <h2>{% trans "Users" %} ({{ user_list.count }})</h2>
{% for user in user_list %}
<p><a href="{% url 'user_detail' pk=user.pk %}">{{ user }}</a></p>
{% endfor %}
{% endblock twothird %}
{% block third %}
<div class="card tinted admin">
{% fa 'user-plus' 'fa-fw' %} <a href="{% url 'user_create' %}">{% trans "Add user" %}</a>
</div>
{% endblock third %} | 2 | 0.142857 | 1 | 1 |
915a9850dcf2459ea2c02f9b08b07a45fbbbfd59 | README.md | README.md | Scripts to help you track changes to your /etc files
After changing a file in `/etc` for the first time, track it using
`gitetc-add`. After subsequent changes or system updates, run `gitetc-sync` to
check for changes tracked files.
`gitetc` maintains a `git` repository located by default (see
[below](#configuration)) in `~/.gitetc/`.
## Scripts
* `gitetc-add` track a new `/etc` file
* `gitetc-sync` checks `/etc` and updates all tracked files
* `gitetc-install` copies all tracked files to `/etc`
## Installing
If the target machine is running [Arch Linux](https://www.archlinux.org/),
an [AUR package](https://aur.archlinux.org/packages/gitetc/) is
available.
Otherwise, the scripts in the `bin` directory can be executed directly.
## Configuration
Change the location of the backing `git` repository via:
```sh
git config --global gitetc.path ~/projects/etcfiles
```
| Scripts to help you track changes to your /etc files
After changing a file in `/etc` for the first time, track it using
`gitetc-add`. After subsequent changes or system updates, run `gitetc-sync` to
check for changes tracked files.
`gitetc` maintains a `git` repository located by default (see
[below](#configuration)) in `~/.gitetc/`.
## Scripts
* `gitetc-add` track a new `/etc` file
([docs](doc/gitetc-add.md), [code](bin/gitetc-add))
* `gitetc-sync` checks `/etc` and updates all tracked files
* `gitetc-install` copies all tracked files to `/etc`
## Installing
If the target machine is running [Arch Linux](https://www.archlinux.org/),
an [AUR package](https://aur.archlinux.org/packages/gitetc/) is
available.
Otherwise, the scripts in the `bin` directory can be executed directly.
## Configuration
Change the location of the backing `git` repository via:
```sh
git config --global gitetc.path ~/projects/etcfiles
```
| Add links to docs and code to readme | Add links to docs and code to readme
| Markdown | mit | vinsonchuong/gitetc | markdown | ## Code Before:
Scripts to help you track changes to your /etc files
After changing a file in `/etc` for the first time, track it using
`gitetc-add`. After subsequent changes or system updates, run `gitetc-sync` to
check for changes tracked files.
`gitetc` maintains a `git` repository located by default (see
[below](#configuration)) in `~/.gitetc/`.
## Scripts
* `gitetc-add` track a new `/etc` file
* `gitetc-sync` checks `/etc` and updates all tracked files
* `gitetc-install` copies all tracked files to `/etc`
## Installing
If the target machine is running [Arch Linux](https://www.archlinux.org/),
an [AUR package](https://aur.archlinux.org/packages/gitetc/) is
available.
Otherwise, the scripts in the `bin` directory can be executed directly.
## Configuration
Change the location of the backing `git` repository via:
```sh
git config --global gitetc.path ~/projects/etcfiles
```
## Instruction:
Add links to docs and code to readme
## Code After:
Scripts to help you track changes to your /etc files
After changing a file in `/etc` for the first time, track it using
`gitetc-add`. After subsequent changes or system updates, run `gitetc-sync` to
check for changes tracked files.
`gitetc` maintains a `git` repository located by default (see
[below](#configuration)) in `~/.gitetc/`.
## Scripts
* `gitetc-add` track a new `/etc` file
([docs](doc/gitetc-add.md), [code](bin/gitetc-add))
* `gitetc-sync` checks `/etc` and updates all tracked files
* `gitetc-install` copies all tracked files to `/etc`
## Installing
If the target machine is running [Arch Linux](https://www.archlinux.org/),
an [AUR package](https://aur.archlinux.org/packages/gitetc/) is
available.
Otherwise, the scripts in the `bin` directory can be executed directly.
## Configuration
Change the location of the backing `git` repository via:
```sh
git config --global gitetc.path ~/projects/etcfiles
```
| Scripts to help you track changes to your /etc files
After changing a file in `/etc` for the first time, track it using
`gitetc-add`. After subsequent changes or system updates, run `gitetc-sync` to
check for changes tracked files.
`gitetc` maintains a `git` repository located by default (see
[below](#configuration)) in `~/.gitetc/`.
## Scripts
* `gitetc-add` track a new `/etc` file
+ ([docs](doc/gitetc-add.md), [code](bin/gitetc-add))
* `gitetc-sync` checks `/etc` and updates all tracked files
* `gitetc-install` copies all tracked files to `/etc`
## Installing
If the target machine is running [Arch Linux](https://www.archlinux.org/),
an [AUR package](https://aur.archlinux.org/packages/gitetc/) is
available.
Otherwise, the scripts in the `bin` directory can be executed directly.
## Configuration
Change the location of the backing `git` repository via:
```sh
git config --global gitetc.path ~/projects/etcfiles
``` | 1 | 0.038462 | 1 | 0 |
c6719476152bdf0760d13897d152dd609f50ad52 | test/BuildConfigurations/x64FreeBSDTarget.swift | test/BuildConfigurations/x64FreeBSDTarget.swift | // RUN: %swift -parse %s -verify -D FOO -D BAR -target x86_64-freebsd10 -disable-objc-interop -D FOO -parse-stdlib
// RUN: %swift-ide-test -test-input-complete -source-filename=%s -target x86_64-freebsd10
#if arch(x86_64) && os(FreeBSD) && _runtime(_Native)
class C {}
var x = C()
#endif
var y = x
| // RUN: %swift -parse %s -verify -D FOO -D BAR -target x86_64-unknown-freebsd10 -disable-objc-interop -D FOO -parse-stdlib
// RUN: %swift-ide-test -test-input-complete -source-filename=%s -target x86_64-unknown-freebsd10
#if arch(x86_64) && os(FreeBSD) && _runtime(_Native)
class C {}
var x = C()
#endif
var y = x
| Apply patch provided by gribozavr; swift -target requires a proper triple. | Apply patch provided by gribozavr; swift -target requires a proper triple.
| Swift | apache-2.0 | djwbrown/swift,bitjammer/swift,calebd/swift,ben-ng/swift,huonw/swift,deyton/swift,allevato/swift,danielmartin/swift,cbrentharris/swift,return/swift,atrick/swift,ken0nek/swift,adrfer/swift,xwu/swift,SwiftAndroid/swift,tinysun212/swift-windows,jtbandes/swift,rudkx/swift,tjw/swift,rudkx/swift,devincoughlin/swift,CodaFi/swift,shajrawi/swift,natecook1000/swift,aschwaighofer/swift,karwa/swift,hughbe/swift,therealbnut/swift,harlanhaskins/swift,JaSpa/swift,practicalswift/swift,IngmarStein/swift,hooman/swift,xedin/swift,stephentyrone/swift,therealbnut/swift,slavapestov/swift,atrick/swift,glessard/swift,apple/swift,kperryua/swift,ben-ng/swift,cbrentharris/swift,huonw/swift,MukeshKumarS/Swift,MukeshKumarS/Swift,austinzheng/swift,KrishMunot/swift,uasys/swift,aschwaighofer/swift,modocache/swift,ben-ng/swift,xedin/swift,milseman/swift,tkremenek/swift,JaSpa/swift,arvedviehweger/swift,brentdax/swift,kperryua/swift,benlangmuir/swift,hughbe/swift,jtbandes/swift,amraboelela/swift,felix91gr/swift,brentdax/swift,modocache/swift,kperryua/swift,modocache/swift,roambotics/swift,kstaring/swift,cbrentharris/swift,apple/swift,amraboelela/swift,nathawes/swift,uasys/swift,OscarSwanros/swift,russbishop/swift,felix91gr/swift,parkera/swift,roambotics/swift,atrick/swift,harlanhaskins/swift,ken0nek/swift,shahmishal/swift,uasys/swift,benlangmuir/swift,practicalswift/swift,shajrawi/swift,brentdax/swift,jckarter/swift,milseman/swift,cbrentharris/swift,arvedviehweger/swift,gribozavr/swift,jtbandes/swift,huonw/swift,jmgc/swift,glessard/swift,huonw/swift,gregomni/swift,zisko/swift,jmgc/swift,hooman/swift,cbrentharris/swift,xedin/swift,rudkx/swift,tjw/swift,kusl/swift,jopamer/swift,russbishop/swift,alblue/swift,IngmarStein/swift,codestergit/swift,felix91gr/swift,zisko/swift,slavapestov/swift,codestergit/swift,alblue/swift,arvedviehweger/swift,swiftix/swift,practicalswift/swift,apple/swift,Jnosh/swift,ahoppen/swift,xedin/swift,lorentey/swift,SwiftAndroid/swift,jmgc/swift,dduan/swift,ahoppen/swift,zisko/swift,frootloops/swift,deyton/swift,russbishop/swift,devincoughlin/swift,manavgabhawala/swift,manavgabhawala/swift,milseman/swift,gribozavr/swift,roambotics/swift,rudkx/swift,bitjammer/swift,sschiau/swift,tjw/swift,shahmishal/swift,Jnosh/swift,modocache/swift,xwu/swift,OscarSwanros/swift,manavgabhawala/swift,jckarter/swift,lorentey/swift,codestergit/swift,jopamer/swift,jckarter/swift,KrishMunot/swift,frootloops/swift,amraboelela/swift,Jnosh/swift,tjw/swift,gottesmm/swift,nathawes/swift,shajrawi/swift,return/swift,atrick/swift,dreamsxin/swift,russbishop/swift,johnno1962d/swift,therealbnut/swift,stephentyrone/swift,johnno1962d/swift,stephentyrone/swift,gmilos/swift,ahoppen/swift,djwbrown/swift,gmilos/swift,kusl/swift,jopamer/swift,jckarter/swift,glessard/swift,bitjammer/swift,ahoppen/swift,milseman/swift,IngmarStein/swift,frootloops/swift,karwa/swift,return/swift,russbishop/swift,codestergit/swift,xedin/swift,benlangmuir/swift,tjw/swift,codestergit/swift,roambotics/swift,deyton/swift,gregomni/swift,slavapestov/swift,JGiola/swift,lorentey/swift,slavapestov/swift,cbrentharris/swift,shajrawi/swift,tkremenek/swift,frootloops/swift,IngmarStein/swift,ahoppen/swift,SwiftAndroid/swift,xwu/swift,gottesmm/swift,alblue/swift,kperryua/swift,harlanhaskins/swift,SwiftAndroid/swift,tjw/swift,lorentey/swift,zisko/swift,OscarSwanros/swift,Jnosh/swift,tkremenek/swift,russbishop/swift,bitjammer/swift,tardieu/swift,JGiola/swift,MukeshKumarS/Swift,johnno1962d/swift,shahmishal/swift,CodaFi/swift,airspeedswift/swift,johnno1962d/swift,dduan/swift,nathawes/swift,lorentey/swift,austinzheng/swift,modocache/swift,shahmishal/swift,parkera/swift,modocache/swift,manavgabhawala/swift,tkremenek/swift,kstaring/swift,calebd/swift,xwu/swift,tkremenek/swift,sschiau/swift,return/swift,SwiftAndroid/swift,shahmishal/swift,gregomni/swift,slavapestov/swift,zisko/swift,austinzheng/swift,gregomni/swift,bitjammer/swift,benlangmuir/swift,kusl/swift,tardieu/swift,jckarter/swift,kstaring/swift,jckarter/swift,devincoughlin/swift,amraboelela/swift,uasys/swift,djwbrown/swift,harlanhaskins/swift,manavgabhawala/swift,ben-ng/swift,roambotics/swift,kstaring/swift,tardieu/swift,brentdax/swift,dduan/swift,johnno1962d/swift,natecook1000/swift,frootloops/swift,codestergit/swift,kstaring/swift,devincoughlin/swift,karwa/swift,kperryua/swift,JaSpa/swift,hooman/swift,practicalswift/swift,apple/swift,adrfer/swift,lorentey/swift,djwbrown/swift,allevato/swift,zisko/swift,hooman/swift,karwa/swift,lorentey/swift,glessard/swift,amraboelela/swift,allevato/swift,practicalswift/swift,gmilos/swift,slavapestov/swift,karwa/swift,milseman/swift,deyton/swift,JGiola/swift,johnno1962d/swift,glessard/swift,deyton/swift,aschwaighofer/swift,CodaFi/swift,allevato/swift,allevato/swift,lorentey/swift,KrishMunot/swift,felix91gr/swift,atrick/swift,tardieu/swift,shahmishal/swift,jmgc/swift,hughbe/swift,bitjammer/swift,kusl/swift,hooman/swift,ken0nek/swift,sschiau/swift,parkera/swift,aschwaighofer/swift,gottesmm/swift,austinzheng/swift,airspeedswift/swift,ken0nek/swift,jopamer/swift,felix91gr/swift,airspeedswift/swift,alblue/swift,danielmartin/swift,glessard/swift,harlanhaskins/swift,jmgc/swift,stephentyrone/swift,adrfer/swift,huonw/swift,KrishMunot/swift,kstaring/swift,kperryua/swift,alblue/swift,shajrawi/swift,arvedviehweger/swift,jckarter/swift,tjw/swift,calebd/swift,jmgc/swift,OscarSwanros/swift,swiftix/swift,devincoughlin/swift,kusl/swift,ben-ng/swift,nathawes/swift,harlanhaskins/swift,gregomni/swift,tinysun212/swift-windows,MukeshKumarS/Swift,practicalswift/swift,gmilos/swift,tardieu/swift,JaSpa/swift,benlangmuir/swift,rudkx/swift,alblue/swift,benlangmuir/swift,sschiau/swift,jtbandes/swift,swiftix/swift,SwiftAndroid/swift,kstaring/swift,danielmartin/swift,IngmarStein/swift,gmilos/swift,manavgabhawala/swift,devincoughlin/swift,nathawes/swift,hughbe/swift,gottesmm/swift,aschwaighofer/swift,natecook1000/swift,xedin/swift,dduan/swift,roambotics/swift,hooman/swift,karwa/swift,hughbe/swift,modocache/swift,kusl/swift,JaSpa/swift,gottesmm/swift,MukeshKumarS/Swift,uasys/swift,dduan/swift,parkera/swift,OscarSwanros/swift,airspeedswift/swift,bitjammer/swift,djwbrown/swift,parkera/swift,Jnosh/swift,zisko/swift,natecook1000/swift,sschiau/swift,calebd/swift,milseman/swift,JaSpa/swift,amraboelela/swift,johnno1962d/swift,aschwaighofer/swift,jopamer/swift,CodaFi/swift,kperryua/swift,CodaFi/swift,tinysun212/swift-windows,brentdax/swift,therealbnut/swift,MukeshKumarS/Swift,brentdax/swift,kusl/swift,danielmartin/swift,gmilos/swift,ben-ng/swift,codestergit/swift,swiftix/swift,karwa/swift,gottesmm/swift,apple/swift,tkremenek/swift,JGiola/swift,gribozavr/swift,Jnosh/swift,manavgabhawala/swift,devincoughlin/swift,natecook1000/swift,stephentyrone/swift,OscarSwanros/swift,tinysun212/swift-windows,apple/swift,JaSpa/swift,frootloops/swift,ken0nek/swift,tinysun212/swift-windows,ben-ng/swift,sschiau/swift,jopamer/swift,return/swift,natecook1000/swift,jmgc/swift,arvedviehweger/swift,adrfer/swift,xwu/swift,swiftix/swift,atrick/swift,rudkx/swift,danielmartin/swift,therealbnut/swift,CodaFi/swift,airspeedswift/swift,uasys/swift,sschiau/swift,dduan/swift,harlanhaskins/swift,practicalswift/swift,uasys/swift,dreamsxin/swift,devincoughlin/swift,danielmartin/swift,IngmarStein/swift,deyton/swift,stephentyrone/swift,OscarSwanros/swift,IngmarStein/swift,aschwaighofer/swift,KrishMunot/swift,adrfer/swift,cbrentharris/swift,parkera/swift,milseman/swift,brentdax/swift,parkera/swift,karwa/swift,return/swift,allevato/swift,gribozavr/swift,CodaFi/swift,felix91gr/swift,austinzheng/swift,JGiola/swift,austinzheng/swift,SwiftAndroid/swift,xedin/swift,shahmishal/swift,jopamer/swift,tinysun212/swift-windows,gottesmm/swift,swiftix/swift,djwbrown/swift,gribozavr/swift,tkremenek/swift,shajrawi/swift,adrfer/swift,natecook1000/swift,xwu/swift,huonw/swift,felix91gr/swift,allevato/swift,shajrawi/swift,jtbandes/swift,gribozavr/swift,russbishop/swift,practicalswift/swift,return/swift,dduan/swift,gribozavr/swift,gregomni/swift,tardieu/swift,tinysun212/swift-windows,austinzheng/swift,calebd/swift,danielmartin/swift,airspeedswift/swift,jtbandes/swift,swiftix/swift,JGiola/swift,xwu/swift,therealbnut/swift,ken0nek/swift,adrfer/swift,hughbe/swift,alblue/swift,hooman/swift,gmilos/swift,tardieu/swift,xedin/swift,hughbe/swift,ken0nek/swift,amraboelela/swift,cbrentharris/swift,jtbandes/swift,KrishMunot/swift,airspeedswift/swift,MukeshKumarS/Swift,KrishMunot/swift,nathawes/swift,gribozavr/swift,arvedviehweger/swift,djwbrown/swift,kusl/swift,deyton/swift,Jnosh/swift,arvedviehweger/swift,calebd/swift,nathawes/swift,stephentyrone/swift,sschiau/swift,shajrawi/swift,slavapestov/swift,huonw/swift,ahoppen/swift,parkera/swift,shahmishal/swift,calebd/swift,frootloops/swift,therealbnut/swift | swift | ## Code Before:
// RUN: %swift -parse %s -verify -D FOO -D BAR -target x86_64-freebsd10 -disable-objc-interop -D FOO -parse-stdlib
// RUN: %swift-ide-test -test-input-complete -source-filename=%s -target x86_64-freebsd10
#if arch(x86_64) && os(FreeBSD) && _runtime(_Native)
class C {}
var x = C()
#endif
var y = x
## Instruction:
Apply patch provided by gribozavr; swift -target requires a proper triple.
## Code After:
// RUN: %swift -parse %s -verify -D FOO -D BAR -target x86_64-unknown-freebsd10 -disable-objc-interop -D FOO -parse-stdlib
// RUN: %swift-ide-test -test-input-complete -source-filename=%s -target x86_64-unknown-freebsd10
#if arch(x86_64) && os(FreeBSD) && _runtime(_Native)
class C {}
var x = C()
#endif
var y = x
| - // RUN: %swift -parse %s -verify -D FOO -D BAR -target x86_64-freebsd10 -disable-objc-interop -D FOO -parse-stdlib
+ // RUN: %swift -parse %s -verify -D FOO -D BAR -target x86_64-unknown-freebsd10 -disable-objc-interop -D FOO -parse-stdlib
? ++++++++
- // RUN: %swift-ide-test -test-input-complete -source-filename=%s -target x86_64-freebsd10
+ // RUN: %swift-ide-test -test-input-complete -source-filename=%s -target x86_64-unknown-freebsd10
? ++++++++
#if arch(x86_64) && os(FreeBSD) && _runtime(_Native)
class C {}
var x = C()
#endif
var y = x | 4 | 0.5 | 2 | 2 |
1c117e6f7498464e491201ab6a843bc8318d1bb8 | config.mk | config.mk | top = `{pwd}
root = ${top}/root
#arch = i486
arch = x86_64
libcroot = ${root}/opt/cross/${arch}-linux-musl/${arch}-linux-musl
nprocs = 4
version = 0.0
mirror = http://dl.2f30.org/morpheus-pkgs/${arch}/${version}
CC = ${arch}-linux-musl-gcc
optldflags = -s -Wl,--gc-sections -Wl,-z,relro,-z,now
optcflags = -fdata-sections -ffunction-sections -Os -g0 -fno-unwind-tables -fno-asynchronous-unwind-tables -Wa,--noexecstack
CFLAGS = -I${libcroot}/include -static ${optcflags}
LDFLAGS = -L${libcroot}/lib -static ${optldflags}
PATH = ${root}/opt/cross/${arch}-linux-musl/bin:${PATH}
| top = `{pwd}
root = ${top}/root
#arch = i486
arch = x86_64
libcroot = ${root}/opt/cross/${arch}-linux-musl/${arch}-linux-musl
nprocs = 4
version = 0.0
mirror = http://dl.2f30.org/morpheus-pkgs/${arch}/${version}
CC = ${arch}-linux-musl-gcc
# these don't work for all packages yet...
optldflags = -s -Wl,--gc-sections -Wl,-z,relro,-z,now
optcflags = -fdata-sections -ffunction-sections -Os -g0 -fno-unwind-tables -fno-asynchronous-unwind-tables -Wa,--noexecstack
CFLAGS = -I${libcroot}/include -static # ${optcflags}
LDFLAGS = -L${libcroot}/lib -static # ${optldflags}
PATH = ${root}/opt/cross/${arch}-linux-musl/bin:${PATH}
| Comment out optcflags and optldflags | Comment out optcflags and optldflags
These mostly work but not for all packages. ncurses and/or libevent
seem to have trouble with them.
| Makefile | mit | henrysher/morpheus,minos-org/morpheus-base | makefile | ## Code Before:
top = `{pwd}
root = ${top}/root
#arch = i486
arch = x86_64
libcroot = ${root}/opt/cross/${arch}-linux-musl/${arch}-linux-musl
nprocs = 4
version = 0.0
mirror = http://dl.2f30.org/morpheus-pkgs/${arch}/${version}
CC = ${arch}-linux-musl-gcc
optldflags = -s -Wl,--gc-sections -Wl,-z,relro,-z,now
optcflags = -fdata-sections -ffunction-sections -Os -g0 -fno-unwind-tables -fno-asynchronous-unwind-tables -Wa,--noexecstack
CFLAGS = -I${libcroot}/include -static ${optcflags}
LDFLAGS = -L${libcroot}/lib -static ${optldflags}
PATH = ${root}/opt/cross/${arch}-linux-musl/bin:${PATH}
## Instruction:
Comment out optcflags and optldflags
These mostly work but not for all packages. ncurses and/or libevent
seem to have trouble with them.
## Code After:
top = `{pwd}
root = ${top}/root
#arch = i486
arch = x86_64
libcroot = ${root}/opt/cross/${arch}-linux-musl/${arch}-linux-musl
nprocs = 4
version = 0.0
mirror = http://dl.2f30.org/morpheus-pkgs/${arch}/${version}
CC = ${arch}-linux-musl-gcc
# these don't work for all packages yet...
optldflags = -s -Wl,--gc-sections -Wl,-z,relro,-z,now
optcflags = -fdata-sections -ffunction-sections -Os -g0 -fno-unwind-tables -fno-asynchronous-unwind-tables -Wa,--noexecstack
CFLAGS = -I${libcroot}/include -static # ${optcflags}
LDFLAGS = -L${libcroot}/lib -static # ${optldflags}
PATH = ${root}/opt/cross/${arch}-linux-musl/bin:${PATH}
| top = `{pwd}
root = ${top}/root
#arch = i486
arch = x86_64
libcroot = ${root}/opt/cross/${arch}-linux-musl/${arch}-linux-musl
nprocs = 4
version = 0.0
mirror = http://dl.2f30.org/morpheus-pkgs/${arch}/${version}
CC = ${arch}-linux-musl-gcc
+ # these don't work for all packages yet...
optldflags = -s -Wl,--gc-sections -Wl,-z,relro,-z,now
optcflags = -fdata-sections -ffunction-sections -Os -g0 -fno-unwind-tables -fno-asynchronous-unwind-tables -Wa,--noexecstack
- CFLAGS = -I${libcroot}/include -static ${optcflags}
+ CFLAGS = -I${libcroot}/include -static # ${optcflags}
? ++
- LDFLAGS = -L${libcroot}/lib -static ${optldflags}
+ LDFLAGS = -L${libcroot}/lib -static # ${optldflags}
? ++
PATH = ${root}/opt/cross/${arch}-linux-musl/bin:${PATH} | 5 | 0.294118 | 3 | 2 |
7044dd578af99b812275c49690589f000473e77c | app/templates/components/completable-input.hbs | app/templates/components/completable-input.hbs | {{
input type='text' class=inputClassNames value=value
placeholder=placeholder enter='enterPressed'
}}
{{!
focus-in='focusIn' focus-out='focusOut'
}}
{{!#if showCompletions}}
<div {{bind-attr class=":completion-list showCompletions:show:hide" }}>
<!--<div class="completion-list">-->
{{#each candidate in potentialComplements}}
<div {{bind-attr class=":completion-candidate candidate.isActive:active"}}
{{action "selectComplement" candidate}}>
{{candidate.value}}
</div>
{{/each}}
</div>
{{!/if}}
| {{
input type='text' class=inputClassNames value=value
placeholder=placeholder enter='enterPressed' autocomplete="off"
}}
{{!
focus-in='focusIn' focus-out='focusOut'
}}
{{!#if showCompletions}}
<div {{bind-attr class=":completion-list showCompletions:show:hide" }}>
<!--<div class="completion-list">-->
{{#each candidate in potentialComplements}}
<div {{bind-attr class=":completion-candidate candidate.isActive:active"}}
{{action "selectComplement" candidate}}>
{{candidate.value}}
</div>
{{/each}}
</div>
{{!/if}}
| Disable form input auto complete. | Disable form input auto complete.
| Handlebars | mit | bitzik/ember-cli-completable-input,bitzik/ember-cli-completable-input | handlebars | ## Code Before:
{{
input type='text' class=inputClassNames value=value
placeholder=placeholder enter='enterPressed'
}}
{{!
focus-in='focusIn' focus-out='focusOut'
}}
{{!#if showCompletions}}
<div {{bind-attr class=":completion-list showCompletions:show:hide" }}>
<!--<div class="completion-list">-->
{{#each candidate in potentialComplements}}
<div {{bind-attr class=":completion-candidate candidate.isActive:active"}}
{{action "selectComplement" candidate}}>
{{candidate.value}}
</div>
{{/each}}
</div>
{{!/if}}
## Instruction:
Disable form input auto complete.
## Code After:
{{
input type='text' class=inputClassNames value=value
placeholder=placeholder enter='enterPressed' autocomplete="off"
}}
{{!
focus-in='focusIn' focus-out='focusOut'
}}
{{!#if showCompletions}}
<div {{bind-attr class=":completion-list showCompletions:show:hide" }}>
<!--<div class="completion-list">-->
{{#each candidate in potentialComplements}}
<div {{bind-attr class=":completion-candidate candidate.isActive:active"}}
{{action "selectComplement" candidate}}>
{{candidate.value}}
</div>
{{/each}}
</div>
{{!/if}}
| {{
input type='text' class=inputClassNames value=value
- placeholder=placeholder enter='enterPressed'
+ placeholder=placeholder enter='enterPressed' autocomplete="off"
? +++++++++++++++++++
}}
{{!
focus-in='focusIn' focus-out='focusOut'
}}
{{!#if showCompletions}}
<div {{bind-attr class=":completion-list showCompletions:show:hide" }}>
<!--<div class="completion-list">-->
{{#each candidate in potentialComplements}}
<div {{bind-attr class=":completion-candidate candidate.isActive:active"}}
{{action "selectComplement" candidate}}>
{{candidate.value}}
</div>
{{/each}}
</div>
{{!/if}} | 2 | 0.1 | 1 | 1 |
b271d15c32c239a4b829bb1722d53c677d2acf90 | AnvilEditor/mission_raw/anvil/spawn/fn_getRandomSpawnPosition.sqf | AnvilEditor/mission_raw/anvil/spawn/fn_getRandomSpawnPosition.sqf | /*
Author: Will Hart, based on Shuko's shkPos
Description:
Selects a random position within the objective
Parameter(s):
_this select 0: OBJECITVE, the objective to return a position for
Example:
_pos = [_obj] call AFW_fnc_getRandomSpawnPosition;
Returns:
A position at which to spawn the objective
*/
private ["_obj", "_pos", "_centre", "_rad"];
_obj = _THIS(0);
_centre = O_POS(_obj);
_rad = O_R(_obj);
_pos = [_centre, random 360, _rad, false, 1] call SHK_pos;
_pos | /*
Author: Will Hart
Description:
Finds a random spawn position suitable for the given object within the objective radius
Parameter(s):
_this select 0: ARRAY, The objective to find a spawn point for
_this select 1: STRING, The type of object to find a spawn point for
Example:
pos = [_obj, "B_Officer_F"] call AFW_fnc_getRandomSpawnPosition;
Returns:
A position to spawn the object at or the initial position if none can be found
*/
private ["_obj", "_safePos", "_radius", "_initial", "_temp", "_object", "_count"];
#include "defines.sqf"
_obj = _this select 0;
_object = _this select 1;
_radius = O_R(_obj);
_initial = O_POS(_obj);
_safePos = [];
_count = 0;
if (O_RANDOMISE(_obj)) then {
while {format ["%1", _safePos] == "[]" and _count < 10} do {
// find a random safe position
_temp = [_initial, random 360, _radius, false, 1] call SHK_pos;
_safePos = _temp findEmptyPosition [0, _radius, _object];
_count = _count + 1;
};
if (format ["%1", _safePos] == "[]") {
_safePos = _initial;
};
} else {
_safePos = _initial;
};
// return the crate handle
_safePos | Build single function for getting the respawn position | Build single function for getting the respawn position
| SQF | mit | will-hart/AnvilEditor,will-hart/AnvilEditor | sqf | ## Code Before:
/*
Author: Will Hart, based on Shuko's shkPos
Description:
Selects a random position within the objective
Parameter(s):
_this select 0: OBJECITVE, the objective to return a position for
Example:
_pos = [_obj] call AFW_fnc_getRandomSpawnPosition;
Returns:
A position at which to spawn the objective
*/
private ["_obj", "_pos", "_centre", "_rad"];
_obj = _THIS(0);
_centre = O_POS(_obj);
_rad = O_R(_obj);
_pos = [_centre, random 360, _rad, false, 1] call SHK_pos;
_pos
## Instruction:
Build single function for getting the respawn position
## Code After:
/*
Author: Will Hart
Description:
Finds a random spawn position suitable for the given object within the objective radius
Parameter(s):
_this select 0: ARRAY, The objective to find a spawn point for
_this select 1: STRING, The type of object to find a spawn point for
Example:
pos = [_obj, "B_Officer_F"] call AFW_fnc_getRandomSpawnPosition;
Returns:
A position to spawn the object at or the initial position if none can be found
*/
private ["_obj", "_safePos", "_radius", "_initial", "_temp", "_object", "_count"];
#include "defines.sqf"
_obj = _this select 0;
_object = _this select 1;
_radius = O_R(_obj);
_initial = O_POS(_obj);
_safePos = [];
_count = 0;
if (O_RANDOMISE(_obj)) then {
while {format ["%1", _safePos] == "[]" and _count < 10} do {
// find a random safe position
_temp = [_initial, random 360, _radius, false, 1] call SHK_pos;
_safePos = _temp findEmptyPosition [0, _radius, _object];
_count = _count + 1;
};
if (format ["%1", _safePos] == "[]") {
_safePos = _initial;
};
} else {
_safePos = _initial;
};
// return the crate handle
_safePos | /*
- Author: Will Hart, based on Shuko's shkPos
+ Author: Will Hart
Description:
- Selects a random position within the objective
+ Finds a random spawn position suitable for the given object within the objective radius
Parameter(s):
- _this select 0: OBJECITVE, the objective to return a position for
+ _this select 0: ARRAY, The objective to find a spawn point for
+ _this select 1: STRING, The type of object to find a spawn point for
Example:
- _pos = [_obj] call AFW_fnc_getRandomSpawnPosition;
? -
+ pos = [_obj, "B_Officer_F"] call AFW_fnc_getRandomSpawnPosition;
? +++++++++++++++
Returns:
- A position at which to spawn the objective
+ A position to spawn the object at or the initial position if none can be found
*/
- private ["_obj", "_pos", "_centre", "_rad"];
+ private ["_obj", "_safePos", "_radius", "_initial", "_temp", "_object", "_count"];
+ #include "defines.sqf"
- _obj = _THIS(0);
- _centre = O_POS(_obj);
- _rad = O_R(_obj);
+ _obj = _this select 0;
+ _object = _this select 1;
+
+ _radius = O_R(_obj);
+ _initial = O_POS(_obj);
+ _safePos = [];
+ _count = 0;
+
+ if (O_RANDOMISE(_obj)) then {
+
+ while {format ["%1", _safePos] == "[]" and _count < 10} do {
+ // find a random safe position
- _pos = [_centre, random 360, _rad, false, 1] call SHK_pos;
? -- ^^ ^^
+ _temp = [_initial, random 360, _radius, false, 1] call SHK_pos;
? ++++++++ +++ ^ + ^^^ +++
- _pos
+ _safePos = _temp findEmptyPosition [0, _radius, _object];
+ _count = _count + 1;
+ };
+
+ if (format ["%1", _safePos] == "[]") {
+ _safePos = _initial;
+ };
+ } else {
+ _safePos = _initial;
+ };
+
+ // return the crate handle
+ _safePos | 45 | 1.875 | 34 | 11 |
3bd5b9d46192369ff83926fbbac3adbf206db93f | lib/Driver/DriverOptions.cpp | lib/Driver/DriverOptions.cpp | //===--- DriverOptions.cpp - Driver Options Table -------------------------===//
//
// The LLVM Compiler Infrastructure
//
// This file is distributed under the University of Illinois Open Source
// License. See LICENSE.TXT for details.
//
//===----------------------------------------------------------------------===//
#include "clang/Driver/Options.h"
#include "llvm/ADT/STLExtras.h"
#include "llvm/Option/OptTable.h"
#include "llvm/Option/Option.h"
using namespace clang::driver;
using namespace clang::driver::options;
using namespace llvm::opt;
#define PREFIX(NAME, VALUE) static const char *const NAME[] = VALUE;
#include "clang/Driver/Options.inc"
#undef PREFIX
static const OptTable::Info InfoTable[] = {
#define OPTION(PREFIX, NAME, ID, KIND, GROUP, ALIAS, ALIASARGS, FLAGS, PARAM, \
HELPTEXT, METAVAR) \
{ PREFIX, NAME, HELPTEXT, METAVAR, OPT_##ID, Option::KIND##Class, PARAM, \
FLAGS, OPT_##GROUP, OPT_##ALIAS, ALIASARGS },
#include "clang/Driver/Options.inc"
#undef OPTION
};
namespace {
class DriverOptTable : public OptTable {
public:
DriverOptTable()
: OptTable(InfoTable, llvm::array_lengthof(InfoTable)) {}
};
}
OptTable *clang::driver::createDriverOptTable() {
return new DriverOptTable();
}
| //===--- DriverOptions.cpp - Driver Options Table -------------------------===//
//
// The LLVM Compiler Infrastructure
//
// This file is distributed under the University of Illinois Open Source
// License. See LICENSE.TXT for details.
//
//===----------------------------------------------------------------------===//
#include "clang/Driver/Options.h"
#include "llvm/ADT/STLExtras.h"
#include "llvm/Option/OptTable.h"
#include "llvm/Option/Option.h"
using namespace clang::driver;
using namespace clang::driver::options;
using namespace llvm::opt;
#define PREFIX(NAME, VALUE) static const char *const NAME[] = VALUE;
#include "clang/Driver/Options.inc"
#undef PREFIX
static const OptTable::Info InfoTable[] = {
#define OPTION(PREFIX, NAME, ID, KIND, GROUP, ALIAS, ALIASARGS, FLAGS, PARAM, \
HELPTEXT, METAVAR) \
{ PREFIX, NAME, HELPTEXT, METAVAR, OPT_##ID, Option::KIND##Class, PARAM, \
FLAGS, OPT_##GROUP, OPT_##ALIAS, ALIASARGS },
#include "clang/Driver/Options.inc"
#undef OPTION
};
namespace {
class DriverOptTable : public OptTable {
public:
DriverOptTable()
: OptTable(InfoTable) {}
};
}
OptTable *clang::driver::createDriverOptTable() {
return new DriverOptTable();
}
| Update clang to match llvm r250901. OptTable constructor now takes an ArrayRef. NFC | Update clang to match llvm r250901. OptTable constructor now takes an ArrayRef. NFC
git-svn-id: ffe668792ed300d6c2daa1f6eba2e0aa28d7ec6c@250903 91177308-0d34-0410-b5e6-96231b3b80d8
| C++ | apache-2.0 | apple/swift-clang,apple/swift-clang,apple/swift-clang,llvm-mirror/clang,llvm-mirror/clang,llvm-mirror/clang,apple/swift-clang,apple/swift-clang,apple/swift-clang,llvm-mirror/clang,llvm-mirror/clang,apple/swift-clang,apple/swift-clang,llvm-mirror/clang,llvm-mirror/clang,llvm-mirror/clang,llvm-mirror/clang,apple/swift-clang,llvm-mirror/clang,apple/swift-clang | c++ | ## Code Before:
//===--- DriverOptions.cpp - Driver Options Table -------------------------===//
//
// The LLVM Compiler Infrastructure
//
// This file is distributed under the University of Illinois Open Source
// License. See LICENSE.TXT for details.
//
//===----------------------------------------------------------------------===//
#include "clang/Driver/Options.h"
#include "llvm/ADT/STLExtras.h"
#include "llvm/Option/OptTable.h"
#include "llvm/Option/Option.h"
using namespace clang::driver;
using namespace clang::driver::options;
using namespace llvm::opt;
#define PREFIX(NAME, VALUE) static const char *const NAME[] = VALUE;
#include "clang/Driver/Options.inc"
#undef PREFIX
static const OptTable::Info InfoTable[] = {
#define OPTION(PREFIX, NAME, ID, KIND, GROUP, ALIAS, ALIASARGS, FLAGS, PARAM, \
HELPTEXT, METAVAR) \
{ PREFIX, NAME, HELPTEXT, METAVAR, OPT_##ID, Option::KIND##Class, PARAM, \
FLAGS, OPT_##GROUP, OPT_##ALIAS, ALIASARGS },
#include "clang/Driver/Options.inc"
#undef OPTION
};
namespace {
class DriverOptTable : public OptTable {
public:
DriverOptTable()
: OptTable(InfoTable, llvm::array_lengthof(InfoTable)) {}
};
}
OptTable *clang::driver::createDriverOptTable() {
return new DriverOptTable();
}
## Instruction:
Update clang to match llvm r250901. OptTable constructor now takes an ArrayRef. NFC
git-svn-id: ffe668792ed300d6c2daa1f6eba2e0aa28d7ec6c@250903 91177308-0d34-0410-b5e6-96231b3b80d8
## Code After:
//===--- DriverOptions.cpp - Driver Options Table -------------------------===//
//
// The LLVM Compiler Infrastructure
//
// This file is distributed under the University of Illinois Open Source
// License. See LICENSE.TXT for details.
//
//===----------------------------------------------------------------------===//
#include "clang/Driver/Options.h"
#include "llvm/ADT/STLExtras.h"
#include "llvm/Option/OptTable.h"
#include "llvm/Option/Option.h"
using namespace clang::driver;
using namespace clang::driver::options;
using namespace llvm::opt;
#define PREFIX(NAME, VALUE) static const char *const NAME[] = VALUE;
#include "clang/Driver/Options.inc"
#undef PREFIX
static const OptTable::Info InfoTable[] = {
#define OPTION(PREFIX, NAME, ID, KIND, GROUP, ALIAS, ALIASARGS, FLAGS, PARAM, \
HELPTEXT, METAVAR) \
{ PREFIX, NAME, HELPTEXT, METAVAR, OPT_##ID, Option::KIND##Class, PARAM, \
FLAGS, OPT_##GROUP, OPT_##ALIAS, ALIASARGS },
#include "clang/Driver/Options.inc"
#undef OPTION
};
namespace {
class DriverOptTable : public OptTable {
public:
DriverOptTable()
: OptTable(InfoTable) {}
};
}
OptTable *clang::driver::createDriverOptTable() {
return new DriverOptTable();
}
| //===--- DriverOptions.cpp - Driver Options Table -------------------------===//
//
// The LLVM Compiler Infrastructure
//
// This file is distributed under the University of Illinois Open Source
// License. See LICENSE.TXT for details.
//
//===----------------------------------------------------------------------===//
#include "clang/Driver/Options.h"
#include "llvm/ADT/STLExtras.h"
#include "llvm/Option/OptTable.h"
#include "llvm/Option/Option.h"
using namespace clang::driver;
using namespace clang::driver::options;
using namespace llvm::opt;
#define PREFIX(NAME, VALUE) static const char *const NAME[] = VALUE;
#include "clang/Driver/Options.inc"
#undef PREFIX
static const OptTable::Info InfoTable[] = {
#define OPTION(PREFIX, NAME, ID, KIND, GROUP, ALIAS, ALIASARGS, FLAGS, PARAM, \
HELPTEXT, METAVAR) \
{ PREFIX, NAME, HELPTEXT, METAVAR, OPT_##ID, Option::KIND##Class, PARAM, \
FLAGS, OPT_##GROUP, OPT_##ALIAS, ALIASARGS },
#include "clang/Driver/Options.inc"
#undef OPTION
};
namespace {
class DriverOptTable : public OptTable {
public:
DriverOptTable()
- : OptTable(InfoTable, llvm::array_lengthof(InfoTable)) {}
+ : OptTable(InfoTable) {}
};
}
OptTable *clang::driver::createDriverOptTable() {
return new DriverOptTable();
} | 2 | 0.045455 | 1 | 1 |
a43c6b2619605006b0555fc81b6711265d182693 | Formula/noweb.rb | Formula/noweb.rb | require 'formula'
class Noweb < Formula
desc "WEB-like literate-programming tool"
homepage 'http://www.cs.tufts.edu/~nr/noweb/'
url 'ftp://www.eecs.harvard.edu/pub/nr/noweb.tgz'
version '2.11b'
sha1 '3b391c42f46dcb8a002b863fb2e483560a7da51d'
depends_on 'icon'
def install
cd "src" do
system "bash", "awkname", "awk"
system "make LIBSRC=icon ICONC=icont CFLAGS='-U_POSIX_C_SOURCE -D_POSIX_C_SOURCE=1'"
if which 'kpsewhich'
ohai 'TeX installation found. Installing TeX support files there might fail if your user does not have permission'
texmf = Pathname.new(`kpsewhich -var-value=TEXMFLOCAL`.chomp)
else
ohai 'No TeX installation found. Installing TeX support files in the noweb Cellar.'
texmf = prefix
end
bin.mkpath
lib.mkpath
man.mkpath
(texmf/'tex/generic/noweb').mkpath
system "make", "install", "BIN=#{bin}",
"LIB=#{lib}",
"MAN=#{man}",
"TEXINPUTS=#{texmf}/tex/generic/noweb"
cd "icon" do
system "make", "install", "BIN=#{bin}",
"LIB=#{lib}",
"MAN=#{man}",
"TEXINPUTS=#{texmf}/tex/generic/noweb"
end
end
end
end
| require 'formula'
class Noweb < Formula
desc "WEB-like literate-programming tool"
homepage 'http://www.cs.tufts.edu/~nr/noweb/'
url 'ftp://www.eecs.harvard.edu/pub/nr/noweb.tgz'
version '2.11b'
sha256 'c913f26c1edb37e331c747619835b4cade000b54e459bb08f4d38899ab690d82'
depends_on 'icon'
def texpath
prefix/'tex/generic/noweb'
end
def install
cd "src" do
system "bash", "awkname", "awk"
system "make LIBSRC=icon ICONC=icont CFLAGS='-U_POSIX_C_SOURCE -D_POSIX_C_SOURCE=1'"
bin.mkpath
lib.mkpath
man.mkpath
texpath.mkpath
system "make", "install", "BIN=#{bin}",
"LIB=#{lib}",
"MAN=#{man}",
"TEXINPUTS=#{texpath}"
cd "icon" do
system "make", "install", "BIN=#{bin}",
"LIB=#{lib}",
"MAN=#{man}",
"TEXINPUTS=#{texpath}"
end
end
end
def caveats; <<-EOS.undent
TeX support files are installed in the directory:
#{texpath}
You may need to add the directory to TEXINPUTS to run noweb properly.
EOS
end
end
| Clean up TeX support installation, add caveat. | Clean up TeX support installation, add caveat.
Closes Homebrew/homebrew#42110.
Signed-off-by: Alex Dunn <46952c1342d5ce1a4f5689855eda3c9064bca8f6@gmail.com>
| Ruby | bsd-2-clause | filcab/homebrew-core,jabenninghoff/homebrew-core,cblecker/homebrew-core,makigumo/homebrew-core,uyjulian/homebrew-core,gserra-olx/homebrew-core,phusion/homebrew-core,moderndeveloperllc/homebrew-core,mvitz/homebrew-core,j-bennet/homebrew-core,fvioz/homebrew-core,mfikes/homebrew-core,zyedidia/homebrew-core,kuahyeow/homebrew-core,chrisfinazzo/homebrew-core,bcg62/homebrew-core,mahori/homebrew-core,wolffaxn/homebrew-core,j-bennet/homebrew-core,jvehent/homebrew-core,ylluminarious/homebrew-core,smessmer/homebrew-core,Homebrew/homebrew-core,uyjulian/homebrew-core,reelsense/homebrew-core,mvitz/homebrew-core,jvillard/homebrew-core,mvbattista/homebrew-core,mvitz/homebrew-core,rwhogg/homebrew-core,grhawk/homebrew-core,zyedidia/homebrew-core,Moisan/homebrew-core,LinuxbrewTestBot/homebrew-core,forevergenin/homebrew-core,passerbyid/homebrew-core,spaam/homebrew-core,edporras/homebrew-core,passerbyid/homebrew-core,bigbes/homebrew-core,FinalDes/homebrew-core,battlemidget/homebrew-core,dsXLII/homebrew-core,ShivaHuang/homebrew-core,fvioz/homebrew-core,sjackman/homebrew-core,wolffaxn/homebrew-core,bfontaine/homebrew-core,bigbes/homebrew-core,aren55555/homebrew-core,kunickiaj/homebrew-core,adamliter/homebrew-core,lasote/homebrew-core,ilovezfs/homebrew-core,rwhogg/homebrew-core,kunickiaj/homebrew-core,lembacon/homebrew-core,nbari/homebrew-core,nandub/homebrew-core,jabenninghoff/homebrew-core,mvbattista/homebrew-core,LinuxbrewTestBot/homebrew-core,jvehent/homebrew-core,jvillard/homebrew-core,lemzwerg/homebrew-core,forevergenin/homebrew-core,bcg62/homebrew-core,tkoenig/homebrew-core,andyjeffries/homebrew-core,jdubois/homebrew-core,andyjeffries/homebrew-core,chrisfinazzo/homebrew-core,aren55555/homebrew-core,jvillard/homebrew-core,aabdnn/homebrew-core,robohack/homebrew-core,git-lfs/homebrew-core,Linuxbrew/homebrew-core,dsXLII/homebrew-core,spaam/homebrew-core,phusion/homebrew-core,straxhaber/homebrew-core,DomT4/homebrew-core,BrewTestBot/homebrew-core,nbari/homebrew-core,tkoenig/homebrew-core,adam-moss/homebrew-core,lasote/homebrew-core,smessmer/homebrew-core,stgraber/homebrew-core,git-lfs/homebrew-core,straxhaber/homebrew-core,jdubois/homebrew-core,battlemidget/homebrew-core,JCount/homebrew-core,Linuxbrew/homebrew-core,sdorra/homebrew-core,robohack/homebrew-core,sjackman/homebrew-core,filcab/homebrew-core,mbcoguno/homebrew-core,maxim-belkin/homebrew-core,aabdnn/homebrew-core,edporras/homebrew-core,lembacon/homebrew-core,mbcoguno/homebrew-core,bfontaine/homebrew-core,ylluminarious/homebrew-core,FinalDes/homebrew-core,Moisan/homebrew-core,reelsense/homebrew-core,sdorra/homebrew-core,mahori/homebrew-core,BrewTestBot/homebrew-core,moderndeveloperllc/homebrew-core,cblecker/homebrew-core,jdubois/homebrew-core,Homebrew/homebrew-core,nandub/homebrew-core,adam-moss/homebrew-core,grhawk/homebrew-core,gserra-olx/homebrew-core,DomT4/homebrew-core,makigumo/homebrew-core,maxim-belkin/homebrew-core,mfikes/homebrew-core,zmwangx/homebrew-core,wolffaxn/homebrew-core,ilovezfs/homebrew-core,kuahyeow/homebrew-core,lemzwerg/homebrew-core,adamliter/homebrew-core,mbcoguno/homebrew-core,lembacon/homebrew-core,ShivaHuang/homebrew-core,zmwangx/homebrew-core,stgraber/homebrew-core,bcg62/homebrew-core,JCount/homebrew-core | ruby | ## Code Before:
require 'formula'
class Noweb < Formula
desc "WEB-like literate-programming tool"
homepage 'http://www.cs.tufts.edu/~nr/noweb/'
url 'ftp://www.eecs.harvard.edu/pub/nr/noweb.tgz'
version '2.11b'
sha1 '3b391c42f46dcb8a002b863fb2e483560a7da51d'
depends_on 'icon'
def install
cd "src" do
system "bash", "awkname", "awk"
system "make LIBSRC=icon ICONC=icont CFLAGS='-U_POSIX_C_SOURCE -D_POSIX_C_SOURCE=1'"
if which 'kpsewhich'
ohai 'TeX installation found. Installing TeX support files there might fail if your user does not have permission'
texmf = Pathname.new(`kpsewhich -var-value=TEXMFLOCAL`.chomp)
else
ohai 'No TeX installation found. Installing TeX support files in the noweb Cellar.'
texmf = prefix
end
bin.mkpath
lib.mkpath
man.mkpath
(texmf/'tex/generic/noweb').mkpath
system "make", "install", "BIN=#{bin}",
"LIB=#{lib}",
"MAN=#{man}",
"TEXINPUTS=#{texmf}/tex/generic/noweb"
cd "icon" do
system "make", "install", "BIN=#{bin}",
"LIB=#{lib}",
"MAN=#{man}",
"TEXINPUTS=#{texmf}/tex/generic/noweb"
end
end
end
end
## Instruction:
Clean up TeX support installation, add caveat.
Closes Homebrew/homebrew#42110.
Signed-off-by: Alex Dunn <46952c1342d5ce1a4f5689855eda3c9064bca8f6@gmail.com>
## Code After:
require 'formula'
class Noweb < Formula
desc "WEB-like literate-programming tool"
homepage 'http://www.cs.tufts.edu/~nr/noweb/'
url 'ftp://www.eecs.harvard.edu/pub/nr/noweb.tgz'
version '2.11b'
sha256 'c913f26c1edb37e331c747619835b4cade000b54e459bb08f4d38899ab690d82'
depends_on 'icon'
def texpath
prefix/'tex/generic/noweb'
end
def install
cd "src" do
system "bash", "awkname", "awk"
system "make LIBSRC=icon ICONC=icont CFLAGS='-U_POSIX_C_SOURCE -D_POSIX_C_SOURCE=1'"
bin.mkpath
lib.mkpath
man.mkpath
texpath.mkpath
system "make", "install", "BIN=#{bin}",
"LIB=#{lib}",
"MAN=#{man}",
"TEXINPUTS=#{texpath}"
cd "icon" do
system "make", "install", "BIN=#{bin}",
"LIB=#{lib}",
"MAN=#{man}",
"TEXINPUTS=#{texpath}"
end
end
end
def caveats; <<-EOS.undent
TeX support files are installed in the directory:
#{texpath}
You may need to add the directory to TEXINPUTS to run noweb properly.
EOS
end
end
| require 'formula'
class Noweb < Formula
desc "WEB-like literate-programming tool"
homepage 'http://www.cs.tufts.edu/~nr/noweb/'
url 'ftp://www.eecs.harvard.edu/pub/nr/noweb.tgz'
version '2.11b'
- sha1 '3b391c42f46dcb8a002b863fb2e483560a7da51d'
+ sha256 'c913f26c1edb37e331c747619835b4cade000b54e459bb08f4d38899ab690d82'
depends_on 'icon'
+
+ def texpath
+ prefix/'tex/generic/noweb'
+ end
def install
cd "src" do
system "bash", "awkname", "awk"
system "make LIBSRC=icon ICONC=icont CFLAGS='-U_POSIX_C_SOURCE -D_POSIX_C_SOURCE=1'"
- if which 'kpsewhich'
- ohai 'TeX installation found. Installing TeX support files there might fail if your user does not have permission'
- texmf = Pathname.new(`kpsewhich -var-value=TEXMFLOCAL`.chomp)
- else
- ohai 'No TeX installation found. Installing TeX support files in the noweb Cellar.'
- texmf = prefix
- end
-
bin.mkpath
lib.mkpath
man.mkpath
- (texmf/'tex/generic/noweb').mkpath
+ texpath.mkpath
system "make", "install", "BIN=#{bin}",
"LIB=#{lib}",
"MAN=#{man}",
- "TEXINPUTS=#{texmf}/tex/generic/noweb"
? ^^ ------------------
+ "TEXINPUTS=#{texpath}"
? ^^^^
cd "icon" do
system "make", "install", "BIN=#{bin}",
"LIB=#{lib}",
"MAN=#{man}",
- "TEXINPUTS=#{texmf}/tex/generic/noweb"
? ^^ ------------------
+ "TEXINPUTS=#{texpath}"
? ^^^^
end
end
end
+
+ def caveats; <<-EOS.undent
+ TeX support files are installed in the directory:
+
+ #{texpath}
+
+ You may need to add the directory to TEXINPUTS to run noweb properly.
+ EOS
+ end
end | 29 | 0.690476 | 17 | 12 |
5c3871f15764ea7e8db3a02ae03b5ef5de03a44d | test/server_test.js | test/server_test.js | var chai = require('chai');
var chaihttp = require('chai-http');
chai.use(chaihttp);
var expect = chai.expect;
var fs = require('fs');
var server = require(__dirname + '/../index.js');
describe('our NLP sever', function() {
before(function(done) {
fs.readFile(__dirname + '/../public/index.html', function(err, data) {
if(err) throw err;
this.html = data.toString();
done();
}.bind(this));
});
it('should respond to a get request with our index page', function(done) {
chai.request('localhost:3000')
.get('/')
.end(function(err, res) {
expect(err).to.eql(null);
expect(res.status).to.eql(200);
expect(res.text).to.eql(this.html);
done();
}.bind(this));
});
});
| var chai = require('chai');
var chaihttp = require('chai-http');
chai.use(chaihttp);
var expect = chai.expect;
var fs = require('fs');
var server = require(__dirname + '/../index.js');
describe('our NLP sever', function() {
before(function(done) {
fs.readFile(__dirname + '/../public/index.html', function(err, data) {
if(err) throw err;
this.html = data.toString();
done();
}.bind(this));
});
it('should respond to a GET request with our index page', function(done) {
chai.request('localhost:3000')
.get('/')
.end(function(err, res) {
expect(err).to.eql(null);
expect(res.status).to.eql(200);
expect(res.text).to.eql(this.html);
done();
}.bind(this));
});
it('should respond to a POST request to the process URL', function(done){
chai.request('localhost:3000')
.post('/process')
.type('form')
.send('text=You talking to me?')
.end(function(err, res){
expect(err).to.eql(null);
expect(res.text).to.eql('{"you":1,"talking":1,"to":1,"me":1}')
done();
})
})
});
| Add test for POST request | Add test for POST request
| JavaScript | mit | timcmiller/nlp_project,timcmiller/nlp_project | javascript | ## Code Before:
var chai = require('chai');
var chaihttp = require('chai-http');
chai.use(chaihttp);
var expect = chai.expect;
var fs = require('fs');
var server = require(__dirname + '/../index.js');
describe('our NLP sever', function() {
before(function(done) {
fs.readFile(__dirname + '/../public/index.html', function(err, data) {
if(err) throw err;
this.html = data.toString();
done();
}.bind(this));
});
it('should respond to a get request with our index page', function(done) {
chai.request('localhost:3000')
.get('/')
.end(function(err, res) {
expect(err).to.eql(null);
expect(res.status).to.eql(200);
expect(res.text).to.eql(this.html);
done();
}.bind(this));
});
});
## Instruction:
Add test for POST request
## Code After:
var chai = require('chai');
var chaihttp = require('chai-http');
chai.use(chaihttp);
var expect = chai.expect;
var fs = require('fs');
var server = require(__dirname + '/../index.js');
describe('our NLP sever', function() {
before(function(done) {
fs.readFile(__dirname + '/../public/index.html', function(err, data) {
if(err) throw err;
this.html = data.toString();
done();
}.bind(this));
});
it('should respond to a GET request with our index page', function(done) {
chai.request('localhost:3000')
.get('/')
.end(function(err, res) {
expect(err).to.eql(null);
expect(res.status).to.eql(200);
expect(res.text).to.eql(this.html);
done();
}.bind(this));
});
it('should respond to a POST request to the process URL', function(done){
chai.request('localhost:3000')
.post('/process')
.type('form')
.send('text=You talking to me?')
.end(function(err, res){
expect(err).to.eql(null);
expect(res.text).to.eql('{"you":1,"talking":1,"to":1,"me":1}')
done();
})
})
});
| var chai = require('chai');
var chaihttp = require('chai-http');
chai.use(chaihttp);
var expect = chai.expect;
var fs = require('fs');
var server = require(__dirname + '/../index.js');
describe('our NLP sever', function() {
before(function(done) {
fs.readFile(__dirname + '/../public/index.html', function(err, data) {
if(err) throw err;
this.html = data.toString();
done();
}.bind(this));
});
- it('should respond to a get request with our index page', function(done) {
? ^^^
+ it('should respond to a GET request with our index page', function(done) {
? ^^^
chai.request('localhost:3000')
.get('/')
.end(function(err, res) {
expect(err).to.eql(null);
expect(res.status).to.eql(200);
expect(res.text).to.eql(this.html);
done();
}.bind(this));
});
+ it('should respond to a POST request to the process URL', function(done){
+ chai.request('localhost:3000')
+ .post('/process')
+ .type('form')
+ .send('text=You talking to me?')
+ .end(function(err, res){
+ expect(err).to.eql(null);
+ expect(res.text).to.eql('{"you":1,"talking":1,"to":1,"me":1}')
+ done();
+ })
+ })
}); | 13 | 0.419355 | 12 | 1 |
05f77f064e6fd900c8ae480dd417f6c2c8a770f0 | app/views/messages/index.html.erb | app/views/messages/index.html.erb | <div id="status">
</div>
<div id="chat-window">
<ul id="messages">
</ul>
</div>
<script type="text/javascript" charset="utf-8">
var pusher = new Pusher('545220afdca7480dd648',
{encrypted: true});
//note that 123456 below needs to be replaced with the requestid
var channel = pusher.subscribe('private-123');
channel.bind('new_message', function(data) {
$('#messages').append('<li>' + data.message + '</li>');
});
pusher.connection.bind('connecting', function() {
$('div#status').text("Connecting to chat system...");
});
pusher.connection.bind('connected', function() {
$('div#status').text("Connected to chat system!");
});
// NOTE: We should act functionality so that the div#status div slowly changes
// and is removed after a period of time (3 seconds?);
// NOTE: More more user-friendly message should appear here.
channel.bind('subscription_error', function() {
$('div#status').text("Not authorized to subscribe to channel.");
});
</script>
| <div id="status">
</div>
<div id="chat-window">
<ul id="messages">
<% @messages.each do |message| %>
<li data-id="<%= message.id %>"><%= message.content %></li>
<% end %>
</ul>
</div>
<div id="message-input">
<form action="/requests/1/messages" method="post">
<div>
<label for="message">Message: </label>
<input type="text" name="message[content]" id="message">
</div>
</form>
</div>
<script type="text/javascript" charset="utf-8">
var pusher = new Pusher('545220afdca7480dd648',
{encrypted: true});
//note that 123456 below needs to be replaced with the requestid
var channel = pusher.subscribe('private-<%= @request.id %>');
channel.bind('new_message', function(data) {
$('#messages').append('<li>' + data.content + '</li>');
});
pusher.connection.bind('connecting', function() {
$('div#status').text("Connecting to chat system...");
});
pusher.connection.bind('connected', function() {
$('div#status').text("Connected to chat system!");
});
// NOTE: We should act functionality so that the div#status div slowly changes
// and is removed after a period of time (3 seconds?);
// NOTE: More more user-friendly message should appear here.
channel.bind('subscription_error', function() {
$('div#status').text("Not authorized to subscribe to channel.");
});
</script>
| Add text area for user to input message. | Add text area for user to input message.
NOTE: The action for the form has been hard-coded. This is only for
demonstration purposes.
| HTML+ERB | mit | kimchanyoung/nghborly,nyc-mud-turtles-2015/nghborly,nyc-mud-turtles-2015/nghborly,kimchanyoung/nghborly,kimchanyoung/nghborly,nyc-mud-turtles-2015/nghborly | html+erb | ## Code Before:
<div id="status">
</div>
<div id="chat-window">
<ul id="messages">
</ul>
</div>
<script type="text/javascript" charset="utf-8">
var pusher = new Pusher('545220afdca7480dd648',
{encrypted: true});
//note that 123456 below needs to be replaced with the requestid
var channel = pusher.subscribe('private-123');
channel.bind('new_message', function(data) {
$('#messages').append('<li>' + data.message + '</li>');
});
pusher.connection.bind('connecting', function() {
$('div#status').text("Connecting to chat system...");
});
pusher.connection.bind('connected', function() {
$('div#status').text("Connected to chat system!");
});
// NOTE: We should act functionality so that the div#status div slowly changes
// and is removed after a period of time (3 seconds?);
// NOTE: More more user-friendly message should appear here.
channel.bind('subscription_error', function() {
$('div#status').text("Not authorized to subscribe to channel.");
});
</script>
## Instruction:
Add text area for user to input message.
NOTE: The action for the form has been hard-coded. This is only for
demonstration purposes.
## Code After:
<div id="status">
</div>
<div id="chat-window">
<ul id="messages">
<% @messages.each do |message| %>
<li data-id="<%= message.id %>"><%= message.content %></li>
<% end %>
</ul>
</div>
<div id="message-input">
<form action="/requests/1/messages" method="post">
<div>
<label for="message">Message: </label>
<input type="text" name="message[content]" id="message">
</div>
</form>
</div>
<script type="text/javascript" charset="utf-8">
var pusher = new Pusher('545220afdca7480dd648',
{encrypted: true});
//note that 123456 below needs to be replaced with the requestid
var channel = pusher.subscribe('private-<%= @request.id %>');
channel.bind('new_message', function(data) {
$('#messages').append('<li>' + data.content + '</li>');
});
pusher.connection.bind('connecting', function() {
$('div#status').text("Connecting to chat system...");
});
pusher.connection.bind('connected', function() {
$('div#status').text("Connected to chat system!");
});
// NOTE: We should act functionality so that the div#status div slowly changes
// and is removed after a period of time (3 seconds?);
// NOTE: More more user-friendly message should appear here.
channel.bind('subscription_error', function() {
$('div#status').text("Not authorized to subscribe to channel.");
});
</script>
| <div id="status">
</div>
<div id="chat-window">
<ul id="messages">
+ <% @messages.each do |message| %>
+ <li data-id="<%= message.id %>"><%= message.content %></li>
+ <% end %>
</ul>
+ </div>
+
+ <div id="message-input">
+ <form action="/requests/1/messages" method="post">
+ <div>
+ <label for="message">Message: </label>
+ <input type="text" name="message[content]" id="message">
+ </div>
+ </form>
</div>
<script type="text/javascript" charset="utf-8">
var pusher = new Pusher('545220afdca7480dd648',
{encrypted: true});
//note that 123456 below needs to be replaced with the requestid
- var channel = pusher.subscribe('private-123');
? ^^^
+ var channel = pusher.subscribe('private-<%= @request.id %>');
? ^^^^^^^^^^^^^^^^^^
channel.bind('new_message', function(data) {
- $('#messages').append('<li>' + data.message + '</li>');
? ^ ^^^^^
+ $('#messages').append('<li>' + data.content + '</li>');
? ^^^^ ^^
});
pusher.connection.bind('connecting', function() {
$('div#status').text("Connecting to chat system...");
});
pusher.connection.bind('connected', function() {
$('div#status').text("Connected to chat system!");
});
// NOTE: We should act functionality so that the div#status div slowly changes
// and is removed after a period of time (3 seconds?);
// NOTE: More more user-friendly message should appear here.
channel.bind('subscription_error', function() {
$('div#status').text("Not authorized to subscribe to channel.");
});
</script> | 16 | 0.444444 | 14 | 2 |
07f3440f42770474be32b642a2c2eb1aeb906827 | .travis.yml | .travis.yml |
language: node_js
# Node.js versions
node_js:
- '4.1'
- '0.11'
# Install dependencies
before_script:
# Install bower
- npm install -g bower
# Install project dependencies
- npm install
- bower install
# Perform build and tests
script: gulp build --production |
language: node_js
# Node.js versions
node_js:
- 4
# Install dependencies
before_script:
# Install bower
- npm install -g bower
# Install project dependencies
- npm install
- bower install
# Perform build and tests
script: gulp build --production
| Remove node 0.11 from Travis build matrix | Remove node 0.11 from Travis build matrix
| YAML | mit | DatanestIO/mkdocs-material,dockerin/mkdocs-material,mjames-upc/mkdocs-unidata,supremainc/sfmdev,mjames-upc/mkdocs-unidata,squidfunk/mkdocs-material,DatanestIO/mkdocs-material,supremainc/sfmdev,mjames-upc/mkdocs-unidata,squidfunk/mkdocs-material,dockerin/mkdocs-material,DatanestIO/mkdocs-material,supremainc/sfmdev,supremainc/sfmdev,DatanestIO/mkdocs-material,dockerin/mkdocs-material,mjames-upc/mkdocs-unidata,dockerin/mkdocs-material,squidfunk/mkdocs-material,squidfunk/mkdocs-material | yaml | ## Code Before:
language: node_js
# Node.js versions
node_js:
- '4.1'
- '0.11'
# Install dependencies
before_script:
# Install bower
- npm install -g bower
# Install project dependencies
- npm install
- bower install
# Perform build and tests
script: gulp build --production
## Instruction:
Remove node 0.11 from Travis build matrix
## Code After:
language: node_js
# Node.js versions
node_js:
- 4
# Install dependencies
before_script:
# Install bower
- npm install -g bower
# Install project dependencies
- npm install
- bower install
# Perform build and tests
script: gulp build --production
|
language: node_js
# Node.js versions
node_js:
+ - 4
- - '4.1'
- - '0.11'
# Install dependencies
before_script:
# Install bower
- npm install -g bower
# Install project dependencies
- npm install
- bower install
# Perform build and tests
script: gulp build --production | 3 | 0.15 | 1 | 2 |
8dd74597a0ae1947290619ff03e5fc9f9d33db33 | README.md | README.md | cpm-google-test
===============
CPM external for google test
| cpm-google-test
===============
[](https://travis-ci.org/iauns/cpm-google-test)
CPM external for google test.
Use
---
```c++
#include "gtest/gtest.h"
```
| Update readme with travis hook and use clause. | Update readme with travis hook and use clause.
| Markdown | mit | iauns/cpm-google-test,iauns/cpm-google-test | markdown | ## Code Before:
cpm-google-test
===============
CPM external for google test
## Instruction:
Update readme with travis hook and use clause.
## Code After:
cpm-google-test
===============
[](https://travis-ci.org/iauns/cpm-google-test)
CPM external for google test.
Use
---
```c++
#include "gtest/gtest.h"
```
| cpm-google-test
===============
+ [](https://travis-ci.org/iauns/cpm-google-test)
+
- CPM external for google test
+ CPM external for google test.
? +
+
+ Use
+ ---
+
+ ```c++
+ #include "gtest/gtest.h"
+ ``` | 11 | 2.75 | 10 | 1 |
09309cbfb321dbf8d5e5c4e4754259b2cb1619ac | startup.py | startup.py | import tables
import tables as tb
import fipy
import fipy as fp
import numpy
import numpy as np
import scipy
import scipy as sp
import pylab
import pylab as pl
| import tables
import tables as tb
import fipy
import fipy as fp
import numpy
import numpy as np
import scipy
import scipy as spy
import matplotlib.pylot as plt
| Use canonical matplotlib.pyplot as plt. | Use canonical matplotlib.pyplot as plt.
| Python | mit | wd15/env,wd15/env,wd15/env | python | ## Code Before:
import tables
import tables as tb
import fipy
import fipy as fp
import numpy
import numpy as np
import scipy
import scipy as sp
import pylab
import pylab as pl
## Instruction:
Use canonical matplotlib.pyplot as plt.
## Code After:
import tables
import tables as tb
import fipy
import fipy as fp
import numpy
import numpy as np
import scipy
import scipy as spy
import matplotlib.pylot as plt
| import tables
import tables as tb
import fipy
import fipy as fp
import numpy
import numpy as np
import scipy
- import scipy as sp
+ import scipy as spy
? +
+ import matplotlib.pylot as plt
- import pylab
- import pylab as pl
+ | 6 | 0.545455 | 3 | 3 |
31e28e7ac513d82fb6f6a208c8ae038ebcbd9beb | KomponentsExample/AppDelegate.swift | KomponentsExample/AppDelegate.swift | //
// AppDelegate.swift
// React
//
// Created by Sacha Durand Saint Omer on 29/03/2017.
// Copyright © 2017 Freshos. All rights reserved.
//
import UIKit
import Komponents
@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {
var window: UIWindow?
func application(_ application: UIApplication,
didFinishLaunchingWithOptions launchOptions: [UIApplicationLaunchOptionsKey: Any]?) -> Bool {
window = UIWindow(frame: UIScreen.main.bounds)
let navVC = UINavigationController(rootViewController: NavigationVC())
navVC.navigationBar.isTranslucent = false
window?.rootViewController = navVC
window?.makeKeyAndVisible()
Komponents.logsEnabled = true
return true
}
}
| //
// AppDelegate.swift
// React
//
// Created by Sacha Durand Saint Omer on 29/03/2017.
// Copyright © 2017 Freshos. All rights reserved.
//
import UIKit
import Komponents
@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {
var window: UIWindow?
func application(_ application: UIApplication,
didFinishLaunchingWithOptions launchOptions: [UIApplicationLaunchOptionsKey: Any]?) -> Bool {
Bundle(path: "/Applications/InjectionIII.app/Contents/Resources/iOSInjection10.bundle")?.load()
window = UIWindow(frame: UIScreen.main.bounds)
let navVC = UINavigationController(rootViewController: NavigationVC())
navVC.navigationBar.isTranslucent = false
window?.rootViewController = navVC
window?.makeKeyAndVisible()
Komponents.logsEnabled = true
return true
}
}
| Add injection for Xcode snippet | Add injection for Xcode snippet
| Swift | mit | freshOS/Komponents,freshOS/Komponents | swift | ## Code Before:
//
// AppDelegate.swift
// React
//
// Created by Sacha Durand Saint Omer on 29/03/2017.
// Copyright © 2017 Freshos. All rights reserved.
//
import UIKit
import Komponents
@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {
var window: UIWindow?
func application(_ application: UIApplication,
didFinishLaunchingWithOptions launchOptions: [UIApplicationLaunchOptionsKey: Any]?) -> Bool {
window = UIWindow(frame: UIScreen.main.bounds)
let navVC = UINavigationController(rootViewController: NavigationVC())
navVC.navigationBar.isTranslucent = false
window?.rootViewController = navVC
window?.makeKeyAndVisible()
Komponents.logsEnabled = true
return true
}
}
## Instruction:
Add injection for Xcode snippet
## Code After:
//
// AppDelegate.swift
// React
//
// Created by Sacha Durand Saint Omer on 29/03/2017.
// Copyright © 2017 Freshos. All rights reserved.
//
import UIKit
import Komponents
@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {
var window: UIWindow?
func application(_ application: UIApplication,
didFinishLaunchingWithOptions launchOptions: [UIApplicationLaunchOptionsKey: Any]?) -> Bool {
Bundle(path: "/Applications/InjectionIII.app/Contents/Resources/iOSInjection10.bundle")?.load()
window = UIWindow(frame: UIScreen.main.bounds)
let navVC = UINavigationController(rootViewController: NavigationVC())
navVC.navigationBar.isTranslucent = false
window?.rootViewController = navVC
window?.makeKeyAndVisible()
Komponents.logsEnabled = true
return true
}
}
| //
// AppDelegate.swift
// React
//
// Created by Sacha Durand Saint Omer on 29/03/2017.
// Copyright © 2017 Freshos. All rights reserved.
//
import UIKit
import Komponents
@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {
var window: UIWindow?
func application(_ application: UIApplication,
didFinishLaunchingWithOptions launchOptions: [UIApplicationLaunchOptionsKey: Any]?) -> Bool {
+
+ Bundle(path: "/Applications/InjectionIII.app/Contents/Resources/iOSInjection10.bundle")?.load()
+
window = UIWindow(frame: UIScreen.main.bounds)
let navVC = UINavigationController(rootViewController: NavigationVC())
navVC.navigationBar.isTranslucent = false
window?.rootViewController = navVC
window?.makeKeyAndVisible()
Komponents.logsEnabled = true
return true
}
} | 3 | 0.115385 | 3 | 0 |
185c64a4784fc720cc3864f522f11329ba28d353 | src/app.component.ts | src/app.component.ts | import '../shims.js';
import { Component } from '@angular/core';
import { bootstrap } from '@angular/platform-browser-dynamic';
import { HTTP_PROVIDERS } from '@angular/http';
import { RESOURCE_PROVIDERS } from 'ng2-resource-rest';
import { CollectionsComponent } from './collections.component.ts';
@Component({
selector: 'qi-app',
template: '<qi-collections></qi-collections>',
directives: [CollectionsComponent]
})
class AppComponent { }
bootstrap(AppComponent, [HTTP_PROVIDERS, RESOURCE_PROVIDERS]);
| import '../shims.js';
import { Component } from '@angular/core';
import { bootstrap } from '@angular/platform-browser-dynamic';
import { HTTP_PROVIDERS } from '@angular/http';
import { RESOURCE_PROVIDERS } from 'ng2-resource-rest';
import { CollectionsComponent } from './collections/collections.component.ts';
@Component({
selector: 'qi-app',
template: '<qi-collections></qi-collections>',
directives: [CollectionsComponent]
})
class AppComponent { }
bootstrap(AppComponent, [HTTP_PROVIDERS, RESOURCE_PROVIDERS]);
| Move collection* to collections subdirectory. | Move collection* to collections subdirectory.
| TypeScript | bsd-2-clause | ohsu-qin/qiprofile,ohsu-qin/qiprofile,ohsu-qin/qiprofile,ohsu-qin/qiprofile | typescript | ## Code Before:
import '../shims.js';
import { Component } from '@angular/core';
import { bootstrap } from '@angular/platform-browser-dynamic';
import { HTTP_PROVIDERS } from '@angular/http';
import { RESOURCE_PROVIDERS } from 'ng2-resource-rest';
import { CollectionsComponent } from './collections.component.ts';
@Component({
selector: 'qi-app',
template: '<qi-collections></qi-collections>',
directives: [CollectionsComponent]
})
class AppComponent { }
bootstrap(AppComponent, [HTTP_PROVIDERS, RESOURCE_PROVIDERS]);
## Instruction:
Move collection* to collections subdirectory.
## Code After:
import '../shims.js';
import { Component } from '@angular/core';
import { bootstrap } from '@angular/platform-browser-dynamic';
import { HTTP_PROVIDERS } from '@angular/http';
import { RESOURCE_PROVIDERS } from 'ng2-resource-rest';
import { CollectionsComponent } from './collections/collections.component.ts';
@Component({
selector: 'qi-app',
template: '<qi-collections></qi-collections>',
directives: [CollectionsComponent]
})
class AppComponent { }
bootstrap(AppComponent, [HTTP_PROVIDERS, RESOURCE_PROVIDERS]);
| import '../shims.js';
import { Component } from '@angular/core';
import { bootstrap } from '@angular/platform-browser-dynamic';
import { HTTP_PROVIDERS } from '@angular/http';
import { RESOURCE_PROVIDERS } from 'ng2-resource-rest';
- import { CollectionsComponent } from './collections.component.ts';
+ import { CollectionsComponent } from './collections/collections.component.ts';
? ++++++++++++
@Component({
selector: 'qi-app',
template: '<qi-collections></qi-collections>',
directives: [CollectionsComponent]
})
class AppComponent { }
bootstrap(AppComponent, [HTTP_PROVIDERS, RESOURCE_PROVIDERS]); | 2 | 0.117647 | 1 | 1 |
9e797ebb55f357fd76a4daa098bbc85a212d7f94 | .travis.yml | .travis.yml | language: objective-c
env:
matrix:
- MONO_VERSION="3.4.0"
install:
- wget "http://download.mono-project.com/archive/3.4.0/macos-10-x86/MonoFramework-MDK-${MONO_VERSION}.macos10.xamarin.x86.pkg"
- sudo installer -pkg "MonoFramework-MDK-${MONO_VERSION}.macos10.xamarin.x86.pkg" -target /
script:
- ./build.sh All
| language: objective-c
env:
matrix:
- MONO_VERSION="3.4.0"
install:
- wget "http://download.mono-project.com/archive/${MONO_VERSION}/macos-10-x86/MonoFramework-MDK-${MONO_VERSION}.macos10.xamarin.x86.pkg"
- sudo installer -pkg "MonoFramework-MDK-${MONO_VERSION}.macos10.xamarin.x86.pkg" -target /
script:
- ./build.sh All
| Add missing reference to MONO_VERSION env variable | Add missing reference to MONO_VERSION env variable
| YAML | mit | enricosada/ProjectScaffold,bigjonroberts/Flurl.fs,terjetyl/ModulusCalculator2,QuickenLoans/MessageRouter,fsprojects/ProjectScaffold,mexx/Paket,Stift/Paket,halcwb/ProjectScaffold,LexxFedoroff/Paket,DamianReeves/Kollege,steego/Steego.Demo,0x53A/Paket,willryan/TimestampedJsonLogFilter,matthid/Paket,thinkbeforecoding/Paket,nhsevidence/ld-bnf,Eleven19/SSISClient,inosik/Paket,blazey/blazey.substituter,robertpi/Paket,enricosada/ProjectScaffold,simonhdickson/Paket,Thorium/Paket,MorganPersson/Paket,magicmonty/Paket,robertpi/Paket,Stift/Paket,simonhdickson/Paket,lexarchik/Paket,MorganPersson/Paket,codehutch/pg3d,ccozianu/XTestFSharp1,robertpi/caffe.clr.demo,baronfel/mongomigrator,magicmonty/Paket,magicmonty/Paket,dedale/Paket,vbfox/Paket,ctaggart/Paket,QuickenLoans/MessageRouter,bmuk/bmuk.io,YaccConstructor/ProjectScaffold,robertpi/Paket,dnauck/AspSQLProvider,theimowski/Paket,NaseUkolyCZ/ProjectScaffoldWithApps,matthid/Paket,jonathankarsh/Paket,cloudRoutine/Paket,fsprojects/ProjectScaffold,LexxFedoroff/Paket,fjoppe/FsYamlParser,mavnn/Paket,Stift/Paket,snowcrazed/Paket,Eleven19/SSISManagement,theimowski/Paket,matthid/Paket,jeremyabbott/doctoralsurvey,betgenius/Paket,jsparkes/ProjectScaffold,mausch/Paket,pblasucci/MessageRouter,ovatsus/Paket,isaacabraham/Paket,ed-ilyin/EdIlyin.FSharp.Elm.Core,Stift/Paket,modulexcite/ProjectScaffold,mrinaldi/Paket,cloudRoutine/Paket,Vidarls/ProjectScaffold,isaacabraham/Paket,kjellski/Paket,thinkbeforecoding/Paket,jeremyabbott/doctoralsurvey,ascjones/Paket,baronfel/mongomigrator,antymon4o/Paket,matthid/Paket,simonhdickson/Paket,MorganPersson/Paket,fsprojects/Paket,markstownsend/edna,isaacabraham/Paket,devboy/Paket,simonhdickson/Paket,hodzanassredin/TransAlt,antymon4o/Paket,neoeinstein/FSharp.RestApiProviders,willryan/Fakeish,hodzanassredin/NFastText,artur-s/Paket,thinkbeforecoding/Paket,Thorium/Paket,giacomociti/AntaniXml,inosik/Paket,kjellski/Paket,ctaggart/Paket,mrinaldi/Paket,codehutch/pg3d,jeremyabbott/doctoralsurvey,0x53A/Paket,sideeffffect/ProjectScaffold,jam40jeff/Paket,ascjones/Paket,NaseUkolyCZ/ProjectScaffoldWithApps,NaseUkolyCZ/ProjectScaffoldWithApps,nhsevidence/ld-bnf,mausch/Paket,jeremyabbott/doctoralsurvey,mastoj/FSourcing,matthid/ProjectScaffold,Thorium/Paket,lexarchik/Paket,isaacabraham/Paket,Eleven19/SSISClient,NatElkins/Paket,steego/Steego.Demo,drzoidb3rg/FSharpTemplate,fjoppe/Legivel,jam40jeff/Paket,steego/Steego.Demo,bradphelan/majk,eiriktsarpalis/progfsharp2015,DamianReeves/Incubation,konste/Paket,baronfel/Paket,baronfel/Paket,vasily-kirichenko/Paket,DamianReeves/Kollege,codehutch/pg3d,smkell/fsfbl,robertpi/Paket,devboy/Paket,NaseUkolyCZ/ProjectScaffoldWithApps,dedale/Paket,Eleven19/SSISClient,lexarchik/Paket,magicmonty/Paket,colinbull/Paket,baronfel/mongomigrator,NatElkins/Paket,sergey-tihon/Paket,giacomociti/AntaniXml,betgenius/Paket,fjoppe/Legivel,colinbull/Paket,fsprojects/Paket,blazey/blazey.substituter,NaseUkolyCZ/ProjectScaffoldWithApps,rneatherway/ProjectScaffold,cloudRoutine/Paket,mrinaldi/Paket,sideeffffect/ProjectScaffold,jonathankarsh/Paket,jeremyabbott/doctoralsurvey,lexarchik/Paket,0x53A/Paket,hodzanassredin/NFastText,NatElkins/Paket,Eleven19/SSISManagement,jeroldhaas/ProjectScaffold,ovatsus/Paket,artur-s/Paket,vbfox/Paket,Thorium/Paket,gsvgit/ProjectScaffold,ovatsus/FSharp.ProjectScaffold,snowcrazed/Paket,NatElkins/Paket,konste/Paket,cloudRoutine/Paket,vbfox/Paket,vasily-kirichenko/Paket,mexx/Paket,DamianReeves/Incubation,theimowski/FsPrimo,toburger/FSharp.Json,vbfox/Paket,mrinaldi/Paket,snowcrazed/Paket,MorganPersson/Paket,thinkbeforecoding/Paket,0x53A/Paket,sergey-tihon/Paket,pblasucci/MessageRouter,dnauck/AspSQLProvider,snowcrazed/Paket,mavnn/Paket | yaml | ## Code Before:
language: objective-c
env:
matrix:
- MONO_VERSION="3.4.0"
install:
- wget "http://download.mono-project.com/archive/3.4.0/macos-10-x86/MonoFramework-MDK-${MONO_VERSION}.macos10.xamarin.x86.pkg"
- sudo installer -pkg "MonoFramework-MDK-${MONO_VERSION}.macos10.xamarin.x86.pkg" -target /
script:
- ./build.sh All
## Instruction:
Add missing reference to MONO_VERSION env variable
## Code After:
language: objective-c
env:
matrix:
- MONO_VERSION="3.4.0"
install:
- wget "http://download.mono-project.com/archive/${MONO_VERSION}/macos-10-x86/MonoFramework-MDK-${MONO_VERSION}.macos10.xamarin.x86.pkg"
- sudo installer -pkg "MonoFramework-MDK-${MONO_VERSION}.macos10.xamarin.x86.pkg" -target /
script:
- ./build.sh All
| language: objective-c
env:
matrix:
- MONO_VERSION="3.4.0"
install:
- - wget "http://download.mono-project.com/archive/3.4.0/macos-10-x86/MonoFramework-MDK-${MONO_VERSION}.macos10.xamarin.x86.pkg"
? ^^^^^
+ - wget "http://download.mono-project.com/archive/${MONO_VERSION}/macos-10-x86/MonoFramework-MDK-${MONO_VERSION}.macos10.xamarin.x86.pkg"
? ^^^^^^^^^^^^^^^
- sudo installer -pkg "MonoFramework-MDK-${MONO_VERSION}.macos10.xamarin.x86.pkg" -target /
script:
- ./build.sh All | 2 | 0.166667 | 1 | 1 |
480259bc2fbedb27a1c2cb7015bd550765d431d9 | lesscss/components/listing.less | lesscss/components/listing.less | .zebra-stripe > *:nth-child(2n-1) {
background-color: @zebra-white;
> span:hover {
background-color: @hover-white;
}
}
.listings li.c-listing--selected {
background-color: @selected-light;
}
.listings span:hover {
background-color: @hover-white;
}
.c-listing {
display: flex;
flex-flow: row;
.c-listing-boss,
.c-listing-minion,
.c-listing-delete,
.c-listing-edit,
.c-listing-move,
.c-listing__counter {
display: none;
}
&:hover {
.c-listing-boss,
.c-listing-minion,
.c-listing-delete,
.c-listing-edit,
.c-listing-move {
display: flex;
}
}
}
.c-folder {
.c-listing {
&:not(:first-child) {
margin-left: @medium-spacer;
}
}
}
.c-folder--open {
box-shadow: inset 10px -10px 5px -5px rgba(0, 0, 0, 0.25);
padding-bottom: @medium-spacer;
.c-folder--open {
margin-left: @medium-spacer;
}
}
| .zebra-stripe > *:nth-child(2n-1) {
background-color: @zebra-white;
> span:hover {
background-color: @hover-white;
}
}
.listings li.c-listing--selected {
background-color: @selected-light;
}
.listings span:hover {
background-color: @hover-white;
}
.c-listing {
display: flex;
flex-flow: row;
.c-listing-boss,
.c-listing-minion,
.c-listing-delete,
.c-listing-edit,
.c-listing-move,
.c-listing__counter {
display: none;
}
&:hover {
.c-listing-boss,
.c-listing-minion,
.c-listing-delete,
.c-listing-edit,
.c-listing-move {
display: flex;
}
}
}
.c-folder {
.c-listing {
&:not(:first-child) {
margin-left: @medium-spacer;
}
}
}
.c-folder--open {
box-shadow: inset 0px -10px 5px -5px rgba(0, 0, 0, 0.25);
border-left: 1px solid @primary-black;
padding-bottom: @medium-spacer;
.c-folder--open {
margin-left: @medium-spacer;
}
}
| Swap side box shadow for solid border | Swap side box shadow for solid border
| Less | mit | cynicaloptimist/improved-initiative,cynicaloptimist/improved-initiative,cynicaloptimist/improved-initiative | less | ## Code Before:
.zebra-stripe > *:nth-child(2n-1) {
background-color: @zebra-white;
> span:hover {
background-color: @hover-white;
}
}
.listings li.c-listing--selected {
background-color: @selected-light;
}
.listings span:hover {
background-color: @hover-white;
}
.c-listing {
display: flex;
flex-flow: row;
.c-listing-boss,
.c-listing-minion,
.c-listing-delete,
.c-listing-edit,
.c-listing-move,
.c-listing__counter {
display: none;
}
&:hover {
.c-listing-boss,
.c-listing-minion,
.c-listing-delete,
.c-listing-edit,
.c-listing-move {
display: flex;
}
}
}
.c-folder {
.c-listing {
&:not(:first-child) {
margin-left: @medium-spacer;
}
}
}
.c-folder--open {
box-shadow: inset 10px -10px 5px -5px rgba(0, 0, 0, 0.25);
padding-bottom: @medium-spacer;
.c-folder--open {
margin-left: @medium-spacer;
}
}
## Instruction:
Swap side box shadow for solid border
## Code After:
.zebra-stripe > *:nth-child(2n-1) {
background-color: @zebra-white;
> span:hover {
background-color: @hover-white;
}
}
.listings li.c-listing--selected {
background-color: @selected-light;
}
.listings span:hover {
background-color: @hover-white;
}
.c-listing {
display: flex;
flex-flow: row;
.c-listing-boss,
.c-listing-minion,
.c-listing-delete,
.c-listing-edit,
.c-listing-move,
.c-listing__counter {
display: none;
}
&:hover {
.c-listing-boss,
.c-listing-minion,
.c-listing-delete,
.c-listing-edit,
.c-listing-move {
display: flex;
}
}
}
.c-folder {
.c-listing {
&:not(:first-child) {
margin-left: @medium-spacer;
}
}
}
.c-folder--open {
box-shadow: inset 0px -10px 5px -5px rgba(0, 0, 0, 0.25);
border-left: 1px solid @primary-black;
padding-bottom: @medium-spacer;
.c-folder--open {
margin-left: @medium-spacer;
}
}
| .zebra-stripe > *:nth-child(2n-1) {
background-color: @zebra-white;
> span:hover {
background-color: @hover-white;
}
}
.listings li.c-listing--selected {
background-color: @selected-light;
}
.listings span:hover {
background-color: @hover-white;
}
.c-listing {
display: flex;
flex-flow: row;
.c-listing-boss,
.c-listing-minion,
.c-listing-delete,
.c-listing-edit,
.c-listing-move,
.c-listing__counter {
display: none;
}
&:hover {
.c-listing-boss,
.c-listing-minion,
.c-listing-delete,
.c-listing-edit,
.c-listing-move {
display: flex;
}
}
}
.c-folder {
.c-listing {
&:not(:first-child) {
margin-left: @medium-spacer;
}
}
}
.c-folder--open {
- box-shadow: inset 10px -10px 5px -5px rgba(0, 0, 0, 0.25);
? -
+ box-shadow: inset 0px -10px 5px -5px rgba(0, 0, 0, 0.25);
+ border-left: 1px solid @primary-black;
padding-bottom: @medium-spacer;
.c-folder--open {
margin-left: @medium-spacer;
}
} | 3 | 0.055556 | 2 | 1 |
4133d2f14a644eadeedcc1ea21abbb6db8054c28 | app/views/layouts/boilerplate.html.erb | app/views/layouts/boilerplate.html.erb | <!DOCTYPE html>
<!--[if lt IE 7]> <html class="no-js lt-ie9 lt-ie8 lt-ie7"> <![endif]-->
<!--[if IE 7]> <html class="no-js lt-ie9 lt-ie8"> <![endif]-->
<!--[if IE 8]> <html class="no-js lt-ie9"> <![endif]-->
<!--[if gt IE 8]><!--> <html class="no-js"> <!--<![endif]-->
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
<title><%= content_for?(:page_title) ? yield(:page_title) : default_page_title %></title>
<meta name="viewport" content="width=device-width">
<%= csrf_meta_tag %>
<%= stylesheet_link_tag 'application' %>
<%= javascript_include_tag 'application' %>
<%= render partial: 'shared/ga', formats: [:html] %>
</head>
<body>
<%= content_for?(:body) ? yield(:body) : yield %>
<% unless Rails.env.production? %>
<!-- Rendered through: <%= yield(:layout_name) %>boilerplate -->
<% end %>
<div id="ajax-modal" class="modal hide fade" tabindex="-1"></div>
</body>
</html>
| <!DOCTYPE html>
<!--[if lt IE 7]> <html class="no-js lt-ie9 lt-ie8 lt-ie7"> <![endif]-->
<!--[if IE 7]> <html class="no-js lt-ie9 lt-ie8"> <![endif]-->
<!--[if IE 8]> <html class="no-js lt-ie9"> <![endif]-->
<!--[if gt IE 8]><!--> <html class="no-js"> <!--<![endif]-->
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
<title><%= content_for?(:page_title) ? yield(:page_title) : default_page_title %></title>
<meta name="viewport" content="width=device-width">
<%= csrf_meta_tag %>
<%= stylesheet_link_tag 'application' %>
<%= javascript_include_tag 'application' %>
<%= render partial: 'shared/ga', formats: [:html] %>
</head>
<body>
<%= content_for?(:body) ? yield(:body) : yield %>
<%= render 'shared/ajax_modal' %>
</body>
</html>
| Fix "more" facets modal window | Fix "more" facets modal window
| HTML+ERB | apache-2.0 | samvera/hyrax,samvera/hyrax,samvera/hyrax,samvera/hyrax | html+erb | ## Code Before:
<!DOCTYPE html>
<!--[if lt IE 7]> <html class="no-js lt-ie9 lt-ie8 lt-ie7"> <![endif]-->
<!--[if IE 7]> <html class="no-js lt-ie9 lt-ie8"> <![endif]-->
<!--[if IE 8]> <html class="no-js lt-ie9"> <![endif]-->
<!--[if gt IE 8]><!--> <html class="no-js"> <!--<![endif]-->
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
<title><%= content_for?(:page_title) ? yield(:page_title) : default_page_title %></title>
<meta name="viewport" content="width=device-width">
<%= csrf_meta_tag %>
<%= stylesheet_link_tag 'application' %>
<%= javascript_include_tag 'application' %>
<%= render partial: 'shared/ga', formats: [:html] %>
</head>
<body>
<%= content_for?(:body) ? yield(:body) : yield %>
<% unless Rails.env.production? %>
<!-- Rendered through: <%= yield(:layout_name) %>boilerplate -->
<% end %>
<div id="ajax-modal" class="modal hide fade" tabindex="-1"></div>
</body>
</html>
## Instruction:
Fix "more" facets modal window
## Code After:
<!DOCTYPE html>
<!--[if lt IE 7]> <html class="no-js lt-ie9 lt-ie8 lt-ie7"> <![endif]-->
<!--[if IE 7]> <html class="no-js lt-ie9 lt-ie8"> <![endif]-->
<!--[if IE 8]> <html class="no-js lt-ie9"> <![endif]-->
<!--[if gt IE 8]><!--> <html class="no-js"> <!--<![endif]-->
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
<title><%= content_for?(:page_title) ? yield(:page_title) : default_page_title %></title>
<meta name="viewport" content="width=device-width">
<%= csrf_meta_tag %>
<%= stylesheet_link_tag 'application' %>
<%= javascript_include_tag 'application' %>
<%= render partial: 'shared/ga', formats: [:html] %>
</head>
<body>
<%= content_for?(:body) ? yield(:body) : yield %>
<%= render 'shared/ajax_modal' %>
</body>
</html>
| <!DOCTYPE html>
<!--[if lt IE 7]> <html class="no-js lt-ie9 lt-ie8 lt-ie7"> <![endif]-->
<!--[if IE 7]> <html class="no-js lt-ie9 lt-ie8"> <![endif]-->
<!--[if IE 8]> <html class="no-js lt-ie9"> <![endif]-->
<!--[if gt IE 8]><!--> <html class="no-js"> <!--<![endif]-->
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
<title><%= content_for?(:page_title) ? yield(:page_title) : default_page_title %></title>
<meta name="viewport" content="width=device-width">
<%= csrf_meta_tag %>
-
<%= stylesheet_link_tag 'application' %>
<%= javascript_include_tag 'application' %>
<%= render partial: 'shared/ga', formats: [:html] %>
</head>
<body>
<%= content_for?(:body) ? yield(:body) : yield %>
+ <%= render 'shared/ajax_modal' %>
-
- <% unless Rails.env.production? %>
- <!-- Rendered through: <%= yield(:layout_name) %>boilerplate -->
- <% end %>
- <div id="ajax-modal" class="modal hide fade" tabindex="-1"></div>
</body>
</html> | 7 | 0.259259 | 1 | 6 |
864b120ef242c7bade6e9a1cb09160c3a97113cf | scripts/ci/azure/macos-setup.sh | scripts/ci/azure/macos-setup.sh |
echo "Setting up default XCode version"
case "$SYSTEM_JOBNAME" in
*xcode941*)
sudo xcode-select --switch /Applications/Xcode_9.4.1.app
;;
*xcode103*)
sudo xcode-select --switch /Applications/Xcode_10.3.app
;;
*xcode111*)
sudo xcode-select --switch /Applications/Xcode_11.1.app
;;
*)
echo " Unknown macOS image. Using defaults."
;;
esac
echo "Removing all existing brew package and update teh formule"
brew remove --force $(brew list)
brew update
echo "Installing Kitware CMake and Ninja"
brew install cmake ninja
echo "Installing GCC"
brew install gcc
echo "Installing blosc compression"
brew install c-blosc
echo "Installing python and friends"
brew install python numpy
if [[ "$SYSTEM_JOBNAME" =~ .*openmpi.* ]]
then
echo "Installing OpenMPI"
brew install openmpi
fi
|
echo "Setting up default XCode version"
case "$SYSTEM_JOBNAME" in
*xcode941*)
sudo xcode-select --switch /Applications/Xcode_9.4.1.app
;;
*xcode103*)
sudo xcode-select --switch /Applications/Xcode_10.3.app
;;
*xcode111*)
sudo xcode-select --switch /Applications/Xcode_11.1.app
;;
*)
echo " Unknown macOS image. Using defaults."
;;
esac
echo "Removing all existing brew package and update the formule"
brew remove --force $(brew list)
brew update
echo "Installing Kitware CMake and Ninja"
brew install cmake ninja
echo "Installing GCC"
brew install gcc
echo "Installing blosc compression"
brew install c-blosc
echo "Installing python and friends"
brew install python numpy
brew link --overwrite python
if [[ "$SYSTEM_JOBNAME" =~ .*openmpi.* ]]
then
echo "Installing OpenMPI"
brew install openmpi
fi
| Fix brew mac build python install problem | Fix brew mac build python install problem
| Shell | apache-2.0 | ornladios/ADIOS2,ornladios/ADIOS2,ornladios/ADIOS2,JasonRuonanWang/ADIOS2,ornladios/ADIOS2,JasonRuonanWang/ADIOS2,JasonRuonanWang/ADIOS2,JasonRuonanWang/ADIOS2,JasonRuonanWang/ADIOS2,ornladios/ADIOS2 | shell | ## Code Before:
echo "Setting up default XCode version"
case "$SYSTEM_JOBNAME" in
*xcode941*)
sudo xcode-select --switch /Applications/Xcode_9.4.1.app
;;
*xcode103*)
sudo xcode-select --switch /Applications/Xcode_10.3.app
;;
*xcode111*)
sudo xcode-select --switch /Applications/Xcode_11.1.app
;;
*)
echo " Unknown macOS image. Using defaults."
;;
esac
echo "Removing all existing brew package and update teh formule"
brew remove --force $(brew list)
brew update
echo "Installing Kitware CMake and Ninja"
brew install cmake ninja
echo "Installing GCC"
brew install gcc
echo "Installing blosc compression"
brew install c-blosc
echo "Installing python and friends"
brew install python numpy
if [[ "$SYSTEM_JOBNAME" =~ .*openmpi.* ]]
then
echo "Installing OpenMPI"
brew install openmpi
fi
## Instruction:
Fix brew mac build python install problem
## Code After:
echo "Setting up default XCode version"
case "$SYSTEM_JOBNAME" in
*xcode941*)
sudo xcode-select --switch /Applications/Xcode_9.4.1.app
;;
*xcode103*)
sudo xcode-select --switch /Applications/Xcode_10.3.app
;;
*xcode111*)
sudo xcode-select --switch /Applications/Xcode_11.1.app
;;
*)
echo " Unknown macOS image. Using defaults."
;;
esac
echo "Removing all existing brew package and update the formule"
brew remove --force $(brew list)
brew update
echo "Installing Kitware CMake and Ninja"
brew install cmake ninja
echo "Installing GCC"
brew install gcc
echo "Installing blosc compression"
brew install c-blosc
echo "Installing python and friends"
brew install python numpy
brew link --overwrite python
if [[ "$SYSTEM_JOBNAME" =~ .*openmpi.* ]]
then
echo "Installing OpenMPI"
brew install openmpi
fi
|
echo "Setting up default XCode version"
case "$SYSTEM_JOBNAME" in
*xcode941*)
sudo xcode-select --switch /Applications/Xcode_9.4.1.app
;;
*xcode103*)
sudo xcode-select --switch /Applications/Xcode_10.3.app
;;
*xcode111*)
sudo xcode-select --switch /Applications/Xcode_11.1.app
;;
*)
echo " Unknown macOS image. Using defaults."
;;
esac
- echo "Removing all existing brew package and update teh formule"
? -
+ echo "Removing all existing brew package and update the formule"
? +
brew remove --force $(brew list)
brew update
echo "Installing Kitware CMake and Ninja"
brew install cmake ninja
echo "Installing GCC"
brew install gcc
echo "Installing blosc compression"
brew install c-blosc
echo "Installing python and friends"
brew install python numpy
+ brew link --overwrite python
if [[ "$SYSTEM_JOBNAME" =~ .*openmpi.* ]]
then
echo "Installing OpenMPI"
brew install openmpi
fi | 3 | 0.078947 | 2 | 1 |
0ddefe4b99865d85ced35ebc28679afebe5d369e | db/migrate/20130213162738_update_travel_advice_slugs.rb | db/migrate/20130213162738_update_travel_advice_slugs.rb | class UpdateTravelAdviceSlugs < Mongoid::Migration
def self.up
Artefact.where(:slug => %r{\Atravel-advice/}).each do |artefact|
artefact.slug = "foreign-#{artefact.slug}"
artefact.save :validate => false # validate => false necessary because these will be live artefacts
end
end
def self.down
Artefact.where(:slug => %r{\Aforeign-travel-advice/(.*)\z}).each do |artefact|
artefact.slug = artefact.slug.sub(/\Aforeign-/, '')
artefact.save :validate => false # validate => false necessary because these will be live artefacts
end
end
end
| class UpdateTravelAdviceSlugs < Mongoid::Migration
def self.up
Artefact.where(:slug => %r{\Atravel-advice/}).each do |artefact|
if ! Rails.env.development? or ENV['UPDATE_SEARCH'].present?
Rummageable.delete("/#{artefact.slug}")
end
artefact.set(:slug, "foreign-#{artefact.slug}")
end
end
def self.down
Artefact.where(:slug => %r{\Aforeign-travel-advice/(.*)\z}).each do |artefact|
artefact.set(:slug, artefact.slug.sub(/\Aforeign-/, ''))
end
end
end
| Fix slug change migration to deal with rummager update | Fix slug change migration to deal with rummager update
Switch to using set methos to avoid callbacks. Will deal with re-adding
to search by re-registering everything from travel-advice-publisher
| Ruby | mit | theodi/panopticon,alphagov/panopticon,alphagov/panopticon,alphagov/panopticon,theodi/panopticon,alphagov/panopticon,theodi/panopticon,theodi/panopticon | ruby | ## Code Before:
class UpdateTravelAdviceSlugs < Mongoid::Migration
def self.up
Artefact.where(:slug => %r{\Atravel-advice/}).each do |artefact|
artefact.slug = "foreign-#{artefact.slug}"
artefact.save :validate => false # validate => false necessary because these will be live artefacts
end
end
def self.down
Artefact.where(:slug => %r{\Aforeign-travel-advice/(.*)\z}).each do |artefact|
artefact.slug = artefact.slug.sub(/\Aforeign-/, '')
artefact.save :validate => false # validate => false necessary because these will be live artefacts
end
end
end
## Instruction:
Fix slug change migration to deal with rummager update
Switch to using set methos to avoid callbacks. Will deal with re-adding
to search by re-registering everything from travel-advice-publisher
## Code After:
class UpdateTravelAdviceSlugs < Mongoid::Migration
def self.up
Artefact.where(:slug => %r{\Atravel-advice/}).each do |artefact|
if ! Rails.env.development? or ENV['UPDATE_SEARCH'].present?
Rummageable.delete("/#{artefact.slug}")
end
artefact.set(:slug, "foreign-#{artefact.slug}")
end
end
def self.down
Artefact.where(:slug => %r{\Aforeign-travel-advice/(.*)\z}).each do |artefact|
artefact.set(:slug, artefact.slug.sub(/\Aforeign-/, ''))
end
end
end
| class UpdateTravelAdviceSlugs < Mongoid::Migration
def self.up
Artefact.where(:slug => %r{\Atravel-advice/}).each do |artefact|
+ if ! Rails.env.development? or ENV['UPDATE_SEARCH'].present?
+ Rummageable.delete("/#{artefact.slug}")
+ end
- artefact.slug = "foreign-#{artefact.slug}"
? ^^
+ artefact.set(:slug, "foreign-#{artefact.slug}")
? +++++ ^ +
- artefact.save :validate => false # validate => false necessary because these will be live artefacts
end
end
def self.down
Artefact.where(:slug => %r{\Aforeign-travel-advice/(.*)\z}).each do |artefact|
- artefact.slug = artefact.slug.sub(/\Aforeign-/, '')
? ^^
+ artefact.set(:slug, artefact.slug.sub(/\Aforeign-/, ''))
? +++++ ^ +
- artefact.save :validate => false # validate => false necessary because these will be live artefacts
end
end
end | 9 | 0.6 | 5 | 4 |
a6993e586d3af137de24f832ac7d477e67a3a501 | server.js | server.js | import express from 'express';
import { apolloServer } from 'graphql-tools';
import Schema from './data/schema';
import Mocks from './data/mocks';
const GRAPHQL_PORT = 8080;
const graphQLServer = express();
graphQLServer.use('/', apolloServer({
graphiql: true,
pretty: true,
schema: Schema,
mocks: Mocks,
}));
graphQLServer.listen(GRAPHQL_PORT, () => console.log(
`GraphQL Server is now running on http://localhost:${GRAPHQL_PORT}`
));
| import express from 'express';
import { apolloServer } from 'graphql-tools';
import Schema from './data/schema';
import Mocks from './data/mocks';
const GRAPHQL_PORT = 8080;
const graphQLServer = express();
graphQLServer.use('/graphql', apolloServer({
graphiql: true,
pretty: true,
schema: Schema,
mocks: Mocks,
}));
graphQLServer.listen(GRAPHQL_PORT, () => console.log(
`GraphQL Server is now running on http://localhost:${GRAPHQL_PORT}/graphql`
));
| Use /graphql instead of graphql | Use /graphql instead of graphql | JavaScript | mit | vincentfretin/assembl-graphql-js-server,misas-io/misas-server,misas-io/misas-server,a1528zhang/graphql-mock-server | javascript | ## Code Before:
import express from 'express';
import { apolloServer } from 'graphql-tools';
import Schema from './data/schema';
import Mocks from './data/mocks';
const GRAPHQL_PORT = 8080;
const graphQLServer = express();
graphQLServer.use('/', apolloServer({
graphiql: true,
pretty: true,
schema: Schema,
mocks: Mocks,
}));
graphQLServer.listen(GRAPHQL_PORT, () => console.log(
`GraphQL Server is now running on http://localhost:${GRAPHQL_PORT}`
));
## Instruction:
Use /graphql instead of graphql
## Code After:
import express from 'express';
import { apolloServer } from 'graphql-tools';
import Schema from './data/schema';
import Mocks from './data/mocks';
const GRAPHQL_PORT = 8080;
const graphQLServer = express();
graphQLServer.use('/graphql', apolloServer({
graphiql: true,
pretty: true,
schema: Schema,
mocks: Mocks,
}));
graphQLServer.listen(GRAPHQL_PORT, () => console.log(
`GraphQL Server is now running on http://localhost:${GRAPHQL_PORT}/graphql`
));
| import express from 'express';
import { apolloServer } from 'graphql-tools';
import Schema from './data/schema';
import Mocks from './data/mocks';
const GRAPHQL_PORT = 8080;
const graphQLServer = express();
- graphQLServer.use('/', apolloServer({
+ graphQLServer.use('/graphql', apolloServer({
? +++++++
graphiql: true,
pretty: true,
schema: Schema,
mocks: Mocks,
}));
graphQLServer.listen(GRAPHQL_PORT, () => console.log(
- `GraphQL Server is now running on http://localhost:${GRAPHQL_PORT}`
+ `GraphQL Server is now running on http://localhost:${GRAPHQL_PORT}/graphql`
? ++++++++
)); | 4 | 0.235294 | 2 | 2 |
1b4053446d6102544ac496a497eefab28928a7ee | plugins/api/item/LootTable.kt | plugins/api/item/LootTable.kt | package api.item
import api.predef.*
import io.luna.game.model.item.Item
/**
* A model representing a loot table. Loot tables are collections of items that can select items to be picked based
* on their rarity.
*
* @author lare96
*/
class LootTable(private val items: List<LootTableItem>) : Iterable<LootTableItem> {
override fun iterator(): Iterator<LootTableItem> = items.iterator()
/**
* Picks [amount] items from this loot table.
*/
fun pick(amount: Int): List<Item> {
val lootItems = ArrayList<Item>(amount)
for (loot in items) {
if (roll(loot)) {
lootItems += loot.getItem()
}
}
return lootItems
}
/**
* Picks one item from this loot table.
*/
fun pick(): Item = pickAll().random()
/**
* Picks all items from this loot table.
*/
fun pickAll(): List<Item> = pick(items.size)
/**
* Determines if [loot] will be picked based on its rarity.
*/
private fun roll(loot: LootTableItem): Boolean {
val chance = loot.chance
return when {
chance.numerator <= 0 -> false
chance.numerator >= chance.denominator -> true
rand(0, chance.denominator) <= chance.numerator -> true
else -> false
}
}
}
| package api.item
import api.predef.*
import io.luna.game.model.item.Item
/**
* A model representing a loot table. Loot tables are collections of items that can select items to be picked based
* on their rarity.
*
* @author lare96
*/
class LootTable(private val items: List<LootTableItem>) : Iterable<LootTableItem> {
override fun iterator(): Iterator<LootTableItem> = items.iterator()
/**
* Rolls on one item from this loot table.
*/
fun pick(): Item? {
val all = pickAll()
return when (all.isEmpty()) {
true -> null
false -> all.random()
}
}
/**
* Rolls on all items from this loot table.
*/
fun pickAll(): List<Item> {
val lootItems = ArrayList<Item>(items.size)
for (loot in items) {
if (roll(loot)) {
lootItems += loot.getItem()
}
}
return lootItems
}
/**
* Determines if [loot] will be picked based on its rarity.
*/
private fun roll(loot: LootTableItem): Boolean {
val chance = loot.chance
return when {
chance.numerator <= 0 -> false
chance.numerator >= chance.denominator -> true
rand(0, chance.denominator) <= chance.numerator -> true
else -> false
}
}
}
| Test and fix bugs with loot tables | Test and fix bugs with loot tables
| Kotlin | mit | lare96/luna | kotlin | ## Code Before:
package api.item
import api.predef.*
import io.luna.game.model.item.Item
/**
* A model representing a loot table. Loot tables are collections of items that can select items to be picked based
* on their rarity.
*
* @author lare96
*/
class LootTable(private val items: List<LootTableItem>) : Iterable<LootTableItem> {
override fun iterator(): Iterator<LootTableItem> = items.iterator()
/**
* Picks [amount] items from this loot table.
*/
fun pick(amount: Int): List<Item> {
val lootItems = ArrayList<Item>(amount)
for (loot in items) {
if (roll(loot)) {
lootItems += loot.getItem()
}
}
return lootItems
}
/**
* Picks one item from this loot table.
*/
fun pick(): Item = pickAll().random()
/**
* Picks all items from this loot table.
*/
fun pickAll(): List<Item> = pick(items.size)
/**
* Determines if [loot] will be picked based on its rarity.
*/
private fun roll(loot: LootTableItem): Boolean {
val chance = loot.chance
return when {
chance.numerator <= 0 -> false
chance.numerator >= chance.denominator -> true
rand(0, chance.denominator) <= chance.numerator -> true
else -> false
}
}
}
## Instruction:
Test and fix bugs with loot tables
## Code After:
package api.item
import api.predef.*
import io.luna.game.model.item.Item
/**
* A model representing a loot table. Loot tables are collections of items that can select items to be picked based
* on their rarity.
*
* @author lare96
*/
class LootTable(private val items: List<LootTableItem>) : Iterable<LootTableItem> {
override fun iterator(): Iterator<LootTableItem> = items.iterator()
/**
* Rolls on one item from this loot table.
*/
fun pick(): Item? {
val all = pickAll()
return when (all.isEmpty()) {
true -> null
false -> all.random()
}
}
/**
* Rolls on all items from this loot table.
*/
fun pickAll(): List<Item> {
val lootItems = ArrayList<Item>(items.size)
for (loot in items) {
if (roll(loot)) {
lootItems += loot.getItem()
}
}
return lootItems
}
/**
* Determines if [loot] will be picked based on its rarity.
*/
private fun roll(loot: LootTableItem): Boolean {
val chance = loot.chance
return when {
chance.numerator <= 0 -> false
chance.numerator >= chance.denominator -> true
rand(0, chance.denominator) <= chance.numerator -> true
else -> false
}
}
}
| package api.item
import api.predef.*
import io.luna.game.model.item.Item
/**
* A model representing a loot table. Loot tables are collections of items that can select items to be picked based
* on their rarity.
*
* @author lare96
*/
class LootTable(private val items: List<LootTableItem>) : Iterable<LootTableItem> {
override fun iterator(): Iterator<LootTableItem> = items.iterator()
/**
- * Picks [amount] items from this loot table.
? ^^^^ --- - ^^ -
+ * Rolls on one item from this loot table.
? ^^^^ ^^^^
*/
+ fun pick(): Item? {
+ val all = pickAll()
+ return when (all.isEmpty()) {
+ true -> null
+ false -> all.random()
+ }
+ }
+
+ /**
+ * Rolls on all items from this loot table.
+ */
- fun pick(amount: Int): List<Item> {
? -----------
+ fun pickAll(): List<Item> {
? +++
- val lootItems = ArrayList<Item>(amount)
? ^ ^^^^
+ val lootItems = ArrayList<Item>(items.size)
? ^^^ ^^^^^^
for (loot in items) {
if (roll(loot)) {
lootItems += loot.getItem()
}
}
return lootItems
}
-
- /**
- * Picks one item from this loot table.
- */
- fun pick(): Item = pickAll().random()
-
- /**
- * Picks all items from this loot table.
- */
- fun pickAll(): List<Item> = pick(items.size)
/**
* Determines if [loot] will be picked based on its rarity.
*/
private fun roll(loot: LootTableItem): Boolean {
val chance = loot.chance
return when {
chance.numerator <= 0 -> false
chance.numerator >= chance.denominator -> true
rand(0, chance.denominator) <= chance.numerator -> true
else -> false
}
}
} | 27 | 0.529412 | 14 | 13 |
d0a9b535057249980866bed66ae2ff0bcaee31e6 | .travis.yml | .travis.yml |
language: objective-c
osx_image: xcode10.1
before_install:
- gem install xcpretty
script:
set -o pipefail &&
travis_retry
xcodebuild test
-workspace CotEditor.xcworkspace
-scheme CotEditor
CODE_SIGN_IDENTITY=""
CODE_SIGNING_REQUIRED=NO
| xcpretty
|
language: objective-c
osx_image: xcode10.1
before_install:
- gem install xcpretty
script:
set -o pipefail &&
travis_retry
xcodebuild test
-workspace CotEditor.xcworkspace
-scheme CotEditor
CODE_SIGN_IDENTITY=""
CODE_SIGNING_REQUIRED=NO
ENABLE_HARDENED_RUNTIME=NO
| xcpretty
| Disable hardened runtime for Travis CI | Disable hardened runtime for Travis CI
| YAML | apache-2.0 | onevcat/CotEditor,onevcat/CotEditor,onevcat/CotEditor | yaml | ## Code Before:
language: objective-c
osx_image: xcode10.1
before_install:
- gem install xcpretty
script:
set -o pipefail &&
travis_retry
xcodebuild test
-workspace CotEditor.xcworkspace
-scheme CotEditor
CODE_SIGN_IDENTITY=""
CODE_SIGNING_REQUIRED=NO
| xcpretty
## Instruction:
Disable hardened runtime for Travis CI
## Code After:
language: objective-c
osx_image: xcode10.1
before_install:
- gem install xcpretty
script:
set -o pipefail &&
travis_retry
xcodebuild test
-workspace CotEditor.xcworkspace
-scheme CotEditor
CODE_SIGN_IDENTITY=""
CODE_SIGNING_REQUIRED=NO
ENABLE_HARDENED_RUNTIME=NO
| xcpretty
|
language: objective-c
osx_image: xcode10.1
before_install:
- gem install xcpretty
script:
set -o pipefail &&
travis_retry
xcodebuild test
-workspace CotEditor.xcworkspace
-scheme CotEditor
CODE_SIGN_IDENTITY=""
CODE_SIGNING_REQUIRED=NO
+ ENABLE_HARDENED_RUNTIME=NO
| xcpretty | 1 | 0.0625 | 1 | 0 |
d43684a867de21d74bdc1410e5ca7d62447fa164 | source/scss/base/html/_html.scss | source/scss/base/html/_html.scss | /*** HTML ***/ | /*** HTML ***/
* {
@include box-sizing(border-box);
}
html, body {
font-family: $text-body--regular;
}
body{
font-size: $base--typesize;
}
| Apply Global properties such as box-sizing, body font face and base font size | Apply Global properties such as box-sizing, body font face and base font size
| SCSS | mit | cloud23/ucsf-patternlab,cloud23/ucsf-patternlab,cloud23/ucsf-patternlab,cloud23/ucsf-patternlab,cloud23/ucsf-patternlab | scss | ## Code Before:
/*** HTML ***/
## Instruction:
Apply Global properties such as box-sizing, body font face and base font size
## Code After:
/*** HTML ***/
* {
@include box-sizing(border-box);
}
html, body {
font-family: $text-body--regular;
}
body{
font-size: $base--typesize;
}
| /*** HTML ***/
+ * {
+ @include box-sizing(border-box);
+ }
+
+ html, body {
+ font-family: $text-body--regular;
+ }
+
+ body{
+ font-size: $base--typesize;
+ } | 11 | 11 | 11 | 0 |
55dc9595a25dd135bcdeff96dee77dafcd4575b6 | bottleopener/shiftrConnector.cpp | bottleopener/shiftrConnector.cpp |
void ShiftrConnector::init(const char* deviceLogin, const char* pwd)
{
client.begin("broker.shiftr.io");
this->deviceLogin = deviceLogin;
this->pwd = pwd;
connect();
client.subscribe("/bottle-openner");
// client.unsubscribe("/bottle-openner");
}
void ShiftrConnector::connect() {
while (!client.connect("arduino", this->deviceLogin, this->pwd)) {
Serial.print(".");
delay(500);
}
Serial.println("\nconnected to Shiftr!");
}
void ShiftrConnector::loop()
{
client.loop();
if (!client.connected()) {
connect();
}
}
void ShiftrConnector::sendCounter(int counter)
{
char buf[5];
itoa(counter, buf, 10);
client.publish("/bottle-openner", buf);
}
|
void ShiftrConnector::init(const char* deviceLogin, const char* pwd)
{
client.begin("broker.shiftr.io");
this->deviceLogin = deviceLogin;
this->pwd = pwd;
connect();
client.subscribe("/bottle-openner");
// client.unsubscribe("/bottle-openner");
}
void ShiftrConnector::connect() {
while (!client.connect("arduino", this->deviceLogin, this->pwd)) {
Serial.print(".");
delay(500);
}
Serial.println("\nBottle-Opener is now connected to Shiftr!");
}
void ShiftrConnector::loop()
{
client.loop();
if (!client.connected()) {
connect();
}
}
void ShiftrConnector::sendCounter(int counter)
{
char buf[5];
itoa(counter, buf, 10);
client.publish("/bottle-openner", buf);
}
| Extend log when connected to the IoT platform | Extend log when connected to the IoT platform
| C++ | mit | Zenika/bottleopener_iot,Zenika/bottleopener_iot,Zenika/bottleopener_iot,Zenika/bottleopener_iot | c++ | ## Code Before:
void ShiftrConnector::init(const char* deviceLogin, const char* pwd)
{
client.begin("broker.shiftr.io");
this->deviceLogin = deviceLogin;
this->pwd = pwd;
connect();
client.subscribe("/bottle-openner");
// client.unsubscribe("/bottle-openner");
}
void ShiftrConnector::connect() {
while (!client.connect("arduino", this->deviceLogin, this->pwd)) {
Serial.print(".");
delay(500);
}
Serial.println("\nconnected to Shiftr!");
}
void ShiftrConnector::loop()
{
client.loop();
if (!client.connected()) {
connect();
}
}
void ShiftrConnector::sendCounter(int counter)
{
char buf[5];
itoa(counter, buf, 10);
client.publish("/bottle-openner", buf);
}
## Instruction:
Extend log when connected to the IoT platform
## Code After:
void ShiftrConnector::init(const char* deviceLogin, const char* pwd)
{
client.begin("broker.shiftr.io");
this->deviceLogin = deviceLogin;
this->pwd = pwd;
connect();
client.subscribe("/bottle-openner");
// client.unsubscribe("/bottle-openner");
}
void ShiftrConnector::connect() {
while (!client.connect("arduino", this->deviceLogin, this->pwd)) {
Serial.print(".");
delay(500);
}
Serial.println("\nBottle-Opener is now connected to Shiftr!");
}
void ShiftrConnector::loop()
{
client.loop();
if (!client.connected()) {
connect();
}
}
void ShiftrConnector::sendCounter(int counter)
{
char buf[5];
itoa(counter, buf, 10);
client.publish("/bottle-openner", buf);
}
|
void ShiftrConnector::init(const char* deviceLogin, const char* pwd)
{
client.begin("broker.shiftr.io");
this->deviceLogin = deviceLogin;
this->pwd = pwd;
connect();
client.subscribe("/bottle-openner");
// client.unsubscribe("/bottle-openner");
}
void ShiftrConnector::connect() {
while (!client.connect("arduino", this->deviceLogin, this->pwd)) {
Serial.print(".");
delay(500);
}
- Serial.println("\nconnected to Shiftr!");
+ Serial.println("\nBottle-Opener is now connected to Shiftr!");
? +++++++++++++++++++++
}
void ShiftrConnector::loop()
{
client.loop();
if (!client.connected()) {
connect();
}
}
void ShiftrConnector::sendCounter(int counter)
{
char buf[5];
itoa(counter, buf, 10);
client.publish("/bottle-openner", buf);
} | 2 | 0.05 | 1 | 1 |
fb196b4e046b6f5dbbaab4de3a04be578e43bb25 | lib/sshkit/all.rb | lib/sshkit/all.rb | require_relative '../core_ext/array'
require_relative '../core_ext/hash'
require_relative 'dsl'
require_relative 'host'
require_relative 'command'
require_relative 'configuration'
require_relative 'coordinator'
require_relative 'logger'
require_relative 'log_message'
require_relative 'formatters/abstract'
require_relative 'formatters/black_hole'
require_relative 'formatters/pretty'
require_relative 'formatters/dot'
require_relative 'runners/abstract'
require_relative 'runners/sequential'
require_relative 'runners/parallel'
require_relative 'runners/group'
require_relative 'backends/abstract'
require_relative 'backends/printer'
require_relative 'backends/netssh'
require_relative 'backends/local'
| require_relative '../core_ext/array'
require_relative '../core_ext/hash'
require_relative 'host'
require_relative 'command'
require_relative 'configuration'
require_relative 'coordinator'
require_relative 'logger'
require_relative 'log_message'
require_relative 'formatters/abstract'
require_relative 'formatters/black_hole'
require_relative 'formatters/pretty'
require_relative 'formatters/dot'
require_relative 'runners/abstract'
require_relative 'runners/sequential'
require_relative 'runners/parallel'
require_relative 'runners/group'
require_relative 'backends/abstract'
require_relative 'backends/printer'
require_relative 'backends/netssh'
require_relative 'backends/local'
| Remove sshkit/dsl from the default required files | Remove sshkit/dsl from the default required files
| Ruby | mit | mikz/sshkit,capistrano/sshkit,robd/sshkit | ruby | ## Code Before:
require_relative '../core_ext/array'
require_relative '../core_ext/hash'
require_relative 'dsl'
require_relative 'host'
require_relative 'command'
require_relative 'configuration'
require_relative 'coordinator'
require_relative 'logger'
require_relative 'log_message'
require_relative 'formatters/abstract'
require_relative 'formatters/black_hole'
require_relative 'formatters/pretty'
require_relative 'formatters/dot'
require_relative 'runners/abstract'
require_relative 'runners/sequential'
require_relative 'runners/parallel'
require_relative 'runners/group'
require_relative 'backends/abstract'
require_relative 'backends/printer'
require_relative 'backends/netssh'
require_relative 'backends/local'
## Instruction:
Remove sshkit/dsl from the default required files
## Code After:
require_relative '../core_ext/array'
require_relative '../core_ext/hash'
require_relative 'host'
require_relative 'command'
require_relative 'configuration'
require_relative 'coordinator'
require_relative 'logger'
require_relative 'log_message'
require_relative 'formatters/abstract'
require_relative 'formatters/black_hole'
require_relative 'formatters/pretty'
require_relative 'formatters/dot'
require_relative 'runners/abstract'
require_relative 'runners/sequential'
require_relative 'runners/parallel'
require_relative 'runners/group'
require_relative 'backends/abstract'
require_relative 'backends/printer'
require_relative 'backends/netssh'
require_relative 'backends/local'
| require_relative '../core_ext/array'
require_relative '../core_ext/hash'
- require_relative 'dsl'
require_relative 'host'
require_relative 'command'
require_relative 'configuration'
require_relative 'coordinator'
require_relative 'logger'
require_relative 'log_message'
require_relative 'formatters/abstract'
require_relative 'formatters/black_hole'
require_relative 'formatters/pretty'
require_relative 'formatters/dot'
require_relative 'runners/abstract'
require_relative 'runners/sequential'
require_relative 'runners/parallel'
require_relative 'runners/group'
require_relative 'backends/abstract'
require_relative 'backends/printer'
require_relative 'backends/netssh'
require_relative 'backends/local' | 1 | 0.037037 | 0 | 1 |
5124764b1672c4440c1d018245397f5c6c762afe | src/block-management/listen-for-new-blocks.js | src/block-management/listen-for-new-blocks.js | "use strict";
var startPollingForBlocks = require("./start-polling-for-blocks");
function listenForNewBlocks() {
return function (dispatch) {
dispatch(startPollingForBlocks());
};
}
module.exports = listenForNewBlocks;
| "use strict";
var startPollingForBlocks = require("./start-polling-for-blocks");
var subscribeToNewBlockNotifications = require("./subscribe-to-new-block-notifications");
var unsubscribeFromNewBlockNotifications = require("./unsubscribe-from-new-block-notifications");
var isMetaMask = require("../utils/is-meta-mask");
function listenForNewBlocks() {
return function (dispatch) {
if (isMetaMask()) { // MetaMask doesn't throw an error on eth_subscribe, but doesn't actually send block notifications, so we need to poll instead...
dispatch(startPollingForBlocks());
} else {
dispatch(subscribeToNewBlockNotifications(function (err) {
console.info("[ethrpc] eth_subscribe request failed, fall back to polling for blocks:", err.message);
dispatch(unsubscribeFromNewBlockNotifications());
dispatch(startPollingForBlocks());
}));
}
};
}
module.exports = listenForNewBlocks;
| Revert "just poll instead of sometimes trying subscribe" | Revert "just poll instead of sometimes trying subscribe"
This reverts commit e900fa6d3396dd38717b33e27ef6817434800647.
| JavaScript | mit | ethereumjs/ethrpc,ethereumjs/ethrpc | javascript | ## Code Before:
"use strict";
var startPollingForBlocks = require("./start-polling-for-blocks");
function listenForNewBlocks() {
return function (dispatch) {
dispatch(startPollingForBlocks());
};
}
module.exports = listenForNewBlocks;
## Instruction:
Revert "just poll instead of sometimes trying subscribe"
This reverts commit e900fa6d3396dd38717b33e27ef6817434800647.
## Code After:
"use strict";
var startPollingForBlocks = require("./start-polling-for-blocks");
var subscribeToNewBlockNotifications = require("./subscribe-to-new-block-notifications");
var unsubscribeFromNewBlockNotifications = require("./unsubscribe-from-new-block-notifications");
var isMetaMask = require("../utils/is-meta-mask");
function listenForNewBlocks() {
return function (dispatch) {
if (isMetaMask()) { // MetaMask doesn't throw an error on eth_subscribe, but doesn't actually send block notifications, so we need to poll instead...
dispatch(startPollingForBlocks());
} else {
dispatch(subscribeToNewBlockNotifications(function (err) {
console.info("[ethrpc] eth_subscribe request failed, fall back to polling for blocks:", err.message);
dispatch(unsubscribeFromNewBlockNotifications());
dispatch(startPollingForBlocks());
}));
}
};
}
module.exports = listenForNewBlocks;
| "use strict";
var startPollingForBlocks = require("./start-polling-for-blocks");
+ var subscribeToNewBlockNotifications = require("./subscribe-to-new-block-notifications");
+ var unsubscribeFromNewBlockNotifications = require("./unsubscribe-from-new-block-notifications");
+ var isMetaMask = require("../utils/is-meta-mask");
function listenForNewBlocks() {
return function (dispatch) {
+ if (isMetaMask()) { // MetaMask doesn't throw an error on eth_subscribe, but doesn't actually send block notifications, so we need to poll instead...
- dispatch(startPollingForBlocks());
+ dispatch(startPollingForBlocks());
? ++
+ } else {
+ dispatch(subscribeToNewBlockNotifications(function (err) {
+ console.info("[ethrpc] eth_subscribe request failed, fall back to polling for blocks:", err.message);
+ dispatch(unsubscribeFromNewBlockNotifications());
+ dispatch(startPollingForBlocks());
+ }));
+ }
};
}
module.exports = listenForNewBlocks; | 13 | 1.181818 | 12 | 1 |
07bf95e2e723f4c6b938eaeaed8b63f42790ad9b | docs/ja/source/_theme/asakusa/static/asakusa.css | docs/ja/source/_theme/asakusa/static/asakusa.css | @import url('sphinxdoc.css');
.section h1 {
font-size: 1.7em;
}
.section h2 {
font-size: 1.4em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-top: 0.2em;
padding-bottom: 0.3em;
padding-left: 0.3em;
border-bottom: thin solid gray;
border-left: thick solid gray;
}
.section h3 {
font-size: 1.3em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-bottom: 0.2em;
border-bottom: thin solid gray;
}
.section h4 {
font-size: 1.2em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-bottom: 0.1em;
border-bottom: thin dotted gray;
}
.section h5 {
font-size: 1.1em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-bottom: 0.1em;
border-bottom: thin dotted lightgrey;
}
.section caption {
margin-top: 1em;
margin-bottom: 0.3em;
}
| @import url('sphinxdoc.css');
.section h1 {
font-size: 1.7em;
}
.section h2 {
font-size: 1.4em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-top: 0.2em;
padding-bottom: 0.3em;
padding-left: 0.3em;
border-bottom: thin solid gray;
border-left: thick solid gray;
}
.section h3 {
font-size: 1.3em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-bottom: 0.2em;
border-bottom: thin solid gray;
}
.section h4 {
font-size: 1.2em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-bottom: 0.1em;
border-bottom: thin dotted gray;
}
.section h5 {
font-size: 1.1em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-bottom: 0.1em;
border-bottom: thin dotted lightgrey;
}
.section caption {
margin-top: 1em;
margin-bottom: 0.3em;
}
div.attention p.admonition-title {
background-color: #EE9816;
}
| Modify attention directive block color | Modify attention directive block color
| CSS | apache-2.0 | asakusafw/asakusafw-m3bp,ashigeru/asakusafw-m3bp,asakusafw/asakusafw-m3bp,ashigeru/asakusafw-m3bp,asakusafw/asakusafw-m3bp,ashigeru/asakusafw-m3bp | css | ## Code Before:
@import url('sphinxdoc.css');
.section h1 {
font-size: 1.7em;
}
.section h2 {
font-size: 1.4em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-top: 0.2em;
padding-bottom: 0.3em;
padding-left: 0.3em;
border-bottom: thin solid gray;
border-left: thick solid gray;
}
.section h3 {
font-size: 1.3em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-bottom: 0.2em;
border-bottom: thin solid gray;
}
.section h4 {
font-size: 1.2em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-bottom: 0.1em;
border-bottom: thin dotted gray;
}
.section h5 {
font-size: 1.1em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-bottom: 0.1em;
border-bottom: thin dotted lightgrey;
}
.section caption {
margin-top: 1em;
margin-bottom: 0.3em;
}
## Instruction:
Modify attention directive block color
## Code After:
@import url('sphinxdoc.css');
.section h1 {
font-size: 1.7em;
}
.section h2 {
font-size: 1.4em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-top: 0.2em;
padding-bottom: 0.3em;
padding-left: 0.3em;
border-bottom: thin solid gray;
border-left: thick solid gray;
}
.section h3 {
font-size: 1.3em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-bottom: 0.2em;
border-bottom: thin solid gray;
}
.section h4 {
font-size: 1.2em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-bottom: 0.1em;
border-bottom: thin dotted gray;
}
.section h5 {
font-size: 1.1em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-bottom: 0.1em;
border-bottom: thin dotted lightgrey;
}
.section caption {
margin-top: 1em;
margin-bottom: 0.3em;
}
div.attention p.admonition-title {
background-color: #EE9816;
}
| @import url('sphinxdoc.css');
.section h1 {
font-size: 1.7em;
}
.section h2 {
font-size: 1.4em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-top: 0.2em;
padding-bottom: 0.3em;
padding-left: 0.3em;
border-bottom: thin solid gray;
border-left: thick solid gray;
}
.section h3 {
font-size: 1.3em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-bottom: 0.2em;
border-bottom: thin solid gray;
}
.section h4 {
font-size: 1.2em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-bottom: 0.1em;
border-bottom: thin dotted gray;
}
.section h5 {
font-size: 1.1em;
margin-top: 1.0em;
margin-bottom: 0.1em;
padding-bottom: 0.1em;
border-bottom: thin dotted lightgrey;
}
.section caption {
margin-top: 1em;
margin-bottom: 0.3em;
}
+ div.attention p.admonition-title {
+ background-color: #EE9816;
+ } | 3 | 0.065217 | 3 | 0 |
434c022343039c7c2ebc9262b192877149ab1ec8 | Server/sodium-pathway-93914/get-lesson.php | Server/sodium-pathway-93914/get-lesson.php | <?php
include 'checklogged.php';
$code = $_GET['code'];
//make code uppercase cause the user probably screwed up
$code = strtoupper($code);
//connect to the MySQL database
$$db = null;
if(isset($_SERVER['SERVER_SOFTWARE']) && strpos($_SERVER['SERVER_SOFTWARE'],'Google App Engine') !== false){
//connect to the MySQL database on app engine
$db = new pdo('mysql:unix_socket=/cloudsql/sodium-pathway-93914:users;dbname=lessons',
'root', // username
'xGQEsWRd39G3UrGU' // password
);
}
else{
$db = new pdo('mysql:host=127.0.0.1:3307;dbname=lessons',
'root', // username
'xGQEsWRd39G3UrGU' // password
);
}
//prevent emulated prepared statements to prevent against SQL injection
$db->setAttribute(PDO::ATTR_EMULATE_PREPARES, false);
//attempt to query the database for the user
$stmt = $db->prepare('SELECT JSON FROM Lessons WHERE Code=? LIMIT 1');
$stmt->execute(array($code));
if ($stmt->rowCount() > 0 ) {
$queryResult = $stmt->fetch();
echo $queryResult['JSON'];
}
else{
echo '{"status":"fail","message":"Code not found"}';
}
?> | <?php
include 'checklogged.php';
include 'GDS/GDS.php';
$code = $_GET['code'];
$obj_store = new GDS\Store('Lessons');
//make code uppercase cause the user probably screwed up
$code = strtoupper($code);
$result = $obj_store->fetchOne('SELECT * From Lessons WHERE Code=@code',['code'=>$code]);
if($result == null)
echo json_encode(['status'=>'fail','message'=>'Code not found']);
else
echo $result->JSON;
?> | Migrate lesson getter to cloud datastore | Migrate lesson getter to cloud datastore
| PHP | apache-2.0 | alex-taffe/MSJS,alex-taffe/MSJS,alex-taffe/MSJS | php | ## Code Before:
<?php
include 'checklogged.php';
$code = $_GET['code'];
//make code uppercase cause the user probably screwed up
$code = strtoupper($code);
//connect to the MySQL database
$$db = null;
if(isset($_SERVER['SERVER_SOFTWARE']) && strpos($_SERVER['SERVER_SOFTWARE'],'Google App Engine') !== false){
//connect to the MySQL database on app engine
$db = new pdo('mysql:unix_socket=/cloudsql/sodium-pathway-93914:users;dbname=lessons',
'root', // username
'xGQEsWRd39G3UrGU' // password
);
}
else{
$db = new pdo('mysql:host=127.0.0.1:3307;dbname=lessons',
'root', // username
'xGQEsWRd39G3UrGU' // password
);
}
//prevent emulated prepared statements to prevent against SQL injection
$db->setAttribute(PDO::ATTR_EMULATE_PREPARES, false);
//attempt to query the database for the user
$stmt = $db->prepare('SELECT JSON FROM Lessons WHERE Code=? LIMIT 1');
$stmt->execute(array($code));
if ($stmt->rowCount() > 0 ) {
$queryResult = $stmt->fetch();
echo $queryResult['JSON'];
}
else{
echo '{"status":"fail","message":"Code not found"}';
}
?>
## Instruction:
Migrate lesson getter to cloud datastore
## Code After:
<?php
include 'checklogged.php';
include 'GDS/GDS.php';
$code = $_GET['code'];
$obj_store = new GDS\Store('Lessons');
//make code uppercase cause the user probably screwed up
$code = strtoupper($code);
$result = $obj_store->fetchOne('SELECT * From Lessons WHERE Code=@code',['code'=>$code]);
if($result == null)
echo json_encode(['status'=>'fail','message'=>'Code not found']);
else
echo $result->JSON;
?> | <?php
include 'checklogged.php';
+ include 'GDS/GDS.php';
$code = $_GET['code'];
+ $obj_store = new GDS\Store('Lessons');
//make code uppercase cause the user probably screwed up
$code = strtoupper($code);
+
+ $result = $obj_store->fetchOne('SELECT * From Lessons WHERE Code=@code',['code'=>$code]);
+ if($result == null)
+ echo json_encode(['status'=>'fail','message'=>'Code not found']);
- //connect to the MySQL database
- $$db = null;
- if(isset($_SERVER['SERVER_SOFTWARE']) && strpos($_SERVER['SERVER_SOFTWARE'],'Google App Engine') !== false){
- //connect to the MySQL database on app engine
- $db = new pdo('mysql:unix_socket=/cloudsql/sodium-pathway-93914:users;dbname=lessons',
- 'root', // username
- 'xGQEsWRd39G3UrGU' // password
- );
- }
- else{
? -
+ else
- $db = new pdo('mysql:host=127.0.0.1:3307;dbname=lessons',
- 'root', // username
- 'xGQEsWRd39G3UrGU' // password
- );
- }
- //prevent emulated prepared statements to prevent against SQL injection
- $db->setAttribute(PDO::ATTR_EMULATE_PREPARES, false);
-
- //attempt to query the database for the user
- $stmt = $db->prepare('SELECT JSON FROM Lessons WHERE Code=? LIMIT 1');
- $stmt->execute(array($code));
- if ($stmt->rowCount() > 0 ) {
- $queryResult = $stmt->fetch();
- echo $queryResult['JSON'];
? --- -- ^^ --
+ echo $result->JSON;
? ^^
- }
- else{
- echo '{"status":"fail","message":"Code not found"}';
- }
?> | 36 | 0.972973 | 8 | 28 |
cc4cee1f879e18d9e192b31dd1bad8f556cb82ca | README.template.md | README.template.md | ios-sim
=======
The ios-sim tool is a command-line utility that launches an iOS application on
the iOS Simulator. This allows for niceties such as automated testing without
having to open XCode.
Features
--------
* Choose the device family to simulate, i.e. iPhone or iPad.
* Setup environment variables.
* Pass arguments to the application.
* See the stdout and stderr, or redirect them to files.
See the `--help` option for more info.
Installation
------------
Through homebrew:
$ brew install ios-sim
Download an archive:
$ curl -L https://github.com/Fingertips/ios-sim/zipball/{{VERSION}} -o ios-sim-{{VERSION}}.zip
$ unzip ios-sim-{{VERSION}}.zip
Or from a git clone:
$ git clone git://github.com/Fingertips/ios-sim.git
Then build and install from the source root:
$ rake install prefix=/usr/local/
License
-------
Original author: Landon Fuller <landonf@plausiblelabs.com>
Copyright (c) 2008-2011 Plausible Labs Cooperative, Inc.
All rights reserved.
This project is available under the MIT license. See [LICENSE][license].
[license]: https://github.com/Fingertips/ios-sim/blob/master/LICENSE
| ios-sim
=======
The ios-sim tool is a command-line utility that launches an iOS application on
the iOS Simulator. This allows for niceties such as automated testing without
having to open XCode.
Features
--------
* Choose the device family to simulate, i.e. iPhone or iPad.
* Setup environment variables.
* Pass arguments to the application.
* See the stdout and stderr, or redirect them to files.
See the `--help` option for more info.
Installation
------------
Through homebrew:
$ brew install ios-sim
Download an archive:
$ curl -L https://github.com/Fingertips/ios-sim/zipball/{{VERSION}} -o ios-sim-{{VERSION}}.zip
$ unzip ios-sim-{{VERSION}}.zip
Or from a git clone:
$ git clone git://github.com/Fingertips/ios-sim.git
Then build and install from the source root:
$ rake install prefix=/usr/local/
tmux
-----
To get ios-sim to launch correctly within tmux use the reattach-to-user-namespace wrapper.
```
reattach-to-user-namespace ios-sim launch ./build/MyTestApp.app
```
*source:* https://github.com/ChrisJohnsen/tmux-MacOSX-pasteboard
*brew:* ```brew install reattach-to-user-namespace```
License
-------
Original author: Landon Fuller <landonf@plausiblelabs.com>
Copyright (c) 2008-2011 Plausible Labs Cooperative, Inc.
All rights reserved.
This project is available under the MIT license. See [LICENSE][license].
[license]: https://github.com/Fingertips/ios-sim/blob/master/LICENSE
| Update README for tmux instructions | Update README for tmux instructions
thanks to @ecornell (https://github.com/phonegap/ios-sim/issues/21#issuecomment-18883636) | Markdown | mit | dpogue/ios-sim,cra5370/ios-sim,cra5370/ios-sim,dpogue/ios-sim,phonegap/ios-sim | markdown | ## Code Before:
ios-sim
=======
The ios-sim tool is a command-line utility that launches an iOS application on
the iOS Simulator. This allows for niceties such as automated testing without
having to open XCode.
Features
--------
* Choose the device family to simulate, i.e. iPhone or iPad.
* Setup environment variables.
* Pass arguments to the application.
* See the stdout and stderr, or redirect them to files.
See the `--help` option for more info.
Installation
------------
Through homebrew:
$ brew install ios-sim
Download an archive:
$ curl -L https://github.com/Fingertips/ios-sim/zipball/{{VERSION}} -o ios-sim-{{VERSION}}.zip
$ unzip ios-sim-{{VERSION}}.zip
Or from a git clone:
$ git clone git://github.com/Fingertips/ios-sim.git
Then build and install from the source root:
$ rake install prefix=/usr/local/
License
-------
Original author: Landon Fuller <landonf@plausiblelabs.com>
Copyright (c) 2008-2011 Plausible Labs Cooperative, Inc.
All rights reserved.
This project is available under the MIT license. See [LICENSE][license].
[license]: https://github.com/Fingertips/ios-sim/blob/master/LICENSE
## Instruction:
Update README for tmux instructions
thanks to @ecornell (https://github.com/phonegap/ios-sim/issues/21#issuecomment-18883636)
## Code After:
ios-sim
=======
The ios-sim tool is a command-line utility that launches an iOS application on
the iOS Simulator. This allows for niceties such as automated testing without
having to open XCode.
Features
--------
* Choose the device family to simulate, i.e. iPhone or iPad.
* Setup environment variables.
* Pass arguments to the application.
* See the stdout and stderr, or redirect them to files.
See the `--help` option for more info.
Installation
------------
Through homebrew:
$ brew install ios-sim
Download an archive:
$ curl -L https://github.com/Fingertips/ios-sim/zipball/{{VERSION}} -o ios-sim-{{VERSION}}.zip
$ unzip ios-sim-{{VERSION}}.zip
Or from a git clone:
$ git clone git://github.com/Fingertips/ios-sim.git
Then build and install from the source root:
$ rake install prefix=/usr/local/
tmux
-----
To get ios-sim to launch correctly within tmux use the reattach-to-user-namespace wrapper.
```
reattach-to-user-namespace ios-sim launch ./build/MyTestApp.app
```
*source:* https://github.com/ChrisJohnsen/tmux-MacOSX-pasteboard
*brew:* ```brew install reattach-to-user-namespace```
License
-------
Original author: Landon Fuller <landonf@plausiblelabs.com>
Copyright (c) 2008-2011 Plausible Labs Cooperative, Inc.
All rights reserved.
This project is available under the MIT license. See [LICENSE][license].
[license]: https://github.com/Fingertips/ios-sim/blob/master/LICENSE
| ios-sim
=======
The ios-sim tool is a command-line utility that launches an iOS application on
the iOS Simulator. This allows for niceties such as automated testing without
having to open XCode.
Features
--------
* Choose the device family to simulate, i.e. iPhone or iPad.
* Setup environment variables.
* Pass arguments to the application.
* See the stdout and stderr, or redirect them to files.
See the `--help` option for more info.
Installation
------------
Through homebrew:
$ brew install ios-sim
Download an archive:
$ curl -L https://github.com/Fingertips/ios-sim/zipball/{{VERSION}} -o ios-sim-{{VERSION}}.zip
$ unzip ios-sim-{{VERSION}}.zip
Or from a git clone:
$ git clone git://github.com/Fingertips/ios-sim.git
Then build and install from the source root:
$ rake install prefix=/usr/local/
+ tmux
+ -----
+
+ To get ios-sim to launch correctly within tmux use the reattach-to-user-namespace wrapper.
+
+ ```
+ reattach-to-user-namespace ios-sim launch ./build/MyTestApp.app
+ ```
+ *source:* https://github.com/ChrisJohnsen/tmux-MacOSX-pasteboard
+
+ *brew:* ```brew install reattach-to-user-namespace```
+
License
-------
Original author: Landon Fuller <landonf@plausiblelabs.com>
Copyright (c) 2008-2011 Plausible Labs Cooperative, Inc.
All rights reserved.
This project is available under the MIT license. See [LICENSE][license].
[license]: https://github.com/Fingertips/ios-sim/blob/master/LICENSE | 12 | 0.255319 | 12 | 0 |
b41f864ed9d2e231b42f0217f2e960e5a0ede9b6 | eval-requirements.txt | eval-requirements.txt | --upgrade pip
-e git+https://github.com/uw-it-aca/scout@qa#egg=scout
-e git+https://github.com/uw-it-aca/scout-manager@qa#egg=scout-manager
-e git+https://github.com/uw-it-aca/spotseeker_client@master#egg=spotseeker_restclient
-e git+https://github.com/charlon/django-hybridize@master#egg=hybridize
| -e git+https://github.com/uw-it-aca/scout@qa#egg=scout
-e git+https://github.com/uw-it-aca/scout-manager@qa#egg=scout-manager
-e git+https://github.com/uw-it-aca/spotseeker_client@master#egg=spotseeker_restclient
-e git+https://github.com/charlon/django-hybridize@master#egg=hybridize
| Revert "Attempt upgrading pip as part of the requirements install." | Revert "Attempt upgrading pip as part of the requirements install."
This reverts commit 6ed81bd92bca6fe32d6194ebf6ab13bd255f98ee.
| Text | apache-2.0 | uw-it-aca/scout-vagrant | text | ## Code Before:
--upgrade pip
-e git+https://github.com/uw-it-aca/scout@qa#egg=scout
-e git+https://github.com/uw-it-aca/scout-manager@qa#egg=scout-manager
-e git+https://github.com/uw-it-aca/spotseeker_client@master#egg=spotseeker_restclient
-e git+https://github.com/charlon/django-hybridize@master#egg=hybridize
## Instruction:
Revert "Attempt upgrading pip as part of the requirements install."
This reverts commit 6ed81bd92bca6fe32d6194ebf6ab13bd255f98ee.
## Code After:
-e git+https://github.com/uw-it-aca/scout@qa#egg=scout
-e git+https://github.com/uw-it-aca/scout-manager@qa#egg=scout-manager
-e git+https://github.com/uw-it-aca/spotseeker_client@master#egg=spotseeker_restclient
-e git+https://github.com/charlon/django-hybridize@master#egg=hybridize
| - --upgrade pip
-e git+https://github.com/uw-it-aca/scout@qa#egg=scout
-e git+https://github.com/uw-it-aca/scout-manager@qa#egg=scout-manager
-e git+https://github.com/uw-it-aca/spotseeker_client@master#egg=spotseeker_restclient
-e git+https://github.com/charlon/django-hybridize@master#egg=hybridize | 1 | 0.2 | 0 | 1 |
a6ef6f532f830af7f0bf8d5563b3f7aaffd7d301 | packages/te/text-render.yaml | packages/te/text-render.yaml | homepage: http://github.com/thinkpad20/text-render
changelog-type: ''
hash: 00a7c0a8935fa716b24d5f2dfdd20f84c5524d62a75422b441a1a47dde74849e
test-bench-deps: {}
maintainer: ithinkican@gmail.com
synopsis: A type class for rendering objects as text, pretty-printing, etc.
changelog: ''
basic-deps:
base: ! '>=4.8 && <4.9'
text: -any
parsec: -any
mtl: -any
classy-prelude: -any
all-versions:
- '0.1.0.0'
- '0.1.0.1'
- '0.1.0.2'
author: Allen Nelson
latest: '0.1.0.2'
description-type: haddock
description: ''
license-name: MIT
| homepage: http://github.com/thinkpad20/text-render
changelog-type: ''
hash: ca6381f3968d4dadcec23184ff864bd43071afe35d3dbf302c013cde32f3b758
test-bench-deps: {}
maintainer: ithinkican@gmail.com
synopsis: A type class for rendering objects as text, pretty-printing, etc.
changelog: ''
basic-deps:
base: ! '>=4.8 && <5'
text: -any
parsec: -any
mtl: -any
classy-prelude: -any
all-versions:
- '0.1.0.0'
- '0.1.0.1'
- '0.1.0.2'
- '0.1.0.3'
author: Allen Nelson
latest: '0.1.0.3'
description-type: haddock
description: ''
license-name: MIT
| Update from Hackage at 2017-09-01T19:00:37Z | Update from Hackage at 2017-09-01T19:00:37Z
| YAML | mit | commercialhaskell/all-cabal-metadata | yaml | ## Code Before:
homepage: http://github.com/thinkpad20/text-render
changelog-type: ''
hash: 00a7c0a8935fa716b24d5f2dfdd20f84c5524d62a75422b441a1a47dde74849e
test-bench-deps: {}
maintainer: ithinkican@gmail.com
synopsis: A type class for rendering objects as text, pretty-printing, etc.
changelog: ''
basic-deps:
base: ! '>=4.8 && <4.9'
text: -any
parsec: -any
mtl: -any
classy-prelude: -any
all-versions:
- '0.1.0.0'
- '0.1.0.1'
- '0.1.0.2'
author: Allen Nelson
latest: '0.1.0.2'
description-type: haddock
description: ''
license-name: MIT
## Instruction:
Update from Hackage at 2017-09-01T19:00:37Z
## Code After:
homepage: http://github.com/thinkpad20/text-render
changelog-type: ''
hash: ca6381f3968d4dadcec23184ff864bd43071afe35d3dbf302c013cde32f3b758
test-bench-deps: {}
maintainer: ithinkican@gmail.com
synopsis: A type class for rendering objects as text, pretty-printing, etc.
changelog: ''
basic-deps:
base: ! '>=4.8 && <5'
text: -any
parsec: -any
mtl: -any
classy-prelude: -any
all-versions:
- '0.1.0.0'
- '0.1.0.1'
- '0.1.0.2'
- '0.1.0.3'
author: Allen Nelson
latest: '0.1.0.3'
description-type: haddock
description: ''
license-name: MIT
| homepage: http://github.com/thinkpad20/text-render
changelog-type: ''
- hash: 00a7c0a8935fa716b24d5f2dfdd20f84c5524d62a75422b441a1a47dde74849e
+ hash: ca6381f3968d4dadcec23184ff864bd43071afe35d3dbf302c013cde32f3b758
test-bench-deps: {}
maintainer: ithinkican@gmail.com
synopsis: A type class for rendering objects as text, pretty-printing, etc.
changelog: ''
basic-deps:
- base: ! '>=4.8 && <4.9'
? ^^^
+ base: ! '>=4.8 && <5'
? ^
text: -any
parsec: -any
mtl: -any
classy-prelude: -any
all-versions:
- '0.1.0.0'
- '0.1.0.1'
- '0.1.0.2'
+ - '0.1.0.3'
author: Allen Nelson
- latest: '0.1.0.2'
? ^
+ latest: '0.1.0.3'
? ^
description-type: haddock
description: ''
license-name: MIT | 7 | 0.318182 | 4 | 3 |
033d81597a1499a45b1c9e14a94f3faf0e5a0719 | lib/cli.js | lib/cli.js | 'use strict'
const fs = require('fs')
const argv = require('yargs')
.usage('Usage: $0 [<task to run> ...]')
.example('$0', 'Runs task "default" from file "tasks.js". Will try "tasks.json" if JS file does not exist.')
.example('$0 build test', 'Runs task "build" and then "test".')
.help()
.argv
const taskRunner = require('./index')
const loader = require('./loader')
const tasksToRun = argv._.length > 0 ? argv._ : [ 'default' ]
let taskFile
if (fileExists('tasks.js')) {
taskFile = 'tasks.js'
} else if (fileExists('tasks.json')) {
taskFile = 'tasks.json'
} else {
throw new Error('Neither tasks.js nor tasks.json file found in current working directory.')
}
const tasks = loader.loadTasksFromFile(taskFile)
const filteredTasks = {}
tasksToRun.forEach((taskName) => {
if (tasks[ taskName ]) {
filteredTasks[ taskName ] = tasks[ taskName ]
} else {
throw new Error(`Task is not defined: ${taskName}`)
}
})
taskRunner.runTasks(filteredTasks)
function fileExists (filePath) {
let stat
try {
stat = fs.statSync(filePath)
} catch (error) {
return false
}
return stat.isFile()
}
| 'use strict'
const fs = require('fs')
const argv = require('yargs')
.usage('Usage: $0 [<task to run> ...]')
.example('$0', 'Runs task "default" from file "tasks.js". Will try "tasks.json" if JS file does not exist.')
.example('$0 build test', 'Runs task "build" and then "test".')
.help()
.argv
const taskRunner = require('./index')
const loader = require('./loader')
const tasksToRun = argv._.length > 0 ? argv._ : [ 'default' ]
let taskFile
if (fileExists('tasks.js')) {
taskFile = 'tasks.js'
} else if (fileExists('tasks.json')) {
taskFile = 'tasks.json'
} else {
throw new Error('Neither tasks.js nor tasks.json file found in current working directory.')
}
const tasks = loader.loadTasksFromFile(taskFile)
const filteredTasks = {}
tasksToRun.forEach((taskName) => {
if (tasks[ taskName ]) {
filteredTasks[ taskName ] = tasks[ taskName ]
} else {
throw new Error(`Task is not defined: ${taskName}`)
}
})
taskRunner
.runTasks(filteredTasks)
.catch((error) => {
if (error instanceof Error) {
console.error(error.stack)
} else {
console.error(error)
}
})
function fileExists (filePath) {
let stat
try {
stat = fs.statSync(filePath)
} catch (error) {
return false
}
return stat.isFile()
}
| Implement basic in-task error handling | Implement basic in-task error handling
| JavaScript | mit | andywer/npm-launch | javascript | ## Code Before:
'use strict'
const fs = require('fs')
const argv = require('yargs')
.usage('Usage: $0 [<task to run> ...]')
.example('$0', 'Runs task "default" from file "tasks.js". Will try "tasks.json" if JS file does not exist.')
.example('$0 build test', 'Runs task "build" and then "test".')
.help()
.argv
const taskRunner = require('./index')
const loader = require('./loader')
const tasksToRun = argv._.length > 0 ? argv._ : [ 'default' ]
let taskFile
if (fileExists('tasks.js')) {
taskFile = 'tasks.js'
} else if (fileExists('tasks.json')) {
taskFile = 'tasks.json'
} else {
throw new Error('Neither tasks.js nor tasks.json file found in current working directory.')
}
const tasks = loader.loadTasksFromFile(taskFile)
const filteredTasks = {}
tasksToRun.forEach((taskName) => {
if (tasks[ taskName ]) {
filteredTasks[ taskName ] = tasks[ taskName ]
} else {
throw new Error(`Task is not defined: ${taskName}`)
}
})
taskRunner.runTasks(filteredTasks)
function fileExists (filePath) {
let stat
try {
stat = fs.statSync(filePath)
} catch (error) {
return false
}
return stat.isFile()
}
## Instruction:
Implement basic in-task error handling
## Code After:
'use strict'
const fs = require('fs')
const argv = require('yargs')
.usage('Usage: $0 [<task to run> ...]')
.example('$0', 'Runs task "default" from file "tasks.js". Will try "tasks.json" if JS file does not exist.')
.example('$0 build test', 'Runs task "build" and then "test".')
.help()
.argv
const taskRunner = require('./index')
const loader = require('./loader')
const tasksToRun = argv._.length > 0 ? argv._ : [ 'default' ]
let taskFile
if (fileExists('tasks.js')) {
taskFile = 'tasks.js'
} else if (fileExists('tasks.json')) {
taskFile = 'tasks.json'
} else {
throw new Error('Neither tasks.js nor tasks.json file found in current working directory.')
}
const tasks = loader.loadTasksFromFile(taskFile)
const filteredTasks = {}
tasksToRun.forEach((taskName) => {
if (tasks[ taskName ]) {
filteredTasks[ taskName ] = tasks[ taskName ]
} else {
throw new Error(`Task is not defined: ${taskName}`)
}
})
taskRunner
.runTasks(filteredTasks)
.catch((error) => {
if (error instanceof Error) {
console.error(error.stack)
} else {
console.error(error)
}
})
function fileExists (filePath) {
let stat
try {
stat = fs.statSync(filePath)
} catch (error) {
return false
}
return stat.isFile()
}
| 'use strict'
const fs = require('fs')
const argv = require('yargs')
.usage('Usage: $0 [<task to run> ...]')
.example('$0', 'Runs task "default" from file "tasks.js". Will try "tasks.json" if JS file does not exist.')
.example('$0 build test', 'Runs task "build" and then "test".')
.help()
.argv
const taskRunner = require('./index')
const loader = require('./loader')
const tasksToRun = argv._.length > 0 ? argv._ : [ 'default' ]
let taskFile
if (fileExists('tasks.js')) {
taskFile = 'tasks.js'
} else if (fileExists('tasks.json')) {
taskFile = 'tasks.json'
} else {
throw new Error('Neither tasks.js nor tasks.json file found in current working directory.')
}
const tasks = loader.loadTasksFromFile(taskFile)
const filteredTasks = {}
tasksToRun.forEach((taskName) => {
if (tasks[ taskName ]) {
filteredTasks[ taskName ] = tasks[ taskName ]
} else {
throw new Error(`Task is not defined: ${taskName}`)
}
})
+ taskRunner
- taskRunner.runTasks(filteredTasks)
? ^^^^^^^^^^
+ .runTasks(filteredTasks)
? ^^
+ .catch((error) => {
+ if (error instanceof Error) {
+ console.error(error.stack)
+ } else {
+ console.error(error)
+ }
+ })
function fileExists (filePath) {
let stat
try {
stat = fs.statSync(filePath)
} catch (error) {
return false
}
return stat.isFile()
} | 10 | 0.212766 | 9 | 1 |
607e0408a46dc3c786db665eaf3a93f8a0405be4 | spec/unit/cronenberg__config_spec.rb | spec/unit/cronenberg__config_spec.rb | require 'spec_helper'
require 'cronenberg/config'
require 'rspec/mocks/standalone'
describe Cronenberg::Config do
context "#initialize" do
let(:config_hash) do
{
host: 'vsphere.pizza.com',
user: 'pizzamaster5000',
password: 'pizzapass',
}
end
context "with no env variables or config file" do
it "should raise a RunTime error" do
expect{ Cronenberg::Config.new }.to raise_error(RuntimeError)
end
end
context "with a valid configuration hash" do
before do
allow_any_instance_of(Cronenberg::Config).to receive(:process_environment_variables).and_return(config_hash)
end
it "returns valid config items" do
config = Cronenberg::Config.new
expect(config.host).to eq(config_hash[:host])
expect(config.user).to eq(config_hash[:user])
expect(config.password).to eq(config_hash[:password])
end
end
end
end
| require 'spec_helper'
require 'cronenberg/config'
require 'rspec/mocks/standalone'
describe Cronenberg::Config do
context "#initialize" do
let(:config_hash) do
{
host: 'vsphere.pizza.com',
user: 'pizzamaster5000',
password: 'pizzapass',
}
end
let(:config_hash_partial) do
{
host: 'vsphere.pizza.com',
user: 'pizzamaster5000',
}
end
context "with no env variables or config file" do
it "should raise a RunTime error" do
expect{ Cronenberg::Config.new }.to raise_error(RuntimeError)
end
end
context "with partial env vars or config file" do
before do
allow_any_instance_of(Cronenberg::Config).to receive(:process_environment_variables).and_return(config_hash_partial)
end
it "should raise a RunTime error listing missing config items" do
expect { Cronenberg::Config.new }.to raise_error(RuntimeError, /missing settings/)
end
end
context "with a valid configuration hash" do
before do
allow_any_instance_of(Cronenberg::Config).to receive(:process_environment_variables).and_return(config_hash)
end
it "returns valid config items" do
config = Cronenberg::Config.new
expect(config.host).to eq(config_hash[:host])
expect(config.user).to eq(config_hash[:user])
expect(config.password).to eq(config_hash[:password])
end
end
end
end
| Test that Cronenberg::Config can list missing config items. | Test that Cronenberg::Config can list missing config items.
| Ruby | apache-2.0 | pizzaops/cronenberg,pizzaops/cronenberg | ruby | ## Code Before:
require 'spec_helper'
require 'cronenberg/config'
require 'rspec/mocks/standalone'
describe Cronenberg::Config do
context "#initialize" do
let(:config_hash) do
{
host: 'vsphere.pizza.com',
user: 'pizzamaster5000',
password: 'pizzapass',
}
end
context "with no env variables or config file" do
it "should raise a RunTime error" do
expect{ Cronenberg::Config.new }.to raise_error(RuntimeError)
end
end
context "with a valid configuration hash" do
before do
allow_any_instance_of(Cronenberg::Config).to receive(:process_environment_variables).and_return(config_hash)
end
it "returns valid config items" do
config = Cronenberg::Config.new
expect(config.host).to eq(config_hash[:host])
expect(config.user).to eq(config_hash[:user])
expect(config.password).to eq(config_hash[:password])
end
end
end
end
## Instruction:
Test that Cronenberg::Config can list missing config items.
## Code After:
require 'spec_helper'
require 'cronenberg/config'
require 'rspec/mocks/standalone'
describe Cronenberg::Config do
context "#initialize" do
let(:config_hash) do
{
host: 'vsphere.pizza.com',
user: 'pizzamaster5000',
password: 'pizzapass',
}
end
let(:config_hash_partial) do
{
host: 'vsphere.pizza.com',
user: 'pizzamaster5000',
}
end
context "with no env variables or config file" do
it "should raise a RunTime error" do
expect{ Cronenberg::Config.new }.to raise_error(RuntimeError)
end
end
context "with partial env vars or config file" do
before do
allow_any_instance_of(Cronenberg::Config).to receive(:process_environment_variables).and_return(config_hash_partial)
end
it "should raise a RunTime error listing missing config items" do
expect { Cronenberg::Config.new }.to raise_error(RuntimeError, /missing settings/)
end
end
context "with a valid configuration hash" do
before do
allow_any_instance_of(Cronenberg::Config).to receive(:process_environment_variables).and_return(config_hash)
end
it "returns valid config items" do
config = Cronenberg::Config.new
expect(config.host).to eq(config_hash[:host])
expect(config.user).to eq(config_hash[:user])
expect(config.password).to eq(config_hash[:password])
end
end
end
end
| require 'spec_helper'
require 'cronenberg/config'
require 'rspec/mocks/standalone'
describe Cronenberg::Config do
context "#initialize" do
let(:config_hash) do
{
host: 'vsphere.pizza.com',
user: 'pizzamaster5000',
password: 'pizzapass',
}
end
+ let(:config_hash_partial) do
+ {
+ host: 'vsphere.pizza.com',
+ user: 'pizzamaster5000',
+ }
+ end
+
context "with no env variables or config file" do
it "should raise a RunTime error" do
expect{ Cronenberg::Config.new }.to raise_error(RuntimeError)
+ end
+ end
+
+ context "with partial env vars or config file" do
+ before do
+ allow_any_instance_of(Cronenberg::Config).to receive(:process_environment_variables).and_return(config_hash_partial)
+ end
+
+ it "should raise a RunTime error listing missing config items" do
+ expect { Cronenberg::Config.new }.to raise_error(RuntimeError, /missing settings/)
end
end
context "with a valid configuration hash" do
before do
allow_any_instance_of(Cronenberg::Config).to receive(:process_environment_variables).and_return(config_hash)
end
it "returns valid config items" do
config = Cronenberg::Config.new
expect(config.host).to eq(config_hash[:host])
expect(config.user).to eq(config_hash[:user])
expect(config.password).to eq(config_hash[:password])
end
end
end
end
- | 18 | 0.529412 | 17 | 1 |
090a5373730db0b2ba915a6f656d6a9ee6bddbf6 | lib/Phortress/GlobalEnvironment.php | lib/Phortress/GlobalEnvironment.php | <?php
namespace Phortress;
/**
* The global environment for a program.
*
* @package Phortress
*/
class GlobalEnvironment extends NamespaceEnvironment {
/**
* The superglobals for this environment.
*
* @var array
*/
private $superglobals;
public function __construct() {
parent::__construct('Global', null);
$this->superglobals = array(
'GLOBALS' => &$this->variables,
'_SERVER' => array(),
'_GET' => array(),
'_POST' => array(),
'_FILES' => array(),
'_COOKIE' => array(),
'_SESSION' => array(),
'_REQUEST' => array(),
'_ENV' => array()
);
self::copyValueReferences($this->variables, $this->superglobals);
}
public function getGlobal() {
return $this;
}
/**
* Gets the superglobals in this global environment.
*
* @return array
*/
public function &getSuperglobals() {
return $this->superglobals;
}
}
| <?php
namespace Phortress;
use PhpParser\Node\Name;
use PhpParser\Node\Name\Relative;
/**
* The global environment for a program.
*
* @package Phortress
*/
class GlobalEnvironment extends NamespaceEnvironment {
/**
* The superglobals for this environment.
*
* @var array
*/
private $superglobals;
public function __construct() {
parent::__construct('Global', null);
$this->superglobals = array(
'GLOBALS' => &$this->variables,
'_SERVER' => array(),
'_GET' => array(),
'_POST' => array(),
'_FILES' => array(),
'_COOKIE' => array(),
'_SESSION' => array(),
'_REQUEST' => array(),
'_ENV' => array()
);
self::copyValueReferences($this->variables, $this->superglobals);
}
/**
* Gets the superglobals in this global environment.
*
* @return array
*/
public function &getSuperglobals() {
return $this->superglobals;
}
public function getGlobal() {
return $this;
}
public function resolveNamespace(Name $namespaceName = null) {
if (self::isAbsolutelyQualified($namespaceName)) {
$namespaceName = new Relative(
$namespaceName->parts,
$namespaceName->getAttributes());
}
return parent::resolveNamespace($namespaceName);
}
}
| Implement absolutely qualified namespace resolution. | Implement absolutely qualified namespace resolution.
| PHP | mit | lowjoel/phortress | php | ## Code Before:
<?php
namespace Phortress;
/**
* The global environment for a program.
*
* @package Phortress
*/
class GlobalEnvironment extends NamespaceEnvironment {
/**
* The superglobals for this environment.
*
* @var array
*/
private $superglobals;
public function __construct() {
parent::__construct('Global', null);
$this->superglobals = array(
'GLOBALS' => &$this->variables,
'_SERVER' => array(),
'_GET' => array(),
'_POST' => array(),
'_FILES' => array(),
'_COOKIE' => array(),
'_SESSION' => array(),
'_REQUEST' => array(),
'_ENV' => array()
);
self::copyValueReferences($this->variables, $this->superglobals);
}
public function getGlobal() {
return $this;
}
/**
* Gets the superglobals in this global environment.
*
* @return array
*/
public function &getSuperglobals() {
return $this->superglobals;
}
}
## Instruction:
Implement absolutely qualified namespace resolution.
## Code After:
<?php
namespace Phortress;
use PhpParser\Node\Name;
use PhpParser\Node\Name\Relative;
/**
* The global environment for a program.
*
* @package Phortress
*/
class GlobalEnvironment extends NamespaceEnvironment {
/**
* The superglobals for this environment.
*
* @var array
*/
private $superglobals;
public function __construct() {
parent::__construct('Global', null);
$this->superglobals = array(
'GLOBALS' => &$this->variables,
'_SERVER' => array(),
'_GET' => array(),
'_POST' => array(),
'_FILES' => array(),
'_COOKIE' => array(),
'_SESSION' => array(),
'_REQUEST' => array(),
'_ENV' => array()
);
self::copyValueReferences($this->variables, $this->superglobals);
}
/**
* Gets the superglobals in this global environment.
*
* @return array
*/
public function &getSuperglobals() {
return $this->superglobals;
}
public function getGlobal() {
return $this;
}
public function resolveNamespace(Name $namespaceName = null) {
if (self::isAbsolutelyQualified($namespaceName)) {
$namespaceName = new Relative(
$namespaceName->parts,
$namespaceName->getAttributes());
}
return parent::resolveNamespace($namespaceName);
}
}
| <?php
namespace Phortress;
+ use PhpParser\Node\Name;
+ use PhpParser\Node\Name\Relative;
/**
* The global environment for a program.
*
* @package Phortress
*/
class GlobalEnvironment extends NamespaceEnvironment {
/**
* The superglobals for this environment.
*
* @var array
*/
private $superglobals;
public function __construct() {
parent::__construct('Global', null);
$this->superglobals = array(
'GLOBALS' => &$this->variables,
'_SERVER' => array(),
'_GET' => array(),
'_POST' => array(),
'_FILES' => array(),
'_COOKIE' => array(),
'_SESSION' => array(),
'_REQUEST' => array(),
'_ENV' => array()
);
self::copyValueReferences($this->variables, $this->superglobals);
}
- public function getGlobal() {
- return $this;
- }
-
/**
* Gets the superglobals in this global environment.
*
* @return array
*/
public function &getSuperglobals() {
return $this->superglobals;
}
+
+ public function getGlobal() {
+ return $this;
+ }
+
+ public function resolveNamespace(Name $namespaceName = null) {
+ if (self::isAbsolutelyQualified($namespaceName)) {
+ $namespaceName = new Relative(
+ $namespaceName->parts,
+ $namespaceName->getAttributes());
+ }
+
+ return parent::resolveNamespace($namespaceName);
+ }
} | 20 | 0.425532 | 16 | 4 |
eced78384c7424e5cb2195aaf645f03959bc9bc2 | polyfills/console-minimal.js | polyfills/console-minimal.js | /*
* Minimal console.log() polyfill
*/
if (typeof console === 'undefined') {
Object.defineProperty(this, 'console', {
value: {}, writable: true, enumerable: false, configurable: true
});
}
if (typeof console.log === 'undefined') {
Object.defineProperty(this.console, 'log', {
value: function () {
print(Array.prototype.join.call(arguments, ' '));
}, writable: true, enumerable: false, configurable: true
});
}
| /*
* Minimal console.log() polyfill
*/
if (typeof console === 'undefined') {
Object.defineProperty(this, 'console', {
value: {}, writable: true, enumerable: false, configurable: true
});
}
if (typeof console.log === 'undefined') {
(function () {
var origPrint = print; // capture in closure in case changed later
Object.defineProperty(this.console, 'log', {
value: function () {
var strArgs = Array.prototype.map.call(arguments, function (v) { return String(v); });
origPrint(Array.prototype.join.call(strArgs, ' '));
}, writable: true, enumerable: false, configurable: true
});
})();
}
| Improve console polyfill String coercion | Improve console polyfill String coercion
| JavaScript | mit | tassmjau/duktape,reqshark/duktape,nivertech/duktape,eddieh/duktape,markand/duktape,svaarala/duktape,eddieh/duktape,skomski/duktape,kphillisjr/duktape,harold-b/duktape,chenyaqiuqiu/duktape,sloth4413/duktape,skomski/duktape,kphillisjr/duktape,zeropool/duktape,svaarala/duktape,nivertech/duktape,markand/duktape,eddieh/duktape,kphillisjr/duktape,harold-b/duktape,nivertech/duktape,svaarala/duktape,zeropool/duktape,eddieh/duktape,thurday/duktape,tassmjau/duktape,sloth4413/duktape,kphillisjr/duktape,jmptrader/duktape,tassmjau/duktape,tassmjau/duktape,tassmjau/duktape,svaarala/duktape,reqshark/duktape,harold-b/duktape,chenyaqiuqiu/duktape,zeropool/duktape,thurday/duktape,harold-b/duktape,svaarala/duktape,skomski/duktape,zeropool/duktape,haosu1987/duktape,thurday/duktape,sloth4413/duktape,haosu1987/duktape,jmptrader/duktape,thurday/duktape,markand/duktape,zeropool/duktape,markand/duktape,nivertech/duktape,eddieh/duktape,chenyaqiuqiu/duktape,kphillisjr/duktape,thurday/duktape,reqshark/duktape,reqshark/duktape,tassmjau/duktape,thurday/duktape,eddieh/duktape,thurday/duktape,harold-b/duktape,kphillisjr/duktape,nivertech/duktape,jmptrader/duktape,nivertech/duktape,reqshark/duktape,sloth4413/duktape,eddieh/duktape,chenyaqiuqiu/duktape,skomski/duktape,zeropool/duktape,eddieh/duktape,svaarala/duktape,chenyaqiuqiu/duktape,zeropool/duktape,harold-b/duktape,reqshark/duktape,kphillisjr/duktape,svaarala/duktape,harold-b/duktape,sloth4413/duktape,jmptrader/duktape,nivertech/duktape,eddieh/duktape,haosu1987/duktape,thurday/duktape,skomski/duktape,harold-b/duktape,haosu1987/duktape,haosu1987/duktape,kphillisjr/duktape,harold-b/duktape,reqshark/duktape,haosu1987/duktape,skomski/duktape,chenyaqiuqiu/duktape,svaarala/duktape,zeropool/duktape,tassmjau/duktape,markand/duktape,jmptrader/duktape,harold-b/duktape,sloth4413/duktape,thurday/duktape,chenyaqiuqiu/duktape,markand/duktape,skomski/duktape,kphillisjr/duktape,nivertech/duktape,nivertech/duktape,sloth4413/duktape,jmptrader/duktape,jmptrader/duktape,kphillisjr/duktape,haosu1987/duktape,reqshark/duktape,chenyaqiuqiu/duktape,thurday/duktape,eddieh/duktape,markand/duktape,reqshark/duktape,skomski/duktape,tassmjau/duktape,markand/duktape,haosu1987/duktape,tassmjau/duktape,chenyaqiuqiu/duktape,haosu1987/duktape,tassmjau/duktape,jmptrader/duktape,jmptrader/duktape,nivertech/duktape,markand/duktape,skomski/duktape,skomski/duktape,svaarala/duktape,reqshark/duktape,jmptrader/duktape,zeropool/duktape,sloth4413/duktape,sloth4413/duktape,haosu1987/duktape,chenyaqiuqiu/duktape,zeropool/duktape,markand/duktape,sloth4413/duktape | javascript | ## Code Before:
/*
* Minimal console.log() polyfill
*/
if (typeof console === 'undefined') {
Object.defineProperty(this, 'console', {
value: {}, writable: true, enumerable: false, configurable: true
});
}
if (typeof console.log === 'undefined') {
Object.defineProperty(this.console, 'log', {
value: function () {
print(Array.prototype.join.call(arguments, ' '));
}, writable: true, enumerable: false, configurable: true
});
}
## Instruction:
Improve console polyfill String coercion
## Code After:
/*
* Minimal console.log() polyfill
*/
if (typeof console === 'undefined') {
Object.defineProperty(this, 'console', {
value: {}, writable: true, enumerable: false, configurable: true
});
}
if (typeof console.log === 'undefined') {
(function () {
var origPrint = print; // capture in closure in case changed later
Object.defineProperty(this.console, 'log', {
value: function () {
var strArgs = Array.prototype.map.call(arguments, function (v) { return String(v); });
origPrint(Array.prototype.join.call(strArgs, ' '));
}, writable: true, enumerable: false, configurable: true
});
})();
}
| /*
* Minimal console.log() polyfill
*/
if (typeof console === 'undefined') {
Object.defineProperty(this, 'console', {
value: {}, writable: true, enumerable: false, configurable: true
});
}
if (typeof console.log === 'undefined') {
+ (function () {
+ var origPrint = print; // capture in closure in case changed later
- Object.defineProperty(this.console, 'log', {
+ Object.defineProperty(this.console, 'log', {
? ++++
- value: function () {
+ value: function () {
? ++++
+ var strArgs = Array.prototype.map.call(arguments, function (v) { return String(v); });
- print(Array.prototype.join.call(arguments, ' '));
? ^ ^ -----
+ origPrint(Array.prototype.join.call(strArgs, ' '));
? ^^^^^^^^^ ^^^^
- }, writable: true, enumerable: false, configurable: true
+ }, writable: true, enumerable: false, configurable: true
? ++++
+ });
- });
+ })();
? ++
} | 14 | 0.875 | 9 | 5 |
361413c253bcde4bef29509138aacb9626ba8304 | css/gallerybutton.css | css/gallerybutton.css | /* toggle for opening shared picture view as file list */
#openAsFileListButton {
float: right;
margin-top: 5px;
font-weight: normal;
background-color: transparent;
border: none;
}
#openAsFileListButton img {
vertical-align: text-top;
-ms-filter: "progid:DXImageTransform.Microsoft.Alpha(Opacity=30)";
filter: alpha(opacity=30);
opacity: .3;
}
| /* toggle for opening shared picture view as file list */
#openAsFileListButton {
float: right;
margin-right: 4px;
margin-top: 5px;
font-weight: normal;
}
#openAsFileListButton img {
vertical-align: text-top;
-ms-filter: "progid:DXImageTransform.Microsoft.Alpha(Opacity=50)";
filter: alpha(opacity=50);
opacity: .5;
}
| Revert "make gallery/filelist switcher button less present and distracting" | Revert "make gallery/filelist switcher button less present and distracting"
| CSS | agpl-3.0 | tahaalibra/gallery,setnes/gallery,desaintmartin/gallery,owncloud/galleryplus,IATkachenko/gallery,tahaalibra/gallery,setnes/gallery,setnes/galleryplus,viraj96/gallery,owncloud/gallery,setnes/gallery,viraj96/gallery,owncloud/gallery,viraj96/gallery,setnes/galleryplus,IATkachenko/gallery,imjalpreet/gallery,imjalpreet/gallery,tahaalibra/gallery,desaintmartin/gallery,desaintmartin/gallery,IATkachenko/gallery,imjalpreet/gallery,owncloud/galleryplus,interfasys/galleryplus,owncloud/gallery,setnes/galleryplus,owncloud/galleryplus,interfasys/galleryplus,interfasys/galleryplus | css | ## Code Before:
/* toggle for opening shared picture view as file list */
#openAsFileListButton {
float: right;
margin-top: 5px;
font-weight: normal;
background-color: transparent;
border: none;
}
#openAsFileListButton img {
vertical-align: text-top;
-ms-filter: "progid:DXImageTransform.Microsoft.Alpha(Opacity=30)";
filter: alpha(opacity=30);
opacity: .3;
}
## Instruction:
Revert "make gallery/filelist switcher button less present and distracting"
## Code After:
/* toggle for opening shared picture view as file list */
#openAsFileListButton {
float: right;
margin-right: 4px;
margin-top: 5px;
font-weight: normal;
}
#openAsFileListButton img {
vertical-align: text-top;
-ms-filter: "progid:DXImageTransform.Microsoft.Alpha(Opacity=50)";
filter: alpha(opacity=50);
opacity: .5;
}
| /* toggle for opening shared picture view as file list */
#openAsFileListButton {
float: right;
+ margin-right: 4px;
margin-top: 5px;
font-weight: normal;
- background-color: transparent;
- border: none;
}
#openAsFileListButton img {
vertical-align: text-top;
- -ms-filter: "progid:DXImageTransform.Microsoft.Alpha(Opacity=30)";
? ^
+ -ms-filter: "progid:DXImageTransform.Microsoft.Alpha(Opacity=50)";
? ^
- filter: alpha(opacity=30);
? ^
+ filter: alpha(opacity=50);
? ^
- opacity: .3;
? ^
+ opacity: .5;
? ^
} | 9 | 0.6 | 4 | 5 |
a9eecd901a965dbe7fc2f8614a54c868f2adf695 | .scrutinizer.yml | .scrutinizer.yml | checks:
python:
code_rating: true
duplicate_code: true
filter:
paths: ['kytos/*', 'tests/*']
excluded_paths:
- 'kytos/web-ui/*'
build:
environment:
python: 3.6.0
postgresql: false
redis: false
dependencies:
before:
- pip install coverage git+git://github.com/diraol/watchdog.git#egg=watchdog
tests:
override:
-
command: 'tox'
coverage:
file: '.coverage'
config_file: '.coveragerc'
format: 'py-cc'
| checks:
python:
code_rating: true
duplicate_code: true
filter:
paths: ['kytos/*', 'tests/*']
excluded_paths:
- 'kytos/web-ui/*'
build:
environment:
python: 3.6.0
postgresql: false
redis: false
dependencies:
before:
- pip install coverage git+git://github.com/diraol/watchdog.git#egg=watchdog
tests:
override:
-
command: 'tox'
coverage:
file: '.coverage'
config_file: '.coveragerc'
format: 'py-cc'
nodes:
analysis:
tests:
override:
- py-scrutinizer-run
| Enable Scrutinizer "Code Rating" Badge | Enable Scrutinizer "Code Rating" Badge | YAML | mit | kytos/kytos,kytos/kyco | yaml | ## Code Before:
checks:
python:
code_rating: true
duplicate_code: true
filter:
paths: ['kytos/*', 'tests/*']
excluded_paths:
- 'kytos/web-ui/*'
build:
environment:
python: 3.6.0
postgresql: false
redis: false
dependencies:
before:
- pip install coverage git+git://github.com/diraol/watchdog.git#egg=watchdog
tests:
override:
-
command: 'tox'
coverage:
file: '.coverage'
config_file: '.coveragerc'
format: 'py-cc'
## Instruction:
Enable Scrutinizer "Code Rating" Badge
## Code After:
checks:
python:
code_rating: true
duplicate_code: true
filter:
paths: ['kytos/*', 'tests/*']
excluded_paths:
- 'kytos/web-ui/*'
build:
environment:
python: 3.6.0
postgresql: false
redis: false
dependencies:
before:
- pip install coverage git+git://github.com/diraol/watchdog.git#egg=watchdog
tests:
override:
-
command: 'tox'
coverage:
file: '.coverage'
config_file: '.coveragerc'
format: 'py-cc'
nodes:
analysis:
tests:
override:
- py-scrutinizer-run
| checks:
python:
code_rating: true
duplicate_code: true
filter:
paths: ['kytos/*', 'tests/*']
excluded_paths:
- 'kytos/web-ui/*'
build:
environment:
python: 3.6.0
postgresql: false
redis: false
dependencies:
before:
- pip install coverage git+git://github.com/diraol/watchdog.git#egg=watchdog
tests:
override:
-
command: 'tox'
coverage:
file: '.coverage'
config_file: '.coveragerc'
format: 'py-cc'
+ nodes:
+ analysis:
+ tests:
+ override:
+ - py-scrutinizer-run | 5 | 0.208333 | 5 | 0 |
5c7e0067e58072e67a1a631bc2e4fbd080a52fe4 | requirements.txt | requirements.txt | Flask == 0.10.1
apispec == 0.9.0
cached-property
| Flask==0.11.1
apispec == 0.9.0
cached-property
| Update flask from 0.10.1 to 0.11.1 | Update flask from 0.10.1 to 0.11.1 | Text | mit | klen/flask-restler,klen/flask-restler | text | ## Code Before:
Flask == 0.10.1
apispec == 0.9.0
cached-property
## Instruction:
Update flask from 0.10.1 to 0.11.1
## Code After:
Flask==0.11.1
apispec == 0.9.0
cached-property
| - Flask == 0.10.1
+ Flask==0.11.1
apispec == 0.9.0
cached-property | 2 | 0.666667 | 1 | 1 |
82f16f33eca1e8008669af59a30d2106d8a680e7 | README.md | README.md |
"Deus Vult"
Text editor in Ruby.
## Instalation
Run `bundle` to install dependences.
## Usage
Run:
ruby cste.rb /path/to/file
|
"Deus Vult"
Text editor in Ruby.
This project is in early stage of development and is not usable yet.
## Instalation
Run `bundle` to install dependences.
## Usage
Run:
ruby cste.rb /path/to/file
| Add a warning about project usability | Add a warning about project usability
* README.md: Add a warning about project usability
| Markdown | mit | fredlinhares/crucis-sapphiri | markdown | ## Code Before:
"Deus Vult"
Text editor in Ruby.
## Instalation
Run `bundle` to install dependences.
## Usage
Run:
ruby cste.rb /path/to/file
## Instruction:
Add a warning about project usability
* README.md: Add a warning about project usability
## Code After:
"Deus Vult"
Text editor in Ruby.
This project is in early stage of development and is not usable yet.
## Instalation
Run `bundle` to install dependences.
## Usage
Run:
ruby cste.rb /path/to/file
|
"Deus Vult"
Text editor in Ruby.
+
+ This project is in early stage of development and is not usable yet.
## Instalation
Run `bundle` to install dependences.
## Usage
Run:
ruby cste.rb /path/to/file | 2 | 0.142857 | 2 | 0 |
07f9355b1b646a4b18572a1e5db575d7fa02467f | src/app/modules/source-converter/source-converter.js | src/app/modules/source-converter/source-converter.js | define([
'marked',
'to-markdown'
], function (
marked,
tomarkdown
) {
'use strict';
return (function() {
var toHTML = function(src) {
marked.setOptions({
breaks: true
});
return marked(src);
};
var toMarkdown = function(src) {
return tomarkdown(src);
};
return {
toHTML : toHTML,
toMarkdown : toMarkdown
};
})();
});
| define([
'marked',
'to-markdown'
], function (
marked,
tomarkdown
) {
'use strict';
return (function() {
var toHTML = function(src) {
marked.setOptions({
breaks: true,
table: false
});
return marked(src);
};
var toMarkdown = function(src) {
return tomarkdown(src);
};
return {
toHTML : toHTML,
toMarkdown : toMarkdown
};
})();
});
| Stop marked from converting Markdown tables into HTML | Stop marked from converting Markdown tables into HTML
Scribe will try to strip out tables
id:4641 state:In Progress
| JavaScript | mit | moneyadviceservice/cms-editor,moneyadviceservice/cms-editor | javascript | ## Code Before:
define([
'marked',
'to-markdown'
], function (
marked,
tomarkdown
) {
'use strict';
return (function() {
var toHTML = function(src) {
marked.setOptions({
breaks: true
});
return marked(src);
};
var toMarkdown = function(src) {
return tomarkdown(src);
};
return {
toHTML : toHTML,
toMarkdown : toMarkdown
};
})();
});
## Instruction:
Stop marked from converting Markdown tables into HTML
Scribe will try to strip out tables
id:4641 state:In Progress
## Code After:
define([
'marked',
'to-markdown'
], function (
marked,
tomarkdown
) {
'use strict';
return (function() {
var toHTML = function(src) {
marked.setOptions({
breaks: true,
table: false
});
return marked(src);
};
var toMarkdown = function(src) {
return tomarkdown(src);
};
return {
toHTML : toHTML,
toMarkdown : toMarkdown
};
})();
});
| define([
'marked',
'to-markdown'
], function (
marked,
tomarkdown
) {
'use strict';
return (function() {
var toHTML = function(src) {
marked.setOptions({
- breaks: true
+ breaks: true,
? +
+ table: false
});
return marked(src);
};
var toMarkdown = function(src) {
return tomarkdown(src);
};
return {
toHTML : toHTML,
toMarkdown : toMarkdown
};
})();
}); | 3 | 0.115385 | 2 | 1 |
ac706bd3fd30f5095835e99431db905a49ccf1d3 | azure-pipelines.yml | azure-pipelines.yml |
trigger:
- main
variables:
MKL_NUM_THREADS: 1
NUMEXPR_NUM_THREADS: 1
OMP_NUM_THREADS: 1
VML_NUM_THREADS: 1
OPENBLAS_NUM_THREADS: 1
PYTHONHASHSEED: 0 # Ensure tests are correctly gathered by xdist
SETUPTOOLS_USE_DISTUTILS: "stdlib"
USE_MATPLOTLIB: true
jobs:
- template: tools/ci/azure/azure_template_windows.yml
parameters:
name: Windows
vmImage: windows-latest
- template: tools/ci/azure/azure_template_posix.yml
parameters:
name: macOS
vmImage: macOS-latest
- template: tools/ci/azure/azure_template_posix.yml
parameters:
name: Linux
vmImage: ubuntu-latest
|
trigger:
- main
schedules:
- cron: "0 6 * * 1" # Each Monday at 06:00 UTC
displayName: Weekly scheduled run
branches:
include: [main, maintenance/0.13.x]
always: true
variables:
MKL_NUM_THREADS: 1
NUMEXPR_NUM_THREADS: 1
OMP_NUM_THREADS: 1
VML_NUM_THREADS: 1
OPENBLAS_NUM_THREADS: 1
PYTHONHASHSEED: 0 # Ensure tests are correctly gathered by xdist
SETUPTOOLS_USE_DISTUTILS: "stdlib"
USE_MATPLOTLIB: true
jobs:
- template: tools/ci/azure/azure_template_windows.yml
parameters:
name: Windows
vmImage: windows-latest
- template: tools/ci/azure/azure_template_posix.yml
parameters:
name: macOS
vmImage: macOS-latest
- template: tools/ci/azure/azure_template_posix.yml
parameters:
name: Linux
vmImage: ubuntu-latest
| Add a weekly scheduled run to the Azure pipelines | CI: Add a weekly scheduled run to the Azure pipelines
This commit adds a weekly scheduled run to the Azure pipelines CI. It runs each Monday at 06:00 UTC on the main and maintenance/0.13.x branches.
Periodically scheduled runs can detect errors and warnings appearing in changes in upstream dependencies or the build environment. By having a scheduled run they can be traced back easier to a dependency or environment change, then if they would pop up in a random PR. | YAML | bsd-3-clause | bashtage/statsmodels,bashtage/statsmodels,statsmodels/statsmodels,bashtage/statsmodels,statsmodels/statsmodels,statsmodels/statsmodels,statsmodels/statsmodels,statsmodels/statsmodels,bashtage/statsmodels,bashtage/statsmodels,statsmodels/statsmodels,bashtage/statsmodels | yaml | ## Code Before:
trigger:
- main
variables:
MKL_NUM_THREADS: 1
NUMEXPR_NUM_THREADS: 1
OMP_NUM_THREADS: 1
VML_NUM_THREADS: 1
OPENBLAS_NUM_THREADS: 1
PYTHONHASHSEED: 0 # Ensure tests are correctly gathered by xdist
SETUPTOOLS_USE_DISTUTILS: "stdlib"
USE_MATPLOTLIB: true
jobs:
- template: tools/ci/azure/azure_template_windows.yml
parameters:
name: Windows
vmImage: windows-latest
- template: tools/ci/azure/azure_template_posix.yml
parameters:
name: macOS
vmImage: macOS-latest
- template: tools/ci/azure/azure_template_posix.yml
parameters:
name: Linux
vmImage: ubuntu-latest
## Instruction:
CI: Add a weekly scheduled run to the Azure pipelines
This commit adds a weekly scheduled run to the Azure pipelines CI. It runs each Monday at 06:00 UTC on the main and maintenance/0.13.x branches.
Periodically scheduled runs can detect errors and warnings appearing in changes in upstream dependencies or the build environment. By having a scheduled run they can be traced back easier to a dependency or environment change, then if they would pop up in a random PR.
## Code After:
trigger:
- main
schedules:
- cron: "0 6 * * 1" # Each Monday at 06:00 UTC
displayName: Weekly scheduled run
branches:
include: [main, maintenance/0.13.x]
always: true
variables:
MKL_NUM_THREADS: 1
NUMEXPR_NUM_THREADS: 1
OMP_NUM_THREADS: 1
VML_NUM_THREADS: 1
OPENBLAS_NUM_THREADS: 1
PYTHONHASHSEED: 0 # Ensure tests are correctly gathered by xdist
SETUPTOOLS_USE_DISTUTILS: "stdlib"
USE_MATPLOTLIB: true
jobs:
- template: tools/ci/azure/azure_template_windows.yml
parameters:
name: Windows
vmImage: windows-latest
- template: tools/ci/azure/azure_template_posix.yml
parameters:
name: macOS
vmImage: macOS-latest
- template: tools/ci/azure/azure_template_posix.yml
parameters:
name: Linux
vmImage: ubuntu-latest
|
trigger:
- main
+ schedules:
+ - cron: "0 6 * * 1" # Each Monday at 06:00 UTC
+ displayName: Weekly scheduled run
+ branches:
+ include: [main, maintenance/0.13.x]
+ always: true
variables:
MKL_NUM_THREADS: 1
NUMEXPR_NUM_THREADS: 1
OMP_NUM_THREADS: 1
VML_NUM_THREADS: 1
OPENBLAS_NUM_THREADS: 1
PYTHONHASHSEED: 0 # Ensure tests are correctly gathered by xdist
SETUPTOOLS_USE_DISTUTILS: "stdlib"
USE_MATPLOTLIB: true
jobs:
- template: tools/ci/azure/azure_template_windows.yml
parameters:
name: Windows
vmImage: windows-latest
- template: tools/ci/azure/azure_template_posix.yml
parameters:
name: macOS
vmImage: macOS-latest
- template: tools/ci/azure/azure_template_posix.yml
parameters:
name: Linux
vmImage: ubuntu-latest | 6 | 0.2 | 6 | 0 |
1eb32c42ad37964461be25a8335e7051288f1e6a | lib/travis/logs.rb | lib/travis/logs.rb | require 'sidekiq/redis_connection'
require 'travis/logs/config'
if RUBY_PLATFORM =~ /^java/
require 'jrjackson'
else
require 'oj'
end
module Travis
def self.config
Travis::Logs.config
end
module Logs
class << self
attr_writer :config, :database_connection, :redis
def config
@config ||= Travis::Logs::Config.load
end
def database_connection
@database_connection ||= Travis::Logs::Helpers::Database.connect
end
def redis_pool
@redis_pool ||= ::Sidekiq::RedisConnection.create(
url: config.redis.url,
namespace: config.sidekiq.namespace,
size: config.sidekiq.pool_size
)
end
def version
@version ||=
`git rev-parse HEAD 2>/dev/null || echo ${SOURCE_VERSION:-fafafaf}`.strip
end
end
end
end
| require 'sidekiq/redis_connection'
require 'travis/logs/config'
if RUBY_PLATFORM =~ /^java/
require 'jrjackson'
else
require 'oj'
end
module Travis
def self.config
Travis::Logs.config
end
module Logs
class << self
attr_writer :config, :database_connection, :redis
def config
@config ||= Travis::Logs::Config.load
end
def database_connection
@database_connection ||= Travis::Logs::Helpers::Database.connect
end
def redis_pool
@redis_pool ||= ::Sidekiq::RedisConnection.create(
url: config.redis.url,
namespace: config.sidekiq.namespace,
size: config.sidekiq.pool_size
)
end
def version
@version ||=
`git rev-parse HEAD 2>/dev/null || echo ${HEROKU_SLUG_COMMIT:-fafafaf}`.strip
end
end
end
end
| Switch to env var with correct deployed version | Switch to env var with correct deployed version
| Ruby | mit | travis-ci/travis-logs,travis-ci/travis-logs | ruby | ## Code Before:
require 'sidekiq/redis_connection'
require 'travis/logs/config'
if RUBY_PLATFORM =~ /^java/
require 'jrjackson'
else
require 'oj'
end
module Travis
def self.config
Travis::Logs.config
end
module Logs
class << self
attr_writer :config, :database_connection, :redis
def config
@config ||= Travis::Logs::Config.load
end
def database_connection
@database_connection ||= Travis::Logs::Helpers::Database.connect
end
def redis_pool
@redis_pool ||= ::Sidekiq::RedisConnection.create(
url: config.redis.url,
namespace: config.sidekiq.namespace,
size: config.sidekiq.pool_size
)
end
def version
@version ||=
`git rev-parse HEAD 2>/dev/null || echo ${SOURCE_VERSION:-fafafaf}`.strip
end
end
end
end
## Instruction:
Switch to env var with correct deployed version
## Code After:
require 'sidekiq/redis_connection'
require 'travis/logs/config'
if RUBY_PLATFORM =~ /^java/
require 'jrjackson'
else
require 'oj'
end
module Travis
def self.config
Travis::Logs.config
end
module Logs
class << self
attr_writer :config, :database_connection, :redis
def config
@config ||= Travis::Logs::Config.load
end
def database_connection
@database_connection ||= Travis::Logs::Helpers::Database.connect
end
def redis_pool
@redis_pool ||= ::Sidekiq::RedisConnection.create(
url: config.redis.url,
namespace: config.sidekiq.namespace,
size: config.sidekiq.pool_size
)
end
def version
@version ||=
`git rev-parse HEAD 2>/dev/null || echo ${HEROKU_SLUG_COMMIT:-fafafaf}`.strip
end
end
end
end
| require 'sidekiq/redis_connection'
require 'travis/logs/config'
if RUBY_PLATFORM =~ /^java/
require 'jrjackson'
else
require 'oj'
end
module Travis
def self.config
Travis::Logs.config
end
module Logs
class << self
attr_writer :config, :database_connection, :redis
def config
@config ||= Travis::Logs::Config.load
end
def database_connection
@database_connection ||= Travis::Logs::Helpers::Database.connect
end
def redis_pool
@redis_pool ||= ::Sidekiq::RedisConnection.create(
url: config.redis.url,
namespace: config.sidekiq.namespace,
size: config.sidekiq.pool_size
)
end
def version
@version ||=
- `git rev-parse HEAD 2>/dev/null || echo ${SOURCE_VERSION:-fafafaf}`.strip
? ^^^^^^^^ ^^
+ `git rev-parse HEAD 2>/dev/null || echo ${HEROKU_SLUG_COMMIT:-fafafaf}`.strip
? ^ ++++ ++++++++ ^
end
end
end
end | 2 | 0.04878 | 1 | 1 |
bade616dff7c4742bbcf00ea083c88c0de07f002 | translate.rb | translate.rb | require 'rubygems'
require 'bundler/setup'
require 'sinatra'
require 'json'
require 'google_drive'
require 'rack/contrib/jsonp'
# Need "callback" query string parameter
use Rack::JSONP
session = GoogleDrive.login(ENV['GOOGLE_ID'], ENV['GOOGLE_PASSWD'])
ws = session.spreadsheet_by_key(ENV['SPREADSHEET_KEY']).worksheets[0]
get '/translate' do
content_type :json
start_time = Time.now
from = params[:from]
to = params[:to]
texts = params[:text]
result = {}
column = 1 + rand(200)
# Call cript from spreadsheet
texts.each_with_index do |text, index|
ws[index + 1, column] = "=GoogleTranslate(\"#{text}\", \"#{from}\", \"#{to}\")"
end
# Save and reload the worksheet to get my changes effect
ws.save
ws.reload
texts.each_with_index do |text, index|
result[text] = ws[index + 1, column]
end
result[:elapased_time] = Time.now - start_time
result.to_json
end
| require 'rubygems'
require 'bundler/setup'
require 'sinatra'
require 'json'
require 'google_drive'
require 'rack/contrib/jsonp'
# Need "callback" query string parameter
use Rack::JSONP
session = GoogleDrive.login(ENV['GOOGLE_ID'], ENV['GOOGLE_PASSWD'])
ws = session.spreadsheet_by_key(ENV['SPREADSHEET_KEY']).worksheets[0]
MAX_ROW_NUMBER = 1000
row = 0;
get '/translate' do
content_type :json
start_time = Time.now
from = params[:from]
to = params[:to]
texts = params[:text]
result = {}
if texts.is_a? String
texts = [texts]
end
row = (row + 1) % MAX_ROW_NUMBER
# Call script from spreadsheet
texts.each_with_index do |text, index|
ws[row, index + 1] = "=GoogleTranslate(\"#{text}\", \"#{from}\", \"#{to}\")"
end
# Save and reload the worksheet to get my changes effect
ws.save
ws.reload
texts.each_with_index do |text, index|
result[text] = ws[row, index + 1]
end
result[:elapased_time] = Time.now - start_time
result.to_json
end
| Extend max concurrent request count. | Extend max concurrent request count.
| Ruby | mit | Sangdol/free-and-slow-google-translate-api | ruby | ## Code Before:
require 'rubygems'
require 'bundler/setup'
require 'sinatra'
require 'json'
require 'google_drive'
require 'rack/contrib/jsonp'
# Need "callback" query string parameter
use Rack::JSONP
session = GoogleDrive.login(ENV['GOOGLE_ID'], ENV['GOOGLE_PASSWD'])
ws = session.spreadsheet_by_key(ENV['SPREADSHEET_KEY']).worksheets[0]
get '/translate' do
content_type :json
start_time = Time.now
from = params[:from]
to = params[:to]
texts = params[:text]
result = {}
column = 1 + rand(200)
# Call cript from spreadsheet
texts.each_with_index do |text, index|
ws[index + 1, column] = "=GoogleTranslate(\"#{text}\", \"#{from}\", \"#{to}\")"
end
# Save and reload the worksheet to get my changes effect
ws.save
ws.reload
texts.each_with_index do |text, index|
result[text] = ws[index + 1, column]
end
result[:elapased_time] = Time.now - start_time
result.to_json
end
## Instruction:
Extend max concurrent request count.
## Code After:
require 'rubygems'
require 'bundler/setup'
require 'sinatra'
require 'json'
require 'google_drive'
require 'rack/contrib/jsonp'
# Need "callback" query string parameter
use Rack::JSONP
session = GoogleDrive.login(ENV['GOOGLE_ID'], ENV['GOOGLE_PASSWD'])
ws = session.spreadsheet_by_key(ENV['SPREADSHEET_KEY']).worksheets[0]
MAX_ROW_NUMBER = 1000
row = 0;
get '/translate' do
content_type :json
start_time = Time.now
from = params[:from]
to = params[:to]
texts = params[:text]
result = {}
if texts.is_a? String
texts = [texts]
end
row = (row + 1) % MAX_ROW_NUMBER
# Call script from spreadsheet
texts.each_with_index do |text, index|
ws[row, index + 1] = "=GoogleTranslate(\"#{text}\", \"#{from}\", \"#{to}\")"
end
# Save and reload the worksheet to get my changes effect
ws.save
ws.reload
texts.each_with_index do |text, index|
result[text] = ws[row, index + 1]
end
result[:elapased_time] = Time.now - start_time
result.to_json
end
| require 'rubygems'
require 'bundler/setup'
require 'sinatra'
require 'json'
require 'google_drive'
require 'rack/contrib/jsonp'
# Need "callback" query string parameter
use Rack::JSONP
session = GoogleDrive.login(ENV['GOOGLE_ID'], ENV['GOOGLE_PASSWD'])
ws = session.spreadsheet_by_key(ENV['SPREADSHEET_KEY']).worksheets[0]
+ MAX_ROW_NUMBER = 1000
+ row = 0;
+
get '/translate' do
content_type :json
start_time = Time.now
from = params[:from]
to = params[:to]
texts = params[:text]
result = {}
- column = 1 + rand(200)
+ if texts.is_a? String
+ texts = [texts]
+ end
+ row = (row + 1) % MAX_ROW_NUMBER
+
- # Call cript from spreadsheet
+ # Call script from spreadsheet
? +
texts.each_with_index do |text, index|
- ws[index + 1, column] = "=GoogleTranslate(\"#{text}\", \"#{from}\", \"#{to}\")"
? --------
+ ws[row, index + 1] = "=GoogleTranslate(\"#{text}\", \"#{from}\", \"#{to}\")"
? +++++
end
# Save and reload the worksheet to get my changes effect
ws.save
ws.reload
texts.each_with_index do |text, index|
- result[text] = ws[index + 1, column]
? --------
+ result[text] = ws[row, index + 1]
? +++++
end
result[:elapased_time] = Time.now - start_time
result.to_json
end | 15 | 0.365854 | 11 | 4 |
b9cbaa73ae3dfd5830a1288d5edc7d45c78fc16c | README.md | README.md | Code to build an RPM for Apache Solr 5.x.x
Copyright © 2015 Jason Stafford
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
> [http://www.apache.org/licenses/LICENSE-2.0](http://www.apache.org/licenses/LICENSE-2.0)
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
To use, first setup your development environment by running
./setup.sh
Once that is done you can build an RPM by running
./build.sh 5.3.0 1
The version '5.3.0' can be replaced with any released version of Solr in the
5.x.x series. The '1' is the version of the RPM itself.
| Code to build an RPM for Apache Solr 5.x.x
To use, first setup your development environment by running
./setup.sh
Once that is done you can build an RPM by running
./build.sh 5.3.0 1
The version '5.3.0' can be replaced with any released version of Solr in the
5.x.x series. The '1' is the version of the RPM itself.
Copyright © 2015 Jason Stafford
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
> [http://www.apache.org/licenses/LICENSE-2.0](http://www.apache.org/licenses/LICENSE-2.0)
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
| Change the order of the readme around so the copyright is less obnoxious | Change the order of the readme around so the copyright is less obnoxious
| Markdown | apache-2.0 | sagevoice/solr-rpm,devghai/solr-rpm | markdown | ## Code Before:
Code to build an RPM for Apache Solr 5.x.x
Copyright © 2015 Jason Stafford
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
> [http://www.apache.org/licenses/LICENSE-2.0](http://www.apache.org/licenses/LICENSE-2.0)
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
To use, first setup your development environment by running
./setup.sh
Once that is done you can build an RPM by running
./build.sh 5.3.0 1
The version '5.3.0' can be replaced with any released version of Solr in the
5.x.x series. The '1' is the version of the RPM itself.
## Instruction:
Change the order of the readme around so the copyright is less obnoxious
## Code After:
Code to build an RPM for Apache Solr 5.x.x
To use, first setup your development environment by running
./setup.sh
Once that is done you can build an RPM by running
./build.sh 5.3.0 1
The version '5.3.0' can be replaced with any released version of Solr in the
5.x.x series. The '1' is the version of the RPM itself.
Copyright © 2015 Jason Stafford
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
> [http://www.apache.org/licenses/LICENSE-2.0](http://www.apache.org/licenses/LICENSE-2.0)
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
| Code to build an RPM for Apache Solr 5.x.x
+
+ To use, first setup your development environment by running
+
+ ./setup.sh
+
+ Once that is done you can build an RPM by running
+
+ ./build.sh 5.3.0 1
+
+ The version '5.3.0' can be replaced with any released version of Solr in the
+ 5.x.x series. The '1' is the version of the RPM itself.
Copyright © 2015 Jason Stafford
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
> [http://www.apache.org/licenses/LICENSE-2.0](http://www.apache.org/licenses/LICENSE-2.0)
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-
- To use, first setup your development environment by running
-
- ./setup.sh
-
- Once that is done you can build an RPM by running
-
- ./build.sh 5.3.0 1
-
- The version '5.3.0' can be replaced with any released version of Solr in the
- 5.x.x series. The '1' is the version of the RPM itself. | 22 | 0.846154 | 11 | 11 |
fc5db4b3c9e5681890c09cf550798777438754d4 | .travis.yml | .travis.yml | env:
- ARCH=x86
language: python
sudo: false
python:
- "2.7"
install:
- "pip install -r client/requirements.txt"
- "pip install python-coveralls"
- "pip install coverage"
- "pip install flake8"
before_script:
- "flake8 jasper.py client tests"
script:
- "coverage run -m unittest discover"
after_success:
- "coverage report"
- "coveralls"
| env:
- ARCH=x86
language: python
sudo: false
python:
- "2.7"
cache:
directories:
- "$HOME/.pip-cache/"
- "/home/travis/virtualenv/python2.7"
install:
- "pip install -r client/requirements.txt --download-cache $HOME/.pip-cache"
- "pip install python-coveralls --download-cache $HOME/.pip-cache"
- "pip install coverage --download-cache $HOME/.pip-cache"
- "pip install flake8 --download-cache $HOME/.pip-cache"
before_script:
- "flake8 jasper.py client tests"
script:
- "coverage run -m unittest discover"
after_success:
- "coverage report"
- "coveralls"
| Use Travis-CI's caching for pip requirements | Use Travis-CI's caching for pip requirements
| YAML | mit | sanyaade-iot/jasper-client,brad999/nikita-client,Siretu/jasper-client,jasperproject/jasper-client,densic/HomeAutomation,densic/HomeAutomation,djeraseit/jasper-client,MaakbareWereld/storyteller,ajay-gandhi/jasper-client,Brandon32/jasper-client,assanee/jasper-client,djeraseit/jasper-client,brad999/nikita-client,rowhit/jasper-client,sunu/jasper-client,auhlig/jasper-client,auhlig/jasper-client,tsaitsai/jasper-client,fritz-fritz/jasper-client,rowhit/jasper-client,skylarker/jasper-client,zanbel/david,jasperproject/jasper-client,assanee/jasper-client,rahul1193/jasper-client,rahul1193/jasper-client,jskye/voicehud-jasper,joekinley/jasper-client,bdizen/jasper-client,sukhoi/jasper-client,sanyaade-iot/jasper-client,benhoff/jasper-client,brad999/nikita,jskye/voicehud-jasper,syardumi/jasper-client,DarrenRainey/jasper-client,benhoff/jasper-client,steppy345/jasper-client,markferry/jasper-client,aish9r/jasper-client,skylarker/jasper-client,rab206/self-proj-pi,sunu/jasper-client,rab206/self-proj-pi,tsaitsai/jasper-client,aish9r/jasper-client,brad999/nikita,zanbel/david,fritz-fritz/jasper-client,DarrenRainey/jasper-client,clumsical/hackthehouse-marty,syardumi/jasper-client,ajay-gandhi/jasper-client,clumsical/hackthehouse-marty,sukhoi/jasper-client,steppy345/jasper-client,Brandon32/jasper-client,bdizen/jasper-client,markferry/jasper-client,MaakbareWereld/storyteller,Siretu/jasper-client,joekinley/jasper-client | yaml | ## Code Before:
env:
- ARCH=x86
language: python
sudo: false
python:
- "2.7"
install:
- "pip install -r client/requirements.txt"
- "pip install python-coveralls"
- "pip install coverage"
- "pip install flake8"
before_script:
- "flake8 jasper.py client tests"
script:
- "coverage run -m unittest discover"
after_success:
- "coverage report"
- "coveralls"
## Instruction:
Use Travis-CI's caching for pip requirements
## Code After:
env:
- ARCH=x86
language: python
sudo: false
python:
- "2.7"
cache:
directories:
- "$HOME/.pip-cache/"
- "/home/travis/virtualenv/python2.7"
install:
- "pip install -r client/requirements.txt --download-cache $HOME/.pip-cache"
- "pip install python-coveralls --download-cache $HOME/.pip-cache"
- "pip install coverage --download-cache $HOME/.pip-cache"
- "pip install flake8 --download-cache $HOME/.pip-cache"
before_script:
- "flake8 jasper.py client tests"
script:
- "coverage run -m unittest discover"
after_success:
- "coverage report"
- "coveralls"
| env:
- ARCH=x86
language: python
sudo: false
python:
- "2.7"
+ cache:
+ directories:
+ - "$HOME/.pip-cache/"
+ - "/home/travis/virtualenv/python2.7"
install:
- - "pip install -r client/requirements.txt"
- - "pip install python-coveralls"
- - "pip install coverage"
- - "pip install flake8"
+ - "pip install -r client/requirements.txt --download-cache $HOME/.pip-cache"
+ - "pip install python-coveralls --download-cache $HOME/.pip-cache"
+ - "pip install coverage --download-cache $HOME/.pip-cache"
+ - "pip install flake8 --download-cache $HOME/.pip-cache"
before_script:
- "flake8 jasper.py client tests"
script:
- "coverage run -m unittest discover"
after_success:
- "coverage report"
- "coveralls" | 12 | 0.666667 | 8 | 4 |
4f5f77f8cd3a201c9b89d78cd9514a1be5f00758 | .travis.yml | .travis.yml | language: ruby
rvm:
- 2.2
- 2.1.1
install: bundle install
script: bundle exec jekyll build
env:
global: NOKOGIRI_USE_SYSTEM_LIBRARIES=true
| language: ruby
rvm:
- 2.2
- 2.1.1
install: bundle install
script: bundle exec jekyll build
branches:
only: /^.*$/
env:
global: NOKOGIRI_USE_SYSTEM_LIBRARIES=true
| Enable Travis CI builds for all branches | Enable Travis CI builds for all branches
From the Travis CI documentation:
> Note that the `gh-pages` branch will not be built unless you add it to the
> whitelist (`branches.only`).
Use regular expressions to whitelist all branches.
| YAML | mit | ahaasler/colos,ahaasler/colos,ahaasler/colos,ahaasler/colos | yaml | ## Code Before:
language: ruby
rvm:
- 2.2
- 2.1.1
install: bundle install
script: bundle exec jekyll build
env:
global: NOKOGIRI_USE_SYSTEM_LIBRARIES=true
## Instruction:
Enable Travis CI builds for all branches
From the Travis CI documentation:
> Note that the `gh-pages` branch will not be built unless you add it to the
> whitelist (`branches.only`).
Use regular expressions to whitelist all branches.
## Code After:
language: ruby
rvm:
- 2.2
- 2.1.1
install: bundle install
script: bundle exec jekyll build
branches:
only: /^.*$/
env:
global: NOKOGIRI_USE_SYSTEM_LIBRARIES=true
| language: ruby
rvm:
- 2.2
- 2.1.1
install: bundle install
script: bundle exec jekyll build
+ branches:
+ only: /^.*$/
env:
global: NOKOGIRI_USE_SYSTEM_LIBRARIES=true | 2 | 0.25 | 2 | 0 |
723e3e199e00799edb2ed60d87c18e2a4e42d15e | locales/tg/messages.properties | locales/tg/messages.properties | your_projects=Лоиҳаҳои шумо
sign_in=Дохилшавӣ
please_sign_in=Марҳамат дохил шавед.
or=оё
sign_up=Бақайдгирӣ
newsletter_sign_up=Бақайдгирӣ
hi_username=Салом, {name}
get_help=Ёрӣ додан
say_hello=Салом бигӯ
mozilla_clubs=Mozilla клуб
tools=Воситаҳо
tools=Воситаҳо
listen=Гӯш кардан
elements_of_mozilla_club_title=Элементҳо барои клуби Mozilla
where_are_clubs_title=Дар куҷои ҷаҳон маҳфилҳои Mozilla ҳастанд?
club_guides_title=Раҳнамои маҳфил
connect_twitter_title=Пайвастшавӣ ба Twitter
guide_category_starting_your_club=Оғози маҳфили худ
featured_report_title=Ҳисобот
featured_resource_title=Захираҳо
| your_projects=Лоиҳаҳои шумо
sign_in=Дохилшавӣ
please_sign_in=Марҳамат дохил шавед.
or=оё
sign_up=Бақайдгирӣ
newsletter_sign_up=Бақайдгирӣ
hi_username=Салом, {name}
get_help=Ёрӣ додан
say_hello=Салом бигӯ
mozilla_clubs=Mozilla клуб
tools=Воситаҳо
tools=Воситаҳо
listen=Гӯш кардан
elements_of_mozilla_club_title=Элементҳо барои клуби Mozilla
where_are_clubs_title=Дар куҷои ҷаҳон маҳфилҳои Mozilla ҳастанд?
club_guides_title=Раҳнамои маҳфил
connect_twitter_title=Пайвастшавӣ ба Twitter
guide_category_starting_your_club=Оғози маҳфили худ
guide_category_customizing_your_club=Танзимкунии маҳфили худ
featured_report_title=Ҳисобот
featured_resource_title=Захираҳо
| Update Tajik (tg) localization of Mozilla Learning Network | Pontoon: Update Tajik (tg) localization of Mozilla Learning Network
Localization authors:
- maftuna.dehqonova.1995 <maftuna.dehqonova.1995@mail.ru>
| INI | mpl-2.0 | mozilla/teach.webmaker.org,mozilla/teach.mozilla.org,mozilla/teach.mozilla.org,mozilla/learning.mozilla.org,mozilla/teach.webmaker.org | ini | ## Code Before:
your_projects=Лоиҳаҳои шумо
sign_in=Дохилшавӣ
please_sign_in=Марҳамат дохил шавед.
or=оё
sign_up=Бақайдгирӣ
newsletter_sign_up=Бақайдгирӣ
hi_username=Салом, {name}
get_help=Ёрӣ додан
say_hello=Салом бигӯ
mozilla_clubs=Mozilla клуб
tools=Воситаҳо
tools=Воситаҳо
listen=Гӯш кардан
elements_of_mozilla_club_title=Элементҳо барои клуби Mozilla
where_are_clubs_title=Дар куҷои ҷаҳон маҳфилҳои Mozilla ҳастанд?
club_guides_title=Раҳнамои маҳфил
connect_twitter_title=Пайвастшавӣ ба Twitter
guide_category_starting_your_club=Оғози маҳфили худ
featured_report_title=Ҳисобот
featured_resource_title=Захираҳо
## Instruction:
Pontoon: Update Tajik (tg) localization of Mozilla Learning Network
Localization authors:
- maftuna.dehqonova.1995 <maftuna.dehqonova.1995@mail.ru>
## Code After:
your_projects=Лоиҳаҳои шумо
sign_in=Дохилшавӣ
please_sign_in=Марҳамат дохил шавед.
or=оё
sign_up=Бақайдгирӣ
newsletter_sign_up=Бақайдгирӣ
hi_username=Салом, {name}
get_help=Ёрӣ додан
say_hello=Салом бигӯ
mozilla_clubs=Mozilla клуб
tools=Воситаҳо
tools=Воситаҳо
listen=Гӯш кардан
elements_of_mozilla_club_title=Элементҳо барои клуби Mozilla
where_are_clubs_title=Дар куҷои ҷаҳон маҳфилҳои Mozilla ҳастанд?
club_guides_title=Раҳнамои маҳфил
connect_twitter_title=Пайвастшавӣ ба Twitter
guide_category_starting_your_club=Оғози маҳфили худ
guide_category_customizing_your_club=Танзимкунии маҳфили худ
featured_report_title=Ҳисобот
featured_resource_title=Захираҳо
| your_projects=Лоиҳаҳои шумо
sign_in=Дохилшавӣ
please_sign_in=Марҳамат дохил шавед.
or=оё
sign_up=Бақайдгирӣ
newsletter_sign_up=Бақайдгирӣ
hi_username=Салом, {name}
get_help=Ёрӣ додан
say_hello=Салом бигӯ
mozilla_clubs=Mozilla клуб
tools=Воситаҳо
tools=Воситаҳо
listen=Гӯш кардан
elements_of_mozilla_club_title=Элементҳо барои клуби Mozilla
where_are_clubs_title=Дар куҷои ҷаҳон маҳфилҳои Mozilla ҳастанд?
club_guides_title=Раҳнамои маҳфил
connect_twitter_title=Пайвастшавӣ ба Twitter
guide_category_starting_your_club=Оғози маҳфили худ
+ guide_category_customizing_your_club=Танзимкунии маҳфили худ
featured_report_title=Ҳисобот
featured_resource_title=Захираҳо | 1 | 0.05 | 1 | 0 |
9e14d8442e1a153c9b7a317249f31c800e0a6aef | packages/hl/hls-graph.yaml | packages/hl/hls-graph.yaml | homepage: https://github.com/haskell/haskell-language-server#readme
changelog-type: ''
hash: f647e60df5319b94d0eb154c62d4afd754adffc9c09efd0ff241f1125a0d8f02
test-bench-deps: {}
maintainer: alan.zimm@gmail.com
synopsis: Haskell Language Server internal graph API
changelog: ''
basic-deps:
exceptions: -any
bytestring: -any
extra: -any
base: '>=4.12 && <5'
time: -any
unordered-containers: -any
filepath: -any
async: -any
js-dgtable: -any
containers: -any
js-jquery: -any
js-flot: -any
hashable: -any
transformers: -any
deepseq: -any
aeson: -any
primitive: -any
directory: -any
all-versions:
- 1.3.0.0
- 1.4.0.0
- 1.5.1.0
author: The Haskell IDE Team
latest: 1.5.1.0
description-type: haddock
description: Please see the README on GitHub at <https://github.com/haskell/haskell-language-server#readme>
license-name: Apache-2.0
| homepage: https://github.com/haskell/haskell-language-server#readme
changelog-type: ''
hash: 1704e44c1674a7f9c972d456b311bbea12ac158130b4f7883a806c8292dc6660
test-bench-deps: {}
maintainer: alan.zimm@gmail.com
synopsis: Haskell Language Server internal graph API
changelog: ''
basic-deps:
exceptions: -any
bytestring: -any
extra: -any
base: '>=4.12 && <5'
time: -any
unordered-containers: -any
filepath: -any
async: -any
js-dgtable: -any
containers: -any
js-jquery: -any
js-flot: -any
hashable: -any
transformers: -any
deepseq: -any
aeson: -any
primitive: -any
directory: -any
all-versions:
- 1.3.0.0
- 1.4.0.0
- 1.5.1.0
- 1.5.1.1
author: The Haskell IDE Team
latest: 1.5.1.1
description-type: haddock
description: Please see the README on GitHub at <https://github.com/haskell/haskell-language-server#readme>
license-name: Apache-2.0
| Update from Hackage at 2021-11-24T17:08:15Z | Update from Hackage at 2021-11-24T17:08:15Z
| YAML | mit | commercialhaskell/all-cabal-metadata | yaml | ## Code Before:
homepage: https://github.com/haskell/haskell-language-server#readme
changelog-type: ''
hash: f647e60df5319b94d0eb154c62d4afd754adffc9c09efd0ff241f1125a0d8f02
test-bench-deps: {}
maintainer: alan.zimm@gmail.com
synopsis: Haskell Language Server internal graph API
changelog: ''
basic-deps:
exceptions: -any
bytestring: -any
extra: -any
base: '>=4.12 && <5'
time: -any
unordered-containers: -any
filepath: -any
async: -any
js-dgtable: -any
containers: -any
js-jquery: -any
js-flot: -any
hashable: -any
transformers: -any
deepseq: -any
aeson: -any
primitive: -any
directory: -any
all-versions:
- 1.3.0.0
- 1.4.0.0
- 1.5.1.0
author: The Haskell IDE Team
latest: 1.5.1.0
description-type: haddock
description: Please see the README on GitHub at <https://github.com/haskell/haskell-language-server#readme>
license-name: Apache-2.0
## Instruction:
Update from Hackage at 2021-11-24T17:08:15Z
## Code After:
homepage: https://github.com/haskell/haskell-language-server#readme
changelog-type: ''
hash: 1704e44c1674a7f9c972d456b311bbea12ac158130b4f7883a806c8292dc6660
test-bench-deps: {}
maintainer: alan.zimm@gmail.com
synopsis: Haskell Language Server internal graph API
changelog: ''
basic-deps:
exceptions: -any
bytestring: -any
extra: -any
base: '>=4.12 && <5'
time: -any
unordered-containers: -any
filepath: -any
async: -any
js-dgtable: -any
containers: -any
js-jquery: -any
js-flot: -any
hashable: -any
transformers: -any
deepseq: -any
aeson: -any
primitive: -any
directory: -any
all-versions:
- 1.3.0.0
- 1.4.0.0
- 1.5.1.0
- 1.5.1.1
author: The Haskell IDE Team
latest: 1.5.1.1
description-type: haddock
description: Please see the README on GitHub at <https://github.com/haskell/haskell-language-server#readme>
license-name: Apache-2.0
| homepage: https://github.com/haskell/haskell-language-server#readme
changelog-type: ''
- hash: f647e60df5319b94d0eb154c62d4afd754adffc9c09efd0ff241f1125a0d8f02
+ hash: 1704e44c1674a7f9c972d456b311bbea12ac158130b4f7883a806c8292dc6660
test-bench-deps: {}
maintainer: alan.zimm@gmail.com
synopsis: Haskell Language Server internal graph API
changelog: ''
basic-deps:
exceptions: -any
bytestring: -any
extra: -any
base: '>=4.12 && <5'
time: -any
unordered-containers: -any
filepath: -any
async: -any
js-dgtable: -any
containers: -any
js-jquery: -any
js-flot: -any
hashable: -any
transformers: -any
deepseq: -any
aeson: -any
primitive: -any
directory: -any
all-versions:
- 1.3.0.0
- 1.4.0.0
- 1.5.1.0
+ - 1.5.1.1
author: The Haskell IDE Team
- latest: 1.5.1.0
? ^
+ latest: 1.5.1.1
? ^
description-type: haddock
description: Please see the README on GitHub at <https://github.com/haskell/haskell-language-server#readme>
license-name: Apache-2.0 | 5 | 0.142857 | 3 | 2 |
e0d092f575c8e2415c83c3b6f6d9b7812da0ef10 | readme.md | readme.md |

<br/>
Install from the Chrome Web Store: <br/>
https://chrome.google.com/webstore/detail/chrome-devtools-dark-them/bfcohnmjbpeilfcijkjkeggbdmehhnbk
<br/>
<br/>
After enabling DevTools Experiments in Chrome (http://islegend.com/development/how-to-enable-devtools-experiments-within-google-chrome/), go to DevTools Settings -> Experiments -> Enable "Allow Custom UI themes". Go to Preferences and select Theme: "Dark", which enables Chrome's own Dark Theme. This theme builds on top of that.
|

## Installation
Install from the Chrome Web Store: <br/>
https://chrome.google.com/webstore/detail/chrome-devtools-dark-them/bfcohnmjbpeilfcijkjkeggbdmehhnbk
<br/>
<br/>
After enabling DevTools Experiments in Chrome (http://islegend.com/development/how-to-enable-devtools-experiments-within-google-chrome/), go to DevTools Settings -> Experiments -> Enable "Allow Custom UI themes". Go to Preferences and select Theme: "Dark", which enables Chrome's own Dark Theme. This theme builds on top of that.
<br/>
## Build Instructions
To customize this theme you can edit the CSS (SASS) by yourself:
- run `npm install`. Make sure you have at least node.js v4 installed.
- Install Gulp globally if you haven't done that yet: `npm install --global gulp-cli`
- modify the .scss under src/
- compile the .scss to dist/theme.css by running `gulp generate.css`
- In dist/devtools.js, comment out Line 9: ` var theStyle = ':host-context(.platform-mac) .mo [...]`. Now the theme.css file gets loaded dynamically.
- Load the extension locally in Chrome: go to `chrome://extensions/`, enable 'Developer Mode', click 'Load unpacked extension' and load it
- To style the DevTools, you can actually inspect the DevTools window itself! Make sure the DevTools are undocked, e.g. they are in a seperate window, then press Cmd+Option+I and another DevTools window opens.
<br/>
## Credits
The CSS for this theme is based on the DevTools Author themes:<br/>
https://github.com/micjamking/devtools-author
| Add build instructions and credits to Readme | Add build instructions and credits to Readme
| Markdown | mit | bernhardc/chrome-devtools-dark-theme | markdown | ## Code Before:

<br/>
Install from the Chrome Web Store: <br/>
https://chrome.google.com/webstore/detail/chrome-devtools-dark-them/bfcohnmjbpeilfcijkjkeggbdmehhnbk
<br/>
<br/>
After enabling DevTools Experiments in Chrome (http://islegend.com/development/how-to-enable-devtools-experiments-within-google-chrome/), go to DevTools Settings -> Experiments -> Enable "Allow Custom UI themes". Go to Preferences and select Theme: "Dark", which enables Chrome's own Dark Theme. This theme builds on top of that.
## Instruction:
Add build instructions and credits to Readme
## Code After:

## Installation
Install from the Chrome Web Store: <br/>
https://chrome.google.com/webstore/detail/chrome-devtools-dark-them/bfcohnmjbpeilfcijkjkeggbdmehhnbk
<br/>
<br/>
After enabling DevTools Experiments in Chrome (http://islegend.com/development/how-to-enable-devtools-experiments-within-google-chrome/), go to DevTools Settings -> Experiments -> Enable "Allow Custom UI themes". Go to Preferences and select Theme: "Dark", which enables Chrome's own Dark Theme. This theme builds on top of that.
<br/>
## Build Instructions
To customize this theme you can edit the CSS (SASS) by yourself:
- run `npm install`. Make sure you have at least node.js v4 installed.
- Install Gulp globally if you haven't done that yet: `npm install --global gulp-cli`
- modify the .scss under src/
- compile the .scss to dist/theme.css by running `gulp generate.css`
- In dist/devtools.js, comment out Line 9: ` var theStyle = ':host-context(.platform-mac) .mo [...]`. Now the theme.css file gets loaded dynamically.
- Load the extension locally in Chrome: go to `chrome://extensions/`, enable 'Developer Mode', click 'Load unpacked extension' and load it
- To style the DevTools, you can actually inspect the DevTools window itself! Make sure the DevTools are undocked, e.g. they are in a seperate window, then press Cmd+Option+I and another DevTools window opens.
<br/>
## Credits
The CSS for this theme is based on the DevTools Author themes:<br/>
https://github.com/micjamking/devtools-author
|

- <br/>
+
+ ## Installation
Install from the Chrome Web Store: <br/>
https://chrome.google.com/webstore/detail/chrome-devtools-dark-them/bfcohnmjbpeilfcijkjkeggbdmehhnbk
<br/>
<br/>
After enabling DevTools Experiments in Chrome (http://islegend.com/development/how-to-enable-devtools-experiments-within-google-chrome/), go to DevTools Settings -> Experiments -> Enable "Allow Custom UI themes". Go to Preferences and select Theme: "Dark", which enables Chrome's own Dark Theme. This theme builds on top of that.
+ <br/>
+
+ ## Build Instructions
+ To customize this theme you can edit the CSS (SASS) by yourself:
+ - run `npm install`. Make sure you have at least node.js v4 installed.
+ - Install Gulp globally if you haven't done that yet: `npm install --global gulp-cli`
+ - modify the .scss under src/
+ - compile the .scss to dist/theme.css by running `gulp generate.css`
+ - In dist/devtools.js, comment out Line 9: ` var theStyle = ':host-context(.platform-mac) .mo [...]`. Now the theme.css file gets loaded dynamically.
+ - Load the extension locally in Chrome: go to `chrome://extensions/`, enable 'Developer Mode', click 'Load unpacked extension' and load it
+ - To style the DevTools, you can actually inspect the DevTools window itself! Make sure the DevTools are undocked, e.g. they are in a seperate window, then press Cmd+Option+I and another DevTools window opens.
+
+ <br/>
+
+ ## Credits
+ The CSS for this theme is based on the DevTools Author themes:<br/>
+ https://github.com/micjamking/devtools-author
+ | 21 | 1.615385 | 20 | 1 |
b518694d27ba89381d21cbec596bced7755fd47b | jenkins/README.md | jenkins/README.md | ----
## Useful links:
[Jenkins plugins](https://plugins.jenkins.io/ansible)
[Official Jenkins Docker repo](https://github.com/jenkinsci/docker)
## Allowing your CI container to start other docker agents/containers.
* If you want your CI container, to be able to start containers, then a simple way is
to expose /var/run/docker.sock to your CI container using -v flag.
``` --volume /var/run/docker.sock:/var/run/docker.sock```.
You will also need to install the docker cli in your CI container.
* Now your CI container will be ablee to start containers. Except that instead of
starting child containers, it will start sibling containers.
| ----
## Useful links:
[Jenkins plugins](https://plugins.jenkins.io/ansible)
[Official Jenkins Docker repo](https://github.com/jenkinsci/docker)
## Allowing your CI container to start other docker agents/containers.
* If you want your CI container, to be able to start containers, then a simple way is
to expose /var/run/docker.sock to your CI container using -v flag.
``` --volume /var/run/docker.sock:/var/run/docker.sock```.
You will also need to install the docker cli in your CI container.
* Now your CI container will be ablee to start containers. Except that instead of
starting child containers, it will start sibling containers.
## To dump all credentials stored in jenkins, go to ```Script console``` under
```Manage Jenkins```, and execute this script:
```
import jenkins.model.*
import com.cloudbees.plugins.credentials.*
import com.cloudbees.plugins.credentials.impl.*
import com.cloudbees.plugins.credentials.domains.*
import com.cloudbees.jenkins.plugins.sshcredentials.impl.BasicSSHUserPrivateKey
import org.jenkinsci.plugins.plaincredentials.StringCredentials
def showRow = { credentialType, secretId, username = null, password = null, description = null ->
println("${credentialType} : ".padLeft(20) + secretId?.padRight(38)+" | " +username?.padRight(20)+" | " +password?.padRight(40) + " | " +description)
}
// set Credentials domain name (null means is it global)
domainName = null
credentialsStore = Jenkins.instance.getExtensionList('com.cloudbees.plugins.credentials.SystemCredentialsProvider')[0]?.getStore()
domain = new Domain(domainName, null, Collections.<DomainSpecification>emptyList())
credentialsStore?.getCredentials(domain).each{
if(it instanceof UsernamePasswordCredentialsImpl)
showRow("user/password", it.id, it.username, it.password?.getPlainText(),it.description)
else if(it instanceof BasicSSHUserPrivateKey)
showRow("ssh priv key", it.id, it.passphrase?.getPlainText(), it.privateKeySource?.getPrivateKey(), it.description )
else if(it instanceof StringCredentials)
showRow("secret text", it.id, it.secret?.getPlainText(), it.description, '' )
else
showRow("something else", it.id)
}
return
```
| Add script to display credentials | Add script to display credentials
| Markdown | apache-2.0 | bdastur/notes,bdastur/notes,bdastur/notes,bdastur/notes,bdastur/notes,bdastur/notes | markdown | ## Code Before:
----
## Useful links:
[Jenkins plugins](https://plugins.jenkins.io/ansible)
[Official Jenkins Docker repo](https://github.com/jenkinsci/docker)
## Allowing your CI container to start other docker agents/containers.
* If you want your CI container, to be able to start containers, then a simple way is
to expose /var/run/docker.sock to your CI container using -v flag.
``` --volume /var/run/docker.sock:/var/run/docker.sock```.
You will also need to install the docker cli in your CI container.
* Now your CI container will be ablee to start containers. Except that instead of
starting child containers, it will start sibling containers.
## Instruction:
Add script to display credentials
## Code After:
----
## Useful links:
[Jenkins plugins](https://plugins.jenkins.io/ansible)
[Official Jenkins Docker repo](https://github.com/jenkinsci/docker)
## Allowing your CI container to start other docker agents/containers.
* If you want your CI container, to be able to start containers, then a simple way is
to expose /var/run/docker.sock to your CI container using -v flag.
``` --volume /var/run/docker.sock:/var/run/docker.sock```.
You will also need to install the docker cli in your CI container.
* Now your CI container will be ablee to start containers. Except that instead of
starting child containers, it will start sibling containers.
## To dump all credentials stored in jenkins, go to ```Script console``` under
```Manage Jenkins```, and execute this script:
```
import jenkins.model.*
import com.cloudbees.plugins.credentials.*
import com.cloudbees.plugins.credentials.impl.*
import com.cloudbees.plugins.credentials.domains.*
import com.cloudbees.jenkins.plugins.sshcredentials.impl.BasicSSHUserPrivateKey
import org.jenkinsci.plugins.plaincredentials.StringCredentials
def showRow = { credentialType, secretId, username = null, password = null, description = null ->
println("${credentialType} : ".padLeft(20) + secretId?.padRight(38)+" | " +username?.padRight(20)+" | " +password?.padRight(40) + " | " +description)
}
// set Credentials domain name (null means is it global)
domainName = null
credentialsStore = Jenkins.instance.getExtensionList('com.cloudbees.plugins.credentials.SystemCredentialsProvider')[0]?.getStore()
domain = new Domain(domainName, null, Collections.<DomainSpecification>emptyList())
credentialsStore?.getCredentials(domain).each{
if(it instanceof UsernamePasswordCredentialsImpl)
showRow("user/password", it.id, it.username, it.password?.getPlainText(),it.description)
else if(it instanceof BasicSSHUserPrivateKey)
showRow("ssh priv key", it.id, it.passphrase?.getPlainText(), it.privateKeySource?.getPrivateKey(), it.description )
else if(it instanceof StringCredentials)
showRow("secret text", it.id, it.secret?.getPlainText(), it.description, '' )
else
showRow("something else", it.id)
}
return
```
| ----
## Useful links:
[Jenkins plugins](https://plugins.jenkins.io/ansible)
[Official Jenkins Docker repo](https://github.com/jenkinsci/docker)
## Allowing your CI container to start other docker agents/containers.
* If you want your CI container, to be able to start containers, then a simple way is
to expose /var/run/docker.sock to your CI container using -v flag.
``` --volume /var/run/docker.sock:/var/run/docker.sock```.
You will also need to install the docker cli in your CI container.
* Now your CI container will be ablee to start containers. Except that instead of
starting child containers, it will start sibling containers.
+
+ ## To dump all credentials stored in jenkins, go to ```Script console``` under
+ ```Manage Jenkins```, and execute this script:
+
+ ```
+ import jenkins.model.*
+ import com.cloudbees.plugins.credentials.*
+ import com.cloudbees.plugins.credentials.impl.*
+ import com.cloudbees.plugins.credentials.domains.*
+ import com.cloudbees.jenkins.plugins.sshcredentials.impl.BasicSSHUserPrivateKey
+ import org.jenkinsci.plugins.plaincredentials.StringCredentials
+
+ def showRow = { credentialType, secretId, username = null, password = null, description = null ->
+ println("${credentialType} : ".padLeft(20) + secretId?.padRight(38)+" | " +username?.padRight(20)+" | " +password?.padRight(40) + " | " +description)
+ }
+
+ // set Credentials domain name (null means is it global)
+ domainName = null
+
+ credentialsStore = Jenkins.instance.getExtensionList('com.cloudbees.plugins.credentials.SystemCredentialsProvider')[0]?.getStore()
+ domain = new Domain(domainName, null, Collections.<DomainSpecification>emptyList())
+
+ credentialsStore?.getCredentials(domain).each{
+ if(it instanceof UsernamePasswordCredentialsImpl)
+ showRow("user/password", it.id, it.username, it.password?.getPlainText(),it.description)
+ else if(it instanceof BasicSSHUserPrivateKey)
+ showRow("ssh priv key", it.id, it.passphrase?.getPlainText(), it.privateKeySource?.getPrivateKey(), it.description )
+ else if(it instanceof StringCredentials)
+ showRow("secret text", it.id, it.secret?.getPlainText(), it.description, '' )
+ else
+ showRow("something else", it.id)
+ }
+
+ return
+ ```
+
+ | 37 | 2.3125 | 37 | 0 |
8b58031fb6c4e9a45090ce9a618b83775337ed22 | app/uploaders/avatar_uploader.rb | app/uploaders/avatar_uploader.rb |
class AvatarUploader < ApplicationUploader
include CarrierWave::MiniMagick
# Process files as they are uploaded:
process :resize_to_fill => [512, 512]
process :convert => 'png'
storage :file
def asset_host
nil
end
def store_dir
if Rails.env.production?
"/store/uploads/#{mounted_as}/#{model.avatar_namespace}/#{model.id}/"
else
"#{Rails.root}/public/uploads/#{mounted_as}/#{model.avatar_namespace}/"
end
end
def filename
"#{mounted_as}_#{model.id}.png" if original_filename.present?
end
def default_url
image_path("#{Cloudsdale.config['asset_url']}/assets/fallback/#{mounted_as}/" + [version_name, "#{model.avatar_namespace}.png"].compact.join('_'))
end
after :store, :purge_from_cdn
after :remove, :purge_from_cdn
protected
# Internal: Clears the CDN of any records associated to previous
# uploaded avatar for the model.
#
# Returns nothing of interest.
def purge_from_cdn(file=nil)
if Rails.env.production?
AvatarPurgeWorker.perform_async(model.id.to_s, model.class.to_s)
end
end
end
|
class AvatarUploader < ApplicationUploader
include CarrierWave::MiniMagick
# Process files as they are uploaded:
process :resize_to_fill => [512, 512]
process :convert => 'png'
storage :file
def asset_host
nil
end
def store_dir
if Rails.env.production?
"/store/uploads/#{mounted_as}/#{model.avatar_namespace}/#{model.id}/"
else
"#{Rails.root}/public/uploads/#{mounted_as}/#{model.avatar_namespace}/"
end
end
def filename
"#{mounted_as}_#{model.id}.png" if original_filename.present?
end
def default_url
image_path("#{Cloudsdale.config['asset_url']}/assets/fallback/#{mounted_as}/" + [version_name, "#{model.avatar_namespace}.png"].compact.join('_'))
end
after :store, :purge_from_cdn, :set_avatar_uploaded_at
after :remove, :purge_from_cdn
protected
# Internal: Clears the CDN of any records associated to previous
# uploaded avatar for the model.
#
# Returns nothing of interest.
def purge_from_cdn(file=nil)
if Rails.env.production?
AvatarPurgeWorker.perform_async(model.id.to_s, model.class.to_s)
end
end
# Private: Sets the avatar upload date on model.
# Returns the date the avatar was uploaded.
def set_avatar_uploaded_at
model.avatar_uploaded_at = DateTime.now if model.respond_to?(:avatar_uploaded_at)
end
end
| Manage avatar upload date from uploader. | Manage avatar upload date from uploader.
| Ruby | mit | cloudsdaleapp/cloudsdale-web,cloudsdaleapp/cloudsdale-web,cloudsdaleapp/cloudsdale-web,cloudsdaleapp/cloudsdale-web | ruby | ## Code Before:
class AvatarUploader < ApplicationUploader
include CarrierWave::MiniMagick
# Process files as they are uploaded:
process :resize_to_fill => [512, 512]
process :convert => 'png'
storage :file
def asset_host
nil
end
def store_dir
if Rails.env.production?
"/store/uploads/#{mounted_as}/#{model.avatar_namespace}/#{model.id}/"
else
"#{Rails.root}/public/uploads/#{mounted_as}/#{model.avatar_namespace}/"
end
end
def filename
"#{mounted_as}_#{model.id}.png" if original_filename.present?
end
def default_url
image_path("#{Cloudsdale.config['asset_url']}/assets/fallback/#{mounted_as}/" + [version_name, "#{model.avatar_namespace}.png"].compact.join('_'))
end
after :store, :purge_from_cdn
after :remove, :purge_from_cdn
protected
# Internal: Clears the CDN of any records associated to previous
# uploaded avatar for the model.
#
# Returns nothing of interest.
def purge_from_cdn(file=nil)
if Rails.env.production?
AvatarPurgeWorker.perform_async(model.id.to_s, model.class.to_s)
end
end
end
## Instruction:
Manage avatar upload date from uploader.
## Code After:
class AvatarUploader < ApplicationUploader
include CarrierWave::MiniMagick
# Process files as they are uploaded:
process :resize_to_fill => [512, 512]
process :convert => 'png'
storage :file
def asset_host
nil
end
def store_dir
if Rails.env.production?
"/store/uploads/#{mounted_as}/#{model.avatar_namespace}/#{model.id}/"
else
"#{Rails.root}/public/uploads/#{mounted_as}/#{model.avatar_namespace}/"
end
end
def filename
"#{mounted_as}_#{model.id}.png" if original_filename.present?
end
def default_url
image_path("#{Cloudsdale.config['asset_url']}/assets/fallback/#{mounted_as}/" + [version_name, "#{model.avatar_namespace}.png"].compact.join('_'))
end
after :store, :purge_from_cdn, :set_avatar_uploaded_at
after :remove, :purge_from_cdn
protected
# Internal: Clears the CDN of any records associated to previous
# uploaded avatar for the model.
#
# Returns nothing of interest.
def purge_from_cdn(file=nil)
if Rails.env.production?
AvatarPurgeWorker.perform_async(model.id.to_s, model.class.to_s)
end
end
# Private: Sets the avatar upload date on model.
# Returns the date the avatar was uploaded.
def set_avatar_uploaded_at
model.avatar_uploaded_at = DateTime.now if model.respond_to?(:avatar_uploaded_at)
end
end
|
class AvatarUploader < ApplicationUploader
include CarrierWave::MiniMagick
# Process files as they are uploaded:
process :resize_to_fill => [512, 512]
process :convert => 'png'
storage :file
def asset_host
nil
end
def store_dir
if Rails.env.production?
"/store/uploads/#{mounted_as}/#{model.avatar_namespace}/#{model.id}/"
else
"#{Rails.root}/public/uploads/#{mounted_as}/#{model.avatar_namespace}/"
end
end
def filename
"#{mounted_as}_#{model.id}.png" if original_filename.present?
end
def default_url
image_path("#{Cloudsdale.config['asset_url']}/assets/fallback/#{mounted_as}/" + [version_name, "#{model.avatar_namespace}.png"].compact.join('_'))
end
- after :store, :purge_from_cdn
+ after :store, :purge_from_cdn, :set_avatar_uploaded_at
after :remove, :purge_from_cdn
protected
# Internal: Clears the CDN of any records associated to previous
# uploaded avatar for the model.
#
# Returns nothing of interest.
def purge_from_cdn(file=nil)
if Rails.env.production?
AvatarPurgeWorker.perform_async(model.id.to_s, model.class.to_s)
end
end
+ # Private: Sets the avatar upload date on model.
+ # Returns the date the avatar was uploaded.
+ def set_avatar_uploaded_at
+ model.avatar_uploaded_at = DateTime.now if model.respond_to?(:avatar_uploaded_at)
+ end
+
end | 8 | 0.170213 | 7 | 1 |
0ee20df4e9db1c1833a7e5b1553fbd9329019790 | src/config.json | src/config.json | {
"localAppStoreName":"127.0.0.1:8181",
"requiredHTTPSecrets": [
"arbiter",
"export-service",
"syslog"
],
"requiredArbiterSecrets": [
"container-manager",
"arbiter",
"export-service",
"syslog"
],
"localAppStoreSeedManifests": [
"driver-twitter",
"app-twitter-sentiment",
"store-json",
"driver-os-monitor"
]
}
| {
"localAppStoreName":"127.0.0.1:8181",
"requiredHTTPSecrets": [
"arbiter",
"export-service",
"syslog"
],
"requiredArbiterSecrets": [
"container-manager",
"arbiter",
"export-service",
"syslog"
],
"localAppStoreSeedManifests": [
"driver-twitter",
"driver-os-monitor",
"driver-sensingkit",
"driver-google-takeout",
"driver-phillips-hue",
"app-light-graph",
"app-twitter-sentiment",
"store-json"
]
}
| Add all the drivers and apps ;-) | Add all the drivers and apps ;-)
| JSON | mit | pooyadav/databox,me-box/databox,pooyadav/databox,me-box/databox | json | ## Code Before:
{
"localAppStoreName":"127.0.0.1:8181",
"requiredHTTPSecrets": [
"arbiter",
"export-service",
"syslog"
],
"requiredArbiterSecrets": [
"container-manager",
"arbiter",
"export-service",
"syslog"
],
"localAppStoreSeedManifests": [
"driver-twitter",
"app-twitter-sentiment",
"store-json",
"driver-os-monitor"
]
}
## Instruction:
Add all the drivers and apps ;-)
## Code After:
{
"localAppStoreName":"127.0.0.1:8181",
"requiredHTTPSecrets": [
"arbiter",
"export-service",
"syslog"
],
"requiredArbiterSecrets": [
"container-manager",
"arbiter",
"export-service",
"syslog"
],
"localAppStoreSeedManifests": [
"driver-twitter",
"driver-os-monitor",
"driver-sensingkit",
"driver-google-takeout",
"driver-phillips-hue",
"app-light-graph",
"app-twitter-sentiment",
"store-json"
]
}
| {
"localAppStoreName":"127.0.0.1:8181",
"requiredHTTPSecrets": [
"arbiter",
"export-service",
"syslog"
],
"requiredArbiterSecrets": [
"container-manager",
"arbiter",
"export-service",
"syslog"
],
"localAppStoreSeedManifests": [
"driver-twitter",
+ "driver-os-monitor",
+ "driver-sensingkit",
+ "driver-google-takeout",
+ "driver-phillips-hue",
+ "app-light-graph",
"app-twitter-sentiment",
- "store-json",
? -
+ "store-json"
- "driver-os-monitor"
]
} | 8 | 0.380952 | 6 | 2 |
9e4b9a206e411d42bbe656d27a993b68f8c56fe7 | containers/postgresql-pgbouncer/conf/pgbouncer.ini | containers/postgresql-pgbouncer/conf/pgbouncer.ini | [databases]
* = host=mc_postgresql_server port=5432 user=mediacloud
[pgbouncer]
; Actual logging done to STDOUT
logfile = /dev/null
pidfile = /var/run/postgresql/pgbouncer.pid
listen_addr = 0.0.0.0
listen_port = 6432
unix_socket_dir = /var/run/postgresql
auth_type = md5
auth_file = /etc/pgbouncer/userlist.txt
pool_mode = session
server_reset_query = DISCARD ALL
max_client_conn = 600
default_pool_size = 600
log_connections = 0
log_disconnections = 0
stats_period = 600
| [databases]
* = host=mc_postgresql_server port=5432 user=mediacloud
[pgbouncer]
; Actual logging done to STDOUT
logfile = /dev/null
pidfile = /var/run/postgresql/pgbouncer.pid
listen_addr = 0.0.0.0
listen_port = 6432
unix_socket_dir = /var/run/postgresql
auth_type = md5
auth_file = /etc/pgbouncer/userlist.txt
pool_mode = session
server_reset_query = DISCARD ALL
max_client_conn = 600
default_pool_size = 600
log_connections = 0
log_disconnections = 0
stats_period = 600
server_login_retry = 1
| Make PgBouncer try to reconnect every 1 s | Make PgBouncer try to reconnect every 1 s
| INI | agpl-3.0 | berkmancenter/mediacloud,berkmancenter/mediacloud,berkmancenter/mediacloud,berkmancenter/mediacloud,berkmancenter/mediacloud | ini | ## Code Before:
[databases]
* = host=mc_postgresql_server port=5432 user=mediacloud
[pgbouncer]
; Actual logging done to STDOUT
logfile = /dev/null
pidfile = /var/run/postgresql/pgbouncer.pid
listen_addr = 0.0.0.0
listen_port = 6432
unix_socket_dir = /var/run/postgresql
auth_type = md5
auth_file = /etc/pgbouncer/userlist.txt
pool_mode = session
server_reset_query = DISCARD ALL
max_client_conn = 600
default_pool_size = 600
log_connections = 0
log_disconnections = 0
stats_period = 600
## Instruction:
Make PgBouncer try to reconnect every 1 s
## Code After:
[databases]
* = host=mc_postgresql_server port=5432 user=mediacloud
[pgbouncer]
; Actual logging done to STDOUT
logfile = /dev/null
pidfile = /var/run/postgresql/pgbouncer.pid
listen_addr = 0.0.0.0
listen_port = 6432
unix_socket_dir = /var/run/postgresql
auth_type = md5
auth_file = /etc/pgbouncer/userlist.txt
pool_mode = session
server_reset_query = DISCARD ALL
max_client_conn = 600
default_pool_size = 600
log_connections = 0
log_disconnections = 0
stats_period = 600
server_login_retry = 1
| [databases]
* = host=mc_postgresql_server port=5432 user=mediacloud
[pgbouncer]
; Actual logging done to STDOUT
logfile = /dev/null
pidfile = /var/run/postgresql/pgbouncer.pid
listen_addr = 0.0.0.0
listen_port = 6432
unix_socket_dir = /var/run/postgresql
auth_type = md5
auth_file = /etc/pgbouncer/userlist.txt
pool_mode = session
server_reset_query = DISCARD ALL
max_client_conn = 600
default_pool_size = 600
log_connections = 0
log_disconnections = 0
stats_period = 600
+ server_login_retry = 1 | 1 | 0.043478 | 1 | 0 |
dc695cc1779e534ee4e402e191c23b2f56ef556c | cloud/aws/courses/infrastructure.yaml | cloud/aws/courses/infrastructure.yaml | ---
student:
ami: "ami-c82bf6a8"
type: "m4.large"
key_name: "training"
security_group_ids:
- "sg-688b120f"
| ---
master:
ami: "ami-c82bf6a8"
type: "m4.large"
key_name: "training"
security_group_ids:
- "sg-688b120f"
student:
ami: "ami-c82bf6a8"
type: "m4.large"
key_name: "training"
security_group_ids:
- "sg-688b120f"
| Add an instructor VM and call it `master` | Add an instructor VM and call it `master`
The `vcmanager` tool expects one machine to be in the `master` role, for indexing purposes. In ID, we don't have that. Each student is standalone. So instead, we'll stand one machine up just like the others that the instructor can use for demos and exercises, but we'll call it the `master`. | YAML | apache-2.0 | carthik/puppetlabs-training-bootstrap,marrero984/education-builds,marrero984/education-builds,samuelson/puppetlabs-training-bootstrap,puppetlabs/puppetlabs-training-bootstrap,carthik/puppetlabs-training-bootstrap,puppetlabs/puppetlabs-training-bootstrap,kjhenner/puppetlabs-training-bootstrap,joshsamuelson/puppetlabs-training-bootstrap,joshsamuelson/puppetlabs-training-bootstrap,samuelson/education-builds,carthik/puppetlabs-training-bootstrap,samuelson/puppetlabs-training-bootstrap,kjhenner/puppetlabs-training-bootstrap,samuelson/education-builds,samuelson/puppetlabs-training-bootstrap,kjhenner/puppetlabs-training-bootstrap,marrero984/education-builds,joshsamuelson/puppetlabs-training-bootstrap,puppetlabs/puppetlabs-training-bootstrap,samuelson/education-builds | yaml | ## Code Before:
---
student:
ami: "ami-c82bf6a8"
type: "m4.large"
key_name: "training"
security_group_ids:
- "sg-688b120f"
## Instruction:
Add an instructor VM and call it `master`
The `vcmanager` tool expects one machine to be in the `master` role, for indexing purposes. In ID, we don't have that. Each student is standalone. So instead, we'll stand one machine up just like the others that the instructor can use for demos and exercises, but we'll call it the `master`.
## Code After:
---
master:
ami: "ami-c82bf6a8"
type: "m4.large"
key_name: "training"
security_group_ids:
- "sg-688b120f"
student:
ami: "ami-c82bf6a8"
type: "m4.large"
key_name: "training"
security_group_ids:
- "sg-688b120f"
| ---
+ master:
+ ami: "ami-c82bf6a8"
+ type: "m4.large"
+ key_name: "training"
+ security_group_ids:
+ - "sg-688b120f"
student:
ami: "ami-c82bf6a8"
type: "m4.large"
key_name: "training"
security_group_ids:
- "sg-688b120f" | 6 | 0.857143 | 6 | 0 |
a520673fd0b0c4e180dcbc2cd79f9173e955f71a | CHANGELOG.md | CHANGELOG.md | basicsetup CHANGELOG
====================
This file is used to list changes made in each version of the basicsetup cookbook.
1.4.0
-----
- dist-upgrade from 14.04 to 16.04
- Remove `mrs` and `biomaj`
This version is executed `dist-upgrade` and `do-release-upgrade` on 1.3.0 .
1.3.0
-----
- bioconda support
- conda 3.14.1
- docker install 1.12.3
- Galaxy run as docker container
- Add 4 Genome RNA-seq Workflow (HISAT2, Tophat2)
- Add 2 Sailfish v0.9 Workflow
0.1.0
-----
- [Manabu ISHII] - Initial release of basicsetup
- - -
Check the [Markdown Syntax Guide](http://daringfireball.net/projects/markdown/syntax) for help with Markdown.
The [Github Flavored Markdown page](http://github.github.com/github-flavored-markdown/) describes the differences between markdown on github and standard markdown.
| basicsetup CHANGELOG
====================
This file is used to list changes made in each version of the basicsetup cookbook.
1.4.0
-----
- dist-upgrade from 14.04 to 16.04
- Remove `mrs` and `biomaj`
This version is executed `dist-upgrade` and `do-release-upgrade` on 1.3.0 .
[Upgrade Instruction here](bayesvm_1.3.0_to_1.4.0.txt)
1.3.0
-----
- bioconda support
- conda 3.14.1
- docker install 1.12.3
- Galaxy run as docker container
- Add 4 Genome RNA-seq Workflow (HISAT2, Tophat2)
- Add 2 Sailfish v0.9 Workflow
0.1.0
-----
- [Manabu ISHII] - Initial release of basicsetup
- - -
Check the [Markdown Syntax Guide](http://daringfireball.net/projects/markdown/syntax) for help with Markdown.
The [Github Flavored Markdown page](http://github.github.com/github-flavored-markdown/) describes the differences between markdown on github and standard markdown.
| Create link instruction from 1.3.0 to 1.4.0 | Create link instruction from 1.3.0 to 1.4.0
| Markdown | apache-2.0 | BioDevOps/basicsetup,BioDevOps/basicsetup,BioDevOps/basicsetup,BioDevOps/basicsetup | markdown | ## Code Before:
basicsetup CHANGELOG
====================
This file is used to list changes made in each version of the basicsetup cookbook.
1.4.0
-----
- dist-upgrade from 14.04 to 16.04
- Remove `mrs` and `biomaj`
This version is executed `dist-upgrade` and `do-release-upgrade` on 1.3.0 .
1.3.0
-----
- bioconda support
- conda 3.14.1
- docker install 1.12.3
- Galaxy run as docker container
- Add 4 Genome RNA-seq Workflow (HISAT2, Tophat2)
- Add 2 Sailfish v0.9 Workflow
0.1.0
-----
- [Manabu ISHII] - Initial release of basicsetup
- - -
Check the [Markdown Syntax Guide](http://daringfireball.net/projects/markdown/syntax) for help with Markdown.
The [Github Flavored Markdown page](http://github.github.com/github-flavored-markdown/) describes the differences between markdown on github and standard markdown.
## Instruction:
Create link instruction from 1.3.0 to 1.4.0
## Code After:
basicsetup CHANGELOG
====================
This file is used to list changes made in each version of the basicsetup cookbook.
1.4.0
-----
- dist-upgrade from 14.04 to 16.04
- Remove `mrs` and `biomaj`
This version is executed `dist-upgrade` and `do-release-upgrade` on 1.3.0 .
[Upgrade Instruction here](bayesvm_1.3.0_to_1.4.0.txt)
1.3.0
-----
- bioconda support
- conda 3.14.1
- docker install 1.12.3
- Galaxy run as docker container
- Add 4 Genome RNA-seq Workflow (HISAT2, Tophat2)
- Add 2 Sailfish v0.9 Workflow
0.1.0
-----
- [Manabu ISHII] - Initial release of basicsetup
- - -
Check the [Markdown Syntax Guide](http://daringfireball.net/projects/markdown/syntax) for help with Markdown.
The [Github Flavored Markdown page](http://github.github.com/github-flavored-markdown/) describes the differences between markdown on github and standard markdown.
| basicsetup CHANGELOG
====================
This file is used to list changes made in each version of the basicsetup cookbook.
1.4.0
-----
- dist-upgrade from 14.04 to 16.04
- Remove `mrs` and `biomaj`
This version is executed `dist-upgrade` and `do-release-upgrade` on 1.3.0 .
+
+ [Upgrade Instruction here](bayesvm_1.3.0_to_1.4.0.txt)
1.3.0
-----
- bioconda support
- conda 3.14.1
- docker install 1.12.3
- Galaxy run as docker container
- Add 4 Genome RNA-seq Workflow (HISAT2, Tophat2)
- Add 2 Sailfish v0.9 Workflow
0.1.0
-----
- [Manabu ISHII] - Initial release of basicsetup
- - -
Check the [Markdown Syntax Guide](http://daringfireball.net/projects/markdown/syntax) for help with Markdown.
The [Github Flavored Markdown page](http://github.github.com/github-flavored-markdown/) describes the differences between markdown on github and standard markdown. | 2 | 0.068966 | 2 | 0 |
e11e9375accfd307b5bd339ab34956d0de7050db | spec/franklin/console_report_spec.cr | spec/franklin/console_report_spec.cr | require "../spec_helper"
module Franklin
class ConsoleReportLet
property subject : ConsoleReport
property result : String
property availability_description : AvailabilityDescription
property search_terms : String
property item : Item
def initialize(@search_terms = "crime and punishment")
@item = Item.random_fixture
availability = Availability.random_fixture
collated_results = {item => [availability]}
io = IO::Memory.new
@subject = ConsoleReport.new(search_terms, collated_results)
@result = subject.to_s(io).to_s
@availability_description = AvailabilityDescription.new(availability, item)
end
end
describe ConsoleReport do
describe ".to_s" do
it "includes search_term" do
let = ConsoleReportLet.new
let.result.should match(/#{let.search_terms}/)
end
it "includes item information" do
let = ConsoleReportLet.new
result = let.result
item = let.item
result.should match(/^#{item.title}/m)
result.should match(/^By #{item.author}/m)
result.should match(/^Format: #{item.format}/m)
end
it "includes availability information" do
let = ConsoleReportLet.new
result = let.result
availability_description = let.availability_description
result.should match(/^Availability:/m)
result.should match(/^ #{availability_description.to_s}/m)
result.should match(/^ #{availability_description.url}/m)
end
end
end
end
| require "../spec_helper"
module Franklin
describe ConsoleReport do
subject { ConsoleReport.new(search_terms, collated_results) }
let(:search_terms) { "crime and punishment" }
let(:collated_results) { {item => [availability]} }
let(:result) { subject.to_s(io).to_s }
let(:io) { IO::Memory.new }
let(:item) { Item.random_fixture }
let(:availability) { Availability.random_fixture }
let(:availability_description) { AvailabilityDescription.new(availability, item) }
describe ".to_s" do
it "includes search_term" do
expect(result).to match(/#{search_terms}/)
end
it "includes item information" do
expect(result).to match(/^#{item.title}/m)
expect(result).to match(/^By #{item.author}/m)
expect(result).to match(/^Format: #{item.format}/m)
end
it "includes availability information" do
expect(result).to match(/^Availability:/m)
expect(result).to match(/^ #{availability_description.to_s}/m)
expect(result).to match(/^ #{availability_description.url}/m)
end
end
end
end
| Revert "Expriment with a Let class to hold setup" | Revert "Expriment with a Let class to hold setup"
This reverts commit 4470fd705897344b0b2781de54cae53f844ea386.
| Crystal | mit | ylansegal/franklin.cr | crystal | ## Code Before:
require "../spec_helper"
module Franklin
class ConsoleReportLet
property subject : ConsoleReport
property result : String
property availability_description : AvailabilityDescription
property search_terms : String
property item : Item
def initialize(@search_terms = "crime and punishment")
@item = Item.random_fixture
availability = Availability.random_fixture
collated_results = {item => [availability]}
io = IO::Memory.new
@subject = ConsoleReport.new(search_terms, collated_results)
@result = subject.to_s(io).to_s
@availability_description = AvailabilityDescription.new(availability, item)
end
end
describe ConsoleReport do
describe ".to_s" do
it "includes search_term" do
let = ConsoleReportLet.new
let.result.should match(/#{let.search_terms}/)
end
it "includes item information" do
let = ConsoleReportLet.new
result = let.result
item = let.item
result.should match(/^#{item.title}/m)
result.should match(/^By #{item.author}/m)
result.should match(/^Format: #{item.format}/m)
end
it "includes availability information" do
let = ConsoleReportLet.new
result = let.result
availability_description = let.availability_description
result.should match(/^Availability:/m)
result.should match(/^ #{availability_description.to_s}/m)
result.should match(/^ #{availability_description.url}/m)
end
end
end
end
## Instruction:
Revert "Expriment with a Let class to hold setup"
This reverts commit 4470fd705897344b0b2781de54cae53f844ea386.
## Code After:
require "../spec_helper"
module Franklin
describe ConsoleReport do
subject { ConsoleReport.new(search_terms, collated_results) }
let(:search_terms) { "crime and punishment" }
let(:collated_results) { {item => [availability]} }
let(:result) { subject.to_s(io).to_s }
let(:io) { IO::Memory.new }
let(:item) { Item.random_fixture }
let(:availability) { Availability.random_fixture }
let(:availability_description) { AvailabilityDescription.new(availability, item) }
describe ".to_s" do
it "includes search_term" do
expect(result).to match(/#{search_terms}/)
end
it "includes item information" do
expect(result).to match(/^#{item.title}/m)
expect(result).to match(/^By #{item.author}/m)
expect(result).to match(/^Format: #{item.format}/m)
end
it "includes availability information" do
expect(result).to match(/^Availability:/m)
expect(result).to match(/^ #{availability_description.to_s}/m)
expect(result).to match(/^ #{availability_description.url}/m)
end
end
end
end
| require "../spec_helper"
module Franklin
- class ConsoleReportLet
- property subject : ConsoleReport
- property result : String
- property availability_description : AvailabilityDescription
- property search_terms : String
- property item : Item
+ describe ConsoleReport do
+ subject { ConsoleReport.new(search_terms, collated_results) }
+ let(:search_terms) { "crime and punishment" }
+ let(:collated_results) { {item => [availability]} }
+ let(:result) { subject.to_s(io).to_s }
+ let(:io) { IO::Memory.new }
+ let(:item) { Item.random_fixture }
+ let(:availability) { Availability.random_fixture }
+ let(:availability_description) { AvailabilityDescription.new(availability, item) }
- def initialize(@search_terms = "crime and punishment")
- @item = Item.random_fixture
- availability = Availability.random_fixture
- collated_results = {item => [availability]}
- io = IO::Memory.new
-
- @subject = ConsoleReport.new(search_terms, collated_results)
- @result = subject.to_s(io).to_s
- @availability_description = AvailabilityDescription.new(availability, item)
- end
- end
-
- describe ConsoleReport do
describe ".to_s" do
it "includes search_term" do
- let = ConsoleReportLet.new
- let.result.should match(/#{let.search_terms}/)
? - ^ ^^ --- ----
+ expect(result).to match(/#{search_terms}/)
? ++++ ^ + ^
end
it "includes item information" do
- let = ConsoleReportLet.new
- result = let.result
- item = let.item
-
- result.should match(/^#{item.title}/m)
? ^^ ---
+ expect(result).to match(/^#{item.title}/m)
? +++++++ + ^
- result.should match(/^By #{item.author}/m)
? ^^ ---
+ expect(result).to match(/^By #{item.author}/m)
? +++++++ + ^
- result.should match(/^Format: #{item.format}/m)
? ^^ ---
+ expect(result).to match(/^Format: #{item.format}/m)
? +++++++ + ^
end
it "includes availability information" do
- let = ConsoleReportLet.new
- result = let.result
- availability_description = let.availability_description
-
- result.should match(/^Availability:/m)
? ^^ ---
+ expect(result).to match(/^Availability:/m)
? +++++++ + ^
- result.should match(/^ #{availability_description.to_s}/m)
? ^^ ---
+ expect(result).to match(/^ #{availability_description.to_s}/m)
? +++++++ + ^
- result.should match(/^ #{availability_description.url}/m)
? ^^ ---
+ expect(result).to match(/^ #{availability_description.url}/m)
? +++++++ + ^
end
end
end
end | 51 | 1 | 16 | 35 |
b21c5a1b0f8d176cdd59c8131a316f142540d9ec | materials.py | materials.py | import color
def parse(mat_node):
materials = []
for node in mat_node:
materials.append(Material(node))
class Material:
''' it’s a material
'''
def __init__(self, node):
for c in node:
if c.tag == 'ambient':
self.ambient_color = color.parse(c[0])
self.ambient_color *= float(c.attrib['factor'].replace(',', '.'))
elif c.tag == 'diffuse':
self.diffuse_color = color.parse(c[0])
self.diffuse_color *= float(c.attrib['factor'].replace(',', '.'))
elif c.tag == 'specular':
self.specular_color = color.parse(c[0])
self.specular_color *= float(c.attrib['factor'].replace(',', '.'))
elif c.tag == 'reflection':
self.reflection_color = color.parse(c[0])
self.reflection_color *= float(c.attrib['factor'].replace(',', '.'))
| import color
def parse(mat_node):
materials = []
for node in mat_node:
materials.append(Material(node))
class Material:
''' it’s a material
'''
def __init__(self, node):
for c in node:
if c.tag == 'ambient':
self.ambient_color = color.parse(c[0])
self.ambient_color *= float(c.attrib['factor'])
elif c.tag == 'diffuse':
self.diffuse_color = color.parse(c[0])
self.diffuse_color *= float(c.attrib['factor'])
elif c.tag == 'specular':
self.specular_color = color.parse(c[0])
self.specular_color *= float(c.attrib['factor'])
elif c.tag == 'reflection':
self.reflection_color = color.parse(c[0])
self.reflection_color *= float(c.attrib['factor'])
| Remove replacement of commas by points | Remove replacement of commas by points
| Python | mit | cocreature/pytracer | python | ## Code Before:
import color
def parse(mat_node):
materials = []
for node in mat_node:
materials.append(Material(node))
class Material:
''' it’s a material
'''
def __init__(self, node):
for c in node:
if c.tag == 'ambient':
self.ambient_color = color.parse(c[0])
self.ambient_color *= float(c.attrib['factor'].replace(',', '.'))
elif c.tag == 'diffuse':
self.diffuse_color = color.parse(c[0])
self.diffuse_color *= float(c.attrib['factor'].replace(',', '.'))
elif c.tag == 'specular':
self.specular_color = color.parse(c[0])
self.specular_color *= float(c.attrib['factor'].replace(',', '.'))
elif c.tag == 'reflection':
self.reflection_color = color.parse(c[0])
self.reflection_color *= float(c.attrib['factor'].replace(',', '.'))
## Instruction:
Remove replacement of commas by points
## Code After:
import color
def parse(mat_node):
materials = []
for node in mat_node:
materials.append(Material(node))
class Material:
''' it’s a material
'''
def __init__(self, node):
for c in node:
if c.tag == 'ambient':
self.ambient_color = color.parse(c[0])
self.ambient_color *= float(c.attrib['factor'])
elif c.tag == 'diffuse':
self.diffuse_color = color.parse(c[0])
self.diffuse_color *= float(c.attrib['factor'])
elif c.tag == 'specular':
self.specular_color = color.parse(c[0])
self.specular_color *= float(c.attrib['factor'])
elif c.tag == 'reflection':
self.reflection_color = color.parse(c[0])
self.reflection_color *= float(c.attrib['factor'])
| import color
def parse(mat_node):
materials = []
for node in mat_node:
materials.append(Material(node))
class Material:
''' it’s a material
'''
def __init__(self, node):
for c in node:
if c.tag == 'ambient':
self.ambient_color = color.parse(c[0])
- self.ambient_color *= float(c.attrib['factor'].replace(',', '.'))
? ----------------- -
+ self.ambient_color *= float(c.attrib['factor'])
elif c.tag == 'diffuse':
self.diffuse_color = color.parse(c[0])
- self.diffuse_color *= float(c.attrib['factor'].replace(',', '.'))
? ----------------- -
+ self.diffuse_color *= float(c.attrib['factor'])
elif c.tag == 'specular':
self.specular_color = color.parse(c[0])
- self.specular_color *= float(c.attrib['factor'].replace(',', '.'))
? ----------------- -
+ self.specular_color *= float(c.attrib['factor'])
elif c.tag == 'reflection':
self.reflection_color = color.parse(c[0])
- self.reflection_color *= float(c.attrib['factor'].replace(',', '.'))
? ----------------- -
+ self.reflection_color *= float(c.attrib['factor']) | 8 | 0.307692 | 4 | 4 |
394b5d617d35b6f9f68dd5ef92ce59ef81f0a0a4 | README.md | README.md |
The official [JSON Schema](https://json-schema.org) for Buildkite’s [pipeline file format](https://buildkite.com/docs/pipelines/defining-steps).
## Testing
If you have [Node 10+](https://nodejs.org/en/) installed:
```shell
cd test
npm install && npm test
```
Or you can use [Docker Compose](https://docs.docker.com/compose/):
```shell
cd test
docker-compose run --rm tests
``` |
The official [JSON Schema](https://json-schema.org) for Buildkite’s [pipeline file format](https://buildkite.com/docs/pipelines/defining-steps).
* See [schema.json](schema.json)
## Testing
If you have [Node 10+](https://nodejs.org/en/) installed:
```shell
cd test
npm install && npm test
```
Or you can use [Docker Compose](https://docs.docker.com/compose/):
```shell
cd test
docker-compose run --rm tests
``` | Add link to the schema file | Add link to the schema file
| Markdown | mit | buildkite/pipeline-schema | markdown | ## Code Before:
The official [JSON Schema](https://json-schema.org) for Buildkite’s [pipeline file format](https://buildkite.com/docs/pipelines/defining-steps).
## Testing
If you have [Node 10+](https://nodejs.org/en/) installed:
```shell
cd test
npm install && npm test
```
Or you can use [Docker Compose](https://docs.docker.com/compose/):
```shell
cd test
docker-compose run --rm tests
```
## Instruction:
Add link to the schema file
## Code After:
The official [JSON Schema](https://json-schema.org) for Buildkite’s [pipeline file format](https://buildkite.com/docs/pipelines/defining-steps).
* See [schema.json](schema.json)
## Testing
If you have [Node 10+](https://nodejs.org/en/) installed:
```shell
cd test
npm install && npm test
```
Or you can use [Docker Compose](https://docs.docker.com/compose/):
```shell
cd test
docker-compose run --rm tests
``` |
The official [JSON Schema](https://json-schema.org) for Buildkite’s [pipeline file format](https://buildkite.com/docs/pipelines/defining-steps).
+
+ * See [schema.json](schema.json)
## Testing
If you have [Node 10+](https://nodejs.org/en/) installed:
```shell
cd test
npm install && npm test
```
Or you can use [Docker Compose](https://docs.docker.com/compose/):
```shell
cd test
docker-compose run --rm tests
``` | 2 | 0.111111 | 2 | 0 |
5826ce827c1a96c7fea8dd5a8aa76898802f1bb6 | test/test_helper.rb | test/test_helper.rb | require 'rubygems'
gem 'minitest'
require 'minitest/autorun'
require 'mocha_standalone'
class MiniTest::Unit::TestCase
include Mocha::API
def teardown
mocha_verify
mocha_teardown
end
end
| require 'rubygems'
gem 'minitest'
require 'minitest/autorun'
require 'mocha_standalone'
class MiniTest::Unit::TestCase
include Mocha::API
def setup
mocha_teardown
end
def teardown
mocha_verify
end
end
| Fix mocha teardown to not fail all following tests | Fix mocha teardown to not fail all following tests
| Ruby | mit | jasonroelofs/simple_aws | ruby | ## Code Before:
require 'rubygems'
gem 'minitest'
require 'minitest/autorun'
require 'mocha_standalone'
class MiniTest::Unit::TestCase
include Mocha::API
def teardown
mocha_verify
mocha_teardown
end
end
## Instruction:
Fix mocha teardown to not fail all following tests
## Code After:
require 'rubygems'
gem 'minitest'
require 'minitest/autorun'
require 'mocha_standalone'
class MiniTest::Unit::TestCase
include Mocha::API
def setup
mocha_teardown
end
def teardown
mocha_verify
end
end
| require 'rubygems'
gem 'minitest'
require 'minitest/autorun'
require 'mocha_standalone'
class MiniTest::Unit::TestCase
include Mocha::API
+ def setup
+ mocha_teardown
+ end
+
def teardown
mocha_verify
- mocha_teardown
end
end | 5 | 0.357143 | 4 | 1 |
56229bd0ab09e00acc8751308c67ae36d0a84187 | packages/re/readcsv.yaml | packages/re/readcsv.yaml | homepage: https://github.com/george-steel/readcsv
changelog-type: ''
hash: 5aaba5267372b2efd8c516914167dcbc6f18935ef13d7a7453ac4a3ebcdefa0c
test-bench-deps: {}
maintainer: george.steel@gmail.com
synopsis: Lightweight CSV parser/emitter based on ReadP
changelog: ''
basic-deps:
base: ! '>=4 && <5'
all-versions:
- '0.1.1'
author: George Steel
latest: '0.1.1'
description-type: haddock
description: ! 'Parses and emits tables of strings formatted according to RFC 4180,
/"The common Format and MIME Type for Comma-Separated Values (CSV) Files"/.
Has no dependencies on parsec or any libraries other than base.'
license-name: MIT
| homepage: https://github.com/george-steel/readcsv
changelog-type: ''
hash: ba0cb06393f608d7f7cc1de9797235202bc2e766478c09bd566a864d65bb8fd1
test-bench-deps: {}
maintainer: george.steel@gmail.com
synopsis: Lightweight CSV parser/emitter based on ReadP
changelog: ''
basic-deps:
base: ! '>=4 && <5'
all-versions:
- '0.1.1'
author: George Steel
latest: '0.1.1'
description-type: haddock
description: ! 'Parses and emits tables of strings formatted according to RFC 4180,
/"The common Format and MIME Type for Comma-Separated Values (CSV) Files"/.
Has no dependencies on parsec or any libraries other than base.'
license-name: MIT
| Update from Hackage at 2017-03-14T09:41:14Z | Update from Hackage at 2017-03-14T09:41:14Z
| YAML | mit | commercialhaskell/all-cabal-metadata | yaml | ## Code Before:
homepage: https://github.com/george-steel/readcsv
changelog-type: ''
hash: 5aaba5267372b2efd8c516914167dcbc6f18935ef13d7a7453ac4a3ebcdefa0c
test-bench-deps: {}
maintainer: george.steel@gmail.com
synopsis: Lightweight CSV parser/emitter based on ReadP
changelog: ''
basic-deps:
base: ! '>=4 && <5'
all-versions:
- '0.1.1'
author: George Steel
latest: '0.1.1'
description-type: haddock
description: ! 'Parses and emits tables of strings formatted according to RFC 4180,
/"The common Format and MIME Type for Comma-Separated Values (CSV) Files"/.
Has no dependencies on parsec or any libraries other than base.'
license-name: MIT
## Instruction:
Update from Hackage at 2017-03-14T09:41:14Z
## Code After:
homepage: https://github.com/george-steel/readcsv
changelog-type: ''
hash: ba0cb06393f608d7f7cc1de9797235202bc2e766478c09bd566a864d65bb8fd1
test-bench-deps: {}
maintainer: george.steel@gmail.com
synopsis: Lightweight CSV parser/emitter based on ReadP
changelog: ''
basic-deps:
base: ! '>=4 && <5'
all-versions:
- '0.1.1'
author: George Steel
latest: '0.1.1'
description-type: haddock
description: ! 'Parses and emits tables of strings formatted according to RFC 4180,
/"The common Format and MIME Type for Comma-Separated Values (CSV) Files"/.
Has no dependencies on parsec or any libraries other than base.'
license-name: MIT
| homepage: https://github.com/george-steel/readcsv
changelog-type: ''
- hash: 5aaba5267372b2efd8c516914167dcbc6f18935ef13d7a7453ac4a3ebcdefa0c
+ hash: ba0cb06393f608d7f7cc1de9797235202bc2e766478c09bd566a864d65bb8fd1
test-bench-deps: {}
maintainer: george.steel@gmail.com
synopsis: Lightweight CSV parser/emitter based on ReadP
changelog: ''
basic-deps:
base: ! '>=4 && <5'
all-versions:
- '0.1.1'
author: George Steel
latest: '0.1.1'
description-type: haddock
description: ! 'Parses and emits tables of strings formatted according to RFC 4180,
/"The common Format and MIME Type for Comma-Separated Values (CSV) Files"/.
Has no dependencies on parsec or any libraries other than base.'
license-name: MIT | 2 | 0.1 | 1 | 1 |
18e1eb2d9ad26fc515eab4bb5027e7ef90589910 | sdk/python-sdk/python-sdk.rst | sdk/python-sdk/python-sdk.rst | Python SDK
==========
Quick links
+++++++++++
* Download last version: `0.0.5 <https://github.com/angus-ai/angus-sdk-python/archive/0.0.5.tar.gz>`_
* `Source <https://github.com/angus-ai/angus-sdk-python>`_
Hello World
+++++++++++
.. code-block:: python
> import angus
> conn = angus.connect()
> dummy_service = conn.services.get_service('dummy', version=1)
> job = dummy_service.process({'echo': 'Hello world!'})
> print(job.result)
{
"url": "https://gate.angus.ai/services/dummy/1/jobs/db77e78e-0dd8-11e5-a743-19d95545b6ca",
"status": 201,
"echo": "Hello world!"
}
Installation
++++++++++++
Angus-sdk is listed in PyPI and can be installed with pip or easy_install.
.. parsed-literal::
$ pip install angus-sdk-python
Documentation
+++++++++++++
.. toctree::
:maxdepth: 2
guide
rest
cloud
| Python SDK API Reference
========================
.. toctree::
:maxdepth: 2
guide
rest
cloud
| Remove outdated links and duplicate info | Remove outdated links and duplicate info
| reStructuredText | apache-2.0 | angus-ai/angus-doc,angus-ai/angus-doc,angus-ai/angus-doc,lordlothar99/angus-doc,lordlothar99/angus-doc | restructuredtext | ## Code Before:
Python SDK
==========
Quick links
+++++++++++
* Download last version: `0.0.5 <https://github.com/angus-ai/angus-sdk-python/archive/0.0.5.tar.gz>`_
* `Source <https://github.com/angus-ai/angus-sdk-python>`_
Hello World
+++++++++++
.. code-block:: python
> import angus
> conn = angus.connect()
> dummy_service = conn.services.get_service('dummy', version=1)
> job = dummy_service.process({'echo': 'Hello world!'})
> print(job.result)
{
"url": "https://gate.angus.ai/services/dummy/1/jobs/db77e78e-0dd8-11e5-a743-19d95545b6ca",
"status": 201,
"echo": "Hello world!"
}
Installation
++++++++++++
Angus-sdk is listed in PyPI and can be installed with pip or easy_install.
.. parsed-literal::
$ pip install angus-sdk-python
Documentation
+++++++++++++
.. toctree::
:maxdepth: 2
guide
rest
cloud
## Instruction:
Remove outdated links and duplicate info
## Code After:
Python SDK API Reference
========================
.. toctree::
:maxdepth: 2
guide
rest
cloud
| + Python SDK API Reference
+ ========================
- Python SDK
- ==========
-
- Quick links
- +++++++++++
-
- * Download last version: `0.0.5 <https://github.com/angus-ai/angus-sdk-python/archive/0.0.5.tar.gz>`_
- * `Source <https://github.com/angus-ai/angus-sdk-python>`_
-
- Hello World
- +++++++++++
-
- .. code-block:: python
-
- > import angus
- > conn = angus.connect()
- > dummy_service = conn.services.get_service('dummy', version=1)
- > job = dummy_service.process({'echo': 'Hello world!'})
- > print(job.result)
- {
- "url": "https://gate.angus.ai/services/dummy/1/jobs/db77e78e-0dd8-11e5-a743-19d95545b6ca",
- "status": 201,
- "echo": "Hello world!"
- }
-
-
- Installation
- ++++++++++++
-
- Angus-sdk is listed in PyPI and can be installed with pip or easy_install.
-
- .. parsed-literal::
-
- $ pip install angus-sdk-python
-
- Documentation
- +++++++++++++
.. toctree::
:maxdepth: 2
guide
rest
cloud
| 39 | 0.866667 | 2 | 37 |
37c3d87a325ae7d06032cc8a146bb02a6295cb66 | README.md | README.md | <a href="http://insidethecpu.com"></a>
# PaySharp.NET
[](https://ci.appveyor.com/project/daishisystems/daishi-paysharp)
A PayPal SDK targeting the .NET Framework (4.5+), written in C#.
<p align="center">

</p> | <a href="http://insidethecpu.com"></a>
# PaySharp.NET
[](https://ci.appveyor.com/project/daishisystems/daishi-paysharp)
A PayPal SDK targeting the .NET Framework (4.5+), written in C#.
 | Revert "Adjusted project icon image alignment" | Revert "Adjusted project icon image alignment"
This reverts commit babfe49e8c50fe51669babf7593922a1e28d3b5f.
| Markdown | mit | daishisystems/Daishi.PaySharp,daishisystems/Daishi.PayPal | markdown | ## Code Before:
<a href="http://insidethecpu.com"></a>
# PaySharp.NET
[](https://ci.appveyor.com/project/daishisystems/daishi-paysharp)
A PayPal SDK targeting the .NET Framework (4.5+), written in C#.
<p align="center">

</p>
## Instruction:
Revert "Adjusted project icon image alignment"
This reverts commit babfe49e8c50fe51669babf7593922a1e28d3b5f.
## Code After:
<a href="http://insidethecpu.com"></a>
# PaySharp.NET
[](https://ci.appveyor.com/project/daishisystems/daishi-paysharp)
A PayPal SDK targeting the .NET Framework (4.5+), written in C#.
 | <a href="http://insidethecpu.com"></a>
# PaySharp.NET
[](https://ci.appveyor.com/project/daishisystems/daishi-paysharp)
A PayPal SDK targeting the .NET Framework (4.5+), written in C#.
- <p align="center">

- </p> | 2 | 0.25 | 0 | 2 |
d5b6e3a0919a32d8cd4c54e5331bfa8c0c2d9782 | README.md | README.md | ID3 Version 2 Audio Metadata parser
=====================================
A simple streaming parser for retrieving ID3 metadata from an mp3 file
### Install
npm install id3v2-parser
### Use
The parser is simply a stream in objectMode, so you can pipe and binary data into it and it will spit out tag objects
var ID3 = require('id3v2-parser')
, stream = require('fs').createReadStream('./my-audio.mp3')
var parser = stream.pipe(new ID3());
parser.on('data', function(tag){
console.log(tag.type) // => 'TPE1'
console.log(tag.value) // => 'Bastille'
})
use in conjunction with my ID3v1 parser and [mpeg frame parser](https://github.com/theporchrat/mpeg-frame-parser/) for more data.
| ID3 Version 2 Audio Metadata parser
=====================================
A simple streaming parser for retrieving ID3 metadata from an mp3 file
### Install
npm install id3v2-parser
### Use
The parser is simply a stream in objectMode, so you can pipe and binary data into it and it will spit out tag objects
var ID3 = require('id3v2-parser')
, stream = require('fs').createReadStream('./my-audio.mp3')
var parser = stream.pipe(new ID3());
parser.on('data', function(tag){
console.log(tag.type) // => 'TPE1'
console.log(tag.value) // => 'Bastille'
})
use in conjunction with my
[ID3v1 parser](https://github.com/jquense/ID3v1-info-parser) and
[mpeg frame parser](https://github.com/theporchrat/mpeg-frame-parser/) for more
data.
| Add link to ID3v1 parser | Add link to ID3v1 parser
So I don't have to go looking for it :P
| Markdown | mit | IbpTeam/node-ID3v2-info-parser,jquense/ID3v2-info-parser | markdown | ## Code Before:
ID3 Version 2 Audio Metadata parser
=====================================
A simple streaming parser for retrieving ID3 metadata from an mp3 file
### Install
npm install id3v2-parser
### Use
The parser is simply a stream in objectMode, so you can pipe and binary data into it and it will spit out tag objects
var ID3 = require('id3v2-parser')
, stream = require('fs').createReadStream('./my-audio.mp3')
var parser = stream.pipe(new ID3());
parser.on('data', function(tag){
console.log(tag.type) // => 'TPE1'
console.log(tag.value) // => 'Bastille'
})
use in conjunction with my ID3v1 parser and [mpeg frame parser](https://github.com/theporchrat/mpeg-frame-parser/) for more data.
## Instruction:
Add link to ID3v1 parser
So I don't have to go looking for it :P
## Code After:
ID3 Version 2 Audio Metadata parser
=====================================
A simple streaming parser for retrieving ID3 metadata from an mp3 file
### Install
npm install id3v2-parser
### Use
The parser is simply a stream in objectMode, so you can pipe and binary data into it and it will spit out tag objects
var ID3 = require('id3v2-parser')
, stream = require('fs').createReadStream('./my-audio.mp3')
var parser = stream.pipe(new ID3());
parser.on('data', function(tag){
console.log(tag.type) // => 'TPE1'
console.log(tag.value) // => 'Bastille'
})
use in conjunction with my
[ID3v1 parser](https://github.com/jquense/ID3v1-info-parser) and
[mpeg frame parser](https://github.com/theporchrat/mpeg-frame-parser/) for more
data.
| ID3 Version 2 Audio Metadata parser
=====================================
A simple streaming parser for retrieving ID3 metadata from an mp3 file
### Install
npm install id3v2-parser
### Use
The parser is simply a stream in objectMode, so you can pipe and binary data into it and it will spit out tag objects
var ID3 = require('id3v2-parser')
, stream = require('fs').createReadStream('./my-audio.mp3')
var parser = stream.pipe(new ID3());
parser.on('data', function(tag){
console.log(tag.type) // => 'TPE1'
console.log(tag.value) // => 'Bastille'
})
+ use in conjunction with my
+ [ID3v1 parser](https://github.com/jquense/ID3v1-info-parser) and
- use in conjunction with my ID3v1 parser and [mpeg frame parser](https://github.com/theporchrat/mpeg-frame-parser/) for more data.
? -------------------------------------------- ------
+ [mpeg frame parser](https://github.com/theporchrat/mpeg-frame-parser/) for more
+ data. | 5 | 0.2 | 4 | 1 |
b14d4a37082579270c565e53bfe330097442f0af | pages/projects.html | pages/projects.html | ---
layout: page
title: Projects
description: "Hobby, Fun, my Open Source projects, studies and practice."
permalink: /projects/
---
<div class="project-list">
<h1 class="heading">{{ page.title }}</h1>
<span>{{ page.description }}</span>
<article class="projects">
<ul>
{% assign proj = site.data.projects | group_by: 'year' %}
{% for year in proj reversed %}
<p class="year">{{ year.name }}</p>
{% for p in year.items %}
<li>
{% if p.github_id %}
<a title="https://github.com/{{ site.author.github }}/{{ p.github_id }}" href="https://github.com/{{ site.author.github }}/{{ p.github_id }}" target="_blank">{{ p.name }}</a>
{% endif %}
{% if p.bitbucket_id %}
<a title="https://bitbucket.org/{{ site.author.bitbucket }}/{{ p.bitbucket_id }}" href="https://bitbucket.org/{{ site.author.bitbucket }}/{{ p.bitbucket_id }}" target="_blank">{{ p.name }}</a>
{% endif %}
{% if p.site %}
<a title="{{ p.site }}" href="{{ p.site }}" target="_blank">{{ p.name }}</a>
{% endif %}
</li>
{% endfor %}
{% endfor %}
</ul>
</article>
</div>
| ---
layout: page
title: Projects
description: "Hobby, Fun, my Open Source projects, studies and practice."
permalink: /projects/
---
<div class="projects">
<h1 class="heading">{{ page.title }}</h1>
<span>{{ page.description}}</span>
{% assign proj = site.data.projects | group_by: 'year' %}
{% for year in proj reversed %}
<div class="project-header">
<h2 class="year">{{ year.name }}</h2>
</div>
<ul>
{% for p in year.items %}
<li>
{% if p.github_id %}
<a title="https://github.com/{{ site.author.github }}/{{ p.github_id }}" href="https://github.com/{{ site.author.github }}/{{ p.github_id }}" target="_blank">{{ p.name }}</a>
{% endif %}
{% if p.bitbucket_id %}
<a title="https://bitbucket.org/{{ site.author.bitbucket }}/{{ p.bitbucket_id }}" href="https://bitbucket.org/{{ site.author.bitbucket }}/{{ p.bitbucket_id }}" target="_blank">{{ p.name }}</a>
{% endif %}
{% if p.site %}
<a title="{{ p.site }}" href="{{ p.site }}" target="_blank">{{ p.name }}</a>
{% endif %}
</li>
{% endfor %}
</ul>
{% endfor %}
</div> | Add spacer and new color year | Add spacer and new color year
| HTML | mit | fcschmidt/fcschmidt.github.io | html | ## Code Before:
---
layout: page
title: Projects
description: "Hobby, Fun, my Open Source projects, studies and practice."
permalink: /projects/
---
<div class="project-list">
<h1 class="heading">{{ page.title }}</h1>
<span>{{ page.description }}</span>
<article class="projects">
<ul>
{% assign proj = site.data.projects | group_by: 'year' %}
{% for year in proj reversed %}
<p class="year">{{ year.name }}</p>
{% for p in year.items %}
<li>
{% if p.github_id %}
<a title="https://github.com/{{ site.author.github }}/{{ p.github_id }}" href="https://github.com/{{ site.author.github }}/{{ p.github_id }}" target="_blank">{{ p.name }}</a>
{% endif %}
{% if p.bitbucket_id %}
<a title="https://bitbucket.org/{{ site.author.bitbucket }}/{{ p.bitbucket_id }}" href="https://bitbucket.org/{{ site.author.bitbucket }}/{{ p.bitbucket_id }}" target="_blank">{{ p.name }}</a>
{% endif %}
{% if p.site %}
<a title="{{ p.site }}" href="{{ p.site }}" target="_blank">{{ p.name }}</a>
{% endif %}
</li>
{% endfor %}
{% endfor %}
</ul>
</article>
</div>
## Instruction:
Add spacer and new color year
## Code After:
---
layout: page
title: Projects
description: "Hobby, Fun, my Open Source projects, studies and practice."
permalink: /projects/
---
<div class="projects">
<h1 class="heading">{{ page.title }}</h1>
<span>{{ page.description}}</span>
{% assign proj = site.data.projects | group_by: 'year' %}
{% for year in proj reversed %}
<div class="project-header">
<h2 class="year">{{ year.name }}</h2>
</div>
<ul>
{% for p in year.items %}
<li>
{% if p.github_id %}
<a title="https://github.com/{{ site.author.github }}/{{ p.github_id }}" href="https://github.com/{{ site.author.github }}/{{ p.github_id }}" target="_blank">{{ p.name }}</a>
{% endif %}
{% if p.bitbucket_id %}
<a title="https://bitbucket.org/{{ site.author.bitbucket }}/{{ p.bitbucket_id }}" href="https://bitbucket.org/{{ site.author.bitbucket }}/{{ p.bitbucket_id }}" target="_blank">{{ p.name }}</a>
{% endif %}
{% if p.site %}
<a title="{{ p.site }}" href="{{ p.site }}" target="_blank">{{ p.name }}</a>
{% endif %}
</li>
{% endfor %}
</ul>
{% endfor %}
</div> | ---
layout: page
title: Projects
description: "Hobby, Fun, my Open Source projects, studies and practice."
permalink: /projects/
---
+
- <div class="project-list">
? --- -
+ <div class="projects">
<h1 class="heading">{{ page.title }}</h1>
- <span>{{ page.description }}</span>
? -
+ <span>{{ page.description}}</span>
- <article class="projects">
+ {% assign proj = site.data.projects | group_by: 'year' %}
+ {% for year in proj reversed %}
+ <div class="project-header">
+ <h2 class="year">{{ year.name }}</h2>
+ </div>
<ul>
- {% assign proj = site.data.projects | group_by: 'year' %}
- {% for year in proj reversed %}
- <p class="year">{{ year.name }}</p>
- {% for p in year.items %}
? ----
+ {% for p in year.items %}
- <li>
? ----
+ <li>
- {% if p.github_id %}
? ----
+ {% if p.github_id %}
- <a title="https://github.com/{{ site.author.github }}/{{ p.github_id }}" href="https://github.com/{{ site.author.github }}/{{ p.github_id }}" target="_blank">{{ p.name }}</a>
? ----
+ <a title="https://github.com/{{ site.author.github }}/{{ p.github_id }}" href="https://github.com/{{ site.author.github }}/{{ p.github_id }}" target="_blank">{{ p.name }}</a>
- {% endif %}
? ----
+ {% endif %}
- {% if p.bitbucket_id %}
? ----
+ {% if p.bitbucket_id %}
- <a title="https://bitbucket.org/{{ site.author.bitbucket }}/{{ p.bitbucket_id }}" href="https://bitbucket.org/{{ site.author.bitbucket }}/{{ p.bitbucket_id }}" target="_blank">{{ p.name }}</a>
? ----
+ <a title="https://bitbucket.org/{{ site.author.bitbucket }}/{{ p.bitbucket_id }}" href="https://bitbucket.org/{{ site.author.bitbucket }}/{{ p.bitbucket_id }}" target="_blank">{{ p.name }}</a>
- {% endif %}
? ----
+ {% endif %}
- {% if p.site %}
? ----
+ {% if p.site %}
- <a title="{{ p.site }}" href="{{ p.site }}" target="_blank">{{ p.name }}</a>
? ----
+ <a title="{{ p.site }}" href="{{ p.site }}" target="_blank">{{ p.name }}</a>
- {% endif %}
- </li>
- {% endfor %}
? --
+ {% endif %}
? +
+ </li>
{% endfor %}
</ul>
- </article>
+ {% endfor %}
</div>
- | 42 | 1.272727 | 21 | 21 |
082db7996fbf4ffd9519b830a059c2d1eff08283 | spec/throttled_queue.rb | spec/throttled_queue.rb | require 'em/throttled_queue'
require 'minitest/autorun'
require 'minitest/spec'
class << MiniTest::Spec
def it_enhanced(*args, &block)
it_original(*args) do
EM::run_block(&block)
end
end
alias_method :it_original, :it
alias_method :it, :it_enhanced
end
describe EM::ThrottledQueue do
it "should work properly" do
EM::reactor_running?.must_equal true
end
end | require 'em/throttled_queue'
require 'minitest/autorun'
require 'minitest/spec'
class << MiniTest::Spec
def it_enhanced(*args, &block)
it_original(*args) do
EM::run(&block)
end
end
alias_method :it_original, :it
alias_method :it, :it_enhanced
end
describe EM::ThrottledQueue do
it "should pop items in FIFO order" do
queue = EM::ThrottledQueue.new(1, 0.2)
pushed_items = [1, 2, 3, 4]
popped_items = []
queue.push(*pushed_items)
1.upto(4) do |i|
queue.pop do |j|
popped_items << j
if i == 4
popped_items.must_equal pushed_items
EM::stop
end
end
end
end
it "should pop items within the rate limit" do
ticks = 0
deqs = 0
queue = EM::ThrottledQueue.new(1, 0.2) # 5 deqs/s
queue.push(*(1..1000).to_a)
queue.size.must_equal 1000
EM::add_timer(2) do
(5..10).must_include deqs
(ticks > 100).must_equal true # need good margin
EM::stop
end
ticker = proc do |me|
ticks += 1
queue.pop { deqs += 1 }
EM::next_tick { me.call(me) }
end
ticker.call(ticker)
end
end | Add specs for FIFO order and rate-limit | Add specs for FIFO order and rate-limit
| Ruby | mit | Burgestrand/em-throttled_queue | ruby | ## Code Before:
require 'em/throttled_queue'
require 'minitest/autorun'
require 'minitest/spec'
class << MiniTest::Spec
def it_enhanced(*args, &block)
it_original(*args) do
EM::run_block(&block)
end
end
alias_method :it_original, :it
alias_method :it, :it_enhanced
end
describe EM::ThrottledQueue do
it "should work properly" do
EM::reactor_running?.must_equal true
end
end
## Instruction:
Add specs for FIFO order and rate-limit
## Code After:
require 'em/throttled_queue'
require 'minitest/autorun'
require 'minitest/spec'
class << MiniTest::Spec
def it_enhanced(*args, &block)
it_original(*args) do
EM::run(&block)
end
end
alias_method :it_original, :it
alias_method :it, :it_enhanced
end
describe EM::ThrottledQueue do
it "should pop items in FIFO order" do
queue = EM::ThrottledQueue.new(1, 0.2)
pushed_items = [1, 2, 3, 4]
popped_items = []
queue.push(*pushed_items)
1.upto(4) do |i|
queue.pop do |j|
popped_items << j
if i == 4
popped_items.must_equal pushed_items
EM::stop
end
end
end
end
it "should pop items within the rate limit" do
ticks = 0
deqs = 0
queue = EM::ThrottledQueue.new(1, 0.2) # 5 deqs/s
queue.push(*(1..1000).to_a)
queue.size.must_equal 1000
EM::add_timer(2) do
(5..10).must_include deqs
(ticks > 100).must_equal true # need good margin
EM::stop
end
ticker = proc do |me|
ticks += 1
queue.pop { deqs += 1 }
EM::next_tick { me.call(me) }
end
ticker.call(ticker)
end
end | require 'em/throttled_queue'
require 'minitest/autorun'
require 'minitest/spec'
class << MiniTest::Spec
def it_enhanced(*args, &block)
it_original(*args) do
- EM::run_block(&block)
? ------
+ EM::run(&block)
end
end
alias_method :it_original, :it
alias_method :it, :it_enhanced
end
describe EM::ThrottledQueue do
- it "should work properly" do
- EM::reactor_running?.must_equal true
+ it "should pop items in FIFO order" do
+ queue = EM::ThrottledQueue.new(1, 0.2)
+ pushed_items = [1, 2, 3, 4]
+ popped_items = []
+ queue.push(*pushed_items)
+
+ 1.upto(4) do |i|
+ queue.pop do |j|
+ popped_items << j
+
+ if i == 4
+ popped_items.must_equal pushed_items
+ EM::stop
+ end
+ end
+ end
+ end
+
+ it "should pop items within the rate limit" do
+ ticks = 0
+ deqs = 0
+
+ queue = EM::ThrottledQueue.new(1, 0.2) # 5 deqs/s
+ queue.push(*(1..1000).to_a)
+ queue.size.must_equal 1000
+
+ EM::add_timer(2) do
+ (5..10).must_include deqs
+ (ticks > 100).must_equal true # need good margin
+ EM::stop
+ end
+
+ ticker = proc do |me|
+ ticks += 1
+ queue.pop { deqs += 1 }
+ EM::next_tick { me.call(me) }
+ end
+
+ ticker.call(ticker)
end
end | 43 | 2.15 | 40 | 3 |
c99f857ab310e1ae50c60e47daf9793cfea1603d | core/app/controllers/spree/base_controller.rb | core/app/controllers/spree/base_controller.rb | require 'cancan'
class Spree::BaseController < ApplicationController
include Spree::Core::ControllerHelpers
include Spree::Core::RespondWith
# graceful error handling for cancan authorization exceptions
rescue_from CanCan::AccessDenied do |exception|
return unauthorized
end
private
def current_spree_user
if Spree.user_class && Spree.current_user_method
send(Spree.current_user_method)
else
Object.new
end
end
# Needs to be overriden so that we use Spree's Ability rather than anyone else's.
def current_ability
@current_ability ||= Spree::Ability.new(current_spree_user)
end
# Redirect as appropriate when an access request fails. The default action is to redirect to the login screen.
# Override this method in your controllers if you want to have special behavior in case the user is not authorized
# to access the requested action. For example, a popup window might simply close itself.
def unauthorized
respond_to do |format|
format.html do
if current_user
flash.now[:error] = t(:authorization_failure)
render 'spree/shared/unauthorized', :layout => '/spree/layouts/spree_application', :status => 401
else
store_location
redirect_to spree.login_path and return
end
end
format.xml do
request_http_basic_authentication 'Web Password'
end
format.json do
render :text => "Not Authorized \n", :status => 401
end
end
end
end
| require 'cancan'
class Spree::BaseController < ApplicationController
include Spree::Core::ControllerHelpers
include Spree::Core::RespondWith
# graceful error handling for cancan authorization exceptions
rescue_from CanCan::AccessDenied do |exception|
return unauthorized
end
private
def current_spree_user
if Spree.user_class && Spree.current_user_method
send(Spree.current_user_method)
else
Object.new
end
end
# Needs to be overriden so that we use Spree's Ability rather than anyone else's.
def current_ability
@current_ability ||= Spree::Ability.new(current_spree_user)
end
# Redirect as appropriate when an access request fails. The default action is to redirect to the login screen.
# Override this method in your controllers if you want to have special behavior in case the user is not authorized
# to access the requested action. For example, a popup window might simply close itself.
def unauthorized
respond_to do |format|
format.html do
if current_user
flash.now[:error] = t(:authorization_failure)
render 'spree/shared/unauthorized', :layout => '/spree/layouts/spree_application', :status => 401
else
store_location
redirect_to spree_login_path and return
end
end
format.xml do
request_http_basic_authentication 'Web Password'
end
format.json do
render :text => "Not Authorized \n", :status => 401
end
end
end
def spree_login_path
spree.login_path
end
end
| Define a spree_login_path method in BaseController that is overridable | Define a spree_login_path method in BaseController that is overridable
| Ruby | bsd-3-clause | wolfieorama/spree,quentinuys/spree,alvinjean/spree,lsirivong/solidus,CJMrozek/spree,yiqing95/spree,sideci-sample/sideci-sample-spree,Arpsara/solidus,hoanghiep90/spree,ahmetabdi/spree,APohio/spree,kitwalker12/spree,ramkumar-kr/spree,delphsoft/spree-store-ballchair,mindvolt/spree,Engeltj/spree,fahidnasir/spree,caiqinghua/spree,athal7/solidus,beni55/spree,calvinl/spree,builtbybuffalo/spree,Engeltj/spree,pervino/spree,grzlus/solidus,DarkoP/spree,gautamsawhney/spree,vmatekole/spree,JuandGirald/spree,ramkumar-kr/spree,orenf/spree,bonobos/solidus,NerdsvilleCEO/spree,FadliKun/spree,jparr/spree,Arpsara/solidus,tomash/spree,rbngzlv/spree,brchristian/spree,Hates/spree,LBRapid/spree,ayb/spree,maybii/spree,useiichi/spree,lyzxsc/spree,carlesjove/spree,pervino/solidus,Hawaiideveloper/shoppingcart,softr8/spree,TimurTarasenko/spree,delphsoft/spree-store-ballchair,brchristian/spree,jaspreet21anand/spree,bricesanchez/spree,derekluo/spree,sideci-sample/sideci-sample-spree,APohio/spree,mleglise/spree,sliaquat/spree,tailic/spree,groundctrl/spree,lzcabrera/spree-1-3-stable,Boomkat/spree,joanblake/spree,agient/agientstorefront,reinaris/spree,azclick/spree,builtbybuffalo/spree,Ropeney/spree,shioyama/spree,Boomkat/spree,sunny2601/spree,biagidp/spree,forkata/solidus,ujai/spree,ayb/spree,ckk-scratch/solidus,ckk-scratch/solidus,Machpowersystems/spree_mach,pervino/solidus,JDutil/spree,shaywood2/spree,reidblomquist/spree,edgward/spree,HealthWave/spree,abhishekjain16/spree,hifly/spree,mindvolt/spree,locomotivapro/spree,devilcoders/solidus,jeffboulet/spree,JuandGirald/spree,bricesanchez/spree,TrialGuides/spree,scottcrawford03/solidus,fahidnasir/spree,rakibulislam/spree,scottcrawford03/solidus,siddharth28/spree,lsirivong/spree,groundctrl/spree,piousbox/spree,caiqinghua/spree,devilcoders/solidus,omarsar/spree,robodisco/spree,thogg4/spree,gregoryrikson/spree-sample,SadTreeFriends/spree,tailic/spree,moneyspyder/spree,knuepwebdev/FloatTubeRodHolders,watg/spree,agient/agientstorefront,PhoenixTeam/spree_phoenix,jeffboulet/spree,Senjai/solidus,alepore/spree,lyzxsc/spree,volpejoaquin/spree,scottcrawford03/solidus,Senjai/spree,ayb/spree,caiqinghua/spree,abhishekjain16/spree,jaspreet21anand/spree,dandanwei/spree,archSeer/spree,dandanwei/spree,njerrywerry/spree,sfcgeorge/spree,vcavallo/spree,pervino/spree,archSeer/spree,grzlus/spree,lsirivong/solidus,dandanwei/spree,assembledbrands/spree,JDutil/spree,miyazawatomoka/spree,woboinc/spree,judaro13/spree-fork,rbngzlv/spree,tancnle/spree,DynamoMTL/spree,lsirivong/spree,Ropeney/spree,ujai/spree,berkes/spree,PhoenixTeam/spree_phoenix,freerunningtech/spree,sfcgeorge/spree,Antdesk/karpal-spree,shekibobo/spree,orenf/spree,karlitxo/spree,DarkoP/spree,Engeltj/spree,adaddeo/spree,grzlus/solidus,dandanwei/spree,tancnle/spree,surfdome/spree,dafontaine/spree,jparr/spree,vulk/spree,tailic/spree,Machpowersystems/spree_mach,jsurdilla/solidus,madetech/spree,joanblake/spree,jspizziri/spree,Mayvenn/spree,Mayvenn/spree,Nevensoft/spree,Boomkat/spree,raow/spree,shioyama/spree,jhawthorn/spree,DynamoMTL/spree,carlesjove/spree,adaddeo/spree,CiscoCloud/spree,yiqing95/spree,richardnuno/solidus,tesserakt/clean_spree,ckk-scratch/solidus,robodisco/spree,volpejoaquin/spree,scottcrawford03/solidus,pervino/solidus,jordan-brough/solidus,athal7/solidus,azranel/spree,rakibulislam/spree,firman/spree,FadliKun/spree,adaddeo/spree,dafontaine/spree,jimblesm/spree,ramkumar-kr/spree,biagidp/spree,shekibobo/spree,assembledbrands/spree,sunny2601/spree,sliaquat/spree,piousbox/spree,odk211/spree,azclick/spree,yushine/spree,bjornlinder/Spree,carlesjove/spree,radarseesradar/spree,omarsar/spree,rajeevriitm/spree,CJMrozek/spree,patdec/spree,madetech/spree,azranel/spree,Lostmyname/spree,devilcoders/solidus,bjornlinder/Spree,lsirivong/solidus,patdec/spree,yiqing95/spree,jsurdilla/solidus,jordan-brough/spree,jsurdilla/solidus,AgilTec/spree,zamiang/spree,welitonfreitas/spree,firman/spree,CJMrozek/spree,rajeevriitm/spree,xuewenfei/solidus,Migweld/spree,vinayvinsol/spree,maybii/spree,kewaunited/spree,surfdome/spree,sliaquat/spree,JuandGirald/spree,azranel/spree,vmatekole/spree,wolfieorama/spree,yushine/spree,jordan-brough/solidus,ramkumar-kr/spree,radarseesradar/spree,bonobos/solidus,alejandromangione/spree,abhishekjain16/spree,dotandbo/spree,dafontaine/spree,vulk/spree,Nevensoft/spree,brchristian/spree,camelmasa/spree,Kagetsuki/spree,joanblake/spree,fahidnasir/spree,CiscoCloud/spree,alejandromangione/spree,wolfieorama/spree,adaddeo/spree,nooysters/spree,dotandbo/spree,ahmetabdi/spree,jimblesm/spree,cutefrank/spree,Kagetsuki/spree,ayb/spree,yomishra/pce,rajeevriitm/spree,quentinuys/spree,camelmasa/spree,ujai/spree,abhishekjain16/spree,Mayvenn/spree,codesavvy/sandbox,athal7/solidus,edgward/spree,brchristian/spree,camelmasa/spree,tancnle/spree,keatonrow/spree,pervino/spree,TrialGuides/spree,athal7/solidus,njerrywerry/spree,RatioClothing/spree,robodisco/spree,kitwalker12/spree,Engeltj/spree,KMikhaylovCTG/spree,shioyama/spree,vcavallo/spree,Lostmyname/spree,Kagetsuki/spree,sliaquat/spree,useiichi/spree,calvinl/spree,sfcgeorge/spree,volpejoaquin/spree,Migweld/spree,reidblomquist/spree,miyazawatomoka/spree,CiscoCloud/spree,bonobos/solidus,camelmasa/spree,tancnle/spree,imella/spree,kewaunited/spree,dotandbo/spree,RatioClothing/spree,welitonfreitas/spree,calvinl/spree,welitonfreitas/spree,odk211/spree,delphsoft/spree-store-ballchair,gregoryrikson/spree-sample,LBRapid/spree,DynamoMTL/spree,radarseesradar/spree,net2b/spree,reidblomquist/spree,surfdome/spree,project-eutopia/spree,TrialGuides/spree,NerdsvilleCEO/spree,woboinc/spree,cutefrank/spree,fahidnasir/spree,JDutil/spree,DarkoP/spree,jparr/spree,project-eutopia/spree,shekibobo/spree,knuepwebdev/FloatTubeRodHolders,mindvolt/spree,alepore/spree,vmatekole/spree,omarsar/spree,jspizziri/spree,trigrass2/spree,pervino/spree,Hates/spree,StemboltHQ/spree,hifly/spree,orenf/spree,keatonrow/spree,Senjai/spree,pjmj777/spree,HealthWave/spree,firman/spree,Lostmyname/spree,dafontaine/spree,madetech/spree,beni55/spree,jasonfb/spree,agient/agientstorefront,judaro13/spree-fork,zamiang/spree,jordan-brough/solidus,rakibulislam/spree,vinsol/spree,tomash/spree,xuewenfei/solidus,alvinjean/spree,jimblesm/spree,forkata/solidus,pjmj777/spree,vinayvinsol/spree,jaspreet21anand/spree,bonobos/solidus,AgilTec/spree,vinsol/spree,bricesanchez/spree,alejandromangione/spree,hifly/spree,sunny2601/spree,kewaunited/spree,keatonrow/spree,gautamsawhney/spree,Migweld/spree,ckk-scratch/solidus,groundctrl/spree,jhawthorn/spree,TrialGuides/spree,CiscoCloud/spree,urimikhli/spree,net2b/spree,jordan-brough/spree,thogg4/spree,Senjai/solidus,Hates/spree,Nevensoft/spree,berkes/spree,rakibulislam/spree,hoanghiep90/spree,CJMrozek/spree,imella/spree,jordan-brough/solidus,Nevensoft/spree,odk211/spree,lsirivong/spree,derekluo/spree,AgilTec/spree,FadliKun/spree,pjmj777/spree,mleglise/spree,beni55/spree,richardnuno/solidus,lyzxsc/spree,jordan-brough/spree,Hawaiideveloper/shoppingcart,progsri/spree,forkata/solidus,KMikhaylovCTG/spree,locomotivapro/spree,shaywood2/spree,xuewenfei/solidus,azclick/spree,hoanghiep90/spree,reinaris/spree,calvinl/spree,madetech/spree,yiqing95/spree,watg/spree,codesavvy/sandbox,vinsol/spree,devilcoders/solidus,urimikhli/spree,forkata/solidus,StemboltHQ/spree,Migweld/spree,sideci-sample/sideci-sample-spree,lyzxsc/spree,patdec/spree,watg/spree,edgward/spree,project-eutopia/spree,archSeer/spree,pulkit21/spree,rbngzlv/spree,alejandromangione/spree,zaeznet/spree,priyank-gupta/spree,DarkoP/spree,piousbox/spree,locomotivapro/spree,TimurTarasenko/spree,Ropeney/spree,JuandGirald/spree,grzlus/solidus,tesserakt/clean_spree,Mayvenn/spree,vinayvinsol/spree,Machpowersystems/spree_mach,jasonfb/spree,mindvolt/spree,urimikhli/spree,knuepwebdev/FloatTubeRodHolders,imella/spree,useiichi/spree,derekluo/spree,tomash/spree,reinaris/spree,groundctrl/spree,softr8/spree,tesserakt/clean_spree,jhawthorn/spree,softr8/spree,softr8/spree,locomotivapro/spree,jaspreet21anand/spree,keatonrow/spree,siddharth28/spree,yushine/spree,trigrass2/spree,biagidp/spree,priyank-gupta/spree,woboinc/spree,hoanghiep90/spree,net2b/spree,quentinuys/spree,FadliKun/spree,Ropeney/spree,sunny2601/spree,rbngzlv/spree,reidblomquist/spree,jasonfb/spree,azclick/spree,gautamsawhney/spree,pulkit21/spree,Senjai/solidus,delphsoft/spree-store-ballchair,pulkit21/spree,SadTreeFriends/spree,thogg4/spree,shaywood2/spree,Kagetsuki/spree,dotandbo/spree,njerrywerry/spree,cutefrank/spree,Antdesk/karpal-spree,jeffboulet/spree,grzlus/solidus,Antdesk/karpal-spree,richardnuno/solidus,berkes/spree,berkes/spree,njerrywerry/spree,vmatekole/spree,rajeevriitm/spree,Boomkat/spree,gregoryrikson/spree-sample,grzlus/spree,alvinjean/spree,welitonfreitas/spree,degica/spree,raow/spree,joanblake/spree,alvinjean/spree,jsurdilla/solidus,xuewenfei/solidus,derekluo/spree,RatioClothing/spree,alepore/spree,odk211/spree,tomash/spree,karlitxo/spree,freerunningtech/spree,TimurTarasenko/spree,zaeznet/spree,net2b/spree,beni55/spree,vinayvinsol/spree,StemboltHQ/spree,pulkit21/spree,lsirivong/spree,DynamoMTL/spree,carlesjove/spree,moneyspyder/spree,piousbox/spree,ahmetabdi/spree,sfcgeorge/spree,lzcabrera/spree-1-3-stable,builtbybuffalo/spree,yomishra/pce,judaro13/spree-fork,reinaris/spree,shaywood2/spree,mleglise/spree,radarseesradar/spree,agient/agientstorefront,zamiang/spree,maybii/spree,vulk/spree,vulk/spree,thogg4/spree,jspizziri/spree,zaeznet/spree,nooysters/spree,firman/spree,HealthWave/spree,lsirivong/solidus,omarsar/spree,jeffboulet/spree,zamiang/spree,nooysters/spree,raow/spree,bjornlinder/Spree,robodisco/spree,edgward/spree,KMikhaylovCTG/spree,siddharth28/spree,gautamsawhney/spree,caiqinghua/spree,SadTreeFriends/spree,project-eutopia/spree,vcavallo/spree,freerunningtech/spree,codesavvy/sandbox,priyank-gupta/spree,shekibobo/spree,degica/spree,Hawaiideveloper/shoppingcart,KMikhaylovCTG/spree,kitwalker12/spree,assembledbrands/spree,maybii/spree,NerdsvilleCEO/spree,yomishra/pce,APohio/spree,pervino/solidus,PhoenixTeam/spree_phoenix,jspizziri/spree,LBRapid/spree,surfdome/spree,progsri/spree,moneyspyder/spree,AgilTec/spree,hifly/spree,quentinuys/spree,gregoryrikson/spree-sample,patdec/spree,azranel/spree,Arpsara/solidus,builtbybuffalo/spree,miyazawatomoka/spree,SadTreeFriends/spree,progsri/spree,siddharth28/spree,raow/spree,trigrass2/spree,ahmetabdi/spree,Hawaiideveloper/shoppingcart,volpejoaquin/spree,progsri/spree,miyazawatomoka/spree,vcavallo/spree,trigrass2/spree,cutefrank/spree,PhoenixTeam/spree_phoenix,karlitxo/spree,Lostmyname/spree,jparr/spree,TimurTarasenko/spree,Senjai/solidus,richardnuno/solidus,tesserakt/clean_spree,priyank-gupta/spree,moneyspyder/spree,grzlus/spree,mleglise/spree,degica/spree,vinsol/spree,useiichi/spree,lzcabrera/spree-1-3-stable,NerdsvilleCEO/spree,nooysters/spree,codesavvy/sandbox,Arpsara/solidus,jimblesm/spree,orenf/spree,jasonfb/spree,wolfieorama/spree,APohio/spree,zaeznet/spree,karlitxo/spree,archSeer/spree,grzlus/spree,kewaunited/spree,yushine/spree,JDutil/spree,Senjai/spree,Hates/spree | ruby | ## Code Before:
require 'cancan'
class Spree::BaseController < ApplicationController
include Spree::Core::ControllerHelpers
include Spree::Core::RespondWith
# graceful error handling for cancan authorization exceptions
rescue_from CanCan::AccessDenied do |exception|
return unauthorized
end
private
def current_spree_user
if Spree.user_class && Spree.current_user_method
send(Spree.current_user_method)
else
Object.new
end
end
# Needs to be overriden so that we use Spree's Ability rather than anyone else's.
def current_ability
@current_ability ||= Spree::Ability.new(current_spree_user)
end
# Redirect as appropriate when an access request fails. The default action is to redirect to the login screen.
# Override this method in your controllers if you want to have special behavior in case the user is not authorized
# to access the requested action. For example, a popup window might simply close itself.
def unauthorized
respond_to do |format|
format.html do
if current_user
flash.now[:error] = t(:authorization_failure)
render 'spree/shared/unauthorized', :layout => '/spree/layouts/spree_application', :status => 401
else
store_location
redirect_to spree.login_path and return
end
end
format.xml do
request_http_basic_authentication 'Web Password'
end
format.json do
render :text => "Not Authorized \n", :status => 401
end
end
end
end
## Instruction:
Define a spree_login_path method in BaseController that is overridable
## Code After:
require 'cancan'
class Spree::BaseController < ApplicationController
include Spree::Core::ControllerHelpers
include Spree::Core::RespondWith
# graceful error handling for cancan authorization exceptions
rescue_from CanCan::AccessDenied do |exception|
return unauthorized
end
private
def current_spree_user
if Spree.user_class && Spree.current_user_method
send(Spree.current_user_method)
else
Object.new
end
end
# Needs to be overriden so that we use Spree's Ability rather than anyone else's.
def current_ability
@current_ability ||= Spree::Ability.new(current_spree_user)
end
# Redirect as appropriate when an access request fails. The default action is to redirect to the login screen.
# Override this method in your controllers if you want to have special behavior in case the user is not authorized
# to access the requested action. For example, a popup window might simply close itself.
def unauthorized
respond_to do |format|
format.html do
if current_user
flash.now[:error] = t(:authorization_failure)
render 'spree/shared/unauthorized', :layout => '/spree/layouts/spree_application', :status => 401
else
store_location
redirect_to spree_login_path and return
end
end
format.xml do
request_http_basic_authentication 'Web Password'
end
format.json do
render :text => "Not Authorized \n", :status => 401
end
end
end
def spree_login_path
spree.login_path
end
end
| require 'cancan'
class Spree::BaseController < ApplicationController
include Spree::Core::ControllerHelpers
include Spree::Core::RespondWith
# graceful error handling for cancan authorization exceptions
rescue_from CanCan::AccessDenied do |exception|
return unauthorized
end
private
def current_spree_user
if Spree.user_class && Spree.current_user_method
send(Spree.current_user_method)
else
Object.new
end
end
# Needs to be overriden so that we use Spree's Ability rather than anyone else's.
def current_ability
@current_ability ||= Spree::Ability.new(current_spree_user)
end
# Redirect as appropriate when an access request fails. The default action is to redirect to the login screen.
# Override this method in your controllers if you want to have special behavior in case the user is not authorized
# to access the requested action. For example, a popup window might simply close itself.
def unauthorized
respond_to do |format|
format.html do
if current_user
flash.now[:error] = t(:authorization_failure)
render 'spree/shared/unauthorized', :layout => '/spree/layouts/spree_application', :status => 401
else
store_location
- redirect_to spree.login_path and return
? ^
+ redirect_to spree_login_path and return
? ^
end
end
format.xml do
request_http_basic_authentication 'Web Password'
end
format.json do
render :text => "Not Authorized \n", :status => 401
end
end
end
+
+ def spree_login_path
+ spree.login_path
+ end
end | 6 | 0.125 | 5 | 1 |
d3675b777dc95f296f26bdd9b8b05311ceac6ba5 | cyder/core/system/migrations/0006_rename_table_from_system_key_value_to_system_kv.py | cyder/core/system/migrations/0006_rename_table_from_system_key_value_to_system_kv.py | from south.db import db
from south.v2 import SchemaMigration
class Migration(SchemaMigration):
def forwards(self, orm):
db.rename_table('system_key_value', 'system_kv')
def backwards(self, orm):
db.rename_table('system_kv', 'system_key_value')
| import datetime
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
db.rename_table('system_key_value', 'system_kv')
def backwards(self, orm):
db.rename_table('system_kv', 'system_key_value')
models = {
'system.system': {
'Meta': {'object_name': 'System', 'db_table': "'system'"},
'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'modified': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '255'})
},
'system.systemkeyvalue': {
'Meta': {'unique_together': "(('key', 'value', 'system'),)", 'object_name': 'SystemKeyValue', 'db_table': "'system_kv'"},
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'is_quoted': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
'key': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
'system': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['system.System']"}),
'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
}
}
complete_apps = ['system']
| Add ORM freeze thing to SystemKeyValue migration | Add ORM freeze thing to SystemKeyValue migration
| Python | bsd-3-clause | akeym/cyder,murrown/cyder,zeeman/cyder,akeym/cyder,OSU-Net/cyder,murrown/cyder,OSU-Net/cyder,OSU-Net/cyder,zeeman/cyder,akeym/cyder,murrown/cyder,zeeman/cyder,drkitty/cyder,zeeman/cyder,drkitty/cyder,akeym/cyder,drkitty/cyder,murrown/cyder,drkitty/cyder,OSU-Net/cyder | python | ## Code Before:
from south.db import db
from south.v2 import SchemaMigration
class Migration(SchemaMigration):
def forwards(self, orm):
db.rename_table('system_key_value', 'system_kv')
def backwards(self, orm):
db.rename_table('system_kv', 'system_key_value')
## Instruction:
Add ORM freeze thing to SystemKeyValue migration
## Code After:
import datetime
from south.db import db
from south.v2 import SchemaMigration
from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
db.rename_table('system_key_value', 'system_kv')
def backwards(self, orm):
db.rename_table('system_kv', 'system_key_value')
models = {
'system.system': {
'Meta': {'object_name': 'System', 'db_table': "'system'"},
'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'modified': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
'name': ('django.db.models.fields.CharField', [], {'max_length': '255'})
},
'system.systemkeyvalue': {
'Meta': {'unique_together': "(('key', 'value', 'system'),)", 'object_name': 'SystemKeyValue', 'db_table': "'system_kv'"},
'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
'is_quoted': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
'key': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
'system': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['system.System']"}),
'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
}
}
complete_apps = ['system']
| + import datetime
from south.db import db
from south.v2 import SchemaMigration
+ from django.db import models
class Migration(SchemaMigration):
def forwards(self, orm):
db.rename_table('system_key_value', 'system_kv')
def backwards(self, orm):
db.rename_table('system_kv', 'system_key_value')
+
+ models = {
+ 'system.system': {
+ 'Meta': {'object_name': 'System', 'db_table': "'system'"},
+ 'created': ('django.db.models.fields.DateTimeField', [], {'auto_now_add': 'True', 'blank': 'True'}),
+ 'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'modified': ('django.db.models.fields.DateTimeField', [], {'auto_now': 'True', 'blank': 'True'}),
+ 'name': ('django.db.models.fields.CharField', [], {'max_length': '255'})
+ },
+ 'system.systemkeyvalue': {
+ 'Meta': {'unique_together': "(('key', 'value', 'system'),)", 'object_name': 'SystemKeyValue', 'db_table': "'system_kv'"},
+ 'id': ('django.db.models.fields.AutoField', [], {'primary_key': 'True'}),
+ 'is_quoted': ('django.db.models.fields.BooleanField', [], {'default': 'False'}),
+ 'key': ('django.db.models.fields.CharField', [], {'max_length': '255'}),
+ 'system': ('django.db.models.fields.related.ForeignKey', [], {'to': "orm['system.System']"}),
+ 'value': ('django.db.models.fields.CharField', [], {'max_length': '255'})
+ }
+ }
+
+ complete_apps = ['system'] | 22 | 2 | 22 | 0 |
edeb33d8ad27664ae7dae4bd089ce418ae337fe9 | .travis.yml | .travis.yml | language: ruby
before_install:
- sudo apt-get install libclamav-dev clamav-data
rvm:
- "1.9.3"
- "2.0.0"
env:
global:
- NOKOGIRI_USE_SYSTEM_LIBRARIES=true
notifications:
email:
recipients:
- "ul-dlt-applications@lists.psu.edu"
- "michael@psu.edu"
on_success: "change"
on_failure: "always"
irc:
channels:
- "irc.freenode.org#scholarsphere"
- "irc.freenode.org#projecthydra"
template:
- "%{repository}//%{branch}@%{commit} by %{author}: %{message} - %{build_url}"
services:
- redis-server
| language: ruby
before_install:
- sudo apt-get install libclamav-dev clamav-data clamav-daemon
- sudo freshclam -v
rvm:
- "2.0.0"
- "2.1.0-preview2"
env:
global:
- NOKOGIRI_USE_SYSTEM_LIBRARIES=true
notifications:
email:
recipients:
- "ul-dlt-hydra@lists.psu.edu"
on_success: "change"
on_failure: "always"
irc:
channels:
- "irc.freenode.org#scholarsphere"
- "irc.freenode.org#projecthydra"
template:
- "%{repository}//%{branch}@%{commit} by %{author}: %{message} - %{build_url}"
services:
- redis-server
| Update Travis config: run freshclam to avoid ClamAV error; remove Ruby 1.9.3 support; start testing Ruby 2.1; simplify email recipient list | Update Travis config: run freshclam to avoid ClamAV error; remove Ruby 1.9.3 support; start testing Ruby 2.1; simplify email recipient list
| YAML | apache-2.0 | samvera/hyrax,samvera/hyrax,samvera/hyrax,samvera/hyrax | yaml | ## Code Before:
language: ruby
before_install:
- sudo apt-get install libclamav-dev clamav-data
rvm:
- "1.9.3"
- "2.0.0"
env:
global:
- NOKOGIRI_USE_SYSTEM_LIBRARIES=true
notifications:
email:
recipients:
- "ul-dlt-applications@lists.psu.edu"
- "michael@psu.edu"
on_success: "change"
on_failure: "always"
irc:
channels:
- "irc.freenode.org#scholarsphere"
- "irc.freenode.org#projecthydra"
template:
- "%{repository}//%{branch}@%{commit} by %{author}: %{message} - %{build_url}"
services:
- redis-server
## Instruction:
Update Travis config: run freshclam to avoid ClamAV error; remove Ruby 1.9.3 support; start testing Ruby 2.1; simplify email recipient list
## Code After:
language: ruby
before_install:
- sudo apt-get install libclamav-dev clamav-data clamav-daemon
- sudo freshclam -v
rvm:
- "2.0.0"
- "2.1.0-preview2"
env:
global:
- NOKOGIRI_USE_SYSTEM_LIBRARIES=true
notifications:
email:
recipients:
- "ul-dlt-hydra@lists.psu.edu"
on_success: "change"
on_failure: "always"
irc:
channels:
- "irc.freenode.org#scholarsphere"
- "irc.freenode.org#projecthydra"
template:
- "%{repository}//%{branch}@%{commit} by %{author}: %{message} - %{build_url}"
services:
- redis-server
| language: ruby
before_install:
- - sudo apt-get install libclamav-dev clamav-data
+ - sudo apt-get install libclamav-dev clamav-data clamav-daemon
? ++++++++++++++
+ - sudo freshclam -v
rvm:
- - "1.9.3"
- "2.0.0"
+ - "2.1.0-preview2"
env:
global:
- NOKOGIRI_USE_SYSTEM_LIBRARIES=true
notifications:
email:
recipients:
- - "ul-dlt-applications@lists.psu.edu"
? -----------
+ - "ul-dlt-hydra@lists.psu.edu"
? ++++
- - "michael@psu.edu"
on_success: "change"
on_failure: "always"
irc:
channels:
- "irc.freenode.org#scholarsphere"
- "irc.freenode.org#projecthydra"
template:
- "%{repository}//%{branch}@%{commit} by %{author}: %{message} - %{build_url}"
services:
- redis-server | 8 | 0.333333 | 4 | 4 |
52d636ee606678b152a169b714530158da6d37e1 | backend/createGoTerms.sh.in | backend/createGoTerms.sh.in |
set -o pipefail -e
terms="http://geneontology.org/ontology/go-basic.obo"
wget -q "$terms" -O - | <<<CMD_AWK>>> '
BEGIN {
OFS = " "
}
/^\[.*\]$/ { # start of a record
type = $0
split("", record, ":")
next
}
!/^$/ { # a field in a record
key = $0; sub(":.*", "", key)
value = $0; sub("[^ ]*: ", "", value)
record[key] = value
}
/^$/ { # end of a record
if (id != 0 && type == "[Term]") {
sub("_", " ", record["namespace"])
print id, record["id"], record["namespace"], record["name"]
}
id++
type = ""
}'
|
set -o pipefail -e
terms="http://geneontology.org/ontology/go-basic.obo"
wget -q "$terms" -O - | <<<CMD_AWK>>> '
BEGIN {
OFS = " "
id=1
}
/^\[.*\]$/ { # start of a record
type = $0
alt_ctr = 0
split("", record, ":")
split("", ids, ":")
next
}
/^(alt_id|id).*$/ { # a id or alt_id field in a record
value = $0; sub("[^ ]*: ", "", value)
record["id"][alt_ctr] = value
alt_ctr++
next
}
!/^$/ { # a field in a record
key = $0; sub(":.*", "", key)
value = $0; sub("[^ ]*: ", "", value)
record[key] = value
}
/^$/ { # end of a record
if (type == "[Term]") {
sub("_", " ", record["namespace"])
for(i in record["id"]){
print id, record["id"][i], record["namespace"], record["name"]
id++
}
}
type = ""
}'
| Duplicate go_terms enteries that have alternative id's | Duplicate go_terms enteries that have alternative id's
| unknown | mit | unipept/unipept,bmesuere/uni-test,bmesuere/uni-test,bmesuere/uni-test,unipept/unipept,unipept/unipept,unipept/unipept,bmesuere/uni-test,unipept/unipept,bmesuere/uni-test | unknown | ## Code Before:
set -o pipefail -e
terms="http://geneontology.org/ontology/go-basic.obo"
wget -q "$terms" -O - | <<<CMD_AWK>>> '
BEGIN {
OFS = " "
}
/^\[.*\]$/ { # start of a record
type = $0
split("", record, ":")
next
}
!/^$/ { # a field in a record
key = $0; sub(":.*", "", key)
value = $0; sub("[^ ]*: ", "", value)
record[key] = value
}
/^$/ { # end of a record
if (id != 0 && type == "[Term]") {
sub("_", " ", record["namespace"])
print id, record["id"], record["namespace"], record["name"]
}
id++
type = ""
}'
## Instruction:
Duplicate go_terms enteries that have alternative id's
## Code After:
set -o pipefail -e
terms="http://geneontology.org/ontology/go-basic.obo"
wget -q "$terms" -O - | <<<CMD_AWK>>> '
BEGIN {
OFS = " "
id=1
}
/^\[.*\]$/ { # start of a record
type = $0
alt_ctr = 0
split("", record, ":")
split("", ids, ":")
next
}
/^(alt_id|id).*$/ { # a id or alt_id field in a record
value = $0; sub("[^ ]*: ", "", value)
record["id"][alt_ctr] = value
alt_ctr++
next
}
!/^$/ { # a field in a record
key = $0; sub(":.*", "", key)
value = $0; sub("[^ ]*: ", "", value)
record[key] = value
}
/^$/ { # end of a record
if (type == "[Term]") {
sub("_", " ", record["namespace"])
for(i in record["id"]){
print id, record["id"][i], record["namespace"], record["name"]
id++
}
}
type = ""
}'
|
set -o pipefail -e
terms="http://geneontology.org/ontology/go-basic.obo"
wget -q "$terms" -O - | <<<CMD_AWK>>> '
BEGIN {
OFS = " "
+ id=1
}
/^\[.*\]$/ { # start of a record
type = $0
+ alt_ctr = 0
split("", record, ":")
+ split("", ids, ":")
+ next
+ }
+ /^(alt_id|id).*$/ { # a id or alt_id field in a record
+ value = $0; sub("[^ ]*: ", "", value)
+ record["id"][alt_ctr] = value
+ alt_ctr++
next
}
!/^$/ { # a field in a record
key = $0; sub(":.*", "", key)
value = $0; sub("[^ ]*: ", "", value)
record[key] = value
}
/^$/ { # end of a record
- if (id != 0 && type == "[Term]") {
? -----------
+ if (type == "[Term]") {
sub("_", " ", record["namespace"])
+ for(i in record["id"]){
- print id, record["id"], record["namespace"], record["name"]
+ print id, record["id"][i], record["namespace"], record["name"]
? ++++ +++
+ id++
+ }
}
- id++
type = ""
}' | 17 | 0.62963 | 14 | 3 |
c3a606f18d3a727babc24a47c9a3d3ef08093390 | source/_posts/An-bash-curl-and-awk-example.md | source/_posts/An-bash-curl-and-awk-example.md | ---
title: 'An bash, curl and awk example'
date: 2018-08-03 09:00:06
tags:
---
This is an example how in a bash script, curl and awk work together:
<script src="https://gist.github.com/TerrenceMiao/2a89c5941503eb25fe15b86a333fdc49.js"></script>
``` Bash
#!/bin/bash
## Call Session API and get Authorization bearer token
token=$(curl -D - -X POST \
https://api.paradise.org/session/v1 \
-H "Accept: application/json" \
-H "Cache-Control: no-cache" \
-H "Content-Type: application/json" \
-d '{
"username": "master@paradise.org",
"password": "Password Go Go Go"
}' | awk 'BEGIN {FS=": "}/^Authorization/{print $2}')
## Migrate each member into new culture pot
for memberId in $(cat member-ids)
do
curl -D - -X POST \
https://api.paradise.org/cultrue/v2/migrate \
-H "Accept: application/json" \
-H "Authorization: $token" \
-H "Cache-Control: no-cache" \
-H "Content-Type: application/json" \
-H "organisation-id: $memberId" \
-d "{}"
echo ""
echo "---------------------------------------------------------------------------------------------------"
echo ""
done
``` | ---
title: 'An bash, curl and awk example'
date: 2018-08-03 09:00:06
tags:
---
This is an example how in a bash script, curl and awk work together, on gist.
<script src="https://gist.github.com/TerrenceMiao/2a89c5941503eb25fe15b86a333fdc49.js"></script> | Update post "An bash, curl and awk example" | Update post "An bash, curl and awk example"
| Markdown | mit | TerrenceMiao/blog | markdown | ## Code Before:
---
title: 'An bash, curl and awk example'
date: 2018-08-03 09:00:06
tags:
---
This is an example how in a bash script, curl and awk work together:
<script src="https://gist.github.com/TerrenceMiao/2a89c5941503eb25fe15b86a333fdc49.js"></script>
``` Bash
#!/bin/bash
## Call Session API and get Authorization bearer token
token=$(curl -D - -X POST \
https://api.paradise.org/session/v1 \
-H "Accept: application/json" \
-H "Cache-Control: no-cache" \
-H "Content-Type: application/json" \
-d '{
"username": "master@paradise.org",
"password": "Password Go Go Go"
}' | awk 'BEGIN {FS=": "}/^Authorization/{print $2}')
## Migrate each member into new culture pot
for memberId in $(cat member-ids)
do
curl -D - -X POST \
https://api.paradise.org/cultrue/v2/migrate \
-H "Accept: application/json" \
-H "Authorization: $token" \
-H "Cache-Control: no-cache" \
-H "Content-Type: application/json" \
-H "organisation-id: $memberId" \
-d "{}"
echo ""
echo "---------------------------------------------------------------------------------------------------"
echo ""
done
```
## Instruction:
Update post "An bash, curl and awk example"
## Code After:
---
title: 'An bash, curl and awk example'
date: 2018-08-03 09:00:06
tags:
---
This is an example how in a bash script, curl and awk work together, on gist.
<script src="https://gist.github.com/TerrenceMiao/2a89c5941503eb25fe15b86a333fdc49.js"></script> | ---
title: 'An bash, curl and awk example'
date: 2018-08-03 09:00:06
tags:
---
- This is an example how in a bash script, curl and awk work together:
? ^
+ This is an example how in a bash script, curl and awk work together, on gist.
? ^^^^^^^^^^
<script src="https://gist.github.com/TerrenceMiao/2a89c5941503eb25fe15b86a333fdc49.js"></script>
-
- ``` Bash
- #!/bin/bash
-
- ## Call Session API and get Authorization bearer token
- token=$(curl -D - -X POST \
- https://api.paradise.org/session/v1 \
- -H "Accept: application/json" \
- -H "Cache-Control: no-cache" \
- -H "Content-Type: application/json" \
- -d '{
- "username": "master@paradise.org",
- "password": "Password Go Go Go"
- }' | awk 'BEGIN {FS=": "}/^Authorization/{print $2}')
-
- ## Migrate each member into new culture pot
- for memberId in $(cat member-ids)
- do
- curl -D - -X POST \
- https://api.paradise.org/cultrue/v2/migrate \
- -H "Accept: application/json" \
- -H "Authorization: $token" \
- -H "Cache-Control: no-cache" \
- -H "Content-Type: application/json" \
- -H "organisation-id: $memberId" \
- -d "{}"
-
- echo ""
- echo "---------------------------------------------------------------------------------------------------"
- echo ""
- done
- ``` | 34 | 0.829268 | 1 | 33 |
4a4173918f0809c557b44e6b8eed041884ea4fe2 | README.md | README.md |
Collection of vulnerable and fixed C# synthetic test cases expressing specific flaws.
Written in Python 3
## Dependencies
- Jinja2 (depends on MarkupSafe)
Installation instructions [here](http://jinja.pocoo.org/docs/dev/intro/#installation)
We encourage you to use [pip](https://pip.pypa.io/en/stable/) to install these dependencies (choose one):
- [sudo] pip install -r requirements.txt (as root, system-wide)
- pip install --user -r requirements.txt (only for your user)
## Execute it
```sh
$ python3 main.py
```
## Discussion
For discussion please send me an email at: Bertrand 'dot' STIVALET 'at' gmail.com
|
Collection of vulnerable and fixed C# synthetic test cases expressing specific flaws.
Written in Python 3
## Dependencies
- Jinja2 (depends on MarkupSafe)
Installation instructions [here](http://jinja.pocoo.org/docs/dev/intro/#installation)
We encourage you to use [pip](https://pip.pypa.io/en/stable/) to install these dependencies (choose one):
- [sudo] pip3 install -r requirements.txt (as root, system-wide)
- pip3 install --user -r requirements.txt (only for your user)
You can also install this dependency with your package manager (if such a package exists in your distribution) :
- [sudo] aptitude install python3-jinja (for GNU/Linux Debian for example)
## Execute it
```sh
$ python3 main.py
```
## Discussion
For discussion please send me an email at: Bertrand 'dot' STIVALET 'at' gmail.com
| Add a different way to install the dependencies | Add a different way to install the dependencies
| Markdown | mit | stivalet/C-Sharp-Vuln-test-suite-gen,stivalet/C-Sharp-Vuln-test-suite-gen | markdown | ## Code Before:
Collection of vulnerable and fixed C# synthetic test cases expressing specific flaws.
Written in Python 3
## Dependencies
- Jinja2 (depends on MarkupSafe)
Installation instructions [here](http://jinja.pocoo.org/docs/dev/intro/#installation)
We encourage you to use [pip](https://pip.pypa.io/en/stable/) to install these dependencies (choose one):
- [sudo] pip install -r requirements.txt (as root, system-wide)
- pip install --user -r requirements.txt (only for your user)
## Execute it
```sh
$ python3 main.py
```
## Discussion
For discussion please send me an email at: Bertrand 'dot' STIVALET 'at' gmail.com
## Instruction:
Add a different way to install the dependencies
## Code After:
Collection of vulnerable and fixed C# synthetic test cases expressing specific flaws.
Written in Python 3
## Dependencies
- Jinja2 (depends on MarkupSafe)
Installation instructions [here](http://jinja.pocoo.org/docs/dev/intro/#installation)
We encourage you to use [pip](https://pip.pypa.io/en/stable/) to install these dependencies (choose one):
- [sudo] pip3 install -r requirements.txt (as root, system-wide)
- pip3 install --user -r requirements.txt (only for your user)
You can also install this dependency with your package manager (if such a package exists in your distribution) :
- [sudo] aptitude install python3-jinja (for GNU/Linux Debian for example)
## Execute it
```sh
$ python3 main.py
```
## Discussion
For discussion please send me an email at: Bertrand 'dot' STIVALET 'at' gmail.com
|
Collection of vulnerable and fixed C# synthetic test cases expressing specific flaws.
Written in Python 3
## Dependencies
- Jinja2 (depends on MarkupSafe)
Installation instructions [here](http://jinja.pocoo.org/docs/dev/intro/#installation)
We encourage you to use [pip](https://pip.pypa.io/en/stable/) to install these dependencies (choose one):
- - [sudo] pip install -r requirements.txt (as root, system-wide)
+ - [sudo] pip3 install -r requirements.txt (as root, system-wide)
? +
- - pip install --user -r requirements.txt (only for your user)
+ - pip3 install --user -r requirements.txt (only for your user)
? +
+
+ You can also install this dependency with your package manager (if such a package exists in your distribution) :
+
+ - [sudo] aptitude install python3-jinja (for GNU/Linux Debian for example)
## Execute it
```sh
$ python3 main.py
```
## Discussion
For discussion please send me an email at: Bertrand 'dot' STIVALET 'at' gmail.com | 8 | 0.32 | 6 | 2 |
153f0e18df03d7ff7b691d2bd454468933ef50b0 | setup.py | setup.py | import os
from setuptools import setup
long_description = 'Please see our GitHub README'
if os.path.exists('README.txt'):
long_description = open('README.txt').read()
base_url = 'https://github.com/sendgrid/'
version = '3.1.0'
setup(
name='python_http_client',
version=version,
author='Elmer Thomas',
author_email='dx@sendgrid.com',
url='{}python-http-client'.format(base_url),
download_url='{}python-http-client/tarball/{}'.format(base_url, version),
packages=['python_http_client'],
license='MIT',
description='HTTP REST client, simplified for Python',
long_description=long_description,
keywords=[
'REST',
'HTTP',
'API'],
classifiers=[
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6'
]
)
| import os
from setuptools import setup
long_description = 'Please see our GitHub README'
if os.path.exists('README.txt'):
long_description = open('README.txt').read()
base_url = 'https://github.com/sendgrid/'
version = '3.1.0'
setup(
name='python_http_client',
version=version,
author='Elmer Thomas',
author_email='dx@sendgrid.com',
url='{}python-http-client'.format(base_url),
download_url='{}python-http-client/tarball/{}'.format(base_url, version),
packages=['python_http_client'],
license='MIT',
description='HTTP REST client, simplified for Python',
long_description=long_description,
keywords=[
'REST',
'HTTP',
'API'],
python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',
classifiers=[
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6'
]
)
| Add python_requires to help pip | Add python_requires to help pip
| Python | mit | sendgrid/python-http-client,sendgrid/python-http-client | python | ## Code Before:
import os
from setuptools import setup
long_description = 'Please see our GitHub README'
if os.path.exists('README.txt'):
long_description = open('README.txt').read()
base_url = 'https://github.com/sendgrid/'
version = '3.1.0'
setup(
name='python_http_client',
version=version,
author='Elmer Thomas',
author_email='dx@sendgrid.com',
url='{}python-http-client'.format(base_url),
download_url='{}python-http-client/tarball/{}'.format(base_url, version),
packages=['python_http_client'],
license='MIT',
description='HTTP REST client, simplified for Python',
long_description=long_description,
keywords=[
'REST',
'HTTP',
'API'],
classifiers=[
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6'
]
)
## Instruction:
Add python_requires to help pip
## Code After:
import os
from setuptools import setup
long_description = 'Please see our GitHub README'
if os.path.exists('README.txt'):
long_description = open('README.txt').read()
base_url = 'https://github.com/sendgrid/'
version = '3.1.0'
setup(
name='python_http_client',
version=version,
author='Elmer Thomas',
author_email='dx@sendgrid.com',
url='{}python-http-client'.format(base_url),
download_url='{}python-http-client/tarball/{}'.format(base_url, version),
packages=['python_http_client'],
license='MIT',
description='HTTP REST client, simplified for Python',
long_description=long_description,
keywords=[
'REST',
'HTTP',
'API'],
python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',
classifiers=[
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6'
]
)
| import os
from setuptools import setup
long_description = 'Please see our GitHub README'
if os.path.exists('README.txt'):
long_description = open('README.txt').read()
base_url = 'https://github.com/sendgrid/'
version = '3.1.0'
setup(
name='python_http_client',
version=version,
author='Elmer Thomas',
author_email='dx@sendgrid.com',
url='{}python-http-client'.format(base_url),
download_url='{}python-http-client/tarball/{}'.format(base_url, version),
packages=['python_http_client'],
license='MIT',
description='HTTP REST client, simplified for Python',
long_description=long_description,
keywords=[
'REST',
'HTTP',
'API'],
+ python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',
classifiers=[
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6'
]
) | 1 | 0.028571 | 1 | 0 |
cbb25e573975a16d0c99e8a7cc2a6a553e0b9c3a | README.md | README.md | iOS8-day-by-day-book
====================
The book which accompanies the iOS8: Day-by-Day blog series
| iOS8-day-by-day-book
====================
Apple delivered iOS 8 to the developer world at WWDC in June 2014, before
launching it at the wider world in September of the same year. The iOS 8
SDK was somewhat over-shadowed by the simultaneous announcement of a new
programming language in the form of Swift, however this didn't mean that
the core OS had been overlooked at all. Quite the opposite - a huge number
of new APIs had been introduced, powering tons of new functionality.
iOS8: Day-by-Day is a review of the most important of these. Busy developers
don't have time to trawl the WWDC videos and Apple documentation. Instead
they'd like to get a high-level summary of the new possibilities, alongside
some working sample code. This is exactly what iOS8: Day-by-Day provides.
It started out as a blog series, and these blog posts now form the basis
of the book.
## Book
This repo contains the source for the published book. You can get hold of
your copy from Leanpub at
[leanpub.com/ios8daybyday](https://leanpub.com/ios8daybyday).
sam
[iwantmyreal.name](http://iwantmyreal.name/)
[@iwantmyrealname](https://twitter.com/iwantmyrealname)
[@shinobicontrols](https://twitter.com/ShinobiControls)
| Update readme with details of book publication | Update readme with details of book publication
| Markdown | apache-2.0 | shinobicontrols/iOS8-day-by-day-book,sammyd/iOS8-day-by-day-book,sammyd/iOS8-day-by-day-book,shinobicontrols/iOS8-day-by-day-book | markdown | ## Code Before:
iOS8-day-by-day-book
====================
The book which accompanies the iOS8: Day-by-Day blog series
## Instruction:
Update readme with details of book publication
## Code After:
iOS8-day-by-day-book
====================
Apple delivered iOS 8 to the developer world at WWDC in June 2014, before
launching it at the wider world in September of the same year. The iOS 8
SDK was somewhat over-shadowed by the simultaneous announcement of a new
programming language in the form of Swift, however this didn't mean that
the core OS had been overlooked at all. Quite the opposite - a huge number
of new APIs had been introduced, powering tons of new functionality.
iOS8: Day-by-Day is a review of the most important of these. Busy developers
don't have time to trawl the WWDC videos and Apple documentation. Instead
they'd like to get a high-level summary of the new possibilities, alongside
some working sample code. This is exactly what iOS8: Day-by-Day provides.
It started out as a blog series, and these blog posts now form the basis
of the book.
## Book
This repo contains the source for the published book. You can get hold of
your copy from Leanpub at
[leanpub.com/ios8daybyday](https://leanpub.com/ios8daybyday).
sam
[iwantmyreal.name](http://iwantmyreal.name/)
[@iwantmyrealname](https://twitter.com/iwantmyrealname)
[@shinobicontrols](https://twitter.com/ShinobiControls)
| iOS8-day-by-day-book
====================
- The book which accompanies the iOS8: Day-by-Day blog series
+ Apple delivered iOS 8 to the developer world at WWDC in June 2014, before
+ launching it at the wider world in September of the same year. The iOS 8
+ SDK was somewhat over-shadowed by the simultaneous announcement of a new
+ programming language in the form of Swift, however this didn't mean that
+ the core OS had been overlooked at all. Quite the opposite - a huge number
+ of new APIs had been introduced, powering tons of new functionality.
+
+ iOS8: Day-by-Day is a review of the most important of these. Busy developers
+ don't have time to trawl the WWDC videos and Apple documentation. Instead
+ they'd like to get a high-level summary of the new possibilities, alongside
+ some working sample code. This is exactly what iOS8: Day-by-Day provides.
+ It started out as a blog series, and these blog posts now form the basis
+ of the book.
+
+
+ ## Book
+
+ This repo contains the source for the published book. You can get hold of
+ your copy from Leanpub at
+ [leanpub.com/ios8daybyday](https://leanpub.com/ios8daybyday).
+
+
+ sam
+
+ [iwantmyreal.name](http://iwantmyreal.name/)
+
+ [@iwantmyrealname](https://twitter.com/iwantmyrealname)
+ [@shinobicontrols](https://twitter.com/ShinobiControls) | 29 | 7.25 | 28 | 1 |
495aef0d2ce61bcc1a699164c4a794b759f921f2 | app/reader.rb | app/reader.rb | class Reader < Base
helpers Sinatra::Nedry
injected :clock, System::Clock
get "/" do
redirect to("/home")
end
get "/home" do
flagged! do
articles = service.fetch_all
tiles = articles.map { |a| Widget::Tile.new(a) }
render_page :articles, :articles => tiles, :opts => { :home_active => true }
end
end
get "/about/?" do
render_page :about
end
get "/categories/:id/?" do
flagged! do
articles = service.fetch_all_from_category(params[:id])
tiles = articles.map { |a| Widget::Tile.new(a) }
render_page :articles, :articles => tiles, :opts => { :active_category => params[:id] }
end
end
get "/articles/:id/?" do
flagged! do
article = service.fetch(params[:id].to_i)
if article
article_widget = Widget::Article.new(article)
comments = Widget::Comments.new(ENV["RACK_ENV"])
render_page :articles_show, :article => article_widget, :comments => comments
else
raise Sinatra::NotFound
end
end
end
private
def render_page(template, locals = {})
locals = locals.merge(:date => clock.now, :categories => Categories::ALL)
locals[:opts] ||= {}
erb(template, :locals => locals)
end
def service
article_service.published.page(0)
end
end
| class Reader < Base
helpers Sinatra::Nedry
injected :clock, System::Clock
get "/" do
redirect to("/home")
end
get "/home" do
flagged! do
articles = service.fetch_all
tiles = articles.map { |a| Widget::Tile.new(a) }
render_page :articles, :articles => tiles, :opts => { :home_active => true }
end
end
get "/about/?" do
render_page :about
end
get "/categories/:id/?" do
flagged! do
articles = service.fetch_all_from_category(params[:id])
tiles = articles.map { |a| Widget::Tile.new(a) }
render_page :articles, :articles => tiles, :opts => { :active_category => params[:id] }
end
end
get "/articles/:id/?" do
flagged! do
article = service.fetch(params[:id].to_i)
if article
article_widget = Widget::Article.new(article)
comments = Widget::Comments.new(ENV["RACK_ENV"])
render_page :articles_show, {
:article => article_widget,
:comments => comments,
:opts => { :active_category => article.category_id }
}
else
raise Sinatra::NotFound
end
end
end
private
def render_page(template, locals = {})
locals = locals.merge(:date => clock.now, :categories => Categories::ALL)
locals[:opts] ||= {}
erb(template, :locals => locals)
end
def service
article_service.published.page(0)
end
end
| Make category active for article view | Make category active for article view
| Ruby | mit | the-tangent/tangent,the-tangent/tangent | ruby | ## Code Before:
class Reader < Base
helpers Sinatra::Nedry
injected :clock, System::Clock
get "/" do
redirect to("/home")
end
get "/home" do
flagged! do
articles = service.fetch_all
tiles = articles.map { |a| Widget::Tile.new(a) }
render_page :articles, :articles => tiles, :opts => { :home_active => true }
end
end
get "/about/?" do
render_page :about
end
get "/categories/:id/?" do
flagged! do
articles = service.fetch_all_from_category(params[:id])
tiles = articles.map { |a| Widget::Tile.new(a) }
render_page :articles, :articles => tiles, :opts => { :active_category => params[:id] }
end
end
get "/articles/:id/?" do
flagged! do
article = service.fetch(params[:id].to_i)
if article
article_widget = Widget::Article.new(article)
comments = Widget::Comments.new(ENV["RACK_ENV"])
render_page :articles_show, :article => article_widget, :comments => comments
else
raise Sinatra::NotFound
end
end
end
private
def render_page(template, locals = {})
locals = locals.merge(:date => clock.now, :categories => Categories::ALL)
locals[:opts] ||= {}
erb(template, :locals => locals)
end
def service
article_service.published.page(0)
end
end
## Instruction:
Make category active for article view
## Code After:
class Reader < Base
helpers Sinatra::Nedry
injected :clock, System::Clock
get "/" do
redirect to("/home")
end
get "/home" do
flagged! do
articles = service.fetch_all
tiles = articles.map { |a| Widget::Tile.new(a) }
render_page :articles, :articles => tiles, :opts => { :home_active => true }
end
end
get "/about/?" do
render_page :about
end
get "/categories/:id/?" do
flagged! do
articles = service.fetch_all_from_category(params[:id])
tiles = articles.map { |a| Widget::Tile.new(a) }
render_page :articles, :articles => tiles, :opts => { :active_category => params[:id] }
end
end
get "/articles/:id/?" do
flagged! do
article = service.fetch(params[:id].to_i)
if article
article_widget = Widget::Article.new(article)
comments = Widget::Comments.new(ENV["RACK_ENV"])
render_page :articles_show, {
:article => article_widget,
:comments => comments,
:opts => { :active_category => article.category_id }
}
else
raise Sinatra::NotFound
end
end
end
private
def render_page(template, locals = {})
locals = locals.merge(:date => clock.now, :categories => Categories::ALL)
locals[:opts] ||= {}
erb(template, :locals => locals)
end
def service
article_service.published.page(0)
end
end
| class Reader < Base
helpers Sinatra::Nedry
injected :clock, System::Clock
get "/" do
redirect to("/home")
end
get "/home" do
flagged! do
articles = service.fetch_all
tiles = articles.map { |a| Widget::Tile.new(a) }
render_page :articles, :articles => tiles, :opts => { :home_active => true }
end
end
get "/about/?" do
render_page :about
end
get "/categories/:id/?" do
flagged! do
articles = service.fetch_all_from_category(params[:id])
tiles = articles.map { |a| Widget::Tile.new(a) }
render_page :articles, :articles => tiles, :opts => { :active_category => params[:id] }
end
end
get "/articles/:id/?" do
flagged! do
article = service.fetch(params[:id].to_i)
if article
article_widget = Widget::Article.new(article)
comments = Widget::Comments.new(ENV["RACK_ENV"])
- render_page :articles_show, :article => article_widget, :comments => comments
+ render_page :articles_show, {
+ :article => article_widget,
+ :comments => comments,
+ :opts => { :active_category => article.category_id }
+ }
else
raise Sinatra::NotFound
end
end
end
private
def render_page(template, locals = {})
locals = locals.merge(:date => clock.now, :categories => Categories::ALL)
locals[:opts] ||= {}
erb(template, :locals => locals)
end
def service
article_service.published.page(0)
end
end | 6 | 0.103448 | 5 | 1 |
ec974eb3a2c98882dbe0f13e89491e8b6da9281f | lib/nark/plugin/event.rb | lib/nark/plugin/event.rb | require_relative 'events'
module Nark
module Plugin
#
# Essentially this is a value object used to store a plugins event block
# which is then used by the middleware to trigger at the right time.
#
class Event
attr_reader :type
attr_reader :method_block
attr_reader :plugin
attr_reader :attributes
protected
attr_writer :type
attr_writer :method_block
attr_writer :plugin
attr_writer :attributes
public
def initialize params
@attributes = params
@type = params[:type]
@method_block = params[:method_block]
@plugin = params[:plugin]
end
def exists?
Nark::Plugin.events.find { |event| method_block == event.method_block }
end
def to_hash
attributes
end
end
end
end
| require_relative 'events'
module Nark
module Plugin
#
# An event based value object allowing use to easily deal with event based functionality.
#
# Used to keep track and interact with a plugin's triggers.
#
# Essentially this is a value object used to store a plugins event block which is then used by the middleware to
# trigger at the right time.
#
class Event
#
# Only allow the object to change the state of the event
#
private
#
# Protect the the object from being changed externally
#
attr_writer :type
attr_writer :method_block
attr_writer :plugin
#
# Used to simplify the way parameters are returned and stored
#
attr_accessor :attributes
#
# Stores the type of event
#
attr_reader :type
#
# Stores the event block to be triggered
#
attr_reader :method_block
#
# Stores the name of the plugin that the event belongs to
#
attr_reader :plugin
#
# Initialise the new Event
#
def initialize params
@attributes = params
@type = params[:type]
@method_block = params[:method_block]
@plugin = params[:plugin]
end
#
# Checks the exists of the plugin in the events collection
#
def exists?
Nark::Plugin.events.find { |event| method_block == event.method_block }
end
#
# Returns a hash of the events attributes
#
def to_hash
attributes
end
end
end
end
| Improve documentation and layout of code | Improve documentation and layout of code
| Ruby | mit | baphled/nark,baphled/nark,baphled/nark | ruby | ## Code Before:
require_relative 'events'
module Nark
module Plugin
#
# Essentially this is a value object used to store a plugins event block
# which is then used by the middleware to trigger at the right time.
#
class Event
attr_reader :type
attr_reader :method_block
attr_reader :plugin
attr_reader :attributes
protected
attr_writer :type
attr_writer :method_block
attr_writer :plugin
attr_writer :attributes
public
def initialize params
@attributes = params
@type = params[:type]
@method_block = params[:method_block]
@plugin = params[:plugin]
end
def exists?
Nark::Plugin.events.find { |event| method_block == event.method_block }
end
def to_hash
attributes
end
end
end
end
## Instruction:
Improve documentation and layout of code
## Code After:
require_relative 'events'
module Nark
module Plugin
#
# An event based value object allowing use to easily deal with event based functionality.
#
# Used to keep track and interact with a plugin's triggers.
#
# Essentially this is a value object used to store a plugins event block which is then used by the middleware to
# trigger at the right time.
#
class Event
#
# Only allow the object to change the state of the event
#
private
#
# Protect the the object from being changed externally
#
attr_writer :type
attr_writer :method_block
attr_writer :plugin
#
# Used to simplify the way parameters are returned and stored
#
attr_accessor :attributes
#
# Stores the type of event
#
attr_reader :type
#
# Stores the event block to be triggered
#
attr_reader :method_block
#
# Stores the name of the plugin that the event belongs to
#
attr_reader :plugin
#
# Initialise the new Event
#
def initialize params
@attributes = params
@type = params[:type]
@method_block = params[:method_block]
@plugin = params[:plugin]
end
#
# Checks the exists of the plugin in the events collection
#
def exists?
Nark::Plugin.events.find { |event| method_block == event.method_block }
end
#
# Returns a hash of the events attributes
#
def to_hash
attributes
end
end
end
end
| require_relative 'events'
module Nark
module Plugin
#
+ # An event based value object allowing use to easily deal with event based functionality.
+ #
+ # Used to keep track and interact with a plugin's triggers.
+ #
- # Essentially this is a value object used to store a plugins event block
+ # Essentially this is a value object used to store a plugins event block which is then used by the middleware to
? ++++++++++++++++++++++++++++++++++++++++
- # which is then used by the middleware to trigger at the right time.
+ # trigger at the right time.
#
class Event
- attr_reader :type
- attr_reader :method_block
- attr_reader :plugin
- attr_reader :attributes
+ #
+ # Only allow the object to change the state of the event
+ #
+ private
- protected
-
+ #
+ # Protect the the object from being changed externally
+ #
attr_writer :type
attr_writer :method_block
attr_writer :plugin
- attr_writer :attributes
- public
+ #
+ # Used to simplify the way parameters are returned and stored
+ #
+ attr_accessor :attributes
+ #
+ # Stores the type of event
+ #
+ attr_reader :type
+
+ #
+ # Stores the event block to be triggered
+ #
+ attr_reader :method_block
+
+ #
+ # Stores the name of the plugin that the event belongs to
+ #
+ attr_reader :plugin
+
+ #
+ # Initialise the new Event
+ #
def initialize params
@attributes = params
@type = params[:type]
@method_block = params[:method_block]
@plugin = params[:plugin]
end
+ #
+ # Checks the exists of the plugin in the events collection
+ #
def exists?
Nark::Plugin.events.find { |event| method_block == event.method_block }
end
+ #
+ # Returns a hash of the events attributes
+ #
def to_hash
attributes
end
end
end
end | 51 | 1.275 | 41 | 10 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.