commit stringlengths 40 40 | old_file stringlengths 4 184 | new_file stringlengths 4 184 | old_contents stringlengths 1 3.6k | new_contents stringlengths 5 3.38k | subject stringlengths 15 778 | message stringlengths 16 6.74k | lang stringclasses 201 values | license stringclasses 13 values | repos stringlengths 6 116k | config stringclasses 201 values | content stringlengths 137 7.24k | diff stringlengths 26 5.55k | diff_length int64 1 123 | relative_diff_length float64 0.01 89 | n_lines_added int64 0 108 | n_lines_deleted int64 0 106 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
34965523bba8e24ab26a577205a65b8afbc1f96c | src/main.cpp | src/main.cpp | /*
* Copyright (c) 2017 Akil Darjean (adarjean@uh.edu)
* Distributed under the MIT License.
* See accompanying file LICENSE.md or copy at http://opensource.org/licenses/MIT
*/
#include "window.h"
#include <string>
int main(int argc, char const* argv[])
{
Window window;
if (argc > 1) {
std::string filename = argv[1];
window.create(filename);
} else {
window.create();
}
return 0;
}
| /*
* Copyright (c) 2017 Akil Darjean (adarjean@uh.edu)
* Distributed under the MIT License.
* See accompanying file LICENSE.md or copy at http://opensource.org/licenses/MIT
*/
#include "window.h"
#include <iostream>
#include <string>
int main(int argc, char const* argv[])
{
Window window;
if (argc > 1) {
std::string filename = argv[1];
window.create(filename);
} else {
std::cout << "SFML Image Viewer (c) 2017" << std::endl;
std::cout << std::endl;
std::cout << "Usage: sfiv FILE" << std::endl;
}
return 0;
}
| Make application open only if file is provided | Make application open only if file is provided
| C++ | mit | adarj/sfml-image-viewer | c++ | ## Code Before:
/*
* Copyright (c) 2017 Akil Darjean (adarjean@uh.edu)
* Distributed under the MIT License.
* See accompanying file LICENSE.md or copy at http://opensource.org/licenses/MIT
*/
#include "window.h"
#include <string>
int main(int argc, char const* argv[])
{
Window window;
if (argc > 1) {
std::string filename = argv[1];
window.create(filename);
} else {
window.create();
}
return 0;
}
## Instruction:
Make application open only if file is provided
## Code After:
/*
* Copyright (c) 2017 Akil Darjean (adarjean@uh.edu)
* Distributed under the MIT License.
* See accompanying file LICENSE.md or copy at http://opensource.org/licenses/MIT
*/
#include "window.h"
#include <iostream>
#include <string>
int main(int argc, char const* argv[])
{
Window window;
if (argc > 1) {
std::string filename = argv[1];
window.create(filename);
} else {
std::cout << "SFML Image Viewer (c) 2017" << std::endl;
std::cout << std::endl;
std::cout << "Usage: sfiv FILE" << std::endl;
}
return 0;
}
| /*
* Copyright (c) 2017 Akil Darjean (adarjean@uh.edu)
* Distributed under the MIT License.
* See accompanying file LICENSE.md or copy at http://opensource.org/licenses/MIT
*/
#include "window.h"
+ #include <iostream>
#include <string>
int main(int argc, char const* argv[])
{
Window window;
if (argc > 1) {
std::string filename = argv[1];
window.create(filename);
} else {
- window.create();
+ std::cout << "SFML Image Viewer (c) 2017" << std::endl;
+ std::cout << std::endl;
+ std::cout << "Usage: sfiv FILE" << std::endl;
}
return 0;
} | 5 | 0.227273 | 4 | 1 |
d1569b7df6ee45324d88acba79eb57c1b985e394 | inventi-wicket-autocomplete/pom.xml | inventi-wicket-autocomplete/pom.xml | <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<parent>
<groupId>lt.inventi</groupId>
<artifactId>inventi-wicket</artifactId>
<version>0.0.3-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>inventi-wicket-autocomplete</artifactId>
<packaging>jar</packaging>
<name>Inventi :: UI :: Wicket :: Autocomplete</name>
<dependencies>
<dependency>
<groupId>org.odlabs.wiquery</groupId>
<artifactId>wiquery-jquery-ui</artifactId>
</dependency>
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>inventi-wicket-js</artifactId>
</dependency>
<!-- Utils (Autocomplete) -->
<dependency>
<groupId>net.sf.json-lib</groupId>
<artifactId>json-lib</artifactId>
<classifier>jdk15</classifier>
</dependency>
<!-- TEST -->
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>inventi-wicket-test</artifactId>
</dependency>
</dependencies>
</project>
| <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<parent>
<groupId>lt.inventi</groupId>
<artifactId>inventi-wicket</artifactId>
<version>0.0.3-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>inventi-wicket-autocomplete</artifactId>
<packaging>jar</packaging>
<name>Inventi :: UI :: Wicket :: Autocomplete</name>
<dependencies>
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>inventi-wicket-js</artifactId>
</dependency>
<!-- Utils (Autocomplete) -->
<dependency>
<groupId>net.sf.json-lib</groupId>
<artifactId>json-lib</artifactId>
<classifier>jdk15</classifier>
</dependency>
<!-- TEST -->
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>inventi-wicket-test</artifactId>
</dependency>
</dependencies>
</project>
| Remove wiquery dependency from autocomplete | Remove wiquery dependency from autocomplete
| XML | apache-2.0 | inventiLT/inventi-wicket,inventiLT/inventi-wicket,inventiLT/inventi-wicket | xml | ## Code Before:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<parent>
<groupId>lt.inventi</groupId>
<artifactId>inventi-wicket</artifactId>
<version>0.0.3-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>inventi-wicket-autocomplete</artifactId>
<packaging>jar</packaging>
<name>Inventi :: UI :: Wicket :: Autocomplete</name>
<dependencies>
<dependency>
<groupId>org.odlabs.wiquery</groupId>
<artifactId>wiquery-jquery-ui</artifactId>
</dependency>
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>inventi-wicket-js</artifactId>
</dependency>
<!-- Utils (Autocomplete) -->
<dependency>
<groupId>net.sf.json-lib</groupId>
<artifactId>json-lib</artifactId>
<classifier>jdk15</classifier>
</dependency>
<!-- TEST -->
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>inventi-wicket-test</artifactId>
</dependency>
</dependencies>
</project>
## Instruction:
Remove wiquery dependency from autocomplete
## Code After:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<parent>
<groupId>lt.inventi</groupId>
<artifactId>inventi-wicket</artifactId>
<version>0.0.3-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>inventi-wicket-autocomplete</artifactId>
<packaging>jar</packaging>
<name>Inventi :: UI :: Wicket :: Autocomplete</name>
<dependencies>
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>inventi-wicket-js</artifactId>
</dependency>
<!-- Utils (Autocomplete) -->
<dependency>
<groupId>net.sf.json-lib</groupId>
<artifactId>json-lib</artifactId>
<classifier>jdk15</classifier>
</dependency>
<!-- TEST -->
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>inventi-wicket-test</artifactId>
</dependency>
</dependencies>
</project>
| <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<parent>
<groupId>lt.inventi</groupId>
<artifactId>inventi-wicket</artifactId>
<version>0.0.3-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>inventi-wicket-autocomplete</artifactId>
<packaging>jar</packaging>
<name>Inventi :: UI :: Wicket :: Autocomplete</name>
<dependencies>
- <dependency>
- <groupId>org.odlabs.wiquery</groupId>
- <artifactId>wiquery-jquery-ui</artifactId>
- </dependency>
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>inventi-wicket-js</artifactId>
</dependency>
<!-- Utils (Autocomplete) -->
<dependency>
<groupId>net.sf.json-lib</groupId>
<artifactId>json-lib</artifactId>
<classifier>jdk15</classifier>
</dependency>
<!-- TEST -->
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>inventi-wicket-test</artifactId>
</dependency>
</dependencies>
</project> | 4 | 0.108108 | 0 | 4 |
54d8abc7864371c365b49766ea1ba5331daacfd2 | tests/Helpers/DateTimeHelperTest.php | tests/Helpers/DateTimeHelperTest.php | <?php
namespace ZpgRtf\Tests\Helpers;
use PHPUnit\Framework\TestCase;
use ZpgRtf\Helpers\DateTimeHelper;
class DateTimeHelperTest extends TestCase
{
public function testCanInstantiate()
{
$this->assertInstanceOf(
DateTimeHelper::class,
new DateTimeHelper()
);
}
public function testCanEncodeCustomFormat()
{
$format = 'c';
$timestamp = date($format);
$dateTimeHelper = new DateTimeHelper($timestamp);
$dateTimeHelper->setHelperFormat($format);
$this->assertSame(
json_decode(json_encode($dateTimeHelper)),
$timestamp
);
}
}
| <?php
namespace ZpgRtf\Tests\Helpers;
use PHPUnit\Framework\TestCase;
use ZpgRtf\Helpers\DateTimeHelper;
class DateTimeHelperTest extends TestCase
{
public function testCanInstantiate()
{
$this->assertInstanceOf(
DateTimeHelper::class,
new DateTimeHelper()
);
}
public function testCanEncodeCustomFormat()
{
$format = 'c';
$timestamp = date($format);
$dateTimeHelper = new DateTimeHelper($timestamp);
$dateTimeHelper->setHelperFormat($format);
$this->assertSame(
json_decode(json_encode($dateTimeHelper)),
$timestamp
);
}
public function testCanJsonSerialize()
{
$this->assertJson(
json_encode(new DateTimeHelper())
);
$this->assertInstanceOf(
\JsonSerializable::class,
new DateTimeHelper()
);
}
}
| Add testCanJsonSerialize test to datetime helper | Add testCanJsonSerialize test to datetime helper
| PHP | mit | lukeoliff/zpg-rtf-php | php | ## Code Before:
<?php
namespace ZpgRtf\Tests\Helpers;
use PHPUnit\Framework\TestCase;
use ZpgRtf\Helpers\DateTimeHelper;
class DateTimeHelperTest extends TestCase
{
public function testCanInstantiate()
{
$this->assertInstanceOf(
DateTimeHelper::class,
new DateTimeHelper()
);
}
public function testCanEncodeCustomFormat()
{
$format = 'c';
$timestamp = date($format);
$dateTimeHelper = new DateTimeHelper($timestamp);
$dateTimeHelper->setHelperFormat($format);
$this->assertSame(
json_decode(json_encode($dateTimeHelper)),
$timestamp
);
}
}
## Instruction:
Add testCanJsonSerialize test to datetime helper
## Code After:
<?php
namespace ZpgRtf\Tests\Helpers;
use PHPUnit\Framework\TestCase;
use ZpgRtf\Helpers\DateTimeHelper;
class DateTimeHelperTest extends TestCase
{
public function testCanInstantiate()
{
$this->assertInstanceOf(
DateTimeHelper::class,
new DateTimeHelper()
);
}
public function testCanEncodeCustomFormat()
{
$format = 'c';
$timestamp = date($format);
$dateTimeHelper = new DateTimeHelper($timestamp);
$dateTimeHelper->setHelperFormat($format);
$this->assertSame(
json_decode(json_encode($dateTimeHelper)),
$timestamp
);
}
public function testCanJsonSerialize()
{
$this->assertJson(
json_encode(new DateTimeHelper())
);
$this->assertInstanceOf(
\JsonSerializable::class,
new DateTimeHelper()
);
}
}
| <?php
namespace ZpgRtf\Tests\Helpers;
use PHPUnit\Framework\TestCase;
use ZpgRtf\Helpers\DateTimeHelper;
class DateTimeHelperTest extends TestCase
{
public function testCanInstantiate()
{
$this->assertInstanceOf(
DateTimeHelper::class,
new DateTimeHelper()
);
}
public function testCanEncodeCustomFormat()
{
$format = 'c';
$timestamp = date($format);
$dateTimeHelper = new DateTimeHelper($timestamp);
$dateTimeHelper->setHelperFormat($format);
$this->assertSame(
json_decode(json_encode($dateTimeHelper)),
$timestamp
);
}
+
+ public function testCanJsonSerialize()
+ {
+ $this->assertJson(
+ json_encode(new DateTimeHelper())
+ );
+
+ $this->assertInstanceOf(
+ \JsonSerializable::class,
+ new DateTimeHelper()
+ );
+ }
} | 12 | 0.387097 | 12 | 0 |
13191a981c048fc9bf46fc8f4bc3ac675b6d9458 | README.md | README.md |
`RSpec::Support` provides common functionality to `RSpec::Core`,
`RSpec::Expectations` and `RSpec::Mocks`. It is considered
suitable for internal use only at this time.
## Installation / Usage
Install one or more of the `RSpec` gems.
Want to run against the `master` branch? You'll need to include the dependent
RSpec repos as well. Add the following to your `Gemfile`:
```ruby
%w[rspec-core rspec-expectations rspec-mocks rspec-support].each do |lib|
gem lib, :git => "git://github.com/rspec/#{lib}.git", :branch => 'master'
end
```
## Contributing
Once you've set up the environment, you'll need to cd into the working
directory of whichever repo you want to work in. From there you can run the
specs and cucumber features, and make patches.
NOTE: You do not need to use rspec-dev to work on a specific RSpec repo. You
can treat each RSpec repo as an independent project.
- [Build details](BUILD_DETAIL.md)
- [Code of Conduct](CODE_OF_CONDUCT.md)
- [Detailed contributing guide](CONTRIBUTING.md)
- [Development setup guide](DEVELOPMENT.md)
## Patches
Please submit a pull request or a github issue to one of the issue trackers
listed below. If you submit an issue, please include a link to either of:
* a gist (or equivalent) of the patch
* a branch or commit in your github fork of the repo
|
`RSpec::Support` provides common functionality to `RSpec::Core`,
`RSpec::Expectations` and `RSpec::Mocks`. It is considered
suitable for internal use only at this time.
## Installation / Usage
Install one or more of the `RSpec` gems.
Want to run against the `master` branch? You'll need to include the dependent
RSpec repos as well. Add the following to your `Gemfile`:
```ruby
%w[rspec-core rspec-expectations rspec-mocks rspec-support].each do |lib|
gem lib, :git => "git://github.com/rspec/#{lib}.git", :branch => 'master'
end
```
## Contributing
Once you've set up the environment, you'll need to cd into the working
directory of whichever repo you want to work in. From there you can run the
specs and cucumber features, and make patches.
NOTE: You do not need to use rspec-dev to work on a specific RSpec repo. You
can treat each RSpec repo as an independent project.
- [Build details](BUILD_DETAIL.md)
- [Code of Conduct](CODE_OF_CONDUCT.md)
- [Detailed contributing guide](CONTRIBUTING.md)
- [Development setup guide](DEVELOPMENT.md)
## Patches
Please submit a pull request or a github issue. If you submit an issue, please
include a link to either of:
* a gist (or equivalent) of the patch
* a branch or commit in your github fork of the repo
| Remove misleading reference to issue tracker | Remove misleading reference to issue tracker
Closes #253 | Markdown | mit | rspec/rspec-support,rspec/rspec-support | markdown | ## Code Before:
`RSpec::Support` provides common functionality to `RSpec::Core`,
`RSpec::Expectations` and `RSpec::Mocks`. It is considered
suitable for internal use only at this time.
## Installation / Usage
Install one or more of the `RSpec` gems.
Want to run against the `master` branch? You'll need to include the dependent
RSpec repos as well. Add the following to your `Gemfile`:
```ruby
%w[rspec-core rspec-expectations rspec-mocks rspec-support].each do |lib|
gem lib, :git => "git://github.com/rspec/#{lib}.git", :branch => 'master'
end
```
## Contributing
Once you've set up the environment, you'll need to cd into the working
directory of whichever repo you want to work in. From there you can run the
specs and cucumber features, and make patches.
NOTE: You do not need to use rspec-dev to work on a specific RSpec repo. You
can treat each RSpec repo as an independent project.
- [Build details](BUILD_DETAIL.md)
- [Code of Conduct](CODE_OF_CONDUCT.md)
- [Detailed contributing guide](CONTRIBUTING.md)
- [Development setup guide](DEVELOPMENT.md)
## Patches
Please submit a pull request or a github issue to one of the issue trackers
listed below. If you submit an issue, please include a link to either of:
* a gist (or equivalent) of the patch
* a branch or commit in your github fork of the repo
## Instruction:
Remove misleading reference to issue tracker
Closes #253
## Code After:
`RSpec::Support` provides common functionality to `RSpec::Core`,
`RSpec::Expectations` and `RSpec::Mocks`. It is considered
suitable for internal use only at this time.
## Installation / Usage
Install one or more of the `RSpec` gems.
Want to run against the `master` branch? You'll need to include the dependent
RSpec repos as well. Add the following to your `Gemfile`:
```ruby
%w[rspec-core rspec-expectations rspec-mocks rspec-support].each do |lib|
gem lib, :git => "git://github.com/rspec/#{lib}.git", :branch => 'master'
end
```
## Contributing
Once you've set up the environment, you'll need to cd into the working
directory of whichever repo you want to work in. From there you can run the
specs and cucumber features, and make patches.
NOTE: You do not need to use rspec-dev to work on a specific RSpec repo. You
can treat each RSpec repo as an independent project.
- [Build details](BUILD_DETAIL.md)
- [Code of Conduct](CODE_OF_CONDUCT.md)
- [Detailed contributing guide](CONTRIBUTING.md)
- [Development setup guide](DEVELOPMENT.md)
## Patches
Please submit a pull request or a github issue. If you submit an issue, please
include a link to either of:
* a gist (or equivalent) of the patch
* a branch or commit in your github fork of the repo
|
`RSpec::Support` provides common functionality to `RSpec::Core`,
`RSpec::Expectations` and `RSpec::Mocks`. It is considered
suitable for internal use only at this time.
## Installation / Usage
Install one or more of the `RSpec` gems.
Want to run against the `master` branch? You'll need to include the dependent
RSpec repos as well. Add the following to your `Gemfile`:
```ruby
%w[rspec-core rspec-expectations rspec-mocks rspec-support].each do |lib|
gem lib, :git => "git://github.com/rspec/#{lib}.git", :branch => 'master'
end
```
## Contributing
Once you've set up the environment, you'll need to cd into the working
directory of whichever repo you want to work in. From there you can run the
specs and cucumber features, and make patches.
NOTE: You do not need to use rspec-dev to work on a specific RSpec repo. You
can treat each RSpec repo as an independent project.
- [Build details](BUILD_DETAIL.md)
- [Code of Conduct](CODE_OF_CONDUCT.md)
- [Detailed contributing guide](CONTRIBUTING.md)
- [Development setup guide](DEVELOPMENT.md)
## Patches
- Please submit a pull request or a github issue to one of the issue trackers
? ^^^^^^^^ ^^ ^^ ^^ --
+ Please submit a pull request or a github issue. If you submit an issue, please
? + ^ +++++++++ ^^^ + ^^^ ^
- listed below. If you submit an issue, please include a link to either of:
+ include a link to either of:
* a gist (or equivalent) of the patch
* a branch or commit in your github fork of the repo | 4 | 0.102564 | 2 | 2 |
9a679fb9bed5c813c33886572b81bc1e4bcd2f6a | lib/search-tickets.js | lib/search-tickets.js | 'use strict';
function searchTickets (args, callback) {
var seneca = this;
var ticketsEntity = seneca.make$('cd/tickets');
var query = args.query || {};
if (!query.limit$) query.limit$ = 'NULL';
ticketsEntity.list$(query, callback);
}
module.exports = searchTickets;
| 'use strict';
var async = require('async');
function searchTickets (args, callback) {
var seneca = this;
var ticketsEntity = seneca.make$('cd/tickets');
var plugin = args.role;
async.waterfall([
loadTickets,
getNumberOfApplications,
getNumberOfApprovedApplications
], callback);
function loadTickets (done) {
var query = args.query || {};
if (!query.limit$) query.limit$ = 'NULL';
ticketsEntity.list$(query, done);
}
function getNumberOfApplications (tickets, done) {
async.map(tickets, function (ticket, cb) {
seneca.act({role: plugin, cmd: 'searchApplications', query: {ticketId: ticket.id, deleted: 0}}, function(err, applications) {
if (err) return cb(err);
ticket.totalApplications = applications.length;
cb(null,ticket);
});
}, done);
}
function getNumberOfApprovedApplications (tickets, done) {
async.map(tickets, function (ticket, cb) {
seneca.act({role: plugin, cmd: 'searchApplications', query: {ticketId: ticket.id, deleted: 0, status: "approved"}}, function(err, applications) {
if (err) return cb(err);
ticket.approvedApplications = applications.length;
cb(null,ticket);
});
}, done);
}
}
module.exports = searchTickets;
| Add approvedApplications and totalApplications to tickets | Add approvedApplications and totalApplications to tickets
This is groundwork for resolving CoderDojo/community-platform#778. It
allows us to see how many applications there are for a given ticket
type.
| JavaScript | mit | CoderDojo/cp-events-service,CoderDojo/cp-events-service | javascript | ## Code Before:
'use strict';
function searchTickets (args, callback) {
var seneca = this;
var ticketsEntity = seneca.make$('cd/tickets');
var query = args.query || {};
if (!query.limit$) query.limit$ = 'NULL';
ticketsEntity.list$(query, callback);
}
module.exports = searchTickets;
## Instruction:
Add approvedApplications and totalApplications to tickets
This is groundwork for resolving CoderDojo/community-platform#778. It
allows us to see how many applications there are for a given ticket
type.
## Code After:
'use strict';
var async = require('async');
function searchTickets (args, callback) {
var seneca = this;
var ticketsEntity = seneca.make$('cd/tickets');
var plugin = args.role;
async.waterfall([
loadTickets,
getNumberOfApplications,
getNumberOfApprovedApplications
], callback);
function loadTickets (done) {
var query = args.query || {};
if (!query.limit$) query.limit$ = 'NULL';
ticketsEntity.list$(query, done);
}
function getNumberOfApplications (tickets, done) {
async.map(tickets, function (ticket, cb) {
seneca.act({role: plugin, cmd: 'searchApplications', query: {ticketId: ticket.id, deleted: 0}}, function(err, applications) {
if (err) return cb(err);
ticket.totalApplications = applications.length;
cb(null,ticket);
});
}, done);
}
function getNumberOfApprovedApplications (tickets, done) {
async.map(tickets, function (ticket, cb) {
seneca.act({role: plugin, cmd: 'searchApplications', query: {ticketId: ticket.id, deleted: 0, status: "approved"}}, function(err, applications) {
if (err) return cb(err);
ticket.approvedApplications = applications.length;
cb(null,ticket);
});
}, done);
}
}
module.exports = searchTickets;
| 'use strict';
+
+ var async = require('async');
function searchTickets (args, callback) {
var seneca = this;
var ticketsEntity = seneca.make$('cd/tickets');
+ var plugin = args.role;
+
+ async.waterfall([
+ loadTickets,
+ getNumberOfApplications,
+ getNumberOfApprovedApplications
+ ], callback);
+
+ function loadTickets (done) {
- var query = args.query || {};
+ var query = args.query || {};
? ++
- if (!query.limit$) query.limit$ = 'NULL';
+ if (!query.limit$) query.limit$ = 'NULL';
? ++
- ticketsEntity.list$(query, callback);
? ^^^^^^^^
+ ticketsEntity.list$(query, done);
? ++ ^^^^
+ }
+
+ function getNumberOfApplications (tickets, done) {
+ async.map(tickets, function (ticket, cb) {
+ seneca.act({role: plugin, cmd: 'searchApplications', query: {ticketId: ticket.id, deleted: 0}}, function(err, applications) {
+ if (err) return cb(err);
+ ticket.totalApplications = applications.length;
+ cb(null,ticket);
+ });
+ }, done);
+ }
+
+ function getNumberOfApprovedApplications (tickets, done) {
+ async.map(tickets, function (ticket, cb) {
+ seneca.act({role: plugin, cmd: 'searchApplications', query: {ticketId: ticket.id, deleted: 0, status: "approved"}}, function(err, applications) {
+ if (err) return cb(err);
+ ticket.approvedApplications = applications.length;
+ cb(null,ticket);
+ });
+ }, done);
+ }
}
module.exports = searchTickets; | 38 | 3.454545 | 35 | 3 |
183a2ad79b78f5ea4760407a82f426e3fe4e7f8b | examples/aiohttp/README.md | examples/aiohttp/README.md | aiohttp Websocket Connection example
====================================
In this example you can control the state of a some LEDs via WebSocket.
Installation
------------
You need to install aiohttp:
'''
pip install aiohttp'''
It is not needed for the general GPIO-lib so it is not part of the installation process.
Configuration
-------------
Make sure to switch the ip address and port number in the index.html-File to the one of your raspberry pi (line 19), so the Browser can connect to it using the WebSocket.
You can configure the GPIO-pins you like to use. See GPIO_LINES in the main.py.
Note that the fedora firewall prevents access on port 8080 by default, so allow that port to be accessed by
`firewall-cmd ...`
Additional notes
----------------
aiohttp still waits for a request when you press CTRL+c to stop it, so you need to refresh the page after pressing CTRL+c to stop the server. | aiohttp Websocket Connection example
====================================
In this example you can control the state of a some LEDs via WebSocket.
Installation
------------
You need to install aiohttp:
```pip install aiohttp```
It is not needed for the general GPIO-lib so it is not part of the installation process.
Configuration
-------------
Make sure to switch the ip address and port number in the index.html-File to the one of your raspberry pi (line 19), so the Browser can connect to it using the WebSocket.
You can configure the GPIO-pins you like to use. See GPIO_LINES in the main.py.
Note that the fedora firewall prevents access on port 8080 by default, so allow that port to be accessed by other machines in your network e.g. `firewall-cmd --zone=dmz --add-port=8080/tcp`, take a look at the [documentation](https://docs-old.fedoraproject.org/en-US/Fedora/19/html/Security_Guide/sec-Open_Ports_in_the_firewall-CLI.html)
Additional notes
----------------
aiohttp still waits for a request when you press CTRL+c to stop it, so you need to refresh the page after pressing CTRL+c to stop the server.
| Update Documentation for aiohttp example | Update Documentation for aiohttp example | Markdown | apache-2.0 | bookwar/python-gpiodev,bookwar/python-gpiodev | markdown | ## Code Before:
aiohttp Websocket Connection example
====================================
In this example you can control the state of a some LEDs via WebSocket.
Installation
------------
You need to install aiohttp:
'''
pip install aiohttp'''
It is not needed for the general GPIO-lib so it is not part of the installation process.
Configuration
-------------
Make sure to switch the ip address and port number in the index.html-File to the one of your raspberry pi (line 19), so the Browser can connect to it using the WebSocket.
You can configure the GPIO-pins you like to use. See GPIO_LINES in the main.py.
Note that the fedora firewall prevents access on port 8080 by default, so allow that port to be accessed by
`firewall-cmd ...`
Additional notes
----------------
aiohttp still waits for a request when you press CTRL+c to stop it, so you need to refresh the page after pressing CTRL+c to stop the server.
## Instruction:
Update Documentation for aiohttp example
## Code After:
aiohttp Websocket Connection example
====================================
In this example you can control the state of a some LEDs via WebSocket.
Installation
------------
You need to install aiohttp:
```pip install aiohttp```
It is not needed for the general GPIO-lib so it is not part of the installation process.
Configuration
-------------
Make sure to switch the ip address and port number in the index.html-File to the one of your raspberry pi (line 19), so the Browser can connect to it using the WebSocket.
You can configure the GPIO-pins you like to use. See GPIO_LINES in the main.py.
Note that the fedora firewall prevents access on port 8080 by default, so allow that port to be accessed by other machines in your network e.g. `firewall-cmd --zone=dmz --add-port=8080/tcp`, take a look at the [documentation](https://docs-old.fedoraproject.org/en-US/Fedora/19/html/Security_Guide/sec-Open_Ports_in_the_firewall-CLI.html)
Additional notes
----------------
aiohttp still waits for a request when you press CTRL+c to stop it, so you need to refresh the page after pressing CTRL+c to stop the server.
| aiohttp Websocket Connection example
====================================
In this example you can control the state of a some LEDs via WebSocket.
Installation
------------
You need to install aiohttp:
- '''
- pip install aiohttp'''
? ^^^
+ ```pip install aiohttp```
? +++ ^^^
It is not needed for the general GPIO-lib so it is not part of the installation process.
Configuration
-------------
Make sure to switch the ip address and port number in the index.html-File to the one of your raspberry pi (line 19), so the Browser can connect to it using the WebSocket.
You can configure the GPIO-pins you like to use. See GPIO_LINES in the main.py.
+ Note that the fedora firewall prevents access on port 8080 by default, so allow that port to be accessed by other machines in your network e.g. `firewall-cmd --zone=dmz --add-port=8080/tcp`, take a look at the [documentation](https://docs-old.fedoraproject.org/en-US/Fedora/19/html/Security_Guide/sec-Open_Ports_in_the_firewall-CLI.html)
- Note that the fedora firewall prevents access on port 8080 by default, so allow that port to be accessed by
- `firewall-cmd ...`
Additional notes
----------------
aiohttp still waits for a request when you press CTRL+c to stop it, so you need to refresh the page after pressing CTRL+c to stop the server. | 6 | 0.24 | 2 | 4 |
ee6cfbf3cb2e65199807066dc8fedcfb133e0132 | examples/storage_json.rb | examples/storage_json.rb |
require "bundler"
Bundler.require(:default, :development)
# Uncomment this if you want to make real requests to GCE (you _will_ be billed!)
WebMock.disable!
# This specific example needs google_storage_access_key_id: and google_storage_secret_access_key to be set in ~/.fog
# One can request those keys via Google Developers console in:
# Storage -> Storage -> Settings -> "Interoperability" tab -> "Create a new key"
def test
connection = Fog::Google::StorageJSON.new
puts "Put a bucket..."
puts "----------------"
connection.put_bucket("fog-smoke-test", options = { "x-goog-acl" => "publicReadWrite" })
puts "Get the bucket..."
puts "-----------------"
connection.get_bucket("fog-smoke-test")
puts "Put a test file..."
puts "---------------"
connection.put_object("fog-smoke-test", "my file", "THISISATESTFILE")
puts "Delete the test file..."
puts "---------------"
connection.delete_object("fog-smoke-test", "my file")
puts "Delete the bucket..."
puts "------------------"
connection.delete_bucket("fog-smoke-test")
end
test
|
require "bundler"
Bundler.require(:default, :development)
# Uncomment this if you want to make real requests to GCE (you _will_ be billed!)
WebMock.disable!
# This specific example needs google_storage_access_key_id: and google_storage_secret_access_key to be set in ~/.fog
# One can request those keys via Google Developers console in:
# Storage -> Storage -> Settings -> "Interoperability" tab -> "Create a new key"
def test
connection = Fog::Google::StorageJSON.new
puts "Put a bucket..."
puts "----------------"
connection.put_bucket("fog-smoke-test", options = { "predefinedAcl" => "publicReadWrite" })
puts "Get the bucket..."
puts "-----------------"
connection.get_bucket("fog-smoke-test")
puts "Put a test file..."
puts "---------------"
connection.put_object("fog-smoke-test", "my file", "THISISATESTFILE")
puts "Delete the test file..."
puts "---------------"
connection.delete_object("fog-smoke-test", "my file")
puts "Delete the bucket..."
puts "------------------"
connection.delete_bucket("fog-smoke-test")
end
test
| Update example to use predefinedAcl parameter | Update example to use predefinedAcl parameter
| Ruby | mit | Temikus/fog-google,plribeiro3000/fog-google,fog/fog-google,Temikus/fog-google | ruby | ## Code Before:
require "bundler"
Bundler.require(:default, :development)
# Uncomment this if you want to make real requests to GCE (you _will_ be billed!)
WebMock.disable!
# This specific example needs google_storage_access_key_id: and google_storage_secret_access_key to be set in ~/.fog
# One can request those keys via Google Developers console in:
# Storage -> Storage -> Settings -> "Interoperability" tab -> "Create a new key"
def test
connection = Fog::Google::StorageJSON.new
puts "Put a bucket..."
puts "----------------"
connection.put_bucket("fog-smoke-test", options = { "x-goog-acl" => "publicReadWrite" })
puts "Get the bucket..."
puts "-----------------"
connection.get_bucket("fog-smoke-test")
puts "Put a test file..."
puts "---------------"
connection.put_object("fog-smoke-test", "my file", "THISISATESTFILE")
puts "Delete the test file..."
puts "---------------"
connection.delete_object("fog-smoke-test", "my file")
puts "Delete the bucket..."
puts "------------------"
connection.delete_bucket("fog-smoke-test")
end
test
## Instruction:
Update example to use predefinedAcl parameter
## Code After:
require "bundler"
Bundler.require(:default, :development)
# Uncomment this if you want to make real requests to GCE (you _will_ be billed!)
WebMock.disable!
# This specific example needs google_storage_access_key_id: and google_storage_secret_access_key to be set in ~/.fog
# One can request those keys via Google Developers console in:
# Storage -> Storage -> Settings -> "Interoperability" tab -> "Create a new key"
def test
connection = Fog::Google::StorageJSON.new
puts "Put a bucket..."
puts "----------------"
connection.put_bucket("fog-smoke-test", options = { "predefinedAcl" => "publicReadWrite" })
puts "Get the bucket..."
puts "-----------------"
connection.get_bucket("fog-smoke-test")
puts "Put a test file..."
puts "---------------"
connection.put_object("fog-smoke-test", "my file", "THISISATESTFILE")
puts "Delete the test file..."
puts "---------------"
connection.delete_object("fog-smoke-test", "my file")
puts "Delete the bucket..."
puts "------------------"
connection.delete_bucket("fog-smoke-test")
end
test
|
require "bundler"
Bundler.require(:default, :development)
# Uncomment this if you want to make real requests to GCE (you _will_ be billed!)
WebMock.disable!
# This specific example needs google_storage_access_key_id: and google_storage_secret_access_key to be set in ~/.fog
# One can request those keys via Google Developers console in:
# Storage -> Storage -> Settings -> "Interoperability" tab -> "Create a new key"
def test
connection = Fog::Google::StorageJSON.new
puts "Put a bucket..."
puts "----------------"
- connection.put_bucket("fog-smoke-test", options = { "x-goog-acl" => "publicReadWrite" })
? ^^^^^^^^
+ connection.put_bucket("fog-smoke-test", options = { "predefinedAcl" => "publicReadWrite" })
? ^^^^^^^^^^^
puts "Get the bucket..."
puts "-----------------"
connection.get_bucket("fog-smoke-test")
puts "Put a test file..."
puts "---------------"
connection.put_object("fog-smoke-test", "my file", "THISISATESTFILE")
puts "Delete the test file..."
puts "---------------"
connection.delete_object("fog-smoke-test", "my file")
puts "Delete the bucket..."
puts "------------------"
connection.delete_bucket("fog-smoke-test")
end
test | 2 | 0.057143 | 1 | 1 |
b3635b85e269a846e994f48745eb012c24e64165 | core/gc/count_spec.rb | core/gc/count_spec.rb | require_relative '../../spec_helper'
describe "GC.count" do
it "returns an integer" do
GC.count.should be_kind_of(Integer)
end
end
| require_relative '../../spec_helper'
describe "GC.count" do
it "returns an integer" do
GC.count.should be_kind_of(Integer)
end
it "increases as collections are run" do
count_before = GC.count
i = 0
while GC.count <= count_before and i < 10
GC.start
i += 1
end
GC.count.should > count_before
end
end
| Move GC.count spec to ruby/spec | Move GC.count spec to ruby/spec
| Ruby | mit | ruby/spec,ruby/spec,ruby/rubyspec,ruby/spec,nobu/rubyspec,ruby/rubyspec,eregon/rubyspec,nobu/rubyspec,eregon/rubyspec,nobu/rubyspec,eregon/rubyspec | ruby | ## Code Before:
require_relative '../../spec_helper'
describe "GC.count" do
it "returns an integer" do
GC.count.should be_kind_of(Integer)
end
end
## Instruction:
Move GC.count spec to ruby/spec
## Code After:
require_relative '../../spec_helper'
describe "GC.count" do
it "returns an integer" do
GC.count.should be_kind_of(Integer)
end
it "increases as collections are run" do
count_before = GC.count
i = 0
while GC.count <= count_before and i < 10
GC.start
i += 1
end
GC.count.should > count_before
end
end
| require_relative '../../spec_helper'
describe "GC.count" do
it "returns an integer" do
GC.count.should be_kind_of(Integer)
end
+
+ it "increases as collections are run" do
+ count_before = GC.count
+ i = 0
+ while GC.count <= count_before and i < 10
+ GC.start
+ i += 1
+ end
+ GC.count.should > count_before
+ end
end | 10 | 1.428571 | 10 | 0 |
51fd22bc1f2fd5d708e2265fdd93c318bc2bea0d | features/support/form_helpers.rb | features/support/form_helpers.rb | module FormHelpers
def fill_in_fields(names_and_values)
names_and_values.each do |field_name, value|
fill_in_field(field_name, value)
end
end
def fill_in_field(field_name, value)
label_text = field_name.to_s.humanize
if page.first(:select, label_text)
select value, from: label_text
else
fill_in label_text, with: value
end
end
def clear_datetime(label)
base_dom_id = find(:xpath, ".//label[contains(., '#{label}')]")["for"].gsub(/(_[1-5]i)$/, "")
find(:xpath, ".//select[@id='#{base_dom_id}_1i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_2i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_3i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_4i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_5i']").select("")
end
end
RSpec.configuration.include FormHelpers, type: :feature
| module FormHelpers
def fill_in_fields(names_and_values)
names_and_values.each do |field_name, value|
fill_in_field(field_name, value)
end
end
def fill_in_field(field_name, value)
label_text = field_name.to_s.humanize
if page.has_css?("select[text='#{label_text}']")
select value, from: label_text
else
fill_in label_text, with: value
end
end
def clear_datetime(label)
base_dom_id = find(:xpath, ".//label[contains(., '#{label}')]")["for"].gsub(/(_[1-5]i)$/, "")
find(:xpath, ".//select[@id='#{base_dom_id}_1i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_2i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_3i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_4i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_5i']").select("")
end
end
RSpec.configuration.include FormHelpers, type: :feature
| Use has_css? instead of first | Use has_css? instead of first
In older versions of Capybara, first would return nil but now it
raises an exception.
| Ruby | mit | alphagov/manuals-publisher,alphagov/manuals-publisher,alphagov/manuals-publisher | ruby | ## Code Before:
module FormHelpers
def fill_in_fields(names_and_values)
names_and_values.each do |field_name, value|
fill_in_field(field_name, value)
end
end
def fill_in_field(field_name, value)
label_text = field_name.to_s.humanize
if page.first(:select, label_text)
select value, from: label_text
else
fill_in label_text, with: value
end
end
def clear_datetime(label)
base_dom_id = find(:xpath, ".//label[contains(., '#{label}')]")["for"].gsub(/(_[1-5]i)$/, "")
find(:xpath, ".//select[@id='#{base_dom_id}_1i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_2i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_3i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_4i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_5i']").select("")
end
end
RSpec.configuration.include FormHelpers, type: :feature
## Instruction:
Use has_css? instead of first
In older versions of Capybara, first would return nil but now it
raises an exception.
## Code After:
module FormHelpers
def fill_in_fields(names_and_values)
names_and_values.each do |field_name, value|
fill_in_field(field_name, value)
end
end
def fill_in_field(field_name, value)
label_text = field_name.to_s.humanize
if page.has_css?("select[text='#{label_text}']")
select value, from: label_text
else
fill_in label_text, with: value
end
end
def clear_datetime(label)
base_dom_id = find(:xpath, ".//label[contains(., '#{label}')]")["for"].gsub(/(_[1-5]i)$/, "")
find(:xpath, ".//select[@id='#{base_dom_id}_1i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_2i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_3i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_4i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_5i']").select("")
end
end
RSpec.configuration.include FormHelpers, type: :feature
| module FormHelpers
def fill_in_fields(names_and_values)
names_and_values.each do |field_name, value|
fill_in_field(field_name, value)
end
end
def fill_in_field(field_name, value)
label_text = field_name.to_s.humanize
- if page.first(:select, label_text)
+ if page.has_css?("select[text='#{label_text}']")
select value, from: label_text
else
fill_in label_text, with: value
end
end
def clear_datetime(label)
base_dom_id = find(:xpath, ".//label[contains(., '#{label}')]")["for"].gsub(/(_[1-5]i)$/, "")
find(:xpath, ".//select[@id='#{base_dom_id}_1i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_2i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_3i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_4i']").select("")
find(:xpath, ".//select[@id='#{base_dom_id}_5i']").select("")
end
end
RSpec.configuration.include FormHelpers, type: :feature | 2 | 0.068966 | 1 | 1 |
2078188588dbf0383ccdbeb6a73f8f38532d93b1 | lib/radbeacon/le_scanner.rb | lib/radbeacon/le_scanner.rb | module Radbeacon
class LeScanner
attr_accessor :duration
def initialize(duration = 5)
@duration = duration
end
def passive_scan
devices = Array.new
scan_output = `sudo hcitool lescan & sleep #{@duration}; sudo kill -2 $!`
scan_output.each_line do |line|
result = line.scan(/^([A-F0-9:]{15}[A-F0-9]{2}) (.*)$/)
if !result.empty?
mac_address = result[0][0]
name = result[0][1]
if !devices.find {|s| s.mac_address == mac_address}
device = BluetoothLeDevice.new(mac_address, name)
devices << device
end
end
end
devices
end
def scan
devices = self.passive_scan
devices.each do |dev|
dev.fetch_characteristics
end
devices
end
end
end
| module Radbeacon
class LeScanner
attr_accessor :duration
def initialize(duration = 5)
@duration = duration
end
def scan_command
rout, wout = IO.pipe
scan_command_str = "sudo hcitool lescan"
pid = Process.spawn(scan_command_str, :out => wout)
begin
Timeout.timeout(@duration) do
Process.wait(pid)
end
rescue Timeout::Error
puts 'Scan process not finished in time, killing it'
Process.kill('INT', pid)
end
wout.close
scan_output = rout.readlines.join("")
rout.close
scan_output
end
def passive_scan
devices = Array.new
scan_output = self.scan_command
scan_output.each_line do |line|
result = line.scan(/^([A-F0-9:]{15}[A-F0-9]{2}) (.*)$/)
if !result.empty?
mac_address = result[0][0]
name = result[0][1]
if !devices.find {|s| s.mac_address == mac_address}
device = BluetoothLeDevice.new(mac_address, name)
devices << device
end
end
end
devices
end
def scan
devices = self.passive_scan
devices.each do |dev|
dev.fetch_characteristics
end
devices
end
end
end
| Monitor lescan process and kill it after timeout | Monitor lescan process and kill it after timeout
| Ruby | mit | RadiusNetworks/radbeacon-gem | ruby | ## Code Before:
module Radbeacon
class LeScanner
attr_accessor :duration
def initialize(duration = 5)
@duration = duration
end
def passive_scan
devices = Array.new
scan_output = `sudo hcitool lescan & sleep #{@duration}; sudo kill -2 $!`
scan_output.each_line do |line|
result = line.scan(/^([A-F0-9:]{15}[A-F0-9]{2}) (.*)$/)
if !result.empty?
mac_address = result[0][0]
name = result[0][1]
if !devices.find {|s| s.mac_address == mac_address}
device = BluetoothLeDevice.new(mac_address, name)
devices << device
end
end
end
devices
end
def scan
devices = self.passive_scan
devices.each do |dev|
dev.fetch_characteristics
end
devices
end
end
end
## Instruction:
Monitor lescan process and kill it after timeout
## Code After:
module Radbeacon
class LeScanner
attr_accessor :duration
def initialize(duration = 5)
@duration = duration
end
def scan_command
rout, wout = IO.pipe
scan_command_str = "sudo hcitool lescan"
pid = Process.spawn(scan_command_str, :out => wout)
begin
Timeout.timeout(@duration) do
Process.wait(pid)
end
rescue Timeout::Error
puts 'Scan process not finished in time, killing it'
Process.kill('INT', pid)
end
wout.close
scan_output = rout.readlines.join("")
rout.close
scan_output
end
def passive_scan
devices = Array.new
scan_output = self.scan_command
scan_output.each_line do |line|
result = line.scan(/^([A-F0-9:]{15}[A-F0-9]{2}) (.*)$/)
if !result.empty?
mac_address = result[0][0]
name = result[0][1]
if !devices.find {|s| s.mac_address == mac_address}
device = BluetoothLeDevice.new(mac_address, name)
devices << device
end
end
end
devices
end
def scan
devices = self.passive_scan
devices.each do |dev|
dev.fetch_characteristics
end
devices
end
end
end
| module Radbeacon
class LeScanner
attr_accessor :duration
def initialize(duration = 5)
@duration = duration
end
+ def scan_command
+ rout, wout = IO.pipe
+ scan_command_str = "sudo hcitool lescan"
+ pid = Process.spawn(scan_command_str, :out => wout)
+ begin
+ Timeout.timeout(@duration) do
+ Process.wait(pid)
+ end
+ rescue Timeout::Error
+ puts 'Scan process not finished in time, killing it'
+ Process.kill('INT', pid)
+ end
+ wout.close
+ scan_output = rout.readlines.join("")
+ rout.close
+ scan_output
+ end
+
def passive_scan
devices = Array.new
- scan_output = `sudo hcitool lescan & sleep #{@duration}; sudo kill -2 $!`
+ scan_output = self.scan_command
scan_output.each_line do |line|
result = line.scan(/^([A-F0-9:]{15}[A-F0-9]{2}) (.*)$/)
if !result.empty?
mac_address = result[0][0]
name = result[0][1]
if !devices.find {|s| s.mac_address == mac_address}
device = BluetoothLeDevice.new(mac_address, name)
devices << device
end
end
end
devices
end
def scan
devices = self.passive_scan
devices.each do |dev|
dev.fetch_characteristics
end
devices
end
end
end | 20 | 0.555556 | 19 | 1 |
e1d71b8504737cba1f89daa8aa43f8afb109219d | src/assemble/manifest.common.json | src/assemble/manifest.common.json | {
"manifest_version": 2,
"name": "WAI-ARIA Landmark Keyboard Navigation",
"short_name": "Landmarks",
"version": "@version@",
"description": "Allows you to navigate a web page via WAI-ARIA landmarks, using the keyboard or a pop-up menu.",
"author": "David Todd, Matthew Tylee Atkinson",
"homepage_url": "http://matatk.agrip.org.uk/landmarks/",
"permissions": [
"activeTab", "storage", "webNavigation"
],
"background": {
"scripts": ["background.js"]
},
"content_scripts": [
{
"matches": ["*://*/*"],
"js": ["content.js"]
}
],
"commands": {
"next-landmark": {
"suggested_key": {
"default": "Alt+Shift+N"
},
"description": "Move to and highlight the next landmark"
},
"prev-landmark": {
"suggested_key": {
"default": "Alt+Shift+P"
},
"description": "Move to and highlight the previous landmark"
}
}
}
| {
"manifest_version": 2,
"name": "Landmark Navigation via Keyboard or Pop-up",
"short_name": "Landmarks",
"version": "@version@",
"description": "Allows you to navigate a web page via WAI-ARIA landmarks, using the keyboard or a pop-up menu.",
"author": "David Todd, Matthew Tylee Atkinson",
"homepage_url": "http://matatk.agrip.org.uk/landmarks/",
"permissions": [
"activeTab", "storage", "webNavigation"
],
"background": {
"scripts": ["background.js"]
},
"content_scripts": [
{
"matches": ["*://*/*"],
"js": ["content.js"]
}
],
"commands": {
"next-landmark": {
"suggested_key": {
"default": "Alt+Shift+N"
},
"description": "Move to and highlight the next landmark"
},
"prev-landmark": {
"suggested_key": {
"default": "Alt+Shift+P"
},
"description": "Move to and highlight the previous landmark"
}
}
}
| Change extension long name for clarity (I hope) | Change extension long name for clarity (I hope)
| JSON | mit | matatk/landmarks,matatk/landmarks,matatk/landmarks | json | ## Code Before:
{
"manifest_version": 2,
"name": "WAI-ARIA Landmark Keyboard Navigation",
"short_name": "Landmarks",
"version": "@version@",
"description": "Allows you to navigate a web page via WAI-ARIA landmarks, using the keyboard or a pop-up menu.",
"author": "David Todd, Matthew Tylee Atkinson",
"homepage_url": "http://matatk.agrip.org.uk/landmarks/",
"permissions": [
"activeTab", "storage", "webNavigation"
],
"background": {
"scripts": ["background.js"]
},
"content_scripts": [
{
"matches": ["*://*/*"],
"js": ["content.js"]
}
],
"commands": {
"next-landmark": {
"suggested_key": {
"default": "Alt+Shift+N"
},
"description": "Move to and highlight the next landmark"
},
"prev-landmark": {
"suggested_key": {
"default": "Alt+Shift+P"
},
"description": "Move to and highlight the previous landmark"
}
}
}
## Instruction:
Change extension long name for clarity (I hope)
## Code After:
{
"manifest_version": 2,
"name": "Landmark Navigation via Keyboard or Pop-up",
"short_name": "Landmarks",
"version": "@version@",
"description": "Allows you to navigate a web page via WAI-ARIA landmarks, using the keyboard or a pop-up menu.",
"author": "David Todd, Matthew Tylee Atkinson",
"homepage_url": "http://matatk.agrip.org.uk/landmarks/",
"permissions": [
"activeTab", "storage", "webNavigation"
],
"background": {
"scripts": ["background.js"]
},
"content_scripts": [
{
"matches": ["*://*/*"],
"js": ["content.js"]
}
],
"commands": {
"next-landmark": {
"suggested_key": {
"default": "Alt+Shift+N"
},
"description": "Move to and highlight the next landmark"
},
"prev-landmark": {
"suggested_key": {
"default": "Alt+Shift+P"
},
"description": "Move to and highlight the previous landmark"
}
}
}
| {
"manifest_version": 2,
- "name": "WAI-ARIA Landmark Keyboard Navigation",
+ "name": "Landmark Navigation via Keyboard or Pop-up",
"short_name": "Landmarks",
"version": "@version@",
"description": "Allows you to navigate a web page via WAI-ARIA landmarks, using the keyboard or a pop-up menu.",
"author": "David Todd, Matthew Tylee Atkinson",
"homepage_url": "http://matatk.agrip.org.uk/landmarks/",
"permissions": [
"activeTab", "storage", "webNavigation"
],
"background": {
"scripts": ["background.js"]
},
"content_scripts": [
{
"matches": ["*://*/*"],
"js": ["content.js"]
}
],
"commands": {
"next-landmark": {
"suggested_key": {
"default": "Alt+Shift+N"
},
"description": "Move to and highlight the next landmark"
},
"prev-landmark": {
"suggested_key": {
"default": "Alt+Shift+P"
},
"description": "Move to and highlight the previous landmark"
}
}
} | 2 | 0.051282 | 1 | 1 |
e099d5edb511efadfd0ef3bd9d3cf08fad5e9d30 | .travis.yml | .travis.yml | language: java
jdk:
- oraclejdk8
sudo: false
services:
- postgresql
- mysql
# skip installation as this is also done during the script step
install: true
# allow gui testing on travis
before_script:
- "export DISPLAY=:99.0"
- "sh -e /etc/init.d/xvfb start"
- sleep 3 # give xvfb some time to start
- psql -c 'create database jabref;' -U postgres
script:
- ./gradlew check integrationTest
after_script:
# enable codecov report
- ./gradlew jacocoTestReport
- bash <(curl -s https://codecov.io/bash)
# cache gradle dependencies
# https://docs.travis-ci.com/user/languages/java#Caching
before_cache:
- rm -f $HOME/.gradle/caches/modules-2/modules-2.lock
cache:
directories:
- $HOME/.gradle/caches/
- $HOME/.gradle/wrapper/
| language: java
jdk:
- oraclejdk8
sudo: false
services:
- postgresql
- mysql
# skip installation as this is also done during the script step
install: true
# allow gui testing on travis
before_script:
- "export DISPLAY=:99.0"
- "sh -e /etc/init.d/xvfb start"
- sleep 3 # give xvfb some time to start
- psql -c 'create database jabref;' -U postgres
script:
- ./gradlew check
# Integration tests currently often fail with "Process 'Gradle Test Executor 1' finished with non-zero exit value 137"
# They should run, but the result is ignored
- ./gradlew integrationTest --info || exit 0
after_script:
# enable codecov report
- ./gradlew jacocoTestReport
- bash <(curl -s https://codecov.io/bash)
# cache gradle dependencies
# https://docs.travis-ci.com/user/languages/java#Caching
before_cache:
- rm -f $HOME/.gradle/caches/modules-2/modules-2.lock
cache:
directories:
- $HOME/.gradle/caches/
- $HOME/.gradle/wrapper/
| Disable build abort when integrationTest fails | Disable build abort when integrationTest fails
| YAML | mit | Siedlerchr/jabref,mredaelli/jabref,mredaelli/jabref,grimes2/jabref,ayanai1/jabref,mredaelli/jabref,Siedlerchr/jabref,tschechlovdev/jabref,obraliar/jabref,ayanai1/jabref,shitikanth/jabref,mairdl/jabref,motokito/jabref,oscargus/jabref,Braunch/jabref,ayanai1/jabref,tschechlovdev/jabref,mredaelli/jabref,Siedlerchr/jabref,Mr-DLib/jabref,jhshinn/jabref,obraliar/jabref,tschechlovdev/jabref,shitikanth/jabref,motokito/jabref,jhshinn/jabref,oscargus/jabref,Siedlerchr/jabref,zellerdev/jabref,sauliusg/jabref,bartsch-dev/jabref,motokito/jabref,mairdl/jabref,tschechlovdev/jabref,grimes2/jabref,Mr-DLib/jabref,mairdl/jabref,bartsch-dev/jabref,jhshinn/jabref,oscargus/jabref,JabRef/jabref,shitikanth/jabref,JabRef/jabref,mredaelli/jabref,JabRef/jabref,Mr-DLib/jabref,Braunch/jabref,shitikanth/jabref,grimes2/jabref,shitikanth/jabref,sauliusg/jabref,oscargus/jabref,bartsch-dev/jabref,sauliusg/jabref,zellerdev/jabref,tobiasdiez/jabref,tobiasdiez/jabref,zellerdev/jabref,bartsch-dev/jabref,ayanai1/jabref,sauliusg/jabref,grimes2/jabref,Braunch/jabref,jhshinn/jabref,obraliar/jabref,tschechlovdev/jabref,tobiasdiez/jabref,obraliar/jabref,zellerdev/jabref,motokito/jabref,mairdl/jabref,grimes2/jabref,tobiasdiez/jabref,JabRef/jabref,bartsch-dev/jabref,ayanai1/jabref,Braunch/jabref,jhshinn/jabref,mairdl/jabref,obraliar/jabref,oscargus/jabref,Braunch/jabref,motokito/jabref,Mr-DLib/jabref,zellerdev/jabref,Mr-DLib/jabref | yaml | ## Code Before:
language: java
jdk:
- oraclejdk8
sudo: false
services:
- postgresql
- mysql
# skip installation as this is also done during the script step
install: true
# allow gui testing on travis
before_script:
- "export DISPLAY=:99.0"
- "sh -e /etc/init.d/xvfb start"
- sleep 3 # give xvfb some time to start
- psql -c 'create database jabref;' -U postgres
script:
- ./gradlew check integrationTest
after_script:
# enable codecov report
- ./gradlew jacocoTestReport
- bash <(curl -s https://codecov.io/bash)
# cache gradle dependencies
# https://docs.travis-ci.com/user/languages/java#Caching
before_cache:
- rm -f $HOME/.gradle/caches/modules-2/modules-2.lock
cache:
directories:
- $HOME/.gradle/caches/
- $HOME/.gradle/wrapper/
## Instruction:
Disable build abort when integrationTest fails
## Code After:
language: java
jdk:
- oraclejdk8
sudo: false
services:
- postgresql
- mysql
# skip installation as this is also done during the script step
install: true
# allow gui testing on travis
before_script:
- "export DISPLAY=:99.0"
- "sh -e /etc/init.d/xvfb start"
- sleep 3 # give xvfb some time to start
- psql -c 'create database jabref;' -U postgres
script:
- ./gradlew check
# Integration tests currently often fail with "Process 'Gradle Test Executor 1' finished with non-zero exit value 137"
# They should run, but the result is ignored
- ./gradlew integrationTest --info || exit 0
after_script:
# enable codecov report
- ./gradlew jacocoTestReport
- bash <(curl -s https://codecov.io/bash)
# cache gradle dependencies
# https://docs.travis-ci.com/user/languages/java#Caching
before_cache:
- rm -f $HOME/.gradle/caches/modules-2/modules-2.lock
cache:
directories:
- $HOME/.gradle/caches/
- $HOME/.gradle/wrapper/
| language: java
jdk:
- oraclejdk8
sudo: false
services:
- postgresql
- mysql
# skip installation as this is also done during the script step
install: true
# allow gui testing on travis
before_script:
- "export DISPLAY=:99.0"
- "sh -e /etc/init.d/xvfb start"
- sleep 3 # give xvfb some time to start
- psql -c 'create database jabref;' -U postgres
script:
- - ./gradlew check integrationTest
+ - ./gradlew check
+ # Integration tests currently often fail with "Process 'Gradle Test Executor 1' finished with non-zero exit value 137"
+ # They should run, but the result is ignored
+ - ./gradlew integrationTest --info || exit 0
after_script:
# enable codecov report
- ./gradlew jacocoTestReport
- bash <(curl -s https://codecov.io/bash)
# cache gradle dependencies
# https://docs.travis-ci.com/user/languages/java#Caching
before_cache:
- rm -f $HOME/.gradle/caches/modules-2/modules-2.lock
cache:
directories:
- $HOME/.gradle/caches/
- $HOME/.gradle/wrapper/ | 5 | 0.142857 | 4 | 1 |
3a72e1af31dc9f1932747f922b59acff0dc57f10 | lib/spree/adyen/api_response.rb | lib/spree/adyen/api_response.rb | module Spree
module Adyen
class ApiResponse
attr_reader :gateway_response
def initialize gateway_response
@gateway_response = gateway_response
end
def success?
!error_response? && gateway_response.success?
end
def psp_reference
return nil if error_response?
@gateway_response[:psp_reference]
end
def attributes
if error_response?
{}
else
@gateway_response.attributes
end
end
def message
if success?
JSON.pretty_generate(@gateway_response.attributes)
else
error_message
end
end
private
def authorisation_response?
@gateway_response.is_a?(::Adyen::REST::AuthorisePayment::Response)
end
def modification_response?
@gateway_response.is_a?(::Adyen::REST::ModifyPayment::Response)
end
def error_response?
@gateway_response.is_a?(::Adyen::REST::ResponseError)
end
def error_message
if authorisation_response?
@gateway_response[:refusal_reason]
elsif modification_response?
@gateway_response[:response]
elsif error_response?
@gateway_response.message
else
I18n.t(:unknown_gateway_error, scope: "solidus-adyen")
end
end
end
end
end
| module Spree
module Adyen
class ApiResponse
attr_reader :gateway_response
def initialize gateway_response
@gateway_response = gateway_response
end
def success?
!error_response? && gateway_response.success?
end
def redirect?
@gateway_response.is_a?(::Adyen::REST::AuthorisePayment::Response) &&
@gateway_response.attributes["paymentResult.resultCode"] == "RedirectShopper"
end
def psp_reference
return nil if error_response?
@gateway_response[:psp_reference]
end
def attributes
if error_response?
{}
else
@gateway_response.attributes
end
end
def message
if success?
JSON.pretty_generate(@gateway_response.attributes)
else
error_message
end
end
private
def authorisation_response?
@gateway_response.is_a?(::Adyen::REST::AuthorisePayment::Response)
end
def modification_response?
@gateway_response.is_a?(::Adyen::REST::ModifyPayment::Response)
end
def error_response?
@gateway_response.is_a?(::Adyen::REST::ResponseError)
end
def error_message
if authorisation_response?
@gateway_response[:refusal_reason]
elsif modification_response?
@gateway_response[:response]
elsif error_response?
@gateway_response.message
else
I18n.t(:unknown_gateway_error, scope: "solidus-adyen")
end
end
end
end
end
| Add method for determining if adyen response is for a redirect | Add method for determining if adyen response is for a redirect
| Ruby | mit | StemboltHQ/solidus-adyen,freerunningtech/solidus-adyen,StemboltHQ/solidus-adyen,freerunningtech/solidus-adyen,freerunningtech/solidus-adyen,freerunningtech/solidus-adyen,StemboltHQ/solidus-adyen,StemboltHQ/solidus-adyen | ruby | ## Code Before:
module Spree
module Adyen
class ApiResponse
attr_reader :gateway_response
def initialize gateway_response
@gateway_response = gateway_response
end
def success?
!error_response? && gateway_response.success?
end
def psp_reference
return nil if error_response?
@gateway_response[:psp_reference]
end
def attributes
if error_response?
{}
else
@gateway_response.attributes
end
end
def message
if success?
JSON.pretty_generate(@gateway_response.attributes)
else
error_message
end
end
private
def authorisation_response?
@gateway_response.is_a?(::Adyen::REST::AuthorisePayment::Response)
end
def modification_response?
@gateway_response.is_a?(::Adyen::REST::ModifyPayment::Response)
end
def error_response?
@gateway_response.is_a?(::Adyen::REST::ResponseError)
end
def error_message
if authorisation_response?
@gateway_response[:refusal_reason]
elsif modification_response?
@gateway_response[:response]
elsif error_response?
@gateway_response.message
else
I18n.t(:unknown_gateway_error, scope: "solidus-adyen")
end
end
end
end
end
## Instruction:
Add method for determining if adyen response is for a redirect
## Code After:
module Spree
module Adyen
class ApiResponse
attr_reader :gateway_response
def initialize gateway_response
@gateway_response = gateway_response
end
def success?
!error_response? && gateway_response.success?
end
def redirect?
@gateway_response.is_a?(::Adyen::REST::AuthorisePayment::Response) &&
@gateway_response.attributes["paymentResult.resultCode"] == "RedirectShopper"
end
def psp_reference
return nil if error_response?
@gateway_response[:psp_reference]
end
def attributes
if error_response?
{}
else
@gateway_response.attributes
end
end
def message
if success?
JSON.pretty_generate(@gateway_response.attributes)
else
error_message
end
end
private
def authorisation_response?
@gateway_response.is_a?(::Adyen::REST::AuthorisePayment::Response)
end
def modification_response?
@gateway_response.is_a?(::Adyen::REST::ModifyPayment::Response)
end
def error_response?
@gateway_response.is_a?(::Adyen::REST::ResponseError)
end
def error_message
if authorisation_response?
@gateway_response[:refusal_reason]
elsif modification_response?
@gateway_response[:response]
elsif error_response?
@gateway_response.message
else
I18n.t(:unknown_gateway_error, scope: "solidus-adyen")
end
end
end
end
end
| module Spree
module Adyen
class ApiResponse
attr_reader :gateway_response
def initialize gateway_response
@gateway_response = gateway_response
end
def success?
!error_response? && gateway_response.success?
+ end
+
+ def redirect?
+ @gateway_response.is_a?(::Adyen::REST::AuthorisePayment::Response) &&
+ @gateway_response.attributes["paymentResult.resultCode"] == "RedirectShopper"
end
def psp_reference
return nil if error_response?
@gateway_response[:psp_reference]
end
def attributes
if error_response?
{}
else
@gateway_response.attributes
end
end
def message
if success?
JSON.pretty_generate(@gateway_response.attributes)
else
error_message
end
end
private
def authorisation_response?
@gateway_response.is_a?(::Adyen::REST::AuthorisePayment::Response)
end
def modification_response?
@gateway_response.is_a?(::Adyen::REST::ModifyPayment::Response)
end
def error_response?
@gateway_response.is_a?(::Adyen::REST::ResponseError)
end
def error_message
if authorisation_response?
@gateway_response[:refusal_reason]
elsif modification_response?
@gateway_response[:response]
elsif error_response?
@gateway_response.message
else
I18n.t(:unknown_gateway_error, scope: "solidus-adyen")
end
end
end
end
end | 5 | 0.080645 | 5 | 0 |
1881f54c215d54e47ade0d095e7495dec0435ee5 | python/src/com/jetbrains/python/psi/search/PyClassInheritorsSearchExecutor.java | python/src/com/jetbrains/python/psi/search/PyClassInheritorsSearchExecutor.java | package com.jetbrains.python.psi.search;
import com.intellij.openapi.project.Project;
import com.intellij.psi.search.ProjectScope;
import com.intellij.psi.stubs.StubIndex;
import com.intellij.util.Processor;
import com.intellij.util.QueryExecutor;
import com.jetbrains.python.psi.PyClass;
import com.jetbrains.python.psi.stubs.PySuperClassIndex;
import java.util.Collection;
/**
* @author yole
*/
public class PyClassInheritorsSearchExecutor implements QueryExecutor<PyClass, PyClassInheritorsSearch.SearchParameters> {
public boolean execute(final PyClassInheritorsSearch.SearchParameters queryParameters, final Processor<PyClass> consumer) {
PyClass superClass = queryParameters.getSuperClass();
Project project = superClass.getProject();
final Collection<PyClass> candidates = StubIndex.getInstance().get(PySuperClassIndex.KEY, superClass.getName(), project,
ProjectScope.getAllScope(project));
for(PyClass candidate: candidates) {
final PyClass[] classes = candidate.getSuperClasses();
if (classes != null) {
for(PyClass superClassCandidate: classes) {
if (superClassCandidate.isEquivalentTo(superClass)) {
if (!consumer.process(superClassCandidate)) return false;
}
}
}
}
return true;
}
}
| package com.jetbrains.python.psi.search;
import com.intellij.openapi.project.Project;
import com.intellij.psi.search.ProjectScope;
import com.intellij.psi.stubs.StubIndex;
import com.intellij.util.Processor;
import com.intellij.util.QueryExecutor;
import com.jetbrains.python.psi.PyClass;
import com.jetbrains.python.psi.stubs.PySuperClassIndex;
import java.util.Collection;
/**
* @author yole
*/
public class PyClassInheritorsSearchExecutor implements QueryExecutor<PyClass, PyClassInheritorsSearch.SearchParameters> {
public boolean execute(final PyClassInheritorsSearch.SearchParameters queryParameters, final Processor<PyClass> consumer) {
PyClass superClass = queryParameters.getSuperClass();
Project project = superClass.getProject();
final Collection<PyClass> candidates = StubIndex.getInstance().get(PySuperClassIndex.KEY, superClass.getName(), project,
ProjectScope.getAllScope(project));
for(PyClass candidate: candidates) {
final PyClass[] classes = candidate.getSuperClasses();
if (classes != null) {
for(PyClass superClassCandidate: classes) {
if (superClassCandidate.isEquivalentTo(superClass)) {
if (!consumer.process(candidate)) {
return false;
}
else {
break;
}
}
}
}
}
return true;
}
}
| Fix indices with multiple key hits in same file. | Fix indices with multiple key hits in same file.
| Java | apache-2.0 | robovm/robovm-studio,blademainer/intellij-community,holmes/intellij-community,semonte/intellij-community,gnuhub/intellij-community,xfournet/intellij-community,ahb0327/intellij-community,xfournet/intellij-community,ibinti/intellij-community,MER-GROUP/intellij-community,retomerz/intellij-community,petteyg/intellij-community,TangHao1987/intellij-community,ThiagoGarciaAlves/intellij-community,gnuhub/intellij-community,ol-loginov/intellij-community,MichaelNedzelsky/intellij-community,Distrotech/intellij-community,ivan-fedorov/intellij-community,michaelgallacher/intellij-community,da1z/intellij-community,pwoodworth/intellij-community,apixandru/intellij-community,MichaelNedzelsky/intellij-community,salguarnieri/intellij-community,jagguli/intellij-community,blademainer/intellij-community,SerCeMan/intellij-community,SerCeMan/intellij-community,caot/intellij-community,jagguli/intellij-community,vvv1559/intellij-community,ol-loginov/intellij-community,ahb0327/intellij-community,ftomassetti/intellij-community,ahb0327/intellij-community,apixandru/intellij-community,diorcety/intellij-community,retomerz/intellij-community,supersven/intellij-community,blademainer/intellij-community,kool79/intellij-community,wreckJ/intellij-community,adedayo/intellij-community,tmpgit/intellij-community,xfournet/intellij-community,pwoodworth/intellij-community,idea4bsd/idea4bsd,suncycheng/intellij-community,fnouama/intellij-community,caot/intellij-community,Lekanich/intellij-community,muntasirsyed/intellij-community,MER-GROUP/intellij-community,asedunov/intellij-community,ibinti/intellij-community,asedunov/intellij-community,petteyg/intellij-community,orekyuu/intellij-community,kdwink/intellij-community,salguarnieri/intellij-community,holmes/intellij-community,dslomov/intellij-community,semonte/intellij-community,MichaelNedzelsky/intellij-community,youdonghai/intellij-community,apixandru/intellij-community,jagguli/intellij-community,ivan-fedorov/intellij-community,da1z/intellij-community,orekyuu/intellij-community,retomerz/intellij-community,MER-GROUP/intellij-community,adedayo/intellij-community,suncycheng/intellij-community,wreckJ/intellij-community,caot/intellij-community,holmes/intellij-community,samthor/intellij-community,lucafavatella/intellij-community,akosyakov/intellij-community,clumsy/intellij-community,ThiagoGarciaAlves/intellij-community,vvv1559/intellij-community,lucafavatella/intellij-community,hurricup/intellij-community,dslomov/intellij-community,ivan-fedorov/intellij-community,fitermay/intellij-community,Distrotech/intellij-community,samthor/intellij-community,vladmm/intellij-community,orekyuu/intellij-community,ryano144/intellij-community,apixandru/intellij-community,asedunov/intellij-community,semonte/intellij-community,retomerz/intellij-community,akosyakov/intellij-community,asedunov/intellij-community,robovm/robovm-studio,ibinti/intellij-community,izonder/intellij-community,mglukhikh/intellij-community,slisson/intellij-community,izonder/intellij-community,slisson/intellij-community,vladmm/intellij-community,samthor/intellij-community,kdwink/intellij-community,samthor/intellij-community,slisson/intellij-community,vvv1559/intellij-community,fengbaicanhe/intellij-community,youdonghai/intellij-community,ThiagoGarciaAlves/intellij-community,robovm/robovm-studio,diorcety/intellij-community,ryano144/intellij-community,asedunov/intellij-community,idea4bsd/idea4bsd,da1z/intellij-community,allotria/intellij-community,muntasirsyed/intellij-community,petteyg/intellij-community,allotria/intellij-community,clumsy/intellij-community,ryano144/intellij-community,salguarnieri/intellij-community,slisson/intellij-community,blademainer/intellij-community,diorcety/intellij-community,adedayo/intellij-community,ThiagoGarciaAlves/intellij-community,fnouama/intellij-community,suncycheng/intellij-community,retomerz/intellij-community,supersven/intellij-community,diorcety/intellij-community,dslomov/intellij-community,jagguli/intellij-community,apixandru/intellij-community,kool79/intellij-community,ol-loginov/intellij-community,robovm/robovm-studio,MER-GROUP/intellij-community,lucafavatella/intellij-community,izonder/intellij-community,adedayo/intellij-community,tmpgit/intellij-community,hurricup/intellij-community,vladmm/intellij-community,TangHao1987/intellij-community,caot/intellij-community,blademainer/intellij-community,orekyuu/intellij-community,ibinti/intellij-community,izonder/intellij-community,vvv1559/intellij-community,izonder/intellij-community,muntasirsyed/intellij-community,gnuhub/intellij-community,caot/intellij-community,apixandru/intellij-community,petteyg/intellij-community,dslomov/intellij-community,ivan-fedorov/intellij-community,akosyakov/intellij-community,allotria/intellij-community,MER-GROUP/intellij-community,ryano144/intellij-community,xfournet/intellij-community,nicolargo/intellij-community,ahb0327/intellij-community,holmes/intellij-community,michaelgallacher/intellij-community,gnuhub/intellij-community,asedunov/intellij-community,fitermay/intellij-community,orekyuu/intellij-community,nicolargo/intellij-community,semonte/intellij-community,Distrotech/intellij-community,clumsy/intellij-community,allotria/intellij-community,Distrotech/intellij-community,lucafavatella/intellij-community,lucafavatella/intellij-community,youdonghai/intellij-community,ryano144/intellij-community,apixandru/intellij-community,semonte/intellij-community,alphafoobar/intellij-community,supersven/intellij-community,samthor/intellij-community,FHannes/intellij-community,alphafoobar/intellij-community,adedayo/intellij-community,Lekanich/intellij-community,SerCeMan/intellij-community,michaelgallacher/intellij-community,blademainer/intellij-community,kool79/intellij-community,izonder/intellij-community,youdonghai/intellij-community,da1z/intellij-community,clumsy/intellij-community,supersven/intellij-community,vvv1559/intellij-community,Distrotech/intellij-community,FHannes/intellij-community,ivan-fedorov/intellij-community,ahb0327/intellij-community,alphafoobar/intellij-community,TangHao1987/intellij-community,mglukhikh/intellij-community,izonder/intellij-community,ol-loginov/intellij-community,MER-GROUP/intellij-community,ThiagoGarciaAlves/intellij-community,diorcety/intellij-community,da1z/intellij-community,fengbaicanhe/intellij-community,slisson/intellij-community,ryano144/intellij-community,izonder/intellij-community,ryano144/intellij-community,fitermay/intellij-community,clumsy/intellij-community,supersven/intellij-community,pwoodworth/intellij-community,suncycheng/intellij-community,youdonghai/intellij-community,TangHao1987/intellij-community,lucafavatella/intellij-community,alphafoobar/intellij-community,robovm/robovm-studio,TangHao1987/intellij-community,robovm/robovm-studio,xfournet/intellij-community,Lekanich/intellij-community,akosyakov/intellij-community,xfournet/intellij-community,caot/intellij-community,ol-loginov/intellij-community,akosyakov/intellij-community,nicolargo/intellij-community,suncycheng/intellij-community,amith01994/intellij-community,tmpgit/intellij-community,wreckJ/intellij-community,akosyakov/intellij-community,michaelgallacher/intellij-community,mglukhikh/intellij-community,wreckJ/intellij-community,vvv1559/intellij-community,kool79/intellij-community,dslomov/intellij-community,ibinti/intellij-community,clumsy/intellij-community,fengbaicanhe/intellij-community,pwoodworth/intellij-community,ahb0327/intellij-community,ivan-fedorov/intellij-community,kool79/intellij-community,ftomassetti/intellij-community,lucafavatella/intellij-community,ftomassetti/intellij-community,Lekanich/intellij-community,diorcety/intellij-community,fitermay/intellij-community,signed/intellij-community,signed/intellij-community,vladmm/intellij-community,idea4bsd/idea4bsd,ivan-fedorov/intellij-community,kool79/intellij-community,muntasirsyed/intellij-community,supersven/intellij-community,petteyg/intellij-community,signed/intellij-community,adedayo/intellij-community,fengbaicanhe/intellij-community,ivan-fedorov/intellij-community,youdonghai/intellij-community,asedunov/intellij-community,samthor/intellij-community,akosyakov/intellij-community,slisson/intellij-community,semonte/intellij-community,tmpgit/intellij-community,vladmm/intellij-community,Distrotech/intellij-community,petteyg/intellij-community,apixandru/intellij-community,ryano144/intellij-community,tmpgit/intellij-community,Distrotech/intellij-community,blademainer/intellij-community,fnouama/intellij-community,wreckJ/intellij-community,salguarnieri/intellij-community,semonte/intellij-community,caot/intellij-community,ftomassetti/intellij-community,xfournet/intellij-community,slisson/intellij-community,FHannes/intellij-community,hurricup/intellij-community,supersven/intellij-community,holmes/intellij-community,diorcety/intellij-community,fitermay/intellij-community,lucafavatella/intellij-community,allotria/intellij-community,FHannes/intellij-community,alphafoobar/intellij-community,da1z/intellij-community,youdonghai/intellij-community,orekyuu/intellij-community,muntasirsyed/intellij-community,ol-loginov/intellij-community,nicolargo/intellij-community,pwoodworth/intellij-community,holmes/intellij-community,Distrotech/intellij-community,ibinti/intellij-community,hurricup/intellij-community,vladmm/intellij-community,TangHao1987/intellij-community,akosyakov/intellij-community,jagguli/intellij-community,gnuhub/intellij-community,amith01994/intellij-community,jagguli/intellij-community,michaelgallacher/intellij-community,supersven/intellij-community,diorcety/intellij-community,TangHao1987/intellij-community,hurricup/intellij-community,ahb0327/intellij-community,muntasirsyed/intellij-community,FHannes/intellij-community,mglukhikh/intellij-community,pwoodworth/intellij-community,signed/intellij-community,MichaelNedzelsky/intellij-community,ftomassetti/intellij-community,kdwink/intellij-community,FHannes/intellij-community,MichaelNedzelsky/intellij-community,idea4bsd/idea4bsd,mglukhikh/intellij-community,allotria/intellij-community,kdwink/intellij-community,alphafoobar/intellij-community,suncycheng/intellij-community,orekyuu/intellij-community,muntasirsyed/intellij-community,blademainer/intellij-community,hurricup/intellij-community,TangHao1987/intellij-community,mglukhikh/intellij-community,muntasirsyed/intellij-community,adedayo/intellij-community,robovm/robovm-studio,kdwink/intellij-community,alphafoobar/intellij-community,fengbaicanhe/intellij-community,wreckJ/intellij-community,retomerz/intellij-community,MER-GROUP/intellij-community,salguarnieri/intellij-community,Distrotech/intellij-community,xfournet/intellij-community,izonder/intellij-community,ol-loginov/intellij-community,tmpgit/intellij-community,kdwink/intellij-community,signed/intellij-community,hurricup/intellij-community,ahb0327/intellij-community,Lekanich/intellij-community,signed/intellij-community,akosyakov/intellij-community,fitermay/intellij-community,pwoodworth/intellij-community,robovm/robovm-studio,blademainer/intellij-community,signed/intellij-community,idea4bsd/idea4bsd,dslomov/intellij-community,SerCeMan/intellij-community,youdonghai/intellij-community,dslomov/intellij-community,hurricup/intellij-community,youdonghai/intellij-community,retomerz/intellij-community,ftomassetti/intellij-community,akosyakov/intellij-community,ol-loginov/intellij-community,slisson/intellij-community,asedunov/intellij-community,vvv1559/intellij-community,Lekanich/intellij-community,ahb0327/intellij-community,blademainer/intellij-community,FHannes/intellij-community,idea4bsd/idea4bsd,amith01994/intellij-community,gnuhub/intellij-community,SerCeMan/intellij-community,apixandru/intellij-community,wreckJ/intellij-community,suncycheng/intellij-community,amith01994/intellij-community,apixandru/intellij-community,retomerz/intellij-community,da1z/intellij-community,clumsy/intellij-community,tmpgit/intellij-community,fitermay/intellij-community,petteyg/intellij-community,jagguli/intellij-community,blademainer/intellij-community,ibinti/intellij-community,asedunov/intellij-community,Lekanich/intellij-community,tmpgit/intellij-community,gnuhub/intellij-community,ivan-fedorov/intellij-community,fitermay/intellij-community,diorcety/intellij-community,alphafoobar/intellij-community,amith01994/intellij-community,idea4bsd/idea4bsd,amith01994/intellij-community,kool79/intellij-community,MER-GROUP/intellij-community,orekyuu/intellij-community,petteyg/intellij-community,supersven/intellij-community,MER-GROUP/intellij-community,SerCeMan/intellij-community,petteyg/intellij-community,dslomov/intellij-community,mglukhikh/intellij-community,ThiagoGarciaAlves/intellij-community,jagguli/intellij-community,retomerz/intellij-community,MER-GROUP/intellij-community,muntasirsyed/intellij-community,apixandru/intellij-community,adedayo/intellij-community,idea4bsd/idea4bsd,ftomassetti/intellij-community,allotria/intellij-community,MichaelNedzelsky/intellij-community,slisson/intellij-community,akosyakov/intellij-community,amith01994/intellij-community,fnouama/intellij-community,samthor/intellij-community,salguarnieri/intellij-community,lucafavatella/intellij-community,vladmm/intellij-community,mglukhikh/intellij-community,da1z/intellij-community,supersven/intellij-community,fnouama/intellij-community,amith01994/intellij-community,hurricup/intellij-community,ol-loginov/intellij-community,caot/intellij-community,ivan-fedorov/intellij-community,ol-loginov/intellij-community,retomerz/intellij-community,dslomov/intellij-community,clumsy/intellij-community,ol-loginov/intellij-community,suncycheng/intellij-community,da1z/intellij-community,ThiagoGarciaAlves/intellij-community,muntasirsyed/intellij-community,samthor/intellij-community,alphafoobar/intellij-community,gnuhub/intellij-community,holmes/intellij-community,fnouama/intellij-community,ryano144/intellij-community,petteyg/intellij-community,caot/intellij-community,fengbaicanhe/intellij-community,SerCeMan/intellij-community,mglukhikh/intellij-community,semonte/intellij-community,fengbaicanhe/intellij-community,xfournet/intellij-community,jagguli/intellij-community,apixandru/intellij-community,amith01994/intellij-community,MER-GROUP/intellij-community,ThiagoGarciaAlves/intellij-community,salguarnieri/intellij-community,suncycheng/intellij-community,ibinti/intellij-community,SerCeMan/intellij-community,fitermay/intellij-community,vladmm/intellij-community,fengbaicanhe/intellij-community,holmes/intellij-community,Lekanich/intellij-community,ftomassetti/intellij-community,allotria/intellij-community,semonte/intellij-community,izonder/intellij-community,michaelgallacher/intellij-community,nicolargo/intellij-community,kdwink/intellij-community,semonte/intellij-community,Lekanich/intellij-community,wreckJ/intellij-community,kdwink/intellij-community,nicolargo/intellij-community,caot/intellij-community,ThiagoGarciaAlves/intellij-community,samthor/intellij-community,vvv1559/intellij-community,lucafavatella/intellij-community,asedunov/intellij-community,semonte/intellij-community,robovm/robovm-studio,vladmm/intellij-community,FHannes/intellij-community,MichaelNedzelsky/intellij-community,nicolargo/intellij-community,hurricup/intellij-community,orekyuu/intellij-community,fnouama/intellij-community,ryano144/intellij-community,vvv1559/intellij-community,youdonghai/intellij-community,amith01994/intellij-community,tmpgit/intellij-community,holmes/intellij-community,FHannes/intellij-community,adedayo/intellij-community,xfournet/intellij-community,gnuhub/intellij-community,gnuhub/intellij-community,salguarnieri/intellij-community,alphafoobar/intellij-community,diorcety/intellij-community,SerCeMan/intellij-community,fitermay/intellij-community,Distrotech/intellij-community,ibinti/intellij-community,signed/intellij-community,fengbaicanhe/intellij-community,apixandru/intellij-community,ivan-fedorov/intellij-community,SerCeMan/intellij-community,alphafoobar/intellij-community,ThiagoGarciaAlves/intellij-community,fitermay/intellij-community,hurricup/intellij-community,MichaelNedzelsky/intellij-community,supersven/intellij-community,izonder/intellij-community,ftomassetti/intellij-community,da1z/intellij-community,jagguli/intellij-community,Distrotech/intellij-community,slisson/intellij-community,ahb0327/intellij-community,diorcety/intellij-community,salguarnieri/intellij-community,FHannes/intellij-community,mglukhikh/intellij-community,FHannes/intellij-community,asedunov/intellij-community,fitermay/intellij-community,Lekanich/intellij-community,mglukhikh/intellij-community,retomerz/intellij-community,akosyakov/intellij-community,holmes/intellij-community,alphafoobar/intellij-community,michaelgallacher/intellij-community,lucafavatella/intellij-community,salguarnieri/intellij-community,pwoodworth/intellij-community,fnouama/intellij-community,muntasirsyed/intellij-community,holmes/intellij-community,holmes/intellij-community,supersven/intellij-community,nicolargo/intellij-community,robovm/robovm-studio,ibinti/intellij-community,orekyuu/intellij-community,fnouama/intellij-community,nicolargo/intellij-community,kool79/intellij-community,Lekanich/intellij-community,FHannes/intellij-community,suncycheng/intellij-community,pwoodworth/intellij-community,allotria/intellij-community,gnuhub/intellij-community,dslomov/intellij-community,fnouama/intellij-community,petteyg/intellij-community,wreckJ/intellij-community,adedayo/intellij-community,kdwink/intellij-community,MichaelNedzelsky/intellij-community,slisson/intellij-community,dslomov/intellij-community,pwoodworth/intellij-community,semonte/intellij-community,ThiagoGarciaAlves/intellij-community,fnouama/intellij-community,signed/intellij-community,hurricup/intellij-community,lucafavatella/intellij-community,tmpgit/intellij-community,signed/intellij-community,michaelgallacher/intellij-community,ThiagoGarciaAlves/intellij-community,adedayo/intellij-community,allotria/intellij-community,samthor/intellij-community,TangHao1987/intellij-community,mglukhikh/intellij-community,MichaelNedzelsky/intellij-community,fnouama/intellij-community,ibinti/intellij-community,caot/intellij-community,retomerz/intellij-community,jagguli/intellij-community,MER-GROUP/intellij-community,ftomassetti/intellij-community,michaelgallacher/intellij-community,kdwink/intellij-community,kool79/intellij-community,nicolargo/intellij-community,ryano144/intellij-community,nicolargo/intellij-community,asedunov/intellij-community,nicolargo/intellij-community,MichaelNedzelsky/intellij-community,FHannes/intellij-community,youdonghai/intellij-community,hurricup/intellij-community,caot/intellij-community,xfournet/intellij-community,clumsy/intellij-community,TangHao1987/intellij-community,suncycheng/intellij-community,ol-loginov/intellij-community,wreckJ/intellij-community,ahb0327/intellij-community,vladmm/intellij-community,mglukhikh/intellij-community,tmpgit/intellij-community,fengbaicanhe/intellij-community,xfournet/intellij-community,amith01994/intellij-community,gnuhub/intellij-community,TangHao1987/intellij-community,izonder/intellij-community,fengbaicanhe/intellij-community,clumsy/intellij-community,signed/intellij-community,vvv1559/intellij-community,vladmm/intellij-community,orekyuu/intellij-community,signed/intellij-community,samthor/intellij-community,ibinti/intellij-community,kool79/intellij-community,salguarnieri/intellij-community,MichaelNedzelsky/intellij-community,salguarnieri/intellij-community,wreckJ/intellij-community,Lekanich/intellij-community,idea4bsd/idea4bsd,amith01994/intellij-community,youdonghai/intellij-community,jagguli/intellij-community,asedunov/intellij-community,youdonghai/intellij-community,orekyuu/intellij-community,Distrotech/intellij-community,clumsy/intellij-community,allotria/intellij-community,vvv1559/intellij-community,allotria/intellij-community,da1z/intellij-community,blademainer/intellij-community,dslomov/intellij-community,idea4bsd/idea4bsd,vvv1559/intellij-community,tmpgit/intellij-community,vladmm/intellij-community,michaelgallacher/intellij-community,petteyg/intellij-community,TangHao1987/intellij-community,vvv1559/intellij-community,allotria/intellij-community,ftomassetti/intellij-community,samthor/intellij-community,da1z/intellij-community,pwoodworth/intellij-community,xfournet/intellij-community,ibinti/intellij-community,ryano144/intellij-community,fitermay/intellij-community,wreckJ/intellij-community,da1z/intellij-community,retomerz/intellij-community,michaelgallacher/intellij-community,robovm/robovm-studio,suncycheng/intellij-community,kool79/intellij-community,kdwink/intellij-community,SerCeMan/intellij-community,fengbaicanhe/intellij-community,idea4bsd/idea4bsd,ivan-fedorov/intellij-community,pwoodworth/intellij-community,semonte/intellij-community,robovm/robovm-studio,clumsy/intellij-community,michaelgallacher/intellij-community,apixandru/intellij-community,slisson/intellij-community,muntasirsyed/intellij-community,adedayo/intellij-community,lucafavatella/intellij-community,diorcety/intellij-community,ftomassetti/intellij-community,SerCeMan/intellij-community,signed/intellij-community,ahb0327/intellij-community,kool79/intellij-community,kdwink/intellij-community,idea4bsd/idea4bsd,idea4bsd/idea4bsd | java | ## Code Before:
package com.jetbrains.python.psi.search;
import com.intellij.openapi.project.Project;
import com.intellij.psi.search.ProjectScope;
import com.intellij.psi.stubs.StubIndex;
import com.intellij.util.Processor;
import com.intellij.util.QueryExecutor;
import com.jetbrains.python.psi.PyClass;
import com.jetbrains.python.psi.stubs.PySuperClassIndex;
import java.util.Collection;
/**
* @author yole
*/
public class PyClassInheritorsSearchExecutor implements QueryExecutor<PyClass, PyClassInheritorsSearch.SearchParameters> {
public boolean execute(final PyClassInheritorsSearch.SearchParameters queryParameters, final Processor<PyClass> consumer) {
PyClass superClass = queryParameters.getSuperClass();
Project project = superClass.getProject();
final Collection<PyClass> candidates = StubIndex.getInstance().get(PySuperClassIndex.KEY, superClass.getName(), project,
ProjectScope.getAllScope(project));
for(PyClass candidate: candidates) {
final PyClass[] classes = candidate.getSuperClasses();
if (classes != null) {
for(PyClass superClassCandidate: classes) {
if (superClassCandidate.isEquivalentTo(superClass)) {
if (!consumer.process(superClassCandidate)) return false;
}
}
}
}
return true;
}
}
## Instruction:
Fix indices with multiple key hits in same file.
## Code After:
package com.jetbrains.python.psi.search;
import com.intellij.openapi.project.Project;
import com.intellij.psi.search.ProjectScope;
import com.intellij.psi.stubs.StubIndex;
import com.intellij.util.Processor;
import com.intellij.util.QueryExecutor;
import com.jetbrains.python.psi.PyClass;
import com.jetbrains.python.psi.stubs.PySuperClassIndex;
import java.util.Collection;
/**
* @author yole
*/
public class PyClassInheritorsSearchExecutor implements QueryExecutor<PyClass, PyClassInheritorsSearch.SearchParameters> {
public boolean execute(final PyClassInheritorsSearch.SearchParameters queryParameters, final Processor<PyClass> consumer) {
PyClass superClass = queryParameters.getSuperClass();
Project project = superClass.getProject();
final Collection<PyClass> candidates = StubIndex.getInstance().get(PySuperClassIndex.KEY, superClass.getName(), project,
ProjectScope.getAllScope(project));
for(PyClass candidate: candidates) {
final PyClass[] classes = candidate.getSuperClasses();
if (classes != null) {
for(PyClass superClassCandidate: classes) {
if (superClassCandidate.isEquivalentTo(superClass)) {
if (!consumer.process(candidate)) {
return false;
}
else {
break;
}
}
}
}
}
return true;
}
}
| package com.jetbrains.python.psi.search;
import com.intellij.openapi.project.Project;
import com.intellij.psi.search.ProjectScope;
import com.intellij.psi.stubs.StubIndex;
import com.intellij.util.Processor;
import com.intellij.util.QueryExecutor;
import com.jetbrains.python.psi.PyClass;
import com.jetbrains.python.psi.stubs.PySuperClassIndex;
import java.util.Collection;
/**
* @author yole
*/
public class PyClassInheritorsSearchExecutor implements QueryExecutor<PyClass, PyClassInheritorsSearch.SearchParameters> {
public boolean execute(final PyClassInheritorsSearch.SearchParameters queryParameters, final Processor<PyClass> consumer) {
PyClass superClass = queryParameters.getSuperClass();
Project project = superClass.getProject();
final Collection<PyClass> candidates = StubIndex.getInstance().get(PySuperClassIndex.KEY, superClass.getName(), project,
ProjectScope.getAllScope(project));
for(PyClass candidate: candidates) {
final PyClass[] classes = candidate.getSuperClasses();
if (classes != null) {
for(PyClass superClassCandidate: classes) {
if (superClassCandidate.isEquivalentTo(superClass)) {
- if (!consumer.process(superClassCandidate)) return false;
+ if (!consumer.process(candidate)) {
+ return false;
+ }
+ else {
+ break;
+ }
}
}
}
}
return true;
}
} | 7 | 0.205882 | 6 | 1 |
383e0993a7ea355413a988fe172525067b2374d2 | tangent.rb | tangent.rb | require "sinatra/base"
require "sinatra/nedry"
require "sequel"
require "redcarpet"
require "pharrell"
require "./lib/all"
require "./config"
require "./app/base"
require "./app/editor"
require "./app/reader"
class Tangent < Sinatra::Base
use Reader
use Editor
end | require "sinatra/base"
require "sinatra/nedry"
require "sequel"
require "redcarpet"
require "pharrell"
require "./lib/all"
require "./config"
require "./app/base"
require "./app/editor"
require "./app/reader"
class Tangent < Sinatra::Base
use Editor
use Reader
end | Switch Reader/Editor loading so redirect works | Switch Reader/Editor loading so redirect works
| Ruby | mit | the-tangent/tangent,the-tangent/tangent | ruby | ## Code Before:
require "sinatra/base"
require "sinatra/nedry"
require "sequel"
require "redcarpet"
require "pharrell"
require "./lib/all"
require "./config"
require "./app/base"
require "./app/editor"
require "./app/reader"
class Tangent < Sinatra::Base
use Reader
use Editor
end
## Instruction:
Switch Reader/Editor loading so redirect works
## Code After:
require "sinatra/base"
require "sinatra/nedry"
require "sequel"
require "redcarpet"
require "pharrell"
require "./lib/all"
require "./config"
require "./app/base"
require "./app/editor"
require "./app/reader"
class Tangent < Sinatra::Base
use Editor
use Reader
end | require "sinatra/base"
require "sinatra/nedry"
require "sequel"
require "redcarpet"
require "pharrell"
require "./lib/all"
require "./config"
require "./app/base"
require "./app/editor"
require "./app/reader"
class Tangent < Sinatra::Base
+ use Editor
use Reader
- use Editor
end | 2 | 0.111111 | 1 | 1 |
69b25eb03a4f3d9e8f5ce4366c3b7944483799bf | client/homebrew/homebrew.less | client/homebrew/homebrew.less |
@import 'naturalcrit/styles/core.less';
.homebrew{
height : 100%;
.page{
display : flex;
height : 100%;
background-color : @steel;
flex-direction : column;
.content{
position : relative;
height : calc(~"100% - 29px"); //Navbar height
flex : auto;
}
}
} |
@import 'naturalcrit/styles/core.less';
.homebrew{
height : 100%;
.page{
display : flex;
height : 100%;
background-color : @steel;
flex-direction : column;
.content{
position : relative;
overflow-y : hidden;
height : calc(~"100% - 29px"); //Navbar height
flex : auto;
}
}
}
| Fix duplicate scrollbar in "share" page | Fix duplicate scrollbar in "share" page | Less | mit | calculuschild/homebrewery,calculuschild/homebrewery | less | ## Code Before:
@import 'naturalcrit/styles/core.less';
.homebrew{
height : 100%;
.page{
display : flex;
height : 100%;
background-color : @steel;
flex-direction : column;
.content{
position : relative;
height : calc(~"100% - 29px"); //Navbar height
flex : auto;
}
}
}
## Instruction:
Fix duplicate scrollbar in "share" page
## Code After:
@import 'naturalcrit/styles/core.less';
.homebrew{
height : 100%;
.page{
display : flex;
height : 100%;
background-color : @steel;
flex-direction : column;
.content{
position : relative;
overflow-y : hidden;
height : calc(~"100% - 29px"); //Navbar height
flex : auto;
}
}
}
|
@import 'naturalcrit/styles/core.less';
.homebrew{
height : 100%;
.page{
display : flex;
height : 100%;
background-color : @steel;
flex-direction : column;
.content{
- position : relative;
+ position : relative;
? ++
+ overflow-y : hidden;
- height : calc(~"100% - 29px"); //Navbar height
+ height : calc(~"100% - 29px"); //Navbar height
? ++
- flex : auto;
+ flex : auto;
? ++
}
}
} | 7 | 0.4375 | 4 | 3 |
3de0404b8885e021f8fe5a4cbfefa127265d9971 | README.md | README.md | REST API SDK for SMS
====================
This is the PHP SDK for the CLX Communications REST API (also called
XMS) for sending and receiving single or batch SMS messages. It also
supports scheduled sends, organizing your frequent recipients into
groups, and customizing your message for each recipient using
parameterization.
This library is compatible with PHP 5.6 and later.
Using
-----
The SDK is packaged using [Composer](https://getcomposer.org/) and is
available in the [Packagist](https://packagist.org/) repository under
the name `clxcommunications/sdk-xms`.
License
-------
This project is licensed under the Apache License Version 2.0. See the
LICENSE.txt file for the license text.
Release procedure
-----------------
The following steps are necessary to perform a release of the SDK:
1. Update to release version in `src/Version.php` and `CHANGELOG.md`.
3. Commit the changes and add a release tag.
4. Prepare `src/Version.php` and `CHANGELOG.md` for next development cycle.
5. Commit again.
6. Push it all to GitHub.
| REST API SDK for SMS
====================
This is the PHP SDK for the CLX Communications REST API (also called
XMS) for sending and receiving single or batch SMS messages. It also
supports scheduled sends, organizing your frequent recipients into
groups, and customizing your message for each recipient using
parameterization.
This library is compatible with PHP 5.6 and later.
Using
-----
The SDK is packaged using [Composer](https://getcomposer.org/) and is
available in the [Packagist](https://packagist.org/) repository under
the name `clxcommunications/sdk-xms`.
License
-------
This project is licensed under the Apache License Version 2.0. See the
LICENSE.txt file for the license text.
Release procedure
-----------------
The following steps are necessary to perform a release of the SDK:
1. Update to release version in `src/Version.php` and `CHANGELOG.md`.
3. Commit the changes and add a release tag.
4. Generate PHP docs and commit to `gh-pages` branch.
5. Prepare `src/Version.php` and `CHANGELOG.md` for next development cycle.
6. Commit again.
7. Push it all to GitHub.
| Add API docs to the release procedure | Add API docs to the release procedure
| Markdown | apache-2.0 | clxcommunications/sdk-xms-php | markdown | ## Code Before:
REST API SDK for SMS
====================
This is the PHP SDK for the CLX Communications REST API (also called
XMS) for sending and receiving single or batch SMS messages. It also
supports scheduled sends, organizing your frequent recipients into
groups, and customizing your message for each recipient using
parameterization.
This library is compatible with PHP 5.6 and later.
Using
-----
The SDK is packaged using [Composer](https://getcomposer.org/) and is
available in the [Packagist](https://packagist.org/) repository under
the name `clxcommunications/sdk-xms`.
License
-------
This project is licensed under the Apache License Version 2.0. See the
LICENSE.txt file for the license text.
Release procedure
-----------------
The following steps are necessary to perform a release of the SDK:
1. Update to release version in `src/Version.php` and `CHANGELOG.md`.
3. Commit the changes and add a release tag.
4. Prepare `src/Version.php` and `CHANGELOG.md` for next development cycle.
5. Commit again.
6. Push it all to GitHub.
## Instruction:
Add API docs to the release procedure
## Code After:
REST API SDK for SMS
====================
This is the PHP SDK for the CLX Communications REST API (also called
XMS) for sending and receiving single or batch SMS messages. It also
supports scheduled sends, organizing your frequent recipients into
groups, and customizing your message for each recipient using
parameterization.
This library is compatible with PHP 5.6 and later.
Using
-----
The SDK is packaged using [Composer](https://getcomposer.org/) and is
available in the [Packagist](https://packagist.org/) repository under
the name `clxcommunications/sdk-xms`.
License
-------
This project is licensed under the Apache License Version 2.0. See the
LICENSE.txt file for the license text.
Release procedure
-----------------
The following steps are necessary to perform a release of the SDK:
1. Update to release version in `src/Version.php` and `CHANGELOG.md`.
3. Commit the changes and add a release tag.
4. Generate PHP docs and commit to `gh-pages` branch.
5. Prepare `src/Version.php` and `CHANGELOG.md` for next development cycle.
6. Commit again.
7. Push it all to GitHub.
| REST API SDK for SMS
====================
This is the PHP SDK for the CLX Communications REST API (also called
XMS) for sending and receiving single or batch SMS messages. It also
supports scheduled sends, organizing your frequent recipients into
groups, and customizing your message for each recipient using
parameterization.
This library is compatible with PHP 5.6 and later.
Using
-----
The SDK is packaged using [Composer](https://getcomposer.org/) and is
available in the [Packagist](https://packagist.org/) repository under
the name `clxcommunications/sdk-xms`.
License
-------
This project is licensed under the Apache License Version 2.0. See the
LICENSE.txt file for the license text.
Release procedure
-----------------
The following steps are necessary to perform a release of the SDK:
1. Update to release version in `src/Version.php` and `CHANGELOG.md`.
3. Commit the changes and add a release tag.
- 4. Prepare `src/Version.php` and `CHANGELOG.md` for next development cycle.
+ 4. Generate PHP docs and commit to `gh-pages` branch.
- 5. Commit again.
+ 5. Prepare `src/Version.php` and `CHANGELOG.md` for next development cycle.
+ 6. Commit again.
+
- 6. Push it all to GitHub.
? ^
+ 7. Push it all to GitHub.
? ^
| 8 | 0.210526 | 5 | 3 |
2a3c94865246a133b6b3e79d9e36af1ce1a38546 | README.md | README.md | The-Texting-Tree
================
A small Python application for controlling an internet-connected Christmas Tree. It receives an incoming SMS from Twilio, parses the color in the body of the message, and hits the Spark Core's API to deliver an RGB value. The Spark Core then changes the color of a strand of lights on a Christmas Tree.
A full tutorial for this project is here: [Placeholder]
| The Texting Tree
================
A small Python application for controlling an internet-connected Christmas Tree. It receives an incoming SMS from Twilio, parses the color in the body of the message, and hits the Spark Core's API to deliver an RGB value. The Spark Core then changes the color of a strand of lights on a Christmas Tree.
A full tutorial for this project is here: [The Texting Tree](http://willd.me/posts/the-texting-tree)
| Add tutorial URL to repo | Add tutorial URL to repo
| Markdown | mit | willdages/The-Texting-Tree | markdown | ## Code Before:
The-Texting-Tree
================
A small Python application for controlling an internet-connected Christmas Tree. It receives an incoming SMS from Twilio, parses the color in the body of the message, and hits the Spark Core's API to deliver an RGB value. The Spark Core then changes the color of a strand of lights on a Christmas Tree.
A full tutorial for this project is here: [Placeholder]
## Instruction:
Add tutorial URL to repo
## Code After:
The Texting Tree
================
A small Python application for controlling an internet-connected Christmas Tree. It receives an incoming SMS from Twilio, parses the color in the body of the message, and hits the Spark Core's API to deliver an RGB value. The Spark Core then changes the color of a strand of lights on a Christmas Tree.
A full tutorial for this project is here: [The Texting Tree](http://willd.me/posts/the-texting-tree)
| - The-Texting-Tree
? ^ ^
+ The Texting Tree
? ^ ^
================
A small Python application for controlling an internet-connected Christmas Tree. It receives an incoming SMS from Twilio, parses the color in the body of the message, and hits the Spark Core's API to deliver an RGB value. The Spark Core then changes the color of a strand of lights on a Christmas Tree.
- A full tutorial for this project is here: [Placeholder]
+ A full tutorial for this project is here: [The Texting Tree](http://willd.me/posts/the-texting-tree) | 4 | 0.666667 | 2 | 2 |
b1c8ce6ac2658264a97983b185ebef31c0952b33 | depot/tests.py | depot/tests.py | from django.test import TestCase
from .models import Depot
class DepotTestCase(TestCase):
def test_str(self):
depot = Depot(1, "My depot")
self.assertEqual(depot.__str__(), "Depot My depot")
| from django.test import TestCase
from .models import Depot, Item
class DepotTestCase(TestCase):
def test_str(self):
depot = Depot(1, "My depot")
self.assertEqual(depot.__str__(), "Depot My depot")
class ItemTestCase(TestCase):
def test_str(self):
depot = Depot(2, "My depot")
item = Item(1, "My item", 5, 2, depot, "My shelf")
self.assertEqual(item.__str__(), "5 unit(s) of My item (visib.: 2) in My shelf")
| Add test case for Item __str__ function | Add test case for Item __str__ function
| Python | agpl-3.0 | verleihtool/verleihtool,verleihtool/verleihtool,verleihtool/verleihtool,verleihtool/verleihtool | python | ## Code Before:
from django.test import TestCase
from .models import Depot
class DepotTestCase(TestCase):
def test_str(self):
depot = Depot(1, "My depot")
self.assertEqual(depot.__str__(), "Depot My depot")
## Instruction:
Add test case for Item __str__ function
## Code After:
from django.test import TestCase
from .models import Depot, Item
class DepotTestCase(TestCase):
def test_str(self):
depot = Depot(1, "My depot")
self.assertEqual(depot.__str__(), "Depot My depot")
class ItemTestCase(TestCase):
def test_str(self):
depot = Depot(2, "My depot")
item = Item(1, "My item", 5, 2, depot, "My shelf")
self.assertEqual(item.__str__(), "5 unit(s) of My item (visib.: 2) in My shelf")
| from django.test import TestCase
- from .models import Depot
+ from .models import Depot, Item
? ++++++
class DepotTestCase(TestCase):
def test_str(self):
depot = Depot(1, "My depot")
self.assertEqual(depot.__str__(), "Depot My depot")
+
+ class ItemTestCase(TestCase):
+
+ def test_str(self):
+ depot = Depot(2, "My depot")
+ item = Item(1, "My item", 5, 2, depot, "My shelf")
+ self.assertEqual(item.__str__(), "5 unit(s) of My item (visib.: 2) in My shelf") | 9 | 1.125 | 8 | 1 |
e05587103c267cbdf0e71c255c8b176e95a1c1af | Filters/HyperTree/Testing/Cxx/CMakeLists.txt | Filters/HyperTree/Testing/Cxx/CMakeLists.txt | create_test_sourcelist(Tests ${vtk-module}CxxTests.cxx
TestBinaryHyperTreeGrid.cxx
TestClipHyperOctree.cxx
TestHyperOctreeContourFilter.cxx
TestHyperOctreeCutter.cxx
TestHyperOctreeDual.cxx
TestHyperOctreeSurfaceFilter.cxx
TestHyperOctreeToUniformGrid.cxx
EXTRA_INCLUDE vtkTestDriver.h
)
vtk_module_test_executable(${vtk-module}CxxTests ${Tests})
set(TestsToRun ${Tests})
list(REMOVE_ITEM TestsToRun ${vtk-module}CxxTests.cxx)
# Add all the executables
foreach(test ${TestsToRun})
get_filename_component(TName ${test} NAME_WE)
if(VTK_DATA_ROOT)
add_test(NAME ${vtk-module}Cxx-${TName}
COMMAND ${vtk-module}CxxTests ${TName}
-D ${VTK_DATA_ROOT}
-T ${VTK_TEST_OUTPUT_DIR}
-V Baseline/Graphics/${TName}.png)
else()
add_test(NAME ${vtk-module}Cxx-${TName}
COMMAND ${vtk-module}CxxTests ${TName})
endif()
endforeach()
| create_test_sourcelist(Tests ${vtk-module}CxxTests.cxx
TestBinaryHyperTreeGrid.cxx
TestClipHyperOctree.cxx
TestHyperOctreeContourFilter.cxx
TestHyperOctreeCutter.cxx
TestHyperOctreeDual.cxx
TestHyperOctreeSurfaceFilter.cxx
TestHyperOctreeToUniformGrid.cxx
TestHyperTreeGrid.cxx
TestHyperTreeGridAxisCut.cxx
TestHyperTreeGridGeometry.cxx
EXTRA_INCLUDE vtkTestDriver.h
)
vtk_module_test_executable(${vtk-module}CxxTests ${Tests})
set(TestsToRun ${Tests})
list(REMOVE_ITEM TestsToRun ${vtk-module}CxxTests.cxx)
# Add all the executables
foreach(test ${TestsToRun})
get_filename_component(TName ${test} NAME_WE)
if(VTK_DATA_ROOT)
add_test(NAME ${vtk-module}Cxx-${TName}
COMMAND ${vtk-module}CxxTests ${TName}
-D ${VTK_DATA_ROOT}
-T ${VTK_TEST_OUTPUT_DIR}
-V Baseline/Graphics/${TName}.png)
else()
add_test(NAME ${vtk-module}Cxx-${TName}
COMMAND ${vtk-module}CxxTests ${TName})
endif()
endforeach()
| Include all hyper tree grid tests into test harness | Include all hyper tree grid tests into test harness
Change-Id: I3a4f7708734d3d6a97d20ec76dc98c80084d4981
| Text | bsd-3-clause | johnkit/vtk-dev,candy7393/VTK,keithroe/vtkoptix,demarle/VTK,gram526/VTK,jmerkow/VTK,mspark93/VTK,berendkleinhaneveld/VTK,candy7393/VTK,msmolens/VTK,sankhesh/VTK,biddisco/VTK,aashish24/VTK-old,jmerkow/VTK,mspark93/VTK,collects/VTK,berendkleinhaneveld/VTK,hendradarwin/VTK,SimVascular/VTK,candy7393/VTK,SimVascular/VTK,sumedhasingla/VTK,jmerkow/VTK,SimVascular/VTK,ashray/VTK-EVM,ashray/VTK-EVM,sankhesh/VTK,mspark93/VTK,berendkleinhaneveld/VTK,ashray/VTK-EVM,sumedhasingla/VTK,SimVascular/VTK,jmerkow/VTK,demarle/VTK,johnkit/vtk-dev,ashray/VTK-EVM,demarle/VTK,candy7393/VTK,biddisco/VTK,SimVascular/VTK,biddisco/VTK,keithroe/vtkoptix,sankhesh/VTK,demarle/VTK,SimVascular/VTK,candy7393/VTK,biddisco/VTK,mspark93/VTK,johnkit/vtk-dev,jmerkow/VTK,gram526/VTK,jmerkow/VTK,sumedhasingla/VTK,biddisco/VTK,johnkit/vtk-dev,demarle/VTK,candy7393/VTK,ashray/VTK-EVM,sumedhasingla/VTK,aashish24/VTK-old,collects/VTK,keithroe/vtkoptix,msmolens/VTK,jmerkow/VTK,gram526/VTK,gram526/VTK,sankhesh/VTK,hendradarwin/VTK,ashray/VTK-EVM,keithroe/vtkoptix,hendradarwin/VTK,keithroe/vtkoptix,msmolens/VTK,biddisco/VTK,gram526/VTK,aashish24/VTK-old,mspark93/VTK,berendkleinhaneveld/VTK,msmolens/VTK,candy7393/VTK,johnkit/vtk-dev,msmolens/VTK,hendradarwin/VTK,gram526/VTK,berendkleinhaneveld/VTK,sankhesh/VTK,sumedhasingla/VTK,candy7393/VTK,sumedhasingla/VTK,aashish24/VTK-old,hendradarwin/VTK,johnkit/vtk-dev,SimVascular/VTK,hendradarwin/VTK,msmolens/VTK,keithroe/vtkoptix,sumedhasingla/VTK,mspark93/VTK,mspark93/VTK,johnkit/vtk-dev,sumedhasingla/VTK,collects/VTK,gram526/VTK,berendkleinhaneveld/VTK,SimVascular/VTK,keithroe/vtkoptix,hendradarwin/VTK,sankhesh/VTK,biddisco/VTK,aashish24/VTK-old,ashray/VTK-EVM,msmolens/VTK,collects/VTK,demarle/VTK,sankhesh/VTK,gram526/VTK,msmolens/VTK,mspark93/VTK,sankhesh/VTK,jmerkow/VTK,collects/VTK,demarle/VTK,aashish24/VTK-old,collects/VTK,berendkleinhaneveld/VTK,ashray/VTK-EVM,demarle/VTK,keithroe/vtkoptix | text | ## Code Before:
create_test_sourcelist(Tests ${vtk-module}CxxTests.cxx
TestBinaryHyperTreeGrid.cxx
TestClipHyperOctree.cxx
TestHyperOctreeContourFilter.cxx
TestHyperOctreeCutter.cxx
TestHyperOctreeDual.cxx
TestHyperOctreeSurfaceFilter.cxx
TestHyperOctreeToUniformGrid.cxx
EXTRA_INCLUDE vtkTestDriver.h
)
vtk_module_test_executable(${vtk-module}CxxTests ${Tests})
set(TestsToRun ${Tests})
list(REMOVE_ITEM TestsToRun ${vtk-module}CxxTests.cxx)
# Add all the executables
foreach(test ${TestsToRun})
get_filename_component(TName ${test} NAME_WE)
if(VTK_DATA_ROOT)
add_test(NAME ${vtk-module}Cxx-${TName}
COMMAND ${vtk-module}CxxTests ${TName}
-D ${VTK_DATA_ROOT}
-T ${VTK_TEST_OUTPUT_DIR}
-V Baseline/Graphics/${TName}.png)
else()
add_test(NAME ${vtk-module}Cxx-${TName}
COMMAND ${vtk-module}CxxTests ${TName})
endif()
endforeach()
## Instruction:
Include all hyper tree grid tests into test harness
Change-Id: I3a4f7708734d3d6a97d20ec76dc98c80084d4981
## Code After:
create_test_sourcelist(Tests ${vtk-module}CxxTests.cxx
TestBinaryHyperTreeGrid.cxx
TestClipHyperOctree.cxx
TestHyperOctreeContourFilter.cxx
TestHyperOctreeCutter.cxx
TestHyperOctreeDual.cxx
TestHyperOctreeSurfaceFilter.cxx
TestHyperOctreeToUniformGrid.cxx
TestHyperTreeGrid.cxx
TestHyperTreeGridAxisCut.cxx
TestHyperTreeGridGeometry.cxx
EXTRA_INCLUDE vtkTestDriver.h
)
vtk_module_test_executable(${vtk-module}CxxTests ${Tests})
set(TestsToRun ${Tests})
list(REMOVE_ITEM TestsToRun ${vtk-module}CxxTests.cxx)
# Add all the executables
foreach(test ${TestsToRun})
get_filename_component(TName ${test} NAME_WE)
if(VTK_DATA_ROOT)
add_test(NAME ${vtk-module}Cxx-${TName}
COMMAND ${vtk-module}CxxTests ${TName}
-D ${VTK_DATA_ROOT}
-T ${VTK_TEST_OUTPUT_DIR}
-V Baseline/Graphics/${TName}.png)
else()
add_test(NAME ${vtk-module}Cxx-${TName}
COMMAND ${vtk-module}CxxTests ${TName})
endif()
endforeach()
| create_test_sourcelist(Tests ${vtk-module}CxxTests.cxx
TestBinaryHyperTreeGrid.cxx
TestClipHyperOctree.cxx
TestHyperOctreeContourFilter.cxx
TestHyperOctreeCutter.cxx
TestHyperOctreeDual.cxx
TestHyperOctreeSurfaceFilter.cxx
TestHyperOctreeToUniformGrid.cxx
+ TestHyperTreeGrid.cxx
+ TestHyperTreeGridAxisCut.cxx
+ TestHyperTreeGridGeometry.cxx
+
EXTRA_INCLUDE vtkTestDriver.h
)
vtk_module_test_executable(${vtk-module}CxxTests ${Tests})
set(TestsToRun ${Tests})
list(REMOVE_ITEM TestsToRun ${vtk-module}CxxTests.cxx)
# Add all the executables
foreach(test ${TestsToRun})
get_filename_component(TName ${test} NAME_WE)
if(VTK_DATA_ROOT)
add_test(NAME ${vtk-module}Cxx-${TName}
COMMAND ${vtk-module}CxxTests ${TName}
-D ${VTK_DATA_ROOT}
-T ${VTK_TEST_OUTPUT_DIR}
-V Baseline/Graphics/${TName}.png)
else()
add_test(NAME ${vtk-module}Cxx-${TName}
COMMAND ${vtk-module}CxxTests ${TName})
endif()
endforeach() | 4 | 0.129032 | 4 | 0 |
80aa07dd6df4ee1431212b899333b34f554bcdd8 | circle.yml | circle.yml | dependencies:
override:
- mkdir -p $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME
- ln -fs $HOME/$CIRCLE_PROJECT_REPONAME $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME/$CIRCLE_PROJECT_REPONAME
- |
echo 'export GOPATH=$GOPATH:$HOME/.go_project' >> ~/.circlerc
- cd $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME/$CIRCLE_PROJECT_REPONAME && go get -t -d -v ./...
test:
override:
- cd $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME/$CIRCLE_PROJECT_REPONAME && make
| dependencies:
override:
- mkdir -p $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME
- ln -fs $HOME/$CIRCLE_PROJECT_REPONAME $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME/$CIRCLE_PROJECT_REPONAME
- |
echo 'export GOPATH=$GOPATH:$HOME/.go_project' >> ~/.circlerc
- cd $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME/$CIRCLE_PROJECT_REPONAME && go get -t -d -v ./...
test:
pre:
- touch .env
override:
- cd $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME/$CIRCLE_PROJECT_REPONAME && make
| Fix build failing in Circle CI | Fix build failing in Circle CI
| YAML | mit | MEDIGO/go-zendesk | yaml | ## Code Before:
dependencies:
override:
- mkdir -p $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME
- ln -fs $HOME/$CIRCLE_PROJECT_REPONAME $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME/$CIRCLE_PROJECT_REPONAME
- |
echo 'export GOPATH=$GOPATH:$HOME/.go_project' >> ~/.circlerc
- cd $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME/$CIRCLE_PROJECT_REPONAME && go get -t -d -v ./...
test:
override:
- cd $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME/$CIRCLE_PROJECT_REPONAME && make
## Instruction:
Fix build failing in Circle CI
## Code After:
dependencies:
override:
- mkdir -p $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME
- ln -fs $HOME/$CIRCLE_PROJECT_REPONAME $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME/$CIRCLE_PROJECT_REPONAME
- |
echo 'export GOPATH=$GOPATH:$HOME/.go_project' >> ~/.circlerc
- cd $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME/$CIRCLE_PROJECT_REPONAME && go get -t -d -v ./...
test:
pre:
- touch .env
override:
- cd $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME/$CIRCLE_PROJECT_REPONAME && make
| dependencies:
override:
- mkdir -p $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME
- ln -fs $HOME/$CIRCLE_PROJECT_REPONAME $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME/$CIRCLE_PROJECT_REPONAME
- |
echo 'export GOPATH=$GOPATH:$HOME/.go_project' >> ~/.circlerc
- cd $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME/$CIRCLE_PROJECT_REPONAME && go get -t -d -v ./...
test:
+ pre:
+ - touch .env
override:
- cd $HOME/.go_project/src/github.com/$CIRCLE_PROJECT_USERNAME/$CIRCLE_PROJECT_REPONAME && make | 2 | 0.181818 | 2 | 0 |
72a7d96f6141b98cfae3c72551ff35659ff27607 | scripts/plugins/jquery.popline.justify.js | scripts/plugins/jquery.popline.justify.js | /*
jquery.popline.justify.js 0.1.0-dev
Version: 0.1.0-dev
Updated: Aug 11th, 2014
(c) 2014 by kenshin54
*/
;(function($) {
$.popline.addButton({
justify: {
iconClass: "fa fa-align-justify",
mode: "edit",
buttons: {
justifyLeft: {
iconClass: "fa fa-align-left",
action: function(event) {
document.execCommand("JustifyLeft");
}
},
justifyCenter: {
iconClass: "fa fa-align-center",
action: function(event) {
document.execCommand("JustifyCenter");
}
},
justifyRight: {
iconClass: "fa fa-align-right",
action: function(event) {
document.execCommand("JustifyRight");
}
},
indent: {
iconClass: "fa fa-indent",
action: function(event) {
document.execCommand("indent");
}
},
outdent: {
iconClass: "fa fa-dedent",
action: function(event) {
document.execCommand("outdent");
}
}
}
}
});
})(jQuery);
| /*
jquery.popline.justify.js 0.1.0-dev
Version: 0.1.0-dev
Updated: Aug 11th, 2014
(c) 2014 by kenshin54
*/
;(function($) {
var removeRedundantParagraphTag = function(popline, align) {
if ($.popline.utils.browser.ie) {
$paragraphs = popline.target.find("p[align=" + align + "]");
$paragraphs.each(function(i, obj){
if (obj.childNodes.length === 1 && obj.childNodes[0].nodeType === 3 && !/\S/.test(obj.childNodes[0].nodeValue)) {
$(obj).remove();
}
})
}
}
$.popline.addButton({
justify: {
iconClass: "fa fa-align-justify",
mode: "edit",
buttons: {
justifyLeft: {
iconClass: "fa fa-align-left",
action: function(event, popline) {
document.execCommand("JustifyLeft");
removeRedundantParagraphTag(popline, "left");
}
},
justifyCenter: {
iconClass: "fa fa-align-center",
action: function(event, popline) {
document.execCommand("JustifyCenter");
removeRedundantParagraphTag(popline, "center");
}
},
justifyRight: {
iconClass: "fa fa-align-right",
action: function(event, popline) {
document.execCommand("JustifyRight");
removeRedundantParagraphTag(popline, "right");
}
},
indent: {
iconClass: "fa fa-indent",
action: function(event) {
document.execCommand("indent");
}
},
outdent: {
iconClass: "fa fa-dedent",
action: function(event) {
document.execCommand("outdent");
}
}
}
}
});
})(jQuery);
| Remove redundant empty paragraph tag under IE family. | Remove redundant empty paragraph tag under IE family.
| JavaScript | mit | ztx1491/popline,ztx1491/popline,kenshin54/popline,roryok/popline,kenshin54/popline,roryok/popline | javascript | ## Code Before:
/*
jquery.popline.justify.js 0.1.0-dev
Version: 0.1.0-dev
Updated: Aug 11th, 2014
(c) 2014 by kenshin54
*/
;(function($) {
$.popline.addButton({
justify: {
iconClass: "fa fa-align-justify",
mode: "edit",
buttons: {
justifyLeft: {
iconClass: "fa fa-align-left",
action: function(event) {
document.execCommand("JustifyLeft");
}
},
justifyCenter: {
iconClass: "fa fa-align-center",
action: function(event) {
document.execCommand("JustifyCenter");
}
},
justifyRight: {
iconClass: "fa fa-align-right",
action: function(event) {
document.execCommand("JustifyRight");
}
},
indent: {
iconClass: "fa fa-indent",
action: function(event) {
document.execCommand("indent");
}
},
outdent: {
iconClass: "fa fa-dedent",
action: function(event) {
document.execCommand("outdent");
}
}
}
}
});
})(jQuery);
## Instruction:
Remove redundant empty paragraph tag under IE family.
## Code After:
/*
jquery.popline.justify.js 0.1.0-dev
Version: 0.1.0-dev
Updated: Aug 11th, 2014
(c) 2014 by kenshin54
*/
;(function($) {
var removeRedundantParagraphTag = function(popline, align) {
if ($.popline.utils.browser.ie) {
$paragraphs = popline.target.find("p[align=" + align + "]");
$paragraphs.each(function(i, obj){
if (obj.childNodes.length === 1 && obj.childNodes[0].nodeType === 3 && !/\S/.test(obj.childNodes[0].nodeValue)) {
$(obj).remove();
}
})
}
}
$.popline.addButton({
justify: {
iconClass: "fa fa-align-justify",
mode: "edit",
buttons: {
justifyLeft: {
iconClass: "fa fa-align-left",
action: function(event, popline) {
document.execCommand("JustifyLeft");
removeRedundantParagraphTag(popline, "left");
}
},
justifyCenter: {
iconClass: "fa fa-align-center",
action: function(event, popline) {
document.execCommand("JustifyCenter");
removeRedundantParagraphTag(popline, "center");
}
},
justifyRight: {
iconClass: "fa fa-align-right",
action: function(event, popline) {
document.execCommand("JustifyRight");
removeRedundantParagraphTag(popline, "right");
}
},
indent: {
iconClass: "fa fa-indent",
action: function(event) {
document.execCommand("indent");
}
},
outdent: {
iconClass: "fa fa-dedent",
action: function(event) {
document.execCommand("outdent");
}
}
}
}
});
})(jQuery);
| /*
jquery.popline.justify.js 0.1.0-dev
Version: 0.1.0-dev
Updated: Aug 11th, 2014
(c) 2014 by kenshin54
*/
;(function($) {
+
+ var removeRedundantParagraphTag = function(popline, align) {
+ if ($.popline.utils.browser.ie) {
+ $paragraphs = popline.target.find("p[align=" + align + "]");
+ $paragraphs.each(function(i, obj){
+ if (obj.childNodes.length === 1 && obj.childNodes[0].nodeType === 3 && !/\S/.test(obj.childNodes[0].nodeValue)) {
+ $(obj).remove();
+ }
+ })
+ }
+ }
+
$.popline.addButton({
justify: {
iconClass: "fa fa-align-justify",
mode: "edit",
buttons: {
justifyLeft: {
iconClass: "fa fa-align-left",
- action: function(event) {
+ action: function(event, popline) {
? +++++++++
document.execCommand("JustifyLeft");
+ removeRedundantParagraphTag(popline, "left");
}
},
justifyCenter: {
iconClass: "fa fa-align-center",
- action: function(event) {
+ action: function(event, popline) {
? +++++++++
document.execCommand("JustifyCenter");
+ removeRedundantParagraphTag(popline, "center");
}
},
justifyRight: {
iconClass: "fa fa-align-right",
- action: function(event) {
+ action: function(event, popline) {
? +++++++++
document.execCommand("JustifyRight");
+ removeRedundantParagraphTag(popline, "right");
}
},
indent: {
iconClass: "fa fa-indent",
action: function(event) {
document.execCommand("indent");
}
},
outdent: {
iconClass: "fa fa-dedent",
action: function(event) {
document.execCommand("outdent");
}
}
}
}
});
})(jQuery); | 21 | 0.403846 | 18 | 3 |
9ab38ac73a09913848a07a322621968e1e36ccf5 | .travis.yml | .travis.yml | language: ruby
sudo: false
cache: bundler
install: true
script:
- './script/ci'
rvm:
- 2.5
- 2.6
- ruby-head
matrix:
allow_failures:
- rvm: ruby-head
notifications:
webhooks:
urls:
- https://webhooks.gitter.im/e/fde2367248d53de4fe70
on_success: change # options: [always|never|change] default: always
on_failure: always # options: [always|never|change] default: always
on_start: never # options: [always|never|change] default: always
| language: ruby
sudo: false
cache: bundler
install: true
script:
- bundle exec rake
rvm:
- 2.6
- 2.7
- ruby-head
matrix:
allow_failures:
- rvm: ruby-head
notifications:
webhooks:
urls:
- https://webhooks.gitter.im/e/fde2367248d53de4fe70
on_success: change # options: [always|never|change] default: always
on_failure: always # options: [always|never|change] default: always
on_start: never # options: [always|never|change] default: always
| Update Travis CI rubies to support 2.6+ only | Update Travis CI rubies to support 2.6+ only
| YAML | mit | lotus/lotus,lotus/lotus,hanami/hanami,hanami/hanami,lotus/lotus | yaml | ## Code Before:
language: ruby
sudo: false
cache: bundler
install: true
script:
- './script/ci'
rvm:
- 2.5
- 2.6
- ruby-head
matrix:
allow_failures:
- rvm: ruby-head
notifications:
webhooks:
urls:
- https://webhooks.gitter.im/e/fde2367248d53de4fe70
on_success: change # options: [always|never|change] default: always
on_failure: always # options: [always|never|change] default: always
on_start: never # options: [always|never|change] default: always
## Instruction:
Update Travis CI rubies to support 2.6+ only
## Code After:
language: ruby
sudo: false
cache: bundler
install: true
script:
- bundle exec rake
rvm:
- 2.6
- 2.7
- ruby-head
matrix:
allow_failures:
- rvm: ruby-head
notifications:
webhooks:
urls:
- https://webhooks.gitter.im/e/fde2367248d53de4fe70
on_success: change # options: [always|never|change] default: always
on_failure: always # options: [always|never|change] default: always
on_start: never # options: [always|never|change] default: always
| language: ruby
sudo: false
cache: bundler
install: true
script:
- - './script/ci'
+ - bundle exec rake
rvm:
- - 2.5
- 2.6
+ - 2.7
- ruby-head
matrix:
allow_failures:
- rvm: ruby-head
notifications:
webhooks:
urls:
- https://webhooks.gitter.im/e/fde2367248d53de4fe70
on_success: change # options: [always|never|change] default: always
on_failure: always # options: [always|never|change] default: always
on_start: never # options: [always|never|change] default: always | 4 | 0.181818 | 2 | 2 |
bbb5fa95cd4b9d7fac6ac45546eedcd8a5d14162 | s4v1.py | s4v1.py | from s3v3 import *
| from s3v3 import *
import csv
def write_to_file(filename, data_sample):
example = csv.write(open(filename, 'w', encoding='utf-8'), dialect='excel') # example is the variable of the new file that is open and which we can write to (using utf-8 encoding and an excel dialect).
example.writerows(data_sample) # write rows is going to take the rows in the data sample and write them to the example (i.e. the file name we passed in)
write_to_file("_data/s4-silk_ties.csv", silk_ties) # this is going to create a new csv located in the _data directory, named s4-silk_ties.csv and it is going to contain all of that data from the silk_ties list which we created in s3v2 (silk_ties = filter_col_by_string(data_from_csv, "material", "_silk"))
| Create write to file function | Create write to file function
| Python | mit | alexmilesyounger/ds_basics | python | ## Code Before:
from s3v3 import *
## Instruction:
Create write to file function
## Code After:
from s3v3 import *
import csv
def write_to_file(filename, data_sample):
example = csv.write(open(filename, 'w', encoding='utf-8'), dialect='excel') # example is the variable of the new file that is open and which we can write to (using utf-8 encoding and an excel dialect).
example.writerows(data_sample) # write rows is going to take the rows in the data sample and write them to the example (i.e. the file name we passed in)
write_to_file("_data/s4-silk_ties.csv", silk_ties) # this is going to create a new csv located in the _data directory, named s4-silk_ties.csv and it is going to contain all of that data from the silk_ties list which we created in s3v2 (silk_ties = filter_col_by_string(data_from_csv, "material", "_silk"))
| from s3v3 import *
+ import csv
+ def write_to_file(filename, data_sample):
+ example = csv.write(open(filename, 'w', encoding='utf-8'), dialect='excel') # example is the variable of the new file that is open and which we can write to (using utf-8 encoding and an excel dialect).
+ example.writerows(data_sample) # write rows is going to take the rows in the data sample and write them to the example (i.e. the file name we passed in)
+
+ write_to_file("_data/s4-silk_ties.csv", silk_ties) # this is going to create a new csv located in the _data directory, named s4-silk_ties.csv and it is going to contain all of that data from the silk_ties list which we created in s3v2 (silk_ties = filter_col_by_string(data_from_csv, "material", "_silk"))
+ | 7 | 3.5 | 7 | 0 |
1b87fd927de9de2d2650b1678bad009a6c828f1d | lib/cryptocompare/stats.rb | lib/cryptocompare/stats.rb | require 'faraday'
require 'json'
module Cryptocompare
module Stats
API_URL = 'https://min-api.cryptocompare.com/stats/rate/limit'.freeze
private_constant :API_URL
# Find out how many calls you have left in the current month, day, hour, minute and second
#
# ==== Returns
#
# [Hash] Returns a hash containing data about calls_made and calls_left
#
# ==== Examples
#
# Find out how calls you have made and have left in the current month, day, hour, minute and second
#
# Cryptocompare::Stats.rate_limit
#
# Sample response
#
# TODO
def self.rate_limit
api_resp = Faraday.get(API_URL)
JSON.parse(api_resp.body)
end
end
end
| require 'faraday'
require 'json'
module Cryptocompare
module Stats
API_URL = 'https://min-api.cryptocompare.com/stats/rate/limit'.freeze
private_constant :API_URL
# Find out how many calls you have left in the current month, day, hour, minute and second
#
# ==== Returns
#
# [Hash] Returns a hash containing data about calls_made and calls_left
#
# ==== Examples
#
# Find out how calls you have made and have left in the current month, day, hour, minute and second
#
# Cryptocompare::Stats.rate_limit
#
# Sample response
#
# {
# "Response" => "Success",
# "Message" => "",
# "HasWarning" => false,
# "Type" => 100,
# "RateLimit" => {},
# "Data" => {
# "calls_made" => {
# "second" => 1,
# "minute" => 2,
# "hour" => 2,
# "day" => 2,
# "month" => 33
# },
# "calls_left" => {
# "second" => 19,
# "minute" => 298,
# "hour" => 2998,
# "day" => 7498,
# "month" => 49967
# }
# }
# }
def self.rate_limit
api_resp = Faraday.get(API_URL)
JSON.parse(api_resp.body)
end
end
end
| Update documentation for Cryptocompare::Stats.rate_limit method | Update documentation for Cryptocompare::Stats.rate_limit method
| Ruby | mit | alexanderdavidpan/cryptocompare,alexanderdavidpan/cryptocompare | ruby | ## Code Before:
require 'faraday'
require 'json'
module Cryptocompare
module Stats
API_URL = 'https://min-api.cryptocompare.com/stats/rate/limit'.freeze
private_constant :API_URL
# Find out how many calls you have left in the current month, day, hour, minute and second
#
# ==== Returns
#
# [Hash] Returns a hash containing data about calls_made and calls_left
#
# ==== Examples
#
# Find out how calls you have made and have left in the current month, day, hour, minute and second
#
# Cryptocompare::Stats.rate_limit
#
# Sample response
#
# TODO
def self.rate_limit
api_resp = Faraday.get(API_URL)
JSON.parse(api_resp.body)
end
end
end
## Instruction:
Update documentation for Cryptocompare::Stats.rate_limit method
## Code After:
require 'faraday'
require 'json'
module Cryptocompare
module Stats
API_URL = 'https://min-api.cryptocompare.com/stats/rate/limit'.freeze
private_constant :API_URL
# Find out how many calls you have left in the current month, day, hour, minute and second
#
# ==== Returns
#
# [Hash] Returns a hash containing data about calls_made and calls_left
#
# ==== Examples
#
# Find out how calls you have made and have left in the current month, day, hour, minute and second
#
# Cryptocompare::Stats.rate_limit
#
# Sample response
#
# {
# "Response" => "Success",
# "Message" => "",
# "HasWarning" => false,
# "Type" => 100,
# "RateLimit" => {},
# "Data" => {
# "calls_made" => {
# "second" => 1,
# "minute" => 2,
# "hour" => 2,
# "day" => 2,
# "month" => 33
# },
# "calls_left" => {
# "second" => 19,
# "minute" => 298,
# "hour" => 2998,
# "day" => 7498,
# "month" => 49967
# }
# }
# }
def self.rate_limit
api_resp = Faraday.get(API_URL)
JSON.parse(api_resp.body)
end
end
end
| require 'faraday'
require 'json'
module Cryptocompare
module Stats
API_URL = 'https://min-api.cryptocompare.com/stats/rate/limit'.freeze
private_constant :API_URL
# Find out how many calls you have left in the current month, day, hour, minute and second
#
# ==== Returns
#
# [Hash] Returns a hash containing data about calls_made and calls_left
#
# ==== Examples
#
# Find out how calls you have made and have left in the current month, day, hour, minute and second
#
# Cryptocompare::Stats.rate_limit
#
# Sample response
#
- # TODO
+ # {
+ # "Response" => "Success",
+ # "Message" => "",
+ # "HasWarning" => false,
+ # "Type" => 100,
+ # "RateLimit" => {},
+ # "Data" => {
+ # "calls_made" => {
+ # "second" => 1,
+ # "minute" => 2,
+ # "hour" => 2,
+ # "day" => 2,
+ # "month" => 33
+ # },
+ # "calls_left" => {
+ # "second" => 19,
+ # "minute" => 298,
+ # "hour" => 2998,
+ # "day" => 7498,
+ # "month" => 49967
+ # }
+ # }
+ # }
def self.rate_limit
api_resp = Faraday.get(API_URL)
JSON.parse(api_resp.body)
end
end
end | 24 | 0.827586 | 23 | 1 |
11d19d1756f6227db894aabcf4bd02e327e292c7 | tests/test_basic.py | tests/test_basic.py | from hello_world import hello_world
from unittest import TestCase
class BasicTest(TestCase):
def test_basic_hello_world(self):
"""
Test basic hello world messaging
"""
False
| from hello_world import hello_world
from unittest import TestCase
class BasicTest(TestCase):
def test_basic_hello_world(self):
"""
Test basic hello world messaging
"""
self.assertTrue(callable(hello_world))
| Make things a little better | Make things a little better
| Python | mit | jeansaad/hello_world | python | ## Code Before:
from hello_world import hello_world
from unittest import TestCase
class BasicTest(TestCase):
def test_basic_hello_world(self):
"""
Test basic hello world messaging
"""
False
## Instruction:
Make things a little better
## Code After:
from hello_world import hello_world
from unittest import TestCase
class BasicTest(TestCase):
def test_basic_hello_world(self):
"""
Test basic hello world messaging
"""
self.assertTrue(callable(hello_world))
| from hello_world import hello_world
from unittest import TestCase
class BasicTest(TestCase):
def test_basic_hello_world(self):
"""
Test basic hello world messaging
"""
- False
+ self.assertTrue(callable(hello_world)) | 2 | 0.181818 | 1 | 1 |
ae08fc6cf60e3ef0935215a7094109421b112ac6 | scripts/fast_featurecounter.sh | scripts/fast_featurecounter.sh | export LC_ALL=C
infile=$1
# Explicitly set tmp directory to better manage disk needs
tmpdir=$2
blocksize=$3
outfile=$4
DIR=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )
mkdir -p $tmpdir
cat $infile |\
parallel --block $blocksize -j95% --pipe --files --tempdir $tmpdir \
awk '{print\ \$2\,\ \$3}' "|" sort "|" awk -f $DIR/mergecounted.awk >tmp1files.txt
# We've processed the files in a big batch, but in all likelihood, there's still too many
# of them to glob all together and sort. So, let's merge in batches of 30 and dedupe again
cat tmp1files.txt | parallel --files --tempdir $tmpdir -Xn30 -j95% \
sort -m {} "|" awk -f scripts/mergecounted.awk ";" rm {} |\
parallel -Xj1 sort -m {} "|" awk -f $DIR/mergecounted.awk ";" rm {} |\
sort -n -r -k2 | awk 'BEGIN {i=0}{i+=1;print i " " $1 " " $2}' >$outfile # Format for bw
rm tmp1files.txt
| export LC_ALL=C
infile=$1
# Explicitly set tmp directory to better manage disk needs
tmpdir=$2
blocksize=$3
outfile=$4
DIR=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )
tmpfile=tmp1-$RANDOM.txt
mkdir -p $tmpdir
cat $infile |\
parallel --block $blocksize -j95% --pipe --files --tempdir $tmpdir \
awk '{print\ \$2\,\ \$3}' "|" sort "|" awk -f $DIR/mergecounted.awk >$tmpfile
echo $tmpfile
# We've processed the files in a big batch, but in all likelihood, there's still too many
# of them to glob all together and sort. So, let's merge in batches of 30 and dedupe again
cat $tmpfile | parallel --files --tempdir $tmpdir -Xn30 -j95% \
sort -m {} "|" awk -f scripts/mergecounted.awk ";" rm {} |\
parallel -Xj1 sort -m {} "|" awk -f $DIR/mergecounted.awk ";" rm {} |\
sort -n -r -k2 | awk 'BEGIN {i=0}{i+=1;print i " " $1 " " $2}' >$outfile # Format for bw
rm $tmpfile
| Add random filename to tempfile list | Add random filename to tempfile list
So that multiple processes of the feature counter script can be run, the
tempfile list created is now randomly named.
| Shell | mit | Bookworm-project/BookwormDB,Bookworm-project/BookwormDB | shell | ## Code Before:
export LC_ALL=C
infile=$1
# Explicitly set tmp directory to better manage disk needs
tmpdir=$2
blocksize=$3
outfile=$4
DIR=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )
mkdir -p $tmpdir
cat $infile |\
parallel --block $blocksize -j95% --pipe --files --tempdir $tmpdir \
awk '{print\ \$2\,\ \$3}' "|" sort "|" awk -f $DIR/mergecounted.awk >tmp1files.txt
# We've processed the files in a big batch, but in all likelihood, there's still too many
# of them to glob all together and sort. So, let's merge in batches of 30 and dedupe again
cat tmp1files.txt | parallel --files --tempdir $tmpdir -Xn30 -j95% \
sort -m {} "|" awk -f scripts/mergecounted.awk ";" rm {} |\
parallel -Xj1 sort -m {} "|" awk -f $DIR/mergecounted.awk ";" rm {} |\
sort -n -r -k2 | awk 'BEGIN {i=0}{i+=1;print i " " $1 " " $2}' >$outfile # Format for bw
rm tmp1files.txt
## Instruction:
Add random filename to tempfile list
So that multiple processes of the feature counter script can be run, the
tempfile list created is now randomly named.
## Code After:
export LC_ALL=C
infile=$1
# Explicitly set tmp directory to better manage disk needs
tmpdir=$2
blocksize=$3
outfile=$4
DIR=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )
tmpfile=tmp1-$RANDOM.txt
mkdir -p $tmpdir
cat $infile |\
parallel --block $blocksize -j95% --pipe --files --tempdir $tmpdir \
awk '{print\ \$2\,\ \$3}' "|" sort "|" awk -f $DIR/mergecounted.awk >$tmpfile
echo $tmpfile
# We've processed the files in a big batch, but in all likelihood, there's still too many
# of them to glob all together and sort. So, let's merge in batches of 30 and dedupe again
cat $tmpfile | parallel --files --tempdir $tmpdir -Xn30 -j95% \
sort -m {} "|" awk -f scripts/mergecounted.awk ";" rm {} |\
parallel -Xj1 sort -m {} "|" awk -f $DIR/mergecounted.awk ";" rm {} |\
sort -n -r -k2 | awk 'BEGIN {i=0}{i+=1;print i " " $1 " " $2}' >$outfile # Format for bw
rm $tmpfile
| export LC_ALL=C
infile=$1
# Explicitly set tmp directory to better manage disk needs
tmpdir=$2
blocksize=$3
outfile=$4
DIR=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )
+ tmpfile=tmp1-$RANDOM.txt
mkdir -p $tmpdir
cat $infile |\
parallel --block $blocksize -j95% --pipe --files --tempdir $tmpdir \
- awk '{print\ \$2\,\ \$3}' "|" sort "|" awk -f $DIR/mergecounted.awk >tmp1files.txt
? - -----
+ awk '{print\ \$2\,\ \$3}' "|" sort "|" awk -f $DIR/mergecounted.awk >$tmpfile
? +
+
+ echo $tmpfile
# We've processed the files in a big batch, but in all likelihood, there's still too many
# of them to glob all together and sort. So, let's merge in batches of 30 and dedupe again
- cat tmp1files.txt | parallel --files --tempdir $tmpdir -Xn30 -j95% \
? - -----
+ cat $tmpfile | parallel --files --tempdir $tmpdir -Xn30 -j95% \
? +
sort -m {} "|" awk -f scripts/mergecounted.awk ";" rm {} |\
parallel -Xj1 sort -m {} "|" awk -f $DIR/mergecounted.awk ";" rm {} |\
sort -n -r -k2 | awk 'BEGIN {i=0}{i+=1;print i " " $1 " " $2}' >$outfile # Format for bw
- rm tmp1files.txt
+ rm $tmpfile | 9 | 0.428571 | 6 | 3 |
b1ad9e560224b73235234a9cb0ea4106a081c612 | WHATSNEW.md | WHATSNEW.md |
Please be aware that the Spark clusters are now **persisting
the home directory between launches**. Caution is advised with
regard to long running scripts that write to disk, especially
with regard to write speed and volume size.
#### Automatic page refresher removed
The automatic page refresher that was active on the dashboard
and the cluster and job detail pages has been removed to prevent
unneeded client side memory consumption when having many ATMO
URLs open in tabs at the same time. Just refresh the page manually
instead.
|
Good news everyone - files are now persisted between clusters!
In fact, make changes to your own dotfiles in your home dir -
they will live on between clusters as well.
We will be limiting your directory size to 20GB, so don't go
saving those big honkin' parquet files.
#### Automatic page refresher removed
The automatic page refresher that was active on the dashboard
and the cluster and job detail pages has been removed to prevent
unneeded client side memory consumption when having many ATMO
URLs open in tabs at the same time. Just refresh the page manually
instead.
| Update whats new wording for EFS | Update whats new wording for EFS
| Markdown | mpl-2.0 | mozilla/telemetry-analysis-service,mozilla/telemetry-analysis-service,mozilla/telemetry-analysis-service,mozilla/telemetry-analysis-service | markdown | ## Code Before:
Please be aware that the Spark clusters are now **persisting
the home directory between launches**. Caution is advised with
regard to long running scripts that write to disk, especially
with regard to write speed and volume size.
#### Automatic page refresher removed
The automatic page refresher that was active on the dashboard
and the cluster and job detail pages has been removed to prevent
unneeded client side memory consumption when having many ATMO
URLs open in tabs at the same time. Just refresh the page manually
instead.
## Instruction:
Update whats new wording for EFS
## Code After:
Good news everyone - files are now persisted between clusters!
In fact, make changes to your own dotfiles in your home dir -
they will live on between clusters as well.
We will be limiting your directory size to 20GB, so don't go
saving those big honkin' parquet files.
#### Automatic page refresher removed
The automatic page refresher that was active on the dashboard
and the cluster and job detail pages has been removed to prevent
unneeded client side memory consumption when having many ATMO
URLs open in tabs at the same time. Just refresh the page manually
instead.
|
- Please be aware that the Spark clusters are now **persisting
- the home directory between launches**. Caution is advised with
- regard to long running scripts that write to disk, especially
- with regard to write speed and volume size.
+ Good news everyone - files are now persisted between clusters!
+ In fact, make changes to your own dotfiles in your home dir -
+ they will live on between clusters as well.
+
+ We will be limiting your directory size to 20GB, so don't go
+ saving those big honkin' parquet files.
#### Automatic page refresher removed
The automatic page refresher that was active on the dashboard
and the cluster and job detail pages has been removed to prevent
unneeded client side memory consumption when having many ATMO
URLs open in tabs at the same time. Just refresh the page manually
instead. | 10 | 0.769231 | 6 | 4 |
afa42b0e4bcae9c2a0b5a93f1174041cbf86f27d | Formula/soprano.rb | Formula/soprano.rb | require 'brewkit'
class Soprano <Formula
@url='http://surfnet.dl.sourceforge.net/project/soprano/Soprano/2.3.1/soprano-2.3.1.tar.bz2'
@homepage='http://soprano.sourceforge.net/'
@md5='c9a2c008b80cd5d76599e9d48139dfe9'
depends_on 'qt'
depends_on 'clucene'
depends_on 'redland'
def install
ENV['CLUCENE_HOME'] = HOMEBREW_PREFIX
system "cmake . #{std_cmake_parameters}"
system "make install"
end
end
| require 'brewkit'
class Soprano <Formula
@url='http://surfnet.dl.sourceforge.net/project/soprano/Soprano/2.3.1/soprano-2.3.1.tar.bz2'
@homepage='http://soprano.sourceforge.net/'
@md5='c9a2c008b80cd5d76599e9d48139dfe9'
depends_on 'cmake'
depends_on 'qt'
depends_on 'clucene'
depends_on 'redland'
def install
ENV['CLUCENE_HOME'] = HOMEBREW_PREFIX
system "cmake . #{std_cmake_parameters}"
system "make install"
end
end
| Add cmake as a build dependency for Soprano | Add cmake as a build dependency for Soprano
| Ruby | bsd-2-clause | BrewTestBot/homebrew-core,passerbyid/homebrew-core,gserra-olx/homebrew-core,ilovezfs/homebrew-core,mbcoguno/homebrew-core,aren55555/homebrew-core,mahori/homebrew-core,mahori/homebrew-core,straxhaber/homebrew-core,nbari/homebrew-core,sjackman/homebrew-core,forevergenin/homebrew-core,bcg62/homebrew-core,mbcoguno/homebrew-core,uyjulian/homebrew-core,nandub/homebrew-core,andyjeffries/homebrew-core,wolffaxn/homebrew-core,robohack/homebrew-core,ylluminarious/homebrew-core,ShivaHuang/homebrew-core,jdubois/homebrew-core,jabenninghoff/homebrew-core,sjackman/homebrew-core,phusion/homebrew-core,lemzwerg/homebrew-core,DomT4/homebrew-core,adam-moss/homebrew-core,bigbes/homebrew-core,git-lfs/homebrew-core,phusion/homebrew-core,edporras/homebrew-core,forevergenin/homebrew-core,FinalDes/homebrew-core,j-bennet/homebrew-core,BrewTestBot/homebrew-core,zmwangx/homebrew-core,sdorra/homebrew-core,bigbes/homebrew-core,chrisfinazzo/homebrew-core,aabdnn/homebrew-core,aabdnn/homebrew-core,jdubois/homebrew-core,wolffaxn/homebrew-core,wolffaxn/homebrew-core,adam-moss/homebrew-core,lembacon/homebrew-core,bcg62/homebrew-core,andyjeffries/homebrew-core,zyedidia/homebrew-core,stgraber/homebrew-core,maxim-belkin/homebrew-core,rwhogg/homebrew-core,LinuxbrewTestBot/homebrew-core,mvitz/homebrew-core,uyjulian/homebrew-core,robohack/homebrew-core,jvillard/homebrew-core,JCount/homebrew-core,mvitz/homebrew-core,j-bennet/homebrew-core,bfontaine/homebrew-core,sdorra/homebrew-core,git-lfs/homebrew-core,mvitz/homebrew-core,mvbattista/homebrew-core,spaam/homebrew-core,kunickiaj/homebrew-core,JCount/homebrew-core,kuahyeow/homebrew-core,jvillard/homebrew-core,jvehent/homebrew-core,battlemidget/homebrew-core,bcg62/homebrew-core,moderndeveloperllc/homebrew-core,moderndeveloperllc/homebrew-core,mbcoguno/homebrew-core,reelsense/homebrew-core,lembacon/homebrew-core,mfikes/homebrew-core,ylluminarious/homebrew-core,adamliter/homebrew-core,zmwangx/homebrew-core,kuahyeow/homebrew-core,Moisan/homebrew-core,bfontaine/homebrew-core,chrisfinazzo/homebrew-core,filcab/homebrew-core,fvioz/homebrew-core,battlemidget/homebrew-core,maxim-belkin/homebrew-core,cblecker/homebrew-core,aren55555/homebrew-core,nbari/homebrew-core,passerbyid/homebrew-core,jdubois/homebrew-core,FinalDes/homebrew-core,adamliter/homebrew-core,Linuxbrew/homebrew-core,DomT4/homebrew-core,mvbattista/homebrew-core,lemzwerg/homebrew-core,LinuxbrewTestBot/homebrew-core,jabenninghoff/homebrew-core,cblecker/homebrew-core,smessmer/homebrew-core,lasote/homebrew-core,makigumo/homebrew-core,nandub/homebrew-core,tkoenig/homebrew-core,grhawk/homebrew-core,jvehent/homebrew-core,dsXLII/homebrew-core,kunickiaj/homebrew-core,makigumo/homebrew-core,ShivaHuang/homebrew-core,gserra-olx/homebrew-core,reelsense/homebrew-core,Homebrew/homebrew-core,spaam/homebrew-core,tkoenig/homebrew-core,straxhaber/homebrew-core,lasote/homebrew-core,zyedidia/homebrew-core,rwhogg/homebrew-core,ilovezfs/homebrew-core,dsXLII/homebrew-core,jvillard/homebrew-core,Linuxbrew/homebrew-core,lembacon/homebrew-core,Moisan/homebrew-core,Homebrew/homebrew-core,grhawk/homebrew-core,edporras/homebrew-core,smessmer/homebrew-core,fvioz/homebrew-core,stgraber/homebrew-core,mfikes/homebrew-core,filcab/homebrew-core | ruby | ## Code Before:
require 'brewkit'
class Soprano <Formula
@url='http://surfnet.dl.sourceforge.net/project/soprano/Soprano/2.3.1/soprano-2.3.1.tar.bz2'
@homepage='http://soprano.sourceforge.net/'
@md5='c9a2c008b80cd5d76599e9d48139dfe9'
depends_on 'qt'
depends_on 'clucene'
depends_on 'redland'
def install
ENV['CLUCENE_HOME'] = HOMEBREW_PREFIX
system "cmake . #{std_cmake_parameters}"
system "make install"
end
end
## Instruction:
Add cmake as a build dependency for Soprano
## Code After:
require 'brewkit'
class Soprano <Formula
@url='http://surfnet.dl.sourceforge.net/project/soprano/Soprano/2.3.1/soprano-2.3.1.tar.bz2'
@homepage='http://soprano.sourceforge.net/'
@md5='c9a2c008b80cd5d76599e9d48139dfe9'
depends_on 'cmake'
depends_on 'qt'
depends_on 'clucene'
depends_on 'redland'
def install
ENV['CLUCENE_HOME'] = HOMEBREW_PREFIX
system "cmake . #{std_cmake_parameters}"
system "make install"
end
end
| require 'brewkit'
class Soprano <Formula
@url='http://surfnet.dl.sourceforge.net/project/soprano/Soprano/2.3.1/soprano-2.3.1.tar.bz2'
@homepage='http://soprano.sourceforge.net/'
@md5='c9a2c008b80cd5d76599e9d48139dfe9'
+ depends_on 'cmake'
depends_on 'qt'
depends_on 'clucene'
depends_on 'redland'
def install
ENV['CLUCENE_HOME'] = HOMEBREW_PREFIX
system "cmake . #{std_cmake_parameters}"
system "make install"
end
end | 1 | 0.055556 | 1 | 0 |
3b4b108e6dc2dce46442d5f09fbfe59a61baa02a | api/admin/author.py | api/admin/author.py | from django.contrib.admin import ModelAdmin, BooleanFieldListFilter
class AuthorOptions(ModelAdmin):
list_display = ['id', 'user', 'github_username', 'host', 'bio', 'enabled','friends','following','requests','pending']
list_editable = ['user', 'github_username', 'host', 'bio', 'enabled']
list_filter = (
('enabled', BooleanFieldListFilter),
)
def approve_author(self, request, queryset):
try:
queryset.update(enabled=True)
self.message_user(request, "Account(s) successfully enabled")
except:
self.message_user(request, "Failed to enable account(s)")
approve_author.short_description = "enable account(s)"
def disable_author(self, request, queryset):
try:
queryset.update(enabled=False)
self.message_user(request, "Account(s) successfully disabled")
except:
self.message_user(request, "Failed to disable account(s)")
disable_author.short_description = "disable account(s)"
actions = [approve_author, disable_author]
class CachedAuthorOptions(ModelAdmin):
list_display = ['id', 'displayname', 'host', 'url']
list_editable = ['displayname', 'host']
# Deletion should occur only through Author models and friend/followers
def has_delete_permission(self, request, obj=None):
return False
| from django.contrib.admin import ModelAdmin, BooleanFieldListFilter
class AuthorOptions(ModelAdmin):
list_display = ['id', 'user', 'github_username', 'host', 'bio', 'enabled']
list_editable = ['user', 'github_username', 'host', 'bio', 'enabled']
list_filter = (
('enabled', BooleanFieldListFilter),
)
def approve_author(self, request, queryset):
try:
queryset.update(enabled=True)
self.message_user(request, "Account(s) successfully enabled")
except:
self.message_user(request, "Failed to enable account(s)")
approve_author.short_description = "enable account(s)"
def disable_author(self, request, queryset):
try:
queryset.update(enabled=False)
self.message_user(request, "Account(s) successfully disabled")
except:
self.message_user(request, "Failed to disable account(s)")
disable_author.short_description = "disable account(s)"
actions = [approve_author, disable_author]
class CachedAuthorOptions(ModelAdmin):
list_display = ['id', 'displayname', 'host', 'url']
list_editable = ['displayname', 'host']
# Deletion should occur only through Author models and friend/followers
def has_delete_permission(self, request, obj=None):
return False
| Revert "Making more fields viewable" | Revert "Making more fields viewable"
| Python | apache-2.0 | CMPUT404/socialdistribution,CMPUT404/socialdistribution,CMPUT404/socialdistribution | python | ## Code Before:
from django.contrib.admin import ModelAdmin, BooleanFieldListFilter
class AuthorOptions(ModelAdmin):
list_display = ['id', 'user', 'github_username', 'host', 'bio', 'enabled','friends','following','requests','pending']
list_editable = ['user', 'github_username', 'host', 'bio', 'enabled']
list_filter = (
('enabled', BooleanFieldListFilter),
)
def approve_author(self, request, queryset):
try:
queryset.update(enabled=True)
self.message_user(request, "Account(s) successfully enabled")
except:
self.message_user(request, "Failed to enable account(s)")
approve_author.short_description = "enable account(s)"
def disable_author(self, request, queryset):
try:
queryset.update(enabled=False)
self.message_user(request, "Account(s) successfully disabled")
except:
self.message_user(request, "Failed to disable account(s)")
disable_author.short_description = "disable account(s)"
actions = [approve_author, disable_author]
class CachedAuthorOptions(ModelAdmin):
list_display = ['id', 'displayname', 'host', 'url']
list_editable = ['displayname', 'host']
# Deletion should occur only through Author models and friend/followers
def has_delete_permission(self, request, obj=None):
return False
## Instruction:
Revert "Making more fields viewable"
## Code After:
from django.contrib.admin import ModelAdmin, BooleanFieldListFilter
class AuthorOptions(ModelAdmin):
list_display = ['id', 'user', 'github_username', 'host', 'bio', 'enabled']
list_editable = ['user', 'github_username', 'host', 'bio', 'enabled']
list_filter = (
('enabled', BooleanFieldListFilter),
)
def approve_author(self, request, queryset):
try:
queryset.update(enabled=True)
self.message_user(request, "Account(s) successfully enabled")
except:
self.message_user(request, "Failed to enable account(s)")
approve_author.short_description = "enable account(s)"
def disable_author(self, request, queryset):
try:
queryset.update(enabled=False)
self.message_user(request, "Account(s) successfully disabled")
except:
self.message_user(request, "Failed to disable account(s)")
disable_author.short_description = "disable account(s)"
actions = [approve_author, disable_author]
class CachedAuthorOptions(ModelAdmin):
list_display = ['id', 'displayname', 'host', 'url']
list_editable = ['displayname', 'host']
# Deletion should occur only through Author models and friend/followers
def has_delete_permission(self, request, obj=None):
return False
| from django.contrib.admin import ModelAdmin, BooleanFieldListFilter
class AuthorOptions(ModelAdmin):
- list_display = ['id', 'user', 'github_username', 'host', 'bio', 'enabled','friends','following','requests','pending']
? -------------------------------------------
+ list_display = ['id', 'user', 'github_username', 'host', 'bio', 'enabled']
list_editable = ['user', 'github_username', 'host', 'bio', 'enabled']
list_filter = (
('enabled', BooleanFieldListFilter),
)
def approve_author(self, request, queryset):
try:
queryset.update(enabled=True)
self.message_user(request, "Account(s) successfully enabled")
except:
self.message_user(request, "Failed to enable account(s)")
approve_author.short_description = "enable account(s)"
def disable_author(self, request, queryset):
try:
queryset.update(enabled=False)
self.message_user(request, "Account(s) successfully disabled")
except:
self.message_user(request, "Failed to disable account(s)")
disable_author.short_description = "disable account(s)"
actions = [approve_author, disable_author]
class CachedAuthorOptions(ModelAdmin):
list_display = ['id', 'displayname', 'host', 'url']
list_editable = ['displayname', 'host']
# Deletion should occur only through Author models and friend/followers
def has_delete_permission(self, request, obj=None):
return False | 2 | 0.055556 | 1 | 1 |
140ddff4a6ed8b1f81b4f18243d5da4c0d8dfbd2 | circle.yml | circle.yml | machine:
environment:
ENV: dev
SECRET_KEY: testing-on-circle-ci
POSTGRES_USER: ubuntu
POSTGRES_HOST: localhost
UNPP_LOGS_PATH: /tmp
dependencies:
pre:
- pip install -r backend/requirements/dev.pip
test:
override:
- cd backend && python manage.py test --noinput --parallel
- cd backend && flake8
| machine:
environment:
ENV: dev
SECRET_KEY: testing-on-circle-ci
POSTGRES_USER: ubuntu
POSTGRES_HOST: localhost
UNPP_LOGS_PATH: /home/ubuntu/logs
dependencies:
pre:
- mkdir -p /home/ubuntu/logs
- pip install -r backend/requirements/dev.pip
test:
override:
- cd backend && python manage.py test --noinput --parallel
- cd backend && flake8
| Use ubuntu dir for logs | Use ubuntu dir for logs
| YAML | apache-2.0 | unicef/un-partner-portal,unicef/un-partner-portal,unicef/un-partner-portal,unicef/un-partner-portal | yaml | ## Code Before:
machine:
environment:
ENV: dev
SECRET_KEY: testing-on-circle-ci
POSTGRES_USER: ubuntu
POSTGRES_HOST: localhost
UNPP_LOGS_PATH: /tmp
dependencies:
pre:
- pip install -r backend/requirements/dev.pip
test:
override:
- cd backend && python manage.py test --noinput --parallel
- cd backend && flake8
## Instruction:
Use ubuntu dir for logs
## Code After:
machine:
environment:
ENV: dev
SECRET_KEY: testing-on-circle-ci
POSTGRES_USER: ubuntu
POSTGRES_HOST: localhost
UNPP_LOGS_PATH: /home/ubuntu/logs
dependencies:
pre:
- mkdir -p /home/ubuntu/logs
- pip install -r backend/requirements/dev.pip
test:
override:
- cd backend && python manage.py test --noinput --parallel
- cd backend && flake8
| machine:
environment:
ENV: dev
SECRET_KEY: testing-on-circle-ci
POSTGRES_USER: ubuntu
POSTGRES_HOST: localhost
- UNPP_LOGS_PATH: /tmp
+ UNPP_LOGS_PATH: /home/ubuntu/logs
dependencies:
pre:
+ - mkdir -p /home/ubuntu/logs
- pip install -r backend/requirements/dev.pip
test:
override:
- cd backend && python manage.py test --noinput --parallel
- cd backend && flake8 | 3 | 0.1875 | 2 | 1 |
31e2102cd9b4cd882aabe664cbfab90eb3336ad3 | stack.yaml | stack.yaml | resolver: lts-8.2
packages:
- '.'
- location:
git: https://github.com/serokell/kademlia.git
commit: 278171b8ab104c78aa95bcdd9b63c8ced4fb1ed2
extra-dep: true
- location:
git: https://github.com/avieth/network-transport
commit: f2321a103f53f51d36c99383132e3ffa3ef1c401
extra-dep: true
- location:
git: https://github.com/avieth/network-transport-tcp
commit: eb1ed2fe4d4419c860ce060cb1091f7c8959134f
extra-dep: true
- location:
git: https://github.com/avieth/network-transport-inmemory
commit: 5d8ff2b07b9df35cf61329a3d975e2c8cf95c12a
extra-dep: true
nix:
enable: false
packages: []
extra-deps:
- serokell-util-0.1.3.5
- time-units-1.0.0
- log-warper-1.0.2
- universum-0.3
flags: {}
extra-package-dbs: []
| resolver: lts-8.2
packages:
- '.'
- location:
git: https://github.com/serokell/kademlia.git
commit: 278171b8ab104c78aa95bcdd9b63c8ced4fb1ed2
extra-dep: true
- location:
git: https://github.com/avieth/network-transport
commit: f2321a103f53f51d36c99383132e3ffa3ef1c401
extra-dep: true
- location:
git: https://github.com/avieth/network-transport-tcp
commit: ca42a954a15792f5ea8dc203e56fac8175b99c33
extra-dep: true
- location:
git: https://github.com/avieth/network-transport-inmemory
commit: 5d8ff2b07b9df35cf61329a3d975e2c8cf95c12a
extra-dep: true
nix:
enable: false
packages: []
extra-deps:
- serokell-util-0.1.3.5
- time-units-1.0.0
- log-warper-1.0.2
- universum-0.3
flags: {}
extra-package-dbs: []
| Use special nt-tcp revision with special case | Use special nt-tcp revision with special case
0.0.0.0, 127.0.0.1 connections treated as unaddressable. Backwards
compatbile last-minute fixup.
| YAML | apache-2.0 | input-output-hk/pos-haskell-prototype,input-output-hk/cardano-sl,input-output-hk/pos-haskell-prototype,input-output-hk/cardano-sl,input-output-hk/pos-haskell-prototype,input-output-hk/cardano-sl,serokell/tw-rework-sketch,input-output-hk/cardano-sl | yaml | ## Code Before:
resolver: lts-8.2
packages:
- '.'
- location:
git: https://github.com/serokell/kademlia.git
commit: 278171b8ab104c78aa95bcdd9b63c8ced4fb1ed2
extra-dep: true
- location:
git: https://github.com/avieth/network-transport
commit: f2321a103f53f51d36c99383132e3ffa3ef1c401
extra-dep: true
- location:
git: https://github.com/avieth/network-transport-tcp
commit: eb1ed2fe4d4419c860ce060cb1091f7c8959134f
extra-dep: true
- location:
git: https://github.com/avieth/network-transport-inmemory
commit: 5d8ff2b07b9df35cf61329a3d975e2c8cf95c12a
extra-dep: true
nix:
enable: false
packages: []
extra-deps:
- serokell-util-0.1.3.5
- time-units-1.0.0
- log-warper-1.0.2
- universum-0.3
flags: {}
extra-package-dbs: []
## Instruction:
Use special nt-tcp revision with special case
0.0.0.0, 127.0.0.1 connections treated as unaddressable. Backwards
compatbile last-minute fixup.
## Code After:
resolver: lts-8.2
packages:
- '.'
- location:
git: https://github.com/serokell/kademlia.git
commit: 278171b8ab104c78aa95bcdd9b63c8ced4fb1ed2
extra-dep: true
- location:
git: https://github.com/avieth/network-transport
commit: f2321a103f53f51d36c99383132e3ffa3ef1c401
extra-dep: true
- location:
git: https://github.com/avieth/network-transport-tcp
commit: ca42a954a15792f5ea8dc203e56fac8175b99c33
extra-dep: true
- location:
git: https://github.com/avieth/network-transport-inmemory
commit: 5d8ff2b07b9df35cf61329a3d975e2c8cf95c12a
extra-dep: true
nix:
enable: false
packages: []
extra-deps:
- serokell-util-0.1.3.5
- time-units-1.0.0
- log-warper-1.0.2
- universum-0.3
flags: {}
extra-package-dbs: []
| resolver: lts-8.2
packages:
- '.'
- location:
git: https://github.com/serokell/kademlia.git
commit: 278171b8ab104c78aa95bcdd9b63c8ced4fb1ed2
extra-dep: true
- location:
git: https://github.com/avieth/network-transport
commit: f2321a103f53f51d36c99383132e3ffa3ef1c401
extra-dep: true
- location:
git: https://github.com/avieth/network-transport-tcp
- commit: eb1ed2fe4d4419c860ce060cb1091f7c8959134f
+ commit: ca42a954a15792f5ea8dc203e56fac8175b99c33
extra-dep: true
- location:
git: https://github.com/avieth/network-transport-inmemory
commit: 5d8ff2b07b9df35cf61329a3d975e2c8cf95c12a
extra-dep: true
nix:
enable: false
packages: []
extra-deps:
- serokell-util-0.1.3.5
- time-units-1.0.0
- log-warper-1.0.2
- universum-0.3
flags: {}
extra-package-dbs: [] | 2 | 0.060606 | 1 | 1 |
9ed49cee1ce669547f6d0278af00c3ad246fec78 | migrations/versions/201608181200_11890f58b1df_add_tracks.py | migrations/versions/201608181200_11890f58b1df_add_tracks.py |
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = '11890f58b1df'
down_revision = '4d4b95748173'
def upgrade():
op.create_table(
'tracks',
sa.Column('id', sa.Integer(), nullable=False, index=True),
sa.Column('title', sa.String(), nullable=False),
sa.Column('event_id', sa.Integer(), nullable=False),
sa.Column('position', sa.Integer(), nullable=False),
sa.Column('description', sa.Text(), nullable=False),
sa.ForeignKeyConstraint(['event_id'], ['events.events.id']),
sa.PrimaryKeyConstraint('id'),
schema='events'
)
def downgrade():
op.drop_table('tracks', schema='events')
|
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = '11890f58b1df'
down_revision = '4d4b95748173'
def upgrade():
op.create_table(
'tracks',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('title', sa.String(), nullable=False),
sa.Column('event_id', sa.Integer(), nullable=False, index=True),
sa.Column('position', sa.Integer(), nullable=False),
sa.Column('description', sa.Text(), nullable=False),
sa.ForeignKeyConstraint(['event_id'], ['events.events.id']),
sa.PrimaryKeyConstraint('id'),
schema='events'
)
def downgrade():
op.drop_table('tracks', schema='events')
| Fix incorrect indexes in alembic revision | Fix incorrect indexes in alembic revision
| Python | mit | ThiefMaster/indico,ThiefMaster/indico,mic4ael/indico,pferreir/indico,pferreir/indico,mvidalgarcia/indico,indico/indico,ThiefMaster/indico,mvidalgarcia/indico,indico/indico,OmeGak/indico,mic4ael/indico,OmeGak/indico,mvidalgarcia/indico,DirkHoffmann/indico,mic4ael/indico,DirkHoffmann/indico,mvidalgarcia/indico,OmeGak/indico,DirkHoffmann/indico,pferreir/indico,mic4ael/indico,OmeGak/indico,indico/indico,ThiefMaster/indico,DirkHoffmann/indico,pferreir/indico,indico/indico | python | ## Code Before:
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = '11890f58b1df'
down_revision = '4d4b95748173'
def upgrade():
op.create_table(
'tracks',
sa.Column('id', sa.Integer(), nullable=False, index=True),
sa.Column('title', sa.String(), nullable=False),
sa.Column('event_id', sa.Integer(), nullable=False),
sa.Column('position', sa.Integer(), nullable=False),
sa.Column('description', sa.Text(), nullable=False),
sa.ForeignKeyConstraint(['event_id'], ['events.events.id']),
sa.PrimaryKeyConstraint('id'),
schema='events'
)
def downgrade():
op.drop_table('tracks', schema='events')
## Instruction:
Fix incorrect indexes in alembic revision
## Code After:
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = '11890f58b1df'
down_revision = '4d4b95748173'
def upgrade():
op.create_table(
'tracks',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('title', sa.String(), nullable=False),
sa.Column('event_id', sa.Integer(), nullable=False, index=True),
sa.Column('position', sa.Integer(), nullable=False),
sa.Column('description', sa.Text(), nullable=False),
sa.ForeignKeyConstraint(['event_id'], ['events.events.id']),
sa.PrimaryKeyConstraint('id'),
schema='events'
)
def downgrade():
op.drop_table('tracks', schema='events')
|
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = '11890f58b1df'
down_revision = '4d4b95748173'
def upgrade():
op.create_table(
'tracks',
- sa.Column('id', sa.Integer(), nullable=False, index=True),
? ------------
+ sa.Column('id', sa.Integer(), nullable=False),
sa.Column('title', sa.String(), nullable=False),
- sa.Column('event_id', sa.Integer(), nullable=False),
+ sa.Column('event_id', sa.Integer(), nullable=False, index=True),
? ++++++++++++
sa.Column('position', sa.Integer(), nullable=False),
sa.Column('description', sa.Text(), nullable=False),
sa.ForeignKeyConstraint(['event_id'], ['events.events.id']),
sa.PrimaryKeyConstraint('id'),
schema='events'
)
def downgrade():
op.drop_table('tracks', schema='events') | 4 | 0.153846 | 2 | 2 |
c2ba8eb03ec505a104814505e2be00da529d50ea | libraries/instance_tar.rb | libraries/instance_tar.rb |
require 'chef/resource/directory'
module ChefJava
module Instance
class Tar
def initialize(path, options, action)
@path = path
@options = options
@action = action
end
def install
end
def remove
end
private
# TODO: Memoize this.
def manage_directory
dir = Chef::Directory.new(@path)
dir.owner 'root'
dir.group 'root'
dir.mode 00755
dir.run_action(:create)
end
#
def tar
end
def decompress
end
end
end
end |
require 'chef/resource/directory'
require_relative 'tar_helper'
module ChefJava
module Instance
class Tar
def initialize(options)
@options = options
end
def install
extract_tar
end
def remove
remove_extracted_tar
end
private
def archive
@options[:source]
end
def destination
@options[:destination]
end
#
def extract_tar
ChefJava::Helpers::Tar.new(archive, destination)
end
def remove_extracted_tar
FileUtils.rm_rf(destination)
end
end
end
end | Make our instance helper to dealing with tar files use our tar_helper. Remove the extracted mess if requested. | Make our instance helper to dealing with tar files use our tar_helper. Remove the extracted mess if requested.
| Ruby | apache-2.0 | SimpleFinance/java | ruby | ## Code Before:
require 'chef/resource/directory'
module ChefJava
module Instance
class Tar
def initialize(path, options, action)
@path = path
@options = options
@action = action
end
def install
end
def remove
end
private
# TODO: Memoize this.
def manage_directory
dir = Chef::Directory.new(@path)
dir.owner 'root'
dir.group 'root'
dir.mode 00755
dir.run_action(:create)
end
#
def tar
end
def decompress
end
end
end
end
## Instruction:
Make our instance helper to dealing with tar files use our tar_helper. Remove the extracted mess if requested.
## Code After:
require 'chef/resource/directory'
require_relative 'tar_helper'
module ChefJava
module Instance
class Tar
def initialize(options)
@options = options
end
def install
extract_tar
end
def remove
remove_extracted_tar
end
private
def archive
@options[:source]
end
def destination
@options[:destination]
end
#
def extract_tar
ChefJava::Helpers::Tar.new(archive, destination)
end
def remove_extracted_tar
FileUtils.rm_rf(destination)
end
end
end
end |
require 'chef/resource/directory'
+ require_relative 'tar_helper'
module ChefJava
module Instance
class Tar
- def initialize(path, options, action)
? ------ --------
+ def initialize(options)
- @path = path
@options = options
- @action = action
end
def install
+ extract_tar
end
def remove
+ remove_extracted_tar
end
private
+ def archive
+ @options[:source]
+ end
+
+ def destination
+ @options[:destination]
- # TODO: Memoize this.
- def manage_directory
- dir = Chef::Directory.new(@path)
- dir.owner 'root'
- dir.group 'root'
- dir.mode 00755
- dir.run_action(:create)
end
#
- def tar
+ def extract_tar
? ++++++++
+ ChefJava::Helpers::Tar.new(archive, destination)
end
- def decompress
+ def remove_extracted_tar
+ FileUtils.rm_rf(destination)
end
-
end
end
end | 27 | 0.675 | 14 | 13 |
ee8f497ff6fc6480cd98b52d8b62ce13c91cfef5 | lib/silent-postgres.rb | lib/silent-postgres.rb | if Rails.env.development? || Rails.env.test?
require "silent-postgres/railtie"
require "silent-postgres/schema_plus"
module SilentPostgres
SILENCED_METHODS = %w(tables table_exists? indexes column_definitions pk_and_sequence_for last_insert_id)
def self.included(base)
SILENCED_METHODS.each do |m|
base.send :alias_method_chain, m, :silencer
end
end
SILENCED_METHODS.each do |m|
m1, m2 = if m =~ /^(.*)\?$/
[$1, '?']
else
[m, nil]
end
eval <<-METHOD
def #{m1}_with_silencer#{m2}(*args)
@logger.silence do
#{m1}_without_silencer#{m2}(*args)
end
end
METHOD
end
end
end
| if Rails.env.development? || Rails.env.test?
require "silent-postgres/railtie"
require "silent-postgres/schema_plus"
module SilentPostgres
SILENCED_METHODS = %w(tables table_exists? indexes column_definitions pk_and_sequence_for last_insert_id)
def self.included(base)
SILENCED_METHODS.each do |m|
base.send :alias_method, "#{m}_without_silencer", m
base.send :alias_method, m, "#{m}_with_silencer"
end
end
SILENCED_METHODS.each do |m|
m1, m2 = if m =~ /^(.*)\?$/
[$1, '?']
else
[m, nil]
end
eval <<-METHOD
def #{m1}_with_silencer#{m2}(*args)
@logger.silence do
#{m1}_without_silencer#{m2}(*args)
end
end
METHOD
end
end
end
| Use alias_method instead of deprecated alias_method_chain | Use alias_method instead of deprecated alias_method_chain
| Ruby | mit | dolzenko/silent-postgres | ruby | ## Code Before:
if Rails.env.development? || Rails.env.test?
require "silent-postgres/railtie"
require "silent-postgres/schema_plus"
module SilentPostgres
SILENCED_METHODS = %w(tables table_exists? indexes column_definitions pk_and_sequence_for last_insert_id)
def self.included(base)
SILENCED_METHODS.each do |m|
base.send :alias_method_chain, m, :silencer
end
end
SILENCED_METHODS.each do |m|
m1, m2 = if m =~ /^(.*)\?$/
[$1, '?']
else
[m, nil]
end
eval <<-METHOD
def #{m1}_with_silencer#{m2}(*args)
@logger.silence do
#{m1}_without_silencer#{m2}(*args)
end
end
METHOD
end
end
end
## Instruction:
Use alias_method instead of deprecated alias_method_chain
## Code After:
if Rails.env.development? || Rails.env.test?
require "silent-postgres/railtie"
require "silent-postgres/schema_plus"
module SilentPostgres
SILENCED_METHODS = %w(tables table_exists? indexes column_definitions pk_and_sequence_for last_insert_id)
def self.included(base)
SILENCED_METHODS.each do |m|
base.send :alias_method, "#{m}_without_silencer", m
base.send :alias_method, m, "#{m}_with_silencer"
end
end
SILENCED_METHODS.each do |m|
m1, m2 = if m =~ /^(.*)\?$/
[$1, '?']
else
[m, nil]
end
eval <<-METHOD
def #{m1}_with_silencer#{m2}(*args)
@logger.silence do
#{m1}_without_silencer#{m2}(*args)
end
end
METHOD
end
end
end
| if Rails.env.development? || Rails.env.test?
require "silent-postgres/railtie"
require "silent-postgres/schema_plus"
module SilentPostgres
SILENCED_METHODS = %w(tables table_exists? indexes column_definitions pk_and_sequence_for last_insert_id)
def self.included(base)
SILENCED_METHODS.each do |m|
+ base.send :alias_method, "#{m}_without_silencer", m
- base.send :alias_method_chain, m, :silencer
? ------ ^
+ base.send :alias_method, m, "#{m}_with_silencer"
? ^^^^^^^^^^^ +
end
end
SILENCED_METHODS.each do |m|
m1, m2 = if m =~ /^(.*)\?$/
[$1, '?']
else
[m, nil]
end
eval <<-METHOD
def #{m1}_with_silencer#{m2}(*args)
@logger.silence do
#{m1}_without_silencer#{m2}(*args)
end
end
METHOD
end
end
end
| 3 | 0.090909 | 2 | 1 |
f3a20bdc85358944a6c03e4701659f55d039c4b6 | coffee/cilantro/ui/concept/search.coffee | coffee/cilantro/ui/concept/search.coffee | define [
'../core'
'../search'
], (c, search)->
# Takes a collection of concepts to filter/get the concept model instances
# for rendering
class ConceptSearch extends search.Search
className: 'concept-search search'
events:
'input typeahead:selected': 'focusConcept'
options: ->
url = c.data.concepts.url()
return {
name: 'Concepts'
valueKey: 'name'
limit: 10
remote:
url: "#{ url }?query=%QUERY&brief=1"
filter: (resp) =>
datums = []
c._.each resp, (datum) =>
if @collection.get(datum.id)
datums.push(datum)
return datums
}
focusConcept: (event, datum) ->
c.publish c.CONCEPT_FOCUS, datum.id
{ ConceptSearch }
| define [
'../core'
'../search'
], (c, search)->
# Takes a collection of concepts to filter/get the concept model instances
# for rendering
class ConceptSearch extends search.Search
className: 'concept-search search'
events:
'typeahead:selected input': 'focusConcept'
'typeahead:autocompleted input': 'focusConcept'
options: ->
url = c.data.concepts.url()
return {
name: 'Concepts'
valueKey: 'name'
limit: 10
remote:
url: "#{ url }?query=%QUERY&brief=1"
filter: (resp) =>
datums = []
c._.each resp, (datum) =>
if @collection.get(datum.id)
datums.push(datum)
return datums
}
focusConcept: (event, datum) ->
c.publish c.CONCEPT_FOCUS, datum.id
{ ConceptSearch }
| Fix reversed events hash for ConceptSearch, add autocomplete handler | Fix reversed events hash for ConceptSearch, add autocomplete handler
| CoffeeScript | bsd-2-clause | chop-dbhi/cilantro,chop-dbhi/cilantro,chop-dbhi/cilantro | coffeescript | ## Code Before:
define [
'../core'
'../search'
], (c, search)->
# Takes a collection of concepts to filter/get the concept model instances
# for rendering
class ConceptSearch extends search.Search
className: 'concept-search search'
events:
'input typeahead:selected': 'focusConcept'
options: ->
url = c.data.concepts.url()
return {
name: 'Concepts'
valueKey: 'name'
limit: 10
remote:
url: "#{ url }?query=%QUERY&brief=1"
filter: (resp) =>
datums = []
c._.each resp, (datum) =>
if @collection.get(datum.id)
datums.push(datum)
return datums
}
focusConcept: (event, datum) ->
c.publish c.CONCEPT_FOCUS, datum.id
{ ConceptSearch }
## Instruction:
Fix reversed events hash for ConceptSearch, add autocomplete handler
## Code After:
define [
'../core'
'../search'
], (c, search)->
# Takes a collection of concepts to filter/get the concept model instances
# for rendering
class ConceptSearch extends search.Search
className: 'concept-search search'
events:
'typeahead:selected input': 'focusConcept'
'typeahead:autocompleted input': 'focusConcept'
options: ->
url = c.data.concepts.url()
return {
name: 'Concepts'
valueKey: 'name'
limit: 10
remote:
url: "#{ url }?query=%QUERY&brief=1"
filter: (resp) =>
datums = []
c._.each resp, (datum) =>
if @collection.get(datum.id)
datums.push(datum)
return datums
}
focusConcept: (event, datum) ->
c.publish c.CONCEPT_FOCUS, datum.id
{ ConceptSearch }
| define [
'../core'
'../search'
], (c, search)->
# Takes a collection of concepts to filter/get the concept model instances
# for rendering
class ConceptSearch extends search.Search
className: 'concept-search search'
events:
- 'input typeahead:selected': 'focusConcept'
? ------
+ 'typeahead:selected input': 'focusConcept'
? ++++++
+ 'typeahead:autocompleted input': 'focusConcept'
options: ->
url = c.data.concepts.url()
return {
name: 'Concepts'
valueKey: 'name'
limit: 10
remote:
url: "#{ url }?query=%QUERY&brief=1"
filter: (resp) =>
datums = []
c._.each resp, (datum) =>
if @collection.get(datum.id)
datums.push(datum)
return datums
}
focusConcept: (event, datum) ->
c.publish c.CONCEPT_FOCUS, datum.id
{ ConceptSearch } | 3 | 0.085714 | 2 | 1 |
41bde8c2fb193aaa8ebefe7d42b32fb0e12626e9 | test/CodeGen/code-coverage.c | test/CodeGen/code-coverage.c | // RUN: %clang -O0 -S -mno-red-zone -fprofile-arcs -ftest-coverage -emit-llvm %s -o - | FileCheck %s
// <rdar://problem/12843084>
int test1(int a) {
switch (a % 2) {
case 0:
++a;
case 1:
a /= 2;
}
return a;
}
// Check tha the `-mno-red-zone' flag is set here on the generated functions.
// CHECK: void @__llvm_gcov_indirect_counter_increment(i32* %{{.*}}, i64** %{{.*}}) unnamed_addr noinline noredzone
// CHECK: void @__llvm_gcov_writeout() unnamed_addr noinline noredzone
// CHECK: void @__llvm_gcov_init() unnamed_addr noinline noredzone
// CHECK: void @__gcov_flush() unnamed_addr noinline noredzone
| // RUN: %clang_cc1 -O0 -emit-llvm -disable-red-zone -femit-coverage-notes -femit-coverage-data %s -o - | FileCheck %s
// <rdar://problem/12843084>
int test1(int a) {
switch (a % 2) {
case 0:
++a;
case 1:
a /= 2;
}
return a;
}
// Check tha the `-mno-red-zone' flag is set here on the generated functions.
// CHECK: void @__llvm_gcov_indirect_counter_increment(i32* %{{.*}}, i64** %{{.*}}) unnamed_addr noinline noredzone
// CHECK: void @__llvm_gcov_writeout() unnamed_addr noinline noredzone
// CHECK: void @__llvm_gcov_init() unnamed_addr noinline noredzone
// CHECK: void @__gcov_flush() unnamed_addr noinline noredzone
| Use correct flags for this test. | Use correct flags for this test.
git-svn-id: ffe668792ed300d6c2daa1f6eba2e0aa28d7ec6c@169768 91177308-0d34-0410-b5e6-96231b3b80d8
| C | apache-2.0 | apple/swift-clang,apple/swift-clang,llvm-mirror/clang,llvm-mirror/clang,apple/swift-clang,llvm-mirror/clang,llvm-mirror/clang,apple/swift-clang,llvm-mirror/clang,apple/swift-clang,apple/swift-clang,llvm-mirror/clang,llvm-mirror/clang,llvm-mirror/clang,apple/swift-clang,llvm-mirror/clang,apple/swift-clang,apple/swift-clang,llvm-mirror/clang,apple/swift-clang | c | ## Code Before:
// RUN: %clang -O0 -S -mno-red-zone -fprofile-arcs -ftest-coverage -emit-llvm %s -o - | FileCheck %s
// <rdar://problem/12843084>
int test1(int a) {
switch (a % 2) {
case 0:
++a;
case 1:
a /= 2;
}
return a;
}
// Check tha the `-mno-red-zone' flag is set here on the generated functions.
// CHECK: void @__llvm_gcov_indirect_counter_increment(i32* %{{.*}}, i64** %{{.*}}) unnamed_addr noinline noredzone
// CHECK: void @__llvm_gcov_writeout() unnamed_addr noinline noredzone
// CHECK: void @__llvm_gcov_init() unnamed_addr noinline noredzone
// CHECK: void @__gcov_flush() unnamed_addr noinline noredzone
## Instruction:
Use correct flags for this test.
git-svn-id: ffe668792ed300d6c2daa1f6eba2e0aa28d7ec6c@169768 91177308-0d34-0410-b5e6-96231b3b80d8
## Code After:
// RUN: %clang_cc1 -O0 -emit-llvm -disable-red-zone -femit-coverage-notes -femit-coverage-data %s -o - | FileCheck %s
// <rdar://problem/12843084>
int test1(int a) {
switch (a % 2) {
case 0:
++a;
case 1:
a /= 2;
}
return a;
}
// Check tha the `-mno-red-zone' flag is set here on the generated functions.
// CHECK: void @__llvm_gcov_indirect_counter_increment(i32* %{{.*}}, i64** %{{.*}}) unnamed_addr noinline noredzone
// CHECK: void @__llvm_gcov_writeout() unnamed_addr noinline noredzone
// CHECK: void @__llvm_gcov_init() unnamed_addr noinline noredzone
// CHECK: void @__gcov_flush() unnamed_addr noinline noredzone
| - // RUN: %clang -O0 -S -mno-red-zone -fprofile-arcs -ftest-coverage -emit-llvm %s -o - | FileCheck %s
+ // RUN: %clang_cc1 -O0 -emit-llvm -disable-red-zone -femit-coverage-notes -femit-coverage-data %s -o - | FileCheck %s
+
// <rdar://problem/12843084>
int test1(int a) {
switch (a % 2) {
case 0:
++a;
case 1:
a /= 2;
}
return a;
}
// Check tha the `-mno-red-zone' flag is set here on the generated functions.
// CHECK: void @__llvm_gcov_indirect_counter_increment(i32* %{{.*}}, i64** %{{.*}}) unnamed_addr noinline noredzone
// CHECK: void @__llvm_gcov_writeout() unnamed_addr noinline noredzone
// CHECK: void @__llvm_gcov_init() unnamed_addr noinline noredzone
// CHECK: void @__gcov_flush() unnamed_addr noinline noredzone | 3 | 0.157895 | 2 | 1 |
20ce89f34e42442367ace9c1b913a503aff53bad | config/genomes/mm10-resources.yaml | config/genomes/mm10-resources.yaml | version: 1
aliases:
snpeff: GRCm38
ensembl: mouse
rnaseq:
transcripts: ../rnaseq/ref-transcripts.gtf
transcripts_mask: ../rnaseq/ref-transcripts-mask.gtf
| version: 2
aliases:
snpeff: GRCm38
ensembl: mouse
variation:
dbsnp: ../variation/mm10-dbSNP-2013-09-12.vcf
rnaseq:
transcripts: ../rnaseq/ref-transcripts.gtf
transcripts_mask: ../rnaseq/ref-transcripts-mask.gtf
| Add new mm10 resource file with pointers to dbSNP resources, matching updated CloudBioLinux data. Bump mm10 resource version | Add new mm10 resource file with pointers to dbSNP resources, matching updated CloudBioLinux data. Bump mm10 resource version
| YAML | mit | biocyberman/bcbio-nextgen,lpantano/bcbio-nextgen,guillermo-carrasco/bcbio-nextgen,elkingtonmcb/bcbio-nextgen,verdurin/bcbio-nextgen,fw1121/bcbio-nextgen,Cyberbio-Lab/bcbio-nextgen,brainstorm/bcbio-nextgen,mjafin/bcbio-nextgen,verdurin/bcbio-nextgen,verdurin/bcbio-nextgen,gifford-lab/bcbio-nextgen,lbeltrame/bcbio-nextgen,SciLifeLab/bcbio-nextgen,guillermo-carrasco/bcbio-nextgen,SciLifeLab/bcbio-nextgen,guillermo-carrasco/bcbio-nextgen,chapmanb/bcbio-nextgen,hjanime/bcbio-nextgen,brainstorm/bcbio-nextgen,a113n/bcbio-nextgen,fw1121/bcbio-nextgen,SciLifeLab/bcbio-nextgen,Cyberbio-Lab/bcbio-nextgen,lbeltrame/bcbio-nextgen,gifford-lab/bcbio-nextgen,vladsaveliev/bcbio-nextgen,vladsaveliev/bcbio-nextgen,hjanime/bcbio-nextgen,lbeltrame/bcbio-nextgen,elkingtonmcb/bcbio-nextgen,lpantano/bcbio-nextgen,vladsaveliev/bcbio-nextgen,biocyberman/bcbio-nextgen,elkingtonmcb/bcbio-nextgen,biocyberman/bcbio-nextgen,Cyberbio-Lab/bcbio-nextgen,mjafin/bcbio-nextgen,chapmanb/bcbio-nextgen,brainstorm/bcbio-nextgen,chapmanb/bcbio-nextgen,lpantano/bcbio-nextgen,fw1121/bcbio-nextgen,gifford-lab/bcbio-nextgen,a113n/bcbio-nextgen,mjafin/bcbio-nextgen,a113n/bcbio-nextgen,hjanime/bcbio-nextgen | yaml | ## Code Before:
version: 1
aliases:
snpeff: GRCm38
ensembl: mouse
rnaseq:
transcripts: ../rnaseq/ref-transcripts.gtf
transcripts_mask: ../rnaseq/ref-transcripts-mask.gtf
## Instruction:
Add new mm10 resource file with pointers to dbSNP resources, matching updated CloudBioLinux data. Bump mm10 resource version
## Code After:
version: 2
aliases:
snpeff: GRCm38
ensembl: mouse
variation:
dbsnp: ../variation/mm10-dbSNP-2013-09-12.vcf
rnaseq:
transcripts: ../rnaseq/ref-transcripts.gtf
transcripts_mask: ../rnaseq/ref-transcripts-mask.gtf
| - version: 1
? ^
+ version: 2
? ^
aliases:
snpeff: GRCm38
ensembl: mouse
-
+
+ variation:
+ dbsnp: ../variation/mm10-dbSNP-2013-09-12.vcf
+
rnaseq:
transcripts: ../rnaseq/ref-transcripts.gtf
transcripts_mask: ../rnaseq/ref-transcripts-mask.gtf | 7 | 0.777778 | 5 | 2 |
e5c002c5135bcce5176bd82317833d45ecee2591 | docker-compose.yml | docker-compose.yml | version: "3"
services:
build:
image: node:alpine
working_dir: /home/node/app
environment:
- NODE_ENV=production
volumes:
- ./:/home/node/app
command: "yarn && yarn build:production"
| version: "3"
services:
build:
image: node:alpine
working_dir: /home/node/app
environment:
- NODE_ENV=production
volumes:
- ./:/home/node/app
command: "rm -rf /home/node/app/public && yarn && yarn build:production"
| Clean build directory when deploying. | Clean build directory when deploying.
| YAML | mit | jsonnull/jsonnull.com,jsonnull/jsonnull.com,jsonnull/jsonnull.com | yaml | ## Code Before:
version: "3"
services:
build:
image: node:alpine
working_dir: /home/node/app
environment:
- NODE_ENV=production
volumes:
- ./:/home/node/app
command: "yarn && yarn build:production"
## Instruction:
Clean build directory when deploying.
## Code After:
version: "3"
services:
build:
image: node:alpine
working_dir: /home/node/app
environment:
- NODE_ENV=production
volumes:
- ./:/home/node/app
command: "rm -rf /home/node/app/public && yarn && yarn build:production"
| version: "3"
services:
build:
image: node:alpine
working_dir: /home/node/app
environment:
- NODE_ENV=production
volumes:
- ./:/home/node/app
- command: "yarn && yarn build:production"
+ command: "rm -rf /home/node/app/public && yarn && yarn build:production" | 2 | 0.2 | 1 | 1 |
1f461d08913f66d625d927f5805f454648b3703a | README.md | README.md | <img src="logo.png" height="160">
> A simple chatbot for Team 1418's Discord server.
## Dependencies
* [Python 3](https://www.python.org/downloads)
* Python packages:
pip3 install -r requirements.txt
## Running
1. Rename `config.ini.example` to `config.ini` and write a valid bot token to `token.txt`.
2. While in `VictiBot` directory, run:
python3 bot.py
## Contributors
* [Erik Boesen](https://github.com/ErikBoesen)
* [Adrian Hall](https://github.com/aderhall)
## License
This software is protected under the [MIT License](LICENSE).
| <img src="logo.png" height="160">
> A simple chatbot for Team 1418's Discord server.
## Dependencies
* [Python 3](https://www.python.org/downloads)
* Python packages:
pip3 install -r requirements.txt
## Running
1. Rename `config.ini.example` to `config.ini` and write a valid bot token to that file.
2. While in `VictiBot` directory, run:
python3 bot.py
## Contributors
* [Erik Boesen](https://github.com/ErikBoesen)
* [Adrian Hall](https://github.com/aderhall)
## License
This software is protected under the [MIT License](LICENSE).
| Fix incorrect token file reference | Fix incorrect token file reference
| Markdown | mit | frc1418/victibot | markdown | ## Code Before:
<img src="logo.png" height="160">
> A simple chatbot for Team 1418's Discord server.
## Dependencies
* [Python 3](https://www.python.org/downloads)
* Python packages:
pip3 install -r requirements.txt
## Running
1. Rename `config.ini.example` to `config.ini` and write a valid bot token to `token.txt`.
2. While in `VictiBot` directory, run:
python3 bot.py
## Contributors
* [Erik Boesen](https://github.com/ErikBoesen)
* [Adrian Hall](https://github.com/aderhall)
## License
This software is protected under the [MIT License](LICENSE).
## Instruction:
Fix incorrect token file reference
## Code After:
<img src="logo.png" height="160">
> A simple chatbot for Team 1418's Discord server.
## Dependencies
* [Python 3](https://www.python.org/downloads)
* Python packages:
pip3 install -r requirements.txt
## Running
1. Rename `config.ini.example` to `config.ini` and write a valid bot token to that file.
2. While in `VictiBot` directory, run:
python3 bot.py
## Contributors
* [Erik Boesen](https://github.com/ErikBoesen)
* [Adrian Hall](https://github.com/aderhall)
## License
This software is protected under the [MIT License](LICENSE).
| <img src="logo.png" height="160">
> A simple chatbot for Team 1418's Discord server.
## Dependencies
* [Python 3](https://www.python.org/downloads)
* Python packages:
pip3 install -r requirements.txt
## Running
- 1. Rename `config.ini.example` to `config.ini` and write a valid bot token to `token.txt`.
? - ^^ - -----
+ 1. Rename `config.ini.example` to `config.ini` and write a valid bot token to that file.
? ^^^^^^^
2. While in `VictiBot` directory, run:
python3 bot.py
## Contributors
* [Erik Boesen](https://github.com/ErikBoesen)
* [Adrian Hall](https://github.com/aderhall)
## License
This software is protected under the [MIT License](LICENSE). | 2 | 0.090909 | 1 | 1 |
6cae4a8a17849effcfb388a38616c023c488fe5c | index.html | index.html | <head></head>
<title></title>
<body>
<p></p>
</body>
| <head>Master Head</head>
<title></title>
<body>
<p></p>
</body>
| Add head title to master | Add head title to master
| HTML | mit | gerryderry/first-test-repo | html | ## Code Before:
<head></head>
<title></title>
<body>
<p></p>
</body>
## Instruction:
Add head title to master
## Code After:
<head>Master Head</head>
<title></title>
<body>
<p></p>
</body>
| - <head></head>
+ <head>Master Head</head>
<title></title>
<body>
<p></p>
</body> | 2 | 0.4 | 1 | 1 |
35a4f4586706136f23fd6764eb0031ed4d5922bd | CONTRIBUTING.md | CONTRIBUTING.md |
We'd love to accept your patches and contributions to this project. There are
a just a few small guidelines you need to follow.
## Contributor License Agreement ##
Contributions to any Google project must be accompanied by a Contributor
License Agreement. This is not a copyright **assignment**, it simply gives
Google permission to use and redistribute your contributions as part of the
project.
* If you are an individual writing original source code and you're sure you
own the intellectual property, then you'll need to sign an [individual
CLA][].
* If you work for a company that wants to allow you to contribute your work,
then you'll need to sign a [corporate CLA][].
You generally only need to submit a CLA once, so if you've already submitted
one (even if it was for a different project), you probably don't need to do it
again.
[individual CLA]: https://developers.google.com/open-source/cla/individual
[corporate CLA]: https://developers.google.com/open-source/cla/corporate
## Submitting a patch ##
1. It's generally best to start by opening a new issue describing the bug or
feature you're intending to fix. Even if you think it's relatively minor,
it's helpful to know what people are working on. Mention in the initial
issue that you are planning to work on that bug or feature so that it can
be assigned to you.
1. Follow the normal process of [forking][] the project, and setup a new
branch to work in. It's important that each group of changes be done in
separate branches in order to ensure that a pull request only includes the
commits related to that bug or feature.
1. Any significant changes should almost always be accompanied by tests. The
project already has good test coverage, so look at some of the existing
tests if you're unsure how to go about it.
1. All contributions must be licensed Apache 2.0 and all files must have
a copy of the boilerplate licence comment (can be copied from an existing
file. Files should be formatted according to Google's [java style guide][].
1. Do your best to have [well-formed commit messages][] for each change.
This provides consistency throughout the project, and ensures that commit
messages are able to be formatted properly by various git tools.
1. Finally, push the commits to your fork and submit a [pull request][].
[forking]: https://help.github.com/articles/fork-a-repo
[java style guide]: http://google-styleguide.googlecode.com/svn/trunk/javaguide.html
[well-formed commit messages]: http://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html
[pull request]: https://help.github.com/articles/creating-a-pull-request
|
This is a patched build of https://github.com/google/guice/ which includes selected contributions that have not yet been merged into the upstream build.
While we will consider pull-requests (Apache 2.0 licensed) it is better to send contributions directly to the upstream project. You can then open an issue here if you would like specific upstream contributions added to sisu-guice and made available in a release outside of the current google/guice schedule.
| Update to clarify sisu-guice contribution model | Update to clarify sisu-guice contribution model
| Markdown | apache-2.0 | sonatype/sisu-guice,sonatype/sisu-guice | markdown | ## Code Before:
We'd love to accept your patches and contributions to this project. There are
a just a few small guidelines you need to follow.
## Contributor License Agreement ##
Contributions to any Google project must be accompanied by a Contributor
License Agreement. This is not a copyright **assignment**, it simply gives
Google permission to use and redistribute your contributions as part of the
project.
* If you are an individual writing original source code and you're sure you
own the intellectual property, then you'll need to sign an [individual
CLA][].
* If you work for a company that wants to allow you to contribute your work,
then you'll need to sign a [corporate CLA][].
You generally only need to submit a CLA once, so if you've already submitted
one (even if it was for a different project), you probably don't need to do it
again.
[individual CLA]: https://developers.google.com/open-source/cla/individual
[corporate CLA]: https://developers.google.com/open-source/cla/corporate
## Submitting a patch ##
1. It's generally best to start by opening a new issue describing the bug or
feature you're intending to fix. Even if you think it's relatively minor,
it's helpful to know what people are working on. Mention in the initial
issue that you are planning to work on that bug or feature so that it can
be assigned to you.
1. Follow the normal process of [forking][] the project, and setup a new
branch to work in. It's important that each group of changes be done in
separate branches in order to ensure that a pull request only includes the
commits related to that bug or feature.
1. Any significant changes should almost always be accompanied by tests. The
project already has good test coverage, so look at some of the existing
tests if you're unsure how to go about it.
1. All contributions must be licensed Apache 2.0 and all files must have
a copy of the boilerplate licence comment (can be copied from an existing
file. Files should be formatted according to Google's [java style guide][].
1. Do your best to have [well-formed commit messages][] for each change.
This provides consistency throughout the project, and ensures that commit
messages are able to be formatted properly by various git tools.
1. Finally, push the commits to your fork and submit a [pull request][].
[forking]: https://help.github.com/articles/fork-a-repo
[java style guide]: http://google-styleguide.googlecode.com/svn/trunk/javaguide.html
[well-formed commit messages]: http://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html
[pull request]: https://help.github.com/articles/creating-a-pull-request
## Instruction:
Update to clarify sisu-guice contribution model
## Code After:
This is a patched build of https://github.com/google/guice/ which includes selected contributions that have not yet been merged into the upstream build.
While we will consider pull-requests (Apache 2.0 licensed) it is better to send contributions directly to the upstream project. You can then open an issue here if you would like specific upstream contributions added to sisu-guice and made available in a release outside of the current google/guice schedule.
|
+ This is a patched build of https://github.com/google/guice/ which includes selected contributions that have not yet been merged into the upstream build.
- We'd love to accept your patches and contributions to this project. There are
- a just a few small guidelines you need to follow.
+ While we will consider pull-requests (Apache 2.0 licensed) it is better to send contributions directly to the upstream project. You can then open an issue here if you would like specific upstream contributions added to sisu-guice and made available in a release outside of the current google/guice schedule.
- ## Contributor License Agreement ##
-
- Contributions to any Google project must be accompanied by a Contributor
- License Agreement. This is not a copyright **assignment**, it simply gives
- Google permission to use and redistribute your contributions as part of the
- project.
-
- * If you are an individual writing original source code and you're sure you
- own the intellectual property, then you'll need to sign an [individual
- CLA][].
-
- * If you work for a company that wants to allow you to contribute your work,
- then you'll need to sign a [corporate CLA][].
-
- You generally only need to submit a CLA once, so if you've already submitted
- one (even if it was for a different project), you probably don't need to do it
- again.
-
- [individual CLA]: https://developers.google.com/open-source/cla/individual
- [corporate CLA]: https://developers.google.com/open-source/cla/corporate
-
-
- ## Submitting a patch ##
-
- 1. It's generally best to start by opening a new issue describing the bug or
- feature you're intending to fix. Even if you think it's relatively minor,
- it's helpful to know what people are working on. Mention in the initial
- issue that you are planning to work on that bug or feature so that it can
- be assigned to you.
-
- 1. Follow the normal process of [forking][] the project, and setup a new
- branch to work in. It's important that each group of changes be done in
- separate branches in order to ensure that a pull request only includes the
- commits related to that bug or feature.
-
- 1. Any significant changes should almost always be accompanied by tests. The
- project already has good test coverage, so look at some of the existing
- tests if you're unsure how to go about it.
-
- 1. All contributions must be licensed Apache 2.0 and all files must have
- a copy of the boilerplate licence comment (can be copied from an existing
- file. Files should be formatted according to Google's [java style guide][].
-
- 1. Do your best to have [well-formed commit messages][] for each change.
- This provides consistency throughout the project, and ensures that commit
- messages are able to be formatted properly by various git tools.
-
- 1. Finally, push the commits to your fork and submit a [pull request][].
-
- [forking]: https://help.github.com/articles/fork-a-repo
- [java style guide]: http://google-styleguide.googlecode.com/svn/trunk/javaguide.html
- [well-formed commit messages]: http://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html
- [pull request]: https://help.github.com/articles/creating-a-pull-request
-
- | 59 | 0.983333 | 2 | 57 |
d55389580160c4585c131537c04c4045a38ea134 | fluxghost/http_server_base.py | fluxghost/http_server_base.py |
from select import select
import logging
import socket
logger = logging.getLogger("HTTPServer")
from fluxghost.http_handlers.websocket_handler import WebSocketHandler
from fluxghost.http_handlers.file_handler import FileHandler
class HttpServerBase(object):
def __init__(self, assets_path, address, backlog=10):
self.assets_handler = FileHandler(assets_path)
self.ws_handler = WebSocketHandler()
self.sock = s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.bind(address)
s.listen(backlog)
logger.info("Listen HTTP on %s:%s" % address)
def serve_forever(self):
self.running = True
args = ((self.sock, ), (), (), 30.)
while self.running:
try:
rl = select(*args)[0]
if rl:
self.on_accept()
except InterruptedError:
pass
|
from select import select
import logging
import socket
logger = logging.getLogger("HTTPServer")
from fluxghost.http_handlers.websocket_handler import WebSocketHandler
from fluxghost.http_handlers.file_handler import FileHandler
class HttpServerBase(object):
def __init__(self, assets_path, address, backlog=10):
self.assets_handler = FileHandler(assets_path)
self.ws_handler = WebSocketHandler()
self.sock = s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.bind(address)
s.listen(backlog)
if address[1] == 0:
from sys import stdout
address = s.getsockname()
stdout.write("LISTEN ON %i\n" % address[1])
stdout.flush()
logger.info("Listen HTTP on %s:%s" % address)
def serve_forever(self):
self.running = True
args = ((self.sock, ), (), (), 30.)
while self.running:
try:
rl = select(*args)[0]
if rl:
self.on_accept()
except InterruptedError:
pass
| Add auto select port function | Add auto select port function
| Python | agpl-3.0 | flux3dp/fluxghost,flux3dp/fluxghost,flux3dp/fluxghost,flux3dp/fluxghost | python | ## Code Before:
from select import select
import logging
import socket
logger = logging.getLogger("HTTPServer")
from fluxghost.http_handlers.websocket_handler import WebSocketHandler
from fluxghost.http_handlers.file_handler import FileHandler
class HttpServerBase(object):
def __init__(self, assets_path, address, backlog=10):
self.assets_handler = FileHandler(assets_path)
self.ws_handler = WebSocketHandler()
self.sock = s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.bind(address)
s.listen(backlog)
logger.info("Listen HTTP on %s:%s" % address)
def serve_forever(self):
self.running = True
args = ((self.sock, ), (), (), 30.)
while self.running:
try:
rl = select(*args)[0]
if rl:
self.on_accept()
except InterruptedError:
pass
## Instruction:
Add auto select port function
## Code After:
from select import select
import logging
import socket
logger = logging.getLogger("HTTPServer")
from fluxghost.http_handlers.websocket_handler import WebSocketHandler
from fluxghost.http_handlers.file_handler import FileHandler
class HttpServerBase(object):
def __init__(self, assets_path, address, backlog=10):
self.assets_handler = FileHandler(assets_path)
self.ws_handler = WebSocketHandler()
self.sock = s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.bind(address)
s.listen(backlog)
if address[1] == 0:
from sys import stdout
address = s.getsockname()
stdout.write("LISTEN ON %i\n" % address[1])
stdout.flush()
logger.info("Listen HTTP on %s:%s" % address)
def serve_forever(self):
self.running = True
args = ((self.sock, ), (), (), 30.)
while self.running:
try:
rl = select(*args)[0]
if rl:
self.on_accept()
except InterruptedError:
pass
|
from select import select
import logging
import socket
logger = logging.getLogger("HTTPServer")
from fluxghost.http_handlers.websocket_handler import WebSocketHandler
from fluxghost.http_handlers.file_handler import FileHandler
class HttpServerBase(object):
def __init__(self, assets_path, address, backlog=10):
self.assets_handler = FileHandler(assets_path)
self.ws_handler = WebSocketHandler()
self.sock = s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.bind(address)
s.listen(backlog)
+
+ if address[1] == 0:
+ from sys import stdout
+ address = s.getsockname()
+ stdout.write("LISTEN ON %i\n" % address[1])
+ stdout.flush()
+
logger.info("Listen HTTP on %s:%s" % address)
def serve_forever(self):
self.running = True
args = ((self.sock, ), (), (), 30.)
while self.running:
try:
rl = select(*args)[0]
if rl:
self.on_accept()
except InterruptedError:
pass | 7 | 0.205882 | 7 | 0 |
ebd1f6589ef890f67ce8e6bd99297b524225d698 | test/regression/NullPointerTest.java | test/regression/NullPointerTest.java | /*
* test that caught null pointers exceptions in finalizers work correctly
* and that local variables are accessible in null pointer exception handlers.
*/
import java.io.*;
public class NullPointerTest {
static String s;
public static void main(String[] args) {
System.out.println(tryfinally() + s);
}
public static String tryfinally() {
String yuck = null;
String local_s = null;
try {
return "This is ";
} finally {
try {
local_s = "Perfect";
/* trigger null pointer exception */
String x = yuck.toLowerCase();
} catch (Exception _) {
/*
* when the null pointer exception is caught, we must still
* be able to access local_s.
* Our return address for the finally clause must also still
* be intact.
*/
s = local_s;
}
}
}
}
/* Expected Output:
This is Perfect
*/
| /*
* test that caught null pointers exceptions in finalizers work correctly
* and that local variables are accessible in null pointer exception handlers.
*/
import java.io.*;
public class NullPointerTest {
static String s;
public static void main(String[] args) {
System.out.println(tryfinally() + s);
try {
// Note: String.concat("") will not dereference 'this'
((String)null).concat("");
System.out.println("FAILED!");
} catch (NullPointerException e) {
System.out.println("This is good too");
}
}
public static String tryfinally() {
String yuck = null;
String local_s = null;
try {
return "This is ";
} finally {
try {
local_s = "Perfect";
/* trigger null pointer exception */
String x = yuck.toLowerCase();
} catch (Exception _) {
/*
* when the null pointer exception is caught, we must still
* be able to access local_s.
* Our return address for the finally clause must also still
* be intact.
*/
s = local_s;
}
}
}
}
/* Expected Output:
This is Perfect
This is good too
*/
| Check that kaffe doesn't allow invocation of a null pointer. | Check that kaffe doesn't allow invocation of a null pointer.
| Java | lgpl-2.1 | kaffe/kaffe,kaffe/kaffe,kaffe/kaffe,kaffe/kaffe,kaffe/kaffe | java | ## Code Before:
/*
* test that caught null pointers exceptions in finalizers work correctly
* and that local variables are accessible in null pointer exception handlers.
*/
import java.io.*;
public class NullPointerTest {
static String s;
public static void main(String[] args) {
System.out.println(tryfinally() + s);
}
public static String tryfinally() {
String yuck = null;
String local_s = null;
try {
return "This is ";
} finally {
try {
local_s = "Perfect";
/* trigger null pointer exception */
String x = yuck.toLowerCase();
} catch (Exception _) {
/*
* when the null pointer exception is caught, we must still
* be able to access local_s.
* Our return address for the finally clause must also still
* be intact.
*/
s = local_s;
}
}
}
}
/* Expected Output:
This is Perfect
*/
## Instruction:
Check that kaffe doesn't allow invocation of a null pointer.
## Code After:
/*
* test that caught null pointers exceptions in finalizers work correctly
* and that local variables are accessible in null pointer exception handlers.
*/
import java.io.*;
public class NullPointerTest {
static String s;
public static void main(String[] args) {
System.out.println(tryfinally() + s);
try {
// Note: String.concat("") will not dereference 'this'
((String)null).concat("");
System.out.println("FAILED!");
} catch (NullPointerException e) {
System.out.println("This is good too");
}
}
public static String tryfinally() {
String yuck = null;
String local_s = null;
try {
return "This is ";
} finally {
try {
local_s = "Perfect";
/* trigger null pointer exception */
String x = yuck.toLowerCase();
} catch (Exception _) {
/*
* when the null pointer exception is caught, we must still
* be able to access local_s.
* Our return address for the finally clause must also still
* be intact.
*/
s = local_s;
}
}
}
}
/* Expected Output:
This is Perfect
This is good too
*/
| /*
* test that caught null pointers exceptions in finalizers work correctly
* and that local variables are accessible in null pointer exception handlers.
*/
import java.io.*;
public class NullPointerTest {
static String s;
public static void main(String[] args) {
System.out.println(tryfinally() + s);
+ try {
+ // Note: String.concat("") will not dereference 'this'
+ ((String)null).concat("");
+ System.out.println("FAILED!");
+ } catch (NullPointerException e) {
+ System.out.println("This is good too");
+ }
}
public static String tryfinally() {
String yuck = null;
String local_s = null;
try {
return "This is ";
} finally {
try {
local_s = "Perfect";
/* trigger null pointer exception */
String x = yuck.toLowerCase();
} catch (Exception _) {
/*
* when the null pointer exception is caught, we must still
* be able to access local_s.
* Our return address for the finally clause must also still
* be intact.
*/
s = local_s;
}
}
}
}
/* Expected Output:
This is Perfect
+ This is good too
*/ | 8 | 0.190476 | 8 | 0 |
512241b8c436304ce677cd6771f236590d2720aa | .github/workflows/make_windows_python.yml | .github/workflows/make_windows_python.yml | name: Make Windows Python
on: [push, pull_request]
jobs:
# Building using the github runner environement directly.
make:
runs-on: windows-latest
steps:
- uses: actions/checkout@v2
- uses: ilammy/msvc-dev-cmd@v1
- uses: actions/setup-python@v2
with:
python-version: '3.9'
- name: Install python3
run: python3 -m pip install --user mypy-protobuf absl-py
- name: Clean sh.exe
run: |
where sh
rm 'C:\Program Files\Git\usr\bin\sh.exe'
rm 'C:\Program Files\Git\bin\sh.exe'
shell: bash
- name: Check make
run: tools\make --version
- name: Check system
run: tools\make detect_port
- name: Check Python
run: tools\make detect_python
- name: Build third party
run: tools\make third_party -j4
- name: Build C++
run: tools\make cc -j4
- name: Build Python
run: tools\make python -j4
- name: Test Python
run: tools\make test_python -j4
- name: Create wheel package
run: tools\make package_python
| name: Make Windows Python
on: [push, pull_request]
jobs:
# Building using the github runner environement directly.
make:
runs-on: windows-latest
steps:
- uses: actions/checkout@v2
- uses: ilammy/msvc-dev-cmd@v1
- uses: actions/setup-python@v2
with:
python-version: '3.9'
- name: Install python3
run: python3 -m pip install --user mypy-protobuf absl-py setuptools wheel
- name: Clean sh.exe
run: |
where sh
rm 'C:\Program Files\Git\usr\bin\sh.exe'
rm 'C:\Program Files\Git\bin\sh.exe'
shell: bash
- name: Check make
run: tools\make --version
- name: Check system
run: tools\make detect_port
- name: Check Python
run: tools\make detect_python
- name: Build third party
run: tools\make third_party -j4
- name: Build C++
run: tools\make cc -j4
- name: Build Python
run: tools\make python -j4
- name: Test Python
run: tools\make test_python -j4
- name: Create wheel package
run: tools\make package_python
| Fix windows python missing deps | ci(make): Fix windows python missing deps
| YAML | apache-2.0 | or-tools/or-tools,google/or-tools,google/or-tools,or-tools/or-tools,or-tools/or-tools,or-tools/or-tools,or-tools/or-tools,google/or-tools,google/or-tools,or-tools/or-tools,google/or-tools,google/or-tools | yaml | ## Code Before:
name: Make Windows Python
on: [push, pull_request]
jobs:
# Building using the github runner environement directly.
make:
runs-on: windows-latest
steps:
- uses: actions/checkout@v2
- uses: ilammy/msvc-dev-cmd@v1
- uses: actions/setup-python@v2
with:
python-version: '3.9'
- name: Install python3
run: python3 -m pip install --user mypy-protobuf absl-py
- name: Clean sh.exe
run: |
where sh
rm 'C:\Program Files\Git\usr\bin\sh.exe'
rm 'C:\Program Files\Git\bin\sh.exe'
shell: bash
- name: Check make
run: tools\make --version
- name: Check system
run: tools\make detect_port
- name: Check Python
run: tools\make detect_python
- name: Build third party
run: tools\make third_party -j4
- name: Build C++
run: tools\make cc -j4
- name: Build Python
run: tools\make python -j4
- name: Test Python
run: tools\make test_python -j4
- name: Create wheel package
run: tools\make package_python
## Instruction:
ci(make): Fix windows python missing deps
## Code After:
name: Make Windows Python
on: [push, pull_request]
jobs:
# Building using the github runner environement directly.
make:
runs-on: windows-latest
steps:
- uses: actions/checkout@v2
- uses: ilammy/msvc-dev-cmd@v1
- uses: actions/setup-python@v2
with:
python-version: '3.9'
- name: Install python3
run: python3 -m pip install --user mypy-protobuf absl-py setuptools wheel
- name: Clean sh.exe
run: |
where sh
rm 'C:\Program Files\Git\usr\bin\sh.exe'
rm 'C:\Program Files\Git\bin\sh.exe'
shell: bash
- name: Check make
run: tools\make --version
- name: Check system
run: tools\make detect_port
- name: Check Python
run: tools\make detect_python
- name: Build third party
run: tools\make third_party -j4
- name: Build C++
run: tools\make cc -j4
- name: Build Python
run: tools\make python -j4
- name: Test Python
run: tools\make test_python -j4
- name: Create wheel package
run: tools\make package_python
| name: Make Windows Python
on: [push, pull_request]
jobs:
# Building using the github runner environement directly.
make:
runs-on: windows-latest
steps:
- uses: actions/checkout@v2
- uses: ilammy/msvc-dev-cmd@v1
- uses: actions/setup-python@v2
with:
python-version: '3.9'
- name: Install python3
- run: python3 -m pip install --user mypy-protobuf absl-py
+ run: python3 -m pip install --user mypy-protobuf absl-py setuptools wheel
? +++++++++++++++++
- name: Clean sh.exe
run: |
where sh
rm 'C:\Program Files\Git\usr\bin\sh.exe'
rm 'C:\Program Files\Git\bin\sh.exe'
shell: bash
- name: Check make
run: tools\make --version
- name: Check system
run: tools\make detect_port
- name: Check Python
run: tools\make detect_python
- name: Build third party
run: tools\make third_party -j4
- name: Build C++
run: tools\make cc -j4
- name: Build Python
run: tools\make python -j4
- name: Test Python
run: tools\make test_python -j4
- name: Create wheel package
run: tools\make package_python | 2 | 0.052632 | 1 | 1 |
f0c93a34ffeae4cfd8817b02a47925cc58085e60 | README.md | README.md | A bot to move comments from YouTube to a slack channel.
| A bot to move comments from YouTube to a slack channel.
# Project Goals
[ ] Get YouTube Comments from YouTube
[ ] Put those comments in Slack
[ ] Ensure they are linked correctly
[ ] Store Comments in a Database
[ ] Make sure to not duplicate messages
| Add Project Goals to Readme | Add Project Goals to Readme
| Markdown | mit | TheWeirdlings/youtube_slackbot | markdown | ## Code Before:
A bot to move comments from YouTube to a slack channel.
## Instruction:
Add Project Goals to Readme
## Code After:
A bot to move comments from YouTube to a slack channel.
# Project Goals
[ ] Get YouTube Comments from YouTube
[ ] Put those comments in Slack
[ ] Ensure they are linked correctly
[ ] Store Comments in a Database
[ ] Make sure to not duplicate messages
| A bot to move comments from YouTube to a slack channel.
+
+ # Project Goals
+ [ ] Get YouTube Comments from YouTube
+ [ ] Put those comments in Slack
+ [ ] Ensure they are linked correctly
+ [ ] Store Comments in a Database
+ [ ] Make sure to not duplicate messages
+ | 8 | 8 | 8 | 0 |
6de56afc847b92eeb9387999629aec8604db6746 | requirements.txt | requirements.txt | boto3
google-api-python-client
psycopg2cffi-compat
six
tablib
xlsxwriter
tabulate
| boto3
google-api-python-client
psycopg2cffi-compat
six
tablib
xlsxwriter
tabulate
oauth2client
| Add oauth2client to reqs since we need it for BQ | Add oauth2client to reqs since we need it for BQ
| Text | apache-2.0 | Parsely/parsely_raw_data,Parsely/parsely_raw_data | text | ## Code Before:
boto3
google-api-python-client
psycopg2cffi-compat
six
tablib
xlsxwriter
tabulate
## Instruction:
Add oauth2client to reqs since we need it for BQ
## Code After:
boto3
google-api-python-client
psycopg2cffi-compat
six
tablib
xlsxwriter
tabulate
oauth2client
| boto3
google-api-python-client
psycopg2cffi-compat
six
tablib
xlsxwriter
tabulate
+ oauth2client | 1 | 0.142857 | 1 | 0 |
717f0c4e6853708e1968c577c72f5fa89152b119 | 3DSView/src/main/java/eu/livotov/labs/android/d3s/D3SRegexUtils.java | 3DSView/src/main/java/eu/livotov/labs/android/d3s/D3SRegexUtils.java | package eu.livotov.labs.android.d3s;
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import static java.util.regex.Pattern.CASE_INSENSITIVE;
import static java.util.regex.Pattern.DOTALL;
import static java.util.regex.Pattern.compile;
/**
* Utilities to find 3DS values in ACS webpages.
*/
final class D3SRegexUtils {
/**
* Pattern to find the value of an attribute named value from an html tag with an attribute named name and a value of PaRes.
*/
private static final Pattern paresFinder = compile("<input(?=.+?name=\"PaRes\")(?=.+?value=\"(\\S+?)\").+>", DOTALL | CASE_INSENSITIVE);
/**
* Finds the PaRes in an html page.
*
* @param html String representation of the html page to search within.
* @return PaRes or null if not found
*/
@Nullable
static String findPaRes(@NonNull String html) {
if (html.trim().isEmpty()) return null;
String paRes = null;
Matcher paresMatcher = paresFinder.matcher(html);
if (paresMatcher.find()) {
paRes = paresMatcher.group(1);
}
return paRes;
}
} | package eu.livotov.labs.android.d3s;
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import static java.util.regex.Pattern.CASE_INSENSITIVE;
import static java.util.regex.Pattern.DOTALL;
import static java.util.regex.Pattern.compile;
/**
* Utilities to find 3DS values in ACS webpages.
*/
final class D3SRegexUtils {
/**
* Pattern to find the value of an attribute named value from an html tag with an attribute named name and a value of PaRes.
*/
private static final Pattern paresFinder = compile("<input(?=.+?name=\"PaRes\")(?=.+?value=\"(\\S+?)\").+>", DOTALL | CASE_INSENSITIVE);
/**
* Finds the PaRes in an html page.
* <p>
* Note: If more than one PaRes is found in a page only the first will be returned.
*
* @param html String representation of the html page to search within.
* @return PaRes or null if not found
*/
@Nullable
static String findPaRes(@NonNull String html) {
if (html.trim().isEmpty()) return null;
String paRes = null;
Matcher paresMatcher = paresFinder.matcher(html);
if (paresMatcher.find()) {
paRes = paresMatcher.group(1);
}
return paRes;
}
} | Add additional comment about only matching first value | Add additional comment about only matching first value
| Java | apache-2.0 | LivotovLabs/3DSView | java | ## Code Before:
package eu.livotov.labs.android.d3s;
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import static java.util.regex.Pattern.CASE_INSENSITIVE;
import static java.util.regex.Pattern.DOTALL;
import static java.util.regex.Pattern.compile;
/**
* Utilities to find 3DS values in ACS webpages.
*/
final class D3SRegexUtils {
/**
* Pattern to find the value of an attribute named value from an html tag with an attribute named name and a value of PaRes.
*/
private static final Pattern paresFinder = compile("<input(?=.+?name=\"PaRes\")(?=.+?value=\"(\\S+?)\").+>", DOTALL | CASE_INSENSITIVE);
/**
* Finds the PaRes in an html page.
*
* @param html String representation of the html page to search within.
* @return PaRes or null if not found
*/
@Nullable
static String findPaRes(@NonNull String html) {
if (html.trim().isEmpty()) return null;
String paRes = null;
Matcher paresMatcher = paresFinder.matcher(html);
if (paresMatcher.find()) {
paRes = paresMatcher.group(1);
}
return paRes;
}
}
## Instruction:
Add additional comment about only matching first value
## Code After:
package eu.livotov.labs.android.d3s;
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import static java.util.regex.Pattern.CASE_INSENSITIVE;
import static java.util.regex.Pattern.DOTALL;
import static java.util.regex.Pattern.compile;
/**
* Utilities to find 3DS values in ACS webpages.
*/
final class D3SRegexUtils {
/**
* Pattern to find the value of an attribute named value from an html tag with an attribute named name and a value of PaRes.
*/
private static final Pattern paresFinder = compile("<input(?=.+?name=\"PaRes\")(?=.+?value=\"(\\S+?)\").+>", DOTALL | CASE_INSENSITIVE);
/**
* Finds the PaRes in an html page.
* <p>
* Note: If more than one PaRes is found in a page only the first will be returned.
*
* @param html String representation of the html page to search within.
* @return PaRes or null if not found
*/
@Nullable
static String findPaRes(@NonNull String html) {
if (html.trim().isEmpty()) return null;
String paRes = null;
Matcher paresMatcher = paresFinder.matcher(html);
if (paresMatcher.find()) {
paRes = paresMatcher.group(1);
}
return paRes;
}
} | package eu.livotov.labs.android.d3s;
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import static java.util.regex.Pattern.CASE_INSENSITIVE;
import static java.util.regex.Pattern.DOTALL;
import static java.util.regex.Pattern.compile;
/**
* Utilities to find 3DS values in ACS webpages.
*/
final class D3SRegexUtils {
/**
* Pattern to find the value of an attribute named value from an html tag with an attribute named name and a value of PaRes.
*/
private static final Pattern paresFinder = compile("<input(?=.+?name=\"PaRes\")(?=.+?value=\"(\\S+?)\").+>", DOTALL | CASE_INSENSITIVE);
/**
* Finds the PaRes in an html page.
+ * <p>
+ * Note: If more than one PaRes is found in a page only the first will be returned.
*
* @param html String representation of the html page to search within.
* @return PaRes or null if not found
*/
@Nullable
static String findPaRes(@NonNull String html) {
if (html.trim().isEmpty()) return null;
String paRes = null;
Matcher paresMatcher = paresFinder.matcher(html);
if (paresMatcher.find()) {
paRes = paresMatcher.group(1);
}
return paRes;
}
} | 2 | 0.04878 | 2 | 0 |
9cc540de8a4956dcf929f4fa8803f0c26ea39d53 | static/locales/ca/messages.properties | static/locales/ca/messages.properties | navigation-toggle=Commuta la navegació
navigation-title=Introducció al Pontoon
navigation-what=Què
navigation-how=Com
navigation-more=Més
navigation-developers=Desenvolupadors
# Header
upper-title=Pontoon, un projecte de Mozilla
headline-1=Traduïu el web.
# What
# How
# More
# Developers
# Footer
| navigation-toggle=Commuta la navegació
navigation-title=Introducció al Pontoon
navigation-what=Què
navigation-how=Com
navigation-more=Més
navigation-developers=Desenvolupadors
# Header
upper-title=Pontoon, un projecte de Mozilla
headline-1=Traduïu el web.
headline-2=«In situ».
call-to-action=Més informació
# What
what-title=Què és el Pontoon?
what-desc=El Pontoon us permet traduir el contingut web «in situ», de forma que podeu veure directament el context i les limitacions d'espai.
context=Enteneu el context
context-desc=Com que traduïu la pàgina web directament dins la pàgina web mateixa, ja no cal patir per saber si la paraula que esteu traduint és un verb o un nom.
space=Vegeu les limitacions d'espai
space-desc=Com que podeu veure l'espai disponible per a la traducció, evitareu els errors en la interfície d'usuari, cosa que és especialment útil en les aplicacions.
preview=Previsualització instantània
preview-desc=Immediatament després d'enviar-se, la traducció substitueix el text original en la pàgina web, per tal que pugueu ser el primer de revisar-la i comprovar-la.
# How
how-title=Com funciona?
how-desc=El Pontoon és una eina molt senzilla i intuïtiva que no requereix cap habilitat tècnica per començar a traduir.
hover=Passeu el ratolí
hover-sub=per sobre del contingut web
hover-desc=Moveu el ratolí sobre els títols, enllaços, paràgrafs o altres blocs de text d'aquesta pàgina. Apareixerà un rectangle discontinu al voltant de cadascun d'aquests blocs. Aquest rectangle indica les cadenes disponibles per a la traducció dins la mateixa pàgina.
select=Seleccioneu
select-sub=un bloc de text
# More
# Developers
# Footer
| Update Catalan (ca) localization of Pontoon Intro. | Pontoon: Update Catalan (ca) localization of Pontoon Intro.
| INI | bsd-3-clause | mathjazz/pontoon-intro,Osmose/pontoon-intro,jotes/pontoon-intro,mathjazz/pontoon-intro,m8ttyB/pontoon-intro,jotes/pontoon-intro,m8ttyB/pontoon-intro,mathjazz/pontoon-intro,Osmose/pontoon-intro,jotes/pontoon-intro,m8ttyB/pontoon-intro,Osmose/pontoon-intro | ini | ## Code Before:
navigation-toggle=Commuta la navegació
navigation-title=Introducció al Pontoon
navigation-what=Què
navigation-how=Com
navigation-more=Més
navigation-developers=Desenvolupadors
# Header
upper-title=Pontoon, un projecte de Mozilla
headline-1=Traduïu el web.
# What
# How
# More
# Developers
# Footer
## Instruction:
Pontoon: Update Catalan (ca) localization of Pontoon Intro.
## Code After:
navigation-toggle=Commuta la navegació
navigation-title=Introducció al Pontoon
navigation-what=Què
navigation-how=Com
navigation-more=Més
navigation-developers=Desenvolupadors
# Header
upper-title=Pontoon, un projecte de Mozilla
headline-1=Traduïu el web.
headline-2=«In situ».
call-to-action=Més informació
# What
what-title=Què és el Pontoon?
what-desc=El Pontoon us permet traduir el contingut web «in situ», de forma que podeu veure directament el context i les limitacions d'espai.
context=Enteneu el context
context-desc=Com que traduïu la pàgina web directament dins la pàgina web mateixa, ja no cal patir per saber si la paraula que esteu traduint és un verb o un nom.
space=Vegeu les limitacions d'espai
space-desc=Com que podeu veure l'espai disponible per a la traducció, evitareu els errors en la interfície d'usuari, cosa que és especialment útil en les aplicacions.
preview=Previsualització instantània
preview-desc=Immediatament després d'enviar-se, la traducció substitueix el text original en la pàgina web, per tal que pugueu ser el primer de revisar-la i comprovar-la.
# How
how-title=Com funciona?
how-desc=El Pontoon és una eina molt senzilla i intuïtiva que no requereix cap habilitat tècnica per començar a traduir.
hover=Passeu el ratolí
hover-sub=per sobre del contingut web
hover-desc=Moveu el ratolí sobre els títols, enllaços, paràgrafs o altres blocs de text d'aquesta pàgina. Apareixerà un rectangle discontinu al voltant de cadascun d'aquests blocs. Aquest rectangle indica les cadenes disponibles per a la traducció dins la mateixa pàgina.
select=Seleccioneu
select-sub=un bloc de text
# More
# Developers
# Footer
| navigation-toggle=Commuta la navegació
navigation-title=Introducció al Pontoon
navigation-what=Què
navigation-how=Com
navigation-more=Més
navigation-developers=Desenvolupadors
# Header
upper-title=Pontoon, un projecte de Mozilla
headline-1=Traduïu el web.
+ headline-2=«In situ».
+ call-to-action=Més informació
# What
+ what-title=Què és el Pontoon?
+ what-desc=El Pontoon us permet traduir el contingut web «in situ», de forma que podeu veure directament el context i les limitacions d'espai.
+ context=Enteneu el context
+ context-desc=Com que traduïu la pàgina web directament dins la pàgina web mateixa, ja no cal patir per saber si la paraula que esteu traduint és un verb o un nom.
+ space=Vegeu les limitacions d'espai
+ space-desc=Com que podeu veure l'espai disponible per a la traducció, evitareu els errors en la interfície d'usuari, cosa que és especialment útil en les aplicacions.
+ preview=Previsualització instantània
+ preview-desc=Immediatament després d'enviar-se, la traducció substitueix el text original en la pàgina web, per tal que pugueu ser el primer de revisar-la i comprovar-la.
# How
+ how-title=Com funciona?
+ how-desc=El Pontoon és una eina molt senzilla i intuïtiva que no requereix cap habilitat tècnica per començar a traduir.
+ hover=Passeu el ratolí
+ hover-sub=per sobre del contingut web
+ hover-desc=Moveu el ratolí sobre els títols, enllaços, paràgrafs o altres blocs de text d'aquesta pàgina. Apareixerà un rectangle discontinu al voltant de cadascun d'aquests blocs. Aquest rectangle indica les cadenes disponibles per a la traducció dins la mateixa pàgina.
+ select=Seleccioneu
+ select-sub=un bloc de text
# More
# Developers
# Footer
| 17 | 0.809524 | 17 | 0 |
8c7cdb9a573880296726c384ca0a63eea239df04 | .travis.yml | .travis.yml | language: node_js
node_js:
- "4.0"
- "5.0"
sudo: false
| language: node_js
node_js:
- "4.0"
- "5.0"
before_install: if [[ `npm -v` != 3* ]]; then npm i -g npm@3; fi
sudo: false
| Make sure NPM is at least v3.0 or higher. | Make sure NPM is at least v3.0 or higher.
| YAML | mit | neogeek/spire-of-babel,neogeek/spire-of-babel | yaml | ## Code Before:
language: node_js
node_js:
- "4.0"
- "5.0"
sudo: false
## Instruction:
Make sure NPM is at least v3.0 or higher.
## Code After:
language: node_js
node_js:
- "4.0"
- "5.0"
before_install: if [[ `npm -v` != 3* ]]; then npm i -g npm@3; fi
sudo: false
| language: node_js
node_js:
- "4.0"
- "5.0"
+ before_install: if [[ `npm -v` != 3* ]]; then npm i -g npm@3; fi
sudo: false | 1 | 0.2 | 1 | 0 |
56c9ee0611428ec8db267d293d93d63f11b8021a | src/helpers/find-root.js | src/helpers/find-root.js | 'use babel'
import Path from 'path'
import {getDir, findFile} from './common'
import {CONFIG_FILE_NAME} from '../defaults'
export function findRoot(path, options) {
if (options.root !== null) {
return options.root
}
const searchPath = getDir(path.indexOf('*') === -1 ? path : options.cwd)
const configFile = findFile(searchPath, CONFIG_FILE_NAME)
if (configFile === null) {
return searchPath
} else {
return Path.dirname(configFile)
}
}
| 'use babel'
import Path from 'path'
import {getDir, findFile} from './common'
import {CONFIG_FILE_NAME} from '../defaults'
import isGlob from 'is-glob'
export function findRoot(path, options) {
if (options.root !== null) {
return options.root
}
const searchPath = getDir(isGlob(path) ? options.cwd : path)
const configFile = findFile(searchPath, CONFIG_FILE_NAME)
if (configFile === null) {
return searchPath
} else {
return Path.dirname(configFile)
}
}
| Use is-glob when finding root | :new: Use is-glob when finding root
| JavaScript | mit | steelbrain/UCompiler | javascript | ## Code Before:
'use babel'
import Path from 'path'
import {getDir, findFile} from './common'
import {CONFIG_FILE_NAME} from '../defaults'
export function findRoot(path, options) {
if (options.root !== null) {
return options.root
}
const searchPath = getDir(path.indexOf('*') === -1 ? path : options.cwd)
const configFile = findFile(searchPath, CONFIG_FILE_NAME)
if (configFile === null) {
return searchPath
} else {
return Path.dirname(configFile)
}
}
## Instruction:
:new: Use is-glob when finding root
## Code After:
'use babel'
import Path from 'path'
import {getDir, findFile} from './common'
import {CONFIG_FILE_NAME} from '../defaults'
import isGlob from 'is-glob'
export function findRoot(path, options) {
if (options.root !== null) {
return options.root
}
const searchPath = getDir(isGlob(path) ? options.cwd : path)
const configFile = findFile(searchPath, CONFIG_FILE_NAME)
if (configFile === null) {
return searchPath
} else {
return Path.dirname(configFile)
}
}
| 'use babel'
import Path from 'path'
import {getDir, findFile} from './common'
import {CONFIG_FILE_NAME} from '../defaults'
+ import isGlob from 'is-glob'
export function findRoot(path, options) {
if (options.root !== null) {
return options.root
}
- const searchPath = getDir(path.indexOf('*') === -1 ? path : options.cwd)
+ const searchPath = getDir(isGlob(path) ? options.cwd : path)
const configFile = findFile(searchPath, CONFIG_FILE_NAME)
if (configFile === null) {
return searchPath
} else {
return Path.dirname(configFile)
}
} | 3 | 0.15 | 2 | 1 |
d0eef8faee9ea135a9f871031198b4fcd01abe2e | config/routes.rb | config/routes.rb | ActionController::Routing::Routes.draw do |map|
map.resource :user
map.resource :password, :only => [:new, :create, :edit, :update]
map.resource :activation, :only => [:new, :create]
map.resource :session
map.namespace :admin do |admin|
admin.resources :users, :member => { :activate => :put, :suspend => :put, :reset_password => :put },
:collection => { :inactive => :get }
end
map.login "/login", :controller => 'sessions', :action => 'new'
map.logout "/logout", :controller => 'sessions', :action => 'destroy'
map.signup "/signup", :controller => 'users', :action => 'new'
map.activate "/login/activate/:activation_code", :controller => "logins", :action => "activate", :activation_code => nil
end
| ActionController::Routing::Routes.draw do |map|
map.resource :user
map.resource :password, :only => [:new, :create, :edit, :update]
map.resource :activation, :only => [:new, :create]
map.resource :session
map.namespace :admin do |admin|
admin.resources :users, :member => { :activate => :put, :suspend => :put, :reset_password => :put, :administrator => :put },
:collection => { :inactive => :get }
end
map.login "/login", :controller => 'sessions', :action => 'new'
map.logout "/logout", :controller => 'sessions', :action => 'destroy'
map.signup "/signup", :controller => 'users', :action => 'new'
map.activate "/login/activate/:activation_code", :controller => "logins", :action => "activate", :activation_code => nil
end
| Add routing for the new action | Add routing for the new action
| Ruby | mit | rubaidh/authentication | ruby | ## Code Before:
ActionController::Routing::Routes.draw do |map|
map.resource :user
map.resource :password, :only => [:new, :create, :edit, :update]
map.resource :activation, :only => [:new, :create]
map.resource :session
map.namespace :admin do |admin|
admin.resources :users, :member => { :activate => :put, :suspend => :put, :reset_password => :put },
:collection => { :inactive => :get }
end
map.login "/login", :controller => 'sessions', :action => 'new'
map.logout "/logout", :controller => 'sessions', :action => 'destroy'
map.signup "/signup", :controller => 'users', :action => 'new'
map.activate "/login/activate/:activation_code", :controller => "logins", :action => "activate", :activation_code => nil
end
## Instruction:
Add routing for the new action
## Code After:
ActionController::Routing::Routes.draw do |map|
map.resource :user
map.resource :password, :only => [:new, :create, :edit, :update]
map.resource :activation, :only => [:new, :create]
map.resource :session
map.namespace :admin do |admin|
admin.resources :users, :member => { :activate => :put, :suspend => :put, :reset_password => :put, :administrator => :put },
:collection => { :inactive => :get }
end
map.login "/login", :controller => 'sessions', :action => 'new'
map.logout "/logout", :controller => 'sessions', :action => 'destroy'
map.signup "/signup", :controller => 'users', :action => 'new'
map.activate "/login/activate/:activation_code", :controller => "logins", :action => "activate", :activation_code => nil
end
| ActionController::Routing::Routes.draw do |map|
map.resource :user
map.resource :password, :only => [:new, :create, :edit, :update]
map.resource :activation, :only => [:new, :create]
map.resource :session
map.namespace :admin do |admin|
- admin.resources :users, :member => { :activate => :put, :suspend => :put, :reset_password => :put },
+ admin.resources :users, :member => { :activate => :put, :suspend => :put, :reset_password => :put, :administrator => :put },
? ++++++++++++++++++++++++
:collection => { :inactive => :get }
end
map.login "/login", :controller => 'sessions', :action => 'new'
map.logout "/logout", :controller => 'sessions', :action => 'destroy'
map.signup "/signup", :controller => 'users', :action => 'new'
map.activate "/login/activate/:activation_code", :controller => "logins", :action => "activate", :activation_code => nil
end | 2 | 0.105263 | 1 | 1 |
7398eefa679af494cd6647c7ef712cb3f46a6a13 | README.md | README.md | README
======
by BirMa
<https://github.com/BirMa/dotfiles_main>
This is my collection of dotfiles.
----------------------------------
* All dotfiles from this repo reside within a single folder on my filesystem.
* They are all symlinked to the respective location where they are expected from applications.
* You totally should not spend your time reading this.
| README
======
by BirMa
<https://github.com/BirMa/dotfiles_main>
This is a collection of most of my dotfiles.
----------------------------------
* All dotfiles from this repo reside within a single folder on the filesystem.
* They are all symlinked to the respective locations where they are expected by applications.
* You totally should not spend your time reading this.
| Improve some of the grammar. | Improve some of the grammar.
| Markdown | unlicense | BirMa/dotfiles_sys,BirMa/dotfiles_sys | markdown | ## Code Before:
README
======
by BirMa
<https://github.com/BirMa/dotfiles_main>
This is my collection of dotfiles.
----------------------------------
* All dotfiles from this repo reside within a single folder on my filesystem.
* They are all symlinked to the respective location where they are expected from applications.
* You totally should not spend your time reading this.
## Instruction:
Improve some of the grammar.
## Code After:
README
======
by BirMa
<https://github.com/BirMa/dotfiles_main>
This is a collection of most of my dotfiles.
----------------------------------
* All dotfiles from this repo reside within a single folder on the filesystem.
* They are all symlinked to the respective locations where they are expected by applications.
* You totally should not spend your time reading this.
| README
======
by BirMa
<https://github.com/BirMa/dotfiles_main>
- This is my collection of dotfiles.
? ^^
+ This is a collection of most of my dotfiles.
? ^ +++++++++++
----------------------------------
- * All dotfiles from this repo reside within a single folder on my filesystem.
? ^^
+ * All dotfiles from this repo reside within a single folder on the filesystem.
? ^^^
- * They are all symlinked to the respective location where they are expected from applications.
? ^^^^
+ * They are all symlinked to the respective locations where they are expected by applications.
? + ^^
* You totally should not spend your time reading this. | 6 | 0.461538 | 3 | 3 |
ef1e4f5492d08872d80e9388564e0bfdc51ae557 | lod_and_effect/index.html | lod_and_effect/index.html | <!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Many box plots</title>
<link rel=stylesheet type="text/css" href="lod_and_effect.css">
<script type="text/javascript" src="http://d3js.org/d3.v3.min.js"></script>
<script type="text/javascript" src="lod_and_effect.js"></script>
</head>
<body>
</body>
</html> | <!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>LOD curves and QTL effects</title>
<link rel=stylesheet type="text/css" href="lod_and_effect.css">
<script type="text/javascript" src="http://d3js.org/d3.v3.min.js"></script>
<script type="text/javascript" src="lod_and_effect.js"></script>
</head>
<body>
</body>
</html> | Change title in html file | lod_and_effect: Change title in html file
| HTML | mit | kbroman/JSbroman,kbroman/JSbroman,kbroman/JSbroman | html | ## Code Before:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Many box plots</title>
<link rel=stylesheet type="text/css" href="lod_and_effect.css">
<script type="text/javascript" src="http://d3js.org/d3.v3.min.js"></script>
<script type="text/javascript" src="lod_and_effect.js"></script>
</head>
<body>
</body>
</html>
## Instruction:
lod_and_effect: Change title in html file
## Code After:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>LOD curves and QTL effects</title>
<link rel=stylesheet type="text/css" href="lod_and_effect.css">
<script type="text/javascript" src="http://d3js.org/d3.v3.min.js"></script>
<script type="text/javascript" src="lod_and_effect.js"></script>
</head>
<body>
</body>
</html> | <!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
- <title>Many box plots</title>
+ <title>LOD curves and QTL effects</title>
<link rel=stylesheet type="text/css" href="lod_and_effect.css">
<script type="text/javascript" src="http://d3js.org/d3.v3.min.js"></script>
<script type="text/javascript" src="lod_and_effect.js"></script>
</head>
<body>
</body>
</html> | 2 | 0.153846 | 1 | 1 |
63a4fc79320266cabd811d4584009e61c5e67e6a | README.md | README.md | monitor-http-server
===================
Simple script to monitor if servers are responding (status 200) or not.
## Screenshot

## Dependencies
You need to have a **mail server** configured to be able to send notification.
* `mail` command to send notification.
* [stylerc](https://github.com/edouard-lopez/stylerc): bash output style ;
* [toolboxrc](https://github.com/edouard-lopez/toolboxrc): some stupid utilities ;
## Usage
First you need to edit the file `monitor-list.txt` and add some servers URLs (using their [FQDN](https://en.wikipedia.org/wiki/Fully_qualified_domain_name)).
Note that lines starting with a `#` (dash) are ignored.
# ignored hosts
# http://wont-be.tested.com/
# test
http://my.website.com/
Then you are good to go and run your first test
bash ./monitor-servers.sh me@host.com ./monitor-list-default.txt
| monitor-http-server
===================
Simple script to monitor if servers are responding (status 200) or not.
## Screenshot

## Dependencies
You need to have a **mail server** configured to be able to send notification.
* `mail` command to send notification.
* [stylerc](https://github.com/edouard-lopez/stylerc): bash output style ;
* [toolboxrc](https://github.com/edouard-lopez/toolboxrc): some stupid utilities ;
## Install
The project use 2 submodule, instructions below cover an *out of the box* installation:
```
git clone git@github.com:edouard-lopez/monitor-http-server.git
cd monitor-http-server
git submodule init && git submodule update # install the submodules
```
## Usage
First you need to edit the file `monitor-list.txt` and add some servers URLs (using their [FQDN](https://en.wikipedia.org/wiki/Fully_qualified_domain_name)).
Note that lines starting with a `#` (dash) are ignored.
# ignored hosts
# http://wont-be.tested.com/
# test
http://my.website.com/
Then you are good to go and run your first test
bash ./monitor-servers.sh me@host.com ./monitor-list-default.txt
| Add out of the box installation instructions | Add out of the box installation instructions
| Markdown | mit | edouard-lopez/monitor-http-server | markdown | ## Code Before:
monitor-http-server
===================
Simple script to monitor if servers are responding (status 200) or not.
## Screenshot

## Dependencies
You need to have a **mail server** configured to be able to send notification.
* `mail` command to send notification.
* [stylerc](https://github.com/edouard-lopez/stylerc): bash output style ;
* [toolboxrc](https://github.com/edouard-lopez/toolboxrc): some stupid utilities ;
## Usage
First you need to edit the file `monitor-list.txt` and add some servers URLs (using their [FQDN](https://en.wikipedia.org/wiki/Fully_qualified_domain_name)).
Note that lines starting with a `#` (dash) are ignored.
# ignored hosts
# http://wont-be.tested.com/
# test
http://my.website.com/
Then you are good to go and run your first test
bash ./monitor-servers.sh me@host.com ./monitor-list-default.txt
## Instruction:
Add out of the box installation instructions
## Code After:
monitor-http-server
===================
Simple script to monitor if servers are responding (status 200) or not.
## Screenshot

## Dependencies
You need to have a **mail server** configured to be able to send notification.
* `mail` command to send notification.
* [stylerc](https://github.com/edouard-lopez/stylerc): bash output style ;
* [toolboxrc](https://github.com/edouard-lopez/toolboxrc): some stupid utilities ;
## Install
The project use 2 submodule, instructions below cover an *out of the box* installation:
```
git clone git@github.com:edouard-lopez/monitor-http-server.git
cd monitor-http-server
git submodule init && git submodule update # install the submodules
```
## Usage
First you need to edit the file `monitor-list.txt` and add some servers URLs (using their [FQDN](https://en.wikipedia.org/wiki/Fully_qualified_domain_name)).
Note that lines starting with a `#` (dash) are ignored.
# ignored hosts
# http://wont-be.tested.com/
# test
http://my.website.com/
Then you are good to go and run your first test
bash ./monitor-servers.sh me@host.com ./monitor-list-default.txt
| monitor-http-server
===================
Simple script to monitor if servers are responding (status 200) or not.
## Screenshot

## Dependencies
You need to have a **mail server** configured to be able to send notification.
* `mail` command to send notification.
* [stylerc](https://github.com/edouard-lopez/stylerc): bash output style ;
* [toolboxrc](https://github.com/edouard-lopez/toolboxrc): some stupid utilities ;
+ ## Install
+
+ The project use 2 submodule, instructions below cover an *out of the box* installation:
+
+ ```
+ git clone git@github.com:edouard-lopez/monitor-http-server.git
+ cd monitor-http-server
+ git submodule init && git submodule update # install the submodules
+ ```
+
## Usage
First you need to edit the file `monitor-list.txt` and add some servers URLs (using their [FQDN](https://en.wikipedia.org/wiki/Fully_qualified_domain_name)).
Note that lines starting with a `#` (dash) are ignored.
# ignored hosts
# http://wont-be.tested.com/
# test
http://my.website.com/
Then you are good to go and run your first test
bash ./monitor-servers.sh me@host.com ./monitor-list-default.txt | 10 | 0.333333 | 10 | 0 |
47c9e7467019ae980b3c5695e1f6453ee25c0246 | README.md | README.md | A simple plugin for making textarea elements capable of being resized automatically based on the combined height of its content.
## Usage
$(function() {
$("textarea").autoResize()
})
Or if you prefer to have the height adjustment happen without animation:
$(function() {
$("textarea").autoResize({ animateOptions: null })
})
| A simple plugin for making textarea elements capable of being resized automatically based on the combined height of its content.
## Usage
$(function() {
$("textarea").autoResize()
})
| Remove unnecessary information about disabling animation. | Remove unnecessary information about disabling animation.
Until animation support is re-enabled, the removed information
serves no purpose other than confusion. Might remove animations
entirely... | Markdown | mit | thomasjo/jquery-autoresize | markdown | ## Code Before:
A simple plugin for making textarea elements capable of being resized automatically based on the combined height of its content.
## Usage
$(function() {
$("textarea").autoResize()
})
Or if you prefer to have the height adjustment happen without animation:
$(function() {
$("textarea").autoResize({ animateOptions: null })
})
## Instruction:
Remove unnecessary information about disabling animation.
Until animation support is re-enabled, the removed information
serves no purpose other than confusion. Might remove animations
entirely...
## Code After:
A simple plugin for making textarea elements capable of being resized automatically based on the combined height of its content.
## Usage
$(function() {
$("textarea").autoResize()
})
| A simple plugin for making textarea elements capable of being resized automatically based on the combined height of its content.
## Usage
$(function() {
$("textarea").autoResize()
})
-
- Or if you prefer to have the height adjustment happen without animation:
-
- $(function() {
- $("textarea").autoResize({ animateOptions: null })
- })
- | 7 | 0.466667 | 0 | 7 |
72e509be8415e628613b2341c136018ff5bb0f44 | openpathsampling/engines/openmm/__init__.py | openpathsampling/engines/openmm/__init__.py | def missing_openmm(*args, **kwargs):
raise RuntimeError("Install OpenMM to use this feature")
try:
import simtk.openmm
import simtk.openmm.app
except ImportError:
HAS_OPENMM = False
Engine = missing_openmm
empty_snapshot_from_openmm_topology = missing_openmm
snapshot_from_pdb = missing_openmm
snapshot_from_testsystem = missing_openmm
to_openmm_topology = missing_openmm
trajectory_from_mdtraj = missing_openmm
trajectory_to_mdtraj = missing_openmm
Snapshot = missing_openmm
MDSnapshot = missing_openmm
else:
from .engine import OpenMMEngine as Engine
from .tools import (
empty_snapshot_from_openmm_topology,
snapshot_from_pdb,
snapshot_from_testsystem,
to_openmm_topology,
trajectory_from_mdtraj,
trajectory_to_mdtraj
)
from . import features
from .snapshot import Snapshot, MDSnapshot
from openpathsampling.engines import NoEngine, SnapshotDescriptor
| def missing_openmm(*args, **kwargs):
raise RuntimeError("Install OpenMM to use this feature")
try:
import simtk.openmm
import simtk.openmm.app
except ImportError:
HAS_OPENMM = False
Engine = missing_openmm
empty_snapshot_from_openmm_topology = missing_openmm
snapshot_from_pdb = missing_openmm
snapshot_from_testsystem = missing_openmm
to_openmm_topology = missing_openmm
trajectory_from_mdtraj = missing_openmm
trajectory_to_mdtraj = missing_openmm
Snapshot = missing_openmm
MDSnapshot = missing_openmm
else:
from .engine import OpenMMEngine as Engine
from .tools import (
empty_snapshot_from_openmm_topology,
snapshot_from_pdb,
snapshot_from_testsystem,
to_openmm_topology,
trajectory_from_mdtraj,
trajectory_to_mdtraj
)
from . import features
from .snapshot import Snapshot, MDSnapshot
from . import topology
from openpathsampling.engines import NoEngine, SnapshotDescriptor
| Fix backward compatiblity for MDTrajTopology | Fix backward compatiblity for MDTrajTopology
| Python | mit | openpathsampling/openpathsampling,dwhswenson/openpathsampling,openpathsampling/openpathsampling,choderalab/openpathsampling,openpathsampling/openpathsampling,choderalab/openpathsampling,dwhswenson/openpathsampling,dwhswenson/openpathsampling,choderalab/openpathsampling,openpathsampling/openpathsampling,dwhswenson/openpathsampling | python | ## Code Before:
def missing_openmm(*args, **kwargs):
raise RuntimeError("Install OpenMM to use this feature")
try:
import simtk.openmm
import simtk.openmm.app
except ImportError:
HAS_OPENMM = False
Engine = missing_openmm
empty_snapshot_from_openmm_topology = missing_openmm
snapshot_from_pdb = missing_openmm
snapshot_from_testsystem = missing_openmm
to_openmm_topology = missing_openmm
trajectory_from_mdtraj = missing_openmm
trajectory_to_mdtraj = missing_openmm
Snapshot = missing_openmm
MDSnapshot = missing_openmm
else:
from .engine import OpenMMEngine as Engine
from .tools import (
empty_snapshot_from_openmm_topology,
snapshot_from_pdb,
snapshot_from_testsystem,
to_openmm_topology,
trajectory_from_mdtraj,
trajectory_to_mdtraj
)
from . import features
from .snapshot import Snapshot, MDSnapshot
from openpathsampling.engines import NoEngine, SnapshotDescriptor
## Instruction:
Fix backward compatiblity for MDTrajTopology
## Code After:
def missing_openmm(*args, **kwargs):
raise RuntimeError("Install OpenMM to use this feature")
try:
import simtk.openmm
import simtk.openmm.app
except ImportError:
HAS_OPENMM = False
Engine = missing_openmm
empty_snapshot_from_openmm_topology = missing_openmm
snapshot_from_pdb = missing_openmm
snapshot_from_testsystem = missing_openmm
to_openmm_topology = missing_openmm
trajectory_from_mdtraj = missing_openmm
trajectory_to_mdtraj = missing_openmm
Snapshot = missing_openmm
MDSnapshot = missing_openmm
else:
from .engine import OpenMMEngine as Engine
from .tools import (
empty_snapshot_from_openmm_topology,
snapshot_from_pdb,
snapshot_from_testsystem,
to_openmm_topology,
trajectory_from_mdtraj,
trajectory_to_mdtraj
)
from . import features
from .snapshot import Snapshot, MDSnapshot
from . import topology
from openpathsampling.engines import NoEngine, SnapshotDescriptor
| def missing_openmm(*args, **kwargs):
raise RuntimeError("Install OpenMM to use this feature")
try:
import simtk.openmm
import simtk.openmm.app
except ImportError:
HAS_OPENMM = False
Engine = missing_openmm
empty_snapshot_from_openmm_topology = missing_openmm
snapshot_from_pdb = missing_openmm
snapshot_from_testsystem = missing_openmm
to_openmm_topology = missing_openmm
trajectory_from_mdtraj = missing_openmm
trajectory_to_mdtraj = missing_openmm
Snapshot = missing_openmm
MDSnapshot = missing_openmm
else:
from .engine import OpenMMEngine as Engine
from .tools import (
empty_snapshot_from_openmm_topology,
snapshot_from_pdb,
snapshot_from_testsystem,
to_openmm_topology,
trajectory_from_mdtraj,
trajectory_to_mdtraj
)
from . import features
from .snapshot import Snapshot, MDSnapshot
+ from . import topology
from openpathsampling.engines import NoEngine, SnapshotDescriptor | 1 | 0.030303 | 1 | 0 |
b61c4d8b34dd4c90a4c8ab972a857bc3a8315d7d | src/test/java/net/openhft/chronicle/bytes/BytesTextMethodTesterTest.java | src/test/java/net/openhft/chronicle/bytes/BytesTextMethodTesterTest.java | package net.openhft.chronicle.bytes;
import org.junit.Test;
import java.io.IOException;
import static junit.framework.TestCase.assertEquals;
public class BytesTextMethodTesterTest {
@Test
public void run() throws IOException {
BytesTextMethodTester tester = new BytesTextMethodTester<>("btmtt/one-input.txt", IBMImpl::new, IBytesMethod.class, "btmtt/one-output.txt");
tester.run();
assertEquals(tester.expected(), tester.actual());
}
static class IBMImpl implements IBytesMethod {
final IBytesMethod out;
IBMImpl(IBytesMethod out) {
this.out = out;
}
@Override
public void myByteable(MyByteable byteable) {
out.myByteable(byteable);
}
@Override
public void myScalars(MyScalars scalars) {
out.myScalars(scalars);
}
@Override
public void myNested(MyNested nested) {
out.myNested(nested);
}
}
} | package net.openhft.chronicle.bytes;
import org.junit.Test;
import java.io.IOException;
import static junit.framework.TestCase.assertEquals;
public class BytesTextMethodTesterTest {
@Test
public void run() throws IOException {
BytesTextMethodTester tester = new BytesTextMethodTester<>(
"btmtt/one-input.txt",
IBMImpl::new,
IBytesMethod.class,
"btmtt/one-output.txt");
tester.run();
assertEquals(tester.expected(), tester.actual());
}
static class IBMImpl implements IBytesMethod {
final IBytesMethod out;
IBMImpl(IBytesMethod out) {
this.out = out;
}
@Override
public void myByteable(MyByteable byteable) {
out.myByteable(byteable);
}
@Override
public void myScalars(MyScalars scalars) {
out.myScalars(scalars);
}
@Override
public void myNested(MyNested nested) {
out.myNested(nested);
}
}
} | Add examples with multi-byte message ids. | Add examples with multi-byte message ids.
| Java | apache-2.0 | OpenHFT/Chronicle-Bytes | java | ## Code Before:
package net.openhft.chronicle.bytes;
import org.junit.Test;
import java.io.IOException;
import static junit.framework.TestCase.assertEquals;
public class BytesTextMethodTesterTest {
@Test
public void run() throws IOException {
BytesTextMethodTester tester = new BytesTextMethodTester<>("btmtt/one-input.txt", IBMImpl::new, IBytesMethod.class, "btmtt/one-output.txt");
tester.run();
assertEquals(tester.expected(), tester.actual());
}
static class IBMImpl implements IBytesMethod {
final IBytesMethod out;
IBMImpl(IBytesMethod out) {
this.out = out;
}
@Override
public void myByteable(MyByteable byteable) {
out.myByteable(byteable);
}
@Override
public void myScalars(MyScalars scalars) {
out.myScalars(scalars);
}
@Override
public void myNested(MyNested nested) {
out.myNested(nested);
}
}
}
## Instruction:
Add examples with multi-byte message ids.
## Code After:
package net.openhft.chronicle.bytes;
import org.junit.Test;
import java.io.IOException;
import static junit.framework.TestCase.assertEquals;
public class BytesTextMethodTesterTest {
@Test
public void run() throws IOException {
BytesTextMethodTester tester = new BytesTextMethodTester<>(
"btmtt/one-input.txt",
IBMImpl::new,
IBytesMethod.class,
"btmtt/one-output.txt");
tester.run();
assertEquals(tester.expected(), tester.actual());
}
static class IBMImpl implements IBytesMethod {
final IBytesMethod out;
IBMImpl(IBytesMethod out) {
this.out = out;
}
@Override
public void myByteable(MyByteable byteable) {
out.myByteable(byteable);
}
@Override
public void myScalars(MyScalars scalars) {
out.myScalars(scalars);
}
@Override
public void myNested(MyNested nested) {
out.myNested(nested);
}
}
} | package net.openhft.chronicle.bytes;
import org.junit.Test;
import java.io.IOException;
import static junit.framework.TestCase.assertEquals;
public class BytesTextMethodTesterTest {
@Test
public void run() throws IOException {
- BytesTextMethodTester tester = new BytesTextMethodTester<>("btmtt/one-input.txt", IBMImpl::new, IBytesMethod.class, "btmtt/one-output.txt");
+ BytesTextMethodTester tester = new BytesTextMethodTester<>(
+ "btmtt/one-input.txt",
+ IBMImpl::new,
+ IBytesMethod.class,
+ "btmtt/one-output.txt");
tester.run();
assertEquals(tester.expected(), tester.actual());
}
static class IBMImpl implements IBytesMethod {
final IBytesMethod out;
IBMImpl(IBytesMethod out) {
this.out = out;
}
@Override
public void myByteable(MyByteable byteable) {
out.myByteable(byteable);
}
@Override
public void myScalars(MyScalars scalars) {
out.myScalars(scalars);
}
@Override
public void myNested(MyNested nested) {
out.myNested(nested);
}
}
} | 6 | 0.153846 | 5 | 1 |
c9e7cfc02ce454ff678463a136d3110835478b2e | src/panel-element.coffee | src/panel-element.coffee | {CompositeDisposable} = require 'event-kit'
{callAttachHooks} = require './space-pen-extensions'
class PanelElement extends HTMLElement
createdCallback: ->
@subscriptions = new CompositeDisposable
initialize: (@model) ->
@appendChild(@getItemView())
@classList.add(@model.getClassName().split(' ')...) if @model.getClassName()?
@subscriptions.add @model.onDidChangeVisible(@visibleChanged.bind(this))
@subscriptions.add @model.onDidDestroy(@destroyed.bind(this))
this
getModel: -> @model
getItemView: ->
atom.views.getView(@model.getItem())
attachedCallback: ->
callAttachHooks(@getItemView()) # for backward compatibility with SpacePen views
@visibleChanged(@model.isVisible())
visibleChanged: (visible) ->
if visible
@style.display = null
else
@style.display = 'none'
destroyed: ->
@subscriptions.dispose()
@parentNode?.removeChild(this)
module.exports = PanelElement = document.registerElement 'atom-panel', prototype: PanelElement.prototype
| {CompositeDisposable} = require 'event-kit'
{callAttachHooks} = require './space-pen-extensions'
Panel = require './panel'
class PanelElement extends HTMLElement
createdCallback: ->
@subscriptions = new CompositeDisposable
initialize: (@model) ->
@appendChild(@getItemView())
@classList.add(@model.getClassName().split(' ')...) if @model.getClassName()?
@subscriptions.add @model.onDidChangeVisible(@visibleChanged.bind(this))
@subscriptions.add @model.onDidDestroy(@destroyed.bind(this))
this
getModel: ->
@model or= new Panel({})
getItemView: ->
atom.views.getView(@getModel().getItem())
attachedCallback: ->
callAttachHooks(@getItemView()) # for backward compatibility with SpacePen views
@visibleChanged(@getModel().isVisible())
visibleChanged: (visible) ->
if visible
@style.display = null
else
@style.display = 'none'
destroyed: ->
@subscriptions.dispose()
@parentNode?.removeChild(this)
module.exports = PanelElement = document.registerElement 'atom-panel', prototype: PanelElement.prototype
| Allow PanelElements to be instantiated with markup | Allow PanelElements to be instantiated with markup
| CoffeeScript | mit | ali/atom,fang-yufeng/atom,hakatashi/atom,Hasimir/atom,dkfiresky/atom,alfredxing/atom,jordanbtucker/atom,bryonwinger/atom,davideg/atom,Neron-X5/atom,jordanbtucker/atom,woss/atom,me6iaton/atom,kjav/atom,scippio/atom,rsvip/aTom,KENJU/atom,vjeux/atom,crazyquark/atom,SlimeQ/atom,yamhon/atom,hellendag/atom,deepfox/atom,yalexx/atom,G-Baby/atom,ashneo76/atom,kandros/atom,SlimeQ/atom,vcarrera/atom,vjeux/atom,gontadu/atom,fscherwi/atom,champagnez/atom,Hasimir/atom,RobinTec/atom,pkdevbox/atom,panuchart/atom,t9md/atom,oggy/atom,bcoe/atom,chengky/atom,001szymon/atom,Rychard/atom,yalexx/atom,DiogoXRP/atom,prembasumatary/atom,woss/atom,acontreras89/atom,Galactix/atom,originye/atom,champagnez/atom,phord/atom,vhutheesing/atom,Locke23rus/atom,matthewclendening/atom,tisu2tisu/atom,Ju2ender/atom,dsandstrom/atom,beni55/atom,gabrielPeart/atom,ezeoleaf/atom,liuderchi/atom,h0dgep0dge/atom,lpommers/atom,folpindo/atom,abcP9110/atom,john-kelly/atom,ali/atom,jacekkopecky/atom,NunoEdgarGub1/atom,kevinrenaers/atom,kdheepak89/atom,acontreras89/atom,Jdesk/atom,avdg/atom,wiggzz/atom,lisonma/atom,yomybaby/atom,bryonwinger/atom,jlord/atom,KENJU/atom,Jdesk/atom,AlbertoBarrago/atom,Abdillah/atom,deoxilix/atom,jacekkopecky/atom,charleswhchan/atom,Dennis1978/atom,Austen-G/BlockBuilder,palita01/atom,rsvip/aTom,sebmck/atom,batjko/atom,charleswhchan/atom,anuwat121/atom,basarat/atom,ObviouslyGreen/atom,yangchenghu/atom,yomybaby/atom,synaptek/atom,Austen-G/BlockBuilder,ivoadf/atom,gzzhanghao/atom,bolinfest/atom,KENJU/atom,originye/atom,KENJU/atom,devoncarew/atom,CraZySacX/atom,vinodpanicker/atom,ppamorim/atom,dsandstrom/atom,splodingsocks/atom,hpham04/atom,basarat/atom,Ju2ender/atom,qiujuer/atom,yangchenghu/atom,constanzaurzua/atom,hagb4rd/atom,brettle/atom,Mokolea/atom,helber/atom,fredericksilva/atom,amine7536/atom,dannyflax/atom,acontreras89/atom,qiujuer/atom,paulcbetts/atom,mrodalgaard/atom,Rodjana/atom,Abdillah/atom,001szymon/atom,rookie125/atom,liuxiong332/atom,vcarrera/atom,constanzaurzua/atom,seedtigo/atom,rsvip/aTom,0x73/atom,florianb/atom,charleswhchan/atom,gontadu/atom,ReddTea/atom,sxgao3001/atom,GHackAnonymous/atom,ralphtheninja/atom,ezeoleaf/atom,nvoron23/atom,RuiDGoncalves/atom,jlord/atom,tanin47/atom,yomybaby/atom,hagb4rd/atom,rmartin/atom,rlugojr/atom,fredericksilva/atom,Galactix/atom,YunchengLiao/atom,dannyflax/atom,gabrielPeart/atom,Jonekee/atom,NunoEdgarGub1/atom,Austen-G/BlockBuilder,liuderchi/atom,sebmck/atom,Arcanemagus/atom,Ju2ender/atom,mrodalgaard/atom,devoncarew/atom,vinodpanicker/atom,sotayamashita/atom,jjz/atom,harshdattani/atom,vinodpanicker/atom,dannyflax/atom,Andrey-Pavlov/atom,jeremyramin/atom,darwin/atom,synaptek/atom,chfritz/atom,Jandersolutions/atom,davideg/atom,matthewclendening/atom,Jandersolutions/atom,xream/atom,codex8/atom,fredericksilva/atom,ashneo76/atom,sotayamashita/atom,svanharmelen/atom,kc8wxm/atom,jacekkopecky/atom,hakatashi/atom,davideg/atom,isghe/atom,decaffeinate-examples/atom,codex8/atom,sxgao3001/atom,targeter21/atom,sekcheong/atom,scv119/atom,ralphtheninja/atom,russlescai/atom,Ju2ender/atom,DiogoXRP/atom,john-kelly/atom,Abdillah/atom,chengky/atom,amine7536/atom,john-kelly/atom,Jandersolutions/atom,CraZySacX/atom,matthewclendening/atom,transcranial/atom,decaffeinate-examples/atom,Andrey-Pavlov/atom,tanin47/atom,cyzn/atom,sillvan/atom,mertkahyaoglu/atom,oggy/atom,prembasumatary/atom,NunoEdgarGub1/atom,ykeisuke/atom,mertkahyaoglu/atom,AlisaKiatkongkumthon/atom,svanharmelen/atom,Ingramz/atom,hagb4rd/atom,abcP9110/atom,ironbox360/atom,bsmr-x-script/atom,kc8wxm/atom,tony612/atom,folpindo/atom,boomwaiza/atom,qskycolor/atom,mnquintana/atom,acontreras89/atom,johnhaley81/atom,splodingsocks/atom,seedtigo/atom,tony612/atom,elkingtonmcb/atom,andrewleverette/atom,codex8/atom,daxlab/atom,hharchani/atom,ali/atom,medovob/atom,n-riesco/atom,Neron-X5/atom,atom/atom,lovesnow/atom,prembasumatary/atom,john-kelly/atom,stuartquin/atom,sxgao3001/atom,Shekharrajak/atom,qskycolor/atom,G-Baby/atom,me-benni/atom,anuwat121/atom,n-riesco/atom,jjz/atom,dsandstrom/atom,Sangaroonaom/atom,pombredanne/atom,bencolon/atom,lisonma/atom,dsandstrom/atom,Jandersoft/atom,liuderchi/atom,atom/atom,dkfiresky/atom,ezeoleaf/atom,dkfiresky/atom,h0dgep0dge/atom,ReddTea/atom,florianb/atom,brumm/atom,john-kelly/atom,Huaraz2/atom,andrewleverette/atom,GHackAnonymous/atom,hpham04/atom,atom/atom,Abdillah/atom,ObviouslyGreen/atom,nvoron23/atom,AdrianVovk/substance-ide,RuiDGoncalves/atom,sxgao3001/atom,mdumrauf/atom,jtrose2/atom,tanin47/atom,alexandergmann/atom,bolinfest/atom,Ingramz/atom,jeremyramin/atom,nrodriguez13/atom,niklabh/atom,gisenberg/atom,omarhuanca/atom,tony612/atom,ilovezy/atom,mostafaeweda/atom,ReddTea/atom,kdheepak89/atom,originye/atom,rmartin/atom,Ingramz/atom,amine7536/atom,dkfiresky/atom,chengky/atom,Jandersoft/atom,paulcbetts/atom,dkfiresky/atom,mnquintana/atom,palita01/atom,liuxiong332/atom,seedtigo/atom,ardeshirj/atom,basarat/atom,jtrose2/atom,Jdesk/atom,russlescai/atom,rxkit/atom,tjkr/atom,MjAbuz/atom,MjAbuz/atom,mostafaeweda/atom,GHackAnonymous/atom,batjko/atom,isghe/atom,ezeoleaf/atom,hellendag/atom,stinsonga/atom,Jonekee/atom,me-benni/atom,jjz/atom,panuchart/atom,chengky/atom,medovob/atom,johnhaley81/atom,bcoe/atom,batjko/atom,deepfox/atom,vhutheesing/atom,devmario/atom,tisu2tisu/atom,deepfox/atom,rookie125/atom,isghe/atom,ironbox360/atom,paulcbetts/atom,bsmr-x-script/atom,liuxiong332/atom,transcranial/atom,AlbertoBarrago/atom,abcP9110/atom,NunoEdgarGub1/atom,rsvip/aTom,chengky/atom,dsandstrom/atom,batjko/atom,prembasumatary/atom,AlexxNica/atom,BogusCurry/atom,MjAbuz/atom,Jandersolutions/atom,fedorov/atom,MjAbuz/atom,charleswhchan/atom,fang-yufeng/atom,hharchani/atom,NunoEdgarGub1/atom,Andrey-Pavlov/atom,panuchart/atom,ReddTea/atom,n-riesco/atom,devoncarew/atom,fang-yufeng/atom,lovesnow/atom,basarat/atom,AlexxNica/atom,chfritz/atom,mostafaeweda/atom,kittens/atom,gisenberg/atom,Hasimir/atom,AlbertoBarrago/atom,hpham04/atom,kittens/atom,Jandersoft/atom,g2p/atom,florianb/atom,Shekharrajak/atom,0x73/atom,omarhuanca/atom,mdumrauf/atom,toqz/atom,vcarrera/atom,devmario/atom,pengshp/atom,pkdevbox/atom,hagb4rd/atom,ObviouslyGreen/atom,hagb4rd/atom,pombredanne/atom,palita01/atom,Rodjana/atom,rsvip/aTom,omarhuanca/atom,woss/atom,toqz/atom,ilovezy/atom,nvoron23/atom,jtrose2/atom,Rychard/atom,dannyflax/atom,me-benni/atom,basarat/atom,johnrizzo1/atom,YunchengLiao/atom,Sangaroonaom/atom,Austen-G/BlockBuilder,Shekharrajak/atom,sebmck/atom,Huaraz2/atom,davideg/atom,bolinfest/atom,RobinTec/atom,cyzn/atom,ironbox360/atom,yamhon/atom,Hasimir/atom,Austen-G/BlockBuilder,woss/atom,dannyflax/atom,qskycolor/atom,chfritz/atom,g2p/atom,alexandergmann/atom,beni55/atom,kdheepak89/atom,jlord/atom,cyzn/atom,Sangaroonaom/atom,jlord/atom,daxlab/atom,phord/atom,ivoadf/atom,0x73/atom,vinodpanicker/atom,deoxilix/atom,synaptek/atom,tmunro/atom,daxlab/atom,RobinTec/atom,charleswhchan/atom,abcP9110/atom,Hasimir/atom,targeter21/atom,mostafaeweda/atom,nvoron23/atom,FoldingText/atom,t9md/atom,vhutheesing/atom,folpindo/atom,burodepeper/atom,sillvan/atom,hakatashi/atom,nvoron23/atom,FoldingText/atom,kaicataldo/atom,jacekkopecky/atom,me6iaton/atom,harshdattani/atom,ilovezy/atom,yalexx/atom,mdumrauf/atom,liuderchi/atom,RobinTec/atom,devoncarew/atom,FIT-CSE2410-A-Bombs/atom,dijs/atom,johnrizzo1/atom,FoldingText/atom,russlescai/atom,Klozz/atom,h0dgep0dge/atom,svanharmelen/atom,Shekharrajak/atom,vjeux/atom,RobinTec/atom,sebmck/atom,elkingtonmcb/atom,alexandergmann/atom,hharchani/atom,Jdesk/atom,alfredxing/atom,wiggzz/atom,splodingsocks/atom,Neron-X5/atom,niklabh/atom,YunchengLiao/atom,nrodriguez13/atom,vcarrera/atom,einarmagnus/atom,omarhuanca/atom,fang-yufeng/atom,batjko/atom,Galactix/atom,Klozz/atom,devoncarew/atom,gzzhanghao/atom,fedorov/atom,constanzaurzua/atom,brumm/atom,YunchengLiao/atom,einarmagnus/atom,burodepeper/atom,crazyquark/atom,stinsonga/atom,RuiDGoncalves/atom,crazyquark/atom,nucked/atom,ralphtheninja/atom,qiujuer/atom,fscherwi/atom,FoldingText/atom,lpommers/atom,bryonwinger/atom,tjkr/atom,YunchengLiao/atom,scippio/atom,Jdesk/atom,gzzhanghao/atom,phord/atom,Andrey-Pavlov/atom,SlimeQ/atom,bcoe/atom,rxkit/atom,kc8wxm/atom,jlord/atom,jeremyramin/atom,kdheepak89/atom,einarmagnus/atom,kandros/atom,Dennis1978/atom,fscherwi/atom,splodingsocks/atom,synaptek/atom,FoldingText/atom,ali/atom,einarmagnus/atom,isghe/atom,kevinrenaers/atom,niklabh/atom,AdrianVovk/substance-ide,alfredxing/atom,h0dgep0dge/atom,bryonwinger/atom,oggy/atom,medovob/atom,deepfox/atom,scv119/atom,oggy/atom,ivoadf/atom,targeter21/atom,tisu2tisu/atom,bcoe/atom,ReddTea/atom,bencolon/atom,Mokolea/atom,dijs/atom,kandros/atom,me6iaton/atom,me6iaton/atom,tmunro/atom,sekcheong/atom,ashneo76/atom,efatsi/atom,pombredanne/atom,russlescai/atom,bencolon/atom,lisonma/atom,deoxilix/atom,gabrielPeart/atom,FIT-CSE2410-A-Bombs/atom,pkdevbox/atom,rlugojr/atom,harshdattani/atom,jacekkopecky/atom,lpommers/atom,tony612/atom,qskycolor/atom,deepfox/atom,basarat/atom,Shekharrajak/atom,champagnez/atom,bj7/atom,Jandersoft/atom,GHackAnonymous/atom,boomwaiza/atom,fang-yufeng/atom,mnquintana/atom,woss/atom,Arcanemagus/atom,BogusCurry/atom,PKRoma/atom,decaffeinate-examples/atom,ppamorim/atom,toqz/atom,rxkit/atom,vinodpanicker/atom,devmario/atom,boomwaiza/atom,Klozz/atom,0x73/atom,SlimeQ/atom,rmartin/atom,targeter21/atom,dannyflax/atom,beni55/atom,kittens/atom,mertkahyaoglu/atom,oggy/atom,Jandersolutions/atom,prembasumatary/atom,decaffeinate-examples/atom,PKRoma/atom,G-Baby/atom,BogusCurry/atom,CraZySacX/atom,paulcbetts/atom,bj7/atom,t9md/atom,mertkahyaoglu/atom,rmartin/atom,stuartquin/atom,hellendag/atom,kdheepak89/atom,vcarrera/atom,einarmagnus/atom,jordanbtucker/atom,FoldingText/atom,Galactix/atom,kjav/atom,kevinrenaers/atom,darwin/atom,Neron-X5/atom,hharchani/atom,pengshp/atom,helber/atom,ppamorim/atom,lovesnow/atom,burodepeper/atom,pombredanne/atom,sebmck/atom,acontreras89/atom,Locke23rus/atom,pengshp/atom,KENJU/atom,jtrose2/atom,mnquintana/atom,kc8wxm/atom,sxgao3001/atom,ilovezy/atom,001szymon/atom,sotayamashita/atom,dijs/atom,qiujuer/atom,stinsonga/atom,liuxiong332/atom,bsmr-x-script/atom,tony612/atom,fredericksilva/atom,kjav/atom,avdg/atom,ali/atom,qskycolor/atom,constanzaurzua/atom,ppamorim/atom,sekcheong/atom,scippio/atom,n-riesco/atom,AdrianVovk/substance-ide,Rodjana/atom,fredericksilva/atom,crazyquark/atom,lisonma/atom,DiogoXRP/atom,synaptek/atom,Dennis1978/atom,florianb/atom,AlisaKiatkongkumthon/atom,avdg/atom,toqz/atom,fedorov/atom,matthewclendening/atom,lovesnow/atom,abcP9110/atom,kc8wxm/atom,stuartquin/atom,lovesnow/atom,g2p/atom,tmunro/atom,yalexx/atom,Ju2ender/atom,amine7536/atom,kjav/atom,codex8/atom,fedorov/atom,bcoe/atom,Jandersoft/atom,n-riesco/atom,hakatashi/atom,johnrizzo1/atom,scv119/atom,tjkr/atom,toqz/atom,isghe/atom,constanzaurzua/atom,pombredanne/atom,anuwat121/atom,brettle/atom,Austen-G/BlockBuilder,SlimeQ/atom,florianb/atom,Arcanemagus/atom,PKRoma/atom,jacekkopecky/atom,kaicataldo/atom,darwin/atom,yomybaby/atom,rlugojr/atom,efatsi/atom,Neron-X5/atom,gontadu/atom,bj7/atom,efatsi/atom,ykeisuke/atom,yamhon/atom,andrewleverette/atom,rookie125/atom,devmario/atom,MjAbuz/atom,qiujuer/atom,brettle/atom,omarhuanca/atom,rmartin/atom,stinsonga/atom,matthewclendening/atom,gisenberg/atom,mertkahyaoglu/atom,mnquintana/atom,sillvan/atom,Galactix/atom,lisonma/atom,FIT-CSE2410-A-Bombs/atom,transcranial/atom,kjav/atom,Jonekee/atom,liuxiong332/atom,amine7536/atom,Rychard/atom,xream/atom,me6iaton/atom,yomybaby/atom,helber/atom,targeter21/atom,davideg/atom,fedorov/atom,wiggzz/atom,devmario/atom,AlisaKiatkongkumthon/atom,hpham04/atom,Abdillah/atom,ppamorim/atom,vjeux/atom,gisenberg/atom,sekcheong/atom,brumm/atom,Locke23rus/atom,Huaraz2/atom,crazyquark/atom,kittens/atom,hpham04/atom,russlescai/atom,Andrey-Pavlov/atom,mostafaeweda/atom,Mokolea/atom,nucked/atom,sekcheong/atom,kittens/atom,ardeshirj/atom,sillvan/atom,ardeshirj/atom,codex8/atom,scv119/atom,gisenberg/atom,ykeisuke/atom,xream/atom,GHackAnonymous/atom,kaicataldo/atom,yalexx/atom,johnhaley81/atom,hharchani/atom,sillvan/atom,elkingtonmcb/atom,jjz/atom,yangchenghu/atom,jjz/atom,vjeux/atom,nucked/atom,jtrose2/atom,ilovezy/atom,mrodalgaard/atom,AlexxNica/atom,nrodriguez13/atom | coffeescript | ## Code Before:
{CompositeDisposable} = require 'event-kit'
{callAttachHooks} = require './space-pen-extensions'
class PanelElement extends HTMLElement
createdCallback: ->
@subscriptions = new CompositeDisposable
initialize: (@model) ->
@appendChild(@getItemView())
@classList.add(@model.getClassName().split(' ')...) if @model.getClassName()?
@subscriptions.add @model.onDidChangeVisible(@visibleChanged.bind(this))
@subscriptions.add @model.onDidDestroy(@destroyed.bind(this))
this
getModel: -> @model
getItemView: ->
atom.views.getView(@model.getItem())
attachedCallback: ->
callAttachHooks(@getItemView()) # for backward compatibility with SpacePen views
@visibleChanged(@model.isVisible())
visibleChanged: (visible) ->
if visible
@style.display = null
else
@style.display = 'none'
destroyed: ->
@subscriptions.dispose()
@parentNode?.removeChild(this)
module.exports = PanelElement = document.registerElement 'atom-panel', prototype: PanelElement.prototype
## Instruction:
Allow PanelElements to be instantiated with markup
## Code After:
{CompositeDisposable} = require 'event-kit'
{callAttachHooks} = require './space-pen-extensions'
Panel = require './panel'
class PanelElement extends HTMLElement
createdCallback: ->
@subscriptions = new CompositeDisposable
initialize: (@model) ->
@appendChild(@getItemView())
@classList.add(@model.getClassName().split(' ')...) if @model.getClassName()?
@subscriptions.add @model.onDidChangeVisible(@visibleChanged.bind(this))
@subscriptions.add @model.onDidDestroy(@destroyed.bind(this))
this
getModel: ->
@model or= new Panel({})
getItemView: ->
atom.views.getView(@getModel().getItem())
attachedCallback: ->
callAttachHooks(@getItemView()) # for backward compatibility with SpacePen views
@visibleChanged(@getModel().isVisible())
visibleChanged: (visible) ->
if visible
@style.display = null
else
@style.display = 'none'
destroyed: ->
@subscriptions.dispose()
@parentNode?.removeChild(this)
module.exports = PanelElement = document.registerElement 'atom-panel', prototype: PanelElement.prototype
| {CompositeDisposable} = require 'event-kit'
{callAttachHooks} = require './space-pen-extensions'
+ Panel = require './panel'
class PanelElement extends HTMLElement
createdCallback: ->
@subscriptions = new CompositeDisposable
initialize: (@model) ->
@appendChild(@getItemView())
@classList.add(@model.getClassName().split(' ')...) if @model.getClassName()?
@subscriptions.add @model.onDidChangeVisible(@visibleChanged.bind(this))
@subscriptions.add @model.onDidDestroy(@destroyed.bind(this))
this
- getModel: -> @model
? -------
+ getModel: ->
+ @model or= new Panel({})
getItemView: ->
- atom.views.getView(@model.getItem())
? ^
+ atom.views.getView(@getModel().getItem())
? ^^^^ ++
attachedCallback: ->
callAttachHooks(@getItemView()) # for backward compatibility with SpacePen views
- @visibleChanged(@model.isVisible())
? ^
+ @visibleChanged(@getModel().isVisible())
? ^^^^ ++
visibleChanged: (visible) ->
if visible
@style.display = null
else
@style.display = 'none'
destroyed: ->
@subscriptions.dispose()
@parentNode?.removeChild(this)
module.exports = PanelElement = document.registerElement 'atom-panel', prototype: PanelElement.prototype | 8 | 0.228571 | 5 | 3 |
f7b43e5bc14f7fc03d085d695dbad4d910a21453 | wordpop.py | wordpop.py | import os
import redis
import json
import random
redis_client = redis.StrictRedis(host='localhost', port=6379, db=0)
key = redis_client.randomkey()
json_str = redis_client.get(key)
word_data = json.loads(json_str)
lexical_entries_length = len(word_data['results'][0]['lexicalEntries'])
random_entry = random.randint(0, lexical_entries_length - 1)
lexical_category = word_data['results'][0]['lexicalEntries'][random_entry]['lexicalCategory']
definition = word_data['results'][0]['lexicalEntries'][random_entry]['entries'][0]['senses'][0]['definitions'][0]
#Following try-except fetches an example sentence for the word
try:
example_sentence = word_data['results'][0]['lexicalEntries'][random_entry]['entries'][0]['senses'][0]['examples'][0]['text']
except LookupError:
#You will arrive here if there is no "text" key
example_sentence = "None"
word = word_data['results'][0]['word']
cmd = '/usr/bin/notify-send "' + word + ' | ' + lexical_category + '" "' + definition + '\n<b>Ex:</b> ' + example_sentence + '"'
print cmd
os.system(cmd)
| import os
import redis
import json
import random
redis_client = redis.StrictRedis(host='localhost', port=6379, db=0)
#Oxford API usually returns a list of synonyms
#Here we are only returning the first one
def fetch_synonym(synonyms):
if synonyms != "none":
return synonyms.split(',')[0]
else:
return None
key = redis_client.randomkey()
json_str = redis_client.get(key)
word_data = json.loads(json_str)
lexical_entries_length = len(word_data['results'][0]['lexicalEntries'])
random_entry = random.randint(0, lexical_entries_length - 1)
lexical_category = word_data['results'][0]['lexicalEntries'][random_entry]['lexicalCategory']
definition = word_data['results'][0]['lexicalEntries'][random_entry]['entries'][0]['senses'][0]['definitions'][0]
#Following try-except fetches an example sentence for the word
try:
example_sentence = word_data['results'][0]['lexicalEntries'][random_entry]['entries'][0]['senses'][0]['examples'][0]['text']
except LookupError:
#You will arrive here if there is no "text" key
example_sentence = "None"
word = word_data['results'][0]['word']
synonym = fetch_synonym(word_data['synonyms'])
#This is done fot the sake of formatting
if synonym:
synonym = ' | ' + synonym
cmd = '/usr/bin/notify-send "' + word + ' | ' + lexical_category + synonym + '" "' + definition + '\nEx: ' + example_sentence + '"'
print cmd
os.system(cmd)
| Append synonym of the word in notification's titie | Append synonym of the word in notification's titie
| Python | mit | sbmthakur/wordpop | python | ## Code Before:
import os
import redis
import json
import random
redis_client = redis.StrictRedis(host='localhost', port=6379, db=0)
key = redis_client.randomkey()
json_str = redis_client.get(key)
word_data = json.loads(json_str)
lexical_entries_length = len(word_data['results'][0]['lexicalEntries'])
random_entry = random.randint(0, lexical_entries_length - 1)
lexical_category = word_data['results'][0]['lexicalEntries'][random_entry]['lexicalCategory']
definition = word_data['results'][0]['lexicalEntries'][random_entry]['entries'][0]['senses'][0]['definitions'][0]
#Following try-except fetches an example sentence for the word
try:
example_sentence = word_data['results'][0]['lexicalEntries'][random_entry]['entries'][0]['senses'][0]['examples'][0]['text']
except LookupError:
#You will arrive here if there is no "text" key
example_sentence = "None"
word = word_data['results'][0]['word']
cmd = '/usr/bin/notify-send "' + word + ' | ' + lexical_category + '" "' + definition + '\n<b>Ex:</b> ' + example_sentence + '"'
print cmd
os.system(cmd)
## Instruction:
Append synonym of the word in notification's titie
## Code After:
import os
import redis
import json
import random
redis_client = redis.StrictRedis(host='localhost', port=6379, db=0)
#Oxford API usually returns a list of synonyms
#Here we are only returning the first one
def fetch_synonym(synonyms):
if synonyms != "none":
return synonyms.split(',')[0]
else:
return None
key = redis_client.randomkey()
json_str = redis_client.get(key)
word_data = json.loads(json_str)
lexical_entries_length = len(word_data['results'][0]['lexicalEntries'])
random_entry = random.randint(0, lexical_entries_length - 1)
lexical_category = word_data['results'][0]['lexicalEntries'][random_entry]['lexicalCategory']
definition = word_data['results'][0]['lexicalEntries'][random_entry]['entries'][0]['senses'][0]['definitions'][0]
#Following try-except fetches an example sentence for the word
try:
example_sentence = word_data['results'][0]['lexicalEntries'][random_entry]['entries'][0]['senses'][0]['examples'][0]['text']
except LookupError:
#You will arrive here if there is no "text" key
example_sentence = "None"
word = word_data['results'][0]['word']
synonym = fetch_synonym(word_data['synonyms'])
#This is done fot the sake of formatting
if synonym:
synonym = ' | ' + synonym
cmd = '/usr/bin/notify-send "' + word + ' | ' + lexical_category + synonym + '" "' + definition + '\nEx: ' + example_sentence + '"'
print cmd
os.system(cmd)
| import os
import redis
import json
import random
redis_client = redis.StrictRedis(host='localhost', port=6379, db=0)
+
+ #Oxford API usually returns a list of synonyms
+ #Here we are only returning the first one
+ def fetch_synonym(synonyms):
+ if synonyms != "none":
+ return synonyms.split(',')[0]
+ else:
+ return None
key = redis_client.randomkey()
json_str = redis_client.get(key)
word_data = json.loads(json_str)
lexical_entries_length = len(word_data['results'][0]['lexicalEntries'])
random_entry = random.randint(0, lexical_entries_length - 1)
lexical_category = word_data['results'][0]['lexicalEntries'][random_entry]['lexicalCategory']
definition = word_data['results'][0]['lexicalEntries'][random_entry]['entries'][0]['senses'][0]['definitions'][0]
#Following try-except fetches an example sentence for the word
try:
example_sentence = word_data['results'][0]['lexicalEntries'][random_entry]['entries'][0]['senses'][0]['examples'][0]['text']
except LookupError:
#You will arrive here if there is no "text" key
example_sentence = "None"
word = word_data['results'][0]['word']
+ synonym = fetch_synonym(word_data['synonyms'])
+
+ #This is done fot the sake of formatting
+ if synonym:
+ synonym = ' | ' + synonym
+
- cmd = '/usr/bin/notify-send "' + word + ' | ' + lexical_category + '" "' + definition + '\n<b>Ex:</b> ' + example_sentence + '"'
? --- ----
+ cmd = '/usr/bin/notify-send "' + word + ' | ' + lexical_category + synonym + '" "' + definition + '\nEx: ' + example_sentence + '"'
? ++++++++++
print cmd
os.system(cmd) | 16 | 0.571429 | 15 | 1 |
30aa7dce0561e1fd8beeec94098a5d6a6f447a65 | src/test.py | src/test.py |
from __future__ import division
import numpy as np
from matplotlib import pyplot as plt
def main():
koeffs = [.3, 1.2, .1, 7]
p = np.poly1d(koeffs)
x = np.linspace(-2, 2, 100)
y = p(x) + 2 * np.random.randn(100) - 1
# fit
fit = np.polyfit(x, y, 3)
p_fit = np.poly1d(fit)
# plot
plt.scatter(x, y)
plt.plot(x, p_fit(x))
plt.show()
if __name__ == '__main__':
main()
|
from __future__ import division
import numpy as np
from matplotlib import pyplot as plt
def main():
koeffs = [.3, 1.2, .1, 7]
p = np.poly1d(koeffs)
x = np.linspace(-2, 2, 100)
y = p(x) + 2 * np.random.randn(100) - 1
# fit
fit = np.polyfit(x, y, 3)
p_fit = np.poly1d(fit)
print "Real coefficients:", koeffs
print "Fitted coefficients:", fit
# plot
plt.scatter(x, y)
plt.plot(x, p_fit(x))
plt.show()
if __name__ == '__main__':
main()
| Print real and fitted coeffs | Print real and fitted coeffs
| Python | mit | bbci/playground | python | ## Code Before:
from __future__ import division
import numpy as np
from matplotlib import pyplot as plt
def main():
koeffs = [.3, 1.2, .1, 7]
p = np.poly1d(koeffs)
x = np.linspace(-2, 2, 100)
y = p(x) + 2 * np.random.randn(100) - 1
# fit
fit = np.polyfit(x, y, 3)
p_fit = np.poly1d(fit)
# plot
plt.scatter(x, y)
plt.plot(x, p_fit(x))
plt.show()
if __name__ == '__main__':
main()
## Instruction:
Print real and fitted coeffs
## Code After:
from __future__ import division
import numpy as np
from matplotlib import pyplot as plt
def main():
koeffs = [.3, 1.2, .1, 7]
p = np.poly1d(koeffs)
x = np.linspace(-2, 2, 100)
y = p(x) + 2 * np.random.randn(100) - 1
# fit
fit = np.polyfit(x, y, 3)
p_fit = np.poly1d(fit)
print "Real coefficients:", koeffs
print "Fitted coefficients:", fit
# plot
plt.scatter(x, y)
plt.plot(x, p_fit(x))
plt.show()
if __name__ == '__main__':
main()
|
from __future__ import division
import numpy as np
from matplotlib import pyplot as plt
def main():
koeffs = [.3, 1.2, .1, 7]
p = np.poly1d(koeffs)
x = np.linspace(-2, 2, 100)
y = p(x) + 2 * np.random.randn(100) - 1
# fit
fit = np.polyfit(x, y, 3)
p_fit = np.poly1d(fit)
+ print "Real coefficients:", koeffs
+ print "Fitted coefficients:", fit
# plot
plt.scatter(x, y)
plt.plot(x, p_fit(x))
plt.show()
if __name__ == '__main__':
main() | 2 | 0.086957 | 2 | 0 |
3ff55bf56dd2a774748c8a045b37208f2f7d64fd | README.md | README.md |
Dr. Xiaoshan Liu, DI Michael Fladischer
Folien im Markdown-Format für die LV **Grundlagen WEB** an der [FH
CAMPUS02](https://www.campus02.at/).
## Markdown zu HTML
Um aus den Markdown-Dateien HTML-Foliensätze zu erzeugen, wird
[mdpress](http://egonschiele.github.io/mdpress/) benötigt.
Der Aufruf für `mdpress` ist für jede Markdown-Datei wie folgt:
mdpress -s campus02 <name>.md
This will create the HTML in the folder `<name>`.
|
Dr. Xiaoshan Liu, DI Michael Fladischer
Folien im reStructuredText-Format für die LV **Grundlagen WEB** an der [FH
CAMPUS02](https://www.campus02.at/).
## reStructuredText zu HTML
Um aus den reStructuredText-Dateien HTML-Foliensätze zu erzeugen, wird
[hovercraft](https://pypi.python.org/pypi/hovercraft) benötigt.
Der Aufruf für `mdpress` ist für jede Markdown-Datei wie folgt:
hovercraft <name>.rst <directory>
This will create the HTML in the folder `<directory>`.
To install `hovercraft` on Debian using `apt`:
sudo apt install hovercraft
| Update documentation for hovercraft usage. | Update documentation for hovercraft usage.
| Markdown | mit | fladi/campus02-web | markdown | ## Code Before:
Dr. Xiaoshan Liu, DI Michael Fladischer
Folien im Markdown-Format für die LV **Grundlagen WEB** an der [FH
CAMPUS02](https://www.campus02.at/).
## Markdown zu HTML
Um aus den Markdown-Dateien HTML-Foliensätze zu erzeugen, wird
[mdpress](http://egonschiele.github.io/mdpress/) benötigt.
Der Aufruf für `mdpress` ist für jede Markdown-Datei wie folgt:
mdpress -s campus02 <name>.md
This will create the HTML in the folder `<name>`.
## Instruction:
Update documentation for hovercraft usage.
## Code After:
Dr. Xiaoshan Liu, DI Michael Fladischer
Folien im reStructuredText-Format für die LV **Grundlagen WEB** an der [FH
CAMPUS02](https://www.campus02.at/).
## reStructuredText zu HTML
Um aus den reStructuredText-Dateien HTML-Foliensätze zu erzeugen, wird
[hovercraft](https://pypi.python.org/pypi/hovercraft) benötigt.
Der Aufruf für `mdpress` ist für jede Markdown-Datei wie folgt:
hovercraft <name>.rst <directory>
This will create the HTML in the folder `<directory>`.
To install `hovercraft` on Debian using `apt`:
sudo apt install hovercraft
|
Dr. Xiaoshan Liu, DI Michael Fladischer
- Folien im Markdown-Format für die LV **Grundlagen WEB** an der [FH
? -- ^ ^^^
+ Folien im reStructuredText-Format für die LV **Grundlagen WEB** an der [FH
? ^^^^^^^^^^ ^^^^
CAMPUS02](https://www.campus02.at/).
- ## Markdown zu HTML
+ ## reStructuredText zu HTML
- Um aus den Markdown-Dateien HTML-Foliensätze zu erzeugen, wird
? -- ^ ^^^
+ Um aus den reStructuredText-Dateien HTML-Foliensätze zu erzeugen, wird
? ^^^^^^^^^^ ^^^^
- [mdpress](http://egonschiele.github.io/mdpress/) benötigt.
+ [hovercraft](https://pypi.python.org/pypi/hovercraft) benötigt.
Der Aufruf für `mdpress` ist für jede Markdown-Datei wie folgt:
- mdpress -s campus02 <name>.md
+ hovercraft <name>.rst <directory>
- This will create the HTML in the folder `<name>`.
? ^^^
+ This will create the HTML in the folder `<directory>`.
? ^^^ +++++
+
+ To install `hovercraft` on Debian using `apt`:
+
+ sudo apt install hovercraft | 16 | 1 | 10 | 6 |
42dd64916e97dd52829f4ca2d3f8af39ed3b0b82 | .travis.yml | .travis.yml | language: python
python:
- "2.6"
- "2.7"
- "3.3"
install:
- "pip install -e ."
- "pip install coverage coveralls"
script:
- "coverage run `which django-admin.py` --settings django_prbac.mock_settings test"
after_success:
- coverage report
- coveralls
| language: python
python:
- "2.6"
- "2.7"
- "3.3"
install:
- "pip install -e ."
- "pip install coverage coveralls"
script:
- "coverage run `which django-admin.py` test django_prbac --settings django_prbac.mock_settings"
after_success:
- coverage report
- coveralls
| Fix arg order to django admin | Fix arg order to django admin
| YAML | bsd-3-clause | chillbear/django-prbac | yaml | ## Code Before:
language: python
python:
- "2.6"
- "2.7"
- "3.3"
install:
- "pip install -e ."
- "pip install coverage coveralls"
script:
- "coverage run `which django-admin.py` --settings django_prbac.mock_settings test"
after_success:
- coverage report
- coveralls
## Instruction:
Fix arg order to django admin
## Code After:
language: python
python:
- "2.6"
- "2.7"
- "3.3"
install:
- "pip install -e ."
- "pip install coverage coveralls"
script:
- "coverage run `which django-admin.py` test django_prbac --settings django_prbac.mock_settings"
after_success:
- coverage report
- coveralls
| language: python
python:
- "2.6"
- "2.7"
- "3.3"
install:
- "pip install -e ."
- "pip install coverage coveralls"
script:
- - "coverage run `which django-admin.py` --settings django_prbac.mock_settings test"
? -----
+ - "coverage run `which django-admin.py` test django_prbac --settings django_prbac.mock_settings"
? ++++++++++++++++++
after_success:
- coverage report
- coveralls
| 2 | 0.142857 | 1 | 1 |
52cf28429f2eb93225b507043ee67f10a275ab0f | wagtail/wagtailadmin/templates/wagtailadmin/chooser/_link_types.html | wagtail/wagtailadmin/templates/wagtailadmin/chooser/_link_types.html | {% load i18n %}
{% if allow_external_link or allow_email_link or current == 'external' or current == 'email' %}
<p class="link-types">
{% if current == 'internal' %}
<b>{% trans "Internal link" %}</b>
{% else %}
<a href="{% url 'wagtailadmin_choose_page' %}?{{ querystring }}">{% trans "Internal link" %}</a>
{% endif %}
{% if current == 'external' %}
| <b>{% trans "External link" %}</b>
{% elif allow_external_link %}
| <a href="{% url 'wagtailadmin_choose_page_external_link' %}?{{ querystring }}">{% trans "External link" %}</a>
{% endif %}
{% if current == 'email' %}
| <b>{% trans "Email link" %}</b>
{% elif allow_email_link %}
| <a href="{% url 'wagtailadmin_choose_page_email_link' %}?{{ querystring }}">{% trans "Email link" %}</a>
{% endif %}
</p>
{% endif %}
| {% load i18n wagtailadmin_tags %}
{% if allow_external_link or allow_email_link or current == 'external' or current == 'email' %}
<p class="link-types">
{% if current == 'internal' %}
<b>{% trans "Internal link" %}</b>
{% else %}
<a href="{% url 'wagtailadmin_choose_page' %}{% querystring p=None %}">{% trans "Internal link" %}</a>
{% endif %}
{% if current == 'external' %}
| <b>{% trans "External link" %}</b>
{% elif allow_external_link %}
| <a href="{% url 'wagtailadmin_choose_page_external_link' %}{% querystring p=None %}">{% trans "External link" %}</a>
{% endif %}
{% if current == 'email' %}
| <b>{% trans "Email link" %}</b>
{% elif allow_email_link %}
| <a href="{% url 'wagtailadmin_choose_page_email_link' %}{% querystring p=None %}">{% trans "Email link" %}</a>
{% endif %}
</p>
{% endif %}
| Use {% querystring %} tag to propagate querystring in the link chooser 'link type' nav links. | Use {% querystring %} tag to propagate querystring in the link chooser 'link type' nav links.
This is necessary now that we don't pass a 'querystring' context var any more. We set p (the page number param) to None because the tree browser's parent page ID is not preserved in this navigation (it's part of the URL path rather than the querystring), so there's no point preserving the page number.
| HTML | bsd-3-clause | takeflight/wagtail,FlipperPA/wagtail,jnns/wagtail,kaedroho/wagtail,rsalmaso/wagtail,serzans/wagtail,nilnvoid/wagtail,hanpama/wagtail,nutztherookie/wagtail,quru/wagtail,nealtodd/wagtail,jnns/wagtail,FlipperPA/wagtail,kurtrwall/wagtail,Tivix/wagtail,rsalmaso/wagtail,davecranwell/wagtail,wagtail/wagtail,nimasmi/wagtail,timorieber/wagtail,quru/wagtail,gogobook/wagtail,Toshakins/wagtail,thenewguy/wagtail,gogobook/wagtail,nealtodd/wagtail,rsalmaso/wagtail,mikedingjan/wagtail,mikedingjan/wagtail,gasman/wagtail,timorieber/wagtail,nimasmi/wagtail,JoshBarr/wagtail,takeflight/wagtail,hamsterbacke23/wagtail,nealtodd/wagtail,inonit/wagtail,wagtail/wagtail,mikedingjan/wagtail,thenewguy/wagtail,Tivix/wagtail,davecranwell/wagtail,mixxorz/wagtail,nutztherookie/wagtail,wagtail/wagtail,thenewguy/wagtail,FlipperPA/wagtail,kurtrwall/wagtail,zerolab/wagtail,hamsterbacke23/wagtail,kurtw/wagtail,torchbox/wagtail,inonit/wagtail,mixxorz/wagtail,torchbox/wagtail,timorieber/wagtail,chrxr/wagtail,davecranwell/wagtail,takeflight/wagtail,JoshBarr/wagtail,JoshBarr/wagtail,nilnvoid/wagtail,iansprice/wagtail,davecranwell/wagtail,kaedroho/wagtail,mixxorz/wagtail,mixxorz/wagtail,torchbox/wagtail,kaedroho/wagtail,gogobook/wagtail,gasman/wagtail,kaedroho/wagtail,zerolab/wagtail,Toshakins/wagtail,wagtail/wagtail,zerolab/wagtail,mixxorz/wagtail,kurtw/wagtail,gogobook/wagtail,rsalmaso/wagtail,thenewguy/wagtail,kurtrwall/wagtail,kurtw/wagtail,thenewguy/wagtail,hanpama/wagtail,hanpama/wagtail,jnns/wagtail,Tivix/wagtail,kurtrwall/wagtail,wagtail/wagtail,gasman/wagtail,iansprice/wagtail,JoshBarr/wagtail,Toshakins/wagtail,chrxr/wagtail,kurtw/wagtail,zerolab/wagtail,Toshakins/wagtail,nealtodd/wagtail,chrxr/wagtail,nilnvoid/wagtail,hamsterbacke23/wagtail,gasman/wagtail,chrxr/wagtail,rsalmaso/wagtail,nutztherookie/wagtail,nimasmi/wagtail,nilnvoid/wagtail,iansprice/wagtail,quru/wagtail,serzans/wagtail,takeflight/wagtail,inonit/wagtail,timorieber/wagtail,hamsterbacke23/wagtail,serzans/wagtail,iansprice/wagtail,quru/wagtail,hanpama/wagtail,kaedroho/wagtail,mikedingjan/wagtail,inonit/wagtail,gasman/wagtail,nutztherookie/wagtail,jnns/wagtail,zerolab/wagtail,nimasmi/wagtail,Tivix/wagtail,FlipperPA/wagtail,torchbox/wagtail,serzans/wagtail | html | ## Code Before:
{% load i18n %}
{% if allow_external_link or allow_email_link or current == 'external' or current == 'email' %}
<p class="link-types">
{% if current == 'internal' %}
<b>{% trans "Internal link" %}</b>
{% else %}
<a href="{% url 'wagtailadmin_choose_page' %}?{{ querystring }}">{% trans "Internal link" %}</a>
{% endif %}
{% if current == 'external' %}
| <b>{% trans "External link" %}</b>
{% elif allow_external_link %}
| <a href="{% url 'wagtailadmin_choose_page_external_link' %}?{{ querystring }}">{% trans "External link" %}</a>
{% endif %}
{% if current == 'email' %}
| <b>{% trans "Email link" %}</b>
{% elif allow_email_link %}
| <a href="{% url 'wagtailadmin_choose_page_email_link' %}?{{ querystring }}">{% trans "Email link" %}</a>
{% endif %}
</p>
{% endif %}
## Instruction:
Use {% querystring %} tag to propagate querystring in the link chooser 'link type' nav links.
This is necessary now that we don't pass a 'querystring' context var any more. We set p (the page number param) to None because the tree browser's parent page ID is not preserved in this navigation (it's part of the URL path rather than the querystring), so there's no point preserving the page number.
## Code After:
{% load i18n wagtailadmin_tags %}
{% if allow_external_link or allow_email_link or current == 'external' or current == 'email' %}
<p class="link-types">
{% if current == 'internal' %}
<b>{% trans "Internal link" %}</b>
{% else %}
<a href="{% url 'wagtailadmin_choose_page' %}{% querystring p=None %}">{% trans "Internal link" %}</a>
{% endif %}
{% if current == 'external' %}
| <b>{% trans "External link" %}</b>
{% elif allow_external_link %}
| <a href="{% url 'wagtailadmin_choose_page_external_link' %}{% querystring p=None %}">{% trans "External link" %}</a>
{% endif %}
{% if current == 'email' %}
| <b>{% trans "Email link" %}</b>
{% elif allow_email_link %}
| <a href="{% url 'wagtailadmin_choose_page_email_link' %}{% querystring p=None %}">{% trans "Email link" %}</a>
{% endif %}
</p>
{% endif %}
| - {% load i18n %}
+ {% load i18n wagtailadmin_tags %}
{% if allow_external_link or allow_email_link or current == 'external' or current == 'email' %}
<p class="link-types">
{% if current == 'internal' %}
<b>{% trans "Internal link" %}</b>
{% else %}
- <a href="{% url 'wagtailadmin_choose_page' %}?{{ querystring }}">{% trans "Internal link" %}</a>
? - ^ ^
+ <a href="{% url 'wagtailadmin_choose_page' %}{% querystring p=None %}">{% trans "Internal link" %}</a>
? ^ ^^^^^^^^
{% endif %}
{% if current == 'external' %}
| <b>{% trans "External link" %}</b>
{% elif allow_external_link %}
- | <a href="{% url 'wagtailadmin_choose_page_external_link' %}?{{ querystring }}">{% trans "External link" %}</a>
? - ^ ^
+ | <a href="{% url 'wagtailadmin_choose_page_external_link' %}{% querystring p=None %}">{% trans "External link" %}</a>
? ^ ^^^^^^^^
{% endif %}
{% if current == 'email' %}
| <b>{% trans "Email link" %}</b>
{% elif allow_email_link %}
- | <a href="{% url 'wagtailadmin_choose_page_email_link' %}?{{ querystring }}">{% trans "Email link" %}</a>
? - ^ ^
+ | <a href="{% url 'wagtailadmin_choose_page_email_link' %}{% querystring p=None %}">{% trans "Email link" %}</a>
? ^ ^^^^^^^^
{% endif %}
</p>
{% endif %} | 8 | 0.363636 | 4 | 4 |
d06faa8adbe6de6cb21ff6e37809ca7dbc7c473d | script/generate_and_translate_po_files.sh | script/generate_and_translate_po_files.sh |
set -u
set -o errexit
working_dir=`dirname $0`
cd $working_dir
cd ..
xgettext.pl -w -v -v -v -P perl=* -P tt2=* --output=lib/MediaWords/I18N/messages.pot --directory=root/zoe_website_template/ --directory=lib/
sed -i -e 's/Content-Type: text\/plain; charset=CHARSET/Content-Type: text\/plain; charset=UTF8/' lib/MediaWords/I18N/messages.pot
cp lib/MediaWords/I18N/messages.pot lib/MediaWords/I18N/en.po
autotranslate-po.pl -f en -t ru -i lib/MediaWords/I18N/messages.pot -o lib/MediaWords/I18N/ru.po
|
set -u
set -o errexit
working_dir=`dirname $0`
cd $working_dir
cd ..
./script/run_carton.sh exec -- xgettext.pl -w -v -v -v -P perl=* -P tt2=* --output=lib/MediaWords/I18N/messages.pot --directory=root/public_ui/ --directory=lib/
sed -i -e 's/Content-Type: text\/plain; charset=CHARSET/Content-Type: text\/plain; charset=UTF8/' lib/MediaWords/I18N/messages.pot
cp lib/MediaWords/I18N/messages.pot lib/MediaWords/I18N/en.po
echo "Starting autotranslate"
./script/run_carton.sh exec -- autotranslate-po.pl -f en -t ru -i lib/MediaWords/I18N/messages.pot -o lib/MediaWords/I18N/ru.po
| Fix script to use carton. The Russian translation is broken now that Google has stopped offering Google Translate but at least there are no compile errors.!!! | Fix script to use carton.
The Russian translation is broken now that Google has stopped offering Google Translate but at least there are no compile errors.!!!
| Shell | agpl-3.0 | berkmancenter/mediacloud,AchyuthIIIT/mediacloud,berkmancenter/mediacloud,AchyuthIIIT/mediacloud,AchyuthIIIT/mediacloud,AchyuthIIIT/mediacloud,berkmancenter/mediacloud,AchyuthIIIT/mediacloud,AchyuthIIIT/mediacloud,berkmancenter/mediacloud,AchyuthIIIT/mediacloud,AchyuthIIIT/mediacloud,berkmancenter/mediacloud,AchyuthIIIT/mediacloud | shell | ## Code Before:
set -u
set -o errexit
working_dir=`dirname $0`
cd $working_dir
cd ..
xgettext.pl -w -v -v -v -P perl=* -P tt2=* --output=lib/MediaWords/I18N/messages.pot --directory=root/zoe_website_template/ --directory=lib/
sed -i -e 's/Content-Type: text\/plain; charset=CHARSET/Content-Type: text\/plain; charset=UTF8/' lib/MediaWords/I18N/messages.pot
cp lib/MediaWords/I18N/messages.pot lib/MediaWords/I18N/en.po
autotranslate-po.pl -f en -t ru -i lib/MediaWords/I18N/messages.pot -o lib/MediaWords/I18N/ru.po
## Instruction:
Fix script to use carton.
The Russian translation is broken now that Google has stopped offering Google Translate but at least there are no compile errors.!!!
## Code After:
set -u
set -o errexit
working_dir=`dirname $0`
cd $working_dir
cd ..
./script/run_carton.sh exec -- xgettext.pl -w -v -v -v -P perl=* -P tt2=* --output=lib/MediaWords/I18N/messages.pot --directory=root/public_ui/ --directory=lib/
sed -i -e 's/Content-Type: text\/plain; charset=CHARSET/Content-Type: text\/plain; charset=UTF8/' lib/MediaWords/I18N/messages.pot
cp lib/MediaWords/I18N/messages.pot lib/MediaWords/I18N/en.po
echo "Starting autotranslate"
./script/run_carton.sh exec -- autotranslate-po.pl -f en -t ru -i lib/MediaWords/I18N/messages.pot -o lib/MediaWords/I18N/ru.po
|
set -u
set -o errexit
working_dir=`dirname $0`
cd $working_dir
cd ..
- xgettext.pl -w -v -v -v -P perl=* -P tt2=* --output=lib/MediaWords/I18N/messages.pot --directory=root/zoe_website_template/ --directory=lib/
? ^^^ ^^^^ -----------
+ ./script/run_carton.sh exec -- xgettext.pl -w -v -v -v -P perl=* -P tt2=* --output=lib/MediaWords/I18N/messages.pot --directory=root/public_ui/ --directory=lib/
? +++++++++++++++++++++++++++++++ ^^^^^^ ^
sed -i -e 's/Content-Type: text\/plain; charset=CHARSET/Content-Type: text\/plain; charset=UTF8/' lib/MediaWords/I18N/messages.pot
cp lib/MediaWords/I18N/messages.pot lib/MediaWords/I18N/en.po
+ echo "Starting autotranslate"
- autotranslate-po.pl -f en -t ru -i lib/MediaWords/I18N/messages.pot -o lib/MediaWords/I18N/ru.po
+ ./script/run_carton.sh exec -- autotranslate-po.pl -f en -t ru -i lib/MediaWords/I18N/messages.pot -o lib/MediaWords/I18N/ru.po
? +++++++++++++++++++++++++++++++
- | 6 | 0.4 | 3 | 3 |
8669b8fbefe61ceb51d8d861503fbd3d55058e1e | objc/MX3Snapshot.mm | objc/MX3Snapshot.mm | using mx3::ObjcAdapter;
@implementation MX3Snapshot {
std::unique_ptr<mx3::SqlSnapshot> __snapshot;
}
- (instancetype) initWithSnapshot:(std::unique_ptr<mx3::SqlSnapshot>)snapshot {
if(!(self = [super init])) {
return nil;
}
// when you assign one unique_ptr to another, you have to move it because there can only ever
// be one copy of a unique pointer at a time. If you need to be able to "copy" pointers, use shared_ptr
__snapshot = std::move(snapshot);
return self;
}
- (void) dealloc {
__snapshot = nullptr;
}
- (NSString *) rowAtIndex:(NSUInteger)index {
auto row = __snapshot->at(static_cast<size_t>(index));
return ObjcAdapter::convert(row);
}
- (NSUInteger) count {
return static_cast<NSUInteger>( __snapshot->size() );
}
@end
| using mx3::ObjcAdapter;
@implementation MX3Snapshot {
std::unique_ptr<mx3::SqlSnapshot> __snapshot;
}
- (instancetype) initWithSnapshot:(std::unique_ptr<mx3::SqlSnapshot>)snapshot {
self = [super init];
if (self) {
// when you assign one unique_ptr to another, you have to move it because there can only ever
// be one copy of a unique pointer at a time. If you need to be able to "copy" pointers, use shared_ptr
__snapshot = std::move(snapshot);
}
return self;
}
- (void) dealloc {
__snapshot = nullptr;
}
- (NSString *) rowAtIndex:(NSUInteger)index {
auto row = __snapshot->at(static_cast<size_t>(index));
return ObjcAdapter::convert(row);
}
- (NSUInteger) count {
return static_cast<NSUInteger>( __snapshot->size() );
}
@end
| Make init more objective-c like | Make init more objective-c like | Objective-C++ | mit | libmx3/mx3,duydb2/mx3,duydb2/mx3,wesselj1/mx3playground,libmx3/mx3,wesselj1/mx3playground,duydb2/mx3,libmx3/mx3,wesselj1/mx3playground | objective-c++ | ## Code Before:
using mx3::ObjcAdapter;
@implementation MX3Snapshot {
std::unique_ptr<mx3::SqlSnapshot> __snapshot;
}
- (instancetype) initWithSnapshot:(std::unique_ptr<mx3::SqlSnapshot>)snapshot {
if(!(self = [super init])) {
return nil;
}
// when you assign one unique_ptr to another, you have to move it because there can only ever
// be one copy of a unique pointer at a time. If you need to be able to "copy" pointers, use shared_ptr
__snapshot = std::move(snapshot);
return self;
}
- (void) dealloc {
__snapshot = nullptr;
}
- (NSString *) rowAtIndex:(NSUInteger)index {
auto row = __snapshot->at(static_cast<size_t>(index));
return ObjcAdapter::convert(row);
}
- (NSUInteger) count {
return static_cast<NSUInteger>( __snapshot->size() );
}
@end
## Instruction:
Make init more objective-c like
## Code After:
using mx3::ObjcAdapter;
@implementation MX3Snapshot {
std::unique_ptr<mx3::SqlSnapshot> __snapshot;
}
- (instancetype) initWithSnapshot:(std::unique_ptr<mx3::SqlSnapshot>)snapshot {
self = [super init];
if (self) {
// when you assign one unique_ptr to another, you have to move it because there can only ever
// be one copy of a unique pointer at a time. If you need to be able to "copy" pointers, use shared_ptr
__snapshot = std::move(snapshot);
}
return self;
}
- (void) dealloc {
__snapshot = nullptr;
}
- (NSString *) rowAtIndex:(NSUInteger)index {
auto row = __snapshot->at(static_cast<size_t>(index));
return ObjcAdapter::convert(row);
}
- (NSUInteger) count {
return static_cast<NSUInteger>( __snapshot->size() );
}
@end
| using mx3::ObjcAdapter;
@implementation MX3Snapshot {
std::unique_ptr<mx3::SqlSnapshot> __snapshot;
}
- (instancetype) initWithSnapshot:(std::unique_ptr<mx3::SqlSnapshot>)snapshot {
- if(!(self = [super init])) {
? ^^^^^ ^^^^
+ self = [super init];
? ^^ ^
+ if (self) {
- return nil;
- }
- // when you assign one unique_ptr to another, you have to move it because there can only ever
+ // when you assign one unique_ptr to another, you have to move it because there can only ever
? ++++++
- // be one copy of a unique pointer at a time. If you need to be able to "copy" pointers, use shared_ptr
+ // be one copy of a unique pointer at a time. If you need to be able to "copy" pointers, use shared_ptr
? ++++++
- __snapshot = std::move(snapshot);
+ __snapshot = std::move(snapshot);
? ++++++
+ }
+
return self;
}
- (void) dealloc {
__snapshot = nullptr;
}
- (NSString *) rowAtIndex:(NSUInteger)index {
auto row = __snapshot->at(static_cast<size_t>(index));
return ObjcAdapter::convert(row);
}
- (NSUInteger) count {
return static_cast<NSUInteger>( __snapshot->size() );
}
@end | 13 | 0.433333 | 7 | 6 |
16335ef9bb274ead8f8868ff5ef77150bbbc7ec3 | app/views/spots/show.html.erb | app/views/spots/show.html.erb | <%= link_to 'Back', spots_path %><br>
<h1><%= @spot.name %></h1>
<%= link_to 'Edit', edit_spot_path(@spot) %><br>
<%= @spot.address %><br>
<%= @spot.phone %><br>
<% if @spot.website %>
<%= @spot.website %><br>
<% end %>
<%#= show_price(@spot.price) %><br>
<%= show_photo(@spot.photo) if @spot.photo %>
<%= button_to "Leave Comment", { action: "new"}, form_class: "comment" %>
<% @spot.comments.each do |comment| %>
<%= comment.body %>
<% end %> | <%= link_to 'Back', spots_path %><br>
<h1><%= @spot.name %></h1>
<%= link_to 'Edit', edit_spot_path(@spot) %><br>
<%= @spot.address %><br>
<%= @spot.phone %><br>
<% if @spot.website %>
<%= @spot.website %><br>
<% end %>
<%#= show_price(@spot.price) %><br>
<%= show_photo(@spot.photo) if @spot.photo %>
<%= button_to "Leave Comment", new_spot_comment_path(@spot), method: :get %>
<% @spot.comments.each do |comment| %>
<%= comment.body %><br>
<% end %> | Fix logic for leaving a comment on spot | Fix logic for leaving a comment on spot
| HTML+ERB | mit | mud-turtles-2014/TheSpot,mud-turtles-2014/TheSpot,mud-turtles-2014/TheSpot | html+erb | ## Code Before:
<%= link_to 'Back', spots_path %><br>
<h1><%= @spot.name %></h1>
<%= link_to 'Edit', edit_spot_path(@spot) %><br>
<%= @spot.address %><br>
<%= @spot.phone %><br>
<% if @spot.website %>
<%= @spot.website %><br>
<% end %>
<%#= show_price(@spot.price) %><br>
<%= show_photo(@spot.photo) if @spot.photo %>
<%= button_to "Leave Comment", { action: "new"}, form_class: "comment" %>
<% @spot.comments.each do |comment| %>
<%= comment.body %>
<% end %>
## Instruction:
Fix logic for leaving a comment on spot
## Code After:
<%= link_to 'Back', spots_path %><br>
<h1><%= @spot.name %></h1>
<%= link_to 'Edit', edit_spot_path(@spot) %><br>
<%= @spot.address %><br>
<%= @spot.phone %><br>
<% if @spot.website %>
<%= @spot.website %><br>
<% end %>
<%#= show_price(@spot.price) %><br>
<%= show_photo(@spot.photo) if @spot.photo %>
<%= button_to "Leave Comment", new_spot_comment_path(@spot), method: :get %>
<% @spot.comments.each do |comment| %>
<%= comment.body %><br>
<% end %> | <%= link_to 'Back', spots_path %><br>
<h1><%= @spot.name %></h1>
<%= link_to 'Edit', edit_spot_path(@spot) %><br>
<%= @spot.address %><br>
<%= @spot.phone %><br>
<% if @spot.website %>
<%= @spot.website %><br>
<% end %>
<%#= show_price(@spot.price) %><br>
<%= show_photo(@spot.photo) if @spot.photo %>
- <%= button_to "Leave Comment", { action: "new"}, form_class: "comment" %>
+ <%= button_to "Leave Comment", new_spot_comment_path(@spot), method: :get %>
<% @spot.comments.each do |comment| %>
- <%= comment.body %>
+ <%= comment.body %><br>
? ++++
<% end %> | 4 | 0.266667 | 2 | 2 |
5b063a9117ba211460fa7e6b8356770080f97c3b | src/System.Private.ServiceModel/tools/scripts/BuildCertUtil.cmd | src/System.Private.ServiceModel/tools/scripts/BuildCertUtil.cmd | @echo off
setlocal
if not defined VisualStudioVersion (
if defined VS140COMNTOOLS (
call "%VS140COMNTOOLS%\VsDevCmd.bat"
goto :EnvSet
)
if defined VS120COMNTOOLS (
call "%VS120COMNTOOLS%\VsDevCmd.bat"
goto :EnvSet
)
echo Error: %~nx0 requires Visual Studio 2013 or 2015.
echo Please see https://github.com/dotnet/wcf/blob/master/Documentation/developer-guide.md for build instructions.
exit /b 1
)
:EnvSet
:: Log build command line
set _buildproj=%~dp0..\CertificateGenerator\CertificateGenerator.sln
set _buildlog=%~dp0..\..\..\..\msbuildCertificateGenerator.log
set _buildprefix=echo
set _buildpostfix=^> "%_buildlog%"
set cmd=msbuild /p:Configuration=Release /t:restore;build "%_buildproj%" /nologo /maxcpucount /verbosity:minimal /nodeReuse:false /fileloggerparameters:Verbosity=diag;LogFile="%_buildlog%";Append %*
echo %cmd%
%cmd%
set BUILDERRORLEVEL=%ERRORLEVEL%
:AfterBuild
echo.
:: Pull the build summary from the log file
findstr /ir /c:".*Warning(s)" /c:".*Error(s)" /c:"Time Elapsed.*" "%_buildlog%"
echo Build Exit Code = %BUILDERRORLEVEL%
exit /b %BUILDERRORLEVEL%
| @echo off
setlocal
if not defined VisualStudioVersion (
if defined VS160COMNTOOLS (
call "%VS160COMNTOOLS%\VsDevCmd.bat"
goto :EnvSet
)
echo Error: %~nx0 requires Visual Studio 2019 because the .NET Core 3.0 SDK is needed.
echo Please see https://github.com/dotnet/wcf/blob/master/Documentation/developer-guide.md for build instructions.
exit /b 1
)
:EnvSet
:: Log build command line
set _buildproj=%~dp0..\CertificateGenerator\CertificateGenerator.sln
set _buildlog=%~dp0..\..\..\..\msbuildCertificateGenerator.log
set _buildprefix=echo
set _buildpostfix=^> "%_buildlog%"
set cmd=msbuild /p:Configuration=Release /t:restore;build "%_buildproj%" /nologo /maxcpucount /verbosity:minimal /nodeReuse:false /fileloggerparameters:Verbosity=diag;LogFile="%_buildlog%";Append %*
echo %cmd%
%cmd%
set BUILDERRORLEVEL=%ERRORLEVEL%
:AfterBuild
echo.
:: Pull the build summary from the log file
findstr /ir /c:".*Warning(s)" /c:".*Error(s)" /c:"Time Elapsed.*" "%_buildlog%"
echo Build Exit Code = %BUILDERRORLEVEL%
exit /b %BUILDERRORLEVEL%
| Fix script to require VS 2019. | Fix script to require VS 2019.
* Since converting to .NET 3.0 SDK projects we now need that SDK in order to build.
| Batchfile | mit | StephenBonikowsky/wcf,imcarolwang/wcf,dotnet/wcf,mconnew/wcf,imcarolwang/wcf,mconnew/wcf,dotnet/wcf,StephenBonikowsky/wcf,mconnew/wcf,dotnet/wcf,imcarolwang/wcf | batchfile | ## Code Before:
@echo off
setlocal
if not defined VisualStudioVersion (
if defined VS140COMNTOOLS (
call "%VS140COMNTOOLS%\VsDevCmd.bat"
goto :EnvSet
)
if defined VS120COMNTOOLS (
call "%VS120COMNTOOLS%\VsDevCmd.bat"
goto :EnvSet
)
echo Error: %~nx0 requires Visual Studio 2013 or 2015.
echo Please see https://github.com/dotnet/wcf/blob/master/Documentation/developer-guide.md for build instructions.
exit /b 1
)
:EnvSet
:: Log build command line
set _buildproj=%~dp0..\CertificateGenerator\CertificateGenerator.sln
set _buildlog=%~dp0..\..\..\..\msbuildCertificateGenerator.log
set _buildprefix=echo
set _buildpostfix=^> "%_buildlog%"
set cmd=msbuild /p:Configuration=Release /t:restore;build "%_buildproj%" /nologo /maxcpucount /verbosity:minimal /nodeReuse:false /fileloggerparameters:Verbosity=diag;LogFile="%_buildlog%";Append %*
echo %cmd%
%cmd%
set BUILDERRORLEVEL=%ERRORLEVEL%
:AfterBuild
echo.
:: Pull the build summary from the log file
findstr /ir /c:".*Warning(s)" /c:".*Error(s)" /c:"Time Elapsed.*" "%_buildlog%"
echo Build Exit Code = %BUILDERRORLEVEL%
exit /b %BUILDERRORLEVEL%
## Instruction:
Fix script to require VS 2019.
* Since converting to .NET 3.0 SDK projects we now need that SDK in order to build.
## Code After:
@echo off
setlocal
if not defined VisualStudioVersion (
if defined VS160COMNTOOLS (
call "%VS160COMNTOOLS%\VsDevCmd.bat"
goto :EnvSet
)
echo Error: %~nx0 requires Visual Studio 2019 because the .NET Core 3.0 SDK is needed.
echo Please see https://github.com/dotnet/wcf/blob/master/Documentation/developer-guide.md for build instructions.
exit /b 1
)
:EnvSet
:: Log build command line
set _buildproj=%~dp0..\CertificateGenerator\CertificateGenerator.sln
set _buildlog=%~dp0..\..\..\..\msbuildCertificateGenerator.log
set _buildprefix=echo
set _buildpostfix=^> "%_buildlog%"
set cmd=msbuild /p:Configuration=Release /t:restore;build "%_buildproj%" /nologo /maxcpucount /verbosity:minimal /nodeReuse:false /fileloggerparameters:Verbosity=diag;LogFile="%_buildlog%";Append %*
echo %cmd%
%cmd%
set BUILDERRORLEVEL=%ERRORLEVEL%
:AfterBuild
echo.
:: Pull the build summary from the log file
findstr /ir /c:".*Warning(s)" /c:".*Error(s)" /c:"Time Elapsed.*" "%_buildlog%"
echo Build Exit Code = %BUILDERRORLEVEL%
exit /b %BUILDERRORLEVEL%
| @echo off
setlocal
if not defined VisualStudioVersion (
- if defined VS140COMNTOOLS (
? ^
+ if defined VS160COMNTOOLS (
? ^
- call "%VS140COMNTOOLS%\VsDevCmd.bat"
? ^
+ call "%VS160COMNTOOLS%\VsDevCmd.bat"
? ^
goto :EnvSet
)
+ echo Error: %~nx0 requires Visual Studio 2019 because the .NET Core 3.0 SDK is needed.
- if defined VS120COMNTOOLS (
- call "%VS120COMNTOOLS%\VsDevCmd.bat"
- goto :EnvSet
- )
-
- echo Error: %~nx0 requires Visual Studio 2013 or 2015.
echo Please see https://github.com/dotnet/wcf/blob/master/Documentation/developer-guide.md for build instructions.
exit /b 1
)
:EnvSet
:: Log build command line
set _buildproj=%~dp0..\CertificateGenerator\CertificateGenerator.sln
set _buildlog=%~dp0..\..\..\..\msbuildCertificateGenerator.log
set _buildprefix=echo
set _buildpostfix=^> "%_buildlog%"
set cmd=msbuild /p:Configuration=Release /t:restore;build "%_buildproj%" /nologo /maxcpucount /verbosity:minimal /nodeReuse:false /fileloggerparameters:Verbosity=diag;LogFile="%_buildlog%";Append %*
echo %cmd%
%cmd%
set BUILDERRORLEVEL=%ERRORLEVEL%
:AfterBuild
echo.
:: Pull the build summary from the log file
findstr /ir /c:".*Warning(s)" /c:".*Error(s)" /c:"Time Elapsed.*" "%_buildlog%"
echo Build Exit Code = %BUILDERRORLEVEL%
exit /b %BUILDERRORLEVEL% | 11 | 0.275 | 3 | 8 |
759482113028cc097cf3ea5af9786a1f1ab13433 | drivers/mgopw/mgo_test.go | drivers/mgopw/mgo_test.go | // Copyright 2015, Klaus Post, see LICENSE for details.
package mgopw
import (
"testing"
"time"
"github.com/klauspost/password/drivers"
"gopkg.in/mgo.v2"
)
// Test a Mongo database
func TestMongo(t *testing.T) {
session, err := mgo.DialWithTimeout("127.0.0.1:27017", time.Second)
if err != nil {
t.Skip("No database: ", err)
}
coll := session.DB("testdb").C("password-test")
_ = coll.DropCollection()
db := New(session, "testdb", "password-test")
err = drivers.TestImport(db)
if err != nil {
t.Fatal(err)
}
// Be sure data is flushed, probably not needed, but we like to be sure
err = session.Fsync(false)
if err != nil {
t.Log("Fsync returned", err, "(ignoring)")
}
err = drivers.TestData(db)
if err != nil {
t.Fatal(err)
}
err = coll.DropCollection()
if err != nil {
t.Log("Drop returned", err, "(ignoring)")
}
session.Close()
}
| // Copyright 2015, Klaus Post, see LICENSE for details.
package mgopw
import (
"testing"
"time"
"github.com/klauspost/password/drivers"
"gopkg.in/mgo.v2"
)
// Test a Mongo database
func TestMongo(t *testing.T) {
session, err := mgo.DialWithTimeout("127.0.0.1:27017", time.Second)
if err != nil {
t.Skip("No database: ", err)
}
coll := session.DB("testdb").C("password-test")
_ = coll.DropCollection()
// Set timeout, otherwise travis sometimes gets timeout.
session.SetSocketTimeout(time.Minute)
session.SetSyncTimeout(time.Minute)
db := New(session, "testdb", "password-test")
err = drivers.TestImport(db)
if err != nil {
t.Fatal(err)
}
// Be sure data is flushed, probably not needed, but we like to be sure
err = session.Fsync(false)
if err != nil {
t.Log("Fsync returned", err, "(ignoring)")
}
err = drivers.TestData(db)
if err != nil {
t.Fatal(err)
}
err = coll.DropCollection()
if err != nil {
t.Log("Drop returned", err, "(ignoring)")
}
session.Close()
}
| Extend timeouts on Mongo sessions to avoid random CI failures. | Extend timeouts on Mongo sessions to avoid random CI failures.
| Go | mit | klauspost/password | go | ## Code Before:
// Copyright 2015, Klaus Post, see LICENSE for details.
package mgopw
import (
"testing"
"time"
"github.com/klauspost/password/drivers"
"gopkg.in/mgo.v2"
)
// Test a Mongo database
func TestMongo(t *testing.T) {
session, err := mgo.DialWithTimeout("127.0.0.1:27017", time.Second)
if err != nil {
t.Skip("No database: ", err)
}
coll := session.DB("testdb").C("password-test")
_ = coll.DropCollection()
db := New(session, "testdb", "password-test")
err = drivers.TestImport(db)
if err != nil {
t.Fatal(err)
}
// Be sure data is flushed, probably not needed, but we like to be sure
err = session.Fsync(false)
if err != nil {
t.Log("Fsync returned", err, "(ignoring)")
}
err = drivers.TestData(db)
if err != nil {
t.Fatal(err)
}
err = coll.DropCollection()
if err != nil {
t.Log("Drop returned", err, "(ignoring)")
}
session.Close()
}
## Instruction:
Extend timeouts on Mongo sessions to avoid random CI failures.
## Code After:
// Copyright 2015, Klaus Post, see LICENSE for details.
package mgopw
import (
"testing"
"time"
"github.com/klauspost/password/drivers"
"gopkg.in/mgo.v2"
)
// Test a Mongo database
func TestMongo(t *testing.T) {
session, err := mgo.DialWithTimeout("127.0.0.1:27017", time.Second)
if err != nil {
t.Skip("No database: ", err)
}
coll := session.DB("testdb").C("password-test")
_ = coll.DropCollection()
// Set timeout, otherwise travis sometimes gets timeout.
session.SetSocketTimeout(time.Minute)
session.SetSyncTimeout(time.Minute)
db := New(session, "testdb", "password-test")
err = drivers.TestImport(db)
if err != nil {
t.Fatal(err)
}
// Be sure data is flushed, probably not needed, but we like to be sure
err = session.Fsync(false)
if err != nil {
t.Log("Fsync returned", err, "(ignoring)")
}
err = drivers.TestData(db)
if err != nil {
t.Fatal(err)
}
err = coll.DropCollection()
if err != nil {
t.Log("Drop returned", err, "(ignoring)")
}
session.Close()
}
| // Copyright 2015, Klaus Post, see LICENSE for details.
package mgopw
import (
"testing"
"time"
"github.com/klauspost/password/drivers"
"gopkg.in/mgo.v2"
)
// Test a Mongo database
func TestMongo(t *testing.T) {
session, err := mgo.DialWithTimeout("127.0.0.1:27017", time.Second)
if err != nil {
t.Skip("No database: ", err)
}
coll := session.DB("testdb").C("password-test")
_ = coll.DropCollection()
+
+ // Set timeout, otherwise travis sometimes gets timeout.
+ session.SetSocketTimeout(time.Minute)
+ session.SetSyncTimeout(time.Minute)
db := New(session, "testdb", "password-test")
err = drivers.TestImport(db)
if err != nil {
t.Fatal(err)
}
// Be sure data is flushed, probably not needed, but we like to be sure
err = session.Fsync(false)
if err != nil {
t.Log("Fsync returned", err, "(ignoring)")
}
err = drivers.TestData(db)
if err != nil {
t.Fatal(err)
}
err = coll.DropCollection()
if err != nil {
t.Log("Drop returned", err, "(ignoring)")
}
session.Close()
} | 4 | 0.090909 | 4 | 0 |
d36a48fce7bce09b4eadfaa9a3a99f6b822be736 | spec/nio/selector_spec.rb | spec/nio/selector_spec.rb | require 'spec_helper'
describe NIO::Selector do
it "monitors IO objects" do
pipe, _ = IO.pipe
monitor = subject.register(pipe, :r)
monitor.should be_a NIO::Monitor
end
it "selects objects for readiness" do
unready_pipe, _ = IO.pipe
ready_pipe, ready_writer = IO.pipe
# Give ready_pipe some data so it's ready
ready_writer << "hi there"
unready_monitor = subject.register(unready_pipe, :r)
ready_monitor = subject.register(ready_pipe, :r)
ready_monitors = subject.select
ready_monitors.should include ready_monitor
ready_monitors.should_not include unready_monitor
end
end
| require 'spec_helper'
describe NIO::Selector do
it "monitors IO objects" do
pipe, _ = IO.pipe
monitor = subject.register(pipe, :r)
monitor.should be_a NIO::Monitor
end
context "IO object support" do
context "pipes" do
it "selects for read readiness" do
unready_pipe, _ = IO.pipe
ready_pipe, ready_writer = IO.pipe
# Give ready_pipe some data so it's ready
ready_writer << "hi there"
unready_monitor = subject.register(unready_pipe, :r)
ready_monitor = subject.register(ready_pipe, :r)
ready_monitors = subject.select
ready_monitors.should include ready_monitor
ready_monitors.should_not include unready_monitor
end
end
context "TCPSockets" do
it "selects for read readiness" do
port = 12345
server = TCPServer.new("localhost", port)
ready_socket = TCPSocket.open("localhost", port)
ready_writer = server.accept
# Give ready_socket some data so it's ready
ready_writer << "hi there"
unready_socket = TCPSocket.open("localhost", port)
unready_monitor = subject.register(unready_socket, :r)
ready_monitor = subject.register(ready_socket, :r)
ready_monitors = subject.select
ready_monitors.should include ready_monitor
ready_monitors.should_not include unready_monitor
end
end
end
end
| Break apart specs for different types of IO objects | Break apart specs for different types of IO objects
| Ruby | mit | celluloid/nio4r,celluloid/nio4r,tpetchel/nio4r,marshall-lee/nio4r,marshall-lee/nio4r,frsyuki/nio4r,marshall-lee/nio4r,celluloid/nio4r,frsyuki/nio4r,frsyuki/nio4r,tpetchel/nio4r,tpetchel/nio4r | ruby | ## Code Before:
require 'spec_helper'
describe NIO::Selector do
it "monitors IO objects" do
pipe, _ = IO.pipe
monitor = subject.register(pipe, :r)
monitor.should be_a NIO::Monitor
end
it "selects objects for readiness" do
unready_pipe, _ = IO.pipe
ready_pipe, ready_writer = IO.pipe
# Give ready_pipe some data so it's ready
ready_writer << "hi there"
unready_monitor = subject.register(unready_pipe, :r)
ready_monitor = subject.register(ready_pipe, :r)
ready_monitors = subject.select
ready_monitors.should include ready_monitor
ready_monitors.should_not include unready_monitor
end
end
## Instruction:
Break apart specs for different types of IO objects
## Code After:
require 'spec_helper'
describe NIO::Selector do
it "monitors IO objects" do
pipe, _ = IO.pipe
monitor = subject.register(pipe, :r)
monitor.should be_a NIO::Monitor
end
context "IO object support" do
context "pipes" do
it "selects for read readiness" do
unready_pipe, _ = IO.pipe
ready_pipe, ready_writer = IO.pipe
# Give ready_pipe some data so it's ready
ready_writer << "hi there"
unready_monitor = subject.register(unready_pipe, :r)
ready_monitor = subject.register(ready_pipe, :r)
ready_monitors = subject.select
ready_monitors.should include ready_monitor
ready_monitors.should_not include unready_monitor
end
end
context "TCPSockets" do
it "selects for read readiness" do
port = 12345
server = TCPServer.new("localhost", port)
ready_socket = TCPSocket.open("localhost", port)
ready_writer = server.accept
# Give ready_socket some data so it's ready
ready_writer << "hi there"
unready_socket = TCPSocket.open("localhost", port)
unready_monitor = subject.register(unready_socket, :r)
ready_monitor = subject.register(ready_socket, :r)
ready_monitors = subject.select
ready_monitors.should include ready_monitor
ready_monitors.should_not include unready_monitor
end
end
end
end
| require 'spec_helper'
describe NIO::Selector do
it "monitors IO objects" do
pipe, _ = IO.pipe
monitor = subject.register(pipe, :r)
monitor.should be_a NIO::Monitor
end
+ context "IO object support" do
+ context "pipes" do
- it "selects objects for readiness" do
? --------
+ it "selects for read readiness" do
? ++++ +++++
- unready_pipe, _ = IO.pipe
+ unready_pipe, _ = IO.pipe
? ++++
- ready_pipe, ready_writer = IO.pipe
+ ready_pipe, ready_writer = IO.pipe
? ++++
- # Give ready_pipe some data so it's ready
+ # Give ready_pipe some data so it's ready
? ++++
- ready_writer << "hi there"
+ ready_writer << "hi there"
? ++++
- unready_monitor = subject.register(unready_pipe, :r)
+ unready_monitor = subject.register(unready_pipe, :r)
? ++++
- ready_monitor = subject.register(ready_pipe, :r)
+ ready_monitor = subject.register(ready_pipe, :r)
? ++++
- ready_monitors = subject.select
+ ready_monitors = subject.select
? ++++
- ready_monitors.should include ready_monitor
+ ready_monitors.should include ready_monitor
? ++++
- ready_monitors.should_not include unready_monitor
+ ready_monitors.should_not include unready_monitor
? ++++
+ end
+ end
+
+ context "TCPSockets" do
+ it "selects for read readiness" do
+ port = 12345
+ server = TCPServer.new("localhost", port)
+
+ ready_socket = TCPSocket.open("localhost", port)
+ ready_writer = server.accept
+
+ # Give ready_socket some data so it's ready
+ ready_writer << "hi there"
+
+ unready_socket = TCPSocket.open("localhost", port)
+
+ unready_monitor = subject.register(unready_socket, :r)
+ ready_monitor = subject.register(ready_socket, :r)
+
+ ready_monitors = subject.select
+ ready_monitors.should include ready_monitor
+ ready_monitors.should_not include unready_monitor
+ end
+ end
end
end | 46 | 1.84 | 36 | 10 |
ccd10fda4adf5d03a0903fd5f42f9b77af3680f6 | src/impl/java/io/core9/module/auth/standard/UserEntity.java | src/impl/java/io/core9/module/auth/standard/UserEntity.java | package io.core9.module.auth.standard;
import io.core9.plugin.database.repository.AbstractCrudEntity;
import io.core9.plugin.database.repository.Collection;
import io.core9.plugin.database.repository.CrudEntity;
import java.util.Set;
import org.apache.shiro.crypto.hash.Sha256Hash;
@Collection("core.users")
public class UserEntity extends AbstractCrudEntity implements CrudEntity {
private String username;
private String password;
private String salt;
private Set<String> roles;
public String getUsername() {
return username;
}
public void setUsername(String username) {
this.username = username;
}
public String getPassword() {
return password;
}
public void setPassword(String password) {
this.password = password;
}
public String getSalt() {
return salt;
}
public void setSalt(String salt) {
this.salt = salt;
}
public Set<String> getRoles() {
return roles;
}
public void setRoles(Set<String> roles) {
this.roles = roles;
}
public void hashPassword(String source) {
this.password = new Sha256Hash(source, salt).toString();
}
}
| package io.core9.module.auth.standard;
import io.core9.plugin.database.repository.AbstractCrudEntity;
import io.core9.plugin.database.repository.Collection;
import io.core9.plugin.database.repository.CrudEntity;
import java.math.BigInteger;
import java.security.SecureRandom;
import java.util.Set;
import org.apache.shiro.crypto.hash.Sha256Hash;
@Collection("core.users")
public class UserEntity extends AbstractCrudEntity implements CrudEntity {
private String username;
private String password;
private String salt;
private Set<String> roles;
public String getUsername() {
return username;
}
public void setUsername(String username) {
this.username = username;
}
public String getPassword() {
return password;
}
public void setPassword(String password) {
this.password = password;
}
public String getSalt() {
return salt;
}
public void setSalt(String salt) {
this.salt = salt;
}
public Set<String> getRoles() {
return roles;
}
public void setRoles(Set<String> roles) {
this.roles = roles;
}
public void hashPassword(String source) {
if(source == null) {
SecureRandom random = new SecureRandom();
source = new BigInteger(130, random).toString(32);
}
this.password = new Sha256Hash(source, salt).toString();
}
}
| Create dummy password on null | Create dummy password on null | Java | mit | core9/module-authentication-standard | java | ## Code Before:
package io.core9.module.auth.standard;
import io.core9.plugin.database.repository.AbstractCrudEntity;
import io.core9.plugin.database.repository.Collection;
import io.core9.plugin.database.repository.CrudEntity;
import java.util.Set;
import org.apache.shiro.crypto.hash.Sha256Hash;
@Collection("core.users")
public class UserEntity extends AbstractCrudEntity implements CrudEntity {
private String username;
private String password;
private String salt;
private Set<String> roles;
public String getUsername() {
return username;
}
public void setUsername(String username) {
this.username = username;
}
public String getPassword() {
return password;
}
public void setPassword(String password) {
this.password = password;
}
public String getSalt() {
return salt;
}
public void setSalt(String salt) {
this.salt = salt;
}
public Set<String> getRoles() {
return roles;
}
public void setRoles(Set<String> roles) {
this.roles = roles;
}
public void hashPassword(String source) {
this.password = new Sha256Hash(source, salt).toString();
}
}
## Instruction:
Create dummy password on null
## Code After:
package io.core9.module.auth.standard;
import io.core9.plugin.database.repository.AbstractCrudEntity;
import io.core9.plugin.database.repository.Collection;
import io.core9.plugin.database.repository.CrudEntity;
import java.math.BigInteger;
import java.security.SecureRandom;
import java.util.Set;
import org.apache.shiro.crypto.hash.Sha256Hash;
@Collection("core.users")
public class UserEntity extends AbstractCrudEntity implements CrudEntity {
private String username;
private String password;
private String salt;
private Set<String> roles;
public String getUsername() {
return username;
}
public void setUsername(String username) {
this.username = username;
}
public String getPassword() {
return password;
}
public void setPassword(String password) {
this.password = password;
}
public String getSalt() {
return salt;
}
public void setSalt(String salt) {
this.salt = salt;
}
public Set<String> getRoles() {
return roles;
}
public void setRoles(Set<String> roles) {
this.roles = roles;
}
public void hashPassword(String source) {
if(source == null) {
SecureRandom random = new SecureRandom();
source = new BigInteger(130, random).toString(32);
}
this.password = new Sha256Hash(source, salt).toString();
}
}
| package io.core9.module.auth.standard;
import io.core9.plugin.database.repository.AbstractCrudEntity;
import io.core9.plugin.database.repository.Collection;
import io.core9.plugin.database.repository.CrudEntity;
+ import java.math.BigInteger;
+ import java.security.SecureRandom;
import java.util.Set;
import org.apache.shiro.crypto.hash.Sha256Hash;
@Collection("core.users")
public class UserEntity extends AbstractCrudEntity implements CrudEntity {
private String username;
private String password;
private String salt;
private Set<String> roles;
public String getUsername() {
return username;
}
public void setUsername(String username) {
this.username = username;
}
public String getPassword() {
return password;
}
public void setPassword(String password) {
this.password = password;
}
public String getSalt() {
return salt;
}
public void setSalt(String salt) {
this.salt = salt;
}
public Set<String> getRoles() {
return roles;
}
public void setRoles(Set<String> roles) {
this.roles = roles;
}
public void hashPassword(String source) {
+ if(source == null) {
+ SecureRandom random = new SecureRandom();
+ source = new BigInteger(130, random).toString(32);
+ }
this.password = new Sha256Hash(source, salt).toString();
}
} | 6 | 0.109091 | 6 | 0 |
d413da5713470b41d471af24bfa3a9deca74cea1 | client/src/components/filters/filterList.less | client/src/components/filters/filterList.less | .filter-list {
label {
font-weight: normal;
}
.search-group {
display: flex;
}
.search-query {
margin-right: 20px;
}
.search-add .btn, .btn-search {
border-left: unset;
background: #f5f9fb;
}
.saved-searches button {
border-right: unset;
}
.search-filter {
border-bottom: 1px solid #f5f9fb;
padding: 10px;
}
.searches-save {
margin: 8px 5px 5px;
}
}
| .filter-list {
label {
font-weight: normal;
}
.search-group {
display: flex;
}
.search-query {
margin-right: 20px;
flex-basis: 100%;
}
.search-add .btn, .btn-search {
border-left: unset;
background: #f5f9fb;
}
.saved-searches button {
border-right: unset;
}
.search-filter {
border-bottom: 1px solid #f5f9fb;
padding: 10px;
}
.searches-save {
margin: 8px 5px 5px;
}
}
| Fix css issue with flex box in latest chrome | Fix css issue with flex box in latest chrome
| Less | apache-2.0 | polyaxon/polyaxon,polyaxon/polyaxon,polyaxon/polyaxon | less | ## Code Before:
.filter-list {
label {
font-weight: normal;
}
.search-group {
display: flex;
}
.search-query {
margin-right: 20px;
}
.search-add .btn, .btn-search {
border-left: unset;
background: #f5f9fb;
}
.saved-searches button {
border-right: unset;
}
.search-filter {
border-bottom: 1px solid #f5f9fb;
padding: 10px;
}
.searches-save {
margin: 8px 5px 5px;
}
}
## Instruction:
Fix css issue with flex box in latest chrome
## Code After:
.filter-list {
label {
font-weight: normal;
}
.search-group {
display: flex;
}
.search-query {
margin-right: 20px;
flex-basis: 100%;
}
.search-add .btn, .btn-search {
border-left: unset;
background: #f5f9fb;
}
.saved-searches button {
border-right: unset;
}
.search-filter {
border-bottom: 1px solid #f5f9fb;
padding: 10px;
}
.searches-save {
margin: 8px 5px 5px;
}
}
| .filter-list {
label {
font-weight: normal;
}
.search-group {
display: flex;
}
.search-query {
margin-right: 20px;
+ flex-basis: 100%;
}
.search-add .btn, .btn-search {
border-left: unset;
background: #f5f9fb;
}
.saved-searches button {
border-right: unset;
}
.search-filter {
border-bottom: 1px solid #f5f9fb;
padding: 10px;
}
.searches-save {
margin: 8px 5px 5px;
}
} | 1 | 0.04 | 1 | 0 |
4c8a5b51046c0f58f5a4c805794735bcd778eb8d | testing/tests/loops.rs | testing/tests/loops.rs | extern crate askama;
use askama::Template;
#[derive(Template)]
#[template(path = "for.html")]
struct ForTemplate<'a> {
strings: Vec<&'a str>,
}
#[test]
fn test_for() {
let s = ForTemplate {
strings: vec!["A", "alfa", "1"],
};
assert_eq!(s.render(), "0. A\n1. alfa\n2. 1\n");
}
#[derive(Template)]
#[template(path = "nested-for.html", print = "code")]
struct NestedForTemplate<'a> {
seqs: Vec<&'a [&'a str]>,
}
#[test]
fn test_nested_for() {
let alpha = vec!["a", "b", "c"];
let numbers = vec!["one", "two"];
let s = NestedForTemplate {
seqs: vec![&alpha, &numbers],
};
assert_eq!(s.render(), "1\n 0a1b2c2\n 0one1two");
}
| extern crate askama;
use askama::Template;
#[derive(Template)]
#[template(path = "for.html")]
struct ForTemplate<'a> {
strings: Vec<&'a str>,
}
#[test]
fn test_for() {
let s = ForTemplate {
strings: vec!["A", "alfa", "1"],
};
assert_eq!(s.render(), "0. A\n1. alfa\n2. 1\n");
}
#[derive(Template)]
#[template(path = "nested-for.html")]
struct NestedForTemplate<'a> {
seqs: Vec<&'a [&'a str]>,
}
#[test]
fn test_nested_for() {
let alpha = vec!["a", "b", "c"];
let numbers = vec!["one", "two"];
let s = NestedForTemplate {
seqs: vec![&alpha, &numbers],
};
assert_eq!(s.render(), "1\n 0a1b2c2\n 0one1two");
}
| Remove debugging cruft from test_nested_for | Remove debugging cruft from test_nested_for
| Rust | apache-2.0 | djc/askama,djc/askama,djc/askama | rust | ## Code Before:
extern crate askama;
use askama::Template;
#[derive(Template)]
#[template(path = "for.html")]
struct ForTemplate<'a> {
strings: Vec<&'a str>,
}
#[test]
fn test_for() {
let s = ForTemplate {
strings: vec!["A", "alfa", "1"],
};
assert_eq!(s.render(), "0. A\n1. alfa\n2. 1\n");
}
#[derive(Template)]
#[template(path = "nested-for.html", print = "code")]
struct NestedForTemplate<'a> {
seqs: Vec<&'a [&'a str]>,
}
#[test]
fn test_nested_for() {
let alpha = vec!["a", "b", "c"];
let numbers = vec!["one", "two"];
let s = NestedForTemplate {
seqs: vec![&alpha, &numbers],
};
assert_eq!(s.render(), "1\n 0a1b2c2\n 0one1two");
}
## Instruction:
Remove debugging cruft from test_nested_for
## Code After:
extern crate askama;
use askama::Template;
#[derive(Template)]
#[template(path = "for.html")]
struct ForTemplate<'a> {
strings: Vec<&'a str>,
}
#[test]
fn test_for() {
let s = ForTemplate {
strings: vec!["A", "alfa", "1"],
};
assert_eq!(s.render(), "0. A\n1. alfa\n2. 1\n");
}
#[derive(Template)]
#[template(path = "nested-for.html")]
struct NestedForTemplate<'a> {
seqs: Vec<&'a [&'a str]>,
}
#[test]
fn test_nested_for() {
let alpha = vec!["a", "b", "c"];
let numbers = vec!["one", "two"];
let s = NestedForTemplate {
seqs: vec![&alpha, &numbers],
};
assert_eq!(s.render(), "1\n 0a1b2c2\n 0one1two");
}
| extern crate askama;
use askama::Template;
#[derive(Template)]
#[template(path = "for.html")]
struct ForTemplate<'a> {
strings: Vec<&'a str>,
}
#[test]
fn test_for() {
let s = ForTemplate {
strings: vec!["A", "alfa", "1"],
};
assert_eq!(s.render(), "0. A\n1. alfa\n2. 1\n");
}
#[derive(Template)]
- #[template(path = "nested-for.html", print = "code")]
? ----------------
+ #[template(path = "nested-for.html")]
struct NestedForTemplate<'a> {
seqs: Vec<&'a [&'a str]>,
}
#[test]
fn test_nested_for() {
let alpha = vec!["a", "b", "c"];
let numbers = vec!["one", "two"];
let s = NestedForTemplate {
seqs: vec![&alpha, &numbers],
};
assert_eq!(s.render(), "1\n 0a1b2c2\n 0one1two");
} | 2 | 0.058824 | 1 | 1 |
056b4ae938ab1aacf5e3f48a1e17919a79ff29b7 | scripts/sbatch_cancel.py | scripts/sbatch_cancel.py | import subprocess
import sys
import getpass
### Kill a job and its chain of dependents (as created by sbatch_submit).
### Usage: python sbatch_cancel.py [Name of first running job in chain]
CURRENT_JOB = sys.argv[1]
USER = getpass.getuser()
lines = subprocess.check_output(['squeue', '-u', USER, '-o', '"%.8A %.20E"'])
lines = lines.split('\n')
lines.sort()
to_kill = [CURRENT_JOB]
for line in lines:
s = line.split()
if (len(s) > 0) and (to_kill[-1] in s[2]):
to_kill.append(s[1])
subprocess.call(['scancel'] + to_kill)
| import subprocess
import sys
import getpass
### Kill a job and its chain of dependents (as created by sbatch_submit).
### Usage: python sbatch_cancel.py [Name of first running job in chain] [Name of first running job in chain for a second chain] ...
USER = getpass.getuser()
lines = subprocess.check_output(['squeue', '-u', USER, '-o', '"%.8A %.20E"'])
lines = lines.split('\n')
lines.sort()
for current_job in sys.argv[1].split():
if len(target_job) < 5:
continue
to_kill = [current_job]
for line in lines:
s = line.split()
if (len(s) > 0) and (to_kill[-1] in s[2]):
to_kill.append(s[1])
subprocess.call(['scancel'] + to_kill)
| Kill multiple chains at once. | Kill multiple chains at once.
| Python | mit | nyu-mll/spinn,nyu-mll/spinn,nyu-mll/spinn | python | ## Code Before:
import subprocess
import sys
import getpass
### Kill a job and its chain of dependents (as created by sbatch_submit).
### Usage: python sbatch_cancel.py [Name of first running job in chain]
CURRENT_JOB = sys.argv[1]
USER = getpass.getuser()
lines = subprocess.check_output(['squeue', '-u', USER, '-o', '"%.8A %.20E"'])
lines = lines.split('\n')
lines.sort()
to_kill = [CURRENT_JOB]
for line in lines:
s = line.split()
if (len(s) > 0) and (to_kill[-1] in s[2]):
to_kill.append(s[1])
subprocess.call(['scancel'] + to_kill)
## Instruction:
Kill multiple chains at once.
## Code After:
import subprocess
import sys
import getpass
### Kill a job and its chain of dependents (as created by sbatch_submit).
### Usage: python sbatch_cancel.py [Name of first running job in chain] [Name of first running job in chain for a second chain] ...
USER = getpass.getuser()
lines = subprocess.check_output(['squeue', '-u', USER, '-o', '"%.8A %.20E"'])
lines = lines.split('\n')
lines.sort()
for current_job in sys.argv[1].split():
if len(target_job) < 5:
continue
to_kill = [current_job]
for line in lines:
s = line.split()
if (len(s) > 0) and (to_kill[-1] in s[2]):
to_kill.append(s[1])
subprocess.call(['scancel'] + to_kill)
| import subprocess
import sys
import getpass
### Kill a job and its chain of dependents (as created by sbatch_submit).
- ### Usage: python sbatch_cancel.py [Name of first running job in chain]
+ ### Usage: python sbatch_cancel.py [Name of first running job in chain] [Name of first running job in chain for a second chain] ...
- CURRENT_JOB = sys.argv[1]
USER = getpass.getuser()
lines = subprocess.check_output(['squeue', '-u', USER, '-o', '"%.8A %.20E"'])
lines = lines.split('\n')
lines.sort()
- to_kill = [CURRENT_JOB]
+ for current_job in sys.argv[1].split():
+ if len(target_job) < 5:
+ continue
+ to_kill = [current_job]
- for line in lines:
+ for line in lines:
? +
- s = line.split()
+ s = line.split()
? +
- if (len(s) > 0) and (to_kill[-1] in s[2]):
+ if (len(s) > 0) and (to_kill[-1] in s[2]):
? +
- to_kill.append(s[1])
+ to_kill.append(s[1])
? +
- subprocess.call(['scancel'] + to_kill)
+ subprocess.call(['scancel'] + to_kill)
? +
- | 19 | 0.826087 | 10 | 9 |
178a2206be25023180c27b07fb71f1f8be65ad43 | app/views/admin/landmarks/_landmark_form.html.haml | app/views/admin/landmarks/_landmark_form.html.haml | = f.input :name, required: true, :as => :string
= f.input :street_number, required: true, :as => :string
= f.input :route, required: true, :as => :string
= f.input :city, required: true, :as => :string
= f.input :state, required: true, :as => :string
= f.input :zip, required: true, :as => :string
= f.input :lat, required: true, :as => :string
= f.input :lng, required: true, :as => :string
| = f.input :name, required: true, :as => :string
= f.input :street_number, required: false, :as => :string
= f.input :route, required: false, :as => :string
= f.input :city, required: false, :as => :string
= f.input :state, required: false, :as => :string
= f.input :zip, required: false, :as => :string
= f.input :lat, required: true, :as => :string
= f.input :lng, required: true, :as => :string
| Change landmarks form to only require name, lat and lng | Change landmarks form to only require name, lat and lng
| Haml | mit | camsys/oneclick-core,camsys/oneclick-core,camsys/oneclick-core,camsys/oneclick-core | haml | ## Code Before:
= f.input :name, required: true, :as => :string
= f.input :street_number, required: true, :as => :string
= f.input :route, required: true, :as => :string
= f.input :city, required: true, :as => :string
= f.input :state, required: true, :as => :string
= f.input :zip, required: true, :as => :string
= f.input :lat, required: true, :as => :string
= f.input :lng, required: true, :as => :string
## Instruction:
Change landmarks form to only require name, lat and lng
## Code After:
= f.input :name, required: true, :as => :string
= f.input :street_number, required: false, :as => :string
= f.input :route, required: false, :as => :string
= f.input :city, required: false, :as => :string
= f.input :state, required: false, :as => :string
= f.input :zip, required: false, :as => :string
= f.input :lat, required: true, :as => :string
= f.input :lng, required: true, :as => :string
| = f.input :name, required: true, :as => :string
- = f.input :street_number, required: true, :as => :string
? ^^^
+ = f.input :street_number, required: false, :as => :string
? ^^^^
- = f.input :route, required: true, :as => :string
? ^^^
+ = f.input :route, required: false, :as => :string
? ^^^^
- = f.input :city, required: true, :as => :string
? ^^^
+ = f.input :city, required: false, :as => :string
? ^^^^
- = f.input :state, required: true, :as => :string
? ^^^
+ = f.input :state, required: false, :as => :string
? ^^^^
- = f.input :zip, required: true, :as => :string
? ^^^
+ = f.input :zip, required: false, :as => :string
? ^^^^
= f.input :lat, required: true, :as => :string
= f.input :lng, required: true, :as => :string | 10 | 1.25 | 5 | 5 |
66d2fc881f486eee20013bb5e1c606f53bc6dbfb | ruby-dtrace.gemspec | ruby-dtrace.gemspec | Gem::Specification.new do |s|
s.name = 'ruby-dtrace'
s.version = Dtrace::VERSION
s.platform = Gem::Platform::RUBY
s.summary = <<-DESC.strip.gsub(/\n\s+/, " ")
ruby-dtrace is Ruby bindings for Dtrace, which lets you write D-based
programs in Ruby, and add probes to your Ruby programs.
DESC
s.files = Dir.glob("{examples,ext,lib,plugin,test}/**/*") + %w(README.txt History.txt Manifest.txt Rakefile)
s.require_path = 'lib'
s.has_rdoc = true
s.author = "Chris Andrews"
s.email = "chris@nodnol.org"
s.homepage = "http://ruby-dtrace.rubyforge.org"
s.rubyforge_project = "ruby-dtrace"
end
| $:<<'./lib'
require 'dtrace'
Gem::Specification.new do |s|
s.name = 'ruby-dtrace'
s.version = Dtrace::VERSION
s.platform = Gem::Platform::RUBY
s.summary = <<-DESC.strip.gsub(/\n\s+/, " ")
ruby-dtrace is Ruby bindings for Dtrace, which lets you write D-based
programs in Ruby, and add probes to your Ruby programs.
DESC
s.files = Dir.glob("{examples,ext,lib,plugin,test}/**/*") + %w(README.txt History.txt Manifest.txt Rakefile)
s.require_path = 'lib'
s.has_rdoc = true
s.author = "Chris Andrews"
s.email = "chris@nodnol.org"
s.homepage = "http://ruby-dtrace.rubyforge.org"
s.rubyforge_project = "ruby-dtrace"
end
| Fix gemspec - must include dtrace.rb | Fix gemspec - must include dtrace.rb
| Ruby | mit | chrisa/ruby-dtrace,chrisa/ruby-dtrace | ruby | ## Code Before:
Gem::Specification.new do |s|
s.name = 'ruby-dtrace'
s.version = Dtrace::VERSION
s.platform = Gem::Platform::RUBY
s.summary = <<-DESC.strip.gsub(/\n\s+/, " ")
ruby-dtrace is Ruby bindings for Dtrace, which lets you write D-based
programs in Ruby, and add probes to your Ruby programs.
DESC
s.files = Dir.glob("{examples,ext,lib,plugin,test}/**/*") + %w(README.txt History.txt Manifest.txt Rakefile)
s.require_path = 'lib'
s.has_rdoc = true
s.author = "Chris Andrews"
s.email = "chris@nodnol.org"
s.homepage = "http://ruby-dtrace.rubyforge.org"
s.rubyforge_project = "ruby-dtrace"
end
## Instruction:
Fix gemspec - must include dtrace.rb
## Code After:
$:<<'./lib'
require 'dtrace'
Gem::Specification.new do |s|
s.name = 'ruby-dtrace'
s.version = Dtrace::VERSION
s.platform = Gem::Platform::RUBY
s.summary = <<-DESC.strip.gsub(/\n\s+/, " ")
ruby-dtrace is Ruby bindings for Dtrace, which lets you write D-based
programs in Ruby, and add probes to your Ruby programs.
DESC
s.files = Dir.glob("{examples,ext,lib,plugin,test}/**/*") + %w(README.txt History.txt Manifest.txt Rakefile)
s.require_path = 'lib'
s.has_rdoc = true
s.author = "Chris Andrews"
s.email = "chris@nodnol.org"
s.homepage = "http://ruby-dtrace.rubyforge.org"
s.rubyforge_project = "ruby-dtrace"
end
| + $:<<'./lib'
+ require 'dtrace'
+
Gem::Specification.new do |s|
s.name = 'ruby-dtrace'
s.version = Dtrace::VERSION
s.platform = Gem::Platform::RUBY
s.summary = <<-DESC.strip.gsub(/\n\s+/, " ")
ruby-dtrace is Ruby bindings for Dtrace, which lets you write D-based
programs in Ruby, and add probes to your Ruby programs.
DESC
s.files = Dir.glob("{examples,ext,lib,plugin,test}/**/*") + %w(README.txt History.txt Manifest.txt Rakefile)
s.require_path = 'lib'
s.has_rdoc = true
s.author = "Chris Andrews"
s.email = "chris@nodnol.org"
s.homepage = "http://ruby-dtrace.rubyforge.org"
s.rubyforge_project = "ruby-dtrace"
end | 3 | 0.15 | 3 | 0 |
27a6984242163c6767f09111accf5071dc3b45d5 | src/sct/templates/resources/setup_puppet_agent.sh | src/sct/templates/resources/setup_puppet_agent.sh | . /etc/profile
grep "puppet" /etc/hosts || echo "@puppetServer puppet" >> /etc/hosts
| . /etc/profile
grep "puppet" /etc/hosts || echo "@puppetServer puppet" >> /etc/hosts
echo -e "START=yes\nDAEMON_OPTS=\"\"\n" > /etc/default/puppet
/etc/init.d/puppet start
| Enable the puppet agent on worker | Enable the puppet agent on worker
| Shell | apache-2.0 | mneagul/scape-cloud-toolkit,mneagul/scape-cloud-toolkit,mneagul/scape-cloud-toolkit | shell | ## Code Before:
. /etc/profile
grep "puppet" /etc/hosts || echo "@puppetServer puppet" >> /etc/hosts
## Instruction:
Enable the puppet agent on worker
## Code After:
. /etc/profile
grep "puppet" /etc/hosts || echo "@puppetServer puppet" >> /etc/hosts
echo -e "START=yes\nDAEMON_OPTS=\"\"\n" > /etc/default/puppet
/etc/init.d/puppet start
| . /etc/profile
grep "puppet" /etc/hosts || echo "@puppetServer puppet" >> /etc/hosts
+ echo -e "START=yes\nDAEMON_OPTS=\"\"\n" > /etc/default/puppet
+ /etc/init.d/puppet start | 2 | 0.666667 | 2 | 0 |
536e89b0c80c6121c9e875ed320f8637019a82b0 | json/testparsestring.cpp | json/testparsestring.cpp |
using json = nlohmann::json;
bool parse_and_dump(const char* text, json& result)
{
// parse and serialize JSON
result = json::parse(text);
//std::cout << std::setw(4) << j_complete << "\n\n";
return true;
}
int test_parse_string()
{
// a JSON text
auto text = R"(
{
"Image": {
"Width": 800,
"Height": 600,
"Title": "View from 15th Floor",
"Thumbnail": {
"Url": "http://www.example.com/image/481989943",
"Height": 125,
"Width": 100
},
"Animated" : false,
"IDs": [116, 943, 234, 38793]
}
}
)";
json myobj;
if ( parse_and_dump(text, myobj) ) {
std::cout << std::setw(4) << myobj << "\n\n";
return true;
}
return false;
}
|
bool parse_and_dump(const char* text, nlohmann::json& result)
{
using json = nlohmann::json;
// parse and serialize JSON
result = json::parse(text);
//std::cout << std::setw(4) << j_complete << "\n\n";
return true;
}
int test_parse_string()
{
using namespace std;
using json = nlohmann::json;
cout << __func__ << " uses a RAW string to initialize a json object\n";
// a JSON text
auto text = R"(
{
"Image": {
"Width": 800,
"Height": 600,
"Title": "View from 15th Floor",
"Thumbnail": {
"Url": "http://www.example.com/image/481989943",
"Height": 125,
"Width": 100
},
"Animated" : false,
"IDs": [116, 943, 234, 38793]
}
}
)";
json myobj;
auto query = myobj.meta();
cout << "json.hpp version: " << query["version"]["string"] << endl;
if ( parse_and_dump(text, myobj) ) {
cout << std::setw(4) << myobj << "\n\n";
return true;
}
return false;
}
| UPDATE json, add some message for details | UPDATE json, add some message for details
| C++ | mit | ericosur/myqt,ericosur/myqt,ericosur/myqt,ericosur/myqt,ericosur/myqt,ericosur/myqt | c++ | ## Code Before:
using json = nlohmann::json;
bool parse_and_dump(const char* text, json& result)
{
// parse and serialize JSON
result = json::parse(text);
//std::cout << std::setw(4) << j_complete << "\n\n";
return true;
}
int test_parse_string()
{
// a JSON text
auto text = R"(
{
"Image": {
"Width": 800,
"Height": 600,
"Title": "View from 15th Floor",
"Thumbnail": {
"Url": "http://www.example.com/image/481989943",
"Height": 125,
"Width": 100
},
"Animated" : false,
"IDs": [116, 943, 234, 38793]
}
}
)";
json myobj;
if ( parse_and_dump(text, myobj) ) {
std::cout << std::setw(4) << myobj << "\n\n";
return true;
}
return false;
}
## Instruction:
UPDATE json, add some message for details
## Code After:
bool parse_and_dump(const char* text, nlohmann::json& result)
{
using json = nlohmann::json;
// parse and serialize JSON
result = json::parse(text);
//std::cout << std::setw(4) << j_complete << "\n\n";
return true;
}
int test_parse_string()
{
using namespace std;
using json = nlohmann::json;
cout << __func__ << " uses a RAW string to initialize a json object\n";
// a JSON text
auto text = R"(
{
"Image": {
"Width": 800,
"Height": 600,
"Title": "View from 15th Floor",
"Thumbnail": {
"Url": "http://www.example.com/image/481989943",
"Height": 125,
"Width": 100
},
"Animated" : false,
"IDs": [116, 943, 234, 38793]
}
}
)";
json myobj;
auto query = myobj.meta();
cout << "json.hpp version: " << query["version"]["string"] << endl;
if ( parse_and_dump(text, myobj) ) {
cout << std::setw(4) << myobj << "\n\n";
return true;
}
return false;
}
|
- using json = nlohmann::json;
- bool parse_and_dump(const char* text, json& result)
+ bool parse_and_dump(const char* text, nlohmann::json& result)
? ++++++++++
{
+ using json = nlohmann::json;
// parse and serialize JSON
result = json::parse(text);
//std::cout << std::setw(4) << j_complete << "\n\n";
return true;
}
int test_parse_string()
{
+ using namespace std;
+ using json = nlohmann::json;
+ cout << __func__ << " uses a RAW string to initialize a json object\n";
+
// a JSON text
auto text = R"(
{
"Image": {
"Width": 800,
"Height": 600,
"Title": "View from 15th Floor",
"Thumbnail": {
"Url": "http://www.example.com/image/481989943",
"Height": 125,
"Width": 100
},
"Animated" : false,
"IDs": [116, 943, 234, 38793]
}
}
)";
json myobj;
+ auto query = myobj.meta();
+ cout << "json.hpp version: " << query["version"]["string"] << endl;
if ( parse_and_dump(text, myobj) ) {
- std::cout << std::setw(4) << myobj << "\n\n";
? -----
+ cout << std::setw(4) << myobj << "\n\n";
return true;
}
return false;
}
- | 13 | 0.325 | 9 | 4 |
9ae30cb0feec3e02a2d23acc7cb2af9c3f216f8b | README.md | README.md | Visualize where new businesses are created in the city
* [business license data ](https://data.cityofchicago.org/Community-Economic-Development/Business-Licenses/r5kz-chrr)
## quickstart
1. Create a python virtualenv with virtualenvwrapper
```sh
mkvirtualenv chicago-new-business
workon chicago-new-business
```
2. Install python dependencies
```sh
pip install -r requirements/python
```
3. Run the data analysis pipeline using [flo]()
```sh
flo run
```
4. Enjoy the static figures.
```sh
open data/*.png
```
5. View the site.
```sh
cd web && python -m SimpleHTTPServer
# open http://localhost:8000 in your browser
```
| Visualize where new businesses are created in the city
* [business license data ](https://data.cityofchicago.org/Community-Economic-Development/Business-Licenses/r5kz-chrr)
## quickstart
1. Create a python virtualenv with virtualenvwrapper
```sh
mkvirtualenv chicago-new-business
workon chicago-new-business
```
2. Install python dependencies
```sh
pip install -r requirements/python
```
3. Run the data analysis pipeline using [flo]()
```sh
flo run
```
4. Enjoy the static figures.
```sh
open data/*.png
```
5. View the site.
```sh
cd web && python -m SimpleHTTPServer
# open http://localhost:8000 in your browser
```
## appendix
1. To convert shapefiles into topojsons used in viz, run following commands in /data/boundaries.
```sh
# The -t_srs crs:84 specifies a projection to use. If you leave this part off, you won't be dealing with degrees in your output document.
ogr2ogr -f "GeoJSON" -t_srs crs:84 neighborhoods.json Neighborhoods_2012b.shp
# Convert to TOPOJSON; specify ID and retain property with -p
topojson -o neighborhoods.topojson --id-property SEC_NEIGH -p PRI_NEIGH --neighborhoods.json
# Merge polygons for neighborhoods in the same SEC_NEIGH
topojson-merge -o merged_neighborhoods.topojson --in-object=neighborhoods --out-object=merged_neighborhoods -- 'neighborhoods.topojson'
```
| Add appendix for shapefile to topojson conversion | Add appendix for shapefile to topojson conversion | Markdown | unlicense | datascopeanalytics/chicago-new-business,datascopeanalytics/chicago-new-business | markdown | ## Code Before:
Visualize where new businesses are created in the city
* [business license data ](https://data.cityofchicago.org/Community-Economic-Development/Business-Licenses/r5kz-chrr)
## quickstart
1. Create a python virtualenv with virtualenvwrapper
```sh
mkvirtualenv chicago-new-business
workon chicago-new-business
```
2. Install python dependencies
```sh
pip install -r requirements/python
```
3. Run the data analysis pipeline using [flo]()
```sh
flo run
```
4. Enjoy the static figures.
```sh
open data/*.png
```
5. View the site.
```sh
cd web && python -m SimpleHTTPServer
# open http://localhost:8000 in your browser
```
## Instruction:
Add appendix for shapefile to topojson conversion
## Code After:
Visualize where new businesses are created in the city
* [business license data ](https://data.cityofchicago.org/Community-Economic-Development/Business-Licenses/r5kz-chrr)
## quickstart
1. Create a python virtualenv with virtualenvwrapper
```sh
mkvirtualenv chicago-new-business
workon chicago-new-business
```
2. Install python dependencies
```sh
pip install -r requirements/python
```
3. Run the data analysis pipeline using [flo]()
```sh
flo run
```
4. Enjoy the static figures.
```sh
open data/*.png
```
5. View the site.
```sh
cd web && python -m SimpleHTTPServer
# open http://localhost:8000 in your browser
```
## appendix
1. To convert shapefiles into topojsons used in viz, run following commands in /data/boundaries.
```sh
# The -t_srs crs:84 specifies a projection to use. If you leave this part off, you won't be dealing with degrees in your output document.
ogr2ogr -f "GeoJSON" -t_srs crs:84 neighborhoods.json Neighborhoods_2012b.shp
# Convert to TOPOJSON; specify ID and retain property with -p
topojson -o neighborhoods.topojson --id-property SEC_NEIGH -p PRI_NEIGH --neighborhoods.json
# Merge polygons for neighborhoods in the same SEC_NEIGH
topojson-merge -o merged_neighborhoods.topojson --in-object=neighborhoods --out-object=merged_neighborhoods -- 'neighborhoods.topojson'
```
| Visualize where new businesses are created in the city
* [business license data ](https://data.cityofchicago.org/Community-Economic-Development/Business-Licenses/r5kz-chrr)
## quickstart
1. Create a python virtualenv with virtualenvwrapper
```sh
mkvirtualenv chicago-new-business
workon chicago-new-business
```
2. Install python dependencies
```sh
pip install -r requirements/python
```
3. Run the data analysis pipeline using [flo]()
```sh
flo run
```
4. Enjoy the static figures.
```sh
open data/*.png
```
5. View the site.
```sh
cd web && python -m SimpleHTTPServer
# open http://localhost:8000 in your browser
```
+
+ ## appendix
+
+ 1. To convert shapefiles into topojsons used in viz, run following commands in /data/boundaries.
+
+ ```sh
+ # The -t_srs crs:84 specifies a projection to use. If you leave this part off, you won't be dealing with degrees in your output document.
+ ogr2ogr -f "GeoJSON" -t_srs crs:84 neighborhoods.json Neighborhoods_2012b.shp
+ # Convert to TOPOJSON; specify ID and retain property with -p
+ topojson -o neighborhoods.topojson --id-property SEC_NEIGH -p PRI_NEIGH --neighborhoods.json
+ # Merge polygons for neighborhoods in the same SEC_NEIGH
+ topojson-merge -o merged_neighborhoods.topojson --in-object=neighborhoods --out-object=merged_neighborhoods -- 'neighborhoods.topojson'
+ ``` | 13 | 0.40625 | 13 | 0 |
453b3e27939e4356c89cffe92c05f48d094ea511 | usecase/webapi-usecase-tests/steps/Vehicle/step.js | usecase/webapi-usecase-tests/steps/Vehicle/step.js | var step = '<font style="font-size:85%">'
+ '<p>Test Purpose: </p>'
+ '<p>Verifies the device supports tizen vehicle information access API.</p>'
+ '<p>Test Step: </p>'
+ '<p>'
+ '<ol>'
+ '<li>'
+ 'Click the "Get vehicle" button.'
+ '</li>'
+ '<li>'
+ 'Click the "Set vehicle door" button.'
+ '</li>'
+ '</ol>'
+ '</p>'
+ '<p>Expected Result: </p>'
+ '<p>Test passes if the detected direction reflects the actual vehicle information after step 1, and show "Set door lock success" after step 2.</p>'
+ '</font>'
| var step = '<font style="font-size:85%">'
+ '<p>Test Purpose: </p>'
+ '<p>Verifies the device supports tizen vehicle information access API.</p>'
+ '<p>Precondition: </p>'
+ '<p>'
+ '<ol>'
+ '<li>'
+ 'Execute commend: "zypper in ico-vic-amb-plugin" to install ico-vic-amb-plugin package on IVI.'
+ '</li>'
+ '<li>'
+ 'Reboot the system, then execute command: "ico_set_vehicleinfo speed=10" to set speed value.'
+ '</li>'
+ '</ol>'
+ '</p>'
+ '<p>Test Step: </p>'
+ '<p>'
+ '<ol>'
+ '<li>'
+ 'Click the "Get vehicle" button.'
+ '</li>'
+ '<li>'
+ 'Click the "Set vehicle door" button.'
+ '</li>'
+ '</ol>'
+ '</p>'
+ '<p>Expected Result: </p>'
+ '<p>Test passes if the detected direction reflects the actual vehicle information after step 1, and show "Set door lock success" after step 2.</p>'
+ '</font>'
| Add test precondition for vehicleinfo | [usecase] Add test precondition for vehicleinfo
- Add install ico-vic-amb-plugin package precondition.
| JavaScript | bsd-3-clause | jacky-young/crosswalk-test-suite,yugang/crosswalk-test-suite,pk-sam/crosswalk-test-suite,haoxli/crosswalk-test-suite,yugang/crosswalk-test-suite,kangxu/crosswalk-test-suite,XiaosongWei/crosswalk-test-suite,JianfengXu/crosswalk-test-suite,Shao-Feng/crosswalk-test-suite,zhuyongyong/crosswalk-test-suite,crosswalk-project/crosswalk-test-suite,YongseopKim/crosswalk-test-suite,Honry/crosswalk-test-suite,jiajiax/crosswalk-test-suite,kangxu/crosswalk-test-suite,yhe39/crosswalk-test-suite,wanghongjuan/crosswalk-test-suite,zqzhang/crosswalk-test-suite,haoxli/crosswalk-test-suite,ibelem/crosswalk-test-suite,Shao-Feng/crosswalk-test-suite,wanghongjuan/crosswalk-test-suite,qiuzhong/crosswalk-test-suite,yhe39/crosswalk-test-suite,chunywang/crosswalk-test-suite,BruceDai/crosswalk-test-suite,pk-sam/crosswalk-test-suite,yhe39/crosswalk-test-suite,zqzhang/crosswalk-test-suite,zqzhang/crosswalk-test-suite,zhuyongyong/crosswalk-test-suite,Honry/crosswalk-test-suite,YongseopKim/crosswalk-test-suite,crosswalk-project/crosswalk-test-suite,zqzhang/crosswalk-test-suite,jiajiax/crosswalk-test-suite,yhe39/crosswalk-test-suite,wanghongjuan/crosswalk-test-suite,haoyunfeix/crosswalk-test-suite,yunxliu/crosswalk-test-suite,yugang/crosswalk-test-suite,zhuyongyong/crosswalk-test-suite,yunxliu/crosswalk-test-suite,Honry/crosswalk-test-suite,Honry/crosswalk-test-suite,kaixinjxq/crosswalk-test-suite,haoxli/crosswalk-test-suite,wanghongjuan/crosswalk-test-suite,qiuzhong/crosswalk-test-suite,BruceDai/crosswalk-test-suite,haoyunfeix/crosswalk-test-suite,BruceDai/crosswalk-test-suite,kaixinjxq/crosswalk-test-suite,kangxu/crosswalk-test-suite,BruceDai/crosswalk-test-suite,haoyunfeix/crosswalk-test-suite,yunxliu/crosswalk-test-suite,yhe39/crosswalk-test-suite,zqzhang/crosswalk-test-suite,kaixinjxq/crosswalk-test-suite,crosswalk-project/crosswalk-test-suite,wanghongjuan/crosswalk-test-suite,BruceDai/crosswalk-test-suite,YongseopKim/crosswalk-test-suite,crosswalk-project/crosswalk-test-suite,Honry/crosswalk-test-suite,YongseopKim/crosswalk-test-suite,JianfengXu/crosswalk-test-suite,ibelem/crosswalk-test-suite,Shao-Feng/crosswalk-test-suite,haoyunfeix/crosswalk-test-suite,yunxliu/crosswalk-test-suite,ibelem/crosswalk-test-suite,crosswalk-project/crosswalk-test-suite,jacky-young/crosswalk-test-suite,qiuzhong/crosswalk-test-suite,Shao-Feng/crosswalk-test-suite,BruceDai/crosswalk-test-suite,yunxliu/crosswalk-test-suite,YongseopKim/crosswalk-test-suite,kangxu/crosswalk-test-suite,kangxu/crosswalk-test-suite,ibelem/crosswalk-test-suite,ibelem/crosswalk-test-suite,XiaosongWei/crosswalk-test-suite,zhuyongyong/crosswalk-test-suite,jiajiax/crosswalk-test-suite,qiuzhong/crosswalk-test-suite,wanghongjuan/crosswalk-test-suite,kaixinjxq/crosswalk-test-suite,haoxli/crosswalk-test-suite,ibelem/crosswalk-test-suite,kangxu/crosswalk-test-suite,jiajiax/crosswalk-test-suite,pk-sam/crosswalk-test-suite,haoyunfeix/crosswalk-test-suite,yunxliu/crosswalk-test-suite,chunywang/crosswalk-test-suite,Shao-Feng/crosswalk-test-suite,ibelem/crosswalk-test-suite,kaixinjxq/crosswalk-test-suite,yugang/crosswalk-test-suite,haoxli/crosswalk-test-suite,XiaosongWei/crosswalk-test-suite,wanghongjuan/crosswalk-test-suite,yunxliu/crosswalk-test-suite,Honry/crosswalk-test-suite,JianfengXu/crosswalk-test-suite,zhuyongyong/crosswalk-test-suite,jacky-young/crosswalk-test-suite,chunywang/crosswalk-test-suite,YongseopKim/crosswalk-test-suite,chunywang/crosswalk-test-suite,JianfengXu/crosswalk-test-suite,chunywang/crosswalk-test-suite,haoyunfeix/crosswalk-test-suite,crosswalk-project/crosswalk-test-suite,jiajiax/crosswalk-test-suite,zhuyongyong/crosswalk-test-suite,chunywang/crosswalk-test-suite,JianfengXu/crosswalk-test-suite,JianfengXu/crosswalk-test-suite,kaixinjxq/crosswalk-test-suite,Honry/crosswalk-test-suite,haoxli/crosswalk-test-suite,jiajiax/crosswalk-test-suite,yugang/crosswalk-test-suite,yhe39/crosswalk-test-suite,pk-sam/crosswalk-test-suite,jacky-young/crosswalk-test-suite,XiaosongWei/crosswalk-test-suite,jacky-young/crosswalk-test-suite,yhe39/crosswalk-test-suite,pk-sam/crosswalk-test-suite,yunxliu/crosswalk-test-suite,BruceDai/crosswalk-test-suite,wanghongjuan/crosswalk-test-suite,qiuzhong/crosswalk-test-suite,haoyunfeix/crosswalk-test-suite,yugang/crosswalk-test-suite,haoxli/crosswalk-test-suite,Honry/crosswalk-test-suite,kangxu/crosswalk-test-suite,crosswalk-project/crosswalk-test-suite,XiaosongWei/crosswalk-test-suite,yugang/crosswalk-test-suite,zhuyongyong/crosswalk-test-suite,yhe39/crosswalk-test-suite,kaixinjxq/crosswalk-test-suite,BruceDai/crosswalk-test-suite,qiuzhong/crosswalk-test-suite,pk-sam/crosswalk-test-suite,XiaosongWei/crosswalk-test-suite,zhuyongyong/crosswalk-test-suite,jacky-young/crosswalk-test-suite,zqzhang/crosswalk-test-suite,ibelem/crosswalk-test-suite,kangxu/crosswalk-test-suite,haoyunfeix/crosswalk-test-suite,zqzhang/crosswalk-test-suite,chunywang/crosswalk-test-suite,Shao-Feng/crosswalk-test-suite,XiaosongWei/crosswalk-test-suite,chunywang/crosswalk-test-suite,qiuzhong/crosswalk-test-suite,zqzhang/crosswalk-test-suite,Shao-Feng/crosswalk-test-suite,JianfengXu/crosswalk-test-suite,jiajiax/crosswalk-test-suite,YongseopKim/crosswalk-test-suite,pk-sam/crosswalk-test-suite,haoxli/crosswalk-test-suite,kaixinjxq/crosswalk-test-suite,crosswalk-project/crosswalk-test-suite | javascript | ## Code Before:
var step = '<font style="font-size:85%">'
+ '<p>Test Purpose: </p>'
+ '<p>Verifies the device supports tizen vehicle information access API.</p>'
+ '<p>Test Step: </p>'
+ '<p>'
+ '<ol>'
+ '<li>'
+ 'Click the "Get vehicle" button.'
+ '</li>'
+ '<li>'
+ 'Click the "Set vehicle door" button.'
+ '</li>'
+ '</ol>'
+ '</p>'
+ '<p>Expected Result: </p>'
+ '<p>Test passes if the detected direction reflects the actual vehicle information after step 1, and show "Set door lock success" after step 2.</p>'
+ '</font>'
## Instruction:
[usecase] Add test precondition for vehicleinfo
- Add install ico-vic-amb-plugin package precondition.
## Code After:
var step = '<font style="font-size:85%">'
+ '<p>Test Purpose: </p>'
+ '<p>Verifies the device supports tizen vehicle information access API.</p>'
+ '<p>Precondition: </p>'
+ '<p>'
+ '<ol>'
+ '<li>'
+ 'Execute commend: "zypper in ico-vic-amb-plugin" to install ico-vic-amb-plugin package on IVI.'
+ '</li>'
+ '<li>'
+ 'Reboot the system, then execute command: "ico_set_vehicleinfo speed=10" to set speed value.'
+ '</li>'
+ '</ol>'
+ '</p>'
+ '<p>Test Step: </p>'
+ '<p>'
+ '<ol>'
+ '<li>'
+ 'Click the "Get vehicle" button.'
+ '</li>'
+ '<li>'
+ 'Click the "Set vehicle door" button.'
+ '</li>'
+ '</ol>'
+ '</p>'
+ '<p>Expected Result: </p>'
+ '<p>Test passes if the detected direction reflects the actual vehicle information after step 1, and show "Set door lock success" after step 2.</p>'
+ '</font>'
| var step = '<font style="font-size:85%">'
+ '<p>Test Purpose: </p>'
+ '<p>Verifies the device supports tizen vehicle information access API.</p>'
+ + '<p>Precondition: </p>'
+ + '<p>'
+ + '<ol>'
+ + '<li>'
+ + 'Execute commend: "zypper in ico-vic-amb-plugin" to install ico-vic-amb-plugin package on IVI.'
+ + '</li>'
+ + '<li>'
+ + 'Reboot the system, then execute command: "ico_set_vehicleinfo speed=10" to set speed value.'
+ + '</li>'
+ + '</ol>'
+ + '</p>'
+ '<p>Test Step: </p>'
+ '<p>'
+ '<ol>'
+ '<li>'
+ 'Click the "Get vehicle" button.'
+ '</li>'
+ '<li>'
+ 'Click the "Set vehicle door" button.'
+ '</li>'
+ '</ol>'
+ '</p>'
+ '<p>Expected Result: </p>'
+ '<p>Test passes if the detected direction reflects the actual vehicle information after step 1, and show "Set door lock success" after step 2.</p>'
+ '</font>' | 11 | 0.647059 | 11 | 0 |
04ceb5cdae608a12fe768d7015fb0ee8db496e7c | docs/using.md | docs/using.md | ---
navhome: /docs/
sort: 1
title: Using
---
# Get Started
These short guides will get you installed, setup, and will walk you through the basic features of Urbit.
<list/>
| ---
navhome: /docs/
sort: 1
title: Using
---
# Using
These short guides will get you installed, setup, and will walk you through the basic features of Urbit.
<list/>
| Change "Get Started" title to "Using" like the path name | Change "Get Started" title to "Using" like the path name
| Markdown | mit | urbit/urbit.org | markdown | ## Code Before:
---
navhome: /docs/
sort: 1
title: Using
---
# Get Started
These short guides will get you installed, setup, and will walk you through the basic features of Urbit.
<list/>
## Instruction:
Change "Get Started" title to "Using" like the path name
## Code After:
---
navhome: /docs/
sort: 1
title: Using
---
# Using
These short guides will get you installed, setup, and will walk you through the basic features of Urbit.
<list/>
| ---
navhome: /docs/
sort: 1
title: Using
---
- # Get Started
+ # Using
These short guides will get you installed, setup, and will walk you through the basic features of Urbit.
<list/> | 2 | 0.181818 | 1 | 1 |
01b412488990132f31e1bffe0967c62a2f3449a6 | app/controllers/admin_controller.rb | app/controllers/admin_controller.rb | class AdminController < ApplicationController
layout "admin"
# Don't serve admin assets from the CDN
# Respond.js needs to run on the same domain to request stylesheets,
# parse them and render a non-mobile layout on <IE9
# https://github.com/scottjehl/Respond#cdnx-domain-setup
ActionController::Base.asset_host = nil
prepend_before_filter :authenticate_user!
before_filter :require_signin_permission!
before_filter :skip_slimmer
def info_for_paper_trail
{ user_name: current_user.name }
end
private
def load_parent_contact
@contact = Contact.find(params[:contact_id])
end
end
| class AdminController < ApplicationController
layout "admin"
prepend_before_filter :authenticate_user!
before_filter :require_signin_permission!
before_filter :skip_slimmer
def info_for_paper_trail
{ user_name: current_user.name }
end
private
def load_parent_contact
@contact = Contact.find(params[:contact_id])
end
end
| Revert "Don’t serve admin assets from the CDN" | Revert "Don’t serve admin assets from the CDN"
The intention was to only affect admin pages, but in production-like
environments it also broke assets on the frontend.
For now, we'll live with the bug for respond.js and IE<9, until we have switched
the frontend to a finder.
This reverts commit 4396cc8e0c6477e9ca794ce7de03cd0c608ce85c.
| Ruby | mit | alphagov/contacts-admin,alphagov/contacts-admin,alphagov/contacts-admin | ruby | ## Code Before:
class AdminController < ApplicationController
layout "admin"
# Don't serve admin assets from the CDN
# Respond.js needs to run on the same domain to request stylesheets,
# parse them and render a non-mobile layout on <IE9
# https://github.com/scottjehl/Respond#cdnx-domain-setup
ActionController::Base.asset_host = nil
prepend_before_filter :authenticate_user!
before_filter :require_signin_permission!
before_filter :skip_slimmer
def info_for_paper_trail
{ user_name: current_user.name }
end
private
def load_parent_contact
@contact = Contact.find(params[:contact_id])
end
end
## Instruction:
Revert "Don’t serve admin assets from the CDN"
The intention was to only affect admin pages, but in production-like
environments it also broke assets on the frontend.
For now, we'll live with the bug for respond.js and IE<9, until we have switched
the frontend to a finder.
This reverts commit 4396cc8e0c6477e9ca794ce7de03cd0c608ce85c.
## Code After:
class AdminController < ApplicationController
layout "admin"
prepend_before_filter :authenticate_user!
before_filter :require_signin_permission!
before_filter :skip_slimmer
def info_for_paper_trail
{ user_name: current_user.name }
end
private
def load_parent_contact
@contact = Contact.find(params[:contact_id])
end
end
| class AdminController < ApplicationController
layout "admin"
-
- # Don't serve admin assets from the CDN
- # Respond.js needs to run on the same domain to request stylesheets,
- # parse them and render a non-mobile layout on <IE9
- # https://github.com/scottjehl/Respond#cdnx-domain-setup
- ActionController::Base.asset_host = nil
prepend_before_filter :authenticate_user!
before_filter :require_signin_permission!
before_filter :skip_slimmer
def info_for_paper_trail
{ user_name: current_user.name }
end
private
def load_parent_contact
@contact = Contact.find(params[:contact_id])
end
end | 6 | 0.26087 | 0 | 6 |
082014ef3ab303cc130f95c7d7af2aa1b05148d7 | src/Oro/Bundle/DataGridBundle/Resources/views/macros.html.twig | src/Oro/Bundle/DataGridBundle/Resources/views/macros.html.twig | {#
Renders datagrid widget
parameters:
name: datagrid name
params: additional parameters for url
renderParams: parameters for grid UI
#}
{% macro renderGrid(name, params = {}, renderParams = {}) %}
{% set datagrid = oro_datagrid_build(name, params) %}
{% set metaData = oro_datagrid_metadata(datagrid, params) %}
{% set data = oro_datagrid_data(datagrid) %}
<div id="grid-{{ name }}" data-type="datagrid" data-data="{{ data|json_encode|raw|escape }}"
{% if renderParams.cssClass is defined %} class="{{ renderParams.cssClass }}" {% endif %}
data-metadata="{{ metaData|json_encode|raw|escape }}"></div>
<script type="text/javascript">
require(['jquery', 'orodatagrid/js/datagrid-builder'].concat({{ metaData.requireJSModules|json_encode|raw }}),
function ($, datagridBuilder) {
var builders = _.toArray(arguments).slice(2);
$(function () {
datagridBuilder(builders, '#grid-{{ name }}');
});
});
</script>
{% endmacro %}
| {#
Renders datagrid widget
parameters:
name: datagrid name
params: additional parameters for url
renderParams: parameters for grid UI
#}
{% macro renderGrid(name, params = {}, renderParams = {}) %}
{% set datagrid = oro_datagrid_build(name, params) %}
{% set metaData = oro_datagrid_metadata(datagrid, params) %}
{% set data = oro_datagrid_data(datagrid) %}
{% set gridId = 'grid-' ~ name %}
{% set gridSelector = gridId ~ '-' ~ random() %}
<div id="{{ gridId }}"
data-grid-selector="{{ gridSelector }}"
data-type="datagrid"
data-data="{{ data|json_encode|raw|escape }}"
{% if renderParams.cssClass is defined %} class="{{ renderParams.cssClass }}" {% endif %}
data-metadata="{{ metaData|json_encode|raw|escape }}"></div>
<script type="text/javascript">
require(['jquery', 'orodatagrid/js/datagrid-builder'].concat({{ metaData.requireJSModules|json_encode|raw }}),
function ($, datagridBuilder) {
var builders = _.toArray(arguments).slice(2);
$(function () {
datagridBuilder(builders, '[data-grid-selector="{{ gridSelector }}"]');
});
});
</script>
{% endmacro %}
| Merge Accounts from different channels. Broken Orders and Carts widgets | CRM-1143: Merge Accounts from different channels. Broken Orders and Carts widgets
| Twig | mit | hugeval/platform,trustify/oroplatform,northdakota/platform,geoffroycochard/platform,ramunasd/platform,hugeval/platform,2ndkauboy/platform,trustify/oroplatform,2ndkauboy/platform,mszajner/platform,trustify/oroplatform,orocrm/platform,hugeval/platform,northdakota/platform,ramunasd/platform,orocrm/platform,geoffroycochard/platform,morontt/platform,morontt/platform,Djamy/platform,mszajner/platform,geoffroycochard/platform,orocrm/platform,Djamy/platform,morontt/platform,ramunasd/platform,northdakota/platform,mszajner/platform,2ndkauboy/platform,Djamy/platform | twig | ## Code Before:
{#
Renders datagrid widget
parameters:
name: datagrid name
params: additional parameters for url
renderParams: parameters for grid UI
#}
{% macro renderGrid(name, params = {}, renderParams = {}) %}
{% set datagrid = oro_datagrid_build(name, params) %}
{% set metaData = oro_datagrid_metadata(datagrid, params) %}
{% set data = oro_datagrid_data(datagrid) %}
<div id="grid-{{ name }}" data-type="datagrid" data-data="{{ data|json_encode|raw|escape }}"
{% if renderParams.cssClass is defined %} class="{{ renderParams.cssClass }}" {% endif %}
data-metadata="{{ metaData|json_encode|raw|escape }}"></div>
<script type="text/javascript">
require(['jquery', 'orodatagrid/js/datagrid-builder'].concat({{ metaData.requireJSModules|json_encode|raw }}),
function ($, datagridBuilder) {
var builders = _.toArray(arguments).slice(2);
$(function () {
datagridBuilder(builders, '#grid-{{ name }}');
});
});
</script>
{% endmacro %}
## Instruction:
CRM-1143: Merge Accounts from different channels. Broken Orders and Carts widgets
## Code After:
{#
Renders datagrid widget
parameters:
name: datagrid name
params: additional parameters for url
renderParams: parameters for grid UI
#}
{% macro renderGrid(name, params = {}, renderParams = {}) %}
{% set datagrid = oro_datagrid_build(name, params) %}
{% set metaData = oro_datagrid_metadata(datagrid, params) %}
{% set data = oro_datagrid_data(datagrid) %}
{% set gridId = 'grid-' ~ name %}
{% set gridSelector = gridId ~ '-' ~ random() %}
<div id="{{ gridId }}"
data-grid-selector="{{ gridSelector }}"
data-type="datagrid"
data-data="{{ data|json_encode|raw|escape }}"
{% if renderParams.cssClass is defined %} class="{{ renderParams.cssClass }}" {% endif %}
data-metadata="{{ metaData|json_encode|raw|escape }}"></div>
<script type="text/javascript">
require(['jquery', 'orodatagrid/js/datagrid-builder'].concat({{ metaData.requireJSModules|json_encode|raw }}),
function ($, datagridBuilder) {
var builders = _.toArray(arguments).slice(2);
$(function () {
datagridBuilder(builders, '[data-grid-selector="{{ gridSelector }}"]');
});
});
</script>
{% endmacro %}
| {#
Renders datagrid widget
parameters:
name: datagrid name
params: additional parameters for url
renderParams: parameters for grid UI
#}
{% macro renderGrid(name, params = {}, renderParams = {}) %}
{% set datagrid = oro_datagrid_build(name, params) %}
{% set metaData = oro_datagrid_metadata(datagrid, params) %}
{% set data = oro_datagrid_data(datagrid) %}
+ {% set gridId = 'grid-' ~ name %}
+ {% set gridSelector = gridId ~ '-' ~ random() %}
- <div id="grid-{{ name }}" data-type="datagrid" data-data="{{ data|json_encode|raw|escape }}"
+ <div id="{{ gridId }}"
+ data-grid-selector="{{ gridSelector }}"
+ data-type="datagrid"
+ data-data="{{ data|json_encode|raw|escape }}"
{% if renderParams.cssClass is defined %} class="{{ renderParams.cssClass }}" {% endif %}
data-metadata="{{ metaData|json_encode|raw|escape }}"></div>
<script type="text/javascript">
require(['jquery', 'orodatagrid/js/datagrid-builder'].concat({{ metaData.requireJSModules|json_encode|raw }}),
function ($, datagridBuilder) {
var builders = _.toArray(arguments).slice(2);
$(function () {
- datagridBuilder(builders, '#grid-{{ name }}');
? ^ ^^^
+ datagridBuilder(builders, '[data-grid-selector="{{ gridSelector }}"]');
? ^^^^^^ ++++++++++ ^^^^^ ++++++ ++
});
});
</script>
{% endmacro %} | 9 | 0.346154 | 7 | 2 |
4ef9539dbb0ea399f6f8ee8408f7869f53d0ce43 | paper-typeahead-results.html | paper-typeahead-results.html | <link rel="import" href="bower_components/paper-styles/paper-styles.html">
<link rel="import" href="bower_components/paper-item/paper-item.html">
<link rel="import" href="bower_components/iron-menu-behavior/iron-menu-behavior.html">
<dom-module id="paper-typeahead-results">
<template>
<style>
:host {
display: block;
}
paper-item:hover {
background: var(--google-grey-100);
}
paper-item {
cursor: pointer;
position: relative;
border-bottom: solid 1px var(--google-grey-500);
}
</style>
<template is="dom-repeat" items="[[results]]">
<paper-item class='result-item' on-tap="onSelected" on-keypress="onKeyPress">
<paper-item-body>
<span>[[item]]</span>
</paper-item-body>
</paper-item>
</template>
</template>
</dom-module>
<script src="paper-typeahead-results.js"></script>
| <link rel="import" href="bower_components/paper-styles/paper-styles.html">
<link rel="import" href="bower_components/paper-item/paper-item.html">
<link rel="import" href="bower_components/iron-menu-behavior/iron-menu-behavior.html">
<dom-module id="paper-typeahead-results">
<template>
<style>
:host {
display: block;
}
paper-item:hover {
background: var(--google-grey-100);
@apply(--paper-typeahead-results-item-hover);
}
paper-item {
cursor: pointer;
position: relative;
border-bottom: solid 1px var(--google-grey-300);
@apply(--paper-typeahead-results-item);
--paper-item-min-height: var(--paper-typeahead-results-item-min-height, 30px);
}
</style>
<template is="dom-repeat" items="[[results]]">
<paper-item class='result-item' on-tap="onSelected" on-keypress="onKeyPress">
<paper-item-body>
<span>[[item]]</span>
</paper-item-body>
</paper-item>
</template>
</template>
</dom-module>
<script src="paper-typeahead-results.js"></script>
| Add var and mixins applied to paper-item | Add var and mixins applied to paper-item
Add mixins —paper-typeahead-results-item,
--paper-typeahead-results-item-hover, and var
—paper-typeahead-results-item-min-height with default to 30px. Also
lighten the default paper-item’s border-bottom-color.
| HTML | mit | Zecat/paper-typeahead,samccone/paper-typeahead,Zecat/paper-typeahead,samccone/paper-typeahead | html | ## Code Before:
<link rel="import" href="bower_components/paper-styles/paper-styles.html">
<link rel="import" href="bower_components/paper-item/paper-item.html">
<link rel="import" href="bower_components/iron-menu-behavior/iron-menu-behavior.html">
<dom-module id="paper-typeahead-results">
<template>
<style>
:host {
display: block;
}
paper-item:hover {
background: var(--google-grey-100);
}
paper-item {
cursor: pointer;
position: relative;
border-bottom: solid 1px var(--google-grey-500);
}
</style>
<template is="dom-repeat" items="[[results]]">
<paper-item class='result-item' on-tap="onSelected" on-keypress="onKeyPress">
<paper-item-body>
<span>[[item]]</span>
</paper-item-body>
</paper-item>
</template>
</template>
</dom-module>
<script src="paper-typeahead-results.js"></script>
## Instruction:
Add var and mixins applied to paper-item
Add mixins —paper-typeahead-results-item,
--paper-typeahead-results-item-hover, and var
—paper-typeahead-results-item-min-height with default to 30px. Also
lighten the default paper-item’s border-bottom-color.
## Code After:
<link rel="import" href="bower_components/paper-styles/paper-styles.html">
<link rel="import" href="bower_components/paper-item/paper-item.html">
<link rel="import" href="bower_components/iron-menu-behavior/iron-menu-behavior.html">
<dom-module id="paper-typeahead-results">
<template>
<style>
:host {
display: block;
}
paper-item:hover {
background: var(--google-grey-100);
@apply(--paper-typeahead-results-item-hover);
}
paper-item {
cursor: pointer;
position: relative;
border-bottom: solid 1px var(--google-grey-300);
@apply(--paper-typeahead-results-item);
--paper-item-min-height: var(--paper-typeahead-results-item-min-height, 30px);
}
</style>
<template is="dom-repeat" items="[[results]]">
<paper-item class='result-item' on-tap="onSelected" on-keypress="onKeyPress">
<paper-item-body>
<span>[[item]]</span>
</paper-item-body>
</paper-item>
</template>
</template>
</dom-module>
<script src="paper-typeahead-results.js"></script>
| <link rel="import" href="bower_components/paper-styles/paper-styles.html">
<link rel="import" href="bower_components/paper-item/paper-item.html">
<link rel="import" href="bower_components/iron-menu-behavior/iron-menu-behavior.html">
<dom-module id="paper-typeahead-results">
<template>
<style>
:host {
display: block;
}
paper-item:hover {
background: var(--google-grey-100);
+ @apply(--paper-typeahead-results-item-hover);
}
paper-item {
cursor: pointer;
position: relative;
- border-bottom: solid 1px var(--google-grey-500);
? ^
+ border-bottom: solid 1px var(--google-grey-300);
? ^
+ @apply(--paper-typeahead-results-item);
+ --paper-item-min-height: var(--paper-typeahead-results-item-min-height, 30px);
}
</style>
<template is="dom-repeat" items="[[results]]">
<paper-item class='result-item' on-tap="onSelected" on-keypress="onKeyPress">
<paper-item-body>
<span>[[item]]</span>
</paper-item-body>
</paper-item>
</template>
</template>
</dom-module>
<script src="paper-typeahead-results.js"></script> | 5 | 0.172414 | 4 | 1 |
577fb888436816ede551179e677b75547ecfff14 | app/views/spree/admin/shared/_new_resource_links.html.haml | app/views/spree/admin/shared/_new_resource_links.html.haml | .form-buttons.filter-actions.actions
= button t(:create), 'icon-ok'
%span.or= t(:or)
= button_link_to t(:cancel), collection_url, icon: 'icon-remove'
| .form-buttons.filter-actions.actions
= button t('actions.create'), 'icon-ok'
%span.or= t(:or)
= button_link_to t('actions.cancel'), collection_url, icon: 'icon-remove'
| Move Create and Cancel References to Actions Namespace | Move Create and Cancel References to Actions Namespace
| Haml | agpl-3.0 | Matt-Yorkley/openfoodnetwork,lin-d-hop/openfoodnetwork,mkllnk/openfoodnetwork,openfoodfoundation/openfoodnetwork,Matt-Yorkley/openfoodnetwork,lin-d-hop/openfoodnetwork,mkllnk/openfoodnetwork,openfoodfoundation/openfoodnetwork,Matt-Yorkley/openfoodnetwork,lin-d-hop/openfoodnetwork,lin-d-hop/openfoodnetwork,openfoodfoundation/openfoodnetwork,openfoodfoundation/openfoodnetwork,mkllnk/openfoodnetwork,Matt-Yorkley/openfoodnetwork,mkllnk/openfoodnetwork | haml | ## Code Before:
.form-buttons.filter-actions.actions
= button t(:create), 'icon-ok'
%span.or= t(:or)
= button_link_to t(:cancel), collection_url, icon: 'icon-remove'
## Instruction:
Move Create and Cancel References to Actions Namespace
## Code After:
.form-buttons.filter-actions.actions
= button t('actions.create'), 'icon-ok'
%span.or= t(:or)
= button_link_to t('actions.cancel'), collection_url, icon: 'icon-remove'
| .form-buttons.filter-actions.actions
- = button t(:create), 'icon-ok'
? ^
+ = button t('actions.create'), 'icon-ok'
? ^^^^^^^^^ +
%span.or= t(:or)
- = button_link_to t(:cancel), collection_url, icon: 'icon-remove'
? ^
+ = button_link_to t('actions.cancel'), collection_url, icon: 'icon-remove'
? ^^^^^^^^^ +
| 4 | 1 | 2 | 2 |
955a5bf0a9b1e6a71c9c055fc18aaea4df33cddd | Cargo.toml | Cargo.toml | [package]
name = "x86_64"
version = "0.1.0"
authors = [
"Gerd Zellweger <mail@gerdzellweger.com>",
"Eric Kidd <git@randomhacks.net>",
"Philipp Oppermann <dev@phil-opp.com>",
"Dan Schatzberg <schatzberg.dan@gmail.com>",
"John Ericson <John_Ericson@Yahoo.com>",
"Rex Lunae <rexlunae@gmail.com>"
]
description = "Library to program x86_64 hardware. Contains x86_64 specific data structure descriptions, data-tables, as well as convenience function to call assembly instructions. This is a fork of the `x86` crate, specialized for x86_64."
repository = "https://github.com/phil-opp/rust-x86_64"
documentation = "https://docs.rs/x86_64"
readme = "README.md"
keywords = ["amd64", "x86", "x86_64", "no_std"]
license = "MIT"
[[test]]
name = "no_std_build"
harness = false
[dependencies]
bit_field = "0.7.0"
| [package]
name = "x86_64"
version = "0.1.0"
authors = [
"Gerd Zellweger <mail@gerdzellweger.com>",
"Eric Kidd <git@randomhacks.net>",
"Philipp Oppermann <dev@phil-opp.com>",
"Dan Schatzberg <schatzberg.dan@gmail.com>",
"John Ericson <John_Ericson@Yahoo.com>",
"Rex Lunae <rexlunae@gmail.com>"
]
description = "Provides x86_64 specific functions and data structures, and access to various system registers."
repository = "https://github.com/phil-opp/x86_64"
documentation = "https://docs.rs/x86_64"
readme = "README.md"
keywords = ["amd64", "x86", "x86_64", "no_std"]
license = "MIT"
[[test]]
name = "no_std_build"
harness = false
[dependencies]
bit_field = "0.7.0"
| Update description and repo link | Update description and repo link
| TOML | mit | phil-opp/rust-x86_64 | toml | ## Code Before:
[package]
name = "x86_64"
version = "0.1.0"
authors = [
"Gerd Zellweger <mail@gerdzellweger.com>",
"Eric Kidd <git@randomhacks.net>",
"Philipp Oppermann <dev@phil-opp.com>",
"Dan Schatzberg <schatzberg.dan@gmail.com>",
"John Ericson <John_Ericson@Yahoo.com>",
"Rex Lunae <rexlunae@gmail.com>"
]
description = "Library to program x86_64 hardware. Contains x86_64 specific data structure descriptions, data-tables, as well as convenience function to call assembly instructions. This is a fork of the `x86` crate, specialized for x86_64."
repository = "https://github.com/phil-opp/rust-x86_64"
documentation = "https://docs.rs/x86_64"
readme = "README.md"
keywords = ["amd64", "x86", "x86_64", "no_std"]
license = "MIT"
[[test]]
name = "no_std_build"
harness = false
[dependencies]
bit_field = "0.7.0"
## Instruction:
Update description and repo link
## Code After:
[package]
name = "x86_64"
version = "0.1.0"
authors = [
"Gerd Zellweger <mail@gerdzellweger.com>",
"Eric Kidd <git@randomhacks.net>",
"Philipp Oppermann <dev@phil-opp.com>",
"Dan Schatzberg <schatzberg.dan@gmail.com>",
"John Ericson <John_Ericson@Yahoo.com>",
"Rex Lunae <rexlunae@gmail.com>"
]
description = "Provides x86_64 specific functions and data structures, and access to various system registers."
repository = "https://github.com/phil-opp/x86_64"
documentation = "https://docs.rs/x86_64"
readme = "README.md"
keywords = ["amd64", "x86", "x86_64", "no_std"]
license = "MIT"
[[test]]
name = "no_std_build"
harness = false
[dependencies]
bit_field = "0.7.0"
| [package]
name = "x86_64"
version = "0.1.0"
authors = [
"Gerd Zellweger <mail@gerdzellweger.com>",
"Eric Kidd <git@randomhacks.net>",
"Philipp Oppermann <dev@phil-opp.com>",
"Dan Schatzberg <schatzberg.dan@gmail.com>",
"John Ericson <John_Ericson@Yahoo.com>",
"Rex Lunae <rexlunae@gmail.com>"
]
- description = "Library to program x86_64 hardware. Contains x86_64 specific data structure descriptions, data-tables, as well as convenience function to call assembly instructions. This is a fork of the `x86` crate, specialized for x86_64."
+ description = "Provides x86_64 specific functions and data structures, and access to various system registers."
- repository = "https://github.com/phil-opp/rust-x86_64"
? -----
+ repository = "https://github.com/phil-opp/x86_64"
documentation = "https://docs.rs/x86_64"
readme = "README.md"
keywords = ["amd64", "x86", "x86_64", "no_std"]
license = "MIT"
[[test]]
name = "no_std_build"
harness = false
[dependencies]
bit_field = "0.7.0" | 4 | 0.148148 | 2 | 2 |
5d3dd6b8ad9ec696fe1bbd0cc8f1eea27e9bc92b | routes/recipes/index.js | routes/recipes/index.js | var express = require('express');
var router = express.Router();
var JsonDB = require('node-json-db');
var _ = require('lodash');
// Recipes listing
router.get('/', function (req, res, next) {
var db = new JsonDB('db', false, false);
var recipes = db.getData('/recipes');
// Expand requested resources if they exist
// The resource to expand is singular, e.g.
// to expand 'users' we provide _expand=user
var expand = req.query._expand;
if (expand) {
try {
var relation = db.getData('/' + expand + 's');
_(recipes)
.forEach(function (recipe) {
recipe.user = _(relation).findWhere({ id: recipe[expand + 'Id'] });
delete recipe[expand + 'Id'];
})
.value();
}
catch(err) {
console.log(err);
}
}
res.json(recipes);
});
module.exports = router;
| var express = require('express');
var router = express.Router();
var JsonDB = require('node-json-db');
var _ = require('lodash');
// Recipes listing
router.get('/', function (req, res, next) {
var db = new JsonDB('db', false, false);
var recipes = db.getData('/recipes');
// Expand requested resources if they exist
// The resource to expand is singular, e.g.
// to expand 'users' we provide _expand=user
var expand = req.query._expand;
if (expand) {
try {
var relation = db.getData('/' + expand + 's');
_(recipes)
.forEach(function (recipe) {
recipe[expand] = _(relation).find({ id: recipe[expand + 'Id'] });
delete recipe[expand + 'Id'];
})
.value();
}
catch(err) {
console.log(err);
}
}
res.json(recipes);
});
module.exports = router;
| Replace direct reference to user in the recipe list api | Replace direct reference to user in the recipe list api
| JavaScript | mit | adamsea/recipes-api | javascript | ## Code Before:
var express = require('express');
var router = express.Router();
var JsonDB = require('node-json-db');
var _ = require('lodash');
// Recipes listing
router.get('/', function (req, res, next) {
var db = new JsonDB('db', false, false);
var recipes = db.getData('/recipes');
// Expand requested resources if they exist
// The resource to expand is singular, e.g.
// to expand 'users' we provide _expand=user
var expand = req.query._expand;
if (expand) {
try {
var relation = db.getData('/' + expand + 's');
_(recipes)
.forEach(function (recipe) {
recipe.user = _(relation).findWhere({ id: recipe[expand + 'Id'] });
delete recipe[expand + 'Id'];
})
.value();
}
catch(err) {
console.log(err);
}
}
res.json(recipes);
});
module.exports = router;
## Instruction:
Replace direct reference to user in the recipe list api
## Code After:
var express = require('express');
var router = express.Router();
var JsonDB = require('node-json-db');
var _ = require('lodash');
// Recipes listing
router.get('/', function (req, res, next) {
var db = new JsonDB('db', false, false);
var recipes = db.getData('/recipes');
// Expand requested resources if they exist
// The resource to expand is singular, e.g.
// to expand 'users' we provide _expand=user
var expand = req.query._expand;
if (expand) {
try {
var relation = db.getData('/' + expand + 's');
_(recipes)
.forEach(function (recipe) {
recipe[expand] = _(relation).find({ id: recipe[expand + 'Id'] });
delete recipe[expand + 'Id'];
})
.value();
}
catch(err) {
console.log(err);
}
}
res.json(recipes);
});
module.exports = router;
| var express = require('express');
var router = express.Router();
var JsonDB = require('node-json-db');
var _ = require('lodash');
// Recipes listing
router.get('/', function (req, res, next) {
var db = new JsonDB('db', false, false);
var recipes = db.getData('/recipes');
// Expand requested resources if they exist
// The resource to expand is singular, e.g.
// to expand 'users' we provide _expand=user
var expand = req.query._expand;
if (expand) {
try {
var relation = db.getData('/' + expand + 's');
_(recipes)
.forEach(function (recipe) {
- recipe.user = _(relation).findWhere({ id: recipe[expand + 'Id'] });
? ^^^ ^ -----
+ recipe[expand] = _(relation).find({ id: recipe[expand + 'Id'] });
? ^ ^^^^^^
delete recipe[expand + 'Id'];
})
.value();
}
catch(err) {
console.log(err);
}
}
res.json(recipes);
});
module.exports = router; | 2 | 0.060606 | 1 | 1 |
e78cef9dd057d567b52ed273e902b51f5f42aafb | web/models/skill.ex | web/models/skill.ex | defmodule CodeCorps.Skill do
use CodeCorps.Web, :model
import CodeCorps.Validators.SlugValidator
alias Inflex
schema "skills" do
field :title, :string
field :description, :string
field :original_row, :integer
field :slug, :string
timestamps()
end
@doc """
Builds a changeset based on the `struct` and `params`.
"""
def changeset(struct, params \\ %{}) do
struct
|> cast(params, [:title, :description, :original_row, :slug])
|> update_slug()
|> validate_required([:title, :slug])
|> validate_slug(:slug)
|> unique_constraint(:slug)
end
def update_slug(changeset) do
case changeset do
%Ecto.Changeset{changes: %{title: title}} ->
slug = Inflex.parameterize(title)
put_change(changeset, :slug, slug)
_ ->
changeset
end
end
end
| defmodule CodeCorps.Skill do
use CodeCorps.Web, :model
import CodeCorps.Validators.SlugValidator
import CodeCorps.ModelHelpers
schema "skills" do
field :title, :string
field :description, :string
field :original_row, :integer
field :slug, :string
timestamps()
end
@doc """
Builds a changeset based on the `struct` and `params`.
"""
def changeset(struct, params \\ %{}) do
struct
|> cast(params, [:title, :description, :original_row, :slug])
|> validate_required(:title)
|> generate_slug(:title, :slug)
|> validate_required(:slug)
|> validate_slug(:slug)
|> unique_constraint(:slug)
end
end
| Update slug generation to use model helper | Update slug generation to use model helper
| Elixir | mit | code-corps/code-corps-api,crodriguez1a/code-corps-api,code-corps/code-corps-api,crodriguez1a/code-corps-api | elixir | ## Code Before:
defmodule CodeCorps.Skill do
use CodeCorps.Web, :model
import CodeCorps.Validators.SlugValidator
alias Inflex
schema "skills" do
field :title, :string
field :description, :string
field :original_row, :integer
field :slug, :string
timestamps()
end
@doc """
Builds a changeset based on the `struct` and `params`.
"""
def changeset(struct, params \\ %{}) do
struct
|> cast(params, [:title, :description, :original_row, :slug])
|> update_slug()
|> validate_required([:title, :slug])
|> validate_slug(:slug)
|> unique_constraint(:slug)
end
def update_slug(changeset) do
case changeset do
%Ecto.Changeset{changes: %{title: title}} ->
slug = Inflex.parameterize(title)
put_change(changeset, :slug, slug)
_ ->
changeset
end
end
end
## Instruction:
Update slug generation to use model helper
## Code After:
defmodule CodeCorps.Skill do
use CodeCorps.Web, :model
import CodeCorps.Validators.SlugValidator
import CodeCorps.ModelHelpers
schema "skills" do
field :title, :string
field :description, :string
field :original_row, :integer
field :slug, :string
timestamps()
end
@doc """
Builds a changeset based on the `struct` and `params`.
"""
def changeset(struct, params \\ %{}) do
struct
|> cast(params, [:title, :description, :original_row, :slug])
|> validate_required(:title)
|> generate_slug(:title, :slug)
|> validate_required(:slug)
|> validate_slug(:slug)
|> unique_constraint(:slug)
end
end
| defmodule CodeCorps.Skill do
use CodeCorps.Web, :model
import CodeCorps.Validators.SlugValidator
- alias Inflex
+ import CodeCorps.ModelHelpers
schema "skills" do
field :title, :string
field :description, :string
field :original_row, :integer
field :slug, :string
timestamps()
end
@doc """
Builds a changeset based on the `struct` and `params`.
"""
def changeset(struct, params \\ %{}) do
struct
|> cast(params, [:title, :description, :original_row, :slug])
- |> update_slug()
- |> validate_required([:title, :slug])
? - --------
+ |> validate_required(:title)
+ |> generate_slug(:title, :slug)
+ |> validate_required(:slug)
|> validate_slug(:slug)
|> unique_constraint(:slug)
end
-
- def update_slug(changeset) do
- case changeset do
- %Ecto.Changeset{changes: %{title: title}} ->
- slug = Inflex.parameterize(title)
- put_change(changeset, :slug, slug)
- _ ->
- changeset
- end
-
- end
end | 18 | 0.486486 | 4 | 14 |
155951c77d95a7dda5dd3e697d404ecfa36d16fe | .cirrus.yml | .cirrus.yml | task:
matrix:
- name: Test (JDK 8)
container:
image: gradle:jdk8
- name: Test (JDK 11)
container:
image: gradle:jdk11
gradle_cache:
folder: ~/.gradle/caches
check_script: gradle check --stacktrace
cleanup_before_cache_script:
- rm -rf ~/.gradle/caches/$GRADLE_VERSION/
- rm -f ~/.gradle/caches/user-id.txt
- rm -f ~/.gradle/caches/journal-1/file-access.bin
- find ~/.gradle/caches/ -name "*.lock" -type f -delete
on_failure:
junit_artifacts:
path: "**/test-results/**/*.xml"
format: junit
deploy_task:
only_if: $CIRRUS_TAG != ''
container:
image: gradle:jdk8
depends_on:
- Test (JDK 8)
- Test (JDK 11)
environment:
BINTRAY_API_KEY: ENCRYPTED[d86059880472037e15bb3ab28eb47726b84ff6d0dd5a0a8a40476dfc3aaa3e11ff34644a9df6c77c3036cca63d82ac54]
deploy_script: gradle bintrayUpload --info
| task:
matrix:
- name: Test (JDK 11)
container:
image: gradle:jdk11
gradle_cache:
folder: ~/.gradle/caches
check_script: gradle check --stacktrace
cleanup_before_cache_script:
- rm -rf ~/.gradle/caches/$GRADLE_VERSION/
- rm -f ~/.gradle/caches/user-id.txt
- rm -f ~/.gradle/caches/journal-1/file-access.bin
- find ~/.gradle/caches/ -name "*.lock" -type f -delete
on_failure:
junit_artifacts:
path: "**/test-results/**/*.xml"
format: junit
deploy_task:
only_if: $CIRRUS_TAG != ''
container:
image: gradle:jdk8
depends_on:
- Test (JDK 11)
environment:
BINTRAY_API_KEY: ENCRYPTED[d86059880472037e15bb3ab28eb47726b84ff6d0dd5a0a8a40476dfc3aaa3e11ff34644a9df6c77c3036cca63d82ac54]
deploy_script: gradle bintrayUpload --info
| Test only on Java 11 | Test only on Java 11
| YAML | mit | fkorotkov/k8s-kotlin-dsl | yaml | ## Code Before:
task:
matrix:
- name: Test (JDK 8)
container:
image: gradle:jdk8
- name: Test (JDK 11)
container:
image: gradle:jdk11
gradle_cache:
folder: ~/.gradle/caches
check_script: gradle check --stacktrace
cleanup_before_cache_script:
- rm -rf ~/.gradle/caches/$GRADLE_VERSION/
- rm -f ~/.gradle/caches/user-id.txt
- rm -f ~/.gradle/caches/journal-1/file-access.bin
- find ~/.gradle/caches/ -name "*.lock" -type f -delete
on_failure:
junit_artifacts:
path: "**/test-results/**/*.xml"
format: junit
deploy_task:
only_if: $CIRRUS_TAG != ''
container:
image: gradle:jdk8
depends_on:
- Test (JDK 8)
- Test (JDK 11)
environment:
BINTRAY_API_KEY: ENCRYPTED[d86059880472037e15bb3ab28eb47726b84ff6d0dd5a0a8a40476dfc3aaa3e11ff34644a9df6c77c3036cca63d82ac54]
deploy_script: gradle bintrayUpload --info
## Instruction:
Test only on Java 11
## Code After:
task:
matrix:
- name: Test (JDK 11)
container:
image: gradle:jdk11
gradle_cache:
folder: ~/.gradle/caches
check_script: gradle check --stacktrace
cleanup_before_cache_script:
- rm -rf ~/.gradle/caches/$GRADLE_VERSION/
- rm -f ~/.gradle/caches/user-id.txt
- rm -f ~/.gradle/caches/journal-1/file-access.bin
- find ~/.gradle/caches/ -name "*.lock" -type f -delete
on_failure:
junit_artifacts:
path: "**/test-results/**/*.xml"
format: junit
deploy_task:
only_if: $CIRRUS_TAG != ''
container:
image: gradle:jdk8
depends_on:
- Test (JDK 11)
environment:
BINTRAY_API_KEY: ENCRYPTED[d86059880472037e15bb3ab28eb47726b84ff6d0dd5a0a8a40476dfc3aaa3e11ff34644a9df6c77c3036cca63d82ac54]
deploy_script: gradle bintrayUpload --info
| task:
matrix:
- - name: Test (JDK 8)
- container:
- image: gradle:jdk8
- name: Test (JDK 11)
container:
image: gradle:jdk11
gradle_cache:
folder: ~/.gradle/caches
check_script: gradle check --stacktrace
cleanup_before_cache_script:
- rm -rf ~/.gradle/caches/$GRADLE_VERSION/
- rm -f ~/.gradle/caches/user-id.txt
- rm -f ~/.gradle/caches/journal-1/file-access.bin
- find ~/.gradle/caches/ -name "*.lock" -type f -delete
on_failure:
junit_artifacts:
path: "**/test-results/**/*.xml"
format: junit
deploy_task:
only_if: $CIRRUS_TAG != ''
container:
image: gradle:jdk8
depends_on:
- - Test (JDK 8)
- Test (JDK 11)
environment:
BINTRAY_API_KEY: ENCRYPTED[d86059880472037e15bb3ab28eb47726b84ff6d0dd5a0a8a40476dfc3aaa3e11ff34644a9df6c77c3036cca63d82ac54]
deploy_script: gradle bintrayUpload --info | 4 | 0.129032 | 0 | 4 |
058e9219e85819063d53d1fa0a1f0f0bc0fe7eb0 | .travis.yml | .travis.yml | language: java
jdk:
- openjdk-ea
- openjdk13
- openjdk12
- openjdk11
- openjdk8
script:
- ./mvnw clean verify
- ./mvnw clean verify -Dspring.version=5.1.12.RELEASE -Dspring-batch.version=4.1.3.RELEASE
after_success:
- chmod -R 777 ./travis/after_success.sh
- ./travis/after_success.sh
env:
global:
- secure: "dPX83x9q53WLJjNEsOJZTj2yjcusMp3Rg2SeF9xGUgxLa0NAdiWKM/ejSiJO\nzRM+5nmnQGb4SSigqb99N0ndB0S5YhMzbpACD8+SmCfpYRPdUQtE5dW22xpd\nPGPPUwPAGcqISwq5lcFCbPeve8k4g5Co/ZWGMMkoYy8DfjRss6g="
- secure: "X/+tqgmKqR6wIvSLzDyYAc3Q0NtzFjnQZT7b4yD9MI+/9S1bLFyZJ8mLhymK\nEakyrz8syFAl38ebl2pGox3yaJ9GRfcQXjJ2Qv0Pgb0r+RQTrtU2Fpvb+3Nr\nh5ymtRIkU0XOpOBCq5M9AxV1TESftw3p1IUD9dzPilRqQdXzTGs="
| language: java
jdk:
- openjdk-ea
- openjdk13
- openjdk11
- openjdk8
script:
- ./mvnw clean verify
- ./mvnw clean verify -Dspring.version=5.1.12.RELEASE -Dspring-batch.version=4.1.3.RELEASE
after_success:
- chmod -R 777 ./travis/after_success.sh
- ./travis/after_success.sh
env:
global:
- secure: "dPX83x9q53WLJjNEsOJZTj2yjcusMp3Rg2SeF9xGUgxLa0NAdiWKM/ejSiJO\nzRM+5nmnQGb4SSigqb99N0ndB0S5YhMzbpACD8+SmCfpYRPdUQtE5dW22xpd\nPGPPUwPAGcqISwq5lcFCbPeve8k4g5Co/ZWGMMkoYy8DfjRss6g="
- secure: "X/+tqgmKqR6wIvSLzDyYAc3Q0NtzFjnQZT7b4yD9MI+/9S1bLFyZJ8mLhymK\nEakyrz8syFAl38ebl2pGox3yaJ9GRfcQXjJ2Qv0Pgb0r+RQTrtU2Fpvb+3Nr\nh5ymtRIkU0XOpOBCq5M9AxV1TESftw3p1IUD9dzPilRqQdXzTGs="
| Drop openjdk12 on Travis CI Fixes gh-431 | Drop openjdk12 on Travis CI
Fixes gh-431
| YAML | apache-2.0 | hazendaz/spring,pboonphong/mybatis-spring,kazuki43zoo/spring,hazendaz/spring,pboonphong/mybatis-spring,kazuki43zoo/spring,mybatis/spring,mosoft521/spring,mosoft521/spring,mybatis/spring | yaml | ## Code Before:
language: java
jdk:
- openjdk-ea
- openjdk13
- openjdk12
- openjdk11
- openjdk8
script:
- ./mvnw clean verify
- ./mvnw clean verify -Dspring.version=5.1.12.RELEASE -Dspring-batch.version=4.1.3.RELEASE
after_success:
- chmod -R 777 ./travis/after_success.sh
- ./travis/after_success.sh
env:
global:
- secure: "dPX83x9q53WLJjNEsOJZTj2yjcusMp3Rg2SeF9xGUgxLa0NAdiWKM/ejSiJO\nzRM+5nmnQGb4SSigqb99N0ndB0S5YhMzbpACD8+SmCfpYRPdUQtE5dW22xpd\nPGPPUwPAGcqISwq5lcFCbPeve8k4g5Co/ZWGMMkoYy8DfjRss6g="
- secure: "X/+tqgmKqR6wIvSLzDyYAc3Q0NtzFjnQZT7b4yD9MI+/9S1bLFyZJ8mLhymK\nEakyrz8syFAl38ebl2pGox3yaJ9GRfcQXjJ2Qv0Pgb0r+RQTrtU2Fpvb+3Nr\nh5ymtRIkU0XOpOBCq5M9AxV1TESftw3p1IUD9dzPilRqQdXzTGs="
## Instruction:
Drop openjdk12 on Travis CI
Fixes gh-431
## Code After:
language: java
jdk:
- openjdk-ea
- openjdk13
- openjdk11
- openjdk8
script:
- ./mvnw clean verify
- ./mvnw clean verify -Dspring.version=5.1.12.RELEASE -Dspring-batch.version=4.1.3.RELEASE
after_success:
- chmod -R 777 ./travis/after_success.sh
- ./travis/after_success.sh
env:
global:
- secure: "dPX83x9q53WLJjNEsOJZTj2yjcusMp3Rg2SeF9xGUgxLa0NAdiWKM/ejSiJO\nzRM+5nmnQGb4SSigqb99N0ndB0S5YhMzbpACD8+SmCfpYRPdUQtE5dW22xpd\nPGPPUwPAGcqISwq5lcFCbPeve8k4g5Co/ZWGMMkoYy8DfjRss6g="
- secure: "X/+tqgmKqR6wIvSLzDyYAc3Q0NtzFjnQZT7b4yD9MI+/9S1bLFyZJ8mLhymK\nEakyrz8syFAl38ebl2pGox3yaJ9GRfcQXjJ2Qv0Pgb0r+RQTrtU2Fpvb+3Nr\nh5ymtRIkU0XOpOBCq5M9AxV1TESftw3p1IUD9dzPilRqQdXzTGs="
| language: java
jdk:
- openjdk-ea
- openjdk13
- - openjdk12
- openjdk11
- openjdk8
script:
- ./mvnw clean verify
- ./mvnw clean verify -Dspring.version=5.1.12.RELEASE -Dspring-batch.version=4.1.3.RELEASE
after_success:
- chmod -R 777 ./travis/after_success.sh
- ./travis/after_success.sh
env:
global:
- secure: "dPX83x9q53WLJjNEsOJZTj2yjcusMp3Rg2SeF9xGUgxLa0NAdiWKM/ejSiJO\nzRM+5nmnQGb4SSigqb99N0ndB0S5YhMzbpACD8+SmCfpYRPdUQtE5dW22xpd\nPGPPUwPAGcqISwq5lcFCbPeve8k4g5Co/ZWGMMkoYy8DfjRss6g="
- secure: "X/+tqgmKqR6wIvSLzDyYAc3Q0NtzFjnQZT7b4yD9MI+/9S1bLFyZJ8mLhymK\nEakyrz8syFAl38ebl2pGox3yaJ9GRfcQXjJ2Qv0Pgb0r+RQTrtU2Fpvb+3Nr\nh5ymtRIkU0XOpOBCq5M9AxV1TESftw3p1IUD9dzPilRqQdXzTGs=" | 1 | 0.047619 | 0 | 1 |
3178012eef4c29cce5dbcc2c82d7ad1eef989969 | app.js | app.js | /**
* Copyright 2014 IBM Corp. All Rights Reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
'use strict';
var express = require('express'),
app = express(),
bluemix = require('./config/bluemix'),
watson = require('watson-developer-cloud'),
// environmental variable points to demo's json config file
config = require(process.env.WATSON_OPTIONS_FILE),
extend = require('util')._extend;
// if bluemix credentials exists, then override local
var credentials = extend(config, bluemix.getServiceCreds('text_to_speech'));
// Create the service wrapper
var textToSpeech = new watson.text_to_speech(credentials);
// Configure express
require('./config/express')(app, textToSpeech);
var port = process.env.VCAP_APP_PORT || 3000;
app.listen(port);
console.log('listening at:', port); | /**
* Copyright 2014 IBM Corp. All Rights Reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
'use strict';
var express = require('express'),
app = express(),
bluemix = require('./config/bluemix'),
watson = require('watson-developer-cloud'),
// environmental variable points to demo's json config file
config = require(process.env.WATSON_CONFIG_FILE),
extend = require('util')._extend;
// if bluemix credentials exists, then override local
var credentials = extend(config, bluemix.getServiceCreds('text_to_speech'));
// Create the service wrapper
var textToSpeech = new watson.text_to_speech(config);
// Configure express
require('./config/express')(app, textToSpeech);
var port = process.env.VCAP_APP_PORT || 3000;
app.listen(port);
console.log('listening at:', port); | Update Express config environmental file name | Update Express config environmental file name
| JavaScript | apache-2.0 | hmagph/otcnov10demo,sktay/text-to-speech-nodejs,skaegi/text-to-speech-nodejs-with-saucelabs,hmagph/text-to-speech-nodejs-with-test,hmagph/text-to-speech-nodejs-with-test,skaegi/text-to-speech-nodejs-with-saucelabs,hmagph/text-to-speech-nodejs-with-saucelabs,watson-developer-cloud/text-to-speech-nodejs,dou800/text-to-speech-nodejs,hmagph/otcnov10demo,dou800/text-to-speech-nodejs,hmagph/text-to-speech-nodejs-with-saucelabs,sktay/text-to-speech-nodejs,colinsf/text-to-speech-nodejs,colinsf/text-to-speech-nodejs | javascript | ## Code Before:
/**
* Copyright 2014 IBM Corp. All Rights Reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
'use strict';
var express = require('express'),
app = express(),
bluemix = require('./config/bluemix'),
watson = require('watson-developer-cloud'),
// environmental variable points to demo's json config file
config = require(process.env.WATSON_OPTIONS_FILE),
extend = require('util')._extend;
// if bluemix credentials exists, then override local
var credentials = extend(config, bluemix.getServiceCreds('text_to_speech'));
// Create the service wrapper
var textToSpeech = new watson.text_to_speech(credentials);
// Configure express
require('./config/express')(app, textToSpeech);
var port = process.env.VCAP_APP_PORT || 3000;
app.listen(port);
console.log('listening at:', port);
## Instruction:
Update Express config environmental file name
## Code After:
/**
* Copyright 2014 IBM Corp. All Rights Reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
'use strict';
var express = require('express'),
app = express(),
bluemix = require('./config/bluemix'),
watson = require('watson-developer-cloud'),
// environmental variable points to demo's json config file
config = require(process.env.WATSON_CONFIG_FILE),
extend = require('util')._extend;
// if bluemix credentials exists, then override local
var credentials = extend(config, bluemix.getServiceCreds('text_to_speech'));
// Create the service wrapper
var textToSpeech = new watson.text_to_speech(config);
// Configure express
require('./config/express')(app, textToSpeech);
var port = process.env.VCAP_APP_PORT || 3000;
app.listen(port);
console.log('listening at:', port); | /**
* Copyright 2014 IBM Corp. All Rights Reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
'use strict';
var express = require('express'),
app = express(),
bluemix = require('./config/bluemix'),
watson = require('watson-developer-cloud'),
// environmental variable points to demo's json config file
- config = require(process.env.WATSON_OPTIONS_FILE),
? ^^^^ ^
+ config = require(process.env.WATSON_CONFIG_FILE),
? ^ ^^^
extend = require('util')._extend;
// if bluemix credentials exists, then override local
var credentials = extend(config, bluemix.getServiceCreds('text_to_speech'));
// Create the service wrapper
- var textToSpeech = new watson.text_to_speech(credentials);
? ^^^^ ^ ^^^
+ var textToSpeech = new watson.text_to_speech(config);
? ^ ^ ^
// Configure express
require('./config/express')(app, textToSpeech);
var port = process.env.VCAP_APP_PORT || 3000;
app.listen(port);
console.log('listening at:', port); | 4 | 0.105263 | 2 | 2 |
b91ca76f545f4140ffd6f7e1ef66dc716f49b20c | client/src/App.css | client/src/App.css | .App {
text-align: center;
}
.App-logo {
animation: App-logo-spin infinite 20s linear;
height: 80px;
}
.App-header {
background-color: #222;
height: 150px;
padding: 20px;
color: white;
}
.App-intro {
font-size: large;
}
@keyframes App-logo-spin {
from { transform: rotate(0deg); }
to { transform: rotate(360deg); }
}
| .App {
text-align: center;
}
.App-logo {
animation: App-logo-spin infinite 20s linear;
height: 80px;
}
.App-header {
background-color: #222;
/*height: 150px;
padding: 20px;*/
color: white;
}
.App-intro {
font-size: large;
}
.App nav {
background: #282828;
padding:8px;
color:white;
}
.App nav a {
color: #9D9B91;
text-decoration: none;
display: inline-block;
background: #303030;
padding:8px;
margin: 5px;
}
@keyframes App-logo-spin {
from { transform: rotate(0deg); }
to { transform: rotate(360deg); }
}
| Add some css love :/ | Add some css love :/
| CSS | mit | magbicaleman/go-neo4j-apollo,magbicaleman/go-neo4j-apollo,magbicaleman/go-neo4j-apollo | css | ## Code Before:
.App {
text-align: center;
}
.App-logo {
animation: App-logo-spin infinite 20s linear;
height: 80px;
}
.App-header {
background-color: #222;
height: 150px;
padding: 20px;
color: white;
}
.App-intro {
font-size: large;
}
@keyframes App-logo-spin {
from { transform: rotate(0deg); }
to { transform: rotate(360deg); }
}
## Instruction:
Add some css love :/
## Code After:
.App {
text-align: center;
}
.App-logo {
animation: App-logo-spin infinite 20s linear;
height: 80px;
}
.App-header {
background-color: #222;
/*height: 150px;
padding: 20px;*/
color: white;
}
.App-intro {
font-size: large;
}
.App nav {
background: #282828;
padding:8px;
color:white;
}
.App nav a {
color: #9D9B91;
text-decoration: none;
display: inline-block;
background: #303030;
padding:8px;
margin: 5px;
}
@keyframes App-logo-spin {
from { transform: rotate(0deg); }
to { transform: rotate(360deg); }
}
| .App {
text-align: center;
}
.App-logo {
animation: App-logo-spin infinite 20s linear;
height: 80px;
}
.App-header {
background-color: #222;
- height: 150px;
+ /*height: 150px;
? ++
- padding: 20px;
+ padding: 20px;*/
? ++
color: white;
}
.App-intro {
font-size: large;
}
+ .App nav {
+ background: #282828;
+ padding:8px;
+ color:white;
+ }
+
+ .App nav a {
+ color: #9D9B91;
+ text-decoration: none;
+ display: inline-block;
+ background: #303030;
+ padding:8px;
+ margin: 5px;
+ }
+
@keyframes App-logo-spin {
from { transform: rotate(0deg); }
to { transform: rotate(360deg); }
} | 19 | 0.791667 | 17 | 2 |
43572aaaa0b47311393be9ab0d8ab90552c92436 | tsan_suppressions.txt | tsan_suppressions.txt | race:crossbeam_epoch::internal::Global::collect
| race:crossbeam_epoch::internal::Global::collect
# Rayon relies on crossbeam stuff which uses raw fences, which are
# not detected by TSAN.
race:rayon
| Add rayon to TSAN suppressions file | Add rayon to TSAN suppressions file
| Text | apache-2.0 | spacejam/sled,spacejam/sled,spacejam/sled | text | ## Code Before:
race:crossbeam_epoch::internal::Global::collect
## Instruction:
Add rayon to TSAN suppressions file
## Code After:
race:crossbeam_epoch::internal::Global::collect
# Rayon relies on crossbeam stuff which uses raw fences, which are
# not detected by TSAN.
race:rayon
| race:crossbeam_epoch::internal::Global::collect
+
+ # Rayon relies on crossbeam stuff which uses raw fences, which are
+ # not detected by TSAN.
+ race:rayon | 4 | 4 | 4 | 0 |
00dd492bbc1488380445c0b75e8f2d05c60f9de1 | doc/filters/url_encode.rst | doc/filters/url_encode.rst | ``url_encode``
==============
The ``url_encode`` filter percent encodes a given string as URL segment
or an array as query string:
.. code-block:: twig
{{ "path-seg*ment"|url_encode }}
{# outputs "path-seg%2Ament" #}
{{ "string with spaces"|url_encode }}
{# outputs "string%20with%20spaces" #}
{{ {'param': 'value', 'foo': 'bar'}|url_encode }}
{# outputs "param=value&foo=bar" #}
.. note::
Internally, Twig uses the PHP ``rawurlencode``.
.. _`rawurlencode`: https://secure.php.net/rawurlencode
| ``url_encode``
==============
The ``url_encode`` filter percent encodes a given string as URL segment
or an array as query string:
.. code-block:: twig
{{ "path-seg*ment"|url_encode }}
{# outputs "path-seg%2Ament" #}
{{ "string with spaces"|url_encode }}
{# outputs "string%20with%20spaces" #}
{{ {'param': 'value', 'foo': 'bar'}|url_encode }}
{# outputs "param=value&foo=bar" #}
.. note::
Internally, Twig uses the PHP `rawurlencode`_ or the `http_build_query`_ function.
.. _`rawurlencode`: https://secure.php.net/rawurlencode
.. _`http_build_query`: https://secure.php.net/http_build_query
| Fix missing http_build_query function call | Fix missing http_build_query function call
| reStructuredText | bsd-3-clause | twigphp/Twig,twigphp/Twig,twigphp/Twig | restructuredtext | ## Code Before:
``url_encode``
==============
The ``url_encode`` filter percent encodes a given string as URL segment
or an array as query string:
.. code-block:: twig
{{ "path-seg*ment"|url_encode }}
{# outputs "path-seg%2Ament" #}
{{ "string with spaces"|url_encode }}
{# outputs "string%20with%20spaces" #}
{{ {'param': 'value', 'foo': 'bar'}|url_encode }}
{# outputs "param=value&foo=bar" #}
.. note::
Internally, Twig uses the PHP ``rawurlencode``.
.. _`rawurlencode`: https://secure.php.net/rawurlencode
## Instruction:
Fix missing http_build_query function call
## Code After:
``url_encode``
==============
The ``url_encode`` filter percent encodes a given string as URL segment
or an array as query string:
.. code-block:: twig
{{ "path-seg*ment"|url_encode }}
{# outputs "path-seg%2Ament" #}
{{ "string with spaces"|url_encode }}
{# outputs "string%20with%20spaces" #}
{{ {'param': 'value', 'foo': 'bar'}|url_encode }}
{# outputs "param=value&foo=bar" #}
.. note::
Internally, Twig uses the PHP `rawurlencode`_ or the `http_build_query`_ function.
.. _`rawurlencode`: https://secure.php.net/rawurlencode
.. _`http_build_query`: https://secure.php.net/http_build_query
| ``url_encode``
==============
The ``url_encode`` filter percent encodes a given string as URL segment
or an array as query string:
.. code-block:: twig
{{ "path-seg*ment"|url_encode }}
{# outputs "path-seg%2Ament" #}
{{ "string with spaces"|url_encode }}
{# outputs "string%20with%20spaces" #}
{{ {'param': 'value', 'foo': 'bar'}|url_encode }}
{# outputs "param=value&foo=bar" #}
.. note::
- Internally, Twig uses the PHP ``rawurlencode``.
+ Internally, Twig uses the PHP `rawurlencode`_ or the `http_build_query`_ function.
.. _`rawurlencode`: https://secure.php.net/rawurlencode
+ .. _`http_build_query`: https://secure.php.net/http_build_query | 3 | 0.136364 | 2 | 1 |
f2a424a85b53cc10bdce29e2bc085e2fea205e52 | src/charming.js | src/charming.js | /* globals define */
var charming = function() {
'use strict';
return function(elem, opts) {
opts = opts || {};
var tagName = opts.tagName || 'span',
classPrefix = typeof opts.classPrefix !== 'undefined' ? opts.classPrefix : 'char',
count = 1;
var traverse = function(elem) {
var childNodes = Array.prototype.slice.call(elem.childNodes); // static array of nodes
for (var i = 0, len = childNodes.length; i < len; i++) {
traverse(childNodes[i]);
}
if (elem.nodeType === Node.TEXT_NODE) {
inject(elem);
}
};
var inject = function(elem) {
var str = elem.nodeValue,
parentNode = elem.parentNode;
for (var i = 0, len = str.length; i < len; i++) {
var node = document.createElement(tagName);
if (classPrefix) {
node.className = classPrefix + count;
count++;
}
node.appendChild(document.createTextNode(str[i]));
parentNode.insertBefore(node, elem);
}
parentNode.removeChild(elem);
};
traverse(elem);
return elem;
};
};
(function(root, factory) {
if (typeof define === 'function' && define.amd) { // istanbul ignore
define(factory);
} else if (typeof exports === 'object') { // istanbul ignore
module.exports = factory;
} else {
root.charming = factory(root);
}
})(this, charming);
| (function(fn) {
/* istanbul ignore else */
if (typeof module === 'undefined') {
this.charming = fn;
} else {
module.exports = fn;
}
})(function(elem, opts) {
'use strict';
opts = opts || {};
var tagName = opts.tagName || 'span';
var classPrefix = opts.classPrefix != null ? opts.classPrefix : 'char';
var count = 1;
var inject = function(elem) {
var parentNode = elem.parentNode;
var str = elem.nodeValue;
var len = str.length;
var i = -1;
while (++i < len) {
var node = document.createElement(tagName);
if (classPrefix) {
node.className = classPrefix + count;
count++;
}
node.appendChild(document.createTextNode(str[i]));
parentNode.insertBefore(node, elem);
}
parentNode.removeChild(elem);
};
(function traverse(elem) {
var childNodes = [].slice.call(elem.childNodes); // static array of nodes
var len = childNodes.length;
var i = -1;
while (++i < len) {
traverse(childNodes[i]);
}
if (elem.nodeType === Node.TEXT_NODE) {
inject(elem);
}
})(elem);
return elem;
});
| Refactor + remove AMD from boilerplate | Refactor + remove AMD from boilerplate
| JavaScript | mit | yuanqing/charming | javascript | ## Code Before:
/* globals define */
var charming = function() {
'use strict';
return function(elem, opts) {
opts = opts || {};
var tagName = opts.tagName || 'span',
classPrefix = typeof opts.classPrefix !== 'undefined' ? opts.classPrefix : 'char',
count = 1;
var traverse = function(elem) {
var childNodes = Array.prototype.slice.call(elem.childNodes); // static array of nodes
for (var i = 0, len = childNodes.length; i < len; i++) {
traverse(childNodes[i]);
}
if (elem.nodeType === Node.TEXT_NODE) {
inject(elem);
}
};
var inject = function(elem) {
var str = elem.nodeValue,
parentNode = elem.parentNode;
for (var i = 0, len = str.length; i < len; i++) {
var node = document.createElement(tagName);
if (classPrefix) {
node.className = classPrefix + count;
count++;
}
node.appendChild(document.createTextNode(str[i]));
parentNode.insertBefore(node, elem);
}
parentNode.removeChild(elem);
};
traverse(elem);
return elem;
};
};
(function(root, factory) {
if (typeof define === 'function' && define.amd) { // istanbul ignore
define(factory);
} else if (typeof exports === 'object') { // istanbul ignore
module.exports = factory;
} else {
root.charming = factory(root);
}
})(this, charming);
## Instruction:
Refactor + remove AMD from boilerplate
## Code After:
(function(fn) {
/* istanbul ignore else */
if (typeof module === 'undefined') {
this.charming = fn;
} else {
module.exports = fn;
}
})(function(elem, opts) {
'use strict';
opts = opts || {};
var tagName = opts.tagName || 'span';
var classPrefix = opts.classPrefix != null ? opts.classPrefix : 'char';
var count = 1;
var inject = function(elem) {
var parentNode = elem.parentNode;
var str = elem.nodeValue;
var len = str.length;
var i = -1;
while (++i < len) {
var node = document.createElement(tagName);
if (classPrefix) {
node.className = classPrefix + count;
count++;
}
node.appendChild(document.createTextNode(str[i]));
parentNode.insertBefore(node, elem);
}
parentNode.removeChild(elem);
};
(function traverse(elem) {
var childNodes = [].slice.call(elem.childNodes); // static array of nodes
var len = childNodes.length;
var i = -1;
while (++i < len) {
traverse(childNodes[i]);
}
if (elem.nodeType === Node.TEXT_NODE) {
inject(elem);
}
})(elem);
return elem;
});
| - /* globals define */
- var charming = function() {
+ (function(fn) {
+ /* istanbul ignore else */
+ if (typeof module === 'undefined') {
+ this.charming = fn;
+ } else {
+ module.exports = fn;
+ }
+ })(function(elem, opts) {
'use strict';
- return function(elem, opts) {
+ opts = opts || {};
+ var tagName = opts.tagName || 'span';
+ var classPrefix = opts.classPrefix != null ? opts.classPrefix : 'char';
- opts = opts || {};
- var tagName = opts.tagName || 'span',
- classPrefix = typeof opts.classPrefix !== 'undefined' ? opts.classPrefix : 'char',
- count = 1;
? ^^^^^
+ var count = 1;
? ^^^
- var traverse = function(elem) {
? -- -------
+ var inject = function(elem) {
? +++++
- var childNodes = Array.prototype.slice.call(elem.childNodes); // static array of nodes
- for (var i = 0, len = childNodes.length; i < len; i++) {
- traverse(childNodes[i]);
+ var parentNode = elem.parentNode;
+ var str = elem.nodeValue;
+ var len = str.length;
+ var i = -1;
+ while (++i < len) {
+ var node = document.createElement(tagName);
+ if (classPrefix) {
+ node.className = classPrefix + count;
+ count++;
}
- if (elem.nodeType === Node.TEXT_NODE) {
- inject(elem);
- }
- };
-
- var inject = function(elem) {
- var str = elem.nodeValue,
- parentNode = elem.parentNode;
- for (var i = 0, len = str.length; i < len; i++) {
- var node = document.createElement(tagName);
- if (classPrefix) {
- node.className = classPrefix + count;
- count++;
- }
- node.appendChild(document.createTextNode(str[i]));
? --
+ node.appendChild(document.createTextNode(str[i]));
- parentNode.insertBefore(node, elem);
? --
+ parentNode.insertBefore(node, elem);
- }
? --
+ }
- parentNode.removeChild(elem);
? --
+ parentNode.removeChild(elem);
- };
-
- traverse(elem);
-
- return elem;
-
};
- };
+ (function traverse(elem) {
+ var childNodes = [].slice.call(elem.childNodes); // static array of nodes
+ var len = childNodes.length;
+ var i = -1;
+ while (++i < len) {
+ traverse(childNodes[i]);
+ }
+ if (elem.nodeType === Node.TEXT_NODE) {
+ inject(elem);
+ }
+ })(elem);
+ return elem;
+
+ });
- (function(root, factory) {
- if (typeof define === 'function' && define.amd) { // istanbul ignore
- define(factory);
- } else if (typeof exports === 'object') { // istanbul ignore
- module.exports = factory;
- } else {
- root.charming = factory(root);
- }
- })(this, charming); | 85 | 1.574074 | 40 | 45 |
ec5f0a6ded1d50773cda52d32f22238bcb019d56 | src/app/main/main.controller.js | src/app/main/main.controller.js | (function() {
'use strict';
angular
.module('annualTimeBlock')
.controller('MainController', MainController);
/** @ngInject */
function MainController($timeout, webDevTec, toastr) {
var vm = this;
vm.awesomeThings = [];
vm.classAnimation = '';
vm.creationDate = 1456885185987;
vm.showToastr = showToastr;
activate();
function activate() {
getWebDevTec();
$timeout(function() {
vm.classAnimation = 'rubberBand';
}, 4000);
}
function showToastr() {
toastr.info('Fork <a href="https://github.com/Swiip/generator-gulp-angular" target="_blank"><b>generator-gulp-angular</b></a>');
vm.classAnimation = '';
}
function getWebDevTec() {
vm.awesomeThings = webDevTec.getTec();
angular.forEach(vm.awesomeThings, function(awesomeThing) {
awesomeThing.rank = Math.random();
});
}
}
})();
| (function() {
'use strict';
angular
.module('annualTimeBlock')
.controller('MainController', MainController);
/** @ngInject */
function MainController() {
var vm = this;
}
})();
| Clear out unneeded MainController stuff | Clear out unneeded MainController stuff
| JavaScript | mit | EdwardHinkle/annualTimeBlock,EdwardHinkle/annualTimeBlock | javascript | ## Code Before:
(function() {
'use strict';
angular
.module('annualTimeBlock')
.controller('MainController', MainController);
/** @ngInject */
function MainController($timeout, webDevTec, toastr) {
var vm = this;
vm.awesomeThings = [];
vm.classAnimation = '';
vm.creationDate = 1456885185987;
vm.showToastr = showToastr;
activate();
function activate() {
getWebDevTec();
$timeout(function() {
vm.classAnimation = 'rubberBand';
}, 4000);
}
function showToastr() {
toastr.info('Fork <a href="https://github.com/Swiip/generator-gulp-angular" target="_blank"><b>generator-gulp-angular</b></a>');
vm.classAnimation = '';
}
function getWebDevTec() {
vm.awesomeThings = webDevTec.getTec();
angular.forEach(vm.awesomeThings, function(awesomeThing) {
awesomeThing.rank = Math.random();
});
}
}
})();
## Instruction:
Clear out unneeded MainController stuff
## Code After:
(function() {
'use strict';
angular
.module('annualTimeBlock')
.controller('MainController', MainController);
/** @ngInject */
function MainController() {
var vm = this;
}
})();
| (function() {
'use strict';
angular
.module('annualTimeBlock')
.controller('MainController', MainController);
/** @ngInject */
- function MainController($timeout, webDevTec, toastr) {
+ function MainController() {
var vm = this;
- vm.awesomeThings = [];
- vm.classAnimation = '';
- vm.creationDate = 1456885185987;
- vm.showToastr = showToastr;
- activate();
-
- function activate() {
- getWebDevTec();
- $timeout(function() {
- vm.classAnimation = 'rubberBand';
- }, 4000);
- }
-
- function showToastr() {
- toastr.info('Fork <a href="https://github.com/Swiip/generator-gulp-angular" target="_blank"><b>generator-gulp-angular</b></a>');
- vm.classAnimation = '';
- }
-
- function getWebDevTec() {
- vm.awesomeThings = webDevTec.getTec();
-
- angular.forEach(vm.awesomeThings, function(awesomeThing) {
- awesomeThing.rank = Math.random();
- });
- }
}
})(); | 27 | 0.692308 | 1 | 26 |
91a9bd2ebcf3f08e2c879e1b6804fd2a160f0547 | test/IncrementingTest.php | test/IncrementingTest.php | <?php
class IncrementingTest extends RememberTestCase
{
public function testIncrementingModelsPopsCache()
{
$group = Group::create(['id' => static::ID, 'name' => 'counter test', 'counter' => 0]);
$cached = Group::find(static::ID);
$group->increment('counter');
$new = Group::find(static::ID);
$this->assertNotEquals($cached, $new);
$this->assertEquals(1, $group->counter);
$this->assertEquals(1, $new->counter);
}
public function testDecrementingModelsPopsCache()
{
$group = Group::create(['id' => static::ID, 'name' => 'counter test', 'counter' => 0]);
$cached = Group::find(static::ID);
$group->decrement('counter');
$new = Group::find(static::ID);
$this->assertNotEquals($cached, $new);
$this->assertEquals(-1, $group->counter);
$this->assertEquals(-1, $new->counter);
}
}
| <?php
class IncrementingTest extends RememberTestCase
{
public function testIncrementingModelsPopsCache()
{
$group = Group::create(['id' => static::ID, 'name' => 'counter test', 'counter' => 0]);
$cached = Group::find(static::ID);
$group->increment('counter');
$new = Group::find(static::ID);
$this->assertNotEquals($cached, $new);
$this->assertEquals(1, $group->counter);
$this->assertEquals(1, $new->counter);
static::$sql = false;
// check we've properly cached here.
Group::find(static::ID);
}
public function testDecrementingModelsPopsCache()
{
$group = Group::create(['id' => static::ID, 'name' => 'counter test', 'counter' => 0]);
$cached = Group::find(static::ID);
$group->decrement('counter');
$new = Group::find(static::ID);
$this->assertNotEquals($cached, $new);
$this->assertEquals(-1, $group->counter);
$this->assertEquals(-1, $new->counter);
static::$sql = false;
// check we've properly cached here.
Group::find(static::ID);
}
}
| Make incrementing tests check the cache was actually used, too | Make incrementing tests check the cache was actually used, too
| PHP | mit | ameliaikeda/rememberable | php | ## Code Before:
<?php
class IncrementingTest extends RememberTestCase
{
public function testIncrementingModelsPopsCache()
{
$group = Group::create(['id' => static::ID, 'name' => 'counter test', 'counter' => 0]);
$cached = Group::find(static::ID);
$group->increment('counter');
$new = Group::find(static::ID);
$this->assertNotEquals($cached, $new);
$this->assertEquals(1, $group->counter);
$this->assertEquals(1, $new->counter);
}
public function testDecrementingModelsPopsCache()
{
$group = Group::create(['id' => static::ID, 'name' => 'counter test', 'counter' => 0]);
$cached = Group::find(static::ID);
$group->decrement('counter');
$new = Group::find(static::ID);
$this->assertNotEquals($cached, $new);
$this->assertEquals(-1, $group->counter);
$this->assertEquals(-1, $new->counter);
}
}
## Instruction:
Make incrementing tests check the cache was actually used, too
## Code After:
<?php
class IncrementingTest extends RememberTestCase
{
public function testIncrementingModelsPopsCache()
{
$group = Group::create(['id' => static::ID, 'name' => 'counter test', 'counter' => 0]);
$cached = Group::find(static::ID);
$group->increment('counter');
$new = Group::find(static::ID);
$this->assertNotEquals($cached, $new);
$this->assertEquals(1, $group->counter);
$this->assertEquals(1, $new->counter);
static::$sql = false;
// check we've properly cached here.
Group::find(static::ID);
}
public function testDecrementingModelsPopsCache()
{
$group = Group::create(['id' => static::ID, 'name' => 'counter test', 'counter' => 0]);
$cached = Group::find(static::ID);
$group->decrement('counter');
$new = Group::find(static::ID);
$this->assertNotEquals($cached, $new);
$this->assertEquals(-1, $group->counter);
$this->assertEquals(-1, $new->counter);
static::$sql = false;
// check we've properly cached here.
Group::find(static::ID);
}
}
| <?php
class IncrementingTest extends RememberTestCase
{
public function testIncrementingModelsPopsCache()
{
$group = Group::create(['id' => static::ID, 'name' => 'counter test', 'counter' => 0]);
$cached = Group::find(static::ID);
$group->increment('counter');
$new = Group::find(static::ID);
$this->assertNotEquals($cached, $new);
$this->assertEquals(1, $group->counter);
$this->assertEquals(1, $new->counter);
+
+ static::$sql = false;
+
+ // check we've properly cached here.
+ Group::find(static::ID);
}
public function testDecrementingModelsPopsCache()
{
$group = Group::create(['id' => static::ID, 'name' => 'counter test', 'counter' => 0]);
$cached = Group::find(static::ID);
$group->decrement('counter');
$new = Group::find(static::ID);
$this->assertNotEquals($cached, $new);
$this->assertEquals(-1, $group->counter);
$this->assertEquals(-1, $new->counter);
+
+ static::$sql = false;
+
+ // check we've properly cached here.
+ Group::find(static::ID);
}
} | 10 | 0.277778 | 10 | 0 |
6cb2e9f1aa00423058e0360f5ec3dc6317fff08e | .travis.yml | .travis.yml | ---
language: ruby
script: bundle exec rake
rvm:
- 1.9.3
env:
- PUPPET_VERSION="~> 3.0.0"
- PUPPET_VERSION="~> 3.1.0"
- PUPPET_VERSION="~> 3.2.0"
- PUPPET_VERSION="~> 3.4.0"
- PUPPET_VERSION="~> 3.7.3"
- PUPPET_VERSION=">= 0"
matrix:
allow_failures:
- env: PUPPET_VERSION=">= 0"
| ---
language: ruby
script: bundle exec rake
rvm:
- 1.9.3
- 2.1.6
env:
- PUPPET_VERSION="~> 3.0.0"
- PUPPET_VERSION="~> 3.1.0"
- PUPPET_VERSION="~> 3.2.0"
- PUPPET_VERSION="~> 3.4.0"
- PUPPET_VERSION="~> 3.7.3"
- PUPPET_VERSION="~> 4.2.0"
- PUPPET_VERSION=">= 0"
matrix:
exclude:
- rvm: 2.1.6
env: PUPPET_VERSION="~> 3.1.0"
- rvm: 2.1.6
env: PUPPET_VERSION="~> 3.0.0"
allow_failures:
- env: PUPPET_VERSION=">= 0"
| Add Puppet 4 to the testing matrix | Add Puppet 4 to the testing matrix
This commit adds Puppet 4 to the testing matrix as a required passing
test.
Exclusions have been added on Puppet 3.0 and 3.1 with Ruby 2 as these
versions of Puppet do not yet support Ruby 2.
| YAML | mit | gds-operations/puppet-syntax,gds-operations/puppet-syntax | yaml | ## Code Before:
---
language: ruby
script: bundle exec rake
rvm:
- 1.9.3
env:
- PUPPET_VERSION="~> 3.0.0"
- PUPPET_VERSION="~> 3.1.0"
- PUPPET_VERSION="~> 3.2.0"
- PUPPET_VERSION="~> 3.4.0"
- PUPPET_VERSION="~> 3.7.3"
- PUPPET_VERSION=">= 0"
matrix:
allow_failures:
- env: PUPPET_VERSION=">= 0"
## Instruction:
Add Puppet 4 to the testing matrix
This commit adds Puppet 4 to the testing matrix as a required passing
test.
Exclusions have been added on Puppet 3.0 and 3.1 with Ruby 2 as these
versions of Puppet do not yet support Ruby 2.
## Code After:
---
language: ruby
script: bundle exec rake
rvm:
- 1.9.3
- 2.1.6
env:
- PUPPET_VERSION="~> 3.0.0"
- PUPPET_VERSION="~> 3.1.0"
- PUPPET_VERSION="~> 3.2.0"
- PUPPET_VERSION="~> 3.4.0"
- PUPPET_VERSION="~> 3.7.3"
- PUPPET_VERSION="~> 4.2.0"
- PUPPET_VERSION=">= 0"
matrix:
exclude:
- rvm: 2.1.6
env: PUPPET_VERSION="~> 3.1.0"
- rvm: 2.1.6
env: PUPPET_VERSION="~> 3.0.0"
allow_failures:
- env: PUPPET_VERSION=">= 0"
| ---
language: ruby
script: bundle exec rake
rvm:
- 1.9.3
+ - 2.1.6
env:
- PUPPET_VERSION="~> 3.0.0"
- PUPPET_VERSION="~> 3.1.0"
- PUPPET_VERSION="~> 3.2.0"
- PUPPET_VERSION="~> 3.4.0"
- PUPPET_VERSION="~> 3.7.3"
+ - PUPPET_VERSION="~> 4.2.0"
- PUPPET_VERSION=">= 0"
matrix:
+ exclude:
+ - rvm: 2.1.6
+ env: PUPPET_VERSION="~> 3.1.0"
+ - rvm: 2.1.6
+ env: PUPPET_VERSION="~> 3.0.0"
allow_failures:
- env: PUPPET_VERSION=">= 0" | 7 | 0.466667 | 7 | 0 |
5bf64dfe413c0c3745198532d160ac8c174fcbf0 | dev-requirements.txt | dev-requirements.txt | .
#coverage==3.5.2
mock==1.0.1
pytest>=2.3.5
wheel==0.21.0
betamax>=0.5.0
betamax_matchers>=0.2.0
| .
#coverage==3.5.2
mock==1.0.1
pytest>=2.3.5
wheel==0.21.0
betamax>=0.5.0
betamax_matchers>=0.2.0
tox>=2.2.0
| Add tox as a dev-requirement in order to run 'make tests'. | Add tox as a dev-requirement in order to run 'make tests'.
| Text | bsd-3-clause | ueg1990/github3.py,christophelec/github3.py,itsmemattchung/github3.py,balloob/github3.py,degustaf/github3.py,sigmavirus24/github3.py | text | ## Code Before:
.
#coverage==3.5.2
mock==1.0.1
pytest>=2.3.5
wheel==0.21.0
betamax>=0.5.0
betamax_matchers>=0.2.0
## Instruction:
Add tox as a dev-requirement in order to run 'make tests'.
## Code After:
.
#coverage==3.5.2
mock==1.0.1
pytest>=2.3.5
wheel==0.21.0
betamax>=0.5.0
betamax_matchers>=0.2.0
tox>=2.2.0
| .
#coverage==3.5.2
mock==1.0.1
pytest>=2.3.5
wheel==0.21.0
betamax>=0.5.0
betamax_matchers>=0.2.0
+ tox>=2.2.0 | 1 | 0.142857 | 1 | 0 |
6728041c8e241468aaae83286c06f4c36c8ec767 | package.json | package.json | {
"name": "androjs",
"version": "0.1.0",
"dependencies": {
"jasmine-node": ""
}
} | {
"name": "androjs",
"version": "0.2.0",
"dependencies": {
"walkdir": "0.0.1",
"jasmine-node": ""
}
} | Add walkdir to list of test running dependencies. | Add walkdir to list of test running dependencies. | JSON | mit | maryrosecook/androjs | json | ## Code Before:
{
"name": "androjs",
"version": "0.1.0",
"dependencies": {
"jasmine-node": ""
}
}
## Instruction:
Add walkdir to list of test running dependencies.
## Code After:
{
"name": "androjs",
"version": "0.2.0",
"dependencies": {
"walkdir": "0.0.1",
"jasmine-node": ""
}
} | {
"name": "androjs",
- "version": "0.1.0",
? ^
+ "version": "0.2.0",
? ^
"dependencies": {
+ "walkdir": "0.0.1",
"jasmine-node": ""
}
} | 3 | 0.428571 | 2 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.