commit stringlengths 40 40 | old_file stringlengths 4 237 | new_file stringlengths 4 237 | old_contents stringlengths 1 4.24k | new_contents stringlengths 5 4.84k | subject stringlengths 15 778 | message stringlengths 16 6.86k | lang stringlengths 1 30 | license stringclasses 13 values | repos stringlengths 5 116k | config stringlengths 1 30 | content stringlengths 105 8.72k |
|---|---|---|---|---|---|---|---|---|---|---|---|
c1b259e8fe3fd73ca27139cd681c5832f143b7a7 | routes/web.php | routes/web.php | <?php
/*
|--------------------------------------------------------------------------
| Web Routes
|--------------------------------------------------------------------------
|
| Here is where you can register web routes for your application. These
| routes are loaded by the RouteServiceProvider within a group which
| contains the "web" middleware group. Now create something great!
|
*/
Route::get('/', 'HomeController@getIndex');
Route::post('/', 'HomeController@getIndex');
Route::get('/earthquakes', 'HomeController@getEarthquakeHistory');
Route::post('/earthquakes', 'HomeController@getEarthquakeHistory');
Route::get('/earthquakes/{id}', 'HomeController@getEarthquakeDetails');
Route::get('/earthquake-heatmap', 'HomeController@getHeatmap');
Route::post('/earthquake-heatmap', 'HomeController@getHeatmap');
Route::get('/earthquake-graphs-charts', 'HomeController@getGraphCharts');
Route::post('/earthquake-graphs-charts', 'HomeController@getGraphCharts');
Route::get('/earthquake-101', 'HomeController@getEarthquake101');
Route::get('/earthquake-hotlines', 'HomeController@getHotlines');
Route::get('/about', 'HomeController@getAbout');
Route::get('/test', 'HomeController@getTest');
Route::prefix('amp')->group(function () {
Route::get('/', 'HomeController@getIndex')
->name('amp-homepage');
}); | <?php
/*
|--------------------------------------------------------------------------
| Web Routes
|--------------------------------------------------------------------------
|
| Here is where you can register web routes for your application. These
| routes are loaded by the RouteServiceProvider within a group which
| contains the "web" middleware group. Now create something great!
|
*/
Route::match(['get', 'post'], '/', 'HomeController@getIndex');
Route::match(['get', 'post'], '/earthquakes', 'HomeController@getEarthquakeHistory');
Route::get('/earthquakes/{id}', 'HomeController@getEarthquakeDetails');
Route::match(['get', 'post'], '/earthquake-heatmap', 'HomeController@getHeatmap');
Route::match(['get', 'post'], '/earthquake-graphs-charts', 'HomeController@getGraphCharts');
Route::get('/earthquake-101', 'HomeController@getEarthquake101');
Route::get('/earthquake-hotlines', 'HomeController@getHotlines');
Route::get('/about', 'HomeController@getAbout');
Route::get('/test', 'HomeController@getTest');
Route::prefix('amp')->group(function () {
Route::get('/', 'HomeController@getIndex')
->name('amp-homepage');
});
| Refactor routes: use ::match() so that multiple http verbs that points to the same controller only uses one line | Refactor routes: use ::match() so that multiple http verbs that points to the same controller only uses one line
| PHP | mit | ridvanbaluyos/earthquake,ridvanbaluyos/earthquake,ridvanbaluyos/earthquake | php | ## Code Before:
<?php
/*
|--------------------------------------------------------------------------
| Web Routes
|--------------------------------------------------------------------------
|
| Here is where you can register web routes for your application. These
| routes are loaded by the RouteServiceProvider within a group which
| contains the "web" middleware group. Now create something great!
|
*/
Route::get('/', 'HomeController@getIndex');
Route::post('/', 'HomeController@getIndex');
Route::get('/earthquakes', 'HomeController@getEarthquakeHistory');
Route::post('/earthquakes', 'HomeController@getEarthquakeHistory');
Route::get('/earthquakes/{id}', 'HomeController@getEarthquakeDetails');
Route::get('/earthquake-heatmap', 'HomeController@getHeatmap');
Route::post('/earthquake-heatmap', 'HomeController@getHeatmap');
Route::get('/earthquake-graphs-charts', 'HomeController@getGraphCharts');
Route::post('/earthquake-graphs-charts', 'HomeController@getGraphCharts');
Route::get('/earthquake-101', 'HomeController@getEarthquake101');
Route::get('/earthquake-hotlines', 'HomeController@getHotlines');
Route::get('/about', 'HomeController@getAbout');
Route::get('/test', 'HomeController@getTest');
Route::prefix('amp')->group(function () {
Route::get('/', 'HomeController@getIndex')
->name('amp-homepage');
});
## Instruction:
Refactor routes: use ::match() so that multiple http verbs that points to the same controller only uses one line
## Code After:
<?php
/*
|--------------------------------------------------------------------------
| Web Routes
|--------------------------------------------------------------------------
|
| Here is where you can register web routes for your application. These
| routes are loaded by the RouteServiceProvider within a group which
| contains the "web" middleware group. Now create something great!
|
*/
Route::match(['get', 'post'], '/', 'HomeController@getIndex');
Route::match(['get', 'post'], '/earthquakes', 'HomeController@getEarthquakeHistory');
Route::get('/earthquakes/{id}', 'HomeController@getEarthquakeDetails');
Route::match(['get', 'post'], '/earthquake-heatmap', 'HomeController@getHeatmap');
Route::match(['get', 'post'], '/earthquake-graphs-charts', 'HomeController@getGraphCharts');
Route::get('/earthquake-101', 'HomeController@getEarthquake101');
Route::get('/earthquake-hotlines', 'HomeController@getHotlines');
Route::get('/about', 'HomeController@getAbout');
Route::get('/test', 'HomeController@getTest');
Route::prefix('amp')->group(function () {
Route::get('/', 'HomeController@getIndex')
->name('amp-homepage');
});
|
e0e7e71442fa747549da7f08ea5d38604a878ab1 | setup.cfg | setup.cfg | [bumpversion]
current_version = 0.5.0
commit = True
tag = True
[bumpversion:file:setup.py]
| [bumpversion]
current_version = 0.5.0
commit = True
tag = True
message = Update the Charlesbot version from {current_version} to {new_version}
[bumpversion:file:setup.py]
| Customize the bump version commit message | Customize the bump version commit message
| INI | mit | marvinpinto/charlesbot,marvinpinto/charlesbot | ini | ## Code Before:
[bumpversion]
current_version = 0.5.0
commit = True
tag = True
[bumpversion:file:setup.py]
## Instruction:
Customize the bump version commit message
## Code After:
[bumpversion]
current_version = 0.5.0
commit = True
tag = True
message = Update the Charlesbot version from {current_version} to {new_version}
[bumpversion:file:setup.py]
|
909d546a96e48a40b6b6e503be0451666ba305a3 | src/main/java/fr/aumgn/dac2/DAC.java | src/main/java/fr/aumgn/dac2/DAC.java | package fr.aumgn.dac2;
import fr.aumgn.bukkitutils.localization.PluginResourceBundles;
import fr.aumgn.bukkitutils.localization.bundle.PluginResourceBundle;
import fr.aumgn.dac2.config.DACConfig;
public class DAC {
private final DACPlugin plugin;
private DACConfig config;
private PluginResourceBundle cmdMessages;
private PluginResourceBundle messages;
public DAC(DACPlugin plugin) {
this.plugin = plugin;
reloadData();
}
public DACPlugin getPlugin() {
return plugin;
}
public DACConfig getConfig() {
return config;
}
public PluginResourceBundle getCmdMessages() {
return cmdMessages;
}
public PluginResourceBundle getMessages() {
return messages;
}
public void reloadData() {
config = plugin.reloadDACConfig();
PluginResourceBundles bundles = new PluginResourceBundles(plugin,
config.getLocale(), plugin.getDataFolder());
cmdMessages = bundles.get("commands");
messages = bundles.get("messages");
}
}
| package fr.aumgn.dac2;
import fr.aumgn.bukkitutils.localization.PluginResourceBundles;
import fr.aumgn.bukkitutils.localization.bundle.PluginResourceBundle;
import fr.aumgn.dac2.config.DACConfig;
public class DAC {
private final DACPlugin plugin;
private DACConfig config;
private PluginResourceBundle cmdMessages;
private PluginResourceBundle messages;
public DAC(DACPlugin plugin) {
this.plugin = plugin;
reloadData();
}
public DACPlugin getPlugin() {
return plugin;
}
public DACConfig getConfig() {
return config;
}
public PluginResourceBundle getCmdMessages() {
return cmdMessages;
}
public PluginResourceBundle getMessages() {
return messages;
}
public void reloadData() {
config = plugin.reloadDACConfig();
PluginResourceBundles.clearCache(plugin);
PluginResourceBundles bundles = new PluginResourceBundles(plugin,
config.getLocale(), plugin.getDataFolder());
cmdMessages = bundles.get("commands");
messages = bundles.get("messages");
}
}
| Clear ResourceBundle cache on reload | Clear ResourceBundle cache on reload
| Java | mit | aumgn/DACv2,Loscillo/DACv2-temp | java | ## Code Before:
package fr.aumgn.dac2;
import fr.aumgn.bukkitutils.localization.PluginResourceBundles;
import fr.aumgn.bukkitutils.localization.bundle.PluginResourceBundle;
import fr.aumgn.dac2.config.DACConfig;
public class DAC {
private final DACPlugin plugin;
private DACConfig config;
private PluginResourceBundle cmdMessages;
private PluginResourceBundle messages;
public DAC(DACPlugin plugin) {
this.plugin = plugin;
reloadData();
}
public DACPlugin getPlugin() {
return plugin;
}
public DACConfig getConfig() {
return config;
}
public PluginResourceBundle getCmdMessages() {
return cmdMessages;
}
public PluginResourceBundle getMessages() {
return messages;
}
public void reloadData() {
config = plugin.reloadDACConfig();
PluginResourceBundles bundles = new PluginResourceBundles(plugin,
config.getLocale(), plugin.getDataFolder());
cmdMessages = bundles.get("commands");
messages = bundles.get("messages");
}
}
## Instruction:
Clear ResourceBundle cache on reload
## Code After:
package fr.aumgn.dac2;
import fr.aumgn.bukkitutils.localization.PluginResourceBundles;
import fr.aumgn.bukkitutils.localization.bundle.PluginResourceBundle;
import fr.aumgn.dac2.config.DACConfig;
public class DAC {
private final DACPlugin plugin;
private DACConfig config;
private PluginResourceBundle cmdMessages;
private PluginResourceBundle messages;
public DAC(DACPlugin plugin) {
this.plugin = plugin;
reloadData();
}
public DACPlugin getPlugin() {
return plugin;
}
public DACConfig getConfig() {
return config;
}
public PluginResourceBundle getCmdMessages() {
return cmdMessages;
}
public PluginResourceBundle getMessages() {
return messages;
}
public void reloadData() {
config = plugin.reloadDACConfig();
PluginResourceBundles.clearCache(plugin);
PluginResourceBundles bundles = new PluginResourceBundles(plugin,
config.getLocale(), plugin.getDataFolder());
cmdMessages = bundles.get("commands");
messages = bundles.get("messages");
}
}
|
727694006cae6eb8b67c62448b89901eb279e29c | app/views/sessions/create.blade.php | app/views/sessions/create.blade.php | @extends('layout')
@section('content')
<h3>Sign In</h3>
{{ Form::open(['route'=> ['sessions.store']]) }}
{{ Form::label('email', 'Email') }}
{{ Form::error('email', $errors) }}
{{ Form::text('email') }}
{{ Form::label('password', 'Password') }}
{{ Form::error('password', $errors) }}
{{ Form::password('password') }}
{{ Form::submit('Sign In', ['class' => 'btn']) }}
{{ Form::close() }}
@stop
| @extends('layout')
@section('content')
<h3>Sign In</h3>
{{ Form::open(['route'=> ['sessions.store'], 'id' => 'sign_in_form']) }}
{{ Form::label('email', 'Email') }}
{{ Form::error('email', $errors) }}
{{ Form::text('email') }}
{{ Form::label('password', 'Password') }}
{{ Form::error('password', $errors) }}
{{ Form::password('password') }}
{{ Form::submit('Sign In', ['class' => 'btn']) }}
{{ Form::close() }}
@stop
| Add an ID to sign-in form for testing. | Add an ID to sign-in form for testing.
| PHP | mit | axton21/voting-app,axton21/voting-app,DoSomething/voting-app,axton21/voting-app,DoSomething/voting-app,DoSomething/voting-app,DoSomething/voting-app | php | ## Code Before:
@extends('layout')
@section('content')
<h3>Sign In</h3>
{{ Form::open(['route'=> ['sessions.store']]) }}
{{ Form::label('email', 'Email') }}
{{ Form::error('email', $errors) }}
{{ Form::text('email') }}
{{ Form::label('password', 'Password') }}
{{ Form::error('password', $errors) }}
{{ Form::password('password') }}
{{ Form::submit('Sign In', ['class' => 'btn']) }}
{{ Form::close() }}
@stop
## Instruction:
Add an ID to sign-in form for testing.
## Code After:
@extends('layout')
@section('content')
<h3>Sign In</h3>
{{ Form::open(['route'=> ['sessions.store'], 'id' => 'sign_in_form']) }}
{{ Form::label('email', 'Email') }}
{{ Form::error('email', $errors) }}
{{ Form::text('email') }}
{{ Form::label('password', 'Password') }}
{{ Form::error('password', $errors) }}
{{ Form::password('password') }}
{{ Form::submit('Sign In', ['class' => 'btn']) }}
{{ Form::close() }}
@stop
|
c4e5d85947d14cbcadf94f35c7fea1ef0ae4eb5f | vendor/buildr/buildr-osgi-runtime/lib/buildr/osgi/container.rb | vendor/buildr/buildr-osgi-runtime/lib/buildr/osgi/container.rb | module Buildr
module OSGi
class Container
attr_reader :runtime
def initialize(runtime)
@runtime = runtime
@parameters = {}
end
def []=(key, value)
@parameters[key] = value
end
def [](key)
@parameters[key]
end
def configuration_dir
File.dirname(configuration_file)
end
def bundle_dir
"lib"
end
def bundles
raise "bundles should be overidden"
end
def configuration_file
raise "configuration_file should be overidden"
end
def write_config(file)
raise "generate_config should be overidden"
end
end
end
end | module Buildr
module OSGi
class Container
attr_reader :runtime
def initialize(runtime)
@runtime = runtime
@parameters = OrderedHash.new
end
def []=(key, value)
@parameters[key] = value
end
def [](key)
@parameters[key]
end
def configuration_dir
File.dirname(configuration_file)
end
def bundle_dir
"lib"
end
def bundles
raise "bundles should be overidden"
end
def configuration_file
raise "configuration_file should be overidden"
end
def write_config(file)
raise "generate_config should be overidden"
end
end
end
end | Use an ordered has for parameters | Use an ordered has for parameters
| Ruby | apache-2.0 | realityforge/jml,realityforge/jml | ruby | ## Code Before:
module Buildr
module OSGi
class Container
attr_reader :runtime
def initialize(runtime)
@runtime = runtime
@parameters = {}
end
def []=(key, value)
@parameters[key] = value
end
def [](key)
@parameters[key]
end
def configuration_dir
File.dirname(configuration_file)
end
def bundle_dir
"lib"
end
def bundles
raise "bundles should be overidden"
end
def configuration_file
raise "configuration_file should be overidden"
end
def write_config(file)
raise "generate_config should be overidden"
end
end
end
end
## Instruction:
Use an ordered has for parameters
## Code After:
module Buildr
module OSGi
class Container
attr_reader :runtime
def initialize(runtime)
@runtime = runtime
@parameters = OrderedHash.new
end
def []=(key, value)
@parameters[key] = value
end
def [](key)
@parameters[key]
end
def configuration_dir
File.dirname(configuration_file)
end
def bundle_dir
"lib"
end
def bundles
raise "bundles should be overidden"
end
def configuration_file
raise "configuration_file should be overidden"
end
def write_config(file)
raise "generate_config should be overidden"
end
end
end
end |
813e80dc4ae71efa7177783aaab03d46ebf6ea9e | view/quoteSuccess.mustache | view/quoteSuccess.mustache | <!DOCTYPE html>
<html>
<head>
{{#header}}
{{> meta.mustache}}
{{/header}}
</head>
<body>
{{#header}}
{{> header.mustache}}
{{/header}}
<section>
<h1>Transfer to Bank Account</h1>
{{#successfulQuote}}
<p>To complete your transaction, send {{quoteAmount}} CAD to <a href="https://ripple.com//send?name=Ripple%20Union&to=r3ADD8kXSUKHd6zTCKfnKT3zV9EZHjzp1S&dt={{quoteId}}&amount={{quoteAmount}}%2FCAD">r3ADD8kXSUKHd6zTCKfnKT3zV9EZHjzp1S</a> with destination tag {{quoteId}}.</p>
{{/successfulQuote}}
<p>Please allow up to 2 business days for processing.</p>
<a href="{{quoteHomeLink}}">« Back</a>
</section>
</body>
</html>
| <!DOCTYPE html>
<html>
<head>
{{#header}}
{{> meta.mustache}}
{{/header}}
</head>
<body>
{{#header}}
{{> header.mustache}}
{{/header}}
<section>
<h1>Transfer to Bank Account</h1>
{{#successfulQuote}}
<p>To complete your transaction, send {{quoteAmount}} CAD to <a class="root-ripple-address" href="https://ripple.com//send?name=Ripple%20Union&to=r3ADD8kXSUKHd6zTCKfnKT3zV9EZHjzp1S&dt={{quoteId}}&amount={{quoteAmount}}%2FCAD">r3ADD8kXSUKHd6zTCKfnKT3zV9EZHjzp1S</a> with destination tag {{quoteId}}.</p>
{{/successfulQuote}}
<p>Please allow up to 2 business days for processing.</p>
<a href="{{quoteHomeLink}}">« Back</a>
</section>
</body>
</html>
| Put sigil in next to address | Put sigil in next to address
| HTML+Django | isc | singpolyma/rippleunion-interac | html+django | ## Code Before:
<!DOCTYPE html>
<html>
<head>
{{#header}}
{{> meta.mustache}}
{{/header}}
</head>
<body>
{{#header}}
{{> header.mustache}}
{{/header}}
<section>
<h1>Transfer to Bank Account</h1>
{{#successfulQuote}}
<p>To complete your transaction, send {{quoteAmount}} CAD to <a href="https://ripple.com//send?name=Ripple%20Union&to=r3ADD8kXSUKHd6zTCKfnKT3zV9EZHjzp1S&dt={{quoteId}}&amount={{quoteAmount}}%2FCAD">r3ADD8kXSUKHd6zTCKfnKT3zV9EZHjzp1S</a> with destination tag {{quoteId}}.</p>
{{/successfulQuote}}
<p>Please allow up to 2 business days for processing.</p>
<a href="{{quoteHomeLink}}">« Back</a>
</section>
</body>
</html>
## Instruction:
Put sigil in next to address
## Code After:
<!DOCTYPE html>
<html>
<head>
{{#header}}
{{> meta.mustache}}
{{/header}}
</head>
<body>
{{#header}}
{{> header.mustache}}
{{/header}}
<section>
<h1>Transfer to Bank Account</h1>
{{#successfulQuote}}
<p>To complete your transaction, send {{quoteAmount}} CAD to <a class="root-ripple-address" href="https://ripple.com//send?name=Ripple%20Union&to=r3ADD8kXSUKHd6zTCKfnKT3zV9EZHjzp1S&dt={{quoteId}}&amount={{quoteAmount}}%2FCAD">r3ADD8kXSUKHd6zTCKfnKT3zV9EZHjzp1S</a> with destination tag {{quoteId}}.</p>
{{/successfulQuote}}
<p>Please allow up to 2 business days for processing.</p>
<a href="{{quoteHomeLink}}">« Back</a>
</section>
</body>
</html>
|
bfa796bb505a48b24a22c33cd6ef3a0b83cc5993 | package.json | package.json | {
"name": "time-elements",
"version": "0.6.1",
"main": "dist/time-elements-legacy.js",
"jsnext:main": "dist/time-elements.js",
"scripts": {
"clean": "rm -rf ./dist ./node_modules",
"lint": "eslint src/ test/",
"build": "rollup -c && rollup -c rollup.legacy-config.js",
"pretest": "npm run lint && npm run build",
"test": "karma start ./test/karma.config.js"
},
"devDependencies": {
"babel-plugin-transform-custom-element-classes": "^0.1.0",
"babel-plugin-transform-es2015-block-scoping": "^6.18.0",
"babel-plugin-transform-es2015-classes": "^6.18.0",
"chai": "^3.5.0",
"eslint": "^4.13.1",
"eslint-plugin-github": "^0.20.0",
"karma": "^1.7.1",
"karma-chai": "^0.1.0",
"karma-chrome-launcher": "^2.2.0",
"karma-mocha": "^1.3.0",
"mocha": "^3.5.3",
"rollup": "^0.36.4",
"rollup-plugin-babel": "^2.6.1",
"webcomponents.js": "^0.7.23"
}
}
| {
"name": "time-elements",
"version": "0.6.1",
"main": "dist/time-elements-legacy.js",
"jsnext:main": "dist/time-elements.js",
"license": "MIT",
"scripts": {
"clean": "rm -rf ./dist ./node_modules",
"lint": "eslint src/ test/",
"build": "rollup -c && rollup -c rollup.legacy-config.js",
"pretest": "npm run lint && npm run build",
"test": "karma start ./test/karma.config.js"
},
"repository": {
"type": "git",
"url": "git+https://github.com/github/time-elements.git"
},
"devDependencies": {
"babel-plugin-transform-custom-element-classes": "^0.1.0",
"babel-plugin-transform-es2015-block-scoping": "^6.18.0",
"babel-plugin-transform-es2015-classes": "^6.18.0",
"chai": "^3.5.0",
"eslint": "^4.13.1",
"eslint-plugin-github": "^0.20.0",
"karma": "^1.7.1",
"karma-chai": "^0.1.0",
"karma-chrome-launcher": "^2.2.0",
"karma-mocha": "^1.3.0",
"mocha": "^3.5.3",
"rollup": "^0.36.4",
"rollup-plugin-babel": "^2.6.1",
"webcomponents.js": "^0.7.23"
}
}
| Add repository and license fields | Add repository and license fields
| JSON | mit | github/time-elements,github/time-elements | json | ## Code Before:
{
"name": "time-elements",
"version": "0.6.1",
"main": "dist/time-elements-legacy.js",
"jsnext:main": "dist/time-elements.js",
"scripts": {
"clean": "rm -rf ./dist ./node_modules",
"lint": "eslint src/ test/",
"build": "rollup -c && rollup -c rollup.legacy-config.js",
"pretest": "npm run lint && npm run build",
"test": "karma start ./test/karma.config.js"
},
"devDependencies": {
"babel-plugin-transform-custom-element-classes": "^0.1.0",
"babel-plugin-transform-es2015-block-scoping": "^6.18.0",
"babel-plugin-transform-es2015-classes": "^6.18.0",
"chai": "^3.5.0",
"eslint": "^4.13.1",
"eslint-plugin-github": "^0.20.0",
"karma": "^1.7.1",
"karma-chai": "^0.1.0",
"karma-chrome-launcher": "^2.2.0",
"karma-mocha": "^1.3.0",
"mocha": "^3.5.3",
"rollup": "^0.36.4",
"rollup-plugin-babel": "^2.6.1",
"webcomponents.js": "^0.7.23"
}
}
## Instruction:
Add repository and license fields
## Code After:
{
"name": "time-elements",
"version": "0.6.1",
"main": "dist/time-elements-legacy.js",
"jsnext:main": "dist/time-elements.js",
"license": "MIT",
"scripts": {
"clean": "rm -rf ./dist ./node_modules",
"lint": "eslint src/ test/",
"build": "rollup -c && rollup -c rollup.legacy-config.js",
"pretest": "npm run lint && npm run build",
"test": "karma start ./test/karma.config.js"
},
"repository": {
"type": "git",
"url": "git+https://github.com/github/time-elements.git"
},
"devDependencies": {
"babel-plugin-transform-custom-element-classes": "^0.1.0",
"babel-plugin-transform-es2015-block-scoping": "^6.18.0",
"babel-plugin-transform-es2015-classes": "^6.18.0",
"chai": "^3.5.0",
"eslint": "^4.13.1",
"eslint-plugin-github": "^0.20.0",
"karma": "^1.7.1",
"karma-chai": "^0.1.0",
"karma-chrome-launcher": "^2.2.0",
"karma-mocha": "^1.3.0",
"mocha": "^3.5.3",
"rollup": "^0.36.4",
"rollup-plugin-babel": "^2.6.1",
"webcomponents.js": "^0.7.23"
}
}
|
33755cede3b5f57b25851652f86b50436e72cb37 | README.md | README.md |
This README outlines the details of collaborating on this Ember addon.
## Installation
* `git clone <repository-url>` this repository
* `cd ember-floating-mobile-buttons`
* `npm install`
* `bower install`
## Running
* `ember serve`
* Visit your app at [http://localhost:4200](http://localhost:4200).
## Running Tests
* `npm test` (Runs `ember try:each` to test your addon against multiple Ember versions)
* `ember test`
* `ember test --server`
## Building
* `ember build`
For more information on using ember-cli, visit [https://ember-cli.com/](https://ember-cli.com/).
|
Stylish and easy to use Ember floating buttons
##Installation
```
ember install ember-floating-mobile-buttons
```
##Usage
Define a simple floating button.
```hbs
{{#floating-mobile-buttons}}
<a href>{{fa-icon "pencil"}}</a>
{{/floating-mobile-buttons}}
```
You can additionally define child buttons which will be display grouped by the parent button.
```hbs
{{#floating-mobile-buttons position="bottom right"}}
{{#floating-mobile-child-buttons label="Add Item"}}
<a href>{{fa-icon "user"}}</a>
{{/floating-mobile-child-buttons}}
{{#floating-mobile-child-buttons label="Remove Item"}}
<a href>{{fa-icon "trash-o"}}</a>
{{/floating-mobile-child-buttons}}
{{#floating-mobile-child-buttons label="Edit Item"}}
<a href>{{fa-icon "pencil"}}</a>
{{/floating-mobile-child-buttons}}
{{/floating-mobile-buttons}}
```
As seen in the example above, you can combine it with you own icons.
##Properties
###Position
The fixed position of the floating button.
The available list por position:
* bottom right (default)
* bottom left
* top right
* top left
## Running
* `ember serve`
* Visit your app at [http://localhost:4200](http://localhost:4200).
## Running Tests
* `npm test` (Runs `ember try:each` to test your addon against multiple Ember versions)
* `ember test`
* `ember test --server`
## Building
* `ember build`
For more information on using ember-cli, visit [https://ember-cli.com/](https://ember-cli.com/).
| Update and complete the documentation. | Update and complete the documentation.
| Markdown | mit | yontxu/ember-floating-mobile-buttons,yontxu/ember-floating-mobile-buttons | markdown | ## Code Before:
This README outlines the details of collaborating on this Ember addon.
## Installation
* `git clone <repository-url>` this repository
* `cd ember-floating-mobile-buttons`
* `npm install`
* `bower install`
## Running
* `ember serve`
* Visit your app at [http://localhost:4200](http://localhost:4200).
## Running Tests
* `npm test` (Runs `ember try:each` to test your addon against multiple Ember versions)
* `ember test`
* `ember test --server`
## Building
* `ember build`
For more information on using ember-cli, visit [https://ember-cli.com/](https://ember-cli.com/).
## Instruction:
Update and complete the documentation.
## Code After:
Stylish and easy to use Ember floating buttons
##Installation
```
ember install ember-floating-mobile-buttons
```
##Usage
Define a simple floating button.
```hbs
{{#floating-mobile-buttons}}
<a href>{{fa-icon "pencil"}}</a>
{{/floating-mobile-buttons}}
```
You can additionally define child buttons which will be display grouped by the parent button.
```hbs
{{#floating-mobile-buttons position="bottom right"}}
{{#floating-mobile-child-buttons label="Add Item"}}
<a href>{{fa-icon "user"}}</a>
{{/floating-mobile-child-buttons}}
{{#floating-mobile-child-buttons label="Remove Item"}}
<a href>{{fa-icon "trash-o"}}</a>
{{/floating-mobile-child-buttons}}
{{#floating-mobile-child-buttons label="Edit Item"}}
<a href>{{fa-icon "pencil"}}</a>
{{/floating-mobile-child-buttons}}
{{/floating-mobile-buttons}}
```
As seen in the example above, you can combine it with you own icons.
##Properties
###Position
The fixed position of the floating button.
The available list por position:
* bottom right (default)
* bottom left
* top right
* top left
## Running
* `ember serve`
* Visit your app at [http://localhost:4200](http://localhost:4200).
## Running Tests
* `npm test` (Runs `ember try:each` to test your addon against multiple Ember versions)
* `ember test`
* `ember test --server`
## Building
* `ember build`
For more information on using ember-cli, visit [https://ember-cli.com/](https://ember-cli.com/).
|
d06a15af459a630539686b62677cb0884120ef2d | spec/lib/api_canon_spec.rb | spec/lib/api_canon_spec.rb | require 'spec_helper'
describe ApiCanon do
let(:fake_controller) do
class FakeController < ActionController::Base
include ApiCanon
end
end
describe 'including' do
context :class_methods do
subject {fake_controller}
its(:methods) { should include(:document_method)}
end
context :instance_methods do
subject {fake_controller.new}
its(:methods) { should include(:api_canon_docs)}
its(:methods) { should include(:index)}
end
end
describe "document_method" do
let(:api_document) { mock :api_document }
context "without a current controller doc" do
it "creates and stores a new ApiCanon::Document and adds the documented action" do
ApiCanon::Document.should_receive(:new).with('fake', 'fake').and_return(api_document)
ApiCanon::DocumentationStore.instance.should_receive(:store).with(api_document)
api_document.should_receive :add_action
fake_controller.send(:document_method, :index, &(Proc.new {}))
end
end
context "with a current controller doc" do
before(:each) do
fake_controller.send(:document_controller, &(Proc.new {}))
end
it "adds a documented action to the current controller doc" do
expect {
fake_controller.send(:document_method, :index, &(Proc.new {}))
}.to change {
ApiCanon::DocumentationStore.fetch('fake').documented_actions.count
}.by(1)
end
end
end
end
| require 'spec_helper'
describe ApiCanon do
let(:fake_controller) do
class FakeController < ActionController::Base
include ApiCanon
end
end
describe 'including' do
it "adds class methods to the controller" do
expect(fake_controller.methods.map(&:to_s)).to include('document_method')
end
it "adds instance methods to the controller" do
expect(fake_controller.new.methods.map(&:to_s)).to include('api_canon_docs')
expect(fake_controller.new.methods.map(&:to_s)).to include('index')
end
end
describe "document_method" do
let(:api_document) { mock :api_document }
context "without a current controller doc" do
it "creates and stores a new ApiCanon::Document and adds the documented action" do
ApiCanon::Document.should_receive(:new).with('fake', 'fake').and_return(api_document)
ApiCanon::DocumentationStore.instance.should_receive(:store).with(api_document)
api_document.should_receive :add_action
fake_controller.send(:document_method, :index, &(Proc.new {}))
end
end
context "with a current controller doc" do
before(:each) do
fake_controller.send(:document_controller, &(Proc.new {}))
end
it "adds a documented action to the current controller doc" do
expect {
fake_controller.send(:document_method, :index, &(Proc.new {}))
}.to change {
ApiCanon::DocumentationStore.fetch('fake').documented_actions.count
}.by(1)
end
end
end
end
| Fix to allow specs to run in 1.8.7 | Fix to allow specs to run in 1.8.7
| Ruby | mit | cwalsh/api_canon | ruby | ## Code Before:
require 'spec_helper'
describe ApiCanon do
let(:fake_controller) do
class FakeController < ActionController::Base
include ApiCanon
end
end
describe 'including' do
context :class_methods do
subject {fake_controller}
its(:methods) { should include(:document_method)}
end
context :instance_methods do
subject {fake_controller.new}
its(:methods) { should include(:api_canon_docs)}
its(:methods) { should include(:index)}
end
end
describe "document_method" do
let(:api_document) { mock :api_document }
context "without a current controller doc" do
it "creates and stores a new ApiCanon::Document and adds the documented action" do
ApiCanon::Document.should_receive(:new).with('fake', 'fake').and_return(api_document)
ApiCanon::DocumentationStore.instance.should_receive(:store).with(api_document)
api_document.should_receive :add_action
fake_controller.send(:document_method, :index, &(Proc.new {}))
end
end
context "with a current controller doc" do
before(:each) do
fake_controller.send(:document_controller, &(Proc.new {}))
end
it "adds a documented action to the current controller doc" do
expect {
fake_controller.send(:document_method, :index, &(Proc.new {}))
}.to change {
ApiCanon::DocumentationStore.fetch('fake').documented_actions.count
}.by(1)
end
end
end
end
## Instruction:
Fix to allow specs to run in 1.8.7
## Code After:
require 'spec_helper'
describe ApiCanon do
let(:fake_controller) do
class FakeController < ActionController::Base
include ApiCanon
end
end
describe 'including' do
it "adds class methods to the controller" do
expect(fake_controller.methods.map(&:to_s)).to include('document_method')
end
it "adds instance methods to the controller" do
expect(fake_controller.new.methods.map(&:to_s)).to include('api_canon_docs')
expect(fake_controller.new.methods.map(&:to_s)).to include('index')
end
end
describe "document_method" do
let(:api_document) { mock :api_document }
context "without a current controller doc" do
it "creates and stores a new ApiCanon::Document and adds the documented action" do
ApiCanon::Document.should_receive(:new).with('fake', 'fake').and_return(api_document)
ApiCanon::DocumentationStore.instance.should_receive(:store).with(api_document)
api_document.should_receive :add_action
fake_controller.send(:document_method, :index, &(Proc.new {}))
end
end
context "with a current controller doc" do
before(:each) do
fake_controller.send(:document_controller, &(Proc.new {}))
end
it "adds a documented action to the current controller doc" do
expect {
fake_controller.send(:document_method, :index, &(Proc.new {}))
}.to change {
ApiCanon::DocumentationStore.fetch('fake').documented_actions.count
}.by(1)
end
end
end
end
|
89657efcce369e4bcfd20e627f913173f9cf0fde | _config.yml | _config.yml | title: ICFP Programming Contest 2017
description: The website for the 2017 ICFP Programming Contest
baseurl: "/icfpcontest2017-dev" # the subpath of your site, e.g. /blog
url: "/icfpcontest2017-dev"
highlighter: rouge
markdown: kramdown
permalink: /post/:title/
paginate: 4
gems: [jekyll-paginate, jekyll-twitter-plugin]
paginate_path: "/page:num"
rss_path: /feed.xml
navigation:
- text: Problem Specification
url: /problem/
- text: Archives
url: /archives/
- text: Tags
url: /tags/
| title: ICFP Programming Contest 2017
description: The website for the 2017 ICFP Programming Contest
baseurl: "/icfpcontest2017-dev" # the subpath of your site, e.g. /blog
url: "/icfpcontest2017-dev"
highlighter: rouge
markdown: kramdown
permalink: /post/:title/
paginate: 4
gems: [jekyll-paginate, jekyll-twitter-plugin, jekyll-last-modified-at]
paginate_path: "/page:num"
rss_path: /feed.xml
navigation:
- text: Problem Specification
url: /problem/
- text: Archives
url: /archives/
- text: Tags
url: /tags/
| Add last modified at gem | Add last modified at gem
| YAML | mit | icfpcontest2017/icfpcontest2017.github.io,icfpcontest2017/icfpcontest2017.github.io | yaml | ## Code Before:
title: ICFP Programming Contest 2017
description: The website for the 2017 ICFP Programming Contest
baseurl: "/icfpcontest2017-dev" # the subpath of your site, e.g. /blog
url: "/icfpcontest2017-dev"
highlighter: rouge
markdown: kramdown
permalink: /post/:title/
paginate: 4
gems: [jekyll-paginate, jekyll-twitter-plugin]
paginate_path: "/page:num"
rss_path: /feed.xml
navigation:
- text: Problem Specification
url: /problem/
- text: Archives
url: /archives/
- text: Tags
url: /tags/
## Instruction:
Add last modified at gem
## Code After:
title: ICFP Programming Contest 2017
description: The website for the 2017 ICFP Programming Contest
baseurl: "/icfpcontest2017-dev" # the subpath of your site, e.g. /blog
url: "/icfpcontest2017-dev"
highlighter: rouge
markdown: kramdown
permalink: /post/:title/
paginate: 4
gems: [jekyll-paginate, jekyll-twitter-plugin, jekyll-last-modified-at]
paginate_path: "/page:num"
rss_path: /feed.xml
navigation:
- text: Problem Specification
url: /problem/
- text: Archives
url: /archives/
- text: Tags
url: /tags/
|
73dd2ae69c4b915faa51e7f582bf93453e9f8419 | shared/api/instance_console.go | shared/api/instance_console.go | package api
// InstanceConsoleControl represents a message on the instance console "control" socket.
//
// API extension: instances
type InstanceConsoleControl struct {
Command string `json:"command" yaml:"command"`
Args map[string]string `json:"args" yaml:"args"`
}
// InstanceConsolePost represents a LXD instance console request.
//
// API extension: instances
type InstanceConsolePost struct {
Width int `json:"width" yaml:"width"`
Height int `json:"height" yaml:"height"`
}
| package api
// InstanceConsoleControl represents a message on the instance console "control" socket.
//
// API extension: instances
type InstanceConsoleControl struct {
Command string `json:"command" yaml:"command"`
Args map[string]string `json:"args" yaml:"args"`
}
// InstanceConsolePost represents a LXD instance console request.
//
// API extension: instances
type InstanceConsolePost struct {
Width int `json:"width" yaml:"width"`
Height int `json:"height" yaml:"height"`
// API extension: console_vga_type
Type string `json:"type" yaml:"type"`
}
| Add Type field to InstanceConsolePost | shared/api: Add Type field to InstanceConsolePost
Signed-off-by: Free Ekanayaka <04111f73b2d444cf053b50d877d79556bf34f55a@canonical.com>
| Go | apache-2.0 | hallyn/lxd,stgraber/lxd,atxwebs/lxd,lxc/lxd,stgraber/lxd,hallyn/lxd,lxc/lxd,atxwebs/lxd,lxc/lxd,Skarlso/lxd,stgraber/lxd,Skarlso/lxd,stgraber/lxd,atxwebs/lxd,stgraber/lxd,hallyn/lxd,stgraber/lxd,Skarlso/lxd,mjeanson/lxd,Skarlso/lxd,lxc/lxd,lxc/lxd,lxc/lxd,lxc/lxd,hallyn/lxd,atxwebs/lxd,hallyn/lxd,stgraber/lxd,mjeanson/lxd,hallyn/lxd,mjeanson/lxd | go | ## Code Before:
package api
// InstanceConsoleControl represents a message on the instance console "control" socket.
//
// API extension: instances
type InstanceConsoleControl struct {
Command string `json:"command" yaml:"command"`
Args map[string]string `json:"args" yaml:"args"`
}
// InstanceConsolePost represents a LXD instance console request.
//
// API extension: instances
type InstanceConsolePost struct {
Width int `json:"width" yaml:"width"`
Height int `json:"height" yaml:"height"`
}
## Instruction:
shared/api: Add Type field to InstanceConsolePost
Signed-off-by: Free Ekanayaka <04111f73b2d444cf053b50d877d79556bf34f55a@canonical.com>
## Code After:
package api
// InstanceConsoleControl represents a message on the instance console "control" socket.
//
// API extension: instances
type InstanceConsoleControl struct {
Command string `json:"command" yaml:"command"`
Args map[string]string `json:"args" yaml:"args"`
}
// InstanceConsolePost represents a LXD instance console request.
//
// API extension: instances
type InstanceConsolePost struct {
Width int `json:"width" yaml:"width"`
Height int `json:"height" yaml:"height"`
// API extension: console_vga_type
Type string `json:"type" yaml:"type"`
}
|
f225e8bb6aad637162af53b6213214337b8af106 | modules/cats/shared/src/main/scala/eu/timepit/refined/cats/package.scala | modules/cats/shared/src/main/scala/eu/timepit/refined/cats/package.scala | package eu.timepit.refined
import _root_.cats.implicits._
import _root_.cats.Show
import _root_.cats.kernel.Eq
import eu.timepit.refined.api.RefType
package object cats {
/**
* `Eq` instance for refined types that delegates to the `Eq`
* instance of the base type.
*/
implicit def refTypeEq[F[_, _], T: Eq, P](implicit rt: RefType[F]): Eq[F[T, P]] =
Eq[T].contramap(rt.unwrap)
/**
* `Show` instance for refined types that delegates to the `Show`
* instance of the base type.
*/
implicit def refTypeShow[F[_, _], T: Show, P](implicit rt: RefType[F]): Show[F[T, P]] =
Show[T].contramap(rt.unwrap)
}
| package eu.timepit.refined
import _root_.cats.Show
import _root_.cats.instances.eq._
import _root_.cats.kernel.Eq
import _root_.cats.syntax.contravariant._
import eu.timepit.refined.api.RefType
package object cats {
/**
* `Eq` instance for refined types that delegates to the `Eq`
* instance of the base type.
*/
implicit def refTypeEq[F[_, _], T: Eq, P](implicit rt: RefType[F]): Eq[F[T, P]] =
Eq[T].contramap(rt.unwrap)
/**
* `Show` instance for refined types that delegates to the `Show`
* instance of the base type.
*/
implicit def refTypeShow[F[_, _], T: Show, P](implicit rt: RefType[F]): Show[F[T, P]] =
Show[T].contramap(rt.unwrap)
}
| Use fine grained imports and fix style error | Use fine grained imports and fix style error | Scala | mit | sh0hei/refined,fthomas/refined | scala | ## Code Before:
package eu.timepit.refined
import _root_.cats.implicits._
import _root_.cats.Show
import _root_.cats.kernel.Eq
import eu.timepit.refined.api.RefType
package object cats {
/**
* `Eq` instance for refined types that delegates to the `Eq`
* instance of the base type.
*/
implicit def refTypeEq[F[_, _], T: Eq, P](implicit rt: RefType[F]): Eq[F[T, P]] =
Eq[T].contramap(rt.unwrap)
/**
* `Show` instance for refined types that delegates to the `Show`
* instance of the base type.
*/
implicit def refTypeShow[F[_, _], T: Show, P](implicit rt: RefType[F]): Show[F[T, P]] =
Show[T].contramap(rt.unwrap)
}
## Instruction:
Use fine grained imports and fix style error
## Code After:
package eu.timepit.refined
import _root_.cats.Show
import _root_.cats.instances.eq._
import _root_.cats.kernel.Eq
import _root_.cats.syntax.contravariant._
import eu.timepit.refined.api.RefType
package object cats {
/**
* `Eq` instance for refined types that delegates to the `Eq`
* instance of the base type.
*/
implicit def refTypeEq[F[_, _], T: Eq, P](implicit rt: RefType[F]): Eq[F[T, P]] =
Eq[T].contramap(rt.unwrap)
/**
* `Show` instance for refined types that delegates to the `Show`
* instance of the base type.
*/
implicit def refTypeShow[F[_, _], T: Show, P](implicit rt: RefType[F]): Show[F[T, P]] =
Show[T].contramap(rt.unwrap)
}
|
762147b8660a507ac5db8d0408162e8463b2fe8e | daiquiri/registry/serializers.py | daiquiri/registry/serializers.py | from rest_framework import serializers
from daiquiri.core.serializers import JSONListField, JSONDictField
class DublincoreSerializer(serializers.Serializer):
identifier = serializers.ReadOnlyField()
title = serializers.ReadOnlyField()
description = serializers.SerializerMethodField()
publisher = serializers.SerializerMethodField()
subjects = serializers.ReadOnlyField()
def get_description(self, obj):
return obj['content']['description']
def get_publisher(self, obj):
return obj['curation']['publisher']
class VoresourceSerializer(serializers.Serializer):
identifier = serializers.ReadOnlyField()
title = serializers.ReadOnlyField()
created = serializers.ReadOnlyField()
updated = serializers.ReadOnlyField()
voresource_status = serializers.ReadOnlyField(default='active')
curation = JSONDictField(default={})
content = JSONDictField(default={})
capabilities = JSONListField(default=[])
tableset = JSONListField(default=[])
managed_authority = serializers.ReadOnlyField(default=None)
managing_org = serializers.ReadOnlyField(default=None)
type = serializers.ReadOnlyField()
| from rest_framework import serializers
from daiquiri.core.serializers import JSONDictField, JSONListField
class DublincoreSerializer(serializers.Serializer):
identifier = serializers.ReadOnlyField()
title = serializers.ReadOnlyField()
description = serializers.SerializerMethodField()
publisher = serializers.SerializerMethodField()
subjects = serializers.ReadOnlyField()
def get_description(self, obj):
return obj['content']['description']
def get_publisher(self, obj):
return obj['curation']['publisher']
class VoresourceSerializer(serializers.Serializer):
identifier = serializers.ReadOnlyField()
title = serializers.ReadOnlyField()
created = serializers.ReadOnlyField()
updated = serializers.ReadOnlyField()
type = serializers.ReadOnlyField()
status = serializers.ReadOnlyField()
curation = JSONDictField(default={})
content = JSONDictField(default={})
capabilities = JSONListField(default=[])
tableset = JSONListField(default=[])
managed_authority = serializers.ReadOnlyField(default=None)
managing_org = serializers.ReadOnlyField(default=None)
| Fix status field in OAI-PMH | Fix status field in OAI-PMH
| Python | apache-2.0 | aipescience/django-daiquiri,aipescience/django-daiquiri,aipescience/django-daiquiri | python | ## Code Before:
from rest_framework import serializers
from daiquiri.core.serializers import JSONListField, JSONDictField
class DublincoreSerializer(serializers.Serializer):
identifier = serializers.ReadOnlyField()
title = serializers.ReadOnlyField()
description = serializers.SerializerMethodField()
publisher = serializers.SerializerMethodField()
subjects = serializers.ReadOnlyField()
def get_description(self, obj):
return obj['content']['description']
def get_publisher(self, obj):
return obj['curation']['publisher']
class VoresourceSerializer(serializers.Serializer):
identifier = serializers.ReadOnlyField()
title = serializers.ReadOnlyField()
created = serializers.ReadOnlyField()
updated = serializers.ReadOnlyField()
voresource_status = serializers.ReadOnlyField(default='active')
curation = JSONDictField(default={})
content = JSONDictField(default={})
capabilities = JSONListField(default=[])
tableset = JSONListField(default=[])
managed_authority = serializers.ReadOnlyField(default=None)
managing_org = serializers.ReadOnlyField(default=None)
type = serializers.ReadOnlyField()
## Instruction:
Fix status field in OAI-PMH
## Code After:
from rest_framework import serializers
from daiquiri.core.serializers import JSONDictField, JSONListField
class DublincoreSerializer(serializers.Serializer):
identifier = serializers.ReadOnlyField()
title = serializers.ReadOnlyField()
description = serializers.SerializerMethodField()
publisher = serializers.SerializerMethodField()
subjects = serializers.ReadOnlyField()
def get_description(self, obj):
return obj['content']['description']
def get_publisher(self, obj):
return obj['curation']['publisher']
class VoresourceSerializer(serializers.Serializer):
identifier = serializers.ReadOnlyField()
title = serializers.ReadOnlyField()
created = serializers.ReadOnlyField()
updated = serializers.ReadOnlyField()
type = serializers.ReadOnlyField()
status = serializers.ReadOnlyField()
curation = JSONDictField(default={})
content = JSONDictField(default={})
capabilities = JSONListField(default=[])
tableset = JSONListField(default=[])
managed_authority = serializers.ReadOnlyField(default=None)
managing_org = serializers.ReadOnlyField(default=None)
|
e6cd744ca8f07799f10ec9823c03553d09c888cc | src/client/app/app.routes.js | src/client/app/app.routes.js | "use strict";
(function (app) {
app.View1 = ng.core
.Component({
template: "<p>View {{ viewName }} loaded with the router</p>"
})
.Class({
constructor: function () {
this.viewName = "View1";
}
});
}(window.app || (window.app = {})));
(function (app) {
var component = ng.core
.Component({
selector: "my-routes",
template: "<h1>My routes</h1><router-outlet></router-outlet>"
})
.Class({
constructor: function () {}
});
var routing = ng.router.RouterModule.forRoot([
{ path: "", component: app.View1 }
]);
app.MyRoutesModule = ng.core.NgModule({
imports: [ng.platformBrowser.BrowserModule, routing],
declarations: [component, app.View1],
bootstrap: [component]
})
.Class({
constructor: function () {}
});
}(window.app || (window.app = {})));
| "use strict";
(function () {
// fix for a regression in rx 5.0.0-beta.11
var obsProto = window.Rx.Observable.prototype;
obsProto._catch = obsProto.catch;
}());
(function (app) {
app.View1 = ng.core
.Component({
template: "<p>View {{ viewName }} loaded with the router</p>"
})
.Class({
constructor: function () {
this.viewName = "View1";
}
});
}(window.app || (window.app = {})));
(function (app) {
var component = ng.core
.Component({
selector: "my-routes",
template: "<h1>My routes</h1><router-outlet></router-outlet>"
})
.Class({
constructor: function () {}
});
var routing = ng.router.RouterModule.forRoot([
{ path: "", component: app.View1 }
]);
app.MyRoutesModule = ng.core.NgModule({
imports: [ng.platformBrowser.BrowserModule, routing],
declarations: [component, app.View1],
bootstrap: [component]
})
.Class({
constructor: function () {}
});
}(window.app || (window.app = {})));
| Fix a bug in rxjs 5.0.0-beta.11. | Fix a bug in rxjs 5.0.0-beta.11.
| JavaScript | mit | albertosantini/angular2-es5-jumpstart,albertosantini/angular2-es5-jumpstart | javascript | ## Code Before:
"use strict";
(function (app) {
app.View1 = ng.core
.Component({
template: "<p>View {{ viewName }} loaded with the router</p>"
})
.Class({
constructor: function () {
this.viewName = "View1";
}
});
}(window.app || (window.app = {})));
(function (app) {
var component = ng.core
.Component({
selector: "my-routes",
template: "<h1>My routes</h1><router-outlet></router-outlet>"
})
.Class({
constructor: function () {}
});
var routing = ng.router.RouterModule.forRoot([
{ path: "", component: app.View1 }
]);
app.MyRoutesModule = ng.core.NgModule({
imports: [ng.platformBrowser.BrowserModule, routing],
declarations: [component, app.View1],
bootstrap: [component]
})
.Class({
constructor: function () {}
});
}(window.app || (window.app = {})));
## Instruction:
Fix a bug in rxjs 5.0.0-beta.11.
## Code After:
"use strict";
(function () {
// fix for a regression in rx 5.0.0-beta.11
var obsProto = window.Rx.Observable.prototype;
obsProto._catch = obsProto.catch;
}());
(function (app) {
app.View1 = ng.core
.Component({
template: "<p>View {{ viewName }} loaded with the router</p>"
})
.Class({
constructor: function () {
this.viewName = "View1";
}
});
}(window.app || (window.app = {})));
(function (app) {
var component = ng.core
.Component({
selector: "my-routes",
template: "<h1>My routes</h1><router-outlet></router-outlet>"
})
.Class({
constructor: function () {}
});
var routing = ng.router.RouterModule.forRoot([
{ path: "", component: app.View1 }
]);
app.MyRoutesModule = ng.core.NgModule({
imports: [ng.platformBrowser.BrowserModule, routing],
declarations: [component, app.View1],
bootstrap: [component]
})
.Class({
constructor: function () {}
});
}(window.app || (window.app = {})));
|
e55ba2f5cb1e2666c54b2ef6de6a2add33c4ac85 | readme.md | readme.md |
```
sudo sass --watch scss:css
```
# Kopiera saker till MAMP
```
../../Applications/MAMP/Library/bin/mysql -u root -p sittning < ~/Desktop/Websites/NILLEFRANZTEST/sittning/sqlQueries/init.sql
``` |
```
sudo sass --watch scss:css
```
# Kopiera saker till MAMP
```
../../Applications/MAMP/Library/bin/mysql -u root -p sittning < ~/Desktop/Websites/NILLEFRANZTEST/sittning/sqlQueries/init.sql
```
# FB inlogg guide
http://www.krizna.com/general/login-with-facebook-using-php/ | Add link to fb tutorial | Add link to fb tutorial
| Markdown | mit | cNille/lundasittning,cNille/lundasittning | markdown | ## Code Before:
```
sudo sass --watch scss:css
```
# Kopiera saker till MAMP
```
../../Applications/MAMP/Library/bin/mysql -u root -p sittning < ~/Desktop/Websites/NILLEFRANZTEST/sittning/sqlQueries/init.sql
```
## Instruction:
Add link to fb tutorial
## Code After:
```
sudo sass --watch scss:css
```
# Kopiera saker till MAMP
```
../../Applications/MAMP/Library/bin/mysql -u root -p sittning < ~/Desktop/Websites/NILLEFRANZTEST/sittning/sqlQueries/init.sql
```
# FB inlogg guide
http://www.krizna.com/general/login-with-facebook-using-php/ |
24e3c857b48872e2335045fa53267195ff718405 | app/models/spree/avatax_configuration.rb | app/models/spree/avatax_configuration.rb |
class Spree::AvataxConfiguration < Spree::Preferences::Configuration
preference :company_code, :string, default: ENV['AVATAX_COMPANY_CODE']
preference :account, :string, default: ENV['AVATAX_ACCOUNT']
preference :license_key, :string, default: ENV['AVATAX_LICENSE_KEY']
preference :environment, :string, default: -> { default_environment }
preference :log, :boolean, default: true
preference :log_to_stdout, :boolean, default: false
preference :address_validation, :boolean, default: true
preference :address_validation_enabled_countries, :array, default: ['United States', 'Canada']
preference :tax_calculation, :boolean, default: true
preference :document_commit, :boolean, default: true
preference :origin, :string, default: '{}'
preference :refuse_checkout_address_validation_error, :boolean, default: false
preference :customer_can_validate, :boolean, default: false
preference :raise_exceptions, :boolean, default: false
def self.boolean_preferences
%w(tax_calculation document_commit log log_to_stdout address_validation refuse_checkout_address_validation_error customer_can_validate raise_exceptions)
end
def self.storable_env_preferences
%w(company_code account license_key environment)
end
def default_environment
if ENV['AVATAX_ENVIRONMENT'].present?
ENV['AVATAX_ENVIRONMENT']
else
Rails.env.production? ? 'production' : 'sandbox'
end
end
end
|
class Spree::AvataxConfiguration < Spree::Preferences::Configuration
preference :company_code, :string, default: ENV['AVATAX_COMPANY_CODE']
preference :account, :string, default: ENV['AVATAX_ACCOUNT']
preference :license_key, :string, default: ENV['AVATAX_LICENSE_KEY']
preference :environment, :string, default: -> { Spree::AvataxConfiguration.default_environment }
preference :log, :boolean, default: true
preference :log_to_stdout, :boolean, default: false
preference :address_validation, :boolean, default: true
preference :address_validation_enabled_countries, :array, default: ['United States', 'Canada']
preference :tax_calculation, :boolean, default: true
preference :document_commit, :boolean, default: true
preference :origin, :string, default: '{}'
preference :refuse_checkout_address_validation_error, :boolean, default: false
preference :customer_can_validate, :boolean, default: false
preference :raise_exceptions, :boolean, default: false
def self.boolean_preferences
%w(tax_calculation document_commit log log_to_stdout address_validation refuse_checkout_address_validation_error customer_can_validate raise_exceptions)
end
def self.storable_env_preferences
%w(company_code account license_key environment)
end
def self.default_environment
if ENV['AVATAX_ENVIRONMENT'].present?
ENV['AVATAX_ENVIRONMENT']
else
Rails.env.production? ? 'production' : 'sandbox'
end
end
end
| Make default_environment a class method | Make default_environment a class method
As instance methods are deprecated since this PR[0].
[0] `https://github.com/solidusio/solidus/pull/4064`
| Ruby | bsd-3-clause | boomerdigital/solidus_avatax_certified,boomerdigital/solidus_avatax_certified,boomerdigital/solidus_avatax_certified | ruby | ## Code Before:
class Spree::AvataxConfiguration < Spree::Preferences::Configuration
preference :company_code, :string, default: ENV['AVATAX_COMPANY_CODE']
preference :account, :string, default: ENV['AVATAX_ACCOUNT']
preference :license_key, :string, default: ENV['AVATAX_LICENSE_KEY']
preference :environment, :string, default: -> { default_environment }
preference :log, :boolean, default: true
preference :log_to_stdout, :boolean, default: false
preference :address_validation, :boolean, default: true
preference :address_validation_enabled_countries, :array, default: ['United States', 'Canada']
preference :tax_calculation, :boolean, default: true
preference :document_commit, :boolean, default: true
preference :origin, :string, default: '{}'
preference :refuse_checkout_address_validation_error, :boolean, default: false
preference :customer_can_validate, :boolean, default: false
preference :raise_exceptions, :boolean, default: false
def self.boolean_preferences
%w(tax_calculation document_commit log log_to_stdout address_validation refuse_checkout_address_validation_error customer_can_validate raise_exceptions)
end
def self.storable_env_preferences
%w(company_code account license_key environment)
end
def default_environment
if ENV['AVATAX_ENVIRONMENT'].present?
ENV['AVATAX_ENVIRONMENT']
else
Rails.env.production? ? 'production' : 'sandbox'
end
end
end
## Instruction:
Make default_environment a class method
As instance methods are deprecated since this PR[0].
[0] `https://github.com/solidusio/solidus/pull/4064`
## Code After:
class Spree::AvataxConfiguration < Spree::Preferences::Configuration
preference :company_code, :string, default: ENV['AVATAX_COMPANY_CODE']
preference :account, :string, default: ENV['AVATAX_ACCOUNT']
preference :license_key, :string, default: ENV['AVATAX_LICENSE_KEY']
preference :environment, :string, default: -> { Spree::AvataxConfiguration.default_environment }
preference :log, :boolean, default: true
preference :log_to_stdout, :boolean, default: false
preference :address_validation, :boolean, default: true
preference :address_validation_enabled_countries, :array, default: ['United States', 'Canada']
preference :tax_calculation, :boolean, default: true
preference :document_commit, :boolean, default: true
preference :origin, :string, default: '{}'
preference :refuse_checkout_address_validation_error, :boolean, default: false
preference :customer_can_validate, :boolean, default: false
preference :raise_exceptions, :boolean, default: false
def self.boolean_preferences
%w(tax_calculation document_commit log log_to_stdout address_validation refuse_checkout_address_validation_error customer_can_validate raise_exceptions)
end
def self.storable_env_preferences
%w(company_code account license_key environment)
end
def self.default_environment
if ENV['AVATAX_ENVIRONMENT'].present?
ENV['AVATAX_ENVIRONMENT']
else
Rails.env.production? ? 'production' : 'sandbox'
end
end
end
|
2c03162a74c3e5afc9c4934d579a711e0c8f1e2a | src/Event/UserEvent.php | src/Event/UserEvent.php | <?php
namespace AWurth\SilexUser\Event;
use Symfony\Component\EventDispatcher\Event;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\Security\Core\User\UserInterface;
class UserEvent extends Event
{
/**
* @var null|Request
*/
protected $request;
/**
* @var UserInterface
*/
protected $user;
/**
* UserEvent constructor.
*
* @param UserInterface $user
* @param Request|null $request
*/
public function __construct(UserInterface $user, Request $request = null)
{
$this->user = $user;
$this->request = $request;
}
/**
* @return UserInterface
*/
public function getUser()
{
return $this->user;
}
/**
* @return Request
*/
public function getRequest()
{
return $this->request;
}
}
| <?php
namespace AWurth\SilexUser\Event;
use Symfony\Component\EventDispatcher\Event;
use Symfony\Component\HttpFoundation\Request;
use AWurth\SilexUser\Entity\UserInterface;
class UserEvent extends Event
{
/**
* @var null|Request
*/
protected $request;
/**
* @var UserInterface
*/
protected $user;
/**
* UserEvent constructor.
*
* @param UserInterface $user
* @param Request|null $request
*/
public function __construct(UserInterface $user, Request $request = null)
{
$this->user = $user;
$this->request = $request;
}
/**
* @return UserInterface
*/
public function getUser()
{
return $this->user;
}
/**
* @return Request
*/
public function getRequest()
{
return $this->request;
}
}
| Use SilexUser's UserInterface instead of the Symfony Security Component one | Use SilexUser's UserInterface instead of the Symfony Security Component one
| PHP | mit | awurth/silex-user,awurth/silex-user | php | ## Code Before:
<?php
namespace AWurth\SilexUser\Event;
use Symfony\Component\EventDispatcher\Event;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\Security\Core\User\UserInterface;
class UserEvent extends Event
{
/**
* @var null|Request
*/
protected $request;
/**
* @var UserInterface
*/
protected $user;
/**
* UserEvent constructor.
*
* @param UserInterface $user
* @param Request|null $request
*/
public function __construct(UserInterface $user, Request $request = null)
{
$this->user = $user;
$this->request = $request;
}
/**
* @return UserInterface
*/
public function getUser()
{
return $this->user;
}
/**
* @return Request
*/
public function getRequest()
{
return $this->request;
}
}
## Instruction:
Use SilexUser's UserInterface instead of the Symfony Security Component one
## Code After:
<?php
namespace AWurth\SilexUser\Event;
use Symfony\Component\EventDispatcher\Event;
use Symfony\Component\HttpFoundation\Request;
use AWurth\SilexUser\Entity\UserInterface;
class UserEvent extends Event
{
/**
* @var null|Request
*/
protected $request;
/**
* @var UserInterface
*/
protected $user;
/**
* UserEvent constructor.
*
* @param UserInterface $user
* @param Request|null $request
*/
public function __construct(UserInterface $user, Request $request = null)
{
$this->user = $user;
$this->request = $request;
}
/**
* @return UserInterface
*/
public function getUser()
{
return $this->user;
}
/**
* @return Request
*/
public function getRequest()
{
return $this->request;
}
}
|
693035b74f65fcc535c8e22babc8413b100de7af | libraries/tests/mbed/i2c_TMP102/main.cpp | libraries/tests/mbed/i2c_TMP102/main.cpp | TMP102 temperature(PTC9, PTC8, 0x90);
#elif defined(TARGET_LPC812)
TMP102 temperature(D10, D11, 0x90);
#elif defined(TARGET_LPC4088)
TMP102 temperature(p9, p10, 0x90);
#elif defined(TARGET_LPC2368)
TMP102 temperature(p28, p27, 0x90);
#else
TMP102 temperature(p28, p27, 0x90);
#endif
int main()
{
float t = temperature.read();
printf("TMP102: Temperature: %f\n\r", t);
// In our test environment (ARM office) we should get a temperature within
// the range ]15, 30[C
bool result = (t > 15.0) && (t < 30.0);
notify_completion(result);
}
| TMP102 temperature(PTC9, PTC8, 0x90);
#elif defined(TARGET_LPC812)
TMP102 temperature(D10, D11, 0x90);
#elif defined(TARGET_LPC4088)
TMP102 temperature(p9, p10, 0x90);
#elif defined(TARGET_LPC2368)
TMP102 temperature(p28, p27, 0x90);
#elif defined(TARGET_NUCLEO_F103RB) || \
defined(TARGET_NUCLEO_L152RE) || \
defined(TARGET_NUCLEO_F302R8) || \
defined(TARGET_NUCLEO_F030R8) || \
defined(TARGET_NUCLEO_F401RE) || \
defined(TARGET_NUCLEO_F411RE) || \
defined(TARGET_NUCLEO_F072RB) || \
defined(TARGET_NUCLEO_F334R8) || \
defined(TARGET_NUCLEO_L053R8)
TMP102 temperature(I2C_SDA, I2C_SCL, 0x90);
#else
TMP102 temperature(p28, p27, 0x90);
#endif
int main()
{
float t = temperature.read();
printf("TMP102: Temperature: %f\n\r", t);
// In our test environment (ARM office) we should get a temperature within
// the range ]15, 30[C
bool result = (t > 15.0) && (t < 30.0);
notify_completion(result);
}
| Add NUCLEOs targets in TMP102 test | Add NUCLEOs targets in TMP102 test
| C++ | apache-2.0 | mikaleppanen/mbed-os,getopenmono/mbed,Tiryoh/mbed,kpurusho/mbed,jamesadevine/mbed,pbrook/mbed,screamerbg/mbed,bcostm/mbed-os,masaohamanaka/mbed,ryankurte/mbed-os,kjbracey-arm/mbed,al177/mbed,nabilbendafi/mbed,pedromes/mbed,mnlipp/mbed,kpurusho/mbed,tung7970/mbed-os-1,adustm/mbed,Timmmm/mbed,hwfwgrp/mbed,infinnovation/mbed-os,rgrover/mbed,betzw/mbed-os,pi19404/mbed,Archcady/mbed-os,betzw/mbed-os,nabilbendafi/mbed,pbrook/mbed,larks/mbed,adamgreen/mbed,DanKupiniak/mbed,rgrover/mbed,FranklyDev/mbed,al177/mbed,HeadsUpDisplayInc/mbed,andcor02/mbed-os,fpiot/mbed-ats,maximmbed/mbed,sam-geek/mbed,Marcomissyou/mbed,RonEld/mbed,andreaslarssonublox/mbed,jferreir/mbed,GustavWi/mbed,jpbrucker/mbed,bentwire/mbed,fanghuaqi/mbed,mmorenobarm/mbed-os,jrjang/mbed,YarivCol/mbed-os,kl-cruz/mbed-os,logost/mbed,JasonHow44/mbed,pi19404/mbed,bcostm/mbed-os,mbedmicro/mbed,radhika-raghavendran/mbed-os5.1-onsemi,netzimme/mbed-os,mikaleppanen/mbed-os,bremoran/mbed-drivers,jrjang/mbed,Archcady/mbed-os,NXPmicro/mbed,NitinBhaskar/mbed,netzimme/mbed-os,theotherjimmy/mbed,masaohamanaka/mbed,monkiineko/mbed-os,naves-thiago/mbed-midi,HeadsUpDisplayInc/mbed,naves-thiago/mbed-midi,0xc0170/mbed-drivers,kl-cruz/mbed-os,devanlai/mbed,wodji/mbed,xcrespo/mbed,struempelix/mbed,Willem23/mbed,CalSol/mbed,ryankurte/mbed-os,Shengliang/mbed,pi19404/mbed,CalSol/mbed,bentwire/mbed,c1728p9/mbed-os,mikaleppanen/mbed-os,adustm/mbed,K4zuki/mbed,pedromes/mbed,alertby/mbed,theotherjimmy/mbed,xcrespo/mbed,brstew/MBED-BUILD,ban4jp/mbed,karsev/mbed-os,radhika-raghavendran/mbed-os5.1-onsemi,K4zuki/mbed,masaohamanaka/mbed,karsev/mbed-os,NXPmicro/mbed,ban4jp/mbed,rgrover/mbed,GustavWi/mbed,mazimkhan/mbed-os,Timmmm/mbed,karsev/mbed-os,tung7970/mbed-os-1,Tiryoh/mbed,nabilbendafi/mbed,0xc0170/mbed-drivers,struempelix/mbed,nRFMesh/mbed-os,devanlai/mbed,RonEld/mbed,NordicSemiconductor/mbed,DanKupiniak/mbed,HeadsUpDisplayInc/mbed,sg-/mbed-drivers,autopulated/mbed,mikaleppanen/mbed-os,tung7970/mbed-os-1,netzimme/mbed-os,cvtsi2sd/mbed-os,CalSol/mbed,getopenmono/mbed,struempelix/mbed,adustm/mbed,cvtsi2sd/mbed-os,c1728p9/mbed-os,autopulated/mbed,adamgreen/mbed,NordicSemiconductor/mbed,xcrespo/mbed,alertby/mbed,nvlsianpu/mbed,jrjang/mbed,YarivCol/mbed-os,logost/mbed,fpiot/mbed-ats,svogl/mbed-os,jamesadevine/mbed,pedromes/mbed,betzw/mbed-os,Willem23/mbed,fvincenzo/mbed-os,DanKupiniak/mbed,fvincenzo/mbed-os,ARM-software/mbed-beetle,sam-geek/mbed,Shengliang/mbed,iriark01/mbed-drivers,geky/mbed,maximmbed/mbed,jpbrucker/mbed,larks/mbed,nabilbendafi/mbed,screamerbg/mbed,infinnovation/mbed-os,brstew/MBED-BUILD,JasonHow44/mbed,bikeNomad/mbed,svogl/mbed-os,Tiryoh/mbed,infinnovation/mbed-os,fpiot/mbed-ats,bentwire/mbed,HeadsUpDisplayInc/mbed,Timmmm/mbed,bulislaw/mbed-os,ryankurte/mbed-os,betzw/mbed-os,jeremybrodt/mbed,adamgreen/mbed,GustavWi/mbed,Archcady/mbed-os,radhika-raghavendran/mbed-os5.1-onsemi,Tiryoh/mbed,K4zuki/mbed,Sweet-Peas/mbed,masaohamanaka/mbed,svastm/mbed,radhika-raghavendran/mbed-os5.1-onsemi,fahhem/mbed-os,nRFMesh/mbed-os,ban4jp/mbed,catiedev/mbed-os,xcrespo/mbed,mazimkhan/mbed-os,naves-thiago/mbed-midi,fahhem/mbed-os,Willem23/mbed,iriark01/mbed-drivers,bikeNomad/mbed,wodji/mbed,Archcady/mbed-os,jferreir/mbed,rosterloh/mbed,EmuxEvans/mbed,cvtsi2sd/mbed-os,alertby/mbed,NXPmicro/mbed,andreaslarssonublox/mbed,c1728p9/mbed-os,fahhem/mbed-os,Marcomissyou/mbed,YarivCol/mbed-os,maximmbed/mbed,mbedmicro/mbed,pradeep-gr/mbed-os5-onsemi,svogl/mbed-os,karsev/mbed-os,NordicSemiconductor/mbed,pradeep-gr/mbed-os5-onsemi,fahhem/mbed-os,ryankurte/mbed-os,mikaleppanen/mbed-os,andreaslarssonublox/mbed,bcostm/mbed-os,GustavWi/mbed,fpiot/mbed-ats,jferreir/mbed,nvlsianpu/mbed,EmuxEvans/mbed,mikaleppanen/mbed-os,FranklyDev/mbed,svogl/mbed-os,catiedev/mbed-os,mazimkhan/mbed-os,Tiryoh/mbed,fpiot/mbed-ats,Archcady/mbed-os,mnlipp/mbed,c1728p9/mbed-os,svastm/mbed,NXPmicro/mbed,mmorenobarm/mbed-os,mnlipp/mbed,dbestm/mbed,rosterloh/mbed,Sweet-Peas/mbed,pedromes/mbed,adustm/mbed,rgrover/mbed,fpiot/mbed-ats,nRFMesh/mbed-os,EmuxEvans/mbed,adustm/mbed,struempelix/mbed,maximmbed/mbed,monkiineko/mbed-os,hwfwgrp/mbed,tung7970/mbed-os,ryankurte/mbed-os,c1728p9/mbed-os,al177/mbed,Marcomissyou/mbed,kpurusho/mbed,devanlai/mbed,sam-geek/mbed,tung7970/mbed-os,hwfwgrp/mbed,Sweet-Peas/mbed,Archcady/mbed-os,pradeep-gr/mbed-os5-onsemi,geky/mbed,Sweet-Peas/mbed,arostm/mbed-os,tung7970/mbed-os-1,pradeep-gr/mbed-os5-onsemi,netzimme/mbed-os,RonEld/mbed,HeadsUpDisplayInc/mbed,geky/mbed,Shengliang/mbed,jrjang/mbed,mnlipp/mbed,mazimkhan/mbed-os,nvlsianpu/mbed,brstew/MBED-BUILD,adamgreen/mbed,adamgreen/mbed,dbestm/mbed,screamerbg/mbed,fvincenzo/mbed-os,mnlipp/mbed,jeremybrodt/mbed,screamerbg/mbed,arostm/mbed-os,naves-thiago/mbed-midi,autopulated/mbed,hwfwgrp/mbed,monkiineko/mbed-os,brstew/MBED-BUILD,bulislaw/mbed-os,j-greffe/mbed-os,FranklyDev/mbed,NordicSemiconductor/mbed,sam-geek/mbed,dbestm/mbed,logost/mbed,svastm/mbed,rgrover/mbed,andreaslarssonublox/mbed,Marcomissyou/mbed,cvtsi2sd/mbed-os,mazimkhan/mbed-os,fanghuaqi/mbed,bcostm/mbed-os,screamerbg/mbed,bentwire/mbed,getopenmono/mbed,mmorenobarm/mbed-os,RonEld/mbed,NitinBhaskar/mbed,betzw/mbed-os,Sweet-Peas/mbed,jpbrucker/mbed,theotherjimmy/mbed,nvlsianpu/mbed,pi19404/mbed,pbrook/mbed,Marcomissyou/mbed,adamgreen/mbed,bulislaw/mbed-os,svogl/mbed-os,pbrook/mbed,NXPmicro/mbed,pedromes/mbed,rosterloh/mbed,andcor02/mbed-os,GustavWi/mbed,rosterloh/mbed,nabilbendafi/mbed,andreaslarssonublox/mbed,mmorenobarm/mbed-os,catiedev/mbed-os,kpurusho/mbed,ARM-software/mbed-beetle,monkiineko/mbed-os,kl-cruz/mbed-os,Shengliang/mbed,NitinBhaskar/mbed,mbedmicro/mbed,Marcomissyou/mbed,YarivCol/mbed-os,nRFMesh/mbed-os,geky/mbed,monkiineko/mbed-os,mbedmicro/mbed,struempelix/mbed,arostm/mbed-os,Timmmm/mbed,fahhem/mbed-os,arostm/mbed-os,Sweet-Peas/mbed,ARM-software/mbed-beetle,bcostm/mbed-os,wodji/mbed,jeremybrodt/mbed,cvtsi2sd/mbed-os,getopenmono/mbed,betzw/mbed-os,larks/mbed,maximmbed/mbed,ryankurte/mbed-os,netzimme/mbed-os,nvlsianpu/mbed,kjbracey-arm/mbed,kl-cruz/mbed-os,CalSol/mbed,bremoran/mbed-drivers,j-greffe/mbed-os,infinnovation/mbed-os,alertby/mbed,bulislaw/mbed-os,jamesadevine/mbed,jamesadevine/mbed,andcor02/mbed-os,theotherjimmy/mbed,mmorenobarm/mbed-os,kpurusho/mbed,xcrespo/mbed,fanghuaqi/mbed,jamesadevine/mbed,NXPmicro/mbed,mazimkhan/mbed-os,screamerbg/mbed,hwfwgrp/mbed,FranklyDev/mbed,K4zuki/mbed,autopulated/mbed,brstew/MBED-BUILD,jamesadevine/mbed,kl-cruz/mbed-os,jpbrucker/mbed,dbestm/mbed,struempelix/mbed,mnlipp/mbed,Willem23/mbed,EmuxEvans/mbed,al177/mbed,hwfwgrp/mbed,NitinBhaskar/mbed,infinnovation/mbed-os,tung7970/mbed-os,jpbrucker/mbed,jferreir/mbed,bcostm/mbed-os,nvlsianpu/mbed,svastm/mbed,alertby/mbed,autopulated/mbed,karsev/mbed-os,tung7970/mbed-os,ban4jp/mbed,devanlai/mbed,kl-cruz/mbed-os,dbestm/mbed,Shengliang/mbed,fahhem/mbed-os,alertby/mbed,Willem23/mbed,pbrook/mbed,nRFMesh/mbed-os,rosterloh/mbed,YarivCol/mbed-os,geky/mbed,dbestm/mbed,andcor02/mbed-os,pbrook/mbed,c1728p9/mbed-os,wodji/mbed,ARM-software/mbed-beetle,jeremybrodt/mbed,jferreir/mbed,maximmbed/mbed,logost/mbed,bikeNomad/mbed,pradeep-gr/mbed-os5-onsemi,tung7970/mbed-os,wodji/mbed,cvtsi2sd/mbed-os,pi19404/mbed,devanlai/mbed,xcrespo/mbed,Shengliang/mbed,Tiryoh/mbed,al177/mbed,K4zuki/mbed,logost/mbed,larks/mbed,netzimme/mbed-os,nRFMesh/mbed-os,jeremybrodt/mbed,HeadsUpDisplayInc/mbed,JasonHow44/mbed,pedromes/mbed,kjbracey-arm/mbed,jrjang/mbed,monkiineko/mbed-os,adustm/mbed,Timmmm/mbed,al177/mbed,catiedev/mbed-os,JasonHow44/mbed,logost/mbed,CalSol/mbed,JasonHow44/mbed,pradeep-gr/mbed-os5-onsemi,j-greffe/mbed-os,JasonHow44/mbed,YarivCol/mbed-os,pi19404/mbed,autopulated/mbed,jferreir/mbed,naves-thiago/mbed-midi,naves-thiago/mbed-midi,arostm/mbed-os,svogl/mbed-os,karsev/mbed-os,brstew/MBED-BUILD,nabilbendafi/mbed,EmuxEvans/mbed,sam-geek/mbed,sg-/mbed-drivers,masaohamanaka/mbed,bulislaw/mbed-os,larks/mbed,rosterloh/mbed,svastm/mbed,fanghuaqi/mbed,devanlai/mbed,K4zuki/mbed,mbedmicro/mbed,bentwire/mbed,mmorenobarm/mbed-os,getopenmono/mbed,bikeNomad/mbed,fvincenzo/mbed-os,NitinBhaskar/mbed,arostm/mbed-os,bentwire/mbed,ban4jp/mbed,fvincenzo/mbed-os,kpurusho/mbed,kjbracey-arm/mbed,fanghuaqi/mbed,Timmmm/mbed,masaohamanaka/mbed,bikeNomad/mbed,CalSol/mbed,larks/mbed,radhika-raghavendran/mbed-os5.1-onsemi,infinnovation/mbed-os,j-greffe/mbed-os,jpbrucker/mbed,tung7970/mbed-os-1,RonEld/mbed,wodji/mbed,andcor02/mbed-os,theotherjimmy/mbed,FranklyDev/mbed,jrjang/mbed,DanKupiniak/mbed,theotherjimmy/mbed,radhika-raghavendran/mbed-os5.1-onsemi,andcor02/mbed-os,EmuxEvans/mbed,catiedev/mbed-os,j-greffe/mbed-os,catiedev/mbed-os,getopenmono/mbed,bulislaw/mbed-os,RonEld/mbed,j-greffe/mbed-os,ban4jp/mbed | c++ | ## Code Before:
TMP102 temperature(PTC9, PTC8, 0x90);
#elif defined(TARGET_LPC812)
TMP102 temperature(D10, D11, 0x90);
#elif defined(TARGET_LPC4088)
TMP102 temperature(p9, p10, 0x90);
#elif defined(TARGET_LPC2368)
TMP102 temperature(p28, p27, 0x90);
#else
TMP102 temperature(p28, p27, 0x90);
#endif
int main()
{
float t = temperature.read();
printf("TMP102: Temperature: %f\n\r", t);
// In our test environment (ARM office) we should get a temperature within
// the range ]15, 30[C
bool result = (t > 15.0) && (t < 30.0);
notify_completion(result);
}
## Instruction:
Add NUCLEOs targets in TMP102 test
## Code After:
TMP102 temperature(PTC9, PTC8, 0x90);
#elif defined(TARGET_LPC812)
TMP102 temperature(D10, D11, 0x90);
#elif defined(TARGET_LPC4088)
TMP102 temperature(p9, p10, 0x90);
#elif defined(TARGET_LPC2368)
TMP102 temperature(p28, p27, 0x90);
#elif defined(TARGET_NUCLEO_F103RB) || \
defined(TARGET_NUCLEO_L152RE) || \
defined(TARGET_NUCLEO_F302R8) || \
defined(TARGET_NUCLEO_F030R8) || \
defined(TARGET_NUCLEO_F401RE) || \
defined(TARGET_NUCLEO_F411RE) || \
defined(TARGET_NUCLEO_F072RB) || \
defined(TARGET_NUCLEO_F334R8) || \
defined(TARGET_NUCLEO_L053R8)
TMP102 temperature(I2C_SDA, I2C_SCL, 0x90);
#else
TMP102 temperature(p28, p27, 0x90);
#endif
int main()
{
float t = temperature.read();
printf("TMP102: Temperature: %f\n\r", t);
// In our test environment (ARM office) we should get a temperature within
// the range ]15, 30[C
bool result = (t > 15.0) && (t < 30.0);
notify_completion(result);
}
|
d6b853e35f1a54cf6dd1b83eac410248d098eb5c | db/seeds.rb | db/seeds.rb | if Rails.env.development?
# Create test user
#
unless User.find_by_email("test@example.com")
u = User.new
u.email = "test@example.com"
u.name = "Test User"
u.permissions = ["signin"]
u.save
end
end
puts "To import the data from redirector, plus Hit data, run this: "
puts " bundle exec rake import:all"
| if Rails.env.development?
# Create test user
#
unless User.find_by_email("test@example.com")
u = User.new
u.email = "test@example.com"
u.name = "Test User"
u.permissions = ["signin", "admin"]
u.organisation_slug = 'cabinet-office'
u.save
end
end
puts "To import the data from redirector, plus Hit data, run this: "
puts " bundle exec rake import:all"
| Make the dev user an admin and a member of an org | Make the dev user an admin and a member of an org
This means that they will be able to edit anything and will see the "my
organisation" link in the banner header.
| Ruby | mit | alphagov/transition,alphagov/transition,alphagov/transition | ruby | ## Code Before:
if Rails.env.development?
# Create test user
#
unless User.find_by_email("test@example.com")
u = User.new
u.email = "test@example.com"
u.name = "Test User"
u.permissions = ["signin"]
u.save
end
end
puts "To import the data from redirector, plus Hit data, run this: "
puts " bundle exec rake import:all"
## Instruction:
Make the dev user an admin and a member of an org
This means that they will be able to edit anything and will see the "my
organisation" link in the banner header.
## Code After:
if Rails.env.development?
# Create test user
#
unless User.find_by_email("test@example.com")
u = User.new
u.email = "test@example.com"
u.name = "Test User"
u.permissions = ["signin", "admin"]
u.organisation_slug = 'cabinet-office'
u.save
end
end
puts "To import the data from redirector, plus Hit data, run this: "
puts " bundle exec rake import:all"
|
37b921448c1760610716ccf4ae81070e87fc29a2 | app/helpers/helpers.rb | app/helpers/helpers.rb | helpers do
def current_user
User.find_by(id: session[:user_id])
end
def logout_user
sessions[:user_id] = nil
end
def existing_user?
if User.find_by(email: current_user[:email])
true
else
false
end
end
end
| helpers do
def current_user
User.find_by(id: session[:user_id]) if session[:user_id]
end
def logout_user
sessions[:user_id] = nil
end
def existing_user?
if User.find_by(email: current_user[:email])
true
else
false
end
end
end
| Secure by checking for cookie | Secure by checking for cookie | Ruby | mit | chi-dragonflies-2015/stack-underflow,chi-dragonflies-2015/stack-underflow,chi-dragonflies-2015/stack-underflow | ruby | ## Code Before:
helpers do
def current_user
User.find_by(id: session[:user_id])
end
def logout_user
sessions[:user_id] = nil
end
def existing_user?
if User.find_by(email: current_user[:email])
true
else
false
end
end
end
## Instruction:
Secure by checking for cookie
## Code After:
helpers do
def current_user
User.find_by(id: session[:user_id]) if session[:user_id]
end
def logout_user
sessions[:user_id] = nil
end
def existing_user?
if User.find_by(email: current_user[:email])
true
else
false
end
end
end
|
4d7066f5aa86b0b84847fbce23c0b02440552ae4 | openstack/octavia/requirements.yaml | openstack/octavia/requirements.yaml | dependencies:
- condition: mariadb.enabled
name: mariadb
repository: https://charts.eu-de-2.cloud.sap
version: 0.2.7
- name: memcached
repository: https://charts.eu-de-2.cloud.sap
version: 0.0.3
- condition: mariadb.enabled
name: mysql_metrics
repository: https://charts.eu-de-2.cloud.sap
version: 0.2.3
- name: rabbitmq
repository: https://charts.eu-de-2.cloud.sap
version: 0.1.9
- alias: rabbitmq_notifications
condition: audit.enabled
name: rabbitmq
repository: https://charts.eu-de-2.cloud.sap
version: 0.1.9
- name: region_check
repository: https://charts.eu-de-2.cloud.sap
version: 0.1.2
- name: utils
repository: file://../../openstack/utils
version: 0.1.1
| dependencies:
- condition: mariadb.enabled
name: mariadb
repository: https://charts.eu-de-2.cloud.sap
version: 0.2.7
- name: memcached
repository: https://charts.eu-de-2.cloud.sap
version: 0.0.3
- condition: mariadb.enabled
name: mysql_metrics
repository: https://charts.eu-de-2.cloud.sap
version: 0.2.3
- name: rabbitmq
repository: https://charts.eu-de-2.cloud.sap
version: 0.1.9
- alias: rabbitmq_notifications
condition: audit.enabled
name: rabbitmq
repository: https://charts.eu-de-2.cloud.sap
version: 0.1.9
- name: region_check
repository: https://charts.eu-de-2.cloud.sap
version: 0.1.2
- name: utils
repository: file://../../openstack/utils
version: 0.1.1
- name: jaeger-operator
repository: https://charts.eu-de-2.cloud.sap
version: 0.1.0
condition: osprofiler.jaeger.enabled
import-values:
- child: jaeger
parent: osprofiler.jaeger
| Fix jaeger dependency: Missed some lines when pasting | [Octavia] Fix jaeger dependency: Missed some lines when pasting
| YAML | apache-2.0 | sapcc/helm-charts,sapcc/helm-charts,sapcc/helm-charts,sapcc/helm-charts | yaml | ## Code Before:
dependencies:
- condition: mariadb.enabled
name: mariadb
repository: https://charts.eu-de-2.cloud.sap
version: 0.2.7
- name: memcached
repository: https://charts.eu-de-2.cloud.sap
version: 0.0.3
- condition: mariadb.enabled
name: mysql_metrics
repository: https://charts.eu-de-2.cloud.sap
version: 0.2.3
- name: rabbitmq
repository: https://charts.eu-de-2.cloud.sap
version: 0.1.9
- alias: rabbitmq_notifications
condition: audit.enabled
name: rabbitmq
repository: https://charts.eu-de-2.cloud.sap
version: 0.1.9
- name: region_check
repository: https://charts.eu-de-2.cloud.sap
version: 0.1.2
- name: utils
repository: file://../../openstack/utils
version: 0.1.1
## Instruction:
[Octavia] Fix jaeger dependency: Missed some lines when pasting
## Code After:
dependencies:
- condition: mariadb.enabled
name: mariadb
repository: https://charts.eu-de-2.cloud.sap
version: 0.2.7
- name: memcached
repository: https://charts.eu-de-2.cloud.sap
version: 0.0.3
- condition: mariadb.enabled
name: mysql_metrics
repository: https://charts.eu-de-2.cloud.sap
version: 0.2.3
- name: rabbitmq
repository: https://charts.eu-de-2.cloud.sap
version: 0.1.9
- alias: rabbitmq_notifications
condition: audit.enabled
name: rabbitmq
repository: https://charts.eu-de-2.cloud.sap
version: 0.1.9
- name: region_check
repository: https://charts.eu-de-2.cloud.sap
version: 0.1.2
- name: utils
repository: file://../../openstack/utils
version: 0.1.1
- name: jaeger-operator
repository: https://charts.eu-de-2.cloud.sap
version: 0.1.0
condition: osprofiler.jaeger.enabled
import-values:
- child: jaeger
parent: osprofiler.jaeger
|
43295b04a3bca72c2f06f29b7861fa82a75b5d17 | config/environments/pull_request.rb | config/environments/pull_request.rb | require_relative './staging'
Rails.application.configure do
# This should probably stay empty since
# staging + production should be the same
end | require_relative './staging'
Rails.application.configure do
# This should probably stay empty since
# staging + production should be the same
#
# see application.rb for feature_toggle documentation
config.x.feature_toggles.projects = true
end | Enable projects feature for pull request, i.e. review app environment | Enable projects feature for pull request, i.e. review app environment
| Ruby | mit | diegoaad/lale-help,diegoaad/lale-help,jprokay/lale-help,lale-help/lale-help,lale-help/lale-help,diegoaad/lale-help,lale-help/lale-help,jprokay/lale-help,jprokay/lale-help,lale-help/lale-help,jprokay/lale-help,diegoaad/lale-help | ruby | ## Code Before:
require_relative './staging'
Rails.application.configure do
# This should probably stay empty since
# staging + production should be the same
end
## Instruction:
Enable projects feature for pull request, i.e. review app environment
## Code After:
require_relative './staging'
Rails.application.configure do
# This should probably stay empty since
# staging + production should be the same
#
# see application.rb for feature_toggle documentation
config.x.feature_toggles.projects = true
end |
b88abd98834529f1342d69e2e91b79efd68e5e8d | backend/uclapi/dashboard/middleware/fake_shibboleth_middleware.py | backend/uclapi/dashboard/middleware/fake_shibboleth_middleware.py | from django.utils.deprecation import MiddlewareMixin
class FakeShibbolethMiddleWare(MiddlewareMixin):
def process_request(self, request):
if request.POST.get("convert-post-headers") == "1":
for key in request.POST:
request.META[key] = request.POST[key]
| from django.utils.deprecation import MiddlewareMixin
class FakeShibbolethMiddleWare(MiddlewareMixin):
def process_request(self, request):
if request.POST.get("convert-post-headers") == "1":
for key in request.POST:
request.META[key] = request.POST[key]
if request.GET.get("convert-get-headers") == "1":
for key in request.GET:
http_key = key.upper()
http_key.replace("-", "_")
http_key = "HTTP_" + http_key
request.META[http_key] = request.GET[key]
| Add get parameter parsing for fakeshibboleth auto mode | Add get parameter parsing for fakeshibboleth auto mode
| Python | mit | uclapi/uclapi,uclapi/uclapi,uclapi/uclapi,uclapi/uclapi | python | ## Code Before:
from django.utils.deprecation import MiddlewareMixin
class FakeShibbolethMiddleWare(MiddlewareMixin):
def process_request(self, request):
if request.POST.get("convert-post-headers") == "1":
for key in request.POST:
request.META[key] = request.POST[key]
## Instruction:
Add get parameter parsing for fakeshibboleth auto mode
## Code After:
from django.utils.deprecation import MiddlewareMixin
class FakeShibbolethMiddleWare(MiddlewareMixin):
def process_request(self, request):
if request.POST.get("convert-post-headers") == "1":
for key in request.POST:
request.META[key] = request.POST[key]
if request.GET.get("convert-get-headers") == "1":
for key in request.GET:
http_key = key.upper()
http_key.replace("-", "_")
http_key = "HTTP_" + http_key
request.META[http_key] = request.GET[key]
|
9d68c0b4e405112cbd8622466f64c7fdd4038240 | .travis.yml | .travis.yml | language: cpp
matrix:
include:
- os: linux
dist: trusty
sudo: required
- os: osx
osx_image: xcode7.3
- os: osx
osx_image: xcode8
before_install:
- if [ $TRAVIS_OS_NAME == linux ]; then eval "$(curl -sL https://raw.githubusercontent.com/ryuichis/oclint-cpp-travis-ci-examples/master/oclint-ci-install.sh)"; fi
script:
- mkdir build
- cd build
- cmake -DCMAKE_EXPORT_COMPILE_COMMANDS=ON ..
- cd ..
- cp build/compile_commands.json .
- oclint-json-compilation-database
| language: cpp
matrix:
include:
- os: linux
dist: trusty
sudo: required
- os: osx
osx_image: xcode7.3
- os: osx
osx_image: xcode8
allow_failures:
- os: osx
osx_image: xcode8
before_install:
- if [ $TRAVIS_OS_NAME == linux ]; then eval "$(curl -sL https://raw.githubusercontent.com/ryuichis/oclint-cpp-travis-ci-examples/master/oclint-ci-install.sh)"; fi
script:
- mkdir build
- cd build
- cmake -DCMAKE_EXPORT_COMPILE_COMMANDS=ON ..
- cd ..
- cp build/compile_commands.json .
- oclint-json-compilation-database
| Allow xcode 8 to fail for now | Allow xcode 8 to fail for now
| YAML | bsd-3-clause | ryuichis/oclint-cpp-travis-ci-examples,ryuichis/oclint-cpp-travis-ci-examples,ryuichis/oclint-cpp-travis-ci-examples | yaml | ## Code Before:
language: cpp
matrix:
include:
- os: linux
dist: trusty
sudo: required
- os: osx
osx_image: xcode7.3
- os: osx
osx_image: xcode8
before_install:
- if [ $TRAVIS_OS_NAME == linux ]; then eval "$(curl -sL https://raw.githubusercontent.com/ryuichis/oclint-cpp-travis-ci-examples/master/oclint-ci-install.sh)"; fi
script:
- mkdir build
- cd build
- cmake -DCMAKE_EXPORT_COMPILE_COMMANDS=ON ..
- cd ..
- cp build/compile_commands.json .
- oclint-json-compilation-database
## Instruction:
Allow xcode 8 to fail for now
## Code After:
language: cpp
matrix:
include:
- os: linux
dist: trusty
sudo: required
- os: osx
osx_image: xcode7.3
- os: osx
osx_image: xcode8
allow_failures:
- os: osx
osx_image: xcode8
before_install:
- if [ $TRAVIS_OS_NAME == linux ]; then eval "$(curl -sL https://raw.githubusercontent.com/ryuichis/oclint-cpp-travis-ci-examples/master/oclint-ci-install.sh)"; fi
script:
- mkdir build
- cd build
- cmake -DCMAKE_EXPORT_COMPILE_COMMANDS=ON ..
- cd ..
- cp build/compile_commands.json .
- oclint-json-compilation-database
|
5f897b9c45a9b03d6a5ada2406a76f500b6efe1d | lib/tasks/data_loader.rb | lib/tasks/data_loader.rb | module DataLoader
def check_for_file taskname
unless ENV['FILE']
puts ''
puts "usage: This task requires FILE=filename"
puts ''
exit 0
end
end
def parse(model, parser_class, skip_invalid=true)
check_for_file model
puts "Loading #{model} from #{ENV['FILE']}..."
parser = parser_class.new
parser.send("parse_#{model}".to_sym, ENV['FILE']) do |model|
begin
model.save!
rescue ActiveRecord::RecordInvalid => validation_error
if skip_invalid
puts validation_error
puts model.inspect
puts 'Continuing....'
else
raise
end
end
end
end
end | module DataLoader
def check_for_file
unless ENV['FILE']
puts ''
puts "usage: This task requires FILE=filename"
puts ''
exit 0
end
end
def check_for_dir
unless ENV['DIR']
puts ''
puts "usage: This task requires DIR=dirname"
puts ''
exit 0
end
end
def parse(model, parser_class, skip_invalid=true)
check_for_file
puts "Loading #{model} from #{ENV['FILE']}..."
parser = parser_class.new
parser.send("parse_#{model}".to_sym, ENV['FILE']) do |model|
begin
model.save!
rescue ActiveRecord::RecordInvalid => validation_error
if skip_invalid
puts validation_error
puts model.inspect
puts 'Continuing....'
else
raise
end
end
end
end
end | Add function to check for dir | Add function to check for dir
| Ruby | agpl-3.0 | mysociety/fixmytransport,mysociety/fixmytransport,mysociety/fixmytransport,mysociety/fixmytransport,mysociety/fixmytransport,mysociety/fixmytransport | ruby | ## Code Before:
module DataLoader
def check_for_file taskname
unless ENV['FILE']
puts ''
puts "usage: This task requires FILE=filename"
puts ''
exit 0
end
end
def parse(model, parser_class, skip_invalid=true)
check_for_file model
puts "Loading #{model} from #{ENV['FILE']}..."
parser = parser_class.new
parser.send("parse_#{model}".to_sym, ENV['FILE']) do |model|
begin
model.save!
rescue ActiveRecord::RecordInvalid => validation_error
if skip_invalid
puts validation_error
puts model.inspect
puts 'Continuing....'
else
raise
end
end
end
end
end
## Instruction:
Add function to check for dir
## Code After:
module DataLoader
def check_for_file
unless ENV['FILE']
puts ''
puts "usage: This task requires FILE=filename"
puts ''
exit 0
end
end
def check_for_dir
unless ENV['DIR']
puts ''
puts "usage: This task requires DIR=dirname"
puts ''
exit 0
end
end
def parse(model, parser_class, skip_invalid=true)
check_for_file
puts "Loading #{model} from #{ENV['FILE']}..."
parser = parser_class.new
parser.send("parse_#{model}".to_sym, ENV['FILE']) do |model|
begin
model.save!
rescue ActiveRecord::RecordInvalid => validation_error
if skip_invalid
puts validation_error
puts model.inspect
puts 'Continuing....'
else
raise
end
end
end
end
end |
43fb2c9e4b039018fbd230acea3e48f6c7c39137 | lib/van_helsing/default.rb | lib/van_helsing/default.rb | task :force_unlock do
queue %{echo "-----> Unlocking"}
queue %{rm -f "#{deploy_to}/deploy.lock"}
queue %{exit 234}
run!
end
namespace :deploy do
desc "Links paths set in :shared_paths."
task :link_shared_paths do
validate_set :shared_paths
dirs = shared_paths.map { |file| File.dirname("#{release_path}/#{file}") }.uniq
cmds = dirs.map do |dir|
%{mkdir -p "#{dir}"}
end
cmds += shared_paths.map do |file|
%{ln -s "#{shared_path}/#{file}" "#{release_path}/#{file}"}
end
queue %{
echo "-----> Symlinking shared paths"
#{cmds.join(" &&\n")}
}
end
end
| namespace :deploy do
desc "Forces a deploy unlock."
task :force_unlock do
queue %{echo "-----> Unlocking"}
queue %{rm -f "#{deploy_to}/deploy.lock"}
queue %{exit 234}
run!
end
desc "Links paths set in :shared_paths."
task :link_shared_paths do
validate_set :shared_paths
dirs = shared_paths.map { |file| File.dirname("#{release_path}/#{file}") }.uniq
cmds = dirs.map do |dir|
%{mkdir -p "#{dir}"}
end
cmds += shared_paths.map do |file|
%{ln -s "#{shared_path}/#{file}" "#{release_path}/#{file}"}
end
queue %{
echo "-----> Symlinking shared paths"
#{cmds.join(" &&\n")}
}
end
end
| Rename force_unlock to deploy:force_unlock. Document deploy:force_unlock. | Rename force_unlock to deploy:force_unlock. Document deploy:force_unlock.
| Ruby | mit | stereodenis/mina,estum/mina,mfinelli/mina,flowerett/mina,scalp42/mina,estum/mina,dcalixto/mina,Zhomart/mina,chip/mina,ianks/mina,cyberwolfru/mina,flowerett/mina,Zhomart/mina,ianks/mina,mfinelli/mina,cyberwolfru/mina,scalp42/mina,whhx/mina,dcalixto/mina,allantatter/mina,whhx/mina,allantatter/mina,openaustralia/mina,openaustralia/mina | ruby | ## Code Before:
task :force_unlock do
queue %{echo "-----> Unlocking"}
queue %{rm -f "#{deploy_to}/deploy.lock"}
queue %{exit 234}
run!
end
namespace :deploy do
desc "Links paths set in :shared_paths."
task :link_shared_paths do
validate_set :shared_paths
dirs = shared_paths.map { |file| File.dirname("#{release_path}/#{file}") }.uniq
cmds = dirs.map do |dir|
%{mkdir -p "#{dir}"}
end
cmds += shared_paths.map do |file|
%{ln -s "#{shared_path}/#{file}" "#{release_path}/#{file}"}
end
queue %{
echo "-----> Symlinking shared paths"
#{cmds.join(" &&\n")}
}
end
end
## Instruction:
Rename force_unlock to deploy:force_unlock. Document deploy:force_unlock.
## Code After:
namespace :deploy do
desc "Forces a deploy unlock."
task :force_unlock do
queue %{echo "-----> Unlocking"}
queue %{rm -f "#{deploy_to}/deploy.lock"}
queue %{exit 234}
run!
end
desc "Links paths set in :shared_paths."
task :link_shared_paths do
validate_set :shared_paths
dirs = shared_paths.map { |file| File.dirname("#{release_path}/#{file}") }.uniq
cmds = dirs.map do |dir|
%{mkdir -p "#{dir}"}
end
cmds += shared_paths.map do |file|
%{ln -s "#{shared_path}/#{file}" "#{release_path}/#{file}"}
end
queue %{
echo "-----> Symlinking shared paths"
#{cmds.join(" &&\n")}
}
end
end
|
dfc5c18ffe19aba192be6a9e163a56470c554fc6 | app/scripts/pocket.coffee | app/scripts/pocket.coffee | 'use strict'
class Pocket
constructor: ->
element = document.createElement 'div'
element.classList.add 'pocket'
element.innerText = 'Pocket'
document.body.appendChild element
pocket = new Pocket()
| 'use strict'
class Pocket
State:
OPEN: 'open'
CLOSE: 'close'
state: null
constructor: ->
@state = @State.CLOSE
element = document.createElement 'div'
element.classList.add 'pocket'
element.innerText = 'Pocket'
element.addEventListener 'mouseover', @onMouseOver.bind(@)
element.addEventListener 'mouseleave', @onMouseLeave.bind(@)
document.body.appendChild element
onMouseOver: ->
@state = @State.OPEN
onMouseLeave: ->
@state = @State.CLOSE
pocket = new Pocket()
| Add open and close states by mouse over and leave | Add open and close states by mouse over and leave
| CoffeeScript | apache-2.0 | eqot/pocket | coffeescript | ## Code Before:
'use strict'
class Pocket
constructor: ->
element = document.createElement 'div'
element.classList.add 'pocket'
element.innerText = 'Pocket'
document.body.appendChild element
pocket = new Pocket()
## Instruction:
Add open and close states by mouse over and leave
## Code After:
'use strict'
class Pocket
State:
OPEN: 'open'
CLOSE: 'close'
state: null
constructor: ->
@state = @State.CLOSE
element = document.createElement 'div'
element.classList.add 'pocket'
element.innerText = 'Pocket'
element.addEventListener 'mouseover', @onMouseOver.bind(@)
element.addEventListener 'mouseleave', @onMouseLeave.bind(@)
document.body.appendChild element
onMouseOver: ->
@state = @State.OPEN
onMouseLeave: ->
@state = @State.CLOSE
pocket = new Pocket()
|
479cb0f3cd200263580a62e7f0268d19ce9fffd7 | src/characters.js | src/characters.js | export const characters = Object.assign({},
require('./characters/geiger.jsx'),
require('./characters/grave.jsx'),
require('./characters/gwen.jsx'),
require('./characters/jaina.jsx'),
require('./characters/midori.jsx'),
require('./characters/rook.jsx'),
require('./characters/setsuki.jsx'),
require('./characters/troq.jsx'),
require('./characters/valerie.jsx'),
require('./characters/argagarg.jsx'),
require('./characters/degrey.jsx'),
require('./characters/lum.jsx'),
require('./characters/menelker.jsx'),
require('./characters/quince.jsx'),
require('./characters/persephone.jsx'),
);
export default characters
| export const characters = Object.assign({},
require('./characters/geiger.jsx'),
require('./characters/grave.jsx'),
require('./characters/gwen.jsx'),
require('./characters/jaina.jsx'),
require('./characters/midori.jsx'),
require('./characters/rook.jsx'),
require('./characters/setsuki.jsx'),
require('./characters/troq.jsx'),
require('./characters/valerie.jsx'),
require('./characters/argagarg.jsx'),
require('./characters/degrey.jsx'),
require('./characters/lum.jsx'),
require('./characters/menelker.jsx'),
require('./characters/quince.jsx'),
require('./characters/persephone.jsx'),
require('./characters/onimaru.jsx'),
);
export default characters
| Add Onimaru to the character list | Add Onimaru to the character list | JavaScript | mit | cpennington/yomibase,cpennington/yomibase | javascript | ## Code Before:
export const characters = Object.assign({},
require('./characters/geiger.jsx'),
require('./characters/grave.jsx'),
require('./characters/gwen.jsx'),
require('./characters/jaina.jsx'),
require('./characters/midori.jsx'),
require('./characters/rook.jsx'),
require('./characters/setsuki.jsx'),
require('./characters/troq.jsx'),
require('./characters/valerie.jsx'),
require('./characters/argagarg.jsx'),
require('./characters/degrey.jsx'),
require('./characters/lum.jsx'),
require('./characters/menelker.jsx'),
require('./characters/quince.jsx'),
require('./characters/persephone.jsx'),
);
export default characters
## Instruction:
Add Onimaru to the character list
## Code After:
export const characters = Object.assign({},
require('./characters/geiger.jsx'),
require('./characters/grave.jsx'),
require('./characters/gwen.jsx'),
require('./characters/jaina.jsx'),
require('./characters/midori.jsx'),
require('./characters/rook.jsx'),
require('./characters/setsuki.jsx'),
require('./characters/troq.jsx'),
require('./characters/valerie.jsx'),
require('./characters/argagarg.jsx'),
require('./characters/degrey.jsx'),
require('./characters/lum.jsx'),
require('./characters/menelker.jsx'),
require('./characters/quince.jsx'),
require('./characters/persephone.jsx'),
require('./characters/onimaru.jsx'),
);
export default characters
|
b32b047656abd28dd794ee16dfab682337a753b1 | accounts/tests.py | accounts/tests.py | from django.test import TestCase
# Create your tests here.
| from django.test import TestCase
class WelcomePageTest(TestCase):
def test_uses_welcome_template(self):
response = self.client.get('/')
self.assertTemplateUsed(response, 'accounts/welcome.html')
| Add first unit test for welcome page | Add first unit test for welcome page
| Python | mit | randomic/aniauth-tdd,randomic/aniauth-tdd | python | ## Code Before:
from django.test import TestCase
# Create your tests here.
## Instruction:
Add first unit test for welcome page
## Code After:
from django.test import TestCase
class WelcomePageTest(TestCase):
def test_uses_welcome_template(self):
response = self.client.get('/')
self.assertTemplateUsed(response, 'accounts/welcome.html')
|
555a0afbb704b1650c0e3208e45f1c12697d9404 | README.md | README.md | Stringizer is a standalone String Utility Library
## Status
Under Development and Unstable.
| Stringizer is a standalone String Utility Library
[](https://travis-ci.org/jasonlam604/Stringizer)
## Status
Under Development and Unstable.
| Add Travis-CI Build Badge and trigger first build | Add Travis-CI Build Badge and trigger first build | Markdown | mit | jasonlam604/Stringizer | markdown | ## Code Before:
Stringizer is a standalone String Utility Library
## Status
Under Development and Unstable.
## Instruction:
Add Travis-CI Build Badge and trigger first build
## Code After:
Stringizer is a standalone String Utility Library
[](https://travis-ci.org/jasonlam604/Stringizer)
## Status
Under Development and Unstable.
|
158985411b5cad8ea97548dc4316b00a32c4d987 | main.yml | main.yml | ---
# Before execute this, setup github account (set a public key)
- hosts: localhost # hostname
become: no
gather_facts: no
- include: tasks/dotfiles.yml
- include: tasks/basic-packages.yml
- include: tasks/icloud-directories.yml
- include: tasks/homebrew-cask.yml
- include: tasks/php.yml
- include: tasks/ruby.yml
- include: tasks/python.yml
- include: tasks/node.yml
- include: tasks/vagrant.yml
- include: tasks/system-preferences.yml
| ---
# Before execute this, setup github account (set a public key)
- hosts: localhost # hostname
become: no
gather_facts: no
- include: tasks/dotfiles.yml
- include: tasks/basic-packages.yml
- include: tasks/icloud-directories.yml
- include: tasks/homebrew-cask.yml
- include: tasks/php.yml
- include: tasks/ruby.yml
- include: tasks/python.yml
- include: tasks/node.yml
- include: tasks/java.yml
- include: tasks/vagrant.yml
- include: tasks/system-preferences.yml
| Add java task to default | Add java task to default
| YAML | mit | ikuwow/mac-provision | yaml | ## Code Before:
---
# Before execute this, setup github account (set a public key)
- hosts: localhost # hostname
become: no
gather_facts: no
- include: tasks/dotfiles.yml
- include: tasks/basic-packages.yml
- include: tasks/icloud-directories.yml
- include: tasks/homebrew-cask.yml
- include: tasks/php.yml
- include: tasks/ruby.yml
- include: tasks/python.yml
- include: tasks/node.yml
- include: tasks/vagrant.yml
- include: tasks/system-preferences.yml
## Instruction:
Add java task to default
## Code After:
---
# Before execute this, setup github account (set a public key)
- hosts: localhost # hostname
become: no
gather_facts: no
- include: tasks/dotfiles.yml
- include: tasks/basic-packages.yml
- include: tasks/icloud-directories.yml
- include: tasks/homebrew-cask.yml
- include: tasks/php.yml
- include: tasks/ruby.yml
- include: tasks/python.yml
- include: tasks/node.yml
- include: tasks/java.yml
- include: tasks/vagrant.yml
- include: tasks/system-preferences.yml
|
40d75d1541fba868b94c8c378d7c27afc272a1ee | deployments/r/image/extras.d/ph-290.r | deployments/r/image/extras.d/ph-290.r | print("Installing packages for PH 290W")
source("/tmp/class-libs.R")
class_name = "PH 290W"
class_libs = c(
"plotly", "4.9.2.1",
"kableExtra", "1.3.1",
"ggthemes", "4.2.0",
"formattable", "0.2.0.1"
)
class_libs_install_version(class_name, class_libs)
| print("Installing packages for PH 290W")
source("/tmp/class-libs.R")
class_name = "PH 290W"
class_libs = c(
"kableExtra", "1.1.0",
"plotly", "4.9.2.1",
"ggthemes", "4.2.0",
"formattable", "0.2.0.1"
)
class_libs_install_version(class_name, class_libs)
| Use older version of kableExtra | Use older version of kableExtra
*So* I think what's happening is that rocker pins to a
particular CRAN snapshot, and kableExtra's 1.3 is too new to
be on that snapshot. 1.1.0 is, and so works!
| R | bsd-3-clause | berkeley-dsep-infra/datahub,ryanlovett/datahub,ryanlovett/datahub,berkeley-dsep-infra/datahub,ryanlovett/datahub,berkeley-dsep-infra/datahub | r | ## Code Before:
print("Installing packages for PH 290W")
source("/tmp/class-libs.R")
class_name = "PH 290W"
class_libs = c(
"plotly", "4.9.2.1",
"kableExtra", "1.3.1",
"ggthemes", "4.2.0",
"formattable", "0.2.0.1"
)
class_libs_install_version(class_name, class_libs)
## Instruction:
Use older version of kableExtra
*So* I think what's happening is that rocker pins to a
particular CRAN snapshot, and kableExtra's 1.3 is too new to
be on that snapshot. 1.1.0 is, and so works!
## Code After:
print("Installing packages for PH 290W")
source("/tmp/class-libs.R")
class_name = "PH 290W"
class_libs = c(
"kableExtra", "1.1.0",
"plotly", "4.9.2.1",
"ggthemes", "4.2.0",
"formattable", "0.2.0.1"
)
class_libs_install_version(class_name, class_libs)
|
5e5d5890509934188c0a453bd5af6b9204ba49d1 | sakura-kubernetes/README.md | sakura-kubernetes/README.md | sakura-kubernetes
=================
Kubernetes manifest files for sakura.
make <manifest_name>
| sakura-kubernetes
=================
Kubernetes manifest files for sakura.
printf <passphrase> >passphrase.secret
make <manifest_name>
| Add instruction to add secret file | Add instruction to add secret file
| Markdown | unlicense | 10sr/machine-setups,10sr/machine-setups,10sr/server-provisions,10sr/server-provisions,10sr/machine-setups,10sr/machine-setups | markdown | ## Code Before:
sakura-kubernetes
=================
Kubernetes manifest files for sakura.
make <manifest_name>
## Instruction:
Add instruction to add secret file
## Code After:
sakura-kubernetes
=================
Kubernetes manifest files for sakura.
printf <passphrase> >passphrase.secret
make <manifest_name>
|
f6c36bbb5b5afec1a029213557b722e50dd6aaaa | test/test_run_script.py | test/test_run_script.py | def test_dummy():
assert True
| import subprocess
import pytest
def test_filter(tmp_path):
unit_test = tmp_path.joinpath('some_unit_test.sv')
unit_test.write_text('''
module some_unit_test;
import svunit_pkg::*;
`include "svunit_defines.svh"
string name = "some_ut";
svunit_testcase svunit_ut;
function void build();
svunit_ut = new(name);
endfunction
task setup();
svunit_ut.setup();
endtask
task teardown();
svunit_ut.teardown();
endtask
`SVUNIT_TESTS_BEGIN
`SVTEST(some_failing_test)
`FAIL_IF(1)
`SVTEST_END
`SVTEST(some_passing_test)
`FAIL_IF(0)
`SVTEST_END
`SVUNIT_TESTS_END
endmodule
''')
log = tmp_path.joinpath('run.log')
print('Filtering only the passing test should block the fail')
subprocess.check_call(['runSVUnit', '--filter', 'some_ut.some_passing_test'], cwd=tmp_path)
assert 'FAILED' not in log.read_text()
print('No explicit filter should cause both tests to run, hence trigger the fail')
subprocess.check_call(['runSVUnit'], cwd=tmp_path)
assert 'FAILED' in log.read_text()
| Add test for '--filter' option | Add test for '--filter' option
The goal now is to make this test pass by implementing the necessary
production code.
| Python | apache-2.0 | svunit/svunit,svunit/svunit,svunit/svunit | python | ## Code Before:
def test_dummy():
assert True
## Instruction:
Add test for '--filter' option
The goal now is to make this test pass by implementing the necessary
production code.
## Code After:
import subprocess
import pytest
def test_filter(tmp_path):
unit_test = tmp_path.joinpath('some_unit_test.sv')
unit_test.write_text('''
module some_unit_test;
import svunit_pkg::*;
`include "svunit_defines.svh"
string name = "some_ut";
svunit_testcase svunit_ut;
function void build();
svunit_ut = new(name);
endfunction
task setup();
svunit_ut.setup();
endtask
task teardown();
svunit_ut.teardown();
endtask
`SVUNIT_TESTS_BEGIN
`SVTEST(some_failing_test)
`FAIL_IF(1)
`SVTEST_END
`SVTEST(some_passing_test)
`FAIL_IF(0)
`SVTEST_END
`SVUNIT_TESTS_END
endmodule
''')
log = tmp_path.joinpath('run.log')
print('Filtering only the passing test should block the fail')
subprocess.check_call(['runSVUnit', '--filter', 'some_ut.some_passing_test'], cwd=tmp_path)
assert 'FAILED' not in log.read_text()
print('No explicit filter should cause both tests to run, hence trigger the fail')
subprocess.check_call(['runSVUnit'], cwd=tmp_path)
assert 'FAILED' in log.read_text()
|
fac227ff1fdc9f9065660ae40fa4481a9e49ca8e | app/controllers/books_controller.rb | app/controllers/books_controller.rb | class BooksController < ApplicationController
before_action :authorize_user, only: [:new, :create, :edit, :update, :destroy]
def index
params[:author_id] ? set_author_books_view : set_books_view
binding.pry
end
def new
@book = Book.new
end
def create
@book = Book.new(book_params)
if @book.save
redirect_to @book
else
render 'new'
end
end
def show
@book = Book.find(params[:id])
end
def edit
@book = Book.find(params[:id])
end
def update
end
def destroy
end
private
def book_params
params.require(:book).permit(:title, :publication_year, :quantity, :author_id, :synopsis, :classification, author_attributes: [:name, :born, :died])
end
def set_books_view
@books = Book.title_search(params[:query]).order(:title)
end
def set_author_books_view
@author = Author.find(params[:author_id])
end
end
| class BooksController < ApplicationController
before_action :authorize_user, only: [:new, :create, :edit, :update, :destroy]
before_action :set_book, only: [:show, :edit]
def index
params[:author_id] ? set_author_books_view : set_books_view
end
def new
@book = Book.new
end
def create
@book = Book.new(book_params)
if @book.save
redirect_to @book
else
render 'new'
end
end
def show
end
def edit
end
def update
end
def destroy
end
private
def book_params
params.require(:book).permit(:title, :publication_year, :quantity, :author_id, :synopsis, :classification, author_attributes: [:name, :born, :died])
end
def set_book
@book = Book.find(params[:id])
end
def set_books_view
@books = Book.title_search(params[:query]).order(:title)
end
def set_author_books_view
@author = Author.find(params[:author_id])
end
end
| Add before_action and set_book method | Add before_action and set_book method
| Ruby | mit | kevinladkins/rails-library,kevinladkins/rails-library,kevinladkins/rails-library | ruby | ## Code Before:
class BooksController < ApplicationController
before_action :authorize_user, only: [:new, :create, :edit, :update, :destroy]
def index
params[:author_id] ? set_author_books_view : set_books_view
binding.pry
end
def new
@book = Book.new
end
def create
@book = Book.new(book_params)
if @book.save
redirect_to @book
else
render 'new'
end
end
def show
@book = Book.find(params[:id])
end
def edit
@book = Book.find(params[:id])
end
def update
end
def destroy
end
private
def book_params
params.require(:book).permit(:title, :publication_year, :quantity, :author_id, :synopsis, :classification, author_attributes: [:name, :born, :died])
end
def set_books_view
@books = Book.title_search(params[:query]).order(:title)
end
def set_author_books_view
@author = Author.find(params[:author_id])
end
end
## Instruction:
Add before_action and set_book method
## Code After:
class BooksController < ApplicationController
before_action :authorize_user, only: [:new, :create, :edit, :update, :destroy]
before_action :set_book, only: [:show, :edit]
def index
params[:author_id] ? set_author_books_view : set_books_view
end
def new
@book = Book.new
end
def create
@book = Book.new(book_params)
if @book.save
redirect_to @book
else
render 'new'
end
end
def show
end
def edit
end
def update
end
def destroy
end
private
def book_params
params.require(:book).permit(:title, :publication_year, :quantity, :author_id, :synopsis, :classification, author_attributes: [:name, :born, :died])
end
def set_book
@book = Book.find(params[:id])
end
def set_books_view
@books = Book.title_search(params[:query]).order(:title)
end
def set_author_books_view
@author = Author.find(params[:author_id])
end
end
|
a199a9732efcfcd50205aed38c64716199c3477a | app/helpers/blacklight_cornell_requests/request_helper.rb | app/helpers/blacklight_cornell_requests/request_helper.rb | module BlacklightCornellRequests
module RequestHelper
# Time estimates are now delivered from the model in ranges (e.g., [1, 3])
# instead of integers. This function converts the range into a string for display
def delivery_estimate_display time_estimate
if time_estimate[0] == time_estimate[1]
pluralize(time_estimate[0], 'working day')
else
"#{time_estimate[0]} to #{time_estimate[1]} working days"
end
end
end
end
| module BlacklightCornellRequests
module RequestHelper
# Time estimates are now delivered from the model in ranges (e.g., [1, 3])
# instead of integers. This function converts the range into a string for display
def delivery_estimate_display time_estimate
if time_estimate[0] == time_estimate[1]
pluralize(time_estimate[0], 'working day')
else
"#{time_estimate[0]} to #{time_estimate[1]} working days"
end
end
def parsed_special_delivery(params)
unless params['program'].present? && params['program']['location_id'] > 0
Rails.logger.warn("Special Delivery: unable to find delivery location code in #{params.inspect}")
return {}
end
program = params['program']
office_delivery = program['location_id'] == 224
formatted_label = office_delivery ? "Office Delivery" : "Special Program Delivery"
formatted_label += " (#{params['delivery_location']})"
{
fod: office_delivery,
code: params['program']['location_id'],
value: formatted_label
}
end
end
end
| Add helper to format FOD data | Add helper to format FOD data
| Ruby | mit | cul-it/blacklight-cornell-requests,cul-it/blacklight-cornell-requests,cul-it/blacklight-cornell-requests | ruby | ## Code Before:
module BlacklightCornellRequests
module RequestHelper
# Time estimates are now delivered from the model in ranges (e.g., [1, 3])
# instead of integers. This function converts the range into a string for display
def delivery_estimate_display time_estimate
if time_estimate[0] == time_estimate[1]
pluralize(time_estimate[0], 'working day')
else
"#{time_estimate[0]} to #{time_estimate[1]} working days"
end
end
end
end
## Instruction:
Add helper to format FOD data
## Code After:
module BlacklightCornellRequests
module RequestHelper
# Time estimates are now delivered from the model in ranges (e.g., [1, 3])
# instead of integers. This function converts the range into a string for display
def delivery_estimate_display time_estimate
if time_estimate[0] == time_estimate[1]
pluralize(time_estimate[0], 'working day')
else
"#{time_estimate[0]} to #{time_estimate[1]} working days"
end
end
def parsed_special_delivery(params)
unless params['program'].present? && params['program']['location_id'] > 0
Rails.logger.warn("Special Delivery: unable to find delivery location code in #{params.inspect}")
return {}
end
program = params['program']
office_delivery = program['location_id'] == 224
formatted_label = office_delivery ? "Office Delivery" : "Special Program Delivery"
formatted_label += " (#{params['delivery_location']})"
{
fod: office_delivery,
code: params['program']['location_id'],
value: formatted_label
}
end
end
end
|
ee0d06e386f6b3df218287cd60de0d29adf2d9e5 | .travis.yml | .travis.yml | language: ruby
rvm:
- 1.9.3
- 2.0.0
- 2.1
- 2.2
- ruby-head
matrix:
allow_failures:
- rvm: ruby-head
fast_finish: true
cache: bundler
before_install:
- mkdir -p $HOME/bin
- export PATH=$HOME/bin:$PATH
- curl -L https://www.dropbox.com/s/577u8h68gbpkpzz/texmath?dl=1 -o $HOME/bin/texmath
| language: ruby
rvm:
- 1.9.3
- 2.0.0
- 2.1
- 2.2
- ruby-head
matrix:
allow_failures:
- rvm: ruby-head
fast_finish: true
cache: bundler
before_install:
- mkdir -p $HOME/bin
- export PATH=$HOME/bin:$PATH
- curl -L https://www.dropbox.com/s/577u8h68gbpkpzz/texmath?dl=1 -o $HOME/bin/texmath
- chmod +x $HOME/bin/texmath
| Add +x permissions to texmath binary on Travis | Add +x permissions to texmath binary on Travis
| YAML | mit | hollingberry/texmath-ruby | yaml | ## Code Before:
language: ruby
rvm:
- 1.9.3
- 2.0.0
- 2.1
- 2.2
- ruby-head
matrix:
allow_failures:
- rvm: ruby-head
fast_finish: true
cache: bundler
before_install:
- mkdir -p $HOME/bin
- export PATH=$HOME/bin:$PATH
- curl -L https://www.dropbox.com/s/577u8h68gbpkpzz/texmath?dl=1 -o $HOME/bin/texmath
## Instruction:
Add +x permissions to texmath binary on Travis
## Code After:
language: ruby
rvm:
- 1.9.3
- 2.0.0
- 2.1
- 2.2
- ruby-head
matrix:
allow_failures:
- rvm: ruby-head
fast_finish: true
cache: bundler
before_install:
- mkdir -p $HOME/bin
- export PATH=$HOME/bin:$PATH
- curl -L https://www.dropbox.com/s/577u8h68gbpkpzz/texmath?dl=1 -o $HOME/bin/texmath
- chmod +x $HOME/bin/texmath
|
3662a2ea16cfa093bdeccfd5fc2e594b4b3541e4 | usr.bin/awk/Makefile | usr.bin/awk/Makefile |
AWKSRC= ${.CURDIR}/../../contrib/one-true-awk
.PATH: ${AWKSRC}
PROG= nawk
SRCS= b.c lex.c lib.c main.c parse.c proctab.c run.c tran.c ytab.c ytab.h
CFLAGS+= -I. -I${AWKSRC}
DPADD= ${LIBM}
LDADD= -lm
.if ${MACHINE_ARCH} == "sparc64"
LINKS+= ${BINDIR}/nawk ${BINDIR}/awk
MLINKS+= nawk.1 awk.1
.endif
CLEANFILES+= maketab proctab.c ytab.c ytab.h
.ORDER: ytab.c ytab.h proctab.c
ytab.c ytab.h: ${AWKSRC}/awkgram.y
@echo Expect 42 reduce/shift conflicts and 83 reduce/reduce conflicts
${YACC} -d ${AWKSRC}/awkgram.y
mv -f y.tab.c ytab.c
mv -f y.tab.h ytab.h
proctab.c: maketab
./maketab > proctab.c
build-tools: maketab
maketab: ytab.h ${AWKSRC}/maketab.c
${CC} ${CFLAGS} ${AWKSRC}/maketab.c -o maketab
CLEANFILES+= nawk.1
nawk.1: awk.1
ln -sf ${.ALLSRC} ${.TARGET}
.include <bsd.prog.mk>
|
AWKSRC= ${.CURDIR}/../../contrib/one-true-awk
.PATH: ${AWKSRC}
PROG= nawk
SRCS= awkgram.y b.c lex.c lib.c main.c parse.c proctab.c run.c tran.c
CFLAGS+= -I. -I${AWKSRC}
DPADD= ${LIBM}
LDADD= -lm
.if ${MACHINE_ARCH} == "sparc64"
LINKS+= ${BINDIR}/nawk ${BINDIR}/awk
MLINKS+= nawk.1 awk.1
.endif
CLEANFILES+= maketab proctab.c ytab.c ytab.h
ytab.h: awkgram.h
ln -sf ${.ALLSRC} ${.TARGET}
proctab.c: maketab
./maketab > proctab.c
build-tools: maketab
maketab: ytab.h ${AWKSRC}/maketab.c
${CC} ${CFLAGS} ${AWKSRC}/maketab.c -o maketab
CLEANFILES+= nawk.1
nawk.1: awk.1
ln -sf ${.ALLSRC} ${.TARGET}
.include <bsd.prog.mk>
| Use our auto-YACC'ing rules to do most of the work. | Use our auto-YACC'ing rules to do most of the work.
Submitted by: bde
| unknown | bsd-3-clause | jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase | unknown | ## Code Before:
AWKSRC= ${.CURDIR}/../../contrib/one-true-awk
.PATH: ${AWKSRC}
PROG= nawk
SRCS= b.c lex.c lib.c main.c parse.c proctab.c run.c tran.c ytab.c ytab.h
CFLAGS+= -I. -I${AWKSRC}
DPADD= ${LIBM}
LDADD= -lm
.if ${MACHINE_ARCH} == "sparc64"
LINKS+= ${BINDIR}/nawk ${BINDIR}/awk
MLINKS+= nawk.1 awk.1
.endif
CLEANFILES+= maketab proctab.c ytab.c ytab.h
.ORDER: ytab.c ytab.h proctab.c
ytab.c ytab.h: ${AWKSRC}/awkgram.y
@echo Expect 42 reduce/shift conflicts and 83 reduce/reduce conflicts
${YACC} -d ${AWKSRC}/awkgram.y
mv -f y.tab.c ytab.c
mv -f y.tab.h ytab.h
proctab.c: maketab
./maketab > proctab.c
build-tools: maketab
maketab: ytab.h ${AWKSRC}/maketab.c
${CC} ${CFLAGS} ${AWKSRC}/maketab.c -o maketab
CLEANFILES+= nawk.1
nawk.1: awk.1
ln -sf ${.ALLSRC} ${.TARGET}
.include <bsd.prog.mk>
## Instruction:
Use our auto-YACC'ing rules to do most of the work.
Submitted by: bde
## Code After:
AWKSRC= ${.CURDIR}/../../contrib/one-true-awk
.PATH: ${AWKSRC}
PROG= nawk
SRCS= awkgram.y b.c lex.c lib.c main.c parse.c proctab.c run.c tran.c
CFLAGS+= -I. -I${AWKSRC}
DPADD= ${LIBM}
LDADD= -lm
.if ${MACHINE_ARCH} == "sparc64"
LINKS+= ${BINDIR}/nawk ${BINDIR}/awk
MLINKS+= nawk.1 awk.1
.endif
CLEANFILES+= maketab proctab.c ytab.c ytab.h
ytab.h: awkgram.h
ln -sf ${.ALLSRC} ${.TARGET}
proctab.c: maketab
./maketab > proctab.c
build-tools: maketab
maketab: ytab.h ${AWKSRC}/maketab.c
${CC} ${CFLAGS} ${AWKSRC}/maketab.c -o maketab
CLEANFILES+= nawk.1
nawk.1: awk.1
ln -sf ${.ALLSRC} ${.TARGET}
.include <bsd.prog.mk>
|
9e7b0fcb34a360d6f0ff593139b91411b59bcf82 | src/styles/base.less | src/styles/base.less | @import './common.less';
@import './colors.less';
@import './fonts.less';
@import './modules/grid';
@import 'modules/base';
@import 'modules/headings';
@import 'modules/form';
@import 'modules/buttons';
@import 'modules/user';
| @import './common.less';
@import './colors.less';
@import './fonts.less';
@import './modules/grid';
// @import 'modules/base';
// @import 'modules/headings';
// @import 'modules/form';
// @import 'modules/buttons';
// @import 'modules/user';
| Comment out old modules code | Comment out old modules code
| Less | mit | ryanbaer/busy,busyorg/busy,busyorg/busy,ryanbaer/busy | less | ## Code Before:
@import './common.less';
@import './colors.less';
@import './fonts.less';
@import './modules/grid';
@import 'modules/base';
@import 'modules/headings';
@import 'modules/form';
@import 'modules/buttons';
@import 'modules/user';
## Instruction:
Comment out old modules code
## Code After:
@import './common.less';
@import './colors.less';
@import './fonts.less';
@import './modules/grid';
// @import 'modules/base';
// @import 'modules/headings';
// @import 'modules/form';
// @import 'modules/buttons';
// @import 'modules/user';
|
92dcb4e012576a790f07fc24698f1663a1c5a31e | scripts/client.js | scripts/client.js | elation.require(['engine.engine', 'engine.things.light', 'janusweb.janusweb'], function() {
elation.component.add('janusweb.client', function() {
this.initEngine = function() {
//this.enginecfg.systems.push('admin');
}
this.initWorld = function() {
var things = this.world.load({
name: 'janusweb',
type: 'janusweb',
properties: {
},
things: {
ambient: {
name: 'ambient',
type: 'light_ambient',
properties: {
color: 0x444444
}
},
sun: {
name: 'sun',
type: 'light_directional',
properties: {
position: [-20,50,25],
intensity: 0.2
}
},
point: {
name: 'point01',
type: 'light_point',
properties: {
position: [22,19,-15],
intensity: 0.2
}
},
player: {
name: 'player',
type: 'player',
properties: {
position: [0,0,0],
mass: 10
}
},
}
});
}
}, elation.engine.client);
});
| elation.require(['engine.engine', 'engine.things.player', 'engine.things.light', 'janusweb.janusweb'], function() {
elation.component.add('janusweb.client', function() {
this.initEngine = function() {
//this.enginecfg.systems.push('admin');
}
this.initWorld = function() {
var things = this.world.load({
name: 'janusweb',
type: 'janusweb',
properties: {
},
things: {
ambient: {
name: 'ambient',
type: 'light_ambient',
properties: {
color: 0x444444
}
},
sun: {
name: 'sun',
type: 'light_directional',
properties: {
position: [-20,50,25],
intensity: 0.1
}
},
point: {
name: 'point01',
type: 'light_point',
properties: {
position: [22,19,-15],
intensity: 0.1
}
},
player: {
name: 'player',
type: 'player',
properties: {
position: [0,0,0],
mass: 10
}
},
}
});
this.player = things.children.janusweb.children.player;
}
}, elation.engine.client);
});
| Store reference to player, dim lights | Store reference to player, dim lights
| JavaScript | mit | jbaicoianu/janusweb,jbaicoianu/janusweb,jbaicoianu/janusweb | javascript | ## Code Before:
elation.require(['engine.engine', 'engine.things.light', 'janusweb.janusweb'], function() {
elation.component.add('janusweb.client', function() {
this.initEngine = function() {
//this.enginecfg.systems.push('admin');
}
this.initWorld = function() {
var things = this.world.load({
name: 'janusweb',
type: 'janusweb',
properties: {
},
things: {
ambient: {
name: 'ambient',
type: 'light_ambient',
properties: {
color: 0x444444
}
},
sun: {
name: 'sun',
type: 'light_directional',
properties: {
position: [-20,50,25],
intensity: 0.2
}
},
point: {
name: 'point01',
type: 'light_point',
properties: {
position: [22,19,-15],
intensity: 0.2
}
},
player: {
name: 'player',
type: 'player',
properties: {
position: [0,0,0],
mass: 10
}
},
}
});
}
}, elation.engine.client);
});
## Instruction:
Store reference to player, dim lights
## Code After:
elation.require(['engine.engine', 'engine.things.player', 'engine.things.light', 'janusweb.janusweb'], function() {
elation.component.add('janusweb.client', function() {
this.initEngine = function() {
//this.enginecfg.systems.push('admin');
}
this.initWorld = function() {
var things = this.world.load({
name: 'janusweb',
type: 'janusweb',
properties: {
},
things: {
ambient: {
name: 'ambient',
type: 'light_ambient',
properties: {
color: 0x444444
}
},
sun: {
name: 'sun',
type: 'light_directional',
properties: {
position: [-20,50,25],
intensity: 0.1
}
},
point: {
name: 'point01',
type: 'light_point',
properties: {
position: [22,19,-15],
intensity: 0.1
}
},
player: {
name: 'player',
type: 'player',
properties: {
position: [0,0,0],
mass: 10
}
},
}
});
this.player = things.children.janusweb.children.player;
}
}, elation.engine.client);
});
|
1c8f999e7f7af25ada4d12c1c9770291ed70e528 | _includes/_i18n/i18n.html | _includes/_i18n/i18n.html | {% assign plang = page.lang | default: site.langs[0] | default: "en" %}
{% assign val = include.value %}
{% if val[plang] %}
{% assign v = val[plang] %}
{% else %}
{% assign v = val %}
{% endif %}
{% if include.escape %}
{{ v | escape | strip_newlines }}
{% else %}
{{ v | strip_newlines }}
{% endif %}
| {% capture text %}
{% assign plang = page.lang | default: site.langs[0] | default: "en" %}
{% assign val = include.value %}
{% if val[plang] %}
{% assign v = val[plang] %}
{% else %}
{% assign v = val %}
{% endif %}
{% if include.escape %}
{{ v | escape | strip_newlines }}
{% else %}
{{ v | strip_newlines }}
{% endif %}
{% endcapture %}
{{ text | strip_newlines }}
| Fix some of the whitespaces problem | Fix some of the whitespaces problem
| HTML | mit | gaalcaras/academic,lschuetze/lschuetze.github.io,lschuetze/lschuetze.github.io,gaalcaras/academic | html | ## Code Before:
{% assign plang = page.lang | default: site.langs[0] | default: "en" %}
{% assign val = include.value %}
{% if val[plang] %}
{% assign v = val[plang] %}
{% else %}
{% assign v = val %}
{% endif %}
{% if include.escape %}
{{ v | escape | strip_newlines }}
{% else %}
{{ v | strip_newlines }}
{% endif %}
## Instruction:
Fix some of the whitespaces problem
## Code After:
{% capture text %}
{% assign plang = page.lang | default: site.langs[0] | default: "en" %}
{% assign val = include.value %}
{% if val[plang] %}
{% assign v = val[plang] %}
{% else %}
{% assign v = val %}
{% endif %}
{% if include.escape %}
{{ v | escape | strip_newlines }}
{% else %}
{{ v | strip_newlines }}
{% endif %}
{% endcapture %}
{{ text | strip_newlines }}
|
e085c548828cf81e274e084be51f06ffd4335df7 | lib/paper_trail/config.rb | lib/paper_trail/config.rb |
require "singleton"
require "paper_trail/serializers/yaml"
module PaperTrail
# Global configuration affecting all threads. Some thread-specific
# configuration can be found in `paper_trail.rb`, others in `controller.rb`.
class Config
include Singleton
attr_accessor(
:association_reify_error_behaviour,
:classes_warned_about_sti_item_types,
:i_have_updated_my_existing_item_types,
:object_changes_adapter,
:serializer,
:version_limit
)
def initialize
# Variables which affect all threads, whose access is synchronized.
@mutex = Mutex.new
@enabled = true
# Variables which affect all threads, whose access is *not* synchronized.
@serializer = PaperTrail::Serializers::YAML
end
# Indicates whether PaperTrail is on or off. Default: true.
def enabled
@mutex.synchronize { !!@enabled }
end
def enabled=(enable)
@mutex.synchronize { @enabled = enable }
end
end
end
|
require "singleton"
require "paper_trail/serializers/yaml"
module PaperTrail
# Global configuration affecting all threads. Some thread-specific
# configuration can be found in `paper_trail.rb`, others in `controller.rb`.
class Config
include Singleton
attr_accessor(
:association_reify_error_behaviour,
:classes_warned_about_sti_item_types,
:i_have_updated_my_existing_item_types,
:object_changes_adapter,
:serializer,
:version_limit
)
def initialize
# Variables which affect all threads, whose access is synchronized.
@mutex = Mutex.new
@enabled = true
# Variables which affect all threads, whose access is *not* synchronized.
@serializer = PaperTrail::Serializers::YAML
end
# Indicates whether PaperTrail is on or off. Default: true.
def enabled
@mutex.synchronize { !!@enabled }
end
def enabled=(enable)
@mutex.synchronize { @enabled = enable }
end
def track_associations=(_)
raise AssociationTrackingRemovedError
end
def track_associations?
raise AssociationTrackingRemovedError
end
# Error for PT v10.x for when association tracking is attempted to be
# used without the paper_trail-association_tracking gem present
class AssociationTrackingRemovedError < RuntimeError
MESSAGE_FMT = "Association Tracking for PaperTrail has been extracted "\
"to a seperate gem. Please add "\
"`paper_trail-association_tracking` to your Gemfile."
def message
format(MESSAGE_FMT)
end
end
end
end
| Add association tracking removal exception | Add association tracking removal exception
| Ruby | mit | airblade/paper_trail,airblade/paper_trail | ruby | ## Code Before:
require "singleton"
require "paper_trail/serializers/yaml"
module PaperTrail
# Global configuration affecting all threads. Some thread-specific
# configuration can be found in `paper_trail.rb`, others in `controller.rb`.
class Config
include Singleton
attr_accessor(
:association_reify_error_behaviour,
:classes_warned_about_sti_item_types,
:i_have_updated_my_existing_item_types,
:object_changes_adapter,
:serializer,
:version_limit
)
def initialize
# Variables which affect all threads, whose access is synchronized.
@mutex = Mutex.new
@enabled = true
# Variables which affect all threads, whose access is *not* synchronized.
@serializer = PaperTrail::Serializers::YAML
end
# Indicates whether PaperTrail is on or off. Default: true.
def enabled
@mutex.synchronize { !!@enabled }
end
def enabled=(enable)
@mutex.synchronize { @enabled = enable }
end
end
end
## Instruction:
Add association tracking removal exception
## Code After:
require "singleton"
require "paper_trail/serializers/yaml"
module PaperTrail
# Global configuration affecting all threads. Some thread-specific
# configuration can be found in `paper_trail.rb`, others in `controller.rb`.
class Config
include Singleton
attr_accessor(
:association_reify_error_behaviour,
:classes_warned_about_sti_item_types,
:i_have_updated_my_existing_item_types,
:object_changes_adapter,
:serializer,
:version_limit
)
def initialize
# Variables which affect all threads, whose access is synchronized.
@mutex = Mutex.new
@enabled = true
# Variables which affect all threads, whose access is *not* synchronized.
@serializer = PaperTrail::Serializers::YAML
end
# Indicates whether PaperTrail is on or off. Default: true.
def enabled
@mutex.synchronize { !!@enabled }
end
def enabled=(enable)
@mutex.synchronize { @enabled = enable }
end
def track_associations=(_)
raise AssociationTrackingRemovedError
end
def track_associations?
raise AssociationTrackingRemovedError
end
# Error for PT v10.x for when association tracking is attempted to be
# used without the paper_trail-association_tracking gem present
class AssociationTrackingRemovedError < RuntimeError
MESSAGE_FMT = "Association Tracking for PaperTrail has been extracted "\
"to a seperate gem. Please add "\
"`paper_trail-association_tracking` to your Gemfile."
def message
format(MESSAGE_FMT)
end
end
end
end
|
868fe1989565dc2fb01de36e83fa91dc282e97d5 | config/announce.yml | config/announce.yml | app_name: feeder
subscribe:
story:
- create
- update
- delete
- publish
- unpublish
feed_entry:
- create
- update
- delete
| app_name: feeder
subscribe:
story:
- create
- update
- delete
- publish
- unpublish
series:
- create
- update
- delete
feed_entry:
- create
- update
- delete
| Add in config for cms series subscription | Add in config for cms series subscription
| YAML | agpl-3.0 | PRX/feeder.prx.org,PRX/feeder.prx.org,PRX/feeder.prx.org | yaml | ## Code Before:
app_name: feeder
subscribe:
story:
- create
- update
- delete
- publish
- unpublish
feed_entry:
- create
- update
- delete
## Instruction:
Add in config for cms series subscription
## Code After:
app_name: feeder
subscribe:
story:
- create
- update
- delete
- publish
- unpublish
series:
- create
- update
- delete
feed_entry:
- create
- update
- delete
|
9fe573614e2f3ca9a6e738afb7f1af84b541092c | invertedindex.py | invertedindex.py |
class InvertedIndex:
def __init__(self):
self.index = dict()
def add_mail(self, mail):
for key in ["simple_terms_body", "complexe_terms_body"]:
for terms in mail[key]:
if terms in self.index.keys():
self.index[terms].append((mail["name"], mail[key][terms]))
else:
self.index[terms] = list()
self.index[terms].append((mail["name"], mail[key][terms]))
|
class InvertedIndex:
def __init__(self):
self.index = dict()
def add_mail(self, mail):
for key in ["simple_terms_body", "complexe_terms_body"]:
for terms in mail[key]:
if terms in self.index.keys():
self.index[terms].append((mail["name"], mail[key][terms]))
else:
self.index[terms] = list()
self.index[terms].append((mail["name"], mail[key][terms]))
def terms(self):
for terms in self.index.keys():
yield terms
def get_terms(self):
return self.index.keys()
def file_counter(self, terms):
for val in self.index[terms]:
yield val
def get_file_counter(self, terms):
return self.index.values()
def file(self, terms):
for val in file_counter(terms):
yield val[0]
def counter(self, terms):
for val in file_counter(terms):
yield val[1]
| Add some access function to inverted index | Add some access function to inverted index
| Python | mit | Nedgang/adt_project | python | ## Code Before:
class InvertedIndex:
def __init__(self):
self.index = dict()
def add_mail(self, mail):
for key in ["simple_terms_body", "complexe_terms_body"]:
for terms in mail[key]:
if terms in self.index.keys():
self.index[terms].append((mail["name"], mail[key][terms]))
else:
self.index[terms] = list()
self.index[terms].append((mail["name"], mail[key][terms]))
## Instruction:
Add some access function to inverted index
## Code After:
class InvertedIndex:
def __init__(self):
self.index = dict()
def add_mail(self, mail):
for key in ["simple_terms_body", "complexe_terms_body"]:
for terms in mail[key]:
if terms in self.index.keys():
self.index[terms].append((mail["name"], mail[key][terms]))
else:
self.index[terms] = list()
self.index[terms].append((mail["name"], mail[key][terms]))
def terms(self):
for terms in self.index.keys():
yield terms
def get_terms(self):
return self.index.keys()
def file_counter(self, terms):
for val in self.index[terms]:
yield val
def get_file_counter(self, terms):
return self.index.values()
def file(self, terms):
for val in file_counter(terms):
yield val[0]
def counter(self, terms):
for val in file_counter(terms):
yield val[1]
|
078e78ff546604767479bae736173e852e3fc380 | README.md | README.md |
- [User Manual](tessdoc/)
## Tesseract Source Code Documentation
This documentation was build with Doxygen from the
[Tesseract source code](https://github.com/tesseract-ocr/tesseract).
- [3.05.02](tessapi/3.05.02/)
- [3.x](tessapi/3.x/)
- [4.0.0](tessapi/4.0.0/)
## Publications
- [Various documents related to Tesseract OCR](http://tesseract-ocr.github.io/docs/)
|
- [User Manual](tessdoc/)
## Tesseract Source Code Documentation
This documentation was build with Doxygen from the
[Tesseract source code](https://github.com/tesseract-ocr/tesseract).
- [3.05.02](tessapi/3.05.02/)
- [3.x](tessapi/3.x/)
- [4.0.0](tessapi/4.0.0/)
- [latest](tessapi/5.x/)
## Publications
- [Various documents related to Tesseract OCR](http://tesseract-ocr.github.io/docs/)
| Add link to 5.x documentation | Add link to 5.x documentation
Signed-off-by: Stefan Weil <8d4c780fcfdc41841e5070f4c43da8958ba6aec0@weilnetz.de>
| Markdown | apache-2.0 | tesseract-ocr/tesseract-ocr.github.io | markdown | ## Code Before:
- [User Manual](tessdoc/)
## Tesseract Source Code Documentation
This documentation was build with Doxygen from the
[Tesseract source code](https://github.com/tesseract-ocr/tesseract).
- [3.05.02](tessapi/3.05.02/)
- [3.x](tessapi/3.x/)
- [4.0.0](tessapi/4.0.0/)
## Publications
- [Various documents related to Tesseract OCR](http://tesseract-ocr.github.io/docs/)
## Instruction:
Add link to 5.x documentation
Signed-off-by: Stefan Weil <8d4c780fcfdc41841e5070f4c43da8958ba6aec0@weilnetz.de>
## Code After:
- [User Manual](tessdoc/)
## Tesseract Source Code Documentation
This documentation was build with Doxygen from the
[Tesseract source code](https://github.com/tesseract-ocr/tesseract).
- [3.05.02](tessapi/3.05.02/)
- [3.x](tessapi/3.x/)
- [4.0.0](tessapi/4.0.0/)
- [latest](tessapi/5.x/)
## Publications
- [Various documents related to Tesseract OCR](http://tesseract-ocr.github.io/docs/)
|
64c8fd3fa18dd6644a67cbd9e9aa5f20eb5e85a7 | var/spack/packages/mrnet/package.py | var/spack/packages/mrnet/package.py | from spack import *
class Mrnet(Package):
"""The MRNet Multi-Cast Reduction Network."""
homepage = "http://paradyn.org/mrnet"
url = "ftp://ftp.cs.wisc.edu/paradyn/mrnet/mrnet_4.0.0.tar.gz"
version('4.0.0', 'd00301c078cba57ef68613be32ceea2f')
version('4.1.0', '5a248298b395b329e2371bf25366115c')
parallel = False
depends_on("boost")
def install(self, spec, prefix):
configure("--prefix=%s" %prefix, "--enable-shared")
make()
make("install")
| from spack import *
class Mrnet(Package):
"""The MRNet Multi-Cast Reduction Network."""
homepage = "http://paradyn.org/mrnet"
url = "ftp://ftp.cs.wisc.edu/paradyn/mrnet/mrnet_4.0.0.tar.gz"
version('4.0.0', 'd00301c078cba57ef68613be32ceea2f')
version('4.1.0', '5a248298b395b329e2371bf25366115c')
variant('krelloptions', default=False, description="Also build the MRNet LW threadsafe libraries")
parallel = False
depends_on("boost")
def install(self, spec, prefix):
# Build the MRNet LW thread safe libraries when the krelloptions variant is present
if '+krelloptions' in spec:
configure("--prefix=%s" %prefix, "--enable-shared", "--enable-ltwt-threadsafe")
else:
configure("--prefix=%s" %prefix, "--enable-shared")
make()
make("install")
| Add krelloptions variant that is used to turn on a configuration option to build the thread safe lightweight libraries. | Add krelloptions variant that is used to turn on a configuration option to build the thread safe lightweight libraries.
| Python | lgpl-2.1 | EmreAtes/spack,krafczyk/spack,matthiasdiener/spack,matthiasdiener/spack,tmerrick1/spack,krafczyk/spack,LLNL/spack,matthiasdiener/spack,EmreAtes/spack,lgarren/spack,iulian787/spack,EmreAtes/spack,mfherbst/spack,krafczyk/spack,matthiasdiener/spack,skosukhin/spack,skosukhin/spack,mfherbst/spack,EmreAtes/spack,TheTimmy/spack,lgarren/spack,mfherbst/spack,mfherbst/spack,matthiasdiener/spack,LLNL/spack,lgarren/spack,krafczyk/spack,EmreAtes/spack,tmerrick1/spack,tmerrick1/spack,TheTimmy/spack,lgarren/spack,LLNL/spack,iulian787/spack,krafczyk/spack,skosukhin/spack,LLNL/spack,TheTimmy/spack,TheTimmy/spack,tmerrick1/spack,LLNL/spack,skosukhin/spack,iulian787/spack,TheTimmy/spack,mfherbst/spack,lgarren/spack,skosukhin/spack,tmerrick1/spack,iulian787/spack,iulian787/spack | python | ## Code Before:
from spack import *
class Mrnet(Package):
"""The MRNet Multi-Cast Reduction Network."""
homepage = "http://paradyn.org/mrnet"
url = "ftp://ftp.cs.wisc.edu/paradyn/mrnet/mrnet_4.0.0.tar.gz"
version('4.0.0', 'd00301c078cba57ef68613be32ceea2f')
version('4.1.0', '5a248298b395b329e2371bf25366115c')
parallel = False
depends_on("boost")
def install(self, spec, prefix):
configure("--prefix=%s" %prefix, "--enable-shared")
make()
make("install")
## Instruction:
Add krelloptions variant that is used to turn on a configuration option to build the thread safe lightweight libraries.
## Code After:
from spack import *
class Mrnet(Package):
"""The MRNet Multi-Cast Reduction Network."""
homepage = "http://paradyn.org/mrnet"
url = "ftp://ftp.cs.wisc.edu/paradyn/mrnet/mrnet_4.0.0.tar.gz"
version('4.0.0', 'd00301c078cba57ef68613be32ceea2f')
version('4.1.0', '5a248298b395b329e2371bf25366115c')
variant('krelloptions', default=False, description="Also build the MRNet LW threadsafe libraries")
parallel = False
depends_on("boost")
def install(self, spec, prefix):
# Build the MRNet LW thread safe libraries when the krelloptions variant is present
if '+krelloptions' in spec:
configure("--prefix=%s" %prefix, "--enable-shared", "--enable-ltwt-threadsafe")
else:
configure("--prefix=%s" %prefix, "--enable-shared")
make()
make("install")
|
2bf161dcbd36c8fc0e3fa5f63b89f30cc7520b5e | simd/simd_amd64.s | simd/simd_amd64.s | // func haveSSSE3() bool
TEXT ·SSSE3(SB),NOSPLIT,$0
MOVQ $1, AX
CPUID
SHRQ $9, CX
ANDQ $1, CX
MOVB CX, ret+0(FP)
RET
TEXT ·CpuId(SB),$0-12
MOVL ax+8(FP), AX
CPUID
MOVQ info+0(FP), DI
MOVL AX, 0(DI)
MOVL BX, 4(DI)
MOVL CX, 8(DI)
MOVL DX, 12(DI)
RET
| // (copied from Nigel Tao's shiny library, https://github.com/golang/exp/blob/master/shiny/driver/internal/swizzle/swizzle_amd64.s)
// func haveSSSE3() bool
TEXT ·SSSE3(SB),NOSPLIT,$0
MOVQ $1, AX
CPUID
SHRQ $9, CX
ANDQ $1, CX
MOVB CX, ret+0(FP)
RET
// (copied from the Go dist tool, https://github.com/golang/go/blob/master/src/cmd/dist/cpuid_amd64.s)
TEXT ·CpuId(SB),$0-12
MOVL ax+8(FP), AX
CPUID
MOVQ info+0(FP), DI
MOVL AX, 0(DI)
MOVL BX, 4(DI)
MOVL CX, 8(DI)
MOVL DX, 12(DI)
RET
| Add attribution for Cpuid and SSSE3 | Add attribution for Cpuid and SSSE3
Cpuid copied from the Go dist tool, [cpuid_amd64.s](https://github.com/golang/go/blob/master/src/cmd/dist/cpuid_amd64.s)
SSSE3 detection copied from Nigel Tao's shiny library, SSSE3 appears
useful, [benchmark numbers](https://github.com/golang/exp/commit/8fada217b14b6cb1c1ee832f8503f995c0cec0ce).
| GAS | mit | bjwbell/gensimd,bjwbell/gensimd | gas | ## Code Before:
// func haveSSSE3() bool
TEXT ·SSSE3(SB),NOSPLIT,$0
MOVQ $1, AX
CPUID
SHRQ $9, CX
ANDQ $1, CX
MOVB CX, ret+0(FP)
RET
TEXT ·CpuId(SB),$0-12
MOVL ax+8(FP), AX
CPUID
MOVQ info+0(FP), DI
MOVL AX, 0(DI)
MOVL BX, 4(DI)
MOVL CX, 8(DI)
MOVL DX, 12(DI)
RET
## Instruction:
Add attribution for Cpuid and SSSE3
Cpuid copied from the Go dist tool, [cpuid_amd64.s](https://github.com/golang/go/blob/master/src/cmd/dist/cpuid_amd64.s)
SSSE3 detection copied from Nigel Tao's shiny library, SSSE3 appears
useful, [benchmark numbers](https://github.com/golang/exp/commit/8fada217b14b6cb1c1ee832f8503f995c0cec0ce).
## Code After:
// (copied from Nigel Tao's shiny library, https://github.com/golang/exp/blob/master/shiny/driver/internal/swizzle/swizzle_amd64.s)
// func haveSSSE3() bool
TEXT ·SSSE3(SB),NOSPLIT,$0
MOVQ $1, AX
CPUID
SHRQ $9, CX
ANDQ $1, CX
MOVB CX, ret+0(FP)
RET
// (copied from the Go dist tool, https://github.com/golang/go/blob/master/src/cmd/dist/cpuid_amd64.s)
TEXT ·CpuId(SB),$0-12
MOVL ax+8(FP), AX
CPUID
MOVQ info+0(FP), DI
MOVL AX, 0(DI)
MOVL BX, 4(DI)
MOVL CX, 8(DI)
MOVL DX, 12(DI)
RET
|
10184838d909de0786aa1a0638342646b3c235b3 | .eslintrc.json | .eslintrc.json | {
"env": {
"browser": true,
"commonjs": true,
"es6": true,
"node": true
},
"extends": "eslint:recommended",
"rules": {
"indent": [
"error",
4
],
"linebreak-style": [
"error",
"unix"
],
"quotes": [
"error",
"double"
],
"semi": [
"error",
"always"
]
}
} | {
"env": {
"browser": true,
"commonjs": true,
"es6": true,
"node": true
},
"extends": "eslint:recommended",
"rules": {
"arrow-body-style": [
"warn"
],
"arrow-spacing": [
"error"
],
"brace-style": [
"error",
"1tbs"
],
"comma-spacing": [
"error"
],
"comma-style": [
"error"
],
"func-call-spacing": [
"error"
],
"indent": [
"error",
4
],
"key-spacing": [
"error"
],
"keyword-spacing": [
"error"
],
"linebreak-style": [
"error",
"unix"
],
"new-cap": [
"warn"
],
"no-trailing-spaces": [
"warn"
],
"no-unneeded-ternary": [
"warn"
],
"no-useless-computed-key": [
"error"
],
"no-var": [
"error"
],
"no-whitespace-before-property": [
"warn"
],
"object-curly-spacing": [
"warn"
],
"object-shorthand": [
"warn"
],
"operator-linebreak": [
"error",
"after"
],
"prefer-arrow-callback": [
"error",
{"allowNamedFunctions": true}
],
"prefer-rest-params": [
"error"
],
"quotes": [
"error",
"double"
],
"rest-spread-spacing": [
"error"
],
"semi": [
"error",
"always"
],
"semi-spacing": [
"error"
],
"semi-style": [
"error"
],
"space-before-function-paren": [
"error",
{"anonymous": "always", "named": "never", "asyncArrow": "always"}
],
"space-in-parens": [
"error"
],
"space-infix-ops": [
"error"
],
"space-unary-ops": [
"error"
],
"spaced-comment": [
"warn"
],
"switch-colon-spacing": [
"error"
]
}
} | Expand code style linting rules | Expand code style linting rules
| JSON | mit | Amphiluke/potprox,Amphiluke/potprox,Amphiluke/potprox | json | ## Code Before:
{
"env": {
"browser": true,
"commonjs": true,
"es6": true,
"node": true
},
"extends": "eslint:recommended",
"rules": {
"indent": [
"error",
4
],
"linebreak-style": [
"error",
"unix"
],
"quotes": [
"error",
"double"
],
"semi": [
"error",
"always"
]
}
}
## Instruction:
Expand code style linting rules
## Code After:
{
"env": {
"browser": true,
"commonjs": true,
"es6": true,
"node": true
},
"extends": "eslint:recommended",
"rules": {
"arrow-body-style": [
"warn"
],
"arrow-spacing": [
"error"
],
"brace-style": [
"error",
"1tbs"
],
"comma-spacing": [
"error"
],
"comma-style": [
"error"
],
"func-call-spacing": [
"error"
],
"indent": [
"error",
4
],
"key-spacing": [
"error"
],
"keyword-spacing": [
"error"
],
"linebreak-style": [
"error",
"unix"
],
"new-cap": [
"warn"
],
"no-trailing-spaces": [
"warn"
],
"no-unneeded-ternary": [
"warn"
],
"no-useless-computed-key": [
"error"
],
"no-var": [
"error"
],
"no-whitespace-before-property": [
"warn"
],
"object-curly-spacing": [
"warn"
],
"object-shorthand": [
"warn"
],
"operator-linebreak": [
"error",
"after"
],
"prefer-arrow-callback": [
"error",
{"allowNamedFunctions": true}
],
"prefer-rest-params": [
"error"
],
"quotes": [
"error",
"double"
],
"rest-spread-spacing": [
"error"
],
"semi": [
"error",
"always"
],
"semi-spacing": [
"error"
],
"semi-style": [
"error"
],
"space-before-function-paren": [
"error",
{"anonymous": "always", "named": "never", "asyncArrow": "always"}
],
"space-in-parens": [
"error"
],
"space-infix-ops": [
"error"
],
"space-unary-ops": [
"error"
],
"spaced-comment": [
"warn"
],
"switch-colon-spacing": [
"error"
]
}
} |
34a476e0273ffcae759194ae684608e92273e4c5 | SearchPage.js | SearchPage.js | 'use strict';
import React from 'react';
import {
StyleSheet,
Text,
TextInput,
View,
Image,
TouchableHighlight,
ActivityIndicatorIOS
} from 'react-native';
const styles = StyleSheet.create({
description:{
marginBottom:20,
fontSize:18,
textAlign:'center',
color:'#656565'},
container:{
padding:30,
marginTop:65,
alignItems:'center'}
});
class SearchPage extends React.Component{
render(){
return (
<View
style = {styles.container}>
<Text style={styles.description}> Search for house to buy</Text>
<Text style={styles.description}> Search by place-name, postcode or search near your location.</Text>
</View>
);
}
}
export default SearchPage; | 'use strict';
import React from 'react';
import {
StyleSheet,
Text,
TextInput,
View,
Image,
TouchableHighlight,
ActivityIndicatorIOS
} from 'react-native';
const styles = StyleSheet.create({
description:{
marginBottom:20,
fontSize:18,
textAlign:'center',
color:'#656565'},
container:{
padding:30,
marginTop:65,
alignItems:'center'},
flowRight:{
flexDirection:'row',
alignItems:'center',
alignSelf:'stretch'},
buttonText:{
fontSize:18,
color:'white',
alignSelf:'center'},
button:{
height:36,
flex:1,
flexDirection:'row',
backgroundColor:'#48BBEC',
borderColor:'#48BBEC',
borderWidth:1,
borderRadius:8,
marginBottom:10,
alignSelf:'stretch',
justifyContent:'center'},
searchInput:{
height:36,
padding:4,
marginRight:5,
flex:4,
fontSize:18,
borderWidth:1,
borderColor:'#48BBEC',
borderRadius:8,
color:'#48BBEC'}
});
class SearchPage extends React.Component{
render(){
return (
<View
style = {styles.container}>
<Text style={styles.description}> Search for house to buy</Text>
<Text style={styles.description}> Search by place-name, postcode or search near your location.</Text>
<View style={styles.flowRight}>
<TextInput style={styles.searchInput} placeholder='Search via name or postcode'/>
<TouchableHighlight style={styles.button} underlayColor='#99d9f4'>
<Text style={styles.buttonText}>Go</Text>
</TouchableHighlight>
</View>
<TouchableHighlight style={styles.button}
underlayColor='#99d9f4'>
<Text style={styles.buttonText}>Location</Text>
</TouchableHighlight>
</View>
);
}
}
export default SearchPage; | Add search input and search button | Add search input and search button
| JavaScript | mit | wxmaudio/ReactNativePropertyFinder,wxmaudio/ReactNativePropertyFinder,wxmaudio/ReactNativePropertyFinder | javascript | ## Code Before:
'use strict';
import React from 'react';
import {
StyleSheet,
Text,
TextInput,
View,
Image,
TouchableHighlight,
ActivityIndicatorIOS
} from 'react-native';
const styles = StyleSheet.create({
description:{
marginBottom:20,
fontSize:18,
textAlign:'center',
color:'#656565'},
container:{
padding:30,
marginTop:65,
alignItems:'center'}
});
class SearchPage extends React.Component{
render(){
return (
<View
style = {styles.container}>
<Text style={styles.description}> Search for house to buy</Text>
<Text style={styles.description}> Search by place-name, postcode or search near your location.</Text>
</View>
);
}
}
export default SearchPage;
## Instruction:
Add search input and search button
## Code After:
'use strict';
import React from 'react';
import {
StyleSheet,
Text,
TextInput,
View,
Image,
TouchableHighlight,
ActivityIndicatorIOS
} from 'react-native';
const styles = StyleSheet.create({
description:{
marginBottom:20,
fontSize:18,
textAlign:'center',
color:'#656565'},
container:{
padding:30,
marginTop:65,
alignItems:'center'},
flowRight:{
flexDirection:'row',
alignItems:'center',
alignSelf:'stretch'},
buttonText:{
fontSize:18,
color:'white',
alignSelf:'center'},
button:{
height:36,
flex:1,
flexDirection:'row',
backgroundColor:'#48BBEC',
borderColor:'#48BBEC',
borderWidth:1,
borderRadius:8,
marginBottom:10,
alignSelf:'stretch',
justifyContent:'center'},
searchInput:{
height:36,
padding:4,
marginRight:5,
flex:4,
fontSize:18,
borderWidth:1,
borderColor:'#48BBEC',
borderRadius:8,
color:'#48BBEC'}
});
class SearchPage extends React.Component{
render(){
return (
<View
style = {styles.container}>
<Text style={styles.description}> Search for house to buy</Text>
<Text style={styles.description}> Search by place-name, postcode or search near your location.</Text>
<View style={styles.flowRight}>
<TextInput style={styles.searchInput} placeholder='Search via name or postcode'/>
<TouchableHighlight style={styles.button} underlayColor='#99d9f4'>
<Text style={styles.buttonText}>Go</Text>
</TouchableHighlight>
</View>
<TouchableHighlight style={styles.button}
underlayColor='#99d9f4'>
<Text style={styles.buttonText}>Location</Text>
</TouchableHighlight>
</View>
);
}
}
export default SearchPage; |
8479b188323d771303bbed6cfc074665db917ebf | README.org | README.org |
* Library Information
- Name :: Hybridizer
- Version :: 0.1.0
- License :: BSD
- URL :: https://github.com/janelia-experimental-technology/hybridizer
- Author :: Peter Polidoro
- Email :: peterpolidoro@gmail.com
** Description
|
* Library Information
- Name :: Hybridizer
- Version :: 0.1.0
- License :: BSD
- URL :: https://github.com/janelia-experimental-technology/hybridizer
- Author :: Peter Polidoro
- Email :: peterpolidoro@gmail.com
** Description
* Host Computer Control Software
[[https://github.com/janelia-pypi/hybridizer_python.git]]
| Add link to Python code | Add link to Python code
| Org | bsd-3-clause | peterpolidoro/hybridizer | org | ## Code Before:
* Library Information
- Name :: Hybridizer
- Version :: 0.1.0
- License :: BSD
- URL :: https://github.com/janelia-experimental-technology/hybridizer
- Author :: Peter Polidoro
- Email :: peterpolidoro@gmail.com
** Description
## Instruction:
Add link to Python code
## Code After:
* Library Information
- Name :: Hybridizer
- Version :: 0.1.0
- License :: BSD
- URL :: https://github.com/janelia-experimental-technology/hybridizer
- Author :: Peter Polidoro
- Email :: peterpolidoro@gmail.com
** Description
* Host Computer Control Software
[[https://github.com/janelia-pypi/hybridizer_python.git]]
|
983c1cd8dcf7abe8888b0c577b223f76efe60a98 | README.md | README.md | cardboard-creeper
=================
A simple clone of Minecraft to run on Android with Google Cardboard
| cardboard-creeper
=================
A simple clone of Minecraft to run on Android with Google Cardboard
This was inspired by a nice Minecraft-like demo written in Pyglet:
https://github.com/fogleman/Minecraft
I tought it would be nice to make it run with Google Cardboard. I had to rewrite everything in Java and OpenGL ES 2.0 to make that happen.
| Update readme with link to fogleman and why Java. | Update readme with link to fogleman and why Java. | Markdown | mit | skligys/cardboard-creeper | markdown | ## Code Before:
cardboard-creeper
=================
A simple clone of Minecraft to run on Android with Google Cardboard
## Instruction:
Update readme with link to fogleman and why Java.
## Code After:
cardboard-creeper
=================
A simple clone of Minecraft to run on Android with Google Cardboard
This was inspired by a nice Minecraft-like demo written in Pyglet:
https://github.com/fogleman/Minecraft
I tought it would be nice to make it run with Google Cardboard. I had to rewrite everything in Java and OpenGL ES 2.0 to make that happen.
|
011f09f55f1e4cd50d1f3586e8cbb4d9c977f34d | julia/parametric_types.jl | julia/parametric_types.jl | @assert typeof(1:10) == UnitRange{Int64}
@assert typeof(0x5 // 0x22) == Rational{UInt8}
@assert typeof(5 // 34) == Rational{Int64}
@assert typeof(8.75im) == Complex{Float64}
@assert typeof([5,3]) == Array{Int64,1}
@assert typeof([3, "abc"]) == Array{Any, 1}
@assert typeof([]) == Array{Any, 1}
@assert typeof([1 0; 0 1]) == Array{Int64, 2}
| @assert typeof(Set(4)) == Set{Int64}
@assert typeof(Set(['3', '$'])) == Set{Char}
@assert typeof(1:10) == UnitRange{Int64}
@assert typeof(0x5 // 0x22) == Rational{UInt8}
@assert typeof(5 // 34) == Rational{Int64}
@assert typeof(8.75im) == Complex{Float64}
@assert typeof(e) == Irrational{:e}
@assert typeof([5,3]) == Array{Int64,1}
@assert typeof([3, "abc"]) == Array{Any, 1}
@assert typeof([]) == Array{Any, 1}
@assert typeof([1 0; 0 1]) == Array{Int64, 2}
| Include sets and irrationals in the parametric types examples | Include sets and irrationals in the parametric types examples
| Julia | mit | rtoal/ple,rtoal/ple,rtoal/ple,rtoal/polyglot,rtoal/polyglot,rtoal/ple,rtoal/ple,rtoal/ple,rtoal/ple,rtoal/polyglot,rtoal/polyglot,rtoal/polyglot,rtoal/ple,rtoal/polyglot,rtoal/ple,rtoal/ple,rtoal/ple,rtoal/polyglot,rtoal/polyglot,rtoal/polyglot,rtoal/polyglot,rtoal/polyglot,rtoal/polyglot,rtoal/ple,rtoal/ple,rtoal/polyglot,rtoal/polyglot,rtoal/ple,rtoal/ple,rtoal/polyglot,rtoal/ple | julia | ## Code Before:
@assert typeof(1:10) == UnitRange{Int64}
@assert typeof(0x5 // 0x22) == Rational{UInt8}
@assert typeof(5 // 34) == Rational{Int64}
@assert typeof(8.75im) == Complex{Float64}
@assert typeof([5,3]) == Array{Int64,1}
@assert typeof([3, "abc"]) == Array{Any, 1}
@assert typeof([]) == Array{Any, 1}
@assert typeof([1 0; 0 1]) == Array{Int64, 2}
## Instruction:
Include sets and irrationals in the parametric types examples
## Code After:
@assert typeof(Set(4)) == Set{Int64}
@assert typeof(Set(['3', '$'])) == Set{Char}
@assert typeof(1:10) == UnitRange{Int64}
@assert typeof(0x5 // 0x22) == Rational{UInt8}
@assert typeof(5 // 34) == Rational{Int64}
@assert typeof(8.75im) == Complex{Float64}
@assert typeof(e) == Irrational{:e}
@assert typeof([5,3]) == Array{Int64,1}
@assert typeof([3, "abc"]) == Array{Any, 1}
@assert typeof([]) == Array{Any, 1}
@assert typeof([1 0; 0 1]) == Array{Int64, 2}
|
df4f97e7e54cec719c37f0c23ebfd62f3b6e8359 | db/migrate/20120516094638_create_asset_groups.rb | db/migrate/20120516094638_create_asset_groups.rb | class CreateAssetGroups < ActiveRecord::Migration
def change
create_table :asset_groups do |t|
t.string :type, :null => false, :limit => 50
t.string :name, :null => false, :limit => 200, :index => :unique
# Nested set
t.integer :asset_group_id, :null => true, :on_delete => :cascade
t.integer :lft, :null => true
t.integer :rgt, :null => true
t.integer :depth, :null => false, :limit => 3, :default => 1
t.integer :children_count, :null => false, :limit => 3, :default => 0
t.integer :creator_id, :null => false, :references => :users
t.integer :updater_id, :null => false, :references => :users
t.timestamps
end
end
end
| class CreateAssetGroups < ActiveRecord::Migration
def change
create_table :asset_groups do |t|
t.string :type, :null => false, :limit => 50
t.string :name, :null => false, :limit => 200, :index => {:unique => true, :with => :asset_group_id}
# Nested set
t.integer :asset_group_id, :null => true, :on_delete => :cascade
t.integer :lft, :null => true
t.integer :rgt, :null => true
t.integer :depth, :null => false, :limit => 3, :default => 1
t.integer :children_count, :null => false, :limit => 3, :default => 0
t.integer :creator_id, :null => false, :references => :users
t.integer :updater_id, :null => false, :references => :users
t.timestamps
end
end
end
| Change the unique constraints on the asset group name to apply only within a leaf of the hierarchy. | Change the unique constraints on the asset group name to apply only within a leaf of the hierarchy.
| Ruby | mit | spookandpuff/islay,spookandpuff/islay,spookandpuff/islay | ruby | ## Code Before:
class CreateAssetGroups < ActiveRecord::Migration
def change
create_table :asset_groups do |t|
t.string :type, :null => false, :limit => 50
t.string :name, :null => false, :limit => 200, :index => :unique
# Nested set
t.integer :asset_group_id, :null => true, :on_delete => :cascade
t.integer :lft, :null => true
t.integer :rgt, :null => true
t.integer :depth, :null => false, :limit => 3, :default => 1
t.integer :children_count, :null => false, :limit => 3, :default => 0
t.integer :creator_id, :null => false, :references => :users
t.integer :updater_id, :null => false, :references => :users
t.timestamps
end
end
end
## Instruction:
Change the unique constraints on the asset group name to apply only within a leaf of the hierarchy.
## Code After:
class CreateAssetGroups < ActiveRecord::Migration
def change
create_table :asset_groups do |t|
t.string :type, :null => false, :limit => 50
t.string :name, :null => false, :limit => 200, :index => {:unique => true, :with => :asset_group_id}
# Nested set
t.integer :asset_group_id, :null => true, :on_delete => :cascade
t.integer :lft, :null => true
t.integer :rgt, :null => true
t.integer :depth, :null => false, :limit => 3, :default => 1
t.integer :children_count, :null => false, :limit => 3, :default => 0
t.integer :creator_id, :null => false, :references => :users
t.integer :updater_id, :null => false, :references => :users
t.timestamps
end
end
end
|
6d6ca5ea72f62a5671c9d07e58849e5b5aab8903 | search/static/search.js | search/static/search.js | $(document).ready(function(){
$('#form').submit(function(e){
e.preventDefault();
var query = $('#query').val();
var docs = [];
$.get('/search/query/', {query: query}, function(data){
var result = JSON.parse(data);
console.log(typeof result);
for(var key in result){
var obj = {};
obj['doc'] = key;
obj['score'] = result[key]
docs.push(obj)
}
docs.sort(function(a, b){
return b.score-a.score;
});
console.log(docs);
});
});
}); | $(document).ready(function(){
$('#form').submit(function(e){
e.preventDefault();
var query = $('#query').val();
var docs = [];
$.get('/search/query/', {query: query}, function(data){
var result = JSON.parse(data);
console.log(typeof result);
for(var key in result){
var obj = {};
obj['doc'] = key;
obj['score'] = result[key]
docs.push(obj)
}
docs.sort(function(a, b){
return b.score-a.score;
});
var length = docs.length;
for(var i = 0; i < length; i++){
var doc = docs[i];
var row = $('<tr></tr>');
var rank = $('<td></td>').text(i+1);
var docid = $('<td></td>').text(doc['doc']);
var score = $('<td></td>').text(doc['score']);
row.append(rank);
row.append(docid);
row.append(score);
$('#result-body').append(row);
}
$('#result-table').removeClass('disabled');
});
});
}); | Add results to result table Display table when results are loaded | Add results to result table
Display table when results are loaded
| JavaScript | mit | nh0815/PySearch,nh0815/PySearch | javascript | ## Code Before:
$(document).ready(function(){
$('#form').submit(function(e){
e.preventDefault();
var query = $('#query').val();
var docs = [];
$.get('/search/query/', {query: query}, function(data){
var result = JSON.parse(data);
console.log(typeof result);
for(var key in result){
var obj = {};
obj['doc'] = key;
obj['score'] = result[key]
docs.push(obj)
}
docs.sort(function(a, b){
return b.score-a.score;
});
console.log(docs);
});
});
});
## Instruction:
Add results to result table
Display table when results are loaded
## Code After:
$(document).ready(function(){
$('#form').submit(function(e){
e.preventDefault();
var query = $('#query').val();
var docs = [];
$.get('/search/query/', {query: query}, function(data){
var result = JSON.parse(data);
console.log(typeof result);
for(var key in result){
var obj = {};
obj['doc'] = key;
obj['score'] = result[key]
docs.push(obj)
}
docs.sort(function(a, b){
return b.score-a.score;
});
var length = docs.length;
for(var i = 0; i < length; i++){
var doc = docs[i];
var row = $('<tr></tr>');
var rank = $('<td></td>').text(i+1);
var docid = $('<td></td>').text(doc['doc']);
var score = $('<td></td>').text(doc['score']);
row.append(rank);
row.append(docid);
row.append(score);
$('#result-body').append(row);
}
$('#result-table').removeClass('disabled');
});
});
}); |
49d07a7fc20830bd05222e7ec2a24623d2b14a2a | Modules/VideoCore/Common/add_video_test.cmake | Modules/VideoCore/Common/add_video_test.cmake | include(${Video-Core-Common_SOURCE_DIR}/MIDAS.cmake)
# Macro to add a midas test with the "Video" label
macro(add_video_test)
midas_add_test(${ARGV})
set_property(TEST ${ARGV1} PROPERTY LABELS Video)
endmacro(add_video_test)
| include(${Video-Core-Common_SOURCE_DIR}/MIDAS.cmake)
# Macro to add a midas test with the "Video" label
macro(add_video_test)
midas_add_test(${ARGV})
set_property(TEST ${ARGV1} PROPERTY LABELS Video)
set_property(TEST ${ARGV1}_fetchData PROPERTY LABELS Video)
endmacro(add_video_test)
| Add Video label for _fetchData tests | COMP: Add Video label for _fetchData tests
| CMake | apache-2.0 | jmerkow/ITK,CapeDrew/DCMTK-ITK,BlueBrain/ITK,GEHC-Surgery/ITK,jcfr/ITK,zachary-williamson/ITK,LucasGandel/ITK,atsnyder/ITK,spinicist/ITK,hjmjohnson/ITK,thewtex/ITK,atsnyder/ITK,CapeDrew/DCMTK-ITK,InsightSoftwareConsortium/ITK,BlueBrain/ITK,hinerm/ITK,hinerm/ITK,malaterre/ITK,thewtex/ITK,eile/ITK,fuentesdt/InsightToolkit-dev,rhgong/itk-with-dom,malaterre/ITK,msmolens/ITK,BlueBrain/ITK,vfonov/ITK,Kitware/ITK,fuentesdt/InsightToolkit-dev,fedral/ITK,hendradarwin/ITK,hjmjohnson/ITK,fedral/ITK,fbudin69500/ITK,fedral/ITK,eile/ITK,PlutoniumHeart/ITK,jmerkow/ITK,stnava/ITK,hjmjohnson/ITK,CapeDrew/DCMTK-ITK,LucHermitte/ITK,rhgong/itk-with-dom,CapeDrew/DCMTK-ITK,BRAINSia/ITK,ajjl/ITK,PlutoniumHeart/ITK,fedral/ITK,fbudin69500/ITK,LucasGandel/ITK,LucasGandel/ITK,eile/ITK,CapeDrew/DCMTK-ITK,atsnyder/ITK,InsightSoftwareConsortium/ITK,biotrump/ITK,hendradarwin/ITK,jmerkow/ITK,stnava/ITK,msmolens/ITK,stnava/ITK,spinicist/ITK,LucasGandel/ITK,spinicist/ITK,vfonov/ITK,fbudin69500/ITK,richardbeare/ITK,GEHC-Surgery/ITK,fuentesdt/InsightToolkit-dev,atsnyder/ITK,rhgong/itk-with-dom,stnava/ITK,jcfr/ITK,zachary-williamson/ITK,malaterre/ITK,PlutoniumHeart/ITK,spinicist/ITK,BRAINSia/ITK,vfonov/ITK,malaterre/ITK,hendradarwin/ITK,malaterre/ITK,atsnyder/ITK,biotrump/ITK,LucasGandel/ITK,thewtex/ITK,hinerm/ITK,hinerm/ITK,thewtex/ITK,malaterre/ITK,fbudin69500/ITK,vfonov/ITK,paulnovo/ITK,fuentesdt/InsightToolkit-dev,heimdali/ITK,blowekamp/ITK,hinerm/ITK,biotrump/ITK,ajjl/ITK,BRAINSia/ITK,rhgong/itk-with-dom,BlueBrain/ITK,malaterre/ITK,Kitware/ITK,zachary-williamson/ITK,spinicist/ITK,hendradarwin/ITK,spinicist/ITK,Kitware/ITK,vfonov/ITK,jmerkow/ITK,fbudin69500/ITK,hjmjohnson/ITK,heimdali/ITK,biotrump/ITK,jcfr/ITK,blowekamp/ITK,hendradarwin/ITK,LucasGandel/ITK,atsnyder/ITK,heimdali/ITK,thewtex/ITK,BlueBrain/ITK,rhgong/itk-with-dom,msmolens/ITK,richardbeare/ITK,jmerkow/ITK,thewtex/ITK,PlutoniumHeart/ITK,rhgong/itk-with-dom,rhgong/itk-with-dom,spinicist/ITK,LucHermitte/ITK,hinerm/ITK,InsightSoftwareConsortium/ITK,vfonov/ITK,stnava/ITK,heimdali/ITK,InsightSoftwareConsortium/ITK,fuentesdt/InsightToolkit-dev,fuentesdt/InsightToolkit-dev,fedral/ITK,hjmjohnson/ITK,fuentesdt/InsightToolkit-dev,paulnovo/ITK,eile/ITK,LucHermitte/ITK,jcfr/ITK,hjmjohnson/ITK,ajjl/ITK,biotrump/ITK,thewtex/ITK,richardbeare/ITK,fbudin69500/ITK,hinerm/ITK,BRAINSia/ITK,hendradarwin/ITK,biotrump/ITK,BlueBrain/ITK,PlutoniumHeart/ITK,hinerm/ITK,paulnovo/ITK,rhgong/itk-with-dom,stnava/ITK,blowekamp/ITK,zachary-williamson/ITK,zachary-williamson/ITK,BRAINSia/ITK,stnava/ITK,fbudin69500/ITK,eile/ITK,PlutoniumHeart/ITK,fuentesdt/InsightToolkit-dev,ajjl/ITK,msmolens/ITK,richardbeare/ITK,fuentesdt/InsightToolkit-dev,eile/ITK,InsightSoftwareConsortium/ITK,eile/ITK,paulnovo/ITK,blowekamp/ITK,LucHermitte/ITK,jcfr/ITK,hjmjohnson/ITK,heimdali/ITK,InsightSoftwareConsortium/ITK,CapeDrew/DCMTK-ITK,heimdali/ITK,fedral/ITK,spinicist/ITK,vfonov/ITK,LucasGandel/ITK,hinerm/ITK,Kitware/ITK,zachary-williamson/ITK,atsnyder/ITK,msmolens/ITK,jmerkow/ITK,msmolens/ITK,blowekamp/ITK,Kitware/ITK,paulnovo/ITK,GEHC-Surgery/ITK,ajjl/ITK,jmerkow/ITK,PlutoniumHeart/ITK,biotrump/ITK,paulnovo/ITK,BRAINSia/ITK,hendradarwin/ITK,stnava/ITK,malaterre/ITK,heimdali/ITK,vfonov/ITK,Kitware/ITK,richardbeare/ITK,paulnovo/ITK,jmerkow/ITK,zachary-williamson/ITK,GEHC-Surgery/ITK,blowekamp/ITK,ajjl/ITK,BlueBrain/ITK,zachary-williamson/ITK,BRAINSia/ITK,blowekamp/ITK,richardbeare/ITK,BlueBrain/ITK,LucHermitte/ITK,heimdali/ITK,LucasGandel/ITK,zachary-williamson/ITK,msmolens/ITK,LucHermitte/ITK,Kitware/ITK,eile/ITK,richardbeare/ITK,eile/ITK,hendradarwin/ITK,ajjl/ITK,CapeDrew/DCMTK-ITK,CapeDrew/DCMTK-ITK,fedral/ITK,atsnyder/ITK,GEHC-Surgery/ITK,msmolens/ITK,biotrump/ITK,jcfr/ITK,GEHC-Surgery/ITK,blowekamp/ITK,LucHermitte/ITK,ajjl/ITK,fedral/ITK,PlutoniumHeart/ITK,LucHermitte/ITK,paulnovo/ITK,GEHC-Surgery/ITK,atsnyder/ITK,InsightSoftwareConsortium/ITK,vfonov/ITK,malaterre/ITK,jcfr/ITK,spinicist/ITK,fbudin69500/ITK,CapeDrew/DCMTK-ITK,stnava/ITK,jcfr/ITK,GEHC-Surgery/ITK | cmake | ## Code Before:
include(${Video-Core-Common_SOURCE_DIR}/MIDAS.cmake)
# Macro to add a midas test with the "Video" label
macro(add_video_test)
midas_add_test(${ARGV})
set_property(TEST ${ARGV1} PROPERTY LABELS Video)
endmacro(add_video_test)
## Instruction:
COMP: Add Video label for _fetchData tests
## Code After:
include(${Video-Core-Common_SOURCE_DIR}/MIDAS.cmake)
# Macro to add a midas test with the "Video" label
macro(add_video_test)
midas_add_test(${ARGV})
set_property(TEST ${ARGV1} PROPERTY LABELS Video)
set_property(TEST ${ARGV1}_fetchData PROPERTY LABELS Video)
endmacro(add_video_test)
|
01d22529936678cf3f39878722af32b28c682bd4 | Domains/Chemistry/module.cmake | Domains/Chemistry/module.cmake | if(VTK_RENDERING_BACKEND STREQUAL "OpenGL2")
set(extra_opengl_depend vtkDomainsChemistry${VTK_RENDERING_BACKEND})
endif()
vtk_module(vtkDomainsChemistry
GROUPS
StandAlone
DEPENDS
vtkCommonDataModel
vtkRenderingCore
vtksys
PRIVATE_DEPENDS
vtkIOXML
vtkFiltersSources
TEST_DEPENDS
vtkIOLegacy
vtkTestingCore
vtkTestingRendering
vtkInteractionStyle
vtkRendering${VTK_RENDERING_BACKEND}
${extra_opengl_depend}
)
| if(VTK_RENDERING_BACKEND STREQUAL "OpenGL2")
set(extra_opengl_depend vtkDomainsChemistry${VTK_RENDERING_BACKEND})
endif()
vtk_module(vtkDomainsChemistry
GROUPS
StandAlone
DEPENDS
vtkCommonDataModel
vtkRenderingCore
PRIVATE_DEPENDS
vtkIOXML
vtkFiltersSources
vtksys
TEST_DEPENDS
vtkIOLegacy
vtkTestingCore
vtkTestingRendering
vtkInteractionStyle
vtkRendering${VTK_RENDERING_BACKEND}
${extra_opengl_depend}
)
| Make internal dependency private for Domains/Chemistry -> vtksys. | Make internal dependency private for Domains/Chemistry -> vtksys.
| CMake | bsd-3-clause | keithroe/vtkoptix,keithroe/vtkoptix,keithroe/vtkoptix,keithroe/vtkoptix,keithroe/vtkoptix,keithroe/vtkoptix,keithroe/vtkoptix,keithroe/vtkoptix | cmake | ## Code Before:
if(VTK_RENDERING_BACKEND STREQUAL "OpenGL2")
set(extra_opengl_depend vtkDomainsChemistry${VTK_RENDERING_BACKEND})
endif()
vtk_module(vtkDomainsChemistry
GROUPS
StandAlone
DEPENDS
vtkCommonDataModel
vtkRenderingCore
vtksys
PRIVATE_DEPENDS
vtkIOXML
vtkFiltersSources
TEST_DEPENDS
vtkIOLegacy
vtkTestingCore
vtkTestingRendering
vtkInteractionStyle
vtkRendering${VTK_RENDERING_BACKEND}
${extra_opengl_depend}
)
## Instruction:
Make internal dependency private for Domains/Chemistry -> vtksys.
## Code After:
if(VTK_RENDERING_BACKEND STREQUAL "OpenGL2")
set(extra_opengl_depend vtkDomainsChemistry${VTK_RENDERING_BACKEND})
endif()
vtk_module(vtkDomainsChemistry
GROUPS
StandAlone
DEPENDS
vtkCommonDataModel
vtkRenderingCore
PRIVATE_DEPENDS
vtkIOXML
vtkFiltersSources
vtksys
TEST_DEPENDS
vtkIOLegacy
vtkTestingCore
vtkTestingRendering
vtkInteractionStyle
vtkRendering${VTK_RENDERING_BACKEND}
${extra_opengl_depend}
)
|
da4387570425bf55f873ff8ffacf1b8eda603ee1 | README.md | README.md |
Cobalt2 theme for [HyperTerm](https://hyperterm.org/), based on Wes Bos' [iTerm2 theme](https://github.com/wesbos/Cobalt2-iterm)
| **Theme no longer maintained. Use Wes' own theme [hyperterm-cobalt2-theme](https://github.com/wesbos/hyperterm-cobalt2-theme)**
~~Cobalt2 theme for [HyperTerm](https://hyperterm.org/), based on Wes Bos' [iTerm2 theme](https://github.com/wesbos/Cobalt2-iterm)~~
| Add deprecation message and link to official theme | Add deprecation message and link to official theme | Markdown | isc | martinAnsty/cobalt2-hyperterm-theme | markdown | ## Code Before:
Cobalt2 theme for [HyperTerm](https://hyperterm.org/), based on Wes Bos' [iTerm2 theme](https://github.com/wesbos/Cobalt2-iterm)
## Instruction:
Add deprecation message and link to official theme
## Code After:
**Theme no longer maintained. Use Wes' own theme [hyperterm-cobalt2-theme](https://github.com/wesbos/hyperterm-cobalt2-theme)**
~~Cobalt2 theme for [HyperTerm](https://hyperterm.org/), based on Wes Bos' [iTerm2 theme](https://github.com/wesbos/Cobalt2-iterm)~~
|
530823cdee612793ab51c17e0c2195bf03d4fa26 | Sodium/Bytes.swift | Sodium/Bytes.swift | import Foundation
public typealias Bytes = Array<UInt8>
extension Array where Element == UInt8 {
init (count bytes: Int) {
self.init(repeating: 0, count: bytes)
}
}
extension ArraySlice where Element == UInt8 {
var bytes: Bytes { return Bytes(self) }
}
public extension String {
var bytes: Bytes { return Bytes(self.utf8) }
}
| import Foundation
public typealias Bytes = Array<UInt8>
extension Array where Element == UInt8 {
init (count bytes: Int) {
self.init(repeating: 0, count: bytes)
}
public var utf8String: String? {
return String(data: Data(bytes: self), encoding: .utf8)
}
}
extension ArraySlice where Element == UInt8 {
var bytes: Bytes { return Bytes(self) }
}
public extension String {
var bytes: Bytes { return Bytes(self.utf8) }
}
| Add convenience property for converting bytes to a UTF-8 string | Add convenience property for converting bytes to a UTF-8 string
| Swift | isc | blochberger/swift-sodium,blochberger/swift-sodium,jedisct1/swift-sodium,blochberger/swift-sodium,ziogaschr/swift-sodium,jedisct1/swift-sodium,jedisct1/swift-sodium,ziogaschr/swift-sodium,blochberger/swift-sodium,ziogaschr/swift-sodium,ziogaschr/swift-sodium | swift | ## Code Before:
import Foundation
public typealias Bytes = Array<UInt8>
extension Array where Element == UInt8 {
init (count bytes: Int) {
self.init(repeating: 0, count: bytes)
}
}
extension ArraySlice where Element == UInt8 {
var bytes: Bytes { return Bytes(self) }
}
public extension String {
var bytes: Bytes { return Bytes(self.utf8) }
}
## Instruction:
Add convenience property for converting bytes to a UTF-8 string
## Code After:
import Foundation
public typealias Bytes = Array<UInt8>
extension Array where Element == UInt8 {
init (count bytes: Int) {
self.init(repeating: 0, count: bytes)
}
public var utf8String: String? {
return String(data: Data(bytes: self), encoding: .utf8)
}
}
extension ArraySlice where Element == UInt8 {
var bytes: Bytes { return Bytes(self) }
}
public extension String {
var bytes: Bytes { return Bytes(self.utf8) }
}
|
c2c61ff8be66dab683799e106c1d56fc3c1263ab | articles/video-series/intro.md | articles/video-series/intro.md | ---
video:
series: main
video: intro
series:
- main
- creating-your-first-app
---
| ---
public: false
video:
series: main
video: intro
series:
- main
- creating-your-first-app
---
| Set public: false for videos | Set public: false for videos | Markdown | mit | khorne3/docs,jeffreylees/docs,Annyv2/docs,miparnisari/docs,Catografix/docs,auth0/docs,sandrinodimattia/docs,yvonnewilson/docs,Catografix/docs,hamzawi06/docs,auth0/docs,Annyv2/docs,Annyv2/docs,jeffreylees/docs,sandrinodimattia/docs,auth0/docs,khorne3/docs,khorne3/docs,hamzawi06/docs,yvonnewilson/docs,sandrinodimattia/docs,Catografix/docs,hamzawi06/docs,yvonnewilson/docs,jeffreylees/docs | markdown | ## Code Before:
---
video:
series: main
video: intro
series:
- main
- creating-your-first-app
---
## Instruction:
Set public: false for videos
## Code After:
---
public: false
video:
series: main
video: intro
series:
- main
- creating-your-first-app
---
|
36e75858d0c6b10d11d998d2d205be7d237c7bb8 | src/index.js | src/index.js |
import {Video} from '@thiagopnts/kaleidoscope';
import {ContainerPlugin, Mediator, Events} from 'clappr';
export default class Video360 extends ContainerPlugin {
constructor(container) {
super(container);
Mediator.on(`${this.options.playerId}:${Events.PLAYER_RESIZE}`, this.updateSize, this);
let {height, width, autoplay} = container.options;
container.playback.el.setAttribute('crossorigin', 'anonymous');
container.el.style.touchAction = "none";
container.el.addEventListener("touchmove", function(event) {
event.preventDefault();
}, false);
this.viewer = new Video({height: isNaN(height) ? 300 : height, width: isNaN(width) ? 400 : width, container: this.container.el, source: this.container.playback.el});
this.viewer.render();
}
get name() {
return 'Video360';
}
updateSize() {
setTimeout(() =>
this.viewer.setSize({height: this.container.$el.height(), width: this.container.$el.width()})
, 250)
}
}
|
import {Video} from '@thiagopnts/kaleidoscope';
import {ContainerPlugin, Mediator, Events} from 'clappr';
export default class Video360 extends ContainerPlugin {
constructor(container) {
super(container);
Mediator.on(`${this.options.playerId}:${Events.PLAYER_RESIZE}`, this.updateSize, this);
let {height, width, autoplay} = container.options;
container.playback.el.setAttribute('crossorigin', 'anonymous');
container.playback.el.setAttribute('preload', 'metadata');
container.playback.el.setAttribute('playsinline', 'true');
container.el.style.touchAction = "none";
container.el.addEventListener("touchmove", function(event) {
event.preventDefault();
}, false);
this.viewer = new Video({height: isNaN(height) ? 300 : height, width: isNaN(width) ? 400 : width, container: this.container.el, source: this.container.playback.el});
this.viewer.render();
}
get name() {
return 'Video360';
}
updateSize() {
setTimeout(() =>
this.viewer.setSize({height: this.container.$el.height(), width: this.container.$el.width()})
, 250)
}
}
| Fix video jumping to fullscreen on iPhones | Fix video jumping to fullscreen on iPhones
| JavaScript | mit | thiagopnts/video-360,thiagopnts/video-360 | javascript | ## Code Before:
import {Video} from '@thiagopnts/kaleidoscope';
import {ContainerPlugin, Mediator, Events} from 'clappr';
export default class Video360 extends ContainerPlugin {
constructor(container) {
super(container);
Mediator.on(`${this.options.playerId}:${Events.PLAYER_RESIZE}`, this.updateSize, this);
let {height, width, autoplay} = container.options;
container.playback.el.setAttribute('crossorigin', 'anonymous');
container.el.style.touchAction = "none";
container.el.addEventListener("touchmove", function(event) {
event.preventDefault();
}, false);
this.viewer = new Video({height: isNaN(height) ? 300 : height, width: isNaN(width) ? 400 : width, container: this.container.el, source: this.container.playback.el});
this.viewer.render();
}
get name() {
return 'Video360';
}
updateSize() {
setTimeout(() =>
this.viewer.setSize({height: this.container.$el.height(), width: this.container.$el.width()})
, 250)
}
}
## Instruction:
Fix video jumping to fullscreen on iPhones
## Code After:
import {Video} from '@thiagopnts/kaleidoscope';
import {ContainerPlugin, Mediator, Events} from 'clappr';
export default class Video360 extends ContainerPlugin {
constructor(container) {
super(container);
Mediator.on(`${this.options.playerId}:${Events.PLAYER_RESIZE}`, this.updateSize, this);
let {height, width, autoplay} = container.options;
container.playback.el.setAttribute('crossorigin', 'anonymous');
container.playback.el.setAttribute('preload', 'metadata');
container.playback.el.setAttribute('playsinline', 'true');
container.el.style.touchAction = "none";
container.el.addEventListener("touchmove", function(event) {
event.preventDefault();
}, false);
this.viewer = new Video({height: isNaN(height) ? 300 : height, width: isNaN(width) ? 400 : width, container: this.container.el, source: this.container.playback.el});
this.viewer.render();
}
get name() {
return 'Video360';
}
updateSize() {
setTimeout(() =>
this.viewer.setSize({height: this.container.$el.height(), width: this.container.$el.width()})
, 250)
}
}
|
d2f295c8835f17e8053c0fbfd2d393ece2716637 | sql/game_code_lookup.sql | sql/game_code_lookup.sql | -- Shows codes for a game
SELECT `code_lookup`.`code`, `code_lookup`.`type`, `code_lookup`.`location_id`, `locations`.`internal_note`, `locations`.`is_start`, `locations`.`is_end` FROM `code_lookup` LEFT OUTER JOIN `locations` ON `code_lookup`.`game_id` = `locations`.`game_id` AND `code_lookup`.`location_id` = `locations`.`location_id` WHERE `code_lookup`.`game_id` = 1;
| -- Shows codes for a game
SELECT `code_lookup`.`code`, `code_lookup`.`type`, `code_lookup`.`location_id`, `locations`.`internal_note`, `locations`.`is_start`, `locations`.`is_end` FROM `code_lookup` LEFT OUTER JOIN `locations` ON `code_lookup`.`game_id` = `locations`.`game_id` AND `code_lookup`.`location_id` = `locations`.`location_id` WHERE `code_lookup`.`game_id` = 1;
-- Shows codes and clusters for locations
SELECT `locations`.`location_id`, `locations`.`internal_note`, `code_lookup`.`code`, `game_location_clusters`.`cluster_id`, `game_location_clusters`.`description` FROM `locations` LEFT OUTER JOIN `code_lookup` ON `locations`.`game_id` = `code_lookup`.`game_id` AND `locations`.`location_id` = `code_lookup`.`location_id` LEFT OUTER JOIN `game_location_clusters` ON `game_location_clusters`.`game_id` = `locations`.`game_id` AND `game_location_clusters`.`cluster_id` = `locations`.`cluster_id` WHERE `locations`.`game_id` = 1 ORDER BY `game_location_clusters`.`cluster_id` ASC, `locations`.`location_id` ASC
| Add new location and code query | Add new location and code query
| SQL | mit | CodeMOOC/TreasureHuntBot,CodeMOOC/TreasureHuntBot,CodeMOOC/TreasureHuntBot,CodeMOOC/TreasureHuntBot | sql | ## Code Before:
-- Shows codes for a game
SELECT `code_lookup`.`code`, `code_lookup`.`type`, `code_lookup`.`location_id`, `locations`.`internal_note`, `locations`.`is_start`, `locations`.`is_end` FROM `code_lookup` LEFT OUTER JOIN `locations` ON `code_lookup`.`game_id` = `locations`.`game_id` AND `code_lookup`.`location_id` = `locations`.`location_id` WHERE `code_lookup`.`game_id` = 1;
## Instruction:
Add new location and code query
## Code After:
-- Shows codes for a game
SELECT `code_lookup`.`code`, `code_lookup`.`type`, `code_lookup`.`location_id`, `locations`.`internal_note`, `locations`.`is_start`, `locations`.`is_end` FROM `code_lookup` LEFT OUTER JOIN `locations` ON `code_lookup`.`game_id` = `locations`.`game_id` AND `code_lookup`.`location_id` = `locations`.`location_id` WHERE `code_lookup`.`game_id` = 1;
-- Shows codes and clusters for locations
SELECT `locations`.`location_id`, `locations`.`internal_note`, `code_lookup`.`code`, `game_location_clusters`.`cluster_id`, `game_location_clusters`.`description` FROM `locations` LEFT OUTER JOIN `code_lookup` ON `locations`.`game_id` = `code_lookup`.`game_id` AND `locations`.`location_id` = `code_lookup`.`location_id` LEFT OUTER JOIN `game_location_clusters` ON `game_location_clusters`.`game_id` = `locations`.`game_id` AND `game_location_clusters`.`cluster_id` = `locations`.`cluster_id` WHERE `locations`.`game_id` = 1 ORDER BY `game_location_clusters`.`cluster_id` ASC, `locations`.`location_id` ASC
|
2fe15225615ecbfc777ea73329308e955ef8a09e | website-guts/assets/js/pages/opticon-sessions.js | website-guts/assets/js/pages/opticon-sessions.js | //Code below manages Wistia gating. Enter email address once, then rest are ungated
wistiaEmbeds.onFind(function(video) {
var email = Wistia.localStorage("golden-ticket");
if (email) {
console.log('email being set');
video.setEmail(email);
}
});
wistiaEmbeds.bind("conversion", function(video, type, data) {
if (/^(pre|mid|post)-roll-email$/.test(type)) {
console.log('about to save on local storage');
Wistia.localStorage("golden-ticket", data);
for (var i = 0; i < wistiaEmbeds.length; i++) {
wistiaEmbeds[i].setEmail(data);
}
}
}); | //Code below manages Wistia gating. Enter email address once, then rest are ungated
wistiaEmbeds.onFind(function(video) {
var email = Wistia.localStorage("golden-ticket");
if (email) {
video.setEmail(email);
}
});
wistiaEmbeds.bind("conversion", function(video, type, data) {
if (/^(pre|mid|post)-roll-email$/.test(type)) {
Wistia.localStorage("golden-ticket", data);
for (var i = 0; i < wistiaEmbeds.length; i++) {
wistiaEmbeds[i].setEmail(data);
}
}
});
//code below allows users to resume video from whence they left off
wistiaEmbeds.onFind(function(video) {
video.addPlugin("resumable", {
src: "//fast.wistia.com/labs/resumable/plugin.js"
});
});
| Add code to pickup where viewer left off | Add code to pickup where viewer left off
| JavaScript | mit | CilantroOrg/CilantroVegOrg,CilantroOrg/CilantroVegOrg,CilantroOrg/CilantroVegOrg | javascript | ## Code Before:
//Code below manages Wistia gating. Enter email address once, then rest are ungated
wistiaEmbeds.onFind(function(video) {
var email = Wistia.localStorage("golden-ticket");
if (email) {
console.log('email being set');
video.setEmail(email);
}
});
wistiaEmbeds.bind("conversion", function(video, type, data) {
if (/^(pre|mid|post)-roll-email$/.test(type)) {
console.log('about to save on local storage');
Wistia.localStorage("golden-ticket", data);
for (var i = 0; i < wistiaEmbeds.length; i++) {
wistiaEmbeds[i].setEmail(data);
}
}
});
## Instruction:
Add code to pickup where viewer left off
## Code After:
//Code below manages Wistia gating. Enter email address once, then rest are ungated
wistiaEmbeds.onFind(function(video) {
var email = Wistia.localStorage("golden-ticket");
if (email) {
video.setEmail(email);
}
});
wistiaEmbeds.bind("conversion", function(video, type, data) {
if (/^(pre|mid|post)-roll-email$/.test(type)) {
Wistia.localStorage("golden-ticket", data);
for (var i = 0; i < wistiaEmbeds.length; i++) {
wistiaEmbeds[i].setEmail(data);
}
}
});
//code below allows users to resume video from whence they left off
wistiaEmbeds.onFind(function(video) {
video.addPlugin("resumable", {
src: "//fast.wistia.com/labs/resumable/plugin.js"
});
});
|
0c5e05973f6bab793c49c64c8640dd21a81b8b92 | lib/themes/dosomething/paraneue_dosomething/templates/search/search-block-form.tpl.php | lib/themes/dosomething/paraneue_dosomething/templates/search/search-block-form.tpl.php | <?php
/**
* @file
* Displays the search form block.
*
* Available variables:
* - $search_form: The complete search form ready for print.
* - $search: Associative array of search elements. Can be used to print each
* form element separately.
*
* Default elements within $search:
* - $search['search_block_form']: Text input area wrapped in a div.
* - $search['actions']: Rendered form buttons.
* - $search['hidden']: Hidden form elements. Used to validate forms when
* submitted.
*
* Modules can add to the search form, so it is recommended to check for their
* existence before printing. The default keys will always exist. To check for
* a module-provided field, use code like this:
* @code
* <?php if (isset($search['extra_field'])): ?>
* <div class="extra-field">
* <?php print $search['extra_field']; ?>
* </div>
* <?php endif; ?>
* @endcode
*
* @see template_preprocess_search_block_form()
*/
?>
<div class="container-inline">
<?php if (empty($variables['form']['#block']->subject)): ?>
<?php endif; ?>
<?php print $search['search_block_form']; ?>
<?php print $search['hidden']; ?>
</div>
| <?php
/**
* @file
* Displays the search form block.
*
* Available variables:
* - $search_form: The complete search form ready for print.
* - $search: Associative array of search elements. Can be used to print each
* form element separately.
*
* Default elements within $search:
* - $search['search_block_form']: Text input area wrapped in a div.
* - $search['actions']: Rendered form buttons.
* - $search['hidden']: Hidden form elements. Used to validate forms when
* submitted.
*
* Modules can add to the search form, so it is recommended to check for their
* existence before printing. The default keys will always exist. To check for
* a module-provided field, use code like this:
* @code
* <?php if (isset($search['extra_field'])): ?>
* <div class="extra-field">
* <?php print $search['extra_field']; ?>
* </div>
* <?php endif; ?>
* @endcode
*
* @see template_preprocess_search_block_form()
*/
?>
<?php if (empty($variables['form']['#block']->subject)): ?>
<?php endif; ?>
<?php print $search['search_block_form']; ?>
<?php print $search['hidden']; ?>
| Remove unnecessary wrapping div from search block. | Remove unnecessary wrapping div from search block.
| PHP | mit | mshmsh5000/dosomething-1,sergii-tkachenko/phoenix,angaither/dosomething,deadlybutter/phoenix,sbsmith86/dosomething,chloealee/dosomething,DoSomething/dosomething,DoSomething/dosomething,mshmsh5000/dosomething-1,sbsmith86/dosomething,chloealee/dosomething,chloealee/dosomething,sergii-tkachenko/phoenix,DoSomething/phoenix,jonuy/dosomething,DoSomething/phoenix,deadlybutter/phoenix,sbsmith86/dosomething,DoSomething/dosomething,angaither/dosomething,sergii-tkachenko/phoenix,DoSomething/dosomething,sbsmith86/dosomething,sergii-tkachenko/phoenix,DoSomething/dosomething,jonuy/dosomething,chloealee/dosomething,mshmsh5000/dosomething-1,jonuy/dosomething,chloealee/dosomething,jonuy/dosomething,sbsmith86/dosomething,sbsmith86/dosomething,mshmsh5000/dosomething-1,jonuy/dosomething,DoSomething/phoenix,deadlybutter/phoenix,deadlybutter/phoenix,mshmsh5000/dosomething-1,angaither/dosomething,deadlybutter/phoenix,angaither/dosomething,angaither/dosomething,chloealee/dosomething,angaither/dosomething,DoSomething/phoenix,DoSomething/dosomething,jonuy/dosomething,sergii-tkachenko/phoenix,DoSomething/phoenix | php | ## Code Before:
<?php
/**
* @file
* Displays the search form block.
*
* Available variables:
* - $search_form: The complete search form ready for print.
* - $search: Associative array of search elements. Can be used to print each
* form element separately.
*
* Default elements within $search:
* - $search['search_block_form']: Text input area wrapped in a div.
* - $search['actions']: Rendered form buttons.
* - $search['hidden']: Hidden form elements. Used to validate forms when
* submitted.
*
* Modules can add to the search form, so it is recommended to check for their
* existence before printing. The default keys will always exist. To check for
* a module-provided field, use code like this:
* @code
* <?php if (isset($search['extra_field'])): ?>
* <div class="extra-field">
* <?php print $search['extra_field']; ?>
* </div>
* <?php endif; ?>
* @endcode
*
* @see template_preprocess_search_block_form()
*/
?>
<div class="container-inline">
<?php if (empty($variables['form']['#block']->subject)): ?>
<?php endif; ?>
<?php print $search['search_block_form']; ?>
<?php print $search['hidden']; ?>
</div>
## Instruction:
Remove unnecessary wrapping div from search block.
## Code After:
<?php
/**
* @file
* Displays the search form block.
*
* Available variables:
* - $search_form: The complete search form ready for print.
* - $search: Associative array of search elements. Can be used to print each
* form element separately.
*
* Default elements within $search:
* - $search['search_block_form']: Text input area wrapped in a div.
* - $search['actions']: Rendered form buttons.
* - $search['hidden']: Hidden form elements. Used to validate forms when
* submitted.
*
* Modules can add to the search form, so it is recommended to check for their
* existence before printing. The default keys will always exist. To check for
* a module-provided field, use code like this:
* @code
* <?php if (isset($search['extra_field'])): ?>
* <div class="extra-field">
* <?php print $search['extra_field']; ?>
* </div>
* <?php endif; ?>
* @endcode
*
* @see template_preprocess_search_block_form()
*/
?>
<?php if (empty($variables['form']['#block']->subject)): ?>
<?php endif; ?>
<?php print $search['search_block_form']; ?>
<?php print $search['hidden']; ?>
|
fd32084a0d98770e0cb4f36ed6112d01a3672c68 | _layouts/page.html | _layouts/page.html | ---
layout: index
---
<div id="page-inner-content">
<div id="page-nav">
<div id="page-tree"></div>
<script>
$("#page-tree").kendoTreeView({
dataSource: {
transport: {
read: {
url: "{{site.baseurl}}/{{page.category}}.json",
dataType: "json"
}
},
schema: {
model: {
id: "path",
children: "items",
hasChildren: "items"
}
}
},
select: preventParentSelection,
template: navigationTemplate("{{site.baseurl}}/"),
dataBound: expandNavigation("{{page.url | remove_first: '/' }}")
});
</script>
</div>
<div id="page-article">
<div id="page-top">
<div id="page-breadcrumb">
{% breadcrumb %}
</div>
<div id="page-search">
{% include search.html %}
</div>
</div>
{% if page.title %}
<h1>{{ page.title }}</h1>
{% endif %}
{% if page.category == "api" %}<div id="markdown-toc"></div>{% endif %}
<article>
{{ content }}
</article>
</div>
</div>
{% include footer.html %}
| ---
layout: index
---
<div id="page-inner-content">
<div id="page-nav">
<div id="page-tree"></div>
<script>
$("#page-tree").kendoTreeView({
dataSource: {
transport: {
read: {
url: "{{site.baseurl}}/{{page.category}}.json",
dataType: "json"
}
},
schema: {
model: {
id: "path",
children: "items",
hasChildren: "items"
}
}
},
select: preventParentSelection,
template: navigationTemplate("{{site.baseurl}}/"),
dataBound: expandNavigation("{{page.url | remove_first: '/' }}")
});
</script>
</div>
<div id="page-article">
<div id="page-top">
<div id="page-breadcrumb">
{% breadcrumb %}
</div>
<div id="page-search">
{% include search.html %}
</div>
</div>
<article>
{{ content }}
</article>
</div>
</div>
{% include footer.html %}
| Remove auto h1 and api TOC insertion. | Remove auto h1 and api TOC insertion.
| HTML | apache-2.0 | ighristov/report-server-docs,telerik/report-server-docs,ighristov/report-server-docs,ighristov/report-server-docs | html | ## Code Before:
---
layout: index
---
<div id="page-inner-content">
<div id="page-nav">
<div id="page-tree"></div>
<script>
$("#page-tree").kendoTreeView({
dataSource: {
transport: {
read: {
url: "{{site.baseurl}}/{{page.category}}.json",
dataType: "json"
}
},
schema: {
model: {
id: "path",
children: "items",
hasChildren: "items"
}
}
},
select: preventParentSelection,
template: navigationTemplate("{{site.baseurl}}/"),
dataBound: expandNavigation("{{page.url | remove_first: '/' }}")
});
</script>
</div>
<div id="page-article">
<div id="page-top">
<div id="page-breadcrumb">
{% breadcrumb %}
</div>
<div id="page-search">
{% include search.html %}
</div>
</div>
{% if page.title %}
<h1>{{ page.title }}</h1>
{% endif %}
{% if page.category == "api" %}<div id="markdown-toc"></div>{% endif %}
<article>
{{ content }}
</article>
</div>
</div>
{% include footer.html %}
## Instruction:
Remove auto h1 and api TOC insertion.
## Code After:
---
layout: index
---
<div id="page-inner-content">
<div id="page-nav">
<div id="page-tree"></div>
<script>
$("#page-tree").kendoTreeView({
dataSource: {
transport: {
read: {
url: "{{site.baseurl}}/{{page.category}}.json",
dataType: "json"
}
},
schema: {
model: {
id: "path",
children: "items",
hasChildren: "items"
}
}
},
select: preventParentSelection,
template: navigationTemplate("{{site.baseurl}}/"),
dataBound: expandNavigation("{{page.url | remove_first: '/' }}")
});
</script>
</div>
<div id="page-article">
<div id="page-top">
<div id="page-breadcrumb">
{% breadcrumb %}
</div>
<div id="page-search">
{% include search.html %}
</div>
</div>
<article>
{{ content }}
</article>
</div>
</div>
{% include footer.html %}
|
bcf9402397ab309637068d79575d256ced893b33 | .travis.yml | .travis.yml | dist: xenial
addons:
chrome: stable
apt:
packages:
- chromium-chromedriver
language: python
python:
- '3.7'
script:
- bash ./transcrypt/development/continuous_integration/run.sh
| dist: xenial
sudo: yes
addons:
chrome: stable
apt:
packages:
- chromium-chromedriver
language: python
python:
- '3.7'
script:
- bash ./transcrypt/development/continuous_integration/run.sh
| Revert "Travis CI: The sudo tag is now deprecated in Travis" | Revert "Travis CI: The sudo tag is now deprecated in Travis"
| YAML | apache-2.0 | QQuick/Transcrypt,QQuick/Transcrypt,QQuick/Transcrypt,QQuick/Transcrypt,QQuick/Transcrypt | yaml | ## Code Before:
dist: xenial
addons:
chrome: stable
apt:
packages:
- chromium-chromedriver
language: python
python:
- '3.7'
script:
- bash ./transcrypt/development/continuous_integration/run.sh
## Instruction:
Revert "Travis CI: The sudo tag is now deprecated in Travis"
## Code After:
dist: xenial
sudo: yes
addons:
chrome: stable
apt:
packages:
- chromium-chromedriver
language: python
python:
- '3.7'
script:
- bash ./transcrypt/development/continuous_integration/run.sh
|
5e037ad12cba733625ffebf356ef4ddcf38d3de0 | rb/lib/selenium/webdriver/common/driver_extensions/uploads_files.rb | rb/lib/selenium/webdriver/common/driver_extensions/uploads_files.rb | module Selenium
module WebDriver
#
# @api private
#
module DriverExtensions
module UploadsFiles
#
# Set the file detector to pass local files to a remote WebDriver.
#
# The detector is an object that responds to #call, and when called
# will determine if the given string represents a file. If it does,
# the path to the file on the local file system should be returned,
# otherwise nil or false.
#
# Example:
#
# driver = Selenium::WebDriver.for :remote
# driver.file_detector = lambda { |str| str if File.exist?(str) }
#
# driver.find_element(:id => "upload").send_keys "/path/to/file"
#
# By default, no file detection is performed.
#
def file_detector=(detector)
unless detector.nil? or detector.respond_to? :call
raise ArgumentError, "detector must respond to #call"
end
bridge.file_detector = detector
end
end # UploadsFiles
end # DriverExtensions
end # WebDriver
end # Selenium
| module Selenium
module WebDriver
#
# @api private
#
module DriverExtensions
module UploadsFiles
#
# Set the file detector to pass local files to a remote WebDriver.
#
# The detector is an object that responds to #call, and when called
# will determine if the given string represents a file. If it does,
# the path to the file on the local file system should be returned,
# otherwise nil or false.
#
# Example:
#
# driver = Selenium::WebDriver.for :remote
# driver.file_detector = lambda do |args|
# # args => ["/path/to/file"]
# str if File.exist?(args.first.to_s)
# end
#
# driver.find_element(:id => "upload").send_keys "/path/to/file"
#
# By default, no file detection is performed.
#
def file_detector=(detector)
unless detector.nil? or detector.respond_to? :call
raise ArgumentError, "detector must respond to #call"
end
bridge.file_detector = detector
end
end # UploadsFiles
end # DriverExtensions
end # WebDriver
end # Selenium
| Update file detector Ruby example. | JariBakken: Update file detector Ruby example.
git-svn-id: aa1aa1384423cb28c2b1e29129bb3a91de1d9196@14082 07704840-8298-11de-bf8c-fd130f914ac9
| Ruby | apache-2.0 | yumingjuan/selenium,jmt4/Selenium2,yumingjuan/selenium,yumingjuan/selenium,yumingjuan/selenium,yumingjuan/selenium,jmt4/Selenium2,yumingjuan/selenium,jmt4/Selenium2,jmt4/Selenium2,yumingjuan/selenium,jmt4/Selenium2,yumingjuan/selenium,yumingjuan/selenium,jmt4/Selenium2,jmt4/Selenium2,jmt4/Selenium2,jmt4/Selenium2 | ruby | ## Code Before:
module Selenium
module WebDriver
#
# @api private
#
module DriverExtensions
module UploadsFiles
#
# Set the file detector to pass local files to a remote WebDriver.
#
# The detector is an object that responds to #call, and when called
# will determine if the given string represents a file. If it does,
# the path to the file on the local file system should be returned,
# otherwise nil or false.
#
# Example:
#
# driver = Selenium::WebDriver.for :remote
# driver.file_detector = lambda { |str| str if File.exist?(str) }
#
# driver.find_element(:id => "upload").send_keys "/path/to/file"
#
# By default, no file detection is performed.
#
def file_detector=(detector)
unless detector.nil? or detector.respond_to? :call
raise ArgumentError, "detector must respond to #call"
end
bridge.file_detector = detector
end
end # UploadsFiles
end # DriverExtensions
end # WebDriver
end # Selenium
## Instruction:
JariBakken: Update file detector Ruby example.
git-svn-id: aa1aa1384423cb28c2b1e29129bb3a91de1d9196@14082 07704840-8298-11de-bf8c-fd130f914ac9
## Code After:
module Selenium
module WebDriver
#
# @api private
#
module DriverExtensions
module UploadsFiles
#
# Set the file detector to pass local files to a remote WebDriver.
#
# The detector is an object that responds to #call, and when called
# will determine if the given string represents a file. If it does,
# the path to the file on the local file system should be returned,
# otherwise nil or false.
#
# Example:
#
# driver = Selenium::WebDriver.for :remote
# driver.file_detector = lambda do |args|
# # args => ["/path/to/file"]
# str if File.exist?(args.first.to_s)
# end
#
# driver.find_element(:id => "upload").send_keys "/path/to/file"
#
# By default, no file detection is performed.
#
def file_detector=(detector)
unless detector.nil? or detector.respond_to? :call
raise ArgumentError, "detector must respond to #call"
end
bridge.file_detector = detector
end
end # UploadsFiles
end # DriverExtensions
end # WebDriver
end # Selenium
|
96ef8f3f2c81822df635a373496d2f638178f85a | backend/project_name/settings/test.py | backend/project_name/settings/test.py | from .base import * # noqa
SECRET_KEY = "test"
DATABASES = {
"default": {"ENGINE": "django.db.backends.sqlite3", "NAME": base_dir_join("db.sqlite3"),}
}
STATIC_ROOT = "staticfiles"
STATIC_URL = "/static/"
MEDIA_ROOT = "mediafiles"
MEDIA_URL = "/media/"
DEFAULT_FILE_STORAGE = "django.core.files.storage.FileSystemStorage"
STATICFILES_STORAGE = "django.contrib.staticfiles.storage.StaticFilesStorage"
# Speed up password hashing
PASSWORD_HASHERS = [
"django.contrib.auth.hashers.MD5PasswordHasher",
]
# Celery
CELERY_TASK_ALWAYS_EAGER = True
CELERY_TASK_EAGER_PROPAGATES = True
| from .base import * # noqa
SECRET_KEY = "test"
DATABASES = {
"default": {"ENGINE": "django.db.backends.sqlite3", "NAME": base_dir_join("db.sqlite3"),}
}
STATIC_ROOT = base_dir_join('staticfiles')
STATIC_URL = '/static/'
MEDIA_ROOT = base_dir_join('mediafiles')
MEDIA_URL = '/media/'
DEFAULT_FILE_STORAGE = "django.core.files.storage.FileSystemStorage"
STATICFILES_STORAGE = "django.contrib.staticfiles.storage.StaticFilesStorage"
# Speed up password hashing
PASSWORD_HASHERS = [
"django.contrib.auth.hashers.MD5PasswordHasher",
]
# Celery
CELERY_TASK_ALWAYS_EAGER = True
CELERY_TASK_EAGER_PROPAGATES = True
| Revert "Fix ci build by adjust static root and media root path" | Revert "Fix ci build by adjust static root and media root path"
This reverts commit e502c906
| Python | mit | vintasoftware/django-react-boilerplate,vintasoftware/django-react-boilerplate,vintasoftware/django-react-boilerplate,vintasoftware/django-react-boilerplate | python | ## Code Before:
from .base import * # noqa
SECRET_KEY = "test"
DATABASES = {
"default": {"ENGINE": "django.db.backends.sqlite3", "NAME": base_dir_join("db.sqlite3"),}
}
STATIC_ROOT = "staticfiles"
STATIC_URL = "/static/"
MEDIA_ROOT = "mediafiles"
MEDIA_URL = "/media/"
DEFAULT_FILE_STORAGE = "django.core.files.storage.FileSystemStorage"
STATICFILES_STORAGE = "django.contrib.staticfiles.storage.StaticFilesStorage"
# Speed up password hashing
PASSWORD_HASHERS = [
"django.contrib.auth.hashers.MD5PasswordHasher",
]
# Celery
CELERY_TASK_ALWAYS_EAGER = True
CELERY_TASK_EAGER_PROPAGATES = True
## Instruction:
Revert "Fix ci build by adjust static root and media root path"
This reverts commit e502c906
## Code After:
from .base import * # noqa
SECRET_KEY = "test"
DATABASES = {
"default": {"ENGINE": "django.db.backends.sqlite3", "NAME": base_dir_join("db.sqlite3"),}
}
STATIC_ROOT = base_dir_join('staticfiles')
STATIC_URL = '/static/'
MEDIA_ROOT = base_dir_join('mediafiles')
MEDIA_URL = '/media/'
DEFAULT_FILE_STORAGE = "django.core.files.storage.FileSystemStorage"
STATICFILES_STORAGE = "django.contrib.staticfiles.storage.StaticFilesStorage"
# Speed up password hashing
PASSWORD_HASHERS = [
"django.contrib.auth.hashers.MD5PasswordHasher",
]
# Celery
CELERY_TASK_ALWAYS_EAGER = True
CELERY_TASK_EAGER_PROPAGATES = True
|
4c3769065b855fefd2f7720bbc3e117d2c6b3f30 | lib/hotspots.rb | lib/hotspots.rb | require "hotspots/version"
require "hotspots/exit"
require "hotspots/configuration"
require "hotspots/store"
require "hotspots/options_parser"
require "hotspots/repository"
class Hotspots
def initialize(configuration)
@configuration = configuration
@logger = @configuration.logger
@repository = @configuration.repository
@exit_strategy = @configuration.exit_strategy
@time = @configuration.time
@message_filters = @configuration.message_filters
@file_filter = @configuration.file_filter
@cutoff = @configuration.cutoff
end
def output
validate
assign
inside_repository do
run
end
end
private
def validate
@exit_strategy.perform
prepare_for_exit_if_git_status_invalid
@exit_strategy.perform
end
def assign
@driver = Hotspots::Repository::GitDriver.new(:logger => @logger)
@parser = Hotspots::Repository::GitParser.new(@driver, :time => @time, :message_filters => @message_filters)
@store = Hotspots::Store.new(@parser.files, :cutoff => @cutoff, :file_filter => @file_filter)
end
def inside_repository
yield Dir.chdir(@repository)
end
def run
puts @store.to_s
end
def prepare_for_exit_if_git_status_invalid
`git status 2>&1`
unless $? == 0
@exit_strategy = Hotspots::Exit::Error.new(:message => "'#{@repository}' doesn't seem to be a git repository!", :code => 10)
end
end
end
| require "hotspots/version"
require "hotspots/exit"
require "hotspots/configuration"
require "hotspots/store"
require "hotspots/options_parser"
require "hotspots/repository"
class Hotspots
attr_reader :configuration
def initialize(configuration)
@configuration = configuration
end
def output
validate
inside_repository do
run
end
end
private
def validate
configuration.exit_strategy.perform
ensure_git_repository
end
def inside_repository
yield Dir.chdir(repository)
end
def run
puts store.to_s
end
def ensure_git_repository
`git status 2>&1`
unless $? == 0
Hotspots::Exit::Error.new(:message => "'#{@repository}' doesn't seem to be a git repository!", :code => 10).perform
end
end
def repository
configuration.repository
end
def store
Hotspots::Store.new(parser.files, :cutoff => configuration.cutoff, :file_filter => configuration.file_filter)
end
def parser
Hotspots::Repository::GitParser.new(driver, :time => configuration.time, :message_filters => configuration.message_filters)
end
def driver
Hotspots::Repository::GitDriver.new(:logger => configuration.logger)
end
end
| Reduce the use of instance level variables | Reduce the use of instance level variables
| Ruby | mit | chiku/hotspots | ruby | ## Code Before:
require "hotspots/version"
require "hotspots/exit"
require "hotspots/configuration"
require "hotspots/store"
require "hotspots/options_parser"
require "hotspots/repository"
class Hotspots
def initialize(configuration)
@configuration = configuration
@logger = @configuration.logger
@repository = @configuration.repository
@exit_strategy = @configuration.exit_strategy
@time = @configuration.time
@message_filters = @configuration.message_filters
@file_filter = @configuration.file_filter
@cutoff = @configuration.cutoff
end
def output
validate
assign
inside_repository do
run
end
end
private
def validate
@exit_strategy.perform
prepare_for_exit_if_git_status_invalid
@exit_strategy.perform
end
def assign
@driver = Hotspots::Repository::GitDriver.new(:logger => @logger)
@parser = Hotspots::Repository::GitParser.new(@driver, :time => @time, :message_filters => @message_filters)
@store = Hotspots::Store.new(@parser.files, :cutoff => @cutoff, :file_filter => @file_filter)
end
def inside_repository
yield Dir.chdir(@repository)
end
def run
puts @store.to_s
end
def prepare_for_exit_if_git_status_invalid
`git status 2>&1`
unless $? == 0
@exit_strategy = Hotspots::Exit::Error.new(:message => "'#{@repository}' doesn't seem to be a git repository!", :code => 10)
end
end
end
## Instruction:
Reduce the use of instance level variables
## Code After:
require "hotspots/version"
require "hotspots/exit"
require "hotspots/configuration"
require "hotspots/store"
require "hotspots/options_parser"
require "hotspots/repository"
class Hotspots
attr_reader :configuration
def initialize(configuration)
@configuration = configuration
end
def output
validate
inside_repository do
run
end
end
private
def validate
configuration.exit_strategy.perform
ensure_git_repository
end
def inside_repository
yield Dir.chdir(repository)
end
def run
puts store.to_s
end
def ensure_git_repository
`git status 2>&1`
unless $? == 0
Hotspots::Exit::Error.new(:message => "'#{@repository}' doesn't seem to be a git repository!", :code => 10).perform
end
end
def repository
configuration.repository
end
def store
Hotspots::Store.new(parser.files, :cutoff => configuration.cutoff, :file_filter => configuration.file_filter)
end
def parser
Hotspots::Repository::GitParser.new(driver, :time => configuration.time, :message_filters => configuration.message_filters)
end
def driver
Hotspots::Repository::GitDriver.new(:logger => configuration.logger)
end
end
|
0bcf50f169878254999f7e04c09dc66854c5394d | app/services/modal-dialog.js | app/services/modal-dialog.js | import { computed } from '@ember/object';
import Service from '@ember/service';
import ENV from '../config/environment';
function computedFromConfig(prop) {
return computed(function(){
return ENV['ember-modal-dialog'] && ENV['ember-modal-dialog'][prop];
});
}
export default Service.extend({
hasEmberTether: computedFromConfig('hasEmberTether'),
hasLiquidWormhole: computedFromConfig('hasLiquidWormhole'),
hasLiquidTether: computedFromConfig('hasLiquidTether'),
destinationElementId: computed(function() {
/*
everywhere except test, this property will be overwritten
by the initializer that appends the modal container div
to the DOM. because initializers don't run in unit/integration
tests, this is a nice fallback.
*/
if (ENV.environment === 'test') {
return 'ember-testing';
}
})
});
| import Ember from 'ember';
import ENV from '../config/environment';
const { computed, Service } = Ember;
function computedFromConfig(prop) {
return computed(function(){
return ENV['ember-modal-dialog'] && ENV['ember-modal-dialog'][prop];
});
}
export default Service.extend({
hasEmberTether: computedFromConfig('hasEmberTether'),
hasLiquidWormhole: computedFromConfig('hasLiquidWormhole'),
hasLiquidTether: computedFromConfig('hasLiquidTether'),
destinationElementId: computed(function() {
/*
everywhere except test, this property will be overwritten
by the initializer that appends the modal container div
to the DOM. because initializers don't run in unit/integration
tests, this is a nice fallback.
*/
if (ENV.environment === 'test') {
return 'ember-testing';
}
})
});
| Revert modules update to file in app directory | Revert modules update to file in app directory | JavaScript | mit | oscarni/ember-modal-dialog,yapplabs/ember-modal-dialog,oscarni/ember-modal-dialog,yapplabs/ember-modal-dialog | javascript | ## Code Before:
import { computed } from '@ember/object';
import Service from '@ember/service';
import ENV from '../config/environment';
function computedFromConfig(prop) {
return computed(function(){
return ENV['ember-modal-dialog'] && ENV['ember-modal-dialog'][prop];
});
}
export default Service.extend({
hasEmberTether: computedFromConfig('hasEmberTether'),
hasLiquidWormhole: computedFromConfig('hasLiquidWormhole'),
hasLiquidTether: computedFromConfig('hasLiquidTether'),
destinationElementId: computed(function() {
/*
everywhere except test, this property will be overwritten
by the initializer that appends the modal container div
to the DOM. because initializers don't run in unit/integration
tests, this is a nice fallback.
*/
if (ENV.environment === 'test') {
return 'ember-testing';
}
})
});
## Instruction:
Revert modules update to file in app directory
## Code After:
import Ember from 'ember';
import ENV from '../config/environment';
const { computed, Service } = Ember;
function computedFromConfig(prop) {
return computed(function(){
return ENV['ember-modal-dialog'] && ENV['ember-modal-dialog'][prop];
});
}
export default Service.extend({
hasEmberTether: computedFromConfig('hasEmberTether'),
hasLiquidWormhole: computedFromConfig('hasLiquidWormhole'),
hasLiquidTether: computedFromConfig('hasLiquidTether'),
destinationElementId: computed(function() {
/*
everywhere except test, this property will be overwritten
by the initializer that appends the modal container div
to the DOM. because initializers don't run in unit/integration
tests, this is a nice fallback.
*/
if (ENV.environment === 'test') {
return 'ember-testing';
}
})
});
|
e5e91544c90dafa3ec1a4050b9bf3787d2568c51 | recipes/ocp/bld.bat | recipes/ocp/bld.bat | set CONDA_PREFIX=%PREFIX%
if errorlevel 1 exit 1
cmake -B build -S "%SRC_DIR%" ^
-G Ninja ^
-DCMAKE_BUILD_TYPE=Release ^
-DPython3_FIND_STRATEGY=LOCATION ^
-DPython3_ROOT_DIR=%CONDA_PREFIX% ^
-DCMAKE_LINKER=lld-link.exe ^
-DCMAKE_MODULE_LINKER_FLAGS="/FORCE:MULTIPLE"
if errorlevel 1 exit 1
cmake --build build -j %CPU_COUNT% -- -v -k 0
if errorlevel 1 exit 1
if not exist "%SP_DIR%" mkdir "%SP_DIR%"
if errorlevel 1 exit 1
copy build/OCP.cp*-*.* "%SP_DIR%"
if errorlevel 1 exit 1 | set CONDA_PREFIX=%PREFIX%
if errorlevel 1 exit 1
cmake -B build -S "%SRC_DIR%" ^
-G Ninja ^
-DCMAKE_BUILD_TYPE=Release ^
-DPython3_FIND_STRATEGY=LOCATION ^
-DPython3_ROOT_DIR=%CONDA_PREFIX% ^
-DCMAKE_LINKER=lld-link.exe ^
-DCMAKE_MODULE_LINKER_FLAGS="/machine:x64 /FORCE:MULTIPLE"
if errorlevel 1 exit 1
cmake --build build -j %CPU_COUNT% -- -v -k 0
if errorlevel 1 exit 1
if not exist "%SP_DIR%" mkdir "%SP_DIR%"
if errorlevel 1 exit 1
copy build/OCP.cp*-*.* "%SP_DIR%"
if errorlevel 1 exit 1 | Add /machine:x64, flag which was lost when setting /FORCE:MULTIPLE | Add /machine:x64, flag which was lost when setting /FORCE:MULTIPLE
| Batchfile | bsd-3-clause | conda-forge/staged-recipes,conda-forge/staged-recipes,goanpeca/staged-recipes,jakirkham/staged-recipes,ocefpaf/staged-recipes,mariusvniekerk/staged-recipes,mariusvniekerk/staged-recipes,kwilcox/staged-recipes,johanneskoester/staged-recipes,jakirkham/staged-recipes,ocefpaf/staged-recipes,goanpeca/staged-recipes,johanneskoester/staged-recipes,kwilcox/staged-recipes | batchfile | ## Code Before:
set CONDA_PREFIX=%PREFIX%
if errorlevel 1 exit 1
cmake -B build -S "%SRC_DIR%" ^
-G Ninja ^
-DCMAKE_BUILD_TYPE=Release ^
-DPython3_FIND_STRATEGY=LOCATION ^
-DPython3_ROOT_DIR=%CONDA_PREFIX% ^
-DCMAKE_LINKER=lld-link.exe ^
-DCMAKE_MODULE_LINKER_FLAGS="/FORCE:MULTIPLE"
if errorlevel 1 exit 1
cmake --build build -j %CPU_COUNT% -- -v -k 0
if errorlevel 1 exit 1
if not exist "%SP_DIR%" mkdir "%SP_DIR%"
if errorlevel 1 exit 1
copy build/OCP.cp*-*.* "%SP_DIR%"
if errorlevel 1 exit 1
## Instruction:
Add /machine:x64, flag which was lost when setting /FORCE:MULTIPLE
## Code After:
set CONDA_PREFIX=%PREFIX%
if errorlevel 1 exit 1
cmake -B build -S "%SRC_DIR%" ^
-G Ninja ^
-DCMAKE_BUILD_TYPE=Release ^
-DPython3_FIND_STRATEGY=LOCATION ^
-DPython3_ROOT_DIR=%CONDA_PREFIX% ^
-DCMAKE_LINKER=lld-link.exe ^
-DCMAKE_MODULE_LINKER_FLAGS="/machine:x64 /FORCE:MULTIPLE"
if errorlevel 1 exit 1
cmake --build build -j %CPU_COUNT% -- -v -k 0
if errorlevel 1 exit 1
if not exist "%SP_DIR%" mkdir "%SP_DIR%"
if errorlevel 1 exit 1
copy build/OCP.cp*-*.* "%SP_DIR%"
if errorlevel 1 exit 1 |
976f11d258df9419a3832687c355506d810ec2c5 | README.md | README.md | Oblog - Markdown to HTML
========================
This is a simple Markdown + Twig => HTML + sitemap.xml script. It will read source markdown files from a directory, natural sort the files, pass them through Twig templates and write output into given directory.
The script will also generate `sitemap.xml` for easier crawling/submission to search engines.
Usage
-----
See the example site in `/example` directory. Sample has two public posts and one draft. To generate posts run the following
php gen.php
**Warning! All files with `.html` extension in output directory will be deleted**
First line of the source markdown file has several **magical** properties:
* it will be used as part of the html filename eg. "Laihduta regexillä" would be named "laihduta-regexilla.html".
* if it contains `+DRAFT+`, the post won't be added to link list nor `sitemap.xml` and is given a slightly obfuscated filename which is only shown during HTML generation.
via Composer
------------
{
"require": {
"ospii/oblog": "dev-master"
}
}
| Oblog - Markdown to HTML
========================
This is a simple Markdown + Twig => HTML + sitemap.xml script. It will read source markdown files from a directory, natural sort the files, pass them through Twig templates and write output into given directory.
The script will also generate a sitemap to `sitemap.xml` and atom feed to `atom.xml`.
Usage
-----
See the example site in `/example` directory. Sample has two public posts and one draft. To generate posts run the following
php gen.php
**Warning! All files with `.html` extension in output directory will be deleted**
First line of the source markdown file has several **magical** properties:
* it will be used as part of the html filename eg. "Laihduta regexillä" would be named "laihduta-regexilla.html".
* if it contains `+DRAFT+`, the post won't be added to link list nor `sitemap.xml` and is given a slightly obfuscated filename which is only shown during HTML generation.
via Composer
------------
{
"require": {
"ospii/oblog": "dev-master"
}
}
| Add note about atom feed to readme | Add note about atom feed to readme
| Markdown | mit | aforsell/oblog,aforsell/oblog,ospii/oblog,ospii/oblog | markdown | ## Code Before:
Oblog - Markdown to HTML
========================
This is a simple Markdown + Twig => HTML + sitemap.xml script. It will read source markdown files from a directory, natural sort the files, pass them through Twig templates and write output into given directory.
The script will also generate `sitemap.xml` for easier crawling/submission to search engines.
Usage
-----
See the example site in `/example` directory. Sample has two public posts and one draft. To generate posts run the following
php gen.php
**Warning! All files with `.html` extension in output directory will be deleted**
First line of the source markdown file has several **magical** properties:
* it will be used as part of the html filename eg. "Laihduta regexillä" would be named "laihduta-regexilla.html".
* if it contains `+DRAFT+`, the post won't be added to link list nor `sitemap.xml` and is given a slightly obfuscated filename which is only shown during HTML generation.
via Composer
------------
{
"require": {
"ospii/oblog": "dev-master"
}
}
## Instruction:
Add note about atom feed to readme
## Code After:
Oblog - Markdown to HTML
========================
This is a simple Markdown + Twig => HTML + sitemap.xml script. It will read source markdown files from a directory, natural sort the files, pass them through Twig templates and write output into given directory.
The script will also generate a sitemap to `sitemap.xml` and atom feed to `atom.xml`.
Usage
-----
See the example site in `/example` directory. Sample has two public posts and one draft. To generate posts run the following
php gen.php
**Warning! All files with `.html` extension in output directory will be deleted**
First line of the source markdown file has several **magical** properties:
* it will be used as part of the html filename eg. "Laihduta regexillä" would be named "laihduta-regexilla.html".
* if it contains `+DRAFT+`, the post won't be added to link list nor `sitemap.xml` and is given a slightly obfuscated filename which is only shown during HTML generation.
via Composer
------------
{
"require": {
"ospii/oblog": "dev-master"
}
}
|
3600cf9ca20f1d4d08765d5db7930bb63804c1f3 | test/integration/terminal_silhouette_flows_test.rb | test/integration/terminal_silhouette_flows_test.rb | require 'test_helper'
class TerminalSilhouetteFlowsTest < ActionDispatch::IntegrationTest
# projects/terminal-silhouettes
# REDIRECTS
# various terminal silhouette redirects
end
| require 'test_helper'
class TerminalSilhouetteFlowsTest < ActionDispatch::IntegrationTest
test "should get terminal silhouettes" do
get(terminal_silhouettes_path)
assert_response(:success)
end
# REDIRECTS
test "should redirect alternate terminal silhouette paths" do
alternate_paths = %w(terminal-silhouettes terminal_silhouettes terminalsilhouettes terminals)
alternate_paths.each do |alt_path|
get("/#{alt_path}")
assert_response(301)
assert_redirected_to(terminal_silhouettes_path)
end
end
end
| Write integration tests for terminal silhouettes | Write integration tests for terminal silhouettes
| Ruby | mit | bogardpd/portfolio,bogardpd/portfolio,bogardpd/portfolio | ruby | ## Code Before:
require 'test_helper'
class TerminalSilhouetteFlowsTest < ActionDispatch::IntegrationTest
# projects/terminal-silhouettes
# REDIRECTS
# various terminal silhouette redirects
end
## Instruction:
Write integration tests for terminal silhouettes
## Code After:
require 'test_helper'
class TerminalSilhouetteFlowsTest < ActionDispatch::IntegrationTest
test "should get terminal silhouettes" do
get(terminal_silhouettes_path)
assert_response(:success)
end
# REDIRECTS
test "should redirect alternate terminal silhouette paths" do
alternate_paths = %w(terminal-silhouettes terminal_silhouettes terminalsilhouettes terminals)
alternate_paths.each do |alt_path|
get("/#{alt_path}")
assert_response(301)
assert_redirected_to(terminal_silhouettes_path)
end
end
end
|
62cec1353e9575a4b9887e882a134264885b4aec | mbp-logging-reports/mbp-logging-reports.php | mbp-logging-reports/mbp-logging-reports.php | <?php
/**
* mbp-logging-reports.php
*
* A producer to create reports based on logged data in mb-logging-api.
*/
use DoSomething\MBP_LoggingReports\MBP_LoggingReports_Users;
date_default_timezone_set('America/New_York');
define('CONFIG_PATH', __DIR__ . '/messagebroker-config');
// Load up the Composer autoload magic
require_once __DIR__ . '/vendor/autoload.php';
require_once __DIR__ . '/mbp-logging-reports.config.inc';
if (isset($_GET['source'])) {
$sources[0] = $_GET['source'];
}
elseif (isset($argv[1])) {
$sources[0] = $argv[1];
}
if ($sources[0] == 'all') {
$sources = [
'niche',
'afterschool'
];
}
echo '------- mbp-logging-reports START: ' . date('D M j G:i:s T Y') . ' -------', PHP_EOL;
try {
// Kick off
$mbpLoggingReport = new MBP_LoggingReports_Users();
// Gather digest message mailing list
$mbpLoggingReport->report('runningMonth', $sources);
}
catch(Exception $e) {
echo $e->getMessage(), PHP_EOL;
}
echo '------- mbp-logging-reports END: ' . date('D M j G:i:s T Y') . ' -------', PHP_EOL;
| <?php
/**
* mbp-logging-reports.php
*
* A producer to create reports based on logged data in mb-logging-api.
*/
use DoSomething\MBP_LoggingReports\MBP_LoggingReports_Users;
date_default_timezone_set('America/New_York');
define('CONFIG_PATH', __DIR__ . '/messagebroker-config');
// Load up the Composer autoload magic
require_once __DIR__ . '/vendor/autoload.php';
require_once __DIR__ . '/mbp-logging-reports.config.inc';
if (isset($_GET['source'])) {
$sources[0] = $_GET['source'];
}
elseif (isset($argv[1])) {
$sources[0] = $argv[1];
}
if ($sources[0] == 'all') {
$sources = [
'niche',
'afterschool'
];
}
if (isset($_GET['startDate'])) {
$startDate = $_GET['startDate'];
}
elseif (isset($argv[2])) {
$startDate = $argv[2];
}
echo '------- mbp-logging-reports START: ' . date('D M j G:i:s T Y') . ' -------', PHP_EOL;
try {
// Kick off
$mbpLoggingReport = new MBP_LoggingReports_Users();
// Gather digest message mailing list
$mbpLoggingReport->report('runningMonth', $sources, $startDate);
}
catch(Exception $e) {
echo $e->getMessage(), PHP_EOL;
}
echo '------- mbp-logging-reports END: ' . date('D M j G:i:s T Y') . ' -------', PHP_EOL;
| Gather startDate parameter and pass into controller | Gather startDate parameter and pass into controller
| PHP | mit | DoSomething/messagebroker-ds-PHP,DoSomething/messagebroker-ds-PHP,DoSomething/MessageBroker-PHP,DoSomething/MessageBroker-PHP | php | ## Code Before:
<?php
/**
* mbp-logging-reports.php
*
* A producer to create reports based on logged data in mb-logging-api.
*/
use DoSomething\MBP_LoggingReports\MBP_LoggingReports_Users;
date_default_timezone_set('America/New_York');
define('CONFIG_PATH', __DIR__ . '/messagebroker-config');
// Load up the Composer autoload magic
require_once __DIR__ . '/vendor/autoload.php';
require_once __DIR__ . '/mbp-logging-reports.config.inc';
if (isset($_GET['source'])) {
$sources[0] = $_GET['source'];
}
elseif (isset($argv[1])) {
$sources[0] = $argv[1];
}
if ($sources[0] == 'all') {
$sources = [
'niche',
'afterschool'
];
}
echo '------- mbp-logging-reports START: ' . date('D M j G:i:s T Y') . ' -------', PHP_EOL;
try {
// Kick off
$mbpLoggingReport = new MBP_LoggingReports_Users();
// Gather digest message mailing list
$mbpLoggingReport->report('runningMonth', $sources);
}
catch(Exception $e) {
echo $e->getMessage(), PHP_EOL;
}
echo '------- mbp-logging-reports END: ' . date('D M j G:i:s T Y') . ' -------', PHP_EOL;
## Instruction:
Gather startDate parameter and pass into controller
## Code After:
<?php
/**
* mbp-logging-reports.php
*
* A producer to create reports based on logged data in mb-logging-api.
*/
use DoSomething\MBP_LoggingReports\MBP_LoggingReports_Users;
date_default_timezone_set('America/New_York');
define('CONFIG_PATH', __DIR__ . '/messagebroker-config');
// Load up the Composer autoload magic
require_once __DIR__ . '/vendor/autoload.php';
require_once __DIR__ . '/mbp-logging-reports.config.inc';
if (isset($_GET['source'])) {
$sources[0] = $_GET['source'];
}
elseif (isset($argv[1])) {
$sources[0] = $argv[1];
}
if ($sources[0] == 'all') {
$sources = [
'niche',
'afterschool'
];
}
if (isset($_GET['startDate'])) {
$startDate = $_GET['startDate'];
}
elseif (isset($argv[2])) {
$startDate = $argv[2];
}
echo '------- mbp-logging-reports START: ' . date('D M j G:i:s T Y') . ' -------', PHP_EOL;
try {
// Kick off
$mbpLoggingReport = new MBP_LoggingReports_Users();
// Gather digest message mailing list
$mbpLoggingReport->report('runningMonth', $sources, $startDate);
}
catch(Exception $e) {
echo $e->getMessage(), PHP_EOL;
}
echo '------- mbp-logging-reports END: ' . date('D M j G:i:s T Y') . ' -------', PHP_EOL;
|
2feef49584ed29c871d4905dced9a6cf24a1feec | index.js | index.js | var React = require('react')
const { Component, PropTypes } = React
class ActionCableProvider extends Component {
getChildContext () {
return {
cable: this.props.cable
}
}
render () {
return this.props.children
}
}
ActionCableProvider.propTypes = {
cable: PropTypes.object.isRequired,
children: PropTypes.any
}
ActionCableProvider.childContextTypes = {
cable: PropTypes.object.isRequired
}
module.exports = ActionCableProvider
| var React = require('react')
var ActionCable = require('actioncable')
const { Component, PropTypes } = React
class ActionCableProvider extends Component {
getChildContext () {
return {
cable: this.cable
}
}
componentWillMount () {
if (this.props.cable) {
this.cable = this.props.cable
} else {
this.cable = ActionCable.createConsumer(this.props.url)
}
}
componentWillUnmount () {
if (!this.props.cable && this.cable) {
this.cable.disconnect()
}
}
componentWillReceiveProps (nextProps) {
// Props not changed
if (this.props.cable === nextProps.cable &&
this.props.url === nextProps.url) {
return
}
// cable is created by self, disconnect it
this.componentWillUnmount()
// create or assign cable
this.componentWillMount()
}
render () {
return this.props.children
}
}
ActionCableProvider.propTypes = {
cable: PropTypes.object,
url: PropTypes.string,
children: PropTypes.any
}
ActionCableProvider.childContextTypes = {
cable: PropTypes.object.isRequired
}
module.exports = ActionCableProvider
| Add url props to simplify usage, let cable and url optional | Add url props to simplify usage, let cable and url optional
| JavaScript | mit | cpunion/react-actioncable-provider | javascript | ## Code Before:
var React = require('react')
const { Component, PropTypes } = React
class ActionCableProvider extends Component {
getChildContext () {
return {
cable: this.props.cable
}
}
render () {
return this.props.children
}
}
ActionCableProvider.propTypes = {
cable: PropTypes.object.isRequired,
children: PropTypes.any
}
ActionCableProvider.childContextTypes = {
cable: PropTypes.object.isRequired
}
module.exports = ActionCableProvider
## Instruction:
Add url props to simplify usage, let cable and url optional
## Code After:
var React = require('react')
var ActionCable = require('actioncable')
const { Component, PropTypes } = React
class ActionCableProvider extends Component {
getChildContext () {
return {
cable: this.cable
}
}
componentWillMount () {
if (this.props.cable) {
this.cable = this.props.cable
} else {
this.cable = ActionCable.createConsumer(this.props.url)
}
}
componentWillUnmount () {
if (!this.props.cable && this.cable) {
this.cable.disconnect()
}
}
componentWillReceiveProps (nextProps) {
// Props not changed
if (this.props.cable === nextProps.cable &&
this.props.url === nextProps.url) {
return
}
// cable is created by self, disconnect it
this.componentWillUnmount()
// create or assign cable
this.componentWillMount()
}
render () {
return this.props.children
}
}
ActionCableProvider.propTypes = {
cable: PropTypes.object,
url: PropTypes.string,
children: PropTypes.any
}
ActionCableProvider.childContextTypes = {
cable: PropTypes.object.isRequired
}
module.exports = ActionCableProvider
|
09b5774750aad5f2ed76382292949f572a7cebf2 | chrome/browser/resources/webstore_app/manifest.json | chrome/browser/resources/webstore_app/manifest.json | {
"key": "MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQCtl3tO0osjuzRsf6xtD2SKxPlTfuoy7AWoObysitBPvH5fE1NaAA1/2JkPWkVDhdLBWLaIBPYeXbzlHp3y4Vv/4XG+aN5qFE3z+1RU/NqkzVYHtIpVScf3DjTYtKVL66mzVGijSoAIwbFCC3LpGdaoe6Q1rSRDp76wR6jjFzsYwQIDAQAB",
"name": "Chrome Web Store",
"version": "0.1",
"description": "Web Store",
"icons": {
},
"app": {
"launch": {
"web_url": "https://chrome.google.com/extensions"
},
"urls": [
"https://chrome.google.com/extensions",
"https://clients2.google.com/service/update2",
"https://clients2.googleusercontent.com/crx"
]
},
"permissions": [
"webstorePrivate",
"experimental"
]
}
| {
"key": "MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQCtl3tO0osjuzRsf6xtD2SKxPlTfuoy7AWoObysitBPvH5fE1NaAA1/2JkPWkVDhdLBWLaIBPYeXbzlHp3y4Vv/4XG+aN5qFE3z+1RU/NqkzVYHtIpVScf3DjTYtKVL66mzVGijSoAIwbFCC3LpGdaoe6Q1rSRDp76wR6jjFzsYwQIDAQAB",
"name": "Chrome Web Store",
"version": "0.1",
"description": "Web Store",
"icons": {
},
"app": {
"launch": {
"web_url": "https://chrome.google.com/extensions"
},
"urls": [
"*://chrome.google.com/extensions",
"*://clients2.google.com/service/update2",
"*://clients2.googleusercontent.com/crx"
]
},
"permissions": [
"webstorePrivate",
"experimental"
]
}
| Include all schemes for webstore component app urls | Include all schemes for webstore component app urls
BUG=55002
TEST=type chrome.google.com/extensions into omnibox, open js console, verify that chrome.webstorePrivate object is defined
Review URL: http://codereview.chromium.org/3366018
git-svn-id: http://src.chromium.org/svn/trunk/src@58970 4ff67af0-8c30-449e-8e8b-ad334ec8d88c
Former-commit-id: b6ff67228cf2e315d6ec24f76ce1f8052ce5a00a | JSON | bsd-3-clause | meego-tablet-ux/meego-app-browser,meego-tablet-ux/meego-app-browser,meego-tablet-ux/meego-app-browser,meego-tablet-ux/meego-app-browser,meego-tablet-ux/meego-app-browser,meego-tablet-ux/meego-app-browser,meego-tablet-ux/meego-app-browser,meego-tablet-ux/meego-app-browser,meego-tablet-ux/meego-app-browser,meego-tablet-ux/meego-app-browser | json | ## Code Before:
{
"key": "MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQCtl3tO0osjuzRsf6xtD2SKxPlTfuoy7AWoObysitBPvH5fE1NaAA1/2JkPWkVDhdLBWLaIBPYeXbzlHp3y4Vv/4XG+aN5qFE3z+1RU/NqkzVYHtIpVScf3DjTYtKVL66mzVGijSoAIwbFCC3LpGdaoe6Q1rSRDp76wR6jjFzsYwQIDAQAB",
"name": "Chrome Web Store",
"version": "0.1",
"description": "Web Store",
"icons": {
},
"app": {
"launch": {
"web_url": "https://chrome.google.com/extensions"
},
"urls": [
"https://chrome.google.com/extensions",
"https://clients2.google.com/service/update2",
"https://clients2.googleusercontent.com/crx"
]
},
"permissions": [
"webstorePrivate",
"experimental"
]
}
## Instruction:
Include all schemes for webstore component app urls
BUG=55002
TEST=type chrome.google.com/extensions into omnibox, open js console, verify that chrome.webstorePrivate object is defined
Review URL: http://codereview.chromium.org/3366018
git-svn-id: http://src.chromium.org/svn/trunk/src@58970 4ff67af0-8c30-449e-8e8b-ad334ec8d88c
Former-commit-id: b6ff67228cf2e315d6ec24f76ce1f8052ce5a00a
## Code After:
{
"key": "MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQCtl3tO0osjuzRsf6xtD2SKxPlTfuoy7AWoObysitBPvH5fE1NaAA1/2JkPWkVDhdLBWLaIBPYeXbzlHp3y4Vv/4XG+aN5qFE3z+1RU/NqkzVYHtIpVScf3DjTYtKVL66mzVGijSoAIwbFCC3LpGdaoe6Q1rSRDp76wR6jjFzsYwQIDAQAB",
"name": "Chrome Web Store",
"version": "0.1",
"description": "Web Store",
"icons": {
},
"app": {
"launch": {
"web_url": "https://chrome.google.com/extensions"
},
"urls": [
"*://chrome.google.com/extensions",
"*://clients2.google.com/service/update2",
"*://clients2.googleusercontent.com/crx"
]
},
"permissions": [
"webstorePrivate",
"experimental"
]
}
|
3d2af653be9bcc4f1c8b217edbec25d089aeba10 | packages/un/universe-instances-trans.yaml | packages/un/universe-instances-trans.yaml | homepage: https://github.com/dmwit/universe
changelog-type: ''
hash: 9f75eb7f8cda98e8ecb5e9f6ec079890a66265bdfd6707f4fa4cb8857327dfa4
test-bench-deps: {}
maintainer: me@dmwit.com
synopsis: Universe instances for types from the transformers and mtl packages
changelog: ''
basic-deps:
base: ! '>=4 && <5'
universe-base: ! '>=1.0 && <1.1'
universe-instances-base: ! '>=1.0 && <1.1'
mtl: ! '>=1.0 && <2.3'
transformers: ! '>=0.2 && <0.5'
all-versions:
- '1.0'
- '1.0.0.1'
author: Daniel Wagner
latest: '1.0.0.1'
description-type: haddock
description: ''
license-name: BSD3
| homepage: https://github.com/dmwit/universe
changelog-type: ''
hash: c96cbeb4bf0240bbe09476ca360e9d35cb07cb0af4324bfbfa5cce55df7a9c35
test-bench-deps: {}
maintainer: me@dmwit.com
synopsis: Universe instances for types from the transformers and mtl packages
changelog: ''
basic-deps:
base: ! '>=4 && <5'
universe-base: ! '>=1.0 && <1.1'
universe-instances-base: ! '>=1.0 && <1.1'
mtl: ! '>=1.0 && <2.3'
transformers: ! '>=0.2 && <0.6'
all-versions:
- '1.0'
- '1.0.0.1'
author: Daniel Wagner
latest: '1.0.0.1'
description-type: haddock
description: ''
license-name: BSD3
| Update from Hackage at 2016-01-18T06:35:15+0000 | Update from Hackage at 2016-01-18T06:35:15+0000
| YAML | mit | commercialhaskell/all-cabal-metadata | yaml | ## Code Before:
homepage: https://github.com/dmwit/universe
changelog-type: ''
hash: 9f75eb7f8cda98e8ecb5e9f6ec079890a66265bdfd6707f4fa4cb8857327dfa4
test-bench-deps: {}
maintainer: me@dmwit.com
synopsis: Universe instances for types from the transformers and mtl packages
changelog: ''
basic-deps:
base: ! '>=4 && <5'
universe-base: ! '>=1.0 && <1.1'
universe-instances-base: ! '>=1.0 && <1.1'
mtl: ! '>=1.0 && <2.3'
transformers: ! '>=0.2 && <0.5'
all-versions:
- '1.0'
- '1.0.0.1'
author: Daniel Wagner
latest: '1.0.0.1'
description-type: haddock
description: ''
license-name: BSD3
## Instruction:
Update from Hackage at 2016-01-18T06:35:15+0000
## Code After:
homepage: https://github.com/dmwit/universe
changelog-type: ''
hash: c96cbeb4bf0240bbe09476ca360e9d35cb07cb0af4324bfbfa5cce55df7a9c35
test-bench-deps: {}
maintainer: me@dmwit.com
synopsis: Universe instances for types from the transformers and mtl packages
changelog: ''
basic-deps:
base: ! '>=4 && <5'
universe-base: ! '>=1.0 && <1.1'
universe-instances-base: ! '>=1.0 && <1.1'
mtl: ! '>=1.0 && <2.3'
transformers: ! '>=0.2 && <0.6'
all-versions:
- '1.0'
- '1.0.0.1'
author: Daniel Wagner
latest: '1.0.0.1'
description-type: haddock
description: ''
license-name: BSD3
|
6817cb577250d91c957ef951fd60e76a91645997 | tasks/casperjs.js | tasks/casperjs.js | /*
* grunt-casperjs
* https://github.com/ronaldlokers/grunt-casperjs
*
* Copyright (c) 2013 Ronald Lokers
* Licensed under the MIT license.
*/
'use strict';
module.exports = function(grunt) {
var casperjs = require('./lib/casperjs').init(grunt).casperjs;
grunt.registerMultiTask('casperjs', 'Run CasperJs tests.', function() {
// Merge task-specific and/or target-specific options with these defaults.
var done = this.async(),
filepaths = [],
options = this.options();
// Iterate over all specified file groups.
this.files.forEach(function(file) {
// Concat specified files.
file.src.filter(function(filepath) {
// Warn on and remove invalid source files (if nonull was set).
if (!grunt.file.exists(filepath)) {
grunt.log.warn('Source file "' + filepath + '" not found.');
return false;
} else {
filepaths.push(filepath);
return true;
}
});
});
grunt.util.async.forEachSeries(
filepaths, function(filepath, callback) {
casperjs(filepath, options, function(err) {
if (err) {
grunt.warn(err);
}
callback();
});
},
done);
});
};
| /*
* grunt-casperjs
* https://github.com/ronaldlokers/grunt-casperjs
*
* Copyright (c) 2013 Ronald Lokers
* Licensed under the MIT license.
*/
'use strict';
module.exports = function(grunt) {
var casperjs = require('./lib/casperjs').init(grunt).casperjs;
grunt.registerMultiTask('casperjs', 'Run CasperJs tests.', function() {
// Merge task-specific and/or target-specific options with these defaults.
var done = this.async(),
filepaths = [],
options = grunt.util._.defaults(this.options(), {async: {}}),
asyncLoop = options.async.parallel ? 'forEach' : 'forEachSeries';
// Get rid of the async options since they're unrelated to casper/phantom
delete options.async;
// Iterate over all specified file groups.
this.files.forEach(function(file) {
// Concat specified files.
file.src.filter(function(filepath) {
// Warn on and remove invalid source files (if nonull was set).
if (!grunt.file.exists(filepath)) {
grunt.log.warn('Source file "' + filepath + '" not found.');
return false;
} else {
filepaths.push(filepath);
return true;
}
});
});
grunt.util.async[asyncLoop](
filepaths, function(filepath, callback) {
casperjs(filepath, options, function(err) {
if (err) {
grunt.warn(err);
}
callback();
});
},
done);
});
};
| Add `async.parallel` option to run tests in parallel instead of in series | Add `async.parallel` option to run tests in parallel instead of in series
| JavaScript | mit | ronaldlokers/grunt-casperjs,ryanmcdermott/grunt-casperjs,codice/grunt-casperjs,davidosomething/grunt-casperjs,guardian/grunt-casperjs | javascript | ## Code Before:
/*
* grunt-casperjs
* https://github.com/ronaldlokers/grunt-casperjs
*
* Copyright (c) 2013 Ronald Lokers
* Licensed under the MIT license.
*/
'use strict';
module.exports = function(grunt) {
var casperjs = require('./lib/casperjs').init(grunt).casperjs;
grunt.registerMultiTask('casperjs', 'Run CasperJs tests.', function() {
// Merge task-specific and/or target-specific options with these defaults.
var done = this.async(),
filepaths = [],
options = this.options();
// Iterate over all specified file groups.
this.files.forEach(function(file) {
// Concat specified files.
file.src.filter(function(filepath) {
// Warn on and remove invalid source files (if nonull was set).
if (!grunt.file.exists(filepath)) {
grunt.log.warn('Source file "' + filepath + '" not found.');
return false;
} else {
filepaths.push(filepath);
return true;
}
});
});
grunt.util.async.forEachSeries(
filepaths, function(filepath, callback) {
casperjs(filepath, options, function(err) {
if (err) {
grunt.warn(err);
}
callback();
});
},
done);
});
};
## Instruction:
Add `async.parallel` option to run tests in parallel instead of in series
## Code After:
/*
* grunt-casperjs
* https://github.com/ronaldlokers/grunt-casperjs
*
* Copyright (c) 2013 Ronald Lokers
* Licensed under the MIT license.
*/
'use strict';
module.exports = function(grunt) {
var casperjs = require('./lib/casperjs').init(grunt).casperjs;
grunt.registerMultiTask('casperjs', 'Run CasperJs tests.', function() {
// Merge task-specific and/or target-specific options with these defaults.
var done = this.async(),
filepaths = [],
options = grunt.util._.defaults(this.options(), {async: {}}),
asyncLoop = options.async.parallel ? 'forEach' : 'forEachSeries';
// Get rid of the async options since they're unrelated to casper/phantom
delete options.async;
// Iterate over all specified file groups.
this.files.forEach(function(file) {
// Concat specified files.
file.src.filter(function(filepath) {
// Warn on and remove invalid source files (if nonull was set).
if (!grunt.file.exists(filepath)) {
grunt.log.warn('Source file "' + filepath + '" not found.');
return false;
} else {
filepaths.push(filepath);
return true;
}
});
});
grunt.util.async[asyncLoop](
filepaths, function(filepath, callback) {
casperjs(filepath, options, function(err) {
if (err) {
grunt.warn(err);
}
callback();
});
},
done);
});
};
|
4b418cee7bcf1f2d47674a94c5070f40771f54f5 | BayesClassification.py | BayesClassification.py |
import sys
#------------------------------------------------------------------------------#
# #
# CLASSES #
# #
#------------------------------------------------------------------------------#
#------------------------------------------------------------------------------#
# #
# UTILITIES FUNCTIONS #
# #
#------------------------------------------------------------------------------#
#------------------------------------------------------------------------------#
# #
# "MAIN" FUNCTION #
# #
#------------------------------------------------------------------------------#
# If this is the main module, run this
if __name__ == '__main__':
argsCount = len(sys.argv)
argsIndex = 1 |
import sys
#------------------------------------------------------------------------------#
# #
# CLASSES #
# #
#------------------------------------------------------------------------------#
class DataFile:
def __init__(self, fileLine, isGood):
"""
:rtype : object
"""
self.isGood = isGood
self.fileLine = fileLine
self.wordsCount = {}
self.words = fileLine.split()
for word in self.words:
try:
self.wordsCount[word] += 1
except KeyError:
self.wordsCount[word] = 1
self.sumWords = sum(self.wordsCount.values())
def __repr__(self):
print("input : "+self.fileLine)
for key, val in self.wordsCount.items():
print(str(key)+" "+str(val))
print(str(self.sumWords))
return ""
#------------------------------------------------------------------------------#
# #
# UTILITIES FUNCTIONS #
# #
#------------------------------------------------------------------------------#
#------------------------------------------------------------------------------#
# #
# "MAIN" FUNCTION #
# #
#------------------------------------------------------------------------------#
# If this is the main module, run this
if __name__ == '__main__':
argsCount = len(sys.argv)
argsIndex = 1
toto = DataFile("coucou je suis une grosse bite et je vous emmerde Monsieur le PD n'ha n'ha n'aire", True)
print(toto) | Add DataFile class to split words of a line and count it | Add DataFile class to split words of a line and count it
| Python | apache-2.0 | Chavjoh/BayesClassificationPython | python | ## Code Before:
import sys
#------------------------------------------------------------------------------#
# #
# CLASSES #
# #
#------------------------------------------------------------------------------#
#------------------------------------------------------------------------------#
# #
# UTILITIES FUNCTIONS #
# #
#------------------------------------------------------------------------------#
#------------------------------------------------------------------------------#
# #
# "MAIN" FUNCTION #
# #
#------------------------------------------------------------------------------#
# If this is the main module, run this
if __name__ == '__main__':
argsCount = len(sys.argv)
argsIndex = 1
## Instruction:
Add DataFile class to split words of a line and count it
## Code After:
import sys
#------------------------------------------------------------------------------#
# #
# CLASSES #
# #
#------------------------------------------------------------------------------#
class DataFile:
def __init__(self, fileLine, isGood):
"""
:rtype : object
"""
self.isGood = isGood
self.fileLine = fileLine
self.wordsCount = {}
self.words = fileLine.split()
for word in self.words:
try:
self.wordsCount[word] += 1
except KeyError:
self.wordsCount[word] = 1
self.sumWords = sum(self.wordsCount.values())
def __repr__(self):
print("input : "+self.fileLine)
for key, val in self.wordsCount.items():
print(str(key)+" "+str(val))
print(str(self.sumWords))
return ""
#------------------------------------------------------------------------------#
# #
# UTILITIES FUNCTIONS #
# #
#------------------------------------------------------------------------------#
#------------------------------------------------------------------------------#
# #
# "MAIN" FUNCTION #
# #
#------------------------------------------------------------------------------#
# If this is the main module, run this
if __name__ == '__main__':
argsCount = len(sys.argv)
argsIndex = 1
toto = DataFile("coucou je suis une grosse bite et je vous emmerde Monsieur le PD n'ha n'ha n'aire", True)
print(toto) |
5cf469c6d4e25d4215de91d43a7efc8cbae88704 | assets/terraform/s3_workingstorage.tf | assets/terraform/s3_workingstorage.tf | resource "aws_s3_bucket" "working_storage" {
bucket = "wellcomecollection-assets-workingstorage"
acl = "private"
lifecycle {
prevent_destroy = true
}
lifecycle_rule {
id = "transition_all_to_standard_ia"
transition {
days = 30
storage_class = "STANDARD_IA"
}
enabled = true
}
# To protect against accidental deletions, we enable versioning on this
# bucket and expire non-current versions after 30 days. This gives us an
# additional safety net against mistakes.
versioning {
enabled = true
}
lifecycle_rule {
enabled = true
noncurrent_version_expiration {
days = 30
}
}
}
| resource "aws_s3_bucket" "working_storage" {
bucket = "wellcomecollection-assets-workingstorage"
acl = "private"
lifecycle {
prevent_destroy = true
}
lifecycle_rule {
id = "transition_all_to_standard_ia"
transition {
days = 30
storage_class = "STANDARD_IA"
}
enabled = true
}
# To protect against accidental deletions, we enable versioning on this
# bucket and expire non-current versions after 30 days. This gives us an
# additional safety net against mistakes.
versioning {
enabled = true
}
lifecycle_rule {
id = "expire_noncurrent_versions"
noncurrent_version_expiration {
days = 30
}
enabled = true
}
}
| Add an ID to the "expire noncurrent versions" rule on the assets bucket | Add an ID to the "expire noncurrent versions" rule on the assets bucket
[skip ci]
| HCL | mit | wellcometrust/platform-api,wellcometrust/platform-api,wellcometrust/platform-api,wellcometrust/platform-api | hcl | ## Code Before:
resource "aws_s3_bucket" "working_storage" {
bucket = "wellcomecollection-assets-workingstorage"
acl = "private"
lifecycle {
prevent_destroy = true
}
lifecycle_rule {
id = "transition_all_to_standard_ia"
transition {
days = 30
storage_class = "STANDARD_IA"
}
enabled = true
}
# To protect against accidental deletions, we enable versioning on this
# bucket and expire non-current versions after 30 days. This gives us an
# additional safety net against mistakes.
versioning {
enabled = true
}
lifecycle_rule {
enabled = true
noncurrent_version_expiration {
days = 30
}
}
}
## Instruction:
Add an ID to the "expire noncurrent versions" rule on the assets bucket
[skip ci]
## Code After:
resource "aws_s3_bucket" "working_storage" {
bucket = "wellcomecollection-assets-workingstorage"
acl = "private"
lifecycle {
prevent_destroy = true
}
lifecycle_rule {
id = "transition_all_to_standard_ia"
transition {
days = 30
storage_class = "STANDARD_IA"
}
enabled = true
}
# To protect against accidental deletions, we enable versioning on this
# bucket and expire non-current versions after 30 days. This gives us an
# additional safety net against mistakes.
versioning {
enabled = true
}
lifecycle_rule {
id = "expire_noncurrent_versions"
noncurrent_version_expiration {
days = 30
}
enabled = true
}
}
|
db2c4c9d907f4ad5b75c33e48e4b6075442eebb7 | style/basic/js/boot_fonts.js | style/basic/js/boot_fonts.js | (function(window,document) {
_editor.loadFont = function(options) {
var dummy = document.createElement('div');
dummy.setAttribute('style','font: 400px fantasy;position:absolute;top:-9999px;left:-9999px');
dummy.innerHTML = 'Am-i#o';
document.body.appendChild(dummy);
var width = dummy.clientWidth;
var e = document.createElement('link');
e.rel = 'stylesheet';
e.type = 'text/css';
e.href = options.href;
this.inject(e);
if (width==0) { // TODO IE7 (we cannot test) so we play it safe
document.body.className += ' ' + options.cls;
return;
}
dummy.style.fontFamily = "'" + options.family + "',fantasy";
var tester;
var timeout = 0.01;
tester = function() {
timeout *= 1.5;
if (width != dummy.clientWidth) {
document.body.className += ' ' + options.cls;
dummy.parentNode.removeChild(dummy);
} else {
window.setTimeout(tester,timeout);
}
}
tester();
}
})(window,document); | (function(window,document,tool) {
tool.loadFont = function(options) {
var weights = options.weights || ['normal'];
weights = ['300','400','700'];
var count = weights.length;
var operation = function(weight) {
var dummy = tool._build('div',{style:'font:400px fantasy;position:absolute;top:-9999px;left:-9999px'})
dummy.innerHTML = 'Am-i#w^o';
document.body.appendChild(dummy);
var width = dummy.clientWidth;
//console.log('Checking: '+weight);
dummy.style.fontFamily = "'" + options.family + "',fantasy";
dummy.style.fontWeight = weight;
var tester;
var timeout = 0.01;
tester = function() {
timeout *= 1.5;
if (width==0 || width != dummy.clientWidth) {
count--;
//console.log('found: '+weight+','+width+'/'+dummy.clientWidth);
if (count==0) {
document.body.className += ' ' + options.cls;
}
dummy.parentNode.removeChild(dummy);
} else {
window.setTimeout(tester,timeout);
}
}
tester();
}
for (var i = 0; i < weights.length; i++) {
operation(weights[i]);
}
tool.inject(tool._build('link',{
rel : 'stylesheet',
type : 'text/css',
href : options.href
}));
}
})(window,document,_editor); | Support multiple weights in font loading | Support multiple weights in font loading
| JavaScript | unlicense | Humanise/hui,Humanise/hui,Humanise/hui,Humanise/hui,Humanise/hui,Humanise/hui,Humanise/hui | javascript | ## Code Before:
(function(window,document) {
_editor.loadFont = function(options) {
var dummy = document.createElement('div');
dummy.setAttribute('style','font: 400px fantasy;position:absolute;top:-9999px;left:-9999px');
dummy.innerHTML = 'Am-i#o';
document.body.appendChild(dummy);
var width = dummy.clientWidth;
var e = document.createElement('link');
e.rel = 'stylesheet';
e.type = 'text/css';
e.href = options.href;
this.inject(e);
if (width==0) { // TODO IE7 (we cannot test) so we play it safe
document.body.className += ' ' + options.cls;
return;
}
dummy.style.fontFamily = "'" + options.family + "',fantasy";
var tester;
var timeout = 0.01;
tester = function() {
timeout *= 1.5;
if (width != dummy.clientWidth) {
document.body.className += ' ' + options.cls;
dummy.parentNode.removeChild(dummy);
} else {
window.setTimeout(tester,timeout);
}
}
tester();
}
})(window,document);
## Instruction:
Support multiple weights in font loading
## Code After:
(function(window,document,tool) {
tool.loadFont = function(options) {
var weights = options.weights || ['normal'];
weights = ['300','400','700'];
var count = weights.length;
var operation = function(weight) {
var dummy = tool._build('div',{style:'font:400px fantasy;position:absolute;top:-9999px;left:-9999px'})
dummy.innerHTML = 'Am-i#w^o';
document.body.appendChild(dummy);
var width = dummy.clientWidth;
//console.log('Checking: '+weight);
dummy.style.fontFamily = "'" + options.family + "',fantasy";
dummy.style.fontWeight = weight;
var tester;
var timeout = 0.01;
tester = function() {
timeout *= 1.5;
if (width==0 || width != dummy.clientWidth) {
count--;
//console.log('found: '+weight+','+width+'/'+dummy.clientWidth);
if (count==0) {
document.body.className += ' ' + options.cls;
}
dummy.parentNode.removeChild(dummy);
} else {
window.setTimeout(tester,timeout);
}
}
tester();
}
for (var i = 0; i < weights.length; i++) {
operation(weights[i]);
}
tool.inject(tool._build('link',{
rel : 'stylesheet',
type : 'text/css',
href : options.href
}));
}
})(window,document,_editor); |
24ac4983c682e28ed96929e07f48df9fc6c0ef71 | styles/fixes.less | styles/fixes.less | .tab-bar {
.active {
background-color: darken(@base, 5%) !important;
&:after {
border-bottom-color: transparent !important;
}
.title {
color: @text;
background-color: darken(@base, 5%) !important;
}
}
}
| .tab-bar {
.active {
background-color: darken(@base, 5%) !important;
&:after {
border-bottom-color: transparent !important;
}
.title {
color: @text;
background-color: darken(@base, 5%) !important;
}
}
}
.theme-atom-dark-ui .tab-bar .active .title {
background-color: transparent !important;
}
.theme-atom-light-ui .tab-bar .active .title {
background-color: #f0f0f0 !important;
}
| Fix active tab glitch on atom dark and light ui theme | Fix active tab glitch on atom dark and light ui theme
| Less | mit | burntime/atom-monokai | less | ## Code Before:
.tab-bar {
.active {
background-color: darken(@base, 5%) !important;
&:after {
border-bottom-color: transparent !important;
}
.title {
color: @text;
background-color: darken(@base, 5%) !important;
}
}
}
## Instruction:
Fix active tab glitch on atom dark and light ui theme
## Code After:
.tab-bar {
.active {
background-color: darken(@base, 5%) !important;
&:after {
border-bottom-color: transparent !important;
}
.title {
color: @text;
background-color: darken(@base, 5%) !important;
}
}
}
.theme-atom-dark-ui .tab-bar .active .title {
background-color: transparent !important;
}
.theme-atom-light-ui .tab-bar .active .title {
background-color: #f0f0f0 !important;
}
|
412f1cb7d180e15c640d1913d500d2489ae4bb0c | helloWindows/helloWindows.yml | helloWindows/helloWindows.yml | tosca_definitions_version: tosca_simple_yaml_1_0_0_wd03
description: Hello World for windows.
template_name: hello-windows-types
template_version: 1.0.0-SNAPSHOT
template_author: FastConnect
imports:
- "tosca-normative-types:1.0.0.wd06-SNAPSHOT"
node_types:
alien.nodes.HelloWindows:
derived_from: tosca.nodes.SoftwareComponent
properties:
world:
type: string
artifacts:
- powershell_script: scripts/start.ps1
type: tosca.artifacts.File
interfaces:
Standard:
create:
inputs:
WORLD_NAME: { get_property: [SELF, world] }
implementation: scripts/install.bat
start:
inputs:
HELLO_MESSAGE: { get_operation_output: [ SELF, Standard, create, HELLO_MESSAGE ] }
implementation: scripts/start.bat
| tosca_definitions_version: tosca_simple_yaml_1_0_0_wd03
description: Hello World for windows.
template_name: hello-windows-types
template_version: 1.0.0-SNAPSHOT
template_author: FastConnect
imports:
- "tosca-normative-types:1.0.0.wd06-SNAPSHOT"
- "alien-base-types:1.0-SNAPSHOT"
node_types:
alien.nodes.HelloWindows:
derived_from: tosca.nodes.SoftwareComponent
properties:
world:
type: string
artifacts:
- powershell_script: scripts/start.ps1
type: tosca.artifacts.File
interfaces:
Standard:
create:
inputs:
WORLD_NAME: { get_property: [SELF, world] }
implementation: scripts/install.bat
start:
inputs:
HELLO_MESSAGE: { get_operation_output: [ SELF, Standard, create, HELLO_MESSAGE ] }
implementation: scripts/start.bat
| Add windows hello world artifacts | Add windows hello world artifacts
| YAML | apache-2.0 | alien4cloud/samples,alien4cloud/samples,alien4cloud/samples,alien4cloud/samples | yaml | ## Code Before:
tosca_definitions_version: tosca_simple_yaml_1_0_0_wd03
description: Hello World for windows.
template_name: hello-windows-types
template_version: 1.0.0-SNAPSHOT
template_author: FastConnect
imports:
- "tosca-normative-types:1.0.0.wd06-SNAPSHOT"
node_types:
alien.nodes.HelloWindows:
derived_from: tosca.nodes.SoftwareComponent
properties:
world:
type: string
artifacts:
- powershell_script: scripts/start.ps1
type: tosca.artifacts.File
interfaces:
Standard:
create:
inputs:
WORLD_NAME: { get_property: [SELF, world] }
implementation: scripts/install.bat
start:
inputs:
HELLO_MESSAGE: { get_operation_output: [ SELF, Standard, create, HELLO_MESSAGE ] }
implementation: scripts/start.bat
## Instruction:
Add windows hello world artifacts
## Code After:
tosca_definitions_version: tosca_simple_yaml_1_0_0_wd03
description: Hello World for windows.
template_name: hello-windows-types
template_version: 1.0.0-SNAPSHOT
template_author: FastConnect
imports:
- "tosca-normative-types:1.0.0.wd06-SNAPSHOT"
- "alien-base-types:1.0-SNAPSHOT"
node_types:
alien.nodes.HelloWindows:
derived_from: tosca.nodes.SoftwareComponent
properties:
world:
type: string
artifacts:
- powershell_script: scripts/start.ps1
type: tosca.artifacts.File
interfaces:
Standard:
create:
inputs:
WORLD_NAME: { get_property: [SELF, world] }
implementation: scripts/install.bat
start:
inputs:
HELLO_MESSAGE: { get_operation_output: [ SELF, Standard, create, HELLO_MESSAGE ] }
implementation: scripts/start.bat
|
6f815e5488f5af634ee752b7a349ebe3efb7a5ee | doc/user/project/integrations/custom_issue_tracker.md | doc/user/project/integrations/custom_issue_tracker.md |
To enable the Custom Issue Tracker integration in a project, navigate to the
[Integrations page](project_services.md#accessing-the-project-services), click
the **Customer Issue Tracker** service, and fill in the required details on the page as described
in the table below.
| Field | Description |
| ----- | ----------- |
| `title` | A title for the issue tracker (to differentiate between instances, for example) |
| `description` | A name for the issue tracker (to differentiate between instances, for example) |
| `project_url` | Currently unused. Will be changed in a future release. |
| `issues_url` | The URL to the issue in the issue tracker project that is linked to this GitLab project. Note that the `issues_url` requires `:id` in the URL. This ID is used by GitLab as a placeholder to replace the issue number. For example, `https://customissuetracker.com/project-name/:id`. |
| `new_issue_url` | Currently unused. Will be changed in a future release. |
Once you have configured and enabled Custom Issue Tracker Service you'll see a link on the GitLab project pages that takes you to that custom issue tracker.
## Referencing issues
Issues are referenced with `#<ID>`, where `<ID>` is a number (example `#143`).
So with the example above, `#143` would refer to `https://customissuetracker.com/project-name/143`. |
To enable the Custom Issue Tracker integration in a project, navigate to the
[Integrations page](project_services.md#accessing-the-project-services), click
the **Customer Issue Tracker** service, and fill in the required details on the page as described
in the table below.
| Field | Description |
| ----- | ----------- |
| `title` | A title for the issue tracker (to differentiate between instances, for example) |
| `description` | A name for the issue tracker (to differentiate between instances, for example) |
| `project_url` | Currently unused. Will be changed in a future release. |
| `issues_url` | The URL to the issue in the issue tracker project that is linked to this GitLab project. Note that the `issues_url` requires `:id` in the URL. This ID is used by GitLab as a placeholder to replace the issue number. For example, `https://customissuetracker.com/project-name/:id`. |
| `new_issue_url` | Currently unused. Will be changed in a future release. |
Once you have configured and enabled Custom Issue Tracker Service you'll see a link on the GitLab project pages that takes you to that custom issue tracker.
## Referencing issues
- Issues are referenced with `ANYTHING-<ID>`, where `ANYTHING` can be any string and `<ID>` is a number used in the target project of the custom integration (example `PROJECT-143`).
- `ANYTHING` is a placeholder to differentiate against GitLab issues, which are referenced with `#<ID>`. You can use a project name or project key to replace it for example.
- So with the example above, `PROJECT-143` would refer to `https://customissuetracker.com/project-name/143`. | Update custom issue tracker docs | Update custom issue tracker docs | Markdown | mit | iiet/iiet-git,axilleas/gitlabhq,stoplightio/gitlabhq,iiet/iiet-git,iiet/iiet-git,stoplightio/gitlabhq,jirutka/gitlabhq,dreampet/gitlab,axilleas/gitlabhq,stoplightio/gitlabhq,dreampet/gitlab,iiet/iiet-git,axilleas/gitlabhq,mmkassem/gitlabhq,dreampet/gitlab,stoplightio/gitlabhq,mmkassem/gitlabhq,mmkassem/gitlabhq,mmkassem/gitlabhq,jirutka/gitlabhq,dreampet/gitlab,jirutka/gitlabhq,jirutka/gitlabhq,axilleas/gitlabhq | markdown | ## Code Before:
To enable the Custom Issue Tracker integration in a project, navigate to the
[Integrations page](project_services.md#accessing-the-project-services), click
the **Customer Issue Tracker** service, and fill in the required details on the page as described
in the table below.
| Field | Description |
| ----- | ----------- |
| `title` | A title for the issue tracker (to differentiate between instances, for example) |
| `description` | A name for the issue tracker (to differentiate between instances, for example) |
| `project_url` | Currently unused. Will be changed in a future release. |
| `issues_url` | The URL to the issue in the issue tracker project that is linked to this GitLab project. Note that the `issues_url` requires `:id` in the URL. This ID is used by GitLab as a placeholder to replace the issue number. For example, `https://customissuetracker.com/project-name/:id`. |
| `new_issue_url` | Currently unused. Will be changed in a future release. |
Once you have configured and enabled Custom Issue Tracker Service you'll see a link on the GitLab project pages that takes you to that custom issue tracker.
## Referencing issues
Issues are referenced with `#<ID>`, where `<ID>` is a number (example `#143`).
So with the example above, `#143` would refer to `https://customissuetracker.com/project-name/143`.
## Instruction:
Update custom issue tracker docs
## Code After:
To enable the Custom Issue Tracker integration in a project, navigate to the
[Integrations page](project_services.md#accessing-the-project-services), click
the **Customer Issue Tracker** service, and fill in the required details on the page as described
in the table below.
| Field | Description |
| ----- | ----------- |
| `title` | A title for the issue tracker (to differentiate between instances, for example) |
| `description` | A name for the issue tracker (to differentiate between instances, for example) |
| `project_url` | Currently unused. Will be changed in a future release. |
| `issues_url` | The URL to the issue in the issue tracker project that is linked to this GitLab project. Note that the `issues_url` requires `:id` in the URL. This ID is used by GitLab as a placeholder to replace the issue number. For example, `https://customissuetracker.com/project-name/:id`. |
| `new_issue_url` | Currently unused. Will be changed in a future release. |
Once you have configured and enabled Custom Issue Tracker Service you'll see a link on the GitLab project pages that takes you to that custom issue tracker.
## Referencing issues
- Issues are referenced with `ANYTHING-<ID>`, where `ANYTHING` can be any string and `<ID>` is a number used in the target project of the custom integration (example `PROJECT-143`).
- `ANYTHING` is a placeholder to differentiate against GitLab issues, which are referenced with `#<ID>`. You can use a project name or project key to replace it for example.
- So with the example above, `PROJECT-143` would refer to `https://customissuetracker.com/project-name/143`. |
431357bc5ced7c1ffd6616f681c195ce45c61462 | scriptobject.h | scriptobject.h |
class EmacsInstance;
class ScriptObject : public NPObject {
public:
static ScriptObject* create(NPP npp);
void invalidate();
bool hasMethod(NPIdentifier name);
bool invoke(NPIdentifier name,
const NPVariant *args,
uint32_t argCount,
NPVariant *result);
bool enumerate(NPIdentifier **identifiers, uint32_t *identifierCount);
static NPObject* allocateThunk(NPP npp, NPClass *aClass);
static void deallocateThunk(NPObject *npobj);
static void invalidateThunk(NPObject *npobj);
static bool hasMethodThunk(NPObject *npobj, NPIdentifier name);
static bool invokeThunk(NPObject *npobj, NPIdentifier name,
const NPVariant *args, uint32_t argCount,
NPVariant *result);
static bool enumerateThunk(NPObject *npobj, NPIdentifier **identifiers,
uint32_t *identifierCount);
private:
ScriptObject(NPP npp);
~ScriptObject();
EmacsInstance* emacsInstance();
NPP npp_;
};
#endif // INCLUDED_SCRIPT_OBJECT_H_
|
class EmacsInstance;
class ScriptObject : public NPObject {
public:
static ScriptObject* create(NPP npp);
void invalidate();
bool hasMethod(NPIdentifier name);
bool invoke(NPIdentifier name,
const NPVariant *args,
uint32_t argCount,
NPVariant *result);
bool enumerate(NPIdentifier **identifiers, uint32_t *identifierCount);
static NPObject* allocateThunk(NPP npp, NPClass *aClass);
static void deallocateThunk(NPObject *npobj);
static void invalidateThunk(NPObject *npobj);
static bool hasMethodThunk(NPObject *npobj, NPIdentifier name);
static bool invokeThunk(NPObject *npobj, NPIdentifier name,
const NPVariant *args, uint32_t argCount,
NPVariant *result);
static bool enumerateThunk(NPObject *npobj, NPIdentifier **identifiers,
uint32_t *identifierCount);
private:
ScriptObject(NPP npp);
~ScriptObject();
EmacsInstance* emacsInstance();
NPP npp_;
DISALLOW_COPY_AND_ASSIGN(ScriptObject);
};
#endif // INCLUDED_SCRIPT_OBJECT_H_
| Disable copy and assign on ScriptObject | Disable copy and assign on ScriptObject
| C | mit | davidben/embedded-emacs,davidben/embedded-emacs,davidben/embedded-emacs | c | ## Code Before:
class EmacsInstance;
class ScriptObject : public NPObject {
public:
static ScriptObject* create(NPP npp);
void invalidate();
bool hasMethod(NPIdentifier name);
bool invoke(NPIdentifier name,
const NPVariant *args,
uint32_t argCount,
NPVariant *result);
bool enumerate(NPIdentifier **identifiers, uint32_t *identifierCount);
static NPObject* allocateThunk(NPP npp, NPClass *aClass);
static void deallocateThunk(NPObject *npobj);
static void invalidateThunk(NPObject *npobj);
static bool hasMethodThunk(NPObject *npobj, NPIdentifier name);
static bool invokeThunk(NPObject *npobj, NPIdentifier name,
const NPVariant *args, uint32_t argCount,
NPVariant *result);
static bool enumerateThunk(NPObject *npobj, NPIdentifier **identifiers,
uint32_t *identifierCount);
private:
ScriptObject(NPP npp);
~ScriptObject();
EmacsInstance* emacsInstance();
NPP npp_;
};
#endif // INCLUDED_SCRIPT_OBJECT_H_
## Instruction:
Disable copy and assign on ScriptObject
## Code After:
class EmacsInstance;
class ScriptObject : public NPObject {
public:
static ScriptObject* create(NPP npp);
void invalidate();
bool hasMethod(NPIdentifier name);
bool invoke(NPIdentifier name,
const NPVariant *args,
uint32_t argCount,
NPVariant *result);
bool enumerate(NPIdentifier **identifiers, uint32_t *identifierCount);
static NPObject* allocateThunk(NPP npp, NPClass *aClass);
static void deallocateThunk(NPObject *npobj);
static void invalidateThunk(NPObject *npobj);
static bool hasMethodThunk(NPObject *npobj, NPIdentifier name);
static bool invokeThunk(NPObject *npobj, NPIdentifier name,
const NPVariant *args, uint32_t argCount,
NPVariant *result);
static bool enumerateThunk(NPObject *npobj, NPIdentifier **identifiers,
uint32_t *identifierCount);
private:
ScriptObject(NPP npp);
~ScriptObject();
EmacsInstance* emacsInstance();
NPP npp_;
DISALLOW_COPY_AND_ASSIGN(ScriptObject);
};
#endif // INCLUDED_SCRIPT_OBJECT_H_
|
92729026b1a172c62789e6bf3c8cb2ba74279b12 | project.clj | project.clj | (defproject clj-fst "0.1.0"
:description "Finite State Transducers (FST) for Clojure"
:url "https://github.com/structureddynamics/clj-fst"
:license {:name "Eclipse Public License"
:url "http://www.eclipse.org/legal/epl-v10.html"}
:dependencies [[org.clojure/clojure "1.6.0"]
[org.apache.lucene/lucene-core "4.10.2"]
[org.apache.lucene/lucene-misc "4.10.2"]
[lein-marginalia "0.8.1-SNAPSHOT"]]
:source-paths ["src/clojure"]
:java-source-paths ["src/java"]
:target-path "target/%s"
:marginalia {:exclude ["utils.clj"]})
| (defproject clj-fst "0.1.0"
:description "Finite State Transducers (FST) for Clojure"
:url "https://github.com/structureddynamics/clj-fst"
:license {:name "Eclipse Public License"
:url "http://www.eclipse.org/legal/epl-v10.html"}
:dependencies [[org.clojure/clojure "1.6.0"]
[org.apache.lucene/lucene-core "4.10.2"]
[org.apache.lucene/lucene-misc "4.10.2"]
[lein-marginalia "0.8.0"]]
:source-paths ["src/clojure"]
:java-source-paths ["src/java"]
:target-path "target/%s"
:marginalia {:exclude ["utils.clj"]})
| Remove dependency on the marginalia snapshot | Remove dependency on the marginalia snapshot
| Clojure | epl-1.0 | structureddynamics/clj-fst | clojure | ## Code Before:
(defproject clj-fst "0.1.0"
:description "Finite State Transducers (FST) for Clojure"
:url "https://github.com/structureddynamics/clj-fst"
:license {:name "Eclipse Public License"
:url "http://www.eclipse.org/legal/epl-v10.html"}
:dependencies [[org.clojure/clojure "1.6.0"]
[org.apache.lucene/lucene-core "4.10.2"]
[org.apache.lucene/lucene-misc "4.10.2"]
[lein-marginalia "0.8.1-SNAPSHOT"]]
:source-paths ["src/clojure"]
:java-source-paths ["src/java"]
:target-path "target/%s"
:marginalia {:exclude ["utils.clj"]})
## Instruction:
Remove dependency on the marginalia snapshot
## Code After:
(defproject clj-fst "0.1.0"
:description "Finite State Transducers (FST) for Clojure"
:url "https://github.com/structureddynamics/clj-fst"
:license {:name "Eclipse Public License"
:url "http://www.eclipse.org/legal/epl-v10.html"}
:dependencies [[org.clojure/clojure "1.6.0"]
[org.apache.lucene/lucene-core "4.10.2"]
[org.apache.lucene/lucene-misc "4.10.2"]
[lein-marginalia "0.8.0"]]
:source-paths ["src/clojure"]
:java-source-paths ["src/java"]
:target-path "target/%s"
:marginalia {:exclude ["utils.clj"]})
|
5b0139eb38fcc89a717b5f2550ffc2074f41cc93 | air.sass | air.sass | //
// Variables
//
$font-size-base: 16!default
$line-height-base: 1.4
$margin-base-vertical: $font-size-base/2!default
$margin-base-horizontal: $font-size-base/2!default
.mb-#{$margin-base-vertical}
margin-bottom: $margin-base-vertical+px
.mb-#{$margin-base-vertical*2}
margin-bottom: $margin-base-vertical*2+px
.mb-#{$margin-base-vertical*3}
margin-bottom: $margin-base-vertical*3+px
.mb-#{$margin-base-vertical*4}
margin-bottom: $margin-base-vertical*4+px
.mb-#{$margin-base-vertical*5}
margin-bottom: $margin-base-vertical*5+px
| //
// Variables
//
$font-size-base: 16!default
$line-height-base: 1.4
$margin-base-vertical: $font-size-base/2!default
$margin-base-horizontal: $font-size-base/2!default
@for $i from 1 through 5
.mb-#{$margin-base-vertical * $i}
margin-bottom: $margin-base-vertical*$i+px
| Use @for directive to create classes | Use @for directive to create classes
| Sass | unlicense | nathanjessen/Air | sass | ## Code Before:
//
// Variables
//
$font-size-base: 16!default
$line-height-base: 1.4
$margin-base-vertical: $font-size-base/2!default
$margin-base-horizontal: $font-size-base/2!default
.mb-#{$margin-base-vertical}
margin-bottom: $margin-base-vertical+px
.mb-#{$margin-base-vertical*2}
margin-bottom: $margin-base-vertical*2+px
.mb-#{$margin-base-vertical*3}
margin-bottom: $margin-base-vertical*3+px
.mb-#{$margin-base-vertical*4}
margin-bottom: $margin-base-vertical*4+px
.mb-#{$margin-base-vertical*5}
margin-bottom: $margin-base-vertical*5+px
## Instruction:
Use @for directive to create classes
## Code After:
//
// Variables
//
$font-size-base: 16!default
$line-height-base: 1.4
$margin-base-vertical: $font-size-base/2!default
$margin-base-horizontal: $font-size-base/2!default
@for $i from 1 through 5
.mb-#{$margin-base-vertical * $i}
margin-bottom: $margin-base-vertical*$i+px
|
08bdcfbbb82e1c6dd0fc6e41d17f6211a2adc1bf | app/components/browser/Browser.js | app/components/browser/Browser.js | import React from 'react'
import ErrorMessage from './ErrorMessage'
import Header from './Header'
import BrowserStack from './BrowserStack'
import BrowserTabs from './BrowserTabs'
import CorpusStatusWatcher from '../CorpusStatusWatcher'
class Browser extends React.Component {
render () {
return (
<CorpusStatusWatcher className="window browser-window">
<Header />
<BrowserStack />
<BrowserTabs />
<ErrorMessage />
</CorpusStatusWatcher>
)
}
}
export default Browser
| import React from 'react'
import { connect } from 'react-redux'
import ErrorMessage from './ErrorMessage'
import Header from './Header'
import BrowserStack from './BrowserStack'
import BrowserTabs from './BrowserTabs'
import CorpusStatusWatcher from '../CorpusStatusWatcher'
import Spinner from '../Spinner'
class Browser extends React.Component {
render () {
if (!this.props.corpus) {
// Corpus not yet selected
return <Spinner />
}
return (
<CorpusStatusWatcher className="window browser-window">
<Header />
<BrowserStack />
<BrowserTabs />
<ErrorMessage />
</CorpusStatusWatcher>
)
}
}
Browser.propTypes = {
corpus: React.PropTypes.object
}
const mapStateToProps = ({ corpora }) => ({
corpus: corpora.selected
})
const mapDispatchToProps = {
}
export default connect(mapStateToProps, mapDispatchToProps)(Browser)
| Fix proptypes error when rendering too soon after corpus is selected | fix(browser): Fix proptypes error when rendering too soon after corpus is selected
| JavaScript | agpl-3.0 | medialab/hyphe-browser,medialab/hyphe-browser | javascript | ## Code Before:
import React from 'react'
import ErrorMessage from './ErrorMessage'
import Header from './Header'
import BrowserStack from './BrowserStack'
import BrowserTabs from './BrowserTabs'
import CorpusStatusWatcher from '../CorpusStatusWatcher'
class Browser extends React.Component {
render () {
return (
<CorpusStatusWatcher className="window browser-window">
<Header />
<BrowserStack />
<BrowserTabs />
<ErrorMessage />
</CorpusStatusWatcher>
)
}
}
export default Browser
## Instruction:
fix(browser): Fix proptypes error when rendering too soon after corpus is selected
## Code After:
import React from 'react'
import { connect } from 'react-redux'
import ErrorMessage from './ErrorMessage'
import Header from './Header'
import BrowserStack from './BrowserStack'
import BrowserTabs from './BrowserTabs'
import CorpusStatusWatcher from '../CorpusStatusWatcher'
import Spinner from '../Spinner'
class Browser extends React.Component {
render () {
if (!this.props.corpus) {
// Corpus not yet selected
return <Spinner />
}
return (
<CorpusStatusWatcher className="window browser-window">
<Header />
<BrowserStack />
<BrowserTabs />
<ErrorMessage />
</CorpusStatusWatcher>
)
}
}
Browser.propTypes = {
corpus: React.PropTypes.object
}
const mapStateToProps = ({ corpora }) => ({
corpus: corpora.selected
})
const mapDispatchToProps = {
}
export default connect(mapStateToProps, mapDispatchToProps)(Browser)
|
d29bc9588203d434843d66dd0b3ddf9d2def4f9a | .travis.yml | .travis.yml | language: node_js
node_js:
- "10"
sudo: required
services:
- docker
cache:
directories:
- node_modules
env:
matrix:
- "MYSQL_VERSION=5.7"
matrix:
include:
- node_js: 10
env: "MYSQL_VERSION=5.7"
# - node_js: 10
# env: "MYSQL_VERSION=8"
before_install:
# Update Node.js modules
- "test ! -d node_modules || npm prune"
- "test ! -d node_modules || npm rebuild"
# Setup environment
- "export MYSQL_DATABASE=flora_mysql_testdb"
- "export MYSQL_HOST=localhost"
- "export MYSQL_PORT=$(node test/integration/tool/free-port.js)"
install:
- "docker run -d --name flora-mysql-testdb -e MYSQL_ALLOW_EMPTY_PASSWORD=yes -e MYSQL_DATABASE=$MYSQL_DATABASE -v \"$PWD/test/integration/fixtures\":/docker-entrypoint-initdb.d --tmpfs \"/var/lib/mysql\" -p $MYSQL_PORT:3306 mysql:$MYSQL_VERSION"
- "npm install"
- "test/integration/tool/wait-for-mysql.sh $MYSQL_HOST $MYSQL_PORT $MYSQL_DATABASE"
script:
- "npm run test"
| language: node_js
node_js:
- "10"
sudo: required
services:
- docker
cache:
directories:
- node_modules
env:
matrix:
- "MYSQL_VERSION=5.7"
matrix:
include:
- node_js: 10
env: "MYSQL_VERSION=5.7"
# - node_js: 10
# env: "MYSQL_VERSION=8"
before_install:
# Update Node.js modules
- "test ! -d node_modules || npm prune"
- "test ! -d node_modules || npm rebuild"
# Setup environment
- "export MYSQL_DATABASE=flora_mysql_testdb"
- "export MYSQL_HOST=localhost"
- "export MYSQL_PORT=$(node test/integration/tool/free-port.js)"
install:
- "docker run -d --name flora-mysql-testdb -e MYSQL_ALLOW_EMPTY_PASSWORD=yes -e MYSQL_DATABASE=$MYSQL_DATABASE -v \"$PWD/test/integration/fixtures\":/docker-entrypoint-initdb.d --tmpfs \"/var/lib/mysql\" -p $MYSQL_PORT:3306 mysql:$MYSQL_VERSION"
- "npm install"
- "test/integration/tool/wait-for-mysql.sh $MYSQL_HOST $MYSQL_PORT $MYSQL_DATABASE"
script:
- "npm run test-unit"
- "npm run test:ci"
| Split unit and ci tests to not start another docker container | Split unit and ci tests to not start another docker container
| YAML | mit | godmodelabs/flora-mysql,godmodelabs/flora-mysql | yaml | ## Code Before:
language: node_js
node_js:
- "10"
sudo: required
services:
- docker
cache:
directories:
- node_modules
env:
matrix:
- "MYSQL_VERSION=5.7"
matrix:
include:
- node_js: 10
env: "MYSQL_VERSION=5.7"
# - node_js: 10
# env: "MYSQL_VERSION=8"
before_install:
# Update Node.js modules
- "test ! -d node_modules || npm prune"
- "test ! -d node_modules || npm rebuild"
# Setup environment
- "export MYSQL_DATABASE=flora_mysql_testdb"
- "export MYSQL_HOST=localhost"
- "export MYSQL_PORT=$(node test/integration/tool/free-port.js)"
install:
- "docker run -d --name flora-mysql-testdb -e MYSQL_ALLOW_EMPTY_PASSWORD=yes -e MYSQL_DATABASE=$MYSQL_DATABASE -v \"$PWD/test/integration/fixtures\":/docker-entrypoint-initdb.d --tmpfs \"/var/lib/mysql\" -p $MYSQL_PORT:3306 mysql:$MYSQL_VERSION"
- "npm install"
- "test/integration/tool/wait-for-mysql.sh $MYSQL_HOST $MYSQL_PORT $MYSQL_DATABASE"
script:
- "npm run test"
## Instruction:
Split unit and ci tests to not start another docker container
## Code After:
language: node_js
node_js:
- "10"
sudo: required
services:
- docker
cache:
directories:
- node_modules
env:
matrix:
- "MYSQL_VERSION=5.7"
matrix:
include:
- node_js: 10
env: "MYSQL_VERSION=5.7"
# - node_js: 10
# env: "MYSQL_VERSION=8"
before_install:
# Update Node.js modules
- "test ! -d node_modules || npm prune"
- "test ! -d node_modules || npm rebuild"
# Setup environment
- "export MYSQL_DATABASE=flora_mysql_testdb"
- "export MYSQL_HOST=localhost"
- "export MYSQL_PORT=$(node test/integration/tool/free-port.js)"
install:
- "docker run -d --name flora-mysql-testdb -e MYSQL_ALLOW_EMPTY_PASSWORD=yes -e MYSQL_DATABASE=$MYSQL_DATABASE -v \"$PWD/test/integration/fixtures\":/docker-entrypoint-initdb.d --tmpfs \"/var/lib/mysql\" -p $MYSQL_PORT:3306 mysql:$MYSQL_VERSION"
- "npm install"
- "test/integration/tool/wait-for-mysql.sh $MYSQL_HOST $MYSQL_PORT $MYSQL_DATABASE"
script:
- "npm run test-unit"
- "npm run test:ci"
|
a20418ad43074a93bc21b97611c4967713ebc897 | plugins/providers/hyperv/scripts/list_hdds.ps1 | plugins/providers/hyperv/scripts/list_hdds.ps1 |
param(
[Parameter(Mandatory=$true)]
[string]$VmId
)
try {
$VM = Hyper-V\Get-VM -Id $VmId
Hyper-V\Get-VMHardDiskDrive -VMName $VM
} catch {
Write-ErrorMessage "Failed to retrieve all disk info from ${VM}: ${PSItem}"
exit 1
}
|
param(
[Parameter(Mandatory=$true)]
[string]$VmId
)
try {
$VM = Hyper-V\Get-VM -Id $VmId
$Disks = @(Hyper-V\Get-VMHardDiskDrive -VMName $VM)
} catch {
Write-ErrorMessage "Failed to retrieve all disk info from ${VM}: ${PSItem}"
exit 1
}
$result = ConvertTo-json $Disks
Write-OutputMessage $result
| Format output from json to return to hyper-v driver | Format output from json to return to hyper-v driver
| PowerShell | mit | mitchellh/vagrant,mitchellh/vagrant,chrisroberts/vagrant,sni/vagrant,marxarelli/vagrant,sni/vagrant,chrisroberts/vagrant,mitchellh/vagrant,marxarelli/vagrant,chrisroberts/vagrant,chrisroberts/vagrant,mitchellh/vagrant,marxarelli/vagrant,sni/vagrant,sni/vagrant,marxarelli/vagrant | powershell | ## Code Before:
param(
[Parameter(Mandatory=$true)]
[string]$VmId
)
try {
$VM = Hyper-V\Get-VM -Id $VmId
Hyper-V\Get-VMHardDiskDrive -VMName $VM
} catch {
Write-ErrorMessage "Failed to retrieve all disk info from ${VM}: ${PSItem}"
exit 1
}
## Instruction:
Format output from json to return to hyper-v driver
## Code After:
param(
[Parameter(Mandatory=$true)]
[string]$VmId
)
try {
$VM = Hyper-V\Get-VM -Id $VmId
$Disks = @(Hyper-V\Get-VMHardDiskDrive -VMName $VM)
} catch {
Write-ErrorMessage "Failed to retrieve all disk info from ${VM}: ${PSItem}"
exit 1
}
$result = ConvertTo-json $Disks
Write-OutputMessage $result
|
34c21504263b78745918742488b911ab38d75522 | group_vars/all/main.yml | group_vars/all/main.yml | composer_keep_updated: true
composer_global_packages:
- { name: hirak/prestissimo }
apt_cache_valid_time: 86400
default_timezone: Etc/UTC
www_root: /opt/proteusnet/www
ip_whitelist:
- "{{ lookup('pipe', 'curl -4 -s https://api.ipify.org') }}"
wordpress_env_defaults:
db_host: localhost
db_name: "{{ item.key | underscore }}_{{ env }}"
db_user: "{{ item.key | underscore }}"
disable_wp_cron: true
wp_env: "{{ env }}"
wp_home: "{{ item.value.ssl.enabled | default(false) | ternary('https', 'http') }}://{{ item.value.site_hosts[0] }}"
wp_siteurl: "${WP_HOME}"
site_env: "{{ wordpress_env_defaults | combine(item.value.env | default({}), vault_wordpress_sites[item.key].env) }}"
mariadb_mirror: ams2.mirrors.digitalocean.com
mariadb_version: "10.1"
| composer_keep_updated: true
composer_global_packages:
- { name: hirak/prestissimo }
apt_cache_valid_time: 86400
default_timezone: Etc/UTC
www_root: /opt/proteusnet/www
ip_whitelist:
- "{{ lookup('pipe', 'curl -4 -s https://api.ipify.org') }}"
wordpress_env_defaults:
db_host: localhost
db_name: "{{ item.key | underscore }}_{{ env }}"
db_user: "{{ item.key | underscore }}"
disable_wp_cron: true
wp_env: "{{ env }}"
wp_home: "{{ item.value.ssl.enabled | default(false) | ternary('https', 'http') }}://${HTTP_HOST}"
wp_siteurl: "${WP_HOME}"
site_env: "{{ wordpress_env_defaults | combine(item.value.env | default({}), vault_wordpress_sites[item.key].env) }}"
mariadb_mirror: ams2.mirrors.digitalocean.com
mariadb_version: "10.1"
| Revert "Changed the wp_home variable to use the first site_hosts item." | Revert "Changed the wp_home variable to use the first site_hosts item."
This reverts commit 659189566199842aa474d105153c023ef2ec7be5.
| YAML | mit | proteusthemes/pt-ops,proteusthemes/pt-ops,proteusthemes/pt-ops | yaml | ## Code Before:
composer_keep_updated: true
composer_global_packages:
- { name: hirak/prestissimo }
apt_cache_valid_time: 86400
default_timezone: Etc/UTC
www_root: /opt/proteusnet/www
ip_whitelist:
- "{{ lookup('pipe', 'curl -4 -s https://api.ipify.org') }}"
wordpress_env_defaults:
db_host: localhost
db_name: "{{ item.key | underscore }}_{{ env }}"
db_user: "{{ item.key | underscore }}"
disable_wp_cron: true
wp_env: "{{ env }}"
wp_home: "{{ item.value.ssl.enabled | default(false) | ternary('https', 'http') }}://{{ item.value.site_hosts[0] }}"
wp_siteurl: "${WP_HOME}"
site_env: "{{ wordpress_env_defaults | combine(item.value.env | default({}), vault_wordpress_sites[item.key].env) }}"
mariadb_mirror: ams2.mirrors.digitalocean.com
mariadb_version: "10.1"
## Instruction:
Revert "Changed the wp_home variable to use the first site_hosts item."
This reverts commit 659189566199842aa474d105153c023ef2ec7be5.
## Code After:
composer_keep_updated: true
composer_global_packages:
- { name: hirak/prestissimo }
apt_cache_valid_time: 86400
default_timezone: Etc/UTC
www_root: /opt/proteusnet/www
ip_whitelist:
- "{{ lookup('pipe', 'curl -4 -s https://api.ipify.org') }}"
wordpress_env_defaults:
db_host: localhost
db_name: "{{ item.key | underscore }}_{{ env }}"
db_user: "{{ item.key | underscore }}"
disable_wp_cron: true
wp_env: "{{ env }}"
wp_home: "{{ item.value.ssl.enabled | default(false) | ternary('https', 'http') }}://${HTTP_HOST}"
wp_siteurl: "${WP_HOME}"
site_env: "{{ wordpress_env_defaults | combine(item.value.env | default({}), vault_wordpress_sites[item.key].env) }}"
mariadb_mirror: ams2.mirrors.digitalocean.com
mariadb_version: "10.1"
|
2c38fea1434f8591957c2707359412151c4b6c43 | tests/test_timezones.py | tests/test_timezones.py | import unittest
import datetime
from garage.timezones import TimeZone
class TimeZoneTest(unittest.TestCase):
def test_time_zone(self):
utc = datetime.datetime(2000, 1, 2, 3, 4, 0, 0, TimeZone.UTC)
cst = utc.astimezone(TimeZone.CST)
print('xxx', utc, cst)
self.assertEqual(2000, cst.year)
self.assertEqual(1, cst.month)
self.assertEqual(2, cst.day)
self.assertEqual(11, cst.hour)
self.assertEqual(4, cst.minute)
self.assertEqual(0, cst.second)
self.assertEqual(0, cst.microsecond)
if __name__ == '__main__':
unittest.main()
| import unittest
import datetime
from garage.timezones import TimeZone
class TimeZoneTest(unittest.TestCase):
def test_time_zone(self):
utc = datetime.datetime(2000, 1, 2, 3, 4, 0, 0, TimeZone.UTC)
cst = utc.astimezone(TimeZone.CST)
self.assertEqual(2000, cst.year)
self.assertEqual(1, cst.month)
self.assertEqual(2, cst.day)
self.assertEqual(11, cst.hour)
self.assertEqual(4, cst.minute)
self.assertEqual(0, cst.second)
self.assertEqual(0, cst.microsecond)
if __name__ == '__main__':
unittest.main()
| Remove print in unit test | Remove print in unit test
| Python | mit | clchiou/garage,clchiou/garage,clchiou/garage,clchiou/garage | python | ## Code Before:
import unittest
import datetime
from garage.timezones import TimeZone
class TimeZoneTest(unittest.TestCase):
def test_time_zone(self):
utc = datetime.datetime(2000, 1, 2, 3, 4, 0, 0, TimeZone.UTC)
cst = utc.astimezone(TimeZone.CST)
print('xxx', utc, cst)
self.assertEqual(2000, cst.year)
self.assertEqual(1, cst.month)
self.assertEqual(2, cst.day)
self.assertEqual(11, cst.hour)
self.assertEqual(4, cst.minute)
self.assertEqual(0, cst.second)
self.assertEqual(0, cst.microsecond)
if __name__ == '__main__':
unittest.main()
## Instruction:
Remove print in unit test
## Code After:
import unittest
import datetime
from garage.timezones import TimeZone
class TimeZoneTest(unittest.TestCase):
def test_time_zone(self):
utc = datetime.datetime(2000, 1, 2, 3, 4, 0, 0, TimeZone.UTC)
cst = utc.astimezone(TimeZone.CST)
self.assertEqual(2000, cst.year)
self.assertEqual(1, cst.month)
self.assertEqual(2, cst.day)
self.assertEqual(11, cst.hour)
self.assertEqual(4, cst.minute)
self.assertEqual(0, cst.second)
self.assertEqual(0, cst.microsecond)
if __name__ == '__main__':
unittest.main()
|
05989b935e6da8e33340f1d61f92b0e0947bd931 | gerrit-docker/src/main/docker/gerrit/conf-and-run-gerrit.sh | gerrit-docker/src/main/docker/gerrit/conf-and-run-gerrit.sh | sed -i 's/__GITLAB_IP__/'${GITLAB_PORT_80_TCP_ADDR}'/g' /home/gerrit/gerrit/etc/replication.config
sed -i 's/__GITLAB_USER__/'${GITLAB_USER}'/g' /home/gerrit/gerrit/etc/replication.config
sed -i 's/__GITLAB_PASSWORD__/'${GITLAB_PASSWORD}'/g' /home/gerrit/gerrit/etc/replication.config
sed -i 's/__GITLAB_PROJ_ROOT__/'${GITLAB_PROJ_ROOT}'/g' /home/gerrit/gerrit/etc/replication.config
# Configure Gerrit
sed -i 's/__AUTH_TYPE__/'${AUTH_TYPE}'/g' /home/gerrit/gerrit/etc/gerrit.config
service supervisor start | sed -i 's/__GITLAB_IP__/'${GITLAB_PORT_80_TCP_ADDR}'/g' /home/gerrit/gerrit/etc/replication.config
sed -i 's/__GITLAB_USER__/'${GITLAB_USER}'/g' /home/gerrit/gerrit/etc/replication.config
sed -i 's/__GITLAB_PASSWORD__/'${GITLAB_PASSWORD}'/g' /home/gerrit/gerrit/etc/replication.config
sed -i 's/__GITLAB_PROJ_ROOT__/'${GITLAB_PROJ_ROOT}'/g' /home/gerrit/gerrit/etc/replication.config
# Configure Gerrit
sed -i 's/__AUTH_TYPE__/'${AUTH_TYPE}'/g' /home/gerrit/gerrit/etc/gerrit.config
# Add ssh key imported by Kubernetes to access gogs
RUN echo "Host Gogs" >> /etc/ssh/ssh_config
RUN echo "Hostname gogs-http-service.default.local" >> /etc/ssh/ssh_config
RUN echo "IdentityFile /etc/secret-volume/id-rsa" >> /etc/ssh/ssh_config
service supervisor start | Update script to include ssh_config. Maybe that should be the otherway around --> gogs | Update script to include ssh_config. Maybe that should be the otherway around --> gogs
| Shell | apache-2.0 | finiteloopme/cd-jboss-fuse,finiteloopme/cd-jboss-fuse,finiteloopme/cd-jboss-fuse | shell | ## Code Before:
sed -i 's/__GITLAB_IP__/'${GITLAB_PORT_80_TCP_ADDR}'/g' /home/gerrit/gerrit/etc/replication.config
sed -i 's/__GITLAB_USER__/'${GITLAB_USER}'/g' /home/gerrit/gerrit/etc/replication.config
sed -i 's/__GITLAB_PASSWORD__/'${GITLAB_PASSWORD}'/g' /home/gerrit/gerrit/etc/replication.config
sed -i 's/__GITLAB_PROJ_ROOT__/'${GITLAB_PROJ_ROOT}'/g' /home/gerrit/gerrit/etc/replication.config
# Configure Gerrit
sed -i 's/__AUTH_TYPE__/'${AUTH_TYPE}'/g' /home/gerrit/gerrit/etc/gerrit.config
service supervisor start
## Instruction:
Update script to include ssh_config. Maybe that should be the otherway around --> gogs
## Code After:
sed -i 's/__GITLAB_IP__/'${GITLAB_PORT_80_TCP_ADDR}'/g' /home/gerrit/gerrit/etc/replication.config
sed -i 's/__GITLAB_USER__/'${GITLAB_USER}'/g' /home/gerrit/gerrit/etc/replication.config
sed -i 's/__GITLAB_PASSWORD__/'${GITLAB_PASSWORD}'/g' /home/gerrit/gerrit/etc/replication.config
sed -i 's/__GITLAB_PROJ_ROOT__/'${GITLAB_PROJ_ROOT}'/g' /home/gerrit/gerrit/etc/replication.config
# Configure Gerrit
sed -i 's/__AUTH_TYPE__/'${AUTH_TYPE}'/g' /home/gerrit/gerrit/etc/gerrit.config
# Add ssh key imported by Kubernetes to access gogs
RUN echo "Host Gogs" >> /etc/ssh/ssh_config
RUN echo "Hostname gogs-http-service.default.local" >> /etc/ssh/ssh_config
RUN echo "IdentityFile /etc/secret-volume/id-rsa" >> /etc/ssh/ssh_config
service supervisor start |
290d9eb3b238fdf18c947416d46968bfc6c25228 | karma.conf.js | karma.conf.js | // Reference: http://karma-runner.github.io/0.12/config/configuration-file.html
module.exports = function karmaConfig (config) {
config.set({
frameworks: [
// Reference: https://github.com/karma-runner/karma-mocha
// Set framework to mocha
'mocha'
],
reporters: [
// Reference: https://github.com/mlex/karma-spec-reporter
// Set reporter to print detailed results to console
'spec',
// Reference: https://github.com/karma-runner/karma-coverage
// Output code coverage files
'coverage'
],
files: [
// Reference: https://www.npmjs.com/package/phantomjs-polyfill
// Needed because React.js requires bind and phantomjs does not support it
'node_modules/phantomjs-polyfill/bind-polyfill.js',
// Grab all files in the app folder that contain .test.
'app/**/*.test.*'
],
preprocessors: {
// Reference: http://webpack.github.io/docs/testing.html
// Reference: https://github.com/webpack/karma-webpack
// Convert files with webpack and load sourcemaps
'app/**/*.test.*': ['webpack', 'sourcemap']
},
browsers: [
// Run tests using PhantomJS
'PhantomJS'
],
singleRun: true,
// Configure code coverage reporter
coverageReporter: {
dir: 'build/coverage/',
type: 'html'
},
// Test webpack config
webpack: require('./webpack.test')
})
}
| // Reference: http://karma-runner.github.io/0.12/config/configuration-file.html
module.exports = function karmaConfig (config) {
config.set({
frameworks: [
// Reference: https://github.com/karma-runner/karma-mocha
// Set framework to mocha
'mocha'
],
reporters: [
// Reference: https://github.com/mlex/karma-spec-reporter
// Set reporter to print detailed results to console
'spec',
// Reference: https://github.com/karma-runner/karma-coverage
// Output code coverage files
'coverage'
],
files: [
// Reference: https://www.npmjs.com/package/phantomjs-polyfill
// Needed because React.js requires bind and phantomjs does not support it
'node_modules/phantomjs-polyfill/bind-polyfill.js',
// Grab all files in the app folder that contain .test.
'app/**/*.test.*'
],
preprocessors: {
// Reference: http://webpack.github.io/docs/testing.html
// Reference: https://github.com/webpack/karma-webpack
// Convert files with webpack and load sourcemaps
'app/**/*.test.*': ['webpack', 'sourcemap']
},
browsers: [
// Run tests using PhantomJS
'PhantomJS'
],
singleRun: true,
// Configure code coverage reporter
coverageReporter: {
dir: 'build/coverage/',
type: 'html'
},
// Test webpack config
webpack: require('./webpack.test'),
// Hide webpack build information from output
webpackMiddleware: {
noInfo: true
}
})
}
| Hide webpack build information when running karma | Hide webpack build information when running karma
| JavaScript | mit | cesarandreu/web-app,Climb-social/climb-social-resources,Climb-social/climb-social-resources,100Shapes/react-webapp-starter,100Shapes/react-webapp-starter,awolf/react-by-example | javascript | ## Code Before:
// Reference: http://karma-runner.github.io/0.12/config/configuration-file.html
module.exports = function karmaConfig (config) {
config.set({
frameworks: [
// Reference: https://github.com/karma-runner/karma-mocha
// Set framework to mocha
'mocha'
],
reporters: [
// Reference: https://github.com/mlex/karma-spec-reporter
// Set reporter to print detailed results to console
'spec',
// Reference: https://github.com/karma-runner/karma-coverage
// Output code coverage files
'coverage'
],
files: [
// Reference: https://www.npmjs.com/package/phantomjs-polyfill
// Needed because React.js requires bind and phantomjs does not support it
'node_modules/phantomjs-polyfill/bind-polyfill.js',
// Grab all files in the app folder that contain .test.
'app/**/*.test.*'
],
preprocessors: {
// Reference: http://webpack.github.io/docs/testing.html
// Reference: https://github.com/webpack/karma-webpack
// Convert files with webpack and load sourcemaps
'app/**/*.test.*': ['webpack', 'sourcemap']
},
browsers: [
// Run tests using PhantomJS
'PhantomJS'
],
singleRun: true,
// Configure code coverage reporter
coverageReporter: {
dir: 'build/coverage/',
type: 'html'
},
// Test webpack config
webpack: require('./webpack.test')
})
}
## Instruction:
Hide webpack build information when running karma
## Code After:
// Reference: http://karma-runner.github.io/0.12/config/configuration-file.html
module.exports = function karmaConfig (config) {
config.set({
frameworks: [
// Reference: https://github.com/karma-runner/karma-mocha
// Set framework to mocha
'mocha'
],
reporters: [
// Reference: https://github.com/mlex/karma-spec-reporter
// Set reporter to print detailed results to console
'spec',
// Reference: https://github.com/karma-runner/karma-coverage
// Output code coverage files
'coverage'
],
files: [
// Reference: https://www.npmjs.com/package/phantomjs-polyfill
// Needed because React.js requires bind and phantomjs does not support it
'node_modules/phantomjs-polyfill/bind-polyfill.js',
// Grab all files in the app folder that contain .test.
'app/**/*.test.*'
],
preprocessors: {
// Reference: http://webpack.github.io/docs/testing.html
// Reference: https://github.com/webpack/karma-webpack
// Convert files with webpack and load sourcemaps
'app/**/*.test.*': ['webpack', 'sourcemap']
},
browsers: [
// Run tests using PhantomJS
'PhantomJS'
],
singleRun: true,
// Configure code coverage reporter
coverageReporter: {
dir: 'build/coverage/',
type: 'html'
},
// Test webpack config
webpack: require('./webpack.test'),
// Hide webpack build information from output
webpackMiddleware: {
noInfo: true
}
})
}
|
d9b877e605501923604abb2aad1f80a23337a3a1 | vscode/User/settings.json | vscode/User/settings.json | {
"editor.glyphMargin": true,
"editor.rulers": [
80,
100,
120,
200
],
"editor.tabSize": 2,
"editor.insertSpaces": true,
"editor.renderWhitespace": "boundary",
"editor.renderControlCharacters": true,
"editor.renderIndentGuides": true,
"files.eol": "\n",
"files.insertFinalNewline": true,
"files.trimTrailingWhitespace": true,
"telemetry.enableTelemetry": false,
"telemetry.enableCrashReporter": false
}
| {
"editor.glyphMargin": true,
"editor.rulers": [
80,
100,
120,
200
],
"editor.tabSize": 2,
"editor.insertSpaces": true,
"editor.renderWhitespace": "boundary",
"editor.renderControlCharacters": true,
"editor.renderIndentGuides": true,
"files.eol": "\n",
"files.insertFinalNewline": true,
"files.trimTrailingWhitespace": true,
"telemetry.enableTelemetry": false,
"telemetry.enableCrashReporter": false,
"workbench.welcome.enabled": false
}
| Disable VS Code welcome page | Disable VS Code welcome page
| JSON | mit | jmlntw/dotfiles,jmlntw/dotfiles | json | ## Code Before:
{
"editor.glyphMargin": true,
"editor.rulers": [
80,
100,
120,
200
],
"editor.tabSize": 2,
"editor.insertSpaces": true,
"editor.renderWhitespace": "boundary",
"editor.renderControlCharacters": true,
"editor.renderIndentGuides": true,
"files.eol": "\n",
"files.insertFinalNewline": true,
"files.trimTrailingWhitespace": true,
"telemetry.enableTelemetry": false,
"telemetry.enableCrashReporter": false
}
## Instruction:
Disable VS Code welcome page
## Code After:
{
"editor.glyphMargin": true,
"editor.rulers": [
80,
100,
120,
200
],
"editor.tabSize": 2,
"editor.insertSpaces": true,
"editor.renderWhitespace": "boundary",
"editor.renderControlCharacters": true,
"editor.renderIndentGuides": true,
"files.eol": "\n",
"files.insertFinalNewline": true,
"files.trimTrailingWhitespace": true,
"telemetry.enableTelemetry": false,
"telemetry.enableCrashReporter": false,
"workbench.welcome.enabled": false
}
|
a9158ccc9aceac55d715f76e5641eb22b459f3a4 | build.gradle.kts | build.gradle.kts |
/**
* Builds and runs the site locally.
*/
task<Exec>("runSite") {
commandLine("./_script/jekyll-serve")
}
task<Exec>("buildSite") {
commandLine("./_script/jekyll-build")
}
|
/**
* Builds and runs the site locally.
*/
task<Exec>("runSite") {
commandLine("./_script/jekyll-serve")
}
/**
* Builds the site without starting the server.
*/
task<Exec>("buildSite") {
commandLine("./_script/jekyll-build")
}
| Document the site build task | Document the site build task
| Kotlin | apache-2.0 | SpineEventEngine/SpineEventEngine.github.io,SpineEventEngine/SpineEventEngine.github.io,SpineEventEngine/SpineEventEngine.github.io,SpineEventEngine/SpineEventEngine.github.io,SpineEventEngine/SpineEventEngine.github.io,SpineEventEngine/SpineEventEngine.github.io | kotlin | ## Code Before:
/**
* Builds and runs the site locally.
*/
task<Exec>("runSite") {
commandLine("./_script/jekyll-serve")
}
task<Exec>("buildSite") {
commandLine("./_script/jekyll-build")
}
## Instruction:
Document the site build task
## Code After:
/**
* Builds and runs the site locally.
*/
task<Exec>("runSite") {
commandLine("./_script/jekyll-serve")
}
/**
* Builds the site without starting the server.
*/
task<Exec>("buildSite") {
commandLine("./_script/jekyll-build")
}
|
54f8bd5579a9cfec716a7b5ba05a9659ea8f9a4d | src/app/feedback/feedback.component.html | src/app/feedback/feedback.component.html | <app-header [title]="'Feedback'"
[description]="'We appreciate your visit to the beta-version of the Research Hub, our platform for research support.'"
[imageUrl]="'assets/cover-1.jpg'"></app-header>
<div class="component" ngClass.xs="component-xs">
<div style="max-width: 850px;">
<h2>Thank You!</h2>
<p>We are developing the Research Hub further and would be grateful for your feedback.
This will give us pointers for user-led refinement and further development of the interface content displayed
in the Research Hub.
</p>
<p>
You can also register your interest in helping our User Experience Specialist running through a usability
study.
</p>
<p>
Thank you,<br/>
The Development Team
</p>
</div>
<div fxLayout="column" fxLayout.gt-xs="row" fxLayoutGap="1em">
<app-feedback-form ngClass.gt-md="card-gt-md" [title]="'Feedback'" [imageUrl]="'https://static.planetminecraft.com/files/avatar/1939407_1.gif'"></app-feedback-form>
<app-feedback-form ngClass.gt-md="card-gt-md" [title]="'Join a User Study'" [imageUrl]="'https://static.planetminecraft.com/files/avatar/1939407_1.gif'"></app-feedback-form>
</div>
</div>
| <app-header [title]="'Feedback'"
[description]="'We appreciate your visit to the beta-version of the Research Hub, our platform for research support.'"
[imageUrl]="'assets/cover-1.jpg'"></app-header>
<div class="component" ngClass.xs="component-xs">
<div style="max-width: 850px; margin-bottom: 2em;">
<h2>Thank You!</h2>
<p>We are developing the Research Hub further and would be grateful for your feedback.
This will give us pointers for user-led refinement and further development of the interface content displayed
in the Research Hub.
</p>
<p>
You can also register your interest in helping our User Experience Specialist running through a usability
study.
</p>
<p>
Thank you,<br/>
The Development Team
</p>
</div>
<div fxLayout="column" fxLayout.gt-xs="row" fxLayoutGap="1em">
<app-feedback-form ngClass.gt-md="card-gt-md" [title]="'Feedback'" [imageUrl]="'https://static.planetminecraft.com/files/avatar/1939407_1.gif'"></app-feedback-form>
<app-feedback-form ngClass.gt-md="card-gt-md" [title]="'Join a User Study'" [imageUrl]="'https://static.planetminecraft.com/files/avatar/1939407_1.gif'"></app-feedback-form>
</div>
</div>
| Add margin between text and forms | Add margin between text and forms
| HTML | bsd-3-clause | UoA-eResearch/research-hub,UoA-eResearch/research-hub,UoA-eResearch/research-hub | html | ## Code Before:
<app-header [title]="'Feedback'"
[description]="'We appreciate your visit to the beta-version of the Research Hub, our platform for research support.'"
[imageUrl]="'assets/cover-1.jpg'"></app-header>
<div class="component" ngClass.xs="component-xs">
<div style="max-width: 850px;">
<h2>Thank You!</h2>
<p>We are developing the Research Hub further and would be grateful for your feedback.
This will give us pointers for user-led refinement and further development of the interface content displayed
in the Research Hub.
</p>
<p>
You can also register your interest in helping our User Experience Specialist running through a usability
study.
</p>
<p>
Thank you,<br/>
The Development Team
</p>
</div>
<div fxLayout="column" fxLayout.gt-xs="row" fxLayoutGap="1em">
<app-feedback-form ngClass.gt-md="card-gt-md" [title]="'Feedback'" [imageUrl]="'https://static.planetminecraft.com/files/avatar/1939407_1.gif'"></app-feedback-form>
<app-feedback-form ngClass.gt-md="card-gt-md" [title]="'Join a User Study'" [imageUrl]="'https://static.planetminecraft.com/files/avatar/1939407_1.gif'"></app-feedback-form>
</div>
</div>
## Instruction:
Add margin between text and forms
## Code After:
<app-header [title]="'Feedback'"
[description]="'We appreciate your visit to the beta-version of the Research Hub, our platform for research support.'"
[imageUrl]="'assets/cover-1.jpg'"></app-header>
<div class="component" ngClass.xs="component-xs">
<div style="max-width: 850px; margin-bottom: 2em;">
<h2>Thank You!</h2>
<p>We are developing the Research Hub further and would be grateful for your feedback.
This will give us pointers for user-led refinement and further development of the interface content displayed
in the Research Hub.
</p>
<p>
You can also register your interest in helping our User Experience Specialist running through a usability
study.
</p>
<p>
Thank you,<br/>
The Development Team
</p>
</div>
<div fxLayout="column" fxLayout.gt-xs="row" fxLayoutGap="1em">
<app-feedback-form ngClass.gt-md="card-gt-md" [title]="'Feedback'" [imageUrl]="'https://static.planetminecraft.com/files/avatar/1939407_1.gif'"></app-feedback-form>
<app-feedback-form ngClass.gt-md="card-gt-md" [title]="'Join a User Study'" [imageUrl]="'https://static.planetminecraft.com/files/avatar/1939407_1.gif'"></app-feedback-form>
</div>
</div>
|
73d22cc63a2a37bd3c99774bf098ca12c81d54ae | funnels.py | funnels.py | import pyglet
from levels import GameOver, IntroScreen, TheGame
from levels.levels import Levels
window = pyglet.window.Window()#fullscreen=True)
levels = Levels([IntroScreen(window), TheGame(window), GameOver(window)])
pyglet.clock.schedule(levels.clock)
@window.event
def on_key_press(symbol, modifiers):
levels.key(symbol, modifiers)
@window.event
def on_draw():
levels.draw()
pyglet.app.run()
| import pyglet
import argparse
from levels import GameOver, IntroScreen, TheGame
from levels.levels import Levels
def main(fullscreen):
window = pyglet.window.Window(fullscreen=fullscreen)
levels = Levels([IntroScreen(window), TheGame(window), GameOver(window)])
pyglet.clock.schedule(levels.clock)
@window.event
def on_key_press(symbol, modifiers):
levels.key(symbol, modifiers)
@window.event
def on_draw():
levels.draw()
pyglet.app.run()
if __name__ == '__main__':
parser = argparse.ArgumentParser(description="Arithemetic practice game.")
parser.add_argument('--fullscreen', action="store_true", help='Turn on fullscreen. Defaults to True')
parser.add_argument('--no-fullscreen', dest="fullscreen", action="store_false", help='Turn off fullscreen. Defaults to False')
parser.set_defaults(fullscreen=True)
results = parser.parse_args()
main(results.fullscreen)
| Add argparse to turn on/off fullscreen behavior | Add argparse to turn on/off fullscreen behavior
| Python | mit | simeonf/claire | python | ## Code Before:
import pyglet
from levels import GameOver, IntroScreen, TheGame
from levels.levels import Levels
window = pyglet.window.Window()#fullscreen=True)
levels = Levels([IntroScreen(window), TheGame(window), GameOver(window)])
pyglet.clock.schedule(levels.clock)
@window.event
def on_key_press(symbol, modifiers):
levels.key(symbol, modifiers)
@window.event
def on_draw():
levels.draw()
pyglet.app.run()
## Instruction:
Add argparse to turn on/off fullscreen behavior
## Code After:
import pyglet
import argparse
from levels import GameOver, IntroScreen, TheGame
from levels.levels import Levels
def main(fullscreen):
window = pyglet.window.Window(fullscreen=fullscreen)
levels = Levels([IntroScreen(window), TheGame(window), GameOver(window)])
pyglet.clock.schedule(levels.clock)
@window.event
def on_key_press(symbol, modifiers):
levels.key(symbol, modifiers)
@window.event
def on_draw():
levels.draw()
pyglet.app.run()
if __name__ == '__main__':
parser = argparse.ArgumentParser(description="Arithemetic practice game.")
parser.add_argument('--fullscreen', action="store_true", help='Turn on fullscreen. Defaults to True')
parser.add_argument('--no-fullscreen', dest="fullscreen", action="store_false", help='Turn off fullscreen. Defaults to False')
parser.set_defaults(fullscreen=True)
results = parser.parse_args()
main(results.fullscreen)
|
874ffb6da356496985feca8f6834ccd09783eada | README.md | README.md | A lightweight package with common helpers for [Handlebars](https://github.com/wycats/handlebars.js)
## Installation
```bash
$ npm install just-handlebars-helpers --save
```
## Usage
```html
<!-- Load Handlebars -->
<script type="text/javascript" src="/node_modules/handlebars/dist/handlebars.min.js"></script>
<!-- Load the package -->
<script type="text/javascript" src="/node_modules/just-handlebars-helpers/dist/h.min.js"></script>
<script type="text/javascript">
// Register helpers for Handlebars
H.registerHelpers(Handlebars);
</script>
```
## TODO
* Support using with CommonJS require i.e `var H = require('just-handlebars-helpers');`
* Add Helpers
* `include`
* `formatDate` (based on moment)
* `sprintf` (based on sprintf-js)
* `join`
* `split`
## Testing the helpers
```bash
# Install dependencies
$ npm install
# Compile everything
$ gulp
# Run all the tests
$ karma start
```
## Inspired by
* [Swag](https://github.com/elving/swag)
* [Dashbars](https://github.com/pismute/dashbars)
* [Assemble](https://github.com/assemble/handlebars-helpers)
| A lightweight package with common helpers for [Handlebars](https://github.com/wycats/handlebars.js)
## Installation
```bash
$ npm install just-handlebars-helpers --save
```
## Usage
```html
<!-- Load Handlebars -->
<script type="text/javascript" src="/node_modules/handlebars/dist/handlebars.min.js"></script>
<!-- Load the package -->
<script type="text/javascript" src="/node_modules/just-handlebars-helpers/dist/h.min.js"></script>
<script type="text/javascript">
// Register helpers for Handlebars
H.registerHelpers(Handlebars);
</script>
```
## Testing the helpers
```bash
# Install dependencies
$ npm install
# Compile everything
$ gulp
# Run all the tests
$ npm test
```
## Inspired by
* [Swag](https://github.com/elving/swag)
* [Dashbars](https://github.com/pismute/dashbars)
* [Assemble](https://github.com/assemble/handlebars-helpers)
| Change to npm test and remove TODO information | Change to npm test and remove TODO information
| Markdown | mit | sanjeevkpandit/just-handlebars-helpers,mesaugat/just-handlebars-helpers,leapfrogtechnology/just-handlebars-helpers,leapfrogtechnology/just-handlebars-helpers | markdown | ## Code Before:
A lightweight package with common helpers for [Handlebars](https://github.com/wycats/handlebars.js)
## Installation
```bash
$ npm install just-handlebars-helpers --save
```
## Usage
```html
<!-- Load Handlebars -->
<script type="text/javascript" src="/node_modules/handlebars/dist/handlebars.min.js"></script>
<!-- Load the package -->
<script type="text/javascript" src="/node_modules/just-handlebars-helpers/dist/h.min.js"></script>
<script type="text/javascript">
// Register helpers for Handlebars
H.registerHelpers(Handlebars);
</script>
```
## TODO
* Support using with CommonJS require i.e `var H = require('just-handlebars-helpers');`
* Add Helpers
* `include`
* `formatDate` (based on moment)
* `sprintf` (based on sprintf-js)
* `join`
* `split`
## Testing the helpers
```bash
# Install dependencies
$ npm install
# Compile everything
$ gulp
# Run all the tests
$ karma start
```
## Inspired by
* [Swag](https://github.com/elving/swag)
* [Dashbars](https://github.com/pismute/dashbars)
* [Assemble](https://github.com/assemble/handlebars-helpers)
## Instruction:
Change to npm test and remove TODO information
## Code After:
A lightweight package with common helpers for [Handlebars](https://github.com/wycats/handlebars.js)
## Installation
```bash
$ npm install just-handlebars-helpers --save
```
## Usage
```html
<!-- Load Handlebars -->
<script type="text/javascript" src="/node_modules/handlebars/dist/handlebars.min.js"></script>
<!-- Load the package -->
<script type="text/javascript" src="/node_modules/just-handlebars-helpers/dist/h.min.js"></script>
<script type="text/javascript">
// Register helpers for Handlebars
H.registerHelpers(Handlebars);
</script>
```
## Testing the helpers
```bash
# Install dependencies
$ npm install
# Compile everything
$ gulp
# Run all the tests
$ npm test
```
## Inspired by
* [Swag](https://github.com/elving/swag)
* [Dashbars](https://github.com/pismute/dashbars)
* [Assemble](https://github.com/assemble/handlebars-helpers)
|
241cf4d0fbf013a766015d53878a32fb7d1c31d0 | docker_push.sh | docker_push.sh |
export SCRIPT_DIR=$(dirname $(realpath $0))
source ${SCRIPT_DIR}/settings.sh
for line in $(docker images | grep $ORIGINAL_REGISTRY)
do
IMAGE=$(echo $line | awk '{print $1}')
VERS=$(echo $line | awk '{print $2}')
HASH=$(echo $line | awk '{print $3}')
NEW_TAG=$(echo $IMAGE | sed s/${ORIGINAL_REGISTRY}/${REGISTRY}/)
# tag new registry server
docker tag -f $HASH $NEW_TAG:$VERS
# push to registry, retry if any errors
RET=1
while [ $RET -ne 0 ]
do
docker push $NEW_TAG:$VERS
RET=$?
done
# finally, remove existing tag
docker rmi $IMAGE:$VERS
done
|
export SCRIPT_DIR=$(dirname $(realpath $0))
source ${SCRIPT_DIR}/settings.sh
for line in $(docker images | grep $OLD_REGISTRY)
do
IMAGE=$(echo $line | awk '{print $1}')
VERS=$(echo $line | awk '{print $2}')
HASH=$(echo $line | awk '{print $3}')
NEW_TAG=$(echo $IMAGE | sed s/${OLD_REGISTRY}/${REGISTRY}/)
# tag new registry server
docker tag -f $HASH $NEW_TAG:$VERS
# push to registry, retry if any errors
RET=1
while [ $RET -ne 0 ]
do
docker push $NEW_TAG:$VERS
RET=$?
done
# finally, remove existing tag
docker rmi $IMAGE:$VERS
done
| Include version info in tags | Include version info in tags | Shell | apache-2.0 | nsabine/ose_scripts,nsabine/ose_scripts | shell | ## Code Before:
export SCRIPT_DIR=$(dirname $(realpath $0))
source ${SCRIPT_DIR}/settings.sh
for line in $(docker images | grep $ORIGINAL_REGISTRY)
do
IMAGE=$(echo $line | awk '{print $1}')
VERS=$(echo $line | awk '{print $2}')
HASH=$(echo $line | awk '{print $3}')
NEW_TAG=$(echo $IMAGE | sed s/${ORIGINAL_REGISTRY}/${REGISTRY}/)
# tag new registry server
docker tag -f $HASH $NEW_TAG:$VERS
# push to registry, retry if any errors
RET=1
while [ $RET -ne 0 ]
do
docker push $NEW_TAG:$VERS
RET=$?
done
# finally, remove existing tag
docker rmi $IMAGE:$VERS
done
## Instruction:
Include version info in tags
## Code After:
export SCRIPT_DIR=$(dirname $(realpath $0))
source ${SCRIPT_DIR}/settings.sh
for line in $(docker images | grep $OLD_REGISTRY)
do
IMAGE=$(echo $line | awk '{print $1}')
VERS=$(echo $line | awk '{print $2}')
HASH=$(echo $line | awk '{print $3}')
NEW_TAG=$(echo $IMAGE | sed s/${OLD_REGISTRY}/${REGISTRY}/)
# tag new registry server
docker tag -f $HASH $NEW_TAG:$VERS
# push to registry, retry if any errors
RET=1
while [ $RET -ne 0 ]
do
docker push $NEW_TAG:$VERS
RET=$?
done
# finally, remove existing tag
docker rmi $IMAGE:$VERS
done
|
bf757ed1844435ef58d6c7f47e204eb33f5b5146 | Core/src/org/sleuthkit/autopsy/casemodule/docs/casemodule-toc.xml | Core/src/org/sleuthkit/autopsy/casemodule/docs/casemodule-toc.xml | <?xml version="1.0" encoding="UTF-8"?>
<!--
To change this template, choose Tools | Templates
and open the template in the editor.
-->
<!DOCTYPE toc PUBLIC "-//Sun Microsystems Inc.//DTD JavaHelp TOC Version 2.0//EN" "http://java.sun.com/products/javahelp/toc_2_0.dtd">
<toc version="2.0">
<tocitem text="Overview" target="org.sleuthkit.autopsy.casemodule.overview"/>
<tocitem text="Quick Start Guide" target="org.sleuthkit.autopsy.casemodule.quickstart"/>
<tocitem text="Case Management">
<tocitem text="Case">
<tocitem text="About Cases" target="org.sleuthkit.autopsy.casemodule.about"/>
<tocitem text="Creating a Case" target="org.sleuthkit.autopsy.casemodule.how-to-create-case"/>
</tocitem>
<tocitem text="Image">
<tocitem text="About Images" target="org.sleuthkit.autopsy.casemodule.image-about"/>
<tocitem text="Adding an Image" target="org.sleuthkit.autopsy.casemodule.add-image"/>
</tocitem>
<tocitem text="Case Properties Window" target="org.sleuthkit.autopsy.casemodule.caseproperties"/>
<tocitem text="Hash Database Management Window" target="org.sleuthkit.autopsy.casemodule.hashdbmgmt"/>
</tocitem>
</toc>
| <?xml version="1.0" encoding="UTF-8"?>
<!--
To change this template, choose Tools | Templates
and open the template in the editor.
-->
<!DOCTYPE toc PUBLIC "-//Sun Microsystems Inc.//DTD JavaHelp TOC Version 2.0//EN" "http://java.sun.com/products/javahelp/toc_2_0.dtd">
<toc version="2.0">
<tocitem text="Quick Start Guide" target="org.sleuthkit.autopsy.casemodule.quickstart"/>
<tocitem text="Overview" target="org.sleuthkit.autopsy.casemodule.overview"/>
<tocitem text="Case Management">
<tocitem text="Case">
<tocitem text="About Cases" target="org.sleuthkit.autopsy.casemodule.about"/>
<tocitem text="Creating a Case" target="org.sleuthkit.autopsy.casemodule.how-to-create-case"/>
</tocitem>
<tocitem text="Image">
<tocitem text="About Images" target="org.sleuthkit.autopsy.casemodule.image-about"/>
<tocitem text="Adding an Image" target="org.sleuthkit.autopsy.casemodule.add-image"/>
</tocitem>
<tocitem text="Case Properties Window" target="org.sleuthkit.autopsy.casemodule.caseproperties"/>
<tocitem text="Hash Database Management Window" target="org.sleuthkit.autopsy.casemodule.hashdbmgmt"/>
</tocitem>
</toc>
| Move guide up in toc | Move guide up in toc
| XML | apache-2.0 | sidheshenator/autopsy,raman-bt/autopsy,wschaeferB/autopsy,millmanorama/autopsy,esaunders/autopsy,rcordovano/autopsy,APriestman/autopsy,raman-bt/autopsy,maxrp/autopsy,APriestman/autopsy,mhmdfy/autopsy,narfindustries/autopsy,esaunders/autopsy,sidheshenator/autopsy,esaunders/autopsy,mhmdfy/autopsy,wschaeferB/autopsy,APriestman/autopsy,maxrp/autopsy,APriestman/autopsy,APriestman/autopsy,wschaeferB/autopsy,raman-bt/autopsy,dgrove727/autopsy,dgrove727/autopsy,sidheshenator/autopsy,APriestman/autopsy,wschaeferB/autopsy,esaunders/autopsy,esaunders/autopsy,mhmdfy/autopsy,rcordovano/autopsy,narfindustries/autopsy,mhmdfy/autopsy,sidheshenator/autopsy,millmanorama/autopsy,raman-bt/autopsy,karlmortensen/autopsy,raman-bt/autopsy,narfindustries/autopsy,karlmortensen/autopsy,rcordovano/autopsy,rcordovano/autopsy,APriestman/autopsy,maxrp/autopsy,millmanorama/autopsy,rcordovano/autopsy,wschaeferB/autopsy,eXcomm/autopsy,eXcomm/autopsy,raman-bt/autopsy,raman-bt/autopsy,millmanorama/autopsy,rcordovano/autopsy,maxrp/autopsy,eXcomm/autopsy,karlmortensen/autopsy,eXcomm/autopsy,dgrove727/autopsy,karlmortensen/autopsy | xml | ## Code Before:
<?xml version="1.0" encoding="UTF-8"?>
<!--
To change this template, choose Tools | Templates
and open the template in the editor.
-->
<!DOCTYPE toc PUBLIC "-//Sun Microsystems Inc.//DTD JavaHelp TOC Version 2.0//EN" "http://java.sun.com/products/javahelp/toc_2_0.dtd">
<toc version="2.0">
<tocitem text="Overview" target="org.sleuthkit.autopsy.casemodule.overview"/>
<tocitem text="Quick Start Guide" target="org.sleuthkit.autopsy.casemodule.quickstart"/>
<tocitem text="Case Management">
<tocitem text="Case">
<tocitem text="About Cases" target="org.sleuthkit.autopsy.casemodule.about"/>
<tocitem text="Creating a Case" target="org.sleuthkit.autopsy.casemodule.how-to-create-case"/>
</tocitem>
<tocitem text="Image">
<tocitem text="About Images" target="org.sleuthkit.autopsy.casemodule.image-about"/>
<tocitem text="Adding an Image" target="org.sleuthkit.autopsy.casemodule.add-image"/>
</tocitem>
<tocitem text="Case Properties Window" target="org.sleuthkit.autopsy.casemodule.caseproperties"/>
<tocitem text="Hash Database Management Window" target="org.sleuthkit.autopsy.casemodule.hashdbmgmt"/>
</tocitem>
</toc>
## Instruction:
Move guide up in toc
## Code After:
<?xml version="1.0" encoding="UTF-8"?>
<!--
To change this template, choose Tools | Templates
and open the template in the editor.
-->
<!DOCTYPE toc PUBLIC "-//Sun Microsystems Inc.//DTD JavaHelp TOC Version 2.0//EN" "http://java.sun.com/products/javahelp/toc_2_0.dtd">
<toc version="2.0">
<tocitem text="Quick Start Guide" target="org.sleuthkit.autopsy.casemodule.quickstart"/>
<tocitem text="Overview" target="org.sleuthkit.autopsy.casemodule.overview"/>
<tocitem text="Case Management">
<tocitem text="Case">
<tocitem text="About Cases" target="org.sleuthkit.autopsy.casemodule.about"/>
<tocitem text="Creating a Case" target="org.sleuthkit.autopsy.casemodule.how-to-create-case"/>
</tocitem>
<tocitem text="Image">
<tocitem text="About Images" target="org.sleuthkit.autopsy.casemodule.image-about"/>
<tocitem text="Adding an Image" target="org.sleuthkit.autopsy.casemodule.add-image"/>
</tocitem>
<tocitem text="Case Properties Window" target="org.sleuthkit.autopsy.casemodule.caseproperties"/>
<tocitem text="Hash Database Management Window" target="org.sleuthkit.autopsy.casemodule.hashdbmgmt"/>
</tocitem>
</toc>
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.