commit
stringlengths
40
40
old_file
stringlengths
4
237
new_file
stringlengths
4
237
old_contents
stringlengths
1
4.24k
new_contents
stringlengths
5
4.84k
subject
stringlengths
15
778
message
stringlengths
16
6.86k
lang
stringlengths
1
30
license
stringclasses
13 values
repos
stringlengths
5
116k
config
stringlengths
1
30
content
stringlengths
105
8.72k
c1f2ecce60517a23edd0f54a2df19c0d8370ef4f
locales/fr/copyright-social.properties
locales/fr/copyright-social.properties
copyright_social_link1=L’enseignement, les parodies, les photos et les remixes ne devraient pas être illégaux en ligne. Il est temps de libérer notre imagination dans l’UE. Signez la pétition de Mozilla : {link} copyright_social_link2_email_subject=Les mèmes ne devraient pas être illégaux copyright_social_link3_twitter=Les parodies et les remixes ne devraient pas être illégaux en ligne. Libérons notre imagination dans l’UE. Signez la pétition de @mozilla : {link} copyright_social_hashtag1=copyfail copyright_social_hashtag2=fixcopyright copyright_social_hashtag3=keepthewebweird copyright_social_hashtag4=copyreform
copyright_social_link1=L’enseignement, les parodies, les photos et les remixes ne devraient pas être illégaux en ligne. Il est temps de libérer notre imagination dans l’UE. Signez la pétition de Mozilla : {link} copyright_social_link2_email_subject=Les mèmes ne devraient pas être illégaux copyright_social_link2_email_body=L’enseignement, les parodies, les photos et les remixes ne devraient pas être illégaux en ligne. Il est temps de libérer notre imagination dans l’UE. Signez la pétition de Mozilla pour réformer la loi européenne obsolète relative au droit d’auteur : {link} copyright_social_link3_twitter=Les parodies et les remixes ne devraient pas être illégaux en ligne. Libérons notre imagination dans l’UE. Signez la pétition de @mozilla : {link} copyright_social_promo1=Vous souvenez-vous de la dernière fois que vous avez utilisé ceci ? La loi européenne relative au droit d’auteur a été rédigée avant et doit maintenant se mettre à l’heure de la technologie. copyright_social_promo2=Vous souvenez-vous de l’époque où c’était à la mode ? La loi européenne relative au droit d’auteur a été rédigée avant. Il est temps de faire table rase. 3 choses simples peuvent être faites copyright_social_hashtag1=copyfail copyright_social_hashtag2=fixcopyright copyright_social_hashtag3=keepthewebweird copyright_social_hashtag4=copyreform
Update French (fr) localization of Mozilla Advocacy
Pontoon: Update French (fr) localization of Mozilla Advocacy Localization authors: - Théo Chevalier <theo.chevalier11@gmail.com>
INI
mpl-2.0
mozilla/advocacy.mozilla.org
ini
## Code Before: copyright_social_link1=L’enseignement, les parodies, les photos et les remixes ne devraient pas être illégaux en ligne. Il est temps de libérer notre imagination dans l’UE. Signez la pétition de Mozilla : {link} copyright_social_link2_email_subject=Les mèmes ne devraient pas être illégaux copyright_social_link3_twitter=Les parodies et les remixes ne devraient pas être illégaux en ligne. Libérons notre imagination dans l’UE. Signez la pétition de @mozilla : {link} copyright_social_hashtag1=copyfail copyright_social_hashtag2=fixcopyright copyright_social_hashtag3=keepthewebweird copyright_social_hashtag4=copyreform ## Instruction: Pontoon: Update French (fr) localization of Mozilla Advocacy Localization authors: - Théo Chevalier <theo.chevalier11@gmail.com> ## Code After: copyright_social_link1=L’enseignement, les parodies, les photos et les remixes ne devraient pas être illégaux en ligne. Il est temps de libérer notre imagination dans l’UE. Signez la pétition de Mozilla : {link} copyright_social_link2_email_subject=Les mèmes ne devraient pas être illégaux copyright_social_link2_email_body=L’enseignement, les parodies, les photos et les remixes ne devraient pas être illégaux en ligne. Il est temps de libérer notre imagination dans l’UE. Signez la pétition de Mozilla pour réformer la loi européenne obsolète relative au droit d’auteur : {link} copyright_social_link3_twitter=Les parodies et les remixes ne devraient pas être illégaux en ligne. Libérons notre imagination dans l’UE. Signez la pétition de @mozilla : {link} copyright_social_promo1=Vous souvenez-vous de la dernière fois que vous avez utilisé ceci ? La loi européenne relative au droit d’auteur a été rédigée avant et doit maintenant se mettre à l’heure de la technologie. copyright_social_promo2=Vous souvenez-vous de l’époque où c’était à la mode ? La loi européenne relative au droit d’auteur a été rédigée avant. Il est temps de faire table rase. 3 choses simples peuvent être faites copyright_social_hashtag1=copyfail copyright_social_hashtag2=fixcopyright copyright_social_hashtag3=keepthewebweird copyright_social_hashtag4=copyreform
79af182822c2022a0473a76c9674e9f3b326876c
package.json
package.json
{ "name": "fnndsc.babymri.org", "repository": "https://github.com/FNNDSC/fnndsc.babymri.org", "license": "Apache-2.0", "devDependencies": { "eslint": "^3.12.0", "eslint-config-google": "^0.7.1", "eslint-plugin-html": "^1.7.0", "gh-pages": "latest" }, "scripts": { "lint": "eslint . --ext js,html --ignore-path .gitignore", "test": "npm run lint && polymer test", "deploy": "polymer build && gh-pages -d build/bundled" } }
{ "name": "fnndsc.babymri.org", "repository": "https://github.com/FNNDSC/fnndsc.babymri.org", "license": "Apache-2.0", "devDependencies": { "eslint": "^3.12.0", "eslint-config-google": "^0.7.1", "eslint-plugin-html": "^1.7.0", "gh-pages": "latest" }, "scripts": { "lint": "eslint . --ext js,html --ignore-path .gitignore", "test": "npm run lint && polymer test", "deploy": "polymer build && timestamp=$(date \"+%c\") && gh-pages -d build/bundled -m \"Deployed on $timestamp\"" } }
Add timestamp in deploy commit message
Add timestamp in deploy commit message
JSON
bsd-3-clause
FNNDSC/fnndsc.babymri.org,FNNDSC/fnndsc.babymri.org,FNNDSC/fnndsc.babymri.org
json
## Code Before: { "name": "fnndsc.babymri.org", "repository": "https://github.com/FNNDSC/fnndsc.babymri.org", "license": "Apache-2.0", "devDependencies": { "eslint": "^3.12.0", "eslint-config-google": "^0.7.1", "eslint-plugin-html": "^1.7.0", "gh-pages": "latest" }, "scripts": { "lint": "eslint . --ext js,html --ignore-path .gitignore", "test": "npm run lint && polymer test", "deploy": "polymer build && gh-pages -d build/bundled" } } ## Instruction: Add timestamp in deploy commit message ## Code After: { "name": "fnndsc.babymri.org", "repository": "https://github.com/FNNDSC/fnndsc.babymri.org", "license": "Apache-2.0", "devDependencies": { "eslint": "^3.12.0", "eslint-config-google": "^0.7.1", "eslint-plugin-html": "^1.7.0", "gh-pages": "latest" }, "scripts": { "lint": "eslint . --ext js,html --ignore-path .gitignore", "test": "npm run lint && polymer test", "deploy": "polymer build && timestamp=$(date \"+%c\") && gh-pages -d build/bundled -m \"Deployed on $timestamp\"" } }
9e4b9a206e411d42bbe656d27a993b68f8c56fe7
containers/postgresql-pgbouncer/conf/pgbouncer.ini
containers/postgresql-pgbouncer/conf/pgbouncer.ini
[databases] * = host=mc_postgresql_server port=5432 user=mediacloud [pgbouncer] ; Actual logging done to STDOUT logfile = /dev/null pidfile = /var/run/postgresql/pgbouncer.pid listen_addr = 0.0.0.0 listen_port = 6432 unix_socket_dir = /var/run/postgresql auth_type = md5 auth_file = /etc/pgbouncer/userlist.txt pool_mode = session server_reset_query = DISCARD ALL max_client_conn = 600 default_pool_size = 600 log_connections = 0 log_disconnections = 0 stats_period = 600
[databases] * = host=mc_postgresql_server port=5432 user=mediacloud [pgbouncer] ; Actual logging done to STDOUT logfile = /dev/null pidfile = /var/run/postgresql/pgbouncer.pid listen_addr = 0.0.0.0 listen_port = 6432 unix_socket_dir = /var/run/postgresql auth_type = md5 auth_file = /etc/pgbouncer/userlist.txt pool_mode = session server_reset_query = DISCARD ALL max_client_conn = 600 default_pool_size = 600 log_connections = 0 log_disconnections = 0 stats_period = 600 server_login_retry = 1
Make PgBouncer try to reconnect every 1 s
Make PgBouncer try to reconnect every 1 s
INI
agpl-3.0
berkmancenter/mediacloud,berkmancenter/mediacloud,berkmancenter/mediacloud,berkmancenter/mediacloud,berkmancenter/mediacloud
ini
## Code Before: [databases] * = host=mc_postgresql_server port=5432 user=mediacloud [pgbouncer] ; Actual logging done to STDOUT logfile = /dev/null pidfile = /var/run/postgresql/pgbouncer.pid listen_addr = 0.0.0.0 listen_port = 6432 unix_socket_dir = /var/run/postgresql auth_type = md5 auth_file = /etc/pgbouncer/userlist.txt pool_mode = session server_reset_query = DISCARD ALL max_client_conn = 600 default_pool_size = 600 log_connections = 0 log_disconnections = 0 stats_period = 600 ## Instruction: Make PgBouncer try to reconnect every 1 s ## Code After: [databases] * = host=mc_postgresql_server port=5432 user=mediacloud [pgbouncer] ; Actual logging done to STDOUT logfile = /dev/null pidfile = /var/run/postgresql/pgbouncer.pid listen_addr = 0.0.0.0 listen_port = 6432 unix_socket_dir = /var/run/postgresql auth_type = md5 auth_file = /etc/pgbouncer/userlist.txt pool_mode = session server_reset_query = DISCARD ALL max_client_conn = 600 default_pool_size = 600 log_connections = 0 log_disconnections = 0 stats_period = 600 server_login_retry = 1
19c7d6a7c725f3a8847e6ed88d8add874ee9ae0b
docs/index.html
docs/index.html
--- layout: home title: github-version-statistics --- <div class="docs-header"> {% comment %} {% include masthead.html %} {% endcomment %} <div class="docs-header-content"> <p class="docs-subtitle">Which versions of a Java library are used on Github.</p> <!--a data-ignore="push" href="https://github.com/centic9/github-version-statistics" class="btn btn-primary" onClick="_gaq.push(['_trackEvent', 'Downloads', 'V1.0']);">Download</a> <p class="version">Currently v1.0</p--> <a href="resultsCurrent.html">Current Results</a><br> <a href="results.html">Changes over time</a><br> </div> {% comment %} <div class="docs-header-bottom"> {% include ad.html %} {% include footer.html %} </div> {% endcomment %} </div>
--- layout: home title: github-version-statistics --- <div class="docs-header"> {% comment %} {% include masthead.html %} {% endcomment %} <div class="docs-header-content"> <p class="docs-subtitle">Which versions of a Java library are used on Github.</p> <!--a data-ignore="push" href="https://github.com/centic9/github-version-statistics" class="btn btn-primary" onClick="_gaq.push(['_trackEvent', 'Downloads', 'V1.0']);">Download</a> <p class="version">Currently v1.0</p--> <a href="resultsCurrent.html">Current Results</a><br> <a href="results.html">Changes over time</a><br> <br> <a href="https://github.com/centic9/github-version-statistics/">Sources</a><br> </div> {% comment %} <div class="docs-header-bottom"> {% include ad.html %} {% include footer.html %} </div> {% endcomment %} </div>
Add link to the github project
Add link to the github project
HTML
apache-2.0
centic9/github-version-statistics,centic9/github-version-statistics
html
## Code Before: --- layout: home title: github-version-statistics --- <div class="docs-header"> {% comment %} {% include masthead.html %} {% endcomment %} <div class="docs-header-content"> <p class="docs-subtitle">Which versions of a Java library are used on Github.</p> <!--a data-ignore="push" href="https://github.com/centic9/github-version-statistics" class="btn btn-primary" onClick="_gaq.push(['_trackEvent', 'Downloads', 'V1.0']);">Download</a> <p class="version">Currently v1.0</p--> <a href="resultsCurrent.html">Current Results</a><br> <a href="results.html">Changes over time</a><br> </div> {% comment %} <div class="docs-header-bottom"> {% include ad.html %} {% include footer.html %} </div> {% endcomment %} </div> ## Instruction: Add link to the github project ## Code After: --- layout: home title: github-version-statistics --- <div class="docs-header"> {% comment %} {% include masthead.html %} {% endcomment %} <div class="docs-header-content"> <p class="docs-subtitle">Which versions of a Java library are used on Github.</p> <!--a data-ignore="push" href="https://github.com/centic9/github-version-statistics" class="btn btn-primary" onClick="_gaq.push(['_trackEvent', 'Downloads', 'V1.0']);">Download</a> <p class="version">Currently v1.0</p--> <a href="resultsCurrent.html">Current Results</a><br> <a href="results.html">Changes over time</a><br> <br> <a href="https://github.com/centic9/github-version-statistics/">Sources</a><br> </div> {% comment %} <div class="docs-header-bottom"> {% include ad.html %} {% include footer.html %} </div> {% endcomment %} </div>
ac435a55b8a6491a4223610ac38d0b1a83b6ff9d
dokan/dokan_test.go
dokan/dokan_test.go
// Copyright 2016 Keybase Inc. All rights reserved. // Use of this source code is governed by a BSD // license that can be found in the LICENSE file. // +build windows package dokan import ( "testing" "time" ) func TestTimePacking(t *testing.T) { t0 := time.Now() if !t0.Equal(unpackTime(packTime(t0))) { t.Fatal("Time unpack+pack not equal with original!") } } func TestCtxAlloc(t *testing.T) { ctx := allocCtx(0) ctx.Free() }
// Copyright 2016 Keybase Inc. All rights reserved. // Use of this source code is governed by a BSD // license that can be found in the LICENSE file. // +build windows package dokan import ( "testing" "time" ) func testTimePacking(t *testing.T, t0 time.Time) { t1 := unpackTime(packTime(t0)) if !t0.Equal(t1) { t.Fatal("Time pack+unpack not equal with original: %v => %v", t0, t1) } } func TestTimePacking(t *testing.T) { testTimePacking(t, time.Time{}) testTimePacking(t, time.Now()) testTimePacking(t, time.Unix(0, 0)) } func TestCtxAlloc(t *testing.T) { ctx := allocCtx(0) ctx.Free() }
Test time type conversions more
dokan: Test time type conversions more
Go
bsd-3-clause
keybase/client,keybase/client,keybase/client,keybase/client,keybase/client,keybase/kbfs,keybase/kbfs,keybase/client,keybase/kbfs,keybase/client,keybase/client,keybase/client,keybase/client,keybase/client,keybase/client,keybase/kbfs
go
## Code Before: // Copyright 2016 Keybase Inc. All rights reserved. // Use of this source code is governed by a BSD // license that can be found in the LICENSE file. // +build windows package dokan import ( "testing" "time" ) func TestTimePacking(t *testing.T) { t0 := time.Now() if !t0.Equal(unpackTime(packTime(t0))) { t.Fatal("Time unpack+pack not equal with original!") } } func TestCtxAlloc(t *testing.T) { ctx := allocCtx(0) ctx.Free() } ## Instruction: dokan: Test time type conversions more ## Code After: // Copyright 2016 Keybase Inc. All rights reserved. // Use of this source code is governed by a BSD // license that can be found in the LICENSE file. // +build windows package dokan import ( "testing" "time" ) func testTimePacking(t *testing.T, t0 time.Time) { t1 := unpackTime(packTime(t0)) if !t0.Equal(t1) { t.Fatal("Time pack+unpack not equal with original: %v => %v", t0, t1) } } func TestTimePacking(t *testing.T) { testTimePacking(t, time.Time{}) testTimePacking(t, time.Now()) testTimePacking(t, time.Unix(0, 0)) } func TestCtxAlloc(t *testing.T) { ctx := allocCtx(0) ctx.Free() }
1e513d901dfef9135a62c8f99633b10d3900ecb8
orator/schema/mysql_builder.py
orator/schema/mysql_builder.py
from .builder import SchemaBuilder class MySQLSchemaBuilder(SchemaBuilder): def has_table(self, table): """ Determine if the given table exists. :param table: The table :type table: str :rtype: bool """ sql = self._grammar.compile_table_exists() database = self._connection.get_database_name() table = self._connection.get_table_prefix() + table return len(self._connection.select(sql, [database, table])) > 0 def get_column_listing(self, table): """ Get the column listing for a given table. :param table: The table :type table: str :rtype: list """ sql = self._grammar.compile_column_exists() database = self._connection.get_database_name() table = self._connection.get_table_prefix() + table results = self._connection.select(sql, [database, table]) return self._connection.get_post_processor().process_column_listing(results)
from .builder import SchemaBuilder class MySQLSchemaBuilder(SchemaBuilder): def has_table(self, table): """ Determine if the given table exists. :param table: The table :type table: str :rtype: bool """ sql = self._grammar.compile_table_exists() database = self._connection.get_database_name() table = self._connection.get_table_prefix() + table return len(self._connection.select(sql, [database, table])) > 0 def get_column_listing(self, table): """ Get the column listing for a given table. :param table: The table :type table: str :rtype: list """ sql = self._grammar.compile_column_exists() database = self._connection.get_database_name() table = self._connection.get_table_prefix() + table results = [] for result in self._connection.select(sql, [database, table]): new_result = {} for key, value in result.items(): new_result[key.lower()] = value results.append(new_result) return self._connection.get_post_processor().process_column_listing(results)
Fix case when processing column names for MySQL
Fix case when processing column names for MySQL
Python
mit
sdispater/orator
python
## Code Before: from .builder import SchemaBuilder class MySQLSchemaBuilder(SchemaBuilder): def has_table(self, table): """ Determine if the given table exists. :param table: The table :type table: str :rtype: bool """ sql = self._grammar.compile_table_exists() database = self._connection.get_database_name() table = self._connection.get_table_prefix() + table return len(self._connection.select(sql, [database, table])) > 0 def get_column_listing(self, table): """ Get the column listing for a given table. :param table: The table :type table: str :rtype: list """ sql = self._grammar.compile_column_exists() database = self._connection.get_database_name() table = self._connection.get_table_prefix() + table results = self._connection.select(sql, [database, table]) return self._connection.get_post_processor().process_column_listing(results) ## Instruction: Fix case when processing column names for MySQL ## Code After: from .builder import SchemaBuilder class MySQLSchemaBuilder(SchemaBuilder): def has_table(self, table): """ Determine if the given table exists. :param table: The table :type table: str :rtype: bool """ sql = self._grammar.compile_table_exists() database = self._connection.get_database_name() table = self._connection.get_table_prefix() + table return len(self._connection.select(sql, [database, table])) > 0 def get_column_listing(self, table): """ Get the column listing for a given table. :param table: The table :type table: str :rtype: list """ sql = self._grammar.compile_column_exists() database = self._connection.get_database_name() table = self._connection.get_table_prefix() + table results = [] for result in self._connection.select(sql, [database, table]): new_result = {} for key, value in result.items(): new_result[key.lower()] = value results.append(new_result) return self._connection.get_post_processor().process_column_listing(results)
78e5c62a7495333f1435e26141118a0e572e47d6
app/javascript/app/pages/sectors-agriculture/sectors-agriculture-styles.scss
app/javascript/app/pages/sectors-agriculture/sectors-agriculture-styles.scss
@import '~styles/layout.scss'; .anchorNav { @include row('shrink', $wrap: false); } .headerContent { margin-bottom: 55px; } .sectionComponent { margin-bottom: 100px; }
@import '~styles/layout.scss'; .anchorNav { @include row('shrink', $wrap: false); } .headerContent { margin-bottom: 55px; } .sectionComponent { margin-bottom: 100px; position: relative; } .sectionHash { position: absolute; top: -130px; @media #{$tablet-landscape} { top: -79px; } }
Fix hash section navigation in sectors/agriculture
Fix hash section navigation in sectors/agriculture
SCSS
mit
Vizzuality/climate-watch,Vizzuality/climate-watch,Vizzuality/climate-watch,Vizzuality/climate-watch
scss
## Code Before: @import '~styles/layout.scss'; .anchorNav { @include row('shrink', $wrap: false); } .headerContent { margin-bottom: 55px; } .sectionComponent { margin-bottom: 100px; } ## Instruction: Fix hash section navigation in sectors/agriculture ## Code After: @import '~styles/layout.scss'; .anchorNav { @include row('shrink', $wrap: false); } .headerContent { margin-bottom: 55px; } .sectionComponent { margin-bottom: 100px; position: relative; } .sectionHash { position: absolute; top: -130px; @media #{$tablet-landscape} { top: -79px; } }
ad576da955df4de52b095eb0d3642c6605fcd173
.travis.yml
.travis.yml
language: python python: - "2.6" - "2.7" env: - DJANGO="django>=1.5,<1.6 --use-mirrors" - DJANGO="django>=1.4,<1.5 --use-mirrors" install: - pip install $DJANGO script: python setup.py test notifications: email: - jklaiho@iki.fi
language: python python: - "2.6" - "2.7" env: - DJANGO="django>=1.5,<1.6 --use-mirrors" - DJANGO="django>=1.4,<1.5 --use-mirrors" - DJANGO="django>=1.3,<1.4 --use-mirrors" install: - pip install $DJANGO script: python setup.py test notifications: email: - jklaiho@iki.fi
Add Django 1.3 to the Travis configuration
Add Django 1.3 to the Travis configuration
YAML
bsd-3-clause
jklaiho/django-class-fixtures,jklaiho/django-class-fixtures
yaml
## Code Before: language: python python: - "2.6" - "2.7" env: - DJANGO="django>=1.5,<1.6 --use-mirrors" - DJANGO="django>=1.4,<1.5 --use-mirrors" install: - pip install $DJANGO script: python setup.py test notifications: email: - jklaiho@iki.fi ## Instruction: Add Django 1.3 to the Travis configuration ## Code After: language: python python: - "2.6" - "2.7" env: - DJANGO="django>=1.5,<1.6 --use-mirrors" - DJANGO="django>=1.4,<1.5 --use-mirrors" - DJANGO="django>=1.3,<1.4 --use-mirrors" install: - pip install $DJANGO script: python setup.py test notifications: email: - jklaiho@iki.fi
905f5ef35e97fb5b22af29deb52df1b438aa7b71
src/encoded/tests/data/inserts/cpg_correlation_quality_metric.json
src/encoded/tests/data/inserts/cpg_correlation_quality_metric.json
[ { "lab": "/labs/robert-waterston/", "award": "/awards/U41HG007355/", "status": "released", "step_run": "7c81414f-a9c0-44ed-87fb-2e2207733d3b", "uuid": "23d36ff2-8219-4bf2-b163-2a922b1d8054", "assay_term_name": "whole-genome shotgun bisulfite sequencing", "assay_term_id": "OBI:0001863", "quality_metric_of": [ "46e82a90-49e5-4c33-afab-9ec90d65faf3", "582d6675-0b2e-4886-b43d-f6efa9033b37" ], "attachment": "generic_from_star_final_out.txt", "CpG pairs": 43730532, "CpG pairs with atleast 10 reads each": 31107909, "Pearson Correlation Coefficient": 0.8993 } ]
[ { "lab": "/labs/robert-waterston/", "award": "/awards/U41HG007355/", "status": "released", "step_run": "7c81414f-a9c0-44ed-87fb-2e2207733d3b", "uuid": "23d36ff2-8219-4bf2-b163-2a922b1d8054", "assay_term_name": "whole-genome shotgun bisulfite sequencing", "assay_term_id": "OBI:0001863", "quality_metric_of": [ "46e82a90-49e5-4c33-afab-9ec90d65faf3", "582d6675-0b2e-4886-b43d-f6efa9033b37" ], "attachment": "generic_from_star_final_out.txt", "CpG pairs": 43730532, "CpG pairs with atleast 10 reads each": 31107909, "Pearson correlation": 0.8993 } ]
Upgrade CpG correlation QC insert
Upgrade CpG correlation QC insert
JSON
mit
ENCODE-DCC/encoded,T2DREAM/t2dream-portal,T2DREAM/t2dream-portal,T2DREAM/t2dream-portal,ENCODE-DCC/encoded,ENCODE-DCC/encoded,T2DREAM/t2dream-portal,ENCODE-DCC/encoded
json
## Code Before: [ { "lab": "/labs/robert-waterston/", "award": "/awards/U41HG007355/", "status": "released", "step_run": "7c81414f-a9c0-44ed-87fb-2e2207733d3b", "uuid": "23d36ff2-8219-4bf2-b163-2a922b1d8054", "assay_term_name": "whole-genome shotgun bisulfite sequencing", "assay_term_id": "OBI:0001863", "quality_metric_of": [ "46e82a90-49e5-4c33-afab-9ec90d65faf3", "582d6675-0b2e-4886-b43d-f6efa9033b37" ], "attachment": "generic_from_star_final_out.txt", "CpG pairs": 43730532, "CpG pairs with atleast 10 reads each": 31107909, "Pearson Correlation Coefficient": 0.8993 } ] ## Instruction: Upgrade CpG correlation QC insert ## Code After: [ { "lab": "/labs/robert-waterston/", "award": "/awards/U41HG007355/", "status": "released", "step_run": "7c81414f-a9c0-44ed-87fb-2e2207733d3b", "uuid": "23d36ff2-8219-4bf2-b163-2a922b1d8054", "assay_term_name": "whole-genome shotgun bisulfite sequencing", "assay_term_id": "OBI:0001863", "quality_metric_of": [ "46e82a90-49e5-4c33-afab-9ec90d65faf3", "582d6675-0b2e-4886-b43d-f6efa9033b37" ], "attachment": "generic_from_star_final_out.txt", "CpG pairs": 43730532, "CpG pairs with atleast 10 reads each": 31107909, "Pearson correlation": 0.8993 } ]
f78eb091950bcd8a8b29d6619537faab95c4dcfc
ckanext/requestdata/templates/requestdata/snippets/section_item_archive.html
ckanext/requestdata/templates/requestdata/snippets/section_item_archive.html
{# Creates single item in a section. item - The request that needs to be shown. Example usage: {% snippet 'requestdata/snippets/section_item_new.html', item=item %} #} {% resource 'requestdata/modal-form.js' %} {# {% set sender_profile = h.url_for(controller='user', action='read', id=item.sender_user_id) %} #} {% set package_url = h.url_for(controller='package', action='read', id=item.package_id) %} <div class="my-requested-data-container__content-item"> <p>{{item}}</p> </div>
{# Creates single item in a section. item - The request that needs to be shown. Example usage: {% snippet 'requestdata/snippets/section_item_archive.html', item=item %} #} {% set sender_profile = h.url_for(controller='user', action='read', id=item.sender_user_id) %} {% set package_url = h.url_for(controller='package', action='read', id=item.package_id) %} <div class="my-requested-data-container__content-item"> <p class="my-requested-data-container__content-item-info">Data request from <a href="{{ sender_profile }}" title="{{ _('View profile') }}">{{ item.sender_name }}</a> for dataset <a href="{{ package_url }}" title="{{ _('View dataset') }}">{{ h.requestdata_get_package_title(item.package_id) }}</a>.</p> <p class="my-requested-data-container__date" title="{{ h.render_datetime(item.modified_at, with_hours=True) }}">{{ h.requestdata_time_ago_from_datetime(item.modified_at) }}</p> <div class="my-requested-data-container__status"> {% if item.rejected %} <i class="icon-remove-circle"></i> <p class="my-requested-data-container__status-text">Rejected</p> {% elif item.data_shared %} <i class="icon-thumbs-up"></i> <p class="my-requested-data-container__status-text">Shared</p> {% elif not item.data_shared %} <i class="icon-thumbs-down"></i> <p class="my-requested-data-container__status-text">Not shared</p> {% endif %} </div> </div>
Add additional info for archived request
Add additional info for archived request
HTML
agpl-3.0
ViderumGlobal/ckanext-requestdata,ViderumGlobal/ckanext-requestdata,ViderumGlobal/ckanext-requestdata,ViderumGlobal/ckanext-requestdata
html
## Code Before: {# Creates single item in a section. item - The request that needs to be shown. Example usage: {% snippet 'requestdata/snippets/section_item_new.html', item=item %} #} {% resource 'requestdata/modal-form.js' %} {# {% set sender_profile = h.url_for(controller='user', action='read', id=item.sender_user_id) %} #} {% set package_url = h.url_for(controller='package', action='read', id=item.package_id) %} <div class="my-requested-data-container__content-item"> <p>{{item}}</p> </div> ## Instruction: Add additional info for archived request ## Code After: {# Creates single item in a section. item - The request that needs to be shown. Example usage: {% snippet 'requestdata/snippets/section_item_archive.html', item=item %} #} {% set sender_profile = h.url_for(controller='user', action='read', id=item.sender_user_id) %} {% set package_url = h.url_for(controller='package', action='read', id=item.package_id) %} <div class="my-requested-data-container__content-item"> <p class="my-requested-data-container__content-item-info">Data request from <a href="{{ sender_profile }}" title="{{ _('View profile') }}">{{ item.sender_name }}</a> for dataset <a href="{{ package_url }}" title="{{ _('View dataset') }}">{{ h.requestdata_get_package_title(item.package_id) }}</a>.</p> <p class="my-requested-data-container__date" title="{{ h.render_datetime(item.modified_at, with_hours=True) }}">{{ h.requestdata_time_ago_from_datetime(item.modified_at) }}</p> <div class="my-requested-data-container__status"> {% if item.rejected %} <i class="icon-remove-circle"></i> <p class="my-requested-data-container__status-text">Rejected</p> {% elif item.data_shared %} <i class="icon-thumbs-up"></i> <p class="my-requested-data-container__status-text">Shared</p> {% elif not item.data_shared %} <i class="icon-thumbs-down"></i> <p class="my-requested-data-container__status-text">Not shared</p> {% endif %} </div> </div>
1b0a3de28f12c7555857b15777287ff906d58eb5
README.md
README.md
Code used for demos in class This code should in no way be considered production ready, but rather as quick demos of various concepts
Code used for demos in class This code should in no way be considered production ready, but rather as quick demos of various concepts ## Links to repositories from rails examples: 11/06/2017 - https://github.com/CCSU-CS416F17/toy-app-11-06
Add link to rails repos 11-06-2017
Add link to rails repos 11-06-2017
Markdown
mit
CCSU-CS416F17/CS416F17CourseInfo,CCSU-CS416F17/CS416F17CourseInfo,CCSU-CS416F17/CS416F17CourseInfo,CCSU-CS416F17/CS416F17CourseInfo
markdown
## Code Before: Code used for demos in class This code should in no way be considered production ready, but rather as quick demos of various concepts ## Instruction: Add link to rails repos 11-06-2017 ## Code After: Code used for demos in class This code should in no way be considered production ready, but rather as quick demos of various concepts ## Links to repositories from rails examples: 11/06/2017 - https://github.com/CCSU-CS416F17/toy-app-11-06
9b8cbfcf33ba644670a42490db7de4249e5ff080
invocations/docs.py
invocations/docs.py
import os from invoke.tasks import task from invoke.runner import run docs_dir = 'docs' build = os.path.join(docs_dir, '_build') @task def clean_docs(): run("rm -rf %s" % build) @task def browse_docs(): run("open %s" % os.path.join(build, 'index.html')) @task def docs(clean=False, browse=False): if clean: clean_docs.body() run("sphinx-build %s %s" % (docs_dir, build), pty=True) if browse: browse_docs.body()
import os from invoke.tasks import task from invoke.runner import run docs_dir = 'docs' build = os.path.join(docs_dir, '_build') @task def clean_docs(): run("rm -rf %s" % build) @task def browse_docs(): run("open %s" % os.path.join(build, 'index.html')) @task def docs(clean=False, browse=False): if clean: clean_docs() run("sphinx-build %s %s" % (docs_dir, build), pty=True) if browse: browse_docs()
Leverage __call__ on task downstream
Leverage __call__ on task downstream
Python
bsd-2-clause
mrjmad/invocations,alex/invocations,pyinvoke/invocations,singingwolfboy/invocations
python
## Code Before: import os from invoke.tasks import task from invoke.runner import run docs_dir = 'docs' build = os.path.join(docs_dir, '_build') @task def clean_docs(): run("rm -rf %s" % build) @task def browse_docs(): run("open %s" % os.path.join(build, 'index.html')) @task def docs(clean=False, browse=False): if clean: clean_docs.body() run("sphinx-build %s %s" % (docs_dir, build), pty=True) if browse: browse_docs.body() ## Instruction: Leverage __call__ on task downstream ## Code After: import os from invoke.tasks import task from invoke.runner import run docs_dir = 'docs' build = os.path.join(docs_dir, '_build') @task def clean_docs(): run("rm -rf %s" % build) @task def browse_docs(): run("open %s" % os.path.join(build, 'index.html')) @task def docs(clean=False, browse=False): if clean: clean_docs() run("sphinx-build %s %s" % (docs_dir, build), pty=True) if browse: browse_docs()
f541db600dd4d2891b9d5878f0538ed6dfd7a05b
plugin/src/main/groovy/com/github/ksoichiro/web/resource/node/TriremeNodeRunner.groovy
plugin/src/main/groovy/com/github/ksoichiro/web/resource/node/TriremeNodeRunner.groovy
package com.github.ksoichiro.web.resource.node import io.apigee.trireme.core.NodeEnvironment import io.apigee.trireme.core.NodeScript import io.apigee.trireme.core.ScriptStatus /** * Wrapper class to run Node.js with Trireme/Rhino. */ class TriremeNodeRunner { public static final String NODE_VERSION = "0.12" File workingDir String scriptName String scriptPath String[] args ScriptStatus status public void exec() { NodeEnvironment env = new NodeEnvironment() File path = scriptPath ? new File(scriptPath) : new File(workingDir, scriptName) NodeScript script = env.createScript(scriptName, path, args) script.setWorkingDirectory(workingDir.absolutePath) script.setNodeVersion(NODE_VERSION) status = script.execute().get() env.close() } }
package com.github.ksoichiro.web.resource.node import io.apigee.trireme.core.NodeEnvironment import io.apigee.trireme.core.NodeScript import io.apigee.trireme.core.ScriptStatus import org.gradle.api.GradleException /** * Wrapper class to run Node.js with Trireme/Rhino. */ class TriremeNodeRunner { public static final String NODE_VERSION = "0.12" File workingDir String scriptName String scriptPath String[] args ScriptStatus status public void exec() { NodeEnvironment env = new NodeEnvironment() File path = scriptPath ? new File(scriptPath) : new File(workingDir, scriptName) NodeScript script = env.createScript(scriptName, path, args) script.setWorkingDirectory(workingDir.absolutePath) script.setNodeVersion(NODE_VERSION) status = script.execute().get() env.close() if (!successfullyFinished()) { throw new GradleException("Error occurred while processing JavaScript. exitCode: ${status?.exitCode}, cause: ${status?.cause?.message}", status?.cause) } } public boolean successfullyFinished() { status != null && status.isOk() } }
Throw exception for Trireme errors
Throw exception for Trireme errors TriremeNodeRunner has status field but it has not been handled. Related to #3, sometimes it actually fails. Therefore it should be reported to the caller and treat as build failure.
Groovy
apache-2.0
ksoichiro/gradle-web-resource-plugin
groovy
## Code Before: package com.github.ksoichiro.web.resource.node import io.apigee.trireme.core.NodeEnvironment import io.apigee.trireme.core.NodeScript import io.apigee.trireme.core.ScriptStatus /** * Wrapper class to run Node.js with Trireme/Rhino. */ class TriremeNodeRunner { public static final String NODE_VERSION = "0.12" File workingDir String scriptName String scriptPath String[] args ScriptStatus status public void exec() { NodeEnvironment env = new NodeEnvironment() File path = scriptPath ? new File(scriptPath) : new File(workingDir, scriptName) NodeScript script = env.createScript(scriptName, path, args) script.setWorkingDirectory(workingDir.absolutePath) script.setNodeVersion(NODE_VERSION) status = script.execute().get() env.close() } } ## Instruction: Throw exception for Trireme errors TriremeNodeRunner has status field but it has not been handled. Related to #3, sometimes it actually fails. Therefore it should be reported to the caller and treat as build failure. ## Code After: package com.github.ksoichiro.web.resource.node import io.apigee.trireme.core.NodeEnvironment import io.apigee.trireme.core.NodeScript import io.apigee.trireme.core.ScriptStatus import org.gradle.api.GradleException /** * Wrapper class to run Node.js with Trireme/Rhino. */ class TriremeNodeRunner { public static final String NODE_VERSION = "0.12" File workingDir String scriptName String scriptPath String[] args ScriptStatus status public void exec() { NodeEnvironment env = new NodeEnvironment() File path = scriptPath ? new File(scriptPath) : new File(workingDir, scriptName) NodeScript script = env.createScript(scriptName, path, args) script.setWorkingDirectory(workingDir.absolutePath) script.setNodeVersion(NODE_VERSION) status = script.execute().get() env.close() if (!successfullyFinished()) { throw new GradleException("Error occurred while processing JavaScript. exitCode: ${status?.exitCode}, cause: ${status?.cause?.message}", status?.cause) } } public boolean successfullyFinished() { status != null && status.isOk() } }
d04410b79a697e8a740ae1a1339cee2573c1ee1c
ci_build_test_publish.sh
ci_build_test_publish.sh
set -ex DOCKER_COMPOSE="docker-compose -f docker-compose.yml -f docker-compose.infrastructure.yml" $DOCKER_COMPOSE build --force-rm --build-arg $BUILD_NUMBER $DOCKER_COMPOSE up -d $DOCKER_COMPOSE run nimbus ./build.sh --Target="Test" $DOCKER_COMPOSE run -e REDIS_TEST_CONNECTION=redis nimbus ./build.sh --Target="IntegrationTest" $DOCKER_COMPOSE run -e NUGET_API_KEY=$NUGET_API_KEY nimbus ./build.sh --Target="Publish" $DOCKER_COMPOSE down
set -ex DOCKER_COMPOSE="docker-compose -f docker-compose.yml -f docker-compose.infrastructure.yml" $DOCKER_COMPOSE build --force-rm --build-arg $BUILD_NUMBER $DOCKER_COMPOSE up -d $DOCKER_COMPOSE run -rm nimbus ./build.sh --Target="Test" $DOCKER_COMPOSE run -rm -e REDIS_TEST_CONNECTION=redis nimbus ./build.sh --Target="IntegrationTest" $DOCKER_COMPOSE run -rm -e NUGET_API_KEY=$NUGET_API_KEY nimbus ./build.sh --Target="Publish" $DOCKER_COMPOSE down
Clean up images after running commands
Clean up images after running commands
Shell
mit
NimbusAPI/Nimbus,NimbusAPI/Nimbus
shell
## Code Before: set -ex DOCKER_COMPOSE="docker-compose -f docker-compose.yml -f docker-compose.infrastructure.yml" $DOCKER_COMPOSE build --force-rm --build-arg $BUILD_NUMBER $DOCKER_COMPOSE up -d $DOCKER_COMPOSE run nimbus ./build.sh --Target="Test" $DOCKER_COMPOSE run -e REDIS_TEST_CONNECTION=redis nimbus ./build.sh --Target="IntegrationTest" $DOCKER_COMPOSE run -e NUGET_API_KEY=$NUGET_API_KEY nimbus ./build.sh --Target="Publish" $DOCKER_COMPOSE down ## Instruction: Clean up images after running commands ## Code After: set -ex DOCKER_COMPOSE="docker-compose -f docker-compose.yml -f docker-compose.infrastructure.yml" $DOCKER_COMPOSE build --force-rm --build-arg $BUILD_NUMBER $DOCKER_COMPOSE up -d $DOCKER_COMPOSE run -rm nimbus ./build.sh --Target="Test" $DOCKER_COMPOSE run -rm -e REDIS_TEST_CONNECTION=redis nimbus ./build.sh --Target="IntegrationTest" $DOCKER_COMPOSE run -rm -e NUGET_API_KEY=$NUGET_API_KEY nimbus ./build.sh --Target="Publish" $DOCKER_COMPOSE down
42822a9dc636feda4719533a8d1e4fd480348766
Build/Grunt-Tasks/Test.js
Build/Grunt-Tasks/Test.js
/** * Test task. * Replaces all t3b_template strings and other meta-data with the data * specified inside the 'package.json'. (Should be run after downloading the extension). */ module.exports = function(grunt) { "use strict"; grunt.registerTask("test", function() { // Replace general text-strings and paths. grunt.task.run("connect:karma", "karma:test"); }); };
/** * Test task. * Run JS unit tests via Jasmine & Karma, and create coverage reports for all browsers. */ module.exports = function(grunt) { "use strict"; grunt.registerTask("test", function() { grunt.task.run("connect:karma", "karma:test"); }); };
Add a proper file description for the 'test' grunt task
[DOCS] Add a proper file description for the 'test' grunt task
JavaScript
mit
t3b/t3b_template,t3b/t3b_template,t3b/t3b_template,t3b/t3b_template,t3b/t3b_template
javascript
## Code Before: /** * Test task. * Replaces all t3b_template strings and other meta-data with the data * specified inside the 'package.json'. (Should be run after downloading the extension). */ module.exports = function(grunt) { "use strict"; grunt.registerTask("test", function() { // Replace general text-strings and paths. grunt.task.run("connect:karma", "karma:test"); }); }; ## Instruction: [DOCS] Add a proper file description for the 'test' grunt task ## Code After: /** * Test task. * Run JS unit tests via Jasmine & Karma, and create coverage reports for all browsers. */ module.exports = function(grunt) { "use strict"; grunt.registerTask("test", function() { grunt.task.run("connect:karma", "karma:test"); }); };
3fd49dfa0e75e4f2cc80edd811706edd1dd5b40a
app/assets/stylesheets/components/shared/c-dashboard-chart-view.scss
app/assets/stylesheets/components/shared/c-dashboard-chart-view.scss
.c-dashboard-chart-view { min-height: 400px; .c-loading-spinner { min-height: 400px; &::before { background-color: transparent; } &::after { border-top-color: $color-1; border-right-color: $color-1; } } .c-we-chart { height: 400px; .chart { height: 100%; } } }
.c-dashboard-chart-view { min-height: 400px; .c-loading-spinner { min-height: 400px; &::before { background-color: transparent; } &::after { border-top-color: $color-1; border-right-color: $color-1; } } .c-we-chart { position: relative; // For the legend height: 400px; .chart { height: 100%; } } }
Fix bug where legend of pie chart could be mispositioned in dashboard
Fix bug where legend of pie chart could be mispositioned in dashboard
SCSS
mit
Vizzuality/forest-atlas-landscape-cms,Vizzuality/forest-atlas-landscape-cms,Vizzuality/forest-atlas-landscape-cms,Vizzuality/forest-atlas-landscape-cms
scss
## Code Before: .c-dashboard-chart-view { min-height: 400px; .c-loading-spinner { min-height: 400px; &::before { background-color: transparent; } &::after { border-top-color: $color-1; border-right-color: $color-1; } } .c-we-chart { height: 400px; .chart { height: 100%; } } } ## Instruction: Fix bug where legend of pie chart could be mispositioned in dashboard ## Code After: .c-dashboard-chart-view { min-height: 400px; .c-loading-spinner { min-height: 400px; &::before { background-color: transparent; } &::after { border-top-color: $color-1; border-right-color: $color-1; } } .c-we-chart { position: relative; // For the legend height: 400px; .chart { height: 100%; } } }
4ec431669134f8ac01fe5ef1d883bc4dc31fd6d7
number_to_words_test.py
number_to_words_test.py
import unittest from number_to_words import NumberToWords class TestNumberToWords(unittest.TestCase): def setUp(self): self.n2w = NumberToWords() def tearDown(self): self.n2w = None def test_zero_and_single_digits(self): NUMBERS = { 0: 'zero', 1: 'one', 2: 'two', 3: 'three', 4: 'four', 5: 'five', 6: 'six', 7: 'seven', 8: 'eight', 9: 'nine' } self.assert_numbers_equal_to_strings(NUMBERS) def assert_numbers_equal_to_strings(self, numbers): for number, string in numbers.iteritems(): self.assertEqual(string, self.n2w.convert(number)) if __name__ == '__main__': unittest.main()
import unittest from number_to_words import NumberToWords class TestNumberToWords(unittest.TestCase): def setUp(self): self.n2w = NumberToWords() def tearDown(self): self.n2w = None def test_zero_and_single_digits(self): NUMBERS = { 0: 'zero', 1: 'one', 2: 'two', 3: 'three', 4: 'four', 5: 'five', 6: 'six', 7: 'seven', 8: 'eight', 9: 'nine' } self.assert_numbers_equal_to_strings(NUMBERS) def test_eleven_to_nineteen(self): NUMBERS = { 11: 'eleven', 12: 'twelve', 13: 'thirteen', 14: 'fourteen', 15: 'fifteen', 16: 'sixteen', 17: 'seventeen', 18: 'eighteen', 19: 'nineteen' } self.assert_numbers_equal_to_strings(NUMBERS) def assert_numbers_equal_to_strings(self, numbers): for number, string in numbers.iteritems(): self.assertEqual(string, self.n2w.convert(number)) if __name__ == '__main__': unittest.main()
Add tests for numbers 11 to 19
Add tests for numbers 11 to 19
Python
mit
ianfieldhouse/number_to_words
python
## Code Before: import unittest from number_to_words import NumberToWords class TestNumberToWords(unittest.TestCase): def setUp(self): self.n2w = NumberToWords() def tearDown(self): self.n2w = None def test_zero_and_single_digits(self): NUMBERS = { 0: 'zero', 1: 'one', 2: 'two', 3: 'three', 4: 'four', 5: 'five', 6: 'six', 7: 'seven', 8: 'eight', 9: 'nine' } self.assert_numbers_equal_to_strings(NUMBERS) def assert_numbers_equal_to_strings(self, numbers): for number, string in numbers.iteritems(): self.assertEqual(string, self.n2w.convert(number)) if __name__ == '__main__': unittest.main() ## Instruction: Add tests for numbers 11 to 19 ## Code After: import unittest from number_to_words import NumberToWords class TestNumberToWords(unittest.TestCase): def setUp(self): self.n2w = NumberToWords() def tearDown(self): self.n2w = None def test_zero_and_single_digits(self): NUMBERS = { 0: 'zero', 1: 'one', 2: 'two', 3: 'three', 4: 'four', 5: 'five', 6: 'six', 7: 'seven', 8: 'eight', 9: 'nine' } self.assert_numbers_equal_to_strings(NUMBERS) def test_eleven_to_nineteen(self): NUMBERS = { 11: 'eleven', 12: 'twelve', 13: 'thirteen', 14: 'fourteen', 15: 'fifteen', 16: 'sixteen', 17: 'seventeen', 18: 'eighteen', 19: 'nineteen' } self.assert_numbers_equal_to_strings(NUMBERS) def assert_numbers_equal_to_strings(self, numbers): for number, string in numbers.iteritems(): self.assertEqual(string, self.n2w.convert(number)) if __name__ == '__main__': unittest.main()
e77b227b1bb30d78a3a8ddb1a2e88c72524b02ba
.travis.yml
.travis.yml
language: cpp compiler: - gcc sudo: false addons: apt: packages: - cmake install: - mkdir cmake - cd cmake - wget -qO- http://www.cmake.org/files/v3.1/cmake-3.1.0-Linux-x86_64.tar.gz | tar xvz - cd .. script: - mkdir build - cd build - ../cmake/cmake-3.1.0-Linux-x86_64/bin/cmake ../ - make -j4 - ./HPCsim/HPCsim -s examples/Pi/libPi.so - cd ../ - mkdir build_pilot - cd build_pilot - ../cmake/cmake-3.1.0-Linux-x86_64/bin/cmake -DUSE_PILOT_THREAD=1 ../ - make -j4 - ./HPCsim/HPCsim -s examples/Pi/libPi.so - ./examples/Pi/ComparePi ../build/HPCsim.out ./HPCsim.out
language: cpp compiler: - gcc sudo: false addons: apt: packages: - cmake install: - mkdir cmake - cd cmake - wget -qO- http://www.cmake.org/files/v3.1/cmake-3.1.0-Linux-x86_64.tar.gz | tar xvz - cd .. script: - mkdir build - cd build - ../cmake/cmake-3.1.0-Linux-x86_64/bin/cmake ../ - make -j4 - ./HPCsim/HPCsim -e 100 -s examples/Pi/libPi.so - cd ../ - mkdir build_pilot - cd build_pilot - ../cmake/cmake-3.1.0-Linux-x86_64/bin/cmake -DUSE_PILOT_THREAD=1 ../ - make -j4 - ./HPCsim/HPCsim -e 100 -s examples/Pi/libPi.so - ./examples/Pi/ComparePi ../build/HPCsim.out ./HPCsim.out - cd ../ - cd build - rm -f ./HPCsim.out - ./HPCsim/HPCsim -e 50 -s examples/Pi/libPi.so - ./HPCsim/HPCsim -c -e 100 -s examples/Pi/libPi.so - ./examples/Pi/ComparePi ./HPCsim.out ../build_pilot/HPCsim.out
Extend the Travis CI test script for checkpoint mode
Extend the Travis CI test script for checkpoint mode
YAML
mit
HeisSpiter/HPCsim,HeisSpiter/HPCsim
yaml
## Code Before: language: cpp compiler: - gcc sudo: false addons: apt: packages: - cmake install: - mkdir cmake - cd cmake - wget -qO- http://www.cmake.org/files/v3.1/cmake-3.1.0-Linux-x86_64.tar.gz | tar xvz - cd .. script: - mkdir build - cd build - ../cmake/cmake-3.1.0-Linux-x86_64/bin/cmake ../ - make -j4 - ./HPCsim/HPCsim -s examples/Pi/libPi.so - cd ../ - mkdir build_pilot - cd build_pilot - ../cmake/cmake-3.1.0-Linux-x86_64/bin/cmake -DUSE_PILOT_THREAD=1 ../ - make -j4 - ./HPCsim/HPCsim -s examples/Pi/libPi.so - ./examples/Pi/ComparePi ../build/HPCsim.out ./HPCsim.out ## Instruction: Extend the Travis CI test script for checkpoint mode ## Code After: language: cpp compiler: - gcc sudo: false addons: apt: packages: - cmake install: - mkdir cmake - cd cmake - wget -qO- http://www.cmake.org/files/v3.1/cmake-3.1.0-Linux-x86_64.tar.gz | tar xvz - cd .. script: - mkdir build - cd build - ../cmake/cmake-3.1.0-Linux-x86_64/bin/cmake ../ - make -j4 - ./HPCsim/HPCsim -e 100 -s examples/Pi/libPi.so - cd ../ - mkdir build_pilot - cd build_pilot - ../cmake/cmake-3.1.0-Linux-x86_64/bin/cmake -DUSE_PILOT_THREAD=1 ../ - make -j4 - ./HPCsim/HPCsim -e 100 -s examples/Pi/libPi.so - ./examples/Pi/ComparePi ../build/HPCsim.out ./HPCsim.out - cd ../ - cd build - rm -f ./HPCsim.out - ./HPCsim/HPCsim -e 50 -s examples/Pi/libPi.so - ./HPCsim/HPCsim -c -e 100 -s examples/Pi/libPi.so - ./examples/Pi/ComparePi ./HPCsim.out ../build_pilot/HPCsim.out
2b038cf159ae69ef9f654c432e7206f717456b65
build.sh
build.sh
rm -f unicensd mkdir -p out cd out cmake .. make mv unicensd .. mv xml2struct ..
rm -f unicensd mkdir -p out cd out cmake .. make mv unicensd .. mv xml2struct .. mv unicensc ..
Copy UNICENS client to working dir
Copy UNICENS client to working dir
Shell
bsd-3-clause
MicrochipTech/unicens-linux-daemon,MicrochipTech/unicens-linux-daemon
shell
## Code Before: rm -f unicensd mkdir -p out cd out cmake .. make mv unicensd .. mv xml2struct .. ## Instruction: Copy UNICENS client to working dir ## Code After: rm -f unicensd mkdir -p out cd out cmake .. make mv unicensd .. mv xml2struct .. mv unicensc ..
b5d0e3a9f18a7ad5100455544195c49bd1e90aae
roles/mysql-server/tasks/main.yml
roles/mysql-server/tasks/main.yml
--- - name: start mysql container docker_container: name: mysql-db image: "mysql:5.7" restart_policy: always env: MYSQL_ROOT_PASSWORD: "{{ mysql_root_password }}" volumes: - "/var/lib/mysql:/var/lib/mysql" network_mode: host
--- - name: start mysql container docker_container: name: mysql-db image: "mysql:5.7" restart_policy: always command: "--lower_case_table_names=1" env: MYSQL_ROOT_PASSWORD: "{{ mysql_root_password }}" volumes: - "/var/lib/mysql:/var/lib/mysql" network_mode: host
Set lower_case_table_names variable to 1
Set lower_case_table_names variable to 1
YAML
apache-2.0
osu-mist/ansible-roles
yaml
## Code Before: --- - name: start mysql container docker_container: name: mysql-db image: "mysql:5.7" restart_policy: always env: MYSQL_ROOT_PASSWORD: "{{ mysql_root_password }}" volumes: - "/var/lib/mysql:/var/lib/mysql" network_mode: host ## Instruction: Set lower_case_table_names variable to 1 ## Code After: --- - name: start mysql container docker_container: name: mysql-db image: "mysql:5.7" restart_policy: always command: "--lower_case_table_names=1" env: MYSQL_ROOT_PASSWORD: "{{ mysql_root_password }}" volumes: - "/var/lib/mysql:/var/lib/mysql" network_mode: host
df66a3bbe9cb5236fa4a9b920691d7b84eae3fe2
bin/rununittestsce.bat
bin/rununittestsce.bat
set QT=%1 set CETEST=%QT%\bin\cetest.exe set CETEST_ARGS=-cache %QT%\.qmake.cache -libpath \Windows -f %CETEST% %CETEST_ARGS% tests\auto\qvaluespace\qvaluespace.pro %CETEST% %CETEST_ARGS% tests\auto\qvaluespaceprovider\qvaluespaceprovider.pro %CETEST% %CETEST_ARGS% tests\auto\qvaluespacesubscriber\tst_qvaluespacesubscriber\tst_qvaluespacesubscriber.pro %CETEST% %CETEST_ARGS% tests\auto\qvaluespacesubscriber\tst_qvaluespacesubscriber_oop\tst_qvaluespacesubscriber_oop.pro %CETEST% %CETEST_ARGS% tests\auto\qsystemreadwritelock\qsystemreadwritelock\test\test.pro %CETEST% %CETEST_ARGS% tests\auto\qsystemreadwritelock_oop\qsystemreadwritelock_oop\test\test.pro %CETEST% %CETEST_ARGS% tests\auto\qcrmlparser\qcrmlparser.pro
set QT=%1 set CETEST=%QT%\bin\cetest.exe set CETEST_ARGS=-cache %QT%\.qmake.cache -libpath \Windows -f %CETEST% %CETEST_ARGS% tests\auto\qvaluespace\qvaluespace.pro %CETEST% %CETEST_ARGS% tests\auto\qvaluespaceprovider\qvaluespaceprovider.pro %CETEST% %CETEST_ARGS% tests\auto\qvaluespacesubscriber\tst_qvaluespacesubscriber\tst_qvaluespacesubscriber.pro %CETEST% %CETEST_ARGS% tests\auto\qvaluespacesubscriber\tst_qvaluespacesubscriber_oop\tst_qvaluespacesubscriber_oop.pro
Remove unbuilt tests on Windows CE.
Remove unbuilt tests on Windows CE.
Batchfile
lgpl-2.1
kaltsi/qt-mobility,enthought/qt-mobility,tmcguire/qt-mobility,qtproject/qt-mobility,kaltsi/qt-mobility,enthought/qt-mobility,qtproject/qt-mobility,qtproject/qt-mobility,kaltsi/qt-mobility,tmcguire/qt-mobility,tmcguire/qt-mobility,tmcguire/qt-mobility,kaltsi/qt-mobility,kaltsi/qt-mobility,enthought/qt-mobility,KDE/android-qt-mobility,KDE/android-qt-mobility,enthought/qt-mobility,enthought/qt-mobility,qtproject/qt-mobility,kaltsi/qt-mobility,tmcguire/qt-mobility,qtproject/qt-mobility,KDE/android-qt-mobility,KDE/android-qt-mobility,qtproject/qt-mobility,enthought/qt-mobility
batchfile
## Code Before: set QT=%1 set CETEST=%QT%\bin\cetest.exe set CETEST_ARGS=-cache %QT%\.qmake.cache -libpath \Windows -f %CETEST% %CETEST_ARGS% tests\auto\qvaluespace\qvaluespace.pro %CETEST% %CETEST_ARGS% tests\auto\qvaluespaceprovider\qvaluespaceprovider.pro %CETEST% %CETEST_ARGS% tests\auto\qvaluespacesubscriber\tst_qvaluespacesubscriber\tst_qvaluespacesubscriber.pro %CETEST% %CETEST_ARGS% tests\auto\qvaluespacesubscriber\tst_qvaluespacesubscriber_oop\tst_qvaluespacesubscriber_oop.pro %CETEST% %CETEST_ARGS% tests\auto\qsystemreadwritelock\qsystemreadwritelock\test\test.pro %CETEST% %CETEST_ARGS% tests\auto\qsystemreadwritelock_oop\qsystemreadwritelock_oop\test\test.pro %CETEST% %CETEST_ARGS% tests\auto\qcrmlparser\qcrmlparser.pro ## Instruction: Remove unbuilt tests on Windows CE. ## Code After: set QT=%1 set CETEST=%QT%\bin\cetest.exe set CETEST_ARGS=-cache %QT%\.qmake.cache -libpath \Windows -f %CETEST% %CETEST_ARGS% tests\auto\qvaluespace\qvaluespace.pro %CETEST% %CETEST_ARGS% tests\auto\qvaluespaceprovider\qvaluespaceprovider.pro %CETEST% %CETEST_ARGS% tests\auto\qvaluespacesubscriber\tst_qvaluespacesubscriber\tst_qvaluespacesubscriber.pro %CETEST% %CETEST_ARGS% tests\auto\qvaluespacesubscriber\tst_qvaluespacesubscriber_oop\tst_qvaluespacesubscriber_oop.pro
7c3946b8a56a9f8a206a6a0ee57e5fc819556ec6
lib/active_model/one_time_password.rb
lib/active_model/one_time_password.rb
module ActiveModel module OneTimePassword extend ActiveSupport::Concern module ClassMethods def has_one_time_password(options = {}) cattr_accessor :otp_column_name self.otp_column_name = (options[:column_name] || "otp_secret_key").to_s include InstanceMethodsOnActivation before_create { self.otp_column = ROTP::Base32.random_base32 } if respond_to?(:attributes_protected_by_default) def self.attributes_protected_by_default #:nodoc: super + [self.otp_column_name] end end end end module InstanceMethodsOnActivation def authenticate_otp(code, options = {}) totp = ROTP::TOTP.new(self.otp_column) if drift = options[:drift] totp.verify_with_drift(code, drift) else totp.verify(code) end end def otp_code(time = Time.now) ROTP::TOTP.new(self.otp_column).at(time, true) end def provisioning_uri(account = nil) account ||= self.email if self.respond_to?(:email) ROTP::TOTP.new(self.otp_column).provisioning_uri(account) end def otp_column self.send(self.class.otp_column_name) end def otp_column=(attr) self.send("#{self.class.otp_column_name}=", attr) end end end end
module ActiveModel module OneTimePassword extend ActiveSupport::Concern module ClassMethods def has_one_time_password(options = {}) cattr_accessor :otp_column_name self.otp_column_name = (options[:column_name] || "otp_secret_key").to_s include InstanceMethodsOnActivation before_create { self.otp_column = ROTP::Base32.random_base32 } if respond_to?(:attributes_protected_by_default) def self.attributes_protected_by_default #:nodoc: super + [self.otp_column_name] end end end end module InstanceMethodsOnActivation def authenticate_otp(code, options = {}) totp = ROTP::TOTP.new(self.otp_column) if drift = options[:drift] totp.verify_with_drift(code, drift) else totp.verify(code) end end def otp_code(time = Time.now, padded = true) ROTP::TOTP.new(self.otp_column).at(time, padded) end def provisioning_uri(account = nil) account ||= self.email if self.respond_to?(:email) ROTP::TOTP.new(self.otp_column).provisioning_uri(account) end def otp_column self.send(self.class.otp_column_name) end def otp_column=(attr) self.send("#{self.class.otp_column_name}=", attr) end end end end
Make OTP code zero-padding configurable
Make OTP code zero-padding configurable
Ruby
mit
masarakki/active_model_otp,heapsource/active_model_otp
ruby
## Code Before: module ActiveModel module OneTimePassword extend ActiveSupport::Concern module ClassMethods def has_one_time_password(options = {}) cattr_accessor :otp_column_name self.otp_column_name = (options[:column_name] || "otp_secret_key").to_s include InstanceMethodsOnActivation before_create { self.otp_column = ROTP::Base32.random_base32 } if respond_to?(:attributes_protected_by_default) def self.attributes_protected_by_default #:nodoc: super + [self.otp_column_name] end end end end module InstanceMethodsOnActivation def authenticate_otp(code, options = {}) totp = ROTP::TOTP.new(self.otp_column) if drift = options[:drift] totp.verify_with_drift(code, drift) else totp.verify(code) end end def otp_code(time = Time.now) ROTP::TOTP.new(self.otp_column).at(time, true) end def provisioning_uri(account = nil) account ||= self.email if self.respond_to?(:email) ROTP::TOTP.new(self.otp_column).provisioning_uri(account) end def otp_column self.send(self.class.otp_column_name) end def otp_column=(attr) self.send("#{self.class.otp_column_name}=", attr) end end end end ## Instruction: Make OTP code zero-padding configurable ## Code After: module ActiveModel module OneTimePassword extend ActiveSupport::Concern module ClassMethods def has_one_time_password(options = {}) cattr_accessor :otp_column_name self.otp_column_name = (options[:column_name] || "otp_secret_key").to_s include InstanceMethodsOnActivation before_create { self.otp_column = ROTP::Base32.random_base32 } if respond_to?(:attributes_protected_by_default) def self.attributes_protected_by_default #:nodoc: super + [self.otp_column_name] end end end end module InstanceMethodsOnActivation def authenticate_otp(code, options = {}) totp = ROTP::TOTP.new(self.otp_column) if drift = options[:drift] totp.verify_with_drift(code, drift) else totp.verify(code) end end def otp_code(time = Time.now, padded = true) ROTP::TOTP.new(self.otp_column).at(time, padded) end def provisioning_uri(account = nil) account ||= self.email if self.respond_to?(:email) ROTP::TOTP.new(self.otp_column).provisioning_uri(account) end def otp_column self.send(self.class.otp_column_name) end def otp_column=(attr) self.send("#{self.class.otp_column_name}=", attr) end end end end
3d0af3a74d5f5f2eb7c3dd9e479e6186455a49fe
troposphere/static/js/public_site/main.js
troposphere/static/js/public_site/main.js
import bootstrapper from 'public_site/bootstrapper.react'; import Raven from 'raven-js'; let sentryDSN = 'https://27643f06676048be96ad6df686c17da3@app.getsentry.com/73366'; Raven.config(sentryDSN, {release: '6a34f86c188811e698eb0242ac130001'}).install(); bootstrapper.run();
import bootstrapper from 'public_site/bootstrapper.react'; import Raven from 'raven-js'; let sentryDSN = 'https://27643f06676048be96ad6df686c17da3@app.getsentry.com/73366'; Raven.config(sentryDSN, {release: 'dfdfd4c7765c8aa5e2e4e34ac3baea92431ca511'}).install(); bootstrapper.run();
Make Sentry release for public-site consistent
Make Sentry release for public-site consistent
JavaScript
apache-2.0
CCI-MOC/GUI-Frontend,CCI-MOC/GUI-Frontend,CCI-MOC/GUI-Frontend,CCI-MOC/GUI-Frontend,CCI-MOC/GUI-Frontend
javascript
## Code Before: import bootstrapper from 'public_site/bootstrapper.react'; import Raven from 'raven-js'; let sentryDSN = 'https://27643f06676048be96ad6df686c17da3@app.getsentry.com/73366'; Raven.config(sentryDSN, {release: '6a34f86c188811e698eb0242ac130001'}).install(); bootstrapper.run(); ## Instruction: Make Sentry release for public-site consistent ## Code After: import bootstrapper from 'public_site/bootstrapper.react'; import Raven from 'raven-js'; let sentryDSN = 'https://27643f06676048be96ad6df686c17da3@app.getsentry.com/73366'; Raven.config(sentryDSN, {release: 'dfdfd4c7765c8aa5e2e4e34ac3baea92431ca511'}).install(); bootstrapper.run();
8aa37d06b22ce151b1b6b631c7feb37285c3a93a
spec/serializers/payment_depot_serializer_spec.rb
spec/serializers/payment_depot_serializer_spec.rb
require "spec_helper" module Bitsy describe PaymentDepotSerializer do let(:payment_depot) do build_stubbed(:payment_depot, address: "89s8x8", min_payment: 2.0) end let(:serializer) { described_class.new(payment_depot) } subject(:json) do JSON.parse(serializer.to_json).with_indifferent_access[:payment_depot] end before do expect(payment_depot).to receive(:total_received_amount) { 19.0 } expect(payment_depot).to receive(:min_payment_received?) { true } end its([:id]) { should eq payment_depot.id } its([:total_received_amount]) { should eq 19.0 } its([:min_payment_received]) { should be_true } its([:address]) { should eq "89s8x8" } its([:min_payment]) { should eq "2.0" } end end
require "spec_helper" module Bitsy describe PaymentDepotSerializer do let(:payment_depot) do build_stubbed(:payment_depot, { address: "89s8x8", min_payment: 2.0, total_received_amount_cache: 19.0, }) end let(:serializer) { described_class.new(payment_depot) } subject(:json) do JSON.parse(serializer.to_json).with_indifferent_access[:payment_depot] end before do expect(payment_depot).to receive(:min_payment_received?) { true } end its([:id]) { should eq payment_depot.id } its([:total_received_amount]) { should eq "19.0" } its([:min_payment_received]) { should be_true } its([:address]) { should eq "89s8x8" } its([:min_payment]) { should eq "2.0" } end end
Set field to test that a string is returned
Set field to test that a string is returned
Ruby
mit
ramontayag/bitsy,dheeraj510/bitsy,ramontayag/bitsy,dheeraj510/bitsy,dheeraj510/bitsy,ramontayag/bitsy
ruby
## Code Before: require "spec_helper" module Bitsy describe PaymentDepotSerializer do let(:payment_depot) do build_stubbed(:payment_depot, address: "89s8x8", min_payment: 2.0) end let(:serializer) { described_class.new(payment_depot) } subject(:json) do JSON.parse(serializer.to_json).with_indifferent_access[:payment_depot] end before do expect(payment_depot).to receive(:total_received_amount) { 19.0 } expect(payment_depot).to receive(:min_payment_received?) { true } end its([:id]) { should eq payment_depot.id } its([:total_received_amount]) { should eq 19.0 } its([:min_payment_received]) { should be_true } its([:address]) { should eq "89s8x8" } its([:min_payment]) { should eq "2.0" } end end ## Instruction: Set field to test that a string is returned ## Code After: require "spec_helper" module Bitsy describe PaymentDepotSerializer do let(:payment_depot) do build_stubbed(:payment_depot, { address: "89s8x8", min_payment: 2.0, total_received_amount_cache: 19.0, }) end let(:serializer) { described_class.new(payment_depot) } subject(:json) do JSON.parse(serializer.to_json).with_indifferent_access[:payment_depot] end before do expect(payment_depot).to receive(:min_payment_received?) { true } end its([:id]) { should eq payment_depot.id } its([:total_received_amount]) { should eq "19.0" } its([:min_payment_received]) { should be_true } its([:address]) { should eq "89s8x8" } its([:min_payment]) { should eq "2.0" } end end
9d564a9efd736192955040ca6e9314c6de692a28
project/plugins.sbt
project/plugins.sbt
libraryDependencies <+= sbtVersion(v => "com.github.siasia" %% "xsbt-web-plugin" % (v+"-0.2.11")) addSbtPlugin("me.lessis" % "coffeescripted-sbt" % "0.2.1")
libraryDependencies <+= sbtVersion(v => "com.github.siasia" %% "xsbt-web-plugin" % (v+"-0.2.11")) resolvers += Resolver.url("sbt-plugin-releases", new URL("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns) addSbtPlugin("me.lessis" % "coffeescripted-sbt" % "0.2.2")
Update des coffeescripted-sbt-Plugins auf die Version 0.2.2
Update des coffeescripted-sbt-Plugins auf die Version 0.2.2
Scala
apache-2.0
abertram/AntScout,abertram/AntScout
scala
## Code Before: libraryDependencies <+= sbtVersion(v => "com.github.siasia" %% "xsbt-web-plugin" % (v+"-0.2.11")) addSbtPlugin("me.lessis" % "coffeescripted-sbt" % "0.2.1") ## Instruction: Update des coffeescripted-sbt-Plugins auf die Version 0.2.2 ## Code After: libraryDependencies <+= sbtVersion(v => "com.github.siasia" %% "xsbt-web-plugin" % (v+"-0.2.11")) resolvers += Resolver.url("sbt-plugin-releases", new URL("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns) addSbtPlugin("me.lessis" % "coffeescripted-sbt" % "0.2.2")
485f1e8443bee8dd5f7116112f98717edbc49a08
examples/ably.js
examples/ably.js
'use strict'; require.config({ paths: { Ably: '/ably.min' } }); define(['Ably'], function(Ably) { var ably = new Ably(); return ably; });
'use strict'; require.config({ paths: { Ably: '/ably.min' } }); /* global define */ define(['Ably'], function(Ably) { var ably = new Ably(); return ably; });
Allow "define" in examples (no-undef)
Allow "define" in examples (no-undef)
JavaScript
mit
vgno/ably
javascript
## Code Before: 'use strict'; require.config({ paths: { Ably: '/ably.min' } }); define(['Ably'], function(Ably) { var ably = new Ably(); return ably; }); ## Instruction: Allow "define" in examples (no-undef) ## Code After: 'use strict'; require.config({ paths: { Ably: '/ably.min' } }); /* global define */ define(['Ably'], function(Ably) { var ably = new Ably(); return ably; });
8b15af14528717bf68f109b21b89592effd14079
lib/metriks_log_webhook/app.rb
lib/metriks_log_webhook/app.rb
require 'sinatra' require 'yajl' require 'active_support/core_ext/hash' require 'dalli' require 'metriks_log_webhook/librato_metrics' require 'metriks_log_webhook/metrik_log_parser' require 'metriks_log_webhook/metric_list' module MetriksLogWebhook class App < Sinatra::Base configure do set :root, ENV['RACK_ROOT'] || RACK_ROOT set :app_file, __FILE__ set :metrik_prefix, ENV['METRIK_PREFIX'] || 'metriks:' set :metric_interval, 60 set :cache, lambda { Dalli::Client.new } set :metrics_client, LibratoMetrics.new(ENV['METRICS_EMAIL'], ENV['METRICS_TOKEN']) end get '/' do 'hello' end post '/submit' do payload = HashWithIndifferentAccess.new(Yajl::Parser.parse(params[:payload])) parser = MetrikLogParser.new(settings.metrik_prefix) metric_list = MetricList.new(settings.cache, settings.metric_interval) payload[:events].each do |event| if data = parser.parse(event[:message]) data[:source] ||= event[:hostname] metric_list.add(data) end end metric_list.save body = metric_list.to_hash settings.metrics_client.submit(body) end end end
require 'sinatra' require 'yajl' require 'active_support/core_ext/hash' require 'dalli' require 'metriks_log_webhook/librato_metrics' require 'metriks_log_webhook/metrik_log_parser' require 'metriks_log_webhook/metric_list' module MetriksLogWebhook class App < Sinatra::Base configure do set :root, ENV['RACK_ROOT'] || RACK_ROOT set :app_file, __FILE__ set :metrik_prefix, ENV['METRIK_PREFIX'] || 'metriks:' set :metric_interval, 60 set :cache, lambda { Dalli::Client.new } set :metrics_client, LibratoMetrics.new(ENV['METRICS_EMAIL'], ENV['METRICS_TOKEN']) end get '/' do 'hello' end post '/submit' do payload = HashWithIndifferentAccess.new(Yajl::Parser.parse(params[:payload])) parser = MetrikLogParser.new(settings.metrik_prefix) metric_list = MetricList.new(settings.cache, settings.metric_interval) payload[:events].each do |event| if data = parser.parse(event[:message]) data[:source] ||= event[:hostname] metric_list.add(data) end end metric_list.save body = metric_list.to_hash settings.metrics_client.submit(body) 'ok' end end end
Return 'ok' in the webhook
Return 'ok' in the webhook
Ruby
mit
eric/metriks_log_webhook
ruby
## Code Before: require 'sinatra' require 'yajl' require 'active_support/core_ext/hash' require 'dalli' require 'metriks_log_webhook/librato_metrics' require 'metriks_log_webhook/metrik_log_parser' require 'metriks_log_webhook/metric_list' module MetriksLogWebhook class App < Sinatra::Base configure do set :root, ENV['RACK_ROOT'] || RACK_ROOT set :app_file, __FILE__ set :metrik_prefix, ENV['METRIK_PREFIX'] || 'metriks:' set :metric_interval, 60 set :cache, lambda { Dalli::Client.new } set :metrics_client, LibratoMetrics.new(ENV['METRICS_EMAIL'], ENV['METRICS_TOKEN']) end get '/' do 'hello' end post '/submit' do payload = HashWithIndifferentAccess.new(Yajl::Parser.parse(params[:payload])) parser = MetrikLogParser.new(settings.metrik_prefix) metric_list = MetricList.new(settings.cache, settings.metric_interval) payload[:events].each do |event| if data = parser.parse(event[:message]) data[:source] ||= event[:hostname] metric_list.add(data) end end metric_list.save body = metric_list.to_hash settings.metrics_client.submit(body) end end end ## Instruction: Return 'ok' in the webhook ## Code After: require 'sinatra' require 'yajl' require 'active_support/core_ext/hash' require 'dalli' require 'metriks_log_webhook/librato_metrics' require 'metriks_log_webhook/metrik_log_parser' require 'metriks_log_webhook/metric_list' module MetriksLogWebhook class App < Sinatra::Base configure do set :root, ENV['RACK_ROOT'] || RACK_ROOT set :app_file, __FILE__ set :metrik_prefix, ENV['METRIK_PREFIX'] || 'metriks:' set :metric_interval, 60 set :cache, lambda { Dalli::Client.new } set :metrics_client, LibratoMetrics.new(ENV['METRICS_EMAIL'], ENV['METRICS_TOKEN']) end get '/' do 'hello' end post '/submit' do payload = HashWithIndifferentAccess.new(Yajl::Parser.parse(params[:payload])) parser = MetrikLogParser.new(settings.metrik_prefix) metric_list = MetricList.new(settings.cache, settings.metric_interval) payload[:events].each do |event| if data = parser.parse(event[:message]) data[:source] ||= event[:hostname] metric_list.add(data) end end metric_list.save body = metric_list.to_hash settings.metrics_client.submit(body) 'ok' end end end
131db4b8dfa26eab9185fba3b58cee6c8b0cfbff
src/jquery.turbolinks.coffee
src/jquery.turbolinks.coffee
jquery.turbolinks.js ~ v1.0.0-rc1 ~ https://github.com/kossnocorp/jquery.turbolinks jQuery plugin for drop-in fix binded events problem caused by Turbolinks The MIT License Copyright (c) 2012 Sasha Koss ### $ = require?('jquery') || window.jQuery # List for store callbacks passed to `$` or `$.ready` callbacks = [] # Call each callback in list ready = -> callback($) for callback in callbacks # Turbolinks ready event turbolinksReady = -> $.isReady = true ready() # Fetch event handler fetch = -> $(document).off(undefined, '**') $.isReady = false # Bind `ready` to DOM ready event $(ready) # Store callbacks in list on `$` and `$.ready` $.fn.ready = (callback) -> callbacks.push(callback) callback($) if $.isReady # Bind ready to passed event $.setReadyEvent = (event) -> $(document) .off('.turbolinks-ready') .on(event + '.turbolinks-ready', turbolinksReady) # Bind fetch event $.setFetchEvent = (event) -> $(document) .off('.turbolinks-fetch') .on(event + '.turbolinks-fetch', fetch) # Bind `ready` to Tubolinks page:load $.setReadyEvent('page:load') # Bind fetch to Turbolinks page:fetch $.setFetchEvent('page:fetch')
jquery.turbolinks.js ~ v1.0.0-rc1 ~ https://github.com/kossnocorp/jquery.turbolinks jQuery plugin for drop-in fix binded events problem caused by Turbolinks The MIT License Copyright (c) 2012 Sasha Koss ### $ = window.jQuery or require?('jquery') # List for store callbacks passed to `$` or `$.ready` callbacks = [] # Call each callback in list ready = -> callback($) for callback in callbacks # Turbolinks ready event turbolinksReady = -> $.isReady = true ready() # Fetch event handler fetch = -> $(document).off(undefined, '**') $.isReady = false # Bind `ready` to DOM ready event $(ready) # Store callbacks in list on `$` and `$.ready` $.fn.ready = (callback) -> callbacks.push(callback) callback($) if $.isReady # Bind ready to passed event $.setReadyEvent = (event) -> $(document) .off('.turbolinks-ready') .on(event + '.turbolinks-ready', turbolinksReady) # Bind fetch event $.setFetchEvent = (event) -> $(document) .off('.turbolinks-fetch') .on(event + '.turbolinks-fetch', fetch) # Bind `ready` to Tubolinks page:load $.setReadyEvent('page:load') # Bind fetch to Turbolinks page:fetch $.setFetchEvent('page:fetch')
Load jQuery from global namespace first.
Load jQuery from global namespace first. This fixes the cases wherein you may be using an AMD system, but load jQuery outside AMD.
CoffeeScript
mit
echobobby/jquery.turbolinks,kossnocorp/jquery.turbolinks
coffeescript
## Code Before: jquery.turbolinks.js ~ v1.0.0-rc1 ~ https://github.com/kossnocorp/jquery.turbolinks jQuery plugin for drop-in fix binded events problem caused by Turbolinks The MIT License Copyright (c) 2012 Sasha Koss ### $ = require?('jquery') || window.jQuery # List for store callbacks passed to `$` or `$.ready` callbacks = [] # Call each callback in list ready = -> callback($) for callback in callbacks # Turbolinks ready event turbolinksReady = -> $.isReady = true ready() # Fetch event handler fetch = -> $(document).off(undefined, '**') $.isReady = false # Bind `ready` to DOM ready event $(ready) # Store callbacks in list on `$` and `$.ready` $.fn.ready = (callback) -> callbacks.push(callback) callback($) if $.isReady # Bind ready to passed event $.setReadyEvent = (event) -> $(document) .off('.turbolinks-ready') .on(event + '.turbolinks-ready', turbolinksReady) # Bind fetch event $.setFetchEvent = (event) -> $(document) .off('.turbolinks-fetch') .on(event + '.turbolinks-fetch', fetch) # Bind `ready` to Tubolinks page:load $.setReadyEvent('page:load') # Bind fetch to Turbolinks page:fetch $.setFetchEvent('page:fetch') ## Instruction: Load jQuery from global namespace first. This fixes the cases wherein you may be using an AMD system, but load jQuery outside AMD. ## Code After: jquery.turbolinks.js ~ v1.0.0-rc1 ~ https://github.com/kossnocorp/jquery.turbolinks jQuery plugin for drop-in fix binded events problem caused by Turbolinks The MIT License Copyright (c) 2012 Sasha Koss ### $ = window.jQuery or require?('jquery') # List for store callbacks passed to `$` or `$.ready` callbacks = [] # Call each callback in list ready = -> callback($) for callback in callbacks # Turbolinks ready event turbolinksReady = -> $.isReady = true ready() # Fetch event handler fetch = -> $(document).off(undefined, '**') $.isReady = false # Bind `ready` to DOM ready event $(ready) # Store callbacks in list on `$` and `$.ready` $.fn.ready = (callback) -> callbacks.push(callback) callback($) if $.isReady # Bind ready to passed event $.setReadyEvent = (event) -> $(document) .off('.turbolinks-ready') .on(event + '.turbolinks-ready', turbolinksReady) # Bind fetch event $.setFetchEvent = (event) -> $(document) .off('.turbolinks-fetch') .on(event + '.turbolinks-fetch', fetch) # Bind `ready` to Tubolinks page:load $.setReadyEvent('page:load') # Bind fetch to Turbolinks page:fetch $.setFetchEvent('page:fetch')
5e9c9933d450c740b3deeb08b216ad0f6b442653
libraries/WebSocket.elm
libraries/WebSocket.elm
-- A library for low latency HTTP communication. See the HTTP library for -- standard requests like GET, POST, etc. module WebSocket where import Signal (Signal) import Basics (String) import Native.WebSocket -- Create a web-socket. The first argument is the URL of the desired -- web-socket server. The input signal holds the outgoing messages, -- and the resulting signal contains the incoming ones. connect : String -> Signal String -> Signal String connect = Native.WebSocket.connect -- data Action = Open String | Close String | Send String String -- connections : Signal Action -> Signal String
module WebSocket where {-| A library for low latency HTTP communication. See the HTTP library for standard requests like GET, POST, etc. -} import Signal (Signal) import Basics (String) import Native.WebSocket {-| Create a web-socket. The first argument is the URL of the desired web-socket server. The input signal holds the outgoing messages, and the resulting signal contains the incoming ones. -} connect : String -> Signal String -> Signal String connect = Native.WebSocket.connect -- data Action = Open String | Close String | Send String String -- connections : Signal Action -> Signal String
Update to new doc format
Update to new doc format
Elm
bsd-3-clause
5outh/Elm,JoeyEremondi/elm-pattern-effects,pairyo/elm-compiler,Axure/elm-compiler,deadfoxygrandpa/Elm,mgold/Elm,laszlopandy/elm-compiler
elm
## Code Before: -- A library for low latency HTTP communication. See the HTTP library for -- standard requests like GET, POST, etc. module WebSocket where import Signal (Signal) import Basics (String) import Native.WebSocket -- Create a web-socket. The first argument is the URL of the desired -- web-socket server. The input signal holds the outgoing messages, -- and the resulting signal contains the incoming ones. connect : String -> Signal String -> Signal String connect = Native.WebSocket.connect -- data Action = Open String | Close String | Send String String -- connections : Signal Action -> Signal String ## Instruction: Update to new doc format ## Code After: module WebSocket where {-| A library for low latency HTTP communication. See the HTTP library for standard requests like GET, POST, etc. -} import Signal (Signal) import Basics (String) import Native.WebSocket {-| Create a web-socket. The first argument is the URL of the desired web-socket server. The input signal holds the outgoing messages, and the resulting signal contains the incoming ones. -} connect : String -> Signal String -> Signal String connect = Native.WebSocket.connect -- data Action = Open String | Close String | Send String String -- connections : Signal Action -> Signal String
5a0cff75f7def577d33595441c1af97296f7e85a
spec/lib/raml_models/method_spec.rb
spec/lib/raml_models/method_spec.rb
RSpec.describe Rambo::RamlModels::Method do let(:raml_file) { File.expand_path("../../../support/foo.raml", __FILE__) } let(:raml) { Raml::Parser.parse_file(raml_file) } let(:method) { raml.resources.first.methods.first } subject { described_class.new(method) } describe "#to_s" do it "returns the method name" do expect(subject.method).to eql method.method end end describe "#description" do it "returns the description" do expect(subject.description).to eql method.description end end describe "#request_body" do it "returns a request body" do expect(subject.request_body).to be_a Rambo::RamlModels::Body end end describe "#responses" do it "returns an array of Response objects" do all_are_responses = subject.responses.all? {|resp| resp.is_a?(Rambo::RamlModels::Response) } expect(all_are_responses).to be true end end end
RSpec.describe Rambo::RamlModels::Method do let(:raml_file) { File.expand_path("../../../support/foo.raml", __FILE__) } let(:raml) { Raml::Parser.parse_file(raml_file) } let(:method) { raml.resources.first.methods.first } subject { described_class.new(method) } describe "#to_s" do it "returns the method name" do expect(subject.method).to eql method.method end end describe "#description" do it "returns the description" do expect(subject.description).to eql method.description end end describe "#responses" do it "returns an array of Response objects" do all_are_responses = subject.responses.all? {|resp| resp.is_a?(Rambo::RamlModels::Response) } expect(all_are_responses).to be true end end end
Delete that new test because it fucked shit up
Delete that new test because it fucked shit up
Ruby
mit
danascheider/rambo,danascheider/rambo
ruby
## Code Before: RSpec.describe Rambo::RamlModels::Method do let(:raml_file) { File.expand_path("../../../support/foo.raml", __FILE__) } let(:raml) { Raml::Parser.parse_file(raml_file) } let(:method) { raml.resources.first.methods.first } subject { described_class.new(method) } describe "#to_s" do it "returns the method name" do expect(subject.method).to eql method.method end end describe "#description" do it "returns the description" do expect(subject.description).to eql method.description end end describe "#request_body" do it "returns a request body" do expect(subject.request_body).to be_a Rambo::RamlModels::Body end end describe "#responses" do it "returns an array of Response objects" do all_are_responses = subject.responses.all? {|resp| resp.is_a?(Rambo::RamlModels::Response) } expect(all_are_responses).to be true end end end ## Instruction: Delete that new test because it fucked shit up ## Code After: RSpec.describe Rambo::RamlModels::Method do let(:raml_file) { File.expand_path("../../../support/foo.raml", __FILE__) } let(:raml) { Raml::Parser.parse_file(raml_file) } let(:method) { raml.resources.first.methods.first } subject { described_class.new(method) } describe "#to_s" do it "returns the method name" do expect(subject.method).to eql method.method end end describe "#description" do it "returns the description" do expect(subject.description).to eql method.description end end describe "#responses" do it "returns an array of Response objects" do all_are_responses = subject.responses.all? {|resp| resp.is_a?(Rambo::RamlModels::Response) } expect(all_are_responses).to be true end end end
d2c390bb5a49bbd6b1ea472886fd812f793481fe
app/controllers/api/braintree_controller.rb
app/controllers/api/braintree_controller.rb
class Api::BraintreeController < ApplicationController def token render json: { token: ::Braintree::ClientToken.generate } end def transaction result = braintree::Transaction.make_transaction(transaction_options) if result.success? render json: { success: true, transaction_id: result.transaction.id } else render json: { success: false, errors: result.errors } end end def subscription result = braintree::Subscription.make_subscription(subscription_options) if result.success? render json: { success: true, subscription_id: result.subscription.id } else render json: { success: false, errors: result.errors.for(:subscription) } end end private def transaction_options { nonce: params[:payment_method_nonce], user: params[:user], amount: params[:amount].to_f, store: Payment } end def subscription_options { amount: params[:amount].to_f, plan_id: '35wm', payment_method_token: default_payment_method_token } end def braintree PaymentProcessor::Clients::Braintree end def default_payment_method_token @token ||= ::Payment.customer(params[:email]).try(:card_vault_token) end end
class Api::BraintreeController < ApplicationController def token render json: { token: ::Braintree::ClientToken.generate } end def transaction result = braintree::Transaction.make_transaction(transaction_options) if result.success? render json: { success: true, transaction_id: result.transaction.id } else render json: { success: false, errors: result.errors } end end def subscription result = braintree::Subscription.make_subscription(subscription_options) if result.success? render json: { success: true, subscription_id: result.subscription.id } else # Subscription-specific errors can be accessed with results.errors.for(:subscription), # but that would not include errors due to invalid parameters. # Subscription-specific errors are raised e.g. if there's an attempt to update a deleted subscription. render json: { success: false, errors: result.errors } end end private def transaction_options { nonce: params[:payment_method_nonce], user: params[:user], amount: params[:amount].to_f, store: Payment } end def subscription_options { amount: params[:amount].to_f, plan_id: '35wm', payment_method_token: default_payment_method_token } end def braintree PaymentProcessor::Clients::Braintree end def default_payment_method_token @token ||= ::Payment.customer(params[:email]).try(:card_vault_token) end end
Return all errors for subscriptions, instead of only subscription-specific ones
Return all errors for subscriptions, instead of only subscription-specific ones
Ruby
mit
SumOfUs/Champaign,SumOfUs/Champaign,SumOfUs/Champaign,SumOfUs/Champaign,SumOfUs/Champaign
ruby
## Code Before: class Api::BraintreeController < ApplicationController def token render json: { token: ::Braintree::ClientToken.generate } end def transaction result = braintree::Transaction.make_transaction(transaction_options) if result.success? render json: { success: true, transaction_id: result.transaction.id } else render json: { success: false, errors: result.errors } end end def subscription result = braintree::Subscription.make_subscription(subscription_options) if result.success? render json: { success: true, subscription_id: result.subscription.id } else render json: { success: false, errors: result.errors.for(:subscription) } end end private def transaction_options { nonce: params[:payment_method_nonce], user: params[:user], amount: params[:amount].to_f, store: Payment } end def subscription_options { amount: params[:amount].to_f, plan_id: '35wm', payment_method_token: default_payment_method_token } end def braintree PaymentProcessor::Clients::Braintree end def default_payment_method_token @token ||= ::Payment.customer(params[:email]).try(:card_vault_token) end end ## Instruction: Return all errors for subscriptions, instead of only subscription-specific ones ## Code After: class Api::BraintreeController < ApplicationController def token render json: { token: ::Braintree::ClientToken.generate } end def transaction result = braintree::Transaction.make_transaction(transaction_options) if result.success? render json: { success: true, transaction_id: result.transaction.id } else render json: { success: false, errors: result.errors } end end def subscription result = braintree::Subscription.make_subscription(subscription_options) if result.success? render json: { success: true, subscription_id: result.subscription.id } else # Subscription-specific errors can be accessed with results.errors.for(:subscription), # but that would not include errors due to invalid parameters. # Subscription-specific errors are raised e.g. if there's an attempt to update a deleted subscription. render json: { success: false, errors: result.errors } end end private def transaction_options { nonce: params[:payment_method_nonce], user: params[:user], amount: params[:amount].to_f, store: Payment } end def subscription_options { amount: params[:amount].to_f, plan_id: '35wm', payment_method_token: default_payment_method_token } end def braintree PaymentProcessor::Clients::Braintree end def default_payment_method_token @token ||= ::Payment.customer(params[:email]).try(:card_vault_token) end end
b763284733492cbd81cf0266407501dc83c71086
.travis.yml
.travis.yml
language: python cache: pip python: - 2.7 - 3.3 - 3.4 - 3.5 - 3.6 - nightly sudo: false addons: apt: packages: - oracle-java8-set-default before_install: - pip install --upgrade setuptools install: - python bootstrap.py - sed -ir "s/SQLAlchemy.*/SQLAlchemy = ${SA_VERSION}/g" versions.cfg - bin/buildout -c base.cfg env: - SA_VERSION=1.0.16 - SA_VERSION=1.1.4 matrix: allow_failures: - python: nightly before_script: - bin/flake8 --ignore=E,C901,F401,F821 --count src script: - bin/coverage run bin/test -vv1 after_success: - pip install coveralls - coveralls notifications: email: false
language: python cache: pip python: - 2.7 - 3.3 - 3.4 - 3.5 - 3.6 - nightly sudo: false addons: apt: packages: - oracle-java8-set-default before_install: # clean existing dependencies to avoid clashes - pip freeze | xargs pip uninstall -y - pip install --upgrade setuptools install: - python bootstrap.py - sed -ir "s/SQLAlchemy.*/SQLAlchemy = ${SA_VERSION}/g" versions.cfg - bin/buildout -c base.cfg env: - SA_VERSION=1.0.16 - SA_VERSION=1.1.4 matrix: allow_failures: - python: nightly before_script: - bin/flake8 --ignore=E,C901,F401,F821 --count src script: - bin/coverage run bin/test -vv1 after_success: - pip install coveralls - coveralls notifications: email: false
Clean Travis pre-setup dependencies from virtualenv
Clean Travis pre-setup dependencies from virtualenv
YAML
apache-2.0
crate/crate-python,crate/crate-python
yaml
## Code Before: language: python cache: pip python: - 2.7 - 3.3 - 3.4 - 3.5 - 3.6 - nightly sudo: false addons: apt: packages: - oracle-java8-set-default before_install: - pip install --upgrade setuptools install: - python bootstrap.py - sed -ir "s/SQLAlchemy.*/SQLAlchemy = ${SA_VERSION}/g" versions.cfg - bin/buildout -c base.cfg env: - SA_VERSION=1.0.16 - SA_VERSION=1.1.4 matrix: allow_failures: - python: nightly before_script: - bin/flake8 --ignore=E,C901,F401,F821 --count src script: - bin/coverage run bin/test -vv1 after_success: - pip install coveralls - coveralls notifications: email: false ## Instruction: Clean Travis pre-setup dependencies from virtualenv ## Code After: language: python cache: pip python: - 2.7 - 3.3 - 3.4 - 3.5 - 3.6 - nightly sudo: false addons: apt: packages: - oracle-java8-set-default before_install: # clean existing dependencies to avoid clashes - pip freeze | xargs pip uninstall -y - pip install --upgrade setuptools install: - python bootstrap.py - sed -ir "s/SQLAlchemy.*/SQLAlchemy = ${SA_VERSION}/g" versions.cfg - bin/buildout -c base.cfg env: - SA_VERSION=1.0.16 - SA_VERSION=1.1.4 matrix: allow_failures: - python: nightly before_script: - bin/flake8 --ignore=E,C901,F401,F821 --count src script: - bin/coverage run bin/test -vv1 after_success: - pip install coveralls - coveralls notifications: email: false
4f7cbbc503f034531adedaecee5f41c109fd0ccd
leagues/templates/leagues/list_available_leagues.html
leagues/templates/leagues/list_available_leagues.html
{% extends "base.html" %} {% block content %} {% if leagues %} {% for league in leagues %} <strong>{{ league.0.league_name }} - {{ league.1 }}</strong><a href="/leagues/join_league/{{ league.id }}/">Join</a> {% endfor %} {% else %} <p>No available leagues.</p> {% endif %} {% endblock %}
{% extends "base.html" %} {% block content %} {% if leagues %} <table id="standings_table" class="table table-striped table-bordered"> <thead> <tr> <th>League name</th> <th>Spots left</th> <th></th> </tr> </thead> <tbody> {% for league in leagues %} <tr> <td>{{ league.0.league_name }}</td> <td>{{ league.1 }}</td> <td><a href="/leagues/join_league/{{ league.id }}/">Join</a></td> </tr> {% endfor %} </tbody> </table> {% else %} <p>No available leagues.</p> {% endif %} {% endblock %}
Add table around available leagues
Add table around available leagues
HTML
mit
leventebakos/football-ech,leventebakos/football-ech
html
## Code Before: {% extends "base.html" %} {% block content %} {% if leagues %} {% for league in leagues %} <strong>{{ league.0.league_name }} - {{ league.1 }}</strong><a href="/leagues/join_league/{{ league.id }}/">Join</a> {% endfor %} {% else %} <p>No available leagues.</p> {% endif %} {% endblock %} ## Instruction: Add table around available leagues ## Code After: {% extends "base.html" %} {% block content %} {% if leagues %} <table id="standings_table" class="table table-striped table-bordered"> <thead> <tr> <th>League name</th> <th>Spots left</th> <th></th> </tr> </thead> <tbody> {% for league in leagues %} <tr> <td>{{ league.0.league_name }}</td> <td>{{ league.1 }}</td> <td><a href="/leagues/join_league/{{ league.id }}/">Join</a></td> </tr> {% endfor %} </tbody> </table> {% else %} <p>No available leagues.</p> {% endif %} {% endblock %}
1785e4cf9e0036dab95b751e3302e61b861a9e82
spec/rest/verix/verix_pdf_documents_spec.rb
spec/rest/verix/verix_pdf_documents_spec.rb
require 'spec_helpers/client' require 'rest/api_request' RSpec.describe FinApps::REST::VerixPdfDocuments do include SpecHelpers::Client let(:api_client) { client } let(:document) { described_class.new(api_client) } describe '#show' do context 'when missing parameters' do subject { document.show(:record_id, nil) } it 'raises an error when missing record id' do expect { subject }.to raise_error(FinAppsCore::MissingArgumentsError) end it 'raises an error when missing provider id' do expect { subject }.to raise_error(FinAppsCore::MissingArgumentsError) end end subject(:show) do document.show( :record_id, :provider_id ) end it_behaves_like 'an API request' it_behaves_like 'a successful request' end end
require 'spec_helpers/client' require 'rest/api_request' RSpec.describe FinApps::REST::VerixPdfDocuments do include SpecHelpers::Client let(:api_client) { client } let(:document) { described_class.new(api_client) } describe '#show' do subject { document.show(:record_id, :provider_id) } it_behaves_like 'an API request' it_behaves_like 'a successful request' context 'when missing record_id' do subject(:show) { document.show(nil, :provider_id) } it { expect { show }.to raise_error(FinAppsCore::MissingArgumentsError) } end context 'when missing provider_id' do subject(:show) { document.show(:record_id, nil) } it { expect { show }.to raise_error(FinAppsCore::MissingArgumentsError) } end end end
Fix issue created by Rubocop?
Fix issue created by Rubocop?
Ruby
mit
finapps/ruby-client
ruby
## Code Before: require 'spec_helpers/client' require 'rest/api_request' RSpec.describe FinApps::REST::VerixPdfDocuments do include SpecHelpers::Client let(:api_client) { client } let(:document) { described_class.new(api_client) } describe '#show' do context 'when missing parameters' do subject { document.show(:record_id, nil) } it 'raises an error when missing record id' do expect { subject }.to raise_error(FinAppsCore::MissingArgumentsError) end it 'raises an error when missing provider id' do expect { subject }.to raise_error(FinAppsCore::MissingArgumentsError) end end subject(:show) do document.show( :record_id, :provider_id ) end it_behaves_like 'an API request' it_behaves_like 'a successful request' end end ## Instruction: Fix issue created by Rubocop? ## Code After: require 'spec_helpers/client' require 'rest/api_request' RSpec.describe FinApps::REST::VerixPdfDocuments do include SpecHelpers::Client let(:api_client) { client } let(:document) { described_class.new(api_client) } describe '#show' do subject { document.show(:record_id, :provider_id) } it_behaves_like 'an API request' it_behaves_like 'a successful request' context 'when missing record_id' do subject(:show) { document.show(nil, :provider_id) } it { expect { show }.to raise_error(FinAppsCore::MissingArgumentsError) } end context 'when missing provider_id' do subject(:show) { document.show(:record_id, nil) } it { expect { show }.to raise_error(FinAppsCore::MissingArgumentsError) } end end end
a8321824ba18e1655167b216d3c03748673cb1e9
app/routes/dashboard.js
app/routes/dashboard.js
import Ember from 'ember'; export default Ember.Route.extend({ renderTemplate() { this.render( 'admin', { controller: 'dashboard' }); } });
import Ember from 'ember'; export default Ember.Route.extend({ activate() { $( 'body' ).removeClass( 'login' ) }, renderTemplate() { this.render( 'admin', { controller: 'dashboard' }); } });
Remove "login" class after transition
Remove "login" class after transition
JavaScript
mit
kunni80/server,muffin/server
javascript
## Code Before: import Ember from 'ember'; export default Ember.Route.extend({ renderTemplate() { this.render( 'admin', { controller: 'dashboard' }); } }); ## Instruction: Remove "login" class after transition ## Code After: import Ember from 'ember'; export default Ember.Route.extend({ activate() { $( 'body' ).removeClass( 'login' ) }, renderTemplate() { this.render( 'admin', { controller: 'dashboard' }); } });
9631ebfc45981e9b6ca1328d57e81c170c10e6fa
data/vesselinfo/vesselinfo-bulkloader.yaml
data/vesselinfo/vesselinfo-bulkloader.yaml
python_preamble: - import: base64 - import: re - import: google.appengine.ext.bulkload.transform - import: google.appengine.ext.bulkload.bulkloader_wizard - import: google.appengine.ext.db - import: google.appengine.api.datastore - import: google.appengine.api.users transformers: - kind: VesselInfo connector: csv connector_options: encoding: utf-8 property_map: - property: __key__ external_name: series export_transform: datastore.Key.name - property: mmsi external_name: mmsi - property: source external_name: source - property: sourceid external_name: sourceid - property: datetime external_name: datetime import_transform: transform.import_date_time('%Y-%m-%d') - property: callsign external_name: callsign - property: vesselclass external_name: vesselclass - property: vesselname external_name: vesselname - property: flagstate external_name: flagstate - property: imo external_name: imo
python_preamble: - import: base64 - import: re - import: google.appengine.ext.bulkload.transform - import: google.appengine.ext.bulkload.bulkloader_wizard - import: google.appengine.ext.db - import: google.appengine.api.datastore - import: google.appengine.api.users transformers: - kind: VesselInfo connector: csv connector_options: encoding: utf-8 property_map: - property: __key__ external_name: series import_transform: int export_transform: datastore.Key.id export_transform: datastore.Key.id - property: mmsi external_name: mmsi - property: source external_name: source - property: sourceid external_name: sourceid - property: datetime external_name: datetime import_transform: transform.import_date_time('%Y-%m-%d') - property: callsign external_name: callsign - property: vesselclass external_name: vesselclass - property: vesselname external_name: vesselname - property: flagstate external_name: flagstate - property: imo external_name: imo
Change VesselInfo key from string to int
Change VesselInfo key from string to int
YAML
mit
SkyTruth/pelagos-data,SkyTruth/pelagos-data
yaml
## Code Before: python_preamble: - import: base64 - import: re - import: google.appengine.ext.bulkload.transform - import: google.appengine.ext.bulkload.bulkloader_wizard - import: google.appengine.ext.db - import: google.appengine.api.datastore - import: google.appengine.api.users transformers: - kind: VesselInfo connector: csv connector_options: encoding: utf-8 property_map: - property: __key__ external_name: series export_transform: datastore.Key.name - property: mmsi external_name: mmsi - property: source external_name: source - property: sourceid external_name: sourceid - property: datetime external_name: datetime import_transform: transform.import_date_time('%Y-%m-%d') - property: callsign external_name: callsign - property: vesselclass external_name: vesselclass - property: vesselname external_name: vesselname - property: flagstate external_name: flagstate - property: imo external_name: imo ## Instruction: Change VesselInfo key from string to int ## Code After: python_preamble: - import: base64 - import: re - import: google.appengine.ext.bulkload.transform - import: google.appengine.ext.bulkload.bulkloader_wizard - import: google.appengine.ext.db - import: google.appengine.api.datastore - import: google.appengine.api.users transformers: - kind: VesselInfo connector: csv connector_options: encoding: utf-8 property_map: - property: __key__ external_name: series import_transform: int export_transform: datastore.Key.id export_transform: datastore.Key.id - property: mmsi external_name: mmsi - property: source external_name: source - property: sourceid external_name: sourceid - property: datetime external_name: datetime import_transform: transform.import_date_time('%Y-%m-%d') - property: callsign external_name: callsign - property: vesselclass external_name: vesselclass - property: vesselname external_name: vesselname - property: flagstate external_name: flagstate - property: imo external_name: imo
53e065675009bae7ce186ef19f1bf96ce4ef0725
app/components/Content/Content.css
app/components/Content/Content.css
.cover { composes: contentContainer from 'app/styles/utilities.css'; max-height: 300px; padding: 0; border-bottom-right-radius: 0; border-bottom-left-radius: 0; overflow: hidden; position: relative; img { object-fit: cover; } } .content { composes: contentContainer from 'app/styles/utilities.css'; width: 100%; padding-top: 20px; } .contentWithBanner { border-top-right-radius: 0; border-top-left-radius: 0; }
.cover { composes: contentContainer from 'app/styles/utilities.css'; max-height: 400px; padding: 0; border-bottom-right-radius: 0; border-bottom-left-radius: 0; overflow: hidden; position: relative; img { object-fit: cover; display: block; } } .content { composes: contentContainer from 'app/styles/utilities.css'; width: 100%; padding-top: 20px; } .contentWithBanner { border-top-right-radius: 0; border-top-left-radius: 0; }
Fix content cover image height
Fix content cover image height
CSS
mit
webkom/lego-webapp,webkom/lego-webapp,webkom/lego-webapp
css
## Code Before: .cover { composes: contentContainer from 'app/styles/utilities.css'; max-height: 300px; padding: 0; border-bottom-right-radius: 0; border-bottom-left-radius: 0; overflow: hidden; position: relative; img { object-fit: cover; } } .content { composes: contentContainer from 'app/styles/utilities.css'; width: 100%; padding-top: 20px; } .contentWithBanner { border-top-right-radius: 0; border-top-left-radius: 0; } ## Instruction: Fix content cover image height ## Code After: .cover { composes: contentContainer from 'app/styles/utilities.css'; max-height: 400px; padding: 0; border-bottom-right-radius: 0; border-bottom-left-radius: 0; overflow: hidden; position: relative; img { object-fit: cover; display: block; } } .content { composes: contentContainer from 'app/styles/utilities.css'; width: 100%; padding-top: 20px; } .contentWithBanner { border-top-right-radius: 0; border-top-left-radius: 0; }
6277a6180e21ac32f822adfde087e423a129b145
lib/active_interaction.rb
lib/active_interaction.rb
require 'active_model' require 'active_interaction/version' require 'active_interaction/errors' require 'active_interaction/overload_hash' require 'active_interaction/filter' require 'active_interaction/filter_method' require 'active_interaction/filter_methods' require 'active_interaction/filters/abstract_date_time_filter' require 'active_interaction/filters/array_filter' require 'active_interaction/filters/boolean_filter' require 'active_interaction/filters/date_filter' require 'active_interaction/filters/date_time_filter' require 'active_interaction/filters/file_filter' require 'active_interaction/filters/float_filter' require 'active_interaction/filters/hash_filter' require 'active_interaction/filters/integer_filter' require 'active_interaction/filters/model_filter' require 'active_interaction/filters/string_filter' require 'active_interaction/filters/time_filter' require 'active_interaction/base' I18n.backend.load_translations(Dir['lib/active_interaction/locale/*.yml']) # @since 0.1.0 module ActiveInteraction end
require 'active_model' require 'active_interaction/version' require 'active_interaction/errors' require 'active_interaction/overload_hash' require 'active_interaction/filter' require 'active_interaction/filter_method' require 'active_interaction/filter_methods' require 'active_interaction/filters/abstract_date_time_filter' require 'active_interaction/filters/array_filter' require 'active_interaction/filters/boolean_filter' require 'active_interaction/filters/date_filter' require 'active_interaction/filters/date_time_filter' require 'active_interaction/filters/file_filter' require 'active_interaction/filters/float_filter' require 'active_interaction/filters/hash_filter' require 'active_interaction/filters/integer_filter' require 'active_interaction/filters/model_filter' require 'active_interaction/filters/string_filter' require 'active_interaction/filters/time_filter' require 'active_interaction/base' I18n.backend.load_translations( Dir.glob(File.join(*%w(lib active_interaction locale *.yml)))) # @since 0.1.0 module ActiveInteraction end
Use more robust and explicit file path
Use more robust and explicit file path
Ruby
mit
frbl/active_interaction,AaronLasseigne/active_interaction,JasOXIII/active_interaction,antoinefinkelstein/active_interaction,antoinefinkelstein/active_interaction,AaronLasseigne/active_interaction,orgsync/active_interaction,frbl/active_interaction,JasOXIII/active_interaction,orgsync/active_interaction
ruby
## Code Before: require 'active_model' require 'active_interaction/version' require 'active_interaction/errors' require 'active_interaction/overload_hash' require 'active_interaction/filter' require 'active_interaction/filter_method' require 'active_interaction/filter_methods' require 'active_interaction/filters/abstract_date_time_filter' require 'active_interaction/filters/array_filter' require 'active_interaction/filters/boolean_filter' require 'active_interaction/filters/date_filter' require 'active_interaction/filters/date_time_filter' require 'active_interaction/filters/file_filter' require 'active_interaction/filters/float_filter' require 'active_interaction/filters/hash_filter' require 'active_interaction/filters/integer_filter' require 'active_interaction/filters/model_filter' require 'active_interaction/filters/string_filter' require 'active_interaction/filters/time_filter' require 'active_interaction/base' I18n.backend.load_translations(Dir['lib/active_interaction/locale/*.yml']) # @since 0.1.0 module ActiveInteraction end ## Instruction: Use more robust and explicit file path ## Code After: require 'active_model' require 'active_interaction/version' require 'active_interaction/errors' require 'active_interaction/overload_hash' require 'active_interaction/filter' require 'active_interaction/filter_method' require 'active_interaction/filter_methods' require 'active_interaction/filters/abstract_date_time_filter' require 'active_interaction/filters/array_filter' require 'active_interaction/filters/boolean_filter' require 'active_interaction/filters/date_filter' require 'active_interaction/filters/date_time_filter' require 'active_interaction/filters/file_filter' require 'active_interaction/filters/float_filter' require 'active_interaction/filters/hash_filter' require 'active_interaction/filters/integer_filter' require 'active_interaction/filters/model_filter' require 'active_interaction/filters/string_filter' require 'active_interaction/filters/time_filter' require 'active_interaction/base' I18n.backend.load_translations( Dir.glob(File.join(*%w(lib active_interaction locale *.yml)))) # @since 0.1.0 module ActiveInteraction end
6eae08c7f03a65fa8ad6994b63e490a281a9a356
apps/ello_v2/web/views/user_meta_attributes_view.ex
apps/ello_v2/web/views/user_meta_attributes_view.ex
defmodule Ello.V2.UserMetaAttributesView do use Ello.V2.Web, :view import Ello.V2.ImageView, only: [image_url: 2] def render("user.json", %{user: user}) do %{ title: title(user), robots: robots(user), image: image(user), description: description(user), } end defp title(%{name: nil, username: username}), do: "@#{username} | Ello" defp title(user), do: "#{user.name} (@#{user.username}) | Ello" defp robots(%{bad_for_seo: true}), do: "noindex, follow" defp robots(_), do: "index, follow" defp image(user) do case Enum.find(user.cover_image_struct.versions, &(&1.name == "optimized")) do nil -> nil version -> image_url(user.cover_image_struct.path, version.filename) end end defp description(%{formatted_short_bio: nil} = user), do: default_description(user) defp description(user) do user.formatted_short_bio |> Curtail.truncate(length: 160) |> HtmlSanitizeEx.strip_tags |> String.trim |> case do "" -> default_description(user) desc -> desc end end defp default_description(%{name: nil, username: username}), do: "See @#{username}'s work on Ello" defp default_description(%{name: name}), do: "See #{name}'s work on Ello" end
defmodule Ello.V2.UserMetaAttributesView do use Ello.V2.Web, :view import Ello.V2.ImageView, only: [image_url: 2] def render("user.json", %{user: user}) do %{ title: title(user), robots: robots(user), image: image(user), description: description(user), } end defp title(%{name: nil, username: username}), do: "@#{username} | Ello" defp title(user), do: "#{user.name} (@#{user.username}) | Ello" defp robots(%{bad_for_seo: true}), do: "noindex, follow" defp robots(_), do: "index, follow" defp image(user) do version = Enum.find(user.cover_image_struct.versions, &(&1.name == "optimized")) image_url(user.cover_image_struct.path, version.filename) end defp description(%{formatted_short_bio: nil} = user), do: default_description(user) defp description(user) do user.formatted_short_bio |> Curtail.truncate(length: 160) |> HtmlSanitizeEx.strip_tags |> String.trim |> case do "" -> default_description(user) desc -> desc end end defp default_description(%{name: nil, username: username}), do: "See @#{username}'s work on Ello" defp default_description(%{name: name}), do: "See #{name}'s work on Ello" end
Revert "Don't 500 if missing optimized image."
Revert "Don't 500 if missing optimized image." This reverts commit bf3c7e850207d5de2745cc0d70bc013710a5d8b5.
Elixir
mit
ello/apex,ello/apex,ello/apex
elixir
## Code Before: defmodule Ello.V2.UserMetaAttributesView do use Ello.V2.Web, :view import Ello.V2.ImageView, only: [image_url: 2] def render("user.json", %{user: user}) do %{ title: title(user), robots: robots(user), image: image(user), description: description(user), } end defp title(%{name: nil, username: username}), do: "@#{username} | Ello" defp title(user), do: "#{user.name} (@#{user.username}) | Ello" defp robots(%{bad_for_seo: true}), do: "noindex, follow" defp robots(_), do: "index, follow" defp image(user) do case Enum.find(user.cover_image_struct.versions, &(&1.name == "optimized")) do nil -> nil version -> image_url(user.cover_image_struct.path, version.filename) end end defp description(%{formatted_short_bio: nil} = user), do: default_description(user) defp description(user) do user.formatted_short_bio |> Curtail.truncate(length: 160) |> HtmlSanitizeEx.strip_tags |> String.trim |> case do "" -> default_description(user) desc -> desc end end defp default_description(%{name: nil, username: username}), do: "See @#{username}'s work on Ello" defp default_description(%{name: name}), do: "See #{name}'s work on Ello" end ## Instruction: Revert "Don't 500 if missing optimized image." This reverts commit bf3c7e850207d5de2745cc0d70bc013710a5d8b5. ## Code After: defmodule Ello.V2.UserMetaAttributesView do use Ello.V2.Web, :view import Ello.V2.ImageView, only: [image_url: 2] def render("user.json", %{user: user}) do %{ title: title(user), robots: robots(user), image: image(user), description: description(user), } end defp title(%{name: nil, username: username}), do: "@#{username} | Ello" defp title(user), do: "#{user.name} (@#{user.username}) | Ello" defp robots(%{bad_for_seo: true}), do: "noindex, follow" defp robots(_), do: "index, follow" defp image(user) do version = Enum.find(user.cover_image_struct.versions, &(&1.name == "optimized")) image_url(user.cover_image_struct.path, version.filename) end defp description(%{formatted_short_bio: nil} = user), do: default_description(user) defp description(user) do user.formatted_short_bio |> Curtail.truncate(length: 160) |> HtmlSanitizeEx.strip_tags |> String.trim |> case do "" -> default_description(user) desc -> desc end end defp default_description(%{name: nil, username: username}), do: "See @#{username}'s work on Ello" defp default_description(%{name: name}), do: "See #{name}'s work on Ello" end
66f2c4979df04274e9f0f909ba325b122d0de6e9
README.md
README.md
prove -e perl6 -r t/
use v6; use Algorithm::Soundex; my Algorithm::Soundex $s .= new(); my $soundex = $s.soundex("Leto"); say "The soundex of Leto is $soundex"; ## Running Tests prove -e perl6 -r t/
Add a code example to the readme
Add a code example to the readme
Markdown
artistic-2.0
leto/perl6-Algorithm-Soundex,paultcochrane/perl6-Algorithm-Soundex
markdown
## Code Before: prove -e perl6 -r t/ ## Instruction: Add a code example to the readme ## Code After: use v6; use Algorithm::Soundex; my Algorithm::Soundex $s .= new(); my $soundex = $s.soundex("Leto"); say "The soundex of Leto is $soundex"; ## Running Tests prove -e perl6 -r t/
39fb7a57fb045b6278193712d1d890bb52a44523
site/bible-studies.md
site/bible-studies.md
--- layout: map title: Bible Studies permalink: /bible-studies/ map: true --- Bible studies run at the following times and locations: - Monday - 12-1 - 1-2 - Tuesday - 10-11 - 1-2 - Wednesday - 12-1 - 1-2 - 7-9 @ Res Common Room - Thursday - 12-1 - 1-2 @ Med PBL 1 - Friday - 12-1 @ Med PBL 1 Location of Bible Studies is the Connect Store Cafe unless stated otherwise (Across from Library)
--- layout: map title: Bible Studies permalink: /bible-studies/ map: true --- Bible studies run at the following times and locations: - Monday - 12-1 - 1-2 - Tuesday - 10-11 - 1-2 - Wednesday - 12-1 - 1-2 - 7-9 @ Res Common Room - Thursday - 12-1 - 1-2 @ Med PBL 1 Building 30 - Friday - 12-1 @ Med PBL 1 Building 30 Location of Bible Studies is the Connect Store Cafe unless stated otherwise (Across from Library)
Add more detail to Med building location
Add more detail to Med building location
Markdown
apache-2.0
daviewales/CBM-Campbelltown-Website
markdown
## Code Before: --- layout: map title: Bible Studies permalink: /bible-studies/ map: true --- Bible studies run at the following times and locations: - Monday - 12-1 - 1-2 - Tuesday - 10-11 - 1-2 - Wednesday - 12-1 - 1-2 - 7-9 @ Res Common Room - Thursday - 12-1 - 1-2 @ Med PBL 1 - Friday - 12-1 @ Med PBL 1 Location of Bible Studies is the Connect Store Cafe unless stated otherwise (Across from Library) ## Instruction: Add more detail to Med building location ## Code After: --- layout: map title: Bible Studies permalink: /bible-studies/ map: true --- Bible studies run at the following times and locations: - Monday - 12-1 - 1-2 - Tuesday - 10-11 - 1-2 - Wednesday - 12-1 - 1-2 - 7-9 @ Res Common Room - Thursday - 12-1 - 1-2 @ Med PBL 1 Building 30 - Friday - 12-1 @ Med PBL 1 Building 30 Location of Bible Studies is the Connect Store Cafe unless stated otherwise (Across from Library)
a0d67c1c18da281c8531ba020e114c2a1ed117e5
docs/sphinx/tools.rst
docs/sphinx/tools.rst
Tools ===== Lorem cupidatat sit quis sunt labore cillum irure nulla quis ipsum incididunt do velit id. Duis et commodo magna velit cupidatat exercitation quis consectetur. Irure labore anim ea eiusmod consectetur voluptate sunt irure ut pariatur mollit ex irure occaecat aliqua deserunt voluptate. Id eu dolor sit labore proident culpa proident excepteur dolore eu cillum culpa id ut cupidatat qui. :doc:`query` :doc:`results`
Tools ===== Marvin-tools is an importable python package that provides convenience classes and functions for searching, accessing, interacting with, and visualzing MaNGA data. Since these capabilities are useful in both an exploratory environment, such as in Marvin-web, and in science-grade analysis code, we factored out the common elements into Marvin-tools. Marvin-tools includes classes that represent hierarchical levels of MaNGA data organization: :ref:`marvin-tools-spectrum`, :ref:`marvin-tools-spaxel`, :ref:`marvin-tools-rss`, :ref:`marvin-tools-cube`, and :ref:`marvin-tools-plate`. These classes have methods to retrieve the appropriate data from a locally stored file, over the internet via Marvin-API, or by downloading FITS files. One of the most powerful aspects of the Marvin ecosystem is the ability to use Marvin-API through Marvin-tools to query the MaNGA databases from within a python script or terminal. With the Marvin-tools class :doc:`query` you can build and execute a query. The results of your query are returned as an instance of the :doc:`results` class, which has built-in methods for navigating and presenting those results.
Add Tools general description page
Add Tools general description page
reStructuredText
bsd-3-clause
albireox/marvin,sdss/marvin,bretthandrews/marvin,sdss/marvin,bretthandrews/marvin,albireox/marvin,sdss/marvin,sdss/marvin,bretthandrews/marvin,bretthandrews/marvin,albireox/marvin,albireox/marvin
restructuredtext
## Code Before: Tools ===== Lorem cupidatat sit quis sunt labore cillum irure nulla quis ipsum incididunt do velit id. Duis et commodo magna velit cupidatat exercitation quis consectetur. Irure labore anim ea eiusmod consectetur voluptate sunt irure ut pariatur mollit ex irure occaecat aliqua deserunt voluptate. Id eu dolor sit labore proident culpa proident excepteur dolore eu cillum culpa id ut cupidatat qui. :doc:`query` :doc:`results` ## Instruction: Add Tools general description page ## Code After: Tools ===== Marvin-tools is an importable python package that provides convenience classes and functions for searching, accessing, interacting with, and visualzing MaNGA data. Since these capabilities are useful in both an exploratory environment, such as in Marvin-web, and in science-grade analysis code, we factored out the common elements into Marvin-tools. Marvin-tools includes classes that represent hierarchical levels of MaNGA data organization: :ref:`marvin-tools-spectrum`, :ref:`marvin-tools-spaxel`, :ref:`marvin-tools-rss`, :ref:`marvin-tools-cube`, and :ref:`marvin-tools-plate`. These classes have methods to retrieve the appropriate data from a locally stored file, over the internet via Marvin-API, or by downloading FITS files. One of the most powerful aspects of the Marvin ecosystem is the ability to use Marvin-API through Marvin-tools to query the MaNGA databases from within a python script or terminal. With the Marvin-tools class :doc:`query` you can build and execute a query. The results of your query are returned as an instance of the :doc:`results` class, which has built-in methods for navigating and presenting those results.
0087ca4409438778da9d5c184a7cbb57e7af43b3
README.md
README.md
[![Greenkeeper badge](https://badges.greenkeeper.io/ChristianMurphy/grunt-remark.svg)](https://greenkeeper.io/) [![Build Status](https://travis-ci.org/ChristianMurphy/grunt-remark.svg?branch=master)](https://travis-ci.org/ChristianMurphy/grunt-remark) [![dependencies Status](https://david-dm.org/ChristianMurphy/grunt-remark/status.svg)](https://david-dm.org/ChristianMurphy/grunt-remark) [![devDependencies Status](https://david-dm.org/ChristianMurphy/grunt-remark/dev-status.svg)](https://david-dm.org/ChristianMurphy/grunt-remark?type=dev) > Process markdown with [remark](http://remark.js.org/) ## example ``` js 'use strict'; module.exports = grunt => { grunt.loadNpmTasks('grunt-remark'); grunt.loadTasks('tasks'); grunt.initConfig({ remark: { src: ['*.md'] } }); grunt.registerTask('default', ['remark']); }; ``` ## options See full list of options [here](https://github.com/wooorm/unified-engine-gulp#options).
[![Build Status](https://travis-ci.org/ChristianMurphy/grunt-remark.svg?branch=master)](https://travis-ci.org/ChristianMurphy/grunt-remark) [![Greenkeeper badge](https://badges.greenkeeper.io/ChristianMurphy/grunt-remark.svg)](https://greenkeeper.io/) [![dependencies Status](https://david-dm.org/ChristianMurphy/grunt-remark/status.svg)](https://david-dm.org/ChristianMurphy/grunt-remark) [![devDependencies Status](https://david-dm.org/ChristianMurphy/grunt-remark/dev-status.svg)](https://david-dm.org/ChristianMurphy/grunt-remark?type=dev) > Process markdown with [remark](http://remark.js.org/) ## example ``` js 'use strict'; module.exports = grunt => { grunt.loadNpmTasks('grunt-remark'); grunt.loadTasks('tasks'); grunt.initConfig({ remark: { src: ['*.md'] } }); grunt.registerTask('default', ['remark']); }; ``` ## options See full list of options [here](https://github.com/wooorm/unified-engine-gulp#options).
Move Greenkeeper badge next to other badges
Move Greenkeeper badge next to other badges
Markdown
mit
ChristianMurphy/grunt-remark
markdown
## Code Before: [![Greenkeeper badge](https://badges.greenkeeper.io/ChristianMurphy/grunt-remark.svg)](https://greenkeeper.io/) [![Build Status](https://travis-ci.org/ChristianMurphy/grunt-remark.svg?branch=master)](https://travis-ci.org/ChristianMurphy/grunt-remark) [![dependencies Status](https://david-dm.org/ChristianMurphy/grunt-remark/status.svg)](https://david-dm.org/ChristianMurphy/grunt-remark) [![devDependencies Status](https://david-dm.org/ChristianMurphy/grunt-remark/dev-status.svg)](https://david-dm.org/ChristianMurphy/grunt-remark?type=dev) > Process markdown with [remark](http://remark.js.org/) ## example ``` js 'use strict'; module.exports = grunt => { grunt.loadNpmTasks('grunt-remark'); grunt.loadTasks('tasks'); grunt.initConfig({ remark: { src: ['*.md'] } }); grunt.registerTask('default', ['remark']); }; ``` ## options See full list of options [here](https://github.com/wooorm/unified-engine-gulp#options). ## Instruction: Move Greenkeeper badge next to other badges ## Code After: [![Build Status](https://travis-ci.org/ChristianMurphy/grunt-remark.svg?branch=master)](https://travis-ci.org/ChristianMurphy/grunt-remark) [![Greenkeeper badge](https://badges.greenkeeper.io/ChristianMurphy/grunt-remark.svg)](https://greenkeeper.io/) [![dependencies Status](https://david-dm.org/ChristianMurphy/grunt-remark/status.svg)](https://david-dm.org/ChristianMurphy/grunt-remark) [![devDependencies Status](https://david-dm.org/ChristianMurphy/grunt-remark/dev-status.svg)](https://david-dm.org/ChristianMurphy/grunt-remark?type=dev) > Process markdown with [remark](http://remark.js.org/) ## example ``` js 'use strict'; module.exports = grunt => { grunt.loadNpmTasks('grunt-remark'); grunt.loadTasks('tasks'); grunt.initConfig({ remark: { src: ['*.md'] } }); grunt.registerTask('default', ['remark']); }; ``` ## options See full list of options [here](https://github.com/wooorm/unified-engine-gulp#options).
9f95715cc7260d02d88781c208f6a6a167496015
aiohttp_json_api/jsonpointer/__init__.py
aiohttp_json_api/jsonpointer/__init__.py
import typing from jsonpointer import JsonPointer as BaseJsonPointer class JSONPointer(BaseJsonPointer): def __init__(self, pointer): super(JSONPointer, self).__init__(pointer) def __truediv__(self, path: typing.Union['JSONPointer', str]) -> 'JSONPointer': parts = self.parts.copy() if isinstance(path, str): if not path.startswith('/'): path = f'/{path}' new_parts = JSONPointer(path).parts.pop(0) parts.append(new_parts) else: new_parts = path.parts parts.extend(new_parts) return JSONPointer.from_parts(parts)
import typing from jsonpointer import JsonPointer as BaseJsonPointer class JSONPointer(BaseJsonPointer): def __init__(self, pointer): super(JSONPointer, self).__init__(pointer) def __truediv__(self, path: typing.Union['JSONPointer', str]) -> 'JSONPointer': parts = self.parts.copy() if isinstance(path, int): path = str(path) if isinstance(path, str): if not path.startswith('/'): path = f'/{path}' new_parts = JSONPointer(path).parts.pop(0) parts.append(new_parts) else: new_parts = path.parts parts.extend(new_parts) return JSONPointer.from_parts(parts)
Fix bug with JSONPointer if part passed via __truediv__ is integer
Fix bug with JSONPointer if part passed via __truediv__ is integer
Python
mit
vovanbo/aiohttp_json_api
python
## Code Before: import typing from jsonpointer import JsonPointer as BaseJsonPointer class JSONPointer(BaseJsonPointer): def __init__(self, pointer): super(JSONPointer, self).__init__(pointer) def __truediv__(self, path: typing.Union['JSONPointer', str]) -> 'JSONPointer': parts = self.parts.copy() if isinstance(path, str): if not path.startswith('/'): path = f'/{path}' new_parts = JSONPointer(path).parts.pop(0) parts.append(new_parts) else: new_parts = path.parts parts.extend(new_parts) return JSONPointer.from_parts(parts) ## Instruction: Fix bug with JSONPointer if part passed via __truediv__ is integer ## Code After: import typing from jsonpointer import JsonPointer as BaseJsonPointer class JSONPointer(BaseJsonPointer): def __init__(self, pointer): super(JSONPointer, self).__init__(pointer) def __truediv__(self, path: typing.Union['JSONPointer', str]) -> 'JSONPointer': parts = self.parts.copy() if isinstance(path, int): path = str(path) if isinstance(path, str): if not path.startswith('/'): path = f'/{path}' new_parts = JSONPointer(path).parts.pop(0) parts.append(new_parts) else: new_parts = path.parts parts.extend(new_parts) return JSONPointer.from_parts(parts)
d1db59b4fa389e5f73ae9de7c4f75802102d1756
fmpz_poly_factor/clear.c
fmpz_poly_factor/clear.c
/* Copyright (C) 2011 Sebastian Pancratz This file is part of FLINT. FLINT is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License (LGPL) as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. See <http://www.gnu.org/licenses/>. */ #include <gmp.h> #include <stdlib.h> #include "flint.h" #include "fmpz.h" #include "fmpz_poly.h" void fmpz_poly_factor_clear(fmpz_poly_factor_t fac) { if (fac->alloc) { slong i; for (i = 0; i < fac->alloc; i++) { fmpz_poly_clear(fac->p + i); } fmpz_clear(&(fac->c)); flint_free(fac->p); flint_free(fac->exp); fac->p = NULL; fac->exp = NULL; } }
/* Copyright (C) 2011 Sebastian Pancratz This file is part of FLINT. FLINT is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License (LGPL) as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. See <http://www.gnu.org/licenses/>. */ #include <gmp.h> #include <stdlib.h> #include "flint.h" #include "fmpz.h" #include "fmpz_poly.h" void fmpz_poly_factor_clear(fmpz_poly_factor_t fac) { if (fac->alloc) { slong i; for (i = 0; i < fac->alloc; i++) { fmpz_poly_clear(fac->p + i); } flint_free(fac->p); flint_free(fac->exp); fac->p = NULL; fac->exp = NULL; } fmpz_clear(&(fac->c)); }
Fix a memory leak in t-factor.
Fix a memory leak in t-factor.
C
lgpl-2.1
fredrik-johansson/flint2,wbhart/flint2,wbhart/flint2,fredrik-johansson/flint2,fredrik-johansson/flint2,wbhart/flint2
c
## Code Before: /* Copyright (C) 2011 Sebastian Pancratz This file is part of FLINT. FLINT is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License (LGPL) as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. See <http://www.gnu.org/licenses/>. */ #include <gmp.h> #include <stdlib.h> #include "flint.h" #include "fmpz.h" #include "fmpz_poly.h" void fmpz_poly_factor_clear(fmpz_poly_factor_t fac) { if (fac->alloc) { slong i; for (i = 0; i < fac->alloc; i++) { fmpz_poly_clear(fac->p + i); } fmpz_clear(&(fac->c)); flint_free(fac->p); flint_free(fac->exp); fac->p = NULL; fac->exp = NULL; } } ## Instruction: Fix a memory leak in t-factor. ## Code After: /* Copyright (C) 2011 Sebastian Pancratz This file is part of FLINT. FLINT is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License (LGPL) as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. See <http://www.gnu.org/licenses/>. */ #include <gmp.h> #include <stdlib.h> #include "flint.h" #include "fmpz.h" #include "fmpz_poly.h" void fmpz_poly_factor_clear(fmpz_poly_factor_t fac) { if (fac->alloc) { slong i; for (i = 0; i < fac->alloc; i++) { fmpz_poly_clear(fac->p + i); } flint_free(fac->p); flint_free(fac->exp); fac->p = NULL; fac->exp = NULL; } fmpz_clear(&(fac->c)); }
2b4b4ac3ec238a039717feff727316217c13d294
test/test_cronquot.py
test/test_cronquot.py
import unittest from cronquot.cronquot import has_directory class CronquotTest(unittest.TestCase): def test_has_directory(self): self.assertTrue(has_directory('/tmp')) if __name__ == '__main__': unittest.test()
import unittest import os from cronquot.cronquot import has_directory class CronquotTest(unittest.TestCase): def test_has_directory(self): sample_dir = os.path.join( os.path.dirname(__file__), 'crontab') self.assertTrue(has_directory(sample_dir)) if __name__ == '__main__': unittest.test()
Fix to test crontab dir
Fix to test crontab dir
Python
mit
pyohei/cronquot,pyohei/cronquot
python
## Code Before: import unittest from cronquot.cronquot import has_directory class CronquotTest(unittest.TestCase): def test_has_directory(self): self.assertTrue(has_directory('/tmp')) if __name__ == '__main__': unittest.test() ## Instruction: Fix to test crontab dir ## Code After: import unittest import os from cronquot.cronquot import has_directory class CronquotTest(unittest.TestCase): def test_has_directory(self): sample_dir = os.path.join( os.path.dirname(__file__), 'crontab') self.assertTrue(has_directory(sample_dir)) if __name__ == '__main__': unittest.test()
708e0cd9668ca5fbc23a69258bd97ce81ca91b8b
.travis.yml
.travis.yml
language: python python: - "2.7" virtualenv: system_site_packages: true script: - ./setup_dev.sh -u
language: python python: - "2.7" virtualenv: system_site_packages: true # For more information about the nasty /dev/random hack, please see: # https://github.com/travis-ci/travis-ci/issues/1913#issuecomment-33891474 before_install: - sudo apt-get update -qq - sudo apt-get install --yes dpkg-dev fakeroot lintian python-apt rng-tools - sudo rm -f /dev/random - sudo mknod -m 0666 /dev/random c 1 9 - echo HRNGDEVICE=/dev/urandom | sudo tee /etc/default/rng-tools - sudo /etc/init.d/rng-tools restart script: - ./setup_dev.sh -u
Implement rng-tools hack for low entropy causing tests to time out on Travis
Implement rng-tools hack for low entropy causing tests to time out on Travis
YAML
agpl-3.0
mark-in/securedrop-prov-upstream,mark-in/securedrop-prov-upstream,mark-in/securedrop-prov-upstream,mark-in/securedrop-prov-upstream
yaml
## Code Before: language: python python: - "2.7" virtualenv: system_site_packages: true script: - ./setup_dev.sh -u ## Instruction: Implement rng-tools hack for low entropy causing tests to time out on Travis ## Code After: language: python python: - "2.7" virtualenv: system_site_packages: true # For more information about the nasty /dev/random hack, please see: # https://github.com/travis-ci/travis-ci/issues/1913#issuecomment-33891474 before_install: - sudo apt-get update -qq - sudo apt-get install --yes dpkg-dev fakeroot lintian python-apt rng-tools - sudo rm -f /dev/random - sudo mknod -m 0666 /dev/random c 1 9 - echo HRNGDEVICE=/dev/urandom | sudo tee /etc/default/rng-tools - sudo /etc/init.d/rng-tools restart script: - ./setup_dev.sh -u
c04cf942262fb1615abaf496a422a6665bf94142
generators/app/templates/theme/search/forms/search-header.twig
generators/app/templates/theme/search/forms/search-header.twig
{% apply spaceless %} <div class="SearchForm"> {{ form.openTag }} <div class="Arrange Arrange--middle"> <div class="Arrange-sizeFill"> {% set form.fields.searchTerm.class = 'SearchForm-input' %} {% set form.fields.searchTerm.id = '' %} {% set form.fields.searchTerm.placeholder = 'Search...' %} {{ form.fields.searchTerm.tag }} </div> <div class="Arrange-sizeFit"> <button class="SearchForm-btn" type="submit"><i class="Icon-search"></i></button> </div> </div> {{ form.closeTag }} </div> {% endapply %}
{% apply spaceless %} <div class="SearchForm"> {{ form.openTag }} <div class="Arrange Arrange--middle"> <div class="Arrange-sizeFill"> {% set form.fields.searchTerm.class = 'SearchForm-input' %} {% set form.fields.searchTerm.placeholder = 'Search...' %} {{ form.fields.searchTerm.tag }} </div> <div class="Arrange-sizeFit"> <button class="SearchForm-btn" type="submit"><i class="Icon-search"></i></button> </div> </div> {{ form.closeTag }} </div> {% endapply %}
Remove line in header search template
Remove line in header search template
Twig
mit
aptuitiv/generator-cacao-branchcms
twig
## Code Before: {% apply spaceless %} <div class="SearchForm"> {{ form.openTag }} <div class="Arrange Arrange--middle"> <div class="Arrange-sizeFill"> {% set form.fields.searchTerm.class = 'SearchForm-input' %} {% set form.fields.searchTerm.id = '' %} {% set form.fields.searchTerm.placeholder = 'Search...' %} {{ form.fields.searchTerm.tag }} </div> <div class="Arrange-sizeFit"> <button class="SearchForm-btn" type="submit"><i class="Icon-search"></i></button> </div> </div> {{ form.closeTag }} </div> {% endapply %} ## Instruction: Remove line in header search template ## Code After: {% apply spaceless %} <div class="SearchForm"> {{ form.openTag }} <div class="Arrange Arrange--middle"> <div class="Arrange-sizeFill"> {% set form.fields.searchTerm.class = 'SearchForm-input' %} {% set form.fields.searchTerm.placeholder = 'Search...' %} {{ form.fields.searchTerm.tag }} </div> <div class="Arrange-sizeFit"> <button class="SearchForm-btn" type="submit"><i class="Icon-search"></i></button> </div> </div> {{ form.closeTag }} </div> {% endapply %}
9b9cba0939471d8c147f28914c1bb2b781e186e1
app/models/chargeback/consumption_with_rollups.rb
app/models/chargeback/consumption_with_rollups.rb
class Chargeback class ConsumptionWithRollups < Consumption delegate :timestamp, :resource, :resource_id, :resource_name, :resource_type, :parent_ems, :hash_features_affecting_rate, :tag_list_with_prefix, :parents_determining_rate, :to => :first_metric_rollup_record def initialize(metric_rollup_records, start_time, end_time) super(start_time, end_time) @rollups = metric_rollup_records end def tag_names first_metric_rollup_record.tag_names.split('|') end def max(metric) values(metric).max end def avg(metric) metric_sum = values(metric).sum metric_sum / consumed_hours_in_interval end def none?(metric) values(metric).empty? end def chargeback_fields_present @chargeback_fields_present ||= @rollups.count(&:chargeback_fields_present?) end private def born_at # metrics can be older than resource (first capture may go few days back) [super, first_metric_rollup_record.timestamp].compact.min end def values(metric) @values ||= {} @values[metric] ||= @rollups.collect(&metric.to_sym).compact end def first_metric_rollup_record @fmrr ||= @rollups.first end end end
class Chargeback class ConsumptionWithRollups < Consumption delegate :timestamp, :resource, :resource_id, :resource_name, :resource_type, :parent_ems, :hash_features_affecting_rate, :parents_determining_rate, :to => :first_metric_rollup_record def initialize(metric_rollup_records, start_time, end_time) super(start_time, end_time) @rollups = metric_rollup_records end def tag_names first_metric_rollup_record.tag_names.split('|') end def tag_list_with_prefix @tag_list_with_prefix ||= @rollups.map(&:tag_list_with_prefix).flatten.uniq end def max(metric) values(metric).max end def avg(metric) metric_sum = values(metric).sum metric_sum / consumed_hours_in_interval end def none?(metric) values(metric).empty? end def chargeback_fields_present @chargeback_fields_present ||= @rollups.count(&:chargeback_fields_present?) end private def born_at # metrics can be older than resource (first capture may go few days back) [super, first_metric_rollup_record.timestamp].compact.min end def values(metric) @values ||= {} @values[metric] ||= @rollups.collect(&metric.to_sym).compact end def first_metric_rollup_record @fmrr ||= @rollups.first end end end
Create tag list from all metric rollups of consumption period
Create tag list from all metric rollups of consumption period Previously, tag list was determined from first metric rollup of the period but it is not enough to use tag list only from first rollup and tag list can be changed during period. Now we are using tag list from metric rollup records of consumption period. Source: https://github.com/ManageIQ/manageiq/pull/15857
Ruby
apache-2.0
ManageIQ/manageiq,jntullo/manageiq,juliancheal/manageiq,pkomanek/manageiq,djberg96/manageiq,chessbyte/manageiq,d-m-u/manageiq,borod108/manageiq,mzazrivec/manageiq,jvlcek/manageiq,gerikis/manageiq,andyvesel/manageiq,kbrock/manageiq,tzumainn/manageiq,billfitzgerald0120/manageiq,djberg96/manageiq,tinaafitz/manageiq,mkanoor/manageiq,NickLaMuro/manageiq,jntullo/manageiq,romanblanco/manageiq,mzazrivec/manageiq,gerikis/manageiq,pkomanek/manageiq,d-m-u/manageiq,syncrou/manageiq,mresti/manageiq,gerikis/manageiq,josejulio/manageiq,tinaafitz/manageiq,hstastna/manageiq,hstastna/manageiq,aufi/manageiq,yaacov/manageiq,kbrock/manageiq,jrafanie/manageiq,djberg96/manageiq,agrare/manageiq,israel-hdez/manageiq,gerikis/manageiq,andyvesel/manageiq,jrafanie/manageiq,israel-hdez/manageiq,syncrou/manageiq,aufi/manageiq,skateman/manageiq,branic/manageiq,ManageIQ/manageiq,romanblanco/manageiq,jrafanie/manageiq,yaacov/manageiq,juliancheal/manageiq,gmcculloug/manageiq,jvlcek/manageiq,chessbyte/manageiq,jrafanie/manageiq,juliancheal/manageiq,israel-hdez/manageiq,tzumainn/manageiq,kbrock/manageiq,durandom/manageiq,ailisp/manageiq,lpichler/manageiq,tinaafitz/manageiq,skateman/manageiq,mzazrivec/manageiq,jvlcek/manageiq,mresti/manageiq,josejulio/manageiq,jameswnl/manageiq,tzumainn/manageiq,lpichler/manageiq,borod108/manageiq,billfitzgerald0120/manageiq,jameswnl/manageiq,pkomanek/manageiq,d-m-u/manageiq,NickLaMuro/manageiq,jameswnl/manageiq,mzazrivec/manageiq,mresti/manageiq,ManageIQ/manageiq,billfitzgerald0120/manageiq,kbrock/manageiq,juliancheal/manageiq,hstastna/manageiq,tinaafitz/manageiq,pkomanek/manageiq,durandom/manageiq,gmcculloug/manageiq,durandom/manageiq,ailisp/manageiq,jameswnl/manageiq,billfitzgerald0120/manageiq,israel-hdez/manageiq,syncrou/manageiq,d-m-u/manageiq,jntullo/manageiq,romanblanco/manageiq,tzumainn/manageiq,branic/manageiq,andyvesel/manageiq,djberg96/manageiq,mkanoor/manageiq,branic/manageiq,borod108/manageiq,josejulio/manageiq,aufi/manageiq,skateman/manageiq,agrare/manageiq,andyvesel/manageiq,mkanoor/manageiq,aufi/manageiq,gmcculloug/manageiq,lpichler/manageiq,mkanoor/manageiq,agrare/manageiq,chessbyte/manageiq,jvlcek/manageiq,ailisp/manageiq,durandom/manageiq,josejulio/manageiq,lpichler/manageiq,NickLaMuro/manageiq,NickLaMuro/manageiq,yaacov/manageiq,hstastna/manageiq,mresti/manageiq,yaacov/manageiq,agrare/manageiq,branic/manageiq,chessbyte/manageiq,ManageIQ/manageiq,gmcculloug/manageiq,skateman/manageiq,borod108/manageiq,romanblanco/manageiq,ailisp/manageiq,syncrou/manageiq,jntullo/manageiq
ruby
## Code Before: class Chargeback class ConsumptionWithRollups < Consumption delegate :timestamp, :resource, :resource_id, :resource_name, :resource_type, :parent_ems, :hash_features_affecting_rate, :tag_list_with_prefix, :parents_determining_rate, :to => :first_metric_rollup_record def initialize(metric_rollup_records, start_time, end_time) super(start_time, end_time) @rollups = metric_rollup_records end def tag_names first_metric_rollup_record.tag_names.split('|') end def max(metric) values(metric).max end def avg(metric) metric_sum = values(metric).sum metric_sum / consumed_hours_in_interval end def none?(metric) values(metric).empty? end def chargeback_fields_present @chargeback_fields_present ||= @rollups.count(&:chargeback_fields_present?) end private def born_at # metrics can be older than resource (first capture may go few days back) [super, first_metric_rollup_record.timestamp].compact.min end def values(metric) @values ||= {} @values[metric] ||= @rollups.collect(&metric.to_sym).compact end def first_metric_rollup_record @fmrr ||= @rollups.first end end end ## Instruction: Create tag list from all metric rollups of consumption period Previously, tag list was determined from first metric rollup of the period but it is not enough to use tag list only from first rollup and tag list can be changed during period. Now we are using tag list from metric rollup records of consumption period. Source: https://github.com/ManageIQ/manageiq/pull/15857 ## Code After: class Chargeback class ConsumptionWithRollups < Consumption delegate :timestamp, :resource, :resource_id, :resource_name, :resource_type, :parent_ems, :hash_features_affecting_rate, :parents_determining_rate, :to => :first_metric_rollup_record def initialize(metric_rollup_records, start_time, end_time) super(start_time, end_time) @rollups = metric_rollup_records end def tag_names first_metric_rollup_record.tag_names.split('|') end def tag_list_with_prefix @tag_list_with_prefix ||= @rollups.map(&:tag_list_with_prefix).flatten.uniq end def max(metric) values(metric).max end def avg(metric) metric_sum = values(metric).sum metric_sum / consumed_hours_in_interval end def none?(metric) values(metric).empty? end def chargeback_fields_present @chargeback_fields_present ||= @rollups.count(&:chargeback_fields_present?) end private def born_at # metrics can be older than resource (first capture may go few days back) [super, first_metric_rollup_record.timestamp].compact.min end def values(metric) @values ||= {} @values[metric] ||= @rollups.collect(&metric.to_sym).compact end def first_metric_rollup_record @fmrr ||= @rollups.first end end end
b2fcda1c15fa5c0ad33d29417c142e80fcf39eb6
release.sh
release.sh
set -e echo "Enter release version: " read VERSION read -p "Releasing $VERSION - are you sure? (y/n)" -n 1 -r echo # (optional) move to a new line if [[ $REPLY =~ ^[Yy]$ ]] then echo "Releasing $VERSION ..." # lint and test npm run lint 2>/dev/null npm test 2>/dev/null # build npm run build # commit git add -A git commit -m "Build for $VERSION" npm version $VERSION -m "Upgrade to $VERSION" # publish npm publish --tag next echo "Publish $VERSION successfully!" read -p "Upgrade GitHub Pages? (y/n)" -n 1 -r echo # (optional) move to a new line if [[ $REPLY =~ ^[Yy]$ ]] then echo "Upgrade GitHub Pages ..." # upgrade GitHub Pages git checkout gh-pages npm install vue-infinite-loading@$VERSION --save VERSION=$VERSION npm run build git add -A git commit -m "Upgrade GitHub Pages to $VERSION" git checkout master fi # push git push echo "Done!" fi
set -e echo "Enter release version: " read VERSION read -p "Releasing $VERSION - are you sure? (y/n)" -n 1 -r echo # (optional) move to a new line if [[ $REPLY =~ ^[Yy]$ ]] then echo "Releasing $VERSION ..." # lint and test npm run lint 2>/dev/null npm test 2>/dev/null # build npm run build # commit git add -A git commit -m "Build for $VERSION" npm version $VERSION -m "Upgrade to $VERSION" # publish npm publish echo "Publish $VERSION successfully!" read -p "Upgrade GitHub Pages? (y/n)" -n 1 -r echo # (optional) move to a new line if [[ $REPLY =~ ^[Yy]$ ]] then echo "Upgrade GitHub Pages ..." # upgrade GitHub Pages git checkout gh-pages npm install vue-infinite-loading@$VERSION --save VERSION=$VERSION npm run build git add -A git commit -m "Upgrade GitHub Pages to $VERSION" git checkout master fi # push git push echo "Done!" fi
Remove the next tag when publish to npm
Remove the next tag when publish to npm
Shell
mit
PeachScript/vue-infinite-loading,PeachScript/vue-infinite-loading
shell
## Code Before: set -e echo "Enter release version: " read VERSION read -p "Releasing $VERSION - are you sure? (y/n)" -n 1 -r echo # (optional) move to a new line if [[ $REPLY =~ ^[Yy]$ ]] then echo "Releasing $VERSION ..." # lint and test npm run lint 2>/dev/null npm test 2>/dev/null # build npm run build # commit git add -A git commit -m "Build for $VERSION" npm version $VERSION -m "Upgrade to $VERSION" # publish npm publish --tag next echo "Publish $VERSION successfully!" read -p "Upgrade GitHub Pages? (y/n)" -n 1 -r echo # (optional) move to a new line if [[ $REPLY =~ ^[Yy]$ ]] then echo "Upgrade GitHub Pages ..." # upgrade GitHub Pages git checkout gh-pages npm install vue-infinite-loading@$VERSION --save VERSION=$VERSION npm run build git add -A git commit -m "Upgrade GitHub Pages to $VERSION" git checkout master fi # push git push echo "Done!" fi ## Instruction: Remove the next tag when publish to npm ## Code After: set -e echo "Enter release version: " read VERSION read -p "Releasing $VERSION - are you sure? (y/n)" -n 1 -r echo # (optional) move to a new line if [[ $REPLY =~ ^[Yy]$ ]] then echo "Releasing $VERSION ..." # lint and test npm run lint 2>/dev/null npm test 2>/dev/null # build npm run build # commit git add -A git commit -m "Build for $VERSION" npm version $VERSION -m "Upgrade to $VERSION" # publish npm publish echo "Publish $VERSION successfully!" read -p "Upgrade GitHub Pages? (y/n)" -n 1 -r echo # (optional) move to a new line if [[ $REPLY =~ ^[Yy]$ ]] then echo "Upgrade GitHub Pages ..." # upgrade GitHub Pages git checkout gh-pages npm install vue-infinite-loading@$VERSION --save VERSION=$VERSION npm run build git add -A git commit -m "Upgrade GitHub Pages to $VERSION" git checkout master fi # push git push echo "Done!" fi
b0f8a9f7dea1565df420908b15780d0f28a6245f
network_roles.yaml
network_roles.yaml
- id: 'detach_keystone_vip' default_mapping: 'management' properties: subnet: true gateway: false vip: - name: 'service_endpoint' namespace: 'haproxy'
- id: 'detach_keystone_vip' default_mapping: 'management' node_roles: - standalone-keystone - primary-standalone-keystone properties: subnet: true gateway: false vip: - name: 'service_endpoint' namespace: 'haproxy' - id: 'detach_keystone_public_vip' default_mapping: 'public' node_roles: - standalone-keystone - primary-standalone-keystone properties: subnet: true gateway: true vip: - name: 'public_service_endpoint' namespace: 'haproxy'
Add public vip and node_roles to vips
Add public vip and node_roles to vips
YAML
apache-2.0
Barthalion/detach-keystone,mattymo/detach-keystone
yaml
## Code Before: - id: 'detach_keystone_vip' default_mapping: 'management' properties: subnet: true gateway: false vip: - name: 'service_endpoint' namespace: 'haproxy' ## Instruction: Add public vip and node_roles to vips ## Code After: - id: 'detach_keystone_vip' default_mapping: 'management' node_roles: - standalone-keystone - primary-standalone-keystone properties: subnet: true gateway: false vip: - name: 'service_endpoint' namespace: 'haproxy' - id: 'detach_keystone_public_vip' default_mapping: 'public' node_roles: - standalone-keystone - primary-standalone-keystone properties: subnet: true gateway: true vip: - name: 'public_service_endpoint' namespace: 'haproxy'
5463ae4f75aec81fc666eb0ae1d70d4abd2fec08
README.md
README.md
zanata-wildfly contains files which help Zanata (3.5 or later) to run on WildFly. Author: Sean Flanigan <sflaniga@redhat.com> The modules contain repackaged copies of Mojarra 2.1 and Hibernate 4.2. The zip files are meant to be extracted into a WildFly installation, which will add some Mojarra 2.1 modules and replace the 'main' Hibernate module with Hibernate 4.2. The 'main' Hibernate module is used by default, but the Mojarra modules need to be activated by a line like this in `standalone.xml`: <subsystem xmlns="urn:jboss:domain:jsf:1.0" default-jsf-impl-slot="mojarra-2.1.28"/> (NB: The slot name needs to match the version of Mojarra.) The `standalone` directory (in the source tree) contains configuration for WildFly which might help you to run Zanata. Note: these files may or may not be kept up to date, and might be removed in future. The build scripts are licensed under LGPL 2.1, but Mojarra and Hibernate retain their original licences. The source code for zanata-wildfly lives here: https://github.com/zanata/zanata-wildfly Hibernate: http://hibernate.org/ Mojarra: https://javaserverfaces.java.net/ WildFly: http://wildfly.org/ Zanata: http://zanata.org/
zanata-wildfly contains files which help Zanata (3.5 or later) to run on WildFly. Author: Sean Flanigan <sflaniga@redhat.com> The modules contain repackaged copies of Mojarra 2.1 and Hibernate 4.2. The zip files are meant to be extracted into a WildFly installation, which will add some Mojarra 2.1 modules and replace the 'main' Hibernate module with Hibernate 4.2. The 'main' Hibernate module is used by default, but the Mojarra modules need to be activated by a line like this in `standalone.xml`: <subsystem xmlns="urn:jboss:domain:jsf:1.0" default-jsf-impl-slot="mojarra-2.1.28"/> (NB: The slot name needs to match the version of Mojarra.) The `standalone` directory (in the source tree) contains configuration for WildFly which might help you to run Zanata. Note: these files may or may not be kept up to date, and might be removed in future. The build scripts are licensed under LGPL 2.1, but Mojarra and Hibernate retain their original licences. The source code for zanata-wildfly lives here: https://github.com/zanata/zanata-wildfly Module binaries: https://sourceforge.net/projects/zanata/files/wildfly/ Hibernate: http://hibernate.org/ Mojarra: https://javaserverfaces.java.net/ WildFly: http://wildfly.org/ Zanata: http://zanata.org/
Add link to module binaries
Add link to module binaries
Markdown
lgpl-2.1
zanata/zanata-wildfly
markdown
## Code Before: zanata-wildfly contains files which help Zanata (3.5 or later) to run on WildFly. Author: Sean Flanigan <sflaniga@redhat.com> The modules contain repackaged copies of Mojarra 2.1 and Hibernate 4.2. The zip files are meant to be extracted into a WildFly installation, which will add some Mojarra 2.1 modules and replace the 'main' Hibernate module with Hibernate 4.2. The 'main' Hibernate module is used by default, but the Mojarra modules need to be activated by a line like this in `standalone.xml`: <subsystem xmlns="urn:jboss:domain:jsf:1.0" default-jsf-impl-slot="mojarra-2.1.28"/> (NB: The slot name needs to match the version of Mojarra.) The `standalone` directory (in the source tree) contains configuration for WildFly which might help you to run Zanata. Note: these files may or may not be kept up to date, and might be removed in future. The build scripts are licensed under LGPL 2.1, but Mojarra and Hibernate retain their original licences. The source code for zanata-wildfly lives here: https://github.com/zanata/zanata-wildfly Hibernate: http://hibernate.org/ Mojarra: https://javaserverfaces.java.net/ WildFly: http://wildfly.org/ Zanata: http://zanata.org/ ## Instruction: Add link to module binaries ## Code After: zanata-wildfly contains files which help Zanata (3.5 or later) to run on WildFly. Author: Sean Flanigan <sflaniga@redhat.com> The modules contain repackaged copies of Mojarra 2.1 and Hibernate 4.2. The zip files are meant to be extracted into a WildFly installation, which will add some Mojarra 2.1 modules and replace the 'main' Hibernate module with Hibernate 4.2. The 'main' Hibernate module is used by default, but the Mojarra modules need to be activated by a line like this in `standalone.xml`: <subsystem xmlns="urn:jboss:domain:jsf:1.0" default-jsf-impl-slot="mojarra-2.1.28"/> (NB: The slot name needs to match the version of Mojarra.) The `standalone` directory (in the source tree) contains configuration for WildFly which might help you to run Zanata. Note: these files may or may not be kept up to date, and might be removed in future. The build scripts are licensed under LGPL 2.1, but Mojarra and Hibernate retain their original licences. The source code for zanata-wildfly lives here: https://github.com/zanata/zanata-wildfly Module binaries: https://sourceforge.net/projects/zanata/files/wildfly/ Hibernate: http://hibernate.org/ Mojarra: https://javaserverfaces.java.net/ WildFly: http://wildfly.org/ Zanata: http://zanata.org/
43e48b5ad15e1077698d2ceca56024e8ee08eede
app/controllers/stations.php
app/controllers/stations.php
<?php if ( ! defined('BASEPATH')) exit('No direct script access allowed'); class Stations extends CI_Controller { public function __construct() { parent::__construct(); $this->load->library('stationsFetcher'); } public function _remap($method) { $trainNo = intval($method); if ($trainNo > 0) { $this->_stations($trainNo); } else { show_error('400 Bad Request', 400); } } private function _stations($trainNo) { $data = $this->stationsfetcher->getStations($trainNo); $data['generated_time'] = date('H:i'); $json = json_encode($data); $this->output->set_header('Cache-Control: no-cache, must-revalidate'); $this->output->set_header('Pragma: no-cache'); if ($this->input->get('callback')) { $this->output->set_content_type('application/javascript'); $this->output->set_output("{$this->input->get('callback')}($json)"); } else { $this->output->set_content_type('application/json'); $this->output->set_output($json); } } }
<?php if ( ! defined('BASEPATH')) exit('No direct script access allowed'); class Stations extends CI_Controller { public function __construct() { parent::__construct(); $this->load->library('stationsFetcher'); date_default_timezone_set('Europe/Zagreb'); } public function _remap($method) { $trainNo = intval($method); if ($trainNo > 0) { $this->_stations($trainNo); } else { show_error('400 Bad Request', 400); } } private function _stations($trainNo) { $data = $this->stationsfetcher->getStations($trainNo); $data['generated_time'] = date('H:i'); $json = json_encode($data); $this->output->set_header('Cache-Control: no-cache, must-revalidate'); $this->output->set_header('Pragma: no-cache'); if ($this->input->get('callback')) { $this->output->set_content_type('application/javascript'); $this->output->set_output("{$this->input->get('callback')}($json)"); } else { $this->output->set_content_type('application/json'); $this->output->set_output($json); } } }
Declare default timezone in controllers which use date function
Declare default timezone in controllers which use date function
PHP
apache-2.0
traintracker/traintracker,traintracker/traintracker,traintracker/traintracker
php
## Code Before: <?php if ( ! defined('BASEPATH')) exit('No direct script access allowed'); class Stations extends CI_Controller { public function __construct() { parent::__construct(); $this->load->library('stationsFetcher'); } public function _remap($method) { $trainNo = intval($method); if ($trainNo > 0) { $this->_stations($trainNo); } else { show_error('400 Bad Request', 400); } } private function _stations($trainNo) { $data = $this->stationsfetcher->getStations($trainNo); $data['generated_time'] = date('H:i'); $json = json_encode($data); $this->output->set_header('Cache-Control: no-cache, must-revalidate'); $this->output->set_header('Pragma: no-cache'); if ($this->input->get('callback')) { $this->output->set_content_type('application/javascript'); $this->output->set_output("{$this->input->get('callback')}($json)"); } else { $this->output->set_content_type('application/json'); $this->output->set_output($json); } } } ## Instruction: Declare default timezone in controllers which use date function ## Code After: <?php if ( ! defined('BASEPATH')) exit('No direct script access allowed'); class Stations extends CI_Controller { public function __construct() { parent::__construct(); $this->load->library('stationsFetcher'); date_default_timezone_set('Europe/Zagreb'); } public function _remap($method) { $trainNo = intval($method); if ($trainNo > 0) { $this->_stations($trainNo); } else { show_error('400 Bad Request', 400); } } private function _stations($trainNo) { $data = $this->stationsfetcher->getStations($trainNo); $data['generated_time'] = date('H:i'); $json = json_encode($data); $this->output->set_header('Cache-Control: no-cache, must-revalidate'); $this->output->set_header('Pragma: no-cache'); if ($this->input->get('callback')) { $this->output->set_content_type('application/javascript'); $this->output->set_output("{$this->input->get('callback')}($json)"); } else { $this->output->set_content_type('application/json'); $this->output->set_output($json); } } }
b68ed5d594d37fa0c76e1592805f64e3030b1ca4
api/client-server/definitions/room_event_filter.yaml
api/client-server/definitions/room_event_filter.yaml
allOf: - $ref: event_filter.yaml - type: object title: RoomEventFilter properties: not_rooms: description: A list of room IDs to exclude. If this list is absent then no rooms are excluded. A matching room will be excluded even if it is listed in the ``'rooms'`` filter. items: type: string type: array rooms: description: A list of room IDs to include. If this list is absent then all rooms are included. items: type: string type: array contains_url: type: boolean description: If ``true``, includes only events with a url key in their content. If ``false``, excludes those events.
allOf: - $ref: event_filter.yaml - type: object title: RoomEventFilter properties: not_rooms: description: A list of room IDs to exclude. If this list is absent then no rooms are excluded. A matching room will be excluded even if it is listed in the ``'rooms'`` filter. items: type: string type: array rooms: description: A list of room IDs to include. If this list is absent then all rooms are included. items: type: string type: array contains_url: type: boolean description: If ``true``, includes only events with a ``url`` key in their content. If ``false``, excludes those events. Defaults to ``false``.
Define the default for the contains_url filter param
Define the default for the contains_url filter param Fixes https://github.com/matrix-org/matrix-doc/issues/1553
YAML
apache-2.0
matrix-org/matrix-doc,matrix-org/matrix-doc,matrix-org/matrix-doc,matrix-org/matrix-doc
yaml
## Code Before: allOf: - $ref: event_filter.yaml - type: object title: RoomEventFilter properties: not_rooms: description: A list of room IDs to exclude. If this list is absent then no rooms are excluded. A matching room will be excluded even if it is listed in the ``'rooms'`` filter. items: type: string type: array rooms: description: A list of room IDs to include. If this list is absent then all rooms are included. items: type: string type: array contains_url: type: boolean description: If ``true``, includes only events with a url key in their content. If ``false``, excludes those events. ## Instruction: Define the default for the contains_url filter param Fixes https://github.com/matrix-org/matrix-doc/issues/1553 ## Code After: allOf: - $ref: event_filter.yaml - type: object title: RoomEventFilter properties: not_rooms: description: A list of room IDs to exclude. If this list is absent then no rooms are excluded. A matching room will be excluded even if it is listed in the ``'rooms'`` filter. items: type: string type: array rooms: description: A list of room IDs to include. If this list is absent then all rooms are included. items: type: string type: array contains_url: type: boolean description: If ``true``, includes only events with a ``url`` key in their content. If ``false``, excludes those events. Defaults to ``false``.
bb22a4aa338d591ec53ed560d9ffbc2e538d8e86
README.md
README.md
A google chrome extension that add apathetic ascii art comments to reddit. Creators: Paul Etscheit Tal Schwartz Mohamed Shibl Catie Kennedy Jupiter Baudot Jake Cross
A google chrome extension that add apathetic ascii art comments to reddit. ## Where to get Appathy Appathy is available on the Google Chrome Store at [this](https://chrome.google.com/webstore/search/appathy?hl=en-US) link. ## How does it work? 1. Once you've installed the app by downloading it on the Chrome Store, go to [reddit](reddit.com). 2. Sign in to reddit or, if you don't have an account, sign up. 3. Click on the comment section of any reddit thread. 4. Give the page a bit to load and you'll see 2 buttons ('whatever' & 'so what') appear next to the save button under the blank comment box intended for the user to fill out. (a) Click the 'whatever' button and you will see a random assortment of single line ASCII images appear in the comment box(mostly middle fingers). (b) Click the 'so what' button and a list of emotions will drop down. Select one of these and an ASCII image will appear in the comment box. 5. When you've selected the desired image, click save. You should then see the ASCII art displayed on the comment thread. -------------------------------------------------------------- Creators: Jupiter Baudot, Paul Etscheit, Tal Schwartz, Mohamed Shibl, Catie Kennedy, Jake Cross
Update ReadMe with instructions on how to use Appathy'
Update ReadMe with instructions on how to use Appathy'
Markdown
mit
JupiterLikeThePlanet/Appathy,JupiterLikeThePlanet/Appathy
markdown
## Code Before: A google chrome extension that add apathetic ascii art comments to reddit. Creators: Paul Etscheit Tal Schwartz Mohamed Shibl Catie Kennedy Jupiter Baudot Jake Cross ## Instruction: Update ReadMe with instructions on how to use Appathy' ## Code After: A google chrome extension that add apathetic ascii art comments to reddit. ## Where to get Appathy Appathy is available on the Google Chrome Store at [this](https://chrome.google.com/webstore/search/appathy?hl=en-US) link. ## How does it work? 1. Once you've installed the app by downloading it on the Chrome Store, go to [reddit](reddit.com). 2. Sign in to reddit or, if you don't have an account, sign up. 3. Click on the comment section of any reddit thread. 4. Give the page a bit to load and you'll see 2 buttons ('whatever' & 'so what') appear next to the save button under the blank comment box intended for the user to fill out. (a) Click the 'whatever' button and you will see a random assortment of single line ASCII images appear in the comment box(mostly middle fingers). (b) Click the 'so what' button and a list of emotions will drop down. Select one of these and an ASCII image will appear in the comment box. 5. When you've selected the desired image, click save. You should then see the ASCII art displayed on the comment thread. -------------------------------------------------------------- Creators: Jupiter Baudot, Paul Etscheit, Tal Schwartz, Mohamed Shibl, Catie Kennedy, Jake Cross
884c4d30321b91cf98f98a39005a30adef8a66a0
_python/main/templates/admin/main/user/change_form.html
_python/main/templates/admin/main/user/change_form.html
{% extends "admin/change_form.html" %} {% load admin_urls %} {% block after_related_objects %} {{ block.super }} <div class="module aligned"> <h2>Casebooks</h2> <div class="form-row"> <a href="{% url 'admin:main_casebook_changelist' %}?collaborator-id={{ original.id }}"> View {{ original.attribution }}'s Casebooks </a> </div> </div> {% endblock %}
{% extends "admin/change_form.html" %} {% load admin_urls %} {% block object-tools %} {{ block.super }} <div class="module"> <form id="send_email" method="POST" action="{% url 'password_reset' %}" onsubmit="event.preventDefault(); fetch(this.action, {method: 'POST', body: new FormData(this)}).then(data => document.getElementById('email_result').textContent = (data.ok ? 'Success!' : 'Error!'));"> {% csrf_token %} <input type="hidden" name="email" value="{{ original.email_address }}"> <button class="button" type="submit" value="Upload file and update statuses" style="padding: 7px 15px;">Send {{ original.verified_email|yesno:"Password Reset, Activation"}} Email</button> <div id="email_result" class="help"></p> </form> </div> {% endblock %} {% block after_related_objects %} {{ block.super }} <div class="module aligned"> <h2>Casebooks</h2> <div class="form-row"> <a href="{% url 'admin:main_casebook_changelist' %}?collaborator-id={{ original.id }}"> View {{ original.attribution }}'s Casebooks </a> </div> </div> {% endblock %}
Add a button to send the email from the Django admin.
Add a button to send the email from the Django admin.
HTML
agpl-3.0
harvard-lil/h2o,harvard-lil/h2o,harvard-lil/h2o,harvard-lil/h2o
html
## Code Before: {% extends "admin/change_form.html" %} {% load admin_urls %} {% block after_related_objects %} {{ block.super }} <div class="module aligned"> <h2>Casebooks</h2> <div class="form-row"> <a href="{% url 'admin:main_casebook_changelist' %}?collaborator-id={{ original.id }}"> View {{ original.attribution }}'s Casebooks </a> </div> </div> {% endblock %} ## Instruction: Add a button to send the email from the Django admin. ## Code After: {% extends "admin/change_form.html" %} {% load admin_urls %} {% block object-tools %} {{ block.super }} <div class="module"> <form id="send_email" method="POST" action="{% url 'password_reset' %}" onsubmit="event.preventDefault(); fetch(this.action, {method: 'POST', body: new FormData(this)}).then(data => document.getElementById('email_result').textContent = (data.ok ? 'Success!' : 'Error!'));"> {% csrf_token %} <input type="hidden" name="email" value="{{ original.email_address }}"> <button class="button" type="submit" value="Upload file and update statuses" style="padding: 7px 15px;">Send {{ original.verified_email|yesno:"Password Reset, Activation"}} Email</button> <div id="email_result" class="help"></p> </form> </div> {% endblock %} {% block after_related_objects %} {{ block.super }} <div class="module aligned"> <h2>Casebooks</h2> <div class="form-row"> <a href="{% url 'admin:main_casebook_changelist' %}?collaborator-id={{ original.id }}"> View {{ original.attribution }}'s Casebooks </a> </div> </div> {% endblock %}
94d9c84cae1df1392f656a013807a29fe37d1491
category/templates/category/category_detail.html
category/templates/category/category_detail.html
{% extends "base.html" %} {% load i18n wagtailimages_tags %} {% block meta_title %}{{ category.name }}{% if filter %} - {{ filter }}{% endif %}{% endblock %} {% block title %}{{ category.name }}{% endblock %} {% block extra_js %} <script type="text/javascript" src="{{ STATIC_URL }}article/js/category.js"></script> {% endblock %} {% block extra_css %} <link type="text/css" rel="stylesheet" href="{{ STATIC_URL }}album/css/magnific-popup.css"> <link type="text/less" rel="stylesheet" href="{{ STATIC_URL }}css/category.less"> {% endblock %} {% block breadcrumb_menu %} <li class="active">{% trans "{{ category.name }}" %}</li> {% endblock %} {% block main %} <div class="jumbotron category-header"> <div class="row"> <div class="col-lg-9 col-md-9 col-sm-9"> <div class="category-title"> {{ category.name }} </div> <div class="category-description"> {{ category.description }} </div> </div> </div> </div> <div class="filter-list-container" data-title="{{ category.name }}" data-filter-endpoint="category_article_filter" data-filter-required-args-category="{{ category.id }}"> {% include "article/includes/article_list.html" with articles=articles category=category %} </div> {% endblock %}
{% extends "base.html" %}{% load i18n %} {% load i18n wagtailimages_tags %} {% block meta_title %}{{ category.name }}{% if filter %} - {{ filter }}{% endif %} - {% trans "Page" %} {{ articles.number }}{% endblock %} {% block title %}{{ category.name }}{% endblock %} {% block extra_js %} <script type="text/javascript" src="{{ STATIC_URL }}article/js/category.js"></script> {% endblock %} {% block extra_css %} <link type="text/css" rel="stylesheet" href="{{ STATIC_URL }}album/css/magnific-popup.css"> <link type="text/less" rel="stylesheet" href="{{ STATIC_URL }}css/category.less"> {% endblock %} {% block breadcrumb_menu %} <li class="active">{% trans "{{ category.name }}" %}</li> {% endblock %} {% block main %} <div class="jumbotron category-header"> <div class="row"> <div class="col-lg-9 col-md-9 col-sm-9"> <div class="category-title"> {{ category.name }} </div> <div class="category-description"> {{ category.description }} </div> </div> </div> </div> <div class="filter-list-container" data-title="{{ category.name }}" data-filter-endpoint="category_article_filter" data-filter-required-args-category="{{ category.id }}"> {% include "article/includes/article_list.html" with articles=articles category=category %} </div> {% endblock %}
Add page number to reduce page title duplication.
Add page number to reduce page title duplication.
HTML
bsd-3-clause
PARINetwork/pari,PARINetwork/pari,PARINetwork/pari,PARINetwork/pari
html
## Code Before: {% extends "base.html" %} {% load i18n wagtailimages_tags %} {% block meta_title %}{{ category.name }}{% if filter %} - {{ filter }}{% endif %}{% endblock %} {% block title %}{{ category.name }}{% endblock %} {% block extra_js %} <script type="text/javascript" src="{{ STATIC_URL }}article/js/category.js"></script> {% endblock %} {% block extra_css %} <link type="text/css" rel="stylesheet" href="{{ STATIC_URL }}album/css/magnific-popup.css"> <link type="text/less" rel="stylesheet" href="{{ STATIC_URL }}css/category.less"> {% endblock %} {% block breadcrumb_menu %} <li class="active">{% trans "{{ category.name }}" %}</li> {% endblock %} {% block main %} <div class="jumbotron category-header"> <div class="row"> <div class="col-lg-9 col-md-9 col-sm-9"> <div class="category-title"> {{ category.name }} </div> <div class="category-description"> {{ category.description }} </div> </div> </div> </div> <div class="filter-list-container" data-title="{{ category.name }}" data-filter-endpoint="category_article_filter" data-filter-required-args-category="{{ category.id }}"> {% include "article/includes/article_list.html" with articles=articles category=category %} </div> {% endblock %} ## Instruction: Add page number to reduce page title duplication. ## Code After: {% extends "base.html" %}{% load i18n %} {% load i18n wagtailimages_tags %} {% block meta_title %}{{ category.name }}{% if filter %} - {{ filter }}{% endif %} - {% trans "Page" %} {{ articles.number }}{% endblock %} {% block title %}{{ category.name }}{% endblock %} {% block extra_js %} <script type="text/javascript" src="{{ STATIC_URL }}article/js/category.js"></script> {% endblock %} {% block extra_css %} <link type="text/css" rel="stylesheet" href="{{ STATIC_URL }}album/css/magnific-popup.css"> <link type="text/less" rel="stylesheet" href="{{ STATIC_URL }}css/category.less"> {% endblock %} {% block breadcrumb_menu %} <li class="active">{% trans "{{ category.name }}" %}</li> {% endblock %} {% block main %} <div class="jumbotron category-header"> <div class="row"> <div class="col-lg-9 col-md-9 col-sm-9"> <div class="category-title"> {{ category.name }} </div> <div class="category-description"> {{ category.description }} </div> </div> </div> </div> <div class="filter-list-container" data-title="{{ category.name }}" data-filter-endpoint="category_article_filter" data-filter-required-args-category="{{ category.id }}"> {% include "article/includes/article_list.html" with articles=articles category=category %} </div> {% endblock %}
1e9fb6337d94f80fd9ce1c024651013df9dd26eb
rpi/roles/fde/files/initramfs-rebuild.sh
rpi/roles/fde/files/initramfs-rebuild.sh
if grep -q '\bsplash\b' /boot/cmdline.txt; then sed -i 's/ \?splash \?/ /' /boot/cmdline.txt fi # Exit if not building kernel for this Raspberry Pi's hardware version. version="$1" current_version="$(uname -r)" case "${current_version}" in *-v7+) case "${version}" in *-v7+) ;; *) exit 0 esac ;; *+) case "${version}" in *-v7+) exit 0 ;; esac ;; esac # Exit if rebuild cannot be performed or not needed. [ -x /usr/sbin/mkinitramfs ] || exit 0 [ -f /boot/initramfs.gz ] || exit 0 lsinitramfs /boot/initramfs.gz |grep -q "/$version$" && exit 0 # Already in initramfs. # Rebuild. mkinitramfs -o /boot/initramfs.gz "$version"
if grep -q '\bsplash\b' /boot/cmdline.txt; then sed -i 's/ \?splash \?/ /' /boot/cmdline.txt fi # Exit if not building kernel for this Raspberry Pi's hardware version. version="$1" current_version="$(uname -r)" case "${current_version}" in *-v7l+) case "${version}" in *-v7l+) ;; *) exit 0 ;; esac ;; *-v7+) case "${version}" in *-v7+) ;; *) exit 0 ;; esac ;; *+) case "${version}" in *-v7l+) exit 0 ;; *-v7+) exit 0 ;; esac ;; esac # Exit if rebuild cannot be performed or not needed. [ -x /usr/sbin/mkinitramfs ] || exit 0 [ -f /boot/initramfs.gz ] || exit 0 lsinitramfs /boot/initramfs.gz |grep -q "/$version$" && exit 0 # Already in initramfs. # Rebuild. mkinitramfs -o /boot/initramfs.gz "$version"
Add support for raspberry pi 4 kernel
Add support for raspberry pi 4 kernel Running `uname -r` on my raspberry pi 4 model b results in *-v7l+ so we need to add explicit support for it to the initramfs rebuild.
Shell
mpl-2.0
mfinelli/arch-install
shell
## Code Before: if grep -q '\bsplash\b' /boot/cmdline.txt; then sed -i 's/ \?splash \?/ /' /boot/cmdline.txt fi # Exit if not building kernel for this Raspberry Pi's hardware version. version="$1" current_version="$(uname -r)" case "${current_version}" in *-v7+) case "${version}" in *-v7+) ;; *) exit 0 esac ;; *+) case "${version}" in *-v7+) exit 0 ;; esac ;; esac # Exit if rebuild cannot be performed or not needed. [ -x /usr/sbin/mkinitramfs ] || exit 0 [ -f /boot/initramfs.gz ] || exit 0 lsinitramfs /boot/initramfs.gz |grep -q "/$version$" && exit 0 # Already in initramfs. # Rebuild. mkinitramfs -o /boot/initramfs.gz "$version" ## Instruction: Add support for raspberry pi 4 kernel Running `uname -r` on my raspberry pi 4 model b results in *-v7l+ so we need to add explicit support for it to the initramfs rebuild. ## Code After: if grep -q '\bsplash\b' /boot/cmdline.txt; then sed -i 's/ \?splash \?/ /' /boot/cmdline.txt fi # Exit if not building kernel for this Raspberry Pi's hardware version. version="$1" current_version="$(uname -r)" case "${current_version}" in *-v7l+) case "${version}" in *-v7l+) ;; *) exit 0 ;; esac ;; *-v7+) case "${version}" in *-v7+) ;; *) exit 0 ;; esac ;; *+) case "${version}" in *-v7l+) exit 0 ;; *-v7+) exit 0 ;; esac ;; esac # Exit if rebuild cannot be performed or not needed. [ -x /usr/sbin/mkinitramfs ] || exit 0 [ -f /boot/initramfs.gz ] || exit 0 lsinitramfs /boot/initramfs.gz |grep -q "/$version$" && exit 0 # Already in initramfs. # Rebuild. mkinitramfs -o /boot/initramfs.gz "$version"
b268db731238959ae592f6a0554352b5d9652808
related_posts/Readme.rst
related_posts/Readme.rst
Related posts ------------- This plugin adds the ``related_posts`` variable to the article's context. By default, up to 5 articles are listed. You can customize this value by defining ``RELATED_POSTS_MAX`` in your settings file:: RELATED_POSTS_MAX = 10 You can then use the ``article.related_posts`` variable in your templates. For example:: {% if article.related_posts %} <ul> {% for related_post in article.related_posts %} <li><a href="{{ SITEURL }}/{{ related_post.url }}">{{ related_post.title }}</a></li> {% endfor %} </ul> {% endif %} You can also override related posts by using it as part of your post's meta data 'related_posts:'. The 'related_posts:' meta data works together with your existing slugs: related_posts: slug1,slug2,slug3...slugN N represents the RELATED_POSTS_MAX
Related posts ------------- This plugin adds the ``related_posts`` variable to the article's context. By default, up to 5 articles are listed. You can customize this value by defining ``RELATED_POSTS_MAX`` in your settings file:: RELATED_POSTS_MAX = 10 You can then use the ``article.related_posts`` variable in your templates. For example:: {% if article.related_posts %} <ul> {% for related_post in article.related_posts %} <li><a href="{{ SITEURL }}/{{ related_post.url }}">{{ related_post.title }}</a></li> {% endfor %} </ul> {% endif %} Your related posts should share a common tag. You can also use ``related_posts:`` in your post's meta data. The 'related_posts:' meta data works together with your existing slugs: related_posts: slug1,slug2,slug3...slugN N represents the RELATED_POSTS_MAX
Clarify documentation of related_posts plugin
Clarify documentation of related_posts plugin It is not clear how the plugin determines which posts are related.
reStructuredText
agpl-3.0
barrysteyn/pelican-plugins,wilsonfreitas/pelican-plugins,talha131/pelican-plugins,andreas-h/pelican-plugins,talha131/pelican-plugins,frickp/pelican-plugins,xsteadfastx/pelican-plugins,ziaa/pelican-plugins,barrysteyn/pelican-plugins,talha131/pelican-plugins,doctorwidget/pelican-plugins,makefu/pelican-plugins,Samael500/pelican-plugins,xsteadfastx/pelican-plugins,ingwinlu/pelican-plugins,makefu/pelican-plugins,makefu/pelican-plugins,FuzzJunket/pelican-plugins,mitchins/pelican-plugins,lindzey/pelican-plugins,proteansec/pelican-plugins,UHBiocomputation/pelican-plugins,barrysteyn/pelican-plugins,karya0/pelican-plugins,prisae/pelican-plugins,clokep/pelican-plugins,proteansec/pelican-plugins,lazycoder-ru/pelican-plugins,ziaa/pelican-plugins,publicus/pelican-plugins,lele1122/pelican-plugins,goerz/pelican-plugins,pelson/pelican-plugins,wilsonfreitas/pelican-plugins,lindzey/pelican-plugins,xsteadfastx/pelican-plugins,UHBiocomputation/pelican-plugins,samueljohn/pelican-plugins,cmacmackin/pelican-plugins,MarkusH/pelican-plugins,florianjacob/pelican-plugins,cmacmackin/pelican-plugins,samueljohn/pelican-plugins,pestrickland/pelican-plugins,frickp/pelican-plugins,Xion/pelican-plugins,mortada/pelican-plugins,clokep/pelican-plugins,pestrickland/pelican-plugins,wilsonfreitas/pelican-plugins,pelson/pelican-plugins,shireenrao/pelican-plugins,mortada/pelican-plugins,lazycoder-ru/pelican-plugins,prisae/pelican-plugins,amitsaha/pelican-plugins,danmackinlay/pelican-plugins,andreas-h/pelican-plugins,M157q/pelican-plugins,ziaa/pelican-plugins,jakevdp/pelican-plugins,Neurita/pelican-plugins,joachimneu/pelican-plugins,howthebodyworks/pelican-plugins,shireenrao/pelican-plugins,lele1122/pelican-plugins,mwcz/pelican-plugins,lindzey/pelican-plugins,mikitex70/pelican-plugins,MarkusH/pelican-plugins,florianjacob/pelican-plugins,yuanboshe/pelican-plugins,yuanboshe/pelican-plugins,phrawzty/pelican-plugins,talha131/pelican-plugins,farseerfc/pelican-plugins,UHBiocomputation/pelican-plugins,mwcz/pelican-plugins,rlaboiss/pelican-plugins,danmackinlay/pelican-plugins,if1live/pelican-plugins,olgabot/pelican-plugins,cmacmackin/pelican-plugins,danmackinlay/pelican-plugins,amitsaha/pelican-plugins,doctorwidget/pelican-plugins,jfosorio/pelican-plugins,phrawzty/pelican-plugins,jfosorio/pelican-plugins,jcdubacq/pelican-plugins,if1live/pelican-plugins,shireenrao/pelican-plugins,Neurita/pelican-plugins,if1live/pelican-plugins,mortada/pelican-plugins,prisae/pelican-plugins,xsteadfastx/pelican-plugins,kdheepak89/pelican-plugins,gjreda/pelican-plugins,gjreda/pelican-plugins,goerz/pelican-plugins,Neurita/pelican-plugins,lindzey/pelican-plugins,proteansec/pelican-plugins,Xion/pelican-plugins,barrysteyn/pelican-plugins,jfosorio/pelican-plugins,FuzzJunket/pelican-plugins,mwcz/pelican-plugins,jakevdp/pelican-plugins,farseerfc/pelican-plugins,benjaminabel/pelican-plugins,Samael500/pelican-plugins,lele1122/pelican-plugins,publicus/pelican-plugins,clokep/pelican-plugins,ingwinlu/pelican-plugins,Xion/pelican-plugins,andreas-h/pelican-plugins,gw0/pelican-plugins,seandavi/pelican-plugins,seandavi/pelican-plugins,clokep/pelican-plugins,prisae/pelican-plugins,howthebodyworks/pelican-plugins,yuanboshe/pelican-plugins,doctorwidget/pelican-plugins,davidmarquis/pelican-plugins,talha131/pelican-plugins,ingwinlu/pelican-plugins,olgabot/pelican-plugins,mitchins/pelican-plugins,jcdubacq/pelican-plugins,makefu/pelican-plugins,lazycoder-ru/pelican-plugins,olgabot/pelican-plugins,andreas-h/pelican-plugins,pelson/pelican-plugins,jfosorio/pelican-plugins,mikitex70/pelican-plugins,seandavi/pelican-plugins,kdheepak89/pelican-plugins,M157q/pelican-plugins,pxquim/pelican-plugins,jakevdp/pelican-plugins,pelson/pelican-plugins,phrawzty/pelican-plugins,davidmarquis/pelican-plugins,cmacmackin/pelican-plugins,amitsaha/pelican-plugins,samueljohn/pelican-plugins,doctorwidget/pelican-plugins,mwcz/pelican-plugins,jantman/pelican-plugins,rlaboiss/pelican-plugins,MarkusH/pelican-plugins,jprine/pelican-plugins,pestrickland/pelican-plugins,lele1122/pelican-plugins,Samael500/pelican-plugins,frickp/pelican-plugins,kdheepak89/pelican-plugins,jakevdp/pelican-plugins,joachimneu/pelican-plugins,goerz/pelican-plugins,Xion/pelican-plugins,farseerfc/pelican-plugins,znegva/pelican-plugins,mitchins/pelican-plugins,M157q/pelican-plugins,jantman/pelican-plugins,farseerfc/pelican-plugins,Neurita/pelican-plugins,gjreda/pelican-plugins,mitchins/pelican-plugins,FuzzJunket/pelican-plugins,cctags/pelican-plugins,publicus/pelican-plugins,rlaboiss/pelican-plugins,Samael500/pelican-plugins,olgabot/pelican-plugins,joachimneu/pelican-plugins,cctags/pelican-plugins,if1live/pelican-plugins,MarkusH/pelican-plugins,M157q/pelican-plugins,cctags/pelican-plugins,ziaa/pelican-plugins,karya0/pelican-plugins,howthebodyworks/pelican-plugins,danmackinlay/pelican-plugins,jprine/pelican-plugins,shireenrao/pelican-plugins,mikitex70/pelican-plugins,yuanboshe/pelican-plugins,joachimneu/pelican-plugins,znegva/pelican-plugins,frickp/pelican-plugins,jantman/pelican-plugins,mortada/pelican-plugins,davidmarquis/pelican-plugins,pxquim/pelican-plugins,karya0/pelican-plugins,davidmarquis/pelican-plugins,samueljohn/pelican-plugins,howthebodyworks/pelican-plugins,MarkusH/pelican-plugins,florianjacob/pelican-plugins,publicus/pelican-plugins,cctags/pelican-plugins,pxquim/pelican-plugins,amitsaha/pelican-plugins,lazycoder-ru/pelican-plugins,seandavi/pelican-plugins,FuzzJunket/pelican-plugins,florianjacob/pelican-plugins,gw0/pelican-plugins,karya0/pelican-plugins,UHBiocomputation/pelican-plugins,pestrickland/pelican-plugins,kdheepak89/pelican-plugins,proteansec/pelican-plugins,gjreda/pelican-plugins,benjaminabel/pelican-plugins,znegva/pelican-plugins,goerz/pelican-plugins,jantman/pelican-plugins,farseerfc/pelican-plugins,mikitex70/pelican-plugins,ingwinlu/pelican-plugins,wilsonfreitas/pelican-plugins,mortada/pelican-plugins,benjaminabel/pelican-plugins,pxquim/pelican-plugins,benjaminabel/pelican-plugins,phrawzty/pelican-plugins,rlaboiss/pelican-plugins
restructuredtext
## Code Before: Related posts ------------- This plugin adds the ``related_posts`` variable to the article's context. By default, up to 5 articles are listed. You can customize this value by defining ``RELATED_POSTS_MAX`` in your settings file:: RELATED_POSTS_MAX = 10 You can then use the ``article.related_posts`` variable in your templates. For example:: {% if article.related_posts %} <ul> {% for related_post in article.related_posts %} <li><a href="{{ SITEURL }}/{{ related_post.url }}">{{ related_post.title }}</a></li> {% endfor %} </ul> {% endif %} You can also override related posts by using it as part of your post's meta data 'related_posts:'. The 'related_posts:' meta data works together with your existing slugs: related_posts: slug1,slug2,slug3...slugN N represents the RELATED_POSTS_MAX ## Instruction: Clarify documentation of related_posts plugin It is not clear how the plugin determines which posts are related. ## Code After: Related posts ------------- This plugin adds the ``related_posts`` variable to the article's context. By default, up to 5 articles are listed. You can customize this value by defining ``RELATED_POSTS_MAX`` in your settings file:: RELATED_POSTS_MAX = 10 You can then use the ``article.related_posts`` variable in your templates. For example:: {% if article.related_posts %} <ul> {% for related_post in article.related_posts %} <li><a href="{{ SITEURL }}/{{ related_post.url }}">{{ related_post.title }}</a></li> {% endfor %} </ul> {% endif %} Your related posts should share a common tag. You can also use ``related_posts:`` in your post's meta data. The 'related_posts:' meta data works together with your existing slugs: related_posts: slug1,slug2,slug3...slugN N represents the RELATED_POSTS_MAX
82a0c81945d3ea930468d5e48b71f6f422a48906
lib/node_modules/@stdlib/stats/base/dists/bernoulli/pmf/docs/repl.txt
lib/node_modules/@stdlib/stats/base/dists/bernoulli/pmf/docs/repl.txt
{{alias}}( x, p ) Evaluates the probability mass function (PMF) for a Bernoulli distribution with success probability `p` at a value `x`. If provided `NaN` as any argument, the function returns `NaN`. If `p < 0` or `p > 1`, the function returns `NaN`. Parameters ---------- x: number Input value. p: number Success probability. Returns ------- out: number Evaluated PMF. Examples -------- > var y = {{alias}}( 4.0, 0.3 ) ~0.072 > y = {{alias}}( 2.0, 0.7 ) ~0.063 > y = {{alias}}( -1.0, 0.5 ) 0.0 > y = {{alias}}( 0.0, NaN ) NaN > y = {{alias}}( NaN, 0.5 ) NaN // Invalid success probability: > y = {{alias}}( 2.0, 1.5 ) NaN {{alias}}.factory( p ) Returns a function for evaluating the probability mass function (PMF) of a Bernoulli distribution with success probability `p`. Parameters ---------- p: number Success probability. Returns ------- pmf: Function Probability mass function (PMF). Examples -------- > var mypmf = {{alias}}.factory( 0.5 ); > var y = mypmf( 3.0 ) 0.0625 > y = mypmf( 1.0 ) 0.25 See Also --------
{{alias}}( x, p ) Evaluates the probability mass function (PMF) for a Bernoulli distribution with success probability `p` at a value `x`. If provided `NaN` as any argument, the function returns `NaN`. If `p < 0` or `p > 1`, the function returns `NaN`. Parameters ---------- x: number Input value. p: number Success probability. Returns ------- out: number Evaluated PMF. Examples -------- > var y = {{alias}}( 1.0, 0.3 ) 0.3 > y = {{alias}}( 0.0, 0.7 ) 0.3 > y = {{alias}}( -1.0, 0.5 ) 0.0 > y = {{alias}}( 0.0, NaN ) NaN > y = {{alias}}( NaN, 0.5 ) NaN // Invalid success probability: > y = {{alias}}( 0.0, 1.5 ) NaN {{alias}}.factory( p ) Returns a function for evaluating the probability mass function (PMF) of a Bernoulli distribution with success probability `p`. Parameters ---------- p: number Success probability. Returns ------- pmf: Function Probability mass function (PMF). Examples -------- > var mypmf = {{alias}}.factory( 0.5 ); > var y = mypmf( 1.0 ) 0.5 > y = mypmf( 0.0 ) 0.5 See Also --------
Fix arguments and update return values
Fix arguments and update return values
Text
apache-2.0
stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib
text
## Code Before: {{alias}}( x, p ) Evaluates the probability mass function (PMF) for a Bernoulli distribution with success probability `p` at a value `x`. If provided `NaN` as any argument, the function returns `NaN`. If `p < 0` or `p > 1`, the function returns `NaN`. Parameters ---------- x: number Input value. p: number Success probability. Returns ------- out: number Evaluated PMF. Examples -------- > var y = {{alias}}( 4.0, 0.3 ) ~0.072 > y = {{alias}}( 2.0, 0.7 ) ~0.063 > y = {{alias}}( -1.0, 0.5 ) 0.0 > y = {{alias}}( 0.0, NaN ) NaN > y = {{alias}}( NaN, 0.5 ) NaN // Invalid success probability: > y = {{alias}}( 2.0, 1.5 ) NaN {{alias}}.factory( p ) Returns a function for evaluating the probability mass function (PMF) of a Bernoulli distribution with success probability `p`. Parameters ---------- p: number Success probability. Returns ------- pmf: Function Probability mass function (PMF). Examples -------- > var mypmf = {{alias}}.factory( 0.5 ); > var y = mypmf( 3.0 ) 0.0625 > y = mypmf( 1.0 ) 0.25 See Also -------- ## Instruction: Fix arguments and update return values ## Code After: {{alias}}( x, p ) Evaluates the probability mass function (PMF) for a Bernoulli distribution with success probability `p` at a value `x`. If provided `NaN` as any argument, the function returns `NaN`. If `p < 0` or `p > 1`, the function returns `NaN`. Parameters ---------- x: number Input value. p: number Success probability. Returns ------- out: number Evaluated PMF. Examples -------- > var y = {{alias}}( 1.0, 0.3 ) 0.3 > y = {{alias}}( 0.0, 0.7 ) 0.3 > y = {{alias}}( -1.0, 0.5 ) 0.0 > y = {{alias}}( 0.0, NaN ) NaN > y = {{alias}}( NaN, 0.5 ) NaN // Invalid success probability: > y = {{alias}}( 0.0, 1.5 ) NaN {{alias}}.factory( p ) Returns a function for evaluating the probability mass function (PMF) of a Bernoulli distribution with success probability `p`. Parameters ---------- p: number Success probability. Returns ------- pmf: Function Probability mass function (PMF). Examples -------- > var mypmf = {{alias}}.factory( 0.5 ); > var y = mypmf( 1.0 ) 0.5 > y = mypmf( 0.0 ) 0.5 See Also --------
8256e571f3ab22988341de26d26ea69f7db67920
index.html
index.html
--- layout: default title: PDX Git --- <div class="content"> Coming Soon </div>
--- layout: default title: PDX Git --- <div class="content"> Our <a href="http://calagator.org/events/1250461898">first meeting</a> is coming soon on Feb 1st! </div>
Add a link to calagator
Add a link to calagator
HTML
mit
GrumpyGits/grumpygits.github.io,GrumpyGits/grumpygits.github.io
html
## Code Before: --- layout: default title: PDX Git --- <div class="content"> Coming Soon </div> ## Instruction: Add a link to calagator ## Code After: --- layout: default title: PDX Git --- <div class="content"> Our <a href="http://calagator.org/events/1250461898">first meeting</a> is coming soon on Feb 1st! </div>
896b7e7aead414933c15506e2b9d970d0e492d48
addon/components/resize-detector.js
addon/components/resize-detector.js
import Ember from 'ember'; import layout from '../templates/components/resize-detector'; const { inject: { service }, run: { scheduleOnce, bind } } = Ember; export default Ember.Component.extend({ layout, tagName: '', resizeDetector: service(), didInsertElement() { this._super(...arguments); scheduleOnce('afterRender', this, this.setup); }, setup() { this.callback = bind(this, this.onResize); this.get('resizeDetector').setup(this.get('selector'), this.callback); }, teardown() { this.get('resizeDetector').teardown(this.get('selector'), this.callback); }, onResize(element) { let $el = Ember.$(element); this.sendAction('on-resize', { width: $el.width(), height: $el.height() }, element); }, willDestroyElement() { this.teardown(); this._super(...arguments); } }).reopenClass({ positionalParams: ['selector'] });
import Ember from 'ember'; import layout from '../templates/components/resize-detector'; const { inject: { service }, run: { scheduleOnce, bind } } = Ember; export default Ember.Component.extend({ layout, tagName: '', resizeDetector: service(), didInsertElement() { this._super(...arguments); scheduleOnce('afterRender', this, this.setup); }, setup() { this.callback = bind(this, this.onResize); this.get('resizeDetector').setup(this.get('selector'), this.callback); }, teardown() { this.get('resizeDetector').teardown(this.get('selector'), this.callback); }, onResize(element) { if (this.get('isDestroyed') || this.get('isDestroying')) { return; } let $el = Ember.$(element); this.sendAction('on-resize', { width: $el.width(), height: $el.height() }, element); }, willDestroyElement() { this.teardown(); this._super(...arguments); } }).reopenClass({ positionalParams: ['selector'] });
Fix calling sendAction on destroyed object error
Fix calling sendAction on destroyed object error
JavaScript
mit
EmberSherpa/ember-element-resize-detector,EmberSherpa/ember-element-resize-detector
javascript
## Code Before: import Ember from 'ember'; import layout from '../templates/components/resize-detector'; const { inject: { service }, run: { scheduleOnce, bind } } = Ember; export default Ember.Component.extend({ layout, tagName: '', resizeDetector: service(), didInsertElement() { this._super(...arguments); scheduleOnce('afterRender', this, this.setup); }, setup() { this.callback = bind(this, this.onResize); this.get('resizeDetector').setup(this.get('selector'), this.callback); }, teardown() { this.get('resizeDetector').teardown(this.get('selector'), this.callback); }, onResize(element) { let $el = Ember.$(element); this.sendAction('on-resize', { width: $el.width(), height: $el.height() }, element); }, willDestroyElement() { this.teardown(); this._super(...arguments); } }).reopenClass({ positionalParams: ['selector'] }); ## Instruction: Fix calling sendAction on destroyed object error ## Code After: import Ember from 'ember'; import layout from '../templates/components/resize-detector'; const { inject: { service }, run: { scheduleOnce, bind } } = Ember; export default Ember.Component.extend({ layout, tagName: '', resizeDetector: service(), didInsertElement() { this._super(...arguments); scheduleOnce('afterRender', this, this.setup); }, setup() { this.callback = bind(this, this.onResize); this.get('resizeDetector').setup(this.get('selector'), this.callback); }, teardown() { this.get('resizeDetector').teardown(this.get('selector'), this.callback); }, onResize(element) { if (this.get('isDestroyed') || this.get('isDestroying')) { return; } let $el = Ember.$(element); this.sendAction('on-resize', { width: $el.width(), height: $el.height() }, element); }, willDestroyElement() { this.teardown(); this._super(...arguments); } }).reopenClass({ positionalParams: ['selector'] });
943b02c0b25a89f4be21963e679cabbe2c4b98ec
.travis.yml
.travis.yml
language: php sudo: false matrix: include: - php: 7.1 env: SYMFONY_VERSION=3.3.* - php: 7.2 env: SYMFONY_VERSION=3.3.* allow_failures: - php: 7.2 env: global: - SYMFONY_VERSION="" cache: directories: - $HOME/.composer/cache before_install: - composer self-update - if [ "$SYMFONY_VERSION" != "" ]; then composer require --no-update symfony/security:$SYMFONY_VERSION symfony/config:$SYMFONY_VERSION symfony/http-foundation:$SYMFONY_VERSION symfony/dependency-injection:$SYMFONY_VERSION symfony/http-kernel:$SYMFONY_VERSION; fi install: - composer install script: - bin/phpspec run -fpretty --verbose
language: php sudo: false matrix: include: - php: 7.1 env: SYMFONY_VERSION=3.3.* - php: 7.2 env: SYMFONY_VERSION=3.3.* env: global: - SYMFONY_VERSION="" cache: directories: - $HOME/.composer/cache before_install: - composer self-update - if [ "$SYMFONY_VERSION" != "" ]; then composer require --no-update symfony/security:$SYMFONY_VERSION symfony/config:$SYMFONY_VERSION symfony/http-foundation:$SYMFONY_VERSION symfony/dependency-injection:$SYMFONY_VERSION symfony/http-kernel:$SYMFONY_VERSION; fi install: - composer install script: - bin/phpspec run -fpretty --verbose
Remove "allow_failures" for PHP 7.2
Remove "allow_failures" for PHP 7.2
YAML
mit
Hexanet/LogBundle,Hexanet/MonologExtraBundle
yaml
## Code Before: language: php sudo: false matrix: include: - php: 7.1 env: SYMFONY_VERSION=3.3.* - php: 7.2 env: SYMFONY_VERSION=3.3.* allow_failures: - php: 7.2 env: global: - SYMFONY_VERSION="" cache: directories: - $HOME/.composer/cache before_install: - composer self-update - if [ "$SYMFONY_VERSION" != "" ]; then composer require --no-update symfony/security:$SYMFONY_VERSION symfony/config:$SYMFONY_VERSION symfony/http-foundation:$SYMFONY_VERSION symfony/dependency-injection:$SYMFONY_VERSION symfony/http-kernel:$SYMFONY_VERSION; fi install: - composer install script: - bin/phpspec run -fpretty --verbose ## Instruction: Remove "allow_failures" for PHP 7.2 ## Code After: language: php sudo: false matrix: include: - php: 7.1 env: SYMFONY_VERSION=3.3.* - php: 7.2 env: SYMFONY_VERSION=3.3.* env: global: - SYMFONY_VERSION="" cache: directories: - $HOME/.composer/cache before_install: - composer self-update - if [ "$SYMFONY_VERSION" != "" ]; then composer require --no-update symfony/security:$SYMFONY_VERSION symfony/config:$SYMFONY_VERSION symfony/http-foundation:$SYMFONY_VERSION symfony/dependency-injection:$SYMFONY_VERSION symfony/http-kernel:$SYMFONY_VERSION; fi install: - composer install script: - bin/phpspec run -fpretty --verbose
ad5cb8f92569b02b2094b6a5a1d9496e932132cc
public/visifile_drivers/services/postgres_server.js
public/visifile_drivers/services/postgres_server.js
async function postgres_sql(args) { /* description("Postgres function") base_component_id("postgres_server") load_once_from_file(true) only_run_on_server(true) */ var config = { user: args.user, database: args.database, password: args.password, host: args.host, port: args.port }; console.log("postgres_server: " + JSON.stringify(args,null,2)); var promise = new Promise(async function(returnFn) { var dbconnection = new postgresdb.Client(config); dbconnection.connect(function (err) { if (err) { console.log({error: '' + err}); returnFn({failed: err}) } else { dbconnection.query(args.sql, [], function (err, result) { if (err) { console.log({failed: '' + err}); } else { console.log("row count: " + result.rows.length); // outputs: { name: 'brianc' } if (args.limit) { result.rows = result.rows.slice(0, args.limit); } returnFn({value: result.rows}) }; }) } }); }) var ret = await promise return ret }
async function postgres_sql(args) { /* description("Postgres function") base_component_id("postgres_server") load_once_from_file(true) only_run_on_server(true) */ var config = { user: args.user, database: args.database, password: args.password, host: args.host, port: args.port }; console.log("postgres_server: " + JSON.stringify(args,null,2)); var promise = new Promise(async function(returnFn) { var dbconnection = new postgresdb.Client(config); dbconnection.connect(function (err) { if (err) { console.log({error: '' + err}); returnFn({failed: err}) } else { var useSql = args.sql if (args.get_tables) { useSql = "SELECT tablename as name FROM pg_catalog.pg_tables where schemaname = 'public';" } dbconnection.query(useSql, [], function (err, result) { if (err) { console.log({failed: '' + err}); } else { console.log("row count: " + result.rows.length); // outputs: { name: 'brianc' } if (args.limit) { result.rows = result.rows.slice(0, args.limit); } returnFn({value: result.rows}) }; }) } }); }) var ret = await promise return ret }
Allow Postgres DB driver to return the list of tables
Allow Postgres DB driver to return the list of tables
JavaScript
mit
zubairq/gosharedata,zubairq/yazz,zubairq/yazz,zubairq/gosharedata
javascript
## Code Before: async function postgres_sql(args) { /* description("Postgres function") base_component_id("postgres_server") load_once_from_file(true) only_run_on_server(true) */ var config = { user: args.user, database: args.database, password: args.password, host: args.host, port: args.port }; console.log("postgres_server: " + JSON.stringify(args,null,2)); var promise = new Promise(async function(returnFn) { var dbconnection = new postgresdb.Client(config); dbconnection.connect(function (err) { if (err) { console.log({error: '' + err}); returnFn({failed: err}) } else { dbconnection.query(args.sql, [], function (err, result) { if (err) { console.log({failed: '' + err}); } else { console.log("row count: " + result.rows.length); // outputs: { name: 'brianc' } if (args.limit) { result.rows = result.rows.slice(0, args.limit); } returnFn({value: result.rows}) }; }) } }); }) var ret = await promise return ret } ## Instruction: Allow Postgres DB driver to return the list of tables ## Code After: async function postgres_sql(args) { /* description("Postgres function") base_component_id("postgres_server") load_once_from_file(true) only_run_on_server(true) */ var config = { user: args.user, database: args.database, password: args.password, host: args.host, port: args.port }; console.log("postgres_server: " + JSON.stringify(args,null,2)); var promise = new Promise(async function(returnFn) { var dbconnection = new postgresdb.Client(config); dbconnection.connect(function (err) { if (err) { console.log({error: '' + err}); returnFn({failed: err}) } else { var useSql = args.sql if (args.get_tables) { useSql = "SELECT tablename as name FROM pg_catalog.pg_tables where schemaname = 'public';" } dbconnection.query(useSql, [], function (err, result) { if (err) { console.log({failed: '' + err}); } else { console.log("row count: " + result.rows.length); // outputs: { name: 'brianc' } if (args.limit) { result.rows = result.rows.slice(0, args.limit); } returnFn({value: result.rows}) }; }) } }); }) var ret = await promise return ret }
00907ccaa904c5b45abebaecbc740dded36625fe
build_skeletons.bat
build_skeletons.bat
pushd bin JSILc "Microsoft.Xna.Framework.Game, Version=4.0.0.0, Culture=neutral, PublicKeyToken=842cf8be1de50553" "Microsoft.Xna.Framework.Xact, Version=4.0.0.0, Culture=neutral, PublicKeyToken=842cf8be1de50553" "C:\Users\Kevin\Documents\Projects\JSIL\Skeletons\skeletons.jsilconfig" popd
pushd bin JSILc "Microsoft.Xna.Framework.Game, Version=4.0.0.0, Culture=neutral, PublicKeyToken=842cf8be1de50553" "Microsoft.Xna.Framework.Xact, Version=4.0.0.0, Culture=neutral, PublicKeyToken=842cf8be1de50553" "..\Skeletons\skeletons.jsilconfig" popd
Build Skeletons batch file: use relative path.
Build Skeletons batch file: use relative path.
Batchfile
mit
sander-git/JSIL,sq/JSIL,volkd/JSIL,antiufo/JSIL,x335/JSIL,FUSEEProjectTeam/JSIL,mispy/JSIL,dmirmilshteyn/JSIL,acourtney2015/JSIL,iskiselev/JSIL,volkd/JSIL,sander-git/JSIL,iskiselev/JSIL,FUSEEProjectTeam/JSIL,Trattpingvin/JSIL,dmirmilshteyn/JSIL,sq/JSIL,mispy/JSIL,iskiselev/JSIL,x335/JSIL,Trattpingvin/JSIL,TukekeSoft/JSIL,sq/JSIL,hach-que/JSIL,volkd/JSIL,iskiselev/JSIL,TukekeSoft/JSIL,volkd/JSIL,mispy/JSIL,antiufo/JSIL,FUSEEProjectTeam/JSIL,mispy/JSIL,antiufo/JSIL,hach-que/JSIL,acourtney2015/JSIL,TukekeSoft/JSIL,sander-git/JSIL,iskiselev/JSIL,hach-que/JSIL,Trattpingvin/JSIL,volkd/JSIL,FUSEEProjectTeam/JSIL,x335/JSIL,sq/JSIL,sq/JSIL,sander-git/JSIL,sander-git/JSIL,Trattpingvin/JSIL,FUSEEProjectTeam/JSIL,TukekeSoft/JSIL,acourtney2015/JSIL,acourtney2015/JSIL,antiufo/JSIL,dmirmilshteyn/JSIL,hach-que/JSIL,x335/JSIL,acourtney2015/JSIL,hach-que/JSIL,dmirmilshteyn/JSIL
batchfile
## Code Before: pushd bin JSILc "Microsoft.Xna.Framework.Game, Version=4.0.0.0, Culture=neutral, PublicKeyToken=842cf8be1de50553" "Microsoft.Xna.Framework.Xact, Version=4.0.0.0, Culture=neutral, PublicKeyToken=842cf8be1de50553" "C:\Users\Kevin\Documents\Projects\JSIL\Skeletons\skeletons.jsilconfig" popd ## Instruction: Build Skeletons batch file: use relative path. ## Code After: pushd bin JSILc "Microsoft.Xna.Framework.Game, Version=4.0.0.0, Culture=neutral, PublicKeyToken=842cf8be1de50553" "Microsoft.Xna.Framework.Xact, Version=4.0.0.0, Culture=neutral, PublicKeyToken=842cf8be1de50553" "..\Skeletons\skeletons.jsilconfig" popd
fc2b4e350275e4029c6b698e8f31479707a4710d
core/lib/generators/spree/mailers_preview/templates/mailers/previews/order_preview.rb
core/lib/generators/spree/mailers_preview/templates/mailers/previews/order_preview.rb
class OrderPreview < ActionMailer::Preview def confirm_email Spree::OrderMailer.confirm_email(Spree::Order.complete.first) end def cancel_email Spree::OrderMailer.cancel_email(Spree::Order.complete.first) end end
class OrderPreview < ActionMailer::Preview def confirm_email Spree::OrderMailer.confirm_email(Spree::Order.complete.first) end def cancel_email Spree::OrderMailer.cancel_email(Spree::Order.complete.first) end def store_owner_notification_email Spree::OrderMailer.store_owner_notification_email(Spree::Order.complete.first) end end
Add email preview for order store_owner_notification mail
Add email preview for order store_owner_notification mail
Ruby
bsd-3-clause
imella/spree,imella/spree,imella/spree
ruby
## Code Before: class OrderPreview < ActionMailer::Preview def confirm_email Spree::OrderMailer.confirm_email(Spree::Order.complete.first) end def cancel_email Spree::OrderMailer.cancel_email(Spree::Order.complete.first) end end ## Instruction: Add email preview for order store_owner_notification mail ## Code After: class OrderPreview < ActionMailer::Preview def confirm_email Spree::OrderMailer.confirm_email(Spree::Order.complete.first) end def cancel_email Spree::OrderMailer.cancel_email(Spree::Order.complete.first) end def store_owner_notification_email Spree::OrderMailer.store_owner_notification_email(Spree::Order.complete.first) end end
895552a5783f502341ac02b0552bb09c8650199d
core/app/backbone/contrib/default_click_handler.coffee
core/app/backbone/contrib/default_click_handler.coffee
defaultClickHandler = (e) -> if Backbone.History.started # Capitalization in Backbone.[H]istory is intentional $link = $(e.target).closest("a") url = $link.attr("href") target = $link.attr("target") if e.metaKey or e.ctrlKey or e.altKey target = "_blank" if target? window.open url, target else navigateTo url, $link.attr("target") false # prevent default else true navigateTo = (url, target='_self') -> if "/" + Backbone.history.fragment is url # Allow reloads by clicking links without polluting the history Backbone.history.fragment = null Backbone.history.navigate url, trigger: true replace: true else if Backbone.history.loadUrl(url) # Try if there is a Backbone route available Backbone.history.fragment = null # Force creating a state in the history Backbone.history.navigate url, false else window.open url, target window.focus() # HACK: this is needed because internal events did not seem to work $(document).on "click", ":not(body.client) a[rel=backbone]", defaultClickHandler
defaultClickHandler = (e) -> if Backbone.History.started # Capitalization in Backbone.[H]istory is intentional $link = $(e.target).closest("a") url = $link.attr("href") if e.metaKey or e.ctrlKey or e.altKey target = "_blank" else target = $link.attr("target") if target? window.open url, target else navigateTo url false # prevent default else true navigateTo = (url) -> if "/" + Backbone.history.fragment is url # Allow reloads by clicking links without polluting the history Backbone.history.fragment = null Backbone.history.navigate url, trigger: true replace: true else if Backbone.history.loadUrl(url) # Try if there is a Backbone route available Backbone.history.fragment = null # Force creating a state in the history Backbone.history.navigate url, false else window.open url window.focus() # HACK: this is needed because internal events did not seem to work $(document).on "click", ":not(body.client) a[rel=backbone]", defaultClickHandler
Put stuff in else and cleaned up navigateTo
Put stuff in else and cleaned up navigateTo
CoffeeScript
mit
daukantas/factlink-core,Factlink/factlink-core,Factlink/factlink-core,Factlink/factlink-core,daukantas/factlink-core,daukantas/factlink-core,daukantas/factlink-core,Factlink/factlink-core
coffeescript
## Code Before: defaultClickHandler = (e) -> if Backbone.History.started # Capitalization in Backbone.[H]istory is intentional $link = $(e.target).closest("a") url = $link.attr("href") target = $link.attr("target") if e.metaKey or e.ctrlKey or e.altKey target = "_blank" if target? window.open url, target else navigateTo url, $link.attr("target") false # prevent default else true navigateTo = (url, target='_self') -> if "/" + Backbone.history.fragment is url # Allow reloads by clicking links without polluting the history Backbone.history.fragment = null Backbone.history.navigate url, trigger: true replace: true else if Backbone.history.loadUrl(url) # Try if there is a Backbone route available Backbone.history.fragment = null # Force creating a state in the history Backbone.history.navigate url, false else window.open url, target window.focus() # HACK: this is needed because internal events did not seem to work $(document).on "click", ":not(body.client) a[rel=backbone]", defaultClickHandler ## Instruction: Put stuff in else and cleaned up navigateTo ## Code After: defaultClickHandler = (e) -> if Backbone.History.started # Capitalization in Backbone.[H]istory is intentional $link = $(e.target).closest("a") url = $link.attr("href") if e.metaKey or e.ctrlKey or e.altKey target = "_blank" else target = $link.attr("target") if target? window.open url, target else navigateTo url false # prevent default else true navigateTo = (url) -> if "/" + Backbone.history.fragment is url # Allow reloads by clicking links without polluting the history Backbone.history.fragment = null Backbone.history.navigate url, trigger: true replace: true else if Backbone.history.loadUrl(url) # Try if there is a Backbone route available Backbone.history.fragment = null # Force creating a state in the history Backbone.history.navigate url, false else window.open url window.focus() # HACK: this is needed because internal events did not seem to work $(document).on "click", ":not(body.client) a[rel=backbone]", defaultClickHandler
e251337edeb1c3dd59870951a747abc0e51e3eed
shell/config/ssh-agent.sh
shell/config/ssh-agent.sh
if [ -f "/run/user/1000/ssh-agent.socket" ]; then export SSH_AUTH_SOCK=/run/user/1000/ssh-agent.socket fi
if [ -e "/run/user/$(id -u)/ssh-agent.socket" ]; then export SSH_AUTH_SOCK=/run/user/$(id -u)/ssh-agent.socket fi
Fix shell setup for setting SSH agent socket location
Fix shell setup for setting SSH agent socket location Bugs: - Check if file exists, but not a regular file - Don't hardcode user id
Shell
mit
steakunderscore/dotfiles,steakunderscore/dotfiles,steakunderscore/dotfiles
shell
## Code Before: if [ -f "/run/user/1000/ssh-agent.socket" ]; then export SSH_AUTH_SOCK=/run/user/1000/ssh-agent.socket fi ## Instruction: Fix shell setup for setting SSH agent socket location Bugs: - Check if file exists, but not a regular file - Don't hardcode user id ## Code After: if [ -e "/run/user/$(id -u)/ssh-agent.socket" ]; then export SSH_AUTH_SOCK=/run/user/$(id -u)/ssh-agent.socket fi
0de213c88dcee2db8f8cd416ff928e6018329e68
passwd_change.py
passwd_change.py
import sys _args = sys.argv if __name__ == "__main__": if len(_args) == 5: keys_file = _args[1] target_file = _args[2] result_file = _args[3] log_file = _args[4] try: with open(keys_file, 'r') as k: keys = k.readlines() keys = [key.strip().split('@')[0] for key in keys] keys = [key for key in keys if key != ''] with open(target_file, 'r') as t: target_lines = t.readlines() log = open(log_file, 'w') with open(result_file, 'w') as r: for line in target_lines: if line.split(':')[0] in keys or \ line.split(':')[3] != '12': r.write(line) else: log.write(line) log.close() except Exception as e: print(str(e)) sys.exit() else: print('==================================================') print('python passwd_change.py keys passwd passwd_new log') print('==================================================')
import sys _args = sys.argv if __name__ == "__main__": if len(_args) == 8: keys_file = _args[1] passwd_orig = _args[2] passwd_new = _args[3] passwd_log = _args[4] shadow_orig = _args[5] shadow_new = _args[6] shadow_log = _args[7] try: with open(keys_file, 'r') as k: keys = k.readlines() keys = [key.strip().split('@')[0] for key in keys] keys = [key for key in keys if key != ''] with open(passwd_orig, 'r') as po: passwd_lines = po.readlines() passwd_log = open(passwd_log, 'w') passwd_new_keys = [] with open(passwd_new, 'w') as pn: for line in passwd_lines: if line.split(':')[0] in keys or \ line.split(':')[3] != '12': pn.write(line) passwd_new_keys.append(line.split(':')[0]) else: passwd_log.write(line) passwd_log.close() with open(shadow_orig, 'r') as so: shadow_lines = so.readlines() shadow_log = open(shadow_log, 'w') with open(shadow_new, 'w') as sn: for line in shadow_lines: if line.split(':')[0] in passwd_new_keys: sn.write(line) else: shadow_log.write(line) shadow_log.close() except Exception as e: print(str(e)) sys.exit() else: print('==================================================') print('python passwd_change.py keys passwd passwd_new passwd_log' + ' shadow shadow_new shadow_log') print('==================================================')
Add shadow changing support according to our new passwd.
Add shadow changing support according to our new passwd.
Python
mit
maxsocl/oldmailer
python
## Code Before: import sys _args = sys.argv if __name__ == "__main__": if len(_args) == 5: keys_file = _args[1] target_file = _args[2] result_file = _args[3] log_file = _args[4] try: with open(keys_file, 'r') as k: keys = k.readlines() keys = [key.strip().split('@')[0] for key in keys] keys = [key for key in keys if key != ''] with open(target_file, 'r') as t: target_lines = t.readlines() log = open(log_file, 'w') with open(result_file, 'w') as r: for line in target_lines: if line.split(':')[0] in keys or \ line.split(':')[3] != '12': r.write(line) else: log.write(line) log.close() except Exception as e: print(str(e)) sys.exit() else: print('==================================================') print('python passwd_change.py keys passwd passwd_new log') print('==================================================') ## Instruction: Add shadow changing support according to our new passwd. ## Code After: import sys _args = sys.argv if __name__ == "__main__": if len(_args) == 8: keys_file = _args[1] passwd_orig = _args[2] passwd_new = _args[3] passwd_log = _args[4] shadow_orig = _args[5] shadow_new = _args[6] shadow_log = _args[7] try: with open(keys_file, 'r') as k: keys = k.readlines() keys = [key.strip().split('@')[0] for key in keys] keys = [key for key in keys if key != ''] with open(passwd_orig, 'r') as po: passwd_lines = po.readlines() passwd_log = open(passwd_log, 'w') passwd_new_keys = [] with open(passwd_new, 'w') as pn: for line in passwd_lines: if line.split(':')[0] in keys or \ line.split(':')[3] != '12': pn.write(line) passwd_new_keys.append(line.split(':')[0]) else: passwd_log.write(line) passwd_log.close() with open(shadow_orig, 'r') as so: shadow_lines = so.readlines() shadow_log = open(shadow_log, 'w') with open(shadow_new, 'w') as sn: for line in shadow_lines: if line.split(':')[0] in passwd_new_keys: sn.write(line) else: shadow_log.write(line) shadow_log.close() except Exception as e: print(str(e)) sys.exit() else: print('==================================================') print('python passwd_change.py keys passwd passwd_new passwd_log' + ' shadow shadow_new shadow_log') print('==================================================')
c883559dca4d5ea0a14d98ad240b93296d17f6a6
couscous.yml
couscous.yml
template: directory: theme index: introduction.md exclude: - %gitignore% cname: laratools.com # Theme Specific github: user: laratools repo: laratools menu: sections: main: items: introduction: text: Introduction path: introduction.html installation: text: Installation path: installation.html
template: directory: theme index: introduction.md exclude: - %gitignore% cname: laratools.com # Theme Specific github: user: laratools repo: laratools menu: sections: main: items: introduction: text: Introduction installation: text: Installation path: installation.html
Fix link to introduction page
Fix link to introduction page
YAML
mit
laratools/docs
yaml
## Code Before: template: directory: theme index: introduction.md exclude: - %gitignore% cname: laratools.com # Theme Specific github: user: laratools repo: laratools menu: sections: main: items: introduction: text: Introduction path: introduction.html installation: text: Installation path: installation.html ## Instruction: Fix link to introduction page ## Code After: template: directory: theme index: introduction.md exclude: - %gitignore% cname: laratools.com # Theme Specific github: user: laratools repo: laratools menu: sections: main: items: introduction: text: Introduction installation: text: Installation path: installation.html
dbb50397e82e56551907390be174089b21a3339c
server.rb
server.rb
require 'sinatra' require 'json' require './filter' SLACK_TOKEN = ENV['SLACK_TOKEN'] puts "TOKEN: #{SLACK_TOKEN}" before '/hooks' do unless token_valid? puts 'Token invalid' halt 403, "Token is invalid\n" end if comment_by_myself? puts 'Comment by myself' halt 200 end end def token_valid? params['token'] == SLACK_TOKEN end def comment_by_myself? params['user_name'] == 'slackbot' end get '/' do 'Hello world!' end post '/hooks' do response = filter.update(params).compact.first return 200 unless response j = response.to_json puts j j end def filter @filter ||= Filter::Pipe.new end
require 'sinatra' require 'json' require './filter' SLACK_TOKENS = ENV['SLACK_TOKENS'].split(/\,/) puts "TOKEN: #{SLACK_TOKENS}" before '/hooks' do unless token_valid? puts 'Token invalid' halt 403, "Token is invalid\n" end if comment_by_myself? puts 'Comment by myself' halt 200 end end def token_valid? SLACK_TOKENS.include?(params['token']) end def comment_by_myself? params['user_name'] == 'slackbot' end get '/' do 'Hello world!' end post '/hooks' do response = filter.update(params).compact.first return 200 unless response response.merge!({link_names: 1, parse: "full"}) j = response.to_json puts j j end def filter @filter ||= Filter::Pipe.new end
Add multi-token options and enable expand urls
Add multi-token options and enable expand urls
Ruby
mit
cnosuke/slack_bot
ruby
## Code Before: require 'sinatra' require 'json' require './filter' SLACK_TOKEN = ENV['SLACK_TOKEN'] puts "TOKEN: #{SLACK_TOKEN}" before '/hooks' do unless token_valid? puts 'Token invalid' halt 403, "Token is invalid\n" end if comment_by_myself? puts 'Comment by myself' halt 200 end end def token_valid? params['token'] == SLACK_TOKEN end def comment_by_myself? params['user_name'] == 'slackbot' end get '/' do 'Hello world!' end post '/hooks' do response = filter.update(params).compact.first return 200 unless response j = response.to_json puts j j end def filter @filter ||= Filter::Pipe.new end ## Instruction: Add multi-token options and enable expand urls ## Code After: require 'sinatra' require 'json' require './filter' SLACK_TOKENS = ENV['SLACK_TOKENS'].split(/\,/) puts "TOKEN: #{SLACK_TOKENS}" before '/hooks' do unless token_valid? puts 'Token invalid' halt 403, "Token is invalid\n" end if comment_by_myself? puts 'Comment by myself' halt 200 end end def token_valid? SLACK_TOKENS.include?(params['token']) end def comment_by_myself? params['user_name'] == 'slackbot' end get '/' do 'Hello world!' end post '/hooks' do response = filter.update(params).compact.first return 200 unless response response.merge!({link_names: 1, parse: "full"}) j = response.to_json puts j j end def filter @filter ||= Filter::Pipe.new end
16b705020c91b938660284734afd4218a826e96b
interp.h
interp.h
Lorito_Ctx * lorito_ctx_init(Lorito_Ctx *next_ctx, Lorito_Codeseg *codeseg); Lorito_Interp * lorito_init(); int lorito_run(Lorito_Interp *interp); #endif /* LORITO_INTERP_H_GUARD */
Lorito_Ctx * lorito_ctx_init(Lorito_Ctx *next_ctx, Lorito_Codeseg *codeseg); Lorito_PMC * lorito_pmc_init(Lorito_Interp *interp, int size); Lorito_Interp * lorito_init(); int lorito_run(Lorito_Interp *interp); #endif /* LORITO_INTERP_H_GUARD */
Add lorito_pmc_init to the proper header.
Add lorito_pmc_init to the proper header.
C
artistic-2.0
atrodo/lorito,atrodo/lorito
c
## Code Before: Lorito_Ctx * lorito_ctx_init(Lorito_Ctx *next_ctx, Lorito_Codeseg *codeseg); Lorito_Interp * lorito_init(); int lorito_run(Lorito_Interp *interp); #endif /* LORITO_INTERP_H_GUARD */ ## Instruction: Add lorito_pmc_init to the proper header. ## Code After: Lorito_Ctx * lorito_ctx_init(Lorito_Ctx *next_ctx, Lorito_Codeseg *codeseg); Lorito_PMC * lorito_pmc_init(Lorito_Interp *interp, int size); Lorito_Interp * lorito_init(); int lorito_run(Lorito_Interp *interp); #endif /* LORITO_INTERP_H_GUARD */
530467c4840fcd528cd836d1e25fa3174c6b3b9e
extension/keysocket-yandex-music.js
extension/keysocket-yandex-music.js
var playTarget = '.b-jambox__play'; var nextTarget = '.b-jambox__next'; var prevTarget = '.b-jambox__prev'; function onKeyPress(key) { if (key === PREV) { simulateClick(document.querySelector(prevTarget)); } else if (key === NEXT) { simulateClick(document.querySelector(nextTarget)); } else if (key === PLAY) { simulateClick(document.querySelector(playTarget)); } }
var playTarget = '.player-controls__btn_play'; var nextTarget = '.player-controls__btn_next'; var prevTarget = '.player-controls__btn_prev'; function onKeyPress(key) { if (key === PREV) { simulateClick(document.querySelector(prevTarget)); } else if (key === NEXT) { simulateClick(document.querySelector(nextTarget)); } else if (key === PLAY) { simulateClick(document.querySelector(playTarget)); } }
Update locators for redesigned Yandex Music
Update locators for redesigned Yandex Music
JavaScript
apache-2.0
borismus/keysocket,iver56/keysocket,feedbee/keysocket,vinyldarkscratch/keysocket,feedbee/keysocket,borismus/keysocket,noelmansour/keysocket,kristianj/keysocket,ALiangLiang/keysocket,kristianj/keysocket,chrisdeely/keysocket,legionaryu/keysocket,vladikoff/keysocket,vinyldarkscratch/keysocket,Whoaa512/keysocket
javascript
## Code Before: var playTarget = '.b-jambox__play'; var nextTarget = '.b-jambox__next'; var prevTarget = '.b-jambox__prev'; function onKeyPress(key) { if (key === PREV) { simulateClick(document.querySelector(prevTarget)); } else if (key === NEXT) { simulateClick(document.querySelector(nextTarget)); } else if (key === PLAY) { simulateClick(document.querySelector(playTarget)); } } ## Instruction: Update locators for redesigned Yandex Music ## Code After: var playTarget = '.player-controls__btn_play'; var nextTarget = '.player-controls__btn_next'; var prevTarget = '.player-controls__btn_prev'; function onKeyPress(key) { if (key === PREV) { simulateClick(document.querySelector(prevTarget)); } else if (key === NEXT) { simulateClick(document.querySelector(nextTarget)); } else if (key === PLAY) { simulateClick(document.querySelector(playTarget)); } }
aaaaa3a143c370f387edf42ebd6b22c924845afa
falcom/luhn/check_digit_number.py
falcom/luhn/check_digit_number.py
class CheckDigitNumber: def __init__ (self, number = None): self.__set_number(number) def get_check_digit (self): if self: return self.generate_from_int(self.number) else: return None def has_valid_check_digit (self): if self: digit = self.number % 10 static = self.number // 10 return digit == self.generate_from_int(static) else: return False def __bool__ (self): return self.number is not None def __repr__ (self): return "<{} {}>".format(self.__class__.__name__, repr(self.number)) def __set_number (self, number): if isinstance(number, int): self.number = number elif isinstance(number, str): self.__try_to_extract_number_from_str(number) else: self.number = None def __try_to_extract_number_from_str (self, number): try: self.number = int(number) except ValueError: self.number = None
class CheckDigitNumber: def __init__ (self, number = None): self.__set_number(number) def generate_from_int (self, n): raise NotImplementedError def get_check_digit (self): if self: return self.generate_from_int(self.number) else: return None def has_valid_check_digit (self): if self: digit = self.number % 10 static = self.number // 10 return digit == self.generate_from_int(static) else: return False def __bool__ (self): return self.number is not None def __repr__ (self): return "<{} {}>".format(self.__class__.__name__, repr(self.number)) def __set_number (self, number): if isinstance(number, int): self.number = number elif isinstance(number, str): self.__try_to_extract_number_from_str(number) else: self.number = None def __try_to_extract_number_from_str (self, number): try: self.number = int(number) except ValueError: self.number = None
Make it clear that the user must implement generate_from_int
Make it clear that the user must implement generate_from_int
Python
bsd-3-clause
mlibrary/image-conversion-and-validation,mlibrary/image-conversion-and-validation
python
## Code Before: class CheckDigitNumber: def __init__ (self, number = None): self.__set_number(number) def get_check_digit (self): if self: return self.generate_from_int(self.number) else: return None def has_valid_check_digit (self): if self: digit = self.number % 10 static = self.number // 10 return digit == self.generate_from_int(static) else: return False def __bool__ (self): return self.number is not None def __repr__ (self): return "<{} {}>".format(self.__class__.__name__, repr(self.number)) def __set_number (self, number): if isinstance(number, int): self.number = number elif isinstance(number, str): self.__try_to_extract_number_from_str(number) else: self.number = None def __try_to_extract_number_from_str (self, number): try: self.number = int(number) except ValueError: self.number = None ## Instruction: Make it clear that the user must implement generate_from_int ## Code After: class CheckDigitNumber: def __init__ (self, number = None): self.__set_number(number) def generate_from_int (self, n): raise NotImplementedError def get_check_digit (self): if self: return self.generate_from_int(self.number) else: return None def has_valid_check_digit (self): if self: digit = self.number % 10 static = self.number // 10 return digit == self.generate_from_int(static) else: return False def __bool__ (self): return self.number is not None def __repr__ (self): return "<{} {}>".format(self.__class__.__name__, repr(self.number)) def __set_number (self, number): if isinstance(number, int): self.number = number elif isinstance(number, str): self.__try_to_extract_number_from_str(number) else: self.number = None def __try_to_extract_number_from_str (self, number): try: self.number = int(number) except ValueError: self.number = None
2a2ff436a031a9a2f95714fb9a7c437693d26e71
lib/node_modules/@stdlib/assert/is-web-worker/README.md
lib/node_modules/@stdlib/assert/is-web-worker/README.md
> Check if the runtime is a [web worker][mdn-web-workers-api]. <section class="usage"> ## Usage ```javascript var IS_WEB_WORKER = require( '@stdlib/assert/is-web-worker' ); ``` #### IS_WEB_WORKER `Boolean` indicating if the runtime is a [web worker][mdn-web-workers-api]. ```javascript var bool = IS_WEB_WORKER; // returns <boolean> ``` </section> <!-- /.usage --> <section class="examples"> ## Examples ```javascript var IS_WEB_WORKER = require( '@stdlib/assert/is-web-worker' ); console.log( IS_WEB_WORKER ); // => <boolean> ``` </section> <!-- /.examples --> <section class="links"> [mdn-web-workers-api]: https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API </section> <!-- /.links -->
> Check if the runtime is a [web worker][mdn-web-workers-api]. <section class="usage"> ## Usage ```javascript var IS_WEB_WORKER = require( '@stdlib/assert/is-web-worker' ); ``` #### IS_WEB_WORKER `Boolean` indicating if the runtime is a [web worker][mdn-web-workers-api]. ```javascript var bool = IS_WEB_WORKER; // returns <boolean> ``` </section> <!-- /.usage --> <section class="notes"> ## Notes - In order to determine whether the runtime is a [web worker][mdn-web-workers-api], the implementation must resolve the global scope, thus requiring function generation. The use of function generation may be problematic in browser contexts enforcing a strict [content security policy][mdn-csp] (CSP). </section> <!-- /.notes --> <section class="examples"> ## Examples ```javascript var IS_WEB_WORKER = require( '@stdlib/assert/is-web-worker' ); console.log( IS_WEB_WORKER ); // => <boolean> ``` </section> <!-- /.examples --> <section class="links"> [mdn-web-workers-api]: https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API [mdn-csp]: https://developer.mozilla.org/en-US/docs/Web/HTTP/CSP </section> <!-- /.links -->
Add note regarding use of function generation
Add note regarding use of function generation
Markdown
apache-2.0
stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib
markdown
## Code Before: > Check if the runtime is a [web worker][mdn-web-workers-api]. <section class="usage"> ## Usage ```javascript var IS_WEB_WORKER = require( '@stdlib/assert/is-web-worker' ); ``` #### IS_WEB_WORKER `Boolean` indicating if the runtime is a [web worker][mdn-web-workers-api]. ```javascript var bool = IS_WEB_WORKER; // returns <boolean> ``` </section> <!-- /.usage --> <section class="examples"> ## Examples ```javascript var IS_WEB_WORKER = require( '@stdlib/assert/is-web-worker' ); console.log( IS_WEB_WORKER ); // => <boolean> ``` </section> <!-- /.examples --> <section class="links"> [mdn-web-workers-api]: https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API </section> <!-- /.links --> ## Instruction: Add note regarding use of function generation ## Code After: > Check if the runtime is a [web worker][mdn-web-workers-api]. <section class="usage"> ## Usage ```javascript var IS_WEB_WORKER = require( '@stdlib/assert/is-web-worker' ); ``` #### IS_WEB_WORKER `Boolean` indicating if the runtime is a [web worker][mdn-web-workers-api]. ```javascript var bool = IS_WEB_WORKER; // returns <boolean> ``` </section> <!-- /.usage --> <section class="notes"> ## Notes - In order to determine whether the runtime is a [web worker][mdn-web-workers-api], the implementation must resolve the global scope, thus requiring function generation. The use of function generation may be problematic in browser contexts enforcing a strict [content security policy][mdn-csp] (CSP). </section> <!-- /.notes --> <section class="examples"> ## Examples ```javascript var IS_WEB_WORKER = require( '@stdlib/assert/is-web-worker' ); console.log( IS_WEB_WORKER ); // => <boolean> ``` </section> <!-- /.examples --> <section class="links"> [mdn-web-workers-api]: https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API [mdn-csp]: https://developer.mozilla.org/en-US/docs/Web/HTTP/CSP </section> <!-- /.links -->
94352ff890da1dc1a83651b316eba519d9101243
optimize/index.js
optimize/index.js
var eng = require('./node/engine'); module.exports = { minimize: function(func, options, callback) { eng.runPython('minimize', func, options, callback); }, nonNegLeastSquares: function(A, b, callback) { eng.runPython('nnls', A, b, callback); } }
var eng = require('./node/engine'); module.exports = { localMinimize: function(func, options, callback) { eng.runPython('local', func, options, callback); }, globalMinimize: function(func, options, callback) { eng.runPython('global', func, options, callback); }, nonNegLeastSquares: function(A, b, callback) { eng.runPython('nnls', A, b, callback); } }
Add localMin and globalMin to API
Add localMin and globalMin to API
JavaScript
mit
acjones617/scipy-node,acjones617/scipy-node
javascript
## Code Before: var eng = require('./node/engine'); module.exports = { minimize: function(func, options, callback) { eng.runPython('minimize', func, options, callback); }, nonNegLeastSquares: function(A, b, callback) { eng.runPython('nnls', A, b, callback); } } ## Instruction: Add localMin and globalMin to API ## Code After: var eng = require('./node/engine'); module.exports = { localMinimize: function(func, options, callback) { eng.runPython('local', func, options, callback); }, globalMinimize: function(func, options, callback) { eng.runPython('global', func, options, callback); }, nonNegLeastSquares: function(A, b, callback) { eng.runPython('nnls', A, b, callback); } }
a6f631b485e3266d299fdf2a2902cec88062c55e
px.gemspec
px.gemspec
Gem::Specification.new do |s| s.name = "px" s.version = "0.0.1" s.summary = "PX library" s.description = "PX library" s.authors = ["Cyril David"] s.email = ["cyx@cyx.is"] s.homepage = "http://cyx.is" s.files = Dir[ "LICENSE", "README*", "Makefile", "lib/**/*.rb", "*.gemspec", "test/*.*", ] s.license = "MIT" s.add_dependency "requests" s.add_dependency "xml-simple" s.add_dependency "mote" s.add_dependency "hache" s.add_development_dependency "cutest" end
Gem::Specification.new do |s| s.name = "px" s.version = "0.0.1" s.summary = "PX library" s.description = "PaymentExpress 1.0 integration library" s.authors = ["Cyril David"] s.email = ["cyx@cyx.is"] s.homepage = "http://cyx.is" s.files = Dir[ "LICENSE", "README*", "Makefile", "lib/*.rb", "lib/xml/*.xml", "*.gemspec", "test/*.*", ] s.license = "MIT" s.add_dependency "requests", "~> 1.0" s.add_dependency "xml-simple", "~> 1.1" s.add_dependency "mote", "~> 1.1" s.add_dependency "hache", "~> 1.1" s.add_development_dependency "cutest", "~> 1.2" end
Fix the gemspec in prep for 0.0.1 release
Fix the gemspec in prep for 0.0.1 release
Ruby
mit
cyx/px
ruby
## Code Before: Gem::Specification.new do |s| s.name = "px" s.version = "0.0.1" s.summary = "PX library" s.description = "PX library" s.authors = ["Cyril David"] s.email = ["cyx@cyx.is"] s.homepage = "http://cyx.is" s.files = Dir[ "LICENSE", "README*", "Makefile", "lib/**/*.rb", "*.gemspec", "test/*.*", ] s.license = "MIT" s.add_dependency "requests" s.add_dependency "xml-simple" s.add_dependency "mote" s.add_dependency "hache" s.add_development_dependency "cutest" end ## Instruction: Fix the gemspec in prep for 0.0.1 release ## Code After: Gem::Specification.new do |s| s.name = "px" s.version = "0.0.1" s.summary = "PX library" s.description = "PaymentExpress 1.0 integration library" s.authors = ["Cyril David"] s.email = ["cyx@cyx.is"] s.homepage = "http://cyx.is" s.files = Dir[ "LICENSE", "README*", "Makefile", "lib/*.rb", "lib/xml/*.xml", "*.gemspec", "test/*.*", ] s.license = "MIT" s.add_dependency "requests", "~> 1.0" s.add_dependency "xml-simple", "~> 1.1" s.add_dependency "mote", "~> 1.1" s.add_dependency "hache", "~> 1.1" s.add_development_dependency "cutest", "~> 1.2" end
cf9c62850ac9ffadfebacfd9b138f35dcd382bf1
app/scripts/services/paymentchannel.js
app/scripts/services/paymentchannel.js
angular.module('scratchApp') .factory('PaymentChannel', [ function() { /* BitcoinJS is availabel through the 'Bitcoin' public variable */ return { // Export necessary functions here }; }]);
angular.module('scratchApp') .factory('PaymentChannel', [ function() { /* BitcoinJS is availabel through the 'Bitcoin' public variable */ var paychans = { // Export necessary functions here /** Create Multisignature transaction using public keys of two parties * public_key_1: bitcoin.ECPubKey (example Bitcoin.ECKey.makeRandom().pub) * public_key_2: bitcoin.ECPubKey */ create_multisignature_address : function(public_key_1, public_key_2) { var pubKeys = [ public_key_1, public_key_2 ]; var redeemScript = Bitcoin.scripts.multisigOutput(2, pubKeys); // 2 of 2 return redeemScript; }, /** Calculate Multisignature public address from redeem script * redeemScript : multisigOutput */ get_multisig_address_from_redeem_script : function(redeemScript) { var scriptPubKey = Bitcoin.scripts.scriptHashOutput(redeemScript.getHash()); var multisig_address = Bitcoin.Address.fromOutputScript(scriptPubKey).toString(); return multisig_address; }, /** Create funding transaction * private_key_1: private key (of person funding transaction) in format returned by Bitcoin.ECKey.makeRandom() * input_transaction_id: id of the input Bitcoin transaction (example Bitcoin.ECKey.makeRandom().pub) * redeemScript: multisigOutput * satoshis: amount of Satoshis to send (example 1000000) */ create_funding_transaction : function(private_key_1, input_transaction_id, redeemScript, satoshis) { var multisig_address = paychans.get_multisig_address_from_redeem_script(redeemScript); var fund_tx_builder = new Bitcoin.TransactionBuilder(); // Add the input with *hash* form previous transaction hash // and index of the output to use (this is an existing transaction in the blockchain) fund_tx_builder.addInput(input_transaction_id, 0); //fund_tx_builder.addOutput("1Gokm82v6DmtwKEB8AiVhm82hyFSsEvBDK", 15000); // Output address plus amount in satoshis fund_tx_builder.addOutput(multisig_address, satoshis); // Output address plus amount in satoshis // Sing and broadcast only after refund has been signed by both fund_tx_builder.sign(0, private_key_1); // Sign transaction var fund_tx = fund_tx_builder.build(); return { transaction_id: fund_tx.getId(), raw_tx_builder_with_signature: fund_tx_builder }; } }; return paychans; }]);
Add functions to create a multisignature transaction and a funding transaction to the multisignature address
Add functions to create a multisignature transaction and a funding transaction to the multisignature address
JavaScript
mit
baleato/bitcoin-hackathon
javascript
## Code Before: angular.module('scratchApp') .factory('PaymentChannel', [ function() { /* BitcoinJS is availabel through the 'Bitcoin' public variable */ return { // Export necessary functions here }; }]); ## Instruction: Add functions to create a multisignature transaction and a funding transaction to the multisignature address ## Code After: angular.module('scratchApp') .factory('PaymentChannel', [ function() { /* BitcoinJS is availabel through the 'Bitcoin' public variable */ var paychans = { // Export necessary functions here /** Create Multisignature transaction using public keys of two parties * public_key_1: bitcoin.ECPubKey (example Bitcoin.ECKey.makeRandom().pub) * public_key_2: bitcoin.ECPubKey */ create_multisignature_address : function(public_key_1, public_key_2) { var pubKeys = [ public_key_1, public_key_2 ]; var redeemScript = Bitcoin.scripts.multisigOutput(2, pubKeys); // 2 of 2 return redeemScript; }, /** Calculate Multisignature public address from redeem script * redeemScript : multisigOutput */ get_multisig_address_from_redeem_script : function(redeemScript) { var scriptPubKey = Bitcoin.scripts.scriptHashOutput(redeemScript.getHash()); var multisig_address = Bitcoin.Address.fromOutputScript(scriptPubKey).toString(); return multisig_address; }, /** Create funding transaction * private_key_1: private key (of person funding transaction) in format returned by Bitcoin.ECKey.makeRandom() * input_transaction_id: id of the input Bitcoin transaction (example Bitcoin.ECKey.makeRandom().pub) * redeemScript: multisigOutput * satoshis: amount of Satoshis to send (example 1000000) */ create_funding_transaction : function(private_key_1, input_transaction_id, redeemScript, satoshis) { var multisig_address = paychans.get_multisig_address_from_redeem_script(redeemScript); var fund_tx_builder = new Bitcoin.TransactionBuilder(); // Add the input with *hash* form previous transaction hash // and index of the output to use (this is an existing transaction in the blockchain) fund_tx_builder.addInput(input_transaction_id, 0); //fund_tx_builder.addOutput("1Gokm82v6DmtwKEB8AiVhm82hyFSsEvBDK", 15000); // Output address plus amount in satoshis fund_tx_builder.addOutput(multisig_address, satoshis); // Output address plus amount in satoshis // Sing and broadcast only after refund has been signed by both fund_tx_builder.sign(0, private_key_1); // Sign transaction var fund_tx = fund_tx_builder.build(); return { transaction_id: fund_tx.getId(), raw_tx_builder_with_signature: fund_tx_builder }; } }; return paychans; }]);
76c581ae56acf046c3b68e1aa30632c893c20931
lib/roqua/healthy/errors.rb
lib/roqua/healthy/errors.rb
module Roqua module Healthy class UnknownFailure < StandardError; end class IllegalMirthResponse < StandardError; end class PatientIdNotInRemote < StandardError; end class Timeout < StandardError; end class HostUnreachable < StandardError; end class ConnectionRefused < StandardError; end class PatientNotFound < StandardError; end module MirthErrors class WrongPatient < StandardError; end end end end
module Roqua module Healthy class Error < StandardError; end class ConnectionError < Error; end class IllegalMirthResponse < ConnectionError; end class Timeout < ConnectionError; end class HostUnreachable < ConnectionError; end class ConnectionRefused < ConnectionError; end class UnknownFailure < Error; end class PatientIdNotInRemote < Error; end class PatientNotFound < Error; end module MirthErrors class WrongPatient < Error; end end end end
Introduce base classes for exceptions
Introduce base classes for exceptions
Ruby
mit
roqua/healthy,roqua/healthy
ruby
## Code Before: module Roqua module Healthy class UnknownFailure < StandardError; end class IllegalMirthResponse < StandardError; end class PatientIdNotInRemote < StandardError; end class Timeout < StandardError; end class HostUnreachable < StandardError; end class ConnectionRefused < StandardError; end class PatientNotFound < StandardError; end module MirthErrors class WrongPatient < StandardError; end end end end ## Instruction: Introduce base classes for exceptions ## Code After: module Roqua module Healthy class Error < StandardError; end class ConnectionError < Error; end class IllegalMirthResponse < ConnectionError; end class Timeout < ConnectionError; end class HostUnreachable < ConnectionError; end class ConnectionRefused < ConnectionError; end class UnknownFailure < Error; end class PatientIdNotInRemote < Error; end class PatientNotFound < Error; end module MirthErrors class WrongPatient < Error; end end end end
76886e39f98b12b754a8ac33f6ae4d933d0b93de
src/file_saver.rs
src/file_saver.rs
use std::error::Error; use std::fs::File; use std::io::{Result, Write}; use std::path::Path; use chunk::*; pub trait SaveToFile { fn save_to_file<P: AsRef<Path>>(&mut self, path: P) -> Result<()>; } impl<I: Iterator<Item=Chunk>> SaveToFile for I { fn save_to_file<P: AsRef<Path>>(&mut self, path: P) -> Result<()> { //TODO: add backup let mut file = match File::create(&path) { Err(why) => panic!("couldn't create {:?}: {}", path.as_ref(), Error::description(&why)), Ok(file) => file, }; for ch in self { match file.write_all(&ch.to_string().into_bytes()) { Err(why) => panic!("couldn't write to {:?}: {}", path.as_ref(), Error::description(&why)), Ok(_) => (), } } Ok(()) } }
use std::error::Error; use std::fs; use std::fs::File; use std::io::{Result, Write}; use std::path::Path; use chunk::*; pub trait SaveToFile { fn save_to_file<P: AsRef<Path>>(&mut self, path: P) -> Result<()>; } impl<I: Iterator<Item=Chunk>> SaveToFile for I { fn save_to_file<P: AsRef<Path>>(&mut self, path: P) -> Result<()> { match create_backup(&path) { Err(why) => panic!("couldn't create backup for {:?}: {}", path.as_ref(), Error::description(&why)), Ok(_) => (), } let mut file = match File::create(&path) { Err(why) => panic!("couldn't create {:?}: {}", path.as_ref(), Error::description(&why)), Ok(file) => file, }; for ch in self { match file.write_all(&ch.to_string().into_bytes()) { Err(why) => panic!("couldn't write to {:?}: {}", path.as_ref(), Error::description(&why)), Ok(_) => (), } } Ok(()) } } /// Create .bak file, only if one not present pub fn create_backup<P: AsRef<Path>>(path: P) -> Result<bool> { match fs::metadata(&path) { Err(_) => return Ok(false), Ok(_) => (), } let new_path = path.as_ref().with_extension("bak"); match fs::metadata(&new_path) { Err(_) => (), Ok(_) => return Ok(false), } fs::rename(path, new_path).map(|_| true) }
Create backup file, if none
Create backup file, if none
Rust
mit
insanebits/gir,GuillaumeGomez/gir,GuillaumeGomez/gir,gtk-rs/gir,GuillaumeGomez/gir,EPashkin/gir,gtk-rs/gir,insanebits/gir,gtk-rs/gir,gkoz/gir,gkoz/gir,EPashkin/gir
rust
## Code Before: use std::error::Error; use std::fs::File; use std::io::{Result, Write}; use std::path::Path; use chunk::*; pub trait SaveToFile { fn save_to_file<P: AsRef<Path>>(&mut self, path: P) -> Result<()>; } impl<I: Iterator<Item=Chunk>> SaveToFile for I { fn save_to_file<P: AsRef<Path>>(&mut self, path: P) -> Result<()> { //TODO: add backup let mut file = match File::create(&path) { Err(why) => panic!("couldn't create {:?}: {}", path.as_ref(), Error::description(&why)), Ok(file) => file, }; for ch in self { match file.write_all(&ch.to_string().into_bytes()) { Err(why) => panic!("couldn't write to {:?}: {}", path.as_ref(), Error::description(&why)), Ok(_) => (), } } Ok(()) } } ## Instruction: Create backup file, if none ## Code After: use std::error::Error; use std::fs; use std::fs::File; use std::io::{Result, Write}; use std::path::Path; use chunk::*; pub trait SaveToFile { fn save_to_file<P: AsRef<Path>>(&mut self, path: P) -> Result<()>; } impl<I: Iterator<Item=Chunk>> SaveToFile for I { fn save_to_file<P: AsRef<Path>>(&mut self, path: P) -> Result<()> { match create_backup(&path) { Err(why) => panic!("couldn't create backup for {:?}: {}", path.as_ref(), Error::description(&why)), Ok(_) => (), } let mut file = match File::create(&path) { Err(why) => panic!("couldn't create {:?}: {}", path.as_ref(), Error::description(&why)), Ok(file) => file, }; for ch in self { match file.write_all(&ch.to_string().into_bytes()) { Err(why) => panic!("couldn't write to {:?}: {}", path.as_ref(), Error::description(&why)), Ok(_) => (), } } Ok(()) } } /// Create .bak file, only if one not present pub fn create_backup<P: AsRef<Path>>(path: P) -> Result<bool> { match fs::metadata(&path) { Err(_) => return Ok(false), Ok(_) => (), } let new_path = path.as_ref().with_extension("bak"); match fs::metadata(&new_path) { Err(_) => (), Ok(_) => return Ok(false), } fs::rename(path, new_path).map(|_| true) }
1f101411ccd0daf2400629e9f279d1cc9e229c23
addon-test-support/@ember/test-helpers/index.js
addon-test-support/@ember/test-helpers/index.js
export { setResolver, getResolver } from './resolver'; export { setApplication } from './application'; export { default as setupContext, getContext, setContext, unsetContext, pauseTest, resumeTest, } from './setup-context'; export { default as teardownContext } from './teardown-context'; export { default as setupRenderingContext, render, clearRender } from './setup-rendering-context'; export { default as teardownRenderingContext } from './teardown-rendering-context'; export { default as settled, isSettled, getState as getSettledState } from './settled'; export { default as waitUntil } from './wait-until'; export { default as validateErrorHandler } from './validate-error-handler'; // DOM Helpers export { default as click } from './dom/click'; export { default as focus } from './dom/focus'; export { default as blur } from './dom/blur'; export { default as triggerEvent } from './dom/blur'; export { default as triggerKeyEvent } from './dom/blur';
export { setResolver, getResolver } from './resolver'; export { setApplication } from './application'; export { default as setupContext, getContext, setContext, unsetContext, pauseTest, resumeTest, } from './setup-context'; export { default as teardownContext } from './teardown-context'; export { default as setupRenderingContext, render, clearRender } from './setup-rendering-context'; export { default as teardownRenderingContext } from './teardown-rendering-context'; export { default as settled, isSettled, getState as getSettledState } from './settled'; export { default as waitUntil } from './wait-until'; export { default as validateErrorHandler } from './validate-error-handler'; // DOM Helpers export { default as click } from './dom/click'; export { default as focus } from './dom/focus'; export { default as blur } from './dom/blur'; export { default as triggerEvent } from './dom/trigger-event'; export { default as triggerKeyEvent } from './dom/trigger-key-event';
Fix stupid mistake in reexports.
Fix stupid mistake in reexports.
JavaScript
apache-2.0
switchfly/ember-test-helpers,switchfly/ember-test-helpers
javascript
## Code Before: export { setResolver, getResolver } from './resolver'; export { setApplication } from './application'; export { default as setupContext, getContext, setContext, unsetContext, pauseTest, resumeTest, } from './setup-context'; export { default as teardownContext } from './teardown-context'; export { default as setupRenderingContext, render, clearRender } from './setup-rendering-context'; export { default as teardownRenderingContext } from './teardown-rendering-context'; export { default as settled, isSettled, getState as getSettledState } from './settled'; export { default as waitUntil } from './wait-until'; export { default as validateErrorHandler } from './validate-error-handler'; // DOM Helpers export { default as click } from './dom/click'; export { default as focus } from './dom/focus'; export { default as blur } from './dom/blur'; export { default as triggerEvent } from './dom/blur'; export { default as triggerKeyEvent } from './dom/blur'; ## Instruction: Fix stupid mistake in reexports. ## Code After: export { setResolver, getResolver } from './resolver'; export { setApplication } from './application'; export { default as setupContext, getContext, setContext, unsetContext, pauseTest, resumeTest, } from './setup-context'; export { default as teardownContext } from './teardown-context'; export { default as setupRenderingContext, render, clearRender } from './setup-rendering-context'; export { default as teardownRenderingContext } from './teardown-rendering-context'; export { default as settled, isSettled, getState as getSettledState } from './settled'; export { default as waitUntil } from './wait-until'; export { default as validateErrorHandler } from './validate-error-handler'; // DOM Helpers export { default as click } from './dom/click'; export { default as focus } from './dom/focus'; export { default as blur } from './dom/blur'; export { default as triggerEvent } from './dom/trigger-event'; export { default as triggerKeyEvent } from './dom/trigger-key-event';
ceeb64c9e46a74f95178be88566fba3d7f080fa1
mica/stats/tests/test_acq_stats.py
mica/stats/tests/test_acq_stats.py
from .. import acq_stats def test_calc_stats(): acq_stats.calc_stats(17210)
import tempfile import os from .. import acq_stats def test_calc_stats(): acq_stats.calc_stats(17210) def test_make_acq_stats(): """ Save the acq stats for one obsid into a newly-created table """ # Get a temporary file, but then delete it, because _save_acq_stats will only # make a new table if the supplied file doesn't exist fh, fn = tempfile.mkstemp(suffix='.h5') os.unlink(fn) acq_stats.table_file = fn obsid = 20001 obsid_info, acq, star_info, catalog, temp = acq_stats.calc_stats(obsid) t = acq_stats.table_acq_stats(obsid_info, acq, star_info, catalog, temp) acq_stats._save_acq_stats(t) os.unlink(fn)
Add a test that makes a new acq stats database
Add a test that makes a new acq stats database
Python
bsd-3-clause
sot/mica,sot/mica
python
## Code Before: from .. import acq_stats def test_calc_stats(): acq_stats.calc_stats(17210) ## Instruction: Add a test that makes a new acq stats database ## Code After: import tempfile import os from .. import acq_stats def test_calc_stats(): acq_stats.calc_stats(17210) def test_make_acq_stats(): """ Save the acq stats for one obsid into a newly-created table """ # Get a temporary file, but then delete it, because _save_acq_stats will only # make a new table if the supplied file doesn't exist fh, fn = tempfile.mkstemp(suffix='.h5') os.unlink(fn) acq_stats.table_file = fn obsid = 20001 obsid_info, acq, star_info, catalog, temp = acq_stats.calc_stats(obsid) t = acq_stats.table_acq_stats(obsid_info, acq, star_info, catalog, temp) acq_stats._save_acq_stats(t) os.unlink(fn)
e614efcf400f30e44dd79f827a5e101dc24dd996
README.md
README.md
Parse A-share market data feeds provided by [TDX](http://www.tdx.com.cn/) for historical prices of shares trading on Shanghai Stock Exchange and the Shenzhen Stock Exchange. Daily market data can be downloaded from: http://www.tdx.com.cn/download/sysj
Parse A-share market data feeds provided by [TDX](http://www.tdx.com.cn/) for historical prices of shares trading on Shanghai Stock Exchange and the Shenzhen Stock Exchange. Daily market data can be downloaded from: http://www.tdx.com.cn/list_66_69.html
Fix dead link for market data download
Fix dead link for market data download
Markdown
mit
ericyan/tdx
markdown
## Code Before: Parse A-share market data feeds provided by [TDX](http://www.tdx.com.cn/) for historical prices of shares trading on Shanghai Stock Exchange and the Shenzhen Stock Exchange. Daily market data can be downloaded from: http://www.tdx.com.cn/download/sysj ## Instruction: Fix dead link for market data download ## Code After: Parse A-share market data feeds provided by [TDX](http://www.tdx.com.cn/) for historical prices of shares trading on Shanghai Stock Exchange and the Shenzhen Stock Exchange. Daily market data can be downloaded from: http://www.tdx.com.cn/list_66_69.html
a0784f30728216795dcd10b9d731f6edee7376d1
psa/psas.json
psa/psas.json
[ "2019-10-08", "2019-10-11", "2019-10-14", "2019-10-19", "2019-10-22", "2019-12-04" ]
[ "2019-10-08" ,"2019-10-11" ,"2019-10-14" ,"2019-10-19" ,"2019-10-22" ,"2019-12-04" ]
Move comma to the next line (awkwardly) so the git diff is more sensible
Move comma to the next line (awkwardly) so the git diff is more sensible
JSON
mit
Orbiit/gunn-web-app,Orbiit/gunn-web-app
json
## Code Before: [ "2019-10-08", "2019-10-11", "2019-10-14", "2019-10-19", "2019-10-22", "2019-12-04" ] ## Instruction: Move comma to the next line (awkwardly) so the git diff is more sensible ## Code After: [ "2019-10-08" ,"2019-10-11" ,"2019-10-14" ,"2019-10-19" ,"2019-10-22" ,"2019-12-04" ]
14b5543bd80387fd8e4c8255e5d9b73d01781b15
generatorTests/test/gulpSpec.js
generatorTests/test/gulpSpec.js
var path = require('path'); var spawn = require('child_process').spawn; var fs = require('fs'); var path = require('path'); var chai = require('chai'); chai.use(require('chai-fs')); chai.should(); const ROOT_DIR = path.join(process.cwd(), '..'); describe('As a dev', function() { this.timeout(10000); describe('When running `gulp release --prod`', function() { before(function(done) { process.chdir(ROOT_DIR); var gulp = spawn('gulp', ['release', '--prod']) gulp.on('close', function() { done(); }); }); it('the release folder should exist', function() { var pathToTest = path.join(ROOT_DIR, 'release'); pathToTest.should.be.a.directory().and.not.empty; }); it('the release folder should contain atleast one zip file', function() { var pathToTest = path.join(ROOT_DIR, 'release/'); var directoryFileName = fs.readdirSync(pathToTest); var actualFileExtName = path.extname(directoryFileName); var expectedFileExtName = '.zip'; actualFileExtName.should.equal(expectedFileExtName); }); }); });
var path = require('path'); var spawn = require('child_process').spawn; var fs = require('fs'); var path = require('path'); var chai = require('chai'); chai.use(require('chai-fs')); chai.should(); const ROOT_DIR = path.join(process.cwd(), '..'); describe('As a dev', function() { this.timeout(10000); describe('When running `gulp release --prod`', function() { before(function(done) { process.chdir(ROOT_DIR); var gulp = spawn('gulp', ['release', '--prod']) gulp.on('close', function() { done(); }); }); it('the release folder should exist', function() { var pathToTest = path.join(ROOT_DIR, 'release'); pathToTest.should.be.a.directory().and.not.empty; }); it('the release folder should contain atleast one zip file', function() { var pathToTest = path.join(ROOT_DIR, 'release/'); var directoryFileName = fs.readdirSync(pathToTest); var actualFileExtName = path.extname(directoryFileName); var expectedFileExtName = '.zip'; actualFileExtName.should.equal(expectedFileExtName); }); }); describe('When running `gulp build --prod`', function() { before(function(done) { process.chdir(ROOT_DIR); var gulp = spawn('gulp', ['build', '--prod']) gulp.on('close', function() { done(); }); }); it('the build folder should exist', function() { var pathToTest = path.join(ROOT_DIR, 'build'); pathToTest.should.be.a.directory().and.not.empty; }); }); });
Add test for gulp build to check if build folder is created
test: Add test for gulp build to check if build folder is created
JavaScript
mit
code-computerlove/slate,cartridge/cartridge-cli,code-computerlove/slate,CodeRichardJohnston/slate-demo,cartridge/cartridge,code-computerlove/quarry,code-computerlove/slate-cli,CodeRichardJohnston/slate-demo
javascript
## Code Before: var path = require('path'); var spawn = require('child_process').spawn; var fs = require('fs'); var path = require('path'); var chai = require('chai'); chai.use(require('chai-fs')); chai.should(); const ROOT_DIR = path.join(process.cwd(), '..'); describe('As a dev', function() { this.timeout(10000); describe('When running `gulp release --prod`', function() { before(function(done) { process.chdir(ROOT_DIR); var gulp = spawn('gulp', ['release', '--prod']) gulp.on('close', function() { done(); }); }); it('the release folder should exist', function() { var pathToTest = path.join(ROOT_DIR, 'release'); pathToTest.should.be.a.directory().and.not.empty; }); it('the release folder should contain atleast one zip file', function() { var pathToTest = path.join(ROOT_DIR, 'release/'); var directoryFileName = fs.readdirSync(pathToTest); var actualFileExtName = path.extname(directoryFileName); var expectedFileExtName = '.zip'; actualFileExtName.should.equal(expectedFileExtName); }); }); }); ## Instruction: test: Add test for gulp build to check if build folder is created ## Code After: var path = require('path'); var spawn = require('child_process').spawn; var fs = require('fs'); var path = require('path'); var chai = require('chai'); chai.use(require('chai-fs')); chai.should(); const ROOT_DIR = path.join(process.cwd(), '..'); describe('As a dev', function() { this.timeout(10000); describe('When running `gulp release --prod`', function() { before(function(done) { process.chdir(ROOT_DIR); var gulp = spawn('gulp', ['release', '--prod']) gulp.on('close', function() { done(); }); }); it('the release folder should exist', function() { var pathToTest = path.join(ROOT_DIR, 'release'); pathToTest.should.be.a.directory().and.not.empty; }); it('the release folder should contain atleast one zip file', function() { var pathToTest = path.join(ROOT_DIR, 'release/'); var directoryFileName = fs.readdirSync(pathToTest); var actualFileExtName = path.extname(directoryFileName); var expectedFileExtName = '.zip'; actualFileExtName.should.equal(expectedFileExtName); }); }); describe('When running `gulp build --prod`', function() { before(function(done) { process.chdir(ROOT_DIR); var gulp = spawn('gulp', ['build', '--prod']) gulp.on('close', function() { done(); }); }); it('the build folder should exist', function() { var pathToTest = path.join(ROOT_DIR, 'build'); pathToTest.should.be.a.directory().and.not.empty; }); }); });
875a6743106326c271dd097ce077d884c1a1b94f
lib/route/integration_api.js
lib/route/integration_api.js
'use strict'; const express = require('express'), router = express.Router(); module.exports = passport => { router.get( '/', passport.authenticate('bearer', { session: false }), (req, res) => res.json({ok : true}) ); router.get( '/report/allowance', passport.authenticate('bearer', { session: false }), (req, res) => { return res.json({foo : 'bar'}); } ); return router; }
'use strict'; const express = require('express'), router = express.Router(); module.exports = passport => { router.all( /.*/, passport.authenticate('bearer', { session: false }), (req, res, next) => { if ( req.isAuthenticated() ) { return next(); } return res.status(401).json({ ok : false}); }); router.get( '/', (req, res) => res.json({ok : true}) ); router.get( '/report/allowance', (req, res) => { return res.json({foo : 'bar'}); } ); return router; }
Use global auth check per Integration API handler
Use global auth check per Integration API handler
JavaScript
mit
timeoff-management/application,timeoff-management/application
javascript
## Code Before: 'use strict'; const express = require('express'), router = express.Router(); module.exports = passport => { router.get( '/', passport.authenticate('bearer', { session: false }), (req, res) => res.json({ok : true}) ); router.get( '/report/allowance', passport.authenticate('bearer', { session: false }), (req, res) => { return res.json({foo : 'bar'}); } ); return router; } ## Instruction: Use global auth check per Integration API handler ## Code After: 'use strict'; const express = require('express'), router = express.Router(); module.exports = passport => { router.all( /.*/, passport.authenticate('bearer', { session: false }), (req, res, next) => { if ( req.isAuthenticated() ) { return next(); } return res.status(401).json({ ok : false}); }); router.get( '/', (req, res) => res.json({ok : true}) ); router.get( '/report/allowance', (req, res) => { return res.json({foo : 'bar'}); } ); return router; }
2b6b2ab4a912e4a93c760f5206b139c8f1bc1926
app/controllers/test_emails_controller.rb
app/controllers/test_emails_controller.rb
class TestEmailsController < ApplicationController # We're using the simple_format helper below. Ugly but quick by bringing it into the controller include ActionView::Helpers::TextHelper def new @from = "contact@openaustraliafoundation.org.au" @to = "Matthew Landauer <matthew@openaustralia.org>" @subject = "This is a test email from Cuttlefish" @text = <<-EOF Hello folks. Hopefully this should have worked and you should be reading this. So, all is good. Love, The Awesome Cuttlefish <a href="http://cuttlefish.io">http://cuttlefish.io</a> EOF end # Send a test email def create app = App.find(params[:app_id]) authorize app, :show? TestMailer.test_email(app, from: params[:from], to: params[:to], cc: params[:cc], subject: params[:subject], text: params[:text]).deliver flash[:notice] = "Test email sent" redirect_to deliveries_url end end
class TestEmailsController < ApplicationController # We're using the simple_format helper below. Ugly but quick by bringing it into the controller include ActionView::Helpers::TextHelper def new @from = "contact@oaf.org.au" @to = "Matthew Landauer <matthew@oaf.org.au>" @subject = "This is a test email from Cuttlefish" @text = <<-EOF Hello folks. Hopefully this should have worked and you should be reading this. So, all is good. Love, The Awesome Cuttlefish <a href="http://cuttlefish.io">http://cuttlefish.io</a> EOF end # Send a test email def create app = App.find(params[:app_id]) authorize app, :show? TestMailer.test_email(app, from: params[:from], to: params[:to], cc: params[:cc], subject: params[:subject], text: params[:text]).deliver flash[:notice] = "Test email sent" redirect_to deliveries_url end end
Change default email addresses of test email
Change default email addresses of test email
Ruby
agpl-3.0
idlweb/cuttlefish,idlweb/cuttlefish,idlweb/cuttlefish,pratyushmittal/cuttlefish,pratyushmittal/cuttlefish,pratyushmittal/cuttlefish,pratyushmittal/cuttlefish,idlweb/cuttlefish
ruby
## Code Before: class TestEmailsController < ApplicationController # We're using the simple_format helper below. Ugly but quick by bringing it into the controller include ActionView::Helpers::TextHelper def new @from = "contact@openaustraliafoundation.org.au" @to = "Matthew Landauer <matthew@openaustralia.org>" @subject = "This is a test email from Cuttlefish" @text = <<-EOF Hello folks. Hopefully this should have worked and you should be reading this. So, all is good. Love, The Awesome Cuttlefish <a href="http://cuttlefish.io">http://cuttlefish.io</a> EOF end # Send a test email def create app = App.find(params[:app_id]) authorize app, :show? TestMailer.test_email(app, from: params[:from], to: params[:to], cc: params[:cc], subject: params[:subject], text: params[:text]).deliver flash[:notice] = "Test email sent" redirect_to deliveries_url end end ## Instruction: Change default email addresses of test email ## Code After: class TestEmailsController < ApplicationController # We're using the simple_format helper below. Ugly but quick by bringing it into the controller include ActionView::Helpers::TextHelper def new @from = "contact@oaf.org.au" @to = "Matthew Landauer <matthew@oaf.org.au>" @subject = "This is a test email from Cuttlefish" @text = <<-EOF Hello folks. Hopefully this should have worked and you should be reading this. So, all is good. Love, The Awesome Cuttlefish <a href="http://cuttlefish.io">http://cuttlefish.io</a> EOF end # Send a test email def create app = App.find(params[:app_id]) authorize app, :show? TestMailer.test_email(app, from: params[:from], to: params[:to], cc: params[:cc], subject: params[:subject], text: params[:text]).deliver flash[:notice] = "Test email sent" redirect_to deliveries_url end end
195fce77418f0f5ec39be9f60202a25035de63b0
local-volume/examples/local-reader.yaml
local-volume/examples/local-reader.yaml
apiVersion: extensions/v1beta1 kind: Deployment metadata: name: local-test-reader spec: replicas: 1 template: metadata: labels: app: local-test-reader spec: terminationGracePeriodSeconds: 10 containers: - name: reader image: gcr.io/google_containers/busybox:1.24 command: - "/bin/sh" args: - "-c" - "tail -f /usr/test-pod/test_file" volumeMounts: - name: local-vol mountPath: /usr/test-pod volumes: - name: local-vol persistentVolumeClaim: claimName: "example-local-claim"
apiVersion: extensions/v1beta1 kind: Deployment metadata: name: local-test-reader spec: replicas: 1 template: metadata: labels: app: local-test-reader spec: terminationGracePeriodSeconds: 10 containers: - name: reader image: gcr.io/google_containers/busybox:1.24 command: - "/bin/sh" args: - "-c" - "tail -f /usr/test-pod/test_file" volumeMounts: - name: local-vol mountPath: /usr/test-pod volumes: - name: local-vol persistentVolumeClaim: claimName: "local-vol-local-test-0"
Revert "change claim name so it works out of the box"
Revert "change claim name so it works out of the box" This reverts commit da46204f0af16504709902a5c10070739f800a46.
YAML
apache-2.0
kubernetes-incubator/external-storage,childsb/external-storage,dhirajh/external-storage,humblec/external-storage,dhirajh/external-storage,dhirajh/external-storage,dhirajh/external-storage,humblec/external-storage,childsb/external-storage,humblec/external-storage,kubernetes-incubator/external-storage,childsb/external-storage,kubernetes-incubator/external-storage
yaml
## Code Before: apiVersion: extensions/v1beta1 kind: Deployment metadata: name: local-test-reader spec: replicas: 1 template: metadata: labels: app: local-test-reader spec: terminationGracePeriodSeconds: 10 containers: - name: reader image: gcr.io/google_containers/busybox:1.24 command: - "/bin/sh" args: - "-c" - "tail -f /usr/test-pod/test_file" volumeMounts: - name: local-vol mountPath: /usr/test-pod volumes: - name: local-vol persistentVolumeClaim: claimName: "example-local-claim" ## Instruction: Revert "change claim name so it works out of the box" This reverts commit da46204f0af16504709902a5c10070739f800a46. ## Code After: apiVersion: extensions/v1beta1 kind: Deployment metadata: name: local-test-reader spec: replicas: 1 template: metadata: labels: app: local-test-reader spec: terminationGracePeriodSeconds: 10 containers: - name: reader image: gcr.io/google_containers/busybox:1.24 command: - "/bin/sh" args: - "-c" - "tail -f /usr/test-pod/test_file" volumeMounts: - name: local-vol mountPath: /usr/test-pod volumes: - name: local-vol persistentVolumeClaim: claimName: "local-vol-local-test-0"
6c1afe99ed4455336e7231c75d7dcf10b3995896
Cargo.toml
Cargo.toml
[package] name = "lightning" version = "1.0.0" authors = ["Chris Krycho <hello@chriskrycho.com>"] edition = "2018" [dependencies] chrono = { version = "0.4", features = ["serde"] } clap = "3.0.0-beta.2" glob = "0.3" json5 = "0.3" lazy_static = "1.4" lx-json-feed = { path = "./crates/json-feed" } pulldown-cmark = { version = "0.8", default-features = false } rayon = "1.5.0" regex = "1.4" serde = "1.0" serde_derive = "1.0" serde_yaml = "0.8" slug = "0.1" syntect = "4.5" tera = "1" uuid = {version = "0.8", features = ["serde", "v5"]} yaml-rust = "0.4"
[package] name = "lightning" version = "1.0.0" authors = ["Chris Krycho <hello@chriskrycho.com>"] edition = "2018" [dependencies] chrono = { version = "0.4", features = ["serde"] } clap = "3.0.0-beta.2" glob = "0.3" json5 = "0.3" lazy_static = "1.4" lx-json-feed = { path = "./crates/json-feed" } pulldown-cmark = { version = "0.8", default-features = false, features = ["simd"] } rayon = "1.5.0" regex = "1.4" serde = "1.0" serde_derive = "1.0" serde_yaml = "0.8" slug = "0.1" syntect = "4.5" tera = "1" uuid = {version = "0.8", features = ["serde", "v5"]} yaml-rust = "0.4"
Use SIMD feature from pulldown-cmark
Use SIMD feature from pulldown-cmark BECAUSE I CAN
TOML
mit
chriskrycho/lightning-rs,chriskrycho/lightning-rs
toml
## Code Before: [package] name = "lightning" version = "1.0.0" authors = ["Chris Krycho <hello@chriskrycho.com>"] edition = "2018" [dependencies] chrono = { version = "0.4", features = ["serde"] } clap = "3.0.0-beta.2" glob = "0.3" json5 = "0.3" lazy_static = "1.4" lx-json-feed = { path = "./crates/json-feed" } pulldown-cmark = { version = "0.8", default-features = false } rayon = "1.5.0" regex = "1.4" serde = "1.0" serde_derive = "1.0" serde_yaml = "0.8" slug = "0.1" syntect = "4.5" tera = "1" uuid = {version = "0.8", features = ["serde", "v5"]} yaml-rust = "0.4" ## Instruction: Use SIMD feature from pulldown-cmark BECAUSE I CAN ## Code After: [package] name = "lightning" version = "1.0.0" authors = ["Chris Krycho <hello@chriskrycho.com>"] edition = "2018" [dependencies] chrono = { version = "0.4", features = ["serde"] } clap = "3.0.0-beta.2" glob = "0.3" json5 = "0.3" lazy_static = "1.4" lx-json-feed = { path = "./crates/json-feed" } pulldown-cmark = { version = "0.8", default-features = false, features = ["simd"] } rayon = "1.5.0" regex = "1.4" serde = "1.0" serde_derive = "1.0" serde_yaml = "0.8" slug = "0.1" syntect = "4.5" tera = "1" uuid = {version = "0.8", features = ["serde", "v5"]} yaml-rust = "0.4"
d51d8cdfd9d0d3f2893f04108188d5be1b90e826
spec/support/example1-after.js
spec/support/example1-after.js
/*************************************************************************** * * * EXAMPLE1 * * * * * CONTENTS LINE * * object ........................................................... 17 * * Core Properties .................................................. 22 * * Special Properties ............................................... 28 * * * /***************************************************************************/ /*************************************************************************** * BEGIN MODULE * ****************************************************************************/ var object = { /* Core Properties /*************************************************************************/ foo: 'bar', bar: 'baz', /* Special Properties /*************************************************************************/ baz: 'qux' };
/*************************************************************************** * * * EXAMPLE1 * * * * CONTENTS LINE * * object ........................................................... 17 * * Core Properties .................................................. 22 * * Special Properties ............................................... 28 * * * /***************************************************************************/ /*************************************************************************** * BEGIN MODULE * ****************************************************************************/ var object = { /* Core Properties /*************************************************************************/ foo: 'bar', bar: 'baz', /* Special Properties /*************************************************************************/ baz: 'qux' };
Fix error in expected TOC output
Fix error in expected TOC output
JavaScript
mit
danascheider/toc,danascheider/toc
javascript
## Code Before: /*************************************************************************** * * * EXAMPLE1 * * * * * CONTENTS LINE * * object ........................................................... 17 * * Core Properties .................................................. 22 * * Special Properties ............................................... 28 * * * /***************************************************************************/ /*************************************************************************** * BEGIN MODULE * ****************************************************************************/ var object = { /* Core Properties /*************************************************************************/ foo: 'bar', bar: 'baz', /* Special Properties /*************************************************************************/ baz: 'qux' }; ## Instruction: Fix error in expected TOC output ## Code After: /*************************************************************************** * * * EXAMPLE1 * * * * CONTENTS LINE * * object ........................................................... 17 * * Core Properties .................................................. 22 * * Special Properties ............................................... 28 * * * /***************************************************************************/ /*************************************************************************** * BEGIN MODULE * ****************************************************************************/ var object = { /* Core Properties /*************************************************************************/ foo: 'bar', bar: 'baz', /* Special Properties /*************************************************************************/ baz: 'qux' };
1020d6119c7f70c57ef48e62ed85576792c0f97e
docs/api.rst
docs/api.rst
The Python API ============== The following classes/modules are included **tag**'s Python API, which is under `semantic versioning <http://semver.org>`_. Range ----- .. automodule:: tag.range :members: Comment ------- .. automodule:: tag.comment :members: Directive --------- .. automodule:: tag.directive :members: Sequence -------- .. automodule:: tag.sequence :members: Feature ------- .. automodule:: tag.feature :members: Index ----- .. automodule:: tag.index :members: Readers ------- Currently the :code:`readers` module contains only a single class, GFF3Reader, but may include others in the future. .. automodule:: tag.reader :members: Writers ------- Currently the :code:`writers` module contains only a single class, GFF3Writer, but may include others in the future. .. automodule:: tag.writer :members: Selectors --------- .. automodule:: tag.select :members:
The Python API ============== The following classes/modules are included **tag**'s Python API, which is under `semantic versioning <http://semver.org>`_. Range ----- .. automodule:: tag.range :members: Comment ------- .. automodule:: tag.comment :members: Directive --------- .. automodule:: tag.directive :members: Sequence -------- .. automodule:: tag.sequence :members: Feature ------- .. automodule:: tag.feature :members: Index ----- .. automodule:: tag.index :members: Readers ------- Currently the :code:`readers` module contains only a single class, GFF3Reader, but may include others in the future. .. automodule:: tag.reader :members: Writers ------- Currently the :code:`writers` module contains only a single class, GFF3Writer, but may include others in the future. .. automodule:: tag.writer :members: Transcript ---------- .. automodule:: tag.transcript :members: Selectors --------- .. automodule:: tag.select :members:
Bring docs up to date
Bring docs up to date
reStructuredText
bsd-3-clause
standage/aeneas,standage/tag
restructuredtext
## Code Before: The Python API ============== The following classes/modules are included **tag**'s Python API, which is under `semantic versioning <http://semver.org>`_. Range ----- .. automodule:: tag.range :members: Comment ------- .. automodule:: tag.comment :members: Directive --------- .. automodule:: tag.directive :members: Sequence -------- .. automodule:: tag.sequence :members: Feature ------- .. automodule:: tag.feature :members: Index ----- .. automodule:: tag.index :members: Readers ------- Currently the :code:`readers` module contains only a single class, GFF3Reader, but may include others in the future. .. automodule:: tag.reader :members: Writers ------- Currently the :code:`writers` module contains only a single class, GFF3Writer, but may include others in the future. .. automodule:: tag.writer :members: Selectors --------- .. automodule:: tag.select :members: ## Instruction: Bring docs up to date ## Code After: The Python API ============== The following classes/modules are included **tag**'s Python API, which is under `semantic versioning <http://semver.org>`_. Range ----- .. automodule:: tag.range :members: Comment ------- .. automodule:: tag.comment :members: Directive --------- .. automodule:: tag.directive :members: Sequence -------- .. automodule:: tag.sequence :members: Feature ------- .. automodule:: tag.feature :members: Index ----- .. automodule:: tag.index :members: Readers ------- Currently the :code:`readers` module contains only a single class, GFF3Reader, but may include others in the future. .. automodule:: tag.reader :members: Writers ------- Currently the :code:`writers` module contains only a single class, GFF3Writer, but may include others in the future. .. automodule:: tag.writer :members: Transcript ---------- .. automodule:: tag.transcript :members: Selectors --------- .. automodule:: tag.select :members:
eb7e540d54f4c163af187d72ff55448dc901f0b9
generators/database/generateMongodbDatabase.js
generators/database/generateMongodbDatabase.js
import { join } from 'path'; import { replaceCode, appendFile, addNpmPackage } from '../utils'; async function generateMongodbDatabase(params) { switch (params.framework) { case 'express': const app = join(__base, 'build', params.uuid, 'app.js'); const mongooseRequire = join(__base, 'modules', 'database', 'mongodb', 'mongoose-require.js'); const mongooseConnect = join(__base, 'modules', 'database', 'mongodb', 'mongoose-connect.js'); await replaceCode(app, 'DATABASE_REQUIRE', mongooseRequire); await replaceCode(app, 'DATABASE_CONNECTION', mongooseConnect, { leadingBlankLine: true }); await appendFile(join(__base, 'build', params.uuid, '.env'), 'MONGODB=mongodb://localhost/test'); await addNpmPackage('mongoose', params); break; case 'hapi': break; case 'meteor': break; default: } } export default generateMongodbDatabase;
import { join } from 'path'; import { replaceCode, mkdirs, copy, appendFile, addNpmPackage } from '../utils'; async function generateMongodbDatabase(params) { const build = join(__base, 'build', params.uuid); switch (params.framework) { case 'express': const app = join(build, 'app.js'); const mongooseRequire = join(__base, 'modules', 'database', 'mongodb', 'mongoose-require.js'); const mongooseConnect = join(__base, 'modules', 'database', 'mongodb', 'mongoose-connect.js'); const mongooseUserModel = join(__base, 'modules', 'database', 'mongodb', 'mongoose-model.js'); await replaceCode(app, 'DATABASE_REQUIRE', mongooseRequire); await replaceCode(app, 'DATABASE_CONNECTION', mongooseConnect, { leadingBlankLine: true }); await appendFile(join(build, '.env'), 'MONGODB=mongodb://localhost/test'); await addNpmPackage('mongoose', params); // Create models dir await mkdirs(join(__base, 'build', params.uuid, 'models')); await copy(mongooseUserModel, join(build, 'models', 'user.js')); break; case 'hapi': break; case 'meteor': break; default: } } export default generateMongodbDatabase;
Create models dir and copy user model for mongodb generator
Create models dir and copy user model for mongodb generator
JavaScript
mit
sahat/boilerplate,sahat/boilerplate,sahat/megaboilerplate,sahat/megaboilerplate,sahat/megaboilerplate
javascript
## Code Before: import { join } from 'path'; import { replaceCode, appendFile, addNpmPackage } from '../utils'; async function generateMongodbDatabase(params) { switch (params.framework) { case 'express': const app = join(__base, 'build', params.uuid, 'app.js'); const mongooseRequire = join(__base, 'modules', 'database', 'mongodb', 'mongoose-require.js'); const mongooseConnect = join(__base, 'modules', 'database', 'mongodb', 'mongoose-connect.js'); await replaceCode(app, 'DATABASE_REQUIRE', mongooseRequire); await replaceCode(app, 'DATABASE_CONNECTION', mongooseConnect, { leadingBlankLine: true }); await appendFile(join(__base, 'build', params.uuid, '.env'), 'MONGODB=mongodb://localhost/test'); await addNpmPackage('mongoose', params); break; case 'hapi': break; case 'meteor': break; default: } } export default generateMongodbDatabase; ## Instruction: Create models dir and copy user model for mongodb generator ## Code After: import { join } from 'path'; import { replaceCode, mkdirs, copy, appendFile, addNpmPackage } from '../utils'; async function generateMongodbDatabase(params) { const build = join(__base, 'build', params.uuid); switch (params.framework) { case 'express': const app = join(build, 'app.js'); const mongooseRequire = join(__base, 'modules', 'database', 'mongodb', 'mongoose-require.js'); const mongooseConnect = join(__base, 'modules', 'database', 'mongodb', 'mongoose-connect.js'); const mongooseUserModel = join(__base, 'modules', 'database', 'mongodb', 'mongoose-model.js'); await replaceCode(app, 'DATABASE_REQUIRE', mongooseRequire); await replaceCode(app, 'DATABASE_CONNECTION', mongooseConnect, { leadingBlankLine: true }); await appendFile(join(build, '.env'), 'MONGODB=mongodb://localhost/test'); await addNpmPackage('mongoose', params); // Create models dir await mkdirs(join(__base, 'build', params.uuid, 'models')); await copy(mongooseUserModel, join(build, 'models', 'user.js')); break; case 'hapi': break; case 'meteor': break; default: } } export default generateMongodbDatabase;
965718e27a2adda88909efca386930d653fb9b49
.travis.yml
.travis.yml
env: global: - CC_TEST_REPORTER_ID=1561686e6399317b53a92aaad0550f8b91fe5af312e1f2852ab803ff6fcb6fa9 language: python python: - "3.6" install: - pip install -r requirements.txt - pip install pytest pytest-cov codeclimate-test-reporter script: - py.test -v --cov=. tests/ after_success: - codeclimate-test-reporter notifications: slack: secure: QPq14naOvUQjnVZJrSRo6K5oRIBOHiZC7bfEKJxGEYa4Xi0x3o9maNLYtcgw5ZLgkv15e31yb/MeLF1F0DiNTBc9j4mn7a0PYALhn5kaeXel+J+xgrm2FGx+uF7BK0zQeWGTZAdX1WO8S00Ri7PjqMsSFyMAlHqoqWHONrb2fuU=
env: global: - CC_TEST_REPORTER_ID=1561686e6399317b53a92aaad0550f8b91fe5af312e1f2852ab803ff6fcb6fa9 language: python python: - "3.6" install: - pip install -r requirements.txt - pip install pytest before_script: - curl -L https://codeclimate.com/downloads/test-reporter/test-reporter-latest-linux-amd64 > ./cc-test-reporter - chmod +x ./cc-test-reporter script: - py.test -v tests/ after_success: - ./cc-test-reporter after-build --exit-code $TRAVIS_TEST_RESULT notifications: slack: secure: QPq14naOvUQjnVZJrSRo6K5oRIBOHiZC7bfEKJxGEYa4Xi0x3o9maNLYtcgw5ZLgkv15e31yb/MeLF1F0DiNTBc9j4mn7a0PYALhn5kaeXel+J+xgrm2FGx+uF7BK0zQeWGTZAdX1WO8S00Ri7PjqMsSFyMAlHqoqWHONrb2fuU=
Apply new coverage report settings at code climate
:hammer: Apply new coverage report settings at code climate
YAML
mit
tadashi-aikawa/gemini
yaml
## Code Before: env: global: - CC_TEST_REPORTER_ID=1561686e6399317b53a92aaad0550f8b91fe5af312e1f2852ab803ff6fcb6fa9 language: python python: - "3.6" install: - pip install -r requirements.txt - pip install pytest pytest-cov codeclimate-test-reporter script: - py.test -v --cov=. tests/ after_success: - codeclimate-test-reporter notifications: slack: secure: QPq14naOvUQjnVZJrSRo6K5oRIBOHiZC7bfEKJxGEYa4Xi0x3o9maNLYtcgw5ZLgkv15e31yb/MeLF1F0DiNTBc9j4mn7a0PYALhn5kaeXel+J+xgrm2FGx+uF7BK0zQeWGTZAdX1WO8S00Ri7PjqMsSFyMAlHqoqWHONrb2fuU= ## Instruction: :hammer: Apply new coverage report settings at code climate ## Code After: env: global: - CC_TEST_REPORTER_ID=1561686e6399317b53a92aaad0550f8b91fe5af312e1f2852ab803ff6fcb6fa9 language: python python: - "3.6" install: - pip install -r requirements.txt - pip install pytest before_script: - curl -L https://codeclimate.com/downloads/test-reporter/test-reporter-latest-linux-amd64 > ./cc-test-reporter - chmod +x ./cc-test-reporter script: - py.test -v tests/ after_success: - ./cc-test-reporter after-build --exit-code $TRAVIS_TEST_RESULT notifications: slack: secure: QPq14naOvUQjnVZJrSRo6K5oRIBOHiZC7bfEKJxGEYa4Xi0x3o9maNLYtcgw5ZLgkv15e31yb/MeLF1F0DiNTBc9j4mn7a0PYALhn5kaeXel+J+xgrm2FGx+uF7BK0zQeWGTZAdX1WO8S00Ri7PjqMsSFyMAlHqoqWHONrb2fuU=
8bacc2115e4d941d75990362ace0a7af716dff1e
lib/business/data/padca.yml
lib/business/data/padca.yml
working_days: - monday - tuesday - wednesday - thursday - friday holidays: - January 1st, 2018 - January 2nd, 2018 - February 12th, 2018 - February 19th, 2018 - March 30th, 2018 - April 2nd, 2018 - May 21st, 2018 - June 21st, 2018 - June 25th, 2018 - July 2nd, 2018 - July 9th, 2018 - August 6th, 2018 - August 20th, 2018 - September 3rd, 2018 - October 8th, 2018 - November 12th, 2018 - December 25th, 2018 - December 26th, 2018 - January 1st, 2019 - January 2nd, 2019 - February 11th, 2019 - February 18th, 2019 - April 19th, 2019 - April 22nd, 2019 - May 20th, 2019 - June 21st, 2019 - June 24th, 2019 - July 1st, 2019 - July 9th, 2019 - August 5th, 2019 - August 19th, 2019 - September 2nd, 2019 - October 14th, 2019 - November 11th, 2019 - December 25th, 2019 - December 26th, 2019
working_days: - monday - tuesday - wednesday - thursday - friday holidays: - January 1st, 2018 - March 30th, 2018 - May 21st, 2018 - July 2nd, 2018 - September 3rd, 2018 - October 8th, 2018 - November 12th, 2018 - December 25th, 2018 - December 26th, 2018 - January 1st, 2019 - April 19th, 2019 - May 20th, 2019 - July 1st, 2019 - September 2nd, 2019 - October 14th, 2019 - November 11th, 2019 - December 25th, 2019 - December 26th, 2019 - January 1st, 2020
Remove regional holiday days. Add 2020 calculated days
Remove regional holiday days. Add 2020 calculated days
YAML
mit
gocardless/business
yaml
## Code Before: working_days: - monday - tuesday - wednesday - thursday - friday holidays: - January 1st, 2018 - January 2nd, 2018 - February 12th, 2018 - February 19th, 2018 - March 30th, 2018 - April 2nd, 2018 - May 21st, 2018 - June 21st, 2018 - June 25th, 2018 - July 2nd, 2018 - July 9th, 2018 - August 6th, 2018 - August 20th, 2018 - September 3rd, 2018 - October 8th, 2018 - November 12th, 2018 - December 25th, 2018 - December 26th, 2018 - January 1st, 2019 - January 2nd, 2019 - February 11th, 2019 - February 18th, 2019 - April 19th, 2019 - April 22nd, 2019 - May 20th, 2019 - June 21st, 2019 - June 24th, 2019 - July 1st, 2019 - July 9th, 2019 - August 5th, 2019 - August 19th, 2019 - September 2nd, 2019 - October 14th, 2019 - November 11th, 2019 - December 25th, 2019 - December 26th, 2019 ## Instruction: Remove regional holiday days. Add 2020 calculated days ## Code After: working_days: - monday - tuesday - wednesday - thursday - friday holidays: - January 1st, 2018 - March 30th, 2018 - May 21st, 2018 - July 2nd, 2018 - September 3rd, 2018 - October 8th, 2018 - November 12th, 2018 - December 25th, 2018 - December 26th, 2018 - January 1st, 2019 - April 19th, 2019 - May 20th, 2019 - July 1st, 2019 - September 2nd, 2019 - October 14th, 2019 - November 11th, 2019 - December 25th, 2019 - December 26th, 2019 - January 1st, 2020
0ce9b2ebad0771b1dcdd9abd3c57489aa7ab6969
README.md
README.md
This is the training material for the Apache Flink workshop at JCConf2016.
This is the training material for the [Apache Flink](https://flink.apache.org/) workshop at JCConf2016. The tutorial code is still work in progress! However, workshop attendees can still download this project for some pre-workshop preparation by following the instructions below. ## Preparation Instructions for JCConf2016 Attendees The project contains a relatively large-sized sample streaming dataset (at `src/main/resources/nycTaxiRides.gz`), and scripts to download some required binaries for the tutorial environment. So, you may want to download these before the workshop. First, clone the project: ``` $ git clone https://github.com/flink-taiwan/jcconf2016-workshop.git ``` Afterwards, install all the required software dependencies using [Maven](https://maven.apache.org/download.cgi): ``` cd jcconf2016-workshop mvn clean install ``` The software dependencies that will be used by this tutorial will be installed in your local Maven repository. Next, download software binaries for systems that we will be deploying for our tutorial environment: ``` cd jcconf2016-workshop bin/prepare-all-deployment.sh ``` The above script will download and prepare all the required binaries. This might take a few minutes, as it needs to download binaries of Apache Flink, Apache Kafka, InfluxDB, and Grafana. After it finishes, you can checkout the downloaded binaries under the directory `jcconf2016-workshop/deploy/`. # Disclaimer Apache®, Apache Flink™, Flink™, and the Apache feather logo are trademarks of [The Apache Software Foundation](http://apache.org/).
Add instructions for pre-downloading resources
Add instructions for pre-downloading resources
Markdown
apache-2.0
flink-taiwan/jcconf2016-workshop,flink-taiwan/jcconf2016-workshop
markdown
## Code Before: This is the training material for the Apache Flink workshop at JCConf2016. ## Instruction: Add instructions for pre-downloading resources ## Code After: This is the training material for the [Apache Flink](https://flink.apache.org/) workshop at JCConf2016. The tutorial code is still work in progress! However, workshop attendees can still download this project for some pre-workshop preparation by following the instructions below. ## Preparation Instructions for JCConf2016 Attendees The project contains a relatively large-sized sample streaming dataset (at `src/main/resources/nycTaxiRides.gz`), and scripts to download some required binaries for the tutorial environment. So, you may want to download these before the workshop. First, clone the project: ``` $ git clone https://github.com/flink-taiwan/jcconf2016-workshop.git ``` Afterwards, install all the required software dependencies using [Maven](https://maven.apache.org/download.cgi): ``` cd jcconf2016-workshop mvn clean install ``` The software dependencies that will be used by this tutorial will be installed in your local Maven repository. Next, download software binaries for systems that we will be deploying for our tutorial environment: ``` cd jcconf2016-workshop bin/prepare-all-deployment.sh ``` The above script will download and prepare all the required binaries. This might take a few minutes, as it needs to download binaries of Apache Flink, Apache Kafka, InfluxDB, and Grafana. After it finishes, you can checkout the downloaded binaries under the directory `jcconf2016-workshop/deploy/`. # Disclaimer Apache®, Apache Flink™, Flink™, and the Apache feather logo are trademarks of [The Apache Software Foundation](http://apache.org/).
56aefa797493832fcc2e74d1d44cb1e8366d58c2
Resources/config/doctrine/metadata/mongodb/Bundle.FOS.UserBundle.Document.Group.dcm.xml
Resources/config/doctrine/metadata/mongodb/Bundle.FOS.UserBundle.Document.Group.dcm.xml
<?xml version="1.0" encoding="UTF-8"?> <doctrine-mongo-mapping xmlns="http://doctrine-project.org/schemas/odm/doctrine-mongo-mapping" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://doctrine-project.org/schemas/odm/doctrine-mongo-mapping http://doctrine-project.org/schemas/odm/doctrine-mongo-mapping.xsd"> <document name="Bundle\FOS\UserBundle\Document\Group" collection="fos_user_group"> <field name="name" fieldName="name" type="string" /> <field name="roles" fieldName="roles" type="hash" /> <indexes> <index unique="true" dropDups="true"> <key name="name" order="asc" /> <option name="safe" value="true" /> </index> </indexes> </document> </doctrine-mongo-mapping>
<?xml version="1.0" encoding="UTF-8"?> <doctrine-mongo-mapping xmlns="http://doctrine-project.org/schemas/odm/doctrine-mongo-mapping" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://doctrine-project.org/schemas/odm/doctrine-mongo-mapping http://doctrine-project.org/schemas/odm/doctrine-mongo-mapping.xsd"> <document name="Bundle\FOS\UserBundle\Document\Group" collection="fos_user_group"> <field name="id" fieldName="id" id="true" /> <field name="name" fieldName="name" type="string" /> <field name="roles" fieldName="roles" type="hash" /> <indexes> <index unique="true" dropDups="true"> <key name="name" order="asc" /> <option name="safe" value="true" /> </index> </indexes> </document> </doctrine-mongo-mapping>
Add back ODM mapping for Group class' ID property
Add back ODM mapping for Group class' ID property See: c8ba7b6396fffad4ed755d650c06518693577c4c
XML
mit
XWB/FOSUserBundle,XWB/FOSUserBundle,FriendsOfSymfony/FOSUserBundle
xml
## Code Before: <?xml version="1.0" encoding="UTF-8"?> <doctrine-mongo-mapping xmlns="http://doctrine-project.org/schemas/odm/doctrine-mongo-mapping" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://doctrine-project.org/schemas/odm/doctrine-mongo-mapping http://doctrine-project.org/schemas/odm/doctrine-mongo-mapping.xsd"> <document name="Bundle\FOS\UserBundle\Document\Group" collection="fos_user_group"> <field name="name" fieldName="name" type="string" /> <field name="roles" fieldName="roles" type="hash" /> <indexes> <index unique="true" dropDups="true"> <key name="name" order="asc" /> <option name="safe" value="true" /> </index> </indexes> </document> </doctrine-mongo-mapping> ## Instruction: Add back ODM mapping for Group class' ID property See: c8ba7b6396fffad4ed755d650c06518693577c4c ## Code After: <?xml version="1.0" encoding="UTF-8"?> <doctrine-mongo-mapping xmlns="http://doctrine-project.org/schemas/odm/doctrine-mongo-mapping" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://doctrine-project.org/schemas/odm/doctrine-mongo-mapping http://doctrine-project.org/schemas/odm/doctrine-mongo-mapping.xsd"> <document name="Bundle\FOS\UserBundle\Document\Group" collection="fos_user_group"> <field name="id" fieldName="id" id="true" /> <field name="name" fieldName="name" type="string" /> <field name="roles" fieldName="roles" type="hash" /> <indexes> <index unique="true" dropDups="true"> <key name="name" order="asc" /> <option name="safe" value="true" /> </index> </indexes> </document> </doctrine-mongo-mapping>
1c4ebcfd5eea227faff1eb4445ee41bef3fdbd1c
README.md
README.md
Some fun ascii art for a fun place to work.
Some fun ascii art for a fun place to work. ## How was it made? A combination of the following tools were used to create the logo - vim - http://vimawesome.com/plugin/drawit - https://monodraw.helftone.com/ - http://asciiflow.com/
Add a little section about how it was made.
Add a little section about how it was made.
Markdown
mit
blakedietz/tsheets-ascii-art
markdown
## Code Before: Some fun ascii art for a fun place to work. ## Instruction: Add a little section about how it was made. ## Code After: Some fun ascii art for a fun place to work. ## How was it made? A combination of the following tools were used to create the logo - vim - http://vimawesome.com/plugin/drawit - https://monodraw.helftone.com/ - http://asciiflow.com/
2c2604527cfe0ceb3dbf052bbcaf9e2e660b9e47
app.py
app.py
from flask import Flask, request, render_template, redirect, url_for db = {} app = Flask(__name__) @app.route('/') def get(): ip = request.remote_addr return render_template("index.html", text = db.get(ip, '')) @app.route('/', methods=['POST']) def post(): ip, content = request.remote_addr, request.form.get('text') if len(content) == 0: del db[ip] else: db[ip] = content return redirect(url_for('get')) if __name__ == "__main__": app.run(host="0.0.0.0")
from flask import Flask, request, render_template, redirect, url_for db = {} app = Flask(__name__) def get_client_ip(request): # PythonAnywhere.com calls our service through a loabalancer # the remote_addr is therefore the IP of the loaabalancer, PythonAnywhere stores Client IP in header if request.headers['X-Real-IP']: return request.headers['X-Real-IP'] return request.remote_addr if __name__ == "__main__": app.run(host="0.0.0.0") @app.route('/') def get(): ip = get_client_ip(request) return render_template("index.html", text = db.get(ip, '')) @app.route('/', methods=['POST']) def post(): ip, content = get_client_ip(request), request.form.get('text') if len(content) == 0: del db[ip] else: db[ip] = content return redirect(url_for('get')) if __name__ == "__main__": app.run(host="0.0.0.0")
Fix for PythonAnywhere LoadBalancer IP
Fix for PythonAnywhere LoadBalancer IP
Python
mit
st0le/ephemeral,st0le/ephemeral
python
## Code Before: from flask import Flask, request, render_template, redirect, url_for db = {} app = Flask(__name__) @app.route('/') def get(): ip = request.remote_addr return render_template("index.html", text = db.get(ip, '')) @app.route('/', methods=['POST']) def post(): ip, content = request.remote_addr, request.form.get('text') if len(content) == 0: del db[ip] else: db[ip] = content return redirect(url_for('get')) if __name__ == "__main__": app.run(host="0.0.0.0") ## Instruction: Fix for PythonAnywhere LoadBalancer IP ## Code After: from flask import Flask, request, render_template, redirect, url_for db = {} app = Flask(__name__) def get_client_ip(request): # PythonAnywhere.com calls our service through a loabalancer # the remote_addr is therefore the IP of the loaabalancer, PythonAnywhere stores Client IP in header if request.headers['X-Real-IP']: return request.headers['X-Real-IP'] return request.remote_addr if __name__ == "__main__": app.run(host="0.0.0.0") @app.route('/') def get(): ip = get_client_ip(request) return render_template("index.html", text = db.get(ip, '')) @app.route('/', methods=['POST']) def post(): ip, content = get_client_ip(request), request.form.get('text') if len(content) == 0: del db[ip] else: db[ip] = content return redirect(url_for('get')) if __name__ == "__main__": app.run(host="0.0.0.0")
a7ec0ef269874b3cf84aea74ed66bc0f64fdb703
test/tests/PlacedSymbol.js
test/tests/PlacedSymbol.js
module('Placed Symbol'); test('placedSymbol bounds', function() { var doc = new Document(); var path = new Path.Circle([50, 50], 50); var symbol = new Symbol(path); var placedSymbol = new PlacedSymbol(symbol); // These tests currently fail because we haven't implemented // Item#strokeBounds yet. compareRectangles(placedSymbol.bounds, new Rectangle(-50.5, -50.5, 101, 101), 'PlacedSymbol initial bounds.'); placedSymbol.scale(0.5); compareRectangles(placedSymbol.bounds, { x: -25.5, y: -25.5, width: 51, height: 51 }, 'Bounds after scale.'); placedSymbol.rotate(40); compareRectangles(placedSymbol.bounds, { x: -25.50049, y: -25.50049, width: 51.00098, height: 51.00098 }, 'Bounds after rotation.'); });
module('Placed Symbol'); test('placedSymbol bounds', function() { var doc = new Document(); var path = new Path.Circle([50, 50], 50); path.strokeWidth = 1; path.strokeCap = 'round'; path.strokeJoin = 'round'; compareRectangles(path.strokeBounds, new Rectangle(-0.5, -0.5, 101, 101), 'Path initial bounds.'); var symbol = new Symbol(path); var placedSymbol = new PlacedSymbol(symbol); // These tests currently fail because we haven't implemented // Item#strokeBounds yet. compareRectangles(placedSymbol.bounds, new Rectangle(-50.5, -50.5, 101, 101), 'PlacedSymbol initial bounds.'); placedSymbol.scale(0.5); compareRectangles(placedSymbol.bounds, { x: -25.5, y: -25.5, width: 51, height: 51 }, 'Bounds after scale.'); placedSymbol.rotate(40); compareRectangles(placedSymbol.bounds, { x: -25.50049, y: -25.50049, width: 51.00098, height: 51.00098 }, 'Bounds after rotation.'); });
Make sure Placed Symbol test uses a strokeWidth on Paper too.
Make sure Placed Symbol test uses a strokeWidth on Paper too.
JavaScript
mit
luisbrito/paper.js,fredoche/paper.js,lehni/paper.js,ClaireRutkoske/paper.js,Olegas/paper.js,proofme/paper.js,legendvijay/paper.js,ClaireRutkoske/paper.js,NHQ/paper,legendvijay/paper.js,byte-foundry/paper.js,proofme/paper.js,baiyanghese/paper.js,iconexperience/paper.js,baiyanghese/paper.js,Olegas/paper.js,superjudge/paper.js,superjudge/paper.js,fredoche/paper.js,mcanthony/paper.js,Olegas/paper.js,legendvijay/paper.js,byte-foundry/paper.js,0/paper.js,mcanthony/paper.js,rgordeev/paper.js,EskenderDev/paper.js,lehni/paper.js,byte-foundry/paper.js,EskenderDev/paper.js,EskenderDev/paper.js,li0t/paper.js,rgordeev/paper.js,luisbrito/paper.js,superjudge/paper.js,mcanthony/paper.js,chad-autry/paper.js,li0t/paper.js,proofme/paper.js,JunaidPaul/paper.js,nancymark/paper.js,nancymark/paper.js,nancymark/paper.js,iconexperience/paper.js,li0t/paper.js,iconexperience/paper.js,lehni/paper.js,NHQ/paper,rgordeev/paper.js,baiyanghese/paper.js,JunaidPaul/paper.js,luisbrito/paper.js,JunaidPaul/paper.js,ClaireRutkoske/paper.js,fredoche/paper.js,0/paper.js
javascript
## Code Before: module('Placed Symbol'); test('placedSymbol bounds', function() { var doc = new Document(); var path = new Path.Circle([50, 50], 50); var symbol = new Symbol(path); var placedSymbol = new PlacedSymbol(symbol); // These tests currently fail because we haven't implemented // Item#strokeBounds yet. compareRectangles(placedSymbol.bounds, new Rectangle(-50.5, -50.5, 101, 101), 'PlacedSymbol initial bounds.'); placedSymbol.scale(0.5); compareRectangles(placedSymbol.bounds, { x: -25.5, y: -25.5, width: 51, height: 51 }, 'Bounds after scale.'); placedSymbol.rotate(40); compareRectangles(placedSymbol.bounds, { x: -25.50049, y: -25.50049, width: 51.00098, height: 51.00098 }, 'Bounds after rotation.'); }); ## Instruction: Make sure Placed Symbol test uses a strokeWidth on Paper too. ## Code After: module('Placed Symbol'); test('placedSymbol bounds', function() { var doc = new Document(); var path = new Path.Circle([50, 50], 50); path.strokeWidth = 1; path.strokeCap = 'round'; path.strokeJoin = 'round'; compareRectangles(path.strokeBounds, new Rectangle(-0.5, -0.5, 101, 101), 'Path initial bounds.'); var symbol = new Symbol(path); var placedSymbol = new PlacedSymbol(symbol); // These tests currently fail because we haven't implemented // Item#strokeBounds yet. compareRectangles(placedSymbol.bounds, new Rectangle(-50.5, -50.5, 101, 101), 'PlacedSymbol initial bounds.'); placedSymbol.scale(0.5); compareRectangles(placedSymbol.bounds, { x: -25.5, y: -25.5, width: 51, height: 51 }, 'Bounds after scale.'); placedSymbol.rotate(40); compareRectangles(placedSymbol.bounds, { x: -25.50049, y: -25.50049, width: 51.00098, height: 51.00098 }, 'Bounds after rotation.'); });
7c732427a0d09d94bfc5b77f778d290750686ae9
spec/countries/CAN-spec.coffee
spec/countries/CAN-spec.coffee
expect = require('chai').expect Phone = require('../../src/script/Phone') canada = require('../../src/script/countries/CAN') describe 'Canada', -> describe 'Should get', -> number = '' afterEach -> # Act result = Phone.getPhoneInternational(number) # Assert expect(result.valid).to.be.true expect(result.countryNameAbbr).to.equal('CAN') it 'a number', -> # Arrange number = "+1 204 9898656" describe 'Should validate a', -> number = '' afterEach -> # Act result = Phone.validate(number) # Assert expect(result).to.be.true it 'number', -> # Arrange number = "+1 204 9898656" describe 'Utility method', -> it 'should get the country code given the abbr', -> countryCode = Phone.getCountryCodeByNameAbbr('CAN') expect(countryCode).to.equal('1')
expect = require('chai').expect Phone = require('../../src/script/Phone') canada = require('../../src/script/countries/CAN') describe 'Canada', -> describe 'Should get', -> number = '' it 'a number', -> # Arrange number = "+1 204 9898656" # Act result = Phone.getPhoneInternational(number) # Assert expect(result.valid).to.be.true expect(result.countryNameAbbr).to.equal('CAN') it 'a national number', -> # Arrange number = "(204) 123 1234" # Act result = Phone.getPhoneNational(number, '1') # Assert expect(result.valid).to.be.true expect(result.countryNameAbbr).to.equal('CAN') describe 'Should validate a', -> number = '' afterEach -> # Act result = Phone.validate(number) # Assert expect(result).to.be.true it 'number', -> # Arrange number = "+1 204 9898656" describe 'Utility method', -> it 'should get the country code given the abbr', -> countryCode = Phone.getCountryCodeByNameAbbr('CAN') expect(countryCode).to.equal('1')
Add more tests to Canadian phones
Add more tests to Canadian phones
CoffeeScript
mit
vtex/front.phone
coffeescript
## Code Before: expect = require('chai').expect Phone = require('../../src/script/Phone') canada = require('../../src/script/countries/CAN') describe 'Canada', -> describe 'Should get', -> number = '' afterEach -> # Act result = Phone.getPhoneInternational(number) # Assert expect(result.valid).to.be.true expect(result.countryNameAbbr).to.equal('CAN') it 'a number', -> # Arrange number = "+1 204 9898656" describe 'Should validate a', -> number = '' afterEach -> # Act result = Phone.validate(number) # Assert expect(result).to.be.true it 'number', -> # Arrange number = "+1 204 9898656" describe 'Utility method', -> it 'should get the country code given the abbr', -> countryCode = Phone.getCountryCodeByNameAbbr('CAN') expect(countryCode).to.equal('1') ## Instruction: Add more tests to Canadian phones ## Code After: expect = require('chai').expect Phone = require('../../src/script/Phone') canada = require('../../src/script/countries/CAN') describe 'Canada', -> describe 'Should get', -> number = '' it 'a number', -> # Arrange number = "+1 204 9898656" # Act result = Phone.getPhoneInternational(number) # Assert expect(result.valid).to.be.true expect(result.countryNameAbbr).to.equal('CAN') it 'a national number', -> # Arrange number = "(204) 123 1234" # Act result = Phone.getPhoneNational(number, '1') # Assert expect(result.valid).to.be.true expect(result.countryNameAbbr).to.equal('CAN') describe 'Should validate a', -> number = '' afterEach -> # Act result = Phone.validate(number) # Assert expect(result).to.be.true it 'number', -> # Arrange number = "+1 204 9898656" describe 'Utility method', -> it 'should get the country code given the abbr', -> countryCode = Phone.getCountryCodeByNameAbbr('CAN') expect(countryCode).to.equal('1')