commit stringlengths 40 40 | old_file stringlengths 4 184 | new_file stringlengths 4 184 | old_contents stringlengths 1 3.6k | new_contents stringlengths 5 3.38k | subject stringlengths 15 778 | message stringlengths 16 6.74k | lang stringclasses 201 values | license stringclasses 13 values | repos stringlengths 6 116k | config stringclasses 201 values | content stringlengths 137 7.24k | diff stringlengths 26 5.55k | diff_length int64 1 123 | relative_diff_length float64 0.01 89 | n_lines_added int64 0 108 | n_lines_deleted int64 0 106 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
5db2ee20fbe8ecdf1d432fd23c21702233f1bba7 | recharges/tasks.py | recharges/tasks.py | from celery.task import Task
from celery.utils.log import get_task_logger
import requests
logger = get_task_logger(__name__)
class Hotsocket_Login(Task):
"""
Task to get the username and password varified then produce a token
"""
name = "gopherairtime.recharges.tasks.hotsocket_login"
def run(self, **kwargs):
"""
Returns the token
"""
l = self.get_logger(**kwargs)
l.info("Logging into hotsocket")
auth = {'username': 'trial_acc_1212', 'password': 'tr14l_l1k3m00n',
'as_json': True}
r = requests.post("http://api.hotsocket.co.za:8080/test/login",data=auth)
result = r.json()
return result["response"]["token"]
hotsocket_login = Hotsocket_Login()
| import requests
from django.conf import settings
from celery.task import Task
from celery.utils.log import get_task_logger
from .models import Account
logger = get_task_logger(__name__)
class Hotsocket_Login(Task):
"""
Task to get the username and password varified then produce a token
"""
name = "gopherairtime.recharges.tasks.hotsocket_login"
def run(self, **kwargs):
"""
Returns the token
"""
l = self.get_logger(**kwargs)
l.info("Logging into hotsocket")
auth = {'username': 'trial_acc_1212', 'password': 'tr14l_l1k3m00n',
'as_json': True}
r = requests.post("%s/login" % settings.HOTSOCKET_API_ENDPOINT,
data=auth)
result = r.json()
token = result["response"]["token"]
account = Account()
account.token = token
account.save()
return True
hotsocket_login = Hotsocket_Login()
| Refactor the task for getting token to store it in DB | Refactor the task for getting token to store it in DB
| Python | bsd-3-clause | westerncapelabs/gopherairtime,westerncapelabs/gopherairtime | python | ## Code Before:
from celery.task import Task
from celery.utils.log import get_task_logger
import requests
logger = get_task_logger(__name__)
class Hotsocket_Login(Task):
"""
Task to get the username and password varified then produce a token
"""
name = "gopherairtime.recharges.tasks.hotsocket_login"
def run(self, **kwargs):
"""
Returns the token
"""
l = self.get_logger(**kwargs)
l.info("Logging into hotsocket")
auth = {'username': 'trial_acc_1212', 'password': 'tr14l_l1k3m00n',
'as_json': True}
r = requests.post("http://api.hotsocket.co.za:8080/test/login",data=auth)
result = r.json()
return result["response"]["token"]
hotsocket_login = Hotsocket_Login()
## Instruction:
Refactor the task for getting token to store it in DB
## Code After:
import requests
from django.conf import settings
from celery.task import Task
from celery.utils.log import get_task_logger
from .models import Account
logger = get_task_logger(__name__)
class Hotsocket_Login(Task):
"""
Task to get the username and password varified then produce a token
"""
name = "gopherairtime.recharges.tasks.hotsocket_login"
def run(self, **kwargs):
"""
Returns the token
"""
l = self.get_logger(**kwargs)
l.info("Logging into hotsocket")
auth = {'username': 'trial_acc_1212', 'password': 'tr14l_l1k3m00n',
'as_json': True}
r = requests.post("%s/login" % settings.HOTSOCKET_API_ENDPOINT,
data=auth)
result = r.json()
token = result["response"]["token"]
account = Account()
account.token = token
account.save()
return True
hotsocket_login = Hotsocket_Login()
| + import requests
+
+ from django.conf import settings
from celery.task import Task
from celery.utils.log import get_task_logger
- import requests
+ from .models import Account
logger = get_task_logger(__name__)
class Hotsocket_Login(Task):
"""
Task to get the username and password varified then produce a token
"""
name = "gopherairtime.recharges.tasks.hotsocket_login"
def run(self, **kwargs):
"""
Returns the token
"""
l = self.get_logger(**kwargs)
l.info("Logging into hotsocket")
auth = {'username': 'trial_acc_1212', 'password': 'tr14l_l1k3m00n',
'as_json': True}
- r = requests.post("http://api.hotsocket.co.za:8080/test/login",data=auth)
+ r = requests.post("%s/login" % settings.HOTSOCKET_API_ENDPOINT,
+ data=auth)
result = r.json()
- return result["response"]["token"]
? ^ ---
+ token = result["response"]["token"]
? ^^^ ++
+ account = Account()
+ account.token = token
+ account.save()
+ return True
hotsocket_login = Hotsocket_Login() | 14 | 0.5 | 11 | 3 |
818b6b21e2adf814347434031907edcd4f7f800b | README.md | README.md | This is a simple proof-of-concept project for learning React, Redux and Node.js in a limited timeframe.
## Status
A simple, non-persisted WebSocket-based chat. Has no sessions, no channels, just one page showing the messages. Also practically no tests, no error handling.
## Next steps
* Implement channels/chatrooms
* Implement user sessions (so as not to lose chat history upon reload at the very least)
*
Thunks look nice and I would have wanted to work with Promises. Alas, I have so far only implemented the simplest WebSocket-based chat with no real async requests flying around.
| This is a simple proof-of-concept project for learning React, Redux and Node.js in a limited timeframe.
## Status
* A simple, non-persisted WebSocket-based chat. Has no sessions, no channels, just one page showing the messages.
* Practically no tests.
* No error handling.
* No security concerns addressed.
* Has to be started manually.
* Only tested in dev mode.
## Next steps
* Finish build and start scripts to make everything go from one command
* Try on top of Heroku
* Re-evaluate application of immutable.js through the app code
* Enforce Redux best practices
* Implement channels/chatrooms
* Implement user sessions (so as not to lose chat history upon reload at the very least)
* Thunks/Promises look nice...
| Update Readme to reflect current status | Update Readme to reflect current status
| Markdown | mit | kheldysh/discursor,kheldysh/discursor | markdown | ## Code Before:
This is a simple proof-of-concept project for learning React, Redux and Node.js in a limited timeframe.
## Status
A simple, non-persisted WebSocket-based chat. Has no sessions, no channels, just one page showing the messages. Also practically no tests, no error handling.
## Next steps
* Implement channels/chatrooms
* Implement user sessions (so as not to lose chat history upon reload at the very least)
*
Thunks look nice and I would have wanted to work with Promises. Alas, I have so far only implemented the simplest WebSocket-based chat with no real async requests flying around.
## Instruction:
Update Readme to reflect current status
## Code After:
This is a simple proof-of-concept project for learning React, Redux and Node.js in a limited timeframe.
## Status
* A simple, non-persisted WebSocket-based chat. Has no sessions, no channels, just one page showing the messages.
* Practically no tests.
* No error handling.
* No security concerns addressed.
* Has to be started manually.
* Only tested in dev mode.
## Next steps
* Finish build and start scripts to make everything go from one command
* Try on top of Heroku
* Re-evaluate application of immutable.js through the app code
* Enforce Redux best practices
* Implement channels/chatrooms
* Implement user sessions (so as not to lose chat history upon reload at the very least)
* Thunks/Promises look nice...
| This is a simple proof-of-concept project for learning React, Redux and Node.js in a limited timeframe.
## Status
- A simple, non-persisted WebSocket-based chat. Has no sessions, no channels, just one page showing the messages. Also practically no tests, no error handling.
? ----------------------------------------------
+ * A simple, non-persisted WebSocket-based chat. Has no sessions, no channels, just one page showing the messages.
? ++
+ * Practically no tests.
+ * No error handling.
+ * No security concerns addressed.
+ * Has to be started manually.
+ * Only tested in dev mode.
## Next steps
+ * Finish build and start scripts to make everything go from one command
+ * Try on top of Heroku
+ * Re-evaluate application of immutable.js through the app code
+ * Enforce Redux best practices
+
* Implement channels/chatrooms
* Implement user sessions (so as not to lose chat history upon reload at the very least)
+ * Thunks/Promises look nice...
- *
-
- Thunks look nice and I would have wanted to work with Promises. Alas, I have so far only implemented the simplest WebSocket-based chat with no real async requests flying around. | 16 | 1.230769 | 12 | 4 |
6ac975503be20010c9dc3118a8f057d9e803e020 | sum-of-multiples.md | sum-of-multiples.md | If we list all the natural numbers up to and including 15 that are
multiples of either 3 or 5, we get 3, 5, 6 and 9, 10, 12, and 15.
The sum of these multiples is 60.
Write a program that can find the sum of the multiples of a given set of
numbers.
| If we list all the natural numbers up to but not including 15 that are
multiples of either 3 or 5, we get 3, 5, 6 and 9, 10, and 12.
The sum of these multiples is 45.
Write a program that can find the sum of the multiples of a given set of
numbers.
| Correct inconsistency in sum of multiples readme | Correct inconsistency in sum of multiples readme
The exercise states that the multiples should not include the higher bound, but then the
example immediately after that includes the upper bound.
Fixes exercism/exercism.io#2579
| Markdown | mit | exercism/x-common,Vankog/problem-specifications,rpottsoh/x-common,petertseng/x-common,kgengler/x-common,wobh/x-common,ErikSchierboom/x-common,petertseng/x-common,Vankog/problem-specifications,rpottsoh/x-common,jmluy/x-common,ErikSchierboom/x-common,exercism/x-common,kgengler/x-common,jmluy/x-common,jmluy/x-common | markdown | ## Code Before:
If we list all the natural numbers up to and including 15 that are
multiples of either 3 or 5, we get 3, 5, 6 and 9, 10, 12, and 15.
The sum of these multiples is 60.
Write a program that can find the sum of the multiples of a given set of
numbers.
## Instruction:
Correct inconsistency in sum of multiples readme
The exercise states that the multiples should not include the higher bound, but then the
example immediately after that includes the upper bound.
Fixes exercism/exercism.io#2579
## Code After:
If we list all the natural numbers up to but not including 15 that are
multiples of either 3 or 5, we get 3, 5, 6 and 9, 10, and 12.
The sum of these multiples is 45.
Write a program that can find the sum of the multiples of a given set of
numbers.
| - If we list all the natural numbers up to and including 15 that are
? ^ ^
+ If we list all the natural numbers up to but not including 15 that are
? ^^^^ ^^
- multiples of either 3 or 5, we get 3, 5, 6 and 9, 10, 12, and 15.
? ---- ^
+ multiples of either 3 or 5, we get 3, 5, 6 and 9, 10, and 12.
? ^
- The sum of these multiples is 60.
? ^^
+ The sum of these multiples is 45.
? ^^
Write a program that can find the sum of the multiples of a given set of
numbers. | 6 | 0.857143 | 3 | 3 |
34dbdd3913a855ee3c066f6ce115f4e11fc8e481 | config/kubernetes/sig-scalability/teams.yaml | config/kubernetes/sig-scalability/teams.yaml | teams:
sig-scalability:
description: SIG Scalability
members:
- gmarek
- jkaniuk
- jprzychodzen
- krzysied
- mborsz
- mm4tt
- oxddr
- shyamjvs
- tosi3k
- wojtek-t
privacy: closed
teams:
sig-scalability-leads:
description: Chairs and Technical Leads for SIG Scalability
members:
- mm4tt
- shyamjvs
- wojtek-t
privacy: closed
sig-scalability-pr-reviews:
description: PR reviews for SIG Scalability
members:
- jkaniuk
- jprzychodzen
- mborsz
- mm4tt
- shyamjvs
- tosi3k
- wojtek-t
privacy: closed
| teams:
sig-scalability:
description: SIG Scalability
members:
- alejandrox1 # SIG Release Technical Lead
- gmarek
- hasheddan # 1.19 CI Signal Lead / Release Manager
- jkaniuk
- jprzychodzen
- justaugustus # SIG Release Chair / Release Manager
- krzysied
- mborsz
- mm4tt
- onlydole # 1.19 Release Team Lead
- oxddr
- saschagrunert # SIG Release Technical Lead / Release Manager
- shyamjvs
- tosi3k
- tpepper # SIG Release Chair / Release Manager
- wojtek-t
privacy: closed
teams:
sig-scalability-leads:
description: Chairs and Technical Leads for SIG Scalability
members:
- mm4tt
- shyamjvs
- wojtek-t
privacy: closed
sig-scalability-pr-reviews:
description: PR reviews for SIG Scalability
members:
- jkaniuk
- jprzychodzen
- mborsz
- mm4tt
- shyamjvs
- tosi3k
- wojtek-t
privacy: closed
| Add key SIG Release personnel to main GH team | sig-scalability: Add key SIG Release personnel to main GH team
Signed-off-by: Stephen Augustus <f96984fef9637085f0559f1484dfd4e99545cd6e@vmware.com>
| YAML | apache-2.0 | kubernetes/org,kubernetes/org,kubernetes/org | yaml | ## Code Before:
teams:
sig-scalability:
description: SIG Scalability
members:
- gmarek
- jkaniuk
- jprzychodzen
- krzysied
- mborsz
- mm4tt
- oxddr
- shyamjvs
- tosi3k
- wojtek-t
privacy: closed
teams:
sig-scalability-leads:
description: Chairs and Technical Leads for SIG Scalability
members:
- mm4tt
- shyamjvs
- wojtek-t
privacy: closed
sig-scalability-pr-reviews:
description: PR reviews for SIG Scalability
members:
- jkaniuk
- jprzychodzen
- mborsz
- mm4tt
- shyamjvs
- tosi3k
- wojtek-t
privacy: closed
## Instruction:
sig-scalability: Add key SIG Release personnel to main GH team
Signed-off-by: Stephen Augustus <f96984fef9637085f0559f1484dfd4e99545cd6e@vmware.com>
## Code After:
teams:
sig-scalability:
description: SIG Scalability
members:
- alejandrox1 # SIG Release Technical Lead
- gmarek
- hasheddan # 1.19 CI Signal Lead / Release Manager
- jkaniuk
- jprzychodzen
- justaugustus # SIG Release Chair / Release Manager
- krzysied
- mborsz
- mm4tt
- onlydole # 1.19 Release Team Lead
- oxddr
- saschagrunert # SIG Release Technical Lead / Release Manager
- shyamjvs
- tosi3k
- tpepper # SIG Release Chair / Release Manager
- wojtek-t
privacy: closed
teams:
sig-scalability-leads:
description: Chairs and Technical Leads for SIG Scalability
members:
- mm4tt
- shyamjvs
- wojtek-t
privacy: closed
sig-scalability-pr-reviews:
description: PR reviews for SIG Scalability
members:
- jkaniuk
- jprzychodzen
- mborsz
- mm4tt
- shyamjvs
- tosi3k
- wojtek-t
privacy: closed
| teams:
sig-scalability:
description: SIG Scalability
members:
+ - alejandrox1 # SIG Release Technical Lead
- gmarek
+ - hasheddan # 1.19 CI Signal Lead / Release Manager
- jkaniuk
- jprzychodzen
+ - justaugustus # SIG Release Chair / Release Manager
- krzysied
- mborsz
- mm4tt
+ - onlydole # 1.19 Release Team Lead
- oxddr
+ - saschagrunert # SIG Release Technical Lead / Release Manager
- shyamjvs
- tosi3k
+ - tpepper # SIG Release Chair / Release Manager
- wojtek-t
privacy: closed
teams:
sig-scalability-leads:
description: Chairs and Technical Leads for SIG Scalability
members:
- mm4tt
- shyamjvs
- wojtek-t
privacy: closed
sig-scalability-pr-reviews:
description: PR reviews for SIG Scalability
members:
- jkaniuk
- jprzychodzen
- mborsz
- mm4tt
- shyamjvs
- tosi3k
- wojtek-t
privacy: closed | 6 | 0.176471 | 6 | 0 |
1d6b4647030b44a0eeb7aa396d83822583fe558e | doc/index.md | doc/index.md |
- [Intro](#intro)
- [Objects](#objects)
- [The prototype](#prototype)
- [`hasOwnProperty`](#hasownproperty)
- [Functions](#functions)
- [How `this` works](#this)
- [Closures and references](#closures)
- [Function arguments](#arguments)
- [Scopes and namespaces](#scopes)
- [Constructors](#constructors)
- [Equality and comparisons](#equality)
- [Arrays](#arrays)
- [The `Array` constructor](#arrayctor)
- [The `for in` loop](#forinloop)
- [The `typeof` operator](#typeof)
- [The `instanceof` operator](#instanceof)
- [Type casting](#casting)
- [`undefined` and `null`](#undefined)
- [The evil `eval`](#eval)
- [`setTimeout` and `setInterval`](#timeouts)
- [Automatic semicolon insertion](#semicolon)
|
- [Intro](#intro)
- [Objects](#objects)
- [The prototype](#prototype)
- [`hasOwnProperty`](#hasownproperty)
- [Functions](#functions)
- [How `this` works](#this)
- [Closures and references](#closures)
- [The `arguments` object](#arguments)
- [Scopes and namespaces](#scopes)
- [Constructors](#constructors)
- [Equality and comparisons](#equality)
- [Arrays](#arrays)
- [The `Array` constructor](#arrayctor)
- [The `for in` loop](#forinloop)
- [The `typeof` operator](#typeof)
- [The `instanceof` operator](#instanceof)
- [Type casting](#casting)
- [`undefined` and `null`](#undefined)
- [The evil `eval`](#eval)
- [`setTimeout` and `setInterval`](#timeouts)
- [Automatic semicolon insertion](#semicolon)
| Change the title of the arguments section | Change the title of the arguments section
| Markdown | mit | xis19/JavaScript-Garden,manekinekko/JavaScript-Garden,manekinekko/JavaScript-Garden,devil-tamachan/BZDoc,tavriaforever/JavaScript-Garden,mosoft521/JavaScript-Garden,tavriaforever/JavaScript-Garden,xis19/JavaScript-Garden,itsimoshka/JavaScript-Garden,dongfeng2008/JavaScript-Garden,julionc/JavaScript-Garden,TangTaigang/JavaScript-Garden,kevingo/JavaScript-Garden,108adams/JavaScript-Garden,it-na-move/JavaScript-Garden,hehuabing/JavaScript-Garden,tavriaforever/JavaScript-Garden,anityche/JavaScript-Garden,mosoft521/JavaScript-Garden,Krasnyanskiy/JavaScript-Garden,hehuabing/JavaScript-Garden,shamansir/JavaScript-Garden,mosoft521/JavaScript-Garden,kevingo/JavaScript-Garden,BonsaiDen/JavaScript-Garden,julionc/JavaScript-Garden,kamilmielnik/JavaScript-Garden,dongfeng2008/JavaScript-Garden,anityche/JavaScript-Garden,TangTaigang/JavaScript-Garden,cesarmarinhorj/JavaScript-Garden,dongfeng2008/JavaScript-Garden,julionc/JavaScript-Garden,cesarmarinhorj/JavaScript-Garden,rdnz/JavaScript-Garden,Krasnyanskiy/JavaScript-Garden,108adams/JavaScript-Garden,hehuabing/JavaScript-Garden,kamilmielnik/JavaScript-Garden,it-na-move/JavaScript-Garden,BonsaiDen/JavaScript-Garden,Krasnyanskiy/JavaScript-Garden,cesarmarinhorj/JavaScript-Garden,shamansir/JavaScript-Garden,xis19/JavaScript-Garden,TangTaigang/JavaScript-Garden,kevingo/JavaScript-Garden,scriptin/JavaScript-Garden,shamansir/JavaScript-Garden,itsimoshka/JavaScript-Garden,anityche/JavaScript-Garden,it-na-move/JavaScript-Garden,devil-tamachan/BZDoc,rdnz/JavaScript-Garden,kamilmielnik/JavaScript-Garden,itsimoshka/JavaScript-Garden,BonsaiDen/JavaScript-Garden,manekinekko/JavaScript-Garden,108adams/JavaScript-Garden | markdown | ## Code Before:
- [Intro](#intro)
- [Objects](#objects)
- [The prototype](#prototype)
- [`hasOwnProperty`](#hasownproperty)
- [Functions](#functions)
- [How `this` works](#this)
- [Closures and references](#closures)
- [Function arguments](#arguments)
- [Scopes and namespaces](#scopes)
- [Constructors](#constructors)
- [Equality and comparisons](#equality)
- [Arrays](#arrays)
- [The `Array` constructor](#arrayctor)
- [The `for in` loop](#forinloop)
- [The `typeof` operator](#typeof)
- [The `instanceof` operator](#instanceof)
- [Type casting](#casting)
- [`undefined` and `null`](#undefined)
- [The evil `eval`](#eval)
- [`setTimeout` and `setInterval`](#timeouts)
- [Automatic semicolon insertion](#semicolon)
## Instruction:
Change the title of the arguments section
## Code After:
- [Intro](#intro)
- [Objects](#objects)
- [The prototype](#prototype)
- [`hasOwnProperty`](#hasownproperty)
- [Functions](#functions)
- [How `this` works](#this)
- [Closures and references](#closures)
- [The `arguments` object](#arguments)
- [Scopes and namespaces](#scopes)
- [Constructors](#constructors)
- [Equality and comparisons](#equality)
- [Arrays](#arrays)
- [The `Array` constructor](#arrayctor)
- [The `for in` loop](#forinloop)
- [The `typeof` operator](#typeof)
- [The `instanceof` operator](#instanceof)
- [Type casting](#casting)
- [`undefined` and `null`](#undefined)
- [The evil `eval`](#eval)
- [`setTimeout` and `setInterval`](#timeouts)
- [Automatic semicolon insertion](#semicolon)
|
- [Intro](#intro)
- [Objects](#objects)
- [The prototype](#prototype)
- [`hasOwnProperty`](#hasownproperty)
- [Functions](#functions)
- [How `this` works](#this)
- [Closures and references](#closures)
- - [Function arguments](#arguments)
+ - [The `arguments` object](#arguments)
- [Scopes and namespaces](#scopes)
- [Constructors](#constructors)
- [Equality and comparisons](#equality)
- [Arrays](#arrays)
- [The `Array` constructor](#arrayctor)
- [The `for in` loop](#forinloop)
- [The `typeof` operator](#typeof)
- [The `instanceof` operator](#instanceof)
- [Type casting](#casting)
- [`undefined` and `null`](#undefined)
- [The evil `eval`](#eval)
- [`setTimeout` and `setInterval`](#timeouts)
- [Automatic semicolon insertion](#semicolon)
| 2 | 0.086957 | 1 | 1 |
a5fe6f2876231b28e08f8a22e18b49aa264351e1 | Resources/views/breadcrumb.html.twig | Resources/views/breadcrumb.html.twig | {% block breadcrumb %}
{% if breadcrumb is defined and breadcrumb|length %}
<ul class="breadcrumb">
{% block breadcrumb_items %}
{% for item in breadcrumb %}
<li>
{% if item.url|default('')|length and item.url != app.request.requestUri %}
<a href="{{ item.url }}">{{ item.text|trans(item.parameters|default([]), domain|default(null), locale|default(null))|capitalize }}</a>
{% else %}
{{ item.text|trans(item.parameters|default([]), domain|default(null), locale|default(null))|capitalize }}
{% endif %}
{% if not loop.last %}<span class='divider'>{{ separator|default('/') }}</span>{% endif %}
</li>
{% endfor %}
{% endblock breadcrumb_items %}
</ul>
{% endif %}
{% endblock breadcrumb %} | {% block breadcrumb %}
{% if breadcrumb is defined and breadcrumb|length %}
<ul class="breadcrumb">
{% block breadcrumb_items %}
{% for item in breadcrumb %}
{% set islink = item.url|default('')|length and item.url != app.request.requestUri %}
<li>
{% if item.icon|default('')|length %}
<i class="{{ item.icon }}"></i>
{% endif %}
{% if islink %}
<a href="{{ item.url }}">
{% endif %}
{{ item.text|trans(item.parameters|default([]), domain|default(null), locale|default(null))|capitalize }}
{% if islink %}
</a>
{% endif %}
{% if not loop.last %}<span class='divider'>{{ separator|default('/') }}</span>{% endif %}
</li>
{% endfor %}
{% endblock breadcrumb_items %}
</ul>
{% endif %}
{% endblock breadcrumb %} | Add icon in breadcrumb items | Add icon in breadcrumb items
| Twig | mit | nmariani/bootstrapp-bundle,nmariani/bootstrapp-bundle,nmariani/bootstrapp-bundle | twig | ## Code Before:
{% block breadcrumb %}
{% if breadcrumb is defined and breadcrumb|length %}
<ul class="breadcrumb">
{% block breadcrumb_items %}
{% for item in breadcrumb %}
<li>
{% if item.url|default('')|length and item.url != app.request.requestUri %}
<a href="{{ item.url }}">{{ item.text|trans(item.parameters|default([]), domain|default(null), locale|default(null))|capitalize }}</a>
{% else %}
{{ item.text|trans(item.parameters|default([]), domain|default(null), locale|default(null))|capitalize }}
{% endif %}
{% if not loop.last %}<span class='divider'>{{ separator|default('/') }}</span>{% endif %}
</li>
{% endfor %}
{% endblock breadcrumb_items %}
</ul>
{% endif %}
{% endblock breadcrumb %}
## Instruction:
Add icon in breadcrumb items
## Code After:
{% block breadcrumb %}
{% if breadcrumb is defined and breadcrumb|length %}
<ul class="breadcrumb">
{% block breadcrumb_items %}
{% for item in breadcrumb %}
{% set islink = item.url|default('')|length and item.url != app.request.requestUri %}
<li>
{% if item.icon|default('')|length %}
<i class="{{ item.icon }}"></i>
{% endif %}
{% if islink %}
<a href="{{ item.url }}">
{% endif %}
{{ item.text|trans(item.parameters|default([]), domain|default(null), locale|default(null))|capitalize }}
{% if islink %}
</a>
{% endif %}
{% if not loop.last %}<span class='divider'>{{ separator|default('/') }}</span>{% endif %}
</li>
{% endfor %}
{% endblock breadcrumb_items %}
</ul>
{% endif %}
{% endblock breadcrumb %} | {% block breadcrumb %}
{% if breadcrumb is defined and breadcrumb|length %}
<ul class="breadcrumb">
{% block breadcrumb_items %}
{% for item in breadcrumb %}
+ {% set islink = item.url|default('')|length and item.url != app.request.requestUri %}
<li>
- {% if item.url|default('')|length and item.url != app.request.requestUri %}
- <a href="{{ item.url }}">{{ item.text|trans(item.parameters|default([]), domain|default(null), locale|default(null))|capitalize }}</a>
+ {% if item.icon|default('')|length %}
+ <i class="{{ item.icon }}"></i>
- {% else %}
? ^^^
+ {% endif %}
? ^^^^
+ {% if islink %}
+ <a href="{{ item.url }}">
+ {% endif %}
- {{ item.text|trans(item.parameters|default([]), domain|default(null), locale|default(null))|capitalize }}
? ----
+ {{ item.text|trans(item.parameters|default([]), domain|default(null), locale|default(null))|capitalize }}
+ {% if islink %}
+ </a>
{% endif %}
{% if not loop.last %}<span class='divider'>{{ separator|default('/') }}</span>{% endif %}
</li>
{% endfor %}
{% endblock breadcrumb_items %}
</ul>
{% endif %}
{% endblock breadcrumb %} | 14 | 0.777778 | 10 | 4 |
03275eccfb4a3d4dc60efa87ab8e1d74065927fb | k8s-publishing-bot.md | k8s-publishing-bot.md |
The publishing-bot for the Kubernetes project is running on a CNCF sponsored
GKE cluster `development2` in the `kubernetes-public` project.
The bot is responsible for updating `go.mod`/`Godeps` and `vendor` for target repos.
To support both godeps for releases <= v1.14 and go modules for the master branch
and releases >= v1.14, we run two instances of the bot today.
The instance of the bot responsible for syncing releases <= v1.14 runs in the
`k8s-publishing-bot-godeps` namespace and the instance of the bot responsible
for syncing the master branch and releases >= v1.14 runs in the ` k8s-publishing-bot`
namespace.
The code for the former can be found in the [godeps branch] and the latter in the master
branch of the publishing-bot repo.
## Permissions
The cluster can be accessed by members of the [k8s-infra-cluster-admins]
google group. Members can be added to the group by updating [groups.yaml].
## Running the bot
Make sure you are at the root of the publishing-bot repo before running these commands.
### Populating repos
This script needs to be run whenever a new staging repo is added.
```sh
hack/fetch-all-latest-and-push.sh kubernetes
```
### Deploying the bot
```sh
make validate build-image push-image deploy CONFIG=configs/kubernetes
```
[godeps branch]: https://github.com/kubernetes/publishing-bot/tree/godeps
[k8s-infra-cluster-admins]: https://groups.google.com/forum/#!forum/k8s-infra-cluster-admins
[groups.yaml]: https://git.k8s.io/k8s.io/groups/groups.yaml
|
The publishing-bot for the Kubernetes project is running in the
`k8s-publishing-bot` namespace on a CNCF sponsored GKE cluster
`development2` in the `kubernetes-public` project.
The bot is responsible for updating `go.mod`/`Godeps` for target repos.
## Permissions
The cluster can be accessed by members of the [k8s-infra-cluster-admins]
google group. Members can be added to the group by updating [groups.yaml].
## Running the bot
Make sure you are at the root of the publishing-bot repo before running these commands.
### Populating repos
This script needs to be run whenever a new staging repo is added.
```sh
hack/fetch-all-latest-and-push.sh kubernetes
```
### Deploying the bot
```sh
make validate build-image push-image deploy CONFIG=configs/kubernetes
```
[k8s-infra-cluster-admins]: https://groups.google.com/forum/#!forum/k8s-infra-cluster-admins
[groups.yaml]: https://git.k8s.io/k8s.io/groups/groups.yaml
| Remove note about godeps instance of the bot | Remove note about godeps instance of the bot
The godeps instance of the bot has now been removed. This instance was
only used for Kubernetes versions <=1.14, which are not
supported anymore.
The final patch release for 1.14 was in early December, so we can now be
sure that we don't need this instance anymore.
| Markdown | apache-2.0 | kubernetes/publishing-bot,kubernetes/publishing-bot | markdown | ## Code Before:
The publishing-bot for the Kubernetes project is running on a CNCF sponsored
GKE cluster `development2` in the `kubernetes-public` project.
The bot is responsible for updating `go.mod`/`Godeps` and `vendor` for target repos.
To support both godeps for releases <= v1.14 and go modules for the master branch
and releases >= v1.14, we run two instances of the bot today.
The instance of the bot responsible for syncing releases <= v1.14 runs in the
`k8s-publishing-bot-godeps` namespace and the instance of the bot responsible
for syncing the master branch and releases >= v1.14 runs in the ` k8s-publishing-bot`
namespace.
The code for the former can be found in the [godeps branch] and the latter in the master
branch of the publishing-bot repo.
## Permissions
The cluster can be accessed by members of the [k8s-infra-cluster-admins]
google group. Members can be added to the group by updating [groups.yaml].
## Running the bot
Make sure you are at the root of the publishing-bot repo before running these commands.
### Populating repos
This script needs to be run whenever a new staging repo is added.
```sh
hack/fetch-all-latest-and-push.sh kubernetes
```
### Deploying the bot
```sh
make validate build-image push-image deploy CONFIG=configs/kubernetes
```
[godeps branch]: https://github.com/kubernetes/publishing-bot/tree/godeps
[k8s-infra-cluster-admins]: https://groups.google.com/forum/#!forum/k8s-infra-cluster-admins
[groups.yaml]: https://git.k8s.io/k8s.io/groups/groups.yaml
## Instruction:
Remove note about godeps instance of the bot
The godeps instance of the bot has now been removed. This instance was
only used for Kubernetes versions <=1.14, which are not
supported anymore.
The final patch release for 1.14 was in early December, so we can now be
sure that we don't need this instance anymore.
## Code After:
The publishing-bot for the Kubernetes project is running in the
`k8s-publishing-bot` namespace on a CNCF sponsored GKE cluster
`development2` in the `kubernetes-public` project.
The bot is responsible for updating `go.mod`/`Godeps` for target repos.
## Permissions
The cluster can be accessed by members of the [k8s-infra-cluster-admins]
google group. Members can be added to the group by updating [groups.yaml].
## Running the bot
Make sure you are at the root of the publishing-bot repo before running these commands.
### Populating repos
This script needs to be run whenever a new staging repo is added.
```sh
hack/fetch-all-latest-and-push.sh kubernetes
```
### Deploying the bot
```sh
make validate build-image push-image deploy CONFIG=configs/kubernetes
```
[k8s-infra-cluster-admins]: https://groups.google.com/forum/#!forum/k8s-infra-cluster-admins
[groups.yaml]: https://git.k8s.io/k8s.io/groups/groups.yaml
|
- The publishing-bot for the Kubernetes project is running on a CNCF sponsored
? ^ ^^^^^^^^^^^^^^ -
+ The publishing-bot for the Kubernetes project is running in the
? ^ ^^
+ `k8s-publishing-bot` namespace on a CNCF sponsored GKE cluster
- GKE cluster `development2` in the `kubernetes-public` project.
? ------------
+ `development2` in the `kubernetes-public` project.
- The bot is responsible for updating `go.mod`/`Godeps` and `vendor` for target repos.
? -------------
+ The bot is responsible for updating `go.mod`/`Godeps` for target repos.
- To support both godeps for releases <= v1.14 and go modules for the master branch
- and releases >= v1.14, we run two instances of the bot today.
-
- The instance of the bot responsible for syncing releases <= v1.14 runs in the
- `k8s-publishing-bot-godeps` namespace and the instance of the bot responsible
- for syncing the master branch and releases >= v1.14 runs in the ` k8s-publishing-bot`
- namespace.
-
- The code for the former can be found in the [godeps branch] and the latter in the master
- branch of the publishing-bot repo.
## Permissions
The cluster can be accessed by members of the [k8s-infra-cluster-admins]
google group. Members can be added to the group by updating [groups.yaml].
## Running the bot
Make sure you are at the root of the publishing-bot repo before running these commands.
### Populating repos
This script needs to be run whenever a new staging repo is added.
```sh
hack/fetch-all-latest-and-push.sh kubernetes
```
### Deploying the bot
```sh
make validate build-image push-image deploy CONFIG=configs/kubernetes
```
- [godeps branch]: https://github.com/kubernetes/publishing-bot/tree/godeps
[k8s-infra-cluster-admins]: https://groups.google.com/forum/#!forum/k8s-infra-cluster-admins
[groups.yaml]: https://git.k8s.io/k8s.io/groups/groups.yaml | 18 | 0.428571 | 4 | 14 |
6b6d2c8615cba40352fe630976b849548566a15a | README.md | README.md | coderit.github.io
=================
The official website of [codeRIT](https://coderit.org).
Jekyll theme taken from: [here] (https://github.com/camporez/Thinny)
Code is licensed under the MIT license! Check out the LICENSE file for more information about this.
|
The official website of [codeRIT](https://coderit.org).
Jekyll theme taken from: [here](https://github.com/camporez/Thinny)
## Local Dev Setup
1. [Install Ruby](https://www.ruby-lang.org/en/documentation/installation/)
2. [Install Jekyll](https://jekyllrb.com/docs/installation/): `gem install jekyll`
3. Install redcarpet: `gem install redcarpet`
4. Start up the local dev server: `jekyll serve`
Code is licensed under the MIT license! Check out the LICENSE file for more information about this.
| Add instructions for local dev setup | Add instructions for local dev setup
Fixes #32
| Markdown | mit | codeRIT/coderit.github.io | markdown | ## Code Before:
coderit.github.io
=================
The official website of [codeRIT](https://coderit.org).
Jekyll theme taken from: [here] (https://github.com/camporez/Thinny)
Code is licensed under the MIT license! Check out the LICENSE file for more information about this.
## Instruction:
Add instructions for local dev setup
Fixes #32
## Code After:
The official website of [codeRIT](https://coderit.org).
Jekyll theme taken from: [here](https://github.com/camporez/Thinny)
## Local Dev Setup
1. [Install Ruby](https://www.ruby-lang.org/en/documentation/installation/)
2. [Install Jekyll](https://jekyllrb.com/docs/installation/): `gem install jekyll`
3. Install redcarpet: `gem install redcarpet`
4. Start up the local dev server: `jekyll serve`
Code is licensed under the MIT license! Check out the LICENSE file for more information about this.
| - coderit.github.io
- =================
The official website of [codeRIT](https://coderit.org).
- Jekyll theme taken from: [here] (https://github.com/camporez/Thinny)
? -
+ Jekyll theme taken from: [here](https://github.com/camporez/Thinny)
+
+ ## Local Dev Setup
+
+ 1. [Install Ruby](https://www.ruby-lang.org/en/documentation/installation/)
+ 2. [Install Jekyll](https://jekyllrb.com/docs/installation/): `gem install jekyll`
+ 3. Install redcarpet: `gem install redcarpet`
+ 4. Start up the local dev server: `jekyll serve`
Code is licensed under the MIT license! Check out the LICENSE file for more information about this. | 11 | 1.375 | 8 | 3 |
07d5888b53f20feb39117bb3d632ef6406c3ff15 | requirements.txt | requirements.txt | pathlib
requests
BeautifulSoup4
youtube_dl
pafy
spotipy
git+https://github.com/nicfit/eyeD3.git
| pathlib
requests
BeautifulSoup4
youtube_dl
pafy
spotipy
eyed3
| Change to updated pypi version of eyed3 | Change to updated pypi version of eyed3 | Text | mit | Ritiek/Spotify-Downloader | text | ## Code Before:
pathlib
requests
BeautifulSoup4
youtube_dl
pafy
spotipy
git+https://github.com/nicfit/eyeD3.git
## Instruction:
Change to updated pypi version of eyed3
## Code After:
pathlib
requests
BeautifulSoup4
youtube_dl
pafy
spotipy
eyed3
| pathlib
requests
BeautifulSoup4
youtube_dl
pafy
spotipy
- git+https://github.com/nicfit/eyeD3.git
+ eyed3 | 2 | 0.285714 | 1 | 1 |
9b0f683e66d7d4c8d98b20ffc3e7f76940a86918 | roles/fidimag/tasks/make_fidimag.yml | roles/fidimag/tasks/make_fidimag.yml | ---
# This Ansible playbook build Fidimag from scripts in the Fidimag
# repository. This include installing local dependencies including FFTW and
# SUNDIALS.
# Install local dependencies.
- name: Install FFTW and SUNDIALS from the Fidimag installation script.
command: bash install.sh
args:
chdir: "{{ FIDIMAG_INSTALL_PATH }}/bin"
sudo: yes
# Build Fidimag. Makefile module?
- name: Build Fidimag.
command: make
args:
chdir: "{{ FIDIMAG_INSTALL_PATH }}"
sudo: yes
| ---
# This Ansible playbook build Fidimag from scripts in the Fidimag
# repository. This include installing local dependencies including FFTW and
# SUNDIALS.
# Install local dependencies.
- name: Install FFTW and SUNDIALS from the Fidimag installation script.
command: bash ubuntu_install_script.sh
args:
chdir: "{{ FIDIMAG_INSTALL_PATH }}/bin"
sudo: yes
# Build Fidimag. Makefile module?
- name: Build Fidimag.
command: make
args:
chdir: "{{ FIDIMAG_INSTALL_PATH }}"
sudo: yes
| Change name of fidimag installation script to match the change in the repository. | Change name of fidimag installation script to match the change in the repository.
| YAML | bsd-3-clause | computationalmodelling/virtualmicromagnetics,fangohr/virtualmicromagnetics,computationalmodelling/virtualmicromagnetics,fangohr/virtualmicromagnetics | yaml | ## Code Before:
---
# This Ansible playbook build Fidimag from scripts in the Fidimag
# repository. This include installing local dependencies including FFTW and
# SUNDIALS.
# Install local dependencies.
- name: Install FFTW and SUNDIALS from the Fidimag installation script.
command: bash install.sh
args:
chdir: "{{ FIDIMAG_INSTALL_PATH }}/bin"
sudo: yes
# Build Fidimag. Makefile module?
- name: Build Fidimag.
command: make
args:
chdir: "{{ FIDIMAG_INSTALL_PATH }}"
sudo: yes
## Instruction:
Change name of fidimag installation script to match the change in the repository.
## Code After:
---
# This Ansible playbook build Fidimag from scripts in the Fidimag
# repository. This include installing local dependencies including FFTW and
# SUNDIALS.
# Install local dependencies.
- name: Install FFTW and SUNDIALS from the Fidimag installation script.
command: bash ubuntu_install_script.sh
args:
chdir: "{{ FIDIMAG_INSTALL_PATH }}/bin"
sudo: yes
# Build Fidimag. Makefile module?
- name: Build Fidimag.
command: make
args:
chdir: "{{ FIDIMAG_INSTALL_PATH }}"
sudo: yes
| ---
# This Ansible playbook build Fidimag from scripts in the Fidimag
# repository. This include installing local dependencies including FFTW and
# SUNDIALS.
# Install local dependencies.
- name: Install FFTW and SUNDIALS from the Fidimag installation script.
- command: bash install.sh
+ command: bash ubuntu_install_script.sh
? +++++++ +++++++
args:
chdir: "{{ FIDIMAG_INSTALL_PATH }}/bin"
sudo: yes
# Build Fidimag. Makefile module?
- name: Build Fidimag.
command: make
args:
chdir: "{{ FIDIMAG_INSTALL_PATH }}"
sudo: yes | 2 | 0.111111 | 1 | 1 |
d881a2b5a31dac5df4e17094b8876a14fb6158a0 | app/views/rewards/_form.html.slim | app/views/rewards/_form.html.slim | = simple_form_for [parent, resource], html: { class: 'remote-form' } do |f|
.pull-right
a.cancel.button.secondary href="#" = t('words.cancel')
.clearfix
- if resource.new_record? || can?(:update, resource, :title)
= f.input :title
- if resource.new_record? || can?(:update, resource, :minimum_value)
= f.input :minimum_value, as: :integer, input_html: {pattern: "^[\\d+.]+"}
- if resource.new_record? || can?(:update, resource, :days_to_delivery)
= f.input :days_to_delivery
- if resource.new_record? || can?(:update, resource, :description)
= f.input :description
- if resource.new_record? || can?(:update, resource, :maximum_contributions)
= f.input :maximum_contributions
= f.input :soon, as: :boolean
= f.input :show_contact_button, as: :boolean
= f.button :submit, data: { disable_with: t('words.sending') }
- if resource.persisted? && can?(:destroy, resource)
span.or = t('words.or')
= link_to [parent, resource], class: 'button alert', method: :delete, confirm: t('words.confirm') do
i.icon-et-trash.icon-white
= t('words.delete')
| = simple_form_for [parent, resource], html: { class: 'remote-form' } do |f|
.pull-right
a.cancel.button.secondary href="#" = t('words.cancel')
.clearfix
- if resource.new_record? || can?(:update, resource, :title)
= f.input :title
- if resource.new_record? || can?(:update, resource, :minimum_value)
= f.input :minimum_value, as: :integer, input_html: {pattern: "^[\\d+.]+"}
- if resource.new_record? || can?(:update, resource, :days_to_delivery)
= f.input :days_to_delivery
- if resource.new_record? || can?(:update, resource, :description)
= f.input :description
- if resource.new_record? || can?(:update, resource, :maximum_contributions)
= f.input :maximum_contributions
= f.input :soon, as: :boolean
= f.button :submit, data: { disable_with: t('words.sending') }
- if resource.persisted? && can?(:destroy, resource)
span.or = t('words.or')
= link_to [parent, resource], class: 'button alert', method: :delete, confirm: t('words.confirm') do
i.icon-et-trash.icon-white
= t('words.delete')
| Remove reward contact from reward form | Remove reward contact from reward form
| Slim | mit | raksonibs/raimcrowd,jinutm/silveralms.com,gustavoguichard/neighborly,jinutm/silverpro,jinutm/silverme,jinutm/silverme,jinutm/silverpro,MicroPasts/micropasts-crowdfunding,gustavoguichard/neighborly,jinutm/silverme,jinutm/silverpro,MicroPasts/micropasts-crowdfunding,gustavoguichard/neighborly,MicroPasts/micropasts-crowdfunding,jinutm/silverprod,MicroPasts/micropasts-crowdfunding,raksonibs/raimcrowd,jinutm/silverprod,jinutm/silverprod,raksonibs/raimcrowd,jinutm/silveralms.com,raksonibs/raimcrowd,jinutm/silveralms.com | slim | ## Code Before:
= simple_form_for [parent, resource], html: { class: 'remote-form' } do |f|
.pull-right
a.cancel.button.secondary href="#" = t('words.cancel')
.clearfix
- if resource.new_record? || can?(:update, resource, :title)
= f.input :title
- if resource.new_record? || can?(:update, resource, :minimum_value)
= f.input :minimum_value, as: :integer, input_html: {pattern: "^[\\d+.]+"}
- if resource.new_record? || can?(:update, resource, :days_to_delivery)
= f.input :days_to_delivery
- if resource.new_record? || can?(:update, resource, :description)
= f.input :description
- if resource.new_record? || can?(:update, resource, :maximum_contributions)
= f.input :maximum_contributions
= f.input :soon, as: :boolean
= f.input :show_contact_button, as: :boolean
= f.button :submit, data: { disable_with: t('words.sending') }
- if resource.persisted? && can?(:destroy, resource)
span.or = t('words.or')
= link_to [parent, resource], class: 'button alert', method: :delete, confirm: t('words.confirm') do
i.icon-et-trash.icon-white
= t('words.delete')
## Instruction:
Remove reward contact from reward form
## Code After:
= simple_form_for [parent, resource], html: { class: 'remote-form' } do |f|
.pull-right
a.cancel.button.secondary href="#" = t('words.cancel')
.clearfix
- if resource.new_record? || can?(:update, resource, :title)
= f.input :title
- if resource.new_record? || can?(:update, resource, :minimum_value)
= f.input :minimum_value, as: :integer, input_html: {pattern: "^[\\d+.]+"}
- if resource.new_record? || can?(:update, resource, :days_to_delivery)
= f.input :days_to_delivery
- if resource.new_record? || can?(:update, resource, :description)
= f.input :description
- if resource.new_record? || can?(:update, resource, :maximum_contributions)
= f.input :maximum_contributions
= f.input :soon, as: :boolean
= f.button :submit, data: { disable_with: t('words.sending') }
- if resource.persisted? && can?(:destroy, resource)
span.or = t('words.or')
= link_to [parent, resource], class: 'button alert', method: :delete, confirm: t('words.confirm') do
i.icon-et-trash.icon-white
= t('words.delete')
| = simple_form_for [parent, resource], html: { class: 'remote-form' } do |f|
.pull-right
a.cancel.button.secondary href="#" = t('words.cancel')
.clearfix
- if resource.new_record? || can?(:update, resource, :title)
= f.input :title
- if resource.new_record? || can?(:update, resource, :minimum_value)
= f.input :minimum_value, as: :integer, input_html: {pattern: "^[\\d+.]+"}
- if resource.new_record? || can?(:update, resource, :days_to_delivery)
= f.input :days_to_delivery
- if resource.new_record? || can?(:update, resource, :description)
= f.input :description
- if resource.new_record? || can?(:update, resource, :maximum_contributions)
= f.input :maximum_contributions
= f.input :soon, as: :boolean
- = f.input :show_contact_button, as: :boolean
= f.button :submit, data: { disable_with: t('words.sending') }
- if resource.persisted? && can?(:destroy, resource)
span.or = t('words.or')
= link_to [parent, resource], class: 'button alert', method: :delete, confirm: t('words.confirm') do
i.icon-et-trash.icon-white
= t('words.delete') | 1 | 0.034483 | 0 | 1 |
432b962ed6eed8fa198cf89bf3fd4770ed2550a3 | t/nqp/78-shell.t | t/nqp/78-shell.t | plan(4);
my $a := nqp::getenvhash();
$a<foo> := 123;
my $b := nqp::getenvhash();
$b<foo> := 456;
ok(nqp::ishash($a),'nqp::getenvhash() returns a hash');
ok($a<foo> == 123,'nqp::getenvhash() is a fresh hash');
ok($b<foo> == 456,'nqp::getenvhash() is a fresh hash');
my $tmp_file := "tmp";
nqp::shell("echo Hello > $tmp_file",nqp::cwd(),nqp::getenvhash());
my $output := slurp($tmp_file);
ok($output ~~ /^Hello/,'nqp::shell works with the echo shell command');
| plan(5);
my $a := nqp::getenvhash();
$a<foo> := 123;
my $b := nqp::getenvhash();
$b<foo> := 456;
ok(nqp::ishash($a),'nqp::getenvhash() returns a hash');
ok($a<foo> == 123,'nqp::getenvhash() is a fresh hash');
ok($b<foo> == 456,'nqp::getenvhash() is a fresh hash');
my $tmp_file := "tmp";
nqp::shell("echo Hello > $tmp_file",nqp::cwd(),nqp::getenvhash());
my $output := slurp($tmp_file);
ok($output ~~ /^Hello/,'nqp::shell works with the echo shell command');
my $env := nqp::getenvhash();
$env<NQP_SHELL_TEST_ENV_VAR> := "123foo";
nqp::shell("echo %NQP_SHELL_TEST_ENV_VAR% > $tmp_file",nqp::cwd(),$env);
$output := slurp($tmp_file);
if $output eq "%NQP_SHELL_TEST_ENV_VAR%\n" {
nqp::shell("echo \$NQP_SHELL_TEST_ENV_VAR > $tmp_file",nqp::cwd(),$env);
my $output := slurp($tmp_file);
ok($output eq "123foo\n","passing env variables to child processes works linux");
} else {
ok($output ~~ /^123foo/,"passing env variables to child processes works on windows");
}
| Test that passing env variable to child processes works correctly. | Test that passing env variable to child processes works correctly.
| Perl | artistic-2.0 | cygx/nqp,cygx/nqp,cygx/nqp,cygx/nqp,cygx/nqp,cygx/nqp,cygx/nqp,cygx/nqp | perl | ## Code Before:
plan(4);
my $a := nqp::getenvhash();
$a<foo> := 123;
my $b := nqp::getenvhash();
$b<foo> := 456;
ok(nqp::ishash($a),'nqp::getenvhash() returns a hash');
ok($a<foo> == 123,'nqp::getenvhash() is a fresh hash');
ok($b<foo> == 456,'nqp::getenvhash() is a fresh hash');
my $tmp_file := "tmp";
nqp::shell("echo Hello > $tmp_file",nqp::cwd(),nqp::getenvhash());
my $output := slurp($tmp_file);
ok($output ~~ /^Hello/,'nqp::shell works with the echo shell command');
## Instruction:
Test that passing env variable to child processes works correctly.
## Code After:
plan(5);
my $a := nqp::getenvhash();
$a<foo> := 123;
my $b := nqp::getenvhash();
$b<foo> := 456;
ok(nqp::ishash($a),'nqp::getenvhash() returns a hash');
ok($a<foo> == 123,'nqp::getenvhash() is a fresh hash');
ok($b<foo> == 456,'nqp::getenvhash() is a fresh hash');
my $tmp_file := "tmp";
nqp::shell("echo Hello > $tmp_file",nqp::cwd(),nqp::getenvhash());
my $output := slurp($tmp_file);
ok($output ~~ /^Hello/,'nqp::shell works with the echo shell command');
my $env := nqp::getenvhash();
$env<NQP_SHELL_TEST_ENV_VAR> := "123foo";
nqp::shell("echo %NQP_SHELL_TEST_ENV_VAR% > $tmp_file",nqp::cwd(),$env);
$output := slurp($tmp_file);
if $output eq "%NQP_SHELL_TEST_ENV_VAR%\n" {
nqp::shell("echo \$NQP_SHELL_TEST_ENV_VAR > $tmp_file",nqp::cwd(),$env);
my $output := slurp($tmp_file);
ok($output eq "123foo\n","passing env variables to child processes works linux");
} else {
ok($output ~~ /^123foo/,"passing env variables to child processes works on windows");
}
| - plan(4);
? ^
+ plan(5);
? ^
my $a := nqp::getenvhash();
$a<foo> := 123;
my $b := nqp::getenvhash();
$b<foo> := 456;
ok(nqp::ishash($a),'nqp::getenvhash() returns a hash');
ok($a<foo> == 123,'nqp::getenvhash() is a fresh hash');
ok($b<foo> == 456,'nqp::getenvhash() is a fresh hash');
my $tmp_file := "tmp";
nqp::shell("echo Hello > $tmp_file",nqp::cwd(),nqp::getenvhash());
my $output := slurp($tmp_file);
ok($output ~~ /^Hello/,'nqp::shell works with the echo shell command');
+
+ my $env := nqp::getenvhash();
+ $env<NQP_SHELL_TEST_ENV_VAR> := "123foo";
+
+ nqp::shell("echo %NQP_SHELL_TEST_ENV_VAR% > $tmp_file",nqp::cwd(),$env);
+ $output := slurp($tmp_file);
+ if $output eq "%NQP_SHELL_TEST_ENV_VAR%\n" {
+ nqp::shell("echo \$NQP_SHELL_TEST_ENV_VAR > $tmp_file",nqp::cwd(),$env);
+ my $output := slurp($tmp_file);
+ ok($output eq "123foo\n","passing env variables to child processes works linux");
+ } else {
+ ok($output ~~ /^123foo/,"passing env variables to child processes works on windows");
+ } | 15 | 1 | 14 | 1 |
9949a1853f1e721e62304cf7af288d95253655b9 | add_support_for_new_angular_version.txt | add_support_for_new_angular_version.txt | Notes from when I added Angular 11 support
- in uirouter/angular:
- update peer deps for angular packages, bumping each range by one
- update angular packages manually to the lower range supported (for angular 11, set to ^10.0.0)
- update ng-packagr manually to the lower range supported (for angular 11, set to ^10.0.0)
- update typescript to the version required by @angular/compiler-cli
- npx check-peer-dependencies
- gh pr create
- update other libs
- npx check-peer-dependencies
- gh pr create
- npm run release
- in sample-app-angular:
- git checkout -b update-to-latest-angular
- npx ng update @angular/core @angular/cli
- git commit -m "chore: update to Angular 11"
- yarn && yarn test
- gh pr create
| Notes from when I added Angular 11 support
- in uirouter/angular:
- update peer deps for angular packages, bumping each range by one
- update angular packages manually to the lower range supported (for angular 11, set to ^10.0.0)
- update ng-packagr manually to the lower range supported (for angular 11, set to ^10.0.0)
- update typescript to the version required by @angular/compiler-cli
- npx check-peer-dependencies
- gh pr create
- update other libs
- npx check-peer-dependencies
- gh pr create
- npm run release
- in sample-app-angular:
- git checkout -b update-to-latest-angular
- npx ng update @angular/core @angular/cli
- git commit -m "chore: update to Angular 11"
- yarn && yarn test
- gh pr create
- in sample-app-angular-hybrid
- npx ng update @angular/core @angular/cli
- yarn upgrade-interactive --latest (update uirouter libs)
- push to a branch 'upgrade-to-angular-11'
- in uirouter/angular-hybrid
- target the sample-app branch in downstream_test.json:
- "https://github.com/ui-router/sample-app-angular-hybrid.git@update-to-angular-11"
- update peer deps for angular packages, bumping each range by one
- update angular packages manually to the lower range supported (for angular 11, set to ^10.0.0)
- update ng-packagr manually to the lower range supported (for angular 11, set to ^10.0.0)
- update typescript to the version required by @angular/compiler-cli
- npx check-peer-dependencies
- in example
- npx ng update @angular/core @angular/cli
- yarn upgrade-interactive --latest (update uirouter libs)
- gh pr create
- npm run release
- revert downstream_projects.json and push
- in sample-app-angular-hybrid after merging
- yarn upgrade-interactive --latest (update uirouter libs)
- push and merge
| Add more notes on adding support for new Angular releases | Add more notes on adding support for new Angular releases
| Text | mit | ui-router/angular,ui-router/angular,ui-router/angular,ui-router/angular | text | ## Code Before:
Notes from when I added Angular 11 support
- in uirouter/angular:
- update peer deps for angular packages, bumping each range by one
- update angular packages manually to the lower range supported (for angular 11, set to ^10.0.0)
- update ng-packagr manually to the lower range supported (for angular 11, set to ^10.0.0)
- update typescript to the version required by @angular/compiler-cli
- npx check-peer-dependencies
- gh pr create
- update other libs
- npx check-peer-dependencies
- gh pr create
- npm run release
- in sample-app-angular:
- git checkout -b update-to-latest-angular
- npx ng update @angular/core @angular/cli
- git commit -m "chore: update to Angular 11"
- yarn && yarn test
- gh pr create
## Instruction:
Add more notes on adding support for new Angular releases
## Code After:
Notes from when I added Angular 11 support
- in uirouter/angular:
- update peer deps for angular packages, bumping each range by one
- update angular packages manually to the lower range supported (for angular 11, set to ^10.0.0)
- update ng-packagr manually to the lower range supported (for angular 11, set to ^10.0.0)
- update typescript to the version required by @angular/compiler-cli
- npx check-peer-dependencies
- gh pr create
- update other libs
- npx check-peer-dependencies
- gh pr create
- npm run release
- in sample-app-angular:
- git checkout -b update-to-latest-angular
- npx ng update @angular/core @angular/cli
- git commit -m "chore: update to Angular 11"
- yarn && yarn test
- gh pr create
- in sample-app-angular-hybrid
- npx ng update @angular/core @angular/cli
- yarn upgrade-interactive --latest (update uirouter libs)
- push to a branch 'upgrade-to-angular-11'
- in uirouter/angular-hybrid
- target the sample-app branch in downstream_test.json:
- "https://github.com/ui-router/sample-app-angular-hybrid.git@update-to-angular-11"
- update peer deps for angular packages, bumping each range by one
- update angular packages manually to the lower range supported (for angular 11, set to ^10.0.0)
- update ng-packagr manually to the lower range supported (for angular 11, set to ^10.0.0)
- update typescript to the version required by @angular/compiler-cli
- npx check-peer-dependencies
- in example
- npx ng update @angular/core @angular/cli
- yarn upgrade-interactive --latest (update uirouter libs)
- gh pr create
- npm run release
- revert downstream_projects.json and push
- in sample-app-angular-hybrid after merging
- yarn upgrade-interactive --latest (update uirouter libs)
- push and merge
| Notes from when I added Angular 11 support
- in uirouter/angular:
- update peer deps for angular packages, bumping each range by one
- update angular packages manually to the lower range supported (for angular 11, set to ^10.0.0)
- update ng-packagr manually to the lower range supported (for angular 11, set to ^10.0.0)
- update typescript to the version required by @angular/compiler-cli
- npx check-peer-dependencies
- gh pr create
- update other libs
- npx check-peer-dependencies
- gh pr create
- npm run release
- in sample-app-angular:
- git checkout -b update-to-latest-angular
- npx ng update @angular/core @angular/cli
- git commit -m "chore: update to Angular 11"
- yarn && yarn test
- gh pr create
+ - in sample-app-angular-hybrid
+ - npx ng update @angular/core @angular/cli
+ - yarn upgrade-interactive --latest (update uirouter libs)
+ - push to a branch 'upgrade-to-angular-11'
+
+ - in uirouter/angular-hybrid
+ - target the sample-app branch in downstream_test.json:
+ - "https://github.com/ui-router/sample-app-angular-hybrid.git@update-to-angular-11"
+ - update peer deps for angular packages, bumping each range by one
+ - update angular packages manually to the lower range supported (for angular 11, set to ^10.0.0)
+ - update ng-packagr manually to the lower range supported (for angular 11, set to ^10.0.0)
+ - update typescript to the version required by @angular/compiler-cli
+ - npx check-peer-dependencies
+ - in example
+ - npx ng update @angular/core @angular/cli
+ - yarn upgrade-interactive --latest (update uirouter libs)
+ - gh pr create
+ - npm run release
+ - revert downstream_projects.json and push
+
+ - in sample-app-angular-hybrid after merging
+ - yarn upgrade-interactive --latest (update uirouter libs)
+ - push and merge | 23 | 1.095238 | 23 | 0 |
5a7167bb05cf709045a6dca80f18bb964a9ee877 | mac/resources/open_FREF.js | mac/resources/open_FREF.js | define(['mac/roman'], function(macintoshRoman) {
'use strict';
return function(resource) {
if (resource.data.length !== 7) {
console.error('FREF resource expected to be 7 bytes, got ' + resource.data.length);
return;
}
resource.dataObject = {
fileType: macintoshRoman(resource.data, 0, 4),
iconListID: (resource.data[4] << 8) | resource.data[5],
};
};
});
| define(['mac/roman'], function(macintoshRoman) {
'use strict';
return function(item) {
return item.getBytes().then(function(bytes) {
if (bytes.length !== 7) {
return Promise.reject('FREF resource expected to be 7 bytes, got ' + bytes.length);
}
item.setDataObject({
fileType: macintoshRoman(resource.data, 0, 4),
iconListID: (resource.data[4] << 8) | resource.data[5],
});
});
};
});
| Change FREF to new loader format | Change FREF to new loader format | JavaScript | mit | radishengine/drowsy,radishengine/drowsy | javascript | ## Code Before:
define(['mac/roman'], function(macintoshRoman) {
'use strict';
return function(resource) {
if (resource.data.length !== 7) {
console.error('FREF resource expected to be 7 bytes, got ' + resource.data.length);
return;
}
resource.dataObject = {
fileType: macintoshRoman(resource.data, 0, 4),
iconListID: (resource.data[4] << 8) | resource.data[5],
};
};
});
## Instruction:
Change FREF to new loader format
## Code After:
define(['mac/roman'], function(macintoshRoman) {
'use strict';
return function(item) {
return item.getBytes().then(function(bytes) {
if (bytes.length !== 7) {
return Promise.reject('FREF resource expected to be 7 bytes, got ' + bytes.length);
}
item.setDataObject({
fileType: macintoshRoman(resource.data, 0, 4),
iconListID: (resource.data[4] << 8) | resource.data[5],
});
});
};
});
| define(['mac/roman'], function(macintoshRoman) {
'use strict';
- return function(resource) {
? ^ ^^^^^^
+ return function(item) {
? ^^ ^
+ return item.getBytes().then(function(bytes) {
- if (resource.data.length !== 7) {
? ^ ----------
+ if (bytes.length !== 7) {
? ++ ^^^
- console.error('FREF resource expected to be 7 bytes, got ' + resource.data.length);
? ^ ^ -- ^^^^ ^ ----------
+ return Promise.reject('FREF resource expected to be 7 bytes, got ' + bytes.length);
? ^^^^^^^^^^^ ^^ + ^^^^ ^^^
- return;
- }
+ }
? ++
- resource.dataObject = {
+ item.setDataObject({
- fileType: macintoshRoman(resource.data, 0, 4),
+ fileType: macintoshRoman(resource.data, 0, 4),
? ++
- iconListID: (resource.data[4] << 8) | resource.data[5],
+ iconListID: (resource.data[4] << 8) | resource.data[5],
? ++
+ });
- };
+ });
? +
};
}); | 19 | 1.1875 | 10 | 9 |
5c12b0c04b25e414b1bd04250cde0c3b1f869104 | hr_emergency_contact/models/hr_employee.py | hr_emergency_contact/models/hr_employee.py |
from openerp import models, fields
class HrEmployee(models.Model):
_name = 'hr.employee'
_inherit = 'hr.employee'
emergency_contact_ids = fields.Many2many(
string='Emergency Contacts',
comodel_name='res.partner',
relation='rel_employee_emergency_contact',
column1='employee_id',
column2='partner_id',
domain=[
('is_company', '=', False),
('parent_id', '=', False),
]
)
|
from openerp import models, fields
class HrEmployee(models.Model):
_inherit = 'hr.employee'
emergency_contact_ids = fields.Many2many(
string='Emergency Contacts',
comodel_name='res.partner',
relation='rel_employee_emergency_contact',
column1='employee_id',
column2='partner_id',
domain=[
('is_company', '=', False),
('parent_id', '=', False),
]
)
| Remove _name attribute on hr.employee | Remove _name attribute on hr.employee
| Python | agpl-3.0 | VitalPet/hr,thinkopensolutions/hr,VitalPet/hr,xpansa/hr,Eficent/hr,Eficent/hr,feketemihai/hr,feketemihai/hr,acsone/hr,open-synergy/hr,open-synergy/hr,xpansa/hr,acsone/hr,thinkopensolutions/hr | python | ## Code Before:
from openerp import models, fields
class HrEmployee(models.Model):
_name = 'hr.employee'
_inherit = 'hr.employee'
emergency_contact_ids = fields.Many2many(
string='Emergency Contacts',
comodel_name='res.partner',
relation='rel_employee_emergency_contact',
column1='employee_id',
column2='partner_id',
domain=[
('is_company', '=', False),
('parent_id', '=', False),
]
)
## Instruction:
Remove _name attribute on hr.employee
## Code After:
from openerp import models, fields
class HrEmployee(models.Model):
_inherit = 'hr.employee'
emergency_contact_ids = fields.Many2many(
string='Emergency Contacts',
comodel_name='res.partner',
relation='rel_employee_emergency_contact',
column1='employee_id',
column2='partner_id',
domain=[
('is_company', '=', False),
('parent_id', '=', False),
]
)
|
from openerp import models, fields
class HrEmployee(models.Model):
- _name = 'hr.employee'
_inherit = 'hr.employee'
emergency_contact_ids = fields.Many2many(
string='Emergency Contacts',
comodel_name='res.partner',
relation='rel_employee_emergency_contact',
column1='employee_id',
column2='partner_id',
domain=[
('is_company', '=', False),
('parent_id', '=', False),
]
) | 1 | 0.05 | 0 | 1 |
3738e536348cd516f98f8ba24022c10fd04afffc | data/travels.csv | data/travels.csv | 34.052235,-118.243683,1,1
40.792240,-73.138260,2,1
48.858093,2.294694,3,1
35.685360,139.753372,4,1
-27.470125,153.021072,5,1
| 34.052235,-118.243683,1,1,Los Angeles
40.792240,-73.138260,2,1,New York
48.858093,2.294694,3,1,Paris
35.685360,139.753372,4,1,Tokyo
-27.470125,153.021072,5,1,Brisbane
| Add place names to dummy data. | Add place names to dummy data.
| CSV | apache-2.0 | sorohan/whereivebeen,sorohan/whereivebeen | csv | ## Code Before:
34.052235,-118.243683,1,1
40.792240,-73.138260,2,1
48.858093,2.294694,3,1
35.685360,139.753372,4,1
-27.470125,153.021072,5,1
## Instruction:
Add place names to dummy data.
## Code After:
34.052235,-118.243683,1,1,Los Angeles
40.792240,-73.138260,2,1,New York
48.858093,2.294694,3,1,Paris
35.685360,139.753372,4,1,Tokyo
-27.470125,153.021072,5,1,Brisbane
| - 34.052235,-118.243683,1,1
+ 34.052235,-118.243683,1,1,Los Angeles
? ++++++++++++
- 40.792240,-73.138260,2,1
+ 40.792240,-73.138260,2,1,New York
? +++++++++
- 48.858093,2.294694,3,1
+ 48.858093,2.294694,3,1,Paris
? ++++++
- 35.685360,139.753372,4,1
+ 35.685360,139.753372,4,1,Tokyo
? ++++++
- -27.470125,153.021072,5,1
+ -27.470125,153.021072,5,1,Brisbane
? +++++++++
| 10 | 2 | 5 | 5 |
389bbd7ddfe3e5fb7462587f6431e090898c4ed0 | release/sites/Sankaku/chan.sankakucomplex.com/tag-types.txt | release/sites/Sankaku/chan.sankakucomplex.com/tag-types.txt | 0,general
1,idol
3,copyright
4,character
5,photoset
8,medium
9,meta | 0,general
1,artist
2,studio
3,copyright
4,character
8,medium
9,meta | Fix wrongly tagged artists in Sankaku JSON API | Fix wrongly tagged artists in Sankaku JSON API
| Text | apache-2.0 | Bionus/imgbrd-grabber,Bionus/imgbrd-grabber,Bionus/imgbrd-grabber,Bionus/imgbrd-grabber,Bionus/imgbrd-grabber,Bionus/imgbrd-grabber | text | ## Code Before:
0,general
1,idol
3,copyright
4,character
5,photoset
8,medium
9,meta
## Instruction:
Fix wrongly tagged artists in Sankaku JSON API
## Code After:
0,general
1,artist
2,studio
3,copyright
4,character
8,medium
9,meta | 0,general
- 1,idol
+ 1,artist
+ 2,studio
3,copyright
4,character
- 5,photoset
8,medium
9,meta | 4 | 0.571429 | 2 | 2 |
2d65dea76a3605683798ee6c91cd5252543a9f62 | doc/examples/ecdsa.cpp | doc/examples/ecdsa.cpp |
using namespace Botan;
int main()
{
try
{
std::auto_ptr<RandomNumberGenerator> rng(
RandomNumberGenerator::make_rng());
EC_Domain_Params params = get_EC_Dom_Pars_by_oid("1.3.132.0.8");
std::cout << params.get_curve().get_p() << "\n";
std::cout << params.get_order() << "\n";
ECDSA_PrivateKey ecdsa(*rng, params);
}
catch(std::exception& e)
{
std::cout << e.what() << "\n";
}
}
|
using namespace Botan;
int main()
{
try
{
std::auto_ptr<RandomNumberGenerator> rng(
RandomNumberGenerator::make_rng());
EC_Domain_Params params = get_EC_Dom_Pars_by_oid("1.3.132.0.8");
std::cout << params.get_curve().get_p() << "\n";
std::cout << params.get_order() << "\n";
ECDSA_PrivateKey ecdsa(*rng, params);
std::cout << X509::PEM_encode(ecdsa);
}
catch(std::exception& e)
{
std::cout << e.what() << "\n";
}
}
| Print generated public key in ECDSA example | Print generated public key in ECDSA example
| C++ | bsd-2-clause | Rohde-Schwarz-Cybersecurity/botan,randombit/botan,randombit/botan,webmaster128/botan,webmaster128/botan,randombit/botan,Rohde-Schwarz-Cybersecurity/botan,Rohde-Schwarz-Cybersecurity/botan,randombit/botan,Rohde-Schwarz-Cybersecurity/botan,webmaster128/botan,Rohde-Schwarz-Cybersecurity/botan,randombit/botan,webmaster128/botan,Rohde-Schwarz-Cybersecurity/botan,webmaster128/botan | c++ | ## Code Before:
using namespace Botan;
int main()
{
try
{
std::auto_ptr<RandomNumberGenerator> rng(
RandomNumberGenerator::make_rng());
EC_Domain_Params params = get_EC_Dom_Pars_by_oid("1.3.132.0.8");
std::cout << params.get_curve().get_p() << "\n";
std::cout << params.get_order() << "\n";
ECDSA_PrivateKey ecdsa(*rng, params);
}
catch(std::exception& e)
{
std::cout << e.what() << "\n";
}
}
## Instruction:
Print generated public key in ECDSA example
## Code After:
using namespace Botan;
int main()
{
try
{
std::auto_ptr<RandomNumberGenerator> rng(
RandomNumberGenerator::make_rng());
EC_Domain_Params params = get_EC_Dom_Pars_by_oid("1.3.132.0.8");
std::cout << params.get_curve().get_p() << "\n";
std::cout << params.get_order() << "\n";
ECDSA_PrivateKey ecdsa(*rng, params);
std::cout << X509::PEM_encode(ecdsa);
}
catch(std::exception& e)
{
std::cout << e.what() << "\n";
}
}
|
using namespace Botan;
int main()
{
try
{
std::auto_ptr<RandomNumberGenerator> rng(
RandomNumberGenerator::make_rng());
EC_Domain_Params params = get_EC_Dom_Pars_by_oid("1.3.132.0.8");
std::cout << params.get_curve().get_p() << "\n";
std::cout << params.get_order() << "\n";
ECDSA_PrivateKey ecdsa(*rng, params);
+
+ std::cout << X509::PEM_encode(ecdsa);
}
catch(std::exception& e)
{
std::cout << e.what() << "\n";
}
} | 2 | 0.090909 | 2 | 0 |
8dbc9df18c218e79640c8d161b5d0eb17c6ec785 | src/Calendar/Controller/LeapYearController.php | src/Calendar/Controller/LeapYearController.php | <?php
namespace Calendar\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Scarlett\Controller\Controller;
use Calendar\Model\LeapYear;
class LeapYearController extends Controller
{
public function indexAction(Request $request, $year)
{
$leapyear = new LeapYear();
if ($leapyear->isLeapYear($year)) {
$message = 'Yep, this is a leap year!';
}else{
$message = 'Nope, this is not a leap year!';
}
return $this->render('leapYear', array('message' => $message));
}
public function primeAction(Request $request, $number)
{
if($number%$number == 0 && $number%1 == 0){
$message = 'Prime number';
}else{
$message = 'Not prime';
}
return $this->render('primeNumber', array('message' => $message));
}
} | <?php
namespace Calendar\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Scarlett\Controller\Controller;
use Calendar\Model\LeapYear;
class LeapYearController extends Controller
{
public function indexAction(Request $request, $year)
{
$leapyear = new LeapYear();
if ($leapyear->isLeapYear($year)) {
$message = 'Yep, this is a leap year!';
}else{
$message = 'Nope, this is not a leap year!';
}
return $this->render('leapYear', array('message' => $message));
}
public function primeAction(Request $request, $number)
{
$message = "It's prime!";
for($i=$number-1;$i>=2;$i--) {
if($number%$i == 0) {
$message =" It's not prime!";
break;
}
}
return $this->render('primeNumber', array('message' => $message));
}
} | Add a real prime number test | Add a real prime number test
| PHP | mit | mangelsnc/Scarlett | php | ## Code Before:
<?php
namespace Calendar\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Scarlett\Controller\Controller;
use Calendar\Model\LeapYear;
class LeapYearController extends Controller
{
public function indexAction(Request $request, $year)
{
$leapyear = new LeapYear();
if ($leapyear->isLeapYear($year)) {
$message = 'Yep, this is a leap year!';
}else{
$message = 'Nope, this is not a leap year!';
}
return $this->render('leapYear', array('message' => $message));
}
public function primeAction(Request $request, $number)
{
if($number%$number == 0 && $number%1 == 0){
$message = 'Prime number';
}else{
$message = 'Not prime';
}
return $this->render('primeNumber', array('message' => $message));
}
}
## Instruction:
Add a real prime number test
## Code After:
<?php
namespace Calendar\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Scarlett\Controller\Controller;
use Calendar\Model\LeapYear;
class LeapYearController extends Controller
{
public function indexAction(Request $request, $year)
{
$leapyear = new LeapYear();
if ($leapyear->isLeapYear($year)) {
$message = 'Yep, this is a leap year!';
}else{
$message = 'Nope, this is not a leap year!';
}
return $this->render('leapYear', array('message' => $message));
}
public function primeAction(Request $request, $number)
{
$message = "It's prime!";
for($i=$number-1;$i>=2;$i--) {
if($number%$i == 0) {
$message =" It's not prime!";
break;
}
}
return $this->render('primeNumber', array('message' => $message));
}
} | <?php
namespace Calendar\Controller;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Scarlett\Controller\Controller;
use Calendar\Model\LeapYear;
class LeapYearController extends Controller
{
public function indexAction(Request $request, $year)
{
$leapyear = new LeapYear();
if ($leapyear->isLeapYear($year)) {
$message = 'Yep, this is a leap year!';
}else{
$message = 'Nope, this is not a leap year!';
}
return $this->render('leapYear', array('message' => $message));
}
public function primeAction(Request $request, $number)
{
- if($number%$number == 0 && $number%1 == 0){
- $message = 'Prime number';
- }else{
+ $message = "It's prime!";
+ for($i=$number-1;$i>=2;$i--) {
+ if($number%$i == 0) {
- $message = 'Not prime';
? ^ ^
+ $message =" It's not prime!";
? ++++ + ++ ^^^ ^^
+ break;
+ }
}
return $this->render('primeNumber', array('message' => $message));
}
} | 10 | 0.294118 | 6 | 4 |
672a75bd6cc54c62503d8fed3c059cde66ffa9c2 | bower.json | bower.json | {
"name": "angular-mixpanel",
"version": "1.0.0",
"main": "./src/angular-mixpanel.js",
"dependencies": {
"angular": "1.2.x",
"angular-mocks": "~1.2.18",
"mixpanel": "~2.2.0"
}
}
| {
"name": "angular-mixpanel",
"version": "1.0.0",
"main": "./src/angular-mixpanel.js",
"dependencies": {
"angular": "1.2.x",
"mixpanel": "~2.2.0"
},
"devDependencies": {
"angular-mocks": "~1.2.18"
}
}
| Move angular-mocks to dev dependencies | Move angular-mocks to dev dependencies
| JSON | mit | Aloomaio/angular-mixpanel,fotoflo/angular-mixpanel,andsafety/angular-mixpanel,kuhnza/angular-mixpanel | json | ## Code Before:
{
"name": "angular-mixpanel",
"version": "1.0.0",
"main": "./src/angular-mixpanel.js",
"dependencies": {
"angular": "1.2.x",
"angular-mocks": "~1.2.18",
"mixpanel": "~2.2.0"
}
}
## Instruction:
Move angular-mocks to dev dependencies
## Code After:
{
"name": "angular-mixpanel",
"version": "1.0.0",
"main": "./src/angular-mixpanel.js",
"dependencies": {
"angular": "1.2.x",
"mixpanel": "~2.2.0"
},
"devDependencies": {
"angular-mocks": "~1.2.18"
}
}
| {
"name": "angular-mixpanel",
"version": "1.0.0",
"main": "./src/angular-mixpanel.js",
"dependencies": {
"angular": "1.2.x",
- "angular-mocks": "~1.2.18",
"mixpanel": "~2.2.0"
+ },
+ "devDependencies": {
+ "angular-mocks": "~1.2.18"
}
} | 4 | 0.4 | 3 | 1 |
b7cd542f34e9987b5c8b15be1489f95e07dfd701 | .travis.yml | .travis.yml | notifications:
slack: wtsi-cgpit:ptUMR1tkNyZJYd9TpGoss8WR
email: false
sudo: false
services:
- docker
install:
- docker build -t dockstore-cgpmap .
- virtualenv -p python3 venv
- source venv/bin/activate
- pip install html5lib cwltool
script:
- docker images | grep -c dockstore-cgpmap
- cwltool --validate Dockstore.cwl
- cwltool --validate cwls/cgpmap-bamOut.cwl
- cwltool --validate cwls/cgpmap-cramOut.cwl
| notifications:
slack: wtsi-cgpit:ptUMR1tkNyZJYd9TpGoss8WR
email: false
sudo: false
env:
- BUILD_ELEMENT='docker'
- BUILD_ELEMENT='validate'
services:
- docker
install:
- docker build -t dockstore-cgpmap .
- virtualenv -p python3 venv
- source venv/bin/activate
- pip install html5lib cwltool
script:
- 'set -ue
if [ "$BUILD_ELEMENT" == "docker" ]; then
docker images | grep -c dockstore-cgpmap
elif [ "$BUILD_ELEMENT" == "validate" ]; then
cwltool --validate Dockstore.cwl
cwltool --validate cwls/cgpmap-bamOut.cwl
cwltool --validate cwls/cgpmap-cramOut.cwl
fi'
| Switch to a matrix for docker vs validation | Switch to a matrix for docker vs validation
| YAML | agpl-3.0 | cancerit/dockstore-cgpmap,cancerit/dockstore-cgpmap | yaml | ## Code Before:
notifications:
slack: wtsi-cgpit:ptUMR1tkNyZJYd9TpGoss8WR
email: false
sudo: false
services:
- docker
install:
- docker build -t dockstore-cgpmap .
- virtualenv -p python3 venv
- source venv/bin/activate
- pip install html5lib cwltool
script:
- docker images | grep -c dockstore-cgpmap
- cwltool --validate Dockstore.cwl
- cwltool --validate cwls/cgpmap-bamOut.cwl
- cwltool --validate cwls/cgpmap-cramOut.cwl
## Instruction:
Switch to a matrix for docker vs validation
## Code After:
notifications:
slack: wtsi-cgpit:ptUMR1tkNyZJYd9TpGoss8WR
email: false
sudo: false
env:
- BUILD_ELEMENT='docker'
- BUILD_ELEMENT='validate'
services:
- docker
install:
- docker build -t dockstore-cgpmap .
- virtualenv -p python3 venv
- source venv/bin/activate
- pip install html5lib cwltool
script:
- 'set -ue
if [ "$BUILD_ELEMENT" == "docker" ]; then
docker images | grep -c dockstore-cgpmap
elif [ "$BUILD_ELEMENT" == "validate" ]; then
cwltool --validate Dockstore.cwl
cwltool --validate cwls/cgpmap-bamOut.cwl
cwltool --validate cwls/cgpmap-cramOut.cwl
fi'
| notifications:
slack: wtsi-cgpit:ptUMR1tkNyZJYd9TpGoss8WR
email: false
sudo: false
+
+ env:
+ - BUILD_ELEMENT='docker'
+ - BUILD_ELEMENT='validate'
services:
- docker
install:
- docker build -t dockstore-cgpmap .
- virtualenv -p python3 venv
- source venv/bin/activate
- pip install html5lib cwltool
script:
+ - 'set -ue
+ if [ "$BUILD_ELEMENT" == "docker" ]; then
- - docker images | grep -c dockstore-cgpmap
? ^
+ docker images | grep -c dockstore-cgpmap
? ^^^^
+ elif [ "$BUILD_ELEMENT" == "validate" ]; then
- - cwltool --validate Dockstore.cwl
? ^
+ cwltool --validate Dockstore.cwl
? ^^^^
- - cwltool --validate cwls/cgpmap-bamOut.cwl
? ^
+ cwltool --validate cwls/cgpmap-bamOut.cwl
? ^^^^
- - cwltool --validate cwls/cgpmap-cramOut.cwl
? ^
+ cwltool --validate cwls/cgpmap-cramOut.cwl
? ^^^^
+ fi' | 16 | 0.8 | 12 | 4 |
3252fc6b980c78e1e5976fe8c4e0add36107341e | vendor/assets/javascripts/tinymce/plugins/uploadimage/editor_plugin.js | vendor/assets/javascripts/tinymce/plugins/uploadimage/editor_plugin.js | (function() {
tinymce.PluginManager.requireLangPack('uploadimage');
tinymce.create('tinymce.plugins.UploadImagePlugin', {
init: function(ed, url) {
ed.addCommand('mceUploadImage', function() {
return ed.windowManager.open({
file: url + '/dialog.html',
width: 350 + parseInt(ed.getLang('uploadimage.delta_width', 0)),
height: 220 + parseInt(ed.getLang('uploadimage.delta_height', 0)),
inline: 1
}, {
plugin_url: url
});
});
ed.addButton('uploadimage', {
title: 'uploadimage.desc',
cmd: 'mceUploadImage',
image: url + '/img/uploadimage.png'
});
return ed.onNodeChange.add(function(ed, cm, n) {
return cm.setActive('uploadimage', n.nodeName === 'IMG');
});
},
createControl: function(n, cm) {
return null;
},
getInfo: function() {
return {
longname: 'UploadImage plugin',
author: 'Per Christian B. Viken (borrows heavily from work done by Peter Shoukry of 77effects.com)',
authorurl: 'eastblue.org/oss',
infourl: 'eastblue.org/oss',
version: '1.0'
};
}
});
return tinymce.PluginManager.add('uploadimage', tinymce.plugins.UploadImagePlugin);
})(); | (function() {
tinymce.PluginManager.requireLangPack('uploadimage');
var currentOrigin = function(url){
var parts = url.split(/\/\/?/);
parts[1] = location.origin;
return parts.slice(1).join("/");
}
tinymce.create('tinymce.plugins.UploadImagePlugin', {
init: function(ed, url) {
ed.addCommand('mceUploadImage', function() {
return ed.windowManager.open({
file: currentOrigin(url) + '/dialog.html',
width: 350 + parseInt(ed.getLang('uploadimage.delta_width', 0)),
height: 220 + parseInt(ed.getLang('uploadimage.delta_height', 0)),
inline: 1
}, {
plugin_url: currentOrigin(url)
});
});
ed.addButton('uploadimage', {
title: 'uploadimage.desc',
cmd: 'mceUploadImage',
image: url + '/img/uploadimage.png'
});
return ed.onNodeChange.add(function(ed, cm, n) {
return cm.setActive('uploadimage', n.nodeName === 'IMG');
});
},
createControl: function(n, cm) {
return null;
},
getInfo: function() {
return {
longname: 'UploadImage plugin',
author: 'Per Christian B. Viken (borrows heavily from work done by Peter Shoukry of 77effects.com)',
authorurl: 'eastblue.org/oss',
infourl: 'eastblue.org/oss',
version: '1.0'
};
}
});
return tinymce.PluginManager.add('uploadimage', tinymce.plugins.UploadImagePlugin);
})();
| Add origin function to vendor | Add origin function to vendor
| JavaScript | mit | Bargains4Business/tinymce-rails-imageupload | javascript | ## Code Before:
(function() {
tinymce.PluginManager.requireLangPack('uploadimage');
tinymce.create('tinymce.plugins.UploadImagePlugin', {
init: function(ed, url) {
ed.addCommand('mceUploadImage', function() {
return ed.windowManager.open({
file: url + '/dialog.html',
width: 350 + parseInt(ed.getLang('uploadimage.delta_width', 0)),
height: 220 + parseInt(ed.getLang('uploadimage.delta_height', 0)),
inline: 1
}, {
plugin_url: url
});
});
ed.addButton('uploadimage', {
title: 'uploadimage.desc',
cmd: 'mceUploadImage',
image: url + '/img/uploadimage.png'
});
return ed.onNodeChange.add(function(ed, cm, n) {
return cm.setActive('uploadimage', n.nodeName === 'IMG');
});
},
createControl: function(n, cm) {
return null;
},
getInfo: function() {
return {
longname: 'UploadImage plugin',
author: 'Per Christian B. Viken (borrows heavily from work done by Peter Shoukry of 77effects.com)',
authorurl: 'eastblue.org/oss',
infourl: 'eastblue.org/oss',
version: '1.0'
};
}
});
return tinymce.PluginManager.add('uploadimage', tinymce.plugins.UploadImagePlugin);
})();
## Instruction:
Add origin function to vendor
## Code After:
(function() {
tinymce.PluginManager.requireLangPack('uploadimage');
var currentOrigin = function(url){
var parts = url.split(/\/\/?/);
parts[1] = location.origin;
return parts.slice(1).join("/");
}
tinymce.create('tinymce.plugins.UploadImagePlugin', {
init: function(ed, url) {
ed.addCommand('mceUploadImage', function() {
return ed.windowManager.open({
file: currentOrigin(url) + '/dialog.html',
width: 350 + parseInt(ed.getLang('uploadimage.delta_width', 0)),
height: 220 + parseInt(ed.getLang('uploadimage.delta_height', 0)),
inline: 1
}, {
plugin_url: currentOrigin(url)
});
});
ed.addButton('uploadimage', {
title: 'uploadimage.desc',
cmd: 'mceUploadImage',
image: url + '/img/uploadimage.png'
});
return ed.onNodeChange.add(function(ed, cm, n) {
return cm.setActive('uploadimage', n.nodeName === 'IMG');
});
},
createControl: function(n, cm) {
return null;
},
getInfo: function() {
return {
longname: 'UploadImage plugin',
author: 'Per Christian B. Viken (borrows heavily from work done by Peter Shoukry of 77effects.com)',
authorurl: 'eastblue.org/oss',
infourl: 'eastblue.org/oss',
version: '1.0'
};
}
});
return tinymce.PluginManager.add('uploadimage', tinymce.plugins.UploadImagePlugin);
})();
| (function() {
tinymce.PluginManager.requireLangPack('uploadimage');
+ var currentOrigin = function(url){
+ var parts = url.split(/\/\/?/);
+ parts[1] = location.origin;
+ return parts.slice(1).join("/");
+ }
tinymce.create('tinymce.plugins.UploadImagePlugin', {
init: function(ed, url) {
ed.addCommand('mceUploadImage', function() {
return ed.windowManager.open({
- file: url + '/dialog.html',
+ file: currentOrigin(url) + '/dialog.html',
? ++++++++++++++ +
width: 350 + parseInt(ed.getLang('uploadimage.delta_width', 0)),
height: 220 + parseInt(ed.getLang('uploadimage.delta_height', 0)),
inline: 1
}, {
- plugin_url: url
+ plugin_url: currentOrigin(url)
? ++++++++++++++ +
});
});
ed.addButton('uploadimage', {
title: 'uploadimage.desc',
cmd: 'mceUploadImage',
image: url + '/img/uploadimage.png'
});
return ed.onNodeChange.add(function(ed, cm, n) {
return cm.setActive('uploadimage', n.nodeName === 'IMG');
});
},
createControl: function(n, cm) {
return null;
},
getInfo: function() {
return {
longname: 'UploadImage plugin',
author: 'Per Christian B. Viken (borrows heavily from work done by Peter Shoukry of 77effects.com)',
authorurl: 'eastblue.org/oss',
infourl: 'eastblue.org/oss',
version: '1.0'
};
}
});
return tinymce.PluginManager.add('uploadimage', tinymce.plugins.UploadImagePlugin);
})(); | 9 | 0.236842 | 7 | 2 |
7d1fdfa854e8af86bed0721ddf0cd1dee3adcfc6 | now.json | now.json | {
"version": 2,
"name": "constantinescu.io",
"builds": [{ "src": "next.config.js", "use": "@now/next" }],
"build": {
"env": {
"GRAPHQL_ENDPOINT": "@main-gql-endpoint"
}
},
"env": {
"GRAPHQL_ENDPOINT": "@main-gql-endpoint"
}
}
| {
"version": 2,
"name": "constantinescu.io",
"builds": [{ "src": "next.config.js", "use": "@now/next" }],
"build": {
"env": {
"GRAPHQL_ENDPOINT": "@main-gql-endpoint"
}
},
"routes": [
{
"src": "/_next/static/(?:[^/]+/pages|chunks|runtime)/.+",
"headers": { "cache-control": "immutable" }
}
]
}
| Add immutable cache for the pages. | Add immutable cache for the pages.
| JSON | mit | andreiconstantinescu/constantinescu.io | json | ## Code Before:
{
"version": 2,
"name": "constantinescu.io",
"builds": [{ "src": "next.config.js", "use": "@now/next" }],
"build": {
"env": {
"GRAPHQL_ENDPOINT": "@main-gql-endpoint"
}
},
"env": {
"GRAPHQL_ENDPOINT": "@main-gql-endpoint"
}
}
## Instruction:
Add immutable cache for the pages.
## Code After:
{
"version": 2,
"name": "constantinescu.io",
"builds": [{ "src": "next.config.js", "use": "@now/next" }],
"build": {
"env": {
"GRAPHQL_ENDPOINT": "@main-gql-endpoint"
}
},
"routes": [
{
"src": "/_next/static/(?:[^/]+/pages|chunks|runtime)/.+",
"headers": { "cache-control": "immutable" }
}
]
}
| {
"version": 2,
"name": "constantinescu.io",
"builds": [{ "src": "next.config.js", "use": "@now/next" }],
"build": {
"env": {
"GRAPHQL_ENDPOINT": "@main-gql-endpoint"
}
},
- "env": {
- "GRAPHQL_ENDPOINT": "@main-gql-endpoint"
+ "routes": [
+ {
+ "src": "/_next/static/(?:[^/]+/pages|chunks|runtime)/.+",
+ "headers": { "cache-control": "immutable" }
- }
+ }
? ++
+ ]
} | 9 | 0.692308 | 6 | 3 |
2601308d7aecb1cfc9599fdc5e6feddcc16c62f1 | admin/html/view.html | admin/html/view.html | <div id='content'>
<?php echo $notification; ?>
<h2><?php echo $experiment->getName(); ?></h2>
<div style='text-align: center;'>
<button id='live_view' class='green'>Live view</button>
<button id='edit_experiment' class='blue'>Edit experiment</button>
<?php echo $change_status_button; ?>
<?php echo $delete_button; ?>
<button id='reload' class='grey'>Reload</button>
</div>
<?php echo $schedule; ?>
</div>
<div id='menu'>
<?php echo generateMenu($user->getName(), $page); ?>
</div>
| <div id='content'>
<?php echo $notification; ?>
<h2><?php echo $experiment->getName(); ?></h2>
<div style='text-align: center;'>
<button id='live_view' class='green'>Live view</button>
<button id='edit_experiment' class='blue'>Edit experiment</button>
<?php echo $change_status_button; ?>
<?php echo $delete_button; ?>
<button id='reload' class='grey'>Reload</button>
</div>
<?php echo $schedule; ?>
</div>
<div id='menu'>
<?php echo generateMenu($user->getName(), $page); ?>
</div>
<?php include('js/view.js'); ?>
| Include javascript file directly from html file | Include javascript file directly from html file
| HTML | mit | jwcarr/SimpleSignUp,jwcarr/SimpleSignUp,jwcarr/SimpleSignUp | html | ## Code Before:
<div id='content'>
<?php echo $notification; ?>
<h2><?php echo $experiment->getName(); ?></h2>
<div style='text-align: center;'>
<button id='live_view' class='green'>Live view</button>
<button id='edit_experiment' class='blue'>Edit experiment</button>
<?php echo $change_status_button; ?>
<?php echo $delete_button; ?>
<button id='reload' class='grey'>Reload</button>
</div>
<?php echo $schedule; ?>
</div>
<div id='menu'>
<?php echo generateMenu($user->getName(), $page); ?>
</div>
## Instruction:
Include javascript file directly from html file
## Code After:
<div id='content'>
<?php echo $notification; ?>
<h2><?php echo $experiment->getName(); ?></h2>
<div style='text-align: center;'>
<button id='live_view' class='green'>Live view</button>
<button id='edit_experiment' class='blue'>Edit experiment</button>
<?php echo $change_status_button; ?>
<?php echo $delete_button; ?>
<button id='reload' class='grey'>Reload</button>
</div>
<?php echo $schedule; ?>
</div>
<div id='menu'>
<?php echo generateMenu($user->getName(), $page); ?>
</div>
<?php include('js/view.js'); ?>
| <div id='content'>
<?php echo $notification; ?>
<h2><?php echo $experiment->getName(); ?></h2>
<div style='text-align: center;'>
<button id='live_view' class='green'>Live view</button>
<button id='edit_experiment' class='blue'>Edit experiment</button>
<?php echo $change_status_button; ?>
<?php echo $delete_button; ?>
<button id='reload' class='grey'>Reload</button>
</div>
<?php echo $schedule; ?>
</div>
<div id='menu'>
<?php echo generateMenu($user->getName(), $page); ?>
</div>
+
+ <?php include('js/view.js'); ?> | 2 | 0.068966 | 2 | 0 |
0c584c7afc4bf623028c84b71ab6379d9de3f759 | src/main/resources/db/migration/V1_2__modify_varchar_limits.sql | src/main/resources/db/migration/V1_2__modify_varchar_limits.sql | ALTER TABLE users MODIFY COLUMN name varchar(64), MODIFY COLUMN username varchar(64), MODIFY COLUMN organization_id varchar(64);
ALTER TABLE tokens MODIFY COLUMN username varchar(64), MODIFY COLUMN organization_id varchar(64);
ALTER TABLE roles_users MODIFY COLUMN username varchar(64); | ALTER TABLE users MODIFY COLUMN name varchar(64), MODIFY COLUMN username varchar(64), MODIFY COLUMN organization_id varchar(64);
ALTER TABLE tokens MODIFY COLUMN username varchar(64), MODIFY COLUMN organization_id varchar(64);
ALTER TABLE roles_users MODIFY COLUMN username varchar(64);
ALTER TABLE permissions MODIFY COLUMN name varchar(128); | Increase varchar lenght in DB. | Increase varchar lenght in DB.
| SQL | mit | granpanda/autheo,granpanda/autheo | sql | ## Code Before:
ALTER TABLE users MODIFY COLUMN name varchar(64), MODIFY COLUMN username varchar(64), MODIFY COLUMN organization_id varchar(64);
ALTER TABLE tokens MODIFY COLUMN username varchar(64), MODIFY COLUMN organization_id varchar(64);
ALTER TABLE roles_users MODIFY COLUMN username varchar(64);
## Instruction:
Increase varchar lenght in DB.
## Code After:
ALTER TABLE users MODIFY COLUMN name varchar(64), MODIFY COLUMN username varchar(64), MODIFY COLUMN organization_id varchar(64);
ALTER TABLE tokens MODIFY COLUMN username varchar(64), MODIFY COLUMN organization_id varchar(64);
ALTER TABLE roles_users MODIFY COLUMN username varchar(64);
ALTER TABLE permissions MODIFY COLUMN name varchar(128); | ALTER TABLE users MODIFY COLUMN name varchar(64), MODIFY COLUMN username varchar(64), MODIFY COLUMN organization_id varchar(64);
ALTER TABLE tokens MODIFY COLUMN username varchar(64), MODIFY COLUMN organization_id varchar(64);
ALTER TABLE roles_users MODIFY COLUMN username varchar(64);
+ ALTER TABLE permissions MODIFY COLUMN name varchar(128); | 1 | 0.333333 | 1 | 0 |
1f9327e9b89dfb812d63f68d0f6a47a89e1c665f | README.md | README.md | Glas - Rails on Ruby - Jekyll Theme
Preview: http://www.blog.lucasgatsas.ch
Clone to Desktop
Delete the cname file.
<code>
cd Desktop
cd Glas
$ jekyll serve
</code>
| Glas - Rails on Ruby - Jekyll Theme
Preview: http://www.blog.lucasgatsas.ch
Clone to Desktop
Delete the cname file.
<code>
cd Desktop <br>
cd Glas <br>
$ jekyll serve
</code>
| Update Glas Theme - Mo.21.March 2016 | Update Glas Theme - Mo.21.March 2016 | Markdown | mit | mijuhan/mijuhan.github.io,SpaceG/iceandfire,mijuhan/mijuhan.github.io,SpaceG/iceandfire | markdown | ## Code Before:
Glas - Rails on Ruby - Jekyll Theme
Preview: http://www.blog.lucasgatsas.ch
Clone to Desktop
Delete the cname file.
<code>
cd Desktop
cd Glas
$ jekyll serve
</code>
## Instruction:
Update Glas Theme - Mo.21.March 2016
## Code After:
Glas - Rails on Ruby - Jekyll Theme
Preview: http://www.blog.lucasgatsas.ch
Clone to Desktop
Delete the cname file.
<code>
cd Desktop <br>
cd Glas <br>
$ jekyll serve
</code>
| Glas - Rails on Ruby - Jekyll Theme
Preview: http://www.blog.lucasgatsas.ch
Clone to Desktop
Delete the cname file.
<code>
- cd Desktop
+ cd Desktop <br>
? +++++
- cd Glas
+ cd Glas <br>
$ jekyll serve
</code>
| 4 | 0.285714 | 2 | 2 |
d72bf615df736b1cddd6bb3901e63488acbb5ea4 | test/test_sync.ml | test/test_sync.ml | open Webtest.Suite
open Webtest.Utils
let test_bracket_succeed () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
assert_equal !state `test_end;
state := `torn_down
in
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
state := `test_end)
teardown ();
assert_equal !state `torn_down
let test_bracket_fail () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
assert_equal 5 6)
teardown ();
with TestFailure "not equal" ->
assert_equal !state `torn_down
let test_bracket_error () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
failwith "error")
teardown ();
with Failure "error" ->
assert_equal !state `torn_down
let suite =
"sync" >::: [
"test_bracket_succeed" >:: test_bracket_succeed;
"test_bracket_fail" >:: test_bracket_fail;
"test_bracket_error" >:: test_bracket_error;
]
| open Webtest.Suite
let test_bracket_succeed () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
assert_equal !state `test_end;
state := `torn_down
in
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
state := `test_end)
teardown ();
assert_equal !state `torn_down
let test_bracket_fail () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
assert_equal 5 6)
teardown ();
with TestFailure "not equal" ->
assert_equal !state `torn_down
let test_bracket_error () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
failwith "error")
teardown ();
with Failure "error" ->
assert_equal !state `torn_down
let suite =
"sync" >::: [
"test_bracket_succeed" >:: test_bracket_succeed;
"test_bracket_fail" >:: test_bracket_fail;
"test_bracket_error" >:: test_bracket_error;
]
| Remove an unused module open | Remove an unused module open
Signed-off-by: John Else <37b74501daa612e48fea94f9846ed4ddeb6a8f07@gmail.com>
| OCaml | mit | johnelse/ocaml-webtest,johnelse/ocaml-webtest,johnelse/ocaml-webtest | ocaml | ## Code Before:
open Webtest.Suite
open Webtest.Utils
let test_bracket_succeed () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
assert_equal !state `test_end;
state := `torn_down
in
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
state := `test_end)
teardown ();
assert_equal !state `torn_down
let test_bracket_fail () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
assert_equal 5 6)
teardown ();
with TestFailure "not equal" ->
assert_equal !state `torn_down
let test_bracket_error () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
failwith "error")
teardown ();
with Failure "error" ->
assert_equal !state `torn_down
let suite =
"sync" >::: [
"test_bracket_succeed" >:: test_bracket_succeed;
"test_bracket_fail" >:: test_bracket_fail;
"test_bracket_error" >:: test_bracket_error;
]
## Instruction:
Remove an unused module open
Signed-off-by: John Else <37b74501daa612e48fea94f9846ed4ddeb6a8f07@gmail.com>
## Code After:
open Webtest.Suite
let test_bracket_succeed () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
assert_equal !state `test_end;
state := `torn_down
in
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
state := `test_end)
teardown ();
assert_equal !state `torn_down
let test_bracket_fail () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
assert_equal 5 6)
teardown ();
with TestFailure "not equal" ->
assert_equal !state `torn_down
let test_bracket_error () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
failwith "error")
teardown ();
with Failure "error" ->
assert_equal !state `torn_down
let suite =
"sync" >::: [
"test_bracket_succeed" >:: test_bracket_succeed;
"test_bracket_fail" >:: test_bracket_fail;
"test_bracket_error" >:: test_bracket_error;
]
| open Webtest.Suite
- open Webtest.Utils
let test_bracket_succeed () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
assert_equal !state `test_end;
state := `torn_down
in
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
state := `test_end)
teardown ();
assert_equal !state `torn_down
let test_bracket_fail () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
assert_equal 5 6)
teardown ();
with TestFailure "not equal" ->
assert_equal !state `torn_down
let test_bracket_error () =
let state = ref `uninitialised in
let setup () = state := `test_start; state in
let teardown state =
state := `torn_down
in
try
Sync.bracket
setup
(fun state ->
assert_equal !state `test_start;
failwith "error")
teardown ();
with Failure "error" ->
assert_equal !state `torn_down
let suite =
"sync" >::: [
"test_bracket_succeed" >:: test_bracket_succeed;
"test_bracket_fail" >:: test_bracket_fail;
"test_bracket_error" >:: test_bracket_error;
] | 1 | 0.017857 | 0 | 1 |
94e3cea4fc7b4f6b89171fd5280fdc6632485378 | knife-cloud.gemspec | knife-cloud.gemspec | $LOAD_PATH.push File.expand_path("lib", __dir__)
require "knife-cloud/version"
Gem::Specification.new do |s|
s.name = "knife-cloud"
s.version = Knife::Cloud::VERSION
s.authors = ["Kaustubh Deorukhkar", "Ameya Varade"]
s.email = ["dev@chef.io"]
s.homepage = "https://github.com/chef/knife-cloud"
s.summary = "knife-cloud plugin"
s.description = s.summary
s.files = %w{LICENSE} + Dir.glob("lib/**/*")
s.test_files = `git ls-files -- {test,spec,features}/*`.split("\n")
s.require_paths = %w{lib}
s.required_ruby_version = ">= 2.6"
s.add_dependency "chef", ">= 15.11"
s.add_dependency "mixlib-shellout"
s.add_dependency "excon", ">= 0.50" # excon 0.50 renamed the errors class and required updating rescues
end
| $LOAD_PATH.push File.expand_path("lib", __dir__)
require "knife-cloud/version"
Gem::Specification.new do |s|
s.name = "knife-cloud"
s.version = Knife::Cloud::VERSION
s.authors = ["Kaustubh Deorukhkar", "Ameya Varade"]
s.email = ["dev@chef.io"]
s.license = "Apache-2.0"
s.homepage = "https://github.com/chef/knife-cloud"
s.summary = "knife-cloud plugin"
s.description = s.summary
s.files = %w{LICENSE} + Dir.glob("lib/**/*")
s.test_files = `git ls-files -- {test,spec,features}/*`.split("\n")
s.require_paths = %w{lib}
s.required_ruby_version = ">= 2.6"
s.add_dependency "chef", ">= 15.11"
s.add_dependency "mixlib-shellout"
s.add_dependency "excon", ">= 0.50" # excon 0.50 renamed the errors class and required updating rescues
end
| Add license to the gemspec | Add license to the gemspec
Signed-off-by: Tim Smith <764ef62106582a09ed09dfa0b6bff7c05fd7d1e4@chef.io>
| Ruby | apache-2.0 | chef/knife-cloud,chef/knife-cloud,chef/knife-cloud | ruby | ## Code Before:
$LOAD_PATH.push File.expand_path("lib", __dir__)
require "knife-cloud/version"
Gem::Specification.new do |s|
s.name = "knife-cloud"
s.version = Knife::Cloud::VERSION
s.authors = ["Kaustubh Deorukhkar", "Ameya Varade"]
s.email = ["dev@chef.io"]
s.homepage = "https://github.com/chef/knife-cloud"
s.summary = "knife-cloud plugin"
s.description = s.summary
s.files = %w{LICENSE} + Dir.glob("lib/**/*")
s.test_files = `git ls-files -- {test,spec,features}/*`.split("\n")
s.require_paths = %w{lib}
s.required_ruby_version = ">= 2.6"
s.add_dependency "chef", ">= 15.11"
s.add_dependency "mixlib-shellout"
s.add_dependency "excon", ">= 0.50" # excon 0.50 renamed the errors class and required updating rescues
end
## Instruction:
Add license to the gemspec
Signed-off-by: Tim Smith <764ef62106582a09ed09dfa0b6bff7c05fd7d1e4@chef.io>
## Code After:
$LOAD_PATH.push File.expand_path("lib", __dir__)
require "knife-cloud/version"
Gem::Specification.new do |s|
s.name = "knife-cloud"
s.version = Knife::Cloud::VERSION
s.authors = ["Kaustubh Deorukhkar", "Ameya Varade"]
s.email = ["dev@chef.io"]
s.license = "Apache-2.0"
s.homepage = "https://github.com/chef/knife-cloud"
s.summary = "knife-cloud plugin"
s.description = s.summary
s.files = %w{LICENSE} + Dir.glob("lib/**/*")
s.test_files = `git ls-files -- {test,spec,features}/*`.split("\n")
s.require_paths = %w{lib}
s.required_ruby_version = ">= 2.6"
s.add_dependency "chef", ">= 15.11"
s.add_dependency "mixlib-shellout"
s.add_dependency "excon", ">= 0.50" # excon 0.50 renamed the errors class and required updating rescues
end
| $LOAD_PATH.push File.expand_path("lib", __dir__)
require "knife-cloud/version"
Gem::Specification.new do |s|
s.name = "knife-cloud"
s.version = Knife::Cloud::VERSION
s.authors = ["Kaustubh Deorukhkar", "Ameya Varade"]
s.email = ["dev@chef.io"]
+ s.license = "Apache-2.0"
s.homepage = "https://github.com/chef/knife-cloud"
s.summary = "knife-cloud plugin"
s.description = s.summary
s.files = %w{LICENSE} + Dir.glob("lib/**/*")
s.test_files = `git ls-files -- {test,spec,features}/*`.split("\n")
s.require_paths = %w{lib}
s.required_ruby_version = ">= 2.6"
s.add_dependency "chef", ">= 15.11"
s.add_dependency "mixlib-shellout"
s.add_dependency "excon", ">= 0.50" # excon 0.50 renamed the errors class and required updating rescues
end | 1 | 0.047619 | 1 | 0 |
e706fbb810ba9293122873a7e41df38c0fc98680 | src/app/global/modals/keybindings/keybindings-modal.component.css | src/app/global/modals/keybindings/keybindings-modal.component.css |
.keybindings-modal {
width: 100%;
height: 100%;
display: grid;
grid-template-areas: "title close"
"message message"
"keybindings .";
grid-template-columns: 1fr 2em;
grid-template-rows: 3em 2em 1fr;
}
.keybindings-modal .title {
grid-area: title;
}
.keybindings-modal .close {
grid-area: close;
}
.keybindings-modal .message {
grid-area: message;
margin-left: 1em;
}
.keybindings-modal .keybindings {
grid-area: keybindings;
margin: 1em;
}
.keybindings-modal .keybindings .keybinding {
margin-bottom: 1em;
}
.keybindings-modal .keybindings .action {
display: inline-block;
width: 20em;
margin-right: 1em;
}
.keybindings-modal .keybindings .key {
display: inline-block;
width: 10em;
}
|
.keybindings-modal {
width: 100%;
height: 100%;
display: grid;
grid-template-areas: "title close"
"message message"
"keybindings .";
grid-template-columns: 1fr 2em;
grid-template-rows: 3em 2em 1fr;
}
.keybindings-modal .title {
grid-area: title;
}
.keybindings-modal .close {
grid-area: close;
}
.keybindings-modal .message {
grid-area: message;
margin-left: 1em;
}
.keybindings-modal .keybindings {
grid-area: keybindings;
margin: 1em;
}
.keybindings-modal .keybindings .keybinding {
margin-bottom: 1em;
border-width: 2px;
border-color: white;
border-style: solid;
}
.keybindings-modal .keybindings .action {
display: inline-block;
width: 20em;
margin-right: 1em;
}
.keybindings-modal .keybindings .key {
display: inline-block;
width: 10em;
}
| Add borders to keybindings modal | feat: Add borders to keybindings modal
| CSS | mit | nb48/chart-hero,nb48/chart-hero,nb48/chart-hero | css | ## Code Before:
.keybindings-modal {
width: 100%;
height: 100%;
display: grid;
grid-template-areas: "title close"
"message message"
"keybindings .";
grid-template-columns: 1fr 2em;
grid-template-rows: 3em 2em 1fr;
}
.keybindings-modal .title {
grid-area: title;
}
.keybindings-modal .close {
grid-area: close;
}
.keybindings-modal .message {
grid-area: message;
margin-left: 1em;
}
.keybindings-modal .keybindings {
grid-area: keybindings;
margin: 1em;
}
.keybindings-modal .keybindings .keybinding {
margin-bottom: 1em;
}
.keybindings-modal .keybindings .action {
display: inline-block;
width: 20em;
margin-right: 1em;
}
.keybindings-modal .keybindings .key {
display: inline-block;
width: 10em;
}
## Instruction:
feat: Add borders to keybindings modal
## Code After:
.keybindings-modal {
width: 100%;
height: 100%;
display: grid;
grid-template-areas: "title close"
"message message"
"keybindings .";
grid-template-columns: 1fr 2em;
grid-template-rows: 3em 2em 1fr;
}
.keybindings-modal .title {
grid-area: title;
}
.keybindings-modal .close {
grid-area: close;
}
.keybindings-modal .message {
grid-area: message;
margin-left: 1em;
}
.keybindings-modal .keybindings {
grid-area: keybindings;
margin: 1em;
}
.keybindings-modal .keybindings .keybinding {
margin-bottom: 1em;
border-width: 2px;
border-color: white;
border-style: solid;
}
.keybindings-modal .keybindings .action {
display: inline-block;
width: 20em;
margin-right: 1em;
}
.keybindings-modal .keybindings .key {
display: inline-block;
width: 10em;
}
|
.keybindings-modal {
width: 100%;
height: 100%;
display: grid;
grid-template-areas: "title close"
"message message"
"keybindings .";
grid-template-columns: 1fr 2em;
grid-template-rows: 3em 2em 1fr;
}
.keybindings-modal .title {
grid-area: title;
}
.keybindings-modal .close {
grid-area: close;
}
.keybindings-modal .message {
grid-area: message;
margin-left: 1em;
}
.keybindings-modal .keybindings {
grid-area: keybindings;
margin: 1em;
}
.keybindings-modal .keybindings .keybinding {
margin-bottom: 1em;
+ border-width: 2px;
+ border-color: white;
+ border-style: solid;
}
.keybindings-modal .keybindings .action {
display: inline-block;
width: 20em;
margin-right: 1em;
}
.keybindings-modal .keybindings .key {
display: inline-block;
width: 10em;
} | 3 | 0.068182 | 3 | 0 |
a48f01fea04dc4dad1d78b7c110beddea32b63cf | config/jobs/kubernetes/cloud-provider-gcp/cloud-provider-gcp-presubmits.yaml | config/jobs/kubernetes/cloud-provider-gcp/cloud-provider-gcp-presubmits.yaml | presubmits:
kubernetes/cloud-provider-gcp:
- name: cloud-provider-gcp-tests
max_concurrency: 5
always_run: true
decorate: true
path_alias: k8s.io/cloud-provider-gcp
annotations:
testgrid-dashboards: provider-gcp-presubmits
description: Build and unit test for kubernetes/cloud-provider-gcp.
testgrid-num-columns-recent: '30'
extra_refs:
- org: kubernetes
repo: test-infra
base_ref: master
path_alias: k8s.io/test-infra
spec:
containers:
- image: gcr.io/cloud-builders/bazel
command:
- ../test-infra/hack/bazel.sh
args:
- test
- --test_output=errors
- --
- //...
- -//vendor/...
| presubmits:
kubernetes/cloud-provider-gcp:
- name: cloud-provider-gcp-tests
always_run: true
decorate: true
path_alias: k8s.io/cloud-provider-gcp
annotations:
testgrid-dashboards: provider-gcp-presubmits
description: Build and unit test for kubernetes/cloud-provider-gcp.
testgrid-num-columns-recent: '30'
extra_refs:
- org: kubernetes
repo: test-infra
base_ref: master
path_alias: k8s.io/test-infra
spec:
containers:
- image: gcr.io/cloud-builders/bazel
command:
- ../test-infra/hack/bazel.sh
args:
- test
- --test_output=errors
- --
- //...
- -//vendor/...
- name: cloud-provider-gcp-e2e-create
decorate: true
path_alias: k8s.io/cloud-provider-gcp
labels:
preset-service-account: "true"
preset-k8s-ssh: "true"
annotations:
testgrid-dashboards: provider-gcp-presubmits
description: End to end cluster creation test using cluster/kube-up based on kubernetes/cloud-provider-gcp.
testgrid-num-columns-recent: '30'
spec:
containers:
- image: gcr.io/k8s-testimages/kubekins-e2e:v20200420-e830a3a-master
command:
- runner.sh
args:
- test/e2e.sh
| Add optional e2e create job for k/cp-gcp. | Add optional e2e create job for k/cp-gcp.
| YAML | apache-2.0 | ixdy/kubernetes-test-infra,dims/test-infra,monopole/test-infra,jessfraz/test-infra,kubernetes/test-infra,monopole/test-infra,dims/test-infra,cjwagner/test-infra,BenTheElder/test-infra,brahmaroutu/test-infra,monopole/test-infra,kubernetes/test-infra,cjwagner/test-infra,michelle192837/test-infra,pwittrock/test-infra,cjwagner/test-infra,fejta/test-infra,brahmaroutu/test-infra,ixdy/kubernetes-test-infra,monopole/test-infra,dims/test-infra,kubernetes/test-infra,fejta/test-infra,dims/test-infra,jessfraz/test-infra,cblecker/test-infra,cblecker/test-infra,BenTheElder/test-infra,dims/test-infra,pwittrock/test-infra,ixdy/kubernetes-test-infra,fejta/test-infra,cblecker/test-infra,kubernetes/test-infra,fejta/test-infra,BenTheElder/test-infra,ixdy/kubernetes-test-infra,pwittrock/test-infra,fejta/test-infra,cblecker/test-infra,cblecker/test-infra,kubernetes/test-infra,BenTheElder/test-infra,monopole/test-infra,dims/test-infra,cblecker/test-infra,ixdy/kubernetes-test-infra,cjwagner/test-infra,jessfraz/test-infra,jessfraz/test-infra,brahmaroutu/test-infra,BenTheElder/test-infra,jessfraz/test-infra,brahmaroutu/test-infra,pwittrock/test-infra,michelle192837/test-infra,jessfraz/test-infra,cjwagner/test-infra,cjwagner/test-infra,michelle192837/test-infra,pwittrock/test-infra,brahmaroutu/test-infra,michelle192837/test-infra,fejta/test-infra,brahmaroutu/test-infra,michelle192837/test-infra,BenTheElder/test-infra,monopole/test-infra,kubernetes/test-infra,michelle192837/test-infra | yaml | ## Code Before:
presubmits:
kubernetes/cloud-provider-gcp:
- name: cloud-provider-gcp-tests
max_concurrency: 5
always_run: true
decorate: true
path_alias: k8s.io/cloud-provider-gcp
annotations:
testgrid-dashboards: provider-gcp-presubmits
description: Build and unit test for kubernetes/cloud-provider-gcp.
testgrid-num-columns-recent: '30'
extra_refs:
- org: kubernetes
repo: test-infra
base_ref: master
path_alias: k8s.io/test-infra
spec:
containers:
- image: gcr.io/cloud-builders/bazel
command:
- ../test-infra/hack/bazel.sh
args:
- test
- --test_output=errors
- --
- //...
- -//vendor/...
## Instruction:
Add optional e2e create job for k/cp-gcp.
## Code After:
presubmits:
kubernetes/cloud-provider-gcp:
- name: cloud-provider-gcp-tests
always_run: true
decorate: true
path_alias: k8s.io/cloud-provider-gcp
annotations:
testgrid-dashboards: provider-gcp-presubmits
description: Build and unit test for kubernetes/cloud-provider-gcp.
testgrid-num-columns-recent: '30'
extra_refs:
- org: kubernetes
repo: test-infra
base_ref: master
path_alias: k8s.io/test-infra
spec:
containers:
- image: gcr.io/cloud-builders/bazel
command:
- ../test-infra/hack/bazel.sh
args:
- test
- --test_output=errors
- --
- //...
- -//vendor/...
- name: cloud-provider-gcp-e2e-create
decorate: true
path_alias: k8s.io/cloud-provider-gcp
labels:
preset-service-account: "true"
preset-k8s-ssh: "true"
annotations:
testgrid-dashboards: provider-gcp-presubmits
description: End to end cluster creation test using cluster/kube-up based on kubernetes/cloud-provider-gcp.
testgrid-num-columns-recent: '30'
spec:
containers:
- image: gcr.io/k8s-testimages/kubekins-e2e:v20200420-e830a3a-master
command:
- runner.sh
args:
- test/e2e.sh
| presubmits:
kubernetes/cloud-provider-gcp:
- name: cloud-provider-gcp-tests
- max_concurrency: 5
always_run: true
decorate: true
path_alias: k8s.io/cloud-provider-gcp
annotations:
testgrid-dashboards: provider-gcp-presubmits
description: Build and unit test for kubernetes/cloud-provider-gcp.
testgrid-num-columns-recent: '30'
extra_refs:
- org: kubernetes
repo: test-infra
base_ref: master
path_alias: k8s.io/test-infra
spec:
containers:
- image: gcr.io/cloud-builders/bazel
command:
- ../test-infra/hack/bazel.sh
args:
- test
- --test_output=errors
- --
- //...
- -//vendor/...
+ - name: cloud-provider-gcp-e2e-create
+ decorate: true
+ path_alias: k8s.io/cloud-provider-gcp
+ labels:
+ preset-service-account: "true"
+ preset-k8s-ssh: "true"
+ annotations:
+ testgrid-dashboards: provider-gcp-presubmits
+ description: End to end cluster creation test using cluster/kube-up based on kubernetes/cloud-provider-gcp.
+ testgrid-num-columns-recent: '30'
+ spec:
+ containers:
+ - image: gcr.io/k8s-testimages/kubekins-e2e:v20200420-e830a3a-master
+ command:
+ - runner.sh
+ args:
+ - test/e2e.sh | 18 | 0.666667 | 17 | 1 |
696a9a471abd0036881bec42d7ab9c2b8769c356 | lib/r_fast_r_furious.rb | lib/r_fast_r_furious.rb | require "r_fast_r_furious/version"
require "v8"
require "net/http"
require "uri"
module RFastRFurious
SOURCE = "https://raw.github.com/alunny/r_fast_r_furious/master/fast.js"
def check(string, url = SOURCE)
uri = URI.parse(SOURCE)
content = Net::HTTP.get(uri)
cxt = V8::Context.new
cxt.eval(content, "fast.js")
cxt["r_fast_r_furious"].call(string)
end
module_function :check
end
| require "r_fast_r_furious/version"
require "v8"
require "net/http"
require "uri"
module RFastRFurious
SOURCE = "https://raw.github.com/alunny/r_fast_r_furious/master/fast.js"
def check(string, url = SOURCE)
uri = URI.parse(SOURCE)
http = Net::HTTP.new(uri.host, uri.port)
http.use_ssl = true
http.verify_mode = OpenSSL::SSL::VERIFY_NONE
request = Net::HTTP::Get.new(uri.request_uri)
response = http.request(request)
content = response.body
cxt = V8::Context.new
cxt.eval(content, "fast.js")
cxt["r_fast_r_furious"].call(string)
end
module_function :check
end
| Make specs pass on Ruby 1.9 | Make specs pass on Ruby 1.9
| Ruby | unlicense | halorgium/r_fast_r_furious,halorgium/r_fast_r_furious | ruby | ## Code Before:
require "r_fast_r_furious/version"
require "v8"
require "net/http"
require "uri"
module RFastRFurious
SOURCE = "https://raw.github.com/alunny/r_fast_r_furious/master/fast.js"
def check(string, url = SOURCE)
uri = URI.parse(SOURCE)
content = Net::HTTP.get(uri)
cxt = V8::Context.new
cxt.eval(content, "fast.js")
cxt["r_fast_r_furious"].call(string)
end
module_function :check
end
## Instruction:
Make specs pass on Ruby 1.9
## Code After:
require "r_fast_r_furious/version"
require "v8"
require "net/http"
require "uri"
module RFastRFurious
SOURCE = "https://raw.github.com/alunny/r_fast_r_furious/master/fast.js"
def check(string, url = SOURCE)
uri = URI.parse(SOURCE)
http = Net::HTTP.new(uri.host, uri.port)
http.use_ssl = true
http.verify_mode = OpenSSL::SSL::VERIFY_NONE
request = Net::HTTP::Get.new(uri.request_uri)
response = http.request(request)
content = response.body
cxt = V8::Context.new
cxt.eval(content, "fast.js")
cxt["r_fast_r_furious"].call(string)
end
module_function :check
end
| require "r_fast_r_furious/version"
require "v8"
require "net/http"
require "uri"
module RFastRFurious
SOURCE = "https://raw.github.com/alunny/r_fast_r_furious/master/fast.js"
def check(string, url = SOURCE)
uri = URI.parse(SOURCE)
- content = Net::HTTP.get(uri)
+ http = Net::HTTP.new(uri.host, uri.port)
+ http.use_ssl = true
+ http.verify_mode = OpenSSL::SSL::VERIFY_NONE
+ request = Net::HTTP::Get.new(uri.request_uri)
+ response = http.request(request)
+ content = response.body
cxt = V8::Context.new
cxt.eval(content, "fast.js")
cxt["r_fast_r_furious"].call(string)
end
module_function :check
end | 7 | 0.4375 | 6 | 1 |
ab034cd60acfeb3ac28d03aca42e976ab33e2905 | .travis.yml | .travis.yml | laugnage: python
sudo: false
addons:
apt:
packages:
- python-pip
- python-matplotlib
- python-tk
script:
- ./build_doc.sh
| laugnage: python
sudo: false
addons:
apt:
packages:
- python-pip
- python-matplotlib
- python-tk
script:
- export PATH=$PATH:$PWD/bin
- ./build_doc.sh
| Add bin to PATH in testing | Add bin to PATH in testing
| YAML | bsd-2-clause | garaemon/pltcli,garaemon/pltcli | yaml | ## Code Before:
laugnage: python
sudo: false
addons:
apt:
packages:
- python-pip
- python-matplotlib
- python-tk
script:
- ./build_doc.sh
## Instruction:
Add bin to PATH in testing
## Code After:
laugnage: python
sudo: false
addons:
apt:
packages:
- python-pip
- python-matplotlib
- python-tk
script:
- export PATH=$PATH:$PWD/bin
- ./build_doc.sh
| laugnage: python
sudo: false
addons:
apt:
packages:
- python-pip
- python-matplotlib
- python-tk
script:
+ - export PATH=$PATH:$PWD/bin
- ./build_doc.sh | 1 | 0.1 | 1 | 0 |
4c1a49c3b8f13aa72987d984f0688a9e2d6b9cf0 | test/unit/protocols/local.js | test/unit/protocols/local.js | 'use strict';
describe('Unit :: Protocols.Local', function () {
var Local = require('../../../src/protocols/list/Local.js');
it('constructor()', function() {
var local = new Local();
assert(typeof local == 'object', 'No found protocol.local : '+local);
assert(typeof local.getName == 'function', 'protocol.local.getName function error : '+local.getName);
});
}); | 'use strict';
describe('Unit :: Protocols.Local', function () {
var Local = require('../../../src/protocols/list/Local.js');
var local = new Local();
it('Test methods', function() {
assert(typeof local == 'object', 'No found protocol.local : '+local);
assert(typeof local.getName == 'function', 'protocol.local.getName function error : '+local.getName);
assert(typeof local._accountIsLocked == 'function', 'protocol.local._accountIsLocked function error : '+local._accountIsLocked);
});
}); | Add simple test for travis | Add simple test for travis
| JavaScript | mit | Dhumez-Sebastien/trap.js,Dhumez-Sebastien/trap.js | javascript | ## Code Before:
'use strict';
describe('Unit :: Protocols.Local', function () {
var Local = require('../../../src/protocols/list/Local.js');
it('constructor()', function() {
var local = new Local();
assert(typeof local == 'object', 'No found protocol.local : '+local);
assert(typeof local.getName == 'function', 'protocol.local.getName function error : '+local.getName);
});
});
## Instruction:
Add simple test for travis
## Code After:
'use strict';
describe('Unit :: Protocols.Local', function () {
var Local = require('../../../src/protocols/list/Local.js');
var local = new Local();
it('Test methods', function() {
assert(typeof local == 'object', 'No found protocol.local : '+local);
assert(typeof local.getName == 'function', 'protocol.local.getName function error : '+local.getName);
assert(typeof local._accountIsLocked == 'function', 'protocol.local._accountIsLocked function error : '+local._accountIsLocked);
});
}); | 'use strict';
describe('Unit :: Protocols.Local', function () {
var Local = require('../../../src/protocols/list/Local.js');
+ var local = new Local();
- it('constructor()', function() {
? ^^^ ^^^ ^^^
+ it('Test methods', function() {
? ^^ ^^^ + ^^
-
- var local = new Local();
-
assert(typeof local == 'object', 'No found protocol.local : '+local);
assert(typeof local.getName == 'function', 'protocol.local.getName function error : '+local.getName);
+ assert(typeof local._accountIsLocked == 'function', 'protocol.local._accountIsLocked function error : '+local._accountIsLocked);
});
}); | 7 | 0.5 | 3 | 4 |
b53df6cfbb199a01c76f09750e60266409610d41 | src/Views/Json.php | src/Views/Json.php | <?php
namespace Bolt\Views;
use \Bolt\Base;
use \Bolt\Interfaces\View;
use \Bolt\Arrays;
class Json extends Base implements View
{
public function render($content)
{
header("Content-Type: application/json; charset=UTF-8");
if ($content !== null)
{
echo($this->handleObject($content));
}
return true;
}
private function handleObject($content)
{
$type = Arrays::type($content);
if ($type == "numeric")
{
$results = array();
foreach ($content as $tmp)
{
if (Arrays::type($tmp) !== false || get_class($tmp) == "stdClass") // error here with arrays of string?
{
$results = $content;
}
else
{
$results[] = json_decode($tmp->toJson());
}
}
return json_encode($results);
}
elseif ($type != "assoc" && (is_object($content) && get_class($content) != "stdClass"))
{
return $content->toJson();
}
else
{
return json_encode($content);
}
}
}
?>
| <?php
namespace Bolt\Views;
use \Bolt\Base;
use \Bolt\Interfaces\View;
use \Bolt\Arrays;
class Json extends Base implements View
{
public function render($content)
{
header("Content-Type: application/json; charset=UTF-8");
if ($content !== null)
{
echo($this->handleObject($content));
}
return true;
}
private function handleObject($content)
{
$type = Arrays::type($content);
if ($type == "numeric")
{
$results = array();
foreach ($content as $tmp)
{
if (is_scalar($tmp) || Arrays::type($tmp) !== false || get_class($tmp) == "stdClass")
{
$results = $content;
}
else
{
$results[] = json_decode($tmp->toJson());
}
}
return json_encode($results);
}
elseif ($type != "assoc" && (is_object($content) && get_class($content) != "stdClass"))
{
return $content->toJson();
}
else
{
return json_encode($content);
}
}
}
?>
| Fix issue when returning arrays of scalar values to the JSON View | Fix issue when returning arrays of scalar values to the JSON View
| PHP | apache-2.0 | irwtdvoys/bolt-mvc | php | ## Code Before:
<?php
namespace Bolt\Views;
use \Bolt\Base;
use \Bolt\Interfaces\View;
use \Bolt\Arrays;
class Json extends Base implements View
{
public function render($content)
{
header("Content-Type: application/json; charset=UTF-8");
if ($content !== null)
{
echo($this->handleObject($content));
}
return true;
}
private function handleObject($content)
{
$type = Arrays::type($content);
if ($type == "numeric")
{
$results = array();
foreach ($content as $tmp)
{
if (Arrays::type($tmp) !== false || get_class($tmp) == "stdClass") // error here with arrays of string?
{
$results = $content;
}
else
{
$results[] = json_decode($tmp->toJson());
}
}
return json_encode($results);
}
elseif ($type != "assoc" && (is_object($content) && get_class($content) != "stdClass"))
{
return $content->toJson();
}
else
{
return json_encode($content);
}
}
}
?>
## Instruction:
Fix issue when returning arrays of scalar values to the JSON View
## Code After:
<?php
namespace Bolt\Views;
use \Bolt\Base;
use \Bolt\Interfaces\View;
use \Bolt\Arrays;
class Json extends Base implements View
{
public function render($content)
{
header("Content-Type: application/json; charset=UTF-8");
if ($content !== null)
{
echo($this->handleObject($content));
}
return true;
}
private function handleObject($content)
{
$type = Arrays::type($content);
if ($type == "numeric")
{
$results = array();
foreach ($content as $tmp)
{
if (is_scalar($tmp) || Arrays::type($tmp) !== false || get_class($tmp) == "stdClass")
{
$results = $content;
}
else
{
$results[] = json_decode($tmp->toJson());
}
}
return json_encode($results);
}
elseif ($type != "assoc" && (is_object($content) && get_class($content) != "stdClass"))
{
return $content->toJson();
}
else
{
return json_encode($content);
}
}
}
?>
| <?php
namespace Bolt\Views;
use \Bolt\Base;
use \Bolt\Interfaces\View;
use \Bolt\Arrays;
class Json extends Base implements View
{
public function render($content)
{
header("Content-Type: application/json; charset=UTF-8");
if ($content !== null)
{
echo($this->handleObject($content));
}
return true;
}
private function handleObject($content)
{
$type = Arrays::type($content);
if ($type == "numeric")
{
$results = array();
foreach ($content as $tmp)
{
- if (Arrays::type($tmp) !== false || get_class($tmp) == "stdClass") // error here with arrays of string?
+ if (is_scalar($tmp) || Arrays::type($tmp) !== false || get_class($tmp) == "stdClass")
{
$results = $content;
}
else
{
$results[] = json_decode($tmp->toJson());
}
}
return json_encode($results);
}
elseif ($type != "assoc" && (is_object($content) && get_class($content) != "stdClass"))
{
return $content->toJson();
}
else
{
return json_encode($content);
}
}
}
?> | 2 | 0.037037 | 1 | 1 |
98592787520a8f97079406d61f3a8176e1e08a77 | lib/uri/query_params/query_params.rb | lib/uri/query_params/query_params.rb | require 'uri'
module URI
module QueryParams
#
# Parses a URI query string.
#
# @param [String] query_string
# The URI query string.
#
# @return [Hash{String => String}]
# The parsed query parameters.
#
def QueryParams.parse(query_string)
query_params = {}
if query_string
query_string.split('&').each do |param|
name, value = param.split('=',2)
if value
query_params[name] = URI.decode(value)
else
query_params[name] = ''
end
end
end
return query_params
end
#
# Dumps the URI query params.
#
# @param [Hash{String => String}] query_params
# The query params.
#
# @return [String]
# The dumped URI query string.
#
# @since 0.5.0
#
def QueryParams.dump(query_params)
query = []
query_params.each do |name,value|
param = if value == true
"#{name}=active"
elsif value
if value.kind_of?(Array)
"#{name}=#{CGI.escape(value.join(' '))}"
else
"#{name}=#{CGI.escape(value.to_s)}"
end
else
"#{name}="
end
query << param
end
return query.join('&')
end
end
end
| require 'uri'
module URI
module QueryParams
#
# Parses a URI query string.
#
# @param [String] query_string
# The URI query string.
#
# @return [Hash{String => String}]
# The parsed query parameters.
#
def QueryParams.parse(query_string)
query_params = {}
if query_string
query_string.split('&').each do |param|
name, value = param.split('=',2)
if value
query_params[name] = URI.decode(value)
else
query_params[name] = ''
end
end
end
return query_params
end
#
# Dumps the URI query params.
#
# @param [Hash{String => String}] query_params
# The query params.
#
# @return [String]
# The dumped URI query string.
#
# @since 0.5.0
#
def QueryParams.dump(query_params)
query = []
query_params.each do |name,value|
param = case value
when Array
"#{name}=#{CGI.escape(value.join(' '))}"
when true
"#{name}=active"
when false, nil
"#{name}="
else
"#{name}=#{CGI.escape(value.to_s)}"
end
query << param
end
return query.join('&')
end
end
end
| Use a case/when statement in QueryParams.dump. | Use a case/when statement in QueryParams.dump.
| Ruby | mit | postmodern/uri-query_params | ruby | ## Code Before:
require 'uri'
module URI
module QueryParams
#
# Parses a URI query string.
#
# @param [String] query_string
# The URI query string.
#
# @return [Hash{String => String}]
# The parsed query parameters.
#
def QueryParams.parse(query_string)
query_params = {}
if query_string
query_string.split('&').each do |param|
name, value = param.split('=',2)
if value
query_params[name] = URI.decode(value)
else
query_params[name] = ''
end
end
end
return query_params
end
#
# Dumps the URI query params.
#
# @param [Hash{String => String}] query_params
# The query params.
#
# @return [String]
# The dumped URI query string.
#
# @since 0.5.0
#
def QueryParams.dump(query_params)
query = []
query_params.each do |name,value|
param = if value == true
"#{name}=active"
elsif value
if value.kind_of?(Array)
"#{name}=#{CGI.escape(value.join(' '))}"
else
"#{name}=#{CGI.escape(value.to_s)}"
end
else
"#{name}="
end
query << param
end
return query.join('&')
end
end
end
## Instruction:
Use a case/when statement in QueryParams.dump.
## Code After:
require 'uri'
module URI
module QueryParams
#
# Parses a URI query string.
#
# @param [String] query_string
# The URI query string.
#
# @return [Hash{String => String}]
# The parsed query parameters.
#
def QueryParams.parse(query_string)
query_params = {}
if query_string
query_string.split('&').each do |param|
name, value = param.split('=',2)
if value
query_params[name] = URI.decode(value)
else
query_params[name] = ''
end
end
end
return query_params
end
#
# Dumps the URI query params.
#
# @param [Hash{String => String}] query_params
# The query params.
#
# @return [String]
# The dumped URI query string.
#
# @since 0.5.0
#
def QueryParams.dump(query_params)
query = []
query_params.each do |name,value|
param = case value
when Array
"#{name}=#{CGI.escape(value.join(' '))}"
when true
"#{name}=active"
when false, nil
"#{name}="
else
"#{name}=#{CGI.escape(value.to_s)}"
end
query << param
end
return query.join('&')
end
end
end
| require 'uri'
module URI
module QueryParams
#
# Parses a URI query string.
#
# @param [String] query_string
# The URI query string.
#
# @return [Hash{String => String}]
# The parsed query parameters.
#
def QueryParams.parse(query_string)
query_params = {}
if query_string
query_string.split('&').each do |param|
name, value = param.split('=',2)
if value
query_params[name] = URI.decode(value)
else
query_params[name] = ''
end
end
end
return query_params
end
#
# Dumps the URI query params.
#
# @param [Hash{String => String}] query_params
# The query params.
#
# @return [String]
# The dumped URI query string.
#
# @since 0.5.0
#
def QueryParams.dump(query_params)
query = []
query_params.each do |name,value|
- param = if value == true
? ^^ --------
+ param = case value
? ^^^^
+ when Array
+ "#{name}=#{CGI.escape(value.join(' '))}"
+ when true
"#{name}=active"
- elsif value
- if value.kind_of?(Array)
- "#{name}=#{CGI.escape(value.join(' '))}"
- else
? ^^
+ when false, nil
? ++++ ^^ +++++
- "#{name}=#{CGI.escape(value.to_s)}"
- end
? ^^
+ "#{name}="
? ++++++ ^^^
else
- "#{name}="
+ "#{name}=#{CGI.escape(value.to_s)}"
end
query << param
end
return query.join('&')
end
end
end | 15 | 0.227273 | 7 | 8 |
24555462f8370ef6c10e9a9ab07542f64eb9ca82 | README.md | README.md | Shape Shapes
============
These are the files that make up [shapeshapes.com](http://shapeshapes.com)!
Development
===========
```
npm install
bundle
bower update
gulp vendor
bundle exec jekyll serve
```
| Shape Shapes
============
These are the files that make up [shapeshapes.com](http://shapeshapes.com)!
Development
===========
```
npm install
bundle
bower update
gulp vendor
bundle exec jekyll server
```
To view draft exhibitions, run `jekyll server` with the `--drafts` flag:
```
bundle exec jekyll server --drafts
```
| Add description of drafts flag | Add description of drafts flag
| Markdown | mit | khoonster/shapeshapes,khoonster/shapeshapes,khoonster/shapeshapes | markdown | ## Code Before:
Shape Shapes
============
These are the files that make up [shapeshapes.com](http://shapeshapes.com)!
Development
===========
```
npm install
bundle
bower update
gulp vendor
bundle exec jekyll serve
```
## Instruction:
Add description of drafts flag
## Code After:
Shape Shapes
============
These are the files that make up [shapeshapes.com](http://shapeshapes.com)!
Development
===========
```
npm install
bundle
bower update
gulp vendor
bundle exec jekyll server
```
To view draft exhibitions, run `jekyll server` with the `--drafts` flag:
```
bundle exec jekyll server --drafts
```
| Shape Shapes
============
These are the files that make up [shapeshapes.com](http://shapeshapes.com)!
Development
===========
```
npm install
bundle
bower update
gulp vendor
- bundle exec jekyll serve
+ bundle exec jekyll server
? +
```
+
+ To view draft exhibitions, run `jekyll server` with the `--drafts` flag:
+
+ ```
+ bundle exec jekyll server --drafts
+ ``` | 8 | 0.533333 | 7 | 1 |
cda0e2b7ecbe8a2c4ff937fb4dceccfee684f216 | pkgs/development/python-modules/azure-mgmt-nspkg/default.nix | pkgs/development/python-modules/azure-mgmt-nspkg/default.nix | { pkgs
, buildPythonPackage
, fetchPypi
, azure-nspkg
}:
buildPythonPackage rec {
version = "3.0.2";
pname = "azure-mgmt-nspkg";
src = fetchPypi {
inherit pname version;
extension = "zip";
sha256 = "8b2287f671529505b296005e6de9150b074344c2c7d1c805b3f053d081d58c52";
};
propagatedBuildInputs = [ azure-nspkg ];
meta = with pkgs.lib; {
description = "Microsoft Azure SDK for Python";
homepage = "https://azure.microsoft.com/en-us/develop/python/";
license = licenses.asl20;
maintainers = with maintainers; [ olcai ];
};
}
| { pkgs
, buildPythonPackage
, fetchPypi
, azure-nspkg
}:
buildPythonPackage rec {
version = "1.0.0";
pname = "azure-mgmt-nspkg";
src = fetchPypi {
inherit pname version;
extension = "zip";
sha256 = "1rq92fj3kvnqkk18596dybw0kvhgscvc6cd8hp1dhy3wrkqnhwmq";
};
propagatedBuildInputs = [ azure-nspkg ];
meta = with pkgs.lib; {
description = "Microsoft Azure SDK for Python";
homepage = "https://azure.microsoft.com/en-us/develop/python/";
license = licenses.asl20;
maintainers = with maintainers; [ olcai ];
};
}
| Revert "python: azure-mgmt-nspkg: 1.0.0 -> 3.0.2" | Revert "python: azure-mgmt-nspkg: 1.0.0 -> 3.0.2"
This reverts commit 8d58469a61b7df300e77818a2e1dab7bb4902ba5.
https://github.com/NixOS/nixpkgs/issues/52547
| Nix | mit | NixOS/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,NixOS/nixpkgs,SymbiFlow/nixpkgs,NixOS/nixpkgs | nix | ## Code Before:
{ pkgs
, buildPythonPackage
, fetchPypi
, azure-nspkg
}:
buildPythonPackage rec {
version = "3.0.2";
pname = "azure-mgmt-nspkg";
src = fetchPypi {
inherit pname version;
extension = "zip";
sha256 = "8b2287f671529505b296005e6de9150b074344c2c7d1c805b3f053d081d58c52";
};
propagatedBuildInputs = [ azure-nspkg ];
meta = with pkgs.lib; {
description = "Microsoft Azure SDK for Python";
homepage = "https://azure.microsoft.com/en-us/develop/python/";
license = licenses.asl20;
maintainers = with maintainers; [ olcai ];
};
}
## Instruction:
Revert "python: azure-mgmt-nspkg: 1.0.0 -> 3.0.2"
This reverts commit 8d58469a61b7df300e77818a2e1dab7bb4902ba5.
https://github.com/NixOS/nixpkgs/issues/52547
## Code After:
{ pkgs
, buildPythonPackage
, fetchPypi
, azure-nspkg
}:
buildPythonPackage rec {
version = "1.0.0";
pname = "azure-mgmt-nspkg";
src = fetchPypi {
inherit pname version;
extension = "zip";
sha256 = "1rq92fj3kvnqkk18596dybw0kvhgscvc6cd8hp1dhy3wrkqnhwmq";
};
propagatedBuildInputs = [ azure-nspkg ];
meta = with pkgs.lib; {
description = "Microsoft Azure SDK for Python";
homepage = "https://azure.microsoft.com/en-us/develop/python/";
license = licenses.asl20;
maintainers = with maintainers; [ olcai ];
};
}
| { pkgs
, buildPythonPackage
, fetchPypi
, azure-nspkg
}:
buildPythonPackage rec {
- version = "3.0.2";
? ^ ^
+ version = "1.0.0";
? ^ ^
pname = "azure-mgmt-nspkg";
src = fetchPypi {
inherit pname version;
extension = "zip";
- sha256 = "8b2287f671529505b296005e6de9150b074344c2c7d1c805b3f053d081d58c52";
+ sha256 = "1rq92fj3kvnqkk18596dybw0kvhgscvc6cd8hp1dhy3wrkqnhwmq";
};
propagatedBuildInputs = [ azure-nspkg ];
meta = with pkgs.lib; {
description = "Microsoft Azure SDK for Python";
homepage = "https://azure.microsoft.com/en-us/develop/python/";
license = licenses.asl20;
maintainers = with maintainers; [ olcai ];
};
} | 4 | 0.16 | 2 | 2 |
7c7a490c33d2f95e8d70b2204233762495d8c5c1 | spec/spec_helper.rb | spec/spec_helper.rb | require 'perpetuity'
Perpetuity.data_source :mongodb, 'perpetuity_gem_test'
| require 'perpetuity'
if ENV['PERPETUITY_ADAPTER'] == 'postgres'
Perpetuity.data_source :postgres, 'perpetuity_gem_test', user: ENV['USER'], password: nil
else
Perpetuity.data_source :mongodb, 'perpetuity_gem_test'
end
| Allow specs to be run with Postgres adapter | Allow specs to be run with Postgres adapter
| Ruby | mit | jgaskins/perpetuity | ruby | ## Code Before:
require 'perpetuity'
Perpetuity.data_source :mongodb, 'perpetuity_gem_test'
## Instruction:
Allow specs to be run with Postgres adapter
## Code After:
require 'perpetuity'
if ENV['PERPETUITY_ADAPTER'] == 'postgres'
Perpetuity.data_source :postgres, 'perpetuity_gem_test', user: ENV['USER'], password: nil
else
Perpetuity.data_source :mongodb, 'perpetuity_gem_test'
end
| require 'perpetuity'
+ if ENV['PERPETUITY_ADAPTER'] == 'postgres'
+ Perpetuity.data_source :postgres, 'perpetuity_gem_test', user: ENV['USER'], password: nil
+ else
- Perpetuity.data_source :mongodb, 'perpetuity_gem_test'
+ Perpetuity.data_source :mongodb, 'perpetuity_gem_test'
? ++
+ end | 6 | 2 | 5 | 1 |
5647a87df12726c2b04bf5a09a89471f91a6c1b8 | cmd/atlas/test.go | cmd/atlas/test.go | /*
This package is just a collection of test cases
*/
package main
import (
"fmt"
"os"
"ripe-atlas"
)
func main() {
p, err := atlas.GetProbe(14037)
if err != nil {
fmt.Printf("err: %v", err)
os.Exit(1)
}
fmt.Printf("p: %#v\n", p)
q, err := atlas.GetProbes()
if err != nil {
fmt.Printf("err: %v", err)
os.Exit(1)
}
fmt.Printf("q: %#v\n", q)
}
| /*
This package is just a collection of test cases
*/
package main
import (
"fmt"
"github.com/codegangsta/cli"
"os"
"ripe-atlas"
"strconv"
)
// set args for examples sake
func main() {
app := cli.NewApp()
app.Name = "atlas"
app.Commands = []cli.Command{
{
Name: "probes",
Aliases: []string{"p"},
Usage: "use it to see a description",
Description: "This is how we describe hello the function",
Subcommands: []cli.Command{
{
Name: "list",
Aliases: []string{"ls"},
Usage: "lists all probes",
Description: "greets someone in english",
Action: func(c *cli.Context) error {
q, err := atlas.GetProbes()
if err != nil {
fmt.Printf("err: %v", err)
os.Exit(1)
}
fmt.Printf("q: %#v\n", q)
return nil
},
},
{
Name: "info",
Usage: "info for one probe",
Description: "gives info for one probe",
Flags: []cli.Flag{
cli.IntFlag{
Name: "id",
Value: 0,
Usage: "id of the probe",
},
},
Action: func(c *cli.Context) error {
args := c.Args()
id, _ := strconv.ParseInt(args[0], 10, 32)
p, err := atlas.GetProbe(int(id))
if err != nil {
fmt.Printf("err: %v", err)
os.Exit(1)
}
fmt.Printf("p: %#v\n", p)
return nil
},
},
},
},
}
app.Run(os.Args)
}
| Move to cli to manage flags/cmd/subcmd. | Move to cli to manage flags/cmd/subcmd.
| Go | mit | keltia/ripe-atlas | go | ## Code Before:
/*
This package is just a collection of test cases
*/
package main
import (
"fmt"
"os"
"ripe-atlas"
)
func main() {
p, err := atlas.GetProbe(14037)
if err != nil {
fmt.Printf("err: %v", err)
os.Exit(1)
}
fmt.Printf("p: %#v\n", p)
q, err := atlas.GetProbes()
if err != nil {
fmt.Printf("err: %v", err)
os.Exit(1)
}
fmt.Printf("q: %#v\n", q)
}
## Instruction:
Move to cli to manage flags/cmd/subcmd.
## Code After:
/*
This package is just a collection of test cases
*/
package main
import (
"fmt"
"github.com/codegangsta/cli"
"os"
"ripe-atlas"
"strconv"
)
// set args for examples sake
func main() {
app := cli.NewApp()
app.Name = "atlas"
app.Commands = []cli.Command{
{
Name: "probes",
Aliases: []string{"p"},
Usage: "use it to see a description",
Description: "This is how we describe hello the function",
Subcommands: []cli.Command{
{
Name: "list",
Aliases: []string{"ls"},
Usage: "lists all probes",
Description: "greets someone in english",
Action: func(c *cli.Context) error {
q, err := atlas.GetProbes()
if err != nil {
fmt.Printf("err: %v", err)
os.Exit(1)
}
fmt.Printf("q: %#v\n", q)
return nil
},
},
{
Name: "info",
Usage: "info for one probe",
Description: "gives info for one probe",
Flags: []cli.Flag{
cli.IntFlag{
Name: "id",
Value: 0,
Usage: "id of the probe",
},
},
Action: func(c *cli.Context) error {
args := c.Args()
id, _ := strconv.ParseInt(args[0], 10, 32)
p, err := atlas.GetProbe(int(id))
if err != nil {
fmt.Printf("err: %v", err)
os.Exit(1)
}
fmt.Printf("p: %#v\n", p)
return nil
},
},
},
},
}
app.Run(os.Args)
}
| /*
This package is just a collection of test cases
- */
? -
+ */
package main
import (
- "fmt"
- "os"
+ "fmt"
+ "github.com/codegangsta/cli"
+ "os"
- "ripe-atlas"
? ^^^^
+ "ripe-atlas"
? ^
+ "strconv"
)
+ // set args for examples sake
+
func main() {
+ app := cli.NewApp()
+ app.Name = "atlas"
+ app.Commands = []cli.Command{
+ {
+ Name: "probes",
+ Aliases: []string{"p"},
+ Usage: "use it to see a description",
+ Description: "This is how we describe hello the function",
+ Subcommands: []cli.Command{
+ {
+ Name: "list",
+ Aliases: []string{"ls"},
+ Usage: "lists all probes",
+ Description: "greets someone in english",
+ Action: func(c *cli.Context) error {
- p, err := atlas.GetProbe(14037)
? ^ -----
+ q, err := atlas.GetProbes()
? ^^^^^^ +
- if err != nil {
+ if err != nil {
? +++++
- fmt.Printf("err: %v", err)
+ fmt.Printf("err: %v", err)
? +++++
- os.Exit(1)
+ os.Exit(1)
? +++++
+ }
+ fmt.Printf("q: %#v\n", q)
+
+ return nil
+ },
+ },
+ {
+ Name: "info",
+ Usage: "info for one probe",
+ Description: "gives info for one probe",
+ Flags: []cli.Flag{
+ cli.IntFlag{
+ Name: "id",
+ Value: 0,
+ Usage: "id of the probe",
+ },
+ },
+ Action: func(c *cli.Context) error {
+ args := c.Args()
+ id, _ := strconv.ParseInt(args[0], 10, 32)
+
+ p, err := atlas.GetProbe(int(id))
+ if err != nil {
+ fmt.Printf("err: %v", err)
+ os.Exit(1)
+ }
+ fmt.Printf("p: %#v\n", p)
+
+ return nil
+ },
+ },
+ },
+ },
}
+ app.Run(os.Args)
- fmt.Printf("p: %#v\n", p)
-
- q, err := atlas.GetProbes()
- if err != nil {
- fmt.Printf("err: %v", err)
- os.Exit(1)
- }
- fmt.Printf("q: %#v\n", q)
} | 77 | 2.851852 | 61 | 16 |
778865190ab5fecbbc8ac44f126df2792baae39a | lib/grabble/vertex.rb | lib/grabble/vertex.rb | module Grabble
class Vertex
def initialize(data)
@data = data
end
def data
@data
end
end
end
| module Grabble
class Vertex
attr_reader :data
def initialize(data)
@data = data
end
end
end
| Use attr_reader instead of manual getter method | Use attr_reader instead of manual getter method
| Ruby | mit | matt-clement/grabble | ruby | ## Code Before:
module Grabble
class Vertex
def initialize(data)
@data = data
end
def data
@data
end
end
end
## Instruction:
Use attr_reader instead of manual getter method
## Code After:
module Grabble
class Vertex
attr_reader :data
def initialize(data)
@data = data
end
end
end
| module Grabble
class Vertex
+ attr_reader :data
+
def initialize(data)
@data = data
end
-
- def data
- @data
- end
-
end
end | 7 | 0.583333 | 2 | 5 |
82f5350df468e55979087f67d317f237dc306cfc | .codeclimate.yml | .codeclimate.yml | engines:
duplication:
enabled: true
config:
languages:
- javascript
- php
markdownlint:
enabled: true
checks:
MD009:
enabled: false
MD013:
enabled: false
MD024:
enabled: false
phpcodesniffer:
enabled: true
config:
file_extensions: "php"
phpmd:
enabled: true
config:
file_extensions: "php"
ratings:
paths:
- "**.js"
- "**.md"
- "**.php"
| engines:
duplication:
enabled: true
config:
languages:
- javascript
- php
markdownlint:
enabled: true
checks:
MD002:
enabled: false
MD009:
enabled: false
MD013:
enabled: false
MD024:
enabled: false
phpcodesniffer:
enabled: true
config:
file_extensions: "php"
phpmd:
enabled: true
config:
file_extensions: "php"
ratings:
paths:
- "**.js"
- "**.md"
- "**.php"
| Disable top level header check | Disable top level header check
| YAML | mit | milescellar/flarum-ext-french,milescellar/flarum-ext-french,milescellar/flarum-ext-french | yaml | ## Code Before:
engines:
duplication:
enabled: true
config:
languages:
- javascript
- php
markdownlint:
enabled: true
checks:
MD009:
enabled: false
MD013:
enabled: false
MD024:
enabled: false
phpcodesniffer:
enabled: true
config:
file_extensions: "php"
phpmd:
enabled: true
config:
file_extensions: "php"
ratings:
paths:
- "**.js"
- "**.md"
- "**.php"
## Instruction:
Disable top level header check
## Code After:
engines:
duplication:
enabled: true
config:
languages:
- javascript
- php
markdownlint:
enabled: true
checks:
MD002:
enabled: false
MD009:
enabled: false
MD013:
enabled: false
MD024:
enabled: false
phpcodesniffer:
enabled: true
config:
file_extensions: "php"
phpmd:
enabled: true
config:
file_extensions: "php"
ratings:
paths:
- "**.js"
- "**.md"
- "**.php"
| engines:
duplication:
enabled: true
config:
languages:
- javascript
- php
markdownlint:
enabled: true
checks:
+ MD002:
+ enabled: false
MD009:
enabled: false
MD013:
enabled: false
MD024:
enabled: false
phpcodesniffer:
enabled: true
config:
file_extensions: "php"
phpmd:
enabled: true
config:
file_extensions: "php"
ratings:
paths:
- "**.js"
- "**.md"
- "**.php" | 2 | 0.066667 | 2 | 0 |
5060ef938f4f0cb880f288235391dc0c08be56c6 | src/main/kotlin/com/github/shiraji/findpullrequest/FindPullRequestAction.kt | src/main/kotlin/com/github/shiraji/findpullrequest/FindPullRequestAction.kt | package com.github.shiraji.findpullrequest
import com.intellij.notification.Notification
import com.intellij.notification.NotificationType
import com.intellij.notification.Notifications
import com.intellij.openapi.actionSystem.AnAction
import com.intellij.openapi.actionSystem.AnActionEvent
import com.intellij.openapi.actionSystem.CommonDataKeys
import com.intellij.openapi.fileEditor.FileDocumentManager
import com.intellij.openapi.project.Project
import git4idea.repo.GitRepository
import org.jetbrains.plugins.github.util.GithubUtil
class FindPullRequestAction : AnAction() {
override fun actionPerformed(e: AnActionEvent) {
val eventData = calcData(e)
Notifications.Bus.notify(Notification("Plugin Importer+Exporter",
"Plugin Importer+Exporter",
"EventData Repo br: " + eventData?.repository?.branches
+ " " + eventData?.repository?.remotes,
NotificationType.INFORMATION))
}
private fun calcData(e : AnActionEvent): EventData? {
val project = e.getData(CommonDataKeys.PROJECT)
project ?: return null
val virtualFile = e.getData(CommonDataKeys.VIRTUAL_FILE)
virtualFile ?: return null
val document = FileDocumentManager.getInstance().getDocument(virtualFile)
document ?: return null
val repository = GithubUtil.getGitRepository(project, virtualFile)
repository ?: return null
return EventData(project, repository)
}
private data class EventData(val project: Project, val repository: GitRepository) {
}
}
| package com.github.shiraji.findpullrequest
import com.intellij.notification.Notification
import com.intellij.notification.NotificationType
import com.intellij.notification.Notifications
import com.intellij.openapi.actionSystem.AnAction
import com.intellij.openapi.actionSystem.AnActionEvent
import com.intellij.openapi.actionSystem.CommonDataKeys
import com.intellij.openapi.fileEditor.FileDocumentManager
import com.intellij.openapi.project.Project
import git4idea.repo.GitRepository
import org.jetbrains.plugins.github.util.GithubUtil
class FindPullRequestAction : AnAction() {
override fun actionPerformed(e: AnActionEvent) {
val eventData = calcData(e)
val foo = eventData?.repository?.remotes?.joinToString {
it.pushUrls.toString() + "\n"
}
Notifications.Bus.notify(Notification("Plugin Importer+Exporter",
"Plugin Importer+Exporter",
"EventData: " + foo,
NotificationType.INFORMATION))
}
private fun calcData(e : AnActionEvent): EventData? {
val project = e.getData(CommonDataKeys.PROJECT)
project ?: return null
val virtualFile = e.getData(CommonDataKeys.VIRTUAL_FILE)
virtualFile ?: return null
val document = FileDocumentManager.getInstance().getDocument(virtualFile)
document ?: return null
val repository = GithubUtil.getGitRepository(project, virtualFile)
repository ?: return null
return EventData(project, repository)
}
private data class EventData(val project: Project, val repository: GitRepository) {
}
}
| Add debug info. It may not work since there is no runtime dependancy | Add debug info. It may not work since there is no runtime dependancy
| Kotlin | apache-2.0 | shiraji/find-pull-request | kotlin | ## Code Before:
package com.github.shiraji.findpullrequest
import com.intellij.notification.Notification
import com.intellij.notification.NotificationType
import com.intellij.notification.Notifications
import com.intellij.openapi.actionSystem.AnAction
import com.intellij.openapi.actionSystem.AnActionEvent
import com.intellij.openapi.actionSystem.CommonDataKeys
import com.intellij.openapi.fileEditor.FileDocumentManager
import com.intellij.openapi.project.Project
import git4idea.repo.GitRepository
import org.jetbrains.plugins.github.util.GithubUtil
class FindPullRequestAction : AnAction() {
override fun actionPerformed(e: AnActionEvent) {
val eventData = calcData(e)
Notifications.Bus.notify(Notification("Plugin Importer+Exporter",
"Plugin Importer+Exporter",
"EventData Repo br: " + eventData?.repository?.branches
+ " " + eventData?.repository?.remotes,
NotificationType.INFORMATION))
}
private fun calcData(e : AnActionEvent): EventData? {
val project = e.getData(CommonDataKeys.PROJECT)
project ?: return null
val virtualFile = e.getData(CommonDataKeys.VIRTUAL_FILE)
virtualFile ?: return null
val document = FileDocumentManager.getInstance().getDocument(virtualFile)
document ?: return null
val repository = GithubUtil.getGitRepository(project, virtualFile)
repository ?: return null
return EventData(project, repository)
}
private data class EventData(val project: Project, val repository: GitRepository) {
}
}
## Instruction:
Add debug info. It may not work since there is no runtime dependancy
## Code After:
package com.github.shiraji.findpullrequest
import com.intellij.notification.Notification
import com.intellij.notification.NotificationType
import com.intellij.notification.Notifications
import com.intellij.openapi.actionSystem.AnAction
import com.intellij.openapi.actionSystem.AnActionEvent
import com.intellij.openapi.actionSystem.CommonDataKeys
import com.intellij.openapi.fileEditor.FileDocumentManager
import com.intellij.openapi.project.Project
import git4idea.repo.GitRepository
import org.jetbrains.plugins.github.util.GithubUtil
class FindPullRequestAction : AnAction() {
override fun actionPerformed(e: AnActionEvent) {
val eventData = calcData(e)
val foo = eventData?.repository?.remotes?.joinToString {
it.pushUrls.toString() + "\n"
}
Notifications.Bus.notify(Notification("Plugin Importer+Exporter",
"Plugin Importer+Exporter",
"EventData: " + foo,
NotificationType.INFORMATION))
}
private fun calcData(e : AnActionEvent): EventData? {
val project = e.getData(CommonDataKeys.PROJECT)
project ?: return null
val virtualFile = e.getData(CommonDataKeys.VIRTUAL_FILE)
virtualFile ?: return null
val document = FileDocumentManager.getInstance().getDocument(virtualFile)
document ?: return null
val repository = GithubUtil.getGitRepository(project, virtualFile)
repository ?: return null
return EventData(project, repository)
}
private data class EventData(val project: Project, val repository: GitRepository) {
}
}
| package com.github.shiraji.findpullrequest
import com.intellij.notification.Notification
import com.intellij.notification.NotificationType
import com.intellij.notification.Notifications
import com.intellij.openapi.actionSystem.AnAction
import com.intellij.openapi.actionSystem.AnActionEvent
import com.intellij.openapi.actionSystem.CommonDataKeys
import com.intellij.openapi.fileEditor.FileDocumentManager
import com.intellij.openapi.project.Project
import git4idea.repo.GitRepository
import org.jetbrains.plugins.github.util.GithubUtil
class FindPullRequestAction : AnAction() {
override fun actionPerformed(e: AnActionEvent) {
val eventData = calcData(e)
+ val foo = eventData?.repository?.remotes?.joinToString {
+ it.pushUrls.toString() + "\n"
+ }
+
Notifications.Bus.notify(Notification("Plugin Importer+Exporter",
"Plugin Importer+Exporter",
+ "EventData: " + foo,
- "EventData Repo br: " + eventData?.repository?.branches
- + " " + eventData?.repository?.remotes,
NotificationType.INFORMATION))
}
private fun calcData(e : AnActionEvent): EventData? {
val project = e.getData(CommonDataKeys.PROJECT)
project ?: return null
val virtualFile = e.getData(CommonDataKeys.VIRTUAL_FILE)
virtualFile ?: return null
val document = FileDocumentManager.getInstance().getDocument(virtualFile)
document ?: return null
val repository = GithubUtil.getGitRepository(project, virtualFile)
repository ?: return null
return EventData(project, repository)
}
private data class EventData(val project: Project, val repository: GitRepository) {
}
} | 7 | 0.159091 | 5 | 2 |
d3d8ac87288ff872eebc7982321af25a62e853a4 | docs/index.rst | docs/index.rst | .. Vumi documentation master file, created by
sphinx-quickstart on Thu Mar 17 20:51:03 2011.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to Vumi's documentation!
================================
Contents:
.. toctree::
:maxdepth: 1
overview.rst
first-smpp-bind.rst
applications/index.rst
transports/index.rst
dispatchers/index.rst
middleware/index.rst
metrics.rst
roadmap.rst
release-notes.rst
.. note::
Looking for documentation for writing Javascript applications for the
hosted Vumi Go environment? Visit http://vumi-go.readthedocs.org for
documentation on the hosted platform and
http://vumi-jssandbox-toolkit.readthedocs.org for documentation on
the Javascript sandbox.
Getting Started:
* :doc:`installation`
* :doc:`getting-started`
* :doc:`Writing your first Vumi app - Part 1</intro/tutorial01>` | :doc:`Part 2</intro/tutorial02>`
* :doc:`ScaleConf workshop instructions</intro/scaleconf01>`
For developers:
.. toctree::
:maxdepth: 1
routing-naming-conventions.rst
how-we-do-releases.rst
coding-guidelines.rst
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
| .. Vumi documentation master file, created by
sphinx-quickstart on Thu Mar 17 20:51:03 2011.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to Vumi's documentation!
================================
Contents:
.. toctree::
:maxdepth: 1
overview.rst
intro/index.rst
first-smpp-bind.rst
applications/index.rst
transports/index.rst
dispatchers/index.rst
middleware/index.rst
metrics.rst
roadmap.rst
release-notes.rst
.. note::
Looking for documentation for writing Javascript applications for the
hosted Vumi Go environment? Visit http://vumi-go.readthedocs.org for
documentation on the hosted platform and
http://vumi-jssandbox-toolkit.readthedocs.org for documentation on
the Javascript sandbox.
Getting Started:
* :doc:`installation`
* :doc:`getting-started`
* :doc:`Writing your first Vumi app - Part 1</intro/tutorial01>` | :doc:`Part 2</intro/tutorial02>`
* :doc:`ScaleConf workshop instructions</intro/scaleconf01>`
For developers:
.. toctree::
:maxdepth: 1
routing-naming-conventions.rst
how-we-do-releases.rst
coding-guidelines.rst
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
| Add intro docs back to contents (@jerith). | Add intro docs back to contents (@jerith).
| reStructuredText | bsd-3-clause | harrissoerja/vumi,harrissoerja/vumi,harrissoerja/vumi,TouK/vumi,vishwaprakashmishra/xmatrix,TouK/vumi,TouK/vumi,vishwaprakashmishra/xmatrix,vishwaprakashmishra/xmatrix | restructuredtext | ## Code Before:
.. Vumi documentation master file, created by
sphinx-quickstart on Thu Mar 17 20:51:03 2011.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to Vumi's documentation!
================================
Contents:
.. toctree::
:maxdepth: 1
overview.rst
first-smpp-bind.rst
applications/index.rst
transports/index.rst
dispatchers/index.rst
middleware/index.rst
metrics.rst
roadmap.rst
release-notes.rst
.. note::
Looking for documentation for writing Javascript applications for the
hosted Vumi Go environment? Visit http://vumi-go.readthedocs.org for
documentation on the hosted platform and
http://vumi-jssandbox-toolkit.readthedocs.org for documentation on
the Javascript sandbox.
Getting Started:
* :doc:`installation`
* :doc:`getting-started`
* :doc:`Writing your first Vumi app - Part 1</intro/tutorial01>` | :doc:`Part 2</intro/tutorial02>`
* :doc:`ScaleConf workshop instructions</intro/scaleconf01>`
For developers:
.. toctree::
:maxdepth: 1
routing-naming-conventions.rst
how-we-do-releases.rst
coding-guidelines.rst
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
## Instruction:
Add intro docs back to contents (@jerith).
## Code After:
.. Vumi documentation master file, created by
sphinx-quickstart on Thu Mar 17 20:51:03 2011.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to Vumi's documentation!
================================
Contents:
.. toctree::
:maxdepth: 1
overview.rst
intro/index.rst
first-smpp-bind.rst
applications/index.rst
transports/index.rst
dispatchers/index.rst
middleware/index.rst
metrics.rst
roadmap.rst
release-notes.rst
.. note::
Looking for documentation for writing Javascript applications for the
hosted Vumi Go environment? Visit http://vumi-go.readthedocs.org for
documentation on the hosted platform and
http://vumi-jssandbox-toolkit.readthedocs.org for documentation on
the Javascript sandbox.
Getting Started:
* :doc:`installation`
* :doc:`getting-started`
* :doc:`Writing your first Vumi app - Part 1</intro/tutorial01>` | :doc:`Part 2</intro/tutorial02>`
* :doc:`ScaleConf workshop instructions</intro/scaleconf01>`
For developers:
.. toctree::
:maxdepth: 1
routing-naming-conventions.rst
how-we-do-releases.rst
coding-guidelines.rst
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
| .. Vumi documentation master file, created by
sphinx-quickstart on Thu Mar 17 20:51:03 2011.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to Vumi's documentation!
================================
Contents:
.. toctree::
:maxdepth: 1
overview.rst
+ intro/index.rst
first-smpp-bind.rst
applications/index.rst
transports/index.rst
dispatchers/index.rst
middleware/index.rst
metrics.rst
roadmap.rst
release-notes.rst
.. note::
Looking for documentation for writing Javascript applications for the
hosted Vumi Go environment? Visit http://vumi-go.readthedocs.org for
documentation on the hosted platform and
http://vumi-jssandbox-toolkit.readthedocs.org for documentation on
the Javascript sandbox.
Getting Started:
* :doc:`installation`
* :doc:`getting-started`
* :doc:`Writing your first Vumi app - Part 1</intro/tutorial01>` | :doc:`Part 2</intro/tutorial02>`
* :doc:`ScaleConf workshop instructions</intro/scaleconf01>`
For developers:
.. toctree::
:maxdepth: 1
routing-naming-conventions.rst
how-we-do-releases.rst
coding-guidelines.rst
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
| 1 | 0.018519 | 1 | 0 |
e64bee050a2295a1abc1e5c1edcdc1884a8b2269 | sample/tasks.js | sample/tasks.js | /**
* Run per command line as `task-runner demo.js`
*/
const taskRunner = require('../lib/index')
export async function npmList () {
return await taskRunner.shell('npm list')
}
export function wasteSomeTime () {
return new Promise((resolve) => {
//console.log('It is now: ', new Date())
setTimeout(() => {
//console.log('After waiting for 1.5s it is now: ', new Date())
resolve()
}, 1500)
})
}
export default async function () {
npmList()
await wasteSomeTime()
}
| /**
* Run per command line as `task-runner demo.js`
*/
const shell = require('../lib/index').shell
export async function npmList () {
return await shell('npm list')
}
export function wasteSomeTime () {
return new Promise((resolve) => {
//console.log('It is now: ', new Date())
setTimeout(() => {
//console.log('After waiting for 1.5s it is now: ', new Date())
resolve()
}, 1500)
})
}
export default async function () {
await npmList()
await wasteSomeTime()
}
export function willfail () {
return shell('echo "Some error" >&2 && exit 1')
}
| Fix a bug in the sample file & add 'willfail' task | Fix a bug in the sample file & add 'willfail' task
| JavaScript | mit | andywer/npm-launch | javascript | ## Code Before:
/**
* Run per command line as `task-runner demo.js`
*/
const taskRunner = require('../lib/index')
export async function npmList () {
return await taskRunner.shell('npm list')
}
export function wasteSomeTime () {
return new Promise((resolve) => {
//console.log('It is now: ', new Date())
setTimeout(() => {
//console.log('After waiting for 1.5s it is now: ', new Date())
resolve()
}, 1500)
})
}
export default async function () {
npmList()
await wasteSomeTime()
}
## Instruction:
Fix a bug in the sample file & add 'willfail' task
## Code After:
/**
* Run per command line as `task-runner demo.js`
*/
const shell = require('../lib/index').shell
export async function npmList () {
return await shell('npm list')
}
export function wasteSomeTime () {
return new Promise((resolve) => {
//console.log('It is now: ', new Date())
setTimeout(() => {
//console.log('After waiting for 1.5s it is now: ', new Date())
resolve()
}, 1500)
})
}
export default async function () {
await npmList()
await wasteSomeTime()
}
export function willfail () {
return shell('echo "Some error" >&2 && exit 1')
}
| /**
* Run per command line as `task-runner demo.js`
*/
- const taskRunner = require('../lib/index')
? -- ^^^^^ ^
+ const shell = require('../lib/index').shell
? ^ ^^ ++++++
export async function npmList () {
- return await taskRunner.shell('npm list')
? -----------
+ return await shell('npm list')
}
export function wasteSomeTime () {
return new Promise((resolve) => {
//console.log('It is now: ', new Date())
setTimeout(() => {
//console.log('After waiting for 1.5s it is now: ', new Date())
resolve()
}, 1500)
})
}
export default async function () {
- npmList()
+ await npmList()
? ++++++
await wasteSomeTime()
}
+
+ export function willfail () {
+ return shell('echo "Some error" >&2 && exit 1')
+ } | 10 | 0.4 | 7 | 3 |
256bb8ee68e95813d6595981aae37d063bf71fd0 | app/assets/javascripts/surveys.coffee | app/assets/javascripts/surveys.coffee | $(document).ready ->
$('#index-table').on 'change', '#select-all', ->
if $(this).is(':checked')
$('#index-table').find('.print-survey').prop 'checked', true
$('#submit').prop 'disabled', false
else
$('#index-table').find('.print-survey').prop 'checked', false
$('#submit').prop 'disabled', true
return
$('#index-table').on 'change', '.print-survey', ->
if $('#index-table').find('.print-survey:checked').length == 0
$('#submit').prop 'disabled', true
else
$('#submit').prop 'disabled', false
return
return | $(document).ready ->
$('#index-table').on 'change', '#select-all', ->
if $(this).is(':checked')
$('#index-table').find('.print-survey').prop 'checked', true
$('#submit').prop 'disabled', false
else
$('#index-table').find('.print-survey').prop 'checked', false
$('#submit').prop 'disabled', true
return
$('#index-table').on 'change', '.print-survey', ->
if $('#index-table').find('.print-survey:checked').length == 0
$('#submit').prop 'disabled', true
else
$('#submit').prop 'disabled', false
if $('.print-survey:checked').length == $('#index-table').find('tbody > tr').length
$('#select-all').prop 'checked', true
else
$('#select-all').prop 'checked', false
return
return | Update to coffeescript for select all and individual select interactivity | Update to coffeescript for select all and individual select interactivity
| CoffeeScript | mit | AkivaGreen/microservices-ntdsurveys,umts/microservices-ntdsurveys,AkivaGreen/microservices-ntdsurveys,umts/microservices-ntdsurveys,umts/microservices-ntdsurveys,AkivaGreen/microservices-ntdsurveys | coffeescript | ## Code Before:
$(document).ready ->
$('#index-table').on 'change', '#select-all', ->
if $(this).is(':checked')
$('#index-table').find('.print-survey').prop 'checked', true
$('#submit').prop 'disabled', false
else
$('#index-table').find('.print-survey').prop 'checked', false
$('#submit').prop 'disabled', true
return
$('#index-table').on 'change', '.print-survey', ->
if $('#index-table').find('.print-survey:checked').length == 0
$('#submit').prop 'disabled', true
else
$('#submit').prop 'disabled', false
return
return
## Instruction:
Update to coffeescript for select all and individual select interactivity
## Code After:
$(document).ready ->
$('#index-table').on 'change', '#select-all', ->
if $(this).is(':checked')
$('#index-table').find('.print-survey').prop 'checked', true
$('#submit').prop 'disabled', false
else
$('#index-table').find('.print-survey').prop 'checked', false
$('#submit').prop 'disabled', true
return
$('#index-table').on 'change', '.print-survey', ->
if $('#index-table').find('.print-survey:checked').length == 0
$('#submit').prop 'disabled', true
else
$('#submit').prop 'disabled', false
if $('.print-survey:checked').length == $('#index-table').find('tbody > tr').length
$('#select-all').prop 'checked', true
else
$('#select-all').prop 'checked', false
return
return | $(document).ready ->
$('#index-table').on 'change', '#select-all', ->
if $(this).is(':checked')
$('#index-table').find('.print-survey').prop 'checked', true
$('#submit').prop 'disabled', false
else
$('#index-table').find('.print-survey').prop 'checked', false
$('#submit').prop 'disabled', true
return
$('#index-table').on 'change', '.print-survey', ->
if $('#index-table').find('.print-survey:checked').length == 0
$('#submit').prop 'disabled', true
else
$('#submit').prop 'disabled', false
+ if $('.print-survey:checked').length == $('#index-table').find('tbody > tr').length
+ $('#select-all').prop 'checked', true
+ else
+ $('#select-all').prop 'checked', false
return
return | 4 | 0.25 | 4 | 0 |
ab6da0ff3d06f6cbc052d52a9d293fca35d25da9 | src/fhofherr/clj_db_util/migrations.clj | src/fhofherr/clj_db_util/migrations.clj | (ns fhofherr.clj-db-util.migrations
(:require [clojure.tools.logging :as log]
[fhofherr.clj-db-util.db :as db]
[fhofherr.clj-db-util.migrations.flyway :as fw]))
(defn migrate
"Migrate the database identified by `db`.
Return the `db` representing the migrated database.
*Parameters*:
- `db` the database to connect to."
[db & options]
(log/info "Preparing to migrate database.")
(fw/migrate (db/dialect db) (db/db-spec db) options))
(defn clean
"Clean the database identified by`db`.
Return a `db` representing the cleaned database.
**Warning**: This removes all database objects! Do not call this function for
production databases!
*Parameters*:
- `db` the database to clean."
[db & options]
(fw/clean (db/dialect db) (db/db-spec db) options))
| (ns fhofherr.clj-db-util.migrations
(:require [clojure.tools.logging :as log]
[fhofherr.clj-db-util.db :as db]
[fhofherr.clj-db-util.migrations.flyway :as fw]))
(defn migrate
"Migrate the database identified by `db`.
Return the `db` representing the migrated database.
*Parameters*:
- `db` the database to connect to."
[db & options]
(log/info "Preparing to migrate database.")
(fw/migrate (db/dialect db) (db/db-spec db) options)
db)
(defn clean
"Clean the database identified by`db`.
Return a `db` representing the cleaned database.
**Warning**: This removes all database objects! Do not call this function for
production databases!
*Parameters*:
- `db` the database to clean."
[db & options]
(fw/clean (db/dialect db) (db/db-spec db) options)
db)
| Fix return values of migrate and clean. | Fix return values of migrate and clean.
| Clojure | mit | fhofherr/clj-db-util | clojure | ## Code Before:
(ns fhofherr.clj-db-util.migrations
(:require [clojure.tools.logging :as log]
[fhofherr.clj-db-util.db :as db]
[fhofherr.clj-db-util.migrations.flyway :as fw]))
(defn migrate
"Migrate the database identified by `db`.
Return the `db` representing the migrated database.
*Parameters*:
- `db` the database to connect to."
[db & options]
(log/info "Preparing to migrate database.")
(fw/migrate (db/dialect db) (db/db-spec db) options))
(defn clean
"Clean the database identified by`db`.
Return a `db` representing the cleaned database.
**Warning**: This removes all database objects! Do not call this function for
production databases!
*Parameters*:
- `db` the database to clean."
[db & options]
(fw/clean (db/dialect db) (db/db-spec db) options))
## Instruction:
Fix return values of migrate and clean.
## Code After:
(ns fhofherr.clj-db-util.migrations
(:require [clojure.tools.logging :as log]
[fhofherr.clj-db-util.db :as db]
[fhofherr.clj-db-util.migrations.flyway :as fw]))
(defn migrate
"Migrate the database identified by `db`.
Return the `db` representing the migrated database.
*Parameters*:
- `db` the database to connect to."
[db & options]
(log/info "Preparing to migrate database.")
(fw/migrate (db/dialect db) (db/db-spec db) options)
db)
(defn clean
"Clean the database identified by`db`.
Return a `db` representing the cleaned database.
**Warning**: This removes all database objects! Do not call this function for
production databases!
*Parameters*:
- `db` the database to clean."
[db & options]
(fw/clean (db/dialect db) (db/db-spec db) options)
db)
| (ns fhofherr.clj-db-util.migrations
(:require [clojure.tools.logging :as log]
[fhofherr.clj-db-util.db :as db]
[fhofherr.clj-db-util.migrations.flyway :as fw]))
(defn migrate
"Migrate the database identified by `db`.
Return the `db` representing the migrated database.
*Parameters*:
- `db` the database to connect to."
[db & options]
(log/info "Preparing to migrate database.")
- (fw/migrate (db/dialect db) (db/db-spec db) options))
? -
+ (fw/migrate (db/dialect db) (db/db-spec db) options)
+ db)
(defn clean
"Clean the database identified by`db`.
Return a `db` representing the cleaned database.
**Warning**: This removes all database objects! Do not call this function for
production databases!
*Parameters*:
- `db` the database to clean."
[db & options]
- (fw/clean (db/dialect db) (db/db-spec db) options))
? -
+ (fw/clean (db/dialect db) (db/db-spec db) options)
+ db) | 6 | 0.214286 | 4 | 2 |
9507e3acaf5a7d51df06cba13a8b45f91d0a3a02 | Sources/Sugar/Helpers/String.swift | Sources/Sugar/Helpers/String.swift | import Crypto
public extension String {
public static func randomAlphaNumericString(_ length: Int = 64) -> String {
func makeRandom(min: Int, max: Int) -> Int {
let top = max - min + 1
#if os(Linux)
// will always be initialized
guard randomInitialized else { fatalError() }
return Int(COperatingSystem.random() % top) + min
#else
return Int(arc4random_uniform(UInt32(top))) + min
#endif
}
let letters: String = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789"
let len = letters.count
var randomString = ""
for _ in 0 ..< length {
let rand = makeRandom(min: 0, max: Int(len - 1))
let char = letters[letters.index(letters.startIndex, offsetBy: rand)]
randomString += String(char)
}
return randomString
}
}
| import Crypto
public extension String {
public static func randomAlphaNumericString(_ length: Int = 64) -> String {
func makeRandom(min: Int, max: Int) -> Int {
let top = max - min + 1
#if os(Linux)
// will always be initialized
guard randomInitialized else { fatalError() }
return Int(COperatingSystem.random() % top) + min
#else
return Int(arc4random_uniform(UInt32(top))) + min
#endif
}
let letters: String = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789"
let len = letters.count
var randomString = ""
for _ in 0 ..< length {
let rand = makeRandom(min: 0, max: Int(len - 1))
let char = letters[letters.index(letters.startIndex, offsetBy: rand)]
randomString += String(char)
}
return randomString
}
}
#if os(Linux)
/// Generates a random number between (and inclusive of)
/// the given minimum and maximum.
private let randomInitialized: Bool = {
/// This stylized initializer is used to work around dispatch_once
/// not existing and still guarantee thread safety
let current = Date().timeIntervalSinceReferenceDate
let salt = current.truncatingRemainder(dividingBy: 1) * 100000000
COperatingSystem.srand(UInt32(current + salt))
return true
}()
#endif
| Make it build on Linux | Make it build on Linux
| Swift | mit | nodes-vapor/sugar | swift | ## Code Before:
import Crypto
public extension String {
public static func randomAlphaNumericString(_ length: Int = 64) -> String {
func makeRandom(min: Int, max: Int) -> Int {
let top = max - min + 1
#if os(Linux)
// will always be initialized
guard randomInitialized else { fatalError() }
return Int(COperatingSystem.random() % top) + min
#else
return Int(arc4random_uniform(UInt32(top))) + min
#endif
}
let letters: String = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789"
let len = letters.count
var randomString = ""
for _ in 0 ..< length {
let rand = makeRandom(min: 0, max: Int(len - 1))
let char = letters[letters.index(letters.startIndex, offsetBy: rand)]
randomString += String(char)
}
return randomString
}
}
## Instruction:
Make it build on Linux
## Code After:
import Crypto
public extension String {
public static func randomAlphaNumericString(_ length: Int = 64) -> String {
func makeRandom(min: Int, max: Int) -> Int {
let top = max - min + 1
#if os(Linux)
// will always be initialized
guard randomInitialized else { fatalError() }
return Int(COperatingSystem.random() % top) + min
#else
return Int(arc4random_uniform(UInt32(top))) + min
#endif
}
let letters: String = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789"
let len = letters.count
var randomString = ""
for _ in 0 ..< length {
let rand = makeRandom(min: 0, max: Int(len - 1))
let char = letters[letters.index(letters.startIndex, offsetBy: rand)]
randomString += String(char)
}
return randomString
}
}
#if os(Linux)
/// Generates a random number between (and inclusive of)
/// the given minimum and maximum.
private let randomInitialized: Bool = {
/// This stylized initializer is used to work around dispatch_once
/// not existing and still guarantee thread safety
let current = Date().timeIntervalSinceReferenceDate
let salt = current.truncatingRemainder(dividingBy: 1) * 100000000
COperatingSystem.srand(UInt32(current + salt))
return true
}()
#endif
| import Crypto
public extension String {
public static func randomAlphaNumericString(_ length: Int = 64) -> String {
func makeRandom(min: Int, max: Int) -> Int {
let top = max - min + 1
#if os(Linux)
// will always be initialized
guard randomInitialized else { fatalError() }
return Int(COperatingSystem.random() % top) + min
#else
return Int(arc4random_uniform(UInt32(top))) + min
#endif
}
let letters: String = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789"
let len = letters.count
var randomString = ""
for _ in 0 ..< length {
let rand = makeRandom(min: 0, max: Int(len - 1))
let char = letters[letters.index(letters.startIndex, offsetBy: rand)]
randomString += String(char)
}
return randomString
}
}
+
+ #if os(Linux)
+ /// Generates a random number between (and inclusive of)
+ /// the given minimum and maximum.
+ private let randomInitialized: Bool = {
+ /// This stylized initializer is used to work around dispatch_once
+ /// not existing and still guarantee thread safety
+ let current = Date().timeIntervalSinceReferenceDate
+ let salt = current.truncatingRemainder(dividingBy: 1) * 100000000
+ COperatingSystem.srand(UInt32(current + salt))
+ return true
+ }()
+ #endif | 13 | 0.464286 | 13 | 0 |
057cdbdb0cd3edb18201ca090f57908681512c76 | openupgradelib/__init__.py | openupgradelib/__init__.py | import sys
__author__ = 'Odoo Community Association (OCA)'
__email__ = 'support@odoo-community.org'
__doc__ = """A library with support functions to be called from Odoo \
migration scripts."""
__license__ = "AGPL-3"
if sys.version_info >= (3, 8):
from importlib.metadata import version, PackageNotFoundError
else:
from importlib_metadata import version, PackageNotFoundError
try:
__version__ = version("openupgradelib")
except PackageNotFoundError:
# package is not installed
pass
| import sys
__author__ = 'Odoo Community Association (OCA)'
__email__ = 'support@odoo-community.org'
__doc__ = """A library with support functions to be called from Odoo \
migration scripts."""
__license__ = "AGPL-3"
try:
if sys.version_info >= (3, 8):
from importlib.metadata import version, PackageNotFoundError
else:
from importlib_metadata import version, PackageNotFoundError
except ImportError:
# this happens when setup.py imports openupgradelib
pass
else:
try:
__version__ = version("openupgradelib")
except PackageNotFoundError:
# package is not installed
pass
| Fix issue when running setup.py on python<3.8 | Fix issue when running setup.py on python<3.8
| Python | agpl-3.0 | OCA/openupgradelib | python | ## Code Before:
import sys
__author__ = 'Odoo Community Association (OCA)'
__email__ = 'support@odoo-community.org'
__doc__ = """A library with support functions to be called from Odoo \
migration scripts."""
__license__ = "AGPL-3"
if sys.version_info >= (3, 8):
from importlib.metadata import version, PackageNotFoundError
else:
from importlib_metadata import version, PackageNotFoundError
try:
__version__ = version("openupgradelib")
except PackageNotFoundError:
# package is not installed
pass
## Instruction:
Fix issue when running setup.py on python<3.8
## Code After:
import sys
__author__ = 'Odoo Community Association (OCA)'
__email__ = 'support@odoo-community.org'
__doc__ = """A library with support functions to be called from Odoo \
migration scripts."""
__license__ = "AGPL-3"
try:
if sys.version_info >= (3, 8):
from importlib.metadata import version, PackageNotFoundError
else:
from importlib_metadata import version, PackageNotFoundError
except ImportError:
# this happens when setup.py imports openupgradelib
pass
else:
try:
__version__ = version("openupgradelib")
except PackageNotFoundError:
# package is not installed
pass
| import sys
__author__ = 'Odoo Community Association (OCA)'
__email__ = 'support@odoo-community.org'
__doc__ = """A library with support functions to be called from Odoo \
migration scripts."""
__license__ = "AGPL-3"
+ try:
- if sys.version_info >= (3, 8):
+ if sys.version_info >= (3, 8):
? ++++
- from importlib.metadata import version, PackageNotFoundError
+ from importlib.metadata import version, PackageNotFoundError
? ++++
+ else:
+ from importlib_metadata import version, PackageNotFoundError
+ except ImportError:
+ # this happens when setup.py imports openupgradelib
+ pass
else:
+ try:
- from importlib_metadata import version, PackageNotFoundError
-
- try:
- __version__ = version("openupgradelib")
+ __version__ = version("openupgradelib")
? ++++
- except PackageNotFoundError:
+ except PackageNotFoundError:
? ++++
- # package is not installed
+ # package is not installed
? ++++
- pass
+ pass
? ++++
| 22 | 1.157895 | 13 | 9 |
53560397cae47554969981b5dc99da07631d3f75 | deploy.sh | deploy.sh | set -e
# Base directory for this entire project
BASEDIR=$(cd $(dirname $0) && pwd)
# Destination directory for built code
ASSETSDIR="$BASEDIR/assets"
# run the build script
./npm install
./gulp
# Create a new Git repo in assets folder
cd "$ASSETSDIR"
git init
# Set user details
git config user.name "iAyeBot"
git config user.email "iayebot@gmail.com"
# First commit, .. horray!
git add .
git commit -m "Deploy to gh-pages"
# Force push ...
git push --force --quiet "https://${GH_TOKEN}@${GH_REF}" master:gh-pages > /dev/null 2>&1
| set -e
# Base directory for this entire project
BASEDIR=$(cd $(dirname $0) && pwd)
# Destination directory for built code
ASSETSDIR="$BASEDIR/assets"
# install npm and run gulp
./npm install
./gulp
# Create a new Git repo in assets folder
cd "$ASSETSDIR"
git init
# Set user details
git config user.name "iAyeBot"
git config user.email "iayebot@gmail.com"
# First commit, .. horray!
git add .
git commit -m "Deploy to gh-pages"
# Force push ...
git push --force --quiet "https://${GH_TOKEN}@${GH_REF}" master:gh-pages > /dev/null 2>&1
| Update readme, kick off Travis .. | Update readme, kick off Travis ..
| Shell | mit | pyrocms/branding,pyrocms/branding,pyrocms/branding | shell | ## Code Before:
set -e
# Base directory for this entire project
BASEDIR=$(cd $(dirname $0) && pwd)
# Destination directory for built code
ASSETSDIR="$BASEDIR/assets"
# run the build script
./npm install
./gulp
# Create a new Git repo in assets folder
cd "$ASSETSDIR"
git init
# Set user details
git config user.name "iAyeBot"
git config user.email "iayebot@gmail.com"
# First commit, .. horray!
git add .
git commit -m "Deploy to gh-pages"
# Force push ...
git push --force --quiet "https://${GH_TOKEN}@${GH_REF}" master:gh-pages > /dev/null 2>&1
## Instruction:
Update readme, kick off Travis ..
## Code After:
set -e
# Base directory for this entire project
BASEDIR=$(cd $(dirname $0) && pwd)
# Destination directory for built code
ASSETSDIR="$BASEDIR/assets"
# install npm and run gulp
./npm install
./gulp
# Create a new Git repo in assets folder
cd "$ASSETSDIR"
git init
# Set user details
git config user.name "iAyeBot"
git config user.email "iayebot@gmail.com"
# First commit, .. horray!
git add .
git commit -m "Deploy to gh-pages"
# Force push ...
git push --force --quiet "https://${GH_TOKEN}@${GH_REF}" master:gh-pages > /dev/null 2>&1
| set -e
# Base directory for this entire project
BASEDIR=$(cd $(dirname $0) && pwd)
# Destination directory for built code
ASSETSDIR="$BASEDIR/assets"
- # run the build script
+ # install npm and run gulp
./npm install
./gulp
# Create a new Git repo in assets folder
cd "$ASSETSDIR"
git init
# Set user details
git config user.name "iAyeBot"
git config user.email "iayebot@gmail.com"
# First commit, .. horray!
git add .
git commit -m "Deploy to gh-pages"
# Force push ...
git push --force --quiet "https://${GH_TOKEN}@${GH_REF}" master:gh-pages > /dev/null 2>&1 | 2 | 0.076923 | 1 | 1 |
a2015a13fbaa23a2a2097b8573a36e617d1eb980 | baseutils/ls/tests/functional_test.c | baseutils/ls/tests/functional_test.c | /*
* $FreeBSD$
*
* Smoke test for `ls` utility
*/
#include <getopt.h>
#include <stdio.h>
#include <stdlib.h>
#include "functional_test.h"
int
main(int argc, char *argv[])
{
while ((opt = getopt_long(argc, argv, short_options,
long_options, &option_index)) != -1) {
switch(opt) {
case 0: /* valid long option */
/* generate the valid command for execution */
snprintf(command, sizeof(command),
"/bin/ls --%s > /dev/null", long_options[option_index].name);
ret = system(command);
if (ret == -1) {
fprintf(stderr, "Failed to create child process\n");
exit(EXIT_FAILURE);
}
if (!WIFEXITED(ret)) {
fprintf(stderr, "Child process failed to terminate normally\n");
exit(EXIT_FAILURE);
}
if (WEXITSTATUS(ret))
fprintf(stderr, "\nValid option '--%s' failed to execute\n",
long_options[option_index].name);
else
printf("Successful: '--%s'\n", long_options[option_index].name);
break;
case '?': /* invalid long option */
break;
default:
printf("getopt_long returned character code %o\n", opt);
}
}
exit(EXIT_SUCCESS);
}
| /*
* $FreeBSD$
*
* Smoke test for `ls` utility
*/
#include <getopt.h>
#include <stdio.h>
#include <stdlib.h>
#include <sys/wait.h>
#include "functional_test.h"
int
main(int argc, char *argv[])
{
while ((opt = getopt_long(argc, argv, short_options,
long_options, &option_index)) != -1) {
switch(opt) {
case 0: /* valid long option */
/* generate the valid command for execution */
snprintf(command, sizeof(command),
"/bin/ls --%s > /dev/null", long_options[option_index].name);
ret = system(command);
if (ret == -1) {
fprintf(stderr, "Failed to create child process\n");
exit(EXIT_FAILURE);
}
if (!WIFEXITED(ret)) {
fprintf(stderr, "Child process failed to terminate normally\n");
exit(EXIT_FAILURE);
}
if (WEXITSTATUS(ret))
fprintf(stderr, "\nValid option '--%s' failed to execute\n",
long_options[option_index].name);
else
printf("Successful: '--%s'\n", long_options[option_index].name);
break;
case '?': /* invalid long option */
break;
default:
printf("getopt_long returned character code %o\n", opt);
}
}
exit(EXIT_SUCCESS);
}
| Fix compilation error on FreeBSD | Fix compilation error on FreeBSD
Error log -
```
undefined reference to `WIFEXITED'
undefined reference to `WEXITSTATUS'
```
Apparently the header file <sys/wait.h> has to be included in FreeBSD
unlike on Linux (Ubuntu 16.10) since it is included by <stdlib.h>.
| C | bsd-2-clause | shivansh/smoketestsuite,shivrai/smoketestsuite,shivrai/smoketestsuite,shivrai/smoketestsuite,shivansh/smoketestsuite,shivansh/smoketestsuite | c | ## Code Before:
/*
* $FreeBSD$
*
* Smoke test for `ls` utility
*/
#include <getopt.h>
#include <stdio.h>
#include <stdlib.h>
#include "functional_test.h"
int
main(int argc, char *argv[])
{
while ((opt = getopt_long(argc, argv, short_options,
long_options, &option_index)) != -1) {
switch(opt) {
case 0: /* valid long option */
/* generate the valid command for execution */
snprintf(command, sizeof(command),
"/bin/ls --%s > /dev/null", long_options[option_index].name);
ret = system(command);
if (ret == -1) {
fprintf(stderr, "Failed to create child process\n");
exit(EXIT_FAILURE);
}
if (!WIFEXITED(ret)) {
fprintf(stderr, "Child process failed to terminate normally\n");
exit(EXIT_FAILURE);
}
if (WEXITSTATUS(ret))
fprintf(stderr, "\nValid option '--%s' failed to execute\n",
long_options[option_index].name);
else
printf("Successful: '--%s'\n", long_options[option_index].name);
break;
case '?': /* invalid long option */
break;
default:
printf("getopt_long returned character code %o\n", opt);
}
}
exit(EXIT_SUCCESS);
}
## Instruction:
Fix compilation error on FreeBSD
Error log -
```
undefined reference to `WIFEXITED'
undefined reference to `WEXITSTATUS'
```
Apparently the header file <sys/wait.h> has to be included in FreeBSD
unlike on Linux (Ubuntu 16.10) since it is included by <stdlib.h>.
## Code After:
/*
* $FreeBSD$
*
* Smoke test for `ls` utility
*/
#include <getopt.h>
#include <stdio.h>
#include <stdlib.h>
#include <sys/wait.h>
#include "functional_test.h"
int
main(int argc, char *argv[])
{
while ((opt = getopt_long(argc, argv, short_options,
long_options, &option_index)) != -1) {
switch(opt) {
case 0: /* valid long option */
/* generate the valid command for execution */
snprintf(command, sizeof(command),
"/bin/ls --%s > /dev/null", long_options[option_index].name);
ret = system(command);
if (ret == -1) {
fprintf(stderr, "Failed to create child process\n");
exit(EXIT_FAILURE);
}
if (!WIFEXITED(ret)) {
fprintf(stderr, "Child process failed to terminate normally\n");
exit(EXIT_FAILURE);
}
if (WEXITSTATUS(ret))
fprintf(stderr, "\nValid option '--%s' failed to execute\n",
long_options[option_index].name);
else
printf("Successful: '--%s'\n", long_options[option_index].name);
break;
case '?': /* invalid long option */
break;
default:
printf("getopt_long returned character code %o\n", opt);
}
}
exit(EXIT_SUCCESS);
}
| /*
* $FreeBSD$
*
* Smoke test for `ls` utility
*/
#include <getopt.h>
#include <stdio.h>
#include <stdlib.h>
+ #include <sys/wait.h>
#include "functional_test.h"
int
main(int argc, char *argv[])
{
while ((opt = getopt_long(argc, argv, short_options,
long_options, &option_index)) != -1) {
switch(opt) {
case 0: /* valid long option */
/* generate the valid command for execution */
snprintf(command, sizeof(command),
"/bin/ls --%s > /dev/null", long_options[option_index].name);
ret = system(command);
if (ret == -1) {
fprintf(stderr, "Failed to create child process\n");
exit(EXIT_FAILURE);
}
if (!WIFEXITED(ret)) {
fprintf(stderr, "Child process failed to terminate normally\n");
exit(EXIT_FAILURE);
}
if (WEXITSTATUS(ret))
fprintf(stderr, "\nValid option '--%s' failed to execute\n",
long_options[option_index].name);
else
printf("Successful: '--%s'\n", long_options[option_index].name);
break;
case '?': /* invalid long option */
break;
default:
printf("getopt_long returned character code %o\n", opt);
}
}
exit(EXIT_SUCCESS);
} | 1 | 0.019608 | 1 | 0 |
5dc8c4ecfe8b95ffb9ef90395760f7f8309496af | volunteers/templates/userena/profile_list.html | volunteers/templates/userena/profile_list.html | {% extends 'userena/base_userena.html' %}
{% load i18n %}
{% load url from future %}
{% block content_title %}<h2 class="content-title">{% trans 'Profiles' %}</h2>{% endblock %}
{% block content %}
<ul id="account_list">
{% for profile in profile_list %}
<li>
<a href="{% url 'userena_profile_detail' profile.user.username %}"><img src="{{ profile.get_mugshot_url }}" /></a>
<a href="{% url 'userena_profile_detail' profile.user.username %}">{{ profile.user.username }}</a>
</li>
{% endfor %}
</ul>
{% if is_paginated %}
<div class="pagination">
<span class="step-links">
{% if page_obj.has_previous %}
<a href="{% url 'userena_profile_list_paginated' page_obj.previous_page_number %}">{% trans 'previous' %}</a>
{% endif %}
<span class="current">
{% trans 'Page' %} {{ page_obj.number }} {% trans 'of' %} {{ page_obj.paginator.num_pages }}.
</span>
{% if page_obj.has_next %}
<a href="{% url 'userena_profile_list_paginated' page_obj.next_page_number %}">{% trans 'next' %}</a>
{% endif %}
</span>
</div>
{% endif %}
{% endblock %}
| {% extends 'userena/base_userena.html' %}
{% load i18n %}
{% load url from future %}
{% block content_title %}<h2 class="content-title">{% trans 'Profiles' %}</h2>{% endblock %}
{% block content %}
<ul id="account_list">
{% for profile in profile_list %}
<li>
<a href="{% url 'userena_profile_detail' profile.user.username %}"><img src="{{ profile.get_mugshot_url }}" /></a>
<a href="{% url 'userena_profile_detail' profile.user.username %}">{{ profile.user.first_name }} {{ profile.user.last_name }} ( {{ profile.user.username }})</a>
</li>
{% endfor %}
</ul>
{% if is_paginated %}
<div class="pagination">
<span class="step-links">
{% if page_obj.has_previous %}
<a href="{% url 'userena_profile_list_paginated' page_obj.previous_page_number %}">{% trans 'previous' %}</a>
{% endif %}
<span class="current">
{% trans 'Page' %} {{ page_obj.number }} {% trans 'of' %} {{ page_obj.paginator.num_pages }}.
</span>
{% if page_obj.has_next %}
<a href="{% url 'userena_profile_list_paginated' page_obj.next_page_number %}">{% trans 'next' %}</a>
{% endif %}
</span>
</div>
{% endif %}
{% endblock %}
| Add fullname to profile list | [IMP] Add fullname to profile list
| HTML | agpl-3.0 | cateee/fosdem-volunteers,jrial/fosdem-volunteers,jrial/fosdem-volunteers,FOSDEM/volunteers,cateee/fosdem-volunteers,jrial/fosdem-volunteers,jrial/fosdem-volunteers,cateee/fosdem-volunteers,FOSDEM/volunteers,FOSDEM/volunteers,FOSDEM/volunteers,cateee/fosdem-volunteers | html | ## Code Before:
{% extends 'userena/base_userena.html' %}
{% load i18n %}
{% load url from future %}
{% block content_title %}<h2 class="content-title">{% trans 'Profiles' %}</h2>{% endblock %}
{% block content %}
<ul id="account_list">
{% for profile in profile_list %}
<li>
<a href="{% url 'userena_profile_detail' profile.user.username %}"><img src="{{ profile.get_mugshot_url }}" /></a>
<a href="{% url 'userena_profile_detail' profile.user.username %}">{{ profile.user.username }}</a>
</li>
{% endfor %}
</ul>
{% if is_paginated %}
<div class="pagination">
<span class="step-links">
{% if page_obj.has_previous %}
<a href="{% url 'userena_profile_list_paginated' page_obj.previous_page_number %}">{% trans 'previous' %}</a>
{% endif %}
<span class="current">
{% trans 'Page' %} {{ page_obj.number }} {% trans 'of' %} {{ page_obj.paginator.num_pages }}.
</span>
{% if page_obj.has_next %}
<a href="{% url 'userena_profile_list_paginated' page_obj.next_page_number %}">{% trans 'next' %}</a>
{% endif %}
</span>
</div>
{% endif %}
{% endblock %}
## Instruction:
[IMP] Add fullname to profile list
## Code After:
{% extends 'userena/base_userena.html' %}
{% load i18n %}
{% load url from future %}
{% block content_title %}<h2 class="content-title">{% trans 'Profiles' %}</h2>{% endblock %}
{% block content %}
<ul id="account_list">
{% for profile in profile_list %}
<li>
<a href="{% url 'userena_profile_detail' profile.user.username %}"><img src="{{ profile.get_mugshot_url }}" /></a>
<a href="{% url 'userena_profile_detail' profile.user.username %}">{{ profile.user.first_name }} {{ profile.user.last_name }} ( {{ profile.user.username }})</a>
</li>
{% endfor %}
</ul>
{% if is_paginated %}
<div class="pagination">
<span class="step-links">
{% if page_obj.has_previous %}
<a href="{% url 'userena_profile_list_paginated' page_obj.previous_page_number %}">{% trans 'previous' %}</a>
{% endif %}
<span class="current">
{% trans 'Page' %} {{ page_obj.number }} {% trans 'of' %} {{ page_obj.paginator.num_pages }}.
</span>
{% if page_obj.has_next %}
<a href="{% url 'userena_profile_list_paginated' page_obj.next_page_number %}">{% trans 'next' %}</a>
{% endif %}
</span>
</div>
{% endif %}
{% endblock %}
| {% extends 'userena/base_userena.html' %}
{% load i18n %}
{% load url from future %}
{% block content_title %}<h2 class="content-title">{% trans 'Profiles' %}</h2>{% endblock %}
{% block content %}
<ul id="account_list">
{% for profile in profile_list %}
<li>
<a href="{% url 'userena_profile_detail' profile.user.username %}"><img src="{{ profile.get_mugshot_url }}" /></a>
- <a href="{% url 'userena_profile_detail' profile.user.username %}">{{ profile.user.username }}</a>
+ <a href="{% url 'userena_profile_detail' profile.user.username %}">{{ profile.user.first_name }} {{ profile.user.last_name }} ( {{ profile.user.username }})</a>
? ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +
</li>
{% endfor %}
</ul>
{% if is_paginated %}
<div class="pagination">
<span class="step-links">
{% if page_obj.has_previous %}
<a href="{% url 'userena_profile_list_paginated' page_obj.previous_page_number %}">{% trans 'previous' %}</a>
{% endif %}
<span class="current">
{% trans 'Page' %} {{ page_obj.number }} {% trans 'of' %} {{ page_obj.paginator.num_pages }}.
</span>
{% if page_obj.has_next %}
<a href="{% url 'userena_profile_list_paginated' page_obj.next_page_number %}">{% trans 'next' %}</a>
{% endif %}
</span>
</div>
{% endif %}
{% endblock %} | 2 | 0.058824 | 1 | 1 |
f10037b9ff2ac0e449a2e6ffb7da1cb1b98b5736 | test/CodeGen/X86/pr12360.ll | test/CodeGen/X86/pr12360.ll | ; RUN: llc < %s -march=x86-64 | FileCheck %s
define zeroext i1 @f1(i8* %x) {
entry:
%0 = load i8* %x, align 1, !range !0
%tobool = trunc i8 %0 to i1
ret i1 %tobool
}
; CHECK: f1:
; CHECK: movb (%rdi), %al
; CHECK-NEXT: ret
!0 = metadata !{i8 0, i8 2}
| ; RUN: llc < %s -mtriple=x86_64-apple-darwin | FileCheck %s
define zeroext i1 @f1(i8* %x) {
entry:
%0 = load i8* %x, align 1, !range !0
%tobool = trunc i8 %0 to i1
ret i1 %tobool
}
; CHECK: f1:
; CHECK: movb (%rdi), %al
; CHECK-NEXT: ret
!0 = metadata !{i8 0, i8 2}
| Add a triple to the test. | Add a triple to the test.
git-svn-id: 0ff597fd157e6f4fc38580e8d64ab130330d2411@153818 91177308-0d34-0410-b5e6-96231b3b80d8
| LLVM | apache-2.0 | llvm-mirror/llvm,GPUOpen-Drivers/llvm,llvm-mirror/llvm,GPUOpen-Drivers/llvm,dslab-epfl/asap,apple/swift-llvm,GPUOpen-Drivers/llvm,dslab-epfl/asap,dslab-epfl/asap,apple/swift-llvm,chubbymaggie/asap,dslab-epfl/asap,GPUOpen-Drivers/llvm,chubbymaggie/asap,apple/swift-llvm,apple/swift-llvm,apple/swift-llvm,llvm-mirror/llvm,llvm-mirror/llvm,apple/swift-llvm,dslab-epfl/asap,chubbymaggie/asap,apple/swift-llvm,llvm-mirror/llvm,GPUOpen-Drivers/llvm,GPUOpen-Drivers/llvm,apple/swift-llvm,chubbymaggie/asap,GPUOpen-Drivers/llvm,llvm-mirror/llvm,dslab-epfl/asap,llvm-mirror/llvm,chubbymaggie/asap,chubbymaggie/asap,dslab-epfl/asap,GPUOpen-Drivers/llvm,llvm-mirror/llvm,llvm-mirror/llvm | llvm | ## Code Before:
; RUN: llc < %s -march=x86-64 | FileCheck %s
define zeroext i1 @f1(i8* %x) {
entry:
%0 = load i8* %x, align 1, !range !0
%tobool = trunc i8 %0 to i1
ret i1 %tobool
}
; CHECK: f1:
; CHECK: movb (%rdi), %al
; CHECK-NEXT: ret
!0 = metadata !{i8 0, i8 2}
## Instruction:
Add a triple to the test.
git-svn-id: 0ff597fd157e6f4fc38580e8d64ab130330d2411@153818 91177308-0d34-0410-b5e6-96231b3b80d8
## Code After:
; RUN: llc < %s -mtriple=x86_64-apple-darwin | FileCheck %s
define zeroext i1 @f1(i8* %x) {
entry:
%0 = load i8* %x, align 1, !range !0
%tobool = trunc i8 %0 to i1
ret i1 %tobool
}
; CHECK: f1:
; CHECK: movb (%rdi), %al
; CHECK-NEXT: ret
!0 = metadata !{i8 0, i8 2}
| - ; RUN: llc < %s -march=x86-64 | FileCheck %s
? ^ ^^ ^
+ ; RUN: llc < %s -mtriple=x86_64-apple-darwin | FileCheck %s
? ^ ^^^^ ^ +++++++++++++
define zeroext i1 @f1(i8* %x) {
entry:
%0 = load i8* %x, align 1, !range !0
%tobool = trunc i8 %0 to i1
ret i1 %tobool
}
; CHECK: f1:
; CHECK: movb (%rdi), %al
; CHECK-NEXT: ret
!0 = metadata !{i8 0, i8 2} | 2 | 0.142857 | 1 | 1 |
e0a19e87a328c41adfd51c7e2cdcd29d0412e916 | src/js/post-911-gib-status/containers/PrintPage.jsx | src/js/post-911-gib-status/containers/PrintPage.jsx | import React from 'react';
import { connect } from 'react-redux';
import UserInfoSection from '../components/UserInfoSection';
import { formatDateLong } from '../../common/utils/helpers';
class PrintPage extends React.Component {
render() {
const enrollmentData = this.props.enrollmentData || {};
const todayFormatted = formatDateLong(new Date());
return (
<div className="usa-width-two-thirds medium-8 columns gib-info">
<div className="print-status">
<div className="print-screen">
<img src="/img/design/logo/va-logo.png" alt="VA logo" width="300"/>
<h1 className="section-header">Post-9/11 GI Bill<sup>®</sup> Statement of Benefits</h1>
<p>
The information in this letter is the Post-9/11 GI Bill
Statement of Benefits for the beneficiary listed below as of
{todayFormatted}. Any pending or recent changes to enrollment may
affect remaining entitlement.
</p>
<UserInfoSection enrollmentData={enrollmentData}/>
</div>
</div>
</div>
);
}
}
function mapStateToProps(state) {
return {
enrollmentData: state.post911GIBStatus.enrollmentData
};
}
export default connect(mapStateToProps)(PrintPage);
| import React from 'react';
import { connect } from 'react-redux';
import UserInfoSection from '../components/UserInfoSection';
import { formatDateLong } from '../../common/utils/helpers';
class PrintPage extends React.Component {
render() {
const enrollmentData = this.props.enrollmentData || {};
const todayFormatted = formatDateLong(new Date());
return (
<div className="usa-width-two-thirds medium-8 columns gib-info">
<div className="print-status">
<div className="print-screen">
<img src="/img/design/logo/va-logo.png" alt="VA logo" width="300"/>
<h1 className="section-header">Post-9/11 GI Bill<sup>®</sup> Statement of Benefits</h1>
<p>
The information in this letter is the Post-9/11 GI Bill Statement of Benefits for the beneficiary listed below as of {todayFormatted}. Any pending or recent changes to enrollment may affect remaining entitlement.
</p>
<UserInfoSection enrollmentData={enrollmentData}/>
</div>
</div>
</div>
);
}
}
function mapStateToProps(state) {
return {
enrollmentData: state.post911GIBStatus.enrollmentData
};
}
export default connect(mapStateToProps)(PrintPage);
| Add trademark symbol to print page | Add trademark symbol to print page
| JSX | cc0-1.0 | department-of-veterans-affairs/vets-website,department-of-veterans-affairs/vets-website | jsx | ## Code Before:
import React from 'react';
import { connect } from 'react-redux';
import UserInfoSection from '../components/UserInfoSection';
import { formatDateLong } from '../../common/utils/helpers';
class PrintPage extends React.Component {
render() {
const enrollmentData = this.props.enrollmentData || {};
const todayFormatted = formatDateLong(new Date());
return (
<div className="usa-width-two-thirds medium-8 columns gib-info">
<div className="print-status">
<div className="print-screen">
<img src="/img/design/logo/va-logo.png" alt="VA logo" width="300"/>
<h1 className="section-header">Post-9/11 GI Bill<sup>®</sup> Statement of Benefits</h1>
<p>
The information in this letter is the Post-9/11 GI Bill
Statement of Benefits for the beneficiary listed below as of
{todayFormatted}. Any pending or recent changes to enrollment may
affect remaining entitlement.
</p>
<UserInfoSection enrollmentData={enrollmentData}/>
</div>
</div>
</div>
);
}
}
function mapStateToProps(state) {
return {
enrollmentData: state.post911GIBStatus.enrollmentData
};
}
export default connect(mapStateToProps)(PrintPage);
## Instruction:
Add trademark symbol to print page
## Code After:
import React from 'react';
import { connect } from 'react-redux';
import UserInfoSection from '../components/UserInfoSection';
import { formatDateLong } from '../../common/utils/helpers';
class PrintPage extends React.Component {
render() {
const enrollmentData = this.props.enrollmentData || {};
const todayFormatted = formatDateLong(new Date());
return (
<div className="usa-width-two-thirds medium-8 columns gib-info">
<div className="print-status">
<div className="print-screen">
<img src="/img/design/logo/va-logo.png" alt="VA logo" width="300"/>
<h1 className="section-header">Post-9/11 GI Bill<sup>®</sup> Statement of Benefits</h1>
<p>
The information in this letter is the Post-9/11 GI Bill Statement of Benefits for the beneficiary listed below as of {todayFormatted}. Any pending or recent changes to enrollment may affect remaining entitlement.
</p>
<UserInfoSection enrollmentData={enrollmentData}/>
</div>
</div>
</div>
);
}
}
function mapStateToProps(state) {
return {
enrollmentData: state.post911GIBStatus.enrollmentData
};
}
export default connect(mapStateToProps)(PrintPage);
| import React from 'react';
import { connect } from 'react-redux';
import UserInfoSection from '../components/UserInfoSection';
import { formatDateLong } from '../../common/utils/helpers';
class PrintPage extends React.Component {
render() {
const enrollmentData = this.props.enrollmentData || {};
const todayFormatted = formatDateLong(new Date());
return (
- <div className="usa-width-two-thirds medium-8 columns gib-info">
+ <div className="usa-width-two-thirds medium-8 columns gib-info">
? ++
- <div className="print-status">
+ <div className="print-status">
? ++++
- <div className="print-screen">
+ <div className="print-screen">
? ++++++
- <img src="/img/design/logo/va-logo.png" alt="VA logo" width="300"/>
+ <img src="/img/design/logo/va-logo.png" alt="VA logo" width="300"/>
? ++++++++
- <h1 className="section-header">Post-9/11 GI Bill<sup>®</sup> Statement of Benefits</h1>
+ <h1 className="section-header">Post-9/11 GI Bill<sup>®</sup> Statement of Benefits</h1>
? ++++++++
- <p>
+ <p>
? ++++++++
+ The information in this letter is the Post-9/11 GI Bill Statement of Benefits for the beneficiary listed below as of {todayFormatted}. Any pending or recent changes to enrollment may affect remaining entitlement.
- The information in this letter is the Post-9/11 GI Bill
- Statement of Benefits for the beneficiary listed below as of
- {todayFormatted}. Any pending or recent changes to enrollment may
- affect remaining entitlement.
- </p>
+ </p>
? ++++++++
- <UserInfoSection enrollmentData={enrollmentData}/>
+ <UserInfoSection enrollmentData={enrollmentData}/>
? ++++++++
+ </div>
- </div>
+ </div>
? ++
</div>
- </div>
);
}
}
function mapStateToProps(state) {
return {
enrollmentData: state.post911GIBStatus.enrollmentData
};
}
export default connect(mapStateToProps)(PrintPage); | 25 | 0.625 | 11 | 14 |
ada4cd886e09e5b97de5cca6083ef166661172fe | app/components/resources/map/list/thumbnail.scss | app/components/resources/map/list/thumbnail.scss | /**
* @author Thomas Kleinke
*/
#thumbnail-container {
width: $resources-listing-item-slim-width;
height: 150px;
img {
max-width: 100%;
max-height: 100%;
object-fit: scale-down;
user-select: none;
-webkit-user-drag: none;
}
} | /**
* @author Thomas Kleinke
* @author Sebastian Cuy
*/
#thumbnail-container {
width: $resources-listing-item-slim-width;
height: 172px;
img {
max-width: 100%;
max-height: 100%;
object-fit: scale-down;
user-select: none;
-webkit-user-drag: none;
&:hover {
filter: brightness(95%);
}
}
}
| Add on hover effect for img in popover | Add on hover effect for img in popover
| SCSS | apache-2.0 | codarchlab/idai-field-client,codarchlab/idai-field-client,codarchlab/idai-field-client,codarchlab/idai-field-client | scss | ## Code Before:
/**
* @author Thomas Kleinke
*/
#thumbnail-container {
width: $resources-listing-item-slim-width;
height: 150px;
img {
max-width: 100%;
max-height: 100%;
object-fit: scale-down;
user-select: none;
-webkit-user-drag: none;
}
}
## Instruction:
Add on hover effect for img in popover
## Code After:
/**
* @author Thomas Kleinke
* @author Sebastian Cuy
*/
#thumbnail-container {
width: $resources-listing-item-slim-width;
height: 172px;
img {
max-width: 100%;
max-height: 100%;
object-fit: scale-down;
user-select: none;
-webkit-user-drag: none;
&:hover {
filter: brightness(95%);
}
}
}
| /**
* @author Thomas Kleinke
+ * @author Sebastian Cuy
*/
#thumbnail-container {
width: $resources-listing-item-slim-width;
- height: 150px;
? ^^
+ height: 172px;
? ^^
img {
max-width: 100%;
max-height: 100%;
object-fit: scale-down;
user-select: none;
-webkit-user-drag: none;
+
+ &:hover {
+ filter: brightness(95%);
+ }
}
} | 7 | 0.4375 | 6 | 1 |
833bf307b6117cb7f1707225a8406d0cffb9c406 | app/services/persister/post_persister.rb | app/services/persister/post_persister.rb | module Persister
class PostPersister
attr_accessor :post
def initialize(post)
@post = post
end
def save
::ActiveRecord::Base.transaction do
post.save!
create_save_event!
post
end
rescue ActiveRecord::RecordInvalid => _exception
false
end
def update_attributes(params)
post.update_attributes(params)
end
private
def create_save_event!
::Event.create! action: :created, post: post
end
end
end
| module Persister
class PostPersister
attr_accessor :post
def initialize(post)
@post = post
end
def save
::ActiveRecord::Base.transaction do
post.save!
create_save_event!
post
end
rescue ActiveRecord::RecordInvalid => _exception
false
end
def update_attributes(params)
::ActiveRecord::Base.transaction do
post.update_attributes!(params)
create_update_event!
post
end
rescue ActiveRecord::RecordInvalid => _exception
false
end
private
def create_save_event!
::Event.create! action: :created, post: post
end
def create_update_event!
::Event.create! action: :updated, post: post
end
end
end
| Create :updated event on Post update | Create :updated event on Post update
| Ruby | agpl-3.0 | coopdevs/timeoverflow,coopdevs/timeoverflow,coopdevs/timeoverflow,coopdevs/timeoverflow | ruby | ## Code Before:
module Persister
class PostPersister
attr_accessor :post
def initialize(post)
@post = post
end
def save
::ActiveRecord::Base.transaction do
post.save!
create_save_event!
post
end
rescue ActiveRecord::RecordInvalid => _exception
false
end
def update_attributes(params)
post.update_attributes(params)
end
private
def create_save_event!
::Event.create! action: :created, post: post
end
end
end
## Instruction:
Create :updated event on Post update
## Code After:
module Persister
class PostPersister
attr_accessor :post
def initialize(post)
@post = post
end
def save
::ActiveRecord::Base.transaction do
post.save!
create_save_event!
post
end
rescue ActiveRecord::RecordInvalid => _exception
false
end
def update_attributes(params)
::ActiveRecord::Base.transaction do
post.update_attributes!(params)
create_update_event!
post
end
rescue ActiveRecord::RecordInvalid => _exception
false
end
private
def create_save_event!
::Event.create! action: :created, post: post
end
def create_update_event!
::Event.create! action: :updated, post: post
end
end
end
| module Persister
class PostPersister
attr_accessor :post
def initialize(post)
@post = post
end
def save
::ActiveRecord::Base.transaction do
post.save!
create_save_event!
post
end
rescue ActiveRecord::RecordInvalid => _exception
false
end
def update_attributes(params)
+ ::ActiveRecord::Base.transaction do
- post.update_attributes(params)
+ post.update_attributes!(params)
? ++ +
+ create_update_event!
+ post
+ end
+ rescue ActiveRecord::RecordInvalid => _exception
+ false
end
private
def create_save_event!
::Event.create! action: :created, post: post
end
+
+ def create_update_event!
+ ::Event.create! action: :updated, post: post
+ end
end
end | 12 | 0.413793 | 11 | 1 |
299d6f800d1c08d04245ecdf7c6b2b4ef0803de6 | t/metadata.t | t/metadata.t | use v5.010;
use lib 't/lib';
use autodie;
use Test::Most;
use Test::NetKAN qw(netkan_files read_netkan licenses);
use Data::Dumper;
my %licenses = licenses();
my %files = netkan_files;
foreach my $shortname (sort keys %files) {
my $metadata = read_netkan($files{$shortname});
is(
$metadata->{identifier},
$shortname,
"$shortname.netkan identifer should match filename"
);
my $mod_license = $metadata->{license} // "(none)";
ok(
$metadata->{x_netkan_license_ok} || $licenses{$mod_license},
"$shortname license ($mod_license) should match spec. Set `x_netkan_license_ok` to supress."
);
ok(
$metadata->{'$kref'} || $metadata->{'$vref'},
"$shortname has no \$kref/\$vref field. It belongs in CKAN-meta"
);
}
done_testing;
| use v5.010;
use lib 't/lib';
use autodie;
use Test::Most;
use Test::NetKAN qw(netkan_files read_netkan licenses);
use Data::Dumper;
my %licenses = licenses();
my %files = netkan_files;
foreach my $shortname (sort keys %files) {
my $metadata = read_netkan($files{$shortname});
is(
$metadata->{identifier},
$shortname,
"$shortname.netkan identifer should match filename"
);
like(
$metadata->{identifier},
qr{^[A-Za-z][A-Z-a-z0-9-]*$},
"CKAN identifers must consist only of letters, numbers, and dashes, and must start with a letter."
);
my $mod_license = $metadata->{license} // "(none)";
ok(
$metadata->{x_netkan_license_ok} || $licenses{$mod_license},
"$shortname license ($mod_license) should match spec. Set `x_netkan_license_ok` to supress."
);
ok(
$metadata->{'$kref'} || $metadata->{'$vref'},
"$shortname has no \$kref/\$vref field. It belongs in CKAN-meta"
);
}
done_testing;
| Enforce tests on CKAN identifiers. | Enforce tests on CKAN identifiers.
CKAN identifiers have a strict definition of letters, numbers, and
dashes; and must start with a letter. This adds a test to ensure that
incorrect identifiers cannot be submitted.
With our current master, this always causes a failiure, as we have
invalid data in our repo.
| Perl | cc0-1.0 | TiktaalikDreaming/NetKAN,aw1621107/NetKAN,jenyaza01/NetKAN,jkoritzinsky/NetKAN,bld/NetKAN,SuicidalInsanity/NetKAN,jkoritzinsky/NetKAN,Kerbas-ad-astra/NetKAN,sumghai/NetKAN,magico13/NetKAN,civilwargeeky/NetKAN,DMagic1/NetKAN,sswelm/NetKAN,NathanKell/NetKAN,mgsdk/NetKAN,phardy/NetKAN,mhoram-kerbin/NetKAN,distantcam/NetKAN,BobPalmer/NetKAN,Space-Duck/NetKAN,Starstrider42/NetKAN,plague006/NetKAN,TeddyDD/NetKAN,puetzk/NetKAN,sswelm/NetKAN,Shaymes/NetKAN,godarklight/NetKAN,ItMustBeACamel/NetKAN,techman83/NetKAN,jrossignol/NetKAN,iPeer/NetKAN,ItMustBeACamel/NetKAN,NecroBones/NetKAN,toadicus/NetKAN,BobPalmer/NetKAN,godarklight/NetKAN,politas/NetKAN,KerbalStuffBot/NetKAN,severedsolo/NetKAN,plague006/NetKAN,Dazpoet/NetKAN,iPeer/NetKAN,Niemand303/NetKAN,rkagerer/NetKAN,toadicus/NetKAN,blowfishpro/NetKAN,Shaymes/NetKAN,alephnaughty/NetKAN,henrybauer/NetKAN,NecroBones/NetKAN,Kerbas-ad-astra/NetKAN,Raptor831/NetKAN,DaMichel/NetKAN,magico13/NetKAN,civilwargeeky/NetKAN,Cphoenix/NetKAN,alephnaughty/NetKAN,Niemand303/NetKAN,raidernick/NetKAN,mgsdk/NetKAN,BasharMilesTeg/NetKAN,distantcam/NetKAN,Dazpoet/NetKAN,bld/NetKAN,Raptor831/NetKAN | perl | ## Code Before:
use v5.010;
use lib 't/lib';
use autodie;
use Test::Most;
use Test::NetKAN qw(netkan_files read_netkan licenses);
use Data::Dumper;
my %licenses = licenses();
my %files = netkan_files;
foreach my $shortname (sort keys %files) {
my $metadata = read_netkan($files{$shortname});
is(
$metadata->{identifier},
$shortname,
"$shortname.netkan identifer should match filename"
);
my $mod_license = $metadata->{license} // "(none)";
ok(
$metadata->{x_netkan_license_ok} || $licenses{$mod_license},
"$shortname license ($mod_license) should match spec. Set `x_netkan_license_ok` to supress."
);
ok(
$metadata->{'$kref'} || $metadata->{'$vref'},
"$shortname has no \$kref/\$vref field. It belongs in CKAN-meta"
);
}
done_testing;
## Instruction:
Enforce tests on CKAN identifiers.
CKAN identifiers have a strict definition of letters, numbers, and
dashes; and must start with a letter. This adds a test to ensure that
incorrect identifiers cannot be submitted.
With our current master, this always causes a failiure, as we have
invalid data in our repo.
## Code After:
use v5.010;
use lib 't/lib';
use autodie;
use Test::Most;
use Test::NetKAN qw(netkan_files read_netkan licenses);
use Data::Dumper;
my %licenses = licenses();
my %files = netkan_files;
foreach my $shortname (sort keys %files) {
my $metadata = read_netkan($files{$shortname});
is(
$metadata->{identifier},
$shortname,
"$shortname.netkan identifer should match filename"
);
like(
$metadata->{identifier},
qr{^[A-Za-z][A-Z-a-z0-9-]*$},
"CKAN identifers must consist only of letters, numbers, and dashes, and must start with a letter."
);
my $mod_license = $metadata->{license} // "(none)";
ok(
$metadata->{x_netkan_license_ok} || $licenses{$mod_license},
"$shortname license ($mod_license) should match spec. Set `x_netkan_license_ok` to supress."
);
ok(
$metadata->{'$kref'} || $metadata->{'$vref'},
"$shortname has no \$kref/\$vref field. It belongs in CKAN-meta"
);
}
done_testing;
| use v5.010;
use lib 't/lib';
use autodie;
use Test::Most;
use Test::NetKAN qw(netkan_files read_netkan licenses);
use Data::Dumper;
my %licenses = licenses();
my %files = netkan_files;
foreach my $shortname (sort keys %files) {
my $metadata = read_netkan($files{$shortname});
is(
$metadata->{identifier},
$shortname,
"$shortname.netkan identifer should match filename"
);
+ like(
+ $metadata->{identifier},
+ qr{^[A-Za-z][A-Z-a-z0-9-]*$},
+ "CKAN identifers must consist only of letters, numbers, and dashes, and must start with a letter."
+ );
+
my $mod_license = $metadata->{license} // "(none)";
ok(
$metadata->{x_netkan_license_ok} || $licenses{$mod_license},
"$shortname license ($mod_license) should match spec. Set `x_netkan_license_ok` to supress."
);
ok(
$metadata->{'$kref'} || $metadata->{'$vref'},
"$shortname has no \$kref/\$vref field. It belongs in CKAN-meta"
);
}
done_testing; | 6 | 0.171429 | 6 | 0 |
2d58e4a041bba27e69c8254e2d505cfaa31b01ac | README.md | README.md |
Go library for reading Dymo USB postal scales.
You can buy them relatively cheap from Ebay.
Tested with a Dymo M5, but will probably work with other models too.
## Developing
### darwin/amd64
To test and build on Darwin. Although it seems that libusb reports an
`access denied` error:
```
brew install libusb
make darwin
```
### linux/amd64
To test, build, and run on Linux in Docker:
```
make docker
make
./dymodump/dymodump
```
### linux/arm
To test and cross-compile to ARM, which can run on a Raspberry Pi:
```
make docker
make test arm
```
|
[](http://godoc.org/github.com/dcarley/dymoscale) [](https://travis-ci.org/dcarley/dymoscale)
Go library for reading Dymo USB postal scales.
You can buy them relatively cheap from Ebay.
Tested with a Dymo M5, but will probably work with other models too.
## Developing
### darwin/amd64
To test and build on Darwin. Although it seems that libusb reports an
`access denied` error:
```
brew install libusb
make darwin
```
### linux/amd64
To test, build, and run on Linux in Docker:
```
make docker
make
./dymodump/dymodump
```
### linux/arm
To test and cross-compile to ARM, which can run on a Raspberry Pi:
```
make docker
make test arm
```
| Add godoc and Travis CI badges. | Add godoc and Travis CI badges.
| Markdown | mit | dcarley/dymoscale | markdown | ## Code Before:
Go library for reading Dymo USB postal scales.
You can buy them relatively cheap from Ebay.
Tested with a Dymo M5, but will probably work with other models too.
## Developing
### darwin/amd64
To test and build on Darwin. Although it seems that libusb reports an
`access denied` error:
```
brew install libusb
make darwin
```
### linux/amd64
To test, build, and run on Linux in Docker:
```
make docker
make
./dymodump/dymodump
```
### linux/arm
To test and cross-compile to ARM, which can run on a Raspberry Pi:
```
make docker
make test arm
```
## Instruction:
Add godoc and Travis CI badges.
## Code After:
[](http://godoc.org/github.com/dcarley/dymoscale) [](https://travis-ci.org/dcarley/dymoscale)
Go library for reading Dymo USB postal scales.
You can buy them relatively cheap from Ebay.
Tested with a Dymo M5, but will probably work with other models too.
## Developing
### darwin/amd64
To test and build on Darwin. Although it seems that libusb reports an
`access denied` error:
```
brew install libusb
make darwin
```
### linux/amd64
To test, build, and run on Linux in Docker:
```
make docker
make
./dymodump/dymodump
```
### linux/arm
To test and cross-compile to ARM, which can run on a Raspberry Pi:
```
make docker
make test arm
```
| +
+ [](http://godoc.org/github.com/dcarley/dymoscale) [](https://travis-ci.org/dcarley/dymoscale)
Go library for reading Dymo USB postal scales.
You can buy them relatively cheap from Ebay.
Tested with a Dymo M5, but will probably work with other models too.
## Developing
### darwin/amd64
To test and build on Darwin. Although it seems that libusb reports an
`access denied` error:
```
brew install libusb
make darwin
```
### linux/amd64
To test, build, and run on Linux in Docker:
```
make docker
make
./dymodump/dymodump
```
### linux/arm
To test and cross-compile to ARM, which can run on a Raspberry Pi:
```
make docker
make test arm
``` | 2 | 0.058824 | 2 | 0 |
c66f4ec0e28ec59ff07a50fc2433b96f56be31f3 | src/thread_checker.cpp | src/thread_checker.cpp |
namespace dux {
#ifdef _WIN32
ThreadChecker::ThreadChecker() : thread_id_(GetCurrentThreadId()) {}
bool ThreadChecker::IsCreationThreadCurrent() const {
return GetCurrentThreadId() == thread_id_;
}
#endif
} // namespace dux
|
namespace dux {
ThreadChecker::ThreadChecker() : thread_id_(GetCurrentThreadId()) {}
bool ThreadChecker::IsCreationThreadCurrent() const {
return GetCurrentThreadId() == thread_id_;
}
} // namespace dux
#endif
| Fix build on non-windows platforms. | Fix build on non-windows platforms.
| C++ | mit | jyaif/dux_base,jyaif/dux_base,jyaif/dux_base | c++ | ## Code Before:
namespace dux {
#ifdef _WIN32
ThreadChecker::ThreadChecker() : thread_id_(GetCurrentThreadId()) {}
bool ThreadChecker::IsCreationThreadCurrent() const {
return GetCurrentThreadId() == thread_id_;
}
#endif
} // namespace dux
## Instruction:
Fix build on non-windows platforms.
## Code After:
namespace dux {
ThreadChecker::ThreadChecker() : thread_id_(GetCurrentThreadId()) {}
bool ThreadChecker::IsCreationThreadCurrent() const {
return GetCurrentThreadId() == thread_id_;
}
} // namespace dux
#endif
|
namespace dux {
-
- #ifdef _WIN32
ThreadChecker::ThreadChecker() : thread_id_(GetCurrentThreadId()) {}
bool ThreadChecker::IsCreationThreadCurrent() const {
return GetCurrentThreadId() == thread_id_;
}
- #endif
} // namespace dux
+
+ #endif | 5 | 0.357143 | 2 | 3 |
94e0a1badf329f760bf1d3b2a3a6cce931cc0dc5 | src/keytar_posix.cc | src/keytar_posix.cc |
namespace keytar {
bool AddPassword(const std::string& service,
const std::string& account,
const std::string& password) {
return false;
}
bool GetPassword(const std::string& service,
const std::string& account,
std::string* password) {
return false;
}
bool DeletePassword(const std::string& service, const std::string& account) {
return false;
}
bool FindPassword(const std::string& service, std::string* password) {
return false;
}
} // namespace keytar
|
namespace keytar {
namespace {
const GnomeKeyringPasswordSchema kGnomeSchema = {
GNOME_KEYRING_ITEM_GENERIC_SECRET, {
{ "service", GNOME_KEYRING_ATTRIBUTE_TYPE_STRING },
{ "account", GNOME_KEYRING_ATTRIBUTE_TYPE_STRING },
{ NULL }
}
};
} // namespace
bool AddPassword(const std::string& service,
const std::string& account,
const std::string& password) {
return gnome_keyring_store_password_sync(
&kGnomeSchema,
NULL, // Default keyring.
(service + "/" + account).c_str(), // Display name.
password.c_str(),
"service", service.c_str(),
"account", account.c_str(),
NULL) == GNOME_KEYRING_RESULT_OK;
}
bool GetPassword(const std::string& service,
const std::string& account,
std::string* password) {
gchar* raw_passwords;
GnomeKeyringResult result = gnome_keyring_find_password_sync(
&kGnomeSchema,
&raw_passwords,
"service", service.c_str(),
"account", account.c_str(),
NULL);
if (result != GNOME_KEYRING_RESULT_OK)
return false;
if (raw_passwords != NULL)
*password = raw_passwords;
gnome_keyring_free_password(raw_passwords);
return true;
}
bool DeletePassword(const std::string& service, const std::string& account) {
return gnome_keyring_delete_password_sync(
&kGnomeSchema,
"service", service.c_str(),
"account", account.c_str(),
NULL) == GNOME_KEYRING_RESULT_OK;
}
bool FindPassword(const std::string& service, std::string* password) {
gchar* raw_passwords;
GnomeKeyringResult result = gnome_keyring_find_password_sync(
&kGnomeSchema,
&raw_passwords,
"service", service.c_str(),
NULL);
if (result != GNOME_KEYRING_RESULT_OK)
return false;
if (raw_passwords != NULL)
*password = raw_passwords;
gnome_keyring_free_password(raw_passwords);
return true;
}
} // namespace keytar
| Implement with gnome-keyring on Linux. | Implement with gnome-keyring on Linux.
| C++ | mit | havoc-io/keytar,havoc-io/keytar,thesbros/node-keytar,atom/node-keytar,ehealthafrica-ci/node-keytar,adelcambre/node-keytar,thesbros/node-keytar,atom/node-keytar,eHealthAfrica/node-keytar,atom/node-keytar,havoc-io/keytar,xeno-io/keytar,atom/node-keytar,eHealthAfrica/node-keytar,adelcambre/node-keytar,ehealthafrica-ci/node-keytar,xeno-io/keytar,adelcambre/node-keytar,xeno-io/keytar,adelcambre/node-keytar,eHealthAfrica/node-keytar,thesbros/node-keytar,ehealthafrica-ci/node-keytar | c++ | ## Code Before:
namespace keytar {
bool AddPassword(const std::string& service,
const std::string& account,
const std::string& password) {
return false;
}
bool GetPassword(const std::string& service,
const std::string& account,
std::string* password) {
return false;
}
bool DeletePassword(const std::string& service, const std::string& account) {
return false;
}
bool FindPassword(const std::string& service, std::string* password) {
return false;
}
} // namespace keytar
## Instruction:
Implement with gnome-keyring on Linux.
## Code After:
namespace keytar {
namespace {
const GnomeKeyringPasswordSchema kGnomeSchema = {
GNOME_KEYRING_ITEM_GENERIC_SECRET, {
{ "service", GNOME_KEYRING_ATTRIBUTE_TYPE_STRING },
{ "account", GNOME_KEYRING_ATTRIBUTE_TYPE_STRING },
{ NULL }
}
};
} // namespace
bool AddPassword(const std::string& service,
const std::string& account,
const std::string& password) {
return gnome_keyring_store_password_sync(
&kGnomeSchema,
NULL, // Default keyring.
(service + "/" + account).c_str(), // Display name.
password.c_str(),
"service", service.c_str(),
"account", account.c_str(),
NULL) == GNOME_KEYRING_RESULT_OK;
}
bool GetPassword(const std::string& service,
const std::string& account,
std::string* password) {
gchar* raw_passwords;
GnomeKeyringResult result = gnome_keyring_find_password_sync(
&kGnomeSchema,
&raw_passwords,
"service", service.c_str(),
"account", account.c_str(),
NULL);
if (result != GNOME_KEYRING_RESULT_OK)
return false;
if (raw_passwords != NULL)
*password = raw_passwords;
gnome_keyring_free_password(raw_passwords);
return true;
}
bool DeletePassword(const std::string& service, const std::string& account) {
return gnome_keyring_delete_password_sync(
&kGnomeSchema,
"service", service.c_str(),
"account", account.c_str(),
NULL) == GNOME_KEYRING_RESULT_OK;
}
bool FindPassword(const std::string& service, std::string* password) {
gchar* raw_passwords;
GnomeKeyringResult result = gnome_keyring_find_password_sync(
&kGnomeSchema,
&raw_passwords,
"service", service.c_str(),
NULL);
if (result != GNOME_KEYRING_RESULT_OK)
return false;
if (raw_passwords != NULL)
*password = raw_passwords;
gnome_keyring_free_password(raw_passwords);
return true;
}
} // namespace keytar
|
namespace keytar {
+
+ namespace {
+
+ const GnomeKeyringPasswordSchema kGnomeSchema = {
+ GNOME_KEYRING_ITEM_GENERIC_SECRET, {
+ { "service", GNOME_KEYRING_ATTRIBUTE_TYPE_STRING },
+ { "account", GNOME_KEYRING_ATTRIBUTE_TYPE_STRING },
+ { NULL }
+ }
+ };
+
+ } // namespace
bool AddPassword(const std::string& service,
const std::string& account,
const std::string& password) {
- return false;
+ return gnome_keyring_store_password_sync(
+ &kGnomeSchema,
+ NULL, // Default keyring.
+ (service + "/" + account).c_str(), // Display name.
+ password.c_str(),
+ "service", service.c_str(),
+ "account", account.c_str(),
+ NULL) == GNOME_KEYRING_RESULT_OK;
}
bool GetPassword(const std::string& service,
const std::string& account,
std::string* password) {
+ gchar* raw_passwords;
+ GnomeKeyringResult result = gnome_keyring_find_password_sync(
+ &kGnomeSchema,
+ &raw_passwords,
+ "service", service.c_str(),
+ "account", account.c_str(),
+ NULL);
+ if (result != GNOME_KEYRING_RESULT_OK)
- return false;
+ return false;
? ++
+
+ if (raw_passwords != NULL)
+ *password = raw_passwords;
+ gnome_keyring_free_password(raw_passwords);
+ return true;
}
bool DeletePassword(const std::string& service, const std::string& account) {
- return false;
+ return gnome_keyring_delete_password_sync(
+ &kGnomeSchema,
+ "service", service.c_str(),
+ "account", account.c_str(),
+ NULL) == GNOME_KEYRING_RESULT_OK;
}
bool FindPassword(const std::string& service, std::string* password) {
+ gchar* raw_passwords;
+ GnomeKeyringResult result = gnome_keyring_find_password_sync(
+ &kGnomeSchema,
+ &raw_passwords,
+ "service", service.c_str(),
+ NULL);
+ if (result != GNOME_KEYRING_RESULT_OK)
- return false;
+ return false;
? ++
+
+ if (raw_passwords != NULL)
+ *password = raw_passwords;
+ gnome_keyring_free_password(raw_passwords);
+ return true;
}
} // namespace keytar | 56 | 2.333333 | 52 | 4 |
2bcb91b9b938d287bcf0d2f67763c10673fa08a1 | views/includes/service_manual_header.html | views/includes/service_manual_header.html | <header class="page-header group">
<div class="hgroup">
<h1>GOV.UK elements</h1>
<h2 class="subtitle">The building blocks from which all pages are made</h2>
</div>
</header> | <header class="page-header group">
<div class="hgroup">
<h1>GOV.UK elements</h1>
<h2 class="subtitle">Layout, typography, colour, images, icons, forms, buttons and data.</h2>
</div>
</header>
| Update subtitle to match service manual header | Update subtitle to match service manual header
| HTML | mit | alphagov/govuk_elements,gavboulton/govuk_elements,gavboulton/govuk_elements,gavboulton/govuk_elements,TFOH/govuk_elements,alphagov/govuk_elements,alphagov/govuk_elements,gavboulton/govuk_elements,alphagov/govuk_elements,TFOH/govuk_elements,joelanman/govuk_elements,joelanman/govuk_elements | html | ## Code Before:
<header class="page-header group">
<div class="hgroup">
<h1>GOV.UK elements</h1>
<h2 class="subtitle">The building blocks from which all pages are made</h2>
</div>
</header>
## Instruction:
Update subtitle to match service manual header
## Code After:
<header class="page-header group">
<div class="hgroup">
<h1>GOV.UK elements</h1>
<h2 class="subtitle">Layout, typography, colour, images, icons, forms, buttons and data.</h2>
</div>
</header>
| <header class="page-header group">
<div class="hgroup">
<h1>GOV.UK elements</h1>
- <h2 class="subtitle">The building blocks from which all pages are made</h2>
+ <h2 class="subtitle">Layout, typography, colour, images, icons, forms, buttons and data.</h2>
</div>
</header> | 2 | 0.333333 | 1 | 1 |
a1f62c7944c239e6490c24bf4a2fe235d80830b3 | lib/ruby_wedding/engine.rb | lib/ruby_wedding/engine.rb | module RubyWedding
class Engine < ::Rails::Engine
isolate_namespace RubyWedding
config.generators do |g|
g.test_framework :rspec, fixture: false
g.fixture_replacement :factory_girl, dir: 'spec/factories'
g.assets false
g.helper false
end
end
end
| module RubyWedding
class Engine < ::Rails::Engine
isolate_namespace RubyWedding
config.generators do |g|
g.test_framework :rspec, fixture: false
g.fixture_replacement :factory_girl, dir: 'spec/factories'
g.assets false
g.helper false
end
config.to_prepare do
Dir.glob(Rails.root + "app/decorators/**/**/*_decorator*.rb").each do |c|
require_dependency(c)
end
end
end
end
| Allow decorators in the parent app. | Allow decorators in the parent app.
| Ruby | mit | skipchris/ruby_wedding,skipchris/ruby_wedding | ruby | ## Code Before:
module RubyWedding
class Engine < ::Rails::Engine
isolate_namespace RubyWedding
config.generators do |g|
g.test_framework :rspec, fixture: false
g.fixture_replacement :factory_girl, dir: 'spec/factories'
g.assets false
g.helper false
end
end
end
## Instruction:
Allow decorators in the parent app.
## Code After:
module RubyWedding
class Engine < ::Rails::Engine
isolate_namespace RubyWedding
config.generators do |g|
g.test_framework :rspec, fixture: false
g.fixture_replacement :factory_girl, dir: 'spec/factories'
g.assets false
g.helper false
end
config.to_prepare do
Dir.glob(Rails.root + "app/decorators/**/**/*_decorator*.rb").each do |c|
require_dependency(c)
end
end
end
end
| module RubyWedding
class Engine < ::Rails::Engine
isolate_namespace RubyWedding
config.generators do |g|
g.test_framework :rspec, fixture: false
g.fixture_replacement :factory_girl, dir: 'spec/factories'
g.assets false
g.helper false
end
+ config.to_prepare do
+ Dir.glob(Rails.root + "app/decorators/**/**/*_decorator*.rb").each do |c|
+ require_dependency(c)
+ end
+ end
end
end | 5 | 0.384615 | 5 | 0 |
c488bae8b97a3a9207d1aa0bbea424c92d493df0 | lib/handlers/providers/index.js | lib/handlers/providers/index.js | "use strict";
var Anyfetch = require('anyfetch');
module.exports.get = function get(req, res, next) {
var anyfetchClient = new Anyfetch(req.accessToken.token);
anyfetchClient.getDocuments({
fields: 'facets.providers'
}, function updated(err, anyfetchRes) {
if(!err) {
var providers = anyfetchRes.body.facets.providers;
res.send({
'count': providers.length
});
}
next(err);
});
};
| "use strict";
var Anyfetch = require('anyfetch');
module.exports.get = function get(req, res, next) {
var anyfetchClient = new Anyfetch(req.accessToken.token);
anyfetchClient.getDocuments({
fields: 'facets.providers'
}, function updated(err, anyfetchRes) {
if(!err) {
var providers = anyfetchRes.body.facets.providers;
res.send({
'count': providers ? providers.length || 0
});
}
next(err);
});
};
| Fix for missing providers length | Fix for missing providers length
| JavaScript | mit | AnyFetch/companion-server | javascript | ## Code Before:
"use strict";
var Anyfetch = require('anyfetch');
module.exports.get = function get(req, res, next) {
var anyfetchClient = new Anyfetch(req.accessToken.token);
anyfetchClient.getDocuments({
fields: 'facets.providers'
}, function updated(err, anyfetchRes) {
if(!err) {
var providers = anyfetchRes.body.facets.providers;
res.send({
'count': providers.length
});
}
next(err);
});
};
## Instruction:
Fix for missing providers length
## Code After:
"use strict";
var Anyfetch = require('anyfetch');
module.exports.get = function get(req, res, next) {
var anyfetchClient = new Anyfetch(req.accessToken.token);
anyfetchClient.getDocuments({
fields: 'facets.providers'
}, function updated(err, anyfetchRes) {
if(!err) {
var providers = anyfetchRes.body.facets.providers;
res.send({
'count': providers ? providers.length || 0
});
}
next(err);
});
};
| "use strict";
var Anyfetch = require('anyfetch');
module.exports.get = function get(req, res, next) {
var anyfetchClient = new Anyfetch(req.accessToken.token);
anyfetchClient.getDocuments({
fields: 'facets.providers'
}, function updated(err, anyfetchRes) {
if(!err) {
var providers = anyfetchRes.body.facets.providers;
res.send({
- 'count': providers.length
+ 'count': providers ? providers.length || 0
? ++++++++++++ +++++
});
}
next(err);
});
}; | 2 | 0.117647 | 1 | 1 |
27ba4b4968d5e01b1551312d64364cbb87d1ea87 | src/templates/base.html | src/templates/base.html | <!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width; initial-scale=1.0" />
<title>{% block title %}{% endblock %} - 12Bcon</title>
<link href="//netdna.bootstrapcdn.com/bootswatch/3.0.0/cyborg/bootstrap.min.css" rel="stylesheet">
<link href="css/styles.css" rel="stylesheet">
</head>
<body>
<nav class="navbar navbar-default" role="navigation">
<ul class="nav navbar-nav navbar-right">
<li {% if page == "home" %} class="active" {% endif %}><a href="/">Home</a></li>
<li {% if page == "cfp" %} class="active" {% endif %}><a href="cfp">Call for Papers</a></li>
</ul>
</nav>
<a href="#" class="logo"><img src="images/12Bcon-blue-small.png"></a>
{% if flash.error %}
<div class="alert alert-error">{{ flash.error }}</div>
{% endif %}
<div class="container">
{% block content %}{% endblock %}
</div>
<div class="footer text-center">
<small>Pesented by the 12-B Quad at the home office.<br>©2013 Dominion Enterprises | All rights reserved.<br><br></small>
</div>
</body>
</html>
| <!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width; initial-scale=1.0" />
<title>{% block title %}{% endblock %} - 12Bcon</title>
<link href="//netdna.bootstrapcdn.com/bootswatch/3.0.0/cyborg/bootstrap.min.css" rel="stylesheet">
<link href="css/styles.css" rel="stylesheet">
</head>
<body>
<nav class="navbar navbar-default" role="navigation">
<ul class="nav navbar-nav navbar-right">
<li {% if page == "home" %} class="active" {% endif %}><a href="/">Home</a></li>
<li {% if page == "cfp" %} class="active" {% endif %}><a href="cfp">Call for Papers</a></li>
</ul>
</nav>
<div class="container">
{% if flash.error %}
<div class="alert alert-danger">{{ flash.error }}</div>
{% endif %}
{% if flash.success %}
<div class="alert alert-success">{{ flash.success }}</div>
{% endif %}
</div>
<a href="#" class="logo"><img src="images/12Bcon-blue-small.png"></a>
<div class="container">
{% block content %}{% endblock %}
</div>
<div class="footer text-center">
<small>Pesented by the 12-B Quad at the home office.<br>©2013 Dominion Enterprises | All rights reserved.<br><br></small>
</div>
</body>
</html>
| Fix location of flash messages and add success flash. | Fix location of flash messages and add success flash.
| HTML | mit | guywithnose/de-devcon-website,chadicus/12bcon,guywithnose/de-devcon-website,j-mateo/website,j-mateo/website,DE-DevCon/website,DE-DevCon/website,chadicus/12bcon | html | ## Code Before:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width; initial-scale=1.0" />
<title>{% block title %}{% endblock %} - 12Bcon</title>
<link href="//netdna.bootstrapcdn.com/bootswatch/3.0.0/cyborg/bootstrap.min.css" rel="stylesheet">
<link href="css/styles.css" rel="stylesheet">
</head>
<body>
<nav class="navbar navbar-default" role="navigation">
<ul class="nav navbar-nav navbar-right">
<li {% if page == "home" %} class="active" {% endif %}><a href="/">Home</a></li>
<li {% if page == "cfp" %} class="active" {% endif %}><a href="cfp">Call for Papers</a></li>
</ul>
</nav>
<a href="#" class="logo"><img src="images/12Bcon-blue-small.png"></a>
{% if flash.error %}
<div class="alert alert-error">{{ flash.error }}</div>
{% endif %}
<div class="container">
{% block content %}{% endblock %}
</div>
<div class="footer text-center">
<small>Pesented by the 12-B Quad at the home office.<br>©2013 Dominion Enterprises | All rights reserved.<br><br></small>
</div>
</body>
</html>
## Instruction:
Fix location of flash messages and add success flash.
## Code After:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width; initial-scale=1.0" />
<title>{% block title %}{% endblock %} - 12Bcon</title>
<link href="//netdna.bootstrapcdn.com/bootswatch/3.0.0/cyborg/bootstrap.min.css" rel="stylesheet">
<link href="css/styles.css" rel="stylesheet">
</head>
<body>
<nav class="navbar navbar-default" role="navigation">
<ul class="nav navbar-nav navbar-right">
<li {% if page == "home" %} class="active" {% endif %}><a href="/">Home</a></li>
<li {% if page == "cfp" %} class="active" {% endif %}><a href="cfp">Call for Papers</a></li>
</ul>
</nav>
<div class="container">
{% if flash.error %}
<div class="alert alert-danger">{{ flash.error }}</div>
{% endif %}
{% if flash.success %}
<div class="alert alert-success">{{ flash.success }}</div>
{% endif %}
</div>
<a href="#" class="logo"><img src="images/12Bcon-blue-small.png"></a>
<div class="container">
{% block content %}{% endblock %}
</div>
<div class="footer text-center">
<small>Pesented by the 12-B Quad at the home office.<br>©2013 Dominion Enterprises | All rights reserved.<br><br></small>
</div>
</body>
</html>
| <!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width; initial-scale=1.0" />
<title>{% block title %}{% endblock %} - 12Bcon</title>
<link href="//netdna.bootstrapcdn.com/bootswatch/3.0.0/cyborg/bootstrap.min.css" rel="stylesheet">
<link href="css/styles.css" rel="stylesheet">
</head>
<body>
<nav class="navbar navbar-default" role="navigation">
<ul class="nav navbar-nav navbar-right">
<li {% if page == "home" %} class="active" {% endif %}><a href="/">Home</a></li>
<li {% if page == "cfp" %} class="active" {% endif %}><a href="cfp">Call for Papers</a></li>
</ul>
</nav>
+ <div class="container">
+ {% if flash.error %}
+ <div class="alert alert-danger">{{ flash.error }}</div>
+ {% endif %}
+ {% if flash.success %}
+ <div class="alert alert-success">{{ flash.success }}</div>
+ {% endif %}
+ </div>
<a href="#" class="logo"><img src="images/12Bcon-blue-small.png"></a>
- {% if flash.error %}
- <div class="alert alert-error">{{ flash.error }}</div>
- {% endif %}
<div class="container">
{% block content %}{% endblock %}
</div>
<div class="footer text-center">
<small>Pesented by the 12-B Quad at the home office.<br>©2013 Dominion Enterprises | All rights reserved.<br><br></small>
</div>
</body>
</html> | 11 | 0.392857 | 8 | 3 |
759ceb094582838d4a5c2fc60282c1f408837dc4 | src/main/java/engine/objects/tower/TowerBehaviorDecorator.java | src/main/java/engine/objects/tower/TowerBehaviorDecorator.java | package main.java.engine.objects.tower;
import main.java.engine.EnvironmentKnowledge;
abstract class TowerBehaviorDecorator implements ITower {
/**
* The base tower will have behaviors added to it ("decorations")
*/
protected ITower baseTower;
public TowerBehaviorDecorator (ITower baseTower) {
this.baseTower = baseTower;
}
@Override
public boolean atInterval (int intervalFrequency) {
return baseTower.atInterval(intervalFrequency);
}
@Override
public double getXCoordinate () {
return baseTower.getXCoordinate();
}
@Override
public double getYCoordinate () {
return baseTower.getYCoordinate();
}
@Override
public double getCost () {
return baseTower.getCost();
}
@Override
public void remove () {
baseTower.remove();
}
@Override
public boolean callTowerActions (EnvironmentKnowledge environ) {
if (baseTower.callTowerActions(environ)) {
// in addition to base tower's behavior, also do additional behavior
doDecoratedBehavior(environ);
}
return true;
}
abstract void doDecoratedBehavior (EnvironmentKnowledge environ);
}
| package main.java.engine.objects.tower;
import main.java.engine.EnvironmentKnowledge;
abstract class TowerBehaviorDecorator implements ITower {
/**
* The base tower will have behaviors added to it ("decorations")
*/
protected ITower baseTower;
public TowerBehaviorDecorator (ITower baseTower) {
this.baseTower = baseTower;
}
@Override
public boolean atInterval (int intervalFrequency) {
return baseTower.atInterval(intervalFrequency);
}
@Override
public double getXCoordinate () {
return baseTower.getXCoordinate();
}
@Override
public double getYCoordinate () {
return baseTower.getYCoordinate();
}
@Override
public double getCost () {
return baseTower.getCost();
}
@Override
public void remove () {
baseTower.remove();
}
@Override
public boolean callTowerActions (EnvironmentKnowledge environ) {
if (baseTower.callTowerActions(environ)) {
// in addition to base tower's behavior, also do additional behavior
doDecoratedBehavior(environ);
return true;
}
return false;
}
abstract void doDecoratedBehavior (EnvironmentKnowledge environ);
}
| Make decorators obey base tower's build up time | Make decorators obey base tower's build up time
| Java | mit | dennis-park/OOGALoompas,jordanly/oogasalad_OOGALoompas,garysheng/tower-defense-game-engine,kevinkdo/oogasalad_OOGALoompas,codylieu/oogasalad_OOGALoompas,dianwen/oogasalad_TowerDefense,sdh31/tower_defense | java | ## Code Before:
package main.java.engine.objects.tower;
import main.java.engine.EnvironmentKnowledge;
abstract class TowerBehaviorDecorator implements ITower {
/**
* The base tower will have behaviors added to it ("decorations")
*/
protected ITower baseTower;
public TowerBehaviorDecorator (ITower baseTower) {
this.baseTower = baseTower;
}
@Override
public boolean atInterval (int intervalFrequency) {
return baseTower.atInterval(intervalFrequency);
}
@Override
public double getXCoordinate () {
return baseTower.getXCoordinate();
}
@Override
public double getYCoordinate () {
return baseTower.getYCoordinate();
}
@Override
public double getCost () {
return baseTower.getCost();
}
@Override
public void remove () {
baseTower.remove();
}
@Override
public boolean callTowerActions (EnvironmentKnowledge environ) {
if (baseTower.callTowerActions(environ)) {
// in addition to base tower's behavior, also do additional behavior
doDecoratedBehavior(environ);
}
return true;
}
abstract void doDecoratedBehavior (EnvironmentKnowledge environ);
}
## Instruction:
Make decorators obey base tower's build up time
## Code After:
package main.java.engine.objects.tower;
import main.java.engine.EnvironmentKnowledge;
abstract class TowerBehaviorDecorator implements ITower {
/**
* The base tower will have behaviors added to it ("decorations")
*/
protected ITower baseTower;
public TowerBehaviorDecorator (ITower baseTower) {
this.baseTower = baseTower;
}
@Override
public boolean atInterval (int intervalFrequency) {
return baseTower.atInterval(intervalFrequency);
}
@Override
public double getXCoordinate () {
return baseTower.getXCoordinate();
}
@Override
public double getYCoordinate () {
return baseTower.getYCoordinate();
}
@Override
public double getCost () {
return baseTower.getCost();
}
@Override
public void remove () {
baseTower.remove();
}
@Override
public boolean callTowerActions (EnvironmentKnowledge environ) {
if (baseTower.callTowerActions(environ)) {
// in addition to base tower's behavior, also do additional behavior
doDecoratedBehavior(environ);
return true;
}
return false;
}
abstract void doDecoratedBehavior (EnvironmentKnowledge environ);
}
| package main.java.engine.objects.tower;
import main.java.engine.EnvironmentKnowledge;
abstract class TowerBehaviorDecorator implements ITower {
/**
* The base tower will have behaviors added to it ("decorations")
*/
protected ITower baseTower;
public TowerBehaviorDecorator (ITower baseTower) {
this.baseTower = baseTower;
}
@Override
public boolean atInterval (int intervalFrequency) {
return baseTower.atInterval(intervalFrequency);
}
@Override
public double getXCoordinate () {
return baseTower.getXCoordinate();
}
@Override
public double getYCoordinate () {
return baseTower.getYCoordinate();
}
@Override
public double getCost () {
return baseTower.getCost();
}
@Override
public void remove () {
baseTower.remove();
}
@Override
public boolean callTowerActions (EnvironmentKnowledge environ) {
if (baseTower.callTowerActions(environ)) {
// in addition to base tower's behavior, also do additional behavior
doDecoratedBehavior(environ);
+ return true;
}
- return true;
? ^^^
+ return false;
? ^^^^
}
abstract void doDecoratedBehavior (EnvironmentKnowledge environ);
} | 3 | 0.058824 | 2 | 1 |
21e5ee2ad250c313ea0eb2c67c3d3cc32661d24d | examples/benchmarking/server.py | examples/benchmarking/server.py | import asyncore,time,signal,sys
from secure_smtpd import SMTPServer, FakeCredentialValidator
class SecureSMTPServer(SMTPServer):
def __init__(self):
pass
def process_message(self, peer, mailfrom, rcpttos, message_data):
pass
def start(self):
SMTPServer.__init__(
self,
('0.0.0.0', 25),
None
)
asyncore.loop()
server = SecureSMTPServer()
server.start()
# normal termination of this process will kill worker children in
# process pool so this process (the parent) needs to idle here waiting
# for termination signal. If you don't have a signal handler, then
# Python multiprocess cleanup stuff doesn't happen, and children won't
# get killed by sending SIGTERM to parent.
def sig_handler(signal,frame):
print "Got signal %s, shutting down." % signal
sys.exit(0)
signal.signal(signal.SIGTERM, sig_handler)
while 1:
time.sleep(1)
| from secure_smtpd import SMTPServer
class SecureSMTPServer(SMTPServer):
def process_message(self, peer, mailfrom, rcpttos, message_data):
pass
server = SecureSMTPServer(('0.0.0.0', 1025), None)
server.run()
| Switch to non-privledged port to make testing easier. | Switch to non-privledged port to make testing easier.
User the new server.run() method.
| Python | isc | bcoe/secure-smtpd | python | ## Code Before:
import asyncore,time,signal,sys
from secure_smtpd import SMTPServer, FakeCredentialValidator
class SecureSMTPServer(SMTPServer):
def __init__(self):
pass
def process_message(self, peer, mailfrom, rcpttos, message_data):
pass
def start(self):
SMTPServer.__init__(
self,
('0.0.0.0', 25),
None
)
asyncore.loop()
server = SecureSMTPServer()
server.start()
# normal termination of this process will kill worker children in
# process pool so this process (the parent) needs to idle here waiting
# for termination signal. If you don't have a signal handler, then
# Python multiprocess cleanup stuff doesn't happen, and children won't
# get killed by sending SIGTERM to parent.
def sig_handler(signal,frame):
print "Got signal %s, shutting down." % signal
sys.exit(0)
signal.signal(signal.SIGTERM, sig_handler)
while 1:
time.sleep(1)
## Instruction:
Switch to non-privledged port to make testing easier.
User the new server.run() method.
## Code After:
from secure_smtpd import SMTPServer
class SecureSMTPServer(SMTPServer):
def process_message(self, peer, mailfrom, rcpttos, message_data):
pass
server = SecureSMTPServer(('0.0.0.0', 1025), None)
server.run()
| + from secure_smtpd import SMTPServer
- import asyncore,time,signal,sys
- from secure_smtpd import SMTPServer, FakeCredentialValidator
class SecureSMTPServer(SMTPServer):
-
- def __init__(self):
- pass
-
def process_message(self, peer, mailfrom, rcpttos, message_data):
pass
+ server = SecureSMTPServer(('0.0.0.0', 1025), None)
- def start(self):
- SMTPServer.__init__(
- self,
- ('0.0.0.0', 25),
- None
- )
- asyncore.loop()
-
- server = SecureSMTPServer()
- server.start()
? --- ^
+ server.run()
? ^^
-
-
- # normal termination of this process will kill worker children in
- # process pool so this process (the parent) needs to idle here waiting
- # for termination signal. If you don't have a signal handler, then
- # Python multiprocess cleanup stuff doesn't happen, and children won't
- # get killed by sending SIGTERM to parent.
-
- def sig_handler(signal,frame):
- print "Got signal %s, shutting down." % signal
- sys.exit(0)
-
- signal.signal(signal.SIGTERM, sig_handler)
-
- while 1:
- time.sleep(1) | 35 | 0.945946 | 3 | 32 |
dca2e6c7b178fb84852e42637eab3856a4153124 | spec/engine_spec.rb | spec/engine_spec.rb | require 'spec_helper'
describe Petroglyph::Engine do
specify "#to_hash" do
engine = Petroglyph::Engine.new('node :whatever => thing')
engine.to_hash({:thing => 'stuff'}).should eq({:whatever => 'stuff'})
end
it "takes a template" do
engine = Petroglyph::Engine.new('node :whatever => "nevermind"')
engine.render.should eq({:whatever => "nevermind"}.to_json)
end
it "takes a block" do
Petroglyph::Engine.new.render do
node :whatever => "nevermind"
end.should eq({:whatever => "nevermind"}.to_json)
end
it "passes the locals" do
engine = Petroglyph::Engine.new('node :whatever => thing')
engine.render(nil, {:thing => 'stuff'}).should eq({:whatever => 'stuff'}.to_json)
end
end
| require 'spec_helper'
describe Petroglyph::Engine do
specify "#to_hash" do
engine = Petroglyph::Engine.new('node :beverage => drink')
engine.to_hash({:drink => 'espresso'}).should eq({:beverage => 'espresso'})
end
it "takes a template" do
engine = Petroglyph::Engine.new('node :beverage => "no, thanks"')
engine.render.should eq({:beverage => "no, thanks"}.to_json)
end
it "takes a block" do
Petroglyph::Engine.new.render do
node :beverage => "hot chocolate"
end.should eq({:beverage => "hot chocolate"}.to_json)
end
it "passes the locals" do
engine = Petroglyph::Engine.new('node :beverage => drink')
engine.render(nil, {:drink => 'bai mu cha'}).should eq({:beverage => 'bai mu cha'}.to_json)
end
end
| Use different test data in each engine test. | Use different test data in each engine test.
| Ruby | mit | kytrinyx/petroglyph | ruby | ## Code Before:
require 'spec_helper'
describe Petroglyph::Engine do
specify "#to_hash" do
engine = Petroglyph::Engine.new('node :whatever => thing')
engine.to_hash({:thing => 'stuff'}).should eq({:whatever => 'stuff'})
end
it "takes a template" do
engine = Petroglyph::Engine.new('node :whatever => "nevermind"')
engine.render.should eq({:whatever => "nevermind"}.to_json)
end
it "takes a block" do
Petroglyph::Engine.new.render do
node :whatever => "nevermind"
end.should eq({:whatever => "nevermind"}.to_json)
end
it "passes the locals" do
engine = Petroglyph::Engine.new('node :whatever => thing')
engine.render(nil, {:thing => 'stuff'}).should eq({:whatever => 'stuff'}.to_json)
end
end
## Instruction:
Use different test data in each engine test.
## Code After:
require 'spec_helper'
describe Petroglyph::Engine do
specify "#to_hash" do
engine = Petroglyph::Engine.new('node :beverage => drink')
engine.to_hash({:drink => 'espresso'}).should eq({:beverage => 'espresso'})
end
it "takes a template" do
engine = Petroglyph::Engine.new('node :beverage => "no, thanks"')
engine.render.should eq({:beverage => "no, thanks"}.to_json)
end
it "takes a block" do
Petroglyph::Engine.new.render do
node :beverage => "hot chocolate"
end.should eq({:beverage => "hot chocolate"}.to_json)
end
it "passes the locals" do
engine = Petroglyph::Engine.new('node :beverage => drink')
engine.render(nil, {:drink => 'bai mu cha'}).should eq({:beverage => 'bai mu cha'}.to_json)
end
end
| require 'spec_helper'
describe Petroglyph::Engine do
specify "#to_hash" do
- engine = Petroglyph::Engine.new('node :whatever => thing')
? ^^^^ ^^ ^
+ engine = Petroglyph::Engine.new('node :beverage => drink')
? ^ +++ ^^ ^
- engine.to_hash({:thing => 'stuff'}).should eq({:whatever => 'stuff'})
? ^^ ^ ^^^^ ^^^^ ^^^^
+ engine.to_hash({:drink => 'espresso'}).should eq({:beverage => 'espresso'})
? ^^ ^ + ^^^^^^ ^ +++ + ^^^^^^
end
it "takes a template" do
- engine = Petroglyph::Engine.new('node :whatever => "nevermind"')
? ^^^^ ^^^^^^ ^
+ engine = Petroglyph::Engine.new('node :beverage => "no, thanks"')
? ^ +++ ^^^^^^ ^^
- engine.render.should eq({:whatever => "nevermind"}.to_json)
? ^^^^ ^^^^^^ ^
+ engine.render.should eq({:beverage => "no, thanks"}.to_json)
? ^ +++ ^^^^^^ ^^
end
it "takes a block" do
Petroglyph::Engine.new.render do
- node :whatever => "nevermind"
- end.should eq({:whatever => "nevermind"}.to_json)
+ node :beverage => "hot chocolate"
+ end.should eq({:beverage => "hot chocolate"}.to_json)
end
it "passes the locals" do
- engine = Petroglyph::Engine.new('node :whatever => thing')
? ^^^^ ^^ ^
+ engine = Petroglyph::Engine.new('node :beverage => drink')
? ^ +++ ^^ ^
- engine.render(nil, {:thing => 'stuff'}).should eq({:whatever => 'stuff'}.to_json)
? ^^ ^ ^^ ^^ ^^^^ ^^ ^^
+ engine.render(nil, {:drink => 'bai mu cha'}).should eq({:beverage => 'bai mu cha'}.to_json)
? ^^ ^ ^^^^^ ^^^^ ^ +++ ^^^^^ ^^^^
end
end | 16 | 0.666667 | 8 | 8 |
674da1a70d5af88aa0da7d5725680fce022022c6 | requirements.txt | requirements.txt | Flask==0.10.1
Flask-Bootstrap==2.3.2.1
Jinja2==2.7
MarkupSafe==0.18
Werkzeug==0.9.1
argparse==1.2.1
gunicorn==18.0
icalendar==3.5
itsdangerous==0.22
python-dateutil==2.1
pytz==2013b
pyxdg==0.25
six==1.3.0
wsgiref==0.1.2
setuptools==3.6 | Flask==0.10.1
Flask-Bootstrap==2.3.2.1
Jinja2==2.7
MarkupSafe==0.18
Werkzeug==0.9.1
argparse==1.2.1
gunicorn==18.0
icalendar==3.5
itsdangerous==0.22
python-dateutil==2.1
pytz==2013b
pyxdg==0.25
six==1.3.0
wsgiref==0.1.2 | Remove setuptools requirement to avoid breaking Heroku | Remove setuptools requirement to avoid breaking Heroku
| Text | agpl-3.0 | palfrey/beeminder-calendar,palfrey/beeminder-calendar | text | ## Code Before:
Flask==0.10.1
Flask-Bootstrap==2.3.2.1
Jinja2==2.7
MarkupSafe==0.18
Werkzeug==0.9.1
argparse==1.2.1
gunicorn==18.0
icalendar==3.5
itsdangerous==0.22
python-dateutil==2.1
pytz==2013b
pyxdg==0.25
six==1.3.0
wsgiref==0.1.2
setuptools==3.6
## Instruction:
Remove setuptools requirement to avoid breaking Heroku
## Code After:
Flask==0.10.1
Flask-Bootstrap==2.3.2.1
Jinja2==2.7
MarkupSafe==0.18
Werkzeug==0.9.1
argparse==1.2.1
gunicorn==18.0
icalendar==3.5
itsdangerous==0.22
python-dateutil==2.1
pytz==2013b
pyxdg==0.25
six==1.3.0
wsgiref==0.1.2 | Flask==0.10.1
Flask-Bootstrap==2.3.2.1
Jinja2==2.7
MarkupSafe==0.18
Werkzeug==0.9.1
argparse==1.2.1
gunicorn==18.0
icalendar==3.5
itsdangerous==0.22
python-dateutil==2.1
pytz==2013b
pyxdg==0.25
six==1.3.0
wsgiref==0.1.2
- setuptools==3.6 | 1 | 0.066667 | 0 | 1 |
ab7b00f852ac9ba423f58f96ee5d3865ea1aca7b | package.json | package.json | {
"name": "dndlib.js",
"version": "1.0.0",
"description": "-",
"main": "gulpfile.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"repository": {
"type": "git",
"url": "developer@s2:dndlib.js"
},
"author": "",
"license": "MIT",
"devDependencies": {
"babel-plugin-transform-es2015-modules-umd": "^6.22.0",
"babel-preset-env": "^1.1.8",
"babel-polyfill": "6.23.0",
"gulp": "^3.9.1",
"gulp-autoprefixer": "^3.1.1",
"gulp-add-src": "0.2.0",
"gulp-babel": "^6.1.2",
"gulp-concat": "^2.6.1",
"gulp-concat-sourcemap": "^1.3.1",
"gulp-cssmin": "^0.1.7",
"gulp-jsmin": "^0.1.5",
"gulp-rename": "^1.2.2",
"gulp-sass": "^3.1.0",
"gulp-sourcemaps": "^2.4.0",
"gulp-notify": "3.0.0"
}
}
| {
"name": "dndlib.js",
"version": "1.0.0",
"description": "-",
"main": "gulpfile.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"repository": {
"type": "git",
"url": "developer@s2:dndlib.js"
},
"author": "",
"license": "MIT",
"devDependencies": {
"babel-plugin-transform-es2015-modules-umd": "^6.22.0",
"babel-preset-env": "^1.1.8",
"babel-polyfill": "6.23.0",
"gulp": "^3.9.1",
"gulp-autoprefixer": "^3.1.1",
"gulp-add-src": "0.2.0",
"gulp-babel": "^6.1.2",
"gulp-concat": "^2.6.1",
"gulp-concat-sourcemap": "^1.3.1",
"gulp-jsmin": "^0.1.5",
"gulp-rename": "^1.2.2",
"gulp-sourcemaps": "^2.4.0",
"gulp-notify": "3.0.0"
}
}
| Delete sass and other css dependeces | Delete sass and other css dependeces
| JSON | apache-2.0 | Norbytus/PSXhr,Norbytus/PSXhr | json | ## Code Before:
{
"name": "dndlib.js",
"version": "1.0.0",
"description": "-",
"main": "gulpfile.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"repository": {
"type": "git",
"url": "developer@s2:dndlib.js"
},
"author": "",
"license": "MIT",
"devDependencies": {
"babel-plugin-transform-es2015-modules-umd": "^6.22.0",
"babel-preset-env": "^1.1.8",
"babel-polyfill": "6.23.0",
"gulp": "^3.9.1",
"gulp-autoprefixer": "^3.1.1",
"gulp-add-src": "0.2.0",
"gulp-babel": "^6.1.2",
"gulp-concat": "^2.6.1",
"gulp-concat-sourcemap": "^1.3.1",
"gulp-cssmin": "^0.1.7",
"gulp-jsmin": "^0.1.5",
"gulp-rename": "^1.2.2",
"gulp-sass": "^3.1.0",
"gulp-sourcemaps": "^2.4.0",
"gulp-notify": "3.0.0"
}
}
## Instruction:
Delete sass and other css dependeces
## Code After:
{
"name": "dndlib.js",
"version": "1.0.0",
"description": "-",
"main": "gulpfile.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"repository": {
"type": "git",
"url": "developer@s2:dndlib.js"
},
"author": "",
"license": "MIT",
"devDependencies": {
"babel-plugin-transform-es2015-modules-umd": "^6.22.0",
"babel-preset-env": "^1.1.8",
"babel-polyfill": "6.23.0",
"gulp": "^3.9.1",
"gulp-autoprefixer": "^3.1.1",
"gulp-add-src": "0.2.0",
"gulp-babel": "^6.1.2",
"gulp-concat": "^2.6.1",
"gulp-concat-sourcemap": "^1.3.1",
"gulp-jsmin": "^0.1.5",
"gulp-rename": "^1.2.2",
"gulp-sourcemaps": "^2.4.0",
"gulp-notify": "3.0.0"
}
}
| {
"name": "dndlib.js",
"version": "1.0.0",
"description": "-",
"main": "gulpfile.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"repository": {
"type": "git",
"url": "developer@s2:dndlib.js"
},
"author": "",
"license": "MIT",
"devDependencies": {
"babel-plugin-transform-es2015-modules-umd": "^6.22.0",
"babel-preset-env": "^1.1.8",
"babel-polyfill": "6.23.0",
"gulp": "^3.9.1",
"gulp-autoprefixer": "^3.1.1",
"gulp-add-src": "0.2.0",
"gulp-babel": "^6.1.2",
"gulp-concat": "^2.6.1",
"gulp-concat-sourcemap": "^1.3.1",
- "gulp-cssmin": "^0.1.7",
"gulp-jsmin": "^0.1.5",
"gulp-rename": "^1.2.2",
- "gulp-sass": "^3.1.0",
"gulp-sourcemaps": "^2.4.0",
"gulp-notify": "3.0.0"
}
} | 2 | 0.0625 | 0 | 2 |
759ce94c61a17b308a1a5e29f8e20a48a4c129f7 | copy-docs.sh | copy-docs.sh |
mkdir ../transcendent-ai-labs.github.io/api_docs/DynaML/$1
mkdir ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-core
mkdir ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-pipes
mkdir ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-examples
mkdir ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-repl
sbt stage
cp -R dynaml-core/target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-core/
cp -R dynaml-pipes/target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-pipes/
cp -R dynaml-examples/target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-examples/
cp -R target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-repl/ |
mkdir -p ../transcendent-ai-labs.github.io/api_docs/DynaML/$1
mkdir -p ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-core
mkdir -p ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-pipes
mkdir -p ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-examples
mkdir -p ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-repl
sbt stage
cp -R dynaml-core/target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-core/
cp -R dynaml-pipes/target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-pipes/
cp -R dynaml-examples/target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-examples/
cp -R target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-repl/ | Change to doc generation script | Change to doc generation script
| Shell | apache-2.0 | transcendent-ai-labs/DynaML,mandar2812/DynaML,mandar2812/DynaML,transcendent-ai-labs/DynaML,transcendent-ai-labs/DynaML,transcendent-ai-labs/DynaML,transcendent-ai-labs/DynaML | shell | ## Code Before:
mkdir ../transcendent-ai-labs.github.io/api_docs/DynaML/$1
mkdir ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-core
mkdir ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-pipes
mkdir ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-examples
mkdir ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-repl
sbt stage
cp -R dynaml-core/target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-core/
cp -R dynaml-pipes/target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-pipes/
cp -R dynaml-examples/target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-examples/
cp -R target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-repl/
## Instruction:
Change to doc generation script
## Code After:
mkdir -p ../transcendent-ai-labs.github.io/api_docs/DynaML/$1
mkdir -p ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-core
mkdir -p ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-pipes
mkdir -p ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-examples
mkdir -p ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-repl
sbt stage
cp -R dynaml-core/target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-core/
cp -R dynaml-pipes/target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-pipes/
cp -R dynaml-examples/target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-examples/
cp -R target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-repl/ |
- mkdir ../transcendent-ai-labs.github.io/api_docs/DynaML/$1
+ mkdir -p ../transcendent-ai-labs.github.io/api_docs/DynaML/$1
? +++
- mkdir ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-core
+ mkdir -p ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-core
? +++
- mkdir ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-pipes
+ mkdir -p ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-pipes
? +++
- mkdir ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-examples
+ mkdir -p ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-examples
? +++
- mkdir ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-repl
+ mkdir -p ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-repl
? +++
sbt stage
cp -R dynaml-core/target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-core/
cp -R dynaml-pipes/target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-pipes/
cp -R dynaml-examples/target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-examples/
cp -R target/scala-2.11/api/* ../transcendent-ai-labs.github.io/api_docs/DynaML/$1/dynaml-repl/ | 10 | 0.769231 | 5 | 5 |
47148ea851567f1ba5e9b1bec7c60e2ef4096ecb | source/_partials/analytics.blade.php | source/_partials/analytics.blade.php | <script>
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
})(window,document,'script','https://www.google-analytics.com/analytics.js','ga');
ga('create', 'UA-100960136-1', 'auto');
ga('send', 'pageview');
</script>
| <!-- Fathom - simple website analytics - https://github.com/usefathom/fathom -->
<script>
(function(f, a, t, h, o, m){
a[h]=a[h]||function(){
(a[h].q=a[h].q||[]).push(arguments)
};
o=f.createElement('script'),
m=f.getElementsByTagName('script')[0];
o.async=1; o.src=t; o.id='fathom-script';
m.parentNode.insertBefore(o,m)
})(document, window, '//fathom.bakerkretzmar.ca/tracker.js', 'fathom');
fathom('set', 'siteId', 'RIUHN');
fathom('trackPageview');
</script>
| Replace GA with Fathom 🙏 | Replace GA with Fathom 🙏
| PHP | mit | bakerkretzmar/bakerkretzmar.ca,bakerkretzmar/bakerkretzmar.ca | php | ## Code Before:
<script>
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
})(window,document,'script','https://www.google-analytics.com/analytics.js','ga');
ga('create', 'UA-100960136-1', 'auto');
ga('send', 'pageview');
</script>
## Instruction:
Replace GA with Fathom 🙏
## Code After:
<!-- Fathom - simple website analytics - https://github.com/usefathom/fathom -->
<script>
(function(f, a, t, h, o, m){
a[h]=a[h]||function(){
(a[h].q=a[h].q||[]).push(arguments)
};
o=f.createElement('script'),
m=f.getElementsByTagName('script')[0];
o.async=1; o.src=t; o.id='fathom-script';
m.parentNode.insertBefore(o,m)
})(document, window, '//fathom.bakerkretzmar.ca/tracker.js', 'fathom');
fathom('set', 'siteId', 'RIUHN');
fathom('trackPageview');
</script>
| + <!-- Fathom - simple website analytics - https://github.com/usefathom/fathom -->
<script>
- (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
- (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
- m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
- })(window,document,'script','https://www.google-analytics.com/analytics.js','ga');
-
- ga('create', 'UA-100960136-1', 'auto');
- ga('send', 'pageview');
+ (function(f, a, t, h, o, m){
+ a[h]=a[h]||function(){
+ (a[h].q=a[h].q||[]).push(arguments)
+ };
+ o=f.createElement('script'),
+ m=f.getElementsByTagName('script')[0];
+ o.async=1; o.src=t; o.id='fathom-script';
+ m.parentNode.insertBefore(o,m)
+ })(document, window, '//fathom.bakerkretzmar.ca/tracker.js', 'fathom');
+ fathom('set', 'siteId', 'RIUHN');
+ fathom('trackPageview');
</script> | 19 | 2.111111 | 12 | 7 |
73ef007be2dd6dfe68eea1ca0e5d8c0d02f5eb53 | scripts/Python/modules/CMakeLists.txt | scripts/Python/modules/CMakeLists.txt | if (CMAKE_SYSTEM_NAME MATCHES "Linux" AND NOT __ANDROID_NDK__)
add_subdirectory(readline)
endif()
| check_cxx_compiler_flag("-Wno-macro-redefined"
CXX_SUPPORTS_NO_MACRO_REDEFINED)
if (CXX_SUPPORTS_NO_MACRO_REDEFINED)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wno-macro-redefined")
endif ()
# build the Python readline suppression module only on Linux
if (CMAKE_SYSTEM_NAME MATCHES "Linux" AND NOT __ANDROID_NDK__)
add_subdirectory(readline)
endif()
| Disable a warning for the python modules as the python C API headers trigger this warning. With this, 'ninja' succeeds without warnings for me on Linux. | Disable a warning for the python modules as the python C API headers
trigger this warning. With this, 'ninja' succeeds without warnings for
me on Linux.
git-svn-id: 4c4cc70b1ef44ba2b7963015e681894188cea27e@229096 91177308-0d34-0410-b5e6-96231b3b80d8
| Text | apache-2.0 | apple/swift-lldb,apple/swift-lldb,llvm-mirror/lldb,apple/swift-lldb,llvm-mirror/lldb,llvm-mirror/lldb,apple/swift-lldb,llvm-mirror/lldb,apple/swift-lldb,apple/swift-lldb,llvm-mirror/lldb | text | ## Code Before:
if (CMAKE_SYSTEM_NAME MATCHES "Linux" AND NOT __ANDROID_NDK__)
add_subdirectory(readline)
endif()
## Instruction:
Disable a warning for the python modules as the python C API headers
trigger this warning. With this, 'ninja' succeeds without warnings for
me on Linux.
git-svn-id: 4c4cc70b1ef44ba2b7963015e681894188cea27e@229096 91177308-0d34-0410-b5e6-96231b3b80d8
## Code After:
check_cxx_compiler_flag("-Wno-macro-redefined"
CXX_SUPPORTS_NO_MACRO_REDEFINED)
if (CXX_SUPPORTS_NO_MACRO_REDEFINED)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wno-macro-redefined")
endif ()
# build the Python readline suppression module only on Linux
if (CMAKE_SYSTEM_NAME MATCHES "Linux" AND NOT __ANDROID_NDK__)
add_subdirectory(readline)
endif()
| + check_cxx_compiler_flag("-Wno-macro-redefined"
+ CXX_SUPPORTS_NO_MACRO_REDEFINED)
+ if (CXX_SUPPORTS_NO_MACRO_REDEFINED)
+ set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wno-macro-redefined")
+ endif ()
+
+ # build the Python readline suppression module only on Linux
if (CMAKE_SYSTEM_NAME MATCHES "Linux" AND NOT __ANDROID_NDK__)
add_subdirectory(readline)
endif() | 7 | 2.333333 | 7 | 0 |
dd2ceb00d63447db534899b6ab381c9e92f7f731 | deployments/math124/config/common.yaml | deployments/math124/config/common.yaml | jupyterhub:
auth:
type: google
admin:
users:
# infrastructure
- rylo
- yuvipanda
# instructors
- adhikari
- culler
- daw
- denero
- vinitra
- ramesh_s
# dsep staff
- bacon
- ericvd
- ryanedw
# f18 prob140 gsis
- dcroce
- hsubbaraj
- zhang.j
singleuser:
memory:
guarantee: 1G
limit: 1G
image:
name: gcr.io/ucb-datahub-2018/prob140-user-image
storage:
type: hostPath
| jupyterhub:
auth:
type: google
admin:
users:
# infrastructure
- rylo
- yuvipanda
singleuser:
memory:
guarantee: 256Mi
limit: 1G
image:
name: gcr.io/ucb-datahub-2018/math124-user-image
storage:
type: hostPath
| Set user image path for math124 correctly | Set user image path for math124 correctly
- Change admin ids
- Change request / guaranteed resources
| YAML | bsd-3-clause | berkeley-dsep-infra/datahub,berkeley-dsep-infra/datahub,ryanlovett/datahub,ryanlovett/datahub,berkeley-dsep-infra/datahub,ryanlovett/datahub | yaml | ## Code Before:
jupyterhub:
auth:
type: google
admin:
users:
# infrastructure
- rylo
- yuvipanda
# instructors
- adhikari
- culler
- daw
- denero
- vinitra
- ramesh_s
# dsep staff
- bacon
- ericvd
- ryanedw
# f18 prob140 gsis
- dcroce
- hsubbaraj
- zhang.j
singleuser:
memory:
guarantee: 1G
limit: 1G
image:
name: gcr.io/ucb-datahub-2018/prob140-user-image
storage:
type: hostPath
## Instruction:
Set user image path for math124 correctly
- Change admin ids
- Change request / guaranteed resources
## Code After:
jupyterhub:
auth:
type: google
admin:
users:
# infrastructure
- rylo
- yuvipanda
singleuser:
memory:
guarantee: 256Mi
limit: 1G
image:
name: gcr.io/ucb-datahub-2018/math124-user-image
storage:
type: hostPath
| jupyterhub:
auth:
type: google
admin:
users:
# infrastructure
- rylo
- yuvipanda
- # instructors
- - adhikari
- - culler
- - daw
- - denero
- - vinitra
- - ramesh_s
- # dsep staff
- - bacon
- - ericvd
- - ryanedw
- # f18 prob140 gsis
- - dcroce
- - hsubbaraj
- - zhang.j
singleuser:
memory:
- guarantee: 1G
? ^^
+ guarantee: 256Mi
? ^^^^^
limit: 1G
image:
- name: gcr.io/ucb-datahub-2018/prob140-user-image
? ^^^^ -
+ name: gcr.io/ucb-datahub-2018/math124-user-image
? ^^^^ +
storage:
type: hostPath | 19 | 0.59375 | 2 | 17 |
4502f549f626c9657fdc4144ff461af3e51454b0 | Casks/thunderbird-beta.rb | Casks/thunderbird-beta.rb | cask :v1 => 'thunderbird-beta' do
version '38.0b4'
sha256 '07e8d88b0a1c37b9770f24b35e513fd27eefb7d34dca20236499692573739503'
url "https://download-installer.cdn.mozilla.net/pub/thunderbird/releases/#{version}/mac/en-US/Thunderbird%20#{version}.dmg"
homepage 'https://www.mozilla.org/en-US/thunderbird/all-beta.html'
license :mpl
app 'Thunderbird.app'
end
| cask :v1 => 'thunderbird-beta' do
version '38.0b4'
sha256 '07e8d88b0a1c37b9770f24b35e513fd27eefb7d34dca20236499692573739503'
url "https://download.mozilla.org/?product=thunderbird-#{version}&os=osx&lang=en-US"
homepage 'https://www.mozilla.org/en-US/thunderbird/all-beta.html'
license :mpl
tags :vendor => 'Mozilla'
app 'Thunderbird.app'
end
| Add tag stanza to Thunderbird-beta | Add tag stanza to Thunderbird-beta
Update URL
| Ruby | bsd-2-clause | kstarsinic/homebrew-versions | ruby | ## Code Before:
cask :v1 => 'thunderbird-beta' do
version '38.0b4'
sha256 '07e8d88b0a1c37b9770f24b35e513fd27eefb7d34dca20236499692573739503'
url "https://download-installer.cdn.mozilla.net/pub/thunderbird/releases/#{version}/mac/en-US/Thunderbird%20#{version}.dmg"
homepage 'https://www.mozilla.org/en-US/thunderbird/all-beta.html'
license :mpl
app 'Thunderbird.app'
end
## Instruction:
Add tag stanza to Thunderbird-beta
Update URL
## Code After:
cask :v1 => 'thunderbird-beta' do
version '38.0b4'
sha256 '07e8d88b0a1c37b9770f24b35e513fd27eefb7d34dca20236499692573739503'
url "https://download.mozilla.org/?product=thunderbird-#{version}&os=osx&lang=en-US"
homepage 'https://www.mozilla.org/en-US/thunderbird/all-beta.html'
license :mpl
tags :vendor => 'Mozilla'
app 'Thunderbird.app'
end
| cask :v1 => 'thunderbird-beta' do
version '38.0b4'
sha256 '07e8d88b0a1c37b9770f24b35e513fd27eefb7d34dca20236499692573739503'
- url "https://download-installer.cdn.mozilla.net/pub/thunderbird/releases/#{version}/mac/en-US/Thunderbird%20#{version}.dmg"
+ url "https://download.mozilla.org/?product=thunderbird-#{version}&os=osx&lang=en-US"
homepage 'https://www.mozilla.org/en-US/thunderbird/all-beta.html'
license :mpl
+ tags :vendor => 'Mozilla'
app 'Thunderbird.app'
end | 3 | 0.3 | 2 | 1 |
23aa772cfeb34f0475d9bca01880aaf9af666799 | logstream.gemspec | logstream.gemspec | libpath = File.join(File.dirname(__FILE__), 'lib')
$LOAD_PATH.unshift(libpath) unless $LOAD_PATH.include?(libpath)
Gem::Specification.new do |s|
s.name = "logstream"
s.version = "0.0.5"
s.date = Time.now.strftime("%Y-%m-%d")
s.author = "Barry Jaspan"
s.email = "barry.jaspan@acquia.com"
s.homepage = "https://github.com/acquia/logstream"
s.summary = "Acquia Logstream tools and library"
s.description = "Logstream is an Acquia service for streaming logs from Acquia Cloud."
s.files = Dir["[A-Z]*", "{bin,etc,lib,test}/**/*"]
s.bindir = "bin"
s.executables = Dir["bin/*"].map { |f| File.basename(f) }.select { |f| f =~ /^[\w\-]+$/ }
s.test_files = Dir["test/**/*"]
s.has_rdoc = false
s.add_runtime_dependency('faye-websocket', ['~> 0.8.0'])
s.add_runtime_dependency('json', ['~> 1.7.7'])
s.add_runtime_dependency('thor', ['~> 0.19.1'])
end
| libpath = File.join(File.dirname(__FILE__), 'lib')
$LOAD_PATH.unshift(libpath) unless $LOAD_PATH.include?(libpath)
Gem::Specification.new do |s|
s.name = "logstream"
s.version = "0.0.6"
s.date = Time.now.strftime("%Y-%m-%d")
s.author = "Barry Jaspan"
s.email = "barry.jaspan@acquia.com"
s.homepage = "https://github.com/acquia/logstream"
s.licenses = ['MIT']
s.summary = "Acquia Logstream tools and library"
s.description = "Logstream is an Acquia service for streaming logs from Acquia Cloud."
s.files = Dir["[A-Z]*", "{bin,etc,lib,test}/**/*"]
s.bindir = "bin"
s.executables = Dir["bin/*"].map { |f| File.basename(f) }.select { |f| f =~ /^[\w\-]+$/ }
s.test_files = Dir["test/**/*"]
s.has_rdoc = false
s.add_runtime_dependency('faye-websocket', ['~> 0.8.0'])
s.add_runtime_dependency('json', ['~> 1.7.7'])
s.add_runtime_dependency('thor', ['~> 0.19.1'])
s.required_ruby_version = '>= 1.9.3'
end
| Set minimum ruby version as well as license in gemspec | Set minimum ruby version as well as license in gemspec
| Ruby | mit | jacobbednarz/logstream,acquia/logstream | ruby | ## Code Before:
libpath = File.join(File.dirname(__FILE__), 'lib')
$LOAD_PATH.unshift(libpath) unless $LOAD_PATH.include?(libpath)
Gem::Specification.new do |s|
s.name = "logstream"
s.version = "0.0.5"
s.date = Time.now.strftime("%Y-%m-%d")
s.author = "Barry Jaspan"
s.email = "barry.jaspan@acquia.com"
s.homepage = "https://github.com/acquia/logstream"
s.summary = "Acquia Logstream tools and library"
s.description = "Logstream is an Acquia service for streaming logs from Acquia Cloud."
s.files = Dir["[A-Z]*", "{bin,etc,lib,test}/**/*"]
s.bindir = "bin"
s.executables = Dir["bin/*"].map { |f| File.basename(f) }.select { |f| f =~ /^[\w\-]+$/ }
s.test_files = Dir["test/**/*"]
s.has_rdoc = false
s.add_runtime_dependency('faye-websocket', ['~> 0.8.0'])
s.add_runtime_dependency('json', ['~> 1.7.7'])
s.add_runtime_dependency('thor', ['~> 0.19.1'])
end
## Instruction:
Set minimum ruby version as well as license in gemspec
## Code After:
libpath = File.join(File.dirname(__FILE__), 'lib')
$LOAD_PATH.unshift(libpath) unless $LOAD_PATH.include?(libpath)
Gem::Specification.new do |s|
s.name = "logstream"
s.version = "0.0.6"
s.date = Time.now.strftime("%Y-%m-%d")
s.author = "Barry Jaspan"
s.email = "barry.jaspan@acquia.com"
s.homepage = "https://github.com/acquia/logstream"
s.licenses = ['MIT']
s.summary = "Acquia Logstream tools and library"
s.description = "Logstream is an Acquia service for streaming logs from Acquia Cloud."
s.files = Dir["[A-Z]*", "{bin,etc,lib,test}/**/*"]
s.bindir = "bin"
s.executables = Dir["bin/*"].map { |f| File.basename(f) }.select { |f| f =~ /^[\w\-]+$/ }
s.test_files = Dir["test/**/*"]
s.has_rdoc = false
s.add_runtime_dependency('faye-websocket', ['~> 0.8.0'])
s.add_runtime_dependency('json', ['~> 1.7.7'])
s.add_runtime_dependency('thor', ['~> 0.19.1'])
s.required_ruby_version = '>= 1.9.3'
end
| libpath = File.join(File.dirname(__FILE__), 'lib')
$LOAD_PATH.unshift(libpath) unless $LOAD_PATH.include?(libpath)
Gem::Specification.new do |s|
s.name = "logstream"
- s.version = "0.0.5"
? ^
+ s.version = "0.0.6"
? ^
s.date = Time.now.strftime("%Y-%m-%d")
s.author = "Barry Jaspan"
s.email = "barry.jaspan@acquia.com"
s.homepage = "https://github.com/acquia/logstream"
+
+ s.licenses = ['MIT']
s.summary = "Acquia Logstream tools and library"
s.description = "Logstream is an Acquia service for streaming logs from Acquia Cloud."
s.files = Dir["[A-Z]*", "{bin,etc,lib,test}/**/*"]
s.bindir = "bin"
s.executables = Dir["bin/*"].map { |f| File.basename(f) }.select { |f| f =~ /^[\w\-]+$/ }
s.test_files = Dir["test/**/*"]
s.has_rdoc = false
s.add_runtime_dependency('faye-websocket', ['~> 0.8.0'])
s.add_runtime_dependency('json', ['~> 1.7.7'])
s.add_runtime_dependency('thor', ['~> 0.19.1'])
+
+ s.required_ruby_version = '>= 1.9.3'
end | 6 | 0.24 | 5 | 1 |
cd530d627bb7f675098bfc5512a02fda34c03d64 | puppet/zulip/templates/uwsgi.ini.template.erb | puppet/zulip/templates/uwsgi.ini.template.erb | [uwsgi]
socket=/home/zulip/deployments/uwsgi-socket
module=zproject.wsgi:application
chdir=/home/zulip/deployments/current/
master=true
chmod-socket=700
chown-socket=zulip:zulip
processes=<%= @uwsgi_processes %>
harakiri=20
buffer-size=<%= @uwsgi_buffer_size %>
listen=<%= @uwsgi_listen_backlog_limit %>
post-buffering=4096
env= LANG=en_US.UTF-8
uid=zulip
gid=zulip
| [uwsgi]
socket=/home/zulip/deployments/uwsgi-socket
module=zproject.wsgi:application
chdir=/home/zulip/deployments/current/
master=true
chmod-socket=700
chown-socket=zulip:zulip
processes=<%= @uwsgi_processes %>
harakiri=20
buffer-size=<%= @uwsgi_buffer_size %>
listen=<%= @uwsgi_listen_backlog_limit %>
post-buffering=4096
env= LANG=en_US.UTF-8
uid=zulip
gid=zulip
ignore-sigpipe = true
ignore-write-errors = true
disable-write-exception = true
| Stop generating IOError and SIGPIPE on client close. | uwsgi: Stop generating IOError and SIGPIPE on client close.
Clients that close their socket to nginx suddenly also cause nginx to close
its connection to uwsgi. When uwsgi finishes computing the response,
it thus tries to write to a closed socket, and generates either
IOError or SIGPIPE failures.
Since these are caused by the _client_ closing the connection
suddenly, they are not actionable by the server. At particularly high
volumes, this could represent some sort of server-side failure;
however, this is better detected by examining status codes at the
loadbalancer. nginx uses the error code 499 for this occurrence:
https://httpstatuses.com/499
Stop uwsgi from generating this family of exception entirely, using
configuration for uwsgi[1]; it documents these errors as "(annoying),"
hinting at their general utility."
[1] https://uwsgi-docs.readthedocs.io/en/latest/Options.html#ignore-sigpipe
| HTML+ERB | apache-2.0 | showell/zulip,eeshangarg/zulip,eeshangarg/zulip,rht/zulip,punchagan/zulip,hackerkid/zulip,andersk/zulip,showell/zulip,hackerkid/zulip,brainwane/zulip,zulip/zulip,zulip/zulip,punchagan/zulip,kou/zulip,zulip/zulip,andersk/zulip,hackerkid/zulip,punchagan/zulip,eeshangarg/zulip,kou/zulip,andersk/zulip,brainwane/zulip,eeshangarg/zulip,rht/zulip,eeshangarg/zulip,punchagan/zulip,showell/zulip,kou/zulip,brainwane/zulip,kou/zulip,hackerkid/zulip,andersk/zulip,kou/zulip,rht/zulip,rht/zulip,hackerkid/zulip,andersk/zulip,zulip/zulip,eeshangarg/zulip,kou/zulip,andersk/zulip,brainwane/zulip,showell/zulip,punchagan/zulip,zulip/zulip,zulip/zulip,brainwane/zulip,punchagan/zulip,showell/zulip,kou/zulip,zulip/zulip,showell/zulip,rht/zulip,brainwane/zulip,andersk/zulip,eeshangarg/zulip,rht/zulip,rht/zulip,punchagan/zulip,brainwane/zulip,showell/zulip,hackerkid/zulip,hackerkid/zulip | html+erb | ## Code Before:
[uwsgi]
socket=/home/zulip/deployments/uwsgi-socket
module=zproject.wsgi:application
chdir=/home/zulip/deployments/current/
master=true
chmod-socket=700
chown-socket=zulip:zulip
processes=<%= @uwsgi_processes %>
harakiri=20
buffer-size=<%= @uwsgi_buffer_size %>
listen=<%= @uwsgi_listen_backlog_limit %>
post-buffering=4096
env= LANG=en_US.UTF-8
uid=zulip
gid=zulip
## Instruction:
uwsgi: Stop generating IOError and SIGPIPE on client close.
Clients that close their socket to nginx suddenly also cause nginx to close
its connection to uwsgi. When uwsgi finishes computing the response,
it thus tries to write to a closed socket, and generates either
IOError or SIGPIPE failures.
Since these are caused by the _client_ closing the connection
suddenly, they are not actionable by the server. At particularly high
volumes, this could represent some sort of server-side failure;
however, this is better detected by examining status codes at the
loadbalancer. nginx uses the error code 499 for this occurrence:
https://httpstatuses.com/499
Stop uwsgi from generating this family of exception entirely, using
configuration for uwsgi[1]; it documents these errors as "(annoying),"
hinting at their general utility."
[1] https://uwsgi-docs.readthedocs.io/en/latest/Options.html#ignore-sigpipe
## Code After:
[uwsgi]
socket=/home/zulip/deployments/uwsgi-socket
module=zproject.wsgi:application
chdir=/home/zulip/deployments/current/
master=true
chmod-socket=700
chown-socket=zulip:zulip
processes=<%= @uwsgi_processes %>
harakiri=20
buffer-size=<%= @uwsgi_buffer_size %>
listen=<%= @uwsgi_listen_backlog_limit %>
post-buffering=4096
env= LANG=en_US.UTF-8
uid=zulip
gid=zulip
ignore-sigpipe = true
ignore-write-errors = true
disable-write-exception = true
| [uwsgi]
socket=/home/zulip/deployments/uwsgi-socket
module=zproject.wsgi:application
chdir=/home/zulip/deployments/current/
master=true
chmod-socket=700
chown-socket=zulip:zulip
processes=<%= @uwsgi_processes %>
harakiri=20
buffer-size=<%= @uwsgi_buffer_size %>
listen=<%= @uwsgi_listen_backlog_limit %>
post-buffering=4096
env= LANG=en_US.UTF-8
uid=zulip
gid=zulip
+
+ ignore-sigpipe = true
+ ignore-write-errors = true
+ disable-write-exception = true | 4 | 0.266667 | 4 | 0 |
82480d181f24bf8aff9972d43b938ff815df7b94 | .travis.yml | .travis.yml | language: cpp
os: linux
sudo: required
dist: trusty
addons:
apt:
sources:
- ubuntu-toolchain-r-test
packages:
- lcov
matrix:
include:
- compiler: gcc
env: BUILD_TYPE=codecov
- compiler: clang
env: BUILD_TYPE=normal
before_script:
- util/travis_before_script.sh
script:
- make -C build
after_success:
- util/travis_after_success.sh
| language: cpp
os: linux
sudo: required
dist: trusty
addons:
apt:
sources:
- ubuntu-toolchain-r-test
packages:
- lcov
matrix:
include:
- compiler: gcc
env: BUILD_TYPE=codecov
- compiler: clang
env: BUILD_TYPE=normal
before_script:
- util/travis_before_script.sh
script:
- make -C build
after_success:
- util/travis_after_success.sh
notifications:
irc:
channels: "chat.freenode.net#banditcpp"
skip_join: true
use_notice: true
template: "%{repository} %{branch} (%{commit} %{commit_subject} -- %{author}): %{message} See %{compare_url}"
| Add IRC notifications for Travis CI | Add IRC notifications for Travis CI
Use one-line notices from outside the #banditcpp channel.
| YAML | mit | joakimkarlsson/bandit,joakimkarlsson/bandit,joakimkarlsson/bandit | yaml | ## Code Before:
language: cpp
os: linux
sudo: required
dist: trusty
addons:
apt:
sources:
- ubuntu-toolchain-r-test
packages:
- lcov
matrix:
include:
- compiler: gcc
env: BUILD_TYPE=codecov
- compiler: clang
env: BUILD_TYPE=normal
before_script:
- util/travis_before_script.sh
script:
- make -C build
after_success:
- util/travis_after_success.sh
## Instruction:
Add IRC notifications for Travis CI
Use one-line notices from outside the #banditcpp channel.
## Code After:
language: cpp
os: linux
sudo: required
dist: trusty
addons:
apt:
sources:
- ubuntu-toolchain-r-test
packages:
- lcov
matrix:
include:
- compiler: gcc
env: BUILD_TYPE=codecov
- compiler: clang
env: BUILD_TYPE=normal
before_script:
- util/travis_before_script.sh
script:
- make -C build
after_success:
- util/travis_after_success.sh
notifications:
irc:
channels: "chat.freenode.net#banditcpp"
skip_join: true
use_notice: true
template: "%{repository} %{branch} (%{commit} %{commit_subject} -- %{author}): %{message} See %{compare_url}"
| language: cpp
os: linux
sudo: required
dist: trusty
addons:
apt:
sources:
- ubuntu-toolchain-r-test
packages:
- lcov
matrix:
include:
- compiler: gcc
env: BUILD_TYPE=codecov
- compiler: clang
env: BUILD_TYPE=normal
before_script:
- util/travis_before_script.sh
script:
- make -C build
after_success:
- util/travis_after_success.sh
+
+ notifications:
+ irc:
+ channels: "chat.freenode.net#banditcpp"
+ skip_join: true
+ use_notice: true
+ template: "%{repository} %{branch} (%{commit} %{commit_subject} -- %{author}): %{message} See %{compare_url}" | 7 | 0.269231 | 7 | 0 |
2d2e22b8038b893c2bad5ed194cf618b2d27a11c | basic/scripts/strings.js | basic/scripts/strings.js | /**
* No Description
*
* @return LocalStrings
* @author Erik E. Lorenz <erik@tuvero.de>
* @license MIT License
* @see LICENSE
*/
define(function() {
return {
teamsicon: 'teams',
variant: 'Basic',
teamtext: 'Team',
teamstext: 'Teams'
};
});
| /**
* No Description
*
* @return LocalStrings
* @author Erik E. Lorenz <erik@tuvero.de>
* @license MIT License
* @see LICENSE
*/
define(function () {
return {
teamsicon: 'teams',
variant: 'Basic',
teamtext: 'Team',
teamstext: 'Teams',
ranking_points: 'Spielpunkte',
ranking_lostpoints: 'Gegnerpunkte',
ranking_saldo: 'Punktedifferenz',
ranking_short_points: 'P',
ranking_short_lostpoints: 'GP',
ranking_short_saldo: 'PD',
ranking_medium_points: 'Punkte',
ranking_medium_lostpoints: 'Gegnerpunkte',
ranking_medium_saldo: 'Differenz'
};
});
| Use comprehensible names for ranking points | Use comprehensible names for ranking points
| JavaScript | mit | elor/tuvero | javascript | ## Code Before:
/**
* No Description
*
* @return LocalStrings
* @author Erik E. Lorenz <erik@tuvero.de>
* @license MIT License
* @see LICENSE
*/
define(function() {
return {
teamsicon: 'teams',
variant: 'Basic',
teamtext: 'Team',
teamstext: 'Teams'
};
});
## Instruction:
Use comprehensible names for ranking points
## Code After:
/**
* No Description
*
* @return LocalStrings
* @author Erik E. Lorenz <erik@tuvero.de>
* @license MIT License
* @see LICENSE
*/
define(function () {
return {
teamsicon: 'teams',
variant: 'Basic',
teamtext: 'Team',
teamstext: 'Teams',
ranking_points: 'Spielpunkte',
ranking_lostpoints: 'Gegnerpunkte',
ranking_saldo: 'Punktedifferenz',
ranking_short_points: 'P',
ranking_short_lostpoints: 'GP',
ranking_short_saldo: 'PD',
ranking_medium_points: 'Punkte',
ranking_medium_lostpoints: 'Gegnerpunkte',
ranking_medium_saldo: 'Differenz'
};
});
| /**
* No Description
*
* @return LocalStrings
* @author Erik E. Lorenz <erik@tuvero.de>
* @license MIT License
* @see LICENSE
*/
- define(function() {
+ define(function () {
? +
return {
teamsicon: 'teams',
variant: 'Basic',
teamtext: 'Team',
- teamstext: 'Teams'
+ teamstext: 'Teams',
? +
+ ranking_points: 'Spielpunkte',
+ ranking_lostpoints: 'Gegnerpunkte',
+ ranking_saldo: 'Punktedifferenz',
+ ranking_short_points: 'P',
+ ranking_short_lostpoints: 'GP',
+ ranking_short_saldo: 'PD',
+ ranking_medium_points: 'Punkte',
+ ranking_medium_lostpoints: 'Gegnerpunkte',
+ ranking_medium_saldo: 'Differenz'
};
}); | 13 | 0.8125 | 11 | 2 |
80957f319fce73af54ee2473f418407ceff7600f | Casks/dbeaver-enterprise.rb | Casks/dbeaver-enterprise.rb | cask :v1 => 'dbeaver-enterprise' do
version '3.1.1'
if Hardware::CPU.is_32_bit?
sha256 '692bf7ab300f0876ab3dac996e61f0b7c367db307c8190754fbf798dbb38daeb'
url "http://dbeaver.jkiss.org/files/dbeaver-#{version}-ee-macosx.cocoa.x86.zip"
else
sha256 '27564968a4fd27dcaca5a9639cd787f63f006cab844017c7d094f922d506922d'
url "http://dbeaver.jkiss.org/files/dbeaver-#{version}-ee-macosx.cocoa.x86_64.zip"
end
homepage 'http://dbeaver.jkiss.org/'
license :oss
app 'dbeaver/dbeaver.app'
end
| cask :v1 => 'dbeaver-enterprise' do
version '3.1.2'
if Hardware::CPU.is_32_bit?
sha256 '69e3fbb77f6dfa4039f2535451849ae3811e0430764bd507ecc5df3cd7eb28ba'
url "http://dbeaver.jkiss.org/files/dbeaver-#{version}-ee-macosx.cocoa.x86.zip"
else
sha256 'f868445eecae115245b5d0882917e548c1a799c3ea8e6e47c551557d62a857ff'
url "http://dbeaver.jkiss.org/files/dbeaver-#{version}-ee-macosx.cocoa.x86_64.zip"
end
homepage 'http://dbeaver.jkiss.org/'
license :oss
app 'dbeaver/dbeaver.app'
end
| Update DBeaver Enterprise to 3.1.2 | Update DBeaver Enterprise to 3.1.2 | Ruby | bsd-2-clause | bric3/homebrew-cask,atsuyim/homebrew-cask,yurrriq/homebrew-cask,yutarody/homebrew-cask,shanonvl/homebrew-cask,skyyuan/homebrew-cask,0rax/homebrew-cask,JikkuJose/homebrew-cask,tangestani/homebrew-cask,diguage/homebrew-cask,daften/homebrew-cask,usami-k/homebrew-cask,moonboots/homebrew-cask,j13k/homebrew-cask,thomanq/homebrew-cask,kostasdizas/homebrew-cask,dieterdemeyer/homebrew-cask,flada-auxv/homebrew-cask,aki77/homebrew-cask,royalwang/homebrew-cask,6uclz1/homebrew-cask,sirodoht/homebrew-cask,danielbayley/homebrew-cask,williamboman/homebrew-cask,xalep/homebrew-cask,klane/homebrew-cask,retrography/homebrew-cask,josa42/homebrew-cask,y00rb/homebrew-cask,bcaceiro/homebrew-cask,jgarber623/homebrew-cask,lantrix/homebrew-cask,gyugyu/homebrew-cask,kei-yamazaki/homebrew-cask,stephenwade/homebrew-cask,xight/homebrew-cask,sanchezm/homebrew-cask,kuno/homebrew-cask,JacopKane/homebrew-cask,mgryszko/homebrew-cask,jeroenj/homebrew-cask,mlocher/homebrew-cask,jpmat296/homebrew-cask,kassi/homebrew-cask,tjt263/homebrew-cask,santoshsahoo/homebrew-cask,julienlavergne/homebrew-cask,lvicentesanchez/homebrew-cask,janlugt/homebrew-cask,JoelLarson/homebrew-cask,vin047/homebrew-cask,JikkuJose/homebrew-cask,spruceb/homebrew-cask,xyb/homebrew-cask,kirikiriyamama/homebrew-cask,wayou/homebrew-cask,reelsense/homebrew-cask,0xadada/homebrew-cask,feigaochn/homebrew-cask,christophermanning/homebrew-cask,hovancik/homebrew-cask,markhuber/homebrew-cask,jpodlech/homebrew-cask,vmrob/homebrew-cask,ayohrling/homebrew-cask,mkozjak/homebrew-cask,ftiff/homebrew-cask,djakarta-trap/homebrew-myCask,cfillion/homebrew-cask,mwilmer/homebrew-cask,zchee/homebrew-cask,squid314/homebrew-cask,huanzhang/homebrew-cask,alebcay/homebrew-cask,michelegera/homebrew-cask,ywfwj2008/homebrew-cask,ctrevino/homebrew-cask,dustinblackman/homebrew-cask,ptb/homebrew-cask,zerrot/homebrew-cask,bchatard/homebrew-cask,m3nu/homebrew-cask,giannitm/homebrew-cask,andyli/homebrew-cask,jeroenj/homebrew-cask,reitermarkus/homebrew-cask,mjgardner/homebrew-cask,johan/homebrew-cask,coeligena/homebrew-customized,bosr/homebrew-cask,singingwolfboy/homebrew-cask,a-x-/homebrew-cask,nivanchikov/homebrew-cask,shonjir/homebrew-cask,hristozov/homebrew-cask,lieuwex/homebrew-cask,albertico/homebrew-cask,kolomiichenko/homebrew-cask,tangestani/homebrew-cask,Dremora/homebrew-cask,Whoaa512/homebrew-cask,wolflee/homebrew-cask,exherb/homebrew-cask,epardee/homebrew-cask,AndreTheHunter/homebrew-cask,kpearson/homebrew-cask,thii/homebrew-cask,fanquake/homebrew-cask,adelinofaria/homebrew-cask,johnjelinek/homebrew-cask,norio-nomura/homebrew-cask,axodys/homebrew-cask,frapposelli/homebrew-cask,supriyantomaftuh/homebrew-cask,mazehall/homebrew-cask,jtriley/homebrew-cask,gguillotte/homebrew-cask,josa42/homebrew-cask,reitermarkus/homebrew-cask,blainesch/homebrew-cask,garborg/homebrew-cask,esebastian/homebrew-cask,wastrachan/homebrew-cask,remko/homebrew-cask,andrewdisley/homebrew-cask,gabrielizaias/homebrew-cask,fharbe/homebrew-cask,flada-auxv/homebrew-cask,zerrot/homebrew-cask,andrewschleifer/homebrew-cask,tranc99/homebrew-cask,nickpellant/homebrew-cask,ldong/homebrew-cask,winkelsdorf/homebrew-cask,winkelsdorf/homebrew-cask,sjackman/homebrew-cask,andrewschleifer/homebrew-cask,MerelyAPseudonym/homebrew-cask,shoichiaizawa/homebrew-cask,jhowtan/homebrew-cask,shorshe/homebrew-cask,johnste/homebrew-cask,cliffcotino/homebrew-cask,athrunsun/homebrew-cask,enriclluelles/homebrew-cask,lolgear/homebrew-cask,julienlavergne/homebrew-cask,ianyh/homebrew-cask,gmkey/homebrew-cask,bosr/homebrew-cask,blogabe/homebrew-cask,My2ndAngelic/homebrew-cask,norio-nomura/homebrew-cask,adrianchia/homebrew-cask,tolbkni/homebrew-cask,samdoran/homebrew-cask,MisumiRize/homebrew-cask,fazo96/homebrew-cask,claui/homebrew-cask,seanorama/homebrew-cask,n0ts/homebrew-cask,patresi/homebrew-cask,freeslugs/homebrew-cask,af/homebrew-cask,JosephViolago/homebrew-cask,michelegera/homebrew-cask,dspeckhard/homebrew-cask,pacav69/homebrew-cask,mindriot101/homebrew-cask,lvicentesanchez/homebrew-cask,shonjir/homebrew-cask,antogg/homebrew-cask,kostasdizas/homebrew-cask,jacobbednarz/homebrew-cask,rickychilcott/homebrew-cask,scw/homebrew-cask,MerelyAPseudonym/homebrew-cask,nshemonsky/homebrew-cask,scribblemaniac/homebrew-cask,imgarylai/homebrew-cask,sohtsuka/homebrew-cask,jtriley/homebrew-cask,elseym/homebrew-cask,katoquro/homebrew-cask,lucasmezencio/homebrew-cask,nathanielvarona/homebrew-cask,amatos/homebrew-cask,gibsjose/homebrew-cask,renard/homebrew-cask,hellosky806/homebrew-cask,gord1anknot/homebrew-cask,chadcatlett/caskroom-homebrew-cask,adelinofaria/homebrew-cask,Ngrd/homebrew-cask,jeanregisser/homebrew-cask,unasuke/homebrew-cask,otaran/homebrew-cask,doits/homebrew-cask,sanchezm/homebrew-cask,dustinblackman/homebrew-cask,akiomik/homebrew-cask,lalyos/homebrew-cask,wmorin/homebrew-cask,d/homebrew-cask,vmrob/homebrew-cask,samshadwell/homebrew-cask,qnm/homebrew-cask,shanonvl/homebrew-cask,lumaxis/homebrew-cask,guerrero/homebrew-cask,elseym/homebrew-cask,napaxton/homebrew-cask,jacobdam/homebrew-cask,mathbunnyru/homebrew-cask,timsutton/homebrew-cask,antogg/homebrew-cask,dvdoliveira/homebrew-cask,askl56/homebrew-cask,FranklinChen/homebrew-cask,artdevjs/homebrew-cask,ebraminio/homebrew-cask,jeanregisser/homebrew-cask,akiomik/homebrew-cask,nrlquaker/homebrew-cask,howie/homebrew-cask,kuno/homebrew-cask,Saklad5/homebrew-cask,helloIAmPau/homebrew-cask,mikem/homebrew-cask,a-x-/homebrew-cask,jeroenseegers/homebrew-cask,RJHsiao/homebrew-cask,crzrcn/homebrew-cask,sgnh/homebrew-cask,arronmabrey/homebrew-cask,dieterdemeyer/homebrew-cask,SamiHiltunen/homebrew-cask,andersonba/homebrew-cask,RogerThiede/homebrew-cask,miccal/homebrew-cask,janlugt/homebrew-cask,casidiablo/homebrew-cask,mwek/homebrew-cask,guylabs/homebrew-cask,psibre/homebrew-cask,lantrix/homebrew-cask,rcuza/homebrew-cask,lalyos/homebrew-cask,MatzFan/homebrew-cask,miccal/homebrew-cask,jmeridth/homebrew-cask,amatos/homebrew-cask,sparrc/homebrew-cask,jamesmlees/homebrew-cask,nathancahill/homebrew-cask,drostron/homebrew-cask,klane/homebrew-cask,miku/homebrew-cask,pacav69/homebrew-cask,cprecioso/homebrew-cask,franklouwers/homebrew-cask,chino/homebrew-cask,ksylvan/homebrew-cask,colindean/homebrew-cask,psibre/homebrew-cask,kronicd/homebrew-cask,jbeagley52/homebrew-cask,csmith-palantir/homebrew-cask,fkrone/homebrew-cask,shorshe/homebrew-cask,bdhess/homebrew-cask,hovancik/homebrew-cask,stephenwade/homebrew-cask,danielbayley/homebrew-cask,paulombcosta/homebrew-cask,af/homebrew-cask,moimikey/homebrew-cask,6uclz1/homebrew-cask,seanorama/homebrew-cask,dictcp/homebrew-cask,codeurge/homebrew-cask,nathanielvarona/homebrew-cask,zmwangx/homebrew-cask,syscrusher/homebrew-cask,gerrypower/homebrew-cask,mrmachine/homebrew-cask,zeusdeux/homebrew-cask,singingwolfboy/homebrew-cask,SentinelWarren/homebrew-cask,bdhess/homebrew-cask,rogeriopradoj/homebrew-cask,blainesch/homebrew-cask,nickpellant/homebrew-cask,Labutin/homebrew-cask,ldong/homebrew-cask,retbrown/homebrew-cask,rogeriopradoj/homebrew-cask,Keloran/homebrew-cask,shoichiaizawa/homebrew-cask,alexg0/homebrew-cask,zorosteven/homebrew-cask,cobyism/homebrew-cask,SamiHiltunen/homebrew-cask,elyscape/homebrew-cask,alebcay/homebrew-cask,mjgardner/homebrew-cask,Labutin/homebrew-cask,julionc/homebrew-cask,mchlrmrz/homebrew-cask,3van/homebrew-cask,kkdd/homebrew-cask,13k/homebrew-cask,ptb/homebrew-cask,mishari/homebrew-cask,Ephemera/homebrew-cask,crzrcn/homebrew-cask,claui/homebrew-cask,uetchy/homebrew-cask,kesara/homebrew-cask,paulbreslin/homebrew-cask,Gasol/homebrew-cask,zhuzihhhh/homebrew-cask,rhendric/homebrew-cask,jrwesolo/homebrew-cask,seanzxx/homebrew-cask,rajiv/homebrew-cask,stonehippo/homebrew-cask,patresi/homebrew-cask,freeslugs/homebrew-cask,otaran/homebrew-cask,squid314/homebrew-cask,kievechua/homebrew-cask,paulbreslin/homebrew-cask,johntrandall/homebrew-cask,fharbe/homebrew-cask,samshadwell/homebrew-cask,cobyism/homebrew-cask,ajbw/homebrew-cask,diogodamiani/homebrew-cask,wolflee/homebrew-cask,casidiablo/homebrew-cask,jaredsampson/homebrew-cask,xyb/homebrew-cask,epmatsw/homebrew-cask,kei-yamazaki/homebrew-cask,FredLackeyOfficial/homebrew-cask,dwkns/homebrew-cask,ctrevino/homebrew-cask,usami-k/homebrew-cask,sanyer/homebrew-cask,carlmod/homebrew-cask,xight/homebrew-cask,farmerchris/homebrew-cask,albertico/homebrew-cask,tedski/homebrew-cask,kesara/homebrew-cask,englishm/homebrew-cask,barravi/homebrew-cask,ericbn/homebrew-cask,tsparber/homebrew-cask,mattrobenolt/homebrew-cask,m3nu/homebrew-cask,gerrymiller/homebrew-cask,chuanxd/homebrew-cask,stevenmaguire/homebrew-cask,nathansgreen/homebrew-cask,jen20/homebrew-cask,johndbritton/homebrew-cask,miguelfrde/homebrew-cask,chadcatlett/caskroom-homebrew-cask,joaocc/homebrew-cask,alexg0/homebrew-cask,haha1903/homebrew-cask,Philosoft/homebrew-cask,optikfluffel/homebrew-cask,sysbot/homebrew-cask,mchlrmrz/homebrew-cask,bcaceiro/homebrew-cask,rajiv/homebrew-cask,gyndav/homebrew-cask,underyx/homebrew-cask,mingzhi22/homebrew-cask,pgr0ss/homebrew-cask,caskroom/homebrew-cask,nicolas-brousse/homebrew-cask,larseggert/homebrew-cask,barravi/homebrew-cask,MicTech/homebrew-cask,mattrobenolt/homebrew-cask,renaudguerin/homebrew-cask,tsparber/homebrew-cask,ywfwj2008/homebrew-cask,astorije/homebrew-cask,spruceb/homebrew-cask,jaredsampson/homebrew-cask,codeurge/homebrew-cask,malob/homebrew-cask,jasmas/homebrew-cask,zorosteven/homebrew-cask,kevyau/homebrew-cask,ponychicken/homebrew-customcask,lieuwex/homebrew-cask,yumitsu/homebrew-cask,supriyantomaftuh/homebrew-cask,uetchy/homebrew-cask,vin047/homebrew-cask,ch3n2k/homebrew-cask,slack4u/homebrew-cask,dunn/homebrew-cask,genewoo/homebrew-cask,sebcode/homebrew-cask,franklouwers/homebrew-cask,fazo96/homebrew-cask,nshemonsky/homebrew-cask,Ephemera/homebrew-cask,joaoponceleao/homebrew-cask,kryhear/homebrew-cask,yurrriq/homebrew-cask,dwihn0r/homebrew-cask,Ketouem/homebrew-cask,joaocc/homebrew-cask,inz/homebrew-cask,robertgzr/homebrew-cask,feigaochn/homebrew-cask,neverfox/homebrew-cask,bsiddiqui/homebrew-cask,colindean/homebrew-cask,joshka/homebrew-cask,hvisage/homebrew-cask,mhubig/homebrew-cask,andrewdisley/homebrew-cask,theoriginalgri/homebrew-cask,yuhki50/homebrew-cask,howie/homebrew-cask,crmne/homebrew-cask,dwkns/homebrew-cask,MicTech/homebrew-cask,cprecioso/homebrew-cask,sirodoht/homebrew-cask,boecko/homebrew-cask,ericbn/homebrew-cask,imgarylai/homebrew-cask,ninjahoahong/homebrew-cask,nrlquaker/homebrew-cask,ksylvan/homebrew-cask,christophermanning/homebrew-cask,jmeridth/homebrew-cask,bendoerr/homebrew-cask,ddm/homebrew-cask,wesen/homebrew-cask,doits/homebrew-cask,githubutilities/homebrew-cask,Amorymeltzer/homebrew-cask,wKovacs64/homebrew-cask,axodys/homebrew-cask,colindunn/homebrew-cask,kpearson/homebrew-cask,jgarber623/homebrew-cask,malob/homebrew-cask,mariusbutuc/homebrew-cask,zhuzihhhh/homebrew-cask,gguillotte/homebrew-cask,cclauss/homebrew-cask,jpodlech/homebrew-cask,moogar0880/homebrew-cask,fkrone/homebrew-cask,renard/homebrew-cask,MichaelPei/homebrew-cask,wKovacs64/homebrew-cask,victorpopkov/homebrew-cask,deiga/homebrew-cask,helloIAmPau/homebrew-cask,zmwangx/homebrew-cask,RickWong/homebrew-cask,larseggert/homebrew-cask,MircoT/homebrew-cask,wickles/homebrew-cask,ericbn/homebrew-cask,coneman/homebrew-cask,Cottser/homebrew-cask,cblecker/homebrew-cask,tmoreira2020/homebrew,mikem/homebrew-cask,imgarylai/homebrew-cask,shoichiaizawa/homebrew-cask,yurikoles/homebrew-cask,tedbundyjr/homebrew-cask,alloy/homebrew-cask,goxberry/homebrew-cask,crmne/homebrew-cask,MoOx/homebrew-cask,astorije/homebrew-cask,kiliankoe/homebrew-cask,xalep/homebrew-cask,dcondrey/homebrew-cask,iAmGhost/homebrew-cask,MisumiRize/homebrew-cask,hackhandslabs/homebrew-cask,aktau/homebrew-cask,flaviocamilo/homebrew-cask,optikfluffel/homebrew-cask,afdnlw/homebrew-cask,enriclluelles/homebrew-cask,remko/homebrew-cask,mauricerkelly/homebrew-cask,danielgomezrico/homebrew-cask,gyndav/homebrew-cask,shonjir/homebrew-cask,Ketouem/homebrew-cask,nightscape/homebrew-cask,chrisRidgers/homebrew-cask,scottsuch/homebrew-cask,Fedalto/homebrew-cask,AnastasiaSulyagina/homebrew-cask,aguynamedryan/homebrew-cask,jawshooah/homebrew-cask,kronicd/homebrew-cask,bkono/homebrew-cask,nathansgreen/homebrew-cask,delphinus35/homebrew-cask,vuquoctuan/homebrew-cask,ch3n2k/homebrew-cask,joschi/homebrew-cask,yumitsu/homebrew-cask,pkq/homebrew-cask,LaurentFough/homebrew-cask,boecko/homebrew-cask,chrisfinazzo/homebrew-cask,stevehedrick/homebrew-cask,timsutton/homebrew-cask,Ibuprofen/homebrew-cask,jellyfishcoder/homebrew-cask,hakamadare/homebrew-cask,yurikoles/homebrew-cask,aki77/homebrew-cask,samdoran/homebrew-cask,wmorin/homebrew-cask,napaxton/homebrew-cask,inta/homebrew-cask,caskroom/homebrew-cask,wastrachan/homebrew-cask,robertgzr/homebrew-cask,MoOx/homebrew-cask,opsdev-ws/homebrew-cask,sjackman/homebrew-cask,paour/homebrew-cask,chrisfinazzo/homebrew-cask,sscotth/homebrew-cask,vuquoctuan/homebrew-cask,xakraz/homebrew-cask,renaudguerin/homebrew-cask,drostron/homebrew-cask,lcasey001/homebrew-cask,segiddins/homebrew-cask,hvisage/homebrew-cask,daften/homebrew-cask,a1russell/homebrew-cask,mchlrmrz/homebrew-cask,epmatsw/homebrew-cask,nivanchikov/homebrew-cask,mattfelsen/homebrew-cask,pinut/homebrew-cask,adriweb/homebrew-cask,jalaziz/homebrew-cask,skatsuta/homebrew-cask,Hywan/homebrew-cask,samnung/homebrew-cask,zeusdeux/homebrew-cask,joschi/homebrew-cask,devmynd/homebrew-cask,afh/homebrew-cask,jconley/homebrew-cask,scottsuch/homebrew-cask,johan/homebrew-cask,askl56/homebrew-cask,mjdescy/homebrew-cask,mwean/homebrew-cask,joschi/homebrew-cask,atsuyim/homebrew-cask,tarwich/homebrew-cask,tdsmith/homebrew-cask,robbiethegeek/homebrew-cask,bric3/homebrew-cask,neil-ca-moore/homebrew-cask,mfpierre/homebrew-cask,mwek/homebrew-cask,lukasbestle/homebrew-cask,kingthorin/homebrew-cask,jacobdam/homebrew-cask,dlovitch/homebrew-cask,afh/homebrew-cask,j13k/homebrew-cask,retbrown/homebrew-cask,n8henrie/homebrew-cask,hyuna917/homebrew-cask,mahori/homebrew-cask,pablote/homebrew-cask,goxberry/homebrew-cask,moimikey/homebrew-cask,miccal/homebrew-cask,santoshsahoo/homebrew-cask,jen20/homebrew-cask,cohei/homebrew-cask,leonmachadowilcox/homebrew-cask,phpwutz/homebrew-cask,diogodamiani/homebrew-cask,moimikey/homebrew-cask,kolomiichenko/homebrew-cask,gord1anknot/homebrew-cask,reitermarkus/homebrew-cask,pkq/homebrew-cask,tjnycum/homebrew-cask,tan9/homebrew-cask,claui/homebrew-cask,qbmiller/homebrew-cask,johnjelinek/homebrew-cask,kryhear/homebrew-cask,aktau/homebrew-cask,farmerchris/homebrew-cask,MatzFan/homebrew-cask,sosedoff/homebrew-cask,gerrymiller/homebrew-cask,cblecker/homebrew-cask,christer155/homebrew-cask,tranc99/homebrew-cask,BahtiyarB/homebrew-cask,mkozjak/homebrew-cask,moonboots/homebrew-cask,kievechua/homebrew-cask,gmkey/homebrew-cask,kirikiriyamama/homebrew-cask,dwihn0r/homebrew-cask,danielbayley/homebrew-cask,mathbunnyru/homebrew-cask,fly19890211/homebrew-cask,kingthorin/homebrew-cask,stigkj/homebrew-caskroom-cask,hackhandslabs/homebrew-cask,lucasmezencio/homebrew-cask,miguelfrde/homebrew-cask,rubenerd/homebrew-cask,FranklinChen/homebrew-cask,hyuna917/homebrew-cask,okket/homebrew-cask,ajbw/homebrew-cask,AndreTheHunter/homebrew-cask,miku/homebrew-cask,mauricerkelly/homebrew-cask,Saklad5/homebrew-cask,MichaelPei/homebrew-cask,adrianchia/homebrew-cask,kteru/homebrew-cask,schneidmaster/homebrew-cask,mariusbutuc/homebrew-cask,segiddins/homebrew-cask,gerrypower/homebrew-cask,yurikoles/homebrew-cask,mattrobenolt/homebrew-cask,Amorymeltzer/homebrew-cask,Bombenleger/homebrew-cask,riyad/homebrew-cask,gwaldo/homebrew-cask,unasuke/homebrew-cask,RogerThiede/homebrew-cask,adrianchia/homebrew-cask,mahori/homebrew-cask,aguynamedryan/homebrew-cask,mazehall/homebrew-cask,lolgear/homebrew-cask,williamboman/homebrew-cask,mfpierre/homebrew-cask,0xadada/homebrew-cask,morganestes/homebrew-cask,coneman/homebrew-cask,lifepillar/homebrew-cask,julionc/homebrew-cask,xtian/homebrew-cask,gilesdring/homebrew-cask,xiongchiamiov/homebrew-cask,JosephViolago/homebrew-cask,kongslund/homebrew-cask,esebastian/homebrew-cask,mingzhi22/homebrew-cask,sscotth/homebrew-cask,jangalinski/homebrew-cask,kassi/homebrew-cask,forevergenin/homebrew-cask,ksato9700/homebrew-cask,d/homebrew-cask,kTitan/homebrew-cask,genewoo/homebrew-cask,corbt/homebrew-cask,sysbot/homebrew-cask,arranubels/homebrew-cask,faun/homebrew-cask,mokagio/homebrew-cask,sohtsuka/homebrew-cask,bgandon/homebrew-cask,retrography/homebrew-cask,coeligena/homebrew-customized,exherb/homebrew-cask,RickWong/homebrew-cask,flaviocamilo/homebrew-cask,Nitecon/homebrew-cask,tjnycum/homebrew-cask,Ngrd/homebrew-cask,perfide/homebrew-cask,jppelteret/homebrew-cask,toonetown/homebrew-cask,nrlquaker/homebrew-cask,afdnlw/homebrew-cask,elnappo/homebrew-cask,elnappo/homebrew-cask,frapposelli/homebrew-cask,jawshooah/homebrew-cask,xyb/homebrew-cask,csmith-palantir/homebrew-cask,jppelteret/homebrew-cask,maxnordlund/homebrew-cask,scottsuch/homebrew-cask,fwiesel/homebrew-cask,nathanielvarona/homebrew-cask,decrement/homebrew-cask,qnm/homebrew-cask,cfillion/homebrew-cask,Keloran/homebrew-cask,markthetech/homebrew-cask,wayou/homebrew-cask,delphinus35/homebrew-cask,devmynd/homebrew-cask,linc01n/homebrew-cask,tdsmith/homebrew-cask,BenjaminHCCarr/homebrew-cask,rajiv/homebrew-cask,Gasol/homebrew-cask,JosephViolago/homebrew-cask,joshka/homebrew-cask,lukeadams/homebrew-cask,nysthee/homebrew-cask,neil-ca-moore/homebrew-cask,nightscape/homebrew-cask,johnste/homebrew-cask,robbiethegeek/homebrew-cask,iamso/homebrew-cask,wesen/homebrew-cask,puffdad/homebrew-cask,athrunsun/homebrew-cask,KosherBacon/homebrew-cask,ebraminio/homebrew-cask,asbachb/homebrew-cask,stephenwade/homebrew-cask,lcasey001/homebrew-cask,timsutton/homebrew-cask,jayshao/homebrew-cask,pgr0ss/homebrew-cask,lauantai/homebrew-cask,catap/homebrew-cask,stonehippo/homebrew-cask,kTitan/homebrew-cask,malford/homebrew-cask,xight/homebrew-cask,Ibuprofen/homebrew-cask,englishm/homebrew-cask,chino/homebrew-cask,bcomnes/homebrew-cask,brianshumate/homebrew-cask,troyxmccall/homebrew-cask,a1russell/homebrew-cask,boydj/homebrew-cask,jconley/homebrew-cask,BenjaminHCCarr/homebrew-cask,arranubels/homebrew-cask,mathbunnyru/homebrew-cask,haha1903/homebrew-cask,gurghet/homebrew-cask,jacobbednarz/homebrew-cask,Ephemera/homebrew-cask,Nitecon/homebrew-cask,tolbkni/homebrew-cask,FinalDes/homebrew-cask,jonathanwiesel/homebrew-cask,underyx/homebrew-cask,riyad/homebrew-cask,deiga/homebrew-cask,markhuber/homebrew-cask,forevergenin/homebrew-cask,hanxue/caskroom,gilesdring/homebrew-cask,inta/homebrew-cask,nysthee/homebrew-cask,johntrandall/homebrew-cask,Fedalto/homebrew-cask,yutarody/homebrew-cask,sgnh/homebrew-cask,nathancahill/homebrew-cask,tjnycum/homebrew-cask,slack4u/homebrew-cask,pkq/homebrew-cask,kteru/homebrew-cask,bric3/homebrew-cask,wizonesolutions/homebrew-cask,wuman/homebrew-cask,neverfox/homebrew-cask,BahtiyarB/homebrew-cask,cedwardsmedia/homebrew-cask,chrisfinazzo/homebrew-cask,pinut/homebrew-cask,mlocher/homebrew-cask,mahori/homebrew-cask,hakamadare/homebrew-cask,arronmabrey/homebrew-cask,shishi/homebrew-cask,sscotth/homebrew-cask,asbachb/homebrew-cask,decrement/homebrew-cask,dcondrey/homebrew-cask,reelsense/homebrew-cask,taherio/homebrew-cask,joshka/homebrew-cask,huanzhang/homebrew-cask,a1russell/homebrew-cask,blogabe/homebrew-cask,qbmiller/homebrew-cask,lifepillar/homebrew-cask,Dremora/homebrew-cask,scw/homebrew-cask,n8henrie/homebrew-cask,13k/homebrew-cask,FinalDes/homebrew-cask,mrmachine/homebrew-cask,buo/homebrew-cask,scribblemaniac/homebrew-cask,Whoaa512/homebrew-cask,cclauss/homebrew-cask,vigosan/homebrew-cask,koenrh/homebrew-cask,taherio/homebrew-cask,guerrero/homebrew-cask,jalaziz/homebrew-cask,malob/homebrew-cask,illusionfield/homebrew-cask,mwean/homebrew-cask,dspeckhard/homebrew-cask,inz/homebrew-cask,stonehippo/homebrew-cask,jbeagley52/homebrew-cask,sebcode/homebrew-cask,kiliankoe/homebrew-cask,rubenerd/homebrew-cask,tyage/homebrew-cask,kongslund/homebrew-cask,jiashuw/homebrew-cask,kkdd/homebrew-cask,kamilboratynski/homebrew-cask,stevenmaguire/homebrew-cask,bgandon/homebrew-cask,ohammersmith/homebrew-cask,gyndav/homebrew-cask,ponychicken/homebrew-customcask,iamso/homebrew-cask,pablote/homebrew-cask,jhowtan/homebrew-cask,jellyfishcoder/homebrew-cask,mjgardner/homebrew-cask,lumaxis/homebrew-cask,lauantai/homebrew-cask,ahundt/homebrew-cask,yuhki50/homebrew-cask,bchatard/homebrew-cask,carlmod/homebrew-cask,gibsjose/homebrew-cask,greg5green/homebrew-cask,leipert/homebrew-cask,asins/homebrew-cask,uetchy/homebrew-cask,rickychilcott/homebrew-cask,artdevjs/homebrew-cask,wickles/homebrew-cask,Hywan/homebrew-cask,onlynone/homebrew-cask,hristozov/homebrew-cask,rogeriopradoj/homebrew-cask,tarwich/homebrew-cask,gurghet/homebrew-cask,Philosoft/homebrew-cask,tedski/homebrew-cask,AnastasiaSulyagina/homebrew-cask,bkono/homebrew-cask,mwilmer/homebrew-cask,victorpopkov/homebrew-cask,alexg0/homebrew-cask,kesara/homebrew-cask,greg5green/homebrew-cask,leipert/homebrew-cask,colindunn/homebrew-cask,jrwesolo/homebrew-cask,JacopKane/homebrew-cask,vigosan/homebrew-cask,paour/homebrew-cask,seanzxx/homebrew-cask,andyli/homebrew-cask,cedwardsmedia/homebrew-cask,ksato9700/homebrew-cask,chrisRidgers/homebrew-cask,adriweb/homebrew-cask,guylabs/homebrew-cask,dlovitch/homebrew-cask,markthetech/homebrew-cask,gyugyu/homebrew-cask,skatsuta/homebrew-cask,ahundt/homebrew-cask,Bombenleger/homebrew-cask,kevyau/homebrew-cask,winkelsdorf/homebrew-cask,3van/homebrew-cask,hanxue/caskroom,muan/homebrew-cask,muan/homebrew-cask,boydj/homebrew-cask,CameronGarrett/homebrew-cask,sanyer/homebrew-cask,sanyer/homebrew-cask,maxnordlund/homebrew-cask,alloy/homebrew-cask,chuanxd/homebrew-cask,brianshumate/homebrew-cask,LaurentFough/homebrew-cask,tyage/homebrew-cask,mjdescy/homebrew-cask,n0ts/homebrew-cask,jamesmlees/homebrew-cask,cblecker/homebrew-cask,neverfox/homebrew-cask,optikfluffel/homebrew-cask,andrewdisley/homebrew-cask,ohammersmith/homebrew-cask,phpwutz/homebrew-cask,wizonesolutions/homebrew-cask,samnung/homebrew-cask,julionc/homebrew-cask,My2ndAngelic/homebrew-cask,troyxmccall/homebrew-cask,fanquake/homebrew-cask,buo/homebrew-cask,puffdad/homebrew-cask,sparrc/homebrew-cask,CameronGarrett/homebrew-cask,iAmGhost/homebrew-cask,gabrielizaias/homebrew-cask,antogg/homebrew-cask,paulombcosta/homebrew-cask,tangestani/homebrew-cask,mattfelsen/homebrew-cask,okket/homebrew-cask,KosherBacon/homebrew-cask,esebastian/homebrew-cask,wuman/homebrew-cask,fwiesel/homebrew-cask,lukasbestle/homebrew-cask,bsiddiqui/homebrew-cask,cohei/homebrew-cask,blogabe/homebrew-cask,RJHsiao/homebrew-cask,rkJun/homebrew-cask,linc01n/homebrew-cask,JacopKane/homebrew-cask,tmoreira2020/homebrew,bendoerr/homebrew-cask,ahvigil/homebrew-cask,wmorin/homebrew-cask,corbt/homebrew-cask,mgryszko/homebrew-cask,ahvigil/homebrew-cask,thomanq/homebrew-cask,githubutilities/homebrew-cask,wickedsp1d3r/homebrew-cask,opsdev-ws/homebrew-cask,ianyh/homebrew-cask,dvdoliveira/homebrew-cask,tan9/homebrew-cask,faun/homebrew-cask,feniix/homebrew-cask,ftiff/homebrew-cask,hanxue/caskroom,ayohrling/homebrew-cask,onlynone/homebrew-cask,MircoT/homebrew-cask,deiga/homebrew-cask,christer155/homebrew-cask,kamilboratynski/homebrew-cask,moogar0880/homebrew-cask,rkJun/homebrew-cask,dictcp/homebrew-cask,mishari/homebrew-cask,andersonba/homebrew-cask,cliffcotino/homebrew-cask,0rax/homebrew-cask,fly19890211/homebrew-cask,slnovak/homebrew-cask,deanmorin/homebrew-cask,Amorymeltzer/homebrew-cask,cobyism/homebrew-cask,feniix/homebrew-cask,JoelLarson/homebrew-cask,xakraz/homebrew-cask,illusionfield/homebrew-cask,sosedoff/homebrew-cask,leonmachadowilcox/homebrew-cask,scribblemaniac/homebrew-cask,y00rb/homebrew-cask,syscrusher/homebrew-cask,hellosky806/homebrew-cask,Cottser/homebrew-cask,zchee/homebrew-cask,jeroenseegers/homebrew-cask,shishi/homebrew-cask,morganestes/homebrew-cask,FredLackeyOfficial/homebrew-cask,stigkj/homebrew-caskroom-cask,theoriginalgri/homebrew-cask,kingthorin/homebrew-cask,toonetown/homebrew-cask,anbotero/homebrew-cask,diguage/homebrew-cask,tedbundyjr/homebrew-cask,thehunmonkgroup/homebrew-cask,joaoponceleao/homebrew-cask,dunn/homebrew-cask,danielgomezrico/homebrew-cask,coeligena/homebrew-customized,katoquro/homebrew-cask,jiashuw/homebrew-cask,anbotero/homebrew-cask,stevehedrick/homebrew-cask,nicolas-brousse/homebrew-cask,jedahan/homebrew-cask,malford/homebrew-cask,vitorgalvao/homebrew-cask,xiongchiamiov/homebrew-cask,wickedsp1d3r/homebrew-cask,elyscape/homebrew-cask,xcezx/homebrew-cask,asins/homebrew-cask,xtian/homebrew-cask,perfide/homebrew-cask,epardee/homebrew-cask,skyyuan/homebrew-cask,yutarody/homebrew-cask,royalwang/homebrew-cask,SentinelWarren/homebrew-cask,mhubig/homebrew-cask,jgarber623/homebrew-cask,vitorgalvao/homebrew-cask,koenrh/homebrew-cask,mindriot101/homebrew-cask,dictcp/homebrew-cask,jayshao/homebrew-cask,ninjahoahong/homebrew-cask,jedahan/homebrew-cask,mokagio/homebrew-cask,rhendric/homebrew-cask,bcomnes/homebrew-cask,slnovak/homebrew-cask,alebcay/homebrew-cask,ddm/homebrew-cask,tjt263/homebrew-cask,rcuza/homebrew-cask,djakarta-trap/homebrew-myCask,josa42/homebrew-cask,singingwolfboy/homebrew-cask,m3nu/homebrew-cask,xcezx/homebrew-cask,catap/homebrew-cask,thii/homebrew-cask,garborg/homebrew-cask,deanmorin/homebrew-cask,jangalinski/homebrew-cask,giannitm/homebrew-cask,lukeadams/homebrew-cask,paour/homebrew-cask,schneidmaster/homebrew-cask,gwaldo/homebrew-cask,thehunmonkgroup/homebrew-cask,jalaziz/homebrew-cask,jpmat296/homebrew-cask,jonathanwiesel/homebrew-cask,BenjaminHCCarr/homebrew-cask,johndbritton/homebrew-cask,jasmas/homebrew-cask | ruby | ## Code Before:
cask :v1 => 'dbeaver-enterprise' do
version '3.1.1'
if Hardware::CPU.is_32_bit?
sha256 '692bf7ab300f0876ab3dac996e61f0b7c367db307c8190754fbf798dbb38daeb'
url "http://dbeaver.jkiss.org/files/dbeaver-#{version}-ee-macosx.cocoa.x86.zip"
else
sha256 '27564968a4fd27dcaca5a9639cd787f63f006cab844017c7d094f922d506922d'
url "http://dbeaver.jkiss.org/files/dbeaver-#{version}-ee-macosx.cocoa.x86_64.zip"
end
homepage 'http://dbeaver.jkiss.org/'
license :oss
app 'dbeaver/dbeaver.app'
end
## Instruction:
Update DBeaver Enterprise to 3.1.2
## Code After:
cask :v1 => 'dbeaver-enterprise' do
version '3.1.2'
if Hardware::CPU.is_32_bit?
sha256 '69e3fbb77f6dfa4039f2535451849ae3811e0430764bd507ecc5df3cd7eb28ba'
url "http://dbeaver.jkiss.org/files/dbeaver-#{version}-ee-macosx.cocoa.x86.zip"
else
sha256 'f868445eecae115245b5d0882917e548c1a799c3ea8e6e47c551557d62a857ff'
url "http://dbeaver.jkiss.org/files/dbeaver-#{version}-ee-macosx.cocoa.x86_64.zip"
end
homepage 'http://dbeaver.jkiss.org/'
license :oss
app 'dbeaver/dbeaver.app'
end
| cask :v1 => 'dbeaver-enterprise' do
- version '3.1.1'
? ^
+ version '3.1.2'
? ^
if Hardware::CPU.is_32_bit?
- sha256 '692bf7ab300f0876ab3dac996e61f0b7c367db307c8190754fbf798dbb38daeb'
+ sha256 '69e3fbb77f6dfa4039f2535451849ae3811e0430764bd507ecc5df3cd7eb28ba'
url "http://dbeaver.jkiss.org/files/dbeaver-#{version}-ee-macosx.cocoa.x86.zip"
else
- sha256 '27564968a4fd27dcaca5a9639cd787f63f006cab844017c7d094f922d506922d'
+ sha256 'f868445eecae115245b5d0882917e548c1a799c3ea8e6e47c551557d62a857ff'
url "http://dbeaver.jkiss.org/files/dbeaver-#{version}-ee-macosx.cocoa.x86_64.zip"
end
homepage 'http://dbeaver.jkiss.org/'
license :oss
app 'dbeaver/dbeaver.app'
end | 6 | 0.375 | 3 | 3 |
07ed6089f607dc5fea9c03f15ca3a03d4491f7e9 | .travis.yml | .travis.yml | language: node_js
node_js:
- "lts/*"
cache: yarn
script: yarn test
# We build PRs, but don't trigger separate builds for the PR from the branch.
branches:
only:
- master | language: node_js
node_js:
- "lts/*"
cache: yarn
script:
- yarn test
- if [[ $TRAVIS_PULL_REQUEST == 'false' ]]; then yarn run doc; fi
# We build PRs, but don't trigger separate builds for the PR from the branch.
branches:
only:
- master
| Build docs on push/merge to master. | Build docs on push/merge to master.
| YAML | mit | chriskrycho/true-myth,chriskrycho/true-myth,chriskrycho/true-myth | yaml | ## Code Before:
language: node_js
node_js:
- "lts/*"
cache: yarn
script: yarn test
# We build PRs, but don't trigger separate builds for the PR from the branch.
branches:
only:
- master
## Instruction:
Build docs on push/merge to master.
## Code After:
language: node_js
node_js:
- "lts/*"
cache: yarn
script:
- yarn test
- if [[ $TRAVIS_PULL_REQUEST == 'false' ]]; then yarn run doc; fi
# We build PRs, but don't trigger separate builds for the PR from the branch.
branches:
only:
- master
| language: node_js
node_js:
- "lts/*"
cache: yarn
- script: yarn test
+ script:
+ - yarn test
+ - if [[ $TRAVIS_PULL_REQUEST == 'false' ]]; then yarn run doc; fi
# We build PRs, but don't trigger separate builds for the PR from the branch.
branches:
only:
- master | 4 | 0.363636 | 3 | 1 |
ecbe3381d7deea6ff71afaef02312fa12848ce69 | apps/ide/src/plugins/webida.ide.project-management.run/run-configuration.html | apps/ide/src/plugins/webida.ide.project-management.run/run-configuration.html | <div class="run-configuration-dialog-container">
<div class="run-configuration-dialog">
<div class="run-configuration-dialog-header">
<button data-dojo-type="dijit/form/Button" type="button" id="run-configuration-create-button">Add New Configuration</button>
<button data-dojo-type="dijit/form/Button" type="button" id="run-configuration-delete-button">Delete Configuration</button>
</div>
<div id="run-configuration-content" class="run-configuration-dialog-content" data-dojo-type="dijit.layout.BorderContainer">
<div id="run-configuration-list-contentpane" data-dojo-type="dijit/layout/ContentPane" data-dojo-props="region: 'center'">
</div>
<div id="run-configuration-list" data-dojo-type="dijit/layout/ContentPane" data-dojo-props="region: 'left', splitter: true">
<ul id="run-configuration-list-tree" class="run-configuration-list-tree"></ul>
</div>
</div>
</div>
</div> <!-- whole pane close -->
| <div class="run-configuration-dialog-container">
<div class="run-configuration-dialog">
<div class="run-configuration-dialog-header">
<button data-dojo-type="dijit/form/Button" type="button" id="run-configuration-create-button" title='Add New Configuration'>Add New Configuration</button>
<button data-dojo-type="dijit/form/Button" type="button" id="run-configuration-delete-button" title='Delete Configuration'>Delete Configuration</button>
</div>
<div id="run-configuration-content" class="run-configuration-dialog-content" data-dojo-type="dijit.layout.BorderContainer">
<div id="run-configuration-list-contentpane" data-dojo-type="dijit/layout/ContentPane" data-dojo-props="region: 'center'">
</div>
<div id="run-configuration-list" data-dojo-type="dijit/layout/ContentPane" data-dojo-props="region: 'left', splitter: true">
<ul id="run-configuration-list-tree" class="run-configuration-list-tree"></ul>
</div>
</div>
</div>
</div> <!-- whole pane close -->
| Add tooltip [DESC] Add tooltip | [BUGFIX] Add tooltip
[DESC] Add tooltip
| HTML | apache-2.0 | webida/webida-client,gseok/webida-client,JunsikShim/webida-client,gotchazipc/webida-client,minsung-jin/webida-client,wcho/webida-client,changgyun-woo/webida-client,wing9405/webida-client,changgyun/webida-client,gseok/webida-client,changgyun/webida-client,yoonhyung/webida-client,JunsikShim/webida-client,minsung-jin/webida-client,kyungmi/webida-client,webida/webida-client,yoonhyung/webida-client,cimfalab/webida-client,wing9405/webida-client,yoonhyung/webida-client,leechwin/webida-client,webida/webida-client,leechwin/webida-client,changgyun/webida-client,gotchazipc/webida-client,happibum/webida-client,gseok/webida-client,minsung-jin/webida-client,wing9405/webida-client,webida/webida-client,yoonhyung/webida-client,kyungmi/webida-client,minsung-jin/webida-client,webida/webida-client,changgyun-woo/webida-client,changgyun/webida-client,wing9405/webida-client,happibum/webida-client,changgyun-woo/webida-client,hyukmin-kwon/webida-client,JunsikShim/webida-client,cimfalab/webida-client,minsung-jin/webida-client,cimfalab/webida-client,happibum/webida-client,kyungmi/webida-client,JunsikShim/webida-client,changgyun-woo/webida-client,hyukmin-kwon/webida-client,changgyun-woo/webida-client,yoonhyung/webida-client,hyukmin-kwon/webida-client,gseok/webida-client,gotchazipc/webida-client,gotchazipc/webida-client,yoonhyung/webida-client,leechwin/webida-client,hyukmin-kwon/webida-client,wcho/webida-client,leechwin/webida-client,wcho/webida-client,leechwin/webida-client,wcho/webida-client,kyungmi/webida-client,kyungmi/webida-client,gotchazipc/webida-client,minsung-jin/webida-client,wing9405/webida-client,webida/webida-client,gseok/webida-client,wing9405/webida-client,changgyun/webida-client,wcho/webida-client,wcho/webida-client,JunsikShim/webida-client,kyungmi/webida-client,gotchazipc/webida-client,leechwin/webida-client,happibum/webida-client,changgyun/webida-client,hyukmin-kwon/webida-client,cimfalab/webida-client,JunsikShim/webida-client,happibum/webida-client,gseok/webida-client,cimfalab/webida-client,changgyun-woo/webida-client,hyukmin-kwon/webida-client,happibum/webida-client,cimfalab/webida-client | html | ## Code Before:
<div class="run-configuration-dialog-container">
<div class="run-configuration-dialog">
<div class="run-configuration-dialog-header">
<button data-dojo-type="dijit/form/Button" type="button" id="run-configuration-create-button">Add New Configuration</button>
<button data-dojo-type="dijit/form/Button" type="button" id="run-configuration-delete-button">Delete Configuration</button>
</div>
<div id="run-configuration-content" class="run-configuration-dialog-content" data-dojo-type="dijit.layout.BorderContainer">
<div id="run-configuration-list-contentpane" data-dojo-type="dijit/layout/ContentPane" data-dojo-props="region: 'center'">
</div>
<div id="run-configuration-list" data-dojo-type="dijit/layout/ContentPane" data-dojo-props="region: 'left', splitter: true">
<ul id="run-configuration-list-tree" class="run-configuration-list-tree"></ul>
</div>
</div>
</div>
</div> <!-- whole pane close -->
## Instruction:
[BUGFIX] Add tooltip
[DESC] Add tooltip
## Code After:
<div class="run-configuration-dialog-container">
<div class="run-configuration-dialog">
<div class="run-configuration-dialog-header">
<button data-dojo-type="dijit/form/Button" type="button" id="run-configuration-create-button" title='Add New Configuration'>Add New Configuration</button>
<button data-dojo-type="dijit/form/Button" type="button" id="run-configuration-delete-button" title='Delete Configuration'>Delete Configuration</button>
</div>
<div id="run-configuration-content" class="run-configuration-dialog-content" data-dojo-type="dijit.layout.BorderContainer">
<div id="run-configuration-list-contentpane" data-dojo-type="dijit/layout/ContentPane" data-dojo-props="region: 'center'">
</div>
<div id="run-configuration-list" data-dojo-type="dijit/layout/ContentPane" data-dojo-props="region: 'left', splitter: true">
<ul id="run-configuration-list-tree" class="run-configuration-list-tree"></ul>
</div>
</div>
</div>
</div> <!-- whole pane close -->
| <div class="run-configuration-dialog-container">
<div class="run-configuration-dialog">
<div class="run-configuration-dialog-header">
- <button data-dojo-type="dijit/form/Button" type="button" id="run-configuration-create-button">Add New Configuration</button>
+ <button data-dojo-type="dijit/form/Button" type="button" id="run-configuration-create-button" title='Add New Configuration'>Add New Configuration</button>
? ++++++++++++++++++++++++++++++
- <button data-dojo-type="dijit/form/Button" type="button" id="run-configuration-delete-button">Delete Configuration</button>
+ <button data-dojo-type="dijit/form/Button" type="button" id="run-configuration-delete-button" title='Delete Configuration'>Delete Configuration</button>
? +++++++++++++++++++++++++++++
</div>
<div id="run-configuration-content" class="run-configuration-dialog-content" data-dojo-type="dijit.layout.BorderContainer">
<div id="run-configuration-list-contentpane" data-dojo-type="dijit/layout/ContentPane" data-dojo-props="region: 'center'">
</div>
<div id="run-configuration-list" data-dojo-type="dijit/layout/ContentPane" data-dojo-props="region: 'left', splitter: true">
<ul id="run-configuration-list-tree" class="run-configuration-list-tree"></ul>
</div>
</div>
</div>
</div> <!-- whole pane close --> | 4 | 0.266667 | 2 | 2 |
a9fd3a98dfdc6e30aceddbc2af59fbdca53cf37a | packaging/linux/make_deb.sh | packaging/linux/make_deb.sh | license=custom
project="notes-0.9.0"
authorEmail="awesomeness.notes@gmail.com"
# Remove old build
if [ -d "$project" ]; then
printf '%s\n' "Removing old build ($project)"
rm -rf "$project"
fi
# Generate folders
mkdir deb_build
cd deb_build
mkdir $project
cd $project
# Build binary
mkdir build
cd build
qmake -qt5 ../../../../../src/Notes.pro
make -j4
mv notes ../
cd ..
rm -r build
#
# Copy icon & desktop file
cp ../../common/LICENSE license.txt
cp ../../common/notes.png notes.png
cp ../../common/notes.desktop notes.desktop
# Copy debian config to build directory
cp -avr ../../debian debian
# Generate source build & debian package"
dh_make -s -c $license --copyrightfile license.txt -e $authorEmail --createorig
dpkg-buildpackage
| license=custom
project="notes-0.9.0"
authorEmail="awesomeness.notes@gmail.com"
# Remove old build
if [ -d "$project" ]; then
printf '%s\n' "Removing old build ($project)"
rm -rf "$project"
fi
# Ensure that submodules are initialized
git submodule update --init
# Generate folders
mkdir deb_build
cd deb_build
mkdir $project
cd $project
# Build binary
mkdir build
cd build
qmake -qt5 ../../../../../src/Notes.pro
make -j4
mv notes ../
cd ..
rm -r build
#
# Copy icon & desktop file
cp ../../common/LICENSE license.txt
cp ../../common/notes.png notes.png
cp ../../common/notes.desktop notes.desktop
# Copy debian config to build directory
cp -avr ../../debian debian
# Generate source build & debian package"
dh_make -s -c $license --copyrightfile license.txt -e $authorEmail --createorig
dpkg-buildpackage
| Modify script to initialize submodules | Modify script to initialize submodules
| Shell | mpl-2.0 | alex-spataru/notes,dplanella/notes,theshadowx/notes,alex-spataru/notes,alex-spataru/notes,dplanella/notes,geoffschoeman/notes,theshadowx/notes,nuttyartist/notes,alex-spataru/notes,nuttyartist/notes,geoffschoeman/notes | shell | ## Code Before:
license=custom
project="notes-0.9.0"
authorEmail="awesomeness.notes@gmail.com"
# Remove old build
if [ -d "$project" ]; then
printf '%s\n' "Removing old build ($project)"
rm -rf "$project"
fi
# Generate folders
mkdir deb_build
cd deb_build
mkdir $project
cd $project
# Build binary
mkdir build
cd build
qmake -qt5 ../../../../../src/Notes.pro
make -j4
mv notes ../
cd ..
rm -r build
#
# Copy icon & desktop file
cp ../../common/LICENSE license.txt
cp ../../common/notes.png notes.png
cp ../../common/notes.desktop notes.desktop
# Copy debian config to build directory
cp -avr ../../debian debian
# Generate source build & debian package"
dh_make -s -c $license --copyrightfile license.txt -e $authorEmail --createorig
dpkg-buildpackage
## Instruction:
Modify script to initialize submodules
## Code After:
license=custom
project="notes-0.9.0"
authorEmail="awesomeness.notes@gmail.com"
# Remove old build
if [ -d "$project" ]; then
printf '%s\n' "Removing old build ($project)"
rm -rf "$project"
fi
# Ensure that submodules are initialized
git submodule update --init
# Generate folders
mkdir deb_build
cd deb_build
mkdir $project
cd $project
# Build binary
mkdir build
cd build
qmake -qt5 ../../../../../src/Notes.pro
make -j4
mv notes ../
cd ..
rm -r build
#
# Copy icon & desktop file
cp ../../common/LICENSE license.txt
cp ../../common/notes.png notes.png
cp ../../common/notes.desktop notes.desktop
# Copy debian config to build directory
cp -avr ../../debian debian
# Generate source build & debian package"
dh_make -s -c $license --copyrightfile license.txt -e $authorEmail --createorig
dpkg-buildpackage
| license=custom
project="notes-0.9.0"
authorEmail="awesomeness.notes@gmail.com"
# Remove old build
if [ -d "$project" ]; then
printf '%s\n' "Removing old build ($project)"
rm -rf "$project"
fi
+
+ # Ensure that submodules are initialized
+ git submodule update --init
# Generate folders
mkdir deb_build
cd deb_build
mkdir $project
cd $project
# Build binary
mkdir build
cd build
qmake -qt5 ../../../../../src/Notes.pro
make -j4
mv notes ../
cd ..
rm -r build
#
# Copy icon & desktop file
cp ../../common/LICENSE license.txt
cp ../../common/notes.png notes.png
cp ../../common/notes.desktop notes.desktop
# Copy debian config to build directory
cp -avr ../../debian debian
# Generate source build & debian package"
dh_make -s -c $license --copyrightfile license.txt -e $authorEmail --createorig
dpkg-buildpackage | 3 | 0.078947 | 3 | 0 |
18b2433a99590a23c71a2f1c702b0b6c9dec5acf | src/browser/squirrel-update.coffee | src/browser/squirrel-update.coffee | app = require 'app'
ChildProcess = require 'child_process'
fs = require 'fs'
path = require 'path'
updateDotExe = path.resolve(path.dirname(process.execPath), '..', 'Update.exe')
exeName = path.basename(process.execPath)
# Spawn the Update.exe with the given arguments and invoke the callback when
# the command completes.
exports.spawn = (args, callback) ->
updateProcess = ChildProcess.spawn(updateDotExe, args)
stdout = ''
updateProcess.stdout.on 'data', (data) -> stdout += data
error = null
updateProcess.on 'error', (processError) -> error ?= processError
updateProcess.on 'close', (code, signal) ->
error ?= new Error("Command failed: #{signal}") if code isnt 0
error?.code ?= code
error?.stdout ?= stdout
callback(error, stdout)
undefined
# Is the Update.exe installed with Atom?
exports.existsSync = ->
fs.existsSync(updateDotExe)
# Handle squirrel events denoted by --squirrel-* command line arguments.
exports.handleStartupEvent = ->
switch process.argv[1]
when '--squirrel-install', '--squirrel-updated'
exports.spawn ['--createShortcut', exeName], -> app.quit()
spawnUpdateAndQuit('')
true
when '--squirrel-uninstall'
exports.spawn ['--removeShortcut', exeName], -> app.quit()
true
when '--squirrel-obsolete'
app.quit()
true
else
false
| app = require 'app'
ChildProcess = require 'child_process'
fs = require 'fs'
path = require 'path'
updateDotExe = path.resolve(path.dirname(process.execPath), '..', 'Update.exe')
exeName = path.basename(process.execPath)
# Spawn the Update.exe with the given arguments and invoke the callback when
# the command completes.
exports.spawn = (args, callback) ->
updateProcess = ChildProcess.spawn(updateDotExe, args)
stdout = ''
updateProcess.stdout.on 'data', (data) -> stdout += data
error = null
updateProcess.on 'error', (processError) -> error ?= processError
updateProcess.on 'close', (code, signal) ->
error ?= new Error("Command failed: #{signal}") if code isnt 0
error?.code ?= code
error?.stdout ?= stdout
callback(error, stdout)
undefined
# Is the Update.exe installed with Atom?
exports.existsSync = ->
fs.existsSync(updateDotExe)
# Handle squirrel events denoted by --squirrel-* command line arguments.
exports.handleStartupEvent = ->
switch process.argv[1]
when '--squirrel-install', '--squirrel-updated'
exports.spawn ['--createShortcut', exeName], -> app.quit()
true
when '--squirrel-uninstall'
exports.spawn ['--removeShortcut', exeName], -> app.quit()
true
when '--squirrel-obsolete'
app.quit()
true
else
false
| Remove call to deleted function | Remove call to deleted function
| CoffeeScript | mit | pombredanne/atom,Jdesk/atom,FoldingText/atom,vcarrera/atom,kandros/atom,nrodriguez13/atom,Jandersolutions/atom,russlescai/atom,medovob/atom,niklabh/atom,deepfox/atom,seedtigo/atom,ralphtheninja/atom,hakatashi/atom,woss/atom,constanzaurzua/atom,kittens/atom,Ju2ender/atom,gontadu/atom,dsandstrom/atom,nrodriguez13/atom,NunoEdgarGub1/atom,bj7/atom,NunoEdgarGub1/atom,dkfiresky/atom,vjeux/atom,deoxilix/atom,crazyquark/atom,beni55/atom,me6iaton/atom,Ingramz/atom,Shekharrajak/atom,chengky/atom,MjAbuz/atom,niklabh/atom,yomybaby/atom,0x73/atom,hagb4rd/atom,beni55/atom,yalexx/atom,rxkit/atom,decaffeinate-examples/atom,efatsi/atom,h0dgep0dge/atom,folpindo/atom,efatsi/atom,lisonma/atom,vjeux/atom,KENJU/atom,fredericksilva/atom,tmunro/atom,nvoron23/atom,dijs/atom,atom/atom,oggy/atom,AlexxNica/atom,tjkr/atom,Abdillah/atom,Austen-G/BlockBuilder,mnquintana/atom,jjz/atom,hharchani/atom,Dennis1978/atom,me6iaton/atom,gisenberg/atom,svanharmelen/atom,jacekkopecky/atom,fedorov/atom,yangchenghu/atom,lisonma/atom,scv119/atom,G-Baby/atom,darwin/atom,targeter21/atom,kdheepak89/atom,Jonekee/atom,bryonwinger/atom,yamhon/atom,FoldingText/atom,lovesnow/atom,mdumrauf/atom,fang-yufeng/atom,anuwat121/atom,amine7536/atom,bcoe/atom,Neron-X5/atom,rxkit/atom,kevinrenaers/atom,cyzn/atom,bolinfest/atom,paulcbetts/atom,palita01/atom,AlbertoBarrago/atom,Andrey-Pavlov/atom,Neron-X5/atom,jordanbtucker/atom,PKRoma/atom,sebmck/atom,rjattrill/atom,tisu2tisu/atom,paulcbetts/atom,kjav/atom,abcP9110/atom,tony612/atom,sotayamashita/atom,xream/atom,DiogoXRP/atom,ardeshirj/atom,originye/atom,pombredanne/atom,deoxilix/atom,kjav/atom,hakatashi/atom,Rodjana/atom,kc8wxm/atom,gontadu/atom,pengshp/atom,splodingsocks/atom,bsmr-x-script/atom,constanzaurzua/atom,harshdattani/atom,charleswhchan/atom,nrodriguez13/atom,decaffeinate-examples/atom,xream/atom,nvoron23/atom,stinsonga/atom,jtrose2/atom,devmario/atom,vinodpanicker/atom,prembasumatary/atom,NunoEdgarGub1/atom,davideg/atom,abcP9110/atom,amine7536/atom,hpham04/atom,crazyquark/atom,johnrizzo1/atom,mdumrauf/atom,ironbox360/atom,paulcbetts/atom,gisenberg/atom,mnquintana/atom,Ju2ender/atom,rookie125/atom,prembasumatary/atom,Huaraz2/atom,brettle/atom,MjAbuz/atom,rsvip/aTom,Shekharrajak/atom,prembasumatary/atom,synaptek/atom,sebmck/atom,matthewclendening/atom,hpham04/atom,kdheepak89/atom,helber/atom,jtrose2/atom,SlimeQ/atom,targeter21/atom,hellendag/atom,RobinTec/atom,me-benni/atom,deepfox/atom,Locke23rus/atom,rjattrill/atom,darwin/atom,rsvip/aTom,fscherwi/atom,me6iaton/atom,ilovezy/atom,qiujuer/atom,kc8wxm/atom,originye/atom,Ju2ender/atom,fedorov/atom,liuxiong332/atom,sillvan/atom,stuartquin/atom,bcoe/atom,chengky/atom,dsandstrom/atom,pombredanne/atom,Rychard/atom,fredericksilva/atom,ykeisuke/atom,avdg/atom,bolinfest/atom,KENJU/atom,mrodalgaard/atom,Galactix/atom,cyzn/atom,liuxiong332/atom,woss/atom,devoncarew/atom,stuartquin/atom,kc8wxm/atom,woss/atom,ilovezy/atom,stinsonga/atom,AlbertoBarrago/atom,Klozz/atom,chengky/atom,svanharmelen/atom,lpommers/atom,g2p/atom,transcranial/atom,woss/atom,qskycolor/atom,brettle/atom,basarat/atom,sekcheong/atom,GHackAnonymous/atom,Hasimir/atom,vinodpanicker/atom,dsandstrom/atom,AdrianVovk/substance-ide,hpham04/atom,PKRoma/atom,fredericksilva/atom,RuiDGoncalves/atom,alexandergmann/atom,vjeux/atom,AdrianVovk/substance-ide,gisenberg/atom,G-Baby/atom,sxgao3001/atom,burodepeper/atom,ReddTea/atom,hharchani/atom,deoxilix/atom,constanzaurzua/atom,liuxiong332/atom,tanin47/atom,mdumrauf/atom,codex8/atom,kittens/atom,kjav/atom,Austen-G/BlockBuilder,Arcanemagus/atom,Huaraz2/atom,synaptek/atom,yomybaby/atom,mertkahyaoglu/atom,einarmagnus/atom,mostafaeweda/atom,jlord/atom,ezeoleaf/atom,RuiDGoncalves/atom,Mokolea/atom,GHackAnonymous/atom,palita01/atom,john-kelly/atom,AlisaKiatkongkumthon/atom,CraZySacX/atom,panuchart/atom,dsandstrom/atom,abcP9110/atom,constanzaurzua/atom,bcoe/atom,Mokolea/atom,jacekkopecky/atom,ashneo76/atom,AdrianVovk/substance-ide,ReddTea/atom,dannyflax/atom,decaffeinate-examples/atom,dkfiresky/atom,jjz/atom,anuwat121/atom,Mokolea/atom,einarmagnus/atom,Rodjana/atom,bencolon/atom,FoldingText/atom,lovesnow/atom,rsvip/aTom,acontreras89/atom,omarhuanca/atom,jlord/atom,ali/atom,sillvan/atom,hagb4rd/atom,BogusCurry/atom,charleswhchan/atom,elkingtonmcb/atom,bsmr-x-script/atom,pombredanne/atom,Jonekee/atom,AlbertoBarrago/atom,Andrey-Pavlov/atom,constanzaurzua/atom,scv119/atom,qiujuer/atom,phord/atom,ironbox360/atom,liuderchi/atom,sekcheong/atom,yalexx/atom,omarhuanca/atom,fedorov/atom,SlimeQ/atom,ali/atom,pombredanne/atom,rxkit/atom,ashneo76/atom,tony612/atom,ashneo76/atom,batjko/atom,jtrose2/atom,mertkahyaoglu/atom,ali/atom,h0dgep0dge/atom,originye/atom,AlisaKiatkongkumthon/atom,crazyquark/atom,targeter21/atom,wiggzz/atom,transcranial/atom,crazyquark/atom,hakatashi/atom,kjav/atom,johnrizzo1/atom,charleswhchan/atom,mrodalgaard/atom,bencolon/atom,Abdillah/atom,harshdattani/atom,me-benni/atom,bencolon/atom,gzzhanghao/atom,stuartquin/atom,john-kelly/atom,Hasimir/atom,gzzhanghao/atom,tony612/atom,pkdevbox/atom,bryonwinger/atom,brumm/atom,gzzhanghao/atom,abcP9110/atom,fang-yufeng/atom,YunchengLiao/atom,ykeisuke/atom,fscherwi/atom,ali/atom,sotayamashita/atom,nvoron23/atom,dannyflax/atom,dijs/atom,ilovezy/atom,acontreras89/atom,vhutheesing/atom,Galactix/atom,matthewclendening/atom,woss/atom,Ingramz/atom,jacekkopecky/atom,harshdattani/atom,stinsonga/atom,GHackAnonymous/atom,Galactix/atom,jeremyramin/atom,splodingsocks/atom,toqz/atom,RuiDGoncalves/atom,Jandersoft/atom,liuxiong332/atom,me6iaton/atom,Neron-X5/atom,dkfiresky/atom,Jandersolutions/atom,alfredxing/atom,brumm/atom,kittens/atom,rjattrill/atom,brumm/atom,john-kelly/atom,devmario/atom,KENJU/atom,Hasimir/atom,gisenberg/atom,nvoron23/atom,DiogoXRP/atom,splodingsocks/atom,johnhaley81/atom,fang-yufeng/atom,sillvan/atom,seedtigo/atom,YunchengLiao/atom,burodepeper/atom,fang-yufeng/atom,dannyflax/atom,Austen-G/BlockBuilder,charleswhchan/atom,alexandergmann/atom,gabrielPeart/atom,rookie125/atom,folpindo/atom,kandros/atom,vcarrera/atom,t9md/atom,lpommers/atom,mostafaeweda/atom,daxlab/atom,ardeshirj/atom,omarhuanca/atom,Jandersolutions/atom,rlugojr/atom,dannyflax/atom,Jdesk/atom,qskycolor/atom,jjz/atom,chfritz/atom,liuderchi/atom,Dennis1978/atom,GHackAnonymous/atom,devmario/atom,ykeisuke/atom,lisonma/atom,jacekkopecky/atom,crazyquark/atom,rmartin/atom,atom/atom,targeter21/atom,vjeux/atom,liuderchi/atom,RobinTec/atom,BogusCurry/atom,ilovezy/atom,g2p/atom,tony612/atom,h0dgep0dge/atom,Shekharrajak/atom,einarmagnus/atom,isghe/atom,ivoadf/atom,deepfox/atom,jlord/atom,Neron-X5/atom,g2p/atom,chengky/atom,russlescai/atom,pengshp/atom,hpham04/atom,bj7/atom,ppamorim/atom,Jandersoft/atom,russlescai/atom,mnquintana/atom,ObviouslyGreen/atom,mrodalgaard/atom,me6iaton/atom,bcoe/atom,n-riesco/atom,lovesnow/atom,isghe/atom,Andrey-Pavlov/atom,scv119/atom,PKRoma/atom,tony612/atom,Sangaroonaom/atom,hagb4rd/atom,isghe/atom,sebmck/atom,davideg/atom,Andrey-Pavlov/atom,SlimeQ/atom,lisonma/atom,jtrose2/atom,amine7536/atom,john-kelly/atom,ReddTea/atom,hharchani/atom,0x73/atom,fscherwi/atom,0x73/atom,rsvip/aTom,Jdesk/atom,dannyflax/atom,001szymon/atom,isghe/atom,acontreras89/atom,ironbox360/atom,MjAbuz/atom,SlimeQ/atom,folpindo/atom,Jdesk/atom,seedtigo/atom,qiujuer/atom,rmartin/atom,t9md/atom,mertkahyaoglu/atom,AlexxNica/atom,Klozz/atom,FIT-CSE2410-A-Bombs/atom,jeremyramin/atom,ezeoleaf/atom,CraZySacX/atom,Galactix/atom,RobinTec/atom,davideg/atom,andrewleverette/atom,Abdillah/atom,toqz/atom,devoncarew/atom,phord/atom,rlugojr/atom,scippio/atom,KENJU/atom,hellendag/atom,devmario/atom,Jandersolutions/atom,nucked/atom,KENJU/atom,davideg/atom,hharchani/atom,boomwaiza/atom,dannyflax/atom,bj7/atom,Jonekee/atom,me-benni/atom,medovob/atom,batjko/atom,sillvan/atom,n-riesco/atom,RobinTec/atom,andrewleverette/atom,fedorov/atom,Jandersolutions/atom,jlord/atom,avdg/atom,FoldingText/atom,ObviouslyGreen/atom,acontreras89/atom,sekcheong/atom,darwin/atom,qskycolor/atom,boomwaiza/atom,medovob/atom,rmartin/atom,sekcheong/atom,fredericksilva/atom,FoldingText/atom,AlisaKiatkongkumthon/atom,jeremyramin/atom,elkingtonmcb/atom,basarat/atom,lovesnow/atom,Austen-G/BlockBuilder,yamhon/atom,codex8/atom,hagb4rd/atom,tanin47/atom,nucked/atom,rjattrill/atom,YunchengLiao/atom,kittens/atom,kevinrenaers/atom,gabrielPeart/atom,chfritz/atom,kdheepak89/atom,Klozz/atom,ivoadf/atom,jjz/atom,lpommers/atom,vjeux/atom,omarhuanca/atom,sillvan/atom,jordanbtucker/atom,yomybaby/atom,tmunro/atom,Galactix/atom,GHackAnonymous/atom,yalexx/atom,einarmagnus/atom,nvoron23/atom,hellendag/atom,ReddTea/atom,BogusCurry/atom,scippio/atom,champagnez/atom,dkfiresky/atom,devmario/atom,Abdillah/atom,oggy/atom,florianb/atom,ralphtheninja/atom,ilovezy/atom,Jandersoft/atom,Arcanemagus/atom,toqz/atom,Huaraz2/atom,yalexx/atom,dsandstrom/atom,fedorov/atom,kaicataldo/atom,pengshp/atom,vinodpanicker/atom,Dennis1978/atom,beni55/atom,liuderchi/atom,hakatashi/atom,rookie125/atom,Shekharrajak/atom,pkdevbox/atom,tisu2tisu/atom,targeter21/atom,Locke23rus/atom,yangchenghu/atom,Austen-G/BlockBuilder,tjkr/atom,nucked/atom,YunchengLiao/atom,wiggzz/atom,synaptek/atom,basarat/atom,fang-yufeng/atom,boomwaiza/atom,gisenberg/atom,ppamorim/atom,Hasimir/atom,sxgao3001/atom,basarat/atom,batjko/atom,florianb/atom,sekcheong/atom,svanharmelen/atom,matthewclendening/atom,ezeoleaf/atom,FIT-CSE2410-A-Bombs/atom,yamhon/atom,G-Baby/atom,FIT-CSE2410-A-Bombs/atom,sebmck/atom,johnhaley81/atom,scippio/atom,andrewleverette/atom,bryonwinger/atom,stinsonga/atom,kevinrenaers/atom,tisu2tisu/atom,MjAbuz/atom,kandros/atom,ObviouslyGreen/atom,vinodpanicker/atom,Ju2ender/atom,gabrielPeart/atom,alfredxing/atom,sxgao3001/atom,prembasumatary/atom,matthewclendening/atom,Jandersoft/atom,h0dgep0dge/atom,qiujuer/atom,jacekkopecky/atom,hagb4rd/atom,basarat/atom,vcarrera/atom,mostafaeweda/atom,Rychard/atom,synaptek/atom,yangchenghu/atom,florianb/atom,ardeshirj/atom,oggy/atom,charleswhchan/atom,MjAbuz/atom,YunchengLiao/atom,palita01/atom,Rodjana/atom,batjko/atom,ppamorim/atom,devoncarew/atom,Ingramz/atom,jjz/atom,alfredxing/atom,isghe/atom,Sangaroonaom/atom,mnquintana/atom,synaptek/atom,mertkahyaoglu/atom,amine7536/atom,scv119/atom,helber/atom,bcoe/atom,mertkahyaoglu/atom,codex8/atom,vcarrera/atom,burodepeper/atom,oggy/atom,johnrizzo1/atom,ali/atom,ivoadf/atom,t9md/atom,Abdillah/atom,ralphtheninja/atom,AlexxNica/atom,omarhuanca/atom,phord/atom,kaicataldo/atom,ReddTea/atom,vinodpanicker/atom,pkdevbox/atom,rmartin/atom,chfritz/atom,ppamorim/atom,sxgao3001/atom,001szymon/atom,qiujuer/atom,champagnez/atom,liuxiong332/atom,Austen-G/BlockBuilder,tmunro/atom,yomybaby/atom,vhutheesing/atom,jacekkopecky/atom,Jdesk/atom,NunoEdgarGub1/atom,ppamorim/atom,sotayamashita/atom,oggy/atom,decaffeinate-examples/atom,kdheepak89/atom,deepfox/atom,FoldingText/atom,efatsi/atom,Andrey-Pavlov/atom,panuchart/atom,0x73/atom,hharchani/atom,Neron-X5/atom,toqz/atom,n-riesco/atom,bolinfest/atom,Shekharrajak/atom,abcP9110/atom,kdheepak89/atom,basarat/atom,kittens/atom,elkingtonmcb/atom,rsvip/aTom,transcranial/atom,daxlab/atom,kaicataldo/atom,daxlab/atom,sxgao3001/atom,paulcbetts/atom,vcarrera/atom,yomybaby/atom,Hasimir/atom,florianb/atom,champagnez/atom,tanin47/atom,Sangaroonaom/atom,jtrose2/atom,DiogoXRP/atom,devoncarew/atom,dkfiresky/atom,RobinTec/atom,001szymon/atom,devoncarew/atom,tjkr/atom,amine7536/atom,florianb/atom,rlugojr/atom,jlord/atom,jordanbtucker/atom,ezeoleaf/atom,qskycolor/atom,panuchart/atom,alexandergmann/atom,bsmr-x-script/atom,qskycolor/atom,davideg/atom,Arcanemagus/atom,splodingsocks/atom,SlimeQ/atom,prembasumatary/atom,anuwat121/atom,vhutheesing/atom,kjav/atom,mostafaeweda/atom,johnhaley81/atom,deepfox/atom,Ju2ender/atom,dijs/atom,yalexx/atom,Jandersoft/atom,n-riesco/atom,Rychard/atom,codex8/atom,avdg/atom,brettle/atom,atom/atom,mnquintana/atom,helber/atom,niklabh/atom,Locke23rus/atom,mostafaeweda/atom,lisonma/atom,kc8wxm/atom,NunoEdgarGub1/atom,john-kelly/atom,chengky/atom,wiggzz/atom,einarmagnus/atom,CraZySacX/atom,cyzn/atom,rmartin/atom,toqz/atom,hpham04/atom,acontreras89/atom,gontadu/atom,kc8wxm/atom,matthewclendening/atom,lovesnow/atom,russlescai/atom,fredericksilva/atom,n-riesco/atom,russlescai/atom,xream/atom,sebmck/atom,codex8/atom,bryonwinger/atom,batjko/atom | coffeescript | ## Code Before:
app = require 'app'
ChildProcess = require 'child_process'
fs = require 'fs'
path = require 'path'
updateDotExe = path.resolve(path.dirname(process.execPath), '..', 'Update.exe')
exeName = path.basename(process.execPath)
# Spawn the Update.exe with the given arguments and invoke the callback when
# the command completes.
exports.spawn = (args, callback) ->
updateProcess = ChildProcess.spawn(updateDotExe, args)
stdout = ''
updateProcess.stdout.on 'data', (data) -> stdout += data
error = null
updateProcess.on 'error', (processError) -> error ?= processError
updateProcess.on 'close', (code, signal) ->
error ?= new Error("Command failed: #{signal}") if code isnt 0
error?.code ?= code
error?.stdout ?= stdout
callback(error, stdout)
undefined
# Is the Update.exe installed with Atom?
exports.existsSync = ->
fs.existsSync(updateDotExe)
# Handle squirrel events denoted by --squirrel-* command line arguments.
exports.handleStartupEvent = ->
switch process.argv[1]
when '--squirrel-install', '--squirrel-updated'
exports.spawn ['--createShortcut', exeName], -> app.quit()
spawnUpdateAndQuit('')
true
when '--squirrel-uninstall'
exports.spawn ['--removeShortcut', exeName], -> app.quit()
true
when '--squirrel-obsolete'
app.quit()
true
else
false
## Instruction:
Remove call to deleted function
## Code After:
app = require 'app'
ChildProcess = require 'child_process'
fs = require 'fs'
path = require 'path'
updateDotExe = path.resolve(path.dirname(process.execPath), '..', 'Update.exe')
exeName = path.basename(process.execPath)
# Spawn the Update.exe with the given arguments and invoke the callback when
# the command completes.
exports.spawn = (args, callback) ->
updateProcess = ChildProcess.spawn(updateDotExe, args)
stdout = ''
updateProcess.stdout.on 'data', (data) -> stdout += data
error = null
updateProcess.on 'error', (processError) -> error ?= processError
updateProcess.on 'close', (code, signal) ->
error ?= new Error("Command failed: #{signal}") if code isnt 0
error?.code ?= code
error?.stdout ?= stdout
callback(error, stdout)
undefined
# Is the Update.exe installed with Atom?
exports.existsSync = ->
fs.existsSync(updateDotExe)
# Handle squirrel events denoted by --squirrel-* command line arguments.
exports.handleStartupEvent = ->
switch process.argv[1]
when '--squirrel-install', '--squirrel-updated'
exports.spawn ['--createShortcut', exeName], -> app.quit()
true
when '--squirrel-uninstall'
exports.spawn ['--removeShortcut', exeName], -> app.quit()
true
when '--squirrel-obsolete'
app.quit()
true
else
false
| app = require 'app'
ChildProcess = require 'child_process'
fs = require 'fs'
path = require 'path'
updateDotExe = path.resolve(path.dirname(process.execPath), '..', 'Update.exe')
exeName = path.basename(process.execPath)
# Spawn the Update.exe with the given arguments and invoke the callback when
# the command completes.
exports.spawn = (args, callback) ->
updateProcess = ChildProcess.spawn(updateDotExe, args)
stdout = ''
updateProcess.stdout.on 'data', (data) -> stdout += data
error = null
updateProcess.on 'error', (processError) -> error ?= processError
updateProcess.on 'close', (code, signal) ->
error ?= new Error("Command failed: #{signal}") if code isnt 0
error?.code ?= code
error?.stdout ?= stdout
callback(error, stdout)
undefined
# Is the Update.exe installed with Atom?
exports.existsSync = ->
fs.existsSync(updateDotExe)
# Handle squirrel events denoted by --squirrel-* command line arguments.
exports.handleStartupEvent = ->
switch process.argv[1]
when '--squirrel-install', '--squirrel-updated'
exports.spawn ['--createShortcut', exeName], -> app.quit()
- spawnUpdateAndQuit('')
true
when '--squirrel-uninstall'
exports.spawn ['--removeShortcut', exeName], -> app.quit()
true
when '--squirrel-obsolete'
app.quit()
true
else
false | 1 | 0.022222 | 0 | 1 |
48aef728aa9975f7ac97a4d7d2cd4dbe36f3c610 | lib/opticon.rb | lib/opticon.rb | class Opticon
attr_reader :docs, :records
def initialize input
docs = records = []
doc = record = []
while line = input.gets
l = line.chomp.split(',')
record = {
:key => l[0],
:volume => l[1],
:path => l[2],
:break => l[3],
:empty1 => l[4],
:empty2 => l[5],
:pages => l[6],
}
if record[:break]
docs.push doc if doc != []
doc = []
end
doc.push record
records.push record
end
doc.push record
docs.push doc if doc != []
@records = records
@docs = docs
end
def beg_end
b = @docs[0][0][:key]
e = @docs[-1][-1][:key]
[b, e]
end
def doc_count
@docs.length
end
def image_count
@records.length
end
def tiff_count
@records.select{|r| r[:path] =~ /\.tif$/}.length
end
def jpeg_count
@records.select{|r| r[:path] =~ /\.jpg$/}.length
end
def volumes
@records.map{|r| r[:volume]}.uniq
end
end
| class Opticon
attr_reader :docs, :records
def initialize input
docs = []
doc = []
records = []
record = {}
while line = input.gets
l = line.chomp.split(',')
record = {
:key => l[0],
:volume => l[1],
:path => l[2],
:break => l[3],
:empty1 => l[4],
:empty2 => l[5],
:pages => l[6],
}
if record[:break]
docs.push doc if doc != []
doc = []
end
doc.push record
records.push record
end
doc.push record
docs.push doc if doc != []
@records = records
@docs = docs
end
def beg_end
b = @docs[0][0][:key]
e = @docs[-1][-1][:key]
[b, e]
end
def doc_count
@docs.length
end
def image_count
@records.length
end
def tiff_count
@records.select{|r| r[:path] =~ /\.tif$/}.length
end
def jpeg_count
@records.select{|r| r[:path] =~ /\.jpg$/}.length
end
def volumes
@records.map{|r| r[:volume]}.uniq
end
end
| Revert "remove extraneous Hash.new call per record" | Revert "remove extraneous Hash.new call per record"
This reverts commit 9a3142f92c567f54b3b8a5b9d1a93dd007b9bdaa.
| Ruby | mit | nbirnel/loadfile,nbirnel/loadfile | ruby | ## Code Before:
class Opticon
attr_reader :docs, :records
def initialize input
docs = records = []
doc = record = []
while line = input.gets
l = line.chomp.split(',')
record = {
:key => l[0],
:volume => l[1],
:path => l[2],
:break => l[3],
:empty1 => l[4],
:empty2 => l[5],
:pages => l[6],
}
if record[:break]
docs.push doc if doc != []
doc = []
end
doc.push record
records.push record
end
doc.push record
docs.push doc if doc != []
@records = records
@docs = docs
end
def beg_end
b = @docs[0][0][:key]
e = @docs[-1][-1][:key]
[b, e]
end
def doc_count
@docs.length
end
def image_count
@records.length
end
def tiff_count
@records.select{|r| r[:path] =~ /\.tif$/}.length
end
def jpeg_count
@records.select{|r| r[:path] =~ /\.jpg$/}.length
end
def volumes
@records.map{|r| r[:volume]}.uniq
end
end
## Instruction:
Revert "remove extraneous Hash.new call per record"
This reverts commit 9a3142f92c567f54b3b8a5b9d1a93dd007b9bdaa.
## Code After:
class Opticon
attr_reader :docs, :records
def initialize input
docs = []
doc = []
records = []
record = {}
while line = input.gets
l = line.chomp.split(',')
record = {
:key => l[0],
:volume => l[1],
:path => l[2],
:break => l[3],
:empty1 => l[4],
:empty2 => l[5],
:pages => l[6],
}
if record[:break]
docs.push doc if doc != []
doc = []
end
doc.push record
records.push record
end
doc.push record
docs.push doc if doc != []
@records = records
@docs = docs
end
def beg_end
b = @docs[0][0][:key]
e = @docs[-1][-1][:key]
[b, e]
end
def doc_count
@docs.length
end
def image_count
@records.length
end
def tiff_count
@records.select{|r| r[:path] =~ /\.tif$/}.length
end
def jpeg_count
@records.select{|r| r[:path] =~ /\.jpg$/}.length
end
def volumes
@records.map{|r| r[:volume]}.uniq
end
end
| class Opticon
attr_reader :docs, :records
def initialize input
+ docs = []
+ doc = []
- docs = records = []
? -------
+ records = []
- doc = record = []
+ record = {}
while line = input.gets
l = line.chomp.split(',')
record = {
:key => l[0],
:volume => l[1],
:path => l[2],
:break => l[3],
:empty1 => l[4],
:empty2 => l[5],
:pages => l[6],
}
if record[:break]
docs.push doc if doc != []
doc = []
end
doc.push record
records.push record
end
doc.push record
docs.push doc if doc != []
@records = records
@docs = docs
end
def beg_end
b = @docs[0][0][:key]
e = @docs[-1][-1][:key]
[b, e]
end
def doc_count
@docs.length
end
def image_count
@records.length
end
def tiff_count
@records.select{|r| r[:path] =~ /\.tif$/}.length
end
def jpeg_count
@records.select{|r| r[:path] =~ /\.jpg$/}.length
end
def volumes
@records.map{|r| r[:volume]}.uniq
end
end | 6 | 0.105263 | 4 | 2 |
2e57823a0e18aa3f532b2c161ade5b0b25433ee3 | README.md | README.md | [](https://circleci.com/gh/geb/workflows/geb/tree/master)
[](https://maven-badges.herokuapp.com/maven-central/org.gebish/geb-core)
Geb (pronounced “jeb”) is a browser automation solution. It brings together the power of WebDriver, the elegance of jQuery content selection, the robustness of Page Object modelling and the expressiveness of the Groovy language.
For more information about the project, see the [http://www.gebish.org](http://www.gebish.org/).
## How to contribute
Please see [CONTRIBUTING.md](https://github.com/geb/geb/blob/master/CONTRIBUTING.md) for contribution guidelines.
## Submitting issues
If you'd like to submit an issue against Geb then please use [the issue tracker for geb/issues Github project](https://github.com/geb/issues/issues).
Please avoid submitting usage questions as issues and use [the mailing lists](http://www.gebish.org/lists) instead. | [](https://circleci.com/gh/geb/workflows/geb/tree/master)
[](https://maven-badges.herokuapp.com/maven-central/org.gebish/geb-core)
Geb (pronounced “jeb”) is a browser automation solution. It brings together the power of WebDriver, the elegance of jQuery content selection, the robustness of Page Object modelling and the expressiveness of the Groovy language.
For more information about the project, see the [http://www.gebish.org](http://www.gebish.org/).
## How to contribute
Please see [CONTRIBUTING.md](https://github.com/geb/geb/blob/master/CONTRIBUTING.md) for contribution guidelines.
## Submitting issues
If you'd like to submit an issue against Geb then please use [the issue tracker for geb/issues Github project](https://github.com/geb/issues/issues).
Please avoid submitting usage questions as issues and use [the user mailing list](https://groups.google.com/forum/#!forum/geb-user) instead. | Fix link to the mailing list. | Fix link to the mailing list.
Closes geb/issues#547.
| Markdown | apache-2.0 | geb/geb,geb/geb | markdown | ## Code Before:
[](https://circleci.com/gh/geb/workflows/geb/tree/master)
[](https://maven-badges.herokuapp.com/maven-central/org.gebish/geb-core)
Geb (pronounced “jeb”) is a browser automation solution. It brings together the power of WebDriver, the elegance of jQuery content selection, the robustness of Page Object modelling and the expressiveness of the Groovy language.
For more information about the project, see the [http://www.gebish.org](http://www.gebish.org/).
## How to contribute
Please see [CONTRIBUTING.md](https://github.com/geb/geb/blob/master/CONTRIBUTING.md) for contribution guidelines.
## Submitting issues
If you'd like to submit an issue against Geb then please use [the issue tracker for geb/issues Github project](https://github.com/geb/issues/issues).
Please avoid submitting usage questions as issues and use [the mailing lists](http://www.gebish.org/lists) instead.
## Instruction:
Fix link to the mailing list.
Closes geb/issues#547.
## Code After:
[](https://circleci.com/gh/geb/workflows/geb/tree/master)
[](https://maven-badges.herokuapp.com/maven-central/org.gebish/geb-core)
Geb (pronounced “jeb”) is a browser automation solution. It brings together the power of WebDriver, the elegance of jQuery content selection, the robustness of Page Object modelling and the expressiveness of the Groovy language.
For more information about the project, see the [http://www.gebish.org](http://www.gebish.org/).
## How to contribute
Please see [CONTRIBUTING.md](https://github.com/geb/geb/blob/master/CONTRIBUTING.md) for contribution guidelines.
## Submitting issues
If you'd like to submit an issue against Geb then please use [the issue tracker for geb/issues Github project](https://github.com/geb/issues/issues).
Please avoid submitting usage questions as issues and use [the user mailing list](https://groups.google.com/forum/#!forum/geb-user) instead. | [](https://circleci.com/gh/geb/workflows/geb/tree/master)
[](https://maven-badges.herokuapp.com/maven-central/org.gebish/geb-core)
Geb (pronounced “jeb”) is a browser automation solution. It brings together the power of WebDriver, the elegance of jQuery content selection, the robustness of Page Object modelling and the expressiveness of the Groovy language.
For more information about the project, see the [http://www.gebish.org](http://www.gebish.org/).
## How to contribute
Please see [CONTRIBUTING.md](https://github.com/geb/geb/blob/master/CONTRIBUTING.md) for contribution guidelines.
## Submitting issues
If you'd like to submit an issue against Geb then please use [the issue tracker for geb/issues Github project](https://github.com/geb/issues/issues).
- Please avoid submitting usage questions as issues and use [the mailing lists](http://www.gebish.org/lists) instead.
? - ^^^ ^ ^^^ -------
+ Please avoid submitting usage questions as issues and use [the user mailing list](https://groups.google.com/forum/#!forum/geb-user) instead.
? +++++ + ^^^^^^ +++++++++++++++++++++++++ ^^ ^
| 2 | 0.133333 | 1 | 1 |
798bb6f898a3864868be1c85755dc3556d602677 | app/scripts/components/experts/extensions/workspace-selector.html | app/scripts/components/experts/extensions/workspace-selector.html | <i
class="fa fa-vcard"
ng-if="organization.is_expert_provider"
uib-tooltip="{{ 'This organization is an expert provider.' | translate }}"
visible-if="'experts'">
</i>
| <i
class="fa fa-vcard"
ng-if="organization.is_expert_provider"
uib-tooltip="{{ 'This organization is an expert provider.' | translate }}"
tooltip-placement="bottom"
visible-if="'experts'">
</i>
| Improve tooltip placement for expert provider indicator in the workspace selector toggle. | Improve tooltip placement for expert provider indicator in the workspace selector toggle.
| HTML | mit | opennode/waldur-homeport,opennode/waldur-homeport,opennode/waldur-homeport,opennode/waldur-homeport | html | ## Code Before:
<i
class="fa fa-vcard"
ng-if="organization.is_expert_provider"
uib-tooltip="{{ 'This organization is an expert provider.' | translate }}"
visible-if="'experts'">
</i>
## Instruction:
Improve tooltip placement for expert provider indicator in the workspace selector toggle.
## Code After:
<i
class="fa fa-vcard"
ng-if="organization.is_expert_provider"
uib-tooltip="{{ 'This organization is an expert provider.' | translate }}"
tooltip-placement="bottom"
visible-if="'experts'">
</i>
| <i
class="fa fa-vcard"
ng-if="organization.is_expert_provider"
uib-tooltip="{{ 'This organization is an expert provider.' | translate }}"
+ tooltip-placement="bottom"
visible-if="'experts'">
</i> | 1 | 0.166667 | 1 | 0 |
d0a1ba04e21b7f936965fb99948048545a79ba62 | moor/tool/run_tests.dart | moor/tool/run_tests.dart | import 'dart:io';
/// Runs all test files individually.
///
/// For some reason, coverage collection times out when running them all at once
/// with null safety enabled, so this is what we have to do now.
Future<void> main() async {
final directory = Directory('test');
var hadFailures = false;
await for (final entity in directory.list(recursive: true)) {
if (entity is! File || !entity.path.endsWith('_test.dart')) continue;
final process = await Process.start(
'/home/simon/bin/dart-sdk/beta/bin/dart', [
'--no-sound-null-safety',
'run',
'test',
'--coverage=coverage',
entity.path
]);
await Future.wait(
[stdout.addStream(process.stdout), stderr.addStream(process.stderr)]);
final exitCode = await process.exitCode;
if (exitCode != 0) {
hadFailures = true;
}
}
exit(hadFailures ? 1 : 0);
}
| import 'dart:io';
/// Runs all test files individually.
///
/// For some reason, coverage collection times out when running them all at once
/// with null safety enabled, so this is what we have to do now.
Future<void> main() async {
final directory = Directory('test');
var hadFailures = false;
await for (final entity in directory.list(recursive: true)) {
if (entity is! File || !entity.path.endsWith('_test.dart')) continue;
final process = await Process.start('dart', [
'--no-sound-null-safety',
'run',
'test',
'--coverage=coverage',
entity.path
]);
await Future.wait(
[stdout.addStream(process.stdout), stderr.addStream(process.stderr)]);
final exitCode = await process.exitCode;
if (exitCode != 0) {
hadFailures = true;
}
}
exit(hadFailures ? 1 : 0);
}
| Use right Dart binary for tests, ugh | Use right Dart binary for tests, ugh
| Dart | mit | simolus3/drift,simolus3/moor,simolus3/drift,simolus3/moor,simolus3/moor,simolus3/drift,simolus3/drift,simolus3/drift,simolus3/moor,simolus3/moor,simolus3/moor,simolus3/drift,simolus3/moor,simolus3/drift | dart | ## Code Before:
import 'dart:io';
/// Runs all test files individually.
///
/// For some reason, coverage collection times out when running them all at once
/// with null safety enabled, so this is what we have to do now.
Future<void> main() async {
final directory = Directory('test');
var hadFailures = false;
await for (final entity in directory.list(recursive: true)) {
if (entity is! File || !entity.path.endsWith('_test.dart')) continue;
final process = await Process.start(
'/home/simon/bin/dart-sdk/beta/bin/dart', [
'--no-sound-null-safety',
'run',
'test',
'--coverage=coverage',
entity.path
]);
await Future.wait(
[stdout.addStream(process.stdout), stderr.addStream(process.stderr)]);
final exitCode = await process.exitCode;
if (exitCode != 0) {
hadFailures = true;
}
}
exit(hadFailures ? 1 : 0);
}
## Instruction:
Use right Dart binary for tests, ugh
## Code After:
import 'dart:io';
/// Runs all test files individually.
///
/// For some reason, coverage collection times out when running them all at once
/// with null safety enabled, so this is what we have to do now.
Future<void> main() async {
final directory = Directory('test');
var hadFailures = false;
await for (final entity in directory.list(recursive: true)) {
if (entity is! File || !entity.path.endsWith('_test.dart')) continue;
final process = await Process.start('dart', [
'--no-sound-null-safety',
'run',
'test',
'--coverage=coverage',
entity.path
]);
await Future.wait(
[stdout.addStream(process.stdout), stderr.addStream(process.stderr)]);
final exitCode = await process.exitCode;
if (exitCode != 0) {
hadFailures = true;
}
}
exit(hadFailures ? 1 : 0);
}
| import 'dart:io';
/// Runs all test files individually.
///
/// For some reason, coverage collection times out when running them all at once
/// with null safety enabled, so this is what we have to do now.
Future<void> main() async {
final directory = Directory('test');
var hadFailures = false;
await for (final entity in directory.list(recursive: true)) {
if (entity is! File || !entity.path.endsWith('_test.dart')) continue;
- final process = await Process.start(
+ final process = await Process.start('dart', [
? +++++++++
- '/home/simon/bin/dart-sdk/beta/bin/dart', [
'--no-sound-null-safety',
'run',
'test',
'--coverage=coverage',
entity.path
]);
await Future.wait(
[stdout.addStream(process.stdout), stderr.addStream(process.stderr)]);
final exitCode = await process.exitCode;
if (exitCode != 0) {
hadFailures = true;
}
}
exit(hadFailures ? 1 : 0);
} | 3 | 0.09375 | 1 | 2 |
87ba6c423118c18c4e85ea9d32ed5a006f00a793 | packages.txt | packages.txt | chromium
cmake
compton
cowsay
dmenu
dunst
feh
fish
git
gnome-keyring
gnome-themes-standard
gst-libav
gst-plugins-bad
gst-plugins-base
gst-plugins-good
gst-plugins-ugly
htop
i3-wm
i3lock
i3session-git
i3status
infinality-bundle
librsvg
neovim
network-manager-applet
networkmanager
nodejs
npm
pa-applet-git
parcellite
playerctl
pulseaudio
python-gobject
python-neovim
python-xdg
python2-neovim
redshift
stow
termite
the_silver_searcher
thunar
tmux
ttf-dejavu
xautolock
xclip
xcwd-git
yaourt
| chromium
cmake
cmus
compton
cowsay
dmenu
dunst
fish
git
gnome-keyring
gnome-themes-standard
gpicview
gst-libav
gst-plugins-bad
gst-plugins-base
gst-plugins-good
gst-plugins-ugly
htop
i3-wm
i3lock
i3session-git
i3status
infinality-bundle
librsvg
neovim
network-manager-applet
networkmanager
nodejs
npm
pa-applet-git
parcellite
playerctl
pulseaudio
python-gobject
python-neovim
python-xdg
python2-neovim
ranger
redshift
smplayer
stow
termite
the_silver_searcher
tmux
ttf-dejavu
xautolock
xclip
xcwd-git
yaourt
| Add a few light applications, remove unused | Add a few light applications, remove unused
| Text | unlicense | Wolfy87/dotfiles,Wolfy87/dotfiles,Olical/dotfiles | text | ## Code Before:
chromium
cmake
compton
cowsay
dmenu
dunst
feh
fish
git
gnome-keyring
gnome-themes-standard
gst-libav
gst-plugins-bad
gst-plugins-base
gst-plugins-good
gst-plugins-ugly
htop
i3-wm
i3lock
i3session-git
i3status
infinality-bundle
librsvg
neovim
network-manager-applet
networkmanager
nodejs
npm
pa-applet-git
parcellite
playerctl
pulseaudio
python-gobject
python-neovim
python-xdg
python2-neovim
redshift
stow
termite
the_silver_searcher
thunar
tmux
ttf-dejavu
xautolock
xclip
xcwd-git
yaourt
## Instruction:
Add a few light applications, remove unused
## Code After:
chromium
cmake
cmus
compton
cowsay
dmenu
dunst
fish
git
gnome-keyring
gnome-themes-standard
gpicview
gst-libav
gst-plugins-bad
gst-plugins-base
gst-plugins-good
gst-plugins-ugly
htop
i3-wm
i3lock
i3session-git
i3status
infinality-bundle
librsvg
neovim
network-manager-applet
networkmanager
nodejs
npm
pa-applet-git
parcellite
playerctl
pulseaudio
python-gobject
python-neovim
python-xdg
python2-neovim
ranger
redshift
smplayer
stow
termite
the_silver_searcher
tmux
ttf-dejavu
xautolock
xclip
xcwd-git
yaourt
| chromium
cmake
+ cmus
compton
cowsay
dmenu
dunst
- feh
fish
git
gnome-keyring
gnome-themes-standard
+ gpicview
gst-libav
gst-plugins-bad
gst-plugins-base
gst-plugins-good
gst-plugins-ugly
htop
i3-wm
i3lock
i3session-git
i3status
infinality-bundle
librsvg
neovim
network-manager-applet
networkmanager
nodejs
npm
pa-applet-git
parcellite
playerctl
pulseaudio
python-gobject
python-neovim
python-xdg
python2-neovim
+ ranger
redshift
+ smplayer
stow
termite
the_silver_searcher
- thunar
tmux
ttf-dejavu
xautolock
xclip
xcwd-git
yaourt | 6 | 0.12766 | 4 | 2 |
f358ee2b196274164b8829d7d5ea6429b1a6b678 | official/r1/README.md | official/r1/README.md |
The R1 folder contains legacy model implmentation and models that will not
update to TensorFlow 2.x. They do not have solid performance tracking.
## Legacy model implmentation
Transformer and MNIST implementation uses pure TF 1.x TF-Estimator.
Users should follow the corresponding TF 2.x implmentation inside the
official model garden.
## Models that will not update to TensorFlow 2.x
* [boosted_trees](boosted_trees): A Gradient Boosted Trees model to
classify higgs boson process from HIGGS Data Set.
* [wide_deep](wide_deep): A model that combines a wide model and deep
network to classify census income data.
|
The R1 folder contains legacy model implmentation and models that will not
update to TensorFlow 2.x. They do not have solid performance tracking.
**Note: models will be removed from the master branch by 2020/06.**
After removal, you can still access to these legacy models in the previous
released tags, e.g. [v2.1.0](https://github.com/tensorflow/models/releases/tag/v2.1.0).
## Legacy model implmentation
Transformer and MNIST implementation uses pure TF 1.x TF-Estimator.
Users should follow the corresponding TF 2.x implmentation inside the
official model garden.
## Models that will not update to TensorFlow 2.x
* [boosted_trees](boosted_trees): A Gradient Boosted Trees model to
classify higgs boson process from HIGGS Data Set.
* [wide_deep](wide_deep): A model that combines a wide model and deep
network to classify census income data.
| Add a note for the plan to delete R1 folder. | Add a note for the plan to delete R1 folder.
PiperOrigin-RevId: 300590317
| Markdown | apache-2.0 | tombstone/models,tombstone/models,tombstone/models,tombstone/models,tombstone/models,tombstone/models | markdown | ## Code Before:
The R1 folder contains legacy model implmentation and models that will not
update to TensorFlow 2.x. They do not have solid performance tracking.
## Legacy model implmentation
Transformer and MNIST implementation uses pure TF 1.x TF-Estimator.
Users should follow the corresponding TF 2.x implmentation inside the
official model garden.
## Models that will not update to TensorFlow 2.x
* [boosted_trees](boosted_trees): A Gradient Boosted Trees model to
classify higgs boson process from HIGGS Data Set.
* [wide_deep](wide_deep): A model that combines a wide model and deep
network to classify census income data.
## Instruction:
Add a note for the plan to delete R1 folder.
PiperOrigin-RevId: 300590317
## Code After:
The R1 folder contains legacy model implmentation and models that will not
update to TensorFlow 2.x. They do not have solid performance tracking.
**Note: models will be removed from the master branch by 2020/06.**
After removal, you can still access to these legacy models in the previous
released tags, e.g. [v2.1.0](https://github.com/tensorflow/models/releases/tag/v2.1.0).
## Legacy model implmentation
Transformer and MNIST implementation uses pure TF 1.x TF-Estimator.
Users should follow the corresponding TF 2.x implmentation inside the
official model garden.
## Models that will not update to TensorFlow 2.x
* [boosted_trees](boosted_trees): A Gradient Boosted Trees model to
classify higgs boson process from HIGGS Data Set.
* [wide_deep](wide_deep): A model that combines a wide model and deep
network to classify census income data.
|
The R1 folder contains legacy model implmentation and models that will not
update to TensorFlow 2.x. They do not have solid performance tracking.
+
+ **Note: models will be removed from the master branch by 2020/06.**
+
+ After removal, you can still access to these legacy models in the previous
+ released tags, e.g. [v2.1.0](https://github.com/tensorflow/models/releases/tag/v2.1.0).
+
## Legacy model implmentation
Transformer and MNIST implementation uses pure TF 1.x TF-Estimator.
Users should follow the corresponding TF 2.x implmentation inside the
official model garden.
## Models that will not update to TensorFlow 2.x
* [boosted_trees](boosted_trees): A Gradient Boosted Trees model to
classify higgs boson process from HIGGS Data Set.
* [wide_deep](wide_deep): A model that combines a wide model and deep
network to classify census income data. | 6 | 0.375 | 6 | 0 |
422868008d1262a7af11505055e74fa490c19f7c | src/Addon/Rate.php | src/Addon/Rate.php | <?php
namespace Nocks\SDK\Addon;
use GuzzleHttp\Client;
/**
* Class Rate
* @package Nocks\SDK\Addon
*/
class Rate
{
public function __construct()
{
$this->client = new Client();
}
function getCurrentRate($currencyCode)
{
$rate = 0;
if(in_array($currencyCode, array('USD', 'GBP', 'EUR')))
{
// Powered by CoinDesk
$response = $this->client->get('https://api.coindesk.com/v1/bpi/currentprice.json');
$response = json_decode($response->getBody()->getContents(), true);
if(isset($response['bpi'][$currencyCode]['rate']))
{
$rate = $response['bpi'][$currencyCode]['rate'];
}
}
elseif(in_array($currencyCode, array('BTC', 'NLG')))
{
// Powered by Bittrex
$response = $this->client->get('https://bittrex.com/api/v1.1/public/getticker?market=BTC-NLG');
$response = json_decode($response->getBody()->getContents(), true);
if(isset($response['result']['Last']))
{
$rate = $response['result']['Last'];
}
}
return str_replace(',', '', $rate);
}
} | <?php
namespace Nocks\SDK\Addon;
use GuzzleHttp\Client;
/**
* Class Rate
* @package Nocks\SDK\Addon
*/
class Rate
{
public function __construct()
{
$this->client = new Client();
}
function getCurrentRate($currencyCode)
{
$rate = 0;
if(in_array($currencyCode, array('USD', 'GBP', 'EUR')))
{
// Powered by CoinDesk
$response = $this->client->get('https://api.coindesk.com/v1/bpi/currentprice.json');
$response = json_decode($response->getBody()->getContents(), true);
if(isset($response['bpi'][$currencyCode]['rate']))
{
$rate = $response['bpi'][$currencyCode]['rate'];
}
}
elseif(in_array($currencyCode, array('BTC', 'NLG')))
{
$btc_eur = $this->getCurrentRate('EUR');
$response = $this->client->get('http://nocks.co/api/market?call=nlg', [
'headers' => [
'Accept' => '*/*'
]
]);
$response = json_decode($response->getBody()->getContents(), true);
if(isset($response['last']))
{
$rate = number_format($response['last'] / $btc_eur, 8);
}
}
return str_replace(',', '', $rate);
}
} | Call Nocks API instead of Bittrex directly | Call Nocks API instead of Bittrex directly
| PHP | mit | nocksapp/nocks-sdk-php,patrickkivits/nocks-sdk-php | php | ## Code Before:
<?php
namespace Nocks\SDK\Addon;
use GuzzleHttp\Client;
/**
* Class Rate
* @package Nocks\SDK\Addon
*/
class Rate
{
public function __construct()
{
$this->client = new Client();
}
function getCurrentRate($currencyCode)
{
$rate = 0;
if(in_array($currencyCode, array('USD', 'GBP', 'EUR')))
{
// Powered by CoinDesk
$response = $this->client->get('https://api.coindesk.com/v1/bpi/currentprice.json');
$response = json_decode($response->getBody()->getContents(), true);
if(isset($response['bpi'][$currencyCode]['rate']))
{
$rate = $response['bpi'][$currencyCode]['rate'];
}
}
elseif(in_array($currencyCode, array('BTC', 'NLG')))
{
// Powered by Bittrex
$response = $this->client->get('https://bittrex.com/api/v1.1/public/getticker?market=BTC-NLG');
$response = json_decode($response->getBody()->getContents(), true);
if(isset($response['result']['Last']))
{
$rate = $response['result']['Last'];
}
}
return str_replace(',', '', $rate);
}
}
## Instruction:
Call Nocks API instead of Bittrex directly
## Code After:
<?php
namespace Nocks\SDK\Addon;
use GuzzleHttp\Client;
/**
* Class Rate
* @package Nocks\SDK\Addon
*/
class Rate
{
public function __construct()
{
$this->client = new Client();
}
function getCurrentRate($currencyCode)
{
$rate = 0;
if(in_array($currencyCode, array('USD', 'GBP', 'EUR')))
{
// Powered by CoinDesk
$response = $this->client->get('https://api.coindesk.com/v1/bpi/currentprice.json');
$response = json_decode($response->getBody()->getContents(), true);
if(isset($response['bpi'][$currencyCode]['rate']))
{
$rate = $response['bpi'][$currencyCode]['rate'];
}
}
elseif(in_array($currencyCode, array('BTC', 'NLG')))
{
$btc_eur = $this->getCurrentRate('EUR');
$response = $this->client->get('http://nocks.co/api/market?call=nlg', [
'headers' => [
'Accept' => '*/*'
]
]);
$response = json_decode($response->getBody()->getContents(), true);
if(isset($response['last']))
{
$rate = number_format($response['last'] / $btc_eur, 8);
}
}
return str_replace(',', '', $rate);
}
} | <?php
namespace Nocks\SDK\Addon;
use GuzzleHttp\Client;
/**
* Class Rate
* @package Nocks\SDK\Addon
*/
class Rate
{
public function __construct()
{
$this->client = new Client();
}
function getCurrentRate($currencyCode)
{
$rate = 0;
if(in_array($currencyCode, array('USD', 'GBP', 'EUR')))
{
// Powered by CoinDesk
$response = $this->client->get('https://api.coindesk.com/v1/bpi/currentprice.json');
$response = json_decode($response->getBody()->getContents(), true);
if(isset($response['bpi'][$currencyCode]['rate']))
{
$rate = $response['bpi'][$currencyCode]['rate'];
}
}
elseif(in_array($currencyCode, array('BTC', 'NLG')))
{
- // Powered by Bittrex
- $response = $this->client->get('https://bittrex.com/api/v1.1/public/getticker?market=BTC-NLG');
+ $btc_eur = $this->getCurrentRate('EUR');
+
+ $response = $this->client->get('http://nocks.co/api/market?call=nlg', [
+ 'headers' => [
+ 'Accept' => '*/*'
+ ]
+ ]);
$response = json_decode($response->getBody()->getContents(), true);
- if(isset($response['result']['Last']))
? ---- ------
+ if(isset($response['last']))
{
- $rate = $response['result']['Last'];
+ $rate = number_format($response['last'] / $btc_eur, 8);
}
}
return str_replace(',', '', $rate);
}
} | 13 | 0.276596 | 9 | 4 |
89224750cdf5ececb025251eccdf8fdd9eb37cce | metadata/com.woefe.shoppinglist.txt | metadata/com.woefe.shoppinglist.txt | Categories:Writing
License:GPL-3.0-or-later
Author Name:Wolfgang Popp
Web Site:https://woefe.github.io/ShoppingList/
Source Code:https://github.com/woefe/ShoppingList
Issue Tracker:https://github.com/woefe/ShoppingList/issues
Auto Name:Shopping List
Summary:Manage (grocery) lists
Description:
Manage your shopping lists and other types of lists. The lists are stored as
simple text files and use a simple, human-readable syntax. This allows you to
synchronize them with other devices by using applications such as
[https://syncthing.net/ Syncthing] and [https://nextcloud.com/ Nextcloud].
.
Repo Type:git
Repo:https://github.com/woefe/ShoppingList
Build:0.8.0,8
commit=v0.8.0
subdir=app
gradle=yes
Build:0.9.0,9
commit=v0.9.0
subdir=app
gradle=yes
Build:0.10.0,10
commit=v0.10.0
subdir=app
gradle=yes
Build:0.10.1,11
commit=v0.10.1
subdir=app
gradle=yes
Auto Update Mode:Version v%v
Update Check Mode:Tags
Current Version:0.10.1
Current Version Code:11
| Categories:Writing
License:GPL-3.0-or-later
Author Name:Wolfgang Popp
Web Site:https://woefe.github.io/ShoppingList/
Source Code:https://github.com/woefe/ShoppingList
Issue Tracker:https://github.com/woefe/ShoppingList/issues
Auto Name:Shopping List
Summary:Manage (grocery) lists
Description:
Manage your shopping lists and other types of lists. The lists are stored as
simple text files and use a simple, human-readable syntax. This allows you to
synchronize them with other devices by using applications such as
[https://syncthing.net/ Syncthing] and [https://nextcloud.com/ Nextcloud].
.
Repo Type:git
Repo:https://github.com/woefe/ShoppingList
Build:0.8.0,8
commit=v0.8.0
subdir=app
gradle=yes
Build:0.9.0,9
commit=v0.9.0
subdir=app
gradle=yes
Build:0.10.0,10
commit=v0.10.0
subdir=app
gradle=yes
Build:0.10.1,11
commit=v0.10.1
subdir=app
gradle=yes
Build:0.11.0,12
commit=v0.11.0
subdir=app
gradle=yes
Auto Update Mode:Version v%v
Update Check Mode:Tags
Current Version:0.11.0
Current Version Code:12
| Update Shopping List to 0.11.0 (12) | Update Shopping List to 0.11.0 (12)
| Text | agpl-3.0 | f-droid/fdroiddata,f-droid/fdroiddata,f-droid/fdroid-data | text | ## Code Before:
Categories:Writing
License:GPL-3.0-or-later
Author Name:Wolfgang Popp
Web Site:https://woefe.github.io/ShoppingList/
Source Code:https://github.com/woefe/ShoppingList
Issue Tracker:https://github.com/woefe/ShoppingList/issues
Auto Name:Shopping List
Summary:Manage (grocery) lists
Description:
Manage your shopping lists and other types of lists. The lists are stored as
simple text files and use a simple, human-readable syntax. This allows you to
synchronize them with other devices by using applications such as
[https://syncthing.net/ Syncthing] and [https://nextcloud.com/ Nextcloud].
.
Repo Type:git
Repo:https://github.com/woefe/ShoppingList
Build:0.8.0,8
commit=v0.8.0
subdir=app
gradle=yes
Build:0.9.0,9
commit=v0.9.0
subdir=app
gradle=yes
Build:0.10.0,10
commit=v0.10.0
subdir=app
gradle=yes
Build:0.10.1,11
commit=v0.10.1
subdir=app
gradle=yes
Auto Update Mode:Version v%v
Update Check Mode:Tags
Current Version:0.10.1
Current Version Code:11
## Instruction:
Update Shopping List to 0.11.0 (12)
## Code After:
Categories:Writing
License:GPL-3.0-or-later
Author Name:Wolfgang Popp
Web Site:https://woefe.github.io/ShoppingList/
Source Code:https://github.com/woefe/ShoppingList
Issue Tracker:https://github.com/woefe/ShoppingList/issues
Auto Name:Shopping List
Summary:Manage (grocery) lists
Description:
Manage your shopping lists and other types of lists. The lists are stored as
simple text files and use a simple, human-readable syntax. This allows you to
synchronize them with other devices by using applications such as
[https://syncthing.net/ Syncthing] and [https://nextcloud.com/ Nextcloud].
.
Repo Type:git
Repo:https://github.com/woefe/ShoppingList
Build:0.8.0,8
commit=v0.8.0
subdir=app
gradle=yes
Build:0.9.0,9
commit=v0.9.0
subdir=app
gradle=yes
Build:0.10.0,10
commit=v0.10.0
subdir=app
gradle=yes
Build:0.10.1,11
commit=v0.10.1
subdir=app
gradle=yes
Build:0.11.0,12
commit=v0.11.0
subdir=app
gradle=yes
Auto Update Mode:Version v%v
Update Check Mode:Tags
Current Version:0.11.0
Current Version Code:12
| Categories:Writing
License:GPL-3.0-or-later
Author Name:Wolfgang Popp
Web Site:https://woefe.github.io/ShoppingList/
Source Code:https://github.com/woefe/ShoppingList
Issue Tracker:https://github.com/woefe/ShoppingList/issues
Auto Name:Shopping List
Summary:Manage (grocery) lists
Description:
Manage your shopping lists and other types of lists. The lists are stored as
simple text files and use a simple, human-readable syntax. This allows you to
synchronize them with other devices by using applications such as
[https://syncthing.net/ Syncthing] and [https://nextcloud.com/ Nextcloud].
.
Repo Type:git
Repo:https://github.com/woefe/ShoppingList
Build:0.8.0,8
commit=v0.8.0
subdir=app
gradle=yes
Build:0.9.0,9
commit=v0.9.0
subdir=app
gradle=yes
Build:0.10.0,10
commit=v0.10.0
subdir=app
gradle=yes
Build:0.10.1,11
commit=v0.10.1
subdir=app
gradle=yes
+ Build:0.11.0,12
+ commit=v0.11.0
+ subdir=app
+ gradle=yes
+
Auto Update Mode:Version v%v
Update Check Mode:Tags
- Current Version:0.10.1
? --
+ Current Version:0.11.0
? ++
- Current Version Code:11
? ^
+ Current Version Code:12
? ^
| 9 | 0.209302 | 7 | 2 |
22516c233339f8b8be801ee6194367b260de3331 | .github/workflows/ci.yml | .github/workflows/ci.yml | name: .NET package CI
on: [workflow_dispatch]
defaults:
run:
working-directory: BrowserStackLocal
jobs:
build:
runs-on: windows-latest
steps:
- uses: actions/checkout@v2
- name: Add msbuild to PATH
uses: microsoft/setup-msbuild@v1.0.2
- name: Build BrowserStackLocal
run: |
msbuild BrowserStackLocal -t:restore -p:Configuration=Release
msbuild BrowserStackLocal -t:build -p:Configuration=Release
- name: Build Test project
run: |
msbuild BrowserStackLocalIntegrationTests -t:restore -p:Configuration=Release
msbuild BrowserStackLocalIntegrationTests -t:build -p:Configuration=Release
- name: Setup .NET Core
uses: actions/setup-dotnet@v1
with:
dotnet-version: 3.1.301
- name: Run Integration Tests
env:
BROWSERSTACK_USERNAME: ${{ secrets.BROWSERSTACK_USERNAME }}
BROWSERSTACK_ACCESS_KEY: ${{ secrets.BROWSERSTACK_ACCESS_KEY }}
run: dotnet test BrowserStackLocalIntegrationTests --no-build -p:Configuration=Release
| name: .NET package CI
on: [workflow_dispatch]
defaults:
run:
working-directory: BrowserStackLocal
jobs:
build:
runs-on: windows-latest
steps:
- uses: actions/checkout@v2
- name: Add msbuild to PATH
uses: microsoft/setup-msbuild@v1.0.2
- name: Build BrowserStackLocal
run: |
msbuild BrowserStackLocal -t:restore -p:Configuration=Release
msbuild BrowserStackLocal -t:build -p:Configuration=Release
- name: Build Test project
run: |
msbuild BrowserStackLocalIntegrationTests -t:restore -p:Configuration=Release
msbuild BrowserStackLocalIntegrationTests -t:build -p:Configuration=Release
- name: Setup .NET Core
uses: actions/setup-dotnet@v1
with:
dotnet-version: 3.1.301
- name: Run Integration Tests
env:
BROWSERSTACK_USERNAME: ${{ secrets.BROWSERSTACK_USERNAME }}
BROWSERSTACK_ACCESS_KEY: ${{ secrets.BROWSERSTACK_ACCESS_KEY }}
run: dotnet test BrowserStackLocalIntegrationTests --no-build -p:Configuration=Release
- name: Pack NuGet Package
run: msbuild BrowserStackLocal -t:pack -p:Configuration=Release
- name: Save artifact
uses: actions/upload-artifact@v2
with:
path: BrowserStackLocal/bin/Release/*.nupkg
| Add step to pack nuget package in pipeline | Add step to pack nuget package in pipeline
| YAML | mit | browserstack/browserstack-local-csharp | yaml | ## Code Before:
name: .NET package CI
on: [workflow_dispatch]
defaults:
run:
working-directory: BrowserStackLocal
jobs:
build:
runs-on: windows-latest
steps:
- uses: actions/checkout@v2
- name: Add msbuild to PATH
uses: microsoft/setup-msbuild@v1.0.2
- name: Build BrowserStackLocal
run: |
msbuild BrowserStackLocal -t:restore -p:Configuration=Release
msbuild BrowserStackLocal -t:build -p:Configuration=Release
- name: Build Test project
run: |
msbuild BrowserStackLocalIntegrationTests -t:restore -p:Configuration=Release
msbuild BrowserStackLocalIntegrationTests -t:build -p:Configuration=Release
- name: Setup .NET Core
uses: actions/setup-dotnet@v1
with:
dotnet-version: 3.1.301
- name: Run Integration Tests
env:
BROWSERSTACK_USERNAME: ${{ secrets.BROWSERSTACK_USERNAME }}
BROWSERSTACK_ACCESS_KEY: ${{ secrets.BROWSERSTACK_ACCESS_KEY }}
run: dotnet test BrowserStackLocalIntegrationTests --no-build -p:Configuration=Release
## Instruction:
Add step to pack nuget package in pipeline
## Code After:
name: .NET package CI
on: [workflow_dispatch]
defaults:
run:
working-directory: BrowserStackLocal
jobs:
build:
runs-on: windows-latest
steps:
- uses: actions/checkout@v2
- name: Add msbuild to PATH
uses: microsoft/setup-msbuild@v1.0.2
- name: Build BrowserStackLocal
run: |
msbuild BrowserStackLocal -t:restore -p:Configuration=Release
msbuild BrowserStackLocal -t:build -p:Configuration=Release
- name: Build Test project
run: |
msbuild BrowserStackLocalIntegrationTests -t:restore -p:Configuration=Release
msbuild BrowserStackLocalIntegrationTests -t:build -p:Configuration=Release
- name: Setup .NET Core
uses: actions/setup-dotnet@v1
with:
dotnet-version: 3.1.301
- name: Run Integration Tests
env:
BROWSERSTACK_USERNAME: ${{ secrets.BROWSERSTACK_USERNAME }}
BROWSERSTACK_ACCESS_KEY: ${{ secrets.BROWSERSTACK_ACCESS_KEY }}
run: dotnet test BrowserStackLocalIntegrationTests --no-build -p:Configuration=Release
- name: Pack NuGet Package
run: msbuild BrowserStackLocal -t:pack -p:Configuration=Release
- name: Save artifact
uses: actions/upload-artifact@v2
with:
path: BrowserStackLocal/bin/Release/*.nupkg
| name: .NET package CI
on: [workflow_dispatch]
defaults:
run:
working-directory: BrowserStackLocal
jobs:
build:
runs-on: windows-latest
steps:
- uses: actions/checkout@v2
- name: Add msbuild to PATH
uses: microsoft/setup-msbuild@v1.0.2
- name: Build BrowserStackLocal
run: |
msbuild BrowserStackLocal -t:restore -p:Configuration=Release
msbuild BrowserStackLocal -t:build -p:Configuration=Release
- name: Build Test project
run: |
msbuild BrowserStackLocalIntegrationTests -t:restore -p:Configuration=Release
msbuild BrowserStackLocalIntegrationTests -t:build -p:Configuration=Release
- name: Setup .NET Core
uses: actions/setup-dotnet@v1
with:
dotnet-version: 3.1.301
- name: Run Integration Tests
env:
BROWSERSTACK_USERNAME: ${{ secrets.BROWSERSTACK_USERNAME }}
BROWSERSTACK_ACCESS_KEY: ${{ secrets.BROWSERSTACK_ACCESS_KEY }}
run: dotnet test BrowserStackLocalIntegrationTests --no-build -p:Configuration=Release
+ - name: Pack NuGet Package
+ run: msbuild BrowserStackLocal -t:pack -p:Configuration=Release
+ - name: Save artifact
+ uses: actions/upload-artifact@v2
+ with:
+ path: BrowserStackLocal/bin/Release/*.nupkg | 6 | 0.181818 | 6 | 0 |
026a8c0f001800c7e4304105af14cf1c1f56e933 | index.js | index.js | 'use strict';
var through = require('through')
var write = function write(chunk) {
var self = this
//read each line
var data = chunk.toString('utf8')
var lines = data.split('\n')
lines.forEach(function (line) {
if (!line) return
if (line.match(/^\s*#.*$/)) return
self.emit('data', line)
})
}
var end = function end () {
this.emit('end')
}
module.exports = through(write, end)
| 'use strict';
var through = require('through')
var write = function write(chunk) {
var self = this
//read each line
var data = chunk.toString('utf8')
var lines = data.split('\n')
lines.forEach(function (line) {
//generic catch all
if (!line) return
//skip empty lines
if (line.match(/^\s*$/)) return
//skip lines starting with comments
if (line.match(/^\s*#.*$/)) return
//skip lines that start with =
if (line.match(/^\s*=.*$/)) return
//skip lines that dont have an =
if (!line.match(/^.*=.*$/)) return
//trim spaces
line = line.replace(/\s*=\s*/, '=')
line = line.replace(/^\s*(.*)\s*=\s*(.*)$/, '$1=$2')
line = line.replace(/\s$/, '')
self.emit('data', line)
})
}
var end = function end () {
this.emit('end')
}
module.exports = function () {
return through(write, end)
}
| Update module to match tests | Update module to match tests
| JavaScript | mit | digitalsadhu/env-reader | javascript | ## Code Before:
'use strict';
var through = require('through')
var write = function write(chunk) {
var self = this
//read each line
var data = chunk.toString('utf8')
var lines = data.split('\n')
lines.forEach(function (line) {
if (!line) return
if (line.match(/^\s*#.*$/)) return
self.emit('data', line)
})
}
var end = function end () {
this.emit('end')
}
module.exports = through(write, end)
## Instruction:
Update module to match tests
## Code After:
'use strict';
var through = require('through')
var write = function write(chunk) {
var self = this
//read each line
var data = chunk.toString('utf8')
var lines = data.split('\n')
lines.forEach(function (line) {
//generic catch all
if (!line) return
//skip empty lines
if (line.match(/^\s*$/)) return
//skip lines starting with comments
if (line.match(/^\s*#.*$/)) return
//skip lines that start with =
if (line.match(/^\s*=.*$/)) return
//skip lines that dont have an =
if (!line.match(/^.*=.*$/)) return
//trim spaces
line = line.replace(/\s*=\s*/, '=')
line = line.replace(/^\s*(.*)\s*=\s*(.*)$/, '$1=$2')
line = line.replace(/\s$/, '')
self.emit('data', line)
})
}
var end = function end () {
this.emit('end')
}
module.exports = function () {
return through(write, end)
}
| 'use strict';
var through = require('through')
var write = function write(chunk) {
var self = this
//read each line
var data = chunk.toString('utf8')
var lines = data.split('\n')
lines.forEach(function (line) {
+
+ //generic catch all
if (!line) return
+
+ //skip empty lines
+ if (line.match(/^\s*$/)) return
+
+ //skip lines starting with comments
if (line.match(/^\s*#.*$/)) return
+
+ //skip lines that start with =
+ if (line.match(/^\s*=.*$/)) return
+
+ //skip lines that dont have an =
+ if (!line.match(/^.*=.*$/)) return
+
+ //trim spaces
+ line = line.replace(/\s*=\s*/, '=')
+ line = line.replace(/^\s*(.*)\s*=\s*(.*)$/, '$1=$2')
+ line = line.replace(/\s$/, '')
+
self.emit('data', line)
})
}
var end = function end () {
this.emit('end')
}
- module.exports = through(write, end)
+ module.exports = function () {
+ return through(write, end)
+ }
| 23 | 0.884615 | 22 | 1 |
e9f72a4b76225516595111aa5a3aab7984f71e47 | .hound.yml | .hound.yml | ruby:
config_file: .rubocop.yml
erb_lint:
enabled: true
scss:
enabled: false
javascript:
enabled: false
| ruby:
config_file: .rubocop.yml
erb_lint:
enabled: true
scss:
enabled: false
eslint:
enabled: true
config_file: .eslintrc.json
| Enable JS linting for Hound | Enable JS linting for Hound
| YAML | bsd-3-clause | pervino/solidus,pervino/solidus,pervino/solidus,pervino/solidus | yaml | ## Code Before:
ruby:
config_file: .rubocop.yml
erb_lint:
enabled: true
scss:
enabled: false
javascript:
enabled: false
## Instruction:
Enable JS linting for Hound
## Code After:
ruby:
config_file: .rubocop.yml
erb_lint:
enabled: true
scss:
enabled: false
eslint:
enabled: true
config_file: .eslintrc.json
| ruby:
config_file: .rubocop.yml
erb_lint:
enabled: true
scss:
enabled: false
- javascript:
+ eslint:
- enabled: false
? ^^^^
+ enabled: true
? ^^^
+ config_file: .eslintrc.json | 5 | 0.625 | 3 | 2 |
c7693c3f0c8252a9014fa8af2bbe142e26740911 | spec/tomcat_manager/api/version7_spec.rb | spec/tomcat_manager/api/version7_spec.rb | require 'spec_helper'
describe Tomcat::Manager::Api::Version7 do
context "url response methods" do
describe "connect_path" do
it "returns the connect_path for Tomcat Manager" do
FactoryGirl.build(:api, :version => "7").connect_path.should be_instance_of(String)
end
end
end
context "processing methods" do
describe "connect_response_valid?" do
let(:api) { FactoryGirl.build :api, :version => "7" }
it "returns true for a valid response HTTP code" do
api.connect_response_valid?("200").should == true
end
it "returns false for an non-OK response HTTP code" do
api.connect_response_valid?("abc").should == false
end
end
end
end
| require 'spec_helper'
describe Tomcat::Manager::Api::Version7 do
context "url response methods" do
describe "connect_path" do
it "returns the connect_path for Tomcat Manager" do
FactoryGirl.build(:api, :version => "7").connect_path.should be_instance_of(String)
end
end
end
context "processing methods" do
describe "connect_response_valid?" do
let(:api) { FactoryGirl.build :api, :version => "7" }
it "returns true for a valid response HTTP code" do
api.connect_response_valid?("200").should == true
end
it "returns false for an non-OK response HTTP code" do
api.connect_response_valid?("abc").should == false
end
end
describe "deployed_applications" do
let(:api) { FactoryGirl.build :api, :version => "7" }
let(:application_list) { File.open(File.dirname(__FILE__) + "/../../fixtures/application_list.txt", "r").read }
it "returns a hash of applications" do
api.deployed_applications(application_list).should ==
{ "/" => { :status => "running", :sessions => 0, :name => "ROOT" },
"/manager" => { :status => "running", :sessions => 2, :name => "manager" },
"/docs" => { :status => "running", :sessions => 0, :name => "docs" },
"/test-app-1.0.0" => { :status => "running", :sessions => 2, :name => "test-app-1.0.0" },
"/test-app-2.0.1" => { :status => "running", :sessions => 0, :name => "test-app-2.0.1" },
"/other-test-app-1.0.1" => { :status => "stopped", :sessions => 0, :name => "other-test-app-1.0.1" },
"/examples" => { :status => "running", :sessions => 1, :name => "examples" },
"/host-manager" => { :status => "running", :sessions => 0, :name => "host-manager" } }
end
end
end
end
| Add Spec for Version7 deployed_applications Method | Add Spec for Version7 deployed_applications Method
Add a spec for the Version7 class deployed_applications method.
| Ruby | mit | jekhokie/tomcat-manager | ruby | ## Code Before:
require 'spec_helper'
describe Tomcat::Manager::Api::Version7 do
context "url response methods" do
describe "connect_path" do
it "returns the connect_path for Tomcat Manager" do
FactoryGirl.build(:api, :version => "7").connect_path.should be_instance_of(String)
end
end
end
context "processing methods" do
describe "connect_response_valid?" do
let(:api) { FactoryGirl.build :api, :version => "7" }
it "returns true for a valid response HTTP code" do
api.connect_response_valid?("200").should == true
end
it "returns false for an non-OK response HTTP code" do
api.connect_response_valid?("abc").should == false
end
end
end
end
## Instruction:
Add Spec for Version7 deployed_applications Method
Add a spec for the Version7 class deployed_applications method.
## Code After:
require 'spec_helper'
describe Tomcat::Manager::Api::Version7 do
context "url response methods" do
describe "connect_path" do
it "returns the connect_path for Tomcat Manager" do
FactoryGirl.build(:api, :version => "7").connect_path.should be_instance_of(String)
end
end
end
context "processing methods" do
describe "connect_response_valid?" do
let(:api) { FactoryGirl.build :api, :version => "7" }
it "returns true for a valid response HTTP code" do
api.connect_response_valid?("200").should == true
end
it "returns false for an non-OK response HTTP code" do
api.connect_response_valid?("abc").should == false
end
end
describe "deployed_applications" do
let(:api) { FactoryGirl.build :api, :version => "7" }
let(:application_list) { File.open(File.dirname(__FILE__) + "/../../fixtures/application_list.txt", "r").read }
it "returns a hash of applications" do
api.deployed_applications(application_list).should ==
{ "/" => { :status => "running", :sessions => 0, :name => "ROOT" },
"/manager" => { :status => "running", :sessions => 2, :name => "manager" },
"/docs" => { :status => "running", :sessions => 0, :name => "docs" },
"/test-app-1.0.0" => { :status => "running", :sessions => 2, :name => "test-app-1.0.0" },
"/test-app-2.0.1" => { :status => "running", :sessions => 0, :name => "test-app-2.0.1" },
"/other-test-app-1.0.1" => { :status => "stopped", :sessions => 0, :name => "other-test-app-1.0.1" },
"/examples" => { :status => "running", :sessions => 1, :name => "examples" },
"/host-manager" => { :status => "running", :sessions => 0, :name => "host-manager" } }
end
end
end
end
| require 'spec_helper'
describe Tomcat::Manager::Api::Version7 do
context "url response methods" do
describe "connect_path" do
it "returns the connect_path for Tomcat Manager" do
FactoryGirl.build(:api, :version => "7").connect_path.should be_instance_of(String)
end
end
end
context "processing methods" do
describe "connect_response_valid?" do
let(:api) { FactoryGirl.build :api, :version => "7" }
it "returns true for a valid response HTTP code" do
api.connect_response_valid?("200").should == true
end
it "returns false for an non-OK response HTTP code" do
api.connect_response_valid?("abc").should == false
end
end
+
+ describe "deployed_applications" do
+ let(:api) { FactoryGirl.build :api, :version => "7" }
+ let(:application_list) { File.open(File.dirname(__FILE__) + "/../../fixtures/application_list.txt", "r").read }
+
+ it "returns a hash of applications" do
+ api.deployed_applications(application_list).should ==
+ { "/" => { :status => "running", :sessions => 0, :name => "ROOT" },
+ "/manager" => { :status => "running", :sessions => 2, :name => "manager" },
+ "/docs" => { :status => "running", :sessions => 0, :name => "docs" },
+ "/test-app-1.0.0" => { :status => "running", :sessions => 2, :name => "test-app-1.0.0" },
+ "/test-app-2.0.1" => { :status => "running", :sessions => 0, :name => "test-app-2.0.1" },
+ "/other-test-app-1.0.1" => { :status => "stopped", :sessions => 0, :name => "other-test-app-1.0.1" },
+ "/examples" => { :status => "running", :sessions => 1, :name => "examples" },
+ "/host-manager" => { :status => "running", :sessions => 0, :name => "host-manager" } }
+ end
+ end
end
end | 17 | 0.68 | 17 | 0 |
0003c67e80b8fee3209298d8cea26ae4f65cf843 | _layouts/default.html | _layouts/default.html | ---
layout: base
---
<main>
<section class='posts-container'>
<ul>
<li class='post latest'>
<a href="{{ site.posts[0].url }}">
<div>
<span class='date'>{{ site.posts[0].date | date: "%b %-d, %Y" }}</span>
<h3>{{ site.posts[0].title }}</h3>
<p>{{ site.posts[0].excerpt }}</p>
</div>
</a>
</li>
{% for post in site.posts %}
<li class='post'>
<a href="{{ post.url }}">
<div>
<span class='date'>{{ post.date | date: "%b %-d, %Y" }}</span>
<h3>{{ post.title }}</h3>
<p>{{ post.excerpt }}</p>
</div>
</a>
</li>
{% endfor %}
</ul>
</section>
</main>
| ---
layout: base
---
<main>
<section class='posts-container'>
<ul>
<li class='post latest'>
<a href="{{ site.posts[0].url }}">
<div>
<h2>Latest post</h2>
<span class='date'>{{ site.posts[0].date | date: "%b %-d, %Y" }}</span>
<hr>
<h3>{{ site.posts[0].title }}</h3>
<p>{{ site.posts[0].excerpt }}</p>
</div>
</a>
</li>
{% for post in site.posts %}
<li class='post'>
<a href="{{ post.url }}">
<div>
<span class='date'>{{ post.date | date: "%b %-d, %Y" }}</span>
<h3>{{ post.title }}</h3>
<p>{{ post.excerpt }}</p>
</div>
</a>
</li>
{% endfor %}
</ul>
</section>
</main>
| Add missing title from home layout | Add missing title from home layout
| HTML | mit | elmtown/elmtown.github.io,elmtown/elmtown.github.io | html | ## Code Before:
---
layout: base
---
<main>
<section class='posts-container'>
<ul>
<li class='post latest'>
<a href="{{ site.posts[0].url }}">
<div>
<span class='date'>{{ site.posts[0].date | date: "%b %-d, %Y" }}</span>
<h3>{{ site.posts[0].title }}</h3>
<p>{{ site.posts[0].excerpt }}</p>
</div>
</a>
</li>
{% for post in site.posts %}
<li class='post'>
<a href="{{ post.url }}">
<div>
<span class='date'>{{ post.date | date: "%b %-d, %Y" }}</span>
<h3>{{ post.title }}</h3>
<p>{{ post.excerpt }}</p>
</div>
</a>
</li>
{% endfor %}
</ul>
</section>
</main>
## Instruction:
Add missing title from home layout
## Code After:
---
layout: base
---
<main>
<section class='posts-container'>
<ul>
<li class='post latest'>
<a href="{{ site.posts[0].url }}">
<div>
<h2>Latest post</h2>
<span class='date'>{{ site.posts[0].date | date: "%b %-d, %Y" }}</span>
<hr>
<h3>{{ site.posts[0].title }}</h3>
<p>{{ site.posts[0].excerpt }}</p>
</div>
</a>
</li>
{% for post in site.posts %}
<li class='post'>
<a href="{{ post.url }}">
<div>
<span class='date'>{{ post.date | date: "%b %-d, %Y" }}</span>
<h3>{{ post.title }}</h3>
<p>{{ post.excerpt }}</p>
</div>
</a>
</li>
{% endfor %}
</ul>
</section>
</main>
| ---
layout: base
---
<main>
<section class='posts-container'>
<ul>
<li class='post latest'>
<a href="{{ site.posts[0].url }}">
<div>
+ <h2>Latest post</h2>
<span class='date'>{{ site.posts[0].date | date: "%b %-d, %Y" }}</span>
+ <hr>
<h3>{{ site.posts[0].title }}</h3>
<p>{{ site.posts[0].excerpt }}</p>
</div>
</a>
</li>
{% for post in site.posts %}
<li class='post'>
<a href="{{ post.url }}">
<div>
<span class='date'>{{ post.date | date: "%b %-d, %Y" }}</span>
<h3>{{ post.title }}</h3>
<p>{{ post.excerpt }}</p>
</div>
</a>
</li>
{% endfor %}
</ul>
</section>
</main> | 2 | 0.066667 | 2 | 0 |
ca723e912e5dde2058d6a72bc4ee93cb338c17e7 | core/spec/support/refinery.rb | core/spec/support/refinery.rb | require 'refinerycms-testing'
RSpec.configure do |config|
config.extend Refinery::Testing::ControllerMacros::Authentication, :type => :controller
config.include Refinery::Testing::ControllerMacros::Methods, :type => :controller
config.extend Refinery::Testing::FeatureMacros::Authentication, :type => :feature
config.include Warden::Test::Helpers
# set some config values so that image and resource factories don't fail to create
config.before do
Refinery::Images.max_image_size = 5242880
Refinery::Resources.max_file_size = 52428800
end
config.after do
Warden.test_reset!
end
end
| require 'refinerycms-testing'
RSpec.configure do |config|
config.extend Refinery::Testing::ControllerMacros::Authentication, :type => :controller
config.include Refinery::Testing::ControllerMacros::Methods, :type => :controller
config.extend Refinery::Testing::FeatureMacros::Authentication, :type => :feature
config.include Warden::Test::Helpers
# set some config values so that image and resource factories don't fail to create
config.before do
Refinery::Images.max_image_size = 5242880 if defined?(Refinery::Images)
Refinery::Resources.max_file_size = 52428800 if defined?(Refinery::Resources)
end
config.after do
Warden.test_reset!
end
end
| Fix for when we don't have constants defined. | Fix for when we don't have constants defined.
| Ruby | mit | LytayTOUCH/refinerycms,sideci-sample/sideci-sample-refinerycms,sideci-sample/sideci-sample-refinerycms,gwagener/refinerycms,anitagraham/refinerycms,aguzubiaga/refinerycms,kappiah/refinerycms,stefanspicer/refinerycms,kappiah/refinerycms,mabras/refinerycms,Eric-Guo/refinerycms,simi/refinerycms,KingLemuel/refinerycms,KingLemuel/refinerycms,simi/refinerycms,KingLemuel/refinerycms,hoopla-software/refinerycms,Eric-Guo/refinerycms,simi/refinerycms,Eric-Guo/refinerycms,stefanspicer/refinerycms,Retimont/refinerycms,mkaplan9/refinerycms,chrise86/refinerycms,johanb/refinerycms,hoopla-software/refinerycms,louim/refinerycms,LytayTOUCH/refinerycms,anitagraham/refinerycms,mabras/refinerycms,chrise86/refinerycms,trevornez/refinerycms,hoopla-software/refinerycms,louim/refinerycms,Retimont/refinerycms,Retimont/refinerycms,trevornez/refinerycms,aguzubiaga/refinerycms,bricesanchez/refinerycms,mkaplan9/refinerycms,trevornez/refinerycms,bricesanchez/refinerycms,gwagener/refinerycms,mabras/refinerycms,kappiah/refinerycms,johanb/refinerycms,LytayTOUCH/refinerycms,johanb/refinerycms,stefanspicer/refinerycms,chrise86/refinerycms,gwagener/refinerycms,mlinfoot/refinerycms,mlinfoot/refinerycms,mkaplan9/refinerycms,refinery/refinerycms,refinery/refinerycms,simi/refinerycms,aguzubiaga/refinerycms,refinery/refinerycms,anitagraham/refinerycms,mlinfoot/refinerycms | ruby | ## Code Before:
require 'refinerycms-testing'
RSpec.configure do |config|
config.extend Refinery::Testing::ControllerMacros::Authentication, :type => :controller
config.include Refinery::Testing::ControllerMacros::Methods, :type => :controller
config.extend Refinery::Testing::FeatureMacros::Authentication, :type => :feature
config.include Warden::Test::Helpers
# set some config values so that image and resource factories don't fail to create
config.before do
Refinery::Images.max_image_size = 5242880
Refinery::Resources.max_file_size = 52428800
end
config.after do
Warden.test_reset!
end
end
## Instruction:
Fix for when we don't have constants defined.
## Code After:
require 'refinerycms-testing'
RSpec.configure do |config|
config.extend Refinery::Testing::ControllerMacros::Authentication, :type => :controller
config.include Refinery::Testing::ControllerMacros::Methods, :type => :controller
config.extend Refinery::Testing::FeatureMacros::Authentication, :type => :feature
config.include Warden::Test::Helpers
# set some config values so that image and resource factories don't fail to create
config.before do
Refinery::Images.max_image_size = 5242880 if defined?(Refinery::Images)
Refinery::Resources.max_file_size = 52428800 if defined?(Refinery::Resources)
end
config.after do
Warden.test_reset!
end
end
| require 'refinerycms-testing'
RSpec.configure do |config|
config.extend Refinery::Testing::ControllerMacros::Authentication, :type => :controller
config.include Refinery::Testing::ControllerMacros::Methods, :type => :controller
config.extend Refinery::Testing::FeatureMacros::Authentication, :type => :feature
config.include Warden::Test::Helpers
# set some config values so that image and resource factories don't fail to create
config.before do
- Refinery::Images.max_image_size = 5242880
+ Refinery::Images.max_image_size = 5242880 if defined?(Refinery::Images)
? ++++++++++++++++++++++++++++++
- Refinery::Resources.max_file_size = 52428800
+ Refinery::Resources.max_file_size = 52428800 if defined?(Refinery::Resources)
end
config.after do
Warden.test_reset!
end
end | 4 | 0.222222 | 2 | 2 |
695d21ace25cf9ddb6eabd82bbe39d35b34ae72f | mutt/urlhandler.sh | mutt/urlhandler.sh | frontmost() {
osascript <<EOF
tell application "System Events"
set frontApp to name of first application process whose frontmost is true
end tell
EOF
}
before=$(frontmost)
open -g "$1"
sleep 0.1
after=$(frontmost)
if [ "$before" != "$after" ]; then
open -a "$before"
fi
|
open -g "$1"
| Revert "Fuck you google chrome" | Revert "Fuck you google chrome"
This reverts commit fea352cfbf274921bfaf18eeda9b5c40db7ef747.
This bug seems to have been fixed in Chrome
| Shell | mit | keith/dotfiles,keith/dotfiles,keith/dotfiles,keith/dotfiles,keith/dotfiles,keith/dotfiles | shell | ## Code Before:
frontmost() {
osascript <<EOF
tell application "System Events"
set frontApp to name of first application process whose frontmost is true
end tell
EOF
}
before=$(frontmost)
open -g "$1"
sleep 0.1
after=$(frontmost)
if [ "$before" != "$after" ]; then
open -a "$before"
fi
## Instruction:
Revert "Fuck you google chrome"
This reverts commit fea352cfbf274921bfaf18eeda9b5c40db7ef747.
This bug seems to have been fixed in Chrome
## Code After:
open -g "$1"
| - frontmost() {
- osascript <<EOF
- tell application "System Events"
- set frontApp to name of first application process whose frontmost is true
- end tell
- EOF
- }
- before=$(frontmost)
open -g "$1"
- sleep 0.1
- after=$(frontmost)
-
- if [ "$before" != "$after" ]; then
- open -a "$before"
- fi | 14 | 0.875 | 0 | 14 |
bbd25b6367ae8d3ee873f9491be384a169990d91 | build-yocto-genivi/run-gocd-agent.sh | build-yocto-genivi/run-gocd-agent.sh | set -e
# Sanity checks
if [ $(whoami) != 'go' ]; then
echo "Please specify docker run --user go"
exit 1
fi
if [ $(pwd) != '/var/lib/go-agent' ]; then
echo "Please specify docker run --workdir /var/lib/go-agent"
exit 1
fi
git config --global user.name "easy-build"
git config --global user.email "$(whoami)@$(hostname)"
cp /etc/default/go-agent go-agent.ORIG
sed 's/^GO_SERVER=.*$/GO_SERVER=go.genivi.org/g' go-agent.ORIG >/etc/default/go-agent
service go-agent start
# EOF
| set -e
# Sanity checks
if [ $(whoami) != 'go' ]; then
echo "Please specify docker run --user go"
exit 1
fi
if [ $(pwd) != '/var/lib/go-agent' ]; then
echo "Please specify docker run --workdir /var/lib/go-agent"
exit 1
fi
if [ "${GO_SERVER}" == "" ]; then
export GO_SERVER=go.genivi.org
echo "GO_SERVER was not defined - setting to ${GO_SERVER}"
fi
git config --global user.name "easy-build"
git config --global user.email "$(whoami)@$(hostname)"
cp /etc/default/go-agent go-agent.ORIG
sed "s/^GO_SERVER=.*$/GO_SERVER=${GO_SERVER}/g" go-agent.ORIG >/etc/default/go-agent
service go-agent start
# EOF
| Enable GO_SERVER configuration via envvars | Enable GO_SERVER configuration via envvars
Signed-off-by: Gianpaolo Macario <d50c09b51e61901a1c5054dce86f92776c198faa@mentor.com>
| Shell | mpl-2.0 | gmacario/easy-build | shell | ## Code Before:
set -e
# Sanity checks
if [ $(whoami) != 'go' ]; then
echo "Please specify docker run --user go"
exit 1
fi
if [ $(pwd) != '/var/lib/go-agent' ]; then
echo "Please specify docker run --workdir /var/lib/go-agent"
exit 1
fi
git config --global user.name "easy-build"
git config --global user.email "$(whoami)@$(hostname)"
cp /etc/default/go-agent go-agent.ORIG
sed 's/^GO_SERVER=.*$/GO_SERVER=go.genivi.org/g' go-agent.ORIG >/etc/default/go-agent
service go-agent start
# EOF
## Instruction:
Enable GO_SERVER configuration via envvars
Signed-off-by: Gianpaolo Macario <d50c09b51e61901a1c5054dce86f92776c198faa@mentor.com>
## Code After:
set -e
# Sanity checks
if [ $(whoami) != 'go' ]; then
echo "Please specify docker run --user go"
exit 1
fi
if [ $(pwd) != '/var/lib/go-agent' ]; then
echo "Please specify docker run --workdir /var/lib/go-agent"
exit 1
fi
if [ "${GO_SERVER}" == "" ]; then
export GO_SERVER=go.genivi.org
echo "GO_SERVER was not defined - setting to ${GO_SERVER}"
fi
git config --global user.name "easy-build"
git config --global user.email "$(whoami)@$(hostname)"
cp /etc/default/go-agent go-agent.ORIG
sed "s/^GO_SERVER=.*$/GO_SERVER=${GO_SERVER}/g" go-agent.ORIG >/etc/default/go-agent
service go-agent start
# EOF
| set -e
# Sanity checks
if [ $(whoami) != 'go' ]; then
echo "Please specify docker run --user go"
exit 1
fi
if [ $(pwd) != '/var/lib/go-agent' ]; then
echo "Please specify docker run --workdir /var/lib/go-agent"
exit 1
fi
+ if [ "${GO_SERVER}" == "" ]; then
+ export GO_SERVER=go.genivi.org
+ echo "GO_SERVER was not defined - setting to ${GO_SERVER}"
+ fi
git config --global user.name "easy-build"
git config --global user.email "$(whoami)@$(hostname)"
cp /etc/default/go-agent go-agent.ORIG
- sed 's/^GO_SERVER=.*$/GO_SERVER=go.genivi.org/g' go-agent.ORIG >/etc/default/go-agent
? ^ ^^^^^^^^^^^^^ ^
+ sed "s/^GO_SERVER=.*$/GO_SERVER=${GO_SERVER}/g" go-agent.ORIG >/etc/default/go-agent
? ^ ^^^^^^^^^^^^ ^
service go-agent start
# EOF | 6 | 0.285714 | 5 | 1 |
8ed67dc7f803f371e739a766eed1d35f1a25ad48 | .travis.yml | .travis.yml | sudo: required
services:
- docker
language: bash
addons:
apt:
sources:
- debian-sid
packages:
- shellcheck
env:
matrix:
- CROSSARCH_COMMON=1
- BUILD=bind
install:
- set -eo pipefail
- chmod +x ./docker_crossarch_common.sh
- chmod +x ./**/get_version.sh
script:
- set -eo pipefail
- |
if [[ -z "${CROSSARCH_COMMON+x}" ]]; then
source ./docker_crossarch_common.sh
if [[ -f "./${BUILD}/entrypoint.sh" ]]; then
crossarch_common_build "./${BUILD}/Dockerfile" "./${BUILD}/entrypoint.sh"
else
crossarch_common_build "./${BUILD}/Dockerfile"
fi
source "./${BUILD}/get_version.sh"
local build_version
build_version=$(crossarch_build_get_version)
crossarch_common_deploy "${DOCKER_USERNAME}" "${DOCKER_PASSWORD}" "${BUILD}" "${build_version}"
else
shellcheck ./docker_crossarch_common.sh
fi
| sudo: required
services:
- docker
before_install:
- curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
- sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"
- sudo apt-get update
- sudo apt-get -y install docker-ce
language: bash
addons:
apt:
sources:
- debian-sid
packages:
- shellcheck
env:
matrix:
- CROSSARCH_COMMON=1
- BUILD=bind
install:
- set -eo pipefail
- chmod +x ./docker_crossarch_common.sh
- chmod +x ./**/get_version.sh
script:
- set -eo pipefail
- |
if [[ -z "${CROSSARCH_COMMON+x}" ]]; then
source ./docker_crossarch_common.sh
if [[ -f "./${BUILD}/entrypoint.sh" ]]; then
crossarch_common_build "./${BUILD}/Dockerfile" "./${BUILD}/entrypoint.sh"
else
crossarch_common_build "./${BUILD}/Dockerfile"
fi
source "./${BUILD}/get_version.sh"
local build_version
build_version=$(crossarch_build_get_version)
crossarch_common_deploy "${DOCKER_USERNAME}" "${DOCKER_PASSWORD}" "${BUILD}" "${build_version}"
else
shellcheck ./docker_crossarch_common.sh
fi
| Use last docker ce for build | Use last docker ce for build
| YAML | mit | GaetanCambier/docker,GaetanCambier/docker | yaml | ## Code Before:
sudo: required
services:
- docker
language: bash
addons:
apt:
sources:
- debian-sid
packages:
- shellcheck
env:
matrix:
- CROSSARCH_COMMON=1
- BUILD=bind
install:
- set -eo pipefail
- chmod +x ./docker_crossarch_common.sh
- chmod +x ./**/get_version.sh
script:
- set -eo pipefail
- |
if [[ -z "${CROSSARCH_COMMON+x}" ]]; then
source ./docker_crossarch_common.sh
if [[ -f "./${BUILD}/entrypoint.sh" ]]; then
crossarch_common_build "./${BUILD}/Dockerfile" "./${BUILD}/entrypoint.sh"
else
crossarch_common_build "./${BUILD}/Dockerfile"
fi
source "./${BUILD}/get_version.sh"
local build_version
build_version=$(crossarch_build_get_version)
crossarch_common_deploy "${DOCKER_USERNAME}" "${DOCKER_PASSWORD}" "${BUILD}" "${build_version}"
else
shellcheck ./docker_crossarch_common.sh
fi
## Instruction:
Use last docker ce for build
## Code After:
sudo: required
services:
- docker
before_install:
- curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
- sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"
- sudo apt-get update
- sudo apt-get -y install docker-ce
language: bash
addons:
apt:
sources:
- debian-sid
packages:
- shellcheck
env:
matrix:
- CROSSARCH_COMMON=1
- BUILD=bind
install:
- set -eo pipefail
- chmod +x ./docker_crossarch_common.sh
- chmod +x ./**/get_version.sh
script:
- set -eo pipefail
- |
if [[ -z "${CROSSARCH_COMMON+x}" ]]; then
source ./docker_crossarch_common.sh
if [[ -f "./${BUILD}/entrypoint.sh" ]]; then
crossarch_common_build "./${BUILD}/Dockerfile" "./${BUILD}/entrypoint.sh"
else
crossarch_common_build "./${BUILD}/Dockerfile"
fi
source "./${BUILD}/get_version.sh"
local build_version
build_version=$(crossarch_build_get_version)
crossarch_common_deploy "${DOCKER_USERNAME}" "${DOCKER_PASSWORD}" "${BUILD}" "${build_version}"
else
shellcheck ./docker_crossarch_common.sh
fi
| sudo: required
+
services:
- docker
+
+ before_install:
+ - curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
+ - sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"
+ - sudo apt-get update
+ - sudo apt-get -y install docker-ce
+
language: bash
+
addons:
apt:
sources:
- debian-sid
packages:
- shellcheck
+
env:
matrix:
- CROSSARCH_COMMON=1
- BUILD=bind
+
install:
- set -eo pipefail
- chmod +x ./docker_crossarch_common.sh
- chmod +x ./**/get_version.sh
+
script:
- set -eo pipefail
- |
if [[ -z "${CROSSARCH_COMMON+x}" ]]; then
source ./docker_crossarch_common.sh
if [[ -f "./${BUILD}/entrypoint.sh" ]]; then
crossarch_common_build "./${BUILD}/Dockerfile" "./${BUILD}/entrypoint.sh"
else
crossarch_common_build "./${BUILD}/Dockerfile"
fi
source "./${BUILD}/get_version.sh"
local build_version
build_version=$(crossarch_build_get_version)
crossarch_common_deploy "${DOCKER_USERNAME}" "${DOCKER_PASSWORD}" "${BUILD}" "${build_version}"
else
shellcheck ./docker_crossarch_common.sh
fi
+ | 13 | 0.371429 | 13 | 0 |
734a9b6e574b0ab106197ed087bd749829ebeae3 | .travis.yml | .travis.yml | language: clojure
jdk:
- oraclejdk8
node_js:
- "4"
install:
- wget -O boot https://github.com/boot-clj/boot-bin/releases/download/latest/boot.sh
- chmod 755 boot
- npm install karma karma-cljs-test karma-cli
- export PATH="$PWD:$PWD/node_modules/.bin;$PATH"
- export BOOT_HOME=.travis.boot
before_script:
- java -version
- phantomjs -v
- karma -v
- firefox -v
- "export DISPLAY=:99.0"
- "sh -e /etc/init.d/xvfb start" # start xvfb for firefox
- sleep 5 # give xvfb some time to start
- boot show --deps
script: boot testc
| language: clojure
jdk:
- oraclejdk8
node_js:
- "4"
install:
- wget -O boot https://github.com/boot-clj/boot-bin/releases/download/latest/boot.sh
- chmod 755 boot
- npm install karma karma-cljs-test karma-cli
- export PATH="$PWD:$PWD/node_modules/.bin;$PATH"
- export BOOT_HOME=.travis.boot
before_script:
- java -version
- boot --version
- phantomjs -v
- karma --version
- firefox -v
- "export DISPLAY=:99.0"
- "sh -e /etc/init.d/xvfb start" # start xvfb for firefox
- sleep 5 # give xvfb some time to start
- boot show --deps
script: boot testc
| Fix karma --version for Travis | Fix karma --version for Travis
Signed-off-by: Tom Marble <b248307e886b80bf9cb4014595572a835b049bf4@info9.net>
| YAML | mit | tmarble/avenir | yaml | ## Code Before:
language: clojure
jdk:
- oraclejdk8
node_js:
- "4"
install:
- wget -O boot https://github.com/boot-clj/boot-bin/releases/download/latest/boot.sh
- chmod 755 boot
- npm install karma karma-cljs-test karma-cli
- export PATH="$PWD:$PWD/node_modules/.bin;$PATH"
- export BOOT_HOME=.travis.boot
before_script:
- java -version
- phantomjs -v
- karma -v
- firefox -v
- "export DISPLAY=:99.0"
- "sh -e /etc/init.d/xvfb start" # start xvfb for firefox
- sleep 5 # give xvfb some time to start
- boot show --deps
script: boot testc
## Instruction:
Fix karma --version for Travis
Signed-off-by: Tom Marble <b248307e886b80bf9cb4014595572a835b049bf4@info9.net>
## Code After:
language: clojure
jdk:
- oraclejdk8
node_js:
- "4"
install:
- wget -O boot https://github.com/boot-clj/boot-bin/releases/download/latest/boot.sh
- chmod 755 boot
- npm install karma karma-cljs-test karma-cli
- export PATH="$PWD:$PWD/node_modules/.bin;$PATH"
- export BOOT_HOME=.travis.boot
before_script:
- java -version
- boot --version
- phantomjs -v
- karma --version
- firefox -v
- "export DISPLAY=:99.0"
- "sh -e /etc/init.d/xvfb start" # start xvfb for firefox
- sleep 5 # give xvfb some time to start
- boot show --deps
script: boot testc
| language: clojure
jdk:
- oraclejdk8
node_js:
- "4"
install:
- wget -O boot https://github.com/boot-clj/boot-bin/releases/download/latest/boot.sh
- chmod 755 boot
- npm install karma karma-cljs-test karma-cli
- export PATH="$PWD:$PWD/node_modules/.bin;$PATH"
- export BOOT_HOME=.travis.boot
before_script:
- java -version
+ - boot --version
- phantomjs -v
- - karma -v
+ - karma --version
- firefox -v
- "export DISPLAY=:99.0"
- "sh -e /etc/init.d/xvfb start" # start xvfb for firefox
- sleep 5 # give xvfb some time to start
- boot show --deps
script: boot testc | 3 | 0.142857 | 2 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.