prompt stringlengths 77 6.9k | completion stringlengths 1 3.38k | text stringlengths 163 9.03k | commit stringlengths 40 40 | old_file stringlengths 4 264 | new_file stringlengths 4 264 | lang stringclasses 277
values |
|---|---|---|---|---|---|---|
<|file_sep|>original/examples/triggers/ceilometer_alarm_threshold.yaml
version: 1.0
time_constraints:
- name: a_time_constraint
start: '* * * * * *'
duration: 3600
description: a time constraint
timezone: 'Asia/Shanghai'
rule:
meter_name: cpu_util
comparison_operator: lt
threshold: 15
period: 60
evaluation_periods: 1
statistic: avg
query:
- field: resource_metadata.cluster
op: eq
type: ''
value: cluster1
<|file_sep|>current/examples/triggers/ceilometer_alarm_threshold.yaml
version: 1.0
time_constraints:
- name: a_time_constraint
start: '* * * * * *'
duration: 3600
description: a time constraint
timezone: 'Asia/Shanghai'
rule:
meter_name: cpu_util
comparison_operator: lt
threshold: 15
period: 60
evaluation_periods: 1
statistic: avg
query:
- field: resource_metadata.cluster
op: eq
type: ''
value: cluster1
<|file_sep|>updated/examples/triggers/ceilometer_alarm_threshold.yaml | type: CeilometerThresholdAlarm
version: 1.0
time_constraints:
- name: a_time_constraint
start: '* * * * * *'
duration: 3600
description: a time constraint
timezone: 'Asia/Shanghai'
rule:
meter_name: cpu_util
comparison_operator: lt
threshold: 15
period: 60
evaluation_periods: 1
statistic: avg
query:
- field: resource_metadata.cluster
op: eq
value: cluster1 | <|file_sep|>original/examples/triggers/ceilometer_alarm_threshold.yaml
version: 1.0
time_constraints:
- name: a_time_constraint
start: '* * * * * *'
duration: 3600
description: a time constraint
timezone: 'Asia/Shanghai'
rule:
meter_name: cpu_util
comparison_operator: lt
threshold: 15
period: 60
evaluation_periods: 1
statistic: avg
query:
- field: resource_metadata.cluster
op: eq
type: ''
value: cluster1
<|file_sep|>current/examples/triggers/ceilometer_alarm_threshold.yaml
version: 1.0
time_constraints:
- name: a_time_constraint
start: '* * * * * *'
duration: 3600
description: a time constraint
timezone: 'Asia/Shanghai'
rule:
meter_name: cpu_util
comparison_operator: lt
threshold: 15
period: 60
evaluation_periods: 1
statistic: avg
query:
- field: resource_metadata.cluster
op: eq
type: ''
value: cluster1
<|file_sep|>updated/examples/triggers/ceilometer_alarm_threshold.yaml
type: CeilometerThresholdAlarm
version: 1.0
time_constraints:
- name: a_time_constraint
start: '* * * * * *'
duration: 3600
description: a time constraint
timezone: 'Asia/Shanghai'
rule:
meter_name: cpu_util
comparison_operator: lt
threshold: 15
period: 60
evaluation_periods: 1
statistic: avg
query:
- field: resource_metadata.cluster
op: eq
value: cluster1 | 222f1a0b52f21ce0c61ee7828834b62b727c936e | examples/triggers/ceilometer_alarm_threshold.yaml | examples/triggers/ceilometer_alarm_threshold.yaml | YAML |
<|file_sep|>resources/views/mail/travel/docusignenvelopereceived.blade.php.diff
original:
updated:
<|file_sep|>original/resources/views/mail/travel/docusignenvelopereceived.blade.php
Hi {{ $envelope->signedBy->preferred_first_name }},
We received your Travel Authority Request for {{ $envelope->signable->travel->name }}. Once everyone has submitted their documents, we'll forward all of them to Georgia Tech for review and approval.
You can view your completed form within DocuSign at {{ $envelope->sender_view_url }}.
@if(!$envelope->signable->is_paid)
You still need to make a ${{ intval($envelope->signable->travel->fee_amount) }} payment for the travel fee.
You can pay online with a credit or debit card at {{ route('pay.travel') }}. Note that we add an additional ${{ number_format(\App\Models\Payment::calculateSurcharge($envelope->signable->travel->fee_amount * 100) / 100, 2) }} surcharge for online payments.
If you would prefer to pay by cash or check, make arrangements with {{ $envelope->signable->travel->primaryContact->full_name }}.
Write checks to Georgia Tech, with RoboJackets on the memo line. Don't forget to sign it!
@endif
For more information, visit {{ route('travel.index') }}. If you have any questions, please contact {{ $envelope->signable->travel->primaryContact->full_name }}.
----
To stop receiving emails from {{ config('app.name') }}, visit @{{{ pm:unsubscribe }}}.
<|file_sep|>current/resources/views/mail/travel/docusignenvelopereceived.blade.php
We received your Travel Authority Request for {{ $envelope->signable->travel->name }}. Once everyone has submitted their documents, we'll forward all of them to Georgia Tech for review and approval.
You can view your completed form within DocuSign at {{ $envelope->sender_view_url }}.
@if(!$envelope->signable->is_paid)
You still need to make a ${{ intval($envelope->signable->travel->fee_amount) }} payment for the travel fee.
You can pay online with a credit or debit card at {{ route('pay.travel') }}. Note that we add an additional ${{ number_format(\App\Models\Payment::calculateSurcharge($envelope->signable->travel->fee_amount * 100) / 100, 2) }} surcharge for online payments.
If you would prefer to pay by cash or check, make arrangements with {{ $envelope->signable->travel->primaryContact->full_name }}.
Write checks to Georgia Tech, with RoboJackets on the memo line. Don't forget to sign it!
@endif
For more information, visit {{ route('travel.index') }}. If you have any questions, please contact {{ $envelope->signable->travel->primaryContact->full_name }}.
----
To stop receiving emails from {{ config('app.name') }}, visit @{{{ pm:unsubscribe }}}.
<|file_sep|>updated/resources/views/mail/travel/docusignenvelopereceived.blade.php | Hi {{ $envelope->signedBy->preferred_first_name }},
We received your Travel Authority Request for {{ $envelope->signable->travel->name }}. Once everyone has submitted their documents, we'll forward all of them to Georgia Tech for review and approval.
You can view your completed form within DocuSign at {{ $envelope->sender_view_url }}.
@if(!$envelope->signable->is_paid)
You still need to make a ${{ intval($envelope->signable->travel->fee_amount) }} payment for the travel fee.
You can pay online with a credit or debit card at {{ route('pay.travel') }}. Note that we add an additional ${{ number_format(\App\Models\Payment::calculateSurcharge($envelope->signable->travel->fee_amount * 100) / 100, 2) }} surcharge for online payments.
If you would prefer to pay by cash or check, make arrangements with {{ $envelope->signable->travel->primaryContact->full_name }}.
Write checks to Georgia Tech, with RoboJackets on the memo line. Don't forget to sign it!
@endif
For more information, visit {{ route('travel.index') }}. If you have any questions, please contact {{ $envelope->signable->travel->primaryContact->full_name }}.
----
To stop receiving emails from {{ config('app.name') }}, visit @{{{ pm:unsubscribe }}}. | <|file_sep|>resources/views/mail/travel/docusignenvelopereceived.blade.php.diff
original:
updated:
<|file_sep|>original/resources/views/mail/travel/docusignenvelopereceived.blade.php
Hi {{ $envelope->signedBy->preferred_first_name }},
We received your Travel Authority Request for {{ $envelope->signable->travel->name }}. Once everyone has submitted their documents, we'll forward all of them to Georgia Tech for review and approval.
You can view your completed form within DocuSign at {{ $envelope->sender_view_url }}.
@if(!$envelope->signable->is_paid)
You still need to make a ${{ intval($envelope->signable->travel->fee_amount) }} payment for the travel fee.
You can pay online with a credit or debit card at {{ route('pay.travel') }}. Note that we add an additional ${{ number_format(\App\Models\Payment::calculateSurcharge($envelope->signable->travel->fee_amount * 100) / 100, 2) }} surcharge for online payments.
If you would prefer to pay by cash or check, make arrangements with {{ $envelope->signable->travel->primaryContact->full_name }}.
Write checks to Georgia Tech, with RoboJackets on the memo line. Don't forget to sign it!
@endif
For more information, visit {{ route('travel.index') }}. If you have any questions, please contact {{ $envelope->signable->travel->primaryContact->full_name }}.
----
To stop receiving emails from {{ config('app.name') }}, visit @{{{ pm:unsubscribe }}}.
<|file_sep|>current/resources/views/mail/travel/docusignenvelopereceived.blade.php
We received your Travel Authority Request for {{ $envelope->signable->travel->name }}. Once everyone has submitted their documents, we'll forward all of them to Georgia Tech for review and approval.
You can view your completed form within DocuSign at {{ $envelope->sender_view_url }}.
@if(!$envelope->signable->is_paid)
You still need to make a ${{ intval($envelope->signable->travel->fee_amount) }} payment for the travel fee.
You can pay online with a credit or debit card at {{ route('pay.travel') }}. Note that we add an additional ${{ number_format(\App\Models\Payment::calculateSurcharge($envelope->signable->travel->fee_amount * 100) / 100, 2) }} surcharge for online payments.
If you would prefer to pay by cash or check, make arrangements with {{ $envelope->signable->travel->primaryContact->full_name }}.
Write checks to Georgia Tech, with RoboJackets on the memo line. Don't forget to sign it!
@endif
For more information, visit {{ route('travel.index') }}. If you have any questions, please contact {{ $envelope->signable->travel->primaryContact->full_name }}.
----
To stop receiving emails from {{ config('app.name') }}, visit @{{{ pm:unsubscribe }}}.
<|file_sep|>updated/resources/views/mail/travel/docusignenvelopereceived.blade.php
Hi {{ $envelope->signedBy->preferred_first_name }},
We received your Travel Authority Request for {{ $envelope->signable->travel->name }}. Once everyone has submitted their documents, we'll forward all of them to Georgia Tech for review and approval.
You can view your completed form within DocuSign at {{ $envelope->sender_view_url }}.
@if(!$envelope->signable->is_paid)
You still need to make a ${{ intval($envelope->signable->travel->fee_amount) }} payment for the travel fee.
You can pay online with a credit or debit card at {{ route('pay.travel') }}. Note that we add an additional ${{ number_format(\App\Models\Payment::calculateSurcharge($envelope->signable->travel->fee_amount * 100) / 100, 2) }} surcharge for online payments.
If you would prefer to pay by cash or check, make arrangements with {{ $envelope->signable->travel->primaryContact->full_name }}.
Write checks to Georgia Tech, with RoboJackets on the memo line. Don't forget to sign it!
@endif
For more information, visit {{ route('travel.index') }}. If you have any questions, please contact {{ $envelope->signable->travel->primaryContact->full_name }}.
----
To stop receiving emails from {{ config('app.name') }}, visit @{{{ pm:unsubscribe }}}. | e5a38af9d7eea51107f2aa7397372b546ef89342 | resources/views/mail/travel/docusignenvelopereceived.blade.php | resources/views/mail/travel/docusignenvelopereceived.blade.php | PHP |
<|file_sep|>original/dbt/clients/agate_helper.py
import agate
DEFAULT_TYPE_TESTER = agate.TypeTester(types=[
agate.data_types.Number(),
agate.data_types.Date(),
agate.data_types.DateTime(),
agate.data_types.Boolean(),
agate.data_types.Text()
])
def table_from_data(data, column_names):
"Convert list of dictionaries into an Agate table"
# The agate table is generated from a list of dicts, so the column order
# from `data` is not preserved. We can use `select` to reorder the columns
#
# If there is no data, create an empty table with the specified columns
if len(data) == 0:
<|file_sep|>current/dbt/clients/agate_helper.py
import agate
DEFAULT_TYPE_TESTER = agate.TypeTester(types=[
agate.data_types.Number(),
agate.data_types.Date(),
agate.data_types.DateTime(),
agate.data_types.Boolean(),
agate.data_types.Text()
])
def table_from_data(data, column_names):
"Convert list of dictionaries into an Agate table"
# The agate table is generated from a list of dicts, so the column order
# from `data` is not preserved. We can use `select` to reorder the columns
#
# If there is no data, create an empty table with the specified columns
if len(data) == 0:
<|file_sep|>updated/dbt/clients/agate_helper.py |
import agate
DEFAULT_TYPE_TESTER = agate.TypeTester(types=[
agate.data_types.Boolean(true_values=('true',),
false_values=('false',),
null_values=('null',)),
agate.data_types.Number(null_values=('null',)),
agate.data_types.TimeDelta(null_values=('null',)),
agate.data_types.Date(null_values=('null',)),
agate.data_types.DateTime(null_values=('null',)),
agate.data_types.Text(null_values=('null',))
])
def table_from_data(data, column_names):
"Convert list of dictionaries into an Agate table"
# The agate table is generated from a list of dicts, so the column order
# from `data` is not preserved. We can use `select` to reorder the columns
# | <|file_sep|>original/dbt/clients/agate_helper.py
import agate
DEFAULT_TYPE_TESTER = agate.TypeTester(types=[
agate.data_types.Number(),
agate.data_types.Date(),
agate.data_types.DateTime(),
agate.data_types.Boolean(),
agate.data_types.Text()
])
def table_from_data(data, column_names):
"Convert list of dictionaries into an Agate table"
# The agate table is generated from a list of dicts, so the column order
# from `data` is not preserved. We can use `select` to reorder the columns
#
# If there is no data, create an empty table with the specified columns
if len(data) == 0:
<|file_sep|>current/dbt/clients/agate_helper.py
import agate
DEFAULT_TYPE_TESTER = agate.TypeTester(types=[
agate.data_types.Number(),
agate.data_types.Date(),
agate.data_types.DateTime(),
agate.data_types.Boolean(),
agate.data_types.Text()
])
def table_from_data(data, column_names):
"Convert list of dictionaries into an Agate table"
# The agate table is generated from a list of dicts, so the column order
# from `data` is not preserved. We can use `select` to reorder the columns
#
# If there is no data, create an empty table with the specified columns
if len(data) == 0:
<|file_sep|>updated/dbt/clients/agate_helper.py
import agate
DEFAULT_TYPE_TESTER = agate.TypeTester(types=[
agate.data_types.Boolean(true_values=('true',),
false_values=('false',),
null_values=('null',)),
agate.data_types.Number(null_values=('null',)),
agate.data_types.TimeDelta(null_values=('null',)),
agate.data_types.Date(null_values=('null',)),
agate.data_types.DateTime(null_values=('null',)),
agate.data_types.Text(null_values=('null',))
])
def table_from_data(data, column_names):
"Convert list of dictionaries into an Agate table"
# The agate table is generated from a list of dicts, so the column order
# from `data` is not preserved. We can use `select` to reorder the columns
# | 30f03692eff862f1456b9c376c21fe8e57de7eaa | dbt/clients/agate_helper.py | dbt/clients/agate_helper.py | Python |
<|file_sep|>original/package.json
{
"name": "rework-calc",
"version": "0.2.2",
"description": "calc() support for Rework",
"dependencies": {
"balanced-match": "^0.1.0",
"rework-visit": "^1.0.0"
},
"devDependencies": {
"mocha": "~1.15.1",
"rework": "^1.0.0"
},
"files": [
"index.js"
],
"scripts": {
"test": "mocha --no-colors",
"watch": "mocha --slow 30 --reporter spec --watch"
},
"repository": {
"type": "git",
<|file_sep|>current/package.json
{
"name": "rework-calc",
"version": "0.2.2",
"description": "calc() support for Rework",
"dependencies": {
"balanced-match": "^0.1.0",
"rework-visit": "^1.0.0"
},
"devDependencies": {
"mocha": "~1.15.1",
"rework": "^1.0.0"
},
"files": [
"index.js"
],
"scripts": {
"test": "mocha --no-colors",
"watch": "mocha --slow 30 --reporter spec --watch"
},
"repository": {
"type": "git",
<|file_sep|>updated/package.json | {
"name": "rework-calc",
"version": "0.2.2",
"description": "calc() support for Rework",
"dependencies": {
"balanced-match": "~0.1.0",
"rework-visit": "^1.0.0"
},
"devDependencies": {
"mocha": "~1.15.1",
"rework": "^1.0.0"
},
"files": [
"index.js"
],
"scripts": {
"test": "mocha --no-colors",
"watch": "mocha --slow 30 --reporter spec --watch"
},
"repository": {
"type": "git", | <|file_sep|>original/package.json
{
"name": "rework-calc",
"version": "0.2.2",
"description": "calc() support for Rework",
"dependencies": {
"balanced-match": "^0.1.0",
"rework-visit": "^1.0.0"
},
"devDependencies": {
"mocha": "~1.15.1",
"rework": "^1.0.0"
},
"files": [
"index.js"
],
"scripts": {
"test": "mocha --no-colors",
"watch": "mocha --slow 30 --reporter spec --watch"
},
"repository": {
"type": "git",
<|file_sep|>current/package.json
{
"name": "rework-calc",
"version": "0.2.2",
"description": "calc() support for Rework",
"dependencies": {
"balanced-match": "^0.1.0",
"rework-visit": "^1.0.0"
},
"devDependencies": {
"mocha": "~1.15.1",
"rework": "^1.0.0"
},
"files": [
"index.js"
],
"scripts": {
"test": "mocha --no-colors",
"watch": "mocha --slow 30 --reporter spec --watch"
},
"repository": {
"type": "git",
<|file_sep|>updated/package.json
{
"name": "rework-calc",
"version": "0.2.2",
"description": "calc() support for Rework",
"dependencies": {
"balanced-match": "~0.1.0",
"rework-visit": "^1.0.0"
},
"devDependencies": {
"mocha": "~1.15.1",
"rework": "^1.0.0"
},
"files": [
"index.js"
],
"scripts": {
"test": "mocha --no-colors",
"watch": "mocha --slow 30 --reporter spec --watch"
},
"repository": {
"type": "git", | 24a22d6efef2c65a7ff397f464ed5ad7a56692b1 | package.json | package.json | JSON |
<|file_sep|>original/src/txkube/_memory.py
@implementer(IKubernetes)
class _MemoryKubernetes(object):
"""
``_MemoryKubernetes`` maintains state in-memory which approximates
the state of a real Kubernetes deployment sufficiently to expose a
subset of the external Kubernetes API.
"""
def __init__(self):
base_url = URL.fromText(u"https://localhost/")
self._resource = _kubernetes_resource()
self._kubernetes = network_kubernetes(
base_url=base_url,
credentials=None,
agent=RequestTraversalAgent(self._resource),
)
def client(self, *args, **kwargs):
"""
:return IKubernetesClient: A new client which interacts with this
<|file_sep|>current/src/txkube/_memory.py
@implementer(IKubernetes)
class _MemoryKubernetes(object):
"""
``_MemoryKubernetes`` maintains state in-memory which approximates
the state of a real Kubernetes deployment sufficiently to expose a
subset of the external Kubernetes API.
"""
def __init__(self):
base_url = URL.fromText(u"https://localhost/")
self._resource = _kubernetes_resource()
self._kubernetes = network_kubernetes(
base_url=base_url,
credentials=None,
agent=RequestTraversalAgent(self._resource),
)
def client(self, *args, **kwargs):
"""
:return IKubernetesClient: A new client which interacts with this
<|file_sep|>updated/src/txkube/_memory.py |
@implementer(IKubernetes)
class _MemoryKubernetes(object):
"""
``_MemoryKubernetes`` maintains state in-memory which approximates
the state of a real Kubernetes deployment sufficiently to expose a
subset of the external Kubernetes API.
"""
def __init__(self):
base_url = URL.fromText(u"https://kubernetes.example.invalid./")
self._resource = _kubernetes_resource()
self._kubernetes = network_kubernetes(
base_url=base_url,
credentials=None,
agent=RequestTraversalAgent(self._resource),
)
def client(self, *args, **kwargs):
"""
:return IKubernetesClient: A new client which interacts with this | <|file_sep|>original/src/txkube/_memory.py
@implementer(IKubernetes)
class _MemoryKubernetes(object):
"""
``_MemoryKubernetes`` maintains state in-memory which approximates
the state of a real Kubernetes deployment sufficiently to expose a
subset of the external Kubernetes API.
"""
def __init__(self):
base_url = URL.fromText(u"https://localhost/")
self._resource = _kubernetes_resource()
self._kubernetes = network_kubernetes(
base_url=base_url,
credentials=None,
agent=RequestTraversalAgent(self._resource),
)
def client(self, *args, **kwargs):
"""
:return IKubernetesClient: A new client which interacts with this
<|file_sep|>current/src/txkube/_memory.py
@implementer(IKubernetes)
class _MemoryKubernetes(object):
"""
``_MemoryKubernetes`` maintains state in-memory which approximates
the state of a real Kubernetes deployment sufficiently to expose a
subset of the external Kubernetes API.
"""
def __init__(self):
base_url = URL.fromText(u"https://localhost/")
self._resource = _kubernetes_resource()
self._kubernetes = network_kubernetes(
base_url=base_url,
credentials=None,
agent=RequestTraversalAgent(self._resource),
)
def client(self, *args, **kwargs):
"""
:return IKubernetesClient: A new client which interacts with this
<|file_sep|>updated/src/txkube/_memory.py
@implementer(IKubernetes)
class _MemoryKubernetes(object):
"""
``_MemoryKubernetes`` maintains state in-memory which approximates
the state of a real Kubernetes deployment sufficiently to expose a
subset of the external Kubernetes API.
"""
def __init__(self):
base_url = URL.fromText(u"https://kubernetes.example.invalid./")
self._resource = _kubernetes_resource()
self._kubernetes = network_kubernetes(
base_url=base_url,
credentials=None,
agent=RequestTraversalAgent(self._resource),
)
def client(self, *args, **kwargs):
"""
:return IKubernetesClient: A new client which interacts with this | da85cfae848df4cee5ccf2cbbbc370848ea172a7 | src/txkube/_memory.py | src/txkube/_memory.py | Python |
<|file_sep|>scanifc-sys/Cargo.toml.diff
original:
updated:
description = "Bingen bindings to Riegl's scanifc library, part of RiVLib"
<|file_sep|>original/scanifc-sys/Cargo.toml
[package]
name = "scanifc-sys"
version = "0.1.0"
authors = ["Pete Gadomski <pete.gadomski@gmail.com>"]
links = "scanifc"
build = "build.rs"
[build-dependencies]
bindgen = "0.33"
<|file_sep|>current/scanifc-sys/Cargo.toml
[package]
name = "scanifc-sys"
version = "0.1.0"
authors = ["Pete Gadomski <pete.gadomski@gmail.com>"]
description = "Bingen bindings to Riegl's scanifc library, part of RiVLib"
links = "scanifc"
build = "build.rs"
[build-dependencies]
bindgen = "0.33"
<|file_sep|>updated/scanifc-sys/Cargo.toml | [package]
name = "scanifc-sys"
version = "0.1.0"
authors = ["Pete Gadomski <pete.gadomski@gmail.com>"]
description = "Bingen bindings to Riegl's scanifc library, part of RiVLib"
links = "scanifc"
build = "build.rs"
keywords = ["riegl", "lidar"]
license = "MIT/ASL2"
[build-dependencies]
bindgen = "0.33" | <|file_sep|>scanifc-sys/Cargo.toml.diff
original:
updated:
description = "Bingen bindings to Riegl's scanifc library, part of RiVLib"
<|file_sep|>original/scanifc-sys/Cargo.toml
[package]
name = "scanifc-sys"
version = "0.1.0"
authors = ["Pete Gadomski <pete.gadomski@gmail.com>"]
links = "scanifc"
build = "build.rs"
[build-dependencies]
bindgen = "0.33"
<|file_sep|>current/scanifc-sys/Cargo.toml
[package]
name = "scanifc-sys"
version = "0.1.0"
authors = ["Pete Gadomski <pete.gadomski@gmail.com>"]
description = "Bingen bindings to Riegl's scanifc library, part of RiVLib"
links = "scanifc"
build = "build.rs"
[build-dependencies]
bindgen = "0.33"
<|file_sep|>updated/scanifc-sys/Cargo.toml
[package]
name = "scanifc-sys"
version = "0.1.0"
authors = ["Pete Gadomski <pete.gadomski@gmail.com>"]
description = "Bingen bindings to Riegl's scanifc library, part of RiVLib"
links = "scanifc"
build = "build.rs"
keywords = ["riegl", "lidar"]
license = "MIT/ASL2"
[build-dependencies]
bindgen = "0.33" | 0fc0bd491dab1dac2eef44f69531ab2eb91177be | scanifc-sys/Cargo.toml | scanifc-sys/Cargo.toml | TOML |
<|file_sep|>original/packages/ng/ngx-export-tools.yaml
<|file_sep|>current/packages/ng/ngx-export-tools.yaml
<|file_sep|>updated/packages/ng/ngx-export-tools.yaml | homepage: http://github.com/lyokha/nginx-haskell-module
changelog-type: markdown
hash: b36a28d7cbd1152692247002d6bfc9361f02d415aac98d9831fc6e461bb353d0
test-bench-deps: {}
maintainer: Alexey Radkov <alexey.radkov@gmail.com>
synopsis: Extra tools for Nginx haskell module
changelog: ! '### 0.1.0.0
- Initial version.
'
basic-deps:
bytestring: ! '>=0.10.0.0'
base: ! '>=4.8 && <5'
ngx-export: ! '>=1.4.1'
aeson: ! '>=1.0'
template-haskell: ! '>=2.11.0.0'
safe: -any
all-versions: | <|file_sep|>original/packages/ng/ngx-export-tools.yaml
<|file_sep|>current/packages/ng/ngx-export-tools.yaml
<|file_sep|>updated/packages/ng/ngx-export-tools.yaml
homepage: http://github.com/lyokha/nginx-haskell-module
changelog-type: markdown
hash: b36a28d7cbd1152692247002d6bfc9361f02d415aac98d9831fc6e461bb353d0
test-bench-deps: {}
maintainer: Alexey Radkov <alexey.radkov@gmail.com>
synopsis: Extra tools for Nginx haskell module
changelog: ! '### 0.1.0.0
- Initial version.
'
basic-deps:
bytestring: ! '>=0.10.0.0'
base: ! '>=4.8 && <5'
ngx-export: ! '>=1.4.1'
aeson: ! '>=1.0'
template-haskell: ! '>=2.11.0.0'
safe: -any
all-versions: | c176483848e06d02fd65e5cd078bf2d0e9242eb4 | packages/ng/ngx-export-tools.yaml | packages/ng/ngx-export-tools.yaml | YAML |
<|file_sep|>original/.circleci/config.yml
name: Calculate cache key for Maven dependencies
command: |
{
md5sum gradle/wrapper/gradle-wrapper.properties
md5sum settings.gradle
md5sum $(find . -name 'build.gradle')
} > ~/cache-key-source-gradle
- restore_cache:
keys:
- v1-dependencies-{{ checksum "~/cache-key-source-gradle" }}
# fallback to using the latest cache if no exact match is found
- v1-dependencies-
- run: gradle build
- save_cache:
paths:
- ~/.gradle
- ~/.m2
key: v1-dependencies-{{ checksum "~/cache-key-source-gradle" }}
<|file_sep|>current/.circleci/config.yml
name: Calculate cache key for Maven dependencies
command: |
{
md5sum gradle/wrapper/gradle-wrapper.properties
md5sum settings.gradle
md5sum $(find . -name 'build.gradle')
} > ~/cache-key-source-gradle
- restore_cache:
keys:
- v1-dependencies-{{ checksum "~/cache-key-source-gradle" }}
# fallback to using the latest cache if no exact match is found
- v1-dependencies-
- run: gradle build
- save_cache:
paths:
- ~/.gradle
- ~/.m2
key: v1-dependencies-{{ checksum "~/cache-key-source-gradle" }}
<|file_sep|>updated/.circleci/config.yml | {
md5sum gradle/wrapper/gradle-wrapper.properties
md5sum settings.gradle
md5sum $(find . -name 'build.gradle')
} > ~/cache-key-source-gradle
- restore_cache:
keys:
- v1-dependencies-{{ checksum "~/cache-key-source-gradle" }}
# fallback to using the latest cache if no exact match is found
- v1-dependencies-
- run: gradle build
- store_test_results:
path: core/build/test-results
- save_cache:
paths:
- ~/.gradle
- ~/.m2
key: v1-dependencies-{{ checksum "~/cache-key-source-gradle" }} | <|file_sep|>original/.circleci/config.yml
name: Calculate cache key for Maven dependencies
command: |
{
md5sum gradle/wrapper/gradle-wrapper.properties
md5sum settings.gradle
md5sum $(find . -name 'build.gradle')
} > ~/cache-key-source-gradle
- restore_cache:
keys:
- v1-dependencies-{{ checksum "~/cache-key-source-gradle" }}
# fallback to using the latest cache if no exact match is found
- v1-dependencies-
- run: gradle build
- save_cache:
paths:
- ~/.gradle
- ~/.m2
key: v1-dependencies-{{ checksum "~/cache-key-source-gradle" }}
<|file_sep|>current/.circleci/config.yml
name: Calculate cache key for Maven dependencies
command: |
{
md5sum gradle/wrapper/gradle-wrapper.properties
md5sum settings.gradle
md5sum $(find . -name 'build.gradle')
} > ~/cache-key-source-gradle
- restore_cache:
keys:
- v1-dependencies-{{ checksum "~/cache-key-source-gradle" }}
# fallback to using the latest cache if no exact match is found
- v1-dependencies-
- run: gradle build
- save_cache:
paths:
- ~/.gradle
- ~/.m2
key: v1-dependencies-{{ checksum "~/cache-key-source-gradle" }}
<|file_sep|>updated/.circleci/config.yml
{
md5sum gradle/wrapper/gradle-wrapper.properties
md5sum settings.gradle
md5sum $(find . -name 'build.gradle')
} > ~/cache-key-source-gradle
- restore_cache:
keys:
- v1-dependencies-{{ checksum "~/cache-key-source-gradle" }}
# fallback to using the latest cache if no exact match is found
- v1-dependencies-
- run: gradle build
- store_test_results:
path: core/build/test-results
- save_cache:
paths:
- ~/.gradle
- ~/.m2
key: v1-dependencies-{{ checksum "~/cache-key-source-gradle" }} | 5c93fa8efa69e7ddba5610910023e452e7ca60e4 | .circleci/config.yml | .circleci/config.yml | YAML |
<|file_sep|>original/examples/moses-pln-synergy/scm/pln-inference.scm
<|file_sep|>current/examples/moses-pln-synergy/scm/pln-inference.scm
<|file_sep|>updated/examples/moses-pln-synergy/scm/pln-inference.scm | ;; PLN inference carried over the background knowledge to overwrite
;; the truth value of the MOSES model based solely on the dataset.
;; Load MOSES model
(load "moses-model.scm")
;; Load the background knowledge
(load "background-knowledge.scm")
;; Load the PLN configuration for this demo
(load "pln-config.scm")
;; Apply the inference rules using the pattern matcher. This only
;; contains the executions, see the README.md for the explanations and
;; the expected results.
;; (1)
(for-each (lambda (i) (cog-bind implication-partial-instantiation-rule))
(iota 2))
;; (2) | <|file_sep|>original/examples/moses-pln-synergy/scm/pln-inference.scm
<|file_sep|>current/examples/moses-pln-synergy/scm/pln-inference.scm
<|file_sep|>updated/examples/moses-pln-synergy/scm/pln-inference.scm
;; PLN inference carried over the background knowledge to overwrite
;; the truth value of the MOSES model based solely on the dataset.
;; Load MOSES model
(load "moses-model.scm")
;; Load the background knowledge
(load "background-knowledge.scm")
;; Load the PLN configuration for this demo
(load "pln-config.scm")
;; Apply the inference rules using the pattern matcher. This only
;; contains the executions, see the README.md for the explanations and
;; the expected results.
;; (1)
(for-each (lambda (i) (cog-bind implication-partial-instantiation-rule))
(iota 2))
;; (2) | be1260c3417570394dcf62cf73f382cc3f46616e | examples/moses-pln-synergy/scm/pln-inference.scm | examples/moses-pln-synergy/scm/pln-inference.scm | Scheme |
<|file_sep|>original/.github/workflows/dotnet.yml
branches: [ Integration ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Publish NuGet
# You may pin to the exact commit or the version.
# uses: brandedoutcast/publish-nuget@c12b8546b67672ee38ac87bea491ac94a587f7cc
uses: brandedoutcast/publish-nuget@v2.5.5
with:
# Filepath of the project to be packaged, relative to root of repository
PROJECT_FILE_PATH: "PushbulletSharp"
# NuGet package id, used for version detection & defaults to project name
PACKAGE_NAME: "PushbulletSharp"
# API key to authenticate with NuGet server
NUGET_KEY: "17ded98493732c649f2474e78890ff567767b26b"
# NuGet server uri hosting the packages, defaults to https://api.nuget.org
NUGET_SOURCE: "https://nuget.pkg.github.com/kekonn/index.json"
<|file_sep|>current/.github/workflows/dotnet.yml
branches: [ Integration ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Publish NuGet
# You may pin to the exact commit or the version.
# uses: brandedoutcast/publish-nuget@c12b8546b67672ee38ac87bea491ac94a587f7cc
uses: brandedoutcast/publish-nuget@v2.5.5
with:
# Filepath of the project to be packaged, relative to root of repository
PROJECT_FILE_PATH: "PushbulletSharp"
# NuGet package id, used for version detection & defaults to project name
PACKAGE_NAME: "PushbulletSharp"
# API key to authenticate with NuGet server
NUGET_KEY: "17ded98493732c649f2474e78890ff567767b26b"
# NuGet server uri hosting the packages, defaults to https://api.nuget.org
NUGET_SOURCE: "https://nuget.pkg.github.com/kekonn/index.json"
<|file_sep|>updated/.github/workflows/dotnet.yml | branches: [ Integration ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Publish NuGet
# You may pin to the exact commit or the version.
# uses: brandedoutcast/publish-nuget@c12b8546b67672ee38ac87bea491ac94a587f7cc
uses: brandedoutcast/publish-nuget@v2.5.5
with:
# Filepath of the project to be packaged, relative to root of repository
PROJECT_FILE_PATH: "PushbulletSharp/PushbulletSharp.csproj"
# NuGet package id, used for version detection & defaults to project name
PACKAGE_NAME: "PushbulletSharp"
# API key to authenticate with NuGet server
NUGET_KEY: "17ded98493732c649f2474e78890ff567767b26b"
# NuGet server uri hosting the packages, defaults to https://api.nuget.org
NUGET_SOURCE: "https://nuget.pkg.github.com/kekonn/index.json" | <|file_sep|>original/.github/workflows/dotnet.yml
branches: [ Integration ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Publish NuGet
# You may pin to the exact commit or the version.
# uses: brandedoutcast/publish-nuget@c12b8546b67672ee38ac87bea491ac94a587f7cc
uses: brandedoutcast/publish-nuget@v2.5.5
with:
# Filepath of the project to be packaged, relative to root of repository
PROJECT_FILE_PATH: "PushbulletSharp"
# NuGet package id, used for version detection & defaults to project name
PACKAGE_NAME: "PushbulletSharp"
# API key to authenticate with NuGet server
NUGET_KEY: "17ded98493732c649f2474e78890ff567767b26b"
# NuGet server uri hosting the packages, defaults to https://api.nuget.org
NUGET_SOURCE: "https://nuget.pkg.github.com/kekonn/index.json"
<|file_sep|>current/.github/workflows/dotnet.yml
branches: [ Integration ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Publish NuGet
# You may pin to the exact commit or the version.
# uses: brandedoutcast/publish-nuget@c12b8546b67672ee38ac87bea491ac94a587f7cc
uses: brandedoutcast/publish-nuget@v2.5.5
with:
# Filepath of the project to be packaged, relative to root of repository
PROJECT_FILE_PATH: "PushbulletSharp"
# NuGet package id, used for version detection & defaults to project name
PACKAGE_NAME: "PushbulletSharp"
# API key to authenticate with NuGet server
NUGET_KEY: "17ded98493732c649f2474e78890ff567767b26b"
# NuGet server uri hosting the packages, defaults to https://api.nuget.org
NUGET_SOURCE: "https://nuget.pkg.github.com/kekonn/index.json"
<|file_sep|>updated/.github/workflows/dotnet.yml
branches: [ Integration ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Publish NuGet
# You may pin to the exact commit or the version.
# uses: brandedoutcast/publish-nuget@c12b8546b67672ee38ac87bea491ac94a587f7cc
uses: brandedoutcast/publish-nuget@v2.5.5
with:
# Filepath of the project to be packaged, relative to root of repository
PROJECT_FILE_PATH: "PushbulletSharp/PushbulletSharp.csproj"
# NuGet package id, used for version detection & defaults to project name
PACKAGE_NAME: "PushbulletSharp"
# API key to authenticate with NuGet server
NUGET_KEY: "17ded98493732c649f2474e78890ff567767b26b"
# NuGet server uri hosting the packages, defaults to https://api.nuget.org
NUGET_SOURCE: "https://nuget.pkg.github.com/kekonn/index.json" | 8cbc8ceb65f5f22defdad1ff55f94bc4236f9783 | .github/workflows/dotnet.yml | .github/workflows/dotnet.yml | YAML |
<|file_sep|>original/bin/restart_job.rb
<|file_sep|>current/bin/restart_job.rb
<|file_sep|>updated/bin/restart_job.rb | #!/usr/bin/env ruby
#
# (c) 2014 -- onwards Georgios Gousios <gousiosg@gmail.com>
#
# Distributed under the 2-close BSD license, see top level directory.
require 'db_stuff'
require 'queue_stuff'
require 'ghtorrent'
require 'json'
require 'yaml'
include QueueStuff
include DBStuff
include GHTorrent::Logging
job_id = ARGV[0].to_i
def settings
YAML.load_file(ARGV[1])
end | <|file_sep|>original/bin/restart_job.rb
<|file_sep|>current/bin/restart_job.rb
<|file_sep|>updated/bin/restart_job.rb
#!/usr/bin/env ruby
#
# (c) 2014 -- onwards Georgios Gousios <gousiosg@gmail.com>
#
# Distributed under the 2-close BSD license, see top level directory.
require 'db_stuff'
require 'queue_stuff'
require 'ghtorrent'
require 'json'
require 'yaml'
include QueueStuff
include DBStuff
include GHTorrent::Logging
job_id = ARGV[0].to_i
def settings
YAML.load_file(ARGV[1])
end | 0c10c92d19ed85da22597fae5f56f2115d05af62 | bin/restart_job.rb | bin/restart_job.rb | Ruby |
<|file_sep|>original/fake_email_validator.gemspec
spec.name = 'fake_email_validator'
spec.version = '0.0.5'
spec.authors = ['Maxim Dobryakov']
spec.email = ['maxim.dobryakov@gmail.com']
spec.summary = %q{E-Mail validator for Rails to block fake emails}
spec.description = %q{E-Mail validator for Rails to block services like mailinator.com}
spec.homepage = 'https://github.com/maxd/fake_email_validator'
spec.license = 'MIT'
spec.files = `git ls-files`.split($/)
spec.executables = spec.files.grep(%r{^bin/}) { |f| File.basename(f) }
spec.test_files = spec.files.grep(%r{^(test|spec|features)/})
spec.require_paths = ['lib']
spec.add_development_dependency 'bundler', '~> 1.5'
spec.add_development_dependency 'rake', '~> 10'
spec.add_development_dependency 'minitest', '~> 4'
spec.add_runtime_dependency 'activemodel', '~> 4.0', '> 4.0.0'
spec.add_runtime_dependency 'mail', '~> 2.5', '> 2.5.3'
end
<|file_sep|>current/fake_email_validator.gemspec
spec.name = 'fake_email_validator'
spec.version = '0.0.5'
spec.authors = ['Maxim Dobryakov']
spec.email = ['maxim.dobryakov@gmail.com']
spec.summary = %q{E-Mail validator for Rails to block fake emails}
spec.description = %q{E-Mail validator for Rails to block services like mailinator.com}
spec.homepage = 'https://github.com/maxd/fake_email_validator'
spec.license = 'MIT'
spec.files = `git ls-files`.split($/)
spec.executables = spec.files.grep(%r{^bin/}) { |f| File.basename(f) }
spec.test_files = spec.files.grep(%r{^(test|spec|features)/})
spec.require_paths = ['lib']
spec.add_development_dependency 'bundler', '~> 1.5'
spec.add_development_dependency 'rake', '~> 10'
spec.add_development_dependency 'minitest', '~> 4'
spec.add_runtime_dependency 'activemodel', '~> 4.0', '> 4.0.0'
spec.add_runtime_dependency 'mail', '~> 2.5', '> 2.5.3'
end
<|file_sep|>updated/fake_email_validator.gemspec | spec.name = 'fake_email_validator'
spec.version = '0.0.5'
spec.authors = ['Maxim Dobryakov']
spec.email = ['maxim.dobryakov@gmail.com']
spec.summary = %q{E-Mail validator for Rails to block fake emails}
spec.description = %q{E-Mail validator for Rails to block services like mailinator.com}
spec.homepage = 'https://github.com/maxd/fake_email_validator'
spec.license = 'MIT'
spec.files = `git ls-files`.split($/)
spec.executables = spec.files.grep(%r{^bin/}) { |f| File.basename(f) }
spec.test_files = spec.files.grep(%r{^(test|spec|features)/})
spec.require_paths = ['lib']
spec.add_development_dependency 'bundler', '~> 1.5'
spec.add_development_dependency 'rake', '~> 10'
spec.add_development_dependency 'minitest', '~> 4'
spec.add_runtime_dependency 'activemodel'
spec.add_runtime_dependency 'mail'
end | <|file_sep|>original/fake_email_validator.gemspec
spec.name = 'fake_email_validator'
spec.version = '0.0.5'
spec.authors = ['Maxim Dobryakov']
spec.email = ['maxim.dobryakov@gmail.com']
spec.summary = %q{E-Mail validator for Rails to block fake emails}
spec.description = %q{E-Mail validator for Rails to block services like mailinator.com}
spec.homepage = 'https://github.com/maxd/fake_email_validator'
spec.license = 'MIT'
spec.files = `git ls-files`.split($/)
spec.executables = spec.files.grep(%r{^bin/}) { |f| File.basename(f) }
spec.test_files = spec.files.grep(%r{^(test|spec|features)/})
spec.require_paths = ['lib']
spec.add_development_dependency 'bundler', '~> 1.5'
spec.add_development_dependency 'rake', '~> 10'
spec.add_development_dependency 'minitest', '~> 4'
spec.add_runtime_dependency 'activemodel', '~> 4.0', '> 4.0.0'
spec.add_runtime_dependency 'mail', '~> 2.5', '> 2.5.3'
end
<|file_sep|>current/fake_email_validator.gemspec
spec.name = 'fake_email_validator'
spec.version = '0.0.5'
spec.authors = ['Maxim Dobryakov']
spec.email = ['maxim.dobryakov@gmail.com']
spec.summary = %q{E-Mail validator for Rails to block fake emails}
spec.description = %q{E-Mail validator for Rails to block services like mailinator.com}
spec.homepage = 'https://github.com/maxd/fake_email_validator'
spec.license = 'MIT'
spec.files = `git ls-files`.split($/)
spec.executables = spec.files.grep(%r{^bin/}) { |f| File.basename(f) }
spec.test_files = spec.files.grep(%r{^(test|spec|features)/})
spec.require_paths = ['lib']
spec.add_development_dependency 'bundler', '~> 1.5'
spec.add_development_dependency 'rake', '~> 10'
spec.add_development_dependency 'minitest', '~> 4'
spec.add_runtime_dependency 'activemodel', '~> 4.0', '> 4.0.0'
spec.add_runtime_dependency 'mail', '~> 2.5', '> 2.5.3'
end
<|file_sep|>updated/fake_email_validator.gemspec
spec.name = 'fake_email_validator'
spec.version = '0.0.5'
spec.authors = ['Maxim Dobryakov']
spec.email = ['maxim.dobryakov@gmail.com']
spec.summary = %q{E-Mail validator for Rails to block fake emails}
spec.description = %q{E-Mail validator for Rails to block services like mailinator.com}
spec.homepage = 'https://github.com/maxd/fake_email_validator'
spec.license = 'MIT'
spec.files = `git ls-files`.split($/)
spec.executables = spec.files.grep(%r{^bin/}) { |f| File.basename(f) }
spec.test_files = spec.files.grep(%r{^(test|spec|features)/})
spec.require_paths = ['lib']
spec.add_development_dependency 'bundler', '~> 1.5'
spec.add_development_dependency 'rake', '~> 10'
spec.add_development_dependency 'minitest', '~> 4'
spec.add_runtime_dependency 'activemodel'
spec.add_runtime_dependency 'mail'
end | 163d9a21ec2a0c3f28c34f847f2114e9cdd54f98 | fake_email_validator.gemspec | fake_email_validator.gemspec | Ruby |
<|file_sep|>original/package.json
"gatsby-link": "^1.6.40",
"gatsby-plugin-react-helmet": "^2.0.10",
"gatsby-source-filesystem": "^1.5.36",
"gatsby-transformer-remark": "^1.7.41",
"react-helmet": "^5.2.0"
},
"keywords": [
"gatsby"
],
"license": "MIT",
"scripts": {
"build": "gatsby build",
"develop": "gatsby develop",
"format": "prettier --write 'src/**/*.js'",
"test": "echo \"Error: no test specified\" && exit 1",
"deploy": "gatsby build --prefix-paths && gh-pages -d public"
},
"devDependencies": {
"prettier": "^1.12.0"
}
}
<|file_sep|>current/package.json
"gatsby-link": "^1.6.40",
"gatsby-plugin-react-helmet": "^2.0.10",
"gatsby-source-filesystem": "^1.5.36",
"gatsby-transformer-remark": "^1.7.41",
"react-helmet": "^5.2.0"
},
"keywords": [
"gatsby"
],
"license": "MIT",
"scripts": {
"build": "gatsby build",
"develop": "gatsby develop",
"format": "prettier --write 'src/**/*.js'",
"test": "echo \"Error: no test specified\" && exit 1",
"deploy": "gatsby build --prefix-paths && gh-pages -d public"
},
"devDependencies": {
"prettier": "^1.12.0"
}
}
<|file_sep|>updated/package.json | "gatsby-link": "^1.6.40",
"gatsby-plugin-react-helmet": "^2.0.10",
"gatsby-source-filesystem": "^1.5.36",
"gatsby-transformer-remark": "^1.7.41",
"react-helmet": "^5.2.0"
},
"keywords": [
"gatsby"
],
"license": "MIT",
"scripts": {
"build": "gatsby build",
"develop": "gatsby develop",
"format": "prettier --write 'src/**/*.js'",
"test": "echo \"Error: no test specified\" && exit 1",
"deploy": "gatsby build --prefix-paths && gh-pages -d public -b master"
},
"devDependencies": {
"prettier": "^1.12.0"
}
} | <|file_sep|>original/package.json
"gatsby-link": "^1.6.40",
"gatsby-plugin-react-helmet": "^2.0.10",
"gatsby-source-filesystem": "^1.5.36",
"gatsby-transformer-remark": "^1.7.41",
"react-helmet": "^5.2.0"
},
"keywords": [
"gatsby"
],
"license": "MIT",
"scripts": {
"build": "gatsby build",
"develop": "gatsby develop",
"format": "prettier --write 'src/**/*.js'",
"test": "echo \"Error: no test specified\" && exit 1",
"deploy": "gatsby build --prefix-paths && gh-pages -d public"
},
"devDependencies": {
"prettier": "^1.12.0"
}
}
<|file_sep|>current/package.json
"gatsby-link": "^1.6.40",
"gatsby-plugin-react-helmet": "^2.0.10",
"gatsby-source-filesystem": "^1.5.36",
"gatsby-transformer-remark": "^1.7.41",
"react-helmet": "^5.2.0"
},
"keywords": [
"gatsby"
],
"license": "MIT",
"scripts": {
"build": "gatsby build",
"develop": "gatsby develop",
"format": "prettier --write 'src/**/*.js'",
"test": "echo \"Error: no test specified\" && exit 1",
"deploy": "gatsby build --prefix-paths && gh-pages -d public"
},
"devDependencies": {
"prettier": "^1.12.0"
}
}
<|file_sep|>updated/package.json
"gatsby-link": "^1.6.40",
"gatsby-plugin-react-helmet": "^2.0.10",
"gatsby-source-filesystem": "^1.5.36",
"gatsby-transformer-remark": "^1.7.41",
"react-helmet": "^5.2.0"
},
"keywords": [
"gatsby"
],
"license": "MIT",
"scripts": {
"build": "gatsby build",
"develop": "gatsby develop",
"format": "prettier --write 'src/**/*.js'",
"test": "echo \"Error: no test specified\" && exit 1",
"deploy": "gatsby build --prefix-paths && gh-pages -d public -b master"
},
"devDependencies": {
"prettier": "^1.12.0"
}
} | 657f9827defd9362b81df2764041d912bc339582 | package.json | package.json | JSON |
<|file_sep|>original/axelrod/random_.py
<|file_sep|>current/axelrod/random_.py
<|file_sep|>updated/axelrod/random_.py | import random
def random_choice(p=0.5):
"""
Return 'C' wit probability `p`, else return 'D'
Emulates Python's random.choice(['C', 'D']) since it is not consistent
across Python 2.7 to Python 3.4"""
r = random.random()
if r < p:
return 'C'
return 'D' | <|file_sep|>original/axelrod/random_.py
<|file_sep|>current/axelrod/random_.py
<|file_sep|>updated/axelrod/random_.py
import random
def random_choice(p=0.5):
"""
Return 'C' wit probability `p`, else return 'D'
Emulates Python's random.choice(['C', 'D']) since it is not consistent
across Python 2.7 to Python 3.4"""
r = random.random()
if r < p:
return 'C'
return 'D' | 2a579b7df30546e642d87b70417ecf8a1a9590e0 | axelrod/random_.py | axelrod/random_.py | Python |
<|file_sep|>original/docs/Examples.md
# Examples with code
The easiest way to see how react-day-picker works is to check the [examples with code](http://react-day-picker.js.org/examples).
## Running the examples locally
If you want to play with the examples, setup the repo and the app:
1. Setup the repo and the app (remember to run `npm install` from the root:
```
git clone https://github.com/gpbl/react-day-picker.git
cd react-day-picker
npm install
npm run build
cd examples
npm install
```
<|file_sep|>current/docs/Examples.md
# Examples with code
The easiest way to see how react-day-picker works is to check the [examples with code](http://react-day-picker.js.org/examples).
## Running the examples locally
If you want to play with the examples, setup the repo and the app:
1. Setup the repo and the app (remember to run `npm install` from the root:
```
git clone https://github.com/gpbl/react-day-picker.git
cd react-day-picker
npm install
npm run build
cd examples
npm install
```
<|file_sep|>updated/docs/Examples.md | # Examples with code
The easiest way to see how react-day-picker works is to check the [examples with code](http://react-day-picker.js.org/examples).
## Running the examples locally
If you want to play with the examples, setup the repo and the app:
1. Setup the repo and the app (remember to run `npm run build` from the root):
```
git clone https://github.com/gpbl/react-day-picker.git
cd react-day-picker
npm install
npm run build
cd examples
npm install
```
| <|file_sep|>original/docs/Examples.md
# Examples with code
The easiest way to see how react-day-picker works is to check the [examples with code](http://react-day-picker.js.org/examples).
## Running the examples locally
If you want to play with the examples, setup the repo and the app:
1. Setup the repo and the app (remember to run `npm install` from the root:
```
git clone https://github.com/gpbl/react-day-picker.git
cd react-day-picker
npm install
npm run build
cd examples
npm install
```
<|file_sep|>current/docs/Examples.md
# Examples with code
The easiest way to see how react-day-picker works is to check the [examples with code](http://react-day-picker.js.org/examples).
## Running the examples locally
If you want to play with the examples, setup the repo and the app:
1. Setup the repo and the app (remember to run `npm install` from the root:
```
git clone https://github.com/gpbl/react-day-picker.git
cd react-day-picker
npm install
npm run build
cd examples
npm install
```
<|file_sep|>updated/docs/Examples.md
# Examples with code
The easiest way to see how react-day-picker works is to check the [examples with code](http://react-day-picker.js.org/examples).
## Running the examples locally
If you want to play with the examples, setup the repo and the app:
1. Setup the repo and the app (remember to run `npm run build` from the root):
```
git clone https://github.com/gpbl/react-day-picker.git
cd react-day-picker
npm install
npm run build
cd examples
npm install
```
| 11ef86b7c1974db35b0225d86b0f945f3e70d7d3 | docs/Examples.md | docs/Examples.md | Markdown |
<|file_sep|>original/examples/listdevs.py
#!/usr/bin/env python
import usb1
def main():
context = usb1.USBContext()
for device in context.getDeviceList(skip_on_error=True):
print str(device)
if __name__ == '__main__':
main()
<|file_sep|>current/examples/listdevs.py
#!/usr/bin/env python
import usb1
def main():
context = usb1.USBContext()
for device in context.getDeviceList(skip_on_error=True):
print str(device)
if __name__ == '__main__':
main()
<|file_sep|>updated/examples/listdevs.py | #!/usr/bin/env python
import usb1
def main():
context = usb1.USBContext()
for device in context.getDeviceList(skip_on_error=True):
print 'ID %04x:%04x' % (device.getVendorID(), device.getProductID()), '->'.join(str(x) for x in ['Bus %03i' % (device.getBusNumber(), )] + device.getPortNumberList()), 'Device', device.getDeviceAddress()
if __name__ == '__main__':
main() | <|file_sep|>original/examples/listdevs.py
#!/usr/bin/env python
import usb1
def main():
context = usb1.USBContext()
for device in context.getDeviceList(skip_on_error=True):
print str(device)
if __name__ == '__main__':
main()
<|file_sep|>current/examples/listdevs.py
#!/usr/bin/env python
import usb1
def main():
context = usb1.USBContext()
for device in context.getDeviceList(skip_on_error=True):
print str(device)
if __name__ == '__main__':
main()
<|file_sep|>updated/examples/listdevs.py
#!/usr/bin/env python
import usb1
def main():
context = usb1.USBContext()
for device in context.getDeviceList(skip_on_error=True):
print 'ID %04x:%04x' % (device.getVendorID(), device.getProductID()), '->'.join(str(x) for x in ['Bus %03i' % (device.getBusNumber(), )] + device.getPortNumberList()), 'Device', device.getDeviceAddress()
if __name__ == '__main__':
main() | 245628bf53bf7255ccd5aa15d21ff8c1f5751ef8 | examples/listdevs.py | examples/listdevs.py | Python |
<|file_sep|>original/altair/examples/world_map.py
<|file_sep|>current/altair/examples/world_map.py
<|file_sep|>updated/altair/examples/world_map.py | """
World Map
---------
This example shows how to create a world map using data generators for
different background layers.
"""
# category: maps
import altair as alt
from vega_datasets import data
# Data generators for the background
sphere = alt.sphere()
graticule = alt.graticule()
# Source of land data
source = alt.topo_feature(data.world_110m.url, 'countries')
# Layering and configuring the components
alt.layer( | <|file_sep|>original/altair/examples/world_map.py
<|file_sep|>current/altair/examples/world_map.py
<|file_sep|>updated/altair/examples/world_map.py
"""
World Map
---------
This example shows how to create a world map using data generators for
different background layers.
"""
# category: maps
import altair as alt
from vega_datasets import data
# Data generators for the background
sphere = alt.sphere()
graticule = alt.graticule()
# Source of land data
source = alt.topo_feature(data.world_110m.url, 'countries')
# Layering and configuring the components
alt.layer( | a4d5e88973a25464be26488d17ecc663cce776d7 | altair/examples/world_map.py | altair/examples/world_map.py | Python |
<|file_sep|>README.md.diff
original:
Modlib is a lightweight modbus library.<br>
updated:
Modlib is a very lightweight modbus library.<br>
<|file_sep|>original/README.md
# Modlib
Modlib is a lightweight modbus library.<br>
Library is targeted to run on AVR devices, but it should also work on PC.
## Work is in a progress and library doesn't have its functionality yet!
<|file_sep|>current/README.md
# Modlib
Modlib is a very lightweight modbus library.<br>
Library is targeted to run on AVR devices, but it should also work on PC.
## Work is in a progress and library doesn't have its functionality yet!
<|file_sep|>updated/README.md | # Modlib
Modlib is a very lightweight modbus library.<br>
Library is targeted to run on AVR devices, but it should also work on PC.
**Note: Library doesn't have its functionality yet, although everything is planned, and is going to be coded as soon as possible.**
## Features
- [ ] Parsing frames using unions - that makes processing really fast
- [ ] Support for all functions according data registers
- [ ] Support for reading and writing discrete input registers and coils
- [ ] Library can be used on master side as well as on slave side
- [ ] Code is written in pure C, and features things like dynamic memory allocation for minimized resources used
- [ ] Exceptions handling
## Communication process
- **Master** device sends a **frame** to **slave** device - this is known as `request`
- **Slave** processes received **frame** and responds to **Master** with another frame - this part is known as `response`
That means functions containing `request` in their names either format request on master side or interpret it on slave side. It works pretty much the same with `response`. | <|file_sep|>README.md.diff
original:
Modlib is a lightweight modbus library.<br>
updated:
Modlib is a very lightweight modbus library.<br>
<|file_sep|>original/README.md
# Modlib
Modlib is a lightweight modbus library.<br>
Library is targeted to run on AVR devices, but it should also work on PC.
## Work is in a progress and library doesn't have its functionality yet!
<|file_sep|>current/README.md
# Modlib
Modlib is a very lightweight modbus library.<br>
Library is targeted to run on AVR devices, but it should also work on PC.
## Work is in a progress and library doesn't have its functionality yet!
<|file_sep|>updated/README.md
# Modlib
Modlib is a very lightweight modbus library.<br>
Library is targeted to run on AVR devices, but it should also work on PC.
**Note: Library doesn't have its functionality yet, although everything is planned, and is going to be coded as soon as possible.**
## Features
- [ ] Parsing frames using unions - that makes processing really fast
- [ ] Support for all functions according data registers
- [ ] Support for reading and writing discrete input registers and coils
- [ ] Library can be used on master side as well as on slave side
- [ ] Code is written in pure C, and features things like dynamic memory allocation for minimized resources used
- [ ] Exceptions handling
## Communication process
- **Master** device sends a **frame** to **slave** device - this is known as `request`
- **Slave** processes received **frame** and responds to **Master** with another frame - this part is known as `response`
That means functions containing `request` in their names either format request on master side or interpret it on slave side. It works pretty much the same with `response`. | fbfad7e40662cd41f32973d364bc3f32d44da56f | README.md | README.md | Markdown |
<|file_sep|>original/package.json
{
"name": "dicey",
"description": "Pen-and-paper roleplaying game aid - dice roller.",
"version": "2.0.1",
"scripts": {
"start": "npm prune && npm install && npm run server",
"server": "./node_modules/.bin/ws --key ws.key --cert ws.crt --verbose -d public/",
"watch": "node ./scripts/browserify.js"
},
"devDependencies": {
"babel-preset-react": "^6.5.0",
"babelify": "^7.2.0",
"browserify": "^13.0.0",
"droll": "^0.1.2",
"local-web-server": "^1.2.2",
"react": "*",
"react-dom": "^15.0.1",
"watchify": "^3.7.0"
}
}
<|file_sep|>current/package.json
{
"name": "dicey",
"description": "Pen-and-paper roleplaying game aid - dice roller.",
"version": "2.0.1",
"scripts": {
"start": "npm prune && npm install && npm run server",
"server": "./node_modules/.bin/ws --key ws.key --cert ws.crt --verbose -d public/",
"watch": "node ./scripts/browserify.js"
},
"devDependencies": {
"babel-preset-react": "^6.5.0",
"babelify": "^7.2.0",
"browserify": "^13.0.0",
"droll": "^0.1.2",
"local-web-server": "^1.2.2",
"react": "*",
"react-dom": "^15.0.1",
"watchify": "^3.7.0"
}
}
<|file_sep|>updated/package.json | {
"name": "dicey",
"description": "Pen-and-paper roleplaying game aid - dice roller.",
"version": "2.0.1",
"scripts": {
"start": "npm prune && npm install && npm run server",
"server": "./node_modules/.bin/ws --key ws.key --cert ws.crt --verbose --compress -d public/",
"watch": "node ./scripts/browserify.js"
},
"devDependencies": {
"babel-preset-react": "^6.5.0",
"babelify": "^7.2.0",
"browserify": "^13.0.0",
"droll": "^0.1.2",
"local-web-server": "^1.2.2",
"react": "*",
"react-dom": "^15.0.1",
"watchify": "^3.7.0"
}
} | <|file_sep|>original/package.json
{
"name": "dicey",
"description": "Pen-and-paper roleplaying game aid - dice roller.",
"version": "2.0.1",
"scripts": {
"start": "npm prune && npm install && npm run server",
"server": "./node_modules/.bin/ws --key ws.key --cert ws.crt --verbose -d public/",
"watch": "node ./scripts/browserify.js"
},
"devDependencies": {
"babel-preset-react": "^6.5.0",
"babelify": "^7.2.0",
"browserify": "^13.0.0",
"droll": "^0.1.2",
"local-web-server": "^1.2.2",
"react": "*",
"react-dom": "^15.0.1",
"watchify": "^3.7.0"
}
}
<|file_sep|>current/package.json
{
"name": "dicey",
"description": "Pen-and-paper roleplaying game aid - dice roller.",
"version": "2.0.1",
"scripts": {
"start": "npm prune && npm install && npm run server",
"server": "./node_modules/.bin/ws --key ws.key --cert ws.crt --verbose -d public/",
"watch": "node ./scripts/browserify.js"
},
"devDependencies": {
"babel-preset-react": "^6.5.0",
"babelify": "^7.2.0",
"browserify": "^13.0.0",
"droll": "^0.1.2",
"local-web-server": "^1.2.2",
"react": "*",
"react-dom": "^15.0.1",
"watchify": "^3.7.0"
}
}
<|file_sep|>updated/package.json
{
"name": "dicey",
"description": "Pen-and-paper roleplaying game aid - dice roller.",
"version": "2.0.1",
"scripts": {
"start": "npm prune && npm install && npm run server",
"server": "./node_modules/.bin/ws --key ws.key --cert ws.crt --verbose --compress -d public/",
"watch": "node ./scripts/browserify.js"
},
"devDependencies": {
"babel-preset-react": "^6.5.0",
"babelify": "^7.2.0",
"browserify": "^13.0.0",
"droll": "^0.1.2",
"local-web-server": "^1.2.2",
"react": "*",
"react-dom": "^15.0.1",
"watchify": "^3.7.0"
}
} | aa9f06bf46716cf8e258bb385beb6e740f008cdb | package.json | package.json | JSON |
<|file_sep|>original/README.md
# DataFurnace
[](https://travis-ci.org/minden/data-furnace)
<|file_sep|>current/README.md
# DataFurnace
[](https://travis-ci.org/minden/data-furnace)
<|file_sep|>updated/README.md | # DataFurnace
[](https://travis-ci.org/minden/data-furnace)
[](https://codeclimate.com/github/minden/data-furnace)
Continuously deployed to [46.101.183.252](http://46.101.183.252) | <|file_sep|>original/README.md
# DataFurnace
[](https://travis-ci.org/minden/data-furnace)
<|file_sep|>current/README.md
# DataFurnace
[](https://travis-ci.org/minden/data-furnace)
<|file_sep|>updated/README.md
# DataFurnace
[](https://travis-ci.org/minden/data-furnace)
[](https://codeclimate.com/github/minden/data-furnace)
Continuously deployed to [46.101.183.252](http://46.101.183.252) | 5818575c71a21ec0551d2397c5585ed606466500 | README.md | README.md | Markdown |
<|file_sep|>original/.travis.yml
language: objective-c
osx_image: xcode8.2
# Travis is currently experiencing issues with Xcode builds: https://github.com/travis-ci/travis-ci/issues/6675#issuecomment-257964767
# When this is resolved, remove the entire `before_install` block, as well as the `travis_retry` command below.
before_install:
- export IOS_SIMULATOR_UDID=`instruments -s devices | grep "iPhone 7 (10.1" | awk -F '[ ]' '{print $4}' | awk -F '[\[]' '{print $2}' | sed 's/.$//'`
- echo $IOS_SIMULATOR_UDID
- open -a "simulator" --args -CurrentDeviceUDID $IOS_SIMULATOR_UDID
install:
- gem install xcpretty
script:
- set -o pipefail && travis_retry xcodebuild -project Bond.xcodeproj -scheme Bond-iOS -sdk iphonesimulator -destination 'platform=iOS Simulator,name=iPhone 7,OS=10.1' test | xcpretty
after_success:
- bash <(curl -s https://codecov.io/bash)
<|file_sep|>current/.travis.yml
language: objective-c
osx_image: xcode8.2
# Travis is currently experiencing issues with Xcode builds: https://github.com/travis-ci/travis-ci/issues/6675#issuecomment-257964767
# When this is resolved, remove the entire `before_install` block, as well as the `travis_retry` command below.
before_install:
- export IOS_SIMULATOR_UDID=`instruments -s devices | grep "iPhone 7 (10.1" | awk -F '[ ]' '{print $4}' | awk -F '[\[]' '{print $2}' | sed 's/.$//'`
- echo $IOS_SIMULATOR_UDID
- open -a "simulator" --args -CurrentDeviceUDID $IOS_SIMULATOR_UDID
install:
- gem install xcpretty
script:
- set -o pipefail && travis_retry xcodebuild -project Bond.xcodeproj -scheme Bond-iOS -sdk iphonesimulator -destination 'platform=iOS Simulator,name=iPhone 7,OS=10.1' test | xcpretty
after_success:
- bash <(curl -s https://codecov.io/bash)
<|file_sep|>updated/.travis.yml | language: objective-c
osx_image: xcode8.2
# Travis is currently experiencing issues with Xcode builds: https://github.com/travis-ci/travis-ci/issues/6675#issuecomment-257964767
# When this is resolved, remove the entire `before_install` block, as well as the `travis_retry` command below.
before_install:
- export IOS_SIMULATOR_UDID=`instruments -s devices | grep "iPhone 7 (10.1" | awk -F '[ ]' '{print $4}' | awk -F '[\[]' '{print $2}' | sed 's/.$//'`
- echo $IOS_SIMULATOR_UDID
- open -a "simulator" --args -CurrentDeviceUDID $IOS_SIMULATOR_UDID
install:
- gem install xcpretty
script:
- set -o pipefail && travis_retry xcodebuild -project Bond.xcodeproj -scheme Bond-iOS -sdk iphonesimulator -destination 'platform=iOS Simulator,name=iPhone 7,OS=10.1' test | xcpretty
- set -o pipefail && travis_retry swift build
- set -o pipefail && travis_retry swift test
after_success:
- bash <(curl -s https://codecov.io/bash) | <|file_sep|>original/.travis.yml
language: objective-c
osx_image: xcode8.2
# Travis is currently experiencing issues with Xcode builds: https://github.com/travis-ci/travis-ci/issues/6675#issuecomment-257964767
# When this is resolved, remove the entire `before_install` block, as well as the `travis_retry` command below.
before_install:
- export IOS_SIMULATOR_UDID=`instruments -s devices | grep "iPhone 7 (10.1" | awk -F '[ ]' '{print $4}' | awk -F '[\[]' '{print $2}' | sed 's/.$//'`
- echo $IOS_SIMULATOR_UDID
- open -a "simulator" --args -CurrentDeviceUDID $IOS_SIMULATOR_UDID
install:
- gem install xcpretty
script:
- set -o pipefail && travis_retry xcodebuild -project Bond.xcodeproj -scheme Bond-iOS -sdk iphonesimulator -destination 'platform=iOS Simulator,name=iPhone 7,OS=10.1' test | xcpretty
after_success:
- bash <(curl -s https://codecov.io/bash)
<|file_sep|>current/.travis.yml
language: objective-c
osx_image: xcode8.2
# Travis is currently experiencing issues with Xcode builds: https://github.com/travis-ci/travis-ci/issues/6675#issuecomment-257964767
# When this is resolved, remove the entire `before_install` block, as well as the `travis_retry` command below.
before_install:
- export IOS_SIMULATOR_UDID=`instruments -s devices | grep "iPhone 7 (10.1" | awk -F '[ ]' '{print $4}' | awk -F '[\[]' '{print $2}' | sed 's/.$//'`
- echo $IOS_SIMULATOR_UDID
- open -a "simulator" --args -CurrentDeviceUDID $IOS_SIMULATOR_UDID
install:
- gem install xcpretty
script:
- set -o pipefail && travis_retry xcodebuild -project Bond.xcodeproj -scheme Bond-iOS -sdk iphonesimulator -destination 'platform=iOS Simulator,name=iPhone 7,OS=10.1' test | xcpretty
after_success:
- bash <(curl -s https://codecov.io/bash)
<|file_sep|>updated/.travis.yml
language: objective-c
osx_image: xcode8.2
# Travis is currently experiencing issues with Xcode builds: https://github.com/travis-ci/travis-ci/issues/6675#issuecomment-257964767
# When this is resolved, remove the entire `before_install` block, as well as the `travis_retry` command below.
before_install:
- export IOS_SIMULATOR_UDID=`instruments -s devices | grep "iPhone 7 (10.1" | awk -F '[ ]' '{print $4}' | awk -F '[\[]' '{print $2}' | sed 's/.$//'`
- echo $IOS_SIMULATOR_UDID
- open -a "simulator" --args -CurrentDeviceUDID $IOS_SIMULATOR_UDID
install:
- gem install xcpretty
script:
- set -o pipefail && travis_retry xcodebuild -project Bond.xcodeproj -scheme Bond-iOS -sdk iphonesimulator -destination 'platform=iOS Simulator,name=iPhone 7,OS=10.1' test | xcpretty
- set -o pipefail && travis_retry swift build
- set -o pipefail && travis_retry swift test
after_success:
- bash <(curl -s https://codecov.io/bash) | fcf1bbe7503cf40fbd789ae1febbb63de2e17bc9 | .travis.yml | .travis.yml | YAML |
<|file_sep|>slyd/requirements.txt.diff
original:
splash==1.6
updated:
<|file_sep|>original/slyd/requirements.txt
# Slyd requirements
twisted==15.2.1
service_identity==14.0.0
requests==2.7.0
autobahn==0.10.4
six==1.5.2
splash==1.6
chardet==2.3.0
monotonic==0.3
# Dulwich's MySQL backend
--allow-external=mysql-connector-python
https://github.com/scrapinghub/dulwich/tarball/420a056#egg=dulwich
<|file_sep|>current/slyd/requirements.txt
# Slyd requirements
twisted==15.2.1
service_identity==14.0.0
requests==2.7.0
autobahn==0.10.4
six==1.5.2
chardet==2.3.0
monotonic==0.3
# Dulwich's MySQL backend
--allow-external=mysql-connector-python
https://github.com/scrapinghub/dulwich/tarball/420a056#egg=dulwich
<|file_sep|>updated/slyd/requirements.txt | # Slyd requirements
twisted==15.2.1
service_identity==14.0.0
requests==2.7.0
autobahn==0.10.4
six==1.5.2
chardet==2.3.0
monotonic==0.3
# Dulwich's MySQL backend
--allow-external=mysql-connector-python
https://github.com/scrapinghub/dulwich/tarball/420a056#egg=dulwich
# Splash dev
--allow-external=splash
https://github.com/scrapinghub/splash/archive/3acde4ef0d389e5067d51dcc9b4ac6d260658757.tar.gz | <|file_sep|>slyd/requirements.txt.diff
original:
splash==1.6
updated:
<|file_sep|>original/slyd/requirements.txt
# Slyd requirements
twisted==15.2.1
service_identity==14.0.0
requests==2.7.0
autobahn==0.10.4
six==1.5.2
splash==1.6
chardet==2.3.0
monotonic==0.3
# Dulwich's MySQL backend
--allow-external=mysql-connector-python
https://github.com/scrapinghub/dulwich/tarball/420a056#egg=dulwich
<|file_sep|>current/slyd/requirements.txt
# Slyd requirements
twisted==15.2.1
service_identity==14.0.0
requests==2.7.0
autobahn==0.10.4
six==1.5.2
chardet==2.3.0
monotonic==0.3
# Dulwich's MySQL backend
--allow-external=mysql-connector-python
https://github.com/scrapinghub/dulwich/tarball/420a056#egg=dulwich
<|file_sep|>updated/slyd/requirements.txt
# Slyd requirements
twisted==15.2.1
service_identity==14.0.0
requests==2.7.0
autobahn==0.10.4
six==1.5.2
chardet==2.3.0
monotonic==0.3
# Dulwich's MySQL backend
--allow-external=mysql-connector-python
https://github.com/scrapinghub/dulwich/tarball/420a056#egg=dulwich
# Splash dev
--allow-external=splash
https://github.com/scrapinghub/splash/archive/3acde4ef0d389e5067d51dcc9b4ac6d260658757.tar.gz | cd6aa1cebe319d1405a86b14be65404a2cf3af11 | slyd/requirements.txt | slyd/requirements.txt | Text |
<|file_sep|>osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs.diff
original:
updated:
/// <summary>
/// The user is idle and waiting for something to happen (or watching the match but not participating).
/// </summary>
<|file_sep|>osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs.diff
original:
updated:
/// <summary>
/// The user has marked themselves as ready to participate and should be considered for the next game start.
/// </summary>
<|file_sep|>osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs.diff
original:
updated:
/// <summary>
/// The server is waiting for this user to finish loading. This is a reserved state, and is set by the server.
/// </summary>
/// <remarks>
/// All users in <see cref="Ready"/> state when the game start will be transitioned to this state.
/// All users in this state need to transition to <see cref="Loaded"/> before the game can start.
/// </remarks>
<|file_sep|>osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs.diff
original:
updated:
/// <summary>
/// The user's client has marked itself as loaded and ready to begin gameplay.
/// </summary>
<|file_sep|>osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs.diff
original:
updated:
/// <summary>
/// The user is currently playing in a game. This is a reserved state, and is set by the server.
/// </summary>
/// <remarks>
/// Once there are no remaining <see cref="WaitingForLoad"/> users, all users in <see cref="Loaded"/> state will be transitioned to this state.
/// At this point the game will start for all users.
/// </remarks>
<|file_sep|>original/osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs
// Copyright (c) ppy Pty Ltd <contact@ppy.sh>. Licensed under the MIT Licence.
// See the LICENCE file in the repository root for full licence text.
namespace osu.Game.Online.RealtimeMultiplayer
{
public enum MultiplayerUserState
{
Idle,
Ready,
WaitingForLoad,
Loaded,
Playing,
Results,
}
}
<|file_sep|>current/osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs
/// All users in <see cref="Ready"/> state when the game start will be transitioned to this state.
/// All users in this state need to transition to <see cref="Loaded"/> before the game can start.
/// </remarks>
WaitingForLoad,
/// <summary>
/// The user's client has marked itself as loaded and ready to begin gameplay.
/// </summary>
Loaded,
/// <summary>
/// The user is currently playing in a game. This is a reserved state, and is set by the server.
/// </summary>
/// <remarks>
/// Once there are no remaining <see cref="WaitingForLoad"/> users, all users in <see cref="Loaded"/> state will be transitioned to this state.
/// At this point the game will start for all users.
/// </remarks>
Playing,
Results,
}
}
<|file_sep|>updated/osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs | Loaded,
/// <summary>
/// The user is currently playing in a game. This is a reserved state, and is set by the server.
/// </summary>
/// <remarks>
/// Once there are no remaining <see cref="WaitingForLoad"/> users, all users in <see cref="Loaded"/> state will be transitioned to this state.
/// At this point the game will start for all users.
/// </remarks>
Playing,
/// <summary>
/// The user has finished playing and is ready to view results.
/// </summary>
/// <remarks>
/// Once all users transition from <see cref="Playing"/> to this state, the game will end and results will be distributed.
/// All users will be transitioned to the <see cref="Results"/> state.
/// </remarks>
FinishedPlay,
/// <summary> | <|file_sep|>osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs.diff
original:
updated:
/// <summary>
/// The user is idle and waiting for something to happen (or watching the match but not participating).
/// </summary>
<|file_sep|>osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs.diff
original:
updated:
/// <summary>
/// The user has marked themselves as ready to participate and should be considered for the next game start.
/// </summary>
<|file_sep|>osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs.diff
original:
updated:
/// <summary>
/// The server is waiting for this user to finish loading. This is a reserved state, and is set by the server.
/// </summary>
/// <remarks>
/// All users in <see cref="Ready"/> state when the game start will be transitioned to this state.
/// All users in this state need to transition to <see cref="Loaded"/> before the game can start.
/// </remarks>
<|file_sep|>osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs.diff
original:
updated:
/// <summary>
/// The user's client has marked itself as loaded and ready to begin gameplay.
/// </summary>
<|file_sep|>osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs.diff
original:
updated:
/// <summary>
/// The user is currently playing in a game. This is a reserved state, and is set by the server.
/// </summary>
/// <remarks>
/// Once there are no remaining <see cref="WaitingForLoad"/> users, all users in <see cref="Loaded"/> state will be transitioned to this state.
/// At this point the game will start for all users.
/// </remarks>
<|file_sep|>original/osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs
// Copyright (c) ppy Pty Ltd <contact@ppy.sh>. Licensed under the MIT Licence.
// See the LICENCE file in the repository root for full licence text.
namespace osu.Game.Online.RealtimeMultiplayer
{
public enum MultiplayerUserState
{
Idle,
Ready,
WaitingForLoad,
Loaded,
Playing,
Results,
}
}
<|file_sep|>current/osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs
/// All users in <see cref="Ready"/> state when the game start will be transitioned to this state.
/// All users in this state need to transition to <see cref="Loaded"/> before the game can start.
/// </remarks>
WaitingForLoad,
/// <summary>
/// The user's client has marked itself as loaded and ready to begin gameplay.
/// </summary>
Loaded,
/// <summary>
/// The user is currently playing in a game. This is a reserved state, and is set by the server.
/// </summary>
/// <remarks>
/// Once there are no remaining <see cref="WaitingForLoad"/> users, all users in <see cref="Loaded"/> state will be transitioned to this state.
/// At this point the game will start for all users.
/// </remarks>
Playing,
Results,
}
}
<|file_sep|>updated/osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs
Loaded,
/// <summary>
/// The user is currently playing in a game. This is a reserved state, and is set by the server.
/// </summary>
/// <remarks>
/// Once there are no remaining <see cref="WaitingForLoad"/> users, all users in <see cref="Loaded"/> state will be transitioned to this state.
/// At this point the game will start for all users.
/// </remarks>
Playing,
/// <summary>
/// The user has finished playing and is ready to view results.
/// </summary>
/// <remarks>
/// Once all users transition from <see cref="Playing"/> to this state, the game will end and results will be distributed.
/// All users will be transitioned to the <see cref="Results"/> state.
/// </remarks>
FinishedPlay,
/// <summary> | 60550b73f77d0ec625c070fcd6eee395e8397149 | osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs | osu.Game/Online/RealtimeMultiplayer/MultiplayerUserState.cs | C# |
<|file_sep|>original/core/src/core/composer.json
}
},
"require": {
"zendframework/zend-diactoros": "~1.0",
"guzzlehttp/guzzle": "~5",
"guzzlehttp/command": "~0.7",
"guzzlehttp/guzzle-services": "~0.5",
"gimler/guzzle-description-loader" : "*",
"commerceguys/guzzle-oauth2-plugin": "~2.0",
"nikic/fast-route":"~1.0",
"symfony/console":"3.3.9",
"davegardnerisme/nsqphp": "dev-master",
"sabre/dav":"1.8.10",
"aws/aws-sdk-php": "^3.19.4",
"meenie/javascript-packer":"1.1",
"dapphp/securimage":"3.6.4",
"phpseclib/phpseclib":"2.0.3",
"pear/mail_mime-decode":"1.5.5.2"
}
}
<|file_sep|>current/core/src/core/composer.json
}
},
"require": {
"zendframework/zend-diactoros": "~1.0",
"guzzlehttp/guzzle": "~5",
"guzzlehttp/command": "~0.7",
"guzzlehttp/guzzle-services": "~0.5",
"gimler/guzzle-description-loader" : "*",
"commerceguys/guzzle-oauth2-plugin": "~2.0",
"nikic/fast-route":"~1.0",
"symfony/console":"3.3.9",
"davegardnerisme/nsqphp": "dev-master",
"sabre/dav":"1.8.10",
"aws/aws-sdk-php": "^3.19.4",
"meenie/javascript-packer":"1.1",
"dapphp/securimage":"3.6.4",
"phpseclib/phpseclib":"2.0.3",
"pear/mail_mime-decode":"1.5.5.2"
}
}
<|file_sep|>updated/core/src/core/composer.json | }
},
"require": {
"zendframework/zend-diactoros": "~1.0",
"guzzlehttp/guzzle": "~5",
"guzzlehttp/command": "~0.7",
"guzzlehttp/guzzle-services": "~0.5",
"gimler/guzzle-description-loader" : "*",
"commerceguys/guzzle-oauth2-plugin": "~2.0",
"nikic/fast-route":"~1.0",
"symfony/console":"3.3.9",
"davegardnerisme/nsqphp": "dev-master",
"sabre/dav":"1.8.10",
"aws/aws-sdk-php": "^3.19.4",
"meenie/javascript-packer":"1.1",
"dapphp/securimage":"3.6.4",
"phpseclib/phpseclib":"2.0.11",
"pear/mail_mime-decode":"1.5.5.2"
}
} | <|file_sep|>original/core/src/core/composer.json
}
},
"require": {
"zendframework/zend-diactoros": "~1.0",
"guzzlehttp/guzzle": "~5",
"guzzlehttp/command": "~0.7",
"guzzlehttp/guzzle-services": "~0.5",
"gimler/guzzle-description-loader" : "*",
"commerceguys/guzzle-oauth2-plugin": "~2.0",
"nikic/fast-route":"~1.0",
"symfony/console":"3.3.9",
"davegardnerisme/nsqphp": "dev-master",
"sabre/dav":"1.8.10",
"aws/aws-sdk-php": "^3.19.4",
"meenie/javascript-packer":"1.1",
"dapphp/securimage":"3.6.4",
"phpseclib/phpseclib":"2.0.3",
"pear/mail_mime-decode":"1.5.5.2"
}
}
<|file_sep|>current/core/src/core/composer.json
}
},
"require": {
"zendframework/zend-diactoros": "~1.0",
"guzzlehttp/guzzle": "~5",
"guzzlehttp/command": "~0.7",
"guzzlehttp/guzzle-services": "~0.5",
"gimler/guzzle-description-loader" : "*",
"commerceguys/guzzle-oauth2-plugin": "~2.0",
"nikic/fast-route":"~1.0",
"symfony/console":"3.3.9",
"davegardnerisme/nsqphp": "dev-master",
"sabre/dav":"1.8.10",
"aws/aws-sdk-php": "^3.19.4",
"meenie/javascript-packer":"1.1",
"dapphp/securimage":"3.6.4",
"phpseclib/phpseclib":"2.0.3",
"pear/mail_mime-decode":"1.5.5.2"
}
}
<|file_sep|>updated/core/src/core/composer.json
}
},
"require": {
"zendframework/zend-diactoros": "~1.0",
"guzzlehttp/guzzle": "~5",
"guzzlehttp/command": "~0.7",
"guzzlehttp/guzzle-services": "~0.5",
"gimler/guzzle-description-loader" : "*",
"commerceguys/guzzle-oauth2-plugin": "~2.0",
"nikic/fast-route":"~1.0",
"symfony/console":"3.3.9",
"davegardnerisme/nsqphp": "dev-master",
"sabre/dav":"1.8.10",
"aws/aws-sdk-php": "^3.19.4",
"meenie/javascript-packer":"1.1",
"dapphp/securimage":"3.6.4",
"phpseclib/phpseclib":"2.0.11",
"pear/mail_mime-decode":"1.5.5.2"
}
} | 5131d0f62a55409faf9ad06b321c165a90f7ddb7 | core/src/core/composer.json | core/src/core/composer.json | JSON |
<|file_sep|>step-01/js/main.js.diff
original:
const localMediaStreamError = function(error) {
updated:
const handleLocalMediaStreamError = function(error) {
<|file_sep|>original/step-01/js/main.js
// On this codelab, you will be streaming only video (video: true).
const mediaStreamConstraints = {
video: true,
};
// Video element where stream will be placed.
const localVideo = document.querySelector('video');
// Handles success by adding the MediaStream to the video element.
const gotLocalMediaStream = function(mediaStream) {
localVideo.srcObject = mediaStream;
};
// Handles error by logging a message to the console with the error message.
const localMediaStreamError = function(error) {
console.log('navigator.getUserMedia error: ', error);
};
// Initializes media stream.
navigator.mediaDevices.getUserMedia(mediaStreamConstraints)
.then(gotLocalMediaStream).catch(localMediaStreamError);
<|file_sep|>current/step-01/js/main.js
// On this codelab, you will be streaming only video (video: true).
const mediaStreamConstraints = {
video: true,
};
// Video element where stream will be placed.
const localVideo = document.querySelector('video');
// Handles success by adding the MediaStream to the video element.
const gotLocalMediaStream = function(mediaStream) {
localVideo.srcObject = mediaStream;
};
// Handles error by logging a message to the console with the error message.
const handleLocalMediaStreamError = function(error) {
console.log('navigator.getUserMedia error: ', error);
};
// Initializes media stream.
navigator.mediaDevices.getUserMedia(mediaStreamConstraints)
.then(gotLocalMediaStream).catch(localMediaStreamError);
<|file_sep|>updated/step-01/js/main.js | // On this codelab, you will be streaming only video (video: true).
const mediaStreamConstraints = {
video: true,
};
// Video element where stream will be placed.
const localVideo = document.querySelector('video');
// Handles success by adding the MediaStream to the video element.
const gotLocalMediaStream = function(mediaStream) {
localVideo.srcObject = mediaStream;
};
// Handles error by logging a message to the console with the error message.
const handleLocalMediaStreamError = function(error) {
console.log('navigator.getUserMedia error: ', error);
};
// Initializes media stream.
navigator.mediaDevices.getUserMedia(mediaStreamConstraints)
.then(gotLocalMediaStream).catch(handleLocalMediaStreamError); | <|file_sep|>step-01/js/main.js.diff
original:
const localMediaStreamError = function(error) {
updated:
const handleLocalMediaStreamError = function(error) {
<|file_sep|>original/step-01/js/main.js
// On this codelab, you will be streaming only video (video: true).
const mediaStreamConstraints = {
video: true,
};
// Video element where stream will be placed.
const localVideo = document.querySelector('video');
// Handles success by adding the MediaStream to the video element.
const gotLocalMediaStream = function(mediaStream) {
localVideo.srcObject = mediaStream;
};
// Handles error by logging a message to the console with the error message.
const localMediaStreamError = function(error) {
console.log('navigator.getUserMedia error: ', error);
};
// Initializes media stream.
navigator.mediaDevices.getUserMedia(mediaStreamConstraints)
.then(gotLocalMediaStream).catch(localMediaStreamError);
<|file_sep|>current/step-01/js/main.js
// On this codelab, you will be streaming only video (video: true).
const mediaStreamConstraints = {
video: true,
};
// Video element where stream will be placed.
const localVideo = document.querySelector('video');
// Handles success by adding the MediaStream to the video element.
const gotLocalMediaStream = function(mediaStream) {
localVideo.srcObject = mediaStream;
};
// Handles error by logging a message to the console with the error message.
const handleLocalMediaStreamError = function(error) {
console.log('navigator.getUserMedia error: ', error);
};
// Initializes media stream.
navigator.mediaDevices.getUserMedia(mediaStreamConstraints)
.then(gotLocalMediaStream).catch(localMediaStreamError);
<|file_sep|>updated/step-01/js/main.js
// On this codelab, you will be streaming only video (video: true).
const mediaStreamConstraints = {
video: true,
};
// Video element where stream will be placed.
const localVideo = document.querySelector('video');
// Handles success by adding the MediaStream to the video element.
const gotLocalMediaStream = function(mediaStream) {
localVideo.srcObject = mediaStream;
};
// Handles error by logging a message to the console with the error message.
const handleLocalMediaStreamError = function(error) {
console.log('navigator.getUserMedia error: ', error);
};
// Initializes media stream.
navigator.mediaDevices.getUserMedia(mediaStreamConstraints)
.then(gotLocalMediaStream).catch(handleLocalMediaStreamError); | 3b2ca730cf63799cc014e0713f1c89e552e9d774 | step-01/js/main.js | step-01/js/main.js | JavaScript |
<|file_sep|>original/CONTRIBUTING.md
<|file_sep|>current/CONTRIBUTING.md
<|file_sep|>updated/CONTRIBUTING.md | # How to Contribute
We'd love to accept your patches and contributions to this project. There are
just a few small guidelines you need to follow.
## Contributor License Agreement
Contributions to this project must be accompanied by a Contributor License
Agreement. You (or your employer) retain the copyright to your contribution;
this simply gives us permission to use and redistribute your contributions as
part of the project. Head over to <https://cla.developers.google.com/> to see
your current agreements on file or to sign a new one.
You generally only need to submit a CLA once, so if you've already submitted one
(even if it was for a different project), you probably don't need to do it
again.
## Code reviews
All submissions, including submissions by project members, require review. We
use GitHub pull requests for this purpose. Consult | <|file_sep|>original/CONTRIBUTING.md
<|file_sep|>current/CONTRIBUTING.md
<|file_sep|>updated/CONTRIBUTING.md
# How to Contribute
We'd love to accept your patches and contributions to this project. There are
just a few small guidelines you need to follow.
## Contributor License Agreement
Contributions to this project must be accompanied by a Contributor License
Agreement. You (or your employer) retain the copyright to your contribution;
this simply gives us permission to use and redistribute your contributions as
part of the project. Head over to <https://cla.developers.google.com/> to see
your current agreements on file or to sign a new one.
You generally only need to submit a CLA once, so if you've already submitted one
(even if it was for a different project), you probably don't need to do it
again.
## Code reviews
All submissions, including submissions by project members, require review. We
use GitHub pull requests for this purpose. Consult | b6c715c53c7cf0ddc7f6b98214d6e1e8d9d95982 | CONTRIBUTING.md | CONTRIBUTING.md | Markdown |
<|file_sep|>original/.travis.yml
language: python
cache: pip
python:
- "3.4"
- "2.7"
sudo: false
env:
- REQ="Django>=1.8,<1.9 FeinCMS>=1.11,<1.12"
- REQ="Django>=1.9,<1.10 https://github.com/feincms/feincms/archive/next.zip"
- REQ="Django>=1.10,<1.11 https://github.com/feincms/feincms/archive/next.zip"
install:
- pip install $REQ django-mptt==0.8.6 factory_boy pytz flake8 Pillow
- python setup.py install
script: "cd tests && ./manage.py test testapp && cd .. && flake8 ."
<|file_sep|>current/.travis.yml
language: python
cache: pip
python:
- "3.4"
- "2.7"
sudo: false
env:
- REQ="Django>=1.8,<1.9 FeinCMS>=1.11,<1.12"
- REQ="Django>=1.9,<1.10 https://github.com/feincms/feincms/archive/next.zip"
- REQ="Django>=1.10,<1.11 https://github.com/feincms/feincms/archive/next.zip"
install:
- pip install $REQ django-mptt==0.8.6 factory_boy pytz flake8 Pillow
- python setup.py install
script: "cd tests && ./manage.py test testapp && cd .. && flake8 ."
<|file_sep|>updated/.travis.yml | language: python
cache: pip
python:
- "3.4"
- "2.7"
sudo: false
env:
- REQ="Django>=1.8,<1.9 FeinCMS>=1.11,<1.12"
- REQ="Django>=1.9,<1.10 FeinCMS>=1.13,<2"
- REQ="Django>=1.10,<1.11 FeinCMS>=1.13,<2"
install:
- pip install $REQ django-mptt==0.8.6 factory_boy pytz flake8 Pillow
- python setup.py install
script: "cd tests && ./manage.py test testapp && cd .. && flake8 ." | <|file_sep|>original/.travis.yml
language: python
cache: pip
python:
- "3.4"
- "2.7"
sudo: false
env:
- REQ="Django>=1.8,<1.9 FeinCMS>=1.11,<1.12"
- REQ="Django>=1.9,<1.10 https://github.com/feincms/feincms/archive/next.zip"
- REQ="Django>=1.10,<1.11 https://github.com/feincms/feincms/archive/next.zip"
install:
- pip install $REQ django-mptt==0.8.6 factory_boy pytz flake8 Pillow
- python setup.py install
script: "cd tests && ./manage.py test testapp && cd .. && flake8 ."
<|file_sep|>current/.travis.yml
language: python
cache: pip
python:
- "3.4"
- "2.7"
sudo: false
env:
- REQ="Django>=1.8,<1.9 FeinCMS>=1.11,<1.12"
- REQ="Django>=1.9,<1.10 https://github.com/feincms/feincms/archive/next.zip"
- REQ="Django>=1.10,<1.11 https://github.com/feincms/feincms/archive/next.zip"
install:
- pip install $REQ django-mptt==0.8.6 factory_boy pytz flake8 Pillow
- python setup.py install
script: "cd tests && ./manage.py test testapp && cd .. && flake8 ."
<|file_sep|>updated/.travis.yml
language: python
cache: pip
python:
- "3.4"
- "2.7"
sudo: false
env:
- REQ="Django>=1.8,<1.9 FeinCMS>=1.11,<1.12"
- REQ="Django>=1.9,<1.10 FeinCMS>=1.13,<2"
- REQ="Django>=1.10,<1.11 FeinCMS>=1.13,<2"
install:
- pip install $REQ django-mptt==0.8.6 factory_boy pytz flake8 Pillow
- python setup.py install
script: "cd tests && ./manage.py test testapp && cd .. && flake8 ." | 05c1ab7b7b2ba6aa980b43c1e08b41af1fe801ae | .travis.yml | .travis.yml | YAML |
<|file_sep|>blitzes/playbook-reboot.yml.diff
original:
- hosts: personal_workstations:personal_servers
updated:
- hosts: node
<|file_sep|>blitzes/playbook-reboot.yml.diff
original:
command: "kubectl drain {{ inventory_hostname.split('.') | first }}"
updated:
command: "kubectl drain {{ inventory_hostname.split('.') | first }} --ignore-daemonsets"
<|file_sep|>original/blitzes/playbook-reboot.yml
gather_facts: no
serial:
- 1
max_fail_percentage: '0%'
vars:
confirm: true
tasks:
- name: Pause for confirm
pause:
prompt: "Confirm reboot of {{ inventory_hostname }}"
when: confirm|bool
- name: Drain node
command: "kubectl drain {{ inventory_hostname.split('.') | first }}"
delegate_to: localhost
when: "inventory_hostname in groups['k3s_cluster']"
ignore_errors: true
- name: "Reboot {{ inventory_hostname }}"
reboot:
become: yes
<|file_sep|>current/blitzes/playbook-reboot.yml
gather_facts: no
serial:
- 1
max_fail_percentage: '0%'
vars:
confirm: true
tasks:
- name: Pause for confirm
pause:
prompt: "Confirm reboot of {{ inventory_hostname }}"
when: confirm|bool
- name: Drain node
command: "kubectl drain {{ inventory_hostname.split('.') | first }} --ignore-daemonsets"
delegate_to: localhost
when: "inventory_hostname in groups['k3s_cluster']"
ignore_errors: true
- name: "Reboot {{ inventory_hostname }}"
reboot:
become: yes
<|file_sep|>updated/blitzes/playbook-reboot.yml |
- name: Drain node
command: "kubectl drain {{ inventory_hostname.split('.') | first }} --ignore-daemonsets"
delegate_to: localhost
when: "inventory_hostname in groups['k3s_cluster']"
ignore_errors: true
- name: "Reboot {{ inventory_hostname }}"
reboot:
become: yes
- name: Mark node as ready
command: "kubectl uncordon {{ inventory_hostname.split('.') | first }}"
delegate_to: localhost
when: "inventory_hostname in groups['k3s_cluster']"
- hosts: master
gather_facts: no
serial:
- 1
max_fail_percentage: '0%' | <|file_sep|>blitzes/playbook-reboot.yml.diff
original:
- hosts: personal_workstations:personal_servers
updated:
- hosts: node
<|file_sep|>blitzes/playbook-reboot.yml.diff
original:
command: "kubectl drain {{ inventory_hostname.split('.') | first }}"
updated:
command: "kubectl drain {{ inventory_hostname.split('.') | first }} --ignore-daemonsets"
<|file_sep|>original/blitzes/playbook-reboot.yml
gather_facts: no
serial:
- 1
max_fail_percentage: '0%'
vars:
confirm: true
tasks:
- name: Pause for confirm
pause:
prompt: "Confirm reboot of {{ inventory_hostname }}"
when: confirm|bool
- name: Drain node
command: "kubectl drain {{ inventory_hostname.split('.') | first }}"
delegate_to: localhost
when: "inventory_hostname in groups['k3s_cluster']"
ignore_errors: true
- name: "Reboot {{ inventory_hostname }}"
reboot:
become: yes
<|file_sep|>current/blitzes/playbook-reboot.yml
gather_facts: no
serial:
- 1
max_fail_percentage: '0%'
vars:
confirm: true
tasks:
- name: Pause for confirm
pause:
prompt: "Confirm reboot of {{ inventory_hostname }}"
when: confirm|bool
- name: Drain node
command: "kubectl drain {{ inventory_hostname.split('.') | first }} --ignore-daemonsets"
delegate_to: localhost
when: "inventory_hostname in groups['k3s_cluster']"
ignore_errors: true
- name: "Reboot {{ inventory_hostname }}"
reboot:
become: yes
<|file_sep|>updated/blitzes/playbook-reboot.yml
- name: Drain node
command: "kubectl drain {{ inventory_hostname.split('.') | first }} --ignore-daemonsets"
delegate_to: localhost
when: "inventory_hostname in groups['k3s_cluster']"
ignore_errors: true
- name: "Reboot {{ inventory_hostname }}"
reboot:
become: yes
- name: Mark node as ready
command: "kubectl uncordon {{ inventory_hostname.split('.') | first }}"
delegate_to: localhost
when: "inventory_hostname in groups['k3s_cluster']"
- hosts: master
gather_facts: no
serial:
- 1
max_fail_percentage: '0%' | 6b51a7b1f3fe2db92f729ba9e25cf3be9e9fd4bf | blitzes/playbook-reboot.yml | blitzes/playbook-reboot.yml | YAML |
<|file_sep|>package.json.diff
original:
"version": "0.0.1",
updated:
"version": "0.0.2",
<|file_sep|>original/package.json
"description": "A react component using pdf.js to display pdf documents inline",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"example": "browserify -t reactify example/index.js > example/bundle.js"
},
"keywords": [
"pdf",
"react"
],
"author": "Niklas Närhinen <niklas@narhinen.net>",
"license": "MIT",
"peerDependencies": {
"react": "^0.11.2"
},
"devDependencies": {
"browserify": "^5.12.1",
"react": "^0.11.2",
"reactify": "^0.14.0"
}
}
<|file_sep|>current/package.json
"description": "A react component using pdf.js to display pdf documents inline",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"example": "browserify -t reactify example/index.js > example/bundle.js"
},
"keywords": [
"pdf",
"react"
],
"author": "Niklas Närhinen <niklas@narhinen.net>",
"license": "MIT",
"peerDependencies": {
"react": "^0.11.2"
},
"devDependencies": {
"browserify": "^5.12.1",
"react": "^0.11.2",
"reactify": "^0.14.0"
}
}
<|file_sep|>updated/package.json | "test": "echo \"Error: no test specified\" && exit 1",
"example": "browserify -t reactify example/index.js > example/bundle.js"
},
"keywords": [
"pdf",
"react"
],
"author": "Niklas Närhinen <niklas@narhinen.net>",
"license": "MIT",
"peerDependencies": {
"react": "^0.11.2"
},
"devDependencies": {
"browserify": "^5.12.1",
"react": "^0.11.2",
"reactify": "^0.14.0"
},
"browserify": {
"transform": ["reactify"]
}
} | <|file_sep|>package.json.diff
original:
"version": "0.0.1",
updated:
"version": "0.0.2",
<|file_sep|>original/package.json
"description": "A react component using pdf.js to display pdf documents inline",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"example": "browserify -t reactify example/index.js > example/bundle.js"
},
"keywords": [
"pdf",
"react"
],
"author": "Niklas Närhinen <niklas@narhinen.net>",
"license": "MIT",
"peerDependencies": {
"react": "^0.11.2"
},
"devDependencies": {
"browserify": "^5.12.1",
"react": "^0.11.2",
"reactify": "^0.14.0"
}
}
<|file_sep|>current/package.json
"description": "A react component using pdf.js to display pdf documents inline",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"example": "browserify -t reactify example/index.js > example/bundle.js"
},
"keywords": [
"pdf",
"react"
],
"author": "Niklas Närhinen <niklas@narhinen.net>",
"license": "MIT",
"peerDependencies": {
"react": "^0.11.2"
},
"devDependencies": {
"browserify": "^5.12.1",
"react": "^0.11.2",
"reactify": "^0.14.0"
}
}
<|file_sep|>updated/package.json
"test": "echo \"Error: no test specified\" && exit 1",
"example": "browserify -t reactify example/index.js > example/bundle.js"
},
"keywords": [
"pdf",
"react"
],
"author": "Niklas Närhinen <niklas@narhinen.net>",
"license": "MIT",
"peerDependencies": {
"react": "^0.11.2"
},
"devDependencies": {
"browserify": "^5.12.1",
"react": "^0.11.2",
"reactify": "^0.14.0"
},
"browserify": {
"transform": ["reactify"]
}
} | a076f1c8c2e4016a8ce73a377d52773d06f4e9ea | package.json | package.json | JSON |
<|file_sep|>original/man/git-delete-branch.md
git-delete-branch(1) -- Delete branches
=======================================
## SYNOPSIS
`git-delete-branch` <branchname>
## DESCRIPTION
Deletes local and remote branch named <branchname>.
## OPTIONS
<branchname>
The name of the branch to delete.
## EXAMPLES
$ git delete-branch integration
<|file_sep|>current/man/git-delete-branch.md
git-delete-branch(1) -- Delete branches
=======================================
## SYNOPSIS
`git-delete-branch` <branchname>
## DESCRIPTION
Deletes local and remote branch named <branchname>.
## OPTIONS
<branchname>
The name of the branch to delete.
## EXAMPLES
$ git delete-branch integration
<|file_sep|>updated/man/git-delete-branch.md | git-delete-branch(1) -- Delete branches
=======================================
## SYNOPSIS
`git-delete-branch` <branchname>
## DESCRIPTION
Deletes local and remote branch named <branchname>.
Note that local deletion fails if the branch is checked out.
## OPTIONS
<branchname>
The name of the branch to delete.
## EXAMPLES
$ git delete-branch integration | <|file_sep|>original/man/git-delete-branch.md
git-delete-branch(1) -- Delete branches
=======================================
## SYNOPSIS
`git-delete-branch` <branchname>
## DESCRIPTION
Deletes local and remote branch named <branchname>.
## OPTIONS
<branchname>
The name of the branch to delete.
## EXAMPLES
$ git delete-branch integration
<|file_sep|>current/man/git-delete-branch.md
git-delete-branch(1) -- Delete branches
=======================================
## SYNOPSIS
`git-delete-branch` <branchname>
## DESCRIPTION
Deletes local and remote branch named <branchname>.
## OPTIONS
<branchname>
The name of the branch to delete.
## EXAMPLES
$ git delete-branch integration
<|file_sep|>updated/man/git-delete-branch.md
git-delete-branch(1) -- Delete branches
=======================================
## SYNOPSIS
`git-delete-branch` <branchname>
## DESCRIPTION
Deletes local and remote branch named <branchname>.
Note that local deletion fails if the branch is checked out.
## OPTIONS
<branchname>
The name of the branch to delete.
## EXAMPLES
$ git delete-branch integration | 082f556969e774bb724382374802c50bbb0ccbc8 | man/git-delete-branch.md | man/git-delete-branch.md | Markdown |
<|file_sep|>original/eigenclass.gemspec
require File.expand_path('../lib/eigenclass/version', __FILE__)
Gem::Specification.new do |s|
s.author = 'Sean Huber'
s.description = 'Eigenclasses (aka metaclasses or singleton classes) in ruby'
s.email = 'github@shuber.io'
s.extra_rdoc_files = %w(LICENSE)
s.files = `git ls-files`.split("\n")
s.homepage = 'https://github.com/shuber/eigenclass'
s.license = 'MIT'
s.name = 'eigenclass'
s.rdoc_options = %w(--charset=UTF-8 --inline-source --line-numbers --main README.md)
s.require_paths = %w(lib)
s.summary = 'Eigenclasses in ruby'
s.test_files = `git ls-files -- test/*`.split("\n")
s.version = Eigenclass::VERSION
s.add_development_dependency 'codeclimate-test-reporter'
s.add_development_dependency 'rspec'
s.add_development_dependency 'shoulda'
end
<|file_sep|>current/eigenclass.gemspec
require File.expand_path('../lib/eigenclass/version', __FILE__)
Gem::Specification.new do |s|
s.author = 'Sean Huber'
s.description = 'Eigenclasses (aka metaclasses or singleton classes) in ruby'
s.email = 'github@shuber.io'
s.extra_rdoc_files = %w(LICENSE)
s.files = `git ls-files`.split("\n")
s.homepage = 'https://github.com/shuber/eigenclass'
s.license = 'MIT'
s.name = 'eigenclass'
s.rdoc_options = %w(--charset=UTF-8 --inline-source --line-numbers --main README.md)
s.require_paths = %w(lib)
s.summary = 'Eigenclasses in ruby'
s.test_files = `git ls-files -- test/*`.split("\n")
s.version = Eigenclass::VERSION
s.add_development_dependency 'codeclimate-test-reporter'
s.add_development_dependency 'rspec'
s.add_development_dependency 'shoulda'
end
<|file_sep|>updated/eigenclass.gemspec | require File.expand_path('../lib/eigenclass/version', __FILE__)
Gem::Specification.new do |s|
s.author = 'Sean Huber'
s.description = 'Eigenclasses (aka metaclasses or singleton classes) in ruby'
s.email = 'github@shuber.io'
s.extra_rdoc_files = %w(LICENSE)
s.files = `git ls-files`.split("\n")
s.homepage = 'https://github.com/shuber/eigenclass'
s.license = 'MIT'
s.name = 'eigenclass'
s.rdoc_options = %w(--charset=UTF-8 --inline-source --line-numbers --main README.md)
s.require_paths = %w(lib)
s.summary = 'Eigenclasses in ruby'
s.test_files = `git ls-files -- spec/*`.split("\n")
s.version = Eigenclass::VERSION
s.add_development_dependency 'codeclimate-test-reporter'
s.add_development_dependency 'rspec'
s.add_development_dependency 'shoulda'
end | <|file_sep|>original/eigenclass.gemspec
require File.expand_path('../lib/eigenclass/version', __FILE__)
Gem::Specification.new do |s|
s.author = 'Sean Huber'
s.description = 'Eigenclasses (aka metaclasses or singleton classes) in ruby'
s.email = 'github@shuber.io'
s.extra_rdoc_files = %w(LICENSE)
s.files = `git ls-files`.split("\n")
s.homepage = 'https://github.com/shuber/eigenclass'
s.license = 'MIT'
s.name = 'eigenclass'
s.rdoc_options = %w(--charset=UTF-8 --inline-source --line-numbers --main README.md)
s.require_paths = %w(lib)
s.summary = 'Eigenclasses in ruby'
s.test_files = `git ls-files -- test/*`.split("\n")
s.version = Eigenclass::VERSION
s.add_development_dependency 'codeclimate-test-reporter'
s.add_development_dependency 'rspec'
s.add_development_dependency 'shoulda'
end
<|file_sep|>current/eigenclass.gemspec
require File.expand_path('../lib/eigenclass/version', __FILE__)
Gem::Specification.new do |s|
s.author = 'Sean Huber'
s.description = 'Eigenclasses (aka metaclasses or singleton classes) in ruby'
s.email = 'github@shuber.io'
s.extra_rdoc_files = %w(LICENSE)
s.files = `git ls-files`.split("\n")
s.homepage = 'https://github.com/shuber/eigenclass'
s.license = 'MIT'
s.name = 'eigenclass'
s.rdoc_options = %w(--charset=UTF-8 --inline-source --line-numbers --main README.md)
s.require_paths = %w(lib)
s.summary = 'Eigenclasses in ruby'
s.test_files = `git ls-files -- test/*`.split("\n")
s.version = Eigenclass::VERSION
s.add_development_dependency 'codeclimate-test-reporter'
s.add_development_dependency 'rspec'
s.add_development_dependency 'shoulda'
end
<|file_sep|>updated/eigenclass.gemspec
require File.expand_path('../lib/eigenclass/version', __FILE__)
Gem::Specification.new do |s|
s.author = 'Sean Huber'
s.description = 'Eigenclasses (aka metaclasses or singleton classes) in ruby'
s.email = 'github@shuber.io'
s.extra_rdoc_files = %w(LICENSE)
s.files = `git ls-files`.split("\n")
s.homepage = 'https://github.com/shuber/eigenclass'
s.license = 'MIT'
s.name = 'eigenclass'
s.rdoc_options = %w(--charset=UTF-8 --inline-source --line-numbers --main README.md)
s.require_paths = %w(lib)
s.summary = 'Eigenclasses in ruby'
s.test_files = `git ls-files -- spec/*`.split("\n")
s.version = Eigenclass::VERSION
s.add_development_dependency 'codeclimate-test-reporter'
s.add_development_dependency 'rspec'
s.add_development_dependency 'shoulda'
end | 9a5afe74d6c1d19d3c894a7f07ec99a77cd45b5a | eigenclass.gemspec | eigenclass.gemspec | Ruby |
<|file_sep|>original/README.md
# clang-format
node.js module which wraps the native clang-format executable.
## From the command-line:
$ npm install -g clang-format
$ clang-format -help
If your platform isn't yet supported, you can create the native binary from
the latest upstream clang sources, make sure it is stripped and optimized
(should be about 1.4MB as of mid-2015) and send a pull request to add it.
## Compiling clang-format
For Linux, compile a statically linked MinSizeRel build:
cmake -G Ninja -DCMAKE_BUILD_TYPE=MinSizeRel -DLLVM_BUILD_STATIC=true ..
ninja clang-format
For Mac OS X, static linking is not required.
<|file_sep|>current/README.md
# clang-format
node.js module which wraps the native clang-format executable.
## From the command-line:
$ npm install -g clang-format
$ clang-format -help
If your platform isn't yet supported, you can create the native binary from
the latest upstream clang sources, make sure it is stripped and optimized
(should be about 1.4MB as of mid-2015) and send a pull request to add it.
## Compiling clang-format
For Linux, compile a statically linked MinSizeRel build:
cmake -G Ninja -DCMAKE_BUILD_TYPE=MinSizeRel -DLLVM_BUILD_STATIC=true ..
ninja clang-format
For Mac OS X, static linking is not required.
<|file_sep|>updated/README.md |
$ npm install -g clang-format
$ clang-format -help
If your platform isn't yet supported, you can create the native binary from
the latest upstream clang sources, make sure it is stripped and optimized
(should be about 1.4MB as of mid-2015) and send a pull request to add it.
## Compiling clang-format
For Linux, compile a statically linked MinSizeRel build:
cmake -G Ninja -DCMAKE_BUILD_TYPE=MinSizeRel -DLLVM_BUILD_STATIC=true ..
ninja clang-format
For Mac OS X, static linking is not required.
## Windows
Windows snapshot builds to include in the release can be found at the
[LLVM website](http://llvm.org/builds/). | <|file_sep|>original/README.md
# clang-format
node.js module which wraps the native clang-format executable.
## From the command-line:
$ npm install -g clang-format
$ clang-format -help
If your platform isn't yet supported, you can create the native binary from
the latest upstream clang sources, make sure it is stripped and optimized
(should be about 1.4MB as of mid-2015) and send a pull request to add it.
## Compiling clang-format
For Linux, compile a statically linked MinSizeRel build:
cmake -G Ninja -DCMAKE_BUILD_TYPE=MinSizeRel -DLLVM_BUILD_STATIC=true ..
ninja clang-format
For Mac OS X, static linking is not required.
<|file_sep|>current/README.md
# clang-format
node.js module which wraps the native clang-format executable.
## From the command-line:
$ npm install -g clang-format
$ clang-format -help
If your platform isn't yet supported, you can create the native binary from
the latest upstream clang sources, make sure it is stripped and optimized
(should be about 1.4MB as of mid-2015) and send a pull request to add it.
## Compiling clang-format
For Linux, compile a statically linked MinSizeRel build:
cmake -G Ninja -DCMAKE_BUILD_TYPE=MinSizeRel -DLLVM_BUILD_STATIC=true ..
ninja clang-format
For Mac OS X, static linking is not required.
<|file_sep|>updated/README.md
$ npm install -g clang-format
$ clang-format -help
If your platform isn't yet supported, you can create the native binary from
the latest upstream clang sources, make sure it is stripped and optimized
(should be about 1.4MB as of mid-2015) and send a pull request to add it.
## Compiling clang-format
For Linux, compile a statically linked MinSizeRel build:
cmake -G Ninja -DCMAKE_BUILD_TYPE=MinSizeRel -DLLVM_BUILD_STATIC=true ..
ninja clang-format
For Mac OS X, static linking is not required.
## Windows
Windows snapshot builds to include in the release can be found at the
[LLVM website](http://llvm.org/builds/). | 6b77f41bbb7de95fbb6be6cf1fda66965d2b52f4 | README.md | README.md | Markdown |
<|file_sep|>lib/create.js.diff
original:
updated:
CloseServer(this);
<|file_sep|>lib/create.js.diff
original:
/* Older versions of Node.js throw hard
// errors when .closing an error'd server.
*/
try{
this.close();
}
catch(error){
/* Ignore the error, simply try and close
// the server if for whatever reason it
// is somehow listening.
*/
}
updated:
<|file_sep|>lib/create.js.diff
original:
this.close(function CreateServerClose(){
delete PORTS[port.toString()];
callback(null,true,port);
});
updated:
CloseServer(this);
delete PORTS[port.toString()];
callback(null,true,port);
<|file_sep|>original/lib/create.js
/* Older versions of Node.js throw hard
// errors when .closing an error'd server.
*/
try{
this.close();
}
catch(error){
/* Ignore the error, simply try and close
// the server if for whatever reason it
// is somehow listening.
*/
}
if(error.code === 'EADDRINUSE') callback(null,false,port);
else callback(true,error,port);
})
.on('listening',function CreateServerListening(){
this.close(function CreateServerClose(){
<|file_sep|>current/lib/create.js
delete PORTS[port.toString()];
if(error.code === 'EADDRINUSE') callback(null,false,port);
else callback(true,error,port);
})
.on('listening',function CreateServerListening(){
CloseServer(this);
delete PORTS[port.toString()];
callback(null,true,port);
})
.listen(port);
}
else if(typeof callback === 'function') callback(null,false);
}
module.exports = CreateServer;
<|file_sep|>updated/lib/create.js |
delete PORTS[port.toString()];
callback(null,true,port);
})
.listen(port);
}
else if(typeof callback === 'function') callback(null,false);
}
/* Older versions of Node.js throw hard
// errors when closing an error'd server.
// This function is to allow 'CreateServer'
// to be optimized by moving out the 'try'.
// Because this is just to try and close the
// server, if it's somehow listening after
// erroring, we won't handle any errors that
// come out of this `CloseServer`.
*/
function CloseServer(
server | <|file_sep|>lib/create.js.diff
original:
updated:
CloseServer(this);
<|file_sep|>lib/create.js.diff
original:
/* Older versions of Node.js throw hard
// errors when .closing an error'd server.
*/
try{
this.close();
}
catch(error){
/* Ignore the error, simply try and close
// the server if for whatever reason it
// is somehow listening.
*/
}
updated:
<|file_sep|>lib/create.js.diff
original:
this.close(function CreateServerClose(){
delete PORTS[port.toString()];
callback(null,true,port);
});
updated:
CloseServer(this);
delete PORTS[port.toString()];
callback(null,true,port);
<|file_sep|>original/lib/create.js
/* Older versions of Node.js throw hard
// errors when .closing an error'd server.
*/
try{
this.close();
}
catch(error){
/* Ignore the error, simply try and close
// the server if for whatever reason it
// is somehow listening.
*/
}
if(error.code === 'EADDRINUSE') callback(null,false,port);
else callback(true,error,port);
})
.on('listening',function CreateServerListening(){
this.close(function CreateServerClose(){
<|file_sep|>current/lib/create.js
delete PORTS[port.toString()];
if(error.code === 'EADDRINUSE') callback(null,false,port);
else callback(true,error,port);
})
.on('listening',function CreateServerListening(){
CloseServer(this);
delete PORTS[port.toString()];
callback(null,true,port);
})
.listen(port);
}
else if(typeof callback === 'function') callback(null,false);
}
module.exports = CreateServer;
<|file_sep|>updated/lib/create.js
delete PORTS[port.toString()];
callback(null,true,port);
})
.listen(port);
}
else if(typeof callback === 'function') callback(null,false);
}
/* Older versions of Node.js throw hard
// errors when closing an error'd server.
// This function is to allow 'CreateServer'
// to be optimized by moving out the 'try'.
// Because this is just to try and close the
// server, if it's somehow listening after
// erroring, we won't handle any errors that
// come out of this `CloseServer`.
*/
function CloseServer(
server | a3b61232879b9722511f1bcc6a4b02d06478cefe | lib/create.js | lib/create.js | JavaScript |
<|file_sep|>original/Resources/ui/Alerts.js
<|file_sep|>current/Resources/ui/Alerts.js
<|file_sep|>updated/Resources/ui/Alerts.js | function Alerts() {
var self = Ti.UI.createWindow({
title : 'Alerts',
});
alert('say hello');
var alertButton = Ti.UI.createButton({
title : 'Click Me for Alert',
borderColor : 'Red',
top : '35%'
});
alertButton.addEventListener('click', function(e) {
// if you don't do this, you will set off multiple alerts
e.cancelBubble = true;
alert(this.getTitle());
}); self.add(alertButton);
/**
*
* @param {Integer} min | <|file_sep|>original/Resources/ui/Alerts.js
<|file_sep|>current/Resources/ui/Alerts.js
<|file_sep|>updated/Resources/ui/Alerts.js
function Alerts() {
var self = Ti.UI.createWindow({
title : 'Alerts',
});
alert('say hello');
var alertButton = Ti.UI.createButton({
title : 'Click Me for Alert',
borderColor : 'Red',
top : '35%'
});
alertButton.addEventListener('click', function(e) {
// if you don't do this, you will set off multiple alerts
e.cancelBubble = true;
alert(this.getTitle());
}); self.add(alertButton);
/**
*
* @param {Integer} min | f882c083bfbcd1d7e1d5033cc62588f292fb5571 | Resources/ui/Alerts.js | Resources/ui/Alerts.js | JavaScript |
<|file_sep|>original/bower.json
"name": "mw-uikit",
"description": "A toolbox to build portals with AngularJS 1.5.x that have a lot of forms and list views for example admin portals to manage data.",
"author": "Alexander Zarges <a.zarges@mwaysolutions.com> (http://relution.io)",
"devDependencies": {
"angular": "1.5.7",
"angular-sanitize": "1.5.7",
"angular-markdown-directive": "0.3.1",
"showdown": "1.2.3",
"jquery": "3.1.0",
"jquery-ui": "1.12.1",
"blueimp-file-upload": "9.17.0",
"bootstrap-sass-datepicker": "1.3.7"
},
"repository": {
"type": "git",
"url": "https://github.com/mwaylabs/uikit.git"
},
"bugs": "https://github.com/mwaylabs/uikit/issues",
"resolutions": {
"showdown": "1.2.3"
}
<|file_sep|>current/bower.json
"name": "mw-uikit",
"description": "A toolbox to build portals with AngularJS 1.5.x that have a lot of forms and list views for example admin portals to manage data.",
"author": "Alexander Zarges <a.zarges@mwaysolutions.com> (http://relution.io)",
"devDependencies": {
"angular": "1.5.7",
"angular-sanitize": "1.5.7",
"angular-markdown-directive": "0.3.1",
"showdown": "1.2.3",
"jquery": "3.1.0",
"jquery-ui": "1.12.1",
"blueimp-file-upload": "9.17.0",
"bootstrap-sass-datepicker": "1.3.7"
},
"repository": {
"type": "git",
"url": "https://github.com/mwaylabs/uikit.git"
},
"bugs": "https://github.com/mwaylabs/uikit/issues",
"resolutions": {
"showdown": "1.2.3"
}
<|file_sep|>updated/bower.json | "name": "mw-uikit",
"description": "A toolbox to build portals with AngularJS 1.5.x that have a lot of forms and list views for example admin portals to manage data.",
"author": "Alexander Zarges <a.zarges@mwaysolutions.com> (http://relution.io)",
"devDependencies": {
"angular": "1.5.7",
"angular-sanitize": "1.5.7",
"angular-markdown-directive": "0.3.1",
"showdown": "1.2.3",
"jquery": "3.1.0",
"jquery-ui": "1.12.1",
"blueimp-file-upload": "9.22.0",
"bootstrap-sass-datepicker": "1.3.7"
},
"repository": {
"type": "git",
"url": "https://github.com/mwaylabs/uikit.git"
},
"bugs": "https://github.com/mwaylabs/uikit/issues",
"resolutions": {
"showdown": "1.2.3"
} | <|file_sep|>original/bower.json
"name": "mw-uikit",
"description": "A toolbox to build portals with AngularJS 1.5.x that have a lot of forms and list views for example admin portals to manage data.",
"author": "Alexander Zarges <a.zarges@mwaysolutions.com> (http://relution.io)",
"devDependencies": {
"angular": "1.5.7",
"angular-sanitize": "1.5.7",
"angular-markdown-directive": "0.3.1",
"showdown": "1.2.3",
"jquery": "3.1.0",
"jquery-ui": "1.12.1",
"blueimp-file-upload": "9.17.0",
"bootstrap-sass-datepicker": "1.3.7"
},
"repository": {
"type": "git",
"url": "https://github.com/mwaylabs/uikit.git"
},
"bugs": "https://github.com/mwaylabs/uikit/issues",
"resolutions": {
"showdown": "1.2.3"
}
<|file_sep|>current/bower.json
"name": "mw-uikit",
"description": "A toolbox to build portals with AngularJS 1.5.x that have a lot of forms and list views for example admin portals to manage data.",
"author": "Alexander Zarges <a.zarges@mwaysolutions.com> (http://relution.io)",
"devDependencies": {
"angular": "1.5.7",
"angular-sanitize": "1.5.7",
"angular-markdown-directive": "0.3.1",
"showdown": "1.2.3",
"jquery": "3.1.0",
"jquery-ui": "1.12.1",
"blueimp-file-upload": "9.17.0",
"bootstrap-sass-datepicker": "1.3.7"
},
"repository": {
"type": "git",
"url": "https://github.com/mwaylabs/uikit.git"
},
"bugs": "https://github.com/mwaylabs/uikit/issues",
"resolutions": {
"showdown": "1.2.3"
}
<|file_sep|>updated/bower.json
"name": "mw-uikit",
"description": "A toolbox to build portals with AngularJS 1.5.x that have a lot of forms and list views for example admin portals to manage data.",
"author": "Alexander Zarges <a.zarges@mwaysolutions.com> (http://relution.io)",
"devDependencies": {
"angular": "1.5.7",
"angular-sanitize": "1.5.7",
"angular-markdown-directive": "0.3.1",
"showdown": "1.2.3",
"jquery": "3.1.0",
"jquery-ui": "1.12.1",
"blueimp-file-upload": "9.22.0",
"bootstrap-sass-datepicker": "1.3.7"
},
"repository": {
"type": "git",
"url": "https://github.com/mwaylabs/uikit.git"
},
"bugs": "https://github.com/mwaylabs/uikit/issues",
"resolutions": {
"showdown": "1.2.3"
} | 10d2b0101a2817ec77a638ccad7535f2a96ad52b | bower.json | bower.json | JSON |
<|file_sep|>original/config/initializers/session_store.rb
# Be sure to restart your server when you modify this file.
Rooms::Application.config.session_store :cookie_store, key: '_room_reservation_session', domain: :all
# Use the database for sessions instead of the cookie-based default,
# which shouldn't be used to store highly confidential information
# (create the session table with "rails generate session_migration")
# Rooms::Application.config.session_store :active_record_store
<|file_sep|>current/config/initializers/session_store.rb
# Be sure to restart your server when you modify this file.
Rooms::Application.config.session_store :cookie_store, key: '_room_reservation_session', domain: :all
# Use the database for sessions instead of the cookie-based default,
# which shouldn't be used to store highly confidential information
# (create the session table with "rails generate session_migration")
# Rooms::Application.config.session_store :active_record_store
<|file_sep|>updated/config/initializers/session_store.rb | # Be sure to restart your server when you modify this file.
Rooms::Application.config.session_store :cookie_store, key: '_room_reservation_session', domain: ENV['LOGIN_COOKIE_DOMAIN'] unless Rails.env.test?
# Use the database for sessions instead of the cookie-based default,
# which shouldn't be used to store highly confidential information
# (create the session table with "rails generate session_migration")
# Rooms::Application.config.session_store :active_record_store | <|file_sep|>original/config/initializers/session_store.rb
# Be sure to restart your server when you modify this file.
Rooms::Application.config.session_store :cookie_store, key: '_room_reservation_session', domain: :all
# Use the database for sessions instead of the cookie-based default,
# which shouldn't be used to store highly confidential information
# (create the session table with "rails generate session_migration")
# Rooms::Application.config.session_store :active_record_store
<|file_sep|>current/config/initializers/session_store.rb
# Be sure to restart your server when you modify this file.
Rooms::Application.config.session_store :cookie_store, key: '_room_reservation_session', domain: :all
# Use the database for sessions instead of the cookie-based default,
# which shouldn't be used to store highly confidential information
# (create the session table with "rails generate session_migration")
# Rooms::Application.config.session_store :active_record_store
<|file_sep|>updated/config/initializers/session_store.rb
# Be sure to restart your server when you modify this file.
Rooms::Application.config.session_store :cookie_store, key: '_room_reservation_session', domain: ENV['LOGIN_COOKIE_DOMAIN'] unless Rails.env.test?
# Use the database for sessions instead of the cookie-based default,
# which shouldn't be used to store highly confidential information
# (create the session table with "rails generate session_migration")
# Rooms::Application.config.session_store :active_record_store | b2cc173a31ca8584c593ddcc1cf67ac5cdb72bda | config/initializers/session_store.rb | config/initializers/session_store.rb | Ruby |
<|file_sep|>original/tools/valgrind/gtest_exclude/base_unittests.gtest_mac.txt
# Fails on Valgrind/Mac, see http://crbug.com/43972
ConditionVariableTest.LargeFastTaskTest
# Fails on Valgrind/Mac due to missing syscall wrapper
# for the symlink() syscall. See http://crbug.com/44001
FileUtilTest.NormalizeFilePathSymlinks
# Fails on Valgrind/Mac, see http://crbug.com/53196
CancellationFlagTest.SetOnDifferentThreadDeathTest
# Fails on Valgrind/Mac, see http://crbug.com/93722
ProcessMemoryTest.MacTerminateOnHeapCorruption
# Fails on Valgrind/Mac, see http://crbug.com/122080
ProcessMemoryTest.MacMallocFailureDoesNotTerminate
# Times out on Valgrind/Mac, see http://crbug.com/172044
MessageLoopTestTypeUI.RecursivePosts
MessageLoopTestTypeIO.RecursivePosts
MessageLoopTestTypeDefault.RecursivePosts
<|file_sep|>current/tools/valgrind/gtest_exclude/base_unittests.gtest_mac.txt
# Fails on Valgrind/Mac, see http://crbug.com/43972
ConditionVariableTest.LargeFastTaskTest
# Fails on Valgrind/Mac due to missing syscall wrapper
# for the symlink() syscall. See http://crbug.com/44001
FileUtilTest.NormalizeFilePathSymlinks
# Fails on Valgrind/Mac, see http://crbug.com/53196
CancellationFlagTest.SetOnDifferentThreadDeathTest
# Fails on Valgrind/Mac, see http://crbug.com/93722
ProcessMemoryTest.MacTerminateOnHeapCorruption
# Fails on Valgrind/Mac, see http://crbug.com/122080
ProcessMemoryTest.MacMallocFailureDoesNotTerminate
# Times out on Valgrind/Mac, see http://crbug.com/172044
MessageLoopTestTypeUI.RecursivePosts
MessageLoopTestTypeIO.RecursivePosts
MessageLoopTestTypeDefault.RecursivePosts
<|file_sep|>updated/tools/valgrind/gtest_exclude/base_unittests.gtest_mac.txt |
# Fails on Valgrind/Mac due to missing syscall wrapper
# for the symlink() syscall. See http://crbug.com/44001
FileUtilTest.NormalizeFilePathSymlinks
# Fails on Valgrind/Mac, see http://crbug.com/53196
CancellationFlagTest.SetOnDifferentThreadDeathTest
# Fails on Valgrind/Mac, see http://crbug.com/93722
ProcessMemoryTest.MacTerminateOnHeapCorruption
# Fails on Valgrind/Mac, see http://crbug.com/122080
ProcessMemoryTest.MacMallocFailureDoesNotTerminate
# Times out on Valgrind/Mac, see http://crbug.com/172044
MessageLoopTestTypeUI.RecursivePosts
MessageLoopTestTypeIO.RecursivePosts
MessageLoopTestTypeDefault.RecursivePosts
# Check-fails on Valgrind/Mac, see https://crbug.com/481290
MessagePumpLibeventTest.QuitWatcher | <|file_sep|>original/tools/valgrind/gtest_exclude/base_unittests.gtest_mac.txt
# Fails on Valgrind/Mac, see http://crbug.com/43972
ConditionVariableTest.LargeFastTaskTest
# Fails on Valgrind/Mac due to missing syscall wrapper
# for the symlink() syscall. See http://crbug.com/44001
FileUtilTest.NormalizeFilePathSymlinks
# Fails on Valgrind/Mac, see http://crbug.com/53196
CancellationFlagTest.SetOnDifferentThreadDeathTest
# Fails on Valgrind/Mac, see http://crbug.com/93722
ProcessMemoryTest.MacTerminateOnHeapCorruption
# Fails on Valgrind/Mac, see http://crbug.com/122080
ProcessMemoryTest.MacMallocFailureDoesNotTerminate
# Times out on Valgrind/Mac, see http://crbug.com/172044
MessageLoopTestTypeUI.RecursivePosts
MessageLoopTestTypeIO.RecursivePosts
MessageLoopTestTypeDefault.RecursivePosts
<|file_sep|>current/tools/valgrind/gtest_exclude/base_unittests.gtest_mac.txt
# Fails on Valgrind/Mac, see http://crbug.com/43972
ConditionVariableTest.LargeFastTaskTest
# Fails on Valgrind/Mac due to missing syscall wrapper
# for the symlink() syscall. See http://crbug.com/44001
FileUtilTest.NormalizeFilePathSymlinks
# Fails on Valgrind/Mac, see http://crbug.com/53196
CancellationFlagTest.SetOnDifferentThreadDeathTest
# Fails on Valgrind/Mac, see http://crbug.com/93722
ProcessMemoryTest.MacTerminateOnHeapCorruption
# Fails on Valgrind/Mac, see http://crbug.com/122080
ProcessMemoryTest.MacMallocFailureDoesNotTerminate
# Times out on Valgrind/Mac, see http://crbug.com/172044
MessageLoopTestTypeUI.RecursivePosts
MessageLoopTestTypeIO.RecursivePosts
MessageLoopTestTypeDefault.RecursivePosts
<|file_sep|>updated/tools/valgrind/gtest_exclude/base_unittests.gtest_mac.txt
# Fails on Valgrind/Mac due to missing syscall wrapper
# for the symlink() syscall. See http://crbug.com/44001
FileUtilTest.NormalizeFilePathSymlinks
# Fails on Valgrind/Mac, see http://crbug.com/53196
CancellationFlagTest.SetOnDifferentThreadDeathTest
# Fails on Valgrind/Mac, see http://crbug.com/93722
ProcessMemoryTest.MacTerminateOnHeapCorruption
# Fails on Valgrind/Mac, see http://crbug.com/122080
ProcessMemoryTest.MacMallocFailureDoesNotTerminate
# Times out on Valgrind/Mac, see http://crbug.com/172044
MessageLoopTestTypeUI.RecursivePosts
MessageLoopTestTypeIO.RecursivePosts
MessageLoopTestTypeDefault.RecursivePosts
# Check-fails on Valgrind/Mac, see https://crbug.com/481290
MessagePumpLibeventTest.QuitWatcher | 159cce5dde14b5a04befb6f504a9e51b60f0d944 | tools/valgrind/gtest_exclude/base_unittests.gtest_mac.txt | tools/valgrind/gtest_exclude/base_unittests.gtest_mac.txt | Text |
<|file_sep|>original/requirements/base.txt
# Media files
#boto==2.47.0
#django_unique_upload==0.2.1
Pillow==4.2.1
# Image manipulation
#django-versatileimagefield==1.7.0
# For asynchronous queuing
#django-rq==0.9.5
# Time zones support
pytz==2017.2
# Parsing Secondlife api's
backoff==1.4.3
requests==2.18.2
beautifulsoup4==4.6.0
lxml==3.8.0
djproxy==2.3.4
<|file_sep|>current/requirements/base.txt
# Media files
#boto==2.47.0
#django_unique_upload==0.2.1
Pillow==4.2.1
# Image manipulation
#django-versatileimagefield==1.7.0
# For asynchronous queuing
#django-rq==0.9.5
# Time zones support
pytz==2017.2
# Parsing Secondlife api's
backoff==1.4.3
requests==2.18.2
beautifulsoup4==4.6.0
lxml==3.8.0
djproxy==2.3.4
<|file_sep|>updated/requirements/base.txt |
# Media files
#boto==2.47.0
#django_unique_upload==0.2.1
Pillow==4.2.1
# Image manipulation
#django-versatileimagefield==1.7.0
# For asynchronous queuing
#django-rq==0.9.5
# Time zones support
pytz==2017.2
# Parsing Secondlife api's
backoff==1.4.3
requests==2.18.4
beautifulsoup4==4.6.0
lxml==3.8.0
djproxy==2.3.4 | <|file_sep|>original/requirements/base.txt
# Media files
#boto==2.47.0
#django_unique_upload==0.2.1
Pillow==4.2.1
# Image manipulation
#django-versatileimagefield==1.7.0
# For asynchronous queuing
#django-rq==0.9.5
# Time zones support
pytz==2017.2
# Parsing Secondlife api's
backoff==1.4.3
requests==2.18.2
beautifulsoup4==4.6.0
lxml==3.8.0
djproxy==2.3.4
<|file_sep|>current/requirements/base.txt
# Media files
#boto==2.47.0
#django_unique_upload==0.2.1
Pillow==4.2.1
# Image manipulation
#django-versatileimagefield==1.7.0
# For asynchronous queuing
#django-rq==0.9.5
# Time zones support
pytz==2017.2
# Parsing Secondlife api's
backoff==1.4.3
requests==2.18.2
beautifulsoup4==4.6.0
lxml==3.8.0
djproxy==2.3.4
<|file_sep|>updated/requirements/base.txt
# Media files
#boto==2.47.0
#django_unique_upload==0.2.1
Pillow==4.2.1
# Image manipulation
#django-versatileimagefield==1.7.0
# For asynchronous queuing
#django-rq==0.9.5
# Time zones support
pytz==2017.2
# Parsing Secondlife api's
backoff==1.4.3
requests==2.18.4
beautifulsoup4==4.6.0
lxml==3.8.0
djproxy==2.3.4 | 9206ddee7a9bf68ab988853e4cbc463eedd16c05 | requirements/base.txt | requirements/base.txt | Text |
<|file_sep|>original/requirements.txt
Flask-SQLAlchemy==2.3.2
Flask==1.0.2
click-datetime==0.2
eventlet==0.23.0
gunicorn==19.7.1
iso8601==0.1.12
jsonschema==2.6.0
marshmallow-sqlalchemy==0.14.0
marshmallow==2.15.3
psycopg2-binary==2.7.4
PyJWT==1.6.4
SQLAlchemy==1.2.7
notifications-python-client==4.8.2
# PaaS
awscli-cwlogs>=1.4,<1.5
git+https://github.com/alphagov/notifications-utils.git@27.0.0#egg=notifications-utils==27.0.0
git+https://github.com/alphagov/boto.git@2.43.0-patch3#egg=boto==2.43.0-patch3
<|file_sep|>current/requirements.txt
Flask-SQLAlchemy==2.3.2
Flask==1.0.2
click-datetime==0.2
eventlet==0.23.0
gunicorn==19.7.1
iso8601==0.1.12
jsonschema==2.6.0
marshmallow-sqlalchemy==0.14.0
marshmallow==2.15.3
psycopg2-binary==2.7.4
PyJWT==1.6.4
SQLAlchemy==1.2.7
notifications-python-client==4.8.2
# PaaS
awscli-cwlogs>=1.4,<1.5
git+https://github.com/alphagov/notifications-utils.git@27.0.0#egg=notifications-utils==27.0.0
git+https://github.com/alphagov/boto.git@2.43.0-patch3#egg=boto==2.43.0-patch3
<|file_sep|>updated/requirements.txt | Flask-SQLAlchemy==2.3.2
Flask==1.0.2
click-datetime==0.2
eventlet==0.23.0
gunicorn==19.7.1
iso8601==0.1.12
jsonschema==2.6.0
marshmallow-sqlalchemy==0.14.0
marshmallow==2.15.3
psycopg2-binary==2.7.4
PyJWT==1.6.4
SQLAlchemy==1.2.8
notifications-python-client==4.8.2
# PaaS
awscli-cwlogs>=1.4,<1.5
git+https://github.com/alphagov/notifications-utils.git@27.0.0#egg=notifications-utils==27.0.0
git+https://github.com/alphagov/boto.git@2.43.0-patch3#egg=boto==2.43.0-patch3 | <|file_sep|>original/requirements.txt
Flask-SQLAlchemy==2.3.2
Flask==1.0.2
click-datetime==0.2
eventlet==0.23.0
gunicorn==19.7.1
iso8601==0.1.12
jsonschema==2.6.0
marshmallow-sqlalchemy==0.14.0
marshmallow==2.15.3
psycopg2-binary==2.7.4
PyJWT==1.6.4
SQLAlchemy==1.2.7
notifications-python-client==4.8.2
# PaaS
awscli-cwlogs>=1.4,<1.5
git+https://github.com/alphagov/notifications-utils.git@27.0.0#egg=notifications-utils==27.0.0
git+https://github.com/alphagov/boto.git@2.43.0-patch3#egg=boto==2.43.0-patch3
<|file_sep|>current/requirements.txt
Flask-SQLAlchemy==2.3.2
Flask==1.0.2
click-datetime==0.2
eventlet==0.23.0
gunicorn==19.7.1
iso8601==0.1.12
jsonschema==2.6.0
marshmallow-sqlalchemy==0.14.0
marshmallow==2.15.3
psycopg2-binary==2.7.4
PyJWT==1.6.4
SQLAlchemy==1.2.7
notifications-python-client==4.8.2
# PaaS
awscli-cwlogs>=1.4,<1.5
git+https://github.com/alphagov/notifications-utils.git@27.0.0#egg=notifications-utils==27.0.0
git+https://github.com/alphagov/boto.git@2.43.0-patch3#egg=boto==2.43.0-patch3
<|file_sep|>updated/requirements.txt
Flask-SQLAlchemy==2.3.2
Flask==1.0.2
click-datetime==0.2
eventlet==0.23.0
gunicorn==19.7.1
iso8601==0.1.12
jsonschema==2.6.0
marshmallow-sqlalchemy==0.14.0
marshmallow==2.15.3
psycopg2-binary==2.7.4
PyJWT==1.6.4
SQLAlchemy==1.2.8
notifications-python-client==4.8.2
# PaaS
awscli-cwlogs>=1.4,<1.5
git+https://github.com/alphagov/notifications-utils.git@27.0.0#egg=notifications-utils==27.0.0
git+https://github.com/alphagov/boto.git@2.43.0-patch3#egg=boto==2.43.0-patch3 | 3e3d71937a094a6db46d2a8c6bc93e1a209cce77 | requirements.txt | requirements.txt | Text |
<|file_sep|>original/Packages/viewerbase/lib/getImageId.js
* Obtain an imageId for Cornerstone from an image instance
*
* @param instance
* @returns {string} The imageId to be used by Cornerstone
*/
getImageId = function(instance, frame) {
if (!instance) {
return;
}
if (instance.wadouri) {
var imageId = 'dicomweb:' + instance.wadouri; // WADO-URI;
if (frame !== undefined) {
imageId += '&frame=' + frame;
}
return imageId;
} else {
// TODO= Check multiframe image support with WADO-RS
return getWADORSImageId(instance); // WADO-RS Retrieve Frame
}
<|file_sep|>current/Packages/viewerbase/lib/getImageId.js
* Obtain an imageId for Cornerstone from an image instance
*
* @param instance
* @returns {string} The imageId to be used by Cornerstone
*/
getImageId = function(instance, frame) {
if (!instance) {
return;
}
if (instance.wadouri) {
var imageId = 'dicomweb:' + instance.wadouri; // WADO-URI;
if (frame !== undefined) {
imageId += '&frame=' + frame;
}
return imageId;
} else {
// TODO= Check multiframe image support with WADO-RS
return getWADORSImageId(instance); // WADO-RS Retrieve Frame
}
<|file_sep|>updated/Packages/viewerbase/lib/getImageId.js | * Obtain an imageId for Cornerstone from an image instance
*
* @param instance
* @returns {string} The imageId to be used by Cornerstone
*/
getImageId = function(instance, frame) {
if (!instance) {
return;
}
if (instance.url) {
return instance.url;
}
if (instance.wadouri) {
var imageId = 'dicomweb:' + instance.wadouri; // WADO-URI;
if (frame !== undefined) {
imageId += '&frame=' + frame;
}
return imageId; | <|file_sep|>original/Packages/viewerbase/lib/getImageId.js
* Obtain an imageId for Cornerstone from an image instance
*
* @param instance
* @returns {string} The imageId to be used by Cornerstone
*/
getImageId = function(instance, frame) {
if (!instance) {
return;
}
if (instance.wadouri) {
var imageId = 'dicomweb:' + instance.wadouri; // WADO-URI;
if (frame !== undefined) {
imageId += '&frame=' + frame;
}
return imageId;
} else {
// TODO= Check multiframe image support with WADO-RS
return getWADORSImageId(instance); // WADO-RS Retrieve Frame
}
<|file_sep|>current/Packages/viewerbase/lib/getImageId.js
* Obtain an imageId for Cornerstone from an image instance
*
* @param instance
* @returns {string} The imageId to be used by Cornerstone
*/
getImageId = function(instance, frame) {
if (!instance) {
return;
}
if (instance.wadouri) {
var imageId = 'dicomweb:' + instance.wadouri; // WADO-URI;
if (frame !== undefined) {
imageId += '&frame=' + frame;
}
return imageId;
} else {
// TODO= Check multiframe image support with WADO-RS
return getWADORSImageId(instance); // WADO-RS Retrieve Frame
}
<|file_sep|>updated/Packages/viewerbase/lib/getImageId.js
* Obtain an imageId for Cornerstone from an image instance
*
* @param instance
* @returns {string} The imageId to be used by Cornerstone
*/
getImageId = function(instance, frame) {
if (!instance) {
return;
}
if (instance.url) {
return instance.url;
}
if (instance.wadouri) {
var imageId = 'dicomweb:' + instance.wadouri; // WADO-URI;
if (frame !== undefined) {
imageId += '&frame=' + frame;
}
return imageId; | cbaf770fba59f8543b72755634610c7a8f0e1ee0 | Packages/viewerbase/lib/getImageId.js | Packages/viewerbase/lib/getImageId.js | JavaScript |
<|file_sep|>Main.hs.diff
original:
let word = "heart"
r <- rhymebrainResults word
updated:
let originalWord = "heart"
r <- rhymebrainResults originalWord
<|file_sep|>original/Main.hs
rhymebrainOptions word = defaults &
param "function" .~ ["getRhymes"] &
param "maxResults" .~ ["0"] &
param "lang" .~ ["en"] &
param "word" .~ [word]
rhymebrainHost :: String
rhymebrainHost = "http://rhymebrain.com/talk"
rhymebrainResults :: T.Text -> IO (Response [RhymebrainResult])
rhymebrainResults word = asJSON =<< getWith (rhymebrainOptions word) rhymebrainHost
resultsWithScore :: Int -> [RhymebrainResult] -> [RhymebrainResult]
resultsWithScore s = filter (\result -> score result == s)
main = do
let word = "heart"
r <- rhymebrainResults word
let results = r ^. responseBody
let highestScoring = resultsWithScore (score $ maximum results) results
print $ highestScoring
<|file_sep|>current/Main.hs
rhymebrainOptions word = defaults &
param "function" .~ ["getRhymes"] &
param "maxResults" .~ ["0"] &
param "lang" .~ ["en"] &
param "word" .~ [word]
rhymebrainHost :: String
rhymebrainHost = "http://rhymebrain.com/talk"
rhymebrainResults :: T.Text -> IO (Response [RhymebrainResult])
rhymebrainResults word = asJSON =<< getWith (rhymebrainOptions word) rhymebrainHost
resultsWithScore :: Int -> [RhymebrainResult] -> [RhymebrainResult]
resultsWithScore s = filter (\result -> score result == s)
main = do
let originalWord = "heart"
r <- rhymebrainResults originalWord
let results = r ^. responseBody
let highestScoring = resultsWithScore (score $ maximum results) results
print $ highestScoring
<|file_sep|>updated/Main.hs | rhymebrainOptions word = defaults &
param "function" .~ ["getRhymes"] &
param "maxResults" .~ ["0"] &
param "lang" .~ ["en"] &
param "word" .~ [word]
rhymebrainHost :: String
rhymebrainHost = "http://rhymebrain.com/talk"
rhymebrainResults :: T.Text -> IO (Response [RhymebrainResult])
rhymebrainResults word = asJSON =<< getWith (rhymebrainOptions word) rhymebrainHost
resultsWithScore :: Int -> [RhymebrainResult] -> [RhymebrainResult]
resultsWithScore s = filter (\result -> score result == s)
main = do
let originalWord = "heart"
r <- rhymebrainResults originalWord
let results = r ^. responseBody
let highestScoring = resultsWithScore (score $ maximum results) results
print $ map word highestScoring | <|file_sep|>Main.hs.diff
original:
let word = "heart"
r <- rhymebrainResults word
updated:
let originalWord = "heart"
r <- rhymebrainResults originalWord
<|file_sep|>original/Main.hs
rhymebrainOptions word = defaults &
param "function" .~ ["getRhymes"] &
param "maxResults" .~ ["0"] &
param "lang" .~ ["en"] &
param "word" .~ [word]
rhymebrainHost :: String
rhymebrainHost = "http://rhymebrain.com/talk"
rhymebrainResults :: T.Text -> IO (Response [RhymebrainResult])
rhymebrainResults word = asJSON =<< getWith (rhymebrainOptions word) rhymebrainHost
resultsWithScore :: Int -> [RhymebrainResult] -> [RhymebrainResult]
resultsWithScore s = filter (\result -> score result == s)
main = do
let word = "heart"
r <- rhymebrainResults word
let results = r ^. responseBody
let highestScoring = resultsWithScore (score $ maximum results) results
print $ highestScoring
<|file_sep|>current/Main.hs
rhymebrainOptions word = defaults &
param "function" .~ ["getRhymes"] &
param "maxResults" .~ ["0"] &
param "lang" .~ ["en"] &
param "word" .~ [word]
rhymebrainHost :: String
rhymebrainHost = "http://rhymebrain.com/talk"
rhymebrainResults :: T.Text -> IO (Response [RhymebrainResult])
rhymebrainResults word = asJSON =<< getWith (rhymebrainOptions word) rhymebrainHost
resultsWithScore :: Int -> [RhymebrainResult] -> [RhymebrainResult]
resultsWithScore s = filter (\result -> score result == s)
main = do
let originalWord = "heart"
r <- rhymebrainResults originalWord
let results = r ^. responseBody
let highestScoring = resultsWithScore (score $ maximum results) results
print $ highestScoring
<|file_sep|>updated/Main.hs
rhymebrainOptions word = defaults &
param "function" .~ ["getRhymes"] &
param "maxResults" .~ ["0"] &
param "lang" .~ ["en"] &
param "word" .~ [word]
rhymebrainHost :: String
rhymebrainHost = "http://rhymebrain.com/talk"
rhymebrainResults :: T.Text -> IO (Response [RhymebrainResult])
rhymebrainResults word = asJSON =<< getWith (rhymebrainOptions word) rhymebrainHost
resultsWithScore :: Int -> [RhymebrainResult] -> [RhymebrainResult]
resultsWithScore s = filter (\result -> score result == s)
main = do
let originalWord = "heart"
r <- rhymebrainResults originalWord
let results = r ^. responseBody
let highestScoring = resultsWithScore (score $ maximum results) results
print $ map word highestScoring | ca7a20861f491d639252454dbdbd06ca74097841 | Main.hs | Main.hs | Haskell |
<|file_sep|>original/tasks/development.js
},
livereload: true
}
}
},
watch: {
sources: {
options: {
livereload: true
},
files: ["*.css", "app.js", "lib/**/*.js", "*.html"],
tasks: ["dev"]
},
config: {
options: {
reload: true
},
files: ["Gruntfile.js", "tasks/*.js"],
tasks: []
}
}
<|file_sep|>current/tasks/development.js
},
livereload: true
}
}
},
watch: {
sources: {
options: {
livereload: true
},
files: ["*.css", "app.js", "lib/**/*.js", "*.html"],
tasks: ["dev"]
},
config: {
options: {
reload: true
},
files: ["Gruntfile.js", "tasks/*.js"],
tasks: []
}
}
<|file_sep|>updated/tasks/development.js | },
livereload: true
}
}
},
watch: {
sources: {
options: {
livereload: true
},
files: ["*.css", "app.js", "helper.js", "lib/**/*.js", "*.html"],
tasks: ["dev"]
},
config: {
options: {
reload: true
},
files: ["Gruntfile.js", "tasks/*.js"],
tasks: []
}
} | <|file_sep|>original/tasks/development.js
},
livereload: true
}
}
},
watch: {
sources: {
options: {
livereload: true
},
files: ["*.css", "app.js", "lib/**/*.js", "*.html"],
tasks: ["dev"]
},
config: {
options: {
reload: true
},
files: ["Gruntfile.js", "tasks/*.js"],
tasks: []
}
}
<|file_sep|>current/tasks/development.js
},
livereload: true
}
}
},
watch: {
sources: {
options: {
livereload: true
},
files: ["*.css", "app.js", "lib/**/*.js", "*.html"],
tasks: ["dev"]
},
config: {
options: {
reload: true
},
files: ["Gruntfile.js", "tasks/*.js"],
tasks: []
}
}
<|file_sep|>updated/tasks/development.js
},
livereload: true
}
}
},
watch: {
sources: {
options: {
livereload: true
},
files: ["*.css", "app.js", "helper.js", "lib/**/*.js", "*.html"],
tasks: ["dev"]
},
config: {
options: {
reload: true
},
files: ["Gruntfile.js", "tasks/*.js"],
tasks: []
}
} | 109101b6d6fc7131a41f9f4ffd24305c1d5d1c09 | tasks/development.js | tasks/development.js | JavaScript |
<|file_sep|>package.json.diff
original:
"version": "0.0.13",
updated:
"version": "0.1.0",
<|file_sep|>package.json.diff
original:
"Chang Wang (https://github.com/cheapsteak)"
updated:
"Chang Wang (https://github.com/cheapsteak)",
"Dominik Menke (https://github.com/dmke)"
<|file_sep|>package.json.diff
original:
"coffee-script": "*",
"gulp-util": "*",
updated:
"coffee-script": "^1.9.1",
"gulp-util": "^3.0.4",
<|file_sep|>package.json.diff
original:
"through2": "*"
updated:
"through2": "^0.6.3"
<|file_sep|>package.json.diff
original:
"should": "*",
"mocha": "*",
"gulp": "*",
"gulp-coffee": "*"
updated:
"gulp": "^3.8.11",
"gulp-coffee": "^2.3.1",
"mocha": "^2.2.1",
"should": "^5.2.0"
<|file_sep|>original/package.json
"spawn-cmd": "0.0.2",
"through2": "*"
},
"devDependencies": {
"should": "*",
"mocha": "*",
"gulp": "*",
"gulp-coffee": "*"
},
"scripts": {
"test": "node_modules/.bin/mocha --compilers coffee:coffee-script/register -R spec"
},
"engines": {
"node": ">= 0.4.0"
},
"licenses": [
{
"type": "MIT"
}
]
}
<|file_sep|>current/package.json
"spawn-cmd": "0.0.2",
"through2": "^0.6.3"
},
"devDependencies": {
"gulp": "^3.8.11",
"gulp-coffee": "^2.3.1",
"mocha": "^2.2.1",
"should": "^5.2.0"
},
"scripts": {
"test": "node_modules/.bin/mocha --compilers coffee:coffee-script/register -R spec"
},
"engines": {
"node": ">= 0.4.0"
},
"licenses": [
{
"type": "MIT"
}
]
}
<|file_sep|>updated/package.json | "spawn-cmd": "0.0.2",
"through2": "^0.6.3"
},
"devDependencies": {
"gulp": "^3.8.11",
"gulp-coffee": "^2.3.1",
"mocha": "^2.2.1",
"should": "^5.2.0"
},
"scripts": {
"test": "node_modules/.bin/mocha --compilers coffee:coffee-script/register -R spec"
},
"engines": {
"node": ">= 0.10.0"
},
"licenses": [
{
"type": "MIT"
}
]
} | <|file_sep|>package.json.diff
original:
"version": "0.0.13",
updated:
"version": "0.1.0",
<|file_sep|>package.json.diff
original:
"Chang Wang (https://github.com/cheapsteak)"
updated:
"Chang Wang (https://github.com/cheapsteak)",
"Dominik Menke (https://github.com/dmke)"
<|file_sep|>package.json.diff
original:
"coffee-script": "*",
"gulp-util": "*",
updated:
"coffee-script": "^1.9.1",
"gulp-util": "^3.0.4",
<|file_sep|>package.json.diff
original:
"through2": "*"
updated:
"through2": "^0.6.3"
<|file_sep|>package.json.diff
original:
"should": "*",
"mocha": "*",
"gulp": "*",
"gulp-coffee": "*"
updated:
"gulp": "^3.8.11",
"gulp-coffee": "^2.3.1",
"mocha": "^2.2.1",
"should": "^5.2.0"
<|file_sep|>original/package.json
"spawn-cmd": "0.0.2",
"through2": "*"
},
"devDependencies": {
"should": "*",
"mocha": "*",
"gulp": "*",
"gulp-coffee": "*"
},
"scripts": {
"test": "node_modules/.bin/mocha --compilers coffee:coffee-script/register -R spec"
},
"engines": {
"node": ">= 0.4.0"
},
"licenses": [
{
"type": "MIT"
}
]
}
<|file_sep|>current/package.json
"spawn-cmd": "0.0.2",
"through2": "^0.6.3"
},
"devDependencies": {
"gulp": "^3.8.11",
"gulp-coffee": "^2.3.1",
"mocha": "^2.2.1",
"should": "^5.2.0"
},
"scripts": {
"test": "node_modules/.bin/mocha --compilers coffee:coffee-script/register -R spec"
},
"engines": {
"node": ">= 0.4.0"
},
"licenses": [
{
"type": "MIT"
}
]
}
<|file_sep|>updated/package.json
"spawn-cmd": "0.0.2",
"through2": "^0.6.3"
},
"devDependencies": {
"gulp": "^3.8.11",
"gulp-coffee": "^2.3.1",
"mocha": "^2.2.1",
"should": "^5.2.0"
},
"scripts": {
"test": "node_modules/.bin/mocha --compilers coffee:coffee-script/register -R spec"
},
"engines": {
"node": ">= 0.10.0"
},
"licenses": [
{
"type": "MIT"
}
]
} | 5daaabf1f328e3736e4b4f75e4fda84e3e3c1d5b | package.json | package.json | JSON |
<|file_sep|>original/.travis.yml
language: android
android:
components:
- build-tools-21.1.1
- extra-android-m2repository
- extra-google-m2repository
- android-21
licenses:
- android-sdk-license-5be876d5
- android-sdk-preview-license-52d11cd2
branches:
except:
- gh-pages
notifications:
email: false
before_install:
- ./check-missing-assertions.sh
<|file_sep|>current/.travis.yml
language: android
android:
components:
- build-tools-21.1.1
- extra-android-m2repository
- extra-google-m2repository
- android-21
licenses:
- android-sdk-license-5be876d5
- android-sdk-preview-license-52d11cd2
branches:
except:
- gh-pages
notifications:
email: false
before_install:
- ./check-missing-assertions.sh
<|file_sep|>updated/.travis.yml | language: android
android:
components:
- build-tools-21.1.1
- extra-android-m2repository
- extra-google-m2repository
- android-21
licenses:
- android-sdk-license-5be876d5
branches:
except:
- gh-pages
notifications:
email: false
before_install:
- ./check-missing-assertions.sh
| <|file_sep|>original/.travis.yml
language: android
android:
components:
- build-tools-21.1.1
- extra-android-m2repository
- extra-google-m2repository
- android-21
licenses:
- android-sdk-license-5be876d5
- android-sdk-preview-license-52d11cd2
branches:
except:
- gh-pages
notifications:
email: false
before_install:
- ./check-missing-assertions.sh
<|file_sep|>current/.travis.yml
language: android
android:
components:
- build-tools-21.1.1
- extra-android-m2repository
- extra-google-m2repository
- android-21
licenses:
- android-sdk-license-5be876d5
- android-sdk-preview-license-52d11cd2
branches:
except:
- gh-pages
notifications:
email: false
before_install:
- ./check-missing-assertions.sh
<|file_sep|>updated/.travis.yml
language: android
android:
components:
- build-tools-21.1.1
- extra-android-m2repository
- extra-google-m2repository
- android-21
licenses:
- android-sdk-license-5be876d5
branches:
except:
- gh-pages
notifications:
email: false
before_install:
- ./check-missing-assertions.sh
| 2907745a6c8bda1b8cfb6c387290594887584e1c | .travis.yml | .travis.yml | YAML |
<|file_sep|>doc/module.rst.diff
original:
The security namespace, used by kernel security modules.
updated:
The security namespace, used by kernel security modules to store
(for example) capabilities information.
<|file_sep|>original/doc/module.rst
Used as flags value, the target attribute
will be replaced, giving an error if it doesn't exist.
.. data:: NS_SECURITY
The security namespace, used by kernel security modules.
.. data:: NS_SYSTEM
The system namespace, used by the kernel to store things such as
ACLs and capabilities.
.. data:: NS_TRUSTED
The trusted namespace, visible and accessibly only to trusted
processes, used to implement mechanisms in user space.
.. data:: NS_USER
The user namespace; this is the namespace accessible to
non-privileged processes.
<|file_sep|>current/doc/module.rst
Used as flags value, the target attribute
will be replaced, giving an error if it doesn't exist.
.. data:: NS_SECURITY
The security namespace, used by kernel security modules to store
(for example) capabilities information.
.. data:: NS_SYSTEM
The system namespace, used by the kernel to store things such as
ACLs and capabilities.
.. data:: NS_TRUSTED
The trusted namespace, visible and accessibly only to trusted
processes, used to implement mechanisms in user space.
.. data:: NS_USER
The user namespace; this is the namespace accessible to
<|file_sep|>updated/doc/module.rst | Used as flags value, the target attribute
will be replaced, giving an error if it doesn't exist.
.. data:: NS_SECURITY
The security namespace, used by kernel security modules to store
(for example) capabilities information.
.. data:: NS_SYSTEM
The system namespace, used by the kernel to store (for example)
ACLs.
.. data:: NS_TRUSTED
The trusted namespace, visible and accessibly only to trusted
processes, used to implement mechanisms in user space.
.. data:: NS_USER
The user namespace; this is the namespace accessible to | <|file_sep|>doc/module.rst.diff
original:
The security namespace, used by kernel security modules.
updated:
The security namespace, used by kernel security modules to store
(for example) capabilities information.
<|file_sep|>original/doc/module.rst
Used as flags value, the target attribute
will be replaced, giving an error if it doesn't exist.
.. data:: NS_SECURITY
The security namespace, used by kernel security modules.
.. data:: NS_SYSTEM
The system namespace, used by the kernel to store things such as
ACLs and capabilities.
.. data:: NS_TRUSTED
The trusted namespace, visible and accessibly only to trusted
processes, used to implement mechanisms in user space.
.. data:: NS_USER
The user namespace; this is the namespace accessible to
non-privileged processes.
<|file_sep|>current/doc/module.rst
Used as flags value, the target attribute
will be replaced, giving an error if it doesn't exist.
.. data:: NS_SECURITY
The security namespace, used by kernel security modules to store
(for example) capabilities information.
.. data:: NS_SYSTEM
The system namespace, used by the kernel to store things such as
ACLs and capabilities.
.. data:: NS_TRUSTED
The trusted namespace, visible and accessibly only to trusted
processes, used to implement mechanisms in user space.
.. data:: NS_USER
The user namespace; this is the namespace accessible to
<|file_sep|>updated/doc/module.rst
Used as flags value, the target attribute
will be replaced, giving an error if it doesn't exist.
.. data:: NS_SECURITY
The security namespace, used by kernel security modules to store
(for example) capabilities information.
.. data:: NS_SYSTEM
The system namespace, used by the kernel to store (for example)
ACLs.
.. data:: NS_TRUSTED
The trusted namespace, visible and accessibly only to trusted
processes, used to implement mechanisms in user space.
.. data:: NS_USER
The user namespace; this is the namespace accessible to | 48ddd87a0fde84f4eb652a2bb494d94bb390a51a | doc/module.rst | doc/module.rst | reStructuredText |
<|file_sep|>original/milton-caldav-demo/src/main/java/scratch/PrincipalsReport.java
*
* @author brad
*/
public class PrincipalsReport {
/**
* @param args the command line arguments
*/
public static void main(String[] args) throws IOException {
HttpClient client = new HttpClient();
ReportMethod rm = new ReportMethod( "http://localhost:9080/webdav-caldav/principals");
rm.setRequestHeader( "depth", "0");
rm.setRequestEntity( new StringRequestEntity( "<x0:principal-search-property-set xmlns:x0=\"DAV:\"/>", "text/xml", "UTF-8"));
int result = client.executeMethod( rm );
System.out.println( "result: " + result );
String resp = rm.getResponseBodyAsString();
System.out.println( "response--" );
System.out.println( resp );
}
}
<|file_sep|>current/milton-caldav-demo/src/main/java/scratch/PrincipalsReport.java
*
* @author brad
*/
public class PrincipalsReport {
/**
* @param args the command line arguments
*/
public static void main(String[] args) throws IOException {
HttpClient client = new HttpClient();
ReportMethod rm = new ReportMethod( "http://localhost:9080/webdav-caldav/principals");
rm.setRequestHeader( "depth", "0");
rm.setRequestEntity( new StringRequestEntity( "<x0:principal-search-property-set xmlns:x0=\"DAV:\"/>", "text/xml", "UTF-8"));
int result = client.executeMethod( rm );
System.out.println( "result: " + result );
String resp = rm.getResponseBodyAsString();
System.out.println( "response--" );
System.out.println( resp );
}
}
<|file_sep|>updated/milton-caldav-demo/src/main/java/scratch/PrincipalsReport.java | *
* @author brad
*/
public class PrincipalsReport {
/**
* @param args the command line arguments
*/
public static void main(String[] args) throws IOException {
HttpClient client = new HttpClient();
ReportMethod rm = new ReportMethod( "http://localhost:9080/principals");
rm.setRequestHeader( "depth", "0");
rm.setRequestEntity( new StringRequestEntity( "<x0:principal-search-property-set xmlns:x0=\"DAV:\"/>", "text/xml", "UTF-8"));
int result = client.executeMethod( rm );
System.out.println( "result: " + result );
String resp = rm.getResponseBodyAsString();
System.out.println( "response--" );
System.out.println( resp );
}
} | <|file_sep|>original/milton-caldav-demo/src/main/java/scratch/PrincipalsReport.java
*
* @author brad
*/
public class PrincipalsReport {
/**
* @param args the command line arguments
*/
public static void main(String[] args) throws IOException {
HttpClient client = new HttpClient();
ReportMethod rm = new ReportMethod( "http://localhost:9080/webdav-caldav/principals");
rm.setRequestHeader( "depth", "0");
rm.setRequestEntity( new StringRequestEntity( "<x0:principal-search-property-set xmlns:x0=\"DAV:\"/>", "text/xml", "UTF-8"));
int result = client.executeMethod( rm );
System.out.println( "result: " + result );
String resp = rm.getResponseBodyAsString();
System.out.println( "response--" );
System.out.println( resp );
}
}
<|file_sep|>current/milton-caldav-demo/src/main/java/scratch/PrincipalsReport.java
*
* @author brad
*/
public class PrincipalsReport {
/**
* @param args the command line arguments
*/
public static void main(String[] args) throws IOException {
HttpClient client = new HttpClient();
ReportMethod rm = new ReportMethod( "http://localhost:9080/webdav-caldav/principals");
rm.setRequestHeader( "depth", "0");
rm.setRequestEntity( new StringRequestEntity( "<x0:principal-search-property-set xmlns:x0=\"DAV:\"/>", "text/xml", "UTF-8"));
int result = client.executeMethod( rm );
System.out.println( "result: " + result );
String resp = rm.getResponseBodyAsString();
System.out.println( "response--" );
System.out.println( resp );
}
}
<|file_sep|>updated/milton-caldav-demo/src/main/java/scratch/PrincipalsReport.java
*
* @author brad
*/
public class PrincipalsReport {
/**
* @param args the command line arguments
*/
public static void main(String[] args) throws IOException {
HttpClient client = new HttpClient();
ReportMethod rm = new ReportMethod( "http://localhost:9080/principals");
rm.setRequestHeader( "depth", "0");
rm.setRequestEntity( new StringRequestEntity( "<x0:principal-search-property-set xmlns:x0=\"DAV:\"/>", "text/xml", "UTF-8"));
int result = client.executeMethod( rm );
System.out.println( "result: " + result );
String resp = rm.getResponseBodyAsString();
System.out.println( "response--" );
System.out.println( resp );
}
} | 6e8ed853c5263f194789bb452f84bb874e69a254 | milton-caldav-demo/src/main/java/scratch/PrincipalsReport.java | milton-caldav-demo/src/main/java/scratch/PrincipalsReport.java | Java |
<|file_sep|>original/manager/manager/themes.py
# Generated by generate.js. Commit this file, but do not edit it.
from manager.helpers import EnumChoice
# The version of Thema to use
version = "2.20.0"
class Themes(EnumChoice):
"""The list of Thema themes."""
bootstrap = "bootstrap"
elife = "elife"
f1000 = "f1000"
galleria = "galleria"
giga = "giga"
latex = "latex"
nature = "nature"
plos = "plos"
rpng = "rpng"
skeleton = "skeleton"
<|file_sep|>current/manager/manager/themes.py
# Generated by generate.js. Commit this file, but do not edit it.
from manager.helpers import EnumChoice
# The version of Thema to use
version = "2.20.0"
class Themes(EnumChoice):
"""The list of Thema themes."""
bootstrap = "bootstrap"
elife = "elife"
f1000 = "f1000"
galleria = "galleria"
giga = "giga"
latex = "latex"
nature = "nature"
plos = "plos"
rpng = "rpng"
skeleton = "skeleton"
<|file_sep|>updated/manager/manager/themes.py | # Generated by generate.js. Commit this file, but do not edit it.
from manager.helpers import EnumChoice
# The version of Thema to use
version = "2.20.3"
class Themes(EnumChoice):
"""The list of Thema themes."""
bootstrap = "bootstrap"
elife = "elife"
f1000 = "f1000"
galleria = "galleria"
giga = "giga"
latex = "latex"
nature = "nature"
plos = "plos"
rpng = "rpng"
skeleton = "skeleton" | <|file_sep|>original/manager/manager/themes.py
# Generated by generate.js. Commit this file, but do not edit it.
from manager.helpers import EnumChoice
# The version of Thema to use
version = "2.20.0"
class Themes(EnumChoice):
"""The list of Thema themes."""
bootstrap = "bootstrap"
elife = "elife"
f1000 = "f1000"
galleria = "galleria"
giga = "giga"
latex = "latex"
nature = "nature"
plos = "plos"
rpng = "rpng"
skeleton = "skeleton"
<|file_sep|>current/manager/manager/themes.py
# Generated by generate.js. Commit this file, but do not edit it.
from manager.helpers import EnumChoice
# The version of Thema to use
version = "2.20.0"
class Themes(EnumChoice):
"""The list of Thema themes."""
bootstrap = "bootstrap"
elife = "elife"
f1000 = "f1000"
galleria = "galleria"
giga = "giga"
latex = "latex"
nature = "nature"
plos = "plos"
rpng = "rpng"
skeleton = "skeleton"
<|file_sep|>updated/manager/manager/themes.py
# Generated by generate.js. Commit this file, but do not edit it.
from manager.helpers import EnumChoice
# The version of Thema to use
version = "2.20.3"
class Themes(EnumChoice):
"""The list of Thema themes."""
bootstrap = "bootstrap"
elife = "elife"
f1000 = "f1000"
galleria = "galleria"
giga = "giga"
latex = "latex"
nature = "nature"
plos = "plos"
rpng = "rpng"
skeleton = "skeleton" | 7d988a0d622496065e3b1acbb9f4a32fa3678c9e | manager/manager/themes.py | manager/manager/themes.py | Python |
<|file_sep|>original/package.json
"type": "git",
"url": "bigeasy/turnstile"
},
"dependencies":
{
"avenue": "^0.4.0",
"extant": "^1.0.20",
"interrupt": "^10.1.0",
"nop": "1.0.0"
},
"devDependencies":
{
"destructible": "7.0.0-alpha.7",
"proof": "^9.0.2"
},
"main": "turnstile",
"scripts":
{
"test": "proof test/*.test.js"
}
}
<|file_sep|>current/package.json
"type": "git",
"url": "bigeasy/turnstile"
},
"dependencies":
{
"avenue": "^0.4.0",
"extant": "^1.0.20",
"interrupt": "^10.1.0",
"nop": "1.0.0"
},
"devDependencies":
{
"destructible": "7.0.0-alpha.7",
"proof": "^9.0.2"
},
"main": "turnstile",
"scripts":
{
"test": "proof test/*.test.js"
}
}
<|file_sep|>updated/package.json | "type": "git",
"url": "bigeasy/turnstile"
},
"dependencies":
{
"avenue": "^0.4.0",
"extant": "^1.0.20",
"interrupt": "^10.1.0",
"nop": "1.0.0"
},
"devDependencies":
{
"destructible": "7.0.0-alpha.7",
"proof": "^9.0.2"
},
"main": "turnstile",
"scripts":
{
"test": "proof test/*.t.js"
}
} | <|file_sep|>original/package.json
"type": "git",
"url": "bigeasy/turnstile"
},
"dependencies":
{
"avenue": "^0.4.0",
"extant": "^1.0.20",
"interrupt": "^10.1.0",
"nop": "1.0.0"
},
"devDependencies":
{
"destructible": "7.0.0-alpha.7",
"proof": "^9.0.2"
},
"main": "turnstile",
"scripts":
{
"test": "proof test/*.test.js"
}
}
<|file_sep|>current/package.json
"type": "git",
"url": "bigeasy/turnstile"
},
"dependencies":
{
"avenue": "^0.4.0",
"extant": "^1.0.20",
"interrupt": "^10.1.0",
"nop": "1.0.0"
},
"devDependencies":
{
"destructible": "7.0.0-alpha.7",
"proof": "^9.0.2"
},
"main": "turnstile",
"scripts":
{
"test": "proof test/*.test.js"
}
}
<|file_sep|>updated/package.json
"type": "git",
"url": "bigeasy/turnstile"
},
"dependencies":
{
"avenue": "^0.4.0",
"extant": "^1.0.20",
"interrupt": "^10.1.0",
"nop": "1.0.0"
},
"devDependencies":
{
"destructible": "7.0.0-alpha.7",
"proof": "^9.0.2"
},
"main": "turnstile",
"scripts":
{
"test": "proof test/*.t.js"
}
} | a870df1e94353472086f6c73408d820778c8d257 | package.json | package.json | JSON |
<|file_sep|>original/travis/cpp-prelude.sh
#!/bin/bash
# Copyright 2017 Yahoo Holdings. Licensed under the terms of the Apache 2.0 license. See LICENSE in the project root.
set -e
export CCACHE_MAXSIZE="1250M"
export CCACHE_COMPRESS=1
ccache --print-config
<|file_sep|>current/travis/cpp-prelude.sh
#!/bin/bash
# Copyright 2017 Yahoo Holdings. Licensed under the terms of the Apache 2.0 license. See LICENSE in the project root.
set -e
export CCACHE_MAXSIZE="1250M"
export CCACHE_COMPRESS=1
ccache --print-config
<|file_sep|>updated/travis/cpp-prelude.sh | #!/bin/bash
# Copyright 2017 Yahoo Holdings. Licensed under the terms of the Apache 2.0 license. See LICENSE in the project root.
set -e
ccache --max-size=1250M
ccache --set-config=compression=true
ccache --print-config | <|file_sep|>original/travis/cpp-prelude.sh
#!/bin/bash
# Copyright 2017 Yahoo Holdings. Licensed under the terms of the Apache 2.0 license. See LICENSE in the project root.
set -e
export CCACHE_MAXSIZE="1250M"
export CCACHE_COMPRESS=1
ccache --print-config
<|file_sep|>current/travis/cpp-prelude.sh
#!/bin/bash
# Copyright 2017 Yahoo Holdings. Licensed under the terms of the Apache 2.0 license. See LICENSE in the project root.
set -e
export CCACHE_MAXSIZE="1250M"
export CCACHE_COMPRESS=1
ccache --print-config
<|file_sep|>updated/travis/cpp-prelude.sh
#!/bin/bash
# Copyright 2017 Yahoo Holdings. Licensed under the terms of the Apache 2.0 license. See LICENSE in the project root.
set -e
ccache --max-size=1250M
ccache --set-config=compression=true
ccache --print-config | d7d34517898e037e2fc229e37b285576545685b9 | travis/cpp-prelude.sh | travis/cpp-prelude.sh | Shell |
<|file_sep|>original/tsconfig.json
{
"compilerOptions": {
"outDir": "lib",
"target": "es6",
"jsx": "react",
"module": "es2015",
"moduleResolution": "node",
"experimentalDecorators": true,
"strictNullChecks": true,
"types": ["node", "commander"]
},
"exclude": [
"node_modules",
"lib"
],
"formatCodeOptions": {
"indentSize": 2,
"tabSize": 2
},
"compileOnSave": false,
"buildOnSave": false
<|file_sep|>current/tsconfig.json
{
"compilerOptions": {
"outDir": "lib",
"target": "es6",
"jsx": "react",
"module": "es2015",
"moduleResolution": "node",
"experimentalDecorators": true,
"strictNullChecks": true,
"types": ["node", "commander"]
},
"exclude": [
"node_modules",
"lib"
],
"formatCodeOptions": {
"indentSize": 2,
"tabSize": 2
},
"compileOnSave": false,
"buildOnSave": false
<|file_sep|>updated/tsconfig.json | {
"compilerOptions": {
"outDir": "lib",
"target": "es6",
"jsx": "react",
"module": "es2015",
"moduleResolution": "node",
"experimentalDecorators": true,
"strictNullChecks": true,
"types": ["node", "commander", "webpack"]
},
"exclude": [
"node_modules",
"lib"
],
"formatCodeOptions": {
"indentSize": 2,
"tabSize": 2
},
"compileOnSave": false,
"buildOnSave": false | <|file_sep|>original/tsconfig.json
{
"compilerOptions": {
"outDir": "lib",
"target": "es6",
"jsx": "react",
"module": "es2015",
"moduleResolution": "node",
"experimentalDecorators": true,
"strictNullChecks": true,
"types": ["node", "commander"]
},
"exclude": [
"node_modules",
"lib"
],
"formatCodeOptions": {
"indentSize": 2,
"tabSize": 2
},
"compileOnSave": false,
"buildOnSave": false
<|file_sep|>current/tsconfig.json
{
"compilerOptions": {
"outDir": "lib",
"target": "es6",
"jsx": "react",
"module": "es2015",
"moduleResolution": "node",
"experimentalDecorators": true,
"strictNullChecks": true,
"types": ["node", "commander"]
},
"exclude": [
"node_modules",
"lib"
],
"formatCodeOptions": {
"indentSize": 2,
"tabSize": 2
},
"compileOnSave": false,
"buildOnSave": false
<|file_sep|>updated/tsconfig.json
{
"compilerOptions": {
"outDir": "lib",
"target": "es6",
"jsx": "react",
"module": "es2015",
"moduleResolution": "node",
"experimentalDecorators": true,
"strictNullChecks": true,
"types": ["node", "commander", "webpack"]
},
"exclude": [
"node_modules",
"lib"
],
"formatCodeOptions": {
"indentSize": 2,
"tabSize": 2
},
"compileOnSave": false,
"buildOnSave": false | 645d74cc4291c81789d371eef1224ccda69db144 | tsconfig.json | tsconfig.json | JSON |
<|file_sep|>original/osu.Game.Tests/Visual/Multiplayer/TestSceneOverlinedPlaylist.cs
{
protected override bool UseOnlineAPI => true;
public TestSceneOverlinedPlaylist()
{
Add(new DrawableRoomPlaylist(false, false)
{
Anchor = Anchor.Centre,
Origin = Anchor.Centre,
Size = new Vector2(500),
});
}
[SetUp]
public new void Setup() => Schedule(() =>
{
Room.RoomID.Value = 7;
for (int i = 0; i < 10; i++)
{
Room.Playlist.Add(new PlaylistItem
<|file_sep|>current/osu.Game.Tests/Visual/Multiplayer/TestSceneOverlinedPlaylist.cs
{
protected override bool UseOnlineAPI => true;
public TestSceneOverlinedPlaylist()
{
Add(new DrawableRoomPlaylist(false, false)
{
Anchor = Anchor.Centre,
Origin = Anchor.Centre,
Size = new Vector2(500),
});
}
[SetUp]
public new void Setup() => Schedule(() =>
{
Room.RoomID.Value = 7;
for (int i = 0; i < 10; i++)
{
Room.Playlist.Add(new PlaylistItem
<|file_sep|>updated/osu.Game.Tests/Visual/Multiplayer/TestSceneOverlinedPlaylist.cs | {
protected override bool UseOnlineAPI => true;
public TestSceneOverlinedPlaylist()
{
Add(new DrawableRoomPlaylist(false, false)
{
Anchor = Anchor.Centre,
Origin = Anchor.Centre,
Size = new Vector2(500),
Items = { BindTarget = Room.Playlist }
});
}
[SetUp]
public new void Setup() => Schedule(() =>
{
Room.RoomID.Value = 7;
for (int i = 0; i < 10; i++)
{ | <|file_sep|>original/osu.Game.Tests/Visual/Multiplayer/TestSceneOverlinedPlaylist.cs
{
protected override bool UseOnlineAPI => true;
public TestSceneOverlinedPlaylist()
{
Add(new DrawableRoomPlaylist(false, false)
{
Anchor = Anchor.Centre,
Origin = Anchor.Centre,
Size = new Vector2(500),
});
}
[SetUp]
public new void Setup() => Schedule(() =>
{
Room.RoomID.Value = 7;
for (int i = 0; i < 10; i++)
{
Room.Playlist.Add(new PlaylistItem
<|file_sep|>current/osu.Game.Tests/Visual/Multiplayer/TestSceneOverlinedPlaylist.cs
{
protected override bool UseOnlineAPI => true;
public TestSceneOverlinedPlaylist()
{
Add(new DrawableRoomPlaylist(false, false)
{
Anchor = Anchor.Centre,
Origin = Anchor.Centre,
Size = new Vector2(500),
});
}
[SetUp]
public new void Setup() => Schedule(() =>
{
Room.RoomID.Value = 7;
for (int i = 0; i < 10; i++)
{
Room.Playlist.Add(new PlaylistItem
<|file_sep|>updated/osu.Game.Tests/Visual/Multiplayer/TestSceneOverlinedPlaylist.cs
{
protected override bool UseOnlineAPI => true;
public TestSceneOverlinedPlaylist()
{
Add(new DrawableRoomPlaylist(false, false)
{
Anchor = Anchor.Centre,
Origin = Anchor.Centre,
Size = new Vector2(500),
Items = { BindTarget = Room.Playlist }
});
}
[SetUp]
public new void Setup() => Schedule(() =>
{
Room.RoomID.Value = 7;
for (int i = 0; i < 10; i++)
{ | f0e91ba43188c7cc3f7a0e660c13477551082f71 | osu.Game.Tests/Visual/Multiplayer/TestSceneOverlinedPlaylist.cs | osu.Game.Tests/Visual/Multiplayer/TestSceneOverlinedPlaylist.cs | C# |
<|file_sep|>original/src/components/finish-order/index.js
import { Component } from 'preact';
import style from './style';
export default class FinishOrder extends Component {
constructor(props) {
super(props);
}
render(props, state) {
return (
<div class="container">
<p class="">
Děkujeme, za Vaši objednávku. Budeme Vás kontaktovat. Detaily objednávky jsme Vám zaslali na email.
</p>
<a class="button" href="http://3dtovarna.cz/" class={`button two columns offset-by-four ${style['button']}`}>
OK
</a>
</div>
);
}
}
<|file_sep|>current/src/components/finish-order/index.js
import { Component } from 'preact';
import style from './style';
export default class FinishOrder extends Component {
constructor(props) {
super(props);
}
render(props, state) {
return (
<div class="container">
<p class="">
Děkujeme, za Vaši objednávku. Budeme Vás kontaktovat. Detaily objednávky jsme Vám zaslali na email.
</p>
<a class="button" href="http://3dtovarna.cz/" class={`button two columns offset-by-four ${style['button']}`}>
OK
</a>
</div>
);
}
}
<|file_sep|>updated/src/components/finish-order/index.js | import { Component } from 'preact';
import style from './style';
export default class FinishOrder extends Component {
constructor(props) {
super(props);
}
render(props, state) {
return (
<div class="container">
<p class="">
Děkujeme, za Vaši objednávku. Budeme Vás kontaktovat. Detaily objednávky jsme Vám zaslali na email.
</p>
{/*<a class="button" href="http://3dtovarna.cz/" class={`button two columns offset-by-four ${style['button']}`}>*/}
{/*OK*/}
{/*</a>*/}
</div>
);
}
} | <|file_sep|>original/src/components/finish-order/index.js
import { Component } from 'preact';
import style from './style';
export default class FinishOrder extends Component {
constructor(props) {
super(props);
}
render(props, state) {
return (
<div class="container">
<p class="">
Děkujeme, za Vaši objednávku. Budeme Vás kontaktovat. Detaily objednávky jsme Vám zaslali na email.
</p>
<a class="button" href="http://3dtovarna.cz/" class={`button two columns offset-by-four ${style['button']}`}>
OK
</a>
</div>
);
}
}
<|file_sep|>current/src/components/finish-order/index.js
import { Component } from 'preact';
import style from './style';
export default class FinishOrder extends Component {
constructor(props) {
super(props);
}
render(props, state) {
return (
<div class="container">
<p class="">
Děkujeme, za Vaši objednávku. Budeme Vás kontaktovat. Detaily objednávky jsme Vám zaslali na email.
</p>
<a class="button" href="http://3dtovarna.cz/" class={`button two columns offset-by-four ${style['button']}`}>
OK
</a>
</div>
);
}
}
<|file_sep|>updated/src/components/finish-order/index.js
import { Component } from 'preact';
import style from './style';
export default class FinishOrder extends Component {
constructor(props) {
super(props);
}
render(props, state) {
return (
<div class="container">
<p class="">
Děkujeme, za Vaši objednávku. Budeme Vás kontaktovat. Detaily objednávky jsme Vám zaslali na email.
</p>
{/*<a class="button" href="http://3dtovarna.cz/" class={`button two columns offset-by-four ${style['button']}`}>*/}
{/*OK*/}
{/*</a>*/}
</div>
);
}
} | f6b0174625f669cef151cf8ef702d38b509fe95a | src/components/finish-order/index.js | src/components/finish-order/index.js | JavaScript |
<|file_sep|>Command/AssetsCacheCommand.php.diff
original:
updated:
public static $defaultName = "becklyn:assets:reset";
<|file_sep|>original/Command/AssetsCacheCommand.php
/**
* @inheritdoc
*/
protected function configure ()
{
$this
->setDescription("Resets (clears and warms) the assets cache")
->addOption(
"no-warmup",
null,
InputOption::VALUE_OPTIONAL,
"Should be cache be automatically warmed up?",
false
);
}
/**
* @inheritdoc
*/
<|file_sep|>current/Command/AssetsCacheCommand.php
parent::__construct();
}
/**
* @inheritdoc
*/
protected function configure ()
{
$this
->setDescription("Resets (clears and warms) the assets cache")
->addOption(
"no-warmup",
null,
InputOption::VALUE_OPTIONAL,
"Should be cache be automatically warmed up?",
false
);
}
<|file_sep|>updated/Command/AssetsCacheCommand.php | parent::__construct();
}
/**
* @inheritdoc
*/
protected function configure ()
{
$this
->setName(self::$defaultName)
->setDescription("Resets (clears and warms) the assets cache")
->addOption(
"no-warmup",
null,
InputOption::VALUE_OPTIONAL,
"Should be cache be automatically warmed up?",
false
);
}
| <|file_sep|>Command/AssetsCacheCommand.php.diff
original:
updated:
public static $defaultName = "becklyn:assets:reset";
<|file_sep|>original/Command/AssetsCacheCommand.php
/**
* @inheritdoc
*/
protected function configure ()
{
$this
->setDescription("Resets (clears and warms) the assets cache")
->addOption(
"no-warmup",
null,
InputOption::VALUE_OPTIONAL,
"Should be cache be automatically warmed up?",
false
);
}
/**
* @inheritdoc
*/
<|file_sep|>current/Command/AssetsCacheCommand.php
parent::__construct();
}
/**
* @inheritdoc
*/
protected function configure ()
{
$this
->setDescription("Resets (clears and warms) the assets cache")
->addOption(
"no-warmup",
null,
InputOption::VALUE_OPTIONAL,
"Should be cache be automatically warmed up?",
false
);
}
<|file_sep|>updated/Command/AssetsCacheCommand.php
parent::__construct();
}
/**
* @inheritdoc
*/
protected function configure ()
{
$this
->setName(self::$defaultName)
->setDescription("Resets (clears and warms) the assets cache")
->addOption(
"no-warmup",
null,
InputOption::VALUE_OPTIONAL,
"Should be cache be automatically warmed up?",
false
);
}
| 4b0152c417a8d29daab9a1f2ea2ce07c027b5ff9 | Command/AssetsCacheCommand.php | Command/AssetsCacheCommand.php | PHP |
<|file_sep|>cpp/README.md.diff
original:
* go >= 1.6.0
* qmake (comes with Qt) to generate the Makefile
updated:
* [go](https://golang.org/dl/) >= 1.6.0
* [qmake](https://www.qt.io/download-open-source/) (comes with Qt) or [waf](https://waf.io/)
<|file_sep|>cpp/README.md.diff
original:
You don't have to have qmake if you want to compile it with your own Makefile :-)
updated:
You don't have to have qmake or waf if you want to compile it with your own Makefile :-)
<|file_sep|>cpp/README.md.diff
original:
### Building
updated:
### Building with qmake
<|file_sep|>cpp/README.md.diff
original:
### Testing
updated:
### Testing with qmake
<|file_sep|>original/cpp/README.md
You don't have to have qmake if you want to compile it with your own Makefile :-)
### Runtime Dependencies
None
### Building
```shell
$ cd <path/to/gofileseq>/cpp
$ qmake
$ make
```
### Testing
```shell
$ cd <path/to/gofileseq>/cpp/test
$ qmake
$ make
```
<|file_sep|>current/cpp/README.md
You don't have to have qmake or waf if you want to compile it with your own Makefile :-)
### Runtime Dependencies
None
### Building with qmake
```shell
$ cd <path/to/gofileseq>/cpp
$ qmake
$ make
```
### Testing with qmake
```shell
$ cd <path/to/gofileseq>/cpp/test
$ qmake
$ make
```
<|file_sep|>updated/cpp/README.md | $ qmake
$ make
```
### Testing with qmake
```shell
$ cd <path/to/gofileseq>/cpp/test
$ qmake
$ make
```
### Building with waf
```shell
$ cd <path/to/gofileseq>/cpp
$ waf configure --prefix=/path/to/install/to
$ waf build
$ waf test
$ waf install
``` | <|file_sep|>cpp/README.md.diff
original:
* go >= 1.6.0
* qmake (comes with Qt) to generate the Makefile
updated:
* [go](https://golang.org/dl/) >= 1.6.0
* [qmake](https://www.qt.io/download-open-source/) (comes with Qt) or [waf](https://waf.io/)
<|file_sep|>cpp/README.md.diff
original:
You don't have to have qmake if you want to compile it with your own Makefile :-)
updated:
You don't have to have qmake or waf if you want to compile it with your own Makefile :-)
<|file_sep|>cpp/README.md.diff
original:
### Building
updated:
### Building with qmake
<|file_sep|>cpp/README.md.diff
original:
### Testing
updated:
### Testing with qmake
<|file_sep|>original/cpp/README.md
You don't have to have qmake if you want to compile it with your own Makefile :-)
### Runtime Dependencies
None
### Building
```shell
$ cd <path/to/gofileseq>/cpp
$ qmake
$ make
```
### Testing
```shell
$ cd <path/to/gofileseq>/cpp/test
$ qmake
$ make
```
<|file_sep|>current/cpp/README.md
You don't have to have qmake or waf if you want to compile it with your own Makefile :-)
### Runtime Dependencies
None
### Building with qmake
```shell
$ cd <path/to/gofileseq>/cpp
$ qmake
$ make
```
### Testing with qmake
```shell
$ cd <path/to/gofileseq>/cpp/test
$ qmake
$ make
```
<|file_sep|>updated/cpp/README.md
$ qmake
$ make
```
### Testing with qmake
```shell
$ cd <path/to/gofileseq>/cpp/test
$ qmake
$ make
```
### Building with waf
```shell
$ cd <path/to/gofileseq>/cpp
$ waf configure --prefix=/path/to/install/to
$ waf build
$ waf test
$ waf install
``` | 8cb3e5b3dde11451fc3eacf4ca3f1b4adc0ae5b2 | cpp/README.md | cpp/README.md | Markdown |
<|file_sep|>original/Resources/config/routing.yaml
darvin_admin_import_image:
resource: '@DarvinImageBundle/Resources/config/routing.yaml'
prefix: /image
darvin_admin_import_search:
resource: '@DarvinAdminBundle/Resources/config/routing/search.yaml'
prefix: /search
darvin_admin_import_fm_elfinder:
resource: '@FMElfinderBundle/Resources/config/routing.yml'
darvin_admin_import_loader:
resource: .
type: darvin_admin
darvin_admin_import_page:
resource: '@DarvinAdminBundle/Resources/config/routing/page.yaml'
darvin_admin_import_security:
resource: '@DarvinAdminBundle/Resources/config/routing/security.yaml'
<|file_sep|>current/Resources/config/routing.yaml
darvin_admin_import_image:
resource: '@DarvinImageBundle/Resources/config/routing.yaml'
prefix: /image
darvin_admin_import_search:
resource: '@DarvinAdminBundle/Resources/config/routing/search.yaml'
prefix: /search
darvin_admin_import_fm_elfinder:
resource: '@FMElfinderBundle/Resources/config/routing.yml'
darvin_admin_import_loader:
resource: .
type: darvin_admin
darvin_admin_import_page:
resource: '@DarvinAdminBundle/Resources/config/routing/page.yaml'
darvin_admin_import_security:
resource: '@DarvinAdminBundle/Resources/config/routing/security.yaml'
<|file_sep|>updated/Resources/config/routing.yaml |
darvin_admin_import_image:
resource: '@DarvinImageBundle/Resources/config/routing.yaml'
prefix: /image
darvin_admin_import_search:
resource: '@DarvinAdminBundle/Resources/config/routing/search.yaml'
prefix: /search
darvin_admin_import_fm_elfinder:
resource: '@FMElfinderBundle/Resources/config/routing.yaml'
darvin_admin_import_loader:
resource: .
type: darvin_admin
darvin_admin_import_page:
resource: '@DarvinAdminBundle/Resources/config/routing/page.yaml'
darvin_admin_import_security:
resource: '@DarvinAdminBundle/Resources/config/routing/security.yaml' | <|file_sep|>original/Resources/config/routing.yaml
darvin_admin_import_image:
resource: '@DarvinImageBundle/Resources/config/routing.yaml'
prefix: /image
darvin_admin_import_search:
resource: '@DarvinAdminBundle/Resources/config/routing/search.yaml'
prefix: /search
darvin_admin_import_fm_elfinder:
resource: '@FMElfinderBundle/Resources/config/routing.yml'
darvin_admin_import_loader:
resource: .
type: darvin_admin
darvin_admin_import_page:
resource: '@DarvinAdminBundle/Resources/config/routing/page.yaml'
darvin_admin_import_security:
resource: '@DarvinAdminBundle/Resources/config/routing/security.yaml'
<|file_sep|>current/Resources/config/routing.yaml
darvin_admin_import_image:
resource: '@DarvinImageBundle/Resources/config/routing.yaml'
prefix: /image
darvin_admin_import_search:
resource: '@DarvinAdminBundle/Resources/config/routing/search.yaml'
prefix: /search
darvin_admin_import_fm_elfinder:
resource: '@FMElfinderBundle/Resources/config/routing.yml'
darvin_admin_import_loader:
resource: .
type: darvin_admin
darvin_admin_import_page:
resource: '@DarvinAdminBundle/Resources/config/routing/page.yaml'
darvin_admin_import_security:
resource: '@DarvinAdminBundle/Resources/config/routing/security.yaml'
<|file_sep|>updated/Resources/config/routing.yaml
darvin_admin_import_image:
resource: '@DarvinImageBundle/Resources/config/routing.yaml'
prefix: /image
darvin_admin_import_search:
resource: '@DarvinAdminBundle/Resources/config/routing/search.yaml'
prefix: /search
darvin_admin_import_fm_elfinder:
resource: '@FMElfinderBundle/Resources/config/routing.yaml'
darvin_admin_import_loader:
resource: .
type: darvin_admin
darvin_admin_import_page:
resource: '@DarvinAdminBundle/Resources/config/routing/page.yaml'
darvin_admin_import_security:
resource: '@DarvinAdminBundle/Resources/config/routing/security.yaml' | 519b774298de36a4e8e920d525f6f6bdfc7a5d81 | Resources/config/routing.yaml | Resources/config/routing.yaml | YAML |
<|file_sep|>original/roles/haskell/tasks/main.yml
---
- name: Enable GCC
homebrew: name=gcc state=linked
- name: Install Haskell Platform
homebrew: name=haskell-platform state=installed
- name: Update cabal
shell: 'cabal update'
tags: update
- name: Install packages
shell: 'cabal install {{ dotfiles.haskell.packages | join(" ")}}'
- name: Link dotfiles
file: src={{ item }} dest=~/.{{ item | basename }} state=link
with_fileglob:
- ./*
<|file_sep|>current/roles/haskell/tasks/main.yml
---
- name: Enable GCC
homebrew: name=gcc state=linked
- name: Install Haskell Platform
homebrew: name=haskell-platform state=installed
- name: Update cabal
shell: 'cabal update'
tags: update
- name: Install packages
shell: 'cabal install {{ dotfiles.haskell.packages | join(" ")}}'
- name: Link dotfiles
file: src={{ item }} dest=~/.{{ item | basename }} state=link
with_fileglob:
- ./*
<|file_sep|>updated/roles/haskell/tasks/main.yml | ---
- name: Install Haskell packages
homebrew: name={{ item }} state=latest
with_items:
- cabal-install
- gcc
- ghc
- name: Update cabal
shell: 'cabal update'
tags: update
- name: Install packages
shell: 'cabal install {{ dotfiles.haskell.packages | join(" ")}}'
- name: Link dotfiles
file: src={{ item }} dest=~/.{{ item | basename }} state=link
with_fileglob:
- ./* | <|file_sep|>original/roles/haskell/tasks/main.yml
---
- name: Enable GCC
homebrew: name=gcc state=linked
- name: Install Haskell Platform
homebrew: name=haskell-platform state=installed
- name: Update cabal
shell: 'cabal update'
tags: update
- name: Install packages
shell: 'cabal install {{ dotfiles.haskell.packages | join(" ")}}'
- name: Link dotfiles
file: src={{ item }} dest=~/.{{ item | basename }} state=link
with_fileglob:
- ./*
<|file_sep|>current/roles/haskell/tasks/main.yml
---
- name: Enable GCC
homebrew: name=gcc state=linked
- name: Install Haskell Platform
homebrew: name=haskell-platform state=installed
- name: Update cabal
shell: 'cabal update'
tags: update
- name: Install packages
shell: 'cabal install {{ dotfiles.haskell.packages | join(" ")}}'
- name: Link dotfiles
file: src={{ item }} dest=~/.{{ item | basename }} state=link
with_fileglob:
- ./*
<|file_sep|>updated/roles/haskell/tasks/main.yml
---
- name: Install Haskell packages
homebrew: name={{ item }} state=latest
with_items:
- cabal-install
- gcc
- ghc
- name: Update cabal
shell: 'cabal update'
tags: update
- name: Install packages
shell: 'cabal install {{ dotfiles.haskell.packages | join(" ")}}'
- name: Link dotfiles
file: src={{ item }} dest=~/.{{ item | basename }} state=link
with_fileglob:
- ./* | b2f60b739ae730db502f7a33af185c58a6b7580e | roles/haskell/tasks/main.yml | roles/haskell/tasks/main.yml | YAML |
<|file_sep|>original/HACKING.rst
=======================
- Step 1: Read the OpenStack Style Commandments
http://docs.openstack.org/developer/hacking/
- Step 2: Read on
Neutron Specific Commandments
--------------------------
- [N320] Validate that LOG messages, except debug ones, have translations
Creating Unit Tests
-------------------
For every new feature, unit tests should be created that both test and
(implicitly) document the usage of said feature. If submitting a patch for a
bug that had no unit test, a new passing unit test should be added. If a
submitted bug fix does have a unit test, be sure to add a new one that fails
without the patch and passes with the patch.
All unittest classes must ultimately inherit from testtools.TestCase. In the
Neutron test suite, this should be done by inheriting from
<|file_sep|>current/HACKING.rst
=======================
- Step 1: Read the OpenStack Style Commandments
http://docs.openstack.org/developer/hacking/
- Step 2: Read on
Neutron Specific Commandments
--------------------------
- [N320] Validate that LOG messages, except debug ones, have translations
Creating Unit Tests
-------------------
For every new feature, unit tests should be created that both test and
(implicitly) document the usage of said feature. If submitting a patch for a
bug that had no unit test, a new passing unit test should be added. If a
submitted bug fix does have a unit test, be sure to add a new one that fails
without the patch and passes with the patch.
All unittest classes must ultimately inherit from testtools.TestCase. In the
Neutron test suite, this should be done by inheriting from
<|file_sep|>updated/HACKING.rst | =======================
- Step 1: Read the OpenStack Style Commandments
http://docs.openstack.org/developer/hacking/
- Step 2: Read on
Neutron Specific Commandments
--------------------------
- [N320] Validate that LOG messages, except debug ones, have translations
- [N321] Validate that jsonutils module is used instead of json
Creating Unit Tests
-------------------
For every new feature, unit tests should be created that both test and
(implicitly) document the usage of said feature. If submitting a patch for a
bug that had no unit test, a new passing unit test should be added. If a
submitted bug fix does have a unit test, be sure to add a new one that fails
without the patch and passes with the patch.
All unittest classes must ultimately inherit from testtools.TestCase. In the | <|file_sep|>original/HACKING.rst
=======================
- Step 1: Read the OpenStack Style Commandments
http://docs.openstack.org/developer/hacking/
- Step 2: Read on
Neutron Specific Commandments
--------------------------
- [N320] Validate that LOG messages, except debug ones, have translations
Creating Unit Tests
-------------------
For every new feature, unit tests should be created that both test and
(implicitly) document the usage of said feature. If submitting a patch for a
bug that had no unit test, a new passing unit test should be added. If a
submitted bug fix does have a unit test, be sure to add a new one that fails
without the patch and passes with the patch.
All unittest classes must ultimately inherit from testtools.TestCase. In the
Neutron test suite, this should be done by inheriting from
<|file_sep|>current/HACKING.rst
=======================
- Step 1: Read the OpenStack Style Commandments
http://docs.openstack.org/developer/hacking/
- Step 2: Read on
Neutron Specific Commandments
--------------------------
- [N320] Validate that LOG messages, except debug ones, have translations
Creating Unit Tests
-------------------
For every new feature, unit tests should be created that both test and
(implicitly) document the usage of said feature. If submitting a patch for a
bug that had no unit test, a new passing unit test should be added. If a
submitted bug fix does have a unit test, be sure to add a new one that fails
without the patch and passes with the patch.
All unittest classes must ultimately inherit from testtools.TestCase. In the
Neutron test suite, this should be done by inheriting from
<|file_sep|>updated/HACKING.rst
=======================
- Step 1: Read the OpenStack Style Commandments
http://docs.openstack.org/developer/hacking/
- Step 2: Read on
Neutron Specific Commandments
--------------------------
- [N320] Validate that LOG messages, except debug ones, have translations
- [N321] Validate that jsonutils module is used instead of json
Creating Unit Tests
-------------------
For every new feature, unit tests should be created that both test and
(implicitly) document the usage of said feature. If submitting a patch for a
bug that had no unit test, a new passing unit test should be added. If a
submitted bug fix does have a unit test, be sure to add a new one that fails
without the patch and passes with the patch.
All unittest classes must ultimately inherit from testtools.TestCase. In the | 2217946e424eefd6108aa0716e12d0fac6395341 | HACKING.rst | HACKING.rst | reStructuredText |
<|file_sep|>original/spec/integration/carrier_wave_spec.rb
<|file_sep|>current/spec/integration/carrier_wave_spec.rb
<|file_sep|>updated/spec/integration/carrier_wave_spec.rb | class TestUploader < CoverPhotoUploader
def uploader_options
{
:ignore_processing_errors => false,
:ignore_integrity_errors => false
}
end
def model
@m ||= Object.new
class << @m
def id; 1; end;
end
@m
end
end
RSpec.describe "CarrierWave Integration" do
it "should process images with MiniMagick without raising CarrierWave::ProcessingError" do
uploader = TestUploader.new
| <|file_sep|>original/spec/integration/carrier_wave_spec.rb
<|file_sep|>current/spec/integration/carrier_wave_spec.rb
<|file_sep|>updated/spec/integration/carrier_wave_spec.rb
class TestUploader < CoverPhotoUploader
def uploader_options
{
:ignore_processing_errors => false,
:ignore_integrity_errors => false
}
end
def model
@m ||= Object.new
class << @m
def id; 1; end;
end
@m
end
end
RSpec.describe "CarrierWave Integration" do
it "should process images with MiniMagick without raising CarrierWave::ProcessingError" do
uploader = TestUploader.new
| 02d386063329e52bf4fd5a3450e5eeab486bf344 | spec/integration/carrier_wave_spec.rb | spec/integration/carrier_wave_spec.rb | Ruby |
<|file_sep|>original/utils/checks.py
from discord.ext import commands
def is_owner_or(**perms):
async def predicate(ctx):
owner = ctx.bot.is_owner(ctx.author)
permissions = ctx.channel.permissions_for(ctx.author)
return all(getattr(permissions, perm, None) == value
for perm, value in perms.items()) or owner
return commands.check(predicate)
<|file_sep|>current/utils/checks.py
from discord.ext import commands
def is_owner_or(**perms):
async def predicate(ctx):
owner = ctx.bot.is_owner(ctx.author)
permissions = ctx.channel.permissions_for(ctx.author)
return all(getattr(permissions, perm, None) == value
for perm, value in perms.items()) or owner
return commands.check(predicate)
<|file_sep|>updated/utils/checks.py | from discord.ext import commands
def is_owner_or(**perms):
async def predicate(ctx):
owner = await ctx.bot.is_owner(ctx.author)
permissions = ctx.channel.permissions_for(ctx.author)
return all(getattr(permissions, perm, None) == value
for perm, value in perms.items()) or owner
return commands.check(predicate) | <|file_sep|>original/utils/checks.py
from discord.ext import commands
def is_owner_or(**perms):
async def predicate(ctx):
owner = ctx.bot.is_owner(ctx.author)
permissions = ctx.channel.permissions_for(ctx.author)
return all(getattr(permissions, perm, None) == value
for perm, value in perms.items()) or owner
return commands.check(predicate)
<|file_sep|>current/utils/checks.py
from discord.ext import commands
def is_owner_or(**perms):
async def predicate(ctx):
owner = ctx.bot.is_owner(ctx.author)
permissions = ctx.channel.permissions_for(ctx.author)
return all(getattr(permissions, perm, None) == value
for perm, value in perms.items()) or owner
return commands.check(predicate)
<|file_sep|>updated/utils/checks.py
from discord.ext import commands
def is_owner_or(**perms):
async def predicate(ctx):
owner = await ctx.bot.is_owner(ctx.author)
permissions = ctx.channel.permissions_for(ctx.author)
return all(getattr(permissions, perm, None) == value
for perm, value in perms.items()) or owner
return commands.check(predicate) | b457ab3abc0dbcfd94b26f7107144af67b8a85f4 | utils/checks.py | utils/checks.py | Python |
<|file_sep|>original/app/models/widgets/ticker_widget.js
Dashboard.TickerWidget = Dashboard.Widget.extend({
sourceData: "",
templateName: 'ticker_widget',
classNames: ['widget', 'widget-ticker'],
widgetView: function() {
var widget = this;
return this._super().reopen({
didInsertElement: function() {
var scaleFactor = 0.7;
var widgetHeight = this.$().height();
var fontSize = widgetHeight * scaleFactor;
var widgetUnitWidth = (DashboardConfig.grid.width - DashboardConfig.widgetMargins) /
DashboardConfig.dim[0] - DashboardConfig.widgetMargins;
var widgetWidth = widgetUnitWidth * widget.get('sizex') +
DashboardConfig.widgetMargins * (widget.get('sizex') - 1) - 5;
<|file_sep|>current/app/models/widgets/ticker_widget.js
Dashboard.TickerWidget = Dashboard.Widget.extend({
sourceData: "",
templateName: 'ticker_widget',
classNames: ['widget', 'widget-ticker'],
widgetView: function() {
var widget = this;
return this._super().reopen({
didInsertElement: function() {
var scaleFactor = 0.7;
var widgetHeight = this.$().height();
var fontSize = widgetHeight * scaleFactor;
var widgetUnitWidth = (DashboardConfig.grid.width - DashboardConfig.widgetMargins) /
DashboardConfig.dim[0] - DashboardConfig.widgetMargins;
var widgetWidth = widgetUnitWidth * widget.get('sizex') +
DashboardConfig.widgetMargins * (widget.get('sizex') - 1) - 5;
<|file_sep|>updated/app/models/widgets/ticker_widget.js | /**
* Shows arbitrary text in a news ticker-like fashion.
*
* The widget is designed to have a very small height in comparison to its width. This can be
* achieved by increasing the granularity of the widget sizes in the config, e.g. by setting the
* `dim[1]` of the grid to a high value and scaling accordingly the heights of the existing
* widgets.
*
* Expects data from sources as simple strings.
*/
Dashboard.TickerWidget = Dashboard.Widget.extend({
sourceData: "",
templateName: 'ticker_widget',
classNames: ['widget', 'widget-ticker'],
widgetView: function() {
var widget = this;
return this._super().reopen({
didInsertElement: function() { | <|file_sep|>original/app/models/widgets/ticker_widget.js
Dashboard.TickerWidget = Dashboard.Widget.extend({
sourceData: "",
templateName: 'ticker_widget',
classNames: ['widget', 'widget-ticker'],
widgetView: function() {
var widget = this;
return this._super().reopen({
didInsertElement: function() {
var scaleFactor = 0.7;
var widgetHeight = this.$().height();
var fontSize = widgetHeight * scaleFactor;
var widgetUnitWidth = (DashboardConfig.grid.width - DashboardConfig.widgetMargins) /
DashboardConfig.dim[0] - DashboardConfig.widgetMargins;
var widgetWidth = widgetUnitWidth * widget.get('sizex') +
DashboardConfig.widgetMargins * (widget.get('sizex') - 1) - 5;
<|file_sep|>current/app/models/widgets/ticker_widget.js
Dashboard.TickerWidget = Dashboard.Widget.extend({
sourceData: "",
templateName: 'ticker_widget',
classNames: ['widget', 'widget-ticker'],
widgetView: function() {
var widget = this;
return this._super().reopen({
didInsertElement: function() {
var scaleFactor = 0.7;
var widgetHeight = this.$().height();
var fontSize = widgetHeight * scaleFactor;
var widgetUnitWidth = (DashboardConfig.grid.width - DashboardConfig.widgetMargins) /
DashboardConfig.dim[0] - DashboardConfig.widgetMargins;
var widgetWidth = widgetUnitWidth * widget.get('sizex') +
DashboardConfig.widgetMargins * (widget.get('sizex') - 1) - 5;
<|file_sep|>updated/app/models/widgets/ticker_widget.js
/**
* Shows arbitrary text in a news ticker-like fashion.
*
* The widget is designed to have a very small height in comparison to its width. This can be
* achieved by increasing the granularity of the widget sizes in the config, e.g. by setting the
* `dim[1]` of the grid to a high value and scaling accordingly the heights of the existing
* widgets.
*
* Expects data from sources as simple strings.
*/
Dashboard.TickerWidget = Dashboard.Widget.extend({
sourceData: "",
templateName: 'ticker_widget',
classNames: ['widget', 'widget-ticker'],
widgetView: function() {
var widget = this;
return this._super().reopen({
didInsertElement: function() { | 647ef6e18230a1012785d44ccf04680dea26b64d | app/models/widgets/ticker_widget.js | app/models/widgets/ticker_widget.js | JavaScript |
<|file_sep|>week-05.md.diff
original:
updated:
- title: "Modular nav"
url: "https://github.com/acgd-webdev-3/modular-nav"
<|file_sep|>original/week-05.md
- title: "Gridifier cheat sheet"
url: gridifier-cheat-sheet
- title: "Typografier cheat sheet"
url: typografier-cheat-sheet
group_activities:
- title: "Sketching websites"
type: pencil
pair: true
tasks:
- title: "Type Trasher"
type: activity
- type: blank
- title: "Refined grid-ception"
url: "https://github.com/acgd-webdev-3/refined-grid-ception"
- title: "Lists ’n’ grids"
url: "https://github.com/acgd-webdev-3/lists-n-grids"
- title: "Modular nav"
url: "https://github.com/acgd-webdev-3/modular-nav"
---
<|file_sep|>current/week-05.md
- title: "Typografier cheat sheet"
url: typografier-cheat-sheet
group_activities:
- title: "Sketching websites"
type: pencil
pair: true
tasks:
- title: "Type Trasher"
type: activity
- type: blank
- title: "Refined grid-ception"
url: "https://github.com/acgd-webdev-3/refined-grid-ception"
- title: "Modular nav"
url: "https://github.com/acgd-webdev-3/modular-nav"
- title: "Lists ’n’ grids"
url: "https://github.com/acgd-webdev-3/lists-n-grids"
- title: "Modular nav"
url: "https://github.com/acgd-webdev-3/modular-nav"
---
<|file_sep|>updated/week-05.md | - title: "Gridifier cheat sheet"
url: gridifier-cheat-sheet
- title: "Typografier cheat sheet"
url: typografier-cheat-sheet
group_activities:
- title: "Sketching websites"
type: pencil
pair: true
tasks:
- title: "Type Trasher"
type: activity
- type: blank
- title: "Refined grid-ception"
url: "https://github.com/acgd-webdev-3/refined-grid-ception"
- title: "Modular nav"
url: "https://github.com/acgd-webdev-3/modular-nav"
- title: "Lists ’n’ grids"
url: "https://github.com/acgd-webdev-3/lists-n-grids"
--- | <|file_sep|>week-05.md.diff
original:
updated:
- title: "Modular nav"
url: "https://github.com/acgd-webdev-3/modular-nav"
<|file_sep|>original/week-05.md
- title: "Gridifier cheat sheet"
url: gridifier-cheat-sheet
- title: "Typografier cheat sheet"
url: typografier-cheat-sheet
group_activities:
- title: "Sketching websites"
type: pencil
pair: true
tasks:
- title: "Type Trasher"
type: activity
- type: blank
- title: "Refined grid-ception"
url: "https://github.com/acgd-webdev-3/refined-grid-ception"
- title: "Lists ’n’ grids"
url: "https://github.com/acgd-webdev-3/lists-n-grids"
- title: "Modular nav"
url: "https://github.com/acgd-webdev-3/modular-nav"
---
<|file_sep|>current/week-05.md
- title: "Typografier cheat sheet"
url: typografier-cheat-sheet
group_activities:
- title: "Sketching websites"
type: pencil
pair: true
tasks:
- title: "Type Trasher"
type: activity
- type: blank
- title: "Refined grid-ception"
url: "https://github.com/acgd-webdev-3/refined-grid-ception"
- title: "Modular nav"
url: "https://github.com/acgd-webdev-3/modular-nav"
- title: "Lists ’n’ grids"
url: "https://github.com/acgd-webdev-3/lists-n-grids"
- title: "Modular nav"
url: "https://github.com/acgd-webdev-3/modular-nav"
---
<|file_sep|>updated/week-05.md
- title: "Gridifier cheat sheet"
url: gridifier-cheat-sheet
- title: "Typografier cheat sheet"
url: typografier-cheat-sheet
group_activities:
- title: "Sketching websites"
type: pencil
pair: true
tasks:
- title: "Type Trasher"
type: activity
- type: blank
- title: "Refined grid-ception"
url: "https://github.com/acgd-webdev-3/refined-grid-ception"
- title: "Modular nav"
url: "https://github.com/acgd-webdev-3/modular-nav"
- title: "Lists ’n’ grids"
url: "https://github.com/acgd-webdev-3/lists-n-grids"
--- | 7d6f49b7fe8c8dde690ea58532c8366184d22cbb | week-05.md | week-05.md | Markdown |
<|file_sep|>original/src/views/withNavigation.js
/* @flow */
import React from 'react';
import hoistStatics from 'hoist-non-react-statics';
import { ContextWithNavigation } from '../TypeDefinition';
export default function withNavigation(Component: ReactClass<T>) {
const componentWithNavigation = (props: T, { navigation }: ContextWithNavigation) => (
<Component {...props} navigation={navigation} />
);
componentWithNavigation.displayName = `withNavigation(${Component.displayName || Component.name})`;
componentWithNavigation.contextTypes = {
navigation: React.PropTypes.object.isRequired,
};
return hoistStatics(componentWithNavigation, Component);
}
<|file_sep|>current/src/views/withNavigation.js
/* @flow */
import React from 'react';
import hoistStatics from 'hoist-non-react-statics';
import { ContextWithNavigation } from '../TypeDefinition';
export default function withNavigation(Component: ReactClass<T>) {
const componentWithNavigation = (props: T, { navigation }: ContextWithNavigation) => (
<Component {...props} navigation={navigation} />
);
componentWithNavigation.displayName = `withNavigation(${Component.displayName || Component.name})`;
componentWithNavigation.contextTypes = {
navigation: React.PropTypes.object.isRequired,
};
return hoistStatics(componentWithNavigation, Component);
}
<|file_sep|>updated/src/views/withNavigation.js | /* @flow */
import React from 'react';
import hoistStatics from 'hoist-non-react-statics';
import type { ContextWithNavigation } from '../TypeDefinition';
export default function withNavigation(Component: ReactClass<T>) {
const componentWithNavigation = (props: T, { navigation }: ContextWithNavigation) => (
<Component {...props} navigation={navigation} />
);
componentWithNavigation.displayName = `withNavigation(${Component.displayName || Component.name})`;
componentWithNavigation.contextTypes = {
navigation: React.PropTypes.object.isRequired,
};
return hoistStatics(componentWithNavigation, Component);
} | <|file_sep|>original/src/views/withNavigation.js
/* @flow */
import React from 'react';
import hoistStatics from 'hoist-non-react-statics';
import { ContextWithNavigation } from '../TypeDefinition';
export default function withNavigation(Component: ReactClass<T>) {
const componentWithNavigation = (props: T, { navigation }: ContextWithNavigation) => (
<Component {...props} navigation={navigation} />
);
componentWithNavigation.displayName = `withNavigation(${Component.displayName || Component.name})`;
componentWithNavigation.contextTypes = {
navigation: React.PropTypes.object.isRequired,
};
return hoistStatics(componentWithNavigation, Component);
}
<|file_sep|>current/src/views/withNavigation.js
/* @flow */
import React from 'react';
import hoistStatics from 'hoist-non-react-statics';
import { ContextWithNavigation } from '../TypeDefinition';
export default function withNavigation(Component: ReactClass<T>) {
const componentWithNavigation = (props: T, { navigation }: ContextWithNavigation) => (
<Component {...props} navigation={navigation} />
);
componentWithNavigation.displayName = `withNavigation(${Component.displayName || Component.name})`;
componentWithNavigation.contextTypes = {
navigation: React.PropTypes.object.isRequired,
};
return hoistStatics(componentWithNavigation, Component);
}
<|file_sep|>updated/src/views/withNavigation.js
/* @flow */
import React from 'react';
import hoistStatics from 'hoist-non-react-statics';
import type { ContextWithNavigation } from '../TypeDefinition';
export default function withNavigation(Component: ReactClass<T>) {
const componentWithNavigation = (props: T, { navigation }: ContextWithNavigation) => (
<Component {...props} navigation={navigation} />
);
componentWithNavigation.displayName = `withNavigation(${Component.displayName || Component.name})`;
componentWithNavigation.contextTypes = {
navigation: React.PropTypes.object.isRequired,
};
return hoistStatics(componentWithNavigation, Component);
} | a7002f215111cfe3adff7e6fd8d327376cb1ac88 | src/views/withNavigation.js | src/views/withNavigation.js | JavaScript |
<|file_sep|>src/Element/Input/Url.php.diff
original:
updated:
use Brick\Validation\Validator\UrlValidator;
<|file_sep|>original/src/Element/Input/Url.php
use PlaceholderAttribute;
use ReadOnlyAttribute;
use RequiredAttribute;
use SizeAttribute;
use ValueAttribute;
/**
* {@inheritdoc}
*/
protected function getType()
{
return 'url';
}
/**
* {@inheritdoc}
*/
protected function doPopulate($value)
{
$this->setValue($value);
}
<|file_sep|>current/src/Element/Input/Url.php
use PatternAttribute;
use PlaceholderAttribute;
use ReadOnlyAttribute;
use RequiredAttribute;
use SizeAttribute;
use ValueAttribute;
/**
* {@inheritdoc}
*/
protected function getType()
{
return 'url';
}
/**
* {@inheritdoc}
*/
protected function doPopulate($value)
{
$this->setValue($value);
<|file_sep|>updated/src/Element/Input/Url.php | use PatternAttribute;
use PlaceholderAttribute;
use ReadOnlyAttribute;
use RequiredAttribute;
use SizeAttribute;
use ValueAttribute;
/**
* {@inheritdoc}
*/
protected function init()
{
$this->addValidator(new UrlValidator());
}
/**
* {@inheritdoc}
*/
protected function getType()
{
return 'url'; | <|file_sep|>src/Element/Input/Url.php.diff
original:
updated:
use Brick\Validation\Validator\UrlValidator;
<|file_sep|>original/src/Element/Input/Url.php
use PlaceholderAttribute;
use ReadOnlyAttribute;
use RequiredAttribute;
use SizeAttribute;
use ValueAttribute;
/**
* {@inheritdoc}
*/
protected function getType()
{
return 'url';
}
/**
* {@inheritdoc}
*/
protected function doPopulate($value)
{
$this->setValue($value);
}
<|file_sep|>current/src/Element/Input/Url.php
use PatternAttribute;
use PlaceholderAttribute;
use ReadOnlyAttribute;
use RequiredAttribute;
use SizeAttribute;
use ValueAttribute;
/**
* {@inheritdoc}
*/
protected function getType()
{
return 'url';
}
/**
* {@inheritdoc}
*/
protected function doPopulate($value)
{
$this->setValue($value);
<|file_sep|>updated/src/Element/Input/Url.php
use PatternAttribute;
use PlaceholderAttribute;
use ReadOnlyAttribute;
use RequiredAttribute;
use SizeAttribute;
use ValueAttribute;
/**
* {@inheritdoc}
*/
protected function init()
{
$this->addValidator(new UrlValidator());
}
/**
* {@inheritdoc}
*/
protected function getType()
{
return 'url'; | 77daf66b7db5cbf8e3fc9c4c5ba3d8a9075a6705 | src/Element/Input/Url.php | src/Element/Input/Url.php | PHP |
<|file_sep|>original/README.md
# dash.vim
This Vim plugin will search for terms using the excellent [Dash][Dash]
app, making API lookups simple.
It provides a new :Dash command and (recommended) mappings.
Read the (TODO :) [help][vim-doc] to know more.
## Installation
If you don't have a preferred installation method, I recommend
installing [pathogen.vim][pathogen], and then simply copy and paste:
```bash
cd ~/.vim/bundle
git clone https://github.com/zehrizzatti/dash.vim.git
```
## License
<|file_sep|>current/README.md
# dash.vim
This Vim plugin will search for terms using the excellent [Dash][Dash]
app, making API lookups simple.
It provides a new :Dash command and (recommended) mappings.
Read the (TODO :) [help][vim-doc] to know more.
## Installation
If you don't have a preferred installation method, I recommend
installing [pathogen.vim][pathogen], and then simply copy and paste:
```bash
cd ~/.vim/bundle
git clone https://github.com/zehrizzatti/dash.vim.git
```
## License
<|file_sep|>updated/README.md | # dash.vim
This Vim plugin will search for terms using the excellent [Dash][Dash]
app, making API lookups simple.
It provides a new :Dash command and (recommended) mappings.
Read the (TODO :) [help][vim-doc] to know more.
## Commands
This plugin has one command: `Dash`.
```
:Dash String
```
Will search for 'String' in Dash, considering the current filetype in Vim.
```
:Dash! func | <|file_sep|>original/README.md
# dash.vim
This Vim plugin will search for terms using the excellent [Dash][Dash]
app, making API lookups simple.
It provides a new :Dash command and (recommended) mappings.
Read the (TODO :) [help][vim-doc] to know more.
## Installation
If you don't have a preferred installation method, I recommend
installing [pathogen.vim][pathogen], and then simply copy and paste:
```bash
cd ~/.vim/bundle
git clone https://github.com/zehrizzatti/dash.vim.git
```
## License
<|file_sep|>current/README.md
# dash.vim
This Vim plugin will search for terms using the excellent [Dash][Dash]
app, making API lookups simple.
It provides a new :Dash command and (recommended) mappings.
Read the (TODO :) [help][vim-doc] to know more.
## Installation
If you don't have a preferred installation method, I recommend
installing [pathogen.vim][pathogen], and then simply copy and paste:
```bash
cd ~/.vim/bundle
git clone https://github.com/zehrizzatti/dash.vim.git
```
## License
<|file_sep|>updated/README.md
# dash.vim
This Vim plugin will search for terms using the excellent [Dash][Dash]
app, making API lookups simple.
It provides a new :Dash command and (recommended) mappings.
Read the (TODO :) [help][vim-doc] to know more.
## Commands
This plugin has one command: `Dash`.
```
:Dash String
```
Will search for 'String' in Dash, considering the current filetype in Vim.
```
:Dash! func | 8efe9c3471c696532af67e48119658c78262383c | README.md | README.md | Markdown |
<|file_sep|>original/lib/style/style.rb
end
def self.formatting(fg_color = nil, bg_color = nil)
formatting_changed = false
formatting = "#["
if fg_color && !fg_color.empty?
formatting << "fg="+fg_color
formatting_changed = true
end
if bg_color && !bg_color.empty?
formatting << "," if formatting_changed
formatting << "bg="+bg_color
end
formatting << "]"
formatting_changed ? formatting : ""
end
end
<|file_sep|>current/lib/style/style.rb
end
def self.formatting(fg_color = nil, bg_color = nil)
formatting_changed = false
formatting = "#["
if fg_color && !fg_color.empty?
formatting << "fg="+fg_color
formatting_changed = true
end
if bg_color && !bg_color.empty?
formatting << "," if formatting_changed
formatting << "bg="+bg_color
end
formatting << "]"
formatting_changed ? formatting : ""
end
end
<|file_sep|>updated/lib/style/style.rb |
def self.formatting(fg_color = nil, bg_color = nil)
formatting_changed = false
formatting = "#["
if fg_color && !fg_color.empty?
formatting << "fg="+fg_color
formatting_changed = true
end
if bg_color && !bg_color.empty?
formatting << "," if formatting_changed
formatting << "bg="+bg_color
formatting_changed = true
end
formatting << "]"
formatting_changed ? formatting : ""
end
end | <|file_sep|>original/lib/style/style.rb
end
def self.formatting(fg_color = nil, bg_color = nil)
formatting_changed = false
formatting = "#["
if fg_color && !fg_color.empty?
formatting << "fg="+fg_color
formatting_changed = true
end
if bg_color && !bg_color.empty?
formatting << "," if formatting_changed
formatting << "bg="+bg_color
end
formatting << "]"
formatting_changed ? formatting : ""
end
end
<|file_sep|>current/lib/style/style.rb
end
def self.formatting(fg_color = nil, bg_color = nil)
formatting_changed = false
formatting = "#["
if fg_color && !fg_color.empty?
formatting << "fg="+fg_color
formatting_changed = true
end
if bg_color && !bg_color.empty?
formatting << "," if formatting_changed
formatting << "bg="+bg_color
end
formatting << "]"
formatting_changed ? formatting : ""
end
end
<|file_sep|>updated/lib/style/style.rb
def self.formatting(fg_color = nil, bg_color = nil)
formatting_changed = false
formatting = "#["
if fg_color && !fg_color.empty?
formatting << "fg="+fg_color
formatting_changed = true
end
if bg_color && !bg_color.empty?
formatting << "," if formatting_changed
formatting << "bg="+bg_color
formatting_changed = true
end
formatting << "]"
formatting_changed ? formatting : ""
end
end | cff5df77fc8ef0f77e04534ed5f050c7c7b8d97a | lib/style/style.rb | lib/style/style.rb | Ruby |
<|file_sep|>Library/Formula/subversion.rb.diff
original:
@url='http://subversion.tigris.org/downloads/subversion-deps-1.6.3.tar.bz2'
@md5='22d3687ae93648fcecf945c045931272'
updated:
@url='http://subversion.tigris.org/downloads/subversion-deps-1.6.5.tar.bz2'
@md5='8272316e1670d4d2bea451411e438bde'
<|file_sep|>Library/Formula/subversion.rb.diff
original:
@url='http://subversion.tigris.org/downloads/subversion-1.6.3.tar.bz2'
updated:
@url='http://subversion.tigris.org/downloads/subversion-1.6.5.tar.bz2'
<|file_sep|>Library/Formula/subversion.rb.diff
original:
@md5='8bf7637ac99368db0890e3f085fa690d'
updated:
@md5='1a53a0e72bee0bf814f4da83a9b6a636'
<|file_sep|>Library/Formula/subversion.rb.diff
original:
# Use existing system zlib
# Use dep-provided other libraries
updated:
# Use existing system zlib, dep-provided other libraries
<|file_sep|>Library/Formula/subversion.rb.diff
original:
updated:
"--with-ssl",
<|file_sep|>original/Library/Formula/subversion.rb
@md5='8bf7637ac99368db0890e3f085fa690d'
def install
# Slot dependencies into place
d=Pathname.getwd
SubversionDeps.new.brew do
d.install Dir['*']
end
# Use existing system zlib
# Use dep-provided other libraries
# Don't mess with Apache modules (since we're not sudo)
system "./configure", "--disable-debug",
"--prefix=#{prefix}",
"--with-zlib=/usr/lib",
"--disable-mod-activation",
"--without-apache-libexecdir"
system "make"
system "make install"
end
end
<|file_sep|>current/Library/Formula/subversion.rb
@md5='1a53a0e72bee0bf814f4da83a9b6a636'
def install
# Slot dependencies into place
d=Pathname.getwd
SubversionDeps.new.brew do
d.install Dir['*']
end
# Use existing system zlib, dep-provided other libraries
# Don't mess with Apache modules (since we're not sudo)
system "./configure", "--disable-debug",
"--prefix=#{prefix}",
"--with-ssl",
"--with-zlib=/usr/lib",
"--disable-mod-activation",
"--without-apache-libexecdir"
system "make"
system "make install"
end
end
<|file_sep|>updated/Library/Formula/subversion.rb |
def install
# Slot dependencies into place
d=Pathname.getwd
SubversionDeps.new.brew do
d.install Dir['*']
end
# Use existing system zlib, dep-provided other libraries
# Don't mess with Apache modules (since we're not sudo)
system "./configure", "--disable-debug",
"--prefix=#{prefix}",
"--with-ssl",
"--with-zlib=/usr/lib",
"--disable-mod-activation",
"--without-apache-libexecdir",
"--without-berkeley-db"
system "make"
system "make install"
end
end | <|file_sep|>Library/Formula/subversion.rb.diff
original:
@url='http://subversion.tigris.org/downloads/subversion-deps-1.6.3.tar.bz2'
@md5='22d3687ae93648fcecf945c045931272'
updated:
@url='http://subversion.tigris.org/downloads/subversion-deps-1.6.5.tar.bz2'
@md5='8272316e1670d4d2bea451411e438bde'
<|file_sep|>Library/Formula/subversion.rb.diff
original:
@url='http://subversion.tigris.org/downloads/subversion-1.6.3.tar.bz2'
updated:
@url='http://subversion.tigris.org/downloads/subversion-1.6.5.tar.bz2'
<|file_sep|>Library/Formula/subversion.rb.diff
original:
@md5='8bf7637ac99368db0890e3f085fa690d'
updated:
@md5='1a53a0e72bee0bf814f4da83a9b6a636'
<|file_sep|>Library/Formula/subversion.rb.diff
original:
# Use existing system zlib
# Use dep-provided other libraries
updated:
# Use existing system zlib, dep-provided other libraries
<|file_sep|>Library/Formula/subversion.rb.diff
original:
updated:
"--with-ssl",
<|file_sep|>original/Library/Formula/subversion.rb
@md5='8bf7637ac99368db0890e3f085fa690d'
def install
# Slot dependencies into place
d=Pathname.getwd
SubversionDeps.new.brew do
d.install Dir['*']
end
# Use existing system zlib
# Use dep-provided other libraries
# Don't mess with Apache modules (since we're not sudo)
system "./configure", "--disable-debug",
"--prefix=#{prefix}",
"--with-zlib=/usr/lib",
"--disable-mod-activation",
"--without-apache-libexecdir"
system "make"
system "make install"
end
end
<|file_sep|>current/Library/Formula/subversion.rb
@md5='1a53a0e72bee0bf814f4da83a9b6a636'
def install
# Slot dependencies into place
d=Pathname.getwd
SubversionDeps.new.brew do
d.install Dir['*']
end
# Use existing system zlib, dep-provided other libraries
# Don't mess with Apache modules (since we're not sudo)
system "./configure", "--disable-debug",
"--prefix=#{prefix}",
"--with-ssl",
"--with-zlib=/usr/lib",
"--disable-mod-activation",
"--without-apache-libexecdir"
system "make"
system "make install"
end
end
<|file_sep|>updated/Library/Formula/subversion.rb
def install
# Slot dependencies into place
d=Pathname.getwd
SubversionDeps.new.brew do
d.install Dir['*']
end
# Use existing system zlib, dep-provided other libraries
# Don't mess with Apache modules (since we're not sudo)
system "./configure", "--disable-debug",
"--prefix=#{prefix}",
"--with-ssl",
"--with-zlib=/usr/lib",
"--disable-mod-activation",
"--without-apache-libexecdir",
"--without-berkeley-db"
system "make"
system "make install"
end
end | 5607a70e6c7f0247a7fbcce5ce3a457973c792cb | Library/Formula/subversion.rb | Library/Formula/subversion.rb | Ruby |
<|file_sep|>original/locales/es-AR/addon.properties
<|file_sep|>current/locales/es-AR/addon.properties
<|file_sep|>updated/locales/es-AR/addon.properties | experiment_list_enabled = Habilitado
experiment_list_new_experiment = Experimento nuevo
experiment_list_view_all = Ver todos los experimentos
experiment_eol_tomorrow_message = Termina mañana
experiment_eol_soon_message = Terminará en breve
experiment_eol_complete_message = Experimento completo
installed_message = <strong> Para que lo sepas:</strong> Pusimos un botón en tu barra de herramientas <br> para que siempre sepas dónde está Test Pilot.
# LOCALIZER NOTE: Placeholder is experiment title
# LOCALIZER NOTE: Placeholder is experiment title
survey_rating_survey_button = Respondé una breve encuesta
# LOCALIZER NOTE: Placeholder is experiment title
# LOCALIZER NOTE: Placeholder is site host (e.g. testpilot.firefox.com)
| <|file_sep|>original/locales/es-AR/addon.properties
<|file_sep|>current/locales/es-AR/addon.properties
<|file_sep|>updated/locales/es-AR/addon.properties
experiment_list_enabled = Habilitado
experiment_list_new_experiment = Experimento nuevo
experiment_list_view_all = Ver todos los experimentos
experiment_eol_tomorrow_message = Termina mañana
experiment_eol_soon_message = Terminará en breve
experiment_eol_complete_message = Experimento completo
installed_message = <strong> Para que lo sepas:</strong> Pusimos un botón en tu barra de herramientas <br> para que siempre sepas dónde está Test Pilot.
# LOCALIZER NOTE: Placeholder is experiment title
# LOCALIZER NOTE: Placeholder is experiment title
survey_rating_survey_button = Respondé una breve encuesta
# LOCALIZER NOTE: Placeholder is experiment title
# LOCALIZER NOTE: Placeholder is site host (e.g. testpilot.firefox.com)
| 57af09d03b560141270d73510763bee04af817f9 | locales/es-AR/addon.properties | locales/es-AR/addon.properties | INI |
<|file_sep|>original/README.md
# fs-promise
[](http://travis-ci.org/kevinbeaty/fs-promise)
Proxies all async `fs` methods exposing them as Promises/A+ compatible promises (when, Q, etc).
Passes all sync methods through as values.
Also exposes to `graceful-fs` and/or `fs-extra` methods if they are installed.
<|file_sep|>current/README.md
# fs-promise
[](http://travis-ci.org/kevinbeaty/fs-promise)
Proxies all async `fs` methods exposing them as Promises/A+ compatible promises (when, Q, etc).
Passes all sync methods through as values.
Also exposes to `graceful-fs` and/or `fs-extra` methods if they are installed.
<|file_sep|>updated/README.md | # fs-promise
[](http://travis-ci.org/kevinbeaty/fs-promise)
Proxies all async `fs` methods exposing them as Promises/A+ compatible promises (when, Q, etc).
Passes all sync methods through as values.
Also exposes to `graceful-fs` and/or `fs-extra` methods if they are installed. | <|file_sep|>original/README.md
# fs-promise
[](http://travis-ci.org/kevinbeaty/fs-promise)
Proxies all async `fs` methods exposing them as Promises/A+ compatible promises (when, Q, etc).
Passes all sync methods through as values.
Also exposes to `graceful-fs` and/or `fs-extra` methods if they are installed.
<|file_sep|>current/README.md
# fs-promise
[](http://travis-ci.org/kevinbeaty/fs-promise)
Proxies all async `fs` methods exposing them as Promises/A+ compatible promises (when, Q, etc).
Passes all sync methods through as values.
Also exposes to `graceful-fs` and/or `fs-extra` methods if they are installed.
<|file_sep|>updated/README.md
# fs-promise
[](http://travis-ci.org/kevinbeaty/fs-promise)
Proxies all async `fs` methods exposing them as Promises/A+ compatible promises (when, Q, etc).
Passes all sync methods through as values.
Also exposes to `graceful-fs` and/or `fs-extra` methods if they are installed. | 0c04097b38c5f232159426527fb48fa284f53312 | README.md | README.md | Markdown |
<|file_sep|>test/integration/org/chrisguitarguy/beanstalkc/ConnectionHelper.java.diff
original:
import java.util.Map;
updated:
import java.util.Properties;
<|file_sep|>test/integration/org/chrisguitarguy/beanstalkc/ConnectionHelper.java.diff
original:
private static Map<String, String> env = System.getenv();
updated:
private static Properties env = System.getProperties();
<|file_sep|>test/integration/org/chrisguitarguy/beanstalkc/ConnectionHelper.java.diff
original:
port = Integer.parseInt(env.get("BEANSTALKD_PORT"));
updated:
port = Integer.parseInt((String)env.get("BEANSTALKD_PORT"));
<|file_sep|>original/test/integration/org/chrisguitarguy/beanstalkc/ConnectionHelper.java
port = Integer.parseInt(env.get("BEANSTALKD_PORT"));
return port;
}
private static String getHost() throws Exception
{
if (host != null) {
return host;
}
if (!env.containsKey("BEANSTALKD_HOST")) {
throw new Exception("BEANSTALKD_HOST not set");
}
host = env.get("BEANSTALKD_HOST");
return host;
}
}
<|file_sep|>current/test/integration/org/chrisguitarguy/beanstalkc/ConnectionHelper.java
port = Integer.parseInt((String)env.get("BEANSTALKD_PORT"));
return port;
}
private static String getHost() throws Exception
{
if (host != null) {
return host;
}
if (!env.containsKey("BEANSTALKD_HOST")) {
throw new Exception("BEANSTALKD_HOST not set");
}
host = env.get("BEANSTALKD_HOST");
return host;
}
}
<|file_sep|>updated/test/integration/org/chrisguitarguy/beanstalkc/ConnectionHelper.java |
port = Integer.parseInt((String)env.get("BEANSTALKD_PORT"));
return port;
}
private static String getHost() throws Exception
{
if (host != null) {
return host;
}
if (!env.containsKey("BEANSTALKD_HOST")) {
throw new Exception("BEANSTALKD_HOST not set");
}
host = (String)env.get("BEANSTALKD_HOST");
return host;
}
} | <|file_sep|>test/integration/org/chrisguitarguy/beanstalkc/ConnectionHelper.java.diff
original:
import java.util.Map;
updated:
import java.util.Properties;
<|file_sep|>test/integration/org/chrisguitarguy/beanstalkc/ConnectionHelper.java.diff
original:
private static Map<String, String> env = System.getenv();
updated:
private static Properties env = System.getProperties();
<|file_sep|>test/integration/org/chrisguitarguy/beanstalkc/ConnectionHelper.java.diff
original:
port = Integer.parseInt(env.get("BEANSTALKD_PORT"));
updated:
port = Integer.parseInt((String)env.get("BEANSTALKD_PORT"));
<|file_sep|>original/test/integration/org/chrisguitarguy/beanstalkc/ConnectionHelper.java
port = Integer.parseInt(env.get("BEANSTALKD_PORT"));
return port;
}
private static String getHost() throws Exception
{
if (host != null) {
return host;
}
if (!env.containsKey("BEANSTALKD_HOST")) {
throw new Exception("BEANSTALKD_HOST not set");
}
host = env.get("BEANSTALKD_HOST");
return host;
}
}
<|file_sep|>current/test/integration/org/chrisguitarguy/beanstalkc/ConnectionHelper.java
port = Integer.parseInt((String)env.get("BEANSTALKD_PORT"));
return port;
}
private static String getHost() throws Exception
{
if (host != null) {
return host;
}
if (!env.containsKey("BEANSTALKD_HOST")) {
throw new Exception("BEANSTALKD_HOST not set");
}
host = env.get("BEANSTALKD_HOST");
return host;
}
}
<|file_sep|>updated/test/integration/org/chrisguitarguy/beanstalkc/ConnectionHelper.java
port = Integer.parseInt((String)env.get("BEANSTALKD_PORT"));
return port;
}
private static String getHost() throws Exception
{
if (host != null) {
return host;
}
if (!env.containsKey("BEANSTALKD_HOST")) {
throw new Exception("BEANSTALKD_HOST not set");
}
host = (String)env.get("BEANSTALKD_HOST");
return host;
}
} | 407e84dc22bdb05e0dda1b8121582906ef4c1313 | test/integration/org/chrisguitarguy/beanstalkc/ConnectionHelper.java | test/integration/org/chrisguitarguy/beanstalkc/ConnectionHelper.java | Java |
<|file_sep|>original/src/pages/index.css
ol {
@apply list-decimal;
}
ul,
ol {
@apply mb-4 ml-8;
}
li {
@apply mb-4;
}
/* Code */
pre {
@apply mb-4;
}
@import 'tailwindcss/components';
@import 'tailwindcss/utilities';
<|file_sep|>current/src/pages/index.css
ol {
@apply list-decimal;
}
ul,
ol {
@apply mb-4 ml-8;
}
li {
@apply mb-4;
}
/* Code */
pre {
@apply mb-4;
}
@import 'tailwindcss/components';
@import 'tailwindcss/utilities';
<|file_sep|>updated/src/pages/index.css | ul,
ol {
@apply mb-4 ml-8;
}
li {
@apply mb-4;
}
/* Code */
pre {
@apply mb-4;
}
.custom-card em {
@apply bg-blue-200 p-1 not-italic rounded;
}
@import 'tailwindcss/components';
@import 'tailwindcss/utilities'; | <|file_sep|>original/src/pages/index.css
ol {
@apply list-decimal;
}
ul,
ol {
@apply mb-4 ml-8;
}
li {
@apply mb-4;
}
/* Code */
pre {
@apply mb-4;
}
@import 'tailwindcss/components';
@import 'tailwindcss/utilities';
<|file_sep|>current/src/pages/index.css
ol {
@apply list-decimal;
}
ul,
ol {
@apply mb-4 ml-8;
}
li {
@apply mb-4;
}
/* Code */
pre {
@apply mb-4;
}
@import 'tailwindcss/components';
@import 'tailwindcss/utilities';
<|file_sep|>updated/src/pages/index.css
ul,
ol {
@apply mb-4 ml-8;
}
li {
@apply mb-4;
}
/* Code */
pre {
@apply mb-4;
}
.custom-card em {
@apply bg-blue-200 p-1 not-italic rounded;
}
@import 'tailwindcss/components';
@import 'tailwindcss/utilities'; | cad706e9bd000dd778632b0dd652faf78ef77eb0 | src/pages/index.css | src/pages/index.css | CSS |
<|file_sep|>original/package.json
],
"author": "Dominik Schmidt",
"license": "MIT",
"bin": {
"tomahawkcc": "index.js"
},
"bugs": {
"url": "https://github.com/dschmidt/tomahawkcc/issues"
},
"homepage": "https://github.com/dschmidt/tomahawkcc",
"dependencies": {
"almond": "^0.3.1",
"babel": "^5.8.23",
"babel-concat": "^1.0.3",
"babel-core": "^5.8.25",
"fs-promise": "^0.3.1",
"lodash": "^3.10.1",
"rsvp": "^3.1.0",
"rsvp-party": "github:dschmidt/rsvp-party#fixup"
}
}
<|file_sep|>current/package.json
],
"author": "Dominik Schmidt",
"license": "MIT",
"bin": {
"tomahawkcc": "index.js"
},
"bugs": {
"url": "https://github.com/dschmidt/tomahawkcc/issues"
},
"homepage": "https://github.com/dschmidt/tomahawkcc",
"dependencies": {
"almond": "^0.3.1",
"babel": "^5.8.23",
"babel-concat": "^1.0.3",
"babel-core": "^5.8.25",
"fs-promise": "^0.3.1",
"lodash": "^3.10.1",
"rsvp": "^3.1.0",
"rsvp-party": "github:dschmidt/rsvp-party#fixup"
}
}
<|file_sep|>updated/package.json | ],
"author": "Dominik Schmidt",
"license": "MIT",
"bin": {
"tomahawkcc": "index.js"
},
"bugs": {
"url": "https://github.com/dschmidt/tomahawkcc/issues"
},
"homepage": "https://github.com/dschmidt/tomahawkcc",
"dependencies": {
"almond": "^0.3.1",
"babel": "^5.8.23",
"babel-concat": "^1.0.3",
"babel-core": "^5.8.25",
"fs-promise": "^0.3.1",
"lodash": "^3.10.1",
"rsvp": "^3.1.0",
"rsvp-party": "dschmidt/rsvp-party#fixup"
}
} | <|file_sep|>original/package.json
],
"author": "Dominik Schmidt",
"license": "MIT",
"bin": {
"tomahawkcc": "index.js"
},
"bugs": {
"url": "https://github.com/dschmidt/tomahawkcc/issues"
},
"homepage": "https://github.com/dschmidt/tomahawkcc",
"dependencies": {
"almond": "^0.3.1",
"babel": "^5.8.23",
"babel-concat": "^1.0.3",
"babel-core": "^5.8.25",
"fs-promise": "^0.3.1",
"lodash": "^3.10.1",
"rsvp": "^3.1.0",
"rsvp-party": "github:dschmidt/rsvp-party#fixup"
}
}
<|file_sep|>current/package.json
],
"author": "Dominik Schmidt",
"license": "MIT",
"bin": {
"tomahawkcc": "index.js"
},
"bugs": {
"url": "https://github.com/dschmidt/tomahawkcc/issues"
},
"homepage": "https://github.com/dschmidt/tomahawkcc",
"dependencies": {
"almond": "^0.3.1",
"babel": "^5.8.23",
"babel-concat": "^1.0.3",
"babel-core": "^5.8.25",
"fs-promise": "^0.3.1",
"lodash": "^3.10.1",
"rsvp": "^3.1.0",
"rsvp-party": "github:dschmidt/rsvp-party#fixup"
}
}
<|file_sep|>updated/package.json
],
"author": "Dominik Schmidt",
"license": "MIT",
"bin": {
"tomahawkcc": "index.js"
},
"bugs": {
"url": "https://github.com/dschmidt/tomahawkcc/issues"
},
"homepage": "https://github.com/dschmidt/tomahawkcc",
"dependencies": {
"almond": "^0.3.1",
"babel": "^5.8.23",
"babel-concat": "^1.0.3",
"babel-core": "^5.8.25",
"fs-promise": "^0.3.1",
"lodash": "^3.10.1",
"rsvp": "^3.1.0",
"rsvp-party": "dschmidt/rsvp-party#fixup"
}
} | 4c4899017885e8ba0dd1af397c4707f8aa41b4a5 | package.json | package.json | JSON |
<|file_sep|>original/meta-oe/recipes-gnome/gtk+/gtkmm3_3.24.5.bb
SECTION = "libs"
LICENSE = "LGPL-2.1-only & GPL-2.0-only"
LIC_FILES_CHKSUM = "file://COPYING;md5=4fbd65380cdd255951079008b364516c \
file://COPYING.tools;md5=751419260aa954499f7abaabaa882bbe"
DEPENDS = "glib-2.0-native atkmm pangomm glibmm gtk+3 cairomm gdk-pixbuf-native"
BPN = "gtkmm"
GNOMEBASEBUILDCLASS = "meson"
inherit gnomebase features_check
REQUIRED_DISTRO_FEATURES = "x11"
SRC_URI[archive.sha256sum] = "856333de86689f6a81c123f2db15d85db9addc438bc3574c36f15736aeae22e6"
EXTRA_OEMESON = "-Dbuild-demos=false"
FILES:${PN}-dev += "${libdir}/*/include ${libdir}/*/proc/m4"
<|file_sep|>current/meta-oe/recipes-gnome/gtk+/gtkmm3_3.24.5.bb
SECTION = "libs"
LICENSE = "LGPL-2.1-only & GPL-2.0-only"
LIC_FILES_CHKSUM = "file://COPYING;md5=4fbd65380cdd255951079008b364516c \
file://COPYING.tools;md5=751419260aa954499f7abaabaa882bbe"
DEPENDS = "glib-2.0-native atkmm pangomm glibmm gtk+3 cairomm gdk-pixbuf-native"
BPN = "gtkmm"
GNOMEBASEBUILDCLASS = "meson"
inherit gnomebase features_check
REQUIRED_DISTRO_FEATURES = "x11"
SRC_URI[archive.sha256sum] = "856333de86689f6a81c123f2db15d85db9addc438bc3574c36f15736aeae22e6"
EXTRA_OEMESON = "-Dbuild-demos=false"
FILES:${PN}-dev += "${libdir}/*/include ${libdir}/*/proc/m4"
<|file_sep|>updated/meta-oe/recipes-gnome/gtk+/gtkmm3_3.24.5.bb | SECTION = "libs"
LICENSE = "LGPL-2.1-only & GPL-2.0-only"
LIC_FILES_CHKSUM = "file://COPYING;md5=4fbd65380cdd255951079008b364516c \
file://COPYING.tools;md5=751419260aa954499f7abaabaa882bbe"
DEPENDS = "glib-2.0-native atkmm pangomm glibmm gtk+3 cairomm gdk-pixbuf-native"
BPN = "gtkmm"
GNOMEBASEBUILDCLASS = "meson"
inherit gnomebase features_check
ANY_OF_DISTRO_FEATURES = "${GTK3DISTROFEATURES}"
SRC_URI[archive.sha256sum] = "856333de86689f6a81c123f2db15d85db9addc438bc3574c36f15736aeae22e6"
EXTRA_OEMESON = "-Dbuild-demos=false"
FILES:${PN}-dev += "${libdir}/*/include ${libdir}/*/proc/m4" | <|file_sep|>original/meta-oe/recipes-gnome/gtk+/gtkmm3_3.24.5.bb
SECTION = "libs"
LICENSE = "LGPL-2.1-only & GPL-2.0-only"
LIC_FILES_CHKSUM = "file://COPYING;md5=4fbd65380cdd255951079008b364516c \
file://COPYING.tools;md5=751419260aa954499f7abaabaa882bbe"
DEPENDS = "glib-2.0-native atkmm pangomm glibmm gtk+3 cairomm gdk-pixbuf-native"
BPN = "gtkmm"
GNOMEBASEBUILDCLASS = "meson"
inherit gnomebase features_check
REQUIRED_DISTRO_FEATURES = "x11"
SRC_URI[archive.sha256sum] = "856333de86689f6a81c123f2db15d85db9addc438bc3574c36f15736aeae22e6"
EXTRA_OEMESON = "-Dbuild-demos=false"
FILES:${PN}-dev += "${libdir}/*/include ${libdir}/*/proc/m4"
<|file_sep|>current/meta-oe/recipes-gnome/gtk+/gtkmm3_3.24.5.bb
SECTION = "libs"
LICENSE = "LGPL-2.1-only & GPL-2.0-only"
LIC_FILES_CHKSUM = "file://COPYING;md5=4fbd65380cdd255951079008b364516c \
file://COPYING.tools;md5=751419260aa954499f7abaabaa882bbe"
DEPENDS = "glib-2.0-native atkmm pangomm glibmm gtk+3 cairomm gdk-pixbuf-native"
BPN = "gtkmm"
GNOMEBASEBUILDCLASS = "meson"
inherit gnomebase features_check
REQUIRED_DISTRO_FEATURES = "x11"
SRC_URI[archive.sha256sum] = "856333de86689f6a81c123f2db15d85db9addc438bc3574c36f15736aeae22e6"
EXTRA_OEMESON = "-Dbuild-demos=false"
FILES:${PN}-dev += "${libdir}/*/include ${libdir}/*/proc/m4"
<|file_sep|>updated/meta-oe/recipes-gnome/gtk+/gtkmm3_3.24.5.bb
SECTION = "libs"
LICENSE = "LGPL-2.1-only & GPL-2.0-only"
LIC_FILES_CHKSUM = "file://COPYING;md5=4fbd65380cdd255951079008b364516c \
file://COPYING.tools;md5=751419260aa954499f7abaabaa882bbe"
DEPENDS = "glib-2.0-native atkmm pangomm glibmm gtk+3 cairomm gdk-pixbuf-native"
BPN = "gtkmm"
GNOMEBASEBUILDCLASS = "meson"
inherit gnomebase features_check
ANY_OF_DISTRO_FEATURES = "${GTK3DISTROFEATURES}"
SRC_URI[archive.sha256sum] = "856333de86689f6a81c123f2db15d85db9addc438bc3574c36f15736aeae22e6"
EXTRA_OEMESON = "-Dbuild-demos=false"
FILES:${PN}-dev += "${libdir}/*/include ${libdir}/*/proc/m4" | 3b311b6c1ec36dcdaf199f51b719bcdb9e0e50b1 | meta-oe/recipes-gnome/gtk+/gtkmm3_3.24.5.bb | meta-oe/recipes-gnome/gtk+/gtkmm3_3.24.5.bb | BitBake |
<|file_sep|>original/validation-test/compiler_crashers/28310-swift-typechecker-checkgenericparamlist.swift
<|file_sep|>current/validation-test/compiler_crashers/28310-swift-typechecker-checkgenericparamlist.swift
<|file_sep|>updated/validation-test/compiler_crashers/28310-swift-typechecker-checkgenericparamlist.swift | // This source file is part of the Swift.org open source project
// Copyright (c) 2014 - 2016 Apple Inc. and the Swift project authors
// Licensed under Apache License v2.0 with Runtime Library Exception
//
// See http://swift.org/LICENSE.txt for license information
// See http://swift.org/CONTRIBUTORS.txt for the list of Swift project authors
// RUN: not --crash %target-swift-frontend %s -parse
yObject, x in 0
class a{
enum b {
class A {
class A {
let i: c
struct D{
enum b {
struct A : a {
func f: b
if true {
class A {
func d.j where g: j.j: j: Int = { | <|file_sep|>original/validation-test/compiler_crashers/28310-swift-typechecker-checkgenericparamlist.swift
<|file_sep|>current/validation-test/compiler_crashers/28310-swift-typechecker-checkgenericparamlist.swift
<|file_sep|>updated/validation-test/compiler_crashers/28310-swift-typechecker-checkgenericparamlist.swift
// This source file is part of the Swift.org open source project
// Copyright (c) 2014 - 2016 Apple Inc. and the Swift project authors
// Licensed under Apache License v2.0 with Runtime Library Exception
//
// See http://swift.org/LICENSE.txt for license information
// See http://swift.org/CONTRIBUTORS.txt for the list of Swift project authors
// RUN: not --crash %target-swift-frontend %s -parse
yObject, x in 0
class a{
enum b {
class A {
class A {
let i: c
struct D{
enum b {
struct A : a {
func f: b
if true {
class A {
func d.j where g: j.j: j: Int = { | 1fa929deee2dea84791e1868950e297b7d720b64 | validation-test/compiler_crashers/28310-swift-typechecker-checkgenericparamlist.swift | validation-test/compiler_crashers/28310-swift-typechecker-checkgenericparamlist.swift | Swift |
<|file_sep|>original/releasenotes/notes/pike_prelude-fedf9f27775d135f.yaml
<|file_sep|>current/releasenotes/notes/pike_prelude-fedf9f27775d135f.yaml
<|file_sep|>updated/releasenotes/notes/pike_prelude-fedf9f27775d135f.yaml | ---
prelude: |
The 16.0.0 release includes many new features and bug fixes. It is
difficult to cover all the changes that have been introduced. Please at
least read the upgrade section which describes the required actions to
upgrade your cloud from 15.0.0 (Ocata) to 16.0.0 (Pike).
That said, a few major changes are worth mentioning. This is not an
exhaustive list:
- The latest Compute API microversion supported for Ocata is v2.53. Details
on REST API microversions added since the 15.0.0 Ocata release can be
found in the `REST API Version History`_ page.
- The FilterScheduler driver now provides allocations to the Placement API,
which helps concurrent schedulers to verify resource consumptions
directly without waiting for compute services to ask for a reschedule in
case of a race condition. That is an important performance improvement
that includes allowing one to use more than one scheduler worker if
there are capacity concerns. For more details, see the
`Pike Upgrade Notes for Placement`_.
- Nova now supports a Cells v2 multi-cell deployment. The default | <|file_sep|>original/releasenotes/notes/pike_prelude-fedf9f27775d135f.yaml
<|file_sep|>current/releasenotes/notes/pike_prelude-fedf9f27775d135f.yaml
<|file_sep|>updated/releasenotes/notes/pike_prelude-fedf9f27775d135f.yaml
---
prelude: |
The 16.0.0 release includes many new features and bug fixes. It is
difficult to cover all the changes that have been introduced. Please at
least read the upgrade section which describes the required actions to
upgrade your cloud from 15.0.0 (Ocata) to 16.0.0 (Pike).
That said, a few major changes are worth mentioning. This is not an
exhaustive list:
- The latest Compute API microversion supported for Ocata is v2.53. Details
on REST API microversions added since the 15.0.0 Ocata release can be
found in the `REST API Version History`_ page.
- The FilterScheduler driver now provides allocations to the Placement API,
which helps concurrent schedulers to verify resource consumptions
directly without waiting for compute services to ask for a reschedule in
case of a race condition. That is an important performance improvement
that includes allowing one to use more than one scheduler worker if
there are capacity concerns. For more details, see the
`Pike Upgrade Notes for Placement`_.
- Nova now supports a Cells v2 multi-cell deployment. The default | 216ccb5b92ba81a33650b952502b610a77119fac | releasenotes/notes/pike_prelude-fedf9f27775d135f.yaml | releasenotes/notes/pike_prelude-fedf9f27775d135f.yaml | YAML |
<|file_sep|>original/util/show_config.pl
<|file_sep|>current/util/show_config.pl
<|file_sep|>updated/util/show_config.pl | #!/usr/bin/env perl
use strict;
use warnings;
use autodie;
use FindBin qw($Bin);
use lib "$Bin/../lib";
use RUM::Config;
use Data::Dumper;
my $config = eval {
RUM::Config->new->parse_command_line(load_default => 1,
options => [RUM::Config->property_names]);
};
if ($@) {
print Dumper($@);
}
print "[job]\n"; | <|file_sep|>original/util/show_config.pl
<|file_sep|>current/util/show_config.pl
<|file_sep|>updated/util/show_config.pl
#!/usr/bin/env perl
use strict;
use warnings;
use autodie;
use FindBin qw($Bin);
use lib "$Bin/../lib";
use RUM::Config;
use Data::Dumper;
my $config = eval {
RUM::Config->new->parse_command_line(load_default => 1,
options => [RUM::Config->property_names]);
};
if ($@) {
print Dumper($@);
}
print "[job]\n"; | 3c050d4080865fd3328b52cccc9a8478de7517fb | util/show_config.pl | util/show_config.pl | Perl |
<|file_sep|>original/build.sh
fi
# Work off travis
if [[ -v TRAVIS_PULL_REQUEST ]]; then
echo "TRAVIS_PULL_REQUEST: $TRAVIS_PULL_REQUEST"
else
echo "TRAVIS_PULL_REQUEST: unset, setting to false"
TRAVIS_PULL_REQUEST=false
fi
# Build
if [ $TRAVIS_PULL_REQUEST = false ] ; then
# For a merged commit, build all configurations.
GRADLE_OPTS=$OPTS ./gradlew clean build
else
# On a pull request, just build debug which is much faster and catches
# obvious errors.
GRADLE_OPTS=$OPTS ./gradlew clean :app:assembleDebug
fi
<|file_sep|>current/build.sh
fi
# Work off travis
if [[ -v TRAVIS_PULL_REQUEST ]]; then
echo "TRAVIS_PULL_REQUEST: $TRAVIS_PULL_REQUEST"
else
echo "TRAVIS_PULL_REQUEST: unset, setting to false"
TRAVIS_PULL_REQUEST=false
fi
# Build
if [ $TRAVIS_PULL_REQUEST = false ] ; then
# For a merged commit, build all configurations.
GRADLE_OPTS=$OPTS ./gradlew clean build
else
# On a pull request, just build debug which is much faster and catches
# obvious errors.
GRADLE_OPTS=$OPTS ./gradlew clean :app:assembleDebug
fi
<|file_sep|>updated/build.sh | fi
# Work off travis
if [[ -v TRAVIS_PULL_REQUEST ]]; then
echo "TRAVIS_PULL_REQUEST: $TRAVIS_PULL_REQUEST"
else
echo "TRAVIS_PULL_REQUEST: unset, setting to false"
TRAVIS_PULL_REQUEST=false
fi
# Build
if [ $TRAVIS_PULL_REQUEST = false ] ; then
# For a merged commit, build all configurations.
GRADLE_OPTS=$OPTS ./gradlew clean build
else
# On a pull request, just build debug which is much faster and catches
# obvious errors.
GRADLE_OPTS=$OPTS ./gradlew clean :mobile:assembleDebug
fi
| <|file_sep|>original/build.sh
fi
# Work off travis
if [[ -v TRAVIS_PULL_REQUEST ]]; then
echo "TRAVIS_PULL_REQUEST: $TRAVIS_PULL_REQUEST"
else
echo "TRAVIS_PULL_REQUEST: unset, setting to false"
TRAVIS_PULL_REQUEST=false
fi
# Build
if [ $TRAVIS_PULL_REQUEST = false ] ; then
# For a merged commit, build all configurations.
GRADLE_OPTS=$OPTS ./gradlew clean build
else
# On a pull request, just build debug which is much faster and catches
# obvious errors.
GRADLE_OPTS=$OPTS ./gradlew clean :app:assembleDebug
fi
<|file_sep|>current/build.sh
fi
# Work off travis
if [[ -v TRAVIS_PULL_REQUEST ]]; then
echo "TRAVIS_PULL_REQUEST: $TRAVIS_PULL_REQUEST"
else
echo "TRAVIS_PULL_REQUEST: unset, setting to false"
TRAVIS_PULL_REQUEST=false
fi
# Build
if [ $TRAVIS_PULL_REQUEST = false ] ; then
# For a merged commit, build all configurations.
GRADLE_OPTS=$OPTS ./gradlew clean build
else
# On a pull request, just build debug which is much faster and catches
# obvious errors.
GRADLE_OPTS=$OPTS ./gradlew clean :app:assembleDebug
fi
<|file_sep|>updated/build.sh
fi
# Work off travis
if [[ -v TRAVIS_PULL_REQUEST ]]; then
echo "TRAVIS_PULL_REQUEST: $TRAVIS_PULL_REQUEST"
else
echo "TRAVIS_PULL_REQUEST: unset, setting to false"
TRAVIS_PULL_REQUEST=false
fi
# Build
if [ $TRAVIS_PULL_REQUEST = false ] ; then
# For a merged commit, build all configurations.
GRADLE_OPTS=$OPTS ./gradlew clean build
else
# On a pull request, just build debug which is much faster and catches
# obvious errors.
GRADLE_OPTS=$OPTS ./gradlew clean :mobile:assembleDebug
fi
| 0997f06943c08ec2a82b303fae9954b1b1d2c4c2 | build.sh | build.sh | Shell |
<|file_sep|>sashimi-webapp/src/components/file-manager/File.vue.diff
original:
<p class="inline-block">Your first document</p>
updated:
<p class="inline-block">{{fileName}}</p>
<|file_sep|>original/sashimi-webapp/src/components/file-manager/File.vue
id="123">
<button class="file">
<img src="../../assets/images/icons/icon-file.svg" alt="file">
<p class="inline-block">Your first document</p>
</button>
</div>
</template>
<script>
export default {
methods: {
openFile() {
const fileId = this.$el.id;
this.$router.push({ path: 'content', query: { id: fileId } });
},
}
};
</script>
<style scoped lang="scss">
</style>
<|file_sep|>current/sashimi-webapp/src/components/file-manager/File.vue
id="123">
<button class="file">
<img src="../../assets/images/icons/icon-file.svg" alt="file">
<p class="inline-block">{{fileName}}</p>
</button>
</div>
</template>
<script>
export default {
methods: {
openFile() {
const fileId = this.$el.id;
this.$router.push({ path: 'content', query: { id: fileId } });
},
}
};
</script>
<style scoped lang="scss">
</style>
<|file_sep|>updated/sashimi-webapp/src/components/file-manager/File.vue | id="123">
<button class="file">
<img src="../../assets/images/icons/icon-file.svg" alt="file">
<p class="inline-block">{{fileName}}</p>
</button>
</div>
</template>
<script>
export default {
props: ['fileName'],
data() {
},
methods: {
openFile() {
const fileId = this.$el.id;
this.$router.push({ path: 'content', query: { id: fileId } });
},
}
};
</script> | <|file_sep|>sashimi-webapp/src/components/file-manager/File.vue.diff
original:
<p class="inline-block">Your first document</p>
updated:
<p class="inline-block">{{fileName}}</p>
<|file_sep|>original/sashimi-webapp/src/components/file-manager/File.vue
id="123">
<button class="file">
<img src="../../assets/images/icons/icon-file.svg" alt="file">
<p class="inline-block">Your first document</p>
</button>
</div>
</template>
<script>
export default {
methods: {
openFile() {
const fileId = this.$el.id;
this.$router.push({ path: 'content', query: { id: fileId } });
},
}
};
</script>
<style scoped lang="scss">
</style>
<|file_sep|>current/sashimi-webapp/src/components/file-manager/File.vue
id="123">
<button class="file">
<img src="../../assets/images/icons/icon-file.svg" alt="file">
<p class="inline-block">{{fileName}}</p>
</button>
</div>
</template>
<script>
export default {
methods: {
openFile() {
const fileId = this.$el.id;
this.$router.push({ path: 'content', query: { id: fileId } });
},
}
};
</script>
<style scoped lang="scss">
</style>
<|file_sep|>updated/sashimi-webapp/src/components/file-manager/File.vue
id="123">
<button class="file">
<img src="../../assets/images/icons/icon-file.svg" alt="file">
<p class="inline-block">{{fileName}}</p>
</button>
</div>
</template>
<script>
export default {
props: ['fileName'],
data() {
},
methods: {
openFile() {
const fileId = this.$el.id;
this.$router.push({ path: 'content', query: { id: fileId } });
},
}
};
</script> | 4c1a2e68db746faecb0310aea1490b256e84abea | sashimi-webapp/src/components/file-manager/File.vue | sashimi-webapp/src/components/file-manager/File.vue | Vue |
<|file_sep|>src/_scss/_aside.scss.diff
original:
updated:
@media print {
background: none;
border-bottom: 2px solid $primary-color;
}
<|file_sep|>src/_scss/_aside.scss.diff
original:
updated:
@media print {
color: $primary-color;
}
<|file_sep|>original/src/_scss/_aside.scss
&, a, a:visited {
color: white;
}
a:hover {
text-decoration: none;
}
#brand {
margin-bottom: -5px;
a {
text-decoration: none;
}
}
#aside_links ul {
padding-top: 0px;
padding-bottom: 0px;
margin-bottom: 10px;
}
}
}
<|file_sep|>current/src/_scss/_aside.scss
#aside_inner {
@include central_element();
&, a, a:visited {
color: white;
@media print {
color: $primary-color;
}
}
a:hover {
text-decoration: none;
}
#brand {
margin-bottom: -5px;
a {
text-decoration: none;
}
}
<|file_sep|>updated/src/_scss/_aside.scss |
#aside_inner {
@include central_element();
&, a, a:visited {
color: white;
@media print {
color: $primary-color;
}
}
a:hover {
text-decoration: none;
}
#brand {
margin-bottom: -5px;
a {
text-decoration: none;
}
} | <|file_sep|>src/_scss/_aside.scss.diff
original:
updated:
@media print {
background: none;
border-bottom: 2px solid $primary-color;
}
<|file_sep|>src/_scss/_aside.scss.diff
original:
updated:
@media print {
color: $primary-color;
}
<|file_sep|>original/src/_scss/_aside.scss
&, a, a:visited {
color: white;
}
a:hover {
text-decoration: none;
}
#brand {
margin-bottom: -5px;
a {
text-decoration: none;
}
}
#aside_links ul {
padding-top: 0px;
padding-bottom: 0px;
margin-bottom: 10px;
}
}
}
<|file_sep|>current/src/_scss/_aside.scss
#aside_inner {
@include central_element();
&, a, a:visited {
color: white;
@media print {
color: $primary-color;
}
}
a:hover {
text-decoration: none;
}
#brand {
margin-bottom: -5px;
a {
text-decoration: none;
}
}
<|file_sep|>updated/src/_scss/_aside.scss
#aside_inner {
@include central_element();
&, a, a:visited {
color: white;
@media print {
color: $primary-color;
}
}
a:hover {
text-decoration: none;
}
#brand {
margin-bottom: -5px;
a {
text-decoration: none;
}
} | 52e8b8a2a15fe333bc58271cae16783f36f20ef8 | src/_scss/_aside.scss | src/_scss/_aside.scss | SCSS |
<|file_sep|>original/requirements/base.txt
dj-database-url==0.5.0
django-apiblueprint-view==2.0.0
django-basicauth==0.5.1
django-extensions==2.1.4
django-localflavor==2.1
django-markdown-deux==1.0.5
django==1.11.18 # pyup: >=1.11,<2.0
djangorestframework==3.9.0
djangorestframework-gis==0.14
django-cors-headers==2.4.0
fastkml==0.11
lxml==4.3.0
psycopg2-binary==2.7.6.1
pyshp==2.0.1
requests==2.21.0
boto==2.49.0
uk-geo-utils==0.6.0
git+git://github.com/DemocracyClub/dc_base_theme.git@0.3.7
git+https://github.com/DemocracyClub/dc_signup_form.git@2.0.1
<|file_sep|>current/requirements/base.txt
dj-database-url==0.5.0
django-apiblueprint-view==2.0.0
django-basicauth==0.5.1
django-extensions==2.1.4
django-localflavor==2.1
django-markdown-deux==1.0.5
django==1.11.18 # pyup: >=1.11,<2.0
djangorestframework==3.9.0
djangorestframework-gis==0.14
django-cors-headers==2.4.0
fastkml==0.11
lxml==4.3.0
psycopg2-binary==2.7.6.1
pyshp==2.0.1
requests==2.21.0
boto==2.49.0
uk-geo-utils==0.6.0
git+git://github.com/DemocracyClub/dc_base_theme.git@0.3.7
git+https://github.com/DemocracyClub/dc_signup_form.git@2.0.1
<|file_sep|>updated/requirements/base.txt | dj-database-url==0.5.0
django-apiblueprint-view==2.0.0
django-basicauth==0.5.1
django-extensions==2.1.4
django-localflavor==2.1
django-markdown-deux==1.0.5
django==1.11.18 # pyup: >=1.11,<2.0
djangorestframework==3.9.1
djangorestframework-gis==0.14
django-cors-headers==2.4.0
fastkml==0.11
lxml==4.3.0
psycopg2-binary==2.7.6.1
pyshp==2.0.1
requests==2.21.0
boto==2.49.0
uk-geo-utils==0.6.0
git+git://github.com/DemocracyClub/dc_base_theme.git@0.3.7
git+https://github.com/DemocracyClub/dc_signup_form.git@2.0.1 | <|file_sep|>original/requirements/base.txt
dj-database-url==0.5.0
django-apiblueprint-view==2.0.0
django-basicauth==0.5.1
django-extensions==2.1.4
django-localflavor==2.1
django-markdown-deux==1.0.5
django==1.11.18 # pyup: >=1.11,<2.0
djangorestframework==3.9.0
djangorestframework-gis==0.14
django-cors-headers==2.4.0
fastkml==0.11
lxml==4.3.0
psycopg2-binary==2.7.6.1
pyshp==2.0.1
requests==2.21.0
boto==2.49.0
uk-geo-utils==0.6.0
git+git://github.com/DemocracyClub/dc_base_theme.git@0.3.7
git+https://github.com/DemocracyClub/dc_signup_form.git@2.0.1
<|file_sep|>current/requirements/base.txt
dj-database-url==0.5.0
django-apiblueprint-view==2.0.0
django-basicauth==0.5.1
django-extensions==2.1.4
django-localflavor==2.1
django-markdown-deux==1.0.5
django==1.11.18 # pyup: >=1.11,<2.0
djangorestframework==3.9.0
djangorestframework-gis==0.14
django-cors-headers==2.4.0
fastkml==0.11
lxml==4.3.0
psycopg2-binary==2.7.6.1
pyshp==2.0.1
requests==2.21.0
boto==2.49.0
uk-geo-utils==0.6.0
git+git://github.com/DemocracyClub/dc_base_theme.git@0.3.7
git+https://github.com/DemocracyClub/dc_signup_form.git@2.0.1
<|file_sep|>updated/requirements/base.txt
dj-database-url==0.5.0
django-apiblueprint-view==2.0.0
django-basicauth==0.5.1
django-extensions==2.1.4
django-localflavor==2.1
django-markdown-deux==1.0.5
django==1.11.18 # pyup: >=1.11,<2.0
djangorestframework==3.9.1
djangorestframework-gis==0.14
django-cors-headers==2.4.0
fastkml==0.11
lxml==4.3.0
psycopg2-binary==2.7.6.1
pyshp==2.0.1
requests==2.21.0
boto==2.49.0
uk-geo-utils==0.6.0
git+git://github.com/DemocracyClub/dc_base_theme.git@0.3.7
git+https://github.com/DemocracyClub/dc_signup_form.git@2.0.1 | c2e7ad33cf5377e844b0f84c6a2e0eee237414f9 | requirements/base.txt | requirements/base.txt | Text |
<|file_sep|>test/C++Frontend/2003-09-29-ArgumentNumberMismatch.cpp.diff
original:
updated:
// Non-POD classes cannot be passed into a function by component, because their
// dtors must be run. Instead, pass them in by reference. The C++ front-end
// was mistakenly "thinking" that 'foo' took a structure by component.
<|file_sep|>test/C++Frontend/2003-09-29-ArgumentNumberMismatch.cpp.diff
original:
void operator^(C b) const { }
updated:
<|file_sep|>test/C++Frontend/2003-09-29-ArgumentNumberMismatch.cpp.diff
original:
updated:
void foo(C b);
<|file_sep|>original/test/C++Frontend/2003-09-29-ArgumentNumberMismatch.cpp
struct C {
int A, B;
~C() {}
void operator^(C b) const { }
};
void test(C *P) {
*P ^ *P;
}
<|file_sep|>current/test/C++Frontend/2003-09-29-ArgumentNumberMismatch.cpp
// Non-POD classes cannot be passed into a function by component, because their
// dtors must be run. Instead, pass them in by reference. The C++ front-end
// was mistakenly "thinking" that 'foo' took a structure by component.
struct C {
int A, B;
~C() {}
};
void foo(C b);
void test(C *P) {
*P ^ *P;
}
<|file_sep|>updated/test/C++Frontend/2003-09-29-ArgumentNumberMismatch.cpp | // Non-POD classes cannot be passed into a function by component, because their
// dtors must be run. Instead, pass them in by reference. The C++ front-end
// was mistakenly "thinking" that 'foo' took a structure by component.
struct C {
int A, B;
~C() {}
};
void foo(C b);
void test(C *P) {
foo(*P);
}
| <|file_sep|>test/C++Frontend/2003-09-29-ArgumentNumberMismatch.cpp.diff
original:
updated:
// Non-POD classes cannot be passed into a function by component, because their
// dtors must be run. Instead, pass them in by reference. The C++ front-end
// was mistakenly "thinking" that 'foo' took a structure by component.
<|file_sep|>test/C++Frontend/2003-09-29-ArgumentNumberMismatch.cpp.diff
original:
void operator^(C b) const { }
updated:
<|file_sep|>test/C++Frontend/2003-09-29-ArgumentNumberMismatch.cpp.diff
original:
updated:
void foo(C b);
<|file_sep|>original/test/C++Frontend/2003-09-29-ArgumentNumberMismatch.cpp
struct C {
int A, B;
~C() {}
void operator^(C b) const { }
};
void test(C *P) {
*P ^ *P;
}
<|file_sep|>current/test/C++Frontend/2003-09-29-ArgumentNumberMismatch.cpp
// Non-POD classes cannot be passed into a function by component, because their
// dtors must be run. Instead, pass them in by reference. The C++ front-end
// was mistakenly "thinking" that 'foo' took a structure by component.
struct C {
int A, B;
~C() {}
};
void foo(C b);
void test(C *P) {
*P ^ *P;
}
<|file_sep|>updated/test/C++Frontend/2003-09-29-ArgumentNumberMismatch.cpp
// Non-POD classes cannot be passed into a function by component, because their
// dtors must be run. Instead, pass them in by reference. The C++ front-end
// was mistakenly "thinking" that 'foo' took a structure by component.
struct C {
int A, B;
~C() {}
};
void foo(C b);
void test(C *P) {
foo(*P);
}
| 4e44c0ea31d142d1412e4fa7001a1654ff6afd2e | test/C++Frontend/2003-09-29-ArgumentNumberMismatch.cpp | test/C++Frontend/2003-09-29-ArgumentNumberMismatch.cpp | C++ |
<|file_sep|>django/santropolFeast/member/tests.py.diff
original:
from member.models import Member
updated:
from member.models import Member, Client
<|file_sep|>django/santropolFeast/member/tests.py.diff
original:
Member.objects.create(firstname='Katrina', birthdate=date(1980, 4, 19))
updated:
Member.objects.create(firstname='Katrina', lastname='Heide', birthdate=date(1980, 4, 19))
<|file_sep|>original/django/santropolFeast/member/tests.py
from django.test import TestCase
from member.models import Member
from datetime import date
class MemberTestCase(TestCase):
def setUp(self):
Member.objects.create(firstname='Katrina', birthdate=date(1980, 4, 19))
def test_age_on_date(self):
"""The age on given date is properly computed"""
katrina = Member.objects.get(firstname='Katrina')
self.assertEqual(katrina.age_on_date(date(2016, 4, 19)), 36)
self.assertEqual(katrina.age_on_date(date(1950, 4, 19)), 0)
self.assertEqual(katrina.age_on_date(katrina.birthdate), 0)
<|file_sep|>current/django/santropolFeast/member/tests.py
from django.test import TestCase
from member.models import Member, Client
from datetime import date
class MemberTestCase(TestCase):
def setUp(self):
Member.objects.create(firstname='Katrina', lastname='Heide', birthdate=date(1980, 4, 19))
def test_age_on_date(self):
"""The age on given date is properly computed"""
katrina = Member.objects.get(firstname='Katrina')
self.assertEqual(katrina.age_on_date(date(2016, 4, 19)), 36)
self.assertEqual(katrina.age_on_date(date(1950, 4, 19)), 0)
self.assertEqual(katrina.age_on_date(katrina.birthdate), 0)
<|file_sep|>updated/django/santropolFeast/member/tests.py | from django.test import TestCase
from member.models import Member, Client
from datetime import date
class MemberTestCase(TestCase):
def setUp(self):
Member.objects.create(firstname='Katrina', lastname='Heide', birthdate=date(1980, 4, 19))
def test_age_on_date(self):
"""The age on given date is properly computed"""
katrina = Member.objects.get(firstname='Katrina')
self.assertEqual(katrina.age_on_date(date(2016, 4, 19)), 36)
self.assertEqual(katrina.age_on_date(date(1950, 4, 19)), 0)
self.assertEqual(katrina.age_on_date(katrina.birthdate), 0)
def test_str_is_fullname(self):
"""The member model must be rendered using the firstname and the lastname"""
member = Member.objects.get(firstname='Katrina')
self.assertEqual(str(member), 'Katrina Heide') | <|file_sep|>django/santropolFeast/member/tests.py.diff
original:
from member.models import Member
updated:
from member.models import Member, Client
<|file_sep|>django/santropolFeast/member/tests.py.diff
original:
Member.objects.create(firstname='Katrina', birthdate=date(1980, 4, 19))
updated:
Member.objects.create(firstname='Katrina', lastname='Heide', birthdate=date(1980, 4, 19))
<|file_sep|>original/django/santropolFeast/member/tests.py
from django.test import TestCase
from member.models import Member
from datetime import date
class MemberTestCase(TestCase):
def setUp(self):
Member.objects.create(firstname='Katrina', birthdate=date(1980, 4, 19))
def test_age_on_date(self):
"""The age on given date is properly computed"""
katrina = Member.objects.get(firstname='Katrina')
self.assertEqual(katrina.age_on_date(date(2016, 4, 19)), 36)
self.assertEqual(katrina.age_on_date(date(1950, 4, 19)), 0)
self.assertEqual(katrina.age_on_date(katrina.birthdate), 0)
<|file_sep|>current/django/santropolFeast/member/tests.py
from django.test import TestCase
from member.models import Member, Client
from datetime import date
class MemberTestCase(TestCase):
def setUp(self):
Member.objects.create(firstname='Katrina', lastname='Heide', birthdate=date(1980, 4, 19))
def test_age_on_date(self):
"""The age on given date is properly computed"""
katrina = Member.objects.get(firstname='Katrina')
self.assertEqual(katrina.age_on_date(date(2016, 4, 19)), 36)
self.assertEqual(katrina.age_on_date(date(1950, 4, 19)), 0)
self.assertEqual(katrina.age_on_date(katrina.birthdate), 0)
<|file_sep|>updated/django/santropolFeast/member/tests.py
from django.test import TestCase
from member.models import Member, Client
from datetime import date
class MemberTestCase(TestCase):
def setUp(self):
Member.objects.create(firstname='Katrina', lastname='Heide', birthdate=date(1980, 4, 19))
def test_age_on_date(self):
"""The age on given date is properly computed"""
katrina = Member.objects.get(firstname='Katrina')
self.assertEqual(katrina.age_on_date(date(2016, 4, 19)), 36)
self.assertEqual(katrina.age_on_date(date(1950, 4, 19)), 0)
self.assertEqual(katrina.age_on_date(katrina.birthdate), 0)
def test_str_is_fullname(self):
"""The member model must be rendered using the firstname and the lastname"""
member = Member.objects.get(firstname='Katrina')
self.assertEqual(str(member), 'Katrina Heide') | 5b9a76dc525f480a08ccbbedcbb3866faa5a50f3 | django/santropolFeast/member/tests.py | django/santropolFeast/member/tests.py | Python |
<|file_sep|>original/lib/sirius_api/umapi_client.rb
unless operator.to_s.in? %(all any none)
fail ArgumentError, "Operator must be 'all', 'any', or 'none'."
end
user_uri = "#{@client.site}/#{username}/roles?#{operator}=#{roles.to_a.join(',')}"
resp = token.request(:head, user_uri)
case resp.status
when 200 then true
when 404 then false
else raise "Invalid response for #{@client.site}/#{user_uri} with status #{resp.status}."
end
end
private
def token
if @token.nil? || @token.expired?
@token = @client.client_credentials.get_token
end
@token
<|file_sep|>current/lib/sirius_api/umapi_client.rb
unless operator.to_s.in? %(all any none)
fail ArgumentError, "Operator must be 'all', 'any', or 'none'."
end
user_uri = "#{@client.site}/#{username}/roles?#{operator}=#{roles.to_a.join(',')}"
resp = token.request(:head, user_uri)
case resp.status
when 200 then true
when 404 then false
else raise "Invalid response for #{@client.site}/#{user_uri} with status #{resp.status}."
end
end
private
def token
if @token.nil? || @token.expired?
@token = @client.client_credentials.get_token
end
@token
<|file_sep|>updated/lib/sirius_api/umapi_client.rb | unless operator.to_s.in? %(all any none)
fail ArgumentError, "Operator must be 'all', 'any', or 'none'."
end
user_uri = "#{@client.site}/#{username}/roles?#{operator}=#{roles.to_a.join(',')}"
resp = token.request(:head, user_uri)
case resp.status
when 200 then true
when 404 then false
else raise "Invalid response for #{user_uri} with status #{resp.status}."
end
end
private
def token
if @token.nil? || @token.expired?
@token = @client.client_credentials.get_token
end
@token | <|file_sep|>original/lib/sirius_api/umapi_client.rb
unless operator.to_s.in? %(all any none)
fail ArgumentError, "Operator must be 'all', 'any', or 'none'."
end
user_uri = "#{@client.site}/#{username}/roles?#{operator}=#{roles.to_a.join(',')}"
resp = token.request(:head, user_uri)
case resp.status
when 200 then true
when 404 then false
else raise "Invalid response for #{@client.site}/#{user_uri} with status #{resp.status}."
end
end
private
def token
if @token.nil? || @token.expired?
@token = @client.client_credentials.get_token
end
@token
<|file_sep|>current/lib/sirius_api/umapi_client.rb
unless operator.to_s.in? %(all any none)
fail ArgumentError, "Operator must be 'all', 'any', or 'none'."
end
user_uri = "#{@client.site}/#{username}/roles?#{operator}=#{roles.to_a.join(',')}"
resp = token.request(:head, user_uri)
case resp.status
when 200 then true
when 404 then false
else raise "Invalid response for #{@client.site}/#{user_uri} with status #{resp.status}."
end
end
private
def token
if @token.nil? || @token.expired?
@token = @client.client_credentials.get_token
end
@token
<|file_sep|>updated/lib/sirius_api/umapi_client.rb
unless operator.to_s.in? %(all any none)
fail ArgumentError, "Operator must be 'all', 'any', or 'none'."
end
user_uri = "#{@client.site}/#{username}/roles?#{operator}=#{roles.to_a.join(',')}"
resp = token.request(:head, user_uri)
case resp.status
when 200 then true
when 404 then false
else raise "Invalid response for #{user_uri} with status #{resp.status}."
end
end
private
def token
if @token.nil? || @token.expired?
@token = @client.client_credentials.get_token
end
@token | c482cd2759137cabf0c8fe251f4613f742934299 | lib/sirius_api/umapi_client.rb | lib/sirius_api/umapi_client.rb | Ruby |
<|file_sep|>original/presentation.md
% Vaibhav Sagar
# Intro
- Collection of `git` curiousities
- Real or quasi-real examples
- Less puns
- Assumes prior `git` experience, unlike my last talk.
# Git Add
# Git Rebase
- **git show** - Shows one or more objects (blobs, trees, tags, and commits)
- **git commit --amend** - Replace the tip of the current branch by creating a new commit
- **git rebase** - Forward-port local commits to the updated upstream head (what?)
- **git cherry-pick** - Apply the changes introduced by some existing commits
# Git Reflog
- **git reflog** - Manage reflog information (uhhh...)
<|file_sep|>current/presentation.md
% Vaibhav Sagar
# Intro
- Collection of `git` curiousities
- Real or quasi-real examples
- Less puns
- Assumes prior `git` experience, unlike my last talk.
# Git Add
# Git Rebase
- **git show** - Shows one or more objects (blobs, trees, tags, and commits)
- **git commit --amend** - Replace the tip of the current branch by creating a new commit
- **git rebase** - Forward-port local commits to the updated upstream head (what?)
- **git cherry-pick** - Apply the changes introduced by some existing commits
# Git Reflog
- **git reflog** - Manage reflog information (uhhh...)
<|file_sep|>updated/presentation.md | % Vaibhav Sagar
# Intro
- Collection of `git` curiousities
- Real or quasi-real examples
- Less puns
- Assumes prior `git` experience, unlike my last talk.
# Git Add
- **git add --patch** - Interactively choose hunks of patch between the index and the work tree and add them to the index. This gives the user a chance to review the difference before adding modified contents to the index.
# Git Rebase
- **git show** - Shows one or more objects (blobs, trees, tags, and commits)
- **git commit --amend** - Replace the tip of the current branch by creating a new commit
- **git rebase** - Forward-port local commits to the updated upstream head (what?)
- **git cherry-pick** - Apply the changes introduced by some existing commits
# Git Reflog | <|file_sep|>original/presentation.md
% Vaibhav Sagar
# Intro
- Collection of `git` curiousities
- Real or quasi-real examples
- Less puns
- Assumes prior `git` experience, unlike my last talk.
# Git Add
# Git Rebase
- **git show** - Shows one or more objects (blobs, trees, tags, and commits)
- **git commit --amend** - Replace the tip of the current branch by creating a new commit
- **git rebase** - Forward-port local commits to the updated upstream head (what?)
- **git cherry-pick** - Apply the changes introduced by some existing commits
# Git Reflog
- **git reflog** - Manage reflog information (uhhh...)
<|file_sep|>current/presentation.md
% Vaibhav Sagar
# Intro
- Collection of `git` curiousities
- Real or quasi-real examples
- Less puns
- Assumes prior `git` experience, unlike my last talk.
# Git Add
# Git Rebase
- **git show** - Shows one or more objects (blobs, trees, tags, and commits)
- **git commit --amend** - Replace the tip of the current branch by creating a new commit
- **git rebase** - Forward-port local commits to the updated upstream head (what?)
- **git cherry-pick** - Apply the changes introduced by some existing commits
# Git Reflog
- **git reflog** - Manage reflog information (uhhh...)
<|file_sep|>updated/presentation.md
% Vaibhav Sagar
# Intro
- Collection of `git` curiousities
- Real or quasi-real examples
- Less puns
- Assumes prior `git` experience, unlike my last talk.
# Git Add
- **git add --patch** - Interactively choose hunks of patch between the index and the work tree and add them to the index. This gives the user a chance to review the difference before adding modified contents to the index.
# Git Rebase
- **git show** - Shows one or more objects (blobs, trees, tags, and commits)
- **git commit --amend** - Replace the tip of the current branch by creating a new commit
- **git rebase** - Forward-port local commits to the updated upstream head (what?)
- **git cherry-pick** - Apply the changes introduced by some existing commits
# Git Reflog | f3a1f32f0e400f7c4f4101f4999d345e61c15ed1 | presentation.md | presentation.md | Markdown |
<|file_sep|>original/importers/src/main/webapp/WEB-INF/web.xml
</servlet>
<servlet>
<servlet-name>PrometheusSimpleClientServlet</servlet-name>
<servlet-class>io.prometheus.client.exporter.MetricsServlet</servlet-class>
<load-on-startup>1</load-on-startup>
</servlet>
<servlet-mapping>
<servlet-name>VCopyImporterServlet</servlet-name>
<url-pattern>/*</url-pattern>
</servlet-mapping>
<servlet-mapping>
<servlet-name>PrometheusSimpleClientServlet</servlet-name>
<url-pattern>/metrics</url-pattern>
</servlet-mapping>
<welcome-file-list>
<welcome-file>index.xhtml</welcome-file>
</welcome-file-list>
<|file_sep|>current/importers/src/main/webapp/WEB-INF/web.xml
</servlet>
<servlet>
<servlet-name>PrometheusSimpleClientServlet</servlet-name>
<servlet-class>io.prometheus.client.exporter.MetricsServlet</servlet-class>
<load-on-startup>1</load-on-startup>
</servlet>
<servlet-mapping>
<servlet-name>VCopyImporterServlet</servlet-name>
<url-pattern>/*</url-pattern>
</servlet-mapping>
<servlet-mapping>
<servlet-name>PrometheusSimpleClientServlet</servlet-name>
<url-pattern>/metrics</url-pattern>
</servlet-mapping>
<welcome-file-list>
<welcome-file>index.xhtml</welcome-file>
</welcome-file-list>
<|file_sep|>updated/importers/src/main/webapp/WEB-INF/web.xml | </servlet>
<servlet>
<servlet-name>PrometheusSimpleClientServlet</servlet-name>
<servlet-class>io.prometheus.client.exporter.MetricsServlet</servlet-class>
<load-on-startup>1</load-on-startup>
</servlet>
<servlet-mapping>
<servlet-name>VCopyImporterServlet</servlet-name>
<url-pattern>/vcopyimporter</url-pattern>
</servlet-mapping>
<servlet-mapping>
<servlet-name>PrometheusSimpleClientServlet</servlet-name>
<url-pattern>/metrics</url-pattern>
</servlet-mapping>
<welcome-file-list>
<welcome-file>index.xhtml</welcome-file>
</welcome-file-list> | <|file_sep|>original/importers/src/main/webapp/WEB-INF/web.xml
</servlet>
<servlet>
<servlet-name>PrometheusSimpleClientServlet</servlet-name>
<servlet-class>io.prometheus.client.exporter.MetricsServlet</servlet-class>
<load-on-startup>1</load-on-startup>
</servlet>
<servlet-mapping>
<servlet-name>VCopyImporterServlet</servlet-name>
<url-pattern>/*</url-pattern>
</servlet-mapping>
<servlet-mapping>
<servlet-name>PrometheusSimpleClientServlet</servlet-name>
<url-pattern>/metrics</url-pattern>
</servlet-mapping>
<welcome-file-list>
<welcome-file>index.xhtml</welcome-file>
</welcome-file-list>
<|file_sep|>current/importers/src/main/webapp/WEB-INF/web.xml
</servlet>
<servlet>
<servlet-name>PrometheusSimpleClientServlet</servlet-name>
<servlet-class>io.prometheus.client.exporter.MetricsServlet</servlet-class>
<load-on-startup>1</load-on-startup>
</servlet>
<servlet-mapping>
<servlet-name>VCopyImporterServlet</servlet-name>
<url-pattern>/*</url-pattern>
</servlet-mapping>
<servlet-mapping>
<servlet-name>PrometheusSimpleClientServlet</servlet-name>
<url-pattern>/metrics</url-pattern>
</servlet-mapping>
<welcome-file-list>
<welcome-file>index.xhtml</welcome-file>
</welcome-file-list>
<|file_sep|>updated/importers/src/main/webapp/WEB-INF/web.xml
</servlet>
<servlet>
<servlet-name>PrometheusSimpleClientServlet</servlet-name>
<servlet-class>io.prometheus.client.exporter.MetricsServlet</servlet-class>
<load-on-startup>1</load-on-startup>
</servlet>
<servlet-mapping>
<servlet-name>VCopyImporterServlet</servlet-name>
<url-pattern>/vcopyimporter</url-pattern>
</servlet-mapping>
<servlet-mapping>
<servlet-name>PrometheusSimpleClientServlet</servlet-name>
<url-pattern>/metrics</url-pattern>
</servlet-mapping>
<welcome-file-list>
<welcome-file>index.xhtml</welcome-file>
</welcome-file-list> | 0fe0b95685e2a97e1572e401bd06f46b1df2de6e | importers/src/main/webapp/WEB-INF/web.xml | importers/src/main/webapp/WEB-INF/web.xml | XML |
<|file_sep|>original/deploytools/publish.sh
<|file_sep|>current/deploytools/publish.sh
<|file_sep|>updated/deploytools/publish.sh | pushd ..
git branch
sbt compile && sbt assembly
aws lambda update-function-code --region us-east-1 --function-name "AlexaBulba" --zip-file fileb://target/scala-2.11/Alexa-Bulbapedia-Scala-assembly-1.0.jar
clear
echo "----"
echo "Done! Test the skill now." | <|file_sep|>original/deploytools/publish.sh
<|file_sep|>current/deploytools/publish.sh
<|file_sep|>updated/deploytools/publish.sh
pushd ..
git branch
sbt compile && sbt assembly
aws lambda update-function-code --region us-east-1 --function-name "AlexaBulba" --zip-file fileb://target/scala-2.11/Alexa-Bulbapedia-Scala-assembly-1.0.jar
clear
echo "----"
echo "Done! Test the skill now." | 6f565cc2b340127d70dbbba55134a019f051ff3a | deploytools/publish.sh | deploytools/publish.sh | Shell |
<|file_sep|>original/lib/parse-response.js
var textPattern = /^text\/*/;
module.exports = function parseResponse(response) {
var key, contentType;
for (key in response) {
if (Buffer.isBuffer(response[key])) {
if (key === 'value') {
if (response.content_type) {
contentType = getContentType(response);
if (contentType === 'application/json') {
response[key] = JSON.parse(response[key]);
} else if (textPattern.test(contentType)) {
response[key] = response[key].toString();
}
} else {
response[key] = response[key].toString();
}
} else if (key !== 'vclock') {
response[key] = response[key].toString();
}
<|file_sep|>current/lib/parse-response.js
var textPattern = /^text\/*/;
module.exports = function parseResponse(response) {
var key, contentType;
for (key in response) {
if (Buffer.isBuffer(response[key])) {
if (key === 'value') {
if (response.content_type) {
contentType = getContentType(response);
if (contentType === 'application/json') {
response[key] = JSON.parse(response[key]);
} else if (textPattern.test(contentType)) {
response[key] = response[key].toString();
}
} else {
response[key] = response[key].toString();
}
} else if (key !== 'vclock') {
response[key] = response[key].toString();
}
<|file_sep|>updated/lib/parse-response.js | var textPattern = /^text\/*/;
module.exports = function parseResponse(response) {
var key, contentType;
for (key in response) {
if (Buffer.isBuffer(response[key])) {
if (key === 'value') {
if (response.content_type && response[key].length !== 0) {
contentType = getContentType(response);
if (contentType === 'application/json') {
response[key] = JSON.parse(response[key]);
} else if (textPattern.test(contentType)) {
response[key] = response[key].toString();
}
} else {
response[key] = response[key].toString();
}
} else if (key !== 'vclock') {
response[key] = response[key].toString();
} | <|file_sep|>original/lib/parse-response.js
var textPattern = /^text\/*/;
module.exports = function parseResponse(response) {
var key, contentType;
for (key in response) {
if (Buffer.isBuffer(response[key])) {
if (key === 'value') {
if (response.content_type) {
contentType = getContentType(response);
if (contentType === 'application/json') {
response[key] = JSON.parse(response[key]);
} else if (textPattern.test(contentType)) {
response[key] = response[key].toString();
}
} else {
response[key] = response[key].toString();
}
} else if (key !== 'vclock') {
response[key] = response[key].toString();
}
<|file_sep|>current/lib/parse-response.js
var textPattern = /^text\/*/;
module.exports = function parseResponse(response) {
var key, contentType;
for (key in response) {
if (Buffer.isBuffer(response[key])) {
if (key === 'value') {
if (response.content_type) {
contentType = getContentType(response);
if (contentType === 'application/json') {
response[key] = JSON.parse(response[key]);
} else if (textPattern.test(contentType)) {
response[key] = response[key].toString();
}
} else {
response[key] = response[key].toString();
}
} else if (key !== 'vclock') {
response[key] = response[key].toString();
}
<|file_sep|>updated/lib/parse-response.js
var textPattern = /^text\/*/;
module.exports = function parseResponse(response) {
var key, contentType;
for (key in response) {
if (Buffer.isBuffer(response[key])) {
if (key === 'value') {
if (response.content_type && response[key].length !== 0) {
contentType = getContentType(response);
if (contentType === 'application/json') {
response[key] = JSON.parse(response[key]);
} else if (textPattern.test(contentType)) {
response[key] = response[key].toString();
}
} else {
response[key] = response[key].toString();
}
} else if (key !== 'vclock') {
response[key] = response[key].toString();
} | 36d38aba445916ba95979fbadbc86551d809af00 | lib/parse-response.js | lib/parse-response.js | JavaScript |
<|file_sep|>original/plugins/pass_the_butter.py
<|file_sep|>current/plugins/pass_the_butter.py
<|file_sep|>updated/plugins/pass_the_butter.py | from espresso.main import robot
@robot.respond(r"(?i)pass the butter")
def pass_the_butter(res):
res.reply(res.msg.user, "What is my purpose in life?")
@robot.respond(r"(?i)you pass butter")
def you_pass_butter(res):
res.send("Oh my god.") | <|file_sep|>original/plugins/pass_the_butter.py
<|file_sep|>current/plugins/pass_the_butter.py
<|file_sep|>updated/plugins/pass_the_butter.py
from espresso.main import robot
@robot.respond(r"(?i)pass the butter")
def pass_the_butter(res):
res.reply(res.msg.user, "What is my purpose in life?")
@robot.respond(r"(?i)you pass butter")
def you_pass_butter(res):
res.send("Oh my god.") | 052dbe05c0e1d3e2821857a035e469be2a1055ae | plugins/pass_the_butter.py | plugins/pass_the_butter.py | Python |
<|file_sep|>original/src/Listabierta/Bundle/MunicipalesBundle/Resources/views/Candidacy/recover_password_success.html.twig
<|file_sep|>current/src/Listabierta/Bundle/MunicipalesBundle/Resources/views/Candidacy/recover_password_success.html.twig
<|file_sep|>updated/src/Listabierta/Bundle/MunicipalesBundle/Resources/views/Candidacy/recover_password_success.html.twig | {% extends 'MunicipalesBundle:theme:layout.html.twig' %}
{% block title %}{{ 'candidacy.step_verify.html_title'|trans }}{% endblock %}
{% form_theme form 'MunicipalesBundle:theme:forms.html.twig' %}
{% block appjsinline %}
{% endblock %}
{% block content %}
<section class="section-2">
<div class="container marketing">
<div class="headline">
<h2>{{ 'candidacy.recover_password.header'|trans }}</h2>
</div><h3 class="featurette-headin"> <i>{{ 'candidacy.recover_password.h1_title'|trans }}</i>
</h3>
<!-- START THE FEATURETTES -->
| <|file_sep|>original/src/Listabierta/Bundle/MunicipalesBundle/Resources/views/Candidacy/recover_password_success.html.twig
<|file_sep|>current/src/Listabierta/Bundle/MunicipalesBundle/Resources/views/Candidacy/recover_password_success.html.twig
<|file_sep|>updated/src/Listabierta/Bundle/MunicipalesBundle/Resources/views/Candidacy/recover_password_success.html.twig
{% extends 'MunicipalesBundle:theme:layout.html.twig' %}
{% block title %}{{ 'candidacy.step_verify.html_title'|trans }}{% endblock %}
{% form_theme form 'MunicipalesBundle:theme:forms.html.twig' %}
{% block appjsinline %}
{% endblock %}
{% block content %}
<section class="section-2">
<div class="container marketing">
<div class="headline">
<h2>{{ 'candidacy.recover_password.header'|trans }}</h2>
</div><h3 class="featurette-headin"> <i>{{ 'candidacy.recover_password.h1_title'|trans }}</i>
</h3>
<!-- START THE FEATURETTES -->
| e81f600d8baff9de3198c0992acfe7e2e1098cb9 | src/Listabierta/Bundle/MunicipalesBundle/Resources/views/Candidacy/recover_password_success.html.twig | src/Listabierta/Bundle/MunicipalesBundle/Resources/views/Candidacy/recover_password_success.html.twig | Twig |
<|file_sep|>metadata/de.baumann.hhsmoodle.txt.diff
original:
updated:
Build:1.0.2,5
commit=v1.0.2
subdir=app
gradle=yes
<|file_sep|>original/metadata/de.baumann.hhsmoodle.txt
Auto Name:HHS Moodle
Summary:Interact with HHS moodle instance
Description:
Interact with the Moodle instance of the [http://huebsch.karlsruhe.de/
Heinrich-Huebsch-Schule] in Karlsruhe, Germany.
[https://github.com/scoute-dich/HHSMoodle/blob/HEAD/SCREENSHOTS.md Screenshots]
.
Repo Type:git
Repo:https://github.com/scoute-dich/HHSMoodle
Build:1.0,2
commit=v1.0
subdir=app
gradle=yes
Auto Update Mode:Version v%v
Update Check Mode:Tags
Current Version:1.0
Current Version Code:2
<|file_sep|>current/metadata/de.baumann.hhsmoodle.txt
[https://github.com/scoute-dich/HHSMoodle/blob/HEAD/SCREENSHOTS.md Screenshots]
.
Repo Type:git
Repo:https://github.com/scoute-dich/HHSMoodle
Build:1.0,2
commit=v1.0
subdir=app
gradle=yes
Build:1.0.2,5
commit=v1.0.2
subdir=app
gradle=yes
Auto Update Mode:Version v%v
Update Check Mode:Tags
Current Version:1.0
Current Version Code:2
<|file_sep|>updated/metadata/de.baumann.hhsmoodle.txt |
[https://github.com/scoute-dich/HHSMoodle/blob/HEAD/SCREENSHOTS.md Screenshots]
.
Repo Type:git
Repo:https://github.com/scoute-dich/HHSMoodle
Build:1.0,2
commit=v1.0
subdir=app
gradle=yes
Build:1.0.2,5
commit=v1.0.2
subdir=app
gradle=yes
Auto Update Mode:Version v%v
Update Check Mode:Tags
Current Version:1.0.2
Current Version Code:5 | <|file_sep|>metadata/de.baumann.hhsmoodle.txt.diff
original:
updated:
Build:1.0.2,5
commit=v1.0.2
subdir=app
gradle=yes
<|file_sep|>original/metadata/de.baumann.hhsmoodle.txt
Auto Name:HHS Moodle
Summary:Interact with HHS moodle instance
Description:
Interact with the Moodle instance of the [http://huebsch.karlsruhe.de/
Heinrich-Huebsch-Schule] in Karlsruhe, Germany.
[https://github.com/scoute-dich/HHSMoodle/blob/HEAD/SCREENSHOTS.md Screenshots]
.
Repo Type:git
Repo:https://github.com/scoute-dich/HHSMoodle
Build:1.0,2
commit=v1.0
subdir=app
gradle=yes
Auto Update Mode:Version v%v
Update Check Mode:Tags
Current Version:1.0
Current Version Code:2
<|file_sep|>current/metadata/de.baumann.hhsmoodle.txt
[https://github.com/scoute-dich/HHSMoodle/blob/HEAD/SCREENSHOTS.md Screenshots]
.
Repo Type:git
Repo:https://github.com/scoute-dich/HHSMoodle
Build:1.0,2
commit=v1.0
subdir=app
gradle=yes
Build:1.0.2,5
commit=v1.0.2
subdir=app
gradle=yes
Auto Update Mode:Version v%v
Update Check Mode:Tags
Current Version:1.0
Current Version Code:2
<|file_sep|>updated/metadata/de.baumann.hhsmoodle.txt
[https://github.com/scoute-dich/HHSMoodle/blob/HEAD/SCREENSHOTS.md Screenshots]
.
Repo Type:git
Repo:https://github.com/scoute-dich/HHSMoodle
Build:1.0,2
commit=v1.0
subdir=app
gradle=yes
Build:1.0.2,5
commit=v1.0.2
subdir=app
gradle=yes
Auto Update Mode:Version v%v
Update Check Mode:Tags
Current Version:1.0.2
Current Version Code:5 | 30defcc56c74fd45a070f822c5dfcbea71f60a6b | metadata/de.baumann.hhsmoodle.txt | metadata/de.baumann.hhsmoodle.txt | Text |
<|file_sep|>original/.github/workflows/macos_cmake.yml
name: MacOS Cmake
on:
workflow_dispatch:
inputs:
extra_resolve_options:
description: "Extra Resolve Options"
required: false
schedule:
- cron: "0 1 * * *" # 3 AM CET
push:
pull_request:
env:
EXTRA_RESOLVE_OPTIONS: ${{ github.event.inputs.extra_resolve_options }}
jobs:
MacOS-cmake:
name: MacOS Cmake
uses: steinwurf/macos-cmake-action/.github/workflows/action.yml@1.0.0
with:
extra_resolve_options: $EXTRA_RESOLVE_OPTIONS
<|file_sep|>current/.github/workflows/macos_cmake.yml
name: MacOS Cmake
on:
workflow_dispatch:
inputs:
extra_resolve_options:
description: "Extra Resolve Options"
required: false
schedule:
- cron: "0 1 * * *" # 3 AM CET
push:
pull_request:
env:
EXTRA_RESOLVE_OPTIONS: ${{ github.event.inputs.extra_resolve_options }}
jobs:
MacOS-cmake:
name: MacOS Cmake
uses: steinwurf/macos-cmake-action/.github/workflows/action.yml@1.0.0
with:
extra_resolve_options: $EXTRA_RESOLVE_OPTIONS
<|file_sep|>updated/.github/workflows/macos_cmake.yml | name: MacOS Cmake
on:
workflow_dispatch:
inputs:
extra_resolve_options:
description: "Extra Resolve Options"
required: false
schedule:
- cron: "0 1 * * *" # 3 AM CET
push:
pull_request:
env:
EXTRA_RESOLVE_OPTIONS: ${{ github.event.inputs.extra_resolve_options }}
jobs:
MacOS-cmake:
name: MacOS Cmake
uses: steinwurf/macos-cmake-action/.github/workflows/action.yml@2.0.0
with:
extra_resolve_options: $EXTRA_RESOLVE_OPTIONS | <|file_sep|>original/.github/workflows/macos_cmake.yml
name: MacOS Cmake
on:
workflow_dispatch:
inputs:
extra_resolve_options:
description: "Extra Resolve Options"
required: false
schedule:
- cron: "0 1 * * *" # 3 AM CET
push:
pull_request:
env:
EXTRA_RESOLVE_OPTIONS: ${{ github.event.inputs.extra_resolve_options }}
jobs:
MacOS-cmake:
name: MacOS Cmake
uses: steinwurf/macos-cmake-action/.github/workflows/action.yml@1.0.0
with:
extra_resolve_options: $EXTRA_RESOLVE_OPTIONS
<|file_sep|>current/.github/workflows/macos_cmake.yml
name: MacOS Cmake
on:
workflow_dispatch:
inputs:
extra_resolve_options:
description: "Extra Resolve Options"
required: false
schedule:
- cron: "0 1 * * *" # 3 AM CET
push:
pull_request:
env:
EXTRA_RESOLVE_OPTIONS: ${{ github.event.inputs.extra_resolve_options }}
jobs:
MacOS-cmake:
name: MacOS Cmake
uses: steinwurf/macos-cmake-action/.github/workflows/action.yml@1.0.0
with:
extra_resolve_options: $EXTRA_RESOLVE_OPTIONS
<|file_sep|>updated/.github/workflows/macos_cmake.yml
name: MacOS Cmake
on:
workflow_dispatch:
inputs:
extra_resolve_options:
description: "Extra Resolve Options"
required: false
schedule:
- cron: "0 1 * * *" # 3 AM CET
push:
pull_request:
env:
EXTRA_RESOLVE_OPTIONS: ${{ github.event.inputs.extra_resolve_options }}
jobs:
MacOS-cmake:
name: MacOS Cmake
uses: steinwurf/macos-cmake-action/.github/workflows/action.yml@2.0.0
with:
extra_resolve_options: $EXTRA_RESOLVE_OPTIONS | b8048a4621f37b9598222463b6cfa1b3fc5a9976 | .github/workflows/macos_cmake.yml | .github/workflows/macos_cmake.yml | YAML |
<|file_sep|>src/lib.rs.diff
original:
updated:
#![feature(step_by)]
/// A direction of the Fourier transform.
<|file_sep|>src/lib.rs.diff
original:
updated:
/// From the time domain to the frequency domain.
<|file_sep|>src/lib.rs.diff
original:
Backward,
updated:
/// From the frequency domain to the time domain.
Inverse,
<|file_sep|>original/src/lib.rs
pub enum Direction {
Forward,
Backward,
}
pub fn transform(data: &mut [f64], _: Direction) {
let n = data.len();
if n < 2 || n & (n - 1) != 0 {
panic!("the data size should be a power of two");
}
}
<|file_sep|>current/src/lib.rs
#![feature(step_by)]
/// A direction of the Fourier transform.
pub enum Direction {
/// From the time domain to the frequency domain.
Forward,
/// From the frequency domain to the time domain.
Inverse,
}
pub fn transform(data: &mut [f64], _: Direction) {
let n = data.len();
if n < 2 || n & (n - 1) != 0 {
panic!("the data size should be a power of two");
}
}
<|file_sep|>updated/src/lib.rs |
/// A direction of the Fourier transform.
pub enum Direction {
/// From the time domain to the frequency domain.
Forward,
/// From the frequency domain to the time domain.
Inverse,
}
/// Perform the Fourier transform.
pub fn transform(data: &mut [f64], direction: Direction) {
use std::f64::consts::PI;
let n = data.len() / 2;
let nn = n << 1;
let isign = match direction {
Direction::Forward => 1.0,
Direction::Inverse => -1.0,
};
| <|file_sep|>src/lib.rs.diff
original:
updated:
#![feature(step_by)]
/// A direction of the Fourier transform.
<|file_sep|>src/lib.rs.diff
original:
updated:
/// From the time domain to the frequency domain.
<|file_sep|>src/lib.rs.diff
original:
Backward,
updated:
/// From the frequency domain to the time domain.
Inverse,
<|file_sep|>original/src/lib.rs
pub enum Direction {
Forward,
Backward,
}
pub fn transform(data: &mut [f64], _: Direction) {
let n = data.len();
if n < 2 || n & (n - 1) != 0 {
panic!("the data size should be a power of two");
}
}
<|file_sep|>current/src/lib.rs
#![feature(step_by)]
/// A direction of the Fourier transform.
pub enum Direction {
/// From the time domain to the frequency domain.
Forward,
/// From the frequency domain to the time domain.
Inverse,
}
pub fn transform(data: &mut [f64], _: Direction) {
let n = data.len();
if n < 2 || n & (n - 1) != 0 {
panic!("the data size should be a power of two");
}
}
<|file_sep|>updated/src/lib.rs
/// A direction of the Fourier transform.
pub enum Direction {
/// From the time domain to the frequency domain.
Forward,
/// From the frequency domain to the time domain.
Inverse,
}
/// Perform the Fourier transform.
pub fn transform(data: &mut [f64], direction: Direction) {
use std::f64::consts::PI;
let n = data.len() / 2;
let nn = n << 1;
let isign = match direction {
Direction::Forward => 1.0,
Direction::Inverse => -1.0,
};
| c47196e3e91382faa62200487f07c4014daa7065 | src/lib.rs | src/lib.rs | Rust |
<|file_sep|>original/plata/product/producer/models.py
from datetime import datetime
from django.db import models
from django.db.models import Sum, signals
from django.utils.translation import ugettext_lazy as _
from plata.product.modules.options.models import Product
class ProducerManager(models.Manager):
def active(self):
return self.filter(is_active=True)
class Producer(models.Model):
"""
Optional producer field for every product
"""
is_active = models.BooleanField(_('is active'), default=True)
name = models.CharField(_('name'), max_length=100)
<|file_sep|>current/plata/product/producer/models.py
from datetime import datetime
from django.db import models
from django.db.models import Sum, signals
from django.utils.translation import ugettext_lazy as _
from plata.product.modules.options.models import Product
class ProducerManager(models.Manager):
def active(self):
return self.filter(is_active=True)
class Producer(models.Model):
"""
Optional producer field for every product
"""
is_active = models.BooleanField(_('is active'), default=True)
name = models.CharField(_('name'), max_length=100)
<|file_sep|>updated/plata/product/producer/models.py | from datetime import datetime
from django.db import models
from django.db.models import Sum, signals
from django.utils.translation import ugettext_lazy as _
from plata.product.modules.options.models import Product # FIXME
class ProducerManager(models.Manager):
def active(self):
return self.filter(is_active=True)
class Producer(models.Model):
"""
Optional producer field for every product
"""
is_active = models.BooleanField(_('is active'), default=True)
name = models.CharField(_('name'), max_length=100) | <|file_sep|>original/plata/product/producer/models.py
from datetime import datetime
from django.db import models
from django.db.models import Sum, signals
from django.utils.translation import ugettext_lazy as _
from plata.product.modules.options.models import Product
class ProducerManager(models.Manager):
def active(self):
return self.filter(is_active=True)
class Producer(models.Model):
"""
Optional producer field for every product
"""
is_active = models.BooleanField(_('is active'), default=True)
name = models.CharField(_('name'), max_length=100)
<|file_sep|>current/plata/product/producer/models.py
from datetime import datetime
from django.db import models
from django.db.models import Sum, signals
from django.utils.translation import ugettext_lazy as _
from plata.product.modules.options.models import Product
class ProducerManager(models.Manager):
def active(self):
return self.filter(is_active=True)
class Producer(models.Model):
"""
Optional producer field for every product
"""
is_active = models.BooleanField(_('is active'), default=True)
name = models.CharField(_('name'), max_length=100)
<|file_sep|>updated/plata/product/producer/models.py
from datetime import datetime
from django.db import models
from django.db.models import Sum, signals
from django.utils.translation import ugettext_lazy as _
from plata.product.modules.options.models import Product # FIXME
class ProducerManager(models.Manager):
def active(self):
return self.filter(is_active=True)
class Producer(models.Model):
"""
Optional producer field for every product
"""
is_active = models.BooleanField(_('is active'), default=True)
name = models.CharField(_('name'), max_length=100) | a4b20f88eeafa3667cc47c92736241069f278e9e | plata/product/producer/models.py | plata/product/producer/models.py | Python |
<|file_sep|>original/docker-dev.sh
#!/bin/bash
dockerd-entrypoint.sh --log-level "${DOCKERD_LOG_LEVEL:-warn}" --storage-driver "${DOCKERD_STORAGE_DRIVER}" &
DOCKERD_PID="$!"
while ! docker version 1>/dev/null; do
kill -0 "$!" || exit
sleep .5
done
bash "$@"
exit_code="$?"
kill %1
exit "$exit_code"
<|file_sep|>current/docker-dev.sh
#!/bin/bash
dockerd-entrypoint.sh --log-level "${DOCKERD_LOG_LEVEL:-warn}" --storage-driver "${DOCKERD_STORAGE_DRIVER}" &
DOCKERD_PID="$!"
while ! docker version 1>/dev/null; do
kill -0 "$!" || exit
sleep .5
done
bash "$@"
exit_code="$?"
kill %1
exit "$exit_code"
<|file_sep|>updated/docker-dev.sh | #!/bin/bash
dockerd-entrypoint.sh --log-level "${DOCKERD_LOG_LEVEL:-warn}" --storage-driver "${DOCKERD_STORAGE_DRIVER:-vfs}" &
DOCKERD_PID="$!"
while ! docker version 1>/dev/null; do
kill -0 "$!" || exit
sleep .5
done
bash "$@"
exit_code="$?"
kill %1
exit "$exit_code" | <|file_sep|>original/docker-dev.sh
#!/bin/bash
dockerd-entrypoint.sh --log-level "${DOCKERD_LOG_LEVEL:-warn}" --storage-driver "${DOCKERD_STORAGE_DRIVER}" &
DOCKERD_PID="$!"
while ! docker version 1>/dev/null; do
kill -0 "$!" || exit
sleep .5
done
bash "$@"
exit_code="$?"
kill %1
exit "$exit_code"
<|file_sep|>current/docker-dev.sh
#!/bin/bash
dockerd-entrypoint.sh --log-level "${DOCKERD_LOG_LEVEL:-warn}" --storage-driver "${DOCKERD_STORAGE_DRIVER}" &
DOCKERD_PID="$!"
while ! docker version 1>/dev/null; do
kill -0 "$!" || exit
sleep .5
done
bash "$@"
exit_code="$?"
kill %1
exit "$exit_code"
<|file_sep|>updated/docker-dev.sh
#!/bin/bash
dockerd-entrypoint.sh --log-level "${DOCKERD_LOG_LEVEL:-warn}" --storage-driver "${DOCKERD_STORAGE_DRIVER:-vfs}" &
DOCKERD_PID="$!"
while ! docker version 1>/dev/null; do
kill -0 "$!" || exit
sleep .5
done
bash "$@"
exit_code="$?"
kill %1
exit "$exit_code" | 16b3e0e7ffd0b0e22e9a72de7fbf6db03a0577a3 | docker-dev.sh | docker-dev.sh | Shell |
<|file_sep|>original/test/bedecked-test.js
it('should not generate an error', function() {
expect(err).to.not.be.ok;
});
it('should split into slides', function() {
expect($('.slides > section').length).to.equal(4);
});
it('should add styles and scripts to the page', function() {
/**
* @todo
*/
expect(false).to.be.true;
});
});
describe('api custom opts', function() {
/**
* @todo
*/
});
<|file_sep|>current/test/bedecked-test.js
it('should not generate an error', function() {
expect(err).to.not.be.ok;
});
it('should split into slides', function() {
expect($('.slides > section').length).to.equal(4);
});
it('should add styles and scripts to the page', function() {
/**
* @todo
*/
expect(false).to.be.true;
});
});
describe('api custom opts', function() {
/**
* @todo
*/
});
<|file_sep|>updated/test/bedecked-test.js | it('should not generate an error', function() {
expect(err).to.not.be.ok;
});
it('should split into slides', function() {
expect($('.slides > section').length).to.equal(4);
});
it('should add styles and scripts to the page', function() {
expect($('link#reveal-core').length).to.equal(1);
expect($('link#reveal-theme').length).to.equal(1);
expect($('script#reveal-core').length).to.equal(1);
expect($('script#reveal-init').length).to.equal(1);
});
});
describe('api custom opts', function() {
/**
* @todo
*/
}); | <|file_sep|>original/test/bedecked-test.js
it('should not generate an error', function() {
expect(err).to.not.be.ok;
});
it('should split into slides', function() {
expect($('.slides > section').length).to.equal(4);
});
it('should add styles and scripts to the page', function() {
/**
* @todo
*/
expect(false).to.be.true;
});
});
describe('api custom opts', function() {
/**
* @todo
*/
});
<|file_sep|>current/test/bedecked-test.js
it('should not generate an error', function() {
expect(err).to.not.be.ok;
});
it('should split into slides', function() {
expect($('.slides > section').length).to.equal(4);
});
it('should add styles and scripts to the page', function() {
/**
* @todo
*/
expect(false).to.be.true;
});
});
describe('api custom opts', function() {
/**
* @todo
*/
});
<|file_sep|>updated/test/bedecked-test.js
it('should not generate an error', function() {
expect(err).to.not.be.ok;
});
it('should split into slides', function() {
expect($('.slides > section').length).to.equal(4);
});
it('should add styles and scripts to the page', function() {
expect($('link#reveal-core').length).to.equal(1);
expect($('link#reveal-theme').length).to.equal(1);
expect($('script#reveal-core').length).to.equal(1);
expect($('script#reveal-init').length).to.equal(1);
});
});
describe('api custom opts', function() {
/**
* @todo
*/
}); | 3c414d17c0e74165d2bef3e43d389773caa3080c | test/bedecked-test.js | test/bedecked-test.js | JavaScript |
<|file_sep|>original/sideloader/templates/accounts_profile.html
{% include "fragments/head.html" with nav_active="home" %}
<div class="row-fluid">
<div class="span9">
<h4>Account settings: {{user.username}}</h4>
<p>
<form method="POST" class="form-horizontal">
{% csrf_token %}
{{form}}
<div class="control-group form-actions">
<div class="controls">
<input type="submit" class="btn btn-success" value="Save">
</div>
</div>
</form>
</p>
</div>
</div>
{% include "fragments/foot.html" %}
<|file_sep|>current/sideloader/templates/accounts_profile.html
{% include "fragments/head.html" with nav_active="home" %}
<div class="row-fluid">
<div class="span9">
<h4>Account settings: {{user.username}}</h4>
<p>
<form method="POST" class="form-horizontal">
{% csrf_token %}
{{form}}
<div class="control-group form-actions">
<div class="controls">
<input type="submit" class="btn btn-success" value="Save">
</div>
</div>
</form>
</p>
</div>
</div>
{% include "fragments/foot.html" %}
<|file_sep|>updated/sideloader/templates/accounts_profile.html | {% include "fragments/head.html" %}
<div class="row-fluid">
<div class="span9">
<h4>Account settings: {{user.username}}</h4>
<p>
<form method="POST" class="form-horizontal">
{% csrf_token %}
{{form}}
<div class="control-group form-actions">
<div class="controls">
<input type="submit" class="btn btn-success" value="Save">
</div>
</div>
</form>
</p>
</div>
</div>
{% include "fragments/foot.html" %} | <|file_sep|>original/sideloader/templates/accounts_profile.html
{% include "fragments/head.html" with nav_active="home" %}
<div class="row-fluid">
<div class="span9">
<h4>Account settings: {{user.username}}</h4>
<p>
<form method="POST" class="form-horizontal">
{% csrf_token %}
{{form}}
<div class="control-group form-actions">
<div class="controls">
<input type="submit" class="btn btn-success" value="Save">
</div>
</div>
</form>
</p>
</div>
</div>
{% include "fragments/foot.html" %}
<|file_sep|>current/sideloader/templates/accounts_profile.html
{% include "fragments/head.html" with nav_active="home" %}
<div class="row-fluid">
<div class="span9">
<h4>Account settings: {{user.username}}</h4>
<p>
<form method="POST" class="form-horizontal">
{% csrf_token %}
{{form}}
<div class="control-group form-actions">
<div class="controls">
<input type="submit" class="btn btn-success" value="Save">
</div>
</div>
</form>
</p>
</div>
</div>
{% include "fragments/foot.html" %}
<|file_sep|>updated/sideloader/templates/accounts_profile.html
{% include "fragments/head.html" %}
<div class="row-fluid">
<div class="span9">
<h4>Account settings: {{user.username}}</h4>
<p>
<form method="POST" class="form-horizontal">
{% csrf_token %}
{{form}}
<div class="control-group form-actions">
<div class="controls">
<input type="submit" class="btn btn-success" value="Save">
</div>
</div>
</form>
</p>
</div>
</div>
{% include "fragments/foot.html" %} | 8e656703194aea4dbb7722a3394ae81c0b7223d6 | sideloader/templates/accounts_profile.html | sideloader/templates/accounts_profile.html | HTML |
<|file_sep|>original/pages/overview/overview_api_provide.md
sidebar: overview_sidebar
permalink: overview_api_provide.html
summary: Providing an RESTful API for the first time is a journey. This page explains a starting point of the work involved in providing an API and also the part be concentrated on by Care Connect
---
{% include important.html content="All phases outlined below are indicative and subject to on-going review." %}
# How to provide an API #
The following diagram explains the elements of APIs allowing a the development of APIs:
{% include custom/provide_api.svg %}
NHS Digital is contributing to progressing the profile development as described below. Invitations are open for the INTEROPen community to get involved and progress the wider developer ecosystem as defined above.
Please see the explanation of the complete development roadmap.
# Approach to developing an API #
The current site focuses on a typical API Developer's Journey as highlighted by the green boxes below in the developer journey:
<|file_sep|>current/pages/overview/overview_api_provide.md
sidebar: overview_sidebar
permalink: overview_api_provide.html
summary: Providing an RESTful API for the first time is a journey. This page explains a starting point of the work involved in providing an API and also the part be concentrated on by Care Connect
---
{% include important.html content="All phases outlined below are indicative and subject to on-going review." %}
# How to provide an API #
The following diagram explains the elements of APIs allowing a the development of APIs:
{% include custom/provide_api.svg %}
NHS Digital is contributing to progressing the profile development as described below. Invitations are open for the INTEROPen community to get involved and progress the wider developer ecosystem as defined above.
Please see the explanation of the complete development roadmap.
# Approach to developing an API #
The current site focuses on a typical API Developer's Journey as highlighted by the green boxes below in the developer journey:
<|file_sep|>updated/pages/overview/overview_api_provide.md | sidebar: overview_sidebar
permalink: overview_api_provide.html
summary: Providing an RESTful API for the first time is a journey. This page explains a starting point of the work involved in providing an API and also the part be concentrated on by Care Connect
---
{% include important.html content="All phases outlined below are indicative and subject to on-going review." %}
# How to provide an API #
The diagram below explains the parts involded in providing APIs. This implementation guide provides :
- the starting point of providing APIs (in white);
- descriptions and pointers of elements to getting working APIs within various infrastructures (in green); and
- considerations when managing live API services (in yellow)
{% include custom/provide_api.svg %}
NHS Digital is contributing to progressing the profile development as described below. Invitations are open for the INTEROPen community to get involved and progress the wider developer ecosystem as defined above.
Please see the explanation of the complete development roadmap.
| <|file_sep|>original/pages/overview/overview_api_provide.md
sidebar: overview_sidebar
permalink: overview_api_provide.html
summary: Providing an RESTful API for the first time is a journey. This page explains a starting point of the work involved in providing an API and also the part be concentrated on by Care Connect
---
{% include important.html content="All phases outlined below are indicative and subject to on-going review." %}
# How to provide an API #
The following diagram explains the elements of APIs allowing a the development of APIs:
{% include custom/provide_api.svg %}
NHS Digital is contributing to progressing the profile development as described below. Invitations are open for the INTEROPen community to get involved and progress the wider developer ecosystem as defined above.
Please see the explanation of the complete development roadmap.
# Approach to developing an API #
The current site focuses on a typical API Developer's Journey as highlighted by the green boxes below in the developer journey:
<|file_sep|>current/pages/overview/overview_api_provide.md
sidebar: overview_sidebar
permalink: overview_api_provide.html
summary: Providing an RESTful API for the first time is a journey. This page explains a starting point of the work involved in providing an API and also the part be concentrated on by Care Connect
---
{% include important.html content="All phases outlined below are indicative and subject to on-going review." %}
# How to provide an API #
The following diagram explains the elements of APIs allowing a the development of APIs:
{% include custom/provide_api.svg %}
NHS Digital is contributing to progressing the profile development as described below. Invitations are open for the INTEROPen community to get involved and progress the wider developer ecosystem as defined above.
Please see the explanation of the complete development roadmap.
# Approach to developing an API #
The current site focuses on a typical API Developer's Journey as highlighted by the green boxes below in the developer journey:
<|file_sep|>updated/pages/overview/overview_api_provide.md
sidebar: overview_sidebar
permalink: overview_api_provide.html
summary: Providing an RESTful API for the first time is a journey. This page explains a starting point of the work involved in providing an API and also the part be concentrated on by Care Connect
---
{% include important.html content="All phases outlined below are indicative and subject to on-going review." %}
# How to provide an API #
The diagram below explains the parts involded in providing APIs. This implementation guide provides :
- the starting point of providing APIs (in white);
- descriptions and pointers of elements to getting working APIs within various infrastructures (in green); and
- considerations when managing live API services (in yellow)
{% include custom/provide_api.svg %}
NHS Digital is contributing to progressing the profile development as described below. Invitations are open for the INTEROPen community to get involved and progress the wider developer ecosystem as defined above.
Please see the explanation of the complete development roadmap.
| 44e986afe2772e2c4caf4b91a6d2f27673b50398 | pages/overview/overview_api_provide.md | pages/overview/overview_api_provide.md | Markdown |
<|file_sep|>original/.travis.yml
language: java
jdk:
- oraclejdk7
- openjdk7
before_install: 'mvn -version'
install: 'mvn install -Pfull -DskipTests'
<|file_sep|>current/.travis.yml
language: java
jdk:
- oraclejdk7
- openjdk7
before_install: 'mvn -version'
install: 'mvn install -Pfull -DskipTests'
<|file_sep|>updated/.travis.yml | language: java
jdk:
- oraclejdk7
- openjdk7
before_install: 'mvn -version'
install: 'mvn clean install -Pfull -DskipTests'
| <|file_sep|>original/.travis.yml
language: java
jdk:
- oraclejdk7
- openjdk7
before_install: 'mvn -version'
install: 'mvn install -Pfull -DskipTests'
<|file_sep|>current/.travis.yml
language: java
jdk:
- oraclejdk7
- openjdk7
before_install: 'mvn -version'
install: 'mvn install -Pfull -DskipTests'
<|file_sep|>updated/.travis.yml
language: java
jdk:
- oraclejdk7
- openjdk7
before_install: 'mvn -version'
install: 'mvn clean install -Pfull -DskipTests'
| 5bcde105576193a4b225fe4f1b167d2e4ab44e4e | .travis.yml | .travis.yml | YAML |
<|file_sep|>original/requirements.txt
# of appearance. Changing the order has an impact on the overall integration
# process, which may cause wedges in the gate later.
# Make sure that the package versions in minimum-constraints.txt are also
# the minimum versions required in requirements.txt and dev-requirements.txt.
# Direct dependencies (except pip, setuptools, wheel, pbr):
# TODO: Switch to zhmcclient>=0.18.0 once released.
git+https://github.com/zhmcclient/python-zhmcclient.git@andy/cli-split#egg=zhmcclient
# zhmcclient>=0.18.0 # Apache
click>=6.6 # BSD
click-repl>=0.1.0 # MIT
click-spinner>=0.1.6 # MIT
progressbar2>=3.12.0 # BSD
six>=1.10.0 # MIT
tabulate>=0.7.7 # MIT
pyreadline>=2.1; sys_platform == "win32" # BSD
<|file_sep|>current/requirements.txt
# of appearance. Changing the order has an impact on the overall integration
# process, which may cause wedges in the gate later.
# Make sure that the package versions in minimum-constraints.txt are also
# the minimum versions required in requirements.txt and dev-requirements.txt.
# Direct dependencies (except pip, setuptools, wheel, pbr):
# TODO: Switch to zhmcclient>=0.18.0 once released.
git+https://github.com/zhmcclient/python-zhmcclient.git@andy/cli-split#egg=zhmcclient
# zhmcclient>=0.18.0 # Apache
click>=6.6 # BSD
click-repl>=0.1.0 # MIT
click-spinner>=0.1.6 # MIT
progressbar2>=3.12.0 # BSD
six>=1.10.0 # MIT
tabulate>=0.7.7 # MIT
pyreadline>=2.1; sys_platform == "win32" # BSD
<|file_sep|>updated/requirements.txt | # of appearance. Changing the order has an impact on the overall integration
# process, which may cause wedges in the gate later.
# Make sure that the package versions in minimum-constraints.txt are also
# the minimum versions required in requirements.txt and dev-requirements.txt.
# Direct dependencies (except pip, setuptools, wheel, pbr):
# TODO: Switch to zhmcclient>=0.18.0 once released.
git+https://github.com/zhmcclient/python-zhmcclient.git@master#egg=zhmcclient
# zhmcclient>=0.18.0 # Apache
click>=6.6 # BSD
click-repl>=0.1.0 # MIT
click-spinner>=0.1.6 # MIT
progressbar2>=3.12.0 # BSD
six>=1.10.0 # MIT
tabulate>=0.7.7 # MIT
pyreadline>=2.1; sys_platform == "win32" # BSD
| <|file_sep|>original/requirements.txt
# of appearance. Changing the order has an impact on the overall integration
# process, which may cause wedges in the gate later.
# Make sure that the package versions in minimum-constraints.txt are also
# the minimum versions required in requirements.txt and dev-requirements.txt.
# Direct dependencies (except pip, setuptools, wheel, pbr):
# TODO: Switch to zhmcclient>=0.18.0 once released.
git+https://github.com/zhmcclient/python-zhmcclient.git@andy/cli-split#egg=zhmcclient
# zhmcclient>=0.18.0 # Apache
click>=6.6 # BSD
click-repl>=0.1.0 # MIT
click-spinner>=0.1.6 # MIT
progressbar2>=3.12.0 # BSD
six>=1.10.0 # MIT
tabulate>=0.7.7 # MIT
pyreadline>=2.1; sys_platform == "win32" # BSD
<|file_sep|>current/requirements.txt
# of appearance. Changing the order has an impact on the overall integration
# process, which may cause wedges in the gate later.
# Make sure that the package versions in minimum-constraints.txt are also
# the minimum versions required in requirements.txt and dev-requirements.txt.
# Direct dependencies (except pip, setuptools, wheel, pbr):
# TODO: Switch to zhmcclient>=0.18.0 once released.
git+https://github.com/zhmcclient/python-zhmcclient.git@andy/cli-split#egg=zhmcclient
# zhmcclient>=0.18.0 # Apache
click>=6.6 # BSD
click-repl>=0.1.0 # MIT
click-spinner>=0.1.6 # MIT
progressbar2>=3.12.0 # BSD
six>=1.10.0 # MIT
tabulate>=0.7.7 # MIT
pyreadline>=2.1; sys_platform == "win32" # BSD
<|file_sep|>updated/requirements.txt
# of appearance. Changing the order has an impact on the overall integration
# process, which may cause wedges in the gate later.
# Make sure that the package versions in minimum-constraints.txt are also
# the minimum versions required in requirements.txt and dev-requirements.txt.
# Direct dependencies (except pip, setuptools, wheel, pbr):
# TODO: Switch to zhmcclient>=0.18.0 once released.
git+https://github.com/zhmcclient/python-zhmcclient.git@master#egg=zhmcclient
# zhmcclient>=0.18.0 # Apache
click>=6.6 # BSD
click-repl>=0.1.0 # MIT
click-spinner>=0.1.6 # MIT
progressbar2>=3.12.0 # BSD
six>=1.10.0 # MIT
tabulate>=0.7.7 # MIT
pyreadline>=2.1; sys_platform == "win32" # BSD
| 198814141d7236551efab0d0dde2eb86f95c074c | requirements.txt | requirements.txt | Text |
<|file_sep|>original/stylesheets/default.css
border-radius: 0 !important;
}
/*Module Rules*/
nav {
margin-top: 30px;
border-radius: 0 !important;
}
ul .nav-info:first-child {
margin-left: 30%;
}
.nav-info {
font-size: 130%;
margin-right: 5%;
}
.nav-info a {
color: #fff !important;
}
<|file_sep|>current/stylesheets/default.css
border-radius: 0 !important;
}
/*Module Rules*/
nav {
margin-top: 30px;
border-radius: 0 !important;
}
ul .nav-info:first-child {
margin-left: 30%;
}
.nav-info {
font-size: 130%;
margin-right: 5%;
}
.nav-info a {
color: #fff !important;
}
<|file_sep|>updated/stylesheets/default.css | border-radius: 0 !important;
}
/*Module Rules*/
nav {
margin-top: 30px;
border-radius: 0 !important;
}
ul .nav-info:first-child {
margin-left: 37%;
}
.nav-info {
font-size: 130%;
margin-right: 5%;
}
.nav-info a {
color: #fff !important;
} | <|file_sep|>original/stylesheets/default.css
border-radius: 0 !important;
}
/*Module Rules*/
nav {
margin-top: 30px;
border-radius: 0 !important;
}
ul .nav-info:first-child {
margin-left: 30%;
}
.nav-info {
font-size: 130%;
margin-right: 5%;
}
.nav-info a {
color: #fff !important;
}
<|file_sep|>current/stylesheets/default.css
border-radius: 0 !important;
}
/*Module Rules*/
nav {
margin-top: 30px;
border-radius: 0 !important;
}
ul .nav-info:first-child {
margin-left: 30%;
}
.nav-info {
font-size: 130%;
margin-right: 5%;
}
.nav-info a {
color: #fff !important;
}
<|file_sep|>updated/stylesheets/default.css
border-radius: 0 !important;
}
/*Module Rules*/
nav {
margin-top: 30px;
border-radius: 0 !important;
}
ul .nav-info:first-child {
margin-left: 37%;
}
.nav-info {
font-size: 130%;
margin-right: 5%;
}
.nav-info a {
color: #fff !important;
} | f837b692d687ad1dcf90495c6d644c25d7429a2a | stylesheets/default.css | stylesheets/default.css | CSS |
<|file_sep|>js/jquery.tableCopy.js.diff
original:
jQuery.fn.tableCopy = function() {
updated:
jQuery.fn.tableCopy = function(filter) {
<|file_sep|>original/js/jquery.tableCopy.js
/* ============================================================
tableCopy plugin
Copy the selected table to the clipboard
Returns a promise, which resolves true if the copy
succeeds, false if it is cancelled.
To Do: clean up hidden table cells, links, etc.
*/
;(function() {
"use strict";
jQuery.fn.tableCopy = function() {
var deferred = new jQuery.Deferred();
var clone = this.clone();
var wrapper = $('<div>');
wrapper.append(clone);
var html = wrapper.html();
return qcode.copy(html);
}
})();
<|file_sep|>current/js/jquery.tableCopy.js
/* ============================================================
tableCopy plugin
Copy the selected table to the clipboard
Returns a promise, which resolves true if the copy
succeeds, false if it is cancelled.
To Do: clean up hidden table cells, links, etc.
*/
;(function() {
"use strict";
jQuery.fn.tableCopy = function(filter) {
var deferred = new jQuery.Deferred();
var clone = this.clone();
var wrapper = $('<div>');
wrapper.append(clone);
var html = wrapper.html();
return qcode.copy(html);
}
})();
<|file_sep|>updated/js/jquery.tableCopy.js | tableCopy plugin
Copy the selected table to the clipboard
Returns a promise, which resolves true if the copy
succeeds, false if it is cancelled.
To Do: clean up hidden table cells, links, etc.
*/
;(function() {
"use strict";
jQuery.fn.tableCopy = function(filter) {
var deferred = new jQuery.Deferred();
var clone = this.clone();
if ( typeof filter === "function" ) {
clone = filter(clone);
}
var wrapper = $('<div>');
wrapper.append(clone);
var html = wrapper.html();
return qcode.copy(html);
}
})(); | <|file_sep|>js/jquery.tableCopy.js.diff
original:
jQuery.fn.tableCopy = function() {
updated:
jQuery.fn.tableCopy = function(filter) {
<|file_sep|>original/js/jquery.tableCopy.js
/* ============================================================
tableCopy plugin
Copy the selected table to the clipboard
Returns a promise, which resolves true if the copy
succeeds, false if it is cancelled.
To Do: clean up hidden table cells, links, etc.
*/
;(function() {
"use strict";
jQuery.fn.tableCopy = function() {
var deferred = new jQuery.Deferred();
var clone = this.clone();
var wrapper = $('<div>');
wrapper.append(clone);
var html = wrapper.html();
return qcode.copy(html);
}
})();
<|file_sep|>current/js/jquery.tableCopy.js
/* ============================================================
tableCopy plugin
Copy the selected table to the clipboard
Returns a promise, which resolves true if the copy
succeeds, false if it is cancelled.
To Do: clean up hidden table cells, links, etc.
*/
;(function() {
"use strict";
jQuery.fn.tableCopy = function(filter) {
var deferred = new jQuery.Deferred();
var clone = this.clone();
var wrapper = $('<div>');
wrapper.append(clone);
var html = wrapper.html();
return qcode.copy(html);
}
})();
<|file_sep|>updated/js/jquery.tableCopy.js
tableCopy plugin
Copy the selected table to the clipboard
Returns a promise, which resolves true if the copy
succeeds, false if it is cancelled.
To Do: clean up hidden table cells, links, etc.
*/
;(function() {
"use strict";
jQuery.fn.tableCopy = function(filter) {
var deferred = new jQuery.Deferred();
var clone = this.clone();
if ( typeof filter === "function" ) {
clone = filter(clone);
}
var wrapper = $('<div>');
wrapper.append(clone);
var html = wrapper.html();
return qcode.copy(html);
}
})(); | bf9afa04a4256f87d404f5f761493ad55cfbef90 | js/jquery.tableCopy.js | js/jquery.tableCopy.js | JavaScript |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.