commit stringlengths 40 40 | old_file stringlengths 4 184 | new_file stringlengths 4 184 | old_contents stringlengths 1 3.6k | new_contents stringlengths 5 3.38k | subject stringlengths 15 778 | message stringlengths 16 6.74k | lang stringclasses 201 values | license stringclasses 13 values | repos stringlengths 6 116k | config stringclasses 201 values | content stringlengths 137 7.24k | diff stringlengths 26 5.55k | diff_length int64 1 123 | relative_diff_length float64 0.01 89 | n_lines_added int64 0 108 | n_lines_deleted int64 0 106 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
d994bd7fc01018bf4e0587d48e11d56085484293 | src/views/pager.jade | src/views/pager.jade | div
ul.pagination.ng-cloak
li(ng-class="{'disabled': !page.active}", ng-repeat='page in pages', ng-switch='page.type')
a(ng-switch-when='prev', ng-click='goToPage(page.number)', href='') «
a(ng-switch-when='first', ng-click='goToPage(page.number)', href='') {{page.number}}
a(ng-switch-when='page', ng-click='goToPage(page.number)', href='') {{page.number}}
a(ng-switch-when='more', ng-click='goToPage(page.number)', href='') …
a(ng-switch-when='last', ng-click='goToPage(page.number)', href='') {{page.number}}
a(ng-switch-when='next', ng-click='goToPage(page.number)', href='') »
.btn-group.pull-right(ng-show="params.counts.length")
button.btn.btn-mini(ng-repeat="count in params.counts", type='button', ng-class="{'active':params.count == count}", ng-click='changeCount(count)') {{count}}
| div.pagintaion.ng-cloak
ul
li(ng-class="{'disabled': !page.active}", ng-repeat='page in pages', ng-switch='page.type')
a(ng-switch-when='prev', ng-click='goToPage(page.number)', href='') «
a(ng-switch-when='first', ng-click='goToPage(page.number)', href='') {{page.number}}
a(ng-switch-when='page', ng-click='goToPage(page.number)', href='') {{page.number}}
a(ng-switch-when='more', ng-click='goToPage(page.number)', href='') …
a(ng-switch-when='last', ng-click='goToPage(page.number)', href='') {{page.number}}
a(ng-switch-when='next', ng-click='goToPage(page.number)', href='') »
.btn-group.pull-right(ng-show="params.counts.length")
button.btn.btn-mini(ng-repeat="count in params.counts", type='button', ng-class="{'active':params.count == count}", ng-click='changeCount(count)') {{count}}
| Fix for pagination classes on Twitter Bootstrap | Fix for pagination classes on Twitter Bootstrap
| Jade | bsd-3-clause | astik/ng-table,mihnsen/ng-table,SoftwareKing/ng-table,esvit/ng-table,elvis-macak/ng-table,joomlavui/ng-table,seawenzhu/ng-table,brunoosilva/ng-table,tb/ng-table,xwolf12/ng-table,SoftwareKing/ng-table,amir-s/ng-table,xwolf12/ng-table,esvit/ng-table,jespinoza711/ng-table,tb/ng-table,pcmoen/ng-table,jespinoza711/ng-table,jwohlfert23/ng-table-expanded,modulexcite/ng-table,bnicart/ng-table,raimohanska/ng-table,faisalferoz/ng-table,jihgao/ng-table,christianacca/ng-table,projectbid7/ng-table,280455936/ng-table,tony-cui/ng-table,pcmoen/ng-table,russell/ng-table,mihnsen/ng-table,realrunner/ng-table,Equalteam/test-220157d9-851d-457e-a789-f2b10c4cf97d-5starsmedia,Zazulei/ng-table,projectbid7/ng-table,Equalteam/test-220157d9-851d-457e-a789-f2b10c4cf97d-5starsmedia,kostakoida/ng-table,aslubsky/ng-table,raimohanska/ng-table,witcxc/ng-table,seawenzhu/ng-table,aslubsky/ng-table,anthony-legible/ng-table,esvit/ng-table,russell/ng-table,enkodellc/ng-table,enkodellc/ng-table,anthony-legible/ng-table,astik/ng-table,tpsumeta/ng-table,tpsumeta/ng-table,modulexcite/ng-table,christianacca/ng-table,faisalferoz/ng-table,tony-cui/ng-table,elvis-macak/ng-table,kostakoida/ng-table,amir-s/ng-table,brunoosilva/ng-table,juracy/ng-table,bnicart/ng-table,280455936/ng-table,joomlavui/ng-table,jwohlfert23/ng-table-expanded,jihgao/ng-table,juracy/ng-table,realrunner/ng-table,Zazulei/ng-table,PeterDaveHello/ng-table | jade | ## Code Before:
div
ul.pagination.ng-cloak
li(ng-class="{'disabled': !page.active}", ng-repeat='page in pages', ng-switch='page.type')
a(ng-switch-when='prev', ng-click='goToPage(page.number)', href='') «
a(ng-switch-when='first', ng-click='goToPage(page.number)', href='') {{page.number}}
a(ng-switch-when='page', ng-click='goToPage(page.number)', href='') {{page.number}}
a(ng-switch-when='more', ng-click='goToPage(page.number)', href='') …
a(ng-switch-when='last', ng-click='goToPage(page.number)', href='') {{page.number}}
a(ng-switch-when='next', ng-click='goToPage(page.number)', href='') »
.btn-group.pull-right(ng-show="params.counts.length")
button.btn.btn-mini(ng-repeat="count in params.counts", type='button', ng-class="{'active':params.count == count}", ng-click='changeCount(count)') {{count}}
## Instruction:
Fix for pagination classes on Twitter Bootstrap
## Code After:
div.pagintaion.ng-cloak
ul
li(ng-class="{'disabled': !page.active}", ng-repeat='page in pages', ng-switch='page.type')
a(ng-switch-when='prev', ng-click='goToPage(page.number)', href='') «
a(ng-switch-when='first', ng-click='goToPage(page.number)', href='') {{page.number}}
a(ng-switch-when='page', ng-click='goToPage(page.number)', href='') {{page.number}}
a(ng-switch-when='more', ng-click='goToPage(page.number)', href='') …
a(ng-switch-when='last', ng-click='goToPage(page.number)', href='') {{page.number}}
a(ng-switch-when='next', ng-click='goToPage(page.number)', href='') »
.btn-group.pull-right(ng-show="params.counts.length")
button.btn.btn-mini(ng-repeat="count in params.counts", type='button', ng-class="{'active':params.count == count}", ng-click='changeCount(count)') {{count}}
| - div
- ul.pagination.ng-cloak
? ^^^^ -
+ div.pagintaion.ng-cloak
? ^^^ +
+ ul
li(ng-class="{'disabled': !page.active}", ng-repeat='page in pages', ng-switch='page.type')
a(ng-switch-when='prev', ng-click='goToPage(page.number)', href='') «
a(ng-switch-when='first', ng-click='goToPage(page.number)', href='') {{page.number}}
a(ng-switch-when='page', ng-click='goToPage(page.number)', href='') {{page.number}}
a(ng-switch-when='more', ng-click='goToPage(page.number)', href='') …
a(ng-switch-when='last', ng-click='goToPage(page.number)', href='') {{page.number}}
a(ng-switch-when='next', ng-click='goToPage(page.number)', href='') »
.btn-group.pull-right(ng-show="params.counts.length")
button.btn.btn-mini(ng-repeat="count in params.counts", type='button', ng-class="{'active':params.count == count}", ng-click='changeCount(count)') {{count}} | 4 | 0.363636 | 2 | 2 |
59f46dbd8e986f40f4d595bfb93e094ddc660658 | compiler/src/test/java/com/rejasupotaro/android/kvs/internal/StringUtilsTest.java | compiler/src/test/java/com/rejasupotaro/android/kvs/internal/StringUtilsTest.java | package com.rejasupotaro.android.kvs.internal;
import org.junit.Test;
import static org.assertj.core.api.Assertions.assertThat;
public class StringUtilsTest {
@Test
public void testCapitalize() {
assertThat(StringUtils.capitalize("username")).isEqualTo("Username");
}
}
| package com.rejasupotaro.android.kvs.internal;
import org.junit.Test;
import static org.assertj.core.api.Assertions.assertThat;
public class StringUtilsTest {
@Test
public void testCapitalize() {
assertThat(StringUtils.capitalize("username")).isEqualTo("Username");
}
@Test
public void testIsEmpty() {
assertThat(StringUtils.isEmpty(null)).isTrue();
assertThat(StringUtils.isEmpty("")).isTrue();
assertThat(StringUtils.isEmpty("hoge")).isFalse();
}
}
| Add a test for isEmpty. | Add a test for isEmpty.
| Java | mit | rejasupotaro/kvs-schema,hotchemi/kvs-schema,sys1yagi/kvs-schema,rejasupotaro/kvs-schema | java | ## Code Before:
package com.rejasupotaro.android.kvs.internal;
import org.junit.Test;
import static org.assertj.core.api.Assertions.assertThat;
public class StringUtilsTest {
@Test
public void testCapitalize() {
assertThat(StringUtils.capitalize("username")).isEqualTo("Username");
}
}
## Instruction:
Add a test for isEmpty.
## Code After:
package com.rejasupotaro.android.kvs.internal;
import org.junit.Test;
import static org.assertj.core.api.Assertions.assertThat;
public class StringUtilsTest {
@Test
public void testCapitalize() {
assertThat(StringUtils.capitalize("username")).isEqualTo("Username");
}
@Test
public void testIsEmpty() {
assertThat(StringUtils.isEmpty(null)).isTrue();
assertThat(StringUtils.isEmpty("")).isTrue();
assertThat(StringUtils.isEmpty("hoge")).isFalse();
}
}
| package com.rejasupotaro.android.kvs.internal;
import org.junit.Test;
import static org.assertj.core.api.Assertions.assertThat;
public class StringUtilsTest {
@Test
public void testCapitalize() {
assertThat(StringUtils.capitalize("username")).isEqualTo("Username");
}
+
+ @Test
+ public void testIsEmpty() {
+ assertThat(StringUtils.isEmpty(null)).isTrue();
+ assertThat(StringUtils.isEmpty("")).isTrue();
+ assertThat(StringUtils.isEmpty("hoge")).isFalse();
+ }
+
} | 8 | 0.666667 | 8 | 0 |
bee9373dcf852e7af9f0f1a78dcc17a0922f96fe | anchorhub/tests/test_main.py | anchorhub/tests/test_main.py | from nose.tools import *
import anchorhub.main as main
def test_one():
"""
main.py: Test defaults with local directory as input.
"""
main.main(['.'])
| from nose.tools import *
import anchorhub.main as main
from anchorhub.util.getanchorhubpath import get_anchorhub_path
from anchorhub.compatibility import get_path_separator
def test_one():
"""
main.py: Test defaults with local directory as input.
"""
main.main([get_anchorhub_path() + get_path_separator() +
'../sample/multi-file'])
| Modify main.py tests to use get_anchorhub_path() | Modify main.py tests to use get_anchorhub_path()
| Python | apache-2.0 | samjabrahams/anchorhub | python | ## Code Before:
from nose.tools import *
import anchorhub.main as main
def test_one():
"""
main.py: Test defaults with local directory as input.
"""
main.main(['.'])
## Instruction:
Modify main.py tests to use get_anchorhub_path()
## Code After:
from nose.tools import *
import anchorhub.main as main
from anchorhub.util.getanchorhubpath import get_anchorhub_path
from anchorhub.compatibility import get_path_separator
def test_one():
"""
main.py: Test defaults with local directory as input.
"""
main.main([get_anchorhub_path() + get_path_separator() +
'../sample/multi-file'])
| from nose.tools import *
import anchorhub.main as main
+ from anchorhub.util.getanchorhubpath import get_anchorhub_path
+ from anchorhub.compatibility import get_path_separator
def test_one():
"""
main.py: Test defaults with local directory as input.
"""
- main.main(['.'])
+ main.main([get_anchorhub_path() + get_path_separator() +
+ '../sample/multi-file']) | 5 | 0.5 | 4 | 1 |
70ba00fed8edda0b58040853cf1e4cdbf25b336f | providers/common.rb | providers/common.rb | action :install do
if new_resource.create_user
group new_resource.group do
action :create
end
user new_resource.owner do
action :create
gid new_resource.group
end
end
# Update the code.
git new_resource.path do
action :sync
repository new_resource.repository
checkout_branch new_resource.revision
destination new_resource.path
user new_resource.owner
group new_resource.group
end
# If a config file template has been specified, create it.
template new_resource.config_template do
only_if { !new_resource.config_template.nil }
action :create
source new_resource.config_template
path new_resource.config_destination
variables new_resource.config_vars
owner new_resource.owner
group new_resource.group
end
# Install the application requirements.
# If a requirements file has been specified, use pip.
# otherwise use the setup.py
if new_resource.requirements_file
execute 'pip install' do
action :run
cwd new_resource.path
command "pip install -r #{new_resource.requirements_file}"
end
else
execute 'python setup.py install' do
action :run
cwd new_resource.path
end
end
new_resource.updated_by_last_action(true)
end
| action :install do
if new_resource.create_user
group new_resource.group do
action :create
end
user new_resource.owner do
action :create
gid new_resource.group
end
end
directory new_resource.path do
action :create
owner new_resource.owner
group new_resource.group
end
# Update the code.
git new_resource.path do
action :sync
repository new_resource.repository
checkout_branch new_resource.revision
destination new_resource.path
user new_resource.owner
group new_resource.group
end
# If a config file template has been specified, create it.
template new_resource.config_template do
only_if { !new_resource.config_template.nil }
action :create
source new_resource.config_template
path new_resource.config_destination
variables new_resource.config_vars
owner new_resource.owner
group new_resource.group
end
# Install the application requirements.
# If a requirements file has been specified, use pip.
# otherwise use the setup.py
if new_resource.requirements_file
execute 'pip install' do
action :run
cwd new_resource.path
command "pip install -r #{new_resource.requirements_file}"
end
else
execute 'python setup.py install' do
action :run
cwd new_resource.path
end
end
new_resource.updated_by_last_action(true)
end
| Set up directory for checkout with proper perms | Set up directory for checkout with proper perms
| Ruby | apache-2.0 | osuosl-cookbooks/python-webapp | ruby | ## Code Before:
action :install do
if new_resource.create_user
group new_resource.group do
action :create
end
user new_resource.owner do
action :create
gid new_resource.group
end
end
# Update the code.
git new_resource.path do
action :sync
repository new_resource.repository
checkout_branch new_resource.revision
destination new_resource.path
user new_resource.owner
group new_resource.group
end
# If a config file template has been specified, create it.
template new_resource.config_template do
only_if { !new_resource.config_template.nil }
action :create
source new_resource.config_template
path new_resource.config_destination
variables new_resource.config_vars
owner new_resource.owner
group new_resource.group
end
# Install the application requirements.
# If a requirements file has been specified, use pip.
# otherwise use the setup.py
if new_resource.requirements_file
execute 'pip install' do
action :run
cwd new_resource.path
command "pip install -r #{new_resource.requirements_file}"
end
else
execute 'python setup.py install' do
action :run
cwd new_resource.path
end
end
new_resource.updated_by_last_action(true)
end
## Instruction:
Set up directory for checkout with proper perms
## Code After:
action :install do
if new_resource.create_user
group new_resource.group do
action :create
end
user new_resource.owner do
action :create
gid new_resource.group
end
end
directory new_resource.path do
action :create
owner new_resource.owner
group new_resource.group
end
# Update the code.
git new_resource.path do
action :sync
repository new_resource.repository
checkout_branch new_resource.revision
destination new_resource.path
user new_resource.owner
group new_resource.group
end
# If a config file template has been specified, create it.
template new_resource.config_template do
only_if { !new_resource.config_template.nil }
action :create
source new_resource.config_template
path new_resource.config_destination
variables new_resource.config_vars
owner new_resource.owner
group new_resource.group
end
# Install the application requirements.
# If a requirements file has been specified, use pip.
# otherwise use the setup.py
if new_resource.requirements_file
execute 'pip install' do
action :run
cwd new_resource.path
command "pip install -r #{new_resource.requirements_file}"
end
else
execute 'python setup.py install' do
action :run
cwd new_resource.path
end
end
new_resource.updated_by_last_action(true)
end
| action :install do
if new_resource.create_user
group new_resource.group do
action :create
end
user new_resource.owner do
action :create
gid new_resource.group
end
+ end
+
+ directory new_resource.path do
+ action :create
+ owner new_resource.owner
+ group new_resource.group
end
# Update the code.
git new_resource.path do
action :sync
repository new_resource.repository
checkout_branch new_resource.revision
destination new_resource.path
user new_resource.owner
group new_resource.group
end
# If a config file template has been specified, create it.
template new_resource.config_template do
only_if { !new_resource.config_template.nil }
action :create
source new_resource.config_template
path new_resource.config_destination
variables new_resource.config_vars
owner new_resource.owner
group new_resource.group
end
# Install the application requirements.
# If a requirements file has been specified, use pip.
# otherwise use the setup.py
if new_resource.requirements_file
execute 'pip install' do
action :run
cwd new_resource.path
command "pip install -r #{new_resource.requirements_file}"
end
else
execute 'python setup.py install' do
action :run
cwd new_resource.path
end
end
new_resource.updated_by_last_action(true)
end | 6 | 0.12 | 6 | 0 |
e72b6272469c382f14a6732514777aacbd457322 | rest_framework_json_api/exceptions.py | rest_framework_json_api/exceptions.py | from django.utils import encoding
from django.utils.translation import ugettext_lazy as _
from rest_framework import status
from rest_framework.exceptions import APIException
from rest_framework.views import exception_handler as drf_exception_handler
from rest_framework_json_api.utils import format_value
def exception_handler(exc, context):
response = drf_exception_handler(exc, context)
errors = []
# handle generic errors. ValidationError('test') in a view for example
if isinstance(response.data, list):
for message in response.data:
errors.append({
'detail': message,
'source': {
'pointer': '/data',
},
'status': encoding.force_text(response.status_code),
})
# handle all errors thrown from serializers
else:
for field, error in response.data.items():
field = format_value(field)
pointer = '/data/attributes/{}'.format(field)
# see if they passed a dictionary to ValidationError manually
if isinstance(error, dict):
errors.append(error)
else:
for message in error:
errors.append({
'detail': message,
'source': {
'pointer': pointer,
},
'status': encoding.force_text(response.status_code),
})
context['view'].resource_name = 'errors'
response.data = errors
return response
class Conflict(APIException):
status_code = status.HTTP_409_CONFLICT
default_detail = _('Conflict.')
| from django.utils import encoding
from django.utils.translation import ugettext_lazy as _
from rest_framework import status
from rest_framework.exceptions import APIException
from rest_framework.views import exception_handler as drf_exception_handler
from rest_framework_json_api.utils import format_value
def exception_handler(exc, context):
response = drf_exception_handler(exc, context)
errors = []
# handle generic errors. ValidationError('test') in a view for example
if isinstance(response.data, list):
for message in response.data:
errors.append({
'detail': message,
'source': {
'pointer': '/data',
},
'status': encoding.force_text(response.status_code),
})
# handle all errors thrown from serializers
else:
for field, error in response.data.items():
field = format_value(field)
pointer = '/data/attributes/{}'.format(field)
# see if they passed a dictionary to ValidationError manually
if isinstance(error, dict):
errors.append(error)
else:
if isinstance(error, list):
for message in error:
errors.append({
'detail': message,
'source': {
'pointer': pointer,
},
'status': encoding.force_text(response.status_code),
})
else:
errors.append({
'detail': message,
'source': {
'pointer': pointer,
},
'status': encoding.force_text(response.status_code),
})
context['view'].resource_name = 'errors'
response.data = errors
return response
class Conflict(APIException):
status_code = status.HTTP_409_CONFLICT
default_detail = _('Conflict.')
| Fix for some error messages that were split into several messages | Fix for some error messages that were split into several messages
The exception handler expects the error to be a list on line 33. In my
case they were a string, which lead to the split of the string into
multiple errors containing one character
| Python | bsd-2-clause | django-json-api/rest_framework_ember,Instawork/django-rest-framework-json-api,leifurhauks/django-rest-framework-json-api,hnakamur/django-rest-framework-json-api,martinmaillard/django-rest-framework-json-api,pombredanne/django-rest-framework-json-api,lukaslundgren/django-rest-framework-json-api,leo-naeka/rest_framework_ember,schtibe/django-rest-framework-json-api,scottfisk/django-rest-framework-json-api,django-json-api/django-rest-framework-json-api,django-json-api/django-rest-framework-json-api,kaldras/django-rest-framework-json-api,leo-naeka/django-rest-framework-json-api,abdulhaq-e/django-rest-framework-json-api,grapo/django-rest-framework-json-api | python | ## Code Before:
from django.utils import encoding
from django.utils.translation import ugettext_lazy as _
from rest_framework import status
from rest_framework.exceptions import APIException
from rest_framework.views import exception_handler as drf_exception_handler
from rest_framework_json_api.utils import format_value
def exception_handler(exc, context):
response = drf_exception_handler(exc, context)
errors = []
# handle generic errors. ValidationError('test') in a view for example
if isinstance(response.data, list):
for message in response.data:
errors.append({
'detail': message,
'source': {
'pointer': '/data',
},
'status': encoding.force_text(response.status_code),
})
# handle all errors thrown from serializers
else:
for field, error in response.data.items():
field = format_value(field)
pointer = '/data/attributes/{}'.format(field)
# see if they passed a dictionary to ValidationError manually
if isinstance(error, dict):
errors.append(error)
else:
for message in error:
errors.append({
'detail': message,
'source': {
'pointer': pointer,
},
'status': encoding.force_text(response.status_code),
})
context['view'].resource_name = 'errors'
response.data = errors
return response
class Conflict(APIException):
status_code = status.HTTP_409_CONFLICT
default_detail = _('Conflict.')
## Instruction:
Fix for some error messages that were split into several messages
The exception handler expects the error to be a list on line 33. In my
case they were a string, which lead to the split of the string into
multiple errors containing one character
## Code After:
from django.utils import encoding
from django.utils.translation import ugettext_lazy as _
from rest_framework import status
from rest_framework.exceptions import APIException
from rest_framework.views import exception_handler as drf_exception_handler
from rest_framework_json_api.utils import format_value
def exception_handler(exc, context):
response = drf_exception_handler(exc, context)
errors = []
# handle generic errors. ValidationError('test') in a view for example
if isinstance(response.data, list):
for message in response.data:
errors.append({
'detail': message,
'source': {
'pointer': '/data',
},
'status': encoding.force_text(response.status_code),
})
# handle all errors thrown from serializers
else:
for field, error in response.data.items():
field = format_value(field)
pointer = '/data/attributes/{}'.format(field)
# see if they passed a dictionary to ValidationError manually
if isinstance(error, dict):
errors.append(error)
else:
if isinstance(error, list):
for message in error:
errors.append({
'detail': message,
'source': {
'pointer': pointer,
},
'status': encoding.force_text(response.status_code),
})
else:
errors.append({
'detail': message,
'source': {
'pointer': pointer,
},
'status': encoding.force_text(response.status_code),
})
context['view'].resource_name = 'errors'
response.data = errors
return response
class Conflict(APIException):
status_code = status.HTTP_409_CONFLICT
default_detail = _('Conflict.')
| from django.utils import encoding
from django.utils.translation import ugettext_lazy as _
from rest_framework import status
from rest_framework.exceptions import APIException
from rest_framework.views import exception_handler as drf_exception_handler
from rest_framework_json_api.utils import format_value
def exception_handler(exc, context):
response = drf_exception_handler(exc, context)
errors = []
# handle generic errors. ValidationError('test') in a view for example
if isinstance(response.data, list):
for message in response.data:
errors.append({
'detail': message,
'source': {
'pointer': '/data',
},
'status': encoding.force_text(response.status_code),
})
# handle all errors thrown from serializers
else:
for field, error in response.data.items():
field = format_value(field)
pointer = '/data/attributes/{}'.format(field)
# see if they passed a dictionary to ValidationError manually
if isinstance(error, dict):
errors.append(error)
else:
+ if isinstance(error, list):
- for message in error:
+ for message in error:
? ++++
+ errors.append({
+ 'detail': message,
+ 'source': {
+ 'pointer': pointer,
+ },
+ 'status': encoding.force_text(response.status_code),
+ })
+ else:
errors.append({
'detail': message,
'source': {
'pointer': pointer,
},
'status': encoding.force_text(response.status_code),
})
+
context['view'].resource_name = 'errors'
response.data = errors
return response
class Conflict(APIException):
status_code = status.HTTP_409_CONFLICT
default_detail = _('Conflict.') | 12 | 0.244898 | 11 | 1 |
5fb005a9dd2ec6e70f715415337ad92e64064a81 | src/Orchestra/Support/Ftp/RuntimeException.php | src/Orchestra/Support/Ftp/RuntimeException.php | <?php namespace Orchestra\Support\Ftp;
class RuntimeException extends \RuntimeException {
/**
* Parameters.
*
* @var array
*/
protected $parameters = array();
/**
* Construct a new exception.
*
* @param string $exception
* @param array $parameters
* @return void
*/
public function __construct($exception, array $parameters = array())
{
$this->parameters = $parameters;
parent::__construct($exception);
}
}
| <?php namespace Orchestra\Support\Ftp;
class RuntimeException extends \RuntimeException {
/**
* Parameters.
*
* @var array
*/
protected $parameters = array();
/**
* Construct a new exception.
*
* @param string $exception
* @param array $parameters
* @return void
*/
public function __construct($exception, array $parameters = array())
{
$this->parameters = $parameters;
parent::__construct($exception);
}
/**
* Get exceptions parameters.
*
* @return array
*/
public function getParameters()
{
return $this->parameters;
}
}
| Allow to get exceptions parameters. | Allow to get exceptions parameters.
Signed-off-by: crynobone <e1a543840a942eb68427510a8a483282a7bfeddf@gmail.com>
| PHP | mit | orchestral/support | php | ## Code Before:
<?php namespace Orchestra\Support\Ftp;
class RuntimeException extends \RuntimeException {
/**
* Parameters.
*
* @var array
*/
protected $parameters = array();
/**
* Construct a new exception.
*
* @param string $exception
* @param array $parameters
* @return void
*/
public function __construct($exception, array $parameters = array())
{
$this->parameters = $parameters;
parent::__construct($exception);
}
}
## Instruction:
Allow to get exceptions parameters.
Signed-off-by: crynobone <e1a543840a942eb68427510a8a483282a7bfeddf@gmail.com>
## Code After:
<?php namespace Orchestra\Support\Ftp;
class RuntimeException extends \RuntimeException {
/**
* Parameters.
*
* @var array
*/
protected $parameters = array();
/**
* Construct a new exception.
*
* @param string $exception
* @param array $parameters
* @return void
*/
public function __construct($exception, array $parameters = array())
{
$this->parameters = $parameters;
parent::__construct($exception);
}
/**
* Get exceptions parameters.
*
* @return array
*/
public function getParameters()
{
return $this->parameters;
}
}
| <?php namespace Orchestra\Support\Ftp;
class RuntimeException extends \RuntimeException {
/**
* Parameters.
*
* @var array
*/
protected $parameters = array();
/**
* Construct a new exception.
*
* @param string $exception
* @param array $parameters
* @return void
*/
public function __construct($exception, array $parameters = array())
{
$this->parameters = $parameters;
parent::__construct($exception);
}
+
+ /**
+ * Get exceptions parameters.
+ *
+ * @return array
+ */
+ public function getParameters()
+ {
+ return $this->parameters;
+ }
} | 10 | 0.416667 | 10 | 0 |
32c9a7f80d9e46f8236c20af9eda0d7516181085 | src/main/java/reborncore/common/BaseTileBlock.java | src/main/java/reborncore/common/BaseTileBlock.java | package reborncore.common;
import net.minecraft.block.Block;
import net.minecraft.block.ITileEntityProvider;
import net.minecraft.block.material.Material;
import net.minecraft.block.state.IBlockState;
import net.minecraft.util.math.BlockPos;
import net.minecraft.world.World;
import reborncore.RebornCore;
public abstract class BaseTileBlock extends Block implements ITileEntityProvider
{
protected BaseTileBlock(Material materialIn)
{
super(materialIn);
RebornCore.jsonDestroyer.registerObject(this);
}
public int getRenderType()
{
return 3;
}
@Override
public void breakBlock(World worldIn, BlockPos pos, IBlockState state)
{
super.breakBlock(worldIn, pos, state);
worldIn.removeTileEntity(pos);
}
}
| package reborncore.common;
import net.minecraft.block.Block;
import net.minecraft.block.ITileEntityProvider;
import net.minecraft.block.material.Material;
import net.minecraft.block.state.IBlockState;
import net.minecraft.util.math.BlockPos;
import net.minecraft.world.World;
import reborncore.RebornCore;
public abstract class BaseTileBlock extends Block implements ITileEntityProvider
{
protected BaseTileBlock(Material materialIn)
{
super(materialIn);
RebornCore.jsonDestroyer.registerObject(this);
setHardness(1F);
}
public int getRenderType()
{
return 3;
}
@Override
public void breakBlock(World worldIn, BlockPos pos, IBlockState state)
{
super.breakBlock(worldIn, pos, state);
worldIn.removeTileEntity(pos);
}
}
| Add a hardness to the base block | Add a hardness to the base block
| Java | mit | TechReborn/RebornCore | java | ## Code Before:
package reborncore.common;
import net.minecraft.block.Block;
import net.minecraft.block.ITileEntityProvider;
import net.minecraft.block.material.Material;
import net.minecraft.block.state.IBlockState;
import net.minecraft.util.math.BlockPos;
import net.minecraft.world.World;
import reborncore.RebornCore;
public abstract class BaseTileBlock extends Block implements ITileEntityProvider
{
protected BaseTileBlock(Material materialIn)
{
super(materialIn);
RebornCore.jsonDestroyer.registerObject(this);
}
public int getRenderType()
{
return 3;
}
@Override
public void breakBlock(World worldIn, BlockPos pos, IBlockState state)
{
super.breakBlock(worldIn, pos, state);
worldIn.removeTileEntity(pos);
}
}
## Instruction:
Add a hardness to the base block
## Code After:
package reborncore.common;
import net.minecraft.block.Block;
import net.minecraft.block.ITileEntityProvider;
import net.minecraft.block.material.Material;
import net.minecraft.block.state.IBlockState;
import net.minecraft.util.math.BlockPos;
import net.minecraft.world.World;
import reborncore.RebornCore;
public abstract class BaseTileBlock extends Block implements ITileEntityProvider
{
protected BaseTileBlock(Material materialIn)
{
super(materialIn);
RebornCore.jsonDestroyer.registerObject(this);
setHardness(1F);
}
public int getRenderType()
{
return 3;
}
@Override
public void breakBlock(World worldIn, BlockPos pos, IBlockState state)
{
super.breakBlock(worldIn, pos, state);
worldIn.removeTileEntity(pos);
}
}
| package reborncore.common;
import net.minecraft.block.Block;
import net.minecraft.block.ITileEntityProvider;
import net.minecraft.block.material.Material;
import net.minecraft.block.state.IBlockState;
import net.minecraft.util.math.BlockPos;
import net.minecraft.world.World;
import reborncore.RebornCore;
public abstract class BaseTileBlock extends Block implements ITileEntityProvider
{
protected BaseTileBlock(Material materialIn)
{
super(materialIn);
RebornCore.jsonDestroyer.registerObject(this);
+ setHardness(1F);
}
public int getRenderType()
{
return 3;
}
@Override
public void breakBlock(World worldIn, BlockPos pos, IBlockState state)
{
super.breakBlock(worldIn, pos, state);
worldIn.removeTileEntity(pos);
}
} | 1 | 0.033333 | 1 | 0 |
be8344c2f796ecab60669630f4729c4ffa41c83b | web/impact/impact/v1/views/utils.py | web/impact/impact/v1/views/utils.py | def merge_data_by_id(data):
result = {}
for datum in data:
id = datum["id"]
item = result.get(id, {})
item.update(datum)
result[id] = item
return result.values()
def map_data(klass, query, order, data_keys, output_keys):
result = klass.objects.filter(query).order_by(order)
data = result.values_list(*data_keys)
return [dict(zip(output_keys, values))
for values in data]
| def coalesce_dictionaries(data, merge_field="id"):
"Takes a sequence of dictionaries, merges those that share the
same merge_field, and returns a list of resulting dictionaries"
result = {}
for datum in data:
merge_id = datum[merge_field]
item = result.get(merge_id, {})
item.update(datum)
result[merge_id] = item
return result.values()
def map_data(klass, query, order, data_keys, output_keys):
result = klass.objects.filter(query).order_by(order)
data = result.values_list(*data_keys)
return [dict(zip(output_keys, values))
for values in data]
| Rename merge_data_by_id, add doc-string, get rid of id as a local | [AC-4835] Rename merge_data_by_id, add doc-string, get rid of id as a local
| Python | mit | masschallenge/impact-api,masschallenge/impact-api,masschallenge/impact-api,masschallenge/impact-api | python | ## Code Before:
def merge_data_by_id(data):
result = {}
for datum in data:
id = datum["id"]
item = result.get(id, {})
item.update(datum)
result[id] = item
return result.values()
def map_data(klass, query, order, data_keys, output_keys):
result = klass.objects.filter(query).order_by(order)
data = result.values_list(*data_keys)
return [dict(zip(output_keys, values))
for values in data]
## Instruction:
[AC-4835] Rename merge_data_by_id, add doc-string, get rid of id as a local
## Code After:
def coalesce_dictionaries(data, merge_field="id"):
"Takes a sequence of dictionaries, merges those that share the
same merge_field, and returns a list of resulting dictionaries"
result = {}
for datum in data:
merge_id = datum[merge_field]
item = result.get(merge_id, {})
item.update(datum)
result[merge_id] = item
return result.values()
def map_data(klass, query, order, data_keys, output_keys):
result = klass.objects.filter(query).order_by(order)
data = result.values_list(*data_keys)
return [dict(zip(output_keys, values))
for values in data]
| - def merge_data_by_id(data):
+ def coalesce_dictionaries(data, merge_field="id"):
+ "Takes a sequence of dictionaries, merges those that share the
+ same merge_field, and returns a list of resulting dictionaries"
result = {}
for datum in data:
- id = datum["id"]
+ merge_id = datum[merge_field]
- item = result.get(id, {})
+ item = result.get(merge_id, {})
? ++++++
item.update(datum)
- result[id] = item
+ result[merge_id] = item
? ++++++
return result.values()
def map_data(klass, query, order, data_keys, output_keys):
result = klass.objects.filter(query).order_by(order)
data = result.values_list(*data_keys)
return [dict(zip(output_keys, values))
for values in data] | 10 | 0.666667 | 6 | 4 |
6117e9c5e04be0f3e9c5606870561dc1a15e37bc | app/templates/components/sl-date-range-picker.hbs | app/templates/components/sl-date-range-picker.hbs | {{#if label}}
<label {{bind-attr for=inputElementId}}>{{label}}</label>
{{/if}}
<div class="row">
{{sl-date-picker
change="startDateChange"
class="sl-daterange-start-date col-md-6"
endDate=latestStartDate
placeholder=startDatePlaceholder
startDate=minDate
value=startDateValue
inputElementIdBinding="inputElementId"
}}
{{sl-date-picker
change="endDateChange"
class="sl-daterange-end-date col-md-6"
endDate=maxDate
placeholder=endDatePlaceholder
startDate=earliestEndDate
value=endDateValue
}}
</div>
{{#if helpText}}
<p class="help-block">{{helpText}}</p>
{{/if}} | {{#if label}}
<label {{bind-attr for=inputElementId}}>{{label}}</label>
{{/if}}
<div class="row">
{{sl-date-picker
class="sl-daterange-start-date col-md-6"
endDate=latestStartDate
placeholder=startDatePlaceholder
startDate=minDate
value=startDateValue
inputElementIdBinding="inputElementId"
}}
{{sl-date-picker
class="sl-daterange-end-date col-md-6"
endDate=maxDate
placeholder=endDatePlaceholder
startDate=earliestEndDate
value=endDateValue
}}
</div>
{{#if helpText}}
<p class="help-block">{{helpText}}</p>
{{/if}}
| Remove change bindings for startDateChange and endDateChange | Remove change bindings for startDateChange and endDateChange
| Handlebars | mit | notmessenger/sl-ember-components,erangeles/sl-ember-components,theoshu/sl-ember-components,theoshu/sl-ember-components,Suven/sl-ember-components,SpikedKira/sl-ember-components,azizpunjani/sl-ember-components,softlayer/sl-ember-components,Suven/sl-ember-components,alxyuu/sl-ember-components,softlayer/sl-ember-components,juwara0/sl-ember-components,erangeles/sl-ember-components,SpikedKira/sl-ember-components,alxyuu/sl-ember-components,juwara0/sl-ember-components,notmessenger/sl-ember-components,azizpunjani/sl-ember-components | handlebars | ## Code Before:
{{#if label}}
<label {{bind-attr for=inputElementId}}>{{label}}</label>
{{/if}}
<div class="row">
{{sl-date-picker
change="startDateChange"
class="sl-daterange-start-date col-md-6"
endDate=latestStartDate
placeholder=startDatePlaceholder
startDate=minDate
value=startDateValue
inputElementIdBinding="inputElementId"
}}
{{sl-date-picker
change="endDateChange"
class="sl-daterange-end-date col-md-6"
endDate=maxDate
placeholder=endDatePlaceholder
startDate=earliestEndDate
value=endDateValue
}}
</div>
{{#if helpText}}
<p class="help-block">{{helpText}}</p>
{{/if}}
## Instruction:
Remove change bindings for startDateChange and endDateChange
## Code After:
{{#if label}}
<label {{bind-attr for=inputElementId}}>{{label}}</label>
{{/if}}
<div class="row">
{{sl-date-picker
class="sl-daterange-start-date col-md-6"
endDate=latestStartDate
placeholder=startDatePlaceholder
startDate=minDate
value=startDateValue
inputElementIdBinding="inputElementId"
}}
{{sl-date-picker
class="sl-daterange-end-date col-md-6"
endDate=maxDate
placeholder=endDatePlaceholder
startDate=earliestEndDate
value=endDateValue
}}
</div>
{{#if helpText}}
<p class="help-block">{{helpText}}</p>
{{/if}}
| {{#if label}}
<label {{bind-attr for=inputElementId}}>{{label}}</label>
{{/if}}
<div class="row">
{{sl-date-picker
- change="startDateChange"
class="sl-daterange-start-date col-md-6"
endDate=latestStartDate
placeholder=startDatePlaceholder
startDate=minDate
value=startDateValue
inputElementIdBinding="inputElementId"
}}
{{sl-date-picker
- change="endDateChange"
class="sl-daterange-end-date col-md-6"
endDate=maxDate
placeholder=endDatePlaceholder
startDate=earliestEndDate
value=endDateValue
}}
</div>
{{#if helpText}}
<p class="help-block">{{helpText}}</p>
{{/if}} | 2 | 0.071429 | 0 | 2 |
6fe8e5dc9cf564b908b9eeb8abab6902dcb87a30 | src/development.sig | src/development.sig | signature DEVELOPMENT =
sig
type term
type pattern
type judgement
type evidence
type conv = term -> term
type tactic
structure Telescope : TELESCOPE
type label = Telescope.label
structure Object :
sig
type t
val toString : label * t -> string
end
type t
type object = Object.t
val out : t -> object Telescope.telescope
(* the empty development *)
val empty : t
(* extend a development with a theorem *)
val prove : t -> label * judgement * tactic -> t
exception RemainingSubgoals of judgement list
(* extend a development with a custom tactic *)
val defineTactic : t -> label * tactic -> t
(* extend a development with a new operator *)
val declareOperator : t -> label * Arity.t -> t
val defineOperator : t -> {definiendum : pattern, definiens : term} -> t
(* lookup the definiens *)
val lookupDefinition : t -> label -> conv
(* lookup the statement & evidence of a theorem *)
val lookupTheorem : t -> label -> {statement : judgement, evidence : evidence Susp.susp}
(* lookup a custom tactic *)
val lookupTactic : t -> label -> tactic
(* lookup a custom operator *)
val lookupOperator : t -> label -> Arity.t
end
| (* DEVELOPMENT models the aspect of Brouwer's Creating Subject which realizes
* constructions over time. A value of type DEVELOPMENT.t is a stage in time at
* which the knowledge of the creating subject may be queried, or to which new
* knowledge may be added, resulting in another stage in time. In this sense,
* the possible values of type DEVELOPMENT.t form a spread, whose law is
* governed by the admissibility of new knowledge. *)
signature DEVELOPMENT =
sig
type term
type pattern
type judgement
type evidence
type conv = term -> term
type tactic
structure Telescope : TELESCOPE
type label = Telescope.label
structure Object :
sig
type t
val toString : label * t -> string
end
type t
type object = Object.t
val out : t -> object Telescope.telescope
(* the empty development *)
val empty : t
(* extend a development with a theorem *)
val prove : t -> label * judgement * tactic -> t
exception RemainingSubgoals of judgement list
(* extend a development with a custom tactic *)
val defineTactic : t -> label * tactic -> t
(* extend a development with a new operator *)
val declareOperator : t -> label * Arity.t -> t
val defineOperator : t -> {definiendum : pattern, definiens : term} -> t
(* lookup the definiens *)
val lookupDefinition : t -> label -> conv
(* lookup the statement & evidence of a theorem *)
val lookupTheorem : t -> label -> {statement : judgement, evidence : evidence Susp.susp}
(* lookup a custom tactic *)
val lookupTactic : t -> label -> tactic
(* lookup a custom operator *)
val lookupOperator : t -> label -> Arity.t
end
| Add note to DEVELOPMENT sig | Add note to DEVELOPMENT sig
| Standard ML | mit | freebroccolo/JonPRL,wilcoxjay/JonPRL,jonsterling/JonPRL,david-christiansen/JonPRL,jozefg/JonPRL | standard-ml | ## Code Before:
signature DEVELOPMENT =
sig
type term
type pattern
type judgement
type evidence
type conv = term -> term
type tactic
structure Telescope : TELESCOPE
type label = Telescope.label
structure Object :
sig
type t
val toString : label * t -> string
end
type t
type object = Object.t
val out : t -> object Telescope.telescope
(* the empty development *)
val empty : t
(* extend a development with a theorem *)
val prove : t -> label * judgement * tactic -> t
exception RemainingSubgoals of judgement list
(* extend a development with a custom tactic *)
val defineTactic : t -> label * tactic -> t
(* extend a development with a new operator *)
val declareOperator : t -> label * Arity.t -> t
val defineOperator : t -> {definiendum : pattern, definiens : term} -> t
(* lookup the definiens *)
val lookupDefinition : t -> label -> conv
(* lookup the statement & evidence of a theorem *)
val lookupTheorem : t -> label -> {statement : judgement, evidence : evidence Susp.susp}
(* lookup a custom tactic *)
val lookupTactic : t -> label -> tactic
(* lookup a custom operator *)
val lookupOperator : t -> label -> Arity.t
end
## Instruction:
Add note to DEVELOPMENT sig
## Code After:
(* DEVELOPMENT models the aspect of Brouwer's Creating Subject which realizes
* constructions over time. A value of type DEVELOPMENT.t is a stage in time at
* which the knowledge of the creating subject may be queried, or to which new
* knowledge may be added, resulting in another stage in time. In this sense,
* the possible values of type DEVELOPMENT.t form a spread, whose law is
* governed by the admissibility of new knowledge. *)
signature DEVELOPMENT =
sig
type term
type pattern
type judgement
type evidence
type conv = term -> term
type tactic
structure Telescope : TELESCOPE
type label = Telescope.label
structure Object :
sig
type t
val toString : label * t -> string
end
type t
type object = Object.t
val out : t -> object Telescope.telescope
(* the empty development *)
val empty : t
(* extend a development with a theorem *)
val prove : t -> label * judgement * tactic -> t
exception RemainingSubgoals of judgement list
(* extend a development with a custom tactic *)
val defineTactic : t -> label * tactic -> t
(* extend a development with a new operator *)
val declareOperator : t -> label * Arity.t -> t
val defineOperator : t -> {definiendum : pattern, definiens : term} -> t
(* lookup the definiens *)
val lookupDefinition : t -> label -> conv
(* lookup the statement & evidence of a theorem *)
val lookupTheorem : t -> label -> {statement : judgement, evidence : evidence Susp.susp}
(* lookup a custom tactic *)
val lookupTactic : t -> label -> tactic
(* lookup a custom operator *)
val lookupOperator : t -> label -> Arity.t
end
| + (* DEVELOPMENT models the aspect of Brouwer's Creating Subject which realizes
+ * constructions over time. A value of type DEVELOPMENT.t is a stage in time at
+ * which the knowledge of the creating subject may be queried, or to which new
+ * knowledge may be added, resulting in another stage in time. In this sense,
+ * the possible values of type DEVELOPMENT.t form a spread, whose law is
+ * governed by the admissibility of new knowledge. *)
+
signature DEVELOPMENT =
sig
type term
type pattern
type judgement
type evidence
type conv = term -> term
type tactic
structure Telescope : TELESCOPE
type label = Telescope.label
structure Object :
sig
type t
val toString : label * t -> string
end
type t
type object = Object.t
val out : t -> object Telescope.telescope
(* the empty development *)
val empty : t
(* extend a development with a theorem *)
val prove : t -> label * judgement * tactic -> t
exception RemainingSubgoals of judgement list
(* extend a development with a custom tactic *)
val defineTactic : t -> label * tactic -> t
(* extend a development with a new operator *)
val declareOperator : t -> label * Arity.t -> t
val defineOperator : t -> {definiendum : pattern, definiens : term} -> t
(* lookup the definiens *)
val lookupDefinition : t -> label -> conv
(* lookup the statement & evidence of a theorem *)
val lookupTheorem : t -> label -> {statement : judgement, evidence : evidence Susp.susp}
(* lookup a custom tactic *)
val lookupTactic : t -> label -> tactic
(* lookup a custom operator *)
val lookupOperator : t -> label -> Arity.t
end | 7 | 0.14 | 7 | 0 |
3a56a8a1b23fd75d5fab60b7e705ec1f26a2721b | scripts/v19_to_v21_params_changes/migrate_vsphere_install_pcf_params.sh | scripts/v19_to_v21_params_changes/migrate_vsphere_install_pcf_params.sh |
function main() {
local file_to_migrate=${1}
if [[ ! -f "${file_to_migrate}" ]]; then
echo "Usage: ${0} <path-to-vsphere-param.yml>"
echo "Migrates an vsphere install-pcf yaml param file that is compatible with pcf-pipeline v0.19.2."
exit 1
fi
local params="$(cat ${file_to_migrate})"
params="$(migrate "${params}" "om_ssh_pwd" "opsman_ssh_password")"
params="$(migrate "${params}" "vcenter_data_center" "vcenter_datacenter")"
params="$(migrate "${params}" "om_data_store" "vcenter_datastore")"
echo "${params}"
}
function migrate() {
local params="${1}"
local old_param="${2}"
local new_param="${3}"
if [[ -z $(grep "^${old_param}:" <<< "${params}") ]]; then
>&2 echo "\"${old_param}\" param not found. Make sure this param file is for the vSphere install-pcf pipeline and is compatible with pcf-pipelines v0.19.2."
exit 1
fi
sed -e "s/^${old_param}:/${new_param}:/g" <<< "${params}"
}
main ${@}
|
function main() {
local file_to_migrate=${1}
if [[ ! -f "${file_to_migrate}" ]]; then
echo "Usage: ${0} <path-to-vsphere-param.yml>"
echo "Migrates an vsphere install-pcf yaml param file that is compatible with pcf-pipeline v0.19.2."
exit 1
fi
local params="$(cat ${file_to_migrate})"
params="$(migrate "${params}" "om_ssh_pwd" "opsman_ssh_password")"
params="$(migrate "${params}" "vcenter_data_center" "vcenter_datacenter")"
params="$(migrate "${params}" "om_data_store" "vcenter_datastore")"
params="$(append_param "${params}" "vcenter_ca_cert")"
params="$(append_param "${params}" "vcenter_insecure")"
echo "${params}"
}
function migrate() {
local params="${1}"
local old_param="${2}"
local new_param="${3}"
if [[ -z $(grep "^${old_param}:" <<< "${params}") ]]; then
>&2 echo "\"${old_param}\" param not found. Make sure this param file is for the vSphere install-pcf pipeline and is compatible with pcf-pipelines v0.19.2."
exit 1
fi
sed -e "s/^${old_param}:/${new_param}:/g" <<< "${params}"
}
function append_param() {
local params="${1}"
local new_param="${2}"
echo "${params}"
if [[ -z $(grep "^${new_param}:" <<< "${params}") ]]; then
echo "${new_param}: "
fi
}
main ${@}
| Append newly introduced install-pcf/vSphere params | Append newly introduced install-pcf/vSphere params
[#152910372]
| Shell | apache-2.0 | pivotal-cf/pcf-pipelines,rahulkj/pcf-pipelines,pvsone/pcf-pipelines,sandyg1/pcf-pipelines,sandyg1/pcf-pipelines,cah-josephgeorge/pcf-pipelines,pvsone/pcf-pipelines,sandyg1/pcf-pipelines,rahulkj/pcf-pipelines,sandyg1/pcf-pipelines,pivotal-cf/pcf-pipelines,rahulkj/pcf-pipelines,pivotal-cf/pcf-pipelines,pvsone/pcf-pipelines,pivotal-cf/pcf-pipelines,cah-josephgeorge/pcf-pipelines,rahulkj/pcf-pipelines,cah-josephgeorge/pcf-pipelines | shell | ## Code Before:
function main() {
local file_to_migrate=${1}
if [[ ! -f "${file_to_migrate}" ]]; then
echo "Usage: ${0} <path-to-vsphere-param.yml>"
echo "Migrates an vsphere install-pcf yaml param file that is compatible with pcf-pipeline v0.19.2."
exit 1
fi
local params="$(cat ${file_to_migrate})"
params="$(migrate "${params}" "om_ssh_pwd" "opsman_ssh_password")"
params="$(migrate "${params}" "vcenter_data_center" "vcenter_datacenter")"
params="$(migrate "${params}" "om_data_store" "vcenter_datastore")"
echo "${params}"
}
function migrate() {
local params="${1}"
local old_param="${2}"
local new_param="${3}"
if [[ -z $(grep "^${old_param}:" <<< "${params}") ]]; then
>&2 echo "\"${old_param}\" param not found. Make sure this param file is for the vSphere install-pcf pipeline and is compatible with pcf-pipelines v0.19.2."
exit 1
fi
sed -e "s/^${old_param}:/${new_param}:/g" <<< "${params}"
}
main ${@}
## Instruction:
Append newly introduced install-pcf/vSphere params
[#152910372]
## Code After:
function main() {
local file_to_migrate=${1}
if [[ ! -f "${file_to_migrate}" ]]; then
echo "Usage: ${0} <path-to-vsphere-param.yml>"
echo "Migrates an vsphere install-pcf yaml param file that is compatible with pcf-pipeline v0.19.2."
exit 1
fi
local params="$(cat ${file_to_migrate})"
params="$(migrate "${params}" "om_ssh_pwd" "opsman_ssh_password")"
params="$(migrate "${params}" "vcenter_data_center" "vcenter_datacenter")"
params="$(migrate "${params}" "om_data_store" "vcenter_datastore")"
params="$(append_param "${params}" "vcenter_ca_cert")"
params="$(append_param "${params}" "vcenter_insecure")"
echo "${params}"
}
function migrate() {
local params="${1}"
local old_param="${2}"
local new_param="${3}"
if [[ -z $(grep "^${old_param}:" <<< "${params}") ]]; then
>&2 echo "\"${old_param}\" param not found. Make sure this param file is for the vSphere install-pcf pipeline and is compatible with pcf-pipelines v0.19.2."
exit 1
fi
sed -e "s/^${old_param}:/${new_param}:/g" <<< "${params}"
}
function append_param() {
local params="${1}"
local new_param="${2}"
echo "${params}"
if [[ -z $(grep "^${new_param}:" <<< "${params}") ]]; then
echo "${new_param}: "
fi
}
main ${@}
|
function main() {
local file_to_migrate=${1}
if [[ ! -f "${file_to_migrate}" ]]; then
echo "Usage: ${0} <path-to-vsphere-param.yml>"
echo "Migrates an vsphere install-pcf yaml param file that is compatible with pcf-pipeline v0.19.2."
exit 1
fi
local params="$(cat ${file_to_migrate})"
params="$(migrate "${params}" "om_ssh_pwd" "opsman_ssh_password")"
params="$(migrate "${params}" "vcenter_data_center" "vcenter_datacenter")"
params="$(migrate "${params}" "om_data_store" "vcenter_datastore")"
+ params="$(append_param "${params}" "vcenter_ca_cert")"
+ params="$(append_param "${params}" "vcenter_insecure")"
echo "${params}"
}
function migrate() {
local params="${1}"
local old_param="${2}"
local new_param="${3}"
if [[ -z $(grep "^${old_param}:" <<< "${params}") ]]; then
>&2 echo "\"${old_param}\" param not found. Make sure this param file is for the vSphere install-pcf pipeline and is compatible with pcf-pipelines v0.19.2."
exit 1
fi
sed -e "s/^${old_param}:/${new_param}:/g" <<< "${params}"
}
+ function append_param() {
+ local params="${1}"
+ local new_param="${2}"
+ echo "${params}"
+ if [[ -z $(grep "^${new_param}:" <<< "${params}") ]]; then
+ echo "${new_param}: "
+ fi
+ }
+
main ${@} | 11 | 0.34375 | 11 | 0 |
9182c89a14ef27b8d69897c2c9af44d2dc48b462 | vendor/sources.json | vendor/sources.json | [
{
"name": "git-for-windows",
"version": "v2.12.1.windows.1",
"url": "https://github.com/git-for-windows/git/releases/download/v2.12.1.windows.1/PortableGit-2.12.1-32-bit.7z.exe"
},
{
"name": "clink",
"version": "0.4.8",
"url": "https://github.com/mridgers/clink/releases/download/0.4.8/clink_0.4.8.zip"
},
{
"name": "conemu-maximus5",
"version": "161206",
"url": "https://github.com/Maximus5/ConEmu/releases/download/v16.12.06/ConEmuPack.161206.7z"
},
{
"name": "clink-completions",
"version": "0.3.2",
"url": "https://github.com/vladimir-kotikov/clink-completions/archive/0.3.2.zip"
}
]
| [
{
"name": "git-for-windows",
"version": "v2.12.1.windows.1",
"url": "https://github.com/git-for-windows/git/releases/download/v2.12.1.windows.1/PortableGit-2.12.1-32-bit.7z.exe"
},
{
"name": "clink",
"version": "0.4.8",
"url": "https://github.com/mridgers/clink/releases/download/0.4.8/clink_0.4.8.zip"
},
{
"name": "conemu-maximus5",
"version": "170316",
"url": "https://github.com/Maximus5/ConEmu/releases/download/v17.03.16/ConEmuPack.170316.7z"
},
{
"name": "clink-completions",
"version": "0.3.2",
"url": "https://github.com/vladimir-kotikov/clink-completions/archive/0.3.2.zip"
}
]
| Update ConEmu to 170316 (preview) | :arrow_up: Update ConEmu to 170316 (preview)
Changelog: https://conemu.github.io/blog/2017/03/16/Build-170316.html
| JSON | mit | cmderdev/cmder,cmderdev/cmder | json | ## Code Before:
[
{
"name": "git-for-windows",
"version": "v2.12.1.windows.1",
"url": "https://github.com/git-for-windows/git/releases/download/v2.12.1.windows.1/PortableGit-2.12.1-32-bit.7z.exe"
},
{
"name": "clink",
"version": "0.4.8",
"url": "https://github.com/mridgers/clink/releases/download/0.4.8/clink_0.4.8.zip"
},
{
"name": "conemu-maximus5",
"version": "161206",
"url": "https://github.com/Maximus5/ConEmu/releases/download/v16.12.06/ConEmuPack.161206.7z"
},
{
"name": "clink-completions",
"version": "0.3.2",
"url": "https://github.com/vladimir-kotikov/clink-completions/archive/0.3.2.zip"
}
]
## Instruction:
:arrow_up: Update ConEmu to 170316 (preview)
Changelog: https://conemu.github.io/blog/2017/03/16/Build-170316.html
## Code After:
[
{
"name": "git-for-windows",
"version": "v2.12.1.windows.1",
"url": "https://github.com/git-for-windows/git/releases/download/v2.12.1.windows.1/PortableGit-2.12.1-32-bit.7z.exe"
},
{
"name": "clink",
"version": "0.4.8",
"url": "https://github.com/mridgers/clink/releases/download/0.4.8/clink_0.4.8.zip"
},
{
"name": "conemu-maximus5",
"version": "170316",
"url": "https://github.com/Maximus5/ConEmu/releases/download/v17.03.16/ConEmuPack.170316.7z"
},
{
"name": "clink-completions",
"version": "0.3.2",
"url": "https://github.com/vladimir-kotikov/clink-completions/archive/0.3.2.zip"
}
]
| [
{
"name": "git-for-windows",
"version": "v2.12.1.windows.1",
"url": "https://github.com/git-for-windows/git/releases/download/v2.12.1.windows.1/PortableGit-2.12.1-32-bit.7z.exe"
},
{
"name": "clink",
"version": "0.4.8",
"url": "https://github.com/mridgers/clink/releases/download/0.4.8/clink_0.4.8.zip"
},
{
"name": "conemu-maximus5",
- "version": "161206",
? ^ --
+ "version": "170316",
? ^^^
- "url": "https://github.com/Maximus5/ConEmu/releases/download/v16.12.06/ConEmuPack.161206.7z"
? ^ --- ^ --
+ "url": "https://github.com/Maximus5/ConEmu/releases/download/v17.03.16/ConEmuPack.170316.7z"
? ^^^^ ^^^
},
{
"name": "clink-completions",
"version": "0.3.2",
"url": "https://github.com/vladimir-kotikov/clink-completions/archive/0.3.2.zip"
}
] | 4 | 0.181818 | 2 | 2 |
13e72c4b5b4b5f57ef517baefbc28f019e00d026 | src/Native/Global.js | src/Native/Global.js | const _elm_node$core$Native_Global = function () {
const Err = _elm_lang$core$Result$Err
const Ok = _elm_lang$core$Result$Ok
// parseInt : Int -> String -> Result Decode.Value Int
const parseInt = F2((radix, string) => {
try {
// radix values integer from 2-36
const value = global.parseInt(string, radix)
// TODO check for value === NaN and Err in that case
return Ok(value)
} catch (error) { return Err(error) }
})
const exports =
{ parseInt
}
return exports
}()
| const _elm_node$core$Native_Global = function () {
const Err = _elm_lang$core$Result$Err
const Ok = _elm_lang$core$Result$Ok
// parseInt : Int -> String -> Result Decode.Value Int
const parseInt = F2((radix, string) => {
try {
// NOTE radix can be any integer from 2-36
const value = global.parseInt(string, radix)
if (isNaN(value)) return Err(new Error(`String cannot be converted to an integer: ${string}`))
return Ok(value)
} catch (error) { return Err(error) }
})
const exports =
{ parseInt
}
return exports
}()
| Add check for NaN in parseInt native code. | Add check for NaN in parseInt native code.
| JavaScript | unlicense | elm-node/core | javascript | ## Code Before:
const _elm_node$core$Native_Global = function () {
const Err = _elm_lang$core$Result$Err
const Ok = _elm_lang$core$Result$Ok
// parseInt : Int -> String -> Result Decode.Value Int
const parseInt = F2((radix, string) => {
try {
// radix values integer from 2-36
const value = global.parseInt(string, radix)
// TODO check for value === NaN and Err in that case
return Ok(value)
} catch (error) { return Err(error) }
})
const exports =
{ parseInt
}
return exports
}()
## Instruction:
Add check for NaN in parseInt native code.
## Code After:
const _elm_node$core$Native_Global = function () {
const Err = _elm_lang$core$Result$Err
const Ok = _elm_lang$core$Result$Ok
// parseInt : Int -> String -> Result Decode.Value Int
const parseInt = F2((radix, string) => {
try {
// NOTE radix can be any integer from 2-36
const value = global.parseInt(string, radix)
if (isNaN(value)) return Err(new Error(`String cannot be converted to an integer: ${string}`))
return Ok(value)
} catch (error) { return Err(error) }
})
const exports =
{ parseInt
}
return exports
}()
| const _elm_node$core$Native_Global = function () {
const Err = _elm_lang$core$Result$Err
const Ok = _elm_lang$core$Result$Ok
// parseInt : Int -> String -> Result Decode.Value Int
const parseInt = F2((radix, string) => {
try {
- // radix values integer from 2-36
? ^ ^^ ^
+ // NOTE radix can be any integer from 2-36
? +++++ ^ ^^^ ^^^^
const value = global.parseInt(string, radix)
- // TODO check for value === NaN and Err in that case
+ if (isNaN(value)) return Err(new Error(`String cannot be converted to an integer: ${string}`))
return Ok(value)
} catch (error) { return Err(error) }
})
const exports =
{ parseInt
}
return exports
}() | 4 | 0.190476 | 2 | 2 |
b0e3ff388c661fc87b199017a91fc7287ab348d6 | src/components/light/index.js | src/components/light/index.js | import React from 'react';
import Radium from 'radium';
import { lightActive, lightInactive } from 'theme/variables';
const size = 18;
const innerPadding = 4;
@Radium
class Light extends React.Component {
render() {
const { active } = this.props;
const baseInnerStyle = {
position: 'absolute',
left: innerPadding,
right: innerPadding,
top: innerPadding,
bottom: innerPadding,
borderRadius: '50%',
};
const styles = {
outer: {
position: 'relative',
backgroundColor: 'rgba(0,0,0,0.4)',
width: size, height: size,
borderRadius: '50%',
pointerEvents: 'none'
},
innerInactive: {
...baseInnerStyle,
backgroundColor: lightInactive
},
innerActive: {
...baseInnerStyle,
backgroundColor: lightActive,
opacity: active ? 1 : 0
}
};
return (
<div style={styles.outer}>
<div style={styles.innerInactive}></div>
<div style={styles.innerActive}></div>
</div>
);
}
}
Light.propTypes = {
active: React.PropTypes.bool.isRequired
};
export default Light; | import React from 'react';
import Radium from 'radium';
import { lightActive, lightInactive } from 'theme/variables';
const size = 18;
const innerPadding = 4;
@Radium
class Light extends React.Component {
render() {
const { active } = this.props;
const baseInnerStyle = {
position: 'absolute',
left: innerPadding,
right: innerPadding,
top: innerPadding,
bottom: innerPadding,
borderRadius: '50%',
};
const styles = {
outer: {
position: 'relative',
backgroundColor: 'rgba(0,0,0,0.4)',
width: size, height: size,
borderRadius: '50%',
pointerEvents: 'none'
},
innerInactive: {
...baseInnerStyle,
backgroundColor: lightInactive
},
innerActive: {
...baseInnerStyle,
backgroundColor: lightActive,
transition: 'opacity 0.1s',
opacity: active ? 1 : 0
}
};
return (
<div style={styles.outer}>
<div style={styles.innerInactive}></div>
<div style={styles.innerActive}></div>
</div>
);
}
}
Light.propTypes = {
active: React.PropTypes.bool.isRequired
};
export default Light; | Add short transition time for the light component | Add short transition time for the light component
| JavaScript | mit | vincentriemer/io-808,vincentriemer/io-808,vincentriemer/io-808 | javascript | ## Code Before:
import React from 'react';
import Radium from 'radium';
import { lightActive, lightInactive } from 'theme/variables';
const size = 18;
const innerPadding = 4;
@Radium
class Light extends React.Component {
render() {
const { active } = this.props;
const baseInnerStyle = {
position: 'absolute',
left: innerPadding,
right: innerPadding,
top: innerPadding,
bottom: innerPadding,
borderRadius: '50%',
};
const styles = {
outer: {
position: 'relative',
backgroundColor: 'rgba(0,0,0,0.4)',
width: size, height: size,
borderRadius: '50%',
pointerEvents: 'none'
},
innerInactive: {
...baseInnerStyle,
backgroundColor: lightInactive
},
innerActive: {
...baseInnerStyle,
backgroundColor: lightActive,
opacity: active ? 1 : 0
}
};
return (
<div style={styles.outer}>
<div style={styles.innerInactive}></div>
<div style={styles.innerActive}></div>
</div>
);
}
}
Light.propTypes = {
active: React.PropTypes.bool.isRequired
};
export default Light;
## Instruction:
Add short transition time for the light component
## Code After:
import React from 'react';
import Radium from 'radium';
import { lightActive, lightInactive } from 'theme/variables';
const size = 18;
const innerPadding = 4;
@Radium
class Light extends React.Component {
render() {
const { active } = this.props;
const baseInnerStyle = {
position: 'absolute',
left: innerPadding,
right: innerPadding,
top: innerPadding,
bottom: innerPadding,
borderRadius: '50%',
};
const styles = {
outer: {
position: 'relative',
backgroundColor: 'rgba(0,0,0,0.4)',
width: size, height: size,
borderRadius: '50%',
pointerEvents: 'none'
},
innerInactive: {
...baseInnerStyle,
backgroundColor: lightInactive
},
innerActive: {
...baseInnerStyle,
backgroundColor: lightActive,
transition: 'opacity 0.1s',
opacity: active ? 1 : 0
}
};
return (
<div style={styles.outer}>
<div style={styles.innerInactive}></div>
<div style={styles.innerActive}></div>
</div>
);
}
}
Light.propTypes = {
active: React.PropTypes.bool.isRequired
};
export default Light; | import React from 'react';
import Radium from 'radium';
import { lightActive, lightInactive } from 'theme/variables';
const size = 18;
const innerPadding = 4;
@Radium
class Light extends React.Component {
render() {
const { active } = this.props;
const baseInnerStyle = {
position: 'absolute',
left: innerPadding,
right: innerPadding,
top: innerPadding,
bottom: innerPadding,
borderRadius: '50%',
};
const styles = {
outer: {
position: 'relative',
backgroundColor: 'rgba(0,0,0,0.4)',
width: size, height: size,
borderRadius: '50%',
pointerEvents: 'none'
},
innerInactive: {
...baseInnerStyle,
backgroundColor: lightInactive
},
innerActive: {
...baseInnerStyle,
backgroundColor: lightActive,
+ transition: 'opacity 0.1s',
opacity: active ? 1 : 0
}
};
return (
<div style={styles.outer}>
<div style={styles.innerInactive}></div>
<div style={styles.innerActive}></div>
</div>
);
}
}
Light.propTypes = {
active: React.PropTypes.bool.isRequired
};
export default Light; | 1 | 0.018182 | 1 | 0 |
1201c6806bac6db17dde4bf5e98e55a7cb81c087 | manifests/bits-service-webdav-certs.yml | manifests/bits-service-webdav-certs.yml | properties:
bits-service:
buildpacks:
webdav_config:
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
droplets:
webdav_config:
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
packages:
webdav_config:
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
app_stash:
webdav_config:
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
blobstore:
tls:
cert: (( file "standalone-blobstore-certs/server.crt" ))
private_key: (( file "standalone-blobstore-certs/server.key" ))
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
| - type: replace
path: /properties?/bits-service/buildpacks/webdav_config/ca_cert
value: ((blobstore_ssl.ca))
- type: replace
path: /properties?/bits-service/droplets/webdav_config/ca_cert
value: ((blobstore_ssl.ca))
- type: replace
path: /properties?/bits-service/packages/webdav_config/ca_cert
value: ((blobstore_ssl.ca))
- type: replace
path: /properties?/bits-service/app_stash/webdav_config/ca_cert
value: ((blobstore_ssl.ca))
- type: replace
path: /properties?/blobstore/tls
value:
cert: ((blobstore_ssl.certificate))
private_key: ((blobstore_ssl.private_key))
ca_cert: ((blobstore_ssl.ca))
- type: replace
path: /variables?/name=blobstore_ssl
value:
name: blobstore_ssl
type: certificate
options:
common_name: not_configured
alternative_names:
- blobstore.10.175.96.245.xip.io
- '*.10.175.96.245.xip.io'
- 10.175.96.245.xip.io
ca: default_ca
- type: replace
path: /variables?/name=default_ca
value:
name: default_ca
options:
common_name: ca
is_ca: true
type: certificate
| Fix certificates for bits-service standalone deployment | Fix certificates for bits-service standalone deployment
We never noticed that these certificates were broken, because the Ruby implementation does not validate them. Bitsgo does.
| YAML | apache-2.0 | cloudfoundry-incubator/bits-service-ci,cloudfoundry-incubator/bits-service-ci,cloudfoundry-incubator/bits-service-ci | yaml | ## Code Before:
properties:
bits-service:
buildpacks:
webdav_config:
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
droplets:
webdav_config:
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
packages:
webdav_config:
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
app_stash:
webdav_config:
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
blobstore:
tls:
cert: (( file "standalone-blobstore-certs/server.crt" ))
private_key: (( file "standalone-blobstore-certs/server.key" ))
ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
## Instruction:
Fix certificates for bits-service standalone deployment
We never noticed that these certificates were broken, because the Ruby implementation does not validate them. Bitsgo does.
## Code After:
- type: replace
path: /properties?/bits-service/buildpacks/webdav_config/ca_cert
value: ((blobstore_ssl.ca))
- type: replace
path: /properties?/bits-service/droplets/webdav_config/ca_cert
value: ((blobstore_ssl.ca))
- type: replace
path: /properties?/bits-service/packages/webdav_config/ca_cert
value: ((blobstore_ssl.ca))
- type: replace
path: /properties?/bits-service/app_stash/webdav_config/ca_cert
value: ((blobstore_ssl.ca))
- type: replace
path: /properties?/blobstore/tls
value:
cert: ((blobstore_ssl.certificate))
private_key: ((blobstore_ssl.private_key))
ca_cert: ((blobstore_ssl.ca))
- type: replace
path: /variables?/name=blobstore_ssl
value:
name: blobstore_ssl
type: certificate
options:
common_name: not_configured
alternative_names:
- blobstore.10.175.96.245.xip.io
- '*.10.175.96.245.xip.io'
- 10.175.96.245.xip.io
ca: default_ca
- type: replace
path: /variables?/name=default_ca
value:
name: default_ca
options:
common_name: ca
is_ca: true
type: certificate
| - properties:
- bits-service:
- buildpacks:
- webdav_config:
- ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
- droplets:
- webdav_config:
- ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
- packages:
- webdav_config:
- ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
- app_stash:
- webdav_config:
- ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
- blobstore:
- tls:
- cert: (( file "standalone-blobstore-certs/server.crt" ))
- private_key: (( file "standalone-blobstore-certs/server.key" ))
- ca_cert: (( file "standalone-blobstore-certs/server-ca.crt" ))
+ - type: replace
+ path: /properties?/bits-service/buildpacks/webdav_config/ca_cert
+ value: ((blobstore_ssl.ca))
+ - type: replace
+ path: /properties?/bits-service/droplets/webdav_config/ca_cert
+ value: ((blobstore_ssl.ca))
+ - type: replace
+ path: /properties?/bits-service/packages/webdav_config/ca_cert
+ value: ((blobstore_ssl.ca))
+ - type: replace
+ path: /properties?/bits-service/app_stash/webdav_config/ca_cert
+ value: ((blobstore_ssl.ca))
+ - type: replace
+ path: /properties?/blobstore/tls
+ value:
+ cert: ((blobstore_ssl.certificate))
+ private_key: ((blobstore_ssl.private_key))
+ ca_cert: ((blobstore_ssl.ca))
+
+ - type: replace
+ path: /variables?/name=blobstore_ssl
+ value:
+ name: blobstore_ssl
+ type: certificate
+ options:
+ common_name: not_configured
+ alternative_names:
+ - blobstore.10.175.96.245.xip.io
+ - '*.10.175.96.245.xip.io'
+ - 10.175.96.245.xip.io
+ ca: default_ca
+
+ - type: replace
+ path: /variables?/name=default_ca
+ value:
+ name: default_ca
+ options:
+ common_name: ca
+ is_ca: true
+ type: certificate | 59 | 3.105263 | 40 | 19 |
d69f00b245afc65495abaf39102d3bae89f3d91e | bin/mason.js | bin/mason.js |
'use strict'
import Mason from '../lib/index'
import Arguments from '../lib/cli/Arguments'
try {
let input = new Arguments();
Mason.run(input.command(), input.all());
} catch(e) {
console.error('Error: ' + e.message);
console.log(e.stack);
}
Mason.finally(() => {
console.log(' ');
}); |
'use strict'
var Mason = require('../lib/index');
var Arguments = require('../lib/cli/Arguments');
try {
var input = new Arguments();
Mason.run(input.command(), input.all());
} catch(e) {
console.error('Error: ' + e.message);
console.log(e.stack);
}
Mason.finally(() => {
console.log(' ');
}); | Use standard node for bin | Use standard node for bin
| JavaScript | mit | mason-cli/mason,aewing/mason,mason-cli/mason | javascript | ## Code Before:
'use strict'
import Mason from '../lib/index'
import Arguments from '../lib/cli/Arguments'
try {
let input = new Arguments();
Mason.run(input.command(), input.all());
} catch(e) {
console.error('Error: ' + e.message);
console.log(e.stack);
}
Mason.finally(() => {
console.log(' ');
});
## Instruction:
Use standard node for bin
## Code After:
'use strict'
var Mason = require('../lib/index');
var Arguments = require('../lib/cli/Arguments');
try {
var input = new Arguments();
Mason.run(input.command(), input.all());
} catch(e) {
console.error('Error: ' + e.message);
console.log(e.stack);
}
Mason.finally(() => {
console.log(' ');
}); |
'use strict'
- import Mason from '../lib/index'
+ var Mason = require('../lib/index');
- import Arguments from '../lib/cli/Arguments'
? ^^^^ - ^ ^^^
+ var Arguments = require('../lib/cli/Arguments');
? ^^ ^^ ^^^^^^^ ++
try {
- let input = new Arguments();
? ^^^
+ var input = new Arguments();
? ^^^
Mason.run(input.command(), input.all());
} catch(e) {
console.error('Error: ' + e.message);
console.log(e.stack);
}
Mason.finally(() => {
console.log(' ');
}); | 6 | 0.352941 | 3 | 3 |
6906b07eb6bf6caa93eac3834734127ce34a419a | app/routes/v1.js | app/routes/v1.js | var router = require('express').Router();
var Incident = require('../incidents');
router.get('/incidents/latest', (req, res) => {
Incident.find().sort('-date').limit(100).exec((err, incidents) => {
if (err) {
throw err;
}
res.json(incidents);
});
});
module.exports = router;
| var router = require('express').Router();
var Incident = require('../incidents');
var RESPONSE_LIMIT = 1000;
// Helper function for building a mongo query.
function buildQuery(params) {
let query = {};
let limit = 100;
if (params.area) {
query.area = params.area.toUpperCase();
}
if (params.type) {
query.type = params.type.toUpperCase();
}
if (params.from) {
let date = Date.parse(params.from);
if (!isNaN(date)) {
query.date = query.date || {};
query.date.$gte = new Date(date);
}
}
if (params.limit) {
limit = Math.min(parseInt(params.limit), RESPONSE_LIMIT);
}
if (params.to) {
let date = Date.parse(params.to);
if (!isNaN(date)) {
query.date = query.date || {};
query.date.$lt = new Date(date);
}
}
return [query, limit];
}
router.get('/areas', (req, res) => {
Incident.find().distinct('area').exec((err, areas) => {
if (err) {
throw err;
}
// There's one area that's all whitespace :(
let whitespaceRegex = /^\s+$/;
res.json(areas.filter((area) => !whitespaceRegex.test(area)));
});
});
router.get('/types', (req, res) => {
Incident.find().distinct('type').exec((err, types) => {
if (err) {
throw err;
}
res.json(types);
});
});
router.get('/incidents', (req, res) => {
// Build params.
let [query, limit] = buildQuery(req.query);
Incident.find(query).select('-_id -__v -geocode_response').sort('-date').limit(limit).exec((err, incidents) => {
if (err) {
throw err;
}
res.json(incidents);
});
});
module.exports = router;
| Update routes from deployed heroku version. | Update routes from deployed heroku version.
| JavaScript | mit | keokilee/hitraffic-api,hitraffic/api-server | javascript | ## Code Before:
var router = require('express').Router();
var Incident = require('../incidents');
router.get('/incidents/latest', (req, res) => {
Incident.find().sort('-date').limit(100).exec((err, incidents) => {
if (err) {
throw err;
}
res.json(incidents);
});
});
module.exports = router;
## Instruction:
Update routes from deployed heroku version.
## Code After:
var router = require('express').Router();
var Incident = require('../incidents');
var RESPONSE_LIMIT = 1000;
// Helper function for building a mongo query.
function buildQuery(params) {
let query = {};
let limit = 100;
if (params.area) {
query.area = params.area.toUpperCase();
}
if (params.type) {
query.type = params.type.toUpperCase();
}
if (params.from) {
let date = Date.parse(params.from);
if (!isNaN(date)) {
query.date = query.date || {};
query.date.$gte = new Date(date);
}
}
if (params.limit) {
limit = Math.min(parseInt(params.limit), RESPONSE_LIMIT);
}
if (params.to) {
let date = Date.parse(params.to);
if (!isNaN(date)) {
query.date = query.date || {};
query.date.$lt = new Date(date);
}
}
return [query, limit];
}
router.get('/areas', (req, res) => {
Incident.find().distinct('area').exec((err, areas) => {
if (err) {
throw err;
}
// There's one area that's all whitespace :(
let whitespaceRegex = /^\s+$/;
res.json(areas.filter((area) => !whitespaceRegex.test(area)));
});
});
router.get('/types', (req, res) => {
Incident.find().distinct('type').exec((err, types) => {
if (err) {
throw err;
}
res.json(types);
});
});
router.get('/incidents', (req, res) => {
// Build params.
let [query, limit] = buildQuery(req.query);
Incident.find(query).select('-_id -__v -geocode_response').sort('-date').limit(limit).exec((err, incidents) => {
if (err) {
throw err;
}
res.json(incidents);
});
});
module.exports = router;
| var router = require('express').Router();
var Incident = require('../incidents');
+ var RESPONSE_LIMIT = 1000;
+
+ // Helper function for building a mongo query.
+ function buildQuery(params) {
+ let query = {};
+ let limit = 100;
+
+ if (params.area) {
+ query.area = params.area.toUpperCase();
+ }
+ if (params.type) {
+ query.type = params.type.toUpperCase();
+ }
+ if (params.from) {
+ let date = Date.parse(params.from);
+ if (!isNaN(date)) {
+ query.date = query.date || {};
+ query.date.$gte = new Date(date);
+ }
+ }
+ if (params.limit) {
+ limit = Math.min(parseInt(params.limit), RESPONSE_LIMIT);
+ }
+ if (params.to) {
+ let date = Date.parse(params.to);
+ if (!isNaN(date)) {
+ query.date = query.date || {};
+ query.date.$lt = new Date(date);
+ }
+ }
+
+ return [query, limit];
+ }
+
+ router.get('/areas', (req, res) => {
+ Incident.find().distinct('area').exec((err, areas) => {
+ if (err) {
+ throw err;
+ }
+
+ // There's one area that's all whitespace :(
+ let whitespaceRegex = /^\s+$/;
+ res.json(areas.filter((area) => !whitespaceRegex.test(area)));
+ });
+ });
+
+ router.get('/types', (req, res) => {
+ Incident.find().distinct('type').exec((err, types) => {
+ if (err) {
+ throw err;
+ }
+
+ res.json(types);
+ });
+ });
+
- router.get('/incidents/latest', (req, res) => {
? -------
+ router.get('/incidents', (req, res) => {
- Incident.find().sort('-date').limit(100).exec((err, incidents) => {
+ // Build params.
+ let [query, limit] = buildQuery(req.query);
+
+ Incident.find(query).select('-_id -__v -geocode_response').sort('-date').limit(limit).exec((err, incidents) => {
if (err) {
throw err;
}
res.json(incidents);
});
});
module.exports = router; | 63 | 4.5 | 61 | 2 |
fd01ce6f5bd70bb7d3bbabc6f4d6e1bc609c71c4 | lib/web/scripts/App.coffee | lib/web/scripts/App.coffee |
Backbone = require('backbone')
[Control, Controls] = require('./Control.coffee')
Router = require('./Router.coffee')
AppView = require('./AppView.coffee')
module.exports = class App extends Backbone.Model
initialize: ->
@log "Jarvis web client here"
Backbone.$ = window.$
@router = new Router(app: this)
@controls = new Controls()
@currentPath = null
$.when(@controls.fetch()).done =>
@log "Startup tasks are finished"
@view = new AppView(el: $('body'), model: this)
@view.render()
Backbone.history.start()
@setupSocket()
setupSocket: ->
@socket = io()
@socket.on 'control:change', (controlJSON) =>
control = @controls.get(controlJSON.id)
unless control?
@log 'warn', "Received update about unknown control #{controlJSON.id}"
control.set controlJSON
log: (one, two) ->
# XXX Implement levels
if two?
console.log two
else
console.log one
|
Backbone = require('backbone')
[Control, Controls] = require('./Control.coffee')
Router = require('./Router.coffee')
AppView = require('./AppView.coffee')
module.exports = class App extends Backbone.Model
initialize: ->
@log "Jarvis web client here"
Backbone.$ = window.$
@router = new Router(app: this)
@controls = new Controls()
@currentPath = null
$.when(@controls.fetch()).done =>
@log "Startup tasks are finished"
@view = new AppView(el: $('body'), model: this)
@view.render()
Backbone.history.start()
@setupSocket()
setupSocket: ->
@socket = io()
@socket.on 'connect', => @log 'Socket connected'
@socket.on 'disconnect', => @log 'Socket disconnected!'
@socket.on 'reconnect', => @log 'Socket reconnected'
@socket.on 'reconnect_failed', => @log 'Socket gave up reconnecting!!'
@socket.on 'control:change', (controlJSON) =>
control = @controls.get(controlJSON.id)
unless control?
@log 'warn', "Received update about unknown control #{controlJSON.id}"
control.set controlJSON
log: (one, two) ->
# XXX Implement levels
if two?
console.log two
else
console.log one
| Add some debug logging for socket connections | Add some debug logging for socket connections | CoffeeScript | mit | monitron/jarvis-ha,monitron/jarvis-ha | coffeescript | ## Code Before:
Backbone = require('backbone')
[Control, Controls] = require('./Control.coffee')
Router = require('./Router.coffee')
AppView = require('./AppView.coffee')
module.exports = class App extends Backbone.Model
initialize: ->
@log "Jarvis web client here"
Backbone.$ = window.$
@router = new Router(app: this)
@controls = new Controls()
@currentPath = null
$.when(@controls.fetch()).done =>
@log "Startup tasks are finished"
@view = new AppView(el: $('body'), model: this)
@view.render()
Backbone.history.start()
@setupSocket()
setupSocket: ->
@socket = io()
@socket.on 'control:change', (controlJSON) =>
control = @controls.get(controlJSON.id)
unless control?
@log 'warn', "Received update about unknown control #{controlJSON.id}"
control.set controlJSON
log: (one, two) ->
# XXX Implement levels
if two?
console.log two
else
console.log one
## Instruction:
Add some debug logging for socket connections
## Code After:
Backbone = require('backbone')
[Control, Controls] = require('./Control.coffee')
Router = require('./Router.coffee')
AppView = require('./AppView.coffee')
module.exports = class App extends Backbone.Model
initialize: ->
@log "Jarvis web client here"
Backbone.$ = window.$
@router = new Router(app: this)
@controls = new Controls()
@currentPath = null
$.when(@controls.fetch()).done =>
@log "Startup tasks are finished"
@view = new AppView(el: $('body'), model: this)
@view.render()
Backbone.history.start()
@setupSocket()
setupSocket: ->
@socket = io()
@socket.on 'connect', => @log 'Socket connected'
@socket.on 'disconnect', => @log 'Socket disconnected!'
@socket.on 'reconnect', => @log 'Socket reconnected'
@socket.on 'reconnect_failed', => @log 'Socket gave up reconnecting!!'
@socket.on 'control:change', (controlJSON) =>
control = @controls.get(controlJSON.id)
unless control?
@log 'warn', "Received update about unknown control #{controlJSON.id}"
control.set controlJSON
log: (one, two) ->
# XXX Implement levels
if two?
console.log two
else
console.log one
|
Backbone = require('backbone')
[Control, Controls] = require('./Control.coffee')
Router = require('./Router.coffee')
AppView = require('./AppView.coffee')
module.exports = class App extends Backbone.Model
initialize: ->
@log "Jarvis web client here"
Backbone.$ = window.$
@router = new Router(app: this)
@controls = new Controls()
@currentPath = null
$.when(@controls.fetch()).done =>
@log "Startup tasks are finished"
@view = new AppView(el: $('body'), model: this)
@view.render()
Backbone.history.start()
@setupSocket()
setupSocket: ->
@socket = io()
+ @socket.on 'connect', => @log 'Socket connected'
+ @socket.on 'disconnect', => @log 'Socket disconnected!'
+ @socket.on 'reconnect', => @log 'Socket reconnected'
+ @socket.on 'reconnect_failed', => @log 'Socket gave up reconnecting!!'
@socket.on 'control:change', (controlJSON) =>
control = @controls.get(controlJSON.id)
unless control?
@log 'warn', "Received update about unknown control #{controlJSON.id}"
control.set controlJSON
log: (one, two) ->
# XXX Implement levels
if two?
console.log two
else
console.log one | 4 | 0.117647 | 4 | 0 |
6ae8c280dd3ddd5f6f14fae016bd7e63e50d4fbf | index.js | index.js | 'use strict'
const LiteralProvider = require('./lib/providers/literal-provider');
const CliProvider = require('./lib/providers/cli-provider');
const FileProvider = require('./lib/providers/file-provider');
const EnvProvider = require('./lib/providers/env-provider');
const Configurator = require('./lib/configurator');
function main() {
let configurator = new Configurator();
// Register all of the builtin providers
configurator.register('literal', LiteralProvider);
configurator.register('cli', CliProvider);
configurator.register('file', FileProvider);
configurator.register('env', EnvProvider);
return configurator;
}
module.exports = main();
| 'use strict'
const LiteralProvider = require('./lib/providers/literal-provider');
const CliProvider = require('./lib/providers/cli-provider');
const FileProvider = require('./lib/providers/file-provider');
const EnvProvider = require('./lib/providers/env-provider');
const Configurator = require('./lib/configurator');
const merge = require('./lib/utils/merge');
function main() {
let configurator = new Configurator();
// Register all of the builtin providers
configurator.register('literal', LiteralProvider);
configurator.register('cli', CliProvider);
configurator.register('file', FileProvider);
configurator.register('env', EnvProvider);
configurator.utils = { merge };
return configurator;
}
module.exports = main();
| Add merge to to global package exports | Add merge to to global package exports
| JavaScript | mit | rpm0618/nconfigurator | javascript | ## Code Before:
'use strict'
const LiteralProvider = require('./lib/providers/literal-provider');
const CliProvider = require('./lib/providers/cli-provider');
const FileProvider = require('./lib/providers/file-provider');
const EnvProvider = require('./lib/providers/env-provider');
const Configurator = require('./lib/configurator');
function main() {
let configurator = new Configurator();
// Register all of the builtin providers
configurator.register('literal', LiteralProvider);
configurator.register('cli', CliProvider);
configurator.register('file', FileProvider);
configurator.register('env', EnvProvider);
return configurator;
}
module.exports = main();
## Instruction:
Add merge to to global package exports
## Code After:
'use strict'
const LiteralProvider = require('./lib/providers/literal-provider');
const CliProvider = require('./lib/providers/cli-provider');
const FileProvider = require('./lib/providers/file-provider');
const EnvProvider = require('./lib/providers/env-provider');
const Configurator = require('./lib/configurator');
const merge = require('./lib/utils/merge');
function main() {
let configurator = new Configurator();
// Register all of the builtin providers
configurator.register('literal', LiteralProvider);
configurator.register('cli', CliProvider);
configurator.register('file', FileProvider);
configurator.register('env', EnvProvider);
configurator.utils = { merge };
return configurator;
}
module.exports = main();
| 'use strict'
const LiteralProvider = require('./lib/providers/literal-provider');
const CliProvider = require('./lib/providers/cli-provider');
const FileProvider = require('./lib/providers/file-provider');
const EnvProvider = require('./lib/providers/env-provider');
const Configurator = require('./lib/configurator');
+
+ const merge = require('./lib/utils/merge');
function main() {
let configurator = new Configurator();
// Register all of the builtin providers
configurator.register('literal', LiteralProvider);
configurator.register('cli', CliProvider);
configurator.register('file', FileProvider);
configurator.register('env', EnvProvider);
+ configurator.utils = { merge };
+
return configurator;
}
module.exports = main(); | 4 | 0.190476 | 4 | 0 |
ed4cf22f9d95847c173926d7ee7bd1e7e7b9523d | app/src/ui/banners/banner.tsx | app/src/ui/banners/banner.tsx | import * as React from 'react'
import { Octicon, OcticonSymbol } from '../octicons'
interface IBannerProps {
readonly id?: string
readonly timeout?: number
readonly dismissable?: boolean
readonly onDismissed: () => void
}
export class Banner extends React.Component<IBannerProps, {}> {
private timeoutId: NodeJS.Timer | null = null
public render() {
return (
<div id={this.props.id} className="banner">
<div className="contents">{this.props.children}</div>
{this.renderCloseButton()}
</div>
)
}
private renderCloseButton() {
const { dismissable } = this.props
if (dismissable === undefined || dismissable === false) {
return null
}
return (
<div className="close">
<a onClick={this.props.onDismissed}>
<Octicon symbol={OcticonSymbol.x} />
</a>
</div>
)
}
public componentDidMount = () => {
if (this.props.timeout !== undefined) {
this.timeoutId = setTimeout(() => {
this.props.onDismissed()
}, this.props.timeout)
}
}
public componentWillUnmount = () => {
if (this.props.timeout !== undefined && this.timeoutId !== null) {
clearTimeout(this.timeoutId)
}
}
}
| import * as React from 'react'
import { Octicon, OcticonSymbol } from '../octicons'
interface IBannerProps {
readonly id?: string
readonly timeout?: number
readonly dismissable?: boolean
readonly onDismissed: () => void
}
export class Banner extends React.Component<IBannerProps, {}> {
private timeoutId: number | null = null
public render() {
return (
<div id={this.props.id} className="banner">
<div className="contents">{this.props.children}</div>
{this.renderCloseButton()}
</div>
)
}
private renderCloseButton() {
const { dismissable } = this.props
if (dismissable === undefined || dismissable === false) {
return null
}
return (
<div className="close">
<a onClick={this.props.onDismissed}>
<Octicon symbol={OcticonSymbol.x} />
</a>
</div>
)
}
public componentDidMount = () => {
if (this.props.timeout !== undefined) {
this.timeoutId = window.setTimeout(() => {
this.props.onDismissed()
}, this.props.timeout)
}
}
public componentWillUnmount = () => {
if (this.props.timeout !== undefined && this.timeoutId !== null) {
window.clearTimeout(this.timeoutId)
}
}
}
| Use the Window setTimeout instead of the Node setTimeout | Use the Window setTimeout instead of the Node setTimeout
Co-Authored-By: Rafael Oleza <2cf5b502deae2e387c60721eb8244c243cb5c4e1@users.noreply.github.com>
| TypeScript | mit | say25/desktop,desktop/desktop,say25/desktop,kactus-io/kactus,desktop/desktop,artivilla/desktop,kactus-io/kactus,j-f1/forked-desktop,say25/desktop,shiftkey/desktop,desktop/desktop,shiftkey/desktop,shiftkey/desktop,kactus-io/kactus,artivilla/desktop,j-f1/forked-desktop,desktop/desktop,say25/desktop,artivilla/desktop,shiftkey/desktop,j-f1/forked-desktop,kactus-io/kactus,j-f1/forked-desktop,artivilla/desktop | typescript | ## Code Before:
import * as React from 'react'
import { Octicon, OcticonSymbol } from '../octicons'
interface IBannerProps {
readonly id?: string
readonly timeout?: number
readonly dismissable?: boolean
readonly onDismissed: () => void
}
export class Banner extends React.Component<IBannerProps, {}> {
private timeoutId: NodeJS.Timer | null = null
public render() {
return (
<div id={this.props.id} className="banner">
<div className="contents">{this.props.children}</div>
{this.renderCloseButton()}
</div>
)
}
private renderCloseButton() {
const { dismissable } = this.props
if (dismissable === undefined || dismissable === false) {
return null
}
return (
<div className="close">
<a onClick={this.props.onDismissed}>
<Octicon symbol={OcticonSymbol.x} />
</a>
</div>
)
}
public componentDidMount = () => {
if (this.props.timeout !== undefined) {
this.timeoutId = setTimeout(() => {
this.props.onDismissed()
}, this.props.timeout)
}
}
public componentWillUnmount = () => {
if (this.props.timeout !== undefined && this.timeoutId !== null) {
clearTimeout(this.timeoutId)
}
}
}
## Instruction:
Use the Window setTimeout instead of the Node setTimeout
Co-Authored-By: Rafael Oleza <2cf5b502deae2e387c60721eb8244c243cb5c4e1@users.noreply.github.com>
## Code After:
import * as React from 'react'
import { Octicon, OcticonSymbol } from '../octicons'
interface IBannerProps {
readonly id?: string
readonly timeout?: number
readonly dismissable?: boolean
readonly onDismissed: () => void
}
export class Banner extends React.Component<IBannerProps, {}> {
private timeoutId: number | null = null
public render() {
return (
<div id={this.props.id} className="banner">
<div className="contents">{this.props.children}</div>
{this.renderCloseButton()}
</div>
)
}
private renderCloseButton() {
const { dismissable } = this.props
if (dismissable === undefined || dismissable === false) {
return null
}
return (
<div className="close">
<a onClick={this.props.onDismissed}>
<Octicon symbol={OcticonSymbol.x} />
</a>
</div>
)
}
public componentDidMount = () => {
if (this.props.timeout !== undefined) {
this.timeoutId = window.setTimeout(() => {
this.props.onDismissed()
}, this.props.timeout)
}
}
public componentWillUnmount = () => {
if (this.props.timeout !== undefined && this.timeoutId !== null) {
window.clearTimeout(this.timeoutId)
}
}
}
| import * as React from 'react'
import { Octicon, OcticonSymbol } from '../octicons'
interface IBannerProps {
readonly id?: string
readonly timeout?: number
readonly dismissable?: boolean
readonly onDismissed: () => void
}
export class Banner extends React.Component<IBannerProps, {}> {
- private timeoutId: NodeJS.Timer | null = null
? ^^^^^^^^^
+ private timeoutId: number | null = null
? ^^ +
public render() {
return (
<div id={this.props.id} className="banner">
<div className="contents">{this.props.children}</div>
{this.renderCloseButton()}
</div>
)
}
private renderCloseButton() {
const { dismissable } = this.props
if (dismissable === undefined || dismissable === false) {
return null
}
return (
<div className="close">
<a onClick={this.props.onDismissed}>
<Octicon symbol={OcticonSymbol.x} />
</a>
</div>
)
}
public componentDidMount = () => {
if (this.props.timeout !== undefined) {
- this.timeoutId = setTimeout(() => {
+ this.timeoutId = window.setTimeout(() => {
? +++++++
this.props.onDismissed()
}, this.props.timeout)
}
}
public componentWillUnmount = () => {
if (this.props.timeout !== undefined && this.timeoutId !== null) {
- clearTimeout(this.timeoutId)
+ window.clearTimeout(this.timeoutId)
? +++++++
}
}
} | 6 | 0.117647 | 3 | 3 |
8f437dbcccb0ff64b2073c4425e0634c90443a5f | S05-metasyntax/changed.t | S05-metasyntax/changed.t | use v6;
use Test;
plan 12;
# L<S05/Changed metacharacters>
{
# A dot . now matches any character including newline.
my $str = "abc\ndef";
ok($str ~~ /./, '. matches something');
ok($str ~~ /c.d/, '. matches \n');
# ^ and $ now always match the start/end of a string, like the old \A and \z.
ok($str ~~ /^abc/, '^ matches beginning of string');
ok(!($str ~~ /^de/), '^ does not match \n');
ok($str ~~ /def$/, '$ matches end of string');
ok(!($str ~~ /bc$/), '$ does not match \n');
# (The /m modifier is gone.)
eval-dies-ok('$str ~~ m:m/bc$/', '/m modifier (as :m) is gone');
}
# A $ no longer matches an optional preceding \n
{
my $str = "abc\ndef\n";
ok($str ~~ /def\n$/, '\n$ matches as expected');
ok(!($str ~~ /def$/), '$ does not match \n at end of string');
}
# The \A, \Z, and \z metacharacters are gone.
{
throws-like '/\A/', X::Obsolete, '\\A is gone';
throws-like '/\Z/', X::Obsolete, '\\Z is gone';
throws-like '/\z/', X::Obsolete, '\\z is gone';
}
# vim: ft=perl6
| use v6;
use Test;
plan 12;
# L<S05/Changed metacharacters>
{
# A dot . now matches any character including newline.
my $str = "abc\ndef";
ok($str ~~ /./, '. matches something');
ok($str ~~ /c.d/, '. matches \n');
# ^ and $ now always match the start/end of a string, like the old \A and \z.
ok($str ~~ /^abc/, '^ matches beginning of string');
ok(!($str ~~ /^de/), '^ does not match \n');
ok($str ~~ /def$/, '$ matches end of string');
ok(!($str ~~ /bc$/), '$ does not match \n');
# (The /m modifier is gone.)
throws-like '$str ~~ m/bc$/m', X::Obsolete, '/m modifier is gone';
}
# A $ no longer matches an optional preceding \n
{
my $str = "abc\ndef\n";
ok($str ~~ /def\n$/, '\n$ matches as expected');
ok(!($str ~~ /def$/), '$ does not match \n at end of string');
}
# The \A, \Z, and \z metacharacters are gone.
{
throws-like '/\A/', X::Obsolete, '\\A is gone';
throws-like '/\Z/', X::Obsolete, '\\Z is gone';
throws-like '/\z/', X::Obsolete, '\\z is gone';
}
# vim: ft=perl6
| Fix test for obsolete /m modifier | Fix test for obsolete /m modifier
test died for the wrong reason, m:m is very much alive
(see S05-modifier/ignoremark.t)
| Perl | artistic-2.0 | cygx/roast,skids/roast,dankogai/roast,zostay/roast,dankogai/roast,bitrauser/roast,skids/roast,dogbert17/roast,b2gills/roast,b2gills/roast,bitrauser/roast,zostay/roast,b2gills/roast,dankogai/roast,perl6/roast,cygx/roast,dogbert17/roast,skids/roast,cygx/roast,zostay/roast | perl | ## Code Before:
use v6;
use Test;
plan 12;
# L<S05/Changed metacharacters>
{
# A dot . now matches any character including newline.
my $str = "abc\ndef";
ok($str ~~ /./, '. matches something');
ok($str ~~ /c.d/, '. matches \n');
# ^ and $ now always match the start/end of a string, like the old \A and \z.
ok($str ~~ /^abc/, '^ matches beginning of string');
ok(!($str ~~ /^de/), '^ does not match \n');
ok($str ~~ /def$/, '$ matches end of string');
ok(!($str ~~ /bc$/), '$ does not match \n');
# (The /m modifier is gone.)
eval-dies-ok('$str ~~ m:m/bc$/', '/m modifier (as :m) is gone');
}
# A $ no longer matches an optional preceding \n
{
my $str = "abc\ndef\n";
ok($str ~~ /def\n$/, '\n$ matches as expected');
ok(!($str ~~ /def$/), '$ does not match \n at end of string');
}
# The \A, \Z, and \z metacharacters are gone.
{
throws-like '/\A/', X::Obsolete, '\\A is gone';
throws-like '/\Z/', X::Obsolete, '\\Z is gone';
throws-like '/\z/', X::Obsolete, '\\z is gone';
}
# vim: ft=perl6
## Instruction:
Fix test for obsolete /m modifier
test died for the wrong reason, m:m is very much alive
(see S05-modifier/ignoremark.t)
## Code After:
use v6;
use Test;
plan 12;
# L<S05/Changed metacharacters>
{
# A dot . now matches any character including newline.
my $str = "abc\ndef";
ok($str ~~ /./, '. matches something');
ok($str ~~ /c.d/, '. matches \n');
# ^ and $ now always match the start/end of a string, like the old \A and \z.
ok($str ~~ /^abc/, '^ matches beginning of string');
ok(!($str ~~ /^de/), '^ does not match \n');
ok($str ~~ /def$/, '$ matches end of string');
ok(!($str ~~ /bc$/), '$ does not match \n');
# (The /m modifier is gone.)
throws-like '$str ~~ m/bc$/m', X::Obsolete, '/m modifier is gone';
}
# A $ no longer matches an optional preceding \n
{
my $str = "abc\ndef\n";
ok($str ~~ /def\n$/, '\n$ matches as expected');
ok(!($str ~~ /def$/), '$ does not match \n at end of string');
}
# The \A, \Z, and \z metacharacters are gone.
{
throws-like '/\A/', X::Obsolete, '\\A is gone';
throws-like '/\Z/', X::Obsolete, '\\Z is gone';
throws-like '/\z/', X::Obsolete, '\\z is gone';
}
# vim: ft=perl6
| use v6;
use Test;
plan 12;
# L<S05/Changed metacharacters>
{
# A dot . now matches any character including newline.
my $str = "abc\ndef";
ok($str ~~ /./, '. matches something');
ok($str ~~ /c.d/, '. matches \n');
# ^ and $ now always match the start/end of a string, like the old \A and \z.
ok($str ~~ /^abc/, '^ matches beginning of string');
ok(!($str ~~ /^de/), '^ does not match \n');
ok($str ~~ /def$/, '$ matches end of string');
ok(!($str ~~ /bc$/), '$ does not match \n');
# (The /m modifier is gone.)
- eval-dies-ok('$str ~~ m:m/bc$/', '/m modifier (as :m) is gone');
+ throws-like '$str ~~ m/bc$/m', X::Obsolete, '/m modifier is gone';
}
# A $ no longer matches an optional preceding \n
{
my $str = "abc\ndef\n";
ok($str ~~ /def\n$/, '\n$ matches as expected');
ok(!($str ~~ /def$/), '$ does not match \n at end of string');
}
# The \A, \Z, and \z metacharacters are gone.
{
throws-like '/\A/', X::Obsolete, '\\A is gone';
throws-like '/\Z/', X::Obsolete, '\\Z is gone';
throws-like '/\z/', X::Obsolete, '\\z is gone';
}
# vim: ft=perl6 | 2 | 0.052632 | 1 | 1 |
50a5a664601c5bf59484999097ff9827ac93a8cf | app/controllers/admin/users_controller.rb | app/controllers/admin/users_controller.rb | class Admin::UsersController < ApplicationController
ssl_required :oauth, :api_key, :regenerate_api_key
before_filter :login_required
before_filter :get_user, only: [:show, :update, :destroy]
def show
not_authorized unless current_user.organization.present? && current_user.organization_owner
end
def create
not_authorized unless current_user.organization.present? && current_user.organization_owner
end
def update
not_authorized unless current_user.organization.present? && current_user.organization_owner
end
def destroy
not_authorized unless current_user.organization.present? && current_user.organization_owner
@user.destroy
end
private
def get_user
@user = current_user.organization.users_dataset.where(username: params[:id]).first
raise RecordNotFound unless @user
end
end
| class Admin::UsersController < ApplicationController
ssl_required :oauth, :api_key, :regenerate_api_key
before_filter :login_required, :check_permissions
before_filter :get_user, only: [:show, :update, :destroy]
def show
end
def create
@user = User.new
@user.set_fields(params[:user], [:username, :email, :password])
@user.organization = current_user.organization
@user.save
respond_with @user
end
def update
@user.set_fields(params[:user], [:quota_in_bytes, :email])
if attributes[:password].present?
@user.password = attributes[:password]
end
@user.save
respond_with @user
end
def destroy
@user.destroy
end
private
def get_user
@user = current_user.organization.users_dataset.where(username: params[:id]).first
raise RecordNotFound unless @user
end
def check_permissions
not_authorized unless current_user.organization.present? && current_user.organization_owner
end
end
| Implement create and update on organization users controller | Implement create and update on organization users controller
| Ruby | bsd-3-clause | future-analytics/cartodb,raquel-ucl/cartodb,raquel-ucl/cartodb,splashblot/dronedb,DigitalCoder/cartodb,nyimbi/cartodb,future-analytics/cartodb,dbirchak/cartodb,dbirchak/cartodb,DigitalCoder/cartodb,thorncp/cartodb,bloomberg/cartodb,CartoDB/cartodb,raquel-ucl/cartodb,thorncp/cartodb,UCL-ShippingGroup/cartodb-1,nuxcode/cartodb,nyimbi/cartodb,UCL-ShippingGroup/cartodb-1,splashblot/dronedb,thorncp/cartodb,thorncp/cartodb,nuxcode/cartodb,bloomberg/cartodb,CartoDB/cartodb,codeandtheory/cartodb,splashblot/dronedb,CartoDB/cartodb,DigitalCoder/cartodb,codeandtheory/cartodb,CartoDB/cartodb,codeandtheory/cartodb,UCL-ShippingGroup/cartodb-1,UCL-ShippingGroup/cartodb-1,raquel-ucl/cartodb,nuxcode/cartodb,codeandtheory/cartodb,nuxcode/cartodb,DigitalCoder/cartodb,nuxcode/cartodb,bloomberg/cartodb,CartoDB/cartodb,dbirchak/cartodb,future-analytics/cartodb,splashblot/dronedb,bloomberg/cartodb,DigitalCoder/cartodb,nyimbi/cartodb,UCL-ShippingGroup/cartodb-1,splashblot/dronedb,thorncp/cartodb,raquel-ucl/cartodb,bloomberg/cartodb,future-analytics/cartodb,dbirchak/cartodb,nyimbi/cartodb,future-analytics/cartodb,codeandtheory/cartodb,nyimbi/cartodb,dbirchak/cartodb | ruby | ## Code Before:
class Admin::UsersController < ApplicationController
ssl_required :oauth, :api_key, :regenerate_api_key
before_filter :login_required
before_filter :get_user, only: [:show, :update, :destroy]
def show
not_authorized unless current_user.organization.present? && current_user.organization_owner
end
def create
not_authorized unless current_user.organization.present? && current_user.organization_owner
end
def update
not_authorized unless current_user.organization.present? && current_user.organization_owner
end
def destroy
not_authorized unless current_user.organization.present? && current_user.organization_owner
@user.destroy
end
private
def get_user
@user = current_user.organization.users_dataset.where(username: params[:id]).first
raise RecordNotFound unless @user
end
end
## Instruction:
Implement create and update on organization users controller
## Code After:
class Admin::UsersController < ApplicationController
ssl_required :oauth, :api_key, :regenerate_api_key
before_filter :login_required, :check_permissions
before_filter :get_user, only: [:show, :update, :destroy]
def show
end
def create
@user = User.new
@user.set_fields(params[:user], [:username, :email, :password])
@user.organization = current_user.organization
@user.save
respond_with @user
end
def update
@user.set_fields(params[:user], [:quota_in_bytes, :email])
if attributes[:password].present?
@user.password = attributes[:password]
end
@user.save
respond_with @user
end
def destroy
@user.destroy
end
private
def get_user
@user = current_user.organization.users_dataset.where(username: params[:id]).first
raise RecordNotFound unless @user
end
def check_permissions
not_authorized unless current_user.organization.present? && current_user.organization_owner
end
end
| class Admin::UsersController < ApplicationController
ssl_required :oauth, :api_key, :regenerate_api_key
+
- before_filter :login_required
+ before_filter :login_required, :check_permissions
? ++++++++++++++++++++
before_filter :get_user, only: [:show, :update, :destroy]
def show
- not_authorized unless current_user.organization.present? && current_user.organization_owner
end
def create
- not_authorized unless current_user.organization.present? && current_user.organization_owner
+ @user = User.new
+ @user.set_fields(params[:user], [:username, :email, :password])
+ @user.organization = current_user.organization
+ @user.save
+ respond_with @user
end
def update
- not_authorized unless current_user.organization.present? && current_user.organization_owner
+ @user.set_fields(params[:user], [:quota_in_bytes, :email])
+ if attributes[:password].present?
+ @user.password = attributes[:password]
+ end
+
+ @user.save
+ respond_with @user
end
def destroy
- not_authorized unless current_user.organization.present? && current_user.organization_owner
@user.destroy
end
private
def get_user
@user = current_user.organization.users_dataset.where(username: params[:id]).first
raise RecordNotFound unless @user
end
+
+ def check_permissions
+ not_authorized unless current_user.organization.present? && current_user.organization_owner
+ end
end | 23 | 0.793103 | 18 | 5 |
f6c2f9b09700eafae66cd45dae0f6bf1dca5068e | prime.clj/src/prime/clj.clj | prime.clj/src/prime/clj.clj | (ns prime.clj
(:gen-class))
(require '[clojure.math.numeric-tower :as math])
(defn prime? [x]
(cond
(integer? (math/sqrt x)) false
:else (every?
(fn [y] (not= (mod x y) 0))
(range 2 (math/ceil (math/sqrt x))))))
(defn -main
[& args]
(def v [])
(def i 0)
(while (< (count v) (Integer/parseInt (first args)))
(cond (prime? i) (def v (conj v i)))
(def i (inc i)))
(println v))
| (ns prime.clj
(:gen-class))
(require '[clojure.math.numeric-tower :as math])
(defn prime? [x]
(if
(integer? (math/sqrt x)) false
(every?
(fn [y] (not= (mod x y) 0))
(range 2 (math/ceil (math/sqrt x))))))
(defn -main
[& args]
(def v [])
(def i 0)
(while (< (count v) (Integer/parseInt (first args)))
(cond (prime? i) (def v (conj v i)))
(def i (inc i)))
(println v))
| Replace cond with if for shorter, clearer code. | Replace cond with if for shorter, clearer code.
| Clojure | bsd-3-clause | noamknispel/ThePrimeBenchMark,noamknispel/ThePrimeBenchMark,noamknispel/ThePrimeBenchMark,noamknispel/ThePrimeBenchMark,noamknispel/ThePrimeBenchMark | clojure | ## Code Before:
(ns prime.clj
(:gen-class))
(require '[clojure.math.numeric-tower :as math])
(defn prime? [x]
(cond
(integer? (math/sqrt x)) false
:else (every?
(fn [y] (not= (mod x y) 0))
(range 2 (math/ceil (math/sqrt x))))))
(defn -main
[& args]
(def v [])
(def i 0)
(while (< (count v) (Integer/parseInt (first args)))
(cond (prime? i) (def v (conj v i)))
(def i (inc i)))
(println v))
## Instruction:
Replace cond with if for shorter, clearer code.
## Code After:
(ns prime.clj
(:gen-class))
(require '[clojure.math.numeric-tower :as math])
(defn prime? [x]
(if
(integer? (math/sqrt x)) false
(every?
(fn [y] (not= (mod x y) 0))
(range 2 (math/ceil (math/sqrt x))))))
(defn -main
[& args]
(def v [])
(def i 0)
(while (< (count v) (Integer/parseInt (first args)))
(cond (prime? i) (def v (conj v i)))
(def i (inc i)))
(println v))
| (ns prime.clj
(:gen-class))
(require '[clojure.math.numeric-tower :as math])
(defn prime? [x]
- (cond
+ (if
(integer? (math/sqrt x)) false
- :else (every?
? ------
+ (every?
- (fn [y] (not= (mod x y) 0))
? -----
+ (fn [y] (not= (mod x y) 0))
- (range 2 (math/ceil (math/sqrt x))))))
? -----
+ (range 2 (math/ceil (math/sqrt x))))))
(defn -main
[& args]
(def v [])
(def i 0)
(while (< (count v) (Integer/parseInt (first args)))
(cond (prime? i) (def v (conj v i)))
(def i (inc i)))
(println v)) | 8 | 0.380952 | 4 | 4 |
3c145a47a20a55787b8ce8a4f00ff3449306ccbc | blazar/cmd/manager.py | blazar/cmd/manager.py |
import eventlet
eventlet.monkey_patch()
import gettext
import sys
from oslo_config import cfg
from oslo_service import service
gettext.install('blazar')
from blazar.db import api as db_api
from blazar.manager import service as manager_service
from blazar.notification import notifier
from blazar.utils import service as service_utils
def main():
cfg.CONF(project='blazar', prog='blazar-manager')
service_utils.prepare_service(sys.argv)
db_api.setup_db()
notifier.init()
service.launch(
cfg.CONF,
manager_service.ManagerService()
).wait()
if __name__ == '__main__':
main()
|
import eventlet
eventlet.monkey_patch()
import gettext
import sys
from oslo_config import cfg
from oslo_service import service
gettext.install('blazar')
from blazar.db import api as db_api
from blazar.manager import service as manager_service
from blazar.notification import notifier
from blazar.utils import service as service_utils
def main():
cfg.CONF(project='blazar', prog='blazar-manager')
service_utils.prepare_service(sys.argv)
db_api.setup_db()
notifier.init()
service.launch(
cfg.CONF,
manager_service.ManagerService(),
restart_method='mutate'
).wait()
if __name__ == '__main__':
main()
| Enable mutable config in blazar | Enable mutable config in blazar
New releases of oslo.config support a 'mutable' parameter in Opts.
oslo.service provides an option [1] which allows services to tell it
they want mutate_config_files to be called by passing a parameter.
This commit is to use the same approach. This allows Blazar to benefit
from [2], where the 'debug' option (owned by oslo.log) is made mutable.
We should be able to turn debug logging on and off by changing the
config and sending a SIGHUP signal to blazar-manager.
However, please note that blazar-manager currently doesn't work
correctly after receiving a SIGHUP. As a result the mutable config is
not yet usable. Operators should continue restarting blazar-manager
after changing blazar.conf.
TC goal: https://governance.openstack.org/tc/goals/rocky/enable-mutable-configuration.html
[1] https://review.openstack.org/263312/
[2] https://review.openstack.org/254821/
Change-Id: Ieea9043b6f3a28dc92717680585614a68227120e
| Python | apache-2.0 | stackforge/blazar,stackforge/blazar,openstack/blazar,ChameleonCloud/blazar,ChameleonCloud/blazar,openstack/blazar | python | ## Code Before:
import eventlet
eventlet.monkey_patch()
import gettext
import sys
from oslo_config import cfg
from oslo_service import service
gettext.install('blazar')
from blazar.db import api as db_api
from blazar.manager import service as manager_service
from blazar.notification import notifier
from blazar.utils import service as service_utils
def main():
cfg.CONF(project='blazar', prog='blazar-manager')
service_utils.prepare_service(sys.argv)
db_api.setup_db()
notifier.init()
service.launch(
cfg.CONF,
manager_service.ManagerService()
).wait()
if __name__ == '__main__':
main()
## Instruction:
Enable mutable config in blazar
New releases of oslo.config support a 'mutable' parameter in Opts.
oslo.service provides an option [1] which allows services to tell it
they want mutate_config_files to be called by passing a parameter.
This commit is to use the same approach. This allows Blazar to benefit
from [2], where the 'debug' option (owned by oslo.log) is made mutable.
We should be able to turn debug logging on and off by changing the
config and sending a SIGHUP signal to blazar-manager.
However, please note that blazar-manager currently doesn't work
correctly after receiving a SIGHUP. As a result the mutable config is
not yet usable. Operators should continue restarting blazar-manager
after changing blazar.conf.
TC goal: https://governance.openstack.org/tc/goals/rocky/enable-mutable-configuration.html
[1] https://review.openstack.org/263312/
[2] https://review.openstack.org/254821/
Change-Id: Ieea9043b6f3a28dc92717680585614a68227120e
## Code After:
import eventlet
eventlet.monkey_patch()
import gettext
import sys
from oslo_config import cfg
from oslo_service import service
gettext.install('blazar')
from blazar.db import api as db_api
from blazar.manager import service as manager_service
from blazar.notification import notifier
from blazar.utils import service as service_utils
def main():
cfg.CONF(project='blazar', prog='blazar-manager')
service_utils.prepare_service(sys.argv)
db_api.setup_db()
notifier.init()
service.launch(
cfg.CONF,
manager_service.ManagerService(),
restart_method='mutate'
).wait()
if __name__ == '__main__':
main()
|
import eventlet
eventlet.monkey_patch()
import gettext
import sys
from oslo_config import cfg
from oslo_service import service
gettext.install('blazar')
from blazar.db import api as db_api
from blazar.manager import service as manager_service
from blazar.notification import notifier
from blazar.utils import service as service_utils
def main():
cfg.CONF(project='blazar', prog='blazar-manager')
service_utils.prepare_service(sys.argv)
db_api.setup_db()
notifier.init()
service.launch(
cfg.CONF,
- manager_service.ManagerService()
+ manager_service.ManagerService(),
? +
+ restart_method='mutate'
).wait()
if __name__ == '__main__':
main() | 3 | 0.096774 | 2 | 1 |
2eb4fcb2d75e6f93f33b1ae41098919f6d4ebc92 | neovim/__init__.py | neovim/__init__.py | from client import Client
from uv_stream import UvStream
__all__ = ['Client', 'UvStream', 'c']
| from client import Client
from uv_stream import UvStream
__all__ = ['connect']
def connect(address, port=None):
client = Client(UvStream(address, port))
client.discover_api()
return client.vim
| Add helper function for connecting with Neovim | Add helper function for connecting with Neovim
| Python | apache-2.0 | fwalch/python-client,Shougo/python-client,bfredl/python-client,brcolow/python-client,starcraftman/python-client,traverseda/python-client,justinmk/python-client,neovim/python-client,meitham/python-client,meitham/python-client,0x90sled/python-client,justinmk/python-client,zchee/python-client,traverseda/python-client,bfredl/python-client,neovim/python-client,starcraftman/python-client,0x90sled/python-client,fwalch/python-client,Shougo/python-client,zchee/python-client,timeyyy/python-client,brcolow/python-client,timeyyy/python-client | python | ## Code Before:
from client import Client
from uv_stream import UvStream
__all__ = ['Client', 'UvStream', 'c']
## Instruction:
Add helper function for connecting with Neovim
## Code After:
from client import Client
from uv_stream import UvStream
__all__ = ['connect']
def connect(address, port=None):
client = Client(UvStream(address, port))
client.discover_api()
return client.vim
| from client import Client
from uv_stream import UvStream
- __all__ = ['Client', 'UvStream', 'c']
+ __all__ = ['connect']
+
+ def connect(address, port=None):
+ client = Client(UvStream(address, port))
+ client.discover_api()
+ return client.vim | 7 | 1.75 | 6 | 1 |
27fe9d6531a2e76affd9388db53c0433062a9cfa | photonix/photos/management/commands/create_library.py | photonix/photos/management/commands/create_library.py | import os
from pathlib import Path
from django.contrib.auth import get_user_model
from django.core.management.base import BaseCommand
from django.db.utils import IntegrityError
from photonix.photos.models import Library, LibraryPath, LibraryUser
from photonix.photos.utils.db import record_photo
from photonix.photos.utils.fs import determine_destination, download_file
User = get_user_model()
class Command(BaseCommand):
help = 'Create a library for a user'
def create_library(self, username, library_name):
# Get user
user = User.objects.get(username=username)
# Create Library
library, _ = Library.objects.get_or_create(
name=library_name,
)
library_path, _ = LibraryPath.objects.get_or_create(
library=library,
type='St',
backend_type='Lo',
path='/data/photos/',
url='/photos/',
)
library_user, _ = LibraryUser.objects.get_or_create(
library=library,
user=user,
owner=True,
)
print(f'Library "{library_name}" created successfully for user "{username}"')
def add_arguments(self, parser):
# Positional arguments
parser.add_argument('username', nargs='+', type=str)
parser.add_argument('library_name', nargs='+', type=str)
def handle(self, *args, **options):
self.create_library(options['username'][0], options['library_name'][0])
| import os
from pathlib import Path
from django.contrib.auth import get_user_model
from django.core.management.base import BaseCommand
from django.db.utils import IntegrityError
from photonix.photos.models import Library, LibraryPath, LibraryUser
from photonix.photos.utils.db import record_photo
from photonix.photos.utils.fs import determine_destination, download_file
User = get_user_model()
class Command(BaseCommand):
help = 'Create a library for a user'
def create_library(self, username, library_name, path):
# Get user
user = User.objects.get(username=username)
# Create Library
library, _ = Library.objects.get_or_create(
name=library_name,
)
library_path, _ = LibraryPath.objects.get_or_create(
library=library,
type='St',
backend_type='Lo',
path=path,
)
library_user, _ = LibraryUser.objects.get_or_create(
library=library,
user=user,
owner=True,
)
print(f'Library "{library_name}" with path "{path}" created successfully for user "{username}"')
def add_arguments(self, parser):
# Positional arguments
parser.add_argument('username', type=str)
parser.add_argument('library_name', type=str)
parser.add_argument('--path', type=str, default='/data/photos')
def handle(self, *args, **options):
self.create_library(options['username'], options['library_name'], options['path'])
| Fix path cannot be set when creating new library | Fix path cannot be set when creating new library | Python | agpl-3.0 | damianmoore/photo-manager,damianmoore/photo-manager,damianmoore/photo-manager,damianmoore/photo-manager | python | ## Code Before:
import os
from pathlib import Path
from django.contrib.auth import get_user_model
from django.core.management.base import BaseCommand
from django.db.utils import IntegrityError
from photonix.photos.models import Library, LibraryPath, LibraryUser
from photonix.photos.utils.db import record_photo
from photonix.photos.utils.fs import determine_destination, download_file
User = get_user_model()
class Command(BaseCommand):
help = 'Create a library for a user'
def create_library(self, username, library_name):
# Get user
user = User.objects.get(username=username)
# Create Library
library, _ = Library.objects.get_or_create(
name=library_name,
)
library_path, _ = LibraryPath.objects.get_or_create(
library=library,
type='St',
backend_type='Lo',
path='/data/photos/',
url='/photos/',
)
library_user, _ = LibraryUser.objects.get_or_create(
library=library,
user=user,
owner=True,
)
print(f'Library "{library_name}" created successfully for user "{username}"')
def add_arguments(self, parser):
# Positional arguments
parser.add_argument('username', nargs='+', type=str)
parser.add_argument('library_name', nargs='+', type=str)
def handle(self, *args, **options):
self.create_library(options['username'][0], options['library_name'][0])
## Instruction:
Fix path cannot be set when creating new library
## Code After:
import os
from pathlib import Path
from django.contrib.auth import get_user_model
from django.core.management.base import BaseCommand
from django.db.utils import IntegrityError
from photonix.photos.models import Library, LibraryPath, LibraryUser
from photonix.photos.utils.db import record_photo
from photonix.photos.utils.fs import determine_destination, download_file
User = get_user_model()
class Command(BaseCommand):
help = 'Create a library for a user'
def create_library(self, username, library_name, path):
# Get user
user = User.objects.get(username=username)
# Create Library
library, _ = Library.objects.get_or_create(
name=library_name,
)
library_path, _ = LibraryPath.objects.get_or_create(
library=library,
type='St',
backend_type='Lo',
path=path,
)
library_user, _ = LibraryUser.objects.get_or_create(
library=library,
user=user,
owner=True,
)
print(f'Library "{library_name}" with path "{path}" created successfully for user "{username}"')
def add_arguments(self, parser):
# Positional arguments
parser.add_argument('username', type=str)
parser.add_argument('library_name', type=str)
parser.add_argument('--path', type=str, default='/data/photos')
def handle(self, *args, **options):
self.create_library(options['username'], options['library_name'], options['path'])
| import os
from pathlib import Path
from django.contrib.auth import get_user_model
from django.core.management.base import BaseCommand
from django.db.utils import IntegrityError
from photonix.photos.models import Library, LibraryPath, LibraryUser
from photonix.photos.utils.db import record_photo
from photonix.photos.utils.fs import determine_destination, download_file
User = get_user_model()
class Command(BaseCommand):
help = 'Create a library for a user'
- def create_library(self, username, library_name):
+ def create_library(self, username, library_name, path):
? ++++++
# Get user
user = User.objects.get(username=username)
# Create Library
library, _ = Library.objects.get_or_create(
name=library_name,
)
library_path, _ = LibraryPath.objects.get_or_create(
library=library,
type='St',
backend_type='Lo',
- path='/data/photos/',
? ^^^ --- ------
+ path=path,
? ^
- url='/photos/',
)
library_user, _ = LibraryUser.objects.get_or_create(
library=library,
user=user,
owner=True,
)
- print(f'Library "{library_name}" created successfully for user "{username}"')
+ print(f'Library "{library_name}" with path "{path}" created successfully for user "{username}"')
? +++++++++++++++++++
def add_arguments(self, parser):
# Positional arguments
- parser.add_argument('username', nargs='+', type=str)
? -----------
+ parser.add_argument('username', type=str)
- parser.add_argument('library_name', nargs='+', type=str)
? -----------
+ parser.add_argument('library_name', type=str)
+ parser.add_argument('--path', type=str, default='/data/photos')
def handle(self, *args, **options):
- self.create_library(options['username'][0], options['library_name'][0])
? --- ^
+ self.create_library(options['username'], options['library_name'], options['path'])
? +++++++++ ^^^^^^
| 14 | 0.297872 | 7 | 7 |
393d40151564bb4354747b173c10d89375c0f262 | app/models/spina/blog/post.rb | app/models/spina/blog/post.rb |
module Spina
module Blog
# Spina::Blog::Post
class Post < ApplicationRecord
extend FriendlyId
friendly_id :title, use: :slugged
belongs_to :image, optional: true
belongs_to :user
belongs_to :category, inverse_of: :posts
validates :title, :content, presence: true
before_save :set_published_at
# Create a 301 redirect if the slug changed
after_save :rewrite_rule, if: -> { saved_change_to_slug? }
scope :available, -> { where('published_at <= ?', Time.zone.now) }
scope :future, -> { where('published_at >= ?', Time.zone.now) }
scope :draft, -> { where(draft: true) }
scope :live, -> { where(draft: false) }
private
def set_published_at
self.published_at = Time.now if !draft? && published_at.blank?
end
def should_generate_new_friendly_id?
slug.blank? || draft_changed? || super
end
def rewrite_rule
RewriteRule.create(old_path: "/blog/posts/#{slug_before_last_save}",
new_path: "/blog/posts/#{slug}")
end
end
end
end
|
module Spina
module Blog
# Spina::Blog::Post
class Post < ApplicationRecord
extend FriendlyId
friendly_id :title, use: :slugged
belongs_to :image, optional: true
belongs_to :user
belongs_to :category, inverse_of: :posts
validates :title, :content, presence: true
before_save :set_published_at
# Create a 301 redirect if the slug changed
after_save :rewrite_rule, if: -> { saved_change_to_slug? }, on: :update
scope :available, -> { where('published_at <= ?', Time.zone.now) }
scope :future, -> { where('published_at >= ?', Time.zone.now) }
scope :draft, -> { where(draft: true) }
scope :live, -> { where(draft: false) }
private
def set_published_at
self.published_at = Time.now if !draft? && published_at.blank?
end
def should_generate_new_friendly_id?
slug.blank? || draft_changed? || super
end
def rewrite_rule
::Spina::RewriteRule.create(
old_path: "/blog/posts/#{slug_before_last_save}",
new_path: "/blog/posts/#{slug}"
)
end
end
end
end
| Use top level namespace to ensure reloading works properly | Use top level namespace to ensure reloading works properly
| Ruby | mit | initforthe/spina-blog,initforthe/spina-blog,initforthe/spina-blog | ruby | ## Code Before:
module Spina
module Blog
# Spina::Blog::Post
class Post < ApplicationRecord
extend FriendlyId
friendly_id :title, use: :slugged
belongs_to :image, optional: true
belongs_to :user
belongs_to :category, inverse_of: :posts
validates :title, :content, presence: true
before_save :set_published_at
# Create a 301 redirect if the slug changed
after_save :rewrite_rule, if: -> { saved_change_to_slug? }
scope :available, -> { where('published_at <= ?', Time.zone.now) }
scope :future, -> { where('published_at >= ?', Time.zone.now) }
scope :draft, -> { where(draft: true) }
scope :live, -> { where(draft: false) }
private
def set_published_at
self.published_at = Time.now if !draft? && published_at.blank?
end
def should_generate_new_friendly_id?
slug.blank? || draft_changed? || super
end
def rewrite_rule
RewriteRule.create(old_path: "/blog/posts/#{slug_before_last_save}",
new_path: "/blog/posts/#{slug}")
end
end
end
end
## Instruction:
Use top level namespace to ensure reloading works properly
## Code After:
module Spina
module Blog
# Spina::Blog::Post
class Post < ApplicationRecord
extend FriendlyId
friendly_id :title, use: :slugged
belongs_to :image, optional: true
belongs_to :user
belongs_to :category, inverse_of: :posts
validates :title, :content, presence: true
before_save :set_published_at
# Create a 301 redirect if the slug changed
after_save :rewrite_rule, if: -> { saved_change_to_slug? }, on: :update
scope :available, -> { where('published_at <= ?', Time.zone.now) }
scope :future, -> { where('published_at >= ?', Time.zone.now) }
scope :draft, -> { where(draft: true) }
scope :live, -> { where(draft: false) }
private
def set_published_at
self.published_at = Time.now if !draft? && published_at.blank?
end
def should_generate_new_friendly_id?
slug.blank? || draft_changed? || super
end
def rewrite_rule
::Spina::RewriteRule.create(
old_path: "/blog/posts/#{slug_before_last_save}",
new_path: "/blog/posts/#{slug}"
)
end
end
end
end
|
module Spina
module Blog
# Spina::Blog::Post
class Post < ApplicationRecord
extend FriendlyId
friendly_id :title, use: :slugged
belongs_to :image, optional: true
belongs_to :user
belongs_to :category, inverse_of: :posts
validates :title, :content, presence: true
before_save :set_published_at
# Create a 301 redirect if the slug changed
- after_save :rewrite_rule, if: -> { saved_change_to_slug? }
+ after_save :rewrite_rule, if: -> { saved_change_to_slug? }, on: :update
? +++++++++++++
scope :available, -> { where('published_at <= ?', Time.zone.now) }
scope :future, -> { where('published_at >= ?', Time.zone.now) }
scope :draft, -> { where(draft: true) }
scope :live, -> { where(draft: false) }
private
def set_published_at
self.published_at = Time.now if !draft? && published_at.blank?
end
def should_generate_new_friendly_id?
slug.blank? || draft_changed? || super
end
def rewrite_rule
+ ::Spina::RewriteRule.create(
- RewriteRule.create(old_path: "/blog/posts/#{slug_before_last_save}",
? ^^^^^^^^^^^^^^^^^^^
+ old_path: "/blog/posts/#{slug_before_last_save}",
? ^^
- new_path: "/blog/posts/#{slug}")
? ----------------- -
+ new_path: "/blog/posts/#{slug}"
+ )
end
end
end
end | 8 | 0.186047 | 5 | 3 |
589f5c779d346ca57cce57e07304975d535c67f9 | .circleci/config.yml | .circleci/config.yml | version: 2
jobs:
build:
docker:
- image: debian:stable
working_directory: /src/cl-png
steps:
- checkout
- run: apt-get update -y && apt-get install -y sbcl wget gcc libpng-dev
- run: wget https://beta.quicklisp.org/quicklisp.lisp
- run: sbcl --load quicklisp.lisp --eval "(quicklisp-quickstart:install)" --eval '(ql:quickload "cffi")' --quit
- run: ln -s /src/cl-png /root/quicklisp/local-projects/png
- run: sbcl --load /root/quicklisp/setup.lisp --eval '(ql:quickload "png")' --eval "(asdf:oos 'asdf:load-op '#:png-test)" --eval "(lisp-unit:run-all-tests :png-test)" --quit
| version: 2
jobs:
build:
docker:
- image: debian:stable
working_directory: /src/cl-png
steps:
- checkout
- run: apt-get update -y && apt-get install -y sbcl wget gcc libpng-dev
- run: wget https://beta.quicklisp.org/quicklisp.lisp
- run: sbcl --load quicklisp.lisp --eval "(quicklisp-quickstart:install)" --eval '(ql:quickload "cffi")' --quit
- run: ln -s /src/cl-png /root/quicklisp/local-projects/png
- run: sbcl --load /root/quicklisp/setup.lisp --eval '(ql:quickload "png")' --eval "(asdf:oos 'asdf:load-op '#:png-test)" --eval "(lisp-unit:run-all-tests :png-test)" --quit
- run: sbcl --load /root/quicklisp/setup.lisp --eval '(ql:quickload "png")' --eval "(asdf:oos 'asdf:load-op '#:image-test)" --eval "(lisp-unit:run-all-tests :image-test)" --quit
- run: sbcl --load /root/quicklisp/setup.lisp --eval '(ql:quickload "png")' --eval "(asdf:oos 'asdf:load-op '#:bmp-test)" --eval "(lisp-unit:run-all-tests :bmp-test)" --quit
- run: sbcl --load /root/quicklisp/setup.lisp --eval '(ql:quickload "png")' --eval "(asdf:oos 'asdf:load-op '#:ops-test)" --eval "(lisp-unit:run-all-tests :ops-test)" --quit
| Add other tests to CircleCI | Add other tests to CircleCI
| YAML | lgpl-2.1 | ljosa/cl-png,ljosa/cl-png,ljosa/cl-png | yaml | ## Code Before:
version: 2
jobs:
build:
docker:
- image: debian:stable
working_directory: /src/cl-png
steps:
- checkout
- run: apt-get update -y && apt-get install -y sbcl wget gcc libpng-dev
- run: wget https://beta.quicklisp.org/quicklisp.lisp
- run: sbcl --load quicklisp.lisp --eval "(quicklisp-quickstart:install)" --eval '(ql:quickload "cffi")' --quit
- run: ln -s /src/cl-png /root/quicklisp/local-projects/png
- run: sbcl --load /root/quicklisp/setup.lisp --eval '(ql:quickload "png")' --eval "(asdf:oos 'asdf:load-op '#:png-test)" --eval "(lisp-unit:run-all-tests :png-test)" --quit
## Instruction:
Add other tests to CircleCI
## Code After:
version: 2
jobs:
build:
docker:
- image: debian:stable
working_directory: /src/cl-png
steps:
- checkout
- run: apt-get update -y && apt-get install -y sbcl wget gcc libpng-dev
- run: wget https://beta.quicklisp.org/quicklisp.lisp
- run: sbcl --load quicklisp.lisp --eval "(quicklisp-quickstart:install)" --eval '(ql:quickload "cffi")' --quit
- run: ln -s /src/cl-png /root/quicklisp/local-projects/png
- run: sbcl --load /root/quicklisp/setup.lisp --eval '(ql:quickload "png")' --eval "(asdf:oos 'asdf:load-op '#:png-test)" --eval "(lisp-unit:run-all-tests :png-test)" --quit
- run: sbcl --load /root/quicklisp/setup.lisp --eval '(ql:quickload "png")' --eval "(asdf:oos 'asdf:load-op '#:image-test)" --eval "(lisp-unit:run-all-tests :image-test)" --quit
- run: sbcl --load /root/quicklisp/setup.lisp --eval '(ql:quickload "png")' --eval "(asdf:oos 'asdf:load-op '#:bmp-test)" --eval "(lisp-unit:run-all-tests :bmp-test)" --quit
- run: sbcl --load /root/quicklisp/setup.lisp --eval '(ql:quickload "png")' --eval "(asdf:oos 'asdf:load-op '#:ops-test)" --eval "(lisp-unit:run-all-tests :ops-test)" --quit
| version: 2
jobs:
build:
docker:
- image: debian:stable
working_directory: /src/cl-png
steps:
- checkout
- run: apt-get update -y && apt-get install -y sbcl wget gcc libpng-dev
- run: wget https://beta.quicklisp.org/quicklisp.lisp
- run: sbcl --load quicklisp.lisp --eval "(quicklisp-quickstart:install)" --eval '(ql:quickload "cffi")' --quit
- run: ln -s /src/cl-png /root/quicklisp/local-projects/png
- run: sbcl --load /root/quicklisp/setup.lisp --eval '(ql:quickload "png")' --eval "(asdf:oos 'asdf:load-op '#:png-test)" --eval "(lisp-unit:run-all-tests :png-test)" --quit
+
+ - run: sbcl --load /root/quicklisp/setup.lisp --eval '(ql:quickload "png")' --eval "(asdf:oos 'asdf:load-op '#:image-test)" --eval "(lisp-unit:run-all-tests :image-test)" --quit
+
+ - run: sbcl --load /root/quicklisp/setup.lisp --eval '(ql:quickload "png")' --eval "(asdf:oos 'asdf:load-op '#:bmp-test)" --eval "(lisp-unit:run-all-tests :bmp-test)" --quit
+
+ - run: sbcl --load /root/quicklisp/setup.lisp --eval '(ql:quickload "png")' --eval "(asdf:oos 'asdf:load-op '#:ops-test)" --eval "(lisp-unit:run-all-tests :ops-test)" --quit | 6 | 0.3 | 6 | 0 |
3b99435da98ca8ad6335340d9745fdd30dde82f7 | metadata/com.ultrafunk.network_info.txt | metadata/com.ultrafunk.network_info.txt | Categories:System
License:Apache2
Web Site:https://github.com/ultrafunk/netinfo/blob/HEAD/README.md
Source Code:https://github.com/ultrafunk/netinfo
Issue Tracker:https://github.com/ultrafunk/netinfo/issues
Auto Name:NetInfo Widget
Summary:Widget for network connectivity
Description:
Easy to use single or dual widget with a clean design that shows the current
mobile data and/or WiFi connection details, with quick shortcuts to change
the settings for each of them and easy access to turn any of them ON or OFF.
.
Repo Type:git
Repo:https://github.com/ultrafunk/netinfo
Build:1.5,11
commit=da899a089e19ae483045b9bfa7c18fb09706a72b
subdir=NetInfo_Widget
gradle=yes
Build:1.5.1,12
commit=2eaa496f69c39ca422afe422fa1abfad90141937
subdir=NetInfo_Widget
gradle=yes
Auto Update Mode:None
Update Check Mode:RepoManifest
Current Version:1.5.2
Current Version Code:13
| Categories:System
License:Apache2
Web Site:https://github.com/ultrafunk/netinfo/blob/HEAD/README.md
Source Code:https://github.com/ultrafunk/netinfo
Issue Tracker:https://github.com/ultrafunk/netinfo/issues
Auto Name:NetInfo Widget
Summary:Widget for network connectivity
Description:
Easy to use single or dual widget with a clean design that shows the current
mobile data and/or WiFi connection details, with quick shortcuts to change
the settings for each of them and easy access to turn any of them ON or OFF.
.
Repo Type:git
Repo:https://github.com/ultrafunk/netinfo
Build:1.5,11
commit=da899a089e19ae483045b9bfa7c18fb09706a72b
subdir=NetInfo_Widget
gradle=yes
Build:1.5.1,12
commit=2eaa496f69c39ca422afe422fa1abfad90141937
subdir=NetInfo_Widget
gradle=yes
Build:1.5.2,13
commit=2ddcd8a0aed2d712f3df50d9bd4345106bb14d8a
subdir=NetInfo_Widget
gradle=yes
Auto Update Mode:None
Update Check Mode:RepoManifest
Current Version:1.5.2
Current Version Code:13
| Update NetInfo Widget to 1.5.2 (13) | Update NetInfo Widget to 1.5.2 (13)
| Text | agpl-3.0 | f-droid/fdroid-data,f-droid/fdroiddata,f-droid/fdroiddata | text | ## Code Before:
Categories:System
License:Apache2
Web Site:https://github.com/ultrafunk/netinfo/blob/HEAD/README.md
Source Code:https://github.com/ultrafunk/netinfo
Issue Tracker:https://github.com/ultrafunk/netinfo/issues
Auto Name:NetInfo Widget
Summary:Widget for network connectivity
Description:
Easy to use single or dual widget with a clean design that shows the current
mobile data and/or WiFi connection details, with quick shortcuts to change
the settings for each of them and easy access to turn any of them ON or OFF.
.
Repo Type:git
Repo:https://github.com/ultrafunk/netinfo
Build:1.5,11
commit=da899a089e19ae483045b9bfa7c18fb09706a72b
subdir=NetInfo_Widget
gradle=yes
Build:1.5.1,12
commit=2eaa496f69c39ca422afe422fa1abfad90141937
subdir=NetInfo_Widget
gradle=yes
Auto Update Mode:None
Update Check Mode:RepoManifest
Current Version:1.5.2
Current Version Code:13
## Instruction:
Update NetInfo Widget to 1.5.2 (13)
## Code After:
Categories:System
License:Apache2
Web Site:https://github.com/ultrafunk/netinfo/blob/HEAD/README.md
Source Code:https://github.com/ultrafunk/netinfo
Issue Tracker:https://github.com/ultrafunk/netinfo/issues
Auto Name:NetInfo Widget
Summary:Widget for network connectivity
Description:
Easy to use single or dual widget with a clean design that shows the current
mobile data and/or WiFi connection details, with quick shortcuts to change
the settings for each of them and easy access to turn any of them ON or OFF.
.
Repo Type:git
Repo:https://github.com/ultrafunk/netinfo
Build:1.5,11
commit=da899a089e19ae483045b9bfa7c18fb09706a72b
subdir=NetInfo_Widget
gradle=yes
Build:1.5.1,12
commit=2eaa496f69c39ca422afe422fa1abfad90141937
subdir=NetInfo_Widget
gradle=yes
Build:1.5.2,13
commit=2ddcd8a0aed2d712f3df50d9bd4345106bb14d8a
subdir=NetInfo_Widget
gradle=yes
Auto Update Mode:None
Update Check Mode:RepoManifest
Current Version:1.5.2
Current Version Code:13
| Categories:System
License:Apache2
Web Site:https://github.com/ultrafunk/netinfo/blob/HEAD/README.md
Source Code:https://github.com/ultrafunk/netinfo
Issue Tracker:https://github.com/ultrafunk/netinfo/issues
Auto Name:NetInfo Widget
Summary:Widget for network connectivity
Description:
Easy to use single or dual widget with a clean design that shows the current
mobile data and/or WiFi connection details, with quick shortcuts to change
the settings for each of them and easy access to turn any of them ON or OFF.
.
Repo Type:git
Repo:https://github.com/ultrafunk/netinfo
Build:1.5,11
commit=da899a089e19ae483045b9bfa7c18fb09706a72b
subdir=NetInfo_Widget
gradle=yes
Build:1.5.1,12
commit=2eaa496f69c39ca422afe422fa1abfad90141937
subdir=NetInfo_Widget
gradle=yes
+ Build:1.5.2,13
+ commit=2ddcd8a0aed2d712f3df50d9bd4345106bb14d8a
+ subdir=NetInfo_Widget
+ gradle=yes
+
Auto Update Mode:None
Update Check Mode:RepoManifest
Current Version:1.5.2
Current Version Code:13
| 5 | 0.15625 | 5 | 0 |
4016662c5df80787593198363008f784d5e73ed6 | package.json | package.json | {
"name": "sqlpad-project",
"version": "4.2.0",
"private": true,
"devDependencies": {
"husky": "^4.2.3",
"lint-staged": "^10.0.9",
"prettier": "^1.19.1"
},
"prettier": {
"singleQuote": true
},
"husky": {
"hooks": {
"pre-commit": "npx lint-staged"
}
},
"lint-staged": {
"*.{js,json,css}": [
"prettier --write",
"git add"
]
},
"scripts": {
"fixlint": "npm run fixlint --prefix client && npm run fixlint --prefix server && prettier --write \"**/*.js\"",
"lint": "npm run lint --prefix client && npm run lint --prefix server && prettier --check \"**/*.js\""
}
}
| {
"name": "sqlpad-project",
"version": "4.2.0",
"private": true,
"devDependencies": {
"husky": "^4.2.3",
"lint-staged": "^10.0.9",
"prettier": "^1.19.1"
},
"prettier": {
"singleQuote": true
},
"husky": {
"hooks": {
"pre-commit": "npx lint-staged"
}
},
"lint-staged": {
"*.{js,json,css}": [
"prettier --write"
]
},
"scripts": {
"fixlint": "npm run fixlint --prefix client && npm run fixlint --prefix server && prettier --write \"**/*.js\"",
"lint": "npm run lint --prefix client && npm run lint --prefix server && prettier --check \"**/*.js\""
}
}
| Remove git add from husky task | Remove git add from husky task
| JSON | mit | rickbergfalk/sqlpad,rickbergfalk/sqlpad,rickbergfalk/sqlpad | json | ## Code Before:
{
"name": "sqlpad-project",
"version": "4.2.0",
"private": true,
"devDependencies": {
"husky": "^4.2.3",
"lint-staged": "^10.0.9",
"prettier": "^1.19.1"
},
"prettier": {
"singleQuote": true
},
"husky": {
"hooks": {
"pre-commit": "npx lint-staged"
}
},
"lint-staged": {
"*.{js,json,css}": [
"prettier --write",
"git add"
]
},
"scripts": {
"fixlint": "npm run fixlint --prefix client && npm run fixlint --prefix server && prettier --write \"**/*.js\"",
"lint": "npm run lint --prefix client && npm run lint --prefix server && prettier --check \"**/*.js\""
}
}
## Instruction:
Remove git add from husky task
## Code After:
{
"name": "sqlpad-project",
"version": "4.2.0",
"private": true,
"devDependencies": {
"husky": "^4.2.3",
"lint-staged": "^10.0.9",
"prettier": "^1.19.1"
},
"prettier": {
"singleQuote": true
},
"husky": {
"hooks": {
"pre-commit": "npx lint-staged"
}
},
"lint-staged": {
"*.{js,json,css}": [
"prettier --write"
]
},
"scripts": {
"fixlint": "npm run fixlint --prefix client && npm run fixlint --prefix server && prettier --write \"**/*.js\"",
"lint": "npm run lint --prefix client && npm run lint --prefix server && prettier --check \"**/*.js\""
}
}
| {
"name": "sqlpad-project",
"version": "4.2.0",
"private": true,
"devDependencies": {
"husky": "^4.2.3",
"lint-staged": "^10.0.9",
"prettier": "^1.19.1"
},
"prettier": {
"singleQuote": true
},
"husky": {
"hooks": {
"pre-commit": "npx lint-staged"
}
},
"lint-staged": {
"*.{js,json,css}": [
- "prettier --write",
? -
+ "prettier --write"
- "git add"
]
},
"scripts": {
"fixlint": "npm run fixlint --prefix client && npm run fixlint --prefix server && prettier --write \"**/*.js\"",
"lint": "npm run lint --prefix client && npm run lint --prefix server && prettier --check \"**/*.js\""
}
} | 3 | 0.107143 | 1 | 2 |
0dd05f28829e201bf68863a3381ba9cc208b9801 | wiki/articles/build_react.yml | wiki/articles/build_react.yml |
name: Build React App
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Use Node.js 14.x
uses: actions/setup-node@v1
with:
node-version: 14.x
- name: Cache node modules
uses: actions/cache@v1
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: Download dependencies with npm
run: npm install
- run: npm ci
- run: npm run lint && npm run build --if-present
- name: Run the tests and generate coverage report
run: npm test -- --coverage
- name: Adding coverage badge
uses: demyanets/angular-coverage-badges-action@v1
with:
coverage-summary-path: ./coverage/coverage-summary.json
github_token: ${{ secrets.GITHUB_TOKEN }}
|
name: Build React App
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Use Node.js 14.x
uses: actions/setup-node@v1
with:
node-version: 14.x
- name: Cache node modules
uses: actions/cache@v1
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: Download dependencies with npm
run: npm install
- run: npm ci
- run: npm run lint && npm run build --if-present
- name: Run the tests and generate coverage report
run: npm test -- --coverage
- name: Adding coverage badge
uses: demyanets/angular-coverage-badges-action@v1
with:
coverage-summary-path: ./coverage/coverage-summary.json
github_token: ${{ secrets.GITHUB_TOKEN }}
| Fix React GH build script | Fix React GH build script
| YAML | lgpl-2.1 | uqbar-project/wiki,uqbar-project/wiki,uqbar-project/wiki,uqbar-project/wiki,uqbar-project/wiki | yaml | ## Code Before:
name: Build React App
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Use Node.js 14.x
uses: actions/setup-node@v1
with:
node-version: 14.x
- name: Cache node modules
uses: actions/cache@v1
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: Download dependencies with npm
run: npm install
- run: npm ci
- run: npm run lint && npm run build --if-present
- name: Run the tests and generate coverage report
run: npm test -- --coverage
- name: Adding coverage badge
uses: demyanets/angular-coverage-badges-action@v1
with:
coverage-summary-path: ./coverage/coverage-summary.json
github_token: ${{ secrets.GITHUB_TOKEN }}
## Instruction:
Fix React GH build script
## Code After:
name: Build React App
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Use Node.js 14.x
uses: actions/setup-node@v1
with:
node-version: 14.x
- name: Cache node modules
uses: actions/cache@v1
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: Download dependencies with npm
run: npm install
- run: npm ci
- run: npm run lint && npm run build --if-present
- name: Run the tests and generate coverage report
run: npm test -- --coverage
- name: Adding coverage badge
uses: demyanets/angular-coverage-badges-action@v1
with:
coverage-summary-path: ./coverage/coverage-summary.json
github_token: ${{ secrets.GITHUB_TOKEN }}
|
name: Build React App
+ on: [push, pull_request]
- on:
- push:
- branches: [ master ]
- pull_request:
- branches: [ master ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Use Node.js 14.x
uses: actions/setup-node@v1
with:
node-version: 14.x
- name: Cache node modules
uses: actions/cache@v1
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: Download dependencies with npm
run: npm install
- run: npm ci
- run: npm run lint && npm run build --if-present
- name: Run the tests and generate coverage report
run: npm test -- --coverage
- name: Adding coverage badge
uses: demyanets/angular-coverage-badges-action@v1
with:
coverage-summary-path: ./coverage/coverage-summary.json
github_token: ${{ secrets.GITHUB_TOKEN }} | 6 | 0.166667 | 1 | 5 |
608d4df2277570032e5fd5202ab7757acc7c83db | app/templates/client/index.html | app/templates/client/index.html | <!DOCTYPE html>
<html ng-app="<%= appname %>">
<head>
<base href="/">
<title><%= appname %></title>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width">
<link rel="icon" href="assets/favicon.ico" type="image/x-icon">
<!-- build:css app.css -->
<!-- bower:css -->
<!-- endinject -->
<link rel="stylesheet" href="styles/css/app.css">
<!-- endbuild -->
</head>
<body>
<strong><%= appname %></strong><% if (filters.auth) { %>
<nav-bar></nav-bar><% } %>
<div ui-view="app"></div><% if (filters.sockets) { %>
<script src="socket.io/socket.io.js"></script><% } %>
<!-- build:js app.js -->
<!-- bower:js -->
<!-- endinject -->
<!-- inject:js -->
<!-- endinject -->
<!-- endbuild --><% if (filters.reload === 'livereload') { %>
<script src="http://localhost:35729/livereload.js?snipver=1"></script><% } %>
</body>
</html>
| <!DOCTYPE html>
<html ng-app="<%= appname %>">
<head>
<script type="text/javascript">
// Detect Node (for NodeWebkit and Electron)
var isNode = (typeof process === 'object');
if (!isNode)
document.write('<base href="/">');
</script>
<title><%= appname %></title>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width">
<link rel="icon" href="assets/favicon.ico" type="image/x-icon">
<!-- build:css app.css -->
<!-- bower:css -->
<!-- endinject -->
<link rel="stylesheet" href="styles/css/app.css">
<!-- endbuild -->
</head>
<body>
<strong><%= appname %></strong><% if (filters.auth) { %>
<nav-bar></nav-bar><% } %>
<div ui-view="app"></div><% if (filters.sockets) { %>
<script src="socket.io/socket.io.js"></script><% } %>
<!-- build:js app.js -->
<!-- bower:js -->
<!-- endinject -->
<!-- inject:js -->
<!-- endinject -->
<!-- endbuild --><% if (filters.reload === 'livereload') { %>
<script src="http://localhost:35729/livereload.js?snipver=1"></script><% } %>
</body>
</html>
| Add isNode variable and base-href depending on it | Client: Add isNode variable and base-href depending on it
| HTML | bsd-3-clause | NovaeWorkshop/Nova,NovaeWorkshop/Nova,NovaeWorkshop/Nova | html | ## Code Before:
<!DOCTYPE html>
<html ng-app="<%= appname %>">
<head>
<base href="/">
<title><%= appname %></title>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width">
<link rel="icon" href="assets/favicon.ico" type="image/x-icon">
<!-- build:css app.css -->
<!-- bower:css -->
<!-- endinject -->
<link rel="stylesheet" href="styles/css/app.css">
<!-- endbuild -->
</head>
<body>
<strong><%= appname %></strong><% if (filters.auth) { %>
<nav-bar></nav-bar><% } %>
<div ui-view="app"></div><% if (filters.sockets) { %>
<script src="socket.io/socket.io.js"></script><% } %>
<!-- build:js app.js -->
<!-- bower:js -->
<!-- endinject -->
<!-- inject:js -->
<!-- endinject -->
<!-- endbuild --><% if (filters.reload === 'livereload') { %>
<script src="http://localhost:35729/livereload.js?snipver=1"></script><% } %>
</body>
</html>
## Instruction:
Client: Add isNode variable and base-href depending on it
## Code After:
<!DOCTYPE html>
<html ng-app="<%= appname %>">
<head>
<script type="text/javascript">
// Detect Node (for NodeWebkit and Electron)
var isNode = (typeof process === 'object');
if (!isNode)
document.write('<base href="/">');
</script>
<title><%= appname %></title>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width">
<link rel="icon" href="assets/favicon.ico" type="image/x-icon">
<!-- build:css app.css -->
<!-- bower:css -->
<!-- endinject -->
<link rel="stylesheet" href="styles/css/app.css">
<!-- endbuild -->
</head>
<body>
<strong><%= appname %></strong><% if (filters.auth) { %>
<nav-bar></nav-bar><% } %>
<div ui-view="app"></div><% if (filters.sockets) { %>
<script src="socket.io/socket.io.js"></script><% } %>
<!-- build:js app.js -->
<!-- bower:js -->
<!-- endinject -->
<!-- inject:js -->
<!-- endinject -->
<!-- endbuild --><% if (filters.reload === 'livereload') { %>
<script src="http://localhost:35729/livereload.js?snipver=1"></script><% } %>
</body>
</html>
| <!DOCTYPE html>
<html ng-app="<%= appname %>">
<head>
- <base href="/">
+ <script type="text/javascript">
+ // Detect Node (for NodeWebkit and Electron)
+ var isNode = (typeof process === 'object');
+ if (!isNode)
+ document.write('<base href="/">');
+ </script>
<title><%= appname %></title>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width">
<link rel="icon" href="assets/favicon.ico" type="image/x-icon">
<!-- build:css app.css -->
<!-- bower:css -->
<!-- endinject -->
<link rel="stylesheet" href="styles/css/app.css">
<!-- endbuild -->
</head>
<body>
<strong><%= appname %></strong><% if (filters.auth) { %>
<nav-bar></nav-bar><% } %>
<div ui-view="app"></div><% if (filters.sockets) { %>
<script src="socket.io/socket.io.js"></script><% } %>
<!-- build:js app.js -->
<!-- bower:js -->
<!-- endinject -->
<!-- inject:js -->
<!-- endinject -->
<!-- endbuild --><% if (filters.reload === 'livereload') { %>
<script src="http://localhost:35729/livereload.js?snipver=1"></script><% } %>
</body>
</html> | 7 | 0.148936 | 6 | 1 |
a7da36d8376de9f2eccac11f9e3db35618ecf44a | source/index.js | source/index.js | import normalizeEvents from './tools/normalizeEvents';
import normalizeListener from './tools/normalizeListener';
export default function stereo () {
let listeners = {};
let emitter = {
on(events, listener) {
// Normalize arguments.
events = normalizeEvents(events);
listener = normalizeListener(listener);
// Register the listener.
for (let event of events) {
let register = listeners[event];
if (!register) listeners[event] = [listener];
else if (!register.includes(listener)) {
register.push(listener);
}
}
},
once(events, listener) {
events = normalizeEvents(events);
listener = normalizeListener(listener);
function listenerWrapper (...args) {
emitter.off(events, listenerWrapper);
listener(...args);
}
emitter.on(events, listenerWrapper);
},
off(events, listener) {
// Normalize arguments.
events = normalizeEvents(events);
// If no listener is specified, unregister all listeners.
if (listener == null) for (let event of events) {
delete listeners[event];
}
// Otherwise unregister the given listener.
else {
listener = normalizeListener(listener);
for (let event of events) {
let register = listeners[event];
let index = register.indexOf(listener);
if (index !== -1) register.splice(index, 1);
}
}
},
emit(events, ...args) {
// Normalize arguments.
events = normalizeEvents(events);
// Dispatch listeners.
for (let event of events) {
let register = listeners[event];
if (register) for (let listener of register) {
listener(...args);
}
}
}
};
return emitter;
}
| import normalizeEvents from './tools/normalizeEvents';
import normalizeListener from './tools/normalizeListener';
export default function stereo () {
let listeners = {};
let emitter = {
on(events, listener) {
// Normalize arguments.
events = normalizeEvents(events);
listener = normalizeListener(listener);
// Register the listener.
for (let event of events) {
let register = listeners[event];
if (!register) listeners[event] = new Set([listener]);
else if (!register.has(listener)) register.add(listener);
}
},
once(events, listener) {
events = normalizeEvents(events);
listener = normalizeListener(listener);
function listenerWrapper (...args) {
emitter.off(events, listenerWrapper);
listener(...args);
}
emitter.on(events, listenerWrapper);
},
off(events, listener) {
// Normalize arguments.
events = normalizeEvents(events);
// If no listener is specified, unregister all listeners.
if (listener == null) for (let event of events) {
delete listeners[event];
}
// Otherwise unregister the given listener.
else {
listener = normalizeListener(listener);
for (let event of events) {
listeners[event].delete(listener);
}
}
},
emit(events, ...args) {
// Normalize arguments.
events = normalizeEvents(events);
// Dispatch listeners.
for (let event of events) {
let register = listeners[event];
if (register) for (let listener of register) {
listener(...args);
}
}
}
};
return emitter;
}
| Switch from arrays to Sets | Switch from arrays to Sets
| JavaScript | mit | stoeffel/stereo,tomekwi/stereo | javascript | ## Code Before:
import normalizeEvents from './tools/normalizeEvents';
import normalizeListener from './tools/normalizeListener';
export default function stereo () {
let listeners = {};
let emitter = {
on(events, listener) {
// Normalize arguments.
events = normalizeEvents(events);
listener = normalizeListener(listener);
// Register the listener.
for (let event of events) {
let register = listeners[event];
if (!register) listeners[event] = [listener];
else if (!register.includes(listener)) {
register.push(listener);
}
}
},
once(events, listener) {
events = normalizeEvents(events);
listener = normalizeListener(listener);
function listenerWrapper (...args) {
emitter.off(events, listenerWrapper);
listener(...args);
}
emitter.on(events, listenerWrapper);
},
off(events, listener) {
// Normalize arguments.
events = normalizeEvents(events);
// If no listener is specified, unregister all listeners.
if (listener == null) for (let event of events) {
delete listeners[event];
}
// Otherwise unregister the given listener.
else {
listener = normalizeListener(listener);
for (let event of events) {
let register = listeners[event];
let index = register.indexOf(listener);
if (index !== -1) register.splice(index, 1);
}
}
},
emit(events, ...args) {
// Normalize arguments.
events = normalizeEvents(events);
// Dispatch listeners.
for (let event of events) {
let register = listeners[event];
if (register) for (let listener of register) {
listener(...args);
}
}
}
};
return emitter;
}
## Instruction:
Switch from arrays to Sets
## Code After:
import normalizeEvents from './tools/normalizeEvents';
import normalizeListener from './tools/normalizeListener';
export default function stereo () {
let listeners = {};
let emitter = {
on(events, listener) {
// Normalize arguments.
events = normalizeEvents(events);
listener = normalizeListener(listener);
// Register the listener.
for (let event of events) {
let register = listeners[event];
if (!register) listeners[event] = new Set([listener]);
else if (!register.has(listener)) register.add(listener);
}
},
once(events, listener) {
events = normalizeEvents(events);
listener = normalizeListener(listener);
function listenerWrapper (...args) {
emitter.off(events, listenerWrapper);
listener(...args);
}
emitter.on(events, listenerWrapper);
},
off(events, listener) {
// Normalize arguments.
events = normalizeEvents(events);
// If no listener is specified, unregister all listeners.
if (listener == null) for (let event of events) {
delete listeners[event];
}
// Otherwise unregister the given listener.
else {
listener = normalizeListener(listener);
for (let event of events) {
listeners[event].delete(listener);
}
}
},
emit(events, ...args) {
// Normalize arguments.
events = normalizeEvents(events);
// Dispatch listeners.
for (let event of events) {
let register = listeners[event];
if (register) for (let listener of register) {
listener(...args);
}
}
}
};
return emitter;
}
| import normalizeEvents from './tools/normalizeEvents';
import normalizeListener from './tools/normalizeListener';
export default function stereo () {
let listeners = {};
let emitter = {
on(events, listener) {
// Normalize arguments.
events = normalizeEvents(events);
listener = normalizeListener(listener);
// Register the listener.
for (let event of events) {
let register = listeners[event];
- if (!register) listeners[event] = [listener];
+ if (!register) listeners[event] = new Set([listener]);
? ++++++++ +
+ else if (!register.has(listener)) register.add(listener);
- else if (!register.includes(listener)) {
- register.push(listener);
- }
}
},
once(events, listener) {
events = normalizeEvents(events);
listener = normalizeListener(listener);
function listenerWrapper (...args) {
emitter.off(events, listenerWrapper);
listener(...args);
}
emitter.on(events, listenerWrapper);
},
off(events, listener) {
// Normalize arguments.
events = normalizeEvents(events);
// If no listener is specified, unregister all listeners.
if (listener == null) for (let event of events) {
delete listeners[event];
}
// Otherwise unregister the given listener.
else {
listener = normalizeListener(listener);
-
for (let event of events) {
+ listeners[event].delete(listener);
- let register = listeners[event];
- let index = register.indexOf(listener);
- if (index !== -1) register.splice(index, 1);
}
}
},
emit(events, ...args) {
// Normalize arguments.
events = normalizeEvents(events);
// Dispatch listeners.
for (let event of events) {
let register = listeners[event];
if (register) for (let listener of register) {
listener(...args);
}
}
}
};
return emitter;
} | 11 | 0.15493 | 3 | 8 |
d5f71d02bf7d5e4813fdfb17b1ee0b8d8c7773ca | docker-compose.yml | docker-compose.yml | db:
image: postgres
environment:
- PGHOST=db
- PGDATABASE=postgres
- PGUSER=postgres
api:
build: .
entrypoint: "npm start"
ports:
- "9000:9000"
links:
- db
environment:
- PG_URL=postgres://postgres:postgres@db/postgres
- NODE_ENV=development
| db:
image: postgres
environment:
- PGHOST=db
- PGUSER=postgres
- PGPASSWORD=postgres
api:
image: studentmediene/radiorevolt-api-dev:latest
entrypoint: "npm start"
ports:
- "9000:9000"
links:
- db
environment:
- PG_URL=postgres://postgres:postgres@db/postgres
- NODE_ENV=development
| Use Docker Hub for images | Use Docker Hub for images
| YAML | mit | Studentmediene/RadioRevolt-API | yaml | ## Code Before:
db:
image: postgres
environment:
- PGHOST=db
- PGDATABASE=postgres
- PGUSER=postgres
api:
build: .
entrypoint: "npm start"
ports:
- "9000:9000"
links:
- db
environment:
- PG_URL=postgres://postgres:postgres@db/postgres
- NODE_ENV=development
## Instruction:
Use Docker Hub for images
## Code After:
db:
image: postgres
environment:
- PGHOST=db
- PGUSER=postgres
- PGPASSWORD=postgres
api:
image: studentmediene/radiorevolt-api-dev:latest
entrypoint: "npm start"
ports:
- "9000:9000"
links:
- db
environment:
- PG_URL=postgres://postgres:postgres@db/postgres
- NODE_ENV=development
| db:
image: postgres
environment:
- PGHOST=db
- - PGDATABASE=postgres
- PGUSER=postgres
+ - PGPASSWORD=postgres
api:
- build: .
+ image: studentmediene/radiorevolt-api-dev:latest
entrypoint: "npm start"
ports:
- "9000:9000"
links:
- db
environment:
- PG_URL=postgres://postgres:postgres@db/postgres
- NODE_ENV=development | 4 | 0.25 | 2 | 2 |
4283dcadb072045fc271601f07d1c9f50361f4e6 | roles/macbook_setup/tasks/nvidia.yml | roles/macbook_setup/tasks/nvidia.yml | ---
- name: 'Add the nvidia drivers repository'
become: true
apt_repository:
repo: 'ppa:graphics-drivers/ppa'
state: present
update_cache: yes
filename: 'ppa_nvidia_graphics_drivers'
when:
- ansible_system_vendor == "Dell Inc."
- ansible_product_name == "Precision T1700"
- name: 'Install the updated nvidia graphics drivers'
become: true
apt:
name: 'nvidia-387'
state: latest
when:
- ansible_system_vendor == "Dell Inc."
- ansible_product_name == "Precision T1700"
| ---
- name: 'Add the nvidia drivers repository'
become: true
apt_repository:
repo: 'ppa:graphics-drivers/ppa'
state: present
update_cache: yes
filename: 'ppa_nvidia_graphics_drivers'
when:
- ansible_system_vendor == "Dell Inc."
- ansible_product_name == "Precision T1700"
- name: 'Install the supported nvidia graphics drivers [NVIDIA Corporation GF108GL [Quadro 600] (rev a1)]'
become: true
apt:
name: 'nvidia-340'
state: latest
when:
- ansible_system_vendor == "Dell Inc."
- ansible_product_name == "Precision T1700"
| Install the correct version of the Nvidia graphics driver | Install the correct version of the Nvidia graphics driver
| YAML | mit | marvinpinto/laptop,marvinpinto/laptop,marvinpinto/laptop | yaml | ## Code Before:
---
- name: 'Add the nvidia drivers repository'
become: true
apt_repository:
repo: 'ppa:graphics-drivers/ppa'
state: present
update_cache: yes
filename: 'ppa_nvidia_graphics_drivers'
when:
- ansible_system_vendor == "Dell Inc."
- ansible_product_name == "Precision T1700"
- name: 'Install the updated nvidia graphics drivers'
become: true
apt:
name: 'nvidia-387'
state: latest
when:
- ansible_system_vendor == "Dell Inc."
- ansible_product_name == "Precision T1700"
## Instruction:
Install the correct version of the Nvidia graphics driver
## Code After:
---
- name: 'Add the nvidia drivers repository'
become: true
apt_repository:
repo: 'ppa:graphics-drivers/ppa'
state: present
update_cache: yes
filename: 'ppa_nvidia_graphics_drivers'
when:
- ansible_system_vendor == "Dell Inc."
- ansible_product_name == "Precision T1700"
- name: 'Install the supported nvidia graphics drivers [NVIDIA Corporation GF108GL [Quadro 600] (rev a1)]'
become: true
apt:
name: 'nvidia-340'
state: latest
when:
- ansible_system_vendor == "Dell Inc."
- ansible_product_name == "Precision T1700"
| ---
- name: 'Add the nvidia drivers repository'
become: true
apt_repository:
repo: 'ppa:graphics-drivers/ppa'
state: present
update_cache: yes
filename: 'ppa_nvidia_graphics_drivers'
when:
- ansible_system_vendor == "Dell Inc."
- ansible_product_name == "Precision T1700"
- - name: 'Install the updated nvidia graphics drivers'
+ - name: 'Install the supported nvidia graphics drivers [NVIDIA Corporation GF108GL [Quadro 600] (rev a1)]'
become: true
apt:
- name: 'nvidia-387'
? ^^
+ name: 'nvidia-340'
? ^^
state: latest
when:
- ansible_system_vendor == "Dell Inc."
- ansible_product_name == "Precision T1700" | 4 | 0.2 | 2 | 2 |
15a5ecb9cc0a3837bd863e2181b1a81db9f6db79 | README.md | README.md | Infection
=========
[](https://travis-ci.com/nickfrostatx/infection)
[](https://raw.githubusercontent.com/nickfrostatx/infection/master/LICENSE)
> A take-home interview question for Khan Academy.
You can find the code in [``infection.py``](infection.py).
Writeup
-------
Take-home interview, you say? Alright, we can do this.
So looking at the spec, we've got a graph of coaching relationships. Since any
user can coach any other user, there's no concept of someone being always a
"coach" and someone else only a "student", so unfortunately this isn't a nice
bipartite graph.
Total Infection sounds like a quick graph traversal. Shouldn't be too bad.
> build passing
Sweet.
| Infection
=========
[](https://travis-ci.com/nickfrostatx/infection)
[](https://raw.githubusercontent.com/nickfrostatx/infection/master/LICENSE)
> A take-home interview question for Khan Academy.
You can find the code in [``infection.py``](infection.py).
Writeup
-------
Take-home interview, you say? Alright, we can do this.
So looking at the spec, we've got a graph of coaching relationships. Since any
user can coach any other user, there's no concept of someone being always a
"coach" and someone else only a "student", so unfortunately this isn't a nice
bipartite graph.
Total Infection sounds like a quick graph traversal. Shouldn't be too bad.
> build passing
Sweet.
Now let's get into the meat of the problem. I need to infect a subset of the
graph, optimizing for two factors: the number of nodes infected (aiming for a
specified target), and the minimal number of edges between infected and
uninfected nodes.
The spec suggests we can get exactly n nodes. I like a challenge, so let's do
that! (And this actually makes things "simpler", since we're only optimizing
one variable now.)
This sounds like graph coloring, but it isn't. In fact, a valid colored graph
would be the least optimal solution to this problem.
| Write a little about graphs | Write a little about graphs
| Markdown | mit | nickfrostatx/infection,nickfrostatx/infection | markdown | ## Code Before:
Infection
=========
[](https://travis-ci.com/nickfrostatx/infection)
[](https://raw.githubusercontent.com/nickfrostatx/infection/master/LICENSE)
> A take-home interview question for Khan Academy.
You can find the code in [``infection.py``](infection.py).
Writeup
-------
Take-home interview, you say? Alright, we can do this.
So looking at the spec, we've got a graph of coaching relationships. Since any
user can coach any other user, there's no concept of someone being always a
"coach" and someone else only a "student", so unfortunately this isn't a nice
bipartite graph.
Total Infection sounds like a quick graph traversal. Shouldn't be too bad.
> build passing
Sweet.
## Instruction:
Write a little about graphs
## Code After:
Infection
=========
[](https://travis-ci.com/nickfrostatx/infection)
[](https://raw.githubusercontent.com/nickfrostatx/infection/master/LICENSE)
> A take-home interview question for Khan Academy.
You can find the code in [``infection.py``](infection.py).
Writeup
-------
Take-home interview, you say? Alright, we can do this.
So looking at the spec, we've got a graph of coaching relationships. Since any
user can coach any other user, there's no concept of someone being always a
"coach" and someone else only a "student", so unfortunately this isn't a nice
bipartite graph.
Total Infection sounds like a quick graph traversal. Shouldn't be too bad.
> build passing
Sweet.
Now let's get into the meat of the problem. I need to infect a subset of the
graph, optimizing for two factors: the number of nodes infected (aiming for a
specified target), and the minimal number of edges between infected and
uninfected nodes.
The spec suggests we can get exactly n nodes. I like a challenge, so let's do
that! (And this actually makes things "simpler", since we're only optimizing
one variable now.)
This sounds like graph coloring, but it isn't. In fact, a valid colored graph
would be the least optimal solution to this problem.
| Infection
=========
[](https://travis-ci.com/nickfrostatx/infection)
[](https://raw.githubusercontent.com/nickfrostatx/infection/master/LICENSE)
> A take-home interview question for Khan Academy.
You can find the code in [``infection.py``](infection.py).
Writeup
-------
Take-home interview, you say? Alright, we can do this.
So looking at the spec, we've got a graph of coaching relationships. Since any
user can coach any other user, there's no concept of someone being always a
"coach" and someone else only a "student", so unfortunately this isn't a nice
bipartite graph.
Total Infection sounds like a quick graph traversal. Shouldn't be too bad.
> build passing
Sweet.
+
+ Now let's get into the meat of the problem. I need to infect a subset of the
+ graph, optimizing for two factors: the number of nodes infected (aiming for a
+ specified target), and the minimal number of edges between infected and
+ uninfected nodes.
+
+ The spec suggests we can get exactly n nodes. I like a challenge, so let's do
+ that! (And this actually makes things "simpler", since we're only optimizing
+ one variable now.)
+
+ This sounds like graph coloring, but it isn't. In fact, a valid colored graph
+ would be the least optimal solution to this problem. | 12 | 0.48 | 12 | 0 |
a84e970fe57e5c88082b52da17747090e3e987e6 | tests/neg-custom-args/isInstanceOf/enum-approx2.scala | tests/neg-custom-args/isInstanceOf/enum-approx2.scala | sealed trait Exp[T]
case class Fun[A, B](f: Exp[A => B]) extends Exp[A => B]
class Test {
def eval(e: Fun[Int, Int]) = e match {
case Fun(x: Fun[Int, Double]) => ??? // error
case Fun(x: Exp[Int => String]) => ??? // error
}
} | sealed trait Exp[T]
case class Fun[A, B](f: Exp[A => B]) extends Exp[A => B]
class Test {
def eval(e: Fun[Int, Int]) = e match {
case Fun(x: Fun[Int, Double]) => ??? // error
case Fun(x: Exp[Int => String]) => ??? // error
case _ =>
}
} | Fix num of warnings in test | Fix num of warnings in test
| Scala | apache-2.0 | lampepfl/dotty,dotty-staging/dotty,sjrd/dotty,dotty-staging/dotty,sjrd/dotty,sjrd/dotty,lampepfl/dotty,sjrd/dotty,dotty-staging/dotty,lampepfl/dotty,dotty-staging/dotty,lampepfl/dotty,dotty-staging/dotty,lampepfl/dotty,sjrd/dotty | scala | ## Code Before:
sealed trait Exp[T]
case class Fun[A, B](f: Exp[A => B]) extends Exp[A => B]
class Test {
def eval(e: Fun[Int, Int]) = e match {
case Fun(x: Fun[Int, Double]) => ??? // error
case Fun(x: Exp[Int => String]) => ??? // error
}
}
## Instruction:
Fix num of warnings in test
## Code After:
sealed trait Exp[T]
case class Fun[A, B](f: Exp[A => B]) extends Exp[A => B]
class Test {
def eval(e: Fun[Int, Int]) = e match {
case Fun(x: Fun[Int, Double]) => ??? // error
case Fun(x: Exp[Int => String]) => ??? // error
case _ =>
}
} | sealed trait Exp[T]
case class Fun[A, B](f: Exp[A => B]) extends Exp[A => B]
class Test {
def eval(e: Fun[Int, Int]) = e match {
case Fun(x: Fun[Int, Double]) => ??? // error
case Fun(x: Exp[Int => String]) => ??? // error
+ case _ =>
}
} | 1 | 0.111111 | 1 | 0 |
471fdd3d9ea2208619c78cfe32ed10ba3691bec4 | lib/pushbots/all.rb | lib/pushbots/all.rb | module Pushbots
# All class
class All < Push
attr_accessor :schedule, :tags, :sound, :badge, :except_tags, :alias,
:except_alias, :payload
def initialize(platforms, message, schedule, options = {})
super(platforms, message, :all)
self.schedule = schedule
self.tags = options[:tags]
self.sound = options[:sound]
self.badge = options[:badge]
self.except_tags = options[:except_tags]
self.alias = options[:alias]
self.except_alias = options[:except_alias]
self.payload = options[:payload]
end
def send
self.response = Request.send(:all, body)
self.status =
response.failed? ? STATUS[:failed] : STATUS[:delivered]
end
def body
data = {
platform: @platform,
msg: message,
schedule: schedule
}
data[:tags] if tags
data[:badge] if badge
data[:payload] if payload
data
end
end
end
| module Pushbots
# All class
class All < Push
attr_accessor :schedule, :tags, :sound, :badge, :except_tags, :alias,
:except_alias, :payload
def initialize(platforms, message, schedule, options = {})
super(platforms, message, :all)
self.schedule = schedule
self.tags = options[:tags]
self.sound = options[:sound]
self.badge = options[:badge]
self.except_tags = options[:except_tags]
self.device_alias = options[:alias]
self.except_alias = options[:except_alias]
self.payload = options[:payload]
end
def send
self.response = Request.send(:all, body)
self.status =
response.failed? ? STATUS[:failed] : STATUS[:delivered]
end
def body
data = {
platform: @platform,
msg: message,
schedule: schedule
}
data[:tags] if tags
data[:badge] if badge
data[:alias] if device_alias
data[:except_tags] if except_tags
data[:payload] if payload
data
end
end
end
| Add alias and except tags to the body payload | Add alias and except tags to the body payload
| Ruby | mit | Kandiie/pushbots,Kandiie/pushbots | ruby | ## Code Before:
module Pushbots
# All class
class All < Push
attr_accessor :schedule, :tags, :sound, :badge, :except_tags, :alias,
:except_alias, :payload
def initialize(platforms, message, schedule, options = {})
super(platforms, message, :all)
self.schedule = schedule
self.tags = options[:tags]
self.sound = options[:sound]
self.badge = options[:badge]
self.except_tags = options[:except_tags]
self.alias = options[:alias]
self.except_alias = options[:except_alias]
self.payload = options[:payload]
end
def send
self.response = Request.send(:all, body)
self.status =
response.failed? ? STATUS[:failed] : STATUS[:delivered]
end
def body
data = {
platform: @platform,
msg: message,
schedule: schedule
}
data[:tags] if tags
data[:badge] if badge
data[:payload] if payload
data
end
end
end
## Instruction:
Add alias and except tags to the body payload
## Code After:
module Pushbots
# All class
class All < Push
attr_accessor :schedule, :tags, :sound, :badge, :except_tags, :alias,
:except_alias, :payload
def initialize(platforms, message, schedule, options = {})
super(platforms, message, :all)
self.schedule = schedule
self.tags = options[:tags]
self.sound = options[:sound]
self.badge = options[:badge]
self.except_tags = options[:except_tags]
self.device_alias = options[:alias]
self.except_alias = options[:except_alias]
self.payload = options[:payload]
end
def send
self.response = Request.send(:all, body)
self.status =
response.failed? ? STATUS[:failed] : STATUS[:delivered]
end
def body
data = {
platform: @platform,
msg: message,
schedule: schedule
}
data[:tags] if tags
data[:badge] if badge
data[:alias] if device_alias
data[:except_tags] if except_tags
data[:payload] if payload
data
end
end
end
| module Pushbots
# All class
class All < Push
attr_accessor :schedule, :tags, :sound, :badge, :except_tags, :alias,
:except_alias, :payload
def initialize(platforms, message, schedule, options = {})
super(platforms, message, :all)
self.schedule = schedule
self.tags = options[:tags]
self.sound = options[:sound]
self.badge = options[:badge]
self.except_tags = options[:except_tags]
- self.alias = options[:alias]
+ self.device_alias = options[:alias]
? +++++++
self.except_alias = options[:except_alias]
self.payload = options[:payload]
end
def send
self.response = Request.send(:all, body)
self.status =
response.failed? ? STATUS[:failed] : STATUS[:delivered]
end
def body
data = {
platform: @platform,
msg: message,
schedule: schedule
}
data[:tags] if tags
data[:badge] if badge
+ data[:alias] if device_alias
+ data[:except_tags] if except_tags
data[:payload] if payload
data
end
end
end | 4 | 0.108108 | 3 | 1 |
582fb412560ce068e2ac516a64ab1979beaccdb2 | keystonemiddleware_echo/app.py | keystonemiddleware_echo/app.py |
import json
import webob.dec
@webob.dec.wsgify
def echo_app(request):
"""A WSGI application that echoes the CGI environment to the user."""
return webob.Response(content_type='application/json',
body=json.dumps(request.environ, indent=4))
def echo_app_factory(global_conf, **local_conf):
import ipdb; ipdb.set_trace()
return echo_app
|
import pprint
import webob.dec
@webob.dec.wsgify
def echo_app(request):
"""A WSGI application that echoes the CGI environment to the user."""
return webob.Response(content_type='application/json',
body=pprint.pformat(request.environ, indent=4))
def echo_app_factory(global_conf, **local_conf):
return echo_app
| Use pprint instead of json for formatting | Use pprint instead of json for formatting
json.dumps fails to print anything for an object. I would prefer to show
a representation of the object rather than filter it out so rely on
pprint instead.
| Python | apache-2.0 | jamielennox/keystonemiddleware-echo | python | ## Code Before:
import json
import webob.dec
@webob.dec.wsgify
def echo_app(request):
"""A WSGI application that echoes the CGI environment to the user."""
return webob.Response(content_type='application/json',
body=json.dumps(request.environ, indent=4))
def echo_app_factory(global_conf, **local_conf):
import ipdb; ipdb.set_trace()
return echo_app
## Instruction:
Use pprint instead of json for formatting
json.dumps fails to print anything for an object. I would prefer to show
a representation of the object rather than filter it out so rely on
pprint instead.
## Code After:
import pprint
import webob.dec
@webob.dec.wsgify
def echo_app(request):
"""A WSGI application that echoes the CGI environment to the user."""
return webob.Response(content_type='application/json',
body=pprint.pformat(request.environ, indent=4))
def echo_app_factory(global_conf, **local_conf):
return echo_app
|
- import json
+ import pprint
import webob.dec
@webob.dec.wsgify
def echo_app(request):
"""A WSGI application that echoes the CGI environment to the user."""
return webob.Response(content_type='application/json',
- body=json.dumps(request.environ, indent=4))
? ^^ ^^^^ ^^
+ body=pprint.pformat(request.environ, indent=4))
? ^^^^^^^^^ ^ ^^
def echo_app_factory(global_conf, **local_conf):
- import ipdb; ipdb.set_trace()
return echo_app | 5 | 0.333333 | 2 | 3 |
1c1e0fd2e7db03adea8c1c1e91728a9448487dde | travis_scripts/build_script.sh | travis_scripts/build_script.sh |
set -e
START_DIR=`pwd`
export DIR=~/patlms_bindir
export GTEST_INSTALL_DIR=${DIR}/gtest
export GMOCK_INSTALL_DIR=${DIR}/gmock
export PATLMS_INSTALL_DIR=${DIR}/libpatlms
export AGENT_INSTALL_DIR=${DIR}/agent
export SERVER_INSTALL_DIR=${DIR}/server
# Google Test & Google Mock flags
export LDFLAGS="$LDFLAGS -L${GTEST_INSTALL_DIR}/lib -L${GMOCK_INSTALL_DIR}/lib"
export CPPFLAGS="$CPPFLAGS -I${GTEST_INSTALL_DIR}/include -I${GMOCK_INSTALL_DIR}/include"
# build libpatlms
cd libpatlms
./configure --prefix=${PATLMS_INSTALL_DIR}
make
make install
make check
# libpatlms flags
export CPPFLAGS="$CPPFLAGS -I${PATLMS_INSTALL_DIR}/include"
export LDFLAGS="$LDFLAGS -L${PATLMS_INSTALL_DIR}/lib"
export LD_LIBRARY_PATH="${PATLMS_INSTALL_DIR}/lib"
# build server
cd ${START_DIR}/server
./configure --prefix=${SERVER_INSTALL_DIR}
make
make install
make check
# build agent
cd ${START_DIR}/agent
./configure --prefix=${AGENT_INSTALL_DIR}
make
make install
make check
|
set -e
START_DIR=`pwd`
export DIR=~/patlms_bindir
export GTEST_INSTALL_DIR=${DIR}/gtest
export GMOCK_INSTALL_DIR=${DIR}/gmock
export PATLMS_INSTALL_DIR=${DIR}/libpatlms
export AGENT_INSTALL_DIR=${DIR}/agent
export SERVER_INSTALL_DIR=${DIR}/server
# Google Test & Google Mock flags
export LDFLAGS="$LDFLAGS -L${GTEST_INSTALL_DIR}/lib -L${GMOCK_INSTALL_DIR}/lib"
export CPPFLAGS="$CPPFLAGS -I${GTEST_INSTALL_DIR}/include -I${GMOCK_INSTALL_DIR}/include"
# build libpatlms
cd libpatlms
./autogen.sh
./configure --prefix=${PATLMS_INSTALL_DIR}
make
make install
make check
# libpatlms flags
export CPPFLAGS="$CPPFLAGS -I${PATLMS_INSTALL_DIR}/include"
export LDFLAGS="$LDFLAGS -L${PATLMS_INSTALL_DIR}/lib"
export LD_LIBRARY_PATH="${PATLMS_INSTALL_DIR}/lib"
# build server
cd ${START_DIR}/server
./autogen.sh
./configure --prefix=${SERVER_INSTALL_DIR}
make
make install
make check
# build agent
cd ${START_DIR}/agent
./autogen.sh
./configure --prefix=${AGENT_INSTALL_DIR}
make
make install
make check
| Add missing autogen call in travis config file | Add missing autogen call in travis config file
| Shell | mit | chyla/slas,chyla/slas,chyla/pat-lms,chyla/pat-lms,chyla/pat-lms,chyla/pat-lms,chyla/slas,chyla/pat-lms,chyla/slas,chyla/slas,chyla/pat-lms,chyla/pat-lms,chyla/slas,chyla/slas | shell | ## Code Before:
set -e
START_DIR=`pwd`
export DIR=~/patlms_bindir
export GTEST_INSTALL_DIR=${DIR}/gtest
export GMOCK_INSTALL_DIR=${DIR}/gmock
export PATLMS_INSTALL_DIR=${DIR}/libpatlms
export AGENT_INSTALL_DIR=${DIR}/agent
export SERVER_INSTALL_DIR=${DIR}/server
# Google Test & Google Mock flags
export LDFLAGS="$LDFLAGS -L${GTEST_INSTALL_DIR}/lib -L${GMOCK_INSTALL_DIR}/lib"
export CPPFLAGS="$CPPFLAGS -I${GTEST_INSTALL_DIR}/include -I${GMOCK_INSTALL_DIR}/include"
# build libpatlms
cd libpatlms
./configure --prefix=${PATLMS_INSTALL_DIR}
make
make install
make check
# libpatlms flags
export CPPFLAGS="$CPPFLAGS -I${PATLMS_INSTALL_DIR}/include"
export LDFLAGS="$LDFLAGS -L${PATLMS_INSTALL_DIR}/lib"
export LD_LIBRARY_PATH="${PATLMS_INSTALL_DIR}/lib"
# build server
cd ${START_DIR}/server
./configure --prefix=${SERVER_INSTALL_DIR}
make
make install
make check
# build agent
cd ${START_DIR}/agent
./configure --prefix=${AGENT_INSTALL_DIR}
make
make install
make check
## Instruction:
Add missing autogen call in travis config file
## Code After:
set -e
START_DIR=`pwd`
export DIR=~/patlms_bindir
export GTEST_INSTALL_DIR=${DIR}/gtest
export GMOCK_INSTALL_DIR=${DIR}/gmock
export PATLMS_INSTALL_DIR=${DIR}/libpatlms
export AGENT_INSTALL_DIR=${DIR}/agent
export SERVER_INSTALL_DIR=${DIR}/server
# Google Test & Google Mock flags
export LDFLAGS="$LDFLAGS -L${GTEST_INSTALL_DIR}/lib -L${GMOCK_INSTALL_DIR}/lib"
export CPPFLAGS="$CPPFLAGS -I${GTEST_INSTALL_DIR}/include -I${GMOCK_INSTALL_DIR}/include"
# build libpatlms
cd libpatlms
./autogen.sh
./configure --prefix=${PATLMS_INSTALL_DIR}
make
make install
make check
# libpatlms flags
export CPPFLAGS="$CPPFLAGS -I${PATLMS_INSTALL_DIR}/include"
export LDFLAGS="$LDFLAGS -L${PATLMS_INSTALL_DIR}/lib"
export LD_LIBRARY_PATH="${PATLMS_INSTALL_DIR}/lib"
# build server
cd ${START_DIR}/server
./autogen.sh
./configure --prefix=${SERVER_INSTALL_DIR}
make
make install
make check
# build agent
cd ${START_DIR}/agent
./autogen.sh
./configure --prefix=${AGENT_INSTALL_DIR}
make
make install
make check
|
set -e
START_DIR=`pwd`
export DIR=~/patlms_bindir
export GTEST_INSTALL_DIR=${DIR}/gtest
export GMOCK_INSTALL_DIR=${DIR}/gmock
export PATLMS_INSTALL_DIR=${DIR}/libpatlms
export AGENT_INSTALL_DIR=${DIR}/agent
export SERVER_INSTALL_DIR=${DIR}/server
# Google Test & Google Mock flags
export LDFLAGS="$LDFLAGS -L${GTEST_INSTALL_DIR}/lib -L${GMOCK_INSTALL_DIR}/lib"
export CPPFLAGS="$CPPFLAGS -I${GTEST_INSTALL_DIR}/include -I${GMOCK_INSTALL_DIR}/include"
# build libpatlms
cd libpatlms
+ ./autogen.sh
./configure --prefix=${PATLMS_INSTALL_DIR}
make
make install
make check
# libpatlms flags
export CPPFLAGS="$CPPFLAGS -I${PATLMS_INSTALL_DIR}/include"
export LDFLAGS="$LDFLAGS -L${PATLMS_INSTALL_DIR}/lib"
export LD_LIBRARY_PATH="${PATLMS_INSTALL_DIR}/lib"
# build server
cd ${START_DIR}/server
+ ./autogen.sh
./configure --prefix=${SERVER_INSTALL_DIR}
make
make install
make check
# build agent
cd ${START_DIR}/agent
+ ./autogen.sh
./configure --prefix=${AGENT_INSTALL_DIR}
make
make install
make check
| 3 | 0.069767 | 3 | 0 |
8709f4c305854a8e23e6c0fc868b7c9a65827707 | src/String.h | src/String.h | //=============================================================================
// String.h
//=============================================================================
#ifndef SRC_STRING_H_
#define SRC_STRING_H_
#include "AutoReleasePool.h"
#include <string.h>
//-----------------------------------------------------------------------------
STRUCT String {
const string str;
const unsigned long size;
} String;
//-----------------------------------------------------------------------------
String StringOf(const string);
string newstring(AutoReleasePool*, unsigned long size);
String newString(AutoReleasePool*, unsigned long size);
string copystring(AutoReleasePool*, const string, unsigned long size);
String copyString(AutoReleasePool*, const String, unsigned long size);
String appendChar(AutoReleasePool*, const String, const CHAR, unsigned int increaseBy);
string joinstrings(AutoReleasePool*, const string, const string);
String joinStrings(AutoReleasePool*, const String, const String);
String trimStringToSize(AutoReleasePool*, const String);
unsigned int getAllocationCount();
//-----------------------------------------------------------------------------
#endif // SRC_STRING_H_
//----------------------------------------------------------------------------- | //=============================================================================
// String.h
//=============================================================================
#ifndef SRC_STRING_H_
#define SRC_STRING_H_
#include "AutoReleasePool.h"
#include <string.h>
//-----------------------------------------------------------------------------
STRUCT String {
string str;
unsigned long size;
} String;
//-----------------------------------------------------------------------------
String StringOf(const string);
string newstring(AutoReleasePool*, unsigned long size);
String newString(AutoReleasePool*, unsigned long size);
string copystring(AutoReleasePool*, const string, unsigned long size);
String copyString(AutoReleasePool*, const String, unsigned long size);
String appendChar(AutoReleasePool*, const String, const CHAR, unsigned int increaseBy);
string joinstrings(AutoReleasePool*, const string, const string);
String joinStrings(AutoReleasePool*, const String, const String);
String trimStringToSize(AutoReleasePool*, const String);
unsigned int getAllocationCount();
//-----------------------------------------------------------------------------
#endif // SRC_STRING_H_
//----------------------------------------------------------------------------- | Refactor - fix error in gcc compile | Refactor - fix error in gcc compile | C | apache-2.0 | jasonxhill/okavangoc,jasonxhill/okavangoc | c | ## Code Before:
//=============================================================================
// String.h
//=============================================================================
#ifndef SRC_STRING_H_
#define SRC_STRING_H_
#include "AutoReleasePool.h"
#include <string.h>
//-----------------------------------------------------------------------------
STRUCT String {
const string str;
const unsigned long size;
} String;
//-----------------------------------------------------------------------------
String StringOf(const string);
string newstring(AutoReleasePool*, unsigned long size);
String newString(AutoReleasePool*, unsigned long size);
string copystring(AutoReleasePool*, const string, unsigned long size);
String copyString(AutoReleasePool*, const String, unsigned long size);
String appendChar(AutoReleasePool*, const String, const CHAR, unsigned int increaseBy);
string joinstrings(AutoReleasePool*, const string, const string);
String joinStrings(AutoReleasePool*, const String, const String);
String trimStringToSize(AutoReleasePool*, const String);
unsigned int getAllocationCount();
//-----------------------------------------------------------------------------
#endif // SRC_STRING_H_
//-----------------------------------------------------------------------------
## Instruction:
Refactor - fix error in gcc compile
## Code After:
//=============================================================================
// String.h
//=============================================================================
#ifndef SRC_STRING_H_
#define SRC_STRING_H_
#include "AutoReleasePool.h"
#include <string.h>
//-----------------------------------------------------------------------------
STRUCT String {
string str;
unsigned long size;
} String;
//-----------------------------------------------------------------------------
String StringOf(const string);
string newstring(AutoReleasePool*, unsigned long size);
String newString(AutoReleasePool*, unsigned long size);
string copystring(AutoReleasePool*, const string, unsigned long size);
String copyString(AutoReleasePool*, const String, unsigned long size);
String appendChar(AutoReleasePool*, const String, const CHAR, unsigned int increaseBy);
string joinstrings(AutoReleasePool*, const string, const string);
String joinStrings(AutoReleasePool*, const String, const String);
String trimStringToSize(AutoReleasePool*, const String);
unsigned int getAllocationCount();
//-----------------------------------------------------------------------------
#endif // SRC_STRING_H_
//----------------------------------------------------------------------------- | //=============================================================================
// String.h
//=============================================================================
#ifndef SRC_STRING_H_
#define SRC_STRING_H_
#include "AutoReleasePool.h"
#include <string.h>
//-----------------------------------------------------------------------------
STRUCT String {
- const string str;
? ------
+ string str;
- const unsigned long size;
? ------
+ unsigned long size;
} String;
//-----------------------------------------------------------------------------
String StringOf(const string);
string newstring(AutoReleasePool*, unsigned long size);
String newString(AutoReleasePool*, unsigned long size);
string copystring(AutoReleasePool*, const string, unsigned long size);
String copyString(AutoReleasePool*, const String, unsigned long size);
String appendChar(AutoReleasePool*, const String, const CHAR, unsigned int increaseBy);
string joinstrings(AutoReleasePool*, const string, const string);
String joinStrings(AutoReleasePool*, const String, const String);
String trimStringToSize(AutoReleasePool*, const String);
unsigned int getAllocationCount();
//-----------------------------------------------------------------------------
#endif // SRC_STRING_H_
//----------------------------------------------------------------------------- | 4 | 0.148148 | 2 | 2 |
5aa475d563a1fbc59b39e44267da59198689a906 | zsh/aliases.zsh | zsh/aliases.zsh | alias tmux="TERM=screen-256color-bce tmux"
# Unix
alias l='ls'
alias ll='ls -al'
alias e="$EDITOR"
# General
alias zshconfig="e ~/.zshrc"
alias myzshconfig="e $ZSH_CUSTOM/rap.zsh"
# Bundler
alias b="bundle"
# Shiftit
alias start="bundle exec foreman start -f Procfile.dev"
alias tail_log="tail -f log/development.log"
# Vagrant
alias v="vagrant"
alias vs="vagrant status"
alias vh="vagrant halt"
alias vu="vagrant up"
| alias tmux="TERM=screen-256color-bce tmux"
# Unix
alias l='ls'
alias ll='ls -al'
alias e="$EDITOR"
# General
alias zshconfig="e ~/.zshrc"
alias myzshconfig="e $ZSH_CUSTOM/rap.zsh"
# Bundler
alias b="bundle"
# Shiftit
alias start="bundle exec foreman start -f Procfile.dev"
alias migrate="foreman run rake db:migrate"
alias tail_log="tail -f log/development.log"
# Vagrant
alias v="vagrant"
alias vs="vagrant status"
alias vh="vagrant halt"
alias vu="vagrant up"
| Add alias to run rails migrations | Add alias to run rails migrations
| Shell | mit | ruiafonsopereira/dotfiles,ruiafonsopereira/dotfiles | shell | ## Code Before:
alias tmux="TERM=screen-256color-bce tmux"
# Unix
alias l='ls'
alias ll='ls -al'
alias e="$EDITOR"
# General
alias zshconfig="e ~/.zshrc"
alias myzshconfig="e $ZSH_CUSTOM/rap.zsh"
# Bundler
alias b="bundle"
# Shiftit
alias start="bundle exec foreman start -f Procfile.dev"
alias tail_log="tail -f log/development.log"
# Vagrant
alias v="vagrant"
alias vs="vagrant status"
alias vh="vagrant halt"
alias vu="vagrant up"
## Instruction:
Add alias to run rails migrations
## Code After:
alias tmux="TERM=screen-256color-bce tmux"
# Unix
alias l='ls'
alias ll='ls -al'
alias e="$EDITOR"
# General
alias zshconfig="e ~/.zshrc"
alias myzshconfig="e $ZSH_CUSTOM/rap.zsh"
# Bundler
alias b="bundle"
# Shiftit
alias start="bundle exec foreman start -f Procfile.dev"
alias migrate="foreman run rake db:migrate"
alias tail_log="tail -f log/development.log"
# Vagrant
alias v="vagrant"
alias vs="vagrant status"
alias vh="vagrant halt"
alias vu="vagrant up"
| alias tmux="TERM=screen-256color-bce tmux"
# Unix
alias l='ls'
alias ll='ls -al'
alias e="$EDITOR"
# General
alias zshconfig="e ~/.zshrc"
alias myzshconfig="e $ZSH_CUSTOM/rap.zsh"
# Bundler
alias b="bundle"
# Shiftit
alias start="bundle exec foreman start -f Procfile.dev"
+ alias migrate="foreman run rake db:migrate"
alias tail_log="tail -f log/development.log"
# Vagrant
alias v="vagrant"
alias vs="vagrant status"
alias vh="vagrant halt"
alias vu="vagrant up" | 1 | 0.043478 | 1 | 0 |
c6413572a64c67f6fbb6adb9fe7acad855a238a9 | roles/nginx/tasks/main.yml | roles/nginx/tasks/main.yml | ---
- name: Recognize Phusion APT key
apt_key: keyserver=keyserver.ubuntu.com
id=561F9B9CAC40B2F7
- name: Use Phusion APT repo
apt_repository: repo='deb https://oss-binaries.phusionpassenger.com/apt/passenger precise main'
- name: Install Passenger + Nginx
apt: pkg=nginx-extras
state=present
notify: Restart Nginx
- name: Set up Nginx vhost directories
file: path=/etc/nginx/{{ item }}
state=directory
with_items:
- sites-available
- sites-enabled
- name: Set up log rotation for Nginx
copy: src=logrotate
dest=/etc/logrotate.d/nginx
| ---
- name: Recognize Phusion APT key
apt_key: keyserver=keyserver.ubuntu.com
id=561F9B9CAC40B2F7
- name: Use Phusion APT repo
apt_repository: repo='deb https://oss-binaries.phusionpassenger.com/apt/passenger precise main'
- name: Install Passenger + Nginx
apt: pkg=nginx-extras
state=present
update_cache=yes
notify: Restart Nginx
- name: Set up Nginx vhost directories
file: path=/etc/nginx/{{ item }}
state=directory
with_items:
- sites-available
- sites-enabled
- name: Set up log rotation for Nginx
copy: src=logrotate
dest=/etc/logrotate.d/nginx
| Update cache when installing nginx | Update cache when installing nginx
| YAML | mit | iangreenleaf/ansible-playbooks,iangreenleaf/ansible-playbooks,iangreenleaf/ansible-playbooks | yaml | ## Code Before:
---
- name: Recognize Phusion APT key
apt_key: keyserver=keyserver.ubuntu.com
id=561F9B9CAC40B2F7
- name: Use Phusion APT repo
apt_repository: repo='deb https://oss-binaries.phusionpassenger.com/apt/passenger precise main'
- name: Install Passenger + Nginx
apt: pkg=nginx-extras
state=present
notify: Restart Nginx
- name: Set up Nginx vhost directories
file: path=/etc/nginx/{{ item }}
state=directory
with_items:
- sites-available
- sites-enabled
- name: Set up log rotation for Nginx
copy: src=logrotate
dest=/etc/logrotate.d/nginx
## Instruction:
Update cache when installing nginx
## Code After:
---
- name: Recognize Phusion APT key
apt_key: keyserver=keyserver.ubuntu.com
id=561F9B9CAC40B2F7
- name: Use Phusion APT repo
apt_repository: repo='deb https://oss-binaries.phusionpassenger.com/apt/passenger precise main'
- name: Install Passenger + Nginx
apt: pkg=nginx-extras
state=present
update_cache=yes
notify: Restart Nginx
- name: Set up Nginx vhost directories
file: path=/etc/nginx/{{ item }}
state=directory
with_items:
- sites-available
- sites-enabled
- name: Set up log rotation for Nginx
copy: src=logrotate
dest=/etc/logrotate.d/nginx
| ---
- name: Recognize Phusion APT key
apt_key: keyserver=keyserver.ubuntu.com
id=561F9B9CAC40B2F7
- name: Use Phusion APT repo
apt_repository: repo='deb https://oss-binaries.phusionpassenger.com/apt/passenger precise main'
- name: Install Passenger + Nginx
apt: pkg=nginx-extras
state=present
+ update_cache=yes
notify: Restart Nginx
- name: Set up Nginx vhost directories
file: path=/etc/nginx/{{ item }}
state=directory
with_items:
- sites-available
- sites-enabled
- name: Set up log rotation for Nginx
copy: src=logrotate
dest=/etc/logrotate.d/nginx | 1 | 0.043478 | 1 | 0 |
d96825e2a6438da0766552bc9a814b2dea50c7c3 | app/controllers/ideas_controller.rb | app/controllers/ideas_controller.rb | class IdeasController < ApplicationController
expose(:idea, attributes: :idea_params)
expose(:ideas) { campaign.user_accessible_ideas(current_user) }
expose(:campaign)
expose(:campaigns)
expose(:user)
before_action :authenticate_user!
def toggle_interesting
authorize idea
idea.interesting = !idea.interesting
if idea.save
redirect_to :back
else
redirect_to :back
end
end
def index
end
def show
authorize idea
end
def edit
authorize idea
end
def new
authorize idea
end
def update
authorize idea
idea.update(idea_params)
redirect_to campaign_ideas_path(campaign)
end
def destroy
authorize idea
idea.destroy
redirect_to campaign_ideas_path(campaign), notice: 'Idea has been destroyed'
end
def create
authorize idea
idea.user = current_user
idea.save
redirect_to campaign_ideas_path(campaign), notice: 'Idea has been succesfully created!'
end
private
def idea_params
params.require(:idea).permit(:title, :description, :id, :campaign, :user)
end
end
| class IdeasController < ApplicationController
expose(:idea, attributes: :idea_params)
expose(:ideas) { campaign.user_accessible_ideas(current_user).order(interesting: :desc) }
expose(:campaign)
expose(:campaigns)
expose(:user)
before_action :authenticate_user!
def toggle_interesting
authorize idea
idea.interesting = !idea.interesting
if idea.save
redirect_to :back
else
redirect_to :back
end
end
def index
end
def show
authorize idea
end
def edit
authorize idea
end
def new
authorize idea
end
def update
authorize idea
idea.update(idea_params)
redirect_to campaign_ideas_path(campaign)
end
def destroy
authorize idea
idea.destroy
redirect_to campaign_ideas_path(campaign), notice: 'Idea has been destroyed'
end
def create
authorize idea
idea.user = current_user
idea.save
redirect_to campaign_ideas_path(campaign), notice: 'Idea has been succesfully created!'
end
private
def idea_params
params.require(:idea).permit(:title, :description, :id, :campaign, :user)
end
end
| Order ideas- interesting ideas moves on top of index | Order ideas- interesting ideas moves on top of index
| Ruby | mit | fantasygame/campaigns,fantasygame/campaigns,fantasygame/campaigns | ruby | ## Code Before:
class IdeasController < ApplicationController
expose(:idea, attributes: :idea_params)
expose(:ideas) { campaign.user_accessible_ideas(current_user) }
expose(:campaign)
expose(:campaigns)
expose(:user)
before_action :authenticate_user!
def toggle_interesting
authorize idea
idea.interesting = !idea.interesting
if idea.save
redirect_to :back
else
redirect_to :back
end
end
def index
end
def show
authorize idea
end
def edit
authorize idea
end
def new
authorize idea
end
def update
authorize idea
idea.update(idea_params)
redirect_to campaign_ideas_path(campaign)
end
def destroy
authorize idea
idea.destroy
redirect_to campaign_ideas_path(campaign), notice: 'Idea has been destroyed'
end
def create
authorize idea
idea.user = current_user
idea.save
redirect_to campaign_ideas_path(campaign), notice: 'Idea has been succesfully created!'
end
private
def idea_params
params.require(:idea).permit(:title, :description, :id, :campaign, :user)
end
end
## Instruction:
Order ideas- interesting ideas moves on top of index
## Code After:
class IdeasController < ApplicationController
expose(:idea, attributes: :idea_params)
expose(:ideas) { campaign.user_accessible_ideas(current_user).order(interesting: :desc) }
expose(:campaign)
expose(:campaigns)
expose(:user)
before_action :authenticate_user!
def toggle_interesting
authorize idea
idea.interesting = !idea.interesting
if idea.save
redirect_to :back
else
redirect_to :back
end
end
def index
end
def show
authorize idea
end
def edit
authorize idea
end
def new
authorize idea
end
def update
authorize idea
idea.update(idea_params)
redirect_to campaign_ideas_path(campaign)
end
def destroy
authorize idea
idea.destroy
redirect_to campaign_ideas_path(campaign), notice: 'Idea has been destroyed'
end
def create
authorize idea
idea.user = current_user
idea.save
redirect_to campaign_ideas_path(campaign), notice: 'Idea has been succesfully created!'
end
private
def idea_params
params.require(:idea).permit(:title, :description, :id, :campaign, :user)
end
end
| class IdeasController < ApplicationController
expose(:idea, attributes: :idea_params)
- expose(:ideas) { campaign.user_accessible_ideas(current_user) }
+ expose(:ideas) { campaign.user_accessible_ideas(current_user).order(interesting: :desc) }
? ++++++++++++++++++++++++++
expose(:campaign)
expose(:campaigns)
expose(:user)
before_action :authenticate_user!
def toggle_interesting
authorize idea
idea.interesting = !idea.interesting
if idea.save
redirect_to :back
else
redirect_to :back
end
end
def index
end
def show
authorize idea
end
def edit
authorize idea
end
def new
authorize idea
end
def update
authorize idea
idea.update(idea_params)
redirect_to campaign_ideas_path(campaign)
end
def destroy
authorize idea
idea.destroy
redirect_to campaign_ideas_path(campaign), notice: 'Idea has been destroyed'
end
def create
authorize idea
idea.user = current_user
idea.save
redirect_to campaign_ideas_path(campaign), notice: 'Idea has been succesfully created!'
end
private
def idea_params
params.require(:idea).permit(:title, :description, :id, :campaign, :user)
end
end | 2 | 0.034483 | 1 | 1 |
cf70698dc149ccd3b3d33ac712fc334bee6226e6 | w.sh | w.sh |
set -e
while true; do
go run main.go -w -dev
if [ $? != 0 ] ; then
exit
fi
echo restarting
done |
set -e
while true; do
time go install
mog -w -dev
if [ $? != 0 ] ; then
exit
fi
echo restarting
done | Use go install instead of go run | Use go install instead of go run
Appears to be faster.
| Shell | isc | shazow/mog,mjibson/mog,mjibson/mog,mjibson/moggio,mjibson/moggio,shazow/mog,shazow/mog,shazow/mog,mjibson/moggio,mjibson/moggio,mjibson/mog,mjibson/mog | shell | ## Code Before:
set -e
while true; do
go run main.go -w -dev
if [ $? != 0 ] ; then
exit
fi
echo restarting
done
## Instruction:
Use go install instead of go run
Appears to be faster.
## Code After:
set -e
while true; do
time go install
mog -w -dev
if [ $? != 0 ] ; then
exit
fi
echo restarting
done |
set -e
while true; do
- go run main.go -w -dev
+ time go install
+ mog -w -dev
if [ $? != 0 ] ; then
exit
fi
echo restarting
done | 3 | 0.3 | 2 | 1 |
d3cd15cf200179a62f589aaae9f81900ce6854db | sg-model-repo/src/main/scala/com/lorandszakacs/sg/model/SGModelAssembly.scala | sg-model-repo/src/main/scala/com/lorandszakacs/sg/model/SGModelAssembly.scala | package com.lorandszakacs.sg.model
import com.lorandszakacs.sg.model.impl.{HopefulsDao, IndexDao, SGModelRepositoryImpl, SuicideGirlsDao}
import com.lorandszakacs.util.mongodb.Database
import com.lorandszakacs.util.future._
/**
*
* @author Lorand Szakacs, lsz@lorandszakacs.com
* @since 04 Jul 2016
*
*/
trait SGModelAssembly {
def db: Database
implicit def executionContext: ExecutionContext
def sgModelRepository: SGModelRepository = _sgModelRepository
private[model] def nameIndexDao: IndexDao = new IndexDao(db)
private[model] def suicideGirlsDao: SuicideGirlsDao = new SuicideGirlsDao(db)
private[model] def hopefulsDao: HopefulsDao = new HopefulsDao(db)
private[model] lazy val _sgModelRepository = new SGModelRepositoryImpl(
indexDao = nameIndexDao,
suicideGirlsDao = suicideGirlsDao,
hopefulsDao = hopefulsDao
)
}
| package com.lorandszakacs.sg.model
import com.lorandszakacs.sg.model.impl.{HopefulsDao, IndexDao, SGModelRepositoryImpl, SuicideGirlsDao}
import com.lorandszakacs.util.mongodb.Database
import com.lorandszakacs.util.future._
/**
*
* @author Lorand Szakacs, lsz@lorandszakacs.com
* @since 04 Jul 2016
*
*/
trait SGModelAssembly {
def db: Database
implicit def executionContext: ExecutionContext
def sgModelRepository: SGModelRepository = _sgModelRepository
private[model] lazy val _sgModelRepository = new SGModelRepositoryImpl(
db
)
}
| Remove garbage from DAO Assembly | Remove garbage from DAO Assembly
| Scala | apache-2.0 | lorandszakacs/sg-downloader,lorandszakacs/sg-downloader,lorandszakacs/sg-downloader | scala | ## Code Before:
package com.lorandszakacs.sg.model
import com.lorandszakacs.sg.model.impl.{HopefulsDao, IndexDao, SGModelRepositoryImpl, SuicideGirlsDao}
import com.lorandszakacs.util.mongodb.Database
import com.lorandszakacs.util.future._
/**
*
* @author Lorand Szakacs, lsz@lorandszakacs.com
* @since 04 Jul 2016
*
*/
trait SGModelAssembly {
def db: Database
implicit def executionContext: ExecutionContext
def sgModelRepository: SGModelRepository = _sgModelRepository
private[model] def nameIndexDao: IndexDao = new IndexDao(db)
private[model] def suicideGirlsDao: SuicideGirlsDao = new SuicideGirlsDao(db)
private[model] def hopefulsDao: HopefulsDao = new HopefulsDao(db)
private[model] lazy val _sgModelRepository = new SGModelRepositoryImpl(
indexDao = nameIndexDao,
suicideGirlsDao = suicideGirlsDao,
hopefulsDao = hopefulsDao
)
}
## Instruction:
Remove garbage from DAO Assembly
## Code After:
package com.lorandszakacs.sg.model
import com.lorandszakacs.sg.model.impl.{HopefulsDao, IndexDao, SGModelRepositoryImpl, SuicideGirlsDao}
import com.lorandszakacs.util.mongodb.Database
import com.lorandszakacs.util.future._
/**
*
* @author Lorand Szakacs, lsz@lorandszakacs.com
* @since 04 Jul 2016
*
*/
trait SGModelAssembly {
def db: Database
implicit def executionContext: ExecutionContext
def sgModelRepository: SGModelRepository = _sgModelRepository
private[model] lazy val _sgModelRepository = new SGModelRepositoryImpl(
db
)
}
| package com.lorandszakacs.sg.model
import com.lorandszakacs.sg.model.impl.{HopefulsDao, IndexDao, SGModelRepositoryImpl, SuicideGirlsDao}
import com.lorandszakacs.util.mongodb.Database
import com.lorandszakacs.util.future._
/**
*
* @author Lorand Szakacs, lsz@lorandszakacs.com
* @since 04 Jul 2016
*
*/
trait SGModelAssembly {
def db: Database
implicit def executionContext: ExecutionContext
def sgModelRepository: SGModelRepository = _sgModelRepository
- private[model] def nameIndexDao: IndexDao = new IndexDao(db)
-
- private[model] def suicideGirlsDao: SuicideGirlsDao = new SuicideGirlsDao(db)
-
- private[model] def hopefulsDao: HopefulsDao = new HopefulsDao(db)
-
private[model] lazy val _sgModelRepository = new SGModelRepositoryImpl(
+ db
- indexDao = nameIndexDao,
- suicideGirlsDao = suicideGirlsDao,
- hopefulsDao = hopefulsDao
)
} | 10 | 0.294118 | 1 | 9 |
7d61008cb5f45e75b648d1ecea7ed8e84da53932 | database/migration/20140228155432_change_enum_of_action.php | database/migration/20140228155432_change_enum_of_action.php | <?php
require_once 'migrate.php';
migrate(
[
'ALTER TABLE
roundcreatures
CHANGE
`action` `action` ENUM("NONE","MOVE","ATTACK") COLLATE utf8_unicode_ci NOT NULL'
]
);
?>
| <?php
require_once 'migrate.php';
Migration::alterField( 'roundcreatures', 'action', 'action', 'ENUM("NONE","MOVE","ATTACK") COLLATE utf8_unicode_ci NOT NULL' );
?>
| Change last migration to new mig format | Change last migration to new mig format
| PHP | mit | dionyziz/endofcodes,VitSalis/endofcodes,dionyziz/endofcodes,VitSalis/endofcodes,VitSalis/endofcodes,VitSalis/endofcodes,dionyziz/endofcodes | php | ## Code Before:
<?php
require_once 'migrate.php';
migrate(
[
'ALTER TABLE
roundcreatures
CHANGE
`action` `action` ENUM("NONE","MOVE","ATTACK") COLLATE utf8_unicode_ci NOT NULL'
]
);
?>
## Instruction:
Change last migration to new mig format
## Code After:
<?php
require_once 'migrate.php';
Migration::alterField( 'roundcreatures', 'action', 'action', 'ENUM("NONE","MOVE","ATTACK") COLLATE utf8_unicode_ci NOT NULL' );
?>
| <?php
require_once 'migrate.php';
+ Migration::alterField( 'roundcreatures', 'action', 'action', 'ENUM("NONE","MOVE","ATTACK") COLLATE utf8_unicode_ci NOT NULL' );
- migrate(
- [
- 'ALTER TABLE
- roundcreatures
- CHANGE
- `action` `action` ENUM("NONE","MOVE","ATTACK") COLLATE utf8_unicode_ci NOT NULL'
- ]
- );
?> | 9 | 0.75 | 1 | 8 |
47304b51b68a6822149401c68d23888b3a31b425 | docker-compose.yml | docker-compose.yml | ---
version: '2'
services:
broker:
image: rabbitmq:3-management
ports:
- "15672:15672"
networks:
- back-tier
web:
build: .
command:
- gunicorn
- -c
- /code/settings.py
- lintreview.web:app
environment: &lintreview_env
LINTREVIEW_GUNICORN_BIND: '0.0.0.0:5000'
LINTREVIEW_GUNICORN_LOG_ACCESS: '-'
LINTREVIEW_GUNICORN_LOG_ERROR: '-'
ports:
- "5000:5000"
links:
- broker
networks:
- front-tier
- back-tier
worker:
build: .
command:
- celery
- -A
- lintreview.tasks
- worker
- -l
- info
environment:
<<: *lintreview_env
C_FORCE_ROOT: "true"
links:
- broker
networks:
- back-tier
networks:
front-tier:
driver: bridge
back-tier:
driver: bridge
| ---
version: '2'
services:
broker:
image: rabbitmq:3-management
ports:
- "15672:15672"
networks:
- back-tier
web:
image: markstory/lint-review
command:
- gunicorn
- -c
- /code/settings.py
- lintreview.web:app
environment: &lintreview_env
LINTREVIEW_GUNICORN_BIND: '0.0.0.0:5000'
LINTREVIEW_GUNICORN_LOG_ACCESS: '-'
LINTREVIEW_GUNICORN_LOG_ERROR: '-'
ports:
- "5000:5000"
links:
- broker
networks:
- front-tier
- back-tier
worker:
image: markstory/lint-review
command:
- celery
- -A
- lintreview.tasks
- worker
- -l
- info
environment:
<<: *lintreview_env
C_FORCE_ROOT: "true"
links:
- broker
networks:
- back-tier
networks:
front-tier:
driver: bridge
back-tier:
driver: bridge
| Use image instead of building | Use image instead of building
Since docker hub now builds it for us, let's use that
| YAML | mit | zoidbergwill/lint-review,markstory/lint-review,markstory/lint-review,adrianmoisey/lint-review,zoidbergwill/lint-review,adrianmoisey/lint-review,markstory/lint-review,zoidbergwill/lint-review | yaml | ## Code Before:
---
version: '2'
services:
broker:
image: rabbitmq:3-management
ports:
- "15672:15672"
networks:
- back-tier
web:
build: .
command:
- gunicorn
- -c
- /code/settings.py
- lintreview.web:app
environment: &lintreview_env
LINTREVIEW_GUNICORN_BIND: '0.0.0.0:5000'
LINTREVIEW_GUNICORN_LOG_ACCESS: '-'
LINTREVIEW_GUNICORN_LOG_ERROR: '-'
ports:
- "5000:5000"
links:
- broker
networks:
- front-tier
- back-tier
worker:
build: .
command:
- celery
- -A
- lintreview.tasks
- worker
- -l
- info
environment:
<<: *lintreview_env
C_FORCE_ROOT: "true"
links:
- broker
networks:
- back-tier
networks:
front-tier:
driver: bridge
back-tier:
driver: bridge
## Instruction:
Use image instead of building
Since docker hub now builds it for us, let's use that
## Code After:
---
version: '2'
services:
broker:
image: rabbitmq:3-management
ports:
- "15672:15672"
networks:
- back-tier
web:
image: markstory/lint-review
command:
- gunicorn
- -c
- /code/settings.py
- lintreview.web:app
environment: &lintreview_env
LINTREVIEW_GUNICORN_BIND: '0.0.0.0:5000'
LINTREVIEW_GUNICORN_LOG_ACCESS: '-'
LINTREVIEW_GUNICORN_LOG_ERROR: '-'
ports:
- "5000:5000"
links:
- broker
networks:
- front-tier
- back-tier
worker:
image: markstory/lint-review
command:
- celery
- -A
- lintreview.tasks
- worker
- -l
- info
environment:
<<: *lintreview_env
C_FORCE_ROOT: "true"
links:
- broker
networks:
- back-tier
networks:
front-tier:
driver: bridge
back-tier:
driver: bridge
| ---
version: '2'
services:
broker:
image: rabbitmq:3-management
ports:
- "15672:15672"
networks:
- back-tier
web:
- build: .
+ image: markstory/lint-review
command:
- gunicorn
- -c
- /code/settings.py
- lintreview.web:app
environment: &lintreview_env
LINTREVIEW_GUNICORN_BIND: '0.0.0.0:5000'
LINTREVIEW_GUNICORN_LOG_ACCESS: '-'
LINTREVIEW_GUNICORN_LOG_ERROR: '-'
ports:
- "5000:5000"
links:
- broker
networks:
- front-tier
- back-tier
worker:
- build: .
+ image: markstory/lint-review
command:
- celery
- -A
- lintreview.tasks
- worker
- -l
- info
environment:
<<: *lintreview_env
C_FORCE_ROOT: "true"
links:
- broker
networks:
- back-tier
networks:
front-tier:
driver: bridge
back-tier:
driver: bridge | 4 | 0.081633 | 2 | 2 |
2e8eb3ca73d90b190aa5c30087928498bdaebda1 | appveyor.yml | appveyor.yml | image: Visual Studio 2019
build_script:
- ps: .\build.ps1 -Target "Appveyor" -Configuration "Release"
# disable built-in tests.
test: off
artifacts:
- path: package\*.nupkg
- path: package\*.msi
- path: package\*.zip
deploy:
- provider: NuGet
server: https://www.myget.org/F/nunit/api/v2
api_key:
secure: wtAvJDVl2tfwiVcyLExFHLvZVfUWiQRHsfdHBFCNEATeCHo1Nd8JP642PfY8xhji
skip_symbols: true
on:
branch: master
APPVEYOR_REPO_NAME: nunit/nunit-console
| image: Visual Studio 2019
build_script:
- ps: .\build.ps1 -Target "Appveyor" -Configuration "Release"
# disable built-in tests.
test: off
artifacts:
- path: package\*.nupkg
- path: package\*.msi
- path: package\*.zip
deploy:
- provider: NuGet
server: https://www.myget.org/F/nunit/api/v3/index.json
api_key:
secure: wtAvJDVl2tfwiVcyLExFHLvZVfUWiQRHsfdHBFCNEATeCHo1Nd8JP642PfY8xhji
skip_symbols: true
on:
branch: master
APPVEYOR_REPO_NAME: nunit/nunit-console
| Use v3 feed to push to myget so source package is pushed as well | Use v3 feed to push to myget so source package is pushed as well
| YAML | mit | nunit/nunit-console,nunit/nunit-console,nunit/nunit-console | yaml | ## Code Before:
image: Visual Studio 2019
build_script:
- ps: .\build.ps1 -Target "Appveyor" -Configuration "Release"
# disable built-in tests.
test: off
artifacts:
- path: package\*.nupkg
- path: package\*.msi
- path: package\*.zip
deploy:
- provider: NuGet
server: https://www.myget.org/F/nunit/api/v2
api_key:
secure: wtAvJDVl2tfwiVcyLExFHLvZVfUWiQRHsfdHBFCNEATeCHo1Nd8JP642PfY8xhji
skip_symbols: true
on:
branch: master
APPVEYOR_REPO_NAME: nunit/nunit-console
## Instruction:
Use v3 feed to push to myget so source package is pushed as well
## Code After:
image: Visual Studio 2019
build_script:
- ps: .\build.ps1 -Target "Appveyor" -Configuration "Release"
# disable built-in tests.
test: off
artifacts:
- path: package\*.nupkg
- path: package\*.msi
- path: package\*.zip
deploy:
- provider: NuGet
server: https://www.myget.org/F/nunit/api/v3/index.json
api_key:
secure: wtAvJDVl2tfwiVcyLExFHLvZVfUWiQRHsfdHBFCNEATeCHo1Nd8JP642PfY8xhji
skip_symbols: true
on:
branch: master
APPVEYOR_REPO_NAME: nunit/nunit-console
| image: Visual Studio 2019
build_script:
- ps: .\build.ps1 -Target "Appveyor" -Configuration "Release"
# disable built-in tests.
test: off
artifacts:
- path: package\*.nupkg
- path: package\*.msi
- path: package\*.zip
deploy:
- provider: NuGet
- server: https://www.myget.org/F/nunit/api/v2
? ^
+ server: https://www.myget.org/F/nunit/api/v3/index.json
? ^^^^^^^^^^^^
api_key:
secure: wtAvJDVl2tfwiVcyLExFHLvZVfUWiQRHsfdHBFCNEATeCHo1Nd8JP642PfY8xhji
skip_symbols: true
on:
branch: master
APPVEYOR_REPO_NAME: nunit/nunit-console | 2 | 0.090909 | 1 | 1 |
017cee6db90b7a4d7e891158dc1018da88d9e7fd | app/views/documents/_metadata.html.erb | app/views/documents/_metadata.html.erb | <%
policies ||= []
topics ||= []
%>
<aside class="meta">
<div class="inner-heading">
<dl>
<% if document.organisations.any? %>
<dt><%= t('document.headings.organisations', count: document.organisations.length) %>:</dt>
<dd>
<%= render partial: 'organisations/organisations_name_list',
locals: { organisations: document.sorted_organisations,
lead_organisations: document.lead_organisations } %>
</dd>
<% end %>
<%= render partial: 'document_extra_metadata',
locals: { document: document } %>
<%= render partial: 'documents/change_notes',
locals: { document: document } %>
<% document_metadata(document, policies, topics).each do |metadata| %>
<dt><%= metadata[:title] %>:</dt>
<dd class="js-hide-other-links <%= metadata.fetch(:classes, []).join(' ') %>">
<%= metadata[:data].to_sentence.html_safe %>
</dd>
<% end %>
<% if document.location && document.location.present? %>
<dt><%= t('document.headings.location') %>:</dt>
<dd class="location"><%= document.location %></dd>
<% end %>
</dl>
</div>
</aside>
| <%
policies ||= []
topics ||= []
%>
<aside class="meta">
<div class="inner-heading">
<dl>
<% if document.organisations.any? %>
<dt><%= t('document.headings.organisations', count: document.organisations.length) %>:</dt>
<dd>
<%= render partial: 'organisations/organisations_name_list',
locals: { organisations: document.sorted_organisations,
lead_organisations: document.lead_organisations } %>
</dd>
<% end %>
<%= render partial: 'document_extra_metadata',
locals: { document: document } %>
<%= render partial: 'documents/change_notes',
locals: { document: document } %>
<% document_metadata(document, policies, topics).each do |metadata| %>
<dt><%= metadata[:title] %>:</dt>
<dd class="js-hide-other-links <%= metadata.fetch(:classes, []).join(' ') %>">
<%= metadata[:data].to_sentence.html_safe %>
</dd>
<% end %>
<% if document.primary_mainstream_category && document.primary_mainstream_category.present? %>
<dt>Primary category:</dt>
<dd>
<%= link_to document.primary_mainstream_category.title,
Whitehall.url_maker.mainstream_category_path(document.primary_mainstream_category) %>
</dd>
<% end %>
<% if document.location && document.location.present? %>
<dt><%= t('document.headings.location') %>:</dt>
<dd class="location"><%= document.location %></dd>
<% end %>
</dl>
</div>
</aside>
| Add parent category document metadata | Add parent category document metadata
To replace functionality previously provided by the breadcrumb
| HTML+ERB | mit | YOTOV-LIMITED/whitehall,alphagov/whitehall,alphagov/whitehall,askl56/whitehall,robinwhittleton/whitehall,robinwhittleton/whitehall,hotvulcan/whitehall,YOTOV-LIMITED/whitehall,ggoral/whitehall,YOTOV-LIMITED/whitehall,askl56/whitehall,hotvulcan/whitehall,ggoral/whitehall,YOTOV-LIMITED/whitehall,robinwhittleton/whitehall,askl56/whitehall,hotvulcan/whitehall,alphagov/whitehall,alphagov/whitehall,askl56/whitehall,robinwhittleton/whitehall,ggoral/whitehall,ggoral/whitehall,hotvulcan/whitehall | html+erb | ## Code Before:
<%
policies ||= []
topics ||= []
%>
<aside class="meta">
<div class="inner-heading">
<dl>
<% if document.organisations.any? %>
<dt><%= t('document.headings.organisations', count: document.organisations.length) %>:</dt>
<dd>
<%= render partial: 'organisations/organisations_name_list',
locals: { organisations: document.sorted_organisations,
lead_organisations: document.lead_organisations } %>
</dd>
<% end %>
<%= render partial: 'document_extra_metadata',
locals: { document: document } %>
<%= render partial: 'documents/change_notes',
locals: { document: document } %>
<% document_metadata(document, policies, topics).each do |metadata| %>
<dt><%= metadata[:title] %>:</dt>
<dd class="js-hide-other-links <%= metadata.fetch(:classes, []).join(' ') %>">
<%= metadata[:data].to_sentence.html_safe %>
</dd>
<% end %>
<% if document.location && document.location.present? %>
<dt><%= t('document.headings.location') %>:</dt>
<dd class="location"><%= document.location %></dd>
<% end %>
</dl>
</div>
</aside>
## Instruction:
Add parent category document metadata
To replace functionality previously provided by the breadcrumb
## Code After:
<%
policies ||= []
topics ||= []
%>
<aside class="meta">
<div class="inner-heading">
<dl>
<% if document.organisations.any? %>
<dt><%= t('document.headings.organisations', count: document.organisations.length) %>:</dt>
<dd>
<%= render partial: 'organisations/organisations_name_list',
locals: { organisations: document.sorted_organisations,
lead_organisations: document.lead_organisations } %>
</dd>
<% end %>
<%= render partial: 'document_extra_metadata',
locals: { document: document } %>
<%= render partial: 'documents/change_notes',
locals: { document: document } %>
<% document_metadata(document, policies, topics).each do |metadata| %>
<dt><%= metadata[:title] %>:</dt>
<dd class="js-hide-other-links <%= metadata.fetch(:classes, []).join(' ') %>">
<%= metadata[:data].to_sentence.html_safe %>
</dd>
<% end %>
<% if document.primary_mainstream_category && document.primary_mainstream_category.present? %>
<dt>Primary category:</dt>
<dd>
<%= link_to document.primary_mainstream_category.title,
Whitehall.url_maker.mainstream_category_path(document.primary_mainstream_category) %>
</dd>
<% end %>
<% if document.location && document.location.present? %>
<dt><%= t('document.headings.location') %>:</dt>
<dd class="location"><%= document.location %></dd>
<% end %>
</dl>
</div>
</aside>
| <%
policies ||= []
topics ||= []
%>
<aside class="meta">
<div class="inner-heading">
<dl>
<% if document.organisations.any? %>
<dt><%= t('document.headings.organisations', count: document.organisations.length) %>:</dt>
<dd>
<%= render partial: 'organisations/organisations_name_list',
locals: { organisations: document.sorted_organisations,
lead_organisations: document.lead_organisations } %>
</dd>
<% end %>
<%= render partial: 'document_extra_metadata',
locals: { document: document } %>
<%= render partial: 'documents/change_notes',
locals: { document: document } %>
<% document_metadata(document, policies, topics).each do |metadata| %>
<dt><%= metadata[:title] %>:</dt>
<dd class="js-hide-other-links <%= metadata.fetch(:classes, []).join(' ') %>">
<%= metadata[:data].to_sentence.html_safe %>
</dd>
<% end %>
+ <% if document.primary_mainstream_category && document.primary_mainstream_category.present? %>
+ <dt>Primary category:</dt>
+ <dd>
+ <%= link_to document.primary_mainstream_category.title,
+ Whitehall.url_maker.mainstream_category_path(document.primary_mainstream_category) %>
+ </dd>
+ <% end %>
+
<% if document.location && document.location.present? %>
<dt><%= t('document.headings.location') %>:</dt>
<dd class="location"><%= document.location %></dd>
<% end %>
</dl>
</div>
</aside> | 8 | 0.228571 | 8 | 0 |
dc98123f29c4ba9a506fe1501aa423a5f326949c | operations/enable-routing-integrity.yml | operations/enable-routing-integrity.yml | ---
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/enabled
value: true
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/require_and_verify_client_certificates
value: true
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/trusted_ca_certificates
value:
- ((service_cf_internal_ca.certificate))
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/verify_subject_alt_name
value:
- gorouter.service.cf.internal
| ---
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/enabled
value: true
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/require_and_verify_client_certificates
value: true
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/trusted_ca_certificates
value:
- ((service_cf_internal_ca.certificate))
- type: replace
path: /instance_groups/name=scheduler/jobs/name=ssh_proxy/properties/backends?/tls/enabled
value: true
- type: replace
path: /instance_groups/name=scheduler/jobs/name=ssh_proxy/properties/backends?/tls/ca_certificates
value:
- ((application_ca.certificate))
- type: replace
path: /variables/-
value:
name: ssh_proxy_backends_tls
type: certificate
options:
ca: service_cf_internal_ca
extended_key_usage:
- client_auth
common_name: ssh_proxy_backends_tls
alternative_names:
- ssh-proxy.service.cf.internal
- type: replace
path: /instance_groups/name=scheduler/jobs/name=ssh_proxy/properties/backends?/tls/client_certificate
value: ((ssh_proxy_backends_tls.certificate))
- type: replace
path: /instance_groups/name=scheduler/jobs/name=ssh_proxy/properties/backends?/tls/client_private_key
value: ((ssh_proxy_backends_tls.private_key))
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/verify_subject_alt_name
value:
- gorouter.service.cf.internal
- ssh-proxy.service.cf.internal
| Enable ssh over tls when routing integrity is enabled | Enable ssh over tls when routing integrity is enabled
- the ssh_proxy trusts the application ca cert
- ssh_proxy is configured with certs generated from service cf internal
CA
- envoy trusts ssh-proxy SAN as well as gorouter
- ssh_proxy will fall back to non-tls if handshake fails
- can now use route integrity + removal of non proxied container ports
and still have cf ssh
[#161421181](https://www.pivotaltracker.com/story/show/161421181)
Signed-off-by: Nick Wei <397a225b6adb51a440bd3821475520403c3e9564@pivotal.io>
| YAML | apache-2.0 | cloudfoundry/cf-deployment,cloudfoundry/cf-deployment | yaml | ## Code Before:
---
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/enabled
value: true
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/require_and_verify_client_certificates
value: true
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/trusted_ca_certificates
value:
- ((service_cf_internal_ca.certificate))
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/verify_subject_alt_name
value:
- gorouter.service.cf.internal
## Instruction:
Enable ssh over tls when routing integrity is enabled
- the ssh_proxy trusts the application ca cert
- ssh_proxy is configured with certs generated from service cf internal
CA
- envoy trusts ssh-proxy SAN as well as gorouter
- ssh_proxy will fall back to non-tls if handshake fails
- can now use route integrity + removal of non proxied container ports
and still have cf ssh
[#161421181](https://www.pivotaltracker.com/story/show/161421181)
Signed-off-by: Nick Wei <397a225b6adb51a440bd3821475520403c3e9564@pivotal.io>
## Code After:
---
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/enabled
value: true
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/require_and_verify_client_certificates
value: true
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/trusted_ca_certificates
value:
- ((service_cf_internal_ca.certificate))
- type: replace
path: /instance_groups/name=scheduler/jobs/name=ssh_proxy/properties/backends?/tls/enabled
value: true
- type: replace
path: /instance_groups/name=scheduler/jobs/name=ssh_proxy/properties/backends?/tls/ca_certificates
value:
- ((application_ca.certificate))
- type: replace
path: /variables/-
value:
name: ssh_proxy_backends_tls
type: certificate
options:
ca: service_cf_internal_ca
extended_key_usage:
- client_auth
common_name: ssh_proxy_backends_tls
alternative_names:
- ssh-proxy.service.cf.internal
- type: replace
path: /instance_groups/name=scheduler/jobs/name=ssh_proxy/properties/backends?/tls/client_certificate
value: ((ssh_proxy_backends_tls.certificate))
- type: replace
path: /instance_groups/name=scheduler/jobs/name=ssh_proxy/properties/backends?/tls/client_private_key
value: ((ssh_proxy_backends_tls.private_key))
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/verify_subject_alt_name
value:
- gorouter.service.cf.internal
- ssh-proxy.service.cf.internal
| ---
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/enabled
value: true
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/require_and_verify_client_certificates
value: true
- type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/trusted_ca_certificates
value:
- ((service_cf_internal_ca.certificate))
- type: replace
+ path: /instance_groups/name=scheduler/jobs/name=ssh_proxy/properties/backends?/tls/enabled
+ value: true
+
+ - type: replace
+ path: /instance_groups/name=scheduler/jobs/name=ssh_proxy/properties/backends?/tls/ca_certificates
+ value:
+ - ((application_ca.certificate))
+
+ - type: replace
+ path: /variables/-
+ value:
+ name: ssh_proxy_backends_tls
+ type: certificate
+ options:
+ ca: service_cf_internal_ca
+ extended_key_usage:
+ - client_auth
+ common_name: ssh_proxy_backends_tls
+ alternative_names:
+ - ssh-proxy.service.cf.internal
+
+ - type: replace
+ path: /instance_groups/name=scheduler/jobs/name=ssh_proxy/properties/backends?/tls/client_certificate
+ value: ((ssh_proxy_backends_tls.certificate))
+
+ - type: replace
+ path: /instance_groups/name=scheduler/jobs/name=ssh_proxy/properties/backends?/tls/client_private_key
+ value: ((ssh_proxy_backends_tls.private_key))
+
+ - type: replace
path: /instance_groups/name=diego-cell/jobs/name=rep/properties/containers?/proxy/verify_subject_alt_name
value:
- gorouter.service.cf.internal
+ - ssh-proxy.service.cf.internal | 31 | 1.722222 | 31 | 0 |
368e7248ab9d2d8eaaceeaf60534fa864ec06a92 | docs/apps/zero_downtime_deployment_for_your_ninefold_rails_app.md | docs/apps/zero_downtime_deployment_for_your_ninefold_rails_app.md | page_title: Zero Downtime Deployment For Your Rails App
page_author: Brittany Martin
page_description: Knowledge base article to instruct users on how they can implement zero downtime deployment
page_keywords: zero downtime deployment redeployment lab
## Zero downtime deployment
#### What is ZDD?
In traditional deployment environments, when switching an app server in the cloud from the current version to a new version, there is a window of time when the node is unusable in terms of serving traffic. During that window, the app server is taken out of traffic, and after the switch it is brought back into traffic.
With Zero Downtime Deployment (aka Rolling Deployments) on Ninefold, app servers are removed from the load balancer and upgraded one at a time when upgrading your code. This minimizes downtime for your users as you deploy changes to your app.
#### How to enable
Check that your app is deployed on at least 2 web servers.
__To check:__
* Click on your App name
* Navigate to the __Infrastructure__ tab
* Click the __Show Details__ button
* Verify that you have at least 2 web servers
__To enable:__
If you have two deployed web servers, navigate to __Account__, __Edit Details__.
Click on the Labs tab and check off the __Zero Downtime Deployment__ option and __Update__. You’ll be all set!
| page_title: Zero Downtime Deployment For Your Rails App
page_author: Brittany Martin
page_description: Knowledge base article to instruct users on how they can implement zero downtime deployment
page_keywords: zero downtime deployment redeployment configuration
## Zero downtime deployment
#### What is ZDD?
In traditional deployment environments, when switching an app server in the cloud from the current version to a new version, there is a window of time when the node is unusable in terms of serving traffic. During that window, the app server is taken out of traffic, and after the switch it is brought back into traffic.
With Zero Downtime Deployment (aka Rolling Deployments) on Ninefold, app servers are removed from the load balancer and upgraded one at a time when upgrading your code. This minimizes downtime for your users as you deploy changes to your app.
#### How to enable
Check that your app is deployed on at least 2 web servers.
__To check:__
* Click on your App name
* Navigate to the __Infrastructure__ tab
* Click the __Show Details__ button
* Verify that you have at least 2 web servers
__To enable (if you have two deployed web servers):__
* Click on your App name
* Navigate to the __Configuration__ tab
* Click __Enable__ next to the __Zero Downtime Deployment__ option.
You'll be all set for the very next time you deploy.
| Fix references to ninefold labs. This thing is live for everyone now! | Fix references to ninefold labs. This thing is live for everyone now!
| Markdown | mit | webhost/docs-2,webhost/docs-2 | markdown | ## Code Before:
page_title: Zero Downtime Deployment For Your Rails App
page_author: Brittany Martin
page_description: Knowledge base article to instruct users on how they can implement zero downtime deployment
page_keywords: zero downtime deployment redeployment lab
## Zero downtime deployment
#### What is ZDD?
In traditional deployment environments, when switching an app server in the cloud from the current version to a new version, there is a window of time when the node is unusable in terms of serving traffic. During that window, the app server is taken out of traffic, and after the switch it is brought back into traffic.
With Zero Downtime Deployment (aka Rolling Deployments) on Ninefold, app servers are removed from the load balancer and upgraded one at a time when upgrading your code. This minimizes downtime for your users as you deploy changes to your app.
#### How to enable
Check that your app is deployed on at least 2 web servers.
__To check:__
* Click on your App name
* Navigate to the __Infrastructure__ tab
* Click the __Show Details__ button
* Verify that you have at least 2 web servers
__To enable:__
If you have two deployed web servers, navigate to __Account__, __Edit Details__.
Click on the Labs tab and check off the __Zero Downtime Deployment__ option and __Update__. You’ll be all set!
## Instruction:
Fix references to ninefold labs. This thing is live for everyone now!
## Code After:
page_title: Zero Downtime Deployment For Your Rails App
page_author: Brittany Martin
page_description: Knowledge base article to instruct users on how they can implement zero downtime deployment
page_keywords: zero downtime deployment redeployment configuration
## Zero downtime deployment
#### What is ZDD?
In traditional deployment environments, when switching an app server in the cloud from the current version to a new version, there is a window of time when the node is unusable in terms of serving traffic. During that window, the app server is taken out of traffic, and after the switch it is brought back into traffic.
With Zero Downtime Deployment (aka Rolling Deployments) on Ninefold, app servers are removed from the load balancer and upgraded one at a time when upgrading your code. This minimizes downtime for your users as you deploy changes to your app.
#### How to enable
Check that your app is deployed on at least 2 web servers.
__To check:__
* Click on your App name
* Navigate to the __Infrastructure__ tab
* Click the __Show Details__ button
* Verify that you have at least 2 web servers
__To enable (if you have two deployed web servers):__
* Click on your App name
* Navigate to the __Configuration__ tab
* Click __Enable__ next to the __Zero Downtime Deployment__ option.
You'll be all set for the very next time you deploy.
| page_title: Zero Downtime Deployment For Your Rails App
page_author: Brittany Martin
page_description: Knowledge base article to instruct users on how they can implement zero downtime deployment
- page_keywords: zero downtime deployment redeployment lab
? ^ ^^
+ page_keywords: zero downtime deployment redeployment configuration
? ^^^^^^^^ ^^^^
## Zero downtime deployment
#### What is ZDD?
In traditional deployment environments, when switching an app server in the cloud from the current version to a new version, there is a window of time when the node is unusable in terms of serving traffic. During that window, the app server is taken out of traffic, and after the switch it is brought back into traffic.
With Zero Downtime Deployment (aka Rolling Deployments) on Ninefold, app servers are removed from the load balancer and upgraded one at a time when upgrading your code. This minimizes downtime for your users as you deploy changes to your app.
#### How to enable
Check that your app is deployed on at least 2 web servers.
__To check:__
* Click on your App name
* Navigate to the __Infrastructure__ tab
* Click the __Show Details__ button
* Verify that you have at least 2 web servers
- __To enable:__
+ __To enable (if you have two deployed web servers):__
- If you have two deployed web servers, navigate to __Account__, __Edit Details__.
+ * Click on your App name
+ * Navigate to the __Configuration__ tab
+ * Click __Enable__ next to the __Zero Downtime Deployment__ option.
- Click on the Labs tab and check off the __Zero Downtime Deployment__ option and __Update__. You’ll be all set!
+ You'll be all set for the very next time you deploy. | 10 | 0.344828 | 6 | 4 |
28b481287f2dbc2ba2547189b57dd47f1f16d497 | app/controllers/letter_bomb/mailers_controller.rb | app/controllers/letter_bomb/mailers_controller.rb | module LetterBomb
class MailersController < ApplicationController
def index
@mailers = LetterBomb::Preview.previews
end
def show
klass = params[:mailer_class]
@action = params[:mailer_action]
@mail = klass.constantize.preview_action(@action)
params[:format] ||= @mail.multipart? ? "html" : "text"
respond_to do |format|
format.html
format.text { render formats: [:html], content_type: 'text/html' }
end
end
private
def body_part
return @mail unless @mail.multipart?
content_type = Rack::Mime.mime_type(".#{params[:format]}")
if @mail.respond_to?(:all_parts)
@mail.all_parts.find { |part| part.content_type.match(content_type) } || @mail.parts.first
else
@mail.parts.find { |part| part.content_type.match(content_type) } || @mail.parts.first
end
end
helper_method :body_part
def html_template?
params[:format] == 'html'
end
helper_method :html_template?
end
end
| module LetterBomb
class MailersController < ApplicationController
def index
@mailers = LetterBomb::Preview.previews
end
def show
klass = params[:mailer_class]
@action = params[:mailer_action]
@mail = klass.constantize.preview_action(@action)
params[:format] ||= content_type_html? ? "html" : "text"
respond_to do |format|
format.html
format.text { render formats: [:html], content_type: 'text/html' }
end
end
private
def content_type_html?
@mail.content_type.match("text/html")
end
def body_part
return @mail unless @mail.multipart?
content_type = Rack::Mime.mime_type(".#{params[:format]}")
if @mail.respond_to?(:all_parts)
@mail.all_parts.find { |part| part.content_type.match(content_type) } || @mail.parts.first
else
@mail.parts.find { |part| part.content_type.match(content_type) } || @mail.parts.first
end
end
helper_method :body_part
def html_template?
params[:format] == 'html'
end
helper_method :html_template?
end
end
| Check message content type to determine default rendered format. | Check message content type to determine default rendered format.
| Ruby | mit | ags/letter_bomb,ags/letter_bomb,ags/letter_bomb | ruby | ## Code Before:
module LetterBomb
class MailersController < ApplicationController
def index
@mailers = LetterBomb::Preview.previews
end
def show
klass = params[:mailer_class]
@action = params[:mailer_action]
@mail = klass.constantize.preview_action(@action)
params[:format] ||= @mail.multipart? ? "html" : "text"
respond_to do |format|
format.html
format.text { render formats: [:html], content_type: 'text/html' }
end
end
private
def body_part
return @mail unless @mail.multipart?
content_type = Rack::Mime.mime_type(".#{params[:format]}")
if @mail.respond_to?(:all_parts)
@mail.all_parts.find { |part| part.content_type.match(content_type) } || @mail.parts.first
else
@mail.parts.find { |part| part.content_type.match(content_type) } || @mail.parts.first
end
end
helper_method :body_part
def html_template?
params[:format] == 'html'
end
helper_method :html_template?
end
end
## Instruction:
Check message content type to determine default rendered format.
## Code After:
module LetterBomb
class MailersController < ApplicationController
def index
@mailers = LetterBomb::Preview.previews
end
def show
klass = params[:mailer_class]
@action = params[:mailer_action]
@mail = klass.constantize.preview_action(@action)
params[:format] ||= content_type_html? ? "html" : "text"
respond_to do |format|
format.html
format.text { render formats: [:html], content_type: 'text/html' }
end
end
private
def content_type_html?
@mail.content_type.match("text/html")
end
def body_part
return @mail unless @mail.multipart?
content_type = Rack::Mime.mime_type(".#{params[:format]}")
if @mail.respond_to?(:all_parts)
@mail.all_parts.find { |part| part.content_type.match(content_type) } || @mail.parts.first
else
@mail.parts.find { |part| part.content_type.match(content_type) } || @mail.parts.first
end
end
helper_method :body_part
def html_template?
params[:format] == 'html'
end
helper_method :html_template?
end
end
| module LetterBomb
class MailersController < ApplicationController
def index
@mailers = LetterBomb::Preview.previews
end
def show
klass = params[:mailer_class]
@action = params[:mailer_action]
@mail = klass.constantize.preview_action(@action)
- params[:format] ||= @mail.multipart? ? "html" : "text"
? ^ -- ----------
+ params[:format] ||= content_type_html? ? "html" : "text"
? ^^^^^^^^^^^^^^^
respond_to do |format|
format.html
format.text { render formats: [:html], content_type: 'text/html' }
end
end
private
+
+ def content_type_html?
+ @mail.content_type.match("text/html")
+ end
def body_part
return @mail unless @mail.multipart?
content_type = Rack::Mime.mime_type(".#{params[:format]}")
if @mail.respond_to?(:all_parts)
@mail.all_parts.find { |part| part.content_type.match(content_type) } || @mail.parts.first
else
@mail.parts.find { |part| part.content_type.match(content_type) } || @mail.parts.first
end
end
helper_method :body_part
def html_template?
params[:format] == 'html'
end
helper_method :html_template?
end
end | 6 | 0.146341 | 5 | 1 |
afdf07a331ed5f7dad218f9f23e7466ffcf75597 | scripts/components/Header.js | scripts/components/Header.js | import React from 'react';
import MuiThemeProvider from 'material-ui/styles/MuiThemeProvider';
import AppBar from 'material-ui/AppBar';
import IconButton from 'material-ui/IconButton';
import IconMenu from 'material-ui/IconMenu';
import MenuItem from 'material-ui/MenuItem';
import MoreVertIcon from 'material-ui/svg-icons/navigation/more-vert';
/*
Header
Top of the full App
*/
var Header = React.createClass({
handleTitleTouch : function() {
this.props.history.push('/');
},
render : function() {
return (
<MuiThemeProvider>
<AppBar
title={<span style={{'cursor':'pointer'}}>MetaSeek</span>}
onTitleTouchTap={this.handleTitleTouch}
iconElementLeft={<div></div>}
iconElementRight={
<IconMenu
iconButtonElement={
<IconButton><MoreVertIcon /></IconButton>
}
targetOrigin={{horizontal: 'right', vertical: 'top'}}
anchorOrigin={{horizontal: 'right', vertical: 'top'}}
>
<MenuItem primaryText="Help" />
<MenuItem primaryText="Sign out" />
</IconMenu>
}
/>
</MuiThemeProvider>
)
}
});
export default Header;
| import React from 'react';
import { Link } from 'react-router';
import MuiThemeProvider from 'material-ui/styles/MuiThemeProvider';
import AppBar from 'material-ui/AppBar';
import IconButton from 'material-ui/IconButton';
import IconMenu from 'material-ui/IconMenu';
import MenuItem from 'material-ui/MenuItem';
import MoreVertIcon from 'material-ui/svg-icons/navigation/more-vert';
/*
Header
Top of the full App
*/
var Header = React.createClass({
handleTitleTouch : function() {
this.props.history.push('/');
},
render : function() {
return (
<MuiThemeProvider>
<AppBar
title={<span style={{'cursor':'pointer'}}>MetaSeek</span>}
onTitleTouchTap={this.handleTitleTouch}
iconElementLeft={<div></div>}
iconElementRight={
<IconMenu
iconButtonElement={
<IconButton><MoreVertIcon /></IconButton>
}
targetOrigin={{horizontal: 'right', vertical: 'top'}}
anchorOrigin={{horizontal: 'right', vertical: 'top'}}
>
<Link style={{'text-decoration':'none'}} to='/myaccount'>
<MenuItem primaryText="My Account" />
</Link>
</IconMenu>
}
/>
</MuiThemeProvider>
)
}
});
export default Header;
| Add my account to header | Add my account to header
| JavaScript | mit | ahoarfrost/metaseek,ahoarfrost/metaseek,ahoarfrost/metaseek | javascript | ## Code Before:
import React from 'react';
import MuiThemeProvider from 'material-ui/styles/MuiThemeProvider';
import AppBar from 'material-ui/AppBar';
import IconButton from 'material-ui/IconButton';
import IconMenu from 'material-ui/IconMenu';
import MenuItem from 'material-ui/MenuItem';
import MoreVertIcon from 'material-ui/svg-icons/navigation/more-vert';
/*
Header
Top of the full App
*/
var Header = React.createClass({
handleTitleTouch : function() {
this.props.history.push('/');
},
render : function() {
return (
<MuiThemeProvider>
<AppBar
title={<span style={{'cursor':'pointer'}}>MetaSeek</span>}
onTitleTouchTap={this.handleTitleTouch}
iconElementLeft={<div></div>}
iconElementRight={
<IconMenu
iconButtonElement={
<IconButton><MoreVertIcon /></IconButton>
}
targetOrigin={{horizontal: 'right', vertical: 'top'}}
anchorOrigin={{horizontal: 'right', vertical: 'top'}}
>
<MenuItem primaryText="Help" />
<MenuItem primaryText="Sign out" />
</IconMenu>
}
/>
</MuiThemeProvider>
)
}
});
export default Header;
## Instruction:
Add my account to header
## Code After:
import React from 'react';
import { Link } from 'react-router';
import MuiThemeProvider from 'material-ui/styles/MuiThemeProvider';
import AppBar from 'material-ui/AppBar';
import IconButton from 'material-ui/IconButton';
import IconMenu from 'material-ui/IconMenu';
import MenuItem from 'material-ui/MenuItem';
import MoreVertIcon from 'material-ui/svg-icons/navigation/more-vert';
/*
Header
Top of the full App
*/
var Header = React.createClass({
handleTitleTouch : function() {
this.props.history.push('/');
},
render : function() {
return (
<MuiThemeProvider>
<AppBar
title={<span style={{'cursor':'pointer'}}>MetaSeek</span>}
onTitleTouchTap={this.handleTitleTouch}
iconElementLeft={<div></div>}
iconElementRight={
<IconMenu
iconButtonElement={
<IconButton><MoreVertIcon /></IconButton>
}
targetOrigin={{horizontal: 'right', vertical: 'top'}}
anchorOrigin={{horizontal: 'right', vertical: 'top'}}
>
<Link style={{'text-decoration':'none'}} to='/myaccount'>
<MenuItem primaryText="My Account" />
</Link>
</IconMenu>
}
/>
</MuiThemeProvider>
)
}
});
export default Header;
| import React from 'react';
+
+ import { Link } from 'react-router';
+
import MuiThemeProvider from 'material-ui/styles/MuiThemeProvider';
import AppBar from 'material-ui/AppBar';
import IconButton from 'material-ui/IconButton';
import IconMenu from 'material-ui/IconMenu';
import MenuItem from 'material-ui/MenuItem';
import MoreVertIcon from 'material-ui/svg-icons/navigation/more-vert';
/*
Header
Top of the full App
*/
var Header = React.createClass({
handleTitleTouch : function() {
this.props.history.push('/');
},
render : function() {
return (
<MuiThemeProvider>
<AppBar
title={<span style={{'cursor':'pointer'}}>MetaSeek</span>}
onTitleTouchTap={this.handleTitleTouch}
iconElementLeft={<div></div>}
iconElementRight={
<IconMenu
iconButtonElement={
<IconButton><MoreVertIcon /></IconButton>
}
targetOrigin={{horizontal: 'right', vertical: 'top'}}
anchorOrigin={{horizontal: 'right', vertical: 'top'}}
>
- <MenuItem primaryText="Help" />
+ <Link style={{'text-decoration':'none'}} to='/myaccount'>
- <MenuItem primaryText="Sign out" />
? ^^^^
+ <MenuItem primaryText="My Account" />
? ++ ^^ +++ +
+ </Link>
</IconMenu>
}
/>
</MuiThemeProvider>
)
}
});
export default Header; | 8 | 0.186047 | 6 | 2 |
dcecd75cae428bb27ec8759a21e52267a55f149a | django_comments/signals.py | django_comments/signals.py | from django.dispatch import Signal
# Sent just before a comment will be posted (after it's been approved and
# moderated; this can be used to modify the comment (in place) with posting
# details or other such actions. If any receiver returns False the comment will be
# discarded and a 400 response. This signal is sent at more or less
# the same time (just before, actually) as the Comment object's pre-save signal,
# except that the HTTP request is sent along with this signal.
comment_will_be_posted = Signal(providing_args=["comment", "request"])
# Sent just after a comment was posted. See above for how this differs
# from the Comment object's post-save signal.
comment_was_posted = Signal(providing_args=["comment", "request"])
# Sent after a comment was "flagged" in some way. Check the flag to see if this
# was a user requesting removal of a comment, a moderator approving/removing a
# comment, or some other custom user flag.
comment_was_flagged = Signal(providing_args=["comment", "flag", "created", "request"])
| from django.dispatch import Signal
# Sent just before a comment will be posted (after it's been approved and
# moderated; this can be used to modify the comment (in place) with posting
# details or other such actions. If any receiver returns False the comment will be
# discarded and a 400 response. This signal is sent at more or less
# the same time (just before, actually) as the Comment object's pre-save signal,
# except that the HTTP request is sent along with this signal.
# Arguments: "comment", "request"
comment_will_be_posted = Signal()
# Sent just after a comment was posted. See above for how this differs
# from the Comment object's post-save signal.
# Arguments: "comment", "request"
comment_was_posted = Signal()
# Sent after a comment was "flagged" in some way. Check the flag to see if this
# was a user requesting removal of a comment, a moderator approving/removing a
# comment, or some other custom user flag.
# Arguments: "comment", "flag", "created", "request"
comment_was_flagged = Signal()
| Remove Signal(providing_args) argument b/c it is deprecated | Remove Signal(providing_args) argument b/c it is deprecated
RemovedInDjango40Warning: The providing_args argument is deprecated.
As it is purely documentational, it has no replacement. If you rely
on this argument as documentation, you can move the text to a code
comment or docstring.
| Python | bsd-3-clause | django/django-contrib-comments,django/django-contrib-comments | python | ## Code Before:
from django.dispatch import Signal
# Sent just before a comment will be posted (after it's been approved and
# moderated; this can be used to modify the comment (in place) with posting
# details or other such actions. If any receiver returns False the comment will be
# discarded and a 400 response. This signal is sent at more or less
# the same time (just before, actually) as the Comment object's pre-save signal,
# except that the HTTP request is sent along with this signal.
comment_will_be_posted = Signal(providing_args=["comment", "request"])
# Sent just after a comment was posted. See above for how this differs
# from the Comment object's post-save signal.
comment_was_posted = Signal(providing_args=["comment", "request"])
# Sent after a comment was "flagged" in some way. Check the flag to see if this
# was a user requesting removal of a comment, a moderator approving/removing a
# comment, or some other custom user flag.
comment_was_flagged = Signal(providing_args=["comment", "flag", "created", "request"])
## Instruction:
Remove Signal(providing_args) argument b/c it is deprecated
RemovedInDjango40Warning: The providing_args argument is deprecated.
As it is purely documentational, it has no replacement. If you rely
on this argument as documentation, you can move the text to a code
comment or docstring.
## Code After:
from django.dispatch import Signal
# Sent just before a comment will be posted (after it's been approved and
# moderated; this can be used to modify the comment (in place) with posting
# details or other such actions. If any receiver returns False the comment will be
# discarded and a 400 response. This signal is sent at more or less
# the same time (just before, actually) as the Comment object's pre-save signal,
# except that the HTTP request is sent along with this signal.
# Arguments: "comment", "request"
comment_will_be_posted = Signal()
# Sent just after a comment was posted. See above for how this differs
# from the Comment object's post-save signal.
# Arguments: "comment", "request"
comment_was_posted = Signal()
# Sent after a comment was "flagged" in some way. Check the flag to see if this
# was a user requesting removal of a comment, a moderator approving/removing a
# comment, or some other custom user flag.
# Arguments: "comment", "flag", "created", "request"
comment_was_flagged = Signal()
| from django.dispatch import Signal
# Sent just before a comment will be posted (after it's been approved and
# moderated; this can be used to modify the comment (in place) with posting
# details or other such actions. If any receiver returns False the comment will be
# discarded and a 400 response. This signal is sent at more or less
# the same time (just before, actually) as the Comment object's pre-save signal,
# except that the HTTP request is sent along with this signal.
- comment_will_be_posted = Signal(providing_args=["comment", "request"])
+ # Arguments: "comment", "request"
+ comment_will_be_posted = Signal()
# Sent just after a comment was posted. See above for how this differs
# from the Comment object's post-save signal.
- comment_was_posted = Signal(providing_args=["comment", "request"])
+ # Arguments: "comment", "request"
+ comment_was_posted = Signal()
# Sent after a comment was "flagged" in some way. Check the flag to see if this
# was a user requesting removal of a comment, a moderator approving/removing a
# comment, or some other custom user flag.
- comment_was_flagged = Signal(providing_args=["comment", "flag", "created", "request"])
+ # Arguments: "comment", "flag", "created", "request"
+ comment_was_flagged = Signal() | 9 | 0.5 | 6 | 3 |
6d7d7360d652708cbfa91e044c9688ccdce5fbe0 | byceps/blueprints/common/authentication/password/templates/common/authentication/password/request_reset_form.html | byceps/blueprints/common/authentication/password/templates/common/authentication/password/request_reset_form.html | {% extends 'layout/base_auto.html' %}
{% from 'macros/forms.html' import form_buttons, form_field, form_fieldset %}
{% set current_page = 'authentication.request_password_reset' %}
{% set title = 'Zurücksetzen deines Passworts anfordern' %}
{% if g.app_mode.is_admin() %}
{% set layout_sidebar_hide = True %}
{% endif %}
{% block body %}
<h1>{{ title }}</h1>
<form action="{{ url_for('.request_reset') }}" method="post">
{%- call form_fieldset() %}
{{ form_field(form.screen_name, maxlength=40, autofocus='autofocus') }}
{%- endcall %}
{{ form_buttons(_('Request')) }}
</form>
{%- endblock %}
| {% extends 'layout/base_auto.html' %}
{% from 'macros/forms.html' import form_buttons, form_field, form_fieldset %}
{% set current_page = 'authentication_password_request_reset' %}
{% set title = 'Zurücksetzen deines Passworts anfordern' %}
{% if g.app_mode.is_admin() %}
{% set layout_sidebar_hide = True %}
{% endif %}
{% block body %}
<h1>{{ title }}</h1>
<form action="{{ url_for('.request_reset') }}" method="post">
{%- call form_fieldset() %}
{{ form_field(form.screen_name, maxlength=40, autofocus='autofocus') }}
{%- endcall %}
{{ form_buttons(_('Request')) }}
</form>
{%- endblock %}
| Align current page value for password reset form | Align current page value for password reset form
| HTML | bsd-3-clause | homeworkprod/byceps,homeworkprod/byceps,homeworkprod/byceps | html | ## Code Before:
{% extends 'layout/base_auto.html' %}
{% from 'macros/forms.html' import form_buttons, form_field, form_fieldset %}
{% set current_page = 'authentication.request_password_reset' %}
{% set title = 'Zurücksetzen deines Passworts anfordern' %}
{% if g.app_mode.is_admin() %}
{% set layout_sidebar_hide = True %}
{% endif %}
{% block body %}
<h1>{{ title }}</h1>
<form action="{{ url_for('.request_reset') }}" method="post">
{%- call form_fieldset() %}
{{ form_field(form.screen_name, maxlength=40, autofocus='autofocus') }}
{%- endcall %}
{{ form_buttons(_('Request')) }}
</form>
{%- endblock %}
## Instruction:
Align current page value for password reset form
## Code After:
{% extends 'layout/base_auto.html' %}
{% from 'macros/forms.html' import form_buttons, form_field, form_fieldset %}
{% set current_page = 'authentication_password_request_reset' %}
{% set title = 'Zurücksetzen deines Passworts anfordern' %}
{% if g.app_mode.is_admin() %}
{% set layout_sidebar_hide = True %}
{% endif %}
{% block body %}
<h1>{{ title }}</h1>
<form action="{{ url_for('.request_reset') }}" method="post">
{%- call form_fieldset() %}
{{ form_field(form.screen_name, maxlength=40, autofocus='autofocus') }}
{%- endcall %}
{{ form_buttons(_('Request')) }}
</form>
{%- endblock %}
| {% extends 'layout/base_auto.html' %}
{% from 'macros/forms.html' import form_buttons, form_field, form_fieldset %}
- {% set current_page = 'authentication.request_password_reset' %}
? --------
+ {% set current_page = 'authentication_password_request_reset' %}
? ++++++++
{% set title = 'Zurücksetzen deines Passworts anfordern' %}
{% if g.app_mode.is_admin() %}
{% set layout_sidebar_hide = True %}
{% endif %}
{% block body %}
<h1>{{ title }}</h1>
<form action="{{ url_for('.request_reset') }}" method="post">
{%- call form_fieldset() %}
{{ form_field(form.screen_name, maxlength=40, autofocus='autofocus') }}
{%- endcall %}
{{ form_buttons(_('Request')) }}
</form>
{%- endblock %} | 2 | 0.095238 | 1 | 1 |
e7fae9c368829013857e03316af06ed511bbfbe8 | extendStorage.js | extendStorage.js | /*
Wonder how this works?
Storage is the Prototype of both localStorage and sessionStorage.
Got it?
*/
(function() {
'use strict';
Storage.prototype.set = function(key, obj) {
var t = typeof obj;
if (t==='undefined' || obj===null ) this.removeItem(key);
this.setItem(key, (t==='object')?JSON.stringify(obj):obj);
};
Storage.prototype.get = function(key) {
var obj = this.getItem(key);
try {
var j = JSON.parse(obj);
if (j && typeof j === "object") return j;
} catch (e) { }
return obj;
};
Storage.prototype.has = window.hasOwnProperty;
Storage.prototype.remove = window.removeItem;
Storage.prototype.keys = function(){
return Object.keys(this.valueOf());
};
})();
| /*
Wonder how this works?
Storage is the Prototype of both localStorage and sessionStorage.
Got it?
*/
(function() {
'use strict';
function extend(){
for(var i=1; i<arguments.length; i++)
for(var key in arguments[i])
if(arguments[i].hasOwnProperty(key)) {
if (typeof arguments[0][key] === 'object'
&& typeof arguments[i][key] === 'object')
extend(arguments[0][key], arguments[i][key]);
else
arguments[0][key] = arguments[i][key];
}
return arguments[0];
}
Storage.prototype.set = function(key, obj) {
var t = typeof obj;
if (t==='undefined' || obj===null ) this.removeItem(key);
this.setItem(key, (t==='object')?JSON.stringify(obj):obj);
};
Storage.prototype.get = function(key) {
var obj = this.getItem(key);
try {
var j = JSON.parse(obj);
if (j && typeof j === "object") return j;
} catch (e) { }
return obj;
};
Storage.prototype.extend = function(key, obj_merge) {
this.set(key,extend(this.get(key),obj_merge);
};
Storage.prototype.has = window.hasOwnProperty;
Storage.prototype.remove = window.removeItem;
Storage.prototype.keys = function(){
return Object.keys(this.valueOf());
};
})();
| Add localStorage.extend with recursive support (but no typechecking) | Add localStorage.extend with recursive support (but no typechecking) | JavaScript | mit | zevero/simpleWebstorage | javascript | ## Code Before:
/*
Wonder how this works?
Storage is the Prototype of both localStorage and sessionStorage.
Got it?
*/
(function() {
'use strict';
Storage.prototype.set = function(key, obj) {
var t = typeof obj;
if (t==='undefined' || obj===null ) this.removeItem(key);
this.setItem(key, (t==='object')?JSON.stringify(obj):obj);
};
Storage.prototype.get = function(key) {
var obj = this.getItem(key);
try {
var j = JSON.parse(obj);
if (j && typeof j === "object") return j;
} catch (e) { }
return obj;
};
Storage.prototype.has = window.hasOwnProperty;
Storage.prototype.remove = window.removeItem;
Storage.prototype.keys = function(){
return Object.keys(this.valueOf());
};
})();
## Instruction:
Add localStorage.extend with recursive support (but no typechecking)
## Code After:
/*
Wonder how this works?
Storage is the Prototype of both localStorage and sessionStorage.
Got it?
*/
(function() {
'use strict';
function extend(){
for(var i=1; i<arguments.length; i++)
for(var key in arguments[i])
if(arguments[i].hasOwnProperty(key)) {
if (typeof arguments[0][key] === 'object'
&& typeof arguments[i][key] === 'object')
extend(arguments[0][key], arguments[i][key]);
else
arguments[0][key] = arguments[i][key];
}
return arguments[0];
}
Storage.prototype.set = function(key, obj) {
var t = typeof obj;
if (t==='undefined' || obj===null ) this.removeItem(key);
this.setItem(key, (t==='object')?JSON.stringify(obj):obj);
};
Storage.prototype.get = function(key) {
var obj = this.getItem(key);
try {
var j = JSON.parse(obj);
if (j && typeof j === "object") return j;
} catch (e) { }
return obj;
};
Storage.prototype.extend = function(key, obj_merge) {
this.set(key,extend(this.get(key),obj_merge);
};
Storage.prototype.has = window.hasOwnProperty;
Storage.prototype.remove = window.removeItem;
Storage.prototype.keys = function(){
return Object.keys(this.valueOf());
};
})();
| /*
Wonder how this works?
Storage is the Prototype of both localStorage and sessionStorage.
Got it?
*/
(function() {
'use strict';
+ function extend(){
+ for(var i=1; i<arguments.length; i++)
+ for(var key in arguments[i])
+ if(arguments[i].hasOwnProperty(key)) {
+ if (typeof arguments[0][key] === 'object'
+ && typeof arguments[i][key] === 'object')
+ extend(arguments[0][key], arguments[i][key]);
+ else
+ arguments[0][key] = arguments[i][key];
+ }
+ return arguments[0];
+ }
+
Storage.prototype.set = function(key, obj) {
var t = typeof obj;
if (t==='undefined' || obj===null ) this.removeItem(key);
this.setItem(key, (t==='object')?JSON.stringify(obj):obj);
};
Storage.prototype.get = function(key) {
var obj = this.getItem(key);
try {
var j = JSON.parse(obj);
if (j && typeof j === "object") return j;
} catch (e) { }
return obj;
};
+ Storage.prototype.extend = function(key, obj_merge) {
+ this.set(key,extend(this.get(key),obj_merge);
+ };
+
Storage.prototype.has = window.hasOwnProperty;
Storage.prototype.remove = window.removeItem;
Storage.prototype.keys = function(){
return Object.keys(this.valueOf());
};
})(); | 17 | 0.566667 | 17 | 0 |
7dff86fb1027899c4251133b8fc0f962c268c8fe | platformio.ini | platformio.ini |
[env:nucleo_f446re]
platform = ststm32
framework = mbed
board = nucleo_f446re
# targets = upload
upload_protocol = stlink
build_flags = -g -Isrc/stm32f4
src_filter = +<*> -<.git/> -<.hg/> -<examples/> -<arduino/>
|
[env:nucleo_f446re]
platform = ststm32
framework = mbed
board = nucleo_f446re
# targets = upload
upload_protocol = stlink
build_flags = -g -Isrc/stm32f4
src_filter = +<*> -<.git/> -<.hg/> -<examples/> -<arduino/>
[env:nucleo_f411re]
platform = ststm32
framework = mbed
board = nucleo_f411re
# targets = upload
upload_protocol = stlink
build_flags = -g -Isrc/stm32f4
src_filter = +<*> -<.git/> -<.hg/> -<examples/> -<arduino/>
| Add build target for NUCLEO_F411RE | Add build target for NUCLEO_F411RE
| INI | mit | NewJapanRadio/NJU9103_eva_mcu,NewJapanRadio/NJU9103_eva_mcu | ini | ## Code Before:
[env:nucleo_f446re]
platform = ststm32
framework = mbed
board = nucleo_f446re
# targets = upload
upload_protocol = stlink
build_flags = -g -Isrc/stm32f4
src_filter = +<*> -<.git/> -<.hg/> -<examples/> -<arduino/>
## Instruction:
Add build target for NUCLEO_F411RE
## Code After:
[env:nucleo_f446re]
platform = ststm32
framework = mbed
board = nucleo_f446re
# targets = upload
upload_protocol = stlink
build_flags = -g -Isrc/stm32f4
src_filter = +<*> -<.git/> -<.hg/> -<examples/> -<arduino/>
[env:nucleo_f411re]
platform = ststm32
framework = mbed
board = nucleo_f411re
# targets = upload
upload_protocol = stlink
build_flags = -g -Isrc/stm32f4
src_filter = +<*> -<.git/> -<.hg/> -<examples/> -<arduino/>
|
[env:nucleo_f446re]
platform = ststm32
framework = mbed
board = nucleo_f446re
# targets = upload
upload_protocol = stlink
build_flags = -g -Isrc/stm32f4
src_filter = +<*> -<.git/> -<.hg/> -<examples/> -<arduino/>
+
+ [env:nucleo_f411re]
+ platform = ststm32
+ framework = mbed
+ board = nucleo_f411re
+ # targets = upload
+ upload_protocol = stlink
+ build_flags = -g -Isrc/stm32f4
+ src_filter = +<*> -<.git/> -<.hg/> -<examples/> -<arduino/> | 9 | 1 | 9 | 0 |
66e88b82c780da64317007a8b0d5c85b56575f69 | README.md | README.md | cd ~
mv .vim .vim.bak
### Clone Repository
git clone https://github.com/Verron/remote-vim .vim
### Install Plugins
cd .vim
git submodule init
git submodule update
### Add new Plugin
Change Pathogen install method of
cd ~/.vim/bundle
git clone [plugin repository] [repository name]
to
cd ~/.vim
git submodule add [plugin repository] bundle/[repository name]
# Dependency
## CTags
sudo apt-get install ctags
## Powerline Fonts
Reference: http://askubuntu.com/questions/283908/how-can-i-install-and-use-powerline-plugin
wget https://github.com/Lokaltog/powerline/raw/develop/font/PowerlineSymbols.otf https://github.com/Lokaltog/powerline/raw/develop/font/10-powerline-symbols.conf
sudo mv PowerlineSymbols.otf /usr/share/fonts/
sudo fc-cache -vf
sudo mv 10-powerline-symbols.conf /etc/fonts/conf.d/
## Terminator Solarized Scheme
See https://github.com/ghuntley/terminator-solarized for more detail
| cd ~
mv .vim .vim.bak
### Don't forget symbolic link
ln -s .vim/.vimrc .vimrc
### Clone Repository
git clone https://github.com/Verron/remote-vim .vim
### Install Plugins
cd .vim
git submodule init
git submodule update
### Add new Plugin
Change Pathogen install method of
cd ~/.vim/bundle
git clone [plugin repository] [repository name]
to
cd ~/.vim
git submodule add [plugin repository] bundle/[repository name]
# Dependency
## CTags
sudo apt-get install ctags
## Powerline Fonts
Reference: http://askubuntu.com/questions/283908/how-can-i-install-and-use-powerline-plugin
wget https://github.com/Lokaltog/powerline/raw/develop/font/PowerlineSymbols.otf https://github.com/Lokaltog/powerline/raw/develop/font/10-powerline-symbols.conf
sudo mv PowerlineSymbols.otf /usr/share/fonts/
sudo fc-cache -vf
sudo mv 10-powerline-symbols.conf /etc/fonts/conf.d/
## Terminator Solarized Scheme
See https://github.com/ghuntley/terminator-solarized for more detail
| Add instruction to create symbolic link | Add instruction to create symbolic link
| Markdown | mit | Verron/remote-vim,Verron/remote-vim | markdown | ## Code Before:
cd ~
mv .vim .vim.bak
### Clone Repository
git clone https://github.com/Verron/remote-vim .vim
### Install Plugins
cd .vim
git submodule init
git submodule update
### Add new Plugin
Change Pathogen install method of
cd ~/.vim/bundle
git clone [plugin repository] [repository name]
to
cd ~/.vim
git submodule add [plugin repository] bundle/[repository name]
# Dependency
## CTags
sudo apt-get install ctags
## Powerline Fonts
Reference: http://askubuntu.com/questions/283908/how-can-i-install-and-use-powerline-plugin
wget https://github.com/Lokaltog/powerline/raw/develop/font/PowerlineSymbols.otf https://github.com/Lokaltog/powerline/raw/develop/font/10-powerline-symbols.conf
sudo mv PowerlineSymbols.otf /usr/share/fonts/
sudo fc-cache -vf
sudo mv 10-powerline-symbols.conf /etc/fonts/conf.d/
## Terminator Solarized Scheme
See https://github.com/ghuntley/terminator-solarized for more detail
## Instruction:
Add instruction to create symbolic link
## Code After:
cd ~
mv .vim .vim.bak
### Don't forget symbolic link
ln -s .vim/.vimrc .vimrc
### Clone Repository
git clone https://github.com/Verron/remote-vim .vim
### Install Plugins
cd .vim
git submodule init
git submodule update
### Add new Plugin
Change Pathogen install method of
cd ~/.vim/bundle
git clone [plugin repository] [repository name]
to
cd ~/.vim
git submodule add [plugin repository] bundle/[repository name]
# Dependency
## CTags
sudo apt-get install ctags
## Powerline Fonts
Reference: http://askubuntu.com/questions/283908/how-can-i-install-and-use-powerline-plugin
wget https://github.com/Lokaltog/powerline/raw/develop/font/PowerlineSymbols.otf https://github.com/Lokaltog/powerline/raw/develop/font/10-powerline-symbols.conf
sudo mv PowerlineSymbols.otf /usr/share/fonts/
sudo fc-cache -vf
sudo mv 10-powerline-symbols.conf /etc/fonts/conf.d/
## Terminator Solarized Scheme
See https://github.com/ghuntley/terminator-solarized for more detail
| cd ~
mv .vim .vim.bak
+
+ ### Don't forget symbolic link
+
+ ln -s .vim/.vimrc .vimrc
### Clone Repository
git clone https://github.com/Verron/remote-vim .vim
### Install Plugins
cd .vim
git submodule init
git submodule update
### Add new Plugin
Change Pathogen install method of
cd ~/.vim/bundle
git clone [plugin repository] [repository name]
to
cd ~/.vim
git submodule add [plugin repository] bundle/[repository name]
# Dependency
## CTags
sudo apt-get install ctags
## Powerline Fonts
Reference: http://askubuntu.com/questions/283908/how-can-i-install-and-use-powerline-plugin
wget https://github.com/Lokaltog/powerline/raw/develop/font/PowerlineSymbols.otf https://github.com/Lokaltog/powerline/raw/develop/font/10-powerline-symbols.conf
sudo mv PowerlineSymbols.otf /usr/share/fonts/
sudo fc-cache -vf
sudo mv 10-powerline-symbols.conf /etc/fonts/conf.d/
## Terminator Solarized Scheme
See https://github.com/ghuntley/terminator-solarized for more detail | 4 | 0.111111 | 4 | 0 |
819376cc6483667ce7698e6ac3088bdbbd429ac5 | doc/reference/libtelepathy-logger/docs.xml | doc/reference/libtelepathy-logger/docs.xml | <?xml version="1.0"?>
<!DOCTYPE book PUBLIC "-//OASIS//DTD DocBook XML V4.1.2//EN"
"http://www.oasis-open.org/docbook/xml/4.1.2/docbookx.dtd">
<book id="index" xmlns:xi="http://www.w3.org/2003/XInclude">
<bookinfo>
<title>telepathy-logger Reference Manual</title>
</bookinfo>
<chapter id="ch-public">
<title>libtelepathy-logger API</title>
<xi:include href="xml/annotation-glossary.xml"/>
<xi:include href="xml/log-manager.xml"/>
<xi:include href="xml/log-walker.xml"/>
<xi:include href="xml/event.xml"/>
<xi:include href="xml/text-event.xml"/>
<xi:include href="xml/call-event.xml"/>
<xi:include href="xml/entity.xml"/>
</chapter>
</book>
<!-- set sts=2 sw=2 et -->
| <?xml version="1.0"?>
<!DOCTYPE book PUBLIC "-//OASIS//DTD DocBook XML V4.1.2//EN"
"http://www.oasis-open.org/docbook/xml/4.1.2/docbookx.dtd">
<book id="index" xmlns:xi="http://www.w3.org/2003/XInclude">
<bookinfo>
<title>telepathy-logger Reference Manual</title>
</bookinfo>
<chapter id="ch-public">
<title>libtelepathy-logger API</title>
<xi:include href="xml/annotation-glossary.xml"><xi:fallback /></xi:include>
<xi:include href="xml/log-manager.xml"/>
<xi:include href="xml/log-walker.xml"/>
<xi:include href="xml/event.xml"/>
<xi:include href="xml/text-event.xml"/>
<xi:include href="xml/call-event.xml"/>
<xi:include href="xml/entity.xml"/>
</chapter>
</book>
<!-- set sts=2 sw=2 et -->
| Use xi:fallback when including the annotation glossary | doc: Use xi:fallback when including the annotation glossary
| XML | lgpl-2.1 | freedesktop-unofficial-mirror/telepathy__telepathy-logger,freedesktop-unofficial-mirror/telepathy__telepathy-logger,freedesktop-unofficial-mirror/telepathy__telepathy-logger | xml | ## Code Before:
<?xml version="1.0"?>
<!DOCTYPE book PUBLIC "-//OASIS//DTD DocBook XML V4.1.2//EN"
"http://www.oasis-open.org/docbook/xml/4.1.2/docbookx.dtd">
<book id="index" xmlns:xi="http://www.w3.org/2003/XInclude">
<bookinfo>
<title>telepathy-logger Reference Manual</title>
</bookinfo>
<chapter id="ch-public">
<title>libtelepathy-logger API</title>
<xi:include href="xml/annotation-glossary.xml"/>
<xi:include href="xml/log-manager.xml"/>
<xi:include href="xml/log-walker.xml"/>
<xi:include href="xml/event.xml"/>
<xi:include href="xml/text-event.xml"/>
<xi:include href="xml/call-event.xml"/>
<xi:include href="xml/entity.xml"/>
</chapter>
</book>
<!-- set sts=2 sw=2 et -->
## Instruction:
doc: Use xi:fallback when including the annotation glossary
## Code After:
<?xml version="1.0"?>
<!DOCTYPE book PUBLIC "-//OASIS//DTD DocBook XML V4.1.2//EN"
"http://www.oasis-open.org/docbook/xml/4.1.2/docbookx.dtd">
<book id="index" xmlns:xi="http://www.w3.org/2003/XInclude">
<bookinfo>
<title>telepathy-logger Reference Manual</title>
</bookinfo>
<chapter id="ch-public">
<title>libtelepathy-logger API</title>
<xi:include href="xml/annotation-glossary.xml"><xi:fallback /></xi:include>
<xi:include href="xml/log-manager.xml"/>
<xi:include href="xml/log-walker.xml"/>
<xi:include href="xml/event.xml"/>
<xi:include href="xml/text-event.xml"/>
<xi:include href="xml/call-event.xml"/>
<xi:include href="xml/entity.xml"/>
</chapter>
</book>
<!-- set sts=2 sw=2 et -->
| <?xml version="1.0"?>
<!DOCTYPE book PUBLIC "-//OASIS//DTD DocBook XML V4.1.2//EN"
"http://www.oasis-open.org/docbook/xml/4.1.2/docbookx.dtd">
<book id="index" xmlns:xi="http://www.w3.org/2003/XInclude">
<bookinfo>
<title>telepathy-logger Reference Manual</title>
</bookinfo>
<chapter id="ch-public">
<title>libtelepathy-logger API</title>
- <xi:include href="xml/annotation-glossary.xml"/>
+ <xi:include href="xml/annotation-glossary.xml"><xi:fallback /></xi:include>
? ++++++++++++++ +++++++++++++
<xi:include href="xml/log-manager.xml"/>
<xi:include href="xml/log-walker.xml"/>
<xi:include href="xml/event.xml"/>
<xi:include href="xml/text-event.xml"/>
<xi:include href="xml/call-event.xml"/>
<xi:include href="xml/entity.xml"/>
</chapter>
</book>
<!-- set sts=2 sw=2 et --> | 2 | 0.090909 | 1 | 1 |
2bbb93a44b76949e34bce3a696a0ad3e3222ad9c | jsonsempai.py | jsonsempai.py | import imp
import json
import os
import sys
class Dot(dict):
def __init__(self, d):
super(dict, self).__init__()
for k, v in d.iteritems():
if isinstance(v, dict):
self[k] = Dot(v)
else:
self[k] = v
def __getattr__(self, attr):
return self.get(attr)
__setattr__ = dict.__setitem__
__delattr__ = dict.__delitem__
class SempaiLoader(object):
def find_module(self, name, path=None):
for d in sys.path:
self.json_path = os.path.join(d, '{}.json'.format(name))
if os.path.isfile(self.json_path):
print self.json_path
return self
return None
def load_module(self, name):
mod = imp.new_module(name)
mod.__file__ = self.json_path
mod.__loader__ = self
try:
with open(self.json_path) as f:
d = json.load(f)
except ValueError:
raise ImportError(
'"{}" does not contain valid json.'.format(self.json_path))
except:
raise ImportError(
'Could not open "{}".'.format(self.json_path))
mod.__dict__.update(d)
for k, i in mod.__dict__.items():
if isinstance(i, dict):
mod.__dict__[k] = Dot(i)
return mod
sys.meta_path.append(SempaiLoader())
| import imp
import json
import os
import sys
class Dot(dict):
def __init__(self, d):
super(dict, self).__init__()
for k, v in d.iteritems():
if isinstance(v, dict):
self[k] = Dot(v)
else:
self[k] = v
def __getattr__(self, attr):
try:
return self[attr]
except KeyError:
raise AttributeError("'{}'".format(attr))
__setattr__ = dict.__setitem__
__delattr__ = dict.__delitem__
class SempaiLoader(object):
def find_module(self, name, path=None):
for d in sys.path:
self.json_path = os.path.join(d, '{}.json'.format(name))
if os.path.isfile(self.json_path):
print self.json_path
return self
return None
def load_module(self, name):
mod = imp.new_module(name)
mod.__file__ = self.json_path
mod.__loader__ = self
try:
with open(self.json_path) as f:
d = json.load(f)
except ValueError:
raise ImportError(
'"{}" does not contain valid json.'.format(self.json_path))
except:
raise ImportError(
'Could not open "{}".'.format(self.json_path))
mod.__dict__.update(d)
for k, i in mod.__dict__.items():
if isinstance(i, dict):
mod.__dict__[k] = Dot(i)
return mod
sys.meta_path.append(SempaiLoader())
| Raise AttributeError instead of None | Raise AttributeError instead of None
| Python | mit | kragniz/json-sempai | python | ## Code Before:
import imp
import json
import os
import sys
class Dot(dict):
def __init__(self, d):
super(dict, self).__init__()
for k, v in d.iteritems():
if isinstance(v, dict):
self[k] = Dot(v)
else:
self[k] = v
def __getattr__(self, attr):
return self.get(attr)
__setattr__ = dict.__setitem__
__delattr__ = dict.__delitem__
class SempaiLoader(object):
def find_module(self, name, path=None):
for d in sys.path:
self.json_path = os.path.join(d, '{}.json'.format(name))
if os.path.isfile(self.json_path):
print self.json_path
return self
return None
def load_module(self, name):
mod = imp.new_module(name)
mod.__file__ = self.json_path
mod.__loader__ = self
try:
with open(self.json_path) as f:
d = json.load(f)
except ValueError:
raise ImportError(
'"{}" does not contain valid json.'.format(self.json_path))
except:
raise ImportError(
'Could not open "{}".'.format(self.json_path))
mod.__dict__.update(d)
for k, i in mod.__dict__.items():
if isinstance(i, dict):
mod.__dict__[k] = Dot(i)
return mod
sys.meta_path.append(SempaiLoader())
## Instruction:
Raise AttributeError instead of None
## Code After:
import imp
import json
import os
import sys
class Dot(dict):
def __init__(self, d):
super(dict, self).__init__()
for k, v in d.iteritems():
if isinstance(v, dict):
self[k] = Dot(v)
else:
self[k] = v
def __getattr__(self, attr):
try:
return self[attr]
except KeyError:
raise AttributeError("'{}'".format(attr))
__setattr__ = dict.__setitem__
__delattr__ = dict.__delitem__
class SempaiLoader(object):
def find_module(self, name, path=None):
for d in sys.path:
self.json_path = os.path.join(d, '{}.json'.format(name))
if os.path.isfile(self.json_path):
print self.json_path
return self
return None
def load_module(self, name):
mod = imp.new_module(name)
mod.__file__ = self.json_path
mod.__loader__ = self
try:
with open(self.json_path) as f:
d = json.load(f)
except ValueError:
raise ImportError(
'"{}" does not contain valid json.'.format(self.json_path))
except:
raise ImportError(
'Could not open "{}".'.format(self.json_path))
mod.__dict__.update(d)
for k, i in mod.__dict__.items():
if isinstance(i, dict):
mod.__dict__[k] = Dot(i)
return mod
sys.meta_path.append(SempaiLoader())
| import imp
import json
import os
import sys
class Dot(dict):
def __init__(self, d):
super(dict, self).__init__()
for k, v in d.iteritems():
if isinstance(v, dict):
self[k] = Dot(v)
else:
self[k] = v
def __getattr__(self, attr):
+ try:
- return self.get(attr)
? ^^^^^ ^
+ return self[attr]
? ++++ ^ ^
+ except KeyError:
+ raise AttributeError("'{}'".format(attr))
__setattr__ = dict.__setitem__
__delattr__ = dict.__delitem__
class SempaiLoader(object):
def find_module(self, name, path=None):
for d in sys.path:
self.json_path = os.path.join(d, '{}.json'.format(name))
if os.path.isfile(self.json_path):
print self.json_path
return self
return None
def load_module(self, name):
mod = imp.new_module(name)
mod.__file__ = self.json_path
mod.__loader__ = self
try:
with open(self.json_path) as f:
d = json.load(f)
except ValueError:
raise ImportError(
'"{}" does not contain valid json.'.format(self.json_path))
except:
raise ImportError(
'Could not open "{}".'.format(self.json_path))
mod.__dict__.update(d)
for k, i in mod.__dict__.items():
if isinstance(i, dict):
mod.__dict__[k] = Dot(i)
return mod
sys.meta_path.append(SempaiLoader()) | 5 | 0.087719 | 4 | 1 |
3befad5d3f89a9a2258a3e1033e846cb9cff0134 | deploy-runner-agent/src/main/java/jetbrains/buildServer/deployer/agent/ssh/SSHProcessAdapterOptions.java | deploy-runner-agent/src/main/java/jetbrains/buildServer/deployer/agent/ssh/SSHProcessAdapterOptions.java | package jetbrains.buildServer.deployer.agent.ssh;
class SSHProcessAdapterOptions {
private boolean myEnableSshAgentForwarding;
private boolean myFailBuildOnExitCode;
SSHProcessAdapterOptions(boolean myEnableSshAgentForwarding, boolean myFailBuildOnExitCode) {
this.myEnableSshAgentForwarding = myEnableSshAgentForwarding;
this.myFailBuildOnExitCode = myFailBuildOnExitCode;
}
boolean shouldFailBuildOnExitCode() {
return myFailBuildOnExitCode;
}
boolean enableSshAgentForwarding() {
return myEnableSshAgentForwarding;
}
}
| package jetbrains.buildServer.deployer.agent.ssh;
class SSHProcessAdapterOptions {
private boolean myFailBuildOnExitCode;
private boolean myEnableSshAgentForwarding;
SSHProcessAdapterOptions(boolean myFailBuildOnExitCode, boolean myEnableSshAgentForwarding) {
this.myFailBuildOnExitCode = myFailBuildOnExitCode;
this.myEnableSshAgentForwarding = myEnableSshAgentForwarding;
}
boolean shouldFailBuildOnExitCode() {
return myFailBuildOnExitCode;
}
boolean enableSshAgentForwarding() {
return myEnableSshAgentForwarding;
}
}
| Fix order of parameters in SSH Process Adapter Options | Fix order of parameters in SSH Process Adapter Options
| Java | apache-2.0 | JetBrains/teamcity-deployer-plugin,JetBrains/teamcity-deployer-plugin | java | ## Code Before:
package jetbrains.buildServer.deployer.agent.ssh;
class SSHProcessAdapterOptions {
private boolean myEnableSshAgentForwarding;
private boolean myFailBuildOnExitCode;
SSHProcessAdapterOptions(boolean myEnableSshAgentForwarding, boolean myFailBuildOnExitCode) {
this.myEnableSshAgentForwarding = myEnableSshAgentForwarding;
this.myFailBuildOnExitCode = myFailBuildOnExitCode;
}
boolean shouldFailBuildOnExitCode() {
return myFailBuildOnExitCode;
}
boolean enableSshAgentForwarding() {
return myEnableSshAgentForwarding;
}
}
## Instruction:
Fix order of parameters in SSH Process Adapter Options
## Code After:
package jetbrains.buildServer.deployer.agent.ssh;
class SSHProcessAdapterOptions {
private boolean myFailBuildOnExitCode;
private boolean myEnableSshAgentForwarding;
SSHProcessAdapterOptions(boolean myFailBuildOnExitCode, boolean myEnableSshAgentForwarding) {
this.myFailBuildOnExitCode = myFailBuildOnExitCode;
this.myEnableSshAgentForwarding = myEnableSshAgentForwarding;
}
boolean shouldFailBuildOnExitCode() {
return myFailBuildOnExitCode;
}
boolean enableSshAgentForwarding() {
return myEnableSshAgentForwarding;
}
}
| package jetbrains.buildServer.deployer.agent.ssh;
class SSHProcessAdapterOptions {
+ private boolean myFailBuildOnExitCode;
private boolean myEnableSshAgentForwarding;
- private boolean myFailBuildOnExitCode;
- SSHProcessAdapterOptions(boolean myEnableSshAgentForwarding, boolean myFailBuildOnExitCode) {
+ SSHProcessAdapterOptions(boolean myFailBuildOnExitCode, boolean myEnableSshAgentForwarding) {
+ this.myFailBuildOnExitCode = myFailBuildOnExitCode;
this.myEnableSshAgentForwarding = myEnableSshAgentForwarding;
- this.myFailBuildOnExitCode = myFailBuildOnExitCode;
}
boolean shouldFailBuildOnExitCode() {
return myFailBuildOnExitCode;
}
boolean enableSshAgentForwarding() {
return myEnableSshAgentForwarding;
}
} | 6 | 0.315789 | 3 | 3 |
38099f7a41345ceff6342a10f0d0eb8e80165b89 | app/assets/javascripts/pageflow/slideshow/scroll_indicator_widget.js | app/assets/javascripts/pageflow/slideshow/scroll_indicator_widget.js | (function($) {
$.widget('pageflow.scrollIndicator', {
_create: function() {
var parent = this.options.parent,
that = this;
parent.on('pageactivate', function(event) {
that.element.toggleClass('invert', $(event.target).hasClass('invert'));
});
parent.on('scrollerhintdown', function() {
that.element.addClass('animate');
setTimeout(function() {
that.element.removeClass('animate');
}, 500);
});
parent.on('scrollernearbottom', function(event) {
var page = $(event.target).parents('section');
if (page.hasClass('active')) {
that.element.toggleClass('visible', parent.nextPageExists());
}
});
parent.on('scrollernotnearbottom slideshowchangepage', function() {
that.element.removeClass('visible');
});
}
});
}(jQuery)); | (function($) {
$.widget('pageflow.scrollIndicator', {
_create: function() {
var parent = this.options.parent,
that = this;
parent.on('pageactivate', function(event) {
var page = $(event.target);
var invertIndicator = page.data('invertIndicator');
if (typeof invertIndicator === 'undefined') {
invertIndicator = page.hasClass('invert');
}
that.element.toggleClass('invert', invertIndicator);
});
parent.on('scrollerhintdown', function() {
that.element.addClass('animate');
setTimeout(function() {
that.element.removeClass('animate');
}, 500);
});
parent.on('scrollernearbottom', function(event) {
var page = $(event.target).parents('section');
if (page.hasClass('active')) {
that.element.toggleClass('visible', parent.nextPageExists());
}
});
parent.on('scrollernotnearbottom slideshowchangepage', function() {
that.element.removeClass('visible');
});
}
});
}(jQuery)); | Allow Inverting Scrolling Indicator on Page Basis | Allow Inverting Scrolling Indicator on Page Basis
Allow page types to set an `invertIndicator` attribute on the page
element to control whether the scrolling indicator should be
inverted. Determined by presence of `invert` css class on page element
by default.
| JavaScript | mit | schoetty/pageflow,luatdolphin/pageflow,codevise/pageflow,schoetty/pageflow,Modularfield/pageflow,tilsammans/pageflow,tilsammans/pageflow,tf/pageflow,Modularfield/pageflow,BenHeubl/pageflow,codevise/pageflow,Modularfield/pageflow,BenHeubl/pageflow,YoussefChafai/pageflow,grgr/pageflow,YoussefChafai/pageflow,upsworld/pageflow,tilsammans/pageflow,luatdolphin/pageflow,grgr/pageflow,tf/pageflow-dependabot-test,codevise/pageflow,grgr/pageflow,tf/pageflow-dependabot-test,codevise/pageflow,tf/pageflow,Modularfield/pageflow,tilsammans/pageflow,tf/pageflow,upsworld/pageflow,luatdolphin/pageflow,upsworld/pageflow,tf/pageflow-dependabot-test,BenHeubl/pageflow,schoetty/pageflow,tf/pageflow-dependabot-test,tf/pageflow,YoussefChafai/pageflow | javascript | ## Code Before:
(function($) {
$.widget('pageflow.scrollIndicator', {
_create: function() {
var parent = this.options.parent,
that = this;
parent.on('pageactivate', function(event) {
that.element.toggleClass('invert', $(event.target).hasClass('invert'));
});
parent.on('scrollerhintdown', function() {
that.element.addClass('animate');
setTimeout(function() {
that.element.removeClass('animate');
}, 500);
});
parent.on('scrollernearbottom', function(event) {
var page = $(event.target).parents('section');
if (page.hasClass('active')) {
that.element.toggleClass('visible', parent.nextPageExists());
}
});
parent.on('scrollernotnearbottom slideshowchangepage', function() {
that.element.removeClass('visible');
});
}
});
}(jQuery));
## Instruction:
Allow Inverting Scrolling Indicator on Page Basis
Allow page types to set an `invertIndicator` attribute on the page
element to control whether the scrolling indicator should be
inverted. Determined by presence of `invert` css class on page element
by default.
## Code After:
(function($) {
$.widget('pageflow.scrollIndicator', {
_create: function() {
var parent = this.options.parent,
that = this;
parent.on('pageactivate', function(event) {
var page = $(event.target);
var invertIndicator = page.data('invertIndicator');
if (typeof invertIndicator === 'undefined') {
invertIndicator = page.hasClass('invert');
}
that.element.toggleClass('invert', invertIndicator);
});
parent.on('scrollerhintdown', function() {
that.element.addClass('animate');
setTimeout(function() {
that.element.removeClass('animate');
}, 500);
});
parent.on('scrollernearbottom', function(event) {
var page = $(event.target).parents('section');
if (page.hasClass('active')) {
that.element.toggleClass('visible', parent.nextPageExists());
}
});
parent.on('scrollernotnearbottom slideshowchangepage', function() {
that.element.removeClass('visible');
});
}
});
}(jQuery)); | (function($) {
$.widget('pageflow.scrollIndicator', {
_create: function() {
var parent = this.options.parent,
that = this;
parent.on('pageactivate', function(event) {
- that.element.toggleClass('invert', $(event.target).hasClass('invert'));
+ var page = $(event.target);
+ var invertIndicator = page.data('invertIndicator');
+
+ if (typeof invertIndicator === 'undefined') {
+ invertIndicator = page.hasClass('invert');
+ }
+
+ that.element.toggleClass('invert', invertIndicator);
});
parent.on('scrollerhintdown', function() {
that.element.addClass('animate');
setTimeout(function() {
that.element.removeClass('animate');
}, 500);
});
parent.on('scrollernearbottom', function(event) {
var page = $(event.target).parents('section');
if (page.hasClass('active')) {
that.element.toggleClass('visible', parent.nextPageExists());
}
});
parent.on('scrollernotnearbottom slideshowchangepage', function() {
that.element.removeClass('visible');
});
}
});
}(jQuery)); | 9 | 0.290323 | 8 | 1 |
eebebcbea425efa3fc0c38528e5bf862a6dcfc7f | README.md | README.md | An online image dataset sharing and annotating platform.
## Setup
Clone this repository:
git clone https://github.com/ikafire/OpenObjectMarker
cd OpenObjectMarker
Install node package dependency:
sudo npm install -g # install the cmd tool (e.g., gulp and bower)
sudo npm install
Install bower components:
bower install
Build the project:
gulp
Open a new terminal and start the server:
cd OpenObjectMarker
npm start
Open your browser and access http://localhost:3000/
| An online image dataset sharing and annotating platform.
## Setup
Install node.js and npm:
# Mac OSX:
go to https://nodejs.org/en/ and download the .pkg.
# Ubuntu:
just install it by apt-get
Clone this repository:
git clone https://github.com/ikafire/OpenObjectMarker
cd OpenObjectMarker
Install node package dependency:
sudo npm install -g # install the cmd tool (e.g., gulp and bower)
sudo npm install
Install bower components:
sudo npm install bower -g
bower install
Build the project:
sudo npm install gulp -g
gulp
Open a new terminal and start the server:
cd OpenObjectMarker
npm start
Open your browser and access http://localhost:3000/
| Add some command for installing building tool | [Readme] Add some command for installing building tool
| Markdown | bsd-3-clause | stitux/OpenObjectMarker,stitux/OpenObjectMarker,ikafire/OpenObjectMarker,colin8930/OpenObjectMarker,colin8930/OpenObjectMarker,ikafire/OpenObjectMarker | markdown | ## Code Before:
An online image dataset sharing and annotating platform.
## Setup
Clone this repository:
git clone https://github.com/ikafire/OpenObjectMarker
cd OpenObjectMarker
Install node package dependency:
sudo npm install -g # install the cmd tool (e.g., gulp and bower)
sudo npm install
Install bower components:
bower install
Build the project:
gulp
Open a new terminal and start the server:
cd OpenObjectMarker
npm start
Open your browser and access http://localhost:3000/
## Instruction:
[Readme] Add some command for installing building tool
## Code After:
An online image dataset sharing and annotating platform.
## Setup
Install node.js and npm:
# Mac OSX:
go to https://nodejs.org/en/ and download the .pkg.
# Ubuntu:
just install it by apt-get
Clone this repository:
git clone https://github.com/ikafire/OpenObjectMarker
cd OpenObjectMarker
Install node package dependency:
sudo npm install -g # install the cmd tool (e.g., gulp and bower)
sudo npm install
Install bower components:
sudo npm install bower -g
bower install
Build the project:
sudo npm install gulp -g
gulp
Open a new terminal and start the server:
cd OpenObjectMarker
npm start
Open your browser and access http://localhost:3000/
| An online image dataset sharing and annotating platform.
## Setup
+
+ Install node.js and npm:
+
+ # Mac OSX:
+ go to https://nodejs.org/en/ and download the .pkg.
+
+ # Ubuntu:
+ just install it by apt-get
Clone this repository:
git clone https://github.com/ikafire/OpenObjectMarker
cd OpenObjectMarker
Install node package dependency:
sudo npm install -g # install the cmd tool (e.g., gulp and bower)
sudo npm install
Install bower components:
+ sudo npm install bower -g
bower install
Build the project:
+ sudo npm install gulp -g
gulp
Open a new terminal and start the server:
cd OpenObjectMarker
npm start
Open your browser and access http://localhost:3000/ | 10 | 0.357143 | 10 | 0 |
fde6dfe73e2f099fa2a17f603f446e0cd0ece301 | lib/mina/unicorn/tasks.rb | lib/mina/unicorn/tasks.rb | require 'mina/bundler'
require 'mina/rails'
require 'mina/unicorn/utility'
namespace :unicorn do
include Mina::Unicorn::Utility
set_default :unicorn_env, -> { fetch(:rails_env, 'production') }
set_default :unicorn_config, -> { "#{deploy_to}/#{current_path}/config/unicorn.rb" }
set_default :unicorn_pid, -> { "#{deploy_to}/#{current_path}/tmp/pids/unicorn.pid" }
set_default :unicorn_cmd, -> { "#{bundle_prefix} unicorn" }
set_default :unicorn_restart_sleep_time, -> { 2 }
set_default :bundle_gemfile, -> { "#{deploy_to}/#{current_path}/Gemfile" }
desc "Start Unicorn master process"
task start: :environment do
queue start_unicorn
end
desc "Stop Unicorn"
task stop: :environment do
queue kill_unicorn('QUIT')
end
desc "Immediately shutdown Unicorn"
task shutdown: :environment do
queue kill_unicorn('TERM')
end
desc "Restart unicorn service"
task restart: :environment do
queue restart_unicorn
end
end
| require 'mina/bundler'
require 'mina/rails'
require 'mina/unicorn/utility'
namespace :unicorn do
include Mina::Unicorn::Utility
# Following recommendations from http://unicorn.bogomips.org/unicorn_1.html#rack-environment
set_default :unicorn_env, -> { fetch(:rails_env) == 'development' ? 'development' : 'deployment' }
set_default :unicorn_config, -> { "#{deploy_to}/#{current_path}/config/unicorn.rb" }
set_default :unicorn_pid, -> { "#{deploy_to}/#{current_path}/tmp/pids/unicorn.pid" }
set_default :unicorn_cmd, -> { "#{bundle_prefix} unicorn" }
set_default :unicorn_restart_sleep_time, -> { 2 }
set_default :bundle_gemfile, -> { "#{deploy_to}/#{current_path}/Gemfile" }
desc "Start Unicorn master process"
task start: :environment do
queue start_unicorn
end
desc "Stop Unicorn"
task stop: :environment do
queue kill_unicorn('QUIT')
end
desc "Immediately shutdown Unicorn"
task shutdown: :environment do
queue kill_unicorn('TERM')
end
desc "Restart unicorn service"
task restart: :environment do
queue restart_unicorn
end
end
| Change default settings for unicorn_env | Change default settings for unicorn_env
Follow recommendations from official unicorn documentation.
see: http://unicorn.bogomips.org/unicorn_1.html#rack-environment
| Ruby | mit | scarfacedeb/mina-unicorn,Rodrigora/mina-unicorn | ruby | ## Code Before:
require 'mina/bundler'
require 'mina/rails'
require 'mina/unicorn/utility'
namespace :unicorn do
include Mina::Unicorn::Utility
set_default :unicorn_env, -> { fetch(:rails_env, 'production') }
set_default :unicorn_config, -> { "#{deploy_to}/#{current_path}/config/unicorn.rb" }
set_default :unicorn_pid, -> { "#{deploy_to}/#{current_path}/tmp/pids/unicorn.pid" }
set_default :unicorn_cmd, -> { "#{bundle_prefix} unicorn" }
set_default :unicorn_restart_sleep_time, -> { 2 }
set_default :bundle_gemfile, -> { "#{deploy_to}/#{current_path}/Gemfile" }
desc "Start Unicorn master process"
task start: :environment do
queue start_unicorn
end
desc "Stop Unicorn"
task stop: :environment do
queue kill_unicorn('QUIT')
end
desc "Immediately shutdown Unicorn"
task shutdown: :environment do
queue kill_unicorn('TERM')
end
desc "Restart unicorn service"
task restart: :environment do
queue restart_unicorn
end
end
## Instruction:
Change default settings for unicorn_env
Follow recommendations from official unicorn documentation.
see: http://unicorn.bogomips.org/unicorn_1.html#rack-environment
## Code After:
require 'mina/bundler'
require 'mina/rails'
require 'mina/unicorn/utility'
namespace :unicorn do
include Mina::Unicorn::Utility
# Following recommendations from http://unicorn.bogomips.org/unicorn_1.html#rack-environment
set_default :unicorn_env, -> { fetch(:rails_env) == 'development' ? 'development' : 'deployment' }
set_default :unicorn_config, -> { "#{deploy_to}/#{current_path}/config/unicorn.rb" }
set_default :unicorn_pid, -> { "#{deploy_to}/#{current_path}/tmp/pids/unicorn.pid" }
set_default :unicorn_cmd, -> { "#{bundle_prefix} unicorn" }
set_default :unicorn_restart_sleep_time, -> { 2 }
set_default :bundle_gemfile, -> { "#{deploy_to}/#{current_path}/Gemfile" }
desc "Start Unicorn master process"
task start: :environment do
queue start_unicorn
end
desc "Stop Unicorn"
task stop: :environment do
queue kill_unicorn('QUIT')
end
desc "Immediately shutdown Unicorn"
task shutdown: :environment do
queue kill_unicorn('TERM')
end
desc "Restart unicorn service"
task restart: :environment do
queue restart_unicorn
end
end
| require 'mina/bundler'
require 'mina/rails'
require 'mina/unicorn/utility'
namespace :unicorn do
include Mina::Unicorn::Utility
- set_default :unicorn_env, -> { fetch(:rails_env, 'production') }
+ # Following recommendations from http://unicorn.bogomips.org/unicorn_1.html#rack-environment
+ set_default :unicorn_env, -> { fetch(:rails_env) == 'development' ? 'development' : 'deployment' }
set_default :unicorn_config, -> { "#{deploy_to}/#{current_path}/config/unicorn.rb" }
set_default :unicorn_pid, -> { "#{deploy_to}/#{current_path}/tmp/pids/unicorn.pid" }
set_default :unicorn_cmd, -> { "#{bundle_prefix} unicorn" }
set_default :unicorn_restart_sleep_time, -> { 2 }
set_default :bundle_gemfile, -> { "#{deploy_to}/#{current_path}/Gemfile" }
desc "Start Unicorn master process"
task start: :environment do
queue start_unicorn
end
desc "Stop Unicorn"
task stop: :environment do
queue kill_unicorn('QUIT')
end
desc "Immediately shutdown Unicorn"
task shutdown: :environment do
queue kill_unicorn('TERM')
end
desc "Restart unicorn service"
task restart: :environment do
queue restart_unicorn
end
end
| 3 | 0.085714 | 2 | 1 |
ed7ad70b91adbbcde814be7e7a3207ba31be617d | build.sbt | build.sbt | name := "Apart"
version := "1.0"
scalaSource in Compile <<= baseDirectory(_ / "src")
scalaSource in Test <<= baseDirectory(_ / "src/junit")
javaSource in Compile <<= baseDirectory(_ / "src")
javaSource in Test <<= baseDirectory(_ / "src/junit")
libraryDependencies += "junit" % "junit" % "4.11"
libraryDependencies += "com.twitter" %% "util-eval" % "6.12.1"
libraryDependencies += "com.novocode" % "junit-interface" % "0.11" % "test"
libraryDependencies += "org.clapper" %% "argot" % "1.0.3"
ScoverageSbtPlugin.ScoverageKeys.coverageExcludedPackages := "<empty>;benchmarks.*;.*Test.*;junit.*;.*interop.*"
fork := true
// To run tests from the command line, uncomment and replace with the proper path form your system
//javaOptions in Test += "-Djava.library.path=/path-to-the-skelcl-folder/build/executor"
| name := "Apart"
version := "1.0"
scalaSource in Compile <<= baseDirectory(_ / "src")
scalaSource in Test <<= baseDirectory(_ / "src/junit")
javaSource in Compile <<= baseDirectory(_ / "src")
javaSource in Test <<= baseDirectory(_ / "src/junit")
libraryDependencies += "junit" % "junit" % "4.11"
libraryDependencies += "com.twitter" %% "util-eval" % "6.12.1"
libraryDependencies += "com.novocode" % "junit-interface" % "0.11" % "test"
libraryDependencies += "org.clapper" %% "argot" % "1.0.3"
scalacOptions in (Compile,doc) := Seq("-implicits", "-diagrams")
ScoverageSbtPlugin.ScoverageKeys.coverageExcludedPackages := "<empty>;benchmarks.*;.*Test.*;junit.*;.*interop.*"
fork := true
// To run tests from the command line, uncomment and replace with the proper path form your system
//javaOptions in Test += "-Djava.library.path=/path-to-the-skelcl-folder/build/executor"
| Set scaladoc to also generate diagrams | Set scaladoc to also generate diagrams
| Scala | mit | lift-project/lift,lift-project/lift,lift-project/lift,lift-project/lift,lift-project/lift | scala | ## Code Before:
name := "Apart"
version := "1.0"
scalaSource in Compile <<= baseDirectory(_ / "src")
scalaSource in Test <<= baseDirectory(_ / "src/junit")
javaSource in Compile <<= baseDirectory(_ / "src")
javaSource in Test <<= baseDirectory(_ / "src/junit")
libraryDependencies += "junit" % "junit" % "4.11"
libraryDependencies += "com.twitter" %% "util-eval" % "6.12.1"
libraryDependencies += "com.novocode" % "junit-interface" % "0.11" % "test"
libraryDependencies += "org.clapper" %% "argot" % "1.0.3"
ScoverageSbtPlugin.ScoverageKeys.coverageExcludedPackages := "<empty>;benchmarks.*;.*Test.*;junit.*;.*interop.*"
fork := true
// To run tests from the command line, uncomment and replace with the proper path form your system
//javaOptions in Test += "-Djava.library.path=/path-to-the-skelcl-folder/build/executor"
## Instruction:
Set scaladoc to also generate diagrams
## Code After:
name := "Apart"
version := "1.0"
scalaSource in Compile <<= baseDirectory(_ / "src")
scalaSource in Test <<= baseDirectory(_ / "src/junit")
javaSource in Compile <<= baseDirectory(_ / "src")
javaSource in Test <<= baseDirectory(_ / "src/junit")
libraryDependencies += "junit" % "junit" % "4.11"
libraryDependencies += "com.twitter" %% "util-eval" % "6.12.1"
libraryDependencies += "com.novocode" % "junit-interface" % "0.11" % "test"
libraryDependencies += "org.clapper" %% "argot" % "1.0.3"
scalacOptions in (Compile,doc) := Seq("-implicits", "-diagrams")
ScoverageSbtPlugin.ScoverageKeys.coverageExcludedPackages := "<empty>;benchmarks.*;.*Test.*;junit.*;.*interop.*"
fork := true
// To run tests from the command line, uncomment and replace with the proper path form your system
//javaOptions in Test += "-Djava.library.path=/path-to-the-skelcl-folder/build/executor"
| name := "Apart"
version := "1.0"
scalaSource in Compile <<= baseDirectory(_ / "src")
scalaSource in Test <<= baseDirectory(_ / "src/junit")
javaSource in Compile <<= baseDirectory(_ / "src")
javaSource in Test <<= baseDirectory(_ / "src/junit")
libraryDependencies += "junit" % "junit" % "4.11"
libraryDependencies += "com.twitter" %% "util-eval" % "6.12.1"
libraryDependencies += "com.novocode" % "junit-interface" % "0.11" % "test"
libraryDependencies += "org.clapper" %% "argot" % "1.0.3"
+ scalacOptions in (Compile,doc) := Seq("-implicits", "-diagrams")
+
ScoverageSbtPlugin.ScoverageKeys.coverageExcludedPackages := "<empty>;benchmarks.*;.*Test.*;junit.*;.*interop.*"
fork := true
// To run tests from the command line, uncomment and replace with the proper path form your system
//javaOptions in Test += "-Djava.library.path=/path-to-the-skelcl-folder/build/executor" | 2 | 0.076923 | 2 | 0 |
f25082ad5fefdfc1e5f53bbefd026cd62c29136b | mods/snow/aliases.lua | mods/snow/aliases.lua | minetest.register_alias("default:pine_needles", "snow:needles")
minetest.register_alias("default:pine_needles", "snow:leaves")
| minetest.register_alias("default:pine_needles", "snow:needles")
minetest.register_alias("default:pine_needles", "snow:leaves")
minetest.register_alias("default:pine_sapling", "snow:sapling_pine")
| Implement alias between snow and default pine sapling | [snow] Implement alias between snow and default pine sapling
- Maybe solve #425
| Lua | unlicense | sys4-fr/server-minetestforfun,Gael-de-Sailly/minetest-minetestforfun-server,Coethium/server-minetestforfun,Coethium/server-minetestforfun,crabman77/minetest-minetestforfun-server,sys4-fr/server-minetestforfun,LeMagnesium/minetest-minetestforfun-server,Gael-de-Sailly/minetest-minetestforfun-server,MinetestForFun/minetest-minetestforfun-server,LeMagnesium/minetest-minetestforfun-server,Ombridride/minetest-minetestforfun-server,crabman77/minetest-minetestforfun-server,MinetestForFun/minetest-minetestforfun-server,sys4-fr/server-minetestforfun,Coethium/server-minetestforfun,MinetestForFun/server-minetestforfun,Ombridride/minetest-minetestforfun-server,MinetestForFun/server-minetestforfun,MinetestForFun/server-minetestforfun,LeMagnesium/minetest-minetestforfun-server,Gael-de-Sailly/minetest-minetestforfun-server,MinetestForFun/minetest-minetestforfun-server,crabman77/minetest-minetestforfun-server,Ombridride/minetest-minetestforfun-server | lua | ## Code Before:
minetest.register_alias("default:pine_needles", "snow:needles")
minetest.register_alias("default:pine_needles", "snow:leaves")
## Instruction:
[snow] Implement alias between snow and default pine sapling
- Maybe solve #425
## Code After:
minetest.register_alias("default:pine_needles", "snow:needles")
minetest.register_alias("default:pine_needles", "snow:leaves")
minetest.register_alias("default:pine_sapling", "snow:sapling_pine")
| minetest.register_alias("default:pine_needles", "snow:needles")
minetest.register_alias("default:pine_needles", "snow:leaves")
+ minetest.register_alias("default:pine_sapling", "snow:sapling_pine") | 1 | 0.5 | 1 | 0 |
d7790236a87155bfdbf2cda8eafa7f4d8114a8f5 | ansible/main.yml | ansible/main.yml | ---
- hosts: all
vars:
cf_cli_version: "6.25.0"
maven_version: "3.5.0"
maven_checksum: "516923b3955b6035ba6b0a5b031fbd8b"
sbt_version: "0.13.13"
intellij_version: "2017.1"
sts_version: "3.8.4"
eclipse_version: "4.6.3"
docker_compose_version: "1.12.0"
atom_version: "1.15.0"
gradle_version: "3.4.1"
go_version: "1.8"
compliance_masonry_version: "1.1.2"
gather_facts: yes
become: yes
roles:
- disable_auto_update
- developer_packages
- atom
- docker_service
- docker_compose
- cf_cli
- maven
- sbt
- gradle
- intellij
- sts
- unity_desktop
- go
- postman
- gitbook
- compliance_masonry
| ---
- hosts: all
vars:
cf_cli_version: "6.25.0"
maven_version: "3.5.0"
maven_checksum: "35c39251d2af99b6624d40d801f6ff02"
sbt_version: "0.13.13"
intellij_version: "2017.1"
sts_version: "3.8.4"
eclipse_version: "4.6.3"
docker_compose_version: "1.12.0"
atom_version: "1.15.0"
gradle_version: "3.4.1"
go_version: "1.8"
compliance_masonry_version: "1.1.2"
gather_facts: yes
become: yes
roles:
- disable_auto_update
- developer_packages
- atom
- docker_service
- docker_compose
- cf_cli
- maven
- sbt
- gradle
- intellij
- sts
- unity_desktop
- go
- postman
- gitbook
- compliance_masonry
| Use correct md5 for maven | Use correct md5 for maven
| YAML | apache-2.0 | cagiti/developer-environment | yaml | ## Code Before:
---
- hosts: all
vars:
cf_cli_version: "6.25.0"
maven_version: "3.5.0"
maven_checksum: "516923b3955b6035ba6b0a5b031fbd8b"
sbt_version: "0.13.13"
intellij_version: "2017.1"
sts_version: "3.8.4"
eclipse_version: "4.6.3"
docker_compose_version: "1.12.0"
atom_version: "1.15.0"
gradle_version: "3.4.1"
go_version: "1.8"
compliance_masonry_version: "1.1.2"
gather_facts: yes
become: yes
roles:
- disable_auto_update
- developer_packages
- atom
- docker_service
- docker_compose
- cf_cli
- maven
- sbt
- gradle
- intellij
- sts
- unity_desktop
- go
- postman
- gitbook
- compliance_masonry
## Instruction:
Use correct md5 for maven
## Code After:
---
- hosts: all
vars:
cf_cli_version: "6.25.0"
maven_version: "3.5.0"
maven_checksum: "35c39251d2af99b6624d40d801f6ff02"
sbt_version: "0.13.13"
intellij_version: "2017.1"
sts_version: "3.8.4"
eclipse_version: "4.6.3"
docker_compose_version: "1.12.0"
atom_version: "1.15.0"
gradle_version: "3.4.1"
go_version: "1.8"
compliance_masonry_version: "1.1.2"
gather_facts: yes
become: yes
roles:
- disable_auto_update
- developer_packages
- atom
- docker_service
- docker_compose
- cf_cli
- maven
- sbt
- gradle
- intellij
- sts
- unity_desktop
- go
- postman
- gitbook
- compliance_masonry
| ---
- hosts: all
vars:
cf_cli_version: "6.25.0"
maven_version: "3.5.0"
- maven_checksum: "516923b3955b6035ba6b0a5b031fbd8b"
+ maven_checksum: "35c39251d2af99b6624d40d801f6ff02"
sbt_version: "0.13.13"
intellij_version: "2017.1"
sts_version: "3.8.4"
eclipse_version: "4.6.3"
docker_compose_version: "1.12.0"
atom_version: "1.15.0"
gradle_version: "3.4.1"
go_version: "1.8"
compliance_masonry_version: "1.1.2"
gather_facts: yes
become: yes
roles:
- disable_auto_update
- developer_packages
- atom
- docker_service
- docker_compose
- cf_cli
- maven
- sbt
- gradle
- intellij
- sts
- unity_desktop
- go
- postman
- gitbook
- compliance_masonry | 2 | 0.058824 | 1 | 1 |
1269cae52fa70296ce922b658d2b87f531a0a27c | README.md | README.md | Convert ZX Spectrum tape recordings to the .PZX format
# Overview
The Sinclair ZX Spectrum and related computers primarily stored their software on audio cassette tapes. While most emulators for these machines support the TAP and TZX file formats to efficiently store that software, they are either limited in what software is supported (TAP) or overly complex to support (TZX).
[The PZX format](http://zxds.raxoft.cz/pzx.html) is a more recent attempt to produce a flexible format that can support all software produced for the ZX Spectrum and related machines but is relatively simple to encode and support.
This program translates a recording of a ZX Spectrum tape in WAV format (preferably a mono 8-bit file with maximum volume without clipping the samples) into a PZX file, recognising the standard Spectrum files saved by the ROM routines and preserving any data it doesn't directly support.
It currently expects to be run from the command line and has the following arguments:
$ wav2pzx.jar <infile.wav> <outfile.pzx>
| Convert ZX Spectrum tape recordings to the .PZX format
# Overview
The Sinclair ZX Spectrum and related computers primarily stored their software on audio cassette tapes. While most emulators for these machines support the TAP and TZX file formats to efficiently store that software, they are either limited in what software is supported (TAP) or overly complex to support (TZX).
[The PZX format](http://zxds.raxoft.cz/pzx.html) is a more recent attempt to produce a flexible format that can support all software produced for the ZX Spectrum and related machines but is relatively simple to encode and support.
This program translates a recording of a ZX Spectrum tape in WAV format (preferably a mono 8-bit file with maximum volume without clipping the samples) into a PZX file, recognising the standard Spectrum files saved by the ROM routines and preserving any data it doesn't directly support.
It currently expects to be run from the command line and has the following arguments:
$ java -jar wav2pzx-1.0.jar <infile.wav> <outfile.pzx>
| Correct command required to run application. | Correct command required to run application.
| Markdown | bsd-2-clause | fmeunier/wav2pzx | markdown | ## Code Before:
Convert ZX Spectrum tape recordings to the .PZX format
# Overview
The Sinclair ZX Spectrum and related computers primarily stored their software on audio cassette tapes. While most emulators for these machines support the TAP and TZX file formats to efficiently store that software, they are either limited in what software is supported (TAP) or overly complex to support (TZX).
[The PZX format](http://zxds.raxoft.cz/pzx.html) is a more recent attempt to produce a flexible format that can support all software produced for the ZX Spectrum and related machines but is relatively simple to encode and support.
This program translates a recording of a ZX Spectrum tape in WAV format (preferably a mono 8-bit file with maximum volume without clipping the samples) into a PZX file, recognising the standard Spectrum files saved by the ROM routines and preserving any data it doesn't directly support.
It currently expects to be run from the command line and has the following arguments:
$ wav2pzx.jar <infile.wav> <outfile.pzx>
## Instruction:
Correct command required to run application.
## Code After:
Convert ZX Spectrum tape recordings to the .PZX format
# Overview
The Sinclair ZX Spectrum and related computers primarily stored their software on audio cassette tapes. While most emulators for these machines support the TAP and TZX file formats to efficiently store that software, they are either limited in what software is supported (TAP) or overly complex to support (TZX).
[The PZX format](http://zxds.raxoft.cz/pzx.html) is a more recent attempt to produce a flexible format that can support all software produced for the ZX Spectrum and related machines but is relatively simple to encode and support.
This program translates a recording of a ZX Spectrum tape in WAV format (preferably a mono 8-bit file with maximum volume without clipping the samples) into a PZX file, recognising the standard Spectrum files saved by the ROM routines and preserving any data it doesn't directly support.
It currently expects to be run from the command line and has the following arguments:
$ java -jar wav2pzx-1.0.jar <infile.wav> <outfile.pzx>
| Convert ZX Spectrum tape recordings to the .PZX format
# Overview
The Sinclair ZX Spectrum and related computers primarily stored their software on audio cassette tapes. While most emulators for these machines support the TAP and TZX file formats to efficiently store that software, they are either limited in what software is supported (TAP) or overly complex to support (TZX).
[The PZX format](http://zxds.raxoft.cz/pzx.html) is a more recent attempt to produce a flexible format that can support all software produced for the ZX Spectrum and related machines but is relatively simple to encode and support.
This program translates a recording of a ZX Spectrum tape in WAV format (preferably a mono 8-bit file with maximum volume without clipping the samples) into a PZX file, recognising the standard Spectrum files saved by the ROM routines and preserving any data it doesn't directly support.
It currently expects to be run from the command line and has the following arguments:
- $ wav2pzx.jar <infile.wav> <outfile.pzx>
+ $ java -jar wav2pzx-1.0.jar <infile.wav> <outfile.pzx>
? ++++++++++ ++++
| 2 | 0.166667 | 1 | 1 |
5797fa2f9b1aabd0eee4fc2e62c74980965831f9 | personal_website/app/assets/stylesheets/blogs-large-screen.scss | personal_website/app/assets/stylesheets/blogs-large-screen.scss | .blogs{
margin-top: 43px;
background-color: #fff;
h3{
text-align: center;
}
ul{
list-style: none;
a{
font-family: verdana;
color:#A6430A ;
display: block;
margin: 15px;
padding: 9px;
border: 1px solid #A6430A;
-moz-border-radius: 12px;
-webkit-border-radius: 12px;
-border-radius: 12px;
text-decoration: none;
}
a:hover{
color: #fff;
background-color: rgba(81,86,115,.15);
}
.title{
float: left;
}
.date{
float: right;
}
}
}
/* small screen posiiton*/
@media only screen and (min-width: 50px) and (max-width: 500px){
.blogs{
position: static;
margin-top: 0px;
}
} | .blogs{
padding-top: 10px;
margin-top: 43px;
background-color: #fff;
h3{
text-align: center;
}
ul{
list-style: none;
a{
font-family: verdana;
color:#A6430A ;
display: block;
margin: 15px;
padding: 9px;
border: 1px solid #A6430A;
-moz-border-radius: 12px;
-webkit-border-radius: 12px;
-border-radius: 12px;
text-decoration: none;
}
a:hover{
color: #fff;
background-color: rgba(81,86,115,.15);
}
.title{
float: left;
}
.date{
float: right;
}
}
}
/* small screen posiiton*/
@media only screen and (min-width: 50px) and (max-width: 500px){
.blogs{
position: static;
margin-top: 0px;
}
} | Add top padding to blog h1 tag | Add top padding to blog h1 tag
| SCSS | mit | Bigless27/Personal-Website-2.0,Bigless27/Personal-Website-2.0,Bigless27/Personal-Website-2.0 | scss | ## Code Before:
.blogs{
margin-top: 43px;
background-color: #fff;
h3{
text-align: center;
}
ul{
list-style: none;
a{
font-family: verdana;
color:#A6430A ;
display: block;
margin: 15px;
padding: 9px;
border: 1px solid #A6430A;
-moz-border-radius: 12px;
-webkit-border-radius: 12px;
-border-radius: 12px;
text-decoration: none;
}
a:hover{
color: #fff;
background-color: rgba(81,86,115,.15);
}
.title{
float: left;
}
.date{
float: right;
}
}
}
/* small screen posiiton*/
@media only screen and (min-width: 50px) and (max-width: 500px){
.blogs{
position: static;
margin-top: 0px;
}
}
## Instruction:
Add top padding to blog h1 tag
## Code After:
.blogs{
padding-top: 10px;
margin-top: 43px;
background-color: #fff;
h3{
text-align: center;
}
ul{
list-style: none;
a{
font-family: verdana;
color:#A6430A ;
display: block;
margin: 15px;
padding: 9px;
border: 1px solid #A6430A;
-moz-border-radius: 12px;
-webkit-border-radius: 12px;
-border-radius: 12px;
text-decoration: none;
}
a:hover{
color: #fff;
background-color: rgba(81,86,115,.15);
}
.title{
float: left;
}
.date{
float: right;
}
}
}
/* small screen posiiton*/
@media only screen and (min-width: 50px) and (max-width: 500px){
.blogs{
position: static;
margin-top: 0px;
}
} | .blogs{
+ padding-top: 10px;
margin-top: 43px;
background-color: #fff;
h3{
text-align: center;
}
ul{
list-style: none;
a{
font-family: verdana;
color:#A6430A ;
display: block;
margin: 15px;
padding: 9px;
border: 1px solid #A6430A;
-moz-border-radius: 12px;
-webkit-border-radius: 12px;
-border-radius: 12px;
text-decoration: none;
}
a:hover{
color: #fff;
background-color: rgba(81,86,115,.15);
}
.title{
float: left;
}
.date{
float: right;
}
}
}
/* small screen posiiton*/
@media only screen and (min-width: 50px) and (max-width: 500px){
.blogs{
position: static;
margin-top: 0px;
}
} | 1 | 0.019231 | 1 | 0 |
30221d74d78b8f1af65621686e89513e650064be | web/locales/kpv/messages.ftl | web/locales/kpv/messages.ftl |
en = Англи кыв
et = Эст кыв
# [/]
## Layout
languages = Кывъяс
help = Отсӧг
## Home Page
show-wall-of-text = Лыддьы унджык
vote-yes = Да
vote-no = Абу
## Speak & Listen Shortcuts
## Listen Shortcuts
## Speak Shortcuts
## ProjectStatus
## ProfileForm
## FAQ
## Profile
## NotFound
## Data
## Record Page
## Download Modal
## Contact Modal
## Request Language Modal
## Languages Overview
## New Contribution
|
as = Ассам кыв
az = Азербайджан кыв
bn = Бенгал кыв
br = Брезон кыв
ca = Каталан кыв
cs = Чех кыв
cv = Чуваш кыв
cy = Кӧмри кыв
da = Дан кыв
de = Немеч кыв
en = Англи кыв
et = Эст кыв
tr = Турок кыв
tt = Тотара кыв
uk = Украин кыв
uz = Узбек кыв
# [/]
## Layout
languages = Кывъяс
help = Отсӧг
## Home Page
show-wall-of-text = Лыддьы унджык
vote-yes = Да
vote-no = Абу
## Speak & Listen Shortcuts
## Listen Shortcuts
## Speak Shortcuts
## ProjectStatus
## ProfileForm
## FAQ
## Profile
## NotFound
## Data
## Record Page
## Download Modal
## Contact Modal
## Request Language Modal
## Languages Overview
## New Contribution
| Update Komi-Zyrian (kpv) localization of Common Voice | Pontoon: Update Komi-Zyrian (kpv) localization of Common Voice
Localization authors:
- nikotapiopartanen <nikotapiopartanen@gmail.com>
| FreeMarker | mpl-2.0 | gozer/voice-web,common-voice/common-voice,gozer/voice-web,common-voice/common-voice,gozer/voice-web,common-voice/common-voice,common-voice/common-voice,gozer/voice-web,gozer/voice-web,gozer/voice-web,gozer/voice-web | freemarker | ## Code Before:
en = Англи кыв
et = Эст кыв
# [/]
## Layout
languages = Кывъяс
help = Отсӧг
## Home Page
show-wall-of-text = Лыддьы унджык
vote-yes = Да
vote-no = Абу
## Speak & Listen Shortcuts
## Listen Shortcuts
## Speak Shortcuts
## ProjectStatus
## ProfileForm
## FAQ
## Profile
## NotFound
## Data
## Record Page
## Download Modal
## Contact Modal
## Request Language Modal
## Languages Overview
## New Contribution
## Instruction:
Pontoon: Update Komi-Zyrian (kpv) localization of Common Voice
Localization authors:
- nikotapiopartanen <nikotapiopartanen@gmail.com>
## Code After:
as = Ассам кыв
az = Азербайджан кыв
bn = Бенгал кыв
br = Брезон кыв
ca = Каталан кыв
cs = Чех кыв
cv = Чуваш кыв
cy = Кӧмри кыв
da = Дан кыв
de = Немеч кыв
en = Англи кыв
et = Эст кыв
tr = Турок кыв
tt = Тотара кыв
uk = Украин кыв
uz = Узбек кыв
# [/]
## Layout
languages = Кывъяс
help = Отсӧг
## Home Page
show-wall-of-text = Лыддьы унджык
vote-yes = Да
vote-no = Абу
## Speak & Listen Shortcuts
## Listen Shortcuts
## Speak Shortcuts
## ProjectStatus
## ProfileForm
## FAQ
## Profile
## NotFound
## Data
## Record Page
## Download Modal
## Contact Modal
## Request Language Modal
## Languages Overview
## New Contribution
|
+ as = Ассам кыв
+ az = Азербайджан кыв
+ bn = Бенгал кыв
+ br = Брезон кыв
+ ca = Каталан кыв
+ cs = Чех кыв
+ cv = Чуваш кыв
+ cy = Кӧмри кыв
+ da = Дан кыв
+ de = Немеч кыв
en = Англи кыв
et = Эст кыв
+ tr = Турок кыв
+ tt = Тотара кыв
+ uk = Украин кыв
+ uz = Узбек кыв
# [/]
## Layout
languages = Кывъяс
help = Отсӧг
## Home Page
show-wall-of-text = Лыддьы унджык
vote-yes = Да
vote-no = Абу
## Speak & Listen Shortcuts
## Listen Shortcuts
## Speak Shortcuts
## ProjectStatus
## ProfileForm
## FAQ
## Profile
## NotFound
## Data
## Record Page
## Download Modal
## Contact Modal
## Request Language Modal
## Languages Overview
## New Contribution
| 14 | 0.225806 | 14 | 0 |
f54e0c11f231422f8ab885f4a88ebaf30b2e0849 | .travis.yml | .travis.yml | language: go
go:
- stable
sudo: required
services:
- docker
script:
- make deps
- make
- make test-integration
| language: go
go:
- stable
#sudo: required
#services:
# - docker
script:
- make deps
- make
- make test
| Remove integration tests from ci | Remove integration tests from ci
| YAML | mit | Jeffail/benthos,Jeffail/benthos,Jeffail/benthos,Jeffail/benthos | yaml | ## Code Before:
language: go
go:
- stable
sudo: required
services:
- docker
script:
- make deps
- make
- make test-integration
## Instruction:
Remove integration tests from ci
## Code After:
language: go
go:
- stable
#sudo: required
#services:
# - docker
script:
- make deps
- make
- make test
| language: go
go:
- stable
- sudo: required
+ #sudo: required
? +
- services:
+ #services:
? +
- - docker
+ # - docker
? +
script:
- make deps
- make
- - make test-integration
+ - make test | 8 | 0.8 | 4 | 4 |
0e971e0ca013781a91d917eb4460a6fbfafc9f62 | docs/coordinates/separations.rst | docs/coordinates/separations.rst | Separations
-----------
The on-sky separation is easily computed with the `separation` method, which
computes the great-circle distance (*not* the small-angle approximation)::
>>> from astropy.coordinates import ICRSCoordinates
>>> c1 = ICRSCoordinates('5h23m34.5s -69d45m22s')
>>> c2 = ICRSCoordinates('0h52m44.8s -72d49m43s')
>>> sep = c1.separation(c2)
>>> sep
<AngularSeparation 0.36209rad>
The `~astropy.coordinates.angles.AngularSeparation` object is a subclass of
`~astropy.coordinates.angles.Angle`, so it can be accessed in the same ways,
along with a few additions::
>>> sep.radian
0.36208807374669766
>>> sep.hour
1.383074562513832
>>> sep.arcminute
1244.7671062624488
>>> sep.arcsecond
74686.02637574692
| Separations
-----------
The on-sky separation is easily computed with the `separation` method, which
computes the great-circle distance (*not* the small-angle approximation)::
>>> from astropy.coordinates import ICRSCoordinates
>>> c1 = ICRSCoordinates('5h23m34.5s -69d45m22s')
>>> c2 = ICRSCoordinates('0h52m44.8s -72d49m43s')
>>> sep = c1.separation(c2)
>>> sep
<Angle 20d44m46.02638s>
The `~astropy.coordinates.angles.AngularSeparation` object is a subclass of
`~astropy.coordinates.angles.Angle`, so it can be accessed in the same ways,
along with a few additions::
>>> sep.radian
0.36208807374669766
>>> sep.hour
1.383074562513832
>>> sep.arcminute
1244.7671062624488
>>> sep.arcsecond
74686.02637574692
| Update doc example for Angle output | Update doc example for Angle output
| reStructuredText | bsd-3-clause | dhomeier/astropy,bsipocz/astropy,joergdietrich/astropy,tbabej/astropy,pllim/astropy,aleksandr-bakanov/astropy,AustereCuriosity/astropy,kelle/astropy,mhvk/astropy,DougBurke/astropy,saimn/astropy,funbaker/astropy,lpsinger/astropy,MSeifert04/astropy,StuartLittlefair/astropy,MSeifert04/astropy,aleksandr-bakanov/astropy,stargaser/astropy,dhomeier/astropy,DougBurke/astropy,stargaser/astropy,larrybradley/astropy,kelle/astropy,larrybradley/astropy,kelle/astropy,dhomeier/astropy,lpsinger/astropy,bsipocz/astropy,lpsinger/astropy,aleksandr-bakanov/astropy,MSeifert04/astropy,mhvk/astropy,StuartLittlefair/astropy,larrybradley/astropy,larrybradley/astropy,tbabej/astropy,funbaker/astropy,mhvk/astropy,bsipocz/astropy,DougBurke/astropy,dhomeier/astropy,saimn/astropy,joergdietrich/astropy,AustereCuriosity/astropy,stargaser/astropy,saimn/astropy,pllim/astropy,astropy/astropy,pllim/astropy,pllim/astropy,tbabej/astropy,stargaser/astropy,AustereCuriosity/astropy,lpsinger/astropy,astropy/astropy,lpsinger/astropy,mhvk/astropy,funbaker/astropy,saimn/astropy,astropy/astropy,aleksandr-bakanov/astropy,StuartLittlefair/astropy,AustereCuriosity/astropy,AustereCuriosity/astropy,pllim/astropy,joergdietrich/astropy,tbabej/astropy,joergdietrich/astropy,kelle/astropy,saimn/astropy,joergdietrich/astropy,larrybradley/astropy,dhomeier/astropy,kelle/astropy,DougBurke/astropy,funbaker/astropy,mhvk/astropy,StuartLittlefair/astropy,tbabej/astropy,MSeifert04/astropy,bsipocz/astropy,StuartLittlefair/astropy,astropy/astropy,astropy/astropy | restructuredtext | ## Code Before:
Separations
-----------
The on-sky separation is easily computed with the `separation` method, which
computes the great-circle distance (*not* the small-angle approximation)::
>>> from astropy.coordinates import ICRSCoordinates
>>> c1 = ICRSCoordinates('5h23m34.5s -69d45m22s')
>>> c2 = ICRSCoordinates('0h52m44.8s -72d49m43s')
>>> sep = c1.separation(c2)
>>> sep
<AngularSeparation 0.36209rad>
The `~astropy.coordinates.angles.AngularSeparation` object is a subclass of
`~astropy.coordinates.angles.Angle`, so it can be accessed in the same ways,
along with a few additions::
>>> sep.radian
0.36208807374669766
>>> sep.hour
1.383074562513832
>>> sep.arcminute
1244.7671062624488
>>> sep.arcsecond
74686.02637574692
## Instruction:
Update doc example for Angle output
## Code After:
Separations
-----------
The on-sky separation is easily computed with the `separation` method, which
computes the great-circle distance (*not* the small-angle approximation)::
>>> from astropy.coordinates import ICRSCoordinates
>>> c1 = ICRSCoordinates('5h23m34.5s -69d45m22s')
>>> c2 = ICRSCoordinates('0h52m44.8s -72d49m43s')
>>> sep = c1.separation(c2)
>>> sep
<Angle 20d44m46.02638s>
The `~astropy.coordinates.angles.AngularSeparation` object is a subclass of
`~astropy.coordinates.angles.Angle`, so it can be accessed in the same ways,
along with a few additions::
>>> sep.radian
0.36208807374669766
>>> sep.hour
1.383074562513832
>>> sep.arcminute
1244.7671062624488
>>> sep.arcsecond
74686.02637574692
| Separations
-----------
The on-sky separation is easily computed with the `separation` method, which
computes the great-circle distance (*not* the small-angle approximation)::
>>> from astropy.coordinates import ICRSCoordinates
>>> c1 = ICRSCoordinates('5h23m34.5s -69d45m22s')
>>> c2 = ICRSCoordinates('0h52m44.8s -72d49m43s')
>>> sep = c1.separation(c2)
>>> sep
- <AngularSeparation 0.36209rad>
+ <Angle 20d44m46.02638s>
The `~astropy.coordinates.angles.AngularSeparation` object is a subclass of
`~astropy.coordinates.angles.Angle`, so it can be accessed in the same ways,
along with a few additions::
>>> sep.radian
0.36208807374669766
>>> sep.hour
1.383074562513832
>>> sep.arcminute
1244.7671062624488
>>> sep.arcsecond
74686.02637574692 | 2 | 0.076923 | 1 | 1 |
a3e650fffaafffaf0792a8166cbd7947b9ef841e | src/app/app.search.service.ts | src/app/app.search.service.ts | import { Injectable } from '@angular/core';
import {Subject} from "rxjs/Rx";
@Injectable()
export class SearchService {
private searchTerm: string;
private subcategories: any;
private productType: string;
searchChange: Subject<any> = new Subject<any>();
uiChange: Subject<any> = new Subject<any>();
getSearch()
{
return {
"productType": this.productType,
"searchTerm": this.searchTerm,
"subcategories": this.subcategories,
"termIds": this.getTermIds()
};
}
getTermIds()
{
let termIds = [];
for (var key in this.subcategories) {
if (this.subcategories.hasOwnProperty(key)) {
let value = this.subcategories[key];
if(value && value != "null")
termIds.push(value)
}
}
return termIds;
}
findAll()
{
this.searchChange.next(this.getSearch());
}
updateSearchParameters(productType, subcategories, updateUI=false)
{
this.productType = productType;
this.subcategories = subcategories;
this.searchChange.next(this.getSearch());
if(updateUI)
this.uiChange.next(this.getSearch())
}
setSearchTerm(searchTerm) {
this.searchTerm = searchTerm;
this.searchChange.next(this.getSearch());
}
}
| import { Injectable } from '@angular/core';
import {Subject} from "rxjs/Rx";
@Injectable()
export class SearchService {
private searchTerm: string;
private subcategories: any;
private productType: string;
searchChange: Subject<any> = new Subject<any>();
uiChange: Subject<any> = new Subject<any>();
getSearch()
{
return {
"productType": this.productType,
"searchTerm": this.searchTerm,
"subcategories": this.subcategories,
"termIds": this.getTermIds()
};
}
getTermIds()
{
let termIds = [];
console.log('subcats: ', this.subcategories);
for (var key in this.subcategories) {
if (this.subcategories.hasOwnProperty(key)) {
let value = this.subcategories[key];
console.log(value);
if(value && value != "null" && value != "undefined")
termIds.push(value)
}
}
return termIds;
}
findAll()
{
this.searchChange.next(this.getSearch());
}
updateSearchParameters(productType, subcategories, updateUI=false)
{
this.productType = productType;
this.subcategories = subcategories;
this.searchChange.next(this.getSearch());
if(updateUI)
this.uiChange.next(this.getSearch())
}
setSearchTerm(searchTerm) {
this.searchTerm = searchTerm;
this.searchChange.next(this.getSearch());
}
}
| Stop undefined terms being returned. | Stop undefined terms being returned.
| TypeScript | bsd-3-clause | UoA-eResearch/research-hub,UoA-eResearch/research-hub,UoA-eResearch/research-hub | typescript | ## Code Before:
import { Injectable } from '@angular/core';
import {Subject} from "rxjs/Rx";
@Injectable()
export class SearchService {
private searchTerm: string;
private subcategories: any;
private productType: string;
searchChange: Subject<any> = new Subject<any>();
uiChange: Subject<any> = new Subject<any>();
getSearch()
{
return {
"productType": this.productType,
"searchTerm": this.searchTerm,
"subcategories": this.subcategories,
"termIds": this.getTermIds()
};
}
getTermIds()
{
let termIds = [];
for (var key in this.subcategories) {
if (this.subcategories.hasOwnProperty(key)) {
let value = this.subcategories[key];
if(value && value != "null")
termIds.push(value)
}
}
return termIds;
}
findAll()
{
this.searchChange.next(this.getSearch());
}
updateSearchParameters(productType, subcategories, updateUI=false)
{
this.productType = productType;
this.subcategories = subcategories;
this.searchChange.next(this.getSearch());
if(updateUI)
this.uiChange.next(this.getSearch())
}
setSearchTerm(searchTerm) {
this.searchTerm = searchTerm;
this.searchChange.next(this.getSearch());
}
}
## Instruction:
Stop undefined terms being returned.
## Code After:
import { Injectable } from '@angular/core';
import {Subject} from "rxjs/Rx";
@Injectable()
export class SearchService {
private searchTerm: string;
private subcategories: any;
private productType: string;
searchChange: Subject<any> = new Subject<any>();
uiChange: Subject<any> = new Subject<any>();
getSearch()
{
return {
"productType": this.productType,
"searchTerm": this.searchTerm,
"subcategories": this.subcategories,
"termIds": this.getTermIds()
};
}
getTermIds()
{
let termIds = [];
console.log('subcats: ', this.subcategories);
for (var key in this.subcategories) {
if (this.subcategories.hasOwnProperty(key)) {
let value = this.subcategories[key];
console.log(value);
if(value && value != "null" && value != "undefined")
termIds.push(value)
}
}
return termIds;
}
findAll()
{
this.searchChange.next(this.getSearch());
}
updateSearchParameters(productType, subcategories, updateUI=false)
{
this.productType = productType;
this.subcategories = subcategories;
this.searchChange.next(this.getSearch());
if(updateUI)
this.uiChange.next(this.getSearch())
}
setSearchTerm(searchTerm) {
this.searchTerm = searchTerm;
this.searchChange.next(this.getSearch());
}
}
| import { Injectable } from '@angular/core';
import {Subject} from "rxjs/Rx";
@Injectable()
export class SearchService {
private searchTerm: string;
private subcategories: any;
private productType: string;
searchChange: Subject<any> = new Subject<any>();
uiChange: Subject<any> = new Subject<any>();
getSearch()
{
return {
"productType": this.productType,
"searchTerm": this.searchTerm,
"subcategories": this.subcategories,
"termIds": this.getTermIds()
};
}
getTermIds()
{
let termIds = [];
+ console.log('subcats: ', this.subcategories);
for (var key in this.subcategories) {
if (this.subcategories.hasOwnProperty(key)) {
let value = this.subcategories[key];
+ console.log(value);
- if(value && value != "null")
+ if(value && value != "null" && value != "undefined")
? ++++++++++++++++++++++++
termIds.push(value)
}
}
return termIds;
}
findAll()
{
this.searchChange.next(this.getSearch());
}
updateSearchParameters(productType, subcategories, updateUI=false)
{
this.productType = productType;
this.subcategories = subcategories;
this.searchChange.next(this.getSearch());
if(updateUI)
this.uiChange.next(this.getSearch())
}
setSearchTerm(searchTerm) {
this.searchTerm = searchTerm;
this.searchChange.next(this.getSearch());
}
} | 4 | 0.074074 | 3 | 1 |
92c7282527fc57dde12f9445cd0809501796f3f7 | install.sh | install.sh |
DOT_DIR="$HOME/.kmkm-dotfiles"
main() {
echo "run install.sh ..."
get_repo
cd ${DOT_DIR}
config_git
do_install
}
config_git() {
git config user.name komukomo
git config user.email komukomo@users.noreply.github.com
}
get_repo() {
if [ ! -e ${DOT_DIR} ]; then
git clone https://github.com/komukomo/dotfiles.git ${DOT_DIR}
else
echo "${DOT_DIR} already exists."
fi
}
do_install() {
DIRPATH=$(cd `dirname $0`; pwd)
cd ${DIRPATH}
TARGETS=$(git ls-files ".*")
for file in ${TARGETS[@]}
do
ln -s ${DIRPATH}/${file} ${HOME}/${file}
if [ $? -eq 0 ]; then
echo "success to create sym link: ${file}"
fi
done
}
main
|
DOT_DIR="$HOME/.kmkm-dotfiles"
main() {
echo "run install.sh ..."
get_repo
cd ${DOT_DIR}
config_git
do_install
}
config_git() {
git config user.name komukomo
git config user.email komukomo@users.noreply.github.com
}
get_repo() {
if [ ! -e ${DOT_DIR} ]; then
git clone https://github.com/komukomo/dotfiles.git ${DOT_DIR}
else
echo "${DOT_DIR} already exists."
fi
}
do_install() {
DIRPATH=$(cd `dirname $0`; pwd)
cd ${DIRPATH}
TARGETS=$(git ls-files ".*" | awk -F'/' '{print $1}' | uniq)
for file in ${TARGETS[@]}
do
ln -s ${DIRPATH}/${file} ${HOME}/${file}
if [ $? -eq 0 ]; then
echo "success to create sym link: ${file}"
fi
done
}
main
| Enable creating symlinks of dot directories | Enable creating symlinks of dot directories
| Shell | mit | komukomo/dotfiles,komukomo/dotfiles | shell | ## Code Before:
DOT_DIR="$HOME/.kmkm-dotfiles"
main() {
echo "run install.sh ..."
get_repo
cd ${DOT_DIR}
config_git
do_install
}
config_git() {
git config user.name komukomo
git config user.email komukomo@users.noreply.github.com
}
get_repo() {
if [ ! -e ${DOT_DIR} ]; then
git clone https://github.com/komukomo/dotfiles.git ${DOT_DIR}
else
echo "${DOT_DIR} already exists."
fi
}
do_install() {
DIRPATH=$(cd `dirname $0`; pwd)
cd ${DIRPATH}
TARGETS=$(git ls-files ".*")
for file in ${TARGETS[@]}
do
ln -s ${DIRPATH}/${file} ${HOME}/${file}
if [ $? -eq 0 ]; then
echo "success to create sym link: ${file}"
fi
done
}
main
## Instruction:
Enable creating symlinks of dot directories
## Code After:
DOT_DIR="$HOME/.kmkm-dotfiles"
main() {
echo "run install.sh ..."
get_repo
cd ${DOT_DIR}
config_git
do_install
}
config_git() {
git config user.name komukomo
git config user.email komukomo@users.noreply.github.com
}
get_repo() {
if [ ! -e ${DOT_DIR} ]; then
git clone https://github.com/komukomo/dotfiles.git ${DOT_DIR}
else
echo "${DOT_DIR} already exists."
fi
}
do_install() {
DIRPATH=$(cd `dirname $0`; pwd)
cd ${DIRPATH}
TARGETS=$(git ls-files ".*" | awk -F'/' '{print $1}' | uniq)
for file in ${TARGETS[@]}
do
ln -s ${DIRPATH}/${file} ${HOME}/${file}
if [ $? -eq 0 ]; then
echo "success to create sym link: ${file}"
fi
done
}
main
|
DOT_DIR="$HOME/.kmkm-dotfiles"
main() {
echo "run install.sh ..."
get_repo
cd ${DOT_DIR}
config_git
do_install
}
config_git() {
git config user.name komukomo
git config user.email komukomo@users.noreply.github.com
}
get_repo() {
if [ ! -e ${DOT_DIR} ]; then
git clone https://github.com/komukomo/dotfiles.git ${DOT_DIR}
else
echo "${DOT_DIR} already exists."
fi
}
do_install() {
DIRPATH=$(cd `dirname $0`; pwd)
cd ${DIRPATH}
- TARGETS=$(git ls-files ".*")
+ TARGETS=$(git ls-files ".*" | awk -F'/' '{print $1}' | uniq)
for file in ${TARGETS[@]}
do
ln -s ${DIRPATH}/${file} ${HOME}/${file}
if [ $? -eq 0 ]; then
echo "success to create sym link: ${file}"
fi
done
}
main | 2 | 0.05 | 1 | 1 |
72a5070bec3a04a9f7155d538b6cfc5ea1f67191 | README.md | README.md |
A web application to translate color schemes between dark- and light-background using the [CIECAM02](https://en.wikipedia.org/wiki/CIECAM02) color appearance model.
Python 3, [Flask](http://flask.pocoo.org), [uWSGI](https://uwsgi-docs.readthedocs.io/en/latest/), and special thanks to the authors of the [colour-science](http://colour-science.org) Python module.
## Running the application
When `app.sh` is invoked for the first time, it will create a Python virtual environment in `venv` and install dependencies into it. (If `venv` is later removed, it will be recreated.) The included sample uWSGI web server configuration (`uwsgi_example.ini`) will serve HTTP on port 8000 by default; for deployment, use of the `socket` or `http-socket` directives and a reverse proxy are preferred. (See the [Flask deployment guide for uWSGI](http://flask.pocoo.org/docs/0.12/deploying/uwsgi/).)
|
A web application to translate color schemes between dark- and light-background using the [CIECAM02](https://en.wikipedia.org/wiki/CIECAM02) color appearance model. An instance of it is running [here](https://kath.io/color_schemer/).
Python 3, [Flask](http://flask.pocoo.org), [uWSGI](https://uwsgi-docs.readthedocs.io/en/latest/), and special thanks to the authors of the [colour-science](http://colour-science.org) Python module.
## Running the application
When `app.sh` is invoked for the first time, it will create a Python virtual environment in `venv` and install dependencies into it. (If `venv` is later removed, it will be recreated.) The included sample uWSGI web server configuration (`uwsgi_example.ini`) will serve HTTP on port 8000 by default; for deployment, use of the `socket` or `http-socket` directives and a reverse proxy are preferred. (See the [Flask deployment guide for uWSGI](http://flask.pocoo.org/docs/0.12/deploying/uwsgi/).)
| Add link to instance on kath.io | Add link to instance on kath.io
| Markdown | mit | crowsonkb/color_schemer,crowsonkb/color_schemer,crowsonkb/color_schemer | markdown | ## Code Before:
A web application to translate color schemes between dark- and light-background using the [CIECAM02](https://en.wikipedia.org/wiki/CIECAM02) color appearance model.
Python 3, [Flask](http://flask.pocoo.org), [uWSGI](https://uwsgi-docs.readthedocs.io/en/latest/), and special thanks to the authors of the [colour-science](http://colour-science.org) Python module.
## Running the application
When `app.sh` is invoked for the first time, it will create a Python virtual environment in `venv` and install dependencies into it. (If `venv` is later removed, it will be recreated.) The included sample uWSGI web server configuration (`uwsgi_example.ini`) will serve HTTP on port 8000 by default; for deployment, use of the `socket` or `http-socket` directives and a reverse proxy are preferred. (See the [Flask deployment guide for uWSGI](http://flask.pocoo.org/docs/0.12/deploying/uwsgi/).)
## Instruction:
Add link to instance on kath.io
## Code After:
A web application to translate color schemes between dark- and light-background using the [CIECAM02](https://en.wikipedia.org/wiki/CIECAM02) color appearance model. An instance of it is running [here](https://kath.io/color_schemer/).
Python 3, [Flask](http://flask.pocoo.org), [uWSGI](https://uwsgi-docs.readthedocs.io/en/latest/), and special thanks to the authors of the [colour-science](http://colour-science.org) Python module.
## Running the application
When `app.sh` is invoked for the first time, it will create a Python virtual environment in `venv` and install dependencies into it. (If `venv` is later removed, it will be recreated.) The included sample uWSGI web server configuration (`uwsgi_example.ini`) will serve HTTP on port 8000 by default; for deployment, use of the `socket` or `http-socket` directives and a reverse proxy are preferred. (See the [Flask deployment guide for uWSGI](http://flask.pocoo.org/docs/0.12/deploying/uwsgi/).)
|
- A web application to translate color schemes between dark- and light-background using the [CIECAM02](https://en.wikipedia.org/wiki/CIECAM02) color appearance model.
+ A web application to translate color schemes between dark- and light-background using the [CIECAM02](https://en.wikipedia.org/wiki/CIECAM02) color appearance model. An instance of it is running [here](https://kath.io/color_schemer/).
? +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Python 3, [Flask](http://flask.pocoo.org), [uWSGI](https://uwsgi-docs.readthedocs.io/en/latest/), and special thanks to the authors of the [colour-science](http://colour-science.org) Python module.
## Running the application
When `app.sh` is invoked for the first time, it will create a Python virtual environment in `venv` and install dependencies into it. (If `venv` is later removed, it will be recreated.) The included sample uWSGI web server configuration (`uwsgi_example.ini`) will serve HTTP on port 8000 by default; for deployment, use of the `socket` or `http-socket` directives and a reverse proxy are preferred. (See the [Flask deployment guide for uWSGI](http://flask.pocoo.org/docs/0.12/deploying/uwsgi/).) | 2 | 0.25 | 1 | 1 |
9b556e5df90b76a67360ef556fb3e87460d54361 | lib/public/js/wharf.js | lib/public/js/wharf.js | 'use strict';
var angular
, DashboardController
, ChartController;
angular.module('wharf', ['ui.router', 'wharf.controllers']);
angular.module('wharf').config(function($stateProvider, $urlRouterProvider) {
$stateProvider.state('dashboard', {
abstract: true,
url: '/dashboard',
template: '<ui-view/>'
}).state('dashboard.details', {
url: '/details',
views: {
'': {
templateUrl: '/js/templates/dashboard.html',
controller: 'DashboardController as dashboardController'
},
'filters@dashboard.details': {
templateUrl: '/js/templates/dashboard/_filters.html',
controller: 'FiltersController as filtersController'
},
'chart@dashboard.details': {
templateUrl: '/js/templates/dashboard/_chart.html',
controller: 'ChartController as chartController'
}
}
});
$urlRouterProvider.otherwise('/');
});
angular.module('wharf.controllers', []);
angular.module('wharf.controllers').controller('DashboardController', DashboardController);
angular.module('wharf.controllers').controller('FiltersController', FiltersController);
angular.module('wharf.controllers').controller('ChartController', ChartController);
| 'use strict';
var angular
, DashboardController
, FiltersController
, ChartController;
angular.module('wharf', ['ui.router', 'wharf.controllers']);
angular.module('wharf').config(function($stateProvider, $urlRouterProvider) {
$stateProvider.state('dashboard', {
abstract: true,
url: '/dashboard',
template: '<ui-view/>'
}).state('dashboard.details', {
url: '/details',
views: {
'': {
templateUrl: '/js/templates/dashboard.html',
controller: 'DashboardController as dashboardController'
},
'filters@dashboard.details': {
templateUrl: '/js/templates/dashboard/_filters.html',
controller: 'FiltersController as filtersController'
},
'chart@dashboard.details': {
templateUrl: '/js/templates/dashboard/_chart.html',
controller: 'ChartController as chartController'
}
}
});
$urlRouterProvider.otherwise('/');
});
angular.module('wharf.controllers', []);
angular.module('wharf.controllers').controller('DashboardController', DashboardController);
angular.module('wharf.controllers').controller('FiltersController', FiltersController);
angular.module('wharf.controllers').controller('ChartController', ChartController);
| Add filters controller missing definition | Add filters controller missing definition
| JavaScript | mit | davidcunha/wharf,davidcunha/wharf | javascript | ## Code Before:
'use strict';
var angular
, DashboardController
, ChartController;
angular.module('wharf', ['ui.router', 'wharf.controllers']);
angular.module('wharf').config(function($stateProvider, $urlRouterProvider) {
$stateProvider.state('dashboard', {
abstract: true,
url: '/dashboard',
template: '<ui-view/>'
}).state('dashboard.details', {
url: '/details',
views: {
'': {
templateUrl: '/js/templates/dashboard.html',
controller: 'DashboardController as dashboardController'
},
'filters@dashboard.details': {
templateUrl: '/js/templates/dashboard/_filters.html',
controller: 'FiltersController as filtersController'
},
'chart@dashboard.details': {
templateUrl: '/js/templates/dashboard/_chart.html',
controller: 'ChartController as chartController'
}
}
});
$urlRouterProvider.otherwise('/');
});
angular.module('wharf.controllers', []);
angular.module('wharf.controllers').controller('DashboardController', DashboardController);
angular.module('wharf.controllers').controller('FiltersController', FiltersController);
angular.module('wharf.controllers').controller('ChartController', ChartController);
## Instruction:
Add filters controller missing definition
## Code After:
'use strict';
var angular
, DashboardController
, FiltersController
, ChartController;
angular.module('wharf', ['ui.router', 'wharf.controllers']);
angular.module('wharf').config(function($stateProvider, $urlRouterProvider) {
$stateProvider.state('dashboard', {
abstract: true,
url: '/dashboard',
template: '<ui-view/>'
}).state('dashboard.details', {
url: '/details',
views: {
'': {
templateUrl: '/js/templates/dashboard.html',
controller: 'DashboardController as dashboardController'
},
'filters@dashboard.details': {
templateUrl: '/js/templates/dashboard/_filters.html',
controller: 'FiltersController as filtersController'
},
'chart@dashboard.details': {
templateUrl: '/js/templates/dashboard/_chart.html',
controller: 'ChartController as chartController'
}
}
});
$urlRouterProvider.otherwise('/');
});
angular.module('wharf.controllers', []);
angular.module('wharf.controllers').controller('DashboardController', DashboardController);
angular.module('wharf.controllers').controller('FiltersController', FiltersController);
angular.module('wharf.controllers').controller('ChartController', ChartController);
| 'use strict';
var angular
, DashboardController
+ , FiltersController
, ChartController;
angular.module('wharf', ['ui.router', 'wharf.controllers']);
angular.module('wharf').config(function($stateProvider, $urlRouterProvider) {
$stateProvider.state('dashboard', {
abstract: true,
url: '/dashboard',
template: '<ui-view/>'
}).state('dashboard.details', {
url: '/details',
views: {
'': {
templateUrl: '/js/templates/dashboard.html',
controller: 'DashboardController as dashboardController'
},
'filters@dashboard.details': {
templateUrl: '/js/templates/dashboard/_filters.html',
controller: 'FiltersController as filtersController'
},
'chart@dashboard.details': {
templateUrl: '/js/templates/dashboard/_chart.html',
controller: 'ChartController as chartController'
}
}
});
$urlRouterProvider.otherwise('/');
});
angular.module('wharf.controllers', []);
angular.module('wharf.controllers').controller('DashboardController', DashboardController);
angular.module('wharf.controllers').controller('FiltersController', FiltersController);
angular.module('wharf.controllers').controller('ChartController', ChartController); | 1 | 0.026316 | 1 | 0 |
e8eaa1149d3b4d98e61a454d1cd9e06f2a458f5c | models/vote.js | models/vote.js | module.exports = function (sequelize, DataTypes) {
const Vote = sequelize.define('Vote', {
id: {
primaryKey: true,
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4
},
UserId: {
type: DataTypes.UUID,
allowNull: false
},
BikeshedId: {
type: DataTypes.UUID,
allowNull: false
}
}, {
classMethods: {
associate (models) {
Vote.belongsTo(models.Bikeshed)
Vote.belongsTo(models.User)
Vote.hasMany(models.Rating)
}
}
})
return Vote
}
| const createError = require('http-errors')
module.exports = function (sequelize, DataTypes) {
const Vote = sequelize.define('Vote', {
id: {
primaryKey: true,
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4
},
UserId: {
type: DataTypes.UUID,
allowNull: false
},
BikeshedId: {
type: DataTypes.UUID,
allowNull: false
}
}, {
hooks: {
async beforeCreate (vote, opts) {
// Confirm vote count is zero
await confirmVoteCount(vote, opts, 0)
},
async afterCreate (vote, opts) {
// Confirm vote count is one
await confirmVoteCount(vote, opts, 1)
}
},
classMethods: {
associate (models) {
Vote.belongsTo(models.Bikeshed)
Vote.belongsTo(models.User)
Vote.hasMany(models.Rating)
}
}
})
return Vote
/**
* Confirm Vote.count matches expected value
* @param {Instance} vote Model instance
* @param {Object} opts Model options
* @param {Number} value Expected count
* @returns {Promise}
*/
async function confirmVoteCount (vote, opts, value) {
const voteCount = await Vote.count({
transaction: opts.transaction,
where: {
BikeshedId: vote.BikeshedId,
UserId: vote.UserId
}
})
if (voteCount !== value)
throw createError(409, 'Can only vote once per Bikeshed', {expose: true})
}
}
| Add before and after create hooks to Vote | Add before and after create hooks to Vote
| JavaScript | apache-2.0 | cesarandreu/bshed,cesarandreu/bshed | javascript | ## Code Before:
module.exports = function (sequelize, DataTypes) {
const Vote = sequelize.define('Vote', {
id: {
primaryKey: true,
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4
},
UserId: {
type: DataTypes.UUID,
allowNull: false
},
BikeshedId: {
type: DataTypes.UUID,
allowNull: false
}
}, {
classMethods: {
associate (models) {
Vote.belongsTo(models.Bikeshed)
Vote.belongsTo(models.User)
Vote.hasMany(models.Rating)
}
}
})
return Vote
}
## Instruction:
Add before and after create hooks to Vote
## Code After:
const createError = require('http-errors')
module.exports = function (sequelize, DataTypes) {
const Vote = sequelize.define('Vote', {
id: {
primaryKey: true,
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4
},
UserId: {
type: DataTypes.UUID,
allowNull: false
},
BikeshedId: {
type: DataTypes.UUID,
allowNull: false
}
}, {
hooks: {
async beforeCreate (vote, opts) {
// Confirm vote count is zero
await confirmVoteCount(vote, opts, 0)
},
async afterCreate (vote, opts) {
// Confirm vote count is one
await confirmVoteCount(vote, opts, 1)
}
},
classMethods: {
associate (models) {
Vote.belongsTo(models.Bikeshed)
Vote.belongsTo(models.User)
Vote.hasMany(models.Rating)
}
}
})
return Vote
/**
* Confirm Vote.count matches expected value
* @param {Instance} vote Model instance
* @param {Object} opts Model options
* @param {Number} value Expected count
* @returns {Promise}
*/
async function confirmVoteCount (vote, opts, value) {
const voteCount = await Vote.count({
transaction: opts.transaction,
where: {
BikeshedId: vote.BikeshedId,
UserId: vote.UserId
}
})
if (voteCount !== value)
throw createError(409, 'Can only vote once per Bikeshed', {expose: true})
}
}
| + const createError = require('http-errors')
+
module.exports = function (sequelize, DataTypes) {
const Vote = sequelize.define('Vote', {
id: {
primaryKey: true,
type: DataTypes.UUID,
defaultValue: DataTypes.UUIDV4
},
UserId: {
type: DataTypes.UUID,
allowNull: false
},
BikeshedId: {
type: DataTypes.UUID,
allowNull: false
}
}, {
+ hooks: {
+ async beforeCreate (vote, opts) {
+ // Confirm vote count is zero
+ await confirmVoteCount(vote, opts, 0)
+ },
+
+ async afterCreate (vote, opts) {
+ // Confirm vote count is one
+ await confirmVoteCount(vote, opts, 1)
+ }
+ },
classMethods: {
associate (models) {
Vote.belongsTo(models.Bikeshed)
Vote.belongsTo(models.User)
Vote.hasMany(models.Rating)
}
}
})
return Vote
+
+ /**
+ * Confirm Vote.count matches expected value
+ * @param {Instance} vote Model instance
+ * @param {Object} opts Model options
+ * @param {Number} value Expected count
+ * @returns {Promise}
+ */
+ async function confirmVoteCount (vote, opts, value) {
+ const voteCount = await Vote.count({
+ transaction: opts.transaction,
+ where: {
+ BikeshedId: vote.BikeshedId,
+ UserId: vote.UserId
+ }
+ })
+ if (voteCount !== value)
+ throw createError(409, 'Can only vote once per Bikeshed', {expose: true})
+ }
} | 32 | 1.185185 | 32 | 0 |
6797803785f740756e27f0d7f894e0959d0518a0 | Magic/src/main/resources/examples/potter/wands/potter_appearances.yml | Magic/src/main/resources/examples/potter/wands/potter_appearances.yml | skinny_upgrade:
name: Skinny Wand Appearance
upgrade: true
icon: wand_icon:24
upgrade_icon: wand_icon:24
brown_upgrade:
name: Brown Wand Appearance
upgrade: true
icon: wand_icon:26
upgrade_icon: wand_icon:26
| skinny_upgrade:
name: Skinny Wand Appearance
upgrade: true
icon: wand_icon:24
upgrade_icon: wand_icon:24
black_upgrade:
name: Black Wand Appearance
upgrade: true
icon: wand_icon:25
upgrade_icon: wand_icon:25
brown_upgrade:
name: Brown Wand Appearance
upgrade: true
icon: wand_icon:26
upgrade_icon: wand_icon:26
| Add back black wand appearance | Add back black wand appearance
| YAML | mit | elBukkit/MagicPlugin,elBukkit/MagicPlugin,elBukkit/MagicPlugin | yaml | ## Code Before:
skinny_upgrade:
name: Skinny Wand Appearance
upgrade: true
icon: wand_icon:24
upgrade_icon: wand_icon:24
brown_upgrade:
name: Brown Wand Appearance
upgrade: true
icon: wand_icon:26
upgrade_icon: wand_icon:26
## Instruction:
Add back black wand appearance
## Code After:
skinny_upgrade:
name: Skinny Wand Appearance
upgrade: true
icon: wand_icon:24
upgrade_icon: wand_icon:24
black_upgrade:
name: Black Wand Appearance
upgrade: true
icon: wand_icon:25
upgrade_icon: wand_icon:25
brown_upgrade:
name: Brown Wand Appearance
upgrade: true
icon: wand_icon:26
upgrade_icon: wand_icon:26
| skinny_upgrade:
name: Skinny Wand Appearance
upgrade: true
icon: wand_icon:24
upgrade_icon: wand_icon:24
+
+ black_upgrade:
+ name: Black Wand Appearance
+ upgrade: true
+ icon: wand_icon:25
+ upgrade_icon: wand_icon:25
brown_upgrade:
name: Brown Wand Appearance
upgrade: true
icon: wand_icon:26
upgrade_icon: wand_icon:26
| 6 | 0.5 | 6 | 0 |
438791bb9936c6c7b198f9e2b5f878e0cf36b171 | sources/us/nc/guilford.json | sources/us/nc/guilford.json | {
"coverage": {
"US Census": {
"geoid": "37081",
"name": "Guilford County",
"state": "North Carolina"
},
"country": "us",
"state": "nc",
"county": "Guilford"
},
"data": "http://gis.co.guilford.nc.us/datadownload/DataDownload.aspx",
"type": "http",
"compression": "zip",
"website": "http://countyweb.co.guilford.nc.us/gis-home",
"note": "address.shp is linked on the page, however they require ie in order to access it; doing so in ie doesn't provide a url, however clicking on it will prompt a download for DataDownload.zip which provides the address points in .shp"
} | {
"coverage": {
"US Census": {
"geoid": "37081",
"name": "Guilford County",
"state": "North Carolina"
},
"country": "us",
"state": "nc",
"county": "Guilford"
},
"data": "http://gis.co.guilford.nc.us/arcgis/rest/services/Data_Download/DataDownload12/MapServer/17",
"type": "ESRI",
"website": "http://countyweb.co.guilford.nc.us/gis-home",
"conform": {
"type": "geojson",
"number": "PHYADDR_STR_NUM",
"street": [
"PHYADDR_DIR_PFX",
"PHYADDR_STR",
"PHYADDR_STR_TYPE",
"PHYADDR_STR_SFX"
],
"city": "PHYADDR_CITY",
"region": "PHYADDR_STATE",
"postcode": "PHYADDR_ZIP"
}
}
| Add conform to Guilford County, NC | Add conform to Guilford County, NC
| JSON | bsd-3-clause | davidchiles/openaddresses,slibby/openaddresses,tyrasd/openaddresses,orangejulius/openaddresses,sabas/openaddresses,sabas/openaddresses,tyrasd/openaddresses,mmdolbow/openaddresses,mmdolbow/openaddresses,binaek89/openaddresses,binaek89/openaddresses,davidchiles/openaddresses,sergiyprotsiv/openaddresses,mmdolbow/openaddresses,sergiyprotsiv/openaddresses,orangejulius/openaddresses,openaddresses/openaddresses,astoff/openaddresses,binaek89/openaddresses,sabas/openaddresses,openaddresses/openaddresses,tyrasd/openaddresses,openaddresses/openaddresses,astoff/openaddresses,slibby/openaddresses,slibby/openaddresses,orangejulius/openaddresses,sergiyprotsiv/openaddresses,astoff/openaddresses,davidchiles/openaddresses | json | ## Code Before:
{
"coverage": {
"US Census": {
"geoid": "37081",
"name": "Guilford County",
"state": "North Carolina"
},
"country": "us",
"state": "nc",
"county": "Guilford"
},
"data": "http://gis.co.guilford.nc.us/datadownload/DataDownload.aspx",
"type": "http",
"compression": "zip",
"website": "http://countyweb.co.guilford.nc.us/gis-home",
"note": "address.shp is linked on the page, however they require ie in order to access it; doing so in ie doesn't provide a url, however clicking on it will prompt a download for DataDownload.zip which provides the address points in .shp"
}
## Instruction:
Add conform to Guilford County, NC
## Code After:
{
"coverage": {
"US Census": {
"geoid": "37081",
"name": "Guilford County",
"state": "North Carolina"
},
"country": "us",
"state": "nc",
"county": "Guilford"
},
"data": "http://gis.co.guilford.nc.us/arcgis/rest/services/Data_Download/DataDownload12/MapServer/17",
"type": "ESRI",
"website": "http://countyweb.co.guilford.nc.us/gis-home",
"conform": {
"type": "geojson",
"number": "PHYADDR_STR_NUM",
"street": [
"PHYADDR_DIR_PFX",
"PHYADDR_STR",
"PHYADDR_STR_TYPE",
"PHYADDR_STR_SFX"
],
"city": "PHYADDR_CITY",
"region": "PHYADDR_STATE",
"postcode": "PHYADDR_ZIP"
}
}
| {
"coverage": {
"US Census": {
"geoid": "37081",
"name": "Guilford County",
"state": "North Carolina"
},
"country": "us",
"state": "nc",
"county": "Guilford"
},
- "data": "http://gis.co.guilford.nc.us/datadownload/DataDownload.aspx",
? ^ ^ ^ - ^
+ "data": "http://gis.co.guilford.nc.us/arcgis/rest/services/Data_Download/DataDownload12/MapServer/17",
? ^^^^^^^^^^^^^^^^^^^^^^ ^^ ^^^^ ^^^^^^^^^
- "type": "http",
? ^^^^
+ "type": "ESRI",
? ^^^^
- "compression": "zip",
"website": "http://countyweb.co.guilford.nc.us/gis-home",
- "note": "address.shp is linked on the page, however they require ie in order to access it; doing so in ie doesn't provide a url, however clicking on it will prompt a download for DataDownload.zip which provides the address points in .shp"
+ "conform": {
+ "type": "geojson",
+ "number": "PHYADDR_STR_NUM",
+ "street": [
+ "PHYADDR_DIR_PFX",
+ "PHYADDR_STR",
+ "PHYADDR_STR_TYPE",
+ "PHYADDR_STR_SFX"
+ ],
+ "city": "PHYADDR_CITY",
+ "region": "PHYADDR_STATE",
+ "postcode": "PHYADDR_ZIP"
+ }
} | 19 | 1.117647 | 15 | 4 |
aad51c0ad67e9c8c6874ebc1908d29936fd600c4 | .codeclimate.yml | .codeclimate.yml | version: '2'
checks:
similar-code:
config:
count_threshold: 3
threshold: 50
identical-code:
config:
count_threshold: 3
threshold: 50
| version: '2'
checks:
similar-code:
config:
threshold: 50
identical-code:
config:
threshold: 50
plugins:
duplication:
enabled: true
config:
count_threshold: 3
| Move count_threshold to plugin config for code climate. | Move count_threshold to plugin config for code climate.
| YAML | mit | Squishymedia/BIDS-Validator,Squishymedia/bids-validator,nellh/bids-validator,nellh/bids-validator,nellh/bids-validator | yaml | ## Code Before:
version: '2'
checks:
similar-code:
config:
count_threshold: 3
threshold: 50
identical-code:
config:
count_threshold: 3
threshold: 50
## Instruction:
Move count_threshold to plugin config for code climate.
## Code After:
version: '2'
checks:
similar-code:
config:
threshold: 50
identical-code:
config:
threshold: 50
plugins:
duplication:
enabled: true
config:
count_threshold: 3
| version: '2'
checks:
similar-code:
config:
- count_threshold: 3
threshold: 50
identical-code:
config:
+ threshold: 50
+ plugins:
+ duplication:
+ enabled: true
+ config:
count_threshold: 3
- threshold: 50 | 7 | 0.7 | 5 | 2 |
b865b2bc102fa0086bf3aaf6bdab5d1e3f5f8b3b | app/Transformers/Outbound/Collections/Place.php | app/Transformers/Outbound/Collections/Place.php | <?php
namespace App\Transformers\Outbound\Collections;
use App\Transformers\Outbound\CollectionsTransformer as BaseTransformer;
class Place extends BaseTransformer
{
protected function getFields()
{
return [
'type' => [
'doc' => 'Type always takes one of the following values: AIC Gallery, AIC Storage, No location',
'type' => 'string',
'elasticsearch' => 'keyword',
],
];
}
public function getLicenseText()
{
return 'The data in this response is licensed under a Creative Commons Attribution 4.0 Generic License (CC-By) and the Terms and Conditions of artic.edu. Contains information from the J. Paul Getty Trust, Getty Research Institute, the Getty Thesaurus of Geographic Names, which is made available under the ODC Attribution License.';
}
public function getLicenseLinks()
{
return [
'https://www.artic.edu/terms',
'https://creativecommons.org/licenses/by/4.0/',
];
}
public function getLicensePriority()
{
return 50;
}
}
| <?php
namespace App\Transformers\Outbound\Collections;
use App\Transformers\Outbound\CollectionsTransformer as BaseTransformer;
class Place extends BaseTransformer
{
protected function getFields()
{
return [
'latitude' => [
'doc' => 'Latitude coordinate of the center of the room',
'type' => 'number',
'elasticsearch' => 'float',
'is_restricted' => true,
],
'longitude' => [
'doc' => 'Longitude coordinate of the center of the room',
'type' => 'number',
'elasticsearch' => 'float',
'is_restricted' => true,
],
'type' => [
'doc' => 'Type always takes one of the following values: AIC Gallery, AIC Storage, No location',
'type' => 'string',
'elasticsearch' => 'keyword',
],
];
}
public function getLicenseText()
{
return 'The data in this response is licensed under a Creative Commons Attribution 4.0 Generic License (CC-By) and the Terms and Conditions of artic.edu. Contains information from the J. Paul Getty Trust, Getty Research Institute, the Getty Thesaurus of Geographic Names, which is made available under the ODC Attribution License.';
}
public function getLicenseLinks()
{
return [
'https://www.artic.edu/terms',
'https://creativecommons.org/licenses/by/4.0/',
];
}
public function getLicensePriority()
{
return 50;
}
}
| Add restricted `latitude` and `longitude` fields to `places` | Add restricted `latitude` and `longitude` fields to `places` [API-12]
| PHP | agpl-3.0 | art-institute-of-chicago/data-aggregator,art-institute-of-chicago/data-aggregator,art-institute-of-chicago/data-aggregator | php | ## Code Before:
<?php
namespace App\Transformers\Outbound\Collections;
use App\Transformers\Outbound\CollectionsTransformer as BaseTransformer;
class Place extends BaseTransformer
{
protected function getFields()
{
return [
'type' => [
'doc' => 'Type always takes one of the following values: AIC Gallery, AIC Storage, No location',
'type' => 'string',
'elasticsearch' => 'keyword',
],
];
}
public function getLicenseText()
{
return 'The data in this response is licensed under a Creative Commons Attribution 4.0 Generic License (CC-By) and the Terms and Conditions of artic.edu. Contains information from the J. Paul Getty Trust, Getty Research Institute, the Getty Thesaurus of Geographic Names, which is made available under the ODC Attribution License.';
}
public function getLicenseLinks()
{
return [
'https://www.artic.edu/terms',
'https://creativecommons.org/licenses/by/4.0/',
];
}
public function getLicensePriority()
{
return 50;
}
}
## Instruction:
Add restricted `latitude` and `longitude` fields to `places` [API-12]
## Code After:
<?php
namespace App\Transformers\Outbound\Collections;
use App\Transformers\Outbound\CollectionsTransformer as BaseTransformer;
class Place extends BaseTransformer
{
protected function getFields()
{
return [
'latitude' => [
'doc' => 'Latitude coordinate of the center of the room',
'type' => 'number',
'elasticsearch' => 'float',
'is_restricted' => true,
],
'longitude' => [
'doc' => 'Longitude coordinate of the center of the room',
'type' => 'number',
'elasticsearch' => 'float',
'is_restricted' => true,
],
'type' => [
'doc' => 'Type always takes one of the following values: AIC Gallery, AIC Storage, No location',
'type' => 'string',
'elasticsearch' => 'keyword',
],
];
}
public function getLicenseText()
{
return 'The data in this response is licensed under a Creative Commons Attribution 4.0 Generic License (CC-By) and the Terms and Conditions of artic.edu. Contains information from the J. Paul Getty Trust, Getty Research Institute, the Getty Thesaurus of Geographic Names, which is made available under the ODC Attribution License.';
}
public function getLicenseLinks()
{
return [
'https://www.artic.edu/terms',
'https://creativecommons.org/licenses/by/4.0/',
];
}
public function getLicensePriority()
{
return 50;
}
}
| <?php
namespace App\Transformers\Outbound\Collections;
use App\Transformers\Outbound\CollectionsTransformer as BaseTransformer;
class Place extends BaseTransformer
{
protected function getFields()
{
return [
+ 'latitude' => [
+ 'doc' => 'Latitude coordinate of the center of the room',
+ 'type' => 'number',
+ 'elasticsearch' => 'float',
+ 'is_restricted' => true,
+ ],
+ 'longitude' => [
+ 'doc' => 'Longitude coordinate of the center of the room',
+ 'type' => 'number',
+ 'elasticsearch' => 'float',
+ 'is_restricted' => true,
+ ],
'type' => [
'doc' => 'Type always takes one of the following values: AIC Gallery, AIC Storage, No location',
'type' => 'string',
'elasticsearch' => 'keyword',
],
];
}
public function getLicenseText()
{
return 'The data in this response is licensed under a Creative Commons Attribution 4.0 Generic License (CC-By) and the Terms and Conditions of artic.edu. Contains information from the J. Paul Getty Trust, Getty Research Institute, the Getty Thesaurus of Geographic Names, which is made available under the ODC Attribution License.';
}
public function getLicenseLinks()
{
return [
'https://www.artic.edu/terms',
'https://creativecommons.org/licenses/by/4.0/',
];
}
public function getLicensePriority()
{
return 50;
}
} | 12 | 0.315789 | 12 | 0 |
dd529bc0b08e653034673045674668eca37d3505 | app/views/sessions/_new.html.haml | app/views/sessions/_new.html.haml | - field_set_tag "Log in" do
- form_tag session_path do
.aligned
%ul
%li
= label_tag 'login', 'Username', :class=>"right"
= text_field_tag 'login', @login
%li
= label_tag :password, "Password", :class=>"right"
= password_field_tag
%li
= label_tag 'submit', '', :class=>"right"
= submit_tag 'Login'
%li
= label_tag 'remember_me', 'Remember me', :class=>"right"
= check_box_tag 'remember_me', '1'
%ul.menu_h_right
%li= link_to 'Forgot my password', forgot_password_path
| - field_set_tag "Log in" do
- form_tag session_path do
.aligned
%ul
%li
= label_tag 'login', 'Username', :class=>"right"
= text_field_tag 'login', @login
%li
= label_tag :password, "Password", :class=>"right"
= password_field_tag
%li
= label_tag 'submit', '', :class=>"right"
= submit_tag 'Login'
%li
= label_tag 'remember_me', 'Stay logged in', :class=>"right"
= check_box_tag 'remember_me', '1'
%ul.menu_h_right
%li= link_to 'Forgot my password', forgot_password_path
| Change naming of remember me to stay logged in | Change naming of remember me to stay logged in
This was originally an itsi su request:
http://www.pivotaltracker.com/story/show/2049248
But the argument was quite compelling, so I pushed
to default theme. | Haml | mit | concord-consortium/rigse,concord-consortium/rigse,concord-consortium/rigse,concord-consortium/rigse,concord-consortium/rigse,concord-consortium/rigse | haml | ## Code Before:
- field_set_tag "Log in" do
- form_tag session_path do
.aligned
%ul
%li
= label_tag 'login', 'Username', :class=>"right"
= text_field_tag 'login', @login
%li
= label_tag :password, "Password", :class=>"right"
= password_field_tag
%li
= label_tag 'submit', '', :class=>"right"
= submit_tag 'Login'
%li
= label_tag 'remember_me', 'Remember me', :class=>"right"
= check_box_tag 'remember_me', '1'
%ul.menu_h_right
%li= link_to 'Forgot my password', forgot_password_path
## Instruction:
Change naming of remember me to stay logged in
This was originally an itsi su request:
http://www.pivotaltracker.com/story/show/2049248
But the argument was quite compelling, so I pushed
to default theme.
## Code After:
- field_set_tag "Log in" do
- form_tag session_path do
.aligned
%ul
%li
= label_tag 'login', 'Username', :class=>"right"
= text_field_tag 'login', @login
%li
= label_tag :password, "Password", :class=>"right"
= password_field_tag
%li
= label_tag 'submit', '', :class=>"right"
= submit_tag 'Login'
%li
= label_tag 'remember_me', 'Stay logged in', :class=>"right"
= check_box_tag 'remember_me', '1'
%ul.menu_h_right
%li= link_to 'Forgot my password', forgot_password_path
| - field_set_tag "Log in" do
- form_tag session_path do
.aligned
%ul
%li
= label_tag 'login', 'Username', :class=>"right"
= text_field_tag 'login', @login
%li
= label_tag :password, "Password", :class=>"right"
= password_field_tag
%li
= label_tag 'submit', '', :class=>"right"
= submit_tag 'Login'
%li
- = label_tag 'remember_me', 'Remember me', :class=>"right"
? ^ ^^^^^^ ^^
+ = label_tag 'remember_me', 'Stay logged in', :class=>"right"
? ^^^^^^^^^ ^ ^^
= check_box_tag 'remember_me', '1'
%ul.menu_h_right
%li= link_to 'Forgot my password', forgot_password_path
| 2 | 0.1 | 1 | 1 |
81af16a8959f72acc386e90bfbc010957bf6e2d4 | package.json | package.json | {
"author": "Einar Otto Stangvik <einaros@gmail.com> (http://2x.io)",
"name": "ws",
"description": "simple to use, blazing fast and thoroughly tested websocket client, server and console for node.js, up-to-date against RFC-6455",
"version": "0.4.32",
"licenses": [
{
"type": "MIT",
"url": "https://raw.githubusercontent.com/einaros/ws/master/README.md"
}
],
"keywords": [
"Hixie",
"HyBi",
"Push",
"RFC-6455",
"WebSocket",
"WebSockets",
"real-time"
],
"repository": {
"type": "git",
"url": "git://github.com/einaros/ws.git"
},
"scripts": {
"test": "make test",
"install": "(node-gyp rebuild 2> builderror.log) || (exit 0)"
},
"dependencies": {
"nan": "~1.0.0",
"options": ">=0.0.5"
},
"devDependencies": {
"mocha": "1.12.0",
"should": "1.2.x",
"expect.js": "0.2.x",
"benchmark": "0.3.x",
"ansi": "latest"
},
"browser": "./lib/browser.js",
"component": {
"scripts": {
"ws/index.js": "./lib/browser.js"
}
},
"gypfile": true
}
| {
"author": "Einar Otto Stangvik <einaros@gmail.com> (http://2x.io)",
"name": "ws",
"description": "simple to use, blazing fast and thoroughly tested websocket client, server and console for node.js, up-to-date against RFC-6455",
"version": "0.4.32",
"license": "MIT",
"keywords": [
"Hixie",
"HyBi",
"Push",
"RFC-6455",
"WebSocket",
"WebSockets",
"real-time"
],
"repository": {
"type": "git",
"url": "git://github.com/einaros/ws.git"
},
"scripts": {
"test": "make test",
"install": "(node-gyp rebuild 2> builderror.log) || (exit 0)"
},
"dependencies": {
"nan": "~1.0.0",
"options": ">=0.0.5"
},
"devDependencies": {
"mocha": "1.12.0",
"should": "1.2.x",
"expect.js": "0.2.x",
"benchmark": "0.3.x",
"ansi": "latest"
},
"browser": "./lib/browser.js",
"component": {
"scripts": {
"ws/index.js": "./lib/browser.js"
}
},
"gypfile": true
}
| Remove the licenses property infavor of license as we have no license file | [fix] Remove the licenses property infavor of license as we have no license file
| JSON | mit | websockets/ws,guymguym/ws,karatheodory/ws | json | ## Code Before:
{
"author": "Einar Otto Stangvik <einaros@gmail.com> (http://2x.io)",
"name": "ws",
"description": "simple to use, blazing fast and thoroughly tested websocket client, server and console for node.js, up-to-date against RFC-6455",
"version": "0.4.32",
"licenses": [
{
"type": "MIT",
"url": "https://raw.githubusercontent.com/einaros/ws/master/README.md"
}
],
"keywords": [
"Hixie",
"HyBi",
"Push",
"RFC-6455",
"WebSocket",
"WebSockets",
"real-time"
],
"repository": {
"type": "git",
"url": "git://github.com/einaros/ws.git"
},
"scripts": {
"test": "make test",
"install": "(node-gyp rebuild 2> builderror.log) || (exit 0)"
},
"dependencies": {
"nan": "~1.0.0",
"options": ">=0.0.5"
},
"devDependencies": {
"mocha": "1.12.0",
"should": "1.2.x",
"expect.js": "0.2.x",
"benchmark": "0.3.x",
"ansi": "latest"
},
"browser": "./lib/browser.js",
"component": {
"scripts": {
"ws/index.js": "./lib/browser.js"
}
},
"gypfile": true
}
## Instruction:
[fix] Remove the licenses property infavor of license as we have no license file
## Code After:
{
"author": "Einar Otto Stangvik <einaros@gmail.com> (http://2x.io)",
"name": "ws",
"description": "simple to use, blazing fast and thoroughly tested websocket client, server and console for node.js, up-to-date against RFC-6455",
"version": "0.4.32",
"license": "MIT",
"keywords": [
"Hixie",
"HyBi",
"Push",
"RFC-6455",
"WebSocket",
"WebSockets",
"real-time"
],
"repository": {
"type": "git",
"url": "git://github.com/einaros/ws.git"
},
"scripts": {
"test": "make test",
"install": "(node-gyp rebuild 2> builderror.log) || (exit 0)"
},
"dependencies": {
"nan": "~1.0.0",
"options": ">=0.0.5"
},
"devDependencies": {
"mocha": "1.12.0",
"should": "1.2.x",
"expect.js": "0.2.x",
"benchmark": "0.3.x",
"ansi": "latest"
},
"browser": "./lib/browser.js",
"component": {
"scripts": {
"ws/index.js": "./lib/browser.js"
}
},
"gypfile": true
}
| {
"author": "Einar Otto Stangvik <einaros@gmail.com> (http://2x.io)",
"name": "ws",
"description": "simple to use, blazing fast and thoroughly tested websocket client, server and console for node.js, up-to-date against RFC-6455",
"version": "0.4.32",
- "licenses": [
? - ^
+ "license": "MIT",
? ^^^^^^
- {
- "type": "MIT",
- "url": "https://raw.githubusercontent.com/einaros/ws/master/README.md"
- }
- ],
"keywords": [
"Hixie",
"HyBi",
"Push",
"RFC-6455",
"WebSocket",
"WebSockets",
"real-time"
],
"repository": {
"type": "git",
"url": "git://github.com/einaros/ws.git"
},
"scripts": {
"test": "make test",
"install": "(node-gyp rebuild 2> builderror.log) || (exit 0)"
},
"dependencies": {
"nan": "~1.0.0",
"options": ">=0.0.5"
},
"devDependencies": {
"mocha": "1.12.0",
"should": "1.2.x",
"expect.js": "0.2.x",
"benchmark": "0.3.x",
"ansi": "latest"
},
"browser": "./lib/browser.js",
"component": {
"scripts": {
"ws/index.js": "./lib/browser.js"
}
},
"gypfile": true
} | 7 | 0.148936 | 1 | 6 |
6f43c9c253ee5f09434a321e21d610b7e2b543eb | lib.rs | lib.rs | // Copyright 2013 The Servo Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
#[crate_id = "github.com/mozilla-servo/rust-geom#geom:0.1"];
#[crate_type = "lib"];
#[crate_type = "dylib"];
#[crate_type = "rlib"];
#[feature(phase)];
#[phase(syntax, link)] extern crate log;
extern crate serialize;
extern crate std;
pub use matrix::Matrix4;
pub use matrix2d::Matrix2D;
pub use point::Point2D;
pub use rect::Rect;
pub use side_offsets::SideOffsets2D;
pub use size::Size2D;
pub mod approxeq;
pub mod matrix;
pub mod matrix2d;
pub mod point;
pub mod rect;
pub mod side_offsets;
pub mod size;
| // Copyright 2013 The Servo Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
#![crate_id = "github.com/mozilla-servo/rust-geom#geom:0.1"]
#![crate_type = "lib"]
#![crate_type = "dylib"]
#![crate_type = "rlib"]
#![feature(phase)]
#[phase(syntax, link)]
extern crate log;
extern crate serialize;
extern crate std;
pub use matrix::Matrix4;
pub use matrix2d::Matrix2D;
pub use point::Point2D;
pub use rect::Rect;
pub use side_offsets::SideOffsets2D;
pub use size::Size2D;
pub mod approxeq;
pub mod matrix;
pub mod matrix2d;
pub mod point;
pub mod rect;
pub mod side_offsets;
pub mod size;
| Update to current rust: use the new attributes syntax. | Update to current rust: use the new attributes syntax.
| Rust | apache-2.0 | mbrubeck/rust-geom,ruud-v-a/rust-geom,servo/rust-geom | rust | ## Code Before:
// Copyright 2013 The Servo Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
#[crate_id = "github.com/mozilla-servo/rust-geom#geom:0.1"];
#[crate_type = "lib"];
#[crate_type = "dylib"];
#[crate_type = "rlib"];
#[feature(phase)];
#[phase(syntax, link)] extern crate log;
extern crate serialize;
extern crate std;
pub use matrix::Matrix4;
pub use matrix2d::Matrix2D;
pub use point::Point2D;
pub use rect::Rect;
pub use side_offsets::SideOffsets2D;
pub use size::Size2D;
pub mod approxeq;
pub mod matrix;
pub mod matrix2d;
pub mod point;
pub mod rect;
pub mod side_offsets;
pub mod size;
## Instruction:
Update to current rust: use the new attributes syntax.
## Code After:
// Copyright 2013 The Servo Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
#![crate_id = "github.com/mozilla-servo/rust-geom#geom:0.1"]
#![crate_type = "lib"]
#![crate_type = "dylib"]
#![crate_type = "rlib"]
#![feature(phase)]
#[phase(syntax, link)]
extern crate log;
extern crate serialize;
extern crate std;
pub use matrix::Matrix4;
pub use matrix2d::Matrix2D;
pub use point::Point2D;
pub use rect::Rect;
pub use side_offsets::SideOffsets2D;
pub use size::Size2D;
pub mod approxeq;
pub mod matrix;
pub mod matrix2d;
pub mod point;
pub mod rect;
pub mod side_offsets;
pub mod size;
| // Copyright 2013 The Servo Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
- #[crate_id = "github.com/mozilla-servo/rust-geom#geom:0.1"];
? -
+ #![crate_id = "github.com/mozilla-servo/rust-geom#geom:0.1"]
? +
- #[crate_type = "lib"];
? -
+ #![crate_type = "lib"]
? +
- #[crate_type = "dylib"];
? -
+ #![crate_type = "dylib"]
? +
- #[crate_type = "rlib"];
? -
+ #![crate_type = "rlib"]
? +
- #[feature(phase)];
? -
+ #![feature(phase)]
? +
- #[phase(syntax, link)] extern crate log;
+ #[phase(syntax, link)]
+ extern crate log;
extern crate serialize;
extern crate std;
pub use matrix::Matrix4;
pub use matrix2d::Matrix2D;
pub use point::Point2D;
pub use rect::Rect;
pub use side_offsets::SideOffsets2D;
pub use size::Size2D;
pub mod approxeq;
pub mod matrix;
pub mod matrix2d;
pub mod point;
pub mod rect;
pub mod side_offsets;
pub mod size;
| 13 | 0.371429 | 7 | 6 |
70822d2879cb105fc79fac130390237b7dc38efa | src/tmpl/domain_suggestion_panel.html | src/tmpl/domain_suggestion_panel.html | Are you looking for <span class="search-term">{{search}}</span> in...<br>
<span class="domain-links">{{#each domains}}<a class="alert-link" data-link-type="domain" data-domain="{{this}}">{{this}}</a> {{/each}}{{#if more}}<a class="alert-link" data-link-type="more">(more)</a>{{/if}}</span>
| Looking for <span class="search-term">{{search}}</span> in these projects?<br>
<span class="domain-links">
{{~#each domains~}}
<a class="alert-link" data-link-type="domain" data-domain="{{this}}">{{this}}</a>
{{/each~}}
{{~#if more~}}
<a class="alert-link" data-link-type="more">(more)</a>
{{~/if~}}</span>
| Use whitespace control in domain suggestion template | Use whitespace control in domain suggestion template
| HTML | mit | searchthedocs/searchthedocs | html | ## Code Before:
Are you looking for <span class="search-term">{{search}}</span> in...<br>
<span class="domain-links">{{#each domains}}<a class="alert-link" data-link-type="domain" data-domain="{{this}}">{{this}}</a> {{/each}}{{#if more}}<a class="alert-link" data-link-type="more">(more)</a>{{/if}}</span>
## Instruction:
Use whitespace control in domain suggestion template
## Code After:
Looking for <span class="search-term">{{search}}</span> in these projects?<br>
<span class="domain-links">
{{~#each domains~}}
<a class="alert-link" data-link-type="domain" data-domain="{{this}}">{{this}}</a>
{{/each~}}
{{~#if more~}}
<a class="alert-link" data-link-type="more">(more)</a>
{{~/if~}}</span>
| - Are you looking for <span class="search-term">{{search}}</span> in...<br>
? ^^^^^^^^^ ^^^
+ Looking for <span class="search-term">{{search}}</span> in these projects?<br>
? ^ ^^^^^^^^^^^^^^^^
- <span class="domain-links">{{#each domains}}<a class="alert-link" data-link-type="domain" data-domain="{{this}}">{{this}}</a> {{/each}}{{#if more}}<a class="alert-link" data-link-type="more">(more)</a>{{/if}}</span>
+ <span class="domain-links">
+ {{~#each domains~}}
+ <a class="alert-link" data-link-type="domain" data-domain="{{this}}">{{this}}</a>
+ {{/each~}}
+ {{~#if more~}}
+ <a class="alert-link" data-link-type="more">(more)</a>
+ {{~/if~}}</span> | 10 | 5 | 8 | 2 |
0140c925a09889d397386925f62c98acac8c9b57 | packages/sp/spatial-rotations.yaml | packages/sp/spatial-rotations.yaml | homepage: https://github.com/leftaroundabout/rotations
changelog-type: markdown
hash: 1ee6dcb1209c27b36fe55f24fe53ffb4c639b1e6e26dedca310c990c5e9f56e3
test-bench-deps:
base: ! '>=4 && <5'
vector-space: -any
spatial-rotations: -any
pragmatic-show: -any
manifolds: -any
containers: -any
tasty-quickcheck: -any
tasty-hunit: -any
tasty: ! '>=0.7'
maintainer: (@) jsagemue $ uni-koeln.de
synopsis: Rotate about any suitable axis
changelog: ! '# Revision history for spatial-rotations
## 0.1.0.0 -- YYYY-mm-dd
* First version. Released on an unsuspecting world.
'
basic-deps:
manifolds-core: ! '>=0.5 && <0.6'
base: ! '>=4.8 && <4.11'
vector-space: ! '>=0.12 && <0.13'
linear: -any
all-versions:
- '0.1.0.0'
author: Justus Sagemüller
latest: '0.1.0.0'
description-type: haddock
description: ''
license-name: GPL-3
| homepage: https://github.com/leftaroundabout/rotations
changelog-type: markdown
hash: a591a567f6c64dc2be9c8af651ed0b85cdc407dbd00dcd8987f00e378d2508ea
test-bench-deps:
base: ! '>=4 && <5'
vector-space: -any
spatial-rotations: -any
pragmatic-show: -any
manifolds: -any
containers: -any
tasty-quickcheck: -any
tasty-hunit: -any
tasty: ! '>=0.7'
maintainer: (@) jsagemue $ uni-koeln.de
synopsis: Rotate about any suitable axis
changelog: ! '# Revision history for spatial-rotations
## 0.1.0.0 -- YYYY-mm-dd
* First version. Released on an unsuspecting world.
'
basic-deps:
manifolds-core: ! '>=0.5 && <0.6'
base: ! '>=4.8 && <6'
vector-space: ! '>=0.12 && <0.13'
linear: -any
all-versions:
- '0.1.0.0'
- '0.1.0.1'
author: Justus Sagemüller
latest: '0.1.0.1'
description-type: haddock
description: ''
license-name: GPL-3
| Update from Hackage at 2018-09-26T18:44:00Z | Update from Hackage at 2018-09-26T18:44:00Z
| YAML | mit | commercialhaskell/all-cabal-metadata | yaml | ## Code Before:
homepage: https://github.com/leftaroundabout/rotations
changelog-type: markdown
hash: 1ee6dcb1209c27b36fe55f24fe53ffb4c639b1e6e26dedca310c990c5e9f56e3
test-bench-deps:
base: ! '>=4 && <5'
vector-space: -any
spatial-rotations: -any
pragmatic-show: -any
manifolds: -any
containers: -any
tasty-quickcheck: -any
tasty-hunit: -any
tasty: ! '>=0.7'
maintainer: (@) jsagemue $ uni-koeln.de
synopsis: Rotate about any suitable axis
changelog: ! '# Revision history for spatial-rotations
## 0.1.0.0 -- YYYY-mm-dd
* First version. Released on an unsuspecting world.
'
basic-deps:
manifolds-core: ! '>=0.5 && <0.6'
base: ! '>=4.8 && <4.11'
vector-space: ! '>=0.12 && <0.13'
linear: -any
all-versions:
- '0.1.0.0'
author: Justus Sagemüller
latest: '0.1.0.0'
description-type: haddock
description: ''
license-name: GPL-3
## Instruction:
Update from Hackage at 2018-09-26T18:44:00Z
## Code After:
homepage: https://github.com/leftaroundabout/rotations
changelog-type: markdown
hash: a591a567f6c64dc2be9c8af651ed0b85cdc407dbd00dcd8987f00e378d2508ea
test-bench-deps:
base: ! '>=4 && <5'
vector-space: -any
spatial-rotations: -any
pragmatic-show: -any
manifolds: -any
containers: -any
tasty-quickcheck: -any
tasty-hunit: -any
tasty: ! '>=0.7'
maintainer: (@) jsagemue $ uni-koeln.de
synopsis: Rotate about any suitable axis
changelog: ! '# Revision history for spatial-rotations
## 0.1.0.0 -- YYYY-mm-dd
* First version. Released on an unsuspecting world.
'
basic-deps:
manifolds-core: ! '>=0.5 && <0.6'
base: ! '>=4.8 && <6'
vector-space: ! '>=0.12 && <0.13'
linear: -any
all-versions:
- '0.1.0.0'
- '0.1.0.1'
author: Justus Sagemüller
latest: '0.1.0.1'
description-type: haddock
description: ''
license-name: GPL-3
| homepage: https://github.com/leftaroundabout/rotations
changelog-type: markdown
- hash: 1ee6dcb1209c27b36fe55f24fe53ffb4c639b1e6e26dedca310c990c5e9f56e3
+ hash: a591a567f6c64dc2be9c8af651ed0b85cdc407dbd00dcd8987f00e378d2508ea
test-bench-deps:
base: ! '>=4 && <5'
vector-space: -any
spatial-rotations: -any
pragmatic-show: -any
manifolds: -any
containers: -any
tasty-quickcheck: -any
tasty-hunit: -any
tasty: ! '>=0.7'
maintainer: (@) jsagemue $ uni-koeln.de
synopsis: Rotate about any suitable axis
changelog: ! '# Revision history for spatial-rotations
## 0.1.0.0 -- YYYY-mm-dd
* First version. Released on an unsuspecting world.
'
basic-deps:
manifolds-core: ! '>=0.5 && <0.6'
- base: ! '>=4.8 && <4.11'
? ^^^^
+ base: ! '>=4.8 && <6'
? ^
vector-space: ! '>=0.12 && <0.13'
linear: -any
all-versions:
- '0.1.0.0'
+ - '0.1.0.1'
author: Justus Sagemüller
- latest: '0.1.0.0'
? ^
+ latest: '0.1.0.1'
? ^
description-type: haddock
description: ''
license-name: GPL-3 | 7 | 0.194444 | 4 | 3 |
437aa12cddf80465c06596547f160cb3acc71eef | app/assets/stylesheets/modules/footer.css.scss | app/assets/stylesheets/modules/footer.css.scss | footer {
padding: 20px 0;
margin-top: 20px;
font-size: 12px;
text-align: center;
p {
color: $footer-text;
}
}
| footer {
padding: 20px 0;
margin-top: 20px;
text-align: center;
p {
color: $footer-text;
font-size: 14px;
}
}
| Set font-size on paragraph instead of footer | Set font-size on paragraph instead of footer
| SCSS | agpl-3.0 | asm-helpful/helpful-web,asm-helpful/helpful-web,asm-helpful/helpful-web,asm-helpful/helpful-web | scss | ## Code Before:
footer {
padding: 20px 0;
margin-top: 20px;
font-size: 12px;
text-align: center;
p {
color: $footer-text;
}
}
## Instruction:
Set font-size on paragraph instead of footer
## Code After:
footer {
padding: 20px 0;
margin-top: 20px;
text-align: center;
p {
color: $footer-text;
font-size: 14px;
}
}
| footer {
padding: 20px 0;
margin-top: 20px;
- font-size: 12px;
text-align: center;
p {
color: $footer-text;
+ font-size: 14px;
}
} | 2 | 0.2 | 1 | 1 |
1c9ef8d19927fb863f1fe27fb7af0d3ca9d28b20 | src/components/Notification.vue | src/components/Notification.vue | <template>
<transition name="fade">
<div v-if="isShown" class="notification" v-bind:class="notificationClass">
<button class="delete" @click="deleteNotification"></button>
{{ notification.message }}
</div>
</transition>
</template>
<script>
export default {
name: 'notification',
props: ['notification'],
data() {
return {
isShown: true
}
},
computed: {
notificationClass(){
return this.notification.type ? `is-${this.notification.type}` : ''
}
},
methods: {
deleteNotification() {
this.isShown = false;
setInterval(() => {
this.$store.commit('REMOVE_FROM_NOTIFICATIONS', this.notification.id)
}, 2500);
}
},
mounted() {
setInterval(this.deleteNotification, 2500);
}
}
</script>
<style>
.fade-enter-active,
.fade-leave-active {
transition: opacity .5s
}
.fade-enter,
.fade-leave-to {
opacity: 0
}
.notification {
position: relative;
}
</style> | <template>
<transition name="fade">
<div v-if="isShown" class="notification" v-bind:class="notificationClass">
<button class="delete" @click="deleteNotification"></button>
{{ notification.message }}
</div>
</transition>
</template>
<script>
export default {
name: 'notification',
props: ['notification'],
data() {
return {
isShown: true
}
},
computed: {
notificationClass(){
return this.notification.type ? `is-${this.notification.type}` : ''
}
},
methods: {
deleteNotification() {
this.isShown = false;
this.$store.commit('REMOVE_FROM_NOTIFICATIONS', this.notification.id);
}
},
mounted() {
setTimeout(this.deleteNotification, 2500);
}
}
</script>
<style>
.fade-enter-active,
.fade-leave-active {
transition: opacity .5s
}
.fade-enter,
.fade-leave-to {
opacity: 0
}
.notification {
position: relative;
}
</style> | Fix notification 'bug' - used setInterval instead of setTimeout! | Fix notification 'bug' - used setInterval instead of setTimeout!
| Vue | apache-2.0 | eamonnbell/sortzzi,eamonnbell/sortzzi | vue | ## Code Before:
<template>
<transition name="fade">
<div v-if="isShown" class="notification" v-bind:class="notificationClass">
<button class="delete" @click="deleteNotification"></button>
{{ notification.message }}
</div>
</transition>
</template>
<script>
export default {
name: 'notification',
props: ['notification'],
data() {
return {
isShown: true
}
},
computed: {
notificationClass(){
return this.notification.type ? `is-${this.notification.type}` : ''
}
},
methods: {
deleteNotification() {
this.isShown = false;
setInterval(() => {
this.$store.commit('REMOVE_FROM_NOTIFICATIONS', this.notification.id)
}, 2500);
}
},
mounted() {
setInterval(this.deleteNotification, 2500);
}
}
</script>
<style>
.fade-enter-active,
.fade-leave-active {
transition: opacity .5s
}
.fade-enter,
.fade-leave-to {
opacity: 0
}
.notification {
position: relative;
}
</style>
## Instruction:
Fix notification 'bug' - used setInterval instead of setTimeout!
## Code After:
<template>
<transition name="fade">
<div v-if="isShown" class="notification" v-bind:class="notificationClass">
<button class="delete" @click="deleteNotification"></button>
{{ notification.message }}
</div>
</transition>
</template>
<script>
export default {
name: 'notification',
props: ['notification'],
data() {
return {
isShown: true
}
},
computed: {
notificationClass(){
return this.notification.type ? `is-${this.notification.type}` : ''
}
},
methods: {
deleteNotification() {
this.isShown = false;
this.$store.commit('REMOVE_FROM_NOTIFICATIONS', this.notification.id);
}
},
mounted() {
setTimeout(this.deleteNotification, 2500);
}
}
</script>
<style>
.fade-enter-active,
.fade-leave-active {
transition: opacity .5s
}
.fade-enter,
.fade-leave-to {
opacity: 0
}
.notification {
position: relative;
}
</style> | <template>
<transition name="fade">
<div v-if="isShown" class="notification" v-bind:class="notificationClass">
<button class="delete" @click="deleteNotification"></button>
{{ notification.message }}
</div>
</transition>
</template>
<script>
export default {
name: 'notification',
props: ['notification'],
data() {
return {
isShown: true
}
},
computed: {
notificationClass(){
return this.notification.type ? `is-${this.notification.type}` : ''
}
},
methods: {
deleteNotification() {
this.isShown = false;
- setInterval(() => {
- this.$store.commit('REMOVE_FROM_NOTIFICATIONS', this.notification.id)
? ----
+ this.$store.commit('REMOVE_FROM_NOTIFICATIONS', this.notification.id);
? +
- }, 2500);
}
},
mounted() {
- setInterval(this.deleteNotification, 2500);
? ^^ -----
+ setTimeout(this.deleteNotification, 2500);
? ^^^^^^
}
}
</script>
<style>
.fade-enter-active,
.fade-leave-active {
transition: opacity .5s
}
.fade-enter,
.fade-leave-to {
opacity: 0
}
.notification {
position: relative;
}
</style> | 6 | 0.115385 | 2 | 4 |
1f3e00404d4442f228b40e5a633f575ba0d0622d | CHANGELOG.md | CHANGELOG.md |
In this release we finaly are able to build and distribute our projects!
- [#3](https://github.com/pirelenito/sagui/issues/3) Implement build action to generate the compiled assets
- [#4](https://github.com/pirelenito/sagui/issues/4) Implement dist action to generate optimized assets ready for distribution
|
Allow extensibility and customization of webpack and karma configurations.
#### Deprecated
Project specific configuration has moved from `package.json` to `sagui.config.js`.
#### Major changes:
- [#16](https://github.com/pirelenito/sagui/issues/16) Allow extensibility of webpack configuration
- [#23](https://github.com/pirelenito/sagui/pull/23) Add support for CSS Modules #23
### v2.6.0 (2015-12-12)
In this release we finaly are able to build and distribute our projects!
- [#3](https://github.com/pirelenito/sagui/issues/3) Implement build action to generate the compiled assets
- [#4](https://github.com/pirelenito/sagui/issues/4) Implement dist action to generate optimized assets ready for distribution
| Add change log for 3.0.0 release | Add change log for 3.0.0 release | Markdown | mit | saguijs/sagui,saguijs/sagui | markdown | ## Code Before:
In this release we finaly are able to build and distribute our projects!
- [#3](https://github.com/pirelenito/sagui/issues/3) Implement build action to generate the compiled assets
- [#4](https://github.com/pirelenito/sagui/issues/4) Implement dist action to generate optimized assets ready for distribution
## Instruction:
Add change log for 3.0.0 release
## Code After:
Allow extensibility and customization of webpack and karma configurations.
#### Deprecated
Project specific configuration has moved from `package.json` to `sagui.config.js`.
#### Major changes:
- [#16](https://github.com/pirelenito/sagui/issues/16) Allow extensibility of webpack configuration
- [#23](https://github.com/pirelenito/sagui/pull/23) Add support for CSS Modules #23
### v2.6.0 (2015-12-12)
In this release we finaly are able to build and distribute our projects!
- [#3](https://github.com/pirelenito/sagui/issues/3) Implement build action to generate the compiled assets
- [#4](https://github.com/pirelenito/sagui/issues/4) Implement dist action to generate optimized assets ready for distribution
| +
+ Allow extensibility and customization of webpack and karma configurations.
+
+ #### Deprecated
+
+ Project specific configuration has moved from `package.json` to `sagui.config.js`.
+
+ #### Major changes:
+
+ - [#16](https://github.com/pirelenito/sagui/issues/16) Allow extensibility of webpack configuration
+ - [#23](https://github.com/pirelenito/sagui/pull/23) Add support for CSS Modules #23
+
+ ### v2.6.0 (2015-12-12)
In this release we finaly are able to build and distribute our projects!
- [#3](https://github.com/pirelenito/sagui/issues/3) Implement build action to generate the compiled assets
- [#4](https://github.com/pirelenito/sagui/issues/4) Implement dist action to generate optimized assets ready for distribution | 13 | 2.6 | 13 | 0 |
9f88b0825d3e161b26bbc4f077b5d14103e10cb3 | tests/test_gfsplit_gfcombine.sh | tests/test_gfsplit_gfcombine.sh |
HERE=$(pwd)
mkdir TOOL-TEST
cd TOOL-TEST
cleanup ()
{
cd $HERE
rm -rf TOOL-TEST
}
trap cleanup 0
cp ../libtool plaintext
../gfsplit -n 3 -m 5 plaintext cyphertext
SHARES=$(ls cyphertext.* | wc -l)
if [ "$SHARES" != 5 ]; then
echo "Share count created was not five"
exit 1
fi
SHARES=$(ls cyphertext.* | xargs)
to_test ()
{
RESULT=$1
SUBSHARES=$(echo $SHARES | cut -d\ -f$2)
WHOWHAT="$3"
../gfcombine $SUBSHARES
cmp -s plaintext cyphertext
if [ "$?" != "$RESULT" ]; then
echo $WHOWHAT
exit 1
fi
}
to_test 1 1-2 "Two shares didn't fail"
to_test 0 1-3 "Three shares didn't succeed"
to_test 0 2-4 "Three shares didn't succeed"
to_test 0 3-5 "Three shares didn't succeed"
exit 0
|
HERE=$(pwd)
mkdir TOOL-TEST
cd TOOL-TEST
cleanup ()
{
cd $HERE
rm -rf TOOL-TEST
}
trap cleanup 0
cp ../libtool plaintext
../gfsplit -n 3 -m 5 plaintext cyphertext
SHARES=$(ls cyphertext.* | wc -l | tr -d ' ')
if [ "$SHARES" != 5 ]; then
echo "Share count created was not five"
exit 1
fi
SHARES=$(ls cyphertext.* | xargs)
to_test ()
{
RESULT=$1
SUBSHARES=$(echo $SHARES | cut -d\ -f$2)
WHOWHAT="$3"
../gfcombine $SUBSHARES
cmp -s plaintext cyphertext
if [ "$?" != "$RESULT" ]; then
echo $WHOWHAT
exit 1
fi
}
to_test 1 1-2 "Two shares didn't fail"
to_test 0 1-3 "Three shares didn't succeed"
to_test 0 2-4 "Three shares didn't succeed"
to_test 0 3-5 "Three shares didn't succeed"
exit 0
| Remove spaces from the output of wc -l. | Remove spaces from the output of wc -l.
This fixes a bug where the test suite would fail on Mac OS X.
Thanks to Filippo Giunchedi for the patch.
| Shell | mit | periodic1236/libgfshare,jcushman/libgfshare,djpohly/libgfshare,periodic1236/libgfshare,jcushman/libgfshare,djpohly/libgfshare | shell | ## Code Before:
HERE=$(pwd)
mkdir TOOL-TEST
cd TOOL-TEST
cleanup ()
{
cd $HERE
rm -rf TOOL-TEST
}
trap cleanup 0
cp ../libtool plaintext
../gfsplit -n 3 -m 5 plaintext cyphertext
SHARES=$(ls cyphertext.* | wc -l)
if [ "$SHARES" != 5 ]; then
echo "Share count created was not five"
exit 1
fi
SHARES=$(ls cyphertext.* | xargs)
to_test ()
{
RESULT=$1
SUBSHARES=$(echo $SHARES | cut -d\ -f$2)
WHOWHAT="$3"
../gfcombine $SUBSHARES
cmp -s plaintext cyphertext
if [ "$?" != "$RESULT" ]; then
echo $WHOWHAT
exit 1
fi
}
to_test 1 1-2 "Two shares didn't fail"
to_test 0 1-3 "Three shares didn't succeed"
to_test 0 2-4 "Three shares didn't succeed"
to_test 0 3-5 "Three shares didn't succeed"
exit 0
## Instruction:
Remove spaces from the output of wc -l.
This fixes a bug where the test suite would fail on Mac OS X.
Thanks to Filippo Giunchedi for the patch.
## Code After:
HERE=$(pwd)
mkdir TOOL-TEST
cd TOOL-TEST
cleanup ()
{
cd $HERE
rm -rf TOOL-TEST
}
trap cleanup 0
cp ../libtool plaintext
../gfsplit -n 3 -m 5 plaintext cyphertext
SHARES=$(ls cyphertext.* | wc -l | tr -d ' ')
if [ "$SHARES" != 5 ]; then
echo "Share count created was not five"
exit 1
fi
SHARES=$(ls cyphertext.* | xargs)
to_test ()
{
RESULT=$1
SUBSHARES=$(echo $SHARES | cut -d\ -f$2)
WHOWHAT="$3"
../gfcombine $SUBSHARES
cmp -s plaintext cyphertext
if [ "$?" != "$RESULT" ]; then
echo $WHOWHAT
exit 1
fi
}
to_test 1 1-2 "Two shares didn't fail"
to_test 0 1-3 "Three shares didn't succeed"
to_test 0 2-4 "Three shares didn't succeed"
to_test 0 3-5 "Three shares didn't succeed"
exit 0
|
HERE=$(pwd)
mkdir TOOL-TEST
cd TOOL-TEST
cleanup ()
{
cd $HERE
rm -rf TOOL-TEST
}
trap cleanup 0
cp ../libtool plaintext
../gfsplit -n 3 -m 5 plaintext cyphertext
- SHARES=$(ls cyphertext.* | wc -l)
+ SHARES=$(ls cyphertext.* | wc -l | tr -d ' ')
? ++++++++++++
if [ "$SHARES" != 5 ]; then
echo "Share count created was not five"
exit 1
fi
SHARES=$(ls cyphertext.* | xargs)
to_test ()
{
RESULT=$1
SUBSHARES=$(echo $SHARES | cut -d\ -f$2)
WHOWHAT="$3"
../gfcombine $SUBSHARES
cmp -s plaintext cyphertext
if [ "$?" != "$RESULT" ]; then
echo $WHOWHAT
exit 1
fi
}
to_test 1 1-2 "Two shares didn't fail"
to_test 0 1-3 "Three shares didn't succeed"
to_test 0 2-4 "Three shares didn't succeed"
to_test 0 3-5 "Three shares didn't succeed"
exit 0
| 2 | 0.044444 | 1 | 1 |
05f151a026bcdc5f671af23025687dd40098e644 | app/controllers/concerns/webhook_validations.rb | app/controllers/concerns/webhook_validations.rb | module WebhookValidations
extend ActiveSupport::Concern
def verify_incoming_webhook_address!
if valid_incoming_webhook_address?
true
else
render :status => 404, :json => "{}"
end
end
def valid_incoming_webhook_address?
if Octokit.api_endpoint == "https://api.github.com/"
GithubSourceValidator.new(request.ip).valid?
else
true
end
end
end
| module WebhookValidations
extend ActiveSupport::Concern
def verify_incoming_webhook_address!
if valid_incoming_webhook_address?
true
else
render :json => {}, :status => :not_found
end
end
def valid_incoming_webhook_address?
if Octokit.api_endpoint == "https://api.github.com/"
GithubSourceValidator.new(request.ip).valid?
else
true
end
end
end
| Use :not_found instead of 404 | Use :not_found instead of 404
| Ruby | mit | ResultadosDigitais/newrelic_notifier,flowdock/heaven,ngpestelos/heaven,cloudy9101/heaven,kidaa/heaven,digideskio/heaven,cloudy9101/heaven,flowdock/heaven,waysact/heaven,digideskio/heaven,digideskio/heaven,waysact/heaven,eLobato/heaven,ackimwilliams/heaven,flowdock/heaven,LoveMondays/heaven,dLobatog/heaven,TailorDev/heaven,atmos/heaven,rnaveiras/heaven,pulibrary/heaven,shift/heaven,markpundsack/heaven,atmos/heaven,rothsa/heaven,waysact/heaven,sharetribe/heaven,maletor/heaven,markpundsack/heaven,n3tr/heaven,LoveMondays/heaven,sharetribe/heaven,kidaa/heaven,n3tr/heaven,shift/heaven,travis-ci/heaven,jaisonerick/heaven,LoveMondays/heaven,ResultadosDigitais/newrelic_notifier,eLobato/heaven,dLobatog/heaven,ResultadosDigitais/heaven,rnaveiras/heaven,ackimwilliams/heaven,ngpestelos/heaven,rothsa/heaven,ngpestelos/heaven,rnaveiras/heaven,jaisonerick/heaven,TailorDev/heaven,shift/heaven,ackimwilliams/heaven,pulibrary/heaven,travis-ci/heaven,atmos/heaven,maletor/heaven,jaisonerick/heaven,TailorDev/heaven,travis-ci/heaven,maletor/heaven,pulibrary/heaven,n3tr/heaven,ResultadosDigitais/heaven,eLobato/heaven,kidaa/heaven,sharetribe/heaven,cloudy9101/heaven,markpundsack/heaven,ResultadosDigitais/heaven,dLobatog/heaven,ResultadosDigitais/newrelic_notifier,rothsa/heaven | ruby | ## Code Before:
module WebhookValidations
extend ActiveSupport::Concern
def verify_incoming_webhook_address!
if valid_incoming_webhook_address?
true
else
render :status => 404, :json => "{}"
end
end
def valid_incoming_webhook_address?
if Octokit.api_endpoint == "https://api.github.com/"
GithubSourceValidator.new(request.ip).valid?
else
true
end
end
end
## Instruction:
Use :not_found instead of 404
## Code After:
module WebhookValidations
extend ActiveSupport::Concern
def verify_incoming_webhook_address!
if valid_incoming_webhook_address?
true
else
render :json => {}, :status => :not_found
end
end
def valid_incoming_webhook_address?
if Octokit.api_endpoint == "https://api.github.com/"
GithubSourceValidator.new(request.ip).valid?
else
true
end
end
end
| module WebhookValidations
extend ActiveSupport::Concern
def verify_incoming_webhook_address!
if valid_incoming_webhook_address?
true
else
- render :status => 404, :json => "{}"
+ render :json => {}, :status => :not_found
end
end
def valid_incoming_webhook_address?
if Octokit.api_endpoint == "https://api.github.com/"
GithubSourceValidator.new(request.ip).valid?
else
true
end
end
end | 2 | 0.105263 | 1 | 1 |
196c268ec7913ce0aba8afb6ec0eb27bc3abd325 | src/promise-array.js | src/promise-array.js | import {Observable} from 'rx';
export function asyncMap(array, selector, maxConcurrency=4) {
return Observable.from(array)
.map((k) =>
Observable.defer(() =>
Observable.fromPromise(selector(k))
.map((v) => ({ k, v }))))
.merge(maxConcurrency)
.reduce((acc, kvp) => {
acc[kvp.k] = kvp.v;
return acc;
}, {})
.toPromise();
}
| import {Observable} from 'rx';
const spawnOg = require('child_process').spawn;
export function asyncMap(array, selector, maxConcurrency=4) {
return Observable.from(array)
.map((k) =>
Observable.defer(() =>
Observable.fromPromise(selector(k))
.map((v) => ({ k, v }))))
.merge(maxConcurrency)
.reduce((acc, kvp) => {
acc[kvp.k] = kvp.v;
return acc;
}, {})
.toPromise();
}
export function delay(ms) {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
}
// Public: Maps a process's output into an {Observable}
//
// exe - The program to execute
// params - Arguments passed to the process
// opts - Options that will be passed to child_process.spawn
//
// Returns a {Promise} with a single value, that is the output of the
// spawned process
export function spawn(exe, params, opts=null) {
let spawnObs = Observable.create((subj) => {
let proc = null;
if (!opts) {
proc = spawnOg(exe, params);
} else {
proc = spawnOg(exe, params, opts);
}
let stdout = '';
let bufHandler = (b) => {
let chunk = b.toString();
stdout += chunk;
subj.onNext(chunk);
};
proc.stdout.on('data', bufHandler);
proc.stderr.on('data', bufHandler);
proc.on('error', (e) => subj.onError(e));
proc.on('close', (code) => {
if (code === 0) {
subj.onCompleted();
} else {
subj.onError(new Error(`Failed with exit code: ${code}\nOutput:\n${stdout}`));
}
});
});
return spawnObs.reduce((acc, x) => acc += x, '').toPromise();
}
| Copy over a promise version of spawn and delay | Copy over a promise version of spawn and delay
| JavaScript | mit | surf-build/surf,surf-build/surf,surf-build/surf,surf-build/surf | javascript | ## Code Before:
import {Observable} from 'rx';
export function asyncMap(array, selector, maxConcurrency=4) {
return Observable.from(array)
.map((k) =>
Observable.defer(() =>
Observable.fromPromise(selector(k))
.map((v) => ({ k, v }))))
.merge(maxConcurrency)
.reduce((acc, kvp) => {
acc[kvp.k] = kvp.v;
return acc;
}, {})
.toPromise();
}
## Instruction:
Copy over a promise version of spawn and delay
## Code After:
import {Observable} from 'rx';
const spawnOg = require('child_process').spawn;
export function asyncMap(array, selector, maxConcurrency=4) {
return Observable.from(array)
.map((k) =>
Observable.defer(() =>
Observable.fromPromise(selector(k))
.map((v) => ({ k, v }))))
.merge(maxConcurrency)
.reduce((acc, kvp) => {
acc[kvp.k] = kvp.v;
return acc;
}, {})
.toPromise();
}
export function delay(ms) {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
}
// Public: Maps a process's output into an {Observable}
//
// exe - The program to execute
// params - Arguments passed to the process
// opts - Options that will be passed to child_process.spawn
//
// Returns a {Promise} with a single value, that is the output of the
// spawned process
export function spawn(exe, params, opts=null) {
let spawnObs = Observable.create((subj) => {
let proc = null;
if (!opts) {
proc = spawnOg(exe, params);
} else {
proc = spawnOg(exe, params, opts);
}
let stdout = '';
let bufHandler = (b) => {
let chunk = b.toString();
stdout += chunk;
subj.onNext(chunk);
};
proc.stdout.on('data', bufHandler);
proc.stderr.on('data', bufHandler);
proc.on('error', (e) => subj.onError(e));
proc.on('close', (code) => {
if (code === 0) {
subj.onCompleted();
} else {
subj.onError(new Error(`Failed with exit code: ${code}\nOutput:\n${stdout}`));
}
});
});
return spawnObs.reduce((acc, x) => acc += x, '').toPromise();
}
| import {Observable} from 'rx';
+ const spawnOg = require('child_process').spawn;
export function asyncMap(array, selector, maxConcurrency=4) {
return Observable.from(array)
.map((k) =>
Observable.defer(() =>
Observable.fromPromise(selector(k))
.map((v) => ({ k, v }))))
.merge(maxConcurrency)
.reduce((acc, kvp) => {
acc[kvp.k] = kvp.v;
return acc;
}, {})
.toPromise();
}
+
+ export function delay(ms) {
+ return new Promise((resolve) => {
+ setTimeout(resolve, ms);
+ });
+ }
+
+ // Public: Maps a process's output into an {Observable}
+ //
+ // exe - The program to execute
+ // params - Arguments passed to the process
+ // opts - Options that will be passed to child_process.spawn
+ //
+ // Returns a {Promise} with a single value, that is the output of the
+ // spawned process
+ export function spawn(exe, params, opts=null) {
+ let spawnObs = Observable.create((subj) => {
+ let proc = null;
+
+ if (!opts) {
+ proc = spawnOg(exe, params);
+ } else {
+ proc = spawnOg(exe, params, opts);
+ }
+
+ let stdout = '';
+ let bufHandler = (b) => {
+ let chunk = b.toString();
+
+ stdout += chunk;
+ subj.onNext(chunk);
+ };
+
+ proc.stdout.on('data', bufHandler);
+ proc.stderr.on('data', bufHandler);
+ proc.on('error', (e) => subj.onError(e));
+
+ proc.on('close', (code) => {
+ if (code === 0) {
+ subj.onCompleted();
+ } else {
+ subj.onError(new Error(`Failed with exit code: ${code}\nOutput:\n${stdout}`));
+ }
+ });
+ });
+
+ return spawnObs.reduce((acc, x) => acc += x, '').toPromise();
+ } | 49 | 3.266667 | 49 | 0 |
2e1516ee89fb0af69733bab259cbb38a3f8a614c | latexbuild/latex_parse.py | latexbuild/latex_parse.py |
import re
######################################################################
# Latex escape regex constants
######################################################################
ESCAPE_CHARS = [r'\&', '%', r'\$', '#', '_', r'\{', r'\}', '~', r'\^', ]
ESCAPE_CHARS_OR = '[{}]'.format('|'.join(ESCAPE_CHARS))
REGEX_ESCAPE_CHARS = [
(re.compile(r"(?=[^\\])" + i), r"\\" + i.replace('\\', ''))
for i in ESCAPE_CHARS
]
REGEX_BACKSLASH = re.compile(r'\\(?!{})'.format(ESCAPE_CHARS_OR))
######################################################################
# Declare module functions
######################################################################
def escape_latex_str(string_text):
'''Escape a latex string'''
for regex, replace_text in REGEX_ESCAPE_CHARS:
string_text = re.sub(regex, replace_text, string_text)
string_text = re.sub(REGEX_BACKSLASH, r'\\\\', string_text)
return string_text
|
import re
######################################################################
# Latex escape regex constants
######################################################################
# Organize all latex escape characters in one list
# (EXCEPT FOR ( "\" ), which is handled separately)
# escaping those which are special characters in
# PERL regular expressions
ESCAPE_CHARS = [r'\&', '%', r'\$', '#', '_', r'\{', r'\}', '~', r'\^', ]
# For each latex escape character, create a regular expression
# that matches all of the following criteria
# 1) two characters
# 2) the first character is NOT a backslash ( "\" )
# 3) the second character is one of the latex escape characters
REGEX_ESCAPE_CHARS = [
(re.compile(r"(?=[^\\])" + i), r"\\" + i.replace('\\', ''))
for i in ESCAPE_CHARS
]
# Place escape characters in [] for "match any character" regex
ESCAPE_CHARS_OR = r'[{}\\]'.format(ESCAPE_CHARS)
# For the back slash, create a regular expression
# that matches all of the following criteria
# 1) three characters
# 2) the first character is not a backslash
# 3) the second characters is a backslash
# 4) the third character is none of the ESCAPE_CHARS,
# and is also not a backslash
REGEX_BACKSLASH = re.compile(r'(?!\\)\\(?!{})'.format(ESCAPE_CHARS_OR))
######################################################################
# Declare module functions
######################################################################
def escape_latex_str(string_text):
'''Escape a latex string'''
for regex, replace_text in REGEX_ESCAPE_CHARS:
string_text = re.sub(regex, replace_text, string_text)
string_text = re.sub(REGEX_BACKSLASH, r'\\\\', string_text)
return string_text
| Update latex parser to better-handle edge cases | Update latex parser to better-handle edge cases
| Python | mit | pappasam/latexbuild | python | ## Code Before:
import re
######################################################################
# Latex escape regex constants
######################################################################
ESCAPE_CHARS = [r'\&', '%', r'\$', '#', '_', r'\{', r'\}', '~', r'\^', ]
ESCAPE_CHARS_OR = '[{}]'.format('|'.join(ESCAPE_CHARS))
REGEX_ESCAPE_CHARS = [
(re.compile(r"(?=[^\\])" + i), r"\\" + i.replace('\\', ''))
for i in ESCAPE_CHARS
]
REGEX_BACKSLASH = re.compile(r'\\(?!{})'.format(ESCAPE_CHARS_OR))
######################################################################
# Declare module functions
######################################################################
def escape_latex_str(string_text):
'''Escape a latex string'''
for regex, replace_text in REGEX_ESCAPE_CHARS:
string_text = re.sub(regex, replace_text, string_text)
string_text = re.sub(REGEX_BACKSLASH, r'\\\\', string_text)
return string_text
## Instruction:
Update latex parser to better-handle edge cases
## Code After:
import re
######################################################################
# Latex escape regex constants
######################################################################
# Organize all latex escape characters in one list
# (EXCEPT FOR ( "\" ), which is handled separately)
# escaping those which are special characters in
# PERL regular expressions
ESCAPE_CHARS = [r'\&', '%', r'\$', '#', '_', r'\{', r'\}', '~', r'\^', ]
# For each latex escape character, create a regular expression
# that matches all of the following criteria
# 1) two characters
# 2) the first character is NOT a backslash ( "\" )
# 3) the second character is one of the latex escape characters
REGEX_ESCAPE_CHARS = [
(re.compile(r"(?=[^\\])" + i), r"\\" + i.replace('\\', ''))
for i in ESCAPE_CHARS
]
# Place escape characters in [] for "match any character" regex
ESCAPE_CHARS_OR = r'[{}\\]'.format(ESCAPE_CHARS)
# For the back slash, create a regular expression
# that matches all of the following criteria
# 1) three characters
# 2) the first character is not a backslash
# 3) the second characters is a backslash
# 4) the third character is none of the ESCAPE_CHARS,
# and is also not a backslash
REGEX_BACKSLASH = re.compile(r'(?!\\)\\(?!{})'.format(ESCAPE_CHARS_OR))
######################################################################
# Declare module functions
######################################################################
def escape_latex_str(string_text):
'''Escape a latex string'''
for regex, replace_text in REGEX_ESCAPE_CHARS:
string_text = re.sub(regex, replace_text, string_text)
string_text = re.sub(REGEX_BACKSLASH, r'\\\\', string_text)
return string_text
|
import re
######################################################################
# Latex escape regex constants
######################################################################
+
+ # Organize all latex escape characters in one list
+ # (EXCEPT FOR ( "\" ), which is handled separately)
+ # escaping those which are special characters in
+ # PERL regular expressions
ESCAPE_CHARS = [r'\&', '%', r'\$', '#', '_', r'\{', r'\}', '~', r'\^', ]
- ESCAPE_CHARS_OR = '[{}]'.format('|'.join(ESCAPE_CHARS))
+
+ # For each latex escape character, create a regular expression
+ # that matches all of the following criteria
+ # 1) two characters
+ # 2) the first character is NOT a backslash ( "\" )
+ # 3) the second character is one of the latex escape characters
REGEX_ESCAPE_CHARS = [
(re.compile(r"(?=[^\\])" + i), r"\\" + i.replace('\\', ''))
for i in ESCAPE_CHARS
]
+
+ # Place escape characters in [] for "match any character" regex
+ ESCAPE_CHARS_OR = r'[{}\\]'.format(ESCAPE_CHARS)
+
+ # For the back slash, create a regular expression
+ # that matches all of the following criteria
+ # 1) three characters
+ # 2) the first character is not a backslash
+ # 3) the second characters is a backslash
+ # 4) the third character is none of the ESCAPE_CHARS,
+ # and is also not a backslash
- REGEX_BACKSLASH = re.compile(r'\\(?!{})'.format(ESCAPE_CHARS_OR))
+ REGEX_BACKSLASH = re.compile(r'(?!\\)\\(?!{})'.format(ESCAPE_CHARS_OR))
? ++++++
######################################################################
# Declare module functions
######################################################################
def escape_latex_str(string_text):
'''Escape a latex string'''
for regex, replace_text in REGEX_ESCAPE_CHARS:
string_text = re.sub(regex, replace_text, string_text)
string_text = re.sub(REGEX_BACKSLASH, r'\\\\', string_text)
return string_text | 25 | 1.086957 | 23 | 2 |
0d777cd08cb3b1cbc3e25614ee78b1760fbf5339 | _config.yml | _config.yml | title: Potteries Hackspace
description: A clean minimal Theme for Jekyll
email: admin@potterieshackspace.org
baseurl: /
url: "http://potterieshackspace.github.io"
#favicon
favicon: /favicon.png
#feed
feedburner_username: maangalabs
#twitter
twitter_username: pranavrajs
github_username: pranavrajs
#linkedin
linkedin_username: pranavrajs
#google analytics
google_analytics_id : UA-53386152-2
#gplus
gplus_username: 101893920934936418705
#fb_username
fb_username: pranav.tayberrycreative
#permalink /blog/2014/11/11/this-is-a-sample-post
permalink: /blog/:year/:month/:day/:title
# Build settings
markdown: kramdown
#Disqus configuration
disqus_short_name: maangalabscom
disqus_show_comment_count: true
#Can be used for multiple author blogs ! Add name to post For more visit the sample
authors:
pranavrajs:
display_name: Pranav Raj S
email: pranavrajs@gmail.com
fb: pranav.tayberrycreative
linkedin: pranavrajs
twitter: pranavrajs
github: pranavrajs
| title: Potteries Hackspace
description: The Potteries Hackspace website.
email: admin@potterieshackspace.org
baseurl: /
url: "http://potterieshackspace.github.io"
favicon: /favicon.png
feedburner_username:
twitter_username: potterieshacks
github_username: PotteriesHackspace
google_analytics_id:
gplus_username:
fb_username:
permalink: /blog/:year/:month/:day/:title
markdown: kramdown
disqus_short_name: potterieshackspace
disqus_show_comment_count: true
authors:
jpswade:
display_name: James Wade
email: jpswade@gmail.com
fb: jpswade
linkedin: jpswade
twitter: jpswade
github: jpswade
| Update config to reflect hackspace | Update config to reflect hackspace
| YAML | mit | PotteriesHackspace/potterieshackspace.github.com,PotteriesHackspace/potterieshackspace.github.com,PotteriesHackspace/potterieshackspace.github.com,PotteriesHackspace/potterieshackspace.github.com,PotteriesHackspace/potterieshackspace.github.com | yaml | ## Code Before:
title: Potteries Hackspace
description: A clean minimal Theme for Jekyll
email: admin@potterieshackspace.org
baseurl: /
url: "http://potterieshackspace.github.io"
#favicon
favicon: /favicon.png
#feed
feedburner_username: maangalabs
#twitter
twitter_username: pranavrajs
github_username: pranavrajs
#linkedin
linkedin_username: pranavrajs
#google analytics
google_analytics_id : UA-53386152-2
#gplus
gplus_username: 101893920934936418705
#fb_username
fb_username: pranav.tayberrycreative
#permalink /blog/2014/11/11/this-is-a-sample-post
permalink: /blog/:year/:month/:day/:title
# Build settings
markdown: kramdown
#Disqus configuration
disqus_short_name: maangalabscom
disqus_show_comment_count: true
#Can be used for multiple author blogs ! Add name to post For more visit the sample
authors:
pranavrajs:
display_name: Pranav Raj S
email: pranavrajs@gmail.com
fb: pranav.tayberrycreative
linkedin: pranavrajs
twitter: pranavrajs
github: pranavrajs
## Instruction:
Update config to reflect hackspace
## Code After:
title: Potteries Hackspace
description: The Potteries Hackspace website.
email: admin@potterieshackspace.org
baseurl: /
url: "http://potterieshackspace.github.io"
favicon: /favicon.png
feedburner_username:
twitter_username: potterieshacks
github_username: PotteriesHackspace
google_analytics_id:
gplus_username:
fb_username:
permalink: /blog/:year/:month/:day/:title
markdown: kramdown
disqus_short_name: potterieshackspace
disqus_show_comment_count: true
authors:
jpswade:
display_name: James Wade
email: jpswade@gmail.com
fb: jpswade
linkedin: jpswade
twitter: jpswade
github: jpswade
| title: Potteries Hackspace
- description: A clean minimal Theme for Jekyll
+ description: The Potteries Hackspace website.
email: admin@potterieshackspace.org
baseurl: /
url: "http://potterieshackspace.github.io"
- #favicon
favicon: /favicon.png
-
- #feed
- feedburner_username: maangalabs
? -----------
+ feedburner_username:
+ twitter_username: potterieshacks
+ github_username: PotteriesHackspace
- #twitter
- twitter_username: pranavrajs
- github_username: pranavrajs
-
- #linkedin
- linkedin_username: pranavrajs
-
- #google analytics
? - ^
+ google_analytics_id:
? ^ ++++
+ gplus_username:
- google_analytics_id : UA-53386152-2
- #gplus
- gplus_username: 101893920934936418705
-
- #fb_username
? -
+ fb_username:
? +
- fb_username: pranav.tayberrycreative
-
- #permalink /blog/2014/11/11/this-is-a-sample-post
permalink: /blog/:year/:month/:day/:title
- # Build settings
markdown: kramdown
+ disqus_short_name: potterieshackspace
-
- #Disqus configuration
- disqus_short_name: maangalabscom
disqus_show_comment_count: true
-
- #Can be used for multiple author blogs ! Add name to post For more visit the sample
authors:
- pranavrajs:
- display_name: Pranav Raj S
+ jpswade:
+ display_name: James Wade
- email: pranavrajs@gmail.com
? ^ ^^^^^^^
+ email: jpswade@gmail.com
? + ^^ ^^
- fb: pranav.tayberrycreative
- linkedin: pranavrajs
- twitter: pranavrajs
- github: pranavrajs
+ fb: jpswade
+ linkedin: jpswade
+ twitter: jpswade
+ github: jpswade | 49 | 1.139535 | 15 | 34 |
34a2dffa04f89bc7b270c744b7d59db339aec56a | docs/_templates/layout.html | docs/_templates/layout.html | {% extends "!layout.html" %}
{% block rootrellink %}
<li><a href="http://www.satchmoproject.com">Satchmo home </a> | </li>
<li><a href="{{ pathto('index') }}">{{ project }} v{{ release }} documentation </a> »</li>
{% endblock %}
{% block beforerelbar %}
<div class="logo">
<img src="{{ pathto("_static/satchmo-front.jpg", 1) }}">
</div>
{% endblock %}
| {% extends "!layout.html" %}
{% block rootrellink %}
<li><a href="http://www.satchmoproject.com">Satchmo home </a> | </li>
<li><a href="{{ pathto('index') }}">{{ project }} v{{ release }} documentation </a> »</li>
{% endblock %}
{% block beforerelbar %}
<div class="logo">
<img src="{{ pathto("_static/satchmo-front.jpg", 1) }}">
</div>
{% endblock %}
{%- block afterfooter %}
<script src='http://www.google-analytics.com/ga.js' type='text/javascript'></script>
<script type="text/javascript">
var pageTracker = _gat._getTracker("UA-1711397-1");
pageTracker._initData();
pageTracker._trackPageview();
</script>
{% endblock %}
| Update docs to include google tracker. | Update docs to include google tracker.
| HTML | bsd-3-clause | grengojbo/satchmo,grengojbo/satchmo | html | ## Code Before:
{% extends "!layout.html" %}
{% block rootrellink %}
<li><a href="http://www.satchmoproject.com">Satchmo home </a> | </li>
<li><a href="{{ pathto('index') }}">{{ project }} v{{ release }} documentation </a> »</li>
{% endblock %}
{% block beforerelbar %}
<div class="logo">
<img src="{{ pathto("_static/satchmo-front.jpg", 1) }}">
</div>
{% endblock %}
## Instruction:
Update docs to include google tracker.
## Code After:
{% extends "!layout.html" %}
{% block rootrellink %}
<li><a href="http://www.satchmoproject.com">Satchmo home </a> | </li>
<li><a href="{{ pathto('index') }}">{{ project }} v{{ release }} documentation </a> »</li>
{% endblock %}
{% block beforerelbar %}
<div class="logo">
<img src="{{ pathto("_static/satchmo-front.jpg", 1) }}">
</div>
{% endblock %}
{%- block afterfooter %}
<script src='http://www.google-analytics.com/ga.js' type='text/javascript'></script>
<script type="text/javascript">
var pageTracker = _gat._getTracker("UA-1711397-1");
pageTracker._initData();
pageTracker._trackPageview();
</script>
{% endblock %}
| {% extends "!layout.html" %}
{% block rootrellink %}
<li><a href="http://www.satchmoproject.com">Satchmo home </a> | </li>
<li><a href="{{ pathto('index') }}">{{ project }} v{{ release }} documentation </a> »</li>
{% endblock %}
{% block beforerelbar %}
<div class="logo">
<img src="{{ pathto("_static/satchmo-front.jpg", 1) }}">
</div>
{% endblock %}
-
+ {%- block afterfooter %}
+ <script src='http://www.google-analytics.com/ga.js' type='text/javascript'></script>
+ <script type="text/javascript">
+ var pageTracker = _gat._getTracker("UA-1711397-1");
+ pageTracker._initData();
+ pageTracker._trackPageview();
+ </script>
+ {% endblock %} | 9 | 0.642857 | 8 | 1 |
bb2c338baf4dacdca91841a4995a81b21786f80b | d03/d03s02/src/main/java/net/safedata/springboot/training/d03/s01/handler/PostLogoutHandler.java | d03/d03s02/src/main/java/net/safedata/springboot/training/d03/s01/handler/PostLogoutHandler.java | package net.safedata.springboot.training.d03.s01.handler;
import org.springframework.security.core.Authentication;
import org.springframework.security.web.authentication.logout.LogoutHandler;
import org.springframework.stereotype.Component;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
@Component
public class PostLogoutHandler implements LogoutHandler {
@Override
public void logout(final HttpServletRequest httpServletRequest, final HttpServletResponse httpServletResponse,
final Authentication authentication) {
// perform any post-logout operations
}
}
| package net.safedata.springboot.training.d03.s01.handler;
import org.springframework.security.core.Authentication;
import org.springframework.security.web.authentication.logout.LogoutHandler;
import org.springframework.stereotype.Component;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
@Component
public class PostLogoutHandler implements LogoutHandler {
@Override
public void logout(final HttpServletRequest httpServletRequest,
final HttpServletResponse httpServletResponse,
final Authentication authentication) {
// perform any post-logout operations
}
}
| Split the parameter on the next line | [improve] Split the parameter on the next line
| Java | apache-2.0 | bogdansolga/spring-boot-training,bogdansolga/spring-boot-training | java | ## Code Before:
package net.safedata.springboot.training.d03.s01.handler;
import org.springframework.security.core.Authentication;
import org.springframework.security.web.authentication.logout.LogoutHandler;
import org.springframework.stereotype.Component;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
@Component
public class PostLogoutHandler implements LogoutHandler {
@Override
public void logout(final HttpServletRequest httpServletRequest, final HttpServletResponse httpServletResponse,
final Authentication authentication) {
// perform any post-logout operations
}
}
## Instruction:
[improve] Split the parameter on the next line
## Code After:
package net.safedata.springboot.training.d03.s01.handler;
import org.springframework.security.core.Authentication;
import org.springframework.security.web.authentication.logout.LogoutHandler;
import org.springframework.stereotype.Component;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
@Component
public class PostLogoutHandler implements LogoutHandler {
@Override
public void logout(final HttpServletRequest httpServletRequest,
final HttpServletResponse httpServletResponse,
final Authentication authentication) {
// perform any post-logout operations
}
}
| package net.safedata.springboot.training.d03.s01.handler;
import org.springframework.security.core.Authentication;
import org.springframework.security.web.authentication.logout.LogoutHandler;
import org.springframework.stereotype.Component;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
@Component
public class PostLogoutHandler implements LogoutHandler {
@Override
- public void logout(final HttpServletRequest httpServletRequest, final HttpServletResponse httpServletResponse,
+ public void logout(final HttpServletRequest httpServletRequest,
+ final HttpServletResponse httpServletResponse,
final Authentication authentication) {
// perform any post-logout operations
}
} | 3 | 0.166667 | 2 | 1 |
00516264a7b6a5a132ab4f0f7bed9dd671f12553 | README.md | README.md |
A Clojure library designed to ... well, that part is up to you.
## Usage
FIXME
## License
Copyright © 2015 FIXME
Distributed under the Eclipse Public License either version 1.0 or (at
your option) any later version.
|
A port of the Om Next Auto-Completion Widget to React Native.
## Usage
This is a Natal based app, so that stuff applies. But, in short you should be able to build the ClojureScript and then Run the Xcode app.
## Demo
<center style="overflow:hidden;"><br/><a href='https://www.youtube.com/watch?v=jCAbgiHd9T4'><img src="http://img.youtube.com/vi/jCAbgiHd9T4/0.jpg" style="width: 350px; max-width: 95%; margin:-56px 0px -57px 0px;"/></a><br/><br/></center> | Update to include text and demo | Update to include text and demo
| Markdown | epl-1.0 | mfikes/auto-completion,mfikes/auto-completion,mfikes/auto-completion | markdown | ## Code Before:
A Clojure library designed to ... well, that part is up to you.
## Usage
FIXME
## License
Copyright © 2015 FIXME
Distributed under the Eclipse Public License either version 1.0 or (at
your option) any later version.
## Instruction:
Update to include text and demo
## Code After:
A port of the Om Next Auto-Completion Widget to React Native.
## Usage
This is a Natal based app, so that stuff applies. But, in short you should be able to build the ClojureScript and then Run the Xcode app.
## Demo
<center style="overflow:hidden;"><br/><a href='https://www.youtube.com/watch?v=jCAbgiHd9T4'><img src="http://img.youtube.com/vi/jCAbgiHd9T4/0.jpg" style="width: 350px; max-width: 95%; margin:-56px 0px -57px 0px;"/></a><br/><br/></center> |
- A Clojure library designed to ... well, that part is up to you.
+ A port of the Om Next Auto-Completion Widget to React Native.
## Usage
- FIXME
+ This is a Natal based app, so that stuff applies. But, in short you should be able to build the ClojureScript and then Run the Xcode app.
- ## License
+ ## Demo
+ <center style="overflow:hidden;"><br/><a href='https://www.youtube.com/watch?v=jCAbgiHd9T4'><img src="http://img.youtube.com/vi/jCAbgiHd9T4/0.jpg" style="width: 350px; max-width: 95%; margin:-56px 0px -57px 0px;"/></a><br/><br/></center>
- Copyright © 2015 FIXME
-
- Distributed under the Eclipse Public License either version 1.0 or (at
- your option) any later version. | 11 | 0.846154 | 4 | 7 |
f9190b50facf283faeb25fb77798ceb60b4b5ae8 | src/renderers/shaders/ShaderChunk/lights_physical_fragment.glsl | src/renderers/shaders/ShaderChunk/lights_physical_fragment.glsl | PhysicalMaterial material;
#ifdef STANDARD_SG
material.diffuseColor = diffuseColor.rgb;
material.specularRoughness = clamp( 1.0 - glossinessFactor, 0.04, 1.0 );
material.specularColor = specular2Factor.rgb;
#else
material.diffuseColor = diffuseColor.rgb * ( 1.0 - metalnessFactor );
material.specularRoughness = clamp( roughnessFactor, 0.04, 1.0 );
#ifdef STANDARD
material.specularColor = mix( vec3( DEFAULT_SPECULAR_COEFFICIENT ), diffuseColor.rgb, metalnessFactor );
#else
material.specularColor = mix( vec3( MAXIMUM_SPECULAR_COEFFICIENT * pow2( reflectivity ) ), diffuseColor.rgb, metalnessFactor );
material.clearCoat = saturate( clearCoat ); // Burley clearcoat model
material.clearCoatRoughness = clamp( clearCoatRoughness, 0.04, 1.0 );
#endif
#endif
| PhysicalMaterial material;
#ifdef STANDARD_SG
material.diffuseColor = diffuseColor.rgb * ( 1.0 - max( max( specular2Factor.r, specular2Factor.g ), specular2Factor.b ) );
material.specularRoughness = clamp( 1.0 - glossinessFactor, 0.04, 1.0 );
material.specularColor = specular2Factor.rgb;
#else
material.diffuseColor = diffuseColor.rgb * ( 1.0 - metalnessFactor );
material.specularRoughness = clamp( roughnessFactor, 0.04, 1.0 );
#ifdef STANDARD
material.specularColor = mix( vec3( DEFAULT_SPECULAR_COEFFICIENT ), diffuseColor.rgb, metalnessFactor );
#else
material.specularColor = mix( vec3( MAXIMUM_SPECULAR_COEFFICIENT * pow2( reflectivity ) ), diffuseColor.rgb, metalnessFactor );
material.clearCoat = saturate( clearCoat ); // Burley clearcoat model
material.clearCoatRoughness = clamp( clearCoatRoughness, 0.04, 1.0 );
#endif
#endif
| Revert "Fix diffuse shader code of Specular-Glossiness" | Revert "Fix diffuse shader code of Specular-Glossiness"
This reverts commit 8360b532643264f817e9d648b72e43c20f111ba5.
| GLSL | mit | Mugen87/three.js,SpinVR/three.js,sherousee/three.js,cadenasgmbh/three.js,matgr1/three.js,spite/three.js,Samsy/three.js,framelab/three.js,Hectate/three.js,ndebeiss/three.js,jpweeks/three.js,opensim-org/three.js,ndebeiss/three.js,WestLangley/three.js,rfm1201/rfm.three.js,brianchirls/three.js,carlosanunes/three.js,sttz/three.js,jee7/three.js,mese79/three.js,colombod/three.js,nhalloran/three.js,mhoangvslev/data-visualizer,fyoudine/three.js,sttz/three.js,ondys/three.js,jeffgoku/three.js,mese79/three.js,matgr1/three.js,Samsung-Blue/three.js,Astrak/three.js,RemusMar/three.js,Samsung-Blue/three.js,fanzhanggoogle/three.js,aardgoose/three.js,satori99/three.js,shinate/three.js,kaisalmen/three.js,Itee/three.js,Amritesh/three.js,QingchaoHu/three.js,framelab/three.js,Amritesh/three.js,ondys/three.js,rfm1201/rfm.three.js,nhalloran/three.js,satori99/three.js,pailhead/three.js,julapy/three.js,06wj/three.js,fernandojsg/three.js,mhoangvslev/data-visualizer,amakaroff82/three.js,mhoangvslev/data-visualizer,nhalloran/three.js,TristanVALCKE/three.js,colombod/three.js,RemusMar/three.js,colombod/three.js,mhoangvslev/data-visualizer,greggman/three.js,daoshengmu/three.js,colombod/three.js,jostschmithals/three.js,jee7/three.js,nhalloran/three.js,Samsy/three.js,jee7/three.js,cadenasgmbh/three.js,ngokevin/three-dev,zhoushijie163/three.js,jostschmithals/three.js,satori99/three.js,shanmugamss/three.js-modified,amakaroff82/three.js,shinate/three.js,mrdoob/three.js,googlecreativelab/three.js,arodic/three.js,Astrak/three.js,ngokevin/three-dev,ngokevin/three-dev,spite/three.js,framelab/three.js,zhoushijie163/three.js,Aldrien-/three.js,daoshengmu/three.js,QingchaoHu/three.js,mese79/three.js,spite/three.js,daoshengmu/three.js,ngokevin/three-dev,amakaroff82/three.js,stanford-gfx/three.js,looeee/three.js,Samsung-Blue/three.js,carlosanunes/three.js,ValtoLibraries/ThreeJS,ondys/three.js,mese79/three.js,googlecreativelab/three.js,Astrak/three.js,Samsy/three.js,colombod/three.js,ondys/three.js,jostschmithals/three.js,fanzhanggoogle/three.js,nhalloran/three.js,amakaroff82/three.js,Mugen87/three.js,looeee/three.js,ndebeiss/three.js,fraguada/three.js,brianchirls/three.js,ndebeiss/three.js,fraguada/three.js,WestLangley/three.js,shanmugamss/three.js-modified,satori99/three.js,arodic/three.js,jostschmithals/three.js,shanmugamss/three.js-modified,sherousee/three.js,RemusMar/three.js,matgr1/three.js,colombod/three.js,amakaroff82/three.js,SpinVR/three.js,fraguada/three.js,googlecreativelab/three.js,arodic/three.js,carlosanunes/three.js,ValtoLibraries/ThreeJS,daoshengmu/three.js,TristanVALCKE/three.js,ValtoLibraries/ThreeJS,06wj/three.js,googlecreativelab/three.js,Ludobaka/three.js,nhalloran/three.js,shinate/three.js,pailhead/three.js,fanzhanggoogle/three.js,mhoangvslev/data-visualizer,fernandojsg/three.js,Ludobaka/three.js,shanmugamss/three.js-modified,gero3/three.js,Samsy/three.js,Hectate/three.js,googlecreativelab/three.js,donmccurdy/three.js,sasha240100/three.js,jeffgoku/three.js,Ludobaka/three.js,brianchirls/three.js,jeffgoku/three.js,fernandojsg/three.js,mese79/three.js,hsimpson/three.js,brianchirls/three.js,Samsy/three.js,Astrak/three.js,fraguada/three.js,fyoudine/three.js,makc/three.js.fork,ngokevin/three.js,gero3/three.js,daoshengmu/three.js,sasha240100/three.js,Samsy/three.js,greggman/three.js,ValtoLibraries/ThreeJS,ngokevin/three.js,jostschmithals/three.js,Aldrien-/three.js,spite/three.js,RemusMar/three.js,Liuer/three.js,satori99/three.js,ngokevin/three.js,sasha240100/three.js,spite/three.js,Astrak/three.js,TristanVALCKE/three.js,jeffgoku/three.js,shanmugamss/three.js-modified,stanford-gfx/three.js,julapy/three.js,arodic/three.js,Ludobaka/three.js,mrdoob/three.js,ndebeiss/three.js,Samsung-Blue/three.js,daoshengmu/three.js,sasha240100/three.js,TristanVALCKE/three.js,sasha240100/three.js,fraguada/three.js,hsimpson/three.js,carlosanunes/three.js,sherousee/three.js,TristanVALCKE/three.js,Astrak/three.js,kaisalmen/three.js,ngokevin/three-dev,brianchirls/three.js,matgr1/three.js,Itee/three.js,Mugen87/three.js,amakaroff82/three.js,ngokevin/three.js,pailhead/three.js,rfm1201/rfm.three.js,carlosanunes/three.js,Hectate/three.js,shinate/three.js,carlosanunes/three.js,ValtoLibraries/ThreeJS,opensim-org/three.js,shinate/three.js,Aldrien-/three.js,shanmugamss/three.js-modified,Hectate/three.js,Liuer/three.js,fraguada/three.js,Amritesh/three.js,mhoangvslev/data-visualizer,TristanVALCKE/three.js,matgr1/three.js,Samsung-Blue/three.js,sasha240100/three.js,jostschmithals/three.js,RemusMar/three.js,satori99/three.js,jeffgoku/three.js,makc/three.js.fork,mese79/three.js,Aldrien-/three.js,hsimpson/three.js,arodic/three.js,ondys/three.js,fanzhanggoogle/three.js,shinate/three.js,Ludobaka/three.js,RemusMar/three.js,Hectate/three.js,ngokevin/three-dev,fanzhanggoogle/three.js,matgr1/three.js,Ludobaka/three.js,donmccurdy/three.js,Aldrien-/three.js,ngokevin/three.js,brianchirls/three.js,ndebeiss/three.js,spite/three.js,jeffgoku/three.js,ngokevin/three.js,ValtoLibraries/ThreeJS,arodic/three.js,julapy/three.js,googlecreativelab/three.js,fanzhanggoogle/three.js,Hectate/three.js,Samsung-Blue/three.js,aardgoose/three.js,ondys/three.js,Aldrien-/three.js,jpweeks/three.js | glsl | ## Code Before:
PhysicalMaterial material;
#ifdef STANDARD_SG
material.diffuseColor = diffuseColor.rgb;
material.specularRoughness = clamp( 1.0 - glossinessFactor, 0.04, 1.0 );
material.specularColor = specular2Factor.rgb;
#else
material.diffuseColor = diffuseColor.rgb * ( 1.0 - metalnessFactor );
material.specularRoughness = clamp( roughnessFactor, 0.04, 1.0 );
#ifdef STANDARD
material.specularColor = mix( vec3( DEFAULT_SPECULAR_COEFFICIENT ), diffuseColor.rgb, metalnessFactor );
#else
material.specularColor = mix( vec3( MAXIMUM_SPECULAR_COEFFICIENT * pow2( reflectivity ) ), diffuseColor.rgb, metalnessFactor );
material.clearCoat = saturate( clearCoat ); // Burley clearcoat model
material.clearCoatRoughness = clamp( clearCoatRoughness, 0.04, 1.0 );
#endif
#endif
## Instruction:
Revert "Fix diffuse shader code of Specular-Glossiness"
This reverts commit 8360b532643264f817e9d648b72e43c20f111ba5.
## Code After:
PhysicalMaterial material;
#ifdef STANDARD_SG
material.diffuseColor = diffuseColor.rgb * ( 1.0 - max( max( specular2Factor.r, specular2Factor.g ), specular2Factor.b ) );
material.specularRoughness = clamp( 1.0 - glossinessFactor, 0.04, 1.0 );
material.specularColor = specular2Factor.rgb;
#else
material.diffuseColor = diffuseColor.rgb * ( 1.0 - metalnessFactor );
material.specularRoughness = clamp( roughnessFactor, 0.04, 1.0 );
#ifdef STANDARD
material.specularColor = mix( vec3( DEFAULT_SPECULAR_COEFFICIENT ), diffuseColor.rgb, metalnessFactor );
#else
material.specularColor = mix( vec3( MAXIMUM_SPECULAR_COEFFICIENT * pow2( reflectivity ) ), diffuseColor.rgb, metalnessFactor );
material.clearCoat = saturate( clearCoat ); // Burley clearcoat model
material.clearCoatRoughness = clamp( clearCoatRoughness, 0.04, 1.0 );
#endif
#endif
| PhysicalMaterial material;
#ifdef STANDARD_SG
- material.diffuseColor = diffuseColor.rgb;
+ material.diffuseColor = diffuseColor.rgb * ( 1.0 - max( max( specular2Factor.r, specular2Factor.g ), specular2Factor.b ) );
material.specularRoughness = clamp( 1.0 - glossinessFactor, 0.04, 1.0 );
material.specularColor = specular2Factor.rgb;
#else
material.diffuseColor = diffuseColor.rgb * ( 1.0 - metalnessFactor );
material.specularRoughness = clamp( roughnessFactor, 0.04, 1.0 );
#ifdef STANDARD
material.specularColor = mix( vec3( DEFAULT_SPECULAR_COEFFICIENT ), diffuseColor.rgb, metalnessFactor );
#else
material.specularColor = mix( vec3( MAXIMUM_SPECULAR_COEFFICIENT * pow2( reflectivity ) ), diffuseColor.rgb, metalnessFactor );
material.clearCoat = saturate( clearCoat ); // Burley clearcoat model
material.clearCoatRoughness = clamp( clearCoatRoughness, 0.04, 1.0 );
#endif
#endif | 2 | 0.117647 | 1 | 1 |
48138f451a9dd8b6f87bfba1a22c1636c4edbaa7 | README.md | README.md |
A [Slim](http://slim-lang.com) template parser in Elixir.
__Under very active development.__
```slim
#id.class
p Hello World
```
```html
<div id="id" class="class">
<p>Hello World</p>
</div>
```
## Using
```elixir
html = SlimFast.render(slim)
```
## Todo
+ [ ] EEx support
+ [ ] String interpolation
+ [x] Attributes
+ [ ] Javascript tag support
+ [ ] HTML escaping
## License
Please see [LICENSE](https://github.com/doomspork/slim_fast/blob/master/LICENSE) for licensing details.
|
A [Slim](http://slim-lang.com) template parser in Elixir.
__Under very active development.__
```slim
#id.class
p Hello World
```
```html
<div id="id" class="class">
<p>Hello World</p>
</div>
```
## Using
```elixir
html = SlimFast.render(slim)
```
## License
Please see [LICENSE](https://github.com/doomspork/slim_fast/blob/master/LICENSE) for licensing details.
| Remove TODO list in favor of GH issues | Remove TODO list in favor of GH issues
| Markdown | mit | Rakoth/slime,slime-lang/slime | markdown | ## Code Before:
A [Slim](http://slim-lang.com) template parser in Elixir.
__Under very active development.__
```slim
#id.class
p Hello World
```
```html
<div id="id" class="class">
<p>Hello World</p>
</div>
```
## Using
```elixir
html = SlimFast.render(slim)
```
## Todo
+ [ ] EEx support
+ [ ] String interpolation
+ [x] Attributes
+ [ ] Javascript tag support
+ [ ] HTML escaping
## License
Please see [LICENSE](https://github.com/doomspork/slim_fast/blob/master/LICENSE) for licensing details.
## Instruction:
Remove TODO list in favor of GH issues
## Code After:
A [Slim](http://slim-lang.com) template parser in Elixir.
__Under very active development.__
```slim
#id.class
p Hello World
```
```html
<div id="id" class="class">
<p>Hello World</p>
</div>
```
## Using
```elixir
html = SlimFast.render(slim)
```
## License
Please see [LICENSE](https://github.com/doomspork/slim_fast/blob/master/LICENSE) for licensing details.
|
A [Slim](http://slim-lang.com) template parser in Elixir.
__Under very active development.__
```slim
#id.class
p Hello World
```
```html
<div id="id" class="class">
<p>Hello World</p>
</div>
```
## Using
```elixir
html = SlimFast.render(slim)
```
- ## Todo
-
- + [ ] EEx support
- + [ ] String interpolation
- + [x] Attributes
- + [ ] Javascript tag support
- + [ ] HTML escaping
-
## License
Please see [LICENSE](https://github.com/doomspork/slim_fast/blob/master/LICENSE) for licensing details. | 8 | 0.242424 | 0 | 8 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.