commit
stringlengths
40
40
old_file
stringlengths
4
184
new_file
stringlengths
4
184
old_contents
stringlengths
1
3.6k
new_contents
stringlengths
5
3.38k
subject
stringlengths
15
778
message
stringlengths
16
6.74k
lang
stringclasses
201 values
license
stringclasses
13 values
repos
stringlengths
6
116k
config
stringclasses
201 values
content
stringlengths
137
7.24k
diff
stringlengths
26
5.55k
diff_length
int64
1
123
relative_diff_length
float64
0.01
89
n_lines_added
int64
0
108
n_lines_deleted
int64
0
106
9bc1231e06e0a3a4099bf0643996da0c71d430a3
README.md
README.md
BatchBundle =========== Batch architecture bundle inspired by Spring Batch [![Scrutinizer Quality Score](https://scrutinizer-ci.com/g/akeneo/BatchBundle/badges/quality-score.png?s=c191e29ba7ff6838205b395eeae3f2d9a027b8d7)](https://scrutinizer-ci.com/g/akeneo/BatchBundle/) [![Code Coverage](https://scrutinizer-ci.com/g/akeneo/BatchBundle/badges/coverage.png?s=6011e77b1d402f3deadf3334408bea971dcefbb1)](https://scrutinizer-ci.com/g/akeneo/BatchBundle/)
BatchBundle =========== Batch architecture bundle inspired by Spring Batch. See http://docs.spring.io/spring-batch/reference/html/domain.html for an explanation of the Domain Language objects you will find in this BatchBundle. [![Scrutinizer Quality Score](https://scrutinizer-ci.com/g/akeneo/BatchBundle/badges/quality-score.png?s=c191e29ba7ff6838205b395eeae3f2d9a027b8d7)](https://scrutinizer-ci.com/g/akeneo/BatchBundle/) [![Code Coverage](https://scrutinizer-ci.com/g/akeneo/BatchBundle/badges/coverage.png?s=6011e77b1d402f3deadf3334408bea971dcefbb1)](https://scrutinizer-ci.com/g/akeneo/BatchBundle/)
Add a note on Spring Batch inspiration
Add a note on Spring Batch inspiration
Markdown
mit
rseliberty/BatchBundle,trustify/BatchBundle,rseliberty/BatchBundle,trustify/BatchBundle
markdown
## Code Before: BatchBundle =========== Batch architecture bundle inspired by Spring Batch [![Scrutinizer Quality Score](https://scrutinizer-ci.com/g/akeneo/BatchBundle/badges/quality-score.png?s=c191e29ba7ff6838205b395eeae3f2d9a027b8d7)](https://scrutinizer-ci.com/g/akeneo/BatchBundle/) [![Code Coverage](https://scrutinizer-ci.com/g/akeneo/BatchBundle/badges/coverage.png?s=6011e77b1d402f3deadf3334408bea971dcefbb1)](https://scrutinizer-ci.com/g/akeneo/BatchBundle/) ## Instruction: Add a note on Spring Batch inspiration ## Code After: BatchBundle =========== Batch architecture bundle inspired by Spring Batch. See http://docs.spring.io/spring-batch/reference/html/domain.html for an explanation of the Domain Language objects you will find in this BatchBundle. [![Scrutinizer Quality Score](https://scrutinizer-ci.com/g/akeneo/BatchBundle/badges/quality-score.png?s=c191e29ba7ff6838205b395eeae3f2d9a027b8d7)](https://scrutinizer-ci.com/g/akeneo/BatchBundle/) [![Code Coverage](https://scrutinizer-ci.com/g/akeneo/BatchBundle/badges/coverage.png?s=6011e77b1d402f3deadf3334408bea971dcefbb1)](https://scrutinizer-ci.com/g/akeneo/BatchBundle/)
BatchBundle =========== - Batch architecture bundle inspired by Spring Batch + Batch architecture bundle inspired by Spring Batch. ? + - [![Scrutinizer Quality Score](https://scrutinizer-ci.com/g/akeneo/BatchBundle/badges/quality-score.png?s=c191e29ba7ff6838205b395eeae3f2d9a027b8d7)](https://scrutinizer-ci.com/g/akeneo/BatchBundle/) [![Code Coverage](https://scrutinizer-ci.com/g/akeneo/BatchBundle/badges/coverage.png?s=6011e77b1d402f3deadf3334408bea971dcefbb1)](https://scrutinizer-ci.com/g/akeneo/BatchBundle/) + See http://docs.spring.io/spring-batch/reference/html/domain.html for an explanation of the Domain Language objects you will find in this BatchBundle. + + [![Scrutinizer Quality Score](https://scrutinizer-ci.com/g/akeneo/BatchBundle/badges/quality-score.png?s=c191e29ba7ff6838205b395eeae3f2d9a027b8d7)](https://scrutinizer-ci.com/g/akeneo/BatchBundle/) + [![Code Coverage](https://scrutinizer-ci.com/g/akeneo/BatchBundle/badges/coverage.png?s=6011e77b1d402f3deadf3334408bea971dcefbb1)](https://scrutinizer-ci.com/g/akeneo/BatchBundle/)
7
1.166667
5
2
104ba7a089eeb303499ea5739ab3789c2b8080b5
docs/source/developing.rst
docs/source/developing.rst
Developing kardboard ===================== Quickstart ------------ To get a local version of kardboard up and running suitable for developing against, you can follow this quickstart guide. .. code-block:: bash # Install python, virtualenv and mongodb using your favorite system package manager here. # aptitude install gcc python2.6 python2.6-dev python-virtualenv redis-server mongodb # These may differ on your local environment. In particular, a different python version # may be present locally. Use of the python and python-dev metapackages on debian is encouraged. # If using OSX, you should at least install mongo and redis. We recommend using Homebrew_. .. _Homebrew: https://github.com/mxcl/homebrew # brew install mongodb redis # Get the source, using your own fork most likely git clone git@github.com:cmheisel/kardboard.git # Make a virtualenv cd kardboard virtualenv .kve # Turn it on source ./.kve/bin/activate # Install the requirements pip install -r requirements.txt # Start mongo and drop it into the background mkdir var mongod --fork --logpath=./var/mongo.log --dbpath=./var/ # Start redis (only if you're running celery) redis-server /usr/local/etc/redis.conf # Start the celery process python kardboard/manage.py celeryd -B # Start the server python kardboard/runserver.py
Developing kardboard ===================== Quickstart ------------ To get a local version of kardboard up and running suitable for developing against, you can follow this quickstart guide. .. code-block:: bash # Install python, virtualenv and mongodb using your favorite system package manager here. # aptitude install gcc python2.6 python2.6-dev python-virtualenv redis-server mongodb # These may differ on your local environment. In particular, a different python version # may be present locally. Use of the python and python-dev metapackages on debian is encouraged. # If using OSX, you should at least install mongo and redis. We recommend using Homebrew_. .. _Homebrew: https://github.com/mxcl/homebrew # brew install mongodb redis # Get the source, using your own fork most likely git clone git@github.com:cmheisel/kardboard.git # Make a virtualenv cd kardboard virtualenv .kve # Turn it on source ./.kve/bin/activate # Install the requirements pip install -r requirements.txt # Start mongo and drop it into the background mkdir var mongod --fork --logpath=./var/mongo.log --dbpath=./var/ # Start redis (only if you're running celery) redis-server /usr/local/etc/redis.conf # Ensure the module is on sys.path # Failed to import module kardboard errors occur otherwise. python setup.py develop # Start the celery process python kardboard/manage.py celeryd -B # Start the server python kardboard/runserver.py
Update getting started docs re: module path setup.
Update getting started docs re: module path setup.
reStructuredText
mit
cmheisel/kardboard,cmheisel/kardboard,cmheisel/kardboard,cmheisel/kardboard
restructuredtext
## Code Before: Developing kardboard ===================== Quickstart ------------ To get a local version of kardboard up and running suitable for developing against, you can follow this quickstart guide. .. code-block:: bash # Install python, virtualenv and mongodb using your favorite system package manager here. # aptitude install gcc python2.6 python2.6-dev python-virtualenv redis-server mongodb # These may differ on your local environment. In particular, a different python version # may be present locally. Use of the python and python-dev metapackages on debian is encouraged. # If using OSX, you should at least install mongo and redis. We recommend using Homebrew_. .. _Homebrew: https://github.com/mxcl/homebrew # brew install mongodb redis # Get the source, using your own fork most likely git clone git@github.com:cmheisel/kardboard.git # Make a virtualenv cd kardboard virtualenv .kve # Turn it on source ./.kve/bin/activate # Install the requirements pip install -r requirements.txt # Start mongo and drop it into the background mkdir var mongod --fork --logpath=./var/mongo.log --dbpath=./var/ # Start redis (only if you're running celery) redis-server /usr/local/etc/redis.conf # Start the celery process python kardboard/manage.py celeryd -B # Start the server python kardboard/runserver.py ## Instruction: Update getting started docs re: module path setup. ## Code After: Developing kardboard ===================== Quickstart ------------ To get a local version of kardboard up and running suitable for developing against, you can follow this quickstart guide. .. code-block:: bash # Install python, virtualenv and mongodb using your favorite system package manager here. # aptitude install gcc python2.6 python2.6-dev python-virtualenv redis-server mongodb # These may differ on your local environment. In particular, a different python version # may be present locally. Use of the python and python-dev metapackages on debian is encouraged. # If using OSX, you should at least install mongo and redis. We recommend using Homebrew_. .. _Homebrew: https://github.com/mxcl/homebrew # brew install mongodb redis # Get the source, using your own fork most likely git clone git@github.com:cmheisel/kardboard.git # Make a virtualenv cd kardboard virtualenv .kve # Turn it on source ./.kve/bin/activate # Install the requirements pip install -r requirements.txt # Start mongo and drop it into the background mkdir var mongod --fork --logpath=./var/mongo.log --dbpath=./var/ # Start redis (only if you're running celery) redis-server /usr/local/etc/redis.conf # Ensure the module is on sys.path # Failed to import module kardboard errors occur otherwise. python setup.py develop # Start the celery process python kardboard/manage.py celeryd -B # Start the server python kardboard/runserver.py
Developing kardboard ===================== Quickstart ------------ To get a local version of kardboard up and running suitable for developing against, you can follow this quickstart guide. .. code-block:: bash # Install python, virtualenv and mongodb using your favorite system package manager here. # aptitude install gcc python2.6 python2.6-dev python-virtualenv redis-server mongodb # These may differ on your local environment. In particular, a different python version # may be present locally. Use of the python and python-dev metapackages on debian is encouraged. # If using OSX, you should at least install mongo and redis. We recommend using Homebrew_. .. _Homebrew: https://github.com/mxcl/homebrew # brew install mongodb redis # Get the source, using your own fork most likely git clone git@github.com:cmheisel/kardboard.git # Make a virtualenv cd kardboard virtualenv .kve # Turn it on source ./.kve/bin/activate # Install the requirements pip install -r requirements.txt # Start mongo and drop it into the background mkdir var mongod --fork --logpath=./var/mongo.log --dbpath=./var/ # Start redis (only if you're running celery) redis-server /usr/local/etc/redis.conf + # Ensure the module is on sys.path + # Failed to import module kardboard errors occur otherwise. + python setup.py develop + # Start the celery process python kardboard/manage.py celeryd -B # Start the server python kardboard/runserver.py
4
0.090909
4
0
ddec6067054cc4408ac174e3ea4ffeca2a962201
regulations/views/notice_home.py
regulations/views/notice_home.py
from __future__ import unicode_literals from operator import itemgetter import logging from django.http import Http404 from django.template.response import TemplateResponse from django.views.generic.base import View from regulations.generator.api_reader import ApiReader from regulations.views.preamble import ( notice_data, CommentState) logger = logging.getLogger(__name__) class NoticeHomeView(View): """ Basic view that provides a list of regulations and notices to the context. """ template_name = None # We should probably have a default notice template. def get(self, request, *args, **kwargs): notices = ApiReader().notices().get("results", []) context = {} notices_meta = [] for notice in notices: try: if notice.get("document_number"): _, meta, _ = notice_data(notice["document_number"]) notices_meta.append(meta) except Http404: pass notices_meta = sorted(notices_meta, key=itemgetter("publication_date"), reverse=True) context["notices"] = notices_meta # Django templates won't show contents of CommentState as an Enum, so: context["comment_state"] = {state.name: state.value for state in CommentState} assert self.template_name template = self.template_name return TemplateResponse(request=request, template=template, context=context)
from __future__ import unicode_literals from operator import itemgetter import logging from django.http import Http404 from django.template.response import TemplateResponse from django.views.generic.base import View from regulations.generator.api_reader import ApiReader from regulations.views.preamble import ( notice_data, CommentState) logger = logging.getLogger(__name__) class NoticeHomeView(View): """ Basic view that provides a list of regulations and notices to the context. """ template_name = None # We should probably have a default notice template. def get(self, request, *args, **kwargs): notices = ApiReader().notices().get("results", []) context = {} notices_meta = [] for notice in notices: try: if notice.get("document_number"): _, meta, _ = notice_data(notice["document_number"]) notices_meta.append(meta) except Http404: pass notices_meta = sorted(notices_meta, key=itemgetter("publication_date"), reverse=True) context["notices"] = notices_meta # Django templates won't show contents of CommentState as an Enum, so: context["comment_state"] = {state.name: state.value for state in CommentState} template = self.template_name return TemplateResponse(request=request, template=template, context=context)
Remove unnecessary assert from view for Notice home.
Remove unnecessary assert from view for Notice home.
Python
cc0-1.0
18F/regulations-site,18F/regulations-site,eregs/regulations-site,tadhg-ohiggins/regulations-site,tadhg-ohiggins/regulations-site,tadhg-ohiggins/regulations-site,eregs/regulations-site,eregs/regulations-site,eregs/regulations-site,tadhg-ohiggins/regulations-site,18F/regulations-site,18F/regulations-site
python
## Code Before: from __future__ import unicode_literals from operator import itemgetter import logging from django.http import Http404 from django.template.response import TemplateResponse from django.views.generic.base import View from regulations.generator.api_reader import ApiReader from regulations.views.preamble import ( notice_data, CommentState) logger = logging.getLogger(__name__) class NoticeHomeView(View): """ Basic view that provides a list of regulations and notices to the context. """ template_name = None # We should probably have a default notice template. def get(self, request, *args, **kwargs): notices = ApiReader().notices().get("results", []) context = {} notices_meta = [] for notice in notices: try: if notice.get("document_number"): _, meta, _ = notice_data(notice["document_number"]) notices_meta.append(meta) except Http404: pass notices_meta = sorted(notices_meta, key=itemgetter("publication_date"), reverse=True) context["notices"] = notices_meta # Django templates won't show contents of CommentState as an Enum, so: context["comment_state"] = {state.name: state.value for state in CommentState} assert self.template_name template = self.template_name return TemplateResponse(request=request, template=template, context=context) ## Instruction: Remove unnecessary assert from view for Notice home. ## Code After: from __future__ import unicode_literals from operator import itemgetter import logging from django.http import Http404 from django.template.response import TemplateResponse from django.views.generic.base import View from regulations.generator.api_reader import ApiReader from regulations.views.preamble import ( notice_data, CommentState) logger = logging.getLogger(__name__) class NoticeHomeView(View): """ Basic view that provides a list of regulations and notices to the context. """ template_name = None # We should probably have a default notice template. def get(self, request, *args, **kwargs): notices = ApiReader().notices().get("results", []) context = {} notices_meta = [] for notice in notices: try: if notice.get("document_number"): _, meta, _ = notice_data(notice["document_number"]) notices_meta.append(meta) except Http404: pass notices_meta = sorted(notices_meta, key=itemgetter("publication_date"), reverse=True) context["notices"] = notices_meta # Django templates won't show contents of CommentState as an Enum, so: context["comment_state"] = {state.name: state.value for state in CommentState} template = self.template_name return TemplateResponse(request=request, template=template, context=context)
from __future__ import unicode_literals from operator import itemgetter import logging from django.http import Http404 from django.template.response import TemplateResponse from django.views.generic.base import View from regulations.generator.api_reader import ApiReader from regulations.views.preamble import ( notice_data, CommentState) logger = logging.getLogger(__name__) class NoticeHomeView(View): """ Basic view that provides a list of regulations and notices to the context. """ template_name = None # We should probably have a default notice template. def get(self, request, *args, **kwargs): notices = ApiReader().notices().get("results", []) context = {} notices_meta = [] for notice in notices: try: if notice.get("document_number"): _, meta, _ = notice_data(notice["document_number"]) notices_meta.append(meta) except Http404: pass notices_meta = sorted(notices_meta, key=itemgetter("publication_date"), reverse=True) context["notices"] = notices_meta # Django templates won't show contents of CommentState as an Enum, so: context["comment_state"] = {state.name: state.value for state in CommentState} - assert self.template_name template = self.template_name return TemplateResponse(request=request, template=template, context=context)
1
0.02
0
1
9004e0e41a58b5af376d169bf91d320ce6705b36
adapters/Vungle/Public/Headers/VungleAdNetworkExtras.h
adapters/Vungle/Public/Headers/VungleAdNetworkExtras.h
// Copyright 2019 Google LLC // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #import <Foundation/Foundation.h> #import <GoogleMobileAds/GADAdNetworkExtras.h> @interface VungleAdNetworkExtras : NSObject<GADAdNetworkExtras> /*! * @brief NSString with user identifier that will be passed if the ad is incentivized. * @discussion Optional. The value passed as 'user' in the an incentivized server-to-server call. */ @property(nonatomic, copy) NSString *_Nullable userId; /*! * @brief Controls whether presented ads will start in a muted state or not. */ @property(nonatomic, assign) BOOL muted; @property(nonatomic, assign) NSUInteger ordinal; @property(nonatomic, assign) NSTimeInterval flexViewAutoDismissSeconds; @property(nonatomic, copy) NSArray<NSString *> *_Nullable allPlacements; @property(nonatomic, copy) NSString *_Nullable playingPlacement; @end
// Copyright 2019 Google LLC // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #import <Foundation/Foundation.h> #import <GoogleMobileAds/GADAdNetworkExtras.h> @interface VungleAdNetworkExtras : NSObject<GADAdNetworkExtras> /*! * @brief NSString with user identifier that will be passed if the ad is incentivized. * @discussion Optional. The value passed as 'user' in the an incentivized server-to-server call. */ @property(nonatomic, copy) NSString *_Nullable userId; /*! * @brief Controls whether presented ads will start in a muted state or not. */ @property(nonatomic, assign) BOOL muted; @property(nonatomic, assign) NSUInteger ordinal; @property(nonatomic, assign) NSTimeInterval flexViewAutoDismissSeconds; @property(nonatomic, copy) NSArray<NSString *> *_Nullable allPlacements; @property(nonatomic, copy) NSString *_Nullable playingPlacement; @property(nonatomic, copy) NSNumber *_Nullable orientations; @end
Add video orientation to play ad option
Add video orientation to play ad option
C
apache-2.0
googleads/googleads-mobile-ios-mediation,googleads/googleads-mobile-ios-mediation,googleads/googleads-mobile-ios-mediation,googleads/googleads-mobile-ios-mediation
c
## Code Before: // Copyright 2019 Google LLC // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #import <Foundation/Foundation.h> #import <GoogleMobileAds/GADAdNetworkExtras.h> @interface VungleAdNetworkExtras : NSObject<GADAdNetworkExtras> /*! * @brief NSString with user identifier that will be passed if the ad is incentivized. * @discussion Optional. The value passed as 'user' in the an incentivized server-to-server call. */ @property(nonatomic, copy) NSString *_Nullable userId; /*! * @brief Controls whether presented ads will start in a muted state or not. */ @property(nonatomic, assign) BOOL muted; @property(nonatomic, assign) NSUInteger ordinal; @property(nonatomic, assign) NSTimeInterval flexViewAutoDismissSeconds; @property(nonatomic, copy) NSArray<NSString *> *_Nullable allPlacements; @property(nonatomic, copy) NSString *_Nullable playingPlacement; @end ## Instruction: Add video orientation to play ad option ## Code After: // Copyright 2019 Google LLC // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #import <Foundation/Foundation.h> #import <GoogleMobileAds/GADAdNetworkExtras.h> @interface VungleAdNetworkExtras : NSObject<GADAdNetworkExtras> /*! * @brief NSString with user identifier that will be passed if the ad is incentivized. * @discussion Optional. The value passed as 'user' in the an incentivized server-to-server call. */ @property(nonatomic, copy) NSString *_Nullable userId; /*! * @brief Controls whether presented ads will start in a muted state or not. */ @property(nonatomic, assign) BOOL muted; @property(nonatomic, assign) NSUInteger ordinal; @property(nonatomic, assign) NSTimeInterval flexViewAutoDismissSeconds; @property(nonatomic, copy) NSArray<NSString *> *_Nullable allPlacements; @property(nonatomic, copy) NSString *_Nullable playingPlacement; @property(nonatomic, copy) NSNumber *_Nullable orientations; @end
// Copyright 2019 Google LLC // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #import <Foundation/Foundation.h> #import <GoogleMobileAds/GADAdNetworkExtras.h> @interface VungleAdNetworkExtras : NSObject<GADAdNetworkExtras> /*! * @brief NSString with user identifier that will be passed if the ad is incentivized. * @discussion Optional. The value passed as 'user' in the an incentivized server-to-server call. */ @property(nonatomic, copy) NSString *_Nullable userId; /*! * @brief Controls whether presented ads will start in a muted state or not. */ @property(nonatomic, assign) BOOL muted; @property(nonatomic, assign) NSUInteger ordinal; @property(nonatomic, assign) NSTimeInterval flexViewAutoDismissSeconds; @property(nonatomic, copy) NSArray<NSString *> *_Nullable allPlacements; @property(nonatomic, copy) NSString *_Nullable playingPlacement; + @property(nonatomic, copy) NSNumber *_Nullable orientations; + @end
2
0.05
2
0
7a8092db0d6eef16756666779fcf3691f3f0185d
troposphere/static/js/components/projects/detail/tableData/volume/Status.react.js
troposphere/static/js/components/projects/detail/tableData/volume/Status.react.js
/** @jsx React.DOM */ define( [ 'react', 'backbone' ], function (React, Backbone) { return React.createClass({ propTypes: { volume: React.PropTypes.instanceOf(Backbone.Model).isRequired, instances: React.PropTypes.instanceOf(Backbone.Collection).isRequired }, render: function () { var status = this.props.volume.get('status'), placeholderMessage = status; if(status === "available"){ placeholderMessage = "Unattached"; }else if(status === "in-use"){ var attachData = this.props.volume.get('attach_data'); var instance = this.props.instances.get(attachData.instance_id); placeholderMessage = "Attached to " + instance.get('name') + " as device " + attachData.device; } return ( <span> {placeholderMessage} </span> ); } }); });
/** @jsx React.DOM */ define( [ 'react', 'backbone' ], function (React, Backbone) { return React.createClass({ propTypes: { volume: React.PropTypes.instanceOf(Backbone.Model).isRequired, instances: React.PropTypes.instanceOf(Backbone.Collection).isRequired }, render: function () { var status = this.props.volume.get('status'), placeholderMessage = status; if(status === "available"){ placeholderMessage = "Unattached"; }else if(status === "in-use"){ var attachData = this.props.volume.get('attach_data'); var instance = this.props.instances.get(attachData.instance_id); if(instance) { placeholderMessage = "Attached to " + instance.get('name') + " as device " + attachData.device; }else{ placeholderMessage = "Attached to instance outside project (" + attachData.instance_id + ")"; } } return ( <span> {placeholderMessage} </span> ); } }); });
Add message when volume attached to instance outside project
Add message when volume attached to instance outside project
JavaScript
apache-2.0
CCI-MOC/GUI-Frontend,CCI-MOC/GUI-Frontend,CCI-MOC/GUI-Frontend,CCI-MOC/GUI-Frontend,CCI-MOC/GUI-Frontend
javascript
## Code Before: /** @jsx React.DOM */ define( [ 'react', 'backbone' ], function (React, Backbone) { return React.createClass({ propTypes: { volume: React.PropTypes.instanceOf(Backbone.Model).isRequired, instances: React.PropTypes.instanceOf(Backbone.Collection).isRequired }, render: function () { var status = this.props.volume.get('status'), placeholderMessage = status; if(status === "available"){ placeholderMessage = "Unattached"; }else if(status === "in-use"){ var attachData = this.props.volume.get('attach_data'); var instance = this.props.instances.get(attachData.instance_id); placeholderMessage = "Attached to " + instance.get('name') + " as device " + attachData.device; } return ( <span> {placeholderMessage} </span> ); } }); }); ## Instruction: Add message when volume attached to instance outside project ## Code After: /** @jsx React.DOM */ define( [ 'react', 'backbone' ], function (React, Backbone) { return React.createClass({ propTypes: { volume: React.PropTypes.instanceOf(Backbone.Model).isRequired, instances: React.PropTypes.instanceOf(Backbone.Collection).isRequired }, render: function () { var status = this.props.volume.get('status'), placeholderMessage = status; if(status === "available"){ placeholderMessage = "Unattached"; }else if(status === "in-use"){ var attachData = this.props.volume.get('attach_data'); var instance = this.props.instances.get(attachData.instance_id); if(instance) { placeholderMessage = "Attached to " + instance.get('name') + " as device " + attachData.device; }else{ placeholderMessage = "Attached to instance outside project (" + attachData.instance_id + ")"; } } return ( <span> {placeholderMessage} </span> ); } }); });
/** @jsx React.DOM */ define( [ 'react', 'backbone' ], function (React, Backbone) { return React.createClass({ propTypes: { volume: React.PropTypes.instanceOf(Backbone.Model).isRequired, instances: React.PropTypes.instanceOf(Backbone.Collection).isRequired }, render: function () { var status = this.props.volume.get('status'), placeholderMessage = status; if(status === "available"){ placeholderMessage = "Unattached"; }else if(status === "in-use"){ var attachData = this.props.volume.get('attach_data'); var instance = this.props.instances.get(attachData.instance_id); + if(instance) { - placeholderMessage = "Attached to " + instance.get('name') + " as device " + attachData.device; + placeholderMessage = "Attached to " + instance.get('name') + " as device " + attachData.device; ? ++ + }else{ + placeholderMessage = "Attached to instance outside project (" + attachData.instance_id + ")"; + } } return ( <span> {placeholderMessage} </span> ); } }); });
6
0.157895
5
1
d8afb3509f893f10cf6570d17573cd991e07a1f1
spec/payload/message_parser_spec.rb
spec/payload/message_parser_spec.rb
require 'spec_helper' describe Magnum::Payload::MessageParser do let(:klass) { class Klass ; include Magnum::Payload::MessageParser ; end } let(:subject) { klass.new } describe '#skip_message?' do it 'returns true when message contains "ci-skip"' do subject.stub(:message).and_return('Commit message [ci-skip]') subject.skip_message?.should eq true end it 'returns true when message contains "ci skip"' do subject.stub(:message).and_return('Commit message [ci skip]') subject.skip_message?.should eq true end it 'returns true when message contains "skip ci"' do subject.stub(:message).and_return('Commit message [skip ci]') subject.skip_message?.should eq true end it 'returns true when message contains "skip-ci"' do subject.stub(:message).and_return('Commit message [skip-ci]') subject.skip_message?.should eq true end it 'returns false if no skip points found' do subject.stub(:message).and_return('Commit message') subject.skip_message?.should eq false end context 'with multi-line message' do it 'returns true' do subject.stub(:message).and_return("Commit message [skip-ci]\nCommit comments") subject.skip_message?.should eq true end it 'returns false' do subject.stub(:message).and_return("Commit message\nLets skip [ci-skip]") subject.skip_message?.should eq false end end end end
require "spec_helper" describe Magnum::Payload::MessageParser do let(:klass) { class Klass ; include Magnum::Payload::MessageParser ; end } let(:subject) { klass.new } describe "#skip_message?" do it "returns true when message contains "ci-skip"" do subject.stub(:message).and_return("Commit message [ci-skip]") subject.skip_message?.should eq true end it "returns true when message contains "ci skip"" do subject.stub(:message).and_return("Commit message [ci skip]") subject.skip_message?.should eq true end it "returns true when message contains "skip ci"" do subject.stub(:message).and_return("Commit message [skip ci]") subject.skip_message?.should eq true end it "returns true when message contains "skip-ci"" do subject.stub(:message).and_return("Commit message [skip-ci]") subject.skip_message?.should eq true end it "returns false if no skip points found" do subject.stub(:message).and_return("Commit message") subject.skip_message?.should eq false end context "with multi-line message" do it "returns true" do subject.stub(:message).and_return("Commit message [skip-ci]\nCommit comments") subject.skip_message?.should eq true end it "returns false" do subject.stub(:message).and_return("Commit message\nLets skip [ci-skip]") subject.skip_message?.should eq false end end end end
Use double quotes in message spec
Use double quotes in message spec
Ruby
mit
magnumci/magnum-payload
ruby
## Code Before: require 'spec_helper' describe Magnum::Payload::MessageParser do let(:klass) { class Klass ; include Magnum::Payload::MessageParser ; end } let(:subject) { klass.new } describe '#skip_message?' do it 'returns true when message contains "ci-skip"' do subject.stub(:message).and_return('Commit message [ci-skip]') subject.skip_message?.should eq true end it 'returns true when message contains "ci skip"' do subject.stub(:message).and_return('Commit message [ci skip]') subject.skip_message?.should eq true end it 'returns true when message contains "skip ci"' do subject.stub(:message).and_return('Commit message [skip ci]') subject.skip_message?.should eq true end it 'returns true when message contains "skip-ci"' do subject.stub(:message).and_return('Commit message [skip-ci]') subject.skip_message?.should eq true end it 'returns false if no skip points found' do subject.stub(:message).and_return('Commit message') subject.skip_message?.should eq false end context 'with multi-line message' do it 'returns true' do subject.stub(:message).and_return("Commit message [skip-ci]\nCommit comments") subject.skip_message?.should eq true end it 'returns false' do subject.stub(:message).and_return("Commit message\nLets skip [ci-skip]") subject.skip_message?.should eq false end end end end ## Instruction: Use double quotes in message spec ## Code After: require "spec_helper" describe Magnum::Payload::MessageParser do let(:klass) { class Klass ; include Magnum::Payload::MessageParser ; end } let(:subject) { klass.new } describe "#skip_message?" do it "returns true when message contains "ci-skip"" do subject.stub(:message).and_return("Commit message [ci-skip]") subject.skip_message?.should eq true end it "returns true when message contains "ci skip"" do subject.stub(:message).and_return("Commit message [ci skip]") subject.skip_message?.should eq true end it "returns true when message contains "skip ci"" do subject.stub(:message).and_return("Commit message [skip ci]") subject.skip_message?.should eq true end it "returns true when message contains "skip-ci"" do subject.stub(:message).and_return("Commit message [skip-ci]") subject.skip_message?.should eq true end it "returns false if no skip points found" do subject.stub(:message).and_return("Commit message") subject.skip_message?.should eq false end context "with multi-line message" do it "returns true" do subject.stub(:message).and_return("Commit message [skip-ci]\nCommit comments") subject.skip_message?.should eq true end it "returns false" do subject.stub(:message).and_return("Commit message\nLets skip [ci-skip]") subject.skip_message?.should eq false end end end end
- require 'spec_helper' ? ^ ^ + require "spec_helper" ? ^ ^ describe Magnum::Payload::MessageParser do let(:klass) { class Klass ; include Magnum::Payload::MessageParser ; end } let(:subject) { klass.new } - describe '#skip_message?' do ? ^ ^ + describe "#skip_message?" do ? ^ ^ - it 'returns true when message contains "ci-skip"' do ? ^ ^ + it "returns true when message contains "ci-skip"" do ? ^ ^ - subject.stub(:message).and_return('Commit message [ci-skip]') ? ^ ^ + subject.stub(:message).and_return("Commit message [ci-skip]") ? ^ ^ subject.skip_message?.should eq true end - it 'returns true when message contains "ci skip"' do ? ^ ^ + it "returns true when message contains "ci skip"" do ? ^ ^ - subject.stub(:message).and_return('Commit message [ci skip]') ? ^ ^ + subject.stub(:message).and_return("Commit message [ci skip]") ? ^ ^ subject.skip_message?.should eq true end - it 'returns true when message contains "skip ci"' do ? ^ ^ + it "returns true when message contains "skip ci"" do ? ^ ^ - subject.stub(:message).and_return('Commit message [skip ci]') ? ^ ^ + subject.stub(:message).and_return("Commit message [skip ci]") ? ^ ^ subject.skip_message?.should eq true end - it 'returns true when message contains "skip-ci"' do ? ^ ^ + it "returns true when message contains "skip-ci"" do ? ^ ^ - subject.stub(:message).and_return('Commit message [skip-ci]') ? ^ ^ + subject.stub(:message).and_return("Commit message [skip-ci]") ? ^ ^ subject.skip_message?.should eq true end - it 'returns false if no skip points found' do ? ^ ^ + it "returns false if no skip points found" do ? ^ ^ - subject.stub(:message).and_return('Commit message') ? ^ ^ + subject.stub(:message).and_return("Commit message") ? ^ ^ subject.skip_message?.should eq false end - context 'with multi-line message' do ? ^ ^ + context "with multi-line message" do ? ^ ^ - it 'returns true' do ? ^ ^ + it "returns true" do ? ^ ^ subject.stub(:message).and_return("Commit message [skip-ci]\nCommit comments") subject.skip_message?.should eq true end - it 'returns false' do ? ^ ^ + it "returns false" do ? ^ ^ subject.stub(:message).and_return("Commit message\nLets skip [ci-skip]") subject.skip_message?.should eq false end end end end
30
0.666667
15
15
aa16f98ba5fb48c21d9ecef4b6e9045a47e7cdcc
data/transition-sites/ea_publications.yml
data/transition-sites/ea_publications.yml
--- site: ea_publications whitehall_slug: environment-agency title: Environment Agency redirection_date: 1st April 2014 homepage: https://www.gov.uk/government/organisations/environment-agency tna_timestamp: 20130903163836 host: publications.environment-agency.gov.uk furl: www.gov.uk/environment-agency options: --query-string id
--- site: ea_publications whitehall_slug: environment-agency title: Environment Agency redirection_date: 1st April 2014 homepage: https://www.gov.uk/government/organisations/environment-agency tna_timestamp: 20130903163836 host: publications.environment-agency.gov.uk furl: www.gov.uk/environment-agency options: --query-string id aliases: - brand.environment-agency.gov.uk
Add alias to EA publications library config
Add alias to EA publications library config
YAML
mit
alphagov/transition-config,alphagov/transition-config
yaml
## Code Before: --- site: ea_publications whitehall_slug: environment-agency title: Environment Agency redirection_date: 1st April 2014 homepage: https://www.gov.uk/government/organisations/environment-agency tna_timestamp: 20130903163836 host: publications.environment-agency.gov.uk furl: www.gov.uk/environment-agency options: --query-string id ## Instruction: Add alias to EA publications library config ## Code After: --- site: ea_publications whitehall_slug: environment-agency title: Environment Agency redirection_date: 1st April 2014 homepage: https://www.gov.uk/government/organisations/environment-agency tna_timestamp: 20130903163836 host: publications.environment-agency.gov.uk furl: www.gov.uk/environment-agency options: --query-string id aliases: - brand.environment-agency.gov.uk
--- site: ea_publications whitehall_slug: environment-agency title: Environment Agency redirection_date: 1st April 2014 homepage: https://www.gov.uk/government/organisations/environment-agency tna_timestamp: 20130903163836 host: publications.environment-agency.gov.uk furl: www.gov.uk/environment-agency options: --query-string id + aliases: + - brand.environment-agency.gov.uk
2
0.2
2
0
ae8a053e6a532d2f4f456462653e827f0f46a25f
server/server.js
server/server.js
Meteor.startup(function () { // Reset Database only in environment dev if (process.env.NODE_ENV === 'development') { KSessions.remove({}); Books.remove({}); datas = Fixtures; // Specific datas for dev environment } else if (process.env.NODE_ENV === 'production') { datas = Datas; // Specific datas for prod environment } // Specific insert for Books datas.books.forEach(function(record) { if (Books.findOne(record._id)) { Books.update(record._id, record); } else { Books.insert(record); } }); Meteor.publish("all_books", function() { return Books.find(); }); // Specific insert for KSessions datas.ksessions.forEach(function(record) { if (KSessions.findOne(record._id)) { KSessions.update(record._id, record); } else { KSessions.insert(record); } }); Meteor.publish("all_ksessions", function() { return KSessions.find(); }); // Specific insert for users datas.users.forEach(function(user) { if (! Meteor.users.findOne({username: user.username})) { Accounts.createUser(user); } }); });
Meteor.startup(function () { // Reset Database only in environment dev if (process.env.NODE_ENV === 'development') { KSessions.remove({}); Books.remove({}); // Specific insert for Books Fixtures.books.forEach(function(rec) { if (Books.findOne(rec._id)) { Books.update(rec._id, rec); } else { Books.insert(rec); } }); Meteor.publish("all_books", function() { return Books.find(); }); // Specific insert for KSessions Fixtures.ksessions.forEach(function(rec) { if (KSessions.findOne(rec._id)) { KSessions.update(rec._id, rec); } else { KSessions.insert(rec); } }); Meteor.publish("all_ksessions", function() { return KSessions.find(); }); // Specific insert for users Fixtures.users.forEach(function(user) { if (! Meteor.users.findOne({username: user.username})) { Accounts.createUser(user); } }); } });
Disable data change in prod and pre-prod environment
Disable data change in prod and pre-prod environment
JavaScript
mit
claudeaubry/klub,claudeaubry/klub,claudeaubry/klub
javascript
## Code Before: Meteor.startup(function () { // Reset Database only in environment dev if (process.env.NODE_ENV === 'development') { KSessions.remove({}); Books.remove({}); datas = Fixtures; // Specific datas for dev environment } else if (process.env.NODE_ENV === 'production') { datas = Datas; // Specific datas for prod environment } // Specific insert for Books datas.books.forEach(function(record) { if (Books.findOne(record._id)) { Books.update(record._id, record); } else { Books.insert(record); } }); Meteor.publish("all_books", function() { return Books.find(); }); // Specific insert for KSessions datas.ksessions.forEach(function(record) { if (KSessions.findOne(record._id)) { KSessions.update(record._id, record); } else { KSessions.insert(record); } }); Meteor.publish("all_ksessions", function() { return KSessions.find(); }); // Specific insert for users datas.users.forEach(function(user) { if (! Meteor.users.findOne({username: user.username})) { Accounts.createUser(user); } }); }); ## Instruction: Disable data change in prod and pre-prod environment ## Code After: Meteor.startup(function () { // Reset Database only in environment dev if (process.env.NODE_ENV === 'development') { KSessions.remove({}); Books.remove({}); // Specific insert for Books Fixtures.books.forEach(function(rec) { if (Books.findOne(rec._id)) { Books.update(rec._id, rec); } else { Books.insert(rec); } }); Meteor.publish("all_books", function() { return Books.find(); }); // Specific insert for KSessions Fixtures.ksessions.forEach(function(rec) { if (KSessions.findOne(rec._id)) { KSessions.update(rec._id, rec); } else { KSessions.insert(rec); } }); Meteor.publish("all_ksessions", function() { return KSessions.find(); }); // Specific insert for users Fixtures.users.forEach(function(user) { if (! Meteor.users.findOne({username: user.username})) { Accounts.createUser(user); } }); } });
Meteor.startup(function () { // Reset Database only in environment dev if (process.env.NODE_ENV === 'development') { KSessions.remove({}); Books.remove({}); - datas = Fixtures; // Specific datas for dev environment + + // Specific insert for Books + Fixtures.books.forEach(function(rec) { + if (Books.findOne(rec._id)) { + Books.update(rec._id, rec); + } + else { + Books.insert(rec); + } + }); + + Meteor.publish("all_books", function() { + return Books.find(); + }); + + // Specific insert for KSessions + Fixtures.ksessions.forEach(function(rec) { + if (KSessions.findOne(rec._id)) { + KSessions.update(rec._id, rec); + } + else { + KSessions.insert(rec); + } + }); + + Meteor.publish("all_ksessions", function() { + return KSessions.find(); + }); + + // Specific insert for users + Fixtures.users.forEach(function(user) { + if (! Meteor.users.findOne({username: user.username})) { + Accounts.createUser(user); + } + + }); } - else if (process.env.NODE_ENV === 'production') { - datas = Datas; // Specific datas for prod environment - } - - // Specific insert for Books - datas.books.forEach(function(record) { - if (Books.findOne(record._id)) { - Books.update(record._id, record); - } - else { - Books.insert(record); - } - }); - - Meteor.publish("all_books", function() { - return Books.find(); - }); - - // Specific insert for KSessions - datas.ksessions.forEach(function(record) { - if (KSessions.findOne(record._id)) { - KSessions.update(record._id, record); - } - else { - KSessions.insert(record); - } - }); - - Meteor.publish("all_ksessions", function() { - return KSessions.find(); - }); - - // Specific insert for users - datas.users.forEach(function(user) { - if (! Meteor.users.findOne({username: user.username})) { - Accounts.createUser(user); - } - }); });
75
1.630435
36
39
7214d765437ffc7b980ec2d6682aaa0e705a5b60
app/serializers/effort_serializer.rb
app/serializers/effort_serializer.rb
class EffortSerializer < BaseSerializer attributes :id, :event_id, :person_id, :bib_number, :first_name, :last_name, :full_name, :gender, :age, :city, :state_code, :country_code, :beacon_url, :photo_url, :report_url, :start_offset link(:self) { api_v1_effort_path(object) } has_many :split_times, if: :split_times_loaded? def split_times_loaded? object.split_times.loaded? end end
class EffortSerializer < BaseSerializer attributes :id, :event_id, :person_id, :participant_id, :bib_number, :first_name, :last_name, :full_name, :gender, :age, :city, :state_code, :country_code, :beacon_url, :photo_url, :report_url, :start_offset link(:self) { api_v1_effort_path(object) } has_many :split_times, if: :split_times_loaded? def split_times_loaded? object.split_times.loaded? end end
Include participant_id (along with person_id) in EffortSerializer to avoid problems during transition. The participant_id attribute is now merely an alias for person_id.
Include participant_id (along with person_id) in EffortSerializer to avoid problems during transition. The participant_id attribute is now merely an alias for person_id.
Ruby
mit
SplitTime/OpenSplitTime,SplitTime/OpenSplitTime,SplitTime/OpenSplitTime
ruby
## Code Before: class EffortSerializer < BaseSerializer attributes :id, :event_id, :person_id, :bib_number, :first_name, :last_name, :full_name, :gender, :age, :city, :state_code, :country_code, :beacon_url, :photo_url, :report_url, :start_offset link(:self) { api_v1_effort_path(object) } has_many :split_times, if: :split_times_loaded? def split_times_loaded? object.split_times.loaded? end end ## Instruction: Include participant_id (along with person_id) in EffortSerializer to avoid problems during transition. The participant_id attribute is now merely an alias for person_id. ## Code After: class EffortSerializer < BaseSerializer attributes :id, :event_id, :person_id, :participant_id, :bib_number, :first_name, :last_name, :full_name, :gender, :age, :city, :state_code, :country_code, :beacon_url, :photo_url, :report_url, :start_offset link(:self) { api_v1_effort_path(object) } has_many :split_times, if: :split_times_loaded? def split_times_loaded? object.split_times.loaded? end end
class EffortSerializer < BaseSerializer - attributes :id, :event_id, :person_id, :bib_number, :first_name, :last_name, :full_name, :gender, + attributes :id, :event_id, :person_id, :participant_id, :bib_number, :first_name, :last_name, :full_name, :gender, ? +++++++++++++++++ :age, :city, :state_code, :country_code, :beacon_url, :photo_url, :report_url, :start_offset link(:self) { api_v1_effort_path(object) } has_many :split_times, if: :split_times_loaded? def split_times_loaded? object.split_times.loaded? end end
2
0.181818
1
1
a2f8ec576e97f2118e4d72713fd956913ff5428a
src/test/scala/org/scalatest/tools/SbtCommandParserSpec.scala
src/test/scala/org/scalatest/tools/SbtCommandParserSpec.scala
package org.scalatest.tools import org.scalatest.Spec import org.scalatest.matchers.ShouldMatchers class SbtCommandParserSpec extends Spec with ShouldMatchers { describe("the cmd terminal?") { it("should parse 'st'") { val parser = new SbtCommandParser val result = parser.parseResult("""st""") result match { case parser.Success(result, _) => println("success: " + result) case ns: parser.NoSuccess => fail(ns.toString) } } it("should parse 'st --'") { val parser = new SbtCommandParser val result = parser.parseResult("""st --""") result match { case parser.Success(result, _) => println("success: " + result) case ns: parser.NoSuccess => fail(ns.toString) } } } }
package org.scalatest.tools import org.scalatest.Spec import org.scalatest.matchers.ShouldMatchers class SbtCommandParserSpec extends Spec with ShouldMatchers { val parser = new SbtCommandParser def canParsePhrase(s: String) { val result = parser.parseResult(s) result match { case ns: parser.NoSuccess => fail(ns.toString) case _ => } } def cannotParsePhrase(s: String) { val result = parser.parseResult(s) result match { case parser.Success(result, _) => fail("wasn't supposed to, but parsed: " + result) case _ => } } describe("the cmd terminal?") { it("should parse 'st'") { canParsePhrase("""st""") canParsePhrase("""st --""") } } }
Reduce code dup in test.
Reduce code dup in test. git-svn-id: a9cb476116504f49bf91a09a9abbac08fcf3bbcc@3076 fd1274fd-080f-5963-8033-43ee40fcd1de
Scala
apache-2.0
svn2github/scalatest,svn2github/scalatest,svn2github/scalatest
scala
## Code Before: package org.scalatest.tools import org.scalatest.Spec import org.scalatest.matchers.ShouldMatchers class SbtCommandParserSpec extends Spec with ShouldMatchers { describe("the cmd terminal?") { it("should parse 'st'") { val parser = new SbtCommandParser val result = parser.parseResult("""st""") result match { case parser.Success(result, _) => println("success: " + result) case ns: parser.NoSuccess => fail(ns.toString) } } it("should parse 'st --'") { val parser = new SbtCommandParser val result = parser.parseResult("""st --""") result match { case parser.Success(result, _) => println("success: " + result) case ns: parser.NoSuccess => fail(ns.toString) } } } } ## Instruction: Reduce code dup in test. git-svn-id: a9cb476116504f49bf91a09a9abbac08fcf3bbcc@3076 fd1274fd-080f-5963-8033-43ee40fcd1de ## Code After: package org.scalatest.tools import org.scalatest.Spec import org.scalatest.matchers.ShouldMatchers class SbtCommandParserSpec extends Spec with ShouldMatchers { val parser = new SbtCommandParser def canParsePhrase(s: String) { val result = parser.parseResult(s) result match { case ns: parser.NoSuccess => fail(ns.toString) case _ => } } def cannotParsePhrase(s: String) { val result = parser.parseResult(s) result match { case parser.Success(result, _) => fail("wasn't supposed to, but parsed: " + result) case _ => } } describe("the cmd terminal?") { it("should parse 'st'") { canParsePhrase("""st""") canParsePhrase("""st --""") } } }
package org.scalatest.tools import org.scalatest.Spec import org.scalatest.matchers.ShouldMatchers class SbtCommandParserSpec extends Spec with ShouldMatchers { + val parser = new SbtCommandParser + + def canParsePhrase(s: String) { + val result = parser.parseResult(s) + result match { + case ns: parser.NoSuccess => fail(ns.toString) + case _ => + } + } + + def cannotParsePhrase(s: String) { + val result = parser.parseResult(s) + result match { + case parser.Success(result, _) => fail("wasn't supposed to, but parsed: " + result) + case _ => + } + } + describe("the cmd terminal?") { it("should parse 'st'") { + canParsePhrase("""st""") + canParsePhrase("""st --""") - - val parser = new SbtCommandParser - val result = parser.parseResult("""st""") - result match { - case parser.Success(result, _) => println("success: " + result) - case ns: parser.NoSuccess => fail(ns.toString) - } - } - it("should parse 'st --'") { - - val parser = new SbtCommandParser - val result = parser.parseResult("""st --""") - result match { - case parser.Success(result, _) => println("success: " + result) - case ns: parser.NoSuccess => fail(ns.toString) - } } } }
36
1.2
20
16
acb9c44e21e9f24610190afaacd87aee4570dff5
articles/integrations/aws-api-gateway-2/part-3.md
articles/integrations/aws-api-gateway-2/part-3.md
--- description: How to set your API methods to use your custom authorizer --- # AWS Part 3: Secure the API Using Custom Authorizers In [part 1](/integrations/aws-api-gateway-2/part-1), you configured an API using API Gateway, and in [part 2](/integrations/aws-api-gateway-2/part-2), you created the custom authorizer that can be used to retrieve the appropriate policies when your API receives an access request. In this part of the tutorial, we will show you how to use the custom authorizer to secure your API's endpoints.
--- description: How to set your API methods to use your custom authorizer --- # AWS Part 3: Secure the API Using Custom Authorizers In [part 1](/integrations/aws-api-gateway-2/part-1), you configured an API using API Gateway, and in [part 2](/integrations/aws-api-gateway-2/part-2), you created the custom authorizer that can be used to retrieve the appropriate policies when your API receives an access request. In this part of the tutorial, we will show you how to use the custom authorizer to secure your API's endpoints. ## Configure API Gateway Resources to use the Custom Authorizer Log in to AWS and navigate to the [API Gateway Console](http://console.aws.amazon.com/apigateway). ![](/media/articles/integrations/aws-api-gateway-2/part-3/pt3-1.png) ::: note Custom authorizers are set on a method by method basis; if you want to secure multiple methods using a single authorizer, you'll need to repeat the following instructions for each method. ::: Open the **PetStore** API we created in [part 1](/integrations/aws-api-gateway-2/part-1) of this tutorial. Under the **Resource** tree in the center pane, select the **GET** method under the `/pets` resource. ![](/media/articles/integrations/aws-api-gateway-2/part-3/pt3-2.png) Select **Method Request**. ![](/media/articles/integrations/aws-api-gateway-2/part-3/pt3-3.png) Under **Settings**, click the **pencil** icon to the right **Authorization** and choose the `jwt-rsa-custom-authorizer` custom authorizer you created in [part 2](/integrations/aws-api-gateway-2/part-2). ![](/media/articles/integrations/aws-api-gateway-2/part-3/pt3-4.png) Click the **check mark** icon to save your choice of custom authorizer. Make sure the **API Key Required** field is set to `false`. ![](/media/articles/integrations/aws-api-gateway-2/part-3/pt3-5.png)
Add second part of part 3
Add second part of part 3
Markdown
mit
auth0/docs,auth0/docs,jeffreylees/docs,yvonnewilson/docs,yvonnewilson/docs,yvonnewilson/docs,auth0/docs,jeffreylees/docs,jeffreylees/docs
markdown
## Code Before: --- description: How to set your API methods to use your custom authorizer --- # AWS Part 3: Secure the API Using Custom Authorizers In [part 1](/integrations/aws-api-gateway-2/part-1), you configured an API using API Gateway, and in [part 2](/integrations/aws-api-gateway-2/part-2), you created the custom authorizer that can be used to retrieve the appropriate policies when your API receives an access request. In this part of the tutorial, we will show you how to use the custom authorizer to secure your API's endpoints. ## Instruction: Add second part of part 3 ## Code After: --- description: How to set your API methods to use your custom authorizer --- # AWS Part 3: Secure the API Using Custom Authorizers In [part 1](/integrations/aws-api-gateway-2/part-1), you configured an API using API Gateway, and in [part 2](/integrations/aws-api-gateway-2/part-2), you created the custom authorizer that can be used to retrieve the appropriate policies when your API receives an access request. In this part of the tutorial, we will show you how to use the custom authorizer to secure your API's endpoints. ## Configure API Gateway Resources to use the Custom Authorizer Log in to AWS and navigate to the [API Gateway Console](http://console.aws.amazon.com/apigateway). ![](/media/articles/integrations/aws-api-gateway-2/part-3/pt3-1.png) ::: note Custom authorizers are set on a method by method basis; if you want to secure multiple methods using a single authorizer, you'll need to repeat the following instructions for each method. ::: Open the **PetStore** API we created in [part 1](/integrations/aws-api-gateway-2/part-1) of this tutorial. Under the **Resource** tree in the center pane, select the **GET** method under the `/pets` resource. ![](/media/articles/integrations/aws-api-gateway-2/part-3/pt3-2.png) Select **Method Request**. ![](/media/articles/integrations/aws-api-gateway-2/part-3/pt3-3.png) Under **Settings**, click the **pencil** icon to the right **Authorization** and choose the `jwt-rsa-custom-authorizer` custom authorizer you created in [part 2](/integrations/aws-api-gateway-2/part-2). ![](/media/articles/integrations/aws-api-gateway-2/part-3/pt3-4.png) Click the **check mark** icon to save your choice of custom authorizer. Make sure the **API Key Required** field is set to `false`. ![](/media/articles/integrations/aws-api-gateway-2/part-3/pt3-5.png)
--- description: How to set your API methods to use your custom authorizer --- # AWS Part 3: Secure the API Using Custom Authorizers In [part 1](/integrations/aws-api-gateway-2/part-1), you configured an API using API Gateway, and in [part 2](/integrations/aws-api-gateway-2/part-2), you created the custom authorizer that can be used to retrieve the appropriate policies when your API receives an access request. In this part of the tutorial, we will show you how to use the custom authorizer to secure your API's endpoints. + + ## Configure API Gateway Resources to use the Custom Authorizer + + Log in to AWS and navigate to the [API Gateway Console](http://console.aws.amazon.com/apigateway). + + ![](/media/articles/integrations/aws-api-gateway-2/part-3/pt3-1.png) + + ::: note + Custom authorizers are set on a method by method basis; if you want to secure multiple methods using a single authorizer, you'll need to repeat the following instructions for each method. + ::: + + Open the **PetStore** API we created in [part 1](/integrations/aws-api-gateway-2/part-1) of this tutorial. Under the **Resource** tree in the center pane, select the **GET** method under the `/pets` resource. + + ![](/media/articles/integrations/aws-api-gateway-2/part-3/pt3-2.png) + + Select **Method Request**. + + ![](/media/articles/integrations/aws-api-gateway-2/part-3/pt3-3.png) + + Under **Settings**, click the **pencil** icon to the right **Authorization** and choose the `jwt-rsa-custom-authorizer` custom authorizer you created in [part 2](/integrations/aws-api-gateway-2/part-2). + + ![](/media/articles/integrations/aws-api-gateway-2/part-3/pt3-4.png) + + Click the **check mark** icon to save your choice of custom authorizer. Make sure the **API Key Required** field is set to `false`. + + ![](/media/articles/integrations/aws-api-gateway-2/part-3/pt3-5.png)
26
3.714286
26
0
e0285116d673b3098822727e1b394006f9468c96
composer.json
composer.json
{ "name": "pubsubhubbub/publisher", "type": "library", "description": "pubsubhubbub implementation of publisher.", "keywords": ["pubsubhubbub", "publishers", "data", "feeds"], "homepage": "https://github.com/pubsubhubbub/php-publisher", "license": "Apache-2.0", "authors": [ { "name": "pubsubhubbub publisher Contributors", "homepage": "https://github.com/pubsubhubbub/php-publisher/contributors" } ], "require": { "php": "~5.4 || ~7.0", "ext-curl": "*" }, "autoload": { "psr-4": { "pubsubhubbub\\publisher\\": "library/" } } }
{ "name": "pubsubhubbub/publisher", "type": "library", "description": "pubsubhubbub implementation of publisher.", "keywords": ["pubsubhubbub", "publishers", "data", "feeds", "websub"], "homepage": "https://github.com/pubsubhubbub/php-publisher", "license": "Apache-2.0", "authors": [ { "name": "pubsubhubbub publisher Contributors", "homepage": "https://github.com/pubsubhubbub/php-publisher/contributors" } ], "require": { "php": "~5.4 || ~7.0", "ext-curl": "*" }, "autoload": { "psr-4": { "pubsubhubbub\\publisher\\": "library/" } } }
Add websub as a keyword
Add websub as a keyword This will let the library show up on Packagist when users search for websub
JSON
apache-2.0
pubsubhubbub/php-publisher
json
## Code Before: { "name": "pubsubhubbub/publisher", "type": "library", "description": "pubsubhubbub implementation of publisher.", "keywords": ["pubsubhubbub", "publishers", "data", "feeds"], "homepage": "https://github.com/pubsubhubbub/php-publisher", "license": "Apache-2.0", "authors": [ { "name": "pubsubhubbub publisher Contributors", "homepage": "https://github.com/pubsubhubbub/php-publisher/contributors" } ], "require": { "php": "~5.4 || ~7.0", "ext-curl": "*" }, "autoload": { "psr-4": { "pubsubhubbub\\publisher\\": "library/" } } } ## Instruction: Add websub as a keyword This will let the library show up on Packagist when users search for websub ## Code After: { "name": "pubsubhubbub/publisher", "type": "library", "description": "pubsubhubbub implementation of publisher.", "keywords": ["pubsubhubbub", "publishers", "data", "feeds", "websub"], "homepage": "https://github.com/pubsubhubbub/php-publisher", "license": "Apache-2.0", "authors": [ { "name": "pubsubhubbub publisher Contributors", "homepage": "https://github.com/pubsubhubbub/php-publisher/contributors" } ], "require": { "php": "~5.4 || ~7.0", "ext-curl": "*" }, "autoload": { "psr-4": { "pubsubhubbub\\publisher\\": "library/" } } }
{ "name": "pubsubhubbub/publisher", "type": "library", "description": "pubsubhubbub implementation of publisher.", - "keywords": ["pubsubhubbub", "publishers", "data", "feeds"], + "keywords": ["pubsubhubbub", "publishers", "data", "feeds", "websub"], ? ++++++++++ "homepage": "https://github.com/pubsubhubbub/php-publisher", "license": "Apache-2.0", "authors": [ { "name": "pubsubhubbub publisher Contributors", "homepage": "https://github.com/pubsubhubbub/php-publisher/contributors" } ], "require": { "php": "~5.4 || ~7.0", "ext-curl": "*" }, "autoload": { "psr-4": { "pubsubhubbub\\publisher\\": "library/" } } }
2
0.095238
1
1
a8d4aa1ba0ba609650613a29dfb1dcc99257496c
spec/calculate_spec.rb
spec/calculate_spec.rb
require 'spec_helper' describe 'Calculating the ETSource datasets' do datasets = Atlas::Dataset.all.select do |dataset| dataset.enabled && dataset.enabled[:etengine] end before(:all) do @graph = Atlas::GraphBuilder.build end datasets.each do |dataset| it "calculates the #{ dataset.key } dataset" do Atlas::Runner.new(dataset, @graph).calculate end end end # Calculating the ETSource dataset
require 'spec_helper' describe 'Calculating the ETSource datasets' do datasets = Atlas::Dataset.all.select do |dataset| dataset.enabled && dataset.enabled[:etengine] end datasets.each do |dataset| it "calculates the #{ dataset.key } dataset" do Atlas::Runner.new(dataset).calculate end end end # Calculating the ETSource dataset
Remove unused graph param from calculation specs
Remove unused graph param from calculation specs Adjustment for new Atlas version
Ruby
mit
quintel/etsource
ruby
## Code Before: require 'spec_helper' describe 'Calculating the ETSource datasets' do datasets = Atlas::Dataset.all.select do |dataset| dataset.enabled && dataset.enabled[:etengine] end before(:all) do @graph = Atlas::GraphBuilder.build end datasets.each do |dataset| it "calculates the #{ dataset.key } dataset" do Atlas::Runner.new(dataset, @graph).calculate end end end # Calculating the ETSource dataset ## Instruction: Remove unused graph param from calculation specs Adjustment for new Atlas version ## Code After: require 'spec_helper' describe 'Calculating the ETSource datasets' do datasets = Atlas::Dataset.all.select do |dataset| dataset.enabled && dataset.enabled[:etengine] end datasets.each do |dataset| it "calculates the #{ dataset.key } dataset" do Atlas::Runner.new(dataset).calculate end end end # Calculating the ETSource dataset
require 'spec_helper' describe 'Calculating the ETSource datasets' do datasets = Atlas::Dataset.all.select do |dataset| dataset.enabled && dataset.enabled[:etengine] end - before(:all) do - @graph = Atlas::GraphBuilder.build - end - datasets.each do |dataset| it "calculates the #{ dataset.key } dataset" do - Atlas::Runner.new(dataset, @graph).calculate ? -------- + Atlas::Runner.new(dataset).calculate end end end # Calculating the ETSource dataset
6
0.352941
1
5
b7e6f8e2eb93ce1c57c49208fc5c006c51807c8b
.travis.yml
.travis.yml
language: node_js node_js: - "0.10" before_install: - npm install -g gulp bower - bower install -f
language: node_js node_js: - "6" before_install: - npm install -g yarn install: - yarn script: yarn test
Update Travis for Yarn and Node 6
Update Travis for Yarn and Node 6
YAML
mit
Snugug/eq.js,Snugug/eq.js
yaml
## Code Before: language: node_js node_js: - "0.10" before_install: - npm install -g gulp bower - bower install -f ## Instruction: Update Travis for Yarn and Node 6 ## Code After: language: node_js node_js: - "6" before_install: - npm install -g yarn install: - yarn script: yarn test
language: node_js node_js: - - "0.10" + - "6" before_install: - - npm install -g gulp bower ? ^^^^^^^^^ + - npm install -g yarn ? ^^ + - - bower install -f + install: + - yarn + script: yarn test
8
1.333333
5
3
ec92b47c48be93182a63c5f900f68638be603ffd
tests/BaseTestCase.php
tests/BaseTestCase.php
<?php declare(strict_types = 1); namespace Churn\Tests; use PHPUnit\Framework\TestCase; abstract class BaseTestCase extends TestCase { }
<?php declare(strict_types = 1); namespace Churn\Tests; use Mockery as m; use PHPUnit\Framework\TestCase; abstract class BaseTestCase extends TestCase { /** * @see https://github.com/phpspec/prophecy/issues/366#issuecomment-359587114 */ protected function tearDown() { $this->addToAssertionCount(m::getContainer()->mockery_getExpectationCount()); } }
Add to assertion count from Mockery
Fix: Add to assertion count from Mockery
PHP
mit
bmitch/churn-php
php
## Code Before: <?php declare(strict_types = 1); namespace Churn\Tests; use PHPUnit\Framework\TestCase; abstract class BaseTestCase extends TestCase { } ## Instruction: Fix: Add to assertion count from Mockery ## Code After: <?php declare(strict_types = 1); namespace Churn\Tests; use Mockery as m; use PHPUnit\Framework\TestCase; abstract class BaseTestCase extends TestCase { /** * @see https://github.com/phpspec/prophecy/issues/366#issuecomment-359587114 */ protected function tearDown() { $this->addToAssertionCount(m::getContainer()->mockery_getExpectationCount()); } }
<?php declare(strict_types = 1); namespace Churn\Tests; + use Mockery as m; use PHPUnit\Framework\TestCase; abstract class BaseTestCase extends TestCase { - + /** + * @see https://github.com/phpspec/prophecy/issues/366#issuecomment-359587114 + */ + protected function tearDown() + { + $this->addToAssertionCount(m::getContainer()->mockery_getExpectationCount()); + } }
9
0.9
8
1
c6da745652b89f8d68242f30a834917429a970fd
ci_environment/kerl/attributes/source.rb
ci_environment/kerl/attributes/source.rb
default[:kerl][:releases] = %w(R14B02 R14B03 R14B04 R15B R15B01 R15B02 R15B03 R16B R16B01 R16B02 R16B03 R16B03-1 17.0-rc1 17.0-rc2) default[:kerl][:path] = "/usr/local/bin/kerl"
default[:kerl][:releases] = %w(R14B02 R14B03 R14B04 R15B R15B01 R15B02 R15B03 R16B R16B01 R16B02 R16B03 R16B03-1 17.0) default[:kerl][:path] = "/usr/local/bin/kerl"
Support Erlang 17.0 with kerl
Support Erlang 17.0 with kerl Also drop support for the previous Erlang 17.0 release candidates.
Ruby
mit
ljharb/travis-cookbooks,spurti-chopra/travis-cookbooks,tianon/travis-cookbooks,tianon/travis-cookbooks,Acidburn0zzz/travis-cookbooks,bd808/travis-cookbooks,vinaykaradia/travis-cookbook-cloned,Zarthus/travis-cookbooks,dracos/travis-cookbooks,ljharb/travis-cookbooks,tianon/travis-cookbooks,Zarthus/travis-cookbooks,vinaykaradia/travis-cookbook-cloned,Distelli/travis-cookbooks,Zarthus/travis-cookbooks,gavioto/travis-cookbooks,gavioto/travis-cookbooks,Acidburn0zzz/travis-cookbooks,johanneswuerbach/travis-cookbooks,spurti-chopra/travis-cookbooks,travis-ci/travis-cookbooks,ardock/travis-cookbooks,ardock/travis-cookbooks,vinaykaradia/travis-cookbook-cloned,DanielG/travis-cookbooks,spurti-chopra/travis-cookbooks,dracos/travis-cookbooks,Acidburn0zzz/travis-cookbooks,johanneswuerbach/travis-cookbooks,ardock/travis-cookbooks,ardock/travis-cookbooks,Acidburn0zzz/travis-cookbooks,DanielG/travis-cookbooks,vinaykaradia/travis-cookbook-cloned,johanneswuerbach/travis-cookbooks,DanielG/travis-cookbooks,Distelli/travis-cookbooks,ljharb/travis-cookbooks,travis-ci/travis-cookbooks,Distelli/travis-cookbooks,dracos/travis-cookbooks,bd808/travis-cookbooks,tianon/travis-cookbooks,Zarthus/travis-cookbooks,vinaykaradia/travis-cookbook-cloned,tianon/travis-cookbooks,travis-ci/travis-cookbooks,DanielG/travis-cookbooks,Distelli/travis-cookbooks,dracos/travis-cookbooks,gavioto/travis-cookbooks,gavioto/travis-cookbooks,bd808/travis-cookbooks
ruby
## Code Before: default[:kerl][:releases] = %w(R14B02 R14B03 R14B04 R15B R15B01 R15B02 R15B03 R16B R16B01 R16B02 R16B03 R16B03-1 17.0-rc1 17.0-rc2) default[:kerl][:path] = "/usr/local/bin/kerl" ## Instruction: Support Erlang 17.0 with kerl Also drop support for the previous Erlang 17.0 release candidates. ## Code After: default[:kerl][:releases] = %w(R14B02 R14B03 R14B04 R15B R15B01 R15B02 R15B03 R16B R16B01 R16B02 R16B03 R16B03-1 17.0) default[:kerl][:path] = "/usr/local/bin/kerl"
- default[:kerl][:releases] = %w(R14B02 R14B03 R14B04 R15B R15B01 R15B02 R15B03 R16B R16B01 R16B02 R16B03 R16B03-1 17.0-rc1 17.0-rc2) ? ------------- + default[:kerl][:releases] = %w(R14B02 R14B03 R14B04 R15B R15B01 R15B02 R15B03 R16B R16B01 R16B02 R16B03 R16B03-1 17.0) default[:kerl][:path] = "/usr/local/bin/kerl"
2
1
1
1
6e984b602e4f4c6f13a8a956eac03f214b6ccef1
README.md
README.md
python api for SeetaFaceEngine(https://github.com/seetaface/SeetaFaceEngine.git) # installation 1. Download pyseeta(https://github.com/TuXiaokang/pyseeta.git) 2. `git submodule update --init --recursive` 3. Build `SeetaFaceEngine` dynamic library. on unix ```bash cd SeetaFaceEngine/ mkdir Release; cd Release cmake .. make ``` on windows ```bash cd SeetaFaceEngine/ mkdir Release; cd Release cmake -G "Visual Studio 14 2015 Win64" .. cmake --build . --config Release ``` 4. Add the dynamic lib path to system environment variables. + on linux & macOS, the default is `SeetaFaceEngine/library` + on windows, the default is `SeetaFaceEngine/library/Release` 5. run test ```bash python setup.py install python test.py ``` # tips If you want to use function of faceidentification, you need decompress the `seeta_fr_v1.0.part.1.rar` which located in `SeetaFaceEngine/model`
python api for SeetaFaceEngine(https://github.com/seetaface/SeetaFaceEngine.git) # installation 1. Download pyseeta(https://github.com/TuXiaokang/pyseeta.git) 2. `git submodule update --init --recursive` 3. Build `SeetaFaceEngine` dynamic library. on unix ```bash cd SeetaFaceEngine/ mkdir Release; cd Release cmake .. make ``` on windows ```bash cd SeetaFaceEngine/ mkdir Release; cd Release cmake -G "Visual Studio 14 2015 Win64" .. cmake --build . --config Release ``` 4. Add the dynamic lib path to system environment variables. + on linux & macOS, the default is `SeetaFaceEngine/library` + on windows, the default is `SeetaFaceEngine/library/Release` 5. run test on ubuntu or unix ```bash sudo python setup.py install python test.py ``` on windows ```bash python setup.py install python test.py ``` # tips If you want to use function of faceidentification, you need decompress the `seeta_fr_v1.0.part.1.rar` which located in `SeetaFaceEngine/model`
Modify install command on linux
Modify install command on linux
Markdown
mit
TuXiaokang/pyseeta
markdown
## Code Before: python api for SeetaFaceEngine(https://github.com/seetaface/SeetaFaceEngine.git) # installation 1. Download pyseeta(https://github.com/TuXiaokang/pyseeta.git) 2. `git submodule update --init --recursive` 3. Build `SeetaFaceEngine` dynamic library. on unix ```bash cd SeetaFaceEngine/ mkdir Release; cd Release cmake .. make ``` on windows ```bash cd SeetaFaceEngine/ mkdir Release; cd Release cmake -G "Visual Studio 14 2015 Win64" .. cmake --build . --config Release ``` 4. Add the dynamic lib path to system environment variables. + on linux & macOS, the default is `SeetaFaceEngine/library` + on windows, the default is `SeetaFaceEngine/library/Release` 5. run test ```bash python setup.py install python test.py ``` # tips If you want to use function of faceidentification, you need decompress the `seeta_fr_v1.0.part.1.rar` which located in `SeetaFaceEngine/model` ## Instruction: Modify install command on linux ## Code After: python api for SeetaFaceEngine(https://github.com/seetaface/SeetaFaceEngine.git) # installation 1. Download pyseeta(https://github.com/TuXiaokang/pyseeta.git) 2. `git submodule update --init --recursive` 3. Build `SeetaFaceEngine` dynamic library. on unix ```bash cd SeetaFaceEngine/ mkdir Release; cd Release cmake .. make ``` on windows ```bash cd SeetaFaceEngine/ mkdir Release; cd Release cmake -G "Visual Studio 14 2015 Win64" .. cmake --build . --config Release ``` 4. Add the dynamic lib path to system environment variables. + on linux & macOS, the default is `SeetaFaceEngine/library` + on windows, the default is `SeetaFaceEngine/library/Release` 5. run test on ubuntu or unix ```bash sudo python setup.py install python test.py ``` on windows ```bash python setup.py install python test.py ``` # tips If you want to use function of faceidentification, you need decompress the `seeta_fr_v1.0.part.1.rar` which located in `SeetaFaceEngine/model`
python api for SeetaFaceEngine(https://github.com/seetaface/SeetaFaceEngine.git) # installation 1. Download pyseeta(https://github.com/TuXiaokang/pyseeta.git) 2. `git submodule update --init --recursive` 3. Build `SeetaFaceEngine` dynamic library. on unix ```bash cd SeetaFaceEngine/ mkdir Release; cd Release cmake .. make ``` on windows ```bash cd SeetaFaceEngine/ mkdir Release; cd Release cmake -G "Visual Studio 14 2015 Win64" .. cmake --build . --config Release ``` 4. Add the dynamic lib path to system environment variables. + on linux & macOS, the default is `SeetaFaceEngine/library` + on windows, the default is `SeetaFaceEngine/library/Release` 5. run test - ```bash + on ubuntu or unix + ```bash + sudo python setup.py install + python test.py + ``` + on windows + ```bash - python setup.py install ? ^^^^ + python setup.py install ? ^ - python test.py ? ^^^^ + python test.py ? ^ - ``` + ``` # tips If you want to use function of faceidentification, you need decompress the `seeta_fr_v1.0.part.1.rar` which located in `SeetaFaceEngine/model`
14
0.451613
10
4
c37468f14ca65b16874f4fa135b3b764724159f3
openstack/kubernikus/templates/seed.yaml
openstack/kubernikus/templates/seed.yaml
apiVersion: openstack.stable.sap.cc/v1 kind: OpenstackSeed metadata: name: kubernikus-seed spec: requires: - monsoon3/keystone-seed - monsoon3/domain-ccadmin-seed - monsoon3/domain-default-seed domains: - name: Default users: - name: kubernikus-terraform description: "Kubernikus Terraform User" password: {{ required "missing .kubernikus_terraform_password" .Values.kubernikus_terraform_password | quote }} role_assignments: - project: cloud_admin@ccadmin role: admin - project: cloud_admin@ccadmin role: cloud_network_admin - project: cloud_admin@ccadmin role: cloud_resource_admin - project: master@ccadmin role: cloud_dns_admin - project: master@ccadmin role: swiftoperator
apiVersion: openstack.stable.sap.cc/v1 kind: OpenstackSeed metadata: name: kubernikus-seed spec: requires: - monsoon3/keystone-seed - monsoon3/domain-ccadmin-seed - monsoon3/domain-default-seed domains: - name: Default users: - name: kubernikus-terraform description: "Kubernikus Terraform User" password: {{ required "missing .kubernikus_terraform_password" .Values.kubernikus_terraform_password | quote }} role_assignments: - project: cloud_admin@ccadmin role: admin - project: cloud_admin@ccadmin role: cloud_network_admin - project: cloud_admin@ccadmin role: cloud_resource_admin - project: master@ccadmin role: cloud_dns_admin - project: master@ccadmin role: swiftoperator - domain: cp role: admin
Allow kubernikus-terraform to have the domain admin role
Allow kubernikus-terraform to have the domain admin role
YAML
apache-2.0
sapcc/helm-charts,sapcc/helm-charts,sapcc/helm-charts,sapcc/helm-charts
yaml
## Code Before: apiVersion: openstack.stable.sap.cc/v1 kind: OpenstackSeed metadata: name: kubernikus-seed spec: requires: - monsoon3/keystone-seed - monsoon3/domain-ccadmin-seed - monsoon3/domain-default-seed domains: - name: Default users: - name: kubernikus-terraform description: "Kubernikus Terraform User" password: {{ required "missing .kubernikus_terraform_password" .Values.kubernikus_terraform_password | quote }} role_assignments: - project: cloud_admin@ccadmin role: admin - project: cloud_admin@ccadmin role: cloud_network_admin - project: cloud_admin@ccadmin role: cloud_resource_admin - project: master@ccadmin role: cloud_dns_admin - project: master@ccadmin role: swiftoperator ## Instruction: Allow kubernikus-terraform to have the domain admin role ## Code After: apiVersion: openstack.stable.sap.cc/v1 kind: OpenstackSeed metadata: name: kubernikus-seed spec: requires: - monsoon3/keystone-seed - monsoon3/domain-ccadmin-seed - monsoon3/domain-default-seed domains: - name: Default users: - name: kubernikus-terraform description: "Kubernikus Terraform User" password: {{ required "missing .kubernikus_terraform_password" .Values.kubernikus_terraform_password | quote }} role_assignments: - project: cloud_admin@ccadmin role: admin - project: cloud_admin@ccadmin role: cloud_network_admin - project: cloud_admin@ccadmin role: cloud_resource_admin - project: master@ccadmin role: cloud_dns_admin - project: master@ccadmin role: swiftoperator - domain: cp role: admin
apiVersion: openstack.stable.sap.cc/v1 kind: OpenstackSeed metadata: name: kubernikus-seed spec: requires: - monsoon3/keystone-seed - monsoon3/domain-ccadmin-seed - monsoon3/domain-default-seed domains: - name: Default users: - name: kubernikus-terraform description: "Kubernikus Terraform User" password: {{ required "missing .kubernikus_terraform_password" .Values.kubernikus_terraform_password | quote }} role_assignments: - project: cloud_admin@ccadmin role: admin - project: cloud_admin@ccadmin role: cloud_network_admin - project: cloud_admin@ccadmin role: cloud_resource_admin - project: master@ccadmin role: cloud_dns_admin - project: master@ccadmin role: swiftoperator + - domain: cp + role: admin
2
0.071429
2
0
08fab995cccffd35a7f6618c539394a38c3eb0d8
test/specs/scenarios.js
test/specs/scenarios.js
const expect = require('expect'); const { run, createProject } = require('../cli'); describe('Child Entries', () => { beforeEach(createProject); it('should be able to delete a child entry with a new configuration', function* () { const firstConfig = { parent: { entry: 'entryToDelete' }, }; yield run(`echo '${JSON.stringify(firstConfig)}' > firstConfig.json`); yield run('comfy setall development firstConfig.json'); const { stdout: firstConfigOutput } = yield run('comfy get development'); expect(JSON.parse(firstConfigOutput)).toEqual(firstConfig); const secondConfig = { parent: { child: { entry: 'entry' } }, child: { entry: 'entry' }, }; yield run(`echo '${JSON.stringify(secondConfig)}' > secondConfig.json`); yield run('comfy setall development secondConfig.json'); const { stdout: secondConfigOutput } = yield run('comfy get development'); expect(JSON.parse(secondConfigOutput)).toEqual(secondConfig); }); });
const expect = require('expect'); const { run, createProject } = require('../cli'); describe('Scenarios', () => { beforeEach(createProject); describe('Child Entries', () => { it('should be able to delete a child entry with a new configuration', function* () { const firstConfig = { parent: { entry: 'entryToDelete' }, }; yield run(`echo '${JSON.stringify(firstConfig)}' > firstConfig.json`); yield run('comfy setall development firstConfig.json'); const { stdout: firstConfigOutput } = yield run('comfy get development'); expect(JSON.parse(firstConfigOutput)).toEqual(firstConfig); const secondConfig = { parent: { child: { entry: 'entry' } }, child: { entry: 'entry' }, }; yield run(`echo '${JSON.stringify(secondConfig)}' > secondConfig.json`); yield run('comfy setall development secondConfig.json'); const { stdout: secondConfigOutput } = yield run('comfy get development'); expect(JSON.parse(secondConfigOutput)).toEqual(secondConfig); }); }); describe('Boolean serialization', () => { it('should not transform a `false` bool to a `"false"` string', function* () { const config = { myEntry: false }; yield run(`echo '${JSON.stringify(config)}' > config.json`); yield run('comfy setall development config.json'); const { stdout } = yield run('comfy get development'); expect(JSON.parse(stdout)).toEqual(config); }); }); });
Make the issue appear in the tests
Make the issue appear in the tests
JavaScript
mit
marmelab/comfygure
javascript
## Code Before: const expect = require('expect'); const { run, createProject } = require('../cli'); describe('Child Entries', () => { beforeEach(createProject); it('should be able to delete a child entry with a new configuration', function* () { const firstConfig = { parent: { entry: 'entryToDelete' }, }; yield run(`echo '${JSON.stringify(firstConfig)}' > firstConfig.json`); yield run('comfy setall development firstConfig.json'); const { stdout: firstConfigOutput } = yield run('comfy get development'); expect(JSON.parse(firstConfigOutput)).toEqual(firstConfig); const secondConfig = { parent: { child: { entry: 'entry' } }, child: { entry: 'entry' }, }; yield run(`echo '${JSON.stringify(secondConfig)}' > secondConfig.json`); yield run('comfy setall development secondConfig.json'); const { stdout: secondConfigOutput } = yield run('comfy get development'); expect(JSON.parse(secondConfigOutput)).toEqual(secondConfig); }); }); ## Instruction: Make the issue appear in the tests ## Code After: const expect = require('expect'); const { run, createProject } = require('../cli'); describe('Scenarios', () => { beforeEach(createProject); describe('Child Entries', () => { it('should be able to delete a child entry with a new configuration', function* () { const firstConfig = { parent: { entry: 'entryToDelete' }, }; yield run(`echo '${JSON.stringify(firstConfig)}' > firstConfig.json`); yield run('comfy setall development firstConfig.json'); const { stdout: firstConfigOutput } = yield run('comfy get development'); expect(JSON.parse(firstConfigOutput)).toEqual(firstConfig); const secondConfig = { parent: { child: { entry: 'entry' } }, child: { entry: 'entry' }, }; yield run(`echo '${JSON.stringify(secondConfig)}' > secondConfig.json`); yield run('comfy setall development secondConfig.json'); const { stdout: secondConfigOutput } = yield run('comfy get development'); expect(JSON.parse(secondConfigOutput)).toEqual(secondConfig); }); }); describe('Boolean serialization', () => { it('should not transform a `false` bool to a `"false"` string', function* () { const config = { myEntry: false }; yield run(`echo '${JSON.stringify(config)}' > config.json`); yield run('comfy setall development config.json'); const { stdout } = yield run('comfy get development'); expect(JSON.parse(stdout)).toEqual(config); }); }); });
const expect = require('expect'); const { run, createProject } = require('../cli'); - describe('Child Entries', () => { ? ^^^^^^^ ^ ^ + describe('Scenarios', () => { ? ^^^ ^ ^ beforeEach(createProject); + describe('Child Entries', () => { - it('should be able to delete a child entry with a new configuration', function* () { + it('should be able to delete a child entry with a new configuration', function* () { ? ++++ - const firstConfig = { + const firstConfig = { ? ++++ - parent: { entry: 'entryToDelete' }, + parent: { entry: 'entryToDelete' }, ? ++++ - }; + }; ? ++++ - yield run(`echo '${JSON.stringify(firstConfig)}' > firstConfig.json`); + yield run(`echo '${JSON.stringify(firstConfig)}' > firstConfig.json`); ? ++++ - yield run('comfy setall development firstConfig.json'); + yield run('comfy setall development firstConfig.json'); ? ++++ - const { stdout: firstConfigOutput } = yield run('comfy get development'); + const { stdout: firstConfigOutput } = yield run('comfy get development'); ? ++++ - expect(JSON.parse(firstConfigOutput)).toEqual(firstConfig); + expect(JSON.parse(firstConfigOutput)).toEqual(firstConfig); ? ++++ - const secondConfig = { + const secondConfig = { ? ++++ - parent: { child: { entry: 'entry' } }, + parent: { child: { entry: 'entry' } }, ? ++++ - child: { entry: 'entry' }, + child: { entry: 'entry' }, ? ++++ - }; + }; ? ++++ - yield run(`echo '${JSON.stringify(secondConfig)}' > secondConfig.json`); + yield run(`echo '${JSON.stringify(secondConfig)}' > secondConfig.json`); ? ++++ - yield run('comfy setall development secondConfig.json'); + yield run('comfy setall development secondConfig.json'); ? ++++ - const { stdout: secondConfigOutput } = yield run('comfy get development'); + const { stdout: secondConfigOutput } = yield run('comfy get development'); ? ++++ - expect(JSON.parse(secondConfigOutput)).toEqual(secondConfig); + expect(JSON.parse(secondConfigOutput)).toEqual(secondConfig); ? ++++ + }); + }); + + describe('Boolean serialization', () => { + it('should not transform a `false` bool to a `"false"` string', function* () { + const config = { myEntry: false }; + + yield run(`echo '${JSON.stringify(config)}' > config.json`); + yield run('comfy setall development config.json'); + + const { stdout } = yield run('comfy get development'); + expect(JSON.parse(stdout)).toEqual(config); + }); }); });
48
1.655172
31
17
3a1615238d4500f0fa7b9eea9ee2bfe460bc21f9
cax/tasks/purity.py
cax/tasks/purity.py
from sympy.parsing.sympy_parser import parse_expr from pax import units from cax import config from cax.task import Task class AddElectronLifetime(Task): "Add electron lifetime to dataset" def __init__(self): self.collection_purity = config.mongo_collection('purity') Task.__init__(self) def each_run(self): if 'processor' in self.run_doc: return # Fetch the latest electron lifetime fit doc = self.collection_purity.find_one(sort=(('calculation_time', -1),)) function = parse_expr(doc['electron_lifetime_function']) # Compute value from this function on this dataset lifetime = function.evalf(subs={"t" : self.run_doc['start'].timestamp()}) run_number = self.run_doc['number'] self.log.info("Run %d: calculated lifetime of %d us" % (run_number, lifetime)) if not config.DATABASE_LOG: return # Update run database key = 'processor.DEFAULT.electron_lifetime_liquid' self.collection.find_and_modify({'_id': self.run_doc['_id']}, {'$set': {key: lifetime * units.us}})
from sympy.parsing.sympy_parser import parse_expr from pax import units from cax import config from cax.task import Task class AddElectronLifetime(Task): "Add electron lifetime to dataset" def __init__(self): self.collection_purity = config.mongo_collection('purity') Task.__init__(self) def each_run(self): if 'processor' in self.run_doc: return # Fetch the latest electron lifetime fit doc = self.collection_purity.find_one(sort=(('calculation_time', -1),)) function = parse_expr(doc['electron_lifetime_function']) # Compute value from this function on this dataset lifetime = function.evalf(subs={"t" : self.run_doc['start'].timestamp()}) lifetime = float(lifetime) # Convert away from Sympy type. run_number = self.run_doc['number'] self.log.info("Run %d: calculated lifetime of %d us" % (run_number, lifetime)) if not config.DATABASE_LOG: return # Update run database key = 'processor.DEFAULT.electron_lifetime_liquid' self.collection.find_and_modify({'_id': self.run_doc['_id']}, {'$set': {key: lifetime * units.us}})
Convert lifetime to float instead of sympy type.
Convert lifetime to float instead of sympy type.
Python
isc
XENON1T/cax,XENON1T/cax
python
## Code Before: from sympy.parsing.sympy_parser import parse_expr from pax import units from cax import config from cax.task import Task class AddElectronLifetime(Task): "Add electron lifetime to dataset" def __init__(self): self.collection_purity = config.mongo_collection('purity') Task.__init__(self) def each_run(self): if 'processor' in self.run_doc: return # Fetch the latest electron lifetime fit doc = self.collection_purity.find_one(sort=(('calculation_time', -1),)) function = parse_expr(doc['electron_lifetime_function']) # Compute value from this function on this dataset lifetime = function.evalf(subs={"t" : self.run_doc['start'].timestamp()}) run_number = self.run_doc['number'] self.log.info("Run %d: calculated lifetime of %d us" % (run_number, lifetime)) if not config.DATABASE_LOG: return # Update run database key = 'processor.DEFAULT.electron_lifetime_liquid' self.collection.find_and_modify({'_id': self.run_doc['_id']}, {'$set': {key: lifetime * units.us}}) ## Instruction: Convert lifetime to float instead of sympy type. ## Code After: from sympy.parsing.sympy_parser import parse_expr from pax import units from cax import config from cax.task import Task class AddElectronLifetime(Task): "Add electron lifetime to dataset" def __init__(self): self.collection_purity = config.mongo_collection('purity') Task.__init__(self) def each_run(self): if 'processor' in self.run_doc: return # Fetch the latest electron lifetime fit doc = self.collection_purity.find_one(sort=(('calculation_time', -1),)) function = parse_expr(doc['electron_lifetime_function']) # Compute value from this function on this dataset lifetime = function.evalf(subs={"t" : self.run_doc['start'].timestamp()}) lifetime = float(lifetime) # Convert away from Sympy type. run_number = self.run_doc['number'] self.log.info("Run %d: calculated lifetime of %d us" % (run_number, lifetime)) if not config.DATABASE_LOG: return # Update run database key = 'processor.DEFAULT.electron_lifetime_liquid' self.collection.find_and_modify({'_id': self.run_doc['_id']}, {'$set': {key: lifetime * units.us}})
from sympy.parsing.sympy_parser import parse_expr from pax import units from cax import config from cax.task import Task class AddElectronLifetime(Task): "Add electron lifetime to dataset" def __init__(self): self.collection_purity = config.mongo_collection('purity') Task.__init__(self) def each_run(self): if 'processor' in self.run_doc: return # Fetch the latest electron lifetime fit doc = self.collection_purity.find_one(sort=(('calculation_time', -1),)) function = parse_expr(doc['electron_lifetime_function']) # Compute value from this function on this dataset lifetime = function.evalf(subs={"t" : self.run_doc['start'].timestamp()}) + lifetime = float(lifetime) # Convert away from Sympy type. run_number = self.run_doc['number'] self.log.info("Run %d: calculated lifetime of %d us" % (run_number, lifetime)) if not config.DATABASE_LOG: return # Update run database key = 'processor.DEFAULT.electron_lifetime_liquid' self.collection.find_and_modify({'_id': self.run_doc['_id']}, {'$set': {key: lifetime * units.us}})
1
0.025
1
0
fc136b02dff09c56297a694fd25a68137f4c74f6
README.md
README.md
[![Build Status](https://travis-ci.org/venmo/feature_ramp.svg?branch=master)](https://travis-ci.org/venmo/feature_ramp) Supports toggling and ramping features via a lightweight Redis backend.
[![Build Status](https://travis-ci.org/venmo/feature_ramp.svg?branch=master)](https://travis-ci.org/venmo/feature_ramp) Supports [feature toggling](http://martinfowler.com/bliki/FeatureToggle.html) and ramping features via a lightweight Redis backend. Installation ------------------ Feature ramp requires requires a running Redis server. If your application is written in Python, we recommend using [redis-py](https://github.com/andymccurdy/redis-py) for a convenient way to interface with Redis. To install Redis, follow the [Redis Quick Start](http://redis.io/topics/quickstart). Once you have redis-py and a Redis server running, you're ready to start using Feature Ramp. Getting Started ----------------- ``` python >>> from feature_ramp.Feature import Feature >>> feature_a = Feature('feature_a') >>> feature_a.activate() >>> feature_a.is_active True >>> feature_a.deactivate() >>> feature_a.is_active False ```
Add installationa and get started sections.
Add installationa and get started sections.
Markdown
mit
venmo/feature_ramp
markdown
## Code Before: [![Build Status](https://travis-ci.org/venmo/feature_ramp.svg?branch=master)](https://travis-ci.org/venmo/feature_ramp) Supports toggling and ramping features via a lightweight Redis backend. ## Instruction: Add installationa and get started sections. ## Code After: [![Build Status](https://travis-ci.org/venmo/feature_ramp.svg?branch=master)](https://travis-ci.org/venmo/feature_ramp) Supports [feature toggling](http://martinfowler.com/bliki/FeatureToggle.html) and ramping features via a lightweight Redis backend. Installation ------------------ Feature ramp requires requires a running Redis server. If your application is written in Python, we recommend using [redis-py](https://github.com/andymccurdy/redis-py) for a convenient way to interface with Redis. To install Redis, follow the [Redis Quick Start](http://redis.io/topics/quickstart). Once you have redis-py and a Redis server running, you're ready to start using Feature Ramp. Getting Started ----------------- ``` python >>> from feature_ramp.Feature import Feature >>> feature_a = Feature('feature_a') >>> feature_a.activate() >>> feature_a.is_active True >>> feature_a.deactivate() >>> feature_a.is_active False ```
[![Build Status](https://travis-ci.org/venmo/feature_ramp.svg?branch=master)](https://travis-ci.org/venmo/feature_ramp) - Supports toggling and ramping features via a lightweight Redis backend. + Supports [feature toggling](http://martinfowler.com/bliki/FeatureToggle.html) and ramping features via a lightweight Redis backend. + + Installation + ------------------ + Feature ramp requires requires a running Redis server. If your application is written in Python, we recommend using [redis-py](https://github.com/andymccurdy/redis-py) for a convenient way to interface with Redis. To install Redis, follow the [Redis Quick Start](http://redis.io/topics/quickstart). + + Once you have redis-py and a Redis server running, you're ready to start using Feature Ramp. + + Getting Started + ----------------- + ``` python + >>> from feature_ramp.Feature import Feature + >>> feature_a = Feature('feature_a') + >>> feature_a.activate() + >>> feature_a.is_active + True + >>> feature_a.deactivate() + >>> feature_a.is_active + False + ```
21
7
20
1
ef73f2500caa1d0f193d951446aa938705978afc
CMakeLists.txt
CMakeLists.txt
cmake_minimum_required(VERSION 2.6) project (pstack) add_executable(pstack dead.cc dump.cc dwarf.cc elf.cc live.cc process.cc proc_service.cc pstack.cc reader.cc ) add_executable(t t.cc) if(NOT ELF_BITS) if(CMAKE_SIZEOF_VOID_P EQUAL 8) SET(ELF_BITS 64) endif(CMAKE_SIZEOF_VOID_P EQUAL 8) if(CMAKE_SIZEOF_VOID_P EQUAL 4) SET(ELF_BITS 32) endif(CMAKE_SIZEOF_VOID_P EQUAL 4) endif(NOT ELF_BITS) if (PROFILE) add_definitions("-pg") set_target_properties(pstack PROPERTIES LINK_FLAGS -pg) endif(PROFILE) if(CMAKE_COMPILER_IS_GNUCXX) add_definitions("-std=c++0x") endif(CMAKE_COMPILER_IS_GNUCXX) add_definitions("-DELF_BITS=${ELF_BITS}") if("${CMAKE_SYSTEM}" MATCHES "Linux") target_link_libraries(pstack "-lthread_db") target_link_libraries(t "-lpthread") endif("${CMAKE_SYSTEM}" MATCHES "Linux")
cmake_minimum_required(VERSION 2.6) project (pstack) add_executable(pstack dead.cc dump.cc dwarf.cc elf.cc live.cc process.cc proc_service.cc pstack.cc reader.cc ) add_executable(t t.cc) if(NOT ELF_BITS) if(CMAKE_SIZEOF_VOID_P EQUAL 8) SET(ELF_BITS 64) endif(CMAKE_SIZEOF_VOID_P EQUAL 8) if(CMAKE_SIZEOF_VOID_P EQUAL 4) SET(ELF_BITS 32) endif(CMAKE_SIZEOF_VOID_P EQUAL 4) endif(NOT ELF_BITS) if (PROFILE) add_definitions("-pg") set_target_properties(pstack PROPERTIES LINK_FLAGS -pg) endif(PROFILE) if(CMAKE_COMPILER_IS_GNUCXX) add_definitions("-std=c++0x") if (ELF_BITS EQUAL 32) add_definitions("-m32") set_target_properties(pstack PROPERTIES LINK_FLAGS -m32) set_target_properties(t PROPERTIES LINK_FLAGS -m32) endif() endif(CMAKE_COMPILER_IS_GNUCXX) add_definitions("-DELF_BITS=${ELF_BITS}") if("${CMAKE_SYSTEM}" MATCHES "Linux") add_definitions("-D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE") target_link_libraries(pstack "-lthread_db") target_link_libraries(t "-lpthread") endif("${CMAKE_SYSTEM}" MATCHES "Linux")
Add flags for 32-bit builds.
Add flags for 32-bit builds.
Text
bsd-2-clause
peadar/pstack,peadar/pstack,peadar/pstack,peadar/pstack
text
## Code Before: cmake_minimum_required(VERSION 2.6) project (pstack) add_executable(pstack dead.cc dump.cc dwarf.cc elf.cc live.cc process.cc proc_service.cc pstack.cc reader.cc ) add_executable(t t.cc) if(NOT ELF_BITS) if(CMAKE_SIZEOF_VOID_P EQUAL 8) SET(ELF_BITS 64) endif(CMAKE_SIZEOF_VOID_P EQUAL 8) if(CMAKE_SIZEOF_VOID_P EQUAL 4) SET(ELF_BITS 32) endif(CMAKE_SIZEOF_VOID_P EQUAL 4) endif(NOT ELF_BITS) if (PROFILE) add_definitions("-pg") set_target_properties(pstack PROPERTIES LINK_FLAGS -pg) endif(PROFILE) if(CMAKE_COMPILER_IS_GNUCXX) add_definitions("-std=c++0x") endif(CMAKE_COMPILER_IS_GNUCXX) add_definitions("-DELF_BITS=${ELF_BITS}") if("${CMAKE_SYSTEM}" MATCHES "Linux") target_link_libraries(pstack "-lthread_db") target_link_libraries(t "-lpthread") endif("${CMAKE_SYSTEM}" MATCHES "Linux") ## Instruction: Add flags for 32-bit builds. ## Code After: cmake_minimum_required(VERSION 2.6) project (pstack) add_executable(pstack dead.cc dump.cc dwarf.cc elf.cc live.cc process.cc proc_service.cc pstack.cc reader.cc ) add_executable(t t.cc) if(NOT ELF_BITS) if(CMAKE_SIZEOF_VOID_P EQUAL 8) SET(ELF_BITS 64) endif(CMAKE_SIZEOF_VOID_P EQUAL 8) if(CMAKE_SIZEOF_VOID_P EQUAL 4) SET(ELF_BITS 32) endif(CMAKE_SIZEOF_VOID_P EQUAL 4) endif(NOT ELF_BITS) if (PROFILE) add_definitions("-pg") set_target_properties(pstack PROPERTIES LINK_FLAGS -pg) endif(PROFILE) if(CMAKE_COMPILER_IS_GNUCXX) add_definitions("-std=c++0x") if (ELF_BITS EQUAL 32) add_definitions("-m32") set_target_properties(pstack PROPERTIES LINK_FLAGS -m32) set_target_properties(t PROPERTIES LINK_FLAGS -m32) endif() endif(CMAKE_COMPILER_IS_GNUCXX) add_definitions("-DELF_BITS=${ELF_BITS}") if("${CMAKE_SYSTEM}" MATCHES "Linux") add_definitions("-D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE") target_link_libraries(pstack "-lthread_db") target_link_libraries(t "-lpthread") endif("${CMAKE_SYSTEM}" MATCHES "Linux")
cmake_minimum_required(VERSION 2.6) project (pstack) add_executable(pstack dead.cc dump.cc dwarf.cc elf.cc live.cc process.cc proc_service.cc pstack.cc reader.cc ) add_executable(t t.cc) if(NOT ELF_BITS) if(CMAKE_SIZEOF_VOID_P EQUAL 8) SET(ELF_BITS 64) endif(CMAKE_SIZEOF_VOID_P EQUAL 8) if(CMAKE_SIZEOF_VOID_P EQUAL 4) SET(ELF_BITS 32) endif(CMAKE_SIZEOF_VOID_P EQUAL 4) endif(NOT ELF_BITS) if (PROFILE) add_definitions("-pg") set_target_properties(pstack PROPERTIES LINK_FLAGS -pg) endif(PROFILE) if(CMAKE_COMPILER_IS_GNUCXX) add_definitions("-std=c++0x") + if (ELF_BITS EQUAL 32) + add_definitions("-m32") + set_target_properties(pstack PROPERTIES LINK_FLAGS -m32) + set_target_properties(t PROPERTIES LINK_FLAGS -m32) + endif() endif(CMAKE_COMPILER_IS_GNUCXX) add_definitions("-DELF_BITS=${ELF_BITS}") if("${CMAKE_SYSTEM}" MATCHES "Linux") + add_definitions("-D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE") target_link_libraries(pstack "-lthread_db") target_link_libraries(t "-lpthread") endif("${CMAKE_SYSTEM}" MATCHES "Linux")
6
0.206897
6
0
75b1fdc9f290c85b4d469cdce5e5d1154aed4881
indra/tests/test_util.py
indra/tests/test_util.py
from __future__ import absolute_import, print_function, unicode_literals from builtins import dict, str import xml.etree.ElementTree as ET from indra.util import UnicodeXMLTreeBuilder as UTB from indra.util import unicode_strs from io import BytesIO def test_unicode_tree_builder(): xml = u'<html><bar>asdf</bar></html>'.encode('utf-8') xml_io = BytesIO(xml) tree = ET.parse(xml_io, parser=UTB()) bar = tree.find('.//bar') assert unicode_strs(bar)
from __future__ import absolute_import, print_function, unicode_literals from builtins import dict, str import json import xml.etree.ElementTree as ET from indra.util import UnicodeXMLTreeBuilder as UTB, kappy_json_to_graph from indra.util import unicode_strs from io import BytesIO def test_unicode_tree_builder(): xml = u'<html><bar>asdf</bar></html>'.encode('utf-8') xml_io = BytesIO(xml) tree = ET.parse(xml_io, parser=UTB()) bar = tree.find('.//bar') assert unicode_strs(bar) def test_kappy_influence_json_to_graph(): with open('kappy_influence.json', 'r') as f: imap = json.load(f) graph = kappy_json_to_graph(imap) assert graph is not None, 'No graph produced.' n_nodes = len(graph.nodes) n_edges = len(graph.edges) assert n_nodes == 4, \ 'Wrong number (%d vs. %d) of nodes on the graph.' % (n_nodes, 4) assert n_edges == 6, \ "Wrong number (%d vs. %d) of edges on graph." % (n_edges, 4)
Implement test for basic graph.
Implement test for basic graph.
Python
bsd-2-clause
pvtodorov/indra,johnbachman/indra,sorgerlab/belpy,pvtodorov/indra,sorgerlab/indra,sorgerlab/indra,pvtodorov/indra,bgyori/indra,sorgerlab/indra,sorgerlab/belpy,johnbachman/indra,pvtodorov/indra,bgyori/indra,johnbachman/belpy,johnbachman/belpy,sorgerlab/belpy,johnbachman/belpy,bgyori/indra,johnbachman/indra
python
## Code Before: from __future__ import absolute_import, print_function, unicode_literals from builtins import dict, str import xml.etree.ElementTree as ET from indra.util import UnicodeXMLTreeBuilder as UTB from indra.util import unicode_strs from io import BytesIO def test_unicode_tree_builder(): xml = u'<html><bar>asdf</bar></html>'.encode('utf-8') xml_io = BytesIO(xml) tree = ET.parse(xml_io, parser=UTB()) bar = tree.find('.//bar') assert unicode_strs(bar) ## Instruction: Implement test for basic graph. ## Code After: from __future__ import absolute_import, print_function, unicode_literals from builtins import dict, str import json import xml.etree.ElementTree as ET from indra.util import UnicodeXMLTreeBuilder as UTB, kappy_json_to_graph from indra.util import unicode_strs from io import BytesIO def test_unicode_tree_builder(): xml = u'<html><bar>asdf</bar></html>'.encode('utf-8') xml_io = BytesIO(xml) tree = ET.parse(xml_io, parser=UTB()) bar = tree.find('.//bar') assert unicode_strs(bar) def test_kappy_influence_json_to_graph(): with open('kappy_influence.json', 'r') as f: imap = json.load(f) graph = kappy_json_to_graph(imap) assert graph is not None, 'No graph produced.' n_nodes = len(graph.nodes) n_edges = len(graph.edges) assert n_nodes == 4, \ 'Wrong number (%d vs. %d) of nodes on the graph.' % (n_nodes, 4) assert n_edges == 6, \ "Wrong number (%d vs. %d) of edges on graph." % (n_edges, 4)
from __future__ import absolute_import, print_function, unicode_literals from builtins import dict, str + + import json import xml.etree.ElementTree as ET - from indra.util import UnicodeXMLTreeBuilder as UTB + from indra.util import UnicodeXMLTreeBuilder as UTB, kappy_json_to_graph ? +++++++++++++++++++++ from indra.util import unicode_strs from io import BytesIO + def test_unicode_tree_builder(): xml = u'<html><bar>asdf</bar></html>'.encode('utf-8') xml_io = BytesIO(xml) tree = ET.parse(xml_io, parser=UTB()) bar = tree.find('.//bar') assert unicode_strs(bar) + + def test_kappy_influence_json_to_graph(): + with open('kappy_influence.json', 'r') as f: + imap = json.load(f) + graph = kappy_json_to_graph(imap) + assert graph is not None, 'No graph produced.' + n_nodes = len(graph.nodes) + n_edges = len(graph.edges) + assert n_nodes == 4, \ + 'Wrong number (%d vs. %d) of nodes on the graph.' % (n_nodes, 4) + assert n_edges == 6, \ + "Wrong number (%d vs. %d) of edges on graph." % (n_edges, 4) +
18
1.285714
17
1
947d14955244ddd134c339143a3a0c8fc70b56bf
lib/hooloo/show.rb
lib/hooloo/show.rb
class Hooloo::Show < Hooloo::MozartHash def self.popular_today(args={limit: 10, position: 0}) args.merge!({sort: 'popular_today'}) Hooloo.request('shows', args)['data'].map { |x| new x['show'] } end def initialize(id) super if id.is_a? Fixnum @obj = Hooloo.request("shows/#{id}")['data'][0]['show'] elsif id.is_a? Hash @obj = id end end def videos(season=1, args={}) Hooloo.paginated_request("shows/#{id}/episodes", { season_number: season }.merge(args)) { |g, x| g << Hooloo::Video.new(x['video']) } end bool :embed_permitted, :has_captions date :cache_time cast Hooloo::Company, :company cast Hooloo::Rollup, {rollups: :show_rollups}, {map: true} end
class Hooloo::Show < Hooloo::MozartHash def self.popular_today(args={}) Hooloo.paginated_request('shows', { sort: 'popular_today' }.merge(args)) { |g, x| g << Hooloo::Show.new(x['show']) } end def initialize(id) super if id.is_a? Fixnum @obj = Hooloo.request("shows/#{id}")['data'][0]['show'] elsif id.is_a? Hash @obj = id end end def videos(season=1, args={}) Hooloo.paginated_request("shows/#{id}/episodes", { season_number: season }.merge(args)) { |g, x| g << Hooloo::Video.new(x['video']) } end bool :embed_permitted, :has_captions date :cache_time cast Hooloo::Company, :company cast Hooloo::Rollup, {rollups: :show_rollups}, {map: true} end
Switch Hooloo::Show.popular_today to return enumerator
Switch Hooloo::Show.popular_today to return enumerator
Ruby
mit
NuckChorris/hooloo
ruby
## Code Before: class Hooloo::Show < Hooloo::MozartHash def self.popular_today(args={limit: 10, position: 0}) args.merge!({sort: 'popular_today'}) Hooloo.request('shows', args)['data'].map { |x| new x['show'] } end def initialize(id) super if id.is_a? Fixnum @obj = Hooloo.request("shows/#{id}")['data'][0]['show'] elsif id.is_a? Hash @obj = id end end def videos(season=1, args={}) Hooloo.paginated_request("shows/#{id}/episodes", { season_number: season }.merge(args)) { |g, x| g << Hooloo::Video.new(x['video']) } end bool :embed_permitted, :has_captions date :cache_time cast Hooloo::Company, :company cast Hooloo::Rollup, {rollups: :show_rollups}, {map: true} end ## Instruction: Switch Hooloo::Show.popular_today to return enumerator ## Code After: class Hooloo::Show < Hooloo::MozartHash def self.popular_today(args={}) Hooloo.paginated_request('shows', { sort: 'popular_today' }.merge(args)) { |g, x| g << Hooloo::Show.new(x['show']) } end def initialize(id) super if id.is_a? Fixnum @obj = Hooloo.request("shows/#{id}")['data'][0]['show'] elsif id.is_a? Hash @obj = id end end def videos(season=1, args={}) Hooloo.paginated_request("shows/#{id}/episodes", { season_number: season }.merge(args)) { |g, x| g << Hooloo::Video.new(x['video']) } end bool :embed_permitted, :has_captions date :cache_time cast Hooloo::Company, :company cast Hooloo::Rollup, {rollups: :show_rollups}, {map: true} end
class Hooloo::Show < Hooloo::MozartHash - def self.popular_today(args={limit: 10, position: 0}) ? ---------------------- + def self.popular_today(args={}) - args.merge!({sort: 'popular_today'}) - Hooloo.request('shows', args)['data'].map { |x| new x['show'] } + Hooloo.paginated_request('shows', { + sort: 'popular_today' + }.merge(args)) { |g, x| g << Hooloo::Show.new(x['show']) } end def initialize(id) super if id.is_a? Fixnum @obj = Hooloo.request("shows/#{id}")['data'][0]['show'] elsif id.is_a? Hash @obj = id end end def videos(season=1, args={}) Hooloo.paginated_request("shows/#{id}/episodes", { season_number: season }.merge(args)) { |g, x| g << Hooloo::Video.new(x['video']) } end bool :embed_permitted, :has_captions date :cache_time cast Hooloo::Company, :company cast Hooloo::Rollup, {rollups: :show_rollups}, {map: true} end
7
0.304348
4
3
e3878267851061f30a3692f80d51d152bac89db7
lib/elasticsearch/bulk_index_worker.rb
lib/elasticsearch/bulk_index_worker.rb
require "sidekiq" require "config" require "sidekiq_json_encoding_patch" require "failed_job_worker" module Elasticsearch class BulkIndexWorker include Sidekiq::Worker sidekiq_options :retry => 5 # Logger is defined on the class for use inthe `sidekiq_retries_exhausted` # block, and as an instance method for use the rest of the time def self.logger Logging.logger[self] end def logger self.class.logger end sidekiq_options :queue => :bulk def perform(index_name, document_hashes) noun = document_hashes.size > 1 ? "documents" : "document" logger.info "Indexing #{document_hashes.size} queued #{noun} into #{index_name}" index(index_name).bulk_index(document_hashes) end sidekiq_retries_exhausted do |msg| logger.warn "Job '#{msg["jid"]}' failed; forwarding to failure queue" FailedJobWorker.perform_async(msg) end private def index(index_name) settings.search_config.search_server.index(index_name) end end end
require "sidekiq" require "sidekiq_json_encoding_patch" require "failed_job_worker" module Elasticsearch # This class requires the `config.rb` file to be loaded, since it requires # access to the `search_config` setting, but including it here can cause a # circular require dependency, from: # # SearchConfig -> SearchServer -> IndexGroup -> Index -> DocumentQueue -> # BulkIndexWorker -> SearchConfig class BulkIndexWorker include Sidekiq::Worker sidekiq_options :retry => 5 # Logger is defined on the class for use inthe `sidekiq_retries_exhausted` # block, and as an instance method for use the rest of the time def self.logger Logging.logger[self] end def logger self.class.logger end sidekiq_options :queue => :bulk def perform(index_name, document_hashes) noun = document_hashes.size > 1 ? "documents" : "document" logger.info "Indexing #{document_hashes.size} queued #{noun} into #{index_name}" index(index_name).bulk_index(document_hashes) end sidekiq_retries_exhausted do |msg| logger.warn "Job '#{msg["jid"]}' failed; forwarding to failure queue" FailedJobWorker.perform_async(msg) end private def index(index_name) settings.search_config.search_server.index(index_name) end end end
Resolve a circular dependency in indexing worker.
Resolve a circular dependency in indexing worker.
Ruby
mit
alphagov/rummager,theodi/rummager,alphagov/rummager,theodi/rummager
ruby
## Code Before: require "sidekiq" require "config" require "sidekiq_json_encoding_patch" require "failed_job_worker" module Elasticsearch class BulkIndexWorker include Sidekiq::Worker sidekiq_options :retry => 5 # Logger is defined on the class for use inthe `sidekiq_retries_exhausted` # block, and as an instance method for use the rest of the time def self.logger Logging.logger[self] end def logger self.class.logger end sidekiq_options :queue => :bulk def perform(index_name, document_hashes) noun = document_hashes.size > 1 ? "documents" : "document" logger.info "Indexing #{document_hashes.size} queued #{noun} into #{index_name}" index(index_name).bulk_index(document_hashes) end sidekiq_retries_exhausted do |msg| logger.warn "Job '#{msg["jid"]}' failed; forwarding to failure queue" FailedJobWorker.perform_async(msg) end private def index(index_name) settings.search_config.search_server.index(index_name) end end end ## Instruction: Resolve a circular dependency in indexing worker. ## Code After: require "sidekiq" require "sidekiq_json_encoding_patch" require "failed_job_worker" module Elasticsearch # This class requires the `config.rb` file to be loaded, since it requires # access to the `search_config` setting, but including it here can cause a # circular require dependency, from: # # SearchConfig -> SearchServer -> IndexGroup -> Index -> DocumentQueue -> # BulkIndexWorker -> SearchConfig class BulkIndexWorker include Sidekiq::Worker sidekiq_options :retry => 5 # Logger is defined on the class for use inthe `sidekiq_retries_exhausted` # block, and as an instance method for use the rest of the time def self.logger Logging.logger[self] end def logger self.class.logger end sidekiq_options :queue => :bulk def perform(index_name, document_hashes) noun = document_hashes.size > 1 ? "documents" : "document" logger.info "Indexing #{document_hashes.size} queued #{noun} into #{index_name}" index(index_name).bulk_index(document_hashes) end sidekiq_retries_exhausted do |msg| logger.warn "Job '#{msg["jid"]}' failed; forwarding to failure queue" FailedJobWorker.perform_async(msg) end private def index(index_name) settings.search_config.search_server.index(index_name) end end end
require "sidekiq" - require "config" require "sidekiq_json_encoding_patch" require "failed_job_worker" module Elasticsearch + # This class requires the `config.rb` file to be loaded, since it requires + # access to the `search_config` setting, but including it here can cause a + # circular require dependency, from: + # + # SearchConfig -> SearchServer -> IndexGroup -> Index -> DocumentQueue -> + # BulkIndexWorker -> SearchConfig class BulkIndexWorker include Sidekiq::Worker sidekiq_options :retry => 5 # Logger is defined on the class for use inthe `sidekiq_retries_exhausted` # block, and as an instance method for use the rest of the time def self.logger Logging.logger[self] end def logger self.class.logger end sidekiq_options :queue => :bulk def perform(index_name, document_hashes) noun = document_hashes.size > 1 ? "documents" : "document" logger.info "Indexing #{document_hashes.size} queued #{noun} into #{index_name}" index(index_name).bulk_index(document_hashes) end sidekiq_retries_exhausted do |msg| logger.warn "Job '#{msg["jid"]}' failed; forwarding to failure queue" FailedJobWorker.perform_async(msg) end private def index(index_name) settings.search_config.search_server.index(index_name) end end end
7
0.175
6
1
6b1ce991282be24d4a112cd5a20a0ed476a834c1
stats/stats.go
stats/stats.go
package stats type SingleStat struct { TotalGB float32 UsedGB float32 AvailableGB float32 UsagePercentage float32 } type MultipleStat struct { AverageUsagePercentage float32 UsagePercentagePerCore []float32 }
package stats type SingleStat struct { TotalGB float32 `json:"totalGB"` UsedGB float32 `json:"usedGB"` AvailableGB float32 `json:"availableGB"` UsagePercentage float32 `json:"usagePercentage"` } type MultipleStat struct { AverageUsagePercentage float32 `json:"averageUsagePercentage"` UsagePercentagePerCore []float32 `json:"usagePercentagePerCore"` } type AllStat struct { CPU MultipleStat `json:"cpu"` RAM SingleStat `json:"ram"` Disk SingleStat `json:"disk"` }
Define JSON data structure serialization.
Define JSON data structure serialization.
Go
mit
julianespinel/mbu
go
## Code Before: package stats type SingleStat struct { TotalGB float32 UsedGB float32 AvailableGB float32 UsagePercentage float32 } type MultipleStat struct { AverageUsagePercentage float32 UsagePercentagePerCore []float32 } ## Instruction: Define JSON data structure serialization. ## Code After: package stats type SingleStat struct { TotalGB float32 `json:"totalGB"` UsedGB float32 `json:"usedGB"` AvailableGB float32 `json:"availableGB"` UsagePercentage float32 `json:"usagePercentage"` } type MultipleStat struct { AverageUsagePercentage float32 `json:"averageUsagePercentage"` UsagePercentagePerCore []float32 `json:"usagePercentagePerCore"` } type AllStat struct { CPU MultipleStat `json:"cpu"` RAM SingleStat `json:"ram"` Disk SingleStat `json:"disk"` }
package stats type SingleStat struct { - TotalGB float32 - UsedGB float32 - AvailableGB float32 - UsagePercentage float32 + TotalGB float32 `json:"totalGB"` + UsedGB float32 `json:"usedGB"` + AvailableGB float32 `json:"availableGB"` + UsagePercentage float32 `json:"usagePercentage"` } type MultipleStat struct { - AverageUsagePercentage float32 - UsagePercentagePerCore []float32 + AverageUsagePercentage float32 `json:"averageUsagePercentage"` + UsagePercentagePerCore []float32 `json:"usagePercentagePerCore"` } + + type AllStat struct { + CPU MultipleStat `json:"cpu"` + RAM SingleStat `json:"ram"` + Disk SingleStat `json:"disk"` + }
18
1.384615
12
6
a0adfab8c2fdac2f2b8bad8b1f036bd35b27a06c
setup.py
setup.py
import re import setuptools import sys if not sys.version_info >= (3, 5): exit("Sorry, Python must be later than 3.5.") setuptools.setup( name="tensorflow-qnd", version=re.search(r'__version__ *= *"([0-9]+\.[0-9]+\.[0-9]+)" *\n', open("qnd/__init__.py").read()).group(1), description="Quick and Dirty TensorFlow command framework", long_description=open("README.md").read(), license="Public Domain", author="Yota Toyama", author_email="raviqqe@gmail.com", url="https://github.com/raviqqe/tensorflow-qnd/", packages=["qnd"], install_requires=["gargparse"], classifiers=[ "Development Status :: 3 - Alpha", "Intended Audience :: Developers", "License :: Public Domain", "Programming Language :: Python :: 3.5", "Topic :: Scientific/Engineering :: Artificial Intelligence", "Topic :: System :: Networking", ], )
import re import setuptools import sys if not sys.version_info >= (3, 5): exit("Sorry, Python must be later than 3.5.") setuptools.setup( name="tensorflow-qnd", version=re.search(r'__version__ *= *"([0-9]+\.[0-9]+\.[0-9]+)" *\n', open("qnd/__init__.py").read()).group(1), description="Quick and Dirty TensorFlow command framework", long_description=open("README.md").read(), license="Public Domain", author="Yota Toyama", author_email="raviqqe@gmail.com", url="https://github.com/raviqqe/tensorflow-qnd/", packages=["qnd"], install_requires=["gargparse"], classifiers=[ "Development Status :: 3 - Alpha", "Intended Audience :: Developers", "License :: Public Domain", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Topic :: Scientific/Engineering :: Artificial Intelligence", "Topic :: System :: Networking", ], )
Add Python 3.6 to topics
Add Python 3.6 to topics
Python
unlicense
raviqqe/tensorflow-qnd,raviqqe/tensorflow-qnd
python
## Code Before: import re import setuptools import sys if not sys.version_info >= (3, 5): exit("Sorry, Python must be later than 3.5.") setuptools.setup( name="tensorflow-qnd", version=re.search(r'__version__ *= *"([0-9]+\.[0-9]+\.[0-9]+)" *\n', open("qnd/__init__.py").read()).group(1), description="Quick and Dirty TensorFlow command framework", long_description=open("README.md").read(), license="Public Domain", author="Yota Toyama", author_email="raviqqe@gmail.com", url="https://github.com/raviqqe/tensorflow-qnd/", packages=["qnd"], install_requires=["gargparse"], classifiers=[ "Development Status :: 3 - Alpha", "Intended Audience :: Developers", "License :: Public Domain", "Programming Language :: Python :: 3.5", "Topic :: Scientific/Engineering :: Artificial Intelligence", "Topic :: System :: Networking", ], ) ## Instruction: Add Python 3.6 to topics ## Code After: import re import setuptools import sys if not sys.version_info >= (3, 5): exit("Sorry, Python must be later than 3.5.") setuptools.setup( name="tensorflow-qnd", version=re.search(r'__version__ *= *"([0-9]+\.[0-9]+\.[0-9]+)" *\n', open("qnd/__init__.py").read()).group(1), description="Quick and Dirty TensorFlow command framework", long_description=open("README.md").read(), license="Public Domain", author="Yota Toyama", author_email="raviqqe@gmail.com", url="https://github.com/raviqqe/tensorflow-qnd/", packages=["qnd"], install_requires=["gargparse"], classifiers=[ "Development Status :: 3 - Alpha", "Intended Audience :: Developers", "License :: Public Domain", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Topic :: Scientific/Engineering :: Artificial Intelligence", "Topic :: System :: Networking", ], )
import re import setuptools import sys if not sys.version_info >= (3, 5): exit("Sorry, Python must be later than 3.5.") setuptools.setup( name="tensorflow-qnd", version=re.search(r'__version__ *= *"([0-9]+\.[0-9]+\.[0-9]+)" *\n', open("qnd/__init__.py").read()).group(1), description="Quick and Dirty TensorFlow command framework", long_description=open("README.md").read(), license="Public Domain", author="Yota Toyama", author_email="raviqqe@gmail.com", url="https://github.com/raviqqe/tensorflow-qnd/", packages=["qnd"], install_requires=["gargparse"], classifiers=[ "Development Status :: 3 - Alpha", "Intended Audience :: Developers", "License :: Public Domain", "Programming Language :: Python :: 3.5", + "Programming Language :: Python :: 3.6", "Topic :: Scientific/Engineering :: Artificial Intelligence", "Topic :: System :: Networking", ], )
1
0.033333
1
0
73ee6d3d5804a8c037b7f8959177e68227bcf071
tests/RegressionAlgorithm/LinearLeastSquaresTest.php
tests/RegressionAlgorithm/LinearLeastSquaresTest.php
<?php use mcordingley\Regression\RegressionAlgorithm\LinearLeastSquares; class LinearLeastSquaresTest extends PHPUnit_Framework_TestCase { protected $strategy; public function __construct($name = null, array $data = array(), $dataName = '') { parent::__construct($name, $data, $dataName); $this->strategy = new LinearLeastSquares; } public function testFatMatrix() { // TODO } public function testSkinnyMatrix() { $coefficients = $this->strategy->regress([1, 2, 3, 4, 5], [ [1, 1], [1, 2], [1, 1.3], [1, 3.75], [1, 2.25], ]); $this->assertEquals(1.095497063, round($coefficients[0], 9)); $this->assertEquals(0.924515989, round($coefficients[1], 9)); } public function testSquareMatrix() { $coefficients = $this->strategy->regress([2, 4, 6, 8, 10], [ [1, 3, 5, 7, 2], [1, 3, 2, 1, 5], [1, 1, 2, 3, 4], [1, 1, 3, 4, 7], [1, 19, 17, 15, 14], ]); $this->assertEquals(-2.667, round($coefficients[0], 3)); $this->assertEquals(3.333, round($coefficients[1], 3)); $this->assertEquals(-9.333, round($coefficients[2], 3)); $this->assertEquals(5.333, round($coefficients[3], 3)); $this->assertEquals(2, round($coefficients[4], 3)); } }
<?php use mcordingley\Regression\RegressionAlgorithm\LinearLeastSquares; class LinearLeastSquaresTest extends PHPUnit_Framework_TestCase { protected $strategy; public function __construct($name = null, array $data = array(), $dataName = '') { parent::__construct($name, $data, $dataName); $this->strategy = new LinearLeastSquares; } public function testRegress() { $coefficients = $this->strategy->regress([1, 2, 3, 4, 5], [ [1, 1], [1, 2], [1, 1.3], [1, 3.75], [1, 2.25], ]); $this->assertEquals(1.095497063, round($coefficients[0], 9)); $this->assertEquals(0.924515989, round($coefficients[1], 9)); } }
Remove tests that targeted removed code paths in LinearLeastSquares.
Remove tests that targeted removed code paths in LinearLeastSquares.
PHP
mit
dbirchak/Regression,mcordingley/Regression
php
## Code Before: <?php use mcordingley\Regression\RegressionAlgorithm\LinearLeastSquares; class LinearLeastSquaresTest extends PHPUnit_Framework_TestCase { protected $strategy; public function __construct($name = null, array $data = array(), $dataName = '') { parent::__construct($name, $data, $dataName); $this->strategy = new LinearLeastSquares; } public function testFatMatrix() { // TODO } public function testSkinnyMatrix() { $coefficients = $this->strategy->regress([1, 2, 3, 4, 5], [ [1, 1], [1, 2], [1, 1.3], [1, 3.75], [1, 2.25], ]); $this->assertEquals(1.095497063, round($coefficients[0], 9)); $this->assertEquals(0.924515989, round($coefficients[1], 9)); } public function testSquareMatrix() { $coefficients = $this->strategy->regress([2, 4, 6, 8, 10], [ [1, 3, 5, 7, 2], [1, 3, 2, 1, 5], [1, 1, 2, 3, 4], [1, 1, 3, 4, 7], [1, 19, 17, 15, 14], ]); $this->assertEquals(-2.667, round($coefficients[0], 3)); $this->assertEquals(3.333, round($coefficients[1], 3)); $this->assertEquals(-9.333, round($coefficients[2], 3)); $this->assertEquals(5.333, round($coefficients[3], 3)); $this->assertEquals(2, round($coefficients[4], 3)); } } ## Instruction: Remove tests that targeted removed code paths in LinearLeastSquares. ## Code After: <?php use mcordingley\Regression\RegressionAlgorithm\LinearLeastSquares; class LinearLeastSquaresTest extends PHPUnit_Framework_TestCase { protected $strategy; public function __construct($name = null, array $data = array(), $dataName = '') { parent::__construct($name, $data, $dataName); $this->strategy = new LinearLeastSquares; } public function testRegress() { $coefficients = $this->strategy->regress([1, 2, 3, 4, 5], [ [1, 1], [1, 2], [1, 1.3], [1, 3.75], [1, 2.25], ]); $this->assertEquals(1.095497063, round($coefficients[0], 9)); $this->assertEquals(0.924515989, round($coefficients[1], 9)); } }
<?php use mcordingley\Regression\RegressionAlgorithm\LinearLeastSquares; class LinearLeastSquaresTest extends PHPUnit_Framework_TestCase { protected $strategy; public function __construct($name = null, array $data = array(), $dataName = '') { parent::__construct($name, $data, $dataName); $this->strategy = new LinearLeastSquares; } - public function testFatMatrix() ? ^^^^^^ ^^ + public function testRegress() ? ^^^ ^^^ - { - // TODO - } - - public function testSkinnyMatrix() { $coefficients = $this->strategy->regress([1, 2, 3, 4, 5], [ [1, 1], [1, 2], [1, 1.3], [1, 3.75], [1, 2.25], ]); $this->assertEquals(1.095497063, round($coefficients[0], 9)); $this->assertEquals(0.924515989, round($coefficients[1], 9)); } - - public function testSquareMatrix() - { - $coefficients = $this->strategy->regress([2, 4, 6, 8, 10], [ - [1, 3, 5, 7, 2], - [1, 3, 2, 1, 5], - [1, 1, 2, 3, 4], - [1, 1, 3, 4, 7], - [1, 19, 17, 15, 14], - ]); - - $this->assertEquals(-2.667, round($coefficients[0], 3)); - $this->assertEquals(3.333, round($coefficients[1], 3)); - $this->assertEquals(-9.333, round($coefficients[2], 3)); - $this->assertEquals(5.333, round($coefficients[3], 3)); - $this->assertEquals(2, round($coefficients[4], 3)); - } }
24
0.461538
1
23
1cf5a0fe163a7769a83abc3cf5b14b91089a5f28
lib/s3.js
lib/s3.js
var assert = require('assert') var AWS = require('aws-sdk') /** * Create s3 client * @param {Object} config AWS configuration * @returns {Object} s3 client and helpers */ module.exports = function s3Loader (config) { assert(config, 's3Loader requires config') var client = new AWS.S3(config) client.endpoint = config.endpoint return { client, uploadFilePromise, deleteObjectsPromise } function uploadFilePromise (uploadConfig) { return new Promise((resolve, reject) => { client.upload(uploadConfig, (err, data) => err ? reject(err) : resolve(data)) }) } function deleteObjectsPromise (deleteConfig) { return new Promise((resolve, reject) => { client.deleteObject(deleteConfig, (err, data) => err ? reject(err) : resolve(data)) }) } }
const assert = require('assert') const AWS = require('aws-sdk') /** * Create s3 client * @param {Object} config AWS configuration * @returns {Object} s3 client and helpers */ module.exports = function s3Loader (config) { assert(config, 's3Loader requires config') const client = new AWS.S3(config) return { client, uploadFilePromise, deleteObjectsPromise } function uploadFilePromise (uploadConfig) { return new Promise((resolve, reject) => { client.upload(uploadConfig, (err, data) => err ? reject(err) : resolve(data)) }) } function deleteObjectsPromise (deleteConfig) { return new Promise((resolve, reject) => { client.deleteObject(deleteConfig, (err, data) => err ? reject(err) : resolve(data)) }) } }
Use const and remove useless lines
Use const and remove useless lines
JavaScript
apache-2.0
cesarandreu/bshed,cesarandreu/bshed
javascript
## Code Before: var assert = require('assert') var AWS = require('aws-sdk') /** * Create s3 client * @param {Object} config AWS configuration * @returns {Object} s3 client and helpers */ module.exports = function s3Loader (config) { assert(config, 's3Loader requires config') var client = new AWS.S3(config) client.endpoint = config.endpoint return { client, uploadFilePromise, deleteObjectsPromise } function uploadFilePromise (uploadConfig) { return new Promise((resolve, reject) => { client.upload(uploadConfig, (err, data) => err ? reject(err) : resolve(data)) }) } function deleteObjectsPromise (deleteConfig) { return new Promise((resolve, reject) => { client.deleteObject(deleteConfig, (err, data) => err ? reject(err) : resolve(data)) }) } } ## Instruction: Use const and remove useless lines ## Code After: const assert = require('assert') const AWS = require('aws-sdk') /** * Create s3 client * @param {Object} config AWS configuration * @returns {Object} s3 client and helpers */ module.exports = function s3Loader (config) { assert(config, 's3Loader requires config') const client = new AWS.S3(config) return { client, uploadFilePromise, deleteObjectsPromise } function uploadFilePromise (uploadConfig) { return new Promise((resolve, reject) => { client.upload(uploadConfig, (err, data) => err ? reject(err) : resolve(data)) }) } function deleteObjectsPromise (deleteConfig) { return new Promise((resolve, reject) => { client.deleteObject(deleteConfig, (err, data) => err ? reject(err) : resolve(data)) }) } }
- var assert = require('assert') ? ^^^ + const assert = require('assert') ? ^^^^^ - var AWS = require('aws-sdk') ? ^^^ + const AWS = require('aws-sdk') ? ^^^^^ /** * Create s3 client * @param {Object} config AWS configuration * @returns {Object} s3 client and helpers */ module.exports = function s3Loader (config) { assert(config, 's3Loader requires config') - var client = new AWS.S3(config) ? ^^^ + const client = new AWS.S3(config) ? ^^^^^ - - client.endpoint = config.endpoint return { client, uploadFilePromise, deleteObjectsPromise } function uploadFilePromise (uploadConfig) { return new Promise((resolve, reject) => { client.upload(uploadConfig, (err, data) => err ? reject(err) : resolve(data)) }) } function deleteObjectsPromise (deleteConfig) { return new Promise((resolve, reject) => { client.deleteObject(deleteConfig, (err, data) => err ? reject(err) : resolve(data)) }) } }
8
0.25
3
5
6a8c8bc0e407327e5c0e4cae3d4d6ace179a6940
webserver/codemanagement/serializers.py
webserver/codemanagement/serializers.py
from rest_framework import serializers from greta.models import Repository from competition.models import Team from .models import TeamClient, TeamSubmission class TeamSerializer(serializers.ModelSerializer): class Meta: model = Team fields = ('id', 'name', 'slug') class RepoSerializer(serializers.ModelSerializer): class Meta: model = Repository fields = ('name', 'description', 'forked_from', 'path', 'is_ready') forked_from = serializers.RelatedField() path = serializers.SerializerMethodField('get_path') is_ready = serializers.SerializerMethodField('get_is_ready') def get_path(self, repo): return repo.path def get_is_ready(self, repo): return repo.is_ready() class TeamSubmissionSerializer(serializers.ModelSerializer): class Meta: model = TeamSubmission fields = ('name', 'commit') class TeamClientSerializer(serializers.ModelSerializer): class Meta: model = TeamClient fields = ('team', 'repository', 'tag', 'language') team = TeamSerializer() repository = RepoSerializer() tag = serializers.SerializerMethodField('get_tag') language = serializers.SerializerMethodField('get_language') def get_tag(self, teamclient): try: latest_sub= teamclient.submissions.latest() return TeamSubmissionSerializer(latest_sub).data except TeamSubmission.DoesNotExist: return None def get_language(self, teamclient): return teamclient.base.language
from rest_framework import serializers from greta.models import Repository from competition.models import Team from .models import TeamClient, TeamSubmission class TeamSerializer(serializers.ModelSerializer): class Meta: model = Team fields = ('id', 'name', 'slug', 'eligible_to_win') class RepoSerializer(serializers.ModelSerializer): class Meta: model = Repository fields = ('name', 'description', 'forked_from', 'path', 'is_ready') forked_from = serializers.RelatedField() path = serializers.SerializerMethodField('get_path') is_ready = serializers.SerializerMethodField('get_is_ready') def get_path(self, repo): return repo.path def get_is_ready(self, repo): return repo.is_ready() class TeamSubmissionSerializer(serializers.ModelSerializer): class Meta: model = TeamSubmission fields = ('name', 'commit') class TeamClientSerializer(serializers.ModelSerializer): class Meta: model = TeamClient fields = ('team', 'repository', 'tag', 'language') team = TeamSerializer() repository = RepoSerializer() tag = serializers.SerializerMethodField('get_tag') language = serializers.SerializerMethodField('get_language') def get_tag(self, teamclient): try: latest_sub= teamclient.submissions.latest() return TeamSubmissionSerializer(latest_sub).data except TeamSubmission.DoesNotExist: return None def get_language(self, teamclient): return teamclient.base.language
Add team eligibility to API
Add team eligibility to API
Python
bsd-3-clause
siggame/webserver,siggame/webserver,siggame/webserver
python
## Code Before: from rest_framework import serializers from greta.models import Repository from competition.models import Team from .models import TeamClient, TeamSubmission class TeamSerializer(serializers.ModelSerializer): class Meta: model = Team fields = ('id', 'name', 'slug') class RepoSerializer(serializers.ModelSerializer): class Meta: model = Repository fields = ('name', 'description', 'forked_from', 'path', 'is_ready') forked_from = serializers.RelatedField() path = serializers.SerializerMethodField('get_path') is_ready = serializers.SerializerMethodField('get_is_ready') def get_path(self, repo): return repo.path def get_is_ready(self, repo): return repo.is_ready() class TeamSubmissionSerializer(serializers.ModelSerializer): class Meta: model = TeamSubmission fields = ('name', 'commit') class TeamClientSerializer(serializers.ModelSerializer): class Meta: model = TeamClient fields = ('team', 'repository', 'tag', 'language') team = TeamSerializer() repository = RepoSerializer() tag = serializers.SerializerMethodField('get_tag') language = serializers.SerializerMethodField('get_language') def get_tag(self, teamclient): try: latest_sub= teamclient.submissions.latest() return TeamSubmissionSerializer(latest_sub).data except TeamSubmission.DoesNotExist: return None def get_language(self, teamclient): return teamclient.base.language ## Instruction: Add team eligibility to API ## Code After: from rest_framework import serializers from greta.models import Repository from competition.models import Team from .models import TeamClient, TeamSubmission class TeamSerializer(serializers.ModelSerializer): class Meta: model = Team fields = ('id', 'name', 'slug', 'eligible_to_win') class RepoSerializer(serializers.ModelSerializer): class Meta: model = Repository fields = ('name', 'description', 'forked_from', 'path', 'is_ready') forked_from = serializers.RelatedField() path = serializers.SerializerMethodField('get_path') is_ready = serializers.SerializerMethodField('get_is_ready') def get_path(self, repo): return repo.path def get_is_ready(self, repo): return repo.is_ready() class TeamSubmissionSerializer(serializers.ModelSerializer): class Meta: model = TeamSubmission fields = ('name', 'commit') class TeamClientSerializer(serializers.ModelSerializer): class Meta: model = TeamClient fields = ('team', 'repository', 'tag', 'language') team = TeamSerializer() repository = RepoSerializer() tag = serializers.SerializerMethodField('get_tag') language = serializers.SerializerMethodField('get_language') def get_tag(self, teamclient): try: latest_sub= teamclient.submissions.latest() return TeamSubmissionSerializer(latest_sub).data except TeamSubmission.DoesNotExist: return None def get_language(self, teamclient): return teamclient.base.language
from rest_framework import serializers from greta.models import Repository from competition.models import Team from .models import TeamClient, TeamSubmission class TeamSerializer(serializers.ModelSerializer): class Meta: model = Team - fields = ('id', 'name', 'slug') + fields = ('id', 'name', 'slug', 'eligible_to_win') ? +++++++++++++++++++ class RepoSerializer(serializers.ModelSerializer): class Meta: model = Repository fields = ('name', 'description', 'forked_from', 'path', 'is_ready') forked_from = serializers.RelatedField() path = serializers.SerializerMethodField('get_path') is_ready = serializers.SerializerMethodField('get_is_ready') def get_path(self, repo): return repo.path def get_is_ready(self, repo): return repo.is_ready() class TeamSubmissionSerializer(serializers.ModelSerializer): class Meta: model = TeamSubmission fields = ('name', 'commit') class TeamClientSerializer(serializers.ModelSerializer): class Meta: model = TeamClient fields = ('team', 'repository', 'tag', 'language') team = TeamSerializer() repository = RepoSerializer() tag = serializers.SerializerMethodField('get_tag') language = serializers.SerializerMethodField('get_language') def get_tag(self, teamclient): try: latest_sub= teamclient.submissions.latest() return TeamSubmissionSerializer(latest_sub).data except TeamSubmission.DoesNotExist: return None def get_language(self, teamclient): return teamclient.base.language
2
0.033898
1
1
b8ae4f90424eca35575f23ecd013d4e6abb4c649
app/views/outpost/resource/edit.html.erb
app/views/outpost/resource/edit.html.erb
<% add_to_page_title "Editing: #{@record.to_title}" %> <%= simple_form_for [:outpost, @record], html: { class: "form-horizontal" } do |f| %> <%= render 'errors', f: f %> <%= render 'form_fields', record: @record, f: f %> <%= render 'extra_fields', f: f %> <%= render "/outpost/shared/submit_row", record: @record %> <% end %> <% content_for :sidebar do %> <ul class="story-status unstyled"> <li> Last Updated: <strong><%= format_date @record.updated_at, time: true %></strong> </li> </ul> <hr /> <div id="fixed-sidebar" data-spy="affix" data-offset-top="140"> <!-- Be sure to change the data-offset-top attribute on this element if you move it vertically --> <%= render "/outpost/shared/form_nav" %> </div> <% end %> <% content_for :footer do %> <script> preview = new outpost.Preview({baseUrl: '<%= @record.admin_show_path %>'}); </script> <% end %>
<% add_to_page_title "Editing: #{@record.to_title}" %> <%= simple_form_for [:outpost, @record], html: { class: "form-horizontal" } do |f| %> <%= render 'errors', f: f %> <%= render 'form_fields', record: @record, f: f %> <%= render 'extra_fields', f: f %> <%= render "/outpost/shared/submit_row", record: @record %> <% end %> <% content_for :sidebar do %> <ul class="story-status unstyled"> <li> Last Updated: <strong><%= format_date @record.updated_at, time: true %></strong> </li> <li> <% if @record.respond_to? :status_text %> <span class="<%=status_bootstrap_map[@record.status]%>"><%= @record.status_text %></span> <% end %> <% if @record.respond_to?(:published_at) && @record.respond_to?(:published?) && @record.published? %> on <strong><%= format_date @record.published_at, time: true %></strong> <% end %> <% if @record.respond_to?(:publish_alarm) && @record.publish_alarm.present? && @record.publish_alarm.fire_at.present? %> Alarm - <strong><%= format_date @record.publish_alarm.fire_at, time: true %></strong> <% end %> </li> </ul> <hr /> <div id="fixed-sidebar" data-spy="affix" data-offset-top="140"> <!-- Be sure to change the data-offset-top attribute on this element if you move it vertically --> <%= render "/outpost/shared/form_nav" %> </div> <% end %> <% content_for :footer do %> <script> preview = new outpost.Preview({baseUrl: '<%= @record.admin_show_path %>'}); </script> <% end %>
Add some more stuff to the sidebar by default
Add some more stuff to the sidebar by default
HTML+ERB
mit
SCPR/outpost,Ravenstine/outpost,Ravenstine/outpost,SCPR/outpost,SCPR/outpost,Ravenstine/outpost,bricker/outpost,bricker/outpost
html+erb
## Code Before: <% add_to_page_title "Editing: #{@record.to_title}" %> <%= simple_form_for [:outpost, @record], html: { class: "form-horizontal" } do |f| %> <%= render 'errors', f: f %> <%= render 'form_fields', record: @record, f: f %> <%= render 'extra_fields', f: f %> <%= render "/outpost/shared/submit_row", record: @record %> <% end %> <% content_for :sidebar do %> <ul class="story-status unstyled"> <li> Last Updated: <strong><%= format_date @record.updated_at, time: true %></strong> </li> </ul> <hr /> <div id="fixed-sidebar" data-spy="affix" data-offset-top="140"> <!-- Be sure to change the data-offset-top attribute on this element if you move it vertically --> <%= render "/outpost/shared/form_nav" %> </div> <% end %> <% content_for :footer do %> <script> preview = new outpost.Preview({baseUrl: '<%= @record.admin_show_path %>'}); </script> <% end %> ## Instruction: Add some more stuff to the sidebar by default ## Code After: <% add_to_page_title "Editing: #{@record.to_title}" %> <%= simple_form_for [:outpost, @record], html: { class: "form-horizontal" } do |f| %> <%= render 'errors', f: f %> <%= render 'form_fields', record: @record, f: f %> <%= render 'extra_fields', f: f %> <%= render "/outpost/shared/submit_row", record: @record %> <% end %> <% content_for :sidebar do %> <ul class="story-status unstyled"> <li> Last Updated: <strong><%= format_date @record.updated_at, time: true %></strong> </li> <li> <% if @record.respond_to? :status_text %> <span class="<%=status_bootstrap_map[@record.status]%>"><%= @record.status_text %></span> <% end %> <% if @record.respond_to?(:published_at) && @record.respond_to?(:published?) && @record.published? %> on <strong><%= format_date @record.published_at, time: true %></strong> <% end %> <% if @record.respond_to?(:publish_alarm) && @record.publish_alarm.present? && @record.publish_alarm.fire_at.present? %> Alarm - <strong><%= format_date @record.publish_alarm.fire_at, time: true %></strong> <% end %> </li> </ul> <hr /> <div id="fixed-sidebar" data-spy="affix" data-offset-top="140"> <!-- Be sure to change the data-offset-top attribute on this element if you move it vertically --> <%= render "/outpost/shared/form_nav" %> </div> <% end %> <% content_for :footer do %> <script> preview = new outpost.Preview({baseUrl: '<%= @record.admin_show_path %>'}); </script> <% end %>
<% add_to_page_title "Editing: #{@record.to_title}" %> <%= simple_form_for [:outpost, @record], html: { class: "form-horizontal" } do |f| %> <%= render 'errors', f: f %> <%= render 'form_fields', record: @record, f: f %> <%= render 'extra_fields', f: f %> <%= render "/outpost/shared/submit_row", record: @record %> <% end %> <% content_for :sidebar do %> <ul class="story-status unstyled"> <li> Last Updated: <strong><%= format_date @record.updated_at, time: true %></strong> + </li> + + <li> + <% if @record.respond_to? :status_text %> + <span class="<%=status_bootstrap_map[@record.status]%>"><%= @record.status_text %></span> + <% end %> + + <% if @record.respond_to?(:published_at) && @record.respond_to?(:published?) && @record.published? %> + on <strong><%= format_date @record.published_at, time: true %></strong> + <% end %> + + <% if @record.respond_to?(:publish_alarm) && @record.publish_alarm.present? && @record.publish_alarm.fire_at.present? %> + Alarm - <strong><%= format_date @record.publish_alarm.fire_at, time: true %></strong> + <% end %> </li> </ul> <hr /> <div id="fixed-sidebar" data-spy="affix" data-offset-top="140"> <!-- Be sure to change the data-offset-top attribute on this element if you move it vertically --> <%= render "/outpost/shared/form_nav" %> </div> <% end %> <% content_for :footer do %> <script> preview = new outpost.Preview({baseUrl: '<%= @record.admin_show_path %>'}); </script> <% end %>
14
0.466667
14
0
43c48cdbc5cf5793ad6f0f46cde5ca91ad2b8756
core/metautils/src/vectorLinkdef.h
core/metautils/src/vectorLinkdef.h
using namespace std; #endif #pragma create TClass vector<bool>; #pragma create TClass vector<char>; #pragma create TClass vector<short>; #pragma create TClass vector<long>; #pragma create TClass vector<unsigned char>; #pragma create TClass vector<unsigned short>; #pragma create TClass vector<unsigned int>; #pragma create TClass vector<unsigned long>; #pragma create TClass vector<float>; #pragma create TClass vector<double>; #pragma create TClass vector<char*>; #pragma create TClass vector<const char*>; #pragma create TClass vector<string>; #if (!(G__GNUC==3 && G__GNUC_MINOR==1)) && !defined(G__KCC) && (!defined(G__VISUAL) || G__MSC_VER<1300) // gcc3.1,3.2 has a problem with iterator<void*,...,void&> #pragma create TClass vector<void*>; #endif
using namespace std; #endif #pragma create TClass vector<bool>; #pragma create TClass vector<char>; #pragma create TClass vector<short>; #pragma create TClass vector<long>; #pragma create TClass vector<unsigned char>; #pragma create TClass vector<unsigned short>; #pragma create TClass vector<unsigned int>; #pragma create TClass vector<unsigned long>; #pragma create TClass vector<float>; #pragma create TClass vector<double>; #pragma create TClass vector<char*>; #pragma create TClass vector<const char*>; #pragma create TClass vector<string>; #pragma create TClass vector<Long64_t>; #pragma create TClass vector<ULong64_t>; #if (!(G__GNUC==3 && G__GNUC_MINOR==1)) && !defined(G__KCC) && (!defined(G__VISUAL) || G__MSC_VER<1300) // gcc3.1,3.2 has a problem with iterator<void*,...,void&> #pragma create TClass vector<void*>; #endif
Add missing TClass creation for vector<Long64_t> and vector<ULong64_t>
Add missing TClass creation for vector<Long64_t> and vector<ULong64_t> git-svn-id: ecbadac9c76e8cf640a0bca86f6bd796c98521e3@38659 27541ba8-7e3a-0410-8455-c3a389f83636
C
lgpl-2.1
bbannier/ROOT,bbannier/ROOT,bbannier/ROOT,bbannier/ROOT,bbannier/ROOT,bbannier/ROOT,bbannier/ROOT
c
## Code Before: using namespace std; #endif #pragma create TClass vector<bool>; #pragma create TClass vector<char>; #pragma create TClass vector<short>; #pragma create TClass vector<long>; #pragma create TClass vector<unsigned char>; #pragma create TClass vector<unsigned short>; #pragma create TClass vector<unsigned int>; #pragma create TClass vector<unsigned long>; #pragma create TClass vector<float>; #pragma create TClass vector<double>; #pragma create TClass vector<char*>; #pragma create TClass vector<const char*>; #pragma create TClass vector<string>; #if (!(G__GNUC==3 && G__GNUC_MINOR==1)) && !defined(G__KCC) && (!defined(G__VISUAL) || G__MSC_VER<1300) // gcc3.1,3.2 has a problem with iterator<void*,...,void&> #pragma create TClass vector<void*>; #endif ## Instruction: Add missing TClass creation for vector<Long64_t> and vector<ULong64_t> git-svn-id: ecbadac9c76e8cf640a0bca86f6bd796c98521e3@38659 27541ba8-7e3a-0410-8455-c3a389f83636 ## Code After: using namespace std; #endif #pragma create TClass vector<bool>; #pragma create TClass vector<char>; #pragma create TClass vector<short>; #pragma create TClass vector<long>; #pragma create TClass vector<unsigned char>; #pragma create TClass vector<unsigned short>; #pragma create TClass vector<unsigned int>; #pragma create TClass vector<unsigned long>; #pragma create TClass vector<float>; #pragma create TClass vector<double>; #pragma create TClass vector<char*>; #pragma create TClass vector<const char*>; #pragma create TClass vector<string>; #pragma create TClass vector<Long64_t>; #pragma create TClass vector<ULong64_t>; #if (!(G__GNUC==3 && G__GNUC_MINOR==1)) && !defined(G__KCC) && (!defined(G__VISUAL) || G__MSC_VER<1300) // gcc3.1,3.2 has a problem with iterator<void*,...,void&> #pragma create TClass vector<void*>; #endif
using namespace std; #endif #pragma create TClass vector<bool>; #pragma create TClass vector<char>; #pragma create TClass vector<short>; #pragma create TClass vector<long>; #pragma create TClass vector<unsigned char>; #pragma create TClass vector<unsigned short>; #pragma create TClass vector<unsigned int>; #pragma create TClass vector<unsigned long>; #pragma create TClass vector<float>; #pragma create TClass vector<double>; #pragma create TClass vector<char*>; #pragma create TClass vector<const char*>; #pragma create TClass vector<string>; + #pragma create TClass vector<Long64_t>; + #pragma create TClass vector<ULong64_t>; #if (!(G__GNUC==3 && G__GNUC_MINOR==1)) && !defined(G__KCC) && (!defined(G__VISUAL) || G__MSC_VER<1300) // gcc3.1,3.2 has a problem with iterator<void*,...,void&> #pragma create TClass vector<void*>; #endif
2
0.095238
2
0
ae5f24564b2fb8c155332db8d19d7171f592e50b
test-requirements.txt
test-requirements.txt
hacking>=0.5.6,<0.8 # Optional backend: SQL pysqlite # Optional backend: Memcache python-memcached # Optional backend: LDAP # authenticate against an existing LDAP server python-ldap==2.3.13 # Testing # computes code coverage percentages coverage>=3.6 # fixture stubbing fixtures>=0.3.14 # mock object framework mock>=1.0 mox>=0.5.3 # required to build documentation sphinx>=1.1.2 # test wsgi apps without starting an http server WebTest>=2.0 extras discover python-subunit testrepository>=0.0.17 testtools>=0.9.32 # for python-keystoneclient # keystoneclient <0.2.1 httplib2 # replaces httplib2 in keystoneclient >=0.2.1 requests>=1.1 keyring>=1.6.1,<2.0 netifaces>=0.5 # For documentation oslo.sphinx
hacking>=0.5.6,<0.8 # Optional backend: SQL pysqlite # Optional backend: Memcache python-memcached # Optional backend: LDAP # authenticate against an existing LDAP server python-ldap==2.3.13 # Testing # computes code coverage percentages coverage>=3.6 # fixture stubbing fixtures>=0.3.14 # mock object framework mock>=1.0 mox>=0.5.3 # Optional KVS feature encryption testing pycrypto>=2.6 # required to build documentation sphinx>=1.1.2 # test wsgi apps without starting an http server WebTest>=2.0 extras discover python-subunit testrepository>=0.0.17 testtools>=0.9.32 # for python-keystoneclient # keystoneclient <0.2.1 httplib2 # replaces httplib2 in keystoneclient >=0.2.1 requests>=1.1 keyring>=1.6.1,<2.0 netifaces>=0.5 # For documentation oslo.sphinx
Add pycrypto as a test-requirement
Add pycrypto as a test-requirement Adding pycrypto as a test-requirement to allow for testing the optional KVS encryption support (ENCRYPT and HMAC validation of values). bp: dogpile-kvs-backends Change-Id: I2e9184dc66393cf805b18abd3822668a6f3792fa
Text
apache-2.0
himanshu-setia/keystone,jumpstarter-io/keystone,jonnary/keystone,jonnary/keystone,openstack/keystone,derekchiang/keystone,mahak/keystone,UTSA-ICS/keystone-kerberos,openstack/keystone,rodrigods/keystone,ging/keystone,mahak/keystone,klmitch/keystone,rajalokan/keystone,dstanek/keystone,derekchiang/keystone,promptworks/keystone,dsiddharth/access-keys,nuxeh/keystone,idjaw/keystone,maestro-hybrid-cloud/keystone,ging/keystone,vivekdhayaal/keystone,dsiddharth/access-keys,vivekdhayaal/keystone,klmitch/keystone,ilay09/keystone,cernops/keystone,idjaw/keystone,cernops/keystone,jamielennox/keystone,ilay09/keystone,maestro-hybrid-cloud/keystone,MaheshIBM/keystone,ajayaa/keystone,dims/keystone,promptworks/keystone,ilay09/keystone,reeshupatel/demo,UTSA-ICS/keystone-kerberos,rajalokan/keystone,roopali8/keystone,JioCloud/keystone,dims/keystone,rushiagr/keystone,dstanek/keystone,rushiagr/keystone,MaheshIBM/keystone,rajalokan/keystone,nuxeh/keystone,dstanek/keystone,roopali8/keystone,jumpstarter-io/keystone,dsiddharth/access-keys,ajayaa/keystone,promptworks/keystone,himanshu-setia/keystone,reeshupatel/demo,takeshineshiro/keystone,takeshineshiro/keystone,reeshupatel/demo,blueboxgroup/keystone,jumpstarter-io/keystone,vivekdhayaal/keystone,nuxeh/keystone,rushiagr/keystone,blueboxgroup/keystone,derekchiang/keystone,jamielennox/keystone,rodrigods/keystone,JioCloud/keystone,openstack/keystone,mahak/keystone
text
## Code Before: hacking>=0.5.6,<0.8 # Optional backend: SQL pysqlite # Optional backend: Memcache python-memcached # Optional backend: LDAP # authenticate against an existing LDAP server python-ldap==2.3.13 # Testing # computes code coverage percentages coverage>=3.6 # fixture stubbing fixtures>=0.3.14 # mock object framework mock>=1.0 mox>=0.5.3 # required to build documentation sphinx>=1.1.2 # test wsgi apps without starting an http server WebTest>=2.0 extras discover python-subunit testrepository>=0.0.17 testtools>=0.9.32 # for python-keystoneclient # keystoneclient <0.2.1 httplib2 # replaces httplib2 in keystoneclient >=0.2.1 requests>=1.1 keyring>=1.6.1,<2.0 netifaces>=0.5 # For documentation oslo.sphinx ## Instruction: Add pycrypto as a test-requirement Adding pycrypto as a test-requirement to allow for testing the optional KVS encryption support (ENCRYPT and HMAC validation of values). bp: dogpile-kvs-backends Change-Id: I2e9184dc66393cf805b18abd3822668a6f3792fa ## Code After: hacking>=0.5.6,<0.8 # Optional backend: SQL pysqlite # Optional backend: Memcache python-memcached # Optional backend: LDAP # authenticate against an existing LDAP server python-ldap==2.3.13 # Testing # computes code coverage percentages coverage>=3.6 # fixture stubbing fixtures>=0.3.14 # mock object framework mock>=1.0 mox>=0.5.3 # Optional KVS feature encryption testing pycrypto>=2.6 # required to build documentation sphinx>=1.1.2 # test wsgi apps without starting an http server WebTest>=2.0 extras discover python-subunit testrepository>=0.0.17 testtools>=0.9.32 # for python-keystoneclient # keystoneclient <0.2.1 httplib2 # replaces httplib2 in keystoneclient >=0.2.1 requests>=1.1 keyring>=1.6.1,<2.0 netifaces>=0.5 # For documentation oslo.sphinx
hacking>=0.5.6,<0.8 # Optional backend: SQL pysqlite # Optional backend: Memcache python-memcached # Optional backend: LDAP # authenticate against an existing LDAP server python-ldap==2.3.13 # Testing # computes code coverage percentages coverage>=3.6 # fixture stubbing fixtures>=0.3.14 # mock object framework mock>=1.0 mox>=0.5.3 + # Optional KVS feature encryption testing + pycrypto>=2.6 # required to build documentation sphinx>=1.1.2 # test wsgi apps without starting an http server WebTest>=2.0 extras discover python-subunit testrepository>=0.0.17 testtools>=0.9.32 # for python-keystoneclient # keystoneclient <0.2.1 httplib2 # replaces httplib2 in keystoneclient >=0.2.1 requests>=1.1 keyring>=1.6.1,<2.0 netifaces>=0.5 # For documentation oslo.sphinx
2
0.047619
2
0
44884df8170fff0ad367df479b937ea6d3e588df
deploy.sh
deploy.sh
git update-ref HEAD master git checkout master mvn release:prepare -B mvn release:perform --settings deploy-settings.xml
git update-ref HEAD refs/heads/master git checkout master mvn release:prepare -B mvn release:perform --settings deploy-settings.xml
Update to update-ref to fix maven release plugin
Update to update-ref to fix maven release plugin
Shell
mit
modernmaster/katas,modernmaster/katas,modernmaster/katas,modernmaster/katas
shell
## Code Before: git update-ref HEAD master git checkout master mvn release:prepare -B mvn release:perform --settings deploy-settings.xml ## Instruction: Update to update-ref to fix maven release plugin ## Code After: git update-ref HEAD refs/heads/master git checkout master mvn release:prepare -B mvn release:perform --settings deploy-settings.xml
- git update-ref HEAD master + git update-ref HEAD refs/heads/master ? +++++++++++ git checkout master mvn release:prepare -B mvn release:perform --settings deploy-settings.xml
2
0.5
1
1
45dcdaf4eb7fa26413345f5920999a5f317afdeb
lib/refills/import_generator.rb
lib/refills/import_generator.rb
require 'rails/generators' module Refills class ImportGenerator < Rails::Generators::Base desc 'Copy refills' source_root File.expand_path("../../../source", __FILE__) argument :snippet, type: :string, required: true def copy_html copy_file "_#{snippet}.html.erb", "app/views/refills/_#{snippet}.html.erb" end def copy_styles copy_file "stylesheets/refills/_#{snippet}.scss", "app/assets/stylesheets/refills/_#{snippet}.scss" end def copy_javascripts copy_file "javascripts/refills/_#{snippet}.js", "app/assets/javascripts/refills/_#{snippet}.js" end end end
require 'rails/generators' module Refills class ImportGenerator < Rails::Generators::Base desc 'Copy refills' source_root File.expand_path("../../../source", __FILE__) argument :snippet, type: :string, required: true def copy_html copy_file view_name(snippet), view_destination end def copy_styles copy_file stylesheet_template, stylesheet_destination end def copy_javascripts copy_file "javascripts/refills/_#{snippet}.js", "app/assets/javascripts/refills/_#{snippet}.js" end private def stylesheet_destination File.join('app', 'assets', 'stylesheets', 'refills', stylesheet_name(snippet_name)) end def view_destination File.join('app', 'views', 'refills', view_name(snippet_name)) end def stylesheet_template File.join('stylesheets', 'refills', stylesheet_name(snippet)) end def view_name(name) "_#{name}.html.erb" end def stylesheet_name(name) "_#{name}.scss" end def snippet_name snippet.underscore end end end
Use Rails naming conventions for partials
Use Rails naming conventions for partials
Ruby
mit
dydx/refills,smithdamen/refills,mehdroid/refills,greyhwndz/refills,DropsOfSerenity/refills,thoughtbot/refills,mehdroid/refills,thoughtbot/refills,cllns/refills,hlogmans/refills,smithdamen/refills,greyhwndz/refills,tranc99/refills,tranc99/refills,cllns/refills,dydx/refills,mehdroid/refills,thoughtbot/refills,thoughtbot/refills,hlogmans/refills,hlogmans/refills,greyhwndz/refills,dydx/refills,DropsOfSerenity/refills,cllns/refills,DropsOfSerenity/refills,smithdamen/refills,smithdamen/refills,greyhwndz/refills,mehdroid/refills,tranc99/refills,DropsOfSerenity/refills,cllns/refills,tranc99/refills,hlogmans/refills,dydx/refills
ruby
## Code Before: require 'rails/generators' module Refills class ImportGenerator < Rails::Generators::Base desc 'Copy refills' source_root File.expand_path("../../../source", __FILE__) argument :snippet, type: :string, required: true def copy_html copy_file "_#{snippet}.html.erb", "app/views/refills/_#{snippet}.html.erb" end def copy_styles copy_file "stylesheets/refills/_#{snippet}.scss", "app/assets/stylesheets/refills/_#{snippet}.scss" end def copy_javascripts copy_file "javascripts/refills/_#{snippet}.js", "app/assets/javascripts/refills/_#{snippet}.js" end end end ## Instruction: Use Rails naming conventions for partials ## Code After: require 'rails/generators' module Refills class ImportGenerator < Rails::Generators::Base desc 'Copy refills' source_root File.expand_path("../../../source", __FILE__) argument :snippet, type: :string, required: true def copy_html copy_file view_name(snippet), view_destination end def copy_styles copy_file stylesheet_template, stylesheet_destination end def copy_javascripts copy_file "javascripts/refills/_#{snippet}.js", "app/assets/javascripts/refills/_#{snippet}.js" end private def stylesheet_destination File.join('app', 'assets', 'stylesheets', 'refills', stylesheet_name(snippet_name)) end def view_destination File.join('app', 'views', 'refills', view_name(snippet_name)) end def stylesheet_template File.join('stylesheets', 'refills', stylesheet_name(snippet)) end def view_name(name) "_#{name}.html.erb" end def stylesheet_name(name) "_#{name}.scss" end def snippet_name snippet.underscore end end end
require 'rails/generators' module Refills class ImportGenerator < Rails::Generators::Base desc 'Copy refills' source_root File.expand_path("../../../source", __FILE__) argument :snippet, type: :string, required: true def copy_html - copy_file "_#{snippet}.html.erb", "app/views/refills/_#{snippet}.html.erb" + copy_file view_name(snippet), view_destination end def copy_styles - copy_file "stylesheets/refills/_#{snippet}.scss", "app/assets/stylesheets/refills/_#{snippet}.scss" + copy_file stylesheet_template, stylesheet_destination end def copy_javascripts copy_file "javascripts/refills/_#{snippet}.js", "app/assets/javascripts/refills/_#{snippet}.js" end + + private + + def stylesheet_destination + File.join('app', 'assets', 'stylesheets', 'refills', stylesheet_name(snippet_name)) + end + + def view_destination + File.join('app', 'views', 'refills', view_name(snippet_name)) + end + + def stylesheet_template + File.join('stylesheets', 'refills', stylesheet_name(snippet)) + end + + def view_name(name) + "_#{name}.html.erb" + end + + def stylesheet_name(name) + "_#{name}.scss" + end + + def snippet_name + snippet.underscore + end end end
30
1.428571
28
2
fa1a08aed5bc6659304097d5ad7e653c553c1b11
cactus/utils/file.py
cactus/utils/file.py
import os import cStringIO import gzip import hashlib from cactus.utils.helpers import checksum class FakeTime: """ Monkey-patch gzip.time to avoid changing files every time we deploy them. """ def time(self): return 1111111111.111 def compressString(s): """Gzip a given string.""" gzip.time = FakeTime() zbuf = cStringIO.StringIO() zfile = gzip.GzipFile(mode='wb', compresslevel=9, fileobj=zbuf) zfile.write(s) zfile.close() return zbuf.getvalue() def fileSize(num): for x in ['b', 'kb', 'mb', 'gb', 'tb']: if num < 1024.0: return "%.0f%s" % (num, x) num /= 1024.0 def calculate_file_checksum(path): """ Calculate the MD5 sum for a file (needs to fit in memory) """ with open(path, 'rb') as f: return checksum(f.read()) def file_changed_hash(path): info = os.stat(path) hashKey = str(info.st_mtime) + str(info.st_size) return checksum(hashKey)
import os import cStringIO import gzip import hashlib import subprocess from cactus.utils.helpers import checksum class FakeTime: """ Monkey-patch gzip.time to avoid changing files every time we deploy them. """ def time(self): return 1111111111.111 def compressString(s): """Gzip a given string.""" gzip.time = FakeTime() zbuf = cStringIO.StringIO() zfile = gzip.GzipFile(mode='wb', compresslevel=9, fileobj=zbuf) zfile.write(s) zfile.close() return zbuf.getvalue() def fileSize(num): for x in ['b', 'kb', 'mb', 'gb', 'tb']: if num < 1024.0: return "%.0f%s" % (num, x) num /= 1024.0 def calculate_file_checksum(path): """ Calculate the MD5 sum for a file (needs to fit in memory) """ # with open(path, 'rb') as f: # return checksum(f.read()) output = subprocess.check_output(["md5", path]) md5 = output.split(" = ")[1].strip() return md5 def file_changed_hash(path): info = os.stat(path) hashKey = str(info.st_mtime) + str(info.st_size) return checksum(hashKey)
Use terminal md5 for perf
Use terminal md5 for perf
Python
bsd-3-clause
koenbok/Cactus,danielmorosan/Cactus,juvham/Cactus,dreadatour/Cactus,Bluetide/Cactus,chaudum/Cactus,koobs/Cactus,chaudum/Cactus,PegasusWang/Cactus,juvham/Cactus,eudicots/Cactus,Knownly/Cactus,danielmorosan/Cactus,page-io/Cactus,juvham/Cactus,fjxhkj/Cactus,koobs/Cactus,ibarria0/Cactus,PegasusWang/Cactus,danielmorosan/Cactus,Bluetide/Cactus,fjxhkj/Cactus,page-io/Cactus,andyzsf/Cactus-,fjxhkj/Cactus,eudicots/Cactus,chaudum/Cactus,PegasusWang/Cactus,koenbok/Cactus,dreadatour/Cactus,gone/Cactus,Bluetide/Cactus,ibarria0/Cactus,dreadatour/Cactus,gone/Cactus,koobs/Cactus,ibarria0/Cactus,koenbok/Cactus,page-io/Cactus,andyzsf/Cactus-,eudicots/Cactus,Knownly/Cactus,andyzsf/Cactus-,Knownly/Cactus,gone/Cactus
python
## Code Before: import os import cStringIO import gzip import hashlib from cactus.utils.helpers import checksum class FakeTime: """ Monkey-patch gzip.time to avoid changing files every time we deploy them. """ def time(self): return 1111111111.111 def compressString(s): """Gzip a given string.""" gzip.time = FakeTime() zbuf = cStringIO.StringIO() zfile = gzip.GzipFile(mode='wb', compresslevel=9, fileobj=zbuf) zfile.write(s) zfile.close() return zbuf.getvalue() def fileSize(num): for x in ['b', 'kb', 'mb', 'gb', 'tb']: if num < 1024.0: return "%.0f%s" % (num, x) num /= 1024.0 def calculate_file_checksum(path): """ Calculate the MD5 sum for a file (needs to fit in memory) """ with open(path, 'rb') as f: return checksum(f.read()) def file_changed_hash(path): info = os.stat(path) hashKey = str(info.st_mtime) + str(info.st_size) return checksum(hashKey) ## Instruction: Use terminal md5 for perf ## Code After: import os import cStringIO import gzip import hashlib import subprocess from cactus.utils.helpers import checksum class FakeTime: """ Monkey-patch gzip.time to avoid changing files every time we deploy them. """ def time(self): return 1111111111.111 def compressString(s): """Gzip a given string.""" gzip.time = FakeTime() zbuf = cStringIO.StringIO() zfile = gzip.GzipFile(mode='wb', compresslevel=9, fileobj=zbuf) zfile.write(s) zfile.close() return zbuf.getvalue() def fileSize(num): for x in ['b', 'kb', 'mb', 'gb', 'tb']: if num < 1024.0: return "%.0f%s" % (num, x) num /= 1024.0 def calculate_file_checksum(path): """ Calculate the MD5 sum for a file (needs to fit in memory) """ # with open(path, 'rb') as f: # return checksum(f.read()) output = subprocess.check_output(["md5", path]) md5 = output.split(" = ")[1].strip() return md5 def file_changed_hash(path): info = os.stat(path) hashKey = str(info.st_mtime) + str(info.st_size) return checksum(hashKey)
import os import cStringIO import gzip import hashlib + import subprocess from cactus.utils.helpers import checksum class FakeTime: """ Monkey-patch gzip.time to avoid changing files every time we deploy them. """ def time(self): return 1111111111.111 def compressString(s): """Gzip a given string.""" gzip.time = FakeTime() zbuf = cStringIO.StringIO() zfile = gzip.GzipFile(mode='wb', compresslevel=9, fileobj=zbuf) zfile.write(s) zfile.close() return zbuf.getvalue() def fileSize(num): for x in ['b', 'kb', 'mb', 'gb', 'tb']: if num < 1024.0: return "%.0f%s" % (num, x) num /= 1024.0 def calculate_file_checksum(path): """ Calculate the MD5 sum for a file (needs to fit in memory) """ - with open(path, 'rb') as f: + # with open(path, 'rb') as f: ? ++ - return checksum(f.read()) + # return checksum(f.read()) ? ++ + output = subprocess.check_output(["md5", path]) + md5 = output.split(" = ")[1].strip() + return md5 def file_changed_hash(path): info = os.stat(path) hashKey = str(info.st_mtime) + str(info.st_size) return checksum(hashKey)
8
0.173913
6
2
b9b6985905c2146d3612104a1e21272dc77d8133
app/views/pages/show.blade.php
app/views/pages/show.blade.php
@extends('layout') @section('content') <div class="wrapper"> <h1 class="highlighted">{{{ $page->title }}}</h1> {{ $page->content_html }} </div> @stop @section('actions') <li>{{ link_to_route('pages.edit', 'Edit Page', [$page->slug], ['class' => 'btn secondary']) }}</li> @stop
@extends('layout') @section('content') <div class="wrapper"> <h1>{{{ $page->title }}}</h1> {{ $page->content_html }} </div> @stop @section('actions') <li>{{ link_to_route('pages.edit', 'Edit Page', [$page->slug], ['class' => 'btn secondary']) }}</li> @stop
Remove highlighted title on page so it doesn't conflict with nav.
Remove highlighted title on page so it doesn't conflict with nav.
PHP
mit
DoSomething/voting-app,axton21/voting-app,DoSomething/voting-app,axton21/voting-app,DoSomething/voting-app,axton21/voting-app,DoSomething/voting-app
php
## Code Before: @extends('layout') @section('content') <div class="wrapper"> <h1 class="highlighted">{{{ $page->title }}}</h1> {{ $page->content_html }} </div> @stop @section('actions') <li>{{ link_to_route('pages.edit', 'Edit Page', [$page->slug], ['class' => 'btn secondary']) }}</li> @stop ## Instruction: Remove highlighted title on page so it doesn't conflict with nav. ## Code After: @extends('layout') @section('content') <div class="wrapper"> <h1>{{{ $page->title }}}</h1> {{ $page->content_html }} </div> @stop @section('actions') <li>{{ link_to_route('pages.edit', 'Edit Page', [$page->slug], ['class' => 'btn secondary']) }}</li> @stop
@extends('layout') @section('content') <div class="wrapper"> - <h1 class="highlighted">{{{ $page->title }}}</h1> ? -------------------- + <h1>{{{ $page->title }}}</h1> {{ $page->content_html }} </div> @stop @section('actions') <li>{{ link_to_route('pages.edit', 'Edit Page', [$page->slug], ['class' => 'btn secondary']) }}</li> @stop
2
0.153846
1
1
a3d4149b667f00bc44401292b7ed95f705e1b2e2
source/class/test/Mammalian.js
source/class/test/Mammalian.js
/** * This is a generic Interface for Mammalian Animals * * Those class of Animals have different things in * common - compared to other animal classes like * {api.test.Fish}. */ core.Interface('api.test.Mammalian', { properties: { /** * All Mammalians have a fur! */ fur: { type: 'Object', fire: 'changeFur', apply: function(value, old) { } }, /** * All Mammalians have teeth! */ teeth: { type: 'Number', fire: 'looseTeeth', apply: function(value, old) { } }, /** * All Mammalians have bones! */ bones: { type: 'Number', fire: 'breakBones', apply: function(value, old) { } } } });
/** * This is a generic Interface for Mammalian Animals * * Those class of Animals have different things in * common - compared to other animal classes like * {api.test.Fish}. */ core.Interface('api.test.Mammalian', { properties: { /** * All Mammalians have a fur! */ fur: { type: 'Object', fire: 'changeFur' }, /** * All Mammalians have teeth! */ teeth: { type: 'Number', fire: 'looseTeeth' }, /** * All Mammalians have bones! */ bones: { type: 'Number', fire: 'breakBones' } } });
Make no sense to have apply routines in interface as it is not visible from the outside
Make no sense to have apply routines in interface as it is not visible from the outside
JavaScript
mit
zynga/apibrowser,zynga/apibrowser
javascript
## Code Before: /** * This is a generic Interface for Mammalian Animals * * Those class of Animals have different things in * common - compared to other animal classes like * {api.test.Fish}. */ core.Interface('api.test.Mammalian', { properties: { /** * All Mammalians have a fur! */ fur: { type: 'Object', fire: 'changeFur', apply: function(value, old) { } }, /** * All Mammalians have teeth! */ teeth: { type: 'Number', fire: 'looseTeeth', apply: function(value, old) { } }, /** * All Mammalians have bones! */ bones: { type: 'Number', fire: 'breakBones', apply: function(value, old) { } } } }); ## Instruction: Make no sense to have apply routines in interface as it is not visible from the outside ## Code After: /** * This is a generic Interface for Mammalian Animals * * Those class of Animals have different things in * common - compared to other animal classes like * {api.test.Fish}. */ core.Interface('api.test.Mammalian', { properties: { /** * All Mammalians have a fur! */ fur: { type: 'Object', fire: 'changeFur' }, /** * All Mammalians have teeth! */ teeth: { type: 'Number', fire: 'looseTeeth' }, /** * All Mammalians have bones! */ bones: { type: 'Number', fire: 'breakBones' } } });
/** * This is a generic Interface for Mammalian Animals * * Those class of Animals have different things in * common - compared to other animal classes like * {api.test.Fish}. */ core.Interface('api.test.Mammalian', { properties: { /** * All Mammalians have a fur! */ fur: { type: 'Object', - fire: 'changeFur', ? - + fire: 'changeFur' - apply: function(value, old) { - } }, /** * All Mammalians have teeth! */ teeth: { type: 'Number', - fire: 'looseTeeth', ? - + fire: 'looseTeeth' - apply: function(value, old) { - } }, /** * All Mammalians have bones! */ bones: { type: 'Number', - fire: 'breakBones', ? - + fire: 'breakBones' - apply: function(value, old) { - } } } });
12
0.266667
3
9
baddcb2cf40946ddf1eb5c11f2bffdb7136da926
README.md
README.md
Jekyll support for Capistrano 3.x. ## Installation Add this line to your application's Gemfile: ```ruby gem 'capistrano-jekyll' ``` And then execute: $ bundle Or install it yourself as: $ gem install capistrano-jekyll ## Usage Require in *Capfile* to use the default task: ```ruby require 'capistrano/jekyll' ``` **jekyll:build** task will run after **deploy:published** as part of Capistrano's default deploy, or can be run in isolation with `bundle exec cap production jekyll:build` ## Contributing 1. Fork it ( https://github.com/ne1ro/capistrano-jekyll/fork ) 2. Create your feature branch (`git checkout -b my-new-feature`) 3. Commit your changes (`git commit -am 'Add some feature'`) 4. Push to the branch (`git push origin my-new-feature`) 5. Create a new Pull Request
Jekyll support for Capistrano 3.x. ## Installation Add this line to your application's Gemfile: ```ruby gem 'capistrano-jekyll' ``` And then execute: $ bundle Or install it yourself as: $ gem install capistrano-jekyll ## Usage Require in *Capfile* to use the default task: ```ruby require 'capistrano/jekyll' ``` **jekyll:build** task will run after **deploy:published** as part of Capistrano's default deploy, or can be run in isolation with `bundle exec cap production jekyll:build` ### List of tasks * `cap jekyll:build # Build your website` * `cap jekyll:doctor # Search site and print specific deprecation warnings` * `cap jekyll:new # Creates a new Jekyll site scaffold in PATH` ## Contributing 1. Fork it ( https://github.com/ne1ro/capistrano-jekyll/fork ) 2. Create your feature branch (`git checkout -b my-new-feature`) 3. Commit your changes (`git commit -am 'Add some feature'`) 4. Push to the branch (`git push origin my-new-feature`) 5. Create a new Pull Request
Add list of tasks to readme
Add list of tasks to readme
Markdown
mit
ne1ro/capistrano-jekyll
markdown
## Code Before: Jekyll support for Capistrano 3.x. ## Installation Add this line to your application's Gemfile: ```ruby gem 'capistrano-jekyll' ``` And then execute: $ bundle Or install it yourself as: $ gem install capistrano-jekyll ## Usage Require in *Capfile* to use the default task: ```ruby require 'capistrano/jekyll' ``` **jekyll:build** task will run after **deploy:published** as part of Capistrano's default deploy, or can be run in isolation with `bundle exec cap production jekyll:build` ## Contributing 1. Fork it ( https://github.com/ne1ro/capistrano-jekyll/fork ) 2. Create your feature branch (`git checkout -b my-new-feature`) 3. Commit your changes (`git commit -am 'Add some feature'`) 4. Push to the branch (`git push origin my-new-feature`) 5. Create a new Pull Request ## Instruction: Add list of tasks to readme ## Code After: Jekyll support for Capistrano 3.x. ## Installation Add this line to your application's Gemfile: ```ruby gem 'capistrano-jekyll' ``` And then execute: $ bundle Or install it yourself as: $ gem install capistrano-jekyll ## Usage Require in *Capfile* to use the default task: ```ruby require 'capistrano/jekyll' ``` **jekyll:build** task will run after **deploy:published** as part of Capistrano's default deploy, or can be run in isolation with `bundle exec cap production jekyll:build` ### List of tasks * `cap jekyll:build # Build your website` * `cap jekyll:doctor # Search site and print specific deprecation warnings` * `cap jekyll:new # Creates a new Jekyll site scaffold in PATH` ## Contributing 1. Fork it ( https://github.com/ne1ro/capistrano-jekyll/fork ) 2. Create your feature branch (`git checkout -b my-new-feature`) 3. Commit your changes (`git commit -am 'Add some feature'`) 4. Push to the branch (`git push origin my-new-feature`) 5. Create a new Pull Request
Jekyll support for Capistrano 3.x. ## Installation Add this line to your application's Gemfile: ```ruby gem 'capistrano-jekyll' ``` And then execute: $ bundle Or install it yourself as: $ gem install capistrano-jekyll ## Usage Require in *Capfile* to use the default task: ```ruby require 'capistrano/jekyll' ``` **jekyll:build** task will run after **deploy:published** as part of Capistrano's default deploy, or can be run in isolation with `bundle exec cap production jekyll:build` + ### List of tasks + * `cap jekyll:build # Build your website` + * `cap jekyll:doctor # Search site and print specific deprecation warnings` + * `cap jekyll:new # Creates a new Jekyll site scaffold in PATH` + ## Contributing 1. Fork it ( https://github.com/ne1ro/capistrano-jekyll/fork ) 2. Create your feature branch (`git checkout -b my-new-feature`) 3. Commit your changes (`git commit -am 'Add some feature'`) 4. Push to the branch (`git push origin my-new-feature`) 5. Create a new Pull Request
5
0.142857
5
0
312f4ceebb9019dbd9b8ead27d9d386324b89064
README.md
README.md
[![Build status](https://secure.travis-ci.org/aleasoluciones/simpledatamigrate.svg?branch=master)](https://secure.travis-ci.org/aleasoluciones/simpledatamigrate) Migration file requirements * initial_<ver1> * <ver1>_<ver2>_<whatever> ##How to run the tests Execute: 1. `dev/setup_venv.sh` 2. `unit-tests.sh`
[![Build status](https://secure.travis-ci.org/aleasoluciones/simpledatamigrate.svg?branch=master)](https://secure.travis-ci.org/aleasoluciones/simpledatamigrate) ![Python versions supported](https://img.shields.io/badge/supports%20python-2.7%20%7C%203.4%20%7C%203.5%20%7C%203.6%20%7C%203.7-blue.svg) Migration file requirements * initial_<ver1> * <ver1>_<ver2>_<whatever> ##How to run the tests Execute: 1. `dev/setup_venv.sh` 2. `unit-tests.sh`
Add badge with python versions supported
[DOC] Add badge with python versions supported
Markdown
mit
aleasoluciones/simpledatamigrate,aleasoluciones/simpledatamigrate
markdown
## Code Before: [![Build status](https://secure.travis-ci.org/aleasoluciones/simpledatamigrate.svg?branch=master)](https://secure.travis-ci.org/aleasoluciones/simpledatamigrate) Migration file requirements * initial_<ver1> * <ver1>_<ver2>_<whatever> ##How to run the tests Execute: 1. `dev/setup_venv.sh` 2. `unit-tests.sh` ## Instruction: [DOC] Add badge with python versions supported ## Code After: [![Build status](https://secure.travis-ci.org/aleasoluciones/simpledatamigrate.svg?branch=master)](https://secure.travis-ci.org/aleasoluciones/simpledatamigrate) ![Python versions supported](https://img.shields.io/badge/supports%20python-2.7%20%7C%203.4%20%7C%203.5%20%7C%203.6%20%7C%203.7-blue.svg) Migration file requirements * initial_<ver1> * <ver1>_<ver2>_<whatever> ##How to run the tests Execute: 1. `dev/setup_venv.sh` 2. `unit-tests.sh`
- [![Build status](https://secure.travis-ci.org/aleasoluciones/simpledatamigrate.svg?branch=master)](https://secure.travis-ci.org/aleasoluciones/simpledatamigrate) + [![Build status](https://secure.travis-ci.org/aleasoluciones/simpledatamigrate.svg?branch=master)](https://secure.travis-ci.org/aleasoluciones/simpledatamigrate) ![Python versions supported](https://img.shields.io/badge/supports%20python-2.7%20%7C%203.4%20%7C%203.5%20%7C%203.6%20%7C%203.7-blue.svg) Migration file requirements * initial_<ver1> * <ver1>_<ver2>_<whatever> ##How to run the tests Execute: 1. `dev/setup_venv.sh` 2. `unit-tests.sh`
2
0.153846
1
1
2fede26d3b65642f9fa39fd682414ed55a03ef31
mybower-test.js
mybower-test.js
var bower = require('bower'); var mybowerConfig = require('./mybower.json'); for (var module in mybowerConfig) { if (mybowerConfig[module].local === true) continue; bower.commands.install([mybowerConfig[module].url], {}, {"directory": mybowerConfig[module].directory}) } //console.log(bower.commands.install(['jquery'], {}, {"directory": "test"}))
var bower = require('bower'); var mybowerConfig = require('./mybower.json'); for (var module in mybowerConfig) { if (mybowerConfig[module].local === true) continue; bower.commands.install([module + "=" + mybowerConfig[module].url], {}, {"directory": mybowerConfig[module].directory}) }
Install into directory that has the module name.
Install into directory that has the module name.
JavaScript
mit
anisdjer/mybower-test,anisdjer/mybower
javascript
## Code Before: var bower = require('bower'); var mybowerConfig = require('./mybower.json'); for (var module in mybowerConfig) { if (mybowerConfig[module].local === true) continue; bower.commands.install([mybowerConfig[module].url], {}, {"directory": mybowerConfig[module].directory}) } //console.log(bower.commands.install(['jquery'], {}, {"directory": "test"})) ## Instruction: Install into directory that has the module name. ## Code After: var bower = require('bower'); var mybowerConfig = require('./mybower.json'); for (var module in mybowerConfig) { if (mybowerConfig[module].local === true) continue; bower.commands.install([module + "=" + mybowerConfig[module].url], {}, {"directory": mybowerConfig[module].directory}) }
var bower = require('bower'); var mybowerConfig = require('./mybower.json'); for (var module in mybowerConfig) { if (mybowerConfig[module].local === true) continue; - bower.commands.install([mybowerConfig[module].url], {}, {"directory": mybowerConfig[module].directory}) + bower.commands.install([module + "=" + mybowerConfig[module].url], {}, {"directory": mybowerConfig[module].directory}) ? +++++++++++++++ } - //console.log(bower.commands.install(['jquery'], {}, {"directory": "test"}))
3
0.375
1
2
03142c00a6c116206c38509ea1b1990a6a2c0b27
bin/sync-npm.sh
bin/sync-npm.sh
set -e cd "`dirname $0`/.." # Install dependencies. pip install -e . pip install ijson==2.3.0 curl https://cmake.org/files/v3.6/cmake-3.6.2-Linux-x86_64.tar.gz > cmake.tgz echo '5df4b69d9e85093ae78b1070d5cb9f824ce0bdd02528948c3f6a740e240083e5 cmake.tgz' \ | sha256sum -c /dev/stdin --status tar zxf cmake.tgz PATH=/app/cmake-3.6.2-Linux-x86_64/bin:$PATH git clone https://github.com/lloyd/yajl.git cd yajl git checkout 2.1.0 ./configure sudo make install cd .. URL=https://registry.npmjs.com/-/all URL=https://gist.githubusercontent.com/whit537/fec53fb1f0618b3d5757f0ab687b7476/raw/25de82f6197df49b47d180db0d62b4e8c6f7f9f8/one curl $URL | sync-npm serialize /dev/stdin | sync-npm upsert /dev/stdin
set -e cd "`dirname $0`/.." # Install dependencies. pip install -e . pip install ijson==2.3.0 curl https://cmake.org/files/v3.6/cmake-3.6.2-Linux-x86_64.tar.gz > cmake.tgz echo '5df4b69d9e85093ae78b1070d5cb9f824ce0bdd02528948c3f6a740e240083e5 cmake.tgz' \ | sha256sum -c /dev/stdin --status tar zxf cmake.tgz PATH=/app/cmake-3.6.2-Linux-x86_64/bin:$PATH git clone https://github.com/lloyd/yajl.git cd yajl git checkout 2.1.0 ./configure -p /app/.heroku/python make install cd .. URL=https://registry.npmjs.com/-/all URL=https://gist.githubusercontent.com/whit537/fec53fb1f0618b3d5757f0ab687b7476/raw/25de82f6197df49b47d180db0d62b4e8c6f7f9f8/one curl $URL | sync-npm serialize /dev/stdin | sync-npm upsert /dev/stdin
Put libyajl where we can find it
Put libyajl where we can find it This location is already in LD_LIBRARY_PATH.
Shell
mit
gratipay/gratipay.com,gratipay/gratipay.com,gratipay/gratipay.com,gratipay/gratipay.com
shell
## Code Before: set -e cd "`dirname $0`/.." # Install dependencies. pip install -e . pip install ijson==2.3.0 curl https://cmake.org/files/v3.6/cmake-3.6.2-Linux-x86_64.tar.gz > cmake.tgz echo '5df4b69d9e85093ae78b1070d5cb9f824ce0bdd02528948c3f6a740e240083e5 cmake.tgz' \ | sha256sum -c /dev/stdin --status tar zxf cmake.tgz PATH=/app/cmake-3.6.2-Linux-x86_64/bin:$PATH git clone https://github.com/lloyd/yajl.git cd yajl git checkout 2.1.0 ./configure sudo make install cd .. URL=https://registry.npmjs.com/-/all URL=https://gist.githubusercontent.com/whit537/fec53fb1f0618b3d5757f0ab687b7476/raw/25de82f6197df49b47d180db0d62b4e8c6f7f9f8/one curl $URL | sync-npm serialize /dev/stdin | sync-npm upsert /dev/stdin ## Instruction: Put libyajl where we can find it This location is already in LD_LIBRARY_PATH. ## Code After: set -e cd "`dirname $0`/.." # Install dependencies. pip install -e . pip install ijson==2.3.0 curl https://cmake.org/files/v3.6/cmake-3.6.2-Linux-x86_64.tar.gz > cmake.tgz echo '5df4b69d9e85093ae78b1070d5cb9f824ce0bdd02528948c3f6a740e240083e5 cmake.tgz' \ | sha256sum -c /dev/stdin --status tar zxf cmake.tgz PATH=/app/cmake-3.6.2-Linux-x86_64/bin:$PATH git clone https://github.com/lloyd/yajl.git cd yajl git checkout 2.1.0 ./configure -p /app/.heroku/python make install cd .. URL=https://registry.npmjs.com/-/all URL=https://gist.githubusercontent.com/whit537/fec53fb1f0618b3d5757f0ab687b7476/raw/25de82f6197df49b47d180db0d62b4e8c6f7f9f8/one curl $URL | sync-npm serialize /dev/stdin | sync-npm upsert /dev/stdin
set -e cd "`dirname $0`/.." # Install dependencies. pip install -e . pip install ijson==2.3.0 curl https://cmake.org/files/v3.6/cmake-3.6.2-Linux-x86_64.tar.gz > cmake.tgz echo '5df4b69d9e85093ae78b1070d5cb9f824ce0bdd02528948c3f6a740e240083e5 cmake.tgz' \ | sha256sum -c /dev/stdin --status tar zxf cmake.tgz PATH=/app/cmake-3.6.2-Linux-x86_64/bin:$PATH git clone https://github.com/lloyd/yajl.git cd yajl git checkout 2.1.0 - ./configure + ./configure -p /app/.heroku/python - sudo make install ? ----- + make install cd .. URL=https://registry.npmjs.com/-/all URL=https://gist.githubusercontent.com/whit537/fec53fb1f0618b3d5757f0ab687b7476/raw/25de82f6197df49b47d180db0d62b4e8c6f7f9f8/one curl $URL | sync-npm serialize /dev/stdin | sync-npm upsert /dev/stdin
4
0.16
2
2
35aa107f343546cbd0435de4d8cf5f5e841f094d
src/index.html.jade
src/index.html.jade
doctype html html(lang='en-us') head meta(http-equiv='Content-Type', content='text/html; charset=utf-8') meta(http-equiv='X-UA-Compatible', content='IE=edge,chrome=1') meta(name='viewport', content='width=device-width, initial-scale=1.0') title Drink Machine | Try your luck for a morning beverage! link(rel='stylesheet', href='css/app.css') script(src='js/app.min.js', type='text/javascript') body h1 Try your luck for a morning beverage! div ul li li li button Press to Play
doctype html html(lang='en-us') head meta(http-equiv='Content-Type', content='text/html; charset=utf-8') meta(http-equiv='X-UA-Compatible', content='IE=edge,chrome=1') meta(name='viewport', content='width=device-width, initial-scale=1.0') title Drink Machine | Try your luck for a morning beverage! link(rel='stylesheet', href='css/app.css') script(src='js/app.min.js', type='text/javascript') body h1 Try your luck for a morning beverage! div ul.Reels li.Reel.Reel_method li.Reel.Reel_equipment li.Reel.Reel_flavoring button Press to Play
Add Reel modules classes to markup
Add Reel modules classes to markup
Jade
isc
bitfyre/drinkmachine
jade
## Code Before: doctype html html(lang='en-us') head meta(http-equiv='Content-Type', content='text/html; charset=utf-8') meta(http-equiv='X-UA-Compatible', content='IE=edge,chrome=1') meta(name='viewport', content='width=device-width, initial-scale=1.0') title Drink Machine | Try your luck for a morning beverage! link(rel='stylesheet', href='css/app.css') script(src='js/app.min.js', type='text/javascript') body h1 Try your luck for a morning beverage! div ul li li li button Press to Play ## Instruction: Add Reel modules classes to markup ## Code After: doctype html html(lang='en-us') head meta(http-equiv='Content-Type', content='text/html; charset=utf-8') meta(http-equiv='X-UA-Compatible', content='IE=edge,chrome=1') meta(name='viewport', content='width=device-width, initial-scale=1.0') title Drink Machine | Try your luck for a morning beverage! link(rel='stylesheet', href='css/app.css') script(src='js/app.min.js', type='text/javascript') body h1 Try your luck for a morning beverage! div ul.Reels li.Reel.Reel_method li.Reel.Reel_equipment li.Reel.Reel_flavoring button Press to Play
doctype html html(lang='en-us') head meta(http-equiv='Content-Type', content='text/html; charset=utf-8') meta(http-equiv='X-UA-Compatible', content='IE=edge,chrome=1') meta(name='viewport', content='width=device-width, initial-scale=1.0') title Drink Machine | Try your luck for a morning beverage! link(rel='stylesheet', href='css/app.css') script(src='js/app.min.js', type='text/javascript') body h1 Try your luck for a morning beverage! div - ul - li - li - li + ul.Reels + li.Reel.Reel_method + li.Reel.Reel_equipment + li.Reel.Reel_flavoring button Press to Play
8
0.470588
4
4
7d0480aa31d393bb026f2ff51a83625a1d8f792f
spec/lib/mr_darcy_spec.rb
spec/lib/mr_darcy_spec.rb
require 'spec_helper' describe MrDarcy do it { should be } it { should respond_to :driver } it { should respond_to :driver= } describe '#promise' do When 'no driver is specified' do subject { MrDarcy.promise {} } it 'uses whichever driver is the default' do expect(MrDarcy).to receive(:driver).and_return(:thread) subject end end end end
require 'spec_helper' describe MrDarcy do it { should be } it { should respond_to :driver } it { should respond_to :driver= } # This spec doesn't pass on CI. I can't figure out why. # describe '#promise' do # When 'no driver is specified' do # subject { MrDarcy.promise {} } # it 'uses whichever driver is the default' do # expect(MrDarcy).to receive(:driver).and_return(:thread) # subject # end # end # end end
Disable failing spec on CI.
Disable failing spec on CI.
Ruby
mit
jamesotron/MrDarcy
ruby
## Code Before: require 'spec_helper' describe MrDarcy do it { should be } it { should respond_to :driver } it { should respond_to :driver= } describe '#promise' do When 'no driver is specified' do subject { MrDarcy.promise {} } it 'uses whichever driver is the default' do expect(MrDarcy).to receive(:driver).and_return(:thread) subject end end end end ## Instruction: Disable failing spec on CI. ## Code After: require 'spec_helper' describe MrDarcy do it { should be } it { should respond_to :driver } it { should respond_to :driver= } # This spec doesn't pass on CI. I can't figure out why. # describe '#promise' do # When 'no driver is specified' do # subject { MrDarcy.promise {} } # it 'uses whichever driver is the default' do # expect(MrDarcy).to receive(:driver).and_return(:thread) # subject # end # end # end end
require 'spec_helper' describe MrDarcy do it { should be } it { should respond_to :driver } it { should respond_to :driver= } + # This spec doesn't pass on CI. I can't figure out why. - describe '#promise' do + # describe '#promise' do ? ++ - When 'no driver is specified' do + # When 'no driver is specified' do ? ++ - subject { MrDarcy.promise {} } + # subject { MrDarcy.promise {} } ? ++ - it 'uses whichever driver is the default' do + # it 'uses whichever driver is the default' do ? ++ - expect(MrDarcy).to receive(:driver).and_return(:thread) + # expect(MrDarcy).to receive(:driver).and_return(:thread) ? ++ - subject + # subject ? ++ - end + # end ? ++ - end + # end ? ++ - end + # end ? ++ end
19
1.055556
10
9
d3457c409049ad5dd2789b345a452dd7d844fbaa
lib/db.js
lib/db.js
exports.get = function getMany(model, params) { } exports.add = function add(model, data) { } exports.update = function update(model, id, data) { } exports.delete = function delete(model, id, params) { }
function get(model, params, cb) { } function add(model, data, cb) { } function update(model, id, data, cb) { } function del(model, id, params, cb) { } function getNameFromModel(model) { return new model().tableName; } exports.get = get; exports.add = add; exports.update = update; exports.delete = del; exports.getNameFromModel = getNameFromModel;
Add function to get table name from model
Add function to get table name from model
JavaScript
mit
ganemone/bookshelf-restful
javascript
## Code Before: exports.get = function getMany(model, params) { } exports.add = function add(model, data) { } exports.update = function update(model, id, data) { } exports.delete = function delete(model, id, params) { } ## Instruction: Add function to get table name from model ## Code After: function get(model, params, cb) { } function add(model, data, cb) { } function update(model, id, data, cb) { } function del(model, id, params, cb) { } function getNameFromModel(model) { return new model().tableName; } exports.get = get; exports.add = add; exports.update = update; exports.delete = del; exports.getNameFromModel = getNameFromModel;
- exports.get = function getMany(model, params) { + function get(model, params, cb) { } - exports.add = function add(model, data) { ? -------------- + function add(model, data, cb) { ? ++++ } - exports.update = function update(model, id, data) { ? ----------------- + function update(model, id, data, cb) { ? ++++ } - exports.delete = function delete(model, id, params) { + function del(model, id, params, cb) { } + + function getNameFromModel(model) { + return new model().tableName; + } + + exports.get = get; + exports.add = add; + exports.update = update; + exports.delete = del; + exports.getNameFromModel = getNameFromModel;
18
1.5
14
4
1b9d453f6fe0d2128849f98922f082d6ccfbee69
channelfilter.py
channelfilter.py
import os import yaml class ChannelFilter(object): def __init__(self, path=None): if path is None: path = os.path.join(os.path.dirname(__file__), 'channels.yaml') with open(path) as f: self.config = yaml.load(f) print(self.config) @property def firehose_channel(self): return self.config['firehose-channel'] @property def default_channel(self): return self.config['default-channel'] def all_channels(self): channels = [self.default_channel, self.firehose_channel] + list(self.config['channels']) return list(set(channels)) def channels_for(self, project): """ :param project: Get all channels to spam for the given project :type project: basestring """ channels = set() for channel in self.config['channels']: if project in self.config['channels'][channel]: channels.add(channel) continue if not channels: channels.add(self.default_channel) channels.add(self.firehose_channel) print(channels) return channels
import os import yaml class ChannelFilter(object): def __init__(self, path=None): if path is None: path = os.path.join(os.path.dirname(__file__), 'channels.yaml') with open(path) as f: self.config = yaml.load(f) print(self.config) @property def firehose_channel(self): return self.config['firehose-channel'] @property def default_channel(self): return self.config['default-channel'] def all_channels(self): channels = [self.default_channel, self.firehose_channel] + list(self.config['channels']) return list(set(channels)) def channels_for(self, projects): """ :param project: Get all channels to spam for given projects :type project: list """ channels = set() for channel in self.config['channels']: for project in projects: if project in self.config['channels'][channel]: channels.add(channel) break if not channels: channels.add(self.default_channel) channels.add(self.firehose_channel) return channels
Fix channel filtering to work properly
Fix channel filtering to work properly
Python
mit
wikimedia/labs-tools-wikibugs2,wikimedia/labs-tools-wikibugs2
python
## Code Before: import os import yaml class ChannelFilter(object): def __init__(self, path=None): if path is None: path = os.path.join(os.path.dirname(__file__), 'channels.yaml') with open(path) as f: self.config = yaml.load(f) print(self.config) @property def firehose_channel(self): return self.config['firehose-channel'] @property def default_channel(self): return self.config['default-channel'] def all_channels(self): channels = [self.default_channel, self.firehose_channel] + list(self.config['channels']) return list(set(channels)) def channels_for(self, project): """ :param project: Get all channels to spam for the given project :type project: basestring """ channels = set() for channel in self.config['channels']: if project in self.config['channels'][channel]: channels.add(channel) continue if not channels: channels.add(self.default_channel) channels.add(self.firehose_channel) print(channels) return channels ## Instruction: Fix channel filtering to work properly ## Code After: import os import yaml class ChannelFilter(object): def __init__(self, path=None): if path is None: path = os.path.join(os.path.dirname(__file__), 'channels.yaml') with open(path) as f: self.config = yaml.load(f) print(self.config) @property def firehose_channel(self): return self.config['firehose-channel'] @property def default_channel(self): return self.config['default-channel'] def all_channels(self): channels = [self.default_channel, self.firehose_channel] + list(self.config['channels']) return list(set(channels)) def channels_for(self, projects): """ :param project: Get all channels to spam for given projects :type project: list """ channels = set() for channel in self.config['channels']: for project in projects: if project in self.config['channels'][channel]: channels.add(channel) break if not channels: channels.add(self.default_channel) channels.add(self.firehose_channel) return channels
import os import yaml class ChannelFilter(object): def __init__(self, path=None): if path is None: path = os.path.join(os.path.dirname(__file__), 'channels.yaml') with open(path) as f: self.config = yaml.load(f) print(self.config) @property def firehose_channel(self): return self.config['firehose-channel'] @property def default_channel(self): return self.config['default-channel'] def all_channels(self): channels = [self.default_channel, self.firehose_channel] + list(self.config['channels']) return list(set(channels)) - def channels_for(self, project): + def channels_for(self, projects): ? + """ - :param project: Get all channels to spam for the given project ? ---- + :param project: Get all channels to spam for given projects ? + - :type project: basestring ? ^^^^ ---- + :type project: list ? ^^ """ channels = set() for channel in self.config['channels']: + for project in projects: - if project in self.config['channels'][channel]: + if project in self.config['channels'][channel]: ? ++++ - channels.add(channel) + channels.add(channel) ? ++++ - continue + break if not channels: channels.add(self.default_channel) channels.add(self.firehose_channel) - print(channels) return channels
14
0.341463
7
7
09ebeb873c83d51053ef6aa2d7c6ce47b4be5070
ckanext/archiver/helpers.py
ckanext/archiver/helpers.py
from ckan.plugins import toolkit as tk def archiver_resource_show(resource_id): data_dict = {'id': resource_id} return tk.get_action('archiver_resource_show')(data_dict) def archiver_is_resource_broken_html(resource): archival = resource.get('archiver') if not archival: return '<!-- No archival info for this resource -->' extra_vars = {'resource': resource} extra_vars.update(archival) return tk.literal( tk.render('archiver/is_resource_broken.html', extra_vars=extra_vars)) def archiver_is_resource_cached_html(resource): archival = resource.get('archiver') if not archival: return '<!-- No archival info for this resource -->' extra_vars = {'resource': resource} extra_vars.update(archival) return tk.literal( tk.render('archiver/is_resource_cached.html', extra_vars=extra_vars)) # Replacement for the core ckan helper 'format_resource_items' # but with our own blacklist def archiver_format_resource_items(items): blacklist = ['archiver', 'qa'] items_ = [item for item in items if item[0] not in blacklist] import ckan.lib.helpers as ckan_helpers return ckan_helpers.format_resource_items(items_)
from ckan.plugins import toolkit as tk def archiver_resource_show(resource_id): data_dict = {'id': resource_id} return tk.get_action('archiver_resource_show')(data_dict) def archiver_is_resource_broken_html(resource): archival = resource.get('archiver') if not archival: return tk.literal('<!-- No archival info for this resource -->') extra_vars = {'resource': resource} extra_vars.update(archival) return tk.literal( tk.render('archiver/is_resource_broken.html', extra_vars=extra_vars)) def archiver_is_resource_cached_html(resource): archival = resource.get('archiver') if not archival: return tk.literal('<!-- No archival info for this resource -->') extra_vars = {'resource': resource} extra_vars.update(archival) return tk.literal( tk.render('archiver/is_resource_cached.html', extra_vars=extra_vars)) # Replacement for the core ckan helper 'format_resource_items' # but with our own blacklist def archiver_format_resource_items(items): blacklist = ['archiver', 'qa'] items_ = [item for item in items if item[0] not in blacklist] import ckan.lib.helpers as ckan_helpers return ckan_helpers.format_resource_items(items_)
Hide comments meant as unseen
Hide comments meant as unseen
Python
mit
ckan/ckanext-archiver,datagovuk/ckanext-archiver,datagovuk/ckanext-archiver,ckan/ckanext-archiver,datagovuk/ckanext-archiver,ckan/ckanext-archiver,DanePubliczneGovPl/ckanext-archiver,DanePubliczneGovPl/ckanext-archiver,DanePubliczneGovPl/ckanext-archiver
python
## Code Before: from ckan.plugins import toolkit as tk def archiver_resource_show(resource_id): data_dict = {'id': resource_id} return tk.get_action('archiver_resource_show')(data_dict) def archiver_is_resource_broken_html(resource): archival = resource.get('archiver') if not archival: return '<!-- No archival info for this resource -->' extra_vars = {'resource': resource} extra_vars.update(archival) return tk.literal( tk.render('archiver/is_resource_broken.html', extra_vars=extra_vars)) def archiver_is_resource_cached_html(resource): archival = resource.get('archiver') if not archival: return '<!-- No archival info for this resource -->' extra_vars = {'resource': resource} extra_vars.update(archival) return tk.literal( tk.render('archiver/is_resource_cached.html', extra_vars=extra_vars)) # Replacement for the core ckan helper 'format_resource_items' # but with our own blacklist def archiver_format_resource_items(items): blacklist = ['archiver', 'qa'] items_ = [item for item in items if item[0] not in blacklist] import ckan.lib.helpers as ckan_helpers return ckan_helpers.format_resource_items(items_) ## Instruction: Hide comments meant as unseen ## Code After: from ckan.plugins import toolkit as tk def archiver_resource_show(resource_id): data_dict = {'id': resource_id} return tk.get_action('archiver_resource_show')(data_dict) def archiver_is_resource_broken_html(resource): archival = resource.get('archiver') if not archival: return tk.literal('<!-- No archival info for this resource -->') extra_vars = {'resource': resource} extra_vars.update(archival) return tk.literal( tk.render('archiver/is_resource_broken.html', extra_vars=extra_vars)) def archiver_is_resource_cached_html(resource): archival = resource.get('archiver') if not archival: return tk.literal('<!-- No archival info for this resource -->') extra_vars = {'resource': resource} extra_vars.update(archival) return tk.literal( tk.render('archiver/is_resource_cached.html', extra_vars=extra_vars)) # Replacement for the core ckan helper 'format_resource_items' # but with our own blacklist def archiver_format_resource_items(items): blacklist = ['archiver', 'qa'] items_ = [item for item in items if item[0] not in blacklist] import ckan.lib.helpers as ckan_helpers return ckan_helpers.format_resource_items(items_)
from ckan.plugins import toolkit as tk def archiver_resource_show(resource_id): data_dict = {'id': resource_id} return tk.get_action('archiver_resource_show')(data_dict) def archiver_is_resource_broken_html(resource): archival = resource.get('archiver') if not archival: - return '<!-- No archival info for this resource -->' + return tk.literal('<!-- No archival info for this resource -->') ? +++++++++++ + extra_vars = {'resource': resource} extra_vars.update(archival) return tk.literal( tk.render('archiver/is_resource_broken.html', extra_vars=extra_vars)) def archiver_is_resource_cached_html(resource): archival = resource.get('archiver') if not archival: - return '<!-- No archival info for this resource -->' + return tk.literal('<!-- No archival info for this resource -->') ? +++++++++++ + extra_vars = {'resource': resource} extra_vars.update(archival) return tk.literal( tk.render('archiver/is_resource_cached.html', extra_vars=extra_vars)) # Replacement for the core ckan helper 'format_resource_items' # but with our own blacklist def archiver_format_resource_items(items): blacklist = ['archiver', 'qa'] items_ = [item for item in items if item[0] not in blacklist] import ckan.lib.helpers as ckan_helpers return ckan_helpers.format_resource_items(items_)
4
0.105263
2
2
036347b2eb779f87f88b44c34b8ed537786f9c90
src/mixins/_clearfix.scss
src/mixins/_clearfix.scss
/// Clears floats in IE 8+ /// @group Main /// @link http://nicolasgallagher.com/micro-clearfix-hack/ /// @link https://css-tricks.com/snippets/css/clear-fix/ @mixin fs-clearfix { &:after { clear: both; content: ''; display: table; } }
/// Clears floats in IE 8+ /// @group Main /// @link http://nicolasgallagher.com/micro-clearfix-hack/ /// @link https://css-tricks.com/snippets/css/clear-fix/ /// @ignore Consider replacing with “display: flow-root” once support is broad enough. /// @ignore https://caniuse.com/#feat=flow-root @mixin fs-clearfix { &:after { clear: both; content: ''; display: table; } }
Add comment to clearfix mixin
Add comment to clearfix mixin
SCSS
mit
Threespot/frontline-sass
scss
## Code Before: /// Clears floats in IE 8+ /// @group Main /// @link http://nicolasgallagher.com/micro-clearfix-hack/ /// @link https://css-tricks.com/snippets/css/clear-fix/ @mixin fs-clearfix { &:after { clear: both; content: ''; display: table; } } ## Instruction: Add comment to clearfix mixin ## Code After: /// Clears floats in IE 8+ /// @group Main /// @link http://nicolasgallagher.com/micro-clearfix-hack/ /// @link https://css-tricks.com/snippets/css/clear-fix/ /// @ignore Consider replacing with “display: flow-root” once support is broad enough. /// @ignore https://caniuse.com/#feat=flow-root @mixin fs-clearfix { &:after { clear: both; content: ''; display: table; } }
/// Clears floats in IE 8+ /// @group Main /// @link http://nicolasgallagher.com/micro-clearfix-hack/ /// @link https://css-tricks.com/snippets/css/clear-fix/ + /// @ignore Consider replacing with “display: flow-root” once support is broad enough. + /// @ignore https://caniuse.com/#feat=flow-root @mixin fs-clearfix { &:after { clear: both; content: ''; display: table; } }
2
0.181818
2
0
47c4d6337e4471c3b0aed40272117a7483bc3fbc
README.md
README.md
BidWire notifies you when it finds new bids.
BidWire notifies you when it finds new bids on https://www.commbuys.com/bso/ Public Pivotal Tracker with planned work: https://www.pivotaltracker.com/n/projects/1996883
Add link to Tracker project.
Add link to Tracker project.
Markdown
mit
RagtagOpen/bidwire,RagtagOpen/bidwire,RagtagOpen/bidwire
markdown
## Code Before: BidWire notifies you when it finds new bids. ## Instruction: Add link to Tracker project. ## Code After: BidWire notifies you when it finds new bids on https://www.commbuys.com/bso/ Public Pivotal Tracker with planned work: https://www.pivotaltracker.com/n/projects/1996883
- BidWire notifies you when it finds new bids. + BidWire notifies you when it finds new bids on https://www.commbuys.com/bso/ + + Public Pivotal Tracker with planned work: https://www.pivotaltracker.com/n/projects/1996883
4
4
3
1
5ac11988155b796e5ace9e2b6ae4ecd09f779344
copy/opt/core/bin/ssl-selfsigned.sh
copy/opt/core/bin/ssl-selfsigned.sh
CN=$(hostname) FILENAME='server' # Help function function help() { echo "${0} -d <DESTINATION> [-c common name] [-f filename]" exit 1 } # Option parameters if (( ${#} < 1 )); then help; fi while getopts ":d:c:f:" opt; do case "${opt}" in d) DESTINATION=${OPTARG} ;; c) CN=${OPTARG} ;; f) FILENAME=${OPTARG} ;; *) help ;; esac done # Verify if folder exists if [[ ! -d "$DESTINATION" ]]; then echo "Error: The ${DESTINATION} doesn't exists, please create!" exit 2 fi # Generate key and csr via OpenSSL openssl req -newkey rsa:2048 -keyout ${DESTINATION}/${FILENAME}.key \ -out ${DESTINATION}/${FILENAME}.csr -nodes \ -subj "/C=DE/L=Raindbow City/O=Aperture Science/OU=Please use valid ssl certificate/CN=${CN}" # Generate self signed ssl certificate from csr via OpenSSL openssl x509 -in ${DESTINATION}/${FILENAME}.csr -out ${DESTINATION}/${FILENAME}.crt -req \ -signkey ${DESTINATION}/${FILENAME}.key -days 128 # Create one PEM file which contains certificate and key cat ${DESTINATION}/${FILENAME}.crt ${DESTINATION}/${FILENAME}.key > ${DESTINATION}/${FILENAME}.pem
CN=$(hostname) FILENAME='server' # Help function function help() { echo "${0} -d <DESTINATION> [-c common name] [-f filename]" exit 1 } # Option parameters if (( ${#} < 1 )); then help; fi while getopts ":d:c:f:" opt; do case "${opt}" in d) DESTINATION=${OPTARG} ;; c) CN=${OPTARG} ;; f) FILENAME=${OPTARG} ;; *) help ;; esac done # Verify if folder exists if [[ ! -d "$DESTINATION" ]]; then echo "Error: The ${DESTINATION} doesn't exists, please create!" exit 2 fi # Generate key and csr via OpenSSL openssl req -newkey rsa:2048 -keyout ${DESTINATION}/${FILENAME}.key \ -out ${DESTINATION}/${FILENAME}.csr -nodes \ -subj "/C=DE/L=Raindbow City/O=Aperture Science/OU=Please use valid ssl certificate/CN=${CN}" # Generate self signed ssl certificate from csr via OpenSSL openssl x509 -in ${DESTINATION}/${FILENAME}.csr -out ${DESTINATION}/${FILENAME}.crt -req \ -signkey ${DESTINATION}/${FILENAME}.key -days 128
Remove support for one PEM file which contains cert and key
Remove support for one PEM file which contains cert and key
Shell
mit
skylime/mi-core-base
shell
## Code Before: CN=$(hostname) FILENAME='server' # Help function function help() { echo "${0} -d <DESTINATION> [-c common name] [-f filename]" exit 1 } # Option parameters if (( ${#} < 1 )); then help; fi while getopts ":d:c:f:" opt; do case "${opt}" in d) DESTINATION=${OPTARG} ;; c) CN=${OPTARG} ;; f) FILENAME=${OPTARG} ;; *) help ;; esac done # Verify if folder exists if [[ ! -d "$DESTINATION" ]]; then echo "Error: The ${DESTINATION} doesn't exists, please create!" exit 2 fi # Generate key and csr via OpenSSL openssl req -newkey rsa:2048 -keyout ${DESTINATION}/${FILENAME}.key \ -out ${DESTINATION}/${FILENAME}.csr -nodes \ -subj "/C=DE/L=Raindbow City/O=Aperture Science/OU=Please use valid ssl certificate/CN=${CN}" # Generate self signed ssl certificate from csr via OpenSSL openssl x509 -in ${DESTINATION}/${FILENAME}.csr -out ${DESTINATION}/${FILENAME}.crt -req \ -signkey ${DESTINATION}/${FILENAME}.key -days 128 # Create one PEM file which contains certificate and key cat ${DESTINATION}/${FILENAME}.crt ${DESTINATION}/${FILENAME}.key > ${DESTINATION}/${FILENAME}.pem ## Instruction: Remove support for one PEM file which contains cert and key ## Code After: CN=$(hostname) FILENAME='server' # Help function function help() { echo "${0} -d <DESTINATION> [-c common name] [-f filename]" exit 1 } # Option parameters if (( ${#} < 1 )); then help; fi while getopts ":d:c:f:" opt; do case "${opt}" in d) DESTINATION=${OPTARG} ;; c) CN=${OPTARG} ;; f) FILENAME=${OPTARG} ;; *) help ;; esac done # Verify if folder exists if [[ ! -d "$DESTINATION" ]]; then echo "Error: The ${DESTINATION} doesn't exists, please create!" exit 2 fi # Generate key and csr via OpenSSL openssl req -newkey rsa:2048 -keyout ${DESTINATION}/${FILENAME}.key \ -out ${DESTINATION}/${FILENAME}.csr -nodes \ -subj "/C=DE/L=Raindbow City/O=Aperture Science/OU=Please use valid ssl certificate/CN=${CN}" # Generate self signed ssl certificate from csr via OpenSSL openssl x509 -in ${DESTINATION}/${FILENAME}.csr -out ${DESTINATION}/${FILENAME}.crt -req \ -signkey ${DESTINATION}/${FILENAME}.key -days 128
CN=$(hostname) FILENAME='server' # Help function function help() { echo "${0} -d <DESTINATION> [-c common name] [-f filename]" exit 1 } # Option parameters if (( ${#} < 1 )); then help; fi while getopts ":d:c:f:" opt; do case "${opt}" in d) DESTINATION=${OPTARG} ;; c) CN=${OPTARG} ;; f) FILENAME=${OPTARG} ;; *) help ;; esac done # Verify if folder exists if [[ ! -d "$DESTINATION" ]]; then echo "Error: The ${DESTINATION} doesn't exists, please create!" exit 2 fi # Generate key and csr via OpenSSL openssl req -newkey rsa:2048 -keyout ${DESTINATION}/${FILENAME}.key \ -out ${DESTINATION}/${FILENAME}.csr -nodes \ -subj "/C=DE/L=Raindbow City/O=Aperture Science/OU=Please use valid ssl certificate/CN=${CN}" # Generate self signed ssl certificate from csr via OpenSSL openssl x509 -in ${DESTINATION}/${FILENAME}.csr -out ${DESTINATION}/${FILENAME}.crt -req \ -signkey ${DESTINATION}/${FILENAME}.key -days 128 - - # Create one PEM file which contains certificate and key - cat ${DESTINATION}/${FILENAME}.crt ${DESTINATION}/${FILENAME}.key > ${DESTINATION}/${FILENAME}.pem
3
0.078947
0
3
96ba5d539892ed3436cf667d396b10458f95b6d0
library/etc/nprocessors_spec.rb
library/etc/nprocessors_spec.rb
require File.expand_path('../../../spec_helper', __FILE__) require 'etc' describe "Etc.nprocessors" do it "returns the number of online processors" do Etc.nprocessors.should be_kind_of(Integer) Etc.nprocessors.should >= 1 end end
require File.expand_path('../../../spec_helper', __FILE__) require 'etc' ruby_version_is "2.2" do describe "Etc.nprocessors" do it "returns the number of online processors" do Etc.nprocessors.should be_kind_of(Integer) Etc.nprocessors.should >= 1 end end end
Add 2.2 guard for Etc.nprocessors, 2.1 is not yet removed from CI
Add 2.2 guard for Etc.nprocessors, 2.1 is not yet removed from CI
Ruby
mit
nobu/rubyspec,eregon/rubyspec,nobu/rubyspec,sgarciac/spec,sgarciac/spec,eregon/rubyspec,eregon/rubyspec,nobu/rubyspec,ruby/rubyspec,sgarciac/spec,ruby/spec,ruby/spec,ruby/spec,ruby/rubyspec
ruby
## Code Before: require File.expand_path('../../../spec_helper', __FILE__) require 'etc' describe "Etc.nprocessors" do it "returns the number of online processors" do Etc.nprocessors.should be_kind_of(Integer) Etc.nprocessors.should >= 1 end end ## Instruction: Add 2.2 guard for Etc.nprocessors, 2.1 is not yet removed from CI ## Code After: require File.expand_path('../../../spec_helper', __FILE__) require 'etc' ruby_version_is "2.2" do describe "Etc.nprocessors" do it "returns the number of online processors" do Etc.nprocessors.should be_kind_of(Integer) Etc.nprocessors.should >= 1 end end end
require File.expand_path('../../../spec_helper', __FILE__) require 'etc' + ruby_version_is "2.2" do - describe "Etc.nprocessors" do + describe "Etc.nprocessors" do ? ++ - it "returns the number of online processors" do + it "returns the number of online processors" do ? ++ - Etc.nprocessors.should be_kind_of(Integer) + Etc.nprocessors.should be_kind_of(Integer) ? ++ - Etc.nprocessors.should >= 1 + Etc.nprocessors.should >= 1 ? ++ + end end end
10
1.111111
6
4
fd71cf3cf18c33d80db9c95431e140ee0a8533ea
scripts/reset_db.sh
scripts/reset_db.sh
sudo -u postgres dropdb pytask sudo -u postgres createdb -O pytask pytask ./bin/django syncdb ./bin/django migrate profile ./bin/django migrate taskapp ./bin/django loaddata sites_fixture.json
sudo -u postgres dropdb pytask sudo -u postgres createdb -O pytask pytask rm -r pytask/profile/migrations/ rm -r pytask/taskapp/migrations/ ./bin/django schemamigration profile --initial ./bin/django schemamigration taskapp --initial ./bin/django syncdb ./bin/django migrate profile ./bin/django migrate taskapp ./bin/django loaddata sites_fixture.json
Remove initial south migrations also during reset of the database.
Remove initial south migrations also during reset of the database.
Shell
agpl-3.0
madhusudancs/pytask,madhusudancs/pytask,madhusudancs/pytask
shell
## Code Before: sudo -u postgres dropdb pytask sudo -u postgres createdb -O pytask pytask ./bin/django syncdb ./bin/django migrate profile ./bin/django migrate taskapp ./bin/django loaddata sites_fixture.json ## Instruction: Remove initial south migrations also during reset of the database. ## Code After: sudo -u postgres dropdb pytask sudo -u postgres createdb -O pytask pytask rm -r pytask/profile/migrations/ rm -r pytask/taskapp/migrations/ ./bin/django schemamigration profile --initial ./bin/django schemamigration taskapp --initial ./bin/django syncdb ./bin/django migrate profile ./bin/django migrate taskapp ./bin/django loaddata sites_fixture.json
sudo -u postgres dropdb pytask sudo -u postgres createdb -O pytask pytask + rm -r pytask/profile/migrations/ + rm -r pytask/taskapp/migrations/ + ./bin/django schemamigration profile --initial + ./bin/django schemamigration taskapp --initial ./bin/django syncdb ./bin/django migrate profile ./bin/django migrate taskapp ./bin/django loaddata sites_fixture.json
4
0.5
4
0
c8d02a61b06b5ade9976b3bac006be6b2545f619
tox.ini
tox.ini
[tox] envlist = py27,py35,py36,py37,py38,py39, flake [testenv] basepython = py27: {env:TOX_PY27_BASEPYTHON:python2.7} py35: {env:TOX_PY35_BASEPYTHON:python3.5} py36: {env:TOX_PY36_BASEPYTHON:python3.6} py37: {env:TOX_PY37_BASEPYTHON:python3.7} py38: {env:TOX_PY38_BASEPYTHON:python3.8} py39: {env:TOX_PY39_BASEPYTHON:python3.9} deps=pytest commands=pytest [testenv:flake8] basepython=python extras=lint deps=flake8 commands=flake8 {toxinidir}
[tox] envlist = py27,py35,py36,py37,py38,py39, flake [testenv] deps=pytest commands=pytest [testenv:flake8] basepython=python extras=lint deps=flake8 commands=flake8 {toxinidir}
Remove reading basepython from env
Remove reading basepython from env
INI
bsd-3-clause
Perlence/rpp
ini
## Code Before: [tox] envlist = py27,py35,py36,py37,py38,py39, flake [testenv] basepython = py27: {env:TOX_PY27_BASEPYTHON:python2.7} py35: {env:TOX_PY35_BASEPYTHON:python3.5} py36: {env:TOX_PY36_BASEPYTHON:python3.6} py37: {env:TOX_PY37_BASEPYTHON:python3.7} py38: {env:TOX_PY38_BASEPYTHON:python3.8} py39: {env:TOX_PY39_BASEPYTHON:python3.9} deps=pytest commands=pytest [testenv:flake8] basepython=python extras=lint deps=flake8 commands=flake8 {toxinidir} ## Instruction: Remove reading basepython from env ## Code After: [tox] envlist = py27,py35,py36,py37,py38,py39, flake [testenv] deps=pytest commands=pytest [testenv:flake8] basepython=python extras=lint deps=flake8 commands=flake8 {toxinidir}
[tox] envlist = py27,py35,py36,py37,py38,py39, flake [testenv] - basepython = - py27: {env:TOX_PY27_BASEPYTHON:python2.7} - py35: {env:TOX_PY35_BASEPYTHON:python3.5} - py36: {env:TOX_PY36_BASEPYTHON:python3.6} - py37: {env:TOX_PY37_BASEPYTHON:python3.7} - py38: {env:TOX_PY38_BASEPYTHON:python3.8} - py39: {env:TOX_PY39_BASEPYTHON:python3.9} deps=pytest commands=pytest [testenv:flake8] basepython=python extras=lint deps=flake8 commands=flake8 {toxinidir}
7
0.333333
0
7
1e07e9424a1ac69e1e660e6a6f1e58bba15472c1
make_spectra.py
make_spectra.py
import halospectra as hs import randspectra as rs import sys snapnum=sys.argv[1] sim=sys.argv[2] #base="/n/hernquistfs1/mvogelsberger/projects/GFM/Production/Cosmo/Cosmo"+str(sim)+"_V6/L25n512/output/" #savedir="/n/home11/spb/scratch/Cosmo/Cosmo"+str(sim)+"_V6_512/snapdir_"+str(snapnum).rjust(3,'0') base="/home/spb/data/Cosmo/Cosmo"+str(sim)+"_V6/L25n256" savedir="/home/spb/scratch/Cosmo/Cosmo"+str(sim)+"_V6/snapdir_"+str(snapnum).rjust(3,'0') #halo = hs.HaloSpectra(snapnum, base,3, savefile="halo_spectra_DLA.hdf5", savedir=savedir) halo = rs.RandSpectra(snapnum, base,numlos=3000,savedir=savedir, savefile="rand_spectra_DLA.hdf5") halo.get_tau("Si",2,2) halo.get_tau("H",1,1) halo.get_col_density("Z",-1) halo.get_col_density("H",-1) halo.save_file()
import halospectra as hs import randspectra as rs import sys snapnum=sys.argv[1] sim=sys.argv[2] #base="/n/hernquistfs1/mvogelsberger/projects/GFM/Production/Cosmo/Cosmo"+str(sim)+"_V6/L25n512/output/" #savedir="/n/home11/spb/scratch/Cosmo/Cosmo"+str(sim)+"_V6_512/snapdir_"+str(snapnum).rjust(3,'0') base="/home/spb/data/Cosmo/Cosmo"+str(sim)+"_V6/L25n256" savedir="/home/spb/scratch/Cosmo/Cosmo"+str(sim)+"_V6/snapdir_"+str(snapnum).rjust(3,'0') #halo = hs.HaloSpectra(snapnum, base,3, savefile="halo_spectra_DLA.hdf5", savedir=savedir) halo = rs.RandSpectra(snapnum, base,numlos=10000,savedir=savedir, savefile="rand_spectra.hdf5") #halo.get_observer_tau("Si",2) halo.get_tau("H",1,1) #halo.get_col_density("Z",-1) #halo.get_col_density("H",-1) halo.save_file()
Implement saving and loading the observer tau
Implement saving and loading the observer tau
Python
mit
sbird/vw_spectra
python
## Code Before: import halospectra as hs import randspectra as rs import sys snapnum=sys.argv[1] sim=sys.argv[2] #base="/n/hernquistfs1/mvogelsberger/projects/GFM/Production/Cosmo/Cosmo"+str(sim)+"_V6/L25n512/output/" #savedir="/n/home11/spb/scratch/Cosmo/Cosmo"+str(sim)+"_V6_512/snapdir_"+str(snapnum).rjust(3,'0') base="/home/spb/data/Cosmo/Cosmo"+str(sim)+"_V6/L25n256" savedir="/home/spb/scratch/Cosmo/Cosmo"+str(sim)+"_V6/snapdir_"+str(snapnum).rjust(3,'0') #halo = hs.HaloSpectra(snapnum, base,3, savefile="halo_spectra_DLA.hdf5", savedir=savedir) halo = rs.RandSpectra(snapnum, base,numlos=3000,savedir=savedir, savefile="rand_spectra_DLA.hdf5") halo.get_tau("Si",2,2) halo.get_tau("H",1,1) halo.get_col_density("Z",-1) halo.get_col_density("H",-1) halo.save_file() ## Instruction: Implement saving and loading the observer tau ## Code After: import halospectra as hs import randspectra as rs import sys snapnum=sys.argv[1] sim=sys.argv[2] #base="/n/hernquistfs1/mvogelsberger/projects/GFM/Production/Cosmo/Cosmo"+str(sim)+"_V6/L25n512/output/" #savedir="/n/home11/spb/scratch/Cosmo/Cosmo"+str(sim)+"_V6_512/snapdir_"+str(snapnum).rjust(3,'0') base="/home/spb/data/Cosmo/Cosmo"+str(sim)+"_V6/L25n256" savedir="/home/spb/scratch/Cosmo/Cosmo"+str(sim)+"_V6/snapdir_"+str(snapnum).rjust(3,'0') #halo = hs.HaloSpectra(snapnum, base,3, savefile="halo_spectra_DLA.hdf5", savedir=savedir) halo = rs.RandSpectra(snapnum, base,numlos=10000,savedir=savedir, savefile="rand_spectra.hdf5") #halo.get_observer_tau("Si",2) halo.get_tau("H",1,1) #halo.get_col_density("Z",-1) #halo.get_col_density("H",-1) halo.save_file()
import halospectra as hs import randspectra as rs import sys snapnum=sys.argv[1] sim=sys.argv[2] #base="/n/hernquistfs1/mvogelsberger/projects/GFM/Production/Cosmo/Cosmo"+str(sim)+"_V6/L25n512/output/" #savedir="/n/home11/spb/scratch/Cosmo/Cosmo"+str(sim)+"_V6_512/snapdir_"+str(snapnum).rjust(3,'0') base="/home/spb/data/Cosmo/Cosmo"+str(sim)+"_V6/L25n256" savedir="/home/spb/scratch/Cosmo/Cosmo"+str(sim)+"_V6/snapdir_"+str(snapnum).rjust(3,'0') #halo = hs.HaloSpectra(snapnum, base,3, savefile="halo_spectra_DLA.hdf5", savedir=savedir) - halo = rs.RandSpectra(snapnum, base,numlos=3000,savedir=savedir, savefile="rand_spectra_DLA.hdf5") ? ^ ---- + halo = rs.RandSpectra(snapnum, base,numlos=10000,savedir=savedir, savefile="rand_spectra.hdf5") ? ^^ - halo.get_tau("Si",2,2) ? -- + #halo.get_observer_tau("Si",2) ? + +++++++++ halo.get_tau("H",1,1) - halo.get_col_density("Z",-1) + #halo.get_col_density("Z",-1) ? + - halo.get_col_density("H",-1) + #halo.get_col_density("H",-1) ? + halo.save_file()
8
0.444444
4
4
d8502569f0e4c562294415242f096eefdf361ca0
pyonep/__init__.py
pyonep/__init__.py
__version__ = '0.12.4'
"""An API library with Python bindings for Exosite One Platform APIs.""" __version__ = '0.12.4' from .onep import OnepV1, DeferredRequests from .provision import Provision
Add docstring and imports to package.
Add docstring and imports to package.
Python
bsd-3-clause
gavinsunde/pyonep,exosite-labs/pyonep,asolz/pyonep,asolz/pyonep,gavinsunde/pyonep,exosite-labs/pyonep
python
## Code Before: __version__ = '0.12.4' ## Instruction: Add docstring and imports to package. ## Code After: """An API library with Python bindings for Exosite One Platform APIs.""" __version__ = '0.12.4' from .onep import OnepV1, DeferredRequests from .provision import Provision
+ """An API library with Python bindings for Exosite One Platform APIs.""" + __version__ = '0.12.4' + + from .onep import OnepV1, DeferredRequests + from .provision import Provision
5
5
5
0
ae30ec80f775b27fd7b1c669c9aaa076e2ec7fdb
control-users/src/main/java/fi/nls/oskari/control/users/UserController.java
control-users/src/main/java/fi/nls/oskari/control/users/UserController.java
package fi.nls.oskari.control.users; import org.springframework.stereotype.Controller; import org.springframework.ui.Model; import org.springframework.web.bind.annotation.ModelAttribute; import org.springframework.web.bind.annotation.RequestMapping; import fi.nls.oskari.log.LogFactory; import fi.nls.oskari.log.Logger; /** * Handles user's password reseting */ @Controller public class UserController { private final static Logger log = LogFactory.getLogger(UserController.class); public UserController() { } @RequestMapping("/resetPassword/{uuid}") public String getMap(Model model, @ModelAttribute String uuid) { System.out.println("Test2"); return null; } }
package fi.nls.oskari.control.users; import org.springframework.stereotype.Controller; import org.springframework.ui.Model; import org.springframework.web.bind.annotation.ModelAttribute; import org.springframework.web.bind.annotation.RequestMapping; import fi.nls.oskari.log.LogFactory; import fi.nls.oskari.log.Logger; /** * Handles user's password reseting */ @Controller public class UserController { private final static Logger log = LogFactory.getLogger(UserController.class); public UserController() { } /** * "passwordReset" jsp view should ALWAYS be used by user to reset password * @param model * @param uuid * @return */ @RequestMapping("/resetPassword/{uuid}") public String resetPassword( Model model, @ModelAttribute String uuid) { return "passwordReset"; } }
Return view is set to 'resetPassword'.
Return view is set to 'resetPassword'.
Java
mit
nls-oskari/oskari-server,nls-oskari/oskari-server,nls-oskari/oskari-server
java
## Code Before: package fi.nls.oskari.control.users; import org.springframework.stereotype.Controller; import org.springframework.ui.Model; import org.springframework.web.bind.annotation.ModelAttribute; import org.springframework.web.bind.annotation.RequestMapping; import fi.nls.oskari.log.LogFactory; import fi.nls.oskari.log.Logger; /** * Handles user's password reseting */ @Controller public class UserController { private final static Logger log = LogFactory.getLogger(UserController.class); public UserController() { } @RequestMapping("/resetPassword/{uuid}") public String getMap(Model model, @ModelAttribute String uuid) { System.out.println("Test2"); return null; } } ## Instruction: Return view is set to 'resetPassword'. ## Code After: package fi.nls.oskari.control.users; import org.springframework.stereotype.Controller; import org.springframework.ui.Model; import org.springframework.web.bind.annotation.ModelAttribute; import org.springframework.web.bind.annotation.RequestMapping; import fi.nls.oskari.log.LogFactory; import fi.nls.oskari.log.Logger; /** * Handles user's password reseting */ @Controller public class UserController { private final static Logger log = LogFactory.getLogger(UserController.class); public UserController() { } /** * "passwordReset" jsp view should ALWAYS be used by user to reset password * @param model * @param uuid * @return */ @RequestMapping("/resetPassword/{uuid}") public String resetPassword( Model model, @ModelAttribute String uuid) { return "passwordReset"; } }
package fi.nls.oskari.control.users; import org.springframework.stereotype.Controller; import org.springframework.ui.Model; import org.springframework.web.bind.annotation.ModelAttribute; import org.springframework.web.bind.annotation.RequestMapping; import fi.nls.oskari.log.LogFactory; import fi.nls.oskari.log.Logger; /** * Handles user's password reseting */ @Controller public class UserController { private final static Logger log = LogFactory.getLogger(UserController.class); - + ? + public UserController() { } + /** + * "passwordReset" jsp view should ALWAYS be used by user to reset password + * @param model + * @param uuid + * @return + */ @RequestMapping("/resetPassword/{uuid}") - public String getMap(Model model, @ModelAttribute String uuid) { - System.out.println("Test2"); - return null; + public String resetPassword( + Model model, + @ModelAttribute String uuid) { + + return "passwordReset"; } - ? - + }
18
0.6
13
5
ff3d7a46e5146d73d145d08e73069944e2b7c5cb
tests/strophe.html
tests/strophe.html
<!DOCTYPE html> <html> <head> <script> <!-- // remove Function.prototype.bind so we can test Strophe's Function.prototype.bind = undefined; // --> </script> <script src="http://code.jquery.com/jquery-latest.js"></script> <link rel="stylesheet" href="https://github.com/jquery/qunit/raw/master/qunit/qunit.css" type="text/css"> <script src="https://github.com/jquery/qunit/raw/master/qunit/qunit.js"></script> <script src="http://cjohansen.no/sinon/releases/sinon-0.8.0.js"></script> <script src="http://cjohansen.no/sinon/releases/sinon-qunit-0.8.0.js"></script> <script src='../strophe.js'></script> <script src='tests.js'></script> <title>Strophe.js Tests</title> </head> <body> <h1 id="qunit-header">Strophe.js Tests</h1> <h2 id="qunit-banner"></h2> <h2 id="qunit-userAgent"></h2> <ol id="qunit-tests"></ol> <div id="main"></div> </body> </html>
<!DOCTYPE html> <html> <head> <script> <!-- // remove Function.prototype.bind so we can test Strophe's Function.prototype.bind = undefined; // --> </script> <script src="http://code.jquery.com/jquery-latest.js"></script> <link rel="stylesheet" href="http://code.jquery.com/qunit/qunit-git.css" type="text/css"> <script src="http://code.jquery.com/qunit/qunit-git.js"></script> <script src="http://cjohansen.no/sinon/releases/sinon-0.8.0.js"></script> <script src="http://cjohansen.no/sinon/releases/sinon-qunit-0.8.0.js"></script> <script src='../strophe.js'></script> <script src='tests.js'></script> <title>Strophe.js Tests</title> </head> <body> <h1 id="qunit-header">Strophe.js Tests</h1> <h2 id="qunit-banner"></h2> <h2 id="qunit-userAgent"></h2> <ol id="qunit-tests"></ol> <div id="main"></div> </body> </html>
Use jquery's CDN for hotlinking qunit JS/CSS instead of Github.
Use jquery's CDN for hotlinking qunit JS/CSS instead of Github. Github now sends everything with mime type of text/plain, which means hotlinking CSS no longer works.
HTML
mit
strophe/strophejs,damencho/strophejs,processone/strophejs,diegocr/strophejs,Rom4eg/strophejs,teambox/strophejs,Rom4eg/strophejs,adozenlines/strophejs,gabrielfalcao/strophejs,m800-limited/strophejs,adozenlines/strophejs,strophe/strophejs,ggozad/strophejs,sualko/strophejs,teambox/strophejs,m800-limited/strophejs,minddistrict/strophejs,processone/strophejs,Voiceworks/strophejs,Rom4eg/strophejs,Voiceworks/strophejs,gabrielfalcao/strophejs,damencho/strophejs,joachimlindborg/strophejs,teambox/strophejs,ggozad/strophejs,damencho/strophejs,strophe/strophejs,minddistrict/strophejs,processone/strophejs,diegocr/strophejs,Voiceworks/strophejs,metajack/strophejs,joachimlindborg/strophejs,sualko/strophejs,diegocr/strophejs,minddistrict/strophejs,ggozad/strophejs,sualko/strophejs,adozenlines/strophejs,gabrielfalcao/strophejs,joachimlindborg/strophejs,m800-limited/strophejs,metajack/strophejs
html
## Code Before: <!DOCTYPE html> <html> <head> <script> <!-- // remove Function.prototype.bind so we can test Strophe's Function.prototype.bind = undefined; // --> </script> <script src="http://code.jquery.com/jquery-latest.js"></script> <link rel="stylesheet" href="https://github.com/jquery/qunit/raw/master/qunit/qunit.css" type="text/css"> <script src="https://github.com/jquery/qunit/raw/master/qunit/qunit.js"></script> <script src="http://cjohansen.no/sinon/releases/sinon-0.8.0.js"></script> <script src="http://cjohansen.no/sinon/releases/sinon-qunit-0.8.0.js"></script> <script src='../strophe.js'></script> <script src='tests.js'></script> <title>Strophe.js Tests</title> </head> <body> <h1 id="qunit-header">Strophe.js Tests</h1> <h2 id="qunit-banner"></h2> <h2 id="qunit-userAgent"></h2> <ol id="qunit-tests"></ol> <div id="main"></div> </body> </html> ## Instruction: Use jquery's CDN for hotlinking qunit JS/CSS instead of Github. Github now sends everything with mime type of text/plain, which means hotlinking CSS no longer works. ## Code After: <!DOCTYPE html> <html> <head> <script> <!-- // remove Function.prototype.bind so we can test Strophe's Function.prototype.bind = undefined; // --> </script> <script src="http://code.jquery.com/jquery-latest.js"></script> <link rel="stylesheet" href="http://code.jquery.com/qunit/qunit-git.css" type="text/css"> <script src="http://code.jquery.com/qunit/qunit-git.js"></script> <script src="http://cjohansen.no/sinon/releases/sinon-0.8.0.js"></script> <script src="http://cjohansen.no/sinon/releases/sinon-qunit-0.8.0.js"></script> <script src='../strophe.js'></script> <script src='tests.js'></script> <title>Strophe.js Tests</title> </head> <body> <h1 id="qunit-header">Strophe.js Tests</h1> <h2 id="qunit-banner"></h2> <h2 id="qunit-userAgent"></h2> <ol id="qunit-tests"></ol> <div id="main"></div> </body> </html>
<!DOCTYPE html> <html> <head> <script> <!-- // remove Function.prototype.bind so we can test Strophe's Function.prototype.bind = undefined; // --> </script> <script src="http://code.jquery.com/jquery-latest.js"></script> <link rel="stylesheet" - href="https://github.com/jquery/qunit/raw/master/qunit/qunit.css" + href="http://code.jquery.com/qunit/qunit-git.css" type="text/css"> - <script src="https://github.com/jquery/qunit/raw/master/qunit/qunit.js"></script> ? - ------- ^^ ^^^^^^^^^^^ ----- + <script src="http://code.jquery.com/qunit/qunit-git.js"></script> ? ^^^ ^^^ ++++ <script src="http://cjohansen.no/sinon/releases/sinon-0.8.0.js"></script> <script src="http://cjohansen.no/sinon/releases/sinon-qunit-0.8.0.js"></script> <script src='../strophe.js'></script> <script src='tests.js'></script> <title>Strophe.js Tests</title> </head> <body> <h1 id="qunit-header">Strophe.js Tests</h1> <h2 id="qunit-banner"></h2> <h2 id="qunit-userAgent"></h2> <ol id="qunit-tests"></ol> <div id="main"></div> </body> </html>
4
0.133333
2
2
425c3f8269b9cf07d709dfc2602f938aed7b4df3
.travis.yml
.travis.yml
language: node_js node_js: - "0.12" - "4.1" install: npm install script: - npm run lint - npm test - npm run build notifications: email: on_success: never on_failure: change slack: rooms: - 'uwdub:Ry6mwlUX1aZevqiqmYLiA3N1' on_success: never on_failure: change
language: node_js node_js: - "0.12" - "4.1" install: npm install script: - npm run lint - npm test - npm run clean && npm run build && npm run schema:only notifications: email: on_success: never on_failure: change slack: rooms: - 'uwdub:Ry6mwlUX1aZevqiqmYLiA3N1' on_success: never on_failure: change
Test building schema as well
Test building schema as well
YAML
bsd-3-clause
vega/vega-lite,uwdata/vega-lite,vega/vega-lite,vega/vega-lite,uwdata/vega-lite,uwdata/vega-lite,vega/vega-lite,uwdata/vega-lite,vega/vega-lite,uwdata/vega-lite
yaml
## Code Before: language: node_js node_js: - "0.12" - "4.1" install: npm install script: - npm run lint - npm test - npm run build notifications: email: on_success: never on_failure: change slack: rooms: - 'uwdub:Ry6mwlUX1aZevqiqmYLiA3N1' on_success: never on_failure: change ## Instruction: Test building schema as well ## Code After: language: node_js node_js: - "0.12" - "4.1" install: npm install script: - npm run lint - npm test - npm run clean && npm run build && npm run schema:only notifications: email: on_success: never on_failure: change slack: rooms: - 'uwdub:Ry6mwlUX1aZevqiqmYLiA3N1' on_success: never on_failure: change
language: node_js node_js: - "0.12" - "4.1" install: npm install script: - npm run lint - npm test - - npm run build + - npm run clean && npm run build && npm run schema:only notifications: email: on_success: never on_failure: change slack: rooms: - 'uwdub:Ry6mwlUX1aZevqiqmYLiA3N1' on_success: never on_failure: change
2
0.111111
1
1
ed97a1f811f04693203f6d1c0e9b64649a3da152
coney/exceptions.py
coney/exceptions.py
class ConeyException(Exception): def __repr__(self): return 'An unspecified error has occurred' class CallTimeoutException(ConeyException): def __repr__(self): return 'An RPC call did not return before the time out period' class MalformedRequestException(ConeyException): def __init__(self, serializer_name, request): self._serializer_name = serializer_name self._request = request def __repr__(self): return '{} failed to create a Request from string: {}'.format(self._serialier_name, self._request) class RemoteExecErrorException(ConeyException): def __init__(self, value, details): self._value = value self._details = details def __repr__(self): return 'An error occurred during remote execution: ({}) {}'.format(self._value, self._details) @property def value(self): return self._value @property def details(self): return self._details class RemoteUnhandledExceptionException(ConeyException): def __init__(self, details): self._details = details def __repr__(self): return 'An unhandled exception was raised during remote execution: {}'.format(self._details) class DispatchHandlerException(ConeyException): def __init__(self, code): self.code = code def __repr__(self): return 'Error {} occurred during message dispatch'.format(self.code)
class ConeyException(Exception): def __repr__(self): return 'An unspecified error has occurred' class CallTimeoutException(ConeyException): def __repr__(self): return 'An RPC call did not return before the time out period' class MalformedRequestException(ConeyException): def __init__(self, serializer_name, request): self._serializer_name = serializer_name self._request = request def __repr__(self): return '{} failed to create a Request from string: {}'.format(self._serialier_name, self._request) class RemoteExecErrorException(ConeyException): def __init__(self, value, details): self._value = value self._details = details def __repr__(self): return 'An error occurred during remote execution: ({}) {}'.format(self._value, self._details) @property def value(self): return self._value @property def details(self): return self._details class RemoteUnhandledExceptionException(ConeyException): def __init__(self, details): self._details = details def __repr__(self): return 'An unhandled exception was raised during remote execution: {}'.format(self._details) class DispatchHandlerException(ConeyException): def __init__(self, code): self.code = code def __repr__(self): return 'Error {} occurred during message dispatch'.format(self.code) class HandlerNotCallableException(ConeyException): def __repr__(self): return 'Handler provided a non-callable object'
Add a new exception to handle a non-callable handler.
Add a new exception to handle a non-callable handler.
Python
mit
cbigler/jackrabbit
python
## Code Before: class ConeyException(Exception): def __repr__(self): return 'An unspecified error has occurred' class CallTimeoutException(ConeyException): def __repr__(self): return 'An RPC call did not return before the time out period' class MalformedRequestException(ConeyException): def __init__(self, serializer_name, request): self._serializer_name = serializer_name self._request = request def __repr__(self): return '{} failed to create a Request from string: {}'.format(self._serialier_name, self._request) class RemoteExecErrorException(ConeyException): def __init__(self, value, details): self._value = value self._details = details def __repr__(self): return 'An error occurred during remote execution: ({}) {}'.format(self._value, self._details) @property def value(self): return self._value @property def details(self): return self._details class RemoteUnhandledExceptionException(ConeyException): def __init__(self, details): self._details = details def __repr__(self): return 'An unhandled exception was raised during remote execution: {}'.format(self._details) class DispatchHandlerException(ConeyException): def __init__(self, code): self.code = code def __repr__(self): return 'Error {} occurred during message dispatch'.format(self.code) ## Instruction: Add a new exception to handle a non-callable handler. ## Code After: class ConeyException(Exception): def __repr__(self): return 'An unspecified error has occurred' class CallTimeoutException(ConeyException): def __repr__(self): return 'An RPC call did not return before the time out period' class MalformedRequestException(ConeyException): def __init__(self, serializer_name, request): self._serializer_name = serializer_name self._request = request def __repr__(self): return '{} failed to create a Request from string: {}'.format(self._serialier_name, self._request) class RemoteExecErrorException(ConeyException): def __init__(self, value, details): self._value = value self._details = details def __repr__(self): return 'An error occurred during remote execution: ({}) {}'.format(self._value, self._details) @property def value(self): return self._value @property def details(self): return self._details class RemoteUnhandledExceptionException(ConeyException): def __init__(self, details): self._details = details def __repr__(self): return 'An unhandled exception was raised during remote execution: {}'.format(self._details) class DispatchHandlerException(ConeyException): def __init__(self, code): self.code = code def __repr__(self): return 'Error {} occurred during message dispatch'.format(self.code) class HandlerNotCallableException(ConeyException): def __repr__(self): return 'Handler provided a non-callable object'
class ConeyException(Exception): def __repr__(self): return 'An unspecified error has occurred' class CallTimeoutException(ConeyException): def __repr__(self): return 'An RPC call did not return before the time out period' class MalformedRequestException(ConeyException): def __init__(self, serializer_name, request): self._serializer_name = serializer_name self._request = request def __repr__(self): return '{} failed to create a Request from string: {}'.format(self._serialier_name, self._request) class RemoteExecErrorException(ConeyException): def __init__(self, value, details): self._value = value self._details = details def __repr__(self): return 'An error occurred during remote execution: ({}) {}'.format(self._value, self._details) @property def value(self): return self._value @property def details(self): return self._details class RemoteUnhandledExceptionException(ConeyException): def __init__(self, details): self._details = details def __repr__(self): return 'An unhandled exception was raised during remote execution: {}'.format(self._details) class DispatchHandlerException(ConeyException): def __init__(self, code): self.code = code def __repr__(self): return 'Error {} occurred during message dispatch'.format(self.code) + + + class HandlerNotCallableException(ConeyException): + def __repr__(self): + return 'Handler provided a non-callable object'
5
0.096154
5
0
5b1db6c408d0386f3c6926b0b65e16d3aae274e0
images/bazel/variants.yaml
images/bazel/variants.yaml
variants: kubernetes: CONFIG: kubernetes-master # Used by the kubernetes repo master branch NEW_VERSION: 2.2.0 OLD_VERSION: 0.25.2 test-infra: CONFIG: test-infra # Used by test-infra repo NEW_VERSION: 3.0.0 OLD_VERSION: 2.2.0
variants: kubernetes: CONFIG: kubernetes-master # Used by the kubernetes repo master branch NEW_VERSION: 2.2.0 OLD_VERSION: 0.25.2 org: CONFIG: org # Used by org repo NEW_VERSION: 3.0.0 OLD_VERSION: 0.29.1 test-infra: CONFIG: test-infra # Used by test-infra repo NEW_VERSION: 3.0.0 OLD_VERSION: 2.2.0
Create an org variant of the bazel image.
Create an org variant of the bazel image. This will install the two versions needed by this repo.
YAML
apache-2.0
cblecker/test-infra,BenTheElder/test-infra,brahmaroutu/test-infra,fejta/test-infra,michelle192837/test-infra,monopole/test-infra,pwittrock/test-infra,BenTheElder/test-infra,dims/test-infra,pwittrock/test-infra,fejta/test-infra,monopole/test-infra,jessfraz/test-infra,jessfraz/test-infra,dims/test-infra,cblecker/test-infra,monopole/test-infra,cjwagner/test-infra,fejta/test-infra,jessfraz/test-infra,michelle192837/test-infra,kubernetes/test-infra,kubernetes/test-infra,jessfraz/test-infra,michelle192837/test-infra,brahmaroutu/test-infra,pwittrock/test-infra,jessfraz/test-infra,monopole/test-infra,pwittrock/test-infra,michelle192837/test-infra,monopole/test-infra,dims/test-infra,michelle192837/test-infra,brahmaroutu/test-infra,cblecker/test-infra,kubernetes/test-infra,cjwagner/test-infra,cjwagner/test-infra,dims/test-infra,fejta/test-infra,brahmaroutu/test-infra,cblecker/test-infra,BenTheElder/test-infra,brahmaroutu/test-infra,dims/test-infra,michelle192837/test-infra,monopole/test-infra,BenTheElder/test-infra,cblecker/test-infra,brahmaroutu/test-infra,cjwagner/test-infra,kubernetes/test-infra,fejta/test-infra,pwittrock/test-infra,BenTheElder/test-infra,dims/test-infra,BenTheElder/test-infra,kubernetes/test-infra,cjwagner/test-infra,jessfraz/test-infra,cjwagner/test-infra,kubernetes/test-infra,fejta/test-infra,cblecker/test-infra
yaml
## Code Before: variants: kubernetes: CONFIG: kubernetes-master # Used by the kubernetes repo master branch NEW_VERSION: 2.2.0 OLD_VERSION: 0.25.2 test-infra: CONFIG: test-infra # Used by test-infra repo NEW_VERSION: 3.0.0 OLD_VERSION: 2.2.0 ## Instruction: Create an org variant of the bazel image. This will install the two versions needed by this repo. ## Code After: variants: kubernetes: CONFIG: kubernetes-master # Used by the kubernetes repo master branch NEW_VERSION: 2.2.0 OLD_VERSION: 0.25.2 org: CONFIG: org # Used by org repo NEW_VERSION: 3.0.0 OLD_VERSION: 0.29.1 test-infra: CONFIG: test-infra # Used by test-infra repo NEW_VERSION: 3.0.0 OLD_VERSION: 2.2.0
variants: kubernetes: CONFIG: kubernetes-master # Used by the kubernetes repo master branch NEW_VERSION: 2.2.0 OLD_VERSION: 0.25.2 + org: + CONFIG: org # Used by org repo + NEW_VERSION: 3.0.0 + OLD_VERSION: 0.29.1 test-infra: CONFIG: test-infra # Used by test-infra repo NEW_VERSION: 3.0.0 OLD_VERSION: 2.2.0
4
0.444444
4
0
ce75ba9a99e5fc3012822d6159beac34f6fbb86d
generators/app/steps/install.js
generators/app/steps/install.js
/** * Step 7 * Where installation are run (npm, bower) */ module.exports = { /** * Install npm dependencies */ installNpmDependencies: function () { if (!(this.options['skip-project-install'] || this.options["skip-all"])) { this.log(chalk.yellow("Start installing npm dependencies, please wait...")); this.npmInstall(); } } };
/** * Step 7 * Where installation are run (npm, bower) */ var chalk = require('chalk'); module.exports = { /** * Install npm dependencies */ installNpmDependencies: function () { if (!(this.options['skip-project-install'] || this.options["skip-all"])) { this.log(chalk.yellow("Start installing npm dependencies, please wait...")); this.npmInstall(); } } };
Fix bug with chalk undefined
Fix bug with chalk undefined
JavaScript
mit
ghaiklor/generator-sails-rest-api,konstantinzolotarev/generator-trails,jaumard/generator-trails,synergycns/generator-sails-rest-api,italoag/generator-sails-rest-api,italoag/generator-sails-rest-api,ghaiklor/generator-sails-rest-api,tnunes/generator-trails,eithewliter5518/generator-sails-rest-api,mhipo1364/generator-sails-rest-api,IncoCode/generator-sails-rest-api
javascript
## Code Before: /** * Step 7 * Where installation are run (npm, bower) */ module.exports = { /** * Install npm dependencies */ installNpmDependencies: function () { if (!(this.options['skip-project-install'] || this.options["skip-all"])) { this.log(chalk.yellow("Start installing npm dependencies, please wait...")); this.npmInstall(); } } }; ## Instruction: Fix bug with chalk undefined ## Code After: /** * Step 7 * Where installation are run (npm, bower) */ var chalk = require('chalk'); module.exports = { /** * Install npm dependencies */ installNpmDependencies: function () { if (!(this.options['skip-project-install'] || this.options["skip-all"])) { this.log(chalk.yellow("Start installing npm dependencies, please wait...")); this.npmInstall(); } } };
/** * Step 7 * Where installation are run (npm, bower) */ + + var chalk = require('chalk'); module.exports = { /** * Install npm dependencies */ installNpmDependencies: function () { if (!(this.options['skip-project-install'] || this.options["skip-all"])) { this.log(chalk.yellow("Start installing npm dependencies, please wait...")); this.npmInstall(); } } };
2
0.125
2
0
bdfd53b26cbebee65472d0de83d41410c593f867
README.md
README.md
_This is not the documentation you are looking for... it is a pointer to the real documentation._ ## Looking for Crowbar Resources? [The Crowbar website](https://crowbar.github.io) has links to all information and is our recommended starting place. ## Specific Crowbar Documentation We track Crowbar documentation with the code so that we can track versions of documentation with the code. Here are commonly requested references: * [Getting Started Guide](https://github.com/crowbar/crowbar/blob/master/doc/gettingstarted.md) > You may need to look in subdirectories under the links above for additional details. ## Background Crowbar documentation is distributed into multiple places under the /doc directory of each Crowbar module (aka "barclamps"). When the modules are installed, Crowbar combines all the /doc directories into a master documentation set. These directories are structured into subdirectories for general topics. This structure is common across all barclamps in the [Crowbar project](https://github.com/crowbar/) > Please, do NOT add documentation in locations besides /doc! If necessary, expand this README to include pointers to important /doc information.
[![Reviewed by Hound](https://img.shields.io/badge/Reviewed_by-Hound-8E64B0.svg)](https://houndci.com) _This is not the documentation you are looking for... it is a pointer to the real documentation._ ## Looking for Crowbar Resources? [The Crowbar website](https://crowbar.github.io) has links to all information and is our recommended starting place. ## Specific Crowbar Documentation We track Crowbar documentation with the code so that we can track versions of documentation with the code. Here are commonly requested references: * [Getting Started Guide](https://github.com/crowbar/crowbar/blob/master/doc/gettingstarted.md) > You may need to look in subdirectories under the links above for additional details. ## Background Crowbar documentation is distributed into multiple places under the /doc directory of each Crowbar module (aka "barclamps"). When the modules are installed, Crowbar combines all the /doc directories into a master documentation set. These directories are structured into subdirectories for general topics. This structure is common across all barclamps in the [Crowbar project](https://github.com/crowbar/) > Please, do NOT add documentation in locations besides /doc! If necessary, expand this README to include pointers to important /doc information.
Add a "Reviewed by Hound" badge
Add a "Reviewed by Hound" badge What do you think about adding a Hound badge?
Markdown
apache-2.0
rsalevsky/crowbar,crowbar/crowbar,dirkmueller/crowbar,dirkmueller/crowbar,crowbar/crowbar,rsalevsky/crowbar
markdown
## Code Before: _This is not the documentation you are looking for... it is a pointer to the real documentation._ ## Looking for Crowbar Resources? [The Crowbar website](https://crowbar.github.io) has links to all information and is our recommended starting place. ## Specific Crowbar Documentation We track Crowbar documentation with the code so that we can track versions of documentation with the code. Here are commonly requested references: * [Getting Started Guide](https://github.com/crowbar/crowbar/blob/master/doc/gettingstarted.md) > You may need to look in subdirectories under the links above for additional details. ## Background Crowbar documentation is distributed into multiple places under the /doc directory of each Crowbar module (aka "barclamps"). When the modules are installed, Crowbar combines all the /doc directories into a master documentation set. These directories are structured into subdirectories for general topics. This structure is common across all barclamps in the [Crowbar project](https://github.com/crowbar/) > Please, do NOT add documentation in locations besides /doc! If necessary, expand this README to include pointers to important /doc information. ## Instruction: Add a "Reviewed by Hound" badge What do you think about adding a Hound badge? ## Code After: [![Reviewed by Hound](https://img.shields.io/badge/Reviewed_by-Hound-8E64B0.svg)](https://houndci.com) _This is not the documentation you are looking for... it is a pointer to the real documentation._ ## Looking for Crowbar Resources? [The Crowbar website](https://crowbar.github.io) has links to all information and is our recommended starting place. ## Specific Crowbar Documentation We track Crowbar documentation with the code so that we can track versions of documentation with the code. Here are commonly requested references: * [Getting Started Guide](https://github.com/crowbar/crowbar/blob/master/doc/gettingstarted.md) > You may need to look in subdirectories under the links above for additional details. ## Background Crowbar documentation is distributed into multiple places under the /doc directory of each Crowbar module (aka "barclamps"). When the modules are installed, Crowbar combines all the /doc directories into a master documentation set. These directories are structured into subdirectories for general topics. This structure is common across all barclamps in the [Crowbar project](https://github.com/crowbar/) > Please, do NOT add documentation in locations besides /doc! If necessary, expand this README to include pointers to important /doc information.
+ + [![Reviewed by Hound](https://img.shields.io/badge/Reviewed_by-Hound-8E64B0.svg)](https://houndci.com) _This is not the documentation you are looking for... it is a pointer to the real documentation._ ## Looking for Crowbar Resources? [The Crowbar website](https://crowbar.github.io) has links to all information and is our recommended starting place. ## Specific Crowbar Documentation We track Crowbar documentation with the code so that we can track versions of documentation with the code. Here are commonly requested references: * [Getting Started Guide](https://github.com/crowbar/crowbar/blob/master/doc/gettingstarted.md) > You may need to look in subdirectories under the links above for additional details. ## Background Crowbar documentation is distributed into multiple places under the /doc directory of each Crowbar module (aka "barclamps"). When the modules are installed, Crowbar combines all the /doc directories into a master documentation set. These directories are structured into subdirectories for general topics. This structure is common across all barclamps in the [Crowbar project](https://github.com/crowbar/) > Please, do NOT add documentation in locations besides /doc! If necessary, expand this README to include pointers to important /doc information.
2
0.090909
2
0
208ea975a4896e2377fb491c5339722b1bb0179e
config/config-defaults.json
config/config-defaults.json
{ "http_port": 2000, "server_address": "localhost", "upstream_host": "registry.npmjs.org", "upstream_port": 80, "cache_dir": "/tmp/npm-lazy", "cache_mem": 200 }
{ "http_port": 2000, "server_address": "localhost", "upstream_host": "registry.npmjs.org", "upstream_port": 443, "upstream_use_https": true, "upstream_verify_ssl": true, "cache_dir": "/tmp/npm-lazy", "cache_mem": 200 }
Set default access to registry server in https
Set default access to registry server in https
JSON
mit
hpcloud/npm-lazy-mirror,ActiveState/npm-lazy-mirror,hpcloud/npm-lazy-mirror,ActiveState/npm-lazy-mirror
json
## Code Before: { "http_port": 2000, "server_address": "localhost", "upstream_host": "registry.npmjs.org", "upstream_port": 80, "cache_dir": "/tmp/npm-lazy", "cache_mem": 200 } ## Instruction: Set default access to registry server in https ## Code After: { "http_port": 2000, "server_address": "localhost", "upstream_host": "registry.npmjs.org", "upstream_port": 443, "upstream_use_https": true, "upstream_verify_ssl": true, "cache_dir": "/tmp/npm-lazy", "cache_mem": 200 }
{ "http_port": 2000, "server_address": "localhost", "upstream_host": "registry.npmjs.org", - "upstream_port": 80, ? ^^ + "upstream_port": 443, ? ^^^ + "upstream_use_https": true, + "upstream_verify_ssl": true, "cache_dir": "/tmp/npm-lazy", "cache_mem": 200 }
4
0.363636
3
1
c316a016db74779d3c473df23e2992c56a30266d
static/js/highlight.js
static/js/highlight.js
$(function () { var highlight = null; var match = /#line\-(\d+)/g.exec(window.location.hash); if (match) { highlight = parseInt(match[1], 10); } SyntaxHighlighter.all({toolbar: false, highlight:highlight}); setTimeout(function () { $('.gutter .line').each(function () { var div = $(this); if (div.hasClass('highlighted')) { scrollTo(div); } var text = div.text(); var id = 'line-' + text; var a = $('<a href="#' + id + '" id="' + id + '">' + text + '</a>'); div.empty(); div.append(a); a.click(function () { $('.syntaxhighlighter .highlighted').removeClass('highlighted'); $('.syntaxhighlighter .number' + $(this).text()).addClass('highlighted'); var pos = scrollPosition(); window.location.hash = '#' + $(this).attr('id'); scrollPosition(pos); scrollTo($(this), true); return false; }); }); }, 100); });
function addLineLinks() { var lines = $('.gutter .line'); if (!lines.length) { return false; } lines.each(function () { var div = $(this); if (div.hasClass('highlighted')) { scrollTo(div); } var text = div.text(); var id = 'line-' + text; var a = $('<a href="#' + id + '" id="' + id + '">' + text + '</a>'); div.empty(); div.append(a); a.click(function () { $('.syntaxhighlighter .highlighted').removeClass('highlighted'); $('.syntaxhighlighter .number' + $(this).text()).addClass('highlighted'); var pos = scrollPosition(); window.location.hash = '#' + $(this).attr('id'); scrollPosition(pos); scrollTo($(this), true); return false; }); }); return true; } $(function () { var highlight = null; var match = /#line\-(\d+)/g.exec(window.location.hash); if (match) { highlight = parseInt(match[1], 10); } SyntaxHighlighter.all({toolbar: false, highlight:highlight}); var ts = setInterval(function () { if (addLineLinks()) { clearInterval(ts); } }, 10); });
Use an interval rather than a timeout to add the line number links
Use an interval rather than a timeout to add the line number links
JavaScript
apache-2.0
rainycape/gondolaweb,rainycape/gondolaweb,rainycape/gondolaweb,rainycape/gondolaweb
javascript
## Code Before: $(function () { var highlight = null; var match = /#line\-(\d+)/g.exec(window.location.hash); if (match) { highlight = parseInt(match[1], 10); } SyntaxHighlighter.all({toolbar: false, highlight:highlight}); setTimeout(function () { $('.gutter .line').each(function () { var div = $(this); if (div.hasClass('highlighted')) { scrollTo(div); } var text = div.text(); var id = 'line-' + text; var a = $('<a href="#' + id + '" id="' + id + '">' + text + '</a>'); div.empty(); div.append(a); a.click(function () { $('.syntaxhighlighter .highlighted').removeClass('highlighted'); $('.syntaxhighlighter .number' + $(this).text()).addClass('highlighted'); var pos = scrollPosition(); window.location.hash = '#' + $(this).attr('id'); scrollPosition(pos); scrollTo($(this), true); return false; }); }); }, 100); }); ## Instruction: Use an interval rather than a timeout to add the line number links ## Code After: function addLineLinks() { var lines = $('.gutter .line'); if (!lines.length) { return false; } lines.each(function () { var div = $(this); if (div.hasClass('highlighted')) { scrollTo(div); } var text = div.text(); var id = 'line-' + text; var a = $('<a href="#' + id + '" id="' + id + '">' + text + '</a>'); div.empty(); div.append(a); a.click(function () { $('.syntaxhighlighter .highlighted').removeClass('highlighted'); $('.syntaxhighlighter .number' + $(this).text()).addClass('highlighted'); var pos = scrollPosition(); window.location.hash = '#' + $(this).attr('id'); scrollPosition(pos); scrollTo($(this), true); return false; }); }); return true; } $(function () { var highlight = null; var match = /#line\-(\d+)/g.exec(window.location.hash); if (match) { highlight = parseInt(match[1], 10); } SyntaxHighlighter.all({toolbar: false, highlight:highlight}); var ts = setInterval(function () { if (addLineLinks()) { clearInterval(ts); } }, 10); });
+ function addLineLinks() { + var lines = $('.gutter .line'); + if (!lines.length) { + return false; - $(function () { - var highlight = null; - var match = /#line\-(\d+)/g.exec(window.location.hash); - if (match) { - highlight = parseInt(match[1], 10); } - SyntaxHighlighter.all({toolbar: false, highlight:highlight}); - setTimeout(function () { - $('.gutter .line').each(function () { ? ------------ ^^ + lines.each(function () { ? ^ var div = $(this); if (div.hasClass('highlighted')) { scrollTo(div); } var text = div.text(); var id = 'line-' + text; var a = $('<a href="#' + id + '" id="' + id + '">' + text + '</a>'); div.empty(); div.append(a); a.click(function () { $('.syntaxhighlighter .highlighted').removeClass('highlighted'); $('.syntaxhighlighter .number' + $(this).text()).addClass('highlighted'); var pos = scrollPosition(); window.location.hash = '#' + $(this).attr('id'); scrollPosition(pos); scrollTo($(this), true); return false; }); }); + return true; + } + + $(function () { + var highlight = null; + var match = /#line\-(\d+)/g.exec(window.location.hash); + if (match) { + highlight = parseInt(match[1], 10); + } + SyntaxHighlighter.all({toolbar: false, highlight:highlight}); + var ts = setInterval(function () { + if (addLineLinks()) { + clearInterval(ts); + } - }, 100); ? - + }, 10); });
29
0.966667
20
9
a57adad2bbe30d7bc8b236a579f7fa7e620e4a2b
manifest.json
manifest.json
{ "manifest_version": 2, "name": "Purge Old History", "version": "1.0.1", "description": "Removes history entries older than a specified age.", "applications": { "gecko": { "id":"purge-old-hist@monochrome101.addons.mozilla.org", "strict_min_version": "49.*" } }, "permissions": [ "alarms", "history", "storage" ], "background": { "scripts": ["purge-daemon.js"] }, "options_ui": { "page": "options.html" } }
{ "manifest_version": 2, "name": "Purge Old History", "version": "1.0.2", "description": "Periodically removes history entries older than a specified age.", "applications": { "gecko": { "id":"purge-old-hist@monochrome101.addons.mozilla.org", "strict_min_version": "49.0a2" } }, "permissions": [ "alarms", "history", "storage" ], "background": { "scripts": ["purge-daemon.js"] }, "options_ui": { "page": "options.html" } }
Update description, min. FF version, and overall version bump
Update description, min. FF version, and overall version bump
JSON
mit
DmitriK/ff-purge-old-history,DmitriK/ff-purge-old-history
json
## Code Before: { "manifest_version": 2, "name": "Purge Old History", "version": "1.0.1", "description": "Removes history entries older than a specified age.", "applications": { "gecko": { "id":"purge-old-hist@monochrome101.addons.mozilla.org", "strict_min_version": "49.*" } }, "permissions": [ "alarms", "history", "storage" ], "background": { "scripts": ["purge-daemon.js"] }, "options_ui": { "page": "options.html" } } ## Instruction: Update description, min. FF version, and overall version bump ## Code After: { "manifest_version": 2, "name": "Purge Old History", "version": "1.0.2", "description": "Periodically removes history entries older than a specified age.", "applications": { "gecko": { "id":"purge-old-hist@monochrome101.addons.mozilla.org", "strict_min_version": "49.0a2" } }, "permissions": [ "alarms", "history", "storage" ], "background": { "scripts": ["purge-daemon.js"] }, "options_ui": { "page": "options.html" } }
{ "manifest_version": 2, "name": "Purge Old History", - "version": "1.0.1", ? ^ + "version": "1.0.2", ? ^ - "description": "Removes history entries older than a specified age.", ? ^ + "description": "Periodically removes history entries older than a specified age.", ? ^^^^^^^^^^^^^^ "applications": { "gecko": { "id":"purge-old-hist@monochrome101.addons.mozilla.org", - "strict_min_version": "49.*" ? ^ + "strict_min_version": "49.0a2" ? ^^^ } }, "permissions": [ "alarms", "history", "storage" ], "background": { "scripts": ["purge-daemon.js"] }, "options_ui": { "page": "options.html" } }
6
0.25
3
3
632bccba2075272e00da55f30342fa3944f22f49
gulpfile.js
gulpfile.js
const gulp = require('gulp'), rename = require('gulp-rename'), less = require('gulp-less'), pug = require('gulp-pug') argv = require('yargs').argv /** * Compile less to css and minify */ gulp.task('less', function () { gulp.src('./src/less/**/*.less') .pipe(less({ compress:true })) .pipe(gulp.dest('./src/css/')); }); // Build template gulp.task('export', function () { if (!argv.template) return console.log("Please enter template name --template <templatename>"); const templateName = argv.template; return gulp.src('./templates/' + templateName + '/index.pug') .pipe(pug( )) .pipe(rename( templateName + '.xml')) .pipe(gulp.dest('./build')) }) // Auto compile gulp.task('autocompile', function () { gulp.watch('./src/less/**/*.less', ['less']); });
const gulp = require('gulp'), rename = require('gulp-rename'), less = require('gulp-less'), pug = require('gulp-pug') argv = require('yargs').argv /** * Compile less to css and minify */ gulp.task('less', function () { gulp.src('./src/less/**/*.less') .pipe(less({ compress:true })) .pipe(gulp.dest('./src/css/')); }); // Build and export template gulp.task('export', function () { // Get argument if (!argv.template && !argv.t) return console.log("Please enter template name \"gulp export --template <templatename>\" "); // Set template name and file path const templateName = argv.template || argv.t; const mainFile = './templates/' + templateName + '/index.pug'; const minify = (argv.minify)? false : true; console.log("Get main file : " + mainFile); // Start gulp return gulp.src(mainFile) .pipe(pug({ pretty: minify })) .pipe(rename( templateName + '.xml')) .pipe(gulp.dest('./build')) }) // Auto compile gulp.task('autocompile', function () { gulp.watch('./src/less/**/*.less', ['less']); });
Update gulp export support pretty
Update gulp export support pretty
JavaScript
mit
nitpum/blogger-template-builder,nitpum/blogger-template-builder
javascript
## Code Before: const gulp = require('gulp'), rename = require('gulp-rename'), less = require('gulp-less'), pug = require('gulp-pug') argv = require('yargs').argv /** * Compile less to css and minify */ gulp.task('less', function () { gulp.src('./src/less/**/*.less') .pipe(less({ compress:true })) .pipe(gulp.dest('./src/css/')); }); // Build template gulp.task('export', function () { if (!argv.template) return console.log("Please enter template name --template <templatename>"); const templateName = argv.template; return gulp.src('./templates/' + templateName + '/index.pug') .pipe(pug( )) .pipe(rename( templateName + '.xml')) .pipe(gulp.dest('./build')) }) // Auto compile gulp.task('autocompile', function () { gulp.watch('./src/less/**/*.less', ['less']); }); ## Instruction: Update gulp export support pretty ## Code After: const gulp = require('gulp'), rename = require('gulp-rename'), less = require('gulp-less'), pug = require('gulp-pug') argv = require('yargs').argv /** * Compile less to css and minify */ gulp.task('less', function () { gulp.src('./src/less/**/*.less') .pipe(less({ compress:true })) .pipe(gulp.dest('./src/css/')); }); // Build and export template gulp.task('export', function () { // Get argument if (!argv.template && !argv.t) return console.log("Please enter template name \"gulp export --template <templatename>\" "); // Set template name and file path const templateName = argv.template || argv.t; const mainFile = './templates/' + templateName + '/index.pug'; const minify = (argv.minify)? false : true; console.log("Get main file : " + mainFile); // Start gulp return gulp.src(mainFile) .pipe(pug({ pretty: minify })) .pipe(rename( templateName + '.xml')) .pipe(gulp.dest('./build')) }) // Auto compile gulp.task('autocompile', function () { gulp.watch('./src/less/**/*.less', ['less']); });
const gulp = require('gulp'), rename = require('gulp-rename'), less = require('gulp-less'), pug = require('gulp-pug') argv = require('yargs').argv /** * Compile less to css and minify */ gulp.task('less', function () { gulp.src('./src/less/**/*.less') .pipe(less({ compress:true })) .pipe(gulp.dest('./src/css/')); }); - // Build template + // Build and export template ? +++++++++++ gulp.task('export', function () { - if (!argv.template) - return console.log("Please enter template name --template <templatename>"); - const templateName = argv.template; + // Get argument + if (!argv.template && !argv.t) + return console.log("Please enter template name \"gulp export --template <templatename>\" "); - return gulp.src('./templates/' + templateName + '/index.pug') - .pipe(pug( + // Set template name and file path + const templateName = argv.template || argv.t; + const mainFile = './templates/' + templateName + '/index.pug'; + const minify = (argv.minify)? false : true; + + console.log("Get main file : " + mainFile); + // Start gulp + return gulp.src(mainFile) + .pipe(pug({ + pretty: minify - )) + })) ? + .pipe(rename( templateName + '.xml')) .pipe(gulp.dest('./build')) }) // Auto compile gulp.task('autocompile', function () { gulp.watch('./src/less/**/*.less', ['less']); });
22
0.564103
15
7
0da19042c74d2a85ef4652b36186a1ee6c4fc247
tilequeue/format/mvt.py
tilequeue/format/mvt.py
from mapbox_vector_tile.encoder import on_invalid_geometry_make_valid from mapbox_vector_tile import encode as mvt_encode def encode(fp, feature_layers, coord, bounds_merc): tile = mvt_encode(feature_layers, quantize_bounds=bounds_merc, on_invalid_geometry=on_invalid_geometry_make_valid) fp.write(tile)
from mapbox_vector_tile.encoder import on_invalid_geometry_make_valid from mapbox_vector_tile import encode as mvt_encode def encode(fp, feature_layers, coord, bounds_merc): tile = mvt_encode( feature_layers, quantize_bounds=bounds_merc, on_invalid_geometry=on_invalid_geometry_make_valid, round_fn=round, ) fp.write(tile)
Use round_fn to specify built-in round function
Use round_fn to specify built-in round function
Python
mit
mapzen/tilequeue,tilezen/tilequeue
python
## Code Before: from mapbox_vector_tile.encoder import on_invalid_geometry_make_valid from mapbox_vector_tile import encode as mvt_encode def encode(fp, feature_layers, coord, bounds_merc): tile = mvt_encode(feature_layers, quantize_bounds=bounds_merc, on_invalid_geometry=on_invalid_geometry_make_valid) fp.write(tile) ## Instruction: Use round_fn to specify built-in round function ## Code After: from mapbox_vector_tile.encoder import on_invalid_geometry_make_valid from mapbox_vector_tile import encode as mvt_encode def encode(fp, feature_layers, coord, bounds_merc): tile = mvt_encode( feature_layers, quantize_bounds=bounds_merc, on_invalid_geometry=on_invalid_geometry_make_valid, round_fn=round, ) fp.write(tile)
from mapbox_vector_tile.encoder import on_invalid_geometry_make_valid from mapbox_vector_tile import encode as mvt_encode def encode(fp, feature_layers, coord, bounds_merc): - tile = mvt_encode(feature_layers, quantize_bounds=bounds_merc, + tile = mvt_encode( + feature_layers, + quantize_bounds=bounds_merc, - on_invalid_geometry=on_invalid_geometry_make_valid) ? -------------- ^ + on_invalid_geometry=on_invalid_geometry_make_valid, ? ^ + round_fn=round, + ) fp.write(tile)
8
1
6
2
f38d6ed5142c4e0d964882bc4cc63e10cf9395a5
doc/requirements/README.md
doc/requirements/README.md
Currently just Twinkles 1. Parts of this will be included into a DC1 requirements document, assembled by the SSim working group. As a result, of this, and in anticipation of Twinkles 2, the Twinkles 1 source files are stored in their own sub-directory, and compiled into a master document called `TwinklesRQ.tex`. ### Compiling The documents use the Science Roadmap macros and settings. To link the required files to this directory, do ``` setenv SRM_DIR <path_to_Science_Roadmap_repo> make links ``` Then, compile with `make`.
Currently just Twinkles 1. Parts of this will be included into a DC1 requirements document, assembled by the SSim working group. As a result, of this, and in anticipation of Twinkles 2, the Twinkles 1 source files are stored in their own sub-directory, and compiled into a master document called `TwinklesRQ.tex`. ### Compiling Compile from the `doc/requirements` directory. The documents use the Science Roadmap macros and settings. To link the required files to this directory, do (in csh): ``` setenv SRM_DIR <path_to_Science_Roadmap_repo> make links ``` or in [ba]sh ``` export SRM_DIR=<path_to_Science_Roadmap_repo> make links ``` Then, compile with `make`.
Add bash instructions for setting SRM_DIR
Add bash instructions for setting SRM_DIR
Markdown
mit
DarkEnergyScienceCollaboration/Twinkles,rbiswas4/Twinkles,DarkEnergyScienceCollaboration/Twinkles,rbiswas4/Twinkles,LSSTDESC/Twinkles,LSSTDESC/Twinkles
markdown
## Code Before: Currently just Twinkles 1. Parts of this will be included into a DC1 requirements document, assembled by the SSim working group. As a result, of this, and in anticipation of Twinkles 2, the Twinkles 1 source files are stored in their own sub-directory, and compiled into a master document called `TwinklesRQ.tex`. ### Compiling The documents use the Science Roadmap macros and settings. To link the required files to this directory, do ``` setenv SRM_DIR <path_to_Science_Roadmap_repo> make links ``` Then, compile with `make`. ## Instruction: Add bash instructions for setting SRM_DIR ## Code After: Currently just Twinkles 1. Parts of this will be included into a DC1 requirements document, assembled by the SSim working group. As a result, of this, and in anticipation of Twinkles 2, the Twinkles 1 source files are stored in their own sub-directory, and compiled into a master document called `TwinklesRQ.tex`. ### Compiling Compile from the `doc/requirements` directory. The documents use the Science Roadmap macros and settings. To link the required files to this directory, do (in csh): ``` setenv SRM_DIR <path_to_Science_Roadmap_repo> make links ``` or in [ba]sh ``` export SRM_DIR=<path_to_Science_Roadmap_repo> make links ``` Then, compile with `make`.
Currently just Twinkles 1. Parts of this will be included into a DC1 requirements document, assembled by the SSim working group. As a result, of this, and in anticipation of Twinkles 2, the Twinkles 1 source files are stored in their own sub-directory, and compiled into a master document called `TwinklesRQ.tex`. ### Compiling + Compile from the `doc/requirements` directory. + The documents use the Science Roadmap macros and settings. To link the - required files to this directory, do + required files to this directory, do (in csh): ? ++++++++++ ``` setenv SRM_DIR <path_to_Science_Roadmap_repo> make links ``` + or in [ba]sh + ``` + export SRM_DIR=<path_to_Science_Roadmap_repo> + make links + ``` + Then, compile with `make`.
10
0.588235
9
1
c10b5348f2e5fdde22bb99f2afa48435a5445f03
.travis.yml
.travis.yml
--- language: ruby sudo: false rvm: - 1.9.3 - 2.0 - 2.1 - 2.2 notifications: email: false env: - PUPPET_GEM_VERSION="~> 3.7.0" - PUPPET_GEM_VERSION="~> 3.8.0" - PUPPET_GEM_VERSION="~> 4.1.0" - PUPPET_GEM_VERSION="~> 4.2.0" script: - puppet --version - bundle exec rake validate lint spec matrix: include: - rvm: 1.8.7 env: PUPPET_GEM_VERSION="~> 3.7.0" RSPEC_GEM_VERSION="< 3.2.0" - rvm: 1.8.7 env: PUPPET_GEM_VERSION="~> 3.8.0" RSPEC_GEM_VERSION="< 3.2.0" exclude: - rvm: 2.2 env: PUPPET_GEM_VERSION="~> 3.7.0" - rvm: 2.2 env: PUPPET_GEM_VERSION="~> 3.8.0"
--- language: ruby sudo: false rvm: - 1.9.3 - 2.0 - 2.1 - 2.2 notifications: email: false env: - PUPPET_GEM_VERSION="~> 3.7.0" - PUPPET_GEM_VERSION="~> 3.8.0" - PUPPET_GEM_VERSION="~> 4.1.0" - PUPPET_GEM_VERSION="~> 4.2.0" script: - puppet --version - bundle exec rake validate lint spec matrix: exclude: - rvm: 2.2 env: PUPPET_GEM_VERSION="~> 3.7.0" - rvm: 2.2 env: PUPPET_GEM_VERSION="~> 3.8.0"
Drop support for ruby 1.8.7
Drop support for ruby 1.8.7 rake (11.0.1) does not working with ruby-versions older than 1.9.3.
YAML
apache-2.0
gnubila-france/puppet-check_mk,gnubila-france/puppet-check_mk
yaml
## Code Before: --- language: ruby sudo: false rvm: - 1.9.3 - 2.0 - 2.1 - 2.2 notifications: email: false env: - PUPPET_GEM_VERSION="~> 3.7.0" - PUPPET_GEM_VERSION="~> 3.8.0" - PUPPET_GEM_VERSION="~> 4.1.0" - PUPPET_GEM_VERSION="~> 4.2.0" script: - puppet --version - bundle exec rake validate lint spec matrix: include: - rvm: 1.8.7 env: PUPPET_GEM_VERSION="~> 3.7.0" RSPEC_GEM_VERSION="< 3.2.0" - rvm: 1.8.7 env: PUPPET_GEM_VERSION="~> 3.8.0" RSPEC_GEM_VERSION="< 3.2.0" exclude: - rvm: 2.2 env: PUPPET_GEM_VERSION="~> 3.7.0" - rvm: 2.2 env: PUPPET_GEM_VERSION="~> 3.8.0" ## Instruction: Drop support for ruby 1.8.7 rake (11.0.1) does not working with ruby-versions older than 1.9.3. ## Code After: --- language: ruby sudo: false rvm: - 1.9.3 - 2.0 - 2.1 - 2.2 notifications: email: false env: - PUPPET_GEM_VERSION="~> 3.7.0" - PUPPET_GEM_VERSION="~> 3.8.0" - PUPPET_GEM_VERSION="~> 4.1.0" - PUPPET_GEM_VERSION="~> 4.2.0" script: - puppet --version - bundle exec rake validate lint spec matrix: exclude: - rvm: 2.2 env: PUPPET_GEM_VERSION="~> 3.7.0" - rvm: 2.2 env: PUPPET_GEM_VERSION="~> 3.8.0"
--- language: ruby sudo: false rvm: - 1.9.3 - 2.0 - 2.1 - 2.2 notifications: email: false env: - PUPPET_GEM_VERSION="~> 3.7.0" - PUPPET_GEM_VERSION="~> 3.8.0" - PUPPET_GEM_VERSION="~> 4.1.0" - PUPPET_GEM_VERSION="~> 4.2.0" script: - puppet --version - bundle exec rake validate lint spec matrix: - include: - - rvm: 1.8.7 - env: PUPPET_GEM_VERSION="~> 3.7.0" RSPEC_GEM_VERSION="< 3.2.0" - - rvm: 1.8.7 - env: PUPPET_GEM_VERSION="~> 3.8.0" RSPEC_GEM_VERSION="< 3.2.0" exclude: - rvm: 2.2 env: PUPPET_GEM_VERSION="~> 3.7.0" - rvm: 2.2 env: PUPPET_GEM_VERSION="~> 3.8.0"
5
0.172414
0
5
ac23b4b1f15df46aaf3127e51b88ae2c85acdd36
lib/npmrc-switcher.js
lib/npmrc-switcher.js
'use strict'; var findup = require('findup-sync'); var fs = require('fs'); var shell = require('child_process').execFile; // Only check the value once var baseDirectory = process.env.HOME; var baseFile = baseDirectory + '/.npmrc'; var baseBackup = baseDirectory + '/.npmrc.bak'; var nearestFile; function isAlreadyInUse(path) { if (path && path !== baseFile) { return true; } } function getPath() { var npmrcPath = findup('.npmrc'); if (isAlreadyInUse(npmrcPath)) { return npmrcPath; } return null; } function renameFile(currentFileName, newFileName) { if (fs.existsSync(currentFileName)) { fs.renameSync(currentFileName, newFileName); if (fs.existsSync(currentFileName)) { throw new Error('Could not rename file'); } } } function copySourceToDestination(source, destination) { shell('/bin/cp', [source, destination]); } function returnToBackedUpFile() { renameFile(baseBackup, baseFile); } function backupFileCurrentlyInUse() { renameFile(baseFile, baseBackup); } nearestFile = getPath(); if (nearestFile) { backupFileCurrentlyInUse(); copySourceToDestination(nearestFile, baseFile); } else { returnToBackedUpFile(); }
'use strict'; var findup = require('findup-sync'); var fs = require('fs'); var shell = require('child_process').execFile; // Only check the value once var baseDirectory = process.env.HOME; var baseFile = baseDirectory + '/.npmrc'; var defaultFile = baseDirectory + '/.npmrc.default'; var nearestFile; function isAlreadyInUse(path) { if (path && path !== baseFile) { return true; } } function getPath() { var npmrcPath = findup('.npmrc'); if (isAlreadyInUse(npmrcPath)) { return npmrcPath; } return null; } function renameFile(currentFileName, newFileName) { if (fs.existsSync(currentFileName)) { fs.renameSync(currentFileName, newFileName); if (fs.existsSync(currentFileName)) { throw new Error('Could not rename file'); } } } function copySourceToDestination(source, destination) { shell('/bin/cp', [source, destination]); } function defaultExists() { return fs.existsSync(defaultFile); } function returnToDefaultFile() { if (fs.existsSync(baseFile)) { fs.unlinkSync(baseFile); } if (defaultExists) { renameFile(defaultFile, baseFile); } } nearestFile = getPath(); if (nearestFile) { copySourceToDestination(nearestFile, baseFile); } else { returnToDefaultFile(); }
Use a default file if it exists, otherwise fallback to nom's default
Use a default file if it exists, otherwise fallback to nom's default
JavaScript
apache-2.0
BBC-News/npmrc-switcher,BBC-News/npmrc-switcher
javascript
## Code Before: 'use strict'; var findup = require('findup-sync'); var fs = require('fs'); var shell = require('child_process').execFile; // Only check the value once var baseDirectory = process.env.HOME; var baseFile = baseDirectory + '/.npmrc'; var baseBackup = baseDirectory + '/.npmrc.bak'; var nearestFile; function isAlreadyInUse(path) { if (path && path !== baseFile) { return true; } } function getPath() { var npmrcPath = findup('.npmrc'); if (isAlreadyInUse(npmrcPath)) { return npmrcPath; } return null; } function renameFile(currentFileName, newFileName) { if (fs.existsSync(currentFileName)) { fs.renameSync(currentFileName, newFileName); if (fs.existsSync(currentFileName)) { throw new Error('Could not rename file'); } } } function copySourceToDestination(source, destination) { shell('/bin/cp', [source, destination]); } function returnToBackedUpFile() { renameFile(baseBackup, baseFile); } function backupFileCurrentlyInUse() { renameFile(baseFile, baseBackup); } nearestFile = getPath(); if (nearestFile) { backupFileCurrentlyInUse(); copySourceToDestination(nearestFile, baseFile); } else { returnToBackedUpFile(); } ## Instruction: Use a default file if it exists, otherwise fallback to nom's default ## Code After: 'use strict'; var findup = require('findup-sync'); var fs = require('fs'); var shell = require('child_process').execFile; // Only check the value once var baseDirectory = process.env.HOME; var baseFile = baseDirectory + '/.npmrc'; var defaultFile = baseDirectory + '/.npmrc.default'; var nearestFile; function isAlreadyInUse(path) { if (path && path !== baseFile) { return true; } } function getPath() { var npmrcPath = findup('.npmrc'); if (isAlreadyInUse(npmrcPath)) { return npmrcPath; } return null; } function renameFile(currentFileName, newFileName) { if (fs.existsSync(currentFileName)) { fs.renameSync(currentFileName, newFileName); if (fs.existsSync(currentFileName)) { throw new Error('Could not rename file'); } } } function copySourceToDestination(source, destination) { shell('/bin/cp', [source, destination]); } function defaultExists() { return fs.existsSync(defaultFile); } function returnToDefaultFile() { if (fs.existsSync(baseFile)) { fs.unlinkSync(baseFile); } if (defaultExists) { renameFile(defaultFile, baseFile); } } nearestFile = getPath(); if (nearestFile) { copySourceToDestination(nearestFile, baseFile); } else { returnToDefaultFile(); }
'use strict'; var findup = require('findup-sync'); var fs = require('fs'); var shell = require('child_process').execFile; // Only check the value once var baseDirectory = process.env.HOME; var baseFile = baseDirectory + '/.npmrc'; - var baseBackup = baseDirectory + '/.npmrc.bak'; + var defaultFile = baseDirectory + '/.npmrc.default'; var nearestFile; function isAlreadyInUse(path) { if (path && path !== baseFile) { return true; } } function getPath() { var npmrcPath = findup('.npmrc'); if (isAlreadyInUse(npmrcPath)) { return npmrcPath; } return null; } function renameFile(currentFileName, newFileName) { if (fs.existsSync(currentFileName)) { fs.renameSync(currentFileName, newFileName); if (fs.existsSync(currentFileName)) { throw new Error('Could not rename file'); } } } function copySourceToDestination(source, destination) { shell('/bin/cp', [source, destination]); } - function returnToBackedUpFile() { - renameFile(baseBackup, baseFile); + function defaultExists() { + return fs.existsSync(defaultFile); } - function backupFileCurrentlyInUse() { - renameFile(baseFile, baseBackup); + function returnToDefaultFile() { + if (fs.existsSync(baseFile)) { + fs.unlinkSync(baseFile); + } + + if (defaultExists) { + renameFile(defaultFile, baseFile); + } } nearestFile = getPath(); if (nearestFile) { - backupFileCurrentlyInUse(); copySourceToDestination(nearestFile, baseFile); } else { - returnToBackedUpFile(); ? ^ ^^^^^^ + returnToDefaultFile(); ? ^^^ ^^^ }
19
0.327586
12
7
f592a1b47fcb407d2ae4543a3e06609252e90a42
config/initializers/raven.rb
config/initializers/raven.rb
require 'raven' Raven.configure do |config| config.dsn = 'https://c22240edb1ba4972bf5007bd1cbf1aa3:707c4fc8872046738fa86eeaf60d5938@app.getsentry.com/11938' end
require 'raven' Raven.configure do |config| config.dsn = ENV["SENTRY_DSN"] end
Store Sentry API key in ENV var
Store Sentry API key in ENV var
Ruby
bsd-3-clause
smcgov/ohana-api-smc,smcgov/ohana-api-smc,smcgov/ohana-api-smc,smcgov/ohana-api-smc
ruby
## Code Before: require 'raven' Raven.configure do |config| config.dsn = 'https://c22240edb1ba4972bf5007bd1cbf1aa3:707c4fc8872046738fa86eeaf60d5938@app.getsentry.com/11938' end ## Instruction: Store Sentry API key in ENV var ## Code After: require 'raven' Raven.configure do |config| config.dsn = ENV["SENTRY_DSN"] end
require 'raven' Raven.configure do |config| - config.dsn = 'https://c22240edb1ba4972bf5007bd1cbf1aa3:707c4fc8872046738fa86eeaf60d5938@app.getsentry.com/11938' + config.dsn = ENV["SENTRY_DSN"] end
2
0.4
1
1
6cf484c9a3ce5aa141363ae67c58fa94d055a105
app/assets/javascripts/app/views/bookmarklet_view.js
app/assets/javascripts/app/views/bookmarklet_view.js
app.views.Bookmarklet = Backbone.View.extend({ separator: ' - ', initialize: function(opts) { // init a standalone publisher app.publisher = new app.views.Publisher({standalone: true}); app.publisher.on('publisher:add', this._postSubmit, this); app.publisher.on('publisher:sync', this._postSuccess, this); app.publisher.on('publisher:error', this._postError, this); this.param_contents = opts; }, render: function() { app.publisher.open(); app.publisher.setText(this._publisherContent()); return this; }, _publisherContent: function() { var p = this.param_contents; if( p.content ) return p.content; var contents = p.title + this.separator + p.url; if( p.notes ) contents += this.separator + p.notes; return contents; }, _postSubmit: function(evt) { this.$('h4').text(Diaspora.I18n.t('bookmarklet.post_submit')); }, _postSuccess: function(evt) { this.$('h4').text(Diaspora.I18n.t('bookmarklet.post_success')); app.publisher.close(); _.delay(window.close, 2000); }, _postError: function(evt) { this.$('h4').text(Diaspora.I18n.t('bookmarklet.post_error')); } });
app.views.Bookmarklet = Backbone.View.extend({ separator: ' - ', initialize: function(opts) { // init a standalone publisher app.publisher = new app.views.Publisher({standalone: true}); app.publisher.on('publisher:add', this._postSubmit, this); app.publisher.on('publisher:sync', this._postSuccess, this); app.publisher.on('publisher:error', this._postError, this); this.param_contents = opts; }, render: function() { app.publisher.open(); app.publisher.setText(this._publisherContent()); return this; }, _publisherContent: function() { var p = this.param_contents; if( p.content ) return p.content; var contents = p.title + this.separator + p.url; if( p.notes ) contents += this.separator + p.notes; return contents; }, _postSubmit: function(evt) { this.$('h4').text(Diaspora.I18n.t('bookmarklet.post_submit')); }, _postSuccess: function(evt) { this.$('h4').text(Diaspora.I18n.t('bookmarklet.post_success')); app.publisher.close(); this.$("#publisher").addClass("hidden"); _.delay(window.close, 2000); }, _postError: function(evt) { this.$('h4').text(Diaspora.I18n.t('bookmarklet.post_error')); } });
Make sure publisher is totally hidden in bookmarklet after post success
Make sure publisher is totally hidden in bookmarklet after post success
JavaScript
agpl-3.0
geraspora/diaspora,jhass/diaspora,Flaburgan/diaspora,diaspora/diaspora,Amadren/diaspora,diaspora/diaspora,geraspora/diaspora,despora/diaspora,Amadren/diaspora,Amadren/diaspora,Muhannes/diaspora,SuperTux88/diaspora,jhass/diaspora,KentShikama/diaspora,spixi/diaspora,jhass/diaspora,SuperTux88/diaspora,SuperTux88/diaspora,SuperTux88/diaspora,Muhannes/diaspora,Muhannes/diaspora,KentShikama/diaspora,geraspora/diaspora,despora/diaspora,spixi/diaspora,geraspora/diaspora,Muhannes/diaspora,Amadren/diaspora,despora/diaspora,diaspora/diaspora,diaspora/diaspora,Flaburgan/diaspora,KentShikama/diaspora,KentShikama/diaspora,despora/diaspora,spixi/diaspora,jhass/diaspora,spixi/diaspora,Flaburgan/diaspora,Flaburgan/diaspora
javascript
## Code Before: app.views.Bookmarklet = Backbone.View.extend({ separator: ' - ', initialize: function(opts) { // init a standalone publisher app.publisher = new app.views.Publisher({standalone: true}); app.publisher.on('publisher:add', this._postSubmit, this); app.publisher.on('publisher:sync', this._postSuccess, this); app.publisher.on('publisher:error', this._postError, this); this.param_contents = opts; }, render: function() { app.publisher.open(); app.publisher.setText(this._publisherContent()); return this; }, _publisherContent: function() { var p = this.param_contents; if( p.content ) return p.content; var contents = p.title + this.separator + p.url; if( p.notes ) contents += this.separator + p.notes; return contents; }, _postSubmit: function(evt) { this.$('h4').text(Diaspora.I18n.t('bookmarklet.post_submit')); }, _postSuccess: function(evt) { this.$('h4').text(Diaspora.I18n.t('bookmarklet.post_success')); app.publisher.close(); _.delay(window.close, 2000); }, _postError: function(evt) { this.$('h4').text(Diaspora.I18n.t('bookmarklet.post_error')); } }); ## Instruction: Make sure publisher is totally hidden in bookmarklet after post success ## Code After: app.views.Bookmarklet = Backbone.View.extend({ separator: ' - ', initialize: function(opts) { // init a standalone publisher app.publisher = new app.views.Publisher({standalone: true}); app.publisher.on('publisher:add', this._postSubmit, this); app.publisher.on('publisher:sync', this._postSuccess, this); app.publisher.on('publisher:error', this._postError, this); this.param_contents = opts; }, render: function() { app.publisher.open(); app.publisher.setText(this._publisherContent()); return this; }, _publisherContent: function() { var p = this.param_contents; if( p.content ) return p.content; var contents = p.title + this.separator + p.url; if( p.notes ) contents += this.separator + p.notes; return contents; }, _postSubmit: function(evt) { this.$('h4').text(Diaspora.I18n.t('bookmarklet.post_submit')); }, _postSuccess: function(evt) { this.$('h4').text(Diaspora.I18n.t('bookmarklet.post_success')); app.publisher.close(); this.$("#publisher").addClass("hidden"); _.delay(window.close, 2000); }, _postError: function(evt) { this.$('h4').text(Diaspora.I18n.t('bookmarklet.post_error')); } });
app.views.Bookmarklet = Backbone.View.extend({ separator: ' - ', initialize: function(opts) { // init a standalone publisher app.publisher = new app.views.Publisher({standalone: true}); app.publisher.on('publisher:add', this._postSubmit, this); app.publisher.on('publisher:sync', this._postSuccess, this); app.publisher.on('publisher:error', this._postError, this); this.param_contents = opts; }, render: function() { app.publisher.open(); app.publisher.setText(this._publisherContent()); return this; }, _publisherContent: function() { var p = this.param_contents; if( p.content ) return p.content; var contents = p.title + this.separator + p.url; if( p.notes ) contents += this.separator + p.notes; return contents; }, _postSubmit: function(evt) { this.$('h4').text(Diaspora.I18n.t('bookmarklet.post_submit')); }, _postSuccess: function(evt) { this.$('h4').text(Diaspora.I18n.t('bookmarklet.post_success')); app.publisher.close(); + this.$("#publisher").addClass("hidden"); _.delay(window.close, 2000); }, _postError: function(evt) { this.$('h4').text(Diaspora.I18n.t('bookmarklet.post_error')); } });
1
0.022727
1
0
82d08ac8d443374f1f22bcdc7313d09b8ca6a3e3
srv/assistants/onCredentialsChangedAssistant.js
srv/assistants/onCredentialsChangedAssistant.js
var onCredentialsChangedAssistant = function (future) {}; onCredentialsChangedAssistant.prototype.run = function (outerFuture) { log("============== onCredentialsChangedAssistant"); log("Params: " + JSON.stringify(this.controller.args)); //may contain password. log("Future: " + JSON.stringify(outerFuture.result)); if (locked === true) { log("Locked... already running?"); previousOperationFuture.then(this, function (f) { log("PreviousOperation finished " + JSON.stringify(f.result) + " , starting onCredentialsChangedAssistant"); this.run(outerFuture); }); return; } //this assistant is not helpfull.. it only get's the accountId, but no new credentials??? Why??? :( finishAssistant(outerFuture, { returnValue: false, success: false }); };
var onCredentialsChangedAssistant = function (future) {}; onCredentialsChangedAssistant.prototype.run = function (outerFuture) { log("============== onCredentialsChangedAssistant"); log("Params: " + JSON.stringify(this.controller.args)); //may contain password. log("Future: " + JSON.stringify(outerFuture.result)); // if (locked === true) { // log("Locked... already running?"); // previousOperationFuture.then(this, function (f) { // log("PreviousOperation finished " + JSON.stringify(f.result) + " , starting onCredentialsChangedAssistant"); // this.run(outerFuture); // }); // return; // } //this assistant is not helpfull.. it only get's the accountId, but no new credentials??? Why??? :( finishAssistant(outerFuture, { returnValue: false, success: false }); };
Change was necessary to allow modifyAccount calls in storeAccounts to return. We don't really use the webOS stored credentials anyway... no idea how to read them.
Change was necessary to allow modifyAccount calls in storeAccounts to return. We don't really use the webOS stored credentials anyway... no idea how to read them.
JavaScript
mit
Garfonso/SyncML,Garfonso/SyncML
javascript
## Code Before: var onCredentialsChangedAssistant = function (future) {}; onCredentialsChangedAssistant.prototype.run = function (outerFuture) { log("============== onCredentialsChangedAssistant"); log("Params: " + JSON.stringify(this.controller.args)); //may contain password. log("Future: " + JSON.stringify(outerFuture.result)); if (locked === true) { log("Locked... already running?"); previousOperationFuture.then(this, function (f) { log("PreviousOperation finished " + JSON.stringify(f.result) + " , starting onCredentialsChangedAssistant"); this.run(outerFuture); }); return; } //this assistant is not helpfull.. it only get's the accountId, but no new credentials??? Why??? :( finishAssistant(outerFuture, { returnValue: false, success: false }); }; ## Instruction: Change was necessary to allow modifyAccount calls in storeAccounts to return. We don't really use the webOS stored credentials anyway... no idea how to read them. ## Code After: var onCredentialsChangedAssistant = function (future) {}; onCredentialsChangedAssistant.prototype.run = function (outerFuture) { log("============== onCredentialsChangedAssistant"); log("Params: " + JSON.stringify(this.controller.args)); //may contain password. log("Future: " + JSON.stringify(outerFuture.result)); // if (locked === true) { // log("Locked... already running?"); // previousOperationFuture.then(this, function (f) { // log("PreviousOperation finished " + JSON.stringify(f.result) + " , starting onCredentialsChangedAssistant"); // this.run(outerFuture); // }); // return; // } //this assistant is not helpfull.. it only get's the accountId, but no new credentials??? Why??? :( finishAssistant(outerFuture, { returnValue: false, success: false }); };
var onCredentialsChangedAssistant = function (future) {}; onCredentialsChangedAssistant.prototype.run = function (outerFuture) { log("============== onCredentialsChangedAssistant"); log("Params: " + JSON.stringify(this.controller.args)); //may contain password. log("Future: " + JSON.stringify(outerFuture.result)); - if (locked === true) { + // if (locked === true) { ? ++ - log("Locked... already running?"); + // log("Locked... already running?"); ? ++ - previousOperationFuture.then(this, function (f) { + // previousOperationFuture.then(this, function (f) { ? ++ - log("PreviousOperation finished " + JSON.stringify(f.result) + " , starting onCredentialsChangedAssistant"); + // log("PreviousOperation finished " + JSON.stringify(f.result) + " , starting onCredentialsChangedAssistant"); ? ++ - this.run(outerFuture); + // this.run(outerFuture); ? ++ - }); + // }); ? ++ - return; + // return; ? ++ - } + // } ? ++ //this assistant is not helpfull.. it only get's the accountId, but no new credentials??? Why??? :( finishAssistant(outerFuture, { returnValue: false, success: false }); };
16
0.842105
8
8
f450382be489559431876ae3fad9ae8b7cf1820b
models/events.js
models/events.js
var mongoose = require('mongoose'); var Schema = mongoose.Schema; var eventSchema = new Schema({ title: String, description: String, startDate: Date, endDate: Date, deadline: Date, duration: String, participants: [String] }); module.exports = mongoose.model('events', eventSchema);
var mongoose = require('mongoose'); var Schema = mongoose.Schema; var slotSchema = new Schema({ startDate: { type: String, required: true }, startTime: { type: String, required: true }, duration: { type: String, required: true } }); var participantSchema = new Schema({ name: { type: String, required: true }, facebookId: { type: String, required: true }, slot: [slotSchema] }); var eventSchema = new Schema({ title: { type: String, trim: true, required: true }, description: { type: String, trim: true }, startDate: { type: String, required: true }, endDate: { type: String, required: true }, deadline: { type: String, required: true }, duration: { type: String, required: true }, participants: { type: [participantSchema], required: true } }); module.exports = mongoose.model('events', eventSchema);
Change the schema of model.
Change the schema of model. An example is given below: var data = { title: 'event 1', description: 'the first event', startDate: '2014-01-02', endDate: '2014-01-05', deadline: '2014-01-01', duration: '2', participants: [ { name: 'John', facebookId: '234236367634', slot: [ { startDate: '2014-01-03', startTime: '4', duration: '2' }, ... ] }, ... ] }; Then post it with AJAX: $.post('/newevent', data);
JavaScript
mit
Cobra-Kao/synckr,Cobra-Kao/synckr
javascript
## Code Before: var mongoose = require('mongoose'); var Schema = mongoose.Schema; var eventSchema = new Schema({ title: String, description: String, startDate: Date, endDate: Date, deadline: Date, duration: String, participants: [String] }); module.exports = mongoose.model('events', eventSchema); ## Instruction: Change the schema of model. An example is given below: var data = { title: 'event 1', description: 'the first event', startDate: '2014-01-02', endDate: '2014-01-05', deadline: '2014-01-01', duration: '2', participants: [ { name: 'John', facebookId: '234236367634', slot: [ { startDate: '2014-01-03', startTime: '4', duration: '2' }, ... ] }, ... ] }; Then post it with AJAX: $.post('/newevent', data); ## Code After: var mongoose = require('mongoose'); var Schema = mongoose.Schema; var slotSchema = new Schema({ startDate: { type: String, required: true }, startTime: { type: String, required: true }, duration: { type: String, required: true } }); var participantSchema = new Schema({ name: { type: String, required: true }, facebookId: { type: String, required: true }, slot: [slotSchema] }); var eventSchema = new Schema({ title: { type: String, trim: true, required: true }, description: { type: String, trim: true }, startDate: { type: String, required: true }, endDate: { type: String, required: true }, deadline: { type: String, required: true }, duration: { type: String, required: true }, participants: { type: [participantSchema], required: true } }); module.exports = mongoose.model('events', eventSchema);
var mongoose = require('mongoose'); var Schema = mongoose.Schema; + var slotSchema = new Schema({ + startDate: { type: String, required: true }, + startTime: { type: String, required: true }, + duration: { type: String, required: true } + }); + + var participantSchema = new Schema({ + name: { type: String, required: true }, + facebookId: { type: String, required: true }, + slot: [slotSchema] + }); + var eventSchema = new Schema({ - title: String, - description: String, - startDate: Date, - endDate: Date, - deadline: Date, - duration: String, - participants: [String] + title: { type: String, trim: true, required: true }, + description: { type: String, trim: true }, + startDate: { type: String, required: true }, + endDate: { type: String, required: true }, + deadline: { type: String, required: true }, + duration: { type: String, required: true }, + participants: { type: [participantSchema], required: true } }); module.exports = mongoose.model('events', eventSchema);
26
1.857143
19
7
933f4a2a74e92d72ccf1b11f87d43e66eb829694
README.rst
README.rst
pytest-dependency - Manage dependencies of tests ================================================ This pytest plugin manages dependencies of tests. It allows to mark some tests as dependent from other tests. These tests will then be skipped if any of the dependencies did fail or has been skipped. System requirements ------------------- + Python 2.6, 2.7, or 3.1 and newer. + `pytest`_ 2.8.0 or newer. Copyright and License --------------------- - Copyright 2013-2015 Helmholtz-Zentrum Berlin für Materialien und Energie GmbH - Copyright 2016 Rolf Krahl Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. .. _pytest: http://pytest.org/
.. note:: This is the ``tests`` branch of the code. It is an attempt to implement Issue `#1`__. .. __: https://github.com/RKrahl/pytest-dependency/issues/1 pytest-dependency - Manage dependencies of tests ================================================ This pytest plugin manages dependencies of tests. It allows to mark some tests as dependent from other tests. These tests will then be skipped if any of the dependencies did fail or has been skipped. System requirements ------------------- + Python 2.6, 2.7, or 3.1 and newer. + `pytest`_ 2.8.0 or newer. Copyright and License --------------------- - Copyright 2013-2015 Helmholtz-Zentrum Berlin für Materialien und Energie GmbH - Copyright 2016 Rolf Krahl Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. .. _pytest: http://pytest.org/
Add a note on the status of the branch.
Add a note on the status of the branch.
reStructuredText
apache-2.0
RKrahl/pytest-dependency
restructuredtext
## Code Before: pytest-dependency - Manage dependencies of tests ================================================ This pytest plugin manages dependencies of tests. It allows to mark some tests as dependent from other tests. These tests will then be skipped if any of the dependencies did fail or has been skipped. System requirements ------------------- + Python 2.6, 2.7, or 3.1 and newer. + `pytest`_ 2.8.0 or newer. Copyright and License --------------------- - Copyright 2013-2015 Helmholtz-Zentrum Berlin für Materialien und Energie GmbH - Copyright 2016 Rolf Krahl Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. .. _pytest: http://pytest.org/ ## Instruction: Add a note on the status of the branch. ## Code After: .. note:: This is the ``tests`` branch of the code. It is an attempt to implement Issue `#1`__. .. __: https://github.com/RKrahl/pytest-dependency/issues/1 pytest-dependency - Manage dependencies of tests ================================================ This pytest plugin manages dependencies of tests. It allows to mark some tests as dependent from other tests. These tests will then be skipped if any of the dependencies did fail or has been skipped. System requirements ------------------- + Python 2.6, 2.7, or 3.1 and newer. + `pytest`_ 2.8.0 or newer. Copyright and License --------------------- - Copyright 2013-2015 Helmholtz-Zentrum Berlin für Materialien und Energie GmbH - Copyright 2016 Rolf Krahl Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. .. _pytest: http://pytest.org/
+ .. note:: This is the ``tests`` branch of the code. + It is an attempt to implement Issue `#1`__. + + .. __: https://github.com/RKrahl/pytest-dependency/issues/1 + + pytest-dependency - Manage dependencies of tests ================================================ This pytest plugin manages dependencies of tests. It allows to mark some tests as dependent from other tests. These tests will then be skipped if any of the dependencies did fail or has been skipped. System requirements ------------------- + Python 2.6, 2.7, or 3.1 and newer. + `pytest`_ 2.8.0 or newer. Copyright and License --------------------- - Copyright 2013-2015 Helmholtz-Zentrum Berlin für Materialien und Energie GmbH - Copyright 2016 Rolf Krahl Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. .. _pytest: http://pytest.org/
6
0.166667
6
0
a61c597d13f6adb08860b4a24b8f4045de5451b5
app/models/obs_factory/distribution_strategy_opensuse.rb
app/models/obs_factory/distribution_strategy_opensuse.rb
module ObsFactory # this class tracks the differences between Factory and the upcoming release class DistributionStrategyOpenSUSE < DistributionStrategyFactory def opensuse_version # Remove the "openSUSE:" part project.name[9..-1] end def opensuse_leap_version # Remove the "openSUSE:Leap:" part project.name[14..-1] end def openqa_version opensuse_leap_version end def openqa_group "openSUSE Leap #{opensuse_leap_version}" end def repo_url "http://download.opensuse.org/distribution/leap/#{opensuse_leap_version}/repo/oss/media.1/build" end def url_suffix "distribution/leap/#{opensuse_leap_version}/iso" end def openqa_iso_prefix "openSUSE-#{opensuse_version}-Staging" end # URL parameter for 42 def openqa_filter(project) return "match=42:S:#{project.letter}" end end end
module ObsFactory # this class tracks the differences between Factory and the upcoming release class DistributionStrategyOpenSUSE < DistributionStrategyFactory def opensuse_version # Remove the "openSUSE:" part project.name[9..-1] end def opensuse_leap_version # Remove the "openSUSE:Leap:" part project.name[14..-1] end def openqa_version opensuse_leap_version end def openqa_group "openSUSE Leap #{opensuse_leap_version}" end def repo_url "http://download.opensuse.org/distribution/leap/#{opensuse_leap_version}/repo/oss/media.1/build" end def url_suffix "distribution/leap/#{opensuse_leap_version}/iso" end def openqa_iso_prefix "openSUSE-#{opensuse_version}-Staging" end def published_arch 'x86_64' end # Version of the published distribution # # @return [String] version string def published_version begin f = open(repo_url) rescue OpenURI::HTTPError => e return 'unknown' end matchdata = %r{openSUSE-#{opensuse_leap_version}-#{published_arch}-Build(.*)}.match(f.read) matchdata[1] end # URL parameter for 42 def openqa_filter(project) return "match=42:S:#{project.letter}" end end end
Fix published_version return nil for Leap
Fix published_version return nil for Leap Build name in 42.3 is openSUSE-42.3-x86_64-Build0144 what matchdata regex will return nil for matchdata[1]. Update regex for Leap.
Ruby
mit
openSUSE-Team/obs_factory,openSUSE-Team/obs_factory,openSUSE-Team/obs_factory
ruby
## Code Before: module ObsFactory # this class tracks the differences between Factory and the upcoming release class DistributionStrategyOpenSUSE < DistributionStrategyFactory def opensuse_version # Remove the "openSUSE:" part project.name[9..-1] end def opensuse_leap_version # Remove the "openSUSE:Leap:" part project.name[14..-1] end def openqa_version opensuse_leap_version end def openqa_group "openSUSE Leap #{opensuse_leap_version}" end def repo_url "http://download.opensuse.org/distribution/leap/#{opensuse_leap_version}/repo/oss/media.1/build" end def url_suffix "distribution/leap/#{opensuse_leap_version}/iso" end def openqa_iso_prefix "openSUSE-#{opensuse_version}-Staging" end # URL parameter for 42 def openqa_filter(project) return "match=42:S:#{project.letter}" end end end ## Instruction: Fix published_version return nil for Leap Build name in 42.3 is openSUSE-42.3-x86_64-Build0144 what matchdata regex will return nil for matchdata[1]. Update regex for Leap. ## Code After: module ObsFactory # this class tracks the differences between Factory and the upcoming release class DistributionStrategyOpenSUSE < DistributionStrategyFactory def opensuse_version # Remove the "openSUSE:" part project.name[9..-1] end def opensuse_leap_version # Remove the "openSUSE:Leap:" part project.name[14..-1] end def openqa_version opensuse_leap_version end def openqa_group "openSUSE Leap #{opensuse_leap_version}" end def repo_url "http://download.opensuse.org/distribution/leap/#{opensuse_leap_version}/repo/oss/media.1/build" end def url_suffix "distribution/leap/#{opensuse_leap_version}/iso" end def openqa_iso_prefix "openSUSE-#{opensuse_version}-Staging" end def published_arch 'x86_64' end # Version of the published distribution # # @return [String] version string def published_version begin f = open(repo_url) rescue OpenURI::HTTPError => e return 'unknown' end matchdata = %r{openSUSE-#{opensuse_leap_version}-#{published_arch}-Build(.*)}.match(f.read) matchdata[1] end # URL parameter for 42 def openqa_filter(project) return "match=42:S:#{project.letter}" end end end
module ObsFactory # this class tracks the differences between Factory and the upcoming release class DistributionStrategyOpenSUSE < DistributionStrategyFactory def opensuse_version # Remove the "openSUSE:" part project.name[9..-1] end def opensuse_leap_version # Remove the "openSUSE:Leap:" part project.name[14..-1] end def openqa_version opensuse_leap_version end def openqa_group "openSUSE Leap #{opensuse_leap_version}" end def repo_url "http://download.opensuse.org/distribution/leap/#{opensuse_leap_version}/repo/oss/media.1/build" end def url_suffix "distribution/leap/#{opensuse_leap_version}/iso" end def openqa_iso_prefix "openSUSE-#{opensuse_version}-Staging" end + def published_arch + 'x86_64' + end + + # Version of the published distribution + # + # @return [String] version string + def published_version + begin + f = open(repo_url) + rescue OpenURI::HTTPError => e + return 'unknown' + end + matchdata = %r{openSUSE-#{opensuse_leap_version}-#{published_arch}-Build(.*)}.match(f.read) + matchdata[1] + end + # URL parameter for 42 def openqa_filter(project) return "match=42:S:#{project.letter}" end end end
17
0.414634
17
0
c260570c49fedb77b30b9948b798ac0e5046b63d
homebrew/chruby.rb
homebrew/chruby.rb
require 'formula' class Chruby < Formula homepage 'https://github.com/postmodern/chruby#readme' url 'https://github.com/postmodern/chruby/archive/v0.3.9.tar.gz' sha1 '64365226210f82b58092ed01a3fb57d379b99c80' head 'https://github.com/postmodern/chruby.git' def install system 'make', 'install', "PREFIX=#{prefix}" end def caveats; <<-EOS.undent Add the following to the ~/.bashrc or ~/.zshrc file: source #{opt_prefix}/share/chruby/chruby.sh By default chruby will search for Rubies installed into /opt/rubies/ or ~/.rubies/. For non-standard installation locations, simply set the RUBIES variable after loading chruby.sh: RUBIES=( /opt/jruby-1.7.0 $HOME/src/rubinius ) If you are migrating from another Ruby manager, set `RUBIES` accordingly: RVM: RUBIES+=(~/.rvm/rubies/*) rbenv: RUBIES+=(~/.rbenv/versions/*) rbfu: RUBIES+=(~/.rbfu/rubies/*) To enable auto-switching of Rubies specified by .ruby-version files, add the following to ~/.bashrc or ~/.zshrc: source #{opt_prefix}/share/chruby/auto.sh EOS end end
require 'formula' class Chruby < Formula homepage 'https://github.com/postmodern/chruby#readme' url 'https://github.com/postmodern/chruby/archive/v0.3.9.tar.gz' sha1 '64365226210f82b58092ed01a3fb57d379b99c80' head 'https://github.com/postmodern/chruby.git' def install system 'make', 'install', "PREFIX=#{prefix}" end def caveats; <<-EOS.undent Add the following to the ~/.bashrc or ~/.zshrc file: source #{opt_share}/chruby/chruby.sh By default chruby will search for Rubies installed into /opt/rubies/ or ~/.rubies/. For non-standard installation locations, simply set the RUBIES variable after loading chruby.sh: RUBIES=( /opt/jruby-1.7.0 $HOME/src/rubinius ) If you are migrating from another Ruby manager, set `RUBIES` accordingly: RVM: RUBIES+=(~/.rvm/rubies/*) rbenv: RUBIES+=(~/.rbenv/versions/*) rbfu: RUBIES+=(~/.rbfu/rubies/*) To enable auto-switching of Rubies specified by .ruby-version files, add the following to ~/.bashrc or ~/.zshrc: source #{opt_share}/chruby/auto.sh EOS end end
Update brew formula opt shortcut.
Update brew formula opt shortcut.
Ruby
mit
ngpestelos/chruby,kbrock/chruby,ilikepi/chruby,regularfry/chruby,ilikepi/chruby,postmodern/chruby,ngpestelos/chruby,kbrock/chruby,postmodern/chruby,regularfry/chruby
ruby
## Code Before: require 'formula' class Chruby < Formula homepage 'https://github.com/postmodern/chruby#readme' url 'https://github.com/postmodern/chruby/archive/v0.3.9.tar.gz' sha1 '64365226210f82b58092ed01a3fb57d379b99c80' head 'https://github.com/postmodern/chruby.git' def install system 'make', 'install', "PREFIX=#{prefix}" end def caveats; <<-EOS.undent Add the following to the ~/.bashrc or ~/.zshrc file: source #{opt_prefix}/share/chruby/chruby.sh By default chruby will search for Rubies installed into /opt/rubies/ or ~/.rubies/. For non-standard installation locations, simply set the RUBIES variable after loading chruby.sh: RUBIES=( /opt/jruby-1.7.0 $HOME/src/rubinius ) If you are migrating from another Ruby manager, set `RUBIES` accordingly: RVM: RUBIES+=(~/.rvm/rubies/*) rbenv: RUBIES+=(~/.rbenv/versions/*) rbfu: RUBIES+=(~/.rbfu/rubies/*) To enable auto-switching of Rubies specified by .ruby-version files, add the following to ~/.bashrc or ~/.zshrc: source #{opt_prefix}/share/chruby/auto.sh EOS end end ## Instruction: Update brew formula opt shortcut. ## Code After: require 'formula' class Chruby < Formula homepage 'https://github.com/postmodern/chruby#readme' url 'https://github.com/postmodern/chruby/archive/v0.3.9.tar.gz' sha1 '64365226210f82b58092ed01a3fb57d379b99c80' head 'https://github.com/postmodern/chruby.git' def install system 'make', 'install', "PREFIX=#{prefix}" end def caveats; <<-EOS.undent Add the following to the ~/.bashrc or ~/.zshrc file: source #{opt_share}/chruby/chruby.sh By default chruby will search for Rubies installed into /opt/rubies/ or ~/.rubies/. For non-standard installation locations, simply set the RUBIES variable after loading chruby.sh: RUBIES=( /opt/jruby-1.7.0 $HOME/src/rubinius ) If you are migrating from another Ruby manager, set `RUBIES` accordingly: RVM: RUBIES+=(~/.rvm/rubies/*) rbenv: RUBIES+=(~/.rbenv/versions/*) rbfu: RUBIES+=(~/.rbfu/rubies/*) To enable auto-switching of Rubies specified by .ruby-version files, add the following to ~/.bashrc or ~/.zshrc: source #{opt_share}/chruby/auto.sh EOS end end
require 'formula' class Chruby < Formula homepage 'https://github.com/postmodern/chruby#readme' url 'https://github.com/postmodern/chruby/archive/v0.3.9.tar.gz' sha1 '64365226210f82b58092ed01a3fb57d379b99c80' head 'https://github.com/postmodern/chruby.git' def install system 'make', 'install', "PREFIX=#{prefix}" end def caveats; <<-EOS.undent Add the following to the ~/.bashrc or ~/.zshrc file: - source #{opt_prefix}/share/chruby/chruby.sh ? -------- + source #{opt_share}/chruby/chruby.sh ? + By default chruby will search for Rubies installed into /opt/rubies/ or ~/.rubies/. For non-standard installation locations, simply set the RUBIES variable after loading chruby.sh: RUBIES=( /opt/jruby-1.7.0 $HOME/src/rubinius ) If you are migrating from another Ruby manager, set `RUBIES` accordingly: RVM: RUBIES+=(~/.rvm/rubies/*) rbenv: RUBIES+=(~/.rbenv/versions/*) rbfu: RUBIES+=(~/.rbfu/rubies/*) To enable auto-switching of Rubies specified by .ruby-version files, add the following to ~/.bashrc or ~/.zshrc: - source #{opt_prefix}/share/chruby/auto.sh ? -------- + source #{opt_share}/chruby/auto.sh ? + EOS end end
4
0.1
2
2
5487664f1f8f47355e406b60ad1a6cb89c85e55c
tests/parse.js
tests/parse.js
const parse = require('../lib/parse'); const cases = require('postcss-parser-tests'); const expect = require('chai').expect; describe('parse', () => { cases.each((name, css, json) => { it('should parse ' + name, () => { let parsed = cases.jsonify(parse(css, { from: name })); expect(parsed).to.equal(json); }); }); });
const parse = require('../lib/parse'); const cases = require('postcss-parser-tests'); const expect = require('chai').expect; describe('parse', () => { cases.each((name, css, json) => { it('should parse ' + name, () => { let parsed = cases.jsonify(parse(css, { from: name })); expect(parsed).to.equal(json); }); }); }); describe('parse mixins', () => { it('should parse mixins', () => { let root = parse('a { custom(); }'), node = root.first.first; expect(node.type).to.equal('mixin'); expect(node.name).to.equal('custom'); }); it('should parse comma separated values as arguments', () => { let root = parse(".block { mixin(1, bold, url('test.png'), #000, rgb(0, 0, 0)); }"), node = root.first.first; expect(JSON.stringify(node.arguments)).to.equal('["1","bold","url(\'test.png\')","#000","rgb(0, 0, 0)"]'); }); it('should parse key: value pairs as arguments', () => { let root = parse(".block { mixin(padding: 1, weight: bold, background: url('test.png')); }"), node = root.first.first; expect(JSON.stringify(node.arguments[0])).to.equal('{"padding":"1","weight":"bold","background":"url(\'test.png\')"}'); }); });
Add tests for basic mixin syntax
Add tests for basic mixin syntax
JavaScript
mit
weepower/postcss-wee-syntax
javascript
## Code Before: const parse = require('../lib/parse'); const cases = require('postcss-parser-tests'); const expect = require('chai').expect; describe('parse', () => { cases.each((name, css, json) => { it('should parse ' + name, () => { let parsed = cases.jsonify(parse(css, { from: name })); expect(parsed).to.equal(json); }); }); }); ## Instruction: Add tests for basic mixin syntax ## Code After: const parse = require('../lib/parse'); const cases = require('postcss-parser-tests'); const expect = require('chai').expect; describe('parse', () => { cases.each((name, css, json) => { it('should parse ' + name, () => { let parsed = cases.jsonify(parse(css, { from: name })); expect(parsed).to.equal(json); }); }); }); describe('parse mixins', () => { it('should parse mixins', () => { let root = parse('a { custom(); }'), node = root.first.first; expect(node.type).to.equal('mixin'); expect(node.name).to.equal('custom'); }); it('should parse comma separated values as arguments', () => { let root = parse(".block { mixin(1, bold, url('test.png'), #000, rgb(0, 0, 0)); }"), node = root.first.first; expect(JSON.stringify(node.arguments)).to.equal('["1","bold","url(\'test.png\')","#000","rgb(0, 0, 0)"]'); }); it('should parse key: value pairs as arguments', () => { let root = parse(".block { mixin(padding: 1, weight: bold, background: url('test.png')); }"), node = root.first.first; expect(JSON.stringify(node.arguments[0])).to.equal('{"padding":"1","weight":"bold","background":"url(\'test.png\')"}'); }); });
const parse = require('../lib/parse'); const cases = require('postcss-parser-tests'); const expect = require('chai').expect; describe('parse', () => { cases.each((name, css, json) => { it('should parse ' + name, () => { let parsed = cases.jsonify(parse(css, { from: name })); expect(parsed).to.equal(json); }); }); }); + + describe('parse mixins', () => { + it('should parse mixins', () => { + let root = parse('a { custom(); }'), + node = root.first.first; + + expect(node.type).to.equal('mixin'); + expect(node.name).to.equal('custom'); + }); + + it('should parse comma separated values as arguments', () => { + let root = parse(".block { mixin(1, bold, url('test.png'), #000, rgb(0, 0, 0)); }"), + node = root.first.first; + + expect(JSON.stringify(node.arguments)).to.equal('["1","bold","url(\'test.png\')","#000","rgb(0, 0, 0)"]'); + }); + + it('should parse key: value pairs as arguments', () => { + let root = parse(".block { mixin(padding: 1, weight: bold, background: url('test.png')); }"), + node = root.first.first; + + expect(JSON.stringify(node.arguments[0])).to.equal('{"padding":"1","weight":"bold","background":"url(\'test.png\')"}'); + }); + });
24
1.846154
24
0
f4baf713c5c9f8b03f5a76ec4650f6b13b0d632a
README.md
README.md
[![Go Report Card](https://goreportcard.com/badge/github.com/monder/route53-etcd)](https://goreportcard.com/report/github.com/monder/route53-etcd) [![license](https://img.shields.io/github/license/monder/route53-etcd.svg?maxAge=2592000&style=flat-square)]() [![GitHub tag](https://img.shields.io/github/tag/monder/route53-etcd.svg?style=flat-square)]() Exposing IPs registred in etcd to route53 #Running ``` docker run monder/route53-etcd --etcd-endpoints=http://10.0.1.10:4001 ``` Will read the configuration in etcd path `/hosts` #Example ``` etcdctl set /hosts/AAAAAAAAAA/test.domain.lan "/units/test-app/*/ip" ``` Will read all keys in etcd matching pattern `/units/test-app/*/ip` and will create/update route53 recordsets for ZoneID `AAAAAAAAAA` and domain `test.domain.lan` If multiple keys match the pattern - route53 will have multiple addresses for the same domain.
[![Go Report Card](https://goreportcard.com/badge/github.com/monder/route53-etcd)](https://goreportcard.com/report/github.com/monder/route53-etcd) [![license](https://img.shields.io/github/license/monder/route53-etcd.svg?maxAge=2592000&style=flat-square)]() [![GitHub tag](https://img.shields.io/github/tag/monder/route53-etcd.svg?style=flat-square)]() Exposing IPs registred in etcd to route53 #Running Application is avalable as both - docker image and signed rkt image: ``` sudo docker monder/route53-etcd:0.3.1 --etcd-endpoints=http://10.0.1.10:2379 # or sudo rkt monder.cc/route53-etcd:v0.3.1 ``` Will read the configuration in etcd path `/hosts` #Example ``` etcdctl set /hosts/AAAAAAAAAA/test.domain.lan "/units/test-app/*/ip" ``` Will read all keys in etcd matching pattern `/units/test-app/*/ip` and will create/update route53 recordsets for ZoneID `AAAAAAAAAA` and domain `test.domain.lan` If multiple keys match the pattern - route53 will have multiple addresses for the same domain.
Update readme with rkt usage
Update readme with rkt usage
Markdown
mit
monder/route53-etcd,monder/route53-etcd
markdown
## Code Before: [![Go Report Card](https://goreportcard.com/badge/github.com/monder/route53-etcd)](https://goreportcard.com/report/github.com/monder/route53-etcd) [![license](https://img.shields.io/github/license/monder/route53-etcd.svg?maxAge=2592000&style=flat-square)]() [![GitHub tag](https://img.shields.io/github/tag/monder/route53-etcd.svg?style=flat-square)]() Exposing IPs registred in etcd to route53 #Running ``` docker run monder/route53-etcd --etcd-endpoints=http://10.0.1.10:4001 ``` Will read the configuration in etcd path `/hosts` #Example ``` etcdctl set /hosts/AAAAAAAAAA/test.domain.lan "/units/test-app/*/ip" ``` Will read all keys in etcd matching pattern `/units/test-app/*/ip` and will create/update route53 recordsets for ZoneID `AAAAAAAAAA` and domain `test.domain.lan` If multiple keys match the pattern - route53 will have multiple addresses for the same domain. ## Instruction: Update readme with rkt usage ## Code After: [![Go Report Card](https://goreportcard.com/badge/github.com/monder/route53-etcd)](https://goreportcard.com/report/github.com/monder/route53-etcd) [![license](https://img.shields.io/github/license/monder/route53-etcd.svg?maxAge=2592000&style=flat-square)]() [![GitHub tag](https://img.shields.io/github/tag/monder/route53-etcd.svg?style=flat-square)]() Exposing IPs registred in etcd to route53 #Running Application is avalable as both - docker image and signed rkt image: ``` sudo docker monder/route53-etcd:0.3.1 --etcd-endpoints=http://10.0.1.10:2379 # or sudo rkt monder.cc/route53-etcd:v0.3.1 ``` Will read the configuration in etcd path `/hosts` #Example ``` etcdctl set /hosts/AAAAAAAAAA/test.domain.lan "/units/test-app/*/ip" ``` Will read all keys in etcd matching pattern `/units/test-app/*/ip` and will create/update route53 recordsets for ZoneID `AAAAAAAAAA` and domain `test.domain.lan` If multiple keys match the pattern - route53 will have multiple addresses for the same domain.
[![Go Report Card](https://goreportcard.com/badge/github.com/monder/route53-etcd)](https://goreportcard.com/report/github.com/monder/route53-etcd) [![license](https://img.shields.io/github/license/monder/route53-etcd.svg?maxAge=2592000&style=flat-square)]() [![GitHub tag](https://img.shields.io/github/tag/monder/route53-etcd.svg?style=flat-square)]() Exposing IPs registred in etcd to route53 #Running + Application is avalable as both - docker image and signed rkt image: + ``` - docker run monder/route53-etcd --etcd-endpoints=http://10.0.1.10:4001 ? ---- ^^^^ + sudo docker monder/route53-etcd:0.3.1 --etcd-endpoints=http://10.0.1.10:2379 ? +++++ ++++++ ^^^^ + # or + sudo rkt monder.cc/route53-etcd:v0.3.1 ``` Will read the configuration in etcd path `/hosts` #Example ``` etcdctl set /hosts/AAAAAAAAAA/test.domain.lan "/units/test-app/*/ip" ``` Will read all keys in etcd matching pattern `/units/test-app/*/ip` and will create/update route53 recordsets for ZoneID `AAAAAAAAAA` and domain `test.domain.lan` If multiple keys match the pattern - route53 will have multiple addresses for the same domain.
6
0.25
5
1
5c373c9baf6a0abfab09025d35d4b2deeb37baca
src/app/utilities/download-helper.ts
src/app/utilities/download-helper.ts
/** * Copyright 2017 - 2018 The Hyve B.V. * * This Source Code Form is subject to the terms of the Mozilla Public * License, v. 2.0. If a copy of the MPL was not distributed with this * file, You can obtain one at http://mozilla.org/MPL/2.0/. */ import * as sanitize from 'sanitize-filename'; export class DownloadHelper { public static downloadJSON(dataObject: object, fileName: string) { if (dataObject === null) { throw new Error('No data to download.'); } if (fileName === null) { throw new Error('Missing file name.'); } fileName = sanitize(fileName.trim()); if (fileName.length === 0) { throw new Error('Empty file name.'); } let data = 'data:text/json;charset=utf-8,' + encodeURIComponent(JSON.stringify(dataObject)); let el = document.createElement('a'); el.setAttribute('href', data); el.setAttribute('download', fileName + '.json'); el.style.display = 'none'; document.body.appendChild(el); el.click(); document.body.removeChild(el); } }
/** * Copyright 2017 - 2018 The Hyve B.V. * * This Source Code Form is subject to the terms of the Mozilla Public * License, v. 2.0. If a copy of the MPL was not distributed with this * file, You can obtain one at http://mozilla.org/MPL/2.0/. */ import * as sanitize from 'sanitize-filename'; import {saveAs} from 'file-saver'; export class DownloadHelper { public static downloadJSON(dataObject: object, fileName: string) { if (dataObject === null) { throw new Error('No data to download.'); } if (fileName === null) { throw new Error('Missing file name.'); } fileName = sanitize(fileName.trim()); if (fileName.length === 0) { throw new Error('Empty file name.'); } const blob = new Blob([JSON.stringify(dataObject)], {type: 'text/json'}); saveAs(blob, `${fileName}.json`, true); } }
Fix cohort download in Edge.
Fix cohort download in Edge.
TypeScript
mpl-2.0
thehyve/glowing-bear,thehyve/glowing-bear,thehyve/glowing-bear,thehyve/glowing-bear
typescript
## Code Before: /** * Copyright 2017 - 2018 The Hyve B.V. * * This Source Code Form is subject to the terms of the Mozilla Public * License, v. 2.0. If a copy of the MPL was not distributed with this * file, You can obtain one at http://mozilla.org/MPL/2.0/. */ import * as sanitize from 'sanitize-filename'; export class DownloadHelper { public static downloadJSON(dataObject: object, fileName: string) { if (dataObject === null) { throw new Error('No data to download.'); } if (fileName === null) { throw new Error('Missing file name.'); } fileName = sanitize(fileName.trim()); if (fileName.length === 0) { throw new Error('Empty file name.'); } let data = 'data:text/json;charset=utf-8,' + encodeURIComponent(JSON.stringify(dataObject)); let el = document.createElement('a'); el.setAttribute('href', data); el.setAttribute('download', fileName + '.json'); el.style.display = 'none'; document.body.appendChild(el); el.click(); document.body.removeChild(el); } } ## Instruction: Fix cohort download in Edge. ## Code After: /** * Copyright 2017 - 2018 The Hyve B.V. * * This Source Code Form is subject to the terms of the Mozilla Public * License, v. 2.0. If a copy of the MPL was not distributed with this * file, You can obtain one at http://mozilla.org/MPL/2.0/. */ import * as sanitize from 'sanitize-filename'; import {saveAs} from 'file-saver'; export class DownloadHelper { public static downloadJSON(dataObject: object, fileName: string) { if (dataObject === null) { throw new Error('No data to download.'); } if (fileName === null) { throw new Error('Missing file name.'); } fileName = sanitize(fileName.trim()); if (fileName.length === 0) { throw new Error('Empty file name.'); } const blob = new Blob([JSON.stringify(dataObject)], {type: 'text/json'}); saveAs(blob, `${fileName}.json`, true); } }
/** * Copyright 2017 - 2018 The Hyve B.V. * * This Source Code Form is subject to the terms of the Mozilla Public * License, v. 2.0. If a copy of the MPL was not distributed with this * file, You can obtain one at http://mozilla.org/MPL/2.0/. */ import * as sanitize from 'sanitize-filename'; + import {saveAs} from 'file-saver'; export class DownloadHelper { public static downloadJSON(dataObject: object, fileName: string) { if (dataObject === null) { throw new Error('No data to download.'); } if (fileName === null) { throw new Error('Missing file name.'); } fileName = sanitize(fileName.trim()); if (fileName.length === 0) { throw new Error('Empty file name.'); } + const blob = new Blob([JSON.stringify(dataObject)], {type: 'text/json'}); + saveAs(blob, `${fileName}.json`, true); - let data = 'data:text/json;charset=utf-8,' + encodeURIComponent(JSON.stringify(dataObject)); - let el = document.createElement('a'); - el.setAttribute('href', data); - el.setAttribute('download', fileName + '.json'); - el.style.display = 'none'; - document.body.appendChild(el); - el.click(); - document.body.removeChild(el); } }
11
0.323529
3
8
45e5264bed1ae6490f07cbff0b537fd9a9fe42fb
app/models/renalware/letters/letter_query.rb
app/models/renalware/letters/letter_query.rb
module Renalware module Letters class LetterQuery def initialize(q: nil) @q = q || {} @q[:s] ||= ["patient_family_name, patient_given_name"] end def call search.result end def search @search ||= QueryableLetter.search(@q) end class QueryableLetter < ActiveType::Record[Letter] def self.finder_needs_type_condition?; false; end scope :state_eq, -> (state = :draft) { where(type: Letter.state_class_name(state)) } private_class_method :ransackable_scopes def self.ransackable_scopes(_auth_object = nil) %i(state_eq) end end end end end
module Renalware module Letters class LetterQuery def initialize(q: nil) @q = q || {} @q[:s] ||= ["issued_on desc"] end def call search.result end def search @search ||= QueryableLetter.search(@q) end class QueryableLetter < ActiveType::Record[Letter] def self.finder_needs_type_condition?; false; end scope :state_eq, -> (state = :draft) { where(type: Letter.state_class_name(state)) } private_class_method :ransackable_scopes def self.ransackable_scopes(_auth_object = nil) %i(state_eq) end end end end end
Sort by issued_on date instead of name
Sort by issued_on date instead of name
Ruby
mit
airslie/renalware-core,airslie/renalware-core,airslie/renalware-core,airslie/renalware-core
ruby
## Code Before: module Renalware module Letters class LetterQuery def initialize(q: nil) @q = q || {} @q[:s] ||= ["patient_family_name, patient_given_name"] end def call search.result end def search @search ||= QueryableLetter.search(@q) end class QueryableLetter < ActiveType::Record[Letter] def self.finder_needs_type_condition?; false; end scope :state_eq, -> (state = :draft) { where(type: Letter.state_class_name(state)) } private_class_method :ransackable_scopes def self.ransackable_scopes(_auth_object = nil) %i(state_eq) end end end end end ## Instruction: Sort by issued_on date instead of name ## Code After: module Renalware module Letters class LetterQuery def initialize(q: nil) @q = q || {} @q[:s] ||= ["issued_on desc"] end def call search.result end def search @search ||= QueryableLetter.search(@q) end class QueryableLetter < ActiveType::Record[Letter] def self.finder_needs_type_condition?; false; end scope :state_eq, -> (state = :draft) { where(type: Letter.state_class_name(state)) } private_class_method :ransackable_scopes def self.ransackable_scopes(_auth_object = nil) %i(state_eq) end end end end end
module Renalware module Letters class LetterQuery def initialize(q: nil) @q = q || {} - @q[:s] ||= ["patient_family_name, patient_given_name"] + @q[:s] ||= ["issued_on desc"] end def call search.result end def search @search ||= QueryableLetter.search(@q) end class QueryableLetter < ActiveType::Record[Letter] def self.finder_needs_type_condition?; false; end scope :state_eq, -> (state = :draft) { where(type: Letter.state_class_name(state)) } private_class_method :ransackable_scopes def self.ransackable_scopes(_auth_object = nil) %i(state_eq) end end end end end
2
0.066667
1
1
d2f0ab2529594c1b248f10c5a1871611616bfcf5
slave/run_slave.bat
slave/run_slave.bat
@echo off setlocal title Build slave if not exist %~dp0..\..\depot_tools\. ( echo You must put a copy of depot_tools in %~dp0..\depot_tools echo Did you read the instructions carefully?? goto :EOF ) set PATH=%~dp0..\..\depot_tools;%PATH% :: Running it once will make sure svn and python were downloaded. call gclient.bat > nul :: run_slave.py will overwrite the PATH and PYTHONPATH environment variables. python %~dp0\run_slave.py --no_save -y buildbot.tac %*
@echo off setlocal title Build slave if not exist %~dp0..\..\depot_tools\. ( echo You must put a copy of depot_tools in %~dp0..\depot_tools echo Did you read the instructions carefully?? goto :EOF ) set PATH=%~dp0..\..\depot_tools;%PATH% cd /d %~dp0 :: Running it once will make sure svn and python were downloaded. call gclient.bat > nul :: run_slave.py will overwrite the PATH and PYTHONPATH environment variables. python %~dp0\run_slave.py --no_save -y buildbot.tac %*
Fix auto-sync on start for Windows.
Fix auto-sync on start for Windows. Otherwise it is run from the script's shortcut directory. A good example is when a shortcut is used in the Startup Menu. So "gclient sync" fails. R=nsylvain@chromium.org BUG= TEST= Review URL: https://chromiumcodereview.appspot.com/10783034 git-svn-id: 239fca9b83025a0b6f823aeeca02ba5be3d9fd76@147030 0039d316-1c4b-4281-b951-d872f2087c98
Batchfile
bsd-3-clause
eunchong/build,eunchong/build,eunchong/build,eunchong/build
batchfile
## Code Before: @echo off setlocal title Build slave if not exist %~dp0..\..\depot_tools\. ( echo You must put a copy of depot_tools in %~dp0..\depot_tools echo Did you read the instructions carefully?? goto :EOF ) set PATH=%~dp0..\..\depot_tools;%PATH% :: Running it once will make sure svn and python were downloaded. call gclient.bat > nul :: run_slave.py will overwrite the PATH and PYTHONPATH environment variables. python %~dp0\run_slave.py --no_save -y buildbot.tac %* ## Instruction: Fix auto-sync on start for Windows. Otherwise it is run from the script's shortcut directory. A good example is when a shortcut is used in the Startup Menu. So "gclient sync" fails. R=nsylvain@chromium.org BUG= TEST= Review URL: https://chromiumcodereview.appspot.com/10783034 git-svn-id: 239fca9b83025a0b6f823aeeca02ba5be3d9fd76@147030 0039d316-1c4b-4281-b951-d872f2087c98 ## Code After: @echo off setlocal title Build slave if not exist %~dp0..\..\depot_tools\. ( echo You must put a copy of depot_tools in %~dp0..\depot_tools echo Did you read the instructions carefully?? goto :EOF ) set PATH=%~dp0..\..\depot_tools;%PATH% cd /d %~dp0 :: Running it once will make sure svn and python were downloaded. call gclient.bat > nul :: run_slave.py will overwrite the PATH and PYTHONPATH environment variables. python %~dp0\run_slave.py --no_save -y buildbot.tac %*
@echo off setlocal title Build slave if not exist %~dp0..\..\depot_tools\. ( echo You must put a copy of depot_tools in %~dp0..\depot_tools echo Did you read the instructions carefully?? goto :EOF ) set PATH=%~dp0..\..\depot_tools;%PATH% + cd /d %~dp0 :: Running it once will make sure svn and python were downloaded. call gclient.bat > nul :: run_slave.py will overwrite the PATH and PYTHONPATH environment variables. python %~dp0\run_slave.py --no_save -y buildbot.tac %*
1
0.076923
1
0
1c2373ed82555bf5b705e2d6d0f5db452cfb8846
recipes/default.rb
recipes/default.rb
include_recipe "homebrew::default" # Install Homebrew Cask tap include_recipe "homebrew::cask" #Install formulae/packages (list is defined in attributes) include_recipe "homebrew::install_formulas" #Install Casks (list is defined in attributes) include_recipe "homebrew::install_casks"
include_recipe "homebrew::default" # Install Homebrew Cask tap include_recipe "homebrew::cask" # Install Homebrew Cask Versions tap homebrew_tap 'caskroom/versions' #Install formulae/packages (list is defined in attributes) include_recipe "homebrew::install_formulas" #Install Casks (list is defined in attributes) include_recipe "homebrew::install_casks"
Add Homebrew Cask Versions tap
Add Homebrew Cask Versions tap The Versions tap has casks for some newer software (e.g. Sublime Text 3). Install this so we have access to these versions.
Ruby
mit
mikemoate/mac_bootstrap,mikemoate/mac_bootstrap
ruby
## Code Before: include_recipe "homebrew::default" # Install Homebrew Cask tap include_recipe "homebrew::cask" #Install formulae/packages (list is defined in attributes) include_recipe "homebrew::install_formulas" #Install Casks (list is defined in attributes) include_recipe "homebrew::install_casks" ## Instruction: Add Homebrew Cask Versions tap The Versions tap has casks for some newer software (e.g. Sublime Text 3). Install this so we have access to these versions. ## Code After: include_recipe "homebrew::default" # Install Homebrew Cask tap include_recipe "homebrew::cask" # Install Homebrew Cask Versions tap homebrew_tap 'caskroom/versions' #Install formulae/packages (list is defined in attributes) include_recipe "homebrew::install_formulas" #Install Casks (list is defined in attributes) include_recipe "homebrew::install_casks"
include_recipe "homebrew::default" # Install Homebrew Cask tap include_recipe "homebrew::cask" + + # Install Homebrew Cask Versions tap + homebrew_tap 'caskroom/versions' #Install formulae/packages (list is defined in attributes) include_recipe "homebrew::install_formulas" #Install Casks (list is defined in attributes) include_recipe "homebrew::install_casks"
3
0.3
3
0
97d3b3cf2a7b220c34f7b467129fe9de8558e2be
utils/metatag.c
utils/metatag.c
/* metatag.c: Program for adding metadata to a file * By: Forrest Kerslager, Nick Noto, David Taylor, Kevin Yeap, * Connie Yu * * 2014/06/06 * */ #include <stdio.h> #include <stdlib.h> #include <string.h> int main(int argc, char** argv){ char* buffer; int fd, i, len; if(argc < 3){ fprintf(stderr, "Usage: metatag FILE TAG\n"); exit(1); } fd=open(argv[1], O_RDWR); if(fd == -1){ fprintf(stderr, "Error, file not found\n"); exit(1); } buffer = ""; for(i=2; i<argc; i++){ strcat(buffer, argv[i]); strcat(buffer, " "); } len = strlen(buffer); metawrite(fd,&buffer,len); close(fd); return 0; }
/* metatag.c: Program for adding metadata to a file * By: Forrest Kerslager, Nick Noto, David Taylor, Kevin Yeap, * Connie Yu * * 2014/06/06 * */ #include <stdio.h> #include <stdlib.h> #include <string.h> #define MAX_LEN 1024 int main(int argc, char** argv){ char* buffer; int fd, i, len; if(argc < 3){ fprintf(stderr, "Usage: metatag FILE TAG\n"); exit(1); } fd=open(argv[1], O_RDWR); if(fd == -1){ fprintf(stderr, "Error, file not found\n"); exit(1); } buffer = ""; for(i=2; i<argc; i++){ strcat(buffer, argv[i]); strcat(buffer, " "); } len = strlen(buffer); if(len > MAX_LEN-1){ fprintf(stderr, "Input stream exceeds max length\n"); exit(1); } metawrite(fd,&buffer,len); close(fd); return 0; }
Add max length error check
Add max length error check
C
mit
dmtaylor/cmps111-proj4,dmtaylor/cmps111-proj4
c
## Code Before: /* metatag.c: Program for adding metadata to a file * By: Forrest Kerslager, Nick Noto, David Taylor, Kevin Yeap, * Connie Yu * * 2014/06/06 * */ #include <stdio.h> #include <stdlib.h> #include <string.h> int main(int argc, char** argv){ char* buffer; int fd, i, len; if(argc < 3){ fprintf(stderr, "Usage: metatag FILE TAG\n"); exit(1); } fd=open(argv[1], O_RDWR); if(fd == -1){ fprintf(stderr, "Error, file not found\n"); exit(1); } buffer = ""; for(i=2; i<argc; i++){ strcat(buffer, argv[i]); strcat(buffer, " "); } len = strlen(buffer); metawrite(fd,&buffer,len); close(fd); return 0; } ## Instruction: Add max length error check ## Code After: /* metatag.c: Program for adding metadata to a file * By: Forrest Kerslager, Nick Noto, David Taylor, Kevin Yeap, * Connie Yu * * 2014/06/06 * */ #include <stdio.h> #include <stdlib.h> #include <string.h> #define MAX_LEN 1024 int main(int argc, char** argv){ char* buffer; int fd, i, len; if(argc < 3){ fprintf(stderr, "Usage: metatag FILE TAG\n"); exit(1); } fd=open(argv[1], O_RDWR); if(fd == -1){ fprintf(stderr, "Error, file not found\n"); exit(1); } buffer = ""; for(i=2; i<argc; i++){ strcat(buffer, argv[i]); strcat(buffer, " "); } len = strlen(buffer); if(len > MAX_LEN-1){ fprintf(stderr, "Input stream exceeds max length\n"); exit(1); } metawrite(fd,&buffer,len); close(fd); return 0; }
/* metatag.c: Program for adding metadata to a file * By: Forrest Kerslager, Nick Noto, David Taylor, Kevin Yeap, * Connie Yu * * 2014/06/06 * */ #include <stdio.h> #include <stdlib.h> #include <string.h> + #define MAX_LEN 1024 int main(int argc, char** argv){ char* buffer; int fd, i, len; if(argc < 3){ fprintf(stderr, "Usage: metatag FILE TAG\n"); exit(1); } fd=open(argv[1], O_RDWR); if(fd == -1){ fprintf(stderr, "Error, file not found\n"); exit(1); } buffer = ""; for(i=2; i<argc; i++){ strcat(buffer, argv[i]); strcat(buffer, " "); } len = strlen(buffer); + if(len > MAX_LEN-1){ + fprintf(stderr, "Input stream exceeds max length\n"); + exit(1); + } metawrite(fd,&buffer,len); close(fd); return 0; }
5
0.138889
5
0
45315d2a4e5136a52867b06ba846d0bbec292952
README.md
README.md
[//] # (This is comment for github markdown) My Config ============================================== ##How to install? git clone https://github.com/BellScurry/config ~/config/install.sh ##This repository consists of * Shell Settings * .profile * .bashrc * Vim Settings * .vimrc * .vim * Git Settings * .gitconfig ###Vim Plugins * Vundle * NerdTree * Several Color Schemes ###Vim Color Scheme * molokai
<!-- This is comment for github markdown. --> My Config ============================================== ##How to install? git clone https://github.com/BellScurry/config ~/config/install.sh ##This repository consists of * Shell Settings * .profile * .bashrc * Vim Settings * .vimrc * .vim * Git Settings * .gitconfig ###Vim Plugins * Vundle * NerdTree * Several Color Schemes ###Vim Color Scheme * molokai
Test github markdown comment (2nd)
Test github markdown comment (2nd)
Markdown
mit
BellScurry/config
markdown
## Code Before: [//] # (This is comment for github markdown) My Config ============================================== ##How to install? git clone https://github.com/BellScurry/config ~/config/install.sh ##This repository consists of * Shell Settings * .profile * .bashrc * Vim Settings * .vimrc * .vim * Git Settings * .gitconfig ###Vim Plugins * Vundle * NerdTree * Several Color Schemes ###Vim Color Scheme * molokai ## Instruction: Test github markdown comment (2nd) ## Code After: <!-- This is comment for github markdown. --> My Config ============================================== ##How to install? git clone https://github.com/BellScurry/config ~/config/install.sh ##This repository consists of * Shell Settings * .profile * .bashrc * Vim Settings * .vimrc * .vim * Git Settings * .gitconfig ###Vim Plugins * Vundle * NerdTree * Several Color Schemes ###Vim Color Scheme * molokai
- [//] # (This is comment for github markdown) ? ^^^^ - ^ ^ + <!-- This is comment for github markdown. --> ? ^^^^ ^^ ^^^^^ My Config ============================================== ##How to install? git clone https://github.com/BellScurry/config ~/config/install.sh ##This repository consists of * Shell Settings * .profile * .bashrc * Vim Settings * .vimrc * .vim * Git Settings * .gitconfig ###Vim Plugins * Vundle * NerdTree * Several Color Schemes ###Vim Color Scheme * molokai
2
0.074074
1
1
50f0a4e56d6aa01b9ca75acf6a74711d481d2490
tzinfo.gemspec
tzinfo.gemspec
Gem::Specification.new do |s| s.name = 'tzinfo' s.version = '1.0.0.pre1' s.summary = 'Daylight-savings aware timezone library' s.description = 'TZInfo is a Ruby library that provides daylight savings aware transformations between times in different time zones.' s.author = 'Philip Ross' s.email = 'phil.ross@gmail.com' s.homepage = 'http://tzinfo.rubyforge.org' s.license = 'MIT' s.files = ['CHANGES', 'LICENSE', 'Rakefile', 'README', 'lib', 'test'] + Dir['lib/**/*.rb'].delete_if {|f| f.include?('.svn')} + Dir['test/**/*'].delete_if {|f| f.include?('.svn')} s.platform = Gem::Platform::RUBY s.require_path = 'lib' s.extra_rdoc_files = ['README', 'CHANGES', 'LICENSE'] s.required_ruby_version = '>= 1.8.6' end
Gem::Specification.new do |s| s.name = 'tzinfo' s.version = '1.0.0.pre1' s.summary = 'Daylight-savings aware timezone library' s.description = 'TZInfo is a Ruby library that provides daylight savings aware transformations between times in different time zones.' s.author = 'Philip Ross' s.email = 'phil.ross@gmail.com' s.homepage = 'http://tzinfo.rubyforge.org' s.license = 'MIT' s.files = ['CHANGES', 'LICENSE', 'Rakefile', 'README', 'lib', 'test'] + Dir['lib/**/*.rb'].delete_if {|f| f.include?('.svn')} + Dir['test/**/*'].delete_if {|f| f.include?('.svn')} s.platform = Gem::Platform::RUBY s.require_path = 'lib' s.extra_rdoc_files = ['README', 'CHANGES', 'LICENSE'] s.required_ruby_version = '>= 1.8.6' s.post_install_message = <<END TZInfo Timezone Data has been Moved =================================== The timezone data previously included with TZInfo as Ruby modules has now been moved to a separate tzinfo-data gem. TZInfo also now supports using the system zoneinfo files on Linux, Mac OS X and other Unix operating systems. If you want to continue using the Ruby timezone modules, or you are using an operating system that does not include zoneinfo files (such as Microsoft Windows), you will need to install tzinfo-data by running: gem install tzinfo-data If tzinfo-data is installed, TZInfo will use the Ruby timezone modules. Otherwise, it will attempt to find the system zoneinfo files. Please refer to the TZInfo documentation (available from http://tzinfo.rubyforge.org) for further information. END end
Add a post_install_message to the gem to inform users that the Ruby data modules have been moved.
Add a post_install_message to the gem to inform users that the Ruby data modules have been moved.
Ruby
mit
tzinfo/tzinfo,eadz/tzinfo
ruby
## Code Before: Gem::Specification.new do |s| s.name = 'tzinfo' s.version = '1.0.0.pre1' s.summary = 'Daylight-savings aware timezone library' s.description = 'TZInfo is a Ruby library that provides daylight savings aware transformations between times in different time zones.' s.author = 'Philip Ross' s.email = 'phil.ross@gmail.com' s.homepage = 'http://tzinfo.rubyforge.org' s.license = 'MIT' s.files = ['CHANGES', 'LICENSE', 'Rakefile', 'README', 'lib', 'test'] + Dir['lib/**/*.rb'].delete_if {|f| f.include?('.svn')} + Dir['test/**/*'].delete_if {|f| f.include?('.svn')} s.platform = Gem::Platform::RUBY s.require_path = 'lib' s.extra_rdoc_files = ['README', 'CHANGES', 'LICENSE'] s.required_ruby_version = '>= 1.8.6' end ## Instruction: Add a post_install_message to the gem to inform users that the Ruby data modules have been moved. ## Code After: Gem::Specification.new do |s| s.name = 'tzinfo' s.version = '1.0.0.pre1' s.summary = 'Daylight-savings aware timezone library' s.description = 'TZInfo is a Ruby library that provides daylight savings aware transformations between times in different time zones.' s.author = 'Philip Ross' s.email = 'phil.ross@gmail.com' s.homepage = 'http://tzinfo.rubyforge.org' s.license = 'MIT' s.files = ['CHANGES', 'LICENSE', 'Rakefile', 'README', 'lib', 'test'] + Dir['lib/**/*.rb'].delete_if {|f| f.include?('.svn')} + Dir['test/**/*'].delete_if {|f| f.include?('.svn')} s.platform = Gem::Platform::RUBY s.require_path = 'lib' s.extra_rdoc_files = ['README', 'CHANGES', 'LICENSE'] s.required_ruby_version = '>= 1.8.6' s.post_install_message = <<END TZInfo Timezone Data has been Moved =================================== The timezone data previously included with TZInfo as Ruby modules has now been moved to a separate tzinfo-data gem. TZInfo also now supports using the system zoneinfo files on Linux, Mac OS X and other Unix operating systems. If you want to continue using the Ruby timezone modules, or you are using an operating system that does not include zoneinfo files (such as Microsoft Windows), you will need to install tzinfo-data by running: gem install tzinfo-data If tzinfo-data is installed, TZInfo will use the Ruby timezone modules. Otherwise, it will attempt to find the system zoneinfo files. Please refer to the TZInfo documentation (available from http://tzinfo.rubyforge.org) for further information. END end
Gem::Specification.new do |s| s.name = 'tzinfo' s.version = '1.0.0.pre1' s.summary = 'Daylight-savings aware timezone library' s.description = 'TZInfo is a Ruby library that provides daylight savings aware transformations between times in different time zones.' s.author = 'Philip Ross' s.email = 'phil.ross@gmail.com' s.homepage = 'http://tzinfo.rubyforge.org' s.license = 'MIT' s.files = ['CHANGES', 'LICENSE', 'Rakefile', 'README', 'lib', 'test'] + Dir['lib/**/*.rb'].delete_if {|f| f.include?('.svn')} + Dir['test/**/*'].delete_if {|f| f.include?('.svn')} s.platform = Gem::Platform::RUBY s.require_path = 'lib' s.extra_rdoc_files = ['README', 'CHANGES', 'LICENSE'] s.required_ruby_version = '>= 1.8.6' + + s.post_install_message = <<END + + TZInfo Timezone Data has been Moved + =================================== + + The timezone data previously included with TZInfo as Ruby modules has now been + moved to a separate tzinfo-data gem. TZInfo also now supports using the system + zoneinfo files on Linux, Mac OS X and other Unix operating systems. + + If you want to continue using the Ruby timezone modules, or you are using an + operating system that does not include zoneinfo files (such as + Microsoft Windows), you will need to install tzinfo-data by running: + + gem install tzinfo-data + + If tzinfo-data is installed, TZInfo will use the Ruby timezone modules. + Otherwise, it will attempt to find the system zoneinfo files. Please refer to + the TZInfo documentation (available from http://tzinfo.rubyforge.org) for + further information. + + END + end
23
1.352941
23
0
b0e19929d06af6ea0f71a6cf40ac83bf43c9e061
docs/contributing.rst
docs/contributing.rst
************ Contributing ************ Thanks for your interest in contributing to the Python Record Linkage Toolkit. There is a lot of work to do. The workflow for contributing: - clone https://github.com/J535D165/recordlinkage.git - Make a branch with your modifications/contributions - Write tests - Run all tests - Do a pull request Testing ======= Install Nose: .. code:: sh pip install nose nose-parameterized Run the following command to test the package .. code:: sh nosetests Performance =========== Performance is very important in record linkage. The performance is monitored for all serious modifications of the core API. The performance monitoring is performed with `Airspeed Velocity <http://github.com/spacetelescope/asv/>`_ (asv). Install Airspeed Velocity: .. code:: sh pip install asv Run the following command from the root of the repository to test the performance of the current version of the package: .. code:: sh asv run Run the following command to test all versions since tag v0.6.0 .. code:: sh asv run --skip-existing-commits v0.6.0..master
************ Contributing ************ Thanks for your interest in contributing to the Python Record Linkage Toolkit. There is a lot of work to do. See `Github <https://github.com/J535D165/recordlinkage/graphs/contributors>`_ for the contributors to this package. The workflow for contributing is as follows: - clone https://github.com/J535D165/recordlinkage.git - Make a branch with your modifications/contributions - Write tests - Run all tests - Do a pull request Testing ======= Install Nose: .. code:: sh pip install nose nose-parameterized Run the following command to test the package .. code:: sh nosetests Performance =========== Performance is very important in record linkage. The performance is monitored for all serious modifications of the core API. The performance monitoring is performed with `Airspeed Velocity <http://github.com/spacetelescope/asv/>`_ (asv). Install Airspeed Velocity: .. code:: sh pip install asv Run the following command from the root of the repository to test the performance of the current version of the package: .. code:: sh asv run Run the following command to test all versions since tag v0.6.0 .. code:: sh asv run --skip-existing-commits v0.6.0..master
Add link to contributors to docs
Add link to contributors to docs
reStructuredText
bsd-3-clause
J535D165/recordlinkage,J535D165/recordlinkage
restructuredtext
## Code Before: ************ Contributing ************ Thanks for your interest in contributing to the Python Record Linkage Toolkit. There is a lot of work to do. The workflow for contributing: - clone https://github.com/J535D165/recordlinkage.git - Make a branch with your modifications/contributions - Write tests - Run all tests - Do a pull request Testing ======= Install Nose: .. code:: sh pip install nose nose-parameterized Run the following command to test the package .. code:: sh nosetests Performance =========== Performance is very important in record linkage. The performance is monitored for all serious modifications of the core API. The performance monitoring is performed with `Airspeed Velocity <http://github.com/spacetelescope/asv/>`_ (asv). Install Airspeed Velocity: .. code:: sh pip install asv Run the following command from the root of the repository to test the performance of the current version of the package: .. code:: sh asv run Run the following command to test all versions since tag v0.6.0 .. code:: sh asv run --skip-existing-commits v0.6.0..master ## Instruction: Add link to contributors to docs ## Code After: ************ Contributing ************ Thanks for your interest in contributing to the Python Record Linkage Toolkit. There is a lot of work to do. See `Github <https://github.com/J535D165/recordlinkage/graphs/contributors>`_ for the contributors to this package. The workflow for contributing is as follows: - clone https://github.com/J535D165/recordlinkage.git - Make a branch with your modifications/contributions - Write tests - Run all tests - Do a pull request Testing ======= Install Nose: .. code:: sh pip install nose nose-parameterized Run the following command to test the package .. code:: sh nosetests Performance =========== Performance is very important in record linkage. The performance is monitored for all serious modifications of the core API. The performance monitoring is performed with `Airspeed Velocity <http://github.com/spacetelescope/asv/>`_ (asv). Install Airspeed Velocity: .. code:: sh pip install asv Run the following command from the root of the repository to test the performance of the current version of the package: .. code:: sh asv run Run the following command to test all versions since tag v0.6.0 .. code:: sh asv run --skip-existing-commits v0.6.0..master
************ Contributing ************ Thanks for your interest in contributing to the Python Record Linkage Toolkit. - There is a lot of work to do. The workflow for contributing: + There is a lot of work to do. See `Github <https://github.com/J535D165/recordlinkage/graphs/contributors>`_ + for the contributors to this package. + + The workflow for contributing is as follows: - clone https://github.com/J535D165/recordlinkage.git - Make a branch with your modifications/contributions - Write tests - Run all tests - Do a pull request Testing ======= Install Nose: .. code:: sh pip install nose nose-parameterized Run the following command to test the package .. code:: sh nosetests Performance =========== Performance is very important in record linkage. The performance is monitored for all serious modifications of the core API. The performance monitoring is performed with `Airspeed Velocity <http://github.com/spacetelescope/asv/>`_ (asv). Install Airspeed Velocity: .. code:: sh pip install asv Run the following command from the root of the repository to test the performance of the current version of the package: .. code:: sh asv run Run the following command to test all versions since tag v0.6.0 .. code:: sh asv run --skip-existing-commits v0.6.0..master
5
0.083333
4
1
ee4a3eaaa4b80f4a5957f2fdbc9f58fb4065bad1
README.md
README.md
> Strategy driven, segmented feature toggles for Javascript
> Strategy driven, segmented feature toggles for Javascript This is a companion javascript project to the main PHP library: https://github.com/zumba/swivel
Add a link to the PHP repo
Add a link to the PHP repo closes #9
Markdown
mit
zumba/swiveljs
markdown
## Code Before: > Strategy driven, segmented feature toggles for Javascript ## Instruction: Add a link to the PHP repo closes #9 ## Code After: > Strategy driven, segmented feature toggles for Javascript This is a companion javascript project to the main PHP library: https://github.com/zumba/swivel
> Strategy driven, segmented feature toggles for Javascript + + This is a companion javascript project to the main PHP library: https://github.com/zumba/swivel
2
1
2
0
5e04250723437f4e765c207472b47ae92ee59d02
bin/install-stunnel.sh
bin/install-stunnel.sh
set -o errexit set -o nounset STUNNEL_VERSION="5.36" STUNNEL_SHA1SUM="60f7c761214f1959f7d52f0164f77d8d2a9328e6" STUNNEL_NAME="stunnel-${STUNNEL_VERSION}" STUNNEL_ARCHIVE="${STUNNEL_NAME}.tar.gz" STUNNEL_URL="https://www.stunnel.org/downloads/${STUNNEL_ARCHIVE}" STUNNEL_BUILD_DEPS=(build-base linux-headers wget openssl-dev) apk-install libssl1.0 libcrypto1.0 "${STUNNEL_BUILD_DEPS[@]}" BUILD_DIR="$(mktemp -d)" pushd "$BUILD_DIR" wget "$STUNNEL_URL" echo "${STUNNEL_SHA1SUM} ${STUNNEL_ARCHIVE}" | sha1sum -c - tar -xzf "$STUNNEL_ARCHIVE" pushd "$STUNNEL_NAME" ./configure --disable-fips make install popd popd rm -rf "$BUILD_DIR" apk del "${STUNNEL_BUILD_DEPS[@]}"
set -o errexit set -o nounset STUNNEL_VERSION="5.37" STUNNEL_SHA1SUM="9ec0c64838b3013b38e2cac8e4500219a027831c" STUNNEL_NAME="stunnel-${STUNNEL_VERSION}" STUNNEL_ARCHIVE="${STUNNEL_NAME}.tar.gz" STUNNEL_URL="https://s3.amazonaws.com/aptible-source-archives/${STUNNEL_ARCHIVE}" STUNNEL_BUILD_DEPS=(build-base linux-headers wget openssl-dev) apk-install libssl1.0 libcrypto1.0 "${STUNNEL_BUILD_DEPS[@]}" BUILD_DIR="$(mktemp -d)" pushd "$BUILD_DIR" wget "$STUNNEL_URL" echo "${STUNNEL_SHA1SUM} ${STUNNEL_ARCHIVE}" | sha1sum -c - tar -xzf "$STUNNEL_ARCHIVE" pushd "$STUNNEL_NAME" ./configure --disable-fips make install popd popd rm -rf "$BUILD_DIR" apk del "${STUNNEL_BUILD_DEPS[@]}"
Use Aptible mirror of stunnel
Use Aptible mirror of stunnel stunnel doesn't keep old archives up on their site, so let's use our own mirror for this.
Shell
mit
aptible/docker-redis,aptible/docker-redis,aptible/docker-redis
shell
## Code Before: set -o errexit set -o nounset STUNNEL_VERSION="5.36" STUNNEL_SHA1SUM="60f7c761214f1959f7d52f0164f77d8d2a9328e6" STUNNEL_NAME="stunnel-${STUNNEL_VERSION}" STUNNEL_ARCHIVE="${STUNNEL_NAME}.tar.gz" STUNNEL_URL="https://www.stunnel.org/downloads/${STUNNEL_ARCHIVE}" STUNNEL_BUILD_DEPS=(build-base linux-headers wget openssl-dev) apk-install libssl1.0 libcrypto1.0 "${STUNNEL_BUILD_DEPS[@]}" BUILD_DIR="$(mktemp -d)" pushd "$BUILD_DIR" wget "$STUNNEL_URL" echo "${STUNNEL_SHA1SUM} ${STUNNEL_ARCHIVE}" | sha1sum -c - tar -xzf "$STUNNEL_ARCHIVE" pushd "$STUNNEL_NAME" ./configure --disable-fips make install popd popd rm -rf "$BUILD_DIR" apk del "${STUNNEL_BUILD_DEPS[@]}" ## Instruction: Use Aptible mirror of stunnel stunnel doesn't keep old archives up on their site, so let's use our own mirror for this. ## Code After: set -o errexit set -o nounset STUNNEL_VERSION="5.37" STUNNEL_SHA1SUM="9ec0c64838b3013b38e2cac8e4500219a027831c" STUNNEL_NAME="stunnel-${STUNNEL_VERSION}" STUNNEL_ARCHIVE="${STUNNEL_NAME}.tar.gz" STUNNEL_URL="https://s3.amazonaws.com/aptible-source-archives/${STUNNEL_ARCHIVE}" STUNNEL_BUILD_DEPS=(build-base linux-headers wget openssl-dev) apk-install libssl1.0 libcrypto1.0 "${STUNNEL_BUILD_DEPS[@]}" BUILD_DIR="$(mktemp -d)" pushd "$BUILD_DIR" wget "$STUNNEL_URL" echo "${STUNNEL_SHA1SUM} ${STUNNEL_ARCHIVE}" | sha1sum -c - tar -xzf "$STUNNEL_ARCHIVE" pushd "$STUNNEL_NAME" ./configure --disable-fips make install popd popd rm -rf "$BUILD_DIR" apk del "${STUNNEL_BUILD_DEPS[@]}"
set -o errexit set -o nounset - STUNNEL_VERSION="5.36" ? ^ + STUNNEL_VERSION="5.37" ? ^ - STUNNEL_SHA1SUM="60f7c761214f1959f7d52f0164f77d8d2a9328e6" + STUNNEL_SHA1SUM="9ec0c64838b3013b38e2cac8e4500219a027831c" STUNNEL_NAME="stunnel-${STUNNEL_VERSION}" STUNNEL_ARCHIVE="${STUNNEL_NAME}.tar.gz" - STUNNEL_URL="https://www.stunnel.org/downloads/${STUNNEL_ARCHIVE}" + STUNNEL_URL="https://s3.amazonaws.com/aptible-source-archives/${STUNNEL_ARCHIVE}" STUNNEL_BUILD_DEPS=(build-base linux-headers wget openssl-dev) apk-install libssl1.0 libcrypto1.0 "${STUNNEL_BUILD_DEPS[@]}" BUILD_DIR="$(mktemp -d)" pushd "$BUILD_DIR" wget "$STUNNEL_URL" echo "${STUNNEL_SHA1SUM} ${STUNNEL_ARCHIVE}" | sha1sum -c - tar -xzf "$STUNNEL_ARCHIVE" pushd "$STUNNEL_NAME" ./configure --disable-fips make install popd popd rm -rf "$BUILD_DIR" apk del "${STUNNEL_BUILD_DEPS[@]}"
6
0.206897
3
3
011c3cc28b63a220308f00ac2a061eded5b346f5
.travis.yml
.travis.yml
language: ruby script: bundle exec rake test rvm: - 2.2.1 - 2.2.0 - 2.1.0 - 2.0.0 - 1.9.3 - 1.8.7 notifications: slack: secure: GVD9d+kwR5hzab5ZnWugbCkp9QSYyheSrABWkD+LmpMcWcx7jijajSn4LLvDi/zHYn1MdOBcPe08hSygmpm7ViUApp0EJcSzE4BLU/5oAs+ANV0Qq6jsssMlyo3v8eRAqHNiLxAiAsz+lc0EZWfQnSW8kHzzbO3NeYq1NRL5CgQ=
language: ruby script: bundle exec rake test rvm: - 2.2.1 - 2.2.0 - 2.1.0 - 2.0.0 - 1.9.3 - 1.8.7 notifications: slack: secure: CUYqzHxxls0Kc+s/MiKtjJvxQltXu/BpJYKooxQ3dGYKB3USnQCdOx0roE4rjUlClJwicZJxd4nBeiG66zRnVfoOC317qt9Y9vcFuxTjEzG/Okzez4RdfB6UGsqqiJifu7bf/GJMHIQYcgc59508SngCWDYcceKhr962Gz+M/ls=
Add an updated slack notification
Add an updated slack notification
YAML
mit
tmtmtmtm/csv_to_popolo
yaml
## Code Before: language: ruby script: bundle exec rake test rvm: - 2.2.1 - 2.2.0 - 2.1.0 - 2.0.0 - 1.9.3 - 1.8.7 notifications: slack: secure: GVD9d+kwR5hzab5ZnWugbCkp9QSYyheSrABWkD+LmpMcWcx7jijajSn4LLvDi/zHYn1MdOBcPe08hSygmpm7ViUApp0EJcSzE4BLU/5oAs+ANV0Qq6jsssMlyo3v8eRAqHNiLxAiAsz+lc0EZWfQnSW8kHzzbO3NeYq1NRL5CgQ= ## Instruction: Add an updated slack notification ## Code After: language: ruby script: bundle exec rake test rvm: - 2.2.1 - 2.2.0 - 2.1.0 - 2.0.0 - 1.9.3 - 1.8.7 notifications: slack: secure: CUYqzHxxls0Kc+s/MiKtjJvxQltXu/BpJYKooxQ3dGYKB3USnQCdOx0roE4rjUlClJwicZJxd4nBeiG66zRnVfoOC317qt9Y9vcFuxTjEzG/Okzez4RdfB6UGsqqiJifu7bf/GJMHIQYcgc59508SngCWDYcceKhr962Gz+M/ls=
language: ruby - script: bundle exec rake test rvm: - 2.2.1 - 2.2.0 - 2.1.0 - 2.0.0 - 1.9.3 - 1.8.7 - notifications: slack: - secure: GVD9d+kwR5hzab5ZnWugbCkp9QSYyheSrABWkD+LmpMcWcx7jijajSn4LLvDi/zHYn1MdOBcPe08hSygmpm7ViUApp0EJcSzE4BLU/5oAs+ANV0Qq6jsssMlyo3v8eRAqHNiLxAiAsz+lc0EZWfQnSW8kHzzbO3NeYq1NRL5CgQ= + secure: CUYqzHxxls0Kc+s/MiKtjJvxQltXu/BpJYKooxQ3dGYKB3USnQCdOx0roE4rjUlClJwicZJxd4nBeiG66zRnVfoOC317qt9Y9vcFuxTjEzG/Okzez4RdfB6UGsqqiJifu7bf/GJMHIQYcgc59508SngCWDYcceKhr962Gz+M/ls=
4
0.266667
1
3
642743f6f4e687da99e4ac1a23fa8ff2bdc9dd37
scripts/ruby-install.sh
scripts/ruby-install.sh
echo "Installing chruby" cd ~ wget -O chruby-0.3.9.tar.gz https://github.com/postmodern/chruby/archive/v0.3.9.tar.gz tar -xzvf chruby-0.3.9.tar.gz cd chruby-0.3.9/ sudo make install echo "Installing ruby-installer" cd ~ wget -O ruby-install-0.5.0.tar.gz https://github.com/postmodern/ruby-install/archive/v0.5.0.tar.gz tar -xzvf ruby-install-0.5.0.tar.gz cd ruby-install-0.5.0/ sudo make install echo "Installing ruby 2.2.3" cd ~ ruby-install ruby 2.2.3 echo "Adding chruby alias' to .bashrc" "source /usr/local/share/chruby/chruby.sh" >> ~/.bashrc "source /usr/local/share/chruby/auto.sh" >> ~/.bashrc
echo "Updating packages" sudo apt-get -y update # Fetches the list of available updates sudo apt-get -y upgrade # Strictly upgrades the current packages sudo apt-get -y dist-upgrade # Installs updates (new ones) sudo apt-get install make sudo apt-get -y update # Fetches the list of available updates sudo apt-get -y upgrade # Strictly upgrades the current packages sudo apt-get -y dist-upgrade # Installs updates (new ones) echo "Installing Git..." sudo apt-get -y install git-core echo "Removing system vim and replacing it with gtk" sudo apt-get -y purge vim gvim vim-gtk sudo apt-get -y install vim-gtk echo "Installing tmux..." sudo apt-get -y install tmux echo "Installing silver searcher" sudo apt-get -y install silversearcher-ag # install chruby echo "Installing chruby" cd ~ wget -O chruby-0.3.9.tar.gz https://github.com/postmodern/chruby/archive/v0.3.9.tar.gz tar -xzvf chruby-0.3.9.tar.gz cd chruby-0.3.9/ sudo make install echo "Installing ruby-installer" cd ~ wget -O ruby-install-0.5.0.tar.gz https://github.com/postmodern/ruby-install/archive/v0.5.0.tar.gz tar -xzvf ruby-install-0.5.0.tar.gz cd ruby-install-0.5.0/ sudo make install echo "Installing ruby 2.2.3" cd ~ ruby-install ruby 2.2.3 echo "Adding chruby alias' to .bashrc" "source /usr/local/share/chruby/chruby.sh" >> ~/.bashrc "source /usr/local/share/chruby/auto.sh" >> ~/.bashrc
Update to install missing packages
Update to install missing packages
Shell
mit
allcentury/dotfiles,allcentury/dotfiles
shell
## Code Before: echo "Installing chruby" cd ~ wget -O chruby-0.3.9.tar.gz https://github.com/postmodern/chruby/archive/v0.3.9.tar.gz tar -xzvf chruby-0.3.9.tar.gz cd chruby-0.3.9/ sudo make install echo "Installing ruby-installer" cd ~ wget -O ruby-install-0.5.0.tar.gz https://github.com/postmodern/ruby-install/archive/v0.5.0.tar.gz tar -xzvf ruby-install-0.5.0.tar.gz cd ruby-install-0.5.0/ sudo make install echo "Installing ruby 2.2.3" cd ~ ruby-install ruby 2.2.3 echo "Adding chruby alias' to .bashrc" "source /usr/local/share/chruby/chruby.sh" >> ~/.bashrc "source /usr/local/share/chruby/auto.sh" >> ~/.bashrc ## Instruction: Update to install missing packages ## Code After: echo "Updating packages" sudo apt-get -y update # Fetches the list of available updates sudo apt-get -y upgrade # Strictly upgrades the current packages sudo apt-get -y dist-upgrade # Installs updates (new ones) sudo apt-get install make sudo apt-get -y update # Fetches the list of available updates sudo apt-get -y upgrade # Strictly upgrades the current packages sudo apt-get -y dist-upgrade # Installs updates (new ones) echo "Installing Git..." sudo apt-get -y install git-core echo "Removing system vim and replacing it with gtk" sudo apt-get -y purge vim gvim vim-gtk sudo apt-get -y install vim-gtk echo "Installing tmux..." sudo apt-get -y install tmux echo "Installing silver searcher" sudo apt-get -y install silversearcher-ag # install chruby echo "Installing chruby" cd ~ wget -O chruby-0.3.9.tar.gz https://github.com/postmodern/chruby/archive/v0.3.9.tar.gz tar -xzvf chruby-0.3.9.tar.gz cd chruby-0.3.9/ sudo make install echo "Installing ruby-installer" cd ~ wget -O ruby-install-0.5.0.tar.gz https://github.com/postmodern/ruby-install/archive/v0.5.0.tar.gz tar -xzvf ruby-install-0.5.0.tar.gz cd ruby-install-0.5.0/ sudo make install echo "Installing ruby 2.2.3" cd ~ ruby-install ruby 2.2.3 echo "Adding chruby alias' to .bashrc" "source /usr/local/share/chruby/chruby.sh" >> ~/.bashrc "source /usr/local/share/chruby/auto.sh" >> ~/.bashrc
+ + echo "Updating packages" + sudo apt-get -y update # Fetches the list of available updates + sudo apt-get -y upgrade # Strictly upgrades the current packages + sudo apt-get -y dist-upgrade # Installs updates (new ones) + sudo apt-get install make + sudo apt-get -y update # Fetches the list of available updates + sudo apt-get -y upgrade # Strictly upgrades the current packages + sudo apt-get -y dist-upgrade # Installs updates (new ones) + echo "Installing Git..." + sudo apt-get -y install git-core + echo "Removing system vim and replacing it with gtk" + sudo apt-get -y purge vim gvim vim-gtk + sudo apt-get -y install vim-gtk + echo "Installing tmux..." + sudo apt-get -y install tmux + echo "Installing silver searcher" + sudo apt-get -y install silversearcher-ag + + # install chruby echo "Installing chruby" cd ~ wget -O chruby-0.3.9.tar.gz https://github.com/postmodern/chruby/archive/v0.3.9.tar.gz tar -xzvf chruby-0.3.9.tar.gz cd chruby-0.3.9/ sudo make install echo "Installing ruby-installer" cd ~ wget -O ruby-install-0.5.0.tar.gz https://github.com/postmodern/ruby-install/archive/v0.5.0.tar.gz tar -xzvf ruby-install-0.5.0.tar.gz cd ruby-install-0.5.0/ sudo make install echo "Installing ruby 2.2.3" cd ~ ruby-install ruby 2.2.3 echo "Adding chruby alias' to .bashrc" "source /usr/local/share/chruby/chruby.sh" >> ~/.bashrc "source /usr/local/share/chruby/auto.sh" >> ~/.bashrc
20
0.909091
20
0
bdbd9fdd71556e5d46497b9295c18e8be4b99f02
modules/transactions/src/main/resources/arjuna-properties.xml
modules/transactions/src/main/resources/arjuna-properties.xml
<?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> <properties> <entry key="CoreEnvironmentBean.nodeIdentifier">1</entry> </properties>
<?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> <properties> <entry key="CoordinatorEnvironmentBean.commitOnePhase">YES</entry> <entry key="ObjectStoreEnvironmentBean.objectStoreDir">PutObjectStoreDirHere</entry> <entry key="ObjectStoreEnvironmentBean.transactionSync">ON</entry> <entry key="CoreEnvironmentBean.nodeIdentifier">1</entry> <entry key="JTAEnvironmentBean.xaRecoveryNodes">1</entry> <entry key="JTAEnvironmentBean.xaResourceOrphanFilterClassNames"> com.arjuna.ats.internal.jta.recovery.arjunacore.JTATransactionLogXAResourceOrphanFilter com.arjuna.ats.internal.jta.recovery.arjunacore.JTANodeNameXAResourceOrphanFilter </entry> <entry key="CoreEnvironmentBean.socketProcessIdPort">0</entry> <entry key="RecoveryEnvironmentBean.recoveryModuleClassNames"> com.arjuna.ats.internal.arjuna.recovery.AtomicActionRecoveryModule com.arjuna.ats.internal.txoj.recovery.TORecoveryModule com.arjuna.ats.internal.jta.recovery.arjunacore.XARecoveryModule </entry> <entry key="RecoveryEnvironmentBean.expiryScannerClassNames"> com.arjuna.ats.internal.arjuna.recovery.ExpiredTransactionStatusManagerScanner </entry> <entry key="RecoveryEnvironmentBean.recoveryPort">4712</entry> <entry key="RecoveryEnvironmentBean.recoveryAddress"></entry> <entry key="RecoveryEnvironmentBean.transactionStatusManagerPort">0</entry> <entry key="RecoveryEnvironmentBean.transactionStatusManagerAddress"></entry> <entry key="RecoveryEnvironmentBean.recoveryListener">NO</entry> </properties>
Use a properties file that is functionally equivalent to the default.
Use a properties file that is functionally equivalent to the default.
XML
apache-2.0
projectodd/wunderboss,projectodd/wunderboss-release,projectodd/wunderboss,projectodd/wunderboss,projectodd/wunderboss-release,projectodd/wunderboss-release
xml
## Code Before: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> <properties> <entry key="CoreEnvironmentBean.nodeIdentifier">1</entry> </properties> ## Instruction: Use a properties file that is functionally equivalent to the default. ## Code After: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> <properties> <entry key="CoordinatorEnvironmentBean.commitOnePhase">YES</entry> <entry key="ObjectStoreEnvironmentBean.objectStoreDir">PutObjectStoreDirHere</entry> <entry key="ObjectStoreEnvironmentBean.transactionSync">ON</entry> <entry key="CoreEnvironmentBean.nodeIdentifier">1</entry> <entry key="JTAEnvironmentBean.xaRecoveryNodes">1</entry> <entry key="JTAEnvironmentBean.xaResourceOrphanFilterClassNames"> com.arjuna.ats.internal.jta.recovery.arjunacore.JTATransactionLogXAResourceOrphanFilter com.arjuna.ats.internal.jta.recovery.arjunacore.JTANodeNameXAResourceOrphanFilter </entry> <entry key="CoreEnvironmentBean.socketProcessIdPort">0</entry> <entry key="RecoveryEnvironmentBean.recoveryModuleClassNames"> com.arjuna.ats.internal.arjuna.recovery.AtomicActionRecoveryModule com.arjuna.ats.internal.txoj.recovery.TORecoveryModule com.arjuna.ats.internal.jta.recovery.arjunacore.XARecoveryModule </entry> <entry key="RecoveryEnvironmentBean.expiryScannerClassNames"> com.arjuna.ats.internal.arjuna.recovery.ExpiredTransactionStatusManagerScanner </entry> <entry key="RecoveryEnvironmentBean.recoveryPort">4712</entry> <entry key="RecoveryEnvironmentBean.recoveryAddress"></entry> <entry key="RecoveryEnvironmentBean.transactionStatusManagerPort">0</entry> <entry key="RecoveryEnvironmentBean.transactionStatusManagerAddress"></entry> <entry key="RecoveryEnvironmentBean.recoveryListener">NO</entry> </properties>
<?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd"> <properties> + <entry key="CoordinatorEnvironmentBean.commitOnePhase">YES</entry> + <entry key="ObjectStoreEnvironmentBean.objectStoreDir">PutObjectStoreDirHere</entry> + <entry key="ObjectStoreEnvironmentBean.transactionSync">ON</entry> <entry key="CoreEnvironmentBean.nodeIdentifier">1</entry> + <entry key="JTAEnvironmentBean.xaRecoveryNodes">1</entry> + <entry key="JTAEnvironmentBean.xaResourceOrphanFilterClassNames"> + com.arjuna.ats.internal.jta.recovery.arjunacore.JTATransactionLogXAResourceOrphanFilter + com.arjuna.ats.internal.jta.recovery.arjunacore.JTANodeNameXAResourceOrphanFilter + </entry> + <entry key="CoreEnvironmentBean.socketProcessIdPort">0</entry> + <entry key="RecoveryEnvironmentBean.recoveryModuleClassNames"> + com.arjuna.ats.internal.arjuna.recovery.AtomicActionRecoveryModule + com.arjuna.ats.internal.txoj.recovery.TORecoveryModule + com.arjuna.ats.internal.jta.recovery.arjunacore.XARecoveryModule + </entry> + <entry key="RecoveryEnvironmentBean.expiryScannerClassNames"> + com.arjuna.ats.internal.arjuna.recovery.ExpiredTransactionStatusManagerScanner + </entry> + <entry key="RecoveryEnvironmentBean.recoveryPort">4712</entry> + <entry key="RecoveryEnvironmentBean.recoveryAddress"></entry> + <entry key="RecoveryEnvironmentBean.transactionStatusManagerPort">0</entry> + <entry key="RecoveryEnvironmentBean.transactionStatusManagerAddress"></entry> + <entry key="RecoveryEnvironmentBean.recoveryListener">NO</entry> </properties>
22
4.4
22
0
458b683a66b98ae454eac27e87a0d3951ffa9a44
src/redux/action-types.js
src/redux/action-types.js
export const RUNNING = 'RUNNING'; export const START_LOAD_RUNTIME = 'START_LOAD_RUNTIME'; export const FINISH_LOAD_RUNTIME = 'FINISH_LOAD_RUNTIME'; export const FAIL_LOAD_RUNTIME = 'FAIL_LOAD_RUNTIME';
export const START_LOAD_RUNTIME = 'START_LOAD_RUNTIME'; export const FINISH_LOAD_RUNTIME = 'FINISH_LOAD_RUNTIME'; export const FAIL_LOAD_RUNTIME = 'FAIL_LOAD_RUNTIME'; export const START_PARSE = 'START_PARSE'; export const FINISH_PARSE = 'FINISH_PARSE'; export const FAIL_PARSE = 'FAIL_PARSE'; export const START_COMPILE = 'START_COMPILE'; export const FINISH_COMPILE = 'FINISH_COMPILE'; export const FAIL_COMPILE = 'FAIL_COMPILE'; export const STOP_COMPILE = "STOP_COMPILE"; export const START_EXECUTE = 'START_EXECUTE'; export const FINISH_EXECUTE = 'FINISH_EXECUTE'; export const FAIL_EXECUTE = 'FAIL_EXECUTE'; export const STOP_EXECUTE = "STOP_EXECUTE"; export const PAUSE_RUN = 'PAUSE_RUN'; export const CLEAR_STATE = 'CLEAR_STATE';
Add action types for running code
Add action types for running code
JavaScript
apache-2.0
pcardune/pyret-ide,pcardune/pyret-ide
javascript
## Code Before: export const RUNNING = 'RUNNING'; export const START_LOAD_RUNTIME = 'START_LOAD_RUNTIME'; export const FINISH_LOAD_RUNTIME = 'FINISH_LOAD_RUNTIME'; export const FAIL_LOAD_RUNTIME = 'FAIL_LOAD_RUNTIME'; ## Instruction: Add action types for running code ## Code After: export const START_LOAD_RUNTIME = 'START_LOAD_RUNTIME'; export const FINISH_LOAD_RUNTIME = 'FINISH_LOAD_RUNTIME'; export const FAIL_LOAD_RUNTIME = 'FAIL_LOAD_RUNTIME'; export const START_PARSE = 'START_PARSE'; export const FINISH_PARSE = 'FINISH_PARSE'; export const FAIL_PARSE = 'FAIL_PARSE'; export const START_COMPILE = 'START_COMPILE'; export const FINISH_COMPILE = 'FINISH_COMPILE'; export const FAIL_COMPILE = 'FAIL_COMPILE'; export const STOP_COMPILE = "STOP_COMPILE"; export const START_EXECUTE = 'START_EXECUTE'; export const FINISH_EXECUTE = 'FINISH_EXECUTE'; export const FAIL_EXECUTE = 'FAIL_EXECUTE'; export const STOP_EXECUTE = "STOP_EXECUTE"; export const PAUSE_RUN = 'PAUSE_RUN'; export const CLEAR_STATE = 'CLEAR_STATE';
- export const RUNNING = 'RUNNING'; - export const START_LOAD_RUNTIME = 'START_LOAD_RUNTIME'; export const FINISH_LOAD_RUNTIME = 'FINISH_LOAD_RUNTIME'; export const FAIL_LOAD_RUNTIME = 'FAIL_LOAD_RUNTIME'; + + export const START_PARSE = 'START_PARSE'; + export const FINISH_PARSE = 'FINISH_PARSE'; + export const FAIL_PARSE = 'FAIL_PARSE'; + + export const START_COMPILE = 'START_COMPILE'; + export const FINISH_COMPILE = 'FINISH_COMPILE'; + export const FAIL_COMPILE = 'FAIL_COMPILE'; + export const STOP_COMPILE = "STOP_COMPILE"; + + export const START_EXECUTE = 'START_EXECUTE'; + export const FINISH_EXECUTE = 'FINISH_EXECUTE'; + export const FAIL_EXECUTE = 'FAIL_EXECUTE'; + export const STOP_EXECUTE = "STOP_EXECUTE"; + + export const PAUSE_RUN = 'PAUSE_RUN'; + export const CLEAR_STATE = 'CLEAR_STATE';
19
3.8
17
2
b9dfbb17512b270103444d972af17c43ddbba26b
ibmcnx/doc/DataSources.py
ibmcnx/doc/DataSources.py
import ibmcnx.functions cell = AdminControl.getCell() cellname = "/Cell:" + cell + "/" # Get a list of all databases except DefaultEJBTimerDataSource and OTiSDataSource dbs = AdminConfig.list('DataSource',AdminConfig.getid(cellname)).splitlines() dsidlist = [] # remove unwanted databases for db in dbs: dbname = db.split('(') n = 0 for i in dbname: # i is only the name of the DataSource, db is DataSource ID! if n == 0 and i != "DefaultEJBTimerDataSource" and i != 'OTiSDataSource': dsidlist.append(str(db).replace('"','')) n += 1 dsidlist.sort() for dsid in dsidlist: propertySet = AdminConfig.showAttribute(dsid,"propertySet") propertyList = AdminConfig.list("J2EEResourceProperty", propertySet).splitlines()
import ibmcnx.functions cell = AdminControl.getCell() cellname = "/Cell:" + cell + "/" # Get a list of all databases except DefaultEJBTimerDataSource and OTiSDataSource dbs = AdminConfig.list('DataSource',AdminConfig.getid(cellname)).splitlines() dsidlist = [] # remove unwanted databases for db in dbs: dbname = db.split('(') n = 0 for i in dbname: # i is only the name of the DataSource, db is DataSource ID! if n == 0 and i != "DefaultEJBTimerDataSource" and i != 'OTiSDataSource': dsidlist.append(str(db).replace('"','')) n += 1 dsidlist.sort() for dsid in dsidlist: propertySet = AdminConfig.showAttribute(dsid,"propertySet") propertyList = AdminConfig.list("J2EEResourceProperty", propertySet).splitlines() print propertyList
Create documentation of DataSource Settings
8: Create documentation of DataSource Settings Task-Url: http://github.com/stoeps13/ibmcnx2/issues/issue/8
Python
apache-2.0
stoeps13/ibmcnx2,stoeps13/ibmcnx2
python
## Code Before: import ibmcnx.functions cell = AdminControl.getCell() cellname = "/Cell:" + cell + "/" # Get a list of all databases except DefaultEJBTimerDataSource and OTiSDataSource dbs = AdminConfig.list('DataSource',AdminConfig.getid(cellname)).splitlines() dsidlist = [] # remove unwanted databases for db in dbs: dbname = db.split('(') n = 0 for i in dbname: # i is only the name of the DataSource, db is DataSource ID! if n == 0 and i != "DefaultEJBTimerDataSource" and i != 'OTiSDataSource': dsidlist.append(str(db).replace('"','')) n += 1 dsidlist.sort() for dsid in dsidlist: propertySet = AdminConfig.showAttribute(dsid,"propertySet") propertyList = AdminConfig.list("J2EEResourceProperty", propertySet).splitlines() ## Instruction: 8: Create documentation of DataSource Settings Task-Url: http://github.com/stoeps13/ibmcnx2/issues/issue/8 ## Code After: import ibmcnx.functions cell = AdminControl.getCell() cellname = "/Cell:" + cell + "/" # Get a list of all databases except DefaultEJBTimerDataSource and OTiSDataSource dbs = AdminConfig.list('DataSource',AdminConfig.getid(cellname)).splitlines() dsidlist = [] # remove unwanted databases for db in dbs: dbname = db.split('(') n = 0 for i in dbname: # i is only the name of the DataSource, db is DataSource ID! if n == 0 and i != "DefaultEJBTimerDataSource" and i != 'OTiSDataSource': dsidlist.append(str(db).replace('"','')) n += 1 dsidlist.sort() for dsid in dsidlist: propertySet = AdminConfig.showAttribute(dsid,"propertySet") propertyList = AdminConfig.list("J2EEResourceProperty", propertySet).splitlines() print propertyList
import ibmcnx.functions cell = AdminControl.getCell() cellname = "/Cell:" + cell + "/" # Get a list of all databases except DefaultEJBTimerDataSource and OTiSDataSource dbs = AdminConfig.list('DataSource',AdminConfig.getid(cellname)).splitlines() dsidlist = [] # remove unwanted databases for db in dbs: dbname = db.split('(') n = 0 for i in dbname: # i is only the name of the DataSource, db is DataSource ID! if n == 0 and i != "DefaultEJBTimerDataSource" and i != 'OTiSDataSource': dsidlist.append(str(db).replace('"','')) n += 1 dsidlist.sort() for dsid in dsidlist: propertySet = AdminConfig.showAttribute(dsid,"propertySet") propertyList = AdminConfig.list("J2EEResourceProperty", propertySet).splitlines() + print propertyList
1
0.041667
1
0
e531cb6b4b9938d2dabf2c803bab0234d995807f
functions.php
functions.php
<?php /** * Load admin dependencies. */ $tempdir = get_template_directory(); require_once($tempdir.'/admin/init.php'); /** * Theme set up settings. */ add_action('after_setup_theme', function() { // Configure WP 2.9+ Thumbnails. add_theme_support('post-thumbnails'); set_post_thumbnail_size(50, 50, true); add_image_size('thumbnail-large', 500, '', false); // Add support for post formats. add_theme_support('post-formats', ['aside', 'gallery', 'image', 'link', 'quote', 'video', 'audio']); }); /** * Configure Default Title. */ add_filter('wp_title', function($title) { $name = get_bloginfo('name'); $description = get_bloginfo('description'); if (is_front_page()) { return "$name - $description"; } return trim($title).' - '.$name; }); /** * Configure Excerpt String. */ add_filter('wp_trim_excerpt', function($excerpt) { return str_replace('[...]', '…', $excerpt); }); /** * Change Default Excerpt Length (55). */ add_filter('excerpt_length', function($length) { return 55; });
<?php /** * Load admin dependencies. */ $tempdir = get_template_directory(); require_once($tempdir.'/admin/init.php'); /** * Theme set up settings. */ add_action('after_setup_theme', function() { // Configure WP 2.9+ Thumbnails. add_theme_support('post-thumbnails'); set_post_thumbnail_size(50, 50, true); add_image_size('thumbnail-large', 500, '', false); // Add support for post formats. add_theme_support('post-formats', ['aside', 'gallery', 'image', 'link', 'quote', 'video', 'audio']); }); /** * Configure Default Title. */ add_filter('wp_title', function($title) { $name = get_bloginfo('name'); $description = get_bloginfo('description'); if (is_front_page()) { return "$name - $description"; } return trim($title).' - '.$name; }); /** * Configure Excerpt String. */ add_filter('excerpt_more', function($excerpt) { return '…'; }); /** * Change Default Excerpt Length (55). */ add_filter('excerpt_length', function($length) { return 55; });
Change wp_trim_excerpt to excerpt_more and update return.
Change wp_trim_excerpt to excerpt_more and update return.
PHP
mit
fieleman/wordplate,mikaelmattsson/wordplate,fieleman/wordplate
php
## Code Before: <?php /** * Load admin dependencies. */ $tempdir = get_template_directory(); require_once($tempdir.'/admin/init.php'); /** * Theme set up settings. */ add_action('after_setup_theme', function() { // Configure WP 2.9+ Thumbnails. add_theme_support('post-thumbnails'); set_post_thumbnail_size(50, 50, true); add_image_size('thumbnail-large', 500, '', false); // Add support for post formats. add_theme_support('post-formats', ['aside', 'gallery', 'image', 'link', 'quote', 'video', 'audio']); }); /** * Configure Default Title. */ add_filter('wp_title', function($title) { $name = get_bloginfo('name'); $description = get_bloginfo('description'); if (is_front_page()) { return "$name - $description"; } return trim($title).' - '.$name; }); /** * Configure Excerpt String. */ add_filter('wp_trim_excerpt', function($excerpt) { return str_replace('[...]', '…', $excerpt); }); /** * Change Default Excerpt Length (55). */ add_filter('excerpt_length', function($length) { return 55; }); ## Instruction: Change wp_trim_excerpt to excerpt_more and update return. ## Code After: <?php /** * Load admin dependencies. */ $tempdir = get_template_directory(); require_once($tempdir.'/admin/init.php'); /** * Theme set up settings. */ add_action('after_setup_theme', function() { // Configure WP 2.9+ Thumbnails. add_theme_support('post-thumbnails'); set_post_thumbnail_size(50, 50, true); add_image_size('thumbnail-large', 500, '', false); // Add support for post formats. add_theme_support('post-formats', ['aside', 'gallery', 'image', 'link', 'quote', 'video', 'audio']); }); /** * Configure Default Title. */ add_filter('wp_title', function($title) { $name = get_bloginfo('name'); $description = get_bloginfo('description'); if (is_front_page()) { return "$name - $description"; } return trim($title).' - '.$name; }); /** * Configure Excerpt String. */ add_filter('excerpt_more', function($excerpt) { return '…'; }); /** * Change Default Excerpt Length (55). */ add_filter('excerpt_length', function($length) { return 55; });
<?php /** * Load admin dependencies. */ $tempdir = get_template_directory(); require_once($tempdir.'/admin/init.php'); /** * Theme set up settings. */ add_action('after_setup_theme', function() { // Configure WP 2.9+ Thumbnails. add_theme_support('post-thumbnails'); set_post_thumbnail_size(50, 50, true); add_image_size('thumbnail-large', 500, '', false); // Add support for post formats. add_theme_support('post-formats', ['aside', 'gallery', 'image', 'link', 'quote', 'video', 'audio']); }); /** * Configure Default Title. */ add_filter('wp_title', function($title) { $name = get_bloginfo('name'); $description = get_bloginfo('description'); if (is_front_page()) { return "$name - $description"; } return trim($title).' - '.$name; }); /** * Configure Excerpt String. */ - add_filter('wp_trim_excerpt', function($excerpt) ? -------- + add_filter('excerpt_more', function($excerpt) ? +++++ { - return str_replace('[...]', '…', $excerpt); + return '…'; }); /** * Change Default Excerpt Length (55). */ add_filter('excerpt_length', function($length) { return 55; });
4
0.08
2
2
4913a056cf2d677fdf7c2e3e50793e749130bddc
src/Http/Request.php
src/Http/Request.php
<?php namespace Laravel\Lumen\Http; use Illuminate\Support\Arr; use Illuminate\Http\Request as BaseRequest; class Request extends BaseRequest { /** * Get the route handling the request. * * @param string|null $param * @param mixed $default * * @return array|string */ public function route($param = null, $default = null) { $route = call_user_func($this->getRouteResolver()); if (is_null($route) || is_null($param)) { return $route; } return Arr::get($route[2], $param, $default); } }
<?php namespace Laravel\Lumen\Http; use RuntimeException; use Illuminate\Support\Arr; use Illuminate\Http\Request as BaseRequest; class Request extends BaseRequest { /** * Get the route handling the request. * * @param string|null $param * @param mixed $default * * @return array|string */ public function route($param = null, $default = null) { $route = call_user_func($this->getRouteResolver()); if (is_null($route) || is_null($param)) { return $route; } return Arr::get($route[2], $param, $default); } /** * Get a unique fingerprint for the request / route / IP address. * * @return string * * @throws \RuntimeException */ public function fingerprint() { if (! $route = $this->route()) { throw new RuntimeException('Unable to generate fingerprint. Route unavailable.'); } return sha1(implode('|', [ $this->getMethod(), $this->root(), $this->path(), $this->ip() ])); } }
Make the fingerprint method compatible with Lumen.
Make the fingerprint method compatible with Lumen.
PHP
mit
orchestral/lumen,laravel/lumen-framework,crynobone/lumen-framework
php
## Code Before: <?php namespace Laravel\Lumen\Http; use Illuminate\Support\Arr; use Illuminate\Http\Request as BaseRequest; class Request extends BaseRequest { /** * Get the route handling the request. * * @param string|null $param * @param mixed $default * * @return array|string */ public function route($param = null, $default = null) { $route = call_user_func($this->getRouteResolver()); if (is_null($route) || is_null($param)) { return $route; } return Arr::get($route[2], $param, $default); } } ## Instruction: Make the fingerprint method compatible with Lumen. ## Code After: <?php namespace Laravel\Lumen\Http; use RuntimeException; use Illuminate\Support\Arr; use Illuminate\Http\Request as BaseRequest; class Request extends BaseRequest { /** * Get the route handling the request. * * @param string|null $param * @param mixed $default * * @return array|string */ public function route($param = null, $default = null) { $route = call_user_func($this->getRouteResolver()); if (is_null($route) || is_null($param)) { return $route; } return Arr::get($route[2], $param, $default); } /** * Get a unique fingerprint for the request / route / IP address. * * @return string * * @throws \RuntimeException */ public function fingerprint() { if (! $route = $this->route()) { throw new RuntimeException('Unable to generate fingerprint. Route unavailable.'); } return sha1(implode('|', [ $this->getMethod(), $this->root(), $this->path(), $this->ip() ])); } }
<?php namespace Laravel\Lumen\Http; + use RuntimeException; use Illuminate\Support\Arr; use Illuminate\Http\Request as BaseRequest; class Request extends BaseRequest { /** * Get the route handling the request. * * @param string|null $param * @param mixed $default * * @return array|string */ public function route($param = null, $default = null) { $route = call_user_func($this->getRouteResolver()); if (is_null($route) || is_null($param)) { return $route; } return Arr::get($route[2], $param, $default); } + + /** + * Get a unique fingerprint for the request / route / IP address. + * + * @return string + * + * @throws \RuntimeException + */ + public function fingerprint() + { + if (! $route = $this->route()) { + throw new RuntimeException('Unable to generate fingerprint. Route unavailable.'); + } + + return sha1(implode('|', [ + $this->getMethod(), $this->root(), $this->path(), $this->ip() + ])); + } }
19
0.678571
19
0
a3b47d5bb732e920cba5f7bcddf3f15eab60c337
src/api-components/Sort.js
src/api-components/Sort.js
import React from 'react'; import { sort } from 'sajari'; import Base from './Base.js'; import Components from '../constants/QueryComponentConstants.js'; const Sort = props => { const { field, order, ...others } = props; return ( <Base {...others} runDefault='update' componentName={Components.SORT} data={sort(field, order)} /> ); }; Sort.propTypes = { field: React.PropTypes.string.isRequired, order: React.PropTypes.string.isRequired, }; export default Sort;
import React from 'react'; import { sort } from 'sajari'; import Base from './Base.js'; import Components from '../constants/QueryComponentConstants.js'; const Sort = props => { const { field, order, ...others } = props; return ( <Base {...others} runDefault='update' componentName={Components.SORT} data={sort(field)} /> ); }; Sort.propTypes = { field: React.PropTypes.string.isRequired, }; export default Sort;
Remove order prop from sort
Remove order prop from sort
JavaScript
mit
sajari/sajari-sdk-react,sajari/sajari-sdk-react
javascript
## Code Before: import React from 'react'; import { sort } from 'sajari'; import Base from './Base.js'; import Components from '../constants/QueryComponentConstants.js'; const Sort = props => { const { field, order, ...others } = props; return ( <Base {...others} runDefault='update' componentName={Components.SORT} data={sort(field, order)} /> ); }; Sort.propTypes = { field: React.PropTypes.string.isRequired, order: React.PropTypes.string.isRequired, }; export default Sort; ## Instruction: Remove order prop from sort ## Code After: import React from 'react'; import { sort } from 'sajari'; import Base from './Base.js'; import Components from '../constants/QueryComponentConstants.js'; const Sort = props => { const { field, order, ...others } = props; return ( <Base {...others} runDefault='update' componentName={Components.SORT} data={sort(field)} /> ); }; Sort.propTypes = { field: React.PropTypes.string.isRequired, }; export default Sort;
import React from 'react'; import { sort } from 'sajari'; import Base from './Base.js'; import Components from '../constants/QueryComponentConstants.js'; const Sort = props => { const { field, order, ...others } = props; return ( <Base {...others} runDefault='update' componentName={Components.SORT} - data={sort(field, order)} ? ------- + data={sort(field)} /> ); }; Sort.propTypes = { field: React.PropTypes.string.isRequired, - order: React.PropTypes.string.isRequired, }; export default Sort;
3
0.125
1
2
1a79fb6032f5a6638b97772124ffaf233fd6e44c
docker/rally/extend_start.sh
docker/rally/extend_start.sh
if [[ "${!KOLLA_BOOTSTRAP[@]}" ]]; then rally-manage db create || rally-manage db upgrade exit 0 fi if [[ ! -d "/var/log/kolla/rally" ]]; then mkdir -p /var/log/kolla/rally fi if [[ $(stat -c %a /var/log/kolla/rally) != "755" ]]; then chmod 755 /var/log/kolla/rally fi
if [[ ! -d "/var/log/kolla/rally" ]]; then mkdir -p /var/log/kolla/rally fi if [[ $(stat -c %a /var/log/kolla/rally) != "755" ]]; then chmod 755 /var/log/kolla/rally fi # Bootstrap and exit if KOLLA_BOOTSTRAP variable is set. This catches all cases # of the KOLLA_BOOTSTRAP variable being set, including empty. if [[ "${!KOLLA_BOOTSTRAP[@]}" ]]; then rally-manage db create || rally-manage db upgrade exit 0 fi
Create /var/log/kolla/rally before running rally-manage db create/upgrade
Create /var/log/kolla/rally before running rally-manage db create/upgrade Change-Id: I6f88a4ecb4ba980836b4778252bb49731463f4eb Closes-bug: #1630377
Shell
apache-2.0
dardelean/kolla-ansible,GalenMa/kolla,intel-onp/kolla,rahulunair/kolla,openstack/kolla,mrangana/kolla,intel-onp/kolla,mrangana/kolla,coolsvap/kolla,stackforge/kolla,mandre/kolla,openstack/kolla,mandre/kolla,dardelean/kolla-ansible,GalenMa/kolla,stackforge/kolla,mandre/kolla,coolsvap/kolla,rahulunair/kolla,dardelean/kolla-ansible,nihilifer/kolla,coolsvap/kolla,nihilifer/kolla,stackforge/kolla
shell
## Code Before: if [[ "${!KOLLA_BOOTSTRAP[@]}" ]]; then rally-manage db create || rally-manage db upgrade exit 0 fi if [[ ! -d "/var/log/kolla/rally" ]]; then mkdir -p /var/log/kolla/rally fi if [[ $(stat -c %a /var/log/kolla/rally) != "755" ]]; then chmod 755 /var/log/kolla/rally fi ## Instruction: Create /var/log/kolla/rally before running rally-manage db create/upgrade Change-Id: I6f88a4ecb4ba980836b4778252bb49731463f4eb Closes-bug: #1630377 ## Code After: if [[ ! -d "/var/log/kolla/rally" ]]; then mkdir -p /var/log/kolla/rally fi if [[ $(stat -c %a /var/log/kolla/rally) != "755" ]]; then chmod 755 /var/log/kolla/rally fi # Bootstrap and exit if KOLLA_BOOTSTRAP variable is set. This catches all cases # of the KOLLA_BOOTSTRAP variable being set, including empty. if [[ "${!KOLLA_BOOTSTRAP[@]}" ]]; then rally-manage db create || rally-manage db upgrade exit 0 fi
- if [[ "${!KOLLA_BOOTSTRAP[@]}" ]]; then - rally-manage db create || rally-manage db upgrade - exit 0 - fi if [[ ! -d "/var/log/kolla/rally" ]]; then mkdir -p /var/log/kolla/rally fi if [[ $(stat -c %a /var/log/kolla/rally) != "755" ]]; then chmod 755 /var/log/kolla/rally fi + + # Bootstrap and exit if KOLLA_BOOTSTRAP variable is set. This catches all cases + # of the KOLLA_BOOTSTRAP variable being set, including empty. + if [[ "${!KOLLA_BOOTSTRAP[@]}" ]]; then + rally-manage db create || rally-manage db upgrade + exit 0 + fi
11
1
7
4
6ca27fba516ddc63ad6bae98b20e5f9a42b37451
examples/plotting/file/image.py
examples/plotting/file/image.py
import numpy as np from bokeh.plotting import * from bokeh.objects import Range1d N = 1000 x = np.linspace(0, 10, N) y = np.linspace(0, 10, N) xx, yy = np.meshgrid(x, y) d = np.sin(xx)*np.cos(yy) output_file("image.html", title="image.py example") image( image=[d], x=[0], y=[0], dw=[10], dh=[10], palette=["Spectral-11"], x_range=[0, 10], y_range=[0, 10], tools="pan,wheel_zoom,box_zoom,reset,previewsave", name="image_example" ) curplot().x_range = [5, 10] show() # open a browser
import numpy as np from bokeh.plotting import * N = 1000 x = np.linspace(0, 10, N) y = np.linspace(0, 10, N) xx, yy = np.meshgrid(x, y) d = np.sin(xx)*np.cos(yy) output_file("image.html", title="image.py example") image( image=[d], x=[0], y=[0], dw=[10], dh=[10], palette=["Spectral-11"], x_range=[0, 10], y_range=[0, 10], tools="pan,wheel_zoom,box_zoom,reset,previewsave", name="image_example" ) show() # open a browser
Fix example and remove extraneous import.
Fix example and remove extraneous import.
Python
bsd-3-clause
birdsarah/bokeh,srinathv/bokeh,justacec/bokeh,eteq/bokeh,saifrahmed/bokeh,eteq/bokeh,rothnic/bokeh,dennisobrien/bokeh,deeplook/bokeh,draperjames/bokeh,tacaswell/bokeh,daodaoliang/bokeh,abele/bokeh,abele/bokeh,phobson/bokeh,Karel-van-de-Plassche/bokeh,percyfal/bokeh,ericdill/bokeh,timsnyder/bokeh,CrazyGuo/bokeh,ptitjano/bokeh,quasiben/bokeh,rothnic/bokeh,ericmjl/bokeh,rs2/bokeh,philippjfr/bokeh,ericmjl/bokeh,rhiever/bokeh,stonebig/bokeh,timothydmorton/bokeh,rs2/bokeh,azjps/bokeh,roxyboy/bokeh,aiguofer/bokeh,justacec/bokeh,draperjames/bokeh,stuart-knock/bokeh,paultcochrane/bokeh,aiguofer/bokeh,birdsarah/bokeh,awanke/bokeh,ChristosChristofidis/bokeh,roxyboy/bokeh,laurent-george/bokeh,ahmadia/bokeh,saifrahmed/bokeh,birdsarah/bokeh,mutirri/bokeh,jplourenco/bokeh,htygithub/bokeh,laurent-george/bokeh,canavandl/bokeh,daodaoliang/bokeh,bokeh/bokeh,ahmadia/bokeh,schoolie/bokeh,schoolie/bokeh,evidation-health/bokeh,maxalbert/bokeh,dennisobrien/bokeh,ChristosChristofidis/bokeh,saifrahmed/bokeh,roxyboy/bokeh,muku42/bokeh,phobson/bokeh,jakirkham/bokeh,josherick/bokeh,ericmjl/bokeh,eteq/bokeh,ptitjano/bokeh,mutirri/bokeh,muku42/bokeh,timsnyder/bokeh,lukebarnard1/bokeh,DuCorey/bokeh,percyfal/bokeh,phobson/bokeh,laurent-george/bokeh,bokeh/bokeh,draperjames/bokeh,canavandl/bokeh,carlvlewis/bokeh,xguse/bokeh,carlvlewis/bokeh,mutirri/bokeh,DuCorey/bokeh,lukebarnard1/bokeh,muku42/bokeh,Karel-van-de-Plassche/bokeh,ptitjano/bokeh,tacaswell/bokeh,xguse/bokeh,akloster/bokeh,PythonCharmers/bokeh,aavanian/bokeh,roxyboy/bokeh,evidation-health/bokeh,alan-unravel/bokeh,mindriot101/bokeh,aavanian/bokeh,msarahan/bokeh,aiguofer/bokeh,bsipocz/bokeh,caseyclements/bokeh,percyfal/bokeh,tacaswell/bokeh,PythonCharmers/bokeh,ChristosChristofidis/bokeh,ChinaQuants/bokeh,timothydmorton/bokeh,mutirri/bokeh,ericdill/bokeh,timsnyder/bokeh,KasperPRasmussen/bokeh,matbra/bokeh,aavanian/bokeh,KasperPRasmussen/bokeh,satishgoda/bokeh,josherick/bokeh,srinathv/bokeh,ahmadia/bokeh,caseyclements/bokeh,draperjames/bokeh,stonebig/bokeh,clairetang6/bokeh,almarklein/bokeh,carlvlewis/bokeh,almarklein/bokeh,laurent-george/bokeh,satishgoda/bokeh,ericdill/bokeh,srinathv/bokeh,philippjfr/bokeh,deeplook/bokeh,msarahan/bokeh,timsnyder/bokeh,justacec/bokeh,rothnic/bokeh,aiguofer/bokeh,azjps/bokeh,DuCorey/bokeh,azjps/bokeh,PythonCharmers/bokeh,percyfal/bokeh,awanke/bokeh,bsipocz/bokeh,maxalbert/bokeh,almarklein/bokeh,dennisobrien/bokeh,bokeh/bokeh,DuCorey/bokeh,CrazyGuo/bokeh,matbra/bokeh,rs2/bokeh,josherick/bokeh,ericdill/bokeh,dennisobrien/bokeh,ChinaQuants/bokeh,clairetang6/bokeh,muku42/bokeh,stuart-knock/bokeh,Karel-van-de-Plassche/bokeh,bokeh/bokeh,khkaminska/bokeh,paultcochrane/bokeh,khkaminska/bokeh,xguse/bokeh,ChristosChristofidis/bokeh,jplourenco/bokeh,KasperPRasmussen/bokeh,ChinaQuants/bokeh,azjps/bokeh,abele/bokeh,msarahan/bokeh,maxalbert/bokeh,rs2/bokeh,alan-unravel/bokeh,awanke/bokeh,paultcochrane/bokeh,philippjfr/bokeh,timothydmorton/bokeh,htygithub/bokeh,quasiben/bokeh,rhiever/bokeh,evidation-health/bokeh,rhiever/bokeh,ericmjl/bokeh,msarahan/bokeh,timsnyder/bokeh,deeplook/bokeh,saifrahmed/bokeh,philippjfr/bokeh,xguse/bokeh,rhiever/bokeh,KasperPRasmussen/bokeh,tacaswell/bokeh,jplourenco/bokeh,phobson/bokeh,clairetang6/bokeh,ptitjano/bokeh,aavanian/bokeh,matbra/bokeh,Karel-van-de-Plassche/bokeh,bsipocz/bokeh,KasperPRasmussen/bokeh,Karel-van-de-Plassche/bokeh,ericmjl/bokeh,draperjames/bokeh,DuCorey/bokeh,justacec/bokeh,ChinaQuants/bokeh,eteq/bokeh,jakirkham/bokeh,PythonCharmers/bokeh,schoolie/bokeh,akloster/bokeh,htygithub/bokeh,lukebarnard1/bokeh,bsipocz/bokeh,schoolie/bokeh,azjps/bokeh,caseyclements/bokeh,akloster/bokeh,stuart-knock/bokeh,stuart-knock/bokeh,mindriot101/bokeh,CrazyGuo/bokeh,abele/bokeh,khkaminska/bokeh,maxalbert/bokeh,rs2/bokeh,CrazyGuo/bokeh,jakirkham/bokeh,bokeh/bokeh,khkaminska/bokeh,quasiben/bokeh,satishgoda/bokeh,clairetang6/bokeh,canavandl/bokeh,daodaoliang/bokeh,ahmadia/bokeh,gpfreitas/bokeh,josherick/bokeh,philippjfr/bokeh,phobson/bokeh,dennisobrien/bokeh,matbra/bokeh,satishgoda/bokeh,ptitjano/bokeh,akloster/bokeh,percyfal/bokeh,caseyclements/bokeh,htygithub/bokeh,canavandl/bokeh,carlvlewis/bokeh,paultcochrane/bokeh,mindriot101/bokeh,schoolie/bokeh,lukebarnard1/bokeh,gpfreitas/bokeh,gpfreitas/bokeh,alan-unravel/bokeh,evidation-health/bokeh,timothydmorton/bokeh,aavanian/bokeh,gpfreitas/bokeh,jakirkham/bokeh,awanke/bokeh,rothnic/bokeh,aiguofer/bokeh,stonebig/bokeh,jplourenco/bokeh,stonebig/bokeh,deeplook/bokeh,jakirkham/bokeh,alan-unravel/bokeh,mindriot101/bokeh,birdsarah/bokeh,srinathv/bokeh,daodaoliang/bokeh
python
## Code Before: import numpy as np from bokeh.plotting import * from bokeh.objects import Range1d N = 1000 x = np.linspace(0, 10, N) y = np.linspace(0, 10, N) xx, yy = np.meshgrid(x, y) d = np.sin(xx)*np.cos(yy) output_file("image.html", title="image.py example") image( image=[d], x=[0], y=[0], dw=[10], dh=[10], palette=["Spectral-11"], x_range=[0, 10], y_range=[0, 10], tools="pan,wheel_zoom,box_zoom,reset,previewsave", name="image_example" ) curplot().x_range = [5, 10] show() # open a browser ## Instruction: Fix example and remove extraneous import. ## Code After: import numpy as np from bokeh.plotting import * N = 1000 x = np.linspace(0, 10, N) y = np.linspace(0, 10, N) xx, yy = np.meshgrid(x, y) d = np.sin(xx)*np.cos(yy) output_file("image.html", title="image.py example") image( image=[d], x=[0], y=[0], dw=[10], dh=[10], palette=["Spectral-11"], x_range=[0, 10], y_range=[0, 10], tools="pan,wheel_zoom,box_zoom,reset,previewsave", name="image_example" ) show() # open a browser
import numpy as np from bokeh.plotting import * - from bokeh.objects import Range1d N = 1000 x = np.linspace(0, 10, N) y = np.linspace(0, 10, N) xx, yy = np.meshgrid(x, y) d = np.sin(xx)*np.cos(yy) output_file("image.html", title="image.py example") image( image=[d], x=[0], y=[0], dw=[10], dh=[10], palette=["Spectral-11"], x_range=[0, 10], y_range=[0, 10], tools="pan,wheel_zoom,box_zoom,reset,previewsave", name="image_example" ) - curplot().x_range = [5, 10] - show() # open a browser
3
0.130435
0
3
c554bd76e8caa659b9fc5684925bf8bb52c45c1f
app/views/bank_accounts/_form.html.haml
app/views/bank_accounts/_form.html.haml
= simple_form_for @bank_account do |f| = f.input :code = f.input :pc_id = f.input :esr_id = f.input :title = f.association :account_type, :label_method => :title = f.association :bank, :as => :combobox .form-actions = f.button :submit
= simple_form_for @bank_account do |f| .row-fluid .span12= f.input :title .row-fluid .span6= f.input :code .span6= f.association :account_type, :label_method => :title .row-fluid .span12= f.input :tag_list, :input_html => {:class => 'select2-tags span', 'data-tags' => Account.tag_collection.to_json} .row-fluid .span6= f.input :pc_id, :as => :string .span6= f.input :esr_id, :as => :string .row-fluid .span12= f.association :bank, :as => :combobox .form-actions = f.button :submit
Use proper bootstrap markup for bank account forms.
Use proper bootstrap markup for bank account forms.
Haml
agpl-3.0
hauledev/bookyt,silvermind/bookyt,silvermind/bookyt,xuewenfei/bookyt,hauledev/bookyt,gaapt/bookyt,huerlisi/bookyt,wtag/bookyt,hauledev/bookyt,gaapt/bookyt,gaapt/bookyt,xuewenfei/bookyt,huerlisi/bookyt,hauledev/bookyt,gaapt/bookyt,wtag/bookyt,silvermind/bookyt,huerlisi/bookyt,silvermind/bookyt,xuewenfei/bookyt,wtag/bookyt
haml
## Code Before: = simple_form_for @bank_account do |f| = f.input :code = f.input :pc_id = f.input :esr_id = f.input :title = f.association :account_type, :label_method => :title = f.association :bank, :as => :combobox .form-actions = f.button :submit ## Instruction: Use proper bootstrap markup for bank account forms. ## Code After: = simple_form_for @bank_account do |f| .row-fluid .span12= f.input :title .row-fluid .span6= f.input :code .span6= f.association :account_type, :label_method => :title .row-fluid .span12= f.input :tag_list, :input_html => {:class => 'select2-tags span', 'data-tags' => Account.tag_collection.to_json} .row-fluid .span6= f.input :pc_id, :as => :string .span6= f.input :esr_id, :as => :string .row-fluid .span12= f.association :bank, :as => :combobox .form-actions = f.button :submit
= simple_form_for @bank_account do |f| + .row-fluid + .span12= f.input :title + .row-fluid - = f.input :code + .span6= f.input :code ? ++++++++ - = f.input :pc_id - = f.input :esr_id - = f.input :title - = f.association :account_type, :label_method => :title + .span6= f.association :account_type, :label_method => :title ? ++++++++ + .row-fluid + .span12= f.input :tag_list, :input_html => {:class => 'select2-tags span', 'data-tags' => Account.tag_collection.to_json} + .row-fluid + .span6= f.input :pc_id, :as => :string + .span6= f.input :esr_id, :as => :string + .row-fluid - = f.association :bank, :as => :combobox + .span12= f.association :bank, :as => :combobox ? +++++++++ .form-actions = f.button :submit
18
1.8
12
6