commit
stringlengths
40
40
old_file
stringlengths
4
237
new_file
stringlengths
4
237
old_contents
stringlengths
1
4.24k
new_contents
stringlengths
5
4.84k
subject
stringlengths
15
778
message
stringlengths
16
6.86k
lang
stringlengths
1
30
license
stringclasses
13 values
repos
stringlengths
5
116k
config
stringlengths
1
30
content
stringlengths
105
8.72k
1c70c77f000a7b1f822b3970036c537be96fbbf8
app/javascript/analytics.js
app/javascript/analytics.js
if (document.head.dataset.piwikHost) { window._paq = window._paq || []; /* tracker methods like "setCustomDimension" should be called before "trackPageView" */ window._paq.push(['trackPageView']); window._paq.push(['enableLinkTracking']); (function() { var u=document.head.dataset.piwikHost; window._paq.push(['setTrackerUrl', u+'piwik.php']); window._paq.push(['setSiteId', '6']); var d=document, g=d.createElement('script'), s=d.getElementsByTagName('script')[0]; g.type='text/javascript'; g.async=true; g.defer=true; g.src=u+'piwik.js'; s.parentNode.insertBefore(g,s); })(); document.addEventListener('click', function (e) { e = e || window.event; var target = e.target || e.srcElement; if (target.dataset.piwikAction) { window._paq.push(['trackEvent', target.dataset.piwikAction, target.dataset.piwikName, target.dataset.piwikValue]); } }, false); }
if (document.head.dataset.piwikHost) { window._paq = window._paq || []; window._paq.push(['disableCookies']); }
Remove them cookies and maybe disable them?
task: Remove them cookies and maybe disable them?
JavaScript
mpl-2.0
brave/publishers,brave/publishers,brave/publishers
javascript
## Code Before: if (document.head.dataset.piwikHost) { window._paq = window._paq || []; /* tracker methods like "setCustomDimension" should be called before "trackPageView" */ window._paq.push(['trackPageView']); window._paq.push(['enableLinkTracking']); (function() { var u=document.head.dataset.piwikHost; window._paq.push(['setTrackerUrl', u+'piwik.php']); window._paq.push(['setSiteId', '6']); var d=document, g=d.createElement('script'), s=d.getElementsByTagName('script')[0]; g.type='text/javascript'; g.async=true; g.defer=true; g.src=u+'piwik.js'; s.parentNode.insertBefore(g,s); })(); document.addEventListener('click', function (e) { e = e || window.event; var target = e.target || e.srcElement; if (target.dataset.piwikAction) { window._paq.push(['trackEvent', target.dataset.piwikAction, target.dataset.piwikName, target.dataset.piwikValue]); } }, false); } ## Instruction: task: Remove them cookies and maybe disable them? ## Code After: if (document.head.dataset.piwikHost) { window._paq = window._paq || []; window._paq.push(['disableCookies']); }
aa687edd00199a52a27fc818e306ff37edfc0009
lib/node_modules/@stdlib/math/generics/statistics/incrmidrange/docs/repl.txt
lib/node_modules/@stdlib/math/generics/statistics/incrmidrange/docs/repl.txt
{{alias}}() Returns an accumulator function which incrementally computes a mid-range. The mid-range is the arithmetic mean of maximum and minimum values. Accordingly, the mid-range is the midpoint of the range and a measure of central tendency. The accumulator function returns an updated mid-range if provided a value. If not provided a value, the accumulator function returns the current mid-range. Returns ------- acc: Function Accumulator function. Examples -------- > var accumulator = {{alias}}(); > var midrange = accumulator() null > midrange = accumulator( 3.14 ) 3.14 > midrange = accumulator( -5.0 ) ~-0.93 > midrange = accumulator( 10.1 ) 2.55 > midrange = accumulator() 2.55 See Also --------
{{alias}}() Returns an accumulator function which incrementally computes a mid-range. The mid-range is the arithmetic mean of maximum and minimum values. Accordingly, the mid-range is the midpoint of the range and a measure of central tendency. If provided a value, the accumulator function returns an updated mid-range. If not provided a value, the accumulator function returns the current mid- range. Returns ------- acc: Function Accumulator function. Examples -------- > var accumulator = {{alias}}(); > var v = accumulator() null > v = accumulator( 3.14 ) 3.14 > v = accumulator( -5.0 ) ~-0.93 > v = accumulator( 10.1 ) 2.55 > v = accumulator() 2.55 See Also --------
Update note and rename variable
Update note and rename variable
Text
apache-2.0
stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib,stdlib-js/stdlib
text
## Code Before: {{alias}}() Returns an accumulator function which incrementally computes a mid-range. The mid-range is the arithmetic mean of maximum and minimum values. Accordingly, the mid-range is the midpoint of the range and a measure of central tendency. The accumulator function returns an updated mid-range if provided a value. If not provided a value, the accumulator function returns the current mid-range. Returns ------- acc: Function Accumulator function. Examples -------- > var accumulator = {{alias}}(); > var midrange = accumulator() null > midrange = accumulator( 3.14 ) 3.14 > midrange = accumulator( -5.0 ) ~-0.93 > midrange = accumulator( 10.1 ) 2.55 > midrange = accumulator() 2.55 See Also -------- ## Instruction: Update note and rename variable ## Code After: {{alias}}() Returns an accumulator function which incrementally computes a mid-range. The mid-range is the arithmetic mean of maximum and minimum values. Accordingly, the mid-range is the midpoint of the range and a measure of central tendency. If provided a value, the accumulator function returns an updated mid-range. If not provided a value, the accumulator function returns the current mid- range. Returns ------- acc: Function Accumulator function. Examples -------- > var accumulator = {{alias}}(); > var v = accumulator() null > v = accumulator( 3.14 ) 3.14 > v = accumulator( -5.0 ) ~-0.93 > v = accumulator( 10.1 ) 2.55 > v = accumulator() 2.55 See Also --------
ecc2eb694a8770d98bdd53a4bc611cd41fb23b5e
Utilities/InstallTest/CMakeLists.txt
Utilities/InstallTest/CMakeLists.txt
if("x${CMAKE_INSTALL_PREFIX}" MATCHES "^x${ITK_BINARY_DIR}/InstallTest$") itk_add_test(NAME Install COMMAND ${CMAKE_COMMAND} -DCONFIGURATION=$<CONFIGURATION> -DITK_BINARY_DIR=${ITK_BINARY_DIR} -P ${CMAKE_CURRENT_SOURCE_DIR}/InstallTest.cmake ) endif()
if("x${CMAKE_INSTALL_PREFIX}" STREQUAL "x${ITK_BINARY_DIR}/InstallTest") itk_add_test(NAME Install COMMAND ${CMAKE_COMMAND} -DCONFIGURATION=$<CONFIGURATION> -DITK_BINARY_DIR=${ITK_BINARY_DIR} -P ${CMAKE_CURRENT_SOURCE_DIR}/InstallTest.cmake ) endif()
Use STREQUAL instead of MATCH
COMP: Use STREQUAL instead of MATCH When the path has items that could be interpreted as regular expression directives, the intended string matching causes cmake configuration failures. ================================== CMake Error at Utilities/InstallTest/CMakeLists.txt:2 (if): if given arguments: "x/usr/local" "MATCHES" "^x/bld/Dashboard/src/ITK-clang++11/InstallTest\$" Regular expression "^x/bld/Dashboard/src/ITK-clang++11/InstallTest$" cannot compile ================================== Solution provided by Brad King. Change-Id: Ib45fc733d35e01fc95340cc4f45df9c26f44db3b
Text
apache-2.0
hjmjohnson/ITK,atsnyder/ITK,hjmjohnson/ITK,LucHermitte/ITK,Kitware/ITK,vfonov/ITK,fbudin69500/ITK,stnava/ITK,richardbeare/ITK,InsightSoftwareConsortium/ITK,thewtex/ITK,biotrump/ITK,vfonov/ITK,jcfr/ITK,LucasGandel/ITK,jmerkow/ITK,BRAINSia/ITK,jcfr/ITK,thewtex/ITK,PlutoniumHeart/ITK,BRAINSia/ITK,spinicist/ITK,spinicist/ITK,ajjl/ITK,LucHermitte/ITK,fedral/ITK,thewtex/ITK,malaterre/ITK,biotrump/ITK,blowekamp/ITK,malaterre/ITK,Kitware/ITK,zachary-williamson/ITK,blowekamp/ITK,LucasGandel/ITK,blowekamp/ITK,richardbeare/ITK,Kitware/ITK,vfonov/ITK,stnava/ITK,atsnyder/ITK,fedral/ITK,richardbeare/ITK,stnava/ITK,BRAINSia/ITK,zachary-williamson/ITK,zachary-williamson/ITK,richardbeare/ITK,hjmjohnson/ITK,blowekamp/ITK,atsnyder/ITK,InsightSoftwareConsortium/ITK,msmolens/ITK,jmerkow/ITK,vfonov/ITK,atsnyder/ITK,BlueBrain/ITK,malaterre/ITK,Kitware/ITK,BlueBrain/ITK,Kitware/ITK,biotrump/ITK,stnava/ITK,Kitware/ITK,LucasGandel/ITK,fbudin69500/ITK,msmolens/ITK,atsnyder/ITK,PlutoniumHeart/ITK,msmolens/ITK,LucasGandel/ITK,LucasGandel/ITK,vfonov/ITK,fbudin69500/ITK,jmerkow/ITK,jmerkow/ITK,ajjl/ITK,atsnyder/ITK,hjmjohnson/ITK,LucasGandel/ITK,blowekamp/ITK,PlutoniumHeart/ITK,BRAINSia/ITK,spinicist/ITK,BlueBrain/ITK,atsnyder/ITK,PlutoniumHeart/ITK,vfonov/ITK,atsnyder/ITK,jcfr/ITK,LucasGandel/ITK,BlueBrain/ITK,jcfr/ITK,hjmjohnson/ITK,spinicist/ITK,malaterre/ITK,jcfr/ITK,jmerkow/ITK,thewtex/ITK,ajjl/ITK,stnava/ITK,InsightSoftwareConsortium/ITK,BlueBrain/ITK,thewtex/ITK,LucHermitte/ITK,ajjl/ITK,msmolens/ITK,jcfr/ITK,BlueBrain/ITK,fedral/ITK,spinicist/ITK,LucasGandel/ITK,thewtex/ITK,PlutoniumHeart/ITK,malaterre/ITK,vfonov/ITK,zachary-williamson/ITK,jcfr/ITK,fbudin69500/ITK,jmerkow/ITK,zachary-williamson/ITK,msmolens/ITK,ajjl/ITK,ajjl/ITK,LucHermitte/ITK,LucHermitte/ITK,LucHermitte/ITK,stnava/ITK,zachary-williamson/ITK,jcfr/ITK,msmolens/ITK,jmerkow/ITK,fedral/ITK,fbudin69500/ITK,zachary-williamson/ITK,malaterre/ITK,BRAINSia/ITK,malaterre/ITK,fedral/ITK,fedral/ITK,BlueBrain/ITK,InsightSoftwareConsortium/ITK,LucHermitte/ITK,fbudin69500/ITK,richardbeare/ITK,stnava/ITK,hjmjohnson/ITK,PlutoniumHeart/ITK,zachary-williamson/ITK,biotrump/ITK,LucHermitte/ITK,biotrump/ITK,richardbeare/ITK,vfonov/ITK,InsightSoftwareConsortium/ITK,fbudin69500/ITK,InsightSoftwareConsortium/ITK,spinicist/ITK,biotrump/ITK,malaterre/ITK,richardbeare/ITK,hjmjohnson/ITK,ajjl/ITK,msmolens/ITK,PlutoniumHeart/ITK,malaterre/ITK,atsnyder/ITK,BlueBrain/ITK,biotrump/ITK,ajjl/ITK,stnava/ITK,blowekamp/ITK,stnava/ITK,BRAINSia/ITK,PlutoniumHeart/ITK,fbudin69500/ITK,biotrump/ITK,vfonov/ITK,thewtex/ITK,blowekamp/ITK,Kitware/ITK,spinicist/ITK,msmolens/ITK,BRAINSia/ITK,zachary-williamson/ITK,blowekamp/ITK,jmerkow/ITK,spinicist/ITK,fedral/ITK,spinicist/ITK,fedral/ITK,InsightSoftwareConsortium/ITK
text
## Code Before: if("x${CMAKE_INSTALL_PREFIX}" MATCHES "^x${ITK_BINARY_DIR}/InstallTest$") itk_add_test(NAME Install COMMAND ${CMAKE_COMMAND} -DCONFIGURATION=$<CONFIGURATION> -DITK_BINARY_DIR=${ITK_BINARY_DIR} -P ${CMAKE_CURRENT_SOURCE_DIR}/InstallTest.cmake ) endif() ## Instruction: COMP: Use STREQUAL instead of MATCH When the path has items that could be interpreted as regular expression directives, the intended string matching causes cmake configuration failures. ================================== CMake Error at Utilities/InstallTest/CMakeLists.txt:2 (if): if given arguments: "x/usr/local" "MATCHES" "^x/bld/Dashboard/src/ITK-clang++11/InstallTest\$" Regular expression "^x/bld/Dashboard/src/ITK-clang++11/InstallTest$" cannot compile ================================== Solution provided by Brad King. Change-Id: Ib45fc733d35e01fc95340cc4f45df9c26f44db3b ## Code After: if("x${CMAKE_INSTALL_PREFIX}" STREQUAL "x${ITK_BINARY_DIR}/InstallTest") itk_add_test(NAME Install COMMAND ${CMAKE_COMMAND} -DCONFIGURATION=$<CONFIGURATION> -DITK_BINARY_DIR=${ITK_BINARY_DIR} -P ${CMAKE_CURRENT_SOURCE_DIR}/InstallTest.cmake ) endif()
c4f4b43e26617a5ada893d1add993c99f0235b00
bokehjs/src/less/font-awesome.less
bokehjs/src/less/font-awesome.less
@font-awesome: "../vendor/font-awesome-4.2.0/less"; @import "@{font-awesome}/variables.less"; @fa-css-prefix: bk-fa; @import "@{font-awesome}/mixins.less"; @import "@{font-awesome}/path.less"; @import "@{font-awesome}/core.less"; @import "@{font-awesome}/larger.less"; @import "@{font-awesome}/fixed-width.less"; @import "@{font-awesome}/list.less"; @import "@{font-awesome}/bordered-pulled.less"; @import "@{font-awesome}/spinning.less"; @import "@{font-awesome}/rotated-flipped.less"; @import "@{font-awesome}/stacked.less"; @import "@{font-awesome}/icons.less";
@font-awesome: "../vendor/font-awesome-4.2.0/less"; @import "@{font-awesome}/variables.less"; @fa-css-prefix: bk-fa; @fa-font-path: "/bokehjs/static/js/vendor/font-awesome-4.2.0/fonts"; @import "@{font-awesome}/mixins.less"; @import "@{font-awesome}/path.less"; @import "@{font-awesome}/core.less"; @import "@{font-awesome}/larger.less"; @import "@{font-awesome}/fixed-width.less"; @import "@{font-awesome}/list.less"; @import "@{font-awesome}/bordered-pulled.less"; @import "@{font-awesome}/spinning.less"; @import "@{font-awesome}/rotated-flipped.less"; @import "@{font-awesome}/stacked.less"; @import "@{font-awesome}/icons.less";
Make icons work on the server (no static pages yet)
Make icons work on the server (no static pages yet)
Less
bsd-3-clause
DuCorey/bokeh,stuart-knock/bokeh,phobson/bokeh,philippjfr/bokeh,azjps/bokeh,schoolie/bokeh,justacec/bokeh,philippjfr/bokeh,ptitjano/bokeh,KasperPRasmussen/bokeh,abele/bokeh,mutirri/bokeh,draperjames/bokeh,dennisobrien/bokeh,srinathv/bokeh,timsnyder/bokeh,gpfreitas/bokeh,aiguofer/bokeh,rhiever/bokeh,ptitjano/bokeh,maxalbert/bokeh,aavanian/bokeh,srinathv/bokeh,rs2/bokeh,quasiben/bokeh,draperjames/bokeh,almarklein/bokeh,jakirkham/bokeh,canavandl/bokeh,tacaswell/bokeh,satishgoda/bokeh,akloster/bokeh,aavanian/bokeh,laurent-george/bokeh,xguse/bokeh,carlvlewis/bokeh,tacaswell/bokeh,eteq/bokeh,mutirri/bokeh,josherick/bokeh,josherick/bokeh,ericmjl/bokeh,almarklein/bokeh,stuart-knock/bokeh,rothnic/bokeh,rothnic/bokeh,canavandl/bokeh,awanke/bokeh,stonebig/bokeh,abele/bokeh,aavanian/bokeh,azjps/bokeh,paultcochrane/bokeh,mindriot101/bokeh,dennisobrien/bokeh,DuCorey/bokeh,muku42/bokeh,matbra/bokeh,lukebarnard1/bokeh,khkaminska/bokeh,daodaoliang/bokeh,timsnyder/bokeh,satishgoda/bokeh,matbra/bokeh,timothydmorton/bokeh,evidation-health/bokeh,eteq/bokeh,abele/bokeh,deeplook/bokeh,bsipocz/bokeh,laurent-george/bokeh,jplourenco/bokeh,timothydmorton/bokeh,schoolie/bokeh,draperjames/bokeh,rs2/bokeh,bsipocz/bokeh,ericmjl/bokeh,stuart-knock/bokeh,tacaswell/bokeh,alan-unravel/bokeh,KasperPRasmussen/bokeh,aiguofer/bokeh,josherick/bokeh,mutirri/bokeh,daodaoliang/bokeh,carlvlewis/bokeh,philippjfr/bokeh,ahmadia/bokeh,josherick/bokeh,philippjfr/bokeh,lukebarnard1/bokeh,ChristosChristofidis/bokeh,timsnyder/bokeh,muku42/bokeh,stonebig/bokeh,aiguofer/bokeh,jplourenco/bokeh,CrazyGuo/bokeh,Karel-van-de-Plassche/bokeh,stonebig/bokeh,maxalbert/bokeh,ChinaQuants/bokeh,ahmadia/bokeh,rhiever/bokeh,xguse/bokeh,quasiben/bokeh,roxyboy/bokeh,draperjames/bokeh,daodaoliang/bokeh,ptitjano/bokeh,CrazyGuo/bokeh,KasperPRasmussen/bokeh,bsipocz/bokeh,laurent-george/bokeh,DuCorey/bokeh,philippjfr/bokeh,akloster/bokeh,dennisobrien/bokeh,caseyclements/bokeh,roxyboy/bokeh,abele/bokeh,alan-unravel/bokeh,evidation-health/bokeh,rs2/bokeh,saifrahmed/bokeh,phobson/bokeh,lukebarnard1/bokeh,roxyboy/bokeh,rs2/bokeh,htygithub/bokeh,deeplook/bokeh,khkaminska/bokeh,bokeh/bokeh,azjps/bokeh,awanke/bokeh,percyfal/bokeh,birdsarah/bokeh,azjps/bokeh,phobson/bokeh,ericdill/bokeh,caseyclements/bokeh,jakirkham/bokeh,rhiever/bokeh,daodaoliang/bokeh,ChinaQuants/bokeh,schoolie/bokeh,ChristosChristofidis/bokeh,saifrahmed/bokeh,quasiben/bokeh,awanke/bokeh,aavanian/bokeh,ericdill/bokeh,PythonCharmers/bokeh,deeplook/bokeh,htygithub/bokeh,paultcochrane/bokeh,paultcochrane/bokeh,canavandl/bokeh,gpfreitas/bokeh,matbra/bokeh,schoolie/bokeh,KasperPRasmussen/bokeh,rs2/bokeh,PythonCharmers/bokeh,gpfreitas/bokeh,percyfal/bokeh,clairetang6/bokeh,KasperPRasmussen/bokeh,percyfal/bokeh,srinathv/bokeh,birdsarah/bokeh,ahmadia/bokeh,tacaswell/bokeh,jakirkham/bokeh,bokeh/bokeh,ericmjl/bokeh,akloster/bokeh,jakirkham/bokeh,mindriot101/bokeh,xguse/bokeh,stuart-knock/bokeh,mindriot101/bokeh,evidation-health/bokeh,carlvlewis/bokeh,ChristosChristofidis/bokeh,mutirri/bokeh,msarahan/bokeh,timsnyder/bokeh,timothydmorton/bokeh,justacec/bokeh,stonebig/bokeh,bokeh/bokeh,PythonCharmers/bokeh,ericmjl/bokeh,evidation-health/bokeh,awanke/bokeh,maxalbert/bokeh,msarahan/bokeh,xguse/bokeh,ChristosChristofidis/bokeh,Karel-van-de-Plassche/bokeh,draperjames/bokeh,DuCorey/bokeh,ChinaQuants/bokeh,percyfal/bokeh,clairetang6/bokeh,dennisobrien/bokeh,carlvlewis/bokeh,khkaminska/bokeh,eteq/bokeh,paultcochrane/bokeh,roxyboy/bokeh,saifrahmed/bokeh,alan-unravel/bokeh,msarahan/bokeh,timsnyder/bokeh,rhiever/bokeh,bokeh/bokeh,almarklein/bokeh,caseyclements/bokeh,ericdill/bokeh,schoolie/bokeh,dennisobrien/bokeh,CrazyGuo/bokeh,clairetang6/bokeh,clairetang6/bokeh,aavanian/bokeh,justacec/bokeh,phobson/bokeh,percyfal/bokeh,mindriot101/bokeh,muku42/bokeh,ptitjano/bokeh,laurent-george/bokeh,caseyclements/bokeh,CrazyGuo/bokeh,rothnic/bokeh,justacec/bokeh,muku42/bokeh,timothydmorton/bokeh,phobson/bokeh,DuCorey/bokeh,eteq/bokeh,Karel-van-de-Plassche/bokeh,jplourenco/bokeh,canavandl/bokeh,saifrahmed/bokeh,jakirkham/bokeh,PythonCharmers/bokeh,maxalbert/bokeh,azjps/bokeh,akloster/bokeh,msarahan/bokeh,htygithub/bokeh,ericmjl/bokeh,lukebarnard1/bokeh,ChinaQuants/bokeh,ahmadia/bokeh,matbra/bokeh,satishgoda/bokeh,aiguofer/bokeh,srinathv/bokeh,birdsarah/bokeh,aiguofer/bokeh,ericdill/bokeh,ptitjano/bokeh,khkaminska/bokeh,Karel-van-de-Plassche/bokeh,jplourenco/bokeh,gpfreitas/bokeh,alan-unravel/bokeh,bsipocz/bokeh,rothnic/bokeh,bokeh/bokeh,deeplook/bokeh,birdsarah/bokeh,satishgoda/bokeh,Karel-van-de-Plassche/bokeh,htygithub/bokeh
less
## Code Before: @font-awesome: "../vendor/font-awesome-4.2.0/less"; @import "@{font-awesome}/variables.less"; @fa-css-prefix: bk-fa; @import "@{font-awesome}/mixins.less"; @import "@{font-awesome}/path.less"; @import "@{font-awesome}/core.less"; @import "@{font-awesome}/larger.less"; @import "@{font-awesome}/fixed-width.less"; @import "@{font-awesome}/list.less"; @import "@{font-awesome}/bordered-pulled.less"; @import "@{font-awesome}/spinning.less"; @import "@{font-awesome}/rotated-flipped.less"; @import "@{font-awesome}/stacked.less"; @import "@{font-awesome}/icons.less"; ## Instruction: Make icons work on the server (no static pages yet) ## Code After: @font-awesome: "../vendor/font-awesome-4.2.0/less"; @import "@{font-awesome}/variables.less"; @fa-css-prefix: bk-fa; @fa-font-path: "/bokehjs/static/js/vendor/font-awesome-4.2.0/fonts"; @import "@{font-awesome}/mixins.less"; @import "@{font-awesome}/path.less"; @import "@{font-awesome}/core.less"; @import "@{font-awesome}/larger.less"; @import "@{font-awesome}/fixed-width.less"; @import "@{font-awesome}/list.less"; @import "@{font-awesome}/bordered-pulled.less"; @import "@{font-awesome}/spinning.less"; @import "@{font-awesome}/rotated-flipped.less"; @import "@{font-awesome}/stacked.less"; @import "@{font-awesome}/icons.less";
2678c5e999269033399121989cfb6f613e0a6b76
tests/CMakeLists.txt
tests/CMakeLists.txt
add_library(gtest ../external_libraries/gtest/gtest-all.cc ) target_include_directories(gtest PUBLIC ../external_libraries/gtest ) add_executable(methcla_tests test_runner_console.cpp methcla_tests.cpp ) target_include_directories(methcla_tests PRIVATE ../src ) target_link_libraries(methcla_tests PRIVATE gtest methcla sine ) add_test(methcla_tests methcla_tests) add_executable(methcla_engine_tests test_runner_console.cpp ../external_libraries/gtest/gtest-all.cc methcla_engine_tests.cpp ) target_include_directories(methcla_engine_tests PRIVATE ../src ../external_libraries/gtest ) target_link_libraries(methcla_engine_tests PRIVATE gtest methcla node-control sine ) add_test(methcla_engine_tests methcla_engine_tests)
add_library(gtest ../external_libraries/gtest/gtest-all.cc ) target_include_directories(gtest PUBLIC ../external_libraries/gtest ) add_executable(methcla_tests test_runner_console.cpp methcla_tests.cpp ) target_include_directories(methcla_tests PRIVATE ../src ../external_libraries/boost ) target_link_libraries(methcla_tests PRIVATE gtest methcla sine ) add_test(methcla_tests methcla_tests) add_executable(methcla_engine_tests test_runner_console.cpp ../external_libraries/gtest/gtest-all.cc methcla_engine_tests.cpp ) target_include_directories(methcla_engine_tests PRIVATE ../src # For Methcla/Utility/Macros.h ../external_libraries/boost ../external_libraries/gtest ) target_link_libraries(methcla_engine_tests PRIVATE gtest methcla node-control sine ) add_test(methcla_engine_tests methcla_engine_tests)
Add boost to include path for tests
Add boost to include path for tests
Text
apache-2.0
samplecount/methcla,samplecount/methcla,samplecount/methcla,samplecount/methcla,samplecount/methcla,samplecount/methcla
text
## Code Before: add_library(gtest ../external_libraries/gtest/gtest-all.cc ) target_include_directories(gtest PUBLIC ../external_libraries/gtest ) add_executable(methcla_tests test_runner_console.cpp methcla_tests.cpp ) target_include_directories(methcla_tests PRIVATE ../src ) target_link_libraries(methcla_tests PRIVATE gtest methcla sine ) add_test(methcla_tests methcla_tests) add_executable(methcla_engine_tests test_runner_console.cpp ../external_libraries/gtest/gtest-all.cc methcla_engine_tests.cpp ) target_include_directories(methcla_engine_tests PRIVATE ../src ../external_libraries/gtest ) target_link_libraries(methcla_engine_tests PRIVATE gtest methcla node-control sine ) add_test(methcla_engine_tests methcla_engine_tests) ## Instruction: Add boost to include path for tests ## Code After: add_library(gtest ../external_libraries/gtest/gtest-all.cc ) target_include_directories(gtest PUBLIC ../external_libraries/gtest ) add_executable(methcla_tests test_runner_console.cpp methcla_tests.cpp ) target_include_directories(methcla_tests PRIVATE ../src ../external_libraries/boost ) target_link_libraries(methcla_tests PRIVATE gtest methcla sine ) add_test(methcla_tests methcla_tests) add_executable(methcla_engine_tests test_runner_console.cpp ../external_libraries/gtest/gtest-all.cc methcla_engine_tests.cpp ) target_include_directories(methcla_engine_tests PRIVATE ../src # For Methcla/Utility/Macros.h ../external_libraries/boost ../external_libraries/gtest ) target_link_libraries(methcla_engine_tests PRIVATE gtest methcla node-control sine ) add_test(methcla_engine_tests methcla_engine_tests)
36c611e4cd250a14423cdb1289c83cd0e05bd388
git/aliases.zsh
git/aliases.zsh
alias glg="git log --graph --pretty=format:'%Cred%h%Creset %an: %s - %Creset %C(yellow)%d%Creset %Cgreen(%cr)%Creset' --abbrev-commit --date=relative" alias gp='git push' alias gd='git diff' alias gdc="git diff --cached" alias gc='git commit --verbose' alias gca='git commit --verbose -a' alias gco='git checkout' alias gb='git branch' alias gs='git status' alias g='git status -sb' alias gu="git-up" # gem install git-up alias ga="git add" alias gaa="git add --all" # live git diff; requires kicker gem gdl() { kicker -c -e "git diff --color" . } # shortcut to push to heroku staging stage() { BRANCH_NAME=$(git rev-parse --abbrev-ref HEAD) git push --force staging $BRANCH_NAME:master }
alias glg="git log --graph --pretty=format:'%Cred%h%Creset %an: %s - %Creset %C(yellow)%d%Creset %Cgreen(%cr)%Creset' --abbrev-commit --date=relative" alias gp='git push' alias gd='git diff' alias gdc="git diff --cached" alias gc='git commit --verbose' alias gca='git commit --verbose -a' alias gco='git checkout' alias gb='git branch' alias gs='git status' alias g='git status -sb' alias gu="git-up" # gem install git-up alias ga="git add" alias gaa="git add --all" # live git diff; requires kicker gem gdl() { kicker -c -e "git diff --color" . } # shortcut to push to heroku staging stage() { BRANCH_NAME=$(git rev-parse --abbrev-ref HEAD) git push --force staging $BRANCH_NAME:master } # Remove a specific history from the history # # Usage: # # git-rm <commit SHA> # git-rm() { git rebase --onto $1^ $1 }
Add git-rm() function to remove a commit from the history
Add git-rm() function to remove a commit from the history
Shell
mit
hale/dotfiles,hale/dotfiles,hale/dotfiles,hale/dotfiles
shell
## Code Before: alias glg="git log --graph --pretty=format:'%Cred%h%Creset %an: %s - %Creset %C(yellow)%d%Creset %Cgreen(%cr)%Creset' --abbrev-commit --date=relative" alias gp='git push' alias gd='git diff' alias gdc="git diff --cached" alias gc='git commit --verbose' alias gca='git commit --verbose -a' alias gco='git checkout' alias gb='git branch' alias gs='git status' alias g='git status -sb' alias gu="git-up" # gem install git-up alias ga="git add" alias gaa="git add --all" # live git diff; requires kicker gem gdl() { kicker -c -e "git diff --color" . } # shortcut to push to heroku staging stage() { BRANCH_NAME=$(git rev-parse --abbrev-ref HEAD) git push --force staging $BRANCH_NAME:master } ## Instruction: Add git-rm() function to remove a commit from the history ## Code After: alias glg="git log --graph --pretty=format:'%Cred%h%Creset %an: %s - %Creset %C(yellow)%d%Creset %Cgreen(%cr)%Creset' --abbrev-commit --date=relative" alias gp='git push' alias gd='git diff' alias gdc="git diff --cached" alias gc='git commit --verbose' alias gca='git commit --verbose -a' alias gco='git checkout' alias gb='git branch' alias gs='git status' alias g='git status -sb' alias gu="git-up" # gem install git-up alias ga="git add" alias gaa="git add --all" # live git diff; requires kicker gem gdl() { kicker -c -e "git diff --color" . } # shortcut to push to heroku staging stage() { BRANCH_NAME=$(git rev-parse --abbrev-ref HEAD) git push --force staging $BRANCH_NAME:master } # Remove a specific history from the history # # Usage: # # git-rm <commit SHA> # git-rm() { git rebase --onto $1^ $1 }
0468f22e18fa851c5ad42bc4e8db228df2301558
CHANGELOG.md
CHANGELOG.md
- Initial release - TODO : Re-introduce snapshot releases to build.gradle files and REPO.md - TODO : Release new v1.0.0
- Improve object builders (`class>Builder`) to support realm models with private fields and getter/setter methods ### 1.0.0 - 03-01-2017 - Initial release [Unreleased]: https://github.com/buchandersenn/android-realm-builders/compare/v1.0.0...HEAD
Prepare changelog for v1.0.0 release
Prepare changelog for v1.0.0 release
Markdown
apache-2.0
buchandersenn/android-realm-builders
markdown
## Code Before: - Initial release - TODO : Re-introduce snapshot releases to build.gradle files and REPO.md - TODO : Release new v1.0.0 ## Instruction: Prepare changelog for v1.0.0 release ## Code After: - Improve object builders (`class>Builder`) to support realm models with private fields and getter/setter methods ### 1.0.0 - 03-01-2017 - Initial release [Unreleased]: https://github.com/buchandersenn/android-realm-builders/compare/v1.0.0...HEAD
f91e0a3d0eeb03722235601a943c777e8e429347
app/models/data_source.rb
app/models/data_source.rb
class DataSource < ActiveRecord::Base module DynamicTable end def source_table_class_name(name) "#{dbname.classify}_#{name.classify}" end def source_base_class base_class_name = source_table_class_name("Base") return DynamicTable.const_get(base_class_name) if DynamicTable.const_defined?(base_class_name) base_class = Class.new(ActiveRecord::Base) DynamicTable.const_set(base_class_name, base_class) base_class.establish_connection( adapter: adapter, host: host, port: port, username: user, password: password, database: dbname, encoding: encoding, ) base_class end def source_table_class(table_name) table_class_name = source_table_class_name(table_name) return DynamicTable.const_get(table_class_name) if DynamicTable.const_defined?(table_class_name) table_class = Class.new(source_base_class) table_class.table_name = table_name DynamicTable.const_set(table_class_name, table_class) end def source_table_classes source_base_class.connection.tables.map do |table_name| source_table_class(table_name) end end end
class DataSource < ActiveRecord::Base module DynamicTable end def source_table_class_name(table_name) "#{name.classify}_#{table_name.classify}" end def source_base_class base_class_name = source_table_class_name("Base") return DynamicTable.const_get(base_class_name) if DynamicTable.const_defined?(base_class_name) base_class = Class.new(ActiveRecord::Base) DynamicTable.const_set(base_class_name, base_class) base_class.abstract_class = true base_class.establish_connection( adapter: adapter, host: host, port: port, username: user, password: password, database: dbname, encoding: encoding, ) base_class end def source_table_class(table_name) table_class_name = source_table_class_name(table_name) return DynamicTable.const_get(table_class_name) if DynamicTable.const_defined?(table_class_name) table_class = Class.new(source_base_class) table_class.table_name = table_name DynamicTable.const_set(table_class_name, table_class) end def source_table_classes source_base_class.connection.tables.map do |table_name| source_table_class(table_name) end end end
Enable access from dynamic table classes
Enable access from dynamic table classes
Ruby
mit
hogelog/dmemo,hogelog/dmemo,hogelog/dmemo,hogelog/dmemo
ruby
## Code Before: class DataSource < ActiveRecord::Base module DynamicTable end def source_table_class_name(name) "#{dbname.classify}_#{name.classify}" end def source_base_class base_class_name = source_table_class_name("Base") return DynamicTable.const_get(base_class_name) if DynamicTable.const_defined?(base_class_name) base_class = Class.new(ActiveRecord::Base) DynamicTable.const_set(base_class_name, base_class) base_class.establish_connection( adapter: adapter, host: host, port: port, username: user, password: password, database: dbname, encoding: encoding, ) base_class end def source_table_class(table_name) table_class_name = source_table_class_name(table_name) return DynamicTable.const_get(table_class_name) if DynamicTable.const_defined?(table_class_name) table_class = Class.new(source_base_class) table_class.table_name = table_name DynamicTable.const_set(table_class_name, table_class) end def source_table_classes source_base_class.connection.tables.map do |table_name| source_table_class(table_name) end end end ## Instruction: Enable access from dynamic table classes ## Code After: class DataSource < ActiveRecord::Base module DynamicTable end def source_table_class_name(table_name) "#{name.classify}_#{table_name.classify}" end def source_base_class base_class_name = source_table_class_name("Base") return DynamicTable.const_get(base_class_name) if DynamicTable.const_defined?(base_class_name) base_class = Class.new(ActiveRecord::Base) DynamicTable.const_set(base_class_name, base_class) base_class.abstract_class = true base_class.establish_connection( adapter: adapter, host: host, port: port, username: user, password: password, database: dbname, encoding: encoding, ) base_class end def source_table_class(table_name) table_class_name = source_table_class_name(table_name) return DynamicTable.const_get(table_class_name) if DynamicTable.const_defined?(table_class_name) table_class = Class.new(source_base_class) table_class.table_name = table_name DynamicTable.const_set(table_class_name, table_class) end def source_table_classes source_base_class.connection.tables.map do |table_name| source_table_class(table_name) end end end
a8a5a3767267370c4a6479bea627ae36837245b5
.eslintrc.js
.eslintrc.js
module.exports = { root: true, extends: ['eslint:recommended', 'prettier'], plugins: ['prettier'], parserOptions: { ecmaVersion: 2017, sourceType: 'module', }, env: { browser: true, }, rules: { 'prettier/prettier': ['error', { singleQuote: true, trailingComma: 'es5', printWidth: 100, }], }, overrides: [ { files: ['index.js'], excludedFiles: ['addon-test-support/**', 'tests/**'], parserOptions: { sourceType: 'script', }, env: { browser: false, node: true, }, plugins: ['node'], rules: Object.assign({}, require('eslint-plugin-node').configs.recommended.rules, { // add your custom rules and overrides for node files here }), }, { files: ['tests/**/*.js'], env: { qunit: true } }, { files: ['index.js', 'addon-test-support/**/*.js', 'config/**/*.js'], plugins: [ 'disable-features', ], rules: { 'disable-features/disable-async-await': 'error', 'disable-features/disable-generator-functions': 'error', } } ] };
module.exports = { root: true, extends: ['eslint:recommended', 'prettier'], plugins: ['prettier'], parserOptions: { ecmaVersion: 2017, sourceType: 'module', }, env: { browser: true, }, rules: { 'prettier/prettier': ['error', { singleQuote: true, trailingComma: 'es5', printWidth: 100, }], }, overrides: [ { files: ['index.js'], excludedFiles: ['addon-test-support/**', 'tests/**'], parserOptions: { sourceType: 'script', }, env: { browser: false, node: true, }, plugins: ['node'], rules: Object.assign({}, require('eslint-plugin-node').configs.recommended.rules, { // add your custom rules and overrides for node files here }), }, { files: ['tests/**/*.js'], env: { qunit: true } }, { files: ['index.js', 'addon-test-support/**/*.js', 'config/**/*.js'], plugins: [ 'disable-features', ], rules: { 'disable-features/disable-async-await': 'error', 'disable-features/disable-generator-functions': 'error', } }, { files: ['addon-test-support/**/*.js'], rules: { 'valid-jsdoc': 'warn', } }, ] };
Add jsdoc validation to eslint config.
Add jsdoc validation to eslint config. Add as a warning (documentation updates will be done in a followup).
JavaScript
apache-2.0
switchfly/ember-test-helpers,switchfly/ember-test-helpers
javascript
## Code Before: module.exports = { root: true, extends: ['eslint:recommended', 'prettier'], plugins: ['prettier'], parserOptions: { ecmaVersion: 2017, sourceType: 'module', }, env: { browser: true, }, rules: { 'prettier/prettier': ['error', { singleQuote: true, trailingComma: 'es5', printWidth: 100, }], }, overrides: [ { files: ['index.js'], excludedFiles: ['addon-test-support/**', 'tests/**'], parserOptions: { sourceType: 'script', }, env: { browser: false, node: true, }, plugins: ['node'], rules: Object.assign({}, require('eslint-plugin-node').configs.recommended.rules, { // add your custom rules and overrides for node files here }), }, { files: ['tests/**/*.js'], env: { qunit: true } }, { files: ['index.js', 'addon-test-support/**/*.js', 'config/**/*.js'], plugins: [ 'disable-features', ], rules: { 'disable-features/disable-async-await': 'error', 'disable-features/disable-generator-functions': 'error', } } ] }; ## Instruction: Add jsdoc validation to eslint config. Add as a warning (documentation updates will be done in a followup). ## Code After: module.exports = { root: true, extends: ['eslint:recommended', 'prettier'], plugins: ['prettier'], parserOptions: { ecmaVersion: 2017, sourceType: 'module', }, env: { browser: true, }, rules: { 'prettier/prettier': ['error', { singleQuote: true, trailingComma: 'es5', printWidth: 100, }], }, overrides: [ { files: ['index.js'], excludedFiles: ['addon-test-support/**', 'tests/**'], parserOptions: { sourceType: 'script', }, env: { browser: false, node: true, }, plugins: ['node'], rules: Object.assign({}, require('eslint-plugin-node').configs.recommended.rules, { // add your custom rules and overrides for node files here }), }, { files: ['tests/**/*.js'], env: { qunit: true } }, { files: ['index.js', 'addon-test-support/**/*.js', 'config/**/*.js'], plugins: [ 'disable-features', ], rules: { 'disable-features/disable-async-await': 'error', 'disable-features/disable-generator-functions': 'error', } }, { files: ['addon-test-support/**/*.js'], rules: { 'valid-jsdoc': 'warn', } }, ] };
d3f33af2fa7d4e7bf9969752e696aaf8120642bc
panoptes/environment/weather_station.py
panoptes/environment/weather_station.py
import datetime import zmq from . import monitor from panoptes.utils import logger, config, messaging, threads @logger.has_logger @config.has_config class WeatherStation(monitor.EnvironmentalMonitor): """ This object is used to determine the weather safe/unsafe condition. It inherits from the monitor.EnvironmentalMonitor base class. It listens on the 'weather' channel of the messaging system. Config: weather_station.port (int): Port to publish to. Defaults to 6500 weather_station.channel (str): the channel topic to publish on. Defaults to 'weather' Args: messaging (panoptes.messaging.Messaging): A messaging Object for creating new sockets. """ def __init__(self, messaging=None, connect_on_startup=False): super().__init__(messaging=messaging, name='WeatherStation') # Get the messaging information self.port = self.config.get('messaging').get('messaging_port', 6500) self.channel = self.config.get('messaging').get('channel', 'weather') # Create our Publishing socket self.socket = self.messaging.create_subscriber(port=self.port, channel=self.channel) if connect_on_startup: self.start_monitoring() def monitor(self): """ Reads serial information off the attached weather station and publishes message with status """ self.send_message('UNSAFE')
import datetime import zmq from . import monitor from panoptes.utils import logger, config, messaging, threads @logger.has_logger @config.has_config class WeatherStation(monitor.EnvironmentalMonitor): """ This object is used to determine the weather safe/unsafe condition. It inherits from the monitor.EnvironmentalMonitor base class. It listens on the 'weather' channel of the messaging system. Config: weather_station.port (int): Port to publish to. Defaults to 6500 weather_station.channel (str): the channel topic to publish on. Defaults to 'weather' Args: messaging (panoptes.messaging.Messaging): A messaging Object for creating new sockets. """ def __init__(self, messaging=None, connect_on_startup=False): super().__init__(messaging=messaging, name='WeatherStation') # Get the messaging information self.port = self.config.get('messaging').get('messaging_port', 6500) self.channel = self.config.get('messaging').get('channel', 'weather') # Create our Publishing socket self.socket = self.messaging.create_subscriber(port=self.port, channel=self.channel) if connect_on_startup: self.start_monitoring() def monitor(self): """ Reads serial information off the attached weather station """ msg = self.socket.recv_json()
Set up weather station to receive updates
Set up weather station to receive updates
Python
mit
Guokr1991/POCS,fmin2958/POCS,AstroHuntsman/POCS,panoptes/POCS,panoptes/POCS,fmin2958/POCS,Guokr1991/POCS,AstroHuntsman/POCS,AstroHuntsman/POCS,panoptes/POCS,panoptes/POCS,joshwalawender/POCS,Guokr1991/POCS,AstroHuntsman/POCS,fmin2958/POCS,joshwalawender/POCS,joshwalawender/POCS,Guokr1991/POCS
python
## Code Before: import datetime import zmq from . import monitor from panoptes.utils import logger, config, messaging, threads @logger.has_logger @config.has_config class WeatherStation(monitor.EnvironmentalMonitor): """ This object is used to determine the weather safe/unsafe condition. It inherits from the monitor.EnvironmentalMonitor base class. It listens on the 'weather' channel of the messaging system. Config: weather_station.port (int): Port to publish to. Defaults to 6500 weather_station.channel (str): the channel topic to publish on. Defaults to 'weather' Args: messaging (panoptes.messaging.Messaging): A messaging Object for creating new sockets. """ def __init__(self, messaging=None, connect_on_startup=False): super().__init__(messaging=messaging, name='WeatherStation') # Get the messaging information self.port = self.config.get('messaging').get('messaging_port', 6500) self.channel = self.config.get('messaging').get('channel', 'weather') # Create our Publishing socket self.socket = self.messaging.create_subscriber(port=self.port, channel=self.channel) if connect_on_startup: self.start_monitoring() def monitor(self): """ Reads serial information off the attached weather station and publishes message with status """ self.send_message('UNSAFE') ## Instruction: Set up weather station to receive updates ## Code After: import datetime import zmq from . import monitor from panoptes.utils import logger, config, messaging, threads @logger.has_logger @config.has_config class WeatherStation(monitor.EnvironmentalMonitor): """ This object is used to determine the weather safe/unsafe condition. It inherits from the monitor.EnvironmentalMonitor base class. It listens on the 'weather' channel of the messaging system. Config: weather_station.port (int): Port to publish to. Defaults to 6500 weather_station.channel (str): the channel topic to publish on. Defaults to 'weather' Args: messaging (panoptes.messaging.Messaging): A messaging Object for creating new sockets. """ def __init__(self, messaging=None, connect_on_startup=False): super().__init__(messaging=messaging, name='WeatherStation') # Get the messaging information self.port = self.config.get('messaging').get('messaging_port', 6500) self.channel = self.config.get('messaging').get('channel', 'weather') # Create our Publishing socket self.socket = self.messaging.create_subscriber(port=self.port, channel=self.channel) if connect_on_startup: self.start_monitoring() def monitor(self): """ Reads serial information off the attached weather station """ msg = self.socket.recv_json()
6923ed54ce033722f20e3fafab67258b5a412d96
src/components/Navigation/ReactovaNavigator.js
src/components/Navigation/ReactovaNavigator.js
import React from 'react'; import { TabNavigator } from 'react-navigation'; import BaseNavigator from './BaseNavigator' const ReactovaNavigator = (navigationSchema,LoadingScreen) => { const routerRouteConfig = { Loading: { screen: LoadingScreen, }, BaseNavigator: { path: 'base', screen: BaseNavigator(navigationSchema), }, } const routerNavigationConfig = { initialRouteName: 'Loading', navigationOptions:{ tabBarVisible: false } } return TabNavigator(routerRouteConfig,routerNavigationConfig); }; export default ReactovaNavigator;
import React from 'react'; import { TabNavigator } from 'react-navigation'; import BaseNavigator from './BaseNavigator' const ReactovaNavigator = (navigationSchema, LoadingScreen = null, ladingScreenIsDefaultRoute = true) => { let loading = {} const baseNavigatorName = 'BaseNavigator' let initialRouteName = baseNavigatorName if(LoadingScreen) { loading = { Loading: { screen: LoadingScreen } } initialRouteName = ladingScreenIsDefaultRoute ? 'Loading' : initialRouteName } const routerRouteConfig = { ...loading, [baseNavigatorName]: { path: 'base', screen: BaseNavigator(navigationSchema), }, } const routerNavigationConfig = { initialRouteName, navigationOptions:{ tabBarVisible: false } } return TabNavigator(routerRouteConfig,routerNavigationConfig); }; export default ReactovaNavigator;
Allow LoadingScreen to be optional and not default
Allow LoadingScreen to be optional and not default
JavaScript
mit
Digitova/reactova-framework
javascript
## Code Before: import React from 'react'; import { TabNavigator } from 'react-navigation'; import BaseNavigator from './BaseNavigator' const ReactovaNavigator = (navigationSchema,LoadingScreen) => { const routerRouteConfig = { Loading: { screen: LoadingScreen, }, BaseNavigator: { path: 'base', screen: BaseNavigator(navigationSchema), }, } const routerNavigationConfig = { initialRouteName: 'Loading', navigationOptions:{ tabBarVisible: false } } return TabNavigator(routerRouteConfig,routerNavigationConfig); }; export default ReactovaNavigator; ## Instruction: Allow LoadingScreen to be optional and not default ## Code After: import React from 'react'; import { TabNavigator } from 'react-navigation'; import BaseNavigator from './BaseNavigator' const ReactovaNavigator = (navigationSchema, LoadingScreen = null, ladingScreenIsDefaultRoute = true) => { let loading = {} const baseNavigatorName = 'BaseNavigator' let initialRouteName = baseNavigatorName if(LoadingScreen) { loading = { Loading: { screen: LoadingScreen } } initialRouteName = ladingScreenIsDefaultRoute ? 'Loading' : initialRouteName } const routerRouteConfig = { ...loading, [baseNavigatorName]: { path: 'base', screen: BaseNavigator(navigationSchema), }, } const routerNavigationConfig = { initialRouteName, navigationOptions:{ tabBarVisible: false } } return TabNavigator(routerRouteConfig,routerNavigationConfig); }; export default ReactovaNavigator;
11458af89032ca7085de1ab6e39b486962140532
locales/kab/messages.properties
locales/kab/messages.properties
one_time=Deg akud monthly=S waggur first_name=Isem amezwaru last_name=Isem aneggaru address=Tansa city=Aɣrem # some of these strings are not or will on the page, or some of them will be use for email/snippets only # These are the variants: # this is for directory tiles # popup further monthly ask # privacy policy varients
one_time=Deg akud monthly=S waggur first_name=Isem amezwaru last_name=Isem aneggaru address=Tansa city=Aɣrem MM=GG YY=SS CVC=CVC Mission=Tuɣdaṭ About=Ɣef privacyPolicyFooter=Tasertit n tbaḍnit legalNotices=Alɣu ɣef usaḍuf Contact=Anermis sign_up_now=Jerred tura your_email=youremail@example.com amount=Tallayt support_mozilla=Mudd afus i Mozilla email=Imayl payment=Asellek choose_payment=Fren asellek country=Tamurt none=Ula yiwen submitting=Tuzna… # some of these strings are not or will on the page, or some of them will be use for email/snippets only # These are the variants: # this is for directory tiles # popup further monthly ask # privacy policy varients
Update Kabyle (kab) localization of Fundraising
Pontoon: Update Kabyle (kab) localization of Fundraising Localization authors: - belkacem77 <belkacem77@gmail.com>
INI
mpl-2.0
mozilla/donate.mozilla.org,ScottDowne/donate.mozilla.org
ini
## Code Before: one_time=Deg akud monthly=S waggur first_name=Isem amezwaru last_name=Isem aneggaru address=Tansa city=Aɣrem # some of these strings are not or will on the page, or some of them will be use for email/snippets only # These are the variants: # this is for directory tiles # popup further monthly ask # privacy policy varients ## Instruction: Pontoon: Update Kabyle (kab) localization of Fundraising Localization authors: - belkacem77 <belkacem77@gmail.com> ## Code After: one_time=Deg akud monthly=S waggur first_name=Isem amezwaru last_name=Isem aneggaru address=Tansa city=Aɣrem MM=GG YY=SS CVC=CVC Mission=Tuɣdaṭ About=Ɣef privacyPolicyFooter=Tasertit n tbaḍnit legalNotices=Alɣu ɣef usaḍuf Contact=Anermis sign_up_now=Jerred tura your_email=youremail@example.com amount=Tallayt support_mozilla=Mudd afus i Mozilla email=Imayl payment=Asellek choose_payment=Fren asellek country=Tamurt none=Ula yiwen submitting=Tuzna… # some of these strings are not or will on the page, or some of them will be use for email/snippets only # These are the variants: # this is for directory tiles # popup further monthly ask # privacy policy varients
44c532694435b8e8f7b68fc10c60875892826ba8
app.rb
app.rb
require 'sinatra/base' require 'sinatra/json' require 'sidekiq' require 'sidekiq/web' require './lib/downloader' require './lib/reply' require './lib/feed_builder' require './lib/settings' require './workers/null_worker' require './workers/download_worker' class MovMeApp < Sinatra::Base get '/' do 'Mov Me' end get '/test' do json frontend: :ok, backend: check_backend, config: check_config end post '/' do if Settings.sms_enabled? puts "SMS Incomming: #{params[:Body]} - received" else puts "SMS Incomming: #{params[:Body]}" end reply_message = Reply.new.run("Downloading #{params[:Body]}") DownloadWorker.perform_async(params[:Body], params[:From]) # Downloader.new.run(url: params[:Body], to: params[:From]) # Downloader.new(settings).background_run(params[:Body]) reply_message end private def check_backend NullWorker.perform_async(rand(30)) :ok rescue :fail end def check_config Settings.loaded? ? :ok : :fail rescue :fail end end
require 'sinatra/base' require 'sinatra/json' require 'sidekiq' require 'sidekiq/web' require './lib/downloader' require './lib/reply' require './lib/feed_builder' require './lib/settings' require './workers/null_worker' require './workers/download_worker' class MovMeApp < Sinatra::Base get '/' do 'Mov Me' end get '/test' do json frontend: :ok, backend: check_backend, config: check_config end post '/' do if Settings.sms_enabled? puts "SMS Incomming: #{params[:Body]} - received" else puts "SMS Incomming: #{params[:Body]}" end reply_message = Reply.new.run("Downloading #{params[:Body]}") DownloadWorker.perform_async(params[:Body], params[:From]) # Downloader.new.run(url: params[:Body], to: params[:From]) # Downloader.new(settings).background_run(params[:Body]) reply_message end private def check_backend NullWorker.perform_async(rand(30)) :ok rescue => e puts e :fail end def check_config Settings.loaded? ? :ok : :fail rescue :fail end end
Print error message for NullWorker
Print error message for NullWorker
Ruby
mit
X0nic/mov-me,X0nic/mov-me
ruby
## Code Before: require 'sinatra/base' require 'sinatra/json' require 'sidekiq' require 'sidekiq/web' require './lib/downloader' require './lib/reply' require './lib/feed_builder' require './lib/settings' require './workers/null_worker' require './workers/download_worker' class MovMeApp < Sinatra::Base get '/' do 'Mov Me' end get '/test' do json frontend: :ok, backend: check_backend, config: check_config end post '/' do if Settings.sms_enabled? puts "SMS Incomming: #{params[:Body]} - received" else puts "SMS Incomming: #{params[:Body]}" end reply_message = Reply.new.run("Downloading #{params[:Body]}") DownloadWorker.perform_async(params[:Body], params[:From]) # Downloader.new.run(url: params[:Body], to: params[:From]) # Downloader.new(settings).background_run(params[:Body]) reply_message end private def check_backend NullWorker.perform_async(rand(30)) :ok rescue :fail end def check_config Settings.loaded? ? :ok : :fail rescue :fail end end ## Instruction: Print error message for NullWorker ## Code After: require 'sinatra/base' require 'sinatra/json' require 'sidekiq' require 'sidekiq/web' require './lib/downloader' require './lib/reply' require './lib/feed_builder' require './lib/settings' require './workers/null_worker' require './workers/download_worker' class MovMeApp < Sinatra::Base get '/' do 'Mov Me' end get '/test' do json frontend: :ok, backend: check_backend, config: check_config end post '/' do if Settings.sms_enabled? puts "SMS Incomming: #{params[:Body]} - received" else puts "SMS Incomming: #{params[:Body]}" end reply_message = Reply.new.run("Downloading #{params[:Body]}") DownloadWorker.perform_async(params[:Body], params[:From]) # Downloader.new.run(url: params[:Body], to: params[:From]) # Downloader.new(settings).background_run(params[:Body]) reply_message end private def check_backend NullWorker.perform_async(rand(30)) :ok rescue => e puts e :fail end def check_config Settings.loaded? ? :ok : :fail rescue :fail end end
6021c4c54cb0a437878553a1e23f8d433476ff2d
main.py
main.py
import json from kivy.app import App from kivy.uix.boxlayout import BoxLayout from kivy.properties import ObjectProperty from kivy.network.urlrequest import UrlRequest class AddLocationForm(BoxLayout): search_input = ObjectProperty() search_results = ObjectProperty() def search_location(self): search_template = ("http://api.openweathermap.org/data/2.5/" + "find?q={}&type=like") search_url = search_template.format(self.search_input.text) request = UrlRequest(search_url, self.found_location) def found_location(self, request, data): data = json.loads(data.decode()) if not isinstance(data, dict) else data cities = ["{} ({})".format(d['name'], d['sys']['country']) for d in data['list']] self.search_results.item_strings = cities class WeatherApp(App): pass if __name__ == '__main__': WeatherApp().run()
import json from kivy.app import App from kivy.uix.boxlayout import BoxLayout from kivy.properties import ObjectProperty from kivy.network.urlrequest import UrlRequest class AddLocationForm(BoxLayout): search_input = ObjectProperty() search_results = ObjectProperty() def search_location(self): search_template = ("http://api.openweathermap.org/data/2.5/" + "find?q={}&type=like") search_url = search_template.format(self.search_input.text) request = UrlRequest(search_url, self.found_location) def found_location(self, request, data): data = json.loads(data.decode()) if not isinstance(data, dict) else data if 'list' in data: cities = ["{} ({})".format(d['name'], d['sys']['country']) for d in data['list']] else: cities = [] self.search_results.item_strings = cities class WeatherApp(App): pass if __name__ == '__main__': WeatherApp().run()
Stop crashing when search doesn't have any matches
Stop crashing when search doesn't have any matches
Python
mit
ciappi/Weather
python
## Code Before: import json from kivy.app import App from kivy.uix.boxlayout import BoxLayout from kivy.properties import ObjectProperty from kivy.network.urlrequest import UrlRequest class AddLocationForm(BoxLayout): search_input = ObjectProperty() search_results = ObjectProperty() def search_location(self): search_template = ("http://api.openweathermap.org/data/2.5/" + "find?q={}&type=like") search_url = search_template.format(self.search_input.text) request = UrlRequest(search_url, self.found_location) def found_location(self, request, data): data = json.loads(data.decode()) if not isinstance(data, dict) else data cities = ["{} ({})".format(d['name'], d['sys']['country']) for d in data['list']] self.search_results.item_strings = cities class WeatherApp(App): pass if __name__ == '__main__': WeatherApp().run() ## Instruction: Stop crashing when search doesn't have any matches ## Code After: import json from kivy.app import App from kivy.uix.boxlayout import BoxLayout from kivy.properties import ObjectProperty from kivy.network.urlrequest import UrlRequest class AddLocationForm(BoxLayout): search_input = ObjectProperty() search_results = ObjectProperty() def search_location(self): search_template = ("http://api.openweathermap.org/data/2.5/" + "find?q={}&type=like") search_url = search_template.format(self.search_input.text) request = UrlRequest(search_url, self.found_location) def found_location(self, request, data): data = json.loads(data.decode()) if not isinstance(data, dict) else data if 'list' in data: cities = ["{} ({})".format(d['name'], d['sys']['country']) for d in data['list']] else: cities = [] self.search_results.item_strings = cities class WeatherApp(App): pass if __name__ == '__main__': WeatherApp().run()
1c3ccd789bb5d13852e42278dbbc337a44ec5205
openstack/nova/templates/console-ingress.yaml
openstack/nova/templates/console-ingress.yaml
kind: Ingress {{- if $.Capabilities.APIVersions.Has "networking.k8s.io/v1beta1" }} apiVersion: networking.k8s.io/v1beta1 {{- else }} apiVersion: extensions/v1beta1 {{- end }} metadata: name: nova-console labels: system: openstack type: backend component: nova annotations: ingress.kubernetes.io/rewrite-target: / {{- if .Values.vice_president }} vice-president: "true" {{- end }} spec: tls: - secretName: tls-{{include "nova_console_endpoint_host_public" . | replace "." "-" }} hosts: [{{include "nova_console_endpoint_host_public" .}}] rules: - host: {{include "nova_console_endpoint_host_public" .}} http: paths: {{- range $name, $config := .Values.consoles }} - path: /{{ $name }} backend: serviceName: nova-console-{{ $name }} servicePort: {{ $name }} {{- end }} {{ $envAll := . }} {{- range $name, $config := .Values.consoles }} --- {{ tuple $envAll $name $config | include "nova.console_service" }} {{- end }}
kind: Ingress {{- if $.Capabilities.APIVersions.Has "networking.k8s.io/v1beta1" }} apiVersion: networking.k8s.io/v1beta1 {{- else }} apiVersion: extensions/v1beta1 {{- end }} metadata: name: nova-console labels: system: openstack type: backend component: nova annotations: ingress.kubernetes.io/use-regex: "true" ingress.kubernetes.io/configuration-snippet: | {{- range $name, $config := .Values.consoles }} rewrite "(?i)/{{ $name }}/(.*)" /$1 break; rewrite "(?i)/{{ $name }}$" / break; {{- end }} {{- if .Values.vice_president }} vice-president: "true" {{- end }} spec: tls: - secretName: tls-{{include "nova_console_endpoint_host_public" . | replace "." "-" }} hosts: [{{include "nova_console_endpoint_host_public" .}}] rules: - host: {{include "nova_console_endpoint_host_public" .}} http: paths: {{- range $name, $config := .Values.consoles }} - path: /{{ $name }}/?(.*) backend: serviceName: nova-console-{{ $name }} servicePort: {{ $name }} {{- end }} {{ $envAll := . }} {{- range $name, $config := .Values.consoles }} --- {{ tuple $envAll $name $config | include "nova.console_service" }} {{- end }}
Use Nginx ingress 0.22 compatible syntax
[nova] Use Nginx ingress 0.22 compatible syntax Nginx ingress 0.21 is the last Nginx ingress version supporting the old way of rewriting URLs for nova-console. With all regions already on 0.21, we can start using the syntax available in both 0.21 and 0.22 so we can update to Nginx ingress 0.22. The plan is to afterwards change to the new syntax without using `configuration-snippet`, which is only supported starting with 0.22.
YAML
apache-2.0
sapcc/helm-charts,sapcc/helm-charts,sapcc/helm-charts,sapcc/helm-charts
yaml
## Code Before: kind: Ingress {{- if $.Capabilities.APIVersions.Has "networking.k8s.io/v1beta1" }} apiVersion: networking.k8s.io/v1beta1 {{- else }} apiVersion: extensions/v1beta1 {{- end }} metadata: name: nova-console labels: system: openstack type: backend component: nova annotations: ingress.kubernetes.io/rewrite-target: / {{- if .Values.vice_president }} vice-president: "true" {{- end }} spec: tls: - secretName: tls-{{include "nova_console_endpoint_host_public" . | replace "." "-" }} hosts: [{{include "nova_console_endpoint_host_public" .}}] rules: - host: {{include "nova_console_endpoint_host_public" .}} http: paths: {{- range $name, $config := .Values.consoles }} - path: /{{ $name }} backend: serviceName: nova-console-{{ $name }} servicePort: {{ $name }} {{- end }} {{ $envAll := . }} {{- range $name, $config := .Values.consoles }} --- {{ tuple $envAll $name $config | include "nova.console_service" }} {{- end }} ## Instruction: [nova] Use Nginx ingress 0.22 compatible syntax Nginx ingress 0.21 is the last Nginx ingress version supporting the old way of rewriting URLs for nova-console. With all regions already on 0.21, we can start using the syntax available in both 0.21 and 0.22 so we can update to Nginx ingress 0.22. The plan is to afterwards change to the new syntax without using `configuration-snippet`, which is only supported starting with 0.22. ## Code After: kind: Ingress {{- if $.Capabilities.APIVersions.Has "networking.k8s.io/v1beta1" }} apiVersion: networking.k8s.io/v1beta1 {{- else }} apiVersion: extensions/v1beta1 {{- end }} metadata: name: nova-console labels: system: openstack type: backend component: nova annotations: ingress.kubernetes.io/use-regex: "true" ingress.kubernetes.io/configuration-snippet: | {{- range $name, $config := .Values.consoles }} rewrite "(?i)/{{ $name }}/(.*)" /$1 break; rewrite "(?i)/{{ $name }}$" / break; {{- end }} {{- if .Values.vice_president }} vice-president: "true" {{- end }} spec: tls: - secretName: tls-{{include "nova_console_endpoint_host_public" . | replace "." "-" }} hosts: [{{include "nova_console_endpoint_host_public" .}}] rules: - host: {{include "nova_console_endpoint_host_public" .}} http: paths: {{- range $name, $config := .Values.consoles }} - path: /{{ $name }}/?(.*) backend: serviceName: nova-console-{{ $name }} servicePort: {{ $name }} {{- end }} {{ $envAll := . }} {{- range $name, $config := .Values.consoles }} --- {{ tuple $envAll $name $config | include "nova.console_service" }} {{- end }}
b86cb23478990f348747b326ad7ba1455fd485a1
CHANGELOG.md
CHANGELOG.md
* Rewrite for Rails 3 * Implement new Rails 3/ARel finder syntax * Double Rainbows 🌈🌈 ## 0.2.1 / 2008-08-08 * Make results will_paginate-compatible ## 0.2.0 / 2007-10-27 * Add `validates_as_geocodable` ([Mark Van Holstyn](https://github.com/mvanholstyn)) * Allow address mapping to be a single field ([Mark Van Holstyn](https://github.com/mvanholstyn)) ## 0.1.0 * Add `remote_location` to get a user's location based on his or her `remote_ip` * Rename `:city` to `:locality` in address mapping to be consistent with Graticule 0.2 Create a migration with: ```ruby rename_column :geocodes, :city, :locality ``` * Replace `#full_address` with `#to_location`
* Rewrite for Rails 3 * Implement new Rails 3/ARel finder syntax * Double Rainbows 🌈🌈 ## 0.2.1 / 2008-08-08 * Make results will_paginate-compatible ## 0.2.0 / 2007-10-27 * Add `validates_as_geocodable` ([Mark Van Holstyn](https://github.com/mvanholstyn)) * Allow address mapping to be a single field ([Mark Van Holstyn](https://github.com/mvanholstyn)) ## 0.1.0 / 2007-03-20 * Add `remote_location` to get a user's location based on his or her `remote_ip` * Rename `:city` to `:locality` in address mapping to be consistent with Graticule 0.2 Create a migration with: ```ruby rename_column :geocodes, :city, :locality ``` * Replace `#full_address` with `#to_location`
Add a release data for version 0.1.0
Add a release data for version 0.1.0 This is when acts_as_geocodable was still a plugin rather than a proper gem.
Markdown
mit
collectiveidea/acts_as_geocodable,kornhammer/acts_as_geocodable
markdown
## Code Before: * Rewrite for Rails 3 * Implement new Rails 3/ARel finder syntax * Double Rainbows 🌈🌈 ## 0.2.1 / 2008-08-08 * Make results will_paginate-compatible ## 0.2.0 / 2007-10-27 * Add `validates_as_geocodable` ([Mark Van Holstyn](https://github.com/mvanholstyn)) * Allow address mapping to be a single field ([Mark Van Holstyn](https://github.com/mvanholstyn)) ## 0.1.0 * Add `remote_location` to get a user's location based on his or her `remote_ip` * Rename `:city` to `:locality` in address mapping to be consistent with Graticule 0.2 Create a migration with: ```ruby rename_column :geocodes, :city, :locality ``` * Replace `#full_address` with `#to_location` ## Instruction: Add a release data for version 0.1.0 This is when acts_as_geocodable was still a plugin rather than a proper gem. ## Code After: * Rewrite for Rails 3 * Implement new Rails 3/ARel finder syntax * Double Rainbows 🌈🌈 ## 0.2.1 / 2008-08-08 * Make results will_paginate-compatible ## 0.2.0 / 2007-10-27 * Add `validates_as_geocodable` ([Mark Van Holstyn](https://github.com/mvanholstyn)) * Allow address mapping to be a single field ([Mark Van Holstyn](https://github.com/mvanholstyn)) ## 0.1.0 / 2007-03-20 * Add `remote_location` to get a user's location based on his or her `remote_ip` * Rename `:city` to `:locality` in address mapping to be consistent with Graticule 0.2 Create a migration with: ```ruby rename_column :geocodes, :city, :locality ``` * Replace `#full_address` with `#to_location`
b635eddbe3ad344b02ecae47333a4ddf4b17cd18
bin/remotePush.py
bin/remotePush.py
import json,httplib config_file = open('conf/net/ext_service/parse.json') silent_push_msg = { "where": { "deviceType": "ios" }, "data": { # "alert": "The Mets scored! The game is now tied 1-1.", "content-available": 1, "sound": "", } } parse_headers = { "X-Parse-Application-Id": config_file["emission_id"], "X-Parse-REST-API-Key": config_file["emission_key"], "Content-Type": "application/json" } connection = httplib.HTTPSConnection('api.parse.com', 443) connection.connect() connection.request('POST', '/1/push', json.dumps(silent_push_msg), parse_headers) result = json.loads(connection.getresponse().read()) print result
import json,httplib config_data = json.load(open('conf/net/ext_service/parse.json')) silent_push_msg = { "where": { "deviceType": "ios" }, "data": { # "alert": "The Mets scored! The game is now tied 1-1.", "content-available": 1, "sound": "", } } parse_headers = { "X-Parse-Application-Id": config_data["emission_id"], "X-Parse-REST-API-Key": config_data["emission_key"], "Content-Type": "application/json" } connection = httplib.HTTPSConnection('api.parse.com', 443) connection.connect() connection.request('POST', '/1/push', json.dumps(silent_push_msg), parse_headers) result = json.loads(connection.getresponse().read()) print result
Fix minor issue in remote push
Fix minor issue in remote push We need to open the file and then parse it as json
Python
bsd-3-clause
joshzarrabi/e-mission-server,sunil07t/e-mission-server,joshzarrabi/e-mission-server,yw374cornell/e-mission-server,e-mission/e-mission-server,joshzarrabi/e-mission-server,joshzarrabi/e-mission-server,shankari/e-mission-server,yw374cornell/e-mission-server,e-mission/e-mission-server,e-mission/e-mission-server,sunil07t/e-mission-server,sunil07t/e-mission-server,sunil07t/e-mission-server,yw374cornell/e-mission-server,shankari/e-mission-server,e-mission/e-mission-server,yw374cornell/e-mission-server,shankari/e-mission-server,shankari/e-mission-server
python
## Code Before: import json,httplib config_file = open('conf/net/ext_service/parse.json') silent_push_msg = { "where": { "deviceType": "ios" }, "data": { # "alert": "The Mets scored! The game is now tied 1-1.", "content-available": 1, "sound": "", } } parse_headers = { "X-Parse-Application-Id": config_file["emission_id"], "X-Parse-REST-API-Key": config_file["emission_key"], "Content-Type": "application/json" } connection = httplib.HTTPSConnection('api.parse.com', 443) connection.connect() connection.request('POST', '/1/push', json.dumps(silent_push_msg), parse_headers) result = json.loads(connection.getresponse().read()) print result ## Instruction: Fix minor issue in remote push We need to open the file and then parse it as json ## Code After: import json,httplib config_data = json.load(open('conf/net/ext_service/parse.json')) silent_push_msg = { "where": { "deviceType": "ios" }, "data": { # "alert": "The Mets scored! The game is now tied 1-1.", "content-available": 1, "sound": "", } } parse_headers = { "X-Parse-Application-Id": config_data["emission_id"], "X-Parse-REST-API-Key": config_data["emission_key"], "Content-Type": "application/json" } connection = httplib.HTTPSConnection('api.parse.com', 443) connection.connect() connection.request('POST', '/1/push', json.dumps(silent_push_msg), parse_headers) result = json.loads(connection.getresponse().read()) print result
5467e6d7580f3a84ae71eb5a60c271a1f9c96e0d
CHANGELOG.md
CHANGELOG.md
This project adheres to [Semantic Versioning](http://semver.org/). ## Upcoming ### Fix * If the `jsdoc`-helper did not create any docs, the whole output file was not written. ## v0.2.0 - 2015-08-04 ### Change * `rendertree` is now a block-helper, the label of each node is determined by the block-contents * `example`-helper now also converts `require('../file')` into `require('package-name/file') * `include`-helper: Determines fence-type from file-extension if not explicitly provided * In order to run thought, `thought run` must be called. ... and many more changes ### Add * Additional helper `jsdoc` to generate documentation for javscript-files. `multilang-apidocs` is by far not as powerful as `jsdoc-to-markdown` * `npm`-helper to link to the npm-page of a page ## v0.0.1 - 2015-07-02 ### Initial version
This project adheres to [Semantic Versioning](http://semver.org/). ## v0.2.1 - 2015-08-06 ### Fix * If the `jsdoc`-helper did not create any docs, the whole output file was not written. ## v0.2.0 - 2015-08-04 ### Change * `rendertree` is now a block-helper, the label of each node is determined by the block-contents * `example`-helper now also converts `require('../file')` into `require('package-name/file') * `include`-helper: Determines fence-type from file-extension if not explicitly provided * In order to run thought, `thought run` must be called. ... and many more changes ### Add * Additional helper `jsdoc` to generate documentation for javscript-files. `multilang-apidocs` is by far not as powerful as `jsdoc-to-markdown` * `npm`-helper to link to the npm-page of a page ## v0.0.1 - 2015-07-02 ### Initial version
Add changes in version: v0.2.1
Add changes in version: v0.2.1
Markdown
mit
nknapp/thought,nknapp/thought
markdown
## Code Before: This project adheres to [Semantic Versioning](http://semver.org/). ## Upcoming ### Fix * If the `jsdoc`-helper did not create any docs, the whole output file was not written. ## v0.2.0 - 2015-08-04 ### Change * `rendertree` is now a block-helper, the label of each node is determined by the block-contents * `example`-helper now also converts `require('../file')` into `require('package-name/file') * `include`-helper: Determines fence-type from file-extension if not explicitly provided * In order to run thought, `thought run` must be called. ... and many more changes ### Add * Additional helper `jsdoc` to generate documentation for javscript-files. `multilang-apidocs` is by far not as powerful as `jsdoc-to-markdown` * `npm`-helper to link to the npm-page of a page ## v0.0.1 - 2015-07-02 ### Initial version ## Instruction: Add changes in version: v0.2.1 ## Code After: This project adheres to [Semantic Versioning](http://semver.org/). ## v0.2.1 - 2015-08-06 ### Fix * If the `jsdoc`-helper did not create any docs, the whole output file was not written. ## v0.2.0 - 2015-08-04 ### Change * `rendertree` is now a block-helper, the label of each node is determined by the block-contents * `example`-helper now also converts `require('../file')` into `require('package-name/file') * `include`-helper: Determines fence-type from file-extension if not explicitly provided * In order to run thought, `thought run` must be called. ... and many more changes ### Add * Additional helper `jsdoc` to generate documentation for javscript-files. `multilang-apidocs` is by far not as powerful as `jsdoc-to-markdown` * `npm`-helper to link to the npm-page of a page ## v0.0.1 - 2015-07-02 ### Initial version
104d37d81c2d8e39722eab95584aa5339ba94e4b
package.json
package.json
{ "name": "deferreds", "description": "Functional utility library for working with Deferred objects", "version": "1.1.0", "homepage": "https://github.com/zship/deferreds.js", "author": "Zach Shipley <zach@zachshipley.com>", "repository": { "type": "git", "url": "https://github.com/zship/deferreds.js" }, "license": "MIT", "dependencies": { "mout": "0.6.x", "signals": "1.x", "setimmediate": "1.x" }, "devDependencies": { "amdefine": "0.x", "uglify-js": "2.x", "almond": "0.2.x", "promises-aplus-tests": "1.3.x", "requirejs": "2.1.x", "mocha": "1.12.x", "glob": "3.x", "umdify": "0.x", "node-amd-require": "0.x", "jquery": "1.x", "q": "0.x", "rsvp": "2.x", "when": "2.x" }, "keywords": [ "promises-aplus" ] }
{ "name": "deferreds", "description": "Functional utility library for working with Deferred objects", "version": "1.1.0", "homepage": "https://github.com/zship/deferreds.js", "author": "Zach Shipley <zach@zachshipley.com>", "repository": { "type": "git", "url": "https://github.com/zship/deferreds.js" }, "license": "MIT", "dependencies": { "mout": "0.6.x", "signals": "1.x", "setimmediate": "1.x" }, "devDependencies": { "amd-tools": "1.x", "amdefine": "0.x", "uglify-js": "2.x", "almond": "0.2.x", "promises-aplus-tests": "1.3.x", "requirejs": "2.1.x", "mocha": "1.12.x", "glob": "3.x", "umdify": "0.x", "node-amd-require": "0.x", "jquery": "1.x", "q": "0.x", "rsvp": "2.x", "when": "2.x" }, "keywords": [ "promises-aplus" ] }
Include amd-tools dependency (required by _make/optimize)
Include amd-tools dependency (required by _make/optimize)
JSON
mit
zship/deferreds.js
json
## Code Before: { "name": "deferreds", "description": "Functional utility library for working with Deferred objects", "version": "1.1.0", "homepage": "https://github.com/zship/deferreds.js", "author": "Zach Shipley <zach@zachshipley.com>", "repository": { "type": "git", "url": "https://github.com/zship/deferreds.js" }, "license": "MIT", "dependencies": { "mout": "0.6.x", "signals": "1.x", "setimmediate": "1.x" }, "devDependencies": { "amdefine": "0.x", "uglify-js": "2.x", "almond": "0.2.x", "promises-aplus-tests": "1.3.x", "requirejs": "2.1.x", "mocha": "1.12.x", "glob": "3.x", "umdify": "0.x", "node-amd-require": "0.x", "jquery": "1.x", "q": "0.x", "rsvp": "2.x", "when": "2.x" }, "keywords": [ "promises-aplus" ] } ## Instruction: Include amd-tools dependency (required by _make/optimize) ## Code After: { "name": "deferreds", "description": "Functional utility library for working with Deferred objects", "version": "1.1.0", "homepage": "https://github.com/zship/deferreds.js", "author": "Zach Shipley <zach@zachshipley.com>", "repository": { "type": "git", "url": "https://github.com/zship/deferreds.js" }, "license": "MIT", "dependencies": { "mout": "0.6.x", "signals": "1.x", "setimmediate": "1.x" }, "devDependencies": { "amd-tools": "1.x", "amdefine": "0.x", "uglify-js": "2.x", "almond": "0.2.x", "promises-aplus-tests": "1.3.x", "requirejs": "2.1.x", "mocha": "1.12.x", "glob": "3.x", "umdify": "0.x", "node-amd-require": "0.x", "jquery": "1.x", "q": "0.x", "rsvp": "2.x", "when": "2.x" }, "keywords": [ "promises-aplus" ] }
16de7005e5694637d20d8ad51cae9a5d14210c8b
style.css
style.css
body { background-color: #64ceaa; color: white; font-family: sans-serif; } kbd { border: 1px solid white; margin: 2px; padding: 2px; } .header, .app, #bpm { text-align: center; } .header { margin-top: 4em; } .output { margin: 2em; } input#bpm { font-size: 1em; font-weight: bold; padding: 1em; width: 5.25em; } button#toggle { background-color: white; font-size: 1em; font-weight: bold; padding: 1em; } #flash { fill: white; stroke-width: 10; } #flash.active { stroke: white; }
body { background-color: #64ceaa; color: white; font-family: sans-serif; } kbd { border: 1px solid white; margin: 2px; padding: 2px; } .header, .app, #bpm { text-align: center; } .header { margin-top: 4em; } .output { margin: 2em; } input#bpm { font-size: 1em; font-weight: bold; margin: .3em; padding: 1.5em; width: 5.25em; } button#toggle { background-color: white; font-size: 1em; font-weight: bold; margin: .3em; padding: 1.5em; } #flash { fill: white; stroke-width: 10; } #flash.active { stroke: white; }
Increase input / button size
Increase input / button size
CSS
mit
SimpleMetronome/simplemetronome.github.io
css
## Code Before: body { background-color: #64ceaa; color: white; font-family: sans-serif; } kbd { border: 1px solid white; margin: 2px; padding: 2px; } .header, .app, #bpm { text-align: center; } .header { margin-top: 4em; } .output { margin: 2em; } input#bpm { font-size: 1em; font-weight: bold; padding: 1em; width: 5.25em; } button#toggle { background-color: white; font-size: 1em; font-weight: bold; padding: 1em; } #flash { fill: white; stroke-width: 10; } #flash.active { stroke: white; } ## Instruction: Increase input / button size ## Code After: body { background-color: #64ceaa; color: white; font-family: sans-serif; } kbd { border: 1px solid white; margin: 2px; padding: 2px; } .header, .app, #bpm { text-align: center; } .header { margin-top: 4em; } .output { margin: 2em; } input#bpm { font-size: 1em; font-weight: bold; margin: .3em; padding: 1.5em; width: 5.25em; } button#toggle { background-color: white; font-size: 1em; font-weight: bold; margin: .3em; padding: 1.5em; } #flash { fill: white; stroke-width: 10; } #flash.active { stroke: white; }
114e2e877898f351bbb388cac7df5811b322c48f
setup.py
setup.py
from setuptools import find_packages, setup from shorty.version import __VERSION__ dependencies=[ 'django', 'django-autoconfig', 'django-nuit', ] test_dependencies=[ 'django-setuptest', ] setup( name='djshorty', version=__VERSION__, description='A Django URL shortening app', author='Ben Cardy', author_email='ben.cardy@ocado.com', packages=find_packages(), install_requires=dependencies, # To run tests via python setup.py test tests_require=test_dependencies, test_suite='setuptest.setuptest.SetupTestSuite', include_package_data=True, )
from setuptools import find_packages, setup from shorty.version import __VERSION__ dependencies=[ 'django', 'django-autoconfig', 'django-nuit >= 1.0.0, < 2.0.0', ] test_dependencies=[ 'django-setuptest', ] setup( name='djshorty', version=__VERSION__, description='A Django URL shortening app', author='Ben Cardy', author_email='ben.cardy@ocado.com', packages=find_packages(), install_requires=dependencies, # To run tests via python setup.py test tests_require=test_dependencies, test_suite='setuptest.setuptest.SetupTestSuite', include_package_data=True, )
Fix version dependency on nuit
Fix version dependency on nuit
Python
apache-2.0
benbacardi/djshorty,benbacardi/djshorty,ocadotechnology/djshorty,ocadotechnology/djshorty,ocadotechnology/djshorty,benbacardi/djshorty
python
## Code Before: from setuptools import find_packages, setup from shorty.version import __VERSION__ dependencies=[ 'django', 'django-autoconfig', 'django-nuit', ] test_dependencies=[ 'django-setuptest', ] setup( name='djshorty', version=__VERSION__, description='A Django URL shortening app', author='Ben Cardy', author_email='ben.cardy@ocado.com', packages=find_packages(), install_requires=dependencies, # To run tests via python setup.py test tests_require=test_dependencies, test_suite='setuptest.setuptest.SetupTestSuite', include_package_data=True, ) ## Instruction: Fix version dependency on nuit ## Code After: from setuptools import find_packages, setup from shorty.version import __VERSION__ dependencies=[ 'django', 'django-autoconfig', 'django-nuit >= 1.0.0, < 2.0.0', ] test_dependencies=[ 'django-setuptest', ] setup( name='djshorty', version=__VERSION__, description='A Django URL shortening app', author='Ben Cardy', author_email='ben.cardy@ocado.com', packages=find_packages(), install_requires=dependencies, # To run tests via python setup.py test tests_require=test_dependencies, test_suite='setuptest.setuptest.SetupTestSuite', include_package_data=True, )
e75e741770d1735c52770900b1cf59f207f2264e
asp/__init__.py
asp/__init__.py
__version__ = '0.1' __version_info__ = tuple([ int(num) for num in __version__.split('.')])
__version__ = '0.1' __version_info__ = tuple([ int(num) for num in __version__.split('.')]) class SpecializationError(Exception): """ Exception that caused specialization not to occur. Attributes: msg -- the message/explanation to the user phase -- which phase of specialization caused the error """ def __init__(self, msg, phase="Unknown phase"): self.value = value
Add asp.SpecializationError class for specialization-related exceptions.
Add asp.SpecializationError class for specialization-related exceptions.
Python
bsd-3-clause
shoaibkamil/asp,mbdriscoll/asp-old,mbdriscoll/asp-old,shoaibkamil/asp,richardxia/asp-multilevel-debug,richardxia/asp-multilevel-debug,pbirsinger/aspNew,pbirsinger/aspNew,mbdriscoll/asp-old,mbdriscoll/asp-old,shoaibkamil/asp,richardxia/asp-multilevel-debug,pbirsinger/aspNew
python
## Code Before: __version__ = '0.1' __version_info__ = tuple([ int(num) for num in __version__.split('.')]) ## Instruction: Add asp.SpecializationError class for specialization-related exceptions. ## Code After: __version__ = '0.1' __version_info__ = tuple([ int(num) for num in __version__.split('.')]) class SpecializationError(Exception): """ Exception that caused specialization not to occur. Attributes: msg -- the message/explanation to the user phase -- which phase of specialization caused the error """ def __init__(self, msg, phase="Unknown phase"): self.value = value
dbbd0ac4425ac8b2d5d80fd6f363217c5675aea3
test/angular_lib/gapi-mocks.js
test/angular_lib/gapi-mocks.js
// GAPI Hangouts Mocks var gapi = function(gapi){ // Create the Hangout API gapi.hangout = function(hangout){ hangout.localParticipant = { id: '123456', displayIndex: 0, person: { id: '123456', displayName: 'Test' } }; // OnApiReady Mocks hangout.onApiReady = { add: function(callback){ // Let's just go ahead and call it. // No sense wasting time. callback({ isApiReady: true }); } }; // Data Mocks var _dataChangedCallbacks = []; hangout.data = { currentState: {}, getState: function(){ return hangout.data.currentState; }, setValue: function(key, value){ hangout.data.currentState[key] = value; }, onStateChanged: { add: function(callback){ _dataChangedCallbacks.push(callback); } } }; hangout.getLocalParticipant = function(){ return hangout.localParticipant; }; return hangout; }(gapi.hangout || {}); return gapi; }(gapi || {});
// GAPI Hangouts Mocks var gapi = function(gapi){ // Create the Hangout API gapi.hangout = function(hangout){ hangout.localParticipant = { id: '123456', displayIndex: 0, person: { id: '123456', displayName: 'Test' } }; // OnApiReady Mocks hangout.onApiReady = { add: function(callback){ // Let's just go ahead and call it. // No sense wasting time. callback({ isApiReady: true }); } }; // Data Mocks var _dataChangedCallbacks = []; hangout.data = { currentState: {}, getState: function(){ return hangout.data.currentState; }, setValue: function(key, value){ hangout.data.currentState[key] = value; }, submitDelta: function(delta){ console.log(delta); }, onStateChanged: { add: function(callback){ _dataChangedCallbacks.push(callback); } } }; hangout.getLocalParticipant = function(){ return hangout.localParticipant; }; return hangout; }(gapi.hangout || {}); return gapi; }(gapi || {});
Improve the GAPI mocks for local development.
Improve the GAPI mocks for local development.
JavaScript
mit
DeBTech/HangoutRundown,DeBTech/HangoutRundown
javascript
## Code Before: // GAPI Hangouts Mocks var gapi = function(gapi){ // Create the Hangout API gapi.hangout = function(hangout){ hangout.localParticipant = { id: '123456', displayIndex: 0, person: { id: '123456', displayName: 'Test' } }; // OnApiReady Mocks hangout.onApiReady = { add: function(callback){ // Let's just go ahead and call it. // No sense wasting time. callback({ isApiReady: true }); } }; // Data Mocks var _dataChangedCallbacks = []; hangout.data = { currentState: {}, getState: function(){ return hangout.data.currentState; }, setValue: function(key, value){ hangout.data.currentState[key] = value; }, onStateChanged: { add: function(callback){ _dataChangedCallbacks.push(callback); } } }; hangout.getLocalParticipant = function(){ return hangout.localParticipant; }; return hangout; }(gapi.hangout || {}); return gapi; }(gapi || {}); ## Instruction: Improve the GAPI mocks for local development. ## Code After: // GAPI Hangouts Mocks var gapi = function(gapi){ // Create the Hangout API gapi.hangout = function(hangout){ hangout.localParticipant = { id: '123456', displayIndex: 0, person: { id: '123456', displayName: 'Test' } }; // OnApiReady Mocks hangout.onApiReady = { add: function(callback){ // Let's just go ahead and call it. // No sense wasting time. callback({ isApiReady: true }); } }; // Data Mocks var _dataChangedCallbacks = []; hangout.data = { currentState: {}, getState: function(){ return hangout.data.currentState; }, setValue: function(key, value){ hangout.data.currentState[key] = value; }, submitDelta: function(delta){ console.log(delta); }, onStateChanged: { add: function(callback){ _dataChangedCallbacks.push(callback); } } }; hangout.getLocalParticipant = function(){ return hangout.localParticipant; }; return hangout; }(gapi.hangout || {}); return gapi; }(gapi || {});
aef21fac420e7715fc569826abe84060ecb8483a
test/tc_live_cmn.rb
test/tc_live_cmn.rb
require 'rbconfig' module TestLiveCMN unless Config::CONFIG['host_os'] =~ /mswin|mingw/ def test_live_cmn flat_dct = IO.read('data/noyes/dct.dat').unpack 'g*' dct =[] 0.step flat_dct.size-13, 13 do |i| dct << flat_dct[i, 13] end ex_cmn = IO.read('data/noyes/cmn.dat').unpack 'g*' live_cmn = LiveCMN.new cmn = live_cmn << dct cmn_flat = cmn.flatten assert_m ex_cmn, cmn_flat, 5 end end end
require 'rbconfig' module TestLiveCMN def test_live_cmn # This test fails on windows because there is too much accumulated # precision error from summing floats. A more sophisticated accumulation # routine may solve this problem, however, from a speech recognition point # of view it isn't a problem. Mean normalization needs to be done quickly # and precision is probably of little benefit. unless Config::CONFIG['host_os'] =~ /mswin|mingw/ flat_dct = IO.read('data/noyes/dct.dat').unpack 'g*' dct =[] 0.step flat_dct.size-13, 13 do |i| dct << flat_dct[i, 13] end ex_cmn = IO.read('data/noyes/cmn.dat').unpack 'g*' live_cmn = LiveCMN.new cmn = live_cmn << dct cmn_flat = cmn.flatten assert_m ex_cmn, cmn_flat, 5 end end end
Stop CMN test from failing on Windows.
Stop CMN test from failing on Windows.
Ruby
bsd-2-clause
talkhouse/noyes,talkhouse/noyes,talkhouse/noyes
ruby
## Code Before: require 'rbconfig' module TestLiveCMN unless Config::CONFIG['host_os'] =~ /mswin|mingw/ def test_live_cmn flat_dct = IO.read('data/noyes/dct.dat').unpack 'g*' dct =[] 0.step flat_dct.size-13, 13 do |i| dct << flat_dct[i, 13] end ex_cmn = IO.read('data/noyes/cmn.dat').unpack 'g*' live_cmn = LiveCMN.new cmn = live_cmn << dct cmn_flat = cmn.flatten assert_m ex_cmn, cmn_flat, 5 end end end ## Instruction: Stop CMN test from failing on Windows. ## Code After: require 'rbconfig' module TestLiveCMN def test_live_cmn # This test fails on windows because there is too much accumulated # precision error from summing floats. A more sophisticated accumulation # routine may solve this problem, however, from a speech recognition point # of view it isn't a problem. Mean normalization needs to be done quickly # and precision is probably of little benefit. unless Config::CONFIG['host_os'] =~ /mswin|mingw/ flat_dct = IO.read('data/noyes/dct.dat').unpack 'g*' dct =[] 0.step flat_dct.size-13, 13 do |i| dct << flat_dct[i, 13] end ex_cmn = IO.read('data/noyes/cmn.dat').unpack 'g*' live_cmn = LiveCMN.new cmn = live_cmn << dct cmn_flat = cmn.flatten assert_m ex_cmn, cmn_flat, 5 end end end
6f6b47de2986fb9192f97a31564563785b38f58a
lib/Tooling/CMakeLists.txt
lib/Tooling/CMakeLists.txt
set(LLVM_LINK_COMPONENTS support) add_subdirectory(Core) add_clang_library(clangTooling ArgumentsAdjusters.cpp CommonOptionsParser.cpp CompilationDatabase.cpp FileMatchTrie.cpp FixIt.cpp JSONCompilationDatabase.cpp Refactoring.cpp RefactoringCallbacks.cpp Tooling.cpp DEPENDS ClangDriverOptions LINK_LIBS clangAST clangASTMatchers clangBasic clangDriver clangFormat clangFrontend clangLex clangRewrite clangToolingCore )
set(LLVM_LINK_COMPONENTS Option Support ) add_subdirectory(Core) add_clang_library(clangTooling ArgumentsAdjusters.cpp CommonOptionsParser.cpp CompilationDatabase.cpp FileMatchTrie.cpp FixIt.cpp JSONCompilationDatabase.cpp Refactoring.cpp RefactoringCallbacks.cpp Tooling.cpp DEPENDS ClangDriverOptions LINK_LIBS clangAST clangASTMatchers clangBasic clangDriver clangFormat clangFrontend clangLex clangRewrite clangToolingCore )
Update libdeps: LLVMOptions, since r280118.
clangTooling: Update libdeps: LLVMOptions, since r280118. git-svn-id: ffe668792ed300d6c2daa1f6eba2e0aa28d7ec6c@280187 91177308-0d34-0410-b5e6-96231b3b80d8
Text
apache-2.0
apple/swift-clang,apple/swift-clang,llvm-mirror/clang,apple/swift-clang,llvm-mirror/clang,apple/swift-clang,apple/swift-clang,llvm-mirror/clang,apple/swift-clang,llvm-mirror/clang,llvm-mirror/clang,apple/swift-clang,llvm-mirror/clang,apple/swift-clang,apple/swift-clang,apple/swift-clang,llvm-mirror/clang,llvm-mirror/clang,llvm-mirror/clang,llvm-mirror/clang
text
## Code Before: set(LLVM_LINK_COMPONENTS support) add_subdirectory(Core) add_clang_library(clangTooling ArgumentsAdjusters.cpp CommonOptionsParser.cpp CompilationDatabase.cpp FileMatchTrie.cpp FixIt.cpp JSONCompilationDatabase.cpp Refactoring.cpp RefactoringCallbacks.cpp Tooling.cpp DEPENDS ClangDriverOptions LINK_LIBS clangAST clangASTMatchers clangBasic clangDriver clangFormat clangFrontend clangLex clangRewrite clangToolingCore ) ## Instruction: clangTooling: Update libdeps: LLVMOptions, since r280118. git-svn-id: ffe668792ed300d6c2daa1f6eba2e0aa28d7ec6c@280187 91177308-0d34-0410-b5e6-96231b3b80d8 ## Code After: set(LLVM_LINK_COMPONENTS Option Support ) add_subdirectory(Core) add_clang_library(clangTooling ArgumentsAdjusters.cpp CommonOptionsParser.cpp CompilationDatabase.cpp FileMatchTrie.cpp FixIt.cpp JSONCompilationDatabase.cpp Refactoring.cpp RefactoringCallbacks.cpp Tooling.cpp DEPENDS ClangDriverOptions LINK_LIBS clangAST clangASTMatchers clangBasic clangDriver clangFormat clangFrontend clangLex clangRewrite clangToolingCore )
6176f4281399e3aeb3f80973d0d4e7187d4715ea
CMakeLists.txt
CMakeLists.txt
cmake_minimum_required(VERSION 3.8) include(version.cmake) include("standard/Standard.cmake") standard_project(Autowiring VERSION ${autowiring_VERSION}) list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/cmake-modules") include(AddPCH) include(ConditionalSources) set_property(GLOBAL PROPERTY USE_FOLDERS ON) # We have unit test projects via googletest, they're added in the places where they are defined enable_testing() # Autoboost headers required everywhere but on MSVC 2015+, which doesn't rely on filesystem if(NOT MSVC OR MSVC_VERSION LESS 1900) install( DIRECTORY ${PROJECT_SOURCE_DIR}/contrib/autoboost/autoboost DESTINATION include COMPONENT autowiring ) endif() add_definitions(-DGTEST_HAS_TR1_TUPLE=0) # Recurse through source directories include_directories( contrib contrib/websocketpp ) add_subdirectory(contrib) add_subdirectory(src) # Now we can generate the version and install stuff generate_version() combined_installer( VENDOR "Leap Motion" CONTACT "cmercenary@gmail.com" )
cmake_minimum_required(VERSION 3.8) include(version.cmake) include("standard/Standard.cmake") standard_project(Autowiring VERSION ${autowiring_VERSION}) list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/cmake-modules") include(AddPCH) include(ConditionalSources) set_property(GLOBAL PROPERTY USE_FOLDERS ON) # We have unit test projects via googletest, they're added in the places where they are defined enable_testing() # Autoboost headers required everywhere but on MSVC 2015+, which doesn't rely on filesystem if(NOT MSVC OR MSVC_VERSION LESS 1900) install( DIRECTORY ${PROJECT_SOURCE_DIR}/contrib/autoboost/autoboost DESTINATION include COMPONENT autowiring ) endif() # Enable conformance mode for newer versions of MSVC if(MSVC_VERSION GREATER 1910) string(APPEND CMAKE_CXX_FLAGS " /d1parsePackExpressions-") endif() add_definitions(-DGTEST_HAS_TR1_TUPLE=0) # Recurse through source directories include_directories( contrib contrib/websocketpp ) add_subdirectory(contrib) add_subdirectory(src) # Now we can generate the version and install stuff generate_version() combined_installer( VENDOR "Leap Motion" CONTACT "cmercenary@gmail.com" )
Enable conformance mode for newer versions of visual studio - required for tuple due to changes in parameter expansion logic
Enable conformance mode for newer versions of visual studio - required for tuple due to changes in parameter expansion logic
Text
apache-2.0
leapmotion/autowiring,leapmotion/autowiring,leapmotion/autowiring,leapmotion/autowiring,leapmotion/autowiring,leapmotion/autowiring
text
## Code Before: cmake_minimum_required(VERSION 3.8) include(version.cmake) include("standard/Standard.cmake") standard_project(Autowiring VERSION ${autowiring_VERSION}) list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/cmake-modules") include(AddPCH) include(ConditionalSources) set_property(GLOBAL PROPERTY USE_FOLDERS ON) # We have unit test projects via googletest, they're added in the places where they are defined enable_testing() # Autoboost headers required everywhere but on MSVC 2015+, which doesn't rely on filesystem if(NOT MSVC OR MSVC_VERSION LESS 1900) install( DIRECTORY ${PROJECT_SOURCE_DIR}/contrib/autoboost/autoboost DESTINATION include COMPONENT autowiring ) endif() add_definitions(-DGTEST_HAS_TR1_TUPLE=0) # Recurse through source directories include_directories( contrib contrib/websocketpp ) add_subdirectory(contrib) add_subdirectory(src) # Now we can generate the version and install stuff generate_version() combined_installer( VENDOR "Leap Motion" CONTACT "cmercenary@gmail.com" ) ## Instruction: Enable conformance mode for newer versions of visual studio - required for tuple due to changes in parameter expansion logic ## Code After: cmake_minimum_required(VERSION 3.8) include(version.cmake) include("standard/Standard.cmake") standard_project(Autowiring VERSION ${autowiring_VERSION}) list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/cmake-modules") include(AddPCH) include(ConditionalSources) set_property(GLOBAL PROPERTY USE_FOLDERS ON) # We have unit test projects via googletest, they're added in the places where they are defined enable_testing() # Autoboost headers required everywhere but on MSVC 2015+, which doesn't rely on filesystem if(NOT MSVC OR MSVC_VERSION LESS 1900) install( DIRECTORY ${PROJECT_SOURCE_DIR}/contrib/autoboost/autoboost DESTINATION include COMPONENT autowiring ) endif() # Enable conformance mode for newer versions of MSVC if(MSVC_VERSION GREATER 1910) string(APPEND CMAKE_CXX_FLAGS " /d1parsePackExpressions-") endif() add_definitions(-DGTEST_HAS_TR1_TUPLE=0) # Recurse through source directories include_directories( contrib contrib/websocketpp ) add_subdirectory(contrib) add_subdirectory(src) # Now we can generate the version and install stuff generate_version() combined_installer( VENDOR "Leap Motion" CONTACT "cmercenary@gmail.com" )
36ce794e52268fa61b2b794b673b7adb9b381889
utils/C++Tests/LLVM-Syntax/lit.local.cfg
utils/C++Tests/LLVM-Syntax/lit.local.cfg
def getRoot(config): if not config.parent: return config return getRoot(config.parent) root = getRoot(config) # testFormat: The test format to use to interpret tests. config.test_format = lit.formats.SyntaxCheckTest(compiler=root.clang, dir='%s/include/llvm' % root.llvm_src_root, recursive=False, pattern='^(.*\\.h|[^.]*)$', excludes=['DAGISelHeader.h', 'AIXDataTypesFix.h', 'Solaris.h'], extra_cxx_args=['-D__STDC_LIMIT_MACROS', '-D__STDC_CONSTANT_MACROS', '-Wno-sign-compare', '-I%s/include' % root.llvm_src_root, '-I%s/include' % root.llvm_obj_root]) config.excludes = ['AbstractTypeUser.h']
def getRoot(config): if not config.parent: return config return getRoot(config.parent) root = getRoot(config) # testFormat: The test format to use to interpret tests. config.test_format = lit.formats.SyntaxCheckTest(compiler=root.clang, dir='%s/include/llvm' % root.llvm_src_root, recursive=False, pattern='^(.*\\.h|[^.]*)$', extra_cxx_args=['-D__STDC_LIMIT_MACROS', '-D__STDC_CONSTANT_MACROS', '-Wno-sign-compare', '-I%s/include' % root.llvm_src_root, '-I%s/include' % root.llvm_obj_root]) config.excludes = ['AbstractTypeUser.h', 'DAGISelHeader.h', 'AIXDataTypesFix.h', 'Solaris.h']
Use the other excludes syntax.
Use the other excludes syntax. git-svn-id: ffe668792ed300d6c2daa1f6eba2e0aa28d7ec6c@88836 91177308-0d34-0410-b5e6-96231b3b80d8
INI
apache-2.0
llvm-mirror/clang,apple/swift-clang,llvm-mirror/clang,apple/swift-clang,llvm-mirror/clang,apple/swift-clang,apple/swift-clang,apple/swift-clang,apple/swift-clang,llvm-mirror/clang,apple/swift-clang,llvm-mirror/clang,llvm-mirror/clang,apple/swift-clang,apple/swift-clang,llvm-mirror/clang,llvm-mirror/clang,apple/swift-clang,llvm-mirror/clang,llvm-mirror/clang
ini
## Code Before: def getRoot(config): if not config.parent: return config return getRoot(config.parent) root = getRoot(config) # testFormat: The test format to use to interpret tests. config.test_format = lit.formats.SyntaxCheckTest(compiler=root.clang, dir='%s/include/llvm' % root.llvm_src_root, recursive=False, pattern='^(.*\\.h|[^.]*)$', excludes=['DAGISelHeader.h', 'AIXDataTypesFix.h', 'Solaris.h'], extra_cxx_args=['-D__STDC_LIMIT_MACROS', '-D__STDC_CONSTANT_MACROS', '-Wno-sign-compare', '-I%s/include' % root.llvm_src_root, '-I%s/include' % root.llvm_obj_root]) config.excludes = ['AbstractTypeUser.h'] ## Instruction: Use the other excludes syntax. git-svn-id: ffe668792ed300d6c2daa1f6eba2e0aa28d7ec6c@88836 91177308-0d34-0410-b5e6-96231b3b80d8 ## Code After: def getRoot(config): if not config.parent: return config return getRoot(config.parent) root = getRoot(config) # testFormat: The test format to use to interpret tests. config.test_format = lit.formats.SyntaxCheckTest(compiler=root.clang, dir='%s/include/llvm' % root.llvm_src_root, recursive=False, pattern='^(.*\\.h|[^.]*)$', extra_cxx_args=['-D__STDC_LIMIT_MACROS', '-D__STDC_CONSTANT_MACROS', '-Wno-sign-compare', '-I%s/include' % root.llvm_src_root, '-I%s/include' % root.llvm_obj_root]) config.excludes = ['AbstractTypeUser.h', 'DAGISelHeader.h', 'AIXDataTypesFix.h', 'Solaris.h']
99c4c301710daa555bb95e4aa278d6a6a294678a
lib/AppleClient.php
lib/AppleClient.php
<?php namespace Busuu\IosReceiptsApi; use GuzzleHttp\Client; class AppleClient { /** @var Client $client */ private $client; /** @var string */ private $password; public function __construct($password) { $this->client = new Client(); $this->password = $password; } /** * Fetch the receipt from apple * * @param $receiptData * @param $endpoint * @return array */ public function fetchReceipt($receiptData, $endpoint) { try { $data = [ 'password' => $this->password, 'receipt-data' => $receiptData ]; $response = $this->client->post($endpoint, ['body' => json_encode($data)]); return json_decode($response->getBody(), true); } catch (\Exception $e) { throw new \InvalidArgumentException( sprintf('Error in the communication with Apple - %s', $e->getMessage()) ); } } }
<?php namespace Busuu\IosReceiptsApi; use Exception; use GuzzleHttp\Client; class AppleClient { /** @var Client $client */ private $client; /** @var string */ private $password; public function __construct($password) { $this->client = new Client(); $this->password = $password; } /** * Fetch the receipt from apple * * @param $receiptData * @param $endpoint * * @return array * @throws Exception */ public function fetchReceipt($receiptData, $endpoint) { $data = [ 'password' => $this->password, 'receipt-data' => $receiptData ]; $response = $this->client->post($endpoint, [ 'body' => json_encode($data), 'timeout' => 10 ]); $jsonResponse = json_decode($response->getBody(), true); if (null !== $jsonResponse) { return $jsonResponse; } throw new Exception(sprintf('Invalid Response from Apple Server: %s', $response)); } }
Add error checking on invalid json
Add error checking on invalid json
PHP
mit
Busuu/ios-receipts-api
php
## Code Before: <?php namespace Busuu\IosReceiptsApi; use GuzzleHttp\Client; class AppleClient { /** @var Client $client */ private $client; /** @var string */ private $password; public function __construct($password) { $this->client = new Client(); $this->password = $password; } /** * Fetch the receipt from apple * * @param $receiptData * @param $endpoint * @return array */ public function fetchReceipt($receiptData, $endpoint) { try { $data = [ 'password' => $this->password, 'receipt-data' => $receiptData ]; $response = $this->client->post($endpoint, ['body' => json_encode($data)]); return json_decode($response->getBody(), true); } catch (\Exception $e) { throw new \InvalidArgumentException( sprintf('Error in the communication with Apple - %s', $e->getMessage()) ); } } } ## Instruction: Add error checking on invalid json ## Code After: <?php namespace Busuu\IosReceiptsApi; use Exception; use GuzzleHttp\Client; class AppleClient { /** @var Client $client */ private $client; /** @var string */ private $password; public function __construct($password) { $this->client = new Client(); $this->password = $password; } /** * Fetch the receipt from apple * * @param $receiptData * @param $endpoint * * @return array * @throws Exception */ public function fetchReceipt($receiptData, $endpoint) { $data = [ 'password' => $this->password, 'receipt-data' => $receiptData ]; $response = $this->client->post($endpoint, [ 'body' => json_encode($data), 'timeout' => 10 ]); $jsonResponse = json_decode($response->getBody(), true); if (null !== $jsonResponse) { return $jsonResponse; } throw new Exception(sprintf('Invalid Response from Apple Server: %s', $response)); } }
39f98999d18d28ba9dec77aac6786f9da2293287
src/pz_discover/util.clj
src/pz_discover/util.clj
(ns pz-discover.util (:require [clojure.tools.logging :as log] [zookeeper :as zk] [pz-discover.models.services :as sm])) (defn setup-zk-env! [client chroot] (let [names-node (format "%s/%s" chroot "names") types-node (format "%s/%s" chroot "types")] (when-not (zk/exists client chroot) (zk/create client chroot :persistent? true)) (when-not (zk/exists client names-node) (zk/create client names-node :persistent? true)) (when-not (zk/exists client types-node) (zk/create client types-node :persistent? true)))) (defn register-kafka! [client kafka-config] (let [kafka-data {:type "infrastructure" :brokers (get-in kafka-config [:producer "bootstrap.servers"])}] (sm/register-by-name client "kafka" kafka-data))) (defn register-zookeeper! [client zookeeper-config] (let [zk-data (assoc zookeeper-config :type "infrastructure")] (sm/register-by-name client "zookeeper" zk-data)))
(ns pz-discover.util (:require [clojure.tools.logging :as log] [zookeeper :as zk] [pz-discover.models.services :as sm])) (defn setup-zk-env! [client chroot] (let [names-node (format "%s/%s" chroot "names") types-node (format "%s/%s" chroot "types")] (when-not (zk/exists client chroot) (zk/create client chroot :persistent? true)) (when-not (zk/exists client names-node) (zk/create client names-node :persistent? true)) (when-not (zk/exists client types-node) (zk/create client types-node :persistent? true)))) (defn register-kafka! [client kafka-config] (let [kafka-data {:type "infrastructure" :host (get-in kafka-config [:producer "bootstrap.servers"])}] (when-not (sm/register-by-name client "kafka" kafka-data) (sm/update-by-name client "kafka" kafka-data)))) (defn register-zookeeper! [client zookeeper-config] (let [zk-normalized {:host (format "%s:%s%s" (:host zookeeper-config) (:port zookeeper-config) (:chroot zookeeper-config))} zk-data (assoc zk-normalized :type "infrastructure")] (when-not (sm/register-by-name client "zookeeper" zk-data) (sm/update-by-name client "zookeeper" zk-data))))
Change format of infrastructure data, update if already registered.
Change format of infrastructure data, update if already registered.
Clojure
epl-1.0
venicegeo/pz-discover
clojure
## Code Before: (ns pz-discover.util (:require [clojure.tools.logging :as log] [zookeeper :as zk] [pz-discover.models.services :as sm])) (defn setup-zk-env! [client chroot] (let [names-node (format "%s/%s" chroot "names") types-node (format "%s/%s" chroot "types")] (when-not (zk/exists client chroot) (zk/create client chroot :persistent? true)) (when-not (zk/exists client names-node) (zk/create client names-node :persistent? true)) (when-not (zk/exists client types-node) (zk/create client types-node :persistent? true)))) (defn register-kafka! [client kafka-config] (let [kafka-data {:type "infrastructure" :brokers (get-in kafka-config [:producer "bootstrap.servers"])}] (sm/register-by-name client "kafka" kafka-data))) (defn register-zookeeper! [client zookeeper-config] (let [zk-data (assoc zookeeper-config :type "infrastructure")] (sm/register-by-name client "zookeeper" zk-data))) ## Instruction: Change format of infrastructure data, update if already registered. ## Code After: (ns pz-discover.util (:require [clojure.tools.logging :as log] [zookeeper :as zk] [pz-discover.models.services :as sm])) (defn setup-zk-env! [client chroot] (let [names-node (format "%s/%s" chroot "names") types-node (format "%s/%s" chroot "types")] (when-not (zk/exists client chroot) (zk/create client chroot :persistent? true)) (when-not (zk/exists client names-node) (zk/create client names-node :persistent? true)) (when-not (zk/exists client types-node) (zk/create client types-node :persistent? true)))) (defn register-kafka! [client kafka-config] (let [kafka-data {:type "infrastructure" :host (get-in kafka-config [:producer "bootstrap.servers"])}] (when-not (sm/register-by-name client "kafka" kafka-data) (sm/update-by-name client "kafka" kafka-data)))) (defn register-zookeeper! [client zookeeper-config] (let [zk-normalized {:host (format "%s:%s%s" (:host zookeeper-config) (:port zookeeper-config) (:chroot zookeeper-config))} zk-data (assoc zk-normalized :type "infrastructure")] (when-not (sm/register-by-name client "zookeeper" zk-data) (sm/update-by-name client "zookeeper" zk-data))))
d639144288cc60d91d9ed0de76458be0df1fa860
vagrant/provision.sh
vagrant/provision.sh
echo "root:vagrant" | chpasswd # Update package cache apt-get update # Install packages apt-get install -y python-setuptools apt-get install -y python-pip apt-get install -y libpq-dev apt-get install -y python-dev pip install -r /rgserver/requirements.txt apt-get install -y postgresql apt-get install -y postgresql-client apt-get install -y nginx cat /rgserver/vagrant/pg_hba.conf > /etc/postgresql/9.4/main/pg_hba.conf /etc/init.d/postgresql restart sudo -u postgres createuser --createdb robot sudo -u postgres /usr/lib/postgresql/9.4/bin/createdb robot sudo -u postgres /usr/lib/postgresql/9.4/bin/createdb robotgame psql -U robot -h localhost < /rgserver/config/db_schema.sql cp /rgserver/vagrant/dbcon_vagrant.py /rgserver/dbcon.py cp /rgserver/vagrant/rgserver.conf /etc/nginx/sites-available/rgserver.conf rm /etc/nginx/sites-enabled/default ln -s /etc/nginx/sites-available/rgserver.conf /etc/nginx/sites-enabled/rgserver.conf cp /rgserver/vagrant/rg /usr/bin/rg rg restart
echo "root:vagrant" | chpasswd # Update package cache apt-get update # Install packages apt-get install -y python-setuptools apt-get install -y python-pip apt-get install -y libpq-dev apt-get install -y python-dev pip install -r /rgserver/requirements.txt apt-get install -y postgresql apt-get install -y postgresql-client apt-get install -y nginx apt-get install -y dos2unix cat /rgserver/vagrant/pg_hba.conf > /etc/postgresql/9.4/main/pg_hba.conf /etc/init.d/postgresql restart sudo -u postgres createuser --createdb robot sudo -u postgres /usr/lib/postgresql/9.4/bin/createdb robot sudo -u postgres /usr/lib/postgresql/9.4/bin/createdb robotgame psql -U robot -h localhost < /rgserver/config/db_schema.sql cp /rgserver/vagrant/dbcon_vagrant.py /rgserver/dbcon.py cp /rgserver/vagrant/rgserver.conf /etc/nginx/sites-available/rgserver.conf rm /etc/nginx/sites-enabled/default ln -s /etc/nginx/sites-available/rgserver.conf /etc/nginx/sites-enabled/rgserver.conf cp /rgserver/vagrant/rg /usr/bin/rg dos2unix /usr/bin/rg rg restart
Add dos2unix conversion for Windows vagrant.
Add dos2unix conversion for Windows vagrant.
Shell
mit
koduj-z-klasa/rgserver,RobotGame/rgserver,RobotGame/rgserver,RobotGame/rgserver,koduj-z-klasa/rgserver,RobotGame/rgserver,koduj-z-klasa/rgserver,koduj-z-klasa/rgserver
shell
## Code Before: echo "root:vagrant" | chpasswd # Update package cache apt-get update # Install packages apt-get install -y python-setuptools apt-get install -y python-pip apt-get install -y libpq-dev apt-get install -y python-dev pip install -r /rgserver/requirements.txt apt-get install -y postgresql apt-get install -y postgresql-client apt-get install -y nginx cat /rgserver/vagrant/pg_hba.conf > /etc/postgresql/9.4/main/pg_hba.conf /etc/init.d/postgresql restart sudo -u postgres createuser --createdb robot sudo -u postgres /usr/lib/postgresql/9.4/bin/createdb robot sudo -u postgres /usr/lib/postgresql/9.4/bin/createdb robotgame psql -U robot -h localhost < /rgserver/config/db_schema.sql cp /rgserver/vagrant/dbcon_vagrant.py /rgserver/dbcon.py cp /rgserver/vagrant/rgserver.conf /etc/nginx/sites-available/rgserver.conf rm /etc/nginx/sites-enabled/default ln -s /etc/nginx/sites-available/rgserver.conf /etc/nginx/sites-enabled/rgserver.conf cp /rgserver/vagrant/rg /usr/bin/rg rg restart ## Instruction: Add dos2unix conversion for Windows vagrant. ## Code After: echo "root:vagrant" | chpasswd # Update package cache apt-get update # Install packages apt-get install -y python-setuptools apt-get install -y python-pip apt-get install -y libpq-dev apt-get install -y python-dev pip install -r /rgserver/requirements.txt apt-get install -y postgresql apt-get install -y postgresql-client apt-get install -y nginx apt-get install -y dos2unix cat /rgserver/vagrant/pg_hba.conf > /etc/postgresql/9.4/main/pg_hba.conf /etc/init.d/postgresql restart sudo -u postgres createuser --createdb robot sudo -u postgres /usr/lib/postgresql/9.4/bin/createdb robot sudo -u postgres /usr/lib/postgresql/9.4/bin/createdb robotgame psql -U robot -h localhost < /rgserver/config/db_schema.sql cp /rgserver/vagrant/dbcon_vagrant.py /rgserver/dbcon.py cp /rgserver/vagrant/rgserver.conf /etc/nginx/sites-available/rgserver.conf rm /etc/nginx/sites-enabled/default ln -s /etc/nginx/sites-available/rgserver.conf /etc/nginx/sites-enabled/rgserver.conf cp /rgserver/vagrant/rg /usr/bin/rg dos2unix /usr/bin/rg rg restart
3e15fb7770c1bacca0f7feadb48c7df64acd85e2
examples/amazon/dynamodb/list-tables.js
examples/amazon/dynamodb/list-tables.js
var fmt = require('fmt'); var awssum = require('awssum'); var amazon = awssum.load('amazon/amazon'); var DynamoDB = awssum.load('amazon/dynamodb').DynamoDB; var env = process.env; var accessKeyId = env.ACCESS_KEY_ID; var secretAccessKey = env.SECRET_ACCESS_KEY; var awsAccountId = env.AWS_ACCOUNT_ID; var ddb = new DynamoDB({ 'accessKeyId' : accessKeyId, 'secretAccessKey' : secretAccessKey, 'awsAccountId' : awsAccountId, 'region' : amazon.US_EAST_1 }); fmt.field('Region', ddb.region() ); fmt.field('EndPoint', ddb.host() ); fmt.field('AccessKeyId', ddb.accessKeyId() ); fmt.field('SecretAccessKey', ddb.secretAccessKey().substr(0, 3) + '...' ); fmt.field('AwsAccountId', ddb.awsAccountId() ); ddb.ListTables(function(err, data) { fmt.msg("listing all the tables - expecting success"); fmt.dump(err, 'Error'); fmt.dump(data, 'Data'); });
var fmt = require('fmt'); var awssum = require('awssum'); var amazon = awssum.load('amazon/amazon'); var DynamoDB = awssum.load('amazon/dynamodb').DynamoDB; var env = process.env; var accessKeyId = env.ACCESS_KEY_ID; var secretAccessKey = env.SECRET_ACCESS_KEY; var awsAccountId = env.AWS_ACCOUNT_ID; var ddb = new DynamoDB({ 'accessKeyId' : accessKeyId, 'secretAccessKey' : secretAccessKey, 'awsAccountId' : awsAccountId, 'region' : amazon.US_EAST_1 }); var ddbWest = new DynamoDB({ 'accessKeyId' : accessKeyId, 'secretAccessKey' : secretAccessKey, 'awsAccountId' : awsAccountId, 'region' : amazon.US_WEST_2 }); fmt.field('Region', ddb.region() ); fmt.field('EndPoint', ddb.host() ); fmt.field('AccessKeyId', ddb.accessKeyId() ); fmt.field('SecretAccessKey', ddb.secretAccessKey().substr(0, 3) + '...' ); fmt.field('AwsAccountId', ddb.awsAccountId() ); ddb.ListTables(function(err, data) { fmt.msg("listing all the tables in us-east-1 - expecting success"); fmt.dump(err, 'Error'); fmt.dump(data, 'Data'); }); ddbWest.ListTables(function(err, data) { fmt.msg("listing all the tables in us-west-1 - expecting success"); fmt.dump(err, 'Error'); fmt.dump(data, 'Data'); });
Add a test so that other regions are fine with DynamoDB using the STS service
Add a test so that other regions are fine with DynamoDB using the STS service
JavaScript
mit
chilts/awssum
javascript
## Code Before: var fmt = require('fmt'); var awssum = require('awssum'); var amazon = awssum.load('amazon/amazon'); var DynamoDB = awssum.load('amazon/dynamodb').DynamoDB; var env = process.env; var accessKeyId = env.ACCESS_KEY_ID; var secretAccessKey = env.SECRET_ACCESS_KEY; var awsAccountId = env.AWS_ACCOUNT_ID; var ddb = new DynamoDB({ 'accessKeyId' : accessKeyId, 'secretAccessKey' : secretAccessKey, 'awsAccountId' : awsAccountId, 'region' : amazon.US_EAST_1 }); fmt.field('Region', ddb.region() ); fmt.field('EndPoint', ddb.host() ); fmt.field('AccessKeyId', ddb.accessKeyId() ); fmt.field('SecretAccessKey', ddb.secretAccessKey().substr(0, 3) + '...' ); fmt.field('AwsAccountId', ddb.awsAccountId() ); ddb.ListTables(function(err, data) { fmt.msg("listing all the tables - expecting success"); fmt.dump(err, 'Error'); fmt.dump(data, 'Data'); }); ## Instruction: Add a test so that other regions are fine with DynamoDB using the STS service ## Code After: var fmt = require('fmt'); var awssum = require('awssum'); var amazon = awssum.load('amazon/amazon'); var DynamoDB = awssum.load('amazon/dynamodb').DynamoDB; var env = process.env; var accessKeyId = env.ACCESS_KEY_ID; var secretAccessKey = env.SECRET_ACCESS_KEY; var awsAccountId = env.AWS_ACCOUNT_ID; var ddb = new DynamoDB({ 'accessKeyId' : accessKeyId, 'secretAccessKey' : secretAccessKey, 'awsAccountId' : awsAccountId, 'region' : amazon.US_EAST_1 }); var ddbWest = new DynamoDB({ 'accessKeyId' : accessKeyId, 'secretAccessKey' : secretAccessKey, 'awsAccountId' : awsAccountId, 'region' : amazon.US_WEST_2 }); fmt.field('Region', ddb.region() ); fmt.field('EndPoint', ddb.host() ); fmt.field('AccessKeyId', ddb.accessKeyId() ); fmt.field('SecretAccessKey', ddb.secretAccessKey().substr(0, 3) + '...' ); fmt.field('AwsAccountId', ddb.awsAccountId() ); ddb.ListTables(function(err, data) { fmt.msg("listing all the tables in us-east-1 - expecting success"); fmt.dump(err, 'Error'); fmt.dump(data, 'Data'); }); ddbWest.ListTables(function(err, data) { fmt.msg("listing all the tables in us-west-1 - expecting success"); fmt.dump(err, 'Error'); fmt.dump(data, 'Data'); });
5c6dbfd8077038727c1379d727ed71a3de507013
spec/plantuml-viewer-view-spec.js
spec/plantuml-viewer-view-spec.js
/* global jasmine atom beforeEach waitsForPromise waitsFor runs describe it expect */ 'use strict' var PlantumlViewerEditor = require('../lib/plantuml-viewer-editor') var PlantumlViewerView = require('../lib/plantuml-viewer-view') describe('PlantumlViewerView', function () { var editor var view beforeEach(function () { jasmine.useRealClock() waitsForPromise(function () { return atom.workspace.open('file.puml') }) runs(function () { editor = atom.workspace.getActiveTextEditor() }) waitsFor(function (done) { editor.onDidStopChanging(done) }) runs(function () { var viewerEditor = new PlantumlViewerEditor('uri', editor.id) view = new PlantumlViewerView(viewerEditor) jasmine.attachToDOM(view.element) }) waitsFor(function () { return view.html().indexOf('svg') !== -1 }) }) it('should contain svg generated from text editor', function () { runs(function () { expect(view.html()).toContain('svg') }) }) describe('when the editor text is modified', function () { it('should display an updated image', function () { var previousHtml runs(function () { previousHtml = view.html() editor.getBuffer().setText('A -> C') }) waitsFor(function () { return view.html() !== previousHtml }) runs(function () { expect(view.html()).not.toBe(previousHtml) }) }) }) })
/* global jasmine atom beforeEach waitsForPromise waitsFor runs describe it expect */ 'use strict' var PlantumlViewerEditor = require('../lib/plantuml-viewer-editor') var PlantumlViewerView = require('../lib/plantuml-viewer-view') describe('PlantumlViewerView', function () { var editor var view beforeEach(function () { jasmine.useRealClock() atom.config.set('plantuml-viewer.liveUpdate', true) waitsForPromise(function () { return atom.workspace.open('file.puml') }) runs(function () { editor = atom.workspace.getActiveTextEditor() }) waitsFor(function (done) { editor.onDidStopChanging(done) }) runs(function () { var viewerEditor = new PlantumlViewerEditor('uri', editor.id) view = new PlantumlViewerView(viewerEditor) jasmine.attachToDOM(view.element) }) waitsFor(function () { return view.html().indexOf('svg') !== -1 }) }) it('should contain svg generated from text editor', function () { runs(function () { expect(view.html()).toContain('svg') }) }) describe('when the editor text is modified', function () { it('should display an updated image', function () { var previousHtml runs(function () { previousHtml = view.html() editor.getBuffer().setText('A -> C') }) waitsFor(function () { return view.html() !== previousHtml }) runs(function () { expect(view.html()).not.toBe(previousHtml) }) }) }) })
Fix failing update on change test
Fix failing update on change test Regression was not ran before last commit. When testing the view standalone the liveUpdate configuration ahd to be set.
JavaScript
mit
markushedvall/plantuml-viewer
javascript
## Code Before: /* global jasmine atom beforeEach waitsForPromise waitsFor runs describe it expect */ 'use strict' var PlantumlViewerEditor = require('../lib/plantuml-viewer-editor') var PlantumlViewerView = require('../lib/plantuml-viewer-view') describe('PlantumlViewerView', function () { var editor var view beforeEach(function () { jasmine.useRealClock() waitsForPromise(function () { return atom.workspace.open('file.puml') }) runs(function () { editor = atom.workspace.getActiveTextEditor() }) waitsFor(function (done) { editor.onDidStopChanging(done) }) runs(function () { var viewerEditor = new PlantumlViewerEditor('uri', editor.id) view = new PlantumlViewerView(viewerEditor) jasmine.attachToDOM(view.element) }) waitsFor(function () { return view.html().indexOf('svg') !== -1 }) }) it('should contain svg generated from text editor', function () { runs(function () { expect(view.html()).toContain('svg') }) }) describe('when the editor text is modified', function () { it('should display an updated image', function () { var previousHtml runs(function () { previousHtml = view.html() editor.getBuffer().setText('A -> C') }) waitsFor(function () { return view.html() !== previousHtml }) runs(function () { expect(view.html()).not.toBe(previousHtml) }) }) }) }) ## Instruction: Fix failing update on change test Regression was not ran before last commit. When testing the view standalone the liveUpdate configuration ahd to be set. ## Code After: /* global jasmine atom beforeEach waitsForPromise waitsFor runs describe it expect */ 'use strict' var PlantumlViewerEditor = require('../lib/plantuml-viewer-editor') var PlantumlViewerView = require('../lib/plantuml-viewer-view') describe('PlantumlViewerView', function () { var editor var view beforeEach(function () { jasmine.useRealClock() atom.config.set('plantuml-viewer.liveUpdate', true) waitsForPromise(function () { return atom.workspace.open('file.puml') }) runs(function () { editor = atom.workspace.getActiveTextEditor() }) waitsFor(function (done) { editor.onDidStopChanging(done) }) runs(function () { var viewerEditor = new PlantumlViewerEditor('uri', editor.id) view = new PlantumlViewerView(viewerEditor) jasmine.attachToDOM(view.element) }) waitsFor(function () { return view.html().indexOf('svg') !== -1 }) }) it('should contain svg generated from text editor', function () { runs(function () { expect(view.html()).toContain('svg') }) }) describe('when the editor text is modified', function () { it('should display an updated image', function () { var previousHtml runs(function () { previousHtml = view.html() editor.getBuffer().setText('A -> C') }) waitsFor(function () { return view.html() !== previousHtml }) runs(function () { expect(view.html()).not.toBe(previousHtml) }) }) }) })
b76d33fb38afaa63008cd2e4177fe7ebfbd15d54
metadata/de.wellenvogel.bonjourbrowser.yml
metadata/de.wellenvogel.bonjourbrowser.yml
Categories: - Internet License: MIT AuthorName: Andreas Vogel AuthorEmail: andreas@wellenvogel.net AuthorWebSite: https://www.wellenvogel.de/ SourceCode: https://github.com/wellenvogel/BonjourBrowser IssueTracker: https://github.com/wellenvogel/BonjourBrowser/issues AutoName: BonjourBrowser RepoType: git Repo: https://github.com/wellenvogel/BonjourBrowser Builds: - versionName: '1.6' versionCode: 106 commit: release-106 subdir: app gradle: - yes AutoUpdateMode: Version release-%c UpdateCheckMode: Tags CurrentVersion: '1.6' CurrentVersionCode: 106
Categories: - Internet License: MIT AuthorName: Andreas Vogel AuthorEmail: andreas@wellenvogel.net AuthorWebSite: https://www.wellenvogel.de/ SourceCode: https://github.com/wellenvogel/BonjourBrowser IssueTracker: https://github.com/wellenvogel/BonjourBrowser/issues AutoName: BonjourBrowser RepoType: git Repo: https://github.com/wellenvogel/BonjourBrowser Builds: - versionName: '1.6' versionCode: 106 commit: release-106 subdir: app gradle: - yes - versionName: '1.7' versionCode: 107 commit: release-107 subdir: app gradle: - yes AutoUpdateMode: Version release-%c UpdateCheckMode: Tags CurrentVersion: '1.7' CurrentVersionCode: 107
Update BonjourBrowser to 1.7 (107)
Update BonjourBrowser to 1.7 (107)
YAML
agpl-3.0
f-droid/fdroiddata,f-droid/fdroiddata
yaml
## Code Before: Categories: - Internet License: MIT AuthorName: Andreas Vogel AuthorEmail: andreas@wellenvogel.net AuthorWebSite: https://www.wellenvogel.de/ SourceCode: https://github.com/wellenvogel/BonjourBrowser IssueTracker: https://github.com/wellenvogel/BonjourBrowser/issues AutoName: BonjourBrowser RepoType: git Repo: https://github.com/wellenvogel/BonjourBrowser Builds: - versionName: '1.6' versionCode: 106 commit: release-106 subdir: app gradle: - yes AutoUpdateMode: Version release-%c UpdateCheckMode: Tags CurrentVersion: '1.6' CurrentVersionCode: 106 ## Instruction: Update BonjourBrowser to 1.7 (107) ## Code After: Categories: - Internet License: MIT AuthorName: Andreas Vogel AuthorEmail: andreas@wellenvogel.net AuthorWebSite: https://www.wellenvogel.de/ SourceCode: https://github.com/wellenvogel/BonjourBrowser IssueTracker: https://github.com/wellenvogel/BonjourBrowser/issues AutoName: BonjourBrowser RepoType: git Repo: https://github.com/wellenvogel/BonjourBrowser Builds: - versionName: '1.6' versionCode: 106 commit: release-106 subdir: app gradle: - yes - versionName: '1.7' versionCode: 107 commit: release-107 subdir: app gradle: - yes AutoUpdateMode: Version release-%c UpdateCheckMode: Tags CurrentVersion: '1.7' CurrentVersionCode: 107
e5c7baaef9d5ca2bcd8c9214510b8cefb295b3bd
install.sh
install.sh
if [ "$EUID" -ne 0 ] then echo "[Error] Please run as root" exit fi rm -r /usr/share/tweelectron cp -r ./dist/Tweelectron-linux-x64/ /usr/share/tweelectron echo "Done"
if [ "$EUID" -ne 0 ] then echo "[Error] Please run as root" exit fi if [ -d "/usr/share/tweelectron" ] then echo "Old version exists. Removing..." rm -r /usr/share/tweelectron fi echo "Copying new files" cp -r ./dist/Tweelectron-linux-x64/ /usr/share/tweelectron echo "Done"
Check if dir exists before removing
Check if dir exists before removing
Shell
mit
Plastikmensch/Tweelectron,Plastikmensch/Tweelectron,Plastikmensch/Tweelectron
shell
## Code Before: if [ "$EUID" -ne 0 ] then echo "[Error] Please run as root" exit fi rm -r /usr/share/tweelectron cp -r ./dist/Tweelectron-linux-x64/ /usr/share/tweelectron echo "Done" ## Instruction: Check if dir exists before removing ## Code After: if [ "$EUID" -ne 0 ] then echo "[Error] Please run as root" exit fi if [ -d "/usr/share/tweelectron" ] then echo "Old version exists. Removing..." rm -r /usr/share/tweelectron fi echo "Copying new files" cp -r ./dist/Tweelectron-linux-x64/ /usr/share/tweelectron echo "Done"
67277e936c14cf6465d23488d4e02c1759c7f04a
img_create.sh
img_create.sh
if [ "x$1" == "x" ]; then echo "Please provide an argument or two" return 1; fi TARGET=$2 if [ "x$TARGET" == "x" ]; then TARGET=`basename "$1"`.binhex fi TMPFILE=`mktemp` echo "Converting $1 to $TARGET" objcopy -j.text -O binary "$1" "$TMPFILE" hexdump -v -e '"0x%08x\n"' "$TMPFILE" > "$TARGET"
if [ "x$1" == "x" ]; then echo "Please provide an argument or two" return 1; fi TARGET=$2 if [ "x$TARGET" == "x" ]; then TARGET=`basename "$1"`.binhex fi TMPFILE=`mktemp` echo "Converting $1 to $TARGET" objcopy -j.text -O binary "$1" "$TMPFILE" readelf -h "$1" | grep "Entry point address:"| awk '{printf "0x%08x\n", strtonum($NF)}' > "$TARGET" hexdump -v -e '"0x%08x\n"' "$TMPFILE" >> "$TARGET"
Put Entry point address at the beginning of binhex
Put Entry point address at the beginning of binhex Signed-off-by: Zi Yan <a95843713135ee2454bfe7dac9a681dcd8c52c48@cs.rutgers.edu>
Shell
bsd-3-clause
jvesely/verilog_ALU
shell
## Code Before: if [ "x$1" == "x" ]; then echo "Please provide an argument or two" return 1; fi TARGET=$2 if [ "x$TARGET" == "x" ]; then TARGET=`basename "$1"`.binhex fi TMPFILE=`mktemp` echo "Converting $1 to $TARGET" objcopy -j.text -O binary "$1" "$TMPFILE" hexdump -v -e '"0x%08x\n"' "$TMPFILE" > "$TARGET" ## Instruction: Put Entry point address at the beginning of binhex Signed-off-by: Zi Yan <a95843713135ee2454bfe7dac9a681dcd8c52c48@cs.rutgers.edu> ## Code After: if [ "x$1" == "x" ]; then echo "Please provide an argument or two" return 1; fi TARGET=$2 if [ "x$TARGET" == "x" ]; then TARGET=`basename "$1"`.binhex fi TMPFILE=`mktemp` echo "Converting $1 to $TARGET" objcopy -j.text -O binary "$1" "$TMPFILE" readelf -h "$1" | grep "Entry point address:"| awk '{printf "0x%08x\n", strtonum($NF)}' > "$TARGET" hexdump -v -e '"0x%08x\n"' "$TMPFILE" >> "$TARGET"
f207a90c1ff48f37a9871211bf5951c96b0f8f44
src/Framework/Http/Middleware/Factory/RequestHandlerFactory.php
src/Framework/Http/Middleware/Factory/RequestHandlerFactory.php
<?php declare(strict_types=1); namespace Onion\Framework\Http\Middleware\Factory; use Onion\Framework\Router\Route; use Psr\Container\ContainerInterface; use Onion\Framework\Router\Interfaces\RouterInterface; use Onion\Framework\Dependency\Interfaces\FactoryInterface; use Onion\Framework\Http\Middleware\RequestHandler; class RequestHandlerFactory implements FactoryInterface { /** @var Router */ private $router; public function __construct(RouterInterface $router) { $this->router = $router; } public function build(ContainerInterface $container) { if (!$container->has('middleware')) { throw new \RuntimeException( 'Unable to initialize RequestHandler without defined middleware' ); } $middlewareGenerator = function () use ($container) { $middleware = $container->get('middleware'); foreach ($middleware as $identifier) { $instance = $container->get($identifier); assert( is_object($instance) && $instance instanceof MiddlewareInterface, new \TypeError("'{$identifier}' must implement MiddlewareInterface") ); yield $instance; } }; return new RequestHandler($middlewareGenerator()); } }
<?php declare(strict_types=1); namespace Onion\Framework\Http\Middleware\Factory; use Onion\Framework\Router\Route; use Psr\Container\ContainerInterface; use Onion\Framework\Dependency\Interfaces\FactoryInterface; use Onion\Framework\Http\Middleware\RequestHandler; class RequestHandlerFactory implements FactoryInterface { public function build(ContainerInterface $container) { if (!$container->has('middleware')) { throw new \RuntimeException( 'Unable to initialize RequestHandler without defined middleware' ); } $middlewareGenerator = function () use ($container) { $middleware = $container->get('middleware'); foreach ($middleware as $identifier) { $instance = $container->get($identifier); assert( is_object($instance) && $instance instanceof MiddlewareInterface, new \TypeError("'{$identifier}' must implement MiddlewareInterface") ); yield $instance; } }; return new RequestHandler($middlewareGenerator()); } }
Remove obsolete reference to RouterInterface
Remove obsolete reference to RouterInterface
PHP
mit
phOnion/framework
php
## Code Before: <?php declare(strict_types=1); namespace Onion\Framework\Http\Middleware\Factory; use Onion\Framework\Router\Route; use Psr\Container\ContainerInterface; use Onion\Framework\Router\Interfaces\RouterInterface; use Onion\Framework\Dependency\Interfaces\FactoryInterface; use Onion\Framework\Http\Middleware\RequestHandler; class RequestHandlerFactory implements FactoryInterface { /** @var Router */ private $router; public function __construct(RouterInterface $router) { $this->router = $router; } public function build(ContainerInterface $container) { if (!$container->has('middleware')) { throw new \RuntimeException( 'Unable to initialize RequestHandler without defined middleware' ); } $middlewareGenerator = function () use ($container) { $middleware = $container->get('middleware'); foreach ($middleware as $identifier) { $instance = $container->get($identifier); assert( is_object($instance) && $instance instanceof MiddlewareInterface, new \TypeError("'{$identifier}' must implement MiddlewareInterface") ); yield $instance; } }; return new RequestHandler($middlewareGenerator()); } } ## Instruction: Remove obsolete reference to RouterInterface ## Code After: <?php declare(strict_types=1); namespace Onion\Framework\Http\Middleware\Factory; use Onion\Framework\Router\Route; use Psr\Container\ContainerInterface; use Onion\Framework\Dependency\Interfaces\FactoryInterface; use Onion\Framework\Http\Middleware\RequestHandler; class RequestHandlerFactory implements FactoryInterface { public function build(ContainerInterface $container) { if (!$container->has('middleware')) { throw new \RuntimeException( 'Unable to initialize RequestHandler without defined middleware' ); } $middlewareGenerator = function () use ($container) { $middleware = $container->get('middleware'); foreach ($middleware as $identifier) { $instance = $container->get($identifier); assert( is_object($instance) && $instance instanceof MiddlewareInterface, new \TypeError("'{$identifier}' must implement MiddlewareInterface") ); yield $instance; } }; return new RequestHandler($middlewareGenerator()); } }
b870028ce8edcb5001f1a4823517d866db0324a8
pyglab/apirequest.py
pyglab/apirequest.py
import enum import json from pyglab.exceptions import RequestError import requests @enum.unique class RequestType(enum.Enum): GET = 1 POST = 2 PUT = 3 DELETE = 4 class ApiRequest: _request_creators = { RequestType.GET: requests.get, RequestType.POST: requests.post, RequestType.PUT: requests.put, RequestType.DELETE: requests.delete, } def __init__(self, request_type, url, token, params={}, sudo=None, page=None, per_page=None): # Build header header = {'PRIVATE-TOKEN': token} if sudo is not None: header['SUDO', sudo] # Build parameters if page is not None: params['page'] = page if per_page is not None: params['per_page'] = per_page r = self._request_creators[request_type](url, params=params, headers=header) content = json.loads(r.text) if RequestError.is_error(r.status_code): raise RequestError.error_class(r.status_code)(content) self._content = content @property def content(self): return self._content
import json from pyglab.exceptions import RequestError import requests class RequestType(object): GET = 1 POST = 2 PUT = 3 DELETE = 4 class ApiRequest: _request_creators = { RequestType.GET: requests.get, RequestType.POST: requests.post, RequestType.PUT: requests.put, RequestType.DELETE: requests.delete, } def __init__(self, request_type, url, token, params={}, sudo=None, page=None, per_page=None): # Build header header = {'PRIVATE-TOKEN': token} if sudo is not None: header['SUDO', sudo] # Build parameters if page is not None: params['page'] = page if per_page is not None: params['per_page'] = per_page r = self._request_creators[request_type](url, params=params, headers=header) content = json.loads(r.text) if RequestError.is_error(r.status_code): raise RequestError.error_class(r.status_code)(content) self._content = content @property def content(self): return self._content
Make RequestType a normal class, not an enum.
Make RequestType a normal class, not an enum. This removes the restriction of needing Python >= 3.4. RequestType is now a normal class with class variables (fixes #19).
Python
mit
sloede/pyglab,sloede/pyglab
python
## Code Before: import enum import json from pyglab.exceptions import RequestError import requests @enum.unique class RequestType(enum.Enum): GET = 1 POST = 2 PUT = 3 DELETE = 4 class ApiRequest: _request_creators = { RequestType.GET: requests.get, RequestType.POST: requests.post, RequestType.PUT: requests.put, RequestType.DELETE: requests.delete, } def __init__(self, request_type, url, token, params={}, sudo=None, page=None, per_page=None): # Build header header = {'PRIVATE-TOKEN': token} if sudo is not None: header['SUDO', sudo] # Build parameters if page is not None: params['page'] = page if per_page is not None: params['per_page'] = per_page r = self._request_creators[request_type](url, params=params, headers=header) content = json.loads(r.text) if RequestError.is_error(r.status_code): raise RequestError.error_class(r.status_code)(content) self._content = content @property def content(self): return self._content ## Instruction: Make RequestType a normal class, not an enum. This removes the restriction of needing Python >= 3.4. RequestType is now a normal class with class variables (fixes #19). ## Code After: import json from pyglab.exceptions import RequestError import requests class RequestType(object): GET = 1 POST = 2 PUT = 3 DELETE = 4 class ApiRequest: _request_creators = { RequestType.GET: requests.get, RequestType.POST: requests.post, RequestType.PUT: requests.put, RequestType.DELETE: requests.delete, } def __init__(self, request_type, url, token, params={}, sudo=None, page=None, per_page=None): # Build header header = {'PRIVATE-TOKEN': token} if sudo is not None: header['SUDO', sudo] # Build parameters if page is not None: params['page'] = page if per_page is not None: params['per_page'] = per_page r = self._request_creators[request_type](url, params=params, headers=header) content = json.loads(r.text) if RequestError.is_error(r.status_code): raise RequestError.error_class(r.status_code)(content) self._content = content @property def content(self): return self._content
7a89ff5c87d47424c65942740dcb2e1f542d28e4
lib/fulcrum/media_resource.rb
lib/fulcrum/media_resource.rb
require 'open-uri' module Fulcrum class MediaResource < Resource include Actions::List include Actions::Find include Actions::Create def default_content_type raise NotImplementedError, 'default_content_type must be implemented in derived classes' end def attributes_for_upload(file, id = new_id, content_type = default_content_type, attrs = {}) file = Faraday::UploadIO.new(file, content_type) resource_attributes = { file: file, access_key: id } resource_attributes.merge!(attrs) attributes = {} attributes[resource_name] = resource_attributes attributes end def create(file, id = new_id, content_type = default_content_type, attrs = {}) call(:post, create_action, attributes_for_upload(file, id, content_type, attrs)) end def download(url, &blk) open(url, "rb", &blk) end def download_version(id, version, &blk) download(find(id)[version], &blk) end def original(id, &blk) download_version(id, 'original', &blk) end def new_id SecureRandom.uuid end end end
require 'open-uri' module Fulcrum class MediaResource < Resource include Actions::List include Actions::Find include Actions::Create def default_content_type raise NotImplementedError, 'default_content_type must be implemented in derived classes' end def attributes_for_upload(file, id = new_id, content_type = nil, attrs = {}) file = Faraday::UploadIO.new(file, content_type || default_content_type) resource_attributes = { file: file, access_key: id } resource_attributes.merge!(attrs) attributes = {} attributes[resource_name] = resource_attributes attributes end def create(file, id = new_id, content_type = nil, attrs = {}) call(:post, create_action, attributes_for_upload(file, id, content_type, attrs)) end def download(url, &blk) open(url, "rb", &blk) end def download_version(id, version, &blk) download(find(id)[version], &blk) end def original(id, &blk) download_version(id, 'original', &blk) end def new_id SecureRandom.uuid end end end
Use the default a little later.
Use the default a little later.
Ruby
mit
fulcrumapp/fulcrum-ruby
ruby
## Code Before: require 'open-uri' module Fulcrum class MediaResource < Resource include Actions::List include Actions::Find include Actions::Create def default_content_type raise NotImplementedError, 'default_content_type must be implemented in derived classes' end def attributes_for_upload(file, id = new_id, content_type = default_content_type, attrs = {}) file = Faraday::UploadIO.new(file, content_type) resource_attributes = { file: file, access_key: id } resource_attributes.merge!(attrs) attributes = {} attributes[resource_name] = resource_attributes attributes end def create(file, id = new_id, content_type = default_content_type, attrs = {}) call(:post, create_action, attributes_for_upload(file, id, content_type, attrs)) end def download(url, &blk) open(url, "rb", &blk) end def download_version(id, version, &blk) download(find(id)[version], &blk) end def original(id, &blk) download_version(id, 'original', &blk) end def new_id SecureRandom.uuid end end end ## Instruction: Use the default a little later. ## Code After: require 'open-uri' module Fulcrum class MediaResource < Resource include Actions::List include Actions::Find include Actions::Create def default_content_type raise NotImplementedError, 'default_content_type must be implemented in derived classes' end def attributes_for_upload(file, id = new_id, content_type = nil, attrs = {}) file = Faraday::UploadIO.new(file, content_type || default_content_type) resource_attributes = { file: file, access_key: id } resource_attributes.merge!(attrs) attributes = {} attributes[resource_name] = resource_attributes attributes end def create(file, id = new_id, content_type = nil, attrs = {}) call(:post, create_action, attributes_for_upload(file, id, content_type, attrs)) end def download(url, &blk) open(url, "rb", &blk) end def download_version(id, version, &blk) download(find(id)[version], &blk) end def original(id, &blk) download_version(id, 'original', &blk) end def new_id SecureRandom.uuid end end end
024e041bf22510b3a7adb2a3e3b113662acc51d7
.travis.yml
.travis.yml
language: c # Ubuntu 14.04 Trusty support sudo: required dist: trusty compiler: - clang - gcc before_install: - sudo apt-get -qq update - sudo apt-get install -y libeigen3-dev - sudo apt-get install -y libopenmpi-dev - sudo apt-get install -y libboost-all-dev script: - mkdir build - cd build - cmake -DTesting=ON -DProgs=ON -Dprogs_list="anderson;hubbard2d" .. - make - make test matrix: include: - compiler: gcc addons: apt: sources: - ubuntu-toolchain-r-test packages: - g++-4.9 env: CXX=g++-4.9 - compiler: gcc addons: apt: sources: - ubuntu-toolchain-r-test packages: - g++-5 env: CXX=g++-5 - compiler: clang addons: apt: sources: - ubuntu-toolchain-r-test - llvm-toolchain-precise-3.6 packages: - clang-3.6 env: CXX=clang++-3.6 - compiler: clang addons: apt: sources: - ubuntu-toolchain-r-test - llvm-toolchain-precise-3.7 packages: - clang-3.7 env: CXX=clang++-3.7
language: cpp # Ubuntu 16.04 Xenial support os: linux dist: xenial compiler: - clang - gcc before_install: - sudo apt-get -qq update - sudo apt-get install -y libeigen3-dev - sudo apt-get install -y libopenmpi-dev - sudo apt-get install -y libboost-all-dev script: # Stop on first error - set -e - mkdir build - cd build - cmake -DTesting=ON -DProgs=ON -Dprogs_list="anderson;hubbard2d" .. - make - make test jobs: include: - compiler: gcc addons: apt: sources: - ubuntu-toolchain-r-test packages: - g++-4.9 env: CXX=g++-4.9 - compiler: clang addons: apt: sources: - ubuntu-toolchain-r-test packages: - clang-3.6 env: CXX=clang++-3.6 - compiler: clang addons: apt: sources: - ubuntu-toolchain-r-test packages: - clang-3.7 env: CXX=clang++-3.7
Upgrade Travis CI configuration to Ubuntu Xenial
Upgrade Travis CI configuration to Ubuntu Xenial The dedicated g++-5 build has been excluded as it is the default compiler in Ubuntu Xenial anyway.
YAML
mpl-2.0
aeantipov/pomerol,aeantipov/pomerol,aeantipov/pomerol
yaml
## Code Before: language: c # Ubuntu 14.04 Trusty support sudo: required dist: trusty compiler: - clang - gcc before_install: - sudo apt-get -qq update - sudo apt-get install -y libeigen3-dev - sudo apt-get install -y libopenmpi-dev - sudo apt-get install -y libboost-all-dev script: - mkdir build - cd build - cmake -DTesting=ON -DProgs=ON -Dprogs_list="anderson;hubbard2d" .. - make - make test matrix: include: - compiler: gcc addons: apt: sources: - ubuntu-toolchain-r-test packages: - g++-4.9 env: CXX=g++-4.9 - compiler: gcc addons: apt: sources: - ubuntu-toolchain-r-test packages: - g++-5 env: CXX=g++-5 - compiler: clang addons: apt: sources: - ubuntu-toolchain-r-test - llvm-toolchain-precise-3.6 packages: - clang-3.6 env: CXX=clang++-3.6 - compiler: clang addons: apt: sources: - ubuntu-toolchain-r-test - llvm-toolchain-precise-3.7 packages: - clang-3.7 env: CXX=clang++-3.7 ## Instruction: Upgrade Travis CI configuration to Ubuntu Xenial The dedicated g++-5 build has been excluded as it is the default compiler in Ubuntu Xenial anyway. ## Code After: language: cpp # Ubuntu 16.04 Xenial support os: linux dist: xenial compiler: - clang - gcc before_install: - sudo apt-get -qq update - sudo apt-get install -y libeigen3-dev - sudo apt-get install -y libopenmpi-dev - sudo apt-get install -y libboost-all-dev script: # Stop on first error - set -e - mkdir build - cd build - cmake -DTesting=ON -DProgs=ON -Dprogs_list="anderson;hubbard2d" .. - make - make test jobs: include: - compiler: gcc addons: apt: sources: - ubuntu-toolchain-r-test packages: - g++-4.9 env: CXX=g++-4.9 - compiler: clang addons: apt: sources: - ubuntu-toolchain-r-test packages: - clang-3.6 env: CXX=clang++-3.6 - compiler: clang addons: apt: sources: - ubuntu-toolchain-r-test packages: - clang-3.7 env: CXX=clang++-3.7
274884f867adc0fb547f717312620f2a4ab0bd95
tests/test-setup.yml
tests/test-setup.yml
--- - hosts: all become: yes tasks: - name: Update apt cache. apt: update_cache=yes when: ansible_os_family == 'Debian' - name: Install test dependencies (RedHat). package: name=logrotate state=present when: ansible_os_family == 'RedHat' - name: Copy initctl_faker into place for Ubuntu 14.04. copy: src: initctl_faker dest: /sbin/initctl mode: 0755 force: yes when: ansible_distribution == 'Ubuntu' and ansible_distribution_version == '14.04' changed_when: false
--- - hosts: all become: yes tasks: - name: Update apt cache. apt: update_cache=yes when: ansible_os_family == 'Debian' - name: Install test dependencies (RedHat). package: name=logrotate state=present when: ansible_os_family == 'RedHat' - name: Override postfix_inet_protocols (RHEL). set_fact: postfix_inet_protocols: ipv4 when: ansible_os_family == 'RedHat' - name: Copy initctl_faker into place for Ubuntu 14.04. copy: src: initctl_faker dest: /sbin/initctl mode: 0755 force: yes when: ansible_distribution == 'Ubuntu' and ansible_distribution_version == '14.04' changed_when: false
Fix breaking CentOS 7 test.
Fix breaking CentOS 7 test.
YAML
mit
debugacademy/drupal-vm,debugacademy/drupal-vm,debugacademy/drupal-vm,debugacademy/drupal-vm
yaml
## Code Before: --- - hosts: all become: yes tasks: - name: Update apt cache. apt: update_cache=yes when: ansible_os_family == 'Debian' - name: Install test dependencies (RedHat). package: name=logrotate state=present when: ansible_os_family == 'RedHat' - name: Copy initctl_faker into place for Ubuntu 14.04. copy: src: initctl_faker dest: /sbin/initctl mode: 0755 force: yes when: ansible_distribution == 'Ubuntu' and ansible_distribution_version == '14.04' changed_when: false ## Instruction: Fix breaking CentOS 7 test. ## Code After: --- - hosts: all become: yes tasks: - name: Update apt cache. apt: update_cache=yes when: ansible_os_family == 'Debian' - name: Install test dependencies (RedHat). package: name=logrotate state=present when: ansible_os_family == 'RedHat' - name: Override postfix_inet_protocols (RHEL). set_fact: postfix_inet_protocols: ipv4 when: ansible_os_family == 'RedHat' - name: Copy initctl_faker into place for Ubuntu 14.04. copy: src: initctl_faker dest: /sbin/initctl mode: 0755 force: yes when: ansible_distribution == 'Ubuntu' and ansible_distribution_version == '14.04' changed_when: false
876de49a9c5d6e2d75714a606238e9041ed49baf
sample/views/booking_day.html
sample/views/booking_day.html
<table class="table"> <thead> <tr> <th>Id</th> <th>Room</th> <th></th> </tr> </thead> <tbody> <tr ng-repeat="reservation in between(reservationDate)"> <td>{{reservation.reservationId}}</td> <td><a ui-sref="room.detail({roomId: reservation.roomId, from: 'booking.day({year:\'2014\', month: \'10\', day: \'14\'})|{{reservationDate.getTime()}}'})">{{getRoom(reservation.roomId).roomNumber}}</a></td> <td><a ui-sref=".detail({reservationId: reservation.reservationId})" class="btn">View</a></td> </tr> </tbody> </table>
<table class="table"> <thead> <tr> <th>Id</th> <th>Room</th> <th></th> </tr> </thead> <tbody> <tr ng-repeat="reservation in between(reservationDate)"> <td>{{reservation.reservationId}}</td> <td><a ui-sref="room.detail({roomId: reservation.roomId, from: 'booking.day({year:\'{{reservationDate.getFullYear()}}\', month: \'{{reservationDate.getMonth() + 1}}\', day: \'{{reservationDate.getDate()}}\'})|{{reservationDate.getTime()}}'})">{{getRoom(reservation.roomId).roomNumber}}</a></td> <td><a ui-sref=".detail({reservationId: reservation.reservationId})" class="btn">View</a></td> </tr> </tbody> </table>
Send correct url params for the room link in booking view
fix(sample): Send correct url params for the room link in booking view
HTML
mit
thebigredgeek/angular-breadcrumb,cuiliang/angular-breadcrumb,allwebsites/angular-breadcrumb,allwebsites/angular-breadcrumb,ansgarkroger/angular-breadcrumb,ansgarkroger/angular-breadcrumb,ncuillery/angular-breadcrumb,cuiliang/angular-breadcrumb,zpzgone/angular-breadcrumb,zpzgone/angular-breadcrumb,kazinov/angular-breadcrumb,kazinov/angular-breadcrumb,ncuillery/angular-breadcrumb,thebigredgeek/angular-breadcrumb
html
## Code Before: <table class="table"> <thead> <tr> <th>Id</th> <th>Room</th> <th></th> </tr> </thead> <tbody> <tr ng-repeat="reservation in between(reservationDate)"> <td>{{reservation.reservationId}}</td> <td><a ui-sref="room.detail({roomId: reservation.roomId, from: 'booking.day({year:\'2014\', month: \'10\', day: \'14\'})|{{reservationDate.getTime()}}'})">{{getRoom(reservation.roomId).roomNumber}}</a></td> <td><a ui-sref=".detail({reservationId: reservation.reservationId})" class="btn">View</a></td> </tr> </tbody> </table> ## Instruction: fix(sample): Send correct url params for the room link in booking view ## Code After: <table class="table"> <thead> <tr> <th>Id</th> <th>Room</th> <th></th> </tr> </thead> <tbody> <tr ng-repeat="reservation in between(reservationDate)"> <td>{{reservation.reservationId}}</td> <td><a ui-sref="room.detail({roomId: reservation.roomId, from: 'booking.day({year:\'{{reservationDate.getFullYear()}}\', month: \'{{reservationDate.getMonth() + 1}}\', day: \'{{reservationDate.getDate()}}\'})|{{reservationDate.getTime()}}'})">{{getRoom(reservation.roomId).roomNumber}}</a></td> <td><a ui-sref=".detail({reservationId: reservation.reservationId})" class="btn">View</a></td> </tr> </tbody> </table>
7ab7642aa1583aeae0a2188b36c2dded038a7628
models/index.js
models/index.js
'use strict'; var fs = require('fs'); var path = require('path'); var Sequelize = require('sequelize'); var basename = path.basename(module.filename); var env = process.env.NODE_ENV || 'development'; if(env !== 'production') { var config = require(__dirname + '/../config/development.json')[env]; } else{ config = require(__dirname + '/../config/production.json')[env]; } var db = {}; if (config.use_env_variable) { var sequelize = new Sequelize(process.env[config.use_env_variable]); } else { config.define = {underscored: true}; var sequelize = new Sequelize(config.database, config.username, config.password, config); } fs .readdirSync(__dirname) .filter(function(file) { return (file.indexOf('.') !== 0) && (file !== basename) && (file.slice(-3) === '.js'); }) .forEach(function(file) { var model = sequelize['import'](path.join(__dirname, file)); db[model.name] = model; }); Object.keys(db).forEach(function(modelName) { if (db[modelName].associate) { db[modelName].associate(db); } }); db.sequelize = sequelize; db.Sequelize = Sequelize; module.exports = db;
'use strict'; var fs = require('fs'); var path = require('path'); var Sequelize = require('sequelize'); var basename = path.basename(module.filename); var env = process.env.NODE_ENV || 'development'; if(env !== 'production') { var config = require(__dirname + '/../config/development.json')[env]; } else{ config = require(__dirname + '/../config/production.json')[env]; } var db = {}; if (config.use_env_variable) { config = {}; config.define = {underscored: true}; var sequelize = new Sequelize(process.env[config.use_env_variable]); } else { config.define = {underscored: true}; var sequelize = new Sequelize(config.database, config.username, config.password, config); } fs .readdirSync(__dirname) .filter(function(file) { return (file.indexOf('.') !== 0) && (file !== basename) && (file.slice(-3) === '.js'); }) .forEach(function(file) { var model = sequelize['import'](path.join(__dirname, file)); db[model.name] = model; }); Object.keys(db).forEach(function(modelName) { if (db[modelName].associate) { db[modelName].associate(db); } }); db.sequelize = sequelize; db.Sequelize = Sequelize; module.exports = db;
Fix association model with user and email
Fix association model with user and email
JavaScript
mit
watchiot/watchiot-api,watchiot/watchiot-api
javascript
## Code Before: 'use strict'; var fs = require('fs'); var path = require('path'); var Sequelize = require('sequelize'); var basename = path.basename(module.filename); var env = process.env.NODE_ENV || 'development'; if(env !== 'production') { var config = require(__dirname + '/../config/development.json')[env]; } else{ config = require(__dirname + '/../config/production.json')[env]; } var db = {}; if (config.use_env_variable) { var sequelize = new Sequelize(process.env[config.use_env_variable]); } else { config.define = {underscored: true}; var sequelize = new Sequelize(config.database, config.username, config.password, config); } fs .readdirSync(__dirname) .filter(function(file) { return (file.indexOf('.') !== 0) && (file !== basename) && (file.slice(-3) === '.js'); }) .forEach(function(file) { var model = sequelize['import'](path.join(__dirname, file)); db[model.name] = model; }); Object.keys(db).forEach(function(modelName) { if (db[modelName].associate) { db[modelName].associate(db); } }); db.sequelize = sequelize; db.Sequelize = Sequelize; module.exports = db; ## Instruction: Fix association model with user and email ## Code After: 'use strict'; var fs = require('fs'); var path = require('path'); var Sequelize = require('sequelize'); var basename = path.basename(module.filename); var env = process.env.NODE_ENV || 'development'; if(env !== 'production') { var config = require(__dirname + '/../config/development.json')[env]; } else{ config = require(__dirname + '/../config/production.json')[env]; } var db = {}; if (config.use_env_variable) { config = {}; config.define = {underscored: true}; var sequelize = new Sequelize(process.env[config.use_env_variable]); } else { config.define = {underscored: true}; var sequelize = new Sequelize(config.database, config.username, config.password, config); } fs .readdirSync(__dirname) .filter(function(file) { return (file.indexOf('.') !== 0) && (file !== basename) && (file.slice(-3) === '.js'); }) .forEach(function(file) { var model = sequelize['import'](path.join(__dirname, file)); db[model.name] = model; }); Object.keys(db).forEach(function(modelName) { if (db[modelName].associate) { db[modelName].associate(db); } }); db.sequelize = sequelize; db.Sequelize = Sequelize; module.exports = db;
1cd5ee98b5806082322a5887cb1cd4a7b20da5a7
.github/mergify.yml
.github/mergify.yml
pull_request_rules: - name: Automatic approve on dependabot PR conditions: - author~=^dependabot(|-preview)\[bot\]$ actions: review: type: APPROVE - name: Automatic merge on dependabot PR after success CI conditions: - author~=^dependabot(|-preview)\[bot\]$ - check-success=DotNet Format - check-success=Release - check-success=Unit Test code (windows-latest, netcoreapp3.1) - check-success=Unit Test code (ubuntu-latest, netcoreapp3.1) - check-success=Unit Test code (macos-latest, netcoreapp3.1) - check-success=Unit Test code (windows-latest, net5.0) - check-success=Unit Test code (windows-latest, net5.0) - check-success=Unit Test code (windows-latest, net5.0) actions: merge: method: rebase strict: smart - name: Thank contributor conditions: - merged - -author~=^dependabot(|-preview)\[bot\]$ actions: comment: message: "Thank you @{{author}} for your contribution!"
pull_request_rules: - name: Automatic approve on dependabot PR conditions: - author~=^dependabot(|-preview)\[bot\]$ actions: review: type: APPROVE - name: Automatic merge on dependabot PR after success CI conditions: - author~=^dependabot(|-preview)\[bot\]$ - '#commits-behind=0' # Only merge up to date pull requests - check-success=DotNet Format - check-success=Release - check-success=Unit Test code (windows-latest, netcoreapp3.1) - check-success=Unit Test code (ubuntu-latest, netcoreapp3.1) - check-success=Unit Test code (macos-latest, netcoreapp3.1) - check-success=Unit Test code (windows-latest, net5.0) - check-success=Unit Test code (windows-latest, net5.0) - check-success=Unit Test code (windows-latest, net5.0) actions: merge: - name: Thank contributor conditions: - merged - -author~=^dependabot(|-preview)\[bot\]$ actions: comment: message: "Thank you @{{author}} for your contribution!"
Remove deprecated `merge: strict` from Mergify.yml
Remove deprecated `merge: strict` from Mergify.yml Remove `merge: strict` from the Mergify configuration because it is deprecated: > The configuration uses the deprecated strict mode of the merge action. > A brownout is planned for the whole December 6th, 2021 day. > This option will be removed on January 10th, 2022. > For more information: https://blog.mergify.com/strict-mode-deprecation/
YAML
mit
GitTools/GitVersion,GitTools/GitVersion,asbjornu/GitVersion,gep13/GitVersion,gep13/GitVersion,asbjornu/GitVersion
yaml
## Code Before: pull_request_rules: - name: Automatic approve on dependabot PR conditions: - author~=^dependabot(|-preview)\[bot\]$ actions: review: type: APPROVE - name: Automatic merge on dependabot PR after success CI conditions: - author~=^dependabot(|-preview)\[bot\]$ - check-success=DotNet Format - check-success=Release - check-success=Unit Test code (windows-latest, netcoreapp3.1) - check-success=Unit Test code (ubuntu-latest, netcoreapp3.1) - check-success=Unit Test code (macos-latest, netcoreapp3.1) - check-success=Unit Test code (windows-latest, net5.0) - check-success=Unit Test code (windows-latest, net5.0) - check-success=Unit Test code (windows-latest, net5.0) actions: merge: method: rebase strict: smart - name: Thank contributor conditions: - merged - -author~=^dependabot(|-preview)\[bot\]$ actions: comment: message: "Thank you @{{author}} for your contribution!" ## Instruction: Remove deprecated `merge: strict` from Mergify.yml Remove `merge: strict` from the Mergify configuration because it is deprecated: > The configuration uses the deprecated strict mode of the merge action. > A brownout is planned for the whole December 6th, 2021 day. > This option will be removed on January 10th, 2022. > For more information: https://blog.mergify.com/strict-mode-deprecation/ ## Code After: pull_request_rules: - name: Automatic approve on dependabot PR conditions: - author~=^dependabot(|-preview)\[bot\]$ actions: review: type: APPROVE - name: Automatic merge on dependabot PR after success CI conditions: - author~=^dependabot(|-preview)\[bot\]$ - '#commits-behind=0' # Only merge up to date pull requests - check-success=DotNet Format - check-success=Release - check-success=Unit Test code (windows-latest, netcoreapp3.1) - check-success=Unit Test code (ubuntu-latest, netcoreapp3.1) - check-success=Unit Test code (macos-latest, netcoreapp3.1) - check-success=Unit Test code (windows-latest, net5.0) - check-success=Unit Test code (windows-latest, net5.0) - check-success=Unit Test code (windows-latest, net5.0) actions: merge: - name: Thank contributor conditions: - merged - -author~=^dependabot(|-preview)\[bot\]$ actions: comment: message: "Thank you @{{author}} for your contribution!"
49bed20629d4b2ef50026700b98694da4c2ce224
tasks.py
tasks.py
"""Useful task commands for development and maintenance.""" from invoke import run, task @task def clean(): """Clean the project directory of unwanted files and directories.""" run('rm -rf gmusicapi_scripts.egg-info') run('rm -rf .coverage') run('rm -rf .tox') run('rm -rf .cache') run('rm -rf build/') run('rm -rf dist/') run('rm -rf site/') run('find . -name *.pyc -delete') run('find . -name *.pyo -delete') run('find . -name __pycache__ -delete -depth') run('find . -name *~ -delete') @task(clean) def build(): """Build sdist and bdist_wheel distributions.""" run('python setup.py sdist bdist_wheel') @task(build) def deploy(): """Build and upload gmusicapi_scripts distributions.""" upload() @task def upload(): """Upload gmusicapi_scripts distributions using twine.""" run('twine upload dist/*')
"""Useful task commands for development and maintenance.""" from invoke import run, task @task def clean(): """Clean the project directory of unwanted files and directories.""" run('rm -rf gmusicapi_scripts.egg-info') run('rm -rf .coverage') run('rm -rf .tox') run('rm -rf .cache') run('rm -rf build/') run('rm -rf dist/') run('rm -rf site/') run('find . -name *.pyc -delete') run('find . -name *.pyo -delete') run('find . -name __pycache__ -delete -depth') run('find . -name *~ -delete') @task(clean) def build(): """Build sdist and bdist_wheel distributions.""" run('python setup.py sdist bdist_wheel') @task(build) def deploy(): """Build and upload gmusicapi_scripts distributions.""" upload() @task def docs(test=False): """"Build the gmusicapi_scripts docs.""" if test: run('mkdocs serve') else: run('mkdocs gh-deploy --clean') @task def upload(): """Upload gmusicapi_scripts distributions using twine.""" run('twine upload dist/*')
Add task for building docs
Add task for building docs
Python
mit
thebigmunch/gmusicapi-scripts
python
## Code Before: """Useful task commands for development and maintenance.""" from invoke import run, task @task def clean(): """Clean the project directory of unwanted files and directories.""" run('rm -rf gmusicapi_scripts.egg-info') run('rm -rf .coverage') run('rm -rf .tox') run('rm -rf .cache') run('rm -rf build/') run('rm -rf dist/') run('rm -rf site/') run('find . -name *.pyc -delete') run('find . -name *.pyo -delete') run('find . -name __pycache__ -delete -depth') run('find . -name *~ -delete') @task(clean) def build(): """Build sdist and bdist_wheel distributions.""" run('python setup.py sdist bdist_wheel') @task(build) def deploy(): """Build and upload gmusicapi_scripts distributions.""" upload() @task def upload(): """Upload gmusicapi_scripts distributions using twine.""" run('twine upload dist/*') ## Instruction: Add task for building docs ## Code After: """Useful task commands for development and maintenance.""" from invoke import run, task @task def clean(): """Clean the project directory of unwanted files and directories.""" run('rm -rf gmusicapi_scripts.egg-info') run('rm -rf .coverage') run('rm -rf .tox') run('rm -rf .cache') run('rm -rf build/') run('rm -rf dist/') run('rm -rf site/') run('find . -name *.pyc -delete') run('find . -name *.pyo -delete') run('find . -name __pycache__ -delete -depth') run('find . -name *~ -delete') @task(clean) def build(): """Build sdist and bdist_wheel distributions.""" run('python setup.py sdist bdist_wheel') @task(build) def deploy(): """Build and upload gmusicapi_scripts distributions.""" upload() @task def docs(test=False): """"Build the gmusicapi_scripts docs.""" if test: run('mkdocs serve') else: run('mkdocs gh-deploy --clean') @task def upload(): """Upload gmusicapi_scripts distributions using twine.""" run('twine upload dist/*')
4205b262d3e26d836e20be1506e3d9a4a9b04f9e
scss/bootstrap.scss
scss/bootstrap.scss
/*! * Bootstrap v4.0.0-beta.2 (https://getbootstrap.com) * Copyright 2011-2017 The Bootstrap Authors * Copyright 2011-2017 Twitter, Inc. * Licensed under MIT (https://github.com/twbs/bootstrap/blob/master/LICENSE) */ @import "functions"; @import "variables"; @import "mixins"; @import "root"; @import "print"; @import "reboot"; @import "type"; @import "images"; @import "code"; @import "grid"; @import "tables"; @import "forms"; @import "buttons"; @import "transitions"; @import "dropdown"; @import "button-group"; @import "input-group"; @import "custom-forms"; @import "nav"; @import "navbar"; @import "card"; @import "breadcrumb"; @import "pagination"; @import "badge"; @import "jumbotron"; @import "alert"; @import "progress"; @import "media"; @import "list-group"; @import "close"; @import "modal"; @import "tooltip"; @import "popover"; @import "carousel"; @import "utilities";
/*! * Bootstrap v4.0.0-beta.2 (https://getbootstrap.com) * Copyright 2011-2017 The Bootstrap Authors * Copyright 2011-2017 Twitter, Inc. * Licensed under MIT (https://github.com/twbs/bootstrap/blob/master/LICENSE) */ @import "functions"; @import "variables"; @import "mixins"; @import "root"; @import "reboot"; @import "type"; @import "images"; @import "code"; @import "grid"; @import "tables"; @import "forms"; @import "buttons"; @import "transitions"; @import "dropdown"; @import "button-group"; @import "input-group"; @import "custom-forms"; @import "nav"; @import "navbar"; @import "card"; @import "breadcrumb"; @import "pagination"; @import "badge"; @import "jumbotron"; @import "alert"; @import "progress"; @import "media"; @import "list-group"; @import "close"; @import "modal"; @import "tooltip"; @import "popover"; @import "carousel"; @import "utilities"; @import "print";
Update SCSS import order to have print styles last
Update SCSS import order to have print styles last
SCSS
mit
tagliala/bootstrap,9-chirno/chirno,peterblazejewicz/bootstrap,m5o/bootstrap,Lyricalz/bootstrap,bardiharborow/bootstrap,peterblazejewicz/bootstrap,janseliger/bootstrap,tjkohli/bootstrap,gijsbotje/bootstrap,joblocal/bootstrap,joblocal/bootstrap,supergibbs/bootstrap,GerHobbelt/bootstrap,supergibbs/bootstrap,stanwmusic/bootstrap,m5o/bootstrap,zalog/bootstrap,vsn4ik/bootstrap,janseliger/bootstrap,tjkohli/bootstrap,yuyokk/bootstrap,GerHobbelt/bootstrap,stanwmusic/bootstrap,kvlsrg/bootstrap,gijsbotje/bootstrap,gijsbotje/bootstrap,supergibbs/bootstrap,Lyricalz/bootstrap,fschumann1211/bootstrap,bootstrapbrasil/bootstrap,stanwmusic/bootstrap,Hemphill/bootstrap-docs,joblocal/bootstrap,Hemphill/bootstrap-docs,creativewebjp/bootstrap,seanwu99/bootstrap,Lyricalz/bootstrap,fschumann1211/bootstrap,fschumann1211/bootstrap,twbs/bootstrap,SweetProcess/bootstrap,SweetProcess/bootstrap,inway/bootstrap,joblocal/bootstrap,supergibbs/bootstrap,creativewebjp/bootstrap,zalog/bootstrap,9-chirno/chirno,fschumann1211/bootstrap,zalog/bootstrap,9-chirno/chirno,coliff/bootstrap,inway/bootstrap,twbs/bootstrap,Lyricalz/bootstrap,seanwu99/bootstrap,tagliala/bootstrap,bardiharborow/bootstrap,peterblazejewicz/bootstrap,janseliger/bootstrap,kvlsrg/bootstrap,SweetProcess/bootstrap,inway/bootstrap,creativewebjp/bootstrap,bootstrapbrasil/bootstrap,nice-fungal/bootstrap,nice-fungal/bootstrap,Hemphill/bootstrap-docs,vsn4ik/bootstrap,coliff/bootstrap,yuyokk/bootstrap,SweetProcess/bootstrap,stanwmusic/bootstrap,nice-fungal/bootstrap,seanwu99/bootstrap,kvlsrg/bootstrap,yuyokk/bootstrap,tagliala/bootstrap,bootstrapbrasil/bootstrap,seanwu99/bootstrap,creativewebjp/bootstrap,janseliger/bootstrap,vsn4ik/bootstrap,yuyokk/bootstrap,gijsbotje/bootstrap,bardiharborow/bootstrap,vsn4ik/bootstrap,GerHobbelt/bootstrap,bardiharborow/bootstrap,Hemphill/bootstrap-docs,9-chirno/chirno
scss
## Code Before: /*! * Bootstrap v4.0.0-beta.2 (https://getbootstrap.com) * Copyright 2011-2017 The Bootstrap Authors * Copyright 2011-2017 Twitter, Inc. * Licensed under MIT (https://github.com/twbs/bootstrap/blob/master/LICENSE) */ @import "functions"; @import "variables"; @import "mixins"; @import "root"; @import "print"; @import "reboot"; @import "type"; @import "images"; @import "code"; @import "grid"; @import "tables"; @import "forms"; @import "buttons"; @import "transitions"; @import "dropdown"; @import "button-group"; @import "input-group"; @import "custom-forms"; @import "nav"; @import "navbar"; @import "card"; @import "breadcrumb"; @import "pagination"; @import "badge"; @import "jumbotron"; @import "alert"; @import "progress"; @import "media"; @import "list-group"; @import "close"; @import "modal"; @import "tooltip"; @import "popover"; @import "carousel"; @import "utilities"; ## Instruction: Update SCSS import order to have print styles last ## Code After: /*! * Bootstrap v4.0.0-beta.2 (https://getbootstrap.com) * Copyright 2011-2017 The Bootstrap Authors * Copyright 2011-2017 Twitter, Inc. * Licensed under MIT (https://github.com/twbs/bootstrap/blob/master/LICENSE) */ @import "functions"; @import "variables"; @import "mixins"; @import "root"; @import "reboot"; @import "type"; @import "images"; @import "code"; @import "grid"; @import "tables"; @import "forms"; @import "buttons"; @import "transitions"; @import "dropdown"; @import "button-group"; @import "input-group"; @import "custom-forms"; @import "nav"; @import "navbar"; @import "card"; @import "breadcrumb"; @import "pagination"; @import "badge"; @import "jumbotron"; @import "alert"; @import "progress"; @import "media"; @import "list-group"; @import "close"; @import "modal"; @import "tooltip"; @import "popover"; @import "carousel"; @import "utilities"; @import "print";
0b5e16ec9806dc947e5d3a90b620a1aadeec2772
test/helpers/before.js
test/helpers/before.js
/* * Suite-global initialization that should occur before any other files are required. It's probably * a code smell that I have to do this. */ var config = require('../../src/config'); function reconfigure () { if (process.env.INTEGRATION) { console.log('Integration test mode active.'); config.configure(process.env); console.log('NOTE: This will leave files uploaded in Cloud Files containers.'); console.log('Be sure to clear these containers after:'); console.log('[' + config.contentContainer() + '] and [' + config.assetContainer() + ']'); } else { config.configure({ STORAGE: 'memory', RACKSPACE_USERNAME: 'me', RACKSPACE_APIKEY: '12345', RACKSPACE_REGION: 'space', ADMIN_APIKEY: '12345', CONTENT_CONTAINER: 'the-content-container', ASSET_CONTAINER: 'the-asset-container', MONGODB_URL: 'mongodb-url', CONTENT_LOG_LEVEL: process.env.CONTENT_LOG_LEVEL || 'error' }); } } reconfigure(); exports.reconfigure = reconfigure;
/* * Suite-global initialization that should occur before any other files are required. It's probably * a code smell that I have to do this. */ var config = require('../../src/config'); function reconfigure () { if (process.env.INTEGRATION) { console.log('Integration test mode active.'); config.configure(process.env); console.log('NOTE: This will leave files uploaded in Cloud Files containers.'); console.log('Be sure to clear these containers after:'); console.log('[' + config.contentContainer() + '] and [' + config.assetContainer() + ']'); } else { config.configure({ STORAGE: 'memory', RACKSPACE_USERNAME: 'me', RACKSPACE_APIKEY: '12345', RACKSPACE_REGION: 'space', ADMIN_APIKEY: process.env.ADMIN_APIKEY || '12345', CONTENT_CONTAINER: 'the-content-container', ASSET_CONTAINER: 'the-asset-container', MONGODB_URL: 'mongodb-url', CONTENT_LOG_LEVEL: process.env.CONTENT_LOG_LEVEL || 'error' }); } } reconfigure(); exports.reconfigure = reconfigure;
Use ADMIN_APIKEY from the env in tests.
Use ADMIN_APIKEY from the env in tests.
JavaScript
mit
deconst/content-service,deconst/content-service
javascript
## Code Before: /* * Suite-global initialization that should occur before any other files are required. It's probably * a code smell that I have to do this. */ var config = require('../../src/config'); function reconfigure () { if (process.env.INTEGRATION) { console.log('Integration test mode active.'); config.configure(process.env); console.log('NOTE: This will leave files uploaded in Cloud Files containers.'); console.log('Be sure to clear these containers after:'); console.log('[' + config.contentContainer() + '] and [' + config.assetContainer() + ']'); } else { config.configure({ STORAGE: 'memory', RACKSPACE_USERNAME: 'me', RACKSPACE_APIKEY: '12345', RACKSPACE_REGION: 'space', ADMIN_APIKEY: '12345', CONTENT_CONTAINER: 'the-content-container', ASSET_CONTAINER: 'the-asset-container', MONGODB_URL: 'mongodb-url', CONTENT_LOG_LEVEL: process.env.CONTENT_LOG_LEVEL || 'error' }); } } reconfigure(); exports.reconfigure = reconfigure; ## Instruction: Use ADMIN_APIKEY from the env in tests. ## Code After: /* * Suite-global initialization that should occur before any other files are required. It's probably * a code smell that I have to do this. */ var config = require('../../src/config'); function reconfigure () { if (process.env.INTEGRATION) { console.log('Integration test mode active.'); config.configure(process.env); console.log('NOTE: This will leave files uploaded in Cloud Files containers.'); console.log('Be sure to clear these containers after:'); console.log('[' + config.contentContainer() + '] and [' + config.assetContainer() + ']'); } else { config.configure({ STORAGE: 'memory', RACKSPACE_USERNAME: 'me', RACKSPACE_APIKEY: '12345', RACKSPACE_REGION: 'space', ADMIN_APIKEY: process.env.ADMIN_APIKEY || '12345', CONTENT_CONTAINER: 'the-content-container', ASSET_CONTAINER: 'the-asset-container', MONGODB_URL: 'mongodb-url', CONTENT_LOG_LEVEL: process.env.CONTENT_LOG_LEVEL || 'error' }); } } reconfigure(); exports.reconfigure = reconfigure;
8367b7b5e992be73768a9371262b13a95618d759
.travis.yml
.travis.yml
language: php php: - 5.3 - 5.4 - 5.5 before_script: - cd tests script: phpunit
language: php php: - 5.3 - 5.4 - 5.5 - 5.6 before_install: - composer self-update install: - composer update -o before_script: - cd tests script: phpunit
Fix Travis config for composer usage
Fix Travis config for composer usage
YAML
mit
MAXakaWIZARD/GettextParser,MAXakaWIZARD/GettextParser
yaml
## Code Before: language: php php: - 5.3 - 5.4 - 5.5 before_script: - cd tests script: phpunit ## Instruction: Fix Travis config for composer usage ## Code After: language: php php: - 5.3 - 5.4 - 5.5 - 5.6 before_install: - composer self-update install: - composer update -o before_script: - cd tests script: phpunit
1e1c8b7a4e34c1a0b71501a29273db8597f33169
templates/editor/modularity-enabled-modules.php
templates/editor/modularity-enabled-modules.php
<div class="modularity-modules"> <?php foreach ($modules as $moduleId => $module) : ?> <div class="modularity-module modularity-js-draggable" data-module-id="<?php echo $moduleId; ?>" data-sidebar-incompability='<?php echo is_array($module['sidebar_incompability']) ? json_encode($module['sidebar_incompability']) : ''; ?>'> <span class="modularity-module-icon"> <?php echo modularity_decode_icon($module); ?> </span> <span class="modularity-module-name"><?php echo $module['labels']['name']; ?></span> </div> <?php endforeach; ?> </div>
<div class="modularity-modules"> <?php foreach ($modules as $moduleId => $module) : ?> <div class="modularity-module modularity-js-draggable" data-module-id="<?php echo $moduleId; ?>" data-sidebar-incompability='<?php echo (isset($module['sidebar_incompability']) && is_array($module['sidebar_incompability'])) ? json_encode($module['sidebar_incompability']) : ''; ?>'> <span class="modularity-module-icon"> <?php echo modularity_decode_icon($module); ?> </span> <span class="modularity-module-name"><?php echo $module['labels']['name']; ?></span> </div> <?php endforeach; ?> </div>
Check if module array has property set before using it. Cleaning up messy row to multiple.
Check if module array has property set before using it. Cleaning up messy row to multiple.
PHP
mit
helsingborg-stad/Modularity,helsingborg-stad/Modularity
php
## Code Before: <div class="modularity-modules"> <?php foreach ($modules as $moduleId => $module) : ?> <div class="modularity-module modularity-js-draggable" data-module-id="<?php echo $moduleId; ?>" data-sidebar-incompability='<?php echo is_array($module['sidebar_incompability']) ? json_encode($module['sidebar_incompability']) : ''; ?>'> <span class="modularity-module-icon"> <?php echo modularity_decode_icon($module); ?> </span> <span class="modularity-module-name"><?php echo $module['labels']['name']; ?></span> </div> <?php endforeach; ?> </div> ## Instruction: Check if module array has property set before using it. Cleaning up messy row to multiple. ## Code After: <div class="modularity-modules"> <?php foreach ($modules as $moduleId => $module) : ?> <div class="modularity-module modularity-js-draggable" data-module-id="<?php echo $moduleId; ?>" data-sidebar-incompability='<?php echo (isset($module['sidebar_incompability']) && is_array($module['sidebar_incompability'])) ? json_encode($module['sidebar_incompability']) : ''; ?>'> <span class="modularity-module-icon"> <?php echo modularity_decode_icon($module); ?> </span> <span class="modularity-module-name"><?php echo $module['labels']['name']; ?></span> </div> <?php endforeach; ?> </div>
ba905a202f5d9fd85d529ff77d58a9ad81d21f89
doc/source/index.rst
doc/source/index.rst
==================================== pySTEPS -- The nowcasting initiative ==================================== pySTEPS is a community-driven initiative for developing and maintaining an easy to use, modular, free and open source Python framework for short-term ensemble prediction systems. The focus is on probabilistic nowcasting of radar precipitation fields, but pySTEPS is designed to allow a wider range of uses. Documentation ============= The documentation is separated in three big branches, intended for different audiences. :ref:`user-guide` ~~~~~~~~~~~~~~~~~ This section is intended for new pySTEPS users. It provides an introductory overview to the pySTEPS package, explains how to install it and make use of the most important features. :ref:`pysteps-reference` ~~~~~~~~~~~~~~~~~~~~~~~~ Comprehensive description of all the modules and functions available in pySTEPS. :ref:`contributors-guides` ~~~~~~~~~~~~~~~~~~~~~~~~~~ Resources and guidelines for pySTEPS developers and contributors. Contents ======== .. toctree:: :maxdepth: 1 user_guide/index pysteps_reference/index developer_guide/index zz_bibliography
==================================== pySTEPS -- The nowcasting initiative ==================================== Pysteps is a community-driven initiative for developing and maintaining an easy to use, modular, free and open source Python framework for short-term ensemble prediction systems. The focus is on probabilistic nowcasting of radar precipitation fields, but pysteps is designed to allow a wider range of uses. Pysteps is actively developed on GitHub__, while a more thorough description of pysteps is available in the pysteps reference publication: Pulkkinen, S., D. Nerini, A. Perez Hortal, C. Velasco-Forero, U. Germann, A. Seed, and L. Foresti, 2019: Pysteps: an open-source Python library for probabilistic precipitation nowcasting (v1.0). *Geosci. Model Dev. Discuss.*, doi:10.5194/gmd-2019-94, **in review**. [source__] __ https://github.com/pySTEPS/pysteps __ https://www.geosci-model-dev-discuss.net/gmd-2019-94/ Documentation ============= The documentation is separated in three big branches, intended for different audiences. :ref:`user-guide` ~~~~~~~~~~~~~~~~~ This section is intended for new pysteps users. It provides an introductory overview to the pysteps package, explains how to install it and make use of the most important features. :ref:`pysteps-reference` ~~~~~~~~~~~~~~~~~~~~~~~~ Comprehensive description of all the modules and functions available in pysteps. :ref:`contributors-guides` ~~~~~~~~~~~~~~~~~~~~~~~~~~ Resources and guidelines for pysteps developers and contributors. Contents ======== .. toctree:: :maxdepth: 1 user_guide/index pysteps_reference/index developer_guide/index zz_bibliography
Add link to github page and reference publication
Add link to github page and reference publication
reStructuredText
bsd-3-clause
pySTEPS/pysteps
restructuredtext
## Code Before: ==================================== pySTEPS -- The nowcasting initiative ==================================== pySTEPS is a community-driven initiative for developing and maintaining an easy to use, modular, free and open source Python framework for short-term ensemble prediction systems. The focus is on probabilistic nowcasting of radar precipitation fields, but pySTEPS is designed to allow a wider range of uses. Documentation ============= The documentation is separated in three big branches, intended for different audiences. :ref:`user-guide` ~~~~~~~~~~~~~~~~~ This section is intended for new pySTEPS users. It provides an introductory overview to the pySTEPS package, explains how to install it and make use of the most important features. :ref:`pysteps-reference` ~~~~~~~~~~~~~~~~~~~~~~~~ Comprehensive description of all the modules and functions available in pySTEPS. :ref:`contributors-guides` ~~~~~~~~~~~~~~~~~~~~~~~~~~ Resources and guidelines for pySTEPS developers and contributors. Contents ======== .. toctree:: :maxdepth: 1 user_guide/index pysteps_reference/index developer_guide/index zz_bibliography ## Instruction: Add link to github page and reference publication ## Code After: ==================================== pySTEPS -- The nowcasting initiative ==================================== Pysteps is a community-driven initiative for developing and maintaining an easy to use, modular, free and open source Python framework for short-term ensemble prediction systems. The focus is on probabilistic nowcasting of radar precipitation fields, but pysteps is designed to allow a wider range of uses. Pysteps is actively developed on GitHub__, while a more thorough description of pysteps is available in the pysteps reference publication: Pulkkinen, S., D. Nerini, A. Perez Hortal, C. Velasco-Forero, U. Germann, A. Seed, and L. Foresti, 2019: Pysteps: an open-source Python library for probabilistic precipitation nowcasting (v1.0). *Geosci. Model Dev. Discuss.*, doi:10.5194/gmd-2019-94, **in review**. [source__] __ https://github.com/pySTEPS/pysteps __ https://www.geosci-model-dev-discuss.net/gmd-2019-94/ Documentation ============= The documentation is separated in three big branches, intended for different audiences. :ref:`user-guide` ~~~~~~~~~~~~~~~~~ This section is intended for new pysteps users. It provides an introductory overview to the pysteps package, explains how to install it and make use of the most important features. :ref:`pysteps-reference` ~~~~~~~~~~~~~~~~~~~~~~~~ Comprehensive description of all the modules and functions available in pysteps. :ref:`contributors-guides` ~~~~~~~~~~~~~~~~~~~~~~~~~~ Resources and guidelines for pysteps developers and contributors. Contents ======== .. toctree:: :maxdepth: 1 user_guide/index pysteps_reference/index developer_guide/index zz_bibliography
b03a6179e48a2b12e47ba4882f13264b687ffd6a
.travis.yml
.travis.yml
language: ruby cache: bundler script: "bundle exec rake" rvm: - 1.9.3 - 2.0.0 env: - DB=sqlite - DB=mysql - DB=postgresql before_script: - cd spec/dummy; bundle exec rake db:setup; cd ../.. - cd spec/dummy; bundle rake db:test:prepare; cd ../.. bundler_args: "--binstubs --without development" # IRC notification notifications: irc: "irc.freenode.org#cyt"
language: ruby cache: bundler script: "bundle exec rake" rvm: - 1.9.3 - 2.0.0 env: - DB=sqlite - DB=mysql - DB=postgresql before_script: - bundle exec rake app:db:setup bundler_args: "--binstubs --without development" # IRC notification notifications: irc: "irc.freenode.org#cyt"
Change before_script to use app:db:setup.
Change before_script to use app:db:setup.
YAML
mit
hauledev/has_vcards,huerlisi/has_vcards,hauledev/has_vcards,huerlisi/has_vcards,hauledev/has_vcards,huerlisi/has_vcards
yaml
## Code Before: language: ruby cache: bundler script: "bundle exec rake" rvm: - 1.9.3 - 2.0.0 env: - DB=sqlite - DB=mysql - DB=postgresql before_script: - cd spec/dummy; bundle exec rake db:setup; cd ../.. - cd spec/dummy; bundle rake db:test:prepare; cd ../.. bundler_args: "--binstubs --without development" # IRC notification notifications: irc: "irc.freenode.org#cyt" ## Instruction: Change before_script to use app:db:setup. ## Code After: language: ruby cache: bundler script: "bundle exec rake" rvm: - 1.9.3 - 2.0.0 env: - DB=sqlite - DB=mysql - DB=postgresql before_script: - bundle exec rake app:db:setup bundler_args: "--binstubs --without development" # IRC notification notifications: irc: "irc.freenode.org#cyt"
b1225a46c82ada350a82fbe7c406cb66548318b1
watir/lib/watir/loader.rb
watir/lib/watir/loader.rb
module Watir class Browser class << self def new(browser=nil, *args) if browser && browser.to_sym != :ie && Watir.driver == :classic Watir.driver = :webdriver end Watir.load_driver if Watir.driver == :webdriver # remove this class method for WebDriver to avoid endless loop singleton_class = class << self; self end singleton_class.send :remove_method, :new end new browser, *args end end end class << self def load_driver require "watir-#{driver}" rescue LoadError puts "watir-#{driver} gem is missing. Install it with the following command:" puts " gem install watir-#{driver}" end def driver @driver || ENV["WATIR_DRIVER"] || default_driver end def driver=(driver) allowed_drivers = %w[webdriver classic] unless allowed_drivers.map(&:to_sym).include?(driver.to_sym) raise "Supported drivers are #{allowed_drivers.join(", ")}." end @driver = driver end def default_driver if ENV['OS'] == 'Windows_NT' :classic else :webdriver end end end end
module Watir class Browser class << self def new(browser=nil, *args) if browser && browser.to_sym != :ie && Watir.driver == :classic Watir.driver = :webdriver end Watir.load_driver if Watir.driver == :webdriver # remove this class method for WebDriver to avoid endless loop singleton_class = class << self; self end singleton_class.send :remove_method, :new end new browser, *args end end end class << self def load_driver require "watir-#{driver}" rescue LoadError warn "watir-#{driver} gem is missing. Install it with the following command:" warn " gem install watir-#{driver}" exit 1 end def driver @driver || ENV["WATIR_DRIVER"] || default_driver end def driver=(driver) allowed_drivers = %w[webdriver classic] unless allowed_drivers.map(&:to_sym).include?(driver.to_sym) raise "Supported drivers are #{allowed_drivers.join(", ")}." end @driver = driver end def default_driver if ENV['OS'] == 'Windows_NT' :classic else :webdriver end end end end
Exit when gem is missing.
Exit when gem is missing.
Ruby
bsd-3-clause
kawakami-o3/watir
ruby
## Code Before: module Watir class Browser class << self def new(browser=nil, *args) if browser && browser.to_sym != :ie && Watir.driver == :classic Watir.driver = :webdriver end Watir.load_driver if Watir.driver == :webdriver # remove this class method for WebDriver to avoid endless loop singleton_class = class << self; self end singleton_class.send :remove_method, :new end new browser, *args end end end class << self def load_driver require "watir-#{driver}" rescue LoadError puts "watir-#{driver} gem is missing. Install it with the following command:" puts " gem install watir-#{driver}" end def driver @driver || ENV["WATIR_DRIVER"] || default_driver end def driver=(driver) allowed_drivers = %w[webdriver classic] unless allowed_drivers.map(&:to_sym).include?(driver.to_sym) raise "Supported drivers are #{allowed_drivers.join(", ")}." end @driver = driver end def default_driver if ENV['OS'] == 'Windows_NT' :classic else :webdriver end end end end ## Instruction: Exit when gem is missing. ## Code After: module Watir class Browser class << self def new(browser=nil, *args) if browser && browser.to_sym != :ie && Watir.driver == :classic Watir.driver = :webdriver end Watir.load_driver if Watir.driver == :webdriver # remove this class method for WebDriver to avoid endless loop singleton_class = class << self; self end singleton_class.send :remove_method, :new end new browser, *args end end end class << self def load_driver require "watir-#{driver}" rescue LoadError warn "watir-#{driver} gem is missing. Install it with the following command:" warn " gem install watir-#{driver}" exit 1 end def driver @driver || ENV["WATIR_DRIVER"] || default_driver end def driver=(driver) allowed_drivers = %w[webdriver classic] unless allowed_drivers.map(&:to_sym).include?(driver.to_sym) raise "Supported drivers are #{allowed_drivers.join(", ")}." end @driver = driver end def default_driver if ENV['OS'] == 'Windows_NT' :classic else :webdriver end end end end
b272412370d68915fad534cf8a6ec3ae9dbae0b9
source-optional-requirements.txt
source-optional-requirements.txt
psutil==2.1.1 pycurl==7.19.5 pymongo==2.6.3
kazoo==1.3.1 psutil==2.1.1 pycurl==7.19.5 pymongo==2.6.3
Add kazoo as an optional requirement
Add kazoo as an optional requirement
Text
bsd-3-clause
brettlangdon/dd-agent,jyogi/purvar-agent,jraede/dd-agent,zendesk/dd-agent,zendesk/dd-agent,gphat/dd-agent,cberry777/dd-agent,JohnLZeller/dd-agent,Mashape/dd-agent,huhongbo/dd-agent,AniruddhaSAtre/dd-agent,citrusleaf/dd-agent,jraede/dd-agent,zendesk/dd-agent,truthbk/dd-agent,JohnLZeller/dd-agent,jvassev/dd-agent,PagerDuty/dd-agent,remh/dd-agent,amalakar/dd-agent,jshum/dd-agent,jshum/dd-agent,pfmooney/dd-agent,Mashape/dd-agent,PagerDuty/dd-agent,brettlangdon/dd-agent,AniruddhaSAtre/dd-agent,joelvanvelden/dd-agent,eeroniemi/dd-agent,truthbk/dd-agent,pmav99/praktoras,c960657/dd-agent,brettlangdon/dd-agent,pfmooney/dd-agent,manolama/dd-agent,Mashape/dd-agent,joelvanvelden/dd-agent,guruxu/dd-agent,zendesk/dd-agent,relateiq/dd-agent,huhongbo/dd-agent,yuecong/dd-agent,GabrielNicolasAvellaneda/dd-agent,takus/dd-agent,lookout/dd-agent,a20012251/dd-agent,manolama/dd-agent,mderomph-coolblue/dd-agent,ess/dd-agent,packetloop/dd-agent,citrusleaf/dd-agent,jamesandariese/dd-agent,oneandoneis2/dd-agent,joelvanvelden/dd-agent,jshum/dd-agent,pmav99/praktoras,gphat/dd-agent,benmccann/dd-agent,huhongbo/dd-agent,guruxu/dd-agent,urosgruber/dd-agent,huhongbo/dd-agent,jamesandariese/dd-agent,mderomph-coolblue/dd-agent,Shopify/dd-agent,jamesandariese/dd-agent,mderomph-coolblue/dd-agent,jvassev/dd-agent,remh/dd-agent,jamesandariese/dd-agent,joelvanvelden/dd-agent,Shopify/dd-agent,jraede/dd-agent,packetloop/dd-agent,JohnLZeller/dd-agent,Shopify/dd-agent,lookout/dd-agent,takus/dd-agent,a20012251/dd-agent,polynomial/dd-agent,takus/dd-agent,mderomph-coolblue/dd-agent,gphat/dd-agent,a20012251/dd-agent,tebriel/dd-agent,polynomial/dd-agent,urosgruber/dd-agent,polynomial/dd-agent,jyogi/purvar-agent,GabrielNicolasAvellaneda/dd-agent,jyogi/purvar-agent,eeroniemi/dd-agent,ess/dd-agent,indeedops/dd-agent,mderomph-coolblue/dd-agent,brettlangdon/dd-agent,gphat/dd-agent,Wattpad/dd-agent,ess/dd-agent,amalakar/dd-agent,amalakar/dd-agent,brettlangdon/dd-agent,c960657/dd-agent,Wattpad/dd-agent,amalakar/dd-agent,darron/dd-agent,JohnLZeller/dd-agent,amalakar/dd-agent,guruxu/dd-agent,indeedops/dd-agent,c960657/dd-agent,pmav99/praktoras,lookout/dd-agent,jraede/dd-agent,PagerDuty/dd-agent,lookout/dd-agent,tebriel/dd-agent,citrusleaf/dd-agent,urosgruber/dd-agent,polynomial/dd-agent,GabrielNicolasAvellaneda/dd-agent,AntoCard/powerdns-recursor_check,tebriel/dd-agent,AniruddhaSAtre/dd-agent,pfmooney/dd-agent,manolama/dd-agent,eeroniemi/dd-agent,AntoCard/powerdns-recursor_check,zendesk/dd-agent,benmccann/dd-agent,truthbk/dd-agent,jamesandariese/dd-agent,cberry777/dd-agent,ess/dd-agent,jvassev/dd-agent,ess/dd-agent,guruxu/dd-agent,remh/dd-agent,benmccann/dd-agent,urosgruber/dd-agent,GabrielNicolasAvellaneda/dd-agent,eeroniemi/dd-agent,oneandoneis2/dd-agent,pmav99/praktoras,oneandoneis2/dd-agent,polynomial/dd-agent,jshum/dd-agent,Wattpad/dd-agent,relateiq/dd-agent,truthbk/dd-agent,jyogi/purvar-agent,yuecong/dd-agent,jvassev/dd-agent,takus/dd-agent,AntoCard/powerdns-recursor_check,packetloop/dd-agent,yuecong/dd-agent,jyogi/purvar-agent,AntoCard/powerdns-recursor_check,jshum/dd-agent,remh/dd-agent,a20012251/dd-agent,packetloop/dd-agent,manolama/dd-agent,relateiq/dd-agent,indeedops/dd-agent,darron/dd-agent,pfmooney/dd-agent,packetloop/dd-agent,indeedops/dd-agent,darron/dd-agent,GabrielNicolasAvellaneda/dd-agent,indeedops/dd-agent,cberry777/dd-agent,benmccann/dd-agent,gphat/dd-agent,a20012251/dd-agent,remh/dd-agent,oneandoneis2/dd-agent,Shopify/dd-agent,cberry777/dd-agent,tebriel/dd-agent,takus/dd-agent,Wattpad/dd-agent,relateiq/dd-agent,relateiq/dd-agent,guruxu/dd-agent,manolama/dd-agent,joelvanvelden/dd-agent,darron/dd-agent,AniruddhaSAtre/dd-agent,lookout/dd-agent,yuecong/dd-agent,truthbk/dd-agent,pmav99/praktoras,darron/dd-agent,urosgruber/dd-agent,cberry777/dd-agent,citrusleaf/dd-agent,AntoCard/powerdns-recursor_check,pfmooney/dd-agent,PagerDuty/dd-agent,tebriel/dd-agent,JohnLZeller/dd-agent,AniruddhaSAtre/dd-agent,citrusleaf/dd-agent,Mashape/dd-agent,c960657/dd-agent,yuecong/dd-agent,jraede/dd-agent,oneandoneis2/dd-agent,Shopify/dd-agent,Wattpad/dd-agent,huhongbo/dd-agent,eeroniemi/dd-agent,jvassev/dd-agent,Mashape/dd-agent,PagerDuty/dd-agent,benmccann/dd-agent,c960657/dd-agent
text
## Code Before: psutil==2.1.1 pycurl==7.19.5 pymongo==2.6.3 ## Instruction: Add kazoo as an optional requirement ## Code After: kazoo==1.3.1 psutil==2.1.1 pycurl==7.19.5 pymongo==2.6.3
828585e51bd57e94296b063345070dead4d1717f
src/client/app/graph/canvas/canvas.component.html
src/client/app/graph/canvas/canvas.component.html
<svg id="svg" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" (mousedown)="this.onMousedown($event)" (mousewheel)="this.onMousewheel($event)"> <g uxg-arrow *ngFor="let arrow of (this.canvasService.arrows | toIterable)" [arrow]="arrow"></g> <g uxg-card *ngFor="let card of (this.canvasService.cards | toIterable)" [card]="card" [canvasBoundsGetter]="getBounds"></g> </svg>
<svg id="svg" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" (mousedown)="this.onMousedown($event)" (mousewheel)="this.onMousewheel($event)"> <g uxg-arrow *ngFor="let arrow of (this.canvasService.arrows | toIterable)" [arrow]="arrow"></g> <g uxg-card *ngFor="let card of (this.canvasService.cards | toIterable)" [card]="card" [attr.card-id]="card.id" [canvasBoundsGetter]="getBounds"></g> </svg>
Add the card id as an attribute on the card's HTML element
Add the card id as an attribute on the card's HTML element
HTML
mit
tessa3/uxgraph,tessa3/uxgraph,tessa3/uxgraph,tessa3/uxgraph
html
## Code Before: <svg id="svg" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" (mousedown)="this.onMousedown($event)" (mousewheel)="this.onMousewheel($event)"> <g uxg-arrow *ngFor="let arrow of (this.canvasService.arrows | toIterable)" [arrow]="arrow"></g> <g uxg-card *ngFor="let card of (this.canvasService.cards | toIterable)" [card]="card" [canvasBoundsGetter]="getBounds"></g> </svg> ## Instruction: Add the card id as an attribute on the card's HTML element ## Code After: <svg id="svg" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" (mousedown)="this.onMousedown($event)" (mousewheel)="this.onMousewheel($event)"> <g uxg-arrow *ngFor="let arrow of (this.canvasService.arrows | toIterable)" [arrow]="arrow"></g> <g uxg-card *ngFor="let card of (this.canvasService.cards | toIterable)" [card]="card" [attr.card-id]="card.id" [canvasBoundsGetter]="getBounds"></g> </svg>
fc775adb3bde69e432502bef304715390ef64688
requirements.txt
requirements.txt
codecov==2.0.11 docutils==0.14 flake8==3.5.0 Pygments==2.2.0 pyobjc-core==4.1 ; sys_platform == 'darwin' pytest==5.4.2 pytest-cov==2.5.1 PyYAML==5.3.1
codecov==2.0.11 docutils==0.14 flake8==3.5.0 Pygments==2.2.0 pyobjc-core==4.1 ; sys_platform == 'darwin' pytest-cov==2.5.1 pytest==5.4.2 ; python_version > "3.4" PyYAML==5.3.1 ; python_version > "3.4" pytest==3.3.1 ; python_version <= "3.4" pyyaml==3.12 ; python_version <= "3.4"
Add environment markers to aid python3.4 support
Add environment markers to aid python3.4 support
Text
mit
GreatFruitOmsk/nativeconfig
text
## Code Before: codecov==2.0.11 docutils==0.14 flake8==3.5.0 Pygments==2.2.0 pyobjc-core==4.1 ; sys_platform == 'darwin' pytest==5.4.2 pytest-cov==2.5.1 PyYAML==5.3.1 ## Instruction: Add environment markers to aid python3.4 support ## Code After: codecov==2.0.11 docutils==0.14 flake8==3.5.0 Pygments==2.2.0 pyobjc-core==4.1 ; sys_platform == 'darwin' pytest-cov==2.5.1 pytest==5.4.2 ; python_version > "3.4" PyYAML==5.3.1 ; python_version > "3.4" pytest==3.3.1 ; python_version <= "3.4" pyyaml==3.12 ; python_version <= "3.4"
cc4f209f011639d4f6888a39e71b1145e58cf5c5
src/route/changelogs.js
src/route/changelogs.js
const router = require('express').Router(); const scraper = require('../service/scraper'); router.get('/', (req, res, next) => { const packageName = req.query.package; if (!packageName) { const err = new Error('Undefined query'); err.satus = 404; next(err); } scraper .scan(packageName) .then(result => { const changes = result.changes; res.json({ packageName, changes }); }) .catch(error => { const err = new Error('Could not request info'); err.status = 404; next(err); }); }); module.exports = router;
const router = require('express').Router(); const scraper = require('../service/scraper'); router.get('/:package/latest', (req, res, next) => { const packageName = req.params.package; if (!packageName) { const err = new Error('Undefined query'); err.satus = 404; next(err); } scraper .scan(packageName) .then(result => { const changes = result.changes; res.json({ packageName, changes }); }) .catch(error => { const err = new Error('Could not request info'); err.status = 404; next(err); }); }); module.exports = router;
Change package query from query param to path variable
Change package query from query param to path variable
JavaScript
mit
enric-sinh/android-changelog-api
javascript
## Code Before: const router = require('express').Router(); const scraper = require('../service/scraper'); router.get('/', (req, res, next) => { const packageName = req.query.package; if (!packageName) { const err = new Error('Undefined query'); err.satus = 404; next(err); } scraper .scan(packageName) .then(result => { const changes = result.changes; res.json({ packageName, changes }); }) .catch(error => { const err = new Error('Could not request info'); err.status = 404; next(err); }); }); module.exports = router; ## Instruction: Change package query from query param to path variable ## Code After: const router = require('express').Router(); const scraper = require('../service/scraper'); router.get('/:package/latest', (req, res, next) => { const packageName = req.params.package; if (!packageName) { const err = new Error('Undefined query'); err.satus = 404; next(err); } scraper .scan(packageName) .then(result => { const changes = result.changes; res.json({ packageName, changes }); }) .catch(error => { const err = new Error('Could not request info'); err.status = 404; next(err); }); }); module.exports = router;
062b5bc427c39067628bd0b68ba03d30f654e9e0
lib/api-wmata.js
lib/api-wmata.js
var request = require('request'); var wmataReq = request.defaults({ baseUrl: 'https://api.wmata.com', qs: { api_key: process.env.WMATA_API_KEY } }); exports.get = function (endpoint, successCallback, errorCallback) { wmataReq(endpoint, function (error, res, body) { if (!error && res.statusCode === 200) { successCallback(JSON.parse(body)); } else { errorCallback(error || res); } }); };
var request = require('request'); var wmataReq = request.defaults({ baseUrl: 'https://api.wmata.com', qs: { api_key: process.env.WMATA_API_KEY } }); exports.get = function (endpoint, successCallback, errorCallback) { wmataReq(endpoint, function (error, res, body) { if (!error && res.statusCode === 200) { successCallback(JSON.parse(body)); } else { error = error || res; console.log(JSON.stringify(errorCallback)); if (errorCallback) { errorCallback(error); } else { throw new Error(error); } } }); };
Throw error when errorCallback isn't defined
Throw error when errorCallback isn't defined
JavaScript
mit
pmyers88/dc-metro-echo
javascript
## Code Before: var request = require('request'); var wmataReq = request.defaults({ baseUrl: 'https://api.wmata.com', qs: { api_key: process.env.WMATA_API_KEY } }); exports.get = function (endpoint, successCallback, errorCallback) { wmataReq(endpoint, function (error, res, body) { if (!error && res.statusCode === 200) { successCallback(JSON.parse(body)); } else { errorCallback(error || res); } }); }; ## Instruction: Throw error when errorCallback isn't defined ## Code After: var request = require('request'); var wmataReq = request.defaults({ baseUrl: 'https://api.wmata.com', qs: { api_key: process.env.WMATA_API_KEY } }); exports.get = function (endpoint, successCallback, errorCallback) { wmataReq(endpoint, function (error, res, body) { if (!error && res.statusCode === 200) { successCallback(JSON.parse(body)); } else { error = error || res; console.log(JSON.stringify(errorCallback)); if (errorCallback) { errorCallback(error); } else { throw new Error(error); } } }); };
88dba9b99bcd7090657ba78ec42cb10ac786ad8e
tests/PDOTest.php
tests/PDOTest.php
<?php use PreSQL\PDO; class PDOTest extends PHPUnit_Framework_TestCase { public function testIsPreSQLPDOVersionReturnedCorrectly() { $this->assertEquals(PDO::getVersion(), PDO::$version); } }
<?php use PreSQL\PDO; class PDOTest extends PHPUnit_Framework_TestCase { /** * @covers \PreSQL\PDO */ public function testIsPreSQLPDOVersionReturnedCorrectly() { $this->assertEquals(PDO::getVersion(), PDO::$version); } }
Add @covers annotation to test how coverage report changes.
Add @covers annotation to test how coverage report changes.
PHP
bsd-3-clause
PreSQL/PreSQL-PDO
php
## Code Before: <?php use PreSQL\PDO; class PDOTest extends PHPUnit_Framework_TestCase { public function testIsPreSQLPDOVersionReturnedCorrectly() { $this->assertEquals(PDO::getVersion(), PDO::$version); } } ## Instruction: Add @covers annotation to test how coverage report changes. ## Code After: <?php use PreSQL\PDO; class PDOTest extends PHPUnit_Framework_TestCase { /** * @covers \PreSQL\PDO */ public function testIsPreSQLPDOVersionReturnedCorrectly() { $this->assertEquals(PDO::getVersion(), PDO::$version); } }
e34bafc84fd5a61c8bb9aeba69d7b1ebf671f99b
ppapi/native_client/src/shared/ppapi_proxy/plugin_ppb_core.h
ppapi/native_client/src/shared/ppapi_proxy/plugin_ppb_core.h
// Copyright 2011 The Chromium Authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can // be found in the LICENSE file. #ifndef NATIVE_CLIENT_SRC_SHARED_PPAPI_PROXY_PLUGIN_CORE_H_ #define NATIVE_CLIENT_SRC_SHARED_PPAPI_PROXY_PLUGIN_CORE_H_ #include "native_client/src/include/nacl_macros.h" class PPB_Core; namespace ppapi_proxy { // Implements the untrusted side of the PPB_Core interface. // We will also need an rpc service to implement the trusted side, which is a // very thin wrapper around the PPB_Core interface returned from the browser. class PluginCore { public: // Return an interface pointer usable by PPAPI plugins. static const PPB_Core* GetInterface(); // Mark the calling thread as the main thread for IsMainThread. static void MarkMainThread(); private: NACL_DISALLOW_COPY_AND_ASSIGN(PluginCore); }; } // namespace ppapi_proxy #endif // NATIVE_CLIENT_SRC_SHARED_PPAPI_PROXY_PLUGIN_CORE_H_
// Copyright (c) 2011 The Chromium Authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. #ifndef NATIVE_CLIENT_SRC_SHARED_PPAPI_PROXY_PLUGIN_CORE_H_ #define NATIVE_CLIENT_SRC_SHARED_PPAPI_PROXY_PLUGIN_CORE_H_ #include "native_client/src/include/nacl_macros.h" struct PPB_Core; namespace ppapi_proxy { // Implements the untrusted side of the PPB_Core interface. // We will also need an rpc service to implement the trusted side, which is a // very thin wrapper around the PPB_Core interface returned from the browser. class PluginCore { public: // Return an interface pointer usable by PPAPI plugins. static const PPB_Core* GetInterface(); // Mark the calling thread as the main thread for IsMainThread. static void MarkMainThread(); private: NACL_DISALLOW_COPY_AND_ASSIGN(PluginCore); }; } // namespace ppapi_proxy #endif // NATIVE_CLIENT_SRC_SHARED_PPAPI_PROXY_PLUGIN_CORE_H_
Declare PPB_Core as a struct, not a class. (Prevents Clang warning)
Declare PPB_Core as a struct, not a class. (Prevents Clang warning) Review URL: http://codereview.chromium.org/8002017 git-svn-id: de016e52bd170d2d4f2344f9bf92d50478b649e0@102565 0039d316-1c4b-4281-b951-d872f2087c98
C
bsd-3-clause
adobe/chromium,gavinp/chromium,yitian134/chromium,yitian134/chromium,gavinp/chromium,gavinp/chromium,ropik/chromium,ropik/chromium,ropik/chromium,yitian134/chromium,adobe/chromium,yitian134/chromium,gavinp/chromium,gavinp/chromium,yitian134/chromium,adobe/chromium,yitian134/chromium,yitian134/chromium,adobe/chromium,adobe/chromium,gavinp/chromium,adobe/chromium,ropik/chromium,yitian134/chromium,adobe/chromium,yitian134/chromium,ropik/chromium,adobe/chromium,adobe/chromium,gavinp/chromium,ropik/chromium,adobe/chromium,ropik/chromium,ropik/chromium,yitian134/chromium,gavinp/chromium,gavinp/chromium,gavinp/chromium,adobe/chromium,ropik/chromium
c
## Code Before: // Copyright 2011 The Chromium Authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can // be found in the LICENSE file. #ifndef NATIVE_CLIENT_SRC_SHARED_PPAPI_PROXY_PLUGIN_CORE_H_ #define NATIVE_CLIENT_SRC_SHARED_PPAPI_PROXY_PLUGIN_CORE_H_ #include "native_client/src/include/nacl_macros.h" class PPB_Core; namespace ppapi_proxy { // Implements the untrusted side of the PPB_Core interface. // We will also need an rpc service to implement the trusted side, which is a // very thin wrapper around the PPB_Core interface returned from the browser. class PluginCore { public: // Return an interface pointer usable by PPAPI plugins. static const PPB_Core* GetInterface(); // Mark the calling thread as the main thread for IsMainThread. static void MarkMainThread(); private: NACL_DISALLOW_COPY_AND_ASSIGN(PluginCore); }; } // namespace ppapi_proxy #endif // NATIVE_CLIENT_SRC_SHARED_PPAPI_PROXY_PLUGIN_CORE_H_ ## Instruction: Declare PPB_Core as a struct, not a class. (Prevents Clang warning) Review URL: http://codereview.chromium.org/8002017 git-svn-id: de016e52bd170d2d4f2344f9bf92d50478b649e0@102565 0039d316-1c4b-4281-b951-d872f2087c98 ## Code After: // Copyright (c) 2011 The Chromium Authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. #ifndef NATIVE_CLIENT_SRC_SHARED_PPAPI_PROXY_PLUGIN_CORE_H_ #define NATIVE_CLIENT_SRC_SHARED_PPAPI_PROXY_PLUGIN_CORE_H_ #include "native_client/src/include/nacl_macros.h" struct PPB_Core; namespace ppapi_proxy { // Implements the untrusted side of the PPB_Core interface. // We will also need an rpc service to implement the trusted side, which is a // very thin wrapper around the PPB_Core interface returned from the browser. class PluginCore { public: // Return an interface pointer usable by PPAPI plugins. static const PPB_Core* GetInterface(); // Mark the calling thread as the main thread for IsMainThread. static void MarkMainThread(); private: NACL_DISALLOW_COPY_AND_ASSIGN(PluginCore); }; } // namespace ppapi_proxy #endif // NATIVE_CLIENT_SRC_SHARED_PPAPI_PROXY_PLUGIN_CORE_H_
3bca2987aa8e822da860f4993b383172ba31b424
app/views/team_name/sign.html.erb
app/views/team_name/sign.html.erb
<% content_for :body_class, "#{ @colour }" %> <h1> <%= (@team_name.upcase == @team_name || @team_name.downcase == @team_name) ? @team_name.upcase : @team_name %> </h1>
<% content_for :body_class, "#{ @colour }" %> <h1> <% @team_name = @team_name.gsub(/^(GOV.UK)/i, "\\1<br>") %> <%= (@team_name.upcase == @team_name || @team_name.downcase == @team_name) ? sanitize(@team_name.upcase) : sanitize(@team_name) %> </h1>
Insert a line break after GOV.UK
Insert a line break after GOV.UK Matches the design of the physical signs
HTML+ERB
mit
govuklaurence/tv-team-name,govuklaurence/tv-team-name,govuklaurence/tv-team-name
html+erb
## Code Before: <% content_for :body_class, "#{ @colour }" %> <h1> <%= (@team_name.upcase == @team_name || @team_name.downcase == @team_name) ? @team_name.upcase : @team_name %> </h1> ## Instruction: Insert a line break after GOV.UK Matches the design of the physical signs ## Code After: <% content_for :body_class, "#{ @colour }" %> <h1> <% @team_name = @team_name.gsub(/^(GOV.UK)/i, "\\1<br>") %> <%= (@team_name.upcase == @team_name || @team_name.downcase == @team_name) ? sanitize(@team_name.upcase) : sanitize(@team_name) %> </h1>
7a1bdf74c98feb61db8db4803443a7a32b8e58f2
cookbooks/omnibus-supermarket/.kitchen.yml
cookbooks/omnibus-supermarket/.kitchen.yml
--- driver: name: vagrant network: - ['private_network', { ip: '192.168.33.33' }] synced_folders: - ['../../pkg', '/tmp/packages', 'create: true, type: :rsync'] provisioner: name: chef_solo solo_rb: ssl_verify_mode: verify_peer platforms: - name: ubuntu-10.04 - name: ubuntu-12.04 - name: centos-5.11 - name: centos-6.7 - name: centos-7.1 suites: - name: default run_list: - recipe[omnibus-supermarket::cookbook_test] attributes: supermarket: test: deb_package_path: <%= ENV['SUPERMARKET_DEB'] || false %> rpm_package_path: <%= ENV['SUPERMARKET_RPM'] || false %>
--- driver: name: vagrant network: - ['private_network', { ip: '192.168.33.33' }] synced_folders: - ['../../pkg', '/tmp/packages', 'create: true, type: :rsync'] provisioner: name: chef_solo solo_rb: ssl_verify_mode: verify_peer platforms: - name: ubuntu-10.04 - name: ubuntu-12.04 - name: ubuntu-14.04 - name: centos-5.11 - name: centos-6.7 - name: centos-7.1 suites: - name: default run_list: - recipe[omnibus-supermarket::cookbook_test] attributes: supermarket: test: deb_package_path: <%= ENV['SUPERMARKET_DEB'] || false %> rpm_package_path: <%= ENV['SUPERMARKET_RPM'] || false %>
Add Ubuntu 14.04 to TK test matrix
Add Ubuntu 14.04 to TK test matrix
YAML
apache-2.0
chef/supermarket,chef/supermarket,chef/omnibus-supermarket,nellshamrell/supermarket,nellshamrell/supermarket,robbkidd/supermarket,juliandunn/supermarket,robbkidd/supermarket,chef/supermarket,nellshamrell/supermarket,chef/supermarket,juliandunn/supermarket,tas50/supermarket,tas50/supermarket,chef/omnibus-supermarket,tas50/supermarket,tas50/supermarket,robbkidd/supermarket,nellshamrell/supermarket,chef/supermarket,juliandunn/supermarket,juliandunn/supermarket,chef/omnibus-supermarket,robbkidd/supermarket
yaml
## Code Before: --- driver: name: vagrant network: - ['private_network', { ip: '192.168.33.33' }] synced_folders: - ['../../pkg', '/tmp/packages', 'create: true, type: :rsync'] provisioner: name: chef_solo solo_rb: ssl_verify_mode: verify_peer platforms: - name: ubuntu-10.04 - name: ubuntu-12.04 - name: centos-5.11 - name: centos-6.7 - name: centos-7.1 suites: - name: default run_list: - recipe[omnibus-supermarket::cookbook_test] attributes: supermarket: test: deb_package_path: <%= ENV['SUPERMARKET_DEB'] || false %> rpm_package_path: <%= ENV['SUPERMARKET_RPM'] || false %> ## Instruction: Add Ubuntu 14.04 to TK test matrix ## Code After: --- driver: name: vagrant network: - ['private_network', { ip: '192.168.33.33' }] synced_folders: - ['../../pkg', '/tmp/packages', 'create: true, type: :rsync'] provisioner: name: chef_solo solo_rb: ssl_verify_mode: verify_peer platforms: - name: ubuntu-10.04 - name: ubuntu-12.04 - name: ubuntu-14.04 - name: centos-5.11 - name: centos-6.7 - name: centos-7.1 suites: - name: default run_list: - recipe[omnibus-supermarket::cookbook_test] attributes: supermarket: test: deb_package_path: <%= ENV['SUPERMARKET_DEB'] || false %> rpm_package_path: <%= ENV['SUPERMARKET_RPM'] || false %>
8472cdf128c3b08252d73cae39a3ee12a1e13ed8
Sources/com/kaviju/accesscontrol/authentication/RandomPasswordGenerator.java
Sources/com/kaviju/accesscontrol/authentication/RandomPasswordGenerator.java
package com.kaviju.accesscontrol.authentication; import java.util.Random; public class RandomPasswordGenerator { static final String possibleChars = "23456789abcdefghjklmnpqrstuvwxyzABCDEFGHJKLMNPQRSTUVWXYZ@#$%?&*()-+"; static Random rnd = new Random(); static public String newPassword(int len) { { StringBuilder password = new StringBuilder( len ); for( int i = 0; i < len; i++ ) { int nextCharIndex = rnd.nextInt(possibleChars.length()); password.append(possibleChars.charAt(nextCharIndex)); } return password.toString(); } } }
package com.kaviju.accesscontrol.authentication; import java.util.Random; public class RandomPasswordGenerator { static final String possibleChars = "23456789abcdefghjklmnpqrstuvwxyzABCDEFGHJKLMNPQRSTUVWXYZ@#$%?&*()-+"; static final String possibleDigits = "0123456789"; static Random rnd = new Random(); static public String newPassword(int len) { { StringBuilder password = new StringBuilder( len ); for( int i = 0; i < len; i++ ) { int nextCharIndex = rnd.nextInt(possibleChars.length()); password.append(possibleChars.charAt(nextCharIndex)); } return password.toString(); } } static public String newDigitsPassword(int len) { { StringBuilder password = new StringBuilder( len ); for( int i = 0; i < len; i++ ) { int nextCharIndex = rnd.nextInt(possibleDigits.length()); password.append(possibleDigits.charAt(nextCharIndex)); } return password.toString(); } } }
Add a digits only password generator.
Add a digits only password generator.
Java
apache-2.0
Kaviju/KAAccessControl,Kaviju/KAAccessControl
java
## Code Before: package com.kaviju.accesscontrol.authentication; import java.util.Random; public class RandomPasswordGenerator { static final String possibleChars = "23456789abcdefghjklmnpqrstuvwxyzABCDEFGHJKLMNPQRSTUVWXYZ@#$%?&*()-+"; static Random rnd = new Random(); static public String newPassword(int len) { { StringBuilder password = new StringBuilder( len ); for( int i = 0; i < len; i++ ) { int nextCharIndex = rnd.nextInt(possibleChars.length()); password.append(possibleChars.charAt(nextCharIndex)); } return password.toString(); } } } ## Instruction: Add a digits only password generator. ## Code After: package com.kaviju.accesscontrol.authentication; import java.util.Random; public class RandomPasswordGenerator { static final String possibleChars = "23456789abcdefghjklmnpqrstuvwxyzABCDEFGHJKLMNPQRSTUVWXYZ@#$%?&*()-+"; static final String possibleDigits = "0123456789"; static Random rnd = new Random(); static public String newPassword(int len) { { StringBuilder password = new StringBuilder( len ); for( int i = 0; i < len; i++ ) { int nextCharIndex = rnd.nextInt(possibleChars.length()); password.append(possibleChars.charAt(nextCharIndex)); } return password.toString(); } } static public String newDigitsPassword(int len) { { StringBuilder password = new StringBuilder( len ); for( int i = 0; i < len; i++ ) { int nextCharIndex = rnd.nextInt(possibleDigits.length()); password.append(possibleDigits.charAt(nextCharIndex)); } return password.toString(); } } }
6a692e20176a5514b4a63a45f6bb90a079b48cff
spec/controllers/users_controller_spec.rb
spec/controllers/users_controller_spec.rb
require 'spec_helper' describe UsersController do before :all do @model = User end before :all do @user = Factory :user end context "as an admin" do end #TODO lots of work here end
require 'spec_helper' describe UsersController do before :all do @model = User end context "as an admin" do before do login_as_admin end it_should_behave_like "standard GET index" it_should_behave_like "standard GET show" end context "as a player" do before do login end it_should_behave_like "unauthorized GET index" it_should_behave_like "unauthorized GET show" end context "as an anonymous user" do it_should_behave_like "unauthorized GET index" it_should_behave_like "unauthorized GET show" end end
Test admin functions for viewing users.
Test admin functions for viewing users.
Ruby
mit
peplin/threephase,peplin/threephase
ruby
## Code Before: require 'spec_helper' describe UsersController do before :all do @model = User end before :all do @user = Factory :user end context "as an admin" do end #TODO lots of work here end ## Instruction: Test admin functions for viewing users. ## Code After: require 'spec_helper' describe UsersController do before :all do @model = User end context "as an admin" do before do login_as_admin end it_should_behave_like "standard GET index" it_should_behave_like "standard GET show" end context "as a player" do before do login end it_should_behave_like "unauthorized GET index" it_should_behave_like "unauthorized GET show" end context "as an anonymous user" do it_should_behave_like "unauthorized GET index" it_should_behave_like "unauthorized GET show" end end
730c2dc11a7854543416aa399b527b22d9a9c334
assemblies/docker/docker-entrypoint.sh
assemblies/docker/docker-entrypoint.sh
echo "Starting Apache Karaf" exec $KARAF_HOME/bin/karaf exec "$@"
if [ "$1" = "server" ]; then echo "Starting Apache Karaf (server mode)" exec $KARAF_HOME/bin/karaf server else echo "Starting Apache Karaf" exec $KARAF_HOME/bin/karaf fi exec "$@"
Add support of server mode in the Docker entrypoint
[KARAF-5982] Add support of server mode in the Docker entrypoint
Shell
apache-2.0
grgrzybek/karaf,grgrzybek/karaf,grgrzybek/karaf
shell
## Code Before: echo "Starting Apache Karaf" exec $KARAF_HOME/bin/karaf exec "$@" ## Instruction: [KARAF-5982] Add support of server mode in the Docker entrypoint ## Code After: if [ "$1" = "server" ]; then echo "Starting Apache Karaf (server mode)" exec $KARAF_HOME/bin/karaf server else echo "Starting Apache Karaf" exec $KARAF_HOME/bin/karaf fi exec "$@"
5016d4facd131380871288df0456910d29c39c73
lib/retina_rails/extensions/carrierwave.rb
lib/retina_rails/extensions/carrierwave.rb
module RetinaRails module Extensions module CarrierWave module Mount def self.included base base.module_eval do alias_method :original_mount_uploader, :mount_uploader ## # Serialize retina_dimensions # if mounted to class has a retina_dimensions column # def mount_uploader(*args) original_mount_uploader(*args) serialize :retina_dimensions if columns_hash.has_key?('retina_dimensions') end end end end # Mount end # CarrierWave end # Extensions end # RetinaRails
module RetinaRails module Extensions module CarrierWave module Mount def self.included base base.module_eval do alias_method :original_mount_uploader, :mount_uploader ## # Serialize retina_dimensions # if mounted to class has a retina_dimensions column # def mount_uploader(*args) original_mount_uploader(*args) serialize :retina_dimensions if table_exists? && columns_hash.has_key?('retina_dimensions') end end end end # Mount end # CarrierWave end # Extensions end # RetinaRails
Check if table exists before calling columns_hash
Check if table exists before calling columns_hash
Ruby
mit
jhnvz/retina_rails,testzugang/retina_rails
ruby
## Code Before: module RetinaRails module Extensions module CarrierWave module Mount def self.included base base.module_eval do alias_method :original_mount_uploader, :mount_uploader ## # Serialize retina_dimensions # if mounted to class has a retina_dimensions column # def mount_uploader(*args) original_mount_uploader(*args) serialize :retina_dimensions if columns_hash.has_key?('retina_dimensions') end end end end # Mount end # CarrierWave end # Extensions end # RetinaRails ## Instruction: Check if table exists before calling columns_hash ## Code After: module RetinaRails module Extensions module CarrierWave module Mount def self.included base base.module_eval do alias_method :original_mount_uploader, :mount_uploader ## # Serialize retina_dimensions # if mounted to class has a retina_dimensions column # def mount_uploader(*args) original_mount_uploader(*args) serialize :retina_dimensions if table_exists? && columns_hash.has_key?('retina_dimensions') end end end end # Mount end # CarrierWave end # Extensions end # RetinaRails
284aff32125618a99015da958004f5bc1c5c8c1e
src/main/java/session/XaJpaDemoBean.java
src/main/java/session/XaJpaDemoBean.java
package session; import java.util.List; import javax.persistence.EntityManager; import javax.persistence.EntityTransaction; import javax.persistence.PersistenceContext; import javax.persistence.Query; import domain.Customer; import domain.Order; public class XaJpaDemoBean { @PersistenceContext(name="customer") EntityManager customerEntityManager; @PersistenceContext(name="orders") EntityManager orderEntityManager; public void saveCustomerOrder(Customer c, Order o, Boolean succeeds) { try { EntityTransaction customerTransaction = customerEntityManager.getTransaction(); customerTransaction.begin(); // Update the customer entity in the database. customerEntityManager.merge(c); c.setNumberOfOrders(c.getNumberOfOrders() + 1); customerTransaction.commit(); int cid = c.getId(); System.out.println("Updated Customer with Id " + cid); EntityTransaction orderTransaction = orderEntityManager.getTransaction(); orderTransaction.begin(); // Insert the order entity in the database. orderEntityManager.persist(c); orderTransaction.commit(); int oid = o.getId(); System.out.println("Created order with Id " + oid); Query query = customerEntityManager.createQuery("select c from Customer c order by c.name"); List<Customer> list = query.getResultList(); System.out.println("There are " + list.size() + " persons:"); for (Customer p : list) { System.out.println(p.getName()); } System.out.println(); } finally { if (!succeeds) { throw new RuntimeException("Simulated failure!"); } } } }
package session; import javax.ejb.Stateless; import javax.persistence.EntityManager; import javax.persistence.EntityTransaction; import javax.persistence.PersistenceContext; import domain.Customer; import domain.Order; @Stateless public class XaJpaDemoBean { @PersistenceContext(name="customer") EntityManager customerEntityManager; @PersistenceContext(name="orders") EntityManager orderEntityManager; public void saveCustomerOrder(Customer c, Order o, Boolean succeeds) { try { EntityTransaction customerTransaction = customerEntityManager.getTransaction(); customerTransaction.begin(); // Update the customer entity in the database. customerEntityManager.merge(c); c.setNumberOfOrders(c.getNumberOfOrders() + 1); customerTransaction.commit(); int cid = c.getId(); System.out.println("XaJpaDemoBean.saveCustomerOrder(): Updated Customer with Id " + cid); EntityTransaction orderTransaction = orderEntityManager.getTransaction(); orderTransaction.begin(); // Insert the order entity in the database. orderEntityManager.persist(c); orderTransaction.commit(); int oid = o.getId(); System.out.println("XaJpaDemoBean.saveCustomerOrder(): Created order with Id " + oid); } finally { if (!succeeds) { throw new RuntimeException("XaJpaDemoBean.saveCustomerOrder(): Simulated failure!"); } } } }
Annotate correctly (@Stateless). Correct several printouts.
Annotate correctly (@Stateless). Correct several printouts.
Java
bsd-2-clause
IanDarwin/eedemo-xa,IanDarwin/eedemo-xa
java
## Code Before: package session; import java.util.List; import javax.persistence.EntityManager; import javax.persistence.EntityTransaction; import javax.persistence.PersistenceContext; import javax.persistence.Query; import domain.Customer; import domain.Order; public class XaJpaDemoBean { @PersistenceContext(name="customer") EntityManager customerEntityManager; @PersistenceContext(name="orders") EntityManager orderEntityManager; public void saveCustomerOrder(Customer c, Order o, Boolean succeeds) { try { EntityTransaction customerTransaction = customerEntityManager.getTransaction(); customerTransaction.begin(); // Update the customer entity in the database. customerEntityManager.merge(c); c.setNumberOfOrders(c.getNumberOfOrders() + 1); customerTransaction.commit(); int cid = c.getId(); System.out.println("Updated Customer with Id " + cid); EntityTransaction orderTransaction = orderEntityManager.getTransaction(); orderTransaction.begin(); // Insert the order entity in the database. orderEntityManager.persist(c); orderTransaction.commit(); int oid = o.getId(); System.out.println("Created order with Id " + oid); Query query = customerEntityManager.createQuery("select c from Customer c order by c.name"); List<Customer> list = query.getResultList(); System.out.println("There are " + list.size() + " persons:"); for (Customer p : list) { System.out.println(p.getName()); } System.out.println(); } finally { if (!succeeds) { throw new RuntimeException("Simulated failure!"); } } } } ## Instruction: Annotate correctly (@Stateless). Correct several printouts. ## Code After: package session; import javax.ejb.Stateless; import javax.persistence.EntityManager; import javax.persistence.EntityTransaction; import javax.persistence.PersistenceContext; import domain.Customer; import domain.Order; @Stateless public class XaJpaDemoBean { @PersistenceContext(name="customer") EntityManager customerEntityManager; @PersistenceContext(name="orders") EntityManager orderEntityManager; public void saveCustomerOrder(Customer c, Order o, Boolean succeeds) { try { EntityTransaction customerTransaction = customerEntityManager.getTransaction(); customerTransaction.begin(); // Update the customer entity in the database. customerEntityManager.merge(c); c.setNumberOfOrders(c.getNumberOfOrders() + 1); customerTransaction.commit(); int cid = c.getId(); System.out.println("XaJpaDemoBean.saveCustomerOrder(): Updated Customer with Id " + cid); EntityTransaction orderTransaction = orderEntityManager.getTransaction(); orderTransaction.begin(); // Insert the order entity in the database. orderEntityManager.persist(c); orderTransaction.commit(); int oid = o.getId(); System.out.println("XaJpaDemoBean.saveCustomerOrder(): Created order with Id " + oid); } finally { if (!succeeds) { throw new RuntimeException("XaJpaDemoBean.saveCustomerOrder(): Simulated failure!"); } } } }
ff65853def5bf1044fe457362f85b8aecca66152
tests/laser/transaction/create.py
tests/laser/transaction/create.py
import mythril.laser.ethereum.transaction as transaction from mythril.ether import util import mythril.laser.ethereum.svm as svm from mythril.disassembler.disassembly import Disassembly from datetime import datetime from mythril.ether.soliditycontract import SolidityContract import tests from mythril.analysis.security import fire_lasers from mythril.analysis.symbolic import SymExecWrapper def test_create(): contract = SolidityContract(str(tests.TESTDATA_INPUTS_CONTRACTS / 'calls.sol')) laser_evm = svm.LaserEVM({}) laser_evm.time = datetime.now() laser_evm.execute_contract_creation(contract.creation_code) resulting_final_state = laser_evm.open_states[0] for address, created_account in resulting_final_state.accounts.items(): created_account_code = created_account.code actual_code = Disassembly(contract.code) for i in range(len(created_account_code.instruction_list)): found_instruction = created_account_code.instruction_list[i] actual_instruction = actual_code.instruction_list[i] assert found_instruction['opcode'] == actual_instruction['opcode'] def test_sym_exec(): contract = SolidityContract(str(tests.TESTDATA_INPUTS_CONTRACTS / 'calls.sol')) sym = SymExecWrapper(contract, address=(util.get_indexed_address(0)), strategy="dfs") issues = fire_lasers(sym) assert len(issues) != 0
from mythril.laser.ethereum.transaction import execute_contract_creation from mythril.ether import util import mythril.laser.ethereum.svm as svm from mythril.disassembler.disassembly import Disassembly from datetime import datetime from mythril.ether.soliditycontract import SolidityContract import tests from mythril.analysis.security import fire_lasers from mythril.analysis.symbolic import SymExecWrapper def test_create(): contract = SolidityContract(str(tests.TESTDATA_INPUTS_CONTRACTS / 'calls.sol')) laser_evm = svm.LaserEVM({}) laser_evm.time = datetime.now() execute_contract_creation(laser_evm, contract.creation_code) resulting_final_state = laser_evm.open_states[0] for address, created_account in resulting_final_state.accounts.items(): created_account_code = created_account.code actual_code = Disassembly(contract.code) for i in range(len(created_account_code.instruction_list)): found_instruction = created_account_code.instruction_list[i] actual_instruction = actual_code.instruction_list[i] assert found_instruction['opcode'] == actual_instruction['opcode'] def test_sym_exec(): contract = SolidityContract(str(tests.TESTDATA_INPUTS_CONTRACTS / 'calls.sol')) sym = SymExecWrapper(contract, address=(util.get_indexed_address(0)), strategy="dfs") issues = fire_lasers(sym) assert len(issues) != 0
Update test to reflect the refactor
Update test to reflect the refactor
Python
mit
b-mueller/mythril,b-mueller/mythril,b-mueller/mythril,b-mueller/mythril
python
## Code Before: import mythril.laser.ethereum.transaction as transaction from mythril.ether import util import mythril.laser.ethereum.svm as svm from mythril.disassembler.disassembly import Disassembly from datetime import datetime from mythril.ether.soliditycontract import SolidityContract import tests from mythril.analysis.security import fire_lasers from mythril.analysis.symbolic import SymExecWrapper def test_create(): contract = SolidityContract(str(tests.TESTDATA_INPUTS_CONTRACTS / 'calls.sol')) laser_evm = svm.LaserEVM({}) laser_evm.time = datetime.now() laser_evm.execute_contract_creation(contract.creation_code) resulting_final_state = laser_evm.open_states[0] for address, created_account in resulting_final_state.accounts.items(): created_account_code = created_account.code actual_code = Disassembly(contract.code) for i in range(len(created_account_code.instruction_list)): found_instruction = created_account_code.instruction_list[i] actual_instruction = actual_code.instruction_list[i] assert found_instruction['opcode'] == actual_instruction['opcode'] def test_sym_exec(): contract = SolidityContract(str(tests.TESTDATA_INPUTS_CONTRACTS / 'calls.sol')) sym = SymExecWrapper(contract, address=(util.get_indexed_address(0)), strategy="dfs") issues = fire_lasers(sym) assert len(issues) != 0 ## Instruction: Update test to reflect the refactor ## Code After: from mythril.laser.ethereum.transaction import execute_contract_creation from mythril.ether import util import mythril.laser.ethereum.svm as svm from mythril.disassembler.disassembly import Disassembly from datetime import datetime from mythril.ether.soliditycontract import SolidityContract import tests from mythril.analysis.security import fire_lasers from mythril.analysis.symbolic import SymExecWrapper def test_create(): contract = SolidityContract(str(tests.TESTDATA_INPUTS_CONTRACTS / 'calls.sol')) laser_evm = svm.LaserEVM({}) laser_evm.time = datetime.now() execute_contract_creation(laser_evm, contract.creation_code) resulting_final_state = laser_evm.open_states[0] for address, created_account in resulting_final_state.accounts.items(): created_account_code = created_account.code actual_code = Disassembly(contract.code) for i in range(len(created_account_code.instruction_list)): found_instruction = created_account_code.instruction_list[i] actual_instruction = actual_code.instruction_list[i] assert found_instruction['opcode'] == actual_instruction['opcode'] def test_sym_exec(): contract = SolidityContract(str(tests.TESTDATA_INPUTS_CONTRACTS / 'calls.sol')) sym = SymExecWrapper(contract, address=(util.get_indexed_address(0)), strategy="dfs") issues = fire_lasers(sym) assert len(issues) != 0
594b03ceb00f0ac76ad015a7380a4c9f23903f50
lib/__tests__/ActionCable-test.js
lib/__tests__/ActionCable-test.js
import React from 'react'; import { shallow } from 'enzyme'; import { expect } from 'chai'; import { ActionCableProvider, ActionCable } from '../index'; test('ActionCable render without children', () => { const node = shallow( <ActionCableProvider> <ActionCable /> </ActionCableProvider> ); expect(node.find(ActionCable)).to.have.length(1); }); test('ActionCable render with children', () => { const node = shallow( <ActionCableProvider> <ActionCable> <div>Hello</div> </ActionCable> </ActionCableProvider> ); expect(node.find(ActionCable)).to.have.length(1); const div = node.find('div') expect(div).to.have.length(1); });
import React from 'react'; import { shallow } from 'enzyme'; import { expect } from 'chai'; import Provider, { ActionCableProvider, ActionCable } from '../index'; test('ActionCable render without children', () => { const node = shallow( <ActionCableProvider> <ActionCable /> </ActionCableProvider> ); expect(node.find(ActionCable)).to.have.length(1); }); test('ActionCable render with children', () => { const node = shallow( <ActionCableProvider> <ActionCable> <div>Hello</div> </ActionCable> </ActionCableProvider> ); expect(node.find(ActionCable)).to.have.length(1); const div = node.find('div') expect(div).to.have.length(1); }); test('Default exporting ActionCableProvider works', () => { const node = shallow( <Provider> <ActionCable /> </Provider> ); expect(node.find(ActionCable)).to.have.length(1); });
Add test for default exporting ActionCableProvider
Add test for default exporting ActionCableProvider
JavaScript
mit
cpunion/react-actioncable-provider
javascript
## Code Before: import React from 'react'; import { shallow } from 'enzyme'; import { expect } from 'chai'; import { ActionCableProvider, ActionCable } from '../index'; test('ActionCable render without children', () => { const node = shallow( <ActionCableProvider> <ActionCable /> </ActionCableProvider> ); expect(node.find(ActionCable)).to.have.length(1); }); test('ActionCable render with children', () => { const node = shallow( <ActionCableProvider> <ActionCable> <div>Hello</div> </ActionCable> </ActionCableProvider> ); expect(node.find(ActionCable)).to.have.length(1); const div = node.find('div') expect(div).to.have.length(1); }); ## Instruction: Add test for default exporting ActionCableProvider ## Code After: import React from 'react'; import { shallow } from 'enzyme'; import { expect } from 'chai'; import Provider, { ActionCableProvider, ActionCable } from '../index'; test('ActionCable render without children', () => { const node = shallow( <ActionCableProvider> <ActionCable /> </ActionCableProvider> ); expect(node.find(ActionCable)).to.have.length(1); }); test('ActionCable render with children', () => { const node = shallow( <ActionCableProvider> <ActionCable> <div>Hello</div> </ActionCable> </ActionCableProvider> ); expect(node.find(ActionCable)).to.have.length(1); const div = node.find('div') expect(div).to.have.length(1); }); test('Default exporting ActionCableProvider works', () => { const node = shallow( <Provider> <ActionCable /> </Provider> ); expect(node.find(ActionCable)).to.have.length(1); });
d0e6353434fe7d47462c43a17a48723835e5a5ae
cbv/static/permalinks.js
cbv/static/permalinks.js
// Auto-expand the accordion and jump to the target if url contains a valid hash (function(doc){ "use strict"; $(function(){ var $section, hash = window.location.hash; if(hash){ $section = $(hash); if($section){ $(doc).one('hidden.bs.collapse', hash, function(){ $section.parent().find('h3').click(); }); $('html, body').animate({ scrollTop: $(hash).offset().top }, 500); } } }) })(document);
// Auto-expand the accordion and jump to the target if url contains a valid hash (function(doc){ "use strict"; // On DOM Load... $(function(){ var hash = window.location.hash; var headerHeight = $('.navbar').height(); var methods = $('.accordion-group .accordion-heading'); var methodCount = methods.length; // Sets-up an event handler that will expand and scroll to the desired // (hash) method. if(hash){ // We need to know that all the panes have (at least initially) // collapsed as a result of loading the page. Unfortunately, we // only get events for each collapsed item, so we count... $(doc).on('hidden.bs.collapse', function(evt) { var methodTop, $hdr = $(hash).parent('.accordion-group'); methodCount--; if(methodCount === 0){ // OK, they've /all/ collapsed, now we expand the one we're // interested in the scroll to it. // First, remove this very event handler $(doc).off('hidden.bs.collapse'); // Now wait just a beat more to allow the last collapse // animation to complete... setTimeout(function(){ // Open the desired method $hdr.find('h3').click(); // Take into account the fixed header and the margin methodTop = $hdr.offset().top - headerHeight - 8; // Scroll it into view $('html, body').animate({scrollTop: methodTop}, 250); }, 250); } }); } // Delegated event handler to prevent the collapse/expand function of // a method's header when clicking the Pilcrow (permalink) $('.accordion-group').on('click', '.permalink', function(evt){ evt.preventDefault(); evt.stopImmediatePropagation(); window.location = $(this).attr('href'); }); }) })(document);
Work around for lack of signal for collapsing of all methods
Work around for lack of signal for collapsing of all methods
JavaScript
bsd-2-clause
refreshoxford/django-cbv-inspector,refreshoxford/django-cbv-inspector,refreshoxford/django-cbv-inspector,refreshoxford/django-cbv-inspector
javascript
## Code Before: // Auto-expand the accordion and jump to the target if url contains a valid hash (function(doc){ "use strict"; $(function(){ var $section, hash = window.location.hash; if(hash){ $section = $(hash); if($section){ $(doc).one('hidden.bs.collapse', hash, function(){ $section.parent().find('h3').click(); }); $('html, body').animate({ scrollTop: $(hash).offset().top }, 500); } } }) })(document); ## Instruction: Work around for lack of signal for collapsing of all methods ## Code After: // Auto-expand the accordion and jump to the target if url contains a valid hash (function(doc){ "use strict"; // On DOM Load... $(function(){ var hash = window.location.hash; var headerHeight = $('.navbar').height(); var methods = $('.accordion-group .accordion-heading'); var methodCount = methods.length; // Sets-up an event handler that will expand and scroll to the desired // (hash) method. if(hash){ // We need to know that all the panes have (at least initially) // collapsed as a result of loading the page. Unfortunately, we // only get events for each collapsed item, so we count... $(doc).on('hidden.bs.collapse', function(evt) { var methodTop, $hdr = $(hash).parent('.accordion-group'); methodCount--; if(methodCount === 0){ // OK, they've /all/ collapsed, now we expand the one we're // interested in the scroll to it. // First, remove this very event handler $(doc).off('hidden.bs.collapse'); // Now wait just a beat more to allow the last collapse // animation to complete... setTimeout(function(){ // Open the desired method $hdr.find('h3').click(); // Take into account the fixed header and the margin methodTop = $hdr.offset().top - headerHeight - 8; // Scroll it into view $('html, body').animate({scrollTop: methodTop}, 250); }, 250); } }); } // Delegated event handler to prevent the collapse/expand function of // a method's header when clicking the Pilcrow (permalink) $('.accordion-group').on('click', '.permalink', function(evt){ evt.preventDefault(); evt.stopImmediatePropagation(); window.location = $(this).attr('href'); }); }) })(document);
1be067ae18f7bcd7be30856dd1149d58d1cea8bb
x/dclupgrade/types/proposed_upgrade.go
x/dclupgrade/types/proposed_upgrade.go
package types import sdk "github.com/cosmos/cosmos-sdk/types" func (upgrade ProposedUpgrade) HasApprovalFrom(address sdk.AccAddress) bool { addrStr := address.String() for _, approval := range upgrade.Approvals { if approval.Address == addrStr { return true } } return false }
package types import sdk "github.com/cosmos/cosmos-sdk/types" func (upgrade ProposedUpgrade) HasApprovalFrom(address sdk.AccAddress) bool { addrStr := address.String() for _, approval := range upgrade.Approvals { if approval.Address == addrStr { return true } } return false } func (upgrade ProposedUpgrade) HasRejectFrom(address sdk.AccAddress) bool { addrStr := address.String() for _, reject := range upgrade.Rejects { if reject.Address == addrStr { return true } } return false }
Add function for checking has reject from trustee
Add function for checking has reject from trustee
Go
apache-2.0
zigbee-alliance/distributed-compliance-ledger,zigbee-alliance/distributed-compliance-ledger,zigbee-alliance/distributed-compliance-ledger,zigbee-alliance/distributed-compliance-ledger,zigbee-alliance/distributed-compliance-ledger,zigbee-alliance/distributed-compliance-ledger
go
## Code Before: package types import sdk "github.com/cosmos/cosmos-sdk/types" func (upgrade ProposedUpgrade) HasApprovalFrom(address sdk.AccAddress) bool { addrStr := address.String() for _, approval := range upgrade.Approvals { if approval.Address == addrStr { return true } } return false } ## Instruction: Add function for checking has reject from trustee ## Code After: package types import sdk "github.com/cosmos/cosmos-sdk/types" func (upgrade ProposedUpgrade) HasApprovalFrom(address sdk.AccAddress) bool { addrStr := address.String() for _, approval := range upgrade.Approvals { if approval.Address == addrStr { return true } } return false } func (upgrade ProposedUpgrade) HasRejectFrom(address sdk.AccAddress) bool { addrStr := address.String() for _, reject := range upgrade.Rejects { if reject.Address == addrStr { return true } } return false }
b7551cc66f3deee2e10b85ee76f55efa133c959a
lib/net/ptth/incoming_request.rb
lib/net/ptth/incoming_request.rb
class Net::PTTH IncomingRequest = Struct.new(:method, :path, :headers, :body) do def to_env env = { "PATH_INFO" => path, "SCRIPT_NAME" => "", "rack.input" => StringIO.new(body || ""), "REQUEST_METHOD" => method, } env["CONTENT_LENGTH"] = body.length unless body.nil? env.merge!(headers) if headers end end end
class Net::PTTH IncomingRequest = Struct.new(:method, :path, :headers, :body) do def to_env env = { "PATH_INFO" => path, "SCRIPT_NAME" => "", "rack.input" => StringIO.new(body || ""), "REQUEST_METHOD" => method, } env["CONTENT_LENGTH"] = body.length unless body.nil? env.merge!(headers) if headers env end end end
Make sure env is always returned
Make sure env is always returned If headers is nil here, the method previously returned nil.
Ruby
mit
elcuervo/net-ptth
ruby
## Code Before: class Net::PTTH IncomingRequest = Struct.new(:method, :path, :headers, :body) do def to_env env = { "PATH_INFO" => path, "SCRIPT_NAME" => "", "rack.input" => StringIO.new(body || ""), "REQUEST_METHOD" => method, } env["CONTENT_LENGTH"] = body.length unless body.nil? env.merge!(headers) if headers end end end ## Instruction: Make sure env is always returned If headers is nil here, the method previously returned nil. ## Code After: class Net::PTTH IncomingRequest = Struct.new(:method, :path, :headers, :body) do def to_env env = { "PATH_INFO" => path, "SCRIPT_NAME" => "", "rack.input" => StringIO.new(body || ""), "REQUEST_METHOD" => method, } env["CONTENT_LENGTH"] = body.length unless body.nil? env.merge!(headers) if headers env end end end
36924cd5fc958ccf266ccfe1388cb5f0fa877bcf
src/beamjs_mod_repl.erl
src/beamjs_mod_repl.erl
-module(beamjs_mod_repl). -export([exports/1,init/1]). -behaviour(erlv8_module). -include_lib("erlv8/include/erlv8.hrl"). init(_VM) -> ok. exports(_VM) -> erlv8_object:new([{"start", fun start/2}]). start(#erlv8_fun_invocation{} = Invocation, []) -> start(Invocation,["beam.js> "]); start(#erlv8_fun_invocation{ vm = VM } = _Invocation, [Prompt]) -> supervisor:start_child(beamjs_repl_sup,[Prompt, beamjs_repl_console, VM]), receive X -> X end.
-module(beamjs_mod_repl). -export([exports/1,init/1]). -behaviour(erlv8_module). -include_lib("erlv8/include/erlv8.hrl"). init(_VM) -> ok. exports(_VM) -> erlv8_object:new([{"start", fun start/2}]). start(#erlv8_fun_invocation{} = Invocation, []) -> case node() of nonode@nohost -> start(Invocation,["beam.js> "]); Node -> start(Invocation,[lists:flatten(io_lib:format("(~p)beam.js> ",[Node]))]) end; start(#erlv8_fun_invocation{ vm = VM } = _Invocation, [Prompt]) -> supervisor:start_child(beamjs_repl_sup,[Prompt, beamjs_repl_console, VM]), receive X -> X end.
Make REPL show node name in the prompt if it is a distributed node.
Make REPL show node name in the prompt if it is a distributed node.
Erlang
bsd-2-clause
beamjs/beamjs,beamjs/beamjs
erlang
## Code Before: -module(beamjs_mod_repl). -export([exports/1,init/1]). -behaviour(erlv8_module). -include_lib("erlv8/include/erlv8.hrl"). init(_VM) -> ok. exports(_VM) -> erlv8_object:new([{"start", fun start/2}]). start(#erlv8_fun_invocation{} = Invocation, []) -> start(Invocation,["beam.js> "]); start(#erlv8_fun_invocation{ vm = VM } = _Invocation, [Prompt]) -> supervisor:start_child(beamjs_repl_sup,[Prompt, beamjs_repl_console, VM]), receive X -> X end. ## Instruction: Make REPL show node name in the prompt if it is a distributed node. ## Code After: -module(beamjs_mod_repl). -export([exports/1,init/1]). -behaviour(erlv8_module). -include_lib("erlv8/include/erlv8.hrl"). init(_VM) -> ok. exports(_VM) -> erlv8_object:new([{"start", fun start/2}]). start(#erlv8_fun_invocation{} = Invocation, []) -> case node() of nonode@nohost -> start(Invocation,["beam.js> "]); Node -> start(Invocation,[lists:flatten(io_lib:format("(~p)beam.js> ",[Node]))]) end; start(#erlv8_fun_invocation{ vm = VM } = _Invocation, [Prompt]) -> supervisor:start_child(beamjs_repl_sup,[Prompt, beamjs_repl_console, VM]), receive X -> X end.
14683ab2bdd010ff050ee6cfacd5475dc59244a6
public/css/style.css
public/css/style.css
body { font-family: 'Slabo 27px', serif; } a { color: #00B7FF; } hr { -moz-box-sizing: content-box; box-sizing: content-box; height: 0; } input[type="text"] { box-sizing: border-box; height: 48px; } .container { position: relative; width: 80%; max-width: 960px; margin: 0 auto; padding: 0; } .main-content { margin: 15% auto 0 auto; max-width: 500px; } .btn-toolbar{ text-align: center; display: inline-block; }​ .form-group { text-align:center; } .drawing { outline: 1px solid LightGrey; width: 100%; } .gamecode { font-family: monospace; box-shadow: 0 0 10pt 1pt LightGrey; padding: 5px; } .gamecode-entry { font-family: monospace; } .user { border: 1px solid LightGrey; padding: 8px; } .user-container { padding: 3px; } .disconnected { background-color: LightGrey; } .drawphone-info { text-align:left; font-size: 125%; } .alert { text-align: left; }
body { font-family: 'Slabo 27px', serif; } a { color: #00B7FF; } hr { -moz-box-sizing: content-box; box-sizing: content-box; height: 0; } input[type="text"] { box-sizing: border-box; height: 48px; } .container { position: relative; max-width: 500px; margin: 0 auto; padding: 28px; } .btn-toolbar{ text-align: center; display: inline-block; }​ .form-group { text-align:center; } .drawing { outline: 1px solid LightGrey; width: 100%; } .gamecode { font-family: monospace; box-shadow: 0 0 10pt 1pt LightGrey; padding: 5px; } .gamecode-entry { font-family: monospace; } .user { border: 1px solid LightGrey; padding: 8px; } .user-container { padding: 3px; } .disconnected { background-color: LightGrey; } .drawphone-info { text-align:left; font-size: 125%; } .alert { text-align: left; }
Fix vertical centering on large displays
Fix vertical centering on large displays
CSS
mit
tannerkrewson/drawphone
css
## Code Before: body { font-family: 'Slabo 27px', serif; } a { color: #00B7FF; } hr { -moz-box-sizing: content-box; box-sizing: content-box; height: 0; } input[type="text"] { box-sizing: border-box; height: 48px; } .container { position: relative; width: 80%; max-width: 960px; margin: 0 auto; padding: 0; } .main-content { margin: 15% auto 0 auto; max-width: 500px; } .btn-toolbar{ text-align: center; display: inline-block; }​ .form-group { text-align:center; } .drawing { outline: 1px solid LightGrey; width: 100%; } .gamecode { font-family: monospace; box-shadow: 0 0 10pt 1pt LightGrey; padding: 5px; } .gamecode-entry { font-family: monospace; } .user { border: 1px solid LightGrey; padding: 8px; } .user-container { padding: 3px; } .disconnected { background-color: LightGrey; } .drawphone-info { text-align:left; font-size: 125%; } .alert { text-align: left; } ## Instruction: Fix vertical centering on large displays ## Code After: body { font-family: 'Slabo 27px', serif; } a { color: #00B7FF; } hr { -moz-box-sizing: content-box; box-sizing: content-box; height: 0; } input[type="text"] { box-sizing: border-box; height: 48px; } .container { position: relative; max-width: 500px; margin: 0 auto; padding: 28px; } .btn-toolbar{ text-align: center; display: inline-block; }​ .form-group { text-align:center; } .drawing { outline: 1px solid LightGrey; width: 100%; } .gamecode { font-family: monospace; box-shadow: 0 0 10pt 1pt LightGrey; padding: 5px; } .gamecode-entry { font-family: monospace; } .user { border: 1px solid LightGrey; padding: 8px; } .user-container { padding: 3px; } .disconnected { background-color: LightGrey; } .drawphone-info { text-align:left; font-size: 125%; } .alert { text-align: left; }
e98f2ae9c70feec4c0b3f64123174fe0f4f3a8de
.travis.yml
.travis.yml
sudo: false language: go go: - "1.9.x" - "1.10.x" script: - make
sudo: false language: go go: - "1.9.x" - "1.10.x" - "1.11".x" script: - make
Add Go 1.11 to Travis.
Add Go 1.11 to Travis.
YAML
apache-2.0
mesosphere/mesos_exporter
yaml
## Code Before: sudo: false language: go go: - "1.9.x" - "1.10.x" script: - make ## Instruction: Add Go 1.11 to Travis. ## Code After: sudo: false language: go go: - "1.9.x" - "1.10.x" - "1.11".x" script: - make
2b6239abfc638d684e0a268bce43a0e88a5f3fd7
_posts/2016-05-10-maultaschen-37.md
_posts/2016-05-10-maultaschen-37.md
--- title: Mobile Maultaschen 37 date: 2016-05-10 layout: post --- Das nächste Treffen findet am **10. Mai, 19:00 Uhr** statt. Manuel und Boris von [fortyone](https://fortyone.io) berichten über ihre Erfahrungen bei der Entwicklung eines Cross-Plattform Chats. Wir treffen uns bei moovel im [neuen Büro am Marienplatz](https://www.google.de/maps/place/Hauptst%C3%A4tter+Str.+149,+70180+Stuttgart/@48.7644413,9.1677958,17z/data=!3m1!4b1!4m2!3m1!1s0x4799db510949fc7b:0xdb8ea86fe4718662?hl=en). Vielen lieben Dank für die Unterstützung!
--- title: Mobile Maultaschen 37 date: 2016-05-10 layout: post --- Das nächste Treffen findet am **10. Mai, 19:00 Uhr** statt. Manuel und Boris von [fortyone](https://fortyone.io) berichten über ihre Erfahrungen bei der Entwicklung eines Cross-Plattform Chats und [Andy](https://twitter.com/andreasg) erklärt das `Model-View-Presenter` pattern. Wir treffen uns bei moovel im [neuen Büro am Marienplatz](https://www.google.de/maps/place/Hauptst%C3%A4tter+Str.+149,+70180+Stuttgart/@48.7644413,9.1677958,17z/data=!3m1!4b1!4m2!3m1!1s0x4799db510949fc7b:0xdb8ea86fe4718662?hl=en). Vielen lieben Dank für die Unterstützung!
Add info about Andys topic
Add info about Andys topic
Markdown
mit
mobilemaultaschen/mobilemaultaschen.github.io,mobilemaultaschen/mobilemaultaschen.github.io,mobilemaultaschen/mobilemaultaschen.github.io
markdown
## Code Before: --- title: Mobile Maultaschen 37 date: 2016-05-10 layout: post --- Das nächste Treffen findet am **10. Mai, 19:00 Uhr** statt. Manuel und Boris von [fortyone](https://fortyone.io) berichten über ihre Erfahrungen bei der Entwicklung eines Cross-Plattform Chats. Wir treffen uns bei moovel im [neuen Büro am Marienplatz](https://www.google.de/maps/place/Hauptst%C3%A4tter+Str.+149,+70180+Stuttgart/@48.7644413,9.1677958,17z/data=!3m1!4b1!4m2!3m1!1s0x4799db510949fc7b:0xdb8ea86fe4718662?hl=en). Vielen lieben Dank für die Unterstützung! ## Instruction: Add info about Andys topic ## Code After: --- title: Mobile Maultaschen 37 date: 2016-05-10 layout: post --- Das nächste Treffen findet am **10. Mai, 19:00 Uhr** statt. Manuel und Boris von [fortyone](https://fortyone.io) berichten über ihre Erfahrungen bei der Entwicklung eines Cross-Plattform Chats und [Andy](https://twitter.com/andreasg) erklärt das `Model-View-Presenter` pattern. Wir treffen uns bei moovel im [neuen Büro am Marienplatz](https://www.google.de/maps/place/Hauptst%C3%A4tter+Str.+149,+70180+Stuttgart/@48.7644413,9.1677958,17z/data=!3m1!4b1!4m2!3m1!1s0x4799db510949fc7b:0xdb8ea86fe4718662?hl=en). Vielen lieben Dank für die Unterstützung!
be8b06349e4242ade1d1b4d5d30a609cb3c47fc5
README.md
README.md
[![travis](https://travis-ci.org/kevincennis/Mix.js.png)](https://travis-ci.org/kevincennis/Mix.js) Multitrack mixing with the Web Audio API. Documentation (and lots of cleanup) forthcoming. ### Demo [kevvv.in/mix](http://kevvv.in/mix) ### Getting started ##### Install Grunt `npm install -g grunt-cli` (may require `sudo`) ##### Install Node dependencies `npm install` ##### Build & Test `npm test` or `grunt` ##### Start a local webserver at `http://localhost:8888` `npm start` ### Usage * Make sure you have git and Node.js installed (obvs) * Clone the repo * Put your own audio (mono mp3 or wav) in the `/public/sounds` directory * Edit `public/mix.json` to reflect your track names and audio URLs * From the terminal, run `npm install -g grunt-cli` * Run `npm install` * Run `npm test` * Copy the `public` directory to your webserver To save a mix, open the dev tools in your browser and enter `JSON.stringify(App.mix.toJSON(), null, ' ')` and copy the output into `public/mix.json`.
[![travis](https://travis-ci.org/kevincennis/Mix.js.png)](https://travis-ci.org/kevincennis/Mix.js) Multitrack mixing with the Web Audio API. Documentation (and lots of cleanup) forthcoming. ### Demo [kevvv.in/mix](http://kevvv.in/mix) ### Getting started (for Developers) ##### Install Grunt `npm install -g grunt-cli` (may require `sudo`) ##### Install Node dependencies `npm install` ##### Build & Test `npm test` or `grunt` ##### Start a local webserver at `http://localhost:8888` `npm start` ### Usage (for... Users) * Download `mix.js.zip` from the [Releases](https://github.com/kevincennis/Mix.js/releases) page and unzip it * Put your own audio (mono mp3 or wav) in the `/sounds` directory * Edit `mix.json` to reflect your track names and audio URLs * Copy the directory to your webserver To save a mix, open the dev tools in your browser and enter `JSON.stringify(App.mix.toJSON(), null, ' ')` and copy the output into `mix.json`.
Simplify usage instructions for non-developer types
Simplify usage instructions for non-developer types
Markdown
mit
mcanthony/Mix.js,kevincennis/Mix.js,kevincennis/Mix.js,mcanthony/Mix.js
markdown
## Code Before: [![travis](https://travis-ci.org/kevincennis/Mix.js.png)](https://travis-ci.org/kevincennis/Mix.js) Multitrack mixing with the Web Audio API. Documentation (and lots of cleanup) forthcoming. ### Demo [kevvv.in/mix](http://kevvv.in/mix) ### Getting started ##### Install Grunt `npm install -g grunt-cli` (may require `sudo`) ##### Install Node dependencies `npm install` ##### Build & Test `npm test` or `grunt` ##### Start a local webserver at `http://localhost:8888` `npm start` ### Usage * Make sure you have git and Node.js installed (obvs) * Clone the repo * Put your own audio (mono mp3 or wav) in the `/public/sounds` directory * Edit `public/mix.json` to reflect your track names and audio URLs * From the terminal, run `npm install -g grunt-cli` * Run `npm install` * Run `npm test` * Copy the `public` directory to your webserver To save a mix, open the dev tools in your browser and enter `JSON.stringify(App.mix.toJSON(), null, ' ')` and copy the output into `public/mix.json`. ## Instruction: Simplify usage instructions for non-developer types ## Code After: [![travis](https://travis-ci.org/kevincennis/Mix.js.png)](https://travis-ci.org/kevincennis/Mix.js) Multitrack mixing with the Web Audio API. Documentation (and lots of cleanup) forthcoming. ### Demo [kevvv.in/mix](http://kevvv.in/mix) ### Getting started (for Developers) ##### Install Grunt `npm install -g grunt-cli` (may require `sudo`) ##### Install Node dependencies `npm install` ##### Build & Test `npm test` or `grunt` ##### Start a local webserver at `http://localhost:8888` `npm start` ### Usage (for... Users) * Download `mix.js.zip` from the [Releases](https://github.com/kevincennis/Mix.js/releases) page and unzip it * Put your own audio (mono mp3 or wav) in the `/sounds` directory * Edit `mix.json` to reflect your track names and audio URLs * Copy the directory to your webserver To save a mix, open the dev tools in your browser and enter `JSON.stringify(App.mix.toJSON(), null, ' ')` and copy the output into `mix.json`.
e3d1c8bbf238516d7a10e03aea0fbd378c4a4f6f
profile_collection/startup/99-bluesky.py
profile_collection/startup/99-bluesky.py
def detselect(detector_object, suffix="_stats_total1"): """Switch the active detector and set some internal state""" gs.DETS =[detector_object] gs.PLOT_Y = detector_object.name + suffix gs.TABLE_COLS = [gs.PLOT_Y]
def detselect(detector_object, suffix="_stats_total1"): """Switch the active detector and set some internal state""" gs.DETS =[detector_object] gs.PLOT_Y = detector_object.name + suffix gs.TABLE_COLS = [gs.PLOT_Y] def chx_plot_motor(scan): fig = None if gs.PLOTMODE == 1: fig = plt.gcf() elif gs.PLOTMODE == 2: fig = plt.gcf() fig.clear() elif gs.PLOTMODE == 3: fig = plt.figure() return LivePlot(gs.PLOT_Y, scan.motor._name, fig=fig) dscan.default_sub_factories['all'][1] = chx_plot_motor gs.PLOTMODE = 1 from bluesky.global_state import resume, abort, stop, panic, all_is_well, state
Add 'better' plotting control for live plots
ENH: Add 'better' plotting control for live plots
Python
bsd-2-clause
NSLS-II-CHX/ipython_ophyd,NSLS-II-CHX/ipython_ophyd
python
## Code Before: def detselect(detector_object, suffix="_stats_total1"): """Switch the active detector and set some internal state""" gs.DETS =[detector_object] gs.PLOT_Y = detector_object.name + suffix gs.TABLE_COLS = [gs.PLOT_Y] ## Instruction: ENH: Add 'better' plotting control for live plots ## Code After: def detselect(detector_object, suffix="_stats_total1"): """Switch the active detector and set some internal state""" gs.DETS =[detector_object] gs.PLOT_Y = detector_object.name + suffix gs.TABLE_COLS = [gs.PLOT_Y] def chx_plot_motor(scan): fig = None if gs.PLOTMODE == 1: fig = plt.gcf() elif gs.PLOTMODE == 2: fig = plt.gcf() fig.clear() elif gs.PLOTMODE == 3: fig = plt.figure() return LivePlot(gs.PLOT_Y, scan.motor._name, fig=fig) dscan.default_sub_factories['all'][1] = chx_plot_motor gs.PLOTMODE = 1 from bluesky.global_state import resume, abort, stop, panic, all_is_well, state
715790511819ab5810659b8f8f5d06891135d6a0
.travis.yml
.travis.yml
language: python python: - "2.7" - "3.3" - "3.4" - "3.5" addons: postgresql: "9.3" env: - DJANGO="Django>=1.6.0,<1.7.0" PSQL=0 - DJANGO="Django>=1.6.0,<1.7.0" PSQL=1 - DJANGO="Django>=1.7.0,<1.8.0" PSQL=0 - DJANGO="Django>=1.7.0,<1.8.0" PSQL=1 - DJANGO="Django>=1.8.0,<1.9.0" PSQL=0 - DJANGO="Django>=1.8.0,<1.9.0" PSQL=1 before_script: - psql -c 'create database travisci;' -U postgres install: - pip install $DJANGO coverage six psycopg2 --quiet script: - export DJANGO_SETTINGS_MODULE=settings - export PYTHONPATH=..:$PYTHONPATH - cd test_project - python manage.py test interval test_app - coverage run --source=../interval manage.py test interval test_app after_success: coveralls
language: python python: - "2.7" - "3.3" - "3.4" - "3.5" addons: postgresql: "9.3" env: - DJANGO="Django>=1.6.0,<1.7.0" PSQL=0 - DJANGO="Django>=1.6.0,<1.7.0" PSQL=1 - DJANGO="Django>=1.7.0,<1.8.0" PSQL=0 - DJANGO="Django>=1.7.0,<1.8.0" PSQL=1 - DJANGO="Django>=1.8.0,<1.9.0" PSQL=0 - DJANGO="Django>=1.8.0,<1.9.0" PSQL=1 before_script: - psql -c 'create database travisci;' -U postgres install: - pip install $DJANGO coverage six psycopg2 --quiet script: - export DJANGO_SETTINGS_MODULE=settings - export PYTHONPATH=..:$PYTHONPATH - cd test_project - python manage.py test interval test_app - coverage run --source=../interval manage.py test interval test_app after_success: coveralls matrix: exclude: - python: "3.5" env: DJANGO="Django>=1.6.0,<1.7.0" - python: "3.5" env: DJANGO="Django>=1.7.0,<1.8.0"
Exclude unsupported Python 3.5 + Django combinations
Exclude unsupported Python 3.5 + Django combinations
YAML
mit
mpasternak/django-interval-field,mpasternak/django-interval-field
yaml
## Code Before: language: python python: - "2.7" - "3.3" - "3.4" - "3.5" addons: postgresql: "9.3" env: - DJANGO="Django>=1.6.0,<1.7.0" PSQL=0 - DJANGO="Django>=1.6.0,<1.7.0" PSQL=1 - DJANGO="Django>=1.7.0,<1.8.0" PSQL=0 - DJANGO="Django>=1.7.0,<1.8.0" PSQL=1 - DJANGO="Django>=1.8.0,<1.9.0" PSQL=0 - DJANGO="Django>=1.8.0,<1.9.0" PSQL=1 before_script: - psql -c 'create database travisci;' -U postgres install: - pip install $DJANGO coverage six psycopg2 --quiet script: - export DJANGO_SETTINGS_MODULE=settings - export PYTHONPATH=..:$PYTHONPATH - cd test_project - python manage.py test interval test_app - coverage run --source=../interval manage.py test interval test_app after_success: coveralls ## Instruction: Exclude unsupported Python 3.5 + Django combinations ## Code After: language: python python: - "2.7" - "3.3" - "3.4" - "3.5" addons: postgresql: "9.3" env: - DJANGO="Django>=1.6.0,<1.7.0" PSQL=0 - DJANGO="Django>=1.6.0,<1.7.0" PSQL=1 - DJANGO="Django>=1.7.0,<1.8.0" PSQL=0 - DJANGO="Django>=1.7.0,<1.8.0" PSQL=1 - DJANGO="Django>=1.8.0,<1.9.0" PSQL=0 - DJANGO="Django>=1.8.0,<1.9.0" PSQL=1 before_script: - psql -c 'create database travisci;' -U postgres install: - pip install $DJANGO coverage six psycopg2 --quiet script: - export DJANGO_SETTINGS_MODULE=settings - export PYTHONPATH=..:$PYTHONPATH - cd test_project - python manage.py test interval test_app - coverage run --source=../interval manage.py test interval test_app after_success: coveralls matrix: exclude: - python: "3.5" env: DJANGO="Django>=1.6.0,<1.7.0" - python: "3.5" env: DJANGO="Django>=1.7.0,<1.8.0"
d87cb6b401e38a06c5d594e40ad813a9db0738e6
taca/analysis/cli.py
taca/analysis/cli.py
import click from taca.analysis import analysis as an @click.group() def analysis(): """ Analysis methods entry point """ pass # analysis subcommands @analysis.command() @click.option('-r', '--run', type=click.Path(exists=True), default=None, help='Demultiplex only a particular run') def demultiplex(run): """ Demultiplex all runs present in the data directories """ an.run_preprocessing(run) @analysis.command() @click.argument('rundir') def transfer(rundir): """Transfers the run without qc""" an.transfer_run(rundir)
import click from taca.analysis import analysis as an @click.group() def analysis(): """ Analysis methods entry point """ pass # analysis subcommands @analysis.command() @click.option('-r', '--run', type=click.Path(exists=True), default=None, help='Demultiplex only a particular run') def demultiplex(run): """ Demultiplex all runs present in the data directories """ an.run_preprocessing(run) @analysis.command() @click.option('-a','--analysis', is_flag=True, help='Trigger the analysis for the transferred flowcell') @click.argument('rundir') def transfer(rundir, analysis): """Transfers the run without qc""" an.transfer_run(rundir, analysis=analysis)
Add option for triggering or not the analysis
Add option for triggering or not the analysis
Python
mit
senthil10/TACA,kate-v-stepanova/TACA,SciLifeLab/TACA,SciLifeLab/TACA,vezzi/TACA,guillermo-carrasco/TACA,senthil10/TACA,b97pla/TACA,kate-v-stepanova/TACA,SciLifeLab/TACA,b97pla/TACA,guillermo-carrasco/TACA,vezzi/TACA
python
## Code Before: import click from taca.analysis import analysis as an @click.group() def analysis(): """ Analysis methods entry point """ pass # analysis subcommands @analysis.command() @click.option('-r', '--run', type=click.Path(exists=True), default=None, help='Demultiplex only a particular run') def demultiplex(run): """ Demultiplex all runs present in the data directories """ an.run_preprocessing(run) @analysis.command() @click.argument('rundir') def transfer(rundir): """Transfers the run without qc""" an.transfer_run(rundir) ## Instruction: Add option for triggering or not the analysis ## Code After: import click from taca.analysis import analysis as an @click.group() def analysis(): """ Analysis methods entry point """ pass # analysis subcommands @analysis.command() @click.option('-r', '--run', type=click.Path(exists=True), default=None, help='Demultiplex only a particular run') def demultiplex(run): """ Demultiplex all runs present in the data directories """ an.run_preprocessing(run) @analysis.command() @click.option('-a','--analysis', is_flag=True, help='Trigger the analysis for the transferred flowcell') @click.argument('rundir') def transfer(rundir, analysis): """Transfers the run without qc""" an.transfer_run(rundir, analysis=analysis)
1cd638361efedbfc83b10c7f698a796e8e9ebaca
.vscode/c_cpp_properties.json
.vscode/c_cpp_properties.json
{ "configurations": [ { "name": "Mac", "includePath": [ "/Applications/Arduino.app/Contents/Java/hardware/tools/avr/include", "/Applications/Arduino.app/Contents/Java/hardware/arduino/avr/cores/arduino" ], "browse" : { "limitSymbolsToIncludedHeaders" : true, "databaseFilename" : "" } }, { "name": "Linux", "includePath": ["/usr/include"], "browse" : { "limitSymbolsToIncludedHeaders" : true, "databaseFilename" : "" } }, { "name": "Win32", "includePath": ["c:/Program Files (x86)/Microsoft Visual Studio 14.0/VC/include"], "browse" : { "limitSymbolsToIncludedHeaders" : true, "databaseFilename" : "" } } ] }
{ "configurations": [ { "name": "Mac", "includePath": [ "/Applications/Arduino.app/Contents/Java/hardware/tools/avr/include", "/Applications/Arduino.app/Contents/Java/hardware/arduino/avr/cores/arduino" ], "browse": { "limitSymbolsToIncludedHeaders": true, "databaseFilename": "", "path": [ "/Applications/Arduino.app/Contents/Java/hardware/tools/avr/include", "/Applications/Arduino.app/Contents/Java/hardware/arduino/avr/cores/arduino" ] }, "intelliSenseMode": "clang-x64", "macFrameworkPath": [ "/System/Library/Frameworks", "/Library/Frameworks" ] }, { "name": "Linux", "includePath": [ "/usr/include" ], "browse": { "limitSymbolsToIncludedHeaders": true, "databaseFilename": "", "path": [ "/usr/include" ] }, "intelliSenseMode": "clang-x64" }, { "name": "Win32", "includePath": [ "c:/Program Files (x86)/Microsoft Visual Studio 14.0/VC/include" ], "browse": { "limitSymbolsToIncludedHeaders": true, "databaseFilename": "", "path": [ "c:/Program Files (x86)/Microsoft Visual Studio 14.0/VC/include" ] }, "intelliSenseMode": "msvc-x64" } ], "version": 3 }
Update VSCode C++ helper file
chore: Update VSCode C++ helper file
JSON
mit
FortySevenEffects/arduino_midi_library,FortySevenEffects/arduino_midi_library,FortySevenEffects/arduino_midi_library,FortySevenEffects/arduino_midi_library
json
## Code Before: { "configurations": [ { "name": "Mac", "includePath": [ "/Applications/Arduino.app/Contents/Java/hardware/tools/avr/include", "/Applications/Arduino.app/Contents/Java/hardware/arduino/avr/cores/arduino" ], "browse" : { "limitSymbolsToIncludedHeaders" : true, "databaseFilename" : "" } }, { "name": "Linux", "includePath": ["/usr/include"], "browse" : { "limitSymbolsToIncludedHeaders" : true, "databaseFilename" : "" } }, { "name": "Win32", "includePath": ["c:/Program Files (x86)/Microsoft Visual Studio 14.0/VC/include"], "browse" : { "limitSymbolsToIncludedHeaders" : true, "databaseFilename" : "" } } ] } ## Instruction: chore: Update VSCode C++ helper file ## Code After: { "configurations": [ { "name": "Mac", "includePath": [ "/Applications/Arduino.app/Contents/Java/hardware/tools/avr/include", "/Applications/Arduino.app/Contents/Java/hardware/arduino/avr/cores/arduino" ], "browse": { "limitSymbolsToIncludedHeaders": true, "databaseFilename": "", "path": [ "/Applications/Arduino.app/Contents/Java/hardware/tools/avr/include", "/Applications/Arduino.app/Contents/Java/hardware/arduino/avr/cores/arduino" ] }, "intelliSenseMode": "clang-x64", "macFrameworkPath": [ "/System/Library/Frameworks", "/Library/Frameworks" ] }, { "name": "Linux", "includePath": [ "/usr/include" ], "browse": { "limitSymbolsToIncludedHeaders": true, "databaseFilename": "", "path": [ "/usr/include" ] }, "intelliSenseMode": "clang-x64" }, { "name": "Win32", "includePath": [ "c:/Program Files (x86)/Microsoft Visual Studio 14.0/VC/include" ], "browse": { "limitSymbolsToIncludedHeaders": true, "databaseFilename": "", "path": [ "c:/Program Files (x86)/Microsoft Visual Studio 14.0/VC/include" ] }, "intelliSenseMode": "msvc-x64" } ], "version": 3 }
eff8f70a9074b1748ee6cf6522f0182d47a7d0ce
docs/bmemcached.client.rst
docs/bmemcached.client.rst
bmemcached\.client package ========================== Submodules ---------- bmemcached\.client\.constants module ------------------------------------ .. automodule:: bmemcached.client.constants :members: :undoc-members: :show-inheritance: bmemcached\.client\.replicating module -------------------------------------- .. automodule:: bmemcached.client.replicating :members: :undoc-members: :show-inheritance: Module contents --------------- .. automodule:: bmemcached.client :members: :undoc-members: :show-inheritance:
bmemcached\.client package ========================== Submodules ---------- bmemcached\.client\.constants module ------------------------------------ .. automodule:: bmemcached.client.constants :members: :undoc-members: :show-inheritance: bmemcached\.client\.mixin module -------------------------------- .. automodule:: bmemcached.client.mixin :members: :undoc-members: :show-inheritance: bmemcached\.client\.replicating module -------------------------------------- .. automodule:: bmemcached.client.replicating :members: :undoc-members: :show-inheritance: bmemcached\.client\.distributed module -------------------------------------- .. automodule:: bmemcached.client.distributed :members: :undoc-members: :show-inheritance: Module contents --------------- .. automodule:: bmemcached.client :members: :undoc-members: :show-inheritance:
Add missing files to docs.
Add missing files to docs. Adding the mixin and the distributed modules to the docs. Signed-off-by: Moisés Guimarães de Medeiros <38931a700a64030c6e1883420cbba039aadcb37a@redhat.com>
reStructuredText
mit
jaysonsantos/python-binary-memcached,jaysonsantos/python-binary-memcached
restructuredtext
## Code Before: bmemcached\.client package ========================== Submodules ---------- bmemcached\.client\.constants module ------------------------------------ .. automodule:: bmemcached.client.constants :members: :undoc-members: :show-inheritance: bmemcached\.client\.replicating module -------------------------------------- .. automodule:: bmemcached.client.replicating :members: :undoc-members: :show-inheritance: Module contents --------------- .. automodule:: bmemcached.client :members: :undoc-members: :show-inheritance: ## Instruction: Add missing files to docs. Adding the mixin and the distributed modules to the docs. Signed-off-by: Moisés Guimarães de Medeiros <38931a700a64030c6e1883420cbba039aadcb37a@redhat.com> ## Code After: bmemcached\.client package ========================== Submodules ---------- bmemcached\.client\.constants module ------------------------------------ .. automodule:: bmemcached.client.constants :members: :undoc-members: :show-inheritance: bmemcached\.client\.mixin module -------------------------------- .. automodule:: bmemcached.client.mixin :members: :undoc-members: :show-inheritance: bmemcached\.client\.replicating module -------------------------------------- .. automodule:: bmemcached.client.replicating :members: :undoc-members: :show-inheritance: bmemcached\.client\.distributed module -------------------------------------- .. automodule:: bmemcached.client.distributed :members: :undoc-members: :show-inheritance: Module contents --------------- .. automodule:: bmemcached.client :members: :undoc-members: :show-inheritance:
21efe6bece9ca87d16cdce0714b03ca3b676ec23
create_artifacts.ps1
create_artifacts.ps1
[CmdletBinding()] Param( [Parameter(Mandatory=$True)] [string]$Platform, [Parameter(Mandatory=$True)] [string]$Configuration ) $binStub = $Configuration $zip = "colore.zip" If ($Platform -eq "Any CPU") { $zip = "colore_anycpu.zip" } Else { $binStub = "$Platform\$binStub" If ($Platform -eq "x86") { $zip = "colore_x86.zip" } Else { $zip = "colore_x64.zip" } } Write-Host "Zipping artifacts for $Platform - $Configuration" 7z.exe a $zip .\Corale.Colore\bin\$binStub\*.dll 7z.exe a $zip .\Corale.Colore\bin\$binStub\*.xml 7z.exe a $zip .\Corale.Colore\bin\$binStub\*.pdb # Create NuGet package if Release and Any CPU if ($Configuration -eq "Release" -And $Platform -eq "Any CPU") { Write-Host "Creating NuGet package" nuget.exe --% pack Corale.Colore\Corale.Colore.csproj -Prop Configuration=Release -Prop Platform=AnyCPU -Version "%APPVEYOR_BUILD_VERSION%" $name = "Corale.Colore.$Env:APPVEYOR_BUILD_VERSION" $file = "$name.nupkg" Push-AppveyorArtifact $file -FileName $name -DeploymentName "nuget_package" }
[CmdletBinding()] Param( [Parameter(Mandatory=$True)] [string]$Platform, [Parameter(Mandatory=$True)] [string]$Configuration ) $binStub = $Configuration $zip = "colore.zip" If ($Platform -eq "Any CPU") { $zip = "colore_anycpu.zip" } Else { $binStub = "$Platform\$binStub" If ($Platform -eq "x86") { $zip = "colore_x86.zip" } Else { $zip = "colore_x64.zip" } } Write-Host "Zipping artifacts for $Platform - $Configuration" 7z.exe a $zip .\Corale.Colore\bin\$binStub\*.dll 7z.exe a $zip .\Corale.Colore\bin\$binStub\*.xml 7z.exe a $zip .\Corale.Colore\bin\$binStub\*.pdb # Create NuGet package if Release and Any CPU if ($Configuration -eq "Release" -And $Platform -eq "Any CPU") { Write-Host "Creating NuGet package" nuget.exe --% pack Corale.Colore\Corale.Colore.csproj -Prop Configuration=Release -Prop Platform=AnyCPU -Version "%APPVEYOR_BUILD_VERSION%" $name = "Corale.Colore.$Env:APPVEYOR_BUILD_VERSION" $file = "$name.nupkg" Push-AppveyorArtifact $file -DeploymentName "nuget_package" }
Fix incorrect file ending for nuget package.
Fix incorrect file ending for nuget package.
PowerShell
mit
danpierce1/Colore,WolfspiritM/Colore,Sharparam/Colore,CoraleStudios/Colore
powershell
## Code Before: [CmdletBinding()] Param( [Parameter(Mandatory=$True)] [string]$Platform, [Parameter(Mandatory=$True)] [string]$Configuration ) $binStub = $Configuration $zip = "colore.zip" If ($Platform -eq "Any CPU") { $zip = "colore_anycpu.zip" } Else { $binStub = "$Platform\$binStub" If ($Platform -eq "x86") { $zip = "colore_x86.zip" } Else { $zip = "colore_x64.zip" } } Write-Host "Zipping artifacts for $Platform - $Configuration" 7z.exe a $zip .\Corale.Colore\bin\$binStub\*.dll 7z.exe a $zip .\Corale.Colore\bin\$binStub\*.xml 7z.exe a $zip .\Corale.Colore\bin\$binStub\*.pdb # Create NuGet package if Release and Any CPU if ($Configuration -eq "Release" -And $Platform -eq "Any CPU") { Write-Host "Creating NuGet package" nuget.exe --% pack Corale.Colore\Corale.Colore.csproj -Prop Configuration=Release -Prop Platform=AnyCPU -Version "%APPVEYOR_BUILD_VERSION%" $name = "Corale.Colore.$Env:APPVEYOR_BUILD_VERSION" $file = "$name.nupkg" Push-AppveyorArtifact $file -FileName $name -DeploymentName "nuget_package" } ## Instruction: Fix incorrect file ending for nuget package. ## Code After: [CmdletBinding()] Param( [Parameter(Mandatory=$True)] [string]$Platform, [Parameter(Mandatory=$True)] [string]$Configuration ) $binStub = $Configuration $zip = "colore.zip" If ($Platform -eq "Any CPU") { $zip = "colore_anycpu.zip" } Else { $binStub = "$Platform\$binStub" If ($Platform -eq "x86") { $zip = "colore_x86.zip" } Else { $zip = "colore_x64.zip" } } Write-Host "Zipping artifacts for $Platform - $Configuration" 7z.exe a $zip .\Corale.Colore\bin\$binStub\*.dll 7z.exe a $zip .\Corale.Colore\bin\$binStub\*.xml 7z.exe a $zip .\Corale.Colore\bin\$binStub\*.pdb # Create NuGet package if Release and Any CPU if ($Configuration -eq "Release" -And $Platform -eq "Any CPU") { Write-Host "Creating NuGet package" nuget.exe --% pack Corale.Colore\Corale.Colore.csproj -Prop Configuration=Release -Prop Platform=AnyCPU -Version "%APPVEYOR_BUILD_VERSION%" $name = "Corale.Colore.$Env:APPVEYOR_BUILD_VERSION" $file = "$name.nupkg" Push-AppveyorArtifact $file -DeploymentName "nuget_package" }
5d5e47bc5aa15e8f499bd35b033ec297dabda48b
types/netease-captcha/netease-captcha-tests.ts
types/netease-captcha/netease-captcha-tests.ts
const config: NeteaseCaptcha.Config = { captchaId: 'FAKE ID', element: '#captcha', mode: 'popup', protocol: 'https', width: '200px', lang: 'en', onVerify: (error: any, data?: NeteaseCaptcha.Data) => { console.log(error, data); } }; const onLoad: NeteaseCaptcha.onLoad = (instance: NeteaseCaptcha.Instance) => { instance.refresh(); instance.destroy(); }; function init(initNECaptcha: NeteaseCaptcha.InitFunction): void { initNECaptcha(config, onLoad); }
const config: NeteaseCaptcha.Config = { captchaId: 'FAKE ID', element: '#captcha', mode: 'popup', protocol: 'https', width: '200px', lang: 'en', onVerify: (error: any, data?: NeteaseCaptcha.Data) => { console.log(error, data); } }; const onLoad: NeteaseCaptcha.onLoad = (instance: NeteaseCaptcha.Instance) => { instance.refresh(); instance.destroy(); if (instance.popUp) { instance.popUp(); } }; if (window.initNECaptcha) { window.initNECaptcha(config, onLoad); }
Add test for window attribute
Add test for window attribute
TypeScript
mit
mcliment/DefinitelyTyped,georgemarshall/DefinitelyTyped,dsebastien/DefinitelyTyped,borisyankov/DefinitelyTyped,georgemarshall/DefinitelyTyped,georgemarshall/DefinitelyTyped,magny/DefinitelyTyped,magny/DefinitelyTyped,markogresak/DefinitelyTyped,dsebastien/DefinitelyTyped,borisyankov/DefinitelyTyped,georgemarshall/DefinitelyTyped
typescript
## Code Before: const config: NeteaseCaptcha.Config = { captchaId: 'FAKE ID', element: '#captcha', mode: 'popup', protocol: 'https', width: '200px', lang: 'en', onVerify: (error: any, data?: NeteaseCaptcha.Data) => { console.log(error, data); } }; const onLoad: NeteaseCaptcha.onLoad = (instance: NeteaseCaptcha.Instance) => { instance.refresh(); instance.destroy(); }; function init(initNECaptcha: NeteaseCaptcha.InitFunction): void { initNECaptcha(config, onLoad); } ## Instruction: Add test for window attribute ## Code After: const config: NeteaseCaptcha.Config = { captchaId: 'FAKE ID', element: '#captcha', mode: 'popup', protocol: 'https', width: '200px', lang: 'en', onVerify: (error: any, data?: NeteaseCaptcha.Data) => { console.log(error, data); } }; const onLoad: NeteaseCaptcha.onLoad = (instance: NeteaseCaptcha.Instance) => { instance.refresh(); instance.destroy(); if (instance.popUp) { instance.popUp(); } }; if (window.initNECaptcha) { window.initNECaptcha(config, onLoad); }
38174f09b32189c45761f41d819b74fa41c15346
Quicklook/quicklooklib/src/main/java/cl/uchile/ing/adi/quicklooklib/fragments/WebFragment.java
Quicklook/quicklooklib/src/main/java/cl/uchile/ing/adi/quicklooklib/fragments/WebFragment.java
package cl.uchile.ing.adi.quicklooklib.fragments; import android.os.Bundle; import android.view.LayoutInflater; import android.view.View; import android.view.ViewGroup; import android.webkit.WebView; import cl.uchile.ing.adi.quicklooklib.R; /** * Created by dudu on 04-01-2016. */ public class WebFragment extends QuicklookFragment { @Override public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) { View v = inflater.inflate(R.layout.fragment_web, container, false); WebView web = (WebView) v.findViewById(R.id.web_fragment); web.loadUrl("file://"+this.item.getPath()); return v; } }
package cl.uchile.ing.adi.quicklooklib.fragments; import android.os.Bundle; import android.view.LayoutInflater; import android.view.View; import android.view.ViewGroup; import android.webkit.WebView; import cl.uchile.ing.adi.quicklooklib.R; /** * Created by dudu on 04-01-2016. */ public class WebFragment extends QuicklookFragment { @Override public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) { View v = inflater.inflate(R.layout.fragment_web, container, false); WebView web = (WebView) v.findViewById(R.id.web_fragment); web.loadUrl("file://"+this.item.getPath()); web.getSettings().setBuiltInZoomControls(true); web.getSettings().setDisplayZoomControls(false); return v; } }
Hide zoom buttons fixed (Only on android >3.0)
Hide zoom buttons fixed (Only on android >3.0)
Java
mit
Ucampus/quicklook,FCFM-ADI/quicklook,FCFM-ADI/quicklook,Ucampus/quicklook
java
## Code Before: package cl.uchile.ing.adi.quicklooklib.fragments; import android.os.Bundle; import android.view.LayoutInflater; import android.view.View; import android.view.ViewGroup; import android.webkit.WebView; import cl.uchile.ing.adi.quicklooklib.R; /** * Created by dudu on 04-01-2016. */ public class WebFragment extends QuicklookFragment { @Override public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) { View v = inflater.inflate(R.layout.fragment_web, container, false); WebView web = (WebView) v.findViewById(R.id.web_fragment); web.loadUrl("file://"+this.item.getPath()); return v; } } ## Instruction: Hide zoom buttons fixed (Only on android >3.0) ## Code After: package cl.uchile.ing.adi.quicklooklib.fragments; import android.os.Bundle; import android.view.LayoutInflater; import android.view.View; import android.view.ViewGroup; import android.webkit.WebView; import cl.uchile.ing.adi.quicklooklib.R; /** * Created by dudu on 04-01-2016. */ public class WebFragment extends QuicklookFragment { @Override public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) { View v = inflater.inflate(R.layout.fragment_web, container, false); WebView web = (WebView) v.findViewById(R.id.web_fragment); web.loadUrl("file://"+this.item.getPath()); web.getSettings().setBuiltInZoomControls(true); web.getSettings().setDisplayZoomControls(false); return v; } }
e7b52c81b6f036cf6350f67f9b84e90b5dd88f33
chapters/npm-workshop-install/package.json
chapters/npm-workshop-install/package.json
{ "name": "npm-workshop-install", "version": "0.0.0", "description": "npm install: learn about the npm install command", "main": "index.js", "bin": { "help": "help.js", "verify": "verify.js", "chapter-reset": "commands/restart", "current-dir": "commands/current-dir", "chapter-test": "commands/test" }, "exercises": [ "exercises/install", "exercises/install-save" ], "dependencies": { "workshop-exercises": "*", "workshop-assertion-message": "*" }, "scripts": { "postinstall": "node ./help.js" }, "author": "Tim Oxley", "license": "ISC" }
{ "name": "npm-workshop-install", "version": "0.0.0", "description": "npm install: learn about the npm install command", "main": "index.js", "bin": { "help": "help.js", "verify": "verify.js", "chapter-reset": "commands/restart", "current-dir": "commands/current-dir", "chapter-test": "commands/test" }, "exercises": [ "exercises/install", "exercises/install-save", "exercises/install-save-dev" ], "dependencies": { "workshop-exercises": "*", "workshop-assertion-message": "*" }, "scripts": { "postinstall": "node ./help.js" }, "author": "Tim Oxley", "license": "ISC" }
Add install-save-dev to exercises list.
Add install-save-dev to exercises list.
JSON
mit
timoxley/npm-tutor,timoxley/npm-tutor,npm/npm-tutor,npm/npm-tutor
json
## Code Before: { "name": "npm-workshop-install", "version": "0.0.0", "description": "npm install: learn about the npm install command", "main": "index.js", "bin": { "help": "help.js", "verify": "verify.js", "chapter-reset": "commands/restart", "current-dir": "commands/current-dir", "chapter-test": "commands/test" }, "exercises": [ "exercises/install", "exercises/install-save" ], "dependencies": { "workshop-exercises": "*", "workshop-assertion-message": "*" }, "scripts": { "postinstall": "node ./help.js" }, "author": "Tim Oxley", "license": "ISC" } ## Instruction: Add install-save-dev to exercises list. ## Code After: { "name": "npm-workshop-install", "version": "0.0.0", "description": "npm install: learn about the npm install command", "main": "index.js", "bin": { "help": "help.js", "verify": "verify.js", "chapter-reset": "commands/restart", "current-dir": "commands/current-dir", "chapter-test": "commands/test" }, "exercises": [ "exercises/install", "exercises/install-save", "exercises/install-save-dev" ], "dependencies": { "workshop-exercises": "*", "workshop-assertion-message": "*" }, "scripts": { "postinstall": "node ./help.js" }, "author": "Tim Oxley", "license": "ISC" }
2dc19a90070dd0d50ecc9f37cf9cef57a4ddfa1c
jenkins-slave-startup.sh
jenkins-slave-startup.sh
set -ex # start the ssh daemon /usr/sbin/sshd -D # start the docker daemon /usr/local/bin/wrapdocker &
set -ex # start the docker daemon /usr/local/bin/wrapdocker & # start the ssh daemon /usr/sbin/sshd -D
Revert "Start sshd before docker"
Revert "Start sshd before docker" This reverts commit 3e747297f1d5a0206a205067804f68109f688ca4.
Shell
mit
kitpages/dind-jenkins-slave
shell
## Code Before: set -ex # start the ssh daemon /usr/sbin/sshd -D # start the docker daemon /usr/local/bin/wrapdocker & ## Instruction: Revert "Start sshd before docker" This reverts commit 3e747297f1d5a0206a205067804f68109f688ca4. ## Code After: set -ex # start the docker daemon /usr/local/bin/wrapdocker & # start the ssh daemon /usr/sbin/sshd -D
fad95a6071d39e649916f534f8561bd1fadd8a4a
packages/fela-bindings/src/ThemeProviderFactory.js
packages/fela-bindings/src/ThemeProviderFactory.js
/* @flow */ import { shallowEqual } from 'recompose' import { objectEach } from 'fela-utils' import createTheme from './createTheme' export default function ThemeProviderFactory( BaseComponent: any, renderChildren: Function, statics?: Object ): any { class ThemeProvider extends BaseComponent { theme: Object constructor(props: Object, context: Object) { super(props, context) const previousTheme = !props.overwrite && this.context.theme this.theme = createTheme(props.theme, previousTheme) } componentWillReceiveProps(nextProps: Object): void { if (this.props.theme !== nextProps.theme) { this.theme.update(nextProps.theme) } } shouldComponentUpdate(nextProps: Object): boolean { return shallowEqual(this.props.theme, nextProps.theme) === false } getChildContext(): Object { return { theme: this.theme } } render(): Object { return renderChildren(this.props.children) } } if (statics) { objectEach(statics, (value, key) => { ThemeProvider[key] = value }) } return ThemeProvider }
/* @flow */ import { shallowEqual } from 'recompose' import { objectEach } from 'fela-utils' import createTheme from './createTheme' export default function ThemeProviderFactory( BaseComponent: any, renderChildren: Function, statics?: Object ): any { class ThemeProvider extends BaseComponent { theme: Object constructor(props: Object, context: Object) { super(props, context) const previousTheme = !props.overwrite && this.context.theme this.theme = createTheme(props.theme, previousTheme) } componentWillReceiveProps(nextProps: Object): void { if (shallowEqual(this.props.theme, nextProps.theme) === false) { this.theme.update(nextProps.theme) } } shouldComponentUpdate(nextProps: Object): boolean { return shallowEqual(this.props.theme, nextProps.theme) === false } getChildContext(): Object { return { theme: this.theme } } render(): Object { return renderChildren(this.props.children) } } if (statics) { objectEach(statics, (value, key) => { ThemeProvider[key] = value }) } return ThemeProvider }
Use shallowEqual to prevent false-positive values
Use shallowEqual to prevent false-positive values
JavaScript
mit
derek-duncan/fela,rofrischmann/fela,risetechnologies/fela,risetechnologies/fela,derek-duncan/fela,derek-duncan/fela,derek-duncan/fela,derek-duncan/fela,derek-duncan/fela,risetechnologies/fela,rofrischmann/fela,rofrischmann/fela
javascript
## Code Before: /* @flow */ import { shallowEqual } from 'recompose' import { objectEach } from 'fela-utils' import createTheme from './createTheme' export default function ThemeProviderFactory( BaseComponent: any, renderChildren: Function, statics?: Object ): any { class ThemeProvider extends BaseComponent { theme: Object constructor(props: Object, context: Object) { super(props, context) const previousTheme = !props.overwrite && this.context.theme this.theme = createTheme(props.theme, previousTheme) } componentWillReceiveProps(nextProps: Object): void { if (this.props.theme !== nextProps.theme) { this.theme.update(nextProps.theme) } } shouldComponentUpdate(nextProps: Object): boolean { return shallowEqual(this.props.theme, nextProps.theme) === false } getChildContext(): Object { return { theme: this.theme } } render(): Object { return renderChildren(this.props.children) } } if (statics) { objectEach(statics, (value, key) => { ThemeProvider[key] = value }) } return ThemeProvider } ## Instruction: Use shallowEqual to prevent false-positive values ## Code After: /* @flow */ import { shallowEqual } from 'recompose' import { objectEach } from 'fela-utils' import createTheme from './createTheme' export default function ThemeProviderFactory( BaseComponent: any, renderChildren: Function, statics?: Object ): any { class ThemeProvider extends BaseComponent { theme: Object constructor(props: Object, context: Object) { super(props, context) const previousTheme = !props.overwrite && this.context.theme this.theme = createTheme(props.theme, previousTheme) } componentWillReceiveProps(nextProps: Object): void { if (shallowEqual(this.props.theme, nextProps.theme) === false) { this.theme.update(nextProps.theme) } } shouldComponentUpdate(nextProps: Object): boolean { return shallowEqual(this.props.theme, nextProps.theme) === false } getChildContext(): Object { return { theme: this.theme } } render(): Object { return renderChildren(this.props.children) } } if (statics) { objectEach(statics, (value, key) => { ThemeProvider[key] = value }) } return ThemeProvider }
7280360cd9f5f71c98266a86070fb7477bf73c77
pyproject.toml
pyproject.toml
[build-system] requires = ["poetry>=0.12"] build-backend = "poetry.masonry.api" [tool.poetry] name = "pkgconfig" version = "1.5.1" license = "MIT" description = "Interface Python with pkg-config" authors = ["Matthias Vogelgesang <matthias.vogelgesang@gmail.com>"] readme = "README.rst" homepage = "https://github.com/matze/pkgconfig" classifiers = [ "Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "Operating System :: OS Independent", "Topic :: Software Development :: Build Tools", ] [tool.poetry.dependencies] python = "~2.7 || ^3.3" [tool.poetry.dev-dependencies] pytest = "^3.8.2"
[build-system] requires = ["poetry_core>=1.0.0"] build-backend = "poetry.core.masonry.api" [tool.poetry] name = "pkgconfig" version = "1.5.1" license = "MIT" description = "Interface Python with pkg-config" authors = ["Matthias Vogelgesang <matthias.vogelgesang@gmail.com>"] readme = "README.rst" homepage = "https://github.com/matze/pkgconfig" classifiers = [ "Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "Operating System :: OS Independent", "Topic :: Software Development :: Build Tools", ] [tool.poetry.dependencies] python = "~2.7 || ^3.3" [tool.poetry.dev-dependencies] pytest = "^3.8.2"
Use `poetry_core` as build backend
Use `poetry_core` as build backend According to https://python-poetry.org/docs/pyproject/#poetry-and-pep-517 > If your `pyproject.toml` file still references `poetry` directly as a build backend, > you should update it to reference poetry_core instead.
TOML
mit
matze/pkgconfig
toml
## Code Before: [build-system] requires = ["poetry>=0.12"] build-backend = "poetry.masonry.api" [tool.poetry] name = "pkgconfig" version = "1.5.1" license = "MIT" description = "Interface Python with pkg-config" authors = ["Matthias Vogelgesang <matthias.vogelgesang@gmail.com>"] readme = "README.rst" homepage = "https://github.com/matze/pkgconfig" classifiers = [ "Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "Operating System :: OS Independent", "Topic :: Software Development :: Build Tools", ] [tool.poetry.dependencies] python = "~2.7 || ^3.3" [tool.poetry.dev-dependencies] pytest = "^3.8.2" ## Instruction: Use `poetry_core` as build backend According to https://python-poetry.org/docs/pyproject/#poetry-and-pep-517 > If your `pyproject.toml` file still references `poetry` directly as a build backend, > you should update it to reference poetry_core instead. ## Code After: [build-system] requires = ["poetry_core>=1.0.0"] build-backend = "poetry.core.masonry.api" [tool.poetry] name = "pkgconfig" version = "1.5.1" license = "MIT" description = "Interface Python with pkg-config" authors = ["Matthias Vogelgesang <matthias.vogelgesang@gmail.com>"] readme = "README.rst" homepage = "https://github.com/matze/pkgconfig" classifiers = [ "Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "Operating System :: OS Independent", "Topic :: Software Development :: Build Tools", ] [tool.poetry.dependencies] python = "~2.7 || ^3.3" [tool.poetry.dev-dependencies] pytest = "^3.8.2"
e3523cfdb0eefe4af476ac8c4e9f6affcebdacc6
test/units/pox_request_test.rb
test/units/pox_request_test.rb
require "test_helper" require "ostruct" describe Autodiscover::PoxRequest do let(:_class) {Autodiscover::PoxRequest } let(:http) { mock("http") } let(:client) { OpenStruct.new({http: http, domain: "example.local", email: "test@example.local"}) } it "returns a PoxResponse if the autodiscover is successful" do request_body = <<-EOF.gsub(/^ /,"") <?xml version="1.0"?> <Autodiscover xmlns="http://schemas.microsoft.com/exchange/autodiscover/outlook/requestschema/2006"> <Request> <EMailAddress>test@example.local</EMailAddress> <AcceptableResponseSchema>http://schemas.microsoft.com/exchange/autodiscover/outlook/responseschema/2006a</AcceptableResponseSchema> </Request> </Autodiscover> EOF http.expects(:post).with( "https://example.local/autodiscover/autodiscover.xml", request_body, {'Content-Type' => 'text/xml; charset=utf-8'} ).returns(OpenStruct.new({status: 200, body: "<test></test>"})) inst = _class.new(client) _(inst.autodiscover).must_be_instance_of(Autodiscover::PoxResponse) end end
require "test_helper" require "ostruct" describe Autodiscover::PoxRequest do let(:_class) {Autodiscover::PoxRequest } let(:http) { mock("http") } let(:client) { OpenStruct.new({http: http, domain: "example.local", email: "test@example.local"}) } it "returns a PoxResponse if the autodiscover is successful" do request_body = <<-EOF.gsub(/^ /,"") <?xml version="1.0"?> <Autodiscover xmlns="http://schemas.microsoft.com/exchange/autodiscover/outlook/requestschema/2006"> <Request> <EMailAddress>test@example.local</EMailAddress> <AcceptableResponseSchema>http://schemas.microsoft.com/exchange/autodiscover/outlook/responseschema/2006a</AcceptableResponseSchema> </Request> </Autodiscover> EOF http.expects(:post).with( "https://example.local/autodiscover/autodiscover.xml", request_body, {'Content-Type' => 'text/xml; charset=utf-8'} ).returns(OpenStruct.new({status: 200, body: "<Autodiscover><Response><test></test></Response></Autodiscover>"})) inst = _class.new(client) _(inst.autodiscover).must_be_instance_of(Autodiscover::PoxResponse) end end
Fix test for Nori integration
Fix test for Nori integration
Ruby
mit
WinRb/autodiscover
ruby
## Code Before: require "test_helper" require "ostruct" describe Autodiscover::PoxRequest do let(:_class) {Autodiscover::PoxRequest } let(:http) { mock("http") } let(:client) { OpenStruct.new({http: http, domain: "example.local", email: "test@example.local"}) } it "returns a PoxResponse if the autodiscover is successful" do request_body = <<-EOF.gsub(/^ /,"") <?xml version="1.0"?> <Autodiscover xmlns="http://schemas.microsoft.com/exchange/autodiscover/outlook/requestschema/2006"> <Request> <EMailAddress>test@example.local</EMailAddress> <AcceptableResponseSchema>http://schemas.microsoft.com/exchange/autodiscover/outlook/responseschema/2006a</AcceptableResponseSchema> </Request> </Autodiscover> EOF http.expects(:post).with( "https://example.local/autodiscover/autodiscover.xml", request_body, {'Content-Type' => 'text/xml; charset=utf-8'} ).returns(OpenStruct.new({status: 200, body: "<test></test>"})) inst = _class.new(client) _(inst.autodiscover).must_be_instance_of(Autodiscover::PoxResponse) end end ## Instruction: Fix test for Nori integration ## Code After: require "test_helper" require "ostruct" describe Autodiscover::PoxRequest do let(:_class) {Autodiscover::PoxRequest } let(:http) { mock("http") } let(:client) { OpenStruct.new({http: http, domain: "example.local", email: "test@example.local"}) } it "returns a PoxResponse if the autodiscover is successful" do request_body = <<-EOF.gsub(/^ /,"") <?xml version="1.0"?> <Autodiscover xmlns="http://schemas.microsoft.com/exchange/autodiscover/outlook/requestschema/2006"> <Request> <EMailAddress>test@example.local</EMailAddress> <AcceptableResponseSchema>http://schemas.microsoft.com/exchange/autodiscover/outlook/responseschema/2006a</AcceptableResponseSchema> </Request> </Autodiscover> EOF http.expects(:post).with( "https://example.local/autodiscover/autodiscover.xml", request_body, {'Content-Type' => 'text/xml; charset=utf-8'} ).returns(OpenStruct.new({status: 200, body: "<Autodiscover><Response><test></test></Response></Autodiscover>"})) inst = _class.new(client) _(inst.autodiscover).must_be_instance_of(Autodiscover::PoxResponse) end end
a8f44e52c36e7dee41d2e6a913e83e1d9bb9c1cc
sbr/folder_free.c
sbr/folder_free.c
/* * folder_free.c -- free a folder/message structure * * This code is Copyright (c) 2002, by the authors of nmh. See the * COPYRIGHT file in the root directory of the nmh distribution for * complete copyright information. */ #include <h/mh.h> void folder_free (struct msgs *mp) { size_t i; bvector_t *v; if (!mp) return; if (mp->foldpath) free (mp->foldpath); /* free the sequence names */ for (i = 0; i < svector_size (mp->msgattrs); i++) free (svector_at (mp->msgattrs, i)); svector_free (mp->msgattrs); for (i = 0, v = mp->msgstats; i < mp->num_msgstats; ++i, ++v) { bvector_free (*v); } free (mp->msgstats); /* Close/free the sequence file if it is open */ if (mp->seqhandle) lkfclosedata (mp->seqhandle, mp->seqname); if (mp->seqname) free (mp->seqname); bvector_free (mp->attrstats); free (mp); /* free main folder structure */ }
/* * folder_free.c -- free a folder/message structure * * This code is Copyright (c) 2002, by the authors of nmh. See the * COPYRIGHT file in the root directory of the nmh distribution for * complete copyright information. */ #include <h/mh.h> #include <h/utils.h> void folder_free (struct msgs *mp) { size_t i; bvector_t *v; if (!mp) return; mh_xfree(mp->foldpath); /* free the sequence names */ for (i = 0; i < svector_size (mp->msgattrs); i++) free (svector_at (mp->msgattrs, i)); svector_free (mp->msgattrs); for (i = 0, v = mp->msgstats; i < mp->num_msgstats; ++i, ++v) { bvector_free (*v); } free (mp->msgstats); /* Close/free the sequence file if it is open */ if (mp->seqhandle) lkfclosedata (mp->seqhandle, mp->seqname); mh_xfree(mp->seqname); bvector_free (mp->attrstats); free (mp); /* free main folder structure */ }
Replace `if (p) free(p)' with `mh_xfree(p)'.
Replace `if (p) free(p)' with `mh_xfree(p)'.
C
bsd-3-clause
mcr/nmh,mcr/nmh
c
## Code Before: /* * folder_free.c -- free a folder/message structure * * This code is Copyright (c) 2002, by the authors of nmh. See the * COPYRIGHT file in the root directory of the nmh distribution for * complete copyright information. */ #include <h/mh.h> void folder_free (struct msgs *mp) { size_t i; bvector_t *v; if (!mp) return; if (mp->foldpath) free (mp->foldpath); /* free the sequence names */ for (i = 0; i < svector_size (mp->msgattrs); i++) free (svector_at (mp->msgattrs, i)); svector_free (mp->msgattrs); for (i = 0, v = mp->msgstats; i < mp->num_msgstats; ++i, ++v) { bvector_free (*v); } free (mp->msgstats); /* Close/free the sequence file if it is open */ if (mp->seqhandle) lkfclosedata (mp->seqhandle, mp->seqname); if (mp->seqname) free (mp->seqname); bvector_free (mp->attrstats); free (mp); /* free main folder structure */ } ## Instruction: Replace `if (p) free(p)' with `mh_xfree(p)'. ## Code After: /* * folder_free.c -- free a folder/message structure * * This code is Copyright (c) 2002, by the authors of nmh. See the * COPYRIGHT file in the root directory of the nmh distribution for * complete copyright information. */ #include <h/mh.h> #include <h/utils.h> void folder_free (struct msgs *mp) { size_t i; bvector_t *v; if (!mp) return; mh_xfree(mp->foldpath); /* free the sequence names */ for (i = 0; i < svector_size (mp->msgattrs); i++) free (svector_at (mp->msgattrs, i)); svector_free (mp->msgattrs); for (i = 0, v = mp->msgstats; i < mp->num_msgstats; ++i, ++v) { bvector_free (*v); } free (mp->msgstats); /* Close/free the sequence file if it is open */ if (mp->seqhandle) lkfclosedata (mp->seqhandle, mp->seqname); mh_xfree(mp->seqname); bvector_free (mp->attrstats); free (mp); /* free main folder structure */ }
02118d275df889f586ea555fba7ad308b20a217f
package.json
package.json
{ "name": "svg-labels", "version": "1.1.1", "license": "ISC", "description": "Easy SVG Labels", "repository": "github:bhousel/svg-labels", "main": "index.js", "contributors": [ "Bryan Housel <bhousel@gmail.com> (https://github.com/bhousel)" ], "keywords": [ "svg", "github", "labels" ], "dependencies": { "koa": "2.13.0", "koa-qs": "2.0.0", "koa-router": "7.1.1", "koa-send": "4.1.0", "string-pixel-width": "1.10.0", "xml-escape": "1.1.0" }, "devDependencies": { "coveralls": "^3.1.0", "tap": "^14.10.08" }, "engines": { "node": ">=10" }, "scripts": { "start": "node server.js", "test": "tap --cov test/*.js" } }
{ "name": "svg-labels", "version": "1.1.1", "license": "ISC", "description": "Easy SVG Labels", "repository": "github:bhousel/svg-labels", "main": "index.js", "contributors": [ "Bryan Housel <bhousel@gmail.com> (https://github.com/bhousel)" ], "keywords": [ "svg", "github", "labels" ], "dependencies": { "koa": "~2.13.0", "koa-qs": "~3.0.0", "koa-router": "~10.0.0", "koa-send": "~5.0.1", "string-pixel-width": "~1.10.0", "xml-escape": "~1.1.0" }, "devDependencies": { "coveralls": "^3.1.0", "tap": "^14.11.0" }, "engines": { "node": ">=10" }, "scripts": { "start": "node server.js", "test": "tap --cov test/*.js" } }
Update the rest of the koa dependencies
Update the rest of the koa dependencies
JSON
isc
bhousel/svg-labels
json
## Code Before: { "name": "svg-labels", "version": "1.1.1", "license": "ISC", "description": "Easy SVG Labels", "repository": "github:bhousel/svg-labels", "main": "index.js", "contributors": [ "Bryan Housel <bhousel@gmail.com> (https://github.com/bhousel)" ], "keywords": [ "svg", "github", "labels" ], "dependencies": { "koa": "2.13.0", "koa-qs": "2.0.0", "koa-router": "7.1.1", "koa-send": "4.1.0", "string-pixel-width": "1.10.0", "xml-escape": "1.1.0" }, "devDependencies": { "coveralls": "^3.1.0", "tap": "^14.10.08" }, "engines": { "node": ">=10" }, "scripts": { "start": "node server.js", "test": "tap --cov test/*.js" } } ## Instruction: Update the rest of the koa dependencies ## Code After: { "name": "svg-labels", "version": "1.1.1", "license": "ISC", "description": "Easy SVG Labels", "repository": "github:bhousel/svg-labels", "main": "index.js", "contributors": [ "Bryan Housel <bhousel@gmail.com> (https://github.com/bhousel)" ], "keywords": [ "svg", "github", "labels" ], "dependencies": { "koa": "~2.13.0", "koa-qs": "~3.0.0", "koa-router": "~10.0.0", "koa-send": "~5.0.1", "string-pixel-width": "~1.10.0", "xml-escape": "~1.1.0" }, "devDependencies": { "coveralls": "^3.1.0", "tap": "^14.11.0" }, "engines": { "node": ">=10" }, "scripts": { "start": "node server.js", "test": "tap --cov test/*.js" } }
80dfc99f3b8df476030ce32891c3d7d7f97ac758
README.md
README.md
Example of how to setup continuous integration for Python.
[![Build Status](https://travis-ci.org/scottclowe/python-ci.svg?branch=master)](https://travis-ci.org/scottclowe/python-ci) [![Shippable branch](https://img.shields.io/shippable/5674d4821895ca447466a204/master.svg)](https://app.shippable.com/projects/5674d4821895ca447466a204) # python-ci Example of how to setup continuous integration for Python.
Add Travis and Shippable badges
Add Travis and Shippable badges
Markdown
mit
scottclowe/python-ci,scottclowe/python-continuous-integration,scottclowe/python-continuous-integration,scottclowe/python-ci
markdown
## Code Before: Example of how to setup continuous integration for Python. ## Instruction: Add Travis and Shippable badges ## Code After: [![Build Status](https://travis-ci.org/scottclowe/python-ci.svg?branch=master)](https://travis-ci.org/scottclowe/python-ci) [![Shippable branch](https://img.shields.io/shippable/5674d4821895ca447466a204/master.svg)](https://app.shippable.com/projects/5674d4821895ca447466a204) # python-ci Example of how to setup continuous integration for Python.
50fd818e6f0cb4f1114fd5855c24ba5974ebf081
Manifest.txt
Manifest.txt
History.txt License.txt Manifest.txt README.txt Rakefile ext/extconf.rb ext/rb_oniguruma.c ext/rb_oniguruma.h ext/rb_oniguruma_ext.h ext/rb_oniguruma_ext_match.c ext/rb_oniguruma_ext_string.c ext/rb_oniguruma_match.c ext/rb_oniguruma_match.h ext/rb_oniguruma_oregexp.c ext/rb_oniguruma_struct_args.h ext/rb_oniguruma_version.h scripts/txt2html setup.rb spec/match_ext_spec.rb spec/oniguruma_spec.rb spec/oregexp_spec.rb spec/spec.opts spec/spec_helper.rb spec/string_ext_spec.rb website/index.html website/index.txt website/javascripts/rounded_corners_lite.inc.js website/stylesheets/screen.css website/template.rhtml
History.txt License.txt Manifest.txt README.txt Rakefile ext/depend ext/extconf.rb ext/rb_oniguruma.c ext/rb_oniguruma.h ext/rb_oniguruma_ext.h ext/rb_oniguruma_ext_match.c ext/rb_oniguruma_ext_string.c ext/rb_oniguruma_match.c ext/rb_oniguruma_match.h ext/rb_oniguruma_oregexp.c ext/rb_oniguruma_struct_args.h ext/rb_oniguruma_version.h scripts/txt2html setup.rb spec/match_ext_spec.rb spec/oniguruma_spec.rb spec/oregexp_spec.rb spec/spec.opts spec/spec_helper.rb spec/string_ext_spec.rb website/index.html website/index.txt website/javascripts/rounded_corners_lite.inc.js website/stylesheets/screen.css website/template.rhtml
Include depends file in gem
Include depends file in gem
Text
mit
geoffgarside/oniguruma,geoffgarside/oniguruma
text
## Code Before: History.txt License.txt Manifest.txt README.txt Rakefile ext/extconf.rb ext/rb_oniguruma.c ext/rb_oniguruma.h ext/rb_oniguruma_ext.h ext/rb_oniguruma_ext_match.c ext/rb_oniguruma_ext_string.c ext/rb_oniguruma_match.c ext/rb_oniguruma_match.h ext/rb_oniguruma_oregexp.c ext/rb_oniguruma_struct_args.h ext/rb_oniguruma_version.h scripts/txt2html setup.rb spec/match_ext_spec.rb spec/oniguruma_spec.rb spec/oregexp_spec.rb spec/spec.opts spec/spec_helper.rb spec/string_ext_spec.rb website/index.html website/index.txt website/javascripts/rounded_corners_lite.inc.js website/stylesheets/screen.css website/template.rhtml ## Instruction: Include depends file in gem ## Code After: History.txt License.txt Manifest.txt README.txt Rakefile ext/depend ext/extconf.rb ext/rb_oniguruma.c ext/rb_oniguruma.h ext/rb_oniguruma_ext.h ext/rb_oniguruma_ext_match.c ext/rb_oniguruma_ext_string.c ext/rb_oniguruma_match.c ext/rb_oniguruma_match.h ext/rb_oniguruma_oregexp.c ext/rb_oniguruma_struct_args.h ext/rb_oniguruma_version.h scripts/txt2html setup.rb spec/match_ext_spec.rb spec/oniguruma_spec.rb spec/oregexp_spec.rb spec/spec.opts spec/spec_helper.rb spec/string_ext_spec.rb website/index.html website/index.txt website/javascripts/rounded_corners_lite.inc.js website/stylesheets/screen.css website/template.rhtml
072d3d17ca443a5bf36a1d68836c98896b10d2d7
conda-recipes/geomdl/meta.yaml
conda-recipes/geomdl/meta.yaml
{% set setup_data = load_setup_py_data() %} package: name: geomdl version: {{ setup_data['version'] }} source: path: ../../ build: noarch: python string: {{ GIT_DESCRIBE_HASH[1:] }} script: - python setup.py install --single-version-externally-managed --record=record.txt requirements: host: - python - setuptools - six #- enum34 # required for py27 run: - python - six #- enum34 # required for py27 - numpy - matplotlib test: imports: - geomdl source_files: - tests/ requires: - pytest commands: - python -c "import geomdl; print(geomdl.__version__)" - pytest about: home: https://onurraufbingol.com/NURBS-Python/ license: MIT license_file: ../../LICENSE summary: Object-oriented NURBS curve and surface evaluation library description: "Self-contained, object-oriented, pure Python B-Spline and NURBS evaluation library with knot vector and surface grid generators." doc_url: "http://nurbs-python.readthedocs.io/" dev_url: "https://github.com/orbingol/NURBS-Python" extra: recipe-maintainers: - orbingol
{% set setup_data = load_setup_py_data() %} {% set build_number = 0 %} package: name: geomdl version: {{ setup_data['version'] }} source: path: ../../ build: noarch: python string: {{ GIT_DESCRIBE_HASH[1:] }}_{{ build_number }} script: - python setup.py sdist - pip install dist/geomdl-{{ setup_data['version'] }}.tar.gz --no-deps #- python setup.py install --single-version-externally-managed --record=record.txt requirements: host: - python - setuptools - six #- enum34 # required for py27 run: - python - six #- enum34 # required for py27 - numpy - matplotlib test: imports: - geomdl source_files: - tests/ requires: - pytest commands: - python -c "import geomdl; print(geomdl.__version__)" - pytest about: home: https://onurraufbingol.com/NURBS-Python/ license: MIT license_file: ../../LICENSE summary: Object-oriented NURBS curve and surface evaluation library description: "Self-contained, object-oriented, pure Python B-Spline and NURBS evaluation library with knot vector and surface grid generators." doc_url: "http://nurbs-python.readthedocs.io/" dev_url: "https://github.com/orbingol/NURBS-Python" extra: recipe-maintainers: - orbingol
Update conda-build recipe to remove egg-info dir from site-packages
Update conda-build recipe to remove egg-info dir from site-packages
YAML
mit
orbingol/NURBS-Python,orbingol/NURBS-Python
yaml
## Code Before: {% set setup_data = load_setup_py_data() %} package: name: geomdl version: {{ setup_data['version'] }} source: path: ../../ build: noarch: python string: {{ GIT_DESCRIBE_HASH[1:] }} script: - python setup.py install --single-version-externally-managed --record=record.txt requirements: host: - python - setuptools - six #- enum34 # required for py27 run: - python - six #- enum34 # required for py27 - numpy - matplotlib test: imports: - geomdl source_files: - tests/ requires: - pytest commands: - python -c "import geomdl; print(geomdl.__version__)" - pytest about: home: https://onurraufbingol.com/NURBS-Python/ license: MIT license_file: ../../LICENSE summary: Object-oriented NURBS curve and surface evaluation library description: "Self-contained, object-oriented, pure Python B-Spline and NURBS evaluation library with knot vector and surface grid generators." doc_url: "http://nurbs-python.readthedocs.io/" dev_url: "https://github.com/orbingol/NURBS-Python" extra: recipe-maintainers: - orbingol ## Instruction: Update conda-build recipe to remove egg-info dir from site-packages ## Code After: {% set setup_data = load_setup_py_data() %} {% set build_number = 0 %} package: name: geomdl version: {{ setup_data['version'] }} source: path: ../../ build: noarch: python string: {{ GIT_DESCRIBE_HASH[1:] }}_{{ build_number }} script: - python setup.py sdist - pip install dist/geomdl-{{ setup_data['version'] }}.tar.gz --no-deps #- python setup.py install --single-version-externally-managed --record=record.txt requirements: host: - python - setuptools - six #- enum34 # required for py27 run: - python - six #- enum34 # required for py27 - numpy - matplotlib test: imports: - geomdl source_files: - tests/ requires: - pytest commands: - python -c "import geomdl; print(geomdl.__version__)" - pytest about: home: https://onurraufbingol.com/NURBS-Python/ license: MIT license_file: ../../LICENSE summary: Object-oriented NURBS curve and surface evaluation library description: "Self-contained, object-oriented, pure Python B-Spline and NURBS evaluation library with knot vector and surface grid generators." doc_url: "http://nurbs-python.readthedocs.io/" dev_url: "https://github.com/orbingol/NURBS-Python" extra: recipe-maintainers: - orbingol
f12dcb2976e29b270e6a9b3235a705fda653c129
app/views/comments/_comment.html.erb
app/views/comments/_comment.html.erb
<li class="comment-list-item" id="comment-<%= comment.id %>"> <h4 class="comment-list-item__headline"><%= h(comment.author) %> / <span class="comment-list-item__headline-date"><%= display_comment_timestamp(comment) %></span></h4> <%= comment.html %> <div class="comment-list-item__report-link"><%= report_comment_link(comment) %></div> <%- unless comment.published %> <div class="spamwarning"><%= t(".this_comment_has_been_flagged_for_moderator_approval") %></div> <%- end %> </li>
<li class="comment-list-item" id="comment-<%= comment.id %>"> <h4 class="comment-list-item__headline"><%= h(comment.author) %> / <span class="comment-list-item__headline-date"><%= display_comment_timestamp(comment) %></span></h4> <%= simple_format(comment.html) %> <div class="comment-list-item__report-link"><%= report_comment_link(comment) %></div> <%- unless comment.published %> <div class="spamwarning"><%= t(".this_comment_has_been_flagged_for_moderator_approval") %></div> <%- end %> </li>
Use simple_format on comments to turn line breaks to new lines / paragraphs.
Use simple_format on comments to turn line breaks to new lines / paragraphs.
HTML+ERB
mit
moneyadviceservice/publify,moneyadviceservice/publify,moneyadviceservice/publify,moneyadviceservice/publify
html+erb
## Code Before: <li class="comment-list-item" id="comment-<%= comment.id %>"> <h4 class="comment-list-item__headline"><%= h(comment.author) %> / <span class="comment-list-item__headline-date"><%= display_comment_timestamp(comment) %></span></h4> <%= comment.html %> <div class="comment-list-item__report-link"><%= report_comment_link(comment) %></div> <%- unless comment.published %> <div class="spamwarning"><%= t(".this_comment_has_been_flagged_for_moderator_approval") %></div> <%- end %> </li> ## Instruction: Use simple_format on comments to turn line breaks to new lines / paragraphs. ## Code After: <li class="comment-list-item" id="comment-<%= comment.id %>"> <h4 class="comment-list-item__headline"><%= h(comment.author) %> / <span class="comment-list-item__headline-date"><%= display_comment_timestamp(comment) %></span></h4> <%= simple_format(comment.html) %> <div class="comment-list-item__report-link"><%= report_comment_link(comment) %></div> <%- unless comment.published %> <div class="spamwarning"><%= t(".this_comment_has_been_flagged_for_moderator_approval") %></div> <%- end %> </li>
83eec6a141879ac6756ff6d94dfa7e8312caacae
index.html
index.html
<!DOCTYPE html> <html> <head> <script src="shared/jquery-1.3.2.js"></script> <script src="jquery.ba-starwipe.js"></script> <style> body { background-color: gray; color: white; } </style> </head> <body> <h1>Star Wipe</h1> <ul> <li><a class="demo-link" href="http://www.wikipedia.org">Wikipedia</a></li> <li><a class="demo-link" href="http://developer.yahoo.com/">Yahoo Developer</a></li> </ul> <script> $('.demo-link').click(function(event) { $.starwipe($(event.target).attr('href')); event.preventDefault(); }); </script> </body> </html>
<!DOCTYPE html> <html> <head> <script src="shared/jquery-1.3.2.js"></script> <script src="jquery.ba-starwipe.js"></script> <style> html { font-family: "Helvetica Neue", Helvetica; } </style> </head> <body> <h1>jQuery Star Wipe</h1> <p> Clicking the links in the list below will load them in an <code>&lt;iframe&gt;</code> and then use a star wipe transition to show them in your browser. </p> <ul> <li><a class="demo-link" href="http://www.nytimes.com/">New York Times</a></li> <li><a class="demo-link" href="http://www.wikipedia.org">Wikipedia</a></li> </ul> <hr> <footer> <dl> <dt>Maintainer</dt> <dd><a href="https://twitter.com/ssorallen">@ssorallen</a></dd> <dt>Code</dt> <dd><a href="https://github.com/ssorallen/jquery-starwipe">ssorallen/jquery-starwipe</a></dd> </dl> </footer> <script> $('.demo-link').click(function(event) { $.starwipe($(event.target).attr('href')); event.preventDefault(); }); </script> </body> </html>
Replace disallowed Yahoo Dev with NYTimes
Replace disallowed Yahoo Dev with NYTimes Yahoo Developer is disallowed in iframes with the `X-Frame-Options` header and is therefore a bad example site for the star wipe. NYTimes does not set the header, use it instead. * Add attribution and links to the GitHub repo for promotion
HTML
mit
ssorallen/jquery-starwipe,ssorallen/jquery-starwipe
html
## Code Before: <!DOCTYPE html> <html> <head> <script src="shared/jquery-1.3.2.js"></script> <script src="jquery.ba-starwipe.js"></script> <style> body { background-color: gray; color: white; } </style> </head> <body> <h1>Star Wipe</h1> <ul> <li><a class="demo-link" href="http://www.wikipedia.org">Wikipedia</a></li> <li><a class="demo-link" href="http://developer.yahoo.com/">Yahoo Developer</a></li> </ul> <script> $('.demo-link').click(function(event) { $.starwipe($(event.target).attr('href')); event.preventDefault(); }); </script> </body> </html> ## Instruction: Replace disallowed Yahoo Dev with NYTimes Yahoo Developer is disallowed in iframes with the `X-Frame-Options` header and is therefore a bad example site for the star wipe. NYTimes does not set the header, use it instead. * Add attribution and links to the GitHub repo for promotion ## Code After: <!DOCTYPE html> <html> <head> <script src="shared/jquery-1.3.2.js"></script> <script src="jquery.ba-starwipe.js"></script> <style> html { font-family: "Helvetica Neue", Helvetica; } </style> </head> <body> <h1>jQuery Star Wipe</h1> <p> Clicking the links in the list below will load them in an <code>&lt;iframe&gt;</code> and then use a star wipe transition to show them in your browser. </p> <ul> <li><a class="demo-link" href="http://www.nytimes.com/">New York Times</a></li> <li><a class="demo-link" href="http://www.wikipedia.org">Wikipedia</a></li> </ul> <hr> <footer> <dl> <dt>Maintainer</dt> <dd><a href="https://twitter.com/ssorallen">@ssorallen</a></dd> <dt>Code</dt> <dd><a href="https://github.com/ssorallen/jquery-starwipe">ssorallen/jquery-starwipe</a></dd> </dl> </footer> <script> $('.demo-link').click(function(event) { $.starwipe($(event.target).attr('href')); event.preventDefault(); }); </script> </body> </html>
daf6972826fbb939a46a4b10fe8c6732b4744af6
src/shared/components/donate/donate.js
src/shared/components/donate/donate.js
import React from 'react'; // import PropTypes from 'prop-types'; import Section from 'shared/components/section/section'; import LinkButton from 'shared/components/linkButton/linkButton'; import styles from './donate.css'; const Donate = (props) => { const { ...otherProps } = props; return ( <Section title="Donate" headingLines={false} {...otherProps} className={styles.donateSection} headingTheme="white" > <div className={styles.donate} > <p> Our mission and veteran programs are all maintained through the efforts of our all volunteer staff. Your thoughtful contribution to our fund allows us to expand our reach and help more veterans attend developer conferences. </p> <p>Thank you for helping us to get veterans coding!</p> <LinkButton text="Donate Now" link="#" /> </div> </Section> ); }; Donate.propTypes = {}; export default Donate;
import React from 'react'; // import PropTypes from 'prop-types'; import Section from 'shared/components/section/section'; import LinkButton from 'shared/components/linkButton/linkButton'; import styles from './donate.css'; const Donate = (props) => { const { ...otherProps } = props; return ( <Section title="Donate" headingLines={false} {...otherProps} className={styles.donateSection} headingTheme="white" > <div className={styles.donate} > <p> Our mission and veteran programs are all maintained through the efforts of our all volunteer staff. Your thoughtful contribution to our fund allows us to expand our reach and help more veterans attend developer conferences. </p> <p>Thank you for helping us to get veterans coding!</p> <LinkButton text="Donate Now" link="https://donorbox.org/operationcode" /> </div> </Section> ); }; Donate.propTypes = {}; export default Donate;
Change button link to point to Donorbox page
Change button link to point to Donorbox page
JavaScript
mit
alexspence/operationcode_frontend,hollomancer/operationcode_frontend,tskuse/operationcode_frontend,OperationCode/operationcode_frontend,tal87/operationcode_frontend,hollomancer/operationcode_frontend,tal87/operationcode_frontend,NestorSegura/operationcode_frontend,hollomancer/operationcode_frontend,sethbergman/operationcode_frontend,sethbergman/operationcode_frontend,miaket/operationcode_frontend,miaket/operationcode_frontend,tskuse/operationcode_frontend,NestorSegura/operationcode_frontend,miaket/operationcode_frontend,tskuse/operationcode_frontend,sethbergman/operationcode_frontend,alexspence/operationcode_frontend,OperationCode/operationcode_frontend,alexspence/operationcode_frontend,OperationCode/operationcode_frontend,NestorSegura/operationcode_frontend,tal87/operationcode_frontend
javascript
## Code Before: import React from 'react'; // import PropTypes from 'prop-types'; import Section from 'shared/components/section/section'; import LinkButton from 'shared/components/linkButton/linkButton'; import styles from './donate.css'; const Donate = (props) => { const { ...otherProps } = props; return ( <Section title="Donate" headingLines={false} {...otherProps} className={styles.donateSection} headingTheme="white" > <div className={styles.donate} > <p> Our mission and veteran programs are all maintained through the efforts of our all volunteer staff. Your thoughtful contribution to our fund allows us to expand our reach and help more veterans attend developer conferences. </p> <p>Thank you for helping us to get veterans coding!</p> <LinkButton text="Donate Now" link="#" /> </div> </Section> ); }; Donate.propTypes = {}; export default Donate; ## Instruction: Change button link to point to Donorbox page ## Code After: import React from 'react'; // import PropTypes from 'prop-types'; import Section from 'shared/components/section/section'; import LinkButton from 'shared/components/linkButton/linkButton'; import styles from './donate.css'; const Donate = (props) => { const { ...otherProps } = props; return ( <Section title="Donate" headingLines={false} {...otherProps} className={styles.donateSection} headingTheme="white" > <div className={styles.donate} > <p> Our mission and veteran programs are all maintained through the efforts of our all volunteer staff. Your thoughtful contribution to our fund allows us to expand our reach and help more veterans attend developer conferences. </p> <p>Thank you for helping us to get veterans coding!</p> <LinkButton text="Donate Now" link="https://donorbox.org/operationcode" /> </div> </Section> ); }; Donate.propTypes = {}; export default Donate;
ff52533c93391efe9ea49d36e024c259d5ec7d12
src/IdFixture.php
src/IdFixture.php
<?php namespace OpenConext\Component\EngineBlockFixtures; use OpenConext\Component\EngineBlockFixtures\DataStore\AbstractDataStore; /** * Ids * @package OpenConext\Component\EngineBlockFixtures */ class IdFixture { protected $dataStore; protected $frames = array(); /** * @param AbstractDataStore $dataStore */ function __construct(AbstractDataStore $dataStore) { $this->dataStore = $dataStore; $this->frames = $this->dataStore->load(); } /** * Get the top frame off the queue for use. */ public function shiftFrame() { return array_shift($this->frames); } /** * Queue up another set of ids to use. * * @param IdFrame $frame */ public function addFrame(IdFrame $frame) { $this->frames[] = $frame; return $this; } /** * Remove all frames. */ public function clear() { $this->frames = array(); return $this; } /** * On destroy write out the current state. */ public function __destruct() { $this->dataStore->save($this->frames); } }
<?php namespace OpenConext\Component\EngineBlockFixtures; use OpenConext\Component\EngineBlockFixtures\DataStore\AbstractDataStore; /** * Ids * @package OpenConext\Component\EngineBlockFixtures */ class IdFixture { protected $dataStore; protected $frames = array(); /** * @param AbstractDataStore $dataStore */ function __construct(AbstractDataStore $dataStore) { $this->dataStore = $dataStore; $this->frames = $this->dataStore->load(); } /** * Get the top frame off the queue for use. */ public function shiftFrame() { if (empty($this->frames)) { throw new \RuntimeException('No more IdFrames?'); } return array_shift($this->frames); } /** * Queue up another set of ids to use. * * @param IdFrame $frame */ public function addFrame(IdFrame $frame) { $this->frames[] = $frame; return $this; } /** * Remove all frames. */ public function clear() { $this->frames = array(); return $this; } /** * On destroy write out the current state. */ public function __destruct() { $this->dataStore->save($this->frames); } }
Throw an exception if there are no more IdFrames
Throw an exception if there are no more IdFrames
PHP
apache-2.0
OpenConext/OpenConext-engineblock-fixtures
php
## Code Before: <?php namespace OpenConext\Component\EngineBlockFixtures; use OpenConext\Component\EngineBlockFixtures\DataStore\AbstractDataStore; /** * Ids * @package OpenConext\Component\EngineBlockFixtures */ class IdFixture { protected $dataStore; protected $frames = array(); /** * @param AbstractDataStore $dataStore */ function __construct(AbstractDataStore $dataStore) { $this->dataStore = $dataStore; $this->frames = $this->dataStore->load(); } /** * Get the top frame off the queue for use. */ public function shiftFrame() { return array_shift($this->frames); } /** * Queue up another set of ids to use. * * @param IdFrame $frame */ public function addFrame(IdFrame $frame) { $this->frames[] = $frame; return $this; } /** * Remove all frames. */ public function clear() { $this->frames = array(); return $this; } /** * On destroy write out the current state. */ public function __destruct() { $this->dataStore->save($this->frames); } } ## Instruction: Throw an exception if there are no more IdFrames ## Code After: <?php namespace OpenConext\Component\EngineBlockFixtures; use OpenConext\Component\EngineBlockFixtures\DataStore\AbstractDataStore; /** * Ids * @package OpenConext\Component\EngineBlockFixtures */ class IdFixture { protected $dataStore; protected $frames = array(); /** * @param AbstractDataStore $dataStore */ function __construct(AbstractDataStore $dataStore) { $this->dataStore = $dataStore; $this->frames = $this->dataStore->load(); } /** * Get the top frame off the queue for use. */ public function shiftFrame() { if (empty($this->frames)) { throw new \RuntimeException('No more IdFrames?'); } return array_shift($this->frames); } /** * Queue up another set of ids to use. * * @param IdFrame $frame */ public function addFrame(IdFrame $frame) { $this->frames[] = $frame; return $this; } /** * Remove all frames. */ public function clear() { $this->frames = array(); return $this; } /** * On destroy write out the current state. */ public function __destruct() { $this->dataStore->save($this->frames); } }
120a2172ff1416c05a2e51d2f9d12ce0438726aa
NoteWrangler/package.json
NoteWrangler/package.json
{ "name": "note-wrangler", "version": "1.0.0", "description": "", "main": "app.js", "scripts": { "test": "echo \"Error: no test specified\" && exit 1", "start": "node server.js" }, "author": "Vitalii Rybka", "license": "MIT", "devDependencies": { "babel-core": "^6.23.1", "babel-preset-env": "^1.1.9", "babel-preset-es2015": "^6.22.0", "bower": "^1.8.0", "crypto-js": "^3.1.9-1", "gulp": "^3.9.1", "gulp-babel": "^6.1.2", "gulp-concat-css": "^2.3.0", "gulp-csso": "^2.0.0", "gulp-flatten": "^0.3.1", "gulp-htmlmin": "^3.0.0", "gulp-inject": "^4.2.0", "gulp-rename": "^1.2.2", "gulp-replace": "^0.5.4", "gulp-uglify": "^2.0.1", "json-server": "^0.9.5" } }
{ "name": "note-wrangler", "version": "1.0.0", "description": "", "main": "app.js", "scripts": { "test": "echo \"Error: no test specified\" && exit 1", "start": "node server.js" }, "author": "Vitalii Rybka", "license": "MIT", "devDependencies": { "babel-core": "^6.23.1", "babel-preset-env": "^1.1.9", "babel-preset-es2015": "^6.22.0", "babelify": "^7.3.0", "bower": "^1.8.0", "browserify": "^14.1.0", "crypto-js": "^3.1.9-1", "gulp": "^3.9.1", "gulp-babel": "^6.1.2", "gulp-buffer": "0.0.2", "gulp-cli": "^1.2.2", "gulp-concat": "^2.6.1", "gulp-concat-css": "^2.3.0", "gulp-csso": "^2.0.0", "gulp-flatten": "^0.3.1", "gulp-htmlmin": "^3.0.0", "gulp-inject": "^4.2.0", "gulp-rename": "^1.2.2", "gulp-replace": "^0.5.4", "gulp-sourcemaps": "^2.4.1", "gulp-tap": "^0.1.3", "gulp-uglify": "^2.0.1", "gulp-util": "^3.0.8", "json-server": "^0.9.5", "vinyl-source-stream": "^1.1.0" } }
Add babelify, browserify, gulp-buffer, gulp-cli, gulp-concat, gulp-sourcemaps, gulp-tap, gulp-util, vinyl-source-stream, json-server
Add babelify, browserify, gulp-buffer, gulp-cli, gulp-concat, gulp-sourcemaps, gulp-tap, gulp-util, vinyl-source-stream, json-server
JSON
mit
var-bin/angularjs-training,var-bin/angularjs-training,var-bin/angularjs-training,var-bin/angularjs-training
json
## Code Before: { "name": "note-wrangler", "version": "1.0.0", "description": "", "main": "app.js", "scripts": { "test": "echo \"Error: no test specified\" && exit 1", "start": "node server.js" }, "author": "Vitalii Rybka", "license": "MIT", "devDependencies": { "babel-core": "^6.23.1", "babel-preset-env": "^1.1.9", "babel-preset-es2015": "^6.22.0", "bower": "^1.8.0", "crypto-js": "^3.1.9-1", "gulp": "^3.9.1", "gulp-babel": "^6.1.2", "gulp-concat-css": "^2.3.0", "gulp-csso": "^2.0.0", "gulp-flatten": "^0.3.1", "gulp-htmlmin": "^3.0.0", "gulp-inject": "^4.2.0", "gulp-rename": "^1.2.2", "gulp-replace": "^0.5.4", "gulp-uglify": "^2.0.1", "json-server": "^0.9.5" } } ## Instruction: Add babelify, browserify, gulp-buffer, gulp-cli, gulp-concat, gulp-sourcemaps, gulp-tap, gulp-util, vinyl-source-stream, json-server ## Code After: { "name": "note-wrangler", "version": "1.0.0", "description": "", "main": "app.js", "scripts": { "test": "echo \"Error: no test specified\" && exit 1", "start": "node server.js" }, "author": "Vitalii Rybka", "license": "MIT", "devDependencies": { "babel-core": "^6.23.1", "babel-preset-env": "^1.1.9", "babel-preset-es2015": "^6.22.0", "babelify": "^7.3.0", "bower": "^1.8.0", "browserify": "^14.1.0", "crypto-js": "^3.1.9-1", "gulp": "^3.9.1", "gulp-babel": "^6.1.2", "gulp-buffer": "0.0.2", "gulp-cli": "^1.2.2", "gulp-concat": "^2.6.1", "gulp-concat-css": "^2.3.0", "gulp-csso": "^2.0.0", "gulp-flatten": "^0.3.1", "gulp-htmlmin": "^3.0.0", "gulp-inject": "^4.2.0", "gulp-rename": "^1.2.2", "gulp-replace": "^0.5.4", "gulp-sourcemaps": "^2.4.1", "gulp-tap": "^0.1.3", "gulp-uglify": "^2.0.1", "gulp-util": "^3.0.8", "json-server": "^0.9.5", "vinyl-source-stream": "^1.1.0" } }
6089e699151a5b51b70deedf6a893665258e6b8b
README.md
README.md
[travis-img]: https://travis-ci.org/bodoni/svg.svg?branch=master [travis-url]: https://travis-ci.org/bodoni/svg [docs]: https://bodoni.github.io/svg
1. Fork the project. 2. Implement your idea. 3. Create a pull request. [travis-img]: https://travis-ci.org/bodoni/svg.svg?branch=master [travis-url]: https://travis-ci.org/bodoni/svg [docs]: https://bodoni.github.io/svg
Add an invitation to contribute
Add an invitation to contribute
Markdown
mit
stainless-steel/svg
markdown
## Code Before: [travis-img]: https://travis-ci.org/bodoni/svg.svg?branch=master [travis-url]: https://travis-ci.org/bodoni/svg [docs]: https://bodoni.github.io/svg ## Instruction: Add an invitation to contribute ## Code After: 1. Fork the project. 2. Implement your idea. 3. Create a pull request. [travis-img]: https://travis-ci.org/bodoni/svg.svg?branch=master [travis-url]: https://travis-ci.org/bodoni/svg [docs]: https://bodoni.github.io/svg
803450c4cdd7f47261ca57ba56b453008aa2787d
spec/spec_helper.rb
spec/spec_helper.rb
require 'simplecov' require 'support/logistic_reverse_helper' SimpleCov.start do add_filter "/spec/" end require 'correios_sigep' RSpec.configure do |config| config.expect_with :rspec do |expectations| expectations.include_chain_clauses_in_custom_matcher_descriptions = true end config.mock_with :rspec do |mocks| mocks.verify_partial_doubles = true end config.include LogisticReverseHelper config.before do CorreiosSigep.configure do |config| config.user = 'user' config.password = 'password' config.administrative_code = '12345' config.card = 'card' config.contract = '67890' config.service_code = 'service_code' end end end
ENV['GEM_ENV'] = 'test' require 'simplecov' require 'support/logistic_reverse_helper' SimpleCov.start do add_filter "/spec/" end require 'correios_sigep' RSpec.configure do |config| config.expect_with :rspec do |expectations| expectations.include_chain_clauses_in_custom_matcher_descriptions = true end config.mock_with :rspec do |mocks| mocks.verify_partial_doubles = true end config.include LogisticReverseHelper config.before do CorreiosSigep.configure do |config| config.user = '60618043' config.password = '8o8otn' config.administrative_code = '08082650' config.card = '0057018901' config.contract = '9912208555' config.service_code = '41076' end end end
Add an Environment to identify when is a test and the correct homolog correios config
Add an Environment to identify when is a test and the correct homolog correios config
Ruby
mit
cfcosta/correios_sigep,duduribeiro/correios_sigep,rinaldifonseca/correios_sigep,downgba/correios_sigep
ruby
## Code Before: require 'simplecov' require 'support/logistic_reverse_helper' SimpleCov.start do add_filter "/spec/" end require 'correios_sigep' RSpec.configure do |config| config.expect_with :rspec do |expectations| expectations.include_chain_clauses_in_custom_matcher_descriptions = true end config.mock_with :rspec do |mocks| mocks.verify_partial_doubles = true end config.include LogisticReverseHelper config.before do CorreiosSigep.configure do |config| config.user = 'user' config.password = 'password' config.administrative_code = '12345' config.card = 'card' config.contract = '67890' config.service_code = 'service_code' end end end ## Instruction: Add an Environment to identify when is a test and the correct homolog correios config ## Code After: ENV['GEM_ENV'] = 'test' require 'simplecov' require 'support/logistic_reverse_helper' SimpleCov.start do add_filter "/spec/" end require 'correios_sigep' RSpec.configure do |config| config.expect_with :rspec do |expectations| expectations.include_chain_clauses_in_custom_matcher_descriptions = true end config.mock_with :rspec do |mocks| mocks.verify_partial_doubles = true end config.include LogisticReverseHelper config.before do CorreiosSigep.configure do |config| config.user = '60618043' config.password = '8o8otn' config.administrative_code = '08082650' config.card = '0057018901' config.contract = '9912208555' config.service_code = '41076' end end end