commit stringlengths 40 40 | old_file stringlengths 4 237 | new_file stringlengths 4 237 | old_contents stringlengths 1 4.24k | new_contents stringlengths 5 4.84k | subject stringlengths 15 778 | message stringlengths 16 6.86k | lang stringlengths 1 30 | license stringclasses 13 values | repos stringlengths 5 116k | config stringlengths 1 30 | content stringlengths 105 8.72k |
|---|---|---|---|---|---|---|---|---|---|---|---|
0a7961a157f1c0f4f16b887debb1f3b51c3aede2 | tests/tests_cats.cljx | tests/tests_cats.cljx | (ns tests-cats
#+cljs
(:require-macros [cemerick.cljs.test
:refer (is deftest with-test run-tests testing test-var)])
#+cljs
(:require [cemerick.cljs.test :as ts]
[cats.core :as m]
[cats.types :as t])
#+clj
(:require [clojure.test :refer :all]
[cats.core :as m]
[cats.types :as t]))
(deftest test-maybe
(testing "Test predicates"
(let [m1 (t/just 1)]
(is (t/maybe? m1))
(is (t/just? m1))))
#+clj
(testing "Test fmap"
(let [m1 (t/just 1)
m2 (t/nothing)]
(is (= (m/fmap inc m1) (t/just 2)))
(is (= (m/fmap inc m2) (t/nothing))))))
| (ns tests-cats
#+cljs
(:require-macros [cemerick.cljs.test
:refer (is deftest with-test run-tests testing test-var)])
#+cljs
(:require [cemerick.cljs.test :as ts]
[cats.core :as m]
[cats.types :as t])
#+clj
(:require [clojure.test :refer :all]
[cats.core :as m]
[cats.types :as t]))
(deftest test-maybe
(testing "Test predicates"
(let [m1 (t/just 1)]
(is (t/maybe? m1))
(is (t/just? m1))))
#+clj
(testing "Test fmap"
(let [m1 (t/just 1)
m2 (t/nothing)]
(is (= (m/fmap inc m1) (t/just 2)))
(is (= (m/fmap inc m2) (t/nothing)))))
#+clj
(testing "sequence-m"
(is (= (m/sequence-m [(t/just 2) (t/just 3)])
(t/just [2 3])))
(is (= (m/sequence-m [(t/just 2) (t/nothing)])
(t/nothing)))))
| Test sequence-m for the Maybe monad | Test sequence-m for the Maybe monad
| Clojure | bsd-2-clause | yurrriq/cats,OlegTheCat/cats,funcool/cats,tcsavage/cats,mccraigmccraig/cats,alesguzik/cats | clojure | ## Code Before:
(ns tests-cats
#+cljs
(:require-macros [cemerick.cljs.test
:refer (is deftest with-test run-tests testing test-var)])
#+cljs
(:require [cemerick.cljs.test :as ts]
[cats.core :as m]
[cats.types :as t])
#+clj
(:require [clojure.test :refer :all]
[cats.core :as m]
[cats.types :as t]))
(deftest test-maybe
(testing "Test predicates"
(let [m1 (t/just 1)]
(is (t/maybe? m1))
(is (t/just? m1))))
#+clj
(testing "Test fmap"
(let [m1 (t/just 1)
m2 (t/nothing)]
(is (= (m/fmap inc m1) (t/just 2)))
(is (= (m/fmap inc m2) (t/nothing))))))
## Instruction:
Test sequence-m for the Maybe monad
## Code After:
(ns tests-cats
#+cljs
(:require-macros [cemerick.cljs.test
:refer (is deftest with-test run-tests testing test-var)])
#+cljs
(:require [cemerick.cljs.test :as ts]
[cats.core :as m]
[cats.types :as t])
#+clj
(:require [clojure.test :refer :all]
[cats.core :as m]
[cats.types :as t]))
(deftest test-maybe
(testing "Test predicates"
(let [m1 (t/just 1)]
(is (t/maybe? m1))
(is (t/just? m1))))
#+clj
(testing "Test fmap"
(let [m1 (t/just 1)
m2 (t/nothing)]
(is (= (m/fmap inc m1) (t/just 2)))
(is (= (m/fmap inc m2) (t/nothing)))))
#+clj
(testing "sequence-m"
(is (= (m/sequence-m [(t/just 2) (t/just 3)])
(t/just [2 3])))
(is (= (m/sequence-m [(t/just 2) (t/nothing)])
(t/nothing)))))
|
6551b81b2dc0963a9244de8d75a3bff2feba2ae8 | CHANGELOG.md | CHANGELOG.md | * Replace ShipmentMailer with CartonMailer
IMPORTANT: Appliction and extension code targeting ShipmentMailer needs to
be updated to target CartonMailer instead.
Issue https://github.com/bonobos/spree/pull/299
* Add Carton concept to Spree
Cartons represent containers of inventory units that have been shipped. See
carton.rb for details.
* Remove Promotion::Actions::CreateLineItems
They were broken in a couple ways.
Issue https://github.com/bonobos/spree/pull/259
*Phillip Birtcher* *Jordan Brough*
* Remove Api::CheckoutsController
Issue https://github.com/bonobos/spree/pull/229
*Jordan Brough*
* Remove the Spree::Alert system
Issue https://github.com/bonobos/spree/pull/222
*Jordan Brough*
## Spree 2.2.2 (May 15, 2014) ##
|
* Replace ShipmentMailer with CartonMailer
IMPORTANT: Appliction and extension code targeting ShipmentMailer needs to
be updated to target CartonMailer instead.
Issue https://github.com/bonobos/spree/pull/299
* Add Carton concept to Spree
Cartons represent containers of inventory units that have been shipped. See
carton.rb for details.
* Remove Promotion::Actions::CreateLineItems
They were broken in a couple ways.
Issue https://github.com/bonobos/spree/pull/259
*Phillip Birtcher* *Jordan Brough*
* Remove Api::CheckoutsController
Issue https://github.com/bonobos/spree/pull/229
*Jordan Brough*
* Remove the Spree::Alert system
Issue https://github.com/bonobos/spree/pull/222
*Jordan Brough*
* Remove Spree::Money preferences
Removes Spree::Config's `symbol_position`, `no_cents`, `decimal_mark`, and
`thousands_separator`. This allows us to use the better defaults provided
by RubyMoney. For the same functionality of the existing preferences,
`Spree::Money.default_formatting_rules` can be used.
https://github.com/solidusio/solidus/pull/47
*John Hawthorn*
| Add changelog entry for Money preference removal | Add changelog entry for Money preference removal
| Markdown | bsd-3-clause | pervino/solidus,ckk-scratch/solidus,lsirivong/solidus,forkata/solidus,devilcoders/solidus,athal7/solidus,devilcoders/solidus,Arpsara/solidus,jsurdilla/solidus,ckk-scratch/solidus,ckk-scratch/solidus,grzlus/solidus,bonobos/solidus,xuewenfei/solidus,pervino/solidus,jsurdilla/solidus,Senjai/solidus,grzlus/solidus,jordan-brough/solidus,scottcrawford03/solidus,scottcrawford03/solidus,bonobos/solidus,lsirivong/solidus,athal7/solidus,pervino/solidus,grzlus/solidus,richardnuno/solidus,lsirivong/solidus,richardnuno/solidus,Arpsara/solidus,scottcrawford03/solidus,xuewenfei/solidus,xuewenfei/solidus,bonobos/solidus,richardnuno/solidus,Arpsara/solidus,ckk-scratch/solidus,richardnuno/solidus,grzlus/solidus,forkata/solidus,jsurdilla/solidus,Senjai/solidus,xuewenfei/solidus,bonobos/solidus,forkata/solidus,jordan-brough/solidus,jsurdilla/solidus,devilcoders/solidus,Arpsara/solidus,jordan-brough/solidus,scottcrawford03/solidus,forkata/solidus,devilcoders/solidus,athal7/solidus,Senjai/solidus,Senjai/solidus,athal7/solidus,pervino/solidus,jordan-brough/solidus,lsirivong/solidus | markdown | ## Code Before:
* Replace ShipmentMailer with CartonMailer
IMPORTANT: Appliction and extension code targeting ShipmentMailer needs to
be updated to target CartonMailer instead.
Issue https://github.com/bonobos/spree/pull/299
* Add Carton concept to Spree
Cartons represent containers of inventory units that have been shipped. See
carton.rb for details.
* Remove Promotion::Actions::CreateLineItems
They were broken in a couple ways.
Issue https://github.com/bonobos/spree/pull/259
*Phillip Birtcher* *Jordan Brough*
* Remove Api::CheckoutsController
Issue https://github.com/bonobos/spree/pull/229
*Jordan Brough*
* Remove the Spree::Alert system
Issue https://github.com/bonobos/spree/pull/222
*Jordan Brough*
## Spree 2.2.2 (May 15, 2014) ##
## Instruction:
Add changelog entry for Money preference removal
## Code After:
* Replace ShipmentMailer with CartonMailer
IMPORTANT: Appliction and extension code targeting ShipmentMailer needs to
be updated to target CartonMailer instead.
Issue https://github.com/bonobos/spree/pull/299
* Add Carton concept to Spree
Cartons represent containers of inventory units that have been shipped. See
carton.rb for details.
* Remove Promotion::Actions::CreateLineItems
They were broken in a couple ways.
Issue https://github.com/bonobos/spree/pull/259
*Phillip Birtcher* *Jordan Brough*
* Remove Api::CheckoutsController
Issue https://github.com/bonobos/spree/pull/229
*Jordan Brough*
* Remove the Spree::Alert system
Issue https://github.com/bonobos/spree/pull/222
*Jordan Brough*
* Remove Spree::Money preferences
Removes Spree::Config's `symbol_position`, `no_cents`, `decimal_mark`, and
`thousands_separator`. This allows us to use the better defaults provided
by RubyMoney. For the same functionality of the existing preferences,
`Spree::Money.default_formatting_rules` can be used.
https://github.com/solidusio/solidus/pull/47
*John Hawthorn*
|
9c525a287afa617d59665eddeb19787255fa56dc | README.md | README.md |
This is a Ruby on Rails based web application. It's a Create, Read, Update, Destroy application. Users can create an account and login. User can also comment on other users profile. | It's an application built for people to rent out the extra space they have in their space/apartment.
#What can I do with this web application?
This is a Ruby on Rails based web application. It's a Create, Read, Update, Destroy application. Users can create an account and login. User can also comment on other users profile. They can also create their own posts along with edit and deleting their own posts.
#Technology used
Ruby on Rails, HTML/CSS, Bootstrap
This was a solo project. | Update readme with more information | Update readme with more information
| Markdown | mit | jhack32/rent-a-storage,jhack32/rent-a-storage,jhack32/rent-a-storage | markdown | ## Code Before:
This is a Ruby on Rails based web application. It's a Create, Read, Update, Destroy application. Users can create an account and login. User can also comment on other users profile.
## Instruction:
Update readme with more information
## Code After:
It's an application built for people to rent out the extra space they have in their space/apartment.
#What can I do with this web application?
This is a Ruby on Rails based web application. It's a Create, Read, Update, Destroy application. Users can create an account and login. User can also comment on other users profile. They can also create their own posts along with edit and deleting their own posts.
#Technology used
Ruby on Rails, HTML/CSS, Bootstrap
This was a solo project. |
c30821c7ef6cd8268ac7e66fa862e64590218bf5 | Configuration/TypoScript/Includes/Page.ts | Configuration/TypoScript/Includes/Page.ts | page.meta {
# tell mobile devices to initially scale the page to fit the devices screen width
# viewport = width=device-width, initial-scale=1.0
} | config {
# set the 'no-js' html-class for Modernizr
htmlTag_setParams = class="no-js"
}
page.meta {
# tell mobile devices to initially scale the page to fit the devices screen width
# viewport = width=device-width, initial-scale=1.0
} | Set the 'no-js' class on the html-element by default | [FEATURE] Set the 'no-js' class on the html-element by default
| TypeScript | mit | t3b/t3b_template,t3b/t3b_template,t3b/t3b_template,t3b/t3b_template,t3b/t3b_template | typescript | ## Code Before:
page.meta {
# tell mobile devices to initially scale the page to fit the devices screen width
# viewport = width=device-width, initial-scale=1.0
}
## Instruction:
[FEATURE] Set the 'no-js' class on the html-element by default
## Code After:
config {
# set the 'no-js' html-class for Modernizr
htmlTag_setParams = class="no-js"
}
page.meta {
# tell mobile devices to initially scale the page to fit the devices screen width
# viewport = width=device-width, initial-scale=1.0
} |
a174fbd637bf9ccc7b8a97a251c016495f92f6a9 | eliot/__init__.py | eliot/__init__.py |
from ._version import __version__
# Expose the public API:
from ._message import Message
from ._action import startAction, startTask, Action
from ._output import ILogger, Logger, MemoryLogger
from ._validation import Field, MessageType, ActionType
from ._traceback import writeTraceback, writeFailure
addDestination = Logger._destinations.add
removeDestination = Logger._destinations.remove
__all__ = ["Message", "writeTraceback", "writeFailure",
"startAction", "startTask", "Action",
"Field", "MessageType", "ActionType",
"ILogger", "Logger", "MemoryLogger", "addDestination",
"removeDestination",
"__version__",
]
|
from ._version import __version__
# Expose the public API:
from ._message import Message
from ._action import startAction, startTask, Action
from ._output import ILogger, Logger, MemoryLogger
from ._validation import Field, fields, MessageType, ActionType
from ._traceback import writeTraceback, writeFailure
addDestination = Logger._destinations.add
removeDestination = Logger._destinations.remove
__all__ = ["Message", "writeTraceback", "writeFailure",
"startAction", "startTask", "Action",
"Field", "fields", "MessageType", "ActionType",
"ILogger", "Logger", "MemoryLogger", "addDestination",
"removeDestination",
"__version__",
]
| Add fields to the public API. | Add fields to the public API.
| Python | apache-2.0 | ClusterHQ/eliot,ScatterHQ/eliot,iffy/eliot,ScatterHQ/eliot,ScatterHQ/eliot | python | ## Code Before:
from ._version import __version__
# Expose the public API:
from ._message import Message
from ._action import startAction, startTask, Action
from ._output import ILogger, Logger, MemoryLogger
from ._validation import Field, MessageType, ActionType
from ._traceback import writeTraceback, writeFailure
addDestination = Logger._destinations.add
removeDestination = Logger._destinations.remove
__all__ = ["Message", "writeTraceback", "writeFailure",
"startAction", "startTask", "Action",
"Field", "MessageType", "ActionType",
"ILogger", "Logger", "MemoryLogger", "addDestination",
"removeDestination",
"__version__",
]
## Instruction:
Add fields to the public API.
## Code After:
from ._version import __version__
# Expose the public API:
from ._message import Message
from ._action import startAction, startTask, Action
from ._output import ILogger, Logger, MemoryLogger
from ._validation import Field, fields, MessageType, ActionType
from ._traceback import writeTraceback, writeFailure
addDestination = Logger._destinations.add
removeDestination = Logger._destinations.remove
__all__ = ["Message", "writeTraceback", "writeFailure",
"startAction", "startTask", "Action",
"Field", "fields", "MessageType", "ActionType",
"ILogger", "Logger", "MemoryLogger", "addDestination",
"removeDestination",
"__version__",
]
|
e200b324ad063d7576d306b1f82841ddd8d637f2 | config/application.rb | config/application.rb | require_relative './boot'
require 'action_controller/railtie'
require 'action_view/railtie'
require 'sprockets/railtie'
Bundler.require(*Rails.groups)
module PensionGuidance
class Application < Rails::Application
config.action_dispatch.rescue_responses.merge! 'GuideRepository::GuideNotFound' => :not_found
config.autoload_paths << Rails.root.join('lib')
config.cache_max_age = ENV['CACHE_MAX_AGE'] || 10.seconds
config.middleware.use Rack::BounceFavicon
config.mount_javascript_test_routes = false
end
end
| require_relative './boot'
require 'action_controller/railtie'
require 'action_view/railtie'
require 'sprockets/railtie'
Bundler.require(*Rails.groups)
module PensionGuidance
class Application < Rails::Application
config.action_dispatch.rescue_responses.merge! 'GuideRepository::GuideNotFound' => :not_found
config.autoload_paths << Rails.root.join('lib')
config.cache_max_age = ENV['CACHE_MAX_AGE'] || 10.seconds
config.middleware.use Rack::BounceFavicon
require 'bounce_browserconfig'
config.middleware.use BounceBrowserconfig
config.mount_javascript_test_routes = false
end
end
| Use middleware to 404 requests to `/browserconfig.xml` | Use middleware to 404 requests to `/browserconfig.xml`
| Ruby | mit | guidance-guarantee-programme/pension_guidance,guidance-guarantee-programme/pension_guidance,guidance-guarantee-programme/pension_guidance | ruby | ## Code Before:
require_relative './boot'
require 'action_controller/railtie'
require 'action_view/railtie'
require 'sprockets/railtie'
Bundler.require(*Rails.groups)
module PensionGuidance
class Application < Rails::Application
config.action_dispatch.rescue_responses.merge! 'GuideRepository::GuideNotFound' => :not_found
config.autoload_paths << Rails.root.join('lib')
config.cache_max_age = ENV['CACHE_MAX_AGE'] || 10.seconds
config.middleware.use Rack::BounceFavicon
config.mount_javascript_test_routes = false
end
end
## Instruction:
Use middleware to 404 requests to `/browserconfig.xml`
## Code After:
require_relative './boot'
require 'action_controller/railtie'
require 'action_view/railtie'
require 'sprockets/railtie'
Bundler.require(*Rails.groups)
module PensionGuidance
class Application < Rails::Application
config.action_dispatch.rescue_responses.merge! 'GuideRepository::GuideNotFound' => :not_found
config.autoload_paths << Rails.root.join('lib')
config.cache_max_age = ENV['CACHE_MAX_AGE'] || 10.seconds
config.middleware.use Rack::BounceFavicon
require 'bounce_browserconfig'
config.middleware.use BounceBrowserconfig
config.mount_javascript_test_routes = false
end
end
|
2a2eb23c78c959ab5f4c3b0a5927eb6ec693a018 | services/web/test/unit_frontend/coffee/test-main.coffee | services/web/test/unit_frontend/coffee/test-main.coffee | tests = []
for file of window.__karma__.files
if window.__karma__.files.hasOwnProperty(file)
if /Tests\.js$/.test(file)
tests.push(file)
requirejs.config
baseUrl: '/base/public/js'
paths:
"moment": "libs/moment-2.9.0"
deps: tests
callback: window.__karma__.start
| tests = []
for file of window.__karma__.files
if window.__karma__.files.hasOwnProperty(file)
if /test\/unit_frontend\/js.+Tests.js$/.test(file)
tests.push(file)
requirejs.config
baseUrl: '/base/public/js'
paths:
"moment": "libs/moment-2.9.0"
deps: tests
callback: window.__karma__.start
| Fix bug where tests from new ES code being included in requirejs wrapped code | Fix bug where tests from new ES code being included in requirejs wrapped code
| CoffeeScript | agpl-3.0 | sharelatex/sharelatex | coffeescript | ## Code Before:
tests = []
for file of window.__karma__.files
if window.__karma__.files.hasOwnProperty(file)
if /Tests\.js$/.test(file)
tests.push(file)
requirejs.config
baseUrl: '/base/public/js'
paths:
"moment": "libs/moment-2.9.0"
deps: tests
callback: window.__karma__.start
## Instruction:
Fix bug where tests from new ES code being included in requirejs wrapped code
## Code After:
tests = []
for file of window.__karma__.files
if window.__karma__.files.hasOwnProperty(file)
if /test\/unit_frontend\/js.+Tests.js$/.test(file)
tests.push(file)
requirejs.config
baseUrl: '/base/public/js'
paths:
"moment": "libs/moment-2.9.0"
deps: tests
callback: window.__karma__.start
|
786d84819a5b4fdd105692e84ba2e75379d29bb5 | extras/languages/_export/ui.csv | extras/languages/_export/ui.csv | "ID","SOURCE","TRANSLATION","NOTES"
"1","The Map of Sigil, City of Doors","","Page title."
"2","Reference index of 260 Sigil venues culled from official Planescape material.","","Page description."
"3","Sigil: The City of Doors","","Map header."
"4","Nether Whisper","","Upstream website name."
| "ID","SOURCE","TRANSLATION","NOTES"
"1","The Map of Sigil, City of Doors","","Page title."
"2","Reference index of 260 Sigil venues culled from official Planescape material.","","Page description."
"3","Sigil: The City of Doors","","Map header."
"4","Nether Whisper","","Upstream website name."
"5","Search...","",""
| Add “Search…” string to UI translation file | :gift: Add “Search…” string to UI translation file
| CSV | mit | amargon/city-of-doors,amargon/city-of-doors,amargon/city-of-doors | csv | ## Code Before:
"ID","SOURCE","TRANSLATION","NOTES"
"1","The Map of Sigil, City of Doors","","Page title."
"2","Reference index of 260 Sigil venues culled from official Planescape material.","","Page description."
"3","Sigil: The City of Doors","","Map header."
"4","Nether Whisper","","Upstream website name."
## Instruction:
:gift: Add “Search…” string to UI translation file
## Code After:
"ID","SOURCE","TRANSLATION","NOTES"
"1","The Map of Sigil, City of Doors","","Page title."
"2","Reference index of 260 Sigil venues culled from official Planescape material.","","Page description."
"3","Sigil: The City of Doors","","Map header."
"4","Nether Whisper","","Upstream website name."
"5","Search...","",""
|
ac67275a527da0370c68fb5f66f1b9fa0b999fce | requirements/base.txt | requirements/base.txt | Django==1.7.3
dj-database-url==0.3.0
django-cache-url==0.8.0
django-configurations==0.8
gunicorn==19.1.1
psycopg2==2.5.4
| Django==1.7.3
dj-database-url==0.3.0
django-cache-url==0.8.0
django-configurations==0.8
gunicorn==19.1.1
psycopg2==2.5.4
requests==2.5.1
| Add requests dependency to requirements | Add requests dependency to requirements
| Text | bsd-3-clause | CorbanU/corban-shopify,CorbanU/corban-shopify | text | ## Code Before:
Django==1.7.3
dj-database-url==0.3.0
django-cache-url==0.8.0
django-configurations==0.8
gunicorn==19.1.1
psycopg2==2.5.4
## Instruction:
Add requests dependency to requirements
## Code After:
Django==1.7.3
dj-database-url==0.3.0
django-cache-url==0.8.0
django-configurations==0.8
gunicorn==19.1.1
psycopg2==2.5.4
requests==2.5.1
|
450d5932d4131b44a5e78566db70720192f533ee | spec/team_spec.rb | spec/team_spec.rb | require 'spec_helper'
require_relative '../src/team.rb'
describe Team do
describe 'Team#name' do
it "Should return team's name." do
t = Team.new 'Devils'
expect(t.name).to eql 'Devils'
end
end
end
| require 'spec_helper'
require_relative '../src/team.rb'
describe Team do
describe 'Team#name' do
it "Should return team's name." do
t = Team.new 'Devils'
expect(t.name).to eql 'Devils'
end
it "Should allow team's name to be set." do
t = Team.new 'Devils'
t.name = 'Angels'
expect(t.name).to eql 'Angels'
end
end
end
| Add should allow team's nameto be set test | Add should allow team's nameto be set test
| Ruby | mit | francisbrito/INS353-2014-2-c2 | ruby | ## Code Before:
require 'spec_helper'
require_relative '../src/team.rb'
describe Team do
describe 'Team#name' do
it "Should return team's name." do
t = Team.new 'Devils'
expect(t.name).to eql 'Devils'
end
end
end
## Instruction:
Add should allow team's nameto be set test
## Code After:
require 'spec_helper'
require_relative '../src/team.rb'
describe Team do
describe 'Team#name' do
it "Should return team's name." do
t = Team.new 'Devils'
expect(t.name).to eql 'Devils'
end
it "Should allow team's name to be set." do
t = Team.new 'Devils'
t.name = 'Angels'
expect(t.name).to eql 'Angels'
end
end
end
|
bf1117effbc1fc4769e10b1fa00f6116202d0c7f | build/configure.sh | build/configure.sh | if [ -z "$MYSQL_DATABASE" ]; then
echo "MYSQL_DATABASE environment variable not specified"
exit 1
fi
if [ -z "$MYSQL_USER" ]; then
echo "MYSQL_USER environment variable not specified"
exit 1
fi
if [ -z "$MYSQL_PASSWORD" ]; then
echo "MYSQL_PASSWORD environment variable not specified"
exit 1
fi
MYSQL_COMMAND="mysql -h mariadb -u $MYSQL_USER --password=$MYSQL_PASSWORD"
until eval $MYSQL_COMMAND -e ";" ; do
echo "MariaDB is unavailable - sleeping"
sleep 1
done
NUM_TABLES=$(eval "$MYSQL_COMMAND -B --disable-column-names \"SELECT count(*) FROM information_schema.tables WHERE table_schema='$MYSQL_DATABASE';\"")
# Check if database is empty
if [ "$NUM_TABLES" -eq 0 ]; then
eval $MYSQL_COMMAND "$MYSQL_DATABASE" < ./database/schema.sql
eval $MYSQL_COMMAND "$MYSQL_DATABASE" < ./database/test.sql
fi
| if [ -z "$MYSQL_DATABASE" ]; then
echo "MYSQL_DATABASE environment variable not specified"
exit 1
fi
if [ -z "$MYSQL_USER" ]; then
echo "MYSQL_USER environment variable not specified"
exit 1
fi
if [ -z "$MYSQL_PASSWORD" ]; then
echo "MYSQL_PASSWORD environment variable not specified"
exit 1
fi
MYSQL_COMMAND="mysql -h mariadb -u $MYSQL_USER --password=$MYSQL_PASSWORD"
until eval $MYSQL_COMMAND -e "use opendc;" ; do
echo "MariaDB is unavailable - sleeping"
sleep 1
done
NUM_TABLES=$(eval "$MYSQL_COMMAND -B --disable-column-names -e \"SELECT count(*) FROM information_schema.tables WHERE table_schema='$MYSQL_DATABASE';\"")
# Check if database is empty
if [ "$NUM_TABLES" -eq 0 ]; then
eval $MYSQL_COMMAND "$MYSQL_DATABASE" < ./database/schema.sql
eval $MYSQL_COMMAND "$MYSQL_DATABASE" < ./database/test.sql
fi
# Writing databse config values to keys.json
cat keys.json | python -c "import os, sys, json; ks = json.load(sys.stdin); \
ks['MYSQL_HOST'] = 'mariadb'; \
ks['MYSQL_PORT'] = '3306'; \
ks['MYSQL_DATABASE'] = os.environ['MYSQL_DATABASE']; \
ks['MYSQL_USER'] = os.environ['MYSQL_USER']; \
ks['MYSQL_PASSWORD'] = os.environ['MYSQL_PASSWORD']; \
print json.dumps(ks, indent=4)" > new_keys.json
mv new_keys.json keys.json
| Copy mysql environment variables into keys.json | Copy mysql environment variables into keys.json
| Shell | mit | tudelft-atlarge/opendc,tudelft-atlarge/opendc | shell | ## Code Before:
if [ -z "$MYSQL_DATABASE" ]; then
echo "MYSQL_DATABASE environment variable not specified"
exit 1
fi
if [ -z "$MYSQL_USER" ]; then
echo "MYSQL_USER environment variable not specified"
exit 1
fi
if [ -z "$MYSQL_PASSWORD" ]; then
echo "MYSQL_PASSWORD environment variable not specified"
exit 1
fi
MYSQL_COMMAND="mysql -h mariadb -u $MYSQL_USER --password=$MYSQL_PASSWORD"
until eval $MYSQL_COMMAND -e ";" ; do
echo "MariaDB is unavailable - sleeping"
sleep 1
done
NUM_TABLES=$(eval "$MYSQL_COMMAND -B --disable-column-names \"SELECT count(*) FROM information_schema.tables WHERE table_schema='$MYSQL_DATABASE';\"")
# Check if database is empty
if [ "$NUM_TABLES" -eq 0 ]; then
eval $MYSQL_COMMAND "$MYSQL_DATABASE" < ./database/schema.sql
eval $MYSQL_COMMAND "$MYSQL_DATABASE" < ./database/test.sql
fi
## Instruction:
Copy mysql environment variables into keys.json
## Code After:
if [ -z "$MYSQL_DATABASE" ]; then
echo "MYSQL_DATABASE environment variable not specified"
exit 1
fi
if [ -z "$MYSQL_USER" ]; then
echo "MYSQL_USER environment variable not specified"
exit 1
fi
if [ -z "$MYSQL_PASSWORD" ]; then
echo "MYSQL_PASSWORD environment variable not specified"
exit 1
fi
MYSQL_COMMAND="mysql -h mariadb -u $MYSQL_USER --password=$MYSQL_PASSWORD"
until eval $MYSQL_COMMAND -e "use opendc;" ; do
echo "MariaDB is unavailable - sleeping"
sleep 1
done
NUM_TABLES=$(eval "$MYSQL_COMMAND -B --disable-column-names -e \"SELECT count(*) FROM information_schema.tables WHERE table_schema='$MYSQL_DATABASE';\"")
# Check if database is empty
if [ "$NUM_TABLES" -eq 0 ]; then
eval $MYSQL_COMMAND "$MYSQL_DATABASE" < ./database/schema.sql
eval $MYSQL_COMMAND "$MYSQL_DATABASE" < ./database/test.sql
fi
# Writing databse config values to keys.json
cat keys.json | python -c "import os, sys, json; ks = json.load(sys.stdin); \
ks['MYSQL_HOST'] = 'mariadb'; \
ks['MYSQL_PORT'] = '3306'; \
ks['MYSQL_DATABASE'] = os.environ['MYSQL_DATABASE']; \
ks['MYSQL_USER'] = os.environ['MYSQL_USER']; \
ks['MYSQL_PASSWORD'] = os.environ['MYSQL_PASSWORD']; \
print json.dumps(ks, indent=4)" > new_keys.json
mv new_keys.json keys.json
|
b4d3f68753a62faf64fecb9bb2aaa39ded57e174 | app/views/batches/_batches.html.erb | app/views/batches/_batches.html.erb | <% if @batches.length > 0 %>
<table class="table table-striped table-bordered">
<thead>
<tr>
<th>Name</th>
<th>Created</th>
<th>Total Contributions</th>
</tr>
</thead>
<tbody>
<% @batches.each do |batch| %>
<tr>
<td><%= link_to batch.title, batch_path(batch) %></td>
<td><%= batch.created_at_string %></td>
<td><%= number_to_currency(batch.total_contributions) %></td>
</tr>
<% end %>
</tbody>
</table>
<%= will_paginate @batches %>
<% end %>
| <% if @batches.length > 0 %>
<table class="table table-striped table-bordered">
<thead>
<tr>
<th>Name</th>
<th>Created</th>
<th>Total Contributions</th>
</tr>
</thead>
<tbody>
<% @batches.each do |batch| %>
<tr>
<td><%= link_to batch.title, batch_contributions_path(batch) %></td>
<td><%= batch.created_at_string %></td>
<td><%= number_to_currency(batch.total_contributions) %></td>
</tr>
<% end %>
</tbody>
</table>
<%= will_paginate @batches %>
<% end %>
| Update batches/_batches partial to reflect batch contributions change | Update batches/_batches partial to reflect batch contributions change
| HTML+ERB | mit | drueck/giving,drueck/giving | html+erb | ## Code Before:
<% if @batches.length > 0 %>
<table class="table table-striped table-bordered">
<thead>
<tr>
<th>Name</th>
<th>Created</th>
<th>Total Contributions</th>
</tr>
</thead>
<tbody>
<% @batches.each do |batch| %>
<tr>
<td><%= link_to batch.title, batch_path(batch) %></td>
<td><%= batch.created_at_string %></td>
<td><%= number_to_currency(batch.total_contributions) %></td>
</tr>
<% end %>
</tbody>
</table>
<%= will_paginate @batches %>
<% end %>
## Instruction:
Update batches/_batches partial to reflect batch contributions change
## Code After:
<% if @batches.length > 0 %>
<table class="table table-striped table-bordered">
<thead>
<tr>
<th>Name</th>
<th>Created</th>
<th>Total Contributions</th>
</tr>
</thead>
<tbody>
<% @batches.each do |batch| %>
<tr>
<td><%= link_to batch.title, batch_contributions_path(batch) %></td>
<td><%= batch.created_at_string %></td>
<td><%= number_to_currency(batch.total_contributions) %></td>
</tr>
<% end %>
</tbody>
</table>
<%= will_paginate @batches %>
<% end %>
|
cfda6cfe0f916e43d7ccb973b0df6c4c72363f37 | appveyor.yml | appveyor.yml | build: false
environment:
matrix:
- PYTHON: "C:\\Python27"
PYTHON_VERSION: "2.7.10"
PYTHON_ARCH: "64"
MINICONDA: C:\Miniconda
- PYTHON: "C:\\Python33"
PYTHON_VERSION: "3.3.5"
PYTHON_ARCH: "64"
MINICONDA: C:\Miniconda3
- PYTHON: "C:\\Python34"
PYTHON_VERSION: "3.4.3"
PYTHON_ARCH: "64"
MINICONDA: C:\Miniconda3
- PYTHON: "C:\\Python35"
PYTHON_VERSION: "3.5.0"
PYTHON_ARCH: "64"
MINICONDA: C:\Miniconda35
init:
- "ECHO %PYTHON% %PYTHON_VERSION% %PYTHON_ARCH% %MINICONDA%"
- systeminfo
install:
- "set PATH=%MINICONDA%;%MINICONDA%\\Scripts;%PATH%"
- conda config --set always_yes yes --set changeps1 no
- conda update -q conda
- conda info -a
- "conda create -q -n test-environment python=%PYTHON_VERSION% cython numpy scipy matplotlib h5py scikit-learn nose"
- activate test-environment
- pip install coverage nibabel
- python setup.py build_ext --inplace
test_script:
- nosetests | build: false
environment:
matrix:
- PYTHON: "C:\\Python27"
PYTHON_VERSION: "2.7.11"
PYTHON_ARCH: "64"
MINICONDA: C:\Miniconda
- PYTHON: "C:\\Python36"
PYTHON_VERSION: "3.6.0"
PYTHON_ARCH: "64"
MINICONDA: C:\Miniconda36
init:
- "ECHO %PYTHON% %PYTHON_VERSION% %PYTHON_ARCH% %MINICONDA%"
- systeminfo
install:
- "set PATH=%MINICONDA%;%MINICONDA%\\Scripts;%PATH%"
- conda config --set always_yes yes --set changeps1 no
- conda update -q conda
- conda info -a
- "conda create -q -n test-environment python=%PYTHON_VERSION% cython numpy scipy matplotlib h5py scikit-learn nose"
- activate test-environment
- pip install coverage nibabel
- python setup.py build_ext --inplace
test_script:
- nosetests
| Test only on 2.7.11 and 3.6.0 | Test only on 2.7.11 and 3.6.0
| YAML | bsd-3-clause | FrancoisRheaultUS/dipy,FrancoisRheaultUS/dipy | yaml | ## Code Before:
build: false
environment:
matrix:
- PYTHON: "C:\\Python27"
PYTHON_VERSION: "2.7.10"
PYTHON_ARCH: "64"
MINICONDA: C:\Miniconda
- PYTHON: "C:\\Python33"
PYTHON_VERSION: "3.3.5"
PYTHON_ARCH: "64"
MINICONDA: C:\Miniconda3
- PYTHON: "C:\\Python34"
PYTHON_VERSION: "3.4.3"
PYTHON_ARCH: "64"
MINICONDA: C:\Miniconda3
- PYTHON: "C:\\Python35"
PYTHON_VERSION: "3.5.0"
PYTHON_ARCH: "64"
MINICONDA: C:\Miniconda35
init:
- "ECHO %PYTHON% %PYTHON_VERSION% %PYTHON_ARCH% %MINICONDA%"
- systeminfo
install:
- "set PATH=%MINICONDA%;%MINICONDA%\\Scripts;%PATH%"
- conda config --set always_yes yes --set changeps1 no
- conda update -q conda
- conda info -a
- "conda create -q -n test-environment python=%PYTHON_VERSION% cython numpy scipy matplotlib h5py scikit-learn nose"
- activate test-environment
- pip install coverage nibabel
- python setup.py build_ext --inplace
test_script:
- nosetests
## Instruction:
Test only on 2.7.11 and 3.6.0
## Code After:
build: false
environment:
matrix:
- PYTHON: "C:\\Python27"
PYTHON_VERSION: "2.7.11"
PYTHON_ARCH: "64"
MINICONDA: C:\Miniconda
- PYTHON: "C:\\Python36"
PYTHON_VERSION: "3.6.0"
PYTHON_ARCH: "64"
MINICONDA: C:\Miniconda36
init:
- "ECHO %PYTHON% %PYTHON_VERSION% %PYTHON_ARCH% %MINICONDA%"
- systeminfo
install:
- "set PATH=%MINICONDA%;%MINICONDA%\\Scripts;%PATH%"
- conda config --set always_yes yes --set changeps1 no
- conda update -q conda
- conda info -a
- "conda create -q -n test-environment python=%PYTHON_VERSION% cython numpy scipy matplotlib h5py scikit-learn nose"
- activate test-environment
- pip install coverage nibabel
- python setup.py build_ext --inplace
test_script:
- nosetests
|
17aaecf89dd3e9065d8b528a87ec97be511281f9 | idsBuild.sh | idsBuild.sh | echo Informing Slack...
curl -X 'POST' --silent --data-binary '{"text":"A new build for the player service has started."}' $WEBHOOK > /dev/null
mkdir dockercfg ; cd dockercfg
echo Downloading Docker requirements..
wget http://game-on.org:8081/dockerneeds.tar -q
echo Setting up Docker...
tar xzf dockerneeds.tar
cd ..
echo Downloading Java 8...
wget http://game-on.org:8081/jdk-8u65-x64.tar.gz -q
echo Extracting Java 8...
tar xzf jdk-8u65-x64.tar.gz
echo And removing the tarball...
rm jdk-8u65-x64.tar.gz
export JAVA_HOME=$(pwd)/jdk1.8.0_65
echo Building projects using gradle...
./gradlew build
echo Building and Starting Concierge Docker Image...
cd player-wlpcfg
../gradlew buildDockerImage
../gradlew stopCurrentContainer
../gradlew removeCurrentContainer
../gradlew startNewContainer
echo Removing JDK, cause there's no reason that's an artifact...
cd ..
rm -rf jdk1.8.0_65 | echo Informing Slack...
curl -X 'POST' --silent --data-binary '{"text":"A new build for the player service has started."}' $WEBHOOK > /dev/null
mkdir dockercfg ; cd dockercfg
echo Downloading Docker requirements..
wget --user=admin --password=$ADMIN_PASSWORD https://game-on.org/repository/dockerneeds.tar -q
echo Setting up Docker...
tar xzf dockerneeds.tar
cd ..
echo Downloading Java 8...
wget http://game-on.org:8081/jdk-8u65-x64.tar.gz -q
echo Extracting Java 8...
tar xzf jdk-8u65-x64.tar.gz
echo And removing the tarball...
rm jdk-8u65-x64.tar.gz
export JAVA_HOME=$(pwd)/jdk1.8.0_65
echo Building projects using gradle...
./gradlew build
echo Building and Starting Concierge Docker Image...
cd player-wlpcfg
../gradlew buildDockerImage
../gradlew stopCurrentContainer
../gradlew removeCurrentContainer
../gradlew startNewContainer
echo Removing JDK, cause there's no reason that's an artifact...
cd ..
rm -rf jdk1.8.0_65 | Access build requirements through proxy and use HTTP AUTH. | Access build requirements through proxy and use HTTP AUTH. | Shell | apache-2.0 | gameontext/gameon-parser,gameontext/gameon-player,gameontext/gameon-parser,gameontext/gameon-player,gameontext/gameon-mediator,gameontext/gameon-mediator,gameontext/gameon-parser,gameontext/gameon-mediator | shell | ## Code Before:
echo Informing Slack...
curl -X 'POST' --silent --data-binary '{"text":"A new build for the player service has started."}' $WEBHOOK > /dev/null
mkdir dockercfg ; cd dockercfg
echo Downloading Docker requirements..
wget http://game-on.org:8081/dockerneeds.tar -q
echo Setting up Docker...
tar xzf dockerneeds.tar
cd ..
echo Downloading Java 8...
wget http://game-on.org:8081/jdk-8u65-x64.tar.gz -q
echo Extracting Java 8...
tar xzf jdk-8u65-x64.tar.gz
echo And removing the tarball...
rm jdk-8u65-x64.tar.gz
export JAVA_HOME=$(pwd)/jdk1.8.0_65
echo Building projects using gradle...
./gradlew build
echo Building and Starting Concierge Docker Image...
cd player-wlpcfg
../gradlew buildDockerImage
../gradlew stopCurrentContainer
../gradlew removeCurrentContainer
../gradlew startNewContainer
echo Removing JDK, cause there's no reason that's an artifact...
cd ..
rm -rf jdk1.8.0_65
## Instruction:
Access build requirements through proxy and use HTTP AUTH.
## Code After:
echo Informing Slack...
curl -X 'POST' --silent --data-binary '{"text":"A new build for the player service has started."}' $WEBHOOK > /dev/null
mkdir dockercfg ; cd dockercfg
echo Downloading Docker requirements..
wget --user=admin --password=$ADMIN_PASSWORD https://game-on.org/repository/dockerneeds.tar -q
echo Setting up Docker...
tar xzf dockerneeds.tar
cd ..
echo Downloading Java 8...
wget http://game-on.org:8081/jdk-8u65-x64.tar.gz -q
echo Extracting Java 8...
tar xzf jdk-8u65-x64.tar.gz
echo And removing the tarball...
rm jdk-8u65-x64.tar.gz
export JAVA_HOME=$(pwd)/jdk1.8.0_65
echo Building projects using gradle...
./gradlew build
echo Building and Starting Concierge Docker Image...
cd player-wlpcfg
../gradlew buildDockerImage
../gradlew stopCurrentContainer
../gradlew removeCurrentContainer
../gradlew startNewContainer
echo Removing JDK, cause there's no reason that's an artifact...
cd ..
rm -rf jdk1.8.0_65 |
05de158f44e72207193c4fb86952a74b6ab3cebe | app/assets/javascripts/resolve/load_permalink.js.coffee | app/assets/javascripts/resolve/load_permalink.js.coffee | $ ->
$("*[data-umlaut-toggle-permalink]").click (event) ->
originalLink = $(this)
valueContainer = $("#umlaut-permalink-container")
loadedLink = $(".umlaut-permalink-content a")
url = loadedLink.attr("href")
if url == undefined
loadedLink = $("#umlaut-permalink-content")
url = loadedLink.attr("href")
input = $("<input />")
input.attr("value", url)
input.addClass("form-control")
valueContainer.html("Cut and paste the following URL:")
valueContainer.append(input)
valueContainer.append(loadedLink.attr("id","umlaut-permalink-content"))
loadedLink.hide()
input.select()
| $ ->
$("*[data-umlaut-toggle-permalink]").click (event) ->
if $(".umlaut-permalink-container input").is(":visible")
$(".umlaut-permalink-container input").select()
permalinkLoaded = setInterval ->
if $(".umlaut-permalink-container a").is(":visible")
reformatPermalink(event)
clearInterval(permalinkLoaded)
, 0
reformatPermalink = (event) ->
originalLink = $(this)
valueContainer = $("#umlaut-permalink-container")
loadedLink = $(".umlaut-permalink-container a")
url = loadedLink.attr("href")
input = $("<input />")
input.attr("value", url)
input.attr("readonly", "true")
input.addClass("form-control")
valueContainer.html("Cut and paste the following URL:")
valueContainer.append(input)
input.select()
| Fix permalink reformat to wait for link to load if its the first time the page opened | Fix permalink reformat to wait for link to load if its the first time the page opened
| CoffeeScript | mit | NYULibraries/getit,NYULibraries/getit,NYULibraries/getit,NYULibraries/getit | coffeescript | ## Code Before:
$ ->
$("*[data-umlaut-toggle-permalink]").click (event) ->
originalLink = $(this)
valueContainer = $("#umlaut-permalink-container")
loadedLink = $(".umlaut-permalink-content a")
url = loadedLink.attr("href")
if url == undefined
loadedLink = $("#umlaut-permalink-content")
url = loadedLink.attr("href")
input = $("<input />")
input.attr("value", url)
input.addClass("form-control")
valueContainer.html("Cut and paste the following URL:")
valueContainer.append(input)
valueContainer.append(loadedLink.attr("id","umlaut-permalink-content"))
loadedLink.hide()
input.select()
## Instruction:
Fix permalink reformat to wait for link to load if its the first time the page opened
## Code After:
$ ->
$("*[data-umlaut-toggle-permalink]").click (event) ->
if $(".umlaut-permalink-container input").is(":visible")
$(".umlaut-permalink-container input").select()
permalinkLoaded = setInterval ->
if $(".umlaut-permalink-container a").is(":visible")
reformatPermalink(event)
clearInterval(permalinkLoaded)
, 0
reformatPermalink = (event) ->
originalLink = $(this)
valueContainer = $("#umlaut-permalink-container")
loadedLink = $(".umlaut-permalink-container a")
url = loadedLink.attr("href")
input = $("<input />")
input.attr("value", url)
input.attr("readonly", "true")
input.addClass("form-control")
valueContainer.html("Cut and paste the following URL:")
valueContainer.append(input)
input.select()
|
2ba932afefd135c0afe50dad3450049918bbcdf8 | app/assets/stylesheets/_app/_community/_application.sass | app/assets/stylesheets/_app/_community/_application.sass | @import '_partials/card_forum'
@import '_partials/forms'
@import '_partials/post'
@import '_partials/user'
.btn--plus
padding-left: 40px
position: relative
&:before
+font-size(35)
left: 15px
position: absolute
top: 4px
.dropdown__value
padding: 6px 40px 6px 10px
&:after
right: 10px
top: 40%
.inverted &
+font-size(14)
min-width: 125px
padding-bottom: 2px
padding-top: 2px
&:after
top: 13px
.thorntree-search
position: relative
top: -3px
.input--text
background: #fff
.search__button
position: absolute
right: 4px
top: 4px
&:hover
background: none
.row--compose
padding-top: 17px
.copy--h3
margin-bottom: 12px
text-align: center
.thorntree
// Hijacking this here where it'll be scoped to thorntree. The idea is to eventually move this out and make copy--body be 16px everywhere.
.copy--body
+font-size(16)
.row__inner--secondary
.place-title
margin-right: 5px
.row__title--secondary + .badge
margin-right: 5px
margin-top: 5px
| @import '_partials/card_forum'
@import '_partials/forms'
@import '_partials/post'
@import '_partials/user'
.btn--plus
padding-left: 40px
position: relative
&:before
+font-size(35)
left: 15px
position: absolute
top: 4px
.dropdown__value
padding: 6px 40px 6px 10px
&:after
right: 10px
top: 40%
.inverted &
+font-size(14)
min-width: 125px
padding-bottom: 2px
padding-top: 2px
&:after
top: 13px
.thorntree-search
position: relative
top: -3px
.input--text
background: #fff
.search__button
position: absolute
right: 4px
top: 4px
&:hover
background: none
.row--compose
padding-top: 17px
.copy--h3
margin-bottom: 12px
text-align: center
.row__inner--secondary
.place-title
margin-right: 5px
.row__title--secondary + .badge
margin-right: 5px
margin-top: 5px
| Remove redundant thorn tree .copy--body override. | Remove redundant thorn tree .copy--body override.
| Sass | mit | lonelyplanet/rizzo,Lostmyname/rizzo,lonelyplanet/rizzo,lonelyplanet/rizzo,Lostmyname/rizzo,Lostmyname/rizzo,lonelyplanet/rizzo,lonelyplanet/rizzo,Lostmyname/rizzo,Lostmyname/rizzo | sass | ## Code Before:
@import '_partials/card_forum'
@import '_partials/forms'
@import '_partials/post'
@import '_partials/user'
.btn--plus
padding-left: 40px
position: relative
&:before
+font-size(35)
left: 15px
position: absolute
top: 4px
.dropdown__value
padding: 6px 40px 6px 10px
&:after
right: 10px
top: 40%
.inverted &
+font-size(14)
min-width: 125px
padding-bottom: 2px
padding-top: 2px
&:after
top: 13px
.thorntree-search
position: relative
top: -3px
.input--text
background: #fff
.search__button
position: absolute
right: 4px
top: 4px
&:hover
background: none
.row--compose
padding-top: 17px
.copy--h3
margin-bottom: 12px
text-align: center
.thorntree
// Hijacking this here where it'll be scoped to thorntree. The idea is to eventually move this out and make copy--body be 16px everywhere.
.copy--body
+font-size(16)
.row__inner--secondary
.place-title
margin-right: 5px
.row__title--secondary + .badge
margin-right: 5px
margin-top: 5px
## Instruction:
Remove redundant thorn tree .copy--body override.
## Code After:
@import '_partials/card_forum'
@import '_partials/forms'
@import '_partials/post'
@import '_partials/user'
.btn--plus
padding-left: 40px
position: relative
&:before
+font-size(35)
left: 15px
position: absolute
top: 4px
.dropdown__value
padding: 6px 40px 6px 10px
&:after
right: 10px
top: 40%
.inverted &
+font-size(14)
min-width: 125px
padding-bottom: 2px
padding-top: 2px
&:after
top: 13px
.thorntree-search
position: relative
top: -3px
.input--text
background: #fff
.search__button
position: absolute
right: 4px
top: 4px
&:hover
background: none
.row--compose
padding-top: 17px
.copy--h3
margin-bottom: 12px
text-align: center
.row__inner--secondary
.place-title
margin-right: 5px
.row__title--secondary + .badge
margin-right: 5px
margin-top: 5px
|
41647ab4cd56fc6d5ce58659f574186f293ca9f8 | bio/sources/uniprot/resources/uniprot-human_keys.properties | bio/sources/uniprot/resources/uniprot-human_keys.properties | Protein.key_md5=md5checksum, organism
Organism.key_taxonid=taxonId
Publication.key_pubmed=pubMedId
DataSet.key_name=name
DataSource.key_name=name
Gene.key_symbol=symbol, organism
OntologyTerm.key_name=name, ontology
Ontology.key_name=name
ProteinDomain.key_identifier=primaryIdentifier
| Protein.key_md5=md5checksum, organism
Organism.key_taxonid=taxonId
Publication.key_pubmed=pubMedId
DataSet.key_name=name
DataSource.key_name=name
Gene.key_symbol=symbol, organism
OntologyTerm.key_name=name, ontology
Ontology.key_name=name
ProteinDomain.key_identifier=primaryIdentifier
Sequence.key = md5checksum
| Add key for protein sequences to merge. | Add key for protein sequences to merge.
Former-commit-id: 9322b409d5cade6a26db62a0c30d31da81461ec6 | INI | lgpl-2.1 | julie-sullivan/phytomine,julie-sullivan/phytomine,julie-sullivan/phytomine,julie-sullivan/phytomine,julie-sullivan/phytomine,julie-sullivan/phytomine,julie-sullivan/phytomine | ini | ## Code Before:
Protein.key_md5=md5checksum, organism
Organism.key_taxonid=taxonId
Publication.key_pubmed=pubMedId
DataSet.key_name=name
DataSource.key_name=name
Gene.key_symbol=symbol, organism
OntologyTerm.key_name=name, ontology
Ontology.key_name=name
ProteinDomain.key_identifier=primaryIdentifier
## Instruction:
Add key for protein sequences to merge.
Former-commit-id: 9322b409d5cade6a26db62a0c30d31da81461ec6
## Code After:
Protein.key_md5=md5checksum, organism
Organism.key_taxonid=taxonId
Publication.key_pubmed=pubMedId
DataSet.key_name=name
DataSource.key_name=name
Gene.key_symbol=symbol, organism
OntologyTerm.key_name=name, ontology
Ontology.key_name=name
ProteinDomain.key_identifier=primaryIdentifier
Sequence.key = md5checksum
|
45a48898129c0e487d4ba2230d97fffc1e94e078 | .travis.yml | .travis.yml | sudo: false
language: ruby
rvm:
- ree
- 1.9.3
- 2.5.1
notifications:
email: false
| sudo: false
language: ruby
rvm:
- ree
- 1.9.3
- 2.5.1
notifications:
email: false
before_install:
- gem uninstall -v '>= 2' -i $(rvm gemdir)@global -ax bundler || true
- gem install bundler -v '< 2'
| Use bundler < 2 for Travis CI | Use bundler < 2 for Travis CI
Otherwise it fails to work with this gem
Travis CI docs on this:
https://docs.travis-ci.com/user/languages/ruby/#bundler-20
| YAML | mit | bigcartel/bigcartel-theme-fonts | yaml | ## Code Before:
sudo: false
language: ruby
rvm:
- ree
- 1.9.3
- 2.5.1
notifications:
email: false
## Instruction:
Use bundler < 2 for Travis CI
Otherwise it fails to work with this gem
Travis CI docs on this:
https://docs.travis-ci.com/user/languages/ruby/#bundler-20
## Code After:
sudo: false
language: ruby
rvm:
- ree
- 1.9.3
- 2.5.1
notifications:
email: false
before_install:
- gem uninstall -v '>= 2' -i $(rvm gemdir)@global -ax bundler || true
- gem install bundler -v '< 2'
|
23aae398b791db53e8f69ed53a078f983c575fb0 | .travis.yml | .travis.yml | language: php
matrix:
allow_failures:
- php: 5.4
- php: 5.5
- php: 5.6
- php: hhvm
- php: hhvm-nightly
- php: nightly
include:
- php: 5.3
dist: precise
- php: 5.4
dist: trusty
- php: 5.5
dist: trusty
- php: 5.6
dist: trusty
- php: 7.0
dist: trusty
- php: 7.1
dist: trusty
- php: 7.2
dist: trusty
- php: 7.3
dist: trusty
- php: 7.4
dist: xenial
- php: hhvm
dist: trusty
- php: hhvm-nightly
dist: trusty
- php: nightly
dist: trusty
before_script:
- bash tests/before_script.sh
script:
- bash tests/script.sh
| language: php
matrix:
allow_failures:
- php: 5.4
- php: 5.5
- php: 5.6
- php: hhvm
- php: hhvm-nightly
- php: nightly
include:
- php: 5.3
dist: precise
- php: 5.4
dist: trusty
- php: 5.5
dist: trusty
- php: 5.6
dist: trusty
- php: 7.0
dist: trusty
- php: 7.1
dist: trusty
- php: 7.2
dist: trusty
- php: 7.3
dist: trusty
- php: 7.4
dist: xenial
- php: 8.0
dist: xenial
- php: hhvm
dist: trusty
- php: hhvm-nightly
dist: trusty
- php: nightly
dist: trusty
before_script:
- bash tests/before_script.sh
script:
- bash tests/script.sh
| Add PHP 8.0 to continuous integration tests | Add PHP 8.0 to continuous integration tests
| YAML | unlicense | zachborboa/php-curl-class,zachborboa/php-curl-class,php-curl-class/php-curl-class,php-curl-class/php-curl-class,php-curl-class/php-curl-class,zachborboa/php-curl-class | yaml | ## Code Before:
language: php
matrix:
allow_failures:
- php: 5.4
- php: 5.5
- php: 5.6
- php: hhvm
- php: hhvm-nightly
- php: nightly
include:
- php: 5.3
dist: precise
- php: 5.4
dist: trusty
- php: 5.5
dist: trusty
- php: 5.6
dist: trusty
- php: 7.0
dist: trusty
- php: 7.1
dist: trusty
- php: 7.2
dist: trusty
- php: 7.3
dist: trusty
- php: 7.4
dist: xenial
- php: hhvm
dist: trusty
- php: hhvm-nightly
dist: trusty
- php: nightly
dist: trusty
before_script:
- bash tests/before_script.sh
script:
- bash tests/script.sh
## Instruction:
Add PHP 8.0 to continuous integration tests
## Code After:
language: php
matrix:
allow_failures:
- php: 5.4
- php: 5.5
- php: 5.6
- php: hhvm
- php: hhvm-nightly
- php: nightly
include:
- php: 5.3
dist: precise
- php: 5.4
dist: trusty
- php: 5.5
dist: trusty
- php: 5.6
dist: trusty
- php: 7.0
dist: trusty
- php: 7.1
dist: trusty
- php: 7.2
dist: trusty
- php: 7.3
dist: trusty
- php: 7.4
dist: xenial
- php: 8.0
dist: xenial
- php: hhvm
dist: trusty
- php: hhvm-nightly
dist: trusty
- php: nightly
dist: trusty
before_script:
- bash tests/before_script.sh
script:
- bash tests/script.sh
|
764b21e02d1f37ed451b3cafb96371cfb1e97b4b | app/views/questions/index.html.erb | app/views/questions/index.html.erb | <body>
<div id='all_questions'>
<p id='ui_title'>All Questions</p>
<% @questions.each do |question| %>
<div class='ind_ques'>
<div class="ind_ques_contents">
<p ><%= question.title %></p>
<p><%= question.content %></p>
<p> <img class='ui_avatar_question' src='<%= question.creator.avatar %>'/> </p>
<p><%= question.creator.name %></p>
</div>
</div>
<% end %>
<% if false %>
<p><%= link_to 'sign up', signup_path %></p>
<p><%= link_to 'login', login_path %></p>
<% end %>
<div>
</body>
| <body>
<div id='all_questions'>
<p id='ui_title'>All Questions</p>
<% @questions.each do |question| %>
<div class='ind_ques'>
<div class="ind_ques_contents">
<li><%= link_to "#{question.title}", question_path(question.id) %></li>
<li><%= question.content %></li>
<li> <img class='ui_avatar_question' src='<%= question.creator.avatar %>'/> </li>
<li><%= link_to "#{question.creator.name}", user_path(question.creator_id) %></li>
</div>
</div>
<% end %>
<% if false %>
<li><%= link_to 'sign up', signup_path %></li>
<li><%= link_to 'login', login_path %></li>
<% end %>
<div>
</body>
| Add links for user and question | Add links for user and question
| HTML+ERB | mit | kevalwell/StackOverflow,kevalwell/StackOverflow,kevalwell/StackOverflow | html+erb | ## Code Before:
<body>
<div id='all_questions'>
<p id='ui_title'>All Questions</p>
<% @questions.each do |question| %>
<div class='ind_ques'>
<div class="ind_ques_contents">
<p ><%= question.title %></p>
<p><%= question.content %></p>
<p> <img class='ui_avatar_question' src='<%= question.creator.avatar %>'/> </p>
<p><%= question.creator.name %></p>
</div>
</div>
<% end %>
<% if false %>
<p><%= link_to 'sign up', signup_path %></p>
<p><%= link_to 'login', login_path %></p>
<% end %>
<div>
</body>
## Instruction:
Add links for user and question
## Code After:
<body>
<div id='all_questions'>
<p id='ui_title'>All Questions</p>
<% @questions.each do |question| %>
<div class='ind_ques'>
<div class="ind_ques_contents">
<li><%= link_to "#{question.title}", question_path(question.id) %></li>
<li><%= question.content %></li>
<li> <img class='ui_avatar_question' src='<%= question.creator.avatar %>'/> </li>
<li><%= link_to "#{question.creator.name}", user_path(question.creator_id) %></li>
</div>
</div>
<% end %>
<% if false %>
<li><%= link_to 'sign up', signup_path %></li>
<li><%= link_to 'login', login_path %></li>
<% end %>
<div>
</body>
|
c1e6d14ee6e4c46b92fa326c81e3644e4626cacf | doc/AboutUs.md | doc/AboutUs.md |
* [Role](#role)
* [Kai Ling](#kai-ling)
* [Yi Hang](#yi-hang)
* [Rong Hua](#rong-hua)
* [Charlton](#charlton)
* [Contact](#contact)
## Role
#### Kai Ling
1. Team Leader
2. Documentation
#### Yi Hang
1. Code Quality
2. Integration
#### Rong Hua
1. Test
2. Intern
#### Charlton
1. Scheduling and Tracking
2. Deliverables and Deadlines
## Contact
* Kai Ling (https://github.com/Kaiiiii)
* Yi Hang (https://github.com/yihangho)
* Rong Hua (https://github.com/Roahhh)
* Charlton (https://github.com/cadmusthefounder)
|
* [Role](#role)
* [Kai Ling](#kai-ling)
* [Yi Hang](#yi-hang)
* [Rong Hua](#rong-hua)
* [Charlton](#charlton)
* [Contact](#contact)
## Role
#### Kai Ling
1. Team Leader - In charge of overall project coordination.
2. Documentation - Ensures that project documentations are in order.
#### Yi Hang
1. Code Quality - Ensures adherence to coding standards.
2. Integration - In charge of versioning and code maintainence.
3. Git Expert - Advisor for other members with regards to Git.
#### Rong Hua
1. Test - Ensures testing of the project is done properly and on time.
2. Eclipse Expert - Advisor for other members with regards to Eclipse.
3. Intern - Helps out everyone for free.
#### Charlton
1. Scheduling and Tracking - In charge of defining, assigning, and tracking project tasks.
2. Deliverables and Deadlines - Ensure project deliverables are done on time and in the right format.
## Contact
* Kai Ling (https://github.com/Kaiiiii)
* Yi Hang (https://github.com/yihangho)
* Rong Hua (https://github.com/Roahhh)
* Charlton (https://github.com/cadmusthefounder)
| Add responsibilities in About Us | Add responsibilities in About Us
| Markdown | mit | CS2103AUG2016-W11-C1/main | markdown | ## Code Before:
* [Role](#role)
* [Kai Ling](#kai-ling)
* [Yi Hang](#yi-hang)
* [Rong Hua](#rong-hua)
* [Charlton](#charlton)
* [Contact](#contact)
## Role
#### Kai Ling
1. Team Leader
2. Documentation
#### Yi Hang
1. Code Quality
2. Integration
#### Rong Hua
1. Test
2. Intern
#### Charlton
1. Scheduling and Tracking
2. Deliverables and Deadlines
## Contact
* Kai Ling (https://github.com/Kaiiiii)
* Yi Hang (https://github.com/yihangho)
* Rong Hua (https://github.com/Roahhh)
* Charlton (https://github.com/cadmusthefounder)
## Instruction:
Add responsibilities in About Us
## Code After:
* [Role](#role)
* [Kai Ling](#kai-ling)
* [Yi Hang](#yi-hang)
* [Rong Hua](#rong-hua)
* [Charlton](#charlton)
* [Contact](#contact)
## Role
#### Kai Ling
1. Team Leader - In charge of overall project coordination.
2. Documentation - Ensures that project documentations are in order.
#### Yi Hang
1. Code Quality - Ensures adherence to coding standards.
2. Integration - In charge of versioning and code maintainence.
3. Git Expert - Advisor for other members with regards to Git.
#### Rong Hua
1. Test - Ensures testing of the project is done properly and on time.
2. Eclipse Expert - Advisor for other members with regards to Eclipse.
3. Intern - Helps out everyone for free.
#### Charlton
1. Scheduling and Tracking - In charge of defining, assigning, and tracking project tasks.
2. Deliverables and Deadlines - Ensure project deliverables are done on time and in the right format.
## Contact
* Kai Ling (https://github.com/Kaiiiii)
* Yi Hang (https://github.com/yihangho)
* Rong Hua (https://github.com/Roahhh)
* Charlton (https://github.com/cadmusthefounder)
|
9e13c395c57629c592c8592dc6fd86403264c701 | .github/pull_request_template.md | .github/pull_request_template.md | ↖ First, change the base branch from "master" to "dev".
↑ Next, briefly describe your proposal in the title.
Fixes: #
Finally, tell us more about the change here, in the description.
/cc @primer/ds-core
| - [ ] First, change the base branch from "master" to "dev".
- [ ] Next, briefly describe your proposal in the title.
- [ ] Fixes: # (type an issue number after the # if applicable)
Finally, tell us more about the change here, in the description.
/cc @primer/ds-core
| Make a checklist out of the 3 sentences | Make a checklist out of the 3 sentences
Also fixed the naming of the `Fixes: #` with the help of @shawnbot.
| Markdown | mit | primer/primer,primer/primer-css,primer/primer-css,primer/primer-css,primer/primer,primer/primer | markdown | ## Code Before:
↖ First, change the base branch from "master" to "dev".
↑ Next, briefly describe your proposal in the title.
Fixes: #
Finally, tell us more about the change here, in the description.
/cc @primer/ds-core
## Instruction:
Make a checklist out of the 3 sentences
Also fixed the naming of the `Fixes: #` with the help of @shawnbot.
## Code After:
- [ ] First, change the base branch from "master" to "dev".
- [ ] Next, briefly describe your proposal in the title.
- [ ] Fixes: # (type an issue number after the # if applicable)
Finally, tell us more about the change here, in the description.
/cc @primer/ds-core
|
58e4a92835061216155926cf1dcf43a20142181f | upload/library/SV/WordCountSearch/XenForo/DataWriter/DiscussionMessage/Post.php | upload/library/SV/WordCountSearch/XenForo/DataWriter/DiscussionMessage/Post.php | <?php
class SV_WordCountSearch_XenForo_DataWriter_DiscussionMessage_Post extends XFCP_SV_WordCountSearch_XenForo_DataWriter_DiscussionMessage_Post
{
protected function _getFields()
{
$fields = parent::_getFields();
$fields['xf_post_words'] = array
(
'post_id' => array('type' => self::TYPE_UINT, 'default' => array('xf_post', 'post_id'), 'required' => true),
'word_count' => array('type' => self::TYPE_UINT, 'default' => 0)
);
return $fields;
}
protected function _messagePreSave()
{
parent::_messagePreSave();
if ($this->isChanged('message'))
{
$this->set('word_count', $this->_getSearchModel()->getTextWordCount($this->get('message')));
}
}
protected function _getSearchModel()
{
return $this->getModelFromCache('XenForo_Model_Search');
}
}
| <?php
class SV_WordCountSearch_XenForo_DataWriter_DiscussionMessage_Post extends XFCP_SV_WordCountSearch_XenForo_DataWriter_DiscussionMessage_Post
{
protected function _getFields()
{
$fields = parent::_getFields();
$fields['xf_post_words'] = array
(
'post_id' => array('type' => self::TYPE_UINT, 'default' => array('xf_post', 'post_id'), 'required' => true),
'word_count' => array('type' => self::TYPE_UINT, 'default' => 0)
);
return $fields;
}
protected function _getExistingData($data)
{
$postData = parent::_getExistingData($data);
if (isset($postData['xf_post']['word_count']))
{
$postData['xf_post_words'] = array('post_id' => $postData['xf_post']['post_id'], 'word_count' => $postData['xf_post']['word_count']);
unset($postData['xf_post']['word_count']);
}
return $postData;
}
protected function _messagePostDelete()
{
parent::_messagePostDelete();
$db = $this->_db;
$post_id = $this->get('post_id');
$postIdQuoted = $db->quote($post_id);
$db->delete('xf_post_words', "post_id = $postIdQuoted");
}
protected function _messagePreSave()
{
parent::_messagePreSave();
if ($this->isChanged('message'))
{
$this->set('word_count', $this->_getSearchModel()->getTextWordCount($this->get('message')));
}
}
protected function _getSearchModel()
{
return $this->getModelFromCache('XenForo_Model_Search');
}
}
| Handle updates/deletes of linked tables correctly | Handle updates/deletes of linked tables correctly
| PHP | mit | Xon/XenForo-WordCountSearch | php | ## Code Before:
<?php
class SV_WordCountSearch_XenForo_DataWriter_DiscussionMessage_Post extends XFCP_SV_WordCountSearch_XenForo_DataWriter_DiscussionMessage_Post
{
protected function _getFields()
{
$fields = parent::_getFields();
$fields['xf_post_words'] = array
(
'post_id' => array('type' => self::TYPE_UINT, 'default' => array('xf_post', 'post_id'), 'required' => true),
'word_count' => array('type' => self::TYPE_UINT, 'default' => 0)
);
return $fields;
}
protected function _messagePreSave()
{
parent::_messagePreSave();
if ($this->isChanged('message'))
{
$this->set('word_count', $this->_getSearchModel()->getTextWordCount($this->get('message')));
}
}
protected function _getSearchModel()
{
return $this->getModelFromCache('XenForo_Model_Search');
}
}
## Instruction:
Handle updates/deletes of linked tables correctly
## Code After:
<?php
class SV_WordCountSearch_XenForo_DataWriter_DiscussionMessage_Post extends XFCP_SV_WordCountSearch_XenForo_DataWriter_DiscussionMessage_Post
{
protected function _getFields()
{
$fields = parent::_getFields();
$fields['xf_post_words'] = array
(
'post_id' => array('type' => self::TYPE_UINT, 'default' => array('xf_post', 'post_id'), 'required' => true),
'word_count' => array('type' => self::TYPE_UINT, 'default' => 0)
);
return $fields;
}
protected function _getExistingData($data)
{
$postData = parent::_getExistingData($data);
if (isset($postData['xf_post']['word_count']))
{
$postData['xf_post_words'] = array('post_id' => $postData['xf_post']['post_id'], 'word_count' => $postData['xf_post']['word_count']);
unset($postData['xf_post']['word_count']);
}
return $postData;
}
protected function _messagePostDelete()
{
parent::_messagePostDelete();
$db = $this->_db;
$post_id = $this->get('post_id');
$postIdQuoted = $db->quote($post_id);
$db->delete('xf_post_words', "post_id = $postIdQuoted");
}
protected function _messagePreSave()
{
parent::_messagePreSave();
if ($this->isChanged('message'))
{
$this->set('word_count', $this->_getSearchModel()->getTextWordCount($this->get('message')));
}
}
protected function _getSearchModel()
{
return $this->getModelFromCache('XenForo_Model_Search');
}
}
|
c7030e461026e718c46b86dadecc9681d226c27c | cupy/util.py | cupy/util.py | import atexit
import functools
from cupy import cuda
_memoized_funcs = []
def memoize(for_each_device=False):
"""Makes a function memoizing the result for each argument and device.
This decorator provides automatic memoization of the function result.
Args:
for_each_device (bool): If True, it memoizes the results for each
device. Otherwise, it memoizes the results only based on the
arguments.
"""
def decorator(f):
global _memoized_funcs
f._cupy_memo = {}
_memoized_funcs.append(f)
@functools.wraps(f)
def ret(*args, **kwargs):
arg_key = (args, frozenset(kwargs.items()))
if for_each_device:
arg_key = (cuda.Device().id, arg_key)
memo = f._cupy_memo
result = memo.get(arg_key, None)
if result is None:
result = f(*args, **kwargs)
memo[arg_key] = result
return result
return ret
return decorator
@atexit.register
def clear_memo():
"""Clears the memoized results for all functions decorated by memoize."""
global _memoized_funcs
for f in _memoized_funcs:
del f._cupy_memo
_memoized_funcs = []
| import atexit
import functools
from cupy import cuda
_memos = []
def memoize(for_each_device=False):
"""Makes a function memoizing the result for each argument and device.
This decorator provides automatic memoization of the function result.
Args:
for_each_device (bool): If True, it memoizes the results for each
device. Otherwise, it memoizes the results only based on the
arguments.
"""
def decorator(f):
memo = {}
_memos.append(memo)
@functools.wraps(f)
def ret(*args, **kwargs):
arg_key = (args, frozenset(kwargs.items()))
if for_each_device:
arg_key = (cuda.Device().id, arg_key)
result = memo.get(arg_key, None)
if result is None:
result = f(*args, **kwargs)
memo[arg_key] = result
return result
return ret
return decorator
@atexit.register
def clear_memo():
"""Clears the memoized results for all functions decorated by memoize."""
for memo in _memos:
memo.clear()
| Fix unintended late finalization of memoized functions | Fix unintended late finalization of memoized functions
| Python | mit | ktnyt/chainer,niboshi/chainer,niboshi/chainer,laysakura/chainer,tscohen/chainer,benob/chainer,chainer/chainer,cupy/cupy,aonotas/chainer,cupy/cupy,jnishi/chainer,cupy/cupy,tkerola/chainer,cemoody/chainer,ktnyt/chainer,chainer/chainer,jnishi/chainer,jnishi/chainer,keisuke-umezawa/chainer,truongdq/chainer,kashif/chainer,wkentaro/chainer,ronekko/chainer,pfnet/chainer,delta2323/chainer,okuta/chainer,cupy/cupy,muupan/chainer,okuta/chainer,ysekky/chainer,hvy/chainer,chainer/chainer,truongdq/chainer,jnishi/chainer,keisuke-umezawa/chainer,t-abe/chainer,chainer/chainer,wkentaro/chainer,tigerneil/chainer,AlpacaDB/chainer,okuta/chainer,wkentaro/chainer,hvy/chainer,niboshi/chainer,ytoyama/yans_chainer_hackathon,sinhrks/chainer,keisuke-umezawa/chainer,ktnyt/chainer,kikusu/chainer,niboshi/chainer,rezoo/chainer,hvy/chainer,benob/chainer,sinhrks/chainer,anaruse/chainer,1986ks/chainer,ktnyt/chainer,sou81821/chainer,kikusu/chainer,okuta/chainer,kiyukuta/chainer,wkentaro/chainer,muupan/chainer,t-abe/chainer,AlpacaDB/chainer,Kaisuke5/chainer,minhpqn/chainer,hvy/chainer,keisuke-umezawa/chainer | python | ## Code Before:
import atexit
import functools
from cupy import cuda
_memoized_funcs = []
def memoize(for_each_device=False):
"""Makes a function memoizing the result for each argument and device.
This decorator provides automatic memoization of the function result.
Args:
for_each_device (bool): If True, it memoizes the results for each
device. Otherwise, it memoizes the results only based on the
arguments.
"""
def decorator(f):
global _memoized_funcs
f._cupy_memo = {}
_memoized_funcs.append(f)
@functools.wraps(f)
def ret(*args, **kwargs):
arg_key = (args, frozenset(kwargs.items()))
if for_each_device:
arg_key = (cuda.Device().id, arg_key)
memo = f._cupy_memo
result = memo.get(arg_key, None)
if result is None:
result = f(*args, **kwargs)
memo[arg_key] = result
return result
return ret
return decorator
@atexit.register
def clear_memo():
"""Clears the memoized results for all functions decorated by memoize."""
global _memoized_funcs
for f in _memoized_funcs:
del f._cupy_memo
_memoized_funcs = []
## Instruction:
Fix unintended late finalization of memoized functions
## Code After:
import atexit
import functools
from cupy import cuda
_memos = []
def memoize(for_each_device=False):
"""Makes a function memoizing the result for each argument and device.
This decorator provides automatic memoization of the function result.
Args:
for_each_device (bool): If True, it memoizes the results for each
device. Otherwise, it memoizes the results only based on the
arguments.
"""
def decorator(f):
memo = {}
_memos.append(memo)
@functools.wraps(f)
def ret(*args, **kwargs):
arg_key = (args, frozenset(kwargs.items()))
if for_each_device:
arg_key = (cuda.Device().id, arg_key)
result = memo.get(arg_key, None)
if result is None:
result = f(*args, **kwargs)
memo[arg_key] = result
return result
return ret
return decorator
@atexit.register
def clear_memo():
"""Clears the memoized results for all functions decorated by memoize."""
for memo in _memos:
memo.clear()
|
a87a0f30d6325705f200c2137dbb2d5a9b0222b9 | pkg/term/tc_linux_cgo.go | pkg/term/tc_linux_cgo.go | // +build linux,cgo
package term
import (
"syscall"
"unsafe"
)
// #include <termios.h>
import "C"
type Termios syscall.Termios
// MakeRaw put the terminal connected to the given file descriptor into raw
// mode and returns the previous state of the terminal so that it can be
// restored.
func MakeRaw(fd uintptr) (*State, error) {
var oldState State
if err := tcget(fd, &oldState.termios); err != 0 {
return nil, err
}
newState := oldState.termios
C.cfmakeraw((*C.struct_termios)(unsafe.Pointer(&newState)))
if err := tcset(fd, &newState); err != 0 {
return nil, err
}
return &oldState, nil
}
func tcget(fd uintptr, p *Termios) syscall.Errno {
ret, err := C.tcgetattr(C.int(fd), (*C.struct_termios)(unsafe.Pointer(p)))
if ret != 0 {
return err.(syscall.Errno)
}
return 0
}
func tcset(fd uintptr, p *Termios) syscall.Errno {
ret, err := C.tcsetattr(C.int(fd), C.TCSANOW, (*C.struct_termios)(unsafe.Pointer(p)))
if ret != 0 {
return err.(syscall.Errno)
}
return 0
}
| // +build linux,cgo
package term
import (
"syscall"
"unsafe"
)
// #include <termios.h>
import "C"
type Termios syscall.Termios
// MakeRaw put the terminal connected to the given file descriptor into raw
// mode and returns the previous state of the terminal so that it can be
// restored.
func MakeRaw(fd uintptr) (*State, error) {
var oldState State
if err := tcget(fd, &oldState.termios); err != 0 {
return nil, err
}
newState := oldState.termios
C.cfmakeraw((*C.struct_termios)(unsafe.Pointer(&newState)))
newState.Oflag = newState.Oflag | C.OPOST
if err := tcset(fd, &newState); err != 0 {
return nil, err
}
return &oldState, nil
}
func tcget(fd uintptr, p *Termios) syscall.Errno {
ret, err := C.tcgetattr(C.int(fd), (*C.struct_termios)(unsafe.Pointer(p)))
if ret != 0 {
return err.(syscall.Errno)
}
return 0
}
func tcset(fd uintptr, p *Termios) syscall.Errno {
ret, err := C.tcsetattr(C.int(fd), C.TCSANOW, (*C.struct_termios)(unsafe.Pointer(p)))
if ret != 0 {
return err.(syscall.Errno)
}
return 0
}
| Fix weird terminal output format | Fix weird terminal output format
Signed-off-by: Lei Jitang <9ac444d2b5df3db1f31aa1c6462ac8e9e2bde241@huawei.com>
| Go | apache-2.0 | moby/term | go | ## Code Before:
// +build linux,cgo
package term
import (
"syscall"
"unsafe"
)
// #include <termios.h>
import "C"
type Termios syscall.Termios
// MakeRaw put the terminal connected to the given file descriptor into raw
// mode and returns the previous state of the terminal so that it can be
// restored.
func MakeRaw(fd uintptr) (*State, error) {
var oldState State
if err := tcget(fd, &oldState.termios); err != 0 {
return nil, err
}
newState := oldState.termios
C.cfmakeraw((*C.struct_termios)(unsafe.Pointer(&newState)))
if err := tcset(fd, &newState); err != 0 {
return nil, err
}
return &oldState, nil
}
func tcget(fd uintptr, p *Termios) syscall.Errno {
ret, err := C.tcgetattr(C.int(fd), (*C.struct_termios)(unsafe.Pointer(p)))
if ret != 0 {
return err.(syscall.Errno)
}
return 0
}
func tcset(fd uintptr, p *Termios) syscall.Errno {
ret, err := C.tcsetattr(C.int(fd), C.TCSANOW, (*C.struct_termios)(unsafe.Pointer(p)))
if ret != 0 {
return err.(syscall.Errno)
}
return 0
}
## Instruction:
Fix weird terminal output format
Signed-off-by: Lei Jitang <9ac444d2b5df3db1f31aa1c6462ac8e9e2bde241@huawei.com>
## Code After:
// +build linux,cgo
package term
import (
"syscall"
"unsafe"
)
// #include <termios.h>
import "C"
type Termios syscall.Termios
// MakeRaw put the terminal connected to the given file descriptor into raw
// mode and returns the previous state of the terminal so that it can be
// restored.
func MakeRaw(fd uintptr) (*State, error) {
var oldState State
if err := tcget(fd, &oldState.termios); err != 0 {
return nil, err
}
newState := oldState.termios
C.cfmakeraw((*C.struct_termios)(unsafe.Pointer(&newState)))
newState.Oflag = newState.Oflag | C.OPOST
if err := tcset(fd, &newState); err != 0 {
return nil, err
}
return &oldState, nil
}
func tcget(fd uintptr, p *Termios) syscall.Errno {
ret, err := C.tcgetattr(C.int(fd), (*C.struct_termios)(unsafe.Pointer(p)))
if ret != 0 {
return err.(syscall.Errno)
}
return 0
}
func tcset(fd uintptr, p *Termios) syscall.Errno {
ret, err := C.tcsetattr(C.int(fd), C.TCSANOW, (*C.struct_termios)(unsafe.Pointer(p)))
if ret != 0 {
return err.(syscall.Errno)
}
return 0
}
|
ad931adc08eec181ea2e6af6ab6b067acec10e3f | render_script/main.js | render_script/main.js | function RenderClass(element) {
var _private = this;
_private.el = null;
this.update = function (data) {
if (_private.el === null) { return; }
for (var container of data) {
newLine = document.getElementById('template').cloneNode(true);
newLine.innerHTML = newLine.innerHTML
.replace(/{{name}}/g, container.name)
.replace(/{{path}}/g, container.path)
.replace(/{{ischecked}}/g, container.state === 'Up' ? 'checked' : '')
if (document.getElementById('container-' + container.name)) {
var toDelete = document.getElementById('container-' + container.name)
toDelete.insertAdjacentHTML('afterend', newLine.innerHTML)
toDelete.parentElement.removeChild(toDelete)
} else {
_private.el.insertAdjacentHTML('afterend', newLine.innerHTML)
}
_private.bindSwitchs()
}
}
_private.bindSwitchs = function () {
var switchs = _private.el.querySelectorAll('#content .switch')
for (var state_switch of switchs) {
var label = state_switch.querySelector('.state-switch')
var input = state_switch.querySelector('input')
label.onclick = function () {
var name = label.getAttribute('for').replace(/state-/, '')
ipc.send('switch-state', {
name: name,
value: !input.checked
})
return false
}
}
}
_private.init = function (el) {
_private.el = el;
}
_private.init(element);
} | function RenderClass(element) {
var _private = this;
_private.el = null;
this.update = function (data) {
if (_private.el === null) { return; }
for (var container of data) {
var isChecked = (container.state === 'Up')
var newLine = document.getElementById('template').cloneNode(true);
var lineIfExist = document.getElementById('container-' + container.name)
newLine.innerHTML = newLine.innerHTML
.replace(/{{name}}/g, container.name)
.replace(/{{path}}/g, container.path)
.replace(/{{ischecked}}/g, isChecked ? 'checked' : '')
if (lineIfExist && lineIfExist.querySelector('input').checked !== isChecked) {
lineIfExist.querySelector('input').checked = !lineIfExist.querySelector('input').checked
} else if (!lineIfExist) {
_private.el.insertAdjacentHTML('beforeend', newLine.innerHTML)
_private.bindSwitch('container-' + container.name)
}
}
}
_private.bindSwitch = function (container_id) {
var container = document.getElementById(container_id)
var label = container.querySelector('.state-switch')
var input = container.querySelector('input')
label.onclick = function () {
var name = this.getAttribute('for').replace(/state-/, '')
ipc.send('switch-state', {
name: name,
value: !input.checked
})
return false
}
}
_private.init = function (el) {
_private.el = el;
}
_private.init(element);
} | Fix bind of the switches and increase perf with the DOM | Fix bind of the switches and increase perf with the DOM
| JavaScript | mit | FlorianDewulf/docker-containers-manager,FlorianDewulf/docker-containers-manager | javascript | ## Code Before:
function RenderClass(element) {
var _private = this;
_private.el = null;
this.update = function (data) {
if (_private.el === null) { return; }
for (var container of data) {
newLine = document.getElementById('template').cloneNode(true);
newLine.innerHTML = newLine.innerHTML
.replace(/{{name}}/g, container.name)
.replace(/{{path}}/g, container.path)
.replace(/{{ischecked}}/g, container.state === 'Up' ? 'checked' : '')
if (document.getElementById('container-' + container.name)) {
var toDelete = document.getElementById('container-' + container.name)
toDelete.insertAdjacentHTML('afterend', newLine.innerHTML)
toDelete.parentElement.removeChild(toDelete)
} else {
_private.el.insertAdjacentHTML('afterend', newLine.innerHTML)
}
_private.bindSwitchs()
}
}
_private.bindSwitchs = function () {
var switchs = _private.el.querySelectorAll('#content .switch')
for (var state_switch of switchs) {
var label = state_switch.querySelector('.state-switch')
var input = state_switch.querySelector('input')
label.onclick = function () {
var name = label.getAttribute('for').replace(/state-/, '')
ipc.send('switch-state', {
name: name,
value: !input.checked
})
return false
}
}
}
_private.init = function (el) {
_private.el = el;
}
_private.init(element);
}
## Instruction:
Fix bind of the switches and increase perf with the DOM
## Code After:
function RenderClass(element) {
var _private = this;
_private.el = null;
this.update = function (data) {
if (_private.el === null) { return; }
for (var container of data) {
var isChecked = (container.state === 'Up')
var newLine = document.getElementById('template').cloneNode(true);
var lineIfExist = document.getElementById('container-' + container.name)
newLine.innerHTML = newLine.innerHTML
.replace(/{{name}}/g, container.name)
.replace(/{{path}}/g, container.path)
.replace(/{{ischecked}}/g, isChecked ? 'checked' : '')
if (lineIfExist && lineIfExist.querySelector('input').checked !== isChecked) {
lineIfExist.querySelector('input').checked = !lineIfExist.querySelector('input').checked
} else if (!lineIfExist) {
_private.el.insertAdjacentHTML('beforeend', newLine.innerHTML)
_private.bindSwitch('container-' + container.name)
}
}
}
_private.bindSwitch = function (container_id) {
var container = document.getElementById(container_id)
var label = container.querySelector('.state-switch')
var input = container.querySelector('input')
label.onclick = function () {
var name = this.getAttribute('for').replace(/state-/, '')
ipc.send('switch-state', {
name: name,
value: !input.checked
})
return false
}
}
_private.init = function (el) {
_private.el = el;
}
_private.init(element);
} |
221c6bcc31147ad23ff856ebc3a58f07a59d7966 | spec/arethusa.core/key_capture_spec.js | spec/arethusa.core/key_capture_spec.js | "use strict";
describe('keyCapture', function() {
beforeEach(module('arethusa.core'));
describe('isCtrlActive', function() {
it('return false before any event', inject(function(keyCapture) {
expect(keyCapture.isCtrlActive()).toBe(false);
}));
it('return true if keydown for ctrl was received', inject(function(keyCapture) {
keyCapture.keydown({ keyCode : keyCapture.keyCodes.ctrl });
expect(keyCapture.isCtrlActive()).toBe(true);
}));
it('return false if keydown for ctrl was received', inject(function(keyCapture) {
keyCapture.keydown({ keyCode : keyCapture.keyCodes.ctrl });
keyCapture.keyup({ keyCode : keyCapture.keyCodes.ctrl });
expect(keyCapture.isCtrlActive()).toBe(false);
}));
});
describe('onKeyPressed', function() {
it('calls the given callback', inject(function(keyCapture) {
var callbackCalled = false;
var callback = function() { callbackCalled = true; };
keyCapture.onKeyPressed(keyCapture.keyCodes.esc, callback);
keyCapture.keydown({ keyCode : keyCapture.keyCodes.esc });
keyCapture.keyup({ keyCode : keyCapture.keyCodes.esc });
expect(callbackCalled).toBe(true);
}));
});
});
| "use strict";
describe('keyCapture', function() {
beforeEach(module('arethusa.core'));
describe('onKeyPressed', function() {
it('calls the given callback', inject(function(keyCapture) {
var event = {
keyCode: 27,
target: { tagname: '' }
};
var callbackCalled = false;
var callback = function() { callbackCalled = true; };
keyCapture.onKeyPressed('esc', callback);
keyCapture.keyup(event);
expect(callbackCalled).toBe(true);
}));
});
});
| Remove obsolete specs for keyCapture service | Remove obsolete specs for keyCapture service
| JavaScript | mit | PonteIneptique/arethusa,alpheios-project/arethusa,latin-language-toolkit/arethusa,Masoumeh/arethusa,alpheios-project/arethusa,PonteIneptique/arethusa,alpheios-project/arethusa,fbaumgardt/arethusa,latin-language-toolkit/arethusa,fbaumgardt/arethusa,fbaumgardt/arethusa,Masoumeh/arethusa | javascript | ## Code Before:
"use strict";
describe('keyCapture', function() {
beforeEach(module('arethusa.core'));
describe('isCtrlActive', function() {
it('return false before any event', inject(function(keyCapture) {
expect(keyCapture.isCtrlActive()).toBe(false);
}));
it('return true if keydown for ctrl was received', inject(function(keyCapture) {
keyCapture.keydown({ keyCode : keyCapture.keyCodes.ctrl });
expect(keyCapture.isCtrlActive()).toBe(true);
}));
it('return false if keydown for ctrl was received', inject(function(keyCapture) {
keyCapture.keydown({ keyCode : keyCapture.keyCodes.ctrl });
keyCapture.keyup({ keyCode : keyCapture.keyCodes.ctrl });
expect(keyCapture.isCtrlActive()).toBe(false);
}));
});
describe('onKeyPressed', function() {
it('calls the given callback', inject(function(keyCapture) {
var callbackCalled = false;
var callback = function() { callbackCalled = true; };
keyCapture.onKeyPressed(keyCapture.keyCodes.esc, callback);
keyCapture.keydown({ keyCode : keyCapture.keyCodes.esc });
keyCapture.keyup({ keyCode : keyCapture.keyCodes.esc });
expect(callbackCalled).toBe(true);
}));
});
});
## Instruction:
Remove obsolete specs for keyCapture service
## Code After:
"use strict";
describe('keyCapture', function() {
beforeEach(module('arethusa.core'));
describe('onKeyPressed', function() {
it('calls the given callback', inject(function(keyCapture) {
var event = {
keyCode: 27,
target: { tagname: '' }
};
var callbackCalled = false;
var callback = function() { callbackCalled = true; };
keyCapture.onKeyPressed('esc', callback);
keyCapture.keyup(event);
expect(callbackCalled).toBe(true);
}));
});
});
|
0d8582c7952856e50e0389086d89d35f155674de | .github/workflows/check-cppcheck-llpc.yml | .github/workflows/check-cppcheck-llpc.yml | name: Static code check
on:
pull_request:
jobs:
cppcheck:
name: cppcheck
runs-on: "ubuntu-20.04"
steps:
- name: Setup environment
run: |
sudo apt-get install -yqq cppcheck
- name: Checkout LLPC
run: |
git clone https://github.com/${GITHUB_REPOSITORY}.git .
git fetch origin +${GITHUB_SHA}:${GITHUB_REF} --update-head-ok
git checkout ${GITHUB_SHA}
- name: Run cppcheck
run: cppcheck -q -j$(( $(nproc) * 4 )) --error-exitcode=1 .
| name: Static code check
on:
pull_request:
jobs:
cppcheck:
name: cppcheck
runs-on: "ubuntu-20.04"
steps:
- name: Setup environment
run: |
sudo apt-get install -yqq cppcheck
- name: Checkout LLPC
run: |
git clone https://github.com/${GITHUB_REPOSITORY}.git .
git fetch origin +${GITHUB_SHA}:${GITHUB_REF} --update-head-ok
git checkout ${GITHUB_SHA}
- name: Run cppcheck
run: cppcheck -q -j$(( $(nproc) * 4 )) --error-exitcode=1 --std=c++14 --inline-suppr .
| Enable inline suppressions for cppcheck | [CI] Enable inline suppressions for cppcheck
With this flags, it's possible to suppress false positives directly in the source code, e.g.:
```
// cppcheck-suppress syntaxError
TEST(InputUtilsTests, PlaceholderPass) {
```
Required for https://github.com/GPUOpen-Drivers/llpc/pull/1372.
| YAML | mit | GPUOpen-Drivers/llpc,GPUOpen-Drivers/llpc,GPUOpen-Drivers/llpc,GPUOpen-Drivers/llpc,GPUOpen-Drivers/llpc | yaml | ## Code Before:
name: Static code check
on:
pull_request:
jobs:
cppcheck:
name: cppcheck
runs-on: "ubuntu-20.04"
steps:
- name: Setup environment
run: |
sudo apt-get install -yqq cppcheck
- name: Checkout LLPC
run: |
git clone https://github.com/${GITHUB_REPOSITORY}.git .
git fetch origin +${GITHUB_SHA}:${GITHUB_REF} --update-head-ok
git checkout ${GITHUB_SHA}
- name: Run cppcheck
run: cppcheck -q -j$(( $(nproc) * 4 )) --error-exitcode=1 .
## Instruction:
[CI] Enable inline suppressions for cppcheck
With this flags, it's possible to suppress false positives directly in the source code, e.g.:
```
// cppcheck-suppress syntaxError
TEST(InputUtilsTests, PlaceholderPass) {
```
Required for https://github.com/GPUOpen-Drivers/llpc/pull/1372.
## Code After:
name: Static code check
on:
pull_request:
jobs:
cppcheck:
name: cppcheck
runs-on: "ubuntu-20.04"
steps:
- name: Setup environment
run: |
sudo apt-get install -yqq cppcheck
- name: Checkout LLPC
run: |
git clone https://github.com/${GITHUB_REPOSITORY}.git .
git fetch origin +${GITHUB_SHA}:${GITHUB_REF} --update-head-ok
git checkout ${GITHUB_SHA}
- name: Run cppcheck
run: cppcheck -q -j$(( $(nproc) * 4 )) --error-exitcode=1 --std=c++14 --inline-suppr .
|
82e65db8147241fb997b5890d6d0778bd69cb633 | dist/vellum/_lists.scss | dist/vellum/_lists.scss | // Lists
// -----
//
// We default to unstyled lists because they seem to be a more common usecase.
// Use the extensions to re-add the defaults back in.
ul,
ol {
margin: 0;
padding: 0;
list-style-type: none;
}
// Use this extension to turn default styling for unordered lists back on
%default-ul {
list-style-type: disc;
margin-bottom: $base-margin;
padding-left: $base-line-height;
}
// Use this extension to turn default styling for ordered lists back on
%default-ol {
list-style-type: decimal;
margin-bottom: $base-margin;
padding-left: $base-line-height;
}
dl {
line-height: $base-line-height;
margin-bottom: $base-margin;
}
dt {
font-weight: bold;
margin-top: $base-margin;
}
dd {
margin: 0;
} | // Lists
// -----
//
// We default to unstyled lists because they seem to be a more common usecase.
// Use the extensions to re-add the defaults back in.
ul,
ol {
margin: 0;
padding: 0;
list-style-type: none;
}
// Use this extension to turn default styling for unordered lists back on
%bullet-list {
list-style-type: disc;
margin-bottom: $base-margin;
padding-left: $base-line-height;
}
// Use this extension to turn default styling for ordered lists back on
%decimal-list {
list-style-type: decimal;
margin-bottom: $base-margin;
padding-left: $base-line-height;
}
dl {
line-height: $base-line-height;
margin-bottom: $base-margin;
}
dt {
font-weight: bold;
margin-top: $base-margin;
}
dd {
margin: 0;
} | Rename extends which bring back standard list styles | Rename extends which bring back standard list styles
| SCSS | mit | mobify/vellum,mobify/vellum | scss | ## Code Before:
// Lists
// -----
//
// We default to unstyled lists because they seem to be a more common usecase.
// Use the extensions to re-add the defaults back in.
ul,
ol {
margin: 0;
padding: 0;
list-style-type: none;
}
// Use this extension to turn default styling for unordered lists back on
%default-ul {
list-style-type: disc;
margin-bottom: $base-margin;
padding-left: $base-line-height;
}
// Use this extension to turn default styling for ordered lists back on
%default-ol {
list-style-type: decimal;
margin-bottom: $base-margin;
padding-left: $base-line-height;
}
dl {
line-height: $base-line-height;
margin-bottom: $base-margin;
}
dt {
font-weight: bold;
margin-top: $base-margin;
}
dd {
margin: 0;
}
## Instruction:
Rename extends which bring back standard list styles
## Code After:
// Lists
// -----
//
// We default to unstyled lists because they seem to be a more common usecase.
// Use the extensions to re-add the defaults back in.
ul,
ol {
margin: 0;
padding: 0;
list-style-type: none;
}
// Use this extension to turn default styling for unordered lists back on
%bullet-list {
list-style-type: disc;
margin-bottom: $base-margin;
padding-left: $base-line-height;
}
// Use this extension to turn default styling for ordered lists back on
%decimal-list {
list-style-type: decimal;
margin-bottom: $base-margin;
padding-left: $base-line-height;
}
dl {
line-height: $base-line-height;
margin-bottom: $base-margin;
}
dt {
font-weight: bold;
margin-top: $base-margin;
}
dd {
margin: 0;
} |
8bcdb52f3f2cf70a95829eed5c1b33fc329187b1 | .travis.yml | .travis.yml | language: ruby
script: "bundle exec rspec"
rvm:
- 2.2.3
before_install: gem install bundler -v 1.10.6
gemfile:
- Gemfile
notifications:
email: false
| language: ruby
script: "bundle exec rspec"
rvm:
- 2.3.0
- 2.2.3
before_install: gem install bundler -v 1.10.6
gemfile:
- Gemfile
notifications:
email: false
| Add Ruby 2.3.0 for CI | Add Ruby 2.3.0 for CI
| YAML | mit | bsodmike/easy_mailchimp,bsodmike/easy_mailchimp | yaml | ## Code Before:
language: ruby
script: "bundle exec rspec"
rvm:
- 2.2.3
before_install: gem install bundler -v 1.10.6
gemfile:
- Gemfile
notifications:
email: false
## Instruction:
Add Ruby 2.3.0 for CI
## Code After:
language: ruby
script: "bundle exec rspec"
rvm:
- 2.3.0
- 2.2.3
before_install: gem install bundler -v 1.10.6
gemfile:
- Gemfile
notifications:
email: false
|
8a35aa3f86a75348c31799a4939429a3f924392d | tasks/bnd-ext.rake | tasks/bnd-ext.rake | require 'buildr'
# Helper methods for building bundles using buildr-bnd.
class Buildr::Project
def java_packages
@java_packages ||= Dir[ File.join( _(:source, :main, :java), "**/*.java" ) ].collect { |f|
File.read(f).scan(/package\s+(\S+);/).flatten.first
}.compact.uniq
end
def bnd_export_package
@bnd_export_package ||= java_packages.collect { |p| "#{p};version=#{version}" }.join(',')
end
def bnd_import_package
"gov.nih.nci.*, *;resolution:=optional"
end
def configure_bundle(bundle)
bundle["Export-Package"] = bnd_export_package
bundle["Import-Package"] = bnd_import_package
bundle["Include-Resource"] = _(:target, :resources) if File.exist?(_(:target, :resources))
end
end
| require 'buildr'
# Helper methods for building bundles using buildr-bnd.
class Buildr::Project
def java_packages
@java_packages ||= Dir[ File.join( _(:source, :main, :java), "**/*.java" ) ].collect { |f|
File.read(f).scan(/package\s+(\S+);/).flatten.first
}.compact.uniq
end
def bnd_export_package
@bnd_export_package ||= java_packages.collect { |p| "#{p};version=#{version}" }.join(',')
end
def bnd_import_package
"gov.nih.nci.*, *;resolution:=optional"
end
def configure_bundle(bundle, options = {})
unless options.has_key?(:resources)
options[:resources] = File.exist?(_(:source, :main, :resources))
end
bundle["Export-Package"] = bnd_export_package
bundle["Import-Package"] = bnd_import_package
bundle["Include-Resource"] = _(:target, :resources) if options[:resources]
end
end
| Use a more sensible mechanism to determine whether to bundle resources. | Use a more sensible mechanism to determine whether to bundle resources.
(The previous one only worked by accident.)
SVN-Revision: 343
| Ruby | bsd-3-clause | NCIP/ctms-commons,NCIP/ctms-commons,NCIP/ctms-commons,NCIP/ctms-commons | ruby | ## Code Before:
require 'buildr'
# Helper methods for building bundles using buildr-bnd.
class Buildr::Project
def java_packages
@java_packages ||= Dir[ File.join( _(:source, :main, :java), "**/*.java" ) ].collect { |f|
File.read(f).scan(/package\s+(\S+);/).flatten.first
}.compact.uniq
end
def bnd_export_package
@bnd_export_package ||= java_packages.collect { |p| "#{p};version=#{version}" }.join(',')
end
def bnd_import_package
"gov.nih.nci.*, *;resolution:=optional"
end
def configure_bundle(bundle)
bundle["Export-Package"] = bnd_export_package
bundle["Import-Package"] = bnd_import_package
bundle["Include-Resource"] = _(:target, :resources) if File.exist?(_(:target, :resources))
end
end
## Instruction:
Use a more sensible mechanism to determine whether to bundle resources.
(The previous one only worked by accident.)
SVN-Revision: 343
## Code After:
require 'buildr'
# Helper methods for building bundles using buildr-bnd.
class Buildr::Project
def java_packages
@java_packages ||= Dir[ File.join( _(:source, :main, :java), "**/*.java" ) ].collect { |f|
File.read(f).scan(/package\s+(\S+);/).flatten.first
}.compact.uniq
end
def bnd_export_package
@bnd_export_package ||= java_packages.collect { |p| "#{p};version=#{version}" }.join(',')
end
def bnd_import_package
"gov.nih.nci.*, *;resolution:=optional"
end
def configure_bundle(bundle, options = {})
unless options.has_key?(:resources)
options[:resources] = File.exist?(_(:source, :main, :resources))
end
bundle["Export-Package"] = bnd_export_package
bundle["Import-Package"] = bnd_import_package
bundle["Include-Resource"] = _(:target, :resources) if options[:resources]
end
end
|
22243b70c2b27cef78cf38d5f3bd9dc420e21ba3 | yard_extensions.rb | yard_extensions.rb | require 'pp'
class MyModuleHandler < YARD::Handlers::Ruby::Base
handles method_call(:property)
def process
name = statement.parameters.first.jump(:tstring_content, :ident).source
object = YARD::CodeObjects::MethodObject.new(namespace, name)
register(object)
# modify the object
object.dynamic = true
object.scope = :class
object.name
end
end |
class MyModuleHandler < YARD::Handlers::Ruby::Base
handles method_call(:property)
def process
name = statement.parameters.first.jump(:tstring_content, :ident).source
object = YARD::CodeObjects::MethodObject.new(namespace, name)
register(object)
# modify the object
object.dynamic = true
object.scope = :class
object.name
end
end | Remove white space remove require 'pp' | Remove white space remove require 'pp'
| Ruby | mit | rapid7/convection,rapid7/convection | ruby | ## Code Before:
require 'pp'
class MyModuleHandler < YARD::Handlers::Ruby::Base
handles method_call(:property)
def process
name = statement.parameters.first.jump(:tstring_content, :ident).source
object = YARD::CodeObjects::MethodObject.new(namespace, name)
register(object)
# modify the object
object.dynamic = true
object.scope = :class
object.name
end
end
## Instruction:
Remove white space remove require 'pp'
## Code After:
class MyModuleHandler < YARD::Handlers::Ruby::Base
handles method_call(:property)
def process
name = statement.parameters.first.jump(:tstring_content, :ident).source
object = YARD::CodeObjects::MethodObject.new(namespace, name)
register(object)
# modify the object
object.dynamic = true
object.scope = :class
object.name
end
end |
5a60adf7298781972388a6743a6a0fcd74c2821f | Resources/translations/validators.es.yml | Resources/translations/validators.es.yml | fos_user:
username:
already_used: El nombre de usuario ya está en uso
blank: Por favor, ingrese un nombre de usuario
short: "[-Inf,Inf]El nombre de usuario es demasiado corto"
long: "[-Inf,Inf]El nombre de usuario es demasiado largo"
email:
already_used: La dirección de correo ya está en uso
blank: Por favor, ingrese una dirección de correo
short: "[-Inf,Inf]La dirección de correo es demasiado corta"
long: "[-Inf,Inf]La dirección de correo es demasiado larga"
invalid: La dirección de correo no es válida
password:
blank: Por favor, ingrese una contraseña
short: "[-Inf,Inf]La contraseña es demasiado corta"
mismatch: Las dos contraseñas no coinciden
new_password:
blank: Por favor, ingrese una nueva contraseña
short: "[-Inf,Inf]La nueva contraseña es demasiado corta"
current_password:
invalid: La contraseña ingresada no es válida
group:
already_used: El nombre de grupo ya está en uso
blank: Por favor, ingrese un nombre
short: "[-Inf,Inf]El nombre es demasiado corto"
long: "[-Inf,Inf]El nombre es demasiado largo"
| fos_user:
username:
already_used: El nombre de usuario ya está en uso
blank: Por favor, ingrese un nombre de usuario
short: "[-Inf,Inf]El nombre de usuario es demasiado corto"
long: "[-Inf,Inf]El nombre de usuario es demasiado largo"
email:
already_used: La dirección de correo ya está en uso
blank: Por favor, ingrese una dirección de correo
short: "[-Inf,Inf]La dirección de correo es demasiado corta"
long: "[-Inf,Inf]La dirección de correo es demasiado larga"
invalid: La dirección de correo no es válida
password:
blank: Por favor, ingrese una contraseña
short: "[-Inf,Inf]La contraseña es demasiado corta"
mismatch: Las dos contraseñas no coinciden
new_password:
blank: Por favor, ingrese una nueva contraseña
short: "[-Inf,Inf]La nueva contraseña es demasiado corta"
current_password:
invalid: La contraseña ingresada no es válida
group:
blank: Por favor, ingrese un nombre
short: "[-Inf,Inf]El nombre es demasiado corto"
long: "[-Inf,Inf]El nombre es demasiado largo"
| Revert "Added Integrity constraint violation for group name" | Revert "Added Integrity constraint violation for group name"
This reverts commit 4cb6a565115e084fe2a7d9cbb9bfabbf2de057ac.
| YAML | mit | XWB/FOSUserBundle,XWB/FOSUserBundle,FriendsOfSymfony/FOSUserBundle | yaml | ## Code Before:
fos_user:
username:
already_used: El nombre de usuario ya está en uso
blank: Por favor, ingrese un nombre de usuario
short: "[-Inf,Inf]El nombre de usuario es demasiado corto"
long: "[-Inf,Inf]El nombre de usuario es demasiado largo"
email:
already_used: La dirección de correo ya está en uso
blank: Por favor, ingrese una dirección de correo
short: "[-Inf,Inf]La dirección de correo es demasiado corta"
long: "[-Inf,Inf]La dirección de correo es demasiado larga"
invalid: La dirección de correo no es válida
password:
blank: Por favor, ingrese una contraseña
short: "[-Inf,Inf]La contraseña es demasiado corta"
mismatch: Las dos contraseñas no coinciden
new_password:
blank: Por favor, ingrese una nueva contraseña
short: "[-Inf,Inf]La nueva contraseña es demasiado corta"
current_password:
invalid: La contraseña ingresada no es válida
group:
already_used: El nombre de grupo ya está en uso
blank: Por favor, ingrese un nombre
short: "[-Inf,Inf]El nombre es demasiado corto"
long: "[-Inf,Inf]El nombre es demasiado largo"
## Instruction:
Revert "Added Integrity constraint violation for group name"
This reverts commit 4cb6a565115e084fe2a7d9cbb9bfabbf2de057ac.
## Code After:
fos_user:
username:
already_used: El nombre de usuario ya está en uso
blank: Por favor, ingrese un nombre de usuario
short: "[-Inf,Inf]El nombre de usuario es demasiado corto"
long: "[-Inf,Inf]El nombre de usuario es demasiado largo"
email:
already_used: La dirección de correo ya está en uso
blank: Por favor, ingrese una dirección de correo
short: "[-Inf,Inf]La dirección de correo es demasiado corta"
long: "[-Inf,Inf]La dirección de correo es demasiado larga"
invalid: La dirección de correo no es válida
password:
blank: Por favor, ingrese una contraseña
short: "[-Inf,Inf]La contraseña es demasiado corta"
mismatch: Las dos contraseñas no coinciden
new_password:
blank: Por favor, ingrese una nueva contraseña
short: "[-Inf,Inf]La nueva contraseña es demasiado corta"
current_password:
invalid: La contraseña ingresada no es válida
group:
blank: Por favor, ingrese un nombre
short: "[-Inf,Inf]El nombre es demasiado corto"
long: "[-Inf,Inf]El nombre es demasiado largo"
|
e6ebfd125152fd052311e64a793adffb0e31438e | .github/ISSUE_TEMPLATE.md | .github/ISSUE_TEMPLATE.md | [Read our guide on bug reports here](./ISSUES_GUIDE.md)
**Actual Behaviour**
Explain what is currently happening.
**Expected Behaviour**
Explain what you expected (unnecessary if it's obviouxly broken, e.g. a crash).
**Would you like to work on the issue?**
Let us know if this issue should be assigned to you.
| **Note**: This repository is for the 2014-era Zulip Android app; you
probably are you using
[zulip-mobile](https://github.com/zulip/zulip-mobile) and want to
[report an issue here](https://github.com/zulip/zulip-mobile/issues).
| Rewrite issue template to discourage confused posts. | docs: Rewrite issue template to discourage confused posts.
| Markdown | apache-2.0 | zulip/zulip-android | markdown | ## Code Before:
[Read our guide on bug reports here](./ISSUES_GUIDE.md)
**Actual Behaviour**
Explain what is currently happening.
**Expected Behaviour**
Explain what you expected (unnecessary if it's obviouxly broken, e.g. a crash).
**Would you like to work on the issue?**
Let us know if this issue should be assigned to you.
## Instruction:
docs: Rewrite issue template to discourage confused posts.
## Code After:
**Note**: This repository is for the 2014-era Zulip Android app; you
probably are you using
[zulip-mobile](https://github.com/zulip/zulip-mobile) and want to
[report an issue here](https://github.com/zulip/zulip-mobile/issues).
|
9c7118524c7194b9a343f43306895cb7f90873bc | requirements.txt | requirements.txt | hgtools==6.3
keyring==5.3
astropy-helpers==1.0.2
astroquery==0.2.4
sncosmo==1.0.0
pydl==0.3.0
montage-wrapper==0.9.8
ccdproc==0.3.1
wcsaxes==0.3
gammapy==0.2
pyvo==0.0beta2
photutils==0.1
spherical-geometry==1.0.4
APLpy==1.0
specutils==0.1
imexam==0.5
gwcs==0.1.dev44
pyregion==1.1.4
astroML==0.2
ginga==2.3.20150517013817
| hgtools==6.3
keyring==5.3
beautifulsoup4==4.3.2
astropy-helpers==1.0.2
astroquery==0.2.4
sncosmo==1.0.0
pydl==0.3.0
montage-wrapper==0.9.8
ccdproc==0.3.1
wcsaxes==0.3
gammapy==0.2
pyvo==0.0beta2
photutils==0.1
spherical-geometry==1.0.4
APLpy==1.0
specutils==0.1
imexam==0.5
gwcs==0.1.dev44
pyregion==1.1.4
astroML==0.2
ginga==2.3.20150517013817
| Add beautifulsoup4 with its PyPI name to the build list | Add beautifulsoup4 with its PyPI name to the build list
| Text | bsd-3-clause | Cadair/conda-builder-affiliated,astropy/conda-build-tools,Cadair/conda-builder-affiliated,astropy/conda-build-tools,kbarbary/conda-builder-affiliated,astropy/conda-builder-affiliated,zblz/conda-builder-affiliated,bmorris3/conda-builder-affiliated,cdeil/conda-builder-affiliated,cdeil/conda-builder-affiliated,bmorris3/conda-builder-affiliated,astropy/conda-builder-affiliated,mwcraig/conda-builder-affiliated,mwcraig/conda-builder-affiliated,zblz/conda-builder-affiliated,kbarbary/conda-builder-affiliated | text | ## Code Before:
hgtools==6.3
keyring==5.3
astropy-helpers==1.0.2
astroquery==0.2.4
sncosmo==1.0.0
pydl==0.3.0
montage-wrapper==0.9.8
ccdproc==0.3.1
wcsaxes==0.3
gammapy==0.2
pyvo==0.0beta2
photutils==0.1
spherical-geometry==1.0.4
APLpy==1.0
specutils==0.1
imexam==0.5
gwcs==0.1.dev44
pyregion==1.1.4
astroML==0.2
ginga==2.3.20150517013817
## Instruction:
Add beautifulsoup4 with its PyPI name to the build list
## Code After:
hgtools==6.3
keyring==5.3
beautifulsoup4==4.3.2
astropy-helpers==1.0.2
astroquery==0.2.4
sncosmo==1.0.0
pydl==0.3.0
montage-wrapper==0.9.8
ccdproc==0.3.1
wcsaxes==0.3
gammapy==0.2
pyvo==0.0beta2
photutils==0.1
spherical-geometry==1.0.4
APLpy==1.0
specutils==0.1
imexam==0.5
gwcs==0.1.dev44
pyregion==1.1.4
astroML==0.2
ginga==2.3.20150517013817
|
36581a6dc0d8e8acfbbf0d34be7b94bf2bd7e7c1 | src/console/DatabaseResetCommand.ts | src/console/DatabaseResetCommand.ts | import { Logger } from '../core/Logger';
import * as Knex from 'knex';
import { AbstractCommand } from './lib/AbstractCommand';
import * as options from './../../knexfile';
const log = new Logger(__filename);
/**
* DatabaseResetCommand rollback all current migrations and
* then migrate to the latest one.
*
* @export
* @class DatabaseResetCommand
*/
export class DatabaseResetCommand extends AbstractCommand {
public static command = 'db:reset';
public static description = 'Reverse all current migrations and migrate to latest.';
public async run(): Promise<void> {
const knex = Knex(options);
const migrate: any = knex.migrate;
// Force unlock in case of bad state
await migrate.forceFreeMigrationsLock();
// Get completed migrations
log.info('Get completed migrations');
const completedMigrations = await migrate._listCompleted();
// Rollback migrations
log.info('Rollback migrations');
await migrate._waterfallBatch(0, completedMigrations.reverse(), 'down');
// Migrate to latest
log.info('Migrate to latest');
await migrate.latest();
// Close connection to the database
await knex.destroy();
log.info('Done');
}
}
| import { Logger } from '../core/Logger';
import * as Knex from 'knex';
import { AbstractCommand } from './lib/AbstractCommand';
import * as options from './../../knexfile';
const log = new Logger(__filename);
/**
* DatabaseResetCommand rollback all current migrations and
* then migrate to the latest one.
*
* @export
* @class DatabaseResetCommand
*/
export class DatabaseResetCommand extends AbstractCommand {
public static command = 'db:reset';
public static description = 'Reverse all current migrations and migrate to latest.';
public async run(): Promise<void> {
const knex = Knex(options as Knex.Config);
const migrate: any = knex.migrate;
// Force unlock in case of bad state
await migrate.forceFreeMigrationsLock();
// Get completed migrations
log.info('Get completed migrations');
const completedMigrations = await migrate._listCompleted();
// Rollback migrations
log.info('Rollback migrations');
await migrate._waterfallBatch(0, completedMigrations.reverse(), 'down');
// Migrate to latest
log.info('Migrate to latest');
await migrate.latest();
// Close connection to the database
await knex.destroy();
log.info('Done');
}
}
| Add type to reset database command | Add type to reset database command
| TypeScript | mit | w3tecch/express-typescript-boilerplate,w3tecch/express-typescript-boilerplate,w3tecch/express-typescript-boilerplate | typescript | ## Code Before:
import { Logger } from '../core/Logger';
import * as Knex from 'knex';
import { AbstractCommand } from './lib/AbstractCommand';
import * as options from './../../knexfile';
const log = new Logger(__filename);
/**
* DatabaseResetCommand rollback all current migrations and
* then migrate to the latest one.
*
* @export
* @class DatabaseResetCommand
*/
export class DatabaseResetCommand extends AbstractCommand {
public static command = 'db:reset';
public static description = 'Reverse all current migrations and migrate to latest.';
public async run(): Promise<void> {
const knex = Knex(options);
const migrate: any = knex.migrate;
// Force unlock in case of bad state
await migrate.forceFreeMigrationsLock();
// Get completed migrations
log.info('Get completed migrations');
const completedMigrations = await migrate._listCompleted();
// Rollback migrations
log.info('Rollback migrations');
await migrate._waterfallBatch(0, completedMigrations.reverse(), 'down');
// Migrate to latest
log.info('Migrate to latest');
await migrate.latest();
// Close connection to the database
await knex.destroy();
log.info('Done');
}
}
## Instruction:
Add type to reset database command
## Code After:
import { Logger } from '../core/Logger';
import * as Knex from 'knex';
import { AbstractCommand } from './lib/AbstractCommand';
import * as options from './../../knexfile';
const log = new Logger(__filename);
/**
* DatabaseResetCommand rollback all current migrations and
* then migrate to the latest one.
*
* @export
* @class DatabaseResetCommand
*/
export class DatabaseResetCommand extends AbstractCommand {
public static command = 'db:reset';
public static description = 'Reverse all current migrations and migrate to latest.';
public async run(): Promise<void> {
const knex = Knex(options as Knex.Config);
const migrate: any = knex.migrate;
// Force unlock in case of bad state
await migrate.forceFreeMigrationsLock();
// Get completed migrations
log.info('Get completed migrations');
const completedMigrations = await migrate._listCompleted();
// Rollback migrations
log.info('Rollback migrations');
await migrate._waterfallBatch(0, completedMigrations.reverse(), 'down');
// Migrate to latest
log.info('Migrate to latest');
await migrate.latest();
// Close connection to the database
await knex.destroy();
log.info('Done');
}
}
|
210370997a9ae3f0a30d3cbccf3ec017e181db71 | autogen.sh | autogen.sh |
aclocal && \
libtoolize --automake --force && \
automake --foreign --add-missing --force && \
autoconf --force
|
aclocal && \
${LIBTOOLIZE:-libtoolize} --automake --force && \
automake --foreign --add-missing --force && \
autoconf --force
| Allow libtoolize to be specified via environment variable | Allow libtoolize to be specified via environment variable
On OS X, libtoolize is known as glibtoolize. Allow this to be
specified via an environment variable so that autogen.sh need
not be modified to build.
Signed-off-by: Jay Soffian <eccf10a4397cf08f0034a6cb5e4457bdd5e3783e@gmail.com>
Signed-off-by: David Woodhouse <b460d66aaf00c296a3db1c1d9eeafc081d5f7d70@intel.com>
| Shell | lgpl-2.1 | neshema/openconnect,chadcatlett/openconnect,mtmiller/openconnect,brandt/openconnect,don-johnny/openconnect,don-johnny/openconnect,mveplus/openconnect-client,mveplus/openconnect-client,lins05/openconnect,matlockx/openconnect,nmav/openconnect-mine,nmav/openconnect-mine,atyndall/openconnect,atyndall/openconnect,mtmiller/openconnect,mtmiller/openconnect,atyndall/openconnect,cernekee/openconnect,cernekee/openconnect,chadcatlett/openconnect,lins05/openconnect,chadcatlett/openconnect,atyndall/openconnect,neshema/openconnect,lins05/openconnect,cernekee/openconnect,brandt/openconnect,nmav/openconnect-mine,brandt/openconnect,mveplus/openconnect-client,neshema/openconnect,matlockx/openconnect,cernekee/openconnect,mtmiller/openconnect,brandt/openconnect,liiir1985/openconnect,neshema/openconnect,liiir1985/openconnect,matlockx/openconnect,liiir1985/openconnect,liiir1985/openconnect,chadcatlett/openconnect,matlockx/openconnect,nmav/openconnect-mine,mveplus/openconnect-client,don-johnny/openconnect,lins05/openconnect,don-johnny/openconnect | shell | ## Code Before:
aclocal && \
libtoolize --automake --force && \
automake --foreign --add-missing --force && \
autoconf --force
## Instruction:
Allow libtoolize to be specified via environment variable
On OS X, libtoolize is known as glibtoolize. Allow this to be
specified via an environment variable so that autogen.sh need
not be modified to build.
Signed-off-by: Jay Soffian <eccf10a4397cf08f0034a6cb5e4457bdd5e3783e@gmail.com>
Signed-off-by: David Woodhouse <b460d66aaf00c296a3db1c1d9eeafc081d5f7d70@intel.com>
## Code After:
aclocal && \
${LIBTOOLIZE:-libtoolize} --automake --force && \
automake --foreign --add-missing --force && \
autoconf --force
|
fc93261c1551b09932f8e24a705698a416d0e63d | test/Driver/linker-args-order-linux.swift | test/Driver/linker-args-order-linux.swift | // Statically link a "hello world" program
// REQUIRES: OS=linux-gnu
// REQUIRES: static_stdlib
print("hello world!")
// RUN: %empty-directory(%t)
// RUN: %target-swiftc_driver -v -static-stdlib -o %t/static-stdlib %s -Xlinker --no-allow-multiple-definition 2>&1| %FileCheck %s
// CHECK: Swift version
// CHECK: Target: x86_64-unknown-linux-gnu
// CHECK: {{.*}}/swift -frontend -c -primary-file {{.*}}/linker-args-order-linux.swift -target x86_64-unknown-linux-gnu -disable-objc-interop
// CHECK: {{.*}}/swift-autolink-extract{{.*}}
// CHECK: {{.*}}swift_begin.o /{{.*}}/linker-args-order-linux-{{[a-z0-9]+}}.o @/{{.*}}/linker-args-order-linux-{{[a-z0-9]+}}.autolink {{.*}} @{{.*}}/static-stdlib-args.lnk {{.*}} -Xlinker --no-allow-multiple-definition {{.*}}/swift_end.o
| // Statically link a "hello world" program
// REQUIRES: static_stdlib
print("hello world!")
// RUN: %empty-directory(%t)
// RUN: %target-swiftc_driver -driver-print-jobs -static-stdlib -o %t/static-stdlib %s -Xlinker --no-allow-multiple-definition 2>&1| %FileCheck %s
// CHECK: {{.*}}/swift -frontend -c -primary-file {{.*}}/linker-args-order-linux.swift -target x86_64-unknown-linux-gnu -disable-objc-interop
// CHECK: {{.*}}/swift-autolink-extract{{.*}}
// CHECK: {{.*}}swift_begin.o /{{.*}}/linker-args-order-linux-{{[a-z0-9]+}}.o @/{{.*}}/linker-args-order-linux-{{[a-z0-9]+}}.autolink {{.*}} @{{.*}}/static-stdlib-args.lnk {{.*}} -Xlinker --no-allow-multiple-definition {{.*}}/swift_end.o
| Update the linker -Xlinker argument reordering test. | Update the linker -Xlinker argument reordering test.
- Use -driver-print-jobs to avoid actually building the test file.
| Swift | apache-2.0 | apple/swift,zisko/swift,nathawes/swift,atrick/swift,rudkx/swift,harlanhaskins/swift,milseman/swift,milseman/swift,milseman/swift,ahoppen/swift,roambotics/swift,danielmartin/swift,austinzheng/swift,hooman/swift,devincoughlin/swift,stephentyrone/swift,glessard/swift,danielmartin/swift,alblue/swift,brentdax/swift,tkremenek/swift,deyton/swift,jckarter/swift,natecook1000/swift,lorentey/swift,jmgc/swift,gribozavr/swift,stephentyrone/swift,lorentey/swift,uasys/swift,atrick/swift,ahoppen/swift,lorentey/swift,austinzheng/swift,nathawes/swift,harlanhaskins/swift,danielmartin/swift,JGiola/swift,amraboelela/swift,practicalswift/swift,shajrawi/swift,amraboelela/swift,tkremenek/swift,milseman/swift,airspeedswift/swift,shahmishal/swift,benlangmuir/swift,ahoppen/swift,shahmishal/swift,airspeedswift/swift,benlangmuir/swift,danielmartin/swift,karwa/swift,gribozavr/swift,xwu/swift,xedin/swift,sschiau/swift,jmgc/swift,CodaFi/swift,amraboelela/swift,return/swift,tjw/swift,karwa/swift,jopamer/swift,shahmishal/swift,CodaFi/swift,roambotics/swift,nathawes/swift,shajrawi/swift,xwu/swift,parkera/swift,lorentey/swift,frootloops/swift,hooman/swift,JGiola/swift,glessard/swift,airspeedswift/swift,gregomni/swift,swiftix/swift,huonw/swift,deyton/swift,gregomni/swift,jckarter/swift,alblue/swift,return/swift,shahmishal/swift,gribozavr/swift,aschwaighofer/swift,return/swift,shahmishal/swift,glessard/swift,danielmartin/swift,huonw/swift,austinzheng/swift,zisko/swift,xwu/swift,shahmishal/swift,practicalswift/swift,deyton/swift,tkremenek/swift,airspeedswift/swift,shajrawi/swift,parkera/swift,glessard/swift,practicalswift/swift,CodaFi/swift,stephentyrone/swift,uasys/swift,harlanhaskins/swift,tjw/swift,brentdax/swift,CodaFi/swift,danielmartin/swift,nathawes/swift,frootloops/swift,zisko/swift,tjw/swift,stephentyrone/swift,sschiau/swift,swiftix/swift,sschiau/swift,hooman/swift,jmgc/swift,jckarter/swift,tjw/swift,parkera/swift,tkremenek/swift,brentdax/swift,return/swift,jopamer/swift,harlanhaskins/swift,jckarter/swift,parkera/swift,atrick/swift,uasys/swift,OscarSwanros/swift,shajrawi/swift,return/swift,alblue/swift,alblue/swift,amraboelela/swift,roambotics/swift,karwa/swift,jmgc/swift,tkremenek/swift,natecook1000/swift,shajrawi/swift,roambotics/swift,devincoughlin/swift,JGiola/swift,jmgc/swift,harlanhaskins/swift,deyton/swift,shahmishal/swift,benlangmuir/swift,ahoppen/swift,apple/swift,glessard/swift,allevato/swift,hooman/swift,xedin/swift,allevato/swift,jopamer/swift,gregomni/swift,OscarSwanros/swift,shajrawi/swift,tjw/swift,tkremenek/swift,benlangmuir/swift,rudkx/swift,austinzheng/swift,JGiola/swift,huonw/swift,lorentey/swift,brentdax/swift,karwa/swift,tjw/swift,jopamer/swift,xwu/swift,natecook1000/swift,OscarSwanros/swift,amraboelela/swift,hooman/swift,return/swift,devincoughlin/swift,allevato/swift,frootloops/swift,JGiola/swift,apple/swift,gribozavr/swift,jopamer/swift,benlangmuir/swift,karwa/swift,swiftix/swift,nathawes/swift,gregomni/swift,gregomni/swift,apple/swift,natecook1000/swift,xwu/swift,lorentey/swift,xedin/swift,huonw/swift,gribozavr/swift,devincoughlin/swift,roambotics/swift,huonw/swift,uasys/swift,OscarSwanros/swift,aschwaighofer/swift,benlangmuir/swift,practicalswift/swift,austinzheng/swift,gribozavr/swift,nathawes/swift,uasys/swift,airspeedswift/swift,sschiau/swift,harlanhaskins/swift,swiftix/swift,deyton/swift,devincoughlin/swift,rudkx/swift,natecook1000/swift,aschwaighofer/swift,devincoughlin/swift,nathawes/swift,parkera/swift,deyton/swift,OscarSwanros/swift,rudkx/swift,huonw/swift,jckarter/swift,practicalswift/swift,devincoughlin/swift,alblue/swift,gregomni/swift,aschwaighofer/swift,natecook1000/swift,aschwaighofer/swift,stephentyrone/swift,jckarter/swift,ahoppen/swift,zisko/swift,jopamer/swift,xedin/swift,OscarSwanros/swift,karwa/swift,zisko/swift,glessard/swift,zisko/swift,brentdax/swift,frootloops/swift,rudkx/swift,stephentyrone/swift,milseman/swift,xedin/swift,sschiau/swift,atrick/swift,airspeedswift/swift,JGiola/swift,devincoughlin/swift,allevato/swift,jckarter/swift,rudkx/swift,milseman/swift,frootloops/swift,danielmartin/swift,aschwaighofer/swift,practicalswift/swift,hooman/swift,tkremenek/swift,apple/swift,sschiau/swift,shajrawi/swift,sschiau/swift,brentdax/swift,austinzheng/swift,frootloops/swift,lorentey/swift,huonw/swift,parkera/swift,brentdax/swift,karwa/swift,jopamer/swift,gribozavr/swift,swiftix/swift,practicalswift/swift,uasys/swift,amraboelela/swift,swiftix/swift,uasys/swift,stephentyrone/swift,airspeedswift/swift,karwa/swift,natecook1000/swift,sschiau/swift,return/swift,harlanhaskins/swift,roambotics/swift,shahmishal/swift,swiftix/swift,zisko/swift,xwu/swift,lorentey/swift,milseman/swift,practicalswift/swift,jmgc/swift,xedin/swift,shajrawi/swift,OscarSwanros/swift,xwu/swift,allevato/swift,xedin/swift,austinzheng/swift,atrick/swift,hooman/swift,amraboelela/swift,frootloops/swift,parkera/swift,jmgc/swift,alblue/swift,alblue/swift,aschwaighofer/swift,deyton/swift,ahoppen/swift,allevato/swift,CodaFi/swift,allevato/swift,parkera/swift,xedin/swift,CodaFi/swift,atrick/swift,apple/swift,CodaFi/swift,tjw/swift,gribozavr/swift | swift | ## Code Before:
// Statically link a "hello world" program
// REQUIRES: OS=linux-gnu
// REQUIRES: static_stdlib
print("hello world!")
// RUN: %empty-directory(%t)
// RUN: %target-swiftc_driver -v -static-stdlib -o %t/static-stdlib %s -Xlinker --no-allow-multiple-definition 2>&1| %FileCheck %s
// CHECK: Swift version
// CHECK: Target: x86_64-unknown-linux-gnu
// CHECK: {{.*}}/swift -frontend -c -primary-file {{.*}}/linker-args-order-linux.swift -target x86_64-unknown-linux-gnu -disable-objc-interop
// CHECK: {{.*}}/swift-autolink-extract{{.*}}
// CHECK: {{.*}}swift_begin.o /{{.*}}/linker-args-order-linux-{{[a-z0-9]+}}.o @/{{.*}}/linker-args-order-linux-{{[a-z0-9]+}}.autolink {{.*}} @{{.*}}/static-stdlib-args.lnk {{.*}} -Xlinker --no-allow-multiple-definition {{.*}}/swift_end.o
## Instruction:
Update the linker -Xlinker argument reordering test.
- Use -driver-print-jobs to avoid actually building the test file.
## Code After:
// Statically link a "hello world" program
// REQUIRES: static_stdlib
print("hello world!")
// RUN: %empty-directory(%t)
// RUN: %target-swiftc_driver -driver-print-jobs -static-stdlib -o %t/static-stdlib %s -Xlinker --no-allow-multiple-definition 2>&1| %FileCheck %s
// CHECK: {{.*}}/swift -frontend -c -primary-file {{.*}}/linker-args-order-linux.swift -target x86_64-unknown-linux-gnu -disable-objc-interop
// CHECK: {{.*}}/swift-autolink-extract{{.*}}
// CHECK: {{.*}}swift_begin.o /{{.*}}/linker-args-order-linux-{{[a-z0-9]+}}.o @/{{.*}}/linker-args-order-linux-{{[a-z0-9]+}}.autolink {{.*}} @{{.*}}/static-stdlib-args.lnk {{.*}} -Xlinker --no-allow-multiple-definition {{.*}}/swift_end.o
|
2546e8b169378c698fc81b20030a899693a27363 | lib/podio/areas/oauth_client.rb | lib/podio/areas/oauth_client.rb | module Podio
module OAuthClient
include Podio::ResponseWrapper
extend self
def update(id, attributes)
response = Podio.connection.put do |req|
req.url "/oauth/client/#{id}"
req.body = attributes
end
response.status
end
def delete(id)
response = Podio.connection.delete("/oauth/client/#{id}")
response.status
end
response.status
end
def find_all_for_user(user_id)
list Podio.connection.get("oauth/client/user/#{user_id}/").body
end
def find(client_id)
member Podio.connection.get("oauth/client/#{client_id}").body
end
end
end | module Podio
module OAuthClient
include Podio::ResponseWrapper
extend self
def update(id, attributes)
puts attributes
response = Podio.connection.put do |req|
req.url "/oauth/client/#{id}"
req.body = attributes
end
response.status
end
def delete(id)
response = Podio.connection.delete("/oauth/client/#{id}")
response.status
end
def find_all_for_user(user_id)
list Podio.connection.get("oauth/client/user/#{user_id}/").body
end
def find(client_id)
member Podio.connection.get("oauth/client/#{client_id}").body
end
end
end | Fix bad syntax for oauth client | Fix bad syntax for oauth client
| Ruby | mit | podio/podio-rb,cocktail-io/podio-rb,jutonz/podio-rb,pjmuller/podio-rb | ruby | ## Code Before:
module Podio
module OAuthClient
include Podio::ResponseWrapper
extend self
def update(id, attributes)
response = Podio.connection.put do |req|
req.url "/oauth/client/#{id}"
req.body = attributes
end
response.status
end
def delete(id)
response = Podio.connection.delete("/oauth/client/#{id}")
response.status
end
response.status
end
def find_all_for_user(user_id)
list Podio.connection.get("oauth/client/user/#{user_id}/").body
end
def find(client_id)
member Podio.connection.get("oauth/client/#{client_id}").body
end
end
end
## Instruction:
Fix bad syntax for oauth client
## Code After:
module Podio
module OAuthClient
include Podio::ResponseWrapper
extend self
def update(id, attributes)
puts attributes
response = Podio.connection.put do |req|
req.url "/oauth/client/#{id}"
req.body = attributes
end
response.status
end
def delete(id)
response = Podio.connection.delete("/oauth/client/#{id}")
response.status
end
def find_all_for_user(user_id)
list Podio.connection.get("oauth/client/user/#{user_id}/").body
end
def find(client_id)
member Podio.connection.get("oauth/client/#{client_id}").body
end
end
end |
2376316b36f4d6bdbdb82bea519b6296763bb2f2 | MORK/ORKTaskResult+MORK.h | MORK/ORKTaskResult+MORK.h | //
// ORKCollectionResult+MORK.h
// MORK
//
// Created by Nolan Carroll on 4/23/15.
// Copyright (c) 2015 Medidata Solutions. All rights reserved.
//
#import "ORKResult.h"
@interface ORKTaskResult (MORK)
@property (readonly) NSArray *mork_fieldDataFromResults;
@end
| //
// ORKCollectionResult+MORK.h
// MORK
//
// Created by Nolan Carroll on 4/23/15.
// Copyright (c) 2015 Medidata Solutions. All rights reserved.
//
#import "ORKResult.h"
@interface ORKTaskResult (MORK)
- (NSArray *)mork_getFieldDataFromResults;
@end
| Put method declaration back into category header | Put method declaration back into category header
| C | mit | mdsol/MORK,mdsol/MORK | c | ## Code Before:
//
// ORKCollectionResult+MORK.h
// MORK
//
// Created by Nolan Carroll on 4/23/15.
// Copyright (c) 2015 Medidata Solutions. All rights reserved.
//
#import "ORKResult.h"
@interface ORKTaskResult (MORK)
@property (readonly) NSArray *mork_fieldDataFromResults;
@end
## Instruction:
Put method declaration back into category header
## Code After:
//
// ORKCollectionResult+MORK.h
// MORK
//
// Created by Nolan Carroll on 4/23/15.
// Copyright (c) 2015 Medidata Solutions. All rights reserved.
//
#import "ORKResult.h"
@interface ORKTaskResult (MORK)
- (NSArray *)mork_getFieldDataFromResults;
@end
|
45b38fc0ba8c2d64fb6ac869c0cc6438d865439f | responsive-design-lab/app/styles/main.css | responsive-design-lab/app/styles/main.css | /* main.css */
/* Add Pregressive Enhancement CSS if Flexbox not supported */
.no-flexbox .container {
background: #eee;
overflow: auto;
}
.no-flexbox .container .col {
width: 27%;
padding: 30px 3.15% 0;
float: left;
}
@media screen and (max-width: 48rem) {
.no-flexbox .container .col {
width: 95%;
}
}
/* Add Pregressive Enhancement CSS if Flexbox not supported */
.container {
display: -webkit-box; /* OLD - iOS 6-, Safari 3.1-6 */
display: -ms-flexbox; /* TWEENER - IE 10 */
display: flex; /* NEW, Spec - Firefox, Chrome, Opera */
background: #eee;
overflow: auto;
}
.container .col {
flex: 1;
padding: 1rem;
}
@media screen and (max-width: 48rem) {
.container {
display: -webkit-box;
display: -ms-flexbox;
display: flex;
flex-flow: column;
}
}
| /* main.css */
/* Add Pregressive Enhancement CSS if Flexbox not supported */
.no-flexbox .container {
background: #eee;
overflow: auto;
}
.no-flexbox .container .col {
width: 27%;
padding: 30px 3.15% 0;
float: left;
}
@media screen and (max-width: 48rem) {
.no-flexbox .container .col {
width: 95%;
}
}
/* Add Pregressive Enhancement CSS if Flexbox not supported */
/* Add Pregressive Enhancement CSS if Flexbox is supported */
.flexbox .container {
display: -webkit-box; /* OLD - iOS 6-, Safari 3.1-6 */
display: -ms-flexbox; /* TWEENER - IE 10 */
display: flex; /* NEW, Spec - Firefox, Chrome, Opera */
background: #eee;
overflow: auto;
}
.flexbox .container .col {
flex: 1;
padding: 1rem;
}
@media screen and (max-width: 48rem) {
.flexbox .container {
display: -webkit-box;
display: -ms-flexbox;
display: flex;
flex-flow: column;
}
}
/* Add Pregressive Enhancement CSS if Flexbox is supported */
| Add Pregressive Enhancement CSS if Flexbox is supported | Add Pregressive Enhancement CSS if Flexbox is supported
| CSS | apache-2.0 | verioussmith/pwa-training-labs,verioussmith/pwa-training-labs | css | ## Code Before:
/* main.css */
/* Add Pregressive Enhancement CSS if Flexbox not supported */
.no-flexbox .container {
background: #eee;
overflow: auto;
}
.no-flexbox .container .col {
width: 27%;
padding: 30px 3.15% 0;
float: left;
}
@media screen and (max-width: 48rem) {
.no-flexbox .container .col {
width: 95%;
}
}
/* Add Pregressive Enhancement CSS if Flexbox not supported */
.container {
display: -webkit-box; /* OLD - iOS 6-, Safari 3.1-6 */
display: -ms-flexbox; /* TWEENER - IE 10 */
display: flex; /* NEW, Spec - Firefox, Chrome, Opera */
background: #eee;
overflow: auto;
}
.container .col {
flex: 1;
padding: 1rem;
}
@media screen and (max-width: 48rem) {
.container {
display: -webkit-box;
display: -ms-flexbox;
display: flex;
flex-flow: column;
}
}
## Instruction:
Add Pregressive Enhancement CSS if Flexbox is supported
## Code After:
/* main.css */
/* Add Pregressive Enhancement CSS if Flexbox not supported */
.no-flexbox .container {
background: #eee;
overflow: auto;
}
.no-flexbox .container .col {
width: 27%;
padding: 30px 3.15% 0;
float: left;
}
@media screen and (max-width: 48rem) {
.no-flexbox .container .col {
width: 95%;
}
}
/* Add Pregressive Enhancement CSS if Flexbox not supported */
/* Add Pregressive Enhancement CSS if Flexbox is supported */
.flexbox .container {
display: -webkit-box; /* OLD - iOS 6-, Safari 3.1-6 */
display: -ms-flexbox; /* TWEENER - IE 10 */
display: flex; /* NEW, Spec - Firefox, Chrome, Opera */
background: #eee;
overflow: auto;
}
.flexbox .container .col {
flex: 1;
padding: 1rem;
}
@media screen and (max-width: 48rem) {
.flexbox .container {
display: -webkit-box;
display: -ms-flexbox;
display: flex;
flex-flow: column;
}
}
/* Add Pregressive Enhancement CSS if Flexbox is supported */
|
0fa19ad5a0a79c76dbb8deb674bb5cbfc86d5c8c | Scripts/script-SplunkSearch.yml | Scripts/script-SplunkSearch.yml | commonfields:
id: SplunkSearch
version: 1
name: SplunkSearch
script: |-
var rows = args.rows ? args.rows : 30;
var query = (args.query.indexOf('|') > 0) ? args.query : args.query + ' | head ' + rows;
var res = executeCommand('search', {'using-brand': 'splunk', query: query});
var table = {
Type: 1,
ContentsFormat: 'table',
Contents: []
};
for (var i=0; i<res[0].Contents.length; i++) {
data = res[0].Contents[i].result['_raw'];
table.Contents.push({Time: res[0].Contents[i].result['_time'], Host: res[0].Contents[i].result['host'], Source: res[0].Contents[i].result['source'], Data: data});
}
return table;
type: javascript
tags:
- enhancement
- splunk
comment: Run a query through Splunk and format the results as a table
system: true
args:
- name: query
required: true
default: true
description: Splunk query to execute
- name: rows
description: Return up to X first rows. If omitted, defaults to 30.
scripttarget: 0
dependson:
must:
- search
timeout: 0s
| commonfields:
id: SplunkSearch
version: 1
name: SplunkSearch
script: |-
var rows = args.rows ? args.rows : 30;
var query = (args.query.indexOf('|') > 0) ? args.query : args.query + ' | head ' + rows;
var res = executeCommand('search', {'using-brand': 'splunk', query: query});
var table = {
Type: 1,
ContentsFormat: 'table',
Contents: []
};
for (var i=0; i<res[0].Contents.length; i++) {
data = res[0].Contents[i].result['_raw'];
table.Contents.push({Time: res[0].Contents[i].result['_time'], Host: res[0].Contents[i].result['host'], Source: res[0].Contents[i].result['source'], Data: data});
}
if (table.Contents.length > 0) {
return table;
} else {
return 'No results.'
}
type: javascript
tags:
- enhancement
- splunk
comment: Run a query through Splunk and format the results as a table
system: true
args:
- name: query
required: true
default: true
description: Splunk query to execute
- name: rows
description: Return up to X first rows. If omitted, defaults to 30.
scripttarget: 0
dependson:
must:
- search
timeout: 0s
| Change SplunkSearch response when no results | Change SplunkSearch response when no results
| YAML | mit | VirusTotal/content,demisto/content,VirusTotal/content,demisto/content,VirusTotal/content,demisto/content,VirusTotal/content,demisto/content | yaml | ## Code Before:
commonfields:
id: SplunkSearch
version: 1
name: SplunkSearch
script: |-
var rows = args.rows ? args.rows : 30;
var query = (args.query.indexOf('|') > 0) ? args.query : args.query + ' | head ' + rows;
var res = executeCommand('search', {'using-brand': 'splunk', query: query});
var table = {
Type: 1,
ContentsFormat: 'table',
Contents: []
};
for (var i=0; i<res[0].Contents.length; i++) {
data = res[0].Contents[i].result['_raw'];
table.Contents.push({Time: res[0].Contents[i].result['_time'], Host: res[0].Contents[i].result['host'], Source: res[0].Contents[i].result['source'], Data: data});
}
return table;
type: javascript
tags:
- enhancement
- splunk
comment: Run a query through Splunk and format the results as a table
system: true
args:
- name: query
required: true
default: true
description: Splunk query to execute
- name: rows
description: Return up to X first rows. If omitted, defaults to 30.
scripttarget: 0
dependson:
must:
- search
timeout: 0s
## Instruction:
Change SplunkSearch response when no results
## Code After:
commonfields:
id: SplunkSearch
version: 1
name: SplunkSearch
script: |-
var rows = args.rows ? args.rows : 30;
var query = (args.query.indexOf('|') > 0) ? args.query : args.query + ' | head ' + rows;
var res = executeCommand('search', {'using-brand': 'splunk', query: query});
var table = {
Type: 1,
ContentsFormat: 'table',
Contents: []
};
for (var i=0; i<res[0].Contents.length; i++) {
data = res[0].Contents[i].result['_raw'];
table.Contents.push({Time: res[0].Contents[i].result['_time'], Host: res[0].Contents[i].result['host'], Source: res[0].Contents[i].result['source'], Data: data});
}
if (table.Contents.length > 0) {
return table;
} else {
return 'No results.'
}
type: javascript
tags:
- enhancement
- splunk
comment: Run a query through Splunk and format the results as a table
system: true
args:
- name: query
required: true
default: true
description: Splunk query to execute
- name: rows
description: Return up to X first rows. If omitted, defaults to 30.
scripttarget: 0
dependson:
must:
- search
timeout: 0s
|
5ba8294d890feef82af4aa01caaeaf648cfa5bab | contributor_docs/navbar.md | contributor_docs/navbar.md | <!-- _navbar.md -->
* [En](/)
* [Es](/es/)
* [Pt-Br](/pt-br/)
| <!-- _navbar.md -->
* [En](/)
* [Es](/es/)
* [Pt-Br](/pt-br/)
* [Ko](/ko/)
* [Sk](/sk/)
* [Zh](/zh/)
* [Hi](/hi/)
| Add links to new translations contrib-docs | Add links to new translations contrib-docs | Markdown | lgpl-2.1 | limzykenneth/p5.js,limzykenneth/p5.js,mlarghydracept/p5.js,processing/p5.js,processing/p5.js,dhowe/p5.js,mlarghydracept/p5.js,dhowe/p5.js | markdown | ## Code Before:
<!-- _navbar.md -->
* [En](/)
* [Es](/es/)
* [Pt-Br](/pt-br/)
## Instruction:
Add links to new translations contrib-docs
## Code After:
<!-- _navbar.md -->
* [En](/)
* [Es](/es/)
* [Pt-Br](/pt-br/)
* [Ko](/ko/)
* [Sk](/sk/)
* [Zh](/zh/)
* [Hi](/hi/)
|
06d8408172e37a2e951f7d6251c04ea95c721eab | .travis.yml | .travis.yml | language: objective-c
cache: cocoapods
#before_install:
#- pod repo --silent update
xcode_workspace: Swindler.xcworkspace
xcode_scheme: Swindler
script: xcodebuild -workspace Swindler.xcworkspace -scheme Swindler build test
| language: objective-c
cache: cocoapods
install: pod install || pod install --repo-update
xcode_workspace: Swindler.xcworkspace
xcode_scheme: Swindler
script: xcodebuild -workspace Swindler.xcworkspace -scheme Swindler build test
| Update repo if initial pod install fails | Travis: Update repo if initial pod install fails
| YAML | mit | tmandry/Swindler,tmandry/Swindler,tmandry/Swindler | yaml | ## Code Before:
language: objective-c
cache: cocoapods
#before_install:
#- pod repo --silent update
xcode_workspace: Swindler.xcworkspace
xcode_scheme: Swindler
script: xcodebuild -workspace Swindler.xcworkspace -scheme Swindler build test
## Instruction:
Travis: Update repo if initial pod install fails
## Code After:
language: objective-c
cache: cocoapods
install: pod install || pod install --repo-update
xcode_workspace: Swindler.xcworkspace
xcode_scheme: Swindler
script: xcodebuild -workspace Swindler.xcworkspace -scheme Swindler build test
|
91beee19bd8fa5d0f9d1f3179847a44caa72b63f | composer.json | composer.json | {
"authors": [
{
"name": "Khalifah Shabazz",
"homepage": "http://www.kshabazz.net",
"role": "Developer"
}
],
"autoload": {
"psr-4": {
"Kshabazz\\BattleNet\\D3\\": "src/"
}
},
"autoload-dev": {
"psr-4": {
"Kshabazz\\Tests\\BattleNet\\D3\\": "tests/D3/"
}
},
"description": "A PHP library for accessing Battle.net D3 API.",
"keywords": ["diablo 3"],
"license": "MIT",
"name": "kshabazz/battlenet-d3",
"repositories": [
{
"type": "vcs",
"url": "https://github.com/b01/slib"
},
{
"type": "vcs",
"url": "https://github.com/b01/interception"
}
],
"require": {
"php": "~5.6",
"kshabazz/slib": "~1.1"
},
"require-dev": {
"phpunit/phpunit": "~4.1",
"kshabazz/interception": "dev-master"
},
"type": "library"
} | {
"authors": [
{
"name": "Khalifah Shabazz",
"homepage": "http://www.kshabazz.net",
"role": "Developer"
}
],
"autoload": {
"psr-4": {
"Kshabazz\\BattleNet\\D3\\": "src/"
}
},
"autoload-dev": {
"psr-4": {
"Kshabazz\\Tests\\BattleNet\\D3\\": "tests/D3/"
}
},
"description": "A PHP library for accessing Battle.net D3 API.",
"keywords": ["diablo 3"],
"license": "MIT",
"name": "kshabazz/battlenet-d3",
"repositories": [
{
"type": "vcs",
"url": "https://github.com/b01/slib"
},
{
"type": "vcs",
"url": "https://github.com/b01/interception"
}
],
"require": {
"php": "~5.6",
"kshabazz/slib": "~1.1"
},
"require-dev": {
"phpunit/phpunit": "~4.1",
"kshabazz/interception": "dev-master",
"sphpdox/sphpdox": "dev-master"
},
"type": "library"
} | Add Sphpdox library for converting docblocks to documentation. | Add Sphpdox library for converting docblocks to documentation.
| JSON | mit | b01/d3 | json | ## Code Before:
{
"authors": [
{
"name": "Khalifah Shabazz",
"homepage": "http://www.kshabazz.net",
"role": "Developer"
}
],
"autoload": {
"psr-4": {
"Kshabazz\\BattleNet\\D3\\": "src/"
}
},
"autoload-dev": {
"psr-4": {
"Kshabazz\\Tests\\BattleNet\\D3\\": "tests/D3/"
}
},
"description": "A PHP library for accessing Battle.net D3 API.",
"keywords": ["diablo 3"],
"license": "MIT",
"name": "kshabazz/battlenet-d3",
"repositories": [
{
"type": "vcs",
"url": "https://github.com/b01/slib"
},
{
"type": "vcs",
"url": "https://github.com/b01/interception"
}
],
"require": {
"php": "~5.6",
"kshabazz/slib": "~1.1"
},
"require-dev": {
"phpunit/phpunit": "~4.1",
"kshabazz/interception": "dev-master"
},
"type": "library"
}
## Instruction:
Add Sphpdox library for converting docblocks to documentation.
## Code After:
{
"authors": [
{
"name": "Khalifah Shabazz",
"homepage": "http://www.kshabazz.net",
"role": "Developer"
}
],
"autoload": {
"psr-4": {
"Kshabazz\\BattleNet\\D3\\": "src/"
}
},
"autoload-dev": {
"psr-4": {
"Kshabazz\\Tests\\BattleNet\\D3\\": "tests/D3/"
}
},
"description": "A PHP library for accessing Battle.net D3 API.",
"keywords": ["diablo 3"],
"license": "MIT",
"name": "kshabazz/battlenet-d3",
"repositories": [
{
"type": "vcs",
"url": "https://github.com/b01/slib"
},
{
"type": "vcs",
"url": "https://github.com/b01/interception"
}
],
"require": {
"php": "~5.6",
"kshabazz/slib": "~1.1"
},
"require-dev": {
"phpunit/phpunit": "~4.1",
"kshabazz/interception": "dev-master",
"sphpdox/sphpdox": "dev-master"
},
"type": "library"
} |
355c678409152d4d039956b483d5f33ab5a91b1e | app/views/news/list.html.erb | app/views/news/list.html.erb | <div class="section">
<% @news.each do |n| %>
<div class="section-wrap" id="news-<%= n.id %>">
<div class="section-item">
<div class="section-title">
[<%= n.created_at.strftime("%Y-%m-%d") %>]
<%= n.title %>
</div>
<div class="section-content">
<%= n.content %>
</div>
</div>
</div>
<% end %>
</div>
<div class="section">
<div class="section-wrap info">
<div class="section-item">
<div class="section-title">
总共: <%= @page[:total] %>条
当前页: <%= @page[:cur_page] %>
<% if @page[:cur_page] > 1 %>
<%= link_to "上一页", controller: "news", action: "list", p: @page[:cur_page] - 1 %>
<% else %>
上一页
<% end %>
<% if @page[:cur_page] < @page[:total_page] %>
<%= link_to "下一页", controller: "news", action: "list", p: @page[:cur_page] + 1 %>
<% else %>
下一页
<% end %>
<%= link_to "订阅 RSS", controller: "news", action: "feed" %>
</div>
</div>
</div>
</div>
| <div class="section">
<% @news.each do |n| %>
<div class="section-wrap" id="news-<%= n.id %>">
<div class="section-item">
<div class="section-title">
[<%= n.created_at.strftime("%Y-%m-%d") %>]
<%= n.title %>
</div>
<div class="section-content">
<%= n.content %>
</div>
</div>
</div>
<% end %>
</div>
<div class="section">
<div class="section-wrap info">
<div class="section-item">
<div class="section-title">
总共: <%= @page[:total] %>条
当前页: <%= @page[:cur_page] %>
<% if @page[:cur_page] > 1 %>
<%= link_to "上一页", controller: "news", action: "list", p: @page[:cur_page] - 1 %>
<% else %>
上一页
<% end %>
<% if @page[:cur_page] < @page[:total_page] %>
<%= link_to "下一页", controller: "news", action: "list", p: @page[:cur_page] + 1 %>
<% else %>
下一页
<% end %>
<a href="/news/feed.xml">订阅 RSS</a>
</div>
</div>
</div>
</div>
| Use normal link to rss | Use normal link to rss
| HTML+ERB | mit | crispgm/rugby-board,crispgm/rugby-board,crispgm/rugby-board | html+erb | ## Code Before:
<div class="section">
<% @news.each do |n| %>
<div class="section-wrap" id="news-<%= n.id %>">
<div class="section-item">
<div class="section-title">
[<%= n.created_at.strftime("%Y-%m-%d") %>]
<%= n.title %>
</div>
<div class="section-content">
<%= n.content %>
</div>
</div>
</div>
<% end %>
</div>
<div class="section">
<div class="section-wrap info">
<div class="section-item">
<div class="section-title">
总共: <%= @page[:total] %>条
当前页: <%= @page[:cur_page] %>
<% if @page[:cur_page] > 1 %>
<%= link_to "上一页", controller: "news", action: "list", p: @page[:cur_page] - 1 %>
<% else %>
上一页
<% end %>
<% if @page[:cur_page] < @page[:total_page] %>
<%= link_to "下一页", controller: "news", action: "list", p: @page[:cur_page] + 1 %>
<% else %>
下一页
<% end %>
<%= link_to "订阅 RSS", controller: "news", action: "feed" %>
</div>
</div>
</div>
</div>
## Instruction:
Use normal link to rss
## Code After:
<div class="section">
<% @news.each do |n| %>
<div class="section-wrap" id="news-<%= n.id %>">
<div class="section-item">
<div class="section-title">
[<%= n.created_at.strftime("%Y-%m-%d") %>]
<%= n.title %>
</div>
<div class="section-content">
<%= n.content %>
</div>
</div>
</div>
<% end %>
</div>
<div class="section">
<div class="section-wrap info">
<div class="section-item">
<div class="section-title">
总共: <%= @page[:total] %>条
当前页: <%= @page[:cur_page] %>
<% if @page[:cur_page] > 1 %>
<%= link_to "上一页", controller: "news", action: "list", p: @page[:cur_page] - 1 %>
<% else %>
上一页
<% end %>
<% if @page[:cur_page] < @page[:total_page] %>
<%= link_to "下一页", controller: "news", action: "list", p: @page[:cur_page] + 1 %>
<% else %>
下一页
<% end %>
<a href="/news/feed.xml">订阅 RSS</a>
</div>
</div>
</div>
</div>
|
041e6bf2579b964e57f3b0a80dc56d2aa9fb8d80 | src/routes/file.ts | src/routes/file.ts | import * as fs from 'fs';
import {ReadStream} from 'fs';
export default {
test: (uri: string): boolean => /^\.|^\//.test(uri) || /^[a-zA-Z]:\\[\\\S|*\S]?.*$/g.test(uri),
read: (path: string): Promise<ReadStream> => {
return new Promise((resolve, reject) => {
let readStream = null;
try {
readStream = fs.createReadStream(path);
} catch (e) {
reject(e);
}
resolve(readStream);
});
}
};
| import * as fs from 'fs';
import {ReadStream} from 'fs';
export default {
test: (uri: string): boolean => {
if (process.platform === 'win32') {
return /^[a-zA-Z]:\\[\\\S|*\S]?.*$/g.test(uri);
}
return /^\.|^\//.test(uri)
},
read: (path: string): Promise<ReadStream> => {
return new Promise((resolve, reject) => {
let readStream = null;
try {
readStream = fs.createReadStream(path);
} catch (e) {
reject(e);
}
resolve(readStream);
});
}
};
| Add path check for windows | Add path check for windows
| TypeScript | mit | kicumkicum/stupid-player,kicumkicum/stupid-player | typescript | ## Code Before:
import * as fs from 'fs';
import {ReadStream} from 'fs';
export default {
test: (uri: string): boolean => /^\.|^\//.test(uri) || /^[a-zA-Z]:\\[\\\S|*\S]?.*$/g.test(uri),
read: (path: string): Promise<ReadStream> => {
return new Promise((resolve, reject) => {
let readStream = null;
try {
readStream = fs.createReadStream(path);
} catch (e) {
reject(e);
}
resolve(readStream);
});
}
};
## Instruction:
Add path check for windows
## Code After:
import * as fs from 'fs';
import {ReadStream} from 'fs';
export default {
test: (uri: string): boolean => {
if (process.platform === 'win32') {
return /^[a-zA-Z]:\\[\\\S|*\S]?.*$/g.test(uri);
}
return /^\.|^\//.test(uri)
},
read: (path: string): Promise<ReadStream> => {
return new Promise((resolve, reject) => {
let readStream = null;
try {
readStream = fs.createReadStream(path);
} catch (e) {
reject(e);
}
resolve(readStream);
});
}
};
|
ff163bfb68caf00240a345426be033e86b60da92 | app/users/users.service.js | app/users/users.service.js | {
class UsersService {
create(user) {
console.log('CREATED!');
console.log(user);
}
}
angular.module('meganote.users')
.service('UsersService', UsersService);
}
| {
angular.module('meganote.users')
.service('UsersService', [
'$http',
'API_BASE',
($http, API_BASE) => {
class UsersService {
create(user) {
return $http.post(`${API_BASE}users`, {
user,
})
.then(
res => {
console.log(res.data);
}
);
}
}
return new UsersService();
}
]);
}
| Make a POST request to create a user. | Make a POST request to create a user.
| JavaScript | mit | xternbootcamp16/meganote,xternbootcamp16/meganote | javascript | ## Code Before:
{
class UsersService {
create(user) {
console.log('CREATED!');
console.log(user);
}
}
angular.module('meganote.users')
.service('UsersService', UsersService);
}
## Instruction:
Make a POST request to create a user.
## Code After:
{
angular.module('meganote.users')
.service('UsersService', [
'$http',
'API_BASE',
($http, API_BASE) => {
class UsersService {
create(user) {
return $http.post(`${API_BASE}users`, {
user,
})
.then(
res => {
console.log(res.data);
}
);
}
}
return new UsersService();
}
]);
}
|
129ac478286f506543644b288ddffb9dddb2dc0e | src/js/background.js | src/js/background.js |
// Update tick
function update() {
setTimeout(update, REFRESH_AMOUNT * 1000);
Twitch.getStatus(function(err, status) {
if (!status.authenticated) {
Twitch.login({
popup: true,
scope: ['user_read'],
redirect_uri: chrome.extension.getURL('verify.html')
});
}
})
}
// Login event
Twitch.events.addListener('auth.login', function() {
console.log("Authenticated: " + Twitch.getToken());
});
// Initialize Twitch SDK
Twitch.init({clientId: CLIENT_ID}, function(err, status) {
console.log("Twitch SDK Initialized");
update();
}); |
// Variables
var authenticated = false;
var streamersCount = 0;
// Update popup window and icon
function updateStatus() {
var popup = "";
var img = 'icon-19-off';
if(authenticated) {
if(streamersCount > 5){
img = 'icon-19-on';
} else {
img = 'icon-19-' + streamersCount;
}
if(streamersCount > 0){
popup = "popup.html";
}
}
chrome.browserAction.setPopup({ popup: popup });
chrome.browserAction.setIcon({ 'path': 'img/' + img + '.png' });
}
// Update streams
function updateStreams(data) {
}
// Update tick
function update() {
setTimeout(update, REFRESH_AMOUNT * 1000);
Twitch.getStatus(function(err, status) {
authenticated = status.authenticated;
if (authenticated) {
Twitch.api({url: 'streams/followed'}, function(err, data) {
if(err){
console.error(err);
return;
}
updateStreams(data);
updateStatus();
});
} else {
Twitch.login({
popup: true,
scope: ['user_read'],
redirect_uri: chrome.extension.getURL('verify.html')
});
updateStatus();
}
})
}
// Login event
Twitch.events.addListener('auth.login', function() {
authenticated = true;
console.log("Authenticated: " + Twitch.getToken());
});
// Initialize Twitch SDK
Twitch.init({clientId: CLIENT_ID}, function(err, status) {
console.log("Twitch SDK Initialized");
update();
}); | Update popup window / icon | Update popup window / icon
| JavaScript | apache-2.0 | TechGuard/StreamAlert,TechGuard/StreamAlert,TechGuard/TwitchNotify,TechGuard/TwitchNotify | javascript | ## Code Before:
// Update tick
function update() {
setTimeout(update, REFRESH_AMOUNT * 1000);
Twitch.getStatus(function(err, status) {
if (!status.authenticated) {
Twitch.login({
popup: true,
scope: ['user_read'],
redirect_uri: chrome.extension.getURL('verify.html')
});
}
})
}
// Login event
Twitch.events.addListener('auth.login', function() {
console.log("Authenticated: " + Twitch.getToken());
});
// Initialize Twitch SDK
Twitch.init({clientId: CLIENT_ID}, function(err, status) {
console.log("Twitch SDK Initialized");
update();
});
## Instruction:
Update popup window / icon
## Code After:
// Variables
var authenticated = false;
var streamersCount = 0;
// Update popup window and icon
function updateStatus() {
var popup = "";
var img = 'icon-19-off';
if(authenticated) {
if(streamersCount > 5){
img = 'icon-19-on';
} else {
img = 'icon-19-' + streamersCount;
}
if(streamersCount > 0){
popup = "popup.html";
}
}
chrome.browserAction.setPopup({ popup: popup });
chrome.browserAction.setIcon({ 'path': 'img/' + img + '.png' });
}
// Update streams
function updateStreams(data) {
}
// Update tick
function update() {
setTimeout(update, REFRESH_AMOUNT * 1000);
Twitch.getStatus(function(err, status) {
authenticated = status.authenticated;
if (authenticated) {
Twitch.api({url: 'streams/followed'}, function(err, data) {
if(err){
console.error(err);
return;
}
updateStreams(data);
updateStatus();
});
} else {
Twitch.login({
popup: true,
scope: ['user_read'],
redirect_uri: chrome.extension.getURL('verify.html')
});
updateStatus();
}
})
}
// Login event
Twitch.events.addListener('auth.login', function() {
authenticated = true;
console.log("Authenticated: " + Twitch.getToken());
});
// Initialize Twitch SDK
Twitch.init({clientId: CLIENT_ID}, function(err, status) {
console.log("Twitch SDK Initialized");
update();
}); |
02f1f500e29927a811c37a2bcc99bc545c1f08d5 | README.md | README.md |
A [Polymer](http://polymer-project.org) element for querying results from the [Strava API](http://strava.github.io/api/)
> Maintained by [Brent Vatne](https://github.com/brentvatne).
## Install
Using [Bower](http://bower.io), run:
```bash
$ bower install --save strava-card
```
## Usage
1. Import Web Components' polyfill:
```html
<script src="//cdnjs.cloudflare.com/ajax/libs/polymer/0.1.4/platform.js"></script>
```
2. Import Custom Element:
```html
<link rel="import" href="src/strava-card.html">
```
3. Start using it!
```html
<strava-card athlete-id="yourathleteidinteger"></strava-card>
```
## Setup
In order to run it locally you'll need a basic server setup.
1. Install [Node.js](http://nodejs.org/download/)
2. Install [Grunt](http://gruntjs.com/):
```sh
$ npm install -g grunt-cli
```
3. Install local dependencies:
```sh
$ npm install
```
4. Run a local server and open `http://localhost:8000`.
```sh
$ grunt connect
```
## License
[MIT License](http://opensource.org/licenses/MIT)
|
A [Polymer](http://polymer-project.org) element for querying results from the [Strava API](http://strava.github.io/api/)
> Maintained by [Brent Vatne](https://github.com/brentvatne).
## Install
Using [Bower](http://bower.io), run:
```bash
$ bower install --save strava-card
```
## Demo
> [See it in action](http://brentvatne.github.io/strava-card/)
## Usage
1. Import Web Components' polyfill:
```html
<script src="//cdnjs.cloudflare.com/ajax/libs/polymer/0.1.4/platform.js"></script>
```
2. Import Custom Element:
```html
<link rel="import" href="src/strava-card.html">
```
3. Start using it!
```html
<strava-card athlete-id="yourathleteidinteger"></strava-card>
```
## Setup
In order to run it locally you'll need a basic server setup.
1. Install [Node.js](http://nodejs.org/download/)
2. Install [Grunt](http://gruntjs.com/):
```sh
$ npm install -g grunt-cli
```
3. Install local dependencies:
```sh
$ npm install
```
4. Run a local server and open `http://localhost:8000`.
```sh
$ grunt connect
```
## License
[MIT License](http://opensource.org/licenses/MIT)
| Add link to demo page | Add link to demo page
| Markdown | mit | brentvatne/strava-card | markdown | ## Code Before:
A [Polymer](http://polymer-project.org) element for querying results from the [Strava API](http://strava.github.io/api/)
> Maintained by [Brent Vatne](https://github.com/brentvatne).
## Install
Using [Bower](http://bower.io), run:
```bash
$ bower install --save strava-card
```
## Usage
1. Import Web Components' polyfill:
```html
<script src="//cdnjs.cloudflare.com/ajax/libs/polymer/0.1.4/platform.js"></script>
```
2. Import Custom Element:
```html
<link rel="import" href="src/strava-card.html">
```
3. Start using it!
```html
<strava-card athlete-id="yourathleteidinteger"></strava-card>
```
## Setup
In order to run it locally you'll need a basic server setup.
1. Install [Node.js](http://nodejs.org/download/)
2. Install [Grunt](http://gruntjs.com/):
```sh
$ npm install -g grunt-cli
```
3. Install local dependencies:
```sh
$ npm install
```
4. Run a local server and open `http://localhost:8000`.
```sh
$ grunt connect
```
## License
[MIT License](http://opensource.org/licenses/MIT)
## Instruction:
Add link to demo page
## Code After:
A [Polymer](http://polymer-project.org) element for querying results from the [Strava API](http://strava.github.io/api/)
> Maintained by [Brent Vatne](https://github.com/brentvatne).
## Install
Using [Bower](http://bower.io), run:
```bash
$ bower install --save strava-card
```
## Demo
> [See it in action](http://brentvatne.github.io/strava-card/)
## Usage
1. Import Web Components' polyfill:
```html
<script src="//cdnjs.cloudflare.com/ajax/libs/polymer/0.1.4/platform.js"></script>
```
2. Import Custom Element:
```html
<link rel="import" href="src/strava-card.html">
```
3. Start using it!
```html
<strava-card athlete-id="yourathleteidinteger"></strava-card>
```
## Setup
In order to run it locally you'll need a basic server setup.
1. Install [Node.js](http://nodejs.org/download/)
2. Install [Grunt](http://gruntjs.com/):
```sh
$ npm install -g grunt-cli
```
3. Install local dependencies:
```sh
$ npm install
```
4. Run a local server and open `http://localhost:8000`.
```sh
$ grunt connect
```
## License
[MIT License](http://opensource.org/licenses/MIT)
|
21d5bc00a373c25dcaa0645e60ea38cb3e9b5a59 | src/main/java/com/ezardlabs/dethsquare/multiplayer/NetworkAnimator.java | src/main/java/com/ezardlabs/dethsquare/multiplayer/NetworkAnimator.java | package com.ezardlabs.dethsquare.multiplayer;
import java.nio.ByteBuffer;
public class NetworkAnimator extends NetworkBehaviour {
@Override
public void start() {
super.start();
assert gameObject.animator != null;
gameObject.animator.shouldUpdate = getPlayerId() == Network.getPlayerId();
}
@Override
protected ByteBuffer onSend() {
data.position(0);
data.putInt(0, gameObject.animator.getCurrentAnimationId()); // 0 - 3
data.putInt(4, gameObject.animator.getCurrentAnimationFrame()); // 4 - 7
return data;
}
@Override
protected void onReceive(ByteBuffer data, int index) {
gameObject.animator.setCurrentAnimationId(data.getInt(index));
gameObject.animator.setCurrentAnimationFrame(data.getInt(index + 4));
}
@Override
public short getSize() {
return 8;
}
}
| package com.ezardlabs.dethsquare.multiplayer;
import java.nio.ByteBuffer;
public class NetworkAnimator extends NetworkBehaviour {
@Override
public void start() {
super.start();
assert gameObject.animator != null;
gameObject.animator.shouldUpdate = getPlayerId() == Network.getPlayerId();
}
@Override
protected ByteBuffer onSend() {
data.position(0);
data.putShort(0, (short) gameObject.animator.getCurrentAnimationId()); // 0 - 1
data.putShort(2, (short) gameObject.animator.getCurrentAnimationFrame()); // 2 - 3
return data;
}
@Override
protected void onReceive(ByteBuffer data, int index) {
gameObject.animator.setCurrentAnimationId(data.getShort(index));
gameObject.animator.setCurrentAnimationFrame(data.getShort(index + 2));
}
@Override
public short getSize() {
return 4;
}
}
| Send animation data across the network as shorts instead of ints, to save some bytes | Send animation data across the network as shorts instead of ints, to save some bytes
| Java | mit | 8-Bit-Warframe/Dethsquare-Engine-core | java | ## Code Before:
package com.ezardlabs.dethsquare.multiplayer;
import java.nio.ByteBuffer;
public class NetworkAnimator extends NetworkBehaviour {
@Override
public void start() {
super.start();
assert gameObject.animator != null;
gameObject.animator.shouldUpdate = getPlayerId() == Network.getPlayerId();
}
@Override
protected ByteBuffer onSend() {
data.position(0);
data.putInt(0, gameObject.animator.getCurrentAnimationId()); // 0 - 3
data.putInt(4, gameObject.animator.getCurrentAnimationFrame()); // 4 - 7
return data;
}
@Override
protected void onReceive(ByteBuffer data, int index) {
gameObject.animator.setCurrentAnimationId(data.getInt(index));
gameObject.animator.setCurrentAnimationFrame(data.getInt(index + 4));
}
@Override
public short getSize() {
return 8;
}
}
## Instruction:
Send animation data across the network as shorts instead of ints, to save some bytes
## Code After:
package com.ezardlabs.dethsquare.multiplayer;
import java.nio.ByteBuffer;
public class NetworkAnimator extends NetworkBehaviour {
@Override
public void start() {
super.start();
assert gameObject.animator != null;
gameObject.animator.shouldUpdate = getPlayerId() == Network.getPlayerId();
}
@Override
protected ByteBuffer onSend() {
data.position(0);
data.putShort(0, (short) gameObject.animator.getCurrentAnimationId()); // 0 - 1
data.putShort(2, (short) gameObject.animator.getCurrentAnimationFrame()); // 2 - 3
return data;
}
@Override
protected void onReceive(ByteBuffer data, int index) {
gameObject.animator.setCurrentAnimationId(data.getShort(index));
gameObject.animator.setCurrentAnimationFrame(data.getShort(index + 2));
}
@Override
public short getSize() {
return 4;
}
}
|
412e195da29b4f4fc7b72967c192714a6f5eaeb5 | setup.py | setup.py | import os
from setuptools import setup, find_packages
from version import get_git_version
VERSION, SOURCE_LABEL = get_git_version()
PROJECT = 'yakonfig'
AUTHOR = 'Diffeo, Inc.'
AUTHOR_EMAIL = 'support@diffeo.com'
URL = 'http://github.com/diffeo/yakonfig'
DESC = 'load a configuration dictionary for a large application'
def read_file(file_name):
with open(os.path.join(os.path.dirname(__file__), file_name), 'r') as f:
return f.read()
setup(
name=PROJECT,
version=VERSION,
description=DESC,
license=read_file('LICENSE.txt'),
long_description=read_file('README.md'),
# source_label=SOURCE_LABEL,
author=AUTHOR,
author_email=AUTHOR_EMAIL,
url=URL,
packages=find_packages(),
classifiers=[
'Programming Language :: Python :: 2.7',
'Development Status :: 3 - Alpha',
'Topic :: Utilities',
# MIT/X11 license http://opensource.org/licenses/MIT
'License :: OSI Approved :: MIT License',
],
tests_require=[
'pexpect',
],
install_requires=[
'importlib',
'pyyaml',
'six',
],
)
| import os
from setuptools import setup, find_packages
from version import get_git_version
VERSION, SOURCE_LABEL = get_git_version()
PROJECT = 'yakonfig'
AUTHOR = 'Diffeo, Inc.'
AUTHOR_EMAIL = 'support@diffeo.com'
URL = 'http://github.com/diffeo/yakonfig'
DESC = 'load a configuration dictionary for a large application'
def read_file(file_name):
with open(os.path.join(os.path.dirname(__file__), file_name), 'r') as f:
return f.read()
setup(
name=PROJECT,
version=VERSION,
description=DESC,
license=read_file('LICENSE.txt'),
long_description=read_file('README.md'),
# source_label=SOURCE_LABEL,
author=AUTHOR,
author_email=AUTHOR_EMAIL,
url=URL,
packages=find_packages(),
classifiers=[
'Programming Language :: Python :: 2.7',
'Development Status :: 3 - Alpha',
'Topic :: Utilities',
# MIT/X11 license http://opensource.org/licenses/MIT
'License :: OSI Approved :: MIT License',
],
install_requires=[
'pyyaml',
'six',
],
)
| Clean up dependencies (no importlib) | Clean up dependencies (no importlib)
| Python | mit | diffeo/yakonfig | python | ## Code Before:
import os
from setuptools import setup, find_packages
from version import get_git_version
VERSION, SOURCE_LABEL = get_git_version()
PROJECT = 'yakonfig'
AUTHOR = 'Diffeo, Inc.'
AUTHOR_EMAIL = 'support@diffeo.com'
URL = 'http://github.com/diffeo/yakonfig'
DESC = 'load a configuration dictionary for a large application'
def read_file(file_name):
with open(os.path.join(os.path.dirname(__file__), file_name), 'r') as f:
return f.read()
setup(
name=PROJECT,
version=VERSION,
description=DESC,
license=read_file('LICENSE.txt'),
long_description=read_file('README.md'),
# source_label=SOURCE_LABEL,
author=AUTHOR,
author_email=AUTHOR_EMAIL,
url=URL,
packages=find_packages(),
classifiers=[
'Programming Language :: Python :: 2.7',
'Development Status :: 3 - Alpha',
'Topic :: Utilities',
# MIT/X11 license http://opensource.org/licenses/MIT
'License :: OSI Approved :: MIT License',
],
tests_require=[
'pexpect',
],
install_requires=[
'importlib',
'pyyaml',
'six',
],
)
## Instruction:
Clean up dependencies (no importlib)
## Code After:
import os
from setuptools import setup, find_packages
from version import get_git_version
VERSION, SOURCE_LABEL = get_git_version()
PROJECT = 'yakonfig'
AUTHOR = 'Diffeo, Inc.'
AUTHOR_EMAIL = 'support@diffeo.com'
URL = 'http://github.com/diffeo/yakonfig'
DESC = 'load a configuration dictionary for a large application'
def read_file(file_name):
with open(os.path.join(os.path.dirname(__file__), file_name), 'r') as f:
return f.read()
setup(
name=PROJECT,
version=VERSION,
description=DESC,
license=read_file('LICENSE.txt'),
long_description=read_file('README.md'),
# source_label=SOURCE_LABEL,
author=AUTHOR,
author_email=AUTHOR_EMAIL,
url=URL,
packages=find_packages(),
classifiers=[
'Programming Language :: Python :: 2.7',
'Development Status :: 3 - Alpha',
'Topic :: Utilities',
# MIT/X11 license http://opensource.org/licenses/MIT
'License :: OSI Approved :: MIT License',
],
install_requires=[
'pyyaml',
'six',
],
)
|
77bb6f36e9651c846f6b0c6148f40b96e2f7598f | src/vmware/VMwareForm.tsx | src/vmware/VMwareForm.tsx | import * as React from 'react';
import { required } from '@waldur/core/validators';
import { FormContainer, StringField, NumberField, SecretField } from '@waldur/form-react';
export const VMwareForm = ({ translate, container }) => (
<FormContainer {...container}>
<StringField
name="backend_url"
label={translate('Hostname')}
required={true}
validate={required}
/>
<StringField
name="username"
label={translate('Username')}
required={true}
validate={required}
/>
<SecretField
name="password"
label={translate('Password')}
required={true}
validate={required}
/>
<StringField
name="default_cluster_id"
label={translate('Default cluster ID')}
required={true}
validate={required}
/>
<NumberField
name="max_cpu"
label={translate('Maximum vCPU for each VM')}
/>
<NumberField
name="max_ram"
label={translate('Maximum RAM for each VM')}
unit="GB"
format={v => v ? v / 1024 : ''}
normalize={v => Number(v) * 1024}
/>
<NumberField
name="max_disk"
label={translate('Maximum capacity for each disk')}
unit="GB"
format={v => v ? v / 1024 : ''}
normalize={v => Number(v) * 1024}
/>
</FormContainer>
);
| import * as React from 'react';
import { required } from '@waldur/core/validators';
import { FormContainer, StringField, NumberField, SecretField } from '@waldur/form-react';
export const VMwareForm = ({ translate, container }) => (
<FormContainer {...container}>
<StringField
name="backend_url"
label={translate('Hostname')}
required={true}
validate={required}
/>
<StringField
name="username"
label={translate('Username')}
required={true}
validate={required}
/>
<SecretField
name="password"
label={translate('Password')}
required={true}
validate={required}
/>
<StringField
name="default_cluster_label"
label={translate('Default cluster label')}
required={true}
validate={required}
/>
<NumberField
name="max_cpu"
label={translate('Maximum vCPU for each VM')}
/>
<NumberField
name="max_ram"
label={translate('Maximum RAM for each VM')}
unit="GB"
format={v => v ? v / 1024 : ''}
normalize={v => Number(v) * 1024}
/>
<NumberField
name="max_disk"
label={translate('Maximum capacity for each disk')}
unit="GB"
format={v => v ? v / 1024 : ''}
normalize={v => Number(v) * 1024}
/>
</FormContainer>
);
| Allow setting cluster label instead of ID when creating a VMware offering | Allow setting cluster label instead of ID when creating a VMware offering [WAL-2536]
| TypeScript | mit | opennode/waldur-homeport,opennode/waldur-homeport,opennode/waldur-homeport,opennode/waldur-homeport | typescript | ## Code Before:
import * as React from 'react';
import { required } from '@waldur/core/validators';
import { FormContainer, StringField, NumberField, SecretField } from '@waldur/form-react';
export const VMwareForm = ({ translate, container }) => (
<FormContainer {...container}>
<StringField
name="backend_url"
label={translate('Hostname')}
required={true}
validate={required}
/>
<StringField
name="username"
label={translate('Username')}
required={true}
validate={required}
/>
<SecretField
name="password"
label={translate('Password')}
required={true}
validate={required}
/>
<StringField
name="default_cluster_id"
label={translate('Default cluster ID')}
required={true}
validate={required}
/>
<NumberField
name="max_cpu"
label={translate('Maximum vCPU for each VM')}
/>
<NumberField
name="max_ram"
label={translate('Maximum RAM for each VM')}
unit="GB"
format={v => v ? v / 1024 : ''}
normalize={v => Number(v) * 1024}
/>
<NumberField
name="max_disk"
label={translate('Maximum capacity for each disk')}
unit="GB"
format={v => v ? v / 1024 : ''}
normalize={v => Number(v) * 1024}
/>
</FormContainer>
);
## Instruction:
Allow setting cluster label instead of ID when creating a VMware offering [WAL-2536]
## Code After:
import * as React from 'react';
import { required } from '@waldur/core/validators';
import { FormContainer, StringField, NumberField, SecretField } from '@waldur/form-react';
export const VMwareForm = ({ translate, container }) => (
<FormContainer {...container}>
<StringField
name="backend_url"
label={translate('Hostname')}
required={true}
validate={required}
/>
<StringField
name="username"
label={translate('Username')}
required={true}
validate={required}
/>
<SecretField
name="password"
label={translate('Password')}
required={true}
validate={required}
/>
<StringField
name="default_cluster_label"
label={translate('Default cluster label')}
required={true}
validate={required}
/>
<NumberField
name="max_cpu"
label={translate('Maximum vCPU for each VM')}
/>
<NumberField
name="max_ram"
label={translate('Maximum RAM for each VM')}
unit="GB"
format={v => v ? v / 1024 : ''}
normalize={v => Number(v) * 1024}
/>
<NumberField
name="max_disk"
label={translate('Maximum capacity for each disk')}
unit="GB"
format={v => v ? v / 1024 : ''}
normalize={v => Number(v) * 1024}
/>
</FormContainer>
);
|
0b550701ced860c91efc929a3738d4a6c7434c90 | config.toml | config.toml | baseurl = "http://yourSiteHere"
languageCode = "en-us"
title = "my new hugo site"
publishdir = "./"
theme = "hyde"
| baseurl = "http://joshuamckenty.github.io"
languageCode = "en-us"
title = "my new hugo site"
publishdir = "./"
theme = "hyde"
| Fix baseurl until DNS roll. | Fix baseurl until DNS roll.
| TOML | unlicense | joshuamckenty/joshuamckenty.github.io | toml | ## Code Before:
baseurl = "http://yourSiteHere"
languageCode = "en-us"
title = "my new hugo site"
publishdir = "./"
theme = "hyde"
## Instruction:
Fix baseurl until DNS roll.
## Code After:
baseurl = "http://joshuamckenty.github.io"
languageCode = "en-us"
title = "my new hugo site"
publishdir = "./"
theme = "hyde"
|
1dbf9fb767d1a8919e50f56185b63daade5cb199 | lib/web.rb | lib/web.rb | require 'sinatra'
require_relative 'sonatype'
require_relative 'shields'
DEFAULT_SUBJECT = 'maven central'
PROJECT_SITE = 'https://github.com/jirutka/maven-badges'
configure :production do
disable :static
before { cache_control :public, :max_age => 3600 }
end
get '/' do
content_type :text
"Nothing is here, see #{PROJECT_SITE}."
end
get '/maven-central/:group/:artifact/badge.?:format?' do |group, artifact, format|
halt 415 unless ['svg', 'png'].include? format
content_type format
subject = params['subject'] || DEFAULT_SUBJECT
begin
version = Sonatype.last_artifact_version(group, artifact)
color = :brightgreen
rescue NotFoundError
version = 'unknown'
color = :lightgray
end
Shields.badge_image subject, version, color, format
end
error do
content_type :text
halt 500, "Something went wrong, please open an issue on #{PROJECT_SITE}/issues"
end | require 'sinatra'
require_relative 'sonatype'
require_relative 'shields'
DEFAULT_SUBJECT = 'maven central'
MAVEN_SEARCH_URI = 'http://search.maven.org'
PROJECT_SITE = 'https://github.com/jirutka/maven-badges'
configure :production do
disable :static
before { cache_control :public, :max_age => 3600 }
end
get '/' do
content_type :text
"Nothing is here, see #{PROJECT_SITE}."
end
# Returns badge image
get '/maven-central/:group/:artifact/badge.?:format?' do |group, artifact, format|
halt 415 unless ['svg', 'png'].include? format
content_type format
subject = params['subject'] || DEFAULT_SUBJECT
begin
version = Sonatype.last_artifact_version(group, artifact)
color = :brightgreen
rescue NotFoundError
version = 'unknown'
color = :lightgray
end
Shields.badge_image subject, version, color, format
end
# Redirects to artifact's page on maven.org
get '/maven-central/:group/:artifact/?' do |group, artifact|
begin
version = Sonatype.last_artifact_version(group, artifact)
redirect to "#{MAVEN_SEARCH_URI}/#artifactdetails|#{group}|#{artifact}|#{version}|"
rescue NotFoundError
redirect to "#{MAVEN_SEARCH_URI}/#search|ga|1|g:\"#{group}\" AND a:\"#{artifact}\""
end
end
error do
content_type :text
halt 500, "Something went wrong, please open an issue on #{PROJECT_SITE}/issues"
end | Add resource for redirect to search.maven.org | Add resource for redirect to search.maven.org
| Ruby | mit | jirutka/maven-badges | ruby | ## Code Before:
require 'sinatra'
require_relative 'sonatype'
require_relative 'shields'
DEFAULT_SUBJECT = 'maven central'
PROJECT_SITE = 'https://github.com/jirutka/maven-badges'
configure :production do
disable :static
before { cache_control :public, :max_age => 3600 }
end
get '/' do
content_type :text
"Nothing is here, see #{PROJECT_SITE}."
end
get '/maven-central/:group/:artifact/badge.?:format?' do |group, artifact, format|
halt 415 unless ['svg', 'png'].include? format
content_type format
subject = params['subject'] || DEFAULT_SUBJECT
begin
version = Sonatype.last_artifact_version(group, artifact)
color = :brightgreen
rescue NotFoundError
version = 'unknown'
color = :lightgray
end
Shields.badge_image subject, version, color, format
end
error do
content_type :text
halt 500, "Something went wrong, please open an issue on #{PROJECT_SITE}/issues"
end
## Instruction:
Add resource for redirect to search.maven.org
## Code After:
require 'sinatra'
require_relative 'sonatype'
require_relative 'shields'
DEFAULT_SUBJECT = 'maven central'
MAVEN_SEARCH_URI = 'http://search.maven.org'
PROJECT_SITE = 'https://github.com/jirutka/maven-badges'
configure :production do
disable :static
before { cache_control :public, :max_age => 3600 }
end
get '/' do
content_type :text
"Nothing is here, see #{PROJECT_SITE}."
end
# Returns badge image
get '/maven-central/:group/:artifact/badge.?:format?' do |group, artifact, format|
halt 415 unless ['svg', 'png'].include? format
content_type format
subject = params['subject'] || DEFAULT_SUBJECT
begin
version = Sonatype.last_artifact_version(group, artifact)
color = :brightgreen
rescue NotFoundError
version = 'unknown'
color = :lightgray
end
Shields.badge_image subject, version, color, format
end
# Redirects to artifact's page on maven.org
get '/maven-central/:group/:artifact/?' do |group, artifact|
begin
version = Sonatype.last_artifact_version(group, artifact)
redirect to "#{MAVEN_SEARCH_URI}/#artifactdetails|#{group}|#{artifact}|#{version}|"
rescue NotFoundError
redirect to "#{MAVEN_SEARCH_URI}/#search|ga|1|g:\"#{group}\" AND a:\"#{artifact}\""
end
end
error do
content_type :text
halt 500, "Something went wrong, please open an issue on #{PROJECT_SITE}/issues"
end |
6a8cbb53dad28508f0fbb893ce37defbbec11a49 | src/bin/e_log.c | src/bin/e_log.c |
EINTERN int e_log_dom = -1;
EINTERN int
e_log_init(void)
{
e_log_dom = eina_log_domain_register("e", EINA_COLOR_WHITE);
return e_log_dom != -1;
}
EINTERN int
e_log_shutdown(void)
{
eina_log_domain_unregister(e_log_dom);
e_log_dom = -1;
return 0;
}
|
EINTERN int e_log_dom = -1;
static const char *_names[] = {
"CRI",
"ERR",
"WRN",
"INF",
"DBG",
};
static void
_e_log_cb(const Eina_Log_Domain *d, Eina_Log_Level level, const char *file, const char *fnc EINA_UNUSED, int line, const char *fmt, void *data EINA_UNUSED, va_list args)
{
const char *color;
color = eina_log_level_color_get(level);
fprintf(stdout,
"%s%s<" EINA_COLOR_RESET "%s%s>" EINA_COLOR_RESET "%s:%d" EINA_COLOR_RESET " ",
color, _names[level > EINA_LOG_LEVEL_DBG ? EINA_LOG_LEVEL_DBG : level],
d->domain_str, color, file, line);
vfprintf(stdout, fmt, args);
putc('\n', stdout);
}
EINTERN int
e_log_init(void)
{
e_log_dom = eina_log_domain_register("e", EINA_COLOR_WHITE);
eina_log_print_cb_set(_e_log_cb, NULL);
return e_log_dom != -1;
}
EINTERN int
e_log_shutdown(void)
{
eina_log_domain_unregister(e_log_dom);
e_log_dom = -1;
return 0;
}
| Revert "e logs - the custom e log func breaks eina backtraces, so don't use it" | Revert "e logs - the custom e log func breaks eina backtraces, so don't use it"
This reverts commit 2df04042269f3b5604c719844eac372fa5fcddd2.
let's not do this in all cases
| C | bsd-2-clause | tasn/enlightenment,rvandegrift/e,rvandegrift/e,rvandegrift/e,tasn/enlightenment,tasn/enlightenment | c | ## Code Before:
EINTERN int e_log_dom = -1;
EINTERN int
e_log_init(void)
{
e_log_dom = eina_log_domain_register("e", EINA_COLOR_WHITE);
return e_log_dom != -1;
}
EINTERN int
e_log_shutdown(void)
{
eina_log_domain_unregister(e_log_dom);
e_log_dom = -1;
return 0;
}
## Instruction:
Revert "e logs - the custom e log func breaks eina backtraces, so don't use it"
This reverts commit 2df04042269f3b5604c719844eac372fa5fcddd2.
let's not do this in all cases
## Code After:
EINTERN int e_log_dom = -1;
static const char *_names[] = {
"CRI",
"ERR",
"WRN",
"INF",
"DBG",
};
static void
_e_log_cb(const Eina_Log_Domain *d, Eina_Log_Level level, const char *file, const char *fnc EINA_UNUSED, int line, const char *fmt, void *data EINA_UNUSED, va_list args)
{
const char *color;
color = eina_log_level_color_get(level);
fprintf(stdout,
"%s%s<" EINA_COLOR_RESET "%s%s>" EINA_COLOR_RESET "%s:%d" EINA_COLOR_RESET " ",
color, _names[level > EINA_LOG_LEVEL_DBG ? EINA_LOG_LEVEL_DBG : level],
d->domain_str, color, file, line);
vfprintf(stdout, fmt, args);
putc('\n', stdout);
}
EINTERN int
e_log_init(void)
{
e_log_dom = eina_log_domain_register("e", EINA_COLOR_WHITE);
eina_log_print_cb_set(_e_log_cb, NULL);
return e_log_dom != -1;
}
EINTERN int
e_log_shutdown(void)
{
eina_log_domain_unregister(e_log_dom);
e_log_dom = -1;
return 0;
}
|
0035b81b380f6dba8dbedc0088e2490f15c2830b | README.md | README.md | Example files and documentation for implementing immersive scrolling pages in NationBuilder.
|
Example files and documentation for implementing immersive scrolling pages in NationBuilder.
Once you've put the files in place, and modified _nav.html and layout.html in your template according to the instructions, you can create your very own immersive scrolling page.
## Creating a scrolling page in NationBuilder
It works by basically pulling in subpages as sections of the parent page. So first you make the parent page and assign a special tag to it. Then you add additional sections as subpages.
1. Create a new page with a scrolling layout, go to the dashboard and click “+ New page," and choose "Basic page."
2. Once the page is created, click Settings in the horizontal tabs for the page (next to Content), and add the tag “Layout: Scrolling”
3. Next click the Subpages tab and create a new subpage for each section that you want, and always leave “Include in top nav” checked
4. Add a background. You add a background by selecting the “Settings” tab for the subpage you’re editing, then the "Social media" tab under that. Upload the image where it says “Thumbnail image for Facebook, etc.” and save.
5. Once you’ve got all your sections, it’s time to order them. Edit the parent page that you created in step 1, and choose “Subpages” and then choose “Subnav pages.” Drag them into the order you want. It should save the order automatically.
| Add instructions for how to create a page in NationBuilder itself | Add instructions for how to create a page in NationBuilder itself | Markdown | agpl-3.0 | heybenji/nationbuilder-scrolling-sections,heybenji/nationbuilder-scrolling-sections | markdown | ## Code Before:
Example files and documentation for implementing immersive scrolling pages in NationBuilder.
## Instruction:
Add instructions for how to create a page in NationBuilder itself
## Code After:
Example files and documentation for implementing immersive scrolling pages in NationBuilder.
Once you've put the files in place, and modified _nav.html and layout.html in your template according to the instructions, you can create your very own immersive scrolling page.
## Creating a scrolling page in NationBuilder
It works by basically pulling in subpages as sections of the parent page. So first you make the parent page and assign a special tag to it. Then you add additional sections as subpages.
1. Create a new page with a scrolling layout, go to the dashboard and click “+ New page," and choose "Basic page."
2. Once the page is created, click Settings in the horizontal tabs for the page (next to Content), and add the tag “Layout: Scrolling”
3. Next click the Subpages tab and create a new subpage for each section that you want, and always leave “Include in top nav” checked
4. Add a background. You add a background by selecting the “Settings” tab for the subpage you’re editing, then the "Social media" tab under that. Upload the image where it says “Thumbnail image for Facebook, etc.” and save.
5. Once you’ve got all your sections, it’s time to order them. Edit the parent page that you created in step 1, and choose “Subpages” and then choose “Subnav pages.” Drag them into the order you want. It should save the order automatically.
|
0ca34fb8a218b8c9338eadf7f730bb8662eb5a7e | recipes-xfce/fontconfig-overrides/fontconfig-overrides_1.0.bb | recipes-xfce/fontconfig-overrides/fontconfig-overrides_1.0.bb | DESCRIPTION = "Customized settings to use the right fonts in XFCE."
SECTION = "x11"
LICENSE = "MIT"
LIC_FILES_CHKSUM = "file://${COREBASE}/LICENSE;md5=3f40d7994397109285ec7b81fdeb3b58 \
file://${COREBASE}/meta/COPYING.MIT;md5=3da9cfbcb788c80a0384361b4de20420"
RDEPENDS_${PN} = "fontconfig"
PR = "r1"
S = "${WORKDIR}"
SRC_URI = "file://48-nilrt-override.conf"
FILES_${PN} = "${sysconfdir}/fonts"
do_install () {
install -d ${D}/${sysconfdir}/fonts/conf.d
install -d ${D}/${sysconfdir}/fonts/conf.avail
install -m 0644 48-nilrt-override.conf ${D}/${sysconfdir}/fonts/conf.avail/48-nilrt-override.conf
ln -sf ${sysconfdir}/fonts/conf.avail/48-nilrt-override.conf ${D}/${sysconfdir}/fonts/conf.d/48-nilrt-override.conf
}
| DESCRIPTION = "Customized settings to use the right fonts in XFCE."
SECTION = "x11"
LICENSE = "MIT"
LIC_FILES_CHKSUM = "file://${COREBASE}/LICENSE;md5=4d92cd373abda3937c2bc47fbc49d690 \
file://${COREBASE}/meta/COPYING.MIT;md5=3da9cfbcb788c80a0384361b4de20420"
RDEPENDS_${PN} = "fontconfig"
PR = "r1"
S = "${WORKDIR}"
SRC_URI = "file://48-nilrt-override.conf"
FILES_${PN} = "${sysconfdir}/fonts"
do_install () {
install -d ${D}/${sysconfdir}/fonts/conf.d
install -d ${D}/${sysconfdir}/fonts/conf.avail
install -m 0644 48-nilrt-override.conf ${D}/${sysconfdir}/fonts/conf.avail/48-nilrt-override.conf
ln -sf ${sysconfdir}/fonts/conf.avail/48-nilrt-override.conf ${D}/${sysconfdir}/fonts/conf.d/48-nilrt-override.conf
}
| Update after toplevl LICENSE file checksum change | fontconfig-overrrides: Update after toplevl LICENSE file checksum change
Acked-by: Ken Sharp <e6c8ad8f54d15df4645e8aaf3f7a209bc3688570@ni.com>
Acked-by: Rich Tollerton <67c688e7f7482ba5c4ebe3d70bc77594748182a6@ni.com>
Acked-by: Ben Shelton <873d2944fc7a954ed322314738197137e10de387@ni.com>
Signed-off-by: Alejandro del Castillo <c09cbcd73cd5840fe009d82a3ad13af11b4d7b46@ni.com>
| BitBake | mit | ni/meta-nilrt,ni/meta-nilrt,ni/meta-nilrt,ni/meta-nilrt,ni/meta-nilrt | bitbake | ## Code Before:
DESCRIPTION = "Customized settings to use the right fonts in XFCE."
SECTION = "x11"
LICENSE = "MIT"
LIC_FILES_CHKSUM = "file://${COREBASE}/LICENSE;md5=3f40d7994397109285ec7b81fdeb3b58 \
file://${COREBASE}/meta/COPYING.MIT;md5=3da9cfbcb788c80a0384361b4de20420"
RDEPENDS_${PN} = "fontconfig"
PR = "r1"
S = "${WORKDIR}"
SRC_URI = "file://48-nilrt-override.conf"
FILES_${PN} = "${sysconfdir}/fonts"
do_install () {
install -d ${D}/${sysconfdir}/fonts/conf.d
install -d ${D}/${sysconfdir}/fonts/conf.avail
install -m 0644 48-nilrt-override.conf ${D}/${sysconfdir}/fonts/conf.avail/48-nilrt-override.conf
ln -sf ${sysconfdir}/fonts/conf.avail/48-nilrt-override.conf ${D}/${sysconfdir}/fonts/conf.d/48-nilrt-override.conf
}
## Instruction:
fontconfig-overrrides: Update after toplevl LICENSE file checksum change
Acked-by: Ken Sharp <e6c8ad8f54d15df4645e8aaf3f7a209bc3688570@ni.com>
Acked-by: Rich Tollerton <67c688e7f7482ba5c4ebe3d70bc77594748182a6@ni.com>
Acked-by: Ben Shelton <873d2944fc7a954ed322314738197137e10de387@ni.com>
Signed-off-by: Alejandro del Castillo <c09cbcd73cd5840fe009d82a3ad13af11b4d7b46@ni.com>
## Code After:
DESCRIPTION = "Customized settings to use the right fonts in XFCE."
SECTION = "x11"
LICENSE = "MIT"
LIC_FILES_CHKSUM = "file://${COREBASE}/LICENSE;md5=4d92cd373abda3937c2bc47fbc49d690 \
file://${COREBASE}/meta/COPYING.MIT;md5=3da9cfbcb788c80a0384361b4de20420"
RDEPENDS_${PN} = "fontconfig"
PR = "r1"
S = "${WORKDIR}"
SRC_URI = "file://48-nilrt-override.conf"
FILES_${PN} = "${sysconfdir}/fonts"
do_install () {
install -d ${D}/${sysconfdir}/fonts/conf.d
install -d ${D}/${sysconfdir}/fonts/conf.avail
install -m 0644 48-nilrt-override.conf ${D}/${sysconfdir}/fonts/conf.avail/48-nilrt-override.conf
ln -sf ${sysconfdir}/fonts/conf.avail/48-nilrt-override.conf ${D}/${sysconfdir}/fonts/conf.d/48-nilrt-override.conf
}
|
4e5d3973da8efaa3246c5131979ef4d0673faac9 | TODO.md | TODO.md |
- Add the ability to include a cover image.
- Add the ability to include images in the finished book formats. |
- [] Add the ability to include a cover image.
- [] Add the ability to include images in the finished book formats.
- [] Add additional themes
- [] Find fonts to include with book
- [] Update book.css to fix spacing between back-to-back codeblocks
- [] Add a plugin that will allow for Leanpub style aside boxes | Add more things to do | Add more things to do
| Markdown | mit | tschmidt/author | markdown | ## Code Before:
- Add the ability to include a cover image.
- Add the ability to include images in the finished book formats.
## Instruction:
Add more things to do
## Code After:
- [] Add the ability to include a cover image.
- [] Add the ability to include images in the finished book formats.
- [] Add additional themes
- [] Find fonts to include with book
- [] Update book.css to fix spacing between back-to-back codeblocks
- [] Add a plugin that will allow for Leanpub style aside boxes |
ee2a6bb99d2a8fadecddd58366c20202c863c97c | geocoder/__init__.py | geocoder/__init__.py | from .api import get, yahoo, bing, geonames, google, mapquest # noqa
from .api import nokia, osm, tomtom, geolytica, arcgis, opencage # noqa
from .api import maxmind, freegeoip, ottawa, here, baidu, w3w, yandex # noqa
# EXTRAS
from .api import timezone, elevation, ip, canadapost, reverse # noqa
# CLI
from .cli import cli # noqa
"""
Geocoder
~~~~~~~~
Geocoder is a geocoding library, written in python, simple and consistent.
Many online providers such as Google & Bing have geocoding services,
these providers do not include Python libraries and have different
JSON responses between each other.
Consistant JSON responses from various providers.
>>> g = geocoder.google('New York City')
>>> g.latlng
[40.7127837, -74.0059413]
>>> g.state
'New York'
>>> g.json
...
"""
__title__ = 'geocoder'
__author__ = 'Denis Carriere'
__author_email__ = 'carriere.denis@gmail.com'
__version__ = '1.2.2'
__license__ = 'MIT'
__copyright__ = 'Copyright (c) 2013-2015 Denis Carriere'
|
__title__ = 'geocoder'
__author__ = 'Denis Carriere'
__author_email__ = 'carriere.denis@gmail.com'
__version__ = '1.2.2'
__license__ = 'MIT'
__copyright__ = 'Copyright (c) 2013-2015 Denis Carriere'
# CORE
from .api import get, yahoo, bing, geonames, mapquest # noqa
from .api import nokia, osm, tomtom, geolytica, arcgis, opencage # noqa
from .api import maxmind, freegeoip, ottawa, here, baidu, w3w, yandex # noqa
# EXTRAS
from .api import timezone, elevation, ip, canadapost, reverse # noqa
# CLI
from .cli import cli # noqa
| Add imports to init with noqa comments | Add imports to init with noqa comments
| Python | mit | epyatopal/geocoder-1,akittas/geocoder,minimedj/geocoder,DenisCarriere/geocoder,ahlusar1989/geocoder,miraculixx/geocoder | python | ## Code Before:
from .api import get, yahoo, bing, geonames, google, mapquest # noqa
from .api import nokia, osm, tomtom, geolytica, arcgis, opencage # noqa
from .api import maxmind, freegeoip, ottawa, here, baidu, w3w, yandex # noqa
# EXTRAS
from .api import timezone, elevation, ip, canadapost, reverse # noqa
# CLI
from .cli import cli # noqa
"""
Geocoder
~~~~~~~~
Geocoder is a geocoding library, written in python, simple and consistent.
Many online providers such as Google & Bing have geocoding services,
these providers do not include Python libraries and have different
JSON responses between each other.
Consistant JSON responses from various providers.
>>> g = geocoder.google('New York City')
>>> g.latlng
[40.7127837, -74.0059413]
>>> g.state
'New York'
>>> g.json
...
"""
__title__ = 'geocoder'
__author__ = 'Denis Carriere'
__author_email__ = 'carriere.denis@gmail.com'
__version__ = '1.2.2'
__license__ = 'MIT'
__copyright__ = 'Copyright (c) 2013-2015 Denis Carriere'
## Instruction:
Add imports to init with noqa comments
## Code After:
__title__ = 'geocoder'
__author__ = 'Denis Carriere'
__author_email__ = 'carriere.denis@gmail.com'
__version__ = '1.2.2'
__license__ = 'MIT'
__copyright__ = 'Copyright (c) 2013-2015 Denis Carriere'
# CORE
from .api import get, yahoo, bing, geonames, mapquest # noqa
from .api import nokia, osm, tomtom, geolytica, arcgis, opencage # noqa
from .api import maxmind, freegeoip, ottawa, here, baidu, w3w, yandex # noqa
# EXTRAS
from .api import timezone, elevation, ip, canadapost, reverse # noqa
# CLI
from .cli import cli # noqa
|
2e136f9fd28b947cfecd721fb94cd95f8488b2cb | modules/index-checker/src/main/java/jorgediazest/indexchecker/util/PortletPropsValues.java | modules/index-checker/src/main/java/jorgediazest/indexchecker/util/PortletPropsValues.java | /**
* Copyright (c) 2015-present Jorge Díaz All rights reserved.
*
* This library is free software; you can redistribute it and/or modify it under
* the terms of the GNU Lesser General Public License as published by the Free
* Software Foundation; either version 2.1 of the License, or (at your option)
* any later version.
*
* This library is distributed in the hope that it will be useful, but WITHOUT
* ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
* FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more
* details.
*/
package jorgediazest.indexchecker.util;
import com.liferay.portal.kernel.configuration.Configuration;
import com.liferay.portal.kernel.configuration.ConfigurationFactoryUtil;
import com.liferay.portal.kernel.util.GetterUtil;
/**
* @author Jorge Díaz
*/
public class PortletPropsValues {
public static final int INDEX_SEARCH_LIMIT = GetterUtil.getInteger(
PortletPropsValues._configuration.get(
PortletPropsKeys.INDEX_SEARCH_LIMIT),
10000);
public static final int NUMBER_THREADS = GetterUtil.getInteger(
PortletPropsValues._configuration.get(PortletPropsKeys.NUMBER_THREADS),
1);
private static final Configuration _configuration =
ConfigurationFactoryUtil.getConfiguration(
PortletPropsValues.class.getClassLoader(), "portlet");
} | /**
* Copyright (c) 2015-present Jorge Díaz All rights reserved.
*
* This library is free software; you can redistribute it and/or modify it under
* the terms of the GNU Lesser General Public License as published by the Free
* Software Foundation; either version 2.1 of the License, or (at your option)
* any later version.
*
* This library is distributed in the hope that it will be useful, but WITHOUT
* ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
* FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more
* details.
*/
package jorgediazest.indexchecker.util;
import com.liferay.portal.kernel.configuration.Configuration;
import com.liferay.portal.kernel.configuration.ConfigurationFactoryUtil;
import com.liferay.portal.kernel.util.GetterUtil;
/**
* @author Jorge Díaz
*/
public class PortletPropsValues {
public static final int INDEX_SEARCH_LIMIT;
public static final int NUMBER_THREADS;
private static final Configuration _configuration;
static {
_configuration = ConfigurationFactoryUtil.getConfiguration(
PortletPropsValues.class.getClassLoader(), "portlet");
INDEX_SEARCH_LIMIT = GetterUtil.getInteger(
_configuration.get(PortletPropsKeys.INDEX_SEARCH_LIMIT), 10000);
NUMBER_THREADS = GetterUtil.getInteger(
PortletPropsValues._configuration.get(
PortletPropsKeys.NUMBER_THREADS),
1);
}
} | Fix NPE caused by automatic SF | Fix NPE caused by automatic SF
| Java | lgpl-2.1 | jorgediaz-lr/index-checker | java | ## Code Before:
/**
* Copyright (c) 2015-present Jorge Díaz All rights reserved.
*
* This library is free software; you can redistribute it and/or modify it under
* the terms of the GNU Lesser General Public License as published by the Free
* Software Foundation; either version 2.1 of the License, or (at your option)
* any later version.
*
* This library is distributed in the hope that it will be useful, but WITHOUT
* ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
* FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more
* details.
*/
package jorgediazest.indexchecker.util;
import com.liferay.portal.kernel.configuration.Configuration;
import com.liferay.portal.kernel.configuration.ConfigurationFactoryUtil;
import com.liferay.portal.kernel.util.GetterUtil;
/**
* @author Jorge Díaz
*/
public class PortletPropsValues {
public static final int INDEX_SEARCH_LIMIT = GetterUtil.getInteger(
PortletPropsValues._configuration.get(
PortletPropsKeys.INDEX_SEARCH_LIMIT),
10000);
public static final int NUMBER_THREADS = GetterUtil.getInteger(
PortletPropsValues._configuration.get(PortletPropsKeys.NUMBER_THREADS),
1);
private static final Configuration _configuration =
ConfigurationFactoryUtil.getConfiguration(
PortletPropsValues.class.getClassLoader(), "portlet");
}
## Instruction:
Fix NPE caused by automatic SF
## Code After:
/**
* Copyright (c) 2015-present Jorge Díaz All rights reserved.
*
* This library is free software; you can redistribute it and/or modify it under
* the terms of the GNU Lesser General Public License as published by the Free
* Software Foundation; either version 2.1 of the License, or (at your option)
* any later version.
*
* This library is distributed in the hope that it will be useful, but WITHOUT
* ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
* FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more
* details.
*/
package jorgediazest.indexchecker.util;
import com.liferay.portal.kernel.configuration.Configuration;
import com.liferay.portal.kernel.configuration.ConfigurationFactoryUtil;
import com.liferay.portal.kernel.util.GetterUtil;
/**
* @author Jorge Díaz
*/
public class PortletPropsValues {
public static final int INDEX_SEARCH_LIMIT;
public static final int NUMBER_THREADS;
private static final Configuration _configuration;
static {
_configuration = ConfigurationFactoryUtil.getConfiguration(
PortletPropsValues.class.getClassLoader(), "portlet");
INDEX_SEARCH_LIMIT = GetterUtil.getInteger(
_configuration.get(PortletPropsKeys.INDEX_SEARCH_LIMIT), 10000);
NUMBER_THREADS = GetterUtil.getInteger(
PortletPropsValues._configuration.get(
PortletPropsKeys.NUMBER_THREADS),
1);
}
} |
9b512127c068cb1fcde6e564a0f0cfde3e02493b | templates/app/index.html | templates/app/index.html | <html>
<head>
<title>Title</title>
<script src="bower_components/react/react.js"></script>
<script src="bower_components/react/JSXTransformer.js"></script>
</head>
<body>
<div id="app-container"></div>
<script type="text/jsx">
var App = React.createClass({
getInitialState: function() {
return { message: "" };
},
updateMessage: function(e) {
this.setState({ message: e.target.value });
},
render: function() {
return (
<div>
<input type="text" onChange={this.updateMessage} />
<Message message={this.state.message} />
</div>
);
}
});
var Message = React.createClass({
render: function() {
return <p>{this.props.message}</p>;
}
});
React.render(
<App />,
document.getElementById('app-container')
);
</script>
</body>
</html>
| <html>
<head>
<title>Title</title>
<script src="bower_components/react/react.js"></script>
<script src="bower_components/react/JSXTransformer.js"></script>
</head>
<body>
<div id="app-container"></div>
<script type="text/jsx">
var App = React.createClass({
getInitialState: function() {
return { message: "" };
},
updateMessage: function(message) {
this.setState({ message: message });
},
render: function() {
return (
<div>
<MessageInput onChange={this.updateMessage} />
<Message message={this.state.message} />
</div>
);
}
});
var MessageInput = React.createClass({
_onChange: function(e) {
this.props.onChange(e.target.value);
},
render: function() {
return <input type="text" onChange={this._onChange} />
}
});
var Message = React.createClass({
render: function() {
return <p>{this.props.message}</p>;
}
});
React.render(
<App />,
document.getElementById('app-container')
);
</script>
</body>
</html>
| Make message input part independent component | Make message input part independent component
| HTML | mit | zaki-yama/react-sample,zaki-yama/react-sample,zaki-yama/react-sample | html | ## Code Before:
<html>
<head>
<title>Title</title>
<script src="bower_components/react/react.js"></script>
<script src="bower_components/react/JSXTransformer.js"></script>
</head>
<body>
<div id="app-container"></div>
<script type="text/jsx">
var App = React.createClass({
getInitialState: function() {
return { message: "" };
},
updateMessage: function(e) {
this.setState({ message: e.target.value });
},
render: function() {
return (
<div>
<input type="text" onChange={this.updateMessage} />
<Message message={this.state.message} />
</div>
);
}
});
var Message = React.createClass({
render: function() {
return <p>{this.props.message}</p>;
}
});
React.render(
<App />,
document.getElementById('app-container')
);
</script>
</body>
</html>
## Instruction:
Make message input part independent component
## Code After:
<html>
<head>
<title>Title</title>
<script src="bower_components/react/react.js"></script>
<script src="bower_components/react/JSXTransformer.js"></script>
</head>
<body>
<div id="app-container"></div>
<script type="text/jsx">
var App = React.createClass({
getInitialState: function() {
return { message: "" };
},
updateMessage: function(message) {
this.setState({ message: message });
},
render: function() {
return (
<div>
<MessageInput onChange={this.updateMessage} />
<Message message={this.state.message} />
</div>
);
}
});
var MessageInput = React.createClass({
_onChange: function(e) {
this.props.onChange(e.target.value);
},
render: function() {
return <input type="text" onChange={this._onChange} />
}
});
var Message = React.createClass({
render: function() {
return <p>{this.props.message}</p>;
}
});
React.render(
<App />,
document.getElementById('app-container')
);
</script>
</body>
</html>
|
2896d1d0507ac312ab6246c3ccb33bbb6bc6d331 | bluebottle/common/management/commands/makemessages.py | bluebottle/common/management/commands/makemessages.py | import json
import codecs
import tempfile
from django.core.management.commands.makemessages import Command as BaseCommand
class Command(BaseCommand):
""" Extend the makemessages to include some of the fixtures """
fixtures = [
('bb_projects', 'project_data.json'),
('bb_tasks', 'skills.json'),
('geo', 'geo_data.json'),
]
def handle(self, *args, **kwargs):
strings = []
for app, file in self.fixtures:
with open('bluebottle/{}/fixtures/{}'.format(app, file)) as fixture_file:
strings += [fixture['fields']['name'].encode('utf-8') for fixture in json.load(fixture_file)]
with tempfile.NamedTemporaryFile(dir='bluebottle', suffix='.py') as temp:
temp.write('\n'.join(['gettext("{}")'.format(string) for string in strings]))
temp.flush()
return super(Command, self).handle(*args, **kwargs)
| import json
import codecs
import tempfile
from django.core.management.commands.makemessages import Command as BaseCommand
class Command(BaseCommand):
""" Extend the makemessages to include some of the fixtures """
fixtures = [
('bb_projects', 'project_data.json'),
('bb_tasks', 'skills.json'),
('geo', 'geo_data.json'),
]
def handle(self, *args, **kwargs):
with tempfile.NamedTemporaryFile(dir='bluebottle', suffix='.py') as temp:
for app, file in self.fixtures:
with open('bluebottle/{}/fixtures/{}'.format(app, file)) as fixture_file:
for string in [
fixture['fields']['name'].encode('utf-8')
for fixture
in json.load(fixture_file)]:
temp.write('pgettext("{}-fixtures", "{}")\n'.format(app, string))
temp.flush()
return super(Command, self).handle(*args, **kwargs)
| Add a context to the fixture translations | Add a context to the fixture translations
| Python | bsd-3-clause | onepercentclub/bluebottle,onepercentclub/bluebottle,onepercentclub/bluebottle,onepercentclub/bluebottle,onepercentclub/bluebottle | python | ## Code Before:
import json
import codecs
import tempfile
from django.core.management.commands.makemessages import Command as BaseCommand
class Command(BaseCommand):
""" Extend the makemessages to include some of the fixtures """
fixtures = [
('bb_projects', 'project_data.json'),
('bb_tasks', 'skills.json'),
('geo', 'geo_data.json'),
]
def handle(self, *args, **kwargs):
strings = []
for app, file in self.fixtures:
with open('bluebottle/{}/fixtures/{}'.format(app, file)) as fixture_file:
strings += [fixture['fields']['name'].encode('utf-8') for fixture in json.load(fixture_file)]
with tempfile.NamedTemporaryFile(dir='bluebottle', suffix='.py') as temp:
temp.write('\n'.join(['gettext("{}")'.format(string) for string in strings]))
temp.flush()
return super(Command, self).handle(*args, **kwargs)
## Instruction:
Add a context to the fixture translations
## Code After:
import json
import codecs
import tempfile
from django.core.management.commands.makemessages import Command as BaseCommand
class Command(BaseCommand):
""" Extend the makemessages to include some of the fixtures """
fixtures = [
('bb_projects', 'project_data.json'),
('bb_tasks', 'skills.json'),
('geo', 'geo_data.json'),
]
def handle(self, *args, **kwargs):
with tempfile.NamedTemporaryFile(dir='bluebottle', suffix='.py') as temp:
for app, file in self.fixtures:
with open('bluebottle/{}/fixtures/{}'.format(app, file)) as fixture_file:
for string in [
fixture['fields']['name'].encode('utf-8')
for fixture
in json.load(fixture_file)]:
temp.write('pgettext("{}-fixtures", "{}")\n'.format(app, string))
temp.flush()
return super(Command, self).handle(*args, **kwargs)
|
f423e245b623b2fed76ecfc28afbb99a2a42188c | python/exports.bash | python/exports.bash | export PIP_REQUIRE_VIRTUALENV=true
export WORKON_HOME=~/Dropbox/Virtualenvs
export PROJECT_HOME=~/Dropbox/projects
export VIRTUALENVWRAPPER_VIRTUALENV_ARGS='--no-site-packages'
| export PIP_REQUIRE_VIRTUALENV=true
export WORKON_HOME=~/Dropbox/Virtualenvs
export PROJECT_HOME=~/Dropbox/projects
export VIRTUALENVWRAPPER_VIRTUALENV_ARGS='--no-site-packages'
export PYTHONDONTWRITEBYTECODE=1
| Fix python byte code problem | Fix python byte code problem
| Shell | mit | sandertammesoo/dotfiles,sandertammesoo/dotfiles | shell | ## Code Before:
export PIP_REQUIRE_VIRTUALENV=true
export WORKON_HOME=~/Dropbox/Virtualenvs
export PROJECT_HOME=~/Dropbox/projects
export VIRTUALENVWRAPPER_VIRTUALENV_ARGS='--no-site-packages'
## Instruction:
Fix python byte code problem
## Code After:
export PIP_REQUIRE_VIRTUALENV=true
export WORKON_HOME=~/Dropbox/Virtualenvs
export PROJECT_HOME=~/Dropbox/projects
export VIRTUALENVWRAPPER_VIRTUALENV_ARGS='--no-site-packages'
export PYTHONDONTWRITEBYTECODE=1
|
09813139d3a77e4bebcbdd8cf6ef7192c9faaf33 | lib/wheaties/concerns/messaging.rb | lib/wheaties/concerns/messaging.rb | module Wheaties
module Concerns
module Messaging
def privmsg(message, *recipients)
broadcast(:privmsg, recipients.join(" "), :text => message)
end
def notice(message, *recipients)
broadcast(:notice, recipients.join(" "), :text => message)
end
def action(message, recipient)
broadcast_ctcp(recipient, :action, message)
end
protected
def broadcast(command, *args)
Connection.broadcast(command, *args)
end
def broadcast_ctcp(recipient, command, *args)
broadcast(:privmsg, recipient, :text => "\001#{command.to_s.upcase} #{args.join(" ")}\001")
end
end
end
end
| module Wheaties
module Concerns
module Messaging
def privmsg(message, *recipients)
ircify(message, *recipients) do |message, recipients|
broadcast(:privmsg, recipients.join(" "), :text => message)
end
end
def notice(message, *recipients)
ircify(message, *recipients) do |message, recipients|
broadcast(:notice, recipients.join(" "), :text => message)
end
end
def action(message, recipient)
broadcast_ctcp(recipient, :action, message)
end
protected
def broadcast(command, *args)
Connection.broadcast(command, *args)
end
def broadcast_ctcp(recipient, command, *args)
broadcast(:privmsg, recipient, :text => "\001#{command.to_s.upcase} #{args.join(" ")}\001")
end
def ircify(message, *recipients, &block)
case message
when String
message = message.split(/[\r\n]+/)
when Array
message = message.map do |line|
line.split(/[\r\n]+/)
end.flatten
else
return
end
message.each do |line|
yield line, *recipients
end
end
end
end
end
| Add special handling for splitting strings and arrays before sending messages | Add special handling for splitting strings and arrays before sending messages
| Ruby | mit | raws/wheaties | ruby | ## Code Before:
module Wheaties
module Concerns
module Messaging
def privmsg(message, *recipients)
broadcast(:privmsg, recipients.join(" "), :text => message)
end
def notice(message, *recipients)
broadcast(:notice, recipients.join(" "), :text => message)
end
def action(message, recipient)
broadcast_ctcp(recipient, :action, message)
end
protected
def broadcast(command, *args)
Connection.broadcast(command, *args)
end
def broadcast_ctcp(recipient, command, *args)
broadcast(:privmsg, recipient, :text => "\001#{command.to_s.upcase} #{args.join(" ")}\001")
end
end
end
end
## Instruction:
Add special handling for splitting strings and arrays before sending messages
## Code After:
module Wheaties
module Concerns
module Messaging
def privmsg(message, *recipients)
ircify(message, *recipients) do |message, recipients|
broadcast(:privmsg, recipients.join(" "), :text => message)
end
end
def notice(message, *recipients)
ircify(message, *recipients) do |message, recipients|
broadcast(:notice, recipients.join(" "), :text => message)
end
end
def action(message, recipient)
broadcast_ctcp(recipient, :action, message)
end
protected
def broadcast(command, *args)
Connection.broadcast(command, *args)
end
def broadcast_ctcp(recipient, command, *args)
broadcast(:privmsg, recipient, :text => "\001#{command.to_s.upcase} #{args.join(" ")}\001")
end
def ircify(message, *recipients, &block)
case message
when String
message = message.split(/[\r\n]+/)
when Array
message = message.map do |line|
line.split(/[\r\n]+/)
end.flatten
else
return
end
message.each do |line|
yield line, *recipients
end
end
end
end
end
|
51647df4a4ac48de433ee5c4dac7116b0f4ab80d | lib/gemrat/arguments.rb | lib/gemrat/arguments.rb | module Gemrat
class Arguments
class UnableToParse < StandardError; end
ATTRIBUTES = [:gems, :gemfile, :options]
ATTRIBUTES.each { |arg| attr_accessor arg }
def initialize(*args)
self.arguments = *args
validate
extract_options
end
def gems
gem_names.map do |name|
gem = Gem.new
gem.name = name
gem
end
end
private
attr_accessor :arguments
def validate
raise UnableToParse if invalid?
end
def invalid?
gem_names.empty? || gem_names.first =~ /-h|--help/ || gem_names.first.nil?
end
def parse_options
self.options = OpenStruct.new
end
def extract_options
options = arguments - gem_names
opts = Hash[*options]
self.gemfile = Gemfile.new(opts.delete("-g") || opts.delete("--gemfile") || "Gemfile")
rescue ArgumentError
raise UnableToParse
# unable to extract options, leave them nil
end
def gem_names
arguments.take_while { |arg| arg !~ /^-|^--/}
end
end
end
| module Gemrat
class Arguments
class UnableToParse < StandardError; end
ATTRIBUTES = [:gems, :gemfile, :options]
ATTRIBUTES.each { |arg| attr_accessor arg }
def initialize(*args)
self.arguments = *args
validate
parse_options
#extract_options
end
def gems
gem_names.map do |name|
gem = Gem.new
gem.name = name
gem
end
end
private
attr_accessor :arguments
def validate
raise UnableToParse if invalid?
end
def invalid?
gem_names.empty? || gem_names.first =~ /-h|--help/ || gem_names.first.nil?
end
def parse_options
self.options = OpenStruct.new
options.gemfile = "Gemfile"
OptionParser.new do |opts|
opts.on("-g", "--gemfile GEMFILE", "Specify the Gemfile to be used, defaults to 'Gemfile'") do |gemfile|
options.gemfile = gemfile
end
end.parse!(arguments)
self.gemfile = Gemfile.new(options.gemfile)
end
def extract_options
options = arguments - gem_names
opts = Hash[*options]
self.gemfile = Gemfile.new(opts.delete("-g") || opts.delete("--gemfile") || "Gemfile")
rescue ArgumentError
raise UnableToParse
# unable to extract options, leave them nil
end
def gem_names
arguments.take_while { |arg| arg !~ /^-|^--/}
end
end
end
| Implement first OptionParser flag, -g | Implement first OptionParser flag, -g
| Ruby | mit | DruRly/gemrat | ruby | ## Code Before:
module Gemrat
class Arguments
class UnableToParse < StandardError; end
ATTRIBUTES = [:gems, :gemfile, :options]
ATTRIBUTES.each { |arg| attr_accessor arg }
def initialize(*args)
self.arguments = *args
validate
extract_options
end
def gems
gem_names.map do |name|
gem = Gem.new
gem.name = name
gem
end
end
private
attr_accessor :arguments
def validate
raise UnableToParse if invalid?
end
def invalid?
gem_names.empty? || gem_names.first =~ /-h|--help/ || gem_names.first.nil?
end
def parse_options
self.options = OpenStruct.new
end
def extract_options
options = arguments - gem_names
opts = Hash[*options]
self.gemfile = Gemfile.new(opts.delete("-g") || opts.delete("--gemfile") || "Gemfile")
rescue ArgumentError
raise UnableToParse
# unable to extract options, leave them nil
end
def gem_names
arguments.take_while { |arg| arg !~ /^-|^--/}
end
end
end
## Instruction:
Implement first OptionParser flag, -g
## Code After:
module Gemrat
class Arguments
class UnableToParse < StandardError; end
ATTRIBUTES = [:gems, :gemfile, :options]
ATTRIBUTES.each { |arg| attr_accessor arg }
def initialize(*args)
self.arguments = *args
validate
parse_options
#extract_options
end
def gems
gem_names.map do |name|
gem = Gem.new
gem.name = name
gem
end
end
private
attr_accessor :arguments
def validate
raise UnableToParse if invalid?
end
def invalid?
gem_names.empty? || gem_names.first =~ /-h|--help/ || gem_names.first.nil?
end
def parse_options
self.options = OpenStruct.new
options.gemfile = "Gemfile"
OptionParser.new do |opts|
opts.on("-g", "--gemfile GEMFILE", "Specify the Gemfile to be used, defaults to 'Gemfile'") do |gemfile|
options.gemfile = gemfile
end
end.parse!(arguments)
self.gemfile = Gemfile.new(options.gemfile)
end
def extract_options
options = arguments - gem_names
opts = Hash[*options]
self.gemfile = Gemfile.new(opts.delete("-g") || opts.delete("--gemfile") || "Gemfile")
rescue ArgumentError
raise UnableToParse
# unable to extract options, leave them nil
end
def gem_names
arguments.take_while { |arg| arg !~ /^-|^--/}
end
end
end
|
407954191669dca72f8c8a8c230162613be8d523 | app/views/companies/_company_card.html.erb | app/views/companies/_company_card.html.erb | <% @companies.each do |company| %>
<div class="company-card" onclick="flipCard()">
<div class="flipper">
<div class="front">
<div class="company-image"></div>
<h2 class="company-name-front"><%= company.name %></h2>
</div>
<div class="back">
<h2 class="company-name"><%= company.name %></h2>
<h3 id="employees-title">Boots working here</h2>
<% company.users.each do |employee| %>
<p class="employee-name"><%= link_to employee.name, user_path(employee)%></p>
<% end %>
</div>
</div>
</div>
<% end %> | <% @companies.each do |company| %>
<% if company.users.length >= 1 %>
<div class="company-card" onclick="flipCard()">
<div class="flipper">
<div class="front">
<div class="company-image"></div>
<h2 class="company-name-front"><%= company.name %></h2>
</div>
<div class="back">
<h2 class="company-name"><%= company.name %></h2>
<h3 id="employees-title">Boots working here</h2>
<% company.users.each do |employee| %>
<p class="employee-name"><%= link_to employee.name, user_path(employee)%></p>
<% end %>
</div>
</div>
</div>
<% end %>
<% end %> | Add conditional to only show companies with users | Add conditional to only show companies with users
| HTML+ERB | mit | iand11/boot-on-the-ground,iand11/boot-on-the-ground,iand11/boot-on-the-ground | html+erb | ## Code Before:
<% @companies.each do |company| %>
<div class="company-card" onclick="flipCard()">
<div class="flipper">
<div class="front">
<div class="company-image"></div>
<h2 class="company-name-front"><%= company.name %></h2>
</div>
<div class="back">
<h2 class="company-name"><%= company.name %></h2>
<h3 id="employees-title">Boots working here</h2>
<% company.users.each do |employee| %>
<p class="employee-name"><%= link_to employee.name, user_path(employee)%></p>
<% end %>
</div>
</div>
</div>
<% end %>
## Instruction:
Add conditional to only show companies with users
## Code After:
<% @companies.each do |company| %>
<% if company.users.length >= 1 %>
<div class="company-card" onclick="flipCard()">
<div class="flipper">
<div class="front">
<div class="company-image"></div>
<h2 class="company-name-front"><%= company.name %></h2>
</div>
<div class="back">
<h2 class="company-name"><%= company.name %></h2>
<h3 id="employees-title">Boots working here</h2>
<% company.users.each do |employee| %>
<p class="employee-name"><%= link_to employee.name, user_path(employee)%></p>
<% end %>
</div>
</div>
</div>
<% end %>
<% end %> |
783b53f45a0d6923f2401674e91b2c6f599bddb8 | lib/spontaneous/facet.rb | lib/spontaneous/facet.rb |
module Spontaneous
class Facet
def initialize(root)
@root = root
paths.add :lib, ["lib", "**/*.rb"]
paths.add :schema, ["schema", "**/*.rb"]
paths.add :templates, "templates"
paths.add :config, "config"
paths.add :tasks, ["lib/tasks", "**/*.rake"]
paths.add :migrations, ["db/migrations", "**/*.rake"]
paths.add :plugins, ["plugins", "*"]
paths.add :features, "features"
end
def paths
@paths ||= Spontaneous::Paths.new(@root)
end
def config
Site.instance.config
end
def load_config!
paths.expanded(:config).each do |config_path|
Site.config.load(config_path)
end
end
def load_indexes!
paths.expanded(:config).each do |config_path|
load(config_path / "indexes.rb" )
end
end
def init!
end
def load!
Spontaneous::Loader.load_classes(load_paths)
end
def load_paths
load_paths = []
[:lib, :schema].each do |category|
load_paths += paths.expanded(category)
end
load_paths
end
end # Facet
end # Spontaneous
|
module Spontaneous
class Facet
def initialize(root)
@root = root
paths.add :lib, ["lib", "**/*.rb"]
paths.add :schema, ["schema", "**/*.rb"]
paths.add :templates, "templates"
paths.add :config, "config"
paths.add :tasks, ["lib/tasks", "**/*.rake"]
paths.add :migrations, ["db/migrations", "**/*.rake"]
paths.add :plugins, ["plugins", "*"]
paths.add :features, "features"
end
def paths
@paths ||= Spontaneous::Paths.new(@root)
end
def config
Site.instance.config
end
def load_config!
paths.expanded(:config).each do |config_path|
Site.config.load(config_path)
end
end
def load_indexes!
paths.expanded(:config).each do |config_path|
index_file = config_path / "indexes.rb"
load(index_file) if File.exists?(index_file)
end
end
def init!
end
def load!
Spontaneous::Loader.load_classes(load_paths)
end
def load_paths
load_paths = []
[:lib, :schema].each do |category|
load_paths += paths.expanded(category)
end
load_paths
end
end # Facet
end # Spontaneous
| Make loading of indexes.rb dependent on its existance | Make loading of indexes.rb dependent on its existance
| Ruby | mit | magnetised/spontaneous,opendesk/spontaneous,mediagreenhouse/spontaneous,magnetised/spontaneous,mediagreenhouse/spontaneous,SpontaneousCMS/spontaneous,mediagreenhouse/spontaneous,magnetised/spontaneous,opendesk/spontaneous,SpontaneousCMS/spontaneous,opendesk/spontaneous,SpontaneousCMS/spontaneous | ruby | ## Code Before:
module Spontaneous
class Facet
def initialize(root)
@root = root
paths.add :lib, ["lib", "**/*.rb"]
paths.add :schema, ["schema", "**/*.rb"]
paths.add :templates, "templates"
paths.add :config, "config"
paths.add :tasks, ["lib/tasks", "**/*.rake"]
paths.add :migrations, ["db/migrations", "**/*.rake"]
paths.add :plugins, ["plugins", "*"]
paths.add :features, "features"
end
def paths
@paths ||= Spontaneous::Paths.new(@root)
end
def config
Site.instance.config
end
def load_config!
paths.expanded(:config).each do |config_path|
Site.config.load(config_path)
end
end
def load_indexes!
paths.expanded(:config).each do |config_path|
load(config_path / "indexes.rb" )
end
end
def init!
end
def load!
Spontaneous::Loader.load_classes(load_paths)
end
def load_paths
load_paths = []
[:lib, :schema].each do |category|
load_paths += paths.expanded(category)
end
load_paths
end
end # Facet
end # Spontaneous
## Instruction:
Make loading of indexes.rb dependent on its existance
## Code After:
module Spontaneous
class Facet
def initialize(root)
@root = root
paths.add :lib, ["lib", "**/*.rb"]
paths.add :schema, ["schema", "**/*.rb"]
paths.add :templates, "templates"
paths.add :config, "config"
paths.add :tasks, ["lib/tasks", "**/*.rake"]
paths.add :migrations, ["db/migrations", "**/*.rake"]
paths.add :plugins, ["plugins", "*"]
paths.add :features, "features"
end
def paths
@paths ||= Spontaneous::Paths.new(@root)
end
def config
Site.instance.config
end
def load_config!
paths.expanded(:config).each do |config_path|
Site.config.load(config_path)
end
end
def load_indexes!
paths.expanded(:config).each do |config_path|
index_file = config_path / "indexes.rb"
load(index_file) if File.exists?(index_file)
end
end
def init!
end
def load!
Spontaneous::Loader.load_classes(load_paths)
end
def load_paths
load_paths = []
[:lib, :schema].each do |category|
load_paths += paths.expanded(category)
end
load_paths
end
end # Facet
end # Spontaneous
|
61d7c9e99398874745d11720cd8d985bdc3d7514 | demoapp/views.py | demoapp/views.py | from demoapp.forms import DemoLoginForm
from django.shortcuts import render_to_response
from django.shortcuts import redirect
from demoapp import app_settings
from demoapp.utils import get_salt
def login_view(request):
if request.method == 'POST':
form = DemoLoginForm(request.POST)
if form.is_valid():
response = redirect('/')
response.set_signed_cookie(app_settings.COOKIE_NAME, 'demo access granted', salt=get_salt(request))
return response
else:
form = DemoLoginForm()
return render_to_response('demoapp/login.html', {'form': form})
| from demoapp.forms import DemoLoginForm
from django.shortcuts import render
from django.shortcuts import redirect
from demoapp import app_settings
from demoapp.utils import get_salt
def login_view(request):
if request.method == 'POST':
form = DemoLoginForm(request.POST)
if form.is_valid():
response = redirect('/')
response.set_signed_cookie(app_settings.COOKIE_NAME, 'demo access granted', salt=get_salt(request))
return response
else:
form = DemoLoginForm()
return render(request, 'demoapp/login.html', {'form': form})
| Add context to login view | Add context to login view
| Python | unlicense | dboczek/django-demo,dboczek/django-demo | python | ## Code Before:
from demoapp.forms import DemoLoginForm
from django.shortcuts import render_to_response
from django.shortcuts import redirect
from demoapp import app_settings
from demoapp.utils import get_salt
def login_view(request):
if request.method == 'POST':
form = DemoLoginForm(request.POST)
if form.is_valid():
response = redirect('/')
response.set_signed_cookie(app_settings.COOKIE_NAME, 'demo access granted', salt=get_salt(request))
return response
else:
form = DemoLoginForm()
return render_to_response('demoapp/login.html', {'form': form})
## Instruction:
Add context to login view
## Code After:
from demoapp.forms import DemoLoginForm
from django.shortcuts import render
from django.shortcuts import redirect
from demoapp import app_settings
from demoapp.utils import get_salt
def login_view(request):
if request.method == 'POST':
form = DemoLoginForm(request.POST)
if form.is_valid():
response = redirect('/')
response.set_signed_cookie(app_settings.COOKIE_NAME, 'demo access granted', salt=get_salt(request))
return response
else:
form = DemoLoginForm()
return render(request, 'demoapp/login.html', {'form': form})
|
3ddeea297cfb0140c6157b5410e448f1a7982bf3 | app/helpers/user_helper.rb | app/helpers/user_helper.rb | module UserHelper
def guider_options
groups = Group.includes(:users)
{
'Groups': groups.map do |group|
[group.name, group.name, data: { icon: 'glyphicon-tag', children_to_select: group.users.pluck(:id) }]
end,
'Users': User.guiders.active.map do |guider|
[guider.name, guider.id, data: { icon: 'glyphicon-user' }]
end
}
end
end
| module UserHelper
def guider_options
groups = Group.includes(:users)
{
'Groups': groups.map do |group|
[group.name, group.name, data: { icon: 'glyphicon-tag', children_to_select: group.user_ids }]
end,
'Users': User.guiders.active.map do |guider|
[guider.name, guider.id, data: { icon: 'glyphicon-user' }]
end
}
end
end
| Reduce huge N+1 querying for associated user IDs | Reduce huge N+1 querying for associated user IDs
This was causing a huge N+1 query through groups / assignments / users.
| Ruby | mit | guidance-guarantee-programme/telephone_appointment_planner,guidance-guarantee-programme/telephone_appointment_planner,guidance-guarantee-programme/telephone_appointment_planner | ruby | ## Code Before:
module UserHelper
def guider_options
groups = Group.includes(:users)
{
'Groups': groups.map do |group|
[group.name, group.name, data: { icon: 'glyphicon-tag', children_to_select: group.users.pluck(:id) }]
end,
'Users': User.guiders.active.map do |guider|
[guider.name, guider.id, data: { icon: 'glyphicon-user' }]
end
}
end
end
## Instruction:
Reduce huge N+1 querying for associated user IDs
This was causing a huge N+1 query through groups / assignments / users.
## Code After:
module UserHelper
def guider_options
groups = Group.includes(:users)
{
'Groups': groups.map do |group|
[group.name, group.name, data: { icon: 'glyphicon-tag', children_to_select: group.user_ids }]
end,
'Users': User.guiders.active.map do |guider|
[guider.name, guider.id, data: { icon: 'glyphicon-user' }]
end
}
end
end
|
7289d253a3f92775b8295515117038928b170b0e | src/test/resources/gvm/aeroplane_mode_steps.groovy | src/test/resources/gvm/aeroplane_mode_steps.groovy | package gvm
import static cucumber.api.groovy.EN.*
Given(~'^the internet is not reachable$') {->
bash = new BashEnv(baseDir, [GVM_DIR: gvmDirEnv, GVM_SERVICE: "http://localhost:0"])
bash.start()
bash.execute("source $binDir/gvm-init.sh")
}
And(~'^the internet is reachable$') {->
bash = new BashEnv(baseDir, [GVM_DIR: gvmDirEnv, GVM_SERVICE: serviceUrlEnv])
bash.start()
bash.execute("source $binDir/gvm-init.sh")
} | package gvm
import static cucumber.api.groovy.EN.*
Given(~'^the internet is not reachable$') {->
bash = new BashEnv(gvmBaseEnv, [GVM_DIR: gvmDirEnv, GVM_SERVICE: "http://localhost:0"])
bash.start()
bash.execute("source $binDir/gvm-init.sh")
}
And(~'^the internet is reachable$') {->
bash = new BashEnv(gvmBaseEnv, [GVM_DIR: gvmDirEnv, GVM_SERVICE: serviceUrlEnv])
bash.start()
bash.execute("source $binDir/gvm-init.sh")
} | Rename base directory to gvm base environment variable in test environment. | Rename base directory to gvm base environment variable in test environment.
| Groovy | apache-2.0 | DealerDotCom/gvm-cli,skpal/sdkman-cli,nobeans/gvm-cli,busches/gvm-cli,shanman190/sdkman-cli,DealerDotCom/gvm-cli,skpal/sdkman-cli,GsusRecovery/sdkman-cli,GsusRecovery/sdkman-cli,nobeans/gvm-cli,jbovet/gvm,sdkman/sdkman-cli,gvmtool/gvm-cli | groovy | ## Code Before:
package gvm
import static cucumber.api.groovy.EN.*
Given(~'^the internet is not reachable$') {->
bash = new BashEnv(baseDir, [GVM_DIR: gvmDirEnv, GVM_SERVICE: "http://localhost:0"])
bash.start()
bash.execute("source $binDir/gvm-init.sh")
}
And(~'^the internet is reachable$') {->
bash = new BashEnv(baseDir, [GVM_DIR: gvmDirEnv, GVM_SERVICE: serviceUrlEnv])
bash.start()
bash.execute("source $binDir/gvm-init.sh")
}
## Instruction:
Rename base directory to gvm base environment variable in test environment.
## Code After:
package gvm
import static cucumber.api.groovy.EN.*
Given(~'^the internet is not reachable$') {->
bash = new BashEnv(gvmBaseEnv, [GVM_DIR: gvmDirEnv, GVM_SERVICE: "http://localhost:0"])
bash.start()
bash.execute("source $binDir/gvm-init.sh")
}
And(~'^the internet is reachable$') {->
bash = new BashEnv(gvmBaseEnv, [GVM_DIR: gvmDirEnv, GVM_SERVICE: serviceUrlEnv])
bash.start()
bash.execute("source $binDir/gvm-init.sh")
} |
806501fd9c8587248ac20f8282bb432b37b54ab0 | README.md | README.md |
Basic tools to generate an HTML visualisation of Tinc Networks.
##Example
Here is an example of a Tinc View page with 3 networks :

##Installation
Prerequies and dependance :
sudo aptitude install graphviz
Get project :
git clone https://github.com/24eme/tincview.git
Html and Graph generation :
make
Install crontab to update html :
* * * * * cd path/to/tincview ; make > /dev/null
##Configuration
It exploits the tinc host configuration files and the native *DumpGraph* option (by default the graph file should be *network.graph*).
To add services and VPN Ip address of the host use extra configuration directives *NodeIP* and *HostedServices* in the *hosts* files. Here is a host example :
#NodeIP=10.1.1.100
#HostedServices=[http://10.1.1.100:8080/root/app|Tomcat]
-----BEGIN RSA PUBLIC KEY-----
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
-----END RSA PUBLIC KEY-----
|
Basic tools to generate an HTML visualisation of Tinc Networks.
##Example
Here is an example of a Tinc View page with 3 networks :

##Installation
Prerequies and dependance :
sudo aptitude install graphviz
Get project :
git clone https://github.com/24eme/tincview.git
Html and Graph generation :
make
You can configure Apache vhost
https://github.com/24eme/tincview/blob/master/webviz/conf/tinc.conf.example
Install crontab to update html :
* * * * * cd path/to/tincview ; make > /dev/null
##Configuration
It exploits the tinc host configuration files and the native *DumpGraph* option (by default the graph file should be *network.graph*).
To add services and VPN Ip address of the host use extra configuration directives *NodeIP* and *HostedServices* in the *hosts* files. Here is a host example :
#NodeIP=10.1.1.100
#HostedServices=[http://10.1.1.100:8080/root/app|Tomcat]
-----BEGIN RSA PUBLIC KEY-----
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
-----END RSA PUBLIC KEY-----
| Add doc to configure apache | Add doc to configure apache | Markdown | mit | 24eme/tincview,24eme/tincview | markdown | ## Code Before:
Basic tools to generate an HTML visualisation of Tinc Networks.
##Example
Here is an example of a Tinc View page with 3 networks :

##Installation
Prerequies and dependance :
sudo aptitude install graphviz
Get project :
git clone https://github.com/24eme/tincview.git
Html and Graph generation :
make
Install crontab to update html :
* * * * * cd path/to/tincview ; make > /dev/null
##Configuration
It exploits the tinc host configuration files and the native *DumpGraph* option (by default the graph file should be *network.graph*).
To add services and VPN Ip address of the host use extra configuration directives *NodeIP* and *HostedServices* in the *hosts* files. Here is a host example :
#NodeIP=10.1.1.100
#HostedServices=[http://10.1.1.100:8080/root/app|Tomcat]
-----BEGIN RSA PUBLIC KEY-----
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
-----END RSA PUBLIC KEY-----
## Instruction:
Add doc to configure apache
## Code After:
Basic tools to generate an HTML visualisation of Tinc Networks.
##Example
Here is an example of a Tinc View page with 3 networks :

##Installation
Prerequies and dependance :
sudo aptitude install graphviz
Get project :
git clone https://github.com/24eme/tincview.git
Html and Graph generation :
make
You can configure Apache vhost
https://github.com/24eme/tincview/blob/master/webviz/conf/tinc.conf.example
Install crontab to update html :
* * * * * cd path/to/tincview ; make > /dev/null
##Configuration
It exploits the tinc host configuration files and the native *DumpGraph* option (by default the graph file should be *network.graph*).
To add services and VPN Ip address of the host use extra configuration directives *NodeIP* and *HostedServices* in the *hosts* files. Here is a host example :
#NodeIP=10.1.1.100
#HostedServices=[http://10.1.1.100:8080/root/app|Tomcat]
-----BEGIN RSA PUBLIC KEY-----
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
-----END RSA PUBLIC KEY-----
|
d13f04072e3a66005885132f551702efa1d18a75 | CMakeLists.txt | CMakeLists.txt | cmake_minimum_required(VERSION 3.0)
project(pybinding)
if(NOT PYTHON_VERSION)
set(PYTHON_VERSION 3.4) # minimum required version
endif()
add_subdirectory(cppwrapper)
if(NOT CMAKE_LIBRARY_OUTPUT_DIRECTORY)
set_target_properties(_pybinding PROPERTIES LIBRARY_OUTPUT_DIRECTORY ${CMAKE_CURRENT_LIST_DIR})
foreach(config ${CMAKE_CONFIGURATION_TYPES})
string(TOUPPER ${config} config)
set_target_properties(_pybinding PROPERTIES
LIBRARY_OUTPUT_DIRECTORY_${config} ${CMAKE_CURRENT_LIST_DIR})
endforeach()
endif()
if(NOT CMAKE_BUILD_TYPE)
set(CMAKE_BUILD_TYPE "Release" CACHE STRING "Choose the type of build" FORCE)
endif()
if(EXISTS ${CMAKE_SOURCE_DIR}/docs)
add_subdirectory(docs EXCLUDE_FROM_ALL)
endif()
| cmake_minimum_required(VERSION 3.0)
project(pybinding)
if(NOT PYTHON_VERSION)
set(PYTHON_VERSION 3.4) # minimum required version
endif()
add_subdirectory(cppwrapper)
if(NOT CMAKE_LIBRARY_OUTPUT_DIRECTORY)
set_target_properties(_pybinding PROPERTIES LIBRARY_OUTPUT_DIRECTORY ${CMAKE_CURRENT_LIST_DIR})
foreach(config ${CMAKE_CONFIGURATION_TYPES})
string(TOUPPER ${config} config)
set_target_properties(_pybinding PROPERTIES
LIBRARY_OUTPUT_DIRECTORY_${config} ${CMAKE_CURRENT_LIST_DIR})
endforeach()
endif()
if(NOT CMAKE_BUILD_TYPE)
set(CMAKE_BUILD_TYPE "Release" CACHE STRING "Choose the type of build" FORCE)
endif()
if(EXISTS ${CMAKE_SOURCE_DIR}/docs)
add_subdirectory(docs EXCLUDE_FROM_ALL)
endif()
add_custom_target(pytest COMMAND py.test \${ARGS} WORKING_DIRECTORY ${CMAKE_SOURCE_DIR})
| Add pytest target to CMake | Add pytest target to CMake
| Text | bsd-2-clause | dean0x7d/pybinding,dean0x7d/pybinding,MAndelkovic/pybinding,dean0x7d/pybinding,MAndelkovic/pybinding,MAndelkovic/pybinding | text | ## Code Before:
cmake_minimum_required(VERSION 3.0)
project(pybinding)
if(NOT PYTHON_VERSION)
set(PYTHON_VERSION 3.4) # minimum required version
endif()
add_subdirectory(cppwrapper)
if(NOT CMAKE_LIBRARY_OUTPUT_DIRECTORY)
set_target_properties(_pybinding PROPERTIES LIBRARY_OUTPUT_DIRECTORY ${CMAKE_CURRENT_LIST_DIR})
foreach(config ${CMAKE_CONFIGURATION_TYPES})
string(TOUPPER ${config} config)
set_target_properties(_pybinding PROPERTIES
LIBRARY_OUTPUT_DIRECTORY_${config} ${CMAKE_CURRENT_LIST_DIR})
endforeach()
endif()
if(NOT CMAKE_BUILD_TYPE)
set(CMAKE_BUILD_TYPE "Release" CACHE STRING "Choose the type of build" FORCE)
endif()
if(EXISTS ${CMAKE_SOURCE_DIR}/docs)
add_subdirectory(docs EXCLUDE_FROM_ALL)
endif()
## Instruction:
Add pytest target to CMake
## Code After:
cmake_minimum_required(VERSION 3.0)
project(pybinding)
if(NOT PYTHON_VERSION)
set(PYTHON_VERSION 3.4) # minimum required version
endif()
add_subdirectory(cppwrapper)
if(NOT CMAKE_LIBRARY_OUTPUT_DIRECTORY)
set_target_properties(_pybinding PROPERTIES LIBRARY_OUTPUT_DIRECTORY ${CMAKE_CURRENT_LIST_DIR})
foreach(config ${CMAKE_CONFIGURATION_TYPES})
string(TOUPPER ${config} config)
set_target_properties(_pybinding PROPERTIES
LIBRARY_OUTPUT_DIRECTORY_${config} ${CMAKE_CURRENT_LIST_DIR})
endforeach()
endif()
if(NOT CMAKE_BUILD_TYPE)
set(CMAKE_BUILD_TYPE "Release" CACHE STRING "Choose the type of build" FORCE)
endif()
if(EXISTS ${CMAKE_SOURCE_DIR}/docs)
add_subdirectory(docs EXCLUDE_FROM_ALL)
endif()
add_custom_target(pytest COMMAND py.test \${ARGS} WORKING_DIRECTORY ${CMAKE_SOURCE_DIR})
|
c0a7a727bfc4afb63f62c961c92746e41a923779 | src/js/containers/Device.js | src/js/containers/Device.js | import React, { Component } from 'react';
import { connect } from 'react-redux';
import { bindActionCreators } from 'redux';
import { Box, Tabs, Tab } from 'grommet';
import { Map } from 'react-d3-map';
import { Table } from '../components';
import { deviceActions } from '../actions';
const mapConf = {
width: 1400,
height: 600,
scale: 4000,
scaleExtent: [1 << 12, 1 << 13],
center: [111, 36]
};
class Device extends Component {
componentDidMount() {
this.props.actions.loadDevices();
}
render() {
const { records } = this.props;
return (
<Box pad="small">
<Tabs>
<Tab title="Location">
<Box size="large">
<Map {...mapConf}/>
</Box>
<Table data={records}/>
</Tab>
<Tab title="Model" />
<Tab title="Customer" />
</Tabs>
</Box>
);
}
}
let mapStateToProps = (state) => {
return {
records: state.device.records
};
};
let mapDispatchProps = (dispatch) => ({
actions: bindActionCreators(deviceActions, dispatch)
});
export default connect(mapStateToProps, mapDispatchProps)(Device);
| import React, { Component } from 'react';
import { connect } from 'react-redux';
import { bindActionCreators } from 'redux';
import { Box, Tabs, Tab } from 'grommet';
import { Map, MarkerGroup } from 'react-d3-map';
import { Table } from '../components';
import { deviceActions } from '../actions';
const mapConf = {
width: 1400,
height: 600,
scale: 4000,
scaleExtent: [1 << 12, 1 << 13],
center: [111, 36]
};
class Device extends Component {
componentDidMount() {
this.props.actions.loadDevices();
}
renderMarkerGroup(records) {
return records.map((record) => {
var data = {
"type": "Feature",
"properties": {
"text": record.name
},
"geometry": {
"type": "Point",
"coordinates": [record.longitude, record.latitude]
}
}
return <MarkerGroup data={data}/>
});
}
render() {
const { records } = this.props;
return (
<Box pad="small">
<Tabs>
<Tab title="Location">
<Box size="large">
<Map {...mapConf}>
{ this.renderMarkerGroup(records) }
</Map>
</Box>
<Table data={records}/>
</Tab>
<Tab title="Model" />
<Tab title="Customer" />
</Tabs>
</Box>
);
}
}
let mapStateToProps = (state) => {
return {
records: state.device.records
};
};
let mapDispatchProps = (dispatch) => ({
actions: bindActionCreators(deviceActions, dispatch)
});
export default connect(mapStateToProps, mapDispatchProps)(Device);
| Add MarkerGroup in Map by device geo info | Add MarkerGroup in Map by device geo info
| JavaScript | mit | masogit/dashboard,masogit/dashboard | javascript | ## Code Before:
import React, { Component } from 'react';
import { connect } from 'react-redux';
import { bindActionCreators } from 'redux';
import { Box, Tabs, Tab } from 'grommet';
import { Map } from 'react-d3-map';
import { Table } from '../components';
import { deviceActions } from '../actions';
const mapConf = {
width: 1400,
height: 600,
scale: 4000,
scaleExtent: [1 << 12, 1 << 13],
center: [111, 36]
};
class Device extends Component {
componentDidMount() {
this.props.actions.loadDevices();
}
render() {
const { records } = this.props;
return (
<Box pad="small">
<Tabs>
<Tab title="Location">
<Box size="large">
<Map {...mapConf}/>
</Box>
<Table data={records}/>
</Tab>
<Tab title="Model" />
<Tab title="Customer" />
</Tabs>
</Box>
);
}
}
let mapStateToProps = (state) => {
return {
records: state.device.records
};
};
let mapDispatchProps = (dispatch) => ({
actions: bindActionCreators(deviceActions, dispatch)
});
export default connect(mapStateToProps, mapDispatchProps)(Device);
## Instruction:
Add MarkerGroup in Map by device geo info
## Code After:
import React, { Component } from 'react';
import { connect } from 'react-redux';
import { bindActionCreators } from 'redux';
import { Box, Tabs, Tab } from 'grommet';
import { Map, MarkerGroup } from 'react-d3-map';
import { Table } from '../components';
import { deviceActions } from '../actions';
const mapConf = {
width: 1400,
height: 600,
scale: 4000,
scaleExtent: [1 << 12, 1 << 13],
center: [111, 36]
};
class Device extends Component {
componentDidMount() {
this.props.actions.loadDevices();
}
renderMarkerGroup(records) {
return records.map((record) => {
var data = {
"type": "Feature",
"properties": {
"text": record.name
},
"geometry": {
"type": "Point",
"coordinates": [record.longitude, record.latitude]
}
}
return <MarkerGroup data={data}/>
});
}
render() {
const { records } = this.props;
return (
<Box pad="small">
<Tabs>
<Tab title="Location">
<Box size="large">
<Map {...mapConf}>
{ this.renderMarkerGroup(records) }
</Map>
</Box>
<Table data={records}/>
</Tab>
<Tab title="Model" />
<Tab title="Customer" />
</Tabs>
</Box>
);
}
}
let mapStateToProps = (state) => {
return {
records: state.device.records
};
};
let mapDispatchProps = (dispatch) => ({
actions: bindActionCreators(deviceActions, dispatch)
});
export default connect(mapStateToProps, mapDispatchProps)(Device);
|
9f5280521cb3d5937b18d1a40ea6ba0a80fcd1c2 | test/chpldoc/classes/methodOutsideClassDef.chpl | test/chpldoc/classes/methodOutsideClassDef.chpl | /* This class will declare a method inside itself, but will have a
method declared outside it as well */
class Foo {
proc internalMeth() {
}
}
// We expect these two methods to be printed outside of the class indentation
// level
proc Foo.externalMeth1() {
}
/* This method has a comment attached to it */
proc Foo.externalMeth2() {
}
| /* This class will declare a method inside itself, but will have a
method declared outside it as well */
class Foo {
proc internalMeth() {
}
}
// We expect these two methods to be printed outside of the class indentation
// level
proc Foo.externalMeth1() {
}
/* This method has a comment attached to it */
proc Foo.externalMeth2() {
}
/* Declares one primary and one secondary method... */
record Bar {
/* A primary method declaration. */
proc internal() {}
}
/* A secondary method declaration. */
proc Bar.external() {}
| Add record with secondary method to chpldoc/ tests. | Add record with secondary method to chpldoc/ tests.
| Chapel | apache-2.0 | hildeth/chapel,chizarlicious/chapel,CoryMcCartan/chapel,hildeth/chapel,chizarlicious/chapel,hildeth/chapel,hildeth/chapel,chizarlicious/chapel,CoryMcCartan/chapel,chizarlicious/chapel,CoryMcCartan/chapel,chizarlicious/chapel,CoryMcCartan/chapel,chizarlicious/chapel,hildeth/chapel,hildeth/chapel,CoryMcCartan/chapel,chizarlicious/chapel,CoryMcCartan/chapel,CoryMcCartan/chapel,hildeth/chapel | chapel | ## Code Before:
/* This class will declare a method inside itself, but will have a
method declared outside it as well */
class Foo {
proc internalMeth() {
}
}
// We expect these two methods to be printed outside of the class indentation
// level
proc Foo.externalMeth1() {
}
/* This method has a comment attached to it */
proc Foo.externalMeth2() {
}
## Instruction:
Add record with secondary method to chpldoc/ tests.
## Code After:
/* This class will declare a method inside itself, but will have a
method declared outside it as well */
class Foo {
proc internalMeth() {
}
}
// We expect these two methods to be printed outside of the class indentation
// level
proc Foo.externalMeth1() {
}
/* This method has a comment attached to it */
proc Foo.externalMeth2() {
}
/* Declares one primary and one secondary method... */
record Bar {
/* A primary method declaration. */
proc internal() {}
}
/* A secondary method declaration. */
proc Bar.external() {}
|
00a464ad7cf9b235d07fe7f060909acbbdd16418 | app/views/admin/index.html.haml | app/views/admin/index.html.haml | .container
%h1
%a{href: '/requested_accounts'}
Requested Accounts:
= RequestedAccount.count
.container
%h1
%a{href: '/tickets/dashboard'}
Open Tickets:
= TicketDispenser::Ticket.where(status: TicketDispenser::Ticket::Statuses::OPEN).count
| .container
%h1
%a{href: '/requested_accounts'}
Requested Accounts:
= RequestedAccount.count
.container
%h1
%a{href: '/tickets/dashboard'}
Open Tickets:
= TicketDispenser::Ticket.where(status: TicketDispenser::Ticket::Statuses::OPEN).count
.container
%h2 Useful links
%ul
%li
%a{href: '/article_finder'} Article Finder
%li
%a{href: '/usage'} Usage
%li
%a{href: '/sidekiq'} Job queue
| Add some extra links to admin index | Add some extra links to admin index
| Haml | mit | WikiEducationFoundation/WikiEduDashboard,WikiEducationFoundation/WikiEduDashboard,WikiEducationFoundation/WikiEduDashboard,WikiEducationFoundation/WikiEduDashboard,WikiEducationFoundation/WikiEduDashboard | haml | ## Code Before:
.container
%h1
%a{href: '/requested_accounts'}
Requested Accounts:
= RequestedAccount.count
.container
%h1
%a{href: '/tickets/dashboard'}
Open Tickets:
= TicketDispenser::Ticket.where(status: TicketDispenser::Ticket::Statuses::OPEN).count
## Instruction:
Add some extra links to admin index
## Code After:
.container
%h1
%a{href: '/requested_accounts'}
Requested Accounts:
= RequestedAccount.count
.container
%h1
%a{href: '/tickets/dashboard'}
Open Tickets:
= TicketDispenser::Ticket.where(status: TicketDispenser::Ticket::Statuses::OPEN).count
.container
%h2 Useful links
%ul
%li
%a{href: '/article_finder'} Article Finder
%li
%a{href: '/usage'} Usage
%li
%a{href: '/sidekiq'} Job queue
|
431ec68fe478d1b8d212c87590e5b614a4d16fe9 | app/src/storage/template.html | app/src/storage/template.html | <div class="gridster-element" gridster="gridsterStorageOptions">
<div gridster-item="map" ng-repeat="map in maps" class="grid-map">
<div ol3-map="map" styles="styles" static-map="true" class="map-box" gridster-item-element="gridsterItem">
<div class="map-title">
<h4>{{ map.title }}</h4>
</div>
<div class="map-control-btns pull-right">
<a title="Load Map to dashboard" ng-click="restore(map)">
<i class="glyphicon glyphicon-upload"></i>
</a>
<a title="Remove Map from storage" ng-click="remove(map)">
<i class="glyphicon glyphicon-trash"></i>
</a>
</div>
</div>
</div>
</div>
| <div class="gridster-element" gridster="gridsterStorageOptions">
<div gridster-item="map" ng-repeat="map in maps" class="grid-map">
<div ol3-map="map" styles="styles" static-map="true" class="map-box" gridster-item-element="gridsterItem">
<div class="map-title">
<h4 ng-click="editMode=true;" ng-hide="editMode">{{ map.title }}</h4>
<div class="input-group" ng-show="editMode">
<input name="editTitle" type="text" ng-model="map.title" ng-blur="editMode=false" focus-me="{{editMode}}" class="form-control"/>
<span class="input-group-btn">
<button ng-click="editMode=false" class="btn btn-success">
<i class="glyphicon glyphicon-ok"></i>
</button>
</span>
</div>
</div>
<div class="map-control-btns pull-right">
<a title="Load Map to dashboard" ng-click="restore(map)">
<i class="glyphicon glyphicon-upload"></i>
</a>
<a title="Remove Map from storage" ng-click="remove(map)">
<i class="glyphicon glyphicon-trash"></i>
</a>
</div>
</div>
</div>
</div>
| Make stored map title editable | Make stored map title editable
| HTML | apache-2.0 | nebulon42/magnacarto,omniscale/magnacarto,omniscale/magnacarto,nebulon42/magnacarto,omniscale/magnacarto,omniscale/magnacarto,nebulon42/magnacarto,nebulon42/magnacarto | html | ## Code Before:
<div class="gridster-element" gridster="gridsterStorageOptions">
<div gridster-item="map" ng-repeat="map in maps" class="grid-map">
<div ol3-map="map" styles="styles" static-map="true" class="map-box" gridster-item-element="gridsterItem">
<div class="map-title">
<h4>{{ map.title }}</h4>
</div>
<div class="map-control-btns pull-right">
<a title="Load Map to dashboard" ng-click="restore(map)">
<i class="glyphicon glyphicon-upload"></i>
</a>
<a title="Remove Map from storage" ng-click="remove(map)">
<i class="glyphicon glyphicon-trash"></i>
</a>
</div>
</div>
</div>
</div>
## Instruction:
Make stored map title editable
## Code After:
<div class="gridster-element" gridster="gridsterStorageOptions">
<div gridster-item="map" ng-repeat="map in maps" class="grid-map">
<div ol3-map="map" styles="styles" static-map="true" class="map-box" gridster-item-element="gridsterItem">
<div class="map-title">
<h4 ng-click="editMode=true;" ng-hide="editMode">{{ map.title }}</h4>
<div class="input-group" ng-show="editMode">
<input name="editTitle" type="text" ng-model="map.title" ng-blur="editMode=false" focus-me="{{editMode}}" class="form-control"/>
<span class="input-group-btn">
<button ng-click="editMode=false" class="btn btn-success">
<i class="glyphicon glyphicon-ok"></i>
</button>
</span>
</div>
</div>
<div class="map-control-btns pull-right">
<a title="Load Map to dashboard" ng-click="restore(map)">
<i class="glyphicon glyphicon-upload"></i>
</a>
<a title="Remove Map from storage" ng-click="remove(map)">
<i class="glyphicon glyphicon-trash"></i>
</a>
</div>
</div>
</div>
</div>
|
02f5db5fdb46684b60a9b5e9125da228a927c2c3 | mrbelvedereci/build/cumulusci/config.py | mrbelvedereci/build/cumulusci/config.py | from cumulusci.core.config import YamlGlobalConfig
from cumulusci.core.config import YamlProjectConfig
class MrbelvedereProjectConfig(YamlProjectConfig):
def __init__(self, global_config_obj, build_flow):
super(MrbelvedereProjectConfig, self).__init__(global_config_obj)
self.build_flow = build_flow
@property
def config_project_local_path(self):
""" mrbelvedere never uses the local path """
return
@property
def repo_root(self):
return self.build_flow.build_dir
@property
def repo_name(self):
return self.build_flow.build.repo.name
@property
def repo_url(self):
return self.build_flow.build.repo.url
@property
def repo_owner(self):
return self.build_flow.build.repo.url.split('/')[-2]
@property
def repo_branch(self):
return self.build_flow.build.branch.name
@property
def repo_commit(self):
return self.build_flow.build.commit
class MrbelvedereGlobalConfig(YamlGlobalConfig):
project_config_class = MrbelvedereProjectConfig
def get_project_config(self, build_flow):
return self.project_config_class(self, build_flow)
| from cumulusci.core.config import YamlGlobalConfig
from cumulusci.core.config import YamlProjectConfig
class MrbelvedereProjectConfig(YamlProjectConfig):
def __init__(self, global_config_obj, build_flow):
self.build_flow = build_flow
super(MrbelvedereProjectConfig, self).__init__(global_config_obj)
@property
def config_project_local_path(self):
""" mrbelvedere never uses the local path """
return
@property
def repo_root(self):
return self.build_flow.build_dir
@property
def repo_name(self):
return self.build_flow.build.repo.name
@property
def repo_url(self):
return self.build_flow.build.repo.url
@property
def repo_owner(self):
return self.build_flow.build.repo.url.split('/')[-2]
@property
def repo_branch(self):
return self.build_flow.build.branch.name
@property
def repo_commit(self):
return self.build_flow.build.commit
class MrbelvedereGlobalConfig(YamlGlobalConfig):
project_config_class = MrbelvedereProjectConfig
def get_project_config(self, build_flow):
return self.project_config_class(self, build_flow)
| Set self.build_flow before calling the super __init__ method | Set self.build_flow before calling the super __init__ method
| Python | bsd-3-clause | SalesforceFoundation/mrbelvedereci,SalesforceFoundation/mrbelvedereci,SalesforceFoundation/mrbelvedereci,SalesforceFoundation/mrbelvedereci | python | ## Code Before:
from cumulusci.core.config import YamlGlobalConfig
from cumulusci.core.config import YamlProjectConfig
class MrbelvedereProjectConfig(YamlProjectConfig):
def __init__(self, global_config_obj, build_flow):
super(MrbelvedereProjectConfig, self).__init__(global_config_obj)
self.build_flow = build_flow
@property
def config_project_local_path(self):
""" mrbelvedere never uses the local path """
return
@property
def repo_root(self):
return self.build_flow.build_dir
@property
def repo_name(self):
return self.build_flow.build.repo.name
@property
def repo_url(self):
return self.build_flow.build.repo.url
@property
def repo_owner(self):
return self.build_flow.build.repo.url.split('/')[-2]
@property
def repo_branch(self):
return self.build_flow.build.branch.name
@property
def repo_commit(self):
return self.build_flow.build.commit
class MrbelvedereGlobalConfig(YamlGlobalConfig):
project_config_class = MrbelvedereProjectConfig
def get_project_config(self, build_flow):
return self.project_config_class(self, build_flow)
## Instruction:
Set self.build_flow before calling the super __init__ method
## Code After:
from cumulusci.core.config import YamlGlobalConfig
from cumulusci.core.config import YamlProjectConfig
class MrbelvedereProjectConfig(YamlProjectConfig):
def __init__(self, global_config_obj, build_flow):
self.build_flow = build_flow
super(MrbelvedereProjectConfig, self).__init__(global_config_obj)
@property
def config_project_local_path(self):
""" mrbelvedere never uses the local path """
return
@property
def repo_root(self):
return self.build_flow.build_dir
@property
def repo_name(self):
return self.build_flow.build.repo.name
@property
def repo_url(self):
return self.build_flow.build.repo.url
@property
def repo_owner(self):
return self.build_flow.build.repo.url.split('/')[-2]
@property
def repo_branch(self):
return self.build_flow.build.branch.name
@property
def repo_commit(self):
return self.build_flow.build.commit
class MrbelvedereGlobalConfig(YamlGlobalConfig):
project_config_class = MrbelvedereProjectConfig
def get_project_config(self, build_flow):
return self.project_config_class(self, build_flow)
|
66e8e572fa16042ef4fae95680aefefe63fc1ac7 | src/constants/enums.ts | src/constants/enums.ts | export enum AudioSource {
NEW_SEARCH_RESULT = 'https://k003.kiwi6.com/hotlink/bbgk5w2fhz/ping.ogg',
SUCCESSFUL_WATCHER_ACCEPT = 'https://k003.kiwi6.com/hotlink/sd6quj2ecs/horn.ogg'
}
export enum TabIndex {
SEARCH = 0,
QUEUE = 1,
WATCHERS = 2,
BLOCKLIST = 3,
ACCOUNT = 4,
SETTINGS = 5
}
export enum Duration {
SECONDS = 'seconds',
HOURS = 'hours',
DAYS = 'days',
WEEKS = 'weeks'
}
| export enum AudioSource {
NEW_SEARCH_RESULT = 'https://k003.kiwi6.com/hotlink/bbgk5w2fhz/ping.ogg',
SUCCESSFUL_WATCHER_ACCEPT = 'https://k003.kiwi6.com/hotlink/sd6quj2ecs/horn.ogg'
}
export enum TabIndex {
SEARCH = 0,
QUEUE = 1,
WATCHERS = 2,
BLOCKLIST = 3,
ACCOUNT = 4,
SETTINGS = 5
}
export enum Duration {
SECONDS = 'seconds',
HOURS = 'hours',
DAYS = 'days',
WEEKS = 'weeks',
MONTHS = 'months'
}
| Add MONTHS to Duration enum. | Add MONTHS to Duration enum.
| TypeScript | mit | Anveio/mturk-engine,Anveio/mturk-engine,Anveio/mturk-engine | typescript | ## Code Before:
export enum AudioSource {
NEW_SEARCH_RESULT = 'https://k003.kiwi6.com/hotlink/bbgk5w2fhz/ping.ogg',
SUCCESSFUL_WATCHER_ACCEPT = 'https://k003.kiwi6.com/hotlink/sd6quj2ecs/horn.ogg'
}
export enum TabIndex {
SEARCH = 0,
QUEUE = 1,
WATCHERS = 2,
BLOCKLIST = 3,
ACCOUNT = 4,
SETTINGS = 5
}
export enum Duration {
SECONDS = 'seconds',
HOURS = 'hours',
DAYS = 'days',
WEEKS = 'weeks'
}
## Instruction:
Add MONTHS to Duration enum.
## Code After:
export enum AudioSource {
NEW_SEARCH_RESULT = 'https://k003.kiwi6.com/hotlink/bbgk5w2fhz/ping.ogg',
SUCCESSFUL_WATCHER_ACCEPT = 'https://k003.kiwi6.com/hotlink/sd6quj2ecs/horn.ogg'
}
export enum TabIndex {
SEARCH = 0,
QUEUE = 1,
WATCHERS = 2,
BLOCKLIST = 3,
ACCOUNT = 4,
SETTINGS = 5
}
export enum Duration {
SECONDS = 'seconds',
HOURS = 'hours',
DAYS = 'days',
WEEKS = 'weeks',
MONTHS = 'months'
}
|
7a496bed78710457c4377f30f70398745eeeb963 | usr.sbin/lpr/Makefile | usr.sbin/lpr/Makefile |
SUBDIR= common_source chkprintcap lp lpc lpd lpq lpr lprm lptest pac \
filters filters.ru SMM.doc
MAINTAINER= wollman@FreeBSD.org
MAINTAINER+= gad@FreeBSD.org
.include <bsd.subdir.mk>
|
SUBDIR= common_source chkprintcap lp lpc lpd lpq lpr lprm lptest pac \
filters filters.ru SMM.doc
MAINTAINER= wollman@FreeBSD.org
MAINTAINER+= gad@FreeBSD.org
# Questions/ideas for lpr & friends could also be sent to:
# freebsd-print@bostonradio.org
.include <bsd.subdir.mk>
| Add a comment pointing to the freebsd-print@bostonradio.org mailing list. | Add a comment pointing to the freebsd-print@bostonradio.org mailing list.
| unknown | bsd-3-clause | jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase,jrobhoward/SCADAbase | unknown | ## Code Before:
SUBDIR= common_source chkprintcap lp lpc lpd lpq lpr lprm lptest pac \
filters filters.ru SMM.doc
MAINTAINER= wollman@FreeBSD.org
MAINTAINER+= gad@FreeBSD.org
.include <bsd.subdir.mk>
## Instruction:
Add a comment pointing to the freebsd-print@bostonradio.org mailing list.
## Code After:
SUBDIR= common_source chkprintcap lp lpc lpd lpq lpr lprm lptest pac \
filters filters.ru SMM.doc
MAINTAINER= wollman@FreeBSD.org
MAINTAINER+= gad@FreeBSD.org
# Questions/ideas for lpr & friends could also be sent to:
# freebsd-print@bostonradio.org
.include <bsd.subdir.mk>
|
220538bf13233e97e57f49054ace5b7392f8f4d3 | .travis.yml | .travis.yml | branches:
only:
- master
- /release-.*/
language: cpp
os:
- linux
- osx
compiler:
- gcc
- clang
dist: trusty
sudo: false
addons:
apt:
packages:
- automake
- autoconf
- libtool
- pkg-config
before_install:
- if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then brew update; fi
- if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then brew install automake autoconf libtool; fi
script: ./super-test.sh -j2 quick # limit parallelism due to limited memory on Travis
| branches:
only:
- master
- /release-.*/
language: cpp
os:
- linux
- osx
compiler:
- gcc
- clang
dist: trusty
sudo: false
addons:
apt:
packages:
- automake
- autoconf
- libtool
- pkg-config
script: ./super-test.sh -j2 quick # limit parallelism due to limited memory on Travis
| Fix Travis build: Homebrew now complains that these are already installed. | Fix Travis build: Homebrew now complains that these are already installed.
| YAML | mit | mologie/capnproto,mologie/capnproto,mologie/capnproto | yaml | ## Code Before:
branches:
only:
- master
- /release-.*/
language: cpp
os:
- linux
- osx
compiler:
- gcc
- clang
dist: trusty
sudo: false
addons:
apt:
packages:
- automake
- autoconf
- libtool
- pkg-config
before_install:
- if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then brew update; fi
- if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then brew install automake autoconf libtool; fi
script: ./super-test.sh -j2 quick # limit parallelism due to limited memory on Travis
## Instruction:
Fix Travis build: Homebrew now complains that these are already installed.
## Code After:
branches:
only:
- master
- /release-.*/
language: cpp
os:
- linux
- osx
compiler:
- gcc
- clang
dist: trusty
sudo: false
addons:
apt:
packages:
- automake
- autoconf
- libtool
- pkg-config
script: ./super-test.sh -j2 quick # limit parallelism due to limited memory on Travis
|
9c225d0f19b19657b86a153f93562881048510b4 | lib/xpi.js | lib/xpi.js | var join = require("path").join;
var console = require("./utils").console;
var fs = require("fs-promise");
var zip = require("./zip");
var all = require("when").all;
var tmp = require('./tmp');
var bootstrapSrc = join(__dirname, "../data/bootstrap.js");
/**
* Takes a manifest object (from package.json) and build options
* object and zipz up the current directory into a XPI, copying
* over default `install.rdf` and `bootstrap.js` if needed
* and not yet defined. Returns a promise that resolves
* upon completion.
*
* @param {Object} manifest
* @param {Object} options
* @return {Promise}
*/
function xpi (manifest, options) {
var cwd = process.cwd();
var xpiName = (manifest.name || "jetpack") + ".xpi";
var xpiPath = join(cwd, xpiName);
var buildOptions = createBuildOptions(options);
return doFinalZip(cwd, xpiPath);
}
module.exports = xpi;
function doFinalZip(cwd, xpiPath) {
return zip(cwd, xpiPath).then(function () {
return xpiPath;
});
}
| var join = require("path").join;
var console = require("./utils").console;
var fs = require("fs-promise");
var zip = require("./zip");
var all = require("when").all;
var tmp = require('./tmp');
var bootstrapSrc = join(__dirname, "../data/bootstrap.js");
/**
* Takes a manifest object (from package.json) and build options
* object and zipz up the current directory into a XPI, copying
* over default `install.rdf` and `bootstrap.js` if needed
* and not yet defined. Returns a promise that resolves
* upon completion.
*
* @param {Object} manifest
* @param {Object} options
* @return {Promise}
*/
function xpi (manifest, options) {
var cwd = process.cwd();
var xpiName = (manifest.name || "jetpack") + ".xpi";
var xpiPath = join(cwd, xpiName);
return doFinalZip(cwd, xpiPath);
}
module.exports = xpi;
function doFinalZip(cwd, xpiPath) {
return zip(cwd, xpiPath).then(function () {
return xpiPath;
});
}
| Remove extra createBuildOptions for now | Remove extra createBuildOptions for now
| JavaScript | mpl-2.0 | Medeah/jpm,rpl/jpm,guyzmuch/jpm,wagnerand/jpm,guyzmuch/jpm,Medeah/jpm,matraska23/jpm,nikolas/jpm,freaktechnik/jpm,matraska23/jpm,alexduch/jpm,mozilla-jetpack/jpm,jsantell/jpm,Gozala/jpm,freaktechnik/jpm,rpl/jpm,nikolas/jpm,mozilla-jetpack/jpm,kkapsner/jpm,mozilla/jpm,mozilla-jetpack/jpm,jsantell/jpm,johannhof/jpm,alexduch/jpm,benbasson/jpm,guyzmuch/jpm,mozilla/jpm,freaktechnik/jpm,Medeah/jpm,johannhof/jpm,johannhof/jpm,benbasson/jpm,Gozala/jpm,benbasson/jpm,jsantell/jpm,alexduch/jpm,wagnerand/jpm,rpl/jpm,wagnerand/jpm,kkapsner/jpm,kkapsner/jpm,mozilla/jpm,nikolas/jpm,matraska23/jpm | javascript | ## Code Before:
var join = require("path").join;
var console = require("./utils").console;
var fs = require("fs-promise");
var zip = require("./zip");
var all = require("when").all;
var tmp = require('./tmp');
var bootstrapSrc = join(__dirname, "../data/bootstrap.js");
/**
* Takes a manifest object (from package.json) and build options
* object and zipz up the current directory into a XPI, copying
* over default `install.rdf` and `bootstrap.js` if needed
* and not yet defined. Returns a promise that resolves
* upon completion.
*
* @param {Object} manifest
* @param {Object} options
* @return {Promise}
*/
function xpi (manifest, options) {
var cwd = process.cwd();
var xpiName = (manifest.name || "jetpack") + ".xpi";
var xpiPath = join(cwd, xpiName);
var buildOptions = createBuildOptions(options);
return doFinalZip(cwd, xpiPath);
}
module.exports = xpi;
function doFinalZip(cwd, xpiPath) {
return zip(cwd, xpiPath).then(function () {
return xpiPath;
});
}
## Instruction:
Remove extra createBuildOptions for now
## Code After:
var join = require("path").join;
var console = require("./utils").console;
var fs = require("fs-promise");
var zip = require("./zip");
var all = require("when").all;
var tmp = require('./tmp');
var bootstrapSrc = join(__dirname, "../data/bootstrap.js");
/**
* Takes a manifest object (from package.json) and build options
* object and zipz up the current directory into a XPI, copying
* over default `install.rdf` and `bootstrap.js` if needed
* and not yet defined. Returns a promise that resolves
* upon completion.
*
* @param {Object} manifest
* @param {Object} options
* @return {Promise}
*/
function xpi (manifest, options) {
var cwd = process.cwd();
var xpiName = (manifest.name || "jetpack") + ".xpi";
var xpiPath = join(cwd, xpiName);
return doFinalZip(cwd, xpiPath);
}
module.exports = xpi;
function doFinalZip(cwd, xpiPath) {
return zip(cwd, xpiPath).then(function () {
return xpiPath;
});
}
|
3bcb8f0843b7798bb02f7cf5652ec200293f0264 | cookbooks/planet/templates/default/users-deleted.erb | cookbooks/planet/templates/default/users-deleted.erb |
T=$(mktemp -d -t -p /var/tmp users.XXXXXXXXXX)
# use the same as for the users-agreed template
export PGPASSFILE=/etc/replication/users-agreed.conf
echo "# user IDs of deleted users. " > $T/users_deleted
psql -h <%= node[:web][:readonly_database_host] %> -U planetdiff -t -c "select id from users where status='deleted' order by id asc" openstreetmap >> $T/users_deleted
if cmp -s "${T}/users_deleted" "/store/planet/users_deleted.txt"; then
: # do nothing
else
cp $T/users_deleted /store/planet/users_deleted.txt
fi
rm -rf $T
|
T=$(mktemp -d -t -p /var/tmp users.XXXXXXXXXX)
# use the same as for the users-agreed template
export PGPASSFILE=/etc/replication/users-agreed.conf
echo "# user IDs of deleted users. " > $T/users_deleted
psql -h <%= node[:web][:readonly_database_host] %> -U planetdiff -t -c "select id from users where status='deleted' order by id asc" openstreetmap >> $T/users_deleted
if cmp -s "${T}/users_deleted" "/store/planet/users_deleted/users_deleted.txt"; then
: # do nothing
else
cp $T/users_deleted /store/planet/users_deleted/users_deleted.txt
fi
rm -rf $T
| Move output down a level | Move output down a level
| HTML+ERB | apache-2.0 | Firefishy/chef,zerebubuth/openstreetmap-chef,tomhughes/openstreetmap-chef,openstreetmap/chef,tomhughes/openstreetmap-chef,Firefishy/chef,openstreetmap/chef,zerebubuth/openstreetmap-chef,zerebubuth/openstreetmap-chef,zerebubuth/openstreetmap-chef,zerebubuth/openstreetmap-chef,Firefishy/chef,tomhughes/openstreetmap-chef,tomhughes/openstreetmap-chef,Firefishy/chef,Firefishy/chef,openstreetmap/chef,zerebubuth/openstreetmap-chef,Firefishy/chef,tomhughes/openstreetmap-chef,openstreetmap/chef,openstreetmap/chef,Firefishy/chef,openstreetmap/chef,Firefishy/chef,openstreetmap/chef,tomhughes/openstreetmap-chef,zerebubuth/openstreetmap-chef,openstreetmap/chef,tomhughes/openstreetmap-chef,zerebubuth/openstreetmap-chef,tomhughes/openstreetmap-chef | html+erb | ## Code Before:
T=$(mktemp -d -t -p /var/tmp users.XXXXXXXXXX)
# use the same as for the users-agreed template
export PGPASSFILE=/etc/replication/users-agreed.conf
echo "# user IDs of deleted users. " > $T/users_deleted
psql -h <%= node[:web][:readonly_database_host] %> -U planetdiff -t -c "select id from users where status='deleted' order by id asc" openstreetmap >> $T/users_deleted
if cmp -s "${T}/users_deleted" "/store/planet/users_deleted.txt"; then
: # do nothing
else
cp $T/users_deleted /store/planet/users_deleted.txt
fi
rm -rf $T
## Instruction:
Move output down a level
## Code After:
T=$(mktemp -d -t -p /var/tmp users.XXXXXXXXXX)
# use the same as for the users-agreed template
export PGPASSFILE=/etc/replication/users-agreed.conf
echo "# user IDs of deleted users. " > $T/users_deleted
psql -h <%= node[:web][:readonly_database_host] %> -U planetdiff -t -c "select id from users where status='deleted' order by id asc" openstreetmap >> $T/users_deleted
if cmp -s "${T}/users_deleted" "/store/planet/users_deleted/users_deleted.txt"; then
: # do nothing
else
cp $T/users_deleted /store/planet/users_deleted/users_deleted.txt
fi
rm -rf $T
|
622ad90f13100b5d5296af662552de2960afb7a6 | src/Field/Scalar/BooleanField.php | src/Field/Scalar/BooleanField.php | <?php
/*
* This file is part of the Active Collab DatabaseStructure project.
*
* (c) A51 doo <info@activecollab.com>. All rights reserved.
*/
namespace ActiveCollab\DatabaseStructure\Field\Scalar;
use LogicException;
/**
* @package ActiveCollab\DatabaseStructure\Field\Scalar
*/
class BooleanField extends Field
{
/**
* @param string $name
* @param bool $default_value
*/
public function __construct($name, $default_value = false)
{
parent::__construct($name, $default_value);
}
/**
* Value of this column needs to be unique (in the given context).
*
* @param string[] $context
* @return $this
*/
public function &unique(...$context)
{
throw new LogicException('Boolean columns cant be made unique');
}
/**
* Return value casting code.
*
* @param string $variable_name
* @return string
*/
public function getCastingCode($variable_name)
{
return '(bool) $' . $variable_name;
}
}
| <?php
/*
* This file is part of the Active Collab DatabaseStructure project.
*
* (c) A51 doo <info@activecollab.com>. All rights reserved.
*/
namespace ActiveCollab\DatabaseStructure\Field\Scalar;
use LogicException;
/**
* @package ActiveCollab\DatabaseStructure\Field\Scalar
*/
class BooleanField extends Field
{
/**
* @param string $name
* @param bool $default_value
*/
public function __construct($name, $default_value = false)
{
parent::__construct($name, $default_value);
}
/**
* {@inheritdoc}
*/
public function &unique(...$context)
{
throw new LogicException('Boolean columns cant be made unique');
}
/**
* {@inheritdoc}
*/
public function getNativeType()
{
return 'bool';
}
/**
* {@inheritdoc}
*/
public function getCastingCode($variable_name)
{
return '(bool) $' . $variable_name;
}
}
| Add native type to boolean fields | Add native type to boolean fields
| PHP | mit | activecollab/databasestructure | php | ## Code Before:
<?php
/*
* This file is part of the Active Collab DatabaseStructure project.
*
* (c) A51 doo <info@activecollab.com>. All rights reserved.
*/
namespace ActiveCollab\DatabaseStructure\Field\Scalar;
use LogicException;
/**
* @package ActiveCollab\DatabaseStructure\Field\Scalar
*/
class BooleanField extends Field
{
/**
* @param string $name
* @param bool $default_value
*/
public function __construct($name, $default_value = false)
{
parent::__construct($name, $default_value);
}
/**
* Value of this column needs to be unique (in the given context).
*
* @param string[] $context
* @return $this
*/
public function &unique(...$context)
{
throw new LogicException('Boolean columns cant be made unique');
}
/**
* Return value casting code.
*
* @param string $variable_name
* @return string
*/
public function getCastingCode($variable_name)
{
return '(bool) $' . $variable_name;
}
}
## Instruction:
Add native type to boolean fields
## Code After:
<?php
/*
* This file is part of the Active Collab DatabaseStructure project.
*
* (c) A51 doo <info@activecollab.com>. All rights reserved.
*/
namespace ActiveCollab\DatabaseStructure\Field\Scalar;
use LogicException;
/**
* @package ActiveCollab\DatabaseStructure\Field\Scalar
*/
class BooleanField extends Field
{
/**
* @param string $name
* @param bool $default_value
*/
public function __construct($name, $default_value = false)
{
parent::__construct($name, $default_value);
}
/**
* {@inheritdoc}
*/
public function &unique(...$context)
{
throw new LogicException('Boolean columns cant be made unique');
}
/**
* {@inheritdoc}
*/
public function getNativeType()
{
return 'bool';
}
/**
* {@inheritdoc}
*/
public function getCastingCode($variable_name)
{
return '(bool) $' . $variable_name;
}
}
|
29e895fab31148e0bc542c16612a1546894ead5f | spec/support/helpers/request_spec_helper.rb | spec/support/helpers/request_spec_helper.rb | module RequestSpecHelper
def i_should_see(content)
page.should have_content(content)
end
def i_should_not_see(content)
page.should have_no_content(content)
end
def i_should_be_on(path)
current_path.should eq(path)
end
def url_should_have_param(param, value)
current_params[param].should == value
end
def url_should_not_have_param(param)
current_params.should_not have_key(param)
end
def current_params
Rack::Utils.parse_query(current_uri.query)
end
def current_uri
URI.parse(page.current_url)
end
def should_have_header(header, value)
headers[header].should == value
end
end
RSpec.configuration.send :include, RequestSpecHelper, :type => :request
| module RequestSpecHelper
def i_should_see(content)
page.should have_content(content)
end
def i_should_not_see(content)
page.should have_no_content(content)
end
def i_should_be_on(path)
current_path.should eq(path)
end
def url_should_have_param(param, value)
current_params[param].should == value
end
def url_should_not_have_param(param)
current_params.should_not have_key(param)
end
def current_params
Rack::Utils.parse_query(current_uri.query)
end
def current_uri
URI.parse(page.current_url)
end
def should_have_header(header, value)
headers[header].should == value
end
def sign_in
visit '/'
click_on "Sign in"
end
def i_should_see_translated_error_message(key)
i_should_see translated_error_message(key)
end
def translated_error_message(key)
I18n.translate key, :scope => [:doorkeeper, :errors, :messages]
end
end
RSpec.configuration.send :include, RequestSpecHelper, :type => :request
| Add few more request helpers | Add few more request helpers
| Ruby | mit | doorkeeper-gem/doorkeeper,outstand/doorkeeper,stormz/doorkeeper,Tout/doorkeeper,jasl/doorkeeper,telekomatrix/doorkeeper,ezilocchi/doorkeeper,dk1234567/doorkeeper,ukasiu/doorkeeper,ministryofjustice/doorkeeper,vanboom/doorkeeper,phillbaker/doorkeeper,mavenlink/doorkeeper,Tout/doorkeeper,CloudTags/doorkeeper,ezilocchi/doorkeeper,lalithr95/doorkeeper,ValMilkevich/doorkeeper,6fusion/doorkeeper,identification-io/doorkeeper,GeekOnCoffee/doorkeeper,doorkeeper-gem/doorkeeper,Uysim/doorkeeper,jasl/doorkeeper,daande/doorkeeper,AdStage/doorkeeper,ValMilkevich/doorkeeper,vovimayhem/doorkeeper,stefania11/doorkeeper,ifeelgoods/doorkeeper,identification-io/doorkeeper,6fusion/doorkeeper,coinbase/modified-doorkeeper,Uysim/doorkeeper,dollarshaveclub/doorkeeper,mavenlink/doorkeeper,moneytree/doorkeeper,outstand/doorkeeper,lalithr95/doorkeeper,moneytree/doorkeeper,calfzhou/doorkeeper,vovimayhem/doorkeeper,telekomatrix/doorkeeper,daande/doorkeeper,doorkeeper-gem/doorkeeper,dollarshaveclub/doorkeeper,GeekOnCoffee/doorkeeper,mavenlink/doorkeeper,ngpestelos/doorkeeper,coinbase/modified-doorkeeper,pmdeazeta/doorkeeper,CloudTags/doorkeeper,ministryofjustice/doorkeeper,coinbase/modified-doorkeeper,stormz/doorkeeper,outstand/doorkeeper,ngpestelos/doorkeeper,ifeelgoods/doorkeeper,kolorahl/doorkeeper,shivakumaarmgs/doorkeeper,Tout/doorkeeper,shivakumaarmgs/doorkeeper,nbulaj/doorkeeper,CloudTags/doorkeeper,kolorahl/doorkeeper,EasterAndJay/doorkeeper,jasl/doorkeeper,ifeelgoods/doorkeeper,pmdeazeta/doorkeeper,nbulaj/doorkeeper,AICIDNN/doorkeeper,AdStage/doorkeeper,stefania11/doorkeeper,AICIDNN/doorkeeper,calfzhou/doorkeeper,EasterAndJay/doorkeeper,CloudTags/doorkeeper,vanboom/doorkeeper,dk1234567/doorkeeper,ukasiu/doorkeeper | ruby | ## Code Before:
module RequestSpecHelper
def i_should_see(content)
page.should have_content(content)
end
def i_should_not_see(content)
page.should have_no_content(content)
end
def i_should_be_on(path)
current_path.should eq(path)
end
def url_should_have_param(param, value)
current_params[param].should == value
end
def url_should_not_have_param(param)
current_params.should_not have_key(param)
end
def current_params
Rack::Utils.parse_query(current_uri.query)
end
def current_uri
URI.parse(page.current_url)
end
def should_have_header(header, value)
headers[header].should == value
end
end
RSpec.configuration.send :include, RequestSpecHelper, :type => :request
## Instruction:
Add few more request helpers
## Code After:
module RequestSpecHelper
def i_should_see(content)
page.should have_content(content)
end
def i_should_not_see(content)
page.should have_no_content(content)
end
def i_should_be_on(path)
current_path.should eq(path)
end
def url_should_have_param(param, value)
current_params[param].should == value
end
def url_should_not_have_param(param)
current_params.should_not have_key(param)
end
def current_params
Rack::Utils.parse_query(current_uri.query)
end
def current_uri
URI.parse(page.current_url)
end
def should_have_header(header, value)
headers[header].should == value
end
def sign_in
visit '/'
click_on "Sign in"
end
def i_should_see_translated_error_message(key)
i_should_see translated_error_message(key)
end
def translated_error_message(key)
I18n.translate key, :scope => [:doorkeeper, :errors, :messages]
end
end
RSpec.configuration.send :include, RequestSpecHelper, :type => :request
|
d6147ef7ba68c89656a721844ea1de6b696c3f83 | IRC/plugins/Git/Git.rb | IRC/plugins/Git/Git.rb |
require_relative '../../IRCPlugin'
class Git < IRCPlugin
Description = "Plugin to run git pull."
Commands = {
:pull => "runs git pull"
}
Dependencies = [ :Statistics ]
def afterLoad
@l = @bot.pluginManager.plugins[:Statistics]
end
def beforeUnload
@l = nil
end
def on_privmsg(msg)
case msg.botcommand
when :pull
versionBefore = @l.versionString
gitPull
versionAfter = @l.versionString
if versionBefore == versionAfter
msg.reply('Already up-to-date.')
else
msg.reply("Updating #{versionBefore} -> #{versionAfter}")
end
end
end
def gitPull
`pushd #{File.dirname($0)} && $(which git) fetch && $(which git) reset --hard @{upstream} && popd`
end
end
|
require_relative '../../IRCPlugin'
class Git < IRCPlugin
Description = "Plugin to run git pull."
Commands = {
:pull => "fetches changes from upstream and resets current branch and working directory to reflect it"
}
Dependencies = [ :Statistics ]
def afterLoad
@l = @bot.pluginManager.plugins[:Statistics]
end
def beforeUnload
@l = nil
end
def on_privmsg(msg)
case msg.botcommand
when :pull
versionBefore = @l.versionString
gitPull
versionAfter = @l.versionString
if versionBefore == versionAfter
msg.reply('Already up-to-date.')
else
msg.reply("Updating #{versionBefore} -> #{versionAfter}")
end
end
end
def gitPull
`pushd #{File.dirname($0)} && $(which git) fetch && $(which git) reset --hard @{upstream} && popd`
end
end
| Update command description for .pull | Update command description for .pull
| Ruby | agpl-3.0 | k5bot/k5bot,k5bot/k5bot,k5bot/k5bot | ruby | ## Code Before:
require_relative '../../IRCPlugin'
class Git < IRCPlugin
Description = "Plugin to run git pull."
Commands = {
:pull => "runs git pull"
}
Dependencies = [ :Statistics ]
def afterLoad
@l = @bot.pluginManager.plugins[:Statistics]
end
def beforeUnload
@l = nil
end
def on_privmsg(msg)
case msg.botcommand
when :pull
versionBefore = @l.versionString
gitPull
versionAfter = @l.versionString
if versionBefore == versionAfter
msg.reply('Already up-to-date.')
else
msg.reply("Updating #{versionBefore} -> #{versionAfter}")
end
end
end
def gitPull
`pushd #{File.dirname($0)} && $(which git) fetch && $(which git) reset --hard @{upstream} && popd`
end
end
## Instruction:
Update command description for .pull
## Code After:
require_relative '../../IRCPlugin'
class Git < IRCPlugin
Description = "Plugin to run git pull."
Commands = {
:pull => "fetches changes from upstream and resets current branch and working directory to reflect it"
}
Dependencies = [ :Statistics ]
def afterLoad
@l = @bot.pluginManager.plugins[:Statistics]
end
def beforeUnload
@l = nil
end
def on_privmsg(msg)
case msg.botcommand
when :pull
versionBefore = @l.versionString
gitPull
versionAfter = @l.versionString
if versionBefore == versionAfter
msg.reply('Already up-to-date.')
else
msg.reply("Updating #{versionBefore} -> #{versionAfter}")
end
end
end
def gitPull
`pushd #{File.dirname($0)} && $(which git) fetch && $(which git) reset --hard @{upstream} && popd`
end
end
|
075343bbeba8bc3488cd03c6b78eeb6b5dfb91f9 | spec/flatfile_spec.rb | spec/flatfile_spec.rb | require 'spec_helper'
require './generate_flatfile'
describe "Flatfiles backend", :js => true do
type = "Flatfile"
name = "Metric"
metric = "#{type}~#{name}"
before :each do
add_config type_config("Flatfile", {file_name: 'public/flatfile_15s.csv', metric: name})
test_config type
end
context "refresh metrics" do
include_examples "refresh metrics", type
end
context "graphs" do
it_behaves_like "a graph", metric
end
end
describe "Broken Filefiles Backend", :js => true do
it "test fallback functionality" do
bad_backend = "Flatfile~Potato"
file = 'this_does_not_exist/nope.csv'
add_config type_config("Flatfile", {file_name: file, metric: bad_backend})
expect_json_error "/metric/?metric=#{bad_backend}"
expect_page_error "/refresh"
end
end
| require 'spec_helper'
require './generate_flatfile'
describe "Flatfiles backend", :js => true do
type = "Flatfile"
name = "Metric"
metric = "#{type}~#{name}"
before :each do
add_config type_config("Flatfile", {file_name: 'public/flatfile_15s.csv', metric: name, interpolate: true})
test_config type
end
context "refresh metrics" do
include_examples "refresh metrics", type
end
context "graphs" do
it_behaves_like "a graph", metric
end
end
describe "Broken Filefiles Backend", :js => true do
it "test fallback functionality" do
bad_backend = "Flatfile~Potato"
file = 'this_does_not_exist/nope.csv'
add_config type_config("Flatfile", {file_name: file, metric: bad_backend})
expect_json_error "/metric/?metric=#{bad_backend}"
expect_page_error "/refresh"
end
end
| Use interpolation for flatfile test | Use interpolation for flatfile test
| Ruby | bsd-3-clause | glasnt/machiavelli,machiavellian/machiavelli,machiavellian/machiavelli,glasnt/machiavelli,glasnt/machiavelli,machiavellian/machiavelli,glasnt/machiavelli,machiavellian/machiavelli | ruby | ## Code Before:
require 'spec_helper'
require './generate_flatfile'
describe "Flatfiles backend", :js => true do
type = "Flatfile"
name = "Metric"
metric = "#{type}~#{name}"
before :each do
add_config type_config("Flatfile", {file_name: 'public/flatfile_15s.csv', metric: name})
test_config type
end
context "refresh metrics" do
include_examples "refresh metrics", type
end
context "graphs" do
it_behaves_like "a graph", metric
end
end
describe "Broken Filefiles Backend", :js => true do
it "test fallback functionality" do
bad_backend = "Flatfile~Potato"
file = 'this_does_not_exist/nope.csv'
add_config type_config("Flatfile", {file_name: file, metric: bad_backend})
expect_json_error "/metric/?metric=#{bad_backend}"
expect_page_error "/refresh"
end
end
## Instruction:
Use interpolation for flatfile test
## Code After:
require 'spec_helper'
require './generate_flatfile'
describe "Flatfiles backend", :js => true do
type = "Flatfile"
name = "Metric"
metric = "#{type}~#{name}"
before :each do
add_config type_config("Flatfile", {file_name: 'public/flatfile_15s.csv', metric: name, interpolate: true})
test_config type
end
context "refresh metrics" do
include_examples "refresh metrics", type
end
context "graphs" do
it_behaves_like "a graph", metric
end
end
describe "Broken Filefiles Backend", :js => true do
it "test fallback functionality" do
bad_backend = "Flatfile~Potato"
file = 'this_does_not_exist/nope.csv'
add_config type_config("Flatfile", {file_name: file, metric: bad_backend})
expect_json_error "/metric/?metric=#{bad_backend}"
expect_page_error "/refresh"
end
end
|
0fcb89b672f19f3df41796aff561e7f22425692f | share/spice/thumbtack/thumbtack.js | share/spice/thumbtack/thumbtack.js | (function (env) {
"use strict";
env.ddg_spice_thumbtack = function(api_result){
// Don't show anything if we weren't able to return 2 or more services.
// In future iterations, we should support a different view in the case
// of a single result.
if (!api_result || api_result.error) {
return Spice.failed('thumbtack');
}
// Render the response
DDG.require('maps', function() {
Spice.add({
id: "thumbtack",
name: "Professionals",
data: api_result.data,
model: 'Place',
view: 'Places',
normalize: function(item) {
return {
// Street-level Addresses are private to most professionals,
// so we instead show a description of the service.
address: item.description,
image: DDG.toHTTP(item.image)
};
},
meta: {
sourceName: "Thumbtack",
sourceUrl: 'https://thumbtack.com/'+ api_result.landing_page_endpoint,
itemType: "Showing " + api_result.data.length + " " + api_result.plural_taxonym
},
templates: {
group: 'places'
}
});
});
};
}(this));
| (function (env) {
"use strict";
env.ddg_spice_thumbtack = function(api_result){
if (!api_result || api_result.error || !api_result.data || !api_result.data.length) {
return Spice.failed('thumbtack');
}
// Render the response
DDG.require('maps', function() {
Spice.add({
id: "thumbtack",
name: "Professionals",
data: api_result.data,
model: 'Place',
view: 'Places',
normalize: function(item) {
return {
// Street-level Addresses are private to most professionals,
// so we instead show a description of the service.
address: item.description,
image: DDG.toHTTP(item.image)
};
},
meta: {
sourceName: "Thumbtack",
sourceUrl: 'https://thumbtack.com/'+ api_result.landing_page_endpoint,
itemType: "Showing " + api_result.data.length + " " + api_result.plural_taxonym
},
templates: {
group: 'places'
}
});
});
};
}(this));
| Check for an empty result | Check for an empty result
| JavaScript | apache-2.0 | MoriTanosuke/zeroclickinfo-spice,levaly/zeroclickinfo-spice,marianosimone/zeroclickinfo-spice,deserted/zeroclickinfo-spice,soleo/zeroclickinfo-spice,levaly/zeroclickinfo-spice,kevintab95/zeroclickinfo-spice,TomBebbington/zeroclickinfo-spice,soleo/zeroclickinfo-spice,whalenrp/zeroclickinfo-spice,xaviervalarino/zeroclickinfo-spice,soleo/zeroclickinfo-spice,bigcurl/zeroclickinfo-spice,whalenrp/zeroclickinfo-spice,MoriTanosuke/zeroclickinfo-spice,TomBebbington/zeroclickinfo-spice,whalenrp/zeroclickinfo-spice,kevintab95/zeroclickinfo-spice,dogomedia/zeroclickinfo-spice,lerna/zeroclickinfo-spice,xaviervalarino/zeroclickinfo-spice,cylgom/zeroclickinfo-spice,iambibhas/zeroclickinfo-spice,lernae/zeroclickinfo-spice,xaviervalarino/zeroclickinfo-spice,soleo/zeroclickinfo-spice,deserted/zeroclickinfo-spice,dogomedia/zeroclickinfo-spice,marianosimone/zeroclickinfo-spice,P71/zeroclickinfo-spice,bigcurl/zeroclickinfo-spice,kevintab95/zeroclickinfo-spice,lerna/zeroclickinfo-spice,cylgom/zeroclickinfo-spice,imwally/zeroclickinfo-spice,lernae/zeroclickinfo-spice,lerna/zeroclickinfo-spice,marianosimone/zeroclickinfo-spice,imwally/zeroclickinfo-spice,imwally/zeroclickinfo-spice,iambibhas/zeroclickinfo-spice,lernae/zeroclickinfo-spice,dogomedia/zeroclickinfo-spice,imwally/zeroclickinfo-spice,cylgom/zeroclickinfo-spice,cylgom/zeroclickinfo-spice,deserted/zeroclickinfo-spice,marianosimone/zeroclickinfo-spice,marianosimone/zeroclickinfo-spice,lerna/zeroclickinfo-spice,TomBebbington/zeroclickinfo-spice,dogomedia/zeroclickinfo-spice,bigcurl/zeroclickinfo-spice,levaly/zeroclickinfo-spice,deserted/zeroclickinfo-spice,MoriTanosuke/zeroclickinfo-spice,MoriTanosuke/zeroclickinfo-spice,imwally/zeroclickinfo-spice,bigcurl/zeroclickinfo-spice,TomBebbington/zeroclickinfo-spice,whalenrp/zeroclickinfo-spice,deserted/zeroclickinfo-spice,cylgom/zeroclickinfo-spice,soleo/zeroclickinfo-spice,soleo/zeroclickinfo-spice,xaviervalarino/zeroclickinfo-spice,P71/zeroclickinfo-spice,xaviervalarino/zeroclickinfo-spice,xaviervalarino/zeroclickinfo-spice,iambibhas/zeroclickinfo-spice,bigcurl/zeroclickinfo-spice,kevintab95/zeroclickinfo-spice,P71/zeroclickinfo-spice,MoriTanosuke/zeroclickinfo-spice,whalenrp/zeroclickinfo-spice,cylgom/zeroclickinfo-spice,deserted/zeroclickinfo-spice,P71/zeroclickinfo-spice,lernae/zeroclickinfo-spice,iambibhas/zeroclickinfo-spice,lernae/zeroclickinfo-spice,levaly/zeroclickinfo-spice,lernae/zeroclickinfo-spice,levaly/zeroclickinfo-spice,whalenrp/zeroclickinfo-spice,dogomedia/zeroclickinfo-spice,lerna/zeroclickinfo-spice,bigcurl/zeroclickinfo-spice,kevintab95/zeroclickinfo-spice,iambibhas/zeroclickinfo-spice,marianosimone/zeroclickinfo-spice,iambibhas/zeroclickinfo-spice,levaly/zeroclickinfo-spice,kevintab95/zeroclickinfo-spice | javascript | ## Code Before:
(function (env) {
"use strict";
env.ddg_spice_thumbtack = function(api_result){
// Don't show anything if we weren't able to return 2 or more services.
// In future iterations, we should support a different view in the case
// of a single result.
if (!api_result || api_result.error) {
return Spice.failed('thumbtack');
}
// Render the response
DDG.require('maps', function() {
Spice.add({
id: "thumbtack",
name: "Professionals",
data: api_result.data,
model: 'Place',
view: 'Places',
normalize: function(item) {
return {
// Street-level Addresses are private to most professionals,
// so we instead show a description of the service.
address: item.description,
image: DDG.toHTTP(item.image)
};
},
meta: {
sourceName: "Thumbtack",
sourceUrl: 'https://thumbtack.com/'+ api_result.landing_page_endpoint,
itemType: "Showing " + api_result.data.length + " " + api_result.plural_taxonym
},
templates: {
group: 'places'
}
});
});
};
}(this));
## Instruction:
Check for an empty result
## Code After:
(function (env) {
"use strict";
env.ddg_spice_thumbtack = function(api_result){
if (!api_result || api_result.error || !api_result.data || !api_result.data.length) {
return Spice.failed('thumbtack');
}
// Render the response
DDG.require('maps', function() {
Spice.add({
id: "thumbtack",
name: "Professionals",
data: api_result.data,
model: 'Place',
view: 'Places',
normalize: function(item) {
return {
// Street-level Addresses are private to most professionals,
// so we instead show a description of the service.
address: item.description,
image: DDG.toHTTP(item.image)
};
},
meta: {
sourceName: "Thumbtack",
sourceUrl: 'https://thumbtack.com/'+ api_result.landing_page_endpoint,
itemType: "Showing " + api_result.data.length + " " + api_result.plural_taxonym
},
templates: {
group: 'places'
}
});
});
};
}(this));
|
1988901d7b174d041ae1418179d63b7b7941a4e3 | src/Euler/Problem016.hs | src/Euler/Problem016.hs | -- 2^15 = 32768 and the sum of its digits is 3 + 2 + 7 + 6 + 8 = 26.
-- What is the sum of the digits of the number 2^1000?
module Euler.Problem016
( solution
) where
solution :: Integer -> Integer
solution = undefined
| -- 2^15 = 32768 and the sum of its digits is 3 + 2 + 7 + 6 + 8 = 26.
-- What is the sum of the digits of the number 2^1000?
module Euler.Problem016
( solution
) where
solution :: Integer -> Integer
solution e = sum . digits $ 2^e
digits :: Integer -> [Integer]
digits = map (read . return) . show
| Add solution to problem 16. | Add solution to problem 16.
| Haskell | mit | whittle/euler | haskell | ## Code Before:
-- 2^15 = 32768 and the sum of its digits is 3 + 2 + 7 + 6 + 8 = 26.
-- What is the sum of the digits of the number 2^1000?
module Euler.Problem016
( solution
) where
solution :: Integer -> Integer
solution = undefined
## Instruction:
Add solution to problem 16.
## Code After:
-- 2^15 = 32768 and the sum of its digits is 3 + 2 + 7 + 6 + 8 = 26.
-- What is the sum of the digits of the number 2^1000?
module Euler.Problem016
( solution
) where
solution :: Integer -> Integer
solution e = sum . digits $ 2^e
digits :: Integer -> [Integer]
digits = map (read . return) . show
|
74e0422096d02d56a4cbb5088631a45b0d11b8f1 | ipmi/test_ipmi_resets.robot | ipmi/test_ipmi_resets.robot | *** Settings ***
Documentation Module to test IPMI cold and warm reset functionalities.
Resource ../lib/ipmi_client.robot
Resource ../lib/openbmc_ffdc.robot
Test Teardown FFDC On Test Case Fail
Suite Teardown Redfish Power Off
*** Variables ***
# User may pass LOOP_COUNT.
${LOOP_COUNT} ${1}
*** Test Cases ***
Test IPMI Warm Reset
[Documentation] Check IPMI warm reset and wait for BMC to become online.
[Tags] Test_IPMI_Warm_Reset
Repeat Keyword ${LOOP_COUNT} times IPMI MC Reset Warm (off)
Test IPMI Cold Reset
[Documentation] Check IPMI cold reset and wait for BMC to become online.
[Tags] Test_IPMI_Cold_Reset
Repeat Keyword ${LOOP_COUNT} times IPMI MC Reset Cold (off)
Verify BMC Power Cycle via IPMI
[Documentation] Verify IPMI power cycle command works fine.
[Tags] Verify_BMC_Power_Cycle_via_IPMI
Redfish Power On stack_mode=skip quiet=1
Run IPMI Standard Command chassis power cycle
Wait Until Keyword Succeeds 3 min 10 sec Is IPMI Chassis Off
Wait Until Keyword Succeeds 3 min 10 sec Is IPMI Chassis On
*** Keywords ***
Is IPMI Chassis Off
[Documentation] Check if chassis state is "Off" via IPMI.
${power_state}= Get Chassis Power State
Should Be Equal ${power_state} Off
Is IPMI Chassis On
[Documentation] Check if chassis state is "On" via IPMI.
${power_state}= Get Chassis Power State
Should Be Equal ${power_state} On
| *** Settings ***
Documentation Module to test IPMI cold and warm reset functionalities.
Resource ../lib/ipmi_client.robot
Resource ../lib/openbmc_ffdc.robot
Test Teardown FFDC On Test Case Fail
*** Variables ***
# User may pass LOOP_COUNT.
${LOOP_COUNT} ${1}
*** Test Cases ***
Test IPMI Warm Reset
[Documentation] Check IPMI warm reset and wait for BMC to become online.
[Tags] Test_IPMI_Warm_Reset
Repeat Keyword ${LOOP_COUNT} times IPMI MC Reset Warm (off)
Test IPMI Cold Reset
[Documentation] Check IPMI cold reset and wait for BMC to become online.
[Tags] Test_IPMI_Cold_Reset
Repeat Keyword ${LOOP_COUNT} times IPMI MC Reset Cold (off)
Verify BMC Power Cycle via IPMI
[Documentation] Verify IPMI power cycle command works fine.
[Tags] Verify_BMC_Power_Cycle_via_IPMI
Repeat Keyword ${LOOP_COUNT} times IPMI Power Cycle
| Update IPMI power cycle test | Update IPMI power cycle test
Change-Id: I2c7461377fe928f937be511ff542b90f4ae56e2e
Signed-off-by: George Keishing <bef0a9ecac45fb57611777c8270153994e13fd2e@in.ibm.com>
| RobotFramework | apache-2.0 | openbmc/openbmc-test-automation,openbmc/openbmc-test-automation | robotframework | ## Code Before:
*** Settings ***
Documentation Module to test IPMI cold and warm reset functionalities.
Resource ../lib/ipmi_client.robot
Resource ../lib/openbmc_ffdc.robot
Test Teardown FFDC On Test Case Fail
Suite Teardown Redfish Power Off
*** Variables ***
# User may pass LOOP_COUNT.
${LOOP_COUNT} ${1}
*** Test Cases ***
Test IPMI Warm Reset
[Documentation] Check IPMI warm reset and wait for BMC to become online.
[Tags] Test_IPMI_Warm_Reset
Repeat Keyword ${LOOP_COUNT} times IPMI MC Reset Warm (off)
Test IPMI Cold Reset
[Documentation] Check IPMI cold reset and wait for BMC to become online.
[Tags] Test_IPMI_Cold_Reset
Repeat Keyword ${LOOP_COUNT} times IPMI MC Reset Cold (off)
Verify BMC Power Cycle via IPMI
[Documentation] Verify IPMI power cycle command works fine.
[Tags] Verify_BMC_Power_Cycle_via_IPMI
Redfish Power On stack_mode=skip quiet=1
Run IPMI Standard Command chassis power cycle
Wait Until Keyword Succeeds 3 min 10 sec Is IPMI Chassis Off
Wait Until Keyword Succeeds 3 min 10 sec Is IPMI Chassis On
*** Keywords ***
Is IPMI Chassis Off
[Documentation] Check if chassis state is "Off" via IPMI.
${power_state}= Get Chassis Power State
Should Be Equal ${power_state} Off
Is IPMI Chassis On
[Documentation] Check if chassis state is "On" via IPMI.
${power_state}= Get Chassis Power State
Should Be Equal ${power_state} On
## Instruction:
Update IPMI power cycle test
Change-Id: I2c7461377fe928f937be511ff542b90f4ae56e2e
Signed-off-by: George Keishing <bef0a9ecac45fb57611777c8270153994e13fd2e@in.ibm.com>
## Code After:
*** Settings ***
Documentation Module to test IPMI cold and warm reset functionalities.
Resource ../lib/ipmi_client.robot
Resource ../lib/openbmc_ffdc.robot
Test Teardown FFDC On Test Case Fail
*** Variables ***
# User may pass LOOP_COUNT.
${LOOP_COUNT} ${1}
*** Test Cases ***
Test IPMI Warm Reset
[Documentation] Check IPMI warm reset and wait for BMC to become online.
[Tags] Test_IPMI_Warm_Reset
Repeat Keyword ${LOOP_COUNT} times IPMI MC Reset Warm (off)
Test IPMI Cold Reset
[Documentation] Check IPMI cold reset and wait for BMC to become online.
[Tags] Test_IPMI_Cold_Reset
Repeat Keyword ${LOOP_COUNT} times IPMI MC Reset Cold (off)
Verify BMC Power Cycle via IPMI
[Documentation] Verify IPMI power cycle command works fine.
[Tags] Verify_BMC_Power_Cycle_via_IPMI
Repeat Keyword ${LOOP_COUNT} times IPMI Power Cycle
|
b1fe3ab9fab78737d4a2b2a40c74bbfdd22bd6c8 | README.md | README.md |
Chrysalide is a clean, minimalistic, responsive, and Lynx friendly theme. It is the default theme for [Motyl](https://github.com/fcambus/motyl), an opinionated blog-aware static site generator written in Lua.
## Syntax highlighting
Chrysalide uses [Prism](http://prismjs.com/) for client-side Syntax highlighting. The bundled style sheet has support for the following languages: Markup, CSS, C-like, JavaScript.
## License
Chrysalide is released under the BSD 2-Clause license. See `LICENSE` file for details.
The theme bundles third party fonts and scripts released under the following licenses:
- Open Sans fonts are licensed under the Apache license Version 2.0
- Prism is licensed under the MIT license
## Author
Chrysalide is developed by Frederic Cambus.
- Site: https://www.cambus.net
## Resources
GitHub: https://github.com/fcambus/chrysalide
|
Chrysalide is a clean, minimalistic, responsive, and Lynx friendly theme. It is the default theme for [Motyl](https://github.com/fcambus/motyl), an opinionated blog-aware static site generator written in Lua.
## Syntax highlighting
Chrysalide uses [Prism](http://prismjs.com/) for client-side Syntax highlighting. The bundled style sheet has support for the following languages: Markup, CSS, C-like, JavaScript.
## License
Chrysalide is released under the BSD 2-Clause license. See `LICENSE` file for details.
The theme bundles third party fonts and scripts released under the following licenses:
- Work Sans fonts are licensed under the SIL Open Font License v1.1
- Prism is licensed under the MIT license
## Author
Chrysalide is developed by Frederic Cambus.
- Site: https://www.cambus.net
## Resources
GitHub: https://github.com/fcambus/chrysalide
| Update license information regarding fonts | Update license information regarding fonts
| Markdown | bsd-2-clause | fcambus/chrysalide | markdown | ## Code Before:
Chrysalide is a clean, minimalistic, responsive, and Lynx friendly theme. It is the default theme for [Motyl](https://github.com/fcambus/motyl), an opinionated blog-aware static site generator written in Lua.
## Syntax highlighting
Chrysalide uses [Prism](http://prismjs.com/) for client-side Syntax highlighting. The bundled style sheet has support for the following languages: Markup, CSS, C-like, JavaScript.
## License
Chrysalide is released under the BSD 2-Clause license. See `LICENSE` file for details.
The theme bundles third party fonts and scripts released under the following licenses:
- Open Sans fonts are licensed under the Apache license Version 2.0
- Prism is licensed under the MIT license
## Author
Chrysalide is developed by Frederic Cambus.
- Site: https://www.cambus.net
## Resources
GitHub: https://github.com/fcambus/chrysalide
## Instruction:
Update license information regarding fonts
## Code After:
Chrysalide is a clean, minimalistic, responsive, and Lynx friendly theme. It is the default theme for [Motyl](https://github.com/fcambus/motyl), an opinionated blog-aware static site generator written in Lua.
## Syntax highlighting
Chrysalide uses [Prism](http://prismjs.com/) for client-side Syntax highlighting. The bundled style sheet has support for the following languages: Markup, CSS, C-like, JavaScript.
## License
Chrysalide is released under the BSD 2-Clause license. See `LICENSE` file for details.
The theme bundles third party fonts and scripts released under the following licenses:
- Work Sans fonts are licensed under the SIL Open Font License v1.1
- Prism is licensed under the MIT license
## Author
Chrysalide is developed by Frederic Cambus.
- Site: https://www.cambus.net
## Resources
GitHub: https://github.com/fcambus/chrysalide
|
08297f1f75e37a65d89afd5bdd6ed14aa8bae321 | src/base-query/bootstrap.js | src/base-query/bootstrap.js | import { version, homepage } from '../../package.json'
import Logger from 'js-logger'
import './polyfills.js'
// Bootstrap logger
Logger.useDefaults()
// print header to console
console.log('base3d.js v'+version+'\n'+homepage) | import { version, homepage } from '../../package.json'
import Logger from 'js-logger'
import './polyfills.js'
// Bootstrap logger
Logger.useDefaults()
// print header to console in browser environment
var isBrowser = typeof window !== 'undefined' && Object.prototype.toString.call(window) === '[object Window]'
if (isBrowser) {
console.log('bq v'+version+'\n'+homepage)
} | Print banner to console in browser environment only | Print banner to console in browser environment only
| JavaScript | mit | archilogic-com/3dio-js,archilogic-com/3dio-js | javascript | ## Code Before:
import { version, homepage } from '../../package.json'
import Logger from 'js-logger'
import './polyfills.js'
// Bootstrap logger
Logger.useDefaults()
// print header to console
console.log('base3d.js v'+version+'\n'+homepage)
## Instruction:
Print banner to console in browser environment only
## Code After:
import { version, homepage } from '../../package.json'
import Logger from 'js-logger'
import './polyfills.js'
// Bootstrap logger
Logger.useDefaults()
// print header to console in browser environment
var isBrowser = typeof window !== 'undefined' && Object.prototype.toString.call(window) === '[object Window]'
if (isBrowser) {
console.log('bq v'+version+'\n'+homepage)
} |
7891cf254bb98b65503675a20ed6b013385328cf | setup.py | setup.py | import setuptools
def package_data_dirs(source, sub_folders):
import os
dirs = []
for d in sub_folders:
for dirname, _, files in os.walk(os.path.join(source, d)):
dirname = os.path.relpath(dirname, source)
for f in files:
dirs.append(os.path.join(dirname, f))
return dirs
def params():
name = "OctoPrint-Netconnectd"
version = "0.1"
description = "Client for netconnectd that allows configuration of netconnectd through OctoPrint's settings dialog. It's only available for Linux right now."
author = "Gina Häußge"
author_email = "osd@foosel.net"
url = "http://octoprint.org"
license = "AGPLv3"
packages = ["octoprint_netconnectd"]
package_data = {"octoprint": package_data_dirs('octoprint_netconnectd', ['static', 'templates'])}
include_package_data = True
zip_safe = False
install_requires = open("requirements.txt").read().split("\n")
entry_points = {
"octoprint.plugin": [
"netconnectd = octoprint_netconnectd"
]
}
return locals()
setuptools.setup(**params())
| import setuptools
def package_data_dirs(source, sub_folders):
import os
dirs = []
for d in sub_folders:
for dirname, _, files in os.walk(os.path.join(source, d)):
dirname = os.path.relpath(dirname, source)
for f in files:
dirs.append(os.path.join(dirname, f))
return dirs
def params():
name = "OctoPrint-Netconnectd"
version = "0.1"
description = "Client for netconnectd that allows configuration of netconnectd through OctoPrint's settings dialog. It's only available for Linux right now."
author = "Gina Häußge"
author_email = "osd@foosel.net"
url = "http://octoprint.org"
license = "AGPLv3"
packages = ["octoprint_netconnectd"]
package_data = {"octoprint_netconnectd": package_data_dirs('octoprint_netconnectd', ['static', 'templates'])}
include_package_data = True
zip_safe = False
install_requires = open("requirements.txt").read().split("\n")
entry_points = {
"octoprint.plugin": [
"netconnectd = octoprint_netconnectd"
]
}
return locals()
setuptools.setup(**params())
| Copy paste error leading to static and template folders not being properly installed along side the package | Copy paste error leading to static and template folders not being properly installed along side the package
| Python | agpl-3.0 | OctoPrint/OctoPrint-Netconnectd,mrbeam/OctoPrint-Netconnectd,mrbeam/OctoPrint-Netconnectd,OctoPrint/OctoPrint-Netconnectd,mrbeam/OctoPrint-Netconnectd | python | ## Code Before:
import setuptools
def package_data_dirs(source, sub_folders):
import os
dirs = []
for d in sub_folders:
for dirname, _, files in os.walk(os.path.join(source, d)):
dirname = os.path.relpath(dirname, source)
for f in files:
dirs.append(os.path.join(dirname, f))
return dirs
def params():
name = "OctoPrint-Netconnectd"
version = "0.1"
description = "Client for netconnectd that allows configuration of netconnectd through OctoPrint's settings dialog. It's only available for Linux right now."
author = "Gina Häußge"
author_email = "osd@foosel.net"
url = "http://octoprint.org"
license = "AGPLv3"
packages = ["octoprint_netconnectd"]
package_data = {"octoprint": package_data_dirs('octoprint_netconnectd', ['static', 'templates'])}
include_package_data = True
zip_safe = False
install_requires = open("requirements.txt").read().split("\n")
entry_points = {
"octoprint.plugin": [
"netconnectd = octoprint_netconnectd"
]
}
return locals()
setuptools.setup(**params())
## Instruction:
Copy paste error leading to static and template folders not being properly installed along side the package
## Code After:
import setuptools
def package_data_dirs(source, sub_folders):
import os
dirs = []
for d in sub_folders:
for dirname, _, files in os.walk(os.path.join(source, d)):
dirname = os.path.relpath(dirname, source)
for f in files:
dirs.append(os.path.join(dirname, f))
return dirs
def params():
name = "OctoPrint-Netconnectd"
version = "0.1"
description = "Client for netconnectd that allows configuration of netconnectd through OctoPrint's settings dialog. It's only available for Linux right now."
author = "Gina Häußge"
author_email = "osd@foosel.net"
url = "http://octoprint.org"
license = "AGPLv3"
packages = ["octoprint_netconnectd"]
package_data = {"octoprint_netconnectd": package_data_dirs('octoprint_netconnectd', ['static', 'templates'])}
include_package_data = True
zip_safe = False
install_requires = open("requirements.txt").read().split("\n")
entry_points = {
"octoprint.plugin": [
"netconnectd = octoprint_netconnectd"
]
}
return locals()
setuptools.setup(**params())
|
38ff21bbc0f64e805b75aa954fe2a748ad32e76f | src/log/index.ts | src/log/index.ts | // Copyright 2016 Joe Duffy. All rights reserved.
"use strict";
import * as contract from '../contract';
export interface ILogger {
infof(msg: string, ...args: any[]): void;
errorf(msg: string, ...args: any[]): void;
fatalf(msg: string, ...args: any[]): void;
}
let consoleLogger: ILogger = {
infof: (msg: string, ...args: any[]) => {
console.log(msg, ...args);
},
errorf: (msg: string, ...args: any[]) => {
console.error(msg, ...args);
},
fatalf: (msg: string, ...args: any[]) => {
contract.failf(msg, ...args);
},
};
let ignoreLogger: ILogger = {
infof: (msg: string, ...args: any[]) => {},
errorf: (msg: string, ...args: any[]) => {},
fatalf: (msg: string, ...args: any[]) => {},
};
export function Get(t?: number): ILogger {
if (!!t || t >= v) {
return consoleLogger;
}
return ignoreLogger;
}
let v: number = 0;
export function Set(t: number): void {
v = t;
}
export function V(t: number): boolean {
return (v >= t);
}
| // Copyright 2016 Joe Duffy. All rights reserved.
"use strict";
import * as contract from '../contract';
export interface ILogger {
infof(msg: string, ...args: any[]): void;
errorf(msg: string, ...args: any[]): void;
fatalf(msg: string, ...args: any[]): void;
}
let consoleLogger: ILogger = {
infof: (msg: string, ...args: any[]) => {
console.log(msg, ...args);
},
errorf: (msg: string, ...args: any[]) => {
console.error(msg, ...args);
},
fatalf: (msg: string, ...args: any[]) => {
contract.failf(msg, ...args);
},
};
let ignoreLogger: ILogger = {
infof: (msg: string, ...args: any[]) => {},
errorf: (msg: string, ...args: any[]) => {},
fatalf: (msg: string, ...args: any[]) => {},
};
let loglevel: number = 0;
export function configure(threshold: number): void {
loglevel = threshold;
}
export function out(target?: number): ILogger {
if (target === undefined || v(target)) {
return consoleLogger;
}
return ignoreLogger;
}
export function v(target: number): boolean {
return (target <= loglevel);
}
| Rename functions to be less obscure (and cased properly) | Rename functions to be less obscure (and cased properly)
| TypeScript | mit | joeduffy/nodets | typescript | ## Code Before:
// Copyright 2016 Joe Duffy. All rights reserved.
"use strict";
import * as contract from '../contract';
export interface ILogger {
infof(msg: string, ...args: any[]): void;
errorf(msg: string, ...args: any[]): void;
fatalf(msg: string, ...args: any[]): void;
}
let consoleLogger: ILogger = {
infof: (msg: string, ...args: any[]) => {
console.log(msg, ...args);
},
errorf: (msg: string, ...args: any[]) => {
console.error(msg, ...args);
},
fatalf: (msg: string, ...args: any[]) => {
contract.failf(msg, ...args);
},
};
let ignoreLogger: ILogger = {
infof: (msg: string, ...args: any[]) => {},
errorf: (msg: string, ...args: any[]) => {},
fatalf: (msg: string, ...args: any[]) => {},
};
export function Get(t?: number): ILogger {
if (!!t || t >= v) {
return consoleLogger;
}
return ignoreLogger;
}
let v: number = 0;
export function Set(t: number): void {
v = t;
}
export function V(t: number): boolean {
return (v >= t);
}
## Instruction:
Rename functions to be less obscure (and cased properly)
## Code After:
// Copyright 2016 Joe Duffy. All rights reserved.
"use strict";
import * as contract from '../contract';
export interface ILogger {
infof(msg: string, ...args: any[]): void;
errorf(msg: string, ...args: any[]): void;
fatalf(msg: string, ...args: any[]): void;
}
let consoleLogger: ILogger = {
infof: (msg: string, ...args: any[]) => {
console.log(msg, ...args);
},
errorf: (msg: string, ...args: any[]) => {
console.error(msg, ...args);
},
fatalf: (msg: string, ...args: any[]) => {
contract.failf(msg, ...args);
},
};
let ignoreLogger: ILogger = {
infof: (msg: string, ...args: any[]) => {},
errorf: (msg: string, ...args: any[]) => {},
fatalf: (msg: string, ...args: any[]) => {},
};
let loglevel: number = 0;
export function configure(threshold: number): void {
loglevel = threshold;
}
export function out(target?: number): ILogger {
if (target === undefined || v(target)) {
return consoleLogger;
}
return ignoreLogger;
}
export function v(target: number): boolean {
return (target <= loglevel);
}
|
1996083e58f308dfa42ff315ea15b7dba8326679 | apina-core/src/main/java/fi/evident/apina/model/settings/TranslationSettings.java | apina-core/src/main/java/fi/evident/apina/model/settings/TranslationSettings.java | package fi.evident.apina.model.settings;
import fi.evident.apina.utils.PatternSet;
import java.util.Collection;
import java.util.Set;
import java.util.TreeMap;
import java.util.TreeSet;
import static java.util.Collections.unmodifiableCollection;
/**
* Various settings guiding the translation.
*/
public final class TranslationSettings {
public final PatternSet blackBoxClasses = new PatternSet();
private final TreeMap<String, ImportDefinition> importsByModule = new TreeMap<>();
private final Set<String> importedTypes = new TreeSet<>();
public boolean isBlackBoxClass(String name) {
return blackBoxClasses.test(name);
}
public void addImport(String typeName, String moduleName) {
if (!importedTypes.add(typeName))
throw new IllegalArgumentException("type " + typeName + " is already imported");
importsByModule.computeIfAbsent(moduleName, ImportDefinition::new).addType(typeName);
}
public Collection<ImportDefinition> getImports() {
return unmodifiableCollection(importsByModule.values());
}
public boolean isImported(String typeName) {
return importedTypes.contains(typeName);
}
}
| package fi.evident.apina.model.settings;
import fi.evident.apina.utils.PatternSet;
import java.util.Collection;
import java.util.Set;
import java.util.TreeMap;
import java.util.TreeSet;
import static java.util.Collections.unmodifiableCollection;
/**
* Various settings guiding the translation.
*/
public final class TranslationSettings {
public final PatternSet blackBoxClasses = new PatternSet();
private final TreeMap<String, ImportDefinition> importsByModule = new TreeMap<>();
private final Set<String> importedTypes = new TreeSet<>();
public boolean isBlackBoxClass(String name) {
return blackBoxClasses.test(name);
}
public void addImport(String moduleName, Collection<String> types) {
ImportDefinition importDefinition = importsByModule.computeIfAbsent(moduleName, ImportDefinition::new);
for (String type : types) {
if (!importedTypes.add(type))
throw new IllegalArgumentException("type " + type + " is already imported");
importDefinition.addType(type);
}
}
public Collection<ImportDefinition> getImports() {
return unmodifiableCollection(importsByModule.values());
}
public boolean isImported(String typeName) {
return importedTypes.contains(typeName);
}
}
| Allow multiple types to be imported with single call | Allow multiple types to be imported with single call
Since both the command line client and the Gradle task support
specifying multiple imports from a module with a shorthand syntax, it
makes things simpler for callers to allow passing multiple types in a
single addImport-call.
| Java | mit | EvidentSolutions/apina,EvidentSolutions/apina,EvidentSolutions/apina | java | ## Code Before:
package fi.evident.apina.model.settings;
import fi.evident.apina.utils.PatternSet;
import java.util.Collection;
import java.util.Set;
import java.util.TreeMap;
import java.util.TreeSet;
import static java.util.Collections.unmodifiableCollection;
/**
* Various settings guiding the translation.
*/
public final class TranslationSettings {
public final PatternSet blackBoxClasses = new PatternSet();
private final TreeMap<String, ImportDefinition> importsByModule = new TreeMap<>();
private final Set<String> importedTypes = new TreeSet<>();
public boolean isBlackBoxClass(String name) {
return blackBoxClasses.test(name);
}
public void addImport(String typeName, String moduleName) {
if (!importedTypes.add(typeName))
throw new IllegalArgumentException("type " + typeName + " is already imported");
importsByModule.computeIfAbsent(moduleName, ImportDefinition::new).addType(typeName);
}
public Collection<ImportDefinition> getImports() {
return unmodifiableCollection(importsByModule.values());
}
public boolean isImported(String typeName) {
return importedTypes.contains(typeName);
}
}
## Instruction:
Allow multiple types to be imported with single call
Since both the command line client and the Gradle task support
specifying multiple imports from a module with a shorthand syntax, it
makes things simpler for callers to allow passing multiple types in a
single addImport-call.
## Code After:
package fi.evident.apina.model.settings;
import fi.evident.apina.utils.PatternSet;
import java.util.Collection;
import java.util.Set;
import java.util.TreeMap;
import java.util.TreeSet;
import static java.util.Collections.unmodifiableCollection;
/**
* Various settings guiding the translation.
*/
public final class TranslationSettings {
public final PatternSet blackBoxClasses = new PatternSet();
private final TreeMap<String, ImportDefinition> importsByModule = new TreeMap<>();
private final Set<String> importedTypes = new TreeSet<>();
public boolean isBlackBoxClass(String name) {
return blackBoxClasses.test(name);
}
public void addImport(String moduleName, Collection<String> types) {
ImportDefinition importDefinition = importsByModule.computeIfAbsent(moduleName, ImportDefinition::new);
for (String type : types) {
if (!importedTypes.add(type))
throw new IllegalArgumentException("type " + type + " is already imported");
importDefinition.addType(type);
}
}
public Collection<ImportDefinition> getImports() {
return unmodifiableCollection(importsByModule.values());
}
public boolean isImported(String typeName) {
return importedTypes.contains(typeName);
}
}
|
1bd52c2e40222a3031b839b2f829d8933f3504fe | includes/registrars.inc.php | includes/registrars.inc.php | <?php
/*
* Get the list of registrars.
*/
$registrars = json_decode(file_get_contents('../includes/registrars.json'));
if ($registrars === FALSE)
{
header('HTTP/1.0 500 Internal Server Error');
echo '500 Internal Server Error';
exit();
}
| <?php
/*
* Get the list of registrars.
*/
$registrars = json_decode(file_get_contents('../includes/registrars.json'));
if ($registrars === FALSE)
{
header('HTTP/1.0 500 Internal Server Error');
echo '500 Internal Server Error';
exit();
}
/*
* If a parameter has been passed, that's a GNIS ID, so display that record.
*/
if (isset($parameter))
{
if (isset($registrars->$parameter))
{
echo json_encode($registrars->$parameter);
}
}
/*
* If no GNIS ID has been passed, list all of the registrars' records.
*/
else
{
/*
* The key is the GNIS ID for the locality. Make this an explicit element, and turn the object
* into an array.
*/
$registrars_new = array();
$i=0;
foreach ($registrars as $gnis_id => $registrar)
{
$registrar->gnis_id = $gnis_id;
$registrars_new[$i] = $registrar;
$i++;
}
$registrars = $registrars_new;
unset($registrars_new);
echo json_encode($registrars);
}
| Build out the rest of the registrar method | Build out the rest of the registrar method
| PHP | mit | TrustTheVote-Project/horatio-server,TrustTheVote-Project/horatio-server | php | ## Code Before:
<?php
/*
* Get the list of registrars.
*/
$registrars = json_decode(file_get_contents('../includes/registrars.json'));
if ($registrars === FALSE)
{
header('HTTP/1.0 500 Internal Server Error');
echo '500 Internal Server Error';
exit();
}
## Instruction:
Build out the rest of the registrar method
## Code After:
<?php
/*
* Get the list of registrars.
*/
$registrars = json_decode(file_get_contents('../includes/registrars.json'));
if ($registrars === FALSE)
{
header('HTTP/1.0 500 Internal Server Error');
echo '500 Internal Server Error';
exit();
}
/*
* If a parameter has been passed, that's a GNIS ID, so display that record.
*/
if (isset($parameter))
{
if (isset($registrars->$parameter))
{
echo json_encode($registrars->$parameter);
}
}
/*
* If no GNIS ID has been passed, list all of the registrars' records.
*/
else
{
/*
* The key is the GNIS ID for the locality. Make this an explicit element, and turn the object
* into an array.
*/
$registrars_new = array();
$i=0;
foreach ($registrars as $gnis_id => $registrar)
{
$registrar->gnis_id = $gnis_id;
$registrars_new[$i] = $registrar;
$i++;
}
$registrars = $registrars_new;
unset($registrars_new);
echo json_encode($registrars);
}
|
728a9cda2e8c14aeacbf57189e757496b9f35a2b | app/views/apps/edit.html.haml | app/views/apps/edit.html.haml | .page-header
%h1 Edit App
= render "form"
- if policy(@app).destroy?
%hr
%h3 Remove this App
%p
Here lies danger. Removing the app from Cuttlefish will permanently delete the app, the associated emails and can not be undone!
%p
= button_to "Remove App", @app, confirm: "Are you sure?", method: :delete, class: "btn btn-danger"
| .page-header
%h1 Edit App
= render "form"
- if policy(@app).destroy?
.alert.alert-block
%h2 Remove this App
%p
Here lies danger. Removing the app from Cuttlefish will permanently delete the app, the associated emails and can not be undone!
%p
= button_to "Remove App", @app, confirm: "Are you sure?", method: :delete, class: "btn btn-danger"
| Make the dangerous bit more obviously dangerous | Make the dangerous bit more obviously dangerous
| Haml | agpl-3.0 | idlweb/cuttlefish,pratyushmittal/cuttlefish,idlweb/cuttlefish,idlweb/cuttlefish,pratyushmittal/cuttlefish,idlweb/cuttlefish,pratyushmittal/cuttlefish,pratyushmittal/cuttlefish | haml | ## Code Before:
.page-header
%h1 Edit App
= render "form"
- if policy(@app).destroy?
%hr
%h3 Remove this App
%p
Here lies danger. Removing the app from Cuttlefish will permanently delete the app, the associated emails and can not be undone!
%p
= button_to "Remove App", @app, confirm: "Are you sure?", method: :delete, class: "btn btn-danger"
## Instruction:
Make the dangerous bit more obviously dangerous
## Code After:
.page-header
%h1 Edit App
= render "form"
- if policy(@app).destroy?
.alert.alert-block
%h2 Remove this App
%p
Here lies danger. Removing the app from Cuttlefish will permanently delete the app, the associated emails and can not be undone!
%p
= button_to "Remove App", @app, confirm: "Are you sure?", method: :delete, class: "btn btn-danger"
|
44656033eb8c2caf2d90439baf661abb44257184 | run.sh | run.sh |
vertx run app.coffee -cp "lib/ehcache-core-2.6.0.jar:lib/slf4j-api-1.7.0.jar:lib/slf4j-simple-1.7.0.jar:lib/handlebars-0.5.3.jar:lib/commons-lang3-3.1.jar:lib/parboiled-java-1.0.2.jar:lib/parboiled-core-1.0.2.jar:lib/asm-3.3.1.jar"
|
vertx run app.coffee -cp "lib/ehcache-core-2.6.0.jar:lib/slf4j-api-1.7.0.jar:lib/slf4j-simple-1.7.0.jar:lib/handlebars-0.5.3.jar:lib/commons-lang3-3.1.jar:lib/parboiled-java-1.0.2.jar:lib/parboiled-core-1.0.2.jar:lib/asm-3.3.1.jar" -Xmx 512M
| Set max heap size to 512M to prevent OOM errors. | Set max heap size to 512M to prevent OOM errors.
| Shell | mit | reitti/reittiopas,reitti/reittiopas | shell | ## Code Before:
vertx run app.coffee -cp "lib/ehcache-core-2.6.0.jar:lib/slf4j-api-1.7.0.jar:lib/slf4j-simple-1.7.0.jar:lib/handlebars-0.5.3.jar:lib/commons-lang3-3.1.jar:lib/parboiled-java-1.0.2.jar:lib/parboiled-core-1.0.2.jar:lib/asm-3.3.1.jar"
## Instruction:
Set max heap size to 512M to prevent OOM errors.
## Code After:
vertx run app.coffee -cp "lib/ehcache-core-2.6.0.jar:lib/slf4j-api-1.7.0.jar:lib/slf4j-simple-1.7.0.jar:lib/handlebars-0.5.3.jar:lib/commons-lang3-3.1.jar:lib/parboiled-java-1.0.2.jar:lib/parboiled-core-1.0.2.jar:lib/asm-3.3.1.jar" -Xmx 512M
|
9faf9362fabd9bac37991986617ee7bf51ce0642 | application/views/messages/passwordResetCompleteMessage.php | application/views/messages/passwordResetCompleteMessage.php |
<div id="message" style='width:450px;'>
<h1>Password Reset Complete</h1>
<div id="authBox">
<p>Your account's password has been reset. An email has been dispatched with a temporary password you may use to login and then change it to anything you wish.</p>
</div>
</div> | <div class="container">
<div class="row">
<h2>Password Reset Complete</h2>
<p>Your account's password has been reset. An email has been dispatched with a temporary password you may use to login and then change it to anything you wish.</p>
</div>
</div> | Update password reset complete page to latest bootstrap. | Update password reset complete page to latest bootstrap.
| PHP | mit | marekr/siggy,marekr/siggy,marekr/siggy,marekr/siggy | php | ## Code Before:
<div id="message" style='width:450px;'>
<h1>Password Reset Complete</h1>
<div id="authBox">
<p>Your account's password has been reset. An email has been dispatched with a temporary password you may use to login and then change it to anything you wish.</p>
</div>
</div>
## Instruction:
Update password reset complete page to latest bootstrap.
## Code After:
<div class="container">
<div class="row">
<h2>Password Reset Complete</h2>
<p>Your account's password has been reset. An email has been dispatched with a temporary password you may use to login and then change it to anything you wish.</p>
</div>
</div> |
7f8c1f8560f58bf7466f09abcfd5de3b6f32e9d6 | shippable.yml | shippable.yml | language: none
build:
pre_ci_boot:
image_name: tomekw/ada-gnat
image_tag: 7.3.0
pull: true
ci:
- apt update
- apt install -y --no-install-recommends gprbuild make libahven6-dev lcov git ca-certificates libegl1-mesa-dev
- git clone https://github.com/onox/json-ada.git
- cd json-ada; make install; cd ..
- make test
- make coverage
integrations:
notifications:
- integrationName: email
type: email
on_success: never
on_failure: never
on_cancel: never
on_pull_request: never
| language: none
build:
pre_ci_boot:
image_name: tomekw/ada-gnat
image_tag: 7.3.0
pull: true
ci:
- apt update
- apt install -y --no-install-recommends gprbuild make libahven6-dev lcov git ca-certificates libegl1-mesa-dev
- git clone https://github.com/onox/json-ada.git
- git clone https://github.com/onox/dcf-ada.git
- cd json-ada; make; make install; cd ..
- cd dcf-ada; make tools; make install; cd ..
- make test
- make coverage
integrations:
notifications:
- integrationName: email
type: email
on_success: never
on_failure: never
on_cancel: never
on_pull_request: never
| Fix CI after adding dcf-ada project as a dependency | Fix CI after adding dcf-ada project as a dependency
Signed-off-by: onox <34c1d3d2d1347bea60a26f46574de21a7a79b6d5@gmail.com>
| YAML | apache-2.0 | onox/orka | yaml | ## Code Before:
language: none
build:
pre_ci_boot:
image_name: tomekw/ada-gnat
image_tag: 7.3.0
pull: true
ci:
- apt update
- apt install -y --no-install-recommends gprbuild make libahven6-dev lcov git ca-certificates libegl1-mesa-dev
- git clone https://github.com/onox/json-ada.git
- cd json-ada; make install; cd ..
- make test
- make coverage
integrations:
notifications:
- integrationName: email
type: email
on_success: never
on_failure: never
on_cancel: never
on_pull_request: never
## Instruction:
Fix CI after adding dcf-ada project as a dependency
Signed-off-by: onox <34c1d3d2d1347bea60a26f46574de21a7a79b6d5@gmail.com>
## Code After:
language: none
build:
pre_ci_boot:
image_name: tomekw/ada-gnat
image_tag: 7.3.0
pull: true
ci:
- apt update
- apt install -y --no-install-recommends gprbuild make libahven6-dev lcov git ca-certificates libegl1-mesa-dev
- git clone https://github.com/onox/json-ada.git
- git clone https://github.com/onox/dcf-ada.git
- cd json-ada; make; make install; cd ..
- cd dcf-ada; make tools; make install; cd ..
- make test
- make coverage
integrations:
notifications:
- integrationName: email
type: email
on_success: never
on_failure: never
on_cancel: never
on_pull_request: never
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.