commit
stringlengths
40
40
old_file
stringlengths
4
184
new_file
stringlengths
4
184
old_contents
stringlengths
1
3.6k
new_contents
stringlengths
5
3.38k
subject
stringlengths
15
778
message
stringlengths
16
6.74k
lang
stringclasses
201 values
license
stringclasses
13 values
repos
stringlengths
6
116k
config
stringclasses
201 values
content
stringlengths
137
7.24k
diff
stringlengths
26
5.55k
diff_length
int64
1
123
relative_diff_length
float64
0.01
89
n_lines_added
int64
0
108
n_lines_deleted
int64
0
106
99805f36e88a432b05bd24c2a02fd02ed786d76f
.travis.yml
.travis.yml
language: ruby rvm: - 2.2.2 before_install: gem install bundler -v 1.10.4
language: ruby rvm: - 2.2.4 - 2.2.3 - 2.2.2 - 2.1.8 before_install: gem install bundler -v 1.10.4
Add additional ruby versions to CI matrix
Add additional ruby versions to CI matrix Support ruby 2.1+ ----------------- Required keyword arguments weren't introduced until ruby-2.1. The forking_resolution_strategy.rb makes use of required keyword arguments to ensure that the caller passes a 'promise' for resolution.
YAML
mit
indiebrain/futurist,indiebrain/futurist
yaml
## Code Before: language: ruby rvm: - 2.2.2 before_install: gem install bundler -v 1.10.4 ## Instruction: Add additional ruby versions to CI matrix Support ruby 2.1+ ----------------- Required keyword arguments weren't introduced until ruby-2.1. The forking_resolution_strategy.rb makes use of required keyword arguments to ensure that the caller passes a 'promise' for resolution. ## Code After: language: ruby rvm: - 2.2.4 - 2.2.3 - 2.2.2 - 2.1.8 before_install: gem install bundler -v 1.10.4
language: ruby rvm: + - 2.2.4 + - 2.2.3 - 2.2.2 + - 2.1.8 before_install: gem install bundler -v 1.10.4
3
0.75
3
0
48ddbb49a511b74c965a57771b59ab964151254a
app/views/nodes/_node_map.html.haml
app/views/nodes/_node_map.html.haml
%section#node-map %h2=t('.map') .map =link_to root_path(anchor: "/?" + {lat: @node.lat, lon: @node.lon, node_id: @node.osm_id, zoom: cookies[:last_zoom]}.to_param), title: '' do = image_tag("http://api.tiles.mapbox.com/v3/sozialhelden.map-iqt6py1k/#{node_map.lon},#{node_map.lat},17/359x239.png64", class: 'img-polaroid', size: '359x239') %div{class: "leaflet-marker-icon public_transfer #{@node.wheelchair}"} .icon{id: @node.icon} %small.pull-right.bottom= link_to "© OpenStreetMap contributors", I18n.locale == :de ? "http://www.openstreetmap.org/copyright" : "http://www.openstreetmap.org/copyright/en"
%section#node-map %h2=t('.map') .map =link_to root_path(anchor: "/?" + {lat: @node.lat, lon: @node.lon, node_id: @node.osm_id, zoom: cookies[:last_zoom]}.to_param), title: '' do = image_tag("http://api.tiles.mapbox.com/v3/sozialhelden.map-iqt6py1k/#{node_map.lon},#{node_map.lat},17/359x239.png64", class: 'img-polaroid', size: '359x239') %div{class: "leaflet-marker-icon public_transfer #{@node.wheelchair}"} .marker-icon{class: "marker-icon-#{@node.icon}"} %small.pull-right.bottom= link_to "© OpenStreetMap contributors", I18n.locale == :de ? "http://www.openstreetmap.org/copyright" : "http://www.openstreetmap.org/copyright/en"
Fix missing icon in detail page map.
Fix missing icon in detail page map.
Haml
agpl-3.0
sozialhelden/wheelmap,sozialhelden/wheelmap,sozialhelden/wheelmap,sozialhelden/wheelmap,sozialhelden/wheelmap
haml
## Code Before: %section#node-map %h2=t('.map') .map =link_to root_path(anchor: "/?" + {lat: @node.lat, lon: @node.lon, node_id: @node.osm_id, zoom: cookies[:last_zoom]}.to_param), title: '' do = image_tag("http://api.tiles.mapbox.com/v3/sozialhelden.map-iqt6py1k/#{node_map.lon},#{node_map.lat},17/359x239.png64", class: 'img-polaroid', size: '359x239') %div{class: "leaflet-marker-icon public_transfer #{@node.wheelchair}"} .icon{id: @node.icon} %small.pull-right.bottom= link_to "© OpenStreetMap contributors", I18n.locale == :de ? "http://www.openstreetmap.org/copyright" : "http://www.openstreetmap.org/copyright/en" ## Instruction: Fix missing icon in detail page map. ## Code After: %section#node-map %h2=t('.map') .map =link_to root_path(anchor: "/?" + {lat: @node.lat, lon: @node.lon, node_id: @node.osm_id, zoom: cookies[:last_zoom]}.to_param), title: '' do = image_tag("http://api.tiles.mapbox.com/v3/sozialhelden.map-iqt6py1k/#{node_map.lon},#{node_map.lat},17/359x239.png64", class: 'img-polaroid', size: '359x239') %div{class: "leaflet-marker-icon public_transfer #{@node.wheelchair}"} .marker-icon{class: "marker-icon-#{@node.icon}"} %small.pull-right.bottom= link_to "© OpenStreetMap contributors", I18n.locale == :de ? "http://www.openstreetmap.org/copyright" : "http://www.openstreetmap.org/copyright/en"
%section#node-map %h2=t('.map') .map =link_to root_path(anchor: "/?" + {lat: @node.lat, lon: @node.lon, node_id: @node.osm_id, zoom: cookies[:last_zoom]}.to_param), title: '' do = image_tag("http://api.tiles.mapbox.com/v3/sozialhelden.map-iqt6py1k/#{node_map.lon},#{node_map.lat},17/359x239.png64", class: 'img-polaroid', size: '359x239') %div{class: "leaflet-marker-icon public_transfer #{@node.wheelchair}"} - .icon{id: @node.icon} + .marker-icon{class: "marker-icon-#{@node.icon}"} %small.pull-right.bottom= link_to "© OpenStreetMap contributors", I18n.locale == :de ? "http://www.openstreetmap.org/copyright" : "http://www.openstreetmap.org/copyright/en"
2
0.25
1
1
3dad35ee43394404ae0f1926d754e7b7820da8e4
CHANGELOG.md
CHANGELOG.md
- **[BC]** Drop support for PHP 7.1 ### 2.0.1 (2019-10-23) - Convert curly-brace string offsets to use square brackets (php 7.4 compatibility) ### 2.0.0 (2019-05-28) - **[BC]** Drop support for PHP 5.x and PHP 7.0 - **[BC]** Remove `PackageInfo` type ### 1.0.1 (2014-07-25) - **[IMPROVED]** Updated autoloader to [PSR-4](http://www.php-fig.org/psr/psr-4/) ### 1.0.0 (2013-01-10) - Initial release
- **[BC]** Add parameter and return type hints to all types ### 3.0.0 (2020-08-25) - **[BC]** Drop support for PHP 7.1 ### 2.0.1 (2019-10-23) - Convert curly-brace string offsets to use square brackets (php 7.4 compatibility) ### 2.0.0 (2019-05-28) - **[BC]** Drop support for PHP 5.x and PHP 7.0 - **[BC]** Remove `PackageInfo` type ### 1.0.1 (2014-07-25) - **[IMPROVED]** Updated autoloader to [PSR-4](http://www.php-fig.org/psr/psr-4/) ### 1.0.0 (2013-01-10) - Initial release
Add v4.0.0 release to changelog.
Add v4.0.0 release to changelog.
Markdown
mit
IcecaveStudios/repr
markdown
## Code Before: - **[BC]** Drop support for PHP 7.1 ### 2.0.1 (2019-10-23) - Convert curly-brace string offsets to use square brackets (php 7.4 compatibility) ### 2.0.0 (2019-05-28) - **[BC]** Drop support for PHP 5.x and PHP 7.0 - **[BC]** Remove `PackageInfo` type ### 1.0.1 (2014-07-25) - **[IMPROVED]** Updated autoloader to [PSR-4](http://www.php-fig.org/psr/psr-4/) ### 1.0.0 (2013-01-10) - Initial release ## Instruction: Add v4.0.0 release to changelog. ## Code After: - **[BC]** Add parameter and return type hints to all types ### 3.0.0 (2020-08-25) - **[BC]** Drop support for PHP 7.1 ### 2.0.1 (2019-10-23) - Convert curly-brace string offsets to use square brackets (php 7.4 compatibility) ### 2.0.0 (2019-05-28) - **[BC]** Drop support for PHP 5.x and PHP 7.0 - **[BC]** Remove `PackageInfo` type ### 1.0.1 (2014-07-25) - **[IMPROVED]** Updated autoloader to [PSR-4](http://www.php-fig.org/psr/psr-4/) ### 1.0.0 (2013-01-10) - Initial release
+ + - **[BC]** Add parameter and return type hints to all types + + ### 3.0.0 (2020-08-25) - **[BC]** Drop support for PHP 7.1 ### 2.0.1 (2019-10-23) - Convert curly-brace string offsets to use square brackets (php 7.4 compatibility) ### 2.0.0 (2019-05-28) - **[BC]** Drop support for PHP 5.x and PHP 7.0 - **[BC]** Remove `PackageInfo` type ### 1.0.1 (2014-07-25) - **[IMPROVED]** Updated autoloader to [PSR-4](http://www.php-fig.org/psr/psr-4/) ### 1.0.0 (2013-01-10) - Initial release
4
0.210526
4
0
80f58b5f9bb2edc0a2e230bc0fb4179fb0f7a707
lib/slippery/assets.rb
lib/slippery/assets.rb
require 'fileutils' module Slippery module Assets ASSETS_PATH = '../../../assets/' def self.embed_locally FileUtils.cp_r(File.expand_path(ASSETS_PATH, __FILE__), './') end def self.path_composer(local) if local ->(path) { File.join('assets', path) } else ->(path) { "file://#{File.expand_path(File.join(ASSETS_PATH, path), __FILE__)}" } end end end end
require 'fileutils' # Manage the assets and their URI/path module Slippery module Assets ASSETS_PATH = '../../../assets/' # Copies the assets locally def self.embed_locally FileUtils.cp_r(File.expand_path(ASSETS_PATH, __FILE__), './') end # returns a composer returning a URI for a given relative file path # considering if the asset is local or not def self.path_composer(local) if local ->(path) { File.join('assets', path) } else ->(path) { "file://#{File.expand_path(File.join(ASSETS_PATH, path), __FILE__)}" } end end end end
Add documentation to Assets module
Add documentation to Assets module
Ruby
mit
myabc/slippery,plexus/slippery,myabc/slippery,myabc/slippery,plexus/slippery,plexus/slippery
ruby
## Code Before: require 'fileutils' module Slippery module Assets ASSETS_PATH = '../../../assets/' def self.embed_locally FileUtils.cp_r(File.expand_path(ASSETS_PATH, __FILE__), './') end def self.path_composer(local) if local ->(path) { File.join('assets', path) } else ->(path) { "file://#{File.expand_path(File.join(ASSETS_PATH, path), __FILE__)}" } end end end end ## Instruction: Add documentation to Assets module ## Code After: require 'fileutils' # Manage the assets and their URI/path module Slippery module Assets ASSETS_PATH = '../../../assets/' # Copies the assets locally def self.embed_locally FileUtils.cp_r(File.expand_path(ASSETS_PATH, __FILE__), './') end # returns a composer returning a URI for a given relative file path # considering if the asset is local or not def self.path_composer(local) if local ->(path) { File.join('assets', path) } else ->(path) { "file://#{File.expand_path(File.join(ASSETS_PATH, path), __FILE__)}" } end end end end
require 'fileutils' + # Manage the assets and their URI/path module Slippery module Assets ASSETS_PATH = '../../../assets/' + # Copies the assets locally def self.embed_locally FileUtils.cp_r(File.expand_path(ASSETS_PATH, __FILE__), './') end + # returns a composer returning a URI for a given relative file path + # considering if the asset is local or not def self.path_composer(local) if local ->(path) { File.join('assets', path) } else ->(path) { "file://#{File.expand_path(File.join(ASSETS_PATH, path), __FILE__)}" } end end end end
4
0.210526
4
0
90148a42f1249a668d3096be4b52721095892487
tox.ini
tox.ini
[tox] envlist = py26_dj14, py27_dj14, py26_dj15, py27_dj15 [testenv:py26_dj14] basepython = python2.6 commands = pip install django==1.4.6 pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger [testenv:py27_dj14] basepython = python2.7 commands = pip install django==1.4.6 pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger [testenv:py26_dj15] basepython = python2.6 commands = pip install django==1.5.2 pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger [testenv:py27_dj15] basepython = python2.7 commands = pip install django==1.5.2 pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger
[tox] envlist = py26_dj14, py27_dj14, py26_dj15, py27_dj15 [testenv:py26_dj14] basepython = python2.6 commands = pip install django<1.5 pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger [testenv:py27_dj14] basepython = python2.7 commands = pip install django<1.5 pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger [testenv:py26_dj15] basepython = python2.6 commands = pip install django<1.6 pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger [testenv:py27_dj15] basepython = python2.7 commands = pip install django<1.6 pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger
Tweak versions so they always use latest Django
Tweak versions so they always use latest Django
INI
bsd-3-clause
mozilla/django-badger,donberna/django-badger,Nuevosmedios/django-badger,hackultura/django-badger,mozilla/django-badger,Nuevosmedios/django-badger,mozilla/django-badgekit,hackultura/django-badger,donberna/django-badger
ini
## Code Before: [tox] envlist = py26_dj14, py27_dj14, py26_dj15, py27_dj15 [testenv:py26_dj14] basepython = python2.6 commands = pip install django==1.4.6 pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger [testenv:py27_dj14] basepython = python2.7 commands = pip install django==1.4.6 pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger [testenv:py26_dj15] basepython = python2.6 commands = pip install django==1.5.2 pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger [testenv:py27_dj15] basepython = python2.7 commands = pip install django==1.5.2 pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger ## Instruction: Tweak versions so they always use latest Django ## Code After: [tox] envlist = py26_dj14, py27_dj14, py26_dj15, py27_dj15 [testenv:py26_dj14] basepython = python2.6 commands = pip install django<1.5 pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger [testenv:py27_dj14] basepython = python2.7 commands = pip install django<1.5 pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger [testenv:py26_dj15] basepython = python2.6 commands = pip install django<1.6 pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger [testenv:py27_dj15] basepython = python2.7 commands = pip install django<1.6 pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger
[tox] envlist = py26_dj14, py27_dj14, py26_dj15, py27_dj15 [testenv:py26_dj14] basepython = python2.6 - commands = pip install django==1.4.6 ? ^^ ^^^ + commands = pip install django<1.5 ? ^ ^ pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger [testenv:py27_dj14] basepython = python2.7 - commands = pip install django==1.4.6 ? ^^ ^^^ + commands = pip install django<1.5 ? ^ ^ pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger [testenv:py26_dj15] basepython = python2.6 - commands = pip install django==1.5.2 ? ^^ ^^^ + commands = pip install django<1.6 ? ^ ^ pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger [testenv:py27_dj15] basepython = python2.7 - commands = pip install django==1.5.2 ? ^^ ^^^ + commands = pip install django<1.6 ? ^ ^ pip install -r requirements/dev.txt {envpython} manage.py test -v2 badger
8
0.296296
4
4
da4f236b3dcdfb75c160eee395fa1bb327d80ade
lib/mips/assembler.rb
lib/mips/assembler.rb
module MIPS class MIPSSyntaxError < StandardError def initialize(msg) @msg = msg end def to_s msg end end # Assembly MIPS codes into machine codes. class Assembler attr_reader :symbol_table, :current_addr def initialize @symbol_table = {} @current_addr = 0x0 end def assembly(line) line.sub!(/#*/, "") # Remove comments. line.strip! read_tag line return if line.empty? # Tag line. end def self.assembly(line) new.assembly(line) end private def read_tag(line) return unless /^(?<tag>[a-zA-Z]\w*)\s*:\s*(?<rest>.*)/ =~ line if @symbol_table.include? tag fail MIPSSyntaxError, "Redeclaration of tag `#{tag}`" else @symbol_table[tag] = @current_addr end end end
module MIPS class MIPSSyntaxError < StandardError def initialize(msg) @msg = msg end def to_s msg end end # Assembly MIPS codes into machine codes. class Assembler attr_reader :symbol_table, :current_addr def initialize @symbol_table = {} @current_addr = 0x0 end def assembly(line) line.sub!(/#*/, "") # Remove comments. line.strip! line = read_tag(line) return if line.empty? # Tag line, @current_addr stays the same. end def self.assembly(line) new.assembly(line) end private def read_tag(line) return unless /^(?<tag>[a-zA-Z]\w*)\s*:\s*(?<rest>.*)/ =~ line if @symbol_table.include? tag fail MIPSSyntaxError, "Redeclaration of tag `#{tag}`" else @symbol_table[tag] = @current_addr rest end end end end
Fix small bug in read_tag.
Fix small bug in read_tag.
Ruby
mit
DigitalLogicSummerTerm2015/mips-assembler,DigitalLogicSummerTerm2015/mips-assembler
ruby
## Code Before: module MIPS class MIPSSyntaxError < StandardError def initialize(msg) @msg = msg end def to_s msg end end # Assembly MIPS codes into machine codes. class Assembler attr_reader :symbol_table, :current_addr def initialize @symbol_table = {} @current_addr = 0x0 end def assembly(line) line.sub!(/#*/, "") # Remove comments. line.strip! read_tag line return if line.empty? # Tag line. end def self.assembly(line) new.assembly(line) end private def read_tag(line) return unless /^(?<tag>[a-zA-Z]\w*)\s*:\s*(?<rest>.*)/ =~ line if @symbol_table.include? tag fail MIPSSyntaxError, "Redeclaration of tag `#{tag}`" else @symbol_table[tag] = @current_addr end end end ## Instruction: Fix small bug in read_tag. ## Code After: module MIPS class MIPSSyntaxError < StandardError def initialize(msg) @msg = msg end def to_s msg end end # Assembly MIPS codes into machine codes. class Assembler attr_reader :symbol_table, :current_addr def initialize @symbol_table = {} @current_addr = 0x0 end def assembly(line) line.sub!(/#*/, "") # Remove comments. line.strip! line = read_tag(line) return if line.empty? # Tag line, @current_addr stays the same. end def self.assembly(line) new.assembly(line) end private def read_tag(line) return unless /^(?<tag>[a-zA-Z]\w*)\s*:\s*(?<rest>.*)/ =~ line if @symbol_table.include? tag fail MIPSSyntaxError, "Redeclaration of tag `#{tag}`" else @symbol_table[tag] = @current_addr rest end end end end
module MIPS class MIPSSyntaxError < StandardError def initialize(msg) @msg = msg end def to_s msg end end # Assembly MIPS codes into machine codes. class Assembler attr_reader :symbol_table, :current_addr def initialize @symbol_table = {} @current_addr = 0x0 end def assembly(line) line.sub!(/#*/, "") # Remove comments. line.strip! - read_tag line ? ^ + line = read_tag(line) ? +++++++ ^ + + return if line.empty? # Tag line, @current_addr stays the same. - return if line.empty? # Tag line. - end def self.assembly(line) new.assembly(line) end private def read_tag(line) return unless /^(?<tag>[a-zA-Z]\w*)\s*:\s*(?<rest>.*)/ =~ line if @symbol_table.include? tag fail MIPSSyntaxError, "Redeclaration of tag `#{tag}`" else @symbol_table[tag] = @current_addr + rest end + end end end
7
0.152174
4
3
ffada8a0286c97887c2670ec6fdcda66b4782d3f
src/Loader/CachingLoader.php
src/Loader/CachingLoader.php
<?php declare(strict_types=1); namespace Imjoehaines\Flowder\Loader; final class CachingLoader implements LoaderInterface { /** * The array of cached data * * @var array */ private $cache = []; /** * A "real" loader instance that actually does the loading * * @var LoaderInterface */ private $loader; /** * @param LoaderInterface $loader */ public function __construct(LoaderInterface $loader) { $this->loader = $loader; } /** * Load the given thing and cache the results for repeated calls * * @param mixed $thingToLoad * @return iterable */ public function load($thingToLoad): iterable { if (!array_key_exists($thingToLoad, $this->cache)) { foreach ($this->loader->load($thingToLoad) as $table => $data) { $this->cache[$thingToLoad][$table] = $data; yield $table => $data; } } else { yield from $this->cache[$thingToLoad]; } } }
<?php declare(strict_types=1); namespace Imjoehaines\Flowder\Loader; final class CachingLoader implements LoaderInterface { /** * The array of cached data * * @var array<string, array<string, iterable>> */ private $cache = []; /** * A "real" loader instance that actually does the loading * * @var LoaderInterface */ private $loader; /** * @param LoaderInterface $loader */ public function __construct(LoaderInterface $loader) { $this->loader = $loader; } /** * Load the given thing and cache the results for repeated calls * * @param mixed $thingToLoad * @return iterable */ public function load($thingToLoad): iterable { if (!array_key_exists($thingToLoad, $this->cache)) { foreach ($this->loader->load($thingToLoad) as $table => $data) { $this->cache[$thingToLoad][$table] = $data; yield $table => $data; } } else { yield from $this->cache[$thingToLoad]; } } }
Add more precise type annotation
Add more precise type annotation
PHP
unlicense
imjoehaines/flowder
php
## Code Before: <?php declare(strict_types=1); namespace Imjoehaines\Flowder\Loader; final class CachingLoader implements LoaderInterface { /** * The array of cached data * * @var array */ private $cache = []; /** * A "real" loader instance that actually does the loading * * @var LoaderInterface */ private $loader; /** * @param LoaderInterface $loader */ public function __construct(LoaderInterface $loader) { $this->loader = $loader; } /** * Load the given thing and cache the results for repeated calls * * @param mixed $thingToLoad * @return iterable */ public function load($thingToLoad): iterable { if (!array_key_exists($thingToLoad, $this->cache)) { foreach ($this->loader->load($thingToLoad) as $table => $data) { $this->cache[$thingToLoad][$table] = $data; yield $table => $data; } } else { yield from $this->cache[$thingToLoad]; } } } ## Instruction: Add more precise type annotation ## Code After: <?php declare(strict_types=1); namespace Imjoehaines\Flowder\Loader; final class CachingLoader implements LoaderInterface { /** * The array of cached data * * @var array<string, array<string, iterable>> */ private $cache = []; /** * A "real" loader instance that actually does the loading * * @var LoaderInterface */ private $loader; /** * @param LoaderInterface $loader */ public function __construct(LoaderInterface $loader) { $this->loader = $loader; } /** * Load the given thing and cache the results for repeated calls * * @param mixed $thingToLoad * @return iterable */ public function load($thingToLoad): iterable { if (!array_key_exists($thingToLoad, $this->cache)) { foreach ($this->loader->load($thingToLoad) as $table => $data) { $this->cache[$thingToLoad][$table] = $data; yield $table => $data; } } else { yield from $this->cache[$thingToLoad]; } } }
<?php declare(strict_types=1); namespace Imjoehaines\Flowder\Loader; final class CachingLoader implements LoaderInterface { /** * The array of cached data * - * @var array + * @var array<string, array<string, iterable>> */ private $cache = []; /** * A "real" loader instance that actually does the loading * * @var LoaderInterface */ private $loader; /** * @param LoaderInterface $loader */ public function __construct(LoaderInterface $loader) { $this->loader = $loader; } /** * Load the given thing and cache the results for repeated calls * * @param mixed $thingToLoad * @return iterable */ public function load($thingToLoad): iterable { if (!array_key_exists($thingToLoad, $this->cache)) { foreach ($this->loader->load($thingToLoad) as $table => $data) { $this->cache[$thingToLoad][$table] = $data; yield $table => $data; } } else { yield from $this->cache[$thingToLoad]; } } }
2
0.042553
1
1
004b60d7ee160e9d801aa8e661044cf01722839c
lib/cms/attribute/association.rb
lib/cms/attribute/association.rb
module Cms module Attribute module Association attr_reader :options_for_select def options_for_select(entry) self.klass.all.reject{ |e| e == entry }.collect do |e| [e.try(:name) || e.try(:title) || e.id, e.id] end end end end end
module Cms module Attribute module Association attr_reader :options_for_select def options_for_select(entry) self.klass.all.reject{ |e| e == entry }.collect do |e| [e.try(:email) || e.try(:login) || e.try(:name) || e.try(:title) || e.id, e.id] end end end end end
Add email and login to options_for_select method for assotiations
Add email and login to options_for_select method for assotiations
Ruby
mit
droptheplot/adminable,droptheplot/adminable,droptheplot/adminable
ruby
## Code Before: module Cms module Attribute module Association attr_reader :options_for_select def options_for_select(entry) self.klass.all.reject{ |e| e == entry }.collect do |e| [e.try(:name) || e.try(:title) || e.id, e.id] end end end end end ## Instruction: Add email and login to options_for_select method for assotiations ## Code After: module Cms module Attribute module Association attr_reader :options_for_select def options_for_select(entry) self.klass.all.reject{ |e| e == entry }.collect do |e| [e.try(:email) || e.try(:login) || e.try(:name) || e.try(:title) || e.id, e.id] end end end end end
module Cms module Attribute module Association attr_reader :options_for_select def options_for_select(entry) self.klass.all.reject{ |e| e == entry }.collect do |e| - [e.try(:name) || e.try(:title) || e.id, e.id] + [e.try(:email) || e.try(:login) || e.try(:name) || e.try(:title) || e.id, e.id] ? ++++++++++++++++++++++++++++++++++ end end end end end
2
0.153846
1
1
ea710b64689159b70fb4cd479e6449b1c893bf09
packages/kf/kf_0.5.4.1.bb
packages/kf/kf_0.5.4.1.bb
LICENSE = "GPL" DEPENDS = "libxml2 glib-2.0 gtk+ loudmouth" MAINTAINER = "Chris Lord <chris@openedhand.com>" DESCRIPTION = "Kf is a GTK+ instant messaging client." PR = "r2" SRC_URI = "http://jabberstudio.2nw.net/${PN}/${PN}-${PV}.tar.gz \ file://fix-configure.patch;patch=1 \ file://fix-desktop-file.patch;patch=0" inherit autotools pkgconfig EXTRA_OECONF = "--disable-binreloc"
LICENSE = "GPL" DEPENDS = "libxml2 glib-2.0 gtk+ loudmouth" MAINTAINER = "Chris Lord <chris@openedhand.com>" DESCRIPTION = "Kf is a GTK+ instant messaging client." PR = "r2" SRC_URI = "http://jabberstudio.2nw.net/${PN}/${PN}-${PV}.tar.gz \ file://fix-configure.patch;patch=1 \ file://fix-desktop-file.patch;patch=0" inherit autotools pkgconfig export PKG_CONFIG="${STAGING_BINDIR}/pkg-config" EXTRA_OECONF = "--disable-binreloc"
Fix pkgconfig related build issue (from poky)
kf: Fix pkgconfig related build issue (from poky)
BitBake
mit
demsey/openembedded,anguslees/openembedded-android,thebohemian/openembedded,Martix/Eonos,openpli-arm/openembedded,giobauermeister/openembedded,trini/openembedded,John-NY/overo-oe,demsey/openembedded,demsey/openenigma2,JrCs/opendreambox,John-NY/overo-oe,nlebedenco/mini2440,BlackPole/bp-openembedded,nx111/openembeded_openpli2.1_nx111,giobauermeister/openembedded,popazerty/openembedded-cuberevo,mrchapp/arago-oe-dev,xifengchuo/openembedded,anguslees/openembedded-android,dave-billin/overo-ui-moos-auv,JamesAng/oe,dellysunnymtech/sakoman-oe,demsey/openembedded,SIFTeam/openembedded,yyli/overo-oe,trini/openembedded,sledz/oe,troth/oe-ts7xxx,dellysunnymtech/sakoman-oe,demsey/openenigma2,thebohemian/openembedded,dave-billin/overo-ui-moos-auv,trini/openembedded,bticino/openembedded,giobauermeister/openembedded,nlebedenco/mini2440,JamesAng/oe,nzjrs/overo-openembedded,openpli-arm/openembedded,mrchapp/arago-oe-dev,philb/pbcl-oe-2010,YtvwlD/od-oe,KDAB/OpenEmbedded-Archos,nzjrs/overo-openembedded,JamesAng/goe,trini/openembedded,yyli/overo-oe,BlackPole/bp-openembedded,SIFTeam/openembedded,BlackPole/bp-openembedded,popazerty/openembedded-cuberevo,popazerty/openembedded-cuberevo,yyli/overo-oe,sledz/oe,sutajiokousagi/openembedded,demsey/openembedded,demsey/openenigma2,sledz/oe,openembedded/openembedded,sentient-energy/emsw-oe-mirror,thebohemian/openembedded,hulifox008/openembedded,philb/pbcl-oe-2010,giobauermeister/openembedded,xifengchuo/openembedded,nvl1109/openembeded,thebohemian/openembedded,YtvwlD/od-oe,dave-billin/overo-ui-moos-auv,philb/pbcl-oe-2010,John-NY/overo-oe,anguslees/openembedded-android,thebohemian/openembedded,JamesAng/goe,demsey/openembedded,openembedded/openembedded,sentient-energy/emsw-oe-mirror,rascalmicro/openembedded-rascal,nvl1109/openembeded,KDAB/OpenEmbedded-Archos,rascalmicro/openembedded-rascal,giobauermeister/openembedded,hulifox008/openembedded,sledz/oe,sentient-energy/emsw-oe-mirror,scottellis/overo-oe,dellysunnymtech/sakoman-oe,openembedded/openembedded,nvl1109/openembeded,libo/openembedded,scottellis/overo-oe,openembedded/openembedded,bticino/openembedded,xifengchuo/openembedded,popazerty/openembedded-cuberevo,YtvwlD/od-oe,JrCs/opendreambox,John-NY/overo-oe,philb/pbcl-oe-2010,nlebedenco/mini2440,popazerty/openembedded-cuberevo,philb/pbcl-oe-2010,dellysunnymtech/sakoman-oe,nzjrs/overo-openembedded,popazerty/openembedded-cuberevo,bticino/openembedded,crystalfontz/openembedded,openembedded/openembedded,JamesAng/oe,JamesAng/oe,rascalmicro/openembedded-rascal,BlackPole/bp-openembedded,sutajiokousagi/openembedded,demsey/openenigma2,JrCs/opendreambox,xifengchuo/openembedded,yyli/overo-oe,openembedded/openembedded,SIFTeam/openembedded,yyli/overo-oe,openembedded/openembedded,SIFTeam/openembedded,crystalfontz/openembedded,nvl1109/openembeded,JamesAng/goe,yyli/overo-oe,xifengchuo/openembedded,sampov2/audio-openembedded,KDAB/OpenEmbedded-Archos,BlackPole/bp-openembedded,anguslees/openembedded-android,Martix/Eonos,troth/oe-ts7xxx,sentient-energy/emsw-oe-mirror,sutajiokousagi/openembedded,trini/openembedded,nx111/openembeded_openpli2.1_nx111,openembedded/openembedded,thebohemian/openembedded,openpli-arm/openembedded,sentient-energy/emsw-oe-mirror,libo/openembedded,crystalfontz/openembedded,KDAB/OpenEmbedded-Archos,libo/openembedded,nzjrs/overo-openembedded,sutajiokousagi/openembedded,buglabs/oe-buglabs,libo/openembedded,sampov2/audio-openembedded,dave-billin/overo-ui-moos-auv,sampov2/audio-openembedded,SIFTeam/openembedded,nlebedenco/mini2440,rascalmicro/openembedded-rascal,hulifox008/openembedded,buglabs/oe-buglabs,YtvwlD/od-oe,nzjrs/overo-openembedded,openpli-arm/openembedded,sentient-energy/emsw-oe-mirror,dave-billin/overo-ui-moos-auv,scottellis/overo-oe,sutajiokousagi/openembedded,mrchapp/arago-oe-dev,anguslees/openembedded-android,openpli-arm/openembedded,giobauermeister/openembedded,dellysunnymtech/sakoman-oe,buglabs/oe-buglabs,sampov2/audio-openembedded,sampov2/audio-openembedded,crystalfontz/openembedded,xifengchuo/openembedded,SIFTeam/openembedded,buglabs/oe-buglabs,yyli/overo-oe,bticino/openembedded,sampov2/audio-openembedded,buglabs/oe-buglabs,philb/pbcl-oe-2010,dellysunnymtech/sakoman-oe,buglabs/oe-buglabs,libo/openembedded,scottellis/overo-oe,crystalfontz/openembedded,openpli-arm/openembedded,dave-billin/overo-ui-moos-auv,YtvwlD/od-oe,rascalmicro/openembedded-rascal,Martix/Eonos,rascalmicro/openembedded-rascal,John-NY/overo-oe,dave-billin/overo-ui-moos-auv,bticino/openembedded,KDAB/OpenEmbedded-Archos,nlebedenco/mini2440,popazerty/openembedded-cuberevo,hulifox008/openembedded,giobauermeister/openembedded,openembedded/openembedded,demsey/openenigma2,JamesAng/goe,nx111/openembeded_openpli2.1_nx111,nzjrs/overo-openembedded,yyli/overo-oe,Martix/Eonos,mrchapp/arago-oe-dev,nx111/openembeded_openpli2.1_nx111,troth/oe-ts7xxx,hulifox008/openembedded,nx111/openembeded_openpli2.1_nx111,scottellis/overo-oe,anguslees/openembedded-android,troth/oe-ts7xxx,sutajiokousagi/openembedded,JamesAng/oe,popazerty/openembedded-cuberevo,thebohemian/openembedded,openpli-arm/openembedded,xifengchuo/openembedded,Martix/Eonos,demsey/openembedded,dellysunnymtech/sakoman-oe,bticino/openembedded,demsey/openembedded,sledz/oe,rascalmicro/openembedded-rascal,nzjrs/overo-openembedded,xifengchuo/openembedded,JrCs/opendreambox,JamesAng/goe,crystalfontz/openembedded,dellysunnymtech/sakoman-oe,openembedded/openembedded,hulifox008/openembedded,nvl1109/openembeded,JamesAng/oe,SIFTeam/openembedded,troth/oe-ts7xxx,mrchapp/arago-oe-dev,openembedded/openembedded,troth/oe-ts7xxx,JrCs/opendreambox,YtvwlD/od-oe,sledz/oe,trini/openembedded,troth/oe-ts7xxx,JrCs/opendreambox,xifengchuo/openembedded,scottellis/overo-oe,anguslees/openembedded-android,demsey/openenigma2,libo/openembedded,BlackPole/bp-openembedded,trini/openembedded,crystalfontz/openembedded,scottellis/overo-oe,Martix/Eonos,philb/pbcl-oe-2010,JrCs/opendreambox,JamesAng/goe,mrchapp/arago-oe-dev,nx111/openembeded_openpli2.1_nx111,nx111/openembeded_openpli2.1_nx111,nlebedenco/mini2440,Martix/Eonos,sledz/oe,demsey/openenigma2,KDAB/OpenEmbedded-Archos,sampov2/audio-openembedded,John-NY/overo-oe,sutajiokousagi/openembedded,KDAB/OpenEmbedded-Archos,John-NY/overo-oe,giobauermeister/openembedded,YtvwlD/od-oe,nvl1109/openembeded,rascalmicro/openembedded-rascal,hulifox008/openembedded,BlackPole/bp-openembedded,sentient-energy/emsw-oe-mirror,YtvwlD/od-oe,nlebedenco/mini2440,JamesAng/oe,dellysunnymtech/sakoman-oe,JamesAng/goe,nvl1109/openembeded,JrCs/opendreambox,giobauermeister/openembedded,bticino/openembedded,nx111/openembeded_openpli2.1_nx111,buglabs/oe-buglabs,buglabs/oe-buglabs,mrchapp/arago-oe-dev,libo/openembedded,JrCs/opendreambox
bitbake
## Code Before: LICENSE = "GPL" DEPENDS = "libxml2 glib-2.0 gtk+ loudmouth" MAINTAINER = "Chris Lord <chris@openedhand.com>" DESCRIPTION = "Kf is a GTK+ instant messaging client." PR = "r2" SRC_URI = "http://jabberstudio.2nw.net/${PN}/${PN}-${PV}.tar.gz \ file://fix-configure.patch;patch=1 \ file://fix-desktop-file.patch;patch=0" inherit autotools pkgconfig EXTRA_OECONF = "--disable-binreloc" ## Instruction: kf: Fix pkgconfig related build issue (from poky) ## Code After: LICENSE = "GPL" DEPENDS = "libxml2 glib-2.0 gtk+ loudmouth" MAINTAINER = "Chris Lord <chris@openedhand.com>" DESCRIPTION = "Kf is a GTK+ instant messaging client." PR = "r2" SRC_URI = "http://jabberstudio.2nw.net/${PN}/${PN}-${PV}.tar.gz \ file://fix-configure.patch;patch=1 \ file://fix-desktop-file.patch;patch=0" inherit autotools pkgconfig export PKG_CONFIG="${STAGING_BINDIR}/pkg-config" EXTRA_OECONF = "--disable-binreloc"
LICENSE = "GPL" DEPENDS = "libxml2 glib-2.0 gtk+ loudmouth" MAINTAINER = "Chris Lord <chris@openedhand.com>" DESCRIPTION = "Kf is a GTK+ instant messaging client." PR = "r2" SRC_URI = "http://jabberstudio.2nw.net/${PN}/${PN}-${PV}.tar.gz \ file://fix-configure.patch;patch=1 \ file://fix-desktop-file.patch;patch=0" inherit autotools pkgconfig + export PKG_CONFIG="${STAGING_BINDIR}/pkg-config" + EXTRA_OECONF = "--disable-binreloc"
2
0.142857
2
0
65c5ef7982dd07fe68edd483be7f5271f5749baf
bigquery-policy-tags-admin.yaml
bigquery-policy-tags-admin.yaml
title: "BigQuery Policy Tags Admin" description: "Allow users to read and update Column Policy Tags" stage: "BETA" includedPermissions: - bigquery.tables.setCategory
title: "BigQuery Policy Tags Admin" description: "Allow users to read and update Column Policy Tags" stage: "BETA" includedPermissions: - bigquery.tables.setCategory # [END policy_permissions_info]
Add section names for github includes
Add section names for github includes
YAML
apache-2.0
GoogleCloudPlatform/bigquery-data-lineage,GoogleCloudPlatform/bigquery-data-lineage,GoogleCloudPlatform/bigquery-data-lineage
yaml
## Code Before: title: "BigQuery Policy Tags Admin" description: "Allow users to read and update Column Policy Tags" stage: "BETA" includedPermissions: - bigquery.tables.setCategory ## Instruction: Add section names for github includes ## Code After: title: "BigQuery Policy Tags Admin" description: "Allow users to read and update Column Policy Tags" stage: "BETA" includedPermissions: - bigquery.tables.setCategory # [END policy_permissions_info]
- title: "BigQuery Policy Tags Admin" description: "Allow users to read and update Column Policy Tags" stage: "BETA" includedPermissions: - bigquery.tables.setCategory - + # [END policy_permissions_info]
3
0.428571
1
2
9479558782384c397b5622752825a5b4d63feee2
spec/models/team_spec.rb
spec/models/team_spec.rb
require 'spec_helper' describe Team do describe "validations" do let(:course) { create :course } it "requires that the team name be unique per course" do create :team, course_id: course.id, name: "Zeppelin" team = Team.new course_id: course.id, name: "zeppelin" expect(team).to_not be_valid expect(team.errors[:name]).to include "has already been taken" end it "can have the same name if it's for a different course" do create :team, course_id: course.id, name: "Zeppelin" team = Team.new course_id: create(:course).id, name: "Zeppelin" expect(team).to be_valid end end describe ".find_by_course_and_name" do let(:team) { create :team } it "returns the team for the specific course id and name" do result = Team.find_by_course_and_name team.course_id, team.name.upcase expect(result).to eq team end end describe "challenge_grade_score" do end end
require "active_record_spec_helper" describe Team do describe "validations" do let(:course) { create :course } it "requires that the team name be unique per course" do create :team, course_id: course.id, name: "Zeppelin" team = Team.new course_id: course.id, name: "zeppelin" expect(team).to_not be_valid expect(team.errors[:name]).to include "has already been taken" end it "can have the same name if it's for a different course" do create :team, course_id: course.id, name: "Zeppelin" team = Team.new course_id: create(:course).id, name: "Zeppelin" expect(team).to be_valid end end describe ".find_by_course_and_name" do let(:team) { create :team } it "returns the team for the specific course id and name" do result = Team.find_by_course_and_name team.course_id, team.name.upcase expect(result).to eq team end end describe "challenge_grade_score" do end end
Add AR spec helper to team spec
Add AR spec helper to team spec
Ruby
agpl-3.0
UM-USElab/gradecraft-development,UM-USElab/gradecraft-development,mkoon/gradecraft-development,mkoon/gradecraft-development,UM-USElab/gradecraft-development,mkoon/gradecraft-development,UM-USElab/gradecraft-development
ruby
## Code Before: require 'spec_helper' describe Team do describe "validations" do let(:course) { create :course } it "requires that the team name be unique per course" do create :team, course_id: course.id, name: "Zeppelin" team = Team.new course_id: course.id, name: "zeppelin" expect(team).to_not be_valid expect(team.errors[:name]).to include "has already been taken" end it "can have the same name if it's for a different course" do create :team, course_id: course.id, name: "Zeppelin" team = Team.new course_id: create(:course).id, name: "Zeppelin" expect(team).to be_valid end end describe ".find_by_course_and_name" do let(:team) { create :team } it "returns the team for the specific course id and name" do result = Team.find_by_course_and_name team.course_id, team.name.upcase expect(result).to eq team end end describe "challenge_grade_score" do end end ## Instruction: Add AR spec helper to team spec ## Code After: require "active_record_spec_helper" describe Team do describe "validations" do let(:course) { create :course } it "requires that the team name be unique per course" do create :team, course_id: course.id, name: "Zeppelin" team = Team.new course_id: course.id, name: "zeppelin" expect(team).to_not be_valid expect(team.errors[:name]).to include "has already been taken" end it "can have the same name if it's for a different course" do create :team, course_id: course.id, name: "Zeppelin" team = Team.new course_id: create(:course).id, name: "Zeppelin" expect(team).to be_valid end end describe ".find_by_course_and_name" do let(:team) { create :team } it "returns the team for the specific course id and name" do result = Team.find_by_course_and_name team.course_id, team.name.upcase expect(result).to eq team end end describe "challenge_grade_score" do end end
- require 'spec_helper' + require "active_record_spec_helper" describe Team do describe "validations" do let(:course) { create :course } it "requires that the team name be unique per course" do create :team, course_id: course.id, name: "Zeppelin" team = Team.new course_id: course.id, name: "zeppelin" expect(team).to_not be_valid expect(team.errors[:name]).to include "has already been taken" end it "can have the same name if it's for a different course" do create :team, course_id: course.id, name: "Zeppelin" team = Team.new course_id: create(:course).id, name: "Zeppelin" expect(team).to be_valid end end describe ".find_by_course_and_name" do let(:team) { create :team } it "returns the team for the specific course id and name" do result = Team.find_by_course_and_name team.course_id, team.name.upcase expect(result).to eq team end end describe "challenge_grade_score" do end end
2
0.0625
1
1
0506a56af72d45a8de17f6c730448ca5952eca9d
.travis.yml
.travis.yml
rvm: - 1.9.3 - 2.0.0 # - ruby-head env: - RAILS=4-0-stable
rvm: - 1.9.3 - 2.0.0 # - ruby-head env: - RAILS=4-0-stable DB=sqlite3 - RAILS=4-0-stable DB=mysql - RAILS=4-0-stable DB=postgres
Support 3 different databases on Travis
Support 3 different databases on Travis
YAML
mit
vaski/ransack,reinteractive/ransack,guilleva/ransack,roman-franko/ransack,Eric-Guo/ransack,mariorcardoso/ransack,praktijkindex/ransack,martndemus/ransack,avit/ransack,outstand/ransack,amboxer21/ransack,AndrewSwerlick/ransack,Mario245/ransack,laserlemon/ransack,johan--/ransack,nagyt234/ransack,activerecord-hackery/ransack
yaml
## Code Before: rvm: - 1.9.3 - 2.0.0 # - ruby-head env: - RAILS=4-0-stable ## Instruction: Support 3 different databases on Travis ## Code After: rvm: - 1.9.3 - 2.0.0 # - ruby-head env: - RAILS=4-0-stable DB=sqlite3 - RAILS=4-0-stable DB=mysql - RAILS=4-0-stable DB=postgres
rvm: - 1.9.3 - 2.0.0 # - ruby-head env: + - RAILS=4-0-stable DB=sqlite3 - - RAILS=4-0-stable + - RAILS=4-0-stable DB=mysql ? +++++++++ + - RAILS=4-0-stable DB=postgres + +
6
0.857143
5
1
51bc8619546d5deb61e73f428f42e4dfdb09ec60
challenge32.go
challenge32.go
// Challenge 32 - Break HMAC-SHA1 with a slightly less artificial timing leak // http://cryptopals.com/sets/4/challenges/32 package cryptopals import ( "crypto/sha1" "net/http" "time" ) type challenge32 struct { } func (challenge32) ForgeHmacSHA1SignaturePrecise(addr, file string) []byte { sig := make([]byte, sha1.Size) x := challenge31{} for i := 0; i < len(sig); i++ { var valBest byte var timeBest time.Duration for j := 0; j < 256; j++ { sig[i] = byte(j) url := x.buildURL(addr, file, sig) start := time.Now() for k := 0; k < 15; k++ { resp, _ := http.Get(url) resp.Body.Close() } elapsed := time.Since(start) if elapsed > timeBest { valBest = byte(j) timeBest = elapsed } } sig[i] = valBest } return sig }
// Challenge 32 - Break HMAC-SHA1 with a slightly less artificial timing leak // http://cryptopals.com/sets/4/challenges/32 package cryptopals import ( "crypto/sha1" "net/http" "time" ) type challenge32 struct { } func (challenge32) ForgeHmacSHA1SignaturePrecise(addr, file string) []byte { sig := make([]byte, sha1.Size) x := challenge31{} for i := 0; i < len(sig); i++ { var valBest byte var timeBest time.Duration for j := 0; j < 256; j++ { sig[i] = byte(j) url := x.buildURL(addr, file, sig) fastest := time.Hour for k := 0; k < 10; k++ { start := time.Now() resp, _ := http.Get(url) elapsed := time.Since(start) resp.Body.Close() if elapsed < fastest { fastest = elapsed } } if fastest > timeBest { valBest = byte(j) timeBest = fastest } } sig[i] = valBest } return sig }
Fix solution for challenge 32
Fix solution for challenge 32
Go
mit
Metalnem/cryptopals-go,Metalnem/cryptopals-go,Metalnem/cryptopals-go
go
## Code Before: // Challenge 32 - Break HMAC-SHA1 with a slightly less artificial timing leak // http://cryptopals.com/sets/4/challenges/32 package cryptopals import ( "crypto/sha1" "net/http" "time" ) type challenge32 struct { } func (challenge32) ForgeHmacSHA1SignaturePrecise(addr, file string) []byte { sig := make([]byte, sha1.Size) x := challenge31{} for i := 0; i < len(sig); i++ { var valBest byte var timeBest time.Duration for j := 0; j < 256; j++ { sig[i] = byte(j) url := x.buildURL(addr, file, sig) start := time.Now() for k := 0; k < 15; k++ { resp, _ := http.Get(url) resp.Body.Close() } elapsed := time.Since(start) if elapsed > timeBest { valBest = byte(j) timeBest = elapsed } } sig[i] = valBest } return sig } ## Instruction: Fix solution for challenge 32 ## Code After: // Challenge 32 - Break HMAC-SHA1 with a slightly less artificial timing leak // http://cryptopals.com/sets/4/challenges/32 package cryptopals import ( "crypto/sha1" "net/http" "time" ) type challenge32 struct { } func (challenge32) ForgeHmacSHA1SignaturePrecise(addr, file string) []byte { sig := make([]byte, sha1.Size) x := challenge31{} for i := 0; i < len(sig); i++ { var valBest byte var timeBest time.Duration for j := 0; j < 256; j++ { sig[i] = byte(j) url := x.buildURL(addr, file, sig) fastest := time.Hour for k := 0; k < 10; k++ { start := time.Now() resp, _ := http.Get(url) elapsed := time.Since(start) resp.Body.Close() if elapsed < fastest { fastest = elapsed } } if fastest > timeBest { valBest = byte(j) timeBest = fastest } } sig[i] = valBest } return sig }
// Challenge 32 - Break HMAC-SHA1 with a slightly less artificial timing leak // http://cryptopals.com/sets/4/challenges/32 package cryptopals import ( "crypto/sha1" "net/http" "time" ) type challenge32 struct { } func (challenge32) ForgeHmacSHA1SignaturePrecise(addr, file string) []byte { sig := make([]byte, sha1.Size) x := challenge31{} for i := 0; i < len(sig); i++ { var valBest byte var timeBest time.Duration for j := 0; j < 256; j++ { sig[i] = byte(j) url := x.buildURL(addr, file, sig) - start := time.Now() + fastest := time.Hour - for k := 0; k < 15; k++ { ? ^ + for k := 0; k < 10; k++ { ? ^ + start := time.Now() resp, _ := http.Get(url) + elapsed := time.Since(start) resp.Body.Close() + + if elapsed < fastest { + fastest = elapsed + } } - elapsed := time.Since(start) - - if elapsed > timeBest { ? --- ^^ + if fastest > timeBest { ? ++++ ^ valBest = byte(j) - timeBest = elapsed ? --- ^^ + timeBest = fastest ? ++++ ^ } } sig[i] = valBest } return sig }
16
0.355556
10
6
96265dd6ca62791bb4387daf7b5ac8a49ecaefc8
PEInfo/GenerateExe.bat
PEInfo/GenerateExe.bat
@ECHO OFF python3 setup.py py2exe -d "%~dp0\" PAUSE
@ECHO OFF CD /D "%~dp0" SET "OutDir=Output" IF NOT EXIST "%OutDir%" ( MKDIR "%OutDir" ) python3 setup.py py2exe -d "%~dp0\%OutDir%" PAUSE
Update script to generate standalone exe
[PEInfo] Update script to generate standalone exe
Batchfile
bsd-3-clause
winest/PEInfo
batchfile
## Code Before: @ECHO OFF python3 setup.py py2exe -d "%~dp0\" PAUSE ## Instruction: [PEInfo] Update script to generate standalone exe ## Code After: @ECHO OFF CD /D "%~dp0" SET "OutDir=Output" IF NOT EXIST "%OutDir%" ( MKDIR "%OutDir" ) python3 setup.py py2exe -d "%~dp0\%OutDir%" PAUSE
@ECHO OFF + CD /D "%~dp0" + SET "OutDir=Output" + + IF NOT EXIST "%OutDir%" ( MKDIR "%OutDir" ) - python3 setup.py py2exe -d "%~dp0\" + python3 setup.py py2exe -d "%~dp0\%OutDir%" ? ++++++++ PAUSE
6
1.2
5
1
ba7ecab7f19e26aca4e1097e1afc3b0d1aab44c9
src/parsers/expression/window.js
src/parsers/expression/window.js
var _window = (typeof window !== 'undefined' && window) || null; export function open(uri, name) { var df = this.context.dataflow; if (_window && _window.open) { df.loader().sanitize(uri, {context:'open', name:name}) .then(function(url) { _window.open(url, name); }) .catch(function(e) { df.warn('Open url failed: ' + e); }); } else { df.warn('Open function can only be invoked in a browser.'); } } export function screen() { return _window ? _window.screen : {}; } export function windowsize() { return _window ? [_window.innerWidth, _window.innerHeight] : [undefined, undefined]; }
var _window = (typeof window !== 'undefined' && window) || null; export function open(uri, name) { var df = this.context.dataflow; if (_window && _window.open) { df.loader().sanitize(uri, {context:'open', name:name}) .then(function(opt) { _window.open(opt.href, name); }) .catch(function(e) { df.warn('Open url failed: ' + e); }); } else { df.warn('Open function can only be invoked in a browser.'); } } export function screen() { return _window ? _window.screen : {}; } export function windowsize() { return _window ? [_window.innerWidth, _window.innerHeight] : [undefined, undefined]; }
Modify `open`function to use sanitize hash
Modify `open`function to use sanitize hash
JavaScript
bsd-3-clause
vega/vega-parser
javascript
## Code Before: var _window = (typeof window !== 'undefined' && window) || null; export function open(uri, name) { var df = this.context.dataflow; if (_window && _window.open) { df.loader().sanitize(uri, {context:'open', name:name}) .then(function(url) { _window.open(url, name); }) .catch(function(e) { df.warn('Open url failed: ' + e); }); } else { df.warn('Open function can only be invoked in a browser.'); } } export function screen() { return _window ? _window.screen : {}; } export function windowsize() { return _window ? [_window.innerWidth, _window.innerHeight] : [undefined, undefined]; } ## Instruction: Modify `open`function to use sanitize hash ## Code After: var _window = (typeof window !== 'undefined' && window) || null; export function open(uri, name) { var df = this.context.dataflow; if (_window && _window.open) { df.loader().sanitize(uri, {context:'open', name:name}) .then(function(opt) { _window.open(opt.href, name); }) .catch(function(e) { df.warn('Open url failed: ' + e); }); } else { df.warn('Open function can only be invoked in a browser.'); } } export function screen() { return _window ? _window.screen : {}; } export function windowsize() { return _window ? [_window.innerWidth, _window.innerHeight] : [undefined, undefined]; }
var _window = (typeof window !== 'undefined' && window) || null; export function open(uri, name) { var df = this.context.dataflow; if (_window && _window.open) { df.loader().sanitize(uri, {context:'open', name:name}) - .then(function(url) { _window.open(url, name); }) ? ^^^ ^ ^ + .then(function(opt) { _window.open(opt.href, name); }) ? ^^^ ^^^^^ ^^ .catch(function(e) { df.warn('Open url failed: ' + e); }); } else { df.warn('Open function can only be invoked in a browser.'); } } export function screen() { return _window ? _window.screen : {}; } export function windowsize() { return _window ? [_window.innerWidth, _window.innerHeight] : [undefined, undefined]; }
2
0.090909
1
1
d755e7ddf05ea60670517ad0f612e2d38de31c35
.travis.yml
.travis.yml
language: php dist: trusty sudo: false cache: directories: - vendor/ php: - 5.4 - 5.5 - 5.6 - 7.0 - 7.1 - nightly - hhvm notifications: email: false irc: "irc.iiens.net#Erebot" before_script: - rm composer.lock - composer self-update -n - composer install -n script: - vendor/bin/phpunit --coverage-clover clover.xml - vendor/bin/phpcs after_success: - composer require --dev satooshi/php-coveralls - travis_retry vendor/bin/coveralls -n -v
language: php dist: trusty sudo: false cache: directories: - vendor/ php: - 5.4 - 5.5 - 5.6 - 7.0 - 7.1 - 7.2 - nightly notifications: email: false irc: "irc.iiens.net#Erebot" before_script: - rm composer.lock - composer self-update -n - composer install -n script: - vendor/bin/phpunit --coverage-clover clover.xml - vendor/bin/phpcs after_success: - composer require --dev satooshi/php-coveralls - travis_retry vendor/bin/coveralls -n -v
Add support for PHP 7.2 & remove HHVM
Add support for PHP 7.2 & remove HHVM
YAML
mit
Erebot/Plop
yaml
## Code Before: language: php dist: trusty sudo: false cache: directories: - vendor/ php: - 5.4 - 5.5 - 5.6 - 7.0 - 7.1 - nightly - hhvm notifications: email: false irc: "irc.iiens.net#Erebot" before_script: - rm composer.lock - composer self-update -n - composer install -n script: - vendor/bin/phpunit --coverage-clover clover.xml - vendor/bin/phpcs after_success: - composer require --dev satooshi/php-coveralls - travis_retry vendor/bin/coveralls -n -v ## Instruction: Add support for PHP 7.2 & remove HHVM ## Code After: language: php dist: trusty sudo: false cache: directories: - vendor/ php: - 5.4 - 5.5 - 5.6 - 7.0 - 7.1 - 7.2 - nightly notifications: email: false irc: "irc.iiens.net#Erebot" before_script: - rm composer.lock - composer self-update -n - composer install -n script: - vendor/bin/phpunit --coverage-clover clover.xml - vendor/bin/phpcs after_success: - composer require --dev satooshi/php-coveralls - travis_retry vendor/bin/coveralls -n -v
language: php dist: trusty sudo: false cache: directories: - vendor/ php: - 5.4 - 5.5 - 5.6 - 7.0 - 7.1 + - 7.2 - nightly - - hhvm notifications: email: false irc: "irc.iiens.net#Erebot" before_script: - rm composer.lock - composer self-update -n - composer install -n script: - vendor/bin/phpunit --coverage-clover clover.xml - vendor/bin/phpcs after_success: - composer require --dev satooshi/php-coveralls - travis_retry vendor/bin/coveralls -n -v
2
0.057143
1
1
b431122b6279b08e224b143a2ad408acbe49eb46
spec/letsencrypt/configuration_spec.rb
spec/letsencrypt/configuration_spec.rb
require 'rails_helper' RSpec.describe LetsEncrypt::Configuration do describe '#use_redis?' do it 'reture is user enable save to redis value' do subject.save_to_redis = true expect(subject.use_redis?).to be_truthy end end describe '#use_staging?' do it 'return same value as #use_staging config attribute' do subject.use_staging = true expect(subject.use_staging?).to eq(subject.use_staging) end end end
require 'rails_helper' RSpec.describe LetsEncrypt::Configuration do describe '#use_redis?' do it 'reture is user enable save to redis value' do subject.save_to_redis = true expect(subject.use_redis?).to be_truthy end end describe '#use_staging?' do it 'return same value as #use_staging config attribute' do subject.use_staging = true expect(subject.use_staging?).to eq(subject.use_staging) end end describe 'customize certificate model' do class OtherModel < LetsEncrypt::Certificate after_update :success def success 'success' end end before(:each) do LetsEncrypt.config.certificate_model = 'OtherModel' LetsEncrypt.certificate_model.create( domain: 'example.com', verification_path: '.well-known/acme-challenge/valid_path', verification_string: 'verification' ) end it 'update data' do expect_any_instance_of(OtherModel).to receive(:success) LetsEncrypt.certificate_model.first.update(renew_after: 3.days.ago) end end end
Add test to check callback will work in customize model
Add test to check callback will work in customize model
Ruby
mit
elct9620/rails-letsencrypt,elct9620/rails-letsencrypt,elct9620/rails-letsencrypt
ruby
## Code Before: require 'rails_helper' RSpec.describe LetsEncrypt::Configuration do describe '#use_redis?' do it 'reture is user enable save to redis value' do subject.save_to_redis = true expect(subject.use_redis?).to be_truthy end end describe '#use_staging?' do it 'return same value as #use_staging config attribute' do subject.use_staging = true expect(subject.use_staging?).to eq(subject.use_staging) end end end ## Instruction: Add test to check callback will work in customize model ## Code After: require 'rails_helper' RSpec.describe LetsEncrypt::Configuration do describe '#use_redis?' do it 'reture is user enable save to redis value' do subject.save_to_redis = true expect(subject.use_redis?).to be_truthy end end describe '#use_staging?' do it 'return same value as #use_staging config attribute' do subject.use_staging = true expect(subject.use_staging?).to eq(subject.use_staging) end end describe 'customize certificate model' do class OtherModel < LetsEncrypt::Certificate after_update :success def success 'success' end end before(:each) do LetsEncrypt.config.certificate_model = 'OtherModel' LetsEncrypt.certificate_model.create( domain: 'example.com', verification_path: '.well-known/acme-challenge/valid_path', verification_string: 'verification' ) end it 'update data' do expect_any_instance_of(OtherModel).to receive(:success) LetsEncrypt.certificate_model.first.update(renew_after: 3.days.ago) end end end
require 'rails_helper' RSpec.describe LetsEncrypt::Configuration do describe '#use_redis?' do it 'reture is user enable save to redis value' do subject.save_to_redis = true expect(subject.use_redis?).to be_truthy end end describe '#use_staging?' do it 'return same value as #use_staging config attribute' do subject.use_staging = true expect(subject.use_staging?).to eq(subject.use_staging) end end + + describe 'customize certificate model' do + class OtherModel < LetsEncrypt::Certificate + after_update :success + + def success + 'success' + end + end + + before(:each) do + LetsEncrypt.config.certificate_model = 'OtherModel' + LetsEncrypt.certificate_model.create( + domain: 'example.com', + verification_path: '.well-known/acme-challenge/valid_path', + verification_string: 'verification' + ) + end + + it 'update data' do + expect_any_instance_of(OtherModel).to receive(:success) + LetsEncrypt.certificate_model.first.update(renew_after: 3.days.ago) + end + end end
24
1.333333
24
0
358f9689dc209798c273bc3f1343d57e577d6428
library/setup.cfg
library/setup.cfg
[metadata] name = pibrella version = 1.4.0 author = Philip Howard author_email = phil@pimoroni.com description = A module to control the Pibrella Raspberry Pi Addon Board long_description = file: README.md long_description_content_type = text/markdown url = http://www.pibrella.com project_urls = GitHub=https://www.github.com/pimoroni/bme280-python license = MIT # This includes the license file(s) in the wheel. # https://wheel.readthedocs.io/en/stable/user_guide.html#including-license-files-in-the-generated-wheel-file license_files = LICENSE.txt classifiers = Development Status :: 4 - Beta Operating System :: POSIX :: Linux License :: OSI Approved :: MIT License Intended Audience :: Developers Programming Language :: Python :: 2.6 Programming Language :: Python :: 2.7 Programming Language :: Python :: 3 Topic :: Software Development Topic :: Software Development :: Libraries Topic :: System :: Hardware [options] packages = pibrella install_requires = RPi.GPIO [flake8] exclude = .tox, .eggs, .git, __pycache__, build, dist ignore = E501 [pimoroni] py2deps = python-pip py3deps = python-pip configtxt = commands =
[metadata] name = pibrella version = 1.4.1 author = Philip Howard author_email = phil@pimoroni.com description = A module to control the Pibrella Raspberry Pi Addon Board long_description = file: README.md long_description_content_type = text/markdown url = http://www.pibrella.com project_urls = GitHub=https://www.github.com/pimoroni/pibrella license = MIT # This includes the license file(s) in the wheel. # https://wheel.readthedocs.io/en/stable/user_guide.html#including-license-files-in-the-generated-wheel-file license_files = LICENSE.txt classifiers = Development Status :: 4 - Beta Operating System :: POSIX :: Linux License :: OSI Approved :: MIT License Intended Audience :: Developers Programming Language :: Python :: 2.6 Programming Language :: Python :: 2.7 Programming Language :: Python :: 3 Topic :: Software Development Topic :: Software Development :: Libraries Topic :: System :: Hardware [options] packages = pibrella install_requires = RPi.GPIO [flake8] exclude = .tox, .eggs, .git, __pycache__, build, dist ignore = E501 [pimoroni] py2deps = python-pip py3deps = python-pip configtxt = commands =
Fix github URL, bump version
Fix github URL, bump version
INI
mit
pimoroni/pibrella,pimoroni/pibrella
ini
## Code Before: [metadata] name = pibrella version = 1.4.0 author = Philip Howard author_email = phil@pimoroni.com description = A module to control the Pibrella Raspberry Pi Addon Board long_description = file: README.md long_description_content_type = text/markdown url = http://www.pibrella.com project_urls = GitHub=https://www.github.com/pimoroni/bme280-python license = MIT # This includes the license file(s) in the wheel. # https://wheel.readthedocs.io/en/stable/user_guide.html#including-license-files-in-the-generated-wheel-file license_files = LICENSE.txt classifiers = Development Status :: 4 - Beta Operating System :: POSIX :: Linux License :: OSI Approved :: MIT License Intended Audience :: Developers Programming Language :: Python :: 2.6 Programming Language :: Python :: 2.7 Programming Language :: Python :: 3 Topic :: Software Development Topic :: Software Development :: Libraries Topic :: System :: Hardware [options] packages = pibrella install_requires = RPi.GPIO [flake8] exclude = .tox, .eggs, .git, __pycache__, build, dist ignore = E501 [pimoroni] py2deps = python-pip py3deps = python-pip configtxt = commands = ## Instruction: Fix github URL, bump version ## Code After: [metadata] name = pibrella version = 1.4.1 author = Philip Howard author_email = phil@pimoroni.com description = A module to control the Pibrella Raspberry Pi Addon Board long_description = file: README.md long_description_content_type = text/markdown url = http://www.pibrella.com project_urls = GitHub=https://www.github.com/pimoroni/pibrella license = MIT # This includes the license file(s) in the wheel. # https://wheel.readthedocs.io/en/stable/user_guide.html#including-license-files-in-the-generated-wheel-file license_files = LICENSE.txt classifiers = Development Status :: 4 - Beta Operating System :: POSIX :: Linux License :: OSI Approved :: MIT License Intended Audience :: Developers Programming Language :: Python :: 2.6 Programming Language :: Python :: 2.7 Programming Language :: Python :: 3 Topic :: Software Development Topic :: Software Development :: Libraries Topic :: System :: Hardware [options] packages = pibrella install_requires = RPi.GPIO [flake8] exclude = .tox, .eggs, .git, __pycache__, build, dist ignore = E501 [pimoroni] py2deps = python-pip py3deps = python-pip configtxt = commands =
[metadata] name = pibrella - version = 1.4.0 ? ^ + version = 1.4.1 ? ^ author = Philip Howard author_email = phil@pimoroni.com description = A module to control the Pibrella Raspberry Pi Addon Board long_description = file: README.md long_description_content_type = text/markdown url = http://www.pibrella.com project_urls = - GitHub=https://www.github.com/pimoroni/bme280-python ? ^ ^^^^^^^^^^ + GitHub=https://www.github.com/pimoroni/pibrella ? ++ ^ ^^^ license = MIT # This includes the license file(s) in the wheel. # https://wheel.readthedocs.io/en/stable/user_guide.html#including-license-files-in-the-generated-wheel-file license_files = LICENSE.txt classifiers = Development Status :: 4 - Beta Operating System :: POSIX :: Linux License :: OSI Approved :: MIT License Intended Audience :: Developers Programming Language :: Python :: 2.6 Programming Language :: Python :: 2.7 Programming Language :: Python :: 3 Topic :: Software Development Topic :: Software Development :: Libraries Topic :: System :: Hardware [options] packages = pibrella install_requires = RPi.GPIO [flake8] exclude = .tox, .eggs, .git, __pycache__, build, dist ignore = E501 [pimoroni] py2deps = python-pip py3deps = python-pip configtxt = commands =
4
0.08
2
2
dcac37aa43d3ff1412680ab8ccb5068bc18dac21
.travis.yml
.travis.yml
language: r sudo: required dist: trusty r_build_args: --no-build-vignettes r_check_args: --no-build-vignettes addons: apt: packages: - libgdal-dev - libproj-dev r: - release - devel - oldrel notifications: email: on_success: change on_failure: change
language: r sudo: required dist: trusty r_build_args: --no-build-vignettes r_check_args: --no-build-vignettes addons: apt: sources: - sourceline: 'ppa:ubuntugis/ubuntugis-unstable' packages: - libgdal-dev - libproj-dev r: - release - devel - oldrel notifications: email: on_success: change on_failure: change
Add ubuntugis-unstable to apt sources
Travis: Add ubuntugis-unstable to apt sources
YAML
apache-2.0
bcgov/bcgroundwater
yaml
## Code Before: language: r sudo: required dist: trusty r_build_args: --no-build-vignettes r_check_args: --no-build-vignettes addons: apt: packages: - libgdal-dev - libproj-dev r: - release - devel - oldrel notifications: email: on_success: change on_failure: change ## Instruction: Travis: Add ubuntugis-unstable to apt sources ## Code After: language: r sudo: required dist: trusty r_build_args: --no-build-vignettes r_check_args: --no-build-vignettes addons: apt: sources: - sourceline: 'ppa:ubuntugis/ubuntugis-unstable' packages: - libgdal-dev - libproj-dev r: - release - devel - oldrel notifications: email: on_success: change on_failure: change
language: r sudo: required dist: trusty r_build_args: --no-build-vignettes r_check_args: --no-build-vignettes addons: apt: + sources: + - sourceline: 'ppa:ubuntugis/ubuntugis-unstable' packages: - libgdal-dev - libproj-dev r: - release - devel - oldrel notifications: email: on_success: change on_failure: change
2
0.090909
2
0
3480438a6698868c7cdcd743489d5e597e318cbe
app/main.js
app/main.js
// Install the configuration interface and set default values require('configuration.js'); // Use DIM styling overrides and our own custom styling require('beautification.js'); // Stores weapons pulled from our custom database require('weaponDatabase.js'); // Pulls down weapon data and broadcasts updates require('weaponDataRefresher.js'); // Removes DIM's native tagging elements require('dimTagRemover.js'); // Store original details about the weapon in 'data-fate' attributes require('weaponDecorator.js'); // Update weapon comments from our database require('commentDecorator.js'); // Rejigger how weapons with legendary mods are displayed require('modIndicator.js'); // Show higher/lower dupes require('dupeIndicator.js'); /* The nicest change-refresh flow means loading the development version of the script from Tampermonkey while editing. This lets us skip kicking off the app when running under Karma. */ if (!window.navigator.userAgent.includes('HeadlessChrome')) { // const logger = require('logger'); // const postal = require('postal'); // logger.log('main.js: Initializing'); // postal.publish({topic:'fate.init'}); // // setInterval(function() { // postal.publish({topic:'fate.refresh'}); // }, 5000); // // setInterval(function() { // postal.publish({topic:'fate.weaponDataStale'}); // }, 30000); }
// Install the configuration interface and set default values require('configuration.js'); // Use DIM styling overrides and our own custom styling require('beautification.js'); // Stores weapons pulled from our custom database require('weaponDatabase.js'); // Pulls down weapon data and broadcasts updates require('weaponDataRefresher.js'); // Removes DIM's native tagging elements require('dimTagRemover.js'); // Store original details about the weapon in 'data-fate' attributes require('weaponDecorator.js'); // Update weapon comments from our database require('commentDecorator.js'); // Rejigger how weapons with legendary mods are displayed require('modIndicator.js'); // Show higher/lower dupes require('dupeIndicator.js'); /* The nicest change-refresh flow means loading the development version of the script from Tampermonkey while editing. This lets us skip kicking off the app when running under Karma. */ if (!window.navigator.userAgent.includes('HeadlessChrome')) { const logger = require('logger'); logger.log('main.js: Initializing'); fateBus.publish(module, 'fate.init'); setInterval(function() { fateBus.publish(module, 'fate.refresh'); }, 5000); setInterval(function() { fateBus.publish(module, 'fate.weaponDataStale'); }, 30000); }
Swap over to using fateBus.
Swap over to using fateBus.
JavaScript
mit
rslifka/fate_of_all_fools,rslifka/fate_of_all_fools
javascript
## Code Before: // Install the configuration interface and set default values require('configuration.js'); // Use DIM styling overrides and our own custom styling require('beautification.js'); // Stores weapons pulled from our custom database require('weaponDatabase.js'); // Pulls down weapon data and broadcasts updates require('weaponDataRefresher.js'); // Removes DIM's native tagging elements require('dimTagRemover.js'); // Store original details about the weapon in 'data-fate' attributes require('weaponDecorator.js'); // Update weapon comments from our database require('commentDecorator.js'); // Rejigger how weapons with legendary mods are displayed require('modIndicator.js'); // Show higher/lower dupes require('dupeIndicator.js'); /* The nicest change-refresh flow means loading the development version of the script from Tampermonkey while editing. This lets us skip kicking off the app when running under Karma. */ if (!window.navigator.userAgent.includes('HeadlessChrome')) { // const logger = require('logger'); // const postal = require('postal'); // logger.log('main.js: Initializing'); // postal.publish({topic:'fate.init'}); // // setInterval(function() { // postal.publish({topic:'fate.refresh'}); // }, 5000); // // setInterval(function() { // postal.publish({topic:'fate.weaponDataStale'}); // }, 30000); } ## Instruction: Swap over to using fateBus. ## Code After: // Install the configuration interface and set default values require('configuration.js'); // Use DIM styling overrides and our own custom styling require('beautification.js'); // Stores weapons pulled from our custom database require('weaponDatabase.js'); // Pulls down weapon data and broadcasts updates require('weaponDataRefresher.js'); // Removes DIM's native tagging elements require('dimTagRemover.js'); // Store original details about the weapon in 'data-fate' attributes require('weaponDecorator.js'); // Update weapon comments from our database require('commentDecorator.js'); // Rejigger how weapons with legendary mods are displayed require('modIndicator.js'); // Show higher/lower dupes require('dupeIndicator.js'); /* The nicest change-refresh flow means loading the development version of the script from Tampermonkey while editing. This lets us skip kicking off the app when running under Karma. */ if (!window.navigator.userAgent.includes('HeadlessChrome')) { const logger = require('logger'); logger.log('main.js: Initializing'); fateBus.publish(module, 'fate.init'); setInterval(function() { fateBus.publish(module, 'fate.refresh'); }, 5000); setInterval(function() { fateBus.publish(module, 'fate.weaponDataStale'); }, 30000); }
// Install the configuration interface and set default values require('configuration.js'); // Use DIM styling overrides and our own custom styling require('beautification.js'); // Stores weapons pulled from our custom database require('weaponDatabase.js'); // Pulls down weapon data and broadcasts updates require('weaponDataRefresher.js'); // Removes DIM's native tagging elements require('dimTagRemover.js'); // Store original details about the weapon in 'data-fate' attributes require('weaponDecorator.js'); // Update weapon comments from our database require('commentDecorator.js'); // Rejigger how weapons with legendary mods are displayed require('modIndicator.js'); // Show higher/lower dupes require('dupeIndicator.js'); /* The nicest change-refresh flow means loading the development version of the script from Tampermonkey while editing. This lets us skip kicking off the app when running under Karma. */ if (!window.navigator.userAgent.includes('HeadlessChrome')) { - // const logger = require('logger'); ? --- + const logger = require('logger'); - // const postal = require('postal'); - // logger.log('main.js: Initializing'); ? --- + logger.log('main.js: Initializing'); - // postal.publish({topic:'fate.init'}); - // + fateBus.publish(module, 'fate.init'); + - // setInterval(function() { ? --- + setInterval(function() { - // postal.publish({topic:'fate.refresh'}); + fateBus.publish(module, 'fate.refresh'); - // }, 5000); ? --- + }, 5000); - // + - // setInterval(function() { ? --- + setInterval(function() { - // postal.publish({topic:'fate.weaponDataStale'}); + fateBus.publish(module, 'fate.weaponDataStale'); - // }, 30000); ? --- + }, 30000); }
23
0.5
11
12
23b3039c2684b50d5ac28122c35f2ccf8b629236
README.md
README.md
A tool for observing file directories and reacting to changes. Also includes `reaktd` for working with long-running processes. ### Usage ``` $ reakt [options] <command> $ reaktd [options] <command> ``` ### Examples ``` $ reakt say "files updated" $ reakt coffee -c src/foo.coffee -o lib $ reaktd ./start_server.sh ```
A tool for observing file directories and reacting to changes. Also includes `reaktd` for working with long-running processes. ### Usage ``` $ reakt [options] <command> $ reaktd [options] <command> Options: -h, --help output usage information -V, --version output the version number -g, --grep [pattern] run <command> when files matching [pattern] change -v, --invert [pattern] do not run <command> if files matching [pattern] change -i, --interval [milliseconds] polling interval in ms - defaults to 1000ms ``` ### Examples ``` $ reakt say "files updated" $ reakt coffee -c src/foo.coffee -o lib $ reakt -g "^\/src" make test $ reaktd ./start_server.sh ```
Add all available options to readme and an extra example
Add all available options to readme and an extra example
Markdown
mit
thetristan/reakt
markdown
## Code Before: A tool for observing file directories and reacting to changes. Also includes `reaktd` for working with long-running processes. ### Usage ``` $ reakt [options] <command> $ reaktd [options] <command> ``` ### Examples ``` $ reakt say "files updated" $ reakt coffee -c src/foo.coffee -o lib $ reaktd ./start_server.sh ``` ## Instruction: Add all available options to readme and an extra example ## Code After: A tool for observing file directories and reacting to changes. Also includes `reaktd` for working with long-running processes. ### Usage ``` $ reakt [options] <command> $ reaktd [options] <command> Options: -h, --help output usage information -V, --version output the version number -g, --grep [pattern] run <command> when files matching [pattern] change -v, --invert [pattern] do not run <command> if files matching [pattern] change -i, --interval [milliseconds] polling interval in ms - defaults to 1000ms ``` ### Examples ``` $ reakt say "files updated" $ reakt coffee -c src/foo.coffee -o lib $ reakt -g "^\/src" make test $ reaktd ./start_server.sh ```
A tool for observing file directories and reacting to changes. Also includes `reaktd` for working with long-running processes. ### Usage ``` $ reakt [options] <command> $ reaktd [options] <command> + + Options: + + -h, --help output usage information + -V, --version output the version number + -g, --grep [pattern] run <command> when files matching [pattern] change + -v, --invert [pattern] do not run <command> if files matching [pattern] change + -i, --interval [milliseconds] polling interval in ms - defaults to 1000ms ``` ### Examples ``` $ reakt say "files updated" $ reakt coffee -c src/foo.coffee -o lib + $ reakt -g "^\/src" make test $ reaktd ./start_server.sh ```
9
0.529412
9
0
43f8cc3d1d0b4cabf39aa474a61720b1cc4c21e0
test/src/xquery/xquery3/braced-uri-literal.xml
test/src/xquery/xquery3/braced-uri-literal.xml
<?xml version="1.0" encoding="UTF-8"?> <TestSet> <testName>braced uri literal</testName> <description> <p>Test xquery3 braced uri literals</p> <author>Adam Retter</author> </description> <test> <task>Test URIBracedLiteral in element name test</task> <code><![CDATA[ xquery version "3.0"; declare namespace s = "https://stuff"; typeswitch(<s:create-collection/>) case element(Q{https://stuff}create-collection) return <create/> default return <default/> ]]> </code> <expected><create/></expected> </test> <test> <task>Test URIBracedLiteral as function name</task> <code><![CDATA[ xquery version "3.0"; declare namespace o = "https://other"; declare function Q{https://other}hello() { <hello/> }; o:hello() ]]> </code> <expected><hello/></expected> </test> </TestSet>
<?xml version="1.0" encoding="UTF-8"?> <TestSet> <testName>braced uri literal</testName> <description> <p>Test xquery3 braced uri literals</p> <author>Adam Retter</author> </description> <test> <task>Test URIBracedLiteral in element name test</task> <code><![CDATA[ xquery version "3.0"; declare namespace s = "https://stuff"; typeswitch(<s:create-collection/>) case element(Q{https://stuff}create-collection) return <create/> default return <default/> ]]> </code> <expected><create/></expected> </test> <test> <task>Test URIBracedLiteral as function name</task> <code><![CDATA[ xquery version "3.0"; declare namespace o = "https://other"; declare function Q{https://other}hello() { <hello/> }; o:hello() ]]> </code> <expected><hello/></expected> </test> <test output="text"> <task>Test URIBracedLiteral as function call</task> <code><![CDATA[ xquery version "3.0"; declare namespace o = "https://other"; declare function o:hello() { <hello/> }; fn:count(Q{https://other}hello()) ]]> </code> <expected>1</expected> </test> </TestSet>
Check URIBracedLiteral as function call works
[test] Check URIBracedLiteral as function call works
XML
lgpl-2.1
eXist-db/exist,eXist-db/exist,MjAbuz/exist,adamretter/exist,MjAbuz/exist,wshager/exist,kohsah/exist,ljo/exist,ljo/exist,olvidalo/exist,jensopetersen/exist,ambs/exist,ljo/exist,opax/exist,zwobit/exist,RemiKoutcherawy/exist,olvidalo/exist,MjAbuz/exist,windauer/exist,adamretter/exist,patczar/exist,joewiz/exist,wshager/exist,zwobit/exist,lcahlander/exist,zwobit/exist,olvidalo/exist,MjAbuz/exist,eXist-db/exist,RemiKoutcherawy/exist,joewiz/exist,MjAbuz/exist,kohsah/exist,eXist-db/exist,dizzzz/exist,ljo/exist,windauer/exist,olvidalo/exist,dizzzz/exist,hungerburg/exist,jensopetersen/exist,jensopetersen/exist,RemiKoutcherawy/exist,patczar/exist,jensopetersen/exist,jessealama/exist,adamretter/exist,kohsah/exist,opax/exist,lcahlander/exist,hungerburg/exist,eXist-db/exist,hungerburg/exist,ljo/exist,adamretter/exist,jensopetersen/exist,lcahlander/exist,opax/exist,jessealama/exist,RemiKoutcherawy/exist,windauer/exist,RemiKoutcherawy/exist,jensopetersen/exist,wshager/exist,lcahlander/exist,ambs/exist,kohsah/exist,dizzzz/exist,jessealama/exist,dizzzz/exist,wolfgangmm/exist,wshager/exist,hungerburg/exist,lcahlander/exist,jessealama/exist,joewiz/exist,RemiKoutcherawy/exist,windauer/exist,wolfgangmm/exist,wshager/exist,patczar/exist,opax/exist,opax/exist,joewiz/exist,kohsah/exist,joewiz/exist,jessealama/exist,adamretter/exist,eXist-db/exist,MjAbuz/exist,zwobit/exist,dizzzz/exist,wolfgangmm/exist,olvidalo/exist,zwobit/exist,ambs/exist,wolfgangmm/exist,wolfgangmm/exist,lcahlander/exist,adamretter/exist,kohsah/exist,ambs/exist,patczar/exist,windauer/exist,hungerburg/exist,wolfgangmm/exist,ambs/exist,patczar/exist,joewiz/exist,ljo/exist,windauer/exist,wshager/exist,jessealama/exist,ambs/exist,dizzzz/exist,patczar/exist,zwobit/exist
xml
## Code Before: <?xml version="1.0" encoding="UTF-8"?> <TestSet> <testName>braced uri literal</testName> <description> <p>Test xquery3 braced uri literals</p> <author>Adam Retter</author> </description> <test> <task>Test URIBracedLiteral in element name test</task> <code><![CDATA[ xquery version "3.0"; declare namespace s = "https://stuff"; typeswitch(<s:create-collection/>) case element(Q{https://stuff}create-collection) return <create/> default return <default/> ]]> </code> <expected><create/></expected> </test> <test> <task>Test URIBracedLiteral as function name</task> <code><![CDATA[ xquery version "3.0"; declare namespace o = "https://other"; declare function Q{https://other}hello() { <hello/> }; o:hello() ]]> </code> <expected><hello/></expected> </test> </TestSet> ## Instruction: [test] Check URIBracedLiteral as function call works ## Code After: <?xml version="1.0" encoding="UTF-8"?> <TestSet> <testName>braced uri literal</testName> <description> <p>Test xquery3 braced uri literals</p> <author>Adam Retter</author> </description> <test> <task>Test URIBracedLiteral in element name test</task> <code><![CDATA[ xquery version "3.0"; declare namespace s = "https://stuff"; typeswitch(<s:create-collection/>) case element(Q{https://stuff}create-collection) return <create/> default return <default/> ]]> </code> <expected><create/></expected> </test> <test> <task>Test URIBracedLiteral as function name</task> <code><![CDATA[ xquery version "3.0"; declare namespace o = "https://other"; declare function Q{https://other}hello() { <hello/> }; o:hello() ]]> </code> <expected><hello/></expected> </test> <test output="text"> <task>Test URIBracedLiteral as function call</task> <code><![CDATA[ xquery version "3.0"; declare namespace o = "https://other"; declare function o:hello() { <hello/> }; fn:count(Q{https://other}hello()) ]]> </code> <expected>1</expected> </test> </TestSet>
<?xml version="1.0" encoding="UTF-8"?> <TestSet> <testName>braced uri literal</testName> <description> <p>Test xquery3 braced uri literals</p> <author>Adam Retter</author> </description> <test> <task>Test URIBracedLiteral in element name test</task> <code><![CDATA[ xquery version "3.0"; declare namespace s = "https://stuff"; typeswitch(<s:create-collection/>) case element(Q{https://stuff}create-collection) return <create/> default return <default/> ]]> </code> <expected><create/></expected> </test> <test> <task>Test URIBracedLiteral as function name</task> <code><![CDATA[ xquery version "3.0"; declare namespace o = "https://other"; declare function Q{https://other}hello() { <hello/> }; o:hello() ]]> </code> <expected><hello/></expected> </test> + <test output="text"> + <task>Test URIBracedLiteral as function call</task> + <code><![CDATA[ + xquery version "3.0"; + declare namespace o = "https://other"; + + declare function o:hello() { + <hello/> + }; + + fn:count(Q{https://other}hello()) + ]]> + </code> + <expected>1</expected> + </test> </TestSet>
15
0.384615
15
0
32894839a289ae83337a49a223c19225f0ad07e8
ci/tasks/test-docker-boshrelease.yml
ci/tasks/test-docker-boshrelease.yml
platform: linux image_resource: type: docker-image source: repository: pcfkubo/kubo-ci tag: latest inputs: - name: git-docker-boshrelease params: AWS_ACCESS_KEY: AWS_SECRET_KEY: PROXY_PRIVATE_KEY: PROXY_USERNAME: PROXY_IP: BOSH_CA_CERT: BOSH_CLIENT: BOSH_CLIENT_SECRET: BOSH_DEPLOYMENT: BOSH_ENVIRONMENT: MANIFEST_PATH: run: path: ./git-docker-boshrelease/ci/scripts/test-docker-boshrelease
platform: linux image_resource: type: docker-image source: repository: pcfkubo/kubo-ci tag: latest inputs: - name: git-docker-boshrelease - name: stemcell params: PROXY_PRIVATE_KEY: PROXY_USERNAME: PROXY_IP: BOSH_CA_CERT: BOSH_CLIENT: BOSH_CLIENT_SECRET: BOSH_DEPLOYMENT: BOSH_ENVIRONMENT: MANIFEST_PATH: run: path: ./git-docker-boshrelease/ci/scripts/test-docker-boshrelease
Add stemcell input to test task
Add stemcell input to test task [#154767498] Signed-off-by: Christian Ang <15ae83feccf4d84083a62aba1cdf46bd8ce2e33b@pivotal.io>
YAML
apache-2.0
cf-platform-eng/docker-boshrelease,cloudfoundry-community/docker-boshrelease,cf-platform-eng/docker-boshrelease,cloudfoundry-community/docker-boshrelease
yaml
## Code Before: platform: linux image_resource: type: docker-image source: repository: pcfkubo/kubo-ci tag: latest inputs: - name: git-docker-boshrelease params: AWS_ACCESS_KEY: AWS_SECRET_KEY: PROXY_PRIVATE_KEY: PROXY_USERNAME: PROXY_IP: BOSH_CA_CERT: BOSH_CLIENT: BOSH_CLIENT_SECRET: BOSH_DEPLOYMENT: BOSH_ENVIRONMENT: MANIFEST_PATH: run: path: ./git-docker-boshrelease/ci/scripts/test-docker-boshrelease ## Instruction: Add stemcell input to test task [#154767498] Signed-off-by: Christian Ang <15ae83feccf4d84083a62aba1cdf46bd8ce2e33b@pivotal.io> ## Code After: platform: linux image_resource: type: docker-image source: repository: pcfkubo/kubo-ci tag: latest inputs: - name: git-docker-boshrelease - name: stemcell params: PROXY_PRIVATE_KEY: PROXY_USERNAME: PROXY_IP: BOSH_CA_CERT: BOSH_CLIENT: BOSH_CLIENT_SECRET: BOSH_DEPLOYMENT: BOSH_ENVIRONMENT: MANIFEST_PATH: run: path: ./git-docker-boshrelease/ci/scripts/test-docker-boshrelease
platform: linux image_resource: type: docker-image source: repository: pcfkubo/kubo-ci tag: latest inputs: - name: git-docker-boshrelease + - name: stemcell params: - AWS_ACCESS_KEY: - AWS_SECRET_KEY: PROXY_PRIVATE_KEY: PROXY_USERNAME: PROXY_IP: BOSH_CA_CERT: BOSH_CLIENT: BOSH_CLIENT_SECRET: BOSH_DEPLOYMENT: BOSH_ENVIRONMENT: MANIFEST_PATH: run: path: ./git-docker-boshrelease/ci/scripts/test-docker-boshrelease
3
0.115385
1
2
ab5ebb50019add34333edb04cc96f7f55fce8d1c
src/toil/utils/__init__.py
src/toil/utils/__init__.py
from __future__ import absolute_import from toil import version import logging logger = logging.getLogger(__name__) def addBasicProvisionerOptions(parser): parser.add_argument("--version", action='version', version=version) parser.add_argument('-p', "--provisioner", dest='provisioner', choices=['aws', 'azure', 'gce'], required=False, default="aws", help="The provisioner for cluster auto-scaling. Only aws is currently " "supported") try: from toil.provisioners.aws import getCurrentAWSZone currentZone = getCurrentAWSZone() except ImportError: currentZone = None zoneString = currentZone if currentZone else 'No zone could be determined' parser.add_argument('-z', '--zone', dest='zone', required=False, default=currentZone, help="The AWS availability zone of the master. This parameter can also be " "set via the TOIL_AWS_ZONE environment variable, or by the ec2_region_name " "parameter in your .boto file, or derived from the instance metadata if " "using this utility on an existing EC2 instance. " "Currently: %s" % zoneString) parser.add_argument("clusterName", help="The name that the cluster will be identifiable by. " "Must be lowercase and may not contain the '_' " "character.") return parser
from __future__ import absolute_import from toil import version import logging import os logger = logging.getLogger(__name__) def addBasicProvisionerOptions(parser): parser.add_argument("--version", action='version', version=version) parser.add_argument('-p', "--provisioner", dest='provisioner', choices=['aws', 'azure', 'gce'], required=False, default="aws", help="The provisioner for cluster auto-scaling. Only aws is currently " "supported") parser.add_argument('-z', '--zone', dest='zone', required=False, default=None, help="The availability zone of the master. This parameter can also be set via the 'TOIL_X_ZONE' " "environment variable, where X is AWS, GCE, or AZURE, or by the ec2_region_name parameter " "in your .boto file, or derived from the instance metadata if using this utility on an " "existing EC2 instance.") parser.add_argument("clusterName", help="The name that the cluster will be identifiable by. " "Must be lowercase and may not contain the '_' " "character.") return parser def getZoneFromEnv(provisioner): """ Find the zone specified in an environment variable. The user can specify zones in environment variables in leiu of writing them at the commandline every time. Given a provisioner, this method will look for the stored value and return it. :param str provisioner: One of the supported provisioners ('azure', 'aws', 'gce') :rtype: str :return: None or the value stored in a 'TOIL_X_ZONE' environment variable. """ return os.environ.get('TOIL_' + provisioner.upper() + '_ZONE')
Remove default for zone, add method for searching for specified zone in environ vars.
Remove default for zone, add method for searching for specified zone in environ vars.
Python
apache-2.0
BD2KGenomics/slugflow,BD2KGenomics/slugflow
python
## Code Before: from __future__ import absolute_import from toil import version import logging logger = logging.getLogger(__name__) def addBasicProvisionerOptions(parser): parser.add_argument("--version", action='version', version=version) parser.add_argument('-p', "--provisioner", dest='provisioner', choices=['aws', 'azure', 'gce'], required=False, default="aws", help="The provisioner for cluster auto-scaling. Only aws is currently " "supported") try: from toil.provisioners.aws import getCurrentAWSZone currentZone = getCurrentAWSZone() except ImportError: currentZone = None zoneString = currentZone if currentZone else 'No zone could be determined' parser.add_argument('-z', '--zone', dest='zone', required=False, default=currentZone, help="The AWS availability zone of the master. This parameter can also be " "set via the TOIL_AWS_ZONE environment variable, or by the ec2_region_name " "parameter in your .boto file, or derived from the instance metadata if " "using this utility on an existing EC2 instance. " "Currently: %s" % zoneString) parser.add_argument("clusterName", help="The name that the cluster will be identifiable by. " "Must be lowercase and may not contain the '_' " "character.") return parser ## Instruction: Remove default for zone, add method for searching for specified zone in environ vars. ## Code After: from __future__ import absolute_import from toil import version import logging import os logger = logging.getLogger(__name__) def addBasicProvisionerOptions(parser): parser.add_argument("--version", action='version', version=version) parser.add_argument('-p', "--provisioner", dest='provisioner', choices=['aws', 'azure', 'gce'], required=False, default="aws", help="The provisioner for cluster auto-scaling. Only aws is currently " "supported") parser.add_argument('-z', '--zone', dest='zone', required=False, default=None, help="The availability zone of the master. This parameter can also be set via the 'TOIL_X_ZONE' " "environment variable, where X is AWS, GCE, or AZURE, or by the ec2_region_name parameter " "in your .boto file, or derived from the instance metadata if using this utility on an " "existing EC2 instance.") parser.add_argument("clusterName", help="The name that the cluster will be identifiable by. " "Must be lowercase and may not contain the '_' " "character.") return parser def getZoneFromEnv(provisioner): """ Find the zone specified in an environment variable. The user can specify zones in environment variables in leiu of writing them at the commandline every time. Given a provisioner, this method will look for the stored value and return it. :param str provisioner: One of the supported provisioners ('azure', 'aws', 'gce') :rtype: str :return: None or the value stored in a 'TOIL_X_ZONE' environment variable. """ return os.environ.get('TOIL_' + provisioner.upper() + '_ZONE')
from __future__ import absolute_import from toil import version import logging - + import os logger = logging.getLogger(__name__) def addBasicProvisionerOptions(parser): parser.add_argument("--version", action='version', version=version) parser.add_argument('-p', "--provisioner", dest='provisioner', choices=['aws', 'azure', 'gce'], required=False, default="aws", help="The provisioner for cluster auto-scaling. Only aws is currently " "supported") - try: - from toil.provisioners.aws import getCurrentAWSZone - currentZone = getCurrentAWSZone() - except ImportError: - currentZone = None - zoneString = currentZone if currentZone else 'No zone could be determined' - parser.add_argument('-z', '--zone', dest='zone', required=False, default=currentZone, ? ^^^^^^^^ + parser.add_argument('-z', '--zone', dest='zone', required=False, default=None, ? ^ - help="The AWS availability zone of the master. This parameter can also be " ? ---- + help="The availability zone of the master. This parameter can also be set via the 'TOIL_X_ZONE' " ? ++++++++++++++++++++++++++ - "set via the TOIL_AWS_ZONE environment variable, or by the ec2_region_name " + "environment variable, where X is AWS, GCE, or AZURE, or by the ec2_region_name parameter " - "parameter in your .boto file, or derived from the instance metadata if " ? ---------- + "in your .boto file, or derived from the instance metadata if using this utility on an " ? +++++++++++++++++++++++++ - "using this utility on an existing EC2 instance. " ? ------------------------- - + "existing EC2 instance.") ? + - "Currently: %s" % zoneString) parser.add_argument("clusterName", help="The name that the cluster will be identifiable by. " "Must be lowercase and may not contain the '_' " "character.") return parser + + + def getZoneFromEnv(provisioner): + """ + Find the zone specified in an environment variable. + + The user can specify zones in environment variables in leiu of writing them at the commandline every time. + Given a provisioner, this method will look for the stored value and return it. + :param str provisioner: One of the supported provisioners ('azure', 'aws', 'gce') + :rtype: str + :return: None or the value stored in a 'TOIL_X_ZONE' environment variable. + """ + + return os.environ.get('TOIL_' + provisioner.upper() + '_ZONE')
33
1.137931
20
13
b681326861199ebbffa3d1c71ba059ca38ca95bf
client/ruby/flare/app/views/document/_document_delicious.rhtml
client/ruby/flare/app/views/document/_document_delicious.rhtml
<tr valign="top"> <td> <%=image_tag "http://images.amazon.com/images/P/#{doc['asin_text']}.01.MZZZZZZZ" %> </td> <td> <table class="entry"> <tr> <td class="title" colspan="2"><%=link_to doc['title_text'], "http://www.amazon.com/exec/obidos/ASIN/#{doc['asin_text']}"%></td> </tr> <% doc.each do |k,v|; highlighting = response.highlighted(doc['id'], k) %> <tr><td class="field"><%=k%>:</td><td><%= highlighting ? "...#{highlighting}..." : (v.respond_to?('join') ? v.join(',') : v.to_s)%></td></tr> <% end %> </table> </td> </tr>
<tr valign="top"> <td> <%=image_tag "http://images.amazon.com/images/P/#{doc['asin_display']}.01.MZZZZZZZ" %> </td> <td> <table class="entry"> <tr> <td class="title" colspan="2"><%=link_to doc['title_text'], "http://www.amazon.com/exec/obidos/ASIN/#{doc['asin_display']}"%></td> </tr> <% doc.each do |k,v|; highlighting = response.highlighted(doc['id'], k) %> <tr><td class="field"><%=k%>:</td><td><%= highlighting ? "...#{highlighting}..." : (v.respond_to?('join') ? v.join(',') : v.to_s)%></td></tr> <% end %> </table> </td> </tr>
Adjust delicious library view for change in asin field name
Adjust delicious library view for change in asin field name git-svn-id: 3b1ff1236863b4d63a22e4dae568675c2e247730@540681 13f79535-47bb-0310-9956-ffa450edef68
RHTML
apache-2.0
apache/solr,apache/solr,apache/solr,apache/solr,apache/solr
rhtml
## Code Before: <tr valign="top"> <td> <%=image_tag "http://images.amazon.com/images/P/#{doc['asin_text']}.01.MZZZZZZZ" %> </td> <td> <table class="entry"> <tr> <td class="title" colspan="2"><%=link_to doc['title_text'], "http://www.amazon.com/exec/obidos/ASIN/#{doc['asin_text']}"%></td> </tr> <% doc.each do |k,v|; highlighting = response.highlighted(doc['id'], k) %> <tr><td class="field"><%=k%>:</td><td><%= highlighting ? "...#{highlighting}..." : (v.respond_to?('join') ? v.join(',') : v.to_s)%></td></tr> <% end %> </table> </td> </tr> ## Instruction: Adjust delicious library view for change in asin field name git-svn-id: 3b1ff1236863b4d63a22e4dae568675c2e247730@540681 13f79535-47bb-0310-9956-ffa450edef68 ## Code After: <tr valign="top"> <td> <%=image_tag "http://images.amazon.com/images/P/#{doc['asin_display']}.01.MZZZZZZZ" %> </td> <td> <table class="entry"> <tr> <td class="title" colspan="2"><%=link_to doc['title_text'], "http://www.amazon.com/exec/obidos/ASIN/#{doc['asin_display']}"%></td> </tr> <% doc.each do |k,v|; highlighting = response.highlighted(doc['id'], k) %> <tr><td class="field"><%=k%>:</td><td><%= highlighting ? "...#{highlighting}..." : (v.respond_to?('join') ? v.join(',') : v.to_s)%></td></tr> <% end %> </table> </td> </tr>
<tr valign="top"> <td> - <%=image_tag "http://images.amazon.com/images/P/#{doc['asin_text']}.01.MZZZZZZZ" %> ? ^^^^ + <%=image_tag "http://images.amazon.com/images/P/#{doc['asin_display']}.01.MZZZZZZZ" %> ? ^^^^^^^ </td> <td> <table class="entry"> <tr> - <td class="title" colspan="2"><%=link_to doc['title_text'], "http://www.amazon.com/exec/obidos/ASIN/#{doc['asin_text']}"%></td> ? ^^^^ + <td class="title" colspan="2"><%=link_to doc['title_text'], "http://www.amazon.com/exec/obidos/ASIN/#{doc['asin_display']}"%></td> ? ^^^^^^^ </tr> <% doc.each do |k,v|; highlighting = response.highlighted(doc['id'], k) %> <tr><td class="field"><%=k%>:</td><td><%= highlighting ? "...#{highlighting}..." : (v.respond_to?('join') ? v.join(',') : v.to_s)%></td></tr> <% end %> </table> </td> </tr>
4
0.235294
2
2
a692c92bfa2db2b4ef1672ddaf5940dfe703fcc3
.travis.yml
.travis.yml
language: python python: - "2.7" - "3.3" - "3.4" - "3.5" - "3.6" - "nightly" script: - if [[ "$TRAVIS_PYTHON_VERSION" == "2.6" ]]; then pip install unittest2 ; fi # Flake8 requires setuptools > 30, but some preinstalled are to old for that - pip install --upgrade setuptools - python setup.py test - if [[ "$TRAVIS_PYTHON_VERSION" != "nightly" ]]; then coverage run setup.py test ; fi notifications: email: false before_install: - pip install codecov - mkdir bad_tests - wget https://hg.python.org/cpython/archive/3.6.tar.bz2/Lib/test/ -O bad_tests/test.tar.bz2 - tar -xjf bad_tests/test.tar.bz2 -C bad_tests/ - mv bad_tests/cpython-*/Lib/test/badsyntax_future* . -v - rm -r bad_tests/ after_success: - codecov
language: python python: - "2.7" - "3.3" - "3.4" - "3.5" - "3.6" - "nightly" script: # Flake8 requires setuptools > 30, but some preinstalled are to old for that - pip install --upgrade setuptools - python setup.py test - if [[ "$TRAVIS_PYTHON_VERSION" != "nightly" ]]; then coverage run setup.py test ; fi notifications: email: false before_install: - pip install codecov - mkdir bad_tests - wget https://hg.python.org/cpython/archive/3.6.tar.bz2/Lib/test/ -O bad_tests/test.tar.bz2 - tar -xjf bad_tests/test.tar.bz2 -C bad_tests/ - mv bad_tests/cpython-*/Lib/test/badsyntax_future* . -v - rm -r bad_tests/ after_success: - codecov
Remove special case for Python 2.6
Remove special case for Python 2.6 As Flake8 does not support Python 2.6, and there hasn't been a test on Python 2.6 with this plugin, the special case can be removed.
YAML
mit
xZise/flake8-future-import
yaml
## Code Before: language: python python: - "2.7" - "3.3" - "3.4" - "3.5" - "3.6" - "nightly" script: - if [[ "$TRAVIS_PYTHON_VERSION" == "2.6" ]]; then pip install unittest2 ; fi # Flake8 requires setuptools > 30, but some preinstalled are to old for that - pip install --upgrade setuptools - python setup.py test - if [[ "$TRAVIS_PYTHON_VERSION" != "nightly" ]]; then coverage run setup.py test ; fi notifications: email: false before_install: - pip install codecov - mkdir bad_tests - wget https://hg.python.org/cpython/archive/3.6.tar.bz2/Lib/test/ -O bad_tests/test.tar.bz2 - tar -xjf bad_tests/test.tar.bz2 -C bad_tests/ - mv bad_tests/cpython-*/Lib/test/badsyntax_future* . -v - rm -r bad_tests/ after_success: - codecov ## Instruction: Remove special case for Python 2.6 As Flake8 does not support Python 2.6, and there hasn't been a test on Python 2.6 with this plugin, the special case can be removed. ## Code After: language: python python: - "2.7" - "3.3" - "3.4" - "3.5" - "3.6" - "nightly" script: # Flake8 requires setuptools > 30, but some preinstalled are to old for that - pip install --upgrade setuptools - python setup.py test - if [[ "$TRAVIS_PYTHON_VERSION" != "nightly" ]]; then coverage run setup.py test ; fi notifications: email: false before_install: - pip install codecov - mkdir bad_tests - wget https://hg.python.org/cpython/archive/3.6.tar.bz2/Lib/test/ -O bad_tests/test.tar.bz2 - tar -xjf bad_tests/test.tar.bz2 -C bad_tests/ - mv bad_tests/cpython-*/Lib/test/badsyntax_future* . -v - rm -r bad_tests/ after_success: - codecov
language: python python: - "2.7" - "3.3" - "3.4" - "3.5" - "3.6" - "nightly" script: - - if [[ "$TRAVIS_PYTHON_VERSION" == "2.6" ]]; then pip install unittest2 ; fi # Flake8 requires setuptools > 30, but some preinstalled are to old for that - pip install --upgrade setuptools - python setup.py test - if [[ "$TRAVIS_PYTHON_VERSION" != "nightly" ]]; then coverage run setup.py test ; fi notifications: email: false before_install: - pip install codecov - mkdir bad_tests - wget https://hg.python.org/cpython/archive/3.6.tar.bz2/Lib/test/ -O bad_tests/test.tar.bz2 - tar -xjf bad_tests/test.tar.bz2 -C bad_tests/ - mv bad_tests/cpython-*/Lib/test/badsyntax_future* . -v - rm -r bad_tests/ after_success: - codecov
1
0.037037
0
1
f40dd24af6788e7de7d06254850b83edb179b423
bootcamp/lesson4.py
bootcamp/lesson4.py
import datetime import math from core import test_helper # Question 1 # ---------- # Using the datetime module return a datetime object with the year of 2015, the month of June, and the day of 1 def playing_with_dt(): return datetime.datetime(2015, 06, 01) # Question 2 # ---------- # Using the math module return pi def playing_with_math(): return math.pi def main(): print "\nRunning playing_with_dt_one function..." test_helper(playing_with_dt(), datetime.datetime(2015, 06, 01)) print "\nRunning playing_with_dt_one function..." test_helper(playing_with_math(), math.pi) if __name__ == '__main__': main()
import datetime import math from core import test_helper # Question 1 # ---------- # Using the datetime module return a datetime object with the year of 2015, the month of June, and the day of 1 def playing_with_dt(): # Write code here pass # Question 2 # ---------- # Using the math module return pi def playing_with_math(): # Write code here pass def main(): print "\nRunning playing_with_dt_one function..." test_helper(playing_with_dt(), datetime.datetime(2015, 06, 01)) print "\nRunning playing_with_dt_one function..." test_helper(playing_with_math(), math.pi) if __name__ == '__main__': main()
Revert "Added solutions for lesson 4"
Revert "Added solutions for lesson 4" This reverts commit 58d049c78b16ec5b61f9681b605dc4e937ea7e3e.
Python
mit
infoscout/python-bootcamp-pv
python
## Code Before: import datetime import math from core import test_helper # Question 1 # ---------- # Using the datetime module return a datetime object with the year of 2015, the month of June, and the day of 1 def playing_with_dt(): return datetime.datetime(2015, 06, 01) # Question 2 # ---------- # Using the math module return pi def playing_with_math(): return math.pi def main(): print "\nRunning playing_with_dt_one function..." test_helper(playing_with_dt(), datetime.datetime(2015, 06, 01)) print "\nRunning playing_with_dt_one function..." test_helper(playing_with_math(), math.pi) if __name__ == '__main__': main() ## Instruction: Revert "Added solutions for lesson 4" This reverts commit 58d049c78b16ec5b61f9681b605dc4e937ea7e3e. ## Code After: import datetime import math from core import test_helper # Question 1 # ---------- # Using the datetime module return a datetime object with the year of 2015, the month of June, and the day of 1 def playing_with_dt(): # Write code here pass # Question 2 # ---------- # Using the math module return pi def playing_with_math(): # Write code here pass def main(): print "\nRunning playing_with_dt_one function..." test_helper(playing_with_dt(), datetime.datetime(2015, 06, 01)) print "\nRunning playing_with_dt_one function..." test_helper(playing_with_math(), math.pi) if __name__ == '__main__': main()
import datetime import math from core import test_helper # Question 1 # ---------- # Using the datetime module return a datetime object with the year of 2015, the month of June, and the day of 1 def playing_with_dt(): - return datetime.datetime(2015, 06, 01) + # Write code here + pass # Question 2 # ---------- # Using the math module return pi def playing_with_math(): - return math.pi + # Write code here + pass def main(): print "\nRunning playing_with_dt_one function..." test_helper(playing_with_dt(), datetime.datetime(2015, 06, 01)) print "\nRunning playing_with_dt_one function..." test_helper(playing_with_math(), math.pi) if __name__ == '__main__': main()
6
0.206897
4
2
1a39d90ecc82a09e00c63a2b773e430180227f44
config/schedule.rb
config/schedule.rb
every 1.minute do system 'thor sw:start' end
every 1.minute do command "cd #{`pwd`.strip} && #{`which thor`.strip} sw:start" end
Fix bug: crontab not updated
Fix bug: crontab not updated
Ruby
mit
datdojp/server-watcher,datdojp/server-watcher
ruby
## Code Before: every 1.minute do system 'thor sw:start' end ## Instruction: Fix bug: crontab not updated ## Code After: every 1.minute do command "cd #{`pwd`.strip} && #{`which thor`.strip} sw:start" end
every 1.minute do - system 'thor sw:start' + command "cd #{`pwd`.strip} && #{`which thor`.strip} sw:start" end
2
0.666667
1
1
774c14ca99a330be990dc4489dc15fdb174a4a1c
lib/elocal_api_support/actions/index.rb
lib/elocal_api_support/actions/index.rb
module ElocalApiSupport module Actions module Index def index render json: { current_page: current_page, per_page: per_page, total_entries: filtered_objects.total_count, total_pages: (params[:page].present? || params[:per_page].present?) ? filtered_objects.total_pages : 1, records: filtered_objects_for_json } end end end end
module ElocalApiSupport module Actions module Index def index render json: { current_page: current_page, per_page: filtered_objects.respond_to?(:per_page) ? filtered_objects.per_page : per_page, total_entries: filtered_objects.total_count, total_pages: filtered_objects.respond_to?(:total_pages) ? filtered_objects.total_pages : 1, records: filtered_objects_for_json } end end end end
Update per_page/total_pages to pull from filtered object
Update per_page/total_pages to pull from filtered object
Ruby
mit
eLocal/elocal_api_support
ruby
## Code Before: module ElocalApiSupport module Actions module Index def index render json: { current_page: current_page, per_page: per_page, total_entries: filtered_objects.total_count, total_pages: (params[:page].present? || params[:per_page].present?) ? filtered_objects.total_pages : 1, records: filtered_objects_for_json } end end end end ## Instruction: Update per_page/total_pages to pull from filtered object ## Code After: module ElocalApiSupport module Actions module Index def index render json: { current_page: current_page, per_page: filtered_objects.respond_to?(:per_page) ? filtered_objects.per_page : per_page, total_entries: filtered_objects.total_count, total_pages: filtered_objects.respond_to?(:total_pages) ? filtered_objects.total_pages : 1, records: filtered_objects_for_json } end end end end
module ElocalApiSupport module Actions module Index def index render json: { current_page: current_page, - per_page: per_page, + per_page: filtered_objects.respond_to?(:per_page) ? filtered_objects.per_page : per_page, total_entries: filtered_objects.total_count, - total_pages: (params[:page].present? || params[:per_page].present?) ? filtered_objects.total_pages : 1, + total_pages: filtered_objects.respond_to?(:total_pages) ? filtered_objects.total_pages : 1, records: filtered_objects_for_json } end end end end
4
0.25
2
2
91ef80fd2acf8f424e61ca66e05807eaa92d5e08
tools/sql-migrate-20111130.sql
tools/sql-migrate-20111130.sql
begin; alter table journals drop constraint "$1"; alter table journals add foreign key (submitted_by) references users (name) on update cascade; end; begin; alter table roles drop constraint "$1"; alter table roles add foreign key (user_name) references users (name) on update cascade; end; begin; alter table cookies drop constraint cookies_name_fkey; alter table cookies add foreign key (name) references users (name) on update cascade; end;
begin; alter table journals drop constraint "$1"; alter table journals add foreign key (submitted_by) references users (name) on update cascade; end; begin; alter table roles drop constraint "$1"; alter table roles add foreign key (user_name) references users (name) on update cascade; end; begin; alter table cookies drop constraint cookies_name_fkey; alter table cookies add foreign key (name) references users (name) on update cascade on delete cascade; end; begin; alter table openids drop constraint openids_name_fkey; alter table openids add foreign key (name) references users (name) on update cascade on delete cascade; end; begin; alter table sshkeys drop constraint sshkeys_name_fkey; alter table sshkeys add foreign key (name) references users (name) on update cascade on delete cascade; end; begin; alter table csrf_tokens drop constraint csrf_tokens_name_fkey; alter table csrf_tokens add foreign key (name) references users (name) on update cascade on delete cascade; end;
Change more user name foreign keys to be updatable.
Change more user name foreign keys to be updatable.
SQL
bsd-3-clause
pydotorg/pypi,pydotorg/pypi,pydotorg/pypi,pydotorg/pypi
sql
## Code Before: begin; alter table journals drop constraint "$1"; alter table journals add foreign key (submitted_by) references users (name) on update cascade; end; begin; alter table roles drop constraint "$1"; alter table roles add foreign key (user_name) references users (name) on update cascade; end; begin; alter table cookies drop constraint cookies_name_fkey; alter table cookies add foreign key (name) references users (name) on update cascade; end; ## Instruction: Change more user name foreign keys to be updatable. ## Code After: begin; alter table journals drop constraint "$1"; alter table journals add foreign key (submitted_by) references users (name) on update cascade; end; begin; alter table roles drop constraint "$1"; alter table roles add foreign key (user_name) references users (name) on update cascade; end; begin; alter table cookies drop constraint cookies_name_fkey; alter table cookies add foreign key (name) references users (name) on update cascade on delete cascade; end; begin; alter table openids drop constraint openids_name_fkey; alter table openids add foreign key (name) references users (name) on update cascade on delete cascade; end; begin; alter table sshkeys drop constraint sshkeys_name_fkey; alter table sshkeys add foreign key (name) references users (name) on update cascade on delete cascade; end; begin; alter table csrf_tokens drop constraint csrf_tokens_name_fkey; alter table csrf_tokens add foreign key (name) references users (name) on update cascade on delete cascade; end;
begin; alter table journals drop constraint "$1"; alter table journals add foreign key (submitted_by) references users (name) on update cascade; end; begin; alter table roles drop constraint "$1"; alter table roles add foreign key (user_name) references users (name) on update cascade; end; begin; alter table cookies drop constraint cookies_name_fkey; - alter table cookies add foreign key (name) references users (name) on update cascade; + alter table cookies add foreign key (name) references users (name) on update cascade on delete cascade; ? ++++++++++++++++++ end; + begin; + alter table openids drop constraint openids_name_fkey; + alter table openids add foreign key (name) references users (name) on update cascade on delete cascade; + end; + begin; + alter table sshkeys drop constraint sshkeys_name_fkey; + alter table sshkeys add foreign key (name) references users (name) on update cascade on delete cascade; + end; + begin; + alter table csrf_tokens drop constraint csrf_tokens_name_fkey; + alter table csrf_tokens add foreign key (name) references users (name) on update cascade on delete cascade; + end;
14
1.166667
13
1
71faadc4400ef7474f9ad113003b046cadf2c7c8
.travis.yml
.travis.yml
language: cpp sudo: required compiler: - gcc - clang services: - docker install: # Pull Docker image - docker pull tatsy/ubuntu-cxx - docker build --tag=spica-env --quiet=true . script: # Run test on Docker container - docker run --env="CC=$CC" --env="CXX=$CXX" spica-env /bin/bash -c "lcov --directory . --zerocounters && CTEST_OUTOUT_ON_FAILURE=TRUE make check && cat coverage.info" > coverage.info after_success: # Send coverage report to Coveralls - if [ "$CXX" = "g++" ]; then lcov --directory . --capture --output-file coverage.info; fi - if [ "$CXX" = "clang++" ]; then lcov --directory . --capture --output-file coverage.info --gcov-tool llvm-cov-${LLVM_VER}; fi - lcov --remove coverage.info '3rdparty/*' 'src/renderer/*' 'src/viewer/*' 'test/*' 'example/*' '/usr/*' 'CMakeFiles/*' --output-file coverage.info - lcov --list coverage.info - coveralls-lcov --repo-token RiYcPJSCbPZoogMd1PE10696EAqG8sl5q coverage.info branches: only: - master - develop notifications: email: recipients: tatsy.mail@gmail.com on_success: change on_failure: always env: global: - LLVM_VER='3.7'
language: cpp sudo: required compiler: - gcc - clang services: - docker install: # Pull Docker image - docker pull tatsy/ubuntu-cxx - docker build --tag=spica-env --quiet=true . script: # Run test on Docker container - docker run --env="CC=$CC" --env="CXX=$CXX" spica-env /bin/bash -c "lcov --directory . --zerocounters && CTEST_OUTOUT_ON_FAILURE=TRUE make check && if [ $CXX = \"g++\" ]; then lcov --directory . --capture --output-file coverage.info; fi && if [ $CXX = \"clang++\" ]; then lcov --directory . --capture --output-file coverage.info --gcov-tool llvm-cov; fi && cat coverage.info" > coverage.info" after_success: # Send coverage report to Coveralls - lcov --remove coverage.info '3rdparty/*' 'src/renderer/*' 'src/viewer/*' 'test/*' 'example/*' '/usr/*' 'CMakeFiles/*' --output-file coverage.info - lcov --list coverage.info - coveralls-lcov --repo-token RiYcPJSCbPZoogMd1PE10696EAqG8sl5q coverage.info branches: only: - master - develop notifications: email: recipients: tatsy.mail@gmail.com on_success: change on_failure: always env: global: - LLVM_VER='3.7'
Fix the build error on Docker.
Fix the build error on Docker.
YAML
mit
tatsy/spica,tatsy/spica,tatsy/spica
yaml
## Code Before: language: cpp sudo: required compiler: - gcc - clang services: - docker install: # Pull Docker image - docker pull tatsy/ubuntu-cxx - docker build --tag=spica-env --quiet=true . script: # Run test on Docker container - docker run --env="CC=$CC" --env="CXX=$CXX" spica-env /bin/bash -c "lcov --directory . --zerocounters && CTEST_OUTOUT_ON_FAILURE=TRUE make check && cat coverage.info" > coverage.info after_success: # Send coverage report to Coveralls - if [ "$CXX" = "g++" ]; then lcov --directory . --capture --output-file coverage.info; fi - if [ "$CXX" = "clang++" ]; then lcov --directory . --capture --output-file coverage.info --gcov-tool llvm-cov-${LLVM_VER}; fi - lcov --remove coverage.info '3rdparty/*' 'src/renderer/*' 'src/viewer/*' 'test/*' 'example/*' '/usr/*' 'CMakeFiles/*' --output-file coverage.info - lcov --list coverage.info - coveralls-lcov --repo-token RiYcPJSCbPZoogMd1PE10696EAqG8sl5q coverage.info branches: only: - master - develop notifications: email: recipients: tatsy.mail@gmail.com on_success: change on_failure: always env: global: - LLVM_VER='3.7' ## Instruction: Fix the build error on Docker. ## Code After: language: cpp sudo: required compiler: - gcc - clang services: - docker install: # Pull Docker image - docker pull tatsy/ubuntu-cxx - docker build --tag=spica-env --quiet=true . script: # Run test on Docker container - docker run --env="CC=$CC" --env="CXX=$CXX" spica-env /bin/bash -c "lcov --directory . --zerocounters && CTEST_OUTOUT_ON_FAILURE=TRUE make check && if [ $CXX = \"g++\" ]; then lcov --directory . --capture --output-file coverage.info; fi && if [ $CXX = \"clang++\" ]; then lcov --directory . --capture --output-file coverage.info --gcov-tool llvm-cov; fi && cat coverage.info" > coverage.info" after_success: # Send coverage report to Coveralls - lcov --remove coverage.info '3rdparty/*' 'src/renderer/*' 'src/viewer/*' 'test/*' 'example/*' '/usr/*' 'CMakeFiles/*' --output-file coverage.info - lcov --list coverage.info - coveralls-lcov --repo-token RiYcPJSCbPZoogMd1PE10696EAqG8sl5q coverage.info branches: only: - master - develop notifications: email: recipients: tatsy.mail@gmail.com on_success: change on_failure: always env: global: - LLVM_VER='3.7'
language: cpp sudo: required compiler: - gcc - clang services: - docker install: # Pull Docker image - docker pull tatsy/ubuntu-cxx - docker build --tag=spica-env --quiet=true . script: # Run test on Docker container - - docker run --env="CC=$CC" --env="CXX=$CXX" spica-env /bin/bash -c "lcov --directory . --zerocounters && CTEST_OUTOUT_ON_FAILURE=TRUE make check && cat coverage.info" > coverage.info + - docker run --env="CC=$CC" --env="CXX=$CXX" spica-env /bin/bash -c "lcov --directory . --zerocounters && CTEST_OUTOUT_ON_FAILURE=TRUE make check && if [ $CXX = \"g++\" ]; then lcov --directory . --capture --output-file coverage.info; fi && if [ $CXX = \"clang++\" ]; then lcov --directory . --capture --output-file coverage.info --gcov-tool llvm-cov; fi && cat coverage.info" > coverage.info" after_success: # Send coverage report to Coveralls - - if [ "$CXX" = "g++" ]; then lcov --directory . --capture --output-file coverage.info; fi - - if [ "$CXX" = "clang++" ]; then lcov --directory . --capture --output-file coverage.info --gcov-tool llvm-cov-${LLVM_VER}; fi - lcov --remove coverage.info '3rdparty/*' 'src/renderer/*' 'src/viewer/*' 'test/*' 'example/*' '/usr/*' 'CMakeFiles/*' --output-file coverage.info - lcov --list coverage.info - coveralls-lcov --repo-token RiYcPJSCbPZoogMd1PE10696EAqG8sl5q coverage.info branches: only: - master - develop notifications: email: recipients: tatsy.mail@gmail.com on_success: change on_failure: always env: global: - LLVM_VER='3.7'
4
0.1
1
3
2a45343769a7c2cfcc5408736e47670b0b07c082
ckanext/showcase/public/ckanext_showcase.css
ckanext/showcase/public/ckanext_showcase.css
/* Custom style rules for ckanext-showcase */ .context-info .module-content .smallest { font-size: 13px; } .context-info .module-content .info .btn { margin-top: 18px; } .actions { top: 36px; } .ckanext-showcase-image { margin-bottom: 25px; } .ckanext-showcase-notes { margin-bottom: 25px; } .ckanext-showcase-launch { margin-bottom: 25px; }
/* Custom style rules for ckanext-showcase */ .context-info .module-content .smallest { font-size: 13px; } .context-info .module-content .info .btn { margin-top: 18px; } .actions { top: 36px; right: 25px; } .ckanext-showcase-image { margin-bottom: 25px; } .ckanext-showcase-notes { margin-bottom: 25px; } .ckanext-showcase-launch { margin-bottom: 25px; }
Move Manage button to right to align with image
Move Manage button to right to align with image
CSS
agpl-3.0
ckan/ckanext-showcase,deniszgonjanin/ckanext-showcase,ckan/ckanext-showcase,deniszgonjanin/ckanext-showcase,deniszgonjanin/ckanext-showcase,ckan/ckanext-showcase
css
## Code Before: /* Custom style rules for ckanext-showcase */ .context-info .module-content .smallest { font-size: 13px; } .context-info .module-content .info .btn { margin-top: 18px; } .actions { top: 36px; } .ckanext-showcase-image { margin-bottom: 25px; } .ckanext-showcase-notes { margin-bottom: 25px; } .ckanext-showcase-launch { margin-bottom: 25px; } ## Instruction: Move Manage button to right to align with image ## Code After: /* Custom style rules for ckanext-showcase */ .context-info .module-content .smallest { font-size: 13px; } .context-info .module-content .info .btn { margin-top: 18px; } .actions { top: 36px; right: 25px; } .ckanext-showcase-image { margin-bottom: 25px; } .ckanext-showcase-notes { margin-bottom: 25px; } .ckanext-showcase-launch { margin-bottom: 25px; }
/* Custom style rules for ckanext-showcase */ .context-info .module-content .smallest { font-size: 13px; } .context-info .module-content .info .btn { margin-top: 18px; } .actions { top: 36px; + right: 25px; } .ckanext-showcase-image { margin-bottom: 25px; } .ckanext-showcase-notes { margin-bottom: 25px; } .ckanext-showcase-launch { margin-bottom: 25px; }
1
0.04
1
0
7fdb9ed571c85987b75a3c496386af135a39266b
app/services/pdf_writer.rb
app/services/pdf_writer.rb
class PDFWriter attr_reader :user, :content_type, :pdf HEADERS = ["TIME", "DATE", "TEMP INSIDE", "TEMP OUTSIDE", "TEMP OF HOT WATER", "NOTES"] HEADER_OPTIONS = {width: 90, background_color: "AFAFAF", align: :center, font_style: :bold, border_width: 0.5} FONT = "Times-Roman" FONT_OPTIONS = {size: 7, align: :center} TABLE_OPTIONS = {width: 90, align: :center, border_width: 0.5} def initialize(user) @user = user @content_type = "application/pdf" @pdf = Prawn::Document.new end def self.new_from_user_id(user_id) new(User.find(user_id)) end def generate_pdf populate_pdf pdf.render end def filename "#{user.last_name}.pdf" end def populate_pdf image_url = Rails.root.join("app","assets","images","pdf_header.png") pdf.image image_url, width: 550 pdf.move_down 5 pdf.font FONT, FONT_OPTIONS pdf.table [HEADERS], cell_style: HEADER_OPTIONS pdf.table self.user.table_array, cell_style: TABLE_OPTIONS end end
class PDFWriter attr_reader :user, :content_type, :pdf HEADERS = ["TIME", "DATE", "TEMP INSIDE", "TEMP OUTSIDE", "TEMP OF HOT WATER", "NOTES"] HEADER_OPTIONS = {width: 90, background_color: "AFAFAF", align: :center, font_style: :bold, border_width: 0.5} FONT = "Times-Roman" FONT_OPTIONS = {size: 7, align: :center} TABLE_OPTIONS = {width: 90, align: :center, border_width: 0.5} def initialize(user) @user = user @content_type = "application/pdf" @pdf = Prawn::Document.new end def self.new_from_user_id(user_id) new(User.find(user_id)) end def generate_pdf populate_pdf pdf.render end def filename "#{user.last_name}.pdf" end def populate_pdf image_url = Rails.root.join("app","assets","images","pdf_header.png") pdf.text "Tenant: #{self.user.name}" pdf.image image_url, width: 550 pdf.move_down 5 pdf.font FONT, FONT_OPTIONS pdf.table [HEADERS], cell_style: HEADER_OPTIONS pdf.table self.user.table_array, cell_style: TABLE_OPTIONS end end
Add tenant name to report
Add tenant name to report
Ruby
mit
ramillim/heatseeknyc,heatseeknyc/heatseeknyc,heatseeknyc/heatseeknyc,ramillim/heatseeknyc,heatseeknyc/heatseeknyc,ramillim/heatseeknyc
ruby
## Code Before: class PDFWriter attr_reader :user, :content_type, :pdf HEADERS = ["TIME", "DATE", "TEMP INSIDE", "TEMP OUTSIDE", "TEMP OF HOT WATER", "NOTES"] HEADER_OPTIONS = {width: 90, background_color: "AFAFAF", align: :center, font_style: :bold, border_width: 0.5} FONT = "Times-Roman" FONT_OPTIONS = {size: 7, align: :center} TABLE_OPTIONS = {width: 90, align: :center, border_width: 0.5} def initialize(user) @user = user @content_type = "application/pdf" @pdf = Prawn::Document.new end def self.new_from_user_id(user_id) new(User.find(user_id)) end def generate_pdf populate_pdf pdf.render end def filename "#{user.last_name}.pdf" end def populate_pdf image_url = Rails.root.join("app","assets","images","pdf_header.png") pdf.image image_url, width: 550 pdf.move_down 5 pdf.font FONT, FONT_OPTIONS pdf.table [HEADERS], cell_style: HEADER_OPTIONS pdf.table self.user.table_array, cell_style: TABLE_OPTIONS end end ## Instruction: Add tenant name to report ## Code After: class PDFWriter attr_reader :user, :content_type, :pdf HEADERS = ["TIME", "DATE", "TEMP INSIDE", "TEMP OUTSIDE", "TEMP OF HOT WATER", "NOTES"] HEADER_OPTIONS = {width: 90, background_color: "AFAFAF", align: :center, font_style: :bold, border_width: 0.5} FONT = "Times-Roman" FONT_OPTIONS = {size: 7, align: :center} TABLE_OPTIONS = {width: 90, align: :center, border_width: 0.5} def initialize(user) @user = user @content_type = "application/pdf" @pdf = Prawn::Document.new end def self.new_from_user_id(user_id) new(User.find(user_id)) end def generate_pdf populate_pdf pdf.render end def filename "#{user.last_name}.pdf" end def populate_pdf image_url = Rails.root.join("app","assets","images","pdf_header.png") pdf.text "Tenant: #{self.user.name}" pdf.image image_url, width: 550 pdf.move_down 5 pdf.font FONT, FONT_OPTIONS pdf.table [HEADERS], cell_style: HEADER_OPTIONS pdf.table self.user.table_array, cell_style: TABLE_OPTIONS end end
class PDFWriter attr_reader :user, :content_type, :pdf HEADERS = ["TIME", "DATE", "TEMP INSIDE", "TEMP OUTSIDE", "TEMP OF HOT WATER", "NOTES"] HEADER_OPTIONS = {width: 90, background_color: "AFAFAF", align: :center, font_style: :bold, border_width: 0.5} FONT = "Times-Roman" FONT_OPTIONS = {size: 7, align: :center} TABLE_OPTIONS = {width: 90, align: :center, border_width: 0.5} - + def initialize(user) @user = user @content_type = "application/pdf" @pdf = Prawn::Document.new end def self.new_from_user_id(user_id) new(User.find(user_id)) end def generate_pdf populate_pdf pdf.render end def filename "#{user.last_name}.pdf" end def populate_pdf image_url = Rails.root.join("app","assets","images","pdf_header.png") + pdf.text "Tenant: #{self.user.name}" pdf.image image_url, width: 550 pdf.move_down 5 pdf.font FONT, FONT_OPTIONS pdf.table [HEADERS], cell_style: HEADER_OPTIONS pdf.table self.user.table_array, cell_style: TABLE_OPTIONS end end
3
0.078947
2
1
6f78cd0ee4708ca4d4f3751748d658e7be800152
pages/sv/om.md
pages/sv/om.md
--- title: Om nav: about lang: sv permalink: sv/om/ --- ![avatar]({{ site.github.owner_gravatar_url }}){: .about-avatar} Det här är en personlig blogg om saker och ting som jag jobbar på eller på annat sätt finner intressant. Huvudmålet med bloggen är att förbättra min Svenska, Engelska, och i framtiden även Japanska. Bloggen i sig är genererad av-, och hostad av [GitHub] [github]. Detta görs med hjälp av statiska sidgeneratorn [Jekyll] [jekyll]. Du kan hitta källkoden på [GitHubrepositorit] [self_repo], och om inte anat angetts, så är allt innehåll licenserat under [MIT Licensen] [self_license] [github]: https://github.com [jekyll]: http://jekyllrb.com [self_repo]: https://github.com/Hexagenic/Hexagenic.github.io [self_license]: https://github.com/Hexagenic/hexagenic.github.io/blob/master/LICENSE
--- title: Om nav: about lang: sv permalink: sv/om/ --- <img src="{{ site.github.owner_gravatar_url }}" alt="avatar" class="about-avatar"> Det här är en personlig blogg om saker och ting som jag jobbar på eller på annat sätt finner intressant. Huvudmålet med bloggen är att förbättra min Svenska, Engelska, och i framtiden även Japanska. Bloggen i sig är genererad av-, och hostad av [GitHub] [github]. Detta görs med hjälp av statiska sidgeneratorn [Jekyll] [jekyll]. Du kan hitta källkoden på [GitHubrepositorit] [self_repo], och om inte anat angetts, så är allt innehåll licenserat under [MIT Licensen] [self_license] [github]: https://github.com [jekyll]: http://jekyllrb.com [self_repo]: https://github.com/Hexagenic/Hexagenic.github.io [self_license]: https://github.com/Hexagenic/hexagenic.github.io/blob/master/LICENSE
Fix avatar in swedish about page
Fix avatar in swedish about page
Markdown
mit
Hexagenic/hexagenic.github.io,Hexagenic/hexagenic.github.io
markdown
## Code Before: --- title: Om nav: about lang: sv permalink: sv/om/ --- ![avatar]({{ site.github.owner_gravatar_url }}){: .about-avatar} Det här är en personlig blogg om saker och ting som jag jobbar på eller på annat sätt finner intressant. Huvudmålet med bloggen är att förbättra min Svenska, Engelska, och i framtiden även Japanska. Bloggen i sig är genererad av-, och hostad av [GitHub] [github]. Detta görs med hjälp av statiska sidgeneratorn [Jekyll] [jekyll]. Du kan hitta källkoden på [GitHubrepositorit] [self_repo], och om inte anat angetts, så är allt innehåll licenserat under [MIT Licensen] [self_license] [github]: https://github.com [jekyll]: http://jekyllrb.com [self_repo]: https://github.com/Hexagenic/Hexagenic.github.io [self_license]: https://github.com/Hexagenic/hexagenic.github.io/blob/master/LICENSE ## Instruction: Fix avatar in swedish about page ## Code After: --- title: Om nav: about lang: sv permalink: sv/om/ --- <img src="{{ site.github.owner_gravatar_url }}" alt="avatar" class="about-avatar"> Det här är en personlig blogg om saker och ting som jag jobbar på eller på annat sätt finner intressant. Huvudmålet med bloggen är att förbättra min Svenska, Engelska, och i framtiden även Japanska. Bloggen i sig är genererad av-, och hostad av [GitHub] [github]. Detta görs med hjälp av statiska sidgeneratorn [Jekyll] [jekyll]. Du kan hitta källkoden på [GitHubrepositorit] [self_repo], och om inte anat angetts, så är allt innehåll licenserat under [MIT Licensen] [self_license] [github]: https://github.com [jekyll]: http://jekyllrb.com [self_repo]: https://github.com/Hexagenic/Hexagenic.github.io [self_license]: https://github.com/Hexagenic/hexagenic.github.io/blob/master/LICENSE
--- title: Om nav: about lang: sv permalink: sv/om/ --- - ![avatar]({{ site.github.owner_gravatar_url }}){: .about-avatar} + <img src="{{ site.github.owner_gravatar_url }}" alt="avatar" class="about-avatar"> Det här är en personlig blogg om saker och ting som jag jobbar på eller på annat sätt finner intressant. Huvudmålet med bloggen är att förbättra min Svenska, Engelska, och i framtiden även Japanska. Bloggen i sig är genererad av-, och hostad av [GitHub] [github]. Detta görs med hjälp av statiska sidgeneratorn [Jekyll] [jekyll]. Du kan hitta källkoden på [GitHubrepositorit] [self_repo], och om inte anat angetts, så är allt innehåll licenserat under [MIT Licensen] [self_license] [github]: https://github.com [jekyll]: http://jekyllrb.com [self_repo]: https://github.com/Hexagenic/Hexagenic.github.io [self_license]: https://github.com/Hexagenic/hexagenic.github.io/blob/master/LICENSE
2
0.105263
1
1
435fce76241d41eaffaf63bbd948eb306806d8f0
microdash/settings/production.py
microdash/settings/production.py
import os import dj_database_url from microdash.settings.base import * env = os.getenv PROJECT_ROOT = os.path.dirname(os.path.realpath(__file__)) # settings is one directory up now here = lambda *x: os.path.join(PROJECT_ROOT, '..', *x) SECRET_KEY = env('SECRET_KEY') DATABASES = {'default': dj_database_url.config(default='postgres://localhost')}
import os import dj_database_url from microdash.settings.base import * env = os.getenv PROJECT_ROOT = os.path.dirname(os.path.realpath(__file__)) # settings is one directory up now here = lambda *x: os.path.join(PROJECT_ROOT, '..', *x) SECRET_KEY = env('SECRET_KEY') DATABASES = {'default': dj_database_url.config(default='postgres://localhost')} TWITTER_KEY = env('TWITTER_KEY') TWITTER_SECRET = env('TWITTER_KEY')
Read settings from the environment.
Read settings from the environment.
Python
bsd-3-clause
alfredo/microdash,alfredo/microdash
python
## Code Before: import os import dj_database_url from microdash.settings.base import * env = os.getenv PROJECT_ROOT = os.path.dirname(os.path.realpath(__file__)) # settings is one directory up now here = lambda *x: os.path.join(PROJECT_ROOT, '..', *x) SECRET_KEY = env('SECRET_KEY') DATABASES = {'default': dj_database_url.config(default='postgres://localhost')} ## Instruction: Read settings from the environment. ## Code After: import os import dj_database_url from microdash.settings.base import * env = os.getenv PROJECT_ROOT = os.path.dirname(os.path.realpath(__file__)) # settings is one directory up now here = lambda *x: os.path.join(PROJECT_ROOT, '..', *x) SECRET_KEY = env('SECRET_KEY') DATABASES = {'default': dj_database_url.config(default='postgres://localhost')} TWITTER_KEY = env('TWITTER_KEY') TWITTER_SECRET = env('TWITTER_KEY')
import os import dj_database_url from microdash.settings.base import * env = os.getenv PROJECT_ROOT = os.path.dirname(os.path.realpath(__file__)) # settings is one directory up now here = lambda *x: os.path.join(PROJECT_ROOT, '..', *x) SECRET_KEY = env('SECRET_KEY') DATABASES = {'default': dj_database_url.config(default='postgres://localhost')} + + TWITTER_KEY = env('TWITTER_KEY') + TWITTER_SECRET = env('TWITTER_KEY')
3
0.230769
3
0
18e1e0a1c1b4492e623d5b86d7a23fff00d5fa72
pysingcells/__main__.py
pysingcells/__main__.py
import os import sys import configparser from subprocess import call # project import from . import logger from .mapper import hisat2 def main(config_path): """ Main function of pro'gramme read configuration and run enable step """ config = configparser.ConfigParser() logger.setup_logging(**config) config.read(config_path) print(config.sections()) for key in config['paths']: print(config['paths'][key]) def trimming(files_dir, rep_out , paired=1) : file_list = os.listdir(files_dir) for fastq in file_list : call(['cmd', 'options...']) if __name__ == "__main__": main(sys.argv[1])
import os import sys import configparser from subprocess import call # project import from . import logger from .mapper import hisat2 def main(config_path): """ Main function of pro'gramme read configuration and run enable step """ config = configparser.ConfigParser() config.read(config_path) print(config.sections()) logger.setup_logging(**config) for key in config['paths']: print(config['paths'][key]) mapper = hisat2.Hisat2() mapper.read_configuration(**config) if mapper.check_configuration() : mapper.run() def trimming(files_dir, rep_out , paired=1) : file_list = os.listdir(files_dir) for fastq in file_list : call(['cmd', 'options...']) if __name__ == "__main__": main(sys.argv[1])
Add test of hisat2 object
Add test of hisat2 object
Python
mit
Fougere87/pysingcells
python
## Code Before: import os import sys import configparser from subprocess import call # project import from . import logger from .mapper import hisat2 def main(config_path): """ Main function of pro'gramme read configuration and run enable step """ config = configparser.ConfigParser() logger.setup_logging(**config) config.read(config_path) print(config.sections()) for key in config['paths']: print(config['paths'][key]) def trimming(files_dir, rep_out , paired=1) : file_list = os.listdir(files_dir) for fastq in file_list : call(['cmd', 'options...']) if __name__ == "__main__": main(sys.argv[1]) ## Instruction: Add test of hisat2 object ## Code After: import os import sys import configparser from subprocess import call # project import from . import logger from .mapper import hisat2 def main(config_path): """ Main function of pro'gramme read configuration and run enable step """ config = configparser.ConfigParser() config.read(config_path) print(config.sections()) logger.setup_logging(**config) for key in config['paths']: print(config['paths'][key]) mapper = hisat2.Hisat2() mapper.read_configuration(**config) if mapper.check_configuration() : mapper.run() def trimming(files_dir, rep_out , paired=1) : file_list = os.listdir(files_dir) for fastq in file_list : call(['cmd', 'options...']) if __name__ == "__main__": main(sys.argv[1])
import os import sys import configparser from subprocess import call # project import from . import logger from .mapper import hisat2 def main(config_path): """ Main function of pro'gramme read configuration and run enable step """ config = configparser.ConfigParser() - logger.setup_logging(**config) - config.read(config_path) print(config.sections()) + logger.setup_logging(**config) for key in config['paths']: print(config['paths'][key]) + mapper = hisat2.Hisat2() + mapper.read_configuration(**config) + if mapper.check_configuration() : + mapper.run() def trimming(files_dir, rep_out , paired=1) : file_list = os.listdir(files_dir) for fastq in file_list : call(['cmd', 'options...']) if __name__ == "__main__": main(sys.argv[1])
7
0.233333
5
2
1186cf3127bf9eae333a9f87f912b981b10631cd
ansible/roles/mon-if/tasks/main.yml
ansible/roles/mon-if/tasks/main.yml
- file: path=/srv/mon-if state=directory owner=root group=root mode=0755 - copy: src=srv/mon-if/mon-if.sh dest=/srv/mon-if/mon-if.sh owner=root group=root mode=0700 notify: mon-if up - copy: src=etc/systemd/system/mon-if.service dest=/etc/systemd/system/mon-if.service owner=root group=root mode=0644 notify: mon-if up
- file: path: /srv/mon-if state: directory owner: root group: root mode: 0755 - copy: src: srv/mon-if/mon-if.sh dest: /srv/mon-if/mon-if.sh owner: root group: root mode: 0700 notify: mon-if up - copy: src: etc/systemd/system/mon-if.service dest: /etc/systemd/system/mon-if.service owner: root group: root mode: 0644 notify: mon-if up - systemd: name: mon-if enabled: yes
Add enabled parameter and refactor.
Add enabled parameter and refactor.
YAML
mit
groovenauts/raspi-ops,groovenauts/raspi-ops,groovenauts/raspi-ops,groovenauts/raspi-ops
yaml
## Code Before: - file: path=/srv/mon-if state=directory owner=root group=root mode=0755 - copy: src=srv/mon-if/mon-if.sh dest=/srv/mon-if/mon-if.sh owner=root group=root mode=0700 notify: mon-if up - copy: src=etc/systemd/system/mon-if.service dest=/etc/systemd/system/mon-if.service owner=root group=root mode=0644 notify: mon-if up ## Instruction: Add enabled parameter and refactor. ## Code After: - file: path: /srv/mon-if state: directory owner: root group: root mode: 0755 - copy: src: srv/mon-if/mon-if.sh dest: /srv/mon-if/mon-if.sh owner: root group: root mode: 0700 notify: mon-if up - copy: src: etc/systemd/system/mon-if.service dest: /etc/systemd/system/mon-if.service owner: root group: root mode: 0644 notify: mon-if up - systemd: name: mon-if enabled: yes
- - file: path=/srv/mon-if state=directory owner=root group=root mode=0755 - - copy: src=srv/mon-if/mon-if.sh dest=/srv/mon-if/mon-if.sh owner=root group=root mode=0700 + - file: + path: /srv/mon-if + state: directory + owner: root + group: root + mode: 0755 + - copy: + src: srv/mon-if/mon-if.sh + dest: /srv/mon-if/mon-if.sh + owner: root + group: root + mode: 0700 notify: mon-if up - - copy: src=etc/systemd/system/mon-if.service dest=/etc/systemd/system/mon-if.service owner=root group=root mode=0644 + - copy: + src: etc/systemd/system/mon-if.service + dest: /etc/systemd/system/mon-if.service + owner: root + group: root + mode: 0644 notify: mon-if up + - systemd: + name: mon-if + enabled: yes
24
4.8
21
3
547ac88131d6041667fb4bf07b217cc249408c94
ptn/templates/host.html
ptn/templates/host.html
{% extends "base.html" %} {% block data %} <script type="text/javascript"> function toggle(id) { var e = document.getElementById(id); if ( e.style.display == 'block' ) e.style.display = 'none'; else e.style.display = 'block'; } </script> <a href="/project/{{ pid }}">Return to project page.</a> <h2>{{ host }}</h2> <h3>Notes</h3> <form method='POST' action="/project/{{ pid }}/host/{{ host }}"> <textarea name="note">{{ note }}</textarea><br /> <input type="submit" value="Update Notes" /> </form> {% if details != {} %} <p>Click on the heading to see the associated data.</p> {% for k in keys %} <h3 onclick="toggle('{{ k }}')">{{ k }}</h3> <section class="note" id="{{ k }}" style="display: none;"> {% for n in details[k] %} <pre>{{ n }}</pre> {% endfor %} </section> {% endfor %} {% endif %} {% endblock %}
{% extends "base.html" %} {% block data %} <script type="text/javascript"> function toggle(id) { var e = document.getElementById(id); if ( e.style.display == 'block' ) e.style.display = 'none'; else e.style.display = 'block'; } </script> <a href="/project/{{ pid }}">Return to project page.</a> <h2>{{ host }}</h2> <h3>Notes</h3> <form method='POST' action="/project/{{ pid }}/host/{{ host }}"> <textarea name="note">{{ note }}</textarea><br /> <input type="submit" value="Update Notes" /> </form> {% if details != {} %} <h3>Host Details</h3> <p>Click on each port heading to see the associated data.</p> {% for k in keys %} <h4 onclick="toggle('{{ k }}')">{{ k }}</h4> <section class="note" id="{{ k }}" style="display: none;"> {% for n in details[k] %} <pre>{{ n }}</pre> {% endfor %} </section> {% endfor %} {% endif %} {% endblock %}
Use h4 for each port heading.
Use h4 for each port heading.
HTML
bsd-3-clause
averagesecurityguy/ptnotes,averagesecurityguy/ptnotes,averagesecurityguy/ptnotes
html
## Code Before: {% extends "base.html" %} {% block data %} <script type="text/javascript"> function toggle(id) { var e = document.getElementById(id); if ( e.style.display == 'block' ) e.style.display = 'none'; else e.style.display = 'block'; } </script> <a href="/project/{{ pid }}">Return to project page.</a> <h2>{{ host }}</h2> <h3>Notes</h3> <form method='POST' action="/project/{{ pid }}/host/{{ host }}"> <textarea name="note">{{ note }}</textarea><br /> <input type="submit" value="Update Notes" /> </form> {% if details != {} %} <p>Click on the heading to see the associated data.</p> {% for k in keys %} <h3 onclick="toggle('{{ k }}')">{{ k }}</h3> <section class="note" id="{{ k }}" style="display: none;"> {% for n in details[k] %} <pre>{{ n }}</pre> {% endfor %} </section> {% endfor %} {% endif %} {% endblock %} ## Instruction: Use h4 for each port heading. ## Code After: {% extends "base.html" %} {% block data %} <script type="text/javascript"> function toggle(id) { var e = document.getElementById(id); if ( e.style.display == 'block' ) e.style.display = 'none'; else e.style.display = 'block'; } </script> <a href="/project/{{ pid }}">Return to project page.</a> <h2>{{ host }}</h2> <h3>Notes</h3> <form method='POST' action="/project/{{ pid }}/host/{{ host }}"> <textarea name="note">{{ note }}</textarea><br /> <input type="submit" value="Update Notes" /> </form> {% if details != {} %} <h3>Host Details</h3> <p>Click on each port heading to see the associated data.</p> {% for k in keys %} <h4 onclick="toggle('{{ k }}')">{{ k }}</h4> <section class="note" id="{{ k }}" style="display: none;"> {% for n in details[k] %} <pre>{{ n }}</pre> {% endfor %} </section> {% endfor %} {% endif %} {% endblock %}
{% extends "base.html" %} {% block data %} <script type="text/javascript"> function toggle(id) { var e = document.getElementById(id); if ( e.style.display == 'block' ) e.style.display = 'none'; else e.style.display = 'block'; } </script> <a href="/project/{{ pid }}">Return to project page.</a> <h2>{{ host }}</h2> <h3>Notes</h3> <form method='POST' action="/project/{{ pid }}/host/{{ host }}"> <textarea name="note">{{ note }}</textarea><br /> <input type="submit" value="Update Notes" /> </form> {% if details != {} %} + <h3>Host Details</h3> - <p>Click on the heading to see the associated data.</p> ? -- + <p>Click on each port heading to see the associated data.</p> ? ++++++++ {% for k in keys %} - <h3 onclick="toggle('{{ k }}')">{{ k }}</h3> ? ^ ^ + <h4 onclick="toggle('{{ k }}')">{{ k }}</h4> ? ^ ^ <section class="note" id="{{ k }}" style="display: none;"> {% for n in details[k] %} <pre>{{ n }}</pre> {% endfor %} </section> {% endfor %} {% endif %} {% endblock %}
5
0.138889
3
2
008ccaf4f5ea94e1ee3a17419ad209b883d55fdd
dgc/dgc.go
dgc/dgc.go
package main import ( "fmt" "os" "os/exec" "github.com/codegangsta/cli" ) func runDgc(c *cli.Context) { fmt.println("Hello Test") } func main() { app := cli.NewApp() dgc.Name = "dgc" dgc.Usage = "A minimal docker garbage collector" dgc.Version = "0.1.0" dgc.Author = "David J Felix <davidjfelix@davidjfelix.com>" dgc.Action = runDgc app.Flags = []cli.Flag { } app.Run(os.Args) }
package main import ( "fmt" "os" "os/exec" "github.com/codegangsta/cli" ) func runDgc(c *cli.Context) { fmt.println("Hello Test") } func main() { app := cli.NewApp() dgc.Name = "dgc" dgc.Usage = "A minimal docker garbage collector" dgc.Version = "0.1.0" dgc.Author = "David J Felix <davidjfelix@davidjfelix.com>" dgc.Action = runDgc app.Flags = []cli.Flag { cli.StringFlag { Name: "grace, g", Value: "3600", Usage: "the grace period for a container, defualt time unit is seconds", EnvVar: "GRACE_PERIOD_SECONDS,GRACE_PERIOD", }, cli.StringFlag { Name: "time-unit, t", Value: "s", Usage: "the time unit used for the grace period", EnvVar: "GRACE_PERIOD_TIME_UNIT,TIME_UNIT", }, cli.StringFlag { Name: "docker, d", Value: "docker", Usage: "the docker executable", EnvVar: "DOCKER", }, cli.StringFlag { Name: "exclude, e", Value: "/etc/docker-gc-exclude", Usage: "the directory of the list of containers to exclude from garbage collection", EnvVar: "EXCLUDE_FROM_GC", } } app.Run(os.Args) }
Create command line args to simulate old script
Create command line args to simulate old script
Go
mit
hatchery/dgc
go
## Code Before: package main import ( "fmt" "os" "os/exec" "github.com/codegangsta/cli" ) func runDgc(c *cli.Context) { fmt.println("Hello Test") } func main() { app := cli.NewApp() dgc.Name = "dgc" dgc.Usage = "A minimal docker garbage collector" dgc.Version = "0.1.0" dgc.Author = "David J Felix <davidjfelix@davidjfelix.com>" dgc.Action = runDgc app.Flags = []cli.Flag { } app.Run(os.Args) } ## Instruction: Create command line args to simulate old script ## Code After: package main import ( "fmt" "os" "os/exec" "github.com/codegangsta/cli" ) func runDgc(c *cli.Context) { fmt.println("Hello Test") } func main() { app := cli.NewApp() dgc.Name = "dgc" dgc.Usage = "A minimal docker garbage collector" dgc.Version = "0.1.0" dgc.Author = "David J Felix <davidjfelix@davidjfelix.com>" dgc.Action = runDgc app.Flags = []cli.Flag { cli.StringFlag { Name: "grace, g", Value: "3600", Usage: "the grace period for a container, defualt time unit is seconds", EnvVar: "GRACE_PERIOD_SECONDS,GRACE_PERIOD", }, cli.StringFlag { Name: "time-unit, t", Value: "s", Usage: "the time unit used for the grace period", EnvVar: "GRACE_PERIOD_TIME_UNIT,TIME_UNIT", }, cli.StringFlag { Name: "docker, d", Value: "docker", Usage: "the docker executable", EnvVar: "DOCKER", }, cli.StringFlag { Name: "exclude, e", Value: "/etc/docker-gc-exclude", Usage: "the directory of the list of containers to exclude from garbage collection", EnvVar: "EXCLUDE_FROM_GC", } } app.Run(os.Args) }
package main import ( "fmt" "os" "os/exec" "github.com/codegangsta/cli" ) func runDgc(c *cli.Context) { fmt.println("Hello Test") } func main() { app := cli.NewApp() dgc.Name = "dgc" dgc.Usage = "A minimal docker garbage collector" dgc.Version = "0.1.0" dgc.Author = "David J Felix <davidjfelix@davidjfelix.com>" dgc.Action = runDgc app.Flags = []cli.Flag { + cli.StringFlag { + Name: "grace, g", + Value: "3600", + Usage: "the grace period for a container, defualt time unit is seconds", + EnvVar: "GRACE_PERIOD_SECONDS,GRACE_PERIOD", + }, + cli.StringFlag { + Name: "time-unit, t", + Value: "s", + Usage: "the time unit used for the grace period", + EnvVar: "GRACE_PERIOD_TIME_UNIT,TIME_UNIT", + }, + cli.StringFlag { + Name: "docker, d", + Value: "docker", + Usage: "the docker executable", + EnvVar: "DOCKER", + }, + cli.StringFlag { + Name: "exclude, e", + Value: "/etc/docker-gc-exclude", + Usage: "the directory of the list of containers to exclude from garbage collection", + EnvVar: "EXCLUDE_FROM_GC", + } } app.Run(os.Args) }
24
1
24
0
022e60b442134e0dc0085c936f8f53edb505020a
tox.ini
tox.ini
[tox] envlist = py25,py26,py27,pypy [testenv] ; simplify numpy installation setenv = LAPACK= ATLAS=None deps = ; epydoc numpy nose ; pysvmlight requires mercurial to install (and even if Mercurial is installed ; globally it may not be available for a given python interpreter) so it is disabled now ; ; hg+https://bitbucket.org/wcauchois/pysvmlight changedir = nltk/test commands = ; scipy and scikit-learn requires numpy even to run setup.py so ; they can't be installed in one command pip install --download-cache={toxworkdir}/_download scipy scikit-learn python runtests.py [testenv:pypy] ; numpy and pysvmlight don't work with pypy deps = epydoc nose commands = python runtests.py
[tox] envlist = py25,py26,py27,pypy [testenv] ; simplify numpy installation setenv = LAPACK= ATLAS=None deps = ; epydoc numpy nose svmlight changedir = nltk/test commands = ; scipy and scikit-learn requires numpy even to run setup.py so ; they can't be installed in one command pip install --download-cache={toxworkdir}/_download scipy scikit-learn python runtests.py [testenv:pypy] ; pysvmlight don't work with pypy; numpy is bundled with pypy. deps = epydoc nose commands = python runtests.py
Add svmlight to box tests (it was released to pypi).
Add svmlight to box tests (it was released to pypi).
INI
apache-2.0
nltk/nltk,nltk/nltk,nltk/nltk
ini
## Code Before: [tox] envlist = py25,py26,py27,pypy [testenv] ; simplify numpy installation setenv = LAPACK= ATLAS=None deps = ; epydoc numpy nose ; pysvmlight requires mercurial to install (and even if Mercurial is installed ; globally it may not be available for a given python interpreter) so it is disabled now ; ; hg+https://bitbucket.org/wcauchois/pysvmlight changedir = nltk/test commands = ; scipy and scikit-learn requires numpy even to run setup.py so ; they can't be installed in one command pip install --download-cache={toxworkdir}/_download scipy scikit-learn python runtests.py [testenv:pypy] ; numpy and pysvmlight don't work with pypy deps = epydoc nose commands = python runtests.py ## Instruction: Add svmlight to box tests (it was released to pypi). ## Code After: [tox] envlist = py25,py26,py27,pypy [testenv] ; simplify numpy installation setenv = LAPACK= ATLAS=None deps = ; epydoc numpy nose svmlight changedir = nltk/test commands = ; scipy and scikit-learn requires numpy even to run setup.py so ; they can't be installed in one command pip install --download-cache={toxworkdir}/_download scipy scikit-learn python runtests.py [testenv:pypy] ; pysvmlight don't work with pypy; numpy is bundled with pypy. deps = epydoc nose commands = python runtests.py
[tox] envlist = py25,py26,py27,pypy [testenv] ; simplify numpy installation setenv = LAPACK= ATLAS=None deps = ; epydoc numpy nose + svmlight - - - ; pysvmlight requires mercurial to install (and even if Mercurial is installed - ; globally it may not be available for a given python interpreter) so it is disabled now - ; - ; hg+https://bitbucket.org/wcauchois/pysvmlight changedir = nltk/test commands = ; scipy and scikit-learn requires numpy even to run setup.py so ; they can't be installed in one command pip install --download-cache={toxworkdir}/_download scipy scikit-learn python runtests.py [testenv:pypy] - ; numpy and pysvmlight don't work with pypy + ; pysvmlight don't work with pypy; numpy is bundled with pypy. deps = epydoc nose commands = python runtests.py
9
0.236842
2
7
4d6e9c70457d21bb3935baf613ff7b2a5cfebcc9
README.md
README.md
This is the main git repository for the OSGi specifications, reference implementations and compliance tests. It is a Bnd Workspace model build. ## Build See [CONTRIBUTING](CONTRIBUTING.md) for information on checking out the repo and building.
This is the main git repository for the OSGi specifications, reference implementations and compliance tests. It is a Bnd Workspace model build. ## Build See [CONTRIBUTING](CONTRIBUTING.md) for information on checking out the repo and building. ## Draft Specifications The latest draft specifications can be viewed at: - [Core](https://osgi.github.io/draft/core/) - [Compendium](https://osgi.github.io/draft/cmpn/)
Add links to draft specifications
readme: Add links to draft specifications
Markdown
apache-2.0
osgi/osgi,osgi/osgi,osgi/osgi,osgi/osgi,osgi/osgi,osgi/osgi,osgi/osgi,osgi/osgi
markdown
## Code Before: This is the main git repository for the OSGi specifications, reference implementations and compliance tests. It is a Bnd Workspace model build. ## Build See [CONTRIBUTING](CONTRIBUTING.md) for information on checking out the repo and building. ## Instruction: readme: Add links to draft specifications ## Code After: This is the main git repository for the OSGi specifications, reference implementations and compliance tests. It is a Bnd Workspace model build. ## Build See [CONTRIBUTING](CONTRIBUTING.md) for information on checking out the repo and building. ## Draft Specifications The latest draft specifications can be viewed at: - [Core](https://osgi.github.io/draft/core/) - [Compendium](https://osgi.github.io/draft/cmpn/)
This is the main git repository for the OSGi specifications, reference implementations and compliance tests. It is a Bnd Workspace model build. ## Build See [CONTRIBUTING](CONTRIBUTING.md) for information on checking out the repo and building. + + ## Draft Specifications + + The latest draft specifications can be viewed at: + - [Core](https://osgi.github.io/draft/core/) + - [Compendium](https://osgi.github.io/draft/cmpn/)
6
1
6
0
8a3a3f43651eade0dabfe326efe9dd4eeca9ba0c
db/migrate/20080916153239_resize_photos.rb
db/migrate/20080916153239_resize_photos.rb
class ResizePhotos < ActiveRecord::Migration def self.up require 'mini_magick' %w(families groups people pictures recipes).each do |kind| Dir["#{Rails.root}/db/photos/#{kind}/*.jpg"].each do |pic| next if pic =~ /large|medium|small|tn|full/ img = MiniMagick::Image.from_file(pic) img.thumbnail(PHOTO_SIZES[:full]) new_path = pic.sub(/\.jpg$/, '.full.jpg') img.write(new_path) File.chmod(0644, new_path) File.delete(pic) end end end def self.down raise ActiveRecord::IrreversibleMigration end end
class ResizePhotos < ActiveRecord::Migration def self.up require 'mini_magick' %w(families groups people pictures recipes).each do |kind| Dir["#{DB_PHOTO_PATH}/#{kind}/*.jpg"].each do |pic| next if pic =~ /large|medium|small|tn|full/ img = MiniMagick::Image.from_file(pic) img.thumbnail(PHOTO_SIZES[:full]) new_path = pic.sub(/\.jpg$/, '.full.jpg') img.write(new_path) File.chmod(0644, new_path) File.delete(pic) end end end def self.down raise ActiveRecord::IrreversibleMigration end end
Use proper path for photo resize migration.
Use proper path for photo resize migration.
Ruby
agpl-3.0
ferdinandrosario/onebody,hooray4me/onebody,hooray4me/onebody2,brunoocasali/onebody,acbilimoria/onebody,ferdinandrosario/onebody,powerchurch/onebody,nerdnorth/remote-workers-app,klarkc/onebody,kevinjqiu/onebody,seethemhigh/onebody,seethemhigh/onebody,tochman/onebody,hooray4me/onebody,davidleach/onebody,ebennaga/onebody,hooray4me/onebody,Capriatto/onebody,0612800232/sns_shop,klarkc/onebody,tmecklem/onebody,powerchurch/onebody,hooray4me/onebody2,moss-zc/sns_shop,fadiwissa/onebody,pgmcgee/onebody,dorman/onebody,cessien/onebody,ferdinandrosario/onebody,calsaviour/onebody,Capriatto/onebody,mattraykowski/onebody,mattraykowski/onebody,moss-zc/sns_shop,tmecklem/onebody,acbilimoria/onebody,fadiwissa/onebody,Capriatto/onebody,powerchurch/onebody,tochman/onebody,moss-zc/sns_shop,moss-zc/sns_shop,samuels410/church-portal,hschin/onebody,cessien/onebody,hschin/onebody,hschin/onebody,calsaviour/onebody,brunoocasali/onebody,acbilimoria/onebody,brunoocasali/onebody,acbilimoria/onebody,powerchurch/onebody,davidleach/onebody,seethemhigh/onebody,moss-zc/sns_shop,fadiwissa/onebody,kevinjqiu/onebody,hschin/onebody,brunoocasali/onebody,tochman/onebody,mattraykowski/onebody,ebennaga/onebody,tmecklem/onebody,hooray4me/onebody2,nerdnorth/remote-workers-app,calsaviour/onebody,AVee/onebody,cessien/onebody,pgmcgee/onebody,ebennaga/onebody,fadiwissa/onebody,hooray4me/onebody2,kevinjqiu/onebody,samuels410/church-portal,dorman/onebody,cessien/onebody,dorman/onebody,klarkc/onebody,hooray4me/onebody,nerdnorth/remote-workers-app,AVee/onebody,mattraykowski/onebody,tochman/onebody,ebennaga/onebody,seethemhigh/onebody,pgmcgee/onebody,calsaviour/onebody,klarkc/onebody,dorman/onebody,kevinjqiu/onebody,tmecklem/onebody,AVee/onebody,ferdinandrosario/onebody,davidleach/onebody,0612800232/sns_shop,nerdnorth/remote-workers-app,Capriatto/onebody,davidleach/onebody,pgmcgee/onebody,AVee/onebody,0612800232/sns_shop
ruby
## Code Before: class ResizePhotos < ActiveRecord::Migration def self.up require 'mini_magick' %w(families groups people pictures recipes).each do |kind| Dir["#{Rails.root}/db/photos/#{kind}/*.jpg"].each do |pic| next if pic =~ /large|medium|small|tn|full/ img = MiniMagick::Image.from_file(pic) img.thumbnail(PHOTO_SIZES[:full]) new_path = pic.sub(/\.jpg$/, '.full.jpg') img.write(new_path) File.chmod(0644, new_path) File.delete(pic) end end end def self.down raise ActiveRecord::IrreversibleMigration end end ## Instruction: Use proper path for photo resize migration. ## Code After: class ResizePhotos < ActiveRecord::Migration def self.up require 'mini_magick' %w(families groups people pictures recipes).each do |kind| Dir["#{DB_PHOTO_PATH}/#{kind}/*.jpg"].each do |pic| next if pic =~ /large|medium|small|tn|full/ img = MiniMagick::Image.from_file(pic) img.thumbnail(PHOTO_SIZES[:full]) new_path = pic.sub(/\.jpg$/, '.full.jpg') img.write(new_path) File.chmod(0644, new_path) File.delete(pic) end end end def self.down raise ActiveRecord::IrreversibleMigration end end
class ResizePhotos < ActiveRecord::Migration def self.up require 'mini_magick' %w(families groups people pictures recipes).each do |kind| - Dir["#{Rails.root}/db/photos/#{kind}/*.jpg"].each do |pic| + Dir["#{DB_PHOTO_PATH}/#{kind}/*.jpg"].each do |pic| next if pic =~ /large|medium|small|tn|full/ img = MiniMagick::Image.from_file(pic) img.thumbnail(PHOTO_SIZES[:full]) new_path = pic.sub(/\.jpg$/, '.full.jpg') img.write(new_path) File.chmod(0644, new_path) File.delete(pic) end end end def self.down raise ActiveRecord::IrreversibleMigration end end
2
0.1
1
1
27c127110965db6e4f90d232623aab367183d590
app/views/layouts/layout.haml
app/views/layouts/layout.haml
!!! 5 %html{:lang => "en"} %head %title Fishnet %meta(content="IE=edge,chrome=1" http-equiv="X-UA-Compatible") %meta(content="text/html; charset=UTF-8" http-equiv="Content-Type") %meta(name="description" content="Grid system based on the semantic.gs") %meta(name="author" content="Matthew Kitt") %meta(name="imagetoolbar" content="no") %meta(name="viewport" content="width=device-width, initial-scale=1.0") %meta(name="apple-touch-fullscreen" content="YES") %meta(name="apple-mobile-web-app-capable" content="YES") %meta(name="apple-mobile-web-app-status-bar-style" content="black") %link(href="/assets/application.css" rel="stylesheet" type="text/css") %body(role="document") %section.page-info.fixed %span#page_dimensions Dimensions %section.container(role="main" aria-live="polite" aria-atomic="true") = yield %script(src="/assets/prettify.js" type="text/javascript" charset="utf-8") %script(src="/assets/application.js" type="text/javascript" charset="utf-8")
!!! 5 %html{:lang => "en"} %head %title Fishnet %meta(content="IE=edge,chrome=1" http-equiv="X-UA-Compatible") %meta(content="text/html; charset=UTF-8" http-equiv="Content-Type") %meta(name="description" content="Grid system based on the semantic.gs") %meta(name="author" content="Matthew Kitt") %meta(name="imagetoolbar" content="no") %meta(name="viewport" content="width=device-width, initial-scale=1.0") %meta(name="apple-touch-fullscreen" content="YES") %meta(name="apple-mobile-web-app-capable" content="YES") %meta(name="apple-mobile-web-app-status-bar-style" content="black") <!--[if lt IE 9]> <script src="//html5shiv.googlecode.com/svn/trunk/html5.js"></script> <![endif]--> %link(href="/assets/application.css" rel="stylesheet" type="text/css") %body(role="document") %section.page-info.fixed %span#page_dimensions Dimensions %section.container(role="main" aria-live="polite" aria-atomic="true") = yield %script(src="/assets/prettify.js" type="text/javascript" charset="utf-8") %script(src="/assets/application.js" type="text/javascript" charset="utf-8")
Add html5 shim for IE
Add html5 shim for IE
Haml
mit
modeset/fishnet,modeset/fishnet
haml
## Code Before: !!! 5 %html{:lang => "en"} %head %title Fishnet %meta(content="IE=edge,chrome=1" http-equiv="X-UA-Compatible") %meta(content="text/html; charset=UTF-8" http-equiv="Content-Type") %meta(name="description" content="Grid system based on the semantic.gs") %meta(name="author" content="Matthew Kitt") %meta(name="imagetoolbar" content="no") %meta(name="viewport" content="width=device-width, initial-scale=1.0") %meta(name="apple-touch-fullscreen" content="YES") %meta(name="apple-mobile-web-app-capable" content="YES") %meta(name="apple-mobile-web-app-status-bar-style" content="black") %link(href="/assets/application.css" rel="stylesheet" type="text/css") %body(role="document") %section.page-info.fixed %span#page_dimensions Dimensions %section.container(role="main" aria-live="polite" aria-atomic="true") = yield %script(src="/assets/prettify.js" type="text/javascript" charset="utf-8") %script(src="/assets/application.js" type="text/javascript" charset="utf-8") ## Instruction: Add html5 shim for IE ## Code After: !!! 5 %html{:lang => "en"} %head %title Fishnet %meta(content="IE=edge,chrome=1" http-equiv="X-UA-Compatible") %meta(content="text/html; charset=UTF-8" http-equiv="Content-Type") %meta(name="description" content="Grid system based on the semantic.gs") %meta(name="author" content="Matthew Kitt") %meta(name="imagetoolbar" content="no") %meta(name="viewport" content="width=device-width, initial-scale=1.0") %meta(name="apple-touch-fullscreen" content="YES") %meta(name="apple-mobile-web-app-capable" content="YES") %meta(name="apple-mobile-web-app-status-bar-style" content="black") <!--[if lt IE 9]> <script src="//html5shiv.googlecode.com/svn/trunk/html5.js"></script> <![endif]--> %link(href="/assets/application.css" rel="stylesheet" type="text/css") %body(role="document") %section.page-info.fixed %span#page_dimensions Dimensions %section.container(role="main" aria-live="polite" aria-atomic="true") = yield %script(src="/assets/prettify.js" type="text/javascript" charset="utf-8") %script(src="/assets/application.js" type="text/javascript" charset="utf-8")
!!! 5 %html{:lang => "en"} %head %title Fishnet %meta(content="IE=edge,chrome=1" http-equiv="X-UA-Compatible") %meta(content="text/html; charset=UTF-8" http-equiv="Content-Type") %meta(name="description" content="Grid system based on the semantic.gs") %meta(name="author" content="Matthew Kitt") %meta(name="imagetoolbar" content="no") %meta(name="viewport" content="width=device-width, initial-scale=1.0") %meta(name="apple-touch-fullscreen" content="YES") %meta(name="apple-mobile-web-app-capable" content="YES") %meta(name="apple-mobile-web-app-status-bar-style" content="black") + <!--[if lt IE 9]> + <script src="//html5shiv.googlecode.com/svn/trunk/html5.js"></script> + <![endif]--> + %link(href="/assets/application.css" rel="stylesheet" type="text/css") %body(role="document") %section.page-info.fixed %span#page_dimensions Dimensions %section.container(role="main" aria-live="polite" aria-atomic="true") = yield %script(src="/assets/prettify.js" type="text/javascript" charset="utf-8") %script(src="/assets/application.js" type="text/javascript" charset="utf-8")
4
0.148148
4
0
e92b45ad68b665095cfce5daea7ff82550fcbfb1
psqtraviscontainer/printer.py
psqtraviscontainer/printer.py
"""Utility functions for printing unicode text.""" import sys def unicode_safe(text): """Print text to standard output, handle unicode.""" # Don't trust non-file like replacements of sys.stdout, assume # that they can only handle ascii. if sys.stdout.__class__ is not file or not sys.stdout.isatty(): text = "".join([c for c in text if ord(c) < 128]) sys.stdout.write(text)
"""Utility functions for printing unicode text.""" import sys def unicode_safe(text): """Print text to standard output, handle unicode.""" # If a replacement of sys.stdout doesn't have isatty, don't trust it. if not getattr(sys.stdout, "isatty", None) or not sys.stdout.isatty(): text = "".join([c for c in text if ord(c) < 128]) sys.stdout.write(text)
Check for the isatty property on sys.stdout.
Check for the isatty property on sys.stdout. Previously we were checking to see if it was of type "file", but the type changed between python 2 and 3. Really, all we want to do is check if it is a tty and if we can't be sure of that, don't enable utf8 output.
Python
mit
polysquare/polysquare-travis-container
python
## Code Before: """Utility functions for printing unicode text.""" import sys def unicode_safe(text): """Print text to standard output, handle unicode.""" # Don't trust non-file like replacements of sys.stdout, assume # that they can only handle ascii. if sys.stdout.__class__ is not file or not sys.stdout.isatty(): text = "".join([c for c in text if ord(c) < 128]) sys.stdout.write(text) ## Instruction: Check for the isatty property on sys.stdout. Previously we were checking to see if it was of type "file", but the type changed between python 2 and 3. Really, all we want to do is check if it is a tty and if we can't be sure of that, don't enable utf8 output. ## Code After: """Utility functions for printing unicode text.""" import sys def unicode_safe(text): """Print text to standard output, handle unicode.""" # If a replacement of sys.stdout doesn't have isatty, don't trust it. if not getattr(sys.stdout, "isatty", None) or not sys.stdout.isatty(): text = "".join([c for c in text if ord(c) < 128]) sys.stdout.write(text)
"""Utility functions for printing unicode text.""" import sys def unicode_safe(text): """Print text to standard output, handle unicode.""" + # If a replacement of sys.stdout doesn't have isatty, don't trust it. + if not getattr(sys.stdout, "isatty", None) or not sys.stdout.isatty(): - # Don't trust non-file like replacements of sys.stdout, assume - # that they can only handle ascii. - if sys.stdout.__class__ is not file or not sys.stdout.isatty(): text = "".join([c for c in text if ord(c) < 128]) sys.stdout.write(text)
5
0.384615
2
3
46d4039601a0ca40d085fc88d30529660dd03b93
tbsetup/snmpit_doc.txt
tbsetup/snmpit_doc.txt
Documentation for snmpit ------------------------ snmpit uses modules that implement the interface to a given switch model (ie Cisco Catalyst 6509, Intel EhterExpress 510T, etc). They are responsable for all communication (typically over SNMP) to the switch. These modules should support these actions: new(ip,args) Initializes snmp session to IP, "arg=val" sets module parameters, ignores unsupported args portControl(cmd,ports) cmd = {enable, disable, 10mbit, 100mbit, full, half, auto} ports = list of "pcX:Y" showPorts print settings of all ports getStats print various stats vlanLock Get a vlan editing lock/token/buffer, and vlanUnlock Release it when done setupVlan (name,ports) - takes a list of ports, and vlans them removeVlan (vlan) - takes vlan number listVlans returns a list of entries, with each entry having an ID, a name, and a list of members (pcX:Y), tab delimited Notes: showPorts and getStats currently print their results directly. This makes sense, since the output could be quite different for different switch types. This also prevents snmpit itself from using these functions to determine certain things about the state. In the future, it would be good to have them return data structures, like listVlans does, and have snmpit be able to use this info. Features supported in snmpit itself: - auto vlan config from file or from db - device type discovery and command dispatch Power control has also been moved into its own module, snmpit_apc.pm. It supports new(ip,args) as above, and power(cmd,port) where cmd is "on", "off", or "cyc[le]" and port is 1-8. snmpit library interface ------------------------ (see comment at top of snmpit_lib.pm) The snmpit library handles all the database transactions. It makes available all the translations that a module might need to do.
Documentation for snmpit ------------------------ snmpit uses modules that implement the interface to a given switch model (ie Cisco Catalyst 6509, Intel EhterExpress 510T, etc). They are responsable for all communication (typically over SNMP) to the switch. The organization is basically that snmpit itself deals with stack objects, which deal with (possibly multiple) switch objects on the backend. So, snmpit makes a snmpit_cisco_stack object, and gives it a list of switches, which snmpit_cisco_stack uses to create a snmpit_cisco object for each switch. The stack objects basically just know how to do things like the the VLAN lists from all of their switches and collate them into one big list that they can return to the caller. The API for the stack objects is currently not documented - look at one of the existing ones.
Bring up to date - there is a lot less content in this document now, though.
Bring up to date - there is a lot less content in this document now, though.
Text
agpl-3.0
nmc-probe/emulab-nome,nmc-probe/emulab-nome,nmc-probe/emulab-nome,nmc-probe/emulab-nome,nmc-probe/emulab-nome,nmc-probe/emulab-nome,nmc-probe/emulab-nome,nmc-probe/emulab-nome,nmc-probe/emulab-nome,nmc-probe/emulab-nome,nmc-probe/emulab-nome
text
## Code Before: Documentation for snmpit ------------------------ snmpit uses modules that implement the interface to a given switch model (ie Cisco Catalyst 6509, Intel EhterExpress 510T, etc). They are responsable for all communication (typically over SNMP) to the switch. These modules should support these actions: new(ip,args) Initializes snmp session to IP, "arg=val" sets module parameters, ignores unsupported args portControl(cmd,ports) cmd = {enable, disable, 10mbit, 100mbit, full, half, auto} ports = list of "pcX:Y" showPorts print settings of all ports getStats print various stats vlanLock Get a vlan editing lock/token/buffer, and vlanUnlock Release it when done setupVlan (name,ports) - takes a list of ports, and vlans them removeVlan (vlan) - takes vlan number listVlans returns a list of entries, with each entry having an ID, a name, and a list of members (pcX:Y), tab delimited Notes: showPorts and getStats currently print their results directly. This makes sense, since the output could be quite different for different switch types. This also prevents snmpit itself from using these functions to determine certain things about the state. In the future, it would be good to have them return data structures, like listVlans does, and have snmpit be able to use this info. Features supported in snmpit itself: - auto vlan config from file or from db - device type discovery and command dispatch Power control has also been moved into its own module, snmpit_apc.pm. It supports new(ip,args) as above, and power(cmd,port) where cmd is "on", "off", or "cyc[le]" and port is 1-8. snmpit library interface ------------------------ (see comment at top of snmpit_lib.pm) The snmpit library handles all the database transactions. It makes available all the translations that a module might need to do. ## Instruction: Bring up to date - there is a lot less content in this document now, though. ## Code After: Documentation for snmpit ------------------------ snmpit uses modules that implement the interface to a given switch model (ie Cisco Catalyst 6509, Intel EhterExpress 510T, etc). They are responsable for all communication (typically over SNMP) to the switch. The organization is basically that snmpit itself deals with stack objects, which deal with (possibly multiple) switch objects on the backend. So, snmpit makes a snmpit_cisco_stack object, and gives it a list of switches, which snmpit_cisco_stack uses to create a snmpit_cisco object for each switch. The stack objects basically just know how to do things like the the VLAN lists from all of their switches and collate them into one big list that they can return to the caller. The API for the stack objects is currently not documented - look at one of the existing ones.
Documentation for snmpit ------------------------ snmpit uses modules that implement the interface to a given switch model (ie Cisco Catalyst 6509, Intel EhterExpress 510T, etc). They are responsable for all communication (typically over SNMP) to the switch. - These modules should support these actions: + The organization is basically that snmpit itself deals with stack objects, + which deal with (possibly multiple) switch objects on the backend. So, snmpit + makes a snmpit_cisco_stack object, and gives it a list of switches, which + snmpit_cisco_stack uses to create a snmpit_cisco object for each switch. The + stack objects basically just know how to do things like the the VLAN lists from + all of their switches and collate them into one big list that they can return + to the caller. - new(ip,args) Initializes snmp session to IP, "arg=val" sets - module parameters, ignores unsupported args - portControl(cmd,ports) - cmd = {enable, disable, 10mbit, 100mbit, full, half, auto} - ports = list of "pcX:Y" - showPorts print settings of all ports - getStats print various stats - vlanLock Get a vlan editing lock/token/buffer, and - vlanUnlock Release it when done - setupVlan (name,ports) - takes a list of ports, and vlans them - removeVlan (vlan) - takes vlan number - listVlans returns a list of entries, with each entry having an ID, - a name, and a list of members (pcX:Y), tab delimited + The API for the stack objects is currently not documented - look at one of the + existing ones. - Notes: - showPorts and getStats currently print their results directly. This - makes sense, since the output could be quite different for different - switch types. This also prevents snmpit itself from using these - functions to determine certain things about the state. In the future, - it would be good to have them return data structures, like listVlans - does, and have snmpit be able to use this info. - - Features supported in snmpit itself: - - auto vlan config from file or from db - - device type discovery and command dispatch - - Power control has also been moved into its own module, snmpit_apc.pm. - It supports new(ip,args) as above, and power(cmd,port) where cmd is - "on", "off", or "cyc[le]" and port is 1-8. - - snmpit library interface - ------------------------ - (see comment at top of snmpit_lib.pm) - - The snmpit library handles all the database transactions. It makes - available all the translations that a module might need to do.
45
1.022727
9
36
4221f84b6e8b12d1b069a5165374fad584c8df08
lib/jwt/algos/ps.rb
lib/jwt/algos/ps.rb
module JWT module Algos module Ps # RSASSA-PSS signing algorithms module_function SUPPORTED = %w[PS256 PS384 PS512].freeze def sign(to_sign) require_openssl! algorithm, msg, key = to_sign.values key_class = key.class raise EncodeError, "The given key is a #{key_class}. It has to be an OpenSSL::PKey::RSA instance." if key_class == String translated_algorithm = algorithm.sub('PS', 'sha') key.sign_pss(translated_algorithm, msg, salt_length: :digest, mgf1_hash: translated_algorithm) end def verify(to_verify) require_openssl! SecurityUtils.verify_ps(to_verify.algorithm, to_verify.public_key, to_verify.signing_input, to_verify.signature) end def require_openssl! if Object.const_defined?('OpenSSL') major, minor = OpenSSL::VERSION.split('.').first(2) unless major.to_i >= 2 && minor.to_i >= 1 raise JWT::RequiredDependencyError, "You currently have OpenSSL #{OpenSSL::VERSION}. PS support requires >= 2.1" end else raise JWT::RequiredDependencyError, 'PS signing requires OpenSSL +2.1' end end end end end
module JWT module Algos module Ps # RSASSA-PSS signing algorithms module_function SUPPORTED = %w[PS256 PS384 PS512].freeze def sign(to_sign) require_openssl! algorithm, msg, key = to_sign.values key_class = key.class raise EncodeError, "The given key is a #{key_class}. It has to be an OpenSSL::PKey::RSA instance." if key_class == String translated_algorithm = algorithm.sub('PS', 'sha') key.sign_pss(translated_algorithm, msg, salt_length: :digest, mgf1_hash: translated_algorithm) end def verify(to_verify) require_openssl! SecurityUtils.verify_ps(to_verify.algorithm, to_verify.public_key, to_verify.signing_input, to_verify.signature) end def require_openssl! if Object.const_defined?('OpenSSL') if ::Gem::Version.new(OpenSSL::VERSION) < ::Gem::Version.new('2.1') raise JWT::RequiredDependencyError, "You currently have OpenSSL #{OpenSSL::VERSION}. PS support requires >= 2.1" end else raise JWT::RequiredDependencyError, 'PS signing requires OpenSSL +2.1' end end end end end
Fix openssl gem version check to support versons greater than 3
Fix openssl gem version check to support versons greater than 3
Ruby
mit
jwt/ruby-jwt,excpt/ruby-jwt
ruby
## Code Before: module JWT module Algos module Ps # RSASSA-PSS signing algorithms module_function SUPPORTED = %w[PS256 PS384 PS512].freeze def sign(to_sign) require_openssl! algorithm, msg, key = to_sign.values key_class = key.class raise EncodeError, "The given key is a #{key_class}. It has to be an OpenSSL::PKey::RSA instance." if key_class == String translated_algorithm = algorithm.sub('PS', 'sha') key.sign_pss(translated_algorithm, msg, salt_length: :digest, mgf1_hash: translated_algorithm) end def verify(to_verify) require_openssl! SecurityUtils.verify_ps(to_verify.algorithm, to_verify.public_key, to_verify.signing_input, to_verify.signature) end def require_openssl! if Object.const_defined?('OpenSSL') major, minor = OpenSSL::VERSION.split('.').first(2) unless major.to_i >= 2 && minor.to_i >= 1 raise JWT::RequiredDependencyError, "You currently have OpenSSL #{OpenSSL::VERSION}. PS support requires >= 2.1" end else raise JWT::RequiredDependencyError, 'PS signing requires OpenSSL +2.1' end end end end end ## Instruction: Fix openssl gem version check to support versons greater than 3 ## Code After: module JWT module Algos module Ps # RSASSA-PSS signing algorithms module_function SUPPORTED = %w[PS256 PS384 PS512].freeze def sign(to_sign) require_openssl! algorithm, msg, key = to_sign.values key_class = key.class raise EncodeError, "The given key is a #{key_class}. It has to be an OpenSSL::PKey::RSA instance." if key_class == String translated_algorithm = algorithm.sub('PS', 'sha') key.sign_pss(translated_algorithm, msg, salt_length: :digest, mgf1_hash: translated_algorithm) end def verify(to_verify) require_openssl! SecurityUtils.verify_ps(to_verify.algorithm, to_verify.public_key, to_verify.signing_input, to_verify.signature) end def require_openssl! if Object.const_defined?('OpenSSL') if ::Gem::Version.new(OpenSSL::VERSION) < ::Gem::Version.new('2.1') raise JWT::RequiredDependencyError, "You currently have OpenSSL #{OpenSSL::VERSION}. PS support requires >= 2.1" end else raise JWT::RequiredDependencyError, 'PS signing requires OpenSSL +2.1' end end end end end
module JWT module Algos module Ps # RSASSA-PSS signing algorithms module_function SUPPORTED = %w[PS256 PS384 PS512].freeze def sign(to_sign) require_openssl! algorithm, msg, key = to_sign.values key_class = key.class raise EncodeError, "The given key is a #{key_class}. It has to be an OpenSSL::PKey::RSA instance." if key_class == String translated_algorithm = algorithm.sub('PS', 'sha') key.sign_pss(translated_algorithm, msg, salt_length: :digest, mgf1_hash: translated_algorithm) end def verify(to_verify) require_openssl! SecurityUtils.verify_ps(to_verify.algorithm, to_verify.public_key, to_verify.signing_input, to_verify.signature) end def require_openssl! if Object.const_defined?('OpenSSL') + if ::Gem::Version.new(OpenSSL::VERSION) < ::Gem::Version.new('2.1') - major, minor = OpenSSL::VERSION.split('.').first(2) - - unless major.to_i >= 2 && minor.to_i >= 1 raise JWT::RequiredDependencyError, "You currently have OpenSSL #{OpenSSL::VERSION}. PS support requires >= 2.1" end else raise JWT::RequiredDependencyError, 'PS signing requires OpenSSL +2.1' end end end end end
4
0.093023
1
3
79b57cb598034e5857a5983c508332b893a87a4a
spec/unit/in_spec.rb
spec/unit/in_spec.rb
require 'spec_helper' describe Concourse::Resource::RSS::In do let(:input) { { 'source' => { 'foo' => 'bar' }, 'version' => { 'ref' => '61cebf' }, } } context 'without destination directory' do let(:destination_directory) { nil } it 'raises an error' do expect { subject.call(input, destination_directory) }.to raise_error(/destination directory/) end end context 'with a proper destination directory' do let(:destination_directory) { Dir.mktmpdir } after do FileUtils.remove_entry(destination_directory) end it 'fetches the resource and responds with the fetched version and its metadata' do output = subject.call(input, destination_directory) expect(output).to eq({ 'version' => { 'ref' => '61cebf' }, 'metadata' => [ { 'name' => 'commit', 'value' => '61cebf' }, { 'name' => 'author', 'value' => 'Hulk Hogan' }, ] }) end it 'fetches the resource and places it in the given directory' do fail 'TBD' end end end __END__ * The script must fetch the resource and place it in the given directory. * If the desired resource version is unavailable (for example, if it was deleted), the script must error. * The script must emit the fetched version, and may emit metadata as a list of key-value pairs. * params is an arbitrary JSON object passed along verbatim from params on a get.
require 'spec_helper' describe Concourse::Resource::RSS::In do let(:input) { { 'source' => { 'foo' => 'bar' }, 'version' => { 'ref' => '61cebf' }, } } let(:destination_directory) { Dir.mktmpdir } after do FileUtils.remove_entry(destination_directory) if destination_directory end it 'fetches the resource and responds with the fetched version and its metadata' do output = subject.call(input, destination_directory) expect(output).to eq({ 'version' => { 'ref' => '61cebf' }, 'metadata' => [ { 'name' => 'commit', 'value' => '61cebf' }, { 'name' => 'author', 'value' => 'Hulk Hogan' }, ] }) end xit 'fetches the resource and places it in the given directory' do end xit 'emits the fetched version' do end xit 'emits metadata as a list of key-value pairs' do end xit 'accepts params passed as an arbitrary JSON object' do end context 'without destination directory' do let(:destination_directory) { nil } it 'raises an error' do expect { subject.call(input, destination_directory) }.to raise_error(/destination directory/) end end context 'the desired resource version is unavailable' do xit 'raises an error' do end end end
Add new test cases (still skipping)
Add new test cases (still skipping)
Ruby
mit
suhlig/concourse-rss-resource,suhlig/concourse-rss-resource
ruby
## Code Before: require 'spec_helper' describe Concourse::Resource::RSS::In do let(:input) { { 'source' => { 'foo' => 'bar' }, 'version' => { 'ref' => '61cebf' }, } } context 'without destination directory' do let(:destination_directory) { nil } it 'raises an error' do expect { subject.call(input, destination_directory) }.to raise_error(/destination directory/) end end context 'with a proper destination directory' do let(:destination_directory) { Dir.mktmpdir } after do FileUtils.remove_entry(destination_directory) end it 'fetches the resource and responds with the fetched version and its metadata' do output = subject.call(input, destination_directory) expect(output).to eq({ 'version' => { 'ref' => '61cebf' }, 'metadata' => [ { 'name' => 'commit', 'value' => '61cebf' }, { 'name' => 'author', 'value' => 'Hulk Hogan' }, ] }) end it 'fetches the resource and places it in the given directory' do fail 'TBD' end end end __END__ * The script must fetch the resource and place it in the given directory. * If the desired resource version is unavailable (for example, if it was deleted), the script must error. * The script must emit the fetched version, and may emit metadata as a list of key-value pairs. * params is an arbitrary JSON object passed along verbatim from params on a get. ## Instruction: Add new test cases (still skipping) ## Code After: require 'spec_helper' describe Concourse::Resource::RSS::In do let(:input) { { 'source' => { 'foo' => 'bar' }, 'version' => { 'ref' => '61cebf' }, } } let(:destination_directory) { Dir.mktmpdir } after do FileUtils.remove_entry(destination_directory) if destination_directory end it 'fetches the resource and responds with the fetched version and its metadata' do output = subject.call(input, destination_directory) expect(output).to eq({ 'version' => { 'ref' => '61cebf' }, 'metadata' => [ { 'name' => 'commit', 'value' => '61cebf' }, { 'name' => 'author', 'value' => 'Hulk Hogan' }, ] }) end xit 'fetches the resource and places it in the given directory' do end xit 'emits the fetched version' do end xit 'emits metadata as a list of key-value pairs' do end xit 'accepts params passed as an arbitrary JSON object' do end context 'without destination directory' do let(:destination_directory) { nil } it 'raises an error' do expect { subject.call(input, destination_directory) }.to raise_error(/destination directory/) end end context 'the desired resource version is unavailable' do xit 'raises an error' do end end end
require 'spec_helper' describe Concourse::Resource::RSS::In do let(:input) { { 'source' => { 'foo' => 'bar' }, 'version' => { 'ref' => '61cebf' }, } } + + let(:destination_directory) { Dir.mktmpdir } + + after do + FileUtils.remove_entry(destination_directory) if destination_directory + end + + it 'fetches the resource and responds with the fetched version and its metadata' do + output = subject.call(input, destination_directory) + + expect(output).to eq({ + 'version' => { 'ref' => '61cebf' }, + 'metadata' => [ + { 'name' => 'commit', 'value' => '61cebf' }, + { 'name' => 'author', 'value' => 'Hulk Hogan' }, + ] + }) + end + + xit 'fetches the resource and places it in the given directory' do + end + + xit 'emits the fetched version' do + end + + xit 'emits metadata as a list of key-value pairs' do + end + + xit 'accepts params passed as an arbitrary JSON object' do + end context 'without destination directory' do let(:destination_directory) { nil } it 'raises an error' do expect { subject.call(input, destination_directory) }.to raise_error(/destination directory/) end end + context 'the desired resource version is unavailable' do + xit 'raises an error' do - context 'with a proper destination directory' do - let(:destination_directory) { Dir.mktmpdir } - - after do - FileUtils.remove_entry(destination_directory) - end - - it 'fetches the resource and responds with the fetched version and its metadata' do - output = subject.call(input, destination_directory) - - expect(output).to eq({ - 'version' => { 'ref' => '61cebf' }, - 'metadata' => [ - { 'name' => 'commit', 'value' => '61cebf' }, - { 'name' => 'author', 'value' => 'Hulk Hogan' }, - ] - }) - end - - it 'fetches the resource and places it in the given directory' do - fail 'TBD' end end end - - __END__ - - * The script must fetch the resource and place it in the given directory. - * If the desired resource version is unavailable (for example, if it was deleted), the script must error. - * The script must emit the fetched version, and may emit metadata as a list of key-value pairs. - * params is an arbitrary JSON object passed along verbatim from params on a get.
60
1.22449
32
28
e92339c7b8fdf9d8a2b1d90d0f59216c2b09db05
.travis.yml
.travis.yml
language: ruby cache: bundler rvm: - ruby-1.9.3 - ruby-2.0 - ruby-2.1 - rbx - jruby-19mode - jruby-20mode - jruby-21mode
language: ruby cache: bundler rvm: - ruby-1.9.3 - ruby-2.0 - ruby-2.1 - rbx - jruby
Change JRuby version on Travis
Change JRuby version on Travis
YAML
mit
michaelbaudino/addic7ed-ruby,Pmaene/addic7ed-ruby
yaml
## Code Before: language: ruby cache: bundler rvm: - ruby-1.9.3 - ruby-2.0 - ruby-2.1 - rbx - jruby-19mode - jruby-20mode - jruby-21mode ## Instruction: Change JRuby version on Travis ## Code After: language: ruby cache: bundler rvm: - ruby-1.9.3 - ruby-2.0 - ruby-2.1 - rbx - jruby
language: ruby cache: bundler rvm: - ruby-1.9.3 - ruby-2.0 - ruby-2.1 - rbx + - jruby - - jruby-19mode - - jruby-20mode - - jruby-21mode
4
0.4
1
3
24b14938a21dc0fd3cee71b14492916fddedd64f
week-2/wireframe-reflection.md
week-2/wireframe-reflection.md
![Wireframe blog index](../week-2/imgs/wireframe-blog-index.png "Wireframe blog index") ![Wireframe index](../week-2/imgs/wireframe-index.png "Wireframe index")
![Wireframe blog index](../week-2/imgs/wireframe-blog-index.png "Wireframe blog index") ![Wireframe index](../week-2/imgs/wireframe-index.png "Wireframe index") #What is a wireframe? #What are the benefits of wireframing? #Did you enjoy wireframing your site? #Did you revise your wireframe or stick with your first idea? #What questions did you ask during this challenge? What resources did you find to help you answer them? #Which parts of the challenge did you enjoy and which parts did you find tedious?
Add questions to wireframe reflection
Add questions to wireframe reflection
Markdown
mit
mirascarvalone/phase-0,mirascarvalone/phase-0,mirascarvalone/phase-0
markdown
## Code Before: ![Wireframe blog index](../week-2/imgs/wireframe-blog-index.png "Wireframe blog index") ![Wireframe index](../week-2/imgs/wireframe-index.png "Wireframe index") ## Instruction: Add questions to wireframe reflection ## Code After: ![Wireframe blog index](../week-2/imgs/wireframe-blog-index.png "Wireframe blog index") ![Wireframe index](../week-2/imgs/wireframe-index.png "Wireframe index") #What is a wireframe? #What are the benefits of wireframing? #Did you enjoy wireframing your site? #Did you revise your wireframe or stick with your first idea? #What questions did you ask during this challenge? What resources did you find to help you answer them? #Which parts of the challenge did you enjoy and which parts did you find tedious?
![Wireframe blog index](../week-2/imgs/wireframe-blog-index.png "Wireframe blog index") ![Wireframe index](../week-2/imgs/wireframe-index.png "Wireframe index") + + #What is a wireframe? + #What are the benefits of wireframing? + #Did you enjoy wireframing your site? + #Did you revise your wireframe or stick with your first idea? + #What questions did you ask during this challenge? What resources did you find to help you answer them? + #Which parts of the challenge did you enjoy and which parts did you find tedious?
7
3.5
7
0
9d3e42abfa4afa45d2cbf691fa7efa3acb70fd5c
zsh/path.zsh
zsh/path.zsh
if [[ -e "/Applications/Postgres.app" ]]; then export PATH="/Applications/Postgres.app/Contents/Versions/9.3/bin:$PATH"; fi ## # if go is installed export GOPATH if [[ ( -e `which go` && -e "$HOME/.go" ) ]]; then export GOPATH=$HOME/.go && export PATH="$GOPATH/bin:$PATH"; fi ## # search for rbenv. if which rbenv > /dev/null; then eval "$(rbenv init -)"; fi ## # add coreutils. # add homebrew installed packages. # last export will be the first in path! export PATH="$(brew --prefix coreutils)/libexec/gnubin:/usr/local/bin:$PATH" ## # if exists, add path for homebrew installed python. if [[ -e "/usr/local/share/python" ]]; then export PATH="/usr/local/share/python:$PATH"; fi ## # remove duplicates from PATH. typeset -U PATH
if [[ -e "/Applications/Postgres.app" ]]; then export PATH="/Applications/Postgres.app/Contents/Versions/9.3/bin:$PATH"; fi ## # if go is installed export GOPATH if [[ ( -e `which go` && -e "$HOME/.go" ) ]]; then export GOPATH=$HOME/.go && export PATH="$GOPATH/bin:$PATH"; fi ## # search for rbenv. if which rbenv > /dev/null; then eval "$(rbenv init -)"; fi ## # add coreutils. # add coreutils manuals. # add homebrew installed packages. # last export will be the first in path! export PATH="$(brew --prefix coreutils)/libexec/gnubin:/usr/local/bin:$PATH" export MANPATH="$(brew --prefix coreutils)/libexec/gnuman:$MANPATH" ## # if exists, add path for homebrew installed python. if [[ -e "/usr/local/share/python" ]]; then export PATH="/usr/local/share/python:$PATH"; fi ## # remove duplicates from PATH. typeset -U PATH
Add the correct man for the coreutils commands
Add the correct man for the coreutils commands
Shell
mit
albertogg/dotfiles,albertogg/dotfiles
shell
## Code Before: if [[ -e "/Applications/Postgres.app" ]]; then export PATH="/Applications/Postgres.app/Contents/Versions/9.3/bin:$PATH"; fi ## # if go is installed export GOPATH if [[ ( -e `which go` && -e "$HOME/.go" ) ]]; then export GOPATH=$HOME/.go && export PATH="$GOPATH/bin:$PATH"; fi ## # search for rbenv. if which rbenv > /dev/null; then eval "$(rbenv init -)"; fi ## # add coreutils. # add homebrew installed packages. # last export will be the first in path! export PATH="$(brew --prefix coreutils)/libexec/gnubin:/usr/local/bin:$PATH" ## # if exists, add path for homebrew installed python. if [[ -e "/usr/local/share/python" ]]; then export PATH="/usr/local/share/python:$PATH"; fi ## # remove duplicates from PATH. typeset -U PATH ## Instruction: Add the correct man for the coreutils commands ## Code After: if [[ -e "/Applications/Postgres.app" ]]; then export PATH="/Applications/Postgres.app/Contents/Versions/9.3/bin:$PATH"; fi ## # if go is installed export GOPATH if [[ ( -e `which go` && -e "$HOME/.go" ) ]]; then export GOPATH=$HOME/.go && export PATH="$GOPATH/bin:$PATH"; fi ## # search for rbenv. if which rbenv > /dev/null; then eval "$(rbenv init -)"; fi ## # add coreutils. # add coreutils manuals. # add homebrew installed packages. # last export will be the first in path! export PATH="$(brew --prefix coreutils)/libexec/gnubin:/usr/local/bin:$PATH" export MANPATH="$(brew --prefix coreutils)/libexec/gnuman:$MANPATH" ## # if exists, add path for homebrew installed python. if [[ -e "/usr/local/share/python" ]]; then export PATH="/usr/local/share/python:$PATH"; fi ## # remove duplicates from PATH. typeset -U PATH
if [[ -e "/Applications/Postgres.app" ]]; then export PATH="/Applications/Postgres.app/Contents/Versions/9.3/bin:$PATH"; fi ## # if go is installed export GOPATH if [[ ( -e `which go` && -e "$HOME/.go" ) ]]; then export GOPATH=$HOME/.go && export PATH="$GOPATH/bin:$PATH"; fi ## # search for rbenv. if which rbenv > /dev/null; then eval "$(rbenv init -)"; fi ## # add coreutils. + # add coreutils manuals. # add homebrew installed packages. # last export will be the first in path! export PATH="$(brew --prefix coreutils)/libexec/gnubin:/usr/local/bin:$PATH" + export MANPATH="$(brew --prefix coreutils)/libexec/gnuman:$MANPATH" ## # if exists, add path for homebrew installed python. if [[ -e "/usr/local/share/python" ]]; then export PATH="/usr/local/share/python:$PATH"; fi ## # remove duplicates from PATH. typeset -U PATH
2
0.086957
2
0
0d33a5e7551f968a81ed21d1229dd4f95b2f0e26
package.json
package.json
{ "name": "fluid-express", "version": "1.0.17", "description": "Fluid components to model an express server and associated router modules.", "main": "index.js", "scripts": { "lint": "fluid-lint-all", "pretest": "npx rimraf coverage/* reports/*", "test": "nyc node tests/all-tests.js" }, "author": "Tony Atkins <tony@raisingthefloor.org>", "license": "BSD-3-Clause", "repository": "https://github.com/fluid-project/fluid-express", "dependencies": { "body-parser": "1.19.0", "cookie-parser": "1.4.5", "express": "4.17.1", "express-session": "1.17.1", "infusion": "3.0.0-dev.20201113T153152Z.32176dcbe.FLUID-6145", "serve-index": "1.9.1" }, "devDependencies": { "eslint": "7.17.0", "eslint-config-fluid": "2.0.0", "fluid-lint-all": "1.0.0-dev.20210111T144240Z.ee70ce3.GH-1", "kettle": "1.16.0", "nyc": "12.0.2", "node-jqunit": "1.1.8" } }
{ "name": "fluid-express", "version": "1.0.18", "description": "Fluid components to model an express server and associated router modules.", "main": "index.js", "scripts": { "lint": "fluid-lint-all", "pretest": "npx rimraf coverage/* reports/*", "test": "nyc node tests/all-tests.js" }, "author": "Tony Atkins <tony@raisingthefloor.org>", "license": "BSD-3-Clause", "repository": "https://github.com/fluid-project/fluid-express", "dependencies": { "body-parser": "1.19.0", "cookie-parser": "1.4.5", "express": "4.17.1", "express-session": "1.17.1", "infusion": "3.0.0-dev.20201113T153152Z.32176dcbe.FLUID-6145", "serve-index": "1.9.1" }, "devDependencies": { "eslint": "7.17.0", "eslint-config-fluid": "2.0.0", "fluid-lint-all": "1.0.0", "kettle": "1.16.0", "nyc": "12.0.2", "node-jqunit": "1.1.8" } }
Update forward-facing version following 1.0.17 release.
Update forward-facing version following 1.0.17 release.
JSON
bsd-3-clause
GPII/gpii-express,GPII/gpii-express
json
## Code Before: { "name": "fluid-express", "version": "1.0.17", "description": "Fluid components to model an express server and associated router modules.", "main": "index.js", "scripts": { "lint": "fluid-lint-all", "pretest": "npx rimraf coverage/* reports/*", "test": "nyc node tests/all-tests.js" }, "author": "Tony Atkins <tony@raisingthefloor.org>", "license": "BSD-3-Clause", "repository": "https://github.com/fluid-project/fluid-express", "dependencies": { "body-parser": "1.19.0", "cookie-parser": "1.4.5", "express": "4.17.1", "express-session": "1.17.1", "infusion": "3.0.0-dev.20201113T153152Z.32176dcbe.FLUID-6145", "serve-index": "1.9.1" }, "devDependencies": { "eslint": "7.17.0", "eslint-config-fluid": "2.0.0", "fluid-lint-all": "1.0.0-dev.20210111T144240Z.ee70ce3.GH-1", "kettle": "1.16.0", "nyc": "12.0.2", "node-jqunit": "1.1.8" } } ## Instruction: Update forward-facing version following 1.0.17 release. ## Code After: { "name": "fluid-express", "version": "1.0.18", "description": "Fluid components to model an express server and associated router modules.", "main": "index.js", "scripts": { "lint": "fluid-lint-all", "pretest": "npx rimraf coverage/* reports/*", "test": "nyc node tests/all-tests.js" }, "author": "Tony Atkins <tony@raisingthefloor.org>", "license": "BSD-3-Clause", "repository": "https://github.com/fluid-project/fluid-express", "dependencies": { "body-parser": "1.19.0", "cookie-parser": "1.4.5", "express": "4.17.1", "express-session": "1.17.1", "infusion": "3.0.0-dev.20201113T153152Z.32176dcbe.FLUID-6145", "serve-index": "1.9.1" }, "devDependencies": { "eslint": "7.17.0", "eslint-config-fluid": "2.0.0", "fluid-lint-all": "1.0.0", "kettle": "1.16.0", "nyc": "12.0.2", "node-jqunit": "1.1.8" } }
{ "name": "fluid-express", - "version": "1.0.17", ? ^ + "version": "1.0.18", ? ^ "description": "Fluid components to model an express server and associated router modules.", "main": "index.js", "scripts": { "lint": "fluid-lint-all", "pretest": "npx rimraf coverage/* reports/*", "test": "nyc node tests/all-tests.js" }, "author": "Tony Atkins <tony@raisingthefloor.org>", "license": "BSD-3-Clause", "repository": "https://github.com/fluid-project/fluid-express", "dependencies": { "body-parser": "1.19.0", "cookie-parser": "1.4.5", "express": "4.17.1", "express-session": "1.17.1", "infusion": "3.0.0-dev.20201113T153152Z.32176dcbe.FLUID-6145", "serve-index": "1.9.1" }, "devDependencies": { "eslint": "7.17.0", "eslint-config-fluid": "2.0.0", - "fluid-lint-all": "1.0.0-dev.20210111T144240Z.ee70ce3.GH-1", + "fluid-lint-all": "1.0.0", "kettle": "1.16.0", "nyc": "12.0.2", "node-jqunit": "1.1.8" } }
4
0.133333
2
2
4ec761fd674aabe7d766a3017c03a328d1164777
acceptance/fips/test/integration/fips/serverspec/fips_spec.rb
acceptance/fips/test/integration/fips/serverspec/fips_spec.rb
require "mixlib/shellout" require "bundler" describe "Chef Fips Specs" do def windows? if RUBY_PLATFORM =~ /mswin|mingw|windows/ true else false end end let(:chef_dir) do if windows? Dir.glob("c:/opscode/chef/embedded/lib/ruby/gems/*/gems/chef-[0-9]*").last else Dir.glob("/opt/chef/embedded/lib/ruby/gems/*/gems/chef-[0-9]*").last end end let(:path) do if windows? 'C:\opscode\chef\embedded\bin' else "/opt/chef/embedded/bin" end end it "passes the unit and functional specs" do Bundler.with_clean_env do ruby_cmd = Mixlib::ShellOut.new( "bundle exec rspec -t ~requires_git spec/unit spec/functional", :env => { "PATH" => "#{ENV['PATH']}:#{path}", "GEM_PATH" => nil, "GEM_CACHE" => nil, "GEM_HOME" => nil, "CHEF_FIPS" => "1" }, :live_stream => STDOUT, :cwd => chef_dir, :timeout => 3600) expect { ruby_cmd.run_command.error! }.not_to raise_exception end end end
require "mixlib/shellout" require "bundler" describe "Chef Fips Specs" do def windows? if RUBY_PLATFORM =~ /mswin|mingw|windows/ true else false end end let(:chef_dir) do if windows? Dir.glob("c:/opscode/chef/embedded/lib/ruby/gems/*/gems/chef-[0-9]*").last else Dir.glob("/opt/chef/embedded/lib/ruby/gems/*/gems/chef-[0-9]*").last end end let(:path) do if windows? 'C:\opscode\chef\embedded\bin' else "/opt/chef/embedded/bin" end end it "passes the unit and functional specs" do Bundler.with_clean_env do ruby_cmd = Mixlib::ShellOut.new( "bundle exec rspec -t ~requires_git spec/unit spec/functional", :env => { "PATH" => [ENV['PATH'], path].join(File::PATH_SEPARATOR), "GEM_PATH" => nil, "GEM_CACHE" => nil, "GEM_HOME" => nil, "CHEF_FIPS" => "1"}, :live_stream => STDOUT, :cwd => chef_dir, :timeout => 3600) expect { ruby_cmd.run_command.error! }.not_to raise_exception end end end
Use the correct path separator
Use the correct path separator
Ruby
apache-2.0
b002368/chef-repo,onlyhavecans/chef,robmul/chef,jonlives/chef,jkerry/chef,martinisoft/chef,juliandunn/chef,natewalck/chef,onlyhavecans/chef,tbunnyman/chef,tomdoherty/chef,oclaussen/chef,martinisoft/chef,ChaosCloud/chef,MsysTechnologiesllc/chef,sanditiffin/chef,Tensibai/chef,mikedodge04/chef,mikedodge04/chef,criteo-forks/chef,higanworks/chef,mikedodge04/chef,webframp/chef,someara/chef,sanditiffin/chef,nathwill/chef,robmul/chef,youngjl1/chef,Kast0rTr0y/chef,jkerry/chef,tbunnyman/chef,evan2645/chef,juliandunn/chef,sanditiffin/chef,jkerry/chef,chef/chef,juliandunn/chef,Kast0rTr0y/chef,brettcave/chef,docwhat/chef,gene1wood/chef,nathwill/chef,evan2645/chef,mwrock/chef,nvwls/chef,ranjib/chef,mattray/chef,Ppjet6/chef,mal/chef,juliandunn/chef,chef/chef,chef/chef,strangelittlemonkey/chef,mikedodge04/chef,jonlives/chef,onlyhavecans/chef,b002368/chef-repo,gene1wood/chef,brettcave/chef,tomdoherty/chef,nathwill/chef,sekrett/chef,strangelittlemonkey/chef,oclaussen/chef,nvwls/chef,mikedodge04/chef,Tensibai/chef,ChaosCloud/chef,jaymzh/chef,evan2645/chef,strangelittlemonkey/chef,sekrett/chef,ChaosCloud/chef,ranjib/chef,jaymzh/chef,3dinfluence/chef,sanditiffin/chef,docwhat/chef,webframp/chef,natewalck/chef,criteo-forks/chef,webframp/chef,b002368/chef-repo,tas50/chef-1,sekrett/chef,tbunnyman/chef,Ppjet6/chef,MsysTechnologiesllc/chef,chef/chef,webframp/chef,ChaosCloud/chef,webframp/chef,youngjl1/chef,youngjl1/chef,mattray/chef,docwhat/chef,tbunnyman/chef,mattray/chef,MsysTechnologiesllc/chef,jaymzh/chef,docwhat/chef,brettcave/chef,ranjib/chef,oclaussen/chef,3dinfluence/chef,Tensibai/chef,3dinfluence/chef,nvwls/chef,robmul/chef,3dinfluence/chef,Kast0rTr0y/chef,martinisoft/chef,3dinfluence/chef,nathwill/chef,natewalck/chef,evan2645/chef,someara/chef,b002368/chef-repo,oclaussen/chef,sanditiffin/chef,onlyhavecans/chef,strangelittlemonkey/chef,jkerry/chef,MsysTechnologiesllc/chef,higanworks/chef,someara/chef,nvwls/chef,martinisoft/chef,mattray/chef,natewalck/chef,ranjib/chef,nvwls/chef,mwrock/chef,mal/chef,onlyhavecans/chef,tas50/chef-1,ranjib/chef,strangelittlemonkey/chef,3dinfluence/chef,criteo-forks/chef,mikedodge04/chef,Ppjet6/chef,tomdoherty/chef,mal/chef,tbunnyman/chef,Ppjet6/chef,youngjl1/chef,sekrett/chef,evan2645/chef,jkerry/chef,jonlives/chef,jaymzh/chef,criteo-forks/chef,sanditiffin/chef,robmul/chef,oclaussen/chef,natewalck/chef,gene1wood/chef,gene1wood/chef,sekrett/chef,nathwill/chef,someara/chef,webframp/chef,docwhat/chef,jonlives/chef,oclaussen/chef,brettcave/chef,ChaosCloud/chef,natewalck/chef,robmul/chef,mal/chef,higanworks/chef,jonlives/chef,Tensibai/chef,martinisoft/chef,evan2645/chef,tbunnyman/chef,tas50/chef-1,b002368/chef-repo,sekrett/chef,tas50/chef-1,onlyhavecans/chef,Kast0rTr0y/chef,strangelittlemonkey/chef,mattray/chef,higanworks/chef,someara/chef,mwrock/chef,mal/chef,docwhat/chef,mwrock/chef,higanworks/chef,nvwls/chef,juliandunn/chef,brettcave/chef,someara/chef,tomdoherty/chef,jonlives/chef,youngjl1/chef,brettcave/chef,youngjl1/chef,Tensibai/chef,Kast0rTr0y/chef,tomdoherty/chef,mal/chef,ChaosCloud/chef,Ppjet6/chef,b002368/chef-repo,nathwill/chef,criteo-forks/chef,criteo-forks/chef,robmul/chef,Ppjet6/chef,ranjib/chef,Kast0rTr0y/chef,jkerry/chef
ruby
## Code Before: require "mixlib/shellout" require "bundler" describe "Chef Fips Specs" do def windows? if RUBY_PLATFORM =~ /mswin|mingw|windows/ true else false end end let(:chef_dir) do if windows? Dir.glob("c:/opscode/chef/embedded/lib/ruby/gems/*/gems/chef-[0-9]*").last else Dir.glob("/opt/chef/embedded/lib/ruby/gems/*/gems/chef-[0-9]*").last end end let(:path) do if windows? 'C:\opscode\chef\embedded\bin' else "/opt/chef/embedded/bin" end end it "passes the unit and functional specs" do Bundler.with_clean_env do ruby_cmd = Mixlib::ShellOut.new( "bundle exec rspec -t ~requires_git spec/unit spec/functional", :env => { "PATH" => "#{ENV['PATH']}:#{path}", "GEM_PATH" => nil, "GEM_CACHE" => nil, "GEM_HOME" => nil, "CHEF_FIPS" => "1" }, :live_stream => STDOUT, :cwd => chef_dir, :timeout => 3600) expect { ruby_cmd.run_command.error! }.not_to raise_exception end end end ## Instruction: Use the correct path separator ## Code After: require "mixlib/shellout" require "bundler" describe "Chef Fips Specs" do def windows? if RUBY_PLATFORM =~ /mswin|mingw|windows/ true else false end end let(:chef_dir) do if windows? Dir.glob("c:/opscode/chef/embedded/lib/ruby/gems/*/gems/chef-[0-9]*").last else Dir.glob("/opt/chef/embedded/lib/ruby/gems/*/gems/chef-[0-9]*").last end end let(:path) do if windows? 'C:\opscode\chef\embedded\bin' else "/opt/chef/embedded/bin" end end it "passes the unit and functional specs" do Bundler.with_clean_env do ruby_cmd = Mixlib::ShellOut.new( "bundle exec rspec -t ~requires_git spec/unit spec/functional", :env => { "PATH" => [ENV['PATH'], path].join(File::PATH_SEPARATOR), "GEM_PATH" => nil, "GEM_CACHE" => nil, "GEM_HOME" => nil, "CHEF_FIPS" => "1"}, :live_stream => STDOUT, :cwd => chef_dir, :timeout => 3600) expect { ruby_cmd.run_command.error! }.not_to raise_exception end end end
require "mixlib/shellout" require "bundler" describe "Chef Fips Specs" do def windows? if RUBY_PLATFORM =~ /mswin|mingw|windows/ true else false end end let(:chef_dir) do if windows? Dir.glob("c:/opscode/chef/embedded/lib/ruby/gems/*/gems/chef-[0-9]*").last else Dir.glob("/opt/chef/embedded/lib/ruby/gems/*/gems/chef-[0-9]*").last end end let(:path) do if windows? 'C:\opscode\chef\embedded\bin' else "/opt/chef/embedded/bin" end end it "passes the unit and functional specs" do Bundler.with_clean_env do ruby_cmd = Mixlib::ShellOut.new( - "bundle exec rspec -t ~requires_git spec/unit spec/functional", :env => { "PATH" => "#{ENV['PATH']}:#{path}", ? ^^^ ^^^^ ^^ + "bundle exec rspec -t ~requires_git spec/unit spec/functional", :env => { "PATH" => [ENV['PATH'], path].join(File::PATH_SEPARATOR), ? ^ ^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ "GEM_PATH" => nil, "GEM_CACHE" => nil, "GEM_HOME" => nil, - "CHEF_FIPS" => "1" }, ? - + "CHEF_FIPS" => "1"}, :live_stream => STDOUT, :cwd => chef_dir, :timeout => 3600) expect { ruby_cmd.run_command.error! }.not_to raise_exception end end end
4
0.102564
2
2
ea7fe7b3bfcd444bf4e36893e6cb002f8e0ae32d
zipkin-hbase/src/test/guava-hack/src/main/java/com/google/common/base/Stopwatch.java
zipkin-hbase/src/test/guava-hack/src/main/java/com/google/common/base/Stopwatch.java
package com.google.common.base; public final class Stopwatch{ private long start; private long duration; public Stopwatch() { } public Stopwatch start() { start = System.nanoTime(); duration = 0; return this; } public Stopwatch stop() { duration = System.nanoTime() - start; start = 0; return this; } public Stopwatch reset() { start = 0; duration = 0; return this; } public long elapsedNanos() { return (start == 0) ? duration : (System.nanoTime() - start); } public long elapsedMillis() { return elapsedNanos() / 1000000; } }
package com.google.common.base; //HBase requires an old version of Guava, and Twitter requires a new one. //HBase only needs the Stopwatch class from guava, so it's simplest to add this class before the actual Guava in the classpath, so HBase sees this old shim. public final class Stopwatch{ private long start; private long duration; public Stopwatch() { } public Stopwatch start() { start = System.nanoTime(); duration = 0; return this; } public Stopwatch stop() { duration = System.nanoTime() - start; start = 0; return this; } public Stopwatch reset() { start = 0; duration = 0; return this; } public long elapsedNanos() { return (start == 0) ? duration : (System.nanoTime() - start); } public long elapsedMillis() { return elapsedNanos() / 1000000; } }
Add comment explaining why there is a shim
Add comment explaining why there is a shim Signed-off-by: Jeff Smick <f3e731dfa293c7a83119d8aacfa41b5d2d780be9@twitter.com>
Java
apache-2.0
bbc/zipkin,betable/zipkin,coursera/zipkin,zerdliu/zipkin,coursera/zipkin,dsyer/zipkin,jamescway/zipkin,rocwzp/zipkin,dsyer/zipkin,wyzssw/zipkin,coursera/zipkin,drax68/zipkin,betable/zipkin,cogitate/twitter-zipkin-uuid,c3p0hz/zipkin,mt0803/zipkin,lookout/zipkin,fengshao0907/zipkin,rjbhewei/zipkin,wyzssw/zipkin,gneokleo/zipkin,lqjack/zipkin,dsyer/zipkin,srijs/zipkin,prat0318/zipkin,jamescway/zipkin,travisbrown/zipkin,xujianhai/zipkin,cogitate/twitter-zipkin-uuid,baloo/zipkin,prat0318/zipkin,wyzssw/zipkin,hydro2k/zipkin,lqjack/zipkin,mzagar/zipkin,xujianhai/zipkin,yshaojie/zipkin,thelastpickle/zipkin,rjbhewei/zipkin,rjbhewei/zipkin,baloo/zipkin,antmat/zipkin,drax68/zipkin,cogitate/twitter-zipkin-uuid,dsyer/zipkin,lookout/zipkin,zhoffice/zipkin,bbc/zipkin,mzagar/zipkin,eirslett/zipkin,lookout/zipkin,hydro2k/zipkin,bbc/zipkin,srijs/zipkin,thelastpickle/zipkin,dsyer/zipkin,c3p0hz/zipkin,rocwzp/zipkin,thelastpickle/zipkin,drax68/zipkin,lqjack/zipkin,Yelp/zipkin,gneokleo/zipkin,rocwzp/zipkin,antmat/zipkin,wuqiangxjtu/zipkin,coursera/zipkin,yshaojie/zipkin,Flipkart/zipkin,dsyer/zipkin,jstanier/zipkin,xujianhai/zipkin,eirslett/zipkin,travisbrown/zipkin,drax68/zipkin,jamescway/zipkin,wuqiangxjtu/zipkin,drax68/zipkin,antmat/zipkin,yshaojie/zipkin,rocwzp/zipkin,mt0803/zipkin,wuqiangxjtu/zipkin,srijs/zipkin,travisbrown/zipkin,c3p0hz/zipkin,zhoffice/zipkin,wuqiangxjtu/zipkin,eirslett/zipkin,cogitate/twitter-zipkin-uuid,cogitate/twitter-zipkin-uuid,xujianhai/zipkin,cogitate/twitter-zipkin-uuid,rjbhewei/zipkin,c3p0hz/zipkin,zhoffice/zipkin,antmat/zipkin,jfeltesse-mdsol/zipkin,jfeltesse-mdsol/zipkin,wyzssw/zipkin,eirslett/zipkin,baloo/zipkin,bbc/zipkin,srijs/zipkin,lookout/zipkin,Yelp/zipkin,thelastpickle/zipkin,fengshao0907/zipkin,travisbrown/zipkin,zerdliu/zipkin,baloo/zipkin,jfeltesse-mdsol/zipkin,zhoffice/zipkin,antmat/zipkin,antmat/zipkin,prat0318/zipkin,jstanier/zipkin,mzagar/zipkin,Flipkart/zipkin,lookout/zipkin,gneokleo/zipkin,yshaojie/zipkin,chang2394/zipkin,jstanier/zipkin,zerdliu/zipkin,jstanier/zipkin,mt0803/zipkin,prat0318/zipkin,lqjack/zipkin,chang2394/zipkin,hydro2k/zipkin,fengshao0907/zipkin,zerdliu/zipkin,Yelp/zipkin,mzagar/zipkin,baloo/zipkin,Flipkart/zipkin,Flipkart/zipkin,travisbrown/zipkin,zerdliu/zipkin,mt0803/zipkin,travisbrown/zipkin,wyzssw/zipkin,betable/zipkin,chang2394/zipkin,hydro2k/zipkin,zerdliu/zipkin,jamescway/zipkin,jfeltesse-mdsol/zipkin,bbc/zipkin,betable/zipkin,lookout/zipkin,gneokleo/zipkin,drax68/zipkin,bbc/zipkin,Yelp/zipkin,chang2394/zipkin,fengshao0907/zipkin,wyzssw/zipkin
java
## Code Before: package com.google.common.base; public final class Stopwatch{ private long start; private long duration; public Stopwatch() { } public Stopwatch start() { start = System.nanoTime(); duration = 0; return this; } public Stopwatch stop() { duration = System.nanoTime() - start; start = 0; return this; } public Stopwatch reset() { start = 0; duration = 0; return this; } public long elapsedNanos() { return (start == 0) ? duration : (System.nanoTime() - start); } public long elapsedMillis() { return elapsedNanos() / 1000000; } } ## Instruction: Add comment explaining why there is a shim Signed-off-by: Jeff Smick <f3e731dfa293c7a83119d8aacfa41b5d2d780be9@twitter.com> ## Code After: package com.google.common.base; //HBase requires an old version of Guava, and Twitter requires a new one. //HBase only needs the Stopwatch class from guava, so it's simplest to add this class before the actual Guava in the classpath, so HBase sees this old shim. public final class Stopwatch{ private long start; private long duration; public Stopwatch() { } public Stopwatch start() { start = System.nanoTime(); duration = 0; return this; } public Stopwatch stop() { duration = System.nanoTime() - start; start = 0; return this; } public Stopwatch reset() { start = 0; duration = 0; return this; } public long elapsedNanos() { return (start == 0) ? duration : (System.nanoTime() - start); } public long elapsedMillis() { return elapsedNanos() / 1000000; } }
package com.google.common.base; + //HBase requires an old version of Guava, and Twitter requires a new one. + //HBase only needs the Stopwatch class from guava, so it's simplest to add this class before the actual Guava in the classpath, so HBase sees this old shim. public final class Stopwatch{ private long start; private long duration; public Stopwatch() { } public Stopwatch start() { start = System.nanoTime(); duration = 0; return this; } public Stopwatch stop() { duration = System.nanoTime() - start; start = 0; return this; } public Stopwatch reset() { start = 0; duration = 0; return this; } public long elapsedNanos() { return (start == 0) ? duration : (System.nanoTime() - start); } public long elapsedMillis() { return elapsedNanos() / 1000000; } }
2
0.057143
2
0
f6b5547b377f87469ff178e3b9f5e9d2aea3f3e1
.travis.yml
.travis.yml
language: go go: - 1.3 - 1.4.3 - 1.5.1 - tip install: - go get -d -v ./... script: - go build -v ./... - go tool vet -all -v *.go
language: go go: - 1.3 - 1.4.3 - 1.5.1 - tip install: - go get -v golang.org/x/tools/cmd/vet - go get -d -v ./... script: - go build -v ./... - go tool vet -all -v *.go
Make sure we have go vet
Make sure we have go vet
YAML
mit
tehmaze/netflow
yaml
## Code Before: language: go go: - 1.3 - 1.4.3 - 1.5.1 - tip install: - go get -d -v ./... script: - go build -v ./... - go tool vet -all -v *.go ## Instruction: Make sure we have go vet ## Code After: language: go go: - 1.3 - 1.4.3 - 1.5.1 - tip install: - go get -v golang.org/x/tools/cmd/vet - go get -d -v ./... script: - go build -v ./... - go tool vet -all -v *.go
language: go go: - 1.3 - 1.4.3 - 1.5.1 - tip install: + - go get -v golang.org/x/tools/cmd/vet - go get -d -v ./... script: - go build -v ./... - go tool vet -all -v *.go
1
0.071429
1
0
f835ce3509af8f719c4412e18ad85623571258d6
Package.swift
Package.swift
import PackageDescription let package = Package( name: "langserver-swift", targets: [ Target(name: "JSONRPC"), Target(name: "LanguageServerProtocol", dependencies: ["JSONRPC"]), Target(name: "LanguageServer", dependencies: ["LanguageServerProtocol", "JSONRPC"]) ], dependencies: [ .Package(url: "https://github.com/IBM-Swift/BlueSocket.git", majorVersion: 0, minor: 11), .Package(url: "https://github.com/RLovelett/SourceKitten.git", majorVersion: 0), .Package(url: "https://github.com/thoughtbot/Argo.git", majorVersion: 4), .Package(url: "https://github.com/thoughtbot/Curry.git", majorVersion: 3) ] )
import PackageDescription let package = Package( name: "langserver-swift", targets: [ Target(name: "JSONRPC"), Target(name: "LanguageServerProtocol", dependencies: ["JSONRPC"]), Target(name: "LanguageServer", dependencies: ["LanguageServerProtocol", "JSONRPC"]) ], dependencies: [ .Package(url: "https://github.com/RLovelett/SourceKitten.git", majorVersion: 0), .Package(url: "https://github.com/thoughtbot/Argo.git", majorVersion: 4), .Package(url: "https://github.com/thoughtbot/Curry.git", majorVersion: 3) ] )
Remove BlueSocket as a dependency (no longer support TCP)
Remove BlueSocket as a dependency (no longer support TCP)
Swift
apache-2.0
RLovelett/langserver-swift
swift
## Code Before: import PackageDescription let package = Package( name: "langserver-swift", targets: [ Target(name: "JSONRPC"), Target(name: "LanguageServerProtocol", dependencies: ["JSONRPC"]), Target(name: "LanguageServer", dependencies: ["LanguageServerProtocol", "JSONRPC"]) ], dependencies: [ .Package(url: "https://github.com/IBM-Swift/BlueSocket.git", majorVersion: 0, minor: 11), .Package(url: "https://github.com/RLovelett/SourceKitten.git", majorVersion: 0), .Package(url: "https://github.com/thoughtbot/Argo.git", majorVersion: 4), .Package(url: "https://github.com/thoughtbot/Curry.git", majorVersion: 3) ] ) ## Instruction: Remove BlueSocket as a dependency (no longer support TCP) ## Code After: import PackageDescription let package = Package( name: "langserver-swift", targets: [ Target(name: "JSONRPC"), Target(name: "LanguageServerProtocol", dependencies: ["JSONRPC"]), Target(name: "LanguageServer", dependencies: ["LanguageServerProtocol", "JSONRPC"]) ], dependencies: [ .Package(url: "https://github.com/RLovelett/SourceKitten.git", majorVersion: 0), .Package(url: "https://github.com/thoughtbot/Argo.git", majorVersion: 4), .Package(url: "https://github.com/thoughtbot/Curry.git", majorVersion: 3) ] )
import PackageDescription let package = Package( name: "langserver-swift", targets: [ Target(name: "JSONRPC"), Target(name: "LanguageServerProtocol", dependencies: ["JSONRPC"]), Target(name: "LanguageServer", dependencies: ["LanguageServerProtocol", "JSONRPC"]) ], dependencies: [ - .Package(url: "https://github.com/IBM-Swift/BlueSocket.git", majorVersion: 0, minor: 11), .Package(url: "https://github.com/RLovelett/SourceKitten.git", majorVersion: 0), .Package(url: "https://github.com/thoughtbot/Argo.git", majorVersion: 4), .Package(url: "https://github.com/thoughtbot/Curry.git", majorVersion: 3) ] )
1
0.0625
0
1
ca33d994ef98f9200ca7c011db3ab97d0f3dce06
vendor/assets/stylesheets/_variables.scss
vendor/assets/stylesheets/_variables.scss
$panelBorder: #babdc5 !default; $skinActiveColor: #e73c3c !default; $bodyColor: rgba(236, 240, 241, 0.48) !default; $skinBlack: rgb(55, 71, 79); $panelHeaderBck: $skinBlack; $skinBlackLight: rgb(69, 90, 100); $textGray: rgb(174, 179, 181); $skinTextColor: #1e2a33 !default; $skinTextActiveColor: white !default; $skinTextTable: #7f8c8d !default; $skinHeaderBck: #1e2a33 !default; // $panelHeaderBck: #2c3e50 !default;
$panelBorder: #babdc5 !default; $skinActiveColor: #e73c3c !default; $bodyColor: rgba(236, 240, 241, 0.48) !default; $skinTextColor: #1e2a33 !default; $skinTextActiveColor: white !default; $skinTextTable: #7f8c8d !default; $skinHeaderBck: #1e2a33 !default; $skinBlack: rgb(55, 71, 79) !default; $panelHeaderBck: $skinBlack !default; $skinBlackLight: rgb(69, 90, 100) !default; $textGray: rgb(174, 179, 181) !default; // $panelHeaderBck: #2c3e50 !default;
Define default value for variables
Define default value for variables
SCSS
mit
ayann/active_admin_flat_skin,ayann/active_admin_flat_skin
scss
## Code Before: $panelBorder: #babdc5 !default; $skinActiveColor: #e73c3c !default; $bodyColor: rgba(236, 240, 241, 0.48) !default; $skinBlack: rgb(55, 71, 79); $panelHeaderBck: $skinBlack; $skinBlackLight: rgb(69, 90, 100); $textGray: rgb(174, 179, 181); $skinTextColor: #1e2a33 !default; $skinTextActiveColor: white !default; $skinTextTable: #7f8c8d !default; $skinHeaderBck: #1e2a33 !default; // $panelHeaderBck: #2c3e50 !default; ## Instruction: Define default value for variables ## Code After: $panelBorder: #babdc5 !default; $skinActiveColor: #e73c3c !default; $bodyColor: rgba(236, 240, 241, 0.48) !default; $skinTextColor: #1e2a33 !default; $skinTextActiveColor: white !default; $skinTextTable: #7f8c8d !default; $skinHeaderBck: #1e2a33 !default; $skinBlack: rgb(55, 71, 79) !default; $panelHeaderBck: $skinBlack !default; $skinBlackLight: rgb(69, 90, 100) !default; $textGray: rgb(174, 179, 181) !default; // $panelHeaderBck: #2c3e50 !default;
$panelBorder: #babdc5 !default; $skinActiveColor: #e73c3c !default; $bodyColor: rgba(236, 240, 241, 0.48) !default; - $skinBlack: rgb(55, 71, 79); - $panelHeaderBck: $skinBlack; - $skinBlackLight: rgb(69, 90, 100); - $textGray: rgb(174, 179, 181); $skinTextColor: #1e2a33 !default; $skinTextActiveColor: white !default; $skinTextTable: #7f8c8d !default; $skinHeaderBck: #1e2a33 !default; + $skinBlack: rgb(55, 71, 79) !default; + $panelHeaderBck: $skinBlack !default; + $skinBlackLight: rgb(69, 90, 100) !default; + $textGray: rgb(174, 179, 181) !default; // $panelHeaderBck: #2c3e50 !default;
8
0.666667
4
4
70f80dc34f28a68e64b3452fd5e6e504fa08a084
scripts/version.sh
scripts/version.sh
VERSION=$(git rev-parse --short HEAD) echo $VERSION
VERSION=$(git rev-parse --short HEAD) if [ "$VERSION" != "" ]; then echo $VERSION else echo "1" fi
Return always 1 if no git
Return always 1 if no git
Shell
mit
vanng822/appskeleton,vanng822/appskeleton
shell
## Code Before: VERSION=$(git rev-parse --short HEAD) echo $VERSION ## Instruction: Return always 1 if no git ## Code After: VERSION=$(git rev-parse --short HEAD) if [ "$VERSION" != "" ]; then echo $VERSION else echo "1" fi
VERSION=$(git rev-parse --short HEAD) + if [ "$VERSION" != "" ]; then - echo $VERSION + echo $VERSION ? + + else + echo "1" + fi
6
3
5
1
01b33af6f4d570b34ad791cd5ccaa3ea7f77dcb9
src/daemon/PhutilDaemonOverseerModule.php
src/daemon/PhutilDaemonOverseerModule.php
<?php /** * Overseer modules allow daemons to be externally influenced. * * See @{class:PhabricatorDaemonOverseerModule} for a concrete example. */ abstract class PhutilDaemonOverseerModule extends Phobject { /** * This method is used to indicate to the overseer that daemons should reload. * * @return bool True if the daemons should reload, otherwise false. */ abstract public function shouldReloadDaemons(); /** * Should a hibernating daemon pool be awoken immediately? * * @return bool True to awaken the pool immediately. */ public function shouldWakePool(PhutilDaemonPool $pool) { return false; } public static function getAllModules() { return id(new PhutilClassMapQuery()) ->setAncestorClass(__CLASS__) ->execute(); } }
<?php /** * Overseer modules allow daemons to be externally influenced. * * See @{class:PhabricatorDaemonOverseerModule} for a concrete example. */ abstract class PhutilDaemonOverseerModule extends Phobject { private $throttles = array(); /** * This method is used to indicate to the overseer that daemons should reload. * * @return bool True if the daemons should reload, otherwise false. */ public function shouldReloadDaemons() { return false; } /** * Should a hibernating daemon pool be awoken immediately? * * @return bool True to awaken the pool immediately. */ public function shouldWakePool(PhutilDaemonPool $pool) { return false; } public static function getAllModules() { return id(new PhutilClassMapQuery()) ->setAncestorClass(__CLASS__) ->execute(); } /** * Throttle checks from executing too often. * * If you throttle a check like this, it will only execute once every 2.5 * seconds: * * if ($this->shouldThrottle('some.check', 2.5)) { * return; * } * * @param string Throttle key. * @param float Duration in seconds. * @return bool True to throttle the check. */ protected function shouldThrottle($name, $duration) { $throttle = idx($this->throttles, $name, 0); $now = microtime(true); // If not enough time has elapsed, throttle the check. $elapsed = ($now - $throttle); if ($elapsed < $duration) { return true; } // Otherwise, mark the current time as the last time we ran the check, // then let it continue. $this->throttles[$name] = $now; return false; } }
Clean up overseer modules slightly and provide a throttling support method
Clean up overseer modules slightly and provide a throttling support method Summary: Ref T12298. Make it slightly easier to write overseer modules: - You don't need to implement shouldReloadDaemons(), since you can now just implement shouldWakePool() and get useful behavior. - It looks like most modules want to do checks only every X seconds, where X is at least somewhat module/use-case dependent, so add a helper for that. Test Plan: Ran `bin/phd debug pull --trace`, saw config check fire properly every 10 seconds after updating it to use `shouldThrottle()` in the next change. Reviewers: chad Reviewed By: chad Maniphest Tasks: T12298 Differential Revision: https://secure.phabricator.com/D17539
PHP
apache-2.0
aik099/libphutil,aik099/libphutil,r4nt/libphutil,Khan/libphutil,codelabs-ch/libphutil,uhd-urz/libphutil,uber/libphutil,wikimedia/phabricator-libphutil,devurandom/libphutil,codelabs-ch/libphutil,r4nt/libphutil,wikimedia/phabricator-libphutil,uhd-urz/libphutil,Khan/libphutil,codelabs-ch/libphutil,uhd-urz/libphutil,r4nt/libphutil,Khan/libphutil,devurandom/libphutil,devurandom/libphutil,aik099/libphutil,wikimedia/phabricator-libphutil,uber/libphutil,uber/libphutil
php
## Code Before: <?php /** * Overseer modules allow daemons to be externally influenced. * * See @{class:PhabricatorDaemonOverseerModule} for a concrete example. */ abstract class PhutilDaemonOverseerModule extends Phobject { /** * This method is used to indicate to the overseer that daemons should reload. * * @return bool True if the daemons should reload, otherwise false. */ abstract public function shouldReloadDaemons(); /** * Should a hibernating daemon pool be awoken immediately? * * @return bool True to awaken the pool immediately. */ public function shouldWakePool(PhutilDaemonPool $pool) { return false; } public static function getAllModules() { return id(new PhutilClassMapQuery()) ->setAncestorClass(__CLASS__) ->execute(); } } ## Instruction: Clean up overseer modules slightly and provide a throttling support method Summary: Ref T12298. Make it slightly easier to write overseer modules: - You don't need to implement shouldReloadDaemons(), since you can now just implement shouldWakePool() and get useful behavior. - It looks like most modules want to do checks only every X seconds, where X is at least somewhat module/use-case dependent, so add a helper for that. Test Plan: Ran `bin/phd debug pull --trace`, saw config check fire properly every 10 seconds after updating it to use `shouldThrottle()` in the next change. Reviewers: chad Reviewed By: chad Maniphest Tasks: T12298 Differential Revision: https://secure.phabricator.com/D17539 ## Code After: <?php /** * Overseer modules allow daemons to be externally influenced. * * See @{class:PhabricatorDaemonOverseerModule} for a concrete example. */ abstract class PhutilDaemonOverseerModule extends Phobject { private $throttles = array(); /** * This method is used to indicate to the overseer that daemons should reload. * * @return bool True if the daemons should reload, otherwise false. */ public function shouldReloadDaemons() { return false; } /** * Should a hibernating daemon pool be awoken immediately? * * @return bool True to awaken the pool immediately. */ public function shouldWakePool(PhutilDaemonPool $pool) { return false; } public static function getAllModules() { return id(new PhutilClassMapQuery()) ->setAncestorClass(__CLASS__) ->execute(); } /** * Throttle checks from executing too often. * * If you throttle a check like this, it will only execute once every 2.5 * seconds: * * if ($this->shouldThrottle('some.check', 2.5)) { * return; * } * * @param string Throttle key. * @param float Duration in seconds. * @return bool True to throttle the check. */ protected function shouldThrottle($name, $duration) { $throttle = idx($this->throttles, $name, 0); $now = microtime(true); // If not enough time has elapsed, throttle the check. $elapsed = ($now - $throttle); if ($elapsed < $duration) { return true; } // Otherwise, mark the current time as the last time we ran the check, // then let it continue. $this->throttles[$name] = $now; return false; } }
<?php /** * Overseer modules allow daemons to be externally influenced. * * See @{class:PhabricatorDaemonOverseerModule} for a concrete example. */ abstract class PhutilDaemonOverseerModule extends Phobject { + private $throttles = array(); + /** * This method is used to indicate to the overseer that daemons should reload. * * @return bool True if the daemons should reload, otherwise false. */ - abstract public function shouldReloadDaemons(); ? --------- ^ + public function shouldReloadDaemons() { ? ^^ + return false; + } /** * Should a hibernating daemon pool be awoken immediately? * * @return bool True to awaken the pool immediately. */ public function shouldWakePool(PhutilDaemonPool $pool) { return false; } public static function getAllModules() { return id(new PhutilClassMapQuery()) ->setAncestorClass(__CLASS__) ->execute(); } + + /** + * Throttle checks from executing too often. + * + * If you throttle a check like this, it will only execute once every 2.5 + * seconds: + * + * if ($this->shouldThrottle('some.check', 2.5)) { + * return; + * } + * + * @param string Throttle key. + * @param float Duration in seconds. + * @return bool True to throttle the check. + */ + protected function shouldThrottle($name, $duration) { + $throttle = idx($this->throttles, $name, 0); + $now = microtime(true); + + // If not enough time has elapsed, throttle the check. + $elapsed = ($now - $throttle); + if ($elapsed < $duration) { + return true; + } + + // Otherwise, mark the current time as the last time we ran the check, + // then let it continue. + $this->throttles[$name] = $now; + + return false; + } + }
38
1.085714
37
1
3f52933f6a8a5bac24275b5f1a540cab98d0529f
packages/strapi-plugin-i18n/admin/src/utils/localizedFields.js
packages/strapi-plugin-i18n/admin/src/utils/localizedFields.js
const LOCALIZED_FIELDS = [ 'text', 'string', 'richtext', 'email', 'password', 'media', 'number', 'date', 'json', 'uid', 'component', 'dynamiczone', 'enumeration', ]; export default LOCALIZED_FIELDS;
const LOCALIZED_FIELDS = [ 'biginteger', 'boolean', 'component', 'date', 'datetime', 'decimal', 'dynamiczone', 'email', 'enumeration', 'float', 'integer', 'json', 'media', 'number', 'password', 'richtext', 'string', 'text', 'time', 'uid', ]; export default LOCALIZED_FIELDS;
Add more types to the localisation
Add more types to the localisation Signed-off-by: soupette <0a59f0508aa203bc732745954131d022d9f538a9@gmail.com>
JavaScript
mit
wistityhq/strapi,wistityhq/strapi
javascript
## Code Before: const LOCALIZED_FIELDS = [ 'text', 'string', 'richtext', 'email', 'password', 'media', 'number', 'date', 'json', 'uid', 'component', 'dynamiczone', 'enumeration', ]; export default LOCALIZED_FIELDS; ## Instruction: Add more types to the localisation Signed-off-by: soupette <0a59f0508aa203bc732745954131d022d9f538a9@gmail.com> ## Code After: const LOCALIZED_FIELDS = [ 'biginteger', 'boolean', 'component', 'date', 'datetime', 'decimal', 'dynamiczone', 'email', 'enumeration', 'float', 'integer', 'json', 'media', 'number', 'password', 'richtext', 'string', 'text', 'time', 'uid', ]; export default LOCALIZED_FIELDS;
const LOCALIZED_FIELDS = [ + 'biginteger', + 'boolean', + 'component', - 'text', ? -- + 'date', ? ++ - 'string', - 'richtext', + 'datetime', + 'decimal', + 'dynamiczone', 'email', - 'password', + 'enumeration', + 'float', + 'integer', + 'json', 'media', 'number', + 'password', + 'richtext', + 'string', - 'date', ? -- + 'text', ? ++ - 'json', + 'time', 'uid', - 'component', - 'dynamiczone', - 'enumeration', ]; export default LOCALIZED_FIELDS;
25
1.470588
16
9
101135ac1c72daa4a02a663c7b874ac4b313e450
Modules/Remote/VariationalRegistration.remote.cmake
Modules/Remote/VariationalRegistration.remote.cmake
itk_fetch_module(VariationalRegistration "A module to perform variational image registration. https://hdl.handle.net/10380/3460" GIT_REPOSITORY ${git_protocol}://github.com/InsightSoftwareConsortium/ITKVariationalRegistration.git GIT_TAG da4acc1804e36b8f5b0fcf7179bddcf98679c6d3 )
itk_fetch_module(VariationalRegistration "A module to perform variational image registration. https://hdl.handle.net/10380/3460" GIT_REPOSITORY ${git_protocol}://github.com/InsightSoftwareConsortium/ITKVariationalRegistration.git GIT_TAG 68698e7558761846ae0a9b48c20f9f09f41698ca )
Update VariationalRegistration to remove warnings during compilation
ENH: Update VariationalRegistration to remove warnings during compilation Alexander Schmidt-Richberg (1): BUG: Small bugs fixed to avoid compiler warnings (unsigned int and unused variable). Francois Budin (6): ENH: Replacing test data with MD5 STYLE: Improve style to pass KWStyle test DOC: Missing backslash in documentation COMP: Remove compilation warnings from Visual Studio COMP: Correct warning due to locally defined symbol Merge pull request #9 from fbudin69500/locally_defined_symbol aschmiri (5): Merge pull request #3 from thewtex/doxygen-warnings Merge pull request #4 from fbudin69500/DOC_missing_backslash_in_documentation Merge pull request #5 from fbudin69500/ENH_replace_test_data_with_MD5 Merge pull request #6 from fbudin69500/ImproveStyle Merge pull request #7 from fbudin69500/remove_VS_warning_cherry_picked Change-Id: Ibb959bc84e01edb6c3b78e0984d9f029992ff5af
CMake
apache-2.0
blowekamp/ITK,Kitware/ITK,spinicist/ITK,PlutoniumHeart/ITK,richardbeare/ITK,LucasGandel/ITK,fbudin69500/ITK,LucasGandel/ITK,stnava/ITK,blowekamp/ITK,hjmjohnson/ITK,LucasGandel/ITK,PlutoniumHeart/ITK,LucasGandel/ITK,thewtex/ITK,thewtex/ITK,PlutoniumHeart/ITK,hjmjohnson/ITK,thewtex/ITK,vfonov/ITK,malaterre/ITK,InsightSoftwareConsortium/ITK,fbudin69500/ITK,spinicist/ITK,fbudin69500/ITK,vfonov/ITK,InsightSoftwareConsortium/ITK,Kitware/ITK,LucasGandel/ITK,stnava/ITK,PlutoniumHeart/ITK,stnava/ITK,BRAINSia/ITK,richardbeare/ITK,fbudin69500/ITK,blowekamp/ITK,Kitware/ITK,LucasGandel/ITK,PlutoniumHeart/ITK,richardbeare/ITK,thewtex/ITK,stnava/ITK,vfonov/ITK,malaterre/ITK,vfonov/ITK,vfonov/ITK,fbudin69500/ITK,thewtex/ITK,vfonov/ITK,fbudin69500/ITK,BRAINSia/ITK,malaterre/ITK,hjmjohnson/ITK,InsightSoftwareConsortium/ITK,malaterre/ITK,InsightSoftwareConsortium/ITK,thewtex/ITK,spinicist/ITK,Kitware/ITK,BRAINSia/ITK,Kitware/ITK,stnava/ITK,malaterre/ITK,vfonov/ITK,spinicist/ITK,richardbeare/ITK,vfonov/ITK,BRAINSia/ITK,hjmjohnson/ITK,spinicist/ITK,stnava/ITK,InsightSoftwareConsortium/ITK,stnava/ITK,blowekamp/ITK,hjmjohnson/ITK,richardbeare/ITK,richardbeare/ITK,blowekamp/ITK,stnava/ITK,richardbeare/ITK,spinicist/ITK,malaterre/ITK,malaterre/ITK,thewtex/ITK,PlutoniumHeart/ITK,malaterre/ITK,stnava/ITK,spinicist/ITK,malaterre/ITK,LucasGandel/ITK,PlutoniumHeart/ITK,hjmjohnson/ITK,LucasGandel/ITK,spinicist/ITK,BRAINSia/ITK,BRAINSia/ITK,blowekamp/ITK,Kitware/ITK,Kitware/ITK,InsightSoftwareConsortium/ITK,vfonov/ITK,PlutoniumHeart/ITK,blowekamp/ITK,fbudin69500/ITK,blowekamp/ITK,spinicist/ITK,hjmjohnson/ITK,fbudin69500/ITK,BRAINSia/ITK,InsightSoftwareConsortium/ITK
cmake
## Code Before: itk_fetch_module(VariationalRegistration "A module to perform variational image registration. https://hdl.handle.net/10380/3460" GIT_REPOSITORY ${git_protocol}://github.com/InsightSoftwareConsortium/ITKVariationalRegistration.git GIT_TAG da4acc1804e36b8f5b0fcf7179bddcf98679c6d3 ) ## Instruction: ENH: Update VariationalRegistration to remove warnings during compilation Alexander Schmidt-Richberg (1): BUG: Small bugs fixed to avoid compiler warnings (unsigned int and unused variable). Francois Budin (6): ENH: Replacing test data with MD5 STYLE: Improve style to pass KWStyle test DOC: Missing backslash in documentation COMP: Remove compilation warnings from Visual Studio COMP: Correct warning due to locally defined symbol Merge pull request #9 from fbudin69500/locally_defined_symbol aschmiri (5): Merge pull request #3 from thewtex/doxygen-warnings Merge pull request #4 from fbudin69500/DOC_missing_backslash_in_documentation Merge pull request #5 from fbudin69500/ENH_replace_test_data_with_MD5 Merge pull request #6 from fbudin69500/ImproveStyle Merge pull request #7 from fbudin69500/remove_VS_warning_cherry_picked Change-Id: Ibb959bc84e01edb6c3b78e0984d9f029992ff5af ## Code After: itk_fetch_module(VariationalRegistration "A module to perform variational image registration. https://hdl.handle.net/10380/3460" GIT_REPOSITORY ${git_protocol}://github.com/InsightSoftwareConsortium/ITKVariationalRegistration.git GIT_TAG 68698e7558761846ae0a9b48c20f9f09f41698ca )
itk_fetch_module(VariationalRegistration "A module to perform variational image registration. https://hdl.handle.net/10380/3460" GIT_REPOSITORY ${git_protocol}://github.com/InsightSoftwareConsortium/ITKVariationalRegistration.git - GIT_TAG da4acc1804e36b8f5b0fcf7179bddcf98679c6d3 + GIT_TAG 68698e7558761846ae0a9b48c20f9f09f41698ca )
2
0.4
1
1
3973b0ca27e1aaf6d31ea1c257f2c6b27cf2ba72
Authentication/NoopAuthenticationManager.php
Authentication/NoopAuthenticationManager.php
<?php /* * This file is part of the Symfony package. * * (c) Fabien Potencier <fabien@symfony.com> * * For the full copyright and license information, please view the LICENSE * file that was distributed with this source code. */ namespace Symfony\Component\Security\Http\Authentication; use Symfony\Component\Security\Core\Authentication\AuthenticationManagerInterface; use Symfony\Component\Security\Core\Authentication\Token\TokenInterface; /** * This class is used when the authenticator system is activated. * * This is used to not break AuthenticationChecker and ContextListener when * using the authenticator system. Once the authenticator system is no longer * experimental, this class can be used to trigger deprecation notices. * * @internal * * @author Wouter de Jong <wouter@wouterj.nl> */ class NoopAuthenticationManager implements AuthenticationManagerInterface { public function authenticate(TokenInterface $token) { return $token; } }
<?php /* * This file is part of the Symfony package. * * (c) Fabien Potencier <fabien@symfony.com> * * For the full copyright and license information, please view the LICENSE * file that was distributed with this source code. */ namespace Symfony\Component\Security\Http\Authentication; use Symfony\Component\Security\Core\Authentication\AuthenticationManagerInterface; use Symfony\Component\Security\Core\Authentication\Token\TokenInterface; /** * This class is used when the authenticator system is activated. * * This is used to not break AuthenticationChecker and ContextListener when * using the authenticator system. Once the authenticator system is no longer * experimental, this class can be used to trigger deprecation notices. * * @author Wouter de Jong <wouter@wouterj.nl> * * @internal */ class NoopAuthenticationManager implements AuthenticationManagerInterface { public function authenticate(TokenInterface $token): TokenInterface { return $token; } }
Add some missing return types to internal/final classes
Add some missing return types to internal/final classes
PHP
mit
symfony/security-http
php
## Code Before: <?php /* * This file is part of the Symfony package. * * (c) Fabien Potencier <fabien@symfony.com> * * For the full copyright and license information, please view the LICENSE * file that was distributed with this source code. */ namespace Symfony\Component\Security\Http\Authentication; use Symfony\Component\Security\Core\Authentication\AuthenticationManagerInterface; use Symfony\Component\Security\Core\Authentication\Token\TokenInterface; /** * This class is used when the authenticator system is activated. * * This is used to not break AuthenticationChecker and ContextListener when * using the authenticator system. Once the authenticator system is no longer * experimental, this class can be used to trigger deprecation notices. * * @internal * * @author Wouter de Jong <wouter@wouterj.nl> */ class NoopAuthenticationManager implements AuthenticationManagerInterface { public function authenticate(TokenInterface $token) { return $token; } } ## Instruction: Add some missing return types to internal/final classes ## Code After: <?php /* * This file is part of the Symfony package. * * (c) Fabien Potencier <fabien@symfony.com> * * For the full copyright and license information, please view the LICENSE * file that was distributed with this source code. */ namespace Symfony\Component\Security\Http\Authentication; use Symfony\Component\Security\Core\Authentication\AuthenticationManagerInterface; use Symfony\Component\Security\Core\Authentication\Token\TokenInterface; /** * This class is used when the authenticator system is activated. * * This is used to not break AuthenticationChecker and ContextListener when * using the authenticator system. Once the authenticator system is no longer * experimental, this class can be used to trigger deprecation notices. * * @author Wouter de Jong <wouter@wouterj.nl> * * @internal */ class NoopAuthenticationManager implements AuthenticationManagerInterface { public function authenticate(TokenInterface $token): TokenInterface { return $token; } }
<?php /* * This file is part of the Symfony package. * * (c) Fabien Potencier <fabien@symfony.com> * * For the full copyright and license information, please view the LICENSE * file that was distributed with this source code. */ namespace Symfony\Component\Security\Http\Authentication; use Symfony\Component\Security\Core\Authentication\AuthenticationManagerInterface; use Symfony\Component\Security\Core\Authentication\Token\TokenInterface; /** * This class is used when the authenticator system is activated. * * This is used to not break AuthenticationChecker and ContextListener when * using the authenticator system. Once the authenticator system is no longer * experimental, this class can be used to trigger deprecation notices. * + * @author Wouter de Jong <wouter@wouterj.nl> + * * @internal - * - * @author Wouter de Jong <wouter@wouterj.nl> */ class NoopAuthenticationManager implements AuthenticationManagerInterface { - public function authenticate(TokenInterface $token) + public function authenticate(TokenInterface $token): TokenInterface ? ++++++++++++++++ { return $token; } }
6
0.176471
3
3
48a16b1c53139bccce4c5d7cb62a9fc72f580380
config/prisons/SHI-stoke-heath.yml
config/prisons/SHI-stoke-heath.yml
--- name: Stoke Heath nomis_id: SHI address: - Market Drayton - Shropshire - TF9 2JL adult_age: 10 email: visitsbooking.westmidlands@noms.gsi.gov.uk enabled: true estate: Stoke Heath phone: 0300 060 6506 slots: mon: - 1400-1600 tue: - 1400-1600 wed: - 1400-1600 thu: - 1400-1600 sat: - 1400-1600 sun: - 1400-1600 unbookable: - 2014-04-18 - 2014-04-21 - 2014-05-05 - 2014-12-25 - 2014-12-26 - 2015-01-01
--- name: Stoke Heath nomis_id: SHI address: - Market Drayton - Shropshire - TF9 2JL adult_age: 10 email: visitsbooking.westmidlands@noms.gsi.gov.uk enabled: true estate: Stoke Heath phone: 0300 060 6506 slots: mon: - 1400-1600 tue: - 1400-1600 wed: - 1400-1600 thu: - 1400-1600 sat: - 1400-1600 sun: - 1400-1600 unbookable: - 2014-04-18 - 2014-04-21 - 2014-05-05 - 2014-12-25 - 2014-12-26 - 2015-01-01 - 2015-12-25 - 2015-12-26 - 2015-12-28 - 2016-01-01
Update Stoke Heath Christmas visit slots
Update Stoke Heath Christmas visit slots Unbookable: - Christmas Day - Boxing Day - 28th December - New Year's Day
YAML
mit
ministryofjustice/prison-visits,ministryofjustice/prison-visits,ministryofjustice/prison-visits
yaml
## Code Before: --- name: Stoke Heath nomis_id: SHI address: - Market Drayton - Shropshire - TF9 2JL adult_age: 10 email: visitsbooking.westmidlands@noms.gsi.gov.uk enabled: true estate: Stoke Heath phone: 0300 060 6506 slots: mon: - 1400-1600 tue: - 1400-1600 wed: - 1400-1600 thu: - 1400-1600 sat: - 1400-1600 sun: - 1400-1600 unbookable: - 2014-04-18 - 2014-04-21 - 2014-05-05 - 2014-12-25 - 2014-12-26 - 2015-01-01 ## Instruction: Update Stoke Heath Christmas visit slots Unbookable: - Christmas Day - Boxing Day - 28th December - New Year's Day ## Code After: --- name: Stoke Heath nomis_id: SHI address: - Market Drayton - Shropshire - TF9 2JL adult_age: 10 email: visitsbooking.westmidlands@noms.gsi.gov.uk enabled: true estate: Stoke Heath phone: 0300 060 6506 slots: mon: - 1400-1600 tue: - 1400-1600 wed: - 1400-1600 thu: - 1400-1600 sat: - 1400-1600 sun: - 1400-1600 unbookable: - 2014-04-18 - 2014-04-21 - 2014-05-05 - 2014-12-25 - 2014-12-26 - 2015-01-01 - 2015-12-25 - 2015-12-26 - 2015-12-28 - 2016-01-01
--- name: Stoke Heath nomis_id: SHI address: - Market Drayton - Shropshire - TF9 2JL adult_age: 10 email: visitsbooking.westmidlands@noms.gsi.gov.uk enabled: true estate: Stoke Heath phone: 0300 060 6506 slots: mon: - 1400-1600 tue: - 1400-1600 wed: - 1400-1600 thu: - 1400-1600 sat: - 1400-1600 sun: - 1400-1600 unbookable: - 2014-04-18 - 2014-04-21 - 2014-05-05 - 2014-12-25 - 2014-12-26 - 2015-01-01 + - 2015-12-25 + - 2015-12-26 + - 2015-12-28 + - 2016-01-01
4
0.125
4
0
ee441bad57effc7d5c774d29945567cc5beade1e
lib/logster/configuration.rb
lib/logster/configuration.rb
module Logster class Configuration attr_accessor :current_context, :allow_grouping, :environments, :application_version, :web_title attr_writer :subdirectory def initialize # lambda |env,block| @current_context = lambda{ |_, &block| block.call } @environments = [:development, :production] @subdirectory = nil @allow_grouping = false if defined?(::Rails) && ::Rails.env.production? @allow_grouping = true end end def subdirectory @subdirectory || '/logs' end end end
module Logster class Configuration attr_accessor :current_context, :allow_grouping, :environments, :application_version, :web_title attr_writer :subdirectory def initialize # lambda |env,block| @current_context = lambda{ |_, &block| block.call } @environments = [:development, :production] @subdirectory = nil @allow_grouping = false if defined?(::Rails) && defined?(::Rails.env) && ::Rails.env.production? @allow_grouping = true end end def subdirectory @subdirectory || '/logs' end end end
Check that `Rails.env` is defined as well.
FIX: Check that `Rails.env` is defined as well.
Ruby
mit
discourse/logster,discourse/logster,discourse/logster,discourse/logster
ruby
## Code Before: module Logster class Configuration attr_accessor :current_context, :allow_grouping, :environments, :application_version, :web_title attr_writer :subdirectory def initialize # lambda |env,block| @current_context = lambda{ |_, &block| block.call } @environments = [:development, :production] @subdirectory = nil @allow_grouping = false if defined?(::Rails) && ::Rails.env.production? @allow_grouping = true end end def subdirectory @subdirectory || '/logs' end end end ## Instruction: FIX: Check that `Rails.env` is defined as well. ## Code After: module Logster class Configuration attr_accessor :current_context, :allow_grouping, :environments, :application_version, :web_title attr_writer :subdirectory def initialize # lambda |env,block| @current_context = lambda{ |_, &block| block.call } @environments = [:development, :production] @subdirectory = nil @allow_grouping = false if defined?(::Rails) && defined?(::Rails.env) && ::Rails.env.production? @allow_grouping = true end end def subdirectory @subdirectory || '/logs' end end end
module Logster class Configuration attr_accessor :current_context, :allow_grouping, :environments, :application_version, :web_title attr_writer :subdirectory def initialize # lambda |env,block| @current_context = lambda{ |_, &block| block.call } @environments = [:development, :production] @subdirectory = nil @allow_grouping = false + - if defined?(::Rails) && ::Rails.env.production? + if defined?(::Rails) && defined?(::Rails.env) && ::Rails.env.production? ? +++++++++++++++++++++++++ @allow_grouping = true end end def subdirectory @subdirectory || '/logs' end end end
3
0.125
2
1
448525e3dd70ec160d671e5ee499ccc286bce3ed
json.php
json.php
<?php error_reporting(E_ALL & ~E_NOTICE); foreach (glob("functions/*.php") as $filename) { include $filename; } if ( isset($_GET['host']) && !empty($_GET['host'])) { $data = []; $hostname = mb_strtolower(get($_GET['host'])); $host = parse_hostname($hostname); if ($host['port']) { $port = $host['port']; } else { $port = get($_GET['port'], '443'); } $host = $host['hostname']; if ( !is_numeric($port) ) { $port = 443; } $data["data"] = check_json($host,$port); } elseif(isset($_GET['csr']) && !empty($_GET['csr'])) { $data["data"]["chain"]["1"] = csr_parse_json($_GET['csr']); } else { $data["error"] = ["Host is required"]; } $data = utf8encodeNestedArray($data); if(isset($data["data"]["error"])) { $data["error"] = $data["data"]["error"]; unset($data["data"]); } if ($_GET["type"] == "pretty") { header('Content-Type: text/html'); echo "<pre>"; echo htmlspecialchars(json_encode($data,JSON_PRETTY_PRINT)); echo "</pre>"; ?> <? } else { header('Content-Type: application/json'); echo json_encode($data); } ?>
<?php error_reporting(E_ALL & ~E_NOTICE); foreach (glob("functions/*.php") as $filename) { include $filename; } if ( isset($_GET['host']) && !empty($_GET['host'])) { $data = []; $hostname = mb_strtolower(get($_GET['host'])); $host = parse_hostname($hostname); if ($host['port']) { $port = $host['port']; } else { $port = get($_GET['port'], '443'); } $host = $host['hostname']; if ( !is_numeric($port) ) { $port = 443; } $data["data"] = check_json($host,$port); } elseif(isset($_GET['csr']) && !empty($_GET['csr'])) { $data["data"]["chain"]["1"] = csr_parse_json($_GET['csr']); } else { $data["error"] = ["Host is required"]; } $data = utf8encodeNestedArray($data); if(isset($data["data"]["error"])) { $data["error"] = $data["data"]["error"]; unset($data["data"]); } if ($_GET["type"] == "pretty") { header('Content-Type: text/html'); echo "<pre>"; echo htmlspecialchars(json_encode($data,JSON_PRETTY_PRINT)); echo "</pre>"; } else { header('Content-Type: application/json'); echo json_encode($data); } ?>
Remove unused php open and close tags
Remove unused php open and close tags
PHP
agpl-3.0
tomwijnroks/ssl-decoder,tomwijnroks/ssl-decoder,tomwijnroks/ssl-decoder,tomwijnroks/ssl-decoder,tomwijnroks/ssl-decoder
php
## Code Before: <?php error_reporting(E_ALL & ~E_NOTICE); foreach (glob("functions/*.php") as $filename) { include $filename; } if ( isset($_GET['host']) && !empty($_GET['host'])) { $data = []; $hostname = mb_strtolower(get($_GET['host'])); $host = parse_hostname($hostname); if ($host['port']) { $port = $host['port']; } else { $port = get($_GET['port'], '443'); } $host = $host['hostname']; if ( !is_numeric($port) ) { $port = 443; } $data["data"] = check_json($host,$port); } elseif(isset($_GET['csr']) && !empty($_GET['csr'])) { $data["data"]["chain"]["1"] = csr_parse_json($_GET['csr']); } else { $data["error"] = ["Host is required"]; } $data = utf8encodeNestedArray($data); if(isset($data["data"]["error"])) { $data["error"] = $data["data"]["error"]; unset($data["data"]); } if ($_GET["type"] == "pretty") { header('Content-Type: text/html'); echo "<pre>"; echo htmlspecialchars(json_encode($data,JSON_PRETTY_PRINT)); echo "</pre>"; ?> <? } else { header('Content-Type: application/json'); echo json_encode($data); } ?> ## Instruction: Remove unused php open and close tags ## Code After: <?php error_reporting(E_ALL & ~E_NOTICE); foreach (glob("functions/*.php") as $filename) { include $filename; } if ( isset($_GET['host']) && !empty($_GET['host'])) { $data = []; $hostname = mb_strtolower(get($_GET['host'])); $host = parse_hostname($hostname); if ($host['port']) { $port = $host['port']; } else { $port = get($_GET['port'], '443'); } $host = $host['hostname']; if ( !is_numeric($port) ) { $port = 443; } $data["data"] = check_json($host,$port); } elseif(isset($_GET['csr']) && !empty($_GET['csr'])) { $data["data"]["chain"]["1"] = csr_parse_json($_GET['csr']); } else { $data["error"] = ["Host is required"]; } $data = utf8encodeNestedArray($data); if(isset($data["data"]["error"])) { $data["error"] = $data["data"]["error"]; unset($data["data"]); } if ($_GET["type"] == "pretty") { header('Content-Type: text/html'); echo "<pre>"; echo htmlspecialchars(json_encode($data,JSON_PRETTY_PRINT)); echo "</pre>"; } else { header('Content-Type: application/json'); echo json_encode($data); } ?>
<?php error_reporting(E_ALL & ~E_NOTICE); foreach (glob("functions/*.php") as $filename) { include $filename; } if ( isset($_GET['host']) && !empty($_GET['host'])) { $data = []; $hostname = mb_strtolower(get($_GET['host'])); $host = parse_hostname($hostname); if ($host['port']) { $port = $host['port']; } else { $port = get($_GET['port'], '443'); } $host = $host['hostname']; if ( !is_numeric($port) ) { $port = 443; } $data["data"] = check_json($host,$port); } elseif(isset($_GET['csr']) && !empty($_GET['csr'])) { $data["data"]["chain"]["1"] = csr_parse_json($_GET['csr']); } else { $data["error"] = ["Host is required"]; } $data = utf8encodeNestedArray($data); if(isset($data["data"]["error"])) { $data["error"] = $data["data"]["error"]; unset($data["data"]); } if ($_GET["type"] == "pretty") { header('Content-Type: text/html'); echo "<pre>"; echo htmlspecialchars(json_encode($data,JSON_PRETTY_PRINT)); echo "</pre>"; - ?> - - <? } else { header('Content-Type: application/json'); echo json_encode($data); } ?>
3
0.058824
0
3
6b338b1a04ef2b88fd356cc6605c722c2400b6d5
osx.md
osx.md
--- layout: default title: Mac OS X Installation --- # Mac OS X Installation Procedure ## getpapers 1. Download npm and node using brew. See: [brew](http://brew.sh/) `brew install node` 1. run `npm install --global getpapers` either as root or with sudo enabled ## norma {% include norma-from-zip.md %} --- # Mac OS X Operating Procedure ## getpapers {% include run-getpapers.md %} ## norma {% include check-java-generic.md %} {% include mac-install-java.md %} {% include run-norma.md %} ## ami {% include check-java-generic.md %} {% include mac-install-java.md %} {% include run-ami.md %}
--- layout: default title: Mac OS X Installation --- # Mac OS X Installation Procedure ## getpapers 1. Download npm and node using brew. See: [brew](http://brew.sh/) `brew install node` 1. run `npm install --global getpapers` either as root or with sudo enabled ## norma {% include norma-from-zip.md %} ## ami {% include ami-from-zip.md %} --- # Mac OS X Operating Procedure ## getpapers {% include run-getpapers.md %} ## norma {% include check-java-generic.md %} {% include mac-install-java.md %} {% include run-norma.md %} ## ami {% include check-java-generic.md %} {% include mac-install-java.md %} {% include run-ami.md %}
Include missing OSX ami installation
Include missing OSX ami installation
Markdown
mit
tarrow/CMInstall,ContentMine/contentmine.github.io
markdown
## Code Before: --- layout: default title: Mac OS X Installation --- # Mac OS X Installation Procedure ## getpapers 1. Download npm and node using brew. See: [brew](http://brew.sh/) `brew install node` 1. run `npm install --global getpapers` either as root or with sudo enabled ## norma {% include norma-from-zip.md %} --- # Mac OS X Operating Procedure ## getpapers {% include run-getpapers.md %} ## norma {% include check-java-generic.md %} {% include mac-install-java.md %} {% include run-norma.md %} ## ami {% include check-java-generic.md %} {% include mac-install-java.md %} {% include run-ami.md %} ## Instruction: Include missing OSX ami installation ## Code After: --- layout: default title: Mac OS X Installation --- # Mac OS X Installation Procedure ## getpapers 1. Download npm and node using brew. See: [brew](http://brew.sh/) `brew install node` 1. run `npm install --global getpapers` either as root or with sudo enabled ## norma {% include norma-from-zip.md %} ## ami {% include ami-from-zip.md %} --- # Mac OS X Operating Procedure ## getpapers {% include run-getpapers.md %} ## norma {% include check-java-generic.md %} {% include mac-install-java.md %} {% include run-norma.md %} ## ami {% include check-java-generic.md %} {% include mac-install-java.md %} {% include run-ami.md %}
--- layout: default title: Mac OS X Installation --- # Mac OS X Installation Procedure ## getpapers 1. Download npm and node using brew. See: [brew](http://brew.sh/) `brew install node` 1. run `npm install --global getpapers` either as root or with sudo enabled ## norma {% include norma-from-zip.md %} + + ## ami + {% include ami-from-zip.md %} --- # Mac OS X Operating Procedure ## getpapers {% include run-getpapers.md %} ## norma {% include check-java-generic.md %} {% include mac-install-java.md %} {% include run-norma.md %} ## ami {% include check-java-generic.md %} {% include mac-install-java.md %} {% include run-ami.md %}
3
0.1
3
0
b51e0ff9407f8a609be580d8fcb9cad6cfd267d8
setup.py
setup.py
try: from setuptools.core import setup except ImportError: from distutils.core import setup PACKAGE = 'django-render-as' VERSION = '1.1' package_data = { 'render_as': [ 'templates/avoid_clash_with_real_app/*.html', 'templates/render_as/*.html', ], } setup( name=PACKAGE, version=VERSION, description="Template rendering indirector based on object class", packages=[ 'render_as', 'render_as/templatetags', ], package_data=package_data, license='MIT', author='James Aylett', author_email='james@tartarus.org', install_requires=[ 'Django~=1.10', ], classifiers=[ 'Intended Audience :: Developers', 'Framework :: Django', 'License :: OSI Approved :: MIT License', 'Programming Language :: Python :: 2', 'Programming Language :: Python :: 3', ], )
try: from setuptools.core import setup except ImportError: from distutils.core import setup PACKAGE = 'django-render-as' VERSION = '1.2' package_data = { 'render_as': [ 'test_templates/avoid_clash_with_real_app/*.html', 'test_templates/render_as/*.html', ], } setup( name=PACKAGE, version=VERSION, description="Template rendering indirector based on object class", packages=[ 'render_as', 'render_as/templatetags', ], package_data=package_data, license='MIT', author='James Aylett', author_email='james@tartarus.org', install_requires=[ 'Django~=1.10', ], classifiers=[ 'Intended Audience :: Developers', 'Framework :: Django', 'License :: OSI Approved :: MIT License', 'Programming Language :: Python :: 2', 'Programming Language :: Python :: 3', ], )
Include test templates in distributions.
Include test templates in distributions. This probably wasn't working before, although apparently I didn't notice. But then no one really runs tests for their 3PA, do they? This is v1.2.
Python
mit
jaylett/django-render-as,jaylett/django-render-as
python
## Code Before: try: from setuptools.core import setup except ImportError: from distutils.core import setup PACKAGE = 'django-render-as' VERSION = '1.1' package_data = { 'render_as': [ 'templates/avoid_clash_with_real_app/*.html', 'templates/render_as/*.html', ], } setup( name=PACKAGE, version=VERSION, description="Template rendering indirector based on object class", packages=[ 'render_as', 'render_as/templatetags', ], package_data=package_data, license='MIT', author='James Aylett', author_email='james@tartarus.org', install_requires=[ 'Django~=1.10', ], classifiers=[ 'Intended Audience :: Developers', 'Framework :: Django', 'License :: OSI Approved :: MIT License', 'Programming Language :: Python :: 2', 'Programming Language :: Python :: 3', ], ) ## Instruction: Include test templates in distributions. This probably wasn't working before, although apparently I didn't notice. But then no one really runs tests for their 3PA, do they? This is v1.2. ## Code After: try: from setuptools.core import setup except ImportError: from distutils.core import setup PACKAGE = 'django-render-as' VERSION = '1.2' package_data = { 'render_as': [ 'test_templates/avoid_clash_with_real_app/*.html', 'test_templates/render_as/*.html', ], } setup( name=PACKAGE, version=VERSION, description="Template rendering indirector based on object class", packages=[ 'render_as', 'render_as/templatetags', ], package_data=package_data, license='MIT', author='James Aylett', author_email='james@tartarus.org', install_requires=[ 'Django~=1.10', ], classifiers=[ 'Intended Audience :: Developers', 'Framework :: Django', 'License :: OSI Approved :: MIT License', 'Programming Language :: Python :: 2', 'Programming Language :: Python :: 3', ], )
try: from setuptools.core import setup except ImportError: from distutils.core import setup PACKAGE = 'django-render-as' - VERSION = '1.1' ? ^ + VERSION = '1.2' ? ^ package_data = { 'render_as': [ - 'templates/avoid_clash_with_real_app/*.html', + 'test_templates/avoid_clash_with_real_app/*.html', ? +++++ - 'templates/render_as/*.html', + 'test_templates/render_as/*.html', ? +++++ ], } setup( name=PACKAGE, version=VERSION, description="Template rendering indirector based on object class", packages=[ 'render_as', 'render_as/templatetags', ], package_data=package_data, license='MIT', author='James Aylett', author_email='james@tartarus.org', install_requires=[ 'Django~=1.10', ], classifiers=[ 'Intended Audience :: Developers', 'Framework :: Django', 'License :: OSI Approved :: MIT License', 'Programming Language :: Python :: 2', 'Programming Language :: Python :: 3', ], )
6
0.162162
3
3
b18bdf11141cf47319eed9ba2b861ebc287cf5ff
pyqs/utils.py
pyqs/utils.py
import base64 import json import pickle def decode_message(message): message_body = message.get_body() json_body = json.loads(message_body) if 'task' in message_body: return json_body else: # Fallback to processing celery messages return decode_celery_message(json_body) def decode_celery_message(json_task): message = base64.decodestring(json_task['body']) return pickle.loads(message) def function_to_import_path(function): return "{}.{}".format(function.__module__, function.func_name)
import base64 import json import pickle def decode_message(message): message_body = message.get_body() json_body = json.loads(message_body) if 'task' in message_body: return json_body else: # Fallback to processing celery messages return decode_celery_message(json_body) def decode_celery_message(json_task): message = base64.decodestring(json_task['body']) try: return json.loads(message) except ValueError: pass return pickle.loads(message) def function_to_import_path(function): return "{}.{}".format(function.__module__, function.func_name)
Add fallback for loading json encoded celery messages
Add fallback for loading json encoded celery messages
Python
mit
spulec/PyQS
python
## Code Before: import base64 import json import pickle def decode_message(message): message_body = message.get_body() json_body = json.loads(message_body) if 'task' in message_body: return json_body else: # Fallback to processing celery messages return decode_celery_message(json_body) def decode_celery_message(json_task): message = base64.decodestring(json_task['body']) return pickle.loads(message) def function_to_import_path(function): return "{}.{}".format(function.__module__, function.func_name) ## Instruction: Add fallback for loading json encoded celery messages ## Code After: import base64 import json import pickle def decode_message(message): message_body = message.get_body() json_body = json.loads(message_body) if 'task' in message_body: return json_body else: # Fallback to processing celery messages return decode_celery_message(json_body) def decode_celery_message(json_task): message = base64.decodestring(json_task['body']) try: return json.loads(message) except ValueError: pass return pickle.loads(message) def function_to_import_path(function): return "{}.{}".format(function.__module__, function.func_name)
import base64 import json import pickle def decode_message(message): message_body = message.get_body() json_body = json.loads(message_body) if 'task' in message_body: return json_body else: # Fallback to processing celery messages return decode_celery_message(json_body) def decode_celery_message(json_task): message = base64.decodestring(json_task['body']) + try: + return json.loads(message) + except ValueError: + pass return pickle.loads(message) def function_to_import_path(function): return "{}.{}".format(function.__module__, function.func_name)
4
0.181818
4
0
ee3c50d017fd5eaf1082ba4c23649a14a09d4f35
Sources/App/Setup/Droplet+Setup.swift
Sources/App/Setup/Droplet+Setup.swift
// // Copyright © 2017 Patrick Balestra. All rights reserved. // import Vapor public extension Droplet { public func setUp() throws { try setUpRoutes() } /// Configure all routes private func setUpRoutes() throws { // /home let homeController = HomeController() homeController.addRoutes(droplet: self) // /explore let exploreController = ExploreController() exploreController.addRoutes(droplet: self) // /contribute let contributeController = ContributeController() contributeController.addRoutes(droplet: self) // /about let aboutController = AboutController() aboutController.addRoutes(droplet: self) // /manufacturer let manufacturerController = ManufacturerController() manufacturerController.addRoutes(droplet: self) // /donation let donationController = DonationController() donationController.addRoutes(droplet: self) // /report let reportController = ReportController() reportController.addRoutes(droplet: self) // /rss.xml let feedController = FeedController() feedController.addRoutes(droplet: self) } }
// // Copyright © 2017 Patrick Balestra. All rights reserved. // import Vapor public extension Droplet { public func setUp() throws { try setUpRoutes() } /// Configure all routes private func setUpRoutes() throws { // /home let homeController = HomeController() homeController.addRoutes(droplet: self) // /explore let exploreController = ExploreController() exploreController.addRoutes(droplet: self) // /contribute let contributeController = ContributeController() contributeController.addRoutes(droplet: self) // /about let aboutController = AboutController() aboutController.addRoutes(droplet: self) // /manufacturer let manufacturerController = ManufacturerController() manufacturerController.addRoutes(droplet: self) // /donation let donationController = DonationController() donationController.addRoutes(droplet: self) // /report let reportController = ReportController() reportController.addRoutes(droplet: self) // /rss.xml let feedController = FeedController() feedController.addRoutes(droplet: self) // /rss.xml let accessoryController = AccessoryController() accessoryController.addRoutes(droplet: self) } }
Add Accessory controller to droplet
Add Accessory controller to droplet
Swift
mit
BalestraPatrick/HomeKitty,BalestraPatrick/HomeKitty,BalestraPatrick/HomeKitty
swift
## Code Before: // // Copyright © 2017 Patrick Balestra. All rights reserved. // import Vapor public extension Droplet { public func setUp() throws { try setUpRoutes() } /// Configure all routes private func setUpRoutes() throws { // /home let homeController = HomeController() homeController.addRoutes(droplet: self) // /explore let exploreController = ExploreController() exploreController.addRoutes(droplet: self) // /contribute let contributeController = ContributeController() contributeController.addRoutes(droplet: self) // /about let aboutController = AboutController() aboutController.addRoutes(droplet: self) // /manufacturer let manufacturerController = ManufacturerController() manufacturerController.addRoutes(droplet: self) // /donation let donationController = DonationController() donationController.addRoutes(droplet: self) // /report let reportController = ReportController() reportController.addRoutes(droplet: self) // /rss.xml let feedController = FeedController() feedController.addRoutes(droplet: self) } } ## Instruction: Add Accessory controller to droplet ## Code After: // // Copyright © 2017 Patrick Balestra. All rights reserved. // import Vapor public extension Droplet { public func setUp() throws { try setUpRoutes() } /// Configure all routes private func setUpRoutes() throws { // /home let homeController = HomeController() homeController.addRoutes(droplet: self) // /explore let exploreController = ExploreController() exploreController.addRoutes(droplet: self) // /contribute let contributeController = ContributeController() contributeController.addRoutes(droplet: self) // /about let aboutController = AboutController() aboutController.addRoutes(droplet: self) // /manufacturer let manufacturerController = ManufacturerController() manufacturerController.addRoutes(droplet: self) // /donation let donationController = DonationController() donationController.addRoutes(droplet: self) // /report let reportController = ReportController() reportController.addRoutes(droplet: self) // /rss.xml let feedController = FeedController() feedController.addRoutes(droplet: self) // /rss.xml let accessoryController = AccessoryController() accessoryController.addRoutes(droplet: self) } }
// // Copyright © 2017 Patrick Balestra. All rights reserved. // import Vapor public extension Droplet { public func setUp() throws { try setUpRoutes() } /// Configure all routes private func setUpRoutes() throws { // /home let homeController = HomeController() homeController.addRoutes(droplet: self) // /explore let exploreController = ExploreController() exploreController.addRoutes(droplet: self) // /contribute let contributeController = ContributeController() contributeController.addRoutes(droplet: self) // /about let aboutController = AboutController() aboutController.addRoutes(droplet: self) // /manufacturer let manufacturerController = ManufacturerController() manufacturerController.addRoutes(droplet: self) // /donation let donationController = DonationController() donationController.addRoutes(droplet: self) // /report let reportController = ReportController() reportController.addRoutes(droplet: self) // /rss.xml let feedController = FeedController() feedController.addRoutes(droplet: self) + + // /rss.xml + let accessoryController = AccessoryController() + accessoryController.addRoutes(droplet: self) } }
4
0.085106
4
0
1ec67da563f9f3237794776e2c17795481b99260
README.md
README.md
About ===== cc-cli is Command Line Interface(CLI) for [CloudConductor](https://github.com/cloudconductor/cloud_conductor). For more information, please visit [official web site](http://cloudconductor.org/). Requirements ============ Prerequisites ------------- - git - ruby (>= 2.0.0) - rubygems - bundler Quick Start =========== ### Clone github repository and install ```bash git clone https://github.com/cloudconductor/cloud_conductor.git bundle install bundle exec rake install ``` ### Show usage ```bash cc-cli help ``` Copyright and License ===================== Copyright 2014 TIS inc. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. Contact ======== For more information: <http://cloudconductor.org/> Report issues and requests: <https://github.com/cloudconductor/cloud_conductor_cli/issues> Send feedback to: <ccndctr@gmail.com>
About ===== cc-cli is Command Line Interface(CLI) for [CloudConductor](https://github.com/cloudconductor/cloud_conductor). Requirements ============ Prerequisites ------------- - git - ruby (>= 2.0.0) - rubygems - bundler Quick Start =========== ### Clone github repository and install ```bash git clone https://github.com/cloudconductor/cloud_conductor.git bundle install bundle exec rake install ``` ### Show usage ```bash cc-cli help ``` For more information, please visit [official user manual](http://cloudconductor.org/documents/user-manual/conductor-cli). (Japanese only) Copyright and License ===================== Copyright 2014 TIS inc. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. Contact ======== For more information: <http://cloudconductor.org/> Report issues and requests: <https://github.com/cloudconductor/cloud_conductor_cli/issues> Send feedback to: <ccndctr@gmail.com>
Add reference to official user manual
Add reference to official user manual
Markdown
apache-2.0
cloudconductor/cloud_conductor_cli
markdown
## Code Before: About ===== cc-cli is Command Line Interface(CLI) for [CloudConductor](https://github.com/cloudconductor/cloud_conductor). For more information, please visit [official web site](http://cloudconductor.org/). Requirements ============ Prerequisites ------------- - git - ruby (>= 2.0.0) - rubygems - bundler Quick Start =========== ### Clone github repository and install ```bash git clone https://github.com/cloudconductor/cloud_conductor.git bundle install bundle exec rake install ``` ### Show usage ```bash cc-cli help ``` Copyright and License ===================== Copyright 2014 TIS inc. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. Contact ======== For more information: <http://cloudconductor.org/> Report issues and requests: <https://github.com/cloudconductor/cloud_conductor_cli/issues> Send feedback to: <ccndctr@gmail.com> ## Instruction: Add reference to official user manual ## Code After: About ===== cc-cli is Command Line Interface(CLI) for [CloudConductor](https://github.com/cloudconductor/cloud_conductor). Requirements ============ Prerequisites ------------- - git - ruby (>= 2.0.0) - rubygems - bundler Quick Start =========== ### Clone github repository and install ```bash git clone https://github.com/cloudconductor/cloud_conductor.git bundle install bundle exec rake install ``` ### Show usage ```bash cc-cli help ``` For more information, please visit [official user manual](http://cloudconductor.org/documents/user-manual/conductor-cli). (Japanese only) Copyright and License ===================== Copyright 2014 TIS inc. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. Contact ======== For more information: <http://cloudconductor.org/> Report issues and requests: <https://github.com/cloudconductor/cloud_conductor_cli/issues> Send feedback to: <ccndctr@gmail.com>
About ===== cc-cli is Command Line Interface(CLI) for [CloudConductor](https://github.com/cloudconductor/cloud_conductor). - For more information, please visit [official web site](http://cloudconductor.org/). Requirements ============ Prerequisites ------------- - git - ruby (>= 2.0.0) - rubygems - bundler Quick Start =========== ### Clone github repository and install ```bash git clone https://github.com/cloudconductor/cloud_conductor.git bundle install bundle exec rake install ``` ### Show usage ```bash cc-cli help ``` + For more information, please visit [official user manual](http://cloudconductor.org/documents/user-manual/conductor-cli). (Japanese only) + Copyright and License ===================== Copyright 2014 TIS inc. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. Contact ======== For more information: <http://cloudconductor.org/> Report issues and requests: <https://github.com/cloudconductor/cloud_conductor_cli/issues> Send feedback to: <ccndctr@gmail.com>
3
0.04918
2
1
dd29a2cb595ed8097131d8ab1561bb999e95955f
scalejs.visualization-d3/scalejs.visualization-d3/json-helper.js
scalejs.visualization-d3/scalejs.visualization-d3/json-helper.js

/*global define*/ /*jslint devel: true */ /*jslint browser: true */ define([ 'knockout', 'd3', 'd3.colorbrewer' ], function ( ko, d3, colorbrewer ) { "use strict"; });
Add template to jsonhelper file
Add template to jsonhelper file
JavaScript
mit
EikosPartners/scalejs.visualization-d3,EikosPartners/scalejs.visualization-d3,lisovin/scalejs.visualization-d3
javascript
## Code Before:  ## Instruction: Add template to jsonhelper file ## Code After: /*global define*/ /*jslint devel: true */ /*jslint browser: true */ define([ 'knockout', 'd3', 'd3.colorbrewer' ], function ( ko, d3, colorbrewer ) { "use strict"; });
-  + /*global define*/ + /*jslint devel: true */ + /*jslint browser: true */ + define([ + 'knockout', + 'd3', + 'd3.colorbrewer' + ], function ( + ko, + d3, + colorbrewer + ) { + "use strict"; + + });
16
16
15
1
c135a200fc817c2ca7a33aaa799251c4370ae56f
circle.yml
circle.yml
machine: services: - docker timezone: America/Los_Angeles # Override /etc/hosts hosts: circlehost: 127.0.0.1 # Add some environment variables environment: GOPATH: $HOME/go PATH: $GOPATH/bin:$PATH CIRCLE_ENV: test DOCKER_ACCOUNT: infradash DOCKER_EMAIL: docker@infradash.com DOCKER_AUTH: aW5mcmFkYXNoOnd1YzR5NmxiUFpHNA== BUILD_LABEL: $CIRCLE_BUILD_NUM BUILD_DIR: build/bin ## Customize dependencies dependencies: pre: - go version # Set up authentication to Docker Registry - sed "s/<EMAIL>/$DOCKER_EMAIL/;s/<AUTH>/$DOCKER_AUTH/" < ./docker/dockercfg.template > ~/.dockercfg override: - source ./hack/env.sh post: - source ./hack/env.sh && make GODEP=godep all ## Customize test commands test: override: - echo "Running tests." - godep go test ./pkg/... -v -check.vv -logtostderr ## Customize deployment commands deployment: docker: branch: /release\/.*/ commands: - cp $BUILD_DIR/dash $CIRCLE_ARTIFACTS - cp $BUILD_DIR/dash docker/dash - cd docker/dash && make push && cd .. - make deploy-git
machine: services: - docker timezone: America/Los_Angeles # Override /etc/hosts hosts: circlehost: 127.0.0.1 # Add some environment variables environment: GOPATH: $HOME/go PATH: $GOPATH/bin:$PATH CIRCLE_ENV: test DOCKER_ACCOUNT: infradash DOCKER_EMAIL: docker@infradash.com DOCKER_AUTH: aW5mcmFkYXNoOnd1YzR5NmxiUFpHNA== BUILD_LABEL: $CIRCLE_BUILD_NUM BUILD_DIR: build/bin ## Customize dependencies dependencies: pre: - go version # Set up authentication to Docker Registry - sed "s/<EMAIL>/$DOCKER_EMAIL/;s/<AUTH>/$DOCKER_AUTH/" < ./docker/dockercfg.template > ~/.dockercfg override: - source ./hack/env.sh post: - source ./hack/env.sh && make GODEP=godep all - cp $BUILD_DIR/dash $CIRCLE_ARTIFACTS - source ./hack/env.sh && make deploy-git ## Customize test commands test: override: - echo "Running tests." - godep go test ./pkg/... -v -check.vv -logtostderr ## Customize deployment commands deployment: docker: branch: /v[0-9]+(\\.[0-9]+)*/ commands: - cp $BUILD_DIR/dash docker/dash - cd docker/dash && make push && cd ..
Build docker image only in v* branches
Build docker image only in v* branches
YAML
apache-2.0
infradash/dash,infradash/dash,infradash/dash,infradash/dash
yaml
## Code Before: machine: services: - docker timezone: America/Los_Angeles # Override /etc/hosts hosts: circlehost: 127.0.0.1 # Add some environment variables environment: GOPATH: $HOME/go PATH: $GOPATH/bin:$PATH CIRCLE_ENV: test DOCKER_ACCOUNT: infradash DOCKER_EMAIL: docker@infradash.com DOCKER_AUTH: aW5mcmFkYXNoOnd1YzR5NmxiUFpHNA== BUILD_LABEL: $CIRCLE_BUILD_NUM BUILD_DIR: build/bin ## Customize dependencies dependencies: pre: - go version # Set up authentication to Docker Registry - sed "s/<EMAIL>/$DOCKER_EMAIL/;s/<AUTH>/$DOCKER_AUTH/" < ./docker/dockercfg.template > ~/.dockercfg override: - source ./hack/env.sh post: - source ./hack/env.sh && make GODEP=godep all ## Customize test commands test: override: - echo "Running tests." - godep go test ./pkg/... -v -check.vv -logtostderr ## Customize deployment commands deployment: docker: branch: /release\/.*/ commands: - cp $BUILD_DIR/dash $CIRCLE_ARTIFACTS - cp $BUILD_DIR/dash docker/dash - cd docker/dash && make push && cd .. - make deploy-git ## Instruction: Build docker image only in v* branches ## Code After: machine: services: - docker timezone: America/Los_Angeles # Override /etc/hosts hosts: circlehost: 127.0.0.1 # Add some environment variables environment: GOPATH: $HOME/go PATH: $GOPATH/bin:$PATH CIRCLE_ENV: test DOCKER_ACCOUNT: infradash DOCKER_EMAIL: docker@infradash.com DOCKER_AUTH: aW5mcmFkYXNoOnd1YzR5NmxiUFpHNA== BUILD_LABEL: $CIRCLE_BUILD_NUM BUILD_DIR: build/bin ## Customize dependencies dependencies: pre: - go version # Set up authentication to Docker Registry - sed "s/<EMAIL>/$DOCKER_EMAIL/;s/<AUTH>/$DOCKER_AUTH/" < ./docker/dockercfg.template > ~/.dockercfg override: - source ./hack/env.sh post: - source ./hack/env.sh && make GODEP=godep all - cp $BUILD_DIR/dash $CIRCLE_ARTIFACTS - source ./hack/env.sh && make deploy-git ## Customize test commands test: override: - echo "Running tests." - godep go test ./pkg/... -v -check.vv -logtostderr ## Customize deployment commands deployment: docker: branch: /v[0-9]+(\\.[0-9]+)*/ commands: - cp $BUILD_DIR/dash docker/dash - cd docker/dash && make push && cd ..
machine: services: - docker timezone: America/Los_Angeles # Override /etc/hosts hosts: circlehost: 127.0.0.1 # Add some environment variables environment: GOPATH: $HOME/go PATH: $GOPATH/bin:$PATH CIRCLE_ENV: test DOCKER_ACCOUNT: infradash DOCKER_EMAIL: docker@infradash.com DOCKER_AUTH: aW5mcmFkYXNoOnd1YzR5NmxiUFpHNA== BUILD_LABEL: $CIRCLE_BUILD_NUM BUILD_DIR: build/bin ## Customize dependencies dependencies: pre: - go version # Set up authentication to Docker Registry - sed "s/<EMAIL>/$DOCKER_EMAIL/;s/<AUTH>/$DOCKER_AUTH/" < ./docker/dockercfg.template > ~/.dockercfg override: - source ./hack/env.sh post: - source ./hack/env.sh && make GODEP=godep all + - cp $BUILD_DIR/dash $CIRCLE_ARTIFACTS + - source ./hack/env.sh && make deploy-git ## Customize test commands test: override: - echo "Running tests." - godep go test ./pkg/... -v -check.vv -logtostderr - ## Customize deployment commands deployment: docker: - branch: /release\/.*/ + branch: /v[0-9]+(\\.[0-9]+)*/ commands: - - cp $BUILD_DIR/dash $CIRCLE_ARTIFACTS - cp $BUILD_DIR/dash docker/dash - cd docker/dash && make push && cd .. - - make deploy-git
7
0.14
3
4
1c75bc92a8078af405cccae3644553e1220db10b
tests/NamingStrategy/CallbackNamingStrategyTest.php
tests/NamingStrategy/CallbackNamingStrategyTest.php
<?php namespace Tests\FileNamingResolver\NamingStrategy; use FileNamingResolver\FileInfo; use FileNamingResolver\NamingStrategy\CallbackNamingStrategy; /** * @author Victor Bocharsky <bocharsky.bw@gmail.com> */ class CallbackNamingStrategyTest extends \PHPUnit_Framework_TestCase { public function testGetFunc() { $func = function(FileInfo $fileInfo) { return $fileInfo->toString(); }; $strategy = new CallbackNamingStrategy($func); $this->assertSame($func, $strategy->getCallback()); } }
<?php namespace Tests\FileNamingResolver\NamingStrategy; use FileNamingResolver\FileInfo; use FileNamingResolver\NamingStrategy\CallbackNamingStrategy; /** * @author Victor Bocharsky <bocharsky.bw@gmail.com> */ class CallbackNamingStrategyTest extends \PHPUnit_Framework_TestCase { public function testProvideName() { $strategy = new CallbackNamingStrategy(function(FileInfo $srcFileInfo) { return $srcFileInfo ->changePath($srcFileInfo->getPath().'/dir1') ->changeBasename('file1') ; }); $srcFileInfo = new FileInfo('/dir/file.ext'); $dstFileInfo = $strategy->provideName($srcFileInfo); $this->assertInstanceOf('FileNamingResolver\FileInfo', $dstFileInfo); $this->assertEquals('/dir/dir1/file1.ext', $dstFileInfo->toString()); } /** * @expectedException \RuntimeException */ public function testProvideNameException() { $strategy = new CallbackNamingStrategy(function(FileInfo $fileInfo) { return $fileInfo->toString(); }); $srcFileInfo = new FileInfo(__FILE__); $strategy->provideName($srcFileInfo); } public function testGetFunc() { $func = function(FileInfo $fileInfo) { return $fileInfo->toString(); }; $strategy = new CallbackNamingStrategy($func); $this->assertSame($func, $strategy->getCallback()); } }
Cover exception throwing in the CallbackNamingStrategy
Cover exception throwing in the CallbackNamingStrategy
PHP
mit
bocharsky-bw/FileNamingResolver
php
## Code Before: <?php namespace Tests\FileNamingResolver\NamingStrategy; use FileNamingResolver\FileInfo; use FileNamingResolver\NamingStrategy\CallbackNamingStrategy; /** * @author Victor Bocharsky <bocharsky.bw@gmail.com> */ class CallbackNamingStrategyTest extends \PHPUnit_Framework_TestCase { public function testGetFunc() { $func = function(FileInfo $fileInfo) { return $fileInfo->toString(); }; $strategy = new CallbackNamingStrategy($func); $this->assertSame($func, $strategy->getCallback()); } } ## Instruction: Cover exception throwing in the CallbackNamingStrategy ## Code After: <?php namespace Tests\FileNamingResolver\NamingStrategy; use FileNamingResolver\FileInfo; use FileNamingResolver\NamingStrategy\CallbackNamingStrategy; /** * @author Victor Bocharsky <bocharsky.bw@gmail.com> */ class CallbackNamingStrategyTest extends \PHPUnit_Framework_TestCase { public function testProvideName() { $strategy = new CallbackNamingStrategy(function(FileInfo $srcFileInfo) { return $srcFileInfo ->changePath($srcFileInfo->getPath().'/dir1') ->changeBasename('file1') ; }); $srcFileInfo = new FileInfo('/dir/file.ext'); $dstFileInfo = $strategy->provideName($srcFileInfo); $this->assertInstanceOf('FileNamingResolver\FileInfo', $dstFileInfo); $this->assertEquals('/dir/dir1/file1.ext', $dstFileInfo->toString()); } /** * @expectedException \RuntimeException */ public function testProvideNameException() { $strategy = new CallbackNamingStrategy(function(FileInfo $fileInfo) { return $fileInfo->toString(); }); $srcFileInfo = new FileInfo(__FILE__); $strategy->provideName($srcFileInfo); } public function testGetFunc() { $func = function(FileInfo $fileInfo) { return $fileInfo->toString(); }; $strategy = new CallbackNamingStrategy($func); $this->assertSame($func, $strategy->getCallback()); } }
<?php namespace Tests\FileNamingResolver\NamingStrategy; use FileNamingResolver\FileInfo; use FileNamingResolver\NamingStrategy\CallbackNamingStrategy; /** * @author Victor Bocharsky <bocharsky.bw@gmail.com> */ class CallbackNamingStrategyTest extends \PHPUnit_Framework_TestCase { + public function testProvideName() + { + $strategy = new CallbackNamingStrategy(function(FileInfo $srcFileInfo) { + return $srcFileInfo + ->changePath($srcFileInfo->getPath().'/dir1') + ->changeBasename('file1') + ; + }); + $srcFileInfo = new FileInfo('/dir/file.ext'); + $dstFileInfo = $strategy->provideName($srcFileInfo); + $this->assertInstanceOf('FileNamingResolver\FileInfo', $dstFileInfo); + $this->assertEquals('/dir/dir1/file1.ext', $dstFileInfo->toString()); + } + + /** + * @expectedException \RuntimeException + */ + public function testProvideNameException() + { + $strategy = new CallbackNamingStrategy(function(FileInfo $fileInfo) { + return $fileInfo->toString(); + }); + $srcFileInfo = new FileInfo(__FILE__); + $strategy->provideName($srcFileInfo); + } + public function testGetFunc() { $func = function(FileInfo $fileInfo) { return $fileInfo->toString(); }; $strategy = new CallbackNamingStrategy($func); $this->assertSame($func, $strategy->getCallback()); } }
26
1.181818
26
0
7a30c071923c5c54582a534046a1afc6350b2905
index.js
index.js
/* jshint node: true */ 'use strict'; var path = require('path'); var fs = require('fs'); var VersionChecker = require('ember-cli-version-checker'); module.exports = { name: 'ember-modal-dialog', init: function() { var checker = new VersionChecker(this); if (!checker.for('ember-cli', 'npm').isAbove('0.2.6')) { console.warn("Warning: ember-modal-dialog requires ember-cli >= 0.2.6 " + "for support for the addon-templates tree, which allows " + "us to support various ember versions. Use an older " + "version of ember-modal-dialog if you are stuck on an " + "older ember-cli."); } }, treeForAddonTemplates: function treeForAddonTemplates (tree) { var checker = new VersionChecker(this); var dep = checker.for('ember', 'bower'); var baseTemplatesPath = path.join(this.root, 'addon/templates'); if (dep.lt('1.13.0-beta.1')) { return this.treeGenerator(path.join(baseTemplatesPath, 'lt-1-13')); } else { return this.treeGenerator(path.join(baseTemplatesPath, 'current')); } } };
/* jshint node: true */ 'use strict'; var path = require('path'); var fs = require('fs'); var VersionChecker = require('ember-cli-version-checker'); module.exports = { name: 'ember-modal-dialog', init: function() { this._super.init && this._super.init.apply(this, arguments); var checker = new VersionChecker(this); if (!checker.for('ember-cli', 'npm').isAbove('0.2.6')) { console.warn("Warning: ember-modal-dialog requires ember-cli >= 0.2.6 " + "for support for the addon-templates tree, which allows " + "us to support various ember versions. Use an older " + "version of ember-modal-dialog if you are stuck on an " + "older ember-cli."); } }, treeForAddonTemplates: function treeForAddonTemplates (tree) { var checker = new VersionChecker(this); var dep = checker.for('ember', 'bower'); var baseTemplatesPath = path.join(this.root, 'addon/templates'); if (dep.lt('1.13.0-beta.1')) { return this.treeGenerator(path.join(baseTemplatesPath, 'lt-1-13')); } else { return this.treeGenerator(path.join(baseTemplatesPath, 'current')); } } };
Call `this._super.init` to avoid ember-cli deprecation
Call `this._super.init` to avoid ember-cli deprecation This avoids the following deprecation warning: DEPRECATION: Overriding init without calling this._super is deprecated. Please call this._super.init && this._super.init.apply(this, arguments);
JavaScript
mit
yapplabs/ember-modal-dialog,oscarni/ember-modal-dialog,oscarni/ember-modal-dialog,samselikoff/ember-modal-dialog,samselikoff/ember-modal-dialog,yapplabs/ember-modal-dialog
javascript
## Code Before: /* jshint node: true */ 'use strict'; var path = require('path'); var fs = require('fs'); var VersionChecker = require('ember-cli-version-checker'); module.exports = { name: 'ember-modal-dialog', init: function() { var checker = new VersionChecker(this); if (!checker.for('ember-cli', 'npm').isAbove('0.2.6')) { console.warn("Warning: ember-modal-dialog requires ember-cli >= 0.2.6 " + "for support for the addon-templates tree, which allows " + "us to support various ember versions. Use an older " + "version of ember-modal-dialog if you are stuck on an " + "older ember-cli."); } }, treeForAddonTemplates: function treeForAddonTemplates (tree) { var checker = new VersionChecker(this); var dep = checker.for('ember', 'bower'); var baseTemplatesPath = path.join(this.root, 'addon/templates'); if (dep.lt('1.13.0-beta.1')) { return this.treeGenerator(path.join(baseTemplatesPath, 'lt-1-13')); } else { return this.treeGenerator(path.join(baseTemplatesPath, 'current')); } } }; ## Instruction: Call `this._super.init` to avoid ember-cli deprecation This avoids the following deprecation warning: DEPRECATION: Overriding init without calling this._super is deprecated. Please call this._super.init && this._super.init.apply(this, arguments); ## Code After: /* jshint node: true */ 'use strict'; var path = require('path'); var fs = require('fs'); var VersionChecker = require('ember-cli-version-checker'); module.exports = { name: 'ember-modal-dialog', init: function() { this._super.init && this._super.init.apply(this, arguments); var checker = new VersionChecker(this); if (!checker.for('ember-cli', 'npm').isAbove('0.2.6')) { console.warn("Warning: ember-modal-dialog requires ember-cli >= 0.2.6 " + "for support for the addon-templates tree, which allows " + "us to support various ember versions. Use an older " + "version of ember-modal-dialog if you are stuck on an " + "older ember-cli."); } }, treeForAddonTemplates: function treeForAddonTemplates (tree) { var checker = new VersionChecker(this); var dep = checker.for('ember', 'bower'); var baseTemplatesPath = path.join(this.root, 'addon/templates'); if (dep.lt('1.13.0-beta.1')) { return this.treeGenerator(path.join(baseTemplatesPath, 'lt-1-13')); } else { return this.treeGenerator(path.join(baseTemplatesPath, 'current')); } } };
/* jshint node: true */ 'use strict'; var path = require('path'); var fs = require('fs'); var VersionChecker = require('ember-cli-version-checker'); module.exports = { name: 'ember-modal-dialog', init: function() { + this._super.init && this._super.init.apply(this, arguments); var checker = new VersionChecker(this); if (!checker.for('ember-cli', 'npm').isAbove('0.2.6')) { console.warn("Warning: ember-modal-dialog requires ember-cli >= 0.2.6 " + "for support for the addon-templates tree, which allows " + "us to support various ember versions. Use an older " + "version of ember-modal-dialog if you are stuck on an " + "older ember-cli."); } }, treeForAddonTemplates: function treeForAddonTemplates (tree) { var checker = new VersionChecker(this); var dep = checker.for('ember', 'bower'); var baseTemplatesPath = path.join(this.root, 'addon/templates'); if (dep.lt('1.13.0-beta.1')) { return this.treeGenerator(path.join(baseTemplatesPath, 'lt-1-13')); } else { return this.treeGenerator(path.join(baseTemplatesPath, 'current')); } } };
1
0.029412
1
0
a15b16336b7bd95f0f10362fe9c144629fcbee9a
src/test/resources/testData/clearAll.xml
src/test/resources/testData/clearAll.xml
<?xml version='1.0' encoding='UTF-8'?> <dataset> <project_dim/> <station/> <station_sum/> <activity/> <act_metric/> <result/> <result_sum/> <web_service_log/> </dataset>
<?xml version='1.0' encoding='UTF-8'?> <dataset> <project_dim/> <station/> <station_sum/> <activity/> <act_metric/> <result/> <result_sum/> <web_service_log/> <prj_ml_weighting/> </dataset>
Clear the prj ml weighting data when using xml test data
WQP-1171: Clear the prj ml weighting data when using xml test data
XML
unlicense
USGS-CIDA/WQP-WQX-Services,dsteinich/WQP-WQX-Services,mbucknell/WQP-WQX-Services
xml
## Code Before: <?xml version='1.0' encoding='UTF-8'?> <dataset> <project_dim/> <station/> <station_sum/> <activity/> <act_metric/> <result/> <result_sum/> <web_service_log/> </dataset> ## Instruction: WQP-1171: Clear the prj ml weighting data when using xml test data ## Code After: <?xml version='1.0' encoding='UTF-8'?> <dataset> <project_dim/> <station/> <station_sum/> <activity/> <act_metric/> <result/> <result_sum/> <web_service_log/> <prj_ml_weighting/> </dataset>
<?xml version='1.0' encoding='UTF-8'?> <dataset> <project_dim/> <station/> <station_sum/> <activity/> <act_metric/> <result/> <result_sum/> <web_service_log/> + <prj_ml_weighting/> </dataset>
1
0.090909
1
0
538175116b7c552c6fd42ef003f6e1cbeb576507
packages/server-render/package.js
packages/server-render/package.js
Package.describe({ name: "server-render", version: "0.3.0", summary: "Generic support for server-side rendering in Meteor apps", documentation: "README.md" }); Npm.depends({ "combined-stream2": "1.1.2", "magic-string": "0.21.3", "stream-to-string": "1.1.0", "parse5": "3.0.2" }); Package.onUse(function(api) { api.use("ecmascript"); api.use("webapp"); api.mainModule("client.js", "client"); api.mainModule("server.js", "server"); }); Package.onTest(function(api) { api.use("ecmascript"); api.use("tinytest"); api.use("server-render"); api.mainModule("server-render-tests.js", "server"); });
Package.describe({ name: "server-render", version: "0.3.1", summary: "Generic support for server-side rendering in Meteor apps", documentation: "README.md" }); Npm.depends({ "combined-stream2": "1.1.2", "magic-string": "0.21.3", "stream-to-string": "1.1.0", "parse5": "3.0.2" }); Package.onUse(function(api) { api.use("ecmascript"); api.use("webapp"); api.mainModule("client.js", "client", { lazy: true }); api.mainModule("server.js", "server"); }); Package.onTest(function(api) { api.use("ecmascript"); api.use("tinytest"); api.use("server-render"); api.mainModule("server-render-tests.js", "server"); });
Make meteor/server-render lazy on the client.
Make meteor/server-render lazy on the client.
JavaScript
mit
Hansoft/meteor,Hansoft/meteor,Hansoft/meteor,Hansoft/meteor,Hansoft/meteor,Hansoft/meteor,Hansoft/meteor
javascript
## Code Before: Package.describe({ name: "server-render", version: "0.3.0", summary: "Generic support for server-side rendering in Meteor apps", documentation: "README.md" }); Npm.depends({ "combined-stream2": "1.1.2", "magic-string": "0.21.3", "stream-to-string": "1.1.0", "parse5": "3.0.2" }); Package.onUse(function(api) { api.use("ecmascript"); api.use("webapp"); api.mainModule("client.js", "client"); api.mainModule("server.js", "server"); }); Package.onTest(function(api) { api.use("ecmascript"); api.use("tinytest"); api.use("server-render"); api.mainModule("server-render-tests.js", "server"); }); ## Instruction: Make meteor/server-render lazy on the client. ## Code After: Package.describe({ name: "server-render", version: "0.3.1", summary: "Generic support for server-side rendering in Meteor apps", documentation: "README.md" }); Npm.depends({ "combined-stream2": "1.1.2", "magic-string": "0.21.3", "stream-to-string": "1.1.0", "parse5": "3.0.2" }); Package.onUse(function(api) { api.use("ecmascript"); api.use("webapp"); api.mainModule("client.js", "client", { lazy: true }); api.mainModule("server.js", "server"); }); Package.onTest(function(api) { api.use("ecmascript"); api.use("tinytest"); api.use("server-render"); api.mainModule("server-render-tests.js", "server"); });
Package.describe({ name: "server-render", - version: "0.3.0", ? ^ + version: "0.3.1", ? ^ summary: "Generic support for server-side rendering in Meteor apps", documentation: "README.md" }); Npm.depends({ "combined-stream2": "1.1.2", "magic-string": "0.21.3", "stream-to-string": "1.1.0", "parse5": "3.0.2" }); Package.onUse(function(api) { api.use("ecmascript"); api.use("webapp"); - api.mainModule("client.js", "client"); + api.mainModule("client.js", "client", { lazy: true }); ? ++++++++++++++++ api.mainModule("server.js", "server"); }); Package.onTest(function(api) { api.use("ecmascript"); api.use("tinytest"); api.use("server-render"); api.mainModule("server-render-tests.js", "server"); });
4
0.148148
2
2
20d355c52a73e38ae421aa3e4227c2c60d6ae2ff
ckanext/inventory/logic/schema.py
ckanext/inventory/logic/schema.py
from ckan.lib.navl.validators import ignore_empty, not_empty from ckan.logic.validators import ( name_validator, boolean_validator, is_positive_integer, isodate, group_id_exists) def default_inventory_entry_schema(): schema = { 'id': [unicode, ignore_empty], 'title': [unicode, not_empty], 'group_id': [group_id_exists], 'is_recurring': [boolean_validator], 'recurring_interval': [is_positive_integer], 'last_added_dataset_timestamp': [isodate], } return schema def default_inventory_entry_schema_create(): schema = { 'title': [unicode, not_empty], 'recurring_interval': [is_positive_integer], } return schema
from ckan.lib.navl.validators import ignore_empty, not_empty from ckan.logic.validators import ( name_validator, boolean_validator, natural_number_validator, isodate, group_id_exists) def default_inventory_entry_schema(): schema = { 'id': [unicode, ignore_empty], 'title': [unicode, not_empty], 'group_id': [group_id_exists], 'is_recurring': [boolean_validator], 'recurring_interval': [natural_number_validator], 'last_added_dataset_timestamp': [isodate], } return schema def default_inventory_entry_schema_create(): schema = { 'title': [unicode, not_empty], 'recurring_interval': [natural_number_validator], } return schema
Allow zeros for recurring interval
Allow zeros for recurring interval
Python
apache-2.0
govro/ckanext-inventory,govro/ckanext-inventory,govro/ckanext-inventory,govro/ckanext-inventory
python
## Code Before: from ckan.lib.navl.validators import ignore_empty, not_empty from ckan.logic.validators import ( name_validator, boolean_validator, is_positive_integer, isodate, group_id_exists) def default_inventory_entry_schema(): schema = { 'id': [unicode, ignore_empty], 'title': [unicode, not_empty], 'group_id': [group_id_exists], 'is_recurring': [boolean_validator], 'recurring_interval': [is_positive_integer], 'last_added_dataset_timestamp': [isodate], } return schema def default_inventory_entry_schema_create(): schema = { 'title': [unicode, not_empty], 'recurring_interval': [is_positive_integer], } return schema ## Instruction: Allow zeros for recurring interval ## Code After: from ckan.lib.navl.validators import ignore_empty, not_empty from ckan.logic.validators import ( name_validator, boolean_validator, natural_number_validator, isodate, group_id_exists) def default_inventory_entry_schema(): schema = { 'id': [unicode, ignore_empty], 'title': [unicode, not_empty], 'group_id': [group_id_exists], 'is_recurring': [boolean_validator], 'recurring_interval': [natural_number_validator], 'last_added_dataset_timestamp': [isodate], } return schema def default_inventory_entry_schema_create(): schema = { 'title': [unicode, not_empty], 'recurring_interval': [natural_number_validator], } return schema
from ckan.lib.navl.validators import ignore_empty, not_empty from ckan.logic.validators import ( - name_validator, boolean_validator, is_positive_integer, isodate, + name_validator, boolean_validator, natural_number_validator, isodate, group_id_exists) def default_inventory_entry_schema(): schema = { 'id': [unicode, ignore_empty], 'title': [unicode, not_empty], 'group_id': [group_id_exists], 'is_recurring': [boolean_validator], - 'recurring_interval': [is_positive_integer], + 'recurring_interval': [natural_number_validator], 'last_added_dataset_timestamp': [isodate], } return schema def default_inventory_entry_schema_create(): schema = { 'title': [unicode, not_empty], - 'recurring_interval': [is_positive_integer], + 'recurring_interval': [natural_number_validator], } return schema
6
0.25
3
3
4ffb74d7df3b612ad49625e8b65423afd2ed95c8
bootstrap/mac.sh
bootstrap/mac.sh
brew install augeas
if hash brew 2>/dev/null; then echo "Homebrew Installed" else echo "Homebrew Not Installed\nDownloading..." ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)" fi brew install augeas brew install dialog
Add dialog dependency and homebrew installation
Add dialog dependency and homebrew installation
Shell
apache-2.0
twstrike/le_for_patching,Sveder/letsencrypt,kuba/letsencrypt,twstrike/le_for_patching,goofwear/letsencrypt,mrb/letsencrypt,kuba/letsencrypt,jsha/letsencrypt,wteiken/letsencrypt,tyagi-prashant/letsencrypt,sjerdo/letsencrypt,DavidGarciaCat/letsencrypt,deserted/letsencrypt,mitnk/letsencrypt,BKreisel/letsencrypt,Bachmann1234/letsencrypt,BillKeenan/lets-encrypt-preview,goofwear/letsencrypt,ghyde/letsencrypt,stweil/letsencrypt,Sveder/letsencrypt,lmcro/letsencrypt,letsencrypt/letsencrypt,letsencrypt/letsencrypt,tyagi-prashant/letsencrypt,mitnk/letsencrypt,BillKeenan/lets-encrypt-preview,piru/letsencrypt,xgin/letsencrypt,mrb/letsencrypt,ghyde/letsencrypt,VladimirTyrin/letsencrypt,dietsche/letsencrypt,DavidGarciaCat/letsencrypt,thanatos/lets-encrypt-preview,Bachmann1234/letsencrypt,g1franc/lets-encrypt-preview,brentdax/letsencrypt,BKreisel/letsencrypt,riseofthetigers/letsencrypt,dietsche/letsencrypt,martindale/letsencrypt,lmcro/letsencrypt,rlustin/letsencrypt,jsha/letsencrypt,piru/letsencrypt,martindale/letsencrypt,sjerdo/letsencrypt,riseofthetigers/letsencrypt,jtl999/certbot,wteiken/letsencrypt,brentdax/letsencrypt,TheBoegl/letsencrypt,stweil/letsencrypt,Jadaw1n/letsencrypt,bsmr-misc-forks/letsencrypt,lbeltrame/letsencrypt,hsduk/lets-encrypt-preview,bsmr-misc-forks/letsencrypt,hsduk/lets-encrypt-preview,deserted/letsencrypt,thanatos/lets-encrypt-preview,TheBoegl/letsencrypt,g1franc/lets-encrypt-preview,jtl999/certbot,xgin/letsencrypt,rlustin/letsencrypt,VladimirTyrin/letsencrypt,lbeltrame/letsencrypt,Jadaw1n/letsencrypt
shell
## Code Before: brew install augeas ## Instruction: Add dialog dependency and homebrew installation ## Code After: if hash brew 2>/dev/null; then echo "Homebrew Installed" else echo "Homebrew Not Installed\nDownloading..." ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)" fi brew install augeas brew install dialog
+ if hash brew 2>/dev/null; then + echo "Homebrew Installed" + else + echo "Homebrew Not Installed\nDownloading..." + ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)" + fi + brew install augeas + brew install dialog
8
8
8
0
db24470b16a94d6785f67e90eda09eb63990830b
spec/dummy/config/database.yml
spec/dummy/config/database.yml
sqlite: &sqlite adapter: sqlite3 database: db/<%= Rails.env %>.sqlite3 mysql: &mysql adapter: mysql2 username: root password: database: alchemy_cms_dummy_<%= Rails.env %> postgresql: &postgresql adapter: postgresql <% if ENV['CI'] %> username: postgres <% end %> database: alchemy_cms_dummy_<%= Rails.env %> min_messages: ERROR defaults: &defaults pool: 5 timeout: 5000 host: localhost <<: *<%= ENV['DB'] || "sqlite" %> development: <<: *defaults test: <<: *defaults production: <<: *defaults
sqlite: &sqlite adapter: sqlite3 database: db/<%= Rails.env %>.sqlite3 mysql: &mysql adapter: mysql2 encoding: utf8mb4 username: root password: database: alchemy_cms_dummy_<%= Rails.env %> postgresql: &postgresql adapter: postgresql <% if ENV['CI'] %> username: postgres <% end %> database: alchemy_cms_dummy_<%= Rails.env %> min_messages: ERROR defaults: &defaults pool: 5 timeout: 5000 host: localhost <<: *<%= ENV['DB'] || "sqlite" %> development: <<: *defaults test: <<: *defaults production: <<: *defaults
Use correct encoding for mysql builds
Use correct encoding for mysql builds
YAML
bsd-3-clause
AlchemyCMS/alchemy_cms,mamhoff/alchemy_cms,mamhoff/alchemy_cms,robinboening/alchemy_cms,robinboening/alchemy_cms,robinboening/alchemy_cms,AlchemyCMS/alchemy_cms,AlchemyCMS/alchemy_cms,mamhoff/alchemy_cms
yaml
## Code Before: sqlite: &sqlite adapter: sqlite3 database: db/<%= Rails.env %>.sqlite3 mysql: &mysql adapter: mysql2 username: root password: database: alchemy_cms_dummy_<%= Rails.env %> postgresql: &postgresql adapter: postgresql <% if ENV['CI'] %> username: postgres <% end %> database: alchemy_cms_dummy_<%= Rails.env %> min_messages: ERROR defaults: &defaults pool: 5 timeout: 5000 host: localhost <<: *<%= ENV['DB'] || "sqlite" %> development: <<: *defaults test: <<: *defaults production: <<: *defaults ## Instruction: Use correct encoding for mysql builds ## Code After: sqlite: &sqlite adapter: sqlite3 database: db/<%= Rails.env %>.sqlite3 mysql: &mysql adapter: mysql2 encoding: utf8mb4 username: root password: database: alchemy_cms_dummy_<%= Rails.env %> postgresql: &postgresql adapter: postgresql <% if ENV['CI'] %> username: postgres <% end %> database: alchemy_cms_dummy_<%= Rails.env %> min_messages: ERROR defaults: &defaults pool: 5 timeout: 5000 host: localhost <<: *<%= ENV['DB'] || "sqlite" %> development: <<: *defaults test: <<: *defaults production: <<: *defaults
sqlite: &sqlite adapter: sqlite3 database: db/<%= Rails.env %>.sqlite3 mysql: &mysql adapter: mysql2 + encoding: utf8mb4 username: root password: database: alchemy_cms_dummy_<%= Rails.env %> postgresql: &postgresql adapter: postgresql <% if ENV['CI'] %> username: postgres <% end %> database: alchemy_cms_dummy_<%= Rails.env %> min_messages: ERROR defaults: &defaults pool: 5 timeout: 5000 host: localhost <<: *<%= ENV['DB'] || "sqlite" %> development: <<: *defaults test: <<: *defaults production: <<: *defaults
1
0.03125
1
0
20c18150e0f18b9ac327687fa626d10b4fa4111b
tests/Kwf/User/AuthPassword/FnF.php
tests/Kwf/User/AuthPassword/FnF.php
<?php class Kwf_User_AuthPassword_FnF extends Kwf_Model_FnF { protected $_toStringField = 'email'; protected $_hasDeletedFlag = true; protected function _init() { $this->_data = array( array('id'=>1, 'email' => 'test@vivid.com', 'password' => md5('foo'.'123'), 'password_salt' => '123', 'deleted'=>false), array('id'=>2, 'email' => 'testdel@vivid.com', 'password' => md5('bar'.'1234'), 'password_salt' => '1234', 'deleted'=>true), ); parent::_init(); } public function getAuthMethods() { return array( 'password' => new Kwf_User_Auth_PasswordFields($this) ); } }
<?php class Kwf_User_AuthPassword_FnF extends Kwf_Model_FnF { protected $_toStringField = 'email'; protected $_hasDeletedFlag = true; protected function _init() { $this->_data = array( array('id'=>1, 'email' => 'test@vivid.com', 'password' => md5('foo'.'123'), 'password_salt' => '123', 'deleted'=>false), array('id'=>2, 'email' => 'testdel@vivid.com', 'password' => md5('bar'.'1234'), 'password_salt' => '1234', 'deleted'=>true), ); parent::_init(); } public function getAuthMethods() { return array( 'password' => new Kwf_User_Auth_PasswordFields($this), 'activation' => new Kwf_User_Auth_ActivationFields($this), ); } }
Add activation auth method, needed for lost password in tests
Add activation auth method, needed for lost password in tests
PHP
bsd-2-clause
kaufmo/koala-framework,kaufmo/koala-framework,kaufmo/koala-framework,nsams/koala-framework,koala-framework/koala-framework,nsams/koala-framework,nsams/koala-framework,koala-framework/koala-framework
php
## Code Before: <?php class Kwf_User_AuthPassword_FnF extends Kwf_Model_FnF { protected $_toStringField = 'email'; protected $_hasDeletedFlag = true; protected function _init() { $this->_data = array( array('id'=>1, 'email' => 'test@vivid.com', 'password' => md5('foo'.'123'), 'password_salt' => '123', 'deleted'=>false), array('id'=>2, 'email' => 'testdel@vivid.com', 'password' => md5('bar'.'1234'), 'password_salt' => '1234', 'deleted'=>true), ); parent::_init(); } public function getAuthMethods() { return array( 'password' => new Kwf_User_Auth_PasswordFields($this) ); } } ## Instruction: Add activation auth method, needed for lost password in tests ## Code After: <?php class Kwf_User_AuthPassword_FnF extends Kwf_Model_FnF { protected $_toStringField = 'email'; protected $_hasDeletedFlag = true; protected function _init() { $this->_data = array( array('id'=>1, 'email' => 'test@vivid.com', 'password' => md5('foo'.'123'), 'password_salt' => '123', 'deleted'=>false), array('id'=>2, 'email' => 'testdel@vivid.com', 'password' => md5('bar'.'1234'), 'password_salt' => '1234', 'deleted'=>true), ); parent::_init(); } public function getAuthMethods() { return array( 'password' => new Kwf_User_Auth_PasswordFields($this), 'activation' => new Kwf_User_Auth_ActivationFields($this), ); } }
<?php class Kwf_User_AuthPassword_FnF extends Kwf_Model_FnF { protected $_toStringField = 'email'; protected $_hasDeletedFlag = true; protected function _init() { $this->_data = array( array('id'=>1, 'email' => 'test@vivid.com', 'password' => md5('foo'.'123'), 'password_salt' => '123', 'deleted'=>false), array('id'=>2, 'email' => 'testdel@vivid.com', 'password' => md5('bar'.'1234'), 'password_salt' => '1234', 'deleted'=>true), ); parent::_init(); } public function getAuthMethods() { return array( - 'password' => new Kwf_User_Auth_PasswordFields($this) + 'password' => new Kwf_User_Auth_PasswordFields($this), ? + + 'activation' => new Kwf_User_Auth_ActivationFields($this), ); } }
3
0.136364
2
1
d6603bc17177da6b980dd30d4cf37b711316e7db
app/views/lotteries/_results.csv.ruby
app/views/lotteries/_results.csv.ruby
if lottery.entrants.exists? ::CSV.generate do |csv| csv << ["Results for #{lottery.name}"] lottery.divisions.each do |division| csv << ["\n"] csv << [division.name] csv << ["Accepted"] division.winning_entrants.each do |entrant| csv << [ entrant.first_name, entrant.last_name, entrant.gender, entrant.city, entrant.state_name, entrant.country_name, ] end csv << ["Wait List"] division.wait_list_entrants.each do |entrant| csv << [ entrant.first_name, entrant.last_name, entrant.gender, entrant.city, entrant.state_name, entrant.country_name, ] end end end else "No records to export." end
if lottery.entrants.exists? ::CSV.generate do |csv| csv << ["Results for #{lottery.name}"] lottery.divisions.each do |division| csv << ["\n"] csv << [division.name] csv << ["Accepted"] division.winning_entrants.each.with_index(1) do |entrant, i| csv << [ i, entrant.first_name, entrant.last_name, entrant.gender, entrant.city, entrant.state_name, entrant.country_name, ] end csv << ["Wait List"] division.wait_list_entrants.each.with_index(1) do |entrant, i| csv << [ i, entrant.first_name, entrant.last_name, entrant.gender, entrant.city, entrant.state_name, entrant.country_name, ] end end end else "No records to export." end
Add row index for lottery results export
Add row index for lottery results export
Ruby
mit
SplitTime/OpenSplitTime,SplitTime/OpenSplitTime,SplitTime/OpenSplitTime
ruby
## Code Before: if lottery.entrants.exists? ::CSV.generate do |csv| csv << ["Results for #{lottery.name}"] lottery.divisions.each do |division| csv << ["\n"] csv << [division.name] csv << ["Accepted"] division.winning_entrants.each do |entrant| csv << [ entrant.first_name, entrant.last_name, entrant.gender, entrant.city, entrant.state_name, entrant.country_name, ] end csv << ["Wait List"] division.wait_list_entrants.each do |entrant| csv << [ entrant.first_name, entrant.last_name, entrant.gender, entrant.city, entrant.state_name, entrant.country_name, ] end end end else "No records to export." end ## Instruction: Add row index for lottery results export ## Code After: if lottery.entrants.exists? ::CSV.generate do |csv| csv << ["Results for #{lottery.name}"] lottery.divisions.each do |division| csv << ["\n"] csv << [division.name] csv << ["Accepted"] division.winning_entrants.each.with_index(1) do |entrant, i| csv << [ i, entrant.first_name, entrant.last_name, entrant.gender, entrant.city, entrant.state_name, entrant.country_name, ] end csv << ["Wait List"] division.wait_list_entrants.each.with_index(1) do |entrant, i| csv << [ i, entrant.first_name, entrant.last_name, entrant.gender, entrant.city, entrant.state_name, entrant.country_name, ] end end end else "No records to export." end
if lottery.entrants.exists? ::CSV.generate do |csv| csv << ["Results for #{lottery.name}"] lottery.divisions.each do |division| csv << ["\n"] csv << [division.name] csv << ["Accepted"] - division.winning_entrants.each do |entrant| + division.winning_entrants.each.with_index(1) do |entrant, i| ? ++++++++++++++ +++ csv << [ + i, entrant.first_name, entrant.last_name, entrant.gender, entrant.city, entrant.state_name, entrant.country_name, ] end csv << ["Wait List"] - division.wait_list_entrants.each do |entrant| + division.wait_list_entrants.each.with_index(1) do |entrant, i| ? ++++++++++++++ +++ csv << [ + i, entrant.first_name, entrant.last_name, entrant.gender, entrant.city, entrant.state_name, entrant.country_name, ] end end end else "No records to export." end
6
0.162162
4
2
b75466868e2171dcc67b1db2666d186bf0afca89
test/junit/scala/collection/immutable/PagedSeqTest.scala
test/junit/scala/collection/immutable/PagedSeqTest.scala
package scala.collection.immutable import org.junit.runner.RunWith import org.junit.runners.JUnit4 import org.junit.Test import org.junit.Assert._ @RunWith(classOf[JUnit4]) class PagedSeqTest { // should not NPE, and should equal the given Seq @Test def test_SI6615(): Unit = { assertEquals(Seq('a'), PagedSeq.fromStrings(List.fill(5000)("a")).slice(4096, 4097)) } // should not NPE, and should be empty @Test def test_SI9480(): Unit = { assertEquals(Seq(), PagedSeq.fromStrings(List("a")).slice(1)) } // Slices shouldn't read outside where they belong @Test def test_SI6519 { var readAttempt = 0 val sideEffectingIterator = new Iterator[Int] { def hasNext = readAttempt < 65536 def next = { readAttempt += 1; readAttempt } } val s = PagedSeq.fromIterator(sideEffectingIterator).slice(0,2).mkString assertEquals(s, "12") assert(readAttempt <= 4096) } }
package scala.collection.immutable import org.junit.runner.RunWith import org.junit.runners.JUnit4 import org.junit.{Ignore, Test} import org.junit.Assert._ @RunWith(classOf[JUnit4]) class PagedSeqTest { // should not NPE, and should equal the given Seq @Test @Ignore("This tests a non-stack safe method in a deprecated class that requires ~1.5M stack, disabling") def test_SI6615(): Unit = { assertEquals(Seq('a'), PagedSeq.fromStrings(List.fill(5000)("a")).slice(4096, 4097)) } // should not NPE, and should be empty @Test def test_SI9480(): Unit = { assertEquals(Seq(), PagedSeq.fromStrings(List("a")).slice(1)) } // Slices shouldn't read outside where they belong @Test def test_SI6519 { var readAttempt = 0 val sideEffectingIterator = new Iterator[Int] { def hasNext = readAttempt < 65536 def next = { readAttempt += 1; readAttempt } } val s = PagedSeq.fromIterator(sideEffectingIterator).slice(0,2).mkString assertEquals(s, "12") assert(readAttempt <= 4096) } }
Disable stack hungry test of deprecated PagedSeq
Disable stack hungry test of deprecated PagedSeq
Scala
apache-2.0
felixmulder/scala,martijnhoekstra/scala,jvican/scala,scala/scala,jvican/scala,lrytz/scala,shimib/scala,felixmulder/scala,shimib/scala,martijnhoekstra/scala,lrytz/scala,scala/scala,martijnhoekstra/scala,lrytz/scala,slothspot/scala,jvican/scala,scala/scala,jvican/scala,martijnhoekstra/scala,shimib/scala,slothspot/scala,felixmulder/scala,slothspot/scala,scala/scala,scala/scala,slothspot/scala,felixmulder/scala,jvican/scala,shimib/scala,lrytz/scala,scala/scala,felixmulder/scala,felixmulder/scala,martijnhoekstra/scala,slothspot/scala,shimib/scala,martijnhoekstra/scala,lrytz/scala,lrytz/scala,jvican/scala,slothspot/scala,shimib/scala,felixmulder/scala,slothspot/scala,jvican/scala
scala
## Code Before: package scala.collection.immutable import org.junit.runner.RunWith import org.junit.runners.JUnit4 import org.junit.Test import org.junit.Assert._ @RunWith(classOf[JUnit4]) class PagedSeqTest { // should not NPE, and should equal the given Seq @Test def test_SI6615(): Unit = { assertEquals(Seq('a'), PagedSeq.fromStrings(List.fill(5000)("a")).slice(4096, 4097)) } // should not NPE, and should be empty @Test def test_SI9480(): Unit = { assertEquals(Seq(), PagedSeq.fromStrings(List("a")).slice(1)) } // Slices shouldn't read outside where they belong @Test def test_SI6519 { var readAttempt = 0 val sideEffectingIterator = new Iterator[Int] { def hasNext = readAttempt < 65536 def next = { readAttempt += 1; readAttempt } } val s = PagedSeq.fromIterator(sideEffectingIterator).slice(0,2).mkString assertEquals(s, "12") assert(readAttempt <= 4096) } } ## Instruction: Disable stack hungry test of deprecated PagedSeq ## Code After: package scala.collection.immutable import org.junit.runner.RunWith import org.junit.runners.JUnit4 import org.junit.{Ignore, Test} import org.junit.Assert._ @RunWith(classOf[JUnit4]) class PagedSeqTest { // should not NPE, and should equal the given Seq @Test @Ignore("This tests a non-stack safe method in a deprecated class that requires ~1.5M stack, disabling") def test_SI6615(): Unit = { assertEquals(Seq('a'), PagedSeq.fromStrings(List.fill(5000)("a")).slice(4096, 4097)) } // should not NPE, and should be empty @Test def test_SI9480(): Unit = { assertEquals(Seq(), PagedSeq.fromStrings(List("a")).slice(1)) } // Slices shouldn't read outside where they belong @Test def test_SI6519 { var readAttempt = 0 val sideEffectingIterator = new Iterator[Int] { def hasNext = readAttempt < 65536 def next = { readAttempt += 1; readAttempt } } val s = PagedSeq.fromIterator(sideEffectingIterator).slice(0,2).mkString assertEquals(s, "12") assert(readAttempt <= 4096) } }
package scala.collection.immutable import org.junit.runner.RunWith import org.junit.runners.JUnit4 - import org.junit.Test + import org.junit.{Ignore, Test} ? +++++++++ + import org.junit.Assert._ @RunWith(classOf[JUnit4]) class PagedSeqTest { // should not NPE, and should equal the given Seq @Test + @Ignore("This tests a non-stack safe method in a deprecated class that requires ~1.5M stack, disabling") def test_SI6615(): Unit = { assertEquals(Seq('a'), PagedSeq.fromStrings(List.fill(5000)("a")).slice(4096, 4097)) } // should not NPE, and should be empty @Test def test_SI9480(): Unit = { assertEquals(Seq(), PagedSeq.fromStrings(List("a")).slice(1)) } // Slices shouldn't read outside where they belong @Test def test_SI6519 { var readAttempt = 0 val sideEffectingIterator = new Iterator[Int] { def hasNext = readAttempt < 65536 def next = { readAttempt += 1; readAttempt } } val s = PagedSeq.fromIterator(sideEffectingIterator).slice(0,2).mkString assertEquals(s, "12") assert(readAttempt <= 4096) } }
3
0.088235
2
1
b8096e69fa183c6facc254b7c7b78cb5099dd770
base/test_file_util_mac.cc
base/test_file_util_mac.cc
// Copyright (c) 2008 The Chromium Authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. #include "base/test_file_util.h" #include "base/logging.h" namespace file_util { bool EvictFileFromSystemCache(const FilePath& file) { // TODO(port): Implement. NOTIMPLEMENTED(); return false; } } // namespace file_util
// Copyright (c) 2008 The Chromium Authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. #include "base/test_file_util.h" #include "base/logging.h" namespace file_util { bool EvictFileFromSystemCache(const FilePath& file) { // There is no way that we can think of to dump something from the UBC. You // can add something so when you open it, it doesn't go in, but that won't // work here. return true; } } // namespace file_util
Change the stub for file_util::EvictFileFromSystemCache
Change the stub for file_util::EvictFileFromSystemCache I don't think there is a way to get something out of the UBC once it's in, so for now I'm just making this api return true so it won't fail unittests. I have email out to Amit incase he has any ideas. Review URL: http://codereview.chromium.org/42333 git-svn-id: de016e52bd170d2d4f2344f9bf92d50478b649e0@11961 0039d316-1c4b-4281-b951-d872f2087c98
C++
bsd-3-clause
Jonekee/chromium.src,ChromiumWebApps/chromium,keishi/chromium,ltilve/chromium,hgl888/chromium-crosswalk,dushu1203/chromium.src,krieger-od/nwjs_chromium.src,bright-sparks/chromium-spacewalk,zcbenz/cefode-chromium,TheTypoMaster/chromium-crosswalk,dednal/chromium.src,ltilve/chromium,pozdnyakov/chromium-crosswalk,M4sse/chromium.src,keishi/chromium,dushu1203/chromium.src,patrickm/chromium.src,pozdnyakov/chromium-crosswalk,Jonekee/chromium.src,Chilledheart/chromium,krieger-od/nwjs_chromium.src,PeterWangIntel/chromium-crosswalk,keishi/chromium,TheTypoMaster/chromium-crosswalk,crosswalk-project/chromium-crosswalk-efl,timopulkkinen/BubbleFish,dednal/chromium.src,markYoungH/chromium.src,littlstar/chromium.src,junmin-zhu/chromium-rivertrail,Fireblend/chromium-crosswalk,zcbenz/cefode-chromium,keishi/chromium,timopulkkinen/BubbleFish,zcbenz/cefode-chromium,nacl-webkit/chrome_deps,ondra-novak/chromium.src,rogerwang/chromium,patrickm/chromium.src,Jonekee/chromium.src,hgl888/chromium-crosswalk,PeterWangIntel/chromium-crosswalk,ltilve/chromium,markYoungH/chromium.src,jaruba/chromium.src,mogoweb/chromium-crosswalk,hujiajie/pa-chromium,bright-sparks/chromium-spacewalk,robclark/chromium,axinging/chromium-crosswalk,pozdnyakov/chromium-crosswalk,rogerwang/chromium,Just-D/chromium-1,junmin-zhu/chromium-rivertrail,TheTypoMaster/chromium-crosswalk,ondra-novak/chromium.src,keishi/chromium,dushu1203/chromium.src,pozdnyakov/chromium-crosswalk,markYoungH/chromium.src,fujunwei/chromium-crosswalk,Just-D/chromium-1,anirudhSK/chromium,ondra-novak/chromium.src,zcbenz/cefode-chromium,ondra-novak/chromium.src,M4sse/chromium.src,keishi/chromium,rogerwang/chromium,Jonekee/chromium.src,axinging/chromium-crosswalk,hgl888/chromium-crosswalk-efl,pozdnyakov/chromium-crosswalk,hujiajie/pa-chromium,nacl-webkit/chrome_deps,markYoungH/chromium.src,chuan9/chromium-crosswalk,hgl888/chromium-crosswalk,Pluto-tv/chromium-crosswalk,jaruba/chromium.src,mohamed--abdel-maksoud/chromium.src,chuan9/chromium-crosswalk,dushu1203/chromium.src,markYoungH/chromium.src,krieger-od/nwjs_chromium.src,littlstar/chromium.src,rogerwang/chromium,hgl888/chromium-crosswalk-efl,axinging/chromium-crosswalk,Jonekee/chromium.src,hgl888/chromium-crosswalk,ChromiumWebApps/chromium,hgl888/chromium-crosswalk-efl,Jonekee/chromium.src,crosswalk-project/chromium-crosswalk-efl,PeterWangIntel/chromium-crosswalk,nacl-webkit/chrome_deps,ChromiumWebApps/chromium,robclark/chromium,littlstar/chromium.src,jaruba/chromium.src,robclark/chromium,chuan9/chromium-crosswalk,Just-D/chromium-1,krieger-od/nwjs_chromium.src,jaruba/chromium.src,mogoweb/chromium-crosswalk,ltilve/chromium,hujiajie/pa-chromium,robclark/chromium,jaruba/chromium.src,crosswalk-project/chromium-crosswalk-efl,pozdnyakov/chromium-crosswalk,patrickm/chromium.src,ondra-novak/chromium.src,patrickm/chromium.src,dednal/chromium.src,dednal/chromium.src,mohamed--abdel-maksoud/chromium.src,PeterWangIntel/chromium-crosswalk,ChromiumWebApps/chromium,Jonekee/chromium.src,dushu1203/chromium.src,fujunwei/chromium-crosswalk,hujiajie/pa-chromium,hgl888/chromium-crosswalk,robclark/chromium,patrickm/chromium.src,keishi/chromium,hgl888/chromium-crosswalk,dushu1203/chromium.src,mohamed--abdel-maksoud/chromium.src,bright-sparks/chromium-spacewalk,Pluto-tv/chromium-crosswalk,mogoweb/chromium-crosswalk,timopulkkinen/BubbleFish,Just-D/chromium-1,ChromiumWebApps/chromium,dednal/chromium.src,Pluto-tv/chromium-crosswalk,anirudhSK/chromium,keishi/chromium,bright-sparks/chromium-spacewalk,dushu1203/chromium.src,jaruba/chromium.src,zcbenz/cefode-chromium,mohamed--abdel-maksoud/chromium.src,TheTypoMaster/chromium-crosswalk,anirudhSK/chromium,mogoweb/chromium-crosswalk,keishi/chromium,timopulkkinen/BubbleFish,ltilve/chromium,patrickm/chromium.src,Jonekee/chromium.src,anirudhSK/chromium,Fireblend/chromium-crosswalk,anirudhSK/chromium,zcbenz/cefode-chromium,mohamed--abdel-maksoud/chromium.src,PeterWangIntel/chromium-crosswalk,dednal/chromium.src,mogoweb/chromium-crosswalk,mogoweb/chromium-crosswalk,ltilve/chromium,patrickm/chromium.src,TheTypoMaster/chromium-crosswalk,hgl888/chromium-crosswalk,markYoungH/chromium.src,hujiajie/pa-chromium,dednal/chromium.src,axinging/chromium-crosswalk,pozdnyakov/chromium-crosswalk,littlstar/chromium.src,M4sse/chromium.src,dushu1203/chromium.src,ChromiumWebApps/chromium,bright-sparks/chromium-spacewalk,Fireblend/chromium-crosswalk,krieger-od/nwjs_chromium.src,markYoungH/chromium.src,littlstar/chromium.src,mogoweb/chromium-crosswalk,axinging/chromium-crosswalk,hgl888/chromium-crosswalk-efl,axinging/chromium-crosswalk,Chilledheart/chromium,nacl-webkit/chrome_deps,jaruba/chromium.src,ChromiumWebApps/chromium,Chilledheart/chromium,junmin-zhu/chromium-rivertrail,pozdnyakov/chromium-crosswalk,timopulkkinen/BubbleFish,Chilledheart/chromium,ltilve/chromium,littlstar/chromium.src,keishi/chromium,littlstar/chromium.src,mogoweb/chromium-crosswalk,mogoweb/chromium-crosswalk,zcbenz/cefode-chromium,ChromiumWebApps/chromium,markYoungH/chromium.src,nacl-webkit/chrome_deps,mohamed--abdel-maksoud/chromium.src,hgl888/chromium-crosswalk,dednal/chromium.src,axinging/chromium-crosswalk,chuan9/chromium-crosswalk,chuan9/chromium-crosswalk,M4sse/chromium.src,dushu1203/chromium.src,chuan9/chromium-crosswalk,Fireblend/chromium-crosswalk,fujunwei/chromium-crosswalk,krieger-od/nwjs_chromium.src,junmin-zhu/chromium-rivertrail,zcbenz/cefode-chromium,Pluto-tv/chromium-crosswalk,junmin-zhu/chromium-rivertrail,mogoweb/chromium-crosswalk,littlstar/chromium.src,pozdnyakov/chromium-crosswalk,rogerwang/chromium,axinging/chromium-crosswalk,Fireblend/chromium-crosswalk,rogerwang/chromium,junmin-zhu/chromium-rivertrail,anirudhSK/chromium,hujiajie/pa-chromium,fujunwei/chromium-crosswalk,robclark/chromium,krieger-od/nwjs_chromium.src,robclark/chromium,timopulkkinen/BubbleFish,M4sse/chromium.src,jaruba/chromium.src,keishi/chromium,anirudhSK/chromium,TheTypoMaster/chromium-crosswalk,nacl-webkit/chrome_deps,ChromiumWebApps/chromium,zcbenz/cefode-chromium,patrickm/chromium.src,mohamed--abdel-maksoud/chromium.src,nacl-webkit/chrome_deps,PeterWangIntel/chromium-crosswalk,nacl-webkit/chrome_deps,Pluto-tv/chromium-crosswalk,chuan9/chromium-crosswalk,patrickm/chromium.src,Just-D/chromium-1,dushu1203/chromium.src,PeterWangIntel/chromium-crosswalk,hgl888/chromium-crosswalk-efl,axinging/chromium-crosswalk,Pluto-tv/chromium-crosswalk,jaruba/chromium.src,hgl888/chromium-crosswalk-efl,Just-D/chromium-1,ltilve/chromium,markYoungH/chromium.src,markYoungH/chromium.src,ChromiumWebApps/chromium,M4sse/chromium.src,M4sse/chromium.src,dednal/chromium.src,Pluto-tv/chromium-crosswalk,zcbenz/cefode-chromium,TheTypoMaster/chromium-crosswalk,Chilledheart/chromium,jaruba/chromium.src,anirudhSK/chromium,hujiajie/pa-chromium,TheTypoMaster/chromium-crosswalk,junmin-zhu/chromium-rivertrail,hujiajie/pa-chromium,anirudhSK/chromium,M4sse/chromium.src,jaruba/chromium.src,dednal/chromium.src,PeterWangIntel/chromium-crosswalk,ltilve/chromium,crosswalk-project/chromium-crosswalk-efl,Just-D/chromium-1,Pluto-tv/chromium-crosswalk,krieger-od/nwjs_chromium.src,fujunwei/chromium-crosswalk,ondra-novak/chromium.src,M4sse/chromium.src,M4sse/chromium.src,robclark/chromium,hgl888/chromium-crosswalk-efl,chuan9/chromium-crosswalk,Chilledheart/chromium,timopulkkinen/BubbleFish,mohamed--abdel-maksoud/chromium.src,fujunwei/chromium-crosswalk,ondra-novak/chromium.src,fujunwei/chromium-crosswalk,mohamed--abdel-maksoud/chromium.src,M4sse/chromium.src,Fireblend/chromium-crosswalk,Fireblend/chromium-crosswalk,robclark/chromium,pozdnyakov/chromium-crosswalk,axinging/chromium-crosswalk,hgl888/chromium-crosswalk-efl,PeterWangIntel/chromium-crosswalk,rogerwang/chromium,nacl-webkit/chrome_deps,TheTypoMaster/chromium-crosswalk,bright-sparks/chromium-spacewalk,timopulkkinen/BubbleFish,crosswalk-project/chromium-crosswalk-efl,hujiajie/pa-chromium,hujiajie/pa-chromium,Chilledheart/chromium,fujunwei/chromium-crosswalk,hgl888/chromium-crosswalk-efl,ChromiumWebApps/chromium,crosswalk-project/chromium-crosswalk-efl,ondra-novak/chromium.src,nacl-webkit/chrome_deps,Chilledheart/chromium,krieger-od/nwjs_chromium.src,junmin-zhu/chromium-rivertrail,axinging/chromium-crosswalk,hujiajie/pa-chromium,hgl888/chromium-crosswalk-efl,mohamed--abdel-maksoud/chromium.src,anirudhSK/chromium,rogerwang/chromium,robclark/chromium,dednal/chromium.src,junmin-zhu/chromium-rivertrail,ondra-novak/chromium.src,Pluto-tv/chromium-crosswalk,zcbenz/cefode-chromium,rogerwang/chromium,timopulkkinen/BubbleFish,bright-sparks/chromium-spacewalk,bright-sparks/chromium-spacewalk,timopulkkinen/BubbleFish,ChromiumWebApps/chromium,anirudhSK/chromium,nacl-webkit/chrome_deps,krieger-od/nwjs_chromium.src,anirudhSK/chromium,fujunwei/chromium-crosswalk,krieger-od/nwjs_chromium.src,Fireblend/chromium-crosswalk,Chilledheart/chromium,rogerwang/chromium,pozdnyakov/chromium-crosswalk,markYoungH/chromium.src,junmin-zhu/chromium-rivertrail,bright-sparks/chromium-spacewalk,crosswalk-project/chromium-crosswalk-efl,Jonekee/chromium.src,Just-D/chromium-1,Jonekee/chromium.src,chuan9/chromium-crosswalk,Fireblend/chromium-crosswalk,junmin-zhu/chromium-rivertrail,crosswalk-project/chromium-crosswalk-efl,timopulkkinen/BubbleFish,Jonekee/chromium.src,crosswalk-project/chromium-crosswalk-efl,hgl888/chromium-crosswalk,Just-D/chromium-1,mohamed--abdel-maksoud/chromium.src,dushu1203/chromium.src
c++
## Code Before: // Copyright (c) 2008 The Chromium Authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. #include "base/test_file_util.h" #include "base/logging.h" namespace file_util { bool EvictFileFromSystemCache(const FilePath& file) { // TODO(port): Implement. NOTIMPLEMENTED(); return false; } } // namespace file_util ## Instruction: Change the stub for file_util::EvictFileFromSystemCache I don't think there is a way to get something out of the UBC once it's in, so for now I'm just making this api return true so it won't fail unittests. I have email out to Amit incase he has any ideas. Review URL: http://codereview.chromium.org/42333 git-svn-id: de016e52bd170d2d4f2344f9bf92d50478b649e0@11961 0039d316-1c4b-4281-b951-d872f2087c98 ## Code After: // Copyright (c) 2008 The Chromium Authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. #include "base/test_file_util.h" #include "base/logging.h" namespace file_util { bool EvictFileFromSystemCache(const FilePath& file) { // There is no way that we can think of to dump something from the UBC. You // can add something so when you open it, it doesn't go in, but that won't // work here. return true; } } // namespace file_util
// Copyright (c) 2008 The Chromium Authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. #include "base/test_file_util.h" #include "base/logging.h" namespace file_util { bool EvictFileFromSystemCache(const FilePath& file) { - // TODO(port): Implement. - NOTIMPLEMENTED(); + // There is no way that we can think of to dump something from the UBC. You + // can add something so when you open it, it doesn't go in, but that won't + // work here. - return false; ? ^^^^ + return true; ? ^^^ } } // namespace file_util
7
0.411765
4
3
3d40b9975c8284a71cf75b722a42839390816b95
.github/workflows/main.yml
.github/workflows/main.yml
name: CI on: [push] jobs: deploy: name: Deploy runs-on: ubuntu-latest steps: - name: Checkout Repo uses: actions/checkout@master with: token: ${{ secrets.GITHUB_ACCESS_TOKEN }} submodules: true - name: Set up Ruby 2.5 uses: actions/setup-ruby@v1 - name: Build run: | gem install bundler bundle install --jobs 4 --retry 3 make clean build - name: Deploy to Firebase uses: w9jds/firebase-action@master with: args: deploy --only hosting env: PROJECT_ID: blog-70591 FIREBASE_TOKEN: ${{ secrets.FIREBASE_TOKEN }} - name: Publish run: make publish
name: CI on: [push] jobs: deploy: name: Deploy runs-on: ubuntu-latest steps: - name: Checkout Source Code uses: actions/checkout@v2 - name: Checkout rticles uses: actions/checkout@v2 with: repository: horimislime/articles token: ${{ secrets.GITHUB_ACCESS_TOKEN }} path: _posts/articles - name: Set up Ruby 2.5 uses: actions/setup-ruby@v1 - name: Build run: | gem install bundler bundle install --jobs 4 --retry 3 make clean build - name: Deploy to Firebase uses: w9jds/firebase-action@master with: args: deploy --only hosting env: PROJECT_ID: blog-70591 FIREBASE_TOKEN: ${{ secrets.FIREBASE_TOKEN }} - name: Publish run: make publish
Update to checkout v2 syntax
Update to checkout v2 syntax
YAML
mit
horimislime/horimisli.me,horimislime/horimisli.me,horimislime/horimisli.me
yaml
## Code Before: name: CI on: [push] jobs: deploy: name: Deploy runs-on: ubuntu-latest steps: - name: Checkout Repo uses: actions/checkout@master with: token: ${{ secrets.GITHUB_ACCESS_TOKEN }} submodules: true - name: Set up Ruby 2.5 uses: actions/setup-ruby@v1 - name: Build run: | gem install bundler bundle install --jobs 4 --retry 3 make clean build - name: Deploy to Firebase uses: w9jds/firebase-action@master with: args: deploy --only hosting env: PROJECT_ID: blog-70591 FIREBASE_TOKEN: ${{ secrets.FIREBASE_TOKEN }} - name: Publish run: make publish ## Instruction: Update to checkout v2 syntax ## Code After: name: CI on: [push] jobs: deploy: name: Deploy runs-on: ubuntu-latest steps: - name: Checkout Source Code uses: actions/checkout@v2 - name: Checkout rticles uses: actions/checkout@v2 with: repository: horimislime/articles token: ${{ secrets.GITHUB_ACCESS_TOKEN }} path: _posts/articles - name: Set up Ruby 2.5 uses: actions/setup-ruby@v1 - name: Build run: | gem install bundler bundle install --jobs 4 --retry 3 make clean build - name: Deploy to Firebase uses: w9jds/firebase-action@master with: args: deploy --only hosting env: PROJECT_ID: blog-70591 FIREBASE_TOKEN: ${{ secrets.FIREBASE_TOKEN }} - name: Publish run: make publish
name: CI on: [push] jobs: deploy: name: Deploy runs-on: ubuntu-latest steps: - - name: Checkout Repo ? ^ ^ + - name: Checkout Source Code ? ^^^^^ ^^ ++ - uses: actions/checkout@master ? ^^^^^^ + uses: actions/checkout@v2 ? ^^ + - name: Checkout rticles + uses: actions/checkout@v2 with: + repository: horimislime/articles token: ${{ secrets.GITHUB_ACCESS_TOKEN }} - submodules: true + path: _posts/articles - name: Set up Ruby 2.5 uses: actions/setup-ruby@v1 - name: Build run: | gem install bundler bundle install --jobs 4 --retry 3 make clean build - name: Deploy to Firebase uses: w9jds/firebase-action@master with: args: deploy --only hosting env: PROJECT_ID: blog-70591 FIREBASE_TOKEN: ${{ secrets.FIREBASE_TOKEN }} - name: Publish run: make publish
9
0.3
6
3
2635979fcc68f39fd81742fa151a2dd8aab87306
test/progress/index.spec.coffee
test/progress/index.spec.coffee
React = require 'react' {Testing, expect, sinon, _} = require 'openstax-react-components/test/helpers' Collection = require 'task/collection' {Progress} = require 'progress' TASK = require 'cc/tasks/C_UUID/m_uuid/GET' Wrapper = React.createClass childContextTypes: moduleUUID: React.PropTypes.string getChildContext: -> _.pick @props, 'moduleUUID', 'collectionUUID' render: -> React.createElement(Progress, @props) describe 'Progress Component', -> beforeEach -> @props = moduleUUID: 'm_uuid' Collection.load("#{@props.collectionUUID}/#{@props.moduleUUID}", _.extend({}, TASK, @props)) # Note. somethings wrong with the format of the task. It doesn't render the title text from it it 'renders status', -> Testing.renderComponent(Wrapper, props: @props).then ({dom, element}) -> expect(dom.querySelector('.concept-coach-progress-page-title')).not.to.be.null
React = require 'react' {Testing, expect, sinon, _} = require 'openstax-react-components/test/helpers' Collection = require 'task/collection' {Progress} = require 'progress' TASK = require 'cc/tasks/C_UUID/m_uuid/GET' Wrapper = React.createClass childContextTypes: moduleUUID: React.PropTypes.string collectionUUID: React.PropTypes.string getChildContext: -> _.pick @props, 'moduleUUID', 'collectionUUID' render: -> React.createElement(Progress, @props) describe 'Progress Component', -> beforeEach -> @props = moduleUUID: 'm_uuid' collectionUUID: 'C_UUID' Collection.load("#{@props.collectionUUID}/#{@props.moduleUUID}", _.extend({}, TASK, @props)) # Note. somethings wrong with the format of the task. It doesn't render the title text from it it 'renders status', -> Testing.renderComponent(Wrapper, props: @props).then ({dom, element}) -> expect(dom.querySelector('.concept-coach-progress-page-title')).not.to.be.null
Add collectionUUID to context & props
Add collectionUUID to context & props
CoffeeScript
agpl-3.0
openstax/tutor-js,openstax/tutor-js,openstax/tutor-js,openstax/tutor-js,openstax/tutor-js
coffeescript
## Code Before: React = require 'react' {Testing, expect, sinon, _} = require 'openstax-react-components/test/helpers' Collection = require 'task/collection' {Progress} = require 'progress' TASK = require 'cc/tasks/C_UUID/m_uuid/GET' Wrapper = React.createClass childContextTypes: moduleUUID: React.PropTypes.string getChildContext: -> _.pick @props, 'moduleUUID', 'collectionUUID' render: -> React.createElement(Progress, @props) describe 'Progress Component', -> beforeEach -> @props = moduleUUID: 'm_uuid' Collection.load("#{@props.collectionUUID}/#{@props.moduleUUID}", _.extend({}, TASK, @props)) # Note. somethings wrong with the format of the task. It doesn't render the title text from it it 'renders status', -> Testing.renderComponent(Wrapper, props: @props).then ({dom, element}) -> expect(dom.querySelector('.concept-coach-progress-page-title')).not.to.be.null ## Instruction: Add collectionUUID to context & props ## Code After: React = require 'react' {Testing, expect, sinon, _} = require 'openstax-react-components/test/helpers' Collection = require 'task/collection' {Progress} = require 'progress' TASK = require 'cc/tasks/C_UUID/m_uuid/GET' Wrapper = React.createClass childContextTypes: moduleUUID: React.PropTypes.string collectionUUID: React.PropTypes.string getChildContext: -> _.pick @props, 'moduleUUID', 'collectionUUID' render: -> React.createElement(Progress, @props) describe 'Progress Component', -> beforeEach -> @props = moduleUUID: 'm_uuid' collectionUUID: 'C_UUID' Collection.load("#{@props.collectionUUID}/#{@props.moduleUUID}", _.extend({}, TASK, @props)) # Note. somethings wrong with the format of the task. It doesn't render the title text from it it 'renders status', -> Testing.renderComponent(Wrapper, props: @props).then ({dom, element}) -> expect(dom.querySelector('.concept-coach-progress-page-title')).not.to.be.null
React = require 'react' {Testing, expect, sinon, _} = require 'openstax-react-components/test/helpers' Collection = require 'task/collection' {Progress} = require 'progress' TASK = require 'cc/tasks/C_UUID/m_uuid/GET' Wrapper = React.createClass childContextTypes: moduleUUID: React.PropTypes.string + collectionUUID: React.PropTypes.string getChildContext: -> _.pick @props, 'moduleUUID', 'collectionUUID' render: -> React.createElement(Progress, @props) describe 'Progress Component', -> beforeEach -> @props = moduleUUID: 'm_uuid' + collectionUUID: 'C_UUID' Collection.load("#{@props.collectionUUID}/#{@props.moduleUUID}", _.extend({}, TASK, @props)) # Note. somethings wrong with the format of the task. It doesn't render the title text from it it 'renders status', -> Testing.renderComponent(Wrapper, props: @props).then ({dom, element}) -> expect(dom.querySelector('.concept-coach-progress-page-title')).not.to.be.null
2
0.068966
2
0
fe5c42e6f6c6fd39a399a3626096a7a8c5cf9349
public/templates/mmcFE/global/header.tpl
public/templates/mmcFE/global/header.tpl
<div id="siteinfo">{$GLOBAL.websitename}<br/> <span class="slogan">{$GLOBAL.slogan}</span> </div> <div id="ministats"> <table border="0"> <tr> <td><li>{$GLOBAL.config.currency}/{$GLOBAL.config.price.currency}: {$GLOBAL.price|default:"n/a"|number_format:"4"}&nbsp;&nbsp;&nbsp;&nbsp;</li></td> <td><li>Pool Hashrate: {($GLOBAL.hashrate / 1000)|number_format:"3"} MH/s&nbsp;&nbsp;&nbsp;&nbsp;</li></td> <td><li>Pool Sharerate: {$GLOBAL.sharerate|number_format:"2"} Shares/s&nbsp;&nbsp;&nbsp;&nbsp;</li></td> <td><li>Pool Workers: {$GLOBAL.workers}&nbsp;&nbsp;&nbsp;&nbsp;</li></td> </tr> </table> </div>
<div id="siteinfo">{$GLOBAL.websitename}<br/> <span class="slogan">{$GLOBAL.slogan}</span> </div> <div id="ministats"> <table border="0"> <tr> {if $GLOBAL.config.price.currency}<td><li>{$GLOBAL.config.currency}/{$GLOBAL.config.price.currency}: {$GLOBAL.price|default:"n/a"|number_format:"4"}&nbsp;&nbsp;&nbsp;&nbsp;</li></td>{/if} <td><li>Pool Hashrate: {($GLOBAL.hashrate / 1000)|number_format:"3"} MH/s&nbsp;&nbsp;&nbsp;&nbsp;</li></td> <td><li>Pool Sharerate: {$GLOBAL.sharerate|number_format:"2"} Shares/s&nbsp;&nbsp;&nbsp;&nbsp;</li></td> <td><li>Pool Workers: {$GLOBAL.workers}&nbsp;&nbsp;&nbsp;&nbsp;</li></td> </tr> </table> </div>
Disable trade currency if no price currency defined
Disable trade currency if no price currency defined This will disable the trace exchange display in the header if no currency is defined in the configuration. Fixes #298
Smarty
apache-2.0
mculp/chunky-mpos,evgenyponomarev/pool,mmitech/php-mpos,xisi/php-mpos,mmitech/php-mpos,UNOMP/chunky-mpos,nimblecoin/web,sixxkilur/php-mpos,xisi/php-mpos,UNOMP/chunky-mpos,nimblecoin/web,evgenyponomarev/pool,BlueDragon747/php-mpos,shreeneve/test,CoinHashMe/MPOS-SSO,sixxkilur/php-mpos,CoinHashMe/MPOS-SSO,mculp/chunky-mpos,mculp/chunky-mpos,ychaim/php-mpos,ychaim/php-mpos,machinecoin-project/php-mpos,MPOS/php-mpos,xisi/php-mpos,shreeneve/test,MPOS/php-mpos,evgenyponomarev/pool,CoinHashMe/MPOS-SSO,studio666/php-mpos,BlueDragon747/php-mpos,machinecoin-project/php-mpos,bankonme/php-mpos,machinecoin-project/php-mpos,studio666/php-mpos,CoinHashMe/MPOS-SSO,UNOMP/chunky-mpos,sasselin/xcoin-mpos,nimblecoin/web,MPOS/php-mpos,CoinHashMe/MPOS-SSO,UNOMP/chunky-mpos,ychaim/php-mpos,shreeneve/test,sasselin/xcoin-mpos,BlueDragon747/php-mpos,bankonme/php-mpos,mculp/chunky-mpos,sixxkilur/php-mpos,studio666/php-mpos,mmitech/php-mpos,sasselin/xcoin-mpos,bankonme/php-mpos
smarty
## Code Before: <div id="siteinfo">{$GLOBAL.websitename}<br/> <span class="slogan">{$GLOBAL.slogan}</span> </div> <div id="ministats"> <table border="0"> <tr> <td><li>{$GLOBAL.config.currency}/{$GLOBAL.config.price.currency}: {$GLOBAL.price|default:"n/a"|number_format:"4"}&nbsp;&nbsp;&nbsp;&nbsp;</li></td> <td><li>Pool Hashrate: {($GLOBAL.hashrate / 1000)|number_format:"3"} MH/s&nbsp;&nbsp;&nbsp;&nbsp;</li></td> <td><li>Pool Sharerate: {$GLOBAL.sharerate|number_format:"2"} Shares/s&nbsp;&nbsp;&nbsp;&nbsp;</li></td> <td><li>Pool Workers: {$GLOBAL.workers}&nbsp;&nbsp;&nbsp;&nbsp;</li></td> </tr> </table> </div> ## Instruction: Disable trade currency if no price currency defined This will disable the trace exchange display in the header if no currency is defined in the configuration. Fixes #298 ## Code After: <div id="siteinfo">{$GLOBAL.websitename}<br/> <span class="slogan">{$GLOBAL.slogan}</span> </div> <div id="ministats"> <table border="0"> <tr> {if $GLOBAL.config.price.currency}<td><li>{$GLOBAL.config.currency}/{$GLOBAL.config.price.currency}: {$GLOBAL.price|default:"n/a"|number_format:"4"}&nbsp;&nbsp;&nbsp;&nbsp;</li></td>{/if} <td><li>Pool Hashrate: {($GLOBAL.hashrate / 1000)|number_format:"3"} MH/s&nbsp;&nbsp;&nbsp;&nbsp;</li></td> <td><li>Pool Sharerate: {$GLOBAL.sharerate|number_format:"2"} Shares/s&nbsp;&nbsp;&nbsp;&nbsp;</li></td> <td><li>Pool Workers: {$GLOBAL.workers}&nbsp;&nbsp;&nbsp;&nbsp;</li></td> </tr> </table> </div>
<div id="siteinfo">{$GLOBAL.websitename}<br/> <span class="slogan">{$GLOBAL.slogan}</span> </div> <div id="ministats"> <table border="0"> <tr> - <td><li>{$GLOBAL.config.currency}/{$GLOBAL.config.price.currency}: {$GLOBAL.price|default:"n/a"|number_format:"4"}&nbsp;&nbsp;&nbsp;&nbsp;</li></td> + {if $GLOBAL.config.price.currency}<td><li>{$GLOBAL.config.currency}/{$GLOBAL.config.price.currency}: {$GLOBAL.price|default:"n/a"|number_format:"4"}&nbsp;&nbsp;&nbsp;&nbsp;</li></td>{/if} ? ++++++++++++++++++++++++++++++++++ +++++ <td><li>Pool Hashrate: {($GLOBAL.hashrate / 1000)|number_format:"3"} MH/s&nbsp;&nbsp;&nbsp;&nbsp;</li></td> <td><li>Pool Sharerate: {$GLOBAL.sharerate|number_format:"2"} Shares/s&nbsp;&nbsp;&nbsp;&nbsp;</li></td> <td><li>Pool Workers: {$GLOBAL.workers}&nbsp;&nbsp;&nbsp;&nbsp;</li></td> </tr> </table> </div>
2
0.153846
1
1
d0c89e4a14d418c19cf42ddc7e1e29426eb6d7fe
lib/filters/filter.rb
lib/filters/filter.rb
module Filters class Filter include Enumerable attr_reader :name, :selection_policy, :selected_values private :selection_policy, :selected_values def initialize(name, param_value, selection_policy) @name = name @selection_policy = selection_policy @selected_values = selection_policy.selected_values_from(param_value) @filter_options = [] end def add_option(name, value) @filter_options << FilterOption.new(self, name, value, selected_values.include?(value)) end def each(&block) @filter_options.each(&block) end def selected_options select(&:selected?) end def current_param param_string(selected_options.map(&:value)) end def param_after_toggle_for(filter_option) param_string(param_value_after_toggling(filter_option)) end private def base_for_new_values selection_policy.base_for_new_values(selected_values) end def param_value_after_toggling(filter_option) filter_option.selected? ? base_for_new_values - [filter_option.value] : base_for_new_values + [filter_option.value] end def param_string(values) values.empty? ? "" : "#{name}:#{values.join(",")}" end end end
module Filters class Filter include Enumerable attr_reader :name, :selection_policy, :selected_values private :selection_policy, :selected_values def initialize(name, current_param_value, selection_policy) @name = name @selection_policy = selection_policy @selected_values = selection_policy.selected_values_from(current_param_value) @filter_options = [] end def add_option(name, value) @filter_options << FilterOption.new(self, name, value, selected_values.include?(value)) end def each(&block) @filter_options.each(&block) end def selected_options select(&:selected?) end def current_param param_string(selected_values) end def param_after_toggle_for(filter_option) param_string(param_value_after_toggling(filter_option)) end private def base_for_new_values selection_policy.base_for_new_values(selected_values) end def param_value_after_toggling(filter_option) filter_option.selected? ? base_for_new_values - [filter_option.value] : base_for_new_values + [filter_option.value] end def param_string(values) values.empty? ? "" : "#{name}:#{values.join(",")}" end end end
Change the way current param is calculated
Change the way current param is calculated
Ruby
mit
ohthatjames/filters
ruby
## Code Before: module Filters class Filter include Enumerable attr_reader :name, :selection_policy, :selected_values private :selection_policy, :selected_values def initialize(name, param_value, selection_policy) @name = name @selection_policy = selection_policy @selected_values = selection_policy.selected_values_from(param_value) @filter_options = [] end def add_option(name, value) @filter_options << FilterOption.new(self, name, value, selected_values.include?(value)) end def each(&block) @filter_options.each(&block) end def selected_options select(&:selected?) end def current_param param_string(selected_options.map(&:value)) end def param_after_toggle_for(filter_option) param_string(param_value_after_toggling(filter_option)) end private def base_for_new_values selection_policy.base_for_new_values(selected_values) end def param_value_after_toggling(filter_option) filter_option.selected? ? base_for_new_values - [filter_option.value] : base_for_new_values + [filter_option.value] end def param_string(values) values.empty? ? "" : "#{name}:#{values.join(",")}" end end end ## Instruction: Change the way current param is calculated ## Code After: module Filters class Filter include Enumerable attr_reader :name, :selection_policy, :selected_values private :selection_policy, :selected_values def initialize(name, current_param_value, selection_policy) @name = name @selection_policy = selection_policy @selected_values = selection_policy.selected_values_from(current_param_value) @filter_options = [] end def add_option(name, value) @filter_options << FilterOption.new(self, name, value, selected_values.include?(value)) end def each(&block) @filter_options.each(&block) end def selected_options select(&:selected?) end def current_param param_string(selected_values) end def param_after_toggle_for(filter_option) param_string(param_value_after_toggling(filter_option)) end private def base_for_new_values selection_policy.base_for_new_values(selected_values) end def param_value_after_toggling(filter_option) filter_option.selected? ? base_for_new_values - [filter_option.value] : base_for_new_values + [filter_option.value] end def param_string(values) values.empty? ? "" : "#{name}:#{values.join(",")}" end end end
module Filters class Filter include Enumerable attr_reader :name, :selection_policy, :selected_values private :selection_policy, :selected_values - def initialize(name, param_value, selection_policy) + def initialize(name, current_param_value, selection_policy) ? ++++++++ @name = name @selection_policy = selection_policy - @selected_values = selection_policy.selected_values_from(param_value) + @selected_values = selection_policy.selected_values_from(current_param_value) ? ++++++++ @filter_options = [] end def add_option(name, value) @filter_options << FilterOption.new(self, name, value, selected_values.include?(value)) end def each(&block) @filter_options.each(&block) end def selected_options select(&:selected?) end def current_param - param_string(selected_options.map(&:value)) ? -------------- - + param_string(selected_values) ? + end def param_after_toggle_for(filter_option) param_string(param_value_after_toggling(filter_option)) end private def base_for_new_values selection_policy.base_for_new_values(selected_values) end def param_value_after_toggling(filter_option) filter_option.selected? ? base_for_new_values - [filter_option.value] : base_for_new_values + [filter_option.value] end def param_string(values) values.empty? ? "" : "#{name}:#{values.join(",")}" end end end
6
0.125
3
3
90ee255c5475659e68e0fac637bc8b7ebb1890c7
Casks/rapidweaver.rb
Casks/rapidweaver.rb
class Rapidweaver < Cask version :latest sha256 :no_check url 'https://realmacsoftware.com/redirects/rapidweaver/direct' appcast 'http://www.realmacsoftware.com/stats/rapidweaver5.php' homepage 'http://realmacsoftware.com/rapidweaver' license :unknown app 'RapidWeaver.app' end
class Rapidweaver < Cask version '6' sha256 'e7b72daffa9c7809b713e05a1b518873a71a73afdd6a97f300b8bb9ab35a9361' url "http://realmacsoftware.com/redirects/rapidweaver#{version}/direct" appcast "http://www.realmacsoftware.com/stats/rapidweaver#{version}.php" homepage 'http://realmacsoftware.com/rapidweaver' license :unknown app 'RapidWeaver.app' end
Upgrade RapidWeaver.app to version 6
Upgrade RapidWeaver.app to version 6
Ruby
bsd-2-clause
hellosky806/homebrew-cask,singingwolfboy/homebrew-cask,csmith-palantir/homebrew-cask,ctrevino/homebrew-cask,inta/homebrew-cask,bdhess/homebrew-cask,bkono/homebrew-cask,morsdyce/homebrew-cask,otaran/homebrew-cask,nickpellant/homebrew-cask,xyb/homebrew-cask,jiashuw/homebrew-cask,corbt/homebrew-cask,muan/homebrew-cask,cliffcotino/homebrew-cask,samdoran/homebrew-cask,royalwang/homebrew-cask,hristozov/homebrew-cask,mahori/homebrew-cask,nelsonjchen/homebrew-cask,barravi/homebrew-cask,af/homebrew-cask,xcezx/homebrew-cask,sosedoff/homebrew-cask,sideci-sample/sideci-sample-homebrew-cask,Fedalto/homebrew-cask,perfide/homebrew-cask,malford/homebrew-cask,rogeriopradoj/homebrew-cask,blogabe/homebrew-cask,wolflee/homebrew-cask,kostasdizas/homebrew-cask,xtian/homebrew-cask,victorpopkov/homebrew-cask,mathbunnyru/homebrew-cask,jeroenj/homebrew-cask,scottsuch/homebrew-cask,underyx/homebrew-cask,xight/homebrew-cask,bosr/homebrew-cask,stevehedrick/homebrew-cask,bcomnes/homebrew-cask,stevenmaguire/homebrew-cask,MoOx/homebrew-cask,cohei/homebrew-cask,boydj/homebrew-cask,Ngrd/homebrew-cask,wesen/homebrew-cask,MichaelPei/homebrew-cask,pacav69/homebrew-cask,bkono/homebrew-cask,jamesmlees/homebrew-cask,alexg0/homebrew-cask,linc01n/homebrew-cask,toonetown/homebrew-cask,howie/homebrew-cask,helloIAmPau/homebrew-cask,zmwangx/homebrew-cask,riyad/homebrew-cask,mwilmer/homebrew-cask,MichaelPei/homebrew-cask,morganestes/homebrew-cask,pablote/homebrew-cask,bcaceiro/homebrew-cask,forevergenin/homebrew-cask,decrement/homebrew-cask,wesen/homebrew-cask,stevenmaguire/homebrew-cask,j13k/homebrew-cask,jangalinski/homebrew-cask,mAAdhaTTah/homebrew-cask,wizonesolutions/homebrew-cask,RJHsiao/homebrew-cask,tangestani/homebrew-cask,joshka/homebrew-cask,larseggert/homebrew-cask,johntrandall/homebrew-cask,dictcp/homebrew-cask,adelinofaria/homebrew-cask,shonjir/homebrew-cask,jen20/homebrew-cask,psibre/homebrew-cask,akiomik/homebrew-cask,xakraz/homebrew-cask,jbeagley52/homebrew-cask,janlugt/homebrew-cask,Gasol/homebrew-cask,Whoaa512/homebrew-cask,stigkj/homebrew-caskroom-cask,ahundt/homebrew-cask,julienlavergne/homebrew-cask,pablote/homebrew-cask,ohammersmith/homebrew-cask,genewoo/homebrew-cask,mjgardner/homebrew-cask,petmoo/homebrew-cask,dspeckhard/homebrew-cask,andersonba/homebrew-cask,gmkey/homebrew-cask,RickWong/homebrew-cask,kpearson/homebrew-cask,ptb/homebrew-cask,kostasdizas/homebrew-cask,lalyos/homebrew-cask,crmne/homebrew-cask,dustinblackman/homebrew-cask,Ketouem/homebrew-cask,guerrero/homebrew-cask,deanmorin/homebrew-cask,aguynamedryan/homebrew-cask,astorije/homebrew-cask,Hywan/homebrew-cask,ingorichter/homebrew-cask,gyugyu/homebrew-cask,hvisage/homebrew-cask,kievechua/homebrew-cask,delphinus35/homebrew-cask,stephenwade/homebrew-cask,miccal/homebrew-cask,jpodlech/homebrew-cask,seanzxx/homebrew-cask,inz/homebrew-cask,neverfox/homebrew-cask,nathanielvarona/homebrew-cask,rhendric/homebrew-cask,shorshe/homebrew-cask,samshadwell/homebrew-cask,neverfox/homebrew-cask,cobyism/homebrew-cask,gerrymiller/homebrew-cask,norio-nomura/homebrew-cask,MircoT/homebrew-cask,jen20/homebrew-cask,ahvigil/homebrew-cask,rickychilcott/homebrew-cask,carlmod/homebrew-cask,corbt/homebrew-cask,mfpierre/homebrew-cask,xyb/homebrew-cask,jmeridth/homebrew-cask,crzrcn/homebrew-cask,carlmod/homebrew-cask,giannitm/homebrew-cask,goxberry/homebrew-cask,gabrielizaias/homebrew-cask,aktau/homebrew-cask,artdevjs/homebrew-cask,bsiddiqui/homebrew-cask,flada-auxv/homebrew-cask,prime8/homebrew-cask,tedbundyjr/homebrew-cask,psibre/homebrew-cask,sebcode/homebrew-cask,Saklad5/homebrew-cask,AndreTheHunter/homebrew-cask,renard/homebrew-cask,freeslugs/homebrew-cask,scribblemaniac/homebrew-cask,forevergenin/homebrew-cask,gmkey/homebrew-cask,catap/homebrew-cask,troyxmccall/homebrew-cask,mjgardner/homebrew-cask,jonathanwiesel/homebrew-cask,rubenerd/homebrew-cask,kolomiichenko/homebrew-cask,elnappo/homebrew-cask,qbmiller/homebrew-cask,andrewdisley/homebrew-cask,jhowtan/homebrew-cask,mwilmer/homebrew-cask,renard/homebrew-cask,chrisRidgers/homebrew-cask,bchatard/homebrew-cask,xalep/homebrew-cask,stevehedrick/homebrew-cask,drostron/homebrew-cask,yurikoles/homebrew-cask,thomanq/homebrew-cask,jrwesolo/homebrew-cask,moimikey/homebrew-cask,dvdoliveira/homebrew-cask,mchlrmrz/homebrew-cask,0xadada/homebrew-cask,jalaziz/homebrew-cask,wickles/homebrew-cask,jamesmlees/homebrew-cask,johnste/homebrew-cask,maxnordlund/homebrew-cask,patresi/homebrew-cask,sparrc/homebrew-cask,coneman/homebrew-cask,robbiethegeek/homebrew-cask,fkrone/homebrew-cask,xcezx/homebrew-cask,bosr/homebrew-cask,decrement/homebrew-cask,jspahrsummers/homebrew-cask,unasuke/homebrew-cask,danielbayley/homebrew-cask,prime8/homebrew-cask,j13k/homebrew-cask,perfide/homebrew-cask,kevyau/homebrew-cask,winkelsdorf/homebrew-cask,kamilboratynski/homebrew-cask,williamboman/homebrew-cask,kTitan/homebrew-cask,segiddins/homebrew-cask,crzrcn/homebrew-cask,jasmas/homebrew-cask,christer155/homebrew-cask,sscotth/homebrew-cask,Dremora/homebrew-cask,tangestani/homebrew-cask,joshka/homebrew-cask,mhubig/homebrew-cask,sanchezm/homebrew-cask,jpmat296/homebrew-cask,hovancik/homebrew-cask,moogar0880/homebrew-cask,iamso/homebrew-cask,pgr0ss/homebrew-cask,rogeriopradoj/homebrew-cask,koenrh/homebrew-cask,xight/homebrew-cask,mazehall/homebrew-cask,gguillotte/homebrew-cask,feniix/homebrew-cask,dieterdemeyer/homebrew-cask,tan9/homebrew-cask,ksylvan/homebrew-cask,freeslugs/homebrew-cask,jasmas/homebrew-cask,paulbreslin/homebrew-cask,hackhandslabs/homebrew-cask,ldong/homebrew-cask,mchlrmrz/homebrew-cask,0rax/homebrew-cask,arronmabrey/homebrew-cask,xtian/homebrew-cask,ohammersmith/homebrew-cask,scottsuch/homebrew-cask,lukeadams/homebrew-cask,dcondrey/homebrew-cask,markthetech/homebrew-cask,vin047/homebrew-cask,mikem/homebrew-cask,Bombenleger/homebrew-cask,tyage/homebrew-cask,mwek/homebrew-cask,robertgzr/homebrew-cask,chrisRidgers/homebrew-cask,nightscape/homebrew-cask,okket/homebrew-cask,ksato9700/homebrew-cask,otzy007/homebrew-cask,mahori/homebrew-cask,englishm/homebrew-cask,bsiddiqui/homebrew-cask,amatos/homebrew-cask,jhowtan/homebrew-cask,SentinelWarren/homebrew-cask,adrianchia/homebrew-cask,FredLackeyOfficial/homebrew-cask,RogerThiede/homebrew-cask,lukasbestle/homebrew-cask,ch3n2k/homebrew-cask,larseggert/homebrew-cask,claui/homebrew-cask,Ephemera/homebrew-cask,danielgomezrico/homebrew-cask,nicolas-brousse/homebrew-cask,adriweb/homebrew-cask,guylabs/homebrew-cask,albertico/homebrew-cask,fharbe/homebrew-cask,neverfox/homebrew-cask,inta/homebrew-cask,yutarody/homebrew-cask,CameronGarrett/homebrew-cask,unasuke/homebrew-cask,stonehippo/homebrew-cask,arronmabrey/homebrew-cask,hristozov/homebrew-cask,wayou/homebrew-cask,AdamCmiel/homebrew-cask,johntrandall/homebrew-cask,diogodamiani/homebrew-cask,brianshumate/homebrew-cask,cclauss/homebrew-cask,kpearson/homebrew-cask,bric3/homebrew-cask,rajiv/homebrew-cask,gilesdring/homebrew-cask,taherio/homebrew-cask,mahori/homebrew-cask,lieuwex/homebrew-cask,moonboots/homebrew-cask,doits/homebrew-cask,a1russell/homebrew-cask,cprecioso/homebrew-cask,gurghet/homebrew-cask,gyndav/homebrew-cask,tranc99/homebrew-cask,adrianchia/homebrew-cask,kirikiriyamama/homebrew-cask,shorshe/homebrew-cask,illusionfield/homebrew-cask,Hywan/homebrew-cask,joaoponceleao/homebrew-cask,gerrymiller/homebrew-cask,flaviocamilo/homebrew-cask,dunn/homebrew-cask,bchatard/homebrew-cask,guylabs/homebrew-cask,mrmachine/homebrew-cask,optikfluffel/homebrew-cask,gyugyu/homebrew-cask,joschi/homebrew-cask,singingwolfboy/homebrew-cask,blainesch/homebrew-cask,ch3n2k/homebrew-cask,asins/homebrew-cask,seanzxx/homebrew-cask,JacopKane/homebrew-cask,malob/homebrew-cask,colindean/homebrew-cask,lvicentesanchez/homebrew-cask,fazo96/homebrew-cask,wKovacs64/homebrew-cask,andyshinn/homebrew-cask,kevyau/homebrew-cask,Keloran/homebrew-cask,winkelsdorf/homebrew-cask,nathanielvarona/homebrew-cask,elseym/homebrew-cask,xakraz/homebrew-cask,elseym/homebrew-cask,nelsonjchen/homebrew-cask,coeligena/homebrew-customized,faun/homebrew-cask,atsuyim/homebrew-cask,wickles/homebrew-cask,cobyism/homebrew-cask,remko/homebrew-cask,mAAdhaTTah/homebrew-cask,cprecioso/homebrew-cask,mjdescy/homebrew-cask,qbmiller/homebrew-cask,patresi/homebrew-cask,timsutton/homebrew-cask,johan/homebrew-cask,samnung/homebrew-cask,JoelLarson/homebrew-cask,klane/homebrew-cask,franklouwers/homebrew-cask,uetchy/homebrew-cask,aki77/homebrew-cask,kiliankoe/homebrew-cask,sanyer/homebrew-cask,johnjelinek/homebrew-cask,axodys/homebrew-cask,fly19890211/homebrew-cask,jeroenseegers/homebrew-cask,shanonvl/homebrew-cask,mariusbutuc/homebrew-cask,rajiv/homebrew-cask,ptb/homebrew-cask,paour/homebrew-cask,winkelsdorf/homebrew-cask,optikfluffel/homebrew-cask,vigosan/homebrew-cask,cedwardsmedia/homebrew-cask,reitermarkus/homebrew-cask,shoichiaizawa/homebrew-cask,a1russell/homebrew-cask,BahtiyarB/homebrew-cask,sachin21/homebrew-cask,cblecker/homebrew-cask,jacobbednarz/homebrew-cask,kuno/homebrew-cask,askl56/homebrew-cask,paulbreslin/homebrew-cask,zeusdeux/homebrew-cask,janlugt/homebrew-cask,pinut/homebrew-cask,paulombcosta/homebrew-cask,riyad/homebrew-cask,hovancik/homebrew-cask,alexg0/homebrew-cask,Ephemera/homebrew-cask,bcomnes/homebrew-cask,pacav69/homebrew-cask,esebastian/homebrew-cask,bric3/homebrew-cask,okket/homebrew-cask,thii/homebrew-cask,hackhandslabs/homebrew-cask,reitermarkus/homebrew-cask,andyli/homebrew-cask,jgarber623/homebrew-cask,nathanielvarona/homebrew-cask,mwek/homebrew-cask,colindunn/homebrew-cask,johan/homebrew-cask,chino/homebrew-cask,schneidmaster/homebrew-cask,ywfwj2008/homebrew-cask,thii/homebrew-cask,yutarody/homebrew-cask,alexg0/homebrew-cask,boydj/homebrew-cask,n0ts/homebrew-cask,deiga/homebrew-cask,reitermarkus/homebrew-cask,ksato9700/homebrew-cask,kirikiriyamama/homebrew-cask,dezon/homebrew-cask,kuno/homebrew-cask,scribblemaniac/homebrew-cask,FredLackeyOfficial/homebrew-cask,garborg/homebrew-cask,adrianchia/homebrew-cask,n8henrie/homebrew-cask,nicolas-brousse/homebrew-cask,nrlquaker/homebrew-cask,victorpopkov/homebrew-cask,crmne/homebrew-cask,nivanchikov/homebrew-cask,mjgardner/homebrew-cask,danielbayley/homebrew-cask,Keloran/homebrew-cask,squid314/homebrew-cask,13k/homebrew-cask,zerrot/homebrew-cask,lifepillar/homebrew-cask,FinalDes/homebrew-cask,athrunsun/homebrew-cask,Amorymeltzer/homebrew-cask,lauantai/homebrew-cask,wayou/homebrew-cask,maxnordlund/homebrew-cask,ctrevino/homebrew-cask,retbrown/homebrew-cask,morganestes/homebrew-cask,iAmGhost/homebrew-cask,diguage/homebrew-cask,BenjaminHCCarr/homebrew-cask,a-x-/homebrew-cask,jeroenseegers/homebrew-cask,bendoerr/homebrew-cask,gord1anknot/homebrew-cask,chadcatlett/caskroom-homebrew-cask,afdnlw/homebrew-cask,buo/homebrew-cask,Bombenleger/homebrew-cask,mingzhi22/homebrew-cask,christophermanning/homebrew-cask,drostron/homebrew-cask,mokagio/homebrew-cask,hellosky806/homebrew-cask,Fedalto/homebrew-cask,paour/homebrew-cask,rickychilcott/homebrew-cask,MatzFan/homebrew-cask,jawshooah/homebrew-cask,mindriot101/homebrew-cask,MoOx/homebrew-cask,albertico/homebrew-cask,jgarber623/homebrew-cask,sohtsuka/homebrew-cask,lukeadams/homebrew-cask,johnste/homebrew-cask,doits/homebrew-cask,hyuna917/homebrew-cask,SamiHiltunen/homebrew-cask,tranc99/homebrew-cask,lantrix/homebrew-cask,gibsjose/homebrew-cask,MicTech/homebrew-cask,bcaceiro/homebrew-cask,Saklad5/homebrew-cask,tarwich/homebrew-cask,m3nu/homebrew-cask,inz/homebrew-cask,6uclz1/homebrew-cask,jspahrsummers/homebrew-cask,tmoreira2020/homebrew,andrewdisley/homebrew-cask,deizel/homebrew-cask,rcuza/homebrew-cask,JacopKane/homebrew-cask,fly19890211/homebrew-cask,guerrero/homebrew-cask,kesara/homebrew-cask,asins/homebrew-cask,cblecker/homebrew-cask,robertgzr/homebrew-cask,adelinofaria/homebrew-cask,samnung/homebrew-cask,Labutin/homebrew-cask,opsdev-ws/homebrew-cask,vitorgalvao/homebrew-cask,julienlavergne/homebrew-cask,scottsuch/homebrew-cask,diguage/homebrew-cask,djmonta/homebrew-cask,cedwardsmedia/homebrew-cask,feigaochn/homebrew-cask,MisumiRize/homebrew-cask,linc01n/homebrew-cask,puffdad/homebrew-cask,jellyfishcoder/homebrew-cask,kongslund/homebrew-cask,wizonesolutions/homebrew-cask,esebastian/homebrew-cask,wickedsp1d3r/homebrew-cask,af/homebrew-cask,ericbn/homebrew-cask,ingorichter/homebrew-cask,d/homebrew-cask,dwkns/homebrew-cask,kingthorin/homebrew-cask,kievechua/homebrew-cask,mazehall/homebrew-cask,robbiethegeek/homebrew-cask,hyuna917/homebrew-cask,Ibuprofen/homebrew-cask,feigaochn/homebrew-cask,tjnycum/homebrew-cask,santoshsahoo/homebrew-cask,AndreTheHunter/homebrew-cask,pinut/homebrew-cask,xalep/homebrew-cask,epmatsw/homebrew-cask,dwihn0r/homebrew-cask,reelsense/homebrew-cask,sebcode/homebrew-cask,joaocc/homebrew-cask,uetchy/homebrew-cask,yuhki50/homebrew-cask,underyx/homebrew-cask,zchee/homebrew-cask,usami-k/homebrew-cask,rogeriopradoj/homebrew-cask,markhuber/homebrew-cask,sjackman/homebrew-cask,dlovitch/homebrew-cask,kei-yamazaki/homebrew-cask,skatsuta/homebrew-cask,sysbot/homebrew-cask,gyndav/homebrew-cask,mathbunnyru/homebrew-cask,johnjelinek/homebrew-cask,mgryszko/homebrew-cask,usami-k/homebrew-cask,joaoponceleao/homebrew-cask,deizel/homebrew-cask,frapposelli/homebrew-cask,valepert/homebrew-cask,faun/homebrew-cask,mfpierre/homebrew-cask,timsutton/homebrew-cask,scribblemaniac/homebrew-cask,epardee/homebrew-cask,gilesdring/homebrew-cask,zorosteven/homebrew-cask,fharbe/homebrew-cask,kkdd/homebrew-cask,slack4u/homebrew-cask,neil-ca-moore/homebrew-cask,stephenwade/homebrew-cask,RJHsiao/homebrew-cask,johndbritton/homebrew-cask,epmatsw/homebrew-cask,frapposelli/homebrew-cask,jedahan/homebrew-cask,daften/homebrew-cask,shonjir/homebrew-cask,katoquro/homebrew-cask,squid314/homebrew-cask,jconley/homebrew-cask,markhuber/homebrew-cask,howie/homebrew-cask,Dremora/homebrew-cask,andrewdisley/homebrew-cask,afh/homebrew-cask,flaviocamilo/homebrew-cask,diogodamiani/homebrew-cask,mattrobenolt/homebrew-cask,fkrone/homebrew-cask,sgnh/homebrew-cask,thehunmonkgroup/homebrew-cask,cfillion/homebrew-cask,ajbw/homebrew-cask,mwean/homebrew-cask,ldong/homebrew-cask,tangestani/homebrew-cask,zhuzihhhh/homebrew-cask,zmwangx/homebrew-cask,scw/homebrew-cask,chuanxd/homebrew-cask,otzy007/homebrew-cask,lantrix/homebrew-cask,Whoaa512/homebrew-cask,ywfwj2008/homebrew-cask,delphinus35/homebrew-cask,phpwutz/homebrew-cask,ajbw/homebrew-cask,nathancahill/homebrew-cask,jalaziz/homebrew-cask,miguelfrde/homebrew-cask,kesara/homebrew-cask,enriclluelles/homebrew-cask,athrunsun/homebrew-cask,kassi/homebrew-cask,jeanregisser/homebrew-cask,kamilboratynski/homebrew-cask,deanmorin/homebrew-cask,garborg/homebrew-cask,arranubels/homebrew-cask,feniix/homebrew-cask,wmorin/homebrew-cask,huanzhang/homebrew-cask,csmith-palantir/homebrew-cask,coeligena/homebrew-customized,flada-auxv/homebrew-cask,gwaldo/homebrew-cask,asbachb/homebrew-cask,anbotero/homebrew-cask,ahundt/homebrew-cask,napaxton/homebrew-cask,leipert/homebrew-cask,tedski/homebrew-cask,moonboots/homebrew-cask,m3nu/homebrew-cask,nshemonsky/homebrew-cask,toonetown/homebrew-cask,SentinelWarren/homebrew-cask,vin047/homebrew-cask,astorije/homebrew-cask,rajiv/homebrew-cask,vuquoctuan/homebrew-cask,exherb/homebrew-cask,colindean/homebrew-cask,cobyism/homebrew-cask,lauantai/homebrew-cask,My2ndAngelic/homebrew-cask,djakarta-trap/homebrew-myCask,kteru/homebrew-cask,malford/homebrew-cask,mattfelsen/homebrew-cask,spruceb/homebrew-cask,lcasey001/homebrew-cask,leonmachadowilcox/homebrew-cask,lcasey001/homebrew-cask,LaurentFough/homebrew-cask,uetchy/homebrew-cask,lukasbestle/homebrew-cask,kteru/homebrew-cask,andyli/homebrew-cask,miku/homebrew-cask,exherb/homebrew-cask,mattrobenolt/homebrew-cask,jppelteret/homebrew-cask,jonathanwiesel/homebrew-cask,deiga/homebrew-cask,kongslund/homebrew-cask,nshemonsky/homebrew-cask,joschi/homebrew-cask,xiongchiamiov/homebrew-cask,alloy/homebrew-cask,zerrot/homebrew-cask,mhubig/homebrew-cask,jalaziz/homebrew-cask,caskroom/homebrew-cask,yumitsu/homebrew-cask,coneman/homebrew-cask,Ibuprofen/homebrew-cask,qnm/homebrew-cask,My2ndAngelic/homebrew-cask,mingzhi22/homebrew-cask,lieuwex/homebrew-cask,mishari/homebrew-cask,dunn/homebrew-cask,ky0615/homebrew-cask-1,qnm/homebrew-cask,mauricerkelly/homebrew-cask,wolflee/homebrew-cask,gustavoavellar/homebrew-cask,pkq/homebrew-cask,barravi/homebrew-cask,shoichiaizawa/homebrew-cask,sirodoht/homebrew-cask,y00rb/homebrew-cask,colindunn/homebrew-cask,yuhki50/homebrew-cask,bgandon/homebrew-cask,BahtiyarB/homebrew-cask,sscotth/homebrew-cask,danielgomezrico/homebrew-cask,chuanxd/homebrew-cask,dictcp/homebrew-cask,xiongchiamiov/homebrew-cask,gerrypower/homebrew-cask,greg5green/homebrew-cask,cohei/homebrew-cask,stephenwade/homebrew-cask,schneidmaster/homebrew-cask,remko/homebrew-cask,MircoT/homebrew-cask,mikem/homebrew-cask,tdsmith/homebrew-cask,jbeagley52/homebrew-cask,Nitecon/homebrew-cask,ponychicken/homebrew-customcask,zorosteven/homebrew-cask,phpwutz/homebrew-cask,skyyuan/homebrew-cask,onlynone/homebrew-cask,sysbot/homebrew-cask,lvicentesanchez/homebrew-cask,josa42/homebrew-cask,JacopKane/homebrew-cask,Cottser/homebrew-cask,andrewschleifer/homebrew-cask,casidiablo/homebrew-cask,opsdev-ws/homebrew-cask,elyscape/homebrew-cask,muan/homebrew-cask,lumaxis/homebrew-cask,samdoran/homebrew-cask,moogar0880/homebrew-cask,antogg/homebrew-cask,jawshooah/homebrew-cask,RickWong/homebrew-cask,mgryszko/homebrew-cask,nathancahill/homebrew-cask,jacobbednarz/homebrew-cask,gerrypower/homebrew-cask,ninjahoahong/homebrew-cask,artdevjs/homebrew-cask,joschi/homebrew-cask,L2G/homebrew-cask,gord1anknot/homebrew-cask,mchlrmrz/homebrew-cask,kronicd/homebrew-cask,coeligena/homebrew-customized,ftiff/homebrew-cask,mkozjak/homebrew-cask,ebraminio/homebrew-cask,epardee/homebrew-cask,mlocher/homebrew-cask,kingthorin/homebrew-cask,scw/homebrew-cask,anbotero/homebrew-cask,leonmachadowilcox/homebrew-cask,askl56/homebrew-cask,ninjahoahong/homebrew-cask,farmerchris/homebrew-cask,andyshinn/homebrew-cask,kryhear/homebrew-cask,0xadada/homebrew-cask,blogabe/homebrew-cask,elyscape/homebrew-cask,sirodoht/homebrew-cask,jmeridth/homebrew-cask,shanonvl/homebrew-cask,devmynd/homebrew-cask,tolbkni/homebrew-cask,kingthorin/homebrew-cask,githubutilities/homebrew-cask,tolbkni/homebrew-cask,syscrusher/homebrew-cask,thehunmonkgroup/homebrew-cask,andrewschleifer/homebrew-cask,gurghet/homebrew-cask,josa42/homebrew-cask,jeroenj/homebrew-cask,iAmGhost/homebrew-cask,alebcay/homebrew-cask,jiashuw/homebrew-cask,akiomik/homebrew-cask,napaxton/homebrew-cask,danielbayley/homebrew-cask,Amorymeltzer/homebrew-cask,gyndav/homebrew-cask,vmrob/homebrew-cask,nysthee/homebrew-cask,koenrh/homebrew-cask,deiga/homebrew-cask,tdsmith/homebrew-cask,miku/homebrew-cask,syscrusher/homebrew-cask,slnovak/homebrew-cask,donbobka/homebrew-cask,dieterdemeyer/homebrew-cask,ianyh/homebrew-cask,atsuyim/homebrew-cask,otaran/homebrew-cask,MisumiRize/homebrew-cask,dwihn0r/homebrew-cask,Ephemera/homebrew-cask,wuman/homebrew-cask,JosephViolago/homebrew-cask,petmoo/homebrew-cask,stonehippo/homebrew-cask,blainesch/homebrew-cask,mishari/homebrew-cask,santoshsahoo/homebrew-cask,sanchezm/homebrew-cask,enriclluelles/homebrew-cask,giannitm/homebrew-cask,JosephViolago/homebrew-cask,aguynamedryan/homebrew-cask,adriweb/homebrew-cask,malob/homebrew-cask,singingwolfboy/homebrew-cask,djmonta/homebrew-cask,Cottser/homebrew-cask,skatsuta/homebrew-cask,rkJun/homebrew-cask,genewoo/homebrew-cask,zchee/homebrew-cask,dezon/homebrew-cask,githubutilities/homebrew-cask,6uclz1/homebrew-cask,wastrachan/homebrew-cask,moimikey/homebrew-cask,JosephViolago/homebrew-cask,wuman/homebrew-cask,gustavoavellar/homebrew-cask,boecko/homebrew-cask,sachin21/homebrew-cask,afh/homebrew-cask,caskroom/homebrew-cask,tedbundyjr/homebrew-cask,ahvigil/homebrew-cask,julionc/homebrew-cask,stonehippo/homebrew-cask,0rax/homebrew-cask,williamboman/homebrew-cask,sanyer/homebrew-cask,miccal/homebrew-cask,mkozjak/homebrew-cask,antogg/homebrew-cask,jtriley/homebrew-cask,leipert/homebrew-cask,samshadwell/homebrew-cask,a-x-/homebrew-cask,sjackman/homebrew-cask,mokagio/homebrew-cask,ayohrling/homebrew-cask,jaredsampson/homebrew-cask,fwiesel/homebrew-cask,jaredsampson/homebrew-cask,thomanq/homebrew-cask,MatzFan/homebrew-cask,morsdyce/homebrew-cask,markthetech/homebrew-cask,KosherBacon/homebrew-cask,yurrriq/homebrew-cask,axodys/homebrew-cask,dictcp/homebrew-cask,alebcay/homebrew-cask,fanquake/homebrew-cask,vmrob/homebrew-cask,haha1903/homebrew-cask,ddm/homebrew-cask,miccal/homebrew-cask,vigosan/homebrew-cask,theoriginalgri/homebrew-cask,jconley/homebrew-cask,renaudguerin/homebrew-cask,pgr0ss/homebrew-cask,a1russell/homebrew-cask,ebraminio/homebrew-cask,kronicd/homebrew-cask,timsutton/homebrew-cask,wastrachan/homebrew-cask,LaurentFough/homebrew-cask,jacobdam/homebrew-cask,moimikey/homebrew-cask,paour/homebrew-cask,hakamadare/homebrew-cask,chrisfinazzo/homebrew-cask,BenjaminHCCarr/homebrew-cask,zeusdeux/homebrew-cask,mrmachine/homebrew-cask,seanorama/homebrew-cask,AnastasiaSulyagina/homebrew-cask,rhendric/homebrew-cask,kryhear/homebrew-cask,mauricerkelly/homebrew-cask,valepert/homebrew-cask,tan9/homebrew-cask,cclauss/homebrew-cask,rcuza/homebrew-cask,gibsjose/homebrew-cask,imgarylai/homebrew-cask,jtriley/homebrew-cask,casidiablo/homebrew-cask,donbobka/homebrew-cask,kei-yamazaki/homebrew-cask,lumaxis/homebrew-cask,hanxue/caskroom,joaocc/homebrew-cask,tarwich/homebrew-cask,13k/homebrew-cask,FinalDes/homebrew-cask,mariusbutuc/homebrew-cask,gwaldo/homebrew-cask,dlovitch/homebrew-cask,neil-ca-moore/homebrew-cask,stigkj/homebrew-caskroom-cask,jayshao/homebrew-cask,ky0615/homebrew-cask-1,norio-nomura/homebrew-cask,devmynd/homebrew-cask,tedski/homebrew-cask,dspeckhard/homebrew-cask,SamiHiltunen/homebrew-cask,slack4u/homebrew-cask,hanxue/caskroom,BenjaminHCCarr/homebrew-cask,Amorymeltzer/homebrew-cask,fazo96/homebrew-cask,johndbritton/homebrew-cask,xyb/homebrew-cask,shishi/homebrew-cask,chadcatlett/caskroom-homebrew-cask,retbrown/homebrew-cask,esebastian/homebrew-cask,nivanchikov/homebrew-cask,slnovak/homebrew-cask,spruceb/homebrew-cask,yurikoles/homebrew-cask,kassi/homebrew-cask,retrography/homebrew-cask,joshka/homebrew-cask,L2G/homebrew-cask,arranubels/homebrew-cask,onlynone/homebrew-cask,cfillion/homebrew-cask,nathansgreen/homebrew-cask,asbachb/homebrew-cask,retrography/homebrew-cask,miguelfrde/homebrew-cask,vuquoctuan/homebrew-cask,n8henrie/homebrew-cask,dwkns/homebrew-cask,MerelyAPseudonym/homebrew-cask,fwiesel/homebrew-cask,supriyantomaftuh/homebrew-cask,jrwesolo/homebrew-cask,iamso/homebrew-cask,yutarody/homebrew-cask,ksylvan/homebrew-cask,jangalinski/homebrew-cask,bendoerr/homebrew-cask,cliffcotino/homebrew-cask,bdhess/homebrew-cask,lolgear/homebrew-cask,Ketouem/homebrew-cask,lucasmezencio/homebrew-cask,tsparber/homebrew-cask,FranklinChen/homebrew-cask,kesara/homebrew-cask,michelegera/homebrew-cask,yurrriq/homebrew-cask,AdamCmiel/homebrew-cask,ianyh/homebrew-cask,boecko/homebrew-cask,lalyos/homebrew-cask,Labutin/homebrew-cask,sgnh/homebrew-cask,ericbn/homebrew-cask,Ngrd/homebrew-cask,mindriot101/homebrew-cask,buo/homebrew-cask,imgarylai/homebrew-cask,d/homebrew-cask,greg5green/homebrew-cask,wickedsp1d3r/homebrew-cask,claui/homebrew-cask,bric3/homebrew-cask,catap/homebrew-cask,fanquake/homebrew-cask,wKovacs64/homebrew-cask,shonjir/homebrew-cask,alloy/homebrew-cask,christer155/homebrew-cask,amatos/homebrew-cask,tyage/homebrew-cask,nathansgreen/homebrew-cask,chrisfinazzo/homebrew-cask,mattrobenolt/homebrew-cask,yumitsu/homebrew-cask,renaudguerin/homebrew-cask,codeurge/homebrew-cask,jayshao/homebrew-cask,afdnlw/homebrew-cask,zhuzihhhh/homebrew-cask,nightscape/homebrew-cask,kolomiichenko/homebrew-cask,3van/homebrew-cask,MicTech/homebrew-cask,segiddins/homebrew-cask,FranklinChen/homebrew-cask,hakamadare/homebrew-cask,jpodlech/homebrew-cask,julionc/homebrew-cask,nysthee/homebrew-cask,sscotth/homebrew-cask,Philosoft/homebrew-cask,wmorin/homebrew-cask,AnastasiaSulyagina/homebrew-cask,kiliankoe/homebrew-cask,JoelLarson/homebrew-cask,ftiff/homebrew-cask,rubenerd/homebrew-cask,y00rb/homebrew-cask,julionc/homebrew-cask,vitorgalvao/homebrew-cask,sanyer/homebrew-cask,helloIAmPau/homebrew-cask,bgandon/homebrew-cask,reelsense/homebrew-cask,aki77/homebrew-cask,KosherBacon/homebrew-cask,ponychicken/homebrew-customcask,chrisfinazzo/homebrew-cask,tmoreira2020/homebrew,sosedoff/homebrew-cask,Philosoft/homebrew-cask,supriyantomaftuh/homebrew-cask,mlocher/homebrew-cask,paulombcosta/homebrew-cask,jeanregisser/homebrew-cask,hanxue/caskroom,CameronGarrett/homebrew-cask,farmerchris/homebrew-cask,blogabe/homebrew-cask,brianshumate/homebrew-cask,royalwang/homebrew-cask,cblecker/homebrew-cask,sideci-sample/sideci-sample-homebrew-cask,djakarta-trap/homebrew-myCask,antogg/homebrew-cask,franklouwers/homebrew-cask,elnappo/homebrew-cask,skyyuan/homebrew-cask,klane/homebrew-cask,seanorama/homebrew-cask,mwean/homebrew-cask,nickpellant/homebrew-cask,goxberry/homebrew-cask,shoichiaizawa/homebrew-cask,mattfelsen/homebrew-cask,MerelyAPseudonym/homebrew-cask,jellyfishcoder/homebrew-cask,sparrc/homebrew-cask,puffdad/homebrew-cask,3van/homebrew-cask,mathbunnyru/homebrew-cask,dcondrey/homebrew-cask,taherio/homebrew-cask,rkJun/homebrew-cask,mjdescy/homebrew-cask,kkdd/homebrew-cask,josa42/homebrew-cask,tjnycum/homebrew-cask,chino/homebrew-cask,englishm/homebrew-cask,hvisage/homebrew-cask,imgarylai/homebrew-cask,sohtsuka/homebrew-cask,lolgear/homebrew-cask,dustinblackman/homebrew-cask,michelegera/homebrew-cask,JikkuJose/homebrew-cask,andersonba/homebrew-cask,kTitan/homebrew-cask,RogerThiede/homebrew-cask,JikkuJose/homebrew-cask,lucasmezencio/homebrew-cask,nrlquaker/homebrew-cask,theoriginalgri/homebrew-cask,Nitecon/homebrew-cask,ddm/homebrew-cask,gabrielizaias/homebrew-cask,ayohrling/homebrew-cask,tjnycum/homebrew-cask,jedahan/homebrew-cask,aktau/homebrew-cask,pkq/homebrew-cask,claui/homebrew-cask,jgarber623/homebrew-cask,pkq/homebrew-cask,jacobdam/homebrew-cask,xight/homebrew-cask,alebcay/homebrew-cask,jppelteret/homebrew-cask,illusionfield/homebrew-cask,jpmat296/homebrew-cask,tjt263/homebrew-cask,yurikoles/homebrew-cask,huanzhang/homebrew-cask,daften/homebrew-cask,codeurge/homebrew-cask,christophermanning/homebrew-cask,wmorin/homebrew-cask,dvdoliveira/homebrew-cask,m3nu/homebrew-cask,tjt263/homebrew-cask,ericbn/homebrew-cask,n0ts/homebrew-cask,lifepillar/homebrew-cask,optikfluffel/homebrew-cask,gguillotte/homebrew-cask,nrlquaker/homebrew-cask,haha1903/homebrew-cask,shishi/homebrew-cask,tsparber/homebrew-cask,katoquro/homebrew-cask,malob/homebrew-cask,troyxmccall/homebrew-cask,Gasol/homebrew-cask
ruby
## Code Before: class Rapidweaver < Cask version :latest sha256 :no_check url 'https://realmacsoftware.com/redirects/rapidweaver/direct' appcast 'http://www.realmacsoftware.com/stats/rapidweaver5.php' homepage 'http://realmacsoftware.com/rapidweaver' license :unknown app 'RapidWeaver.app' end ## Instruction: Upgrade RapidWeaver.app to version 6 ## Code After: class Rapidweaver < Cask version '6' sha256 'e7b72daffa9c7809b713e05a1b518873a71a73afdd6a97f300b8bb9ab35a9361' url "http://realmacsoftware.com/redirects/rapidweaver#{version}/direct" appcast "http://www.realmacsoftware.com/stats/rapidweaver#{version}.php" homepage 'http://realmacsoftware.com/rapidweaver' license :unknown app 'RapidWeaver.app' end
class Rapidweaver < Cask - version :latest - sha256 :no_check + version '6' + sha256 'e7b72daffa9c7809b713e05a1b518873a71a73afdd6a97f300b8bb9ab35a9361' - url 'https://realmacsoftware.com/redirects/rapidweaver/direct' ? ^ - ^ + url "http://realmacsoftware.com/redirects/rapidweaver#{version}/direct" ? ^ ++++++++++ ^ - appcast 'http://www.realmacsoftware.com/stats/rapidweaver5.php' ? ^ ^ ^ + appcast "http://www.realmacsoftware.com/stats/rapidweaver#{version}.php" ? ^ ^^^^^^^^^^ ^ homepage 'http://realmacsoftware.com/rapidweaver' license :unknown app 'RapidWeaver.app' end
8
0.727273
4
4
909b06e805e2f64686dfc411b49bae948f196399
lib/Predis/Commands/ServerEval.php
lib/Predis/Commands/ServerEval.php
<?php namespace Predis\Commands; class ServerEval extends Command { public function getId() { return 'EVAL'; } protected function canBeHashed() { return false; } }
<?php namespace Predis\Commands; class ServerEval extends Command { public function getId() { return 'EVAL'; } protected function onPrefixKeys(Array $arguments, $prefix) { $arguments = $this->getArguments(); for ($i = 2; $i < $arguments[1] + 2; $i++) { $arguments[$i] = "$prefix{$arguments[$i]}"; } return $arguments; } protected function canBeHashed() { return false; } }
Add support for transparent key prefixing to EVAL and the Predis\Commands\ScriptedCommand class.
Add support for transparent key prefixing to EVAL and the Predis\Commands\ScriptedCommand class.
PHP
mit
CloudSide/predis,johnhelmuth/predis,vend/predis,zhangyancoder/predis,moria/predis,RedisLabs/predis,RudyJessop/predis,SwelenFrance/predis,gopalindians/predis,channelgrabber/predis,stokes3452/predis,oswaldderiemaecker/predis,mrkeng/predis,quangnguyen90/predis,dzung2t/predis,lvbaosong/predis,gencer/predis,nguyenthaihan/predis,protomouse/predis,WalterShe/predis,nrk/predis,SecureCloud-biz/predis,dominics/predis
php
## Code Before: <?php namespace Predis\Commands; class ServerEval extends Command { public function getId() { return 'EVAL'; } protected function canBeHashed() { return false; } } ## Instruction: Add support for transparent key prefixing to EVAL and the Predis\Commands\ScriptedCommand class. ## Code After: <?php namespace Predis\Commands; class ServerEval extends Command { public function getId() { return 'EVAL'; } protected function onPrefixKeys(Array $arguments, $prefix) { $arguments = $this->getArguments(); for ($i = 2; $i < $arguments[1] + 2; $i++) { $arguments[$i] = "$prefix{$arguments[$i]}"; } return $arguments; } protected function canBeHashed() { return false; } }
<?php namespace Predis\Commands; class ServerEval extends Command { public function getId() { return 'EVAL'; } + protected function onPrefixKeys(Array $arguments, $prefix) { + $arguments = $this->getArguments(); + for ($i = 2; $i < $arguments[1] + 2; $i++) { + $arguments[$i] = "$prefix{$arguments[$i]}"; + } + return $arguments; + } + protected function canBeHashed() { return false; } }
8
0.615385
8
0
e8d71b25e9e19d1cb26c93ec8dd640e06c6bd3ce
config/initializers/aws.rb
config/initializers/aws.rb
AssetManager.aws_s3_bucket_name = if Rails.env.production? ENV.fetch('AWS_S3_BUCKET_NAME') else ENV['AWS_S3_BUCKET_NAME'] end AssetManager.aws_s3_use_virtual_host = ENV['AWS_S3_USE_VIRTUAL_HOST'].present? Aws.config.update( logger: Rails.logger )
AssetManager.aws_s3_bucket_name = if Rails.env.production? ENV.fetch('AWS_S3_BUCKET_NAME') else ENV['AWS_S3_BUCKET_NAME'] end AssetManager.aws_s3_use_virtual_host = ENV['AWS_S3_USE_VIRTUAL_HOST'].present? Aws.config.update( logger: Rails.logger )
Fix some Rubocop offences & warnings in AWS config
Fix some Rubocop offences & warnings in AWS config I was seeing the following locally: * Layout/IndentationWidth * Layout/ElseAlignment * Lint/EndAlignment For some reason these were not picked up by the Jenkins CI build of the master branch. I plan to investigate why this didn't happen separately.
Ruby
mit
alphagov/asset-manager,alphagov/asset-manager,alphagov/asset-manager
ruby
## Code Before: AssetManager.aws_s3_bucket_name = if Rails.env.production? ENV.fetch('AWS_S3_BUCKET_NAME') else ENV['AWS_S3_BUCKET_NAME'] end AssetManager.aws_s3_use_virtual_host = ENV['AWS_S3_USE_VIRTUAL_HOST'].present? Aws.config.update( logger: Rails.logger ) ## Instruction: Fix some Rubocop offences & warnings in AWS config I was seeing the following locally: * Layout/IndentationWidth * Layout/ElseAlignment * Lint/EndAlignment For some reason these were not picked up by the Jenkins CI build of the master branch. I plan to investigate why this didn't happen separately. ## Code After: AssetManager.aws_s3_bucket_name = if Rails.env.production? ENV.fetch('AWS_S3_BUCKET_NAME') else ENV['AWS_S3_BUCKET_NAME'] end AssetManager.aws_s3_use_virtual_host = ENV['AWS_S3_USE_VIRTUAL_HOST'].present? Aws.config.update( logger: Rails.logger )
AssetManager.aws_s3_bucket_name = if Rails.env.production? - ENV.fetch('AWS_S3_BUCKET_NAME') - else - ENV['AWS_S3_BUCKET_NAME'] - end + ENV.fetch('AWS_S3_BUCKET_NAME') + else + ENV['AWS_S3_BUCKET_NAME'] + end AssetManager.aws_s3_use_virtual_host = ENV['AWS_S3_USE_VIRTUAL_HOST'].present? Aws.config.update( logger: Rails.logger )
8
0.727273
4
4
192ba8b2bfa138662cd25f3502fd376187ca3e73
pagarme/__init__.py
pagarme/__init__.py
from .pagarme import Pagarme from .exceptions import *
__version__ = '2.0.0-dev' __description__ = 'Pagar.me Python library' __long_description__ = '' from .pagarme import Pagarme from .exceptions import *
Set global stats for pagarme-python
Set global stats for pagarme-python
Python
mit
pbassut/pagarme-python,mbodock/pagarme-python,pagarme/pagarme-python,aroncds/pagarme-python,reginaldojunior/pagarme-python
python
## Code Before: from .pagarme import Pagarme from .exceptions import * ## Instruction: Set global stats for pagarme-python ## Code After: __version__ = '2.0.0-dev' __description__ = 'Pagar.me Python library' __long_description__ = '' from .pagarme import Pagarme from .exceptions import *
+ + __version__ = '2.0.0-dev' + __description__ = 'Pagar.me Python library' + __long_description__ = '' from .pagarme import Pagarme from .exceptions import *
4
1.333333
4
0
bc13c658cd1aba7d2ea751ecf25186cb22fa1e70
README.md
README.md
[![Build Status](https://travis-ci.org/doctrine/annotations.svg?branch=master)](https://travis-ci.org/doctrine/annotations) [![Dependency Status](https://www.versioneye.com/package/php--doctrine--annotations/badge.png)](https://www.versioneye.com/package/php--doctrine--annotations) [![Reference Status](https://www.versioneye.com/php/doctrine:annotations/reference_badge.svg)](https://www.versioneye.com/php/doctrine:annotations/references) [![Total Downloads](https://poser.pugx.org/doctrine/annotations/downloads.png)](https://packagist.org/packages/doctrine/annotations) [![Latest Stable Version](https://poser.pugx.org/doctrine/annotations/v/stable.png)](https://packagist.org/packages/doctrine/annotations) Docblock Annotations Parser library (extracted from [Doctrine Common](https://github.com/doctrine/common)). ## Documentation See the [doctrine-project website](https://www.doctrine-project.org/projects/doctrine-annotations/en/latest/index.html). ## Contributing When making a pull request, make sure your changes follow the [Coding Standard Guidelines](https://www.doctrine-project.org/projects/doctrine-coding-standard/en/latest/reference/index.html#introduction). ## Changelog See [CHANGELOG.md](CHANGELOG.md).
[![Dependency Status](https://www.versioneye.com/package/php--doctrine--annotations/badge.png)](https://www.versioneye.com/package/php--doctrine--annotations) [![Reference Status](https://www.versioneye.com/php/doctrine:annotations/reference_badge.svg)](https://www.versioneye.com/php/doctrine:annotations/references) [![Total Downloads](https://poser.pugx.org/doctrine/annotations/downloads.png)](https://packagist.org/packages/doctrine/annotations) [![Latest Stable Version](https://poser.pugx.org/doctrine/annotations/v/stable.png)](https://packagist.org/packages/doctrine/annotations) Docblock Annotations Parser library (extracted from [Doctrine Common](https://github.com/doctrine/common)). ## Documentation See the [doctrine-project website](https://www.doctrine-project.org/projects/doctrine-annotations/en/latest/index.html). ## Contributing When making a pull request, make sure your changes follow the [Coding Standard Guidelines](https://www.doctrine-project.org/projects/doctrine-coding-standard/en/latest/reference/index.html#introduction). ## Changelog See [CHANGELOG.md](CHANGELOG.md).
Remove Travis Build status badge
Remove Travis Build status badge [ci-skip]
Markdown
mit
doctrine/annotations
markdown
## Code Before: [![Build Status](https://travis-ci.org/doctrine/annotations.svg?branch=master)](https://travis-ci.org/doctrine/annotations) [![Dependency Status](https://www.versioneye.com/package/php--doctrine--annotations/badge.png)](https://www.versioneye.com/package/php--doctrine--annotations) [![Reference Status](https://www.versioneye.com/php/doctrine:annotations/reference_badge.svg)](https://www.versioneye.com/php/doctrine:annotations/references) [![Total Downloads](https://poser.pugx.org/doctrine/annotations/downloads.png)](https://packagist.org/packages/doctrine/annotations) [![Latest Stable Version](https://poser.pugx.org/doctrine/annotations/v/stable.png)](https://packagist.org/packages/doctrine/annotations) Docblock Annotations Parser library (extracted from [Doctrine Common](https://github.com/doctrine/common)). ## Documentation See the [doctrine-project website](https://www.doctrine-project.org/projects/doctrine-annotations/en/latest/index.html). ## Contributing When making a pull request, make sure your changes follow the [Coding Standard Guidelines](https://www.doctrine-project.org/projects/doctrine-coding-standard/en/latest/reference/index.html#introduction). ## Changelog See [CHANGELOG.md](CHANGELOG.md). ## Instruction: Remove Travis Build status badge [ci-skip] ## Code After: [![Dependency Status](https://www.versioneye.com/package/php--doctrine--annotations/badge.png)](https://www.versioneye.com/package/php--doctrine--annotations) [![Reference Status](https://www.versioneye.com/php/doctrine:annotations/reference_badge.svg)](https://www.versioneye.com/php/doctrine:annotations/references) [![Total Downloads](https://poser.pugx.org/doctrine/annotations/downloads.png)](https://packagist.org/packages/doctrine/annotations) [![Latest Stable Version](https://poser.pugx.org/doctrine/annotations/v/stable.png)](https://packagist.org/packages/doctrine/annotations) Docblock Annotations Parser library (extracted from [Doctrine Common](https://github.com/doctrine/common)). ## Documentation See the [doctrine-project website](https://www.doctrine-project.org/projects/doctrine-annotations/en/latest/index.html). ## Contributing When making a pull request, make sure your changes follow the [Coding Standard Guidelines](https://www.doctrine-project.org/projects/doctrine-coding-standard/en/latest/reference/index.html#introduction). ## Changelog See [CHANGELOG.md](CHANGELOG.md).
- [![Build Status](https://travis-ci.org/doctrine/annotations.svg?branch=master)](https://travis-ci.org/doctrine/annotations) [![Dependency Status](https://www.versioneye.com/package/php--doctrine--annotations/badge.png)](https://www.versioneye.com/package/php--doctrine--annotations) [![Reference Status](https://www.versioneye.com/php/doctrine:annotations/reference_badge.svg)](https://www.versioneye.com/php/doctrine:annotations/references) [![Total Downloads](https://poser.pugx.org/doctrine/annotations/downloads.png)](https://packagist.org/packages/doctrine/annotations) [![Latest Stable Version](https://poser.pugx.org/doctrine/annotations/v/stable.png)](https://packagist.org/packages/doctrine/annotations) Docblock Annotations Parser library (extracted from [Doctrine Common](https://github.com/doctrine/common)). ## Documentation See the [doctrine-project website](https://www.doctrine-project.org/projects/doctrine-annotations/en/latest/index.html). ## Contributing When making a pull request, make sure your changes follow the [Coding Standard Guidelines](https://www.doctrine-project.org/projects/doctrine-coding-standard/en/latest/reference/index.html#introduction). ## Changelog See [CHANGELOG.md](CHANGELOG.md).
1
0.047619
0
1
c6a07d1b6012355ad30185a76cf32d65fe7c597f
app/models/wisdom/question.rb
app/models/wisdom/question.rb
module Wisdom class Question < ActiveRecord::Base attr_accessible :row_order, :text, :title belongs_to :topic, inverse_of: :questions include RankedModel ranks :row_order validates :text, :presence => true validates :title, :presence => true before_save do self.slug = slug.downcase.parameterize end end end
module Wisdom class Question < ActiveRecord::Base attr_accessible :row_order, :text, :title belongs_to :topic, inverse_of: :questions include RankedModel ranks :row_order validates :text, :presence => true validates :title, :presence => true end end
Fix broken model, Question slug is not a thing.
Fix broken model, Question slug is not a thing.
Ruby
mit
maxwell/wisdom,maxwell/wisdom
ruby
## Code Before: module Wisdom class Question < ActiveRecord::Base attr_accessible :row_order, :text, :title belongs_to :topic, inverse_of: :questions include RankedModel ranks :row_order validates :text, :presence => true validates :title, :presence => true before_save do self.slug = slug.downcase.parameterize end end end ## Instruction: Fix broken model, Question slug is not a thing. ## Code After: module Wisdom class Question < ActiveRecord::Base attr_accessible :row_order, :text, :title belongs_to :topic, inverse_of: :questions include RankedModel ranks :row_order validates :text, :presence => true validates :title, :presence => true end end
module Wisdom class Question < ActiveRecord::Base attr_accessible :row_order, :text, :title belongs_to :topic, inverse_of: :questions include RankedModel ranks :row_order validates :text, :presence => true validates :title, :presence => true - before_save do - self.slug = slug.downcase.parameterize - end - end end
4
0.25
0
4
07d5079fff9bf09a3b042e7bf9eb0dee78dea85d
locales/ach/email.properties
locales/ach/email.properties
second_paragraph_email=Wa yee ni intanet tye gin ma kato diro; obedo jami tic pa lwak. Obedo kero ma ki nywako pi ber ne. third_paragraph_email=Pwoc odoco pi gin ducu ma itimo me cwako Mozilla. Kacel, wa bigwoko intanet ayaba, ma nonge, ma leng ki maber. here_is_record=HERE IS A TRANSACTION RECEIPT FOR YOUR RECORDS: more_info_about_page=Pi ngec mapol nen potbuk me ikom wa. (http://www.mozilla.org/about/) cancel_recurring_donation_email=Kacce imito juko miyo kony mamegi me kare ki kare ni, cwal kwac mamegi ii email bot donate@mozilla.org. your_gift_help_us=Mic mamegi konyo wa me yubo intanet ma wilobo mito. we_are_global_community=Wabedo lwak me wilobo ma tiyo me gwoko intanet ma kwoo ki ma nonge
first_paragraph_email=Pwoc tutwal pi mic mamegi bot Mozilla. Ki mic mamegi, wa bimede ki miti wa me yubo ki gwoko kakube ayaba. second_paragraph_email=Wa yee ni intanet tye gin ma kato diro; obedo jami tic pa lwak. Obedo kero ma ki nywako pi ber ne. third_paragraph_email=Pwoc odoco pi gin ducu ma itimo me cwako Mozilla. Kacel, wa bigwoko intanet ayaba, ma nonge, ma leng ki maber. here_is_record=HERE IS A TRANSACTION RECEIPT FOR YOUR RECORDS: more_info_about_page=Pi ngec mapol nen potbuk me ikom wa. (http://www.mozilla.org/about/) cancel_recurring_donation_email=Kacce imito juko miyo kony mamegi me kare ki kare ni, cwal kwac mamegi ii email bot donate@mozilla.org. your_gift_help_us=Mic mamegi konyo wa me yubo intanet ma wilobo mito. we_are_global_community=Wabedo lwak me wilobo ma tiyo me gwoko intanet ma kwoo ki ma nonge
Update Acholi (ach) localization of Fundraising
Pontoon: Update Acholi (ach) localization of Fundraising Localization authors: - denish <denish@mozilla-uganda.org>
INI
mpl-2.0
mozilla/donate.mozilla.org
ini
## Code Before: second_paragraph_email=Wa yee ni intanet tye gin ma kato diro; obedo jami tic pa lwak. Obedo kero ma ki nywako pi ber ne. third_paragraph_email=Pwoc odoco pi gin ducu ma itimo me cwako Mozilla. Kacel, wa bigwoko intanet ayaba, ma nonge, ma leng ki maber. here_is_record=HERE IS A TRANSACTION RECEIPT FOR YOUR RECORDS: more_info_about_page=Pi ngec mapol nen potbuk me ikom wa. (http://www.mozilla.org/about/) cancel_recurring_donation_email=Kacce imito juko miyo kony mamegi me kare ki kare ni, cwal kwac mamegi ii email bot donate@mozilla.org. your_gift_help_us=Mic mamegi konyo wa me yubo intanet ma wilobo mito. we_are_global_community=Wabedo lwak me wilobo ma tiyo me gwoko intanet ma kwoo ki ma nonge ## Instruction: Pontoon: Update Acholi (ach) localization of Fundraising Localization authors: - denish <denish@mozilla-uganda.org> ## Code After: first_paragraph_email=Pwoc tutwal pi mic mamegi bot Mozilla. Ki mic mamegi, wa bimede ki miti wa me yubo ki gwoko kakube ayaba. second_paragraph_email=Wa yee ni intanet tye gin ma kato diro; obedo jami tic pa lwak. Obedo kero ma ki nywako pi ber ne. third_paragraph_email=Pwoc odoco pi gin ducu ma itimo me cwako Mozilla. Kacel, wa bigwoko intanet ayaba, ma nonge, ma leng ki maber. here_is_record=HERE IS A TRANSACTION RECEIPT FOR YOUR RECORDS: more_info_about_page=Pi ngec mapol nen potbuk me ikom wa. (http://www.mozilla.org/about/) cancel_recurring_donation_email=Kacce imito juko miyo kony mamegi me kare ki kare ni, cwal kwac mamegi ii email bot donate@mozilla.org. your_gift_help_us=Mic mamegi konyo wa me yubo intanet ma wilobo mito. we_are_global_community=Wabedo lwak me wilobo ma tiyo me gwoko intanet ma kwoo ki ma nonge
+ first_paragraph_email=Pwoc tutwal pi mic mamegi bot Mozilla. Ki mic mamegi, wa bimede ki miti wa me yubo ki gwoko kakube ayaba. second_paragraph_email=Wa yee ni intanet tye gin ma kato diro; obedo jami tic pa lwak. Obedo kero ma ki nywako pi ber ne. third_paragraph_email=Pwoc odoco pi gin ducu ma itimo me cwako Mozilla. Kacel, wa bigwoko intanet ayaba, ma nonge, ma leng ki maber. here_is_record=HERE IS A TRANSACTION RECEIPT FOR YOUR RECORDS: more_info_about_page=Pi ngec mapol nen potbuk me ikom wa. (http://www.mozilla.org/about/) cancel_recurring_donation_email=Kacce imito juko miyo kony mamegi me kare ki kare ni, cwal kwac mamegi ii email bot donate@mozilla.org. your_gift_help_us=Mic mamegi konyo wa me yubo intanet ma wilobo mito. we_are_global_community=Wabedo lwak me wilobo ma tiyo me gwoko intanet ma kwoo ki ma nonge
1
0.142857
1
0
cabb50b76926e1da2a525101b4fe599e9f0fcaa0
src/containers/app.js
src/containers/app.js
import React, { Component, PropTypes } from 'react'; import { connect } from 'react-redux'; import Heading from '../components/heading' import Counter from '../components/counter' import { increment, decrement } from '../actions' class App extends Component { render() { const { dispatch, counter, children } = this.props return ( <div> <Heading>Counter</Heading> <Counter counter = {counter} onIncrement = { () => dispatch(increment()) } onDecrement = { () => dispatch(decrement()) } /> {children} </div> ); } } function mapStateToProps (state) { return { counter: state.counter } } export default connect(mapStateToProps)(App);
import React, { Component, PropTypes } from 'react' import { connect } from 'react-redux' import { bindActionCreators } from 'redux' import Heading from '../components/heading' import Counter from '../components/counter' import * as Actions from '../actions' class App extends Component { render() { const { increment, decrement, counter, children } = this.props return ( <div> <Heading>Counter</Heading> <Counter counter = {counter} onIncrement = { () => increment() } onDecrement = { () => decrement() } /> {children} </div> ); } } function mapStateToProps (state) { return { counter: state.counter } } function mapDispatchToProps (dispatch) { return bindActionCreators(Actions, dispatch); } export default connect(mapStateToProps, mapDispatchToProps)(App);
Use mapDispatchToProps and bindActionCreators to streamline the dispatch process.
Use mapDispatchToProps and bindActionCreators to streamline the dispatch process.
JavaScript
mit
edwardmsmith/react-starter,edwardmsmith/react-starter
javascript
## Code Before: import React, { Component, PropTypes } from 'react'; import { connect } from 'react-redux'; import Heading from '../components/heading' import Counter from '../components/counter' import { increment, decrement } from '../actions' class App extends Component { render() { const { dispatch, counter, children } = this.props return ( <div> <Heading>Counter</Heading> <Counter counter = {counter} onIncrement = { () => dispatch(increment()) } onDecrement = { () => dispatch(decrement()) } /> {children} </div> ); } } function mapStateToProps (state) { return { counter: state.counter } } export default connect(mapStateToProps)(App); ## Instruction: Use mapDispatchToProps and bindActionCreators to streamline the dispatch process. ## Code After: import React, { Component, PropTypes } from 'react' import { connect } from 'react-redux' import { bindActionCreators } from 'redux' import Heading from '../components/heading' import Counter from '../components/counter' import * as Actions from '../actions' class App extends Component { render() { const { increment, decrement, counter, children } = this.props return ( <div> <Heading>Counter</Heading> <Counter counter = {counter} onIncrement = { () => increment() } onDecrement = { () => decrement() } /> {children} </div> ); } } function mapStateToProps (state) { return { counter: state.counter } } function mapDispatchToProps (dispatch) { return bindActionCreators(Actions, dispatch); } export default connect(mapStateToProps, mapDispatchToProps)(App);
- import React, { Component, PropTypes } from 'react'; ? - + import React, { Component, PropTypes } from 'react' - import { connect } from 'react-redux'; ? - + import { connect } from 'react-redux' + import { bindActionCreators } from 'redux' import Heading from '../components/heading' import Counter from '../components/counter' - import { increment, decrement } from '../actions' + import * as Actions from '../actions' class App extends Component { - render() { - const { dispatch, counter, children } = this.props ? ^^^^ -- + const { increment, decrement, counter, children } = this.props ? +++++++++++ ^^^^^^^ return ( <div> <Heading>Counter</Heading> <Counter counter = {counter} onIncrement = { () => - dispatch(increment()) } ? --------- - + increment() } onDecrement = { () => - dispatch(decrement()) } ? --------- - + decrement() } /> ? +++ - /> {children} </div> ); } } function mapStateToProps (state) { return { counter: state.counter } } + function mapDispatchToProps (dispatch) { + return bindActionCreators(Actions, dispatch); + } - export default connect(mapStateToProps)(App); + export default connect(mapStateToProps, mapDispatchToProps)(App); ? ++++++++++++++++++++
20
0.606061
11
9
e52071f4e29669022e2503c1b9f8c8d38cee01be
resque-lock-timeout.gemspec
resque-lock-timeout.gemspec
Gem::Specification.new do |s| s.name = 'resque-lock-timeout' s.version = '0.3.1' s.date = Time.now.strftime('%Y-%m-%d') s.summary = 'A Resque plugin adding locking, with optional timeout/deadlock handling to resque jobs.' s.homepage = 'http://github.com/lantins/resque-lock-timeout' s.email = 'luke@lividpenguin.com' s.authors = ['Luke Antins', 'Ryan Carver', 'Chris Wanstrath'] s.has_rdoc = false s.files = %w(README.md Rakefile LICENSE HISTORY.md) s.files += Dir.glob('lib/**/*') s.files += Dir.glob('test/**/*') s.add_dependency('resque', '>= 1.8.0') s.add_development_dependency('turn') s.add_development_dependency('yard') s.add_development_dependency('rdiscount') s.description = <<desc A Resque plugin. Adds locking, with optional timeout/deadlock handling to resque jobs. Using a `lock_timeout` allows you to re-aquire the lock should your worker fail, crash, or is otherwise unable to relase the lock. i.e. Your server unexpectedly looses power. Very handy for jobs that are recurring or may be retried. desc end
Gem::Specification.new do |s| s.name = 'resque-lock-timeout' s.version = '0.3.1' s.date = Time.now.strftime('%Y-%m-%d') s.summary = 'A Resque plugin adding locking, with optional timeout/deadlock handling to resque jobs.' s.homepage = 'http://github.com/lantins/resque-lock-timeout' s.email = 'luke@lividpenguin.com' s.authors = ['Luke Antins', 'Ryan Carver', 'Chris Wanstrath'] s.has_rdoc = false s.files = %w(README.md Rakefile LICENSE HISTORY.md) s.files += Dir.glob('lib/**/*') s.files += Dir.glob('test/**/*') s.add_dependency('resque', '>= 1.8.0') s.add_development_dependency('rake') s.add_development_dependency('turn') s.add_development_dependency('yard') s.add_development_dependency('rdiscount') s.description = <<desc A Resque plugin. Adds locking, with optional timeout/deadlock handling to resque jobs. Using a `lock_timeout` allows you to re-aquire the lock should your worker fail, crash, or is otherwise unable to relase the lock. i.e. Your server unexpectedly looses power. Very handy for jobs that are recurring or may be retried. desc end
Add `rake` gem as development dependency.
Add `rake` gem as development dependency.
Ruby
mit
lantins/resque-lock-timeout,Talkdesk/resque-lock-timeout,lantins/resque-lock-timeout,Talkdesk/resque-lock-timeout
ruby
## Code Before: Gem::Specification.new do |s| s.name = 'resque-lock-timeout' s.version = '0.3.1' s.date = Time.now.strftime('%Y-%m-%d') s.summary = 'A Resque plugin adding locking, with optional timeout/deadlock handling to resque jobs.' s.homepage = 'http://github.com/lantins/resque-lock-timeout' s.email = 'luke@lividpenguin.com' s.authors = ['Luke Antins', 'Ryan Carver', 'Chris Wanstrath'] s.has_rdoc = false s.files = %w(README.md Rakefile LICENSE HISTORY.md) s.files += Dir.glob('lib/**/*') s.files += Dir.glob('test/**/*') s.add_dependency('resque', '>= 1.8.0') s.add_development_dependency('turn') s.add_development_dependency('yard') s.add_development_dependency('rdiscount') s.description = <<desc A Resque plugin. Adds locking, with optional timeout/deadlock handling to resque jobs. Using a `lock_timeout` allows you to re-aquire the lock should your worker fail, crash, or is otherwise unable to relase the lock. i.e. Your server unexpectedly looses power. Very handy for jobs that are recurring or may be retried. desc end ## Instruction: Add `rake` gem as development dependency. ## Code After: Gem::Specification.new do |s| s.name = 'resque-lock-timeout' s.version = '0.3.1' s.date = Time.now.strftime('%Y-%m-%d') s.summary = 'A Resque plugin adding locking, with optional timeout/deadlock handling to resque jobs.' s.homepage = 'http://github.com/lantins/resque-lock-timeout' s.email = 'luke@lividpenguin.com' s.authors = ['Luke Antins', 'Ryan Carver', 'Chris Wanstrath'] s.has_rdoc = false s.files = %w(README.md Rakefile LICENSE HISTORY.md) s.files += Dir.glob('lib/**/*') s.files += Dir.glob('test/**/*') s.add_dependency('resque', '>= 1.8.0') s.add_development_dependency('rake') s.add_development_dependency('turn') s.add_development_dependency('yard') s.add_development_dependency('rdiscount') s.description = <<desc A Resque plugin. Adds locking, with optional timeout/deadlock handling to resque jobs. Using a `lock_timeout` allows you to re-aquire the lock should your worker fail, crash, or is otherwise unable to relase the lock. i.e. Your server unexpectedly looses power. Very handy for jobs that are recurring or may be retried. desc end
Gem::Specification.new do |s| s.name = 'resque-lock-timeout' s.version = '0.3.1' s.date = Time.now.strftime('%Y-%m-%d') s.summary = 'A Resque plugin adding locking, with optional timeout/deadlock handling to resque jobs.' s.homepage = 'http://github.com/lantins/resque-lock-timeout' s.email = 'luke@lividpenguin.com' s.authors = ['Luke Antins', 'Ryan Carver', 'Chris Wanstrath'] s.has_rdoc = false s.files = %w(README.md Rakefile LICENSE HISTORY.md) s.files += Dir.glob('lib/**/*') s.files += Dir.glob('test/**/*') s.add_dependency('resque', '>= 1.8.0') + s.add_development_dependency('rake') s.add_development_dependency('turn') s.add_development_dependency('yard') s.add_development_dependency('rdiscount') s.description = <<desc A Resque plugin. Adds locking, with optional timeout/deadlock handling to resque jobs. Using a `lock_timeout` allows you to re-aquire the lock should your worker fail, crash, or is otherwise unable to relase the lock. i.e. Your server unexpectedly looses power. Very handy for jobs that are recurring or may be retried. desc end
1
0.033333
1
0
2e5bec360e5d48a1565f18320bf9b5ab3776664f
src/test/java/net/simplestorage/storage/impl/indexed/IndexedStorageFileTest.java
src/test/java/net/simplestorage/storage/impl/indexed/IndexedStorageFileTest.java
package net.simplestorage.storage.impl.indexed; import org.junit.Before; import org.junit.Test; import java.io.File; import java.io.FileNotFoundException; import java.net.URISyntaxException; import static org.junit.Assert.*; public class IndexedStorageFileTest { private IndexedStorageFile storageFile; @Before public void setUp() throws URISyntaxException, FileNotFoundException { File testFile = new File(ExistingIndexedStorageIntegrationTest.class.getResource("testIndexFile.txt").toURI()); storageFile = new IndexedStorageFile(testFile.getAbsolutePath()); } @Test public void testRead() throws Exception { String data = storageFile.read(0,7); assertEquals("test001", data); data = storageFile.read(7,7); assertEquals("test002", data); } @Test public void testWrite() throws Exception { assertEquals(14,storageFile.length()); storageFile.write("test003", 14); assertEquals(21,storageFile.length()); storageFile.clear(14,7); } @Test public void testAppend() throws Exception { assertEquals(14,storageFile.length()); storageFile.append("test003"); assertEquals(21,storageFile.length()); storageFile.clear(14,7); } @Test public void testClear() throws Exception { } }
package net.simplestorage.storage.impl.indexed; import org.junit.After; import org.junit.Before; import org.junit.Test; import java.io.BufferedOutputStream; import java.io.File; import java.io.FileOutputStream; import java.io.IOException; import java.net.URISyntaxException; import static org.junit.Assert.assertEquals; public class IndexedStorageFileTest { private IndexedStorageFile storageFile; private File testFile; @Before public void setUp() throws URISyntaxException, IOException { testFile = new File(ExistingIndexedStorageIntegrationTest.class.getResource("testIndexFile.txt").toURI()); BufferedOutputStream stream = new BufferedOutputStream(new FileOutputStream(testFile)); stream.write("test001test002".toString().getBytes("UTF-8")); stream.close(); storageFile = new IndexedStorageFile(testFile.getAbsolutePath()); } @After public void tearDown() { testFile.delete(); } @Test public void testRead() throws Exception { String data = storageFile.read(0,7); assertEquals("test001", data); data = storageFile.read(7,7); assertEquals("test002", data); } @Test public void testWrite() throws Exception { assertEquals(14,storageFile.length()); storageFile.write("test003", 14); assertEquals(21,storageFile.length()); storageFile.clear(14,7); } @Test public void testAppend() throws Exception { assertEquals(14,storageFile.length()); storageFile.append("test003"); assertEquals(21,storageFile.length()); storageFile.clear(14,7); } }
Fix length problem in test.
Fix length problem in test.
Java
mit
flesire/simple-storage
java
## Code Before: package net.simplestorage.storage.impl.indexed; import org.junit.Before; import org.junit.Test; import java.io.File; import java.io.FileNotFoundException; import java.net.URISyntaxException; import static org.junit.Assert.*; public class IndexedStorageFileTest { private IndexedStorageFile storageFile; @Before public void setUp() throws URISyntaxException, FileNotFoundException { File testFile = new File(ExistingIndexedStorageIntegrationTest.class.getResource("testIndexFile.txt").toURI()); storageFile = new IndexedStorageFile(testFile.getAbsolutePath()); } @Test public void testRead() throws Exception { String data = storageFile.read(0,7); assertEquals("test001", data); data = storageFile.read(7,7); assertEquals("test002", data); } @Test public void testWrite() throws Exception { assertEquals(14,storageFile.length()); storageFile.write("test003", 14); assertEquals(21,storageFile.length()); storageFile.clear(14,7); } @Test public void testAppend() throws Exception { assertEquals(14,storageFile.length()); storageFile.append("test003"); assertEquals(21,storageFile.length()); storageFile.clear(14,7); } @Test public void testClear() throws Exception { } } ## Instruction: Fix length problem in test. ## Code After: package net.simplestorage.storage.impl.indexed; import org.junit.After; import org.junit.Before; import org.junit.Test; import java.io.BufferedOutputStream; import java.io.File; import java.io.FileOutputStream; import java.io.IOException; import java.net.URISyntaxException; import static org.junit.Assert.assertEquals; public class IndexedStorageFileTest { private IndexedStorageFile storageFile; private File testFile; @Before public void setUp() throws URISyntaxException, IOException { testFile = new File(ExistingIndexedStorageIntegrationTest.class.getResource("testIndexFile.txt").toURI()); BufferedOutputStream stream = new BufferedOutputStream(new FileOutputStream(testFile)); stream.write("test001test002".toString().getBytes("UTF-8")); stream.close(); storageFile = new IndexedStorageFile(testFile.getAbsolutePath()); } @After public void tearDown() { testFile.delete(); } @Test public void testRead() throws Exception { String data = storageFile.read(0,7); assertEquals("test001", data); data = storageFile.read(7,7); assertEquals("test002", data); } @Test public void testWrite() throws Exception { assertEquals(14,storageFile.length()); storageFile.write("test003", 14); assertEquals(21,storageFile.length()); storageFile.clear(14,7); } @Test public void testAppend() throws Exception { assertEquals(14,storageFile.length()); storageFile.append("test003"); assertEquals(21,storageFile.length()); storageFile.clear(14,7); } }
package net.simplestorage.storage.impl.indexed; + import org.junit.After; import org.junit.Before; import org.junit.Test; + import java.io.BufferedOutputStream; import java.io.File; + import java.io.FileOutputStream; - import java.io.FileNotFoundException; ? ^^^^^^^^^^^^ + import java.io.IOException; ? ^^ import java.net.URISyntaxException; - import static org.junit.Assert.*; ? ^ + import static org.junit.Assert.assertEquals; ? ^^^^^^^^^^^^ public class IndexedStorageFileTest { private IndexedStorageFile storageFile; + private File testFile; + @Before - public void setUp() throws URISyntaxException, FileNotFoundException { ? ^^^^^^^^^^^^ + public void setUp() throws URISyntaxException, IOException { ? ^^ - File testFile = new File(ExistingIndexedStorageIntegrationTest.class.getResource("testIndexFile.txt").toURI()); ? ----- + testFile = new File(ExistingIndexedStorageIntegrationTest.class.getResource("testIndexFile.txt").toURI()); + BufferedOutputStream stream = new BufferedOutputStream(new FileOutputStream(testFile)); + stream.write("test001test002".toString().getBytes("UTF-8")); + stream.close(); storageFile = new IndexedStorageFile(testFile.getAbsolutePath()); + } + + @After + public void tearDown() { + testFile.delete(); } @Test public void testRead() throws Exception { String data = storageFile.read(0,7); assertEquals("test001", data); data = storageFile.read(7,7); assertEquals("test002", data); } @Test public void testWrite() throws Exception { assertEquals(14,storageFile.length()); storageFile.write("test003", 14); assertEquals(21,storageFile.length()); storageFile.clear(14,7); } @Test public void testAppend() throws Exception { assertEquals(14,storageFile.length()); storageFile.append("test003"); assertEquals(21,storageFile.length()); storageFile.clear(14,7); } - - @Test - public void testClear() throws Exception { - - } }
26
0.509804
17
9
f4242d8f5071510b2aa4a547552c0af1eb552caa
lib/expando/api_ai/system_entity_examples.rb
lib/expando/api_ai/system_entity_examples.rb
module Expando module ApiAi module SystemEntityExamples # Example system entity values to use when replacing/annotating entity values. # # @see https://dialogflow.com/docs/reference/system-entities VALUES = { 'sys.unit-currency' => [ '5 dollars', '25 pounds' ], 'sys.date-time' => [ 'Tomorrow', '5:30 pm', 'Today at 4 pm', 'Last morning', '1st to 3rd of January', 'January 1st at 3 pm' ], 'sys.date' => [ 'January 1', 'Tomorrow', 'January first' ], 'sys.date-period' => [ 'April', 'weekend', 'from 1 till 3 of May', 'in 2 days' ], 'sys.given-name' => [ 'Matthew', 'Jim', 'Robert', 'Lauren' ], 'sys.number-integer' => [ '1', '34', '5' ], 'sys.number' => [ 'ten', 'twenty', 'tenth', 'third' ], 'sys.time' => [ '1 pm', '20:30', 'half past four', 'in 2 minutes' ], 'sys.any' => [ 'anything', 'this is many words' ] } end end end
module Expando module ApiAi module SystemEntityExamples # Example system entity values to use when replacing/annotating entity values. # # @see https://dialogflow.com/docs/reference/system-entities VALUES = { 'sys.unit-currency' => [ '5 dollars', '25 pounds' ], 'sys.date-time' => [ 'Tomorrow', '5:30 pm', 'Today at 4 pm', 'Last morning', '1st to 3rd of January', 'January 1st at 3 pm' ], 'sys.date' => [ 'January 1', 'Tomorrow', 'January first' ], 'sys.date-period' => [ 'April', 'weekend', 'from 1 till 3 of May', 'in 2 days' ], 'sys.given-name' => [ 'Matthew', 'Jim', 'Robert', 'Lauren' ], 'sys.number-integer' => [ '1', '34', '5' ], 'sys.number' => [ 'ten', 'twenty', 'tenth', 'third' ], 'sys.time' => [ '1 pm', '20:30', 'half past four', 'in 2 minutes' ], 'sys.any' => [ 'anything', 'this is many words' ], 'sys.time-period' => [ 'morning', 'in the afternoon', 'evening', 'at night' ] } end end end
Add time-period to system entity examples
Add time-period to system entity examples
Ruby
mit
voxable-labs/expando,expando-lang/expando,expando-lang/expando,voxable-labs/expando
ruby
## Code Before: module Expando module ApiAi module SystemEntityExamples # Example system entity values to use when replacing/annotating entity values. # # @see https://dialogflow.com/docs/reference/system-entities VALUES = { 'sys.unit-currency' => [ '5 dollars', '25 pounds' ], 'sys.date-time' => [ 'Tomorrow', '5:30 pm', 'Today at 4 pm', 'Last morning', '1st to 3rd of January', 'January 1st at 3 pm' ], 'sys.date' => [ 'January 1', 'Tomorrow', 'January first' ], 'sys.date-period' => [ 'April', 'weekend', 'from 1 till 3 of May', 'in 2 days' ], 'sys.given-name' => [ 'Matthew', 'Jim', 'Robert', 'Lauren' ], 'sys.number-integer' => [ '1', '34', '5' ], 'sys.number' => [ 'ten', 'twenty', 'tenth', 'third' ], 'sys.time' => [ '1 pm', '20:30', 'half past four', 'in 2 minutes' ], 'sys.any' => [ 'anything', 'this is many words' ] } end end end ## Instruction: Add time-period to system entity examples ## Code After: module Expando module ApiAi module SystemEntityExamples # Example system entity values to use when replacing/annotating entity values. # # @see https://dialogflow.com/docs/reference/system-entities VALUES = { 'sys.unit-currency' => [ '5 dollars', '25 pounds' ], 'sys.date-time' => [ 'Tomorrow', '5:30 pm', 'Today at 4 pm', 'Last morning', '1st to 3rd of January', 'January 1st at 3 pm' ], 'sys.date' => [ 'January 1', 'Tomorrow', 'January first' ], 'sys.date-period' => [ 'April', 'weekend', 'from 1 till 3 of May', 'in 2 days' ], 'sys.given-name' => [ 'Matthew', 'Jim', 'Robert', 'Lauren' ], 'sys.number-integer' => [ '1', '34', '5' ], 'sys.number' => [ 'ten', 'twenty', 'tenth', 'third' ], 'sys.time' => [ '1 pm', '20:30', 'half past four', 'in 2 minutes' ], 'sys.any' => [ 'anything', 'this is many words' ], 'sys.time-period' => [ 'morning', 'in the afternoon', 'evening', 'at night' ] } end end end
module Expando module ApiAi module SystemEntityExamples # Example system entity values to use when replacing/annotating entity values. # # @see https://dialogflow.com/docs/reference/system-entities VALUES = { 'sys.unit-currency' => [ '5 dollars', '25 pounds' ], 'sys.date-time' => [ 'Tomorrow', '5:30 pm', 'Today at 4 pm', 'Last morning', '1st to 3rd of January', 'January 1st at 3 pm' ], 'sys.date' => [ 'January 1', 'Tomorrow', 'January first' ], 'sys.date-period' => [ 'April', 'weekend', 'from 1 till 3 of May', 'in 2 days' ], 'sys.given-name' => [ 'Matthew', 'Jim', 'Robert', 'Lauren' ], 'sys.number-integer' => [ '1', '34', '5' ], 'sys.number' => [ 'ten', 'twenty', 'tenth', 'third' ], 'sys.time' => [ '1 pm', '20:30', 'half past four', 'in 2 minutes' ], 'sys.any' => [ 'anything', 'this is many words' + ], + 'sys.time-period' => [ + 'morning', + 'in the afternoon', + 'evening', + 'at night' ] } end end end
6
0.098361
6
0
cf75f884bacee1aee38e98c5077f358507460ba2
docker-compose-notification-service.yml
docker-compose-notification-service.yml
version: '3' services: notification-service: image: schulcloud/node-notification-service:latest # build: # context: . # dockerfile: Dockerfile container_name: notification-service deploy: replicas: 1 restart_policy: condition: any environment: - MONGO_HOST=notification-mongo - REDIS_HOST=notification-redis volumes: - notification-logs:/usr/src/app/logs ports: - 3031:3031 depends_on: - notification-mongo - notification-redis restart: always notification-mongo: image: mongo:3 deploy: replicas: 1 restart_policy: condition: any volumes: - notification-mongo:/data/db restart: always notification-redis: image: redis:5.0.3 deploy: replicas: 1 restart_policy: condition: any volumes: - notification-redis:/data restart: always notification-redis-commander: container_name: redis-commander hostname: redis-commander image: rediscommander/redis-commander:latest deploy: replicas: 1 restart_policy: condition: any environment: - REDIS_HOSTS=notification-redis:6379 ports: - "8081:8081" depends_on: - notification-redis restart: always volumes: notification-mongo: notification-redis: notification-logs:
version: '3' services: notification-service: image: schulcloud/node-notification-service:latest # build: # context: . # dockerfile: Dockerfile container_name: notification-service deploy: replicas: 1 restart_policy: condition: any environment: - MONGO_HOST=notification-mongo/notification-service - REDIS_HOST=notification-redis volumes: - notification-logs:/usr/src/app/logs ports: - 3031:3031 depends_on: - notification-mongo - notification-redis restart: always notification-mongo: image: mongo:3 deploy: replicas: 1 restart_policy: condition: any volumes: - notification-mongo:/data/db restart: always notification-redis: image: redis:5.0.3 deploy: replicas: 1 restart_policy: condition: any volumes: - notification-redis:/data restart: always notification-redis-commander: container_name: redis-commander hostname: redis-commander image: rediscommander/redis-commander:latest deploy: replicas: 1 restart_policy: condition: any environment: - REDIS_HOSTS=notification-redis:6379 ports: - "8081:8081" depends_on: - notification-redis restart: always volumes: notification-mongo: notification-redis: notification-logs:
Change of mongo host string
Change of mongo host string
YAML
mit
schulcloud/node-notification-service,schulcloud/node-notification-service,schulcloud/node-notification-service
yaml
## Code Before: version: '3' services: notification-service: image: schulcloud/node-notification-service:latest # build: # context: . # dockerfile: Dockerfile container_name: notification-service deploy: replicas: 1 restart_policy: condition: any environment: - MONGO_HOST=notification-mongo - REDIS_HOST=notification-redis volumes: - notification-logs:/usr/src/app/logs ports: - 3031:3031 depends_on: - notification-mongo - notification-redis restart: always notification-mongo: image: mongo:3 deploy: replicas: 1 restart_policy: condition: any volumes: - notification-mongo:/data/db restart: always notification-redis: image: redis:5.0.3 deploy: replicas: 1 restart_policy: condition: any volumes: - notification-redis:/data restart: always notification-redis-commander: container_name: redis-commander hostname: redis-commander image: rediscommander/redis-commander:latest deploy: replicas: 1 restart_policy: condition: any environment: - REDIS_HOSTS=notification-redis:6379 ports: - "8081:8081" depends_on: - notification-redis restart: always volumes: notification-mongo: notification-redis: notification-logs: ## Instruction: Change of mongo host string ## Code After: version: '3' services: notification-service: image: schulcloud/node-notification-service:latest # build: # context: . # dockerfile: Dockerfile container_name: notification-service deploy: replicas: 1 restart_policy: condition: any environment: - MONGO_HOST=notification-mongo/notification-service - REDIS_HOST=notification-redis volumes: - notification-logs:/usr/src/app/logs ports: - 3031:3031 depends_on: - notification-mongo - notification-redis restart: always notification-mongo: image: mongo:3 deploy: replicas: 1 restart_policy: condition: any volumes: - notification-mongo:/data/db restart: always notification-redis: image: redis:5.0.3 deploy: replicas: 1 restart_policy: condition: any volumes: - notification-redis:/data restart: always notification-redis-commander: container_name: redis-commander hostname: redis-commander image: rediscommander/redis-commander:latest deploy: replicas: 1 restart_policy: condition: any environment: - REDIS_HOSTS=notification-redis:6379 ports: - "8081:8081" depends_on: - notification-redis restart: always volumes: notification-mongo: notification-redis: notification-logs:
version: '3' services: notification-service: image: schulcloud/node-notification-service:latest # build: # context: . # dockerfile: Dockerfile container_name: notification-service deploy: replicas: 1 restart_policy: condition: any environment: - - MONGO_HOST=notification-mongo + - MONGO_HOST=notification-mongo/notification-service ? +++++++++++++++++++++ - REDIS_HOST=notification-redis volumes: - notification-logs:/usr/src/app/logs ports: - 3031:3031 depends_on: - notification-mongo - notification-redis restart: always notification-mongo: image: mongo:3 deploy: replicas: 1 restart_policy: condition: any volumes: - notification-mongo:/data/db restart: always notification-redis: image: redis:5.0.3 deploy: replicas: 1 restart_policy: condition: any volumes: - notification-redis:/data restart: always notification-redis-commander: container_name: redis-commander hostname: redis-commander image: rediscommander/redis-commander:latest deploy: replicas: 1 restart_policy: condition: any environment: - REDIS_HOSTS=notification-redis:6379 ports: - "8081:8081" depends_on: - notification-redis restart: always volumes: notification-mongo: notification-redis: notification-logs:
2
0.03125
1
1
1a7328f31090cf6383106188a83bbcd138135d65
recipes-core/packagegroups/packagegroup-ni-safemode.bb
recipes-core/packagegroups/packagegroup-ni-safemode.bb
SUMMARY = "Safemode specific packages for NI Linux Realtime distribution" LICENSE = "MIT" PACKAGE_ARCH = "${MACHINE_ARCH}" inherit packagegroup RDEPENDS_${PN} = " \ initscripts-nilrt-safemode \ "
SUMMARY = "Safemode specific packages for NI Linux Realtime distribution" LICENSE = "MIT" PACKAGE_ARCH = "${MACHINE_ARCH}" inherit packagegroup RDEPENDS_${PN} = " \ initscripts-nilrt-safemode \ e2fsprogs \ e2fsprogs-e2fsck \ e2fsprogs-mke2fs \ e2fsprogs-tune2fs \ "
Add e2fsprogs utilities to support formatting ext4 filesystem
pkggrp-ni-safemode: Add e2fsprogs utilities to support formatting ext4 filesystem The Artemis project introduces the use of SD card as the main disk in a zynq target. In order to allow NIMAX to format the disk (which is in ext4 filesystem format), the e2fsprogs utilities are required. Signed-off-by: wkoe <bb883267201df8d0a4dea54699e313a899b20526@ni.com>
BitBake
mit
ni/meta-nilrt,ni/meta-nilrt,ni/meta-nilrt,ni/meta-nilrt,ni/meta-nilrt
bitbake
## Code Before: SUMMARY = "Safemode specific packages for NI Linux Realtime distribution" LICENSE = "MIT" PACKAGE_ARCH = "${MACHINE_ARCH}" inherit packagegroup RDEPENDS_${PN} = " \ initscripts-nilrt-safemode \ " ## Instruction: pkggrp-ni-safemode: Add e2fsprogs utilities to support formatting ext4 filesystem The Artemis project introduces the use of SD card as the main disk in a zynq target. In order to allow NIMAX to format the disk (which is in ext4 filesystem format), the e2fsprogs utilities are required. Signed-off-by: wkoe <bb883267201df8d0a4dea54699e313a899b20526@ni.com> ## Code After: SUMMARY = "Safemode specific packages for NI Linux Realtime distribution" LICENSE = "MIT" PACKAGE_ARCH = "${MACHINE_ARCH}" inherit packagegroup RDEPENDS_${PN} = " \ initscripts-nilrt-safemode \ e2fsprogs \ e2fsprogs-e2fsck \ e2fsprogs-mke2fs \ e2fsprogs-tune2fs \ "
SUMMARY = "Safemode specific packages for NI Linux Realtime distribution" LICENSE = "MIT" PACKAGE_ARCH = "${MACHINE_ARCH}" inherit packagegroup RDEPENDS_${PN} = " \ initscripts-nilrt-safemode \ + e2fsprogs \ + e2fsprogs-e2fsck \ + e2fsprogs-mke2fs \ + e2fsprogs-tune2fs \ "
4
0.363636
4
0
9bade5e7b36fa2f244f553945e80638c4aae7464
cookbooks/planet/templates/default/replication.auth.erb
cookbooks/planet/templates/default/replication.auth.erb
host=db # The database instance database=openstreetmap # The database user user=planetdiff # The database password password=<%= @password =%> # The database type dbType=postgresql
host=katla.bm.openstreetmap.org # The database instance database=openstreetmap # The database user user=planetdiff # The database password password=<%= @password =%> # The database type dbType=postgresql
Update database address for replication
Update database address for replication
HTML+ERB
apache-2.0
zerebubuth/openstreetmap-chef,openstreetmap/chef,openstreetmap/chef,Firefishy/chef,tomhughes/openstreetmap-chef,Firefishy/chef,Firefishy/chef,Firefishy/chef,Firefishy/chef,openstreetmap/chef,openstreetmap/chef,gravitystorm/chef,zerebubuth/openstreetmap-chef,zerebubuth/openstreetmap-chef,tomhughes/openstreetmap-chef,openstreetmap/chef,Firefishy/chef,gravitystorm/chef,tomhughes/openstreetmap-chef,gravitystorm/chef,gravitystorm/chef,gravitystorm/chef,openstreetmap/chef,gravitystorm/chef,openstreetmap/chef,zerebubuth/openstreetmap-chef,zerebubuth/openstreetmap-chef,tomhughes/openstreetmap-chef,gravitystorm/chef,tomhughes/openstreetmap-chef,zerebubuth/openstreetmap-chef,openstreetmap/chef,zerebubuth/openstreetmap-chef,tomhughes/openstreetmap-chef,tomhughes/openstreetmap-chef,Firefishy/chef,tomhughes/openstreetmap-chef,gravitystorm/chef,zerebubuth/openstreetmap-chef,Firefishy/chef
html+erb
## Code Before: host=db # The database instance database=openstreetmap # The database user user=planetdiff # The database password password=<%= @password =%> # The database type dbType=postgresql ## Instruction: Update database address for replication ## Code After: host=katla.bm.openstreetmap.org # The database instance database=openstreetmap # The database user user=planetdiff # The database password password=<%= @password =%> # The database type dbType=postgresql
- host=db + host=katla.bm.openstreetmap.org # The database instance database=openstreetmap # The database user user=planetdiff # The database password password=<%= @password =%> # The database type dbType=postgresql
2
0.153846
1
1