commit stringlengths 40 40 | old_file stringlengths 4 237 | new_file stringlengths 4 237 | old_contents stringlengths 1 4.24k | new_contents stringlengths 1 4.87k | subject stringlengths 15 778 | message stringlengths 15 8.75k | lang stringclasses 266 values | license stringclasses 13 values | repos stringlengths 5 127k |
|---|---|---|---|---|---|---|---|---|---|
23738564ef8ed4e45eed13301ba66a6351128d1e | recipes/fastx_toolkit/meta.yaml | recipes/fastx_toolkit/meta.yaml | package:
name: fastx_toolkit
version: 0.0.14
source:
url: https://github.com/agordon/fastx_toolkit/releases/download/0.0.14/fastx_toolkit-0.0.14.tar.bz2
fn: fastx_toolkit-0.0.14.tar.bz2
build:
preserve_egg_dir: True
number: 2
skip: False
requirements:
build:
- gcc [not osx]
- llvm [osx]
- cython
- nose
- libgtextutils
run:
- libgcc [not osx]
- cython
- nose
- libgtextutils
- gnuplot
#- pkg-config
test:
commands:
#- fastx_quality_stats -h # This fails for some unknown reason, even though it prints the output successfully...
about:
home: https://github.com/agordon/fastx_toolkit
license: AGPL
| package:
name: fastx_toolkit
version: 0.0.14
source:
url: https://github.com/agordon/fastx_toolkit/releases/download/0.0.14/fastx_toolkit-0.0.14.tar.bz2
fn: fastx_toolkit-0.0.14.tar.bz2
build:
preserve_egg_dir: True
number: 3
skip: False
requirements:
build:
- gcc [not osx]
- llvm [osx]
- cython
- nose
- libgtextutils
run:
- libgcc [not osx]
- cython
- nose
- libgtextutils
- gnuplot
- libgd >=2.1.1post 1
- perl-threaded
- perl-perlio-gzip
- perl-gd >=2.5.6 2
- perl-gdgraph-histogram
#- pkg-config
test:
commands:
- SCRIPT=$(which fasta_clipping_histogram.pl) && perl5.22.0 "$SCRIPT"
- 'fastx_trimmer -h | grep "Part of FASTX Toolkit 0.0.14 by A. Gordon (assafgordon@gmail.com)"'
about:
home: https://github.com/agordon/fastx_toolkit
license: AGPL
| Add perl dependencies to fastx_toolkit | Add perl dependencies to fastx_toolkit
runtime dependencies. This is required for fasta_clipping_histogram.pl.
Also fixes tests (all fastx_* -h commands exit with exit-code 1) by
grepping for `Part of FASTX Toolkit 0.0.14 by A. Gordon
(assafgordon@gmail.com)`.
| YAML | mit | dmaticzka/bioconda-recipes,zwanli/bioconda-recipes,CGATOxford/bioconda-recipes,matthdsm/bioconda-recipes,abims-sbr/bioconda-recipes,bioconda/bioconda-recipes,saketkc/bioconda-recipes,daler/bioconda-recipes,ostrokach/bioconda-recipes,bioconda/recipes,gvlproject/bioconda-recipes,bow/bioconda-recipes,phac-nml/bioconda-recipes,xguse/bioconda-recipes,zachcp/bioconda-recipes,HassanAmr/bioconda-recipes,peterjc/bioconda-recipes,ivirshup/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,saketkc/bioconda-recipes,pinguinkiste/bioconda-recipes,pinguinkiste/bioconda-recipes,colinbrislawn/bioconda-recipes,lpantano/recipes,hardingnj/bioconda-recipes,yesimon/bioconda-recipes,bioconda/bioconda-recipes,mdehollander/bioconda-recipes,omicsnut/bioconda-recipes,dkoppstein/recipes,shenwei356/bioconda-recipes,chapmanb/bioconda-recipes,gvlproject/bioconda-recipes,jfallmann/bioconda-recipes,oena/bioconda-recipes,abims-sbr/bioconda-recipes,shenwei356/bioconda-recipes,ivirshup/bioconda-recipes,rob-p/bioconda-recipes,chapmanb/bioconda-recipes,hardingnj/bioconda-recipes,HassanAmr/bioconda-recipes,keuv-grvl/bioconda-recipes,hardingnj/bioconda-recipes,bioconda/recipes,keuv-grvl/bioconda-recipes,lpantano/recipes,oena/bioconda-recipes,zachcp/bioconda-recipes,rob-p/bioconda-recipes,yesimon/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,CGATOxford/bioconda-recipes,pinguinkiste/bioconda-recipes,daler/bioconda-recipes,phac-nml/bioconda-recipes,martin-mann/bioconda-recipes,ostrokach/bioconda-recipes,jfallmann/bioconda-recipes,JenCabral/bioconda-recipes,gvlproject/bioconda-recipes,xguse/bioconda-recipes,roryk/recipes,guowei-he/bioconda-recipes,guowei-he/bioconda-recipes,shenwei356/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,npavlovikj/bioconda-recipes,saketkc/bioconda-recipes,gvlproject/bioconda-recipes,abims-sbr/bioconda-recipes,bebatut/bioconda-recipes,phac-nml/bioconda-recipes,zachcp/bioconda-recipes,colinbrislawn/bioconda-recipes,mcornwell1957/bioconda-recipes,rvalieris/bioconda-recipes,bow/bioconda-recipes,cokelaer/bioconda-recipes,peterjc/bioconda-recipes,yesimon/bioconda-recipes,abims-sbr/bioconda-recipes,ostrokach/bioconda-recipes,ThomasWollmann/bioconda-recipes,gvlproject/bioconda-recipes,rob-p/bioconda-recipes,peterjc/bioconda-recipes,rob-p/bioconda-recipes,yesimon/bioconda-recipes,ThomasWollmann/bioconda-recipes,ThomasWollmann/bioconda-recipes,hardingnj/bioconda-recipes,mdehollander/bioconda-recipes,ostrokach/bioconda-recipes,jasper1918/bioconda-recipes,daler/bioconda-recipes,zwanli/bioconda-recipes,Luobiny/bioconda-recipes,matthdsm/bioconda-recipes,martin-mann/bioconda-recipes,guowei-he/bioconda-recipes,gregvonkuster/bioconda-recipes,zwanli/bioconda-recipes,bow/bioconda-recipes,ivirshup/bioconda-recipes,omicsnut/bioconda-recipes,joachimwolff/bioconda-recipes,npavlovikj/bioconda-recipes,rvalieris/bioconda-recipes,HassanAmr/bioconda-recipes,acaprez/recipes,pinguinkiste/bioconda-recipes,joachimwolff/bioconda-recipes,ostrokach/bioconda-recipes,dmaticzka/bioconda-recipes,blankenberg/bioconda-recipes,phac-nml/bioconda-recipes,mcornwell1957/bioconda-recipes,martin-mann/bioconda-recipes,jasper1918/bioconda-recipes,npavlovikj/bioconda-recipes,roryk/recipes,peterjc/bioconda-recipes,npavlovikj/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,daler/bioconda-recipes,JenCabral/bioconda-recipes,chapmanb/bioconda-recipes,oena/bioconda-recipes,mdehollander/bioconda-recipes,bioconda/bioconda-recipes,daler/bioconda-recipes,colinbrislawn/bioconda-recipes,bow/bioconda-recipes,CGATOxford/bioconda-recipes,shenwei356/bioconda-recipes,jasper1918/bioconda-recipes,Luobiny/bioconda-recipes,zwanli/bioconda-recipes,guowei-he/bioconda-recipes,JenCabral/bioconda-recipes,ivirshup/bioconda-recipes,roryk/recipes,matthdsm/bioconda-recipes,cokelaer/bioconda-recipes,colinbrislawn/bioconda-recipes,keuv-grvl/bioconda-recipes,oena/bioconda-recipes,bioconda/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,dmaticzka/bioconda-recipes,gregvonkuster/bioconda-recipes,omicsnut/bioconda-recipes,blankenberg/bioconda-recipes,pinguinkiste/bioconda-recipes,mcornwell1957/bioconda-recipes,abims-sbr/bioconda-recipes,peterjc/bioconda-recipes,CGATOxford/bioconda-recipes,peterjc/bioconda-recipes,chapmanb/bioconda-recipes,JenCabral/bioconda-recipes,HassanAmr/bioconda-recipes,blankenberg/bioconda-recipes,jasper1918/bioconda-recipes,colinbrislawn/bioconda-recipes,gregvonkuster/bioconda-recipes,dkoppstein/recipes,acaprez/recipes,dmaticzka/bioconda-recipes,zwanli/bioconda-recipes,rvalieris/bioconda-recipes,jfallmann/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,matthdsm/bioconda-recipes,dmaticzka/bioconda-recipes,lpantano/recipes,JenCabral/bioconda-recipes,pinguinkiste/bioconda-recipes,bebatut/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,mdehollander/bioconda-recipes,rvalieris/bioconda-recipes,gregvonkuster/bioconda-recipes,dkoppstein/recipes,bebatut/bioconda-recipes,Luobiny/bioconda-recipes,oena/bioconda-recipes,saketkc/bioconda-recipes,xguse/bioconda-recipes,matthdsm/bioconda-recipes,joachimwolff/bioconda-recipes,joachimwolff/bioconda-recipes,saketkc/bioconda-recipes,martin-mann/bioconda-recipes,phac-nml/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,zwanli/bioconda-recipes,chapmanb/bioconda-recipes,dmaticzka/bioconda-recipes,HassanAmr/bioconda-recipes,ThomasWollmann/bioconda-recipes,omicsnut/bioconda-recipes,Luobiny/bioconda-recipes,cokelaer/bioconda-recipes,CGATOxford/bioconda-recipes,abims-sbr/bioconda-recipes,mcornwell1957/bioconda-recipes,mcornwell1957/bioconda-recipes,colinbrislawn/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,acaprez/recipes,joachimwolff/bioconda-recipes,JenCabral/bioconda-recipes,rvalieris/bioconda-recipes,keuv-grvl/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,keuv-grvl/bioconda-recipes,ThomasWollmann/bioconda-recipes,mdehollander/bioconda-recipes,rvalieris/bioconda-recipes,martin-mann/bioconda-recipes,omicsnut/bioconda-recipes,daler/bioconda-recipes,keuv-grvl/bioconda-recipes,xguse/bioconda-recipes,lpantano/recipes,xguse/bioconda-recipes,mdehollander/bioconda-recipes,acaprez/recipes,bebatut/bioconda-recipes,zachcp/bioconda-recipes,jfallmann/bioconda-recipes,cokelaer/bioconda-recipes,bow/bioconda-recipes,gvlproject/bioconda-recipes,CGATOxford/bioconda-recipes,blankenberg/bioconda-recipes,HassanAmr/bioconda-recipes,bow/bioconda-recipes,ivirshup/bioconda-recipes,bioconda/recipes,matthdsm/bioconda-recipes,ivirshup/bioconda-recipes,jasper1918/bioconda-recipes,joachimwolff/bioconda-recipes,guowei-he/bioconda-recipes,hardingnj/bioconda-recipes,ThomasWollmann/bioconda-recipes,saketkc/bioconda-recipes,ostrokach/bioconda-recipes |
950b013c0acb1ccfe04c5ac0af6f5ac0978caa51 | azure-pipelines.yml | azure-pipelines.yml | # Node.js
# Build a general Node.js project with npm.
# Add steps that analyze code, save build artifacts, deploy, and more:
# https://docs.microsoft.com/azure/devops/pipelines/languages/javascript
jobs:
- job: rolling_VS2017_build
displayName: 'Build Job for JS (VS2017)'
pool:
name: Hosted VS2017
steps:
- template: build/sdl-tasks.yml
| # Node.js
# Build a general Node.js project with npm.
# Add steps that analyze code, save build artifacts, deploy, and more:
# https://docs.microsoft.com/azure/devops/pipelines/languages/javascript
jobs:
- job: windows-2019
displayName: 'Build Job for JS (VS2019)'
pool:
name: Hosted VS2019
steps:
- template: build/sdl-tasks.yml
| Use VS 2019 for Azure Pipelines | Use VS 2019 for Azure Pipelines
| YAML | mit | AzureAD/microsoft-authentication-library-for-js,AzureAD/microsoft-authentication-library-for-js,AzureAD/microsoft-authentication-library-for-js,AzureAD/microsoft-authentication-library-for-js,AzureAD/microsoft-authentication-library-for-js,AzureAD/microsoft-authentication-library-for-js,AzureAD/microsoft-authentication-library-for-js,AzureAD/microsoft-authentication-library-for-js |
d5b7710c7233b99044c2028e2094a7b5ce970d69 | azure-pipelines.yml | azure-pipelines.yml | pool:
vmImage: 'windows-latest'
steps:
- task: CmdLine@2
inputs:
script: 'build'
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken) | pool:
vmImage: windows-latest
steps:
- script: build
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken) | Apply 'script' shorthand to Azure Pipelines definition. | Apply 'script' shorthand to Azure Pipelines definition. | YAML | mit | fixie/fixie |
b02a8a5689e6f6d1f1a920484bd11fd845ef3eaf | azure-pipelines.yml | azure-pipelines.yml | # ASP.NET Core
# Build and test ASP.NET Core web applications targeting .NET Core.
# Add steps that run tests, create a NuGet package, deploy, and more:
# https://docs.microsoft.com/vsts/pipelines/languages/dotnet-core
variables:
BuildConfiguration: 'Release'
steps:
- task: DotNetCoreInstaller@0
displayName: 'Use .NET Core sdk 3.0.100'
inputs:
version: 3.0.100
- task: DotNetCoreCLI@2
displayName: Test
inputs:
command: test
projects: '*[Tt]ests/*.csproj'
arguments: '-c $(BuildConfiguration)'
- task: DotNetCoreCLI@2
displayName: 'dotnet build'
inputs:
projects: VkNet/VkNet.csproj
arguments: '-o $(Build.ArtifactStagingDirectory) -c $(BuildConfiguration) --version-suffix $(Build.BuildId)'
- script: 'dotnet nuget push *.nupkg -k $(Parameters.MyGetKey) -s $(Parameters.MyGetFeed)'
workingDirectory: '$(Build.ArtifactStagingDirectory)'
displayName: 'Command Line Script'
- task: EvgenyChernyi.telegram-notification-extension.sample-telegram-task.SendTelegramNotification@0
displayName: 'Send Telegram Notification'
inputs:
botToken: '$(Parameters.botToken)'
chats: '$(Parameters.chats)'
message: '<code>Reason: $(Build.Reason)
Build №: $(Build.BuildNumber)
VkNet был успешно собран!</code>'
buildQueuedBy: true
| # ASP.NET Core
# Build and test ASP.NET Core web applications targeting .NET Core.
# Add steps that run tests, create a NuGet package, deploy, and more:
# https://docs.microsoft.com/vsts/pipelines/languages/dotnet-core
variables:
BuildConfiguration: 'Release'
steps:
- task: DotNetCoreInstaller@0
displayName: 'Use .NET Core sdk 2.1.402'
inputs:
version: 2.1.402
- task: DotNetCoreCLI@2
displayName: Test
inputs:
command: test
projects: '*[Tt]ests/*.csproj'
arguments: '-c $(BuildConfiguration) -f netcoreapp2.0'
- task: DotNetCoreCLI@2
displayName: 'dotnet build'
inputs:
projects: VkNet/VkNet.csproj
arguments: '-o $(Build.ArtifactStagingDirectory) -c $(BuildConfiguration) --version-suffix $(Build.BuildId)'
- script: 'dotnet nuget push *.nupkg -k $(Parameters.MyGetKey) -s $(Parameters.MyGetFeed)'
workingDirectory: '$(Build.ArtifactStagingDirectory)'
displayName: 'Command Line Script'
- task: EvgenyChernyi.telegram-notification-extension.sample-telegram-task.SendTelegramNotification@0
displayName: 'Send Telegram Notification'
inputs:
botToken: '$(Parameters.botToken)'
chats: '$(Parameters.chats)'
message: '<code>Reason: $(Build.Reason)
Build №: $(Build.BuildNumber)
VkNet был успешно собран!</code>'
buildQueuedBy: true
| Use .NET Core sdk 2.1.402 | Use .NET Core sdk 2.1.402
| YAML | mit | vknet/vk,vknet/vk |
54803a26ac6b3ab7d2e61e3425017c4fae3385c8 | .circleci/config.yml | .circleci/config.yml | version: 2
references:
test-steps: &test-steps
steps:
- checkout
- restore_cache:
name: Restore module caches
keys:
- go-modules-v1-{{ .Branch }}-{{ checksum "go.sum" }}
- go-modules-v1-{{ .Branch }}-
- go-modules-v1-
- run:
name: Install lint tools
command: make -s install-tools
- run:
name: Check Gofmt diff
command: make -s fmt-diff
- run:
name: Run GolangCI-Lint
command: make -s ci-lint
- run:
name: Run Golint
command: make -s lint
- run:
name: Run Vet
command: make -s vet
- run:
name: Test
command: make -s test
- save_cache:
name: Save module caches
key: go-modules-v1-{{ .Branch }}-{{ checksum "go.sum" }}
paths:
- /go/pkg/mod
jobs:
test-1.13:
docker:
- image: circleci/golang:1.13
<<: *test-steps
test-1.14:
docker:
- image: circleci/golang:1.14
<<: *test-steps
workflows:
version: 2
test:
jobs:
- test-1.13
- test-1.14
| version: 2
references:
test-steps: &test-steps
steps:
- checkout
- restore_cache:
name: Restore module caches
keys:
- go-modules-v1-{{ .Branch }}-{{ checksum "go.sum" }}
- go-modules-v1-{{ .Branch }}-
- go-modules-v1-
- run:
name: Install lint tools
command: make -s install-tools
- run:
name: Check Gofmt diff
command: make -s fmt-diff
- run:
name: Run GolangCI-Lint
command: make -s ci-lint
- run:
name: Run Golint
command: make -s lint
- run:
name: Run Vet
command: make -s vet
- run:
name: Test
command: make -s test
- save_cache:
name: Save module caches
key: go-modules-v1-{{ .Branch }}-{{ checksum "go.sum" }}
paths:
- /go/pkg/mod
jobs:
test-1.14:
docker:
- image: circleci/golang:1.14
<<: *test-steps
test-1.15:
docker:
- image: circleci/golang:1.15
<<: *test-steps
workflows:
version: 2
test:
jobs:
- test-1.14
- test-1.15
| Update go version in CircleCI | Update go version in CircleCI
| YAML | mit | 178inaba/third_test |
e24094bd35196d46eaeea873c4a35382f610bca0 | .circleci/config.yml | .circleci/config.yml | version: 2
jobs:
build:
docker:
- image: mikeastock/rust:0.3
environment:
- DATABASE_URL=postgres://localhost/mob_test
- image: circleci/postgres:9.6
environment:
- POSTGRES_USER=root
- POSTGRES_DB=mob_test
working_directory: /mob
steps:
- checkout
- restore_cache:
key: warehouse-v1-{{ checksum "Cargo.lock" }}
- run:
name: Install Diesel CLI
command: |
if ! type diesel > /dev/null; then
cargo install diesel_cli --no-default-features --features postgres
fi
- run:
name: Setup Database
command: diesel setup
- run:
name: Build
command: cargo build --all
- run:
name: Run Tests
command: RUST_TEST_THREADS=1 cargo test -all
- save_cache:
key: mob-v1-{{ checksum "Cargo.lock" }}
paths:
- ~/.cargo
| version: 2
jobs:
build:
docker:
- image: mikeastock/rust:nightly
environment:
- DATABASE_URL=postgres://localhost/mob_test
- image: circleci/postgres:9.6
environment:
- POSTGRES_USER=root
- POSTGRES_DB=mob_test
working_directory: /mob
steps:
- checkout
- restore_cache:
key: warehouse-v1-{{ checksum "Cargo.lock" }}
- run:
name: Install Diesel CLI
command: |
if ! type diesel > /dev/null; then
cargo install diesel_cli --no-default-features --features postgres
fi
- run:
name: Setup Database
command: diesel setup
- run:
name: Build
command: cargo build --all
- run:
name: Run Tests
command: RUST_TEST_THREADS=1 cargo test -all
- save_cache:
key: mob-v1-{{ checksum "Cargo.lock" }}
paths:
- ~/.cargo
| Use Rust nightly on Circle | chore: Use Rust nightly on Circle
| YAML | mit | mob-rs/mob |
7d9f5322c7ba7089e77f5f278738fccc399021aa | .circleci/config.yml | .circleci/config.yml | version: 2.1
orbs:
silta: silta/silta@0.1
workflows:
version: 2
commit:
jobs:
- silta/drupal-validate:
name: validate
drupal-root: drupal
post-validation:
- run: echo "You can add additional validation here!"
- silta/drupal-build-deploy: &build-deploy
name: build-deploy
drupal-root: drupal
codebase-build:
- silta/drupal-composer-install
- silta/npm-install-build:
path: . # Adjust to the location of your package.json
context: global_nonprod
filters:
branches:
ignore: production
- silta/drupal-build-deploy:
# Extend the build-deploy configuration for the production environment.
<<: *build-deploy
name: build-deploy-prod
silta_config: silta/silta.yml,silta/silta-prod.yml
context: global_nonprod
filters:
branches:
only: production
| version: 2.1
orbs:
silta: silta/silta@0.1
workflows:
version: 2
commit:
jobs:
- silta/drupal-validate:
name: validate
drupal-root: drupal
post-validation:
- run: echo "You can add additional validation here!"
- silta/drupal-build-deploy: &build-deploy
name: build-deploy
drupal-root: drupal
codebase-build:
- silta/drupal-composer-install
- silta/npm-install-build:
path: . # Adjust to the location of your package.json
context: silta_dev
filters:
branches:
ignore: production
- silta/drupal-build-deploy:
# Extend the build-deploy configuration for the production environment.
<<: *build-deploy
name: build-deploy-prod
silta_config: silta/silta.yml,silta/silta-prod.yml
context: silta_finland
filters:
branches:
only: production
| Use the new dev cluster. | Use the new dev cluster. | YAML | mit | wunderkraut/WunderTools,wunderkraut/WunderTools,wunderkraut/WunderTools |
021a4f777f679069e75764dcf285f542459eecdc | .circleci/config.yml | .circleci/config.yml | version: 2
jobs:
build:
docker:
- image: circleci/ruby:2.2.0-node
steps:
- checkout
# Restore bundle cache
- restore_cache:
key: gems-{{ checksum "Gemfile.lock" }}
# copy env
- run: cp .env.example .env
# Bundle install dependencies
- run: bundle install --path /tmp/vendor/bundle
# Store bundle cache
- save_cache:
key: gems-{{ checksum "Gemfile.lock" }}
paths:
- /tmp/vendor/bundle
# run rubocop
- type: shell
command: |
bundle exec rubocop
# Run rspec in parallel
- type: shell
command: |
bundle exec rspec --profile 10 \
--format RspecJunitFormatter \
--out test_results/rspec.xml \
--format documentation \
--order rand \
$(circleci tests glob "spec/**/*_spec.rb" | circleci tests split --split-by=timings)
# Save test results for timing analysis
- store_test_results:
path: test_results
| version: 2
jobs:
build:
docker:
- image: circleci/ruby:2.3.3-node
steps:
- checkout
# Restore bundle cache
- restore_cache:
key: gems-{{ checksum "Gemfile.lock" }}
# copy env
- run: cp .env.example .env
# Bundle install dependencies
- run: bundle install --path /tmp/vendor/bundle
# Store bundle cache
- save_cache:
key: gems-{{ checksum "Gemfile.lock" }}
paths:
- /tmp/vendor/bundle
# run rubocop
- type: shell
command: |
bundle exec rubocop
# Run rspec in parallel
- type: shell
command: |
bundle exec rspec --profile 10 \
--format RspecJunitFormatter \
--out test_results/rspec.xml \
--format documentation \
--order rand \
$(circleci tests glob "spec/**/*_spec.rb" | circleci tests split --split-by=timings)
# Save test results for timing analysis
- store_test_results:
path: test_results
| Revert "scale back ruby version" | Revert "scale back ruby version"
This reverts commit 23f55afab24be22471a6ebae26028d4bf4469379.
| YAML | mit | watermarkchurch/wcc-api |
95f1a51e1d0f0aa97c167a8ed19618d5923e4316 | .circleci/config.yml | .circleci/config.yml | version: 2.1
workflows:
workflow:
jobs:
- test:
matrix:
parameters:
python_version: ["2.7", "3.4", "3.5", "3.6", "3.7", "3.8", "3.9", "3.10"]
- test_pypy:
matrix:
parameters:
python_version: ["2.7", "3.7"]
- lint-rst
jobs:
test:
parameters:
python_version:
type: string
steps:
- checkout
- run:
name: Test
command: python setup.py test
docker:
- image: circleci/python:<<parameters.python_version>>
test_pypy:
parameters:
python_version:
type: string
steps:
- checkout
- run:
name: Test
command: pypy setup.py test
docker:
- image: pypy:<<parameters.python_version>>
lint-rst:
working_directory: ~/code
steps:
- checkout
- run:
name: Install lint tools
command: |
python3 -m venv venv
. venv/bin/activate
pip install Pygments restructuredtext-lint
- run:
name: Lint
command: |
. venv/bin/activate
rst-lint --encoding=utf-8 README.rst
docker:
- image: circleci/python:3.10
| version: 2.1
workflows:
workflow:
jobs:
- test:
matrix:
parameters:
python_version: ["2.7", "3.4", "3.5", "3.6", "3.7", "3.8", "3.9", "3.10"]
- test_pypy:
matrix:
parameters:
python_version: ["2.7", "3.7", "3.8"]
- lint-rst
jobs:
test:
parameters:
python_version:
type: string
steps:
- checkout
- run:
name: Test
command: python setup.py test
docker:
- image: circleci/python:<<parameters.python_version>>
test_pypy:
parameters:
python_version:
type: string
steps:
- checkout
- run:
name: Test
command: pypy setup.py test
docker:
- image: pypy:<<parameters.python_version>>
lint-rst:
working_directory: ~/code
steps:
- checkout
- run:
name: Install lint tools
command: |
python3 -m venv venv
. venv/bin/activate
pip install Pygments restructuredtext-lint
- run:
name: Lint
command: |
. venv/bin/activate
rst-lint --encoding=utf-8 README.rst
docker:
- image: circleci/python:3.10
| Add PyPy 3.8 to the CI tests | Add PyPy 3.8 to the CI tests
| YAML | mit | elasticsales/ciso8601,closeio/ciso8601,closeio/ciso8601,elasticsales/ciso8601,closeio/ciso8601 |
dc14fc35ffcddfcf1f026593678d2d3588bd508a | .circleci/config.yml | .circleci/config.yml | version: 2
jobs:
build:
docker:
- image: cahirwpz/demoscene:latest
working_directory: ~/demoscene
steps:
- checkout
- run:
name: Git LFS (install Git Large File Storage)
command: |
apt-get install -y --no-install-recommends openssh-client
mkdir -p ~/.ssh
ssh-keyscan -H github.com >> ~/.ssh/known_hosts
ssh git@github.com git-lfs-authenticate "${CIRCLE_PROJECT_USERNAME}/${CIRCLE_PROJECT_REPONAME}" download
git lfs pull
- run:
name: Amiga 500 (compile code base)
command: 'cd a500 && make'
| version: 2
jobs:
build:
docker:
- image: cahirwpz/demoscene:latest
working_directory: ~/demoscene
steps:
- checkout
- run:
name: Git LFS (install Git Large File Storage)
command: |
apt-get -q update && apt-get upgrade -y
apt-get install -y --no-install-recommends openssh-client
mkdir -p ~/.ssh
ssh-keyscan -H github.com >> ~/.ssh/known_hosts
ssh git@github.com git-lfs-authenticate "${CIRCLE_PROJECT_USERNAME}/${CIRCLE_PROJECT_REPONAME}" download
git lfs pull
- run:
name: Amiga 500 (compile code base)
command: 'cd a500 && make'
| Update package list - fixes missing openssh-client. | Update package list - fixes missing openssh-client.
| YAML | artistic-2.0 | cahirwpz/demoscene,cahirwpz/demoscene,cahirwpz/demoscene,cahirwpz/demoscene |
3ea861fb3eaa243a9eb8e4aad5088a99c1eab1af | .circleci/config.yml | .circleci/config.yml | # Javascript Node CircleCI 2.0 configuration file
#
# Check https://circleci.com/docs/2.0/language-javascript/ for more details
#
version: 2
jobs:
build:
docker:
# specify the version you desire here
- image: circleci/node:latest
# Specify service dependencies here if necessary
# CircleCI maintains a library of pre-built images
# documented at https://circleci.com/docs/2.0/circleci-images/
# - image: circleci/mongo:3.4.4
working_directory: ~/repo
steps:
- checkout
# Download and cache dependencies
- restore_cache:
keys:
- v1-dependencies-{{ checksum "package.json" }}
# fallback to using the latest cache if no exact match is found
- v1-dependencies-
- run: yarn install
- save_cache:
paths:
- node_modules
key: v1-dependencies-{{ checksum "package.json" }}
- run:
name: Run test
command: yarn test -- --coverage
- run:
name: Upload coverage to codecov
command: bash <(curl -s https://codecov.io/bash) -t $CODECOV_TOKEN
| # Javascript Node CircleCI 2.0 configuration file
#
# Check https://circleci.com/docs/2.0/language-javascript/ for more details
#
version: 2
jobs:
build:
docker:
# specify the version you desire here
- image: circleci/node:latest
# Specify service dependencies here if necessary
# CircleCI maintains a library of pre-built images
# documented at https://circleci.com/docs/2.0/circleci-images/
# - image: circleci/mongo:3.4.4
working_directory: ~/repo
steps:
- checkout
# Download and cache dependencies
- restore_cache:
keys:
- v1-dependencies-{{ checksum "package.json" }}
# fallback to using the latest cache if no exact match is found
- v1-dependencies-
- run: yarn install
- save_cache:
paths:
- node_modules
key: v1-dependencies-{{ checksum "package.json" }}
- run:
name: Run test
command: yarn test --coverage
- run:
name: Upload coverage to codecov
command: bash <(curl -s https://codecov.io/bash) -t $CODECOV_TOKEN
| Remove -- from yarn test -- --coverage to follow the latest yarn version | Remove -- from yarn test -- --coverage to follow the latest yarn version
| YAML | mit | shirasudon/chat,shirasudon/chat |
c7646fa447fc70bd45ee6de1aa9395e19de250b5 | .circleci/config.yml | .circleci/config.yml | version: 2
jobs:
build:
working_directory: ~/tilex
docker:
- image: circleci/elixir:1.8.1-node-browsers
- image: postgres:9.6.5
environment:
POSTGRES_USER: postgres
ASYNC_FEATURE_TEST: yes
DATE_DISPLAY_TZ: America/Chicago
steps:
- checkout
- run: sudo apt-get update
- run: sudo apt-get install build-essential
- run: sudo apt-get install libnss3 libgconf-2-4
- run: sudo apt-get install libxss1 libappindicator1 libindicator7
- run: sudo apt-get install libasound2 libgtk-3-0 libxtst6 fonts-liberation lsb-release xdg-utils libappindicator3-1
- run: sudo apt-get install -f
- run: elixir -v
- run: mix local.hex --force
- run: mix local.rebar --force
- run: mix deps.get
- run: MIX_ENV=test mix ecto.create
- run: MIX_ENV=test mix ecto.migrate
- run: cd assets && npm install
- run: cd assets && node node_modules/webpack/bin/webpack.js --mode production
- run: mix phx.digest
- run: mix format --check-formatted
- run: mix credo
- run: mix test
- store_artifacts:
path: /home/circleci/tilex/screenshots
| version: 2
jobs:
build:
working_directory: ~/tilex
docker:
- image: circleci/elixir:1.8.1-node-browsers
- image: postgres:9.6.5
environment:
POSTGRES_USER: postgres
ASYNC_FEATURE_TEST: yes
DATE_DISPLAY_TZ: America/Chicago
steps:
- checkout
- run: sudo apt-get update
- run: sudo apt-get install build-essential
- run: sudo apt-get install libnss3 libgconf-2-4
- run: sudo apt-get install libxss1 libappindicator1 libindicator7
- run: sudo apt-get install libasound2 libgtk-3-0 libxtst6 fonts-liberation lsb-release xdg-utils libappindicator3-1
- run: sudo apt-get install -f
- run: mix local.hex --force
- run: mix local.rebar --force
- run: mix deps.get
- run: MIX_ENV=test mix ecto.create
- run: MIX_ENV=test mix ecto.migrate
- run: cd assets && npm install
- run: cd assets && node node_modules/webpack/bin/webpack.js --mode production
- run: mix phx.digest
- run: mix format --check-formatted
- run: mix credo
- run: mix test
- store_artifacts:
path: /home/circleci/tilex/screenshots
| Remove redundant Elixir version check | Remove redundant Elixir version check
| YAML | mit | hashrocket/tilex,hashrocket/tilex,hashrocket/tilex |
f7504f417b3ddb15ed4e3ac593ded8e6051014f7 | config/database.yml | config/database.yml | default: &default
adapter: mysql2
encoding: utf8mb4
pool: 8
username: root
password:
development:
<<: *default
database: github_ranking
test:
<<: *default
database: github_ranking_test
production:
adapter: mysql2
username: <%= ENV['GITHUBRANKING_DATABASE_USER'] %>
password: <%= ENV['GITHUBRANKING_DATABASE_PASSWORD'] %>
socket: <%= ENV['GITHUBRANKING_DATABASE_SOCKET'] %>
pool: 8
database: github_ranking
| default: &default
adapter: mysql2
encoding: utf8mb4
pool: 8
username: root
password:
development:
<<: *default
database: github_ranking
test:
<<: *default
database: github_ranking_test
production:
adapter: mysql2
username: <%= ENV['DATABASE_USER'] %>
password: <%= ENV['DATABASE_PASSWORD'] %>
host: <%= ENV['DATABASE_HOST'] %>
port: <%= ENV['DATABASE_PORT'] %>
pool: 8
database: github_ranking
| Allow specifying some db configs | Allow specifying some db configs
| YAML | mit | k0kubun/github-ranking,k0kubun/github-ranking,k0kubun/githubranking,k0kubun/githubranking,k0kubun/github-ranking,k0kubun/githubranking,k0kubun/github-ranking,k0kubun/github-ranking,k0kubun/githubranking,k0kubun/githubranking |
f15d039bccceefcdc21099f948d0c0c5d8a94e52 | .github/workflows/changeset.yml | .github/workflows/changeset.yml | name: New changeset
on: [push, pull_request]
jobs:
test:
name: Run tests
strategy:
matrix:
os: [ubuntu-18.04, ubuntu-20.04, macos-latest, windows-latest]
node: [10, 12, 14, 15, 16]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v2
with:
node-version: ${{ matrix.node }}
- run: yarn install
- run: yarn test
| name: New changeset
on: [push, pull_request]
jobs:
test:
name: Run tests
strategy:
matrix:
os: [ubuntu-18.04, ubuntu-20.04, macos-latest, windows-latest]
node: [12, 14, 15, 16]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v2
with:
node-version: ${{ matrix.node }}
- run: yarn install
- run: yarn test
| Remove support for node v10 | Remove support for node v10
| YAML | mit | sth/karma-summary-reporter,sth/karma-summary-reporter |
7de6ae08097b017b79fb9673d97be983dd46666a | .github/workflows/gh-pages.yaml | .github/workflows/gh-pages.yaml | name: Publish Documentation
on:
push:
branches:
- develop
paths:
- 'CMSIS/DoxyGen/**'
jobs:
docs:
name: Build develop documentation
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v1
- name: Install Doxygen 1.8.6
run: |
wget http://archive.ubuntu.com/ubuntu/pool/main/d/doxygen/doxygen_1.8.6-2_amd64.deb
sudo dpkg -i doxygen_1.8.6-2_amd64.deb
- name: Install mscgen 0.20
run: |
sudo apt-get update
sudo apt-get install --no-install-recommends -y mscgen=0.20-12
- name: Generate doxygen
run: CMSIS/DoxyGen/gen_doc.sh
- name: Archive documentation
run: |
cd CMSIS/Documentation
tar -cvjf /tmp/doc.tbz2 .
- uses: actions/checkout@v2
with:
ref: gh-pages
- name: Publish documentation
run: |
rm -r develop
mkdir develop
cd develop
tar -xvjf /tmp/doc.tbz2
git config user.name github-actions
git config user.email github-actions@github.com
git add .
git commit -m "Update develop documentation"
git push
| name: Publish Documentation
on:
workflow_dispatch:
push:
branches:
- develop
paths:
- 'CMSIS/DoxyGen/**'
jobs:
docs:
name: Build develop documentation
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v1
- name: Install Doxygen 1.8.6
run: |
wget http://archive.ubuntu.com/ubuntu/pool/main/d/doxygen/doxygen_1.8.6-2_amd64.deb
sudo dpkg -i doxygen_1.8.6-2_amd64.deb
- name: Install mscgen 0.20
run: |
sudo apt-get update
sudo apt-get install --no-install-recommends -y mscgen=0.20-12
- name: Generate doxygen
run: CMSIS/DoxyGen/gen_doc.sh
- name: Archive documentation
run: |
cd CMSIS/Documentation
tar -cvjf /tmp/doc.tbz2 .
- uses: actions/checkout@v2
with:
ref: gh-pages
- name: Publish documentation
run: |
rm -r develop
mkdir develop
cd develop
tar -xvjf /tmp/doc.tbz2
git config user.name github-actions
git config user.email github-actions@github.com
git add .
git commit -m "Update develop documentation"
git push
| Allow manual develop docs update. | GitHub: Allow manual develop docs update.
| YAML | apache-2.0 | ARM-software/CMSIS_5,ARM-software/CMSIS_5,JonatanAntoni/CMSIS_5,JonatanAntoni/CMSIS_5,ARM-software/CMSIS_5,ARM-software/CMSIS_5,JonatanAntoni/CMSIS_5,JonatanAntoni/CMSIS_5,JonatanAntoni/CMSIS_5,ARM-software/CMSIS_5,JonatanAntoni/CMSIS_5,ARM-software/CMSIS_5 |
2a9eb20c31ae045ec592469e4d95cd46fb05f917 | .github/workflows/run-tests.yml | .github/workflows/run-tests.yml | ---
name: Run tests
on: [push, pull_request, workflow_dispatch]
jobs:
tests:
runs-on: ubuntu-latest
timeout-minutes: 5
steps:
- name: Clone code
uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v3
with:
python-version: '3.10'
- name: Install build dependencies
run: |
python -m pip install -r requirements.txt
- name: Fetch latest packages data and build the website
make generate
| ---
name: Run tests
on: [push, pull_request, workflow_dispatch]
jobs:
tests:
runs-on: ubuntu-latest
timeout-minutes: 5
steps:
- name: Clone code
uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v3
with:
python-version: '3.10'
- name: Install build dependencies
run: |
python -m pip install -r requirements.txt
- name: Fetch latest packages data and build the website
run: |
make generate
| Add missing line to CI workflow | Add missing line to CI workflow
| YAML | bsd-2-clause | meshy/pythonwheels,meshy/pythonwheels |
d5dba1baeed1dc243e3c617fabefb99768e0db99 | config/cucumber.yml | config/cucumber.yml | <%
rerun = File.file?('rerun.txt') ? IO.read('rerun.txt') : ""
rerun_opts = rerun.to_s.strip.empty? ? "--format #{ENV['CUCUMBER_FORMAT'] || 'progress'} features" : "--format #{ENV['CUCUMBER_FORMAT'] || 'pretty'} #{rerun}"
std_opts = "--format #{ENV['CUCUMBER_FORMAT'] || 'pretty'} --strict --tags ~@wip"
%>
default: <%= std_opts %> features
wip: --tags @wip:3 --wip features
rerun: <%= rerun_opts %> --format rerun --out rerun.txt --strict --tags ~@wip
| <%
rerun = File.file?('rerun.txt') ? IO.read('rerun.txt') : ""
rerun_opts = rerun.to_s.strip.empty? ? "--format #{ENV['CUCUMBER_FORMAT'] || 'progress'} features" : "--format #{ENV['CUCUMBER_FORMAT'] || 'pretty'} #{rerun}"
std_opts = "--format html --out tmp/features.html --format #{ENV['CUCUMBER_FORMAT'] || 'pretty'} --strict --tags ~@wip"
%>
default: <%= std_opts %> features
wip: --tags @wip:3 --wip features
rerun: <%= rerun_opts %> --format rerun --out rerun.txt --strict --tags ~@wip
| Add Cucumber HTML Formatter for CI | Add Cucumber HTML Formatter for CI | YAML | mit | moneyadviceservice/frontend,moneyadviceservice/frontend,moneyadviceservice/frontend,moneyadviceservice/frontend |
4618450391c6e9a960390fba459129a322c2fbb6 | task.yml | task.yml | variables:
HOME: /root
PATH: $PATH:/opt/chefdk/embedded/bin:/opt/chef/embedded/bin:/usr/bin
ROLE: '{{config.role}}'
CONSUL_SECRET_KEY: '{{config.token}}'
default:
timeout: 1800
chef:
attribute_keys:
- cloudconductor
events:
setup:
description: Execute setup
task: setup
configure:
priority: 99
description: Execute configure chef
task: configure
deploy:
description: Execute deploy
task: deploy
backup:
description: Execute backup
priority: 80
task: backup
restore:
description: Execute restore
priority: 20
task: restore
spec:
description: Execute spec
task: spec
tasks:
setup:
description: Execute setup chef
operations:
- execute:
file: prepare.sh
- chef:
run_list:
- role[{{role}}_setup]
configure:
description: Execute configure chef
operations:
- chef:
run_list:
- role[{{role}}_configure]
deploy:
description: Execute deploy chef
operations:
- chef:
run_list:
- role[{{role}}_deploy]
backup:
description: Execute backup chef
operations:
- chef:
run_list:
- role[{{role}}_backup]
restore:
description: Execute restore chef
operations:
- chef:
run_list:
- role[{{role}}_restore]
spec:
description: Execute spec chef
operations:
- execute:
script: python lib/serverspec.py {{role}}
| variables:
HOME: /root
PATH: $PATH:/usr/local/bin:/usr/bin
ROLE: '{{config.role}}'
CONSUL_SECRET_KEY: '{{config.token}}'
default:
timeout: 1800
chef:
attribute_keys:
- cloudconductor
events:
setup:
description: Execute setup
task: setup
configure:
priority: 99
description: Execute configure chef
task: configure
deploy:
description: Execute deploy
task: deploy
backup:
description: Execute backup
priority: 80
task: backup
restore:
description: Execute restore
priority: 20
task: restore
spec:
description: Execute spec
task: spec
tasks:
setup:
description: Execute setup chef
operations:
- execute:
file: prepare.sh
- chef:
run_list:
- role[{{role}}_setup]
configure:
description: Execute configure chef
operations:
- chef:
run_list:
- role[{{role}}_configure]
deploy:
description: Execute deploy chef
operations:
- chef:
run_list:
- role[{{role}}_deploy]
backup:
description: Execute backup chef
operations:
- chef:
run_list:
- role[{{role}}_backup]
restore:
description: Execute restore chef
operations:
- chef:
run_list:
- role[{{role}}_restore]
spec:
description: Execute spec chef
operations:
- execute:
script: python lib/serverspec.py {{role}}
| Change PATH environment to execute chef/berkshelf without chefdk | Change PATH environment to execute chef/berkshelf without chefdk
| YAML | apache-2.0 | cloudconductor-patterns/amanda_pattern,cloudconductor-patterns/amanda_pattern,cloudconductor-patterns/amanda_pattern,cloudconductor-patterns/amanda_pattern |
50ed5ee208626628300a7e73f8285f1c945b4d28 | task.yml | task.yml | environments:
HOME: /root
PATH: $PATH:/usr/local/bin:/usr/bin
ROLE: '{{config.role}}'
CONSUL_SECRET_KEY: '{{config.token}}'
default:
timeout: 1800
chef:
attribute_keys:
- cloudconductor
events:
setup:
description: Execute setup
task: setup
configure:
description: Execute configure chef
task: configure
spec:
description: Execute spec
task: spec
deploy:
description: Execute deploy
task: deploy
tasks:
setup:
description: Execute setup chef
operations:
- execute:
file: prepare.sh
- chef:
run_list:
- role[{{role}}_setup]
configure:
description: Execute configure chef
operations:
- chef:
run_list:
- role[{{role}}_configure]
spec:
description: Execute serverspec
operations:
- execute:
script: |
gem install activesupport
python lib/serverspec.py {{role}}
deploy:
description: Execute deploy chef
operations:
- chef:
run_list:
- role[{{role}}_deploy]
| environments:
HOME: /root
PATH: $PATH:/usr/local/bin:/usr/bin
ROLE: '{{config.role}}'
CONSUL_SECRET_KEY: '{{config.token}}'
default:
timeout: 1800
chef:
attribute_keys:
- cloudconductor
events:
setup:
description: Execute setup
task: setup
configure:
description: Execute configure chef
task: configure
spec:
description: Execute spec
task: spec
deploy:
description: Execute deploy
task: deploy
tasks:
setup:
description: Execute setup chef
operations:
- execute:
script: |
gem install activesupport
- execute:
file: prepare.sh
- chef:
run_list:
- role[{{role}}_setup]
configure:
description: Execute configure chef
operations:
- chef:
run_list:
- role[{{role}}_configure]
spec:
description: Execute serverspec
operations:
- execute:
script: |
gem install activesupport
python lib/serverspec.py {{role}}
deploy:
description: Execute deploy chef
operations:
- chef:
run_list:
- role[{{role}}_deploy]
| Install active_support gem which is used in setup | Install active_support gem which is used in setup
| YAML | apache-2.0 | cloudconductor-patterns/fluentd_pattern,cloudconductor-patterns/fluentd_pattern,cloudconductor-patterns/fluentd_pattern,cloudconductor-patterns/fluentd_pattern |
9c757973462d5db9ad7eaabc41d9f1a531ed32d2 | .asf.yaml | .asf.yaml | notifications:
commits: commits@thrift.apache.org
issues: dev@thrift.apache.org
pullrequests_status: dev@thrift.apache.org
pullrequests_comment: github@thrift.apache.org
jira_options: link label worklog
| notifications:
commits: commits@thrift.apache.org
issues: dev@thrift.apache.org
pullrequests_status: dev@thrift.apache.org
pullrequests_comment: notifications@thrift.apache.org
jira_options: link label worklog
| Configure Gitbox: It's notifications, not github | Configure Gitbox: It's notifications, not github [skip ci]
| YAML | apache-2.0 | jeking3/thrift,yongju-hong/thrift,jeking3/thrift,strava/thrift,nsuke/thrift,apache/thrift,Jens-G/thrift,strava/thrift,nsuke/thrift,bforbis/thrift,bforbis/thrift,dcelasun/thrift,Jens-G/thrift,Jens-G/thrift,yongju-hong/thrift,apache/thrift,strava/thrift,jeking3/thrift,nsuke/thrift,strava/thrift,Jens-G/thrift,dcelasun/thrift,nsuke/thrift,dcelasun/thrift,apache/thrift,bforbis/thrift,nsuke/thrift,dcelasun/thrift,bgould/thrift,creker/thrift,nsuke/thrift,jeking3/thrift,Jens-G/thrift,strava/thrift,dcelasun/thrift,Jens-G/thrift,creker/thrift,Jens-G/thrift,dcelasun/thrift,dcelasun/thrift,bforbis/thrift,creker/thrift,bforbis/thrift,creker/thrift,dcelasun/thrift,nsuke/thrift,bgould/thrift,yongju-hong/thrift,bgould/thrift,Jens-G/thrift,Jens-G/thrift,jeking3/thrift,nsuke/thrift,Jens-G/thrift,bgould/thrift,Jens-G/thrift,nsuke/thrift,yongju-hong/thrift,bgould/thrift,apache/thrift,strava/thrift,strava/thrift,yongju-hong/thrift,strava/thrift,apache/thrift,apache/thrift,jeking3/thrift,Jens-G/thrift,jeking3/thrift,bforbis/thrift,yongju-hong/thrift,jeking3/thrift,yongju-hong/thrift,bgould/thrift,dcelasun/thrift,dcelasun/thrift,nsuke/thrift,strava/thrift,bforbis/thrift,yongju-hong/thrift,Jens-G/thrift,bgould/thrift,strava/thrift,bforbis/thrift,bgould/thrift,strava/thrift,yongju-hong/thrift,apache/thrift,jeking3/thrift,bforbis/thrift,bgould/thrift,jeking3/thrift,jeking3/thrift,yongju-hong/thrift,Jens-G/thrift,bforbis/thrift,bforbis/thrift,dcelasun/thrift,bforbis/thrift,strava/thrift,apache/thrift,Jens-G/thrift,nsuke/thrift,strava/thrift,yongju-hong/thrift,dcelasun/thrift,jeking3/thrift,nsuke/thrift,Jens-G/thrift,bgould/thrift,jeking3/thrift,bgould/thrift,strava/thrift,strava/thrift,nsuke/thrift,jeking3/thrift,creker/thrift,dcelasun/thrift,apache/thrift,bgould/thrift,yongju-hong/thrift,jeking3/thrift,creker/thrift,bforbis/thrift,creker/thrift,yongju-hong/thrift,apache/thrift,yongju-hong/thrift,strava/thrift,bgould/thrift,bforbis/thrift,apache/thrift,yongju-hong/thrift,nsuke/thrift,apache/thrift,apache/thrift,apache/thrift,Jens-G/thrift,nsuke/thrift,creker/thrift,apache/thrift,bforbis/thrift,jeking3/thrift,nsuke/thrift,dcelasun/thrift,creker/thrift,apache/thrift,nsuke/thrift,creker/thrift,creker/thrift,creker/thrift,strava/thrift,bforbis/thrift,dcelasun/thrift,dcelasun/thrift,apache/thrift,creker/thrift,yongju-hong/thrift,bgould/thrift,dcelasun/thrift,bforbis/thrift,yongju-hong/thrift,bgould/thrift,Jens-G/thrift,apache/thrift,creker/thrift |
9a19a21f10402aeb981807b6f14816edad1ced92 | .zuul.yml | .zuul.yml | ui: tape
browsers:
- name: chrome
version: latest
- name: firefox
version: latest
- name: safari
version: latest
- name: microsoftedge
version: latest
- name: iphone
version: latest
- name: ipad
version: latest
- name: android
version: latest
| ui: tape
browsers:
- name: chrome
version: latest
- name: firefox
version: latest
- name: safari
version: latest
- name: microsoftedge
version: latest
- name: ie
version: latest
- name: iphone
version: latest
- name: ipad
version: latest
- name: android
version: latest
| Revert "Disable Internet explorer testing for now..." | Revert "Disable Internet explorer testing for now..."
This reverts commit e65dfd4724b585285fed19fb3097b2ec7dd15c5a.
| YAML | mit | pgaubatz/node-uuid |
112d646bfa5f0ea8155ba184cb087048d50ec8c0 | .zuul.yml | .zuul.yml | ui: tape
browsers:
- name: chrome
version: 27..latest
- name: ie
version: 8..latest
- name: iphone
version: 6.0..latest
- name: ipad
version: 6.0..latest
- name: safari
version: 5..latest
- name: android
version: 4.0..latest
| ui: tape
browsers:
- name: chrome
version: 30..latest
- name: ie
version: 8..latest
- name: iphone
version: 6.0..latest
- name: ipad
version: 6.0..latest
- name: safari
version: 5..latest
- name: android
version: 4.0..latest
- name: firefox
version: 30..latest
| Add firefox to browser test list | Add firefox to browser test list
| YAML | mit | ajoslin/just-storage |
32320c6decff87f815b62c65bd78bb10a911fb42 | metadata/com.marcdonald.hibi.yml | metadata/com.marcdonald.hibi.yml | Categories:
- Science & Education
- Writing
License: Apache-2.0
AuthorName: Marc Donald
AuthorWebSite: https://marcdonald.com/
WebSite: https://marcdonald.com/hibi
SourceCode: https://github.com/marcdonald/hibi
IssueTracker: https://github.com/marcdonald/hibi/issues
Changelog: https://github.com/MarcDonald/Hibi/releases
AutoName: Hibi
RepoType: git
Repo: https://github.com/marcdonald/hibi
Builds:
- versionName: 1.3.2
versionCode: 34
commit: v1.3.2;34
subdir: app
gradle:
- yes
- versionName: 1.3.3
versionCode: 35
commit: v1.3.3;35
subdir: app
gradle:
- yes
- versionName: 1.4.0
versionCode: 36
commit: v1.4.0;36
subdir: app
gradle:
- yes
AutoUpdateMode: Version v%v;%c
UpdateCheckMode: Tags
CurrentVersion: 1.4.0
CurrentVersionCode: 36
| Categories:
- Science & Education
- Writing
License: Apache-2.0
AuthorName: Marc Donald
AuthorWebSite: https://marcdonald.com/
WebSite: https://marcdonald.com/hibi
SourceCode: https://github.com/marcdonald/hibi
IssueTracker: https://github.com/marcdonald/hibi/issues
Changelog: https://github.com/MarcDonald/Hibi/releases
AutoName: Hibi
RepoType: git
Repo: https://github.com/marcdonald/hibi
Builds:
- versionName: 1.3.2
versionCode: 34
commit: v1.3.2;34
subdir: app
gradle:
- yes
- versionName: 1.3.3
versionCode: 35
commit: v1.3.3;35
subdir: app
gradle:
- yes
- versionName: 1.4.0
versionCode: 36
commit: v1.4.0;36
subdir: app
gradle:
- yes
- versionName: 1.4.1
versionCode: 37
commit: v1.4.1;37
subdir: app
gradle:
- yes
AutoUpdateMode: Version v%v;%c
UpdateCheckMode: Tags
CurrentVersion: 1.4.1
CurrentVersionCode: 37
| Update Hibi to 1.4.1 (37) | Update Hibi to 1.4.1 (37)
| YAML | agpl-3.0 | f-droid/fdroiddata,f-droid/fdroiddata |
ef2144446381d161a3ef764522bc8dac44713a4a | packages/la/lazy-io-streams.yaml | packages/la/lazy-io-streams.yaml | homepage: ''
changelog-type: ''
hash: 55cbf42566b4666c2b53477492cd5df6f7c21d4a2bd6f4448258d9429e21c45e
test-bench-deps: {}
maintainer: ltclifton@gmail.com
synopsis: Get lazy with your io-streams
changelog: ''
basic-deps:
bytestring: -any
base: ! '>=4.8 && <4.13'
io-streams: -any
all-versions:
- 0.1.0.0
author: Luke Clifton
latest: 0.1.0.0
description-type: markdown
description: |
A library to do naughty things with
[io-streams](https://hackage.haskell.org/package/io-streams).
It let's you:
- convert `InputStream`s to lazy lists
- build lazy lists by feeding an `OutputStream`
- convert `InputStream ByteString`s to lazy `ByteStrings`
This can be handy when you are interfacing with other libraries that do
the right thing in the presence of lazy IO, but don't provide an
io-streams interface. Of course, the correct solution is to contribute
the io-streams interface.
license-name: BSD-3-Clause
| homepage: ''
changelog-type: ''
hash: d4de643ff1d5d500de8bd86910ae5a0cee363e41d2410fcab3d9d49e038a2607
test-bench-deps: {}
maintainer: ltclifton@gmail.com
synopsis: Get lazy with your io-streams
changelog: ''
basic-deps:
bytestring: -any
base: '>=4.8 && <5.0'
io-streams: -any
all-versions:
- 0.1.0.0
author: Luke Clifton
latest: 0.1.0.0
description-type: markdown
description: |
A library to do naughty things with
[io-streams](https://hackage.haskell.org/package/io-streams).
It let's you:
- convert `InputStream`s to lazy lists
- build lazy lists by feeding an `OutputStream`
- convert `InputStream ByteString`s to lazy `ByteStrings`
This can be handy when you are interfacing with other libraries that do
the right thing in the presence of lazy IO, but don't provide an
io-streams interface. Of course, the correct solution is to contribute
the io-streams interface.
license-name: BSD-3-Clause
| Update from Hackage at 2021-02-05T02:00:07Z | Update from Hackage at 2021-02-05T02:00:07Z
| YAML | mit | commercialhaskell/all-cabal-metadata |
ecf04892332f85bc6f7bc184eee381fc4aadc804 | roles/keitaro.php/tasks/main.yml | roles/keitaro.php/tasks/main.yml | - name: Install php extensions
yum:
name:
- "{{php_version}}"
- "{{php_version}}-php-bcmath"
- "{{php_version}}-php-devel"
- "{{php_version}}-php-mysqlnd"
- "{{php_version}}-php-opcache"
- "{{php_version}}-php-pecl-redis"
- "{{php_version}}-php-mbstring"
- "{{php_version}}-php-pear"
- "{{php_version}}-php-xml"
- "{{php_version}}-php-pecl-zip"
- "{{php_version}}-php-ioncube-loader"
state: installed
- name: Link php directories and php binary
file:
src: "{{item.from}}"
dest: "{{item.to}}"
state: link
with_items:
- {from: "/usr/bin/{{php_version}}", to: '/usr/bin/php'}
- {from: "/opt/remi/{{php_version}}/root/bin/php-config", to: '/usr/bin/php-config'}
- {from: "/etc/opt/remi/{{php_version}}", to: '/etc/php'}
- {from: "/var/opt/remi/{{php_version}}/log/php-fpm", to: '/var/log/php-fpm'}
- name: Tune php
include: tune-php.yml | - name: Install php extensions
yum:
name:
- "{{php_version}}"
- "{{php_version}}-php-bcmath"
- "{{php_version}}-php-devel"
- "{{php_version}}-php-mysqlnd"
- "{{php_version}}-php-opcache"
- "{{php_version}}-php-pecl-redis"
- "{{php_version}}-php-mbstring"
- "{{php_version}}-php-pear"
- "{{php_version}}-php-xml"
- "{{php_version}}-php-pecl-zip"
- "{{php_version}}-php-ioncube-loader"
state: installed
- name: Link php directories and php binary
file:
src: "{{item.from}}"
dest: "{{item.to}}"
state: link
with_items:
- {from: "/usr/bin/{{php_version}}", to: '/usr/bin/php'}
- {from: "/opt/remi/{{php_version}}/root/bin/php-config", to: '/usr/bin/php-config'}
- {from: "/etc/opt/remi/{{php_version}}", to: '/etc/php'}
- name: Tune php
include: tune-php.yml | Make symling to php-fpm log after installing php-fpm | Make symling to php-fpm log after installing php-fpm
| YAML | mit | keitarocorp/centos_provision,keitarocorp/centos_provision |
169140d1bd0d9a836953966ade8b89bcd9c17027 | packages/ja/jailbreak-cabal.yaml | packages/ja/jailbreak-cabal.yaml | homepage: https://github.com/peti/jailbreak-cabal
changelog-type: ''
hash: c645cb878c42b14dca1ef811403b939546f39284b0965b65dea4d93042d5da70
test-bench-deps: {}
maintainer: simons@cryp.to
synopsis: Strip version restrictions from Cabal files
changelog: ''
basic-deps:
Cabal: ==3.*
base: <5
all-versions:
- '1.0'
- '1.1'
- '1.2'
- '1.3'
- 1.3.1
- 1.3.2
- 1.3.3
- 1.3.4
- 1.3.5
author: Peter Simons, Jeremy Shaw, Joel Taylor, Kosyrev Serge, Nikolay Amiantov, aszlig
latest: 1.3.5
description-type: haddock
description: Strip version restrictions from build dependencies in Cabal files.
license-name: BSD-3-Clause
| homepage: https://github.com/NixOS/jailbreak-cabal
changelog-type: ''
hash: 624f4d427b7d9deea4ab69316b7ed0d835ab329f8f441fbdeef92b13c13bc0bf
test-bench-deps: {}
maintainer: sternenseemann@systemli.org
synopsis: Strip version restrictions from Cabal files
changelog: ''
basic-deps:
Cabal: ==3.*
base: <5
Cabal-syntax: -any
all-versions:
- '1.0'
- '1.1'
- '1.2'
- '1.3'
- 1.3.1
- 1.3.2
- 1.3.3
- 1.3.4
- 1.3.5
- 1.3.6
author: Peter Simons, Jeremy Shaw, Joel Taylor, Kosyrev Serge, Nikolay Amiantov, aszlig
latest: 1.3.6
description-type: haddock
description: Strip version restrictions from build dependencies in Cabal files.
license-name: BSD-3-Clause
| Update from Hackage at 2022-04-17T16:38:27Z | Update from Hackage at 2022-04-17T16:38:27Z
| YAML | mit | commercialhaskell/all-cabal-metadata |
ffbe4d1c3ecf603636d97fd8dfcd53e541cde288 | bddtests/docker-compose-1-empty.yml | bddtests/docker-compose-1-empty.yml | empty:
image: hyperledger/fabric-src
# TCP Listen on a port to satisfy the container 'ready' condition
command: nc -k -l 50000
| empty:
image: hyperledger/fabric-src
# TCP Listen on a port to satisfy the container 'ready' condition
# If we don't fork, trap and wait, nc (and by extension, the container)
# won't respond to the SIGTERM that docker sends when it wants to stop.
command: |
sh -c '
nc -k -l 50000 &
pid=$$!
trap "kill $$pid; exit 0" TERM
wait $$pid
'
| Handle SIGTERM in empty docker-compose container | Handle SIGTERM in empty docker-compose container
In docker-compose-1-empty.yml nc was being invoked directly which led
to SIGTERM being ignored by the container. This added an unneccessary
10 second wait after tests that used it while docker timed out and
sends a SIGKILL. This fixes that by forking nc and trapping SIGTERM
and directly killing nc.
Change-Id: Ifc5072b33c45f9fa90868a87ca69982c051e10e1
Signed-off-by: Julian Carrivick <689bb502006712bd9777fa7f2dbac6f1e92d8c51@au1.ibm.com>
| YAML | apache-2.0 | rameshthoomu/fabric,thakkarparth007/fabric,ibmmark/fabric,hyperledger/fabric,lukehuangch/fabric,lukehuangch/fabric,bcbrock/fabric,binhn/fabric,rameshbabu79/fabric,stonejiang208/fabric,stemlending/fabric,magg/fabric,ckeyer/fabric,manish-sethi/fabric,lukehuangch/fabric,hyperledgerchina/fabric_zh_CN,tkuhrt/fabric,akihikot/fabric,vpaprots/fabric,mqshen/fabric,thakkarparth007/fabric,muralisrini/fabric,muralisrini/fabric,magg/fabric,cophey/fabric,ibmmark/fabric,rameshthoomu/fabric,ibmmark/fabric,manish-sethi/fabric-sidedb,ckeyer/fabric,lukehuangch/fabric,linchaoyang/fabric,ckeyer/fabric,masterDev1985/fabric,stonejiang208/fabric,thakkarparth007/fabric,hyperledgerchina/fabric_zh_CN,blockc/fabric,lukehuangch/fabric,stonejiang208/fabric,mffrench/fabric,king3000/fabric,bcbrock/fabric,akihikot/fabric,binhn/fabric,mffrench/fabric,phariel/fabric,magg/fabric,vpaprots/fabric,rameshthoomu/fabric,masterDev1985/fabric,vdods/fabric,thakkarparth007/fabric,tkuhrt/fabric,chrisguoado/fabric,ckeyer/fabric,kchristidis/fabric,ibmmark/fabric,kchristidis/fabric,mqshen/fabric,Ryan--Yang/fabric,mqshen/fabric,binhn/fabric,mffrench/fabric,linchaoyang/fabric,bmos299/fabric,vdods/fabric,tkuhrt/fabric,tkuhrt/fabric,hyperledger/fabric,akihikot/fabric,rameshbabu79/fabric,hyperledgerchina/fabric_zh_CN,masterDev1985/fabric,cophey/fabric,christo4ferris/fabric-docs,masterDev1985/fabric,chrisguoado/fabric,mqshen/fabric,rameshbabu79/fabric,bcbrock/fabric,tkuhrt/fabric,rameshthoomu/fabric,vpaprots/fabric,king3000/fabric,cophey/fabric,cophey/fabric,bmos299/fabric,ibmmark/fabric,bmos299/fabric,chrisguoado/fabric,king3000/fabric,lukehuangch/fabric,xixuejia/fabric,hyperledgerchina/fabric_zh_CN,blockc/fabric,linchaoyang/fabric,manish-sethi/fabric-sidedb,manish-sethi/fabric,rameshthoomu/fabric,tkuhrt/fabric,Ryan--Yang/fabric,Ryan--Yang/fabric,rameshbabu79/fabric,tkuhrt/fabric,mqshen/fabric,jimthematrix/fabric,mffrench/fabric,chrisguoado/fabric,magg/fabric,hyperledgerchina/fabric_zh_CN,kchristidis/fabric,jimthematrix/fabric,rameshbabu79/fabric,masterDev1985/fabric,christo4ferris/fabric-docs,manish-sethi/fabric-sidedb,christo4ferris/fabric-docs,phariel/fabric,christo4ferris/fabric-docs,bmos299/fabric,akihikot/fabric,akihikot/fabric,rameshthoomu/fabric,mffrench/fabric,bcbrock/fabric,christo4ferris/fabric-docs,magg/fabric,bmos299/fabric,akihikot/fabric,ckeyer/fabric,kchristidis/fabric,mqshen/fabric,stonejiang208/fabric,chrisguoado/fabric,xixuejia/fabric,king3000/fabric,linchaoyang/fabric,hyperledgerchina/fabric_zh_CN,Ryan--Yang/fabric,phariel/fabric,vdods/fabric,blockc/fabric,bcbrock/fabric,linchaoyang/fabric,stemlending/fabric,bcbrock/fabric,Ryan--Yang/fabric,bmos299/fabric,blockc/fabric,blockc/fabric,akihikot/fabric,bmos299/fabric,phariel/fabric,vdods/fabric,kchristidis/fabric,xixuejia/fabric,linchaoyang/fabric,stemlending/fabric,ibmmark/fabric,Ryan--Yang/fabric,hyperledgerchina/fabric_zh_CN,thakkarparth007/fabric,chrisguoado/fabric,mffrench/fabric,phariel/fabric,magg/fabric,ckeyer/fabric,stemlending/fabric,linchaoyang/fabric,muralisrini/fabric,phariel/fabric,vpaprots/fabric,bcbrock/fabric,lukehuangch/fabric,thakkarparth007/fabric,thakkarparth007/fabric,manish-sethi/fabric-sidedb,kchristidis/fabric,manish-sethi/fabric-sidedb,stonejiang208/fabric,king3000/fabric,cophey/fabric,manish-sethi/fabric-sidedb,kchristidis/fabric,king3000/fabric,vdods/fabric,cophey/fabric,ibmmark/fabric,blockc/fabric,mffrench/fabric,king3000/fabric,chrisguoado/fabric,christo4ferris/fabric-docs,mqshen/fabric,magg/fabric,Ryan--Yang/fabric,manish-sethi/fabric-sidedb,ckeyer/fabric,rameshthoomu/fabric,phariel/fabric,hyperledgerchina/fabric_zh_CN,vpaprots/fabric,blockc/fabric,cophey/fabric |
9ee2d6b60987eccab495e854e40e99d73811492e | postgresql-server/tasks/main.yml | postgresql-server/tasks/main.yml | ---
- name: Add the PostgreSQL APT key
apt_key: url=https://www.postgresql.org/media/keys/ACCC4CF8.asc state=present
- name: Add the PostgreSQL APT repository
apt_repository: repo='deb http://apt.postgresql.org/pub/repos/apt/ precise-pgdg main' state=present
- name: Install the PostgreSQL packages
apt: pkg={{ item }} state=present
with_items:
- postgresql-9.3
- libpq-dev
- python-psycopg2
- name: Ensure logrotate is uninstalled
apt: pkg=logrotate state=absent
when: ansible_distribution_release != "precise"
| ---
- name: Add the PostgreSQL APT key
apt_key: url=https://www.postgresql.org/media/keys/ACCC4CF8.asc state=present
- name: Add the PostgreSQL APT repository
apt_repository: repo='deb http://apt.postgresql.org/pub/repos/apt/ precise-pgdg main' state=present
- name: Install the PostgreSQL packages
apt: pkg={{ item }} state=present
with_items:
- postgresql-9.3
- libpq-dev
- python-psycopg2
- name: Ensure logrotate is uninstalled
apt: pkg=logrotate state=absent
when: ansible_distribution_release != "precise" and ansible_distribution_release != "trusty"
| Add Trusty to the logrotate whitelist for Postgres. | Add Trusty to the logrotate whitelist for Postgres.
| YAML | mit | tomku/ansible-roles |
56092e2c92fcb612d2f3184750c0d011e6550b8c | recipes/perl-role-tiny/meta.yaml | recipes/perl-role-tiny/meta.yaml | package:
name: perl-role-tiny
version: "2.000001"
source:
fn: Role-Tiny-2.000001.tar.gz
url: https://cpan.metacpan.org/authors/id/H/HA/HAARG/Role-Tiny-2.000001.tar.gz
md5: f350f1f8c13652bf85da172380b39ec8
build:
number: 0
skip: True # [osx]
requirements:
build:
- perl-threaded
- perl-test-fatal
run:
- perl-threaded
test:
# Perl 'use' tests
imports:
- Role::Tiny
- Role::Tiny::With
about:
home: http://metacpan.org/pod/Role-Tiny
license: perl_5
summary: 'Roles. Like a nouvelle cuisine portion size slice of Moose.'
| package:
name: perl-role-tiny
version: "2.000001"
source:
fn: Role-Tiny-2.000001.tar.gz
url: https://cpan.metacpan.org/authors/id/H/HA/HAARG/Role-Tiny-2.000001.tar.gz
md5: f350f1f8c13652bf85da172380b39ec8
build:
number: 1
requirements:
build:
- perl-threaded
- perl-test-fatal
run:
- perl-threaded
test:
# Perl 'use' tests
imports:
- Role::Tiny
- Role::Tiny::With
about:
home: http://metacpan.org/pod/Role-Tiny
license: perl_5
summary: 'Roles. Like a nouvelle cuisine portion size slice of Moose.'
| Add OSX build for Perl Role::Tiny | Add OSX build for Perl Role::Tiny
| YAML | mit | rob-p/bioconda-recipes,HassanAmr/bioconda-recipes,gvlproject/bioconda-recipes,keuv-grvl/bioconda-recipes,mdehollander/bioconda-recipes,bow/bioconda-recipes,keuv-grvl/bioconda-recipes,acaprez/recipes,matthdsm/bioconda-recipes,jasper1918/bioconda-recipes,zwanli/bioconda-recipes,cokelaer/bioconda-recipes,rvalieris/bioconda-recipes,bioconda/bioconda-recipes,jfallmann/bioconda-recipes,jfallmann/bioconda-recipes,daler/bioconda-recipes,abims-sbr/bioconda-recipes,keuv-grvl/bioconda-recipes,pinguinkiste/bioconda-recipes,rvalieris/bioconda-recipes,pinguinkiste/bioconda-recipes,HassanAmr/bioconda-recipes,ThomasWollmann/bioconda-recipes,dkoppstein/recipes,bebatut/bioconda-recipes,ivirshup/bioconda-recipes,saketkc/bioconda-recipes,mdehollander/bioconda-recipes,jasper1918/bioconda-recipes,shenwei356/bioconda-recipes,daler/bioconda-recipes,hardingnj/bioconda-recipes,guowei-he/bioconda-recipes,mcornwell1957/bioconda-recipes,ostrokach/bioconda-recipes,HassanAmr/bioconda-recipes,keuv-grvl/bioconda-recipes,mcornwell1957/bioconda-recipes,lpantano/recipes,oena/bioconda-recipes,gregvonkuster/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,colinbrislawn/bioconda-recipes,abims-sbr/bioconda-recipes,ostrokach/bioconda-recipes,Luobiny/bioconda-recipes,mdehollander/bioconda-recipes,dkoppstein/recipes,ivirshup/bioconda-recipes,gregvonkuster/bioconda-recipes,roryk/recipes,instituteofpathologyheidelberg/bioconda-recipes,lpantano/recipes,peterjc/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,CGATOxford/bioconda-recipes,Luobiny/bioconda-recipes,npavlovikj/bioconda-recipes,JenCabral/bioconda-recipes,keuv-grvl/bioconda-recipes,ivirshup/bioconda-recipes,xguse/bioconda-recipes,CGATOxford/bioconda-recipes,martin-mann/bioconda-recipes,chapmanb/bioconda-recipes,acaprez/recipes,phac-nml/bioconda-recipes,oena/bioconda-recipes,gregvonkuster/bioconda-recipes,zwanli/bioconda-recipes,pinguinkiste/bioconda-recipes,hardingnj/bioconda-recipes,xguse/bioconda-recipes,blankenberg/bioconda-recipes,zachcp/bioconda-recipes,phac-nml/bioconda-recipes,cokelaer/bioconda-recipes,daler/bioconda-recipes,blankenberg/bioconda-recipes,jasper1918/bioconda-recipes,JenCabral/bioconda-recipes,ThomasWollmann/bioconda-recipes,zwanli/bioconda-recipes,abims-sbr/bioconda-recipes,ostrokach/bioconda-recipes,bioconda/recipes,peterjc/bioconda-recipes,peterjc/bioconda-recipes,chapmanb/bioconda-recipes,ivirshup/bioconda-recipes,guowei-he/bioconda-recipes,zwanli/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,mdehollander/bioconda-recipes,matthdsm/bioconda-recipes,guowei-he/bioconda-recipes,dmaticzka/bioconda-recipes,zwanli/bioconda-recipes,bow/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,roryk/recipes,bioconda/bioconda-recipes,bioconda/bioconda-recipes,phac-nml/bioconda-recipes,zachcp/bioconda-recipes,dmaticzka/bioconda-recipes,guowei-he/bioconda-recipes,npavlovikj/bioconda-recipes,shenwei356/bioconda-recipes,bioconda/bioconda-recipes,pinguinkiste/bioconda-recipes,ostrokach/bioconda-recipes,colinbrislawn/bioconda-recipes,abims-sbr/bioconda-recipes,hardingnj/bioconda-recipes,daler/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,omicsnut/bioconda-recipes,jfallmann/bioconda-recipes,peterjc/bioconda-recipes,ThomasWollmann/bioconda-recipes,npavlovikj/bioconda-recipes,pinguinkiste/bioconda-recipes,JenCabral/bioconda-recipes,gvlproject/bioconda-recipes,joachimwolff/bioconda-recipes,ostrokach/bioconda-recipes,chapmanb/bioconda-recipes,blankenberg/bioconda-recipes,joachimwolff/bioconda-recipes,matthdsm/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,cokelaer/bioconda-recipes,bow/bioconda-recipes,roryk/recipes,yesimon/bioconda-recipes,colinbrislawn/bioconda-recipes,zachcp/bioconda-recipes,zwanli/bioconda-recipes,HassanAmr/bioconda-recipes,matthdsm/bioconda-recipes,zachcp/bioconda-recipes,joachimwolff/bioconda-recipes,mdehollander/bioconda-recipes,HassanAmr/bioconda-recipes,JenCabral/bioconda-recipes,jfallmann/bioconda-recipes,daler/bioconda-recipes,cokelaer/bioconda-recipes,shenwei356/bioconda-recipes,rob-p/bioconda-recipes,chapmanb/bioconda-recipes,phac-nml/bioconda-recipes,saketkc/bioconda-recipes,martin-mann/bioconda-recipes,colinbrislawn/bioconda-recipes,CGATOxford/bioconda-recipes,colinbrislawn/bioconda-recipes,jasper1918/bioconda-recipes,rvalieris/bioconda-recipes,hardingnj/bioconda-recipes,peterjc/bioconda-recipes,peterjc/bioconda-recipes,martin-mann/bioconda-recipes,bebatut/bioconda-recipes,ostrokach/bioconda-recipes,xguse/bioconda-recipes,CGATOxford/bioconda-recipes,jasper1918/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,acaprez/recipes,yesimon/bioconda-recipes,mcornwell1957/bioconda-recipes,HassanAmr/bioconda-recipes,phac-nml/bioconda-recipes,npavlovikj/bioconda-recipes,dmaticzka/bioconda-recipes,bow/bioconda-recipes,oena/bioconda-recipes,dkoppstein/recipes,omicsnut/bioconda-recipes,shenwei356/bioconda-recipes,instituteofpathologyheidelberg/bioconda-recipes,rvalieris/bioconda-recipes,joachimwolff/bioconda-recipes,hardingnj/bioconda-recipes,dmaticzka/bioconda-recipes,bebatut/bioconda-recipes,martin-mann/bioconda-recipes,ThomasWollmann/bioconda-recipes,ThomasWollmann/bioconda-recipes,lpantano/recipes,JenCabral/bioconda-recipes,colinbrislawn/bioconda-recipes,abims-sbr/bioconda-recipes,joachimwolff/bioconda-recipes,oena/bioconda-recipes,bioconda/recipes,mdehollander/bioconda-recipes,rob-p/bioconda-recipes,daler/bioconda-recipes,guowei-he/bioconda-recipes,Luobiny/bioconda-recipes,gvlproject/bioconda-recipes,pinguinkiste/bioconda-recipes,ivirshup/bioconda-recipes,saketkc/bioconda-recipes,martin-mann/bioconda-recipes,ivirshup/bioconda-recipes,CGATOxford/bioconda-recipes,bow/bioconda-recipes,omicsnut/bioconda-recipes,matthdsm/bioconda-recipes,gvlproject/bioconda-recipes,rob-p/bioconda-recipes,dmaticzka/bioconda-recipes,omicsnut/bioconda-recipes,abims-sbr/bioconda-recipes,bow/bioconda-recipes,ThomasWollmann/bioconda-recipes,oena/bioconda-recipes,saketkc/bioconda-recipes,acaprez/recipes,bebatut/bioconda-recipes,chapmanb/bioconda-recipes,BIMSBbioinfo/bioconda-recipes,bioconda/recipes,lpantano/recipes,yesimon/bioconda-recipes,joachimwolff/bioconda-recipes,yesimon/bioconda-recipes,gvlproject/bioconda-recipes,JenCabral/bioconda-recipes,saketkc/bioconda-recipes,xguse/bioconda-recipes,mcornwell1957/bioconda-recipes,gregvonkuster/bioconda-recipes,rvalieris/bioconda-recipes,dmaticzka/bioconda-recipes,Luobiny/bioconda-recipes,gvlproject/bioconda-recipes,rvalieris/bioconda-recipes,xguse/bioconda-recipes,CGATOxford/bioconda-recipes,saketkc/bioconda-recipes,omicsnut/bioconda-recipes,keuv-grvl/bioconda-recipes,blankenberg/bioconda-recipes,matthdsm/bioconda-recipes,mcornwell1957/bioconda-recipes |
b814325cf7e459ce542f62c8b0bbd25390bee0b7 | conandata.yml | conandata.yml | requirements:
"5.1.0-beta":
- "clipper/6.4.2"
- "boost/1.78.0"
- "rapidjson/1.1.0"
- "stb/20200203"
"5.1.0-cura_9365":
- "clipper/6.4.2"
- "boost/1.78.0"
- "rapidjson/1.1.0"
- "stb/20200203"
requirements_arcus:
"5.1.0-beta":
- "protobuf/3.17.1"
- "arcus/5.1.0-beta@ultimaker/stable"
"5.1.0-cura_9365":
- "protobuf/3.17.1"
- "arcus/5.1.0-CURA-9365@ultimaker/testing"
| requirements:
"5.1.0-beta":
- "clipper/6.4.2"
- "boost/1.78.0"
- "rapidjson/1.1.0"
- "stb/20200203"
"5.1.0-cura_9365":
- "clipper/6.4.2"
- "boost/1.78.0"
- "rapidjson/1.1.0"
- "stb/20200203"
requirements_arcus:
"5.1.0-beta":
- "protobuf/3.17.1"
- "arcus/[~5.1.0-beta]@ultimaker/stable"
"5.1.0-cura_9365":
- "protobuf/3.17.1"
- "arcus/[~5.1.0-cura_9365]@ultimaker/testing"
| Use semver compatible range modifier | Use semver compatible range modifier
https://docs.conan.io/en/latest/versioning/version_ranges.html
It won't match build metadata otherwise.
The order of search for matching versions is as follows:
First, the local conan storage is searched for matching versions, unless the --update flag is provided to conan install.
If a matching version is found, it is used in the dependency graph as a solution.
If no matching version is locally found, it starts to search in the remotes, in order. If some remote is specified with -r=remote, then only that remote will be used.
If the --update parameter is used, then the existing packages in the local conan cache will not be used, and the same search of the previous steps is carried out in the remotes. If new matching versions are found, they will be retrieved, so subsequent calls to install will find them locally and use them.
Contributes to CURA-9365
| YAML | agpl-3.0 | Ultimaker/CuraEngine,Ultimaker/CuraEngine |
45db74b586ac9215939f61b97a1e67979ca95221 | roles/consul/handlers/main.yml | roles/consul/handlers/main.yml | ---
- name: reload consul
sudo: yes
command: systemctl reload consul
notify:
- wait for consul to listen
- name: restart consul
sudo: yes
command: systemctl restart consul
notify:
- wait for consul to listen
- name: wait for consul to listen
wait_for:
port: 8500
delay: 10
- name: restart nginx-consul
sudo: yes
command: systemctl restart nginx-consul
| ---
- name: restart consul
sudo: yes
command: systemctl restart consul
notify:
- wait for consul to listen
- name: restart nginx-consul
sudo: yes
command: systemctl restart nginx-consul
| Remove handler tasks that are duplicates of the handlers/ role | Remove handler tasks that are duplicates of the handlers/ role
| YAML | apache-2.0 | ianscrivener/microservices-infrastructure,sehqlr/mantl,revpoint/microservices-infrastructure,ludovicc/microservices-infrastructure,sudhirpandey/microservices-infrastructure,linearregression/microservices-infrastructure,noelbk/microservices-infrastructure,cmgc/microservices-infrastructure,TeaBough/microservices-infrastructure,mehulsbhatt/microservices-infrastructure,CiscoCloud/mantl,z00223295/microservices-infrastructure,mantl/mantl,kenjones-cisco/microservices-infrastructure,huodon/microservices-infrastructure,CiscoCloud/microservices-infrastructure,futuro/microservices-infrastructure,z00223295/microservices-infrastructure,benschumacher/microservices-infrastructure,linearregression/microservices-infrastructure,noelbk/microservices-infrastructure,SillyMoo/microservices-infrastructure,huodon/microservices-infrastructure,revpoint/microservices-infrastructure,Parkayun/microservices-infrastructure,huodon/microservices-infrastructure,datascienceinc/mantl,ddONGzaru/microservices-infrastructure,sudhirpandey/microservices-infrastructure,chrislovecnm/microservices-infrastructure,benschumacher/microservices-infrastructure,futuro/microservices-infrastructure,arminc/microservices-infrastructure,ianscrivener/microservices-infrastructure,ContainerSolutions/microservices-infrastructure,pinterb/microservices-infrastructure,cmgc/microservices-infrastructure,CiscoCloud/mantl,mehulsbhatt/microservices-infrastructure,remmelt/microservices-infrastructure,benschumacher/microservices-infrastructure,KaGeN101/mantl,pinterb/microservices-infrastructure,benschumacher/microservices-infrastructure,eirslett/microservices-infrastructure,chrislovecnm/microservices-infrastructure,pinterb/microservices-infrastructure,ContainerSolutions/microservices-infrastructure,eirslett/microservices-infrastructure,benschumacher/microservices-infrastructure,ianscrivener/microservices-infrastructure,gtcno/microservices-infrastructure,gtcno/microservices-infrastructure,mantl/mantl,SillyMoo/microservices-infrastructure,arminc/microservices-infrastructure,ilboud/microservices-infrastructure,datascienceinc/mantl,kindlyops/microservices-infrastructure,remmelt/microservices-infrastructure,revpoint/microservices-infrastructure,chrislovecnm/microservices-infrastructure,eirslett/microservices-infrastructure,bitium/mantl,liangyali/microservices-infrastructure,CiscoCloud/microservices-infrastructure,chrislovecnm/microservices-infrastructure,arminc/microservices-infrastructure,phnmnl/mantl,heww/microservices-infrastructure,abn/microservices-infrastructure,ianscrivener/microservices-infrastructure,phnmnl/mantl,ianscrivener/microservices-infrastructure,linearregression/microservices-infrastructure,ddONGzaru/microservices-infrastructure,ludovicc/microservices-infrastructure,kindlyops/microservices-infrastructure,heww/microservices-infrastructure,kenjones-cisco/microservices-infrastructure,huodon/microservices-infrastructure,ilboud/microservices-infrastructure,TeaBough/microservices-infrastructure,abn/microservices-infrastructure,chrislovecnm/microservices-infrastructure,sehqlr/mantl,bitium/mantl,cmgc/microservices-infrastructure,KaGeN101/mantl,mehulsbhatt/microservices-infrastructure,z00223295/microservices-infrastructure,ilboud/microservices-infrastructure,abn/microservices-infrastructure,Parkayun/microservices-infrastructure,liangyali/microservices-infrastructure,huodon/microservices-infrastructure |
faba59c54d17e74f673c3d17b14652f640bf8612 | roles/glance/defaults/main.yml | roles/glance/defaults/main.yml | ---
glance:
rev: def65d3a891a15a671f0713d3dbc9563228af9a7
api_workers: 5
registry_workers: 5
sync:
enabled: true
listening_port: 25307
download_limit: 0
upload_limit: 0
folder_rescan_interval: 600
storage_path: /var/lib/btsync
device_name: image-cache
dir: /var/lib/glance/images
store_swift: False
logs:
- paths:
- /var/log/glance/glance-api.log
fields:
type: openstack
tags: glance,glance-api
- paths:
- /var/log/glance/glance-manage.log
fields:
type: openstack
tags: glance,glance-manage
- paths:
- /var/log/glance/glance-registry.log
fields:
type: openstack
tags: glance,glance-registry
logging:
debug: False
verbose: True
| ---
glance:
rev: 42564be7ea133313a52639c27172d035390da1f2
api_workers: 5
registry_workers: 5
sync:
enabled: true
listening_port: 25307
download_limit: 0
upload_limit: 0
folder_rescan_interval: 600
storage_path: /var/lib/btsync
device_name: image-cache
dir: /var/lib/glance/images
store_swift: False
logs:
- paths:
- /var/log/glance/glance-api.log
fields:
type: openstack
tags: glance,glance-api
- paths:
- /var/log/glance/glance-manage.log
fields:
type: openstack
tags: glance,glance-manage
- paths:
- /var/log/glance/glance-registry.log
fields:
type: openstack
tags: glance,glance-registry
logging:
debug: False
verbose: True
| Update glance to fix traversal bug | Update glance to fix traversal bug
https://bugs.launchpad.net/glance/+bug/1400966
| YAML | mit | edtubillara/ursula,fancyhe/ursula,fancyhe/ursula,narengan/ursula,msambol/ursula,pgraziano/ursula,aldevigi/ursula,zrs233/ursula,twaldrop/ursula,nirajdp76/ursula,davidcusatis/ursula,MaheshIBM/ursula,blueboxgroup/ursula,masteinhauser/ursula,j2sol/ursula,EricCrosson/ursula,kennjason/ursula,blueboxgroup/ursula,j2sol/ursula,allomov/ursula,narengan/ursula,fancyhe/ursula,fancyhe/ursula,jwaibel/ursula,lihkin213/ursula,wupeiran/ursula,paulczar/ursula,pgraziano/ursula,edtubillara/ursula,wupeiran/ursula,ddaskal/ursula,ryshah/ursula,knandya/ursula,persistent-ursula/ursula,dlundquist/ursula,pbannister/ursula,panxia6679/ursula,kennjason/ursula,rongzhus/ursula,masteinhauser/ursula,ddaskal/ursula,panxia6679/ursula,channus/ursula,j2sol/ursula,persistent-ursula/ursula,nirajdp76/ursula,MaheshIBM/ursula,pgraziano/ursula,andrewrothstein/ursula,zrs233/ursula,panxia6679/ursula,MaheshIBM/ursula,channus/ursula,pbannister/ursula,ryshah/ursula,ryshah/ursula,rongzhus/ursula,jwaibel/ursula,nirajdp76/ursula,ryshah/ursula,jwaibel/ursula,mjbrewer/ursula,twaldrop/ursula,lihkin213/ursula,channus/ursula,knandya/ursula,allomov/ursula,masteinhauser/ursula,allomov/ursula,dlundquist/ursula,greghaynes/ursula,kennjason/ursula,nirajdp76/ursula,pgraziano/ursula,zrs233/ursula,narengan/ursula,edtubillara/ursula,edtubillara/ursula,EricCrosson/ursula,rongzhus/ursula,rongzhus/ursula,knandya/ursula,ddaskal/ursula,twaldrop/ursula,andrewrothstein/ursula,sivakom/ursula,lihkin213/ursula,aldevigi/ursula,pbannister/ursula,channus/ursula,j2sol/ursula,narengan/ursula,wupeiran/ursula,zrs233/ursula,msambol/ursula,panxia6679/ursula,sivakom/ursula,masteinhauser/ursula,persistent-ursula/ursula,persistent-ursula/ursula,wupeiran/ursula,dlundquist/ursula,greghaynes/ursula,aldevigi/ursula,paulczar/ursula,EricCrosson/ursula,twaldrop/ursula,davidcusatis/ursula,mjbrewer/ursula,lihkin213/ursula,msambol/ursula,andrewrothstein/ursula,ddaskal/ursula,davidcusatis/ursula,greghaynes/ursula,blueboxgroup/ursula,paulczar/ursula,mjbrewer/ursula,blueboxgroup/ursula,knandya/ursula,sivakom/ursula |
c706abcf1923aa8235572ce1303e5f096ec421f7 | packages/tr/tree-sitter-ruby.yaml | packages/tr/tree-sitter-ruby.yaml | homepage: https://github.com/tree-sitter/tree-sitter-ruby#readme
changelog-type: ''
hash: e7aa06c8fc5e48714e82d30c64ceda20bbefb2120dd0e6d3e8bcda9b1a3fb86b
test-bench-deps: {}
maintainer: tclem@github.com
synopsis: Tree-sitter grammar/parser for Ruby
changelog: ''
basic-deps:
base: ! '>=4.12 && <5'
tree-sitter-ruby-internal: -any
tree-sitter: ^>=0.1.0.0
template-haskell: ! '>=2.12.0.0 && <2.15.0.0'
all-versions:
- 0.1.0.0
author: Max Brunsfeld, Tim Clem, Rob Rix, Josh Vera, Rick Winfrey, Ayman Nadeem, Patrick
Thomson
latest: 0.1.0.0
description-type: haddock
description: This package provides a parser for Ruby suitable for use with the tree-sitter
package.
license-name: BSD-3-Clause
| homepage: https://github.com/tree-sitter/tree-sitter-ruby#readme
changelog-type: ''
hash: 1fdc2c5c06c58dd967ef224150ed11e541b25887db69ea3a071ee8739e69906f
test-bench-deps: {}
maintainer: tclem@github.com
synopsis: Tree-sitter grammar/parser for Ruby
changelog: ''
basic-deps:
base: ! '>=4.12 && <5'
tree-sitter-ruby-internal: -any
tree-sitter: ! '>=0.1 && <0.3'
template-haskell: ! '>=2.12.0.0 && <2.15.0.0'
all-versions:
- 0.1.0.0
author: Max Brunsfeld, Tim Clem, Rob Rix, Josh Vera, Rick Winfrey, Ayman Nadeem, Patrick
Thomson
latest: 0.1.0.0
description-type: haddock
description: This package provides a parser for Ruby suitable for use with the tree-sitter
package.
license-name: BSD-3-Clause
| Update from Hackage at 2019-09-09T15:18:52Z | Update from Hackage at 2019-09-09T15:18:52Z
| YAML | mit | commercialhaskell/all-cabal-metadata |
fab3dfda8329984d74cefd816a61d8f4431ba3d8 | roles/irc/tasks/notifications.yml | roles/irc/tasks/notifications.yml | ---
- get_url: url=https://raw.githubusercontent.com/m4v/inotify-daemon/stable/inotify-daemon
dest=~/.bin/inotify-daemon
- file: path=~/.bin/inotify-daemon
mode=0755
- copy: dest="~/.config/autostart/IRC Notifier.desktop"
src="IRC Notifier.desktop"
| ---
- get_url: url=https://raw.githubusercontent.com/m4v/inotify-daemon/stable/inotify-daemon
dest=~/.bin/inotify-daemon
- file: path=~/.bin/inotify-daemon
mode=0755
- file: path="~/.config/autostart"
state=directory
- copy: dest="~/.config/autostart/IRC Notifier.desktop"
src="IRC Notifier.desktop"
| Create autostart directory for Gnome | Create autostart directory for Gnome
| YAML | mit | ryansb/workstation,ryansb/workstation,ryansb/workstation |
635b0e115da9428e6ad1a749c6f0fd108a8cafbd | packages/zi/zipper-extra.yaml | packages/zi/zipper-extra.yaml | homepage: ''
changelog-type: markdown
hash: 5628e172543a45def5106158b138d5e8ff7cf4c91b50dad71098cf17d0b486cf
test-bench-deps: {}
maintainer: dan.firth@homotopic.tech
synopsis: Zipper utils that weren't in Control.Comonad.Store.Zipper
changelog: |
# Changelog for zipper-extra
## v0.1.0.0
Reexport `Control.Comonad.Store.Zipper` with some utility functions.
basic-deps:
split: -any
base: '>=4.7 && <5'
comonad: -any
comonad-extras: -any
all-versions:
- 0.1.0.0
- 0.1.0.1
author: Daniel Firth
latest: 0.1.0.1
description-type: markdown
description: |
# zipper-extra
Zipper utils that weren't in `Control.Comonad.Store.Zipper`.
license-name: MIT
| homepage: ''
changelog-type: markdown
hash: 41bf31d05f8f647fd2a6d7d9b13018829ce9145a6ec03a474d264023d9d372b4
test-bench-deps: {}
maintainer: dan.firth@homotopic.tech
synopsis: Zipper utils that weren't in Control.Comonad.Store.Zipper
changelog: |
# Changelog for zipper-extra
## v0.1.1.0
Add paginate' and exception handling for pagination failure.
## v0.1.0.0
Reexport `Control.Comonad.Store.Zipper` with some utility functions.
basic-deps:
exceptions: -any
split: -any
base: '>=4.7 && <5'
comonad: -any
comonad-extras: -any
all-versions:
- 0.1.0.0
- 0.1.0.1
- 0.1.1.0
author: Daniel Firth
latest: 0.1.1.0
description-type: markdown
description: |
# zipper-extra
Zipper utils that weren't in `Control.Comonad.Store.Zipper`.
license-name: MIT
| Update from Hackage at 2020-05-14T13:18:17Z | Update from Hackage at 2020-05-14T13:18:17Z
| YAML | mit | commercialhaskell/all-cabal-metadata |
e3311fcb1fe97870e0daea6b912b6059c0de97ff | metadata/awais.instagrabber.yml | metadata/awais.instagrabber.yml | AntiFeatures:
- NonFreeNet
Categories:
- Internet
License: GPL-3.0-or-later
WebSite: https://gitlab.com/AwaisKing/Instagrabber
SourceCode: https://gitlab.com/AwaisKing/Instagrabber/tree/HEAD
IssueTracker: https://gitlab.com/AwaisKing/Instagrabber/issues
AutoName: InstaGrabber
RepoType: git
Repo: https://gitlab.com/AwaisKing/Instagrabber.git
Builds:
- versionName: 4.0-fdroid
versionCode: 4
commit: v4.0
subdir: app
gradle:
- fdroid
- versionName: 6.0-fdroid
versionCode: 6
commit: v6.0
subdir: app
gradle:
- fdroid
- versionName: 7.0-fdroid
versionCode: 7
commit: v7.0
subdir: app
gradle:
- fdroid
- versionName: 8.0-fdroid
versionCode: 8
commit: v8.0
subdir: app
gradle:
- fdroid
- versionName: 10.0-fdroid
versionCode: 10
commit: v10.0-fdroid
subdir: app
gradle:
- fdroid
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags
CurrentVersion: 10.0-fdroid
CurrentVersionCode: 10
| AntiFeatures:
- NonFreeNet
Categories:
- Internet
License: GPL-3.0-or-later
WebSite: https://gitlab.com/AwaisKing/Instagrabber
SourceCode: https://gitlab.com/AwaisKing/Instagrabber/tree/HEAD
IssueTracker: https://gitlab.com/AwaisKing/Instagrabber/issues
AutoName: InstaGrabber
RepoType: git
Repo: https://gitlab.com/AwaisKing/Instagrabber.git
Builds:
- versionName: 4.0-fdroid
versionCode: 4
commit: v4.0
subdir: app
gradle:
- fdroid
- versionName: 6.0-fdroid
versionCode: 6
commit: v6.0
subdir: app
gradle:
- fdroid
- versionName: 7.0-fdroid
versionCode: 7
commit: v7.0
subdir: app
gradle:
- fdroid
- versionName: 8.0-fdroid
versionCode: 8
commit: v8.0
subdir: app
gradle:
- fdroid
- versionName: 10.0-fdroid
versionCode: 10
commit: v10.0-fdroid
subdir: app
gradle:
- fdroid
- versionName: 11.0-fdroid
versionCode: 11
commit: v11.0-fdroid
subdir: app
gradle:
- fdroid
AutoUpdateMode: Version v%v
UpdateCheckMode: Tags
CurrentVersion: 11.0-fdroid
CurrentVersionCode: 11
| Update InstaGrabber to 11.0-fdroid (11) | Update InstaGrabber to 11.0-fdroid (11)
| YAML | agpl-3.0 | f-droid/fdroiddata,f-droid/fdroiddata |
e98a24e5a9cb2d6983fa1bd6164c166db9139fea | recipes/firefox/meta.yaml | recipes/firefox/meta.yaml | {% set version = "68.0.2" %}
package:
name: firefox
version: {{ version }}
source:
url: https://ftp.mozilla.org/pub/firefox/releases/{{ version }}/linux-x86_64/en-US/firefox-{{ version }}.tar.bz2 # [linux64]
sha256: 284f58b5ee75daec5eaf8c994fe2c8b14aff6c65331e5deeaed6ba650673357c # [linux64]
build:
number: 0
ignore_prefix_files: True
binary_relocation: False
about:
home: https://www.mozilla.org/en-US/firefox/releases
license: MPL-2.0
license_family: OTHER
license_file: LICENSE
summary: 'Firefox web browser'
extra:
recipe-maintainers:
- birdsarah
- ocefpaf
| {% set version = "68.0.2" %}
package:
name: firefox
version: {{ version }}
source:
url: https://ftp.mozilla.org/pub/firefox/releases/{{ version }}/linux-x86_64/en-US/firefox-{{ version }}.tar.bz2 # [linux64]
sha256: 284f58b5ee75daec5eaf8c994fe2c8b14aff6c65331e5deeaed6ba650673357c # [linux64]
build:
number: 0
ignore_prefix_files: True
binary_relocation: False
test:
commands:
- test -f $PREFIX/bin/firefox # [not win]
about:
home: https://www.mozilla.org/en-US/firefox/releases
license: MPL-2.0
license_family: OTHER
license_file: LICENSE
summary: 'Firefox web browser'
extra:
recipe-maintainers:
- birdsarah
- ocefpaf
| Add test for existence of bin | Add test for existence of bin
| YAML | bsd-3-clause | conda-forge/staged-recipes,jakirkham/staged-recipes,dschreij/staged-recipes,jochym/staged-recipes,conda-forge/staged-recipes,ocefpaf/staged-recipes,petrushy/staged-recipes,SylvainCorlay/staged-recipes,mariusvniekerk/staged-recipes,kwilcox/staged-recipes,goanpeca/staged-recipes,patricksnape/staged-recipes,jakirkham/staged-recipes,stuertz/staged-recipes,asmeurer/staged-recipes,birdsarah/staged-recipes,mcs07/staged-recipes,birdsarah/staged-recipes,kwilcox/staged-recipes,hadim/staged-recipes,igortg/staged-recipes,scopatz/staged-recipes,dschreij/staged-recipes,synapticarbors/staged-recipes,petrushy/staged-recipes,asmeurer/staged-recipes,ReimarBauer/staged-recipes,chrisburr/staged-recipes,jochym/staged-recipes,hadim/staged-recipes,mcs07/staged-recipes,stuertz/staged-recipes,SylvainCorlay/staged-recipes,Juanlu001/staged-recipes,ReimarBauer/staged-recipes,chrisburr/staged-recipes,johanneskoester/staged-recipes,ocefpaf/staged-recipes,synapticarbors/staged-recipes,mariusvniekerk/staged-recipes,scopatz/staged-recipes,isuruf/staged-recipes,johanneskoester/staged-recipes,goanpeca/staged-recipes,igortg/staged-recipes,Juanlu001/staged-recipes,patricksnape/staged-recipes,isuruf/staged-recipes |
7500175edd082c0733d4c66d0109cca7207aa058 | packages/al/alg.yaml | packages/al/alg.yaml | homepage: ''
changelog-type: ''
hash: cb277a81edb0626c252ec333a4e8c55567329e3f949eef24e95b34e9944a6aeb
test-bench-deps: {}
maintainer: strake888@gmail.com
synopsis: Algebraic structures
changelog: ''
basic-deps:
base: ! '>=4.9 && <5'
util: ! '>=0.1.5 && <0.2'
all-versions:
- '0.2.0.0'
- '0.2.1.0'
- '0.2.2.0'
author: M Farkas-Dyck
latest: '0.2.2.0'
description-type: haddock
description: ''
license-name: BSD3
| homepage: ''
changelog-type: ''
hash: 026d651cf1986446406ae52498ee59ec1f66cf52f00e13a7ac3fb25aac6ce926
test-bench-deps: {}
maintainer: strake888@gmail.com
synopsis: Algebraic structures
changelog: ''
basic-deps:
base: ! '>=4.9 && <5'
util: ! '>=0.1.9 && <0.2'
all-versions:
- '0.2.0.0'
- '0.2.1.0'
- '0.2.2.0'
- '0.2.2.1'
author: M Farkas-Dyck
latest: '0.2.2.1'
description-type: haddock
description: ''
license-name: BSD3
| Update from Hackage at 2018-05-07T06:33:18Z | Update from Hackage at 2018-05-07T06:33:18Z
| YAML | mit | commercialhaskell/all-cabal-metadata |
049f90a51a6e5c97072a5319f9aa3d067e1193d4 | behat.yml | behat.yml | default:
extensions:
MvLabs\Zf2Extension\Zf2Extension: ~
Behat\MinkExtension:
base_url: 'http://localhost:8085'
sessions:
default:
zombie:
node_modules_path: /Users/Christopher/.npm-packages/lib/node_modules/
| default:
extensions:
MvLabs\Zf2Extension\Zf2Extension: ~
Behat\MinkExtension:
base_url: 'http://localhost:8085'
sessions:
default:
zombie: ~
| Change node module path to env variable | Change node module path to env variable
| YAML | bsd-3-clause | chris-fa/zend-blog-tutorial,chris-fa/zend-blog-tutorial,chris-fa/zend-blog-tutorial |
f1479a52f9efe87310d26155096cf2370eb628ce | packages/gi/gi-cairo-render.yaml | packages/gi/gi-cairo-render.yaml | homepage: https://github.com/cohomology/gi-cairo-render
changelog-type: ''
hash: a56ff8a2b30f60b62113fe9a9c68cda583d3da4d75f13c8781d4aa211fc99410
test-bench-deps: {}
maintainer: Kilian Kilger (kkilger@gmail.com)
synopsis: GI friendly Binding to the Cairo library.
changelog: ''
basic-deps:
haskell-gi-base: ! '>=0.21.0 && <0.22'
bytestring: -any
base: ! '>=4 && <5'
text: ! '>=1.0.0.0 && <1.3'
array: -any
utf8-string: ! '>=0.2 && <1.1'
mtl: -any
all-versions:
- 0.0.1
author: Axel Simon, Duncan Coutts
latest: 0.0.1
description-type: haddock
description: |-
Cairo is a library to render high quality vector graphics. There
exist various backends that allows rendering to Gtk windows, PDF,
PS, PNG and SVG documents, amongst others.
license-name: BSD-3-Clause
| homepage: https://github.com/cohomology/gi-cairo-render
changelog-type: ''
hash: ff2ccc309c021c2c023fa0d380375ef36cff2df93e0c78ed733f052dd1aa9782
test-bench-deps: {}
maintainer: Kilian Kilger (kkilger@gmail.com)
synopsis: GI friendly Binding to the Cairo library.
changelog: ''
basic-deps:
haskell-gi-base: ! '>=0.21.0 && <1'
bytestring: -any
base: ! '>=4 && <5'
text: ! '>=1.0.0.0 && <1.3'
array: -any
utf8-string: ! '>=0.2 && <1.1'
mtl: -any
all-versions:
- 0.0.1
author: Axel Simon, Duncan Coutts
latest: 0.0.1
description-type: haddock
description: |-
Cairo is a library to render high quality vector graphics. There
exist various backends that allows rendering to Gtk windows, PDF,
PS, PNG and SVG documents, amongst others.
license-name: BSD-3-Clause
| Update from Hackage at 2019-06-16T23:58:20Z | Update from Hackage at 2019-06-16T23:58:20Z
| YAML | mit | commercialhaskell/all-cabal-metadata |
966f5c68bd6dd4e782e4d80a0c59704166a57fd7 | packages/gn/gnss-converters.yaml | packages/gn/gnss-converters.yaml | homepage: http://github.com/swift-nav/gnss-converters
changelog-type: ''
hash: c0fc72a6ab1948f25b3315f27cec2292ea5814e5460d21590706e71b189d6450
test-bench-deps:
base: -any
basic-prelude: -any
criterion: -any
gnss-converters: -any
tasty-hunit: -any
tasty: -any
maintainer: Mark Fine <dev@swiftnav.com>
synopsis: GNSS Converters.
changelog: ''
basic-deps:
base: ! '>=4.7 && <5'
time: -any
basic-prelude: -any
rtcm: -any
conduit: -any
conduit-extra: -any
sbp: -any
lens: -any
binary-conduit: -any
gnss-converters: -any
resourcet: -any
all-versions:
- '0.1.0'
- '0.1.1'
- '0.1.2'
- '0.1.3'
- '0.1.4'
- '0.1.5'
- '0.1.6'
author: Swift Navigation Inc.
latest: '0.1.6'
description-type: haddock
description: Haskell bindings for GNSS converters.
license-name: BSD3
| homepage: http://github.com/swift-nav/gnss-converters
changelog-type: ''
hash: 3e129ab23dbb8b7d113319f1ae1fc838fb40468fdf9574d1a1a1c511725bfebd
test-bench-deps:
base: -any
basic-prelude: -any
criterion: -any
gnss-converters: -any
tasty-hunit: -any
tasty: -any
maintainer: Mark Fine <dev@swiftnav.com>
synopsis: GNSS Converters.
changelog: ''
basic-deps:
base: ! '>=4.7 && <5'
time: -any
basic-prelude: -any
rtcm: -any
conduit: -any
conduit-extra: -any
sbp: -any
lens: -any
binary-conduit: -any
gnss-converters: -any
resourcet: -any
all-versions:
- '0.1.0'
- '0.1.1'
- '0.1.2'
- '0.1.3'
- '0.1.4'
- '0.1.5'
- '0.1.6'
- '0.1.7'
author: Swift Navigation Inc.
latest: '0.1.7'
description-type: haddock
description: Haskell bindings for GNSS converters.
license-name: BSD3
| Update from Hackage at 2016-04-21T00:19:56+0000 | Update from Hackage at 2016-04-21T00:19:56+0000
| YAML | mit | commercialhaskell/all-cabal-metadata |
0e07d84d0a6e9c76d687e5d2e47b71d6805d3f10 | stable/ipfs/Chart.yaml | stable/ipfs/Chart.yaml | apiVersion: v1
description: A Helm chart for the Interplanetary File System
name: ipfs
version: 0.1.0
icon: https://raw.githubusercontent.com/ipfs/logo/master/raster-generated/ipfs-logo-128-ice-text.png
home: https://ipfs.io/
keywords:
- ipfs
- distributed-web
sources:
- https://github.com/ipfs/go-ipfs
maintainers:
- name: Yuvi Panda
email: yuvipanda@gmail.com
| apiVersion: v1
description: A Helm chart for the Interplanetary File System
name: ipfs
version: 0.1.0
icon: https://raw.githubusercontent.com/ipfs/logo/master/raster-generated/ipfs-logo-128-ice-text.png
home: https://ipfs.io/
keywords:
- ipfs
- distributed-web
sources:
- https://github.com/ipfs/go-ipfs
maintainers:
- name: YuviPanda
email: yuvipanda@gmail.com
| Make maintainer name a valid GitHub handle | [stable/ipfs] Make maintainer name a valid GitHub handle
| YAML | apache-2.0 | mlaccetti/charts,grugnog/charts,mlaccetti/charts,trobert2/charts-1,smelchior/charts,jainishshah17/charts,ledor473/charts,rimusz/helm-charts,smileisak/charts,jainishshah17/charts,oleh-ozimok/charts,rimusz/helm-charts,helm/charts,trobert2/charts-1,samisms/charts,mluedke/charts,mluedke/charts,sstarcher/charts,caiwenhao/charts,GaneshSPatil/charts,helm/charts,GaneshSPatil/charts,helm/charts,jainishshah17/charts,sstarcher/charts,lachie83/charts,smileisak/charts,viglesiasce/charts,kachkaev/charts,lachie83/charts,escardin/charts,sd2k/charts,smelchior/charts,GaneshSPatil/charts,viglesiasce/charts,oleh-ozimok/charts,smileisak/charts,oleh-ozimok/charts,grugnog/charts,caiwenhao/charts,sstarcher/charts,escardin/charts,smelchior/charts,lachie83/charts,rimusz/helm-charts,kachkaev/charts,viglesiasce/charts,lachie83/charts,samisms/charts,sd2k/charts,samisms/charts,rimusz/helm-charts,kachkaev/charts,ledor473/charts,mluedke/charts,helm/charts,GaneshSPatil/charts,caiwenhao/charts,grugnog/charts,sd2k/charts,ledor473/charts,sd2k/charts,trobert2/charts-1,escardin/charts,mlaccetti/charts |
2ebd1fa495cef35c3730e7e7281f01f7db20351e | packages/ar/arbor-postgres.yaml | packages/ar/arbor-postgres.yaml | homepage: https://github.com/arbor/arbor-postgres#readme
changelog-type: markdown
hash: 9c7b667f39876746ef4ea55c70ec047840e158a2bd02c12c749e34b12f3c20eb
test-bench-deps: {}
maintainer: mayhem@arbor.net
synopsis: Convenience types and functions for postgresql-simple.
changelog: |
# Changelog for arbor-postgres
## Unreleased changes
basic-deps:
bytestring: ! '>=0.10 && <0.11'
base: ! '>=4.7 && <5'
text: ! '>=1.2 && <1.3'
lens: ! '>=4.16 && <5'
postgresql-simple: ! '>=0.5 && <0.7'
network-uri: ! '>=2.6 && <3'
generic-lens: ! '>=1.2.0.1 && <2'
optparse-applicative: ! '>=0.14 && <0.16'
all-versions:
- 0.0.1
- 0.0.2
- 0.0.3
- 0.0.4
- 0.0.5
author: Arbor Networks
latest: 0.0.5
description-type: markdown
description: |
# arbor-postgres
Convenience types and functions to make `postgresql-simple` easier and
less error-prone to use.
license-name: MIT
| homepage: https://github.com/arbor/arbor-postgres#readme
changelog-type: markdown
hash: 69221fe6326e2ce95fcdd98a4d400e18dfd51509a6dd2819f752d98ca0fba424
test-bench-deps: {}
maintainer: mayhem@arbor.net
synopsis: Convenience types and functions for postgresql-simple.
changelog: |
# Changelog for arbor-postgres
## Unreleased changes
basic-deps:
bytestring: '>=0.10 && <0.11'
base: '>=4.7 && <5'
text: '>=1.2 && <1.3'
lens: '>=4.16 && <5'
postgresql-simple: '>=0.5 && <0.7'
network-uri: '>=2.6 && <3'
generic-lens: '>=1.2.0.1 && <3'
optparse-applicative: '>=0.14 && <0.16'
all-versions:
- 0.0.1
- 0.0.2
- 0.0.3
- 0.0.4
- 0.0.5
author: Arbor Networks
latest: 0.0.5
description-type: markdown
description: |
# arbor-postgres
Convenience types and functions to make `postgresql-simple` easier and
less error-prone to use.
license-name: MIT
| Update from Hackage at 2020-10-17T10:11:34Z | Update from Hackage at 2020-10-17T10:11:34Z
| YAML | mit | commercialhaskell/all-cabal-metadata |
f462b6c2c03704a01360cc26e34931e177d75771 | .github/ISSUE_TEMPLATE/config.yml | .github/ISSUE_TEMPLATE/config.yml | blank_issues_enabled: false
contact_links:
- name: Discord Support
url: https://biblebot.xyz/discord
about: Join our support Discord for issues regarding how to use the bot and the service itself.
- name: Email
url: mailto:team@kerygma.digital
about: Send us an email! | blank_issues_enabled: true
contact_links:
- name: Discord Support
url: https://biblebot.xyz/discord
about: Join our support Discord for issues regarding how to use the bot and the service itself.
- name: Email
url: mailto:team@kerygma.digital
about: Send us an email! | Allow for blank issues. | Allow for blank issues. [ci skip]
| YAML | mpl-2.0 | BibleBot/BibleBot,BibleBot/BibleBot,BibleBot/BibleBot |
61e334be764dec12ac6a2e95959b2d31460bb562 | .github/workflows/build-test.yaml | .github/workflows/build-test.yaml | name: Node.js CI
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [16.x, 18.x, 19.x]
steps:
- uses: actions/checkout@v3
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v3
with:
node-version: ${{ matrix.node-version }}
- name: Install dependencies
run: npm install
- run: npm run build --if-present
- run: npm test | name: Node.js CI
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [16.x, 18.x, 19.x]
steps:
- uses: actions/checkout@v3
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v3
with:
node-version: ${{ matrix.node-version }}
- name: Install dependencies
run: npm install
- run: npm run build --if-present
- run: npm test | Update workflow after renaming master -> main | Update workflow after renaming master -> main
| YAML | mit | jaakkos/winston-logstash,jaakkos/winston-logstash |
bd9b2ad786a4a95fb92b9c6ac614a2c68279ad68 | .github/workflows/deploy-data.yml | .github/workflows/deploy-data.yml | name: Validate/Deploy Data
on:
push: { branches: [master] }
jobs:
validate-then-deploy:
name: Validate, then deploy data
runs-on: ubuntu-latest
env:
task: GH_PAGES
steps:
- uses: actions/setup-node@master
with: {node-version: ^12.0}
- name: Check out the code
uses: actions/checkout@master
- name: Unshallow the history
run: git fetch --prune --unshallow
- name: Install dependencies
run: yarn install --frozen-lockfile
- name: Validate the data
run: |
npm run validate-bus-data
npm run validate-data
# Having validated the data, we now prepare a bundle. This script creates
# files in a docs/ directory.
- name: Bundle the data
run: npm run bundle-data
# Notes: actions/checkout@v2 no longer fetches entire history nor enters
# detached HEAD state. We really just need to check out an (orphaned)
# branch and then add, commit, and push the appropriate directory.
- name: Commit the data
run: |
git checkout --orphan "gh-pages-$GITHUB_SHA"
git --work-tree=docs add .
git commit --author="Data Deployment Bot <>" -m "Automated data deployment at $(date -Is)"
git show --stat HEAD
# If the previous commit successfully happened
- name: Deploy the data
run:
git push --force-with-lease origin "gh-pages-$GITHUB_SHA:gh-pages"
| name: Validate/Deploy Data
on:
push: { branches: [master] }
jobs:
validate-then-deploy:
name: Validate, then deploy data
runs-on: ubuntu-latest
env:
task: GH_PAGES
steps:
- uses: actions/setup-node@master
with: {node-version: ^12.0}
- name: Check out the code
uses: actions/checkout@master
- name: Install dependencies
run: yarn install --frozen-lockfile
- name: Validate the data
run: |
npm run validate-bus-data
npm run validate-data
# Having validated the data, we now prepare a bundle. This script creates
# files in a docs/ directory.
- name: Bundle the data
run: npm run bundle-data
# Notes: actions/checkout@v2 no longer fetches entire history nor enters
# detached HEAD state. We really just need to check out an (orphaned)
# branch and then add, commit, and push the appropriate directory.
- name: Commit the data
run: |
git checkout --orphan "gh-pages-$GITHUB_SHA"
git --work-tree=docs add .
git commit --author="Data Deployment Bot <>" -m "Automated data deployment at $(date -Is)"
git show --stat HEAD
# If the previous commit successfully happened, download the latest state
# of the remote branch gh-pages.
- name: Deploy the data
run: |
git fetch --prune --unshallow origin gh-pages
git push --force-with-lease origin "gh-pages-$GITHUB_SHA:gh-pages"
| Move unshallowing to right before the push | ci: Move unshallowing to right before the push
Signed-off-by: Kristofer Rye <1ed31cfd0b53bc3d1689a6fee6dbfc9507dffd22@gmail.com>
| YAML | agpl-3.0 | StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native,StoDevX/AAO-React-Native |
82f35f236a015f9b57920bcb3a238b14f241a33c | .github/workflows/depsreview.yaml | .github/workflows/depsreview.yaml | name: 'Dependency Review'
on:
push:
branches: [ trunk ]
pull_request:
branches: [ trunk ]
schedule:
- cron: '0 3 * * *'
permissions:
contents: read
jobs:
# Special job which skips duplicate jobs
pre_job:
name: Skip Duplicate Jobs Pre Job
runs-on: ubuntu-latest
# Map a step output to a job output
outputs:
should_skip: ${{ steps.skip_check.outputs.should_skip }}
steps:
- name: Checkout code
uses: actions/checkout@master
with:
persist-credentials: false
submodules: recursive
- id: skip_check
# NOTE: We store action as submodule since ASF doesn't allow directly referencing external
# actions
uses: ./.github/actions/skip-duplicate-actions # v3.4.1
with:
github_token: ${{ github.token }}
dependency-review:
runs-on: ubuntu-latest
needs: pre_job
if: ${{ needs.pre_job.outputs.should_skip == 'false' || github.ref == 'refs/heads/trunk' }}
steps:
- name: 'Checkout Repository'
uses: actions/checkout@v3
- name: 'Dependency Review'
uses: actions/dependency-review-action@v1
| name: 'Dependency Review'
on:
pull_request:
branches: [ trunk ]
permissions:
contents: read
jobs:
# Special job which skips duplicate jobs
pre_job:
name: Skip Duplicate Jobs Pre Job
runs-on: ubuntu-latest
# Map a step output to a job output
outputs:
should_skip: ${{ steps.skip_check.outputs.should_skip }}
steps:
- name: Checkout code
uses: actions/checkout@master
with:
persist-credentials: false
submodules: recursive
- id: skip_check
# NOTE: We store action as submodule since ASF doesn't allow directly referencing external
# actions
uses: ./.github/actions/skip-duplicate-actions # v3.4.1
with:
github_token: ${{ github.token }}
dependency-review:
runs-on: ubuntu-latest
needs: pre_job
if: ${{ needs.pre_job.outputs.should_skip == 'false' || github.ref == 'refs/heads/trunk' }}
steps:
- name: 'Checkout Repository'
uses: actions/checkout@v3
- name: 'Dependency Review'
uses: actions/dependency-review-action@v1
| Fix GHA event trigger for dependency review workflow. | Fix GHA event trigger for dependency review workflow.
| YAML | apache-2.0 | apache/libcloud,apache/libcloud,apache/libcloud |
a1645dfb17acf4118dc4fbed8dc5d41997d775aa | .github/workflows/development.yml | .github/workflows/development.yml | name: Development
on: [push, pull_request]
jobs:
test:
name: ${{matrix.ruby}} on ${{matrix.os}}
runs-on: ${{matrix.os}}-latest
continue-on-error: ${{matrix.experimental}}
strategy:
matrix:
os:
- ubuntu
ruby:
- "2.6"
- "2.7"
- "3.0"
experimental: [false]
env: [""]
include:
- os: ubuntu
ruby: truffleruby
experimental: true
- os: ubuntu
ruby: jruby
experimental: true
- os: ubuntu
ruby: head
experimental: true
steps:
- uses: actions/checkout@v2
- name: Install dependencies
run: sudo apt-get install libcurl4-openssl-dev
- uses: ruby/setup-ruby@v1
with:
ruby-version: ${{matrix.ruby}}
bundler-cache: true
- name: Run tests
timeout-minutes: 5
run: ${{matrix.env}} bundle exec rspec
| name: Development
on: [push, pull_request]
jobs:
test:
name: ${{matrix.ruby}} on ${{matrix.os}}
runs-on: ${{matrix.os}}-latest
continue-on-error: ${{matrix.experimental}}
strategy:
matrix:
os:
- ubuntu
ruby:
- "2.6"
- "2.7"
- "3.0"
- "3.1"
experimental: [false]
env: [""]
include:
- os: ubuntu
ruby: truffleruby
experimental: true
- os: ubuntu
ruby: jruby
experimental: true
- os: ubuntu
ruby: head
experimental: true
steps:
- uses: actions/checkout@v2
- name: Install dependencies
run: sudo apt-get install libcurl4-openssl-dev
- uses: ruby/setup-ruby@v1
with:
ruby-version: ${{matrix.ruby}}
bundler-cache: true
- name: Run tests
timeout-minutes: 5
run: ${{matrix.env}} bundle exec rspec
| Add Ruby 3.1 to build matrix | CI: Add Ruby 3.1 to build matrix | YAML | mit | savonrb/httpi |
8adab8db7f108e08c3ee2e63d496fc1c696b5a81 | packages/hw/hw-diagnostics.yaml | packages/hw/hw-diagnostics.yaml | homepage: http://github.com/haskell-works/hw-diagnostics#readme
changelog-type: ''
hash: 51da20353dba1c73180b4b7d494c61b44c6522eccfccef1a4c694d4e4a9e1713
test-bench-deps:
base: '>=4 && <5'
doctest: '>=0.16.2 && <0.19'
hw-diagnostics: -any
doctest-discover: '>=0.2 && <0.3'
maintainer: newhoggy@gmail.com
synopsis: Diagnostics library
changelog: ''
basic-deps:
base: '>=4 && <5'
all-versions:
- 0.0.0.1
- 0.0.0.2
- 0.0.0.3
- 0.0.0.4
- 0.0.0.5
- 0.0.0.7
- 0.0.1.0
author: John Ky
latest: 0.0.1.0
description-type: markdown
description: |
# hw-diagnostics
[](https://circleci.com/gh/haskell-works/hw-diagnostics)
Simple facilities for debugging code
license-name: BSD-3-Clause
| homepage: http://github.com/haskell-works/hw-diagnostics#readme
changelog-type: ''
hash: a94620b6bae1824eacf9e5adcf501e4f4f17b31015d19d3e12a58bf8efedd155
test-bench-deps:
base: '>=4 && <5'
doctest: '>=0.16.2 && <0.21'
hw-diagnostics: -any
doctest-discover: '>=0.2 && <0.3'
maintainer: newhoggy@gmail.com
synopsis: Diagnostics library
changelog: ''
basic-deps:
base: '>=4 && <5'
all-versions:
- 0.0.0.1
- 0.0.0.2
- 0.0.0.3
- 0.0.0.4
- 0.0.0.5
- 0.0.0.7
- 0.0.1.0
author: John Ky
latest: 0.0.1.0
description-type: markdown
description: |
# hw-diagnostics
[](https://circleci.com/gh/haskell-works/hw-diagnostics)
Simple facilities for debugging code
license-name: BSD-3-Clause
| Update from Hackage at 2022-03-26T01:58:02Z | Update from Hackage at 2022-03-26T01:58:02Z
| YAML | mit | commercialhaskell/all-cabal-metadata |
5868958bfa25ef8a13a293edc250d0439b21a4f2 | pubspec.yaml | pubspec.yaml | name: app
version: 0.1.0
environment:
sdk: '>=1.12.0 <2.0.0'
dependencies:
bridge: '^1.0.0-beta.1'
dev_dependencies:
test: any
grinder: any
| name: app
version: 0.1.0
environment:
sdk: '>=1.12.0 <2.0.0'
dependencies:
bridge:
git:
url: https://github.com/dart-bridge/framework
ref: develop
dev_dependencies:
test: any
grinder: any
| Develop branch should use dev branch of framework | Develop branch should use dev branch of framework
| YAML | mit | dart-bridge/bridge,dart-bridge/bridge |
657bfbe1cf18d17903862e7417cf577a459e92a7 | pubspec.yaml | pubspec.yaml | name: package_config
version: 1.0.1
description: Support for working with Package Resolution config files.
author: Dart Team <misc@dartlang.org>
homepage: https://github.com/dart-lang/package_config
environment:
sdk: '>=1.11.0-dev.0.0 <2.0.0'
dependencies:
charcode: ^1.1.0
path: ^1.0.0
dev_dependencies:
test: '>=0.12.0 <0.13.0'
| name: package_config
version: 1.0.2
description: Support for working with Package Resolution config files.
author: Dart Team <misc@dartlang.org>
homepage: https://github.com/dart-lang/package_config
environment:
sdk: '>=1.11.0 <2.0.0-dev.infinity'
dependencies:
charcode: ^1.1.0
path: ^1.0.0
dev_dependencies:
test: ^0.12.0
| Update SDK constraint to be 2.0.0 dev friendly. | Update SDK constraint to be 2.0.0 dev friendly. | YAML | bsd-3-clause | dart-lang/package_config |
17a2ebf9971c94ca80b429e7eee15c6fd6893791 | packages/qu/quantification.yaml | packages/qu/quantification.yaml | homepage: https://github.com/andrewthad/quantification#readme
changelog-type: ''
hash: 42909352b2c00a2838883bac1c15a7cf6210fbbaaa301fb6471eca7e4cc7c30c
test-bench-deps: {}
maintainer: andrew.thaddeus@gmail.com
synopsis: Rage against the quantification
changelog: ''
basic-deps:
base: ! '>=4.9 && <5'
unordered-containers: ! '>=0.2 && <0.3'
text: ! '>=1.0 && <2.0'
path-pieces: ! '>=0.2 && <0.3'
containers: ! '>=0.5 && <0.6'
ghc-prim: ! '>=0.5 && <0.6'
hashable: ! '>=1.2 && <1.3'
aeson: ! '>=0.11 && <1.3'
vector: ! '>=0.11 && <0.13'
all-versions:
- '0.1'
- '0.1.1'
- '0.1.2'
- '0.2'
- '0.3'
author: Andrew Martin
latest: '0.3'
description-type: haddock
description: Data types and typeclasses to deal with universally and existentially
quantified types
license-name: BSD3
| homepage: https://github.com/andrewthad/quantification#readme
changelog-type: ''
hash: f6d563e72bfbcae250998567394cf059dc90677176ce409b9f7f5ddecd279dcd
test-bench-deps: {}
maintainer: andrew.thaddeus@gmail.com
synopsis: Rage against the quantification
changelog: ''
basic-deps:
base: ! '>=4.9 && <5'
unordered-containers: ! '>=0.2 && <0.3'
text: ! '>=1.0 && <2.0'
path-pieces: ! '>=0.2 && <0.3'
containers: ! '>=0.5 && <0.6'
ghc-prim: ! '>=0.5 && <0.6'
hashable: ! '>=1.2 && <1.3'
aeson: ! '>=0.11 && <1.4'
vector: ! '>=0.11 && <0.13'
all-versions:
- '0.1'
- '0.1.1'
- '0.1.2'
- '0.2'
- '0.3'
author: Andrew Martin
latest: '0.3'
description-type: haddock
description: Data types and typeclasses to deal with universally and existentially
quantified types
license-name: BSD3
| Update from Hackage at 2018-05-07T17:32:33Z | Update from Hackage at 2018-05-07T17:32:33Z
| YAML | mit | commercialhaskell/all-cabal-metadata |
c64c39e55431e9eca4c2e24ed86574f2f10cd259 | docker-compose.yaml | docker-compose.yaml | version: "3"
services:
mongo:
image: mongo:latest
ports:
- "27017"
volumes:
- ./docker/mongo/data:/data/db
node-server:
build:
context: .
dockerfile: "./docker/nodejs.dockerfile"
volumes:
- "./app:/src/app"
ports:
- "3031:3001"
depends_on:
- mongo
links:
- mongo
command: bash -c "npm install && nodemon --legacy-watch ./bin/www ../server 3001"
node-client:
build:
context: .
dockerfile: "./docker/nodejs.dockerfile"
volumes:
- "./app:/src/app"
ports:
- "3030:3000"
depends_on:
- node-server
links:
- node-server
command: bash -c "./node_modules/.bin/grunt build && node ./bin/www ../client 3000"
| version: "3"
services:
mongo:
image: mongo:latest
ports:
- "27017"
volumes:
- ./docker/mongo/data:/data/db
node-server:
build:
context: .
dockerfile: "./docker/nodejs.dockerfile"
volumes:
- "./app:/src/app"
ports:
- "3031:3001"
depends_on:
- mongo
links:
- mongo
command: bash -c "npm install && nodemon --legacy-watch ./bin/www ../server 3001"
node-client:
build:
context: .
dockerfile: "./docker/nodejs.dockerfile"
volumes:
- "./app:/src/app"
ports:
- "3030:3000"
depends_on:
- node-server
links:
- node-server
command: bash -c "./node_modules/.bin/grunt build && nodemon --legacy-watch ./bin/www ../client 3000"
| Use nodemon for client app. | Use nodemon for client app.
| YAML | mit | highpine/highpine,highpine/highpine,highpine/highpine |
1215b67b4c39c39137fbdf9dfba24ab44faeb905 | packages/ae/aeson-flowtyped.yaml | packages/ae/aeson-flowtyped.yaml | homepage: https://github.com/mikeplus64/aeson-flowtyped#readme
changelog-type: ''
hash: 7f408c75580e37fc39cbe1deb7a760d85f1a4bf69c4bdf0e74a706a6fb5a932c
test-bench-deps:
base: ! '>=4.9 && <4.11'
unordered-containers: -any
text: -any
aeson-flowtyped: -any
tasty-hunit: -any
tasty: -any
recursion-schemes: -any
aeson: ! '>=0.8'
vector: -any
maintainer: mike@quasimal.com
synopsis: Create Flow type definitions from Haskell data types.
changelog: ''
basic-deps:
free: -any
wl-pprint: -any
reflection: -any
base: ! '>=4.9 && <4.11'
time: -any
unordered-containers: -any
text: -any
containers: -any
recursion-schemes: -any
scientific: -any
aeson: ! '>=0.8'
vector: -any
all-versions:
- '0.7.0'
- '0.7.1'
- '0.7.2'
- '0.7.4'
author: Mike Ledger <mike@quasimal.com>
latest: '0.7.4'
description-type: haddock
description: Create Flow type definitions from Haskell data types.
license-name: BSD3
| homepage: https://github.com/mikeplus64/aeson-flowtyped#readme
changelog-type: ''
hash: 61bd1c7b0c55c9c0068cf017d1bbc573ef66fdd23a61da85a75b8d3be1bf4ed3
test-bench-deps:
base: ! '>=4.9 && <4.11'
unordered-containers: -any
text: -any
aeson-flowtyped: -any
tasty-hunit: -any
tasty: -any
recursion-schemes: -any
aeson: ! '>=0.8'
vector: -any
maintainer: mike@quasimal.com
synopsis: Create Flow type definitions from Haskell data types.
changelog: ''
basic-deps:
free: -any
wl-pprint: -any
reflection: -any
base: ! '>=4.9 && <4.11'
time: -any
unordered-containers: -any
text: -any
containers: -any
recursion-schemes: -any
scientific: -any
aeson: ! '>=0.8'
vector: -any
all-versions:
- '0.7.0'
- '0.7.1'
- '0.7.2'
- '0.7.4'
- '0.7.5'
author: Mike Ledger <mike@quasimal.com>
latest: '0.7.5'
description-type: haddock
description: Create Flow type definitions from Haskell data types.
license-name: BSD3
| Update from Hackage at 2017-07-28T06:40:15Z | Update from Hackage at 2017-07-28T06:40:15Z
| YAML | mit | commercialhaskell/all-cabal-metadata |
83daf2914c71f6d0bf52318249e9183878652455 | _data/progresses.yml | _data/progresses.yml | - name: Alojamiento
percentage: 100
- name: Transporte
percentage: 25
- name: Preboda
percentage: 20
- name: Traje feo
percentage: 85
- name: Vestido princesa
percentage: 40
- name: Fotógrafo
percentage: 25
- name: Ceremonia
percentage: 50
- name: Menús
percentage: 25
- name: Música
percentage: 30
| - name: Alojamiento
percentage: 100
- name: Transporte
percentage: 25
- name: Preboda
percentage: 20
- name: Traje feo
percentage: 85
- name: Vestido princesa
percentage: 40
- name: Vestido cosita
percentage: 0
- name: Fotógrafo
percentage: 25
- name: Ceremonia
percentage: 50
- name: Menús
percentage: 25
- name: Música
percentage: 30
| Add vestido cosita to progress | Add vestido cosita to progress
| YAML | mit | pablerass/boda-feoyprincesa,pablerass/boda-feoyprincesa |
d488b7d8ae0a95e170a8827f4c8d80ac86e7e604 | concourse/tasks/remove-db.yml | concourse/tasks/remove-db.yml | ---
platform: linux
inputs:
- name: paas-cf
- name: bosh-CA
- name: config
image_resource:
type: docker-image
source:
repository: governmentpaas/cf-cli
tag: 895cf6752c8ec64af05a3a735186b90acd3db65a
run:
path: sh
args:
- -e
- -u
- -c
- |
./paas-cf/concourse/scripts/import_bosh_ca.sh
. ./config/config.sh
if ! curl -I -f "${API_ENDPOINT}/info"; then
echo "CF API unavailable. Skipping..."
exit 0
fi
if ! echo | cf login -a "${API_ENDPOINT}" -u "${CF_ADMIN}" -p "${CF_PASS}"; then
echo "Login failed. Skipping..."
exit 0
fi
cf target -o "${ORG}" -s "${SPACE}"
if cf services | grep -q "${DB_NAME}"; then
cf unbind-service "${BOUND_APP}" "${DB_NAME}" || true
cf delete-service "${DB_NAME}" -f
fi
| ---
platform: linux
inputs:
- name: paas-cf
- name: bosh-CA
- name: config
image_resource:
type: docker-image
source:
repository: governmentpaas/cf-cli
tag: 895cf6752c8ec64af05a3a735186b90acd3db65a
run:
path: sh
args:
- -e
- -u
- -c
- |
./paas-cf/concourse/scripts/import_bosh_ca.sh
. ./config/config.sh
if ! curl -I -f "${API_ENDPOINT}/info"; then
echo "CF API unavailable. Skipping..."
exit 0
fi
if ! echo | cf login -a "${API_ENDPOINT}" -u "${CF_ADMIN}" -p "${CF_PASS}"; then
echo "Login failed. Skipping..."
exit 0
fi
if ! cf target -o "${ORG}" -s "${SPACE}"; then
echo "${ORG}/${SPACE} space doesn't seem to exist. Skipping..."
exit 0
fi
if cf services | grep -q "${DB_NAME}"; then
cf unbind-service "${BOUND_APP}" "${DB_NAME}" || true
cf delete-service "${DB_NAME}" -f
fi
| Handle space not existing in the database removal script. | Handle space not existing in the database removal script.
Otherwise, this script can sometimes erroneously fail if the space
doesn't exist or has already been deleted.
| YAML | mit | alphagov/paas-cf,alphagov/paas-cf,alphagov/paas-cf,alphagov/paas-cf,alphagov/paas-cf,alphagov/paas-cf,alphagov/paas-cf,alphagov/paas-cf |
e594a483b446d19bfa1cc30503e9cb3b844aa094 | backend/dispatch.yaml | backend/dispatch.yaml | application: serfr0-fdb
dispatch:
- url: "airspace.serfr1.org/"
module: frontend
- url: "fdb.serfr1.org/"
module: frontend
- url: "fdb.serfr1.org/comp"
module: frontend
- url: "fdb.serfr1.org/fdb/*"
module: ui
- url: "fdb.serfr1.org/api/*"
module: ui
- url: "*/static/*"
module: ui
- url: "*/status"
module: consolidator
- url: "fdb.serfr1.org/reset"
module: consolidator
- url: "fdb.serfr1.org/report"
module: backend
- url: "fdb.serfr1.org/batch/*"
module: backend
| application: serfr0-fdb
dispatch:
# All front doors (to stop stackdriver starting up default module)
- url: "*/"
module: frontend
#- url: "airspace.serfr1.org/"
# module: frontend
#- url: "fdb.serfr1.org/"
# module: frontend
- url: "fdb.serfr1.org/fdb/*"
module: ui
- url: "fdb.serfr1.org/api/*"
module: ui
- url: "*/static/*"
module: ui
- url: "*/status"
module: consolidator
- url: "fdb.serfr1.org/reset"
module: consolidator
- url: "fdb.serfr1.org/report"
module: backend
- url: "fdb.serfr1.org/batch/*"
module: backend
| Stop the bleeding from the default module (should move ui to that maybe ?) | Stop the bleeding from the default module (should move ui to that maybe ?)
| YAML | apache-2.0 | skypies/flightdb2,skypies/flightdb2,skypies/flightdb,skypies/flightdb,skypies/flightdb,skypies/flightdb,skypies/flightdb2 |
7cceb1a124c90e1ca018de75eef9a5953c6d72c9 | roles/linuxfr/tasks/linuxfr.yml | roles/linuxfr/tasks/linuxfr.yml | - name: Create the LinuxFR user
user: name=linuxfr home=/data/dev/linuxfr state=present
- name: Clone the LinuxFR repository
git: repo=git://github.com/nono/linuxfr.org.git dest=/data/dev/linuxfr/source
- name: Configure the Rails apps's database
copy: src=linuxfr/database.yml dest=/data/dev/linuxfr/source/config/database.yml
- name: Configure the Rails apps's secrets
copy: src=linuxfr/secret.yml dest=/data/dev/linuxfr/source/config/secret.yml
- name: Install the Rails app using Bundler
command: bundle install chdir=/data/dev/linuxfr/source
- name: Setup ElasticSearch
command: desi install chdir=/data/dev/linuxfr/source
- name: Setup the database
command: rake db:setup chdir=/data/dev/linuxfr/source
- name: Flush the Redis database
command: redis-cli flushdb
# This doesn't work because undefined method `tire' for #<Class:0x00000005cdedf8> (NoMethodError)
#- name: Initialize indexes
# command: ./script/rails r '[Diary, News, Page, Poll, Post, Tracker, WikiPage].each {|m| m.tire.index.refresh }'
# chdir=/data/dev/linuxfr
| - name: Create the LinuxFR user
user: name=linuxfr home=/data/dev/linuxfr state=present
- name: Clone the LinuxFR repository
git: repo=git://github.com/nono/linuxfr.org.git dest=/data/dev/linuxfr/source
- name: Configure the Rails apps's database
copy: src=linuxfr/database.yml dest=/data/dev/linuxfr/source/config/database.yml
- name: Configure the Rails apps's secrets
copy: src=linuxfr/secret.yml dest=/data/dev/linuxfr/source/config/secret.yml
- name: Install the Rails app using Bundler
command: bundle install chdir=/data/dev/linuxfr/source
- name: Setup ElasticSearch
command: desi install chdir=/data/dev/linuxfr/source
- name: Check if the database already exists
command: rake db:version chdir=/data/dev/linuxfr/source
ignore_errors: True
register: db_version
- name: Setup the database
command: rake db:setup chdir=/data/dev/linuxfr/source
when: db_version|failed
- name: Migrate the database
command: rake db:migrate chdir=/data/dev/linuxfr/source
when: db_version|success
- name: Flush the Redis database
command: redis-cli flushdb
# This doesn't work because undefined method `tire' for #<Class:0x00000005cdedf8> (NoMethodError)
#- name: Initialize indexes
# command: ./script/rails r '[Diary, News, Page, Poll, Post, Tracker, WikiPage].each {|m| m.tire.index.refresh }'
# chdir=/data/dev/linuxfr
| Add support for migrating the database if it is already set up. | Add support for migrating the database if it is already set up.
http://mxey.net/deploying-rails-with-ansible/
| YAML | agpl-3.0 | nud/ansible-linuxfr,nud/ansible-linuxfr |
5f0eb5f2c5d0cff52f3cafa41be347f6f6dc3603 | config/fragments.yml | config/fragments.yml | head:
keys:
- :site_updated_at
- :last_signature_at
- :petition_page
- :archived_petition_page
- :page_title
- :petition
options:
expires_in: 300
header:
options:
expires_in: 300
footer:
keys:
- :create_petition_page
- :home_page
options:
expires_in: 300
home_page:
keys:
- :last_signature_at
options:
expires_in: 300
petition:
keys:
- :petition
options:
expires_in: 300
| head:
keys:
- :site_updated_at
- :last_signature_at
- :petition_page
- :archived_petition_page
- :page_title
- :petition
options:
expires_in: 300
header:
keys:
- :home_page
options:
expires_in: 300
footer:
keys:
- :create_petition_page
options:
expires_in: 300
home_page:
keys:
- :last_signature_at
options:
expires_in: 300
petition:
keys:
- :petition
options:
expires_in: 300
| Fix header and footer caching | Fix header and footer caching
Due to erroneous information from @pixeltrix the `:home_page` key was
removed from the `:header` fragment and not the `:footer` fragment as
it should've been.
| YAML | mit | ansonK/e-petitions,joelanman/e-petitions,joelanman/e-petitions,joelanman/e-petitions,StatesOfJersey/e-petitions,unboxed/e-petitions,oskarpearson/e-petitions,ansonK/e-petitions,unboxed/e-petitions,alphagov/e-petitions,oskarpearson/e-petitions,unboxed/e-petitions,alphagov/e-petitions,telekomatrix/e-petitions,StatesOfJersey/e-petitions,telekomatrix/e-petitions,StatesOfJersey/e-petitions,ansonK/e-petitions,StatesOfJersey/e-petitions,alphagov/e-petitions,oskarpearson/e-petitions,telekomatrix/e-petitions,alphagov/e-petitions,unboxed/e-petitions |
51a65c2fa6f44d4c82a1befb8c9a905c233a58ac | xe-guest-utils.yml | xe-guest-utils.yml | xe-guest-utils:
image: klowner/rancher-xe-guest-utils:6.6.80
privileged: true
labels:
- io.rancher.os.scope=system
volumes:
- /dev:/dev:ro
- /lib/modules:/lib/modules:ro
- /etc/udev/rules.d/:/host/etc/udev/rules.d/
restart: always
net: host
ipc: host
pid: host
| xe-guest-utils:
image: klowner/rancher-xe-guest-utils:6.6.80
privileged: true
labels:
- io.rancher.os.scope=system
volumes:
- /dev:/dev:ro
- /lib/modules:/lib/modules:ro
- /etc/udev/rules.d/:/host/etc/udev/rules.d/
- /etc/lsb-release:/etc/lsb-release
restart: always
net: host
ipc: host
pid: host
| Include lsb-release in volume mounts | Include lsb-release in volume mounts
| YAML | mit | Klowner/rancher-xe-guest-utils |
fca371bb5812619cde6a387362ab3d545f4fd3d4 | travis/.travis.yml | travis/.travis.yml | before_install: "script/clone_all_rspec_repos"
bundler_args: "--binstubs --standalone --without documentation --path ../bundle"
script: "script/run_build"
rvm:
- 1.8.7
- 1.9.2
- 1.9.3
- 2.0.0
- 2.1.0
- ruby-head
- ree
- jruby-18mode
- jruby
- jruby-head
- rbx
matrix:
allow_failures:
- rvm: jruby-18mode
- rvm: jruby-head
- rvm: ruby-head
fast_finish: true
| before_install: "script/clone_all_rspec_repos"
bundler_args: "--binstubs --standalone --without documentation --path ../bundle"
script: "script/run_build"
rvm:
- 1.8.7
- 1.9.2
- 1.9.3
- 2.0.0
- 2.1.0
- ruby-head
- ree
- jruby-18mode
- jruby
- jruby-head
- rbx
matrix:
allow_failures:
- rvm: jruby-18mode
- rvm: jruby-head
- rvm: ruby-head
- rvm: rbx
fast_finish: true
| Add rbx to list of allowed failures. | Add rbx to list of allowed failures.
We're getting odd compile errors from rubinius:
https://github.com/rubinius/rubinius/issues/2934 | YAML | mit | rspec/rspec-dev,rspec/rspec-dev,rspec/rspec-dev |
0a48f397cc736d7f48bf31c4f9c9856a11440333 | config/plugins.yml | config/plugins.yml | loomio_org_plugin:
repo: https://github.com/loomio/loomio_org_plugin.git
branch: slack-app-distribution
loomio_tags:
repo: https://github.com/loomio/loomio_tags.git
loomio_truncate_comment:
repo: https://github.com/loomio/loomio_truncate_comment.git
loomio_content_preview:
repo: https://github.com/loomio/loomio_content_preview.git
loomio_onboarding:
repo: https://github.com/loomio/loomio_onboarding.git
| loomio_org_plugin:
repo: https://github.com/loomio/loomio_org_plugin.git
loomio_tags:
repo: https://github.com/loomio/loomio_tags.git
loomio_truncate_comment:
repo: https://github.com/loomio/loomio_truncate_comment.git
loomio_content_preview:
repo: https://github.com/loomio/loomio_content_preview.git
loomio_onboarding:
repo: https://github.com/loomio/loomio_onboarding.git
| Use master branch for loomio_org plugin | Use master branch for loomio_org plugin
| YAML | agpl-3.0 | piratas-ar/loomio,loomio/loomio,piratas-ar/loomio,piratas-ar/loomio,piratas-ar/loomio,loomio/loomio,loomio/loomio,loomio/loomio |
4cdae4170cb8e15c195e9e0cbe0b03f9a81ebcb0 | travis/configuration.yaml | travis/configuration.yaml | emg:
databases:
default:
ENGINE: 'django.db.backends.mysql'
NAME: 'emg'
USER: 'root'
# PASSWORD: 'secret'
HOST: 127.0.0.1
PORT: 3306
session_engine: 'django.contrib.sessions.backends.cache'
# caches:
# default:
# BACKEND: "django_redis.cache.RedisCache"
# LOCATION: "redis://127.0.0.1:6379/0"
# KEY_PREFIX: "emg"
emg_backend_auth: "https://backend"
mongodb:
db: 'emg'
host: '127.0.0.1'
| emg:
databases:
default:
ENGINE: 'django.db.backends.mysql'
NAME: 'emg'
USER: 'root'
# PASSWORD: 'secret'
HOST: 127.0.0.1
PORT: 3306
session_engine: 'django.contrib.sessions.backends.cache'
# caches:
# default:
# BACKEND: "django_redis.cache.RedisCache"
# LOCATION: "redis://127.0.0.1:6379/0"
# KEY_PREFIX: "emg"
emg_backend_auth: "https://backend"
mongodb:
db: 'testdb'
host: '127.0.0.1'
| Use testdb mongo db on tests (Travis) | Use testdb mongo db on tests (Travis)
| YAML | apache-2.0 | EBI-Metagenomics/emgapi,EBI-Metagenomics/emgapi,EBI-Metagenomics/emgapi,EBI-Metagenomics/emgapi,EBI-Metagenomics/emgapi |
413fa961c306b628d92d3c3b7cb6c4002fd31cfd | handlers/main.yml | handlers/main.yml | ---
- name: openvpn restart
service: name=openvpn state=restarted
- name: openvpn pack clients
command: tar cvzf {{openvpn_keydir}}/{{item.item}}.tar.gz -C {{openvpn_keydir}} {{item.item}}.crt {{item.item}}.key {{item.item}}.ovpn ca.crt
when: item.changed
with_items: openvpn_clients_changed.results
| ---
- name: openvpn restart
service: name=openvpn state=restarted
- name: openvpn pack clients
command: tar cvzf {{openvpn_keydir}}/{{item.item}}.tar.gz -C {{openvpn_keydir}} {{item.item}}.crt {{item.item}}.key {{item.item}}.ovpn ca.crt
when: item.changed
with_items: "{{openvpn_clients_changed.results}}"
| Update for deprecation-related variable quoting | Update for deprecation-related variable quoting
See https://docs.ansible.com/ansible/porting_guide_2.0.html#deprecated
for more details
| YAML | mit | Stouts/Stouts.openvpn,teadur/Stouts.openvpn,Cobliteam/Stouts.openvpn,Stouts/Stouts.openvpn |
2faac1ce0fdef835112e01db691faabcddc4b932 | git/cloudbuild.yaml | git/cloudbuild.yaml | # In this directory, run the following command to build this builder.
# $ gcloud alpha container builds create . --config=cloudbuild.yaml
steps:
- name: 'gcr.io/cloud-builders/docker'
args: ['build', '--tag=gcr.io/$PROJECT_ID/git', '.']
# Clone a public repo and write its revision to a VERSION file.
- name: 'gcr.io/$PROJECT_ID/git'
args: ['clone', 'https://github.com/GoogleCloudPlatform/cloud-builders.git']
dir: 'examples/version-file'
# The Dockerfile here uses this git builder as its base image.
- name: 'gcr.io/cloud-builders/docker'
args: ['build', '--tag=version_file', '.']
dir: 'examples/version-file'
- name: 'version_file'
args: ['VERSION']
dir: 'examples/version-file/cloud-builders'
# Prove that the file exists.
- name: 'alpine'
args: ['ash', '-c', 'echo "Version: $(cat VERSION)"']
dir: 'examples/version-file/cloud-builders'
# Clone a private repo belonging to the project.
- name: 'gcr.io/$PROJECT_ID/git'
args: ['clone', 'https://source.developers.google.com/p/$PROJECT_ID/r/default']
images: ['gcr.io/$PROJECT_ID/git']
| # In this directory, run the following command to build this builder.
# $ gcloud alpha container builds create . --config=cloudbuild.yaml
steps:
- name: 'gcr.io/cloud-builders/docker'
args: ['build', '--tag=gcr.io/$PROJECT_ID/git', '.']
# Clone a public repo and write its revision to a VERSION file.
- name: 'gcr.io/$PROJECT_ID/git'
args: ['clone', 'https://github.com/GoogleCloudPlatform/cloud-builders.git']
dir: 'examples/version-file'
# The Dockerfile here uses this git builder as its base image.
- name: 'gcr.io/cloud-builders/docker'
args: ['build', '--tag=version_file', '.']
dir: 'examples/version-file'
- name: 'version_file'
args: ['VERSION']
dir: 'examples/version-file/cloud-builders'
# Prove that the file exists.
- name: 'alpine'
args: ['ash', '-c', 'echo "Version: $(cat VERSION)"']
dir: 'examples/version-file/cloud-builders'
images: ['gcr.io/$PROJECT_ID/git']
| Remove a sample that breaks our release | Remove a sample that breaks our release
-------------
MOE_MIGRATED_REVID=139902326
APPROVED=philmod@google.com
| YAML | apache-2.0 | GoogleCloudPlatform/cloud-builders,google-octo/cloud-builders,google-octo/cloud-builders,GoogleCloudPlatform/cloud-builders,google-octo/cloud-builders,GoogleCloudPlatform/cloud-builders,GoogleCloudPlatform/cloud-builders,GoogleCloudPlatform/cloud-builders,GoogleCloudPlatform/cloud-builders,GoogleCloudPlatform/cloud-builders,google-octo/cloud-builders,google-octo/cloud-builders,google-octo/cloud-builders,google-octo/cloud-builders |
42d322b50e5f79eeb0d32f5cf7d1c937612d69da | packages/ch/chiphunk.yaml | packages/ch/chiphunk.yaml | homepage: https://github.com/CthulhuDen/chiphunk#readme
changelog-type: ''
hash: 5029535b952c907efefcddcf3c7291ec96f12df8dca28349dfe510194fe85978
test-bench-deps: {}
maintainer: cthulhu.den@gmail.com
synopsis: Haskell bindings for Chipmunk2D physics engine
changelog: ''
basic-deps:
chiphunk: -any
base: ! '>=4.7 && <5'
vector-space: ! '>=0.13 && <0.16'
safe-exceptions: ! '>=0.1.7.0 && <0.2'
async: ! '>=2.2.1 && <2.3'
nanovg-simple: ! '>=0.4.0.0 && <0.5'
nanovg: ! '>=0.6.0.0 && <0.7'
StateVar: ! '>=1.1.1.1 && <1.2'
all-versions:
- '0.1.0.0'
- '0.1.0.1'
- '0.1.0.2'
author: Cthulhu
latest: '0.1.0.2'
description-type: markdown
description: ! '# chiphunk
Chiphunk is a Haskell bindings for Chipmunk2D physics library. See `Chiphunk.Low`
module for documentation.
'
license-name: BSD3
| homepage: https://github.com/CthulhuDen/chiphunk#readme
changelog-type: ''
hash: 0c88d66e5b66ffd0b3bd70fc63ff00be0ab0757c26f1315a1b6f788852e14a6c
test-bench-deps: {}
maintainer: cthulhu.den@gmail.com
synopsis: Haskell bindings for Chipmunk2D physics engine
changelog: ''
basic-deps:
chiphunk: -any
base: ! '>=4.7 && <5'
vector-space: ! '>=0.13 && <0.16'
safe-exceptions: ! '>=0.1.7.0 && <0.2'
async: ! '>=2.2.1 && <2.3'
nanovg-simple: ! '>=0.4.0.0 && <0.5'
nanovg: ! '>=0.6.0.0 && <0.7'
StateVar: ! '>=1.1.1.1 && <1.2'
all-versions:
- '0.1.0.0'
- '0.1.0.1'
- '0.1.0.2'
- '0.1.0.3'
author: Cthulhu
latest: '0.1.0.3'
description-type: markdown
description: ! '# chiphunk
Chiphunk is a Haskell bindings for Chipmunk2D physics library. See `Chiphunk.Low`
module for documentation.
'
license-name: BSD3
| Update from Hackage at 2018-11-18T11:45:56Z | Update from Hackage at 2018-11-18T11:45:56Z
| YAML | mit | commercialhaskell/all-cabal-metadata |
f068060c09130842b0f8080824e3be5de9090105 | packages/li/ListTree.yaml | packages/li/ListTree.yaml | homepage: http://github.com/yairchu/generator/tree
changelog-type: ''
hash: e4fcac1bf2880b1fdde46395e927c4b590db8626348547fa8e7745e93bf926df
test-bench-deps: {}
maintainer: yairchu@gmail.com
synopsis: Trees and monadic trees expressed as monadic lists where the underlying
monad is a list
changelog: ''
basic-deps:
List: ! '>=0.4.0 && <0.6.0'
base: ! '>=3 && <5'
filepath: ! '>=1.1 && <2.0'
transformers: ! '>=0.2'
directory: ! '>=1.0 && <2.0'
all-versions:
- '0.1'
- '0.2.0'
- '0.2.1'
author: Yair Chuchem
latest: '0.2.1'
description-type: haddock
description: ! 'Directory tree structure expressed as a monadic tree.
Searching, pruning, iterating, and processing trees.'
license-name: BSD3
| homepage: http://github.com/yairchu/generator/tree
changelog-type: ''
hash: a4d4f0b740f6bc5b90ee833f9fe8af06ceeba3f91c4aa7370f87c398691c937d
test-bench-deps: {}
maintainer: yairchu@gmail.com
synopsis: Trees and monadic trees expressed as monadic lists where the underlying
monad is a list
changelog: ''
basic-deps:
List: ! '>=0.4.0 && <0.6.0'
base: ! '>=3 && <5'
filepath: ! '>=1.1 && <2.0'
transformers: ! '>=0.2'
directory: ! '>=1.0 && <2.0'
all-versions:
- '0.1'
- '0.2.0'
- '0.2.1'
- '0.2.2'
author: Yair Chuchem
latest: '0.2.2'
description-type: haddock
description: ! 'Directory tree structure expressed as a monadic tree.
Searching, pruning, iterating, and processing trees.'
license-name: BSD3
| Update from Hackage at 2015-08-13T12:18:44+0000 | Update from Hackage at 2015-08-13T12:18:44+0000
| YAML | mit | commercialhaskell/all-cabal-metadata |
12b0e0375ab5202b17a6b8913f2b454b4b1c03d7 | packages/pa/pangraph.yaml | packages/pa/pangraph.yaml | homepage: https://github.com/tuura/pangraph#readme
changelog-type: ''
hash: e61d7728f2e5d180b57f41cfb77c03a1417629a2a7a1d44ab7bfabb51dbbc18e
test-bench-deps:
bytestring: -any
base: ! '>=4.8 && <5'
HUnit: -any
containers: -any
pangraph: -any
maintainer: joseph-scott@hotmail.co.uk
synopsis: ! 'A set of parsers for graph languages and conversions to
graph libaries.'
changelog: ''
basic-deps:
bytestring: -any
base: ! '>=4.8 && <5'
containers: -any
hexml: -any
algebraic-graphs: ==0.1.1.*
all-versions:
- '0.1.1.5'
- '0.1.2'
author: Joe Scott
latest: '0.1.2'
description-type: haddock
description: ! 'A package allowing parsing of graph files into graph
library datatypes. With aim the cope with large networks
and provide translations between graph libraries. Like a
pandoc but for graphs. This is my first library so any
feedback and help is appreicated. For example use please
see the homepage.'
license-name: BSD3
| homepage: https://github.com/tuura/pangraph#readme
changelog-type: ''
hash: b218092873fc33f317dad33abba86581c181a4fa15ddb92b708fecc5d750bf69
test-bench-deps:
bytestring: -any
base: ! '>=4.8 && <5'
HUnit: -any
containers: -any
pangraph: -any
maintainer: joseph-scott@hotmail.co.uk
synopsis: ! 'A set of parsers for graph languages and conversions to
graph libaries.'
changelog: ''
basic-deps:
bytestring: -any
base: ! '>=4.8 && <5'
html-entities: -any
text: -any
containers: -any
hexml: -any
fgl: -any
attoparsec: -any
algebraic-graphs: -any
all-versions:
- '0.1.1.5'
- '0.1.2'
- '0.2.0'
author: Joe Scott
latest: '0.2.0'
description-type: haddock
description: ! 'A package allowing parsing of graph files into graph
library datatypes. With aim the cope with large networks
and provide translations between graph libraries. Like a
pandoc but for graphs. This is my first library so any
feedback and help is appreicated. For example use please
see the homepage.'
license-name: BSD3
| Update from Hackage at 2018-09-22T10:39:32Z | Update from Hackage at 2018-09-22T10:39:32Z
| YAML | mit | commercialhaskell/all-cabal-metadata |
f8fd715f30c225bb4f1e2bf19a6368040dd63900 | packages/re/regex-do.yaml | packages/re/regex-do.yaml | homepage: https://github.com/ciez/regex-do
changelog-type: ''
hash: c28172ed9e823b26a33cbe22957e0a5fdae1741d863e60fdb73b8ce464ba49c3
test-bench-deps:
regex-pcre: -any
bytestring: -any
base: <=5.0
hspec: -any
text: -any
stringsearch: -any
array: -any
regex-base: -any
regex-do: -any
QuickCheck: -any
maintainer: Imants Cekusins
synopsis: PCRE wrapper
changelog: ''
basic-deps:
regex-pcre: -any
bytestring: -any
base: <=5.0
text: -any
stringsearch: -any
array: -any
regex-base: -any
all-versions:
- '1.4'
- '1.5'
- '1.6'
author: Imants Cekusins
latest: '1.6'
description-type: haddock
description: format, search, replace (String | ByteString) with PCRE regex. Utf8-safe
license-name: PublicDomain
| homepage: https://github.com/ciez/regex-do
changelog-type: ''
hash: f984be298a794a1c68abbaa919104253d7b02581876fd696fb5a6d41b327f731
test-bench-deps:
regex-pcre: -any
bytestring: -any
base: <=5.0
hspec: -any
text: -any
stringsearch: -any
array: -any
regex-base: -any
regex-do: -any
QuickCheck: -any
maintainer: Imants Cekusins
synopsis: PCRE wrapper
changelog: ''
basic-deps:
regex-pcre: -any
bytestring: -any
base: <=5.0
text: -any
stringsearch: -any
array: -any
regex-base: -any
all-versions:
- '1.4'
- '1.5'
- '1.6'
- '1.7'
author: Imants Cekusins
latest: '1.7'
description-type: haddock
description: format, search, replace (String | ByteString) with PCRE regex. Utf8-safe
license-name: PublicDomain
| Update from Hackage at 2016-11-09T11:17:12Z | Update from Hackage at 2016-11-09T11:17:12Z | YAML | mit | commercialhaskell/all-cabal-metadata |
fdf1897dc9f06800525ff7c3fa8ea5b008b6e3fe | packages/sy/sym-plot.yaml | packages/sy/sym-plot.yaml | homepage: http://github.com/akc/sym-plot
changelog-type: ''
hash: 5c3bfbfdcb1756eee01680f14b2f5fbf51f7e7ee6336b6a123a7792d2f13137f
test-bench-deps: {}
maintainer: anders.claesson@gmail.com
synopsis: Plot permutations; an addition to the sym package
changelog: ''
basic-deps:
diagrams-lib: -any
base: ! '>=3 && <=4.7'
diagrams-cairo: -any
sym: ! '>=0.9'
all-versions:
- '0.1.0'
- '0.2.0'
author: Anders Claesson
latest: '0.2.0'
description-type: haddock
description: ! 'This packade adds plots to the sym package. It has been factored out
of that package because diagrams is a rather heavy dependency.'
license-name: BSD3
| homepage: http://github.com/akc/sym-plot
changelog-type: ''
hash: efcea5f5b603b3ccfd943815d415492bd27baba36215be9c8c9f7552dc56989b
test-bench-deps: {}
maintainer: anders.claesson@gmail.com
synopsis: Plot permutations; an addition to the sym package
changelog: ''
basic-deps:
diagrams-lib: ! '>=1.4'
base: ! '>=4.7 && <=5'
diagrams-cairo: ! '>=1.4'
sym: ! '>=0.12'
all-versions:
- '0.1.0'
- '0.2.0'
- '0.3.0'
author: Anders Claesson
latest: '0.3.0'
description-type: haddock
description: ! 'This packade adds plots to the sym package. It has been factored out
of that package because diagrams is a rather heavy dependency.'
license-name: BSD3
| Update from Hackage at 2017-06-10T12:42:18Z | Update from Hackage at 2017-06-10T12:42:18Z
| YAML | mit | commercialhaskell/all-cabal-metadata |
f42e5745891f06c5cb9f809ed144c8753a29cfde | recipes/py/meta.yaml | recipes/py/meta.yaml | {% set name = "py" %}
{% set version = "1.4.31" %}
{% set hash_type = "sha256" %}
{% set hash = "a6501963c725fc2554dabfece8ae9a8fb5e149c0ac0a42fd2b02c5c1c57fc114" %}
package:
name: {{ name }}
version: {{ version }}
source:
fn: {{ name }}-{{ version }}.tar.gz
url: https://pypi.io/packages/source/p/py/py-{{ version }}.tar.gz
{{ hash_type }}: {{ hash }}
build:
number: 0
script: python setup.py install --single-version-externally-managed --record=record.txt
requirements:
build:
- python
- setuptools
run:
- python
test:
imports:
- py
- py.code
- py.io
- py.log
- py.path
- py.process
about:
home: https://github.com/pytest-dev/py
license: MIT
license_file: LICENSE
summary: "library with cross-python path, ini-parsing, io, code, log facilities"
| {% set name = "py" %}
{% set version = "1.4.32" %}
{% set hash_type = "sha256" %}
{% set hash = "c4b89fd1ff1162375115608d01f77c38cca1d0f28f37fd718005e19b28be41a7" %}
package:
name: {{ name }}
version: {{ version }}
source:
fn: {{ name }}-{{ version }}.tar.gz
url: https://pypi.io/packages/source/p/py/py-{{ version }}.tar.gz
{{ hash_type }}: {{ hash }}
build:
number: 0
script: python setup.py install --single-version-externally-managed --record=record.txt
requirements:
build:
- python
- setuptools
run:
- python
test:
imports:
- py
- py.code
- py.io
- py.log
- py.path
- py.process
about:
home: https://github.com/pytest-dev/py
license: MIT
license_file: LICENSE
summary: "library with cross-python path, ini-parsing, io, code, log facilities"
| Update py recipe to version 1.4.32 | Update py recipe to version 1.4.32
| YAML | bsd-3-clause | jjhelmus/berryconda,jjhelmus/berryconda,jjhelmus/berryconda,jjhelmus/berryconda,jjhelmus/berryconda |
943ecaa69665f3c736b1ff4021a7e825ecf38c43 | .github/workflows/main.yml | .github/workflows/main.yml | name: CI
# Allows you to run this workflow manually from the Actions tab
on: [push, workflow_dispatch]
# Controls when the action will run.
jobs:
# Allows you to run this workflow manually from the Actions tab
build:
runs-on: ubuntu-latest
strategy:
max-parallel: 5
matrix:
python-version: [2.7, 3.6, 3.7, 3.8, 3.9]
steps:
- uses: actions/checkout@v1
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install tox tox-gh-actions
- name: Test with tox
run: tox
| name: CI
# Allows you to run this workflow manually from the Actions tab
on: [push, workflow_dispatch]
# Controls when the action will run.
jobs:
# Allows you to run this workflow manually from the Actions tab
build:
runs-on: ubuntu-latest
strategy:
max-parallel: 5
matrix:
python-version: [3.6, 3.7, 3.8, 3.9]
steps:
- uses: actions/checkout@v1
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install tox tox-gh-actions
- name: Test with tox
run: tox
| Remove py27 from GH CI | Remove py27 from GH CI | YAML | apache-2.0 | alanjds/drf-nested-routers |
07d2e28554255449d695266be4701229b72fcb8a | .github/workflows/main.yml | .github/workflows/main.yml | name: build
on: [push]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
java-version: [ 8, 11, 12, 13, 14, 15, 16, 17 ]
steps:
- uses: actions/checkout@v2
- uses: actions/setup-java@v2
with:
distribution: adopt
java-version: ${{ matrix.java-version }}
# delete gradle.properties in Java 8 because it contains a then-unsupported JVM argument: --add-exports
- run: rm gradle.properties
if: ${{ matrix.java-version == 8 }}
- run: ./gradlew assemble check --info
| name: build
on: [push]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
java-version: [ 8, 11, 12, 13, 14, 15, 16, 17 ]
steps:
- uses: actions/checkout@v2
- uses: actions/setup-java@v3
with:
distribution: adopt
java-version: ${{ matrix.java-version }}
# delete gradle.properties in Java 8 because it contains a then-unsupported JVM argument: --add-exports
- run: rm gradle.properties
if: ${{ matrix.java-version == 8 }}
- run: ./gradlew assemble check --info
| Update actions/setup-java action to v3 | Update actions/setup-java action to v3
| YAML | bsd-2-clause | marianobarrios/linked-blocking-multi-queue |
16e33125daa97f23a32c80cf26d9fd478ff00549 | .github/workflows/main.yml | .github/workflows/main.yml | name: Testing
on: [push, pull_request]
jobs:
laravel:
name: (PHP ${{ matrix.php-versions }} on ${{ matrix.operating-system }})
runs-on: ${{ matrix.operating-system }}
strategy:
fail-fast: false
matrix:
operating-system: [ubuntu-latest, windows-latest, macos-latest]
php-versions: ['7.2', '7.3', '7.4']
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Setup PHP, with composer and extensions
uses: shivammathur/setup-php@v2 #https://github.com/shivammathur/setup-php
with:
php-version: ${{ matrix.php-versions }}
extensions: mbstring, dom, fileinfo
coverage: xdebug #optional
- name: Get composer cache directory
id: composer-cache
run: echo "::set-output name=dir::$(composer config cache-files-dir)"
- name: Cache composer dependencies
uses: actions/cache@v2
with:
path: ${{ steps.composer-cache.outputs.dir }}
# Use composer.json for key, if composer.lock is not committed.
# key: ${{ runner.os }}-composer-${{ hashFiles('**/composer.json') }}
key: ${{ runner.os }}-composer-${{ hashFiles('**/composer.lock') }}
restore-keys: ${{ runner.os }}-composer-
- name: Install Composer dependencies
run: composer install --no-progress --prefer-dist --optimize-autoloader
- name: Test with phpunit
run: vendor/bin/phpunit --coverage-text
| name: Testing
on: [push, pull_request]
jobs:
laravel:
name: (PHP ${{ matrix.php-versions }} on ${{ matrix.operating-system }})
runs-on: ${{ matrix.operating-system }}
strategy:
fail-fast: false
matrix:
operating-system: [ubuntu-latest, windows-latest, macos-latest]
php-versions: ['7.2', '7.3', '7.4']
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Setup PHP, with composer and extensions
uses: shivammathur/setup-php@v2 #https://github.com/shivammathur/setup-php
with:
php-version: ${{ matrix.php-versions }}
extensions: mbstring, dom, fileinfo, pdo_sqlite
coverage: xdebug #optional
- name: Get composer cache directory
id: composer-cache
run: echo "::set-output name=dir::$(composer config cache-files-dir)"
- name: Cache composer dependencies
uses: actions/cache@v2
with:
path: ${{ steps.composer-cache.outputs.dir }}
# Use composer.json for key, if composer.lock is not committed.
# key: ${{ runner.os }}-composer-${{ hashFiles('**/composer.json') }}
key: ${{ runner.os }}-composer-${{ hashFiles('**/composer.lock') }}
restore-keys: ${{ runner.os }}-composer-
- name: Install Composer dependencies
run: composer install --no-progress --prefer-dist --optimize-autoloader
- name: Test with phpunit
run: vendor/bin/phpunit --coverage-text
| Add SQLite support (needed for testing) | Add SQLite support (needed for testing)
| YAML | mit | crodas/SQLParser,crodas/SQLParser |
0c3c74f27881e468af367a0d8202ca3aa22aec89 | .github/workflows/main.yml | .github/workflows/main.yml | name: Python application
on: [push]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [3.5, 3.6, 3.7, 3.8]
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install tox
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
if [ -f requirements_test.txt ]; then pip install -r requirements_test.txt; fi
- name: Run Tox
# Run tox using the version of Python in `PATH`
run: tox -e py
| name: Python application
on: [push]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [3.6, 3.7, 3.8]
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install tox
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
if [ -f requirements_test.txt ]; then pip install -r requirements_test.txt; fi
- name: Run Tox
# Run tox using the version of Python in `PATH`
run: tox -e py
| Exclude python 3.5 from github actions | Exclude python 3.5 from github actions | YAML | mit | rfverbruggen/rachiopy |
e0edc9824d91763c29907ac555cfeffc5ab3fd62 | .github/workflows/main.yml | .github/workflows/main.yml | name: CI
on:
push:
branches: [ feature/github-actions-for-windows-compilers ]
pull_request:
branches: [ master ]
jobs:
build-windows-vs2019:
name: Windows VS2019 Debug
runs-on: [windows-latest]
steps:
- name: Checkout
uses: actions/checkout@v2
with:
submodules: recursive
- name: Add msbuild to PATH
uses: microsoft/setup-msbuild@v1.0.1
- name: Build
run: |
cmake -G "Visual Studio 16 2019" -D BUILD_TESTS=ON ./
MSBuild.exe etl.sln
- name: Run tests
run: |
ls *.exe
etl_tests.exe
| name: CI
on:
push:
branches: [ feature/github-actions-for-windows-compilers ]
pull_request:
branches: [ master ]
jobs:
build-windows-vs2019:
name: Windows VS2019 Debug
runs-on: [windows-latest]
steps:
- name: Checkout
uses: actions/checkout@v2
with:
submodules: recursive
- name: Add msbuild to PATH
uses: microsoft/setup-msbuild@v1.0.1
- name: Build
run: |
cmake -G "Visual Studio 16 2019" -D BUILD_TESTS=ON ./
MSBuild.exe etl.sln
- name: Run tests
run: |
ls
etl_tests.exe
| Add VS2019 configuration to Github CI | Add VS2019 configuration to Github CI
| YAML | mit | ETLCPP/etl,ETLCPP/etl,ETLCPP/etl,ETLCPP/etl |
8b57f6558e89b4fe3b2ddcca2ae112853df5f692 | .github/workflows/main.yml | .github/workflows/main.yml | name: Python tests
on: [push]
jobs:
build:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
python-version: ["3.7", "3.8", "3.9", "3.10"]
django: ["3.2", "4.0"]
exclude:
- python-version: "3.7"
django: "4.0"
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -q Django==${{ matrix.django }} requests mock hypernova
- name: Test
run: |
python runtests.py
- name: "Run isort checks for ${{ matrix.python-version }}"
run: |
isort . --check-only
| name: Run tests and lint
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
python-version: ["3.7", "3.8", "3.9", "3.10"]
django: ["3.2", "4.0"]
exclude:
- python-version: "3.7"
django: "4.0"
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -q Django==${{ matrix.django }} requests mock hypernova
- name: Test
run: |
python runtests.py
lint-black:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: psf/black@stable
with:
options: "--check --verbose"
src: "."
version: "21.5b1"
lint-isort:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install isort
isort . --check-only
| Add black and isort to ci | Add black and isort to ci
| YAML | mit | Frojd/django-react-templatetags,Frojd/django-react-templatetags,Frojd/django-react-templatetags |
1c9adf2b452cab144dd8d89e024ddbb0c19c1b88 | .github/workflows/main.yml | .github/workflows/main.yml | name: CI
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Run a one-line script
run: echo Hello, world!
- name: Checkout the tools
run: |
pwd
mkdir ~/esp
cd ~/esp
git clone --recursive https://github.com/espressif/esp-idf.git
cd ~/esp/esp-idf
./install.sh
- name: Compile example 01_max_malloc
run: |
. $HOME/esp/esp-idf/export.sh
export BATCH_BUILD=1
export V=0
cd /home/runner/work/qemu_esp32/qemu_esp32/examples/01_max_malloc
idf.py build
exit 0
| name: CI
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Run a one-line script
run: echo Hello, world!
- name: Checkout the tools
run: |
pwd
mkdir ~/esp
cd ~/esp
#git clone -b branch --single-branch git://github/repository.git
#git clone --recursive https://github.com/espressif/esp-idf.git
git clone --recursive -b release/v4.0 --single-branch https://github.com/espressif/esp-idf.git
cd ~/esp/esp-idf
./install.sh
- name: Compile example 01_max_malloc
run: |
. $HOME/esp/esp-idf/export.sh
export BATCH_BUILD=1
export V=0
cd /home/runner/work/qemu_esp32/qemu_esp32/examples/01_max_malloc
idf.py build
exit 0
| Use v 4.0 for ci | Use v 4.0 for ci
| YAML | apache-2.0 | Ebiroll/qemu_esp32,Ebiroll/qemu_esp32,Ebiroll/qemu_esp32 |
c2d465add08a88ff923f54fef68b72172bd5e193 | .github/workflows/main.yml | .github/workflows/main.yml | name: pystan
on: [push, pull_request]
jobs:
tests:
name: pystan tests
runs-on: ${{ matrix.os }}
timeout-minutes: 20
strategy:
matrix:
os: [ubuntu-20.04, macos-10.15]
python-version: [3.7, 3.8, 3.9]
steps:
- name: Check out repository
uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install poetry
run: pip install "poetry~=1.1.4"
# export test dependencies from pyproject.toml, install them
- name: Install dependencies
run: |
poetry export -f requirements.txt --without-hashes --dev -o requirements.txt \
&& pip install -r requirements.txt
- name: Build and Install wheel
run: |
poetry build -v
python -m pip install dist/*.whl
- name: Check code
if: matrix.python-version == '3.8'
run: scripts/check
- name: Run tests
run: scripts/test
| name: pystan
on: [push, pull_request]
jobs:
tests:
name: pystan tests
timeout-minutes: 20
runs-on: ${{ matrix.runs-on }}
strategy:
matrix:
include:
- {runs-on: ubuntu-20.04, python-version: 3.7}
- {runs-on: ubuntu-20.04, python-version: 3.8}
- {runs-on: ubuntu-20.04, python-version: 3.9}
- {runs-on: macos-10.15, python-version: 3.7}
- {runs-on: macos-10.15, python-version: 3.8}
- {runs-on: macos-10.15, python-version: 3.9}
- {runs-on: ubuntu-20.04, container: "ubuntu:latest", python-version: 3.8}
- {runs-on: ubuntu-20.04, container: "fedora:latest", python-version: 3.8}
steps:
- name: Check out repository
uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install poetry
run: pip install "poetry~=1.1.4"
# export test dependencies from pyproject.toml, install them
- name: Install dependencies
run: |
poetry export -f requirements.txt --without-hashes --dev -o requirements.txt \
&& pip install -r requirements.txt
- name: Build and Install wheel
run: |
poetry build -v
python -m pip install dist/*.whl
- name: Check code
if: matrix.python-version == '3.8'
run: scripts/check
- name: Run tests
run: scripts/test
| Test on fedora latest and ubuntu latest | ci: Test on fedora latest and ubuntu latest
Closes #295
| YAML | isc | stan-dev/pystan,stan-dev/pystan |
cec75e9152f612579ad95f094ee825df85a3977a | .github/workflows/ruby.yml | .github/workflows/ruby.yml | name: Ruby
on:
pull_request:
branches:
- 'master'
push:
branches:
- 'master'
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
ruby: [ '2.5.4', '2.6.5', 'ruby-head', 'jruby-9.2.9.0', 'jruby-head' ]
steps:
- uses: actions/checkout@v1
- name: Set up RVM
run: |
curl -sSL https://get.rvm.io | bash
- name: Set up Ruby
run: |
source $HOME/.rvm/scripts/rvm
rvm install ${{ matrix.ruby }} --binary
rvm --default use ${{ matrix.ruby }}
- name: Build and test with Rake
run: |
source $HOME/.rvm/scripts/rvm
gem i test-unit
rake
benchmark:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Set up Ruby 2.6
uses: actions/setup-ruby@v1
with:
ruby-version: 2.6.x
- name: Test performance and accuracy
run: |
gem install bundler
bundle install --jobs 4 --retry 2
gem uni did_you_mean
bundle exec rake test:accuracy
bundle exec rake test:explore
bundle exec rake benchmark:memory
| name: Ruby
on:
pull_request:
branches:
- 'master'
push:
branches:
- 'master'
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
ruby: [ '2.5.8', '2.6.6', '2.7.1', 'ruby-head', 'jruby-9.2.11.1', 'jruby-head' ]
steps:
- uses: actions/checkout@v1
- uses: ruby/setup-ruby@v1
with:
ruby-version: ${{ matrix.ruby }}
- name: Build and test with Rake
run: |
gem i test-unit
rake
benchmark:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- uses: ruby/setup-ruby@v1
with:
ruby-version: 2.7.1
- name: Test performance and accuracy
run: |
gem install bundler
bundle install --jobs 4 --retry 2
gem uni did_you_mean
bundle exec rake test:accuracy
bundle exec rake test:explore
bundle exec rake benchmark:memory
| Use GH action box maintained by Ruby core | Use GH action box maintained by Ruby core
| YAML | mit | yuki24/did_you_mean |
0c6b6401314464230b25c72b7233f604eea9098b | .github/workflows/ruby.yml | .github/workflows/ruby.yml | name: Ruby
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Set up Ruby 2.6
uses: actions/setup-ruby@v1
with:
ruby-version: 2.6.x
- name: Build and test with Rake
run: |
gem install bundler
bundle install --jobs 4 --retry 3
bundle exec rake
| name: Ruby
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Set up Ruby 2.6
uses: actions/setup-ruby@v1
with:
ruby-version: 2.6.4
- name: Build and test with Rake
run: |
gem install bundler
bundle install --jobs 4 --retry 3
bundle exec rake
| Use Ruby 2.6.4 in GitHub Actions CI | Use Ruby 2.6.4 in GitHub Actions CI
| YAML | mit | levups/http-tester |
79a0e93b6fb873456020e62552f4a9489f452bfd | .github/workflows/test.yml | .github/workflows/test.yml | name: test
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: test
run: |
export DEBIAN_FRONTEND=noninteractive && \
echo 'debconf debconf/frontend select Noninteractive' | sudo debconf-set-selections && \
sudo apt-get update && \
sudo apt-get install -yq --no-install-recommends libev-dev python3.7 python3-dev python3-setuptools dialog apt-utils tshark libpcap0.8 libpcap-dev p0f tcpdump && \
pip3 install -U pip && \
pip3 install codecov pytype pytest-cov && \
find . -name requirements.txt -type f -exec pip3 install -r {} \; && \
export PATH=/home/runner/.local/bin:$PATH && \
make test && \
coverage report && coverage xml
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v2.1.0
if: github.repository == 'iqtlabs/network-tools'
| name: test
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: test
run: |
export DEBIAN_FRONTEND=noninteractive && \
echo 'debconf debconf/frontend select Noninteractive' | sudo debconf-set-selections && \
sudo apt-get update && \
sudo apt-get install -yq --no-install-recommends libev-dev python3.7 python3-dev python3-setuptools dialog apt-utils tshark libpcap0.8 libpcap-dev p0f tcpdump && \
pip3 install -U pip && \
pip3 install codecov pytype pytest-cov && \
find . -name requirements.txt -type f -exec pip3 install -r {} \; && \
export PATH=/home/runner/.local/bin:$PATH && \
make test && \
coverage report && coverage xml
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v3.0.0
if: github.repository == 'iqtlabs/network-tools'
| Update codecov/codecov-action action to v3 | Update codecov/codecov-action action to v3
| YAML | apache-2.0 | CyberReboot/vent-plugins,CyberReboot/vent-plugins,CyberReboot/vent-plugins,CyberReboot/vent-plugins |
b17afe362b4db0e5988792d0b340d5460ecb3546 | .github/workflows/test.yml | .github/workflows/test.yml | name: Test
on:
pull_request:
push:
branches:
- master
- 'renovate/*'
jobs:
build:
runs-on: ubuntu-18.04
if: github.event_name == 'push' || (github.event_name == 'pull_request' && !startsWith(github.head_ref, 'renovate/'))
strategy:
matrix:
node: [ 10, 12, 14 ]
env:
FORCE_COLOR: 1
name: Node ${{ matrix.node }}
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v1
with:
node-version: ${{ matrix.node }}
- run: yarn
- run: yarn test
| name: Test
on:
pull_request:
push:
branches:
- master
- 'renovate/*'
jobs:
build:
runs-on: ubuntu-18.04
if: github.event_name == 'push' || (github.event_name == 'pull_request' && !startsWith(github.head_ref, 'renovate/'))
strategy:
matrix:
node: [ 10, 12, 14 ]
env:
FORCE_COLOR: 1
name: Node ${{ matrix.node }}
steps:
- uses: actions/checkout@v2
- uses: actions/setup-node@v2
with:
node-version: ${{ matrix.node }}
- run: yarn
- run: yarn test
| Update actions/setup-node action to v2 | Update actions/setup-node action to v2
| YAML | mit | TryGhost/Ignition |
038ee14d150afedfcfab61c69bb99d0cd00a2fae | .github/workflows/test.yml | .github/workflows/test.yml | name: Test
on: [push, pull_request]
jobs:
test:
name: Test on ruby ${{ matrix.ruby_version }} and prawn ${{ matrix.prawn_version }}
runs-on: ubuntu-latest
strategy:
matrix:
ruby_version:
- 2.5
- 2.6
- 2.7
- 3.0
prawn_version:
- 2.3
- 2.4
exclude:
- ruby_version: 3.0
prawn_version: 2.3
steps:
- uses: actions/checkout@v2.3.4
- name: Set up diff-pdf
run: |
sudo apt-get update
sudo apt-get install make automake g++ libpoppler-glib-dev poppler-utils libwxgtk3.0-dev
git clone https://github.com/vslavik/diff-pdf.git -b v0.4.1 --depth 1 /tmp/diff-pdf-src
cd /tmp/diff-pdf-src
./bootstrap && ./configure && make && sudo make install
- name: Set up Ruby ${{ matrix.ruby_version }}
uses: ruby/setup-ruby@v1
with:
ruby-version: ${{ matrix.ruby_version }}
- name: Install dependencies
run: |
gem install bundler
bundle install --gemfile gemfiles/prawn-${{ matrix.prawn_version }}.gemfile --jobs 4 --retry 3
- name: Run Tests
run: bundle exec rake test
| name: Test
on: [push, pull_request]
jobs:
test:
name: Test on ruby ${{ matrix.ruby_version }} and prawn ${{ matrix.prawn_version }}
runs-on: ubuntu-latest
strategy:
matrix:
ruby_version:
- 2.5
- 2.6
- 2.7
- 3.0
prawn_version:
- 2.3
- 2.4
exclude:
- ruby_version: 3.0
prawn_version: 2.3
steps:
- uses: actions/checkout@v2
- name: Set up diff-pdf
run: |
sudo apt-get update
sudo apt-get install make automake g++ libpoppler-glib-dev poppler-utils libwxgtk3.0-dev
git clone https://github.com/vslavik/diff-pdf.git -b v0.4.1 --depth 1 /tmp/diff-pdf-src
cd /tmp/diff-pdf-src
./bootstrap && ./configure && make && sudo make install
- name: Set up Ruby ${{ matrix.ruby_version }}
uses: ruby/setup-ruby@v1
with:
ruby-version: ${{ matrix.ruby_version }}
- name: Install dependencies
run: |
gem install bundler
bundle install --gemfile gemfiles/prawn-${{ matrix.prawn_version }}.gemfile --jobs 4 --retry 3
- name: Run Tests
run: bundle exec rake test
| Allow to use the major version of the v2 | Allow to use the major version of the v2
| YAML | mit | hidakatsuya/prawn-emoji |
104bc2b15ec0d608aa97dd56f9416e9eafcc26a4 | skeleton/.travis.yml | skeleton/.travis.yml | ---
# Verify this with: http://lint.travis-ci.org/
language: ruby
before_install: rm Gemfile.lock
rvm:
- 1.8.7
- 1.9.3
script: bundle exec rake spec
env:
matrix:
- PUPPET_VERSION=2.7.23
- PUPPET_VERSION=3.1.1
| ---
# Verify this with: http://lint.travis-ci.org/
language: ruby
before_install: rm Gemfile.lock
rvm:
- 1.8.7
- 1.9.3
script: bundle exec rake spec
env:
matrix:
- PUPPET_VERSION="~> 2.7.0"
- PUPPET_VERSION="~> 3.1.0"
- PUPPET_VERSION="~> 3.2.0"
| Test against each Puppet series | Test against each Puppet series
Will pick the latest version from 2.7.x, 3.1.x and 3.2.x
I'm skipping 3.0.x as it was the bratty child of the 3.x series and doesn't
appear to have received as much adoption. We also experienced some real
problems with testing frameworks on 3.x at the time.
| YAML | mit | gds-operations/puppet-module-skeleton,gds-operations/puppet-module-skeleton,gds-operations/puppet-module-skeleton |
440137dbd8ee171766cef4d11333068114515fda | install.conf.yaml | install.conf.yaml | - clean: ['~']
- link:
~/.vim:
path: 'files/.vim/'
force: true
~/.vimrc:
path: 'files/.vimrc'
force: true
~/.gvimrc:
path: 'files/.gvimrc'
force: true
~/.bash_profile:
path: 'files/.bash_profile'
force: true
~/.gitconfig:
path: 'files/.gitconfig'
force: true
~/.gitignore_global:
path: 'files/.gitignore_global'
force: true
~/.zshrc:
path: 'files/.zshrc'
force: true
~/.npmrc:
path: 'files/.npmrc'
force: true
~/.gnupg/gpg.conf:
path: 'files/.gnupg/gpg.conf'
force: true
~/.tmux.conf:
path: 'files/.tmux.conf'
force: true
~/.oh-my-zsh/custom:
path: 'files/.oh-my-zsh/custom'
force: true
~/.XCompose:
path: 'files/.XCompose'
force: true
- shell:
- [git submodule update --init --recursive, Installing submodules]
- [bash scripts/osx.sh]
- [pip install --user powerline-status]
- [bash scripts/install_powerline_fonts.sh]
- [mkdir ~/npm]
- [mkdir ~/bin]
- [mkdir ~/.gnupg]
- [bash scripts/install_ack.sh]
| - clean: ['~']
- link:
~/.vim:
path: 'files/.vim/'
force: true
~/.vimrc:
path: 'files/.vimrc'
force: true
~/.gvimrc:
path: 'files/.gvimrc'
force: true
~/.bash_profile:
path: 'files/.bash_profile'
force: true
~/.gitconfig:
path: 'files/.gitconfig'
force: true
~/.gitignore_global:
path: 'files/.gitignore_global'
force: true
~/.zshrc:
path: 'files/.zshrc'
force: true
~/.npmrc:
path: 'files/.npmrc'
force: true
~/.gnupg/gpg.conf:
path: 'files/.gnupg/gpg.conf'
force: true
~/.tmux.conf:
path: 'files/.tmux.conf'
force: true
~/.XCompose:
path: 'files/.XCompose'
force: true
~/.oh-my-zsh/custom/themes/:
glob: true
path: files/.oh-my-zsh/custom/themes/*
force: true
- shell:
- [git submodule update --init --recursive, Installing submodules]
- [bash scripts/osx.sh]
- [pip install --user powerline-status]
- [bash scripts/install_powerline_fonts.sh]
- [mkdir ~/npm]
- [mkdir ~/bin]
- [mkdir ~/.gnupg]
- [bash scripts/install_ack.sh]
| Change how the custom oh-my-zsh themes are installed | Change how the custom oh-my-zsh themes are installed
Before this, the installation would remove the entire custom/themes folder in oh-my-zsh and this would make the update process fail (because git would say it can't update due to changes in the local repo). Now the installation creates individual links to any custom theme without removing the existing folder
| YAML | mit | davialexandre/dotfiles,davialexandre/dotfiles,davialexandre/dotfiles,davialexandre/dotfiles |
1e563459288936dd3e81f31bc80614c152e91b48 | .circleci/config.yml | .circleci/config.yml | version: 2
jobs:
build:
docker:
- image: circleci/node:8@sha256:ea6a1dfa308a6af75b880f1db86da4cf78955e68d1a56dd2e64280fc546665a0
- image: circleci/mongo:3@sha256:04a81e1cd52345ebf5886874621d47f32662810213ef327532bf2aa63ae86dc6
steps:
- checkout
- run: yarn install --frozen-lockfile
- run: yarn test
- run: yarn codecov
| version: 2
jobs:
build:
docker:
- image: circleci/node:8@sha256:ea6a1dfa308a6af75b880f1db86da4cf78955e68d1a56dd2e64280fc546665a0
- image: circleci/mongo:3@sha256:1b30c31e8c3932064af462d101b5c114d94a2f9e3341391a681f5f1d1aa301c6
steps:
- checkout
- run: yarn install --frozen-lockfile
- run: yarn test
- run: yarn codecov
| Update circleci/mongo:3 Docker digest to 1b30c3 | Update circleci/mongo:3 Docker digest to 1b30c3 | YAML | mit | js-accounts/accounts |
aa4ea7b3e9950816c5515b3f82fa55550ae2b779 | wolfia.example.yaml | wolfia.example.yaml | ---
wolfia:
debug: true # Some things are different. Set to false to properly run the bot.
discordToken: "" # Discord bot token
logChannelId: 0 # Id of a channel where to post general bot activity like games starting and ending.
database:
jdbcUrl: "" # Postgres database. When running with the bundled docker-compose file, set to jdbc:postgresql://db:5432/wolfia?user=wolfia
server:
port: 4567 # Port of Wolfia's API.
sentry:
dsn: "" # Error aggregation service. See https://sentry.io
spring:
output:
ansi:
enabled: always
logging:
file: './logs/wolfia.log'
file.max-history: 30
file.max-size: 1GB
level:
root: INFO
space.npstr: DEBUG
| ---
wolfia:
debug: true # Some things are different. Set to false to properly run the bot.
discordToken: "" # Discord bot token
logChannelId: 0 # Id of a channel where to post general bot activity like games starting and ending.
database:
jdbcUrl: "" # Postgres database. When running with the bundled docker-compose file, set to jdbc:postgresql://db:5432/wolfia?user=wolfia
server:
port: 4567 # Port of Wolfia's API.
sentry:
dsn: "" # Error aggregation service. See https://sentry.io
spring:
output:
ansi:
enabled: always
logging:
file:
name: './logs/wolfia.log'
max-history: 30
max-size: 1GB
level:
root: INFO
space.npstr: DEBUG
| Update log file name config | Update log file name config
| YAML | agpl-3.0 | napstr/wolfia,napstr/wolfia,napstr/wolfia |
f6ff962da03be424095ba1ddc5182772a51e6146 | hieradata/nodes/bgo/bgo-mgmt-01.yaml | hieradata/nodes/bgo/bgo-mgmt-01.yaml | ---
network::interfaces_hash:
'eth0':
'ipaddress': '172.16.0.5/21'
'post_up': [ '/etc/network/if-up.d/z90-route-eth0', ]
'post_down': [ '/etc/network/if-down.d/z90-route-eth0', ]
# Add default route for management VRF
profile::base::network::routes:
'eth0':
'ipaddress': [ '0.0.0.0', ]
'netmask': [ '0.0.0.0', ]
'gateway': [ "%{hiera('netcfg_mgmt_gateway')}", ]
'table': [ 'mgmt', ]
| ---
network::interfaces_hash:
'eth0':
'ipaddress': '172.16.0.5/21'
'gateway': "%{hiera('netcfg_mgmt_gateway')}"
'vrf': 'mgmt'
'mgmt':
'ipaddress': '127.0.0.1/8'
'vrf_table': 'auto'
| Reconfigure network for mgmt node | Reconfigure network for mgmt node
| YAML | apache-2.0 | norcams/himlar,mikaeld66/himlar,raykrist/himlar,tanzr/himlar,raykrist/himlar,norcams/himlar,tanzr/himlar,raykrist/himlar,TorLdre/himlar,raykrist/himlar,norcams/himlar,mikaeld66/himlar,TorLdre/himlar,tanzr/himlar,mikaeld66/himlar,tanzr/himlar,raykrist/himlar,mikaeld66/himlar,TorLdre/himlar,tanzr/himlar,TorLdre/himlar,TorLdre/himlar,norcams/himlar,norcams/himlar,mikaeld66/himlar |
fb78695e46b022e104bbb9d02fa25444db35a04a | .pre-commit-config.yaml | .pre-commit-config.yaml | repos:
- repo: https://github.com/psf/black
rev: 20.8b1
hooks:
- id: black
language_version: python3
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v0.790
hooks:
- id: mypy
- repo: https://gitlab.com/pycqa/flake8
rev: 3.8.4
hooks:
- id: flake8
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.3.0
hooks:
- id: check-toml
- id: check-yaml
- repo: https://github.com/PyCQA/isort
rev: 5.6.4
hooks:
- id: isort
additional_dependencies: [toml]
- repo: https://github.com/myint/docformatter
rev: v1.3.1
hooks:
- id: docformatter
args: [--in-place]
- repo: git://github.com/detailyang/pre-commit-shell
rev: 1.0.5
hooks:
- id: shell-lint
args: [-x]
| repos:
- repo: https://github.com/psf/black
rev: 20.8b1
hooks:
- id: black
language_version: python3
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v0.790
hooks:
- id: mypy
- repo: https://gitlab.com/pycqa/flake8
rev: 3.8.4
hooks:
- id: flake8
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.3.0
hooks:
- id: check-toml
- id: check-yaml
- repo: https://github.com/PyCQA/isort
rev: 5.6.4
hooks:
- id: isort
additional_dependencies: [toml]
- repo: https://github.com/myint/docformatter
rev: v1.3.1
hooks:
- id: docformatter
args: [--in-place]
- repo: https://github.com/shellcheck-py/shellcheck-py
rev: v0.7.1.1
hooks:
- id: shellcheck
| Replace shell-lint with shellcheck for pre-commit-ci | Replace shell-lint with shellcheck for pre-commit-ci
Signed-off-by: Dan Yeaw <2591e5f46f28d303f9dc027d475a5c60d8dea17a@yeaw.me>
| YAML | lgpl-2.1 | amolenaar/gaphor,amolenaar/gaphor |
001672806acedc1301e489154d50e9ed56fb297f | .circleci/config.yml | .circleci/config.yml | # Javascript Node CircleCI 2.0 configuration file
#
# Check https://circleci.com/docs/2.0/language-javascript/ for more details
#
version: 2
jobs:
test:
docker:
# specify the version you desire here
- image: circleci/node:12.13
# Specify service dependencies here if necessary
# CircleCI maintains a library of pre-built images
# documented at https://circleci.com/docs/2.0/circleci-images/
# - image: circleci/mongo:3.4.4
working_directory: ~/repo
steps:
- checkout
# Download and cache dependencies
- restore_cache:
keys:
- dependencies-{{ checksum "package-lock.json" }}
# fallback to using the latest cache if no exact match is found
- dependencies-
- run:
name: Install
command: npm ci
- save_cache:
paths:
- node_modules
key: dependencies-{{ checksum "package-lock.json" }}
# run tests!
- run:
name: Test
command: npm test
deploy:
docker:
- image: circleci/node:12.13
steps:
- checkout
- run:
name: Build
command: npm run build
- run:
name: Deploy
command: ./scripts/deploy
workflows:
version: 2
test-build-deploy:
jobs:
- test:
filters:
branches:
ignore:
- gh-pages
- deploy:
requires:
- test
filters:
branches:
only: master
| # Javascript Node CircleCI 2.0 configuration file
#
# Check https://circleci.com/docs/2.0/language-javascript/ for more details
#
version: 2
jobs:
test:
docker:
# specify the version you desire here
- image: circleci/node:12.13
# Specify service dependencies here if necessary
# CircleCI maintains a library of pre-built images
# documented at https://circleci.com/docs/2.0/circleci-images/
# - image: circleci/mongo:3.4.4
working_directory: ~/repo
steps:
- checkout
# Download and cache dependencies
- restore_cache:
keys:
- dependencies-{{ checksum "package-lock.json" }}
# fallback to using the latest cache if no exact match is found
- dependencies-
- run:
name: Install
command: npm ci
- save_cache:
paths:
- node_modules
key: dependencies-{{ checksum "package-lock.json" }}
# run tests!
- run:
name: Test
command: npm test
- persist_to_workspace:
root: .
paths:
- node_modules
deploy:
docker:
- image: circleci/node:12.13
steps:
- checkout
- attach_workspace:
at: .
- run:
name: Build
command: npm run build
- run:
name: Deploy
command: ./scripts/deploy
workflows:
version: 2
test-build-deploy:
jobs:
- test:
filters:
branches:
ignore:
- gh-pages
- deploy:
requires:
- test
filters:
branches:
only: master
| Attach node_moduels to workspace for deployment | Attach node_moduels to workspace for deployment
| YAML | mit | JakeSidSmith/slik,JakeSidSmith/flo-js,JakeSidSmith/slik,JakeSidSmith/flo-js,JakeSidSmith/flo-js |
dbe29f18259aa32f6c562fee8f21136b3eafeffd | .pre-commit-config.yaml | .pre-commit-config.yaml | # See https://pre-commit.com for more information
# See https://pre-commit.com/hooks.html for more hooks
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.2.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-added-large-files
- repo: https://github.com/digitalpulp/pre-commit-php.git
rev: 1.4.0
hooks:
- id: php-lint
files: \.(php)$
exclude: |
(?x)^(
(.*blade\.php)|
(.*public\/index.php)|
(.*resources\/lang.*)
)$
- id: php-cbf
files: \.(php)$
exclude: |
(?x)^(
(.*blade\.php)|
(.*public\/index.php)|
(.*resources\/lang.*)
)$
args: [--standard=PSR12 -p]
- id: php-cs
files: \.(php)$
exclude: |
(?x)^(
(.*blade\.php)|
(.*public\/index.php)|
(.*resources\/lang.*)
)$
args: [--standard=PSR12 -p]
| # See https://pre-commit.com for more information
# See https://pre-commit.com/hooks.html for more hooks
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.2.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-added-large-files
- repo: https://github.com/digitalpulp/pre-commit-php.git
rev: 1.4.0
hooks:
- id: php-lint
files: \.(php)$
exclude: |
(?x)^(
(.*blade\.php)|
(.*public\/index.php)|
(.*resources\/lang.*)
)$
- id: php-cbf
files: \.(php)$
exclude: |
(?x)^(
(.*blade\.php)|
(.*public\/index.php)|
(.*resources\/lang.*)
)$
args: [--standard=PSR12 -p]
- id: php-cs
files: \.(php)$
exclude: |
(?x)^(
(.*blade\.php)|
(.*public\/index.php)|
(.*resources\/lang.*)
)$
args: [--standard=PSR12 -p]
- repo: https://github.com/hadolint/hadolint
rev: master
hooks:
- id: hadolint-docker
| Add hadolint to lint Dockerfile | Add hadolint to lint Dockerfile
| YAML | agpl-3.0 | zeropingheroes/lanager,zeropingheroes/lanager |
cd703247adcf2be82a7a3d2c3a6f824312fb9afe | config/locales/en.yml | config/locales/en.yml | en:
xmpp_label_settings: XMPP Notifications settings
xmpp_label_jid: Jabber ID (jid@server.tld/resource)
xmpp_label_password: Password
xmpp_issue_created: Issue was created:
xmpp_issue_updated: Issue was updated:
xmpp_update_author: Author of changes
| en:
xmpp_label_settings: XMPP Notifications settings
xmpp_label_jid: Jabber ID (jid@server.tld/resource)
xmpp_label_password: Password
xmpp_issue_created: "Issue was created:"
xmpp_issue_updated: "Issue was updated:"
xmpp_update_author: Author of changes
| Fix translation braking (appears, at least, in Redmine 2.3) | Fix translation braking (appears, at least, in Redmine 2.3)
Signed-off-by: Vadim A. Misbakh-Soloviov <0317e6896132b03afd1b7513bbabbb42f0b8d888@mva.name>
| YAML | mit | iprok/redmine_xmpp_notifications,redmine-xmpp/notifications,redmine-xmpp/notifications,iprok/redmine_xmpp_notifications |
cc7337b61afd6b2a211a40729b8e927c27ef6eb2 | roles/docker_host/tasks/main.yml | roles/docker_host/tasks/main.yml | - name: install docker
apt: name=docker.io state=latest
notify: docker from experimental
- name: create docker fs
filesystem:
dev: "{{ docker_dev }}"
fstype: btrfs
- name: create directory for ansible custom facts
file: state=directory recurse=yes path=/etc/ansible/facts.d
- name: create fact for docker fs blkid
template:
src: etc/ansible/facts.d/docker_blkid.fact
dest: /etc/ansible/facts.d/docker_blkid.fact
mode: 500
- name: re-read facts after adding custom fact
setup: filter=ansible_local
- name: mount docker fs
mount:
src: UUID={{ ansible_local.docker_blkid }}
name: /var/lib/docker
fstype: btrfs
state: mounted
notify: restart docker
| - name: install docker
apt: name=docker.io state=latest
notify: docker from experimental
- name: create docker fs
filesystem:
dev: "{{ docker_dev }}"
fstype: btrfs
- name: create directory for ansible custom facts
file: state=directory recurse=yes path=/etc/ansible/facts.d
- name: create fact for docker fs blkid
template:
src: etc/ansible/facts.d/docker_blkid.fact
dest: /etc/ansible/facts.d/docker_blkid.fact
mode: 500
- name: re-read facts after adding custom fact
setup: filter=ansible_local
- name: mount docker fs
mount:
src: UUID={{ ansible_local.docker_blkid }}
name: /var/lib/docker
fstype: btrfs
state: mounted
notify: restart docker
- name: install python-docker
apt: name=python-docker state=latest
- name: generate container config dir
file: state=directory recurse=yes path=/etc/container_environment
| Prepare docker installation for further uses | Prepare docker installation for further uses
Signed-off-by: Jan Losinski <577c4104c61edf9f052c616c0c23e67bef4a9955@wh2.tu-dresden.de>
| YAML | bsd-3-clause | janLo/ansible-playground,janLo/ansible-playground,janLo/ansible-playground |
07990457993ca3c1dcf39351940c6067021be3de | tasks/install-apt.yml | tasks/install-apt.yml | ---
# Copyright 2016, Rackspace US, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#TODO(evrardjp): Replace the next 2 tasks by a standard apt with cache
#when https://github.com/ansible/ansible-modules-core/pull/1517 is merged
#in 1.9.x or we move to 2.0 (if tested working)
- name: Check apt last update file
stat:
path: /var/cache/apt
register: apt_cache_stat
- name: Update apt if needed
apt:
update_cache: yes
when: "ansible_date_time.epoch|float - apt_cache_stat.stat.mtime > {{cache_timeout}}"
- name: Install apt packages
apt:
pkg: "{{ item }}"
state: "{{ memcached_package_state }}"
register: install_packages
until: install_packages|success
retries: 5
delay: 2
with_items: "{{ memcached_distro_packages }}"
- name: Install apt packages for testing
apt:
pkg: "{{ item }}"
state: "{{ memcached_package_state }}"
register: install_test_packages
until: install_test_packages|success
retries: 5
delay: 2
with_items: "{{ memcached_test_distro_packages }}"
when: install_test_packages|bool
| ---
# Copyright 2016, Rackspace US, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
- name: Install apt packages
apt:
pkg: "{{ item }}"
state: "{{ memcached_package_state }}"
update_cache: yes
cache_valid_time: "{{ cache_timeout }}"
register: install_packages
until: install_packages|success
retries: 5
delay: 2
with_items: "{{ memcached_distro_packages }}"
- name: Install apt packages for testing
apt:
pkg: "{{ item }}"
state: "{{ memcached_package_state }}"
register: install_test_packages
until: install_test_packages|success
retries: 5
delay: 2
with_items: "{{ memcached_test_distro_packages }}"
when: install_test_packages|bool
| Remove ansible<2.2 apt cache hack | Remove ansible<2.2 apt cache hack
Now ansible apt module correctly behaves, so it's time
to deprecate these cruft tasks for apt.
Change-Id: I59c36e11fafa4842e1dde78c0afa0f757d5ba670
| YAML | apache-2.0 | openstack/openstack-ansible-memcached_server,openstack/openstack-ansible-memcached_server |
16bb633da12f4f301ece6b667a024fb0ab33b5a3 | etc/log4j2_java.yaml | etc/log4j2_java.yaml | configuration:
status: warn
name: Odenos Logging
packages: com.datastax.logging.appender
appenders:
Console:
name: Console
PatternLayout:
Pattern: "%d{HH:mm:ss.SSS} %-5level %logger{2} %X - %msg%n"
ThresholdFilter:
level: error
File:
name: File
fileName: "${sys:log4j2_app.log}"
PatternLayout:
Pattern: "%d %-5level [%t] %logger{36}.%M %X - %msg%n"
Syslog:
name: Syslog
Host: "localhost"
Protocol: UDP
Port: 514
Facility: "LOCAL1"
Format: "RFC5424"
mdcId: "mdc"
includeMDC: "true"
LoggerFields:
KeyValuePair: !!pairs
- key: "thread"
value: "%t"
- key: "method"
value: "%c.%M"
# CassandraAppender:
# name: CassandraAppender
# ThresholdFilter:
# level: info
loggers:
root:
level: info
AppenderRef:
# - ref: Console
- ref: File
# - ref: Syslog
# - ref: CassandraAppender
| configuration:
status: warn
name: Odenos Logging
packages: com.datastax.logging.appender
appenders:
Console:
name: Console
PatternLayout:
Pattern: "%d{HH:mm:ss.SSS} %-5level %logger{2} txid: %replace{%X{txid}}{^$}{-} - %msg%n"
ThresholdFilter:
level: error
File:
name: File
fileName: "${sys:log4j2_app.log}"
PatternLayout:
Pattern: "%d %-5level [%t] %logger{36}.%M txid: %replace{%X{txid}}{^$}{-} - %msg%n"
Syslog:
name: Syslog
Host: "localhost"
Protocol: UDP
Port: 514
Facility: "LOCAL1"
Format: "RFC5424"
mdcId: "mdc"
includeMDC: "true"
LoggerFields:
KeyValuePair: !!pairs
- key: "thread"
value: "%t"
- key: "method"
value: "%c.%M"
# CassandraAppender:
# name: CassandraAppender
# ThresholdFilter:
# level: info
loggers:
root:
level: info
AppenderRef:
# - ref: Console
- ref: File
# - ref: Syslog
# - ref: CassandraAppender
| Change PatternLayout in stdout, file. | Change PatternLayout in stdout, file.
| YAML | apache-2.0 | nis-sdn/odenos,haizawa/odenos,haizawa/odenos,o3project/odenos,nis-sdn/odenos,o3project/odenos,nis-sdn/odenos,haizawa/odenos,nis-sdn/odenos,nis-sdn/odenos,o3project/odenos,o3project/odenos,haizawa/odenos,o3project/odenos,haizawa/odenos |
44206fc0b1356317413b6dc6256934945482ee88 | stanford/lagunita/worker1.yml | stanford/lagunita/worker1.yml | #!/usr/bin/env ansible-playbook
---
- include: ../playbooks/worker.yml
vars:
CLUSTER_INSTANCE_TYPE: r4.xlarge
CLUSTER_NUMBER: 1
COMMON_DEPLOYMENT: lagunita
edx_platform_version: master
EDXAPP_COMPREHENSIVE_THEME_VERSION: master
| #!/usr/bin/env ansible-playbook
---
- include: ../playbooks/worker.yml
vars:
CLUSTER_INSTANCE_TYPE: r4.xlarge
CLUSTER_NUMBER: 1
COMMON_DEPLOYMENT: lagunita
| Remove lagunita worker version overrides | Remove lagunita worker version overrides
| YAML | agpl-3.0 | Stanford-Online/configuration,Stanford-Online/configuration,Stanford-Online/configuration,Stanford-Online/configuration,Stanford-Online/configuration |
9fe21020e037a72bf9d1ed2133e84a81b0893053 | contrib/flavor/ngs_pipeline_minimal/packages-homebrew.yaml | contrib/flavor/ngs_pipeline_minimal/packages-homebrew.yaml | # Packages available in the Homebrew and Linuxbrew package manager
---
minimal:
- cmake
- s3gof3r
perl:
- cpanminus
bio_nextgen:
alignment:
- bwa
- bwakit
- bowtie2
- novoalign
- rna-star
# - snap-aligner # Needs sync with homebrew-cbl/homebrew-science and compile on older gcc
- tophat-binary
utilities:
- bamtools
- bedtools
- cramtools
- biobambam
- express-binary
- fastqc
- grabix
- picard-tools
- qualimap
- kraken
- sambamba-binary
- samblaster
- seqtk
- speedseq
- staden_io_lib
- oncofuse
- sickle
analysis:
- cufflinks-binary
- samtools;--without-curses
- htslib
- bcftools
variant:
- bcbio-variation-recall
- delly;--with-binary
- freebayes
- gatk-framework
#- glia # Needs updates to fsom
- hall-lab-sv-tools
- lofreq
- lumpy-sv
- pindel
- platypus-variant
- rtg-tools
- scalpel
- snpeff
- theta2
- vardict
- vardict-java
- vcflib
- vep
- wham;--with-binary
- qsignature
- qsnp
# - vcftools # Build problems with default libz on Ubuntu 12.04
- vt;--with-binary
| # Packages available in the Homebrew and Linuxbrew package manager
---
minimal:
- cmake
- s3gof3r
perl:
- cpanminus
bio_nextgen:
alignment:
- bwa
- bwakit
- bowtie2
- novoalign
- rna-star
# - snap-aligner # Needs sync with homebrew-cbl/homebrew-science and compile on older gcc
- tophat-binary
utilities:
- bamtools-nodeps
- bedtools
- cramtools
- biobambam
- express-binary
- fastqc
- grabix
- picard-tools
- qualimap
- kraken
- sambamba-binary
- samblaster
- seqtk
- speedseq
- staden_io_lib
- oncofuse
- sickle
analysis:
- cufflinks-binary
- samtools;--without-curses
- htslib
- bcftools
variant:
- bcbio-variation-recall
- delly;--with-binary
- freebayes
- gatk-framework
#- glia # Needs updates to fsom
- hall-lab-sv-tools
- lofreq
- lumpy-sv
- pindel
- platypus-variant
- rtg-tools
- scalpel
- snpeff
- theta2
- vardict
- vardict-java
- vcflib
- vep
- wham;--with-binary
- qsignature
- qsnp
# - vcftools # Build problems with default libz on Ubuntu 12.04
- vt;--with-binary
| Use bamtools build without explicit cmake dependency causing build issues | Use bamtools build without explicit cmake dependency causing build issues
| YAML | mit | heuermh/cloudbiolinux,AICIDNN/cloudbiolinux,chapmanb/cloudbiolinux,joemphilips/cloudbiolinux,heuermh/cloudbiolinux,pjotrp/cloudbiolinux,kdaily/cloudbiolinux,averagehat/cloudbiolinux,elkingtonmcb/cloudbiolinux,heuermh/cloudbiolinux,averagehat/cloudbiolinux,elkingtonmcb/cloudbiolinux,AICIDNN/cloudbiolinux,joemphilips/cloudbiolinux,chapmanb/cloudbiolinux,rchekaluk/cloudbiolinux,averagehat/cloudbiolinux,kdaily/cloudbiolinux,elkingtonmcb/cloudbiolinux,kdaily/cloudbiolinux,elkingtonmcb/cloudbiolinux,averagehat/cloudbiolinux,chapmanb/cloudbiolinux,joemphilips/cloudbiolinux,heuermh/cloudbiolinux,rchekaluk/cloudbiolinux,kdaily/cloudbiolinux,pjotrp/cloudbiolinux,pjotrp/cloudbiolinux,rchekaluk/cloudbiolinux,joemphilips/cloudbiolinux,rchekaluk/cloudbiolinux,pjotrp/cloudbiolinux,AICIDNN/cloudbiolinux,AICIDNN/cloudbiolinux,chapmanb/cloudbiolinux |
df5db9528327e2b93fb37fc1ce12d851fb0f1585 | config/govuk_index/migrated_formats.yaml | config/govuk_index/migrated_formats.yaml | migrated:
# Contacts
- contact
# Publisher
- answer
- guide
- help_page
- licence
- local_transaction
- place
- transaction
- simple_smart_answer
# Specialist Publisher
- aaib_report
- asylum_support_decision
- business_finance_support_scheme
- cma_case
- countryside_stewardship_grant
- dfid_research_output
- drug_safety_update
- employment_appeal_tribunal_decision
- employment_tribunal_decision
- european_structural_investment_fund # use Rummager name as mapping occurs before validation check
- international_development_fund
- maib_report
- medical_safety_alert
- raib_report
- service_standard_report
- tax_tribunal_decision
- utaac_decision
# Other
- hmrc_manual
- hmrc_manual_section
- manual
- manual_section
- task_list
indexable:
- service_manual_guide
- service_manual_topic
- travel_advice
| migrated:
# Contacts
- contact
# Publisher
- answer
- guide
- help_page
- licence
- local_transaction
- place
- transaction
- simple_smart_answer
# Specialist Publisher
- aaib_report
- asylum_support_decision
- business_finance_support_scheme
- cma_case
- countryside_stewardship_grant
- dfid_research_output
- drug_safety_update
- employment_appeal_tribunal_decision
- employment_tribunal_decision
- european_structural_investment_fund # use Rummager name as mapping occurs before validation check
- international_development_fund
- maib_report
- medical_safety_alert
- raib_report
- service_standard_report
- tax_tribunal_decision
- utaac_decision
# Other
- hmrc_manual
- hmrc_manual_section
- manual
- manual_section
- task_list
indexable:
- service_manual_guide
- service_manual_homepage
- service_manual_service_standard
- service_manual_topic
- travel_advice
| Index additional service manual formats | Index additional service manual formats
These two formats should also have been part of the previous [PR](https://github.com/alphagov/rummager/pull/1024)
| YAML | mit | alphagov/rummager,alphagov/rummager |
70cd4b2ea3c6c380ed700c587fe6bde8a3008dc7 | .github/workflows/run-tests.yml | .github/workflows/run-tests.yml | name: Run Tests
on:
push:
pull_request:
jobs:
phpunit:
runs-on: ubuntu-latest
name: PHPUnit test suite
services:
postgres:
image: postgres:12
env:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: jbuktest
ports:
- 5432:5432
steps:
- uses: actions/checkout@v2
- name: Copy .env
run: php -r "file_exists('.env') || copy('.env.github', '.env');"
- name: Install dependencies
run: composer install -q --no-ansi --no-interaction --no-progress
- name: Generate key
run: php artisan key:generate
- name: Setup directory permissions
run: chmod -R 777 storage bootstrap/cache
- name: Setup test database
run: |
php artisan migrate
php artisan db:seed
- name: Execute tests (Unit and Feature tests) via PHPUnit
run: vendor/bin/phpunit
| name: Run Tests
on:
push:
pull_request:
jobs:
phpunit:
runs-on: ubuntu-latest
name: PHPUnit test suite
services:
postgres:
image: postgres:12
env:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: jbuktest
ports:
- 5432:5432
steps:
- uses: actions/checkout@v2
- name: Cache node modules
uses: actions/cache@v2
with:
path: ~/.npm
key: ${{ runner.os }}-${{ hashFiles('**/package.json') }}
- name: Install npm dependencies
run: npm install
- name: Copy .env
run: php -r "file_exists('.env') || copy('.env.github', '.env');"
- name: Install dependencies
run: composer install -q --no-ansi --no-interaction --no-progress
- name: Generate key
run: php artisan key:generate
- name: Setup directory permissions
run: chmod -R 777 storage bootstrap/cache
- name: Setup test database
run: |
php artisan migrate
php artisan db:seed
- name: Execute tests (Unit and Feature tests) via PHPUnit
run: vendor/bin/phpunit
| Make sure puppeteer is installed for actions | Make sure puppeteer is installed for actions
| YAML | cc0-1.0 | jonnybarnes/jonnybarnes.uk,jonnybarnes/jonnybarnes.uk,jonnybarnes/jonnybarnes.uk |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.