Instruction
stringlengths
14
778
input_code
stringlengths
0
4.24k
output_code
stringlengths
1
5.44k
Update from Hackage at 2018-09-15T10:13:05Z
homepage: '' changelog-type: markdown hash: 1cafbf3f1c63a62fc42e3283ec08c89a10988be61f2bcce9e9b6458a0276e502 test-bench-deps: {} maintainer: fresheyeball@protonmail.com synopsis: Sets of fixed size, with typelits changelog: ! '# Revision history for set-of ## 0.1.0.0 -- YYYY-mm-dd * First version. Released on an unsuspecting world. ' basic-deps: base: ! '>=4.9 && <4.12' containers: -any all-versions: - '0.1.0.0' author: Isaac Shapira latest: '0.1.0.0' description-type: haddock description: There are use cases for sets of a fixed size. This package aims to make that ergonimic. license-name: BSD3
Check in lobby register schema.
# EC/EP register of interests - name: representative label: Representative in the EC/EP register of interests obj: entity attributes: - name: reg_identifier label: Identifier in the register - name: reg_legal_status label: Legal status - name: reg_entry_status label: Status of registration - name: reg_activities label: Lobbying activities - name: reg_goals label: Goals - name: reg_networking label: Networking activities - name: registration_date label: Registration date - name: main_category label: Main category - name: sub_category label: Subcategory - name: reg_role label: A person's role obj: relation attributes: - name: role label: Role - name: position label: Position - name: role_start_date label: Start date - name: role_end_date label: End date - name: reg_membership label: An organisation's involvement obj: relation attributes: [] - name: reg_turnover label: Turnover for Lobbying obj: relation attributes: - name: turnover_min label: Minimal amount - name: turnover_max label: Maximal amount
Set up CI with Azure Pipelines
# .NET Desktop # Build and run tests for .NET Desktop or Windows classic desktop solutions. # Add steps that publish symbols, save build artifacts, and more: # https://docs.microsoft.com/azure/devops/pipelines/apps/windows/dot-net trigger: - master pool: vmImage: 'default' variables: solution: '**/*.sln' buildPlatform: 'Any CPU' buildConfiguration: 'Release' steps: - task: NuGetToolInstaller@0 - task: NuGetCommand@2 inputs: restoreSolution: '$(solution)' - task: VSBuild@1 inputs: solution: '$(solution)' platform: '$(buildPlatform)' configuration: '$(buildConfiguration)' - task: VSTest@2 inputs: platform: '$(buildPlatform)' configuration: '$(buildConfiguration)'
Add greeting action for first interactions
name: Greetings on: [pull_request, issues] jobs: greeting: runs-on: ubuntu-latest steps: - uses: actions/first-interaction@v1 with: repo-token: ${{ secrets.GITHUB_TOKEN }} issue-message: 'Hello! The CSSBot project is only intended for use in the CSS Discord. Please keep that scope in mind when reporting bugs or requesting new features, as the bot is very much purpose-built for only this one server.' pr-message: 'Hello! Thank you for your contribution. Please remember that this project is only intended for use in the CSS Discord, and isn''t meant to be an all-purpose bot for outside of this context.'
Set up CI with Azure Pipelines
# ASP.NET # Build and test ASP.NET projects. # Add steps that publish symbols, save build artifacts, deploy, and more: # https://docs.microsoft.com/azure/devops/pipelines/apps/aspnet/build-aspnet-4 pool: vmImage: 'VS2017-Win2016' variables: solution: '**/*.sln' buildPlatform: 'Any CPU' buildConfiguration: 'Release' steps: - task: NuGetToolInstaller@0 - task: NuGetCommand@2 inputs: restoreSolution: '$(solution)' - task: VSBuild@1 inputs: solution: '$(solution)' msbuildArgs: '/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:PackageLocation="$(build.artifactStagingDirectory)"' platform: '$(buildPlatform)' configuration: '$(buildConfiguration)' - task: VSTest@2 inputs: platform: '$(buildPlatform)' configuration: '$(buildConfiguration)'
Add initial version for jaeger-browser
{% set name = "jaeger-browser" %} {% set name_ = "jaeger_browser" %} {% set version = "1.0.1" %} package: name: {{ name|lower }} version: {{ version }} # https://files.pythonhosted.org/packages/f4/dc/76d789d1f3ab38dda369ad41242aa61cd139ebc94b89bc160e6474f1d057/jaeger_browser-1.0.1.tar.gz source: url: https://pypi.io/packages/source/{{ name_[0] }}/{{ name_ }}/{{ name_ }}-{{ version }}.tar.gz sha256: 75b019c8a3f66254fece0517ddae17064d1e14b90716270eac6039dd23af41d4 build: noarch: python number: 0 script: "{{ PYTHON }} -m pip install . -vv" requirements: host: - python - pip - flit run: - python - jaeger - yarn - opentracing - mypy_extensions - typing-extensions - jaeger-client - uvicorn - starlette test: commands: - test -x jaeger-all-in-one about: home: https://github.com/Quansight/jaeger-browser license: MIT license_family: MIT license_file: LICENSE summary: 'This repo is to help you submit Jaeger traces from your browser.' description: | This repo is to help you submit Jaeger traces from your browser. If you want to use this alongside your Jupyter server, check out jupyter-jaeger. There is an example in that repo of starting a span in a kernel server side and then continuing it on the clien side. doc_url: https://github.com/Quansight/jaeger-browser dev_url: https://github.com/Quansight/jaeger-browser extra: recipe-maintainers: - xmnlab
Add conda forge recipe for torchgeo
{% set name = "torchgeo" %} {% set version = "0.1.0" %} package: name: {{ name|lower }} version: {{ version }} source: url: https://pypi.io/packages/source/{{ name[0] }}/{{ name }}/torchgeo-{{ version }}.tar.gz sha256: 44eb3cf10ab2ac63ff95e92fcd3807096bac3dcb9bdfe15a8edac9d440d2f323 build: number: 0 noarch: python script: {{ PYTHON }} -m pip install . -vv requirements: host: - pip - python >=3.6 - setuptools >=42 run: - einops - fiona >=1.5 - kornia >=0.5.4 - matplotlib-base - numpy - omegaconf >=2.1 - pillow >=2.9 - pyproj >=2.2 - python >=3.6 - pytorch >=1.7 - pytorch-lightning >=1.3 - rasterio >=1.0.16 - rtree >=0.5 - scikit-learn >=0.18 - segmentation-models-pytorch >=0.2 - shapely >=1.3 - timm >=0.2.1 - torchmetrics - torchvision >=0.3 test: imports: - torchgeo - torchgeo.datasets - torchgeo.models - torchgeo.samplers - torchgeo.trainers - torchgeo.transforms commands: - pip check requires: - pip about: home: https://github.com/microsoft/torchgeo doc_url: https://torchgeo.readthedocs.io/ dev_url: https://github.com/microsoft/torchgeo license: MIT license_file: LICENSE summary: 'TorchGeo: datasets, transforms, and models for geospatial data' description: | TorchGeo is a PyTorch domain library, similar to torchvision, that provides datasets, transforms, samplers, and pre-trained models specific to geospatial data. The goal of this library is to make it simple 1. for machine learning experts to use geospatial data in their workflows, and 2. for remote sensing experts to use their data in machine learning workflows. extra: recipe-maintainers: - adamjstewart - calebrob6
Monitor system of another host
monit_services: - name: blog.entwicklerbier.org type: system rules: - "if memory usage > 90% for 3 cycles then alert" - "if swap usage > 70% for 3 cycles then alert" - "if cpu usage (user) > 70% then alert" - "if cpu usage (system) > 30% then alert" - "if cpu usage (wait) > 20% then alert" - name: / target: "{{ ansible_mounts[0].device }}" type: filesystem rules: - "if space usage > 80% for 8 cycles then alert"
Add Action to clean up tests
name: Clean PR checks on: workflow_dispatch: inputs: pr: description: PR to be cleaned required: true checks: description: Checks to be cleaned required: true default: 'build/O2/o2,build/AliceO2/O2/o2/macOS,build/O2/fullCI' owner: description: Organization required: true default: 'alisw' repo: description: Repository required: true default: 'AliPhysics' jobs: cleanup_pr_checks: runs-on: ubuntu-latest steps: - name: Set up Python 3.7 uses: actions/setup-python@v1 with: python-version: 3.7 - name: Install ali-bot run: | sudo apt-get update -y sudo apt-get install -y libsasl2-dev python-dev libldap2-dev libssl-dev python -m pip install --upgrade pip pip install git+https://github.com/alisw/ali-bot@master - uses: octokit/graphql-action@v2.x id: get_last_commit_for_pr with: query: | { repository(owner: "${{ github.event.inputs.owner }}", name: "${{ github.event.inputs.repo }}") { url pullRequest(number:${{ github.event.inputs.pr }}) { commits(last: 1) { nodes { commit { oid } } } } } } env: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - name: Cleanup tests run: |- set -x cat <<\EOF > results.json ${{ steps.get_last_commit_for_pr.outputs.data }} EOF COMMIT=$(jq -r '.repository.pullRequest.commits.nodes[].commit.oid' results.json) echo $COMMIT for check in `echo ${{ github.event.inputs.checks }} | tr , \\\\n`; do set-github-status -c ${{ github.event.inputs.owner }}/${{ github.event.inputs.repo }}@$COMMIT -s $check/pending done env: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
Add basic profile post schema
title: Basic Profile type: object properties: name: description: Name to be displayed publicly. type: string website: description: The entity's primary website. type: string format: uri birthdate: description: Date of birth in `YYYY-MM-DD` format with omitted information set to all zero. type: string location: description: Physical location. type: string gender: description: Self-described gender. type: string bio: description: Biography/self-description. type: string
Add playbook to install flatpak from PPA
--- - hosts: all become: yes become_user: root become_method: sudo gather_facts: yes tasks: - name: Add flatpak PPA apt_repository: repo: "ppa:alexlarsson/flatpak" state: present - name: Install flatpak and flatpak-builder apt: package: [flatpak, flatpak-builder] update_cache: yes state: latest
Add swig back into the openmm conda build requirements.
package: name: openmm version: 6.2 source: url: https://github.com/pandegroup/openmm/archive/6.2.tar.gz fn: 6.2.tar.gz patches: - plugin-dir.patch build: number: 1 preserve_egg_dir: yes requirements: build: # on windows, need to install cmake manually - cmake # [not win] - python - fftw3f # [not win] - sphinx - sphinxcontrib-bibtex # - swig # leave out unless we use VM with workaround because conda swig is broken run: - python - fftw3f # [not win] test: imports: - simtk - simtk.openmm commands: - python -m simtk.testInstallation about: home: http://openmm.org license: LGPL and MIT summary: A high performance toolkit for molecular simulation.
package: name: openmm version: 6.2 source: url: https://github.com/pandegroup/openmm/archive/6.2.tar.gz fn: 6.2.tar.gz patches: - plugin-dir.patch build: number: 1 preserve_egg_dir: yes requirements: build: # on windows, need to install cmake manually - cmake # [not win] - python - fftw3f # [not win] - sphinx - sphinxcontrib-bibtex - swig run: - python - fftw3f # [not win] test: imports: - simtk - simtk.openmm commands: - python -m simtk.testInstallation about: home: http://openmm.org license: LGPL and MIT summary: A high performance toolkit for molecular simulation.
Set up CI with Azure Pipelines
# Starter pipeline # Start with a minimal pipeline that you can customize to build and deploy your code. # Add steps that build, run tests, deploy, and more: # https://aka.ms/yaml trigger: - master pool: vmImage: 'ubuntu-latest' steps: - script: echo Hello, world! displayName: 'Run a one-line script' - script: | echo Add other tasks to build, test, and deploy your project. echo See https://aka.ms/yaml displayName: 'Run a multi-line script'
Add CI with GitHub Actions
name: CI on: [push, pull_request] jobs: test: runs-on: ubuntu-latest strategy: matrix: ruby-version: ['2.6', '2.7', '3.0', '3.1'] steps: - uses: actions/checkout@v2 - name: Set up Ruby ${{ matrix.ruby-version }} uses: ruby/setup-ruby@v1 with: ruby-version: ${{ matrix.ruby-version }} bundler-cache: true # 'bundle install' and cache - name: Run specs run: | bundle exec rspec
Move CI configuration from Travis to GitHub actions
on: [push, pull_request] name: Build env: GO111MODULE: on jobs: test: strategy: matrix: go-version: [1.13.x, 1.14.x, 1.15.x, 1.16.x, 1.17.x] os: [ubuntu-latest, macos-latest] runs-on: ${{ matrix.os }} steps: - name: Install Go uses: actions/setup-go@v2 with: go-version: ${{ matrix.go-version }} - name: Checkout code uses: actions/checkout@v2 - name: Test run: go test ./... test-cache: runs-on: ubuntu-latest steps: - name: Install Go uses: actions/setup-go@v2 with: go-version: 1.17.x - name: Checkout code uses: actions/checkout@v2 - uses: actions/cache@v2 with: path: | ~/go/pkg/mod # Module download cache ~/.cache/go-build # Build cache (Linux) ~/Library/Caches/go-build # Build cache (Mac) '%LocalAppData%\go-build' # Build cache (Windows) key: ${{ runner.os }}-go-${{ hashFiles('**/go.sum') }} restore-keys: | ${{ runner.os }}-go- - name: Test run: go test ./...
Add GitHub Action config file for CI
name: Tests on: pull_request: branches: - master paths-ignore: - 'README.md' push: branches: - master paths-ignore: - 'README.md' jobs: unit_tests: name: Unit Tests if: "contains(github.event.commits[0].message, '[ci skip]') == false" strategy: fail-fast: false matrix: os: - ubuntu ruby: - jruby-9.1.13.0 - 2.3 - 2.4 - 2.5 - 2.6 - 2.7 - ruby-head - rbx-3 allow_failures: - false # include: # - os: ubuntu # ruby: ruby-head # gemfile: gemfiles/contracts_16_0.gemfile # allow_failures: true env: BUNDLE_PATH: "./vendor/bundle" ALLOW_FAILURES: "${{ matrix.allow_failures }}" runs-on: ${{ matrix.os }}-latest steps: - name: Checkout uses: actions/checkout@v2 - name: Setup Ruby uses: ruby/setup-ruby@v1 with: ruby-version: ${{ matrix.ruby }} - uses: actions/cache@v2 with: path: vendor/bundle key: ${{ runner.os }}-gems-${{ matrix.ruby }}-${{ github.ref }}-${{ github.sha }} restore-keys: | ${{ runner.os }}-gems--${{ matrix.ruby }}-${{ github.ref }}- ${{ runner.os }}-gems--${{ matrix.ruby }}- - name: Bundle Install run: | bundle install --jobs 4 --retry 3 - name: Test run: bundle exec rspec || $ALLOW_FAILURES
Update from Hackage at 2021-12-20T13:27:03Z
homepage: '' changelog-type: markdown hash: d2ee6a1b06c1ba5cf4258cf3a32ef5fc34c3c2702db9836f2c64b060cb7c3d4a test-bench-deps: {} maintainer: divipp@gmail.com synopsis: K_M,N quadratic programming changelog: |+ # Revision history for kmn-programming ## 0.9.0.1 -- 2020-03-06 * Add Wmat44.txt * Add SplitProblem.hs * Add documentation in README.md ## 0.9.0.0 -- 2016-12-12 * First version. basic-deps: base: '>=4.9 && <4.17' time: -any random-shuffle: -any optparse-applicative: '>=0.13 && <0.17' random: -any x86-64bit: '>=0.4 && <0.5' all-versions: - 0.9.1 author: Péter Diviánszky latest: 0.9.1 description-type: haddock description: '' license-name: BSD-3-Clause
Add recipe for splink library
# Note: there are many handy hints in comments in this example -- remove them when you've finalized your recipe # Jinja variables help maintain the recipe as you'll update the version only here. # Using the name variable with the URL in line 14 is convenient # when copying and pasting from another recipe, but not really needed. {% set name = "splink" %} {% set version = "0.2.4" %} package: name: {{ name|lower }} version: {{ version }} source: url: https://pypi.io/packages/source/{{ name[0] }}/{{ name }}/{{ name }}-{{ version }}.tar.gz # If getting the source from GitHub, remove the line above, # uncomment the line below, and modify as needed. Use releases if available: # url: https://github.com/simplejson/simplejson/releases/download/{{ version }}/simplejson-{{ version }}.tar.gz # and otherwise fall back to archive: # url: https://github.com/simplejson/simplejson/archive/v{{ version }}.tar.gz sha256: 2d597bee5ff921658867ce87581629c6cc861711cb7ff85ed07778c2e927b2f7 # sha256 is the preferred checksum -- you can get it for a file with: # `openssl sha256 <file name>`. # You may need the openssl package, available on conda-forge: # `conda install openssl -c conda-forge`` build: # Uncomment the following line if the package is pure Python and the recipe is exactly the same for all platforms. # It is okay if the dependencies are not built for all platforms/versions, although selectors are still not allowed. # See https://conda-forge.org/docs/maintainer/knowledge_base.html#noarch-python for more details. # noarch: python number: 0 # If the installation is complex, or different between Unix and Windows, use separate bld.bat and build.sh files instead of this key. # By default, the package will be built for the Python versions supported by conda-forge and for all major OSs. # Add the line "skip: True # [py<35]" (for example) to limit to Python 3.5 and newer, or "skip: True # [not win]" to limit to Windows. script: "{{ PYTHON }} -m pip install . -vv" requirements: host: - python - pip - poetry run: - python - jsonschema - pyspark test: imports: - splink about: home: https://github.com/moj-analytical-services/splink license: MIT license_family: MIT license_file: LICENSE summary: "Implements Fellegi-Sunter's canonical model of record linkage in Apache Spark, including EM algorithm to estimate parameters of the model" extra: recipe-maintainers: - cdesouza21
Add a GH action for PHP & Composer
# This is a basic workflow to help you get started with Actions name: CI # Controls when the workflow will run on: # Triggers the workflow on push or pull request events but only for the develop branch push: branches: [ develop ] pull_request: branches: [ develop ] # Allows you to run this workflow manually from the Actions tab workflow_dispatch: # A workflow run is made up of one or more jobs that can run sequentially or in parallel jobs: # This workflow contains a single job called "build" build: # The type of runner that the job will run on runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - name: Validate composer.json and composer.lock run: composer validate --strict - name: Cache Composer packages id: composer-cache uses: actions/cache@v2 with: path: vendor key: ${{ runner.os }}-php-${{ hashFiles('**/composer.lock') }} restore-keys: | ${{ runner.os }}-php- - name: Install dependencies run: composer install --prefer-dist --no-progress # Add a test script to composer.json, for instance: "test": "vendor/bin/phpunit" # Docs: https://getcomposer.org/doc/articles/scripts.md # - name: Run test suite # run: composer run-script test
Update from Hackage at 2015-05-14T18:41:06+0000
homepage: https://github.com/ion1/acme-memorandom changelog-type: markdown hash: 5ee761691aaad8805126b2fc38c8e188ba784c191435ae1b4a3a2ce35fbc006e test-bench-deps: {} maintainer: Johan Kiviniemi <devel@johan.kiviniemi.name> synopsis: Memoized random number generation changelog: ! '# 0.0.1 (2015-05-14) * Initial release. ' basic-deps: base: ==4.* MemoTrie: ==0.6.* random: ==1.* all-versions: - '0.0.1' author: Johan Kiviniemi <devel@johan.kiviniemi.name> latest: '0.0.1' description-type: markdown description: ! '# `acme-memorandom` [![Hackage](https://budueba.com/hackage/acme-memorandom)](https://hackage.haskell.org/package/acme-memorandom) A library for generating random numbers in a memoized manner. Implemented as a lazy table indexed by serialized [`StdGen`][StdGen]. Monomorphism is used to facilitate memoization, users should adapt their design to work with random [`Int`][Int] values only. [StdGen]: http://hackage.haskell.org/package/random/docs/System-Random.html#t:StdGen [Int]: https://hackage.haskell.org/package/base/docs/Prelude.html#t:Int ' license-name: MIT
Add composition - towards 1.0.0
### # Copyright (c) 2015-2017 Mainflux # # Mainflux server is licensed under an Apache license, version 2.0 license. # All rights not explicitly granted in the Apache license, version 2.0 are reserved. # See the included LICENSE file for more details. ### version: "2" services: ### # NATS ### nats: image: nats:latest container_name: mainflux-nats ports: - "4222:4222" - "8222:8222" ### # Manager ### manager: image: mainflux/manager:latest container_name: mainflux-manager ports: - "9090:9090" ### # Message Writer ### message-writer: image: mainflux/message-writer:latest container_name: mainflux-message-writer ### # MQTT Broker ### mqtt-adapter: image: mainflux/mqtt-adapter:latest container_name: mainflux-mqtt ports: - "1883:1883" - "8883:8883" ### # CoAP Server ### mainflux-coap: image: mainflux/coap-adapter:latest container_name: mainflux-coap ports: - "5683:5683" ### # HTTP Server ### http-adapter: image: mainflux/http-adapter:latest container_name: mainflux-http ports: - "7070:7070"
Update from Hackage at 2016-09-22T21:34:29+00:00
homepage: https://github.com/pbrisbin/gh-pocket-knife#readme changelog-type: '' hash: 690c5f4bf33af132e6b9359fd240bcecbb9785e581e5b5475b6449f1aa6729f7 test-bench-deps: base: -any hspec: -any gh-pocket-knife: -any QuickCheck: -any maintainer: pbrisbin@gmail.com synopsis: Script helpers for interacting with GitHub changelog: '' basic-deps: bytestring: -any base: ! '>=4.7 && <5' gh-pocket-knife: -any http-conduit: -any resourcet: -any aeson: -any all-versions: - '0.1.0.0' author: Pat Brisbin latest: '0.1.0.0' description-type: haddock description: Please see README.md license-name: BSD3
Set up pr label size action
name: size-label on: pull_request jobs: size-label: runs-on: ubuntu-latest steps: - name: size-label uses: "pascalgn/size-label-action@d909487e1a0057d85c638f1ddefdb315a63d2e98" env: GITHUB_TOKEN: "${{ secrets.GITHUB_TOKEN }}"
Update from Hackage at 2016-03-20T06:20:33+0000
homepage: '' changelog-type: '' hash: 6341ce0b176121b900f2c4f60068a0fe6ec00e2561ff16a7c4260a1dea3daf15 test-bench-deps: {} maintainer: hawk.alan@gmail.com synopsis: zZzzZz changelog: '' basic-deps: base: <10000 time: <1000 all-versions: - '0.1.0.0' author: Alan Hawkins latest: '0.1.0.0' description-type: haddock description: '' license-name: GPL-2
Update from Hackage at 2018-08-20T06:09:26Z
homepage: https://github.com/frasertweedale/hs-tax changelog-type: '' hash: 70712a80e5e066bfe4789b11564ed694b9bd31e9a6aefc328dffe7f87903909c test-bench-deps: {} maintainer: frase@frase.id.au synopsis: Types and combinators for taxes changelog: '' basic-deps: base: ! '>=4.8 && <5' semigroups: ! '>=0.16' dollaridoos: ! '>=0.1' all-versions: - '0.1.0.0' author: Fraser Tweedale latest: '0.1.0.0' description-type: haddock description: ! 'This library provides combinators for constructing taxes. It is based on the <https://hackage.haskell.org/package/dollaridoos dollaridoos> library.' license-name: AGPL-3
Disable Travis running 'go get' three times then failing the build >:|
language: go go: - 1.4 script: - goad validate - goad test - goad install
language: go go: - 1.4 # I know I like my dependencies specified by custom meta tags in HTML! # Oh wait, no. No I don't. install: true script: - goad init - goad validate - goad test - goad install
Move Trusty e2e testing jobs to job-configs
- job-template: name: 'kubernetes-e2e-gce-trusty-{suffix}' description: '{description} Test owner: {test-owner}.' logrotate: daysToKeep: 7 builders: - shell: | curl -fsS --retry 3 "https://raw.githubusercontent.com/kubernetes/kubernetes/{branch}/hack/jenkins/e2e.sh" | bash - properties: - mail-watcher publishers: - claim-build - junit-publisher - log-parser - email-ext: recipients: "{emails}" - description-setter: regexp: KUBE_GCE_MINION_IMAGE=(.*) - groovy-postbuild: script: | def trustyImageMatcher = manager.getLogMatcher("KUBE_GCE_MINION_IMAGE=(.*)") if(trustyImageMatcher?.matches()) manager.addShortText("<b>Trusty Image: " + trustyImageMatcher.group(1) + "</b>", "grey", "white", "0px", "white") def k8sVersionMatcher = manager.getLogMatcher("Using\\spublished\\sversion\\s(.*)\\s\\(from.*") if(k8sVersionMatcher?.matches()) manager.addShortText("<br><b>Kubernetes version: " + k8sVersionMatcher.group(1) + "</b>", "grey", "white", "0px", "white") triggers: - timed: 'H H/8 * * *' wrappers: - ansicolor: colormap: xterm - timeout: timeout: '{timeout}' fail: true - timestamps - workspace-cleanup - project: name: kubernetes-e2e-gce-trusty test-owner: 'wonderfly@google.com' branch: 'release-1.1' emails: adityakali@google.com,ameyd@google.com,andryeu@google.com,gwells@google.com,qzheng@google.com,saied@google.com,wonderfly@google.com,yinghan@google.com suffix: - 'head-release': description: 'Continuously test Trusty build against latest k8s release.' timeout: 150 - 'dev-release': description: 'Continuously test Trusty dev build against latest k8s release.' timeout: 150 - 'beta-release': description: 'Continuously test Trusty beta build against latest k8s release.' timeout: 150 - 'stable-release': description: 'Continuously test Trusty stable build against latest k8s release.' timeout: 150 - 'head-slow': description: 'Run slow E2E tests on latest Trusty build.' timeout: 270 - 'dev-slow': description: 'Run slow E2E tests on latest Trusty dev build.' timeout: 270 - 'beta-slow': description: 'Run slow E2E tests on latest Trusty beta build.' timeout: 270 - 'stable-slow': description: 'Run slow E2E tests on latest Trusty stable build.' timeout: 270 jobs: - 'kubernetes-e2e-gce-trusty-{suffix}'
Update from Hackage at 2019-12-17T23:44:56Z
homepage: https://github.com/maquinitas/maquinitas-tidal changelog-type: markdown hash: 15817cc10ba8ab59ed99d5fa24f17e0fe5fc0700be3c48c3e652673c0e2e0919 test-bench-deps: {} maintainer: montoyamoraga@gmail.com synopsis: library for MIDI control of hardware changelog: | # Revision history for maquinitas ## 0.1.0.0 -- 2019-12-17 * First version. Testing build of package, add first four instruments. basic-deps: base: ! '>=4.13 && <4.14' all-versions: - 0.1.0 author: montoyamoraga latest: 0.1.0 description-type: markdown description: | # maquinitas-tidal ## About maquinitas-tidal is a project by [Aarón Montoya-Moraga](https://montoyamoraga.io/). maquinitas-tidal is a flavor of the maquinitas library, intended to be used in conjunction with [TidalCycles](https://github.com/tidalcycles/). maquinitas-tidal is built using the programming language Haskell, and with the cabal package manager. ## Developing ```bash cabal configure cabal build cabal install ``` ## Deploying ```bash cabal sdist ``` ## Installing ```bash cabal update cabal install maquinitas ``` ## Testing ## Releases None so far ## License MIT license-name: MIT
Add or update the Azure App Service build and deployment workflow config
# Docs for the Azure Web Apps Deploy action: https://github.com/Azure/webapps-deploy # More GitHub Actions for Azure: https://github.com/Azure/actions # More info on Python, GitHub Actions, and Azure App Service: https://aka.ms/python-webapps-actions name: Build and deploy Python app to Azure Web App - propalyzer-new on: push: branches: - new_data_source workflow_dispatch: jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - name: Set up Python version uses: actions/setup-python@v1 with: python-version: '3.8' - name: Create and start virtual environment run: | python -m venv venv source venv/bin/activate - name: Install dependencies run: pip install -r requirements.txt # Optional: Add step to run tests here (PyTest, Django test suites, etc.) - name: Upload artifact for deployment jobs uses: actions/upload-artifact@v2 with: name: python-app path: | . !venv/ deploy: runs-on: ubuntu-latest needs: build environment: name: 'Production' url: ${{ steps.deploy-to-webapp.outputs.webapp-url }} steps: - name: Download artifact from build job uses: actions/download-artifact@v2 with: name: python-app path: . - name: 'Deploy to Azure Web App' uses: azure/webapps-deploy@v2 id: deploy-to-webapp with: app-name: 'propalyzer-new' slot-name: 'Production' publish-profile: ${{ secrets.AZUREAPPSERVICE_PUBLISHPROFILE_B225243A4B2943CA8C250B7F1054449E }}
Update from Hackage at 2021-06-26T07:33:38Z
homepage: https://github.com/lehmacdj/polysemy-readline#readme changelog-type: markdown hash: bd049f9e5252973b8584be2bdae95e7facb3356f834b4f7b613c33a171168944 test-bench-deps: polysemy-plugin: '>=0.3.0 && <0.4' exceptions: '>=0.10.4 && <0.11' haskeline: '>=0.8.1 && <0.9.0' base: '>=4.12 && <4.15' polysemy-readline: -any polysemy: '>=1.5.0 && <1.6' maintainer: Devin Lehmacher synopsis: Readline effect for polysemy. changelog: | # Changelog for polysemy-readline ## 0.1.0.0 — June 24th 2021 - Release candidate for Hackage. basic-deps: polysemy-plugin: '>=0.3.0 && <0.4' exceptions: '>=0.10.4 && <0.11' haskeline: '>=0.8.1 && <0.9.0' base: '>=4.12 && <4.15' polysemy: '>=1.5.0 && <1.6' all-versions: - 0.1.0.0 author: Devin Lehmacher latest: 0.1.0.0 description-type: markdown description: "# polysemy-readline\n[![GitHub Actions](https://github.com/lehmacdj/polysemy-readline/actions/workflows/ci.yml/badge.svg)](https://github.com/lehmacdj/polysemy-readline/actions/workflows/ci.yml)\n[![Hackage](http://img.shields.io/hackage/v/polsyemy-readline.svg)](http://img.shields.io/hackage/v/polsyemy-readline.svg)\n\nThis package provides a [polysemy](https://github.com/polysemy-research/polysemy#readme) effect that provides most of the functionality of [haskeline](https://github.com/judah/haskeline#readme). See Haskeline's documentation for usage information.\n\n## Contributions\nIssues or PRs are welcome. In particular there are a number of things that I don't use frequently enough from Haskeline to justify working on or just haven't gotten around to implementing yet:\n- interrupt handling: `withInterrupt`, `handleInterrupt`\n- pure interpreter for using in tests\n- additional interpreters matching the `run*` functions for `InputT`\n- support for older versions of Haskeline (currently only 0.8.1+ is supported)\n- version bumps for things that already compile but aren't allowed\nPRs for any of these things would be greatly appreciated or I might get around to implementing them myself later \U0001F642.\n" license-name: BSD-2-Clause
Add support for AppVeyor (ugexe++)
os: Visual Studio 2015 platform: x64 install: - '"C:\Program Files\Microsoft SDKs\Windows\v7.1\Bin\SetEnv.cmd" /x64' - choco install strawberryperl - SET PATH=C:\strawberry\c\bin;C:\strawberry\perl\site\bin;C:\strawberry\perl\bin;%PATH% - git clone https://github.com/rakudo/rakudo.git %APPVEYOR_BUILD_FOLDER%\..\rakudo - cd %APPVEYOR_BUILD_FOLDER%\..\rakudo - perl Configure.pl --gen-moar=HEAD --gen-nqp - nmake install - SET PATH=%APPVEYOR_BUILD_FOLDER%\..\rakudo\install\bin;%PATH% - cd %APPVEYOR_BUILD_FOLDER% build: off test_script: - prove -v -e "perl6 -Ilib" t/ - perl6 -Ilib bin/zef --verbose install . - SET PATH=%APPVEYOR_BUILD_FOLDER%\..\rakudo\install\share\perl6\site\bin;%PATH% - cd %APPVEYOR_BUILD_FOLDER%\.. - zef --verbose --force install Zef shallow_clone: true
Update from Hackage at 2022-08-19T10:44:59Z
homepage: http://github.com/lyokha/nginx-log-plugin changelog-type: markdown hash: 2658a5694aa44bcc36b29f42db0115edf9776eb2a036319dfb00dcc567cff114 test-bench-deps: {} maintainer: Alexey Radkov <alexey.radkov@gmail.com> synopsis: Native Nginx logging from configuration files and Haskell handlers changelog: |+ ### 1.5 - Auto-generate haddocks for generated functions. basic-deps: bytestring: -any base: '>=4.8 && <5' ngx-export: '>=1.7.1' ngx-export-tools: '>=0.4.9.0' template-haskell: '>=2.11.0' all-versions: - '1.5' author: Alexey Radkov <alexey.radkov@gmail.com> latest: '1.5' description-type: haddock description: |- Native Nginx logging from configuration files and Haskell handlers. This is a part of <https://github.com/lyokha/nginx-log-plugin>. Custom libraries are required to be linked against C module /ngx_log_plugin/. license-name: BSD-3-Clause
Update from Hackage at 2017-07-06T12:48:10Z
homepage: https://github.com/louispan/data-diverse-lens#readme changelog-type: '' hash: 1dde6cced9d30a42df36728ba3d3a4225616f0a72cd7a0b4b6e65cf0166a15e0 test-bench-deps: data-diverse: ! '>=0.6 && <1' base: -any hspec: ! '>=2 && <3' tagged: ! '>=0.8.5 && <1' lens: ! '>=4 && <5' data-diverse-lens: -any maintainer: louis@pan.me synopsis: Isos & Lens for Data.Diverse.Many and Prisms for Data.Diverse.Which changelog: '' basic-deps: data-diverse: ! '>=0.6 && <1' base: ! '>=4.7 && <5' tagged: ! '>=0.8.5 && <1' lens: ! '>=4 && <5' all-versions: - '0.1.0.0' author: Louis Pan latest: '0.1.0.0' description-type: markdown description: ! '[![Hackage](https://img.shields.io/hackage/v/data-diverse-lens.svg)](https://hackage.haskell.org/package/data-diverse-lens) [![Build Status](https://secure.travis-ci.org/louispan/data-diverse-lens.png?branch=master)](http://travis-ci.org/louispan/data-diverse-lens) Provides "Iso"s & ''Len''s for "Data.Diverse.Many" and ''Prism''s for "Data.Diverse.Which". Refer to [ManySpec.hs](https://github.com/louispan/data-diverse-lens/blob/master/test/Data/Diverse/Lens/ManySpec.hs) and [WhichSpec.hs](https://github.com/louispan/data-diverse/blob/master/test/Data/Diverse/Lens/WhichSpec.hs) for example usages. ' license-name: BSD3
Set up CI with Azure Pipelines
strategy: matrix: linux: imageName: 'ubuntu-16.04' mac: imageName: 'macos-10.13' windows: imageName: 'vs2017-win2016' # windows-2019 pool: vmImage: $(imageName) steps: - task: UseDotNet@2 displayName: 'Install .net core 3.0' inputs: packageType: sdk version: '3.0.100' installationPath: $(Agent.ToolsDirectory)/dotnet - bash: ./build.sh condition: or( eq( variables['Agent.OS'], 'Darwin' ), eq( variables['Agent.OS'], 'Linux' )) displayName: 'build.sh' - powershell: .\build.cmd condition: eq( variables['Agent.OS'], 'Windows_NT' ) displayName: 'build.cmd'
Add first GitHub Actions pipeline
name: Test on: push: branches: [ master ] pull_request: branches: [ master ] jobs: build: runs-on: ubuntu-latest strategy: matrix: node-version: [14.x] steps: - uses: actions/checkout@v2 - name: Use Node.js ${{ matrix.node-version }} uses: actions/setup-node@v2 with: node-version: ${{ matrix.node-version }} cache: 'yarn' - run: yarn install --frozen-lockfile - run: yarn run lint - run: xvfb-run yarn run test
Update from Hackage at 2017-12-14T09:25:30Z
homepage: https://github.com/mniip/singleton-typelits changelog-type: '' hash: 30aafede59fe7f481abe41ced9cdfded3b9640b984e12fb48db88714756b1687 test-bench-deps: {} maintainer: mniip@mniip.com synopsis: Singletons and induction over GHC TypeLits changelog: '' basic-deps: base: ! '>=4.9 && <4.11' all-versions: - '0.0.0.0' author: mniip latest: '0.0.0.0' description-type: haddock description: Singletons and induction schemes over 'GHC.TypeLits.Nat' license-name: BSD3
Use GitHub Actions for CI testing.
name: CI on: push: branches: [ master ] pull_request: branches: [ master ] jobs: test: name: Test on node ${{ matrix.node_version }} and ${{ matrix.os }} runs-on: ${{ matrix.os }} strategy: matrix: node_version: ['10', '12', '14'] os: [ubuntu-latest] steps: - uses: actions/checkout@v2 - name: Use Node.js ${{ matrix.node_version }} uses: actions/setup-node@v1 with: node-version: ${{ matrix.node_version }} - name: npm install, build and test run: | npm install npm run build --if-present npm test
Move ci to GH Actions
name: Pylint on: [push] jobs: build: runs-on: ubuntu-latest strategy: matrix: python-version: ["3.9", "3.10.0-beta.4"] steps: - uses: actions/checkout@v2 - name: Set up Python ${{ matrix.python-version }} uses: actions/setup-python@v2 with: python-version: ${{ matrix.python-version }} - name: Cache pip uses: actions/cache@v2 with: # This path is specific to Ubuntu path: ~/.cache/pip # Look to see if there is a cache hit for the corresponding requirements file key: ${{ runner.os }}-pip-${{ hashFiles('requirements.txt') }} restore-keys: | ${{ runner.os }}-pip- ${{ runner.os }}- - name: Install dependencies run: | python -m pip install --upgrade pip pip install -r requirements.txt pip install coverage codacy-coverage - run: | touch userdb.json echo '{"##wolfy1339":{},"#zirc":{}}' > userdb.json - run: | coverage run test.py --source="test.py,plugins/*.py,utils/*.py,ansi.py,config.py,handlers/*.py,log.py" --exclude="utils/tasks.py,utils/web.py,handlers/server.py" - if: ${{ success() }} run: | coverage report --include="test.py,plugins/*.py,utils/*.py,ansi.py,config.py,handlers/*.py,log.py" --omit="utils/tasks.py,utils/web.py,handlers/server.py" coverage xml --include="test.py,plugins/*.py,utils/*.py,ansi.py,config.py,handlers/*.py,log.py" --omit="utils/tasks.py,utils/web.py,handlers/server.py" python-codacy-coverage -r coverage.xml
Use ruby 2.5.7 in CI
version: 2 jobs: build: docker: - image: circleci/ruby:2.4.6 working_directory: ~/intercom-rails steps: - checkout - run: bundle install - run: bundle exec rake
version: 2 jobs: build: docker: - image: circleci/ruby:2.5.7 working_directory: ~/intercom-rails steps: - checkout - run: bundle install - run: bundle exec rake
Update Travis to test on the latest ruby releases
language: ruby cache: bundler dist: xenial rvm: - 2.4.5 - 2.5.3 - 2.6 - ruby-head matrix: allow_failures: - rvm: ruby-head branches: only: - master - omnibus/3.2-stable bundler_args: --jobs 7 before_install: gem install bundler script: bundle exec rake travis:ci
language: ruby cache: bundler dist: xenial before_install: - gem install bundler || true - bundle --version - gem update --system - gem --version matrix: include: - rvm: 2.4.5 - rvm: 2.5.5 - rvm: 2.6.3 - rvm: ruby-head allow_failures: - rvm: ruby-head branches: only: - master bundler_args: --jobs 7 script: bundle exec rake travis:ci
Test installation with Travis CI
# Config file for automatic testing at travis-ci.org language: python python: - "2.7" install: # Install the cloned dragonn package from source - pip install . # Do not test the package functionality, simply run an arbitrary tutorial function script: python -c 'from dragonn.tutorial_utils import print_available_simulations; print_available_simulations()'
Update from Hackage at 2020-01-13T09:59:04Z
homepage: https://higherkindness.io/mu-haskell/ changelog-type: markdown hash: 2c85ed4ea340a592e5265e5c6d3937fdceda2fc3b3398fdeea7deaf3442a9362 test-bench-deps: {} maintainer: alejandro.serrano@47deg.com synopsis: Protocol-independent declaration of services and servers changelog: | # Revision history for mu-haskell ## 0.1.0.0 -- YYYY-mm-dd * First version. Released on an unsuspecting world. basic-deps: mu-schema: -any sop-core: -any base: ! '>=4.12 && <5' text: -any conduit: -any mtl: -any template-haskell: -any all-versions: - 0.1.0.0 author: Alejandro Serrano, Flavio Corpa latest: 0.1.0.0 description-type: haddock description: Protocol-independent declaration of services and servers for mu-haskell license-name: Apache-2.0
Configure legacy domain with global (nuclear) option to organisational homepage
--- site: mcga whitehall_slug: maritime-and-coastguard-agency title: Maritime and Coastguard Agency redirection_date: 17th June 2014 homepage: https://www.gov.uk/government/organisations/maritime-and-coastguard-agency tna_timestamp: 20140305221616 host: www.mcga.gov.uk furl: www.gov.uk/mca global: =301 https://www.gov.uk/government/organisations/maritime-and-coastguard-agency
Update from Hackage at 2016-02-09T23:22:42+0000
homepage: '' changelog-type: '' hash: f0164cc7b5a6fcdc714afa794ef07d5768e85d095d07cdf2e005009803010894 test-bench-deps: {} maintainer: JDawson@ku.edu synopsis: Web server wrapper for remote-json changelog: '' basic-deps: warp: ! '>=3.2 && <3.3' base: ! '>=4 && <5' data-default-class: ==0.0.1 text: ! '>=1.2 && <1.3' remote-json: ! '>=0.2 && <0.3' natural-transformation: ! '>=0.3.1 && <0.4' transformers: ! '>=0.4 && <0.6' scotty: ! '>=0.10' aeson: ! '>=0.8 && <0.12' all-versions: - '0.2' author: Justin Dawson and Andy Gill latest: '0.2' description-type: markdown description: ! 'A sub-package of remote-json that provides the ReceiveAPI capabilities. ' license-name: BSD3
Fix typo in doc region tag
# Copyright 2016 Google, Inc # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. runtime: ruby vm: true entrypoint: bundle exec ruby app.rb -p 8080 # [START env_variables] env_variables: TWILIO_ACCOUNT_SID: <your-account-sid> TWILIO_AUTH_TOKEN: <your-auth-token> TWILIO_NUMBER: <your-twilio-number> # [ENV env_variables]
# Copyright 2016 Google, Inc # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. runtime: ruby vm: true entrypoint: bundle exec ruby app.rb -p 8080 # [START env_variables] env_variables: TWILIO_ACCOUNT_SID: <your-account-sid> TWILIO_AUTH_TOKEN: <your-auth-token> TWILIO_NUMBER: <your-twilio-number> # [END env_variables]
Add first version of a GitHub Actions based CI
name: CI on: pull_request: push: tags-ignore: - '*' paths-ignore: - README.md - CHANGELOG.md schedule: - cron: '0 7 * * SUN' jobs: rspec: strategy: fail-fast: false matrix: os: [ubuntu, macos, windows] ruby: - ruby-2.7 - ruby-2.6 - ruby-2.5 - ruby-2.4 - ruby-2.3 - ruby-2.2 - ruby-2.1 - ruby-head - jruby-9.1 - jruby-9.2 - jruby-head - truffleruby - truffleruby-head include: - ruby: ruby-2.7 os: ubuntu env: COVERAGE: 'true' exclude: # Truffleruby is currently not built on Windows - ruby: truffleruby os: windows - ruby: truffleruby-head os: windows runs-on: ${{ matrix.os }}-latest continue-on-error: ${{ endsWith(matrix.ruby, 'head') }} steps: - uses: actions/checkout@v2 - uses: ruby/setup-ruby@v1 with: ruby-version: ${{ matrix.ruby }} bundler-cache: true - name: Run rspec run: bundle exec rspec --format progress spec env: COVERAGE: ${{ matrix.env.COVERAGE }} - name: Coveralls uses: coverallsapp/github-action@master with: github-token: ${{ secrets.GITHUB_TOKEN }} path-to-lcov: coverage/lcov/rackstash.lcov if: matrix.env.COVERAGE == 'true' rubocop: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - uses: ruby/setup-ruby@v1 with: ruby-version: 2.7 - name: Run RuboCop run: | gem install rubocop rubocop --parallel
Update from Hackage at 2016-10-28T21:23:58Z
homepage: '' changelog-type: '' hash: 7e2091c6715de3ef5d5422e22d61d42b86c0b0a0aa005685cebda58b7805abd7 test-bench-deps: {} maintainer: simon@cse.yorku.ca synopsis: Specify axioms for type classes and quickCheck all available instances changelog: '' basic-deps: base: ! '>=4.8 && <5' monad-loops: -any th-printf: -any semigroups: -any control-invariants: -any containers: ! '>=0.5 && <0.6' lens: ! '>=4.12 && <4.15' mtl: -any quickcheck-report: -any transformers: ! '>=0.4 && <0.6' QuickCheck: ! '>=2.8.1 && <2.10' portable-template-haskell-lens: -any template-haskell: ! '>=2.10 && <2.12' all-versions: - '0.1.0.0' author: Simon Hudon latest: '0.1.0.0' description-type: haddock description: ! 'Provides a way to specify axioms for type classes and to quickCheck all available instances against them' license-name: MIT
Add docker provisioner yaml file Debian 8 AArch64
# Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. docker: memory_limit: "4g" image: "bigtop/puppet:debian-8-aarch64" repo: "http://bigtop-repos.s3.amazonaws.com/releases/1.2.0/debian/8-aarch64/aarch64" distro: debian components: [hdfs, yarn, mapreduce] enable_local_repo: false smoke_test_components: [hdfs, yarn, mapreduce]
Create workflow which manages GitHub labels
name: Manage Labels on: push jobs: labeler: runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v2 - name: Run Labeler if: success() uses: crazy-max/ghaction-github-labeler@v2 with: yaml_file: .github/labels.yml skip_delete: true dry_run: true env: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
Add Senlin-dashboard Train release notes
--- upgrade: - > Switched nodejs4-jobs to nodejs10. - > Dropped the py35 testing. - > Switched to the new canonical constraints URL on master. - > Added Python3 Train unit tests.
Update from Hackage at 2018-11-01T03:25:41Z
homepage: https://github.com/xafizoff/n2o#readme changelog-type: '' hash: 69920bb046db179ceeb16f6c4eb4dac803e2fa8cbfc58f2defb2daf2cd641981 test-bench-deps: base: ! '>=4.7 && <5' hspec: -any bert: -any n2o: -any maintainer: xafizoff@gmail.com synopsis: Abstract Protocol Loop changelog: '' basic-deps: bytestring: ! '>=0.9' base: ! '>=4.7 && <5' text: ! '>=1.2' containers: ! '>=0.5' binary: ! '>=0.5' all-versions: - '0.11.0' author: Marat Khafizov latest: '0.11.0' description-type: markdown description: '' license-name: BSD3
Update from Hackage at 2017-07-22T18:28:23Z
homepage: https://github.com/caneroj1/mbtiles#readme changelog-type: '' hash: ba753b02570d697232ab6029421fcbcdc744ff8a5ac283d3dc4d764238951f59 test-bench-deps: base: -any HUnit: -any mbtiles: -any maintainer: jmc41493@gmail.com synopsis: Haskell MBTiles client changelog: '' basic-deps: bytestring: -any base: ! '>=4.7 && <5' sqlite-simple: -any unordered-containers: -any text: -any mtl: -any transformers: -any directory: -any all-versions: - '0.1.0.0' author: Joe Canero latest: '0.1.0.0' description-type: markdown description: ! "# mbtiles\n\nHaskell library for interfacing with MapBox [MBTiles](https://github.com/mapbox/mbtiles-spec) files.\n\n## Functionality\n* Getting tiles by zoom, x, and y.\n* Writing new tiles by zoom, x, and y.\n* Updating existing tiles by zoom, x, and y.\n* Accessing metadata from the mbtiles file.\n\n## Basic Usage\n\nReading, writing, and updating tiles:\n\n```haskell\n{-# LANGUAGE OverloadedStrings #-}\n\nimport qualified Data.ByteString.Lazy as BL\nimport \ Database.Mbtiles\n\nmain = do\n let myData = \"myTileData\" :: BL.ByteString\n \ runMbtiles \"my/path/to/file.mbtiles\" $ do\n maybeTileData <- getTile (Z 0) (X 0) (Y 0)\n case maybeTileData of\n Nothing -> writeTile (Z 0) (X 0) (Y 0) myData\n (Just d) -> updateTile (Z 0) (X 0) (Y 0) $ BL.init d\n```\n\nGetting metadata:\n\n```haskell\n\nimport Control.Monad.IO.Class\nimport Database.Mbtiles\n\nmain = do\n runMbtiles \"my/path/to/file.mbtiles\" $ do\n liftIO . print =<< getName\n \ liftIO . print =<< getType\n liftIO . print =<< getFormat\n\n```\n\n## Future Work\n* Improve database error handling.\n* Investigate usage as a performant tile server.\n* Add tests." license-name: BSD3
Update from Hackage at 2022-10-11T02:44:40Z
homepage: https://github.com/owensmurray/om-show changelog-type: '' hash: 879a12ea8cfb6af5fdcbb21bf6e8d7b8c3fef64c03af418e96e54761e0dc5093 test-bench-deps: {} maintainer: rick@owensmurray.com synopsis: Utilities for showing string-like things. changelog: '' basic-deps: base: '>=4.15.0.0 && <4.16' text: '>=1.2.5.0 && <1.3' aeson: '>=2.0.3.0 && <2.1' all-versions: - 0.1.2.6 author: Rick Owens latest: 0.1.2.6 description-type: markdown description: | # om-show Haskell utilities for turning values into string representations. It is mainly just a couple of very useful functions that eliminate a lot of conversion to from text types. It's so small that I can just post the entire source code in this README: ```haskell {-# LANGUAGE DerivingStrategies #-} {-# LANGUAGE GeneralizedNewtypeDeriving #-} {- | Utilities for showing string-like things. -} module OM.Show ( showt, showj, ShowJ(..), ) where import Data.Aeson (encode, ToJSON) import Data.String (IsString, fromString) import qualified Data.Text.Lazy as TL import qualified Data.Text.Lazy.Encoding as TLE {- | Like 'show', but for any string-like thing. -} showt :: (Show a, IsString b) => a -> b showt = fromString . show {- | Show the JSON representation as any kind of string-like thing. Primarily useful for dumping JSON values into log messages without having to jump through too many hoops. -} showj :: (ToJSON a, IsString b) => a -> b showj = fromString . TL.unpack . TLE.decodeUtf8 . encode {- | Wrapper whose 'Show' instance outputs JSON. Especially useful with `-XDerivingVia` e.g. > newtype Foo = Foo SomeType > deriving Show via (ShowJ SomeType) This will cause @show (foo :: Foo) to output the JSON representation of SomeType. -} newtype ShowJ a = ShowJ a deriving stock (Eq, Ord) deriving newtype (ToJSON) instance (ToJSON a) => Show (ShowJ a) where show = showj ``` license-name: MIT
Update from Hackage at 2018-10-06T07:31:55Z
homepage: https://github.com/NorfairKing/cursor changelog-type: '' hash: 6e1a678ccf3dccde203c5ea0d5739803af5cbe75d71d7d74b23af1de8130c807 test-bench-deps: cursor: -any base: -any hspec: -any text: -any cursor-gen: -any genvalidity-hspec: -any genvalidity-hspec-optics: -any containers: -any pretty-show: -any QuickCheck: -any microlens: -any maintainer: syd@cs-syd.eu synopsis: Generators for Purely Functional Cursors changelog: '' basic-deps: cursor: -any base: <5 text: -any containers: -any genvalidity-containers: -any QuickCheck: -any genvalidity-text: -any genvalidity: -any all-versions: - '0.0.0.0' author: Tom Sydney Kerckhove latest: '0.0.0.0' description-type: haddock description: Generators for Purely Functional Cursors for common data structures license-name: MIT
Add bash shell to deploy user
--- - name: Setup deploy group group: name={{ deploy_group }} state=present - name: Setup deploy user user: name={{ deploy_user }} group={{ deploy_group }} groups={{ deploy_groups }} state=present - name: Adding public key to server authorized_key: user={{ deploy_user }} key='{{ item }}' with_file: deploy_keys
--- - name: Setup deploy group group: name={{ deploy_group }} state=present - name: Setup deploy user user: name={{ deploy_user }} group={{ deploy_group }} groups={{ deploy_groups }} state=present shell=/bin/bash - name: Adding public key to server authorized_key: user={{ deploy_user }} key='{{ item }}' with_file: deploy_keys
Update from Hackage at 2019-05-16T21:16:40Z
homepage: https://github.com/chessai/primitive-stablename changelog-type: markdown hash: dc9aeb27861a17726faa3a9eff1cbba64d2b558ab09799524063b4e25607f1dc test-bench-deps: {} maintainer: chessai <chessai1996@gmail.com> synopsis: primitive operations on StableNames changelog: |- # Changelog `primitive-stablename` uses [PVP Versioning][1]. The changelog is available [on GitHub][2]. 0.0.0 ===== * Initially created. [1]: https://pvp.haskell.org [2]: https://github.com/chessai/primitive-stablename/releases basic-deps: base: ! '>=4.10.1.0 && <4.13' primitive: ! '>=0.6.4 && <0.8' all-versions: - '0.1' author: chessai latest: '0.1' description-type: markdown description: |- # primitive-stablename [![Hackage](https://img.shields.io/hackage/v/primitive-stablename.svg)](https://hackage.haskell.org/package/primitive-stablename) [![BSD3 license](https://img.shields.io/badge/license-BSD3-blue.svg)](LICENSE) primitive operations on StableNames license-name: BSD-3-Clause
Set up CI with Azure Pipelines for Extensions
# Node.js # Build a general Node.js project with npm. # Add steps that analyze code, save build artifacts, deploy, and more: # https://docs.microsoft.com/azure/devops/pipelines/languages/javascript jobs: - job: rolling_VS2017_build displayName: 'Extensions build' pool: name: Hosted VS2017 steps: - template: build/sdl-tasks.yml - template: build/extensions-npm-build-steps.yml
Update from Hackage at 2019-04-17T15:01:09Z
homepage: https://github.com/chessai/hedgehog-generic changelog-type: markdown hash: 379138b084fdcd1bd4d0321cf1735c20fa50c911572a9e44fffbf73a543eb87a test-bench-deps: {} maintainer: chessai <chessai1996@gmail.com> synopsis: GHC Generics automatically derived hedgehog generators changelog: |- # Changelog `hedgehog-generic` uses [PVP Versioning][1]. The changelog is available [on GitHub][2]. 0.0.0 ===== * Initially created. [1]: https://pvp.haskell.org [2]: https://github.com/chessai/hedgehog-generic/releases basic-deps: base: ! '>=4.10.1.0 && <4.13' hedgehog: ! '>=0.1 && <0.7' all-versions: - '0.1' author: chessai latest: '0.1' description-type: markdown description: |- # hedgehog-generic [![Hackage](https://img.shields.io/hackage/v/hedgehog-generic.svg)](https://hackage.haskell.org/package/hedgehog-generic) [![BSD3 license](https://img.shields.io/badge/license-BSD3-blue.svg)](LICENSE) GHC Generics automatically derived hedgehog generators license-name: BSD-3-Clause
Add bob.ip.optflow.liu recipe [skip appveyor]
{% set version = "2.0.5" %} package: name: bob.ip.optflow.liu version: {{ version }} source: fn: bob.ip.optflow.liu-{{ version }}.zip md5: 4fe7ce56f749b7a19e683cb16a946b94 url: https://pypi.python.org/packages/source/b/bob.ip.optflow.liu/bob.ip.optflow.liu-{{ version }}.zip build: entry_points: - bob_of_liu.py = bob.ip.optflow.liu.script.flow:main number: 0 skip: true # [not linux] script: python -B setup.py install --single-version-externally-managed --record record.txt requirements: build: - python - setuptools - bob.extension - bob.blitz - bob.io.base - bob.io.image - bob.io.video - bob.ip.color - gcc # [linux] run: - python - bob.extension - bob.blitz - bob.io.base - bob.io.image - bob.io.video - bob.ip.color - libgcc # [linux] test: commands: - bob_of_liu.py --help - nosetests -sv bob.ip.optflow.liu imports: - bob - bob.ip - bob.ip.optflow - bob.ip.optflow.liu - bob.ip.optflow.liu.script requires: - nose about: home: https://github.com/bioidiap/bob.ip.optflow.liu license: GNU General Public License v3 (GPLv3) summary: Python bindings to the optical flow framework by C. Liu extra: recipe-maintainers: - 183amir
Add example to down and remove a profile
# SPDX-License-Identifier: BSD-3-Clause --- - name: Set {{ profile }} down hosts: all vars: network_connections: - name: "{{ profile }}" persistent_state: absent state: down roles: - linux-system-roles.network ...
Add missing module to make it work
--- classes: - apache - collectd::plugin::postfix - mx_asf - pflogsumm - ssl::name::wildcard_apache_org postfix::server::inet_interfaces: 'all' postfix::server::postscreen: 'true' postfix::server::relay_domains: 'apache.org,incubator.apache.org,apachecon.com,apachecon.eu,subversion.com,subversion.net,subversion.org' postfix::server::recipient_canonical_maps: 'regexp:/etc/postfix/recipient_canonical_maps' postfix::server::mynetworks: 'hash:/etc/postfix/network_map' postfix::server::asf_mx_enabled: 'true' postfix::server::asf_mx_content_filter: 'smtp-amavis:[spamc1-us-west.apache.org]:10024' postfix::server::max_postfix_amavis_procs: '25' postfix::server::max_use_postfix_amavis: '25' postfix::dbfile: network_map: content: '127.0.0.0/8 %{network_eth0}/%{netmask} [::1]/128 [fe80::]/64 209.188.14.139/32'
--- classes: - apache - collectd::plugin::postfix - mx_asf - pflogsumm - postfix_asf - ssl::name::wildcard_apache_org postfix::server::inet_interfaces: 'all' postfix::server::postscreen: 'true' postfix::server::relay_domains: 'apache.org,incubator.apache.org,apachecon.com,apachecon.eu,subversion.com,subversion.net,subversion.org' postfix::server::recipient_canonical_maps: 'regexp:/etc/postfix/recipient_canonical_maps' postfix::server::mynetworks: 'hash:/etc/postfix/network_map' postfix::server::asf_mx_enabled: 'true' postfix::server::asf_mx_content_filter: 'smtp-amavis:[spamc1-us-west.apache.org]:10024' postfix::server::max_postfix_amavis_procs: '25' postfix::server::max_use_postfix_amavis: '25' postfix::dbfile: network_map: content: '127.0.0.0/8 %{network_eth0}/%{netmask} [::1]/128 [fe80::]/64 209.188.14.139/32'
Update from Hackage at 2015-11-27T11:16:41+0000
homepage: https://github.com/atzedijkstra/delimiter-separated changelog-type: '' hash: 85d9b153e18011c857629c5d689ead637073739700cb174aa4dae2c42f0ab92f test-bench-deps: {} maintainer: atze@uu.nl synopsis: Library for dealing with tab and/or comma (or other) separated files changelog: '' basic-deps: uulib: ! '>=0.9' uhc-util: ! '>=0.1.1.0' base: ! '>=4 && <5' all-versions: - '0.1.0.0' author: atze@uu.nl latest: '0.1.0.0' description-type: markdown description: ! '# delimiter-separated Haskell library for dealing with tab and/or comma (or other) separated files ' license-name: BSD3
Add GitHub action for syncing main branch
# Synchronize all pushes to 'master' branch with 'main' branch to facilitate migration name: "Sync main branch" on: push: branches: - master jobs: sync_latest_from_upstream: runs-on: ubuntu-latest name: Sync latest commits from master branch steps: - name: Checkout target repo uses: actions/checkout@v2 with: ref: main - name: Sync upstream changes id: sync uses: aormsby/Fork-Sync-With-Upstream-action@v3.0 with: target_sync_branch: main target_repo_token: ${{ secrets.GITHUB_TOKEN }} upstream_sync_branch: master upstream_sync_repo: elastic/elasticsearch-hadoop
Update from Hackage at 2017-03-21T16:14:59Z
homepage: https://github.com/qoelet/storeviva-login#readme changelog-type: '' hash: ed7f0d41d50fde9ce37c1beb878391af90f0a95f979d9b437977d000ecaf4621 test-bench-deps: bytestring: -any base: ! '>=4.7 && <5' base64-bytestring: -any hspec: ==2.* SimpleAES: -any QuickCheck: -any string-conversions: -any maintainer: Kenny Shen <kenny@machinesung.com> synopsis: Helper functions for setting up Double Submit Cookie defense for forms changelog: '' basic-deps: bytestring: -any base: ! '>=4.7 && <5' base64-bytestring: -any SimpleAES: -any string-conversions: -any all-versions: - '0.1.0' author: '' latest: '0.1.0' description-type: haddock description: See README at <https://github.com/qoelet/dsc#readme> license-name: MIT
Update from Hackage at 2018-12-30T17:34:55Z
homepage: https://github.com/athanclark/z85#readme changelog-type: markdown hash: cf84acb9d9418c876c7d383a43b42aa77be41e6dee84cc6c7746c53771c459bc test-bench-deps: z85: -any bytestring: -any pipes-text: -any attoparsec-binary: -any base: ! '>=4.7 && <5' pipes-bytestring: -any vector-sized: -any text: -any quickcheck-instances: -any pipes: -any tasty-quickcheck: -any tasty-hunit: -any attoparsec: -any tasty: -any QuickCheck: -any maintainer: athan.clark@localcooking.com synopsis: Implementation of the z85 binary codec changelog: | # Changelog for z85-bytestring ## Unreleased changes basic-deps: bytestring: -any pipes-text: -any attoparsec-binary: -any base: ! '>=4.7 && <5' pipes-bytestring: -any vector-sized: -any text: -any pipes: -any attoparsec: -any QuickCheck: -any all-versions: - 0.0.0 author: Athan Clark latest: 0.0.0 description-type: markdown description: | # z85 [z85](https://rfc.zeromq.org/spec:32/Z85/) is a binary string codec, like hexadecimal or base64, but has a higher density of compression than the former, due to its use of a higher base value of 85 than base 64. ByteStrings just need to be a length of a multiple of 4 (a Word32String might be a better name). There are multiple layers of exposed implementation in this package - `Word32 <-> Vector 4 Z85Char` for low level work - Attoparsec `Parser ByteString <-> Parser Text` for slightly higher level parsing of strict data - Pipes `Pipe ByteString Text <-> Pipe Text ByteString` for encoding / decoding streams of strict data - Casual `Lazy.ByteString ~ Lazy.Text` functions for encoding / decoding lazy data. license-name: BSD-3-Clause
Add the adjudicators office to list of transitioned sites
--- site: adjudicatorsoffice whitehall_slug: the-adjudicator-s-office homepage: https://www.gov.uk/government/organisations/the-adjudicator-s-office tna_timestamp: 20190501125725 host: www.adjudicatorsoffice.gov.uk aliases: - adjudicatorsoffice.gov.uk
Check things in the CI (node16 and node18 only)
name: ci on: push: pull_request: jobs: build: runs-on: macos-latest strategy: matrix: node-version: [ '16', '18' ] steps: - uses: actions/checkout@v2 - name: Use Node.js ${{ matrix.node-version }} uses: actions/setup-node@v1 with: node-version: ${{ matrix.node-version }} - run: npm install - run: npm test - run: npm run semi - run: npm run dist
Add github action for testing
name: Node CI on: [push] jobs: build: runs-on: ubuntu-latest strategy: matrix: node-version: [6.x, 8.x, 10.x, 12.x] steps: - name: Use Node.js ${{ matrix.node-version }} uses: actions/setup-node@v1 with: node-version: ${{ matrix.node-version }} - name: install rabbitmq run: | curl -s https://packagecloud.io/install/repositories/rabbitmq/rabbitmq-server/script.deb.sh | sudo bash sudo apt update sudo apt install rabbitmq-server - name: install pre-commit run: | sudo pip install pre-commit - uses: actions/checkout@v1 - name: npm install, build, and test run: | npm install npm run build --if-present npm test env: CI: true
Test against node 4 instead of iojs
language: node_js node_js: - "0.10" - "0.12" - "iojs" sudo: false script: - npm run lint - npm test matrix: allow_failures: - node_js: iojs
language: node_js node_js: - "0.10" - "0.12" - "4" sudo: false script: - npm run lint - npm test matrix: allow_failures: - node_js: "4"
Add the Travis CI configuration
# Use Python language: python # Run the test runner using the same version of Python we use. python: 3.5 # Install the test runner. install: pip install tox # Run each environment separately so we get errors back from all of them. env: - TOX_ENV=py35 - TOX_ENV=pep8 - TOX_ENV=docs script: - tox -e $TOX_ENV
Add Travis CI configuration file
before_install: - sudo apt-get update -qq - sudo apt-get install -qq libboost-regex-dev libyaml-cpp-dev cmake libgtest-dev - cd /usr/src/gtest - sudo cmake CMakeLists.txt - sudo make - sudo cp *.a /usr/lib - cd - script: - make test language: cpp
Build Newsboat on FreeBSD with Cirrus CI
freebsd_instance: image: freebsd-11-2-release-amd64 task: install_script: pkg install -y rust gmake asciidoc pkgconf stfl curl json-c ncurses openssl sqlite3 gettext-tools script: RUST_BACKTRACE=1 gmake
Add workflow to trigger image rebuild
--- name: CRON Trigger testing images on: schedule: # 7am UTC, 12am PDT - cron: '0 7 * * *' push: branches: - main paths: - 'ros2/testing/**' jobs: testing_build: name: Trigger testing image runs-on: ubuntu-latest container: image: osrf/ros2:testing steps: - name: "Check apt updates" if: ${{ github.event_name == 'schedule' }} env: SOURCELIST: sources.list.d/ros2-testing.list run: | apt-get update \ -o Dir::Etc::sourcelist="${SOURCELIST}" apt-get --simulate upgrade \ -o Dir::Etc::sourcelist="${SOURCELIST}" \ | grep "0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded." \ && echo "No apt updates" || echo "TRIGGER=true" >> $GITHUB_ENV - name: "Check file updates" if: ${{ github.event_name == 'push' }} run: | echo "TRIGGER=true" >> $GITHUB_ENV - name: "Trigger Dockerhub URL" if: ${{ fromJSON(env.TRIGGER) }} env: DATA: | { "docker_tag": "testing" } run: | echo ${DATA} \ | curl -H "Content-Type: application/json" \ --data @- \ -X POST ${{ secrets.DOCKER_ROS2_POST_PUSH_TRIGGER }}
Set up basic Travis CI, including rustdoc builds
language: rust sudo: required rust: - stable - beta - nightly before_script: - | pip install 'travis-cargo<0.2' --user && export PATH=$HOME/.local/bin:$PATH script: - | travis-cargo build && travis-cargo test && travis-cargo bench && travis-cargo doc after_success: - travis-cargo --only nightly doc-upload - travis-cargo coveralls env: global: secure: tal0xa4ro9Qd50osPTRELV46VMj3h0kdCFdgxMPoQt2PTYBOPYxRUP8XvgzVcg6PPEkSvtwsrCNDPNNQVctmCIC9QKtFiuznGRPVEz9Da5KfFweddJTCH8Mz9F57EPigA0vdNSLmu+N1pJaDl9AQYXO9cpAiHA4p9sPr7fnQsp6cn5byngJJJYN95Owfs7ahjJrigQYfikq+beqFQnngQBBq6CL0GQkYHp+dv2Z2mrS3+Wm+rnQtlDwdfoCyXAd+u5czMwOiMZNUqffZPpo5new4dmvaXlgMygsZtsKjzTZEJnuA3Kp0DfAA275crkfM7Su2j04RV7W9svwbqxZcqp6kAwOc5lpijrk/DM3KlJOZ/Bwa7CK8+oHmhFXZJvIOiczfkTtCOt0JOIotwoWlzvLFg0YhVzGdcwfwSpNL7cDCZpwsiHyMvZuxaX5ZaCbyLKAKONnBlaNFyumo4Vcah2LE8TU+QUW7dkx75oUfDeBt/NGbgniTwsc2cIkNWl340aCPxTjBU7yO7wofqHLpRrIyJzsC5Vy0tgAA6aDwKFH49OtXorUrqpG+BF2nb+APXvXvIEwXPpNhfpZjMADXxWjfMAmemniR5CQjJl+o0QJURoKUjNvxS6VQW4I5FrTJERPgRowkhLMC9ejYLgaQX4FCMbQZq34he3dIF4ansao=
Add github actions to build library and build docs...
name: build-and-docs on: push: branches: [ develop ] jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2.3.1 - run: | mkdir build && cd build cmake .. && make docs: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2.3.1 - name: Doxygen Action uses: mattnotmitt/doxygen-action@v1.1.0 with: working-directory: 'docs/' doxyfile-path: './' - name: Deploy Docs uses: peaceiris/actions-gh-pages@v3.6.4 with: github_token: ${{ secrets.GITHUB_TOKEN }} publish_branch: gh-pages publish_dir: ./docs/html
Update from Hackage at 2017-05-04T03:23:44Z
homepage: https://github.com/glguy/toml-parser changelog-type: markdown hash: fd731b1a7f192710d3ffd37eab0695a05bd1d1d20a0c377dcac61e2bb78d1f8b test-bench-deps: {} maintainer: emertens@gmail.com synopsis: Parser for the TOML configuration language changelog: ! '# Revision history for toml-parser ## 0.1.0.0 -- YYYY-mm-dd * First version. Released on an unsuspecting world. ' basic-deps: base: ! '>=4.9 && <4.11' time: ! '>=1.6 && <1.9' text: ! '>=1.2 && <1.3' array: ! '>=0.5 && <0.6' all-versions: - '0.1.0.0' author: Eric Mertens latest: '0.1.0.0' description-type: haddock description: ! 'Parser for the TOML configuration language. TOML is specified by <https://github.com/toml-lang/toml>. This language is designed to be easy to understand and unambiguous. This implementation uses Alex and Happy to generate an efficient lexer and parser. It aims to have minimal library dependencies.' license-name: ISC
Set `page.title` instead of `site.title`
# visit https://github.com/mojombo/jekyll/wiki/Configuration for more settings paginate: 10 # pagination based on number of posts paginate_path: "page:num" exclude: ["README.md"] # files to exclude pygments: true markdown: kramdown title: dbyll description: Stylish Jekyll Theme author: name: dbyll email: dbyll@ismaildemirbilek.com github: dbtek twitter: dbtek pinterest: asd123 linkedin: asd123 bio: Your stylish, minimalist theme! email_md5: 726351295ec82e145928582f595aa3aa rss_path: feed.xml categories_path: categories.html tags_path: tags.html BASE_PATH:
# visit https://github.com/mojombo/jekyll/wiki/Configuration for more settings paginate: 10 # pagination based on number of posts paginate_path: "page:num" exclude: ["README.md"] # files to exclude pygments: true markdown: kramdown defaults: - scope: path: "" # empty string for all files values: title: dbyll description: Stylish Jekyll Theme author: name: dbyll email: dbyll@ismaildemirbilek.com github: dbtek twitter: dbtek pinterest: asd123 linkedin: asd123 bio: Your stylish, minimalist theme! email_md5: 726351295ec82e145928582f595aa3aa rss_path: feed.xml categories_path: categories.html tags_path: tags.html BASE_PATH:
Add bob.db.biosecurid.face recipe [skip appveyor]
{% set version = "2.0.5" %} package: name: bob.db.biosecurid.face version: {{ version }} source: fn: bob.db.biosecurid.face-{{ version }}.zip md5: f70ebc28151001459ee83c33af20af1d url: https://pypi.python.org/packages/source/b/bob.db.biosecurid.face/bob.db.biosecurid.face-{{ version }}.zip build: number: 0 skip: true # [not linux] script: python -B setup.py install --single-version-externally-managed --record record.txt requirements: build: - python - setuptools - six - bob.io.base - bob.db.base - bob.db.verification.utils run: - python - six - bob.io.base - bob.db.base - bob.db.verification.utils test: commands: - nosetests -sv bob.db.biosecurid.face imports: - bob - bob.db - bob.db.biosecurid - bob.db.biosecurid.face requires: - nose about: home: https://github.com/bioidiap/bob.db.biosecurid.face license: GNU General Public License v3 (GPLv3) summary: BANCA Database Access API for Bob extra: recipe-maintainers: - 183amir
Add a basic Travis config file
language: objective-c osx_image: xcode8.2 script: - xcodebuild -project Movies.xcodeproj -scheme Movies -sdk iphonesimulator -destination 'platform=iOS Simulator,name=iPhone 6S,OS=10.1' build test
Update from Hackage at 2017-03-12T09:39:13Z
homepage: https://github.com/agrafix/powerqueue#readme changelog-type: '' hash: dc9951c1c4f4fbf9b211733d480f8b8206d8fe36c6da905d4d66b6049a18c1bc test-bench-deps: stm: ! '>=2.4' base: -any hspec: ! '>=2.2' powerqueue-distributed: -any async: ! '>=2.1' timespan: -any powerqueue: -any maintainer: mail@athiemann.net synopsis: A distributed worker backend for powerqueu changelog: '' basic-deps: cereal: ! '>=0.5' bytestring: ! '>=0.10' cereal-conduit: -any base: ! '>=4.7 && <5' text: ! '>=1.2' conduit: -any conduit-extra: ! '>=1.1' timespan: -any mtl: -any powerqueue: ! '>=0.1.1' all-versions: - '0.1.0.0' author: Alexander Thiemann latest: '0.1.0.0' description-type: haddock description: A distributed worker backend for powerqueu license-name: BSD3
Add CI (via GitHub Actions)
name: ci on: pull_request: push: branches: [master] jobs: stack: name: ${{ matrix.os }}, ${{ matrix.resolver }} runs-on: ${{ matrix.os }} strategy: fail-fast: false matrix: os: [macOS-latest, ubuntu-latest, windows-latest] resolver: ['lts-17.12'] # resolver: ['lts-17.12', 'nightly-2021-05-23'] stack: ['2.7.1'] env: STACK : stack build --resolver ${{ matrix.resolver }} CACHE_DIR : ${{ matrix.os == 'windows-latest' && '~/AppData/Roaming/stack' || '~/.stack' }} BINARY : ./dist/adventofcode-exe${{ matrix.os == 'windows-latest' && '.exe' || '' }} steps: - uses: actions/checkout@v2 - name: Install stack uses: actions/setup-haskell@v1.1.4 with: stack-version: ${{ matrix.stack }} enable-stack: true stack-no-global: true - name: Cache ${{ env.CACHE_DIR }} uses: actions/cache@v2.1.3 with: path: ${{ env.CACHE_DIR }} key: ${{ runner.os }}-${{ matrix.resolver }}-stack-cache-5 - name: Clean run: stack clean - name: Clean ~/.stack/setup-exe-* (skip for Windows) if: ${{ runner.os != 'Windows' }} run: rm -rf ~/.stack/setup-exe-cache && rm -rf ~/.stack/setup-exe-src - name: Install GHC and deps run: ${{ env.STACK }} --only-dependencies - name: Build run: | mkdir ./dist ${{ env.STACK }} --test --bench --no-run-tests --no-run-benchmarks --copy-bins --local-bin-path ./dist - name: Test run: ${{ env.STACK }} --test - name: Run run: ${{ env.BINARY }} # - name: Compress # uses: svenstaro/upx-action@2.0.1 # with: # file: ${{ env.BINARY }} # strip: true # args: --ultra-brute
Add Travis CI config file
language: python python: - "2.6" - "2.7" # command to install dependencies install: - pip install . - pip install -r requirements.txt # command to run tests script: make check
Disable some of the Rubocop rules
Style/CommentedKeyword: Enabled: false Lint/AmbiguousRegexpLiteral: Enabled: false Lint/UselessAssignment: Enabled: false Lint/HandleExceptions: Enabled: false Lint/AssignmentInCondition: Enabled: false
Add an AppVeyor file for Windows CI builds
environment: MVN_VERSION: 3.3.3 install: - cinst maven --version=%MVN_VERSION% -y - set PATH=C:\tools\apache-maven-%MVN_VERSION%\bin;%PATH% cache: - C:\Users\appveyor\.m2\repository -> pom.xml build_script: - mvn install -DskipTests=true -Dmaven.javadoc.skip=true -B -V # This is what Travis CI does by default. test_script: - mvn test -B # This is what Travis CI does by default. after_test: - ps: | $url = "https://ci.appveyor.com/api/testresults/junit/$($env:APPVEYOR_JOB_ID)" $files = Get-ChildItem '.\target\surefire-reports\junitreports\TEST-*.xml' ForEach ($file in $files) { (New-Object 'System.Net.WebClient').UploadFile($url, (Resolve-Path $file)) } artifacts: - path: target\*.jar name: Java archive
Add custon spell icons in the spellbook gui and memory gui
spellNameFormat: "&5&ka&r&bMagic Scroll&5&ka&r {NAME}" memorizationEnabled: true spellbookEnabled: true spellbookName: "&bSpellbook" spellpageNameFormat: "&5&ka&r&bSpell Page&5&ka&r {NAME}" manaPerHalfHeart: 1 Mysql: username: user password: pass database: civspellapi hostname: localhost port: 3306 spells: Do Nothing: spell: nop # Defaults to "nop" if not present. manaCost: 1 # Defaults to 0 scrollCastable: true spellbookCastable: true memoryCastable: true # The above three default to false # config: # nop does not take a config, however other spells do.
Add Gitlab CI startup testing
before_script: stages: - test - stop test_job: stage: test script: - ./start.sh - sleep 5 - ./stop.sh only: - master stop: stage: stop script: - ./stop.sh only: - master when: always
Set up CI with Azure Pipelines
# Starter pipeline # Start with a minimal pipeline that you can customize to build and deploy your code. # Add steps that build, run tests, deploy, and more: # https://aka.ms/yaml trigger: - master pool: vmImage: 'ubuntu-latest' steps: - script: echo Hello, world! displayName: 'Run a one-line script' - script: | echo Add other tasks to build, test, and deploy your project. echo See https://aka.ms/yaml displayName: 'Run a multi-line script'
Add a GitHub funding config
# These are supported funding model platforms github: # Replace with up to 4 GitHub Sponsors-enabled usernames e.g., [user1, user2] - jaraco - webknjaz # patreon: # Replace with a single Patreon username # open_collective: # Replace with a single Open Collective username # ko_fi: # Replace with a single Ko-fi username tidelift: pypi/Cheroot # A single Tidelift platform-name/package-name e.g., npm/babel # community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry # liberapay: # Replace with a single Liberapay username # issuehunt: # Replace with a single IssueHunt username # otechie: # Replace with a single Otechie username # custom: # Replace with up to 4 custom sponsorship URLs e.g., ['link1', 'link2']
Add node to ring role
--- # The MIT License (MIT) # # Copyright (c) 2015 Taio Jia (jiasir) <jiasir@icloud.com> # # Permission is hereby granted, free of charge, to any person obtaining a copy # of this software and associated documentation files (the "Software"), to deal # in the Software without restriction, including without limitation the rights # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell # copies of the Software, and to permit persons to whom the Software is # furnished to do so, subject to the following conditions: # # The above copyright notice and this permission notice shall be included in all # copies or substantial portions of the Software. # # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE # SOFTWARE. - name: Add each storage node to the account ring shell: swift-ring-builder account.builder add {{ swift_storage_mgmt_ip }}:6002/{{ device_name }} {{ device_weight }} chdir=/etc/swift - name: Add each storage node to the container ring shell: swift-ring-builder container.builder add {{ swift_storage_mgmt_ip }}:6001/{{ device_name }} {{ device_weight }} chdir=/etc/swift - name: Add each storage node to the object ring shell: swift-ring-builder object.builder add {{ swift_storage_mgmt_ip }}:6000/{{ device_name }} {{ device_weight }} chdir=/etc/swift
Add optking recipe. Generated with grayskull
{% set name = "optking" %} {% set version = "0.1.1" %} package: name: {{ name|lower }} version: {{ version }} source: url: https://github.com/psi-rking/optking/archive/{{ version }}.tar.gz sha256: 57c5b15fc160a6ed7cf260f669f332d97fd74ff5f0eb6cad9cf27d65d049d24a build: noarch: python script: {{ PYTHON }} -m pip install . -vv number: 0 requirements: host: - python - pip run: - python >=3.7 - numpy >=1.21 - qcelemental >=0.21.0 - qcengine >=0.21.0 - msgpack-python >=1.0 test: imports: - optking commands: - pip check - pytest -k test_lj_external_gradient source_files: - optking/tests/test_opthelper.py requires: - pip - pytest >=4.0.0 about: home: https://github.com/psi-rking/optking license: BSD-3-Clause license_file: LICENSE summary: A molecular optimizer for Quantum Chemistry calculations. dev_url: https://github.com/psi-rking/optking.git doc_url: https://optking.readthedocs.io/en/latest/ doc_source_url: https://github.com/psi-rking/optking/blob/master/docs/source/index.rst extra: recipe-maintainers: - AlexHeide
Set up marker sync for @muriloricci
name: 'Import markers from Ricci' on: workflow_dispatch jobs: import-markers: runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v2 with: ref: ${{ github.head_ref }} token: ${{ secrets.PERSONAL_ACCESS_TOKEN }} - name: Set up Node.js uses: actions/setup-node@v2 with: node-version-file: '.nvmrc' - name: Install dependencies run: npm install - name: Import markers env: URL: ${{secrets.MARKERS_URL_RICCI}} run: | mkdir -p minimap curl --silent --location --output minimap/minimapmarkers.bin "${URL}" node_modules/.bin/tibia-maps --from-minimap=minimap --output-dir=data --markers-only git config user.name 'Mathias Bynens' git config user.email 'mathias@qiwi.be' git commit data/markers.json -m 'Import marker changes from Ricci’s client' git push
Add config file for production server
# Build settings # Gems gems: - jekyll-paginate - jekyll-feed # url is the full website URL # baseurl is the website's URL without the hostname # If you are hosting it on Github for some project with name 'projectname', then the url and baseurl should look like: #url: "http://username.github.io/projectname" #baseurl: "/projectname" url: "https://uhlibraries-digital.github.io/bcdams-map" enforce_ssl: "" baseurl: "/bcdams-map" # Title of website title: BCDAMS MAP #Default keywords (used if no keywords are specified on a page basis) keywords: digital asset management, university of houston, metadata application profile # Short description of your site desc: "This is the metadata application profile for the University of Houston Library's digital asset management system based on Hydra in a Box: the Bayou City DAMS." # --- Navigation bar options --- # # Image to show in the navigation bar - image must be a square (width = height) # Remove this parameter if you don't want an image in the navbar avatar: "/img/uh.jpg" # List of links in the navigation bar nav-links: Home: "" Guidelines: "guidelines" About: "about" Contact: "contact" # --- Footer options --- # # If the values are empty, they are ignored profile: name: Andrew Weidner email: ajweidner@uh.edu # facebook: your-fb-id github: uhlibraries-digital # twitter: your-twitter-id # linkedin: your-linkedin-id # stackoverflow: your-stackoverflow-id # To display link in the footer section # pretty-url: "bchetty.com" # --- Misc --- # # Your Disqus profile (shortname) settings # disqus: "Your-Disqus-Id" # Google Analytics Settings # google_analytics: "Your-GA-Id" # Set these options as you need (For more information, check Jekyll's site) timezone: "America/Chicago" markdown: kramdown highlighter: rouge permalink: /blog/:title paginate: 5 # Default YAML values (more information on Jekyll's site) defaults: - scope: path: "" type: "posts" values: comments: false # add comments to all blog posts - scope: path: "" # all files values: layout: "default" show-avatar: true # Exclude these files from production site exclude: - Gemfile - Gemfile.lock - LICENSE - README.md - CNAME created-by: Weidner, Andrew creator-url: https://id.lib.uh.edu/ark:/84475/au2798md59p
Set up CI with Azure Pipelines
# .NET Desktop # Build and run tests for .NET Desktop or Windows classic desktop solutions. # Add steps that publish symbols, save build artifacts, and more: # https://docs.microsoft.com/azure/devops/pipelines/apps/windows/dot-net trigger: - main pool: vmImage: 'windows-latest' variables: solution: '**/*.sln' buildPlatform: 'Any CPU' buildConfiguration: 'Release' steps: - task: NuGetToolInstaller@1 - task: NuGetCommand@2 inputs: restoreSolution: '$(solution)' - task: VSBuild@1 inputs: solution: '$(solution)' platform: '$(buildPlatform)' configuration: '$(buildConfiguration)' - task: VSTest@2 inputs: platform: '$(buildPlatform)' configuration: '$(buildConfiguration)'
Configure Government IT Profession site
--- site: civilservice_it whitehall_slug: civil-service homepage_title: 'Civil Service IT Profession' homepage: https://www.gov.uk/government/organisations/civil-service-government-it-profession/about tna_timestamp: 20140618001128 host: it.civilservice.gov.uk global: =301 https://www.gov.uk/government/organisations/civil-service-government-it-profession/about
Define development default for Let's Encrypt variable
env: development ferm_enabled: false mysql_root_password: "{{ vault_mysql_root_password }}" # Define this variable in group_vars/development/vault.yml web_user: vagrant
acme_tiny_challenges_directory: "{{ www_root }}/letsencrypt" env: development ferm_enabled: false mysql_root_password: "{{ vault_mysql_root_password }}" # Define this variable in group_vars/development/vault.yml web_user: vagrant
Update from Hackage at 2018-09-12T12:47:46Z
homepage: http://github.com/eltix/quantities changelog-type: markdown hash: a715b6d0467038237c9d20f0bfb88815a1563e08380607dec5be5d31942c1b42 test-bench-deps: hlint: -any base: ==4.* hspec: -any process: -any parsec: -any doctest: -any containers: -any regex-compat: -any mtl: -any quantities: -any Glob: -any maintainer: Eliott Tixier <eliott.tixier@novadiscovery.com> synopsis: Unit conversion and manipulation library. changelog: ! ' # jinquantities changelog --- ## 0.1.0 (2018-09-07) - Fork from https://github.com/jdreaver/quantities ' basic-deps: base: ==4.* parsec: -any containers: -any mtl: -any quantities: -any all-versions: - '0.1.0' author: Eliott Tixier <eliott.tixier@novadiscovery.com> latest: '0.1.0' description-type: haddock description: ! 'Credit goes to https://github.com/jdreaver/quantities from which this package is initially a fork. A library for creating and manipulating physical quantities, which are a numerical value associated with a unit of measurement. Included is an expression parser and a huge list of predefined quantities with which to parse strings into a Quantity datatype. Once created, a quantity can be converted to different units or queried for its dimensionality. A user can also operate on quantities arithmetically, and doing so uses automatic unit conversion and simplification.' license-name: BSD3
Add release notes for Immutable Parameters feature
--- features: - Adds a new "immutable" boolean field to the parameters section in a HOT template. This gives template authors the ability to mark template parameters as immutable to restrict updating parameters which have destructive effects on the application. A value of True results in the engine rejecting stack-updates that include changes to that parameter. When not specified in the template, "immutable" defaults to False to ensure backwards compatibility with old templates.
Update from Hackage at 2021-04-21T21:05:55Z
homepage: https://github.com/parsonsmatt/lift-type#readme changelog-type: markdown hash: ee0e3a3da22c8de7c4de1addaff85a42deb0d01497d0458fb72f285a631e8da3 test-bench-deps: lift-type: -any base: '>=4.7 && <5' template-haskell: -any maintainer: parsonsmatt@gmail.com synopsis: '' changelog: | # Changelog for lift-typeable ## Unreleased changes basic-deps: base: '>=4.7 && <5' template-haskell: -any all-versions: - 0.1.0.0 author: Matt Parsons latest: 0.1.0.0 description-type: markdown description: | # lift-type This library provides a utility function `liftType` which accepts a type application argument and returns the `Template Haskell` `Type` representation of it. license-name: BSD-3-Clause
Update from Hackage at 2016-09-15T14:47:04+0000
homepage: https://github.com/RyanGlScott/code-page changelog-type: markdown hash: a541192b317c29962c8f98e4613c6311032699bfce26a8483f7102fc469ca74e test-bench-deps: base: ! '>=4.3 && <5' code-page: ==0.1 maintainer: Ryan Scott <ryan.gl.scott@gmail.com> synopsis: Windows code page library for Haskell changelog: ! "# 0.1 [YYYY.MM.DD]\r\n\r\n* Initial commit.\r\n" basic-deps: base: ! '>=4.3 && <5' all-versions: - '0.1' author: Ryan Scott latest: '0.1' description-type: markdown description: ! "# `code-page`\n[![Hackage](https://img.shields.io/hackage/v/code-page.svg)][Hackage: code-page]\n[![Hackage Dependencies](https://img.shields.io/hackage-deps/v/code-page.svg)](http://packdeps.haskellers.com/reverse/code-page)\n[![Haskell Programming Language](https://img.shields.io/badge/language-Haskell-blue.svg)][Haskell.org]\n[![BSD3 License](http://img.shields.io/badge/license-BSD3-brightgreen.svg)][tl;dr Legal: BSD3]\n[![Linux build](https://img.shields.io/travis/RyanGlScott/code-page.svg)](https://travis-ci.org/RyanGlScott/code-page)\n[![Windows build](https://ci.appveyor.com/api/projects/status/kaxqsgm2xx66l2q5?svg=true)](https://ci.appveyor.com/project/RyanGlScott/code-page)\n\n[Hackage: code-page]:\n http://hackage.haskell.org/package/code-page\n \"code-page package on Hackage\"\n[Haskell.org]:\n http://www.haskell.org\n \"The Haskell Programming Language\"\n[tl;dr Legal: BSD3]:\n https://tldrlegal.com/license/bsd-3-clause-license-%28revised%29\n \ \"BSD 3-Clause License (Revised)\"\n\nWindows code page library for Haskell\n" license-name: BSD3
Use Ansible 2.x feature in "cikit-jenkins"
--- - include: repo.yml - name: Download Jenkins get_url: url: "{{ jenkins.package }}" dest: ~/jenkins.deb mode: 440 - name: Create default config template: src: jenkins.j2 dest: /etc/default/jenkins - name: Install Jenkins # Do not override previously created Jenkins config during installation. shell: "yes N | dpkg -i ~/jenkins.deb" - name: Restart service service: name: jenkins state: restarted - include: cli.yml - name: Copy configs template: src: "configs/{{ item | basename }}" dest: '{{ jenkins_lib }}/{{ item | basename | splitext | first }}' owner: "{{ jenkins_data.user }}" group: "{{ jenkins_data.group }}" force: yes with_fileglob: ../templates/configs/*.j2 - name: Copy user content copy: src: "../files/userContent" dest: "{{ jenkins_lib }}" owner: "{{ jenkins_data.user }}" group: "{{ jenkins_data.group }}" - name: Restart service service: name: jenkins state: restarted - name: Install/update plugins # Firstly: plugin downloading. # Secondly: install plugin dependencies. shell: > java -jar {{ jenkins.cli }} -s {{ jenkins_host }} install-plugin http://mirrors.jenkins-ci.org/plugins/{{ item.key }}/{{ item.value }}/{{ item.key }}.hpi && curl -X POST -d "<jenkins><install plugin='{{ item.key }}@{{ item.value }}' /></jenkins>" -H 'Content-Type: text/xml' {{ jenkins_host }}/pluginManager/installNecessaryPlugins with_dict: "{{ plugins }}" notify: Restart Jenkins - name: Restart Nginx service: name: nginx state: restarted ignore_errors: yes - name: Adding Jenkins user to permitted groups user: name: "{{ jenkins_data.user }}" groups: shadow,adm append: yes - name: Adding Jenkins user to nopasswd sudoers lineinfile: dest: /etc/sudoers line: "{{ jenkins_data.user }} ALL=(ALL) NOPASSWD:ALL"