Instruction
stringlengths
14
778
input_code
stringlengths
0
4.24k
output_code
stringlengths
1
5.44k
Update from Hackage at 2018-08-14T20:10:26Z
homepage: '' changelog-type: '' hash: c3cc910c168f5cc41ed075e787c8d259a4d1d055ee90146024fc514d62245098 test-bench-deps: {} maintainer: strake888@gmail.com synopsis: Foldable types with at least 1 element changelog: '' basic-deps: base: ! '>=4.7 && <5' util: -any transformers: ! '>=0.3 && <0.6' all-versions: - '0.1.0.0' author: M Farkas-Dyck latest: '0.1.0.0' description-type: markdown description: ! '# foldable1 ' license-name: BSD3
Update from Hackage at 2017-08-13T22:07:00Z
homepage: https://github.com/jez/pandoc-sidenote#readme changelog-type: '' hash: f5a5fdab1900da7b26ca673d14302b0a27b56024ca811babcacab79ba9614e90 test-bench-deps: {} maintainer: zimmerman.jake@gmail.com synopsis: Convert Pandoc Markdown-style footnotes into sidenotes changelog: '' basic-deps: base: <5 pandoc-types: -any pandoc: -any monad-gen: -any pandoc-sidenote: -any all-versions: - '0.19.0.0' author: Jake Zimmerman latest: '0.19.0.0' description-type: haddock description: Convert Pandoc Markdown-style footnotes into sidenotes license-name: MIT
Fix the skipping stuff and make this work on OSX too
build: number: 1 about: home: http://ccb.jhu.edu/software/tophat/index.shtml license: Boost Software License summary: A spliced read mapper for RNA-Seq package: name: tophat version: 2.1.1 requirements: build: - python # [not py3k] run: - python # [not py3k] - bowtie2 <=2.2.5 test: commands: - (tophat --version 2>&1) > /dev/null source: fn: tophat-2.1.1.Linux_x86_64.tar.gz url: http://ccb.jhu.edu/software/tophat/downloads/tophat-2.1.1.Linux_x86_64.tar.gz md5: 97fe58465a01cb0a860328fdb1993660
build: number: 1 about: home: http://ccb.jhu.edu/software/tophat/index.shtml license: Boost Software License summary: A spliced read mapper for RNA-Seq package: name: tophat version: 2.1.1 build: skip: True # [py3k] requirements: build: - python run: - python - bowtie2 <=2.2.5 test: commands: - (tophat --version 2>&1) > /dev/null source: fn: tophat-2.1.1.Linux_x86_64.tar.gz url: http://ccb.jhu.edu/software/tophat/downloads/tophat-2.1.1.Linux_x86_64.tar.gz # [linux] md5: 97fe58465a01cb0a860328fdb1993660 # [linux] url: https://ccb.jhu.edu/software/tophat/downloads/tophat-2.1.1.OSX_x86_64.tar.gz # [osx] md5: 1e4e7f8d08f182d2db3202975f284294 # [osx]
Add workflow to validate Gradle wrapper
name: "Validate Gradle Wrapper" on: [push, pull_request] jobs: validation: name: "Validation" runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - uses: gradle/wrapper-validation-action@v1
Update from Hackage at 2020-08-19T14:17:29Z
homepage: https://www.manuelbaerenz.de/#computerscience changelog-type: markdown hash: 0d62625e019e26d5e088ef06e4aa2d2c11578fc80e3beadcb86ee4368ba4f3b1 test-bench-deps: http-client: '>=0.7.1' base: '>=4.11 && <5' essence-of-live-coding: '>=0.2.1' essence-of-live-coding-warp: -any maintainer: programming@manuelbaerenz.de synopsis: General purpose live coding framework changelog: | # Revision history for essence-of-live-coding-warp ## 0.2.1 * Simple single-threaded version basic-deps: warp: '>=3.3.13' wai: '>=3.2.2.1' base: '>=4.11 && <5' essence-of-live-coding: '>=0.2.1' http-types: '>=0.12.3' all-versions: - 0.2.1 author: Manuel Bärenz latest: 0.2.1 description-type: haddock description: |- essence-of-live-coding is a general purpose and type safe live coding framework. You can run programs in it, and edit, recompile and reload them while they're running. Internally, the state of the live program is automatically migrated when performing hot code swap. The library also offers an easy to use FRP interface. It is parametrized by its side effects, separates data flow cleanly from control flow, and allows to develop live programs from reusable, modular components. There are also useful utilities for debugging and quickchecking. This library contains a single-threaded interface to the WARP web server. WAI applications can be run this way. license-name: BSD-3-Clause
Add Travis file for inline source map
# ---------------------------------------------------------------------------- # # Package : inline-source-map # Source Repo : https://github.com/thlorenz/inline-source-map.git # Travis Job Link : https://travis-ci.com/github/nagesh4193/inline-source-map/builds/212785838 # Created travis.yml : No # Maintainer : Nageswara Rao K<nagesh4193@gmail.com>/Priya Seth<sethp@us.ibm.com> # # Script License : Apache License, Version 2 or later # # ---------------------------------------------------------------------------- sudo: false language: node_js node_js: - 6 - 8 arch: - amd64 - ppc64le before_install: - npm install --global npm
Switch Travis to use postgres
services: docker env: global: - DOCKER_REGISTRY=registry.heroku.com - HEROKU_PROD_APP=standups - HEROKU_STAGE_APP=standupstage - HEROKU_PROC_TYPE=web - DOCKER_CACHE_FILE=/home/travis/docker/cache.tar.gz cache: directories: - /home/travis/docker/ before_install: - docker -v - docker-compose -v - echo "ENV GIT_SHA ${TRAVIS_COMMIT}" >> docker/standup_base - bin/travis-docker-cache.sh load install: make build script: - make test-image - bin/travis-docker-cache.sh save deploy: - provider: script script: bin/deploy.sh stage on: branch: master repo: mozilla/standup - provider: script script: bin/deploy.sh prod on: tags: true repo: mozilla/standup notifications: irc: "irc.mozilla.org#standup"
services: - docker - postgres env: global: - DOCKER_REGISTRY=registry.heroku.com - HEROKU_PROD_APP=standups - HEROKU_STAGE_APP=standupstage - HEROKU_PROC_TYPE=web - DOCKER_CACHE_FILE=/home/travis/docker/cache.tar.gz - DATABASE_URL=postgres://postgres:@localhost/standup cache: directories: - /home/travis/docker/ before_install: - docker -v - docker-compose -v - echo "ENV GIT_SHA ${TRAVIS_COMMIT}" >> docker/standup_base - bin/travis-docker-cache.sh load install: make build before_script: - psql -c 'create database standup;' -U postgres script: - make test-image - bin/travis-docker-cache.sh save deploy: - provider: script script: bin/deploy.sh stage on: branch: master repo: mozilla/standup - provider: script script: bin/deploy.sh prod on: tags: true repo: mozilla/standup notifications: irc: "irc.mozilla.org#standup"
Switch to YAML config file
Butler-SOS: # Possible log levels are silly, debug, verbose, info, warn, error logLevel: verbose # Qlik Sense logging db config parameters logdb: pollingInterval: 15000 host: <IP or FQDN of Qlik Sense logging db> port: 4432 qlogsReaderUser: qlogs_reader qlogsReaderPwd: <pwd> # Certificates to use when querying Sense for healthcheck data cert: clientCert: <path/to/cert/client.pem> clientCertKey: <path/to/cert/client_key.pem> clientCertCA: <path/to/cert/root.pem> # MQTT config parameters mqttConfig: enableMQTT: true brokerIP: <IP of MQTT server> baseTopic: butler-sos/ # Influx db config parameters influxdbConfig: enableInfluxdb: true hostIP: <IP or FQDN of Influxdb server> dbName: SenseOps serversToMonitor: # How often (milliseconds) should the healthcheck API be polled? pollingInterval: 5000 # Sense Servers that should be queried for healthcheck data servers: - host: <server1.my.domain> serverName: <server1> availableRAM: 32000 - host: <server2.my.domain> serverName: <server2> availableRAM: 24000
Update from Hackage at 2022-06-22T18:01:09Z
homepage: https://github.com/shapr/takedouble changelog-type: markdown hash: a134d5db1b70382f8406020c1a6c76227c99e7f0805d5d9b1982f0479f733cdc test-bench-deps: extra: -any unix: -any base: '>=4.11 && <5' filepath: -any hedgehog: -any takedouble: -any temporary: -any directory: -any maintainer: Shae Erisson synopsis: duplicate file finder changelog: | # Revision history for takedouble ## 0.1.0.0 -- 2022-06-22 * First version. Released on an unsuspecting world. In this first episode, files are lazily compared by filesize, then by first and last 4 kilobytes, and then by entire file contents. Duplicates are reported as a list of results. basic-deps: bytestring: -any extra: -any unix: -any base: '>=4.11 && <5' filepath: -any takedouble: -any directory: -any all-versions: - 0.0.1.1 author: Shae Erisson latest: 0.0.1.1 description-type: markdown description: | # takedouble TakeDouble is a duplicate file finder that reads and checks the filesize and first 4k and last 4k of a file and only then checks the full file to find duplicates. # How do I make it go? You can use nix or cabal to build this. `cabal build` should produce a binary. (use [ghcup](https://www.haskell.org/ghcup/) to install cabal and the latest GHC version). After that, `takedouble <dirname>` so you could use `takedouble ~/` for example. # Is it Fast? On my ThinkPad with six Xeon cores, 128GB RAM, and a 1TB Samsung 970 Pro NVMe (via PCIe 3.0), I can check 34393 uncached files in 6.4 seconds. A second run on the same directory takes 2.8 seconds due to file metainfo cached in memory. license-name: BSD-3-Clause
Add test temps partiel réduction générale
- period: "2016-01" name: Base relative_error_margin: 0.001 input_variables: allegement_fillon_mode_recouvrement: 1 effectif_entreprise: 1 heures_remunerees_volume: 130 # 130 heures par mois = 30h par semaine = 6/7 salaire_de_base: 1257.4 # ~ équivalent SMIC contrat_de_travail: 1 # temps partiel output_variables: allegement_fillon: > 1257.4 * ( (.2802 / .6) * (1.6 * ( (17599.4 * (6/7)) / (1257.4 * 12)) - 1) )
Update from Hackage at 2018-06-05T15:43:02Z
homepage: https://github.com/patrickt/fastsum#readme changelog-type: '' hash: a84b132ca3fc9d8c688910def166a4c440a687179b8abf7242611a2c9d5feaff test-bench-deps: {} maintainer: patrickt@github.com synopsis: A fast open-union type suitable for 100+ contained alternatives changelog: '' basic-deps: base: ! '>=4.7 && <5' fastsum: -any ghc-prim: -any hashable: -any template-haskell: -any all-versions: - '0.1.0.0' author: Rob Rix, Josh Vera, Allele Dev, Patrick Thomson latest: '0.1.0.0' description-type: markdown description: ! "# fastsum\n\nThis package provides `Data.Sum`, an open-union type, similar to the `Union` type that powers the implementation of [Oleg Kiselyov's extensible-effects library](http://okmij.org/ftp/Haskell/extensible/).\n\nUnlike most open-union implementations, this type is very fast to compile, even when the type-level list of alternatives contains hundreds of entries. Membership queries are constant-time, compiling to a single type-level natural lookup in a closed type family, unlike the traditional encoding of `Union`, which relies on recursive typeclass lookups. As such, this type lends itself to representing abstract syntax trees or other rich data structures. \n\nGHC 8's support for custom type errors provides readable type errors should membership constraints not be satisfied.\n\nIn order to achieve speed, this package makes fewer guarantees about what can be proven given a `Member` instance. If you require a richer vocabulary to describe the implications of membership, you should use the traditional implementation of open-unions.\n\n# Credits\n\nThis library is built on the work of Oleg Kiselyov, which was then modified by Allele Dev. It was extracted from Josh Vera's [effects](https://github.com/joshvera/effects/) library. Rob Rix implemented the `ElemIndex` type family and the `Apply` typeclass.\n" license-name: BSD3
Add skeleton for publish workflow with inputs
#/ # @license Apache-2.0 # # Copyright (c) 2022 The Stdlib Authors. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. #/ # Workflow name: name: standalone_publish # Workflow triggers: on: # Allow the workflow to be manually run: workflow_dispatch: inputs: skip: type: choice description: Whether to skip individual or toplevel package publishing options: - - --skip-toplevel - --skip-individual packages: description: Space-separated list of packages to publish skip-upload: type: boolean description: Boolean flag indicating whether to skip uploading the packages (dry run) patch: type: boolean description: Boolean flag indicating whether to create a patch release for the packages # Workflow concurrency group: concurrency: # Specify a group name: group: ${{ github.workflow }} # Specify whether to cancel any currently running workflow in the same concurrency group: cancel-in-progress: false # Workflow jobs: jobs: # Define a job for publishing standalone packages... publish: # Define a display name: name: 'Publish' # Define the type of virtual host machine: runs-on: ubuntu-latest # Define the sequence of job steps... steps: # Checkout the repository: - name: 'Checkout repository' uses: actions/checkout@v2 with: # Specify whether to remove untracked files before checking out the repository: clean: false # Limit clone depth to the most recent commit: fetch-depth: 1 # Specify whether to download Git-LFS files: lfs: false timeout-minutes: 10 # Install Node.js: - name: 'Install Node.js' uses: actions/setup-node@v2 with: node-version: '16' # 'lts/*' timeout-minutes: 5 # Install dependencies (accounting for possible network failures, etc, when installing node module dependencies): - name: 'Install dependencies' run: | make install-node-modules || make install-node-modules || make install-node-modules timeout-minutes: 15 # Publish standalone packages: - name: 'Publish packages' env: GITHUB_TOKEN: ${{ secrets.REPO_GITHUB_TOKEN }} run: |
Move to Travis CI Container builds
language: node_js node_js: - "0.12" - "0.10" - "0.8" - "iojs" env: - TEST="all" - TEST="node" matrix: exclude: - node_js: "iojs" env: TEST="all" - node_js: "0.12" env: TEST="all" - node_js: "0.10" env: TEST="node" - node_js: "0.8" env: TEST="all" notifications: email: - jchan@linkedin.com - skinast@linkedin.com before_install: - npm install -g npm@1.4 - npm --version - npm install -g grunt-cli script: - "[ $TEST = 'all' ] && grunt test || grunt testNode"
language: node_js node_js: - "0.12" - "0.10" - "0.8" - "iojs" env: - TEST="all" - TEST="node" matrix: exclude: - node_js: "iojs" env: TEST="all" - node_js: "0.12" env: TEST="all" - node_js: "0.10" env: TEST="node" - node_js: "0.8" env: TEST="all" notifications: email: - jchan@linkedin.com - skinast@linkedin.com before_install: - npm install -g npm@1.4 - npm --version - npm install -g grunt-cli script: - "[ $TEST = 'all' ] && grunt test || grunt testNode" sudo: false
Add Travis YAML configuration for CI.
language: java jdk: - oraclejdk7 - openjdk6 before_install: - sudo apt-get update -qq - sudo apt-get install -qq mercurial libsqlite3-dev python-dev
Add AppVeyor as Windows Continuous Integration
environment: JAVA_HOME: C:\Program Files\Java\jdk1.8.0 os: Visual Studio 2017 # Windows Server 2016 install: - java -version - mvn --version build_script: - mvn install -DskipTests=true -Dmaven.javadoc.skip=true -B -V test_script: - mvn test --batch-mode - mvn checkstyle:check --batch-mode --fail-never cache: - C:\Users\appveyor\.m2\ -> pom.xml
Update from Hackage at 2018-05-11T22:49:37Z
homepage: '' changelog-type: markdown hash: f83a169be832e51d14b5b7e8c7dca17e057fad8049c4349ed28b6cac41f509e9 test-bench-deps: {} maintainer: eben.cowley42@gmail.com synopsis: Quickly detect clusters and holes in data. changelog: ! '# Revision history for Persistence ## 1.0 -- YYYY-mm-dd * First version. Released on an unsuspecting world. ' basic-deps: base: ! '>=4.10 && <4.11' containers: ! '>=0.5 && <0.6' maximal-cliques: ! '>=0.1 && <0.2' parallel: ! '>=3.2 && <3.3' vector: ! '>=0.12 && <0.13' all-versions: - '1.0' author: Eben Cowley latest: '1.0' description-type: haddock description: Persistence is a topological data analysis library motivated by flexibility when it comes to the type of data being analyzed. If you have data that comes with a meaningful function into something of the Ord typeclass, you can use Persistence to detect clusters and holes in the data. You can also use the library to analyze undirected graphs, and interesting features for directed graphs will be added soon. license-name: BSD3
Update Travis to include 2.1.1
language: ruby rvm: - 1.9.2 - 1.9.3 - 2.0.0 - jruby-19mode - rbx-19mode before_install: - git submodule update --init --recursive
language: ruby rvm: - 1.9.2 - 1.9.3 - 2.0.0 - 2.1.1 - jruby-19mode - rbx-19mode before_install: - git submodule update --init --recursive
Update from Hackage at 2018-08-02T09:07:28Z
homepage: https://github.com/metrix-ai/cereal-uuid changelog-type: '' hash: 2c8002f5090ba7b9c1092055d35208f91b15c18a8a9c75e3c7a5850a10eafe35 test-bench-deps: {} maintainer: Metrix.AI Ninjas <ninjas@metrix.ai> synopsis: Integration of "cereal" and "uuid" changelog: '' basic-deps: cereal: ! '>=0.5.5 && <0.6' bytestring: ! '>=0.10 && <0.11' base: <5 uuid: ! '>=1.3 && <2' all-versions: - '0.1' author: Nikita Volkov <nikita.y.volkov@mail.ru> latest: '0.1' description-type: haddock description: '' license-name: MIT
Update from Hackage at 2020-07-11T20:46:30Z
homepage: https://github.com/swamp-agr/servant-seo#readme changelog-type: markdown hash: 2b7262f4fd8f2fd0978c4c9be7c19803afd4e5e261c5b67dc01b7ae1a64208e7 test-bench-deps: base: -any filepath: -any doctest: '>=0.11.1 && <0.18' servant-seo: -any QuickCheck: -any directory: '>=1.0' maintainer: persiantiger@yandex.ru synopsis: Generate Robots.txt and Sitemap.xml specification for your servant API. changelog: | # Changelog for servant-sitemap ## 0.1.0 -- 2020-07-11 - /robots.txt deriving from API: Disallow, Sitemap, User-agent commands. - /sitemap.xml deriving from API: Frequency, Priority optional tags included. - sitemap index support. basic-deps: warp: -any bytestring: -any xml-conduit: -any base: '>=4.7 && <5' blaze-markup: -any text: -any servant-server: -any servant: -any containers: -any servant-blaze: -any lens: '>=4.18.1' binary: -any aeson: -any http-media: -any all-versions: - 0.1.0 author: Andrey Prokopenko latest: 0.1.0 description-type: markdown description: "# servant-seo\n\n## Installation\n\n- Add `servant-seo` to project dependencies.\n\n## Usage\n\n1. Restrict API with `Disallow` combinator to prevent robots making requests.\n\n \ ```haskell\n type ProtectedAPI = Disallow \"admin\" :> Get '[HTML] AdminPage\n\n \ type StaticAPI = \"cdn\" :> Disallow \"static\" :> Raw\n ```\n\n2. Add `Priority` and `Frequency` (optional step).\n\n ```haskell\n typw BlogAPI = \"blog\" :> Frequency 'Always :> Get '[HTML] BlogPage\n ```\n \n3. Extend your API with something like: \n\n ```haskell\n Warp.runSettings Warp.defaultSettings $ serveWithSeo website api server\n where\n website = \"https://example.com\"\n ```\n\nYou will have `/robots.txt` and `/sitemap.xml` automatically generated and ready to be served with your API.\n\nSee `Servant.Seo` module for detailed description and `example/Example.hs` for show case.\n\n## Acknowledgements\n\nThis library would not have happened without these people. Thank you!\n\n - @fizruk and **Servant Contributors**: as source of inspiration.\n - @isovector: for `Thinking with types` book.\n \n" license-name: BSD-3-Clause
Add configuration for testing the build on Windows
name: Java CI on: [push, pull_request] jobs: build: runs-on: windows-latest steps: - uses: actions/checkout@v1 - name: Set up JDK 1.8 uses: actions/setup-java@v1 with: java-version: 1.8 - name: Build with Gradle run: ./gradlew clean build
Add testing with Github Actions
name: delivery on: [push, pull_request] jobs: delivery: runs-on: ubuntu-latest steps: - name: Check out code uses: actions/checkout@master - name: Run Chef Delivery uses: actionshub/chef-delivery@master env: CHEF_LICENSE: accept-no-persist
Add Appveyor CI build configuration
version: 0.1.{build} branches: except: gh-pages skip_tags: true os: Visual Studio 2015 init: - git config --global core.autocrlf input clone_folder: C:\projects\crole platform: - x86 - x64 configuration: - Debug - Release init: - cmd: cmake --version - cmd: msbuild /version before_build: - cd C:\projects\crole - mkdir build - cd build - if "%platform%"=="Win32" set CMAKE_GENERATOR_NAME=Visual Studio 14 2015 - if "%platform%"=="x64" set CMAKE_GENERATOR_NAME=Visual Studio 14 2015 Win64 - cmake -G "%CMAKE_GENERATOR_NAME%" -DCMAKE_BUILD_TYPE=%configuration% build: project: C:\projects\crole\build\crole.sln parallel: true verbosity: quiet test_script: - C:\projects\crole\build\test\crole-test.exe
Add advisories for Thunderbird 102.4
## mfsa2022-46.yml announced: October 18, 2022 impact: high fixed_in: - Thunderbird 102.4 title: Security Vulnerabilities fixed in Thunderbird 102.4 description: | *In general, these flaws cannot be exploited through email in the Thunderbird product because scripting is disabled when reading mail, but are potentially risks in browser or browser-like contexts.* advisories: CVE-2022-42927: title: Same-origin policy violation could have leaked cross-origin URLs impact: high reporter: James Lee description: | A same-origin policy violation could have allowed the theft of cross-origin URL entries, leaking the result of a redirect, via <code>performance.getEntries()</code>. bugs: - url: 1789128 CVE-2022-42928: title: Memory Corruption in JS Engine impact: high reporter: Samuel Groß description: | Certain types of allocations were missing annotations that, if the Garbage Collector was in a specific state, could have lead to memory corruption and a potentially exploitable crash. bugs: - url: 1791520 CVE-2022-42929: title: Denial of Service via window.print impact: moderate reporter: Andrei Enache description: | If a website called <code>window.print()</code> in a particular way, it could cause a denial of service of the browser, which may persist beyond browser restart depending on the user's session restore settings. bugs: - url: 1789439 CVE-2022-42932: title: Memory safety bugs fixed in Thunderbird 102.4 impact: moderate reporter: Mozilla developers and community description: | Mozilla developers Ashley Hale and the Mozilla Fuzzing Team reported memory safety bugs present in Thunderbird 102.3. Some of these bugs showed evidence of memory corruption and we presume that with enough effort some of these could have been exploited to run arbitrary code. bugs: - url: 1789729, 1791363, 1792041 desc: Memory safety bugs fixed in Thunderbird 102.4
Add a GitHub Action to build the demo site
name: Build Demo on: [push] jobs: deploy: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - uses: actions/setup-node@v2 with: node-version: '16' - run: npm install working-directory: ./javascript - run: npm install working-directory: ./demo - run: npm run build working-directory: ./demo - name: Publish Demo uses: peaceiris/actions-gh-pages@v3 if: ${{ github.ref == 'refs/heads/main' }} with: github_token: ${{ secrets.GITHUB_TOKEN }} publish_dir: ./demo/dist
Add Github action for testing
# This workflow will do a clean install of node dependencies, build the source code and run tests across different versions of node # For more information see: https://help.github.com/actions/language-and-framework-guides/using-nodejs-with-github-actions name: Node.js CI on: push: branches: [ master ] pull_request: branches: [ master ] jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - run: npm ci - run: npm test
Add CI config for formatting, tests and coverage.
name: Rust on: push: branches: [main] pull_request: env: CARGO_TERM_COLOR: always jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - name: Build run: cargo build - name: Run tests run: cargo test - name: Run clippy uses: actions-rs/clippy-check@v1 with: token: ${{ secrets.GITHUB_TOKEN }} args: --all-features format: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - name: Format Rust code run: cargo fmt --all -- --check coverage: runs-on: ubuntu-latest env: RUSTC_BOOTSTRAP: 1 steps: - uses: actions/checkout@v2 - name: Install dependencies run: sudo apt-get install libdbus-1-dev - name: Install grcov run: curl -L https://github.com/mozilla/grcov/releases/latest/download/grcov-linux-x86_64.tar.bz2 | tar jxf - - name: Install llvm-tools run: rustup component add llvm-tools-preview - name: Build for coverage run: cargo build --all-features env: RUSTFLAGS: "-Zinstrument-coverage" - name: Run tests with coverage run: cargo test --all-features env: RUSTFLAGS: "-Zinstrument-coverage" LLVM_PROFILE_FILE: "test-coverage-%p-%m.profraw" - name: Convert coverage run: ./grcov . -s . --binary-path target/debug/ -t lcov --branch --ignore-not-existing -o target/debug/lcov.info - name: Upload coverage to codecov.io uses: codecov/codecov-action@v1 with: directory: ./target/debug fail_ci_if_error: true
Update from Hackage at 2018-11-17T16:25:46Z
homepage: https://github.com/CthulhuDen/nanovg-simple#readme changelog-type: '' hash: a3b7f89fc44d02954ca0843e6ba987c5659ba3cd63377b37a484c1e5a91a9748 test-bench-deps: {} maintainer: cthulhu.den@gmail.com synopsis: Simple interface to rendering with NanoVG changelog: '' basic-deps: OpenGL: ! '>=3.0.2.2 && <3.1' base: ! '>=4.7 && <5' time: ! '>=1.8.0.2 && <1.10' text: ! '>=1.2.3.0 && <1.3' safe-exceptions: ! '>=0.1.7.0 && <0.2' monad-loops: ! '>=0.4.3 && <0.5' nanovg-simple: -any nanovg: ! '>=0.6.0.0 && <0.7' GLFW-b: ! '>=3.2.1.0 && <3.3' all-versions: - '0.4.0.0' author: Cthulhu latest: '0.4.0.0' description-type: markdown description: ! "# nanovg-simple\n\nSimple interface to creating window with associated NanoVG context. See\n[nanovg.h](https://github.com/memononen/nanovg/blob/master/src/nanovg.h) for comprehensive listing of methods.\n\nRefer to `Graphics.NanoVG.Simple` module for utilities to create NanoVG window. Simple example:\n\n```haskell\nimport Graphics.NanoVG.Simple\nimport qualified NanoVG as NVG\n\nmain :: IO ()\nmain = run 800 600 \"Simple app\" $ simpleWindow $\n NVG.circle ctx 10 10 10 *> NVG.fill ctx\n```\n\nAlso provided is wrapper for rendering combination of composable picture pieces: see `Graphics.NanoVG.Picture`.\n\n```haskell\nimport \ Graphics.NanoVG.Picture\nimport Graphics.NanoVG.Simple\nimport qualified NanoVG as NVG\n\nmain :: IO ()\nmain = run 800 600 \"Simple app\" $ asWindow $\n pure $ translateP 50 0 $ pictures\n [ fill (NVG.Color 1 1 1 1) $ circle (10, 10) 10\n , stroke (NVG.Color 1 1 1 1) $ circle (10, 10) 15\n ]\n```\n" license-name: BSD3
Replace legacy tips jobs with shiny new versions
- job: name: cliff-tox-py27-neutronclient-tip parent: openstack-tox-py27 description: | Run unit tests for neutronclient with master branch of cliff Uses tox with the ``unit-tips`` environment and master branch of the required-projects below. branches: ^master$ irrelevant-files: - ^.*\.rst$ - ^doc/.*$ - ^releasenotes/.*$ required-projects: - openstack/cliff - openstack/python-neutronclient vars: # Set work dir to neutronclient so that if it's triggered by one of the # other repos the tests will run in the same place zuul_work_dir: src/git.openstack.org/openstack/python-neutronclient - project: name: openstack/cliff check: jobs: - cliff-tox-py27-neutronclient-tip - osc-tox-unit-tips: branches: ^master$ gate: jobs: - cliff-tox-py27-neutronclient-tip - osc-tox-unit-tips: branches: ^master$
Update from Hackage at 2016-03-31T05:21:02+0000
homepage: '' changelog-type: '' hash: b85e20bc26a6b151b3e6fbb8650401acb5787e3c30ff82dbc1e17b5dce11a109 test-bench-deps: base: -any hspec: ! '>=2.1 && <2.3' quiver: -any transformers: ! '>=0.4 && <0.6' QuickCheck: ! '>=2.5 && <2.9' quiver-binary: -any maintainer: Ivan.Miljenovic@gmail.com synopsis: Binary serialisation support for Quivers changelog: '' basic-deps: bytestring: -any base: ! '>=4 && <5' quiver-bytestring: -any quiver: ==1.1.* binary: ! '>=0.6 && <0.9' all-versions: - '0.1.0.0' author: Ivan Lazar Miljenovic latest: '0.1.0.0' description-type: markdown description: ! 'quiver-binary ============= [![Hackage](https://img.shields.io/hackage/v/quiver-binary.svg)](https://hackage.haskell.org/package/quiver-binary) [![Build Status](https://travis-ci.org/ivan-m/quiver-binary.svg)](https://travis-ci.org/ivan-m/quiver-binary) This package provides inter-operability between the _[quiver]_ stream-processing library and the _[binary]_ serialisation library. Using this you can efficiently encode/decode native Haskell values in a streaming fashion. This is especially useful when combined with the _[quiver-bytestring]_ library for I/O. [quiver]: http://hackage.haskell.org/package/quiver [binary]: https://hackage.haskell.org/package/binary [quiver-bytestring]: http://hackage.haskell.org/package/quiver-bytestring ' license-name: MIT
Update from Hackage at 2015-11-15T14:36:38+0000
homepage: https://bitbucket.org/bwbush/handa-opengl changelog-type: '' hash: 843ee89737f92a9ce83ac95b2141bbd74fd95bb3574bdad41847207189e7a0a4 test-bench-deps: {} maintainer: Brian W Bush <consult@brianwbush.info> synopsis: Utility functions for OpenGL and GLUT changelog: '' basic-deps: OpenGL: ! '>=2.8' GLUT: ! '>=2.4' base: ! '>=4.6 && <5' data-default: ! '>=0.5.3' array: ! '>=0.4.0.1' opengl-dlp-stereo: ! '>=0.1.4.1' all-versions: - '0.1.6.1' author: Brian W Bush <consult@brianwbush.info> latest: '0.1.6.1' description-type: markdown description: ! 'This is a collection of miscellaneous utility functions for OpenGL and GLUT. It currently includes support for the following: * Managing vertex buffer objects (VBOs). * Creating shapes and wireframes. * Managing perspectives and frusta. * Using the keyboard to navigate scenes. * Setting up and initializing GLUT windows using command-line arguments. Please report issues at <<https://bwbush.atlassian.net/projects/HOGL/issues/>>. ' license-name: MIT
Add thirdpartyresources create capability to cilium role
kind: ClusterRole apiVersion: rbac.authorization.k8s.io/v1beta1 metadata: name: cilium rules: - apiGroups: - "" resources: - pods - namespaces - nodes - ingress - endpoints verbs: - get - list - watch - apiGroups: - extensions resources: - networkpolicies - thirdpartyresources verbs: - get - list - watch - apiGroups: - cilium.io resources: - ciliumnetworkpolicies verbs: - "*" --- apiVersion: v1 kind: ServiceAccount metadata: name: cilium namespace: kube-system --- kind: ClusterRoleBinding apiVersion: rbac.authorization.k8s.io/v1beta1 metadata: name: cilium roleRef: apiGroup: rbac.authorization.k8s.io kind: ClusterRole name: cilium subjects: - kind: ServiceAccount name: cilium namespace: kube-system
kind: ClusterRole apiVersion: rbac.authorization.k8s.io/v1beta1 metadata: name: cilium rules: - apiGroups: - "" resources: - pods - namespaces - nodes - ingress - endpoints verbs: - get - list - watch - apiGroups: - extensions resources: - networkpolicies - thirdpartyresources verbs: - create - get - list - watch - apiGroups: - cilium.io resources: - ciliumnetworkpolicies verbs: - "*" --- apiVersion: v1 kind: ServiceAccount metadata: name: cilium namespace: kube-system --- kind: ClusterRoleBinding apiVersion: rbac.authorization.k8s.io/v1beta1 metadata: name: cilium roleRef: apiGroup: rbac.authorization.k8s.io kind: ClusterRole name: cilium subjects: - kind: ServiceAccount name: cilium namespace: kube-system
Set up CI with Azure Pipelines
# Node.js # Build a general Node.js project with npm. # Add steps that analyze code, save build artifacts, deploy, and more: # https://docs.microsoft.com/azure/devops/pipelines/languages/javascript trigger: - master pool: vmImage: 'ubuntu-latest' steps: - task: NodeTool@0 inputs: versionSpec: '10.x' displayName: 'Install Node.js' - script: | npm install npm run build displayName: 'npm install and build'
Update from Hackage at 2018-07-15T08:11:16Z
homepage: '' changelog-type: '' hash: 33a3db3e1bb48bbf83ed9e61bf395fc054939982b3b89dc0025ada1a7c99349c test-bench-deps: base: ! '>=4.9 && <5' hspec: ==2.* text: -any text-metrics: ! '>=0.3.0 && <0.4' infer-license: -any maintainer: Simon Hengel <sol@typeful.net> synopsis: Infer software license from a given license file changelog: '' basic-deps: base: ! '>=4.9 && <5' text: -any text-metrics: ! '>=0.3.0 && <0.4' all-versions: - '0.1.0' author: Simon Hengel <sol@typeful.net> latest: '0.1.0' description-type: haddock description: '' license-name: MIT
Add scriptengine (a lightweight framework for executing yaml scripts)
{% set name = "scriptengine" %} {% set version = "0.12.5" %} package: name: {{ name|lower }} version: {{ version }} source: url: https://pypi.io/packages/source/{{ name[0] }}/{{ name }}/scriptengine-{{ version }}.tar.gz sha256: 800a9a083ed295f72589c0ade7c8f74220e13f7b1bf8e10f69f1bbca6f5d208b build: entry_points: - se = scriptengine.cli.se:main noarch: python script: {{ PYTHON }} -m pip install . -vv number: 0 requirements: host: - pip - python >=3.6 run: - deepdiff >=5.7.0 - deepmerge - jinja2 - python >=3.6 - python-dateutil - pyyaml test: imports: - scriptengine commands: - pip check - se --help requires: - pip about: home: https://github.com/uwefladrich/scriptengine summary: A lightweight and extensible framework for executing scripts written in YAML license: GPL-3.0 license_file: LICENSE extra: recipe-maintainers: - zklaus
Prepare release notes for release 16.0.0
--- prelude: > This release indicates end of support for Mitaka in Tempest. other: - | OpenStack Releases Supported after this release are **Newton** and **Ocata** The release under current development as of this tag is Pike, meaning that every Tempest commit is also tested against master branch during the Pike cycle. However, this does not necessarily mean that using Tempest as of this tag will work against Pike (or future releases) cloud.
Build against released version of Chef 0.10.4.
rvm: - 1.8.7 - 1.9.2 env: - CHEF_VERSION=0.9.18 - CHEF_VERSION=0.10.2 - CHEF_VERSION=0.10.4.rc.5
rvm: - 1.8.7 - 1.9.2 env: - CHEF_VERSION=0.9.18 - CHEF_VERSION=0.10.2 - CHEF_VERSION=0.10.4
Create github workflow to run Gradle tests
name: Tests on: [ push, pull_request ] jobs: test: runs-on: ubuntu-latest strategy: fail-fast: true matrix: os: [ ubuntu-latest ] java: [ 8, 9, 10, 11 ] name: Java v${{ matrix.java }} - ${{ matrix.os }} steps: - uses: actions/checkout@v2 - name: Set up Java uses: actions/setup-java@v1 with: java-version: ${{ matrix.java }} architecture: x64 - name: Cache Gradle packages uses: actions/cache@v2 with: path: | ~/.gradle/caches ~/.gradle/wrapper key: ${{ runner.os }}-gradle-${{ hashFiles('**/*.gradle*', '**/gradle-wrapper.properties') }} restore-keys: | ${{ runner.os }}-gradle- - name: Run tests run: ./gradlew test - name: Cleanup Gradle Cache # Remove some files from the Gradle cache, so they aren't cached by GitHub Actions. # Restoring these files from a GitHub Actions cache might cause problems for future builds. run: | rm -f ~/.gradle/caches/modules-2/modules-2.lock rm -f ~/.gradle/caches/modules-2/gc.properties
Set up CI with Azure Pipelines
# Node.js # Build a general Node.js project with npm. # Add steps that analyze code, save build artifacts, deploy, and more: # https://docs.microsoft.com/azure/devops/pipelines/languages/javascript pool: vmImage: 'Ubuntu 16.04' steps: - task: NodeTool@0 inputs: versionSpec: '8.x' displayName: 'Install Node.js' - script: | npm install npm run build displayName: 'npm install and build'
Set up CI with Azure Pipelines
# ASP.NET # Build and test ASP.NET projects. # Add steps that publish symbols, save build artifacts, deploy, and more: # https://docs.microsoft.com/azure/devops/pipelines/apps/aspnet/build-aspnet-4 trigger: - master pool: vmImage: 'windows-latest' variables: solution: '**/*.sln' buildPlatform: 'Any CPU' buildConfiguration: 'Release' steps: - task: NuGetToolInstaller@0 - task: NuGetCommand@2 inputs: restoreSolution: '$(solution)' - task: VSBuild@1 inputs: solution: '$(solution)' msbuildArgs: '/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:PackageLocation="$(build.artifactStagingDirectory)"' platform: '$(buildPlatform)' configuration: '$(buildConfiguration)' - task: VSTest@2 inputs: platform: '$(buildPlatform)' configuration: '$(buildConfiguration)'
Update from Forestry.io - Updated Forestry configuration
--- label: Webinar Template hide_body: false fields: - type: text name: layout label: layout - type: text name: title label: title - type: file name: img-url label: img-url - type: datetime name: date label: date - type: boolean name: enable-header label: enable-header - type: text name: event-type label: event-type
Add config for Stale app
# Number of days of inactivity before an issue becomes stale daysUntilStale: 60 # Number of days of inactivity before a stale issue is closed daysUntilClose: 7 # Issues with these labels will never be considered stale exemptLabels: - pinned - security # Label to use when marking an issue as stale staleLabel: stale # Comment to post when marking an issue as stale. Set to `false` to disable markComment: > This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. # Comment to post when closing a stale issue. Set to `false` to disable closeComment: false
Make GitHub Actions run the existing tests
# Copyright (C) 2021 Sebastian Pipping <sebastian@pipping.org> # Licensed under the MIT License name: Run the test suite on: - pull_request - push jobs: run-tests: name: Run the test suite strategy: matrix: python-version: [2.7, 3.5, 3.9] # no explicit need for 3.6, 3.7, 3.8 runs-on: [macos-latest, ubuntu-latest] runs-on: ${{ matrix.runs-on }} steps: - uses: actions/checkout@v2.3.4 - uses: actions/setup-python@v2.1.4 with: python-version: ${{ matrix.python-version }} - name: Run the test suite run: | python --version pip install nose pip install -e . python -m nose -v
Update from Hackage at 2020-08-31T07:04:27Z
homepage: https://github.com/Lupino/micro-gateway#readme changelog-type: '' hash: bc51ddec4f5c6c144a1b3e3297f01b8b0c8574991c4e47a33e99faac2bca52db test-bench-deps: {} maintainer: lmjubuntu@gmail.com synopsis: A Micro service gateway. changelog: '' basic-deps: warp: -any http-client: -any signature: -any cookie: -any wai-websockets: -any streaming-commons: -any bytestring: -any micro-gateway: -any wai-cors: -any wai: -any case-insensitive: -any unix-time: -any stm: -any base: '>=4.7 && <5' data-default-class: -any time: -any unordered-containers: -any text: -any websockets: -any containers: -any binary: -any hslogger: -any network-uri: -any optparse-applicative: -any scotty: -any http-types: -any aeson: -any yaml: -any all-versions: - 1.1.0.0 author: Li Meng Jun latest: 1.1.0.0 description-type: markdown description: | # micro-gateway A Micro service gateway. Support http, and websockets reverse proxy. license-name: BSD-3-Clause
Add grafana config for production deployment in AWS
apiVersion: extensions/v1beta1 kind: Deployment metadata: name: monitoring-grafana namespace: kube-system spec: replicas: 1 template: metadata: labels: task: monitoring k8s-app: grafana spec: containers: - name: grafana image: gcr.io/google_containers/heapster-grafana-amd64:v4.0.2 ports: - containerPort: 3000 protocol: TCP volumeMounts: - mountPath: /var name: grafana-storage env: - name: INFLUXDB_HOST value: monitoring-influxdb - name: GRAFANA_PORT value: "3000" # The following env variables are required to make Grafana accessible via # the kubernetes api-server proxy. On production clusters, we recommend # removing these env variables, setup auth for grafana, and expose the grafana # service using a LoadBalancer or a public IP. - name: GF_AUTH_BASIC_ENABLED value: "false" - name: GF_AUTH_ANONYMOUS_ENABLED value: "true" - name: GF_AUTH_ANONYMOUS_ORG_ROLE value: Admin - name: GF_SERVER_ROOT_URL # If you're only using the API Server proxy, set this value instead: # value: /api/v1/proxy/namespaces/kube-system/services/monitoring-grafana/ value: / volumes: - name: grafana-storage persistentVolumeClaim: claimName: grafana-storage-pvc --- apiVersion: storage.k8s.io/v1beta1 kind: StorageClass metadata: name: grafana-storage-class labels: task: monitoring k8s-app: grafana namespace: kube-system parameters: type: gp2 provisioner: kubernetes.io/aws-ebs --- kind: PersistentVolume apiVersion: v1 metadata: name: grafana-storage labels: task: monitoring k8s-app: grafana namespace: kube-system spec: storageClassName: grafana-storage-class capacity: storage: 100Gi accessModes: - ReadWriteOnce hostPath: path: "/mnt/data" --- kind: PersistentVolumeClaim apiVersion: v1 metadata: name: grafana-storage-pvc namespace: kube-system spec: storageClassName: grafana-storage-class accessModes: - ReadWriteOnce resources: requests: storage: 100Gi
Add galaxy dependency for dein role
--- galaxy_info: author: aglorei description: Set up dotfiles playlist as an ansible-pull cron license: MIT min_ansible_version: 2.10.3 platforms: - name: Archlinux versions: - any - name: MacOSX versions: - 10.15 - name: Ubuntu versions: - focal dependencies: - role: packages vars: apt_packages: - neovim homebrew_brews: - neovim pacman_packages: - neovim
Add CodeSee architecture diagram workflow to repository
on: push: branches: - master pull_request_target: types: [opened, synchronize, reopened] name: CodeSee Map jobs: test_map_action: runs-on: ubuntu-latest continue-on-error: true name: Run CodeSee Map Analysis steps: - name: checkout id: checkout uses: actions/checkout@v2 with: repository: ${{ github.event.pull_request.head.repo.full_name }} ref: ${{ github.event.pull_request.head.ref }} fetch-depth: 0 # codesee-detect-languages has an output with id languages. - name: Detect Languages id: detect-languages uses: Codesee-io/codesee-detect-languages-action@latest - name: Configure JDK 16 uses: actions/setup-java@v2 if: ${{ fromJSON(steps.detect-languages.outputs.languages).java }} with: java-version: '16' distribution: 'zulu' # CodeSee Maps Go support uses a static binary so there's no setup step required. - name: Configure Node.js 14 uses: actions/setup-node@v2 if: ${{ fromJSON(steps.detect-languages.outputs.languages).javascript }} with: node-version: '14' - name: Configure Python 3.x uses: actions/setup-python@v2 if: ${{ fromJSON(steps.detect-languages.outputs.languages).python }} with: python-version: '3.x' architecture: 'x64' - name: Configure Ruby '3.x' uses: ruby/setup-ruby@v1 if: ${{ fromJSON(steps.detect-languages.outputs.languages).ruby }} with: ruby-version: '3.0' # CodeSee Maps Rust support uses a static binary so there's no setup step required. - name: Generate Map id: generate-map uses: Codesee-io/codesee-map-action@latest with: step: map github_ref: ${{ github.ref }} languages: ${{ steps.detect-languages.outputs.languages }} - name: Upload Map id: upload-map uses: Codesee-io/codesee-map-action@latest with: step: mapUpload api_token: ${{ secrets.CODESEE_ARCH_DIAG_API_TOKEN }} github_ref: ${{ github.ref }} - name: Insights id: insights uses: Codesee-io/codesee-map-action@latest with: step: insights api_token: ${{ secrets.CODESEE_ARCH_DIAG_API_TOKEN }} github_ref: ${{ github.ref }}
Add manual workflow to create Releases.
name: Release on: workflow_dispatch: inputs: candidateName: description: 'Name of the candidate' type: string required: true branchCommit: description: 'Commit to check out. If it is the most recent commit then leave blank.' type: string required: false default: '' cherrypickCommits: description: 'Comma separated commits to cherry pick' type: string required: false permissions: contents: write jobs: release: name: Create Release runs-on: ubuntu-latest steps: - name: Get releaser identity run: | git config --global user.name '${{github.actor}}' git config --global user.email '${{github.actor}}@users.noreply.github.com' - name: Declare release branch name and tag name id: variables run: | echo "::set-output name=releaseBranchName::release_${CANDIDATE_NAME,,}" echo "::set-output name=tagName::${CANDIDATE_NAME^^}" env: CANDIDATE_NAME: ${{ inputs.candidateName }} - name: Checkout code uses: actions/checkout@v3 with: fetch-depth: 0 - name: Create release branch run: git checkout -b $RELEASE_BRANCH_NAME $BRANCH_COMMIT env: RELEASE_BRANCH_NAME: ${{ steps.variables.outputs.releaseBranchName }} BRANCH_COMMIT: ${{ inputs.branchCommit }} - name: Cherry pick commits run: | commits=$(echo $CHERRYPICK_COMMITS | tr "," "\n") for commit in $commits do echo "Cherry picking $commit." git cherry-pick $commit done env: CHERRYPICK_COMMITS: ${{ inputs.cherrypickCommits }} - name: Add tag to most recent commit run: | DATE=$(date -d"next-monday - 1week" +'%Y-%m-%d') T_COMMIT=$(git log -n 1 $RELEASE_BRANCH_NAME --pretty=format:'%H') git tag -a $TAG_NAME -m "Release week of $DATE" $T_COMMIT env: RELEASE_BRANCH_NAME: ${{ steps.variables.outputs.releaseBranchName }} TAG_NAME: ${{ steps.variables.outputs.tagName }} - name: Push changes run: | git push -u origin --all git push -u origin --tags - name: Release run: | gh release create $TAG_NAME --title "Dataflow Templates $TAG_NAME" --notes "" env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} TAG_NAME: ${{ steps.variables.outputs.tagName }}
Add GitHub action to run tests
name: Run tests on: pull_request: branches: - "main" jobs: tests: runs-on: ubuntu-latest timeout-minutes: 15 strategy: matrix: python-version: [2.7, 3.4, 3.5, 3.6, 3.7, 3.8, 3.9, 3.10] steps: - uses: actions/checkout@v2 - name: Set up Python ${{ matrix.python-version }} uses: actions/setup-python@v2 with: python-version: ${{ matrix.python-version }} - name: Install dependencies run: | python -m pip install --upgrade pip pip install "pytest==4.6.9" # pytest 4.6.X works best for older Python versions pip install -U codecov 'coverage==4.5.4' pytest-cov - run: py.test tests env: CI: 1 PYTHONDONTWRITEBYTECODE: 1
Use Appraisal w/ Rails 2.3 to execute tests on Travis CI.
language: ruby rvm: - 1.8.7 - 1.9.2 - 1.9.3 script: bundle exec rake test
language: ruby rvm: - 1.8.7 - 1.9.2 - 1.9.3 script: bundle exec appraisal:rails2.3 test
Revert "Use the pre-release builds of chefdk"
# Use Travis's cointainer based infrastructure sudo: false addons: apt: sources: - chef-current-precise packages: - chefdk # Don't `bundle install` install: echo "skip bundle install" branches: only: - master # Ensure we make ChefDK's Ruby the default before_script: - eval "$(/opt/chefdk/bin/chef shell-init bash)" # We have to install chef-sugar for ChefSpec - /opt/chefdk/embedded/bin/chef gem install chef-sugar script: - /opt/chefdk/embedded/bin/chef --version - /opt/chefdk/embedded/bin/rubocop --version - /opt/chefdk/embedded/bin/rubocop - /opt/chefdk/embedded/bin/foodcritic --version - /opt/chefdk/embedded/bin/foodcritic . --exclude spec - /opt/chefdk/embedded/bin/rspec spec
# Use Travis's cointainer based infrastructure sudo: false addons: apt: sources: - chef-stable-precise packages: - chefdk # Don't `bundle install` install: echo "skip bundle install" branches: only: - master # Ensure we make ChefDK's Ruby the default before_script: - eval "$(/opt/chefdk/bin/chef shell-init bash)" # We have to install chef-sugar for ChefSpec - /opt/chefdk/embedded/bin/chef gem install chef-sugar script: - /opt/chefdk/embedded/bin/chef --version - /opt/chefdk/embedded/bin/rubocop --version - /opt/chefdk/embedded/bin/rubocop - /opt/chefdk/embedded/bin/foodcritic --version - /opt/chefdk/embedded/bin/foodcritic . --exclude spec - /opt/chefdk/embedded/bin/rspec spec
Update from Hackage at 2019-09-09T13:50:01Z
homepage: https://github.com/symbiont-sam-halliday/secp256k1-haskell changelog-type: '' hash: dcf5caac6a59a1db89312cf9e735ff6a0fe0bccdb69e3a471f3eb1bc9317849c test-bench-deps: test-framework-hunit: ^>=0.3.0.2 cereal: -any bytestring: -any test-framework: ^>=0.8.2.0 base: -any entropy: -any test-framework-quickcheck2: ^>=0.3.0.5 secp256k1-legacy: -any HUnit: -any mtl: -any base16-bytestring: -any QuickCheck: -any string-conversions: -any cryptohash: -any maintainer: sam.halliday@symbiont.io synopsis: fork of secp256k1 changelog: '' basic-deps: cereal: ^>=0.5.8.1 bytestring: -any base: ! '>=4.8 && <5' entropy: ^>=0.4.1.4 mtl: -any base16-bytestring: ^>=0.1.1.6 QuickCheck: ^>=2.12.6.1 string-conversions: ^>=0.4.0.1 all-versions: - 0.5.4 author: Jean-Pierre Rupp latest: 0.5.4 description-type: markdown description: | # Haskell bindings for secp256k1 This project contains Haskell bindings for the [secp256k1](https://github.com/bitcoin/secp256k1) library from the Bitcoin Core project. ## Installing Although it is more common that you’ll want to just use this library as part of a project, here are the stand-alone installation instructions: ```sh git clone --recursive https://github.com/haskoin/secp256k1-haskell.git cd secp256k1 stack install ``` This library contains a submodule that points to the latest supported version of [secp256k1](https://github.com/bitcoin/secp256k1). It is not necessary to install the secp256k1 library in your system beforehand. This package will automatically compile and link the C code from upstream. It will not attempt to link against the secp256k1 library in your system if you already have it. license-name: LicenseRef-PublicDomain
Test bosh high CPU usage Prometheus alert
--- rule_files: # See alerts_validation_spec.rb for details of how stdin gets set: - /dev/stdin evaluation_interval: 1m tests: - interval: 1h input_series: - series: "bosh_job_cpu_sys{bosh_job_name='test',bosh_job_index='0'}" values: 60 80 95 - series: "bosh_job_cpu_user{bosh_job_name='test',bosh_job_index='0'}" values: 0 0 0 0 0 - series: "bosh_job_cpu_wait{bosh_job_name='test',bosh_job_index='0'}" values: 0 0 0 0 0 alert_rule_test: - eval_time: 30m alertname: BoshHighCPUUtilisation_Warning - eval_time: 61m alertname: BoshHighCPUUtilisation_Warning exp_alerts: - exp_labels: severity: warning bosh_job_name: 'test' bosh_job_index: '0' exp_annotations: summary: "High cpu utilisation on test/0" description: "test/0 CPU utilisation was over 80% in the last hour on average" - eval_time: 121m alertname: BoshHighCPUUtilisation_Critical exp_alerts: - exp_labels: severity: critical bosh_job_name: 'test' bosh_job_index: '0' exp_annotations: summary: "High cpu utilisation on test/0" description: "test/0 CPU utilisation was over 95% in the last hour on average"
Replace Coveralls config with Codecov
coverage: status: project: yes patch: off comment: layout: "diff" behavior: once require_changes: true require_base: no require_head: yes branches: null
Add configuration for Travis CI
sudo: false language: ruby rvm: - 1.8.7 - 1.9.2 - 1.9.3 - 2.0.0 - 2.1 - 2.2 - 2.3 - 2.4 - jruby-9.1.13.0 - jruby-head - ruby-head - rubinius-3 branches: only: - master
Test only on modern Node.
language: node_js node_js: - "0.10" - "0.8"
language: node_js node_js: - "0.11" - "0.10"
Add an example minimal composition
version: '3' services: master: image: itzg/elasticsearch environment: UNICAST_HOSTS: master MIN_MASTERS: 1 ports: - "9200:9200" - "9300:9300" deploy: replicas: 1 update_config: parallelism: 1 kibana: image: kibana ports: - "5601:5601" environment: ELASTICSEARCH_URL: http://master:9200
Add a basic quickstarts (link to docs) under "Native" application type
title: Device Authorization Flow author: name: Tami Goodall email: tami.goodall@auth0.com community: false hybrid: false image: /media/platforms/device.svg topics: - quickstart contentType: tutorial useCase: quickstart snippets: alias: - device - device-auth - device-flow - device-authorization-flow seo_alias: device default_article: ../../../flows/guides/device-auth/call-api-device-auth articles: - ../../../flows/guides/device-auth/call-api-device-auth show_steps: false
Use GitHub actions for tests
# This workflow uses actions that are not certified by GitHub. # They are provided by a third-party and are governed by # separate terms of service, privacy policy, and support # documentation. # This workflow will download a prebuilt Ruby version, install dependencies and run tests with Rake # For more information see: https://github.com/marketplace/actions/setup-ruby-jruby-and-truffleruby name: tests on: push: branches: [ "master" ] pull_request: branches: [ "master" ] permissions: contents: read jobs: test: runs-on: ubuntu-latest continue-on-error: ${{ matrix.ruby == 'head' }} strategy: matrix: ruby-version: ['2.6', '2.7', '3.0', '3.1', 'ruby-head'] steps: - uses: actions/checkout@v3 - name: Set up Ruby uses: ruby/setup-ruby@v1 with: ruby-version: ${{ matrix.ruby-version }} bundler-cache: true # runs 'bundle install' and caches installed gems automatically - name: Run tests run: bundle exec rake
Add rule for invoking build on github push
--- name: "cicd-docker.github_incoming" pack: "cicd-docker" enabled: true description: "Webhook listening for pushes to our CI/CD Pipeline from GitHub" trigger: type: "core.st2.webhook" parameters: url: "cicd-docker/github" criteria: trigger.headers['X-Github-Event']: pattern: "push" type: "eq" action: ref: "cicd-docker.docker_image_build_and_push" parameters: project: "{{trigger.body.repository.name}}" branch: "{{trigger.body.ref}}" user: "{{trigger.body.pusher.name}}" commit: "{{trigger.body.after}}" detail_url: "{{trigger.body.compare}}"
Add releasenote for removal of overcloud remote execute command
--- upgrade: - | ``openstack overcloud remote execute`` command has been removed. It depended on os-collect-config service on the overcloud nodes, that had been disabled by default since rocky. Please use ansible playbooks to make necessary configuration changes in place of this command.
Add basic Code Climate config to kick off Brakeman security checks. (And other automated checks.)
engines: brakeman: enabled: true # Fails eslint: enabled: false rubocop: enabled: true ratings: paths: - "**.rb" - "**.coffee" - "**.js" - "**.jsx" - "**.css" - Gemfile.lock exclude_paths: - .codeclimate.yml - .eslintrc - bin/**/* - client/**/* - config/**/* - db/**/* - node_modules/**/* - script/**/* - test/**/* - vendor/**/*
Update from Hackage at 2019-11-12T04:16:26Z
homepage: https://github.com/cmk/profunctor-extras changelog-type: '' hash: 8995c5dc427c68682fb968a241445033211dcb6dd55d44979f7935d645c85129 test-bench-deps: {} maintainer: Chris McKinlay synopsis: Profunctor arrows changelog: '' basic-deps: base: ! '>=4.9 && <5.0' comonad: ! '>=4 && <6' profunctors: ! '>=5.3 && <6' all-versions: - 0.0.0.1 author: Chris McKinlay latest: 0.0.0.1 description-type: haddock description: Free prearrows and arrows for profunctors. license-name: BSD-3-Clause
Update from Hackage at 2016-01-14T03:38:22+0000
homepage: '' changelog-type: '' hash: ac3a970ee0cc3f85ccd783368f8cb855837f74fdee2816197905724a89dd1eb0 test-bench-deps: {} maintainer: ncaq@ncaq.net synopsis: compile es2015 changelog: '' basic-deps: shakespeare: ! '>=2.0 && <2.1' base: ! '>=4.8 && <4.9' classy-prelude: ! '>=0.12 && <0.13' template-haskell: ! '>=2.10 && <2.11' all-versions: - '0.1.0.0' author: ncaq latest: '0.1.0.0' description-type: haddock description: '' license-name: PublicDomain
Add workflow to auto-merge PRs
name: automerge on: pull_request: types: - edited - labeled - opened - ready_for_review - reopened - synchronize - unlabeled - unlocked pull_request_review: types: - submitted check_suite: types: - completed status: {} jobs: automerge: runs-on: ubuntu-latest steps: - name: automerge uses: "pascalgn/automerge-action@f81beb99aef41bb55ad072857d43073fba833a98" env: GITHUB_TOKEN: "${{ secrets.ACCESS_TOKEN }}" MERGE_LABELS: "automerge,!blocked"
Add a manual/API-triggered workflow to update the Hypothesis client
name: Update client on: workflow_dispatch: jobs: ci: runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v2 - name: Cache the node_modules dir uses: actions/cache@v2 with: path: node_modules key: ${{ runner.os }}-node_modules-${{ hashFiles('yarn.lock') }} - name: Install run: yarn install --frozen-lockfile - name: Update client run: tools/update-client
Add book deploy in CI.
name: Deploy Docs on: push: branches: - master jobs: doc: name: Documentation runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - name: Install mdbook run: | mkdir mdbook curl -Lf https://github.com/rust-lang/mdBook/releases/download/v0.4.1/mdbook-v0.4.1-x86_64-unknown-linux-gnu.tar.gz | tar -xz --directory=./mdbook echo ::add-path::`pwd`/mdbook - name: Build book run: cd doc && mdbook build - name: Deploy to GitHub run: | cd doc/book git init git config user.name "Deploy from CI" git config user.email "" git add . .nojekyll git commit -m "Deploy $GITHUB_SHA to gh-pages" git remote add origin https://x-access-token:${{ secrets.GITHUB_TOKEN }}@github.com/$GITHUB_REPOSITORY git push --force --set-upstream origin master:gh-pages
Update from Hackage at 2018-05-02T18:57:00Z
homepage: https://github.com/alanz/haskell-lsp changelog-type: markdown hash: 465eeb2ae643bd572e9e15969e32def50dc832a4fdccc77ec7495e8d13caca43 test-bench-deps: {} maintainer: alan.zimm@gmail.com synopsis: Haskell library for the Microsoft Language Server Protocol, data types changelog: ! "# Revision history for haskell-lsp\n\n## 0.2.1.0 -- 2018-05-02\n\n* Broken out from the haskell-lsp package, to simplify development\n by not having to run massive TH processes when working on the\n framework.\n\n" basic-deps: bytestring: -any base: ! '>=4.9 && <4.12' unordered-containers: -any text: -any filepath: -any data-default: -any lens: ! '>=4.15.2' network-uri: -any hashable: -any aeson: ! '>=1.0.0.0' all-versions: - '0.2.1.0' author: Alan Zimmerman latest: '0.2.1.0' description-type: markdown description: ! '[![Hackage](https://img.shields.io/hackage/v/haskell-lsp-types.svg)](https://hackage.haskell.org/package/haskell-lsp-types) # haskell-lsp-types Haskell library for the data types for Microsoft Language Server Protocol ## Useful links - https://github.com/Microsoft/language-server-protocol/blob/master/protocol.md ## Other resource See #haskell-ide-engine on IRC freenode ' license-name: MIT
Add github action for multi test checking.
name: Chinese Pinyin on: pull_request: branches: - 'master' jobs: build: runs-on: ubuntu-latest strategy: matrix: ruby: ['2.1', '2.5', '2.6', '2.7', '3.0'] steps: - uses: actions/checkout@v1 - uses: ruby/setup-ruby@v1 with: ruby-version: ${{ matrix.ruby }} - name: Build and test with Rake run: | rake test
Test conda installation on win
name: Python Package using Conda on: workflow_dispatch: jobs: build: runs-on: ${{ matrix.os }} strategy: matrix: os: [windows-latest] python-version: [3.8] name: OS ${{ matrix.os }} Python ${{ matrix.python-version }} defaults: run: shell: bash -l {0} steps: - uses: actions/checkout@v3 - name: Setup conda uses: conda-incubator/setup-miniconda@v2 with: activate-environment: alm python-version: ${{ matrix.python-version }} auto-update-conda: true channels: conda-forge - run: | conda info conda list conda config --show-sources conda config --show - name: Install conda libraries run: conda install --yes numpy scipy h5py m2w64-gcc spglib boost eigen cmake - run: echo ${CC} ${CXX} - name: Build ALM library working-directory: ./python run: python setup.py build - name: Place ALM library working-directory: ./python run: pip install -e . - name: Run test Si working-directory: ./test run: python Si_fitting.py - name: Run test SiC working-directory: ./test run: python SiC_fitting.py
Add build script for Travis-CI.
sudo: required dist: trusty language: c++ compiler: - clang - gcc install: - sudo add-apt-repository -y ppa:beineri/opt-qt58-trusty - sudo apt-get update -qq - sudo apt-get install -qq cmake qt58base before_script: - export PATH=$PATH:/opt/qt58/bin script: - cmake . - make
Add support for Debian 11 bullseye
--- php_webserver: "apache2" php_default_packages: - "libapache2-mod-php7.4" php_default_conf_scan_dir: "/etc/php/7.4/apache2/conf.d" # Only when php_alt_repo is set to true php_apt_repositories: - "deb https://packages.sury.org/php/ bullseye main" php_apt_key: "https://packages.sury.org/php/apt.gpg" php_apt_packages: - apt-transport-https - ca-certificates
Add flask-rest-jsonapi 0.12.2 to conda-forge
{% set name = "flask-rest-jsonapi" %} {% set version = "0.12.4" %} {% set sha256 = "984534d61fc6f367b1c3e977ff7de4dc8fbf89017d41eac1a7b42fb648780717" %} package: name: {{ name|lower }} version: {{ version }} source: fn: {{ name }}-{{ version }}.tar.gz url: https://github.com/miLibris/{{ name }}/archive/{{ version }}.tar.gz sha256: {{ sha256 }} build: number: 0 script: python setup.py install --single-version-externally-managed --record record.txt requirements: build: - python - setuptools - pytest-runner run: - python - six - flask >=0.11 - marshmallow-jsonapi - sqlalchemy test: # Some package might need a `test/commands` key to check CLI. # List all the packages/modules that `run_test.py` imports. imports: - flask_rest_jsonapi - flask_rest_jsonapi.data_layers - flask_rest_jsonapi.data_layers.filtering about: home: http://github.com/miLibris/flask-rest-jsonapi license: MIT license_family: MIT license_file: LICENSE summary: 'Flask-REST-JSONAPI is a flask extension for building REST APIs' description: | Flask-REST-JSONAPI is a flask extension for building REST APIs. It combines the power of Flask-Restless and the flexibility of Flask-RESTful around a strong specification JSONAPI 1.0. This framework is designed to quickly build REST APIs and fit the complexity of real life projects with legacy data and multiple data storages doc_url: http://{{ name }}.readthedocs.io/ dev_url: http://github.com/miLibris/flask-rest-jsonapi extra: recipe-maintainers: - sodre
Add Cirrus-CI config for FreeBSD builds
env: CIRRUS_CLONE_DEPTH: 1 ARCH: amd64 task: freebsd_instance: matrix: image: freebsd-12-0-release-amd64 image: freebsd-11-2-release-amd64 install_script: - sed -i.bak -e 's,pkg+http://pkg.FreeBSD.org/\${ABI}/quarterly,pkg+http://pkg.FreeBSD.org/\${ABI}/latest,' /etc/pkg/FreeBSD.conf - pkg upgrade -y - pkg install -y gmake script: - CFLAGS="-Werror" gmake -j all
Add ASDF schema for ICRS coordinate objects
%YAML 1.1 --- $schema: "http://stsci.edu/schemas/yaml-schema/draft-01" id: "http://astropy.org/schemas/astropy/coords/icrs_coord-1.0.0" tag: "tag:astropy.org:astropy/coords/icrs_coord-1.0.0" title: | Represents an ICRS coordinate object from astropy description: This object represents the right ascension (RA) and declination of an ICRS coordinate or frame. The ICRS class contains additional fields that may be useful to add here in the future. type: object properties: ra: type: object description: | A longitude representing the right ascension of the ICRS coordinate properties: value: type: number unit: $ref: "tag:stsci.edu:asdf/unit/unit-1.0.0" default: deg wrap_angle: $ref: "tag:stsci.edu:asdf/unit/quantity-1.1.0" default: "360 deg" dec: type: object description: | A latitude representing the declination of the ICRS coordinate properties: value: type: number unit: $ref: "tag:stsci.edu:asdf/unit/unit-1.0.0" default: deg required: [ra, dec] ...
Add Clang static analysis build
name: Clang Static Analysis on: [push, pull_request] jobs: run: name: "Clang Static Analysis" runs-on: ubuntu-latest steps: - name: Setup PHP uses: shivammathur/setup-php@v2 with: php-version: 8.0 tools: composer, phpize - name: Checkout # We use v1 due to https://github.com/actions/checkout/issues/334 uses: actions/checkout@v1 with: submodules: true - name: Install clang-tools and libmaxminddb run: sudo apt-get install clang-tools libmaxminddb-dev - name: Build extension run: | export CFLAGS="-L$HOME/libmaxminddb/lib" export CPPFLAGS="-I$HOME/libmaxminddb/include" cd ext phpize scan-build --status-bugs ./configure --with-maxminddb --enable-maxminddb-debug make clean scan-build --status-bugs make
Update from Hackage at 2016-12-20T18:05:28Z
homepage: https://github.com/centromere/nfc#readme changelog-type: markdown hash: 90cd1b63cf6df4fcf5b302aa3fe772e75032ac2ce71d523d518b75daae86d6b8 test-bench-deps: {} maintainer: John Galt <jgalt@centromere.net> synopsis: libnfc bindings changelog: ! '# 0.0.0 * Initial release ' basic-deps: bytestring: -any base: <5 nfc: -any base16-bytestring: -any all-versions: - '0.0.1' author: '' latest: '0.0.1' description-type: markdown description: ! '# nfc This library provides bindings to [libnfc](https://github.com/nfc-tools/libnfc). Only ISO14443A is supported at the moment, but adding additional bindings should be straightforward. ' license-name: PublicDomain
Set up CI with Azure Pipelines
# Node.js with React # Build a Node.js project that uses React. # Add steps that analyze code, save build artifacts, deploy, and more: # https://docs.microsoft.com/azure/devops/pipelines/languages/javascript trigger: - master pool: vmImage: 'Ubuntu-16.04' steps: - task: NodeTool@0 inputs: versionSpec: '8.x' displayName: 'Install Node.js' steps: - script: npm run lint displayName: 'running lint' - script: | npm install npm run build displayName: 'npm install and build' - script: npm run test displayName: 'running tests'
Set up CI with Azure Pipelines
# ASP.NET Core # Build and test ASP.NET Core web applications targeting .NET Core. # Add steps that run tests, create a NuGet package, deploy, and more: # https://docs.microsoft.com/vsts/pipelines/languages/dotnet-core pool: vmImage: 'Ubuntu 16.04' variables: buildConfiguration: 'Release' steps: - script: dotnet build --configuration $(buildConfiguration) displayName: 'dotnet build $(buildConfiguration)'
Set up CI with Azure Pipelines
# Go # Build your Go project. # Add steps that test, save build artifacts, deploy, and more: # https://docs.microsoft.com/azure/devops/pipelines/languages/go trigger: - master pool: vmImage: 'Ubuntu-16.04' variables: GOBIN: '$(GOPATH)/bin' # Go binaries path GOROOT: '/usr/local/go1.11' # Go installation path GOPATH: '$(system.defaultWorkingDirectory)/gopath' # Go workspace path modulePath: '$(GOPATH)/src/github.com/$(build.repository.name)' # Path to the module's code steps: - script: | mkdir -p '$(GOBIN)' mkdir -p '$(GOPATH)/pkg' mkdir -p '$(modulePath)' shopt -s extglob shopt -s dotglob mv !(gopath) '$(modulePath)' echo '##vso[task.prependpath]$(GOBIN)' echo '##vso[task.prependpath]$(GOROOT)/bin' displayName: 'Set up the Go workspace' - script: | go version go test -v ./... workingDirectory: '$(modulePath)' displayName: 'Test' - script: go build -v . workingDirectory: '$(modulePath)' displayName: 'Build'
Add search2.hmrc.gov.uk to transition tool
--- site: hmrc_search2 whitehall_slug: hm-revenue-customs redirection_date: 1st June 2014 homepage: https://www.gov.uk/government/organisations/hm-revenue-customs tna_timestamp: 20140603092408 host: search2.hmrc.gov.uk homepage_furl: www.gov.uk/hmrc options: --query-string formid:record special_redirect_strategy: supplier
Add the origami-labels GitHub Action
on: [issues, pull_request] jobs: sync-labels: runs-on: ubuntu-latest name: Sync repository labels steps: - uses: Financial-Times/origami-labels@v1 with: github-token: ${{ secrets.GITHUB_TOKEN }}
Change order of config attributes
name: "Build Gradle project" on: [ push, workflow_dispatch ] jobs: build: runs-on: ubuntu-latest strategy: matrix: java-version: [ 8, 11, 16, 17 ] gradle-version: [ 6.9.3, 7.0.2, 7.2, 7.3.3, 7.5.1 ] name: Test JVM/Gradle (${{ matrix.java-version }}, ${{ matrix.gradle-version }}) steps: - name: Check out project uses: actions/checkout@v2 - name: Set up JDK 11 uses: actions/setup-java@v1 with: java-version: 11 - name: Setup Gradle uses: gradle/gradle-build-action@v2 - name: Build with Gradle run: ./gradlew clean build -PtestJavaRuntimeVersion=${{ matrix.java-version }} -PtestGradleVersion=${{ matrix.gradle-version }} env: GRADLE_ENTERPRISE_ACCESS_KEY: ${{ secrets.GE_ACCESS_TOKEN }}
name: "Build Gradle project" on: [ push, workflow_dispatch ] jobs: build: name: Test JVM/Gradle (${{ matrix.java-version }}, ${{ matrix.gradle-version }}) runs-on: ubuntu-latest strategy: matrix: java-version: [ 8, 11, 16, 17 ] gradle-version: [ 6.9.3, 7.0.2, 7.2, 7.3.3, 7.5.1 ] steps: - name: Check out project uses: actions/checkout@v2 - name: Set up JDK 11 uses: actions/setup-java@v1 with: java-version: 11 - name: Setup Gradle uses: gradle/gradle-build-action@v2 - name: Build with Gradle run: ./gradlew clean build -PtestJavaRuntimeVersion=${{ matrix.java-version }} -PtestGradleVersion=${{ matrix.gradle-version }} env: GRADLE_ENTERPRISE_ACCESS_KEY: ${{ secrets.GE_ACCESS_TOKEN }}
Use dev for npm install
name: Test Deployment on: push: branches: - test jobs: build: name: Build and Upload Webpack Bundle runs-on: ubuntu-latest env: NODE_ENV: production steps: - name: Checkout uses: actions/checkout@v2 - name: Install node uses: actions/setup-node@v1 - run: npm install env: NODE_ENV: dev - run: npm run build-prod --if-present
Add example config file for test server.
#server and amqp stanzas are mandatory for a SimpleAmqpServer server: name: test_server sleep_on_empty: 5 amqp: #queues for communication to and from caller. As expected, receive on incoming_queue and send result on outgoing_queue incoming_queue: in_to_test outgoing_queue: out_of_test #Connection information, passed directly to Bunny.new, whose defaults are used for anything missing #Note that Bunny will want these as symbols, so the YAML should reflect that connection: :port: 5672
Set up CI with Azure Pipelines
# Node.js # Build a general Node.js project with npm. # Add steps that analyze code, save build artifacts, deploy, and more: # https://docs.microsoft.com/azure/devops/pipelines/languages/javascript trigger: - master pool: vmImage: 'ubuntu-latest' steps: - task: NodeTool@0 inputs: versionSpec: '10.x' displayName: 'Install Node.js' - script: | npm install npm run build displayName: 'npm install and build'
Revert "go away circle ci"
# Javascript Node CircleCI 2.0 configuration file # # Check https://circleci.com/docs/2.0/language-javascript/ for more details # version: 2 jobs: build: docker: # specify the version you desire here - image: circleci/node:7.10 # Specify service dependencies here if necessary # CircleCI maintains a library of pre-built images # documented at https://circleci.com/docs/2.0/circleci-images/ # - image: circleci/mongo:3.4.4 working_directory: ~/repo steps: - checkout # Download and cache dependencies - restore_cache: keys: - v1-dependencies-{{ checksum "package.json" }} # fallback to using the latest cache if no exact match is found - v1-dependencies- # - run: yarn install - run: npm install - save_cache: paths: - node_modules key: v1-dependencies-{{ checksum "package.json" }} # TODO: upgrade to using yarn instead - run: npm test
Fix AppVeyor build errors caused by restoring NuGet packages
version: 1.0.{build} configuration: Release platform: Any CPU hosts: api.nuget.org: 93.184.221.200 cache: - InsireBot\packages -> InsireBot\**\packages.config before_build: - cmd: nuget restore InsireBot\Maple.sln build: project: InsireBot\Maple.sln verbosity: minimal artifacts: - path: 'InsireBot/InsireBot/bin/$(configuration)' name: 'InsireBot-Binaries'
Add workflow updating Conda Lock
name: update_conda_lock on: pull_request: push: schedule: - cron: '0 3 * * *' workflow_dispatch: jobs: update-locks: runs-on: ubuntu-18.04 strategy: fail-fast: false steps: - name: Checkout Repository uses: actions/checkout@v2 with: fetch-depth: 0 submodules: 'recursive' ref: 'master' # It will be the PR's base - name: Update Lock and Issue a Pull Request uses: SymbiFlow/actions/update_conda_lock@7a34d4c1a27dc369cfc618ae0902fcfaf1a36c59 with: branch_name_core: 'update_conda_lock' conda_lock_file: 'conda_lock.yml' environment_file: 'environment.yml' gh_access_token: ${{ secrets.GITHUB_TOKEN }}
Add INL MOOSE 1c benchmark
--- benchmark: id: 1c version: 1 metadata: summary: Benchmark with MOOSE on macPro, T-shaped no-flux domain author: Daniel Schwen email: daniel.schwen@inl.gov timestamp: 6/1/2017 hardware: architecture: x86_64 cores: 4 software: name: moose version: CHiMaD_Hackathon implementation: repo: url: https://github.com/dschwen/CHiMaD_Hackathon version: f6ec84570 end_condition: Time 10000 details: - name: mesh values: uniform rectilinear 95*114 (with elements removed) QUAD4 - name: numerical_method values: implicit finite elements - name: compiler values: clang++ 3.9.0 - name: parallel_model values: MPI - name: time_stepper values: IterationAdaptive - name: time_integration values: second order backward euler data: - name: run_time values: [ { "time": 231.957, "sim_time": 10000.0 } ] - name: memory_usage values: [ { "value_m": 277.19, "unit": MB } ] - name: free_energy url: https://gist.github.com/anonymous/898a197718cf02b643e78841e54dba41 format: type: csv parse: F: number time: number transform: - type: formula field: free_energy expr: datum.F - type: filter test: "datum.time > 0.01"
Add a Travis config for automated testing
language: node_js node_js: - "0.10" services: - mysql notifications: irc: channels: - "irc.mozilla.org#fxa" use_notice: false skip_join: false before_install: - npm config set spin false install: - npm install before_script: - mysql -e 'DROP DATABASE IF EXISTS fxa' - npm i grunt-cli -g - npm run outdated - grunt validate-shrinkwrap --force script: - npm test
Add experimental Travis CI config
matrix: include: - # see http://about.travis-ci.org/docs/user/languages/php/ for more hints stage: Tests language: php # list any PHP version you want to test against php: # main version should be kept as close as possible to supported OMV's PHP version # and dependencies should be updated with --prefer-lowest to prevent being ahead - "7.0" # install project dependencies install: - composer install script: - composer run-script ci-php-phpcs - # see https://docs.travis-ci.com/user/languages/javascript-with-nodejs/ for more hints stage: Tests language: node_js node_js: - "11.6" # install project dependencies install: - npm ci script: - npm run ci-js-eslint - npm run ci-css-stylelint
Enable debug in gradle build command
language: java jdk: - oraclejdk7 services: - mongodb before_script: - sleep 15 - mongo gfk --eval 'db.addUser("gfk", "password");' install: - gradle clean build --stacktrace
language: java jdk: - oraclejdk7 services: - mongodb before_script: - sleep 15 - mongo gfk --eval 'db.addUser("gfk", "password");' install: - gradle clean build --debug
Update from Hackage at 2017-03-20T07:49:27Z
homepage: https://github.com/karky7/hzenhan#readme changelog-type: '' hash: d20b61399d2ae4304adca7d8be2f938e79f130748e5095c3dddbc16d0619c1b1 test-bench-deps: base: ! '>=4.7 && <5' text: -any containers: -any QuickCheck: -any hzenhan: -any maintainer: cantimerny.g@gmail.com synopsis: Zenhan library for Haskell changelog: '' basic-deps: base: ! '>=4.7 && <5' text: -any containers: -any all-versions: - '0.0.1' author: karky7 latest: '0.0.1' description-type: markdown description: ! "# hzenhan\n\nConverter between Full-width Japanese and Half-width Japanese \n \nThis module ported Python's zenhan library, similar to the \n\"zenhan\" library found in pypi:\n\n[<https://pypi.python.org/pypi/zenhan/>](<https://pypi.python.org/pypi/zenhan/> \"zenhan\")\n\n# Install\n\nInstalling from Hackage with stack is straightforward:\n\n \ > stack install\n\nor gentoo emerge from gentoo-haskell overlay\n\n > layman -a haskell\n > emerge dev-haskell/hzenhan\n\n# Usage\n\nLet's see an example.\n\n \ > {-# LANGUAGE OverloadedStrings #-}\n >\n > import Text.Zenhan\n > import Data.Text (pack, unpack)\n >\n > main :: IO ()\n > main = do\n > \ let h = h2z [Kana, Digit, Ascii] \\\"A\\\" \\\"ABCd\\\\\\\\「」アイウエオ123\\\"\n > \ z = z2h [Kana, Digit, Ascii] \\\"Bエ\\\" h\n > putStrLn $ toString h\n \ > putStrLn $ toString z\n" license-name: BSD3
Change package loader to not generate snapshot for logical StructureDefinition resources.
--- type: fix issue: 2535 title: "An issue with package installer involving logical StructureDefinition resources was fixed. Package registry will no longer attempt to generate a snapshot for logical StructureDefinition resources if one is not already provided in the resource definition."
Add Github actions testing of style/unit
name: delivery on: [push, pull_request] jobs: delivery: runs-on: ubuntu-latest steps: - name: Check out code uses: actions/checkout@master - name: Run Chef Delivery uses: actionshub/chef-delivery@master env: CHEF_LICENSE: accept-no-persist
Add Wercker for CI & Code coverage
box: golang dev: steps: - internal/watch: code: | go build ./... ./source reload: true # Build definition build: # The steps that will be executed on build steps: - setup-go-workspace # golint step! - wercker/golint # Get dependencies for the project - script: name: go get code: | go get ./... # Build the project - script: name: go build code: | go build ./... # Test the project - script: name: go test code: | go test ./... deploy: steps: - setup-go-workspace - script: name: go get code: | go get ./... - tcnksm/goveralls: token: $COVERALLS_TOKEN
Update from Hackage at 2019-03-31T06:33:40Z
homepage: https://github.com/cdepillabout/password/password#readme changelog-type: markdown hash: 92747d99a079c6144eb61a4ba5ad0619e33f509cd7bbc8112ad3130756f97697 test-bench-deps: base: ! '>=4.7 && <5' doctest: -any quickcheck-instances: -any password: -any QuickCheck: -any maintainer: cdep.illabout@gmail.com synopsis: plain-text password and hashed password datatypes and functions changelog: |+ # Changelog for password ## 0.1.0.0 - Initial version. basic-deps: scrypt: -any base: ! '>=4.7 && <5' text: -any all-versions: - 0.1.0.0 author: Dennis Gosnell latest: 0.1.0.0 description-type: markdown description: | # password [![Build Status](https://secure.travis-ci.org/cdepillabout/password.svg)](http://travis-ci.org/cdepillabout/password) [![Hackage](https://img.shields.io/hackage/v/password.svg)](https://hackage.haskell.org/package/password) [![Stackage LTS](http://stackage.org/package/password/badge/lts)](http://stackage.org/lts/package/password) [![Stackage Nightly](http://stackage.org/package/password/badge/nightly)](http://stackage.org/nightly/package/password) [![BSD3 license](https://img.shields.io/badge/license-BSD3-blue.svg)](./LICENSE) This library provides datatypes and functions for working with passwords and password hashes in Haskell. Also, see the [password-instances](../password-instances) package for instances for common typeclasses. license-name: BSD-3-Clause
Add initial Travis CI config file
sudo: false language: go go: - 1.5.3 os: - linux - osx install: - go get github.com/elastic/beats - go get github.com/garyburd/redigo/redis - go get github.com/stretchr/testify/assert script: make testsuite
Add badge source to whitelist
language: ruby rvm: - 2.2 before_script: - gem install awesome_bot script: - awesome_bot README.md --allow-dupe --allow-redirect --white-list https://ipfs.io,slideshare
language: ruby rvm: - 2.2 before_script: - gem install awesome_bot script: - awesome_bot README.md --allow-dupe --allow-redirect --white-list https://ipfs.io,slideshare,https://img.shields.io
Update from Hackage at 2018-10-25T07:42:35Z
homepage: '' changelog-type: '' hash: b8302308879ba97d99e56ea7892a8b2c69c0f4154f91c45662d223457301be50 test-bench-deps: {} maintainer: dev@capital-match.com synopsis: Low-level bindings to the DocuSign API changelog: '' basic-deps: bytestring: -any base: <1000 servant-client: -any text: -any data-default: -any servant: -any lens: -any aeson: -any http-media: -any all-versions: - '0.0.1' author: Jonathan Knowles <mail@jonathanknowles.net> latest: '0.0.1' description-type: haddock description: ! 'DocuSign is an electronic signature technology and digital transaction management. These are just low-level bindings to the API.' license-name: BSD3